2026-03-10T06:12:57.265 INFO:root:teuthology version: 1.2.4.dev6+g1c580df7a 2026-03-10T06:12:57.273 DEBUG:teuthology.report:Pushing job info to http://localhost:8080 2026-03-10T06:12:57.294 INFO:teuthology.run:Config: archive_path: /archive/kyr-2026-03-10_01:00:38-orch-squid-none-default-vps/926 branch: squid description: orch/cephadm/mds_upgrade_sequence/{bluestore-bitmap centos_9.stream conf/{client mds mgr mon osd} fail_fs/yes kernel overrides/{ignorelist_health ignorelist_upgrade ignorelist_wrongly_marked_down pg-warn pg_health syntax} roles tasks/{0-from/reef/{v18.2.0} 1-volume/{0-create 1-ranks/1 2-allow_standby_replay/yes 3-inline/yes 4-verify} 2-client/fuse 3-upgrade-mgr-staggered 4-config-upgrade/{fail_fs} 5-upgrade-with-workload 6-verify}} email: null first_in_suite: false flavor: default job_id: '926' last_in_suite: false machine_type: vps meta: - desc: 'setup ceph/v18.2.0 ' name: kyr-2026-03-10_01:00:38-orch-squid-none-default-vps no_nested_subset: false os_type: centos os_version: 9.stream overrides: admin_socket: branch: squid ansible.cephlab: branch: main skip_tags: nagios,monitoring-scripts,hostname,pubkeys,zap,sudoers,kerberos,ntp-client,resolvconf,cpan,nfs vars: timezone: UTC ceph: cluster-conf: mgr: client mount timeout: 30 debug client: 20 debug mgr: 20 debug ms: 1 mon warn on pool no app: false conf: client: client mount timeout: 600 debug client: 20 debug ms: 1 rados mon op timeout: 900 rados osd op timeout: 900 global: mon pg warn min per osd: 0 mds: debug mds: 20 debug mds balancer: 20 debug ms: 1 mds debug frag: true mds debug scatterstat: true mds op complaint time: 180 mds verify scatter: true osd op complaint time: 180 rados mon op timeout: 900 rados osd op timeout: 900 mgr: debug mgr: 20 debug ms: 1 mon: debug mon: 20 debug ms: 1 debug paxos: 20 mon down mkfs grace: 300 mon op complaint time: 120 osd: bdev async discard: true bdev enable discard: true bluestore allocator: bitmap bluestore block size: 96636764160 bluestore fsck on mount: true debug bluefs: 1/20 debug bluestore: 1/20 debug ms: 1 debug osd: 20 debug rocksdb: 4/10 mon osd backfillfull_ratio: 0.85 mon osd full ratio: 0.9 mon osd nearfull ratio: 0.8 osd failsafe full ratio: 0.95 osd mclock iops capacity threshold hdd: 49000 osd objectstore: bluestore osd op complaint time: 180 flavor: default fs: xfs log-ignorelist: - \(MDS_ALL_DOWN\) - \(MDS_UP_LESS_THAN_MAX\) - FS_DEGRADED - filesystem is degraded - FS_INLINE_DATA_DEPRECATED - FS_WITH_FAILED_MDS - MDS_ALL_DOWN - filesystem is offline - is offline because no MDS - MDS_DAMAGE - MDS_DEGRADED - MDS_FAILED - MDS_INSUFFICIENT_STANDBY - MDS_UP_LESS_THAN_MAX - online, but wants - filesystem is online with fewer MDS than max_mds - POOL_APP_NOT_ENABLED - do not have an application enabled - overall HEALTH_ - Replacing daemon - deprecated feature inline_data - MGR_MODULE_ERROR - OSD_DOWN - osds down - overall HEALTH_ - \(OSD_DOWN\) - \(OSD_ - but it is still running - is not responding - MON_DOWN - PG_AVAILABILITY - PG_DEGRADED - Reduced data availability - Degraded data redundancy - pg .* is stuck inactive - pg .* is .*degraded - pg .* is stuck peering sha1: e911bdebe5c8faa3800735d1568fcdca65db60df ceph-deploy: bluestore: true conf: client: log file: /var/log/ceph/ceph-$name.$pid.log mon: {} osd: bdev async discard: true bdev enable discard: true bluestore block size: 96636764160 bluestore fsck on mount: true debug bluefs: 1/20 debug bluestore: 1/20 debug rocksdb: 4/10 mon osd backfillfull_ratio: 0.85 mon osd full ratio: 0.9 mon osd nearfull ratio: 0.8 osd failsafe full ratio: 0.95 osd objectstore: bluestore fs: xfs install: ceph: flavor: default sha1: e911bdebe5c8faa3800735d1568fcdca65db60df extra_system_packages: deb: - python3-xmltodict - python3-jmespath rpm: - bzip2 - perl-Test-Harness - python3-xmltodict - python3-jmespath kclient: syntax: v1 selinux: allowlist: - scontext=system_u:system_r:logrotate_t:s0 - scontext=system_u:system_r:getty_t:s0 thrashosds: bdev_inject_crash: 2 bdev_inject_crash_probability: 0.5 workunit: branch: tt-squid sha1: 75a68fd8ca3f918fe9466b4c0bb385b7fc260a9b owner: kyr priority: 1000 repo: https://github.com/ceph/ceph.git roles: - - host.a - client.0 - osd.0 - osd.1 - osd.2 - - host.b - client.1 - osd.3 - osd.4 - osd.5 seed: 8043 sha1: e911bdebe5c8faa3800735d1568fcdca65db60df sleep_before_teardown: 0 subset: 1/64 suite: orch suite_branch: tt-squid suite_path: /home/teuthos/src/github.com_kshtsk_ceph_75a68fd8ca3f918fe9466b4c0bb385b7fc260a9b/qa suite_relpath: qa suite_repo: https://github.com/kshtsk/ceph.git suite_sha1: 75a68fd8ca3f918fe9466b4c0bb385b7fc260a9b targets: vm04.local: ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBCYGeZ4F2KRHSofCojYJkCYVklcFF+sjjJjzikf2T9nme8wKd3TMzAjSfZasdpqy7KMY0WKz9TBMVId4I2lUGWM= vm06.local: ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOtxpIQi7fJACBgXk6G0jr/ZCtVffPAKFL6Hh9ItEkGygL7rhPtt5pZLx2W8pbbbohreIVPwSOTPVVoT5iV6PM4= tasks: - install: exclude_packages: - ceph-volume tag: v18.2.0 - print: '**** done install task...' - cephadm: compiled_cephadm_branch: reef conf: osd: osd_class_default_list: '*' osd_class_load_list: '*' image: quay.io/ceph/ceph:v18.2.0 roleless: true - print: '**** done end installing v18.2.0 cephadm ...' - cephadm.shell: host.a: - ceph config set mgr mgr/cephadm/use_repo_digest true --force - print: '**** done cephadm.shell ceph config set mgr...' - cephadm.shell: host.a: - ceph orch status - ceph orch ps - ceph orch ls - ceph orch host ls - ceph orch device ls - cephadm.shell: host.a: - ceph fs volume create cephfs --placement=4 - ceph fs dump - cephadm.shell: host.a: - ceph fs set cephfs max_mds 1 - cephadm.shell: host.a: - ceph fs set cephfs allow_standby_replay true - cephadm.shell: host.a: - ceph fs set cephfs inline_data true --yes-i-really-really-mean-it - cephadm.shell: host.a: - ceph fs dump - ceph --format=json fs dump | jq -e ".filesystems | length == 1" - while ! ceph --format=json mds versions | jq -e ". | add == 4"; do sleep 1; done - fs.pre_upgrade_save: null - ceph-fuse: null - print: '**** done client' - parallel: - upgrade-tasks - workload-tasks - cephadm.shell: host.a: - ceph fs dump - fs.post_upgrade_checks: null teuthology: fragments_dropped: [] meta: {} postmerge: - "local kernel = py_attrgetter(yaml).get('kernel')\nif kernel ~= nil then\n local\ \ branch = py_attrgetter(kernel).get('branch')\n if branch and not kernel.branch:find\ \ \"-all$\" then\n log.debug(\"removing default kernel specification: %s\"\ , kernel)\n py_attrgetter(kernel).pop('branch', nil)\n py_attrgetter(kernel).pop('deb',\ \ nil)\n py_attrgetter(kernel).pop('flavor', nil)\n py_attrgetter(kernel).pop('kdb',\ \ nil)\n py_attrgetter(kernel).pop('koji', nil)\n py_attrgetter(kernel).pop('koji_task',\ \ nil)\n py_attrgetter(kernel).pop('rpm', nil)\n py_attrgetter(kernel).pop('sha1',\ \ nil)\n py_attrgetter(kernel).pop('tag', nil)\n end\nend\n" variables: fail_fs: true teuthology_branch: clyso-debian-13 teuthology_repo: https://github.com/clyso/teuthology teuthology_sha1: 1c580df7a9c7c2aadc272da296344fd99f27c444 timestamp: 2026-03-10_01:00:38 tube: vps upgrade-tasks: sequential: - cephadm.shell: env: - sha1 host.a: - ceph config set mon mon_warn_on_insecure_global_id_reclaim false --force - ceph config set mon mon_warn_on_insecure_global_id_reclaim_allowed false --force - ceph config set global log_to_journald false --force - ceph orch upgrade start --image quay.ceph.io/ceph-ci/ceph:$sha1 --daemon-types mgr - while ceph orch upgrade status | jq '.in_progress' | grep true && ! ceph orch upgrade status | jq '.message' | grep Error ; do ceph orch ps ; ceph versions ; ceph orch upgrade status ; sleep 30 ; done - ceph versions | jq -e '.mgr | length == 1' - ceph versions | jq -e '.mgr | keys' | grep $sha1 - ceph versions | jq -e '.overall | length == 2' - ceph orch upgrade check quay.ceph.io/ceph-ci/ceph:$sha1 | jq -e '.up_to_date | length == 2' - ceph orch ps - cephadm.shell: env: - sha1 host.a: - ceph config set mgr mgr/orchestrator/fail_fs true - cephadm.shell: env: - sha1 host.a: - ceph config set mon mon_warn_on_insecure_global_id_reclaim false --force - ceph config set mon mon_warn_on_insecure_global_id_reclaim_allowed false --force - ceph config set global log_to_journald false --force - ceph orch upgrade start --image quay.ceph.io/ceph-ci/ceph:$sha1 - cephadm.shell: env: - sha1 host.a: - while ceph orch upgrade status | jq '.in_progress' | grep true && ! ceph orch upgrade status | jq '.message' | grep Error ; do ceph orch ps ; ceph versions ; ceph fs dump; ceph orch upgrade status ; ceph health detail ; sleep 30 ; done - ceph orch ps - ceph orch upgrade status - ceph health detail - ceph versions - echo "wait for servicemap items w/ changing names to refresh" - sleep 60 - ceph orch ps - ceph versions - ceph versions | jq -e '.overall | length == 1' - ceph versions | jq -e '.overall | keys' | grep $sha1 user: kyr verbose: false worker_log: /home/teuthos/.teuthology/dispatcher/dispatcher.vps.611473 workload-tasks: sequential: - workunit: clients: all: - suites/fsstress.sh 2026-03-10T06:12:57.294 INFO:teuthology.run:suite_path is set to /home/teuthos/src/github.com_kshtsk_ceph_75a68fd8ca3f918fe9466b4c0bb385b7fc260a9b/qa; will attempt to use it 2026-03-10T06:12:57.295 INFO:teuthology.run:Found tasks at /home/teuthos/src/github.com_kshtsk_ceph_75a68fd8ca3f918fe9466b4c0bb385b7fc260a9b/qa/tasks 2026-03-10T06:12:57.295 INFO:teuthology.run_tasks:Running task internal.check_packages... 2026-03-10T06:12:57.295 INFO:teuthology.task.internal:Checking packages... 2026-03-10T06:12:57.295 INFO:teuthology.task.internal:Checking packages for os_type 'centos', flavor 'default' and ceph hash 'e911bdebe5c8faa3800735d1568fcdca65db60df' 2026-03-10T06:12:57.295 WARNING:teuthology.packaging:More than one of ref, tag, branch, or sha1 supplied; using branch 2026-03-10T06:12:57.295 INFO:teuthology.packaging:ref: None 2026-03-10T06:12:57.295 INFO:teuthology.packaging:tag: None 2026-03-10T06:12:57.295 INFO:teuthology.packaging:branch: squid 2026-03-10T06:12:57.295 INFO:teuthology.packaging:sha1: e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-10T06:12:57.295 DEBUG:teuthology.packaging:Querying https://shaman.ceph.com/api/search?status=ready&project=ceph&flavor=default&distros=centos%2F9%2Fx86_64&ref=squid 2026-03-10T06:12:58.024 INFO:teuthology.task.internal:Found packages for ceph version 19.2.3-678.ge911bdeb 2026-03-10T06:12:58.025 INFO:teuthology.run_tasks:Running task internal.buildpackages_prep... 2026-03-10T06:12:58.036 INFO:teuthology.task.internal:no buildpackages task found 2026-03-10T06:12:58.036 INFO:teuthology.run_tasks:Running task internal.save_config... 2026-03-10T06:12:58.038 INFO:teuthology.task.internal:Saving configuration 2026-03-10T06:12:58.046 INFO:teuthology.run_tasks:Running task internal.check_lock... 2026-03-10T06:12:58.056 INFO:teuthology.task.internal.check_lock:Checking locks... 2026-03-10T06:12:58.063 DEBUG:teuthology.task.internal.check_lock:machine status is {'name': 'vm04.local', 'description': '/archive/kyr-2026-03-10_01:00:38-orch-squid-none-default-vps/926', 'up': True, 'machine_type': 'vps', 'is_vm': True, 'vm_host': {'name': 'localhost', 'description': None, 'up': True, 'machine_type': 'libvirt', 'is_vm': False, 'vm_host': None, 'os_type': None, 'os_version': None, 'arch': None, 'locked': True, 'locked_since': None, 'locked_by': None, 'mac_address': None, 'ssh_pub_key': None}, 'os_type': 'centos', 'os_version': '9.stream', 'arch': 'x86_64', 'locked': True, 'locked_since': '2026-03-10 06:11:49.666112', 'locked_by': 'kyr', 'mac_address': '52:55:00:00:00:04', 'ssh_pub_key': 'ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBCYGeZ4F2KRHSofCojYJkCYVklcFF+sjjJjzikf2T9nme8wKd3TMzAjSfZasdpqy7KMY0WKz9TBMVId4I2lUGWM='} 2026-03-10T06:12:58.067 DEBUG:teuthology.task.internal.check_lock:machine status is {'name': 'vm06.local', 'description': '/archive/kyr-2026-03-10_01:00:38-orch-squid-none-default-vps/926', 'up': True, 'machine_type': 'vps', 'is_vm': True, 'vm_host': {'name': 'localhost', 'description': None, 'up': True, 'machine_type': 'libvirt', 'is_vm': False, 'vm_host': None, 'os_type': None, 'os_version': None, 'arch': None, 'locked': True, 'locked_since': None, 'locked_by': None, 'mac_address': None, 'ssh_pub_key': None}, 'os_type': 'centos', 'os_version': '9.stream', 'arch': 'x86_64', 'locked': True, 'locked_since': '2026-03-10 06:11:49.665583', 'locked_by': 'kyr', 'mac_address': '52:55:00:00:00:06', 'ssh_pub_key': 'ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOtxpIQi7fJACBgXk6G0jr/ZCtVffPAKFL6Hh9ItEkGygL7rhPtt5pZLx2W8pbbbohreIVPwSOTPVVoT5iV6PM4='} 2026-03-10T06:12:58.067 INFO:teuthology.run_tasks:Running task internal.add_remotes... 2026-03-10T06:12:58.091 INFO:teuthology.task.internal:roles: ubuntu@vm04.local - ['host.a', 'client.0', 'osd.0', 'osd.1', 'osd.2'] 2026-03-10T06:12:58.091 INFO:teuthology.task.internal:roles: ubuntu@vm06.local - ['host.b', 'client.1', 'osd.3', 'osd.4', 'osd.5'] 2026-03-10T06:12:58.091 INFO:teuthology.run_tasks:Running task console_log... 2026-03-10T06:12:58.096 DEBUG:teuthology.task.console_log:vm04 does not support IPMI; excluding 2026-03-10T06:12:58.101 DEBUG:teuthology.task.console_log:vm06 does not support IPMI; excluding 2026-03-10T06:12:58.101 DEBUG:teuthology.exit:Installing handler: Handler(exiter=, func=.kill_console_loggers at 0x7f481127a170>, signals=[15]) 2026-03-10T06:12:58.101 INFO:teuthology.run_tasks:Running task internal.connect... 2026-03-10T06:12:58.103 INFO:teuthology.task.internal:Opening connections... 2026-03-10T06:12:58.103 DEBUG:teuthology.task.internal:connecting to ubuntu@vm04.local 2026-03-10T06:12:58.103 DEBUG:teuthology.orchestra.connection:{'hostname': 'vm04.local', 'username': 'ubuntu', 'timeout': 60} 2026-03-10T06:12:58.162 DEBUG:teuthology.task.internal:connecting to ubuntu@vm06.local 2026-03-10T06:12:58.163 DEBUG:teuthology.orchestra.connection:{'hostname': 'vm06.local', 'username': 'ubuntu', 'timeout': 60} 2026-03-10T06:12:58.220 INFO:teuthology.run_tasks:Running task internal.push_inventory... 2026-03-10T06:12:58.221 DEBUG:teuthology.orchestra.run.vm04:> uname -m 2026-03-10T06:12:58.267 INFO:teuthology.orchestra.run.vm04.stdout:x86_64 2026-03-10T06:12:58.267 DEBUG:teuthology.orchestra.run.vm04:> cat /etc/os-release 2026-03-10T06:12:58.321 INFO:teuthology.orchestra.run.vm04.stdout:NAME="CentOS Stream" 2026-03-10T06:12:58.321 INFO:teuthology.orchestra.run.vm04.stdout:VERSION="9" 2026-03-10T06:12:58.321 INFO:teuthology.orchestra.run.vm04.stdout:ID="centos" 2026-03-10T06:12:58.321 INFO:teuthology.orchestra.run.vm04.stdout:ID_LIKE="rhel fedora" 2026-03-10T06:12:58.321 INFO:teuthology.orchestra.run.vm04.stdout:VERSION_ID="9" 2026-03-10T06:12:58.321 INFO:teuthology.orchestra.run.vm04.stdout:PLATFORM_ID="platform:el9" 2026-03-10T06:12:58.321 INFO:teuthology.orchestra.run.vm04.stdout:PRETTY_NAME="CentOS Stream 9" 2026-03-10T06:12:58.321 INFO:teuthology.orchestra.run.vm04.stdout:ANSI_COLOR="0;31" 2026-03-10T06:12:58.321 INFO:teuthology.orchestra.run.vm04.stdout:LOGO="fedora-logo-icon" 2026-03-10T06:12:58.321 INFO:teuthology.orchestra.run.vm04.stdout:CPE_NAME="cpe:/o:centos:centos:9" 2026-03-10T06:12:58.321 INFO:teuthology.orchestra.run.vm04.stdout:HOME_URL="https://centos.org/" 2026-03-10T06:12:58.321 INFO:teuthology.orchestra.run.vm04.stdout:BUG_REPORT_URL="https://issues.redhat.com/" 2026-03-10T06:12:58.321 INFO:teuthology.orchestra.run.vm04.stdout:REDHAT_SUPPORT_PRODUCT="Red Hat Enterprise Linux 9" 2026-03-10T06:12:58.321 INFO:teuthology.orchestra.run.vm04.stdout:REDHAT_SUPPORT_PRODUCT_VERSION="CentOS Stream" 2026-03-10T06:12:58.322 INFO:teuthology.lock.ops:Updating vm04.local on lock server 2026-03-10T06:12:58.326 DEBUG:teuthology.orchestra.run.vm06:> uname -m 2026-03-10T06:12:58.340 INFO:teuthology.orchestra.run.vm06.stdout:x86_64 2026-03-10T06:12:58.340 DEBUG:teuthology.orchestra.run.vm06:> cat /etc/os-release 2026-03-10T06:12:58.394 INFO:teuthology.orchestra.run.vm06.stdout:NAME="CentOS Stream" 2026-03-10T06:12:58.394 INFO:teuthology.orchestra.run.vm06.stdout:VERSION="9" 2026-03-10T06:12:58.394 INFO:teuthology.orchestra.run.vm06.stdout:ID="centos" 2026-03-10T06:12:58.394 INFO:teuthology.orchestra.run.vm06.stdout:ID_LIKE="rhel fedora" 2026-03-10T06:12:58.394 INFO:teuthology.orchestra.run.vm06.stdout:VERSION_ID="9" 2026-03-10T06:12:58.394 INFO:teuthology.orchestra.run.vm06.stdout:PLATFORM_ID="platform:el9" 2026-03-10T06:12:58.394 INFO:teuthology.orchestra.run.vm06.stdout:PRETTY_NAME="CentOS Stream 9" 2026-03-10T06:12:58.394 INFO:teuthology.orchestra.run.vm06.stdout:ANSI_COLOR="0;31" 2026-03-10T06:12:58.394 INFO:teuthology.orchestra.run.vm06.stdout:LOGO="fedora-logo-icon" 2026-03-10T06:12:58.394 INFO:teuthology.orchestra.run.vm06.stdout:CPE_NAME="cpe:/o:centos:centos:9" 2026-03-10T06:12:58.394 INFO:teuthology.orchestra.run.vm06.stdout:HOME_URL="https://centos.org/" 2026-03-10T06:12:58.394 INFO:teuthology.orchestra.run.vm06.stdout:BUG_REPORT_URL="https://issues.redhat.com/" 2026-03-10T06:12:58.394 INFO:teuthology.orchestra.run.vm06.stdout:REDHAT_SUPPORT_PRODUCT="Red Hat Enterprise Linux 9" 2026-03-10T06:12:58.394 INFO:teuthology.orchestra.run.vm06.stdout:REDHAT_SUPPORT_PRODUCT_VERSION="CentOS Stream" 2026-03-10T06:12:58.394 INFO:teuthology.lock.ops:Updating vm06.local on lock server 2026-03-10T06:12:58.399 INFO:teuthology.run_tasks:Running task internal.serialize_remote_roles... 2026-03-10T06:12:58.401 INFO:teuthology.run_tasks:Running task internal.check_conflict... 2026-03-10T06:12:58.401 INFO:teuthology.task.internal:Checking for old test directory... 2026-03-10T06:12:58.402 DEBUG:teuthology.orchestra.run.vm04:> test '!' -e /home/ubuntu/cephtest 2026-03-10T06:12:58.403 DEBUG:teuthology.orchestra.run.vm06:> test '!' -e /home/ubuntu/cephtest 2026-03-10T06:12:58.449 INFO:teuthology.run_tasks:Running task internal.check_ceph_data... 2026-03-10T06:12:58.450 INFO:teuthology.task.internal:Checking for non-empty /var/lib/ceph... 2026-03-10T06:12:58.450 DEBUG:teuthology.orchestra.run.vm04:> test -z $(ls -A /var/lib/ceph) 2026-03-10T06:12:58.457 DEBUG:teuthology.orchestra.run.vm06:> test -z $(ls -A /var/lib/ceph) 2026-03-10T06:12:58.469 INFO:teuthology.orchestra.run.vm04.stderr:ls: cannot access '/var/lib/ceph': No such file or directory 2026-03-10T06:12:58.507 INFO:teuthology.orchestra.run.vm06.stderr:ls: cannot access '/var/lib/ceph': No such file or directory 2026-03-10T06:12:58.507 INFO:teuthology.run_tasks:Running task internal.vm_setup... 2026-03-10T06:12:58.514 DEBUG:teuthology.orchestra.run.vm04:> test -e /ceph-qa-ready 2026-03-10T06:12:58.528 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-10T06:12:58.716 DEBUG:teuthology.orchestra.run.vm06:> test -e /ceph-qa-ready 2026-03-10T06:12:58.729 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-10T06:12:58.913 INFO:teuthology.run_tasks:Running task internal.base... 2026-03-10T06:12:58.914 INFO:teuthology.task.internal:Creating test directory... 2026-03-10T06:12:58.914 DEBUG:teuthology.orchestra.run.vm04:> mkdir -p -m0755 -- /home/ubuntu/cephtest 2026-03-10T06:12:58.916 DEBUG:teuthology.orchestra.run.vm06:> mkdir -p -m0755 -- /home/ubuntu/cephtest 2026-03-10T06:12:58.930 INFO:teuthology.run_tasks:Running task internal.archive_upload... 2026-03-10T06:12:58.932 INFO:teuthology.run_tasks:Running task internal.archive... 2026-03-10T06:12:58.933 INFO:teuthology.task.internal:Creating archive directory... 2026-03-10T06:12:58.933 DEBUG:teuthology.orchestra.run.vm04:> install -d -m0755 -- /home/ubuntu/cephtest/archive 2026-03-10T06:12:58.972 DEBUG:teuthology.orchestra.run.vm06:> install -d -m0755 -- /home/ubuntu/cephtest/archive 2026-03-10T06:12:58.988 INFO:teuthology.run_tasks:Running task internal.coredump... 2026-03-10T06:12:58.990 INFO:teuthology.task.internal:Enabling coredump saving... 2026-03-10T06:12:58.990 DEBUG:teuthology.orchestra.run.vm04:> test -f /run/.containerenv -o -f /.dockerenv 2026-03-10T06:12:59.043 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-10T06:12:59.043 DEBUG:teuthology.orchestra.run.vm06:> test -f /run/.containerenv -o -f /.dockerenv 2026-03-10T06:12:59.057 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-10T06:12:59.058 DEBUG:teuthology.orchestra.run.vm04:> install -d -m0755 -- /home/ubuntu/cephtest/archive/coredump && sudo sysctl -w kernel.core_pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core && echo kernel.core_pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core | sudo tee -a /etc/sysctl.conf 2026-03-10T06:12:59.085 DEBUG:teuthology.orchestra.run.vm06:> install -d -m0755 -- /home/ubuntu/cephtest/archive/coredump && sudo sysctl -w kernel.core_pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core && echo kernel.core_pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core | sudo tee -a /etc/sysctl.conf 2026-03-10T06:12:59.109 INFO:teuthology.orchestra.run.vm04.stdout:kernel.core_pattern = /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-10T06:12:59.119 INFO:teuthology.orchestra.run.vm04.stdout:kernel.core_pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-10T06:12:59.128 INFO:teuthology.orchestra.run.vm06.stdout:kernel.core_pattern = /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-10T06:12:59.139 INFO:teuthology.orchestra.run.vm06.stdout:kernel.core_pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-10T06:12:59.140 INFO:teuthology.run_tasks:Running task internal.sudo... 2026-03-10T06:12:59.142 INFO:teuthology.task.internal:Configuring sudo... 2026-03-10T06:12:59.142 DEBUG:teuthology.orchestra.run.vm04:> sudo sed -i.orig.teuthology -e 's/^\([^#]*\) \(requiretty\)/\1 !\2/g' -e 's/^\([^#]*\) !\(visiblepw\)/\1 \2/g' /etc/sudoers 2026-03-10T06:12:59.163 DEBUG:teuthology.orchestra.run.vm06:> sudo sed -i.orig.teuthology -e 's/^\([^#]*\) \(requiretty\)/\1 !\2/g' -e 's/^\([^#]*\) !\(visiblepw\)/\1 \2/g' /etc/sudoers 2026-03-10T06:12:59.208 INFO:teuthology.run_tasks:Running task internal.syslog... 2026-03-10T06:12:59.211 INFO:teuthology.task.internal.syslog:Starting syslog monitoring... 2026-03-10T06:12:59.211 DEBUG:teuthology.orchestra.run.vm04:> mkdir -p -m0755 -- /home/ubuntu/cephtest/archive/syslog 2026-03-10T06:12:59.228 DEBUG:teuthology.orchestra.run.vm06:> mkdir -p -m0755 -- /home/ubuntu/cephtest/archive/syslog 2026-03-10T06:12:59.263 DEBUG:teuthology.orchestra.run.vm04:> install -m 666 /dev/null /home/ubuntu/cephtest/archive/syslog/kern.log 2026-03-10T06:12:59.303 DEBUG:teuthology.orchestra.run.vm04:> install -m 666 /dev/null /home/ubuntu/cephtest/archive/syslog/misc.log 2026-03-10T06:12:59.359 DEBUG:teuthology.orchestra.run.vm04:> set -ex 2026-03-10T06:12:59.359 DEBUG:teuthology.orchestra.run.vm04:> sudo dd of=/etc/rsyslog.d/80-cephtest.conf 2026-03-10T06:12:59.416 DEBUG:teuthology.orchestra.run.vm06:> install -m 666 /dev/null /home/ubuntu/cephtest/archive/syslog/kern.log 2026-03-10T06:12:59.437 DEBUG:teuthology.orchestra.run.vm06:> install -m 666 /dev/null /home/ubuntu/cephtest/archive/syslog/misc.log 2026-03-10T06:12:59.493 DEBUG:teuthology.orchestra.run.vm06:> set -ex 2026-03-10T06:12:59.493 DEBUG:teuthology.orchestra.run.vm06:> sudo dd of=/etc/rsyslog.d/80-cephtest.conf 2026-03-10T06:12:59.550 DEBUG:teuthology.orchestra.run.vm04:> sudo service rsyslog restart 2026-03-10T06:12:59.552 DEBUG:teuthology.orchestra.run.vm06:> sudo service rsyslog restart 2026-03-10T06:12:59.575 INFO:teuthology.orchestra.run.vm04.stderr:Redirecting to /bin/systemctl restart rsyslog.service 2026-03-10T06:12:59.617 INFO:teuthology.orchestra.run.vm06.stderr:Redirecting to /bin/systemctl restart rsyslog.service 2026-03-10T06:12:59.894 INFO:teuthology.run_tasks:Running task internal.timer... 2026-03-10T06:12:59.896 INFO:teuthology.task.internal:Starting timer... 2026-03-10T06:12:59.896 INFO:teuthology.run_tasks:Running task pcp... 2026-03-10T06:12:59.898 INFO:teuthology.run_tasks:Running task selinux... 2026-03-10T06:12:59.900 DEBUG:teuthology.task:Applying overrides for task selinux: {'allowlist': ['scontext=system_u:system_r:logrotate_t:s0', 'scontext=system_u:system_r:getty_t:s0']} 2026-03-10T06:12:59.900 INFO:teuthology.task.selinux:Excluding vm04: VMs are not yet supported 2026-03-10T06:12:59.900 INFO:teuthology.task.selinux:Excluding vm06: VMs are not yet supported 2026-03-10T06:12:59.900 DEBUG:teuthology.task.selinux:Getting current SELinux state 2026-03-10T06:12:59.900 DEBUG:teuthology.task.selinux:Existing SELinux modes: {} 2026-03-10T06:12:59.900 INFO:teuthology.task.selinux:Putting SELinux into permissive mode 2026-03-10T06:12:59.900 INFO:teuthology.run_tasks:Running task ansible.cephlab... 2026-03-10T06:12:59.902 DEBUG:teuthology.task:Applying overrides for task ansible.cephlab: {'branch': 'main', 'skip_tags': 'nagios,monitoring-scripts,hostname,pubkeys,zap,sudoers,kerberos,ntp-client,resolvconf,cpan,nfs', 'vars': {'timezone': 'UTC'}} 2026-03-10T06:12:59.902 DEBUG:teuthology.repo_utils:Setting repo remote to https://github.com/ceph/ceph-cm-ansible.git 2026-03-10T06:12:59.903 INFO:teuthology.repo_utils:Fetching github.com_ceph_ceph-cm-ansible_main from origin 2026-03-10T06:13:00.382 DEBUG:teuthology.repo_utils:Resetting repo at /home/teuthos/src/github.com_ceph_ceph-cm-ansible_main to origin/main 2026-03-10T06:13:00.387 INFO:teuthology.task.ansible:Playbook: [{'import_playbook': 'ansible_managed.yml'}, {'import_playbook': 'teuthology.yml'}, {'hosts': 'testnodes', 'tasks': [{'set_fact': {'ran_from_cephlab_playbook': True}}]}, {'import_playbook': 'testnodes.yml'}, {'import_playbook': 'container-host.yml'}, {'import_playbook': 'cobbler.yml'}, {'import_playbook': 'paddles.yml'}, {'import_playbook': 'pulpito.yml'}, {'hosts': 'testnodes', 'become': True, 'tasks': [{'name': 'Touch /ceph-qa-ready', 'file': {'path': '/ceph-qa-ready', 'state': 'touch'}, 'when': 'ran_from_cephlab_playbook|bool'}]}] 2026-03-10T06:13:00.387 DEBUG:teuthology.task.ansible:Running ansible-playbook -v --extra-vars '{"ansible_ssh_user": "ubuntu", "timezone": "UTC"}' -i /tmp/teuth_ansible_inventory2zuj3ll9 --limit vm04.local,vm06.local /home/teuthos/src/github.com_ceph_ceph-cm-ansible_main/cephlab.yml --skip-tags nagios,monitoring-scripts,hostname,pubkeys,zap,sudoers,kerberos,ntp-client,resolvconf,cpan,nfs 2026-03-10T06:14:55.719 DEBUG:teuthology.task.ansible:Reconnecting to [Remote(name='ubuntu@vm04.local'), Remote(name='ubuntu@vm06.local')] 2026-03-10T06:14:55.719 INFO:teuthology.orchestra.remote:Trying to reconnect to host 'ubuntu@vm04.local' 2026-03-10T06:14:55.720 DEBUG:teuthology.orchestra.connection:{'hostname': 'vm04.local', 'username': 'ubuntu', 'timeout': 60} 2026-03-10T06:14:55.784 DEBUG:teuthology.orchestra.run.vm04:> true 2026-03-10T06:14:55.858 INFO:teuthology.orchestra.remote:Successfully reconnected to host 'ubuntu@vm04.local' 2026-03-10T06:14:55.858 INFO:teuthology.orchestra.remote:Trying to reconnect to host 'ubuntu@vm06.local' 2026-03-10T06:14:55.858 DEBUG:teuthology.orchestra.connection:{'hostname': 'vm06.local', 'username': 'ubuntu', 'timeout': 60} 2026-03-10T06:14:55.928 DEBUG:teuthology.orchestra.run.vm06:> true 2026-03-10T06:14:56.008 INFO:teuthology.orchestra.remote:Successfully reconnected to host 'ubuntu@vm06.local' 2026-03-10T06:14:56.008 INFO:teuthology.run_tasks:Running task clock... 2026-03-10T06:14:56.010 INFO:teuthology.task.clock:Syncing clocks and checking initial clock skew... 2026-03-10T06:14:56.010 INFO:teuthology.orchestra.run:Running command with timeout 360 2026-03-10T06:14:56.010 DEBUG:teuthology.orchestra.run.vm04:> sudo systemctl stop ntp.service || sudo systemctl stop ntpd.service || sudo systemctl stop chronyd.service ; sudo ntpd -gq || sudo chronyc makestep ; sudo systemctl start ntp.service || sudo systemctl start ntpd.service || sudo systemctl start chronyd.service ; PATH=/usr/bin:/usr/sbin ntpq -p || PATH=/usr/bin:/usr/sbin chronyc sources || true 2026-03-10T06:14:56.012 INFO:teuthology.orchestra.run:Running command with timeout 360 2026-03-10T06:14:56.012 DEBUG:teuthology.orchestra.run.vm06:> sudo systemctl stop ntp.service || sudo systemctl stop ntpd.service || sudo systemctl stop chronyd.service ; sudo ntpd -gq || sudo chronyc makestep ; sudo systemctl start ntp.service || sudo systemctl start ntpd.service || sudo systemctl start chronyd.service ; PATH=/usr/bin:/usr/sbin ntpq -p || PATH=/usr/bin:/usr/sbin chronyc sources || true 2026-03-10T06:14:56.054 INFO:teuthology.orchestra.run.vm04.stderr:Failed to stop ntp.service: Unit ntp.service not loaded. 2026-03-10T06:14:56.073 INFO:teuthology.orchestra.run.vm04.stderr:Failed to stop ntpd.service: Unit ntpd.service not loaded. 2026-03-10T06:14:56.088 INFO:teuthology.orchestra.run.vm06.stderr:Failed to stop ntp.service: Unit ntp.service not loaded. 2026-03-10T06:14:56.104 INFO:teuthology.orchestra.run.vm06.stderr:Failed to stop ntpd.service: Unit ntpd.service not loaded. 2026-03-10T06:14:56.109 INFO:teuthology.orchestra.run.vm04.stderr:sudo: ntpd: command not found 2026-03-10T06:14:56.124 INFO:teuthology.orchestra.run.vm04.stdout:506 Cannot talk to daemon 2026-03-10T06:14:56.137 INFO:teuthology.orchestra.run.vm06.stderr:sudo: ntpd: command not found 2026-03-10T06:14:56.146 INFO:teuthology.orchestra.run.vm04.stderr:Failed to start ntp.service: Unit ntp.service not found. 2026-03-10T06:14:56.152 INFO:teuthology.orchestra.run.vm06.stdout:506 Cannot talk to daemon 2026-03-10T06:14:56.162 INFO:teuthology.orchestra.run.vm04.stderr:Failed to start ntpd.service: Unit ntpd.service not found. 2026-03-10T06:14:56.166 INFO:teuthology.orchestra.run.vm06.stderr:Failed to start ntp.service: Unit ntp.service not found. 2026-03-10T06:14:56.180 INFO:teuthology.orchestra.run.vm06.stderr:Failed to start ntpd.service: Unit ntpd.service not found. 2026-03-10T06:14:56.212 INFO:teuthology.orchestra.run.vm04.stderr:bash: line 1: ntpq: command not found 2026-03-10T06:14:56.217 INFO:teuthology.orchestra.run.vm04.stdout:MS Name/IP address Stratum Poll Reach LastRx Last sample 2026-03-10T06:14:56.217 INFO:teuthology.orchestra.run.vm04.stdout:=============================================================================== 2026-03-10T06:14:56.228 INFO:teuthology.orchestra.run.vm06.stderr:bash: line 1: ntpq: command not found 2026-03-10T06:14:56.230 INFO:teuthology.orchestra.run.vm06.stdout:MS Name/IP address Stratum Poll Reach LastRx Last sample 2026-03-10T06:14:56.230 INFO:teuthology.orchestra.run.vm06.stdout:=============================================================================== 2026-03-10T06:14:56.231 INFO:teuthology.run_tasks:Running task install... 2026-03-10T06:14:56.232 DEBUG:teuthology.task.install:project ceph 2026-03-10T06:14:56.232 DEBUG:teuthology.task.install:INSTALL overrides: {'ceph': {'flavor': 'default', 'sha1': 'e911bdebe5c8faa3800735d1568fcdca65db60df'}, 'extra_system_packages': {'deb': ['python3-xmltodict', 'python3-jmespath'], 'rpm': ['bzip2', 'perl-Test-Harness', 'python3-xmltodict', 'python3-jmespath']}} 2026-03-10T06:14:56.233 DEBUG:teuthology.task.install:config {'exclude_packages': ['ceph-volume'], 'tag': 'v18.2.0', 'flavor': 'default', 'sha1': 'e911bdebe5c8faa3800735d1568fcdca65db60df', 'extra_system_packages': {'deb': ['python3-xmltodict', 'python3-jmespath'], 'rpm': ['bzip2', 'perl-Test-Harness', 'python3-xmltodict', 'python3-jmespath']}} 2026-03-10T06:14:56.233 INFO:teuthology.task.install:Using flavor: default 2026-03-10T06:14:56.235 DEBUG:teuthology.task.install:Package list is: {'deb': ['ceph', 'cephadm', 'ceph-mds', 'ceph-mgr', 'ceph-common', 'ceph-fuse', 'ceph-test', 'radosgw', 'python3-rados', 'python3-rgw', 'python3-cephfs', 'python3-rbd', 'libcephfs2', 'libcephfs-dev', 'librados2', 'librbd1', 'rbd-fuse'], 'rpm': ['ceph-radosgw', 'ceph-test', 'ceph', 'ceph-base', 'cephadm', 'ceph-immutable-object-cache', 'ceph-mgr', 'ceph-mgr-dashboard', 'ceph-mgr-diskprediction-local', 'ceph-mgr-rook', 'ceph-mgr-cephadm', 'ceph-fuse', 'librados-devel', 'libcephfs2', 'libcephfs-devel', 'librados2', 'librbd1', 'python3-rados', 'python3-rgw', 'python3-cephfs', 'python3-rbd', 'rbd-fuse', 'rbd-mirror', 'rbd-nbd']} 2026-03-10T06:14:56.235 INFO:teuthology.task.install:extra packages: [] 2026-03-10T06:14:56.235 DEBUG:teuthology.task.install.rpm:_update_package_list_and_install: config is {'branch': None, 'cleanup': None, 'debuginfo': None, 'downgrade_packages': [], 'exclude_packages': ['ceph-volume'], 'extra_packages': [], 'extra_system_packages': {'deb': ['python3-xmltodict', 'python3-jmespath'], 'rpm': ['bzip2', 'perl-Test-Harness', 'python3-xmltodict', 'python3-jmespath']}, 'extras': None, 'enable_coprs': [], 'flavor': 'default', 'install_ceph_packages': True, 'packages': {}, 'project': 'ceph', 'repos_only': False, 'sha1': 'e911bdebe5c8faa3800735d1568fcdca65db60df', 'tag': 'v18.2.0', 'wait_for_package': False} 2026-03-10T06:14:56.235 WARNING:teuthology.packaging:More than one of ref, tag, branch, or sha1 supplied; using tag 2026-03-10T06:14:56.235 INFO:teuthology.packaging:ref: None 2026-03-10T06:14:56.235 INFO:teuthology.packaging:tag: v18.2.0 2026-03-10T06:14:56.235 INFO:teuthology.packaging:branch: None 2026-03-10T06:14:56.235 INFO:teuthology.packaging:sha1: e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-10T06:14:56.816 DEBUG:teuthology.repo_utils:git ls-remote https://github.com/ceph/ceph v18.2.0^{} -> 5dd24139a1eada541a3bc16b6941c5dde975e26d 2026-03-10T06:14:56.816 DEBUG:teuthology.packaging:Querying https://shaman.ceph.com/api/search?status=ready&project=ceph&flavor=default&distros=centos%2F9%2Fx86_64&sha1=5dd24139a1eada541a3bc16b6941c5dde975e26d 2026-03-10T06:14:56.817 DEBUG:teuthology.task.install.rpm:_update_package_list_and_install: config is {'branch': None, 'cleanup': None, 'debuginfo': None, 'downgrade_packages': [], 'exclude_packages': ['ceph-volume'], 'extra_packages': [], 'extra_system_packages': {'deb': ['python3-xmltodict', 'python3-jmespath'], 'rpm': ['bzip2', 'perl-Test-Harness', 'python3-xmltodict', 'python3-jmespath']}, 'extras': None, 'enable_coprs': [], 'flavor': 'default', 'install_ceph_packages': True, 'packages': {}, 'project': 'ceph', 'repos_only': False, 'sha1': 'e911bdebe5c8faa3800735d1568fcdca65db60df', 'tag': 'v18.2.0', 'wait_for_package': False} 2026-03-10T06:14:56.817 WARNING:teuthology.packaging:More than one of ref, tag, branch, or sha1 supplied; using tag 2026-03-10T06:14:56.817 INFO:teuthology.packaging:ref: None 2026-03-10T06:14:56.817 INFO:teuthology.packaging:tag: v18.2.0 2026-03-10T06:14:56.817 INFO:teuthology.packaging:branch: None 2026-03-10T06:14:56.817 INFO:teuthology.packaging:sha1: e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-10T06:14:56.817 DEBUG:teuthology.packaging:Querying https://shaman.ceph.com/api/search?status=ready&project=ceph&flavor=default&distros=centos%2F9%2Fx86_64&sha1=5dd24139a1eada541a3bc16b6941c5dde975e26d 2026-03-10T06:14:57.404 INFO:teuthology.task.install.rpm:Pulling from https://chacra.ceph.com/r/ceph/reef/5dd24139a1eada541a3bc16b6941c5dde975e26d/centos/9/flavors/default/ 2026-03-10T06:14:57.404 INFO:teuthology.task.install.rpm:Package version is 18.2.0-0 2026-03-10T06:14:57.437 INFO:teuthology.task.install.rpm:Pulling from https://chacra.ceph.com/r/ceph/reef/5dd24139a1eada541a3bc16b6941c5dde975e26d/centos/9/flavors/default/ 2026-03-10T06:14:57.437 INFO:teuthology.task.install.rpm:Package version is 18.2.0-0 2026-03-10T06:14:57.763 INFO:teuthology.packaging:Writing yum repo: [ceph] name=ceph packages for $basearch baseurl=https://chacra.ceph.com/r/ceph/reef/5dd24139a1eada541a3bc16b6941c5dde975e26d/centos/9/flavors/default/$basearch enabled=1 gpgcheck=0 type=rpm-md [ceph-noarch] name=ceph noarch packages baseurl=https://chacra.ceph.com/r/ceph/reef/5dd24139a1eada541a3bc16b6941c5dde975e26d/centos/9/flavors/default/noarch enabled=1 gpgcheck=0 type=rpm-md [ceph-source] name=ceph source packages baseurl=https://chacra.ceph.com/r/ceph/reef/5dd24139a1eada541a3bc16b6941c5dde975e26d/centos/9/flavors/default/SRPMS enabled=1 gpgcheck=0 type=rpm-md 2026-03-10T06:14:57.763 DEBUG:teuthology.orchestra.run.vm06:> set -ex 2026-03-10T06:14:57.763 DEBUG:teuthology.orchestra.run.vm06:> sudo dd of=/etc/yum.repos.d/ceph.repo 2026-03-10T06:14:57.797 INFO:teuthology.task.install.rpm:Installing packages: ceph-radosgw, ceph-test, ceph, ceph-base, cephadm, ceph-immutable-object-cache, ceph-mgr, ceph-mgr-dashboard, ceph-mgr-diskprediction-local, ceph-mgr-rook, ceph-mgr-cephadm, ceph-fuse, librados-devel, libcephfs2, libcephfs-devel, librados2, librbd1, python3-rados, python3-rgw, python3-cephfs, python3-rbd, rbd-fuse, rbd-mirror, rbd-nbd, bzip2, perl-Test-Harness, python3-xmltodict, python3-jmespath on remote rpm x86_64 2026-03-10T06:14:57.797 WARNING:teuthology.packaging:More than one of ref, tag, branch, or sha1 supplied; using tag 2026-03-10T06:14:57.797 INFO:teuthology.packaging:ref: None 2026-03-10T06:14:57.797 INFO:teuthology.packaging:tag: v18.2.0 2026-03-10T06:14:57.797 INFO:teuthology.packaging:branch: None 2026-03-10T06:14:57.798 INFO:teuthology.packaging:sha1: e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-10T06:14:57.798 DEBUG:teuthology.orchestra.run.vm06:> if test -f /etc/yum.repos.d/ceph.repo ; then sudo sed -i -e ':a;N;$!ba;s/enabled=1\ngpg/enabled=1\npriority=1\ngpg/g' -e 's;ref/[a-zA-Z0-9_-]*/;ref/v18.2.0/;g' /etc/yum.repos.d/ceph.repo ; fi 2026-03-10T06:14:57.812 INFO:teuthology.packaging:Writing yum repo: [ceph] name=ceph packages for $basearch baseurl=https://chacra.ceph.com/r/ceph/reef/5dd24139a1eada541a3bc16b6941c5dde975e26d/centos/9/flavors/default/$basearch enabled=1 gpgcheck=0 type=rpm-md [ceph-noarch] name=ceph noarch packages baseurl=https://chacra.ceph.com/r/ceph/reef/5dd24139a1eada541a3bc16b6941c5dde975e26d/centos/9/flavors/default/noarch enabled=1 gpgcheck=0 type=rpm-md [ceph-source] name=ceph source packages baseurl=https://chacra.ceph.com/r/ceph/reef/5dd24139a1eada541a3bc16b6941c5dde975e26d/centos/9/flavors/default/SRPMS enabled=1 gpgcheck=0 type=rpm-md 2026-03-10T06:14:57.812 DEBUG:teuthology.orchestra.run.vm04:> set -ex 2026-03-10T06:14:57.813 DEBUG:teuthology.orchestra.run.vm04:> sudo dd of=/etc/yum.repos.d/ceph.repo 2026-03-10T06:14:57.857 INFO:teuthology.task.install.rpm:Installing packages: ceph-radosgw, ceph-test, ceph, ceph-base, cephadm, ceph-immutable-object-cache, ceph-mgr, ceph-mgr-dashboard, ceph-mgr-diskprediction-local, ceph-mgr-rook, ceph-mgr-cephadm, ceph-fuse, librados-devel, libcephfs2, libcephfs-devel, librados2, librbd1, python3-rados, python3-rgw, python3-cephfs, python3-rbd, rbd-fuse, rbd-mirror, rbd-nbd, bzip2, perl-Test-Harness, python3-xmltodict, python3-jmespath on remote rpm x86_64 2026-03-10T06:14:57.857 WARNING:teuthology.packaging:More than one of ref, tag, branch, or sha1 supplied; using tag 2026-03-10T06:14:57.857 INFO:teuthology.packaging:ref: None 2026-03-10T06:14:57.857 INFO:teuthology.packaging:tag: v18.2.0 2026-03-10T06:14:57.857 INFO:teuthology.packaging:branch: None 2026-03-10T06:14:57.857 INFO:teuthology.packaging:sha1: e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-10T06:14:57.857 DEBUG:teuthology.orchestra.run.vm04:> if test -f /etc/yum.repos.d/ceph.repo ; then sudo sed -i -e ':a;N;$!ba;s/enabled=1\ngpg/enabled=1\npriority=1\ngpg/g' -e 's;ref/[a-zA-Z0-9_-]*/;ref/v18.2.0/;g' /etc/yum.repos.d/ceph.repo ; fi 2026-03-10T06:14:57.874 DEBUG:teuthology.orchestra.run.vm06:> sudo touch -a /etc/yum/pluginconf.d/priorities.conf ; test -e /etc/yum/pluginconf.d/priorities.conf.orig || sudo cp -af /etc/yum/pluginconf.d/priorities.conf /etc/yum/pluginconf.d/priorities.conf.orig 2026-03-10T06:14:57.940 DEBUG:teuthology.orchestra.run.vm04:> sudo touch -a /etc/yum/pluginconf.d/priorities.conf ; test -e /etc/yum/pluginconf.d/priorities.conf.orig || sudo cp -af /etc/yum/pluginconf.d/priorities.conf /etc/yum/pluginconf.d/priorities.conf.orig 2026-03-10T06:14:57.968 DEBUG:teuthology.orchestra.run.vm06:> grep check_obsoletes /etc/yum/pluginconf.d/priorities.conf && sudo sed -i 's/check_obsoletes.*0/check_obsoletes = 1/g' /etc/yum/pluginconf.d/priorities.conf || echo 'check_obsoletes = 1' | sudo tee -a /etc/yum/pluginconf.d/priorities.conf 2026-03-10T06:14:58.003 INFO:teuthology.orchestra.run.vm06.stdout:check_obsoletes = 1 2026-03-10T06:14:58.004 DEBUG:teuthology.orchestra.run.vm06:> sudo yum clean all 2026-03-10T06:14:58.038 DEBUG:teuthology.orchestra.run.vm04:> grep check_obsoletes /etc/yum/pluginconf.d/priorities.conf && sudo sed -i 's/check_obsoletes.*0/check_obsoletes = 1/g' /etc/yum/pluginconf.d/priorities.conf || echo 'check_obsoletes = 1' | sudo tee -a /etc/yum/pluginconf.d/priorities.conf 2026-03-10T06:14:58.074 INFO:teuthology.orchestra.run.vm04.stdout:check_obsoletes = 1 2026-03-10T06:14:58.076 DEBUG:teuthology.orchestra.run.vm04:> sudo yum clean all 2026-03-10T06:14:58.229 INFO:teuthology.orchestra.run.vm06.stdout:41 files removed 2026-03-10T06:14:58.248 DEBUG:teuthology.orchestra.run.vm06:> sudo yum -y install ceph-radosgw ceph-test ceph ceph-base cephadm ceph-immutable-object-cache ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-cephadm ceph-fuse librados-devel libcephfs2 libcephfs-devel librados2 librbd1 python3-rados python3-rgw python3-cephfs python3-rbd rbd-fuse rbd-mirror rbd-nbd bzip2 perl-Test-Harness python3-xmltodict python3-jmespath 2026-03-10T06:14:58.294 INFO:teuthology.orchestra.run.vm04.stdout:41 files removed 2026-03-10T06:14:58.326 DEBUG:teuthology.orchestra.run.vm04:> sudo yum -y install ceph-radosgw ceph-test ceph ceph-base cephadm ceph-immutable-object-cache ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-cephadm ceph-fuse librados-devel libcephfs2 libcephfs-devel librados2 librbd1 python3-rados python3-rgw python3-cephfs python3-rbd rbd-fuse rbd-mirror rbd-nbd bzip2 perl-Test-Harness python3-xmltodict python3-jmespath 2026-03-10T06:14:59.297 INFO:teuthology.orchestra.run.vm06.stdout:ceph packages for x86_64 94 kB/s | 76 kB 00:00 2026-03-10T06:14:59.363 INFO:teuthology.orchestra.run.vm04.stdout:ceph packages for x86_64 94 kB/s | 76 kB 00:00 2026-03-10T06:14:59.950 INFO:teuthology.orchestra.run.vm06.stdout:ceph noarch packages 15 kB/s | 9.3 kB 00:00 2026-03-10T06:15:00.003 INFO:teuthology.orchestra.run.vm04.stdout:ceph noarch packages 15 kB/s | 9.3 kB 00:00 2026-03-10T06:15:00.588 INFO:teuthology.orchestra.run.vm06.stdout:ceph source packages 3.5 kB/s | 2.2 kB 00:00 2026-03-10T06:15:00.632 INFO:teuthology.orchestra.run.vm04.stdout:ceph source packages 3.5 kB/s | 2.2 kB 00:00 2026-03-10T06:15:01.215 INFO:teuthology.orchestra.run.vm06.stdout:CentOS Stream 9 - BaseOS 15 MB/s | 8.9 MB 00:00 2026-03-10T06:15:02.155 INFO:teuthology.orchestra.run.vm04.stdout:CentOS Stream 9 - BaseOS 5.9 MB/s | 8.9 MB 00:01 2026-03-10T06:15:03.786 INFO:teuthology.orchestra.run.vm06.stdout:CentOS Stream 9 - AppStream 16 MB/s | 27 MB 00:01 2026-03-10T06:15:04.907 INFO:teuthology.orchestra.run.vm04.stdout:CentOS Stream 9 - AppStream 14 MB/s | 27 MB 00:01 2026-03-10T06:15:08.353 INFO:teuthology.orchestra.run.vm06.stdout:CentOS Stream 9 - CRB 7.3 MB/s | 8.0 MB 00:01 2026-03-10T06:15:09.832 INFO:teuthology.orchestra.run.vm06.stdout:CentOS Stream 9 - Extras packages 45 kB/s | 20 kB 00:00 2026-03-10T06:15:09.988 INFO:teuthology.orchestra.run.vm04.stdout:CentOS Stream 9 - CRB 4.9 MB/s | 8.0 MB 00:01 2026-03-10T06:15:11.284 INFO:teuthology.orchestra.run.vm04.stdout:CentOS Stream 9 - Extras packages 90 kB/s | 20 kB 00:00 2026-03-10T06:15:11.419 INFO:teuthology.orchestra.run.vm06.stdout:Extra Packages for Enterprise Linux 14 MB/s | 20 MB 00:01 2026-03-10T06:15:12.522 INFO:teuthology.orchestra.run.vm04.stdout:Extra Packages for Enterprise Linux 18 MB/s | 20 MB 00:01 2026-03-10T06:15:16.230 INFO:teuthology.orchestra.run.vm06.stdout:lab-extras 65 kB/s | 50 kB 00:00 2026-03-10T06:15:17.649 INFO:teuthology.orchestra.run.vm06.stdout:Package librados2-2:16.2.4-5.el9.x86_64 is already installed. 2026-03-10T06:15:17.649 INFO:teuthology.orchestra.run.vm06.stdout:Package librbd1-2:16.2.4-5.el9.x86_64 is already installed. 2026-03-10T06:15:17.653 INFO:teuthology.orchestra.run.vm06.stdout:Package bzip2-1.0.8-11.el9.x86_64 is already installed. 2026-03-10T06:15:17.654 INFO:teuthology.orchestra.run.vm06.stdout:Package perl-Test-Harness-1:3.42-461.el9.noarch is already installed. 2026-03-10T06:15:17.683 INFO:teuthology.orchestra.run.vm06.stdout:Dependencies resolved. 2026-03-10T06:15:17.687 INFO:teuthology.orchestra.run.vm06.stdout:================================================================================ 2026-03-10T06:15:17.687 INFO:teuthology.orchestra.run.vm06.stdout: Package Arch Version Repository Size 2026-03-10T06:15:17.687 INFO:teuthology.orchestra.run.vm06.stdout:================================================================================ 2026-03-10T06:15:17.687 INFO:teuthology.orchestra.run.vm06.stdout:Installing: 2026-03-10T06:15:17.687 INFO:teuthology.orchestra.run.vm06.stdout: ceph x86_64 2:18.2.0-0.el9 ceph 6.4 k 2026-03-10T06:15:17.687 INFO:teuthology.orchestra.run.vm06.stdout: ceph-base x86_64 2:18.2.0-0.el9 ceph 5.2 M 2026-03-10T06:15:17.687 INFO:teuthology.orchestra.run.vm06.stdout: ceph-fuse x86_64 2:18.2.0-0.el9 ceph 835 k 2026-03-10T06:15:17.687 INFO:teuthology.orchestra.run.vm06.stdout: ceph-immutable-object-cache x86_64 2:18.2.0-0.el9 ceph 142 k 2026-03-10T06:15:17.687 INFO:teuthology.orchestra.run.vm06.stdout: ceph-mgr x86_64 2:18.2.0-0.el9 ceph 1.4 M 2026-03-10T06:15:17.687 INFO:teuthology.orchestra.run.vm06.stdout: ceph-mgr-cephadm noarch 2:18.2.0-0.el9 ceph-noarch 127 k 2026-03-10T06:15:17.687 INFO:teuthology.orchestra.run.vm06.stdout: ceph-mgr-dashboard noarch 2:18.2.0-0.el9 ceph-noarch 1.7 M 2026-03-10T06:15:17.687 INFO:teuthology.orchestra.run.vm06.stdout: ceph-mgr-diskprediction-local noarch 2:18.2.0-0.el9 ceph-noarch 7.4 M 2026-03-10T06:15:17.687 INFO:teuthology.orchestra.run.vm06.stdout: ceph-mgr-rook noarch 2:18.2.0-0.el9 ceph-noarch 47 k 2026-03-10T06:15:17.687 INFO:teuthology.orchestra.run.vm06.stdout: ceph-radosgw x86_64 2:18.2.0-0.el9 ceph 7.6 M 2026-03-10T06:15:17.687 INFO:teuthology.orchestra.run.vm06.stdout: ceph-test x86_64 2:18.2.0-0.el9 ceph 40 M 2026-03-10T06:15:17.687 INFO:teuthology.orchestra.run.vm06.stdout: cephadm noarch 2:18.2.0-0.el9 ceph-noarch 209 k 2026-03-10T06:15:17.687 INFO:teuthology.orchestra.run.vm06.stdout: libcephfs-devel x86_64 2:18.2.0-0.el9 ceph 30 k 2026-03-10T06:15:17.688 INFO:teuthology.orchestra.run.vm06.stdout: libcephfs2 x86_64 2:18.2.0-0.el9 ceph 653 k 2026-03-10T06:15:17.688 INFO:teuthology.orchestra.run.vm06.stdout: librados-devel x86_64 2:18.2.0-0.el9 ceph 126 k 2026-03-10T06:15:17.688 INFO:teuthology.orchestra.run.vm06.stdout: python3-cephfs x86_64 2:18.2.0-0.el9 ceph 155 k 2026-03-10T06:15:17.688 INFO:teuthology.orchestra.run.vm06.stdout: python3-jmespath noarch 1.0.1-1.el9 appstream 48 k 2026-03-10T06:15:17.688 INFO:teuthology.orchestra.run.vm06.stdout: python3-rados x86_64 2:18.2.0-0.el9 ceph 321 k 2026-03-10T06:15:17.688 INFO:teuthology.orchestra.run.vm06.stdout: python3-rbd x86_64 2:18.2.0-0.el9 ceph 297 k 2026-03-10T06:15:17.688 INFO:teuthology.orchestra.run.vm06.stdout: python3-rgw x86_64 2:18.2.0-0.el9 ceph 99 k 2026-03-10T06:15:17.688 INFO:teuthology.orchestra.run.vm06.stdout: python3-xmltodict noarch 0.12.0-15.el9 epel 22 k 2026-03-10T06:15:17.688 INFO:teuthology.orchestra.run.vm06.stdout: rbd-fuse x86_64 2:18.2.0-0.el9 ceph 86 k 2026-03-10T06:15:17.688 INFO:teuthology.orchestra.run.vm06.stdout: rbd-mirror x86_64 2:18.2.0-0.el9 ceph 3.0 M 2026-03-10T06:15:17.688 INFO:teuthology.orchestra.run.vm06.stdout: rbd-nbd x86_64 2:18.2.0-0.el9 ceph 169 k 2026-03-10T06:15:17.688 INFO:teuthology.orchestra.run.vm06.stdout:Upgrading: 2026-03-10T06:15:17.688 INFO:teuthology.orchestra.run.vm06.stdout: librados2 x86_64 2:18.2.0-0.el9 ceph 3.3 M 2026-03-10T06:15:17.688 INFO:teuthology.orchestra.run.vm06.stdout: librbd1 x86_64 2:18.2.0-0.el9 ceph 3.0 M 2026-03-10T06:15:17.688 INFO:teuthology.orchestra.run.vm06.stdout:Installing dependencies: 2026-03-10T06:15:17.688 INFO:teuthology.orchestra.run.vm06.stdout: boost-program-options x86_64 1.75.0-13.el9 appstream 104 k 2026-03-10T06:15:17.688 INFO:teuthology.orchestra.run.vm06.stdout: ceph-common x86_64 2:18.2.0-0.el9 ceph 18 M 2026-03-10T06:15:17.688 INFO:teuthology.orchestra.run.vm06.stdout: ceph-grafana-dashboards noarch 2:18.2.0-0.el9 ceph-noarch 23 k 2026-03-10T06:15:17.688 INFO:teuthology.orchestra.run.vm06.stdout: ceph-mds x86_64 2:18.2.0-0.el9 ceph 2.1 M 2026-03-10T06:15:17.688 INFO:teuthology.orchestra.run.vm06.stdout: ceph-mgr-modules-core noarch 2:18.2.0-0.el9 ceph-noarch 240 k 2026-03-10T06:15:17.688 INFO:teuthology.orchestra.run.vm06.stdout: ceph-mon x86_64 2:18.2.0-0.el9 ceph 4.4 M 2026-03-10T06:15:17.688 INFO:teuthology.orchestra.run.vm06.stdout: ceph-osd x86_64 2:18.2.0-0.el9 ceph 18 M 2026-03-10T06:15:17.688 INFO:teuthology.orchestra.run.vm06.stdout: ceph-prometheus-alerts noarch 2:18.2.0-0.el9 ceph-noarch 15 k 2026-03-10T06:15:17.688 INFO:teuthology.orchestra.run.vm06.stdout: ceph-selinux x86_64 2:18.2.0-0.el9 ceph 24 k 2026-03-10T06:15:17.688 INFO:teuthology.orchestra.run.vm06.stdout: flexiblas x86_64 3.0.4-9.el9 appstream 30 k 2026-03-10T06:15:17.688 INFO:teuthology.orchestra.run.vm06.stdout: flexiblas-netlib x86_64 3.0.4-9.el9 appstream 3.0 M 2026-03-10T06:15:17.688 INFO:teuthology.orchestra.run.vm06.stdout: flexiblas-openblas-openmp x86_64 3.0.4-9.el9 appstream 15 k 2026-03-10T06:15:17.688 INFO:teuthology.orchestra.run.vm06.stdout: fmt x86_64 8.1.1-5.el9 epel 111 k 2026-03-10T06:15:17.688 INFO:teuthology.orchestra.run.vm06.stdout: gperftools-libs x86_64 2.9.1-3.el9 epel 308 k 2026-03-10T06:15:17.688 INFO:teuthology.orchestra.run.vm06.stdout: ledmon-libs x86_64 1.1.0-3.el9 baseos 40 k 2026-03-10T06:15:17.688 INFO:teuthology.orchestra.run.vm06.stdout: libarrow x86_64 9.0.0-15.el9 epel 4.4 M 2026-03-10T06:15:17.688 INFO:teuthology.orchestra.run.vm06.stdout: libarrow-doc noarch 9.0.0-15.el9 epel 25 k 2026-03-10T06:15:17.688 INFO:teuthology.orchestra.run.vm06.stdout: libcephsqlite x86_64 2:18.2.0-0.el9 ceph 161 k 2026-03-10T06:15:17.688 INFO:teuthology.orchestra.run.vm06.stdout: libconfig x86_64 1.7.2-9.el9 baseos 72 k 2026-03-10T06:15:17.688 INFO:teuthology.orchestra.run.vm06.stdout: libgfortran x86_64 11.5.0-14.el9 baseos 794 k 2026-03-10T06:15:17.688 INFO:teuthology.orchestra.run.vm06.stdout: liboath x86_64 2.6.12-1.el9 epel 49 k 2026-03-10T06:15:17.688 INFO:teuthology.orchestra.run.vm06.stdout: libpmemobj x86_64 1.12.1-1.el9 appstream 160 k 2026-03-10T06:15:17.688 INFO:teuthology.orchestra.run.vm06.stdout: libquadmath x86_64 11.5.0-14.el9 baseos 184 k 2026-03-10T06:15:17.688 INFO:teuthology.orchestra.run.vm06.stdout: librabbitmq x86_64 0.11.0-7.el9 appstream 45 k 2026-03-10T06:15:17.688 INFO:teuthology.orchestra.run.vm06.stdout: libradosstriper1 x86_64 2:18.2.0-0.el9 ceph 474 k 2026-03-10T06:15:17.688 INFO:teuthology.orchestra.run.vm06.stdout: librdkafka x86_64 1.6.1-102.el9 appstream 662 k 2026-03-10T06:15:17.688 INFO:teuthology.orchestra.run.vm06.stdout: librgw2 x86_64 2:18.2.0-0.el9 ceph 4.4 M 2026-03-10T06:15:17.688 INFO:teuthology.orchestra.run.vm06.stdout: libstoragemgmt x86_64 1.10.1-1.el9 appstream 246 k 2026-03-10T06:15:17.688 INFO:teuthology.orchestra.run.vm06.stdout: libunwind x86_64 1.6.2-1.el9 epel 67 k 2026-03-10T06:15:17.688 INFO:teuthology.orchestra.run.vm06.stdout: libxslt x86_64 1.1.34-12.el9 appstream 233 k 2026-03-10T06:15:17.688 INFO:teuthology.orchestra.run.vm06.stdout: lttng-ust x86_64 2.12.0-6.el9 appstream 292 k 2026-03-10T06:15:17.689 INFO:teuthology.orchestra.run.vm06.stdout: mailcap noarch 2.1.49-5.el9 baseos 33 k 2026-03-10T06:15:17.689 INFO:teuthology.orchestra.run.vm06.stdout: openblas x86_64 0.3.29-1.el9 appstream 42 k 2026-03-10T06:15:17.689 INFO:teuthology.orchestra.run.vm06.stdout: openblas-openmp x86_64 0.3.29-1.el9 appstream 5.3 M 2026-03-10T06:15:17.689 INFO:teuthology.orchestra.run.vm06.stdout: parquet-libs x86_64 9.0.0-15.el9 epel 838 k 2026-03-10T06:15:17.689 INFO:teuthology.orchestra.run.vm06.stdout: python3-asyncssh noarch 2.13.2-5.el9 epel 548 k 2026-03-10T06:15:17.689 INFO:teuthology.orchestra.run.vm06.stdout: python3-autocommand noarch 2.2.2-8.el9 epel 29 k 2026-03-10T06:15:17.689 INFO:teuthology.orchestra.run.vm06.stdout: python3-babel noarch 2.9.1-2.el9 appstream 6.0 M 2026-03-10T06:15:17.689 INFO:teuthology.orchestra.run.vm06.stdout: python3-backports-tarfile noarch 1.2.0-1.el9 epel 60 k 2026-03-10T06:15:17.689 INFO:teuthology.orchestra.run.vm06.stdout: python3-bcrypt x86_64 3.2.2-1.el9 epel 43 k 2026-03-10T06:15:17.689 INFO:teuthology.orchestra.run.vm06.stdout: python3-cachetools noarch 4.2.4-1.el9 epel 32 k 2026-03-10T06:15:17.689 INFO:teuthology.orchestra.run.vm06.stdout: python3-ceph-argparse x86_64 2:18.2.0-0.el9 ceph 45 k 2026-03-10T06:15:17.689 INFO:teuthology.orchestra.run.vm06.stdout: python3-ceph-common x86_64 2:18.2.0-0.el9 ceph 119 k 2026-03-10T06:15:17.689 INFO:teuthology.orchestra.run.vm06.stdout: python3-certifi noarch 2023.05.07-4.el9 epel 14 k 2026-03-10T06:15:17.689 INFO:teuthology.orchestra.run.vm06.stdout: python3-cffi x86_64 1.14.5-5.el9 baseos 253 k 2026-03-10T06:15:17.689 INFO:teuthology.orchestra.run.vm06.stdout: python3-cheroot noarch 10.0.1-4.el9 epel 173 k 2026-03-10T06:15:17.689 INFO:teuthology.orchestra.run.vm06.stdout: python3-cherrypy noarch 18.6.1-2.el9 epel 358 k 2026-03-10T06:15:17.689 INFO:teuthology.orchestra.run.vm06.stdout: python3-cryptography x86_64 36.0.1-5.el9 baseos 1.2 M 2026-03-10T06:15:17.689 INFO:teuthology.orchestra.run.vm06.stdout: python3-devel x86_64 3.9.25-3.el9 appstream 244 k 2026-03-10T06:15:17.689 INFO:teuthology.orchestra.run.vm06.stdout: python3-google-auth noarch 1:2.45.0-1.el9 epel 254 k 2026-03-10T06:15:17.689 INFO:teuthology.orchestra.run.vm06.stdout: python3-jaraco noarch 8.2.1-3.el9 epel 11 k 2026-03-10T06:15:17.689 INFO:teuthology.orchestra.run.vm06.stdout: python3-jaraco-classes noarch 3.2.1-5.el9 epel 18 k 2026-03-10T06:15:17.689 INFO:teuthology.orchestra.run.vm06.stdout: python3-jaraco-collections noarch 3.0.0-8.el9 epel 23 k 2026-03-10T06:15:17.689 INFO:teuthology.orchestra.run.vm06.stdout: python3-jaraco-context noarch 6.0.1-3.el9 epel 20 k 2026-03-10T06:15:17.689 INFO:teuthology.orchestra.run.vm06.stdout: python3-jaraco-functools noarch 3.5.0-2.el9 epel 19 k 2026-03-10T06:15:17.689 INFO:teuthology.orchestra.run.vm06.stdout: python3-jaraco-text noarch 4.0.0-2.el9 epel 26 k 2026-03-10T06:15:17.689 INFO:teuthology.orchestra.run.vm06.stdout: python3-jinja2 noarch 2.11.3-8.el9 appstream 249 k 2026-03-10T06:15:17.689 INFO:teuthology.orchestra.run.vm06.stdout: python3-jwt noarch 2.4.0-1.el9 epel 41 k 2026-03-10T06:15:17.689 INFO:teuthology.orchestra.run.vm06.stdout: python3-kubernetes noarch 1:26.1.0-3.el9 epel 1.0 M 2026-03-10T06:15:17.689 INFO:teuthology.orchestra.run.vm06.stdout: python3-libstoragemgmt x86_64 1.10.1-1.el9 appstream 177 k 2026-03-10T06:15:17.689 INFO:teuthology.orchestra.run.vm06.stdout: python3-logutils noarch 0.3.5-21.el9 epel 46 k 2026-03-10T06:15:17.689 INFO:teuthology.orchestra.run.vm06.stdout: python3-mako noarch 1.1.4-6.el9 appstream 172 k 2026-03-10T06:15:17.689 INFO:teuthology.orchestra.run.vm06.stdout: python3-markupsafe x86_64 1.1.1-12.el9 appstream 35 k 2026-03-10T06:15:17.689 INFO:teuthology.orchestra.run.vm06.stdout: python3-more-itertools noarch 8.12.0-2.el9 epel 79 k 2026-03-10T06:15:17.689 INFO:teuthology.orchestra.run.vm06.stdout: python3-natsort noarch 7.1.1-5.el9 epel 58 k 2026-03-10T06:15:17.689 INFO:teuthology.orchestra.run.vm06.stdout: python3-numpy x86_64 1:1.23.5-2.el9 appstream 6.1 M 2026-03-10T06:15:17.689 INFO:teuthology.orchestra.run.vm06.stdout: python3-numpy-f2py x86_64 1:1.23.5-2.el9 appstream 442 k 2026-03-10T06:15:17.689 INFO:teuthology.orchestra.run.vm06.stdout: python3-pecan noarch 1.4.2-3.el9 epel 272 k 2026-03-10T06:15:17.689 INFO:teuthology.orchestra.run.vm06.stdout: python3-ply noarch 3.11-14.el9 baseos 106 k 2026-03-10T06:15:17.689 INFO:teuthology.orchestra.run.vm06.stdout: python3-portend noarch 3.1.0-2.el9 epel 16 k 2026-03-10T06:15:17.689 INFO:teuthology.orchestra.run.vm06.stdout: python3-pyOpenSSL noarch 21.0.0-1.el9 epel 90 k 2026-03-10T06:15:17.689 INFO:teuthology.orchestra.run.vm06.stdout: python3-pyasn1 noarch 0.4.8-7.el9 appstream 157 k 2026-03-10T06:15:17.689 INFO:teuthology.orchestra.run.vm06.stdout: python3-pyasn1-modules noarch 0.4.8-7.el9 appstream 277 k 2026-03-10T06:15:17.689 INFO:teuthology.orchestra.run.vm06.stdout: python3-pycparser noarch 2.20-6.el9 baseos 135 k 2026-03-10T06:15:17.689 INFO:teuthology.orchestra.run.vm06.stdout: python3-repoze-lru noarch 0.7-16.el9 epel 31 k 2026-03-10T06:15:17.689 INFO:teuthology.orchestra.run.vm06.stdout: python3-requests noarch 2.25.1-10.el9 baseos 126 k 2026-03-10T06:15:17.689 INFO:teuthology.orchestra.run.vm06.stdout: python3-requests-oauthlib noarch 1.3.0-12.el9 appstream 54 k 2026-03-10T06:15:17.690 INFO:teuthology.orchestra.run.vm06.stdout: python3-routes noarch 2.5.1-5.el9 epel 188 k 2026-03-10T06:15:17.690 INFO:teuthology.orchestra.run.vm06.stdout: python3-rsa noarch 4.9-2.el9 epel 59 k 2026-03-10T06:15:17.690 INFO:teuthology.orchestra.run.vm06.stdout: python3-scipy x86_64 1.9.3-2.el9 appstream 19 M 2026-03-10T06:15:17.690 INFO:teuthology.orchestra.run.vm06.stdout: python3-tempora noarch 5.0.0-2.el9 epel 36 k 2026-03-10T06:15:17.690 INFO:teuthology.orchestra.run.vm06.stdout: python3-toml noarch 0.10.2-6.el9 appstream 42 k 2026-03-10T06:15:17.690 INFO:teuthology.orchestra.run.vm06.stdout: python3-typing-extensions noarch 4.15.0-1.el9 epel 86 k 2026-03-10T06:15:17.690 INFO:teuthology.orchestra.run.vm06.stdout: python3-urllib3 noarch 1.26.5-7.el9 baseos 218 k 2026-03-10T06:15:17.690 INFO:teuthology.orchestra.run.vm06.stdout: python3-webob noarch 1.8.8-2.el9 epel 230 k 2026-03-10T06:15:17.690 INFO:teuthology.orchestra.run.vm06.stdout: python3-websocket-client noarch 1.2.3-2.el9 epel 90 k 2026-03-10T06:15:17.690 INFO:teuthology.orchestra.run.vm06.stdout: python3-werkzeug noarch 2.0.3-3.el9.1 epel 427 k 2026-03-10T06:15:17.690 INFO:teuthology.orchestra.run.vm06.stdout: python3-zc-lockfile noarch 2.0-10.el9 epel 20 k 2026-03-10T06:15:17.690 INFO:teuthology.orchestra.run.vm06.stdout: re2 x86_64 1:20211101-20.el9 epel 191 k 2026-03-10T06:15:17.690 INFO:teuthology.orchestra.run.vm06.stdout: socat x86_64 1.7.4.1-8.el9 appstream 303 k 2026-03-10T06:15:17.690 INFO:teuthology.orchestra.run.vm06.stdout: thrift x86_64 0.15.0-4.el9 epel 1.6 M 2026-03-10T06:15:17.690 INFO:teuthology.orchestra.run.vm06.stdout: xmlstarlet x86_64 1.6.1-20.el9 appstream 64 k 2026-03-10T06:15:17.690 INFO:teuthology.orchestra.run.vm06.stdout:Installing weak dependencies: 2026-03-10T06:15:17.690 INFO:teuthology.orchestra.run.vm06.stdout: python3-jwt+crypto noarch 2.4.0-1.el9 epel 9.0 k 2026-03-10T06:15:17.690 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-10T06:15:17.690 INFO:teuthology.orchestra.run.vm06.stdout:Transaction Summary 2026-03-10T06:15:17.690 INFO:teuthology.orchestra.run.vm06.stdout:================================================================================ 2026-03-10T06:15:17.690 INFO:teuthology.orchestra.run.vm06.stdout:Install 117 Packages 2026-03-10T06:15:17.690 INFO:teuthology.orchestra.run.vm06.stdout:Upgrade 2 Packages 2026-03-10T06:15:17.690 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-10T06:15:17.690 INFO:teuthology.orchestra.run.vm06.stdout:Total download size: 182 M 2026-03-10T06:15:17.690 INFO:teuthology.orchestra.run.vm06.stdout:Downloading Packages: 2026-03-10T06:15:18.106 INFO:teuthology.orchestra.run.vm04.stdout:lab-extras 65 kB/s | 50 kB 00:00 2026-03-10T06:15:19.335 INFO:teuthology.orchestra.run.vm06.stdout:(1/119): ceph-18.2.0-0.el9.x86_64.rpm 21 kB/s | 6.4 kB 00:00 2026-03-10T06:15:19.557 INFO:teuthology.orchestra.run.vm04.stdout:Package librados2-2:16.2.4-5.el9.x86_64 is already installed. 2026-03-10T06:15:19.558 INFO:teuthology.orchestra.run.vm04.stdout:Package librbd1-2:16.2.4-5.el9.x86_64 is already installed. 2026-03-10T06:15:19.563 INFO:teuthology.orchestra.run.vm04.stdout:Package bzip2-1.0.8-11.el9.x86_64 is already installed. 2026-03-10T06:15:19.563 INFO:teuthology.orchestra.run.vm04.stdout:Package perl-Test-Harness-1:3.42-461.el9.noarch is already installed. 2026-03-10T06:15:19.591 INFO:teuthology.orchestra.run.vm04.stdout:Dependencies resolved. 2026-03-10T06:15:19.595 INFO:teuthology.orchestra.run.vm04.stdout:================================================================================ 2026-03-10T06:15:19.595 INFO:teuthology.orchestra.run.vm04.stdout: Package Arch Version Repository Size 2026-03-10T06:15:19.595 INFO:teuthology.orchestra.run.vm04.stdout:================================================================================ 2026-03-10T06:15:19.595 INFO:teuthology.orchestra.run.vm04.stdout:Installing: 2026-03-10T06:15:19.595 INFO:teuthology.orchestra.run.vm04.stdout: ceph x86_64 2:18.2.0-0.el9 ceph 6.4 k 2026-03-10T06:15:19.595 INFO:teuthology.orchestra.run.vm04.stdout: ceph-base x86_64 2:18.2.0-0.el9 ceph 5.2 M 2026-03-10T06:15:19.595 INFO:teuthology.orchestra.run.vm04.stdout: ceph-fuse x86_64 2:18.2.0-0.el9 ceph 835 k 2026-03-10T06:15:19.595 INFO:teuthology.orchestra.run.vm04.stdout: ceph-immutable-object-cache x86_64 2:18.2.0-0.el9 ceph 142 k 2026-03-10T06:15:19.595 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mgr x86_64 2:18.2.0-0.el9 ceph 1.4 M 2026-03-10T06:15:19.595 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mgr-cephadm noarch 2:18.2.0-0.el9 ceph-noarch 127 k 2026-03-10T06:15:19.595 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mgr-dashboard noarch 2:18.2.0-0.el9 ceph-noarch 1.7 M 2026-03-10T06:15:19.595 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mgr-diskprediction-local noarch 2:18.2.0-0.el9 ceph-noarch 7.4 M 2026-03-10T06:15:19.595 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mgr-rook noarch 2:18.2.0-0.el9 ceph-noarch 47 k 2026-03-10T06:15:19.595 INFO:teuthology.orchestra.run.vm04.stdout: ceph-radosgw x86_64 2:18.2.0-0.el9 ceph 7.6 M 2026-03-10T06:15:19.595 INFO:teuthology.orchestra.run.vm04.stdout: ceph-test x86_64 2:18.2.0-0.el9 ceph 40 M 2026-03-10T06:15:19.595 INFO:teuthology.orchestra.run.vm04.stdout: cephadm noarch 2:18.2.0-0.el9 ceph-noarch 209 k 2026-03-10T06:15:19.595 INFO:teuthology.orchestra.run.vm04.stdout: libcephfs-devel x86_64 2:18.2.0-0.el9 ceph 30 k 2026-03-10T06:15:19.595 INFO:teuthology.orchestra.run.vm04.stdout: libcephfs2 x86_64 2:18.2.0-0.el9 ceph 653 k 2026-03-10T06:15:19.595 INFO:teuthology.orchestra.run.vm04.stdout: librados-devel x86_64 2:18.2.0-0.el9 ceph 126 k 2026-03-10T06:15:19.595 INFO:teuthology.orchestra.run.vm04.stdout: python3-cephfs x86_64 2:18.2.0-0.el9 ceph 155 k 2026-03-10T06:15:19.595 INFO:teuthology.orchestra.run.vm04.stdout: python3-jmespath noarch 1.0.1-1.el9 appstream 48 k 2026-03-10T06:15:19.595 INFO:teuthology.orchestra.run.vm04.stdout: python3-rados x86_64 2:18.2.0-0.el9 ceph 321 k 2026-03-10T06:15:19.595 INFO:teuthology.orchestra.run.vm04.stdout: python3-rbd x86_64 2:18.2.0-0.el9 ceph 297 k 2026-03-10T06:15:19.595 INFO:teuthology.orchestra.run.vm04.stdout: python3-rgw x86_64 2:18.2.0-0.el9 ceph 99 k 2026-03-10T06:15:19.596 INFO:teuthology.orchestra.run.vm04.stdout: python3-xmltodict noarch 0.12.0-15.el9 epel 22 k 2026-03-10T06:15:19.596 INFO:teuthology.orchestra.run.vm04.stdout: rbd-fuse x86_64 2:18.2.0-0.el9 ceph 86 k 2026-03-10T06:15:19.596 INFO:teuthology.orchestra.run.vm04.stdout: rbd-mirror x86_64 2:18.2.0-0.el9 ceph 3.0 M 2026-03-10T06:15:19.596 INFO:teuthology.orchestra.run.vm04.stdout: rbd-nbd x86_64 2:18.2.0-0.el9 ceph 169 k 2026-03-10T06:15:19.596 INFO:teuthology.orchestra.run.vm04.stdout:Upgrading: 2026-03-10T06:15:19.596 INFO:teuthology.orchestra.run.vm04.stdout: librados2 x86_64 2:18.2.0-0.el9 ceph 3.3 M 2026-03-10T06:15:19.596 INFO:teuthology.orchestra.run.vm04.stdout: librbd1 x86_64 2:18.2.0-0.el9 ceph 3.0 M 2026-03-10T06:15:19.596 INFO:teuthology.orchestra.run.vm04.stdout:Installing dependencies: 2026-03-10T06:15:19.596 INFO:teuthology.orchestra.run.vm04.stdout: boost-program-options x86_64 1.75.0-13.el9 appstream 104 k 2026-03-10T06:15:19.596 INFO:teuthology.orchestra.run.vm04.stdout: ceph-common x86_64 2:18.2.0-0.el9 ceph 18 M 2026-03-10T06:15:19.596 INFO:teuthology.orchestra.run.vm04.stdout: ceph-grafana-dashboards noarch 2:18.2.0-0.el9 ceph-noarch 23 k 2026-03-10T06:15:19.596 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mds x86_64 2:18.2.0-0.el9 ceph 2.1 M 2026-03-10T06:15:19.596 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mgr-modules-core noarch 2:18.2.0-0.el9 ceph-noarch 240 k 2026-03-10T06:15:19.596 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mon x86_64 2:18.2.0-0.el9 ceph 4.4 M 2026-03-10T06:15:19.596 INFO:teuthology.orchestra.run.vm04.stdout: ceph-osd x86_64 2:18.2.0-0.el9 ceph 18 M 2026-03-10T06:15:19.596 INFO:teuthology.orchestra.run.vm04.stdout: ceph-prometheus-alerts noarch 2:18.2.0-0.el9 ceph-noarch 15 k 2026-03-10T06:15:19.596 INFO:teuthology.orchestra.run.vm04.stdout: ceph-selinux x86_64 2:18.2.0-0.el9 ceph 24 k 2026-03-10T06:15:19.596 INFO:teuthology.orchestra.run.vm04.stdout: flexiblas x86_64 3.0.4-9.el9 appstream 30 k 2026-03-10T06:15:19.596 INFO:teuthology.orchestra.run.vm04.stdout: flexiblas-netlib x86_64 3.0.4-9.el9 appstream 3.0 M 2026-03-10T06:15:19.596 INFO:teuthology.orchestra.run.vm04.stdout: flexiblas-openblas-openmp x86_64 3.0.4-9.el9 appstream 15 k 2026-03-10T06:15:19.596 INFO:teuthology.orchestra.run.vm04.stdout: fmt x86_64 8.1.1-5.el9 epel 111 k 2026-03-10T06:15:19.596 INFO:teuthology.orchestra.run.vm04.stdout: gperftools-libs x86_64 2.9.1-3.el9 epel 308 k 2026-03-10T06:15:19.596 INFO:teuthology.orchestra.run.vm04.stdout: ledmon-libs x86_64 1.1.0-3.el9 baseos 40 k 2026-03-10T06:15:19.596 INFO:teuthology.orchestra.run.vm04.stdout: libarrow x86_64 9.0.0-15.el9 epel 4.4 M 2026-03-10T06:15:19.596 INFO:teuthology.orchestra.run.vm04.stdout: libarrow-doc noarch 9.0.0-15.el9 epel 25 k 2026-03-10T06:15:19.596 INFO:teuthology.orchestra.run.vm04.stdout: libcephsqlite x86_64 2:18.2.0-0.el9 ceph 161 k 2026-03-10T06:15:19.596 INFO:teuthology.orchestra.run.vm04.stdout: libconfig x86_64 1.7.2-9.el9 baseos 72 k 2026-03-10T06:15:19.596 INFO:teuthology.orchestra.run.vm04.stdout: libgfortran x86_64 11.5.0-14.el9 baseos 794 k 2026-03-10T06:15:19.596 INFO:teuthology.orchestra.run.vm04.stdout: liboath x86_64 2.6.12-1.el9 epel 49 k 2026-03-10T06:15:19.596 INFO:teuthology.orchestra.run.vm04.stdout: libpmemobj x86_64 1.12.1-1.el9 appstream 160 k 2026-03-10T06:15:19.596 INFO:teuthology.orchestra.run.vm04.stdout: libquadmath x86_64 11.5.0-14.el9 baseos 184 k 2026-03-10T06:15:19.596 INFO:teuthology.orchestra.run.vm04.stdout: librabbitmq x86_64 0.11.0-7.el9 appstream 45 k 2026-03-10T06:15:19.596 INFO:teuthology.orchestra.run.vm04.stdout: libradosstriper1 x86_64 2:18.2.0-0.el9 ceph 474 k 2026-03-10T06:15:19.596 INFO:teuthology.orchestra.run.vm04.stdout: librdkafka x86_64 1.6.1-102.el9 appstream 662 k 2026-03-10T06:15:19.596 INFO:teuthology.orchestra.run.vm04.stdout: librgw2 x86_64 2:18.2.0-0.el9 ceph 4.4 M 2026-03-10T06:15:19.596 INFO:teuthology.orchestra.run.vm04.stdout: libstoragemgmt x86_64 1.10.1-1.el9 appstream 246 k 2026-03-10T06:15:19.596 INFO:teuthology.orchestra.run.vm04.stdout: libunwind x86_64 1.6.2-1.el9 epel 67 k 2026-03-10T06:15:19.596 INFO:teuthology.orchestra.run.vm04.stdout: libxslt x86_64 1.1.34-12.el9 appstream 233 k 2026-03-10T06:15:19.596 INFO:teuthology.orchestra.run.vm04.stdout: lttng-ust x86_64 2.12.0-6.el9 appstream 292 k 2026-03-10T06:15:19.597 INFO:teuthology.orchestra.run.vm04.stdout: mailcap noarch 2.1.49-5.el9 baseos 33 k 2026-03-10T06:15:19.597 INFO:teuthology.orchestra.run.vm04.stdout: openblas x86_64 0.3.29-1.el9 appstream 42 k 2026-03-10T06:15:19.597 INFO:teuthology.orchestra.run.vm04.stdout: openblas-openmp x86_64 0.3.29-1.el9 appstream 5.3 M 2026-03-10T06:15:19.597 INFO:teuthology.orchestra.run.vm04.stdout: parquet-libs x86_64 9.0.0-15.el9 epel 838 k 2026-03-10T06:15:19.597 INFO:teuthology.orchestra.run.vm04.stdout: python3-asyncssh noarch 2.13.2-5.el9 epel 548 k 2026-03-10T06:15:19.597 INFO:teuthology.orchestra.run.vm04.stdout: python3-autocommand noarch 2.2.2-8.el9 epel 29 k 2026-03-10T06:15:19.597 INFO:teuthology.orchestra.run.vm04.stdout: python3-babel noarch 2.9.1-2.el9 appstream 6.0 M 2026-03-10T06:15:19.597 INFO:teuthology.orchestra.run.vm04.stdout: python3-backports-tarfile noarch 1.2.0-1.el9 epel 60 k 2026-03-10T06:15:19.597 INFO:teuthology.orchestra.run.vm04.stdout: python3-bcrypt x86_64 3.2.2-1.el9 epel 43 k 2026-03-10T06:15:19.597 INFO:teuthology.orchestra.run.vm04.stdout: python3-cachetools noarch 4.2.4-1.el9 epel 32 k 2026-03-10T06:15:19.597 INFO:teuthology.orchestra.run.vm04.stdout: python3-ceph-argparse x86_64 2:18.2.0-0.el9 ceph 45 k 2026-03-10T06:15:19.597 INFO:teuthology.orchestra.run.vm04.stdout: python3-ceph-common x86_64 2:18.2.0-0.el9 ceph 119 k 2026-03-10T06:15:19.597 INFO:teuthology.orchestra.run.vm04.stdout: python3-certifi noarch 2023.05.07-4.el9 epel 14 k 2026-03-10T06:15:19.597 INFO:teuthology.orchestra.run.vm04.stdout: python3-cffi x86_64 1.14.5-5.el9 baseos 253 k 2026-03-10T06:15:19.597 INFO:teuthology.orchestra.run.vm04.stdout: python3-cheroot noarch 10.0.1-4.el9 epel 173 k 2026-03-10T06:15:19.597 INFO:teuthology.orchestra.run.vm04.stdout: python3-cherrypy noarch 18.6.1-2.el9 epel 358 k 2026-03-10T06:15:19.597 INFO:teuthology.orchestra.run.vm04.stdout: python3-cryptography x86_64 36.0.1-5.el9 baseos 1.2 M 2026-03-10T06:15:19.597 INFO:teuthology.orchestra.run.vm04.stdout: python3-devel x86_64 3.9.25-3.el9 appstream 244 k 2026-03-10T06:15:19.597 INFO:teuthology.orchestra.run.vm04.stdout: python3-google-auth noarch 1:2.45.0-1.el9 epel 254 k 2026-03-10T06:15:19.597 INFO:teuthology.orchestra.run.vm04.stdout: python3-jaraco noarch 8.2.1-3.el9 epel 11 k 2026-03-10T06:15:19.597 INFO:teuthology.orchestra.run.vm04.stdout: python3-jaraco-classes noarch 3.2.1-5.el9 epel 18 k 2026-03-10T06:15:19.597 INFO:teuthology.orchestra.run.vm04.stdout: python3-jaraco-collections noarch 3.0.0-8.el9 epel 23 k 2026-03-10T06:15:19.597 INFO:teuthology.orchestra.run.vm04.stdout: python3-jaraco-context noarch 6.0.1-3.el9 epel 20 k 2026-03-10T06:15:19.597 INFO:teuthology.orchestra.run.vm04.stdout: python3-jaraco-functools noarch 3.5.0-2.el9 epel 19 k 2026-03-10T06:15:19.597 INFO:teuthology.orchestra.run.vm04.stdout: python3-jaraco-text noarch 4.0.0-2.el9 epel 26 k 2026-03-10T06:15:19.597 INFO:teuthology.orchestra.run.vm04.stdout: python3-jinja2 noarch 2.11.3-8.el9 appstream 249 k 2026-03-10T06:15:19.597 INFO:teuthology.orchestra.run.vm04.stdout: python3-jwt noarch 2.4.0-1.el9 epel 41 k 2026-03-10T06:15:19.597 INFO:teuthology.orchestra.run.vm04.stdout: python3-kubernetes noarch 1:26.1.0-3.el9 epel 1.0 M 2026-03-10T06:15:19.597 INFO:teuthology.orchestra.run.vm04.stdout: python3-libstoragemgmt x86_64 1.10.1-1.el9 appstream 177 k 2026-03-10T06:15:19.597 INFO:teuthology.orchestra.run.vm04.stdout: python3-logutils noarch 0.3.5-21.el9 epel 46 k 2026-03-10T06:15:19.597 INFO:teuthology.orchestra.run.vm04.stdout: python3-mako noarch 1.1.4-6.el9 appstream 172 k 2026-03-10T06:15:19.597 INFO:teuthology.orchestra.run.vm04.stdout: python3-markupsafe x86_64 1.1.1-12.el9 appstream 35 k 2026-03-10T06:15:19.597 INFO:teuthology.orchestra.run.vm04.stdout: python3-more-itertools noarch 8.12.0-2.el9 epel 79 k 2026-03-10T06:15:19.597 INFO:teuthology.orchestra.run.vm04.stdout: python3-natsort noarch 7.1.1-5.el9 epel 58 k 2026-03-10T06:15:19.597 INFO:teuthology.orchestra.run.vm04.stdout: python3-numpy x86_64 1:1.23.5-2.el9 appstream 6.1 M 2026-03-10T06:15:19.597 INFO:teuthology.orchestra.run.vm04.stdout: python3-numpy-f2py x86_64 1:1.23.5-2.el9 appstream 442 k 2026-03-10T06:15:19.597 INFO:teuthology.orchestra.run.vm04.stdout: python3-pecan noarch 1.4.2-3.el9 epel 272 k 2026-03-10T06:15:19.597 INFO:teuthology.orchestra.run.vm04.stdout: python3-ply noarch 3.11-14.el9 baseos 106 k 2026-03-10T06:15:19.597 INFO:teuthology.orchestra.run.vm04.stdout: python3-portend noarch 3.1.0-2.el9 epel 16 k 2026-03-10T06:15:19.598 INFO:teuthology.orchestra.run.vm04.stdout: python3-pyOpenSSL noarch 21.0.0-1.el9 epel 90 k 2026-03-10T06:15:19.598 INFO:teuthology.orchestra.run.vm04.stdout: python3-pyasn1 noarch 0.4.8-7.el9 appstream 157 k 2026-03-10T06:15:19.598 INFO:teuthology.orchestra.run.vm04.stdout: python3-pyasn1-modules noarch 0.4.8-7.el9 appstream 277 k 2026-03-10T06:15:19.598 INFO:teuthology.orchestra.run.vm04.stdout: python3-pycparser noarch 2.20-6.el9 baseos 135 k 2026-03-10T06:15:19.598 INFO:teuthology.orchestra.run.vm04.stdout: python3-repoze-lru noarch 0.7-16.el9 epel 31 k 2026-03-10T06:15:19.598 INFO:teuthology.orchestra.run.vm04.stdout: python3-requests noarch 2.25.1-10.el9 baseos 126 k 2026-03-10T06:15:19.598 INFO:teuthology.orchestra.run.vm04.stdout: python3-requests-oauthlib noarch 1.3.0-12.el9 appstream 54 k 2026-03-10T06:15:19.598 INFO:teuthology.orchestra.run.vm04.stdout: python3-routes noarch 2.5.1-5.el9 epel 188 k 2026-03-10T06:15:19.598 INFO:teuthology.orchestra.run.vm04.stdout: python3-rsa noarch 4.9-2.el9 epel 59 k 2026-03-10T06:15:19.598 INFO:teuthology.orchestra.run.vm04.stdout: python3-scipy x86_64 1.9.3-2.el9 appstream 19 M 2026-03-10T06:15:19.598 INFO:teuthology.orchestra.run.vm04.stdout: python3-tempora noarch 5.0.0-2.el9 epel 36 k 2026-03-10T06:15:19.598 INFO:teuthology.orchestra.run.vm04.stdout: python3-toml noarch 0.10.2-6.el9 appstream 42 k 2026-03-10T06:15:19.598 INFO:teuthology.orchestra.run.vm04.stdout: python3-typing-extensions noarch 4.15.0-1.el9 epel 86 k 2026-03-10T06:15:19.598 INFO:teuthology.orchestra.run.vm04.stdout: python3-urllib3 noarch 1.26.5-7.el9 baseos 218 k 2026-03-10T06:15:19.598 INFO:teuthology.orchestra.run.vm04.stdout: python3-webob noarch 1.8.8-2.el9 epel 230 k 2026-03-10T06:15:19.598 INFO:teuthology.orchestra.run.vm04.stdout: python3-websocket-client noarch 1.2.3-2.el9 epel 90 k 2026-03-10T06:15:19.598 INFO:teuthology.orchestra.run.vm04.stdout: python3-werkzeug noarch 2.0.3-3.el9.1 epel 427 k 2026-03-10T06:15:19.598 INFO:teuthology.orchestra.run.vm04.stdout: python3-zc-lockfile noarch 2.0-10.el9 epel 20 k 2026-03-10T06:15:19.598 INFO:teuthology.orchestra.run.vm04.stdout: re2 x86_64 1:20211101-20.el9 epel 191 k 2026-03-10T06:15:19.598 INFO:teuthology.orchestra.run.vm04.stdout: socat x86_64 1.7.4.1-8.el9 appstream 303 k 2026-03-10T06:15:19.598 INFO:teuthology.orchestra.run.vm04.stdout: thrift x86_64 0.15.0-4.el9 epel 1.6 M 2026-03-10T06:15:19.598 INFO:teuthology.orchestra.run.vm04.stdout: xmlstarlet x86_64 1.6.1-20.el9 appstream 64 k 2026-03-10T06:15:19.598 INFO:teuthology.orchestra.run.vm04.stdout:Installing weak dependencies: 2026-03-10T06:15:19.598 INFO:teuthology.orchestra.run.vm04.stdout: python3-jwt+crypto noarch 2.4.0-1.el9 epel 9.0 k 2026-03-10T06:15:19.598 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:15:19.598 INFO:teuthology.orchestra.run.vm04.stdout:Transaction Summary 2026-03-10T06:15:19.598 INFO:teuthology.orchestra.run.vm04.stdout:================================================================================ 2026-03-10T06:15:19.598 INFO:teuthology.orchestra.run.vm04.stdout:Install 117 Packages 2026-03-10T06:15:19.598 INFO:teuthology.orchestra.run.vm04.stdout:Upgrade 2 Packages 2026-03-10T06:15:19.598 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:15:19.599 INFO:teuthology.orchestra.run.vm04.stdout:Total download size: 182 M 2026-03-10T06:15:19.599 INFO:teuthology.orchestra.run.vm04.stdout:Downloading Packages: 2026-03-10T06:15:19.971 INFO:teuthology.orchestra.run.vm06.stdout:(2/119): ceph-fuse-18.2.0-0.el9.x86_64.rpm 1.3 MB/s | 835 kB 00:00 2026-03-10T06:15:20.072 INFO:teuthology.orchestra.run.vm06.stdout:(3/119): ceph-immutable-object-cache-18.2.0-0.e 1.4 MB/s | 142 kB 00:00 2026-03-10T06:15:20.223 INFO:teuthology.orchestra.run.vm04.stdout:(1/119): ceph-18.2.0-0.el9.x86_64.rpm 21 kB/s | 6.4 kB 00:00 2026-03-10T06:15:20.249 INFO:teuthology.orchestra.run.vm06.stdout:(4/119): ceph-base-18.2.0-0.el9.x86_64.rpm 4.3 MB/s | 5.2 MB 00:01 2026-03-10T06:15:20.474 INFO:teuthology.orchestra.run.vm06.stdout:(5/119): ceph-mds-18.2.0-0.el9.x86_64.rpm 5.2 MB/s | 2.1 MB 00:00 2026-03-10T06:15:20.848 INFO:teuthology.orchestra.run.vm06.stdout:(6/119): ceph-mgr-18.2.0-0.el9.x86_64.rpm 2.4 MB/s | 1.4 MB 00:00 2026-03-10T06:15:21.080 INFO:teuthology.orchestra.run.vm06.stdout:(7/119): ceph-mon-18.2.0-0.el9.x86_64.rpm 7.3 MB/s | 4.4 MB 00:00 2026-03-10T06:15:21.248 INFO:teuthology.orchestra.run.vm04.stdout:(2/119): ceph-base-18.2.0-0.el9.x86_64.rpm 3.9 MB/s | 5.2 MB 00:01 2026-03-10T06:15:21.364 INFO:teuthology.orchestra.run.vm04.stdout:(3/119): ceph-immutable-object-cache-18.2.0-0.e 1.2 MB/s | 142 kB 00:00 2026-03-10T06:15:21.770 INFO:teuthology.orchestra.run.vm04.stdout:(4/119): ceph-mds-18.2.0-0.el9.x86_64.rpm 5.2 MB/s | 2.1 MB 00:00 2026-03-10T06:15:22.074 INFO:teuthology.orchestra.run.vm04.stdout:(5/119): ceph-mgr-18.2.0-0.el9.x86_64.rpm 4.8 MB/s | 1.4 MB 00:00 2026-03-10T06:15:22.192 INFO:teuthology.orchestra.run.vm06.stdout:(8/119): ceph-radosgw-18.2.0-0.el9.x86_64.rpm 6.9 MB/s | 7.6 MB 00:01 2026-03-10T06:15:22.293 INFO:teuthology.orchestra.run.vm06.stdout:(9/119): ceph-selinux-18.2.0-0.el9.x86_64.rpm 241 kB/s | 24 kB 00:00 2026-03-10T06:15:22.686 INFO:teuthology.orchestra.run.vm04.stdout:(6/119): ceph-mon-18.2.0-0.el9.x86_64.rpm 7.2 MB/s | 4.4 MB 00:00 2026-03-10T06:15:22.964 INFO:teuthology.orchestra.run.vm06.stdout:(10/119): ceph-common-18.2.0-0.el9.x86_64.rpm 4.6 MB/s | 18 MB 00:03 2026-03-10T06:15:22.983 INFO:teuthology.orchestra.run.vm04.stdout:(7/119): ceph-fuse-18.2.0-0.el9.x86_64.rpm 302 kB/s | 835 kB 00:02 2026-03-10T06:15:23.063 INFO:teuthology.orchestra.run.vm06.stdout:(11/119): libcephfs-devel-18.2.0-0.el9.x86_64.r 307 kB/s | 30 kB 00:00 2026-03-10T06:15:23.264 INFO:teuthology.orchestra.run.vm06.stdout:(12/119): libcephfs2-18.2.0-0.el9.x86_64.rpm 3.2 MB/s | 653 kB 00:00 2026-03-10T06:15:23.404 INFO:teuthology.orchestra.run.vm06.stdout:(13/119): ceph-osd-18.2.0-0.el9.x86_64.rpm 6.9 MB/s | 18 MB 00:02 2026-03-10T06:15:23.405 INFO:teuthology.orchestra.run.vm06.stdout:(14/119): libcephsqlite-18.2.0-0.el9.x86_64.rpm 1.1 MB/s | 161 kB 00:00 2026-03-10T06:15:23.507 INFO:teuthology.orchestra.run.vm06.stdout:(15/119): librados-devel-18.2.0-0.el9.x86_64.rp 1.2 MB/s | 126 kB 00:00 2026-03-10T06:15:23.513 INFO:teuthology.orchestra.run.vm06.stdout:(16/119): libradosstriper1-18.2.0-0.el9.x86_64. 4.3 MB/s | 474 kB 00:00 2026-03-10T06:15:23.615 INFO:teuthology.orchestra.run.vm06.stdout:(17/119): python3-ceph-argparse-18.2.0-0.el9.x8 437 kB/s | 45 kB 00:00 2026-03-10T06:15:23.716 INFO:teuthology.orchestra.run.vm06.stdout:(18/119): python3-ceph-common-18.2.0-0.el9.x86_ 1.2 MB/s | 119 kB 00:00 2026-03-10T06:15:23.842 INFO:teuthology.orchestra.run.vm06.stdout:(19/119): python3-cephfs-18.2.0-0.el9.x86_64.rp 1.2 MB/s | 155 kB 00:00 2026-03-10T06:15:23.843 INFO:teuthology.orchestra.run.vm04.stdout:(8/119): ceph-common-18.2.0-0.el9.x86_64.rpm 4.7 MB/s | 18 MB 00:03 2026-03-10T06:15:23.944 INFO:teuthology.orchestra.run.vm04.stdout:(9/119): ceph-selinux-18.2.0-0.el9.x86_64.rpm 237 kB/s | 24 kB 00:00 2026-03-10T06:15:23.945 INFO:teuthology.orchestra.run.vm06.stdout:(20/119): python3-rados-18.2.0-0.el9.x86_64.rpm 3.0 MB/s | 321 kB 00:00 2026-03-10T06:15:24.050 INFO:teuthology.orchestra.run.vm06.stdout:(21/119): python3-rbd-18.2.0-0.el9.x86_64.rpm 2.8 MB/s | 297 kB 00:00 2026-03-10T06:15:24.150 INFO:teuthology.orchestra.run.vm06.stdout:(22/119): python3-rgw-18.2.0-0.el9.x86_64.rpm 995 kB/s | 99 kB 00:00 2026-03-10T06:15:24.250 INFO:teuthology.orchestra.run.vm06.stdout:(23/119): rbd-fuse-18.2.0-0.el9.x86_64.rpm 860 kB/s | 86 kB 00:00 2026-03-10T06:15:24.311 INFO:teuthology.orchestra.run.vm06.stdout:(24/119): librgw2-18.2.0-0.el9.x86_64.rpm 5.5 MB/s | 4.4 MB 00:00 2026-03-10T06:15:24.412 INFO:teuthology.orchestra.run.vm06.stdout:(25/119): rbd-nbd-18.2.0-0.el9.x86_64.rpm 1.6 MB/s | 169 kB 00:00 2026-03-10T06:15:24.512 INFO:teuthology.orchestra.run.vm06.stdout:(26/119): ceph-grafana-dashboards-18.2.0-0.el9. 233 kB/s | 23 kB 00:00 2026-03-10T06:15:24.613 INFO:teuthology.orchestra.run.vm06.stdout:(27/119): ceph-mgr-cephadm-18.2.0-0.el9.noarch. 1.2 MB/s | 127 kB 00:00 2026-03-10T06:15:24.853 INFO:teuthology.orchestra.run.vm06.stdout:(28/119): rbd-mirror-18.2.0-0.el9.x86_64.rpm 4.9 MB/s | 3.0 MB 00:00 2026-03-10T06:15:25.012 INFO:teuthology.orchestra.run.vm06.stdout:(29/119): ceph-mgr-dashboard-18.2.0-0.el9.noarc 4.3 MB/s | 1.7 MB 00:00 2026-03-10T06:15:25.114 INFO:teuthology.orchestra.run.vm06.stdout:(30/119): ceph-mgr-modules-core-18.2.0-0.el9.no 2.3 MB/s | 240 kB 00:00 2026-03-10T06:15:25.215 INFO:teuthology.orchestra.run.vm06.stdout:(31/119): ceph-mgr-rook-18.2.0-0.el9.noarch.rpm 475 kB/s | 47 kB 00:00 2026-03-10T06:15:25.315 INFO:teuthology.orchestra.run.vm06.stdout:(32/119): ceph-prometheus-alerts-18.2.0-0.el9.n 146 kB/s | 15 kB 00:00 2026-03-10T06:15:25.417 INFO:teuthology.orchestra.run.vm06.stdout:(33/119): cephadm-18.2.0-0.el9.noarch.rpm 2.0 MB/s | 209 kB 00:00 2026-03-10T06:15:25.612 INFO:teuthology.orchestra.run.vm06.stdout:(34/119): ledmon-libs-1.1.0-3.el9.x86_64.rpm 207 kB/s | 40 kB 00:00 2026-03-10T06:15:25.698 INFO:teuthology.orchestra.run.vm04.stdout:(10/119): ceph-osd-18.2.0-0.el9.x86_64.rpm 5.8 MB/s | 18 MB 00:03 2026-03-10T06:15:25.752 INFO:teuthology.orchestra.run.vm06.stdout:(35/119): libconfig-1.7.2-9.el9.x86_64.rpm 514 kB/s | 72 kB 00:00 2026-03-10T06:15:25.798 INFO:teuthology.orchestra.run.vm04.stdout:(11/119): libcephfs-devel-18.2.0-0.el9.x86_64.r 306 kB/s | 30 kB 00:00 2026-03-10T06:15:26.014 INFO:teuthology.orchestra.run.vm06.stdout:(36/119): libgfortran-11.5.0-14.el9.x86_64.rpm 3.0 MB/s | 794 kB 00:00 2026-03-10T06:15:26.107 INFO:teuthology.orchestra.run.vm06.stdout:(37/119): libquadmath-11.5.0-14.el9.x86_64.rpm 1.9 MB/s | 184 kB 00:00 2026-03-10T06:15:26.172 INFO:teuthology.orchestra.run.vm06.stdout:(38/119): mailcap-2.1.49-5.el9.noarch.rpm 515 kB/s | 33 kB 00:00 2026-03-10T06:15:26.196 INFO:teuthology.orchestra.run.vm04.stdout:(12/119): libcephfs2-18.2.0-0.el9.x86_64.rpm 1.6 MB/s | 653 kB 00:00 2026-03-10T06:15:26.255 INFO:teuthology.orchestra.run.vm06.stdout:(39/119): ceph-mgr-diskprediction-local-18.2.0- 5.3 MB/s | 7.4 MB 00:01 2026-03-10T06:15:26.297 INFO:teuthology.orchestra.run.vm04.stdout:(13/119): libcephsqlite-18.2.0-0.el9.x86_64.rpm 1.6 MB/s | 161 kB 00:00 2026-03-10T06:15:26.306 INFO:teuthology.orchestra.run.vm06.stdout:(40/119): python3-cffi-1.14.5-5.el9.x86_64.rpm 1.8 MB/s | 253 kB 00:00 2026-03-10T06:15:26.394 INFO:teuthology.orchestra.run.vm06.stdout:(41/119): python3-ply-3.11-14.el9.noarch.rpm 1.2 MB/s | 106 kB 00:00 2026-03-10T06:15:26.398 INFO:teuthology.orchestra.run.vm04.stdout:(14/119): librados-devel-18.2.0-0.el9.x86_64.rp 1.2 MB/s | 126 kB 00:00 2026-03-10T06:15:26.483 INFO:teuthology.orchestra.run.vm06.stdout:(42/119): python3-pycparser-2.20-6.el9.noarch.r 1.5 MB/s | 135 kB 00:00 2026-03-10T06:15:26.577 INFO:teuthology.orchestra.run.vm06.stdout:(43/119): python3-requests-2.25.1-10.el9.noarch 1.3 MB/s | 126 kB 00:00 2026-03-10T06:15:26.597 INFO:teuthology.orchestra.run.vm04.stdout:(15/119): libradosstriper1-18.2.0-0.el9.x86_64. 2.3 MB/s | 474 kB 00:00 2026-03-10T06:15:26.681 INFO:teuthology.orchestra.run.vm06.stdout:(44/119): python3-urllib3-1.26.5-7.el9.noarch.r 2.0 MB/s | 218 kB 00:00 2026-03-10T06:15:26.739 INFO:teuthology.orchestra.run.vm06.stdout:(45/119): python3-cryptography-36.0.1-5.el9.x86 2.6 MB/s | 1.2 MB 00:00 2026-03-10T06:15:26.850 INFO:teuthology.orchestra.run.vm06.stdout:(46/119): flexiblas-3.0.4-9.el9.x86_64.rpm 268 kB/s | 30 kB 00:00 2026-03-10T06:15:26.894 INFO:teuthology.orchestra.run.vm06.stdout:(47/119): boost-program-options-1.75.0-13.el9.x 491 kB/s | 104 kB 00:00 2026-03-10T06:15:26.935 INFO:teuthology.orchestra.run.vm06.stdout:(48/119): flexiblas-openblas-openmp-3.0.4-9.el9 361 kB/s | 15 kB 00:00 2026-03-10T06:15:27.030 INFO:teuthology.orchestra.run.vm06.stdout:(49/119): libpmemobj-1.12.1-1.el9.x86_64.rpm 1.7 MB/s | 160 kB 00:00 2026-03-10T06:15:27.062 INFO:teuthology.orchestra.run.vm06.stdout:(50/119): librabbitmq-0.11.0-7.el9.x86_64.rpm 1.4 MB/s | 45 kB 00:00 2026-03-10T06:15:27.224 INFO:teuthology.orchestra.run.vm04.stdout:(16/119): ceph-radosgw-18.2.0-0.el9.x86_64.rpm 1.8 MB/s | 7.6 MB 00:04 2026-03-10T06:15:27.324 INFO:teuthology.orchestra.run.vm04.stdout:(17/119): python3-ceph-argparse-18.2.0-0.el9.x8 451 kB/s | 45 kB 00:00 2026-03-10T06:15:27.424 INFO:teuthology.orchestra.run.vm04.stdout:(18/119): python3-ceph-common-18.2.0-0.el9.x86_ 1.2 MB/s | 119 kB 00:00 2026-03-10T06:15:27.453 INFO:teuthology.orchestra.run.vm06.stdout:(51/119): librdkafka-1.6.1-102.el9.x86_64.rpm 1.7 MB/s | 662 kB 00:00 2026-03-10T06:15:27.525 INFO:teuthology.orchestra.run.vm04.stdout:(19/119): python3-cephfs-18.2.0-0.el9.x86_64.rp 1.5 MB/s | 155 kB 00:00 2026-03-10T06:15:27.599 INFO:teuthology.orchestra.run.vm04.stdout:(20/119): librgw2-18.2.0-0.el9.x86_64.rpm 4.4 MB/s | 4.4 MB 00:01 2026-03-10T06:15:27.609 INFO:teuthology.orchestra.run.vm06.stdout:(52/119): libstoragemgmt-1.10.1-1.el9.x86_64.rp 1.6 MB/s | 246 kB 00:00 2026-03-10T06:15:27.627 INFO:teuthology.orchestra.run.vm04.stdout:(21/119): python3-rados-18.2.0-0.el9.x86_64.rpm 3.1 MB/s | 321 kB 00:00 2026-03-10T06:15:27.706 INFO:teuthology.orchestra.run.vm04.stdout:(22/119): python3-rbd-18.2.0-0.el9.x86_64.rpm 2.7 MB/s | 297 kB 00:00 2026-03-10T06:15:27.727 INFO:teuthology.orchestra.run.vm04.stdout:(23/119): python3-rgw-18.2.0-0.el9.x86_64.rpm 995 kB/s | 99 kB 00:00 2026-03-10T06:15:27.768 INFO:teuthology.orchestra.run.vm06.stdout:(53/119): libxslt-1.1.34-12.el9.x86_64.rpm 1.4 MB/s | 233 kB 00:00 2026-03-10T06:15:27.806 INFO:teuthology.orchestra.run.vm04.stdout:(24/119): rbd-fuse-18.2.0-0.el9.x86_64.rpm 859 kB/s | 86 kB 00:00 2026-03-10T06:15:27.908 INFO:teuthology.orchestra.run.vm04.stdout:(25/119): rbd-nbd-18.2.0-0.el9.x86_64.rpm 1.6 MB/s | 169 kB 00:00 2026-03-10T06:15:27.939 INFO:teuthology.orchestra.run.vm06.stdout:(54/119): lttng-ust-2.12.0-6.el9.x86_64.rpm 1.7 MB/s | 292 kB 00:00 2026-03-10T06:15:27.972 INFO:teuthology.orchestra.run.vm06.stdout:(55/119): openblas-0.3.29-1.el9.x86_64.rpm 1.3 MB/s | 42 kB 00:00 2026-03-10T06:15:28.008 INFO:teuthology.orchestra.run.vm04.stdout:(26/119): ceph-grafana-dashboards-18.2.0-0.el9. 231 kB/s | 23 kB 00:00 2026-03-10T06:15:28.109 INFO:teuthology.orchestra.run.vm04.stdout:(27/119): ceph-mgr-cephadm-18.2.0-0.el9.noarch. 1.2 MB/s | 127 kB 00:00 2026-03-10T06:15:28.365 INFO:teuthology.orchestra.run.vm06.stdout:(56/119): flexiblas-netlib-3.0.4-9.el9.x86_64.r 2.0 MB/s | 3.0 MB 00:01 2026-03-10T06:15:28.421 INFO:teuthology.orchestra.run.vm04.stdout:(28/119): rbd-mirror-18.2.0-0.el9.x86_64.rpm 4.3 MB/s | 3.0 MB 00:00 2026-03-10T06:15:28.511 INFO:teuthology.orchestra.run.vm04.stdout:(29/119): ceph-mgr-dashboard-18.2.0-0.el9.noarc 4.2 MB/s | 1.7 MB 00:00 2026-03-10T06:15:28.613 INFO:teuthology.orchestra.run.vm04.stdout:(30/119): ceph-mgr-modules-core-18.2.0-0.el9.no 2.3 MB/s | 240 kB 00:00 2026-03-10T06:15:28.714 INFO:teuthology.orchestra.run.vm04.stdout:(31/119): ceph-mgr-rook-18.2.0-0.el9.noarch.rpm 473 kB/s | 47 kB 00:00 2026-03-10T06:15:28.814 INFO:teuthology.orchestra.run.vm04.stdout:(32/119): ceph-prometheus-alerts-18.2.0-0.el9.n 146 kB/s | 15 kB 00:00 2026-03-10T06:15:28.918 INFO:teuthology.orchestra.run.vm04.stdout:(33/119): cephadm-18.2.0-0.el9.noarch.rpm 2.0 MB/s | 209 kB 00:00 2026-03-10T06:15:29.060 INFO:teuthology.orchestra.run.vm04.stdout:(34/119): ledmon-libs-1.1.0-3.el9.x86_64.rpm 285 kB/s | 40 kB 00:00 2026-03-10T06:15:29.165 INFO:teuthology.orchestra.run.vm04.stdout:(35/119): libconfig-1.7.2-9.el9.x86_64.rpm 686 kB/s | 72 kB 00:00 2026-03-10T06:15:29.304 INFO:teuthology.orchestra.run.vm06.stdout:(57/119): openblas-openmp-0.3.29-1.el9.x86_64.r 4.0 MB/s | 5.3 MB 00:01 2026-03-10T06:15:29.348 INFO:teuthology.orchestra.run.vm04.stdout:(36/119): libgfortran-11.5.0-14.el9.x86_64.rpm 4.2 MB/s | 794 kB 00:00 2026-03-10T06:15:29.368 INFO:teuthology.orchestra.run.vm06.stdout:(58/119): python3-devel-3.9.25-3.el9.x86_64.rpm 3.8 MB/s | 244 kB 00:00 2026-03-10T06:15:29.431 INFO:teuthology.orchestra.run.vm06.stdout:(59/119): python3-jinja2-2.11.3-8.el9.noarch.rp 3.8 MB/s | 249 kB 00:00 2026-03-10T06:15:29.463 INFO:teuthology.orchestra.run.vm04.stdout:(37/119): libquadmath-11.5.0-14.el9.x86_64.rpm 1.6 MB/s | 184 kB 00:00 2026-03-10T06:15:29.464 INFO:teuthology.orchestra.run.vm06.stdout:(60/119): python3-jmespath-1.0.1-1.el9.noarch.r 1.4 MB/s | 48 kB 00:00 2026-03-10T06:15:29.491 INFO:teuthology.orchestra.run.vm04.stdout:(38/119): mailcap-2.1.49-5.el9.noarch.rpm 1.2 MB/s | 33 kB 00:00 2026-03-10T06:15:29.497 INFO:teuthology.orchestra.run.vm06.stdout:(61/119): python3-libstoragemgmt-1.10.1-1.el9.x 5.2 MB/s | 177 kB 00:00 2026-03-10T06:15:29.525 INFO:teuthology.orchestra.run.vm04.stdout:(39/119): ceph-mgr-diskprediction-local-18.2.0- 6.7 MB/s | 7.4 MB 00:01 2026-03-10T06:15:29.531 INFO:teuthology.orchestra.run.vm06.stdout:(62/119): python3-mako-1.1.4-6.el9.noarch.rpm 5.1 MB/s | 172 kB 00:00 2026-03-10T06:15:29.575 INFO:teuthology.orchestra.run.vm04.stdout:(40/119): python3-cffi-1.14.5-5.el9.x86_64.rpm 2.9 MB/s | 253 kB 00:00 2026-03-10T06:15:29.579 INFO:teuthology.orchestra.run.vm06.stdout:(63/119): python3-markupsafe-1.1.1-12.el9.x86_6 725 kB/s | 35 kB 00:00 2026-03-10T06:15:29.743 INFO:teuthology.orchestra.run.vm04.stdout:(41/119): python3-cryptography-36.0.1-5.el9.x86 5.7 MB/s | 1.2 MB 00:00 2026-03-10T06:15:29.787 INFO:teuthology.orchestra.run.vm04.stdout:(42/119): python3-ply-3.11-14.el9.noarch.rpm 505 kB/s | 106 kB 00:00 2026-03-10T06:15:29.845 INFO:teuthology.orchestra.run.vm04.stdout:(43/119): python3-pycparser-2.20-6.el9.noarch.r 1.3 MB/s | 135 kB 00:00 2026-03-10T06:15:30.026 INFO:teuthology.orchestra.run.vm04.stdout:(44/119): python3-requests-2.25.1-10.el9.noarch 529 kB/s | 126 kB 00:00 2026-03-10T06:15:30.096 INFO:teuthology.orchestra.run.vm04.stdout:(45/119): python3-urllib3-1.26.5-7.el9.noarch.r 869 kB/s | 218 kB 00:00 2026-03-10T06:15:30.256 INFO:teuthology.orchestra.run.vm04.stdout:(46/119): flexiblas-3.0.4-9.el9.x86_64.rpm 187 kB/s | 30 kB 00:00 2026-03-10T06:15:30.349 INFO:teuthology.orchestra.run.vm04.stdout:(47/119): boost-program-options-1.75.0-13.el9.x 323 kB/s | 104 kB 00:00 2026-03-10T06:15:30.436 INFO:teuthology.orchestra.run.vm04.stdout:(48/119): flexiblas-openblas-openmp-3.0.4-9.el9 171 kB/s | 15 kB 00:00 2026-03-10T06:15:30.573 INFO:teuthology.orchestra.run.vm04.stdout:(49/119): libpmemobj-1.12.1-1.el9.x86_64.rpm 1.2 MB/s | 160 kB 00:00 2026-03-10T06:15:30.633 INFO:teuthology.orchestra.run.vm04.stdout:(50/119): librabbitmq-0.11.0-7.el9.x86_64.rpm 733 kB/s | 45 kB 00:00 2026-03-10T06:15:30.780 INFO:teuthology.orchestra.run.vm06.stdout:(64/119): python3-numpy-1.23.5-2.el9.x86_64.rpm 5.1 MB/s | 6.1 MB 00:01 2026-03-10T06:15:30.783 INFO:teuthology.orchestra.run.vm04.stdout:(51/119): flexiblas-netlib-3.0.4-9.el9.x86_64.r 5.7 MB/s | 3.0 MB 00:00 2026-03-10T06:15:30.807 INFO:teuthology.orchestra.run.vm04.stdout:(52/119): librdkafka-1.6.1-102.el9.x86_64.rpm 3.7 MB/s | 662 kB 00:00 2026-03-10T06:15:30.878 INFO:teuthology.orchestra.run.vm04.stdout:(53/119): libstoragemgmt-1.10.1-1.el9.x86_64.rp 2.6 MB/s | 246 kB 00:00 2026-03-10T06:15:30.880 INFO:teuthology.orchestra.run.vm04.stdout:(54/119): libxslt-1.1.34-12.el9.x86_64.rpm 3.1 MB/s | 233 kB 00:00 2026-03-10T06:15:30.964 INFO:teuthology.orchestra.run.vm04.stdout:(55/119): lttng-ust-2.12.0-6.el9.x86_64.rpm 3.3 MB/s | 292 kB 00:00 2026-03-10T06:15:31.049 INFO:teuthology.orchestra.run.vm04.stdout:(56/119): openblas-0.3.29-1.el9.x86_64.rpm 250 kB/s | 42 kB 00:00 2026-03-10T06:15:31.087 INFO:teuthology.orchestra.run.vm06.stdout:(65/119): ceph-test-18.2.0-0.el9.x86_64.rpm 4.5 MB/s | 40 MB 00:08 2026-03-10T06:15:31.125 INFO:teuthology.orchestra.run.vm06.stdout:(66/119): python3-numpy-f2py-1.23.5-2.el9.x86_6 1.3 MB/s | 442 kB 00:00 2026-03-10T06:15:31.367 INFO:teuthology.orchestra.run.vm06.stdout:(67/119): python3-pyasn1-0.4.8-7.el9.noarch.rpm 564 kB/s | 157 kB 00:00 2026-03-10T06:15:31.368 INFO:teuthology.orchestra.run.vm04.stdout:(57/119): ceph-test-18.2.0-0.el9.x86_64.rpm 5.3 MB/s | 40 MB 00:07 2026-03-10T06:15:31.412 INFO:teuthology.orchestra.run.vm04.stdout:(58/119): openblas-openmp-0.3.29-1.el9.x86_64.r 12 MB/s | 5.3 MB 00:00 2026-03-10T06:15:31.420 INFO:teuthology.orchestra.run.vm06.stdout:(68/119): python3-pyasn1-modules-0.4.8-7.el9.no 942 kB/s | 277 kB 00:00 2026-03-10T06:15:31.447 INFO:teuthology.orchestra.run.vm06.stdout:(69/119): python3-requests-oauthlib-1.3.0-12.el 668 kB/s | 54 kB 00:00 2026-03-10T06:15:31.479 INFO:teuthology.orchestra.run.vm04.stdout:(59/119): python3-babel-2.9.1-2.el9.noarch.rpm 14 MB/s | 6.0 MB 00:00 2026-03-10T06:15:31.482 INFO:teuthology.orchestra.run.vm04.stdout:(60/119): python3-jinja2-2.11.3-8.el9.noarch.rp 3.5 MB/s | 249 kB 00:00 2026-03-10T06:15:31.484 INFO:teuthology.orchestra.run.vm06.stdout:(70/119): python3-toml-0.10.2-6.el9.noarch.rpm 1.1 MB/s | 42 kB 00:00 2026-03-10T06:15:31.537 INFO:teuthology.orchestra.run.vm04.stdout:(61/119): python3-libstoragemgmt-1.10.1-1.el9.x 3.1 MB/s | 177 kB 00:00 2026-03-10T06:15:31.538 INFO:teuthology.orchestra.run.vm04.stdout:(62/119): python3-jmespath-1.0.1-1.el9.noarch.r 810 kB/s | 48 kB 00:00 2026-03-10T06:15:31.576 INFO:teuthology.orchestra.run.vm06.stdout:(71/119): python3-babel-2.9.1-2.el9.noarch.rpm 1.9 MB/s | 6.0 MB 00:03 2026-03-10T06:15:31.593 INFO:teuthology.orchestra.run.vm04.stdout:(63/119): python3-mako-1.1.4-6.el9.noarch.rpm 3.1 MB/s | 172 kB 00:00 2026-03-10T06:15:31.594 INFO:teuthology.orchestra.run.vm04.stdout:(64/119): python3-markupsafe-1.1.1-12.el9.x86_6 618 kB/s | 35 kB 00:00 2026-03-10T06:15:31.609 INFO:teuthology.orchestra.run.vm06.stdout:(72/119): socat-1.7.4.1-8.el9.x86_64.rpm 2.4 MB/s | 303 kB 00:00 2026-03-10T06:15:31.624 INFO:teuthology.orchestra.run.vm06.stdout:(73/119): xmlstarlet-1.6.1-20.el9.x86_64.rpm 1.3 MB/s | 64 kB 00:00 2026-03-10T06:15:31.684 INFO:teuthology.orchestra.run.vm04.stdout:(65/119): python3-devel-3.9.25-3.el9.x86_64.rpm 772 kB/s | 244 kB 00:00 2026-03-10T06:15:31.740 INFO:teuthology.orchestra.run.vm04.stdout:(66/119): python3-pyasn1-0.4.8-7.el9.noarch.rpm 2.8 MB/s | 157 kB 00:00 2026-03-10T06:15:31.796 INFO:teuthology.orchestra.run.vm04.stdout:(67/119): python3-pyasn1-modules-0.4.8-7.el9.no 4.8 MB/s | 277 kB 00:00 2026-03-10T06:15:31.850 INFO:teuthology.orchestra.run.vm04.stdout:(68/119): python3-requests-oauthlib-1.3.0-12.el 996 kB/s | 54 kB 00:00 2026-03-10T06:15:31.855 INFO:teuthology.orchestra.run.vm06.stdout:(74/119): fmt-8.1.1-5.el9.x86_64.rpm 451 kB/s | 111 kB 00:00 2026-03-10T06:15:31.896 INFO:teuthology.orchestra.run.vm06.stdout:(75/119): gperftools-libs-2.9.1-3.el9.x86_64.rp 1.1 MB/s | 308 kB 00:00 2026-03-10T06:15:31.952 INFO:teuthology.orchestra.run.vm06.stdout:(76/119): libarrow-doc-9.0.0-15.el9.noarch.rpm 439 kB/s | 25 kB 00:00 2026-03-10T06:15:31.988 INFO:teuthology.orchestra.run.vm04.stdout:(69/119): python3-numpy-1.23.5-2.el9.x86_64.rpm 16 MB/s | 6.1 MB 00:00 2026-03-10T06:15:32.007 INFO:teuthology.orchestra.run.vm06.stdout:(77/119): liboath-2.6.12-1.el9.x86_64.rpm 907 kB/s | 49 kB 00:00 2026-03-10T06:15:32.042 INFO:teuthology.orchestra.run.vm04.stdout:(70/119): python3-toml-0.10.2-6.el9.noarch.rpm 774 kB/s | 42 kB 00:00 2026-03-10T06:15:32.061 INFO:teuthology.orchestra.run.vm06.stdout:(78/119): libunwind-1.6.2-1.el9.x86_64.rpm 1.2 MB/s | 67 kB 00:00 2026-03-10T06:15:32.102 INFO:teuthology.orchestra.run.vm04.stdout:(71/119): socat-1.7.4.1-8.el9.x86_64.rpm 5.0 MB/s | 303 kB 00:00 2026-03-10T06:15:32.136 INFO:teuthology.orchestra.run.vm04.stdout:(72/119): python3-numpy-f2py-1.23.5-2.el9.x86_6 817 kB/s | 442 kB 00:00 2026-03-10T06:15:32.156 INFO:teuthology.orchestra.run.vm04.stdout:(73/119): xmlstarlet-1.6.1-20.el9.x86_64.rpm 1.1 MB/s | 64 kB 00:00 2026-03-10T06:15:32.205 INFO:teuthology.orchestra.run.vm06.stdout:(79/119): libarrow-9.0.0-15.el9.x86_64.rpm 13 MB/s | 4.4 MB 00:00 2026-03-10T06:15:32.217 INFO:teuthology.orchestra.run.vm06.stdout:(80/119): parquet-libs-9.0.0-15.el9.x86_64.rpm 5.2 MB/s | 838 kB 00:00 2026-03-10T06:15:32.276 INFO:teuthology.orchestra.run.vm06.stdout:(81/119): python3-autocommand-2.2.2-8.el9.noarc 496 kB/s | 29 kB 00:00 2026-03-10T06:15:32.332 INFO:teuthology.orchestra.run.vm06.stdout:(82/119): python3-backports-tarfile-1.2.0-1.el9 1.1 MB/s | 60 kB 00:00 2026-03-10T06:15:32.341 INFO:teuthology.orchestra.run.vm06.stdout:(83/119): python3-asyncssh-2.13.2-5.el9.noarch. 3.9 MB/s | 548 kB 00:00 2026-03-10T06:15:32.393 INFO:teuthology.orchestra.run.vm06.stdout:(84/119): python3-bcrypt-3.2.2-1.el9.x86_64.rpm 706 kB/s | 43 kB 00:00 2026-03-10T06:15:32.395 INFO:teuthology.orchestra.run.vm06.stdout:(85/119): python3-cachetools-4.2.4-1.el9.noarch 594 kB/s | 32 kB 00:00 2026-03-10T06:15:32.422 INFO:teuthology.orchestra.run.vm04.stdout:(74/119): fmt-8.1.1-5.el9.x86_64.rpm 388 kB/s | 111 kB 00:00 2026-03-10T06:15:32.444 INFO:teuthology.orchestra.run.vm06.stdout:(86/119): python3-certifi-2023.05.07-4.el9.noar 283 kB/s | 14 kB 00:00 2026-03-10T06:15:32.456 INFO:teuthology.orchestra.run.vm06.stdout:(87/119): python3-cheroot-10.0.1-4.el9.noarch.r 2.8 MB/s | 173 kB 00:00 2026-03-10T06:15:32.512 INFO:teuthology.orchestra.run.vm04.stdout:(75/119): gperftools-libs-2.9.1-3.el9.x86_64.rp 865 kB/s | 308 kB 00:00 2026-03-10T06:15:32.530 INFO:teuthology.orchestra.run.vm06.stdout:(88/119): python3-google-auth-2.45.0-1.el9.noar 3.4 MB/s | 254 kB 00:00 2026-03-10T06:15:32.534 INFO:teuthology.orchestra.run.vm06.stdout:(89/119): python3-cherrypy-18.6.1-2.el9.noarch. 3.9 MB/s | 358 kB 00:00 2026-03-10T06:15:32.560 INFO:teuthology.orchestra.run.vm04.stdout:(76/119): libarrow-doc-9.0.0-15.el9.noarch.rpm 515 kB/s | 25 kB 00:00 2026-03-10T06:15:32.584 INFO:teuthology.orchestra.run.vm06.stdout:(90/119): python3-jaraco-8.2.1-3.el9.noarch.rpm 198 kB/s | 11 kB 00:00 2026-03-10T06:15:32.586 INFO:teuthology.orchestra.run.vm06.stdout:(91/119): python3-jaraco-classes-3.2.1-5.el9.no 337 kB/s | 18 kB 00:00 2026-03-10T06:15:32.611 INFO:teuthology.orchestra.run.vm04.stdout:(77/119): liboath-2.6.12-1.el9.x86_64.rpm 963 kB/s | 49 kB 00:00 2026-03-10T06:15:32.631 INFO:teuthology.orchestra.run.vm06.stdout:(92/119): python3-jaraco-collections-3.0.0-8.el 494 kB/s | 23 kB 00:00 2026-03-10T06:15:32.632 INFO:teuthology.orchestra.run.vm06.stdout:(93/119): python3-jaraco-context-6.0.1-3.el9.no 430 kB/s | 20 kB 00:00 2026-03-10T06:15:32.664 INFO:teuthology.orchestra.run.vm04.stdout:(78/119): libunwind-1.6.2-1.el9.x86_64.rpm 1.2 MB/s | 67 kB 00:00 2026-03-10T06:15:32.685 INFO:teuthology.orchestra.run.vm06.stdout:(94/119): python3-jaraco-functools-3.5.0-2.el9. 365 kB/s | 19 kB 00:00 2026-03-10T06:15:32.687 INFO:teuthology.orchestra.run.vm06.stdout:(95/119): python3-jaraco-text-4.0.0-2.el9.noarc 482 kB/s | 26 kB 00:00 2026-03-10T06:15:32.740 INFO:teuthology.orchestra.run.vm04.stdout:(79/119): libarrow-9.0.0-15.el9.x86_64.rpm 14 MB/s | 4.4 MB 00:00 2026-03-10T06:15:32.754 INFO:teuthology.orchestra.run.vm06.stdout:(96/119): python3-jwt+crypto-2.4.0-1.el9.noarch 131 kB/s | 9.0 kB 00:00 2026-03-10T06:15:32.754 INFO:teuthology.orchestra.run.vm06.stdout:(97/119): python3-jwt-2.4.0-1.el9.noarch.rpm 608 kB/s | 41 kB 00:00 2026-03-10T06:15:32.793 INFO:teuthology.orchestra.run.vm04.stdout:(80/119): python3-asyncssh-2.13.2-5.el9.noarch. 10 MB/s | 548 kB 00:00 2026-03-10T06:15:32.808 INFO:teuthology.orchestra.run.vm04.stdout:(81/119): parquet-libs-9.0.0-15.el9.x86_64.rpm 5.7 MB/s | 838 kB 00:00 2026-03-10T06:15:32.808 INFO:teuthology.orchestra.run.vm06.stdout:(98/119): python3-logutils-0.3.5-21.el9.noarch. 867 kB/s | 46 kB 00:00 2026-03-10T06:15:32.845 INFO:teuthology.orchestra.run.vm04.stdout:(82/119): python3-autocommand-2.2.2-8.el9.noarc 573 kB/s | 29 kB 00:00 2026-03-10T06:15:32.846 INFO:teuthology.orchestra.run.vm06.stdout:(99/119): python3-kubernetes-26.1.0-3.el9.noarc 11 MB/s | 1.0 MB 00:00 2026-03-10T06:15:32.858 INFO:teuthology.orchestra.run.vm04.stdout:(83/119): python3-backports-tarfile-1.2.0-1.el9 1.2 MB/s | 60 kB 00:00 2026-03-10T06:15:32.866 INFO:teuthology.orchestra.run.vm06.stdout:(100/119): python3-more-itertools-8.12.0-2.el9. 1.3 MB/s | 79 kB 00:00 2026-03-10T06:15:32.893 INFO:teuthology.orchestra.run.vm04.stdout:(84/119): python3-bcrypt-3.2.2-1.el9.x86_64.rpm 918 kB/s | 43 kB 00:00 2026-03-10T06:15:32.899 INFO:teuthology.orchestra.run.vm06.stdout:(101/119): python3-natsort-7.1.1-5.el9.noarch.r 1.1 MB/s | 58 kB 00:00 2026-03-10T06:15:32.906 INFO:teuthology.orchestra.run.vm04.stdout:(85/119): python3-cachetools-4.2.4-1.el9.noarch 686 kB/s | 32 kB 00:00 2026-03-10T06:15:32.928 INFO:teuthology.orchestra.run.vm06.stdout:(102/119): python3-pecan-1.4.2-3.el9.noarch.rpm 4.4 MB/s | 272 kB 00:00 2026-03-10T06:15:32.970 INFO:teuthology.orchestra.run.vm06.stdout:(103/119): python3-portend-3.1.0-2.el9.noarch.r 231 kB/s | 16 kB 00:00 2026-03-10T06:15:32.971 INFO:teuthology.orchestra.run.vm04.stdout:(86/119): python3-scipy-1.9.3-2.el9.x86_64.rpm 17 MB/s | 19 MB 00:01 2026-03-10T06:15:32.972 INFO:teuthology.orchestra.run.vm04.stdout:(87/119): python3-certifi-2023.05.07-4.el9.noar 179 kB/s | 14 kB 00:00 2026-03-10T06:15:32.974 INFO:teuthology.orchestra.run.vm04.stdout:(88/119): python3-cheroot-10.0.1-4.el9.noarch.r 2.5 MB/s | 173 kB 00:00 2026-03-10T06:15:32.990 INFO:teuthology.orchestra.run.vm06.stdout:(104/119): python3-pyOpenSSL-21.0.0-1.el9.noarc 1.4 MB/s | 90 kB 00:00 2026-03-10T06:15:33.021 INFO:teuthology.orchestra.run.vm06.stdout:(105/119): python3-repoze-lru-0.7-16.el9.noarch 611 kB/s | 31 kB 00:00 2026-03-10T06:15:33.027 INFO:teuthology.orchestra.run.vm04.stdout:(89/119): python3-cherrypy-18.6.1-2.el9.noarch. 6.2 MB/s | 358 kB 00:00 2026-03-10T06:15:33.028 INFO:teuthology.orchestra.run.vm04.stdout:(90/119): python3-jaraco-8.2.1-3.el9.noarch.rpm 196 kB/s | 11 kB 00:00 2026-03-10T06:15:33.054 INFO:teuthology.orchestra.run.vm06.stdout:(106/119): python3-routes-2.5.1-5.el9.noarch.rp 2.9 MB/s | 188 kB 00:00 2026-03-10T06:15:33.068 INFO:teuthology.orchestra.run.vm06.stdout:(107/119): python3-rsa-4.9-2.el9.noarch.rpm 1.2 MB/s | 59 kB 00:00 2026-03-10T06:15:33.075 INFO:teuthology.orchestra.run.vm04.stdout:(91/119): python3-jaraco-classes-3.2.1-5.el9.no 377 kB/s | 18 kB 00:00 2026-03-10T06:15:33.076 INFO:teuthology.orchestra.run.vm04.stdout:(92/119): python3-jaraco-collections-3.0.0-8.el 486 kB/s | 23 kB 00:00 2026-03-10T06:15:33.108 INFO:teuthology.orchestra.run.vm06.stdout:(108/119): python3-tempora-5.0.0-2.el9.noarch.r 669 kB/s | 36 kB 00:00 2026-03-10T06:15:33.114 INFO:teuthology.orchestra.run.vm06.stdout:(109/119): python3-typing-extensions-4.15.0-1.e 1.8 MB/s | 86 kB 00:00 2026-03-10T06:15:33.126 INFO:teuthology.orchestra.run.vm04.stdout:(93/119): python3-jaraco-context-6.0.1-3.el9.no 390 kB/s | 20 kB 00:00 2026-03-10T06:15:33.128 INFO:teuthology.orchestra.run.vm04.stdout:(94/119): python3-jaraco-functools-3.5.0-2.el9. 372 kB/s | 19 kB 00:00 2026-03-10T06:15:33.167 INFO:teuthology.orchestra.run.vm06.stdout:(110/119): python3-websocket-client-1.2.3-2.el9 1.7 MB/s | 90 kB 00:00 2026-03-10T06:15:33.175 INFO:teuthology.orchestra.run.vm06.stdout:(111/119): python3-webob-1.8.8-2.el9.noarch.rpm 3.3 MB/s | 230 kB 00:00 2026-03-10T06:15:33.177 INFO:teuthology.orchestra.run.vm04.stdout:(95/119): python3-jaraco-text-4.0.0-2.el9.noarc 519 kB/s | 26 kB 00:00 2026-03-10T06:15:33.179 INFO:teuthology.orchestra.run.vm04.stdout:(96/119): python3-jwt+crypto-2.4.0-1.el9.noarch 180 kB/s | 9.0 kB 00:00 2026-03-10T06:15:33.222 INFO:teuthology.orchestra.run.vm06.stdout:(112/119): python3-xmltodict-0.12.0-15.el9.noar 482 kB/s | 22 kB 00:00 2026-03-10T06:15:33.225 INFO:teuthology.orchestra.run.vm04.stdout:(97/119): python3-jwt-2.4.0-1.el9.noarch.rpm 859 kB/s | 41 kB 00:00 2026-03-10T06:15:33.258 INFO:teuthology.orchestra.run.vm06.stdout:(113/119): python3-werkzeug-2.0.3-3.el9.1.noarc 4.6 MB/s | 427 kB 00:00 2026-03-10T06:15:33.269 INFO:teuthology.orchestra.run.vm04.stdout:(98/119): python3-kubernetes-26.1.0-3.el9.noarc 11 MB/s | 1.0 MB 00:00 2026-03-10T06:15:33.273 INFO:teuthology.orchestra.run.vm04.stdout:(99/119): python3-logutils-0.3.5-21.el9.noarch. 963 kB/s | 46 kB 00:00 2026-03-10T06:15:33.276 INFO:teuthology.orchestra.run.vm06.stdout:(114/119): python3-zc-lockfile-2.0-10.el9.noarc 371 kB/s | 20 kB 00:00 2026-03-10T06:15:33.318 INFO:teuthology.orchestra.run.vm04.stdout:(100/119): python3-google-auth-2.45.0-1.el9.noa 735 kB/s | 254 kB 00:00 2026-03-10T06:15:33.318 INFO:teuthology.orchestra.run.vm04.stdout:(101/119): python3-more-itertools-8.12.0-2.el9. 1.6 MB/s | 79 kB 00:00 2026-03-10T06:15:33.320 INFO:teuthology.orchestra.run.vm04.stdout:(102/119): python3-natsort-7.1.1-5.el9.noarch.r 1.2 MB/s | 58 kB 00:00 2026-03-10T06:15:33.324 INFO:teuthology.orchestra.run.vm06.stdout:(115/119): re2-20211101-20.el9.x86_64.rpm 2.9 MB/s | 191 kB 00:00 2026-03-10T06:15:33.368 INFO:teuthology.orchestra.run.vm04.stdout:(103/119): python3-pyOpenSSL-21.0.0-1.el9.noarc 1.8 MB/s | 90 kB 00:00 2026-03-10T06:15:33.368 INFO:teuthology.orchestra.run.vm04.stdout:(104/119): python3-portend-3.1.0-2.el9.noarch.r 331 kB/s | 16 kB 00:00 2026-03-10T06:15:33.373 INFO:teuthology.orchestra.run.vm04.stdout:(105/119): python3-pecan-1.4.2-3.el9.noarch.rpm 5.0 MB/s | 272 kB 00:00 2026-03-10T06:15:33.414 INFO:teuthology.orchestra.run.vm04.stdout:(106/119): python3-repoze-lru-0.7-16.el9.noarch 667 kB/s | 31 kB 00:00 2026-03-10T06:15:33.421 INFO:teuthology.orchestra.run.vm04.stdout:(107/119): python3-rsa-4.9-2.el9.noarch.rpm 1.2 MB/s | 59 kB 00:00 2026-03-10T06:15:33.439 INFO:teuthology.orchestra.run.vm04.stdout:(108/119): python3-routes-2.5.1-5.el9.noarch.rp 2.6 MB/s | 188 kB 00:00 2026-03-10T06:15:33.465 INFO:teuthology.orchestra.run.vm04.stdout:(109/119): python3-tempora-5.0.0-2.el9.noarch.r 705 kB/s | 36 kB 00:00 2026-03-10T06:15:33.476 INFO:teuthology.orchestra.run.vm04.stdout:(110/119): python3-typing-extensions-4.15.0-1.e 1.5 MB/s | 86 kB 00:00 2026-03-10T06:15:33.489 INFO:teuthology.orchestra.run.vm06.stdout:(116/119): thrift-0.15.0-4.el9.x86_64.rpm 7.4 MB/s | 1.6 MB 00:00 2026-03-10T06:15:33.514 INFO:teuthology.orchestra.run.vm04.stdout:(111/119): python3-websocket-client-1.2.3-2.el9 1.8 MB/s | 90 kB 00:00 2026-03-10T06:15:33.516 INFO:teuthology.orchestra.run.vm04.stdout:(112/119): python3-webob-1.8.8-2.el9.noarch.rpm 2.9 MB/s | 230 kB 00:00 2026-03-10T06:15:33.539 INFO:teuthology.orchestra.run.vm04.stdout:(113/119): python3-werkzeug-2.0.3-3.el9.1.noarc 6.8 MB/s | 427 kB 00:00 2026-03-10T06:15:33.562 INFO:teuthology.orchestra.run.vm04.stdout:(114/119): python3-xmltodict-0.12.0-15.el9.noar 472 kB/s | 22 kB 00:00 2026-03-10T06:15:33.565 INFO:teuthology.orchestra.run.vm04.stdout:(115/119): python3-zc-lockfile-2.0-10.el9.noarc 409 kB/s | 20 kB 00:00 2026-03-10T06:15:33.624 INFO:teuthology.orchestra.run.vm04.stdout:(116/119): re2-20211101-20.el9.x86_64.rpm 2.2 MB/s | 191 kB 00:00 2026-03-10T06:15:33.631 INFO:teuthology.orchestra.run.vm04.stdout:(117/119): thrift-0.15.0-4.el9.x86_64.rpm 23 MB/s | 1.6 MB 00:00 2026-03-10T06:15:34.480 INFO:teuthology.orchestra.run.vm06.stdout:(117/119): librados2-18.2.0-0.el9.x86_64.rpm 2.8 MB/s | 3.3 MB 00:01 2026-03-10T06:15:34.485 INFO:teuthology.orchestra.run.vm06.stdout:(118/119): librbd1-18.2.0-0.el9.x86_64.rpm 3.0 MB/s | 3.0 MB 00:00 2026-03-10T06:15:34.567 INFO:teuthology.orchestra.run.vm04.stdout:(118/119): librados2-18.2.0-0.el9.x86_64.rpm 3.3 MB/s | 3.3 MB 00:01 2026-03-10T06:15:34.719 INFO:teuthology.orchestra.run.vm04.stdout:(119/119): librbd1-18.2.0-0.el9.x86_64.rpm 2.7 MB/s | 3.0 MB 00:01 2026-03-10T06:15:34.724 INFO:teuthology.orchestra.run.vm04.stdout:-------------------------------------------------------------------------------- 2026-03-10T06:15:34.724 INFO:teuthology.orchestra.run.vm04.stdout:Total 12 MB/s | 182 MB 00:15 2026-03-10T06:15:35.245 INFO:teuthology.orchestra.run.vm04.stdout:Running transaction check 2026-03-10T06:15:35.294 INFO:teuthology.orchestra.run.vm04.stdout:Transaction check succeeded. 2026-03-10T06:15:35.294 INFO:teuthology.orchestra.run.vm04.stdout:Running transaction test 2026-03-10T06:15:36.086 INFO:teuthology.orchestra.run.vm04.stdout:Transaction test succeeded. 2026-03-10T06:15:36.086 INFO:teuthology.orchestra.run.vm04.stdout:Running transaction 2026-03-10T06:15:36.932 INFO:teuthology.orchestra.run.vm04.stdout: Preparing : 1/1 2026-03-10T06:15:36.942 INFO:teuthology.orchestra.run.vm04.stdout: Installing : thrift-0.15.0-4.el9.x86_64 1/121 2026-03-10T06:15:36.957 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-more-itertools-8.12.0-2.el9.noarch 2/121 2026-03-10T06:15:37.134 INFO:teuthology.orchestra.run.vm04.stdout: Installing : lttng-ust-2.12.0-6.el9.x86_64 3/121 2026-03-10T06:15:37.137 INFO:teuthology.orchestra.run.vm04.stdout: Upgrading : librados2-2:18.2.0-0.el9.x86_64 4/121 2026-03-10T06:15:37.184 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: librados2-2:18.2.0-0.el9.x86_64 4/121 2026-03-10T06:15:37.186 INFO:teuthology.orchestra.run.vm04.stdout: Installing : libcephfs2-2:18.2.0-0.el9.x86_64 5/121 2026-03-10T06:15:37.218 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: libcephfs2-2:18.2.0-0.el9.x86_64 5/121 2026-03-10T06:15:37.228 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-rados-2:18.2.0-0.el9.x86_64 6/121 2026-03-10T06:15:37.232 INFO:teuthology.orchestra.run.vm04.stdout: Installing : librdkafka-1.6.1-102.el9.x86_64 7/121 2026-03-10T06:15:37.235 INFO:teuthology.orchestra.run.vm04.stdout: Installing : librabbitmq-0.11.0-7.el9.x86_64 8/121 2026-03-10T06:15:37.245 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-jaraco-8.2.1-3.el9.noarch 9/121 2026-03-10T06:15:37.246 INFO:teuthology.orchestra.run.vm04.stdout: Installing : libcephsqlite-2:18.2.0-0.el9.x86_64 10/121 2026-03-10T06:15:37.285 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: libcephsqlite-2:18.2.0-0.el9.x86_64 10/121 2026-03-10T06:15:37.287 INFO:teuthology.orchestra.run.vm04.stdout: Installing : libradosstriper1-2:18.2.0-0.el9.x86_64 11/121 2026-03-10T06:15:37.346 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: libradosstriper1-2:18.2.0-0.el9.x86_64 11/121 2026-03-10T06:15:37.353 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-werkzeug-2.0.3-3.el9.1.noarch 12/121 2026-03-10T06:15:37.383 INFO:teuthology.orchestra.run.vm04.stdout: Installing : liboath-2.6.12-1.el9.x86_64 13/121 2026-03-10T06:15:37.392 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-pyasn1-0.4.8-7.el9.noarch 14/121 2026-03-10T06:15:37.397 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-markupsafe-1.1.1-12.el9.x86_64 15/121 2026-03-10T06:15:37.427 INFO:teuthology.orchestra.run.vm04.stdout: Installing : flexiblas-3.0.4-9.el9.x86_64 16/121 2026-03-10T06:15:37.446 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-urllib3-1.26.5-7.el9.noarch 17/121 2026-03-10T06:15:37.454 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-requests-2.25.1-10.el9.noarch 18/121 2026-03-10T06:15:37.463 INFO:teuthology.orchestra.run.vm04.stdout: Installing : libquadmath-11.5.0-14.el9.x86_64 19/121 2026-03-10T06:15:37.466 INFO:teuthology.orchestra.run.vm04.stdout: Installing : libgfortran-11.5.0-14.el9.x86_64 20/121 2026-03-10T06:15:37.472 INFO:teuthology.orchestra.run.vm04.stdout: Installing : ledmon-libs-1.1.0-3.el9.x86_64 21/121 2026-03-10T06:15:37.483 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-ceph-argparse-2:18.2.0-0.el9.x86_64 22/121 2026-03-10T06:15:37.500 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-cephfs-2:18.2.0-0.el9.x86_64 23/121 2026-03-10T06:15:37.533 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-requests-oauthlib-1.3.0-12.el9.noarch 24/121 2026-03-10T06:15:37.600 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-mako-1.1.4-6.el9.noarch 25/121 2026-03-10T06:15:37.621 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-pyasn1-modules-0.4.8-7.el9.noarch 26/121 2026-03-10T06:15:37.630 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-rsa-4.9-2.el9.noarch 27/121 2026-03-10T06:15:37.639 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-jaraco-classes-3.2.1-5.el9.noarch 28/121 2026-03-10T06:15:37.645 INFO:teuthology.orchestra.run.vm04.stdout: Installing : librados-devel-2:18.2.0-0.el9.x86_64 29/121 2026-03-10T06:15:37.689 INFO:teuthology.orchestra.run.vm04.stdout: Installing : re2-1:20211101-20.el9.x86_64 30/121 2026-03-10T06:15:37.697 INFO:teuthology.orchestra.run.vm04.stdout: Installing : libarrow-9.0.0-15.el9.x86_64 31/121 2026-03-10T06:15:37.719 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-zc-lockfile-2.0-10.el9.noarch 32/121 2026-03-10T06:15:37.750 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-websocket-client-1.2.3-2.el9.noarch 33/121 2026-03-10T06:15:37.760 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-webob-1.8.8-2.el9.noarch 34/121 2026-03-10T06:15:37.771 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-typing-extensions-4.15.0-1.el9.noarch 35/121 2026-03-10T06:15:37.788 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-repoze-lru-0.7-16.el9.noarch 36/121 2026-03-10T06:15:37.805 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-routes-2.5.1-5.el9.noarch 37/121 2026-03-10T06:15:37.819 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-natsort-7.1.1-5.el9.noarch 38/121 2026-03-10T06:15:37.896 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-logutils-0.3.5-21.el9.noarch 39/121 2026-03-10T06:15:37.906 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-pecan-1.4.2-3.el9.noarch 40/121 2026-03-10T06:15:37.916 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-certifi-2023.05.07-4.el9.noarch 41/121 2026-03-10T06:15:37.970 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-cachetools-4.2.4-1.el9.noarch 42/121 2026-03-10T06:15:38.074 INFO:teuthology.orchestra.run.vm06.stdout:(119/119): python3-scipy-1.9.3-2.el9.x86_64.rpm 2.9 MB/s | 19 MB 00:06 2026-03-10T06:15:38.076 INFO:teuthology.orchestra.run.vm06.stdout:-------------------------------------------------------------------------------- 2026-03-10T06:15:38.076 INFO:teuthology.orchestra.run.vm06.stdout:Total 8.9 MB/s | 182 MB 00:20 2026-03-10T06:15:38.389 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-google-auth-1:2.45.0-1.el9.noarch 43/121 2026-03-10T06:15:38.408 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-kubernetes-1:26.1.0-3.el9.noarch 44/121 2026-03-10T06:15:38.415 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-backports-tarfile-1.2.0-1.el9.noarch 45/121 2026-03-10T06:15:38.424 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-jaraco-context-6.0.1-3.el9.noarch 46/121 2026-03-10T06:15:38.430 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-autocommand-2.2.2-8.el9.noarch 47/121 2026-03-10T06:15:38.439 INFO:teuthology.orchestra.run.vm04.stdout: Installing : libunwind-1.6.2-1.el9.x86_64 48/121 2026-03-10T06:15:38.445 INFO:teuthology.orchestra.run.vm04.stdout: Installing : gperftools-libs-2.9.1-3.el9.x86_64 49/121 2026-03-10T06:15:38.450 INFO:teuthology.orchestra.run.vm04.stdout: Installing : libarrow-doc-9.0.0-15.el9.noarch 50/121 2026-03-10T06:15:38.467 INFO:teuthology.orchestra.run.vm04.stdout: Installing : fmt-8.1.1-5.el9.x86_64 51/121 2026-03-10T06:15:38.497 INFO:teuthology.orchestra.run.vm04.stdout: Installing : socat-1.7.4.1-8.el9.x86_64 52/121 2026-03-10T06:15:38.502 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-toml-0.10.2-6.el9.noarch 53/121 2026-03-10T06:15:38.512 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-jaraco-functools-3.5.0-2.el9.noarch 54/121 2026-03-10T06:15:38.517 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-jaraco-text-4.0.0-2.el9.noarch 55/121 2026-03-10T06:15:38.527 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-jaraco-collections-3.0.0-8.el9.noarch 56/121 2026-03-10T06:15:38.532 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-tempora-5.0.0-2.el9.noarch 57/121 2026-03-10T06:15:38.581 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-portend-3.1.0-2.el9.noarch 58/121 2026-03-10T06:15:38.643 INFO:teuthology.orchestra.run.vm06.stdout:Running transaction check 2026-03-10T06:15:38.697 INFO:teuthology.orchestra.run.vm06.stdout:Transaction check succeeded. 2026-03-10T06:15:38.697 INFO:teuthology.orchestra.run.vm06.stdout:Running transaction test 2026-03-10T06:15:38.878 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-devel-3.9.25-3.el9.x86_64 59/121 2026-03-10T06:15:38.919 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-babel-2.9.1-2.el9.noarch 60/121 2026-03-10T06:15:38.926 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-jinja2-2.11.3-8.el9.noarch 61/121 2026-03-10T06:15:38.993 INFO:teuthology.orchestra.run.vm04.stdout: Installing : openblas-0.3.29-1.el9.x86_64 62/121 2026-03-10T06:15:38.998 INFO:teuthology.orchestra.run.vm04.stdout: Installing : openblas-openmp-0.3.29-1.el9.x86_64 63/121 2026-03-10T06:15:39.026 INFO:teuthology.orchestra.run.vm04.stdout: Installing : flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 64/121 2026-03-10T06:15:39.445 INFO:teuthology.orchestra.run.vm04.stdout: Installing : flexiblas-netlib-3.0.4-9.el9.x86_64 65/121 2026-03-10T06:15:39.499 INFO:teuthology.orchestra.run.vm06.stdout:Transaction test succeeded. 2026-03-10T06:15:39.499 INFO:teuthology.orchestra.run.vm06.stdout:Running transaction 2026-03-10T06:15:39.539 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-numpy-1:1.23.5-2.el9.x86_64 66/121 2026-03-10T06:15:40.381 INFO:teuthology.orchestra.run.vm06.stdout: Preparing : 1/1 2026-03-10T06:15:40.391 INFO:teuthology.orchestra.run.vm06.stdout: Installing : thrift-0.15.0-4.el9.x86_64 1/121 2026-03-10T06:15:40.396 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-numpy-f2py-1:1.23.5-2.el9.x86_64 67/121 2026-03-10T06:15:40.404 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-more-itertools-8.12.0-2.el9.noarch 2/121 2026-03-10T06:15:40.424 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-scipy-1.9.3-2.el9.x86_64 68/121 2026-03-10T06:15:40.432 INFO:teuthology.orchestra.run.vm04.stdout: Installing : libxslt-1.1.34-12.el9.x86_64 69/121 2026-03-10T06:15:40.439 INFO:teuthology.orchestra.run.vm04.stdout: Installing : xmlstarlet-1.6.1-20.el9.x86_64 70/121 2026-03-10T06:15:40.582 INFO:teuthology.orchestra.run.vm06.stdout: Installing : lttng-ust-2.12.0-6.el9.x86_64 3/121 2026-03-10T06:15:40.584 INFO:teuthology.orchestra.run.vm06.stdout: Upgrading : librados2-2:18.2.0-0.el9.x86_64 4/121 2026-03-10T06:15:40.606 INFO:teuthology.orchestra.run.vm04.stdout: Installing : libpmemobj-1.12.1-1.el9.x86_64 71/121 2026-03-10T06:15:40.609 INFO:teuthology.orchestra.run.vm04.stdout: Upgrading : librbd1-2:18.2.0-0.el9.x86_64 72/121 2026-03-10T06:15:40.632 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: librados2-2:18.2.0-0.el9.x86_64 4/121 2026-03-10T06:15:40.633 INFO:teuthology.orchestra.run.vm06.stdout: Installing : libcephfs2-2:18.2.0-0.el9.x86_64 5/121 2026-03-10T06:15:40.642 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: librbd1-2:18.2.0-0.el9.x86_64 72/121 2026-03-10T06:15:40.646 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-rbd-2:18.2.0-0.el9.x86_64 73/121 2026-03-10T06:15:40.654 INFO:teuthology.orchestra.run.vm04.stdout: Installing : boost-program-options-1.75.0-13.el9.x86_64 74/121 2026-03-10T06:15:40.668 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: libcephfs2-2:18.2.0-0.el9.x86_64 5/121 2026-03-10T06:15:40.679 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-rados-2:18.2.0-0.el9.x86_64 6/121 2026-03-10T06:15:40.683 INFO:teuthology.orchestra.run.vm06.stdout: Installing : librdkafka-1.6.1-102.el9.x86_64 7/121 2026-03-10T06:15:40.686 INFO:teuthology.orchestra.run.vm06.stdout: Installing : librabbitmq-0.11.0-7.el9.x86_64 8/121 2026-03-10T06:15:40.697 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-jaraco-8.2.1-3.el9.noarch 9/121 2026-03-10T06:15:40.698 INFO:teuthology.orchestra.run.vm06.stdout: Installing : libcephsqlite-2:18.2.0-0.el9.x86_64 10/121 2026-03-10T06:15:40.811 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: libcephsqlite-2:18.2.0-0.el9.x86_64 10/121 2026-03-10T06:15:40.813 INFO:teuthology.orchestra.run.vm06.stdout: Installing : libradosstriper1-2:18.2.0-0.el9.x86_64 11/121 2026-03-10T06:15:40.864 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: libradosstriper1-2:18.2.0-0.el9.x86_64 11/121 2026-03-10T06:15:40.869 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-werkzeug-2.0.3-3.el9.1.noarch 12/121 2026-03-10T06:15:40.876 INFO:teuthology.orchestra.run.vm04.stdout: Installing : parquet-libs-9.0.0-15.el9.x86_64 75/121 2026-03-10T06:15:40.879 INFO:teuthology.orchestra.run.vm04.stdout: Installing : librgw2-2:18.2.0-0.el9.x86_64 76/121 2026-03-10T06:15:40.898 INFO:teuthology.orchestra.run.vm06.stdout: Installing : liboath-2.6.12-1.el9.x86_64 13/121 2026-03-10T06:15:40.901 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: librgw2-2:18.2.0-0.el9.x86_64 76/121 2026-03-10T06:15:40.910 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-pyasn1-0.4.8-7.el9.noarch 14/121 2026-03-10T06:15:40.911 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-rgw-2:18.2.0-0.el9.x86_64 77/121 2026-03-10T06:15:40.916 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-markupsafe-1.1.1-12.el9.x86_64 15/121 2026-03-10T06:15:40.929 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-ply-3.11-14.el9.noarch 78/121 2026-03-10T06:15:40.951 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-pycparser-2.20-6.el9.noarch 79/121 2026-03-10T06:15:40.951 INFO:teuthology.orchestra.run.vm06.stdout: Installing : flexiblas-3.0.4-9.el9.x86_64 16/121 2026-03-10T06:15:40.971 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-urllib3-1.26.5-7.el9.noarch 17/121 2026-03-10T06:15:40.976 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-requests-2.25.1-10.el9.noarch 18/121 2026-03-10T06:15:40.983 INFO:teuthology.orchestra.run.vm06.stdout: Installing : libquadmath-11.5.0-14.el9.x86_64 19/121 2026-03-10T06:15:40.985 INFO:teuthology.orchestra.run.vm06.stdout: Installing : libgfortran-11.5.0-14.el9.x86_64 20/121 2026-03-10T06:15:40.991 INFO:teuthology.orchestra.run.vm06.stdout: Installing : ledmon-libs-1.1.0-3.el9.x86_64 21/121 2026-03-10T06:15:41.002 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-ceph-argparse-2:18.2.0-0.el9.x86_64 22/121 2026-03-10T06:15:41.016 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-cephfs-2:18.2.0-0.el9.x86_64 23/121 2026-03-10T06:15:41.047 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-requests-oauthlib-1.3.0-12.el9.noarch 24/121 2026-03-10T06:15:41.049 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-cffi-1.14.5-5.el9.x86_64 80/121 2026-03-10T06:15:41.068 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-cryptography-36.0.1-5.el9.x86_64 81/121 2026-03-10T06:15:41.101 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-pyOpenSSL-21.0.0-1.el9.noarch 82/121 2026-03-10T06:15:41.112 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-mako-1.1.4-6.el9.noarch 25/121 2026-03-10T06:15:41.129 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-pyasn1-modules-0.4.8-7.el9.noarch 26/121 2026-03-10T06:15:41.137 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-rsa-4.9-2.el9.noarch 27/121 2026-03-10T06:15:41.143 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-cheroot-10.0.1-4.el9.noarch 83/121 2026-03-10T06:15:41.146 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-jaraco-classes-3.2.1-5.el9.noarch 28/121 2026-03-10T06:15:41.151 INFO:teuthology.orchestra.run.vm06.stdout: Installing : librados-devel-2:18.2.0-0.el9.x86_64 29/121 2026-03-10T06:15:41.191 INFO:teuthology.orchestra.run.vm06.stdout: Installing : re2-1:20211101-20.el9.x86_64 30/121 2026-03-10T06:15:41.197 INFO:teuthology.orchestra.run.vm06.stdout: Installing : libarrow-9.0.0-15.el9.x86_64 31/121 2026-03-10T06:15:41.217 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-cherrypy-18.6.1-2.el9.noarch 84/121 2026-03-10T06:15:41.217 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-zc-lockfile-2.0-10.el9.noarch 32/121 2026-03-10T06:15:41.233 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-asyncssh-2.13.2-5.el9.noarch 85/121 2026-03-10T06:15:41.237 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-jwt-2.4.0-1.el9.noarch 86/121 2026-03-10T06:15:41.245 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-jwt+crypto-2.4.0-1.el9.noarch 87/121 2026-03-10T06:15:41.250 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-websocket-client-1.2.3-2.el9.noarch 33/121 2026-03-10T06:15:41.251 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-bcrypt-3.2.2-1.el9.x86_64 88/121 2026-03-10T06:15:41.256 INFO:teuthology.orchestra.run.vm04.stdout: Installing : mailcap-2.1.49-5.el9.noarch 89/121 2026-03-10T06:15:41.257 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-webob-1.8.8-2.el9.noarch 34/121 2026-03-10T06:15:41.260 INFO:teuthology.orchestra.run.vm04.stdout: Installing : libconfig-1.7.2-9.el9.x86_64 90/121 2026-03-10T06:15:41.265 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-typing-extensions-4.15.0-1.el9.noarch 35/121 2026-03-10T06:15:41.282 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-repoze-lru-0.7-16.el9.noarch 36/121 2026-03-10T06:15:41.284 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: libstoragemgmt-1.10.1-1.el9.x86_64 91/121 2026-03-10T06:15:41.284 INFO:teuthology.orchestra.run.vm04.stdout:Creating group 'libstoragemgmt' with GID 994. 2026-03-10T06:15:41.284 INFO:teuthology.orchestra.run.vm04.stdout:Creating user 'libstoragemgmt' (daemon account for libstoragemgmt) with UID 994 and GID 994. 2026-03-10T06:15:41.284 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:15:41.295 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-routes-2.5.1-5.el9.noarch 37/121 2026-03-10T06:15:41.297 INFO:teuthology.orchestra.run.vm04.stdout: Installing : libstoragemgmt-1.10.1-1.el9.x86_64 91/121 2026-03-10T06:15:41.308 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-natsort-7.1.1-5.el9.noarch 38/121 2026-03-10T06:15:41.335 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: libstoragemgmt-1.10.1-1.el9.x86_64 91/121 2026-03-10T06:15:41.335 INFO:teuthology.orchestra.run.vm04.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/libstoragemgmt.service → /usr/lib/systemd/system/libstoragemgmt.service. 2026-03-10T06:15:41.335 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:15:41.352 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-libstoragemgmt-1.10.1-1.el9.x86_64 92/121 2026-03-10T06:15:41.379 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-logutils-0.3.5-21.el9.noarch 39/121 2026-03-10T06:15:41.387 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-pecan-1.4.2-3.el9.noarch 40/121 2026-03-10T06:15:41.398 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-certifi-2023.05.07-4.el9.noarch 41/121 2026-03-10T06:15:41.416 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: cephadm-2:18.2.0-0.el9.noarch 93/121 2026-03-10T06:15:41.420 INFO:teuthology.orchestra.run.vm04.stdout: Installing : cephadm-2:18.2.0-0.el9.noarch 93/121 2026-03-10T06:15:41.425 INFO:teuthology.orchestra.run.vm04.stdout: Installing : ceph-prometheus-alerts-2:18.2.0-0.el9.noarch 94/121 2026-03-10T06:15:41.448 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-cachetools-4.2.4-1.el9.noarch 42/121 2026-03-10T06:15:41.452 INFO:teuthology.orchestra.run.vm04.stdout: Installing : ceph-grafana-dashboards-2:18.2.0-0.el9.noarch 95/121 2026-03-10T06:15:41.457 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-ceph-common-2:18.2.0-0.el9.x86_64 96/121 2026-03-10T06:15:41.866 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-google-auth-1:2.45.0-1.el9.noarch 43/121 2026-03-10T06:15:41.882 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-kubernetes-1:26.1.0-3.el9.noarch 44/121 2026-03-10T06:15:41.887 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-backports-tarfile-1.2.0-1.el9.noarch 45/121 2026-03-10T06:15:41.896 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-jaraco-context-6.0.1-3.el9.noarch 46/121 2026-03-10T06:15:41.901 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-autocommand-2.2.2-8.el9.noarch 47/121 2026-03-10T06:15:41.909 INFO:teuthology.orchestra.run.vm06.stdout: Installing : libunwind-1.6.2-1.el9.x86_64 48/121 2026-03-10T06:15:41.914 INFO:teuthology.orchestra.run.vm06.stdout: Installing : gperftools-libs-2.9.1-3.el9.x86_64 49/121 2026-03-10T06:15:41.919 INFO:teuthology.orchestra.run.vm06.stdout: Installing : libarrow-doc-9.0.0-15.el9.noarch 50/121 2026-03-10T06:15:41.936 INFO:teuthology.orchestra.run.vm06.stdout: Installing : fmt-8.1.1-5.el9.x86_64 51/121 2026-03-10T06:15:41.944 INFO:teuthology.orchestra.run.vm06.stdout: Installing : socat-1.7.4.1-8.el9.x86_64 52/121 2026-03-10T06:15:41.950 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-toml-0.10.2-6.el9.noarch 53/121 2026-03-10T06:15:41.958 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-jaraco-functools-3.5.0-2.el9.noarch 54/121 2026-03-10T06:15:41.963 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-jaraco-text-4.0.0-2.el9.noarch 55/121 2026-03-10T06:15:41.973 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-jaraco-collections-3.0.0-8.el9.noarch 56/121 2026-03-10T06:15:41.984 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-tempora-5.0.0-2.el9.noarch 57/121 2026-03-10T06:15:42.038 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-portend-3.1.0-2.el9.noarch 58/121 2026-03-10T06:15:42.340 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-devel-3.9.25-3.el9.x86_64 59/121 2026-03-10T06:15:42.375 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-babel-2.9.1-2.el9.noarch 60/121 2026-03-10T06:15:42.383 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-jinja2-2.11.3-8.el9.noarch 61/121 2026-03-10T06:15:42.450 INFO:teuthology.orchestra.run.vm06.stdout: Installing : openblas-0.3.29-1.el9.x86_64 62/121 2026-03-10T06:15:42.454 INFO:teuthology.orchestra.run.vm06.stdout: Installing : openblas-openmp-0.3.29-1.el9.x86_64 63/121 2026-03-10T06:15:42.456 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-common-2:18.2.0-0.el9.x86_64 97/121 2026-03-10T06:15:42.462 INFO:teuthology.orchestra.run.vm04.stdout: Installing : ceph-common-2:18.2.0-0.el9.x86_64 97/121 2026-03-10T06:15:42.483 INFO:teuthology.orchestra.run.vm06.stdout: Installing : flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 64/121 2026-03-10T06:15:42.788 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-common-2:18.2.0-0.el9.x86_64 97/121 2026-03-10T06:15:42.851 INFO:teuthology.orchestra.run.vm04.stdout: Installing : ceph-base-2:18.2.0-0.el9.x86_64 98/121 2026-03-10T06:15:42.898 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-base-2:18.2.0-0.el9.x86_64 98/121 2026-03-10T06:15:42.899 INFO:teuthology.orchestra.run.vm04.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph.target → /usr/lib/systemd/system/ceph.target. 2026-03-10T06:15:42.899 INFO:teuthology.orchestra.run.vm04.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-crash.service → /usr/lib/systemd/system/ceph-crash.service. 2026-03-10T06:15:42.899 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:15:42.905 INFO:teuthology.orchestra.run.vm04.stdout: Installing : ceph-selinux-2:18.2.0-0.el9.x86_64 99/121 2026-03-10T06:15:42.925 INFO:teuthology.orchestra.run.vm06.stdout: Installing : flexiblas-netlib-3.0.4-9.el9.x86_64 65/121 2026-03-10T06:15:43.025 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-numpy-1:1.23.5-2.el9.x86_64 66/121 2026-03-10T06:15:43.925 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-numpy-f2py-1:1.23.5-2.el9.x86_64 67/121 2026-03-10T06:15:43.961 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-scipy-1.9.3-2.el9.x86_64 68/121 2026-03-10T06:15:43.969 INFO:teuthology.orchestra.run.vm06.stdout: Installing : libxslt-1.1.34-12.el9.x86_64 69/121 2026-03-10T06:15:43.975 INFO:teuthology.orchestra.run.vm06.stdout: Installing : xmlstarlet-1.6.1-20.el9.x86_64 70/121 2026-03-10T06:15:44.140 INFO:teuthology.orchestra.run.vm06.stdout: Installing : libpmemobj-1.12.1-1.el9.x86_64 71/121 2026-03-10T06:15:44.172 INFO:teuthology.orchestra.run.vm06.stdout: Upgrading : librbd1-2:18.2.0-0.el9.x86_64 72/121 2026-03-10T06:15:44.202 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: librbd1-2:18.2.0-0.el9.x86_64 72/121 2026-03-10T06:15:44.205 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-rbd-2:18.2.0-0.el9.x86_64 73/121 2026-03-10T06:15:44.212 INFO:teuthology.orchestra.run.vm06.stdout: Installing : boost-program-options-1.75.0-13.el9.x86_64 74/121 2026-03-10T06:15:44.434 INFO:teuthology.orchestra.run.vm06.stdout: Installing : parquet-libs-9.0.0-15.el9.x86_64 75/121 2026-03-10T06:15:44.436 INFO:teuthology.orchestra.run.vm06.stdout: Installing : librgw2-2:18.2.0-0.el9.x86_64 76/121 2026-03-10T06:15:44.458 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: librgw2-2:18.2.0-0.el9.x86_64 76/121 2026-03-10T06:15:44.468 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-rgw-2:18.2.0-0.el9.x86_64 77/121 2026-03-10T06:15:44.487 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-ply-3.11-14.el9.noarch 78/121 2026-03-10T06:15:44.510 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-pycparser-2.20-6.el9.noarch 79/121 2026-03-10T06:15:44.612 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-cffi-1.14.5-5.el9.x86_64 80/121 2026-03-10T06:15:44.628 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-cryptography-36.0.1-5.el9.x86_64 81/121 2026-03-10T06:15:44.660 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-pyOpenSSL-21.0.0-1.el9.noarch 82/121 2026-03-10T06:15:44.700 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-cheroot-10.0.1-4.el9.noarch 83/121 2026-03-10T06:15:44.771 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-cherrypy-18.6.1-2.el9.noarch 84/121 2026-03-10T06:15:44.785 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-asyncssh-2.13.2-5.el9.noarch 85/121 2026-03-10T06:15:44.788 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-jwt-2.4.0-1.el9.noarch 86/121 2026-03-10T06:15:44.795 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-jwt+crypto-2.4.0-1.el9.noarch 87/121 2026-03-10T06:15:44.801 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-bcrypt-3.2.2-1.el9.x86_64 88/121 2026-03-10T06:15:44.805 INFO:teuthology.orchestra.run.vm06.stdout: Installing : mailcap-2.1.49-5.el9.noarch 89/121 2026-03-10T06:15:44.809 INFO:teuthology.orchestra.run.vm06.stdout: Installing : libconfig-1.7.2-9.el9.x86_64 90/121 2026-03-10T06:15:44.830 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: libstoragemgmt-1.10.1-1.el9.x86_64 91/121 2026-03-10T06:15:44.830 INFO:teuthology.orchestra.run.vm06.stdout:Creating group 'libstoragemgmt' with GID 994. 2026-03-10T06:15:44.830 INFO:teuthology.orchestra.run.vm06.stdout:Creating user 'libstoragemgmt' (daemon account for libstoragemgmt) with UID 994 and GID 994. 2026-03-10T06:15:44.830 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-10T06:15:44.850 INFO:teuthology.orchestra.run.vm06.stdout: Installing : libstoragemgmt-1.10.1-1.el9.x86_64 91/121 2026-03-10T06:15:44.883 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: libstoragemgmt-1.10.1-1.el9.x86_64 91/121 2026-03-10T06:15:44.883 INFO:teuthology.orchestra.run.vm06.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/libstoragemgmt.service → /usr/lib/systemd/system/libstoragemgmt.service. 2026-03-10T06:15:44.883 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-10T06:15:44.906 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-libstoragemgmt-1.10.1-1.el9.x86_64 92/121 2026-03-10T06:15:44.963 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: cephadm-2:18.2.0-0.el9.noarch 93/121 2026-03-10T06:15:44.966 INFO:teuthology.orchestra.run.vm06.stdout: Installing : cephadm-2:18.2.0-0.el9.noarch 93/121 2026-03-10T06:15:44.971 INFO:teuthology.orchestra.run.vm06.stdout: Installing : ceph-prometheus-alerts-2:18.2.0-0.el9.noarch 94/121 2026-03-10T06:15:45.000 INFO:teuthology.orchestra.run.vm06.stdout: Installing : ceph-grafana-dashboards-2:18.2.0-0.el9.noarch 95/121 2026-03-10T06:15:45.004 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-ceph-common-2:18.2.0-0.el9.x86_64 96/121 2026-03-10T06:15:46.035 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: ceph-common-2:18.2.0-0.el9.x86_64 97/121 2026-03-10T06:15:46.108 INFO:teuthology.orchestra.run.vm06.stdout: Installing : ceph-common-2:18.2.0-0.el9.x86_64 97/121 2026-03-10T06:15:46.428 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: ceph-common-2:18.2.0-0.el9.x86_64 97/121 2026-03-10T06:15:46.435 INFO:teuthology.orchestra.run.vm06.stdout: Installing : ceph-base-2:18.2.0-0.el9.x86_64 98/121 2026-03-10T06:15:46.476 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: ceph-base-2:18.2.0-0.el9.x86_64 98/121 2026-03-10T06:15:46.476 INFO:teuthology.orchestra.run.vm06.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph.target → /usr/lib/systemd/system/ceph.target. 2026-03-10T06:15:46.476 INFO:teuthology.orchestra.run.vm06.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-crash.service → /usr/lib/systemd/system/ceph-crash.service. 2026-03-10T06:15:46.476 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-10T06:15:46.481 INFO:teuthology.orchestra.run.vm06.stdout: Installing : ceph-selinux-2:18.2.0-0.el9.x86_64 99/121 2026-03-10T06:15:50.203 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-selinux-2:18.2.0-0.el9.x86_64 99/121 2026-03-10T06:15:50.203 INFO:teuthology.orchestra.run.vm04.stdout:skipping the directory /sys 2026-03-10T06:15:50.203 INFO:teuthology.orchestra.run.vm04.stdout:skipping the directory /proc 2026-03-10T06:15:50.203 INFO:teuthology.orchestra.run.vm04.stdout:skipping the directory /mnt 2026-03-10T06:15:50.203 INFO:teuthology.orchestra.run.vm04.stdout:skipping the directory /var/tmp 2026-03-10T06:15:50.203 INFO:teuthology.orchestra.run.vm04.stdout:skipping the directory /home 2026-03-10T06:15:50.203 INFO:teuthology.orchestra.run.vm04.stdout:skipping the directory /root 2026-03-10T06:15:50.203 INFO:teuthology.orchestra.run.vm04.stdout:skipping the directory /tmp 2026-03-10T06:15:50.203 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:15:50.239 INFO:teuthology.orchestra.run.vm04.stdout: Installing : ceph-mgr-cephadm-2:18.2.0-0.el9.noarch 100/121 2026-03-10T06:15:50.379 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-mgr-cephadm-2:18.2.0-0.el9.noarch 100/121 2026-03-10T06:15:50.384 INFO:teuthology.orchestra.run.vm04.stdout: Installing : ceph-mgr-dashboard-2:18.2.0-0.el9.noarch 101/121 2026-03-10T06:15:50.952 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-mgr-dashboard-2:18.2.0-0.el9.noarch 101/121 2026-03-10T06:15:50.955 INFO:teuthology.orchestra.run.vm04.stdout: Installing : ceph-mgr-diskprediction-local-2:18.2.0-0.el9.noa 102/121 2026-03-10T06:15:51.025 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-mgr-diskprediction-local-2:18.2.0-0.el9.noa 102/121 2026-03-10T06:15:51.146 INFO:teuthology.orchestra.run.vm04.stdout: Installing : ceph-mgr-modules-core-2:18.2.0-0.el9.noarch 103/121 2026-03-10T06:15:51.150 INFO:teuthology.orchestra.run.vm04.stdout: Installing : ceph-mgr-2:18.2.0-0.el9.x86_64 104/121 2026-03-10T06:15:51.182 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-mgr-2:18.2.0-0.el9.x86_64 104/121 2026-03-10T06:15:51.182 INFO:teuthology.orchestra.run.vm04.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T06:15:51.182 INFO:teuthology.orchestra.run.vm04.stdout:Invalid unit name "ceph-mgr@*.service" escaped as "ceph-mgr@\x2a.service". 2026-03-10T06:15:51.182 INFO:teuthology.orchestra.run.vm04.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-mgr.target → /usr/lib/systemd/system/ceph-mgr.target. 2026-03-10T06:15:51.182 INFO:teuthology.orchestra.run.vm04.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-mgr.target → /usr/lib/systemd/system/ceph-mgr.target. 2026-03-10T06:15:51.182 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:15:51.196 INFO:teuthology.orchestra.run.vm04.stdout: Installing : ceph-mgr-rook-2:18.2.0-0.el9.noarch 105/121 2026-03-10T06:15:51.325 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-mgr-rook-2:18.2.0-0.el9.noarch 105/121 2026-03-10T06:15:51.345 INFO:teuthology.orchestra.run.vm04.stdout: Installing : ceph-mds-2:18.2.0-0.el9.x86_64 106/121 2026-03-10T06:15:51.375 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-mds-2:18.2.0-0.el9.x86_64 106/121 2026-03-10T06:15:51.375 INFO:teuthology.orchestra.run.vm04.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T06:15:51.375 INFO:teuthology.orchestra.run.vm04.stdout:Invalid unit name "ceph-mds@*.service" escaped as "ceph-mds@\x2a.service". 2026-03-10T06:15:51.375 INFO:teuthology.orchestra.run.vm04.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-mds.target → /usr/lib/systemd/system/ceph-mds.target. 2026-03-10T06:15:51.375 INFO:teuthology.orchestra.run.vm04.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-mds.target → /usr/lib/systemd/system/ceph-mds.target. 2026-03-10T06:15:51.375 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:15:51.752 INFO:teuthology.orchestra.run.vm04.stdout: Installing : ceph-mon-2:18.2.0-0.el9.x86_64 107/121 2026-03-10T06:15:51.777 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-mon-2:18.2.0-0.el9.x86_64 107/121 2026-03-10T06:15:51.778 INFO:teuthology.orchestra.run.vm04.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T06:15:51.778 INFO:teuthology.orchestra.run.vm04.stdout:Invalid unit name "ceph-mon@*.service" escaped as "ceph-mon@\x2a.service". 2026-03-10T06:15:51.778 INFO:teuthology.orchestra.run.vm04.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-mon.target → /usr/lib/systemd/system/ceph-mon.target. 2026-03-10T06:15:51.778 INFO:teuthology.orchestra.run.vm04.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-mon.target → /usr/lib/systemd/system/ceph-mon.target. 2026-03-10T06:15:51.778 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:15:52.657 INFO:teuthology.orchestra.run.vm04.stdout: Installing : ceph-osd-2:18.2.0-0.el9.x86_64 108/121 2026-03-10T06:15:52.686 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-osd-2:18.2.0-0.el9.x86_64 108/121 2026-03-10T06:15:52.686 INFO:teuthology.orchestra.run.vm04.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T06:15:52.687 INFO:teuthology.orchestra.run.vm04.stdout:Invalid unit name "ceph-osd@*.service" escaped as "ceph-osd@\x2a.service". 2026-03-10T06:15:52.687 INFO:teuthology.orchestra.run.vm04.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-osd.target → /usr/lib/systemd/system/ceph-osd.target. 2026-03-10T06:15:52.687 INFO:teuthology.orchestra.run.vm04.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-osd.target → /usr/lib/systemd/system/ceph-osd.target. 2026-03-10T06:15:52.687 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:15:52.910 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: ceph-selinux-2:18.2.0-0.el9.x86_64 99/121 2026-03-10T06:15:52.910 INFO:teuthology.orchestra.run.vm06.stdout:skipping the directory /sys 2026-03-10T06:15:52.910 INFO:teuthology.orchestra.run.vm06.stdout:skipping the directory /proc 2026-03-10T06:15:52.910 INFO:teuthology.orchestra.run.vm06.stdout:skipping the directory /mnt 2026-03-10T06:15:52.910 INFO:teuthology.orchestra.run.vm06.stdout:skipping the directory /var/tmp 2026-03-10T06:15:52.910 INFO:teuthology.orchestra.run.vm06.stdout:skipping the directory /home 2026-03-10T06:15:52.910 INFO:teuthology.orchestra.run.vm06.stdout:skipping the directory /root 2026-03-10T06:15:52.910 INFO:teuthology.orchestra.run.vm06.stdout:skipping the directory /tmp 2026-03-10T06:15:52.910 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-10T06:15:52.943 INFO:teuthology.orchestra.run.vm06.stdout: Installing : ceph-mgr-cephadm-2:18.2.0-0.el9.noarch 100/121 2026-03-10T06:15:53.084 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: ceph-mgr-cephadm-2:18.2.0-0.el9.noarch 100/121 2026-03-10T06:15:53.086 INFO:teuthology.orchestra.run.vm04.stdout: Installing : ceph-2:18.2.0-0.el9.x86_64 109/121 2026-03-10T06:15:53.090 INFO:teuthology.orchestra.run.vm06.stdout: Installing : ceph-mgr-dashboard-2:18.2.0-0.el9.noarch 101/121 2026-03-10T06:15:53.090 INFO:teuthology.orchestra.run.vm04.stdout: Installing : ceph-radosgw-2:18.2.0-0.el9.x86_64 110/121 2026-03-10T06:15:53.115 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-radosgw-2:18.2.0-0.el9.x86_64 110/121 2026-03-10T06:15:53.115 INFO:teuthology.orchestra.run.vm04.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T06:15:53.115 INFO:teuthology.orchestra.run.vm04.stdout:Invalid unit name "ceph-radosgw@*.service" escaped as "ceph-radosgw@\x2a.service". 2026-03-10T06:15:53.115 INFO:teuthology.orchestra.run.vm04.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-radosgw.target → /usr/lib/systemd/system/ceph-radosgw.target. 2026-03-10T06:15:53.115 INFO:teuthology.orchestra.run.vm04.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-radosgw.target → /usr/lib/systemd/system/ceph-radosgw.target. 2026-03-10T06:15:53.115 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:15:53.126 INFO:teuthology.orchestra.run.vm04.stdout: Installing : ceph-immutable-object-cache-2:18.2.0-0.el9.x86_6 111/121 2026-03-10T06:15:53.149 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-immutable-object-cache-2:18.2.0-0.el9.x86_6 111/121 2026-03-10T06:15:53.149 INFO:teuthology.orchestra.run.vm04.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T06:15:53.149 INFO:teuthology.orchestra.run.vm04.stdout:Invalid unit name "ceph-immutable-object-cache@*.service" escaped as "ceph-immutable-object-cache@\x2a.service". 2026-03-10T06:15:53.149 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:15:53.296 INFO:teuthology.orchestra.run.vm04.stdout: Installing : rbd-mirror-2:18.2.0-0.el9.x86_64 112/121 2026-03-10T06:15:53.319 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: rbd-mirror-2:18.2.0-0.el9.x86_64 112/121 2026-03-10T06:15:53.319 INFO:teuthology.orchestra.run.vm04.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T06:15:53.319 INFO:teuthology.orchestra.run.vm04.stdout:Invalid unit name "ceph-rbd-mirror@*.service" escaped as "ceph-rbd-mirror@\x2a.service". 2026-03-10T06:15:53.319 INFO:teuthology.orchestra.run.vm04.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-rbd-mirror.target → /usr/lib/systemd/system/ceph-rbd-mirror.target. 2026-03-10T06:15:53.319 INFO:teuthology.orchestra.run.vm04.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-rbd-mirror.target → /usr/lib/systemd/system/ceph-rbd-mirror.target. 2026-03-10T06:15:53.319 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:15:53.668 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: ceph-mgr-dashboard-2:18.2.0-0.el9.noarch 101/121 2026-03-10T06:15:53.671 INFO:teuthology.orchestra.run.vm06.stdout: Installing : ceph-mgr-diskprediction-local-2:18.2.0-0.el9.noa 102/121 2026-03-10T06:15:53.736 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: ceph-mgr-diskprediction-local-2:18.2.0-0.el9.noa 102/121 2026-03-10T06:15:53.815 INFO:teuthology.orchestra.run.vm06.stdout: Installing : ceph-mgr-modules-core-2:18.2.0-0.el9.noarch 103/121 2026-03-10T06:15:53.818 INFO:teuthology.orchestra.run.vm06.stdout: Installing : ceph-mgr-2:18.2.0-0.el9.x86_64 104/121 2026-03-10T06:15:53.841 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: ceph-mgr-2:18.2.0-0.el9.x86_64 104/121 2026-03-10T06:15:53.842 INFO:teuthology.orchestra.run.vm06.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T06:15:53.842 INFO:teuthology.orchestra.run.vm06.stdout:Invalid unit name "ceph-mgr@*.service" escaped as "ceph-mgr@\x2a.service". 2026-03-10T06:15:53.842 INFO:teuthology.orchestra.run.vm06.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-mgr.target → /usr/lib/systemd/system/ceph-mgr.target. 2026-03-10T06:15:53.842 INFO:teuthology.orchestra.run.vm06.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-mgr.target → /usr/lib/systemd/system/ceph-mgr.target. 2026-03-10T06:15:53.842 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-10T06:15:53.854 INFO:teuthology.orchestra.run.vm06.stdout: Installing : ceph-mgr-rook-2:18.2.0-0.el9.noarch 105/121 2026-03-10T06:15:53.966 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: ceph-mgr-rook-2:18.2.0-0.el9.noarch 105/121 2026-03-10T06:15:53.968 INFO:teuthology.orchestra.run.vm06.stdout: Installing : ceph-mds-2:18.2.0-0.el9.x86_64 106/121 2026-03-10T06:15:53.995 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: ceph-mds-2:18.2.0-0.el9.x86_64 106/121 2026-03-10T06:15:53.995 INFO:teuthology.orchestra.run.vm06.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T06:15:53.995 INFO:teuthology.orchestra.run.vm06.stdout:Invalid unit name "ceph-mds@*.service" escaped as "ceph-mds@\x2a.service". 2026-03-10T06:15:53.995 INFO:teuthology.orchestra.run.vm06.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-mds.target → /usr/lib/systemd/system/ceph-mds.target. 2026-03-10T06:15:53.995 INFO:teuthology.orchestra.run.vm06.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-mds.target → /usr/lib/systemd/system/ceph-mds.target. 2026-03-10T06:15:53.995 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-10T06:15:54.246 INFO:teuthology.orchestra.run.vm06.stdout: Installing : ceph-mon-2:18.2.0-0.el9.x86_64 107/121 2026-03-10T06:15:54.269 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: ceph-mon-2:18.2.0-0.el9.x86_64 107/121 2026-03-10T06:15:54.269 INFO:teuthology.orchestra.run.vm06.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T06:15:54.269 INFO:teuthology.orchestra.run.vm06.stdout:Invalid unit name "ceph-mon@*.service" escaped as "ceph-mon@\x2a.service". 2026-03-10T06:15:54.269 INFO:teuthology.orchestra.run.vm06.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-mon.target → /usr/lib/systemd/system/ceph-mon.target. 2026-03-10T06:15:54.269 INFO:teuthology.orchestra.run.vm06.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-mon.target → /usr/lib/systemd/system/ceph-mon.target. 2026-03-10T06:15:54.269 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-10T06:15:55.082 INFO:teuthology.orchestra.run.vm06.stdout: Installing : ceph-osd-2:18.2.0-0.el9.x86_64 108/121 2026-03-10T06:15:55.107 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: ceph-osd-2:18.2.0-0.el9.x86_64 108/121 2026-03-10T06:15:55.107 INFO:teuthology.orchestra.run.vm06.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T06:15:55.107 INFO:teuthology.orchestra.run.vm06.stdout:Invalid unit name "ceph-osd@*.service" escaped as "ceph-osd@\x2a.service". 2026-03-10T06:15:55.107 INFO:teuthology.orchestra.run.vm06.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-osd.target → /usr/lib/systemd/system/ceph-osd.target. 2026-03-10T06:15:55.107 INFO:teuthology.orchestra.run.vm06.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-osd.target → /usr/lib/systemd/system/ceph-osd.target. 2026-03-10T06:15:55.107 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-10T06:15:55.366 INFO:teuthology.orchestra.run.vm04.stdout: Installing : ceph-test-2:18.2.0-0.el9.x86_64 113/121 2026-03-10T06:15:55.378 INFO:teuthology.orchestra.run.vm04.stdout: Installing : rbd-fuse-2:18.2.0-0.el9.x86_64 114/121 2026-03-10T06:15:55.383 INFO:teuthology.orchestra.run.vm04.stdout: Installing : rbd-nbd-2:18.2.0-0.el9.x86_64 115/121 2026-03-10T06:15:55.422 INFO:teuthology.orchestra.run.vm04.stdout: Installing : libcephfs-devel-2:18.2.0-0.el9.x86_64 116/121 2026-03-10T06:15:55.429 INFO:teuthology.orchestra.run.vm04.stdout: Installing : ceph-fuse-2:18.2.0-0.el9.x86_64 117/121 2026-03-10T06:15:55.440 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-xmltodict-0.12.0-15.el9.noarch 118/121 2026-03-10T06:15:55.444 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-jmespath-1.0.1-1.el9.noarch 119/121 2026-03-10T06:15:55.444 INFO:teuthology.orchestra.run.vm04.stdout: Cleanup : librbd1-2:16.2.4-5.el9.x86_64 120/121 2026-03-10T06:15:55.482 INFO:teuthology.orchestra.run.vm06.stdout: Installing : ceph-2:18.2.0-0.el9.x86_64 109/121 2026-03-10T06:15:55.484 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: librbd1-2:16.2.4-5.el9.x86_64 120/121 2026-03-10T06:15:55.484 INFO:teuthology.orchestra.run.vm04.stdout: Cleanup : librados2-2:16.2.4-5.el9.x86_64 121/121 2026-03-10T06:15:55.485 INFO:teuthology.orchestra.run.vm06.stdout: Installing : ceph-radosgw-2:18.2.0-0.el9.x86_64 110/121 2026-03-10T06:15:55.512 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: ceph-radosgw-2:18.2.0-0.el9.x86_64 110/121 2026-03-10T06:15:55.512 INFO:teuthology.orchestra.run.vm06.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T06:15:55.512 INFO:teuthology.orchestra.run.vm06.stdout:Invalid unit name "ceph-radosgw@*.service" escaped as "ceph-radosgw@\x2a.service". 2026-03-10T06:15:55.512 INFO:teuthology.orchestra.run.vm06.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-radosgw.target → /usr/lib/systemd/system/ceph-radosgw.target. 2026-03-10T06:15:55.512 INFO:teuthology.orchestra.run.vm06.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-radosgw.target → /usr/lib/systemd/system/ceph-radosgw.target. 2026-03-10T06:15:55.512 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-10T06:15:55.524 INFO:teuthology.orchestra.run.vm06.stdout: Installing : ceph-immutable-object-cache-2:18.2.0-0.el9.x86_6 111/121 2026-03-10T06:15:55.548 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: ceph-immutable-object-cache-2:18.2.0-0.el9.x86_6 111/121 2026-03-10T06:15:55.548 INFO:teuthology.orchestra.run.vm06.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T06:15:55.548 INFO:teuthology.orchestra.run.vm06.stdout:Invalid unit name "ceph-immutable-object-cache@*.service" escaped as "ceph-immutable-object-cache@\x2a.service". 2026-03-10T06:15:55.548 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-10T06:15:55.694 INFO:teuthology.orchestra.run.vm06.stdout: Installing : rbd-mirror-2:18.2.0-0.el9.x86_64 112/121 2026-03-10T06:15:55.715 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: rbd-mirror-2:18.2.0-0.el9.x86_64 112/121 2026-03-10T06:15:55.715 INFO:teuthology.orchestra.run.vm06.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T06:15:55.715 INFO:teuthology.orchestra.run.vm06.stdout:Invalid unit name "ceph-rbd-mirror@*.service" escaped as "ceph-rbd-mirror@\x2a.service". 2026-03-10T06:15:55.715 INFO:teuthology.orchestra.run.vm06.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-rbd-mirror.target → /usr/lib/systemd/system/ceph-rbd-mirror.target. 2026-03-10T06:15:55.715 INFO:teuthology.orchestra.run.vm06.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-rbd-mirror.target → /usr/lib/systemd/system/ceph-rbd-mirror.target. 2026-03-10T06:15:55.715 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-10T06:15:56.735 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: librados2-2:16.2.4-5.el9.x86_64 121/121 2026-03-10T06:15:56.735 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-2:18.2.0-0.el9.x86_64 1/121 2026-03-10T06:15:56.735 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-base-2:18.2.0-0.el9.x86_64 2/121 2026-03-10T06:15:56.735 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-common-2:18.2.0-0.el9.x86_64 3/121 2026-03-10T06:15:56.735 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-fuse-2:18.2.0-0.el9.x86_64 4/121 2026-03-10T06:15:56.736 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-immutable-object-cache-2:18.2.0-0.el9.x86_6 5/121 2026-03-10T06:15:56.736 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-mds-2:18.2.0-0.el9.x86_64 6/121 2026-03-10T06:15:56.736 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-mgr-2:18.2.0-0.el9.x86_64 7/121 2026-03-10T06:15:56.736 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-mon-2:18.2.0-0.el9.x86_64 8/121 2026-03-10T06:15:56.736 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-osd-2:18.2.0-0.el9.x86_64 9/121 2026-03-10T06:15:56.736 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-radosgw-2:18.2.0-0.el9.x86_64 10/121 2026-03-10T06:15:56.736 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-selinux-2:18.2.0-0.el9.x86_64 11/121 2026-03-10T06:15:56.736 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-test-2:18.2.0-0.el9.x86_64 12/121 2026-03-10T06:15:56.736 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : libcephfs-devel-2:18.2.0-0.el9.x86_64 13/121 2026-03-10T06:15:56.736 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : libcephfs2-2:18.2.0-0.el9.x86_64 14/121 2026-03-10T06:15:56.736 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : libcephsqlite-2:18.2.0-0.el9.x86_64 15/121 2026-03-10T06:15:56.736 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : librados-devel-2:18.2.0-0.el9.x86_64 16/121 2026-03-10T06:15:56.736 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : libradosstriper1-2:18.2.0-0.el9.x86_64 17/121 2026-03-10T06:15:56.736 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : librgw2-2:18.2.0-0.el9.x86_64 18/121 2026-03-10T06:15:56.736 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-ceph-argparse-2:18.2.0-0.el9.x86_64 19/121 2026-03-10T06:15:56.736 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-ceph-common-2:18.2.0-0.el9.x86_64 20/121 2026-03-10T06:15:56.736 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-cephfs-2:18.2.0-0.el9.x86_64 21/121 2026-03-10T06:15:56.736 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-rados-2:18.2.0-0.el9.x86_64 22/121 2026-03-10T06:15:56.736 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-rbd-2:18.2.0-0.el9.x86_64 23/121 2026-03-10T06:15:56.736 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-rgw-2:18.2.0-0.el9.x86_64 24/121 2026-03-10T06:15:56.736 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : rbd-fuse-2:18.2.0-0.el9.x86_64 25/121 2026-03-10T06:15:56.736 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : rbd-mirror-2:18.2.0-0.el9.x86_64 26/121 2026-03-10T06:15:56.736 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : rbd-nbd-2:18.2.0-0.el9.x86_64 27/121 2026-03-10T06:15:56.736 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-grafana-dashboards-2:18.2.0-0.el9.noarch 28/121 2026-03-10T06:15:56.736 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-mgr-cephadm-2:18.2.0-0.el9.noarch 29/121 2026-03-10T06:15:56.736 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-mgr-dashboard-2:18.2.0-0.el9.noarch 30/121 2026-03-10T06:15:56.736 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-mgr-diskprediction-local-2:18.2.0-0.el9.noa 31/121 2026-03-10T06:15:56.736 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-mgr-modules-core-2:18.2.0-0.el9.noarch 32/121 2026-03-10T06:15:56.740 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-mgr-rook-2:18.2.0-0.el9.noarch 33/121 2026-03-10T06:15:56.740 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-prometheus-alerts-2:18.2.0-0.el9.noarch 34/121 2026-03-10T06:15:56.740 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : cephadm-2:18.2.0-0.el9.noarch 35/121 2026-03-10T06:15:56.740 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ledmon-libs-1.1.0-3.el9.x86_64 36/121 2026-03-10T06:15:56.740 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : libconfig-1.7.2-9.el9.x86_64 37/121 2026-03-10T06:15:56.740 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : libgfortran-11.5.0-14.el9.x86_64 38/121 2026-03-10T06:15:56.740 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : libquadmath-11.5.0-14.el9.x86_64 39/121 2026-03-10T06:15:56.740 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : mailcap-2.1.49-5.el9.noarch 40/121 2026-03-10T06:15:56.740 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-cffi-1.14.5-5.el9.x86_64 41/121 2026-03-10T06:15:56.740 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-cryptography-36.0.1-5.el9.x86_64 42/121 2026-03-10T06:15:56.740 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-ply-3.11-14.el9.noarch 43/121 2026-03-10T06:15:56.740 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-pycparser-2.20-6.el9.noarch 44/121 2026-03-10T06:15:56.740 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-requests-2.25.1-10.el9.noarch 45/121 2026-03-10T06:15:56.740 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-urllib3-1.26.5-7.el9.noarch 46/121 2026-03-10T06:15:56.740 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : boost-program-options-1.75.0-13.el9.x86_64 47/121 2026-03-10T06:15:56.740 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : flexiblas-3.0.4-9.el9.x86_64 48/121 2026-03-10T06:15:56.740 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : flexiblas-netlib-3.0.4-9.el9.x86_64 49/121 2026-03-10T06:15:56.740 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 50/121 2026-03-10T06:15:56.740 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : libpmemobj-1.12.1-1.el9.x86_64 51/121 2026-03-10T06:15:56.740 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : librabbitmq-0.11.0-7.el9.x86_64 52/121 2026-03-10T06:15:56.740 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : librdkafka-1.6.1-102.el9.x86_64 53/121 2026-03-10T06:15:56.740 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : libstoragemgmt-1.10.1-1.el9.x86_64 54/121 2026-03-10T06:15:56.740 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : libxslt-1.1.34-12.el9.x86_64 55/121 2026-03-10T06:15:56.740 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : lttng-ust-2.12.0-6.el9.x86_64 56/121 2026-03-10T06:15:56.740 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : openblas-0.3.29-1.el9.x86_64 57/121 2026-03-10T06:15:56.740 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : openblas-openmp-0.3.29-1.el9.x86_64 58/121 2026-03-10T06:15:56.740 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-babel-2.9.1-2.el9.noarch 59/121 2026-03-10T06:15:56.740 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-devel-3.9.25-3.el9.x86_64 60/121 2026-03-10T06:15:56.740 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-jinja2-2.11.3-8.el9.noarch 61/121 2026-03-10T06:15:56.740 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-jmespath-1.0.1-1.el9.noarch 62/121 2026-03-10T06:15:56.740 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-libstoragemgmt-1.10.1-1.el9.x86_64 63/121 2026-03-10T06:15:56.740 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-mako-1.1.4-6.el9.noarch 64/121 2026-03-10T06:15:56.740 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-markupsafe-1.1.1-12.el9.x86_64 65/121 2026-03-10T06:15:56.740 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-numpy-1:1.23.5-2.el9.x86_64 66/121 2026-03-10T06:15:56.740 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-numpy-f2py-1:1.23.5-2.el9.x86_64 67/121 2026-03-10T06:15:56.740 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-pyasn1-0.4.8-7.el9.noarch 68/121 2026-03-10T06:15:56.740 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-pyasn1-modules-0.4.8-7.el9.noarch 69/121 2026-03-10T06:15:56.740 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-requests-oauthlib-1.3.0-12.el9.noarch 70/121 2026-03-10T06:15:56.740 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-scipy-1.9.3-2.el9.x86_64 71/121 2026-03-10T06:15:56.740 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-toml-0.10.2-6.el9.noarch 72/121 2026-03-10T06:15:56.740 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : socat-1.7.4.1-8.el9.x86_64 73/121 2026-03-10T06:15:56.740 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : xmlstarlet-1.6.1-20.el9.x86_64 74/121 2026-03-10T06:15:56.740 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : fmt-8.1.1-5.el9.x86_64 75/121 2026-03-10T06:15:56.740 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : gperftools-libs-2.9.1-3.el9.x86_64 76/121 2026-03-10T06:15:56.740 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : libarrow-9.0.0-15.el9.x86_64 77/121 2026-03-10T06:15:56.740 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : libarrow-doc-9.0.0-15.el9.noarch 78/121 2026-03-10T06:15:56.741 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : liboath-2.6.12-1.el9.x86_64 79/121 2026-03-10T06:15:56.741 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : libunwind-1.6.2-1.el9.x86_64 80/121 2026-03-10T06:15:56.741 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : parquet-libs-9.0.0-15.el9.x86_64 81/121 2026-03-10T06:15:56.741 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-asyncssh-2.13.2-5.el9.noarch 82/121 2026-03-10T06:15:56.741 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-autocommand-2.2.2-8.el9.noarch 83/121 2026-03-10T06:15:56.741 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-backports-tarfile-1.2.0-1.el9.noarch 84/121 2026-03-10T06:15:56.741 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-bcrypt-3.2.2-1.el9.x86_64 85/121 2026-03-10T06:15:56.741 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-cachetools-4.2.4-1.el9.noarch 86/121 2026-03-10T06:15:56.741 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-certifi-2023.05.07-4.el9.noarch 87/121 2026-03-10T06:15:56.741 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-cheroot-10.0.1-4.el9.noarch 88/121 2026-03-10T06:15:56.741 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-cherrypy-18.6.1-2.el9.noarch 89/121 2026-03-10T06:15:56.741 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-google-auth-1:2.45.0-1.el9.noarch 90/121 2026-03-10T06:15:56.741 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-jaraco-8.2.1-3.el9.noarch 91/121 2026-03-10T06:15:56.741 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-jaraco-classes-3.2.1-5.el9.noarch 92/121 2026-03-10T06:15:56.741 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-jaraco-collections-3.0.0-8.el9.noarch 93/121 2026-03-10T06:15:56.741 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-jaraco-context-6.0.1-3.el9.noarch 94/121 2026-03-10T06:15:56.741 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-jaraco-functools-3.5.0-2.el9.noarch 95/121 2026-03-10T06:15:56.741 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-jaraco-text-4.0.0-2.el9.noarch 96/121 2026-03-10T06:15:56.741 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-jwt+crypto-2.4.0-1.el9.noarch 97/121 2026-03-10T06:15:56.741 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-jwt-2.4.0-1.el9.noarch 98/121 2026-03-10T06:15:56.741 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-kubernetes-1:26.1.0-3.el9.noarch 99/121 2026-03-10T06:15:56.741 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-logutils-0.3.5-21.el9.noarch 100/121 2026-03-10T06:15:56.741 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-more-itertools-8.12.0-2.el9.noarch 101/121 2026-03-10T06:15:56.741 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-natsort-7.1.1-5.el9.noarch 102/121 2026-03-10T06:15:56.741 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-pecan-1.4.2-3.el9.noarch 103/121 2026-03-10T06:15:56.741 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-portend-3.1.0-2.el9.noarch 104/121 2026-03-10T06:15:56.741 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-pyOpenSSL-21.0.0-1.el9.noarch 105/121 2026-03-10T06:15:56.741 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-repoze-lru-0.7-16.el9.noarch 106/121 2026-03-10T06:15:56.741 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-routes-2.5.1-5.el9.noarch 107/121 2026-03-10T06:15:56.741 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-rsa-4.9-2.el9.noarch 108/121 2026-03-10T06:15:56.741 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-tempora-5.0.0-2.el9.noarch 109/121 2026-03-10T06:15:56.741 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-typing-extensions-4.15.0-1.el9.noarch 110/121 2026-03-10T06:15:56.741 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-webob-1.8.8-2.el9.noarch 111/121 2026-03-10T06:15:56.741 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-websocket-client-1.2.3-2.el9.noarch 112/121 2026-03-10T06:15:56.741 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-werkzeug-2.0.3-3.el9.1.noarch 113/121 2026-03-10T06:15:56.741 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-xmltodict-0.12.0-15.el9.noarch 114/121 2026-03-10T06:15:56.741 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-zc-lockfile-2.0-10.el9.noarch 115/121 2026-03-10T06:15:56.741 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : re2-1:20211101-20.el9.x86_64 116/121 2026-03-10T06:15:56.741 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : thrift-0.15.0-4.el9.x86_64 117/121 2026-03-10T06:15:56.741 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : librados2-2:18.2.0-0.el9.x86_64 118/121 2026-03-10T06:15:56.741 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : librados2-2:16.2.4-5.el9.x86_64 119/121 2026-03-10T06:15:56.741 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : librbd1-2:18.2.0-0.el9.x86_64 120/121 2026-03-10T06:15:56.863 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : librbd1-2:16.2.4-5.el9.x86_64 121/121 2026-03-10T06:15:56.864 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:15:56.864 INFO:teuthology.orchestra.run.vm04.stdout:Upgraded: 2026-03-10T06:15:56.864 INFO:teuthology.orchestra.run.vm04.stdout: librados2-2:18.2.0-0.el9.x86_64 librbd1-2:18.2.0-0.el9.x86_64 2026-03-10T06:15:56.864 INFO:teuthology.orchestra.run.vm04.stdout:Installed: 2026-03-10T06:15:56.864 INFO:teuthology.orchestra.run.vm04.stdout: boost-program-options-1.75.0-13.el9.x86_64 2026-03-10T06:15:56.864 INFO:teuthology.orchestra.run.vm04.stdout: ceph-2:18.2.0-0.el9.x86_64 2026-03-10T06:15:56.864 INFO:teuthology.orchestra.run.vm04.stdout: ceph-base-2:18.2.0-0.el9.x86_64 2026-03-10T06:15:56.864 INFO:teuthology.orchestra.run.vm04.stdout: ceph-common-2:18.2.0-0.el9.x86_64 2026-03-10T06:15:56.864 INFO:teuthology.orchestra.run.vm04.stdout: ceph-fuse-2:18.2.0-0.el9.x86_64 2026-03-10T06:15:56.864 INFO:teuthology.orchestra.run.vm04.stdout: ceph-grafana-dashboards-2:18.2.0-0.el9.noarch 2026-03-10T06:15:56.864 INFO:teuthology.orchestra.run.vm04.stdout: ceph-immutable-object-cache-2:18.2.0-0.el9.x86_64 2026-03-10T06:15:56.864 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mds-2:18.2.0-0.el9.x86_64 2026-03-10T06:15:56.864 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mgr-2:18.2.0-0.el9.x86_64 2026-03-10T06:15:56.864 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mgr-cephadm-2:18.2.0-0.el9.noarch 2026-03-10T06:15:56.864 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mgr-dashboard-2:18.2.0-0.el9.noarch 2026-03-10T06:15:56.864 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mgr-diskprediction-local-2:18.2.0-0.el9.noarch 2026-03-10T06:15:56.864 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mgr-modules-core-2:18.2.0-0.el9.noarch 2026-03-10T06:15:56.864 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mgr-rook-2:18.2.0-0.el9.noarch 2026-03-10T06:15:56.864 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mon-2:18.2.0-0.el9.x86_64 2026-03-10T06:15:56.864 INFO:teuthology.orchestra.run.vm04.stdout: ceph-osd-2:18.2.0-0.el9.x86_64 2026-03-10T06:15:56.864 INFO:teuthology.orchestra.run.vm04.stdout: ceph-prometheus-alerts-2:18.2.0-0.el9.noarch 2026-03-10T06:15:56.864 INFO:teuthology.orchestra.run.vm04.stdout: ceph-radosgw-2:18.2.0-0.el9.x86_64 2026-03-10T06:15:56.864 INFO:teuthology.orchestra.run.vm04.stdout: ceph-selinux-2:18.2.0-0.el9.x86_64 2026-03-10T06:15:56.864 INFO:teuthology.orchestra.run.vm04.stdout: ceph-test-2:18.2.0-0.el9.x86_64 2026-03-10T06:15:56.864 INFO:teuthology.orchestra.run.vm04.stdout: cephadm-2:18.2.0-0.el9.noarch 2026-03-10T06:15:56.864 INFO:teuthology.orchestra.run.vm04.stdout: flexiblas-3.0.4-9.el9.x86_64 2026-03-10T06:15:56.864 INFO:teuthology.orchestra.run.vm04.stdout: flexiblas-netlib-3.0.4-9.el9.x86_64 2026-03-10T06:15:56.864 INFO:teuthology.orchestra.run.vm04.stdout: flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 2026-03-10T06:15:56.864 INFO:teuthology.orchestra.run.vm04.stdout: fmt-8.1.1-5.el9.x86_64 2026-03-10T06:15:56.864 INFO:teuthology.orchestra.run.vm04.stdout: gperftools-libs-2.9.1-3.el9.x86_64 2026-03-10T06:15:56.864 INFO:teuthology.orchestra.run.vm04.stdout: ledmon-libs-1.1.0-3.el9.x86_64 2026-03-10T06:15:56.864 INFO:teuthology.orchestra.run.vm04.stdout: libarrow-9.0.0-15.el9.x86_64 2026-03-10T06:15:56.864 INFO:teuthology.orchestra.run.vm04.stdout: libarrow-doc-9.0.0-15.el9.noarch 2026-03-10T06:15:56.864 INFO:teuthology.orchestra.run.vm04.stdout: libcephfs-devel-2:18.2.0-0.el9.x86_64 2026-03-10T06:15:56.864 INFO:teuthology.orchestra.run.vm04.stdout: libcephfs2-2:18.2.0-0.el9.x86_64 2026-03-10T06:15:56.864 INFO:teuthology.orchestra.run.vm04.stdout: libcephsqlite-2:18.2.0-0.el9.x86_64 2026-03-10T06:15:56.864 INFO:teuthology.orchestra.run.vm04.stdout: libconfig-1.7.2-9.el9.x86_64 2026-03-10T06:15:56.864 INFO:teuthology.orchestra.run.vm04.stdout: libgfortran-11.5.0-14.el9.x86_64 2026-03-10T06:15:56.864 INFO:teuthology.orchestra.run.vm04.stdout: liboath-2.6.12-1.el9.x86_64 2026-03-10T06:15:56.864 INFO:teuthology.orchestra.run.vm04.stdout: libpmemobj-1.12.1-1.el9.x86_64 2026-03-10T06:15:56.865 INFO:teuthology.orchestra.run.vm04.stdout: libquadmath-11.5.0-14.el9.x86_64 2026-03-10T06:15:56.865 INFO:teuthology.orchestra.run.vm04.stdout: librabbitmq-0.11.0-7.el9.x86_64 2026-03-10T06:15:56.865 INFO:teuthology.orchestra.run.vm04.stdout: librados-devel-2:18.2.0-0.el9.x86_64 2026-03-10T06:15:56.865 INFO:teuthology.orchestra.run.vm04.stdout: libradosstriper1-2:18.2.0-0.el9.x86_64 2026-03-10T06:15:56.865 INFO:teuthology.orchestra.run.vm04.stdout: librdkafka-1.6.1-102.el9.x86_64 2026-03-10T06:15:56.865 INFO:teuthology.orchestra.run.vm04.stdout: librgw2-2:18.2.0-0.el9.x86_64 2026-03-10T06:15:56.865 INFO:teuthology.orchestra.run.vm04.stdout: libstoragemgmt-1.10.1-1.el9.x86_64 2026-03-10T06:15:56.865 INFO:teuthology.orchestra.run.vm04.stdout: libunwind-1.6.2-1.el9.x86_64 2026-03-10T06:15:56.865 INFO:teuthology.orchestra.run.vm04.stdout: libxslt-1.1.34-12.el9.x86_64 2026-03-10T06:15:56.865 INFO:teuthology.orchestra.run.vm04.stdout: lttng-ust-2.12.0-6.el9.x86_64 2026-03-10T06:15:56.865 INFO:teuthology.orchestra.run.vm04.stdout: mailcap-2.1.49-5.el9.noarch 2026-03-10T06:15:56.865 INFO:teuthology.orchestra.run.vm04.stdout: openblas-0.3.29-1.el9.x86_64 2026-03-10T06:15:56.865 INFO:teuthology.orchestra.run.vm04.stdout: openblas-openmp-0.3.29-1.el9.x86_64 2026-03-10T06:15:56.865 INFO:teuthology.orchestra.run.vm04.stdout: parquet-libs-9.0.0-15.el9.x86_64 2026-03-10T06:15:56.865 INFO:teuthology.orchestra.run.vm04.stdout: python3-asyncssh-2.13.2-5.el9.noarch 2026-03-10T06:15:56.865 INFO:teuthology.orchestra.run.vm04.stdout: python3-autocommand-2.2.2-8.el9.noarch 2026-03-10T06:15:56.865 INFO:teuthology.orchestra.run.vm04.stdout: python3-babel-2.9.1-2.el9.noarch 2026-03-10T06:15:56.865 INFO:teuthology.orchestra.run.vm04.stdout: python3-backports-tarfile-1.2.0-1.el9.noarch 2026-03-10T06:15:56.865 INFO:teuthology.orchestra.run.vm04.stdout: python3-bcrypt-3.2.2-1.el9.x86_64 2026-03-10T06:15:56.865 INFO:teuthology.orchestra.run.vm04.stdout: python3-cachetools-4.2.4-1.el9.noarch 2026-03-10T06:15:56.865 INFO:teuthology.orchestra.run.vm04.stdout: python3-ceph-argparse-2:18.2.0-0.el9.x86_64 2026-03-10T06:15:56.865 INFO:teuthology.orchestra.run.vm04.stdout: python3-ceph-common-2:18.2.0-0.el9.x86_64 2026-03-10T06:15:56.865 INFO:teuthology.orchestra.run.vm04.stdout: python3-cephfs-2:18.2.0-0.el9.x86_64 2026-03-10T06:15:56.865 INFO:teuthology.orchestra.run.vm04.stdout: python3-certifi-2023.05.07-4.el9.noarch 2026-03-10T06:15:56.865 INFO:teuthology.orchestra.run.vm04.stdout: python3-cffi-1.14.5-5.el9.x86_64 2026-03-10T06:15:56.865 INFO:teuthology.orchestra.run.vm04.stdout: python3-cheroot-10.0.1-4.el9.noarch 2026-03-10T06:15:56.865 INFO:teuthology.orchestra.run.vm04.stdout: python3-cherrypy-18.6.1-2.el9.noarch 2026-03-10T06:15:56.865 INFO:teuthology.orchestra.run.vm04.stdout: python3-cryptography-36.0.1-5.el9.x86_64 2026-03-10T06:15:56.865 INFO:teuthology.orchestra.run.vm04.stdout: python3-devel-3.9.25-3.el9.x86_64 2026-03-10T06:15:56.865 INFO:teuthology.orchestra.run.vm04.stdout: python3-google-auth-1:2.45.0-1.el9.noarch 2026-03-10T06:15:56.865 INFO:teuthology.orchestra.run.vm04.stdout: python3-jaraco-8.2.1-3.el9.noarch 2026-03-10T06:15:56.865 INFO:teuthology.orchestra.run.vm04.stdout: python3-jaraco-classes-3.2.1-5.el9.noarch 2026-03-10T06:15:56.865 INFO:teuthology.orchestra.run.vm04.stdout: python3-jaraco-collections-3.0.0-8.el9.noarch 2026-03-10T06:15:56.865 INFO:teuthology.orchestra.run.vm04.stdout: python3-jaraco-context-6.0.1-3.el9.noarch 2026-03-10T06:15:56.865 INFO:teuthology.orchestra.run.vm04.stdout: python3-jaraco-functools-3.5.0-2.el9.noarch 2026-03-10T06:15:56.865 INFO:teuthology.orchestra.run.vm04.stdout: python3-jaraco-text-4.0.0-2.el9.noarch 2026-03-10T06:15:56.865 INFO:teuthology.orchestra.run.vm04.stdout: python3-jinja2-2.11.3-8.el9.noarch 2026-03-10T06:15:56.865 INFO:teuthology.orchestra.run.vm04.stdout: python3-jmespath-1.0.1-1.el9.noarch 2026-03-10T06:15:56.865 INFO:teuthology.orchestra.run.vm04.stdout: python3-jwt-2.4.0-1.el9.noarch 2026-03-10T06:15:56.865 INFO:teuthology.orchestra.run.vm04.stdout: python3-jwt+crypto-2.4.0-1.el9.noarch 2026-03-10T06:15:56.865 INFO:teuthology.orchestra.run.vm04.stdout: python3-kubernetes-1:26.1.0-3.el9.noarch 2026-03-10T06:15:56.865 INFO:teuthology.orchestra.run.vm04.stdout: python3-libstoragemgmt-1.10.1-1.el9.x86_64 2026-03-10T06:15:56.865 INFO:teuthology.orchestra.run.vm04.stdout: python3-logutils-0.3.5-21.el9.noarch 2026-03-10T06:15:56.865 INFO:teuthology.orchestra.run.vm04.stdout: python3-mako-1.1.4-6.el9.noarch 2026-03-10T06:15:56.865 INFO:teuthology.orchestra.run.vm04.stdout: python3-markupsafe-1.1.1-12.el9.x86_64 2026-03-10T06:15:56.866 INFO:teuthology.orchestra.run.vm04.stdout: python3-more-itertools-8.12.0-2.el9.noarch 2026-03-10T06:15:56.866 INFO:teuthology.orchestra.run.vm04.stdout: python3-natsort-7.1.1-5.el9.noarch 2026-03-10T06:15:56.866 INFO:teuthology.orchestra.run.vm04.stdout: python3-numpy-1:1.23.5-2.el9.x86_64 2026-03-10T06:15:56.866 INFO:teuthology.orchestra.run.vm04.stdout: python3-numpy-f2py-1:1.23.5-2.el9.x86_64 2026-03-10T06:15:56.866 INFO:teuthology.orchestra.run.vm04.stdout: python3-pecan-1.4.2-3.el9.noarch 2026-03-10T06:15:56.866 INFO:teuthology.orchestra.run.vm04.stdout: python3-ply-3.11-14.el9.noarch 2026-03-10T06:15:56.866 INFO:teuthology.orchestra.run.vm04.stdout: python3-portend-3.1.0-2.el9.noarch 2026-03-10T06:15:56.866 INFO:teuthology.orchestra.run.vm04.stdout: python3-pyOpenSSL-21.0.0-1.el9.noarch 2026-03-10T06:15:56.866 INFO:teuthology.orchestra.run.vm04.stdout: python3-pyasn1-0.4.8-7.el9.noarch 2026-03-10T06:15:56.866 INFO:teuthology.orchestra.run.vm04.stdout: python3-pyasn1-modules-0.4.8-7.el9.noarch 2026-03-10T06:15:56.866 INFO:teuthology.orchestra.run.vm04.stdout: python3-pycparser-2.20-6.el9.noarch 2026-03-10T06:15:56.866 INFO:teuthology.orchestra.run.vm04.stdout: python3-rados-2:18.2.0-0.el9.x86_64 2026-03-10T06:15:56.866 INFO:teuthology.orchestra.run.vm04.stdout: python3-rbd-2:18.2.0-0.el9.x86_64 2026-03-10T06:15:56.866 INFO:teuthology.orchestra.run.vm04.stdout: python3-repoze-lru-0.7-16.el9.noarch 2026-03-10T06:15:56.866 INFO:teuthology.orchestra.run.vm04.stdout: python3-requests-2.25.1-10.el9.noarch 2026-03-10T06:15:56.866 INFO:teuthology.orchestra.run.vm04.stdout: python3-requests-oauthlib-1.3.0-12.el9.noarch 2026-03-10T06:15:56.866 INFO:teuthology.orchestra.run.vm04.stdout: python3-rgw-2:18.2.0-0.el9.x86_64 2026-03-10T06:15:56.866 INFO:teuthology.orchestra.run.vm04.stdout: python3-routes-2.5.1-5.el9.noarch 2026-03-10T06:15:56.866 INFO:teuthology.orchestra.run.vm04.stdout: python3-rsa-4.9-2.el9.noarch 2026-03-10T06:15:56.866 INFO:teuthology.orchestra.run.vm04.stdout: python3-scipy-1.9.3-2.el9.x86_64 2026-03-10T06:15:56.866 INFO:teuthology.orchestra.run.vm04.stdout: python3-tempora-5.0.0-2.el9.noarch 2026-03-10T06:15:56.866 INFO:teuthology.orchestra.run.vm04.stdout: python3-toml-0.10.2-6.el9.noarch 2026-03-10T06:15:56.866 INFO:teuthology.orchestra.run.vm04.stdout: python3-typing-extensions-4.15.0-1.el9.noarch 2026-03-10T06:15:56.866 INFO:teuthology.orchestra.run.vm04.stdout: python3-urllib3-1.26.5-7.el9.noarch 2026-03-10T06:15:56.866 INFO:teuthology.orchestra.run.vm04.stdout: python3-webob-1.8.8-2.el9.noarch 2026-03-10T06:15:56.866 INFO:teuthology.orchestra.run.vm04.stdout: python3-websocket-client-1.2.3-2.el9.noarch 2026-03-10T06:15:56.866 INFO:teuthology.orchestra.run.vm04.stdout: python3-werkzeug-2.0.3-3.el9.1.noarch 2026-03-10T06:15:56.866 INFO:teuthology.orchestra.run.vm04.stdout: python3-xmltodict-0.12.0-15.el9.noarch 2026-03-10T06:15:56.866 INFO:teuthology.orchestra.run.vm04.stdout: python3-zc-lockfile-2.0-10.el9.noarch 2026-03-10T06:15:56.866 INFO:teuthology.orchestra.run.vm04.stdout: rbd-fuse-2:18.2.0-0.el9.x86_64 2026-03-10T06:15:56.866 INFO:teuthology.orchestra.run.vm04.stdout: rbd-mirror-2:18.2.0-0.el9.x86_64 2026-03-10T06:15:56.866 INFO:teuthology.orchestra.run.vm04.stdout: rbd-nbd-2:18.2.0-0.el9.x86_64 2026-03-10T06:15:56.866 INFO:teuthology.orchestra.run.vm04.stdout: re2-1:20211101-20.el9.x86_64 2026-03-10T06:15:56.866 INFO:teuthology.orchestra.run.vm04.stdout: socat-1.7.4.1-8.el9.x86_64 2026-03-10T06:15:56.866 INFO:teuthology.orchestra.run.vm04.stdout: thrift-0.15.0-4.el9.x86_64 2026-03-10T06:15:56.867 INFO:teuthology.orchestra.run.vm04.stdout: xmlstarlet-1.6.1-20.el9.x86_64 2026-03-10T06:15:56.867 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:15:56.867 INFO:teuthology.orchestra.run.vm04.stdout:Complete! 2026-03-10T06:15:56.988 DEBUG:teuthology.parallel:result is None 2026-03-10T06:15:57.728 INFO:teuthology.orchestra.run.vm06.stdout: Installing : ceph-test-2:18.2.0-0.el9.x86_64 113/121 2026-03-10T06:15:57.740 INFO:teuthology.orchestra.run.vm06.stdout: Installing : rbd-fuse-2:18.2.0-0.el9.x86_64 114/121 2026-03-10T06:15:57.745 INFO:teuthology.orchestra.run.vm06.stdout: Installing : rbd-nbd-2:18.2.0-0.el9.x86_64 115/121 2026-03-10T06:15:57.785 INFO:teuthology.orchestra.run.vm06.stdout: Installing : libcephfs-devel-2:18.2.0-0.el9.x86_64 116/121 2026-03-10T06:15:57.790 INFO:teuthology.orchestra.run.vm06.stdout: Installing : ceph-fuse-2:18.2.0-0.el9.x86_64 117/121 2026-03-10T06:15:57.800 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-xmltodict-0.12.0-15.el9.noarch 118/121 2026-03-10T06:15:57.804 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-jmespath-1.0.1-1.el9.noarch 119/121 2026-03-10T06:15:57.804 INFO:teuthology.orchestra.run.vm06.stdout: Cleanup : librbd1-2:16.2.4-5.el9.x86_64 120/121 2026-03-10T06:15:57.820 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: librbd1-2:16.2.4-5.el9.x86_64 120/121 2026-03-10T06:15:57.820 INFO:teuthology.orchestra.run.vm06.stdout: Cleanup : librados2-2:16.2.4-5.el9.x86_64 121/121 2026-03-10T06:15:58.963 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: librados2-2:16.2.4-5.el9.x86_64 121/121 2026-03-10T06:15:58.963 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : ceph-2:18.2.0-0.el9.x86_64 1/121 2026-03-10T06:15:58.963 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : ceph-base-2:18.2.0-0.el9.x86_64 2/121 2026-03-10T06:15:58.963 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : ceph-common-2:18.2.0-0.el9.x86_64 3/121 2026-03-10T06:15:58.963 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : ceph-fuse-2:18.2.0-0.el9.x86_64 4/121 2026-03-10T06:15:58.963 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : ceph-immutable-object-cache-2:18.2.0-0.el9.x86_6 5/121 2026-03-10T06:15:58.963 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : ceph-mds-2:18.2.0-0.el9.x86_64 6/121 2026-03-10T06:15:58.963 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : ceph-mgr-2:18.2.0-0.el9.x86_64 7/121 2026-03-10T06:15:58.963 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : ceph-mon-2:18.2.0-0.el9.x86_64 8/121 2026-03-10T06:15:58.963 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : ceph-osd-2:18.2.0-0.el9.x86_64 9/121 2026-03-10T06:15:58.963 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : ceph-radosgw-2:18.2.0-0.el9.x86_64 10/121 2026-03-10T06:15:58.963 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : ceph-selinux-2:18.2.0-0.el9.x86_64 11/121 2026-03-10T06:15:58.963 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : ceph-test-2:18.2.0-0.el9.x86_64 12/121 2026-03-10T06:15:58.963 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : libcephfs-devel-2:18.2.0-0.el9.x86_64 13/121 2026-03-10T06:15:58.963 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : libcephfs2-2:18.2.0-0.el9.x86_64 14/121 2026-03-10T06:15:58.963 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : libcephsqlite-2:18.2.0-0.el9.x86_64 15/121 2026-03-10T06:15:58.963 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : librados-devel-2:18.2.0-0.el9.x86_64 16/121 2026-03-10T06:15:58.963 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : libradosstriper1-2:18.2.0-0.el9.x86_64 17/121 2026-03-10T06:15:58.963 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : librgw2-2:18.2.0-0.el9.x86_64 18/121 2026-03-10T06:15:58.963 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-ceph-argparse-2:18.2.0-0.el9.x86_64 19/121 2026-03-10T06:15:58.963 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-ceph-common-2:18.2.0-0.el9.x86_64 20/121 2026-03-10T06:15:58.963 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-cephfs-2:18.2.0-0.el9.x86_64 21/121 2026-03-10T06:15:58.963 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-rados-2:18.2.0-0.el9.x86_64 22/121 2026-03-10T06:15:58.963 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-rbd-2:18.2.0-0.el9.x86_64 23/121 2026-03-10T06:15:58.963 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-rgw-2:18.2.0-0.el9.x86_64 24/121 2026-03-10T06:15:58.963 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : rbd-fuse-2:18.2.0-0.el9.x86_64 25/121 2026-03-10T06:15:58.963 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : rbd-mirror-2:18.2.0-0.el9.x86_64 26/121 2026-03-10T06:15:58.963 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : rbd-nbd-2:18.2.0-0.el9.x86_64 27/121 2026-03-10T06:15:58.963 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : ceph-grafana-dashboards-2:18.2.0-0.el9.noarch 28/121 2026-03-10T06:15:58.963 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : ceph-mgr-cephadm-2:18.2.0-0.el9.noarch 29/121 2026-03-10T06:15:58.963 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : ceph-mgr-dashboard-2:18.2.0-0.el9.noarch 30/121 2026-03-10T06:15:58.963 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : ceph-mgr-diskprediction-local-2:18.2.0-0.el9.noa 31/121 2026-03-10T06:15:58.964 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : ceph-mgr-modules-core-2:18.2.0-0.el9.noarch 32/121 2026-03-10T06:15:58.964 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : ceph-mgr-rook-2:18.2.0-0.el9.noarch 33/121 2026-03-10T06:15:58.964 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : ceph-prometheus-alerts-2:18.2.0-0.el9.noarch 34/121 2026-03-10T06:15:58.964 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : cephadm-2:18.2.0-0.el9.noarch 35/121 2026-03-10T06:15:58.964 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : ledmon-libs-1.1.0-3.el9.x86_64 36/121 2026-03-10T06:15:58.964 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : libconfig-1.7.2-9.el9.x86_64 37/121 2026-03-10T06:15:58.964 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : libgfortran-11.5.0-14.el9.x86_64 38/121 2026-03-10T06:15:58.964 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : libquadmath-11.5.0-14.el9.x86_64 39/121 2026-03-10T06:15:58.964 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : mailcap-2.1.49-5.el9.noarch 40/121 2026-03-10T06:15:58.964 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-cffi-1.14.5-5.el9.x86_64 41/121 2026-03-10T06:15:58.965 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-cryptography-36.0.1-5.el9.x86_64 42/121 2026-03-10T06:15:58.965 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-ply-3.11-14.el9.noarch 43/121 2026-03-10T06:15:58.965 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-pycparser-2.20-6.el9.noarch 44/121 2026-03-10T06:15:58.965 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-requests-2.25.1-10.el9.noarch 45/121 2026-03-10T06:15:58.965 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-urllib3-1.26.5-7.el9.noarch 46/121 2026-03-10T06:15:58.965 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : boost-program-options-1.75.0-13.el9.x86_64 47/121 2026-03-10T06:15:58.965 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : flexiblas-3.0.4-9.el9.x86_64 48/121 2026-03-10T06:15:58.965 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : flexiblas-netlib-3.0.4-9.el9.x86_64 49/121 2026-03-10T06:15:58.965 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 50/121 2026-03-10T06:15:58.965 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : libpmemobj-1.12.1-1.el9.x86_64 51/121 2026-03-10T06:15:58.965 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : librabbitmq-0.11.0-7.el9.x86_64 52/121 2026-03-10T06:15:58.965 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : librdkafka-1.6.1-102.el9.x86_64 53/121 2026-03-10T06:15:58.966 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : libstoragemgmt-1.10.1-1.el9.x86_64 54/121 2026-03-10T06:15:58.966 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : libxslt-1.1.34-12.el9.x86_64 55/121 2026-03-10T06:15:58.966 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : lttng-ust-2.12.0-6.el9.x86_64 56/121 2026-03-10T06:15:58.966 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : openblas-0.3.29-1.el9.x86_64 57/121 2026-03-10T06:15:58.966 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : openblas-openmp-0.3.29-1.el9.x86_64 58/121 2026-03-10T06:15:58.966 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-babel-2.9.1-2.el9.noarch 59/121 2026-03-10T06:15:58.966 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-devel-3.9.25-3.el9.x86_64 60/121 2026-03-10T06:15:58.966 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-jinja2-2.11.3-8.el9.noarch 61/121 2026-03-10T06:15:58.966 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-jmespath-1.0.1-1.el9.noarch 62/121 2026-03-10T06:15:58.966 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-libstoragemgmt-1.10.1-1.el9.x86_64 63/121 2026-03-10T06:15:58.966 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-mako-1.1.4-6.el9.noarch 64/121 2026-03-10T06:15:58.966 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-markupsafe-1.1.1-12.el9.x86_64 65/121 2026-03-10T06:15:58.966 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-numpy-1:1.23.5-2.el9.x86_64 66/121 2026-03-10T06:15:58.966 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-numpy-f2py-1:1.23.5-2.el9.x86_64 67/121 2026-03-10T06:15:58.966 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-pyasn1-0.4.8-7.el9.noarch 68/121 2026-03-10T06:15:58.966 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-pyasn1-modules-0.4.8-7.el9.noarch 69/121 2026-03-10T06:15:58.966 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-requests-oauthlib-1.3.0-12.el9.noarch 70/121 2026-03-10T06:15:58.966 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-scipy-1.9.3-2.el9.x86_64 71/121 2026-03-10T06:15:58.966 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-toml-0.10.2-6.el9.noarch 72/121 2026-03-10T06:15:58.966 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : socat-1.7.4.1-8.el9.x86_64 73/121 2026-03-10T06:15:58.966 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : xmlstarlet-1.6.1-20.el9.x86_64 74/121 2026-03-10T06:15:58.966 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : fmt-8.1.1-5.el9.x86_64 75/121 2026-03-10T06:15:58.966 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : gperftools-libs-2.9.1-3.el9.x86_64 76/121 2026-03-10T06:15:58.966 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : libarrow-9.0.0-15.el9.x86_64 77/121 2026-03-10T06:15:58.966 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : libarrow-doc-9.0.0-15.el9.noarch 78/121 2026-03-10T06:15:58.966 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : liboath-2.6.12-1.el9.x86_64 79/121 2026-03-10T06:15:58.966 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : libunwind-1.6.2-1.el9.x86_64 80/121 2026-03-10T06:15:58.966 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : parquet-libs-9.0.0-15.el9.x86_64 81/121 2026-03-10T06:15:58.966 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-asyncssh-2.13.2-5.el9.noarch 82/121 2026-03-10T06:15:58.966 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-autocommand-2.2.2-8.el9.noarch 83/121 2026-03-10T06:15:58.966 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-backports-tarfile-1.2.0-1.el9.noarch 84/121 2026-03-10T06:15:58.966 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-bcrypt-3.2.2-1.el9.x86_64 85/121 2026-03-10T06:15:58.966 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-cachetools-4.2.4-1.el9.noarch 86/121 2026-03-10T06:15:58.966 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-certifi-2023.05.07-4.el9.noarch 87/121 2026-03-10T06:15:58.966 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-cheroot-10.0.1-4.el9.noarch 88/121 2026-03-10T06:15:58.966 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-cherrypy-18.6.1-2.el9.noarch 89/121 2026-03-10T06:15:58.966 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-google-auth-1:2.45.0-1.el9.noarch 90/121 2026-03-10T06:15:58.966 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-jaraco-8.2.1-3.el9.noarch 91/121 2026-03-10T06:15:58.966 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-jaraco-classes-3.2.1-5.el9.noarch 92/121 2026-03-10T06:15:58.966 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-jaraco-collections-3.0.0-8.el9.noarch 93/121 2026-03-10T06:15:58.966 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-jaraco-context-6.0.1-3.el9.noarch 94/121 2026-03-10T06:15:58.966 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-jaraco-functools-3.5.0-2.el9.noarch 95/121 2026-03-10T06:15:58.966 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-jaraco-text-4.0.0-2.el9.noarch 96/121 2026-03-10T06:15:58.966 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-jwt+crypto-2.4.0-1.el9.noarch 97/121 2026-03-10T06:15:58.966 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-jwt-2.4.0-1.el9.noarch 98/121 2026-03-10T06:15:58.966 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-kubernetes-1:26.1.0-3.el9.noarch 99/121 2026-03-10T06:15:58.966 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-logutils-0.3.5-21.el9.noarch 100/121 2026-03-10T06:15:58.966 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-more-itertools-8.12.0-2.el9.noarch 101/121 2026-03-10T06:15:58.966 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-natsort-7.1.1-5.el9.noarch 102/121 2026-03-10T06:15:58.966 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-pecan-1.4.2-3.el9.noarch 103/121 2026-03-10T06:15:58.966 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-portend-3.1.0-2.el9.noarch 104/121 2026-03-10T06:15:58.966 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-pyOpenSSL-21.0.0-1.el9.noarch 105/121 2026-03-10T06:15:58.966 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-repoze-lru-0.7-16.el9.noarch 106/121 2026-03-10T06:15:58.966 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-routes-2.5.1-5.el9.noarch 107/121 2026-03-10T06:15:58.966 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-rsa-4.9-2.el9.noarch 108/121 2026-03-10T06:15:58.966 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-tempora-5.0.0-2.el9.noarch 109/121 2026-03-10T06:15:58.966 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-typing-extensions-4.15.0-1.el9.noarch 110/121 2026-03-10T06:15:58.966 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-webob-1.8.8-2.el9.noarch 111/121 2026-03-10T06:15:58.966 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-websocket-client-1.2.3-2.el9.noarch 112/121 2026-03-10T06:15:58.966 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-werkzeug-2.0.3-3.el9.1.noarch 113/121 2026-03-10T06:15:58.966 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-xmltodict-0.12.0-15.el9.noarch 114/121 2026-03-10T06:15:58.966 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-zc-lockfile-2.0-10.el9.noarch 115/121 2026-03-10T06:15:58.966 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : re2-1:20211101-20.el9.x86_64 116/121 2026-03-10T06:15:58.967 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : thrift-0.15.0-4.el9.x86_64 117/121 2026-03-10T06:15:58.967 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : librados2-2:18.2.0-0.el9.x86_64 118/121 2026-03-10T06:15:58.967 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : librados2-2:16.2.4-5.el9.x86_64 119/121 2026-03-10T06:15:58.967 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : librbd1-2:18.2.0-0.el9.x86_64 120/121 2026-03-10T06:15:59.068 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : librbd1-2:16.2.4-5.el9.x86_64 121/121 2026-03-10T06:15:59.068 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-10T06:15:59.068 INFO:teuthology.orchestra.run.vm06.stdout:Upgraded: 2026-03-10T06:15:59.068 INFO:teuthology.orchestra.run.vm06.stdout: librados2-2:18.2.0-0.el9.x86_64 librbd1-2:18.2.0-0.el9.x86_64 2026-03-10T06:15:59.068 INFO:teuthology.orchestra.run.vm06.stdout:Installed: 2026-03-10T06:15:59.068 INFO:teuthology.orchestra.run.vm06.stdout: boost-program-options-1.75.0-13.el9.x86_64 2026-03-10T06:15:59.068 INFO:teuthology.orchestra.run.vm06.stdout: ceph-2:18.2.0-0.el9.x86_64 2026-03-10T06:15:59.068 INFO:teuthology.orchestra.run.vm06.stdout: ceph-base-2:18.2.0-0.el9.x86_64 2026-03-10T06:15:59.068 INFO:teuthology.orchestra.run.vm06.stdout: ceph-common-2:18.2.0-0.el9.x86_64 2026-03-10T06:15:59.068 INFO:teuthology.orchestra.run.vm06.stdout: ceph-fuse-2:18.2.0-0.el9.x86_64 2026-03-10T06:15:59.068 INFO:teuthology.orchestra.run.vm06.stdout: ceph-grafana-dashboards-2:18.2.0-0.el9.noarch 2026-03-10T06:15:59.068 INFO:teuthology.orchestra.run.vm06.stdout: ceph-immutable-object-cache-2:18.2.0-0.el9.x86_64 2026-03-10T06:15:59.068 INFO:teuthology.orchestra.run.vm06.stdout: ceph-mds-2:18.2.0-0.el9.x86_64 2026-03-10T06:15:59.068 INFO:teuthology.orchestra.run.vm06.stdout: ceph-mgr-2:18.2.0-0.el9.x86_64 2026-03-10T06:15:59.068 INFO:teuthology.orchestra.run.vm06.stdout: ceph-mgr-cephadm-2:18.2.0-0.el9.noarch 2026-03-10T06:15:59.068 INFO:teuthology.orchestra.run.vm06.stdout: ceph-mgr-dashboard-2:18.2.0-0.el9.noarch 2026-03-10T06:15:59.068 INFO:teuthology.orchestra.run.vm06.stdout: ceph-mgr-diskprediction-local-2:18.2.0-0.el9.noarch 2026-03-10T06:15:59.068 INFO:teuthology.orchestra.run.vm06.stdout: ceph-mgr-modules-core-2:18.2.0-0.el9.noarch 2026-03-10T06:15:59.068 INFO:teuthology.orchestra.run.vm06.stdout: ceph-mgr-rook-2:18.2.0-0.el9.noarch 2026-03-10T06:15:59.068 INFO:teuthology.orchestra.run.vm06.stdout: ceph-mon-2:18.2.0-0.el9.x86_64 2026-03-10T06:15:59.068 INFO:teuthology.orchestra.run.vm06.stdout: ceph-osd-2:18.2.0-0.el9.x86_64 2026-03-10T06:15:59.068 INFO:teuthology.orchestra.run.vm06.stdout: ceph-prometheus-alerts-2:18.2.0-0.el9.noarch 2026-03-10T06:15:59.068 INFO:teuthology.orchestra.run.vm06.stdout: ceph-radosgw-2:18.2.0-0.el9.x86_64 2026-03-10T06:15:59.068 INFO:teuthology.orchestra.run.vm06.stdout: ceph-selinux-2:18.2.0-0.el9.x86_64 2026-03-10T06:15:59.068 INFO:teuthology.orchestra.run.vm06.stdout: ceph-test-2:18.2.0-0.el9.x86_64 2026-03-10T06:15:59.068 INFO:teuthology.orchestra.run.vm06.stdout: cephadm-2:18.2.0-0.el9.noarch 2026-03-10T06:15:59.068 INFO:teuthology.orchestra.run.vm06.stdout: flexiblas-3.0.4-9.el9.x86_64 2026-03-10T06:15:59.068 INFO:teuthology.orchestra.run.vm06.stdout: flexiblas-netlib-3.0.4-9.el9.x86_64 2026-03-10T06:15:59.068 INFO:teuthology.orchestra.run.vm06.stdout: flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 2026-03-10T06:15:59.068 INFO:teuthology.orchestra.run.vm06.stdout: fmt-8.1.1-5.el9.x86_64 2026-03-10T06:15:59.068 INFO:teuthology.orchestra.run.vm06.stdout: gperftools-libs-2.9.1-3.el9.x86_64 2026-03-10T06:15:59.068 INFO:teuthology.orchestra.run.vm06.stdout: ledmon-libs-1.1.0-3.el9.x86_64 2026-03-10T06:15:59.068 INFO:teuthology.orchestra.run.vm06.stdout: libarrow-9.0.0-15.el9.x86_64 2026-03-10T06:15:59.068 INFO:teuthology.orchestra.run.vm06.stdout: libarrow-doc-9.0.0-15.el9.noarch 2026-03-10T06:15:59.068 INFO:teuthology.orchestra.run.vm06.stdout: libcephfs-devel-2:18.2.0-0.el9.x86_64 2026-03-10T06:15:59.068 INFO:teuthology.orchestra.run.vm06.stdout: libcephfs2-2:18.2.0-0.el9.x86_64 2026-03-10T06:15:59.069 INFO:teuthology.orchestra.run.vm06.stdout: libcephsqlite-2:18.2.0-0.el9.x86_64 2026-03-10T06:15:59.069 INFO:teuthology.orchestra.run.vm06.stdout: libconfig-1.7.2-9.el9.x86_64 2026-03-10T06:15:59.069 INFO:teuthology.orchestra.run.vm06.stdout: libgfortran-11.5.0-14.el9.x86_64 2026-03-10T06:15:59.069 INFO:teuthology.orchestra.run.vm06.stdout: liboath-2.6.12-1.el9.x86_64 2026-03-10T06:15:59.069 INFO:teuthology.orchestra.run.vm06.stdout: libpmemobj-1.12.1-1.el9.x86_64 2026-03-10T06:15:59.069 INFO:teuthology.orchestra.run.vm06.stdout: libquadmath-11.5.0-14.el9.x86_64 2026-03-10T06:15:59.069 INFO:teuthology.orchestra.run.vm06.stdout: librabbitmq-0.11.0-7.el9.x86_64 2026-03-10T06:15:59.069 INFO:teuthology.orchestra.run.vm06.stdout: librados-devel-2:18.2.0-0.el9.x86_64 2026-03-10T06:15:59.069 INFO:teuthology.orchestra.run.vm06.stdout: libradosstriper1-2:18.2.0-0.el9.x86_64 2026-03-10T06:15:59.069 INFO:teuthology.orchestra.run.vm06.stdout: librdkafka-1.6.1-102.el9.x86_64 2026-03-10T06:15:59.069 INFO:teuthology.orchestra.run.vm06.stdout: librgw2-2:18.2.0-0.el9.x86_64 2026-03-10T06:15:59.069 INFO:teuthology.orchestra.run.vm06.stdout: libstoragemgmt-1.10.1-1.el9.x86_64 2026-03-10T06:15:59.069 INFO:teuthology.orchestra.run.vm06.stdout: libunwind-1.6.2-1.el9.x86_64 2026-03-10T06:15:59.069 INFO:teuthology.orchestra.run.vm06.stdout: libxslt-1.1.34-12.el9.x86_64 2026-03-10T06:15:59.069 INFO:teuthology.orchestra.run.vm06.stdout: lttng-ust-2.12.0-6.el9.x86_64 2026-03-10T06:15:59.069 INFO:teuthology.orchestra.run.vm06.stdout: mailcap-2.1.49-5.el9.noarch 2026-03-10T06:15:59.069 INFO:teuthology.orchestra.run.vm06.stdout: openblas-0.3.29-1.el9.x86_64 2026-03-10T06:15:59.069 INFO:teuthology.orchestra.run.vm06.stdout: openblas-openmp-0.3.29-1.el9.x86_64 2026-03-10T06:15:59.069 INFO:teuthology.orchestra.run.vm06.stdout: parquet-libs-9.0.0-15.el9.x86_64 2026-03-10T06:15:59.069 INFO:teuthology.orchestra.run.vm06.stdout: python3-asyncssh-2.13.2-5.el9.noarch 2026-03-10T06:15:59.069 INFO:teuthology.orchestra.run.vm06.stdout: python3-autocommand-2.2.2-8.el9.noarch 2026-03-10T06:15:59.069 INFO:teuthology.orchestra.run.vm06.stdout: python3-babel-2.9.1-2.el9.noarch 2026-03-10T06:15:59.069 INFO:teuthology.orchestra.run.vm06.stdout: python3-backports-tarfile-1.2.0-1.el9.noarch 2026-03-10T06:15:59.069 INFO:teuthology.orchestra.run.vm06.stdout: python3-bcrypt-3.2.2-1.el9.x86_64 2026-03-10T06:15:59.069 INFO:teuthology.orchestra.run.vm06.stdout: python3-cachetools-4.2.4-1.el9.noarch 2026-03-10T06:15:59.069 INFO:teuthology.orchestra.run.vm06.stdout: python3-ceph-argparse-2:18.2.0-0.el9.x86_64 2026-03-10T06:15:59.069 INFO:teuthology.orchestra.run.vm06.stdout: python3-ceph-common-2:18.2.0-0.el9.x86_64 2026-03-10T06:15:59.069 INFO:teuthology.orchestra.run.vm06.stdout: python3-cephfs-2:18.2.0-0.el9.x86_64 2026-03-10T06:15:59.069 INFO:teuthology.orchestra.run.vm06.stdout: python3-certifi-2023.05.07-4.el9.noarch 2026-03-10T06:15:59.069 INFO:teuthology.orchestra.run.vm06.stdout: python3-cffi-1.14.5-5.el9.x86_64 2026-03-10T06:15:59.069 INFO:teuthology.orchestra.run.vm06.stdout: python3-cheroot-10.0.1-4.el9.noarch 2026-03-10T06:15:59.069 INFO:teuthology.orchestra.run.vm06.stdout: python3-cherrypy-18.6.1-2.el9.noarch 2026-03-10T06:15:59.069 INFO:teuthology.orchestra.run.vm06.stdout: python3-cryptography-36.0.1-5.el9.x86_64 2026-03-10T06:15:59.069 INFO:teuthology.orchestra.run.vm06.stdout: python3-devel-3.9.25-3.el9.x86_64 2026-03-10T06:15:59.069 INFO:teuthology.orchestra.run.vm06.stdout: python3-google-auth-1:2.45.0-1.el9.noarch 2026-03-10T06:15:59.069 INFO:teuthology.orchestra.run.vm06.stdout: python3-jaraco-8.2.1-3.el9.noarch 2026-03-10T06:15:59.069 INFO:teuthology.orchestra.run.vm06.stdout: python3-jaraco-classes-3.2.1-5.el9.noarch 2026-03-10T06:15:59.069 INFO:teuthology.orchestra.run.vm06.stdout: python3-jaraco-collections-3.0.0-8.el9.noarch 2026-03-10T06:15:59.069 INFO:teuthology.orchestra.run.vm06.stdout: python3-jaraco-context-6.0.1-3.el9.noarch 2026-03-10T06:15:59.069 INFO:teuthology.orchestra.run.vm06.stdout: python3-jaraco-functools-3.5.0-2.el9.noarch 2026-03-10T06:15:59.069 INFO:teuthology.orchestra.run.vm06.stdout: python3-jaraco-text-4.0.0-2.el9.noarch 2026-03-10T06:15:59.069 INFO:teuthology.orchestra.run.vm06.stdout: python3-jinja2-2.11.3-8.el9.noarch 2026-03-10T06:15:59.069 INFO:teuthology.orchestra.run.vm06.stdout: python3-jmespath-1.0.1-1.el9.noarch 2026-03-10T06:15:59.069 INFO:teuthology.orchestra.run.vm06.stdout: python3-jwt-2.4.0-1.el9.noarch 2026-03-10T06:15:59.069 INFO:teuthology.orchestra.run.vm06.stdout: python3-jwt+crypto-2.4.0-1.el9.noarch 2026-03-10T06:15:59.069 INFO:teuthology.orchestra.run.vm06.stdout: python3-kubernetes-1:26.1.0-3.el9.noarch 2026-03-10T06:15:59.069 INFO:teuthology.orchestra.run.vm06.stdout: python3-libstoragemgmt-1.10.1-1.el9.x86_64 2026-03-10T06:15:59.069 INFO:teuthology.orchestra.run.vm06.stdout: python3-logutils-0.3.5-21.el9.noarch 2026-03-10T06:15:59.069 INFO:teuthology.orchestra.run.vm06.stdout: python3-mako-1.1.4-6.el9.noarch 2026-03-10T06:15:59.069 INFO:teuthology.orchestra.run.vm06.stdout: python3-markupsafe-1.1.1-12.el9.x86_64 2026-03-10T06:15:59.069 INFO:teuthology.orchestra.run.vm06.stdout: python3-more-itertools-8.12.0-2.el9.noarch 2026-03-10T06:15:59.069 INFO:teuthology.orchestra.run.vm06.stdout: python3-natsort-7.1.1-5.el9.noarch 2026-03-10T06:15:59.069 INFO:teuthology.orchestra.run.vm06.stdout: python3-numpy-1:1.23.5-2.el9.x86_64 2026-03-10T06:15:59.069 INFO:teuthology.orchestra.run.vm06.stdout: python3-numpy-f2py-1:1.23.5-2.el9.x86_64 2026-03-10T06:15:59.069 INFO:teuthology.orchestra.run.vm06.stdout: python3-pecan-1.4.2-3.el9.noarch 2026-03-10T06:15:59.069 INFO:teuthology.orchestra.run.vm06.stdout: python3-ply-3.11-14.el9.noarch 2026-03-10T06:15:59.069 INFO:teuthology.orchestra.run.vm06.stdout: python3-portend-3.1.0-2.el9.noarch 2026-03-10T06:15:59.069 INFO:teuthology.orchestra.run.vm06.stdout: python3-pyOpenSSL-21.0.0-1.el9.noarch 2026-03-10T06:15:59.069 INFO:teuthology.orchestra.run.vm06.stdout: python3-pyasn1-0.4.8-7.el9.noarch 2026-03-10T06:15:59.069 INFO:teuthology.orchestra.run.vm06.stdout: python3-pyasn1-modules-0.4.8-7.el9.noarch 2026-03-10T06:15:59.069 INFO:teuthology.orchestra.run.vm06.stdout: python3-pycparser-2.20-6.el9.noarch 2026-03-10T06:15:59.070 INFO:teuthology.orchestra.run.vm06.stdout: python3-rados-2:18.2.0-0.el9.x86_64 2026-03-10T06:15:59.070 INFO:teuthology.orchestra.run.vm06.stdout: python3-rbd-2:18.2.0-0.el9.x86_64 2026-03-10T06:15:59.070 INFO:teuthology.orchestra.run.vm06.stdout: python3-repoze-lru-0.7-16.el9.noarch 2026-03-10T06:15:59.070 INFO:teuthology.orchestra.run.vm06.stdout: python3-requests-2.25.1-10.el9.noarch 2026-03-10T06:15:59.070 INFO:teuthology.orchestra.run.vm06.stdout: python3-requests-oauthlib-1.3.0-12.el9.noarch 2026-03-10T06:15:59.070 INFO:teuthology.orchestra.run.vm06.stdout: python3-rgw-2:18.2.0-0.el9.x86_64 2026-03-10T06:15:59.070 INFO:teuthology.orchestra.run.vm06.stdout: python3-routes-2.5.1-5.el9.noarch 2026-03-10T06:15:59.070 INFO:teuthology.orchestra.run.vm06.stdout: python3-rsa-4.9-2.el9.noarch 2026-03-10T06:15:59.070 INFO:teuthology.orchestra.run.vm06.stdout: python3-scipy-1.9.3-2.el9.x86_64 2026-03-10T06:15:59.070 INFO:teuthology.orchestra.run.vm06.stdout: python3-tempora-5.0.0-2.el9.noarch 2026-03-10T06:15:59.070 INFO:teuthology.orchestra.run.vm06.stdout: python3-toml-0.10.2-6.el9.noarch 2026-03-10T06:15:59.070 INFO:teuthology.orchestra.run.vm06.stdout: python3-typing-extensions-4.15.0-1.el9.noarch 2026-03-10T06:15:59.070 INFO:teuthology.orchestra.run.vm06.stdout: python3-urllib3-1.26.5-7.el9.noarch 2026-03-10T06:15:59.070 INFO:teuthology.orchestra.run.vm06.stdout: python3-webob-1.8.8-2.el9.noarch 2026-03-10T06:15:59.070 INFO:teuthology.orchestra.run.vm06.stdout: python3-websocket-client-1.2.3-2.el9.noarch 2026-03-10T06:15:59.070 INFO:teuthology.orchestra.run.vm06.stdout: python3-werkzeug-2.0.3-3.el9.1.noarch 2026-03-10T06:15:59.070 INFO:teuthology.orchestra.run.vm06.stdout: python3-xmltodict-0.12.0-15.el9.noarch 2026-03-10T06:15:59.070 INFO:teuthology.orchestra.run.vm06.stdout: python3-zc-lockfile-2.0-10.el9.noarch 2026-03-10T06:15:59.070 INFO:teuthology.orchestra.run.vm06.stdout: rbd-fuse-2:18.2.0-0.el9.x86_64 2026-03-10T06:15:59.070 INFO:teuthology.orchestra.run.vm06.stdout: rbd-mirror-2:18.2.0-0.el9.x86_64 2026-03-10T06:15:59.070 INFO:teuthology.orchestra.run.vm06.stdout: rbd-nbd-2:18.2.0-0.el9.x86_64 2026-03-10T06:15:59.070 INFO:teuthology.orchestra.run.vm06.stdout: re2-1:20211101-20.el9.x86_64 2026-03-10T06:15:59.070 INFO:teuthology.orchestra.run.vm06.stdout: socat-1.7.4.1-8.el9.x86_64 2026-03-10T06:15:59.070 INFO:teuthology.orchestra.run.vm06.stdout: thrift-0.15.0-4.el9.x86_64 2026-03-10T06:15:59.070 INFO:teuthology.orchestra.run.vm06.stdout: xmlstarlet-1.6.1-20.el9.x86_64 2026-03-10T06:15:59.070 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-10T06:15:59.070 INFO:teuthology.orchestra.run.vm06.stdout:Complete! 2026-03-10T06:15:59.164 DEBUG:teuthology.parallel:result is None 2026-03-10T06:15:59.164 WARNING:teuthology.packaging:More than one of ref, tag, branch, or sha1 supplied; using tag 2026-03-10T06:15:59.164 INFO:teuthology.packaging:ref: None 2026-03-10T06:15:59.164 INFO:teuthology.packaging:tag: v18.2.0 2026-03-10T06:15:59.164 INFO:teuthology.packaging:branch: None 2026-03-10T06:15:59.164 INFO:teuthology.packaging:sha1: e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-10T06:15:59.164 DEBUG:teuthology.packaging:Querying https://shaman.ceph.com/api/search?status=ready&project=ceph&flavor=default&distros=centos%2F9%2Fx86_64&sha1=5dd24139a1eada541a3bc16b6941c5dde975e26d 2026-03-10T06:15:59.807 DEBUG:teuthology.orchestra.run.vm04:> rpm -q ceph --qf '%{VERSION}-%{RELEASE}' 2026-03-10T06:15:59.829 INFO:teuthology.orchestra.run.vm04.stdout:18.2.0-0.el9 2026-03-10T06:15:59.829 INFO:teuthology.packaging:The installed version of ceph is 18.2.0-0.el9 2026-03-10T06:15:59.829 INFO:teuthology.task.install:The correct ceph version 18.2.0-0 is installed. 2026-03-10T06:15:59.830 WARNING:teuthology.packaging:More than one of ref, tag, branch, or sha1 supplied; using tag 2026-03-10T06:15:59.830 INFO:teuthology.packaging:ref: None 2026-03-10T06:15:59.830 INFO:teuthology.packaging:tag: v18.2.0 2026-03-10T06:15:59.830 INFO:teuthology.packaging:branch: None 2026-03-10T06:15:59.830 INFO:teuthology.packaging:sha1: e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-10T06:15:59.830 DEBUG:teuthology.packaging:Querying https://shaman.ceph.com/api/search?status=ready&project=ceph&flavor=default&distros=centos%2F9%2Fx86_64&sha1=5dd24139a1eada541a3bc16b6941c5dde975e26d 2026-03-10T06:16:00.443 DEBUG:teuthology.orchestra.run.vm06:> rpm -q ceph --qf '%{VERSION}-%{RELEASE}' 2026-03-10T06:16:00.462 INFO:teuthology.orchestra.run.vm06.stdout:18.2.0-0.el9 2026-03-10T06:16:00.462 INFO:teuthology.packaging:The installed version of ceph is 18.2.0-0.el9 2026-03-10T06:16:00.462 INFO:teuthology.task.install:The correct ceph version 18.2.0-0 is installed. 2026-03-10T06:16:00.463 INFO:teuthology.task.install.util:Shipping valgrind.supp... 2026-03-10T06:16:00.463 DEBUG:teuthology.orchestra.run.vm04:> set -ex 2026-03-10T06:16:00.463 DEBUG:teuthology.orchestra.run.vm04:> sudo dd of=/home/ubuntu/cephtest/valgrind.supp 2026-03-10T06:16:00.493 DEBUG:teuthology.orchestra.run.vm06:> set -ex 2026-03-10T06:16:00.493 DEBUG:teuthology.orchestra.run.vm06:> sudo dd of=/home/ubuntu/cephtest/valgrind.supp 2026-03-10T06:16:00.535 INFO:teuthology.task.install.util:Shipping 'daemon-helper'... 2026-03-10T06:16:00.535 DEBUG:teuthology.orchestra.run.vm04:> set -ex 2026-03-10T06:16:00.535 DEBUG:teuthology.orchestra.run.vm04:> sudo dd of=/usr/bin/daemon-helper 2026-03-10T06:16:00.567 DEBUG:teuthology.orchestra.run.vm04:> sudo chmod a=rx -- /usr/bin/daemon-helper 2026-03-10T06:16:00.635 DEBUG:teuthology.orchestra.run.vm06:> set -ex 2026-03-10T06:16:00.635 DEBUG:teuthology.orchestra.run.vm06:> sudo dd of=/usr/bin/daemon-helper 2026-03-10T06:16:00.659 DEBUG:teuthology.orchestra.run.vm06:> sudo chmod a=rx -- /usr/bin/daemon-helper 2026-03-10T06:16:00.723 INFO:teuthology.task.install.util:Shipping 'adjust-ulimits'... 2026-03-10T06:16:00.723 DEBUG:teuthology.orchestra.run.vm04:> set -ex 2026-03-10T06:16:00.723 DEBUG:teuthology.orchestra.run.vm04:> sudo dd of=/usr/bin/adjust-ulimits 2026-03-10T06:16:00.752 DEBUG:teuthology.orchestra.run.vm04:> sudo chmod a=rx -- /usr/bin/adjust-ulimits 2026-03-10T06:16:00.823 DEBUG:teuthology.orchestra.run.vm06:> set -ex 2026-03-10T06:16:00.824 DEBUG:teuthology.orchestra.run.vm06:> sudo dd of=/usr/bin/adjust-ulimits 2026-03-10T06:16:00.847 DEBUG:teuthology.orchestra.run.vm06:> sudo chmod a=rx -- /usr/bin/adjust-ulimits 2026-03-10T06:16:00.910 INFO:teuthology.task.install.util:Shipping 'stdin-killer'... 2026-03-10T06:16:00.910 DEBUG:teuthology.orchestra.run.vm04:> set -ex 2026-03-10T06:16:00.910 DEBUG:teuthology.orchestra.run.vm04:> sudo dd of=/usr/bin/stdin-killer 2026-03-10T06:16:00.939 DEBUG:teuthology.orchestra.run.vm04:> sudo chmod a=rx -- /usr/bin/stdin-killer 2026-03-10T06:16:01.010 DEBUG:teuthology.orchestra.run.vm06:> set -ex 2026-03-10T06:16:01.010 DEBUG:teuthology.orchestra.run.vm06:> sudo dd of=/usr/bin/stdin-killer 2026-03-10T06:16:01.034 DEBUG:teuthology.orchestra.run.vm06:> sudo chmod a=rx -- /usr/bin/stdin-killer 2026-03-10T06:16:01.098 INFO:teuthology.run_tasks:Running task print... 2026-03-10T06:16:01.100 INFO:teuthology.task.print:**** done install task... 2026-03-10T06:16:01.100 INFO:teuthology.run_tasks:Running task cephadm... 2026-03-10T06:16:01.149 INFO:tasks.cephadm:Config: {'compiled_cephadm_branch': 'reef', 'conf': {'osd': {'osd_class_default_list': '*', 'osd_class_load_list': '*', 'bdev async discard': True, 'bdev enable discard': True, 'bluestore allocator': 'bitmap', 'bluestore block size': 96636764160, 'bluestore fsck on mount': True, 'debug bluefs': '1/20', 'debug bluestore': '1/20', 'debug ms': 1, 'debug osd': 20, 'debug rocksdb': '4/10', 'mon osd backfillfull_ratio': 0.85, 'mon osd full ratio': 0.9, 'mon osd nearfull ratio': 0.8, 'osd failsafe full ratio': 0.95, 'osd mclock iops capacity threshold hdd': 49000, 'osd objectstore': 'bluestore', 'osd op complaint time': 180}, 'client': {'client mount timeout': 600, 'debug client': 20, 'debug ms': 1, 'rados mon op timeout': 900, 'rados osd op timeout': 900}, 'global': {'mon pg warn min per osd': 0}, 'mds': {'debug mds': 20, 'debug mds balancer': 20, 'debug ms': 1, 'mds debug frag': True, 'mds debug scatterstat': True, 'mds op complaint time': 180, 'mds verify scatter': True, 'osd op complaint time': 180, 'rados mon op timeout': 900, 'rados osd op timeout': 900}, 'mgr': {'debug mgr': 20, 'debug ms': 1}, 'mon': {'debug mon': 20, 'debug ms': 1, 'debug paxos': 20, 'mon down mkfs grace': 300, 'mon op complaint time': 120}}, 'image': 'quay.io/ceph/ceph:v18.2.0', 'roleless': True, 'cluster-conf': {'mgr': {'client mount timeout': 30, 'debug client': 20, 'debug mgr': 20, 'debug ms': 1, 'mon warn on pool no app': False}}, 'flavor': 'default', 'fs': 'xfs', 'log-ignorelist': ['\\(MDS_ALL_DOWN\\)', '\\(MDS_UP_LESS_THAN_MAX\\)', 'FS_DEGRADED', 'filesystem is degraded', 'FS_INLINE_DATA_DEPRECATED', 'FS_WITH_FAILED_MDS', 'MDS_ALL_DOWN', 'filesystem is offline', 'is offline because no MDS', 'MDS_DAMAGE', 'MDS_DEGRADED', 'MDS_FAILED', 'MDS_INSUFFICIENT_STANDBY', 'MDS_UP_LESS_THAN_MAX', 'online, but wants', 'filesystem is online with fewer MDS than max_mds', 'POOL_APP_NOT_ENABLED', 'do not have an application enabled', 'overall HEALTH_', 'Replacing daemon', 'deprecated feature inline_data', 'MGR_MODULE_ERROR', 'OSD_DOWN', 'osds down', 'overall HEALTH_', '\\(OSD_DOWN\\)', '\\(OSD_', 'but it is still running', 'is not responding', 'MON_DOWN', 'PG_AVAILABILITY', 'PG_DEGRADED', 'Reduced data availability', 'Degraded data redundancy', 'pg .* is stuck inactive', 'pg .* is .*degraded', 'pg .* is stuck peering'], 'sha1': 'e911bdebe5c8faa3800735d1568fcdca65db60df'} 2026-03-10T06:16:01.150 INFO:tasks.cephadm:Cluster image is quay.io/ceph/ceph:v18.2.0 2026-03-10T06:16:01.150 INFO:tasks.cephadm:Cluster fsid is 9c59102a-1c48-11f1-b618-035af535377d 2026-03-10T06:16:01.150 INFO:tasks.cephadm:Choosing monitor IPs and ports... 2026-03-10T06:16:01.150 INFO:tasks.cephadm:No mon roles; fabricating mons 2026-03-10T06:16:01.150 INFO:tasks.cephadm:Monitor IPs: {'mon.vm04': '192.168.123.104', 'mon.vm06': '192.168.123.106'} 2026-03-10T06:16:01.150 INFO:tasks.cephadm:Normalizing hostnames... 2026-03-10T06:16:01.150 DEBUG:teuthology.orchestra.run.vm04:> sudo hostname $(hostname -s) 2026-03-10T06:16:01.181 DEBUG:teuthology.orchestra.run.vm06:> sudo hostname $(hostname -s) 2026-03-10T06:16:01.213 INFO:tasks.cephadm:Downloading "compiled" cephadm from cachra for reef 2026-03-10T06:16:01.213 DEBUG:teuthology.packaging:Querying https://shaman.ceph.com/api/search?status=ready&project=ceph&flavor=default&distros=centos%2F9%2Fx86_64&sha1=e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-10T06:16:01.871 INFO:tasks.cephadm:builder_project result: [{'url': 'https://3.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/centos/9/flavors/default/', 'chacra_url': 'https://3.chacra.ceph.com/repos/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/centos/9/flavors/default/', 'ref': 'squid', 'sha1': 'e911bdebe5c8faa3800735d1568fcdca65db60df', 'distro': 'centos', 'distro_version': '9', 'distro_codename': None, 'modified': '2026-02-25 18:55:15.146628', 'status': 'ready', 'flavor': 'default', 'project': 'ceph', 'archs': ['source', 'x86_64'], 'extra': {'version': '19.2.3-678-ge911bdeb', 'package_manager_version': '19.2.3-678.ge911bdeb', 'build_url': 'https://jenkins.ceph.com/job/ceph-dev-pipeline/3275/', 'root_build_cause': '', 'node_name': '10.20.192.26+soko16', 'job_name': 'ceph-dev-pipeline'}}] 2026-03-10T06:16:02.695 INFO:tasks.util.chacra:got chacra host 3.chacra.ceph.com, ref reef, sha1 ab47f43c099b2cbae6e21342fe673ce251da54d6 from https://shaman.ceph.com/api/search/?project=ceph&distros=centos%2F9%2Fx86_64&flavor=default&ref=reef 2026-03-10T06:16:02.696 INFO:tasks.cephadm:Discovered cachra url: https://3.chacra.ceph.com/binaries/ceph/reef/ab47f43c099b2cbae6e21342fe673ce251da54d6/centos/9/x86_64/flavors/default/cephadm 2026-03-10T06:16:02.696 INFO:tasks.cephadm:Downloading cephadm from url: https://3.chacra.ceph.com/binaries/ceph/reef/ab47f43c099b2cbae6e21342fe673ce251da54d6/centos/9/x86_64/flavors/default/cephadm 2026-03-10T06:16:02.696 DEBUG:teuthology.orchestra.run.vm04:> curl --silent -L https://3.chacra.ceph.com/binaries/ceph/reef/ab47f43c099b2cbae6e21342fe673ce251da54d6/centos/9/x86_64/flavors/default/cephadm > /home/ubuntu/cephtest/cephadm && ls -l /home/ubuntu/cephtest/cephadm 2026-03-10T06:16:03.882 INFO:teuthology.orchestra.run.vm04.stdout:-rw-r--r--. 1 ubuntu ubuntu 217462 Mar 10 06:16 /home/ubuntu/cephtest/cephadm 2026-03-10T06:16:03.882 DEBUG:teuthology.orchestra.run.vm06:> curl --silent -L https://3.chacra.ceph.com/binaries/ceph/reef/ab47f43c099b2cbae6e21342fe673ce251da54d6/centos/9/x86_64/flavors/default/cephadm > /home/ubuntu/cephtest/cephadm && ls -l /home/ubuntu/cephtest/cephadm 2026-03-10T06:16:05.147 INFO:teuthology.orchestra.run.vm06.stdout:-rw-r--r--. 1 ubuntu ubuntu 217462 Mar 10 06:16 /home/ubuntu/cephtest/cephadm 2026-03-10T06:16:05.147 DEBUG:teuthology.orchestra.run.vm04:> test -s /home/ubuntu/cephtest/cephadm && test $(stat -c%s /home/ubuntu/cephtest/cephadm) -gt 1000 && chmod +x /home/ubuntu/cephtest/cephadm 2026-03-10T06:16:05.163 DEBUG:teuthology.orchestra.run.vm06:> test -s /home/ubuntu/cephtest/cephadm && test $(stat -c%s /home/ubuntu/cephtest/cephadm) -gt 1000 && chmod +x /home/ubuntu/cephtest/cephadm 2026-03-10T06:16:05.186 INFO:tasks.cephadm:Pulling image quay.io/ceph/ceph:v18.2.0 on all hosts... 2026-03-10T06:16:05.186 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 pull 2026-03-10T06:16:05.204 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 pull 2026-03-10T06:16:05.330 INFO:teuthology.orchestra.run.vm04.stderr:Pulling container image quay.io/ceph/ceph:v18.2.0... 2026-03-10T06:16:05.358 INFO:teuthology.orchestra.run.vm06.stderr:Pulling container image quay.io/ceph/ceph:v18.2.0... 2026-03-10T06:16:40.369 INFO:teuthology.orchestra.run.vm04.stdout:{ 2026-03-10T06:16:40.369 INFO:teuthology.orchestra.run.vm04.stdout: "ceph_version": "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)", 2026-03-10T06:16:40.369 INFO:teuthology.orchestra.run.vm04.stdout: "image_id": "dc2bc1663786ad5862ef6efcff18243ab7dfeda0bcb911ddf757a0d20feca946", 2026-03-10T06:16:40.369 INFO:teuthology.orchestra.run.vm04.stdout: "repo_digests": [ 2026-03-10T06:16:40.369 INFO:teuthology.orchestra.run.vm04.stdout: "quay.io/ceph/ceph@sha256:1793ff3af6ae74527c86e1a0b22401e9c42dc08d0ebb8379653be07db17d0007", 2026-03-10T06:16:40.369 INFO:teuthology.orchestra.run.vm04.stdout: "quay.io/ceph/ceph@sha256:d58cf65589d0abf9d5261cd46fb62be3bfb29098febc78c0fcdf116a15274d27" 2026-03-10T06:16:40.369 INFO:teuthology.orchestra.run.vm04.stdout: ] 2026-03-10T06:16:40.369 INFO:teuthology.orchestra.run.vm04.stdout:} 2026-03-10T06:16:40.370 INFO:teuthology.orchestra.run.vm06.stdout:{ 2026-03-10T06:16:40.370 INFO:teuthology.orchestra.run.vm06.stdout: "ceph_version": "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)", 2026-03-10T06:16:40.370 INFO:teuthology.orchestra.run.vm06.stdout: "image_id": "dc2bc1663786ad5862ef6efcff18243ab7dfeda0bcb911ddf757a0d20feca946", 2026-03-10T06:16:40.370 INFO:teuthology.orchestra.run.vm06.stdout: "repo_digests": [ 2026-03-10T06:16:40.370 INFO:teuthology.orchestra.run.vm06.stdout: "quay.io/ceph/ceph@sha256:1793ff3af6ae74527c86e1a0b22401e9c42dc08d0ebb8379653be07db17d0007", 2026-03-10T06:16:40.370 INFO:teuthology.orchestra.run.vm06.stdout: "quay.io/ceph/ceph@sha256:d58cf65589d0abf9d5261cd46fb62be3bfb29098febc78c0fcdf116a15274d27" 2026-03-10T06:16:40.370 INFO:teuthology.orchestra.run.vm06.stdout: ] 2026-03-10T06:16:40.370 INFO:teuthology.orchestra.run.vm06.stdout:} 2026-03-10T06:16:40.384 DEBUG:teuthology.orchestra.run.vm04:> sudo mkdir -p /etc/ceph 2026-03-10T06:16:40.413 DEBUG:teuthology.orchestra.run.vm06:> sudo mkdir -p /etc/ceph 2026-03-10T06:16:40.442 DEBUG:teuthology.orchestra.run.vm04:> sudo chmod 777 /etc/ceph 2026-03-10T06:16:40.480 DEBUG:teuthology.orchestra.run.vm06:> sudo chmod 777 /etc/ceph 2026-03-10T06:16:40.509 INFO:tasks.cephadm:Writing seed config... 2026-03-10T06:16:40.510 INFO:tasks.cephadm: override: [osd] osd_class_default_list = * 2026-03-10T06:16:40.510 INFO:tasks.cephadm: override: [osd] osd_class_load_list = * 2026-03-10T06:16:40.510 INFO:tasks.cephadm: override: [osd] bdev async discard = True 2026-03-10T06:16:40.510 INFO:tasks.cephadm: override: [osd] bdev enable discard = True 2026-03-10T06:16:40.510 INFO:tasks.cephadm: override: [osd] bluestore allocator = bitmap 2026-03-10T06:16:40.510 INFO:tasks.cephadm: override: [osd] bluestore block size = 96636764160 2026-03-10T06:16:40.510 INFO:tasks.cephadm: override: [osd] bluestore fsck on mount = True 2026-03-10T06:16:40.510 INFO:tasks.cephadm: override: [osd] debug bluefs = 1/20 2026-03-10T06:16:40.510 INFO:tasks.cephadm: override: [osd] debug bluestore = 1/20 2026-03-10T06:16:40.510 INFO:tasks.cephadm: override: [osd] debug ms = 1 2026-03-10T06:16:40.510 INFO:tasks.cephadm: override: [osd] debug osd = 20 2026-03-10T06:16:40.510 INFO:tasks.cephadm: override: [osd] debug rocksdb = 4/10 2026-03-10T06:16:40.510 INFO:tasks.cephadm: override: [osd] mon osd backfillfull_ratio = 0.85 2026-03-10T06:16:40.510 INFO:tasks.cephadm: override: [osd] mon osd full ratio = 0.9 2026-03-10T06:16:40.510 INFO:tasks.cephadm: override: [osd] mon osd nearfull ratio = 0.8 2026-03-10T06:16:40.510 INFO:tasks.cephadm: override: [osd] osd failsafe full ratio = 0.95 2026-03-10T06:16:40.510 INFO:tasks.cephadm: override: [osd] osd mclock iops capacity threshold hdd = 49000 2026-03-10T06:16:40.510 INFO:tasks.cephadm: override: [osd] osd objectstore = bluestore 2026-03-10T06:16:40.510 INFO:tasks.cephadm: override: [osd] osd op complaint time = 180 2026-03-10T06:16:40.510 INFO:tasks.cephadm: override: [client] client mount timeout = 600 2026-03-10T06:16:40.510 INFO:tasks.cephadm: override: [client] debug client = 20 2026-03-10T06:16:40.510 INFO:tasks.cephadm: override: [client] debug ms = 1 2026-03-10T06:16:40.510 INFO:tasks.cephadm: override: [client] rados mon op timeout = 900 2026-03-10T06:16:40.510 INFO:tasks.cephadm: override: [client] rados osd op timeout = 900 2026-03-10T06:16:40.510 INFO:tasks.cephadm: override: [global] mon pg warn min per osd = 0 2026-03-10T06:16:40.510 INFO:tasks.cephadm: override: [mds] debug mds = 20 2026-03-10T06:16:40.510 INFO:tasks.cephadm: override: [mds] debug mds balancer = 20 2026-03-10T06:16:40.510 INFO:tasks.cephadm: override: [mds] debug ms = 1 2026-03-10T06:16:40.511 INFO:tasks.cephadm: override: [mds] mds debug frag = True 2026-03-10T06:16:40.511 INFO:tasks.cephadm: override: [mds] mds debug scatterstat = True 2026-03-10T06:16:40.511 INFO:tasks.cephadm: override: [mds] mds op complaint time = 180 2026-03-10T06:16:40.511 INFO:tasks.cephadm: override: [mds] mds verify scatter = True 2026-03-10T06:16:40.511 INFO:tasks.cephadm: override: [mds] osd op complaint time = 180 2026-03-10T06:16:40.511 INFO:tasks.cephadm: override: [mds] rados mon op timeout = 900 2026-03-10T06:16:40.511 INFO:tasks.cephadm: override: [mds] rados osd op timeout = 900 2026-03-10T06:16:40.511 INFO:tasks.cephadm: override: [mgr] debug mgr = 20 2026-03-10T06:16:40.511 INFO:tasks.cephadm: override: [mgr] debug ms = 1 2026-03-10T06:16:40.511 INFO:tasks.cephadm: override: [mon] debug mon = 20 2026-03-10T06:16:40.511 INFO:tasks.cephadm: override: [mon] debug ms = 1 2026-03-10T06:16:40.511 INFO:tasks.cephadm: override: [mon] debug paxos = 20 2026-03-10T06:16:40.511 INFO:tasks.cephadm: override: [mon] mon down mkfs grace = 300 2026-03-10T06:16:40.511 INFO:tasks.cephadm: override: [mon] mon op complaint time = 120 2026-03-10T06:16:40.511 DEBUG:teuthology.orchestra.run.vm04:> set -ex 2026-03-10T06:16:40.511 DEBUG:teuthology.orchestra.run.vm04:> dd of=/home/ubuntu/cephtest/seed.ceph.conf 2026-03-10T06:16:40.537 DEBUG:tasks.cephadm:Final config: [global] # make logging friendly to teuthology log_to_file = true log_to_stderr = false log to journald = false mon cluster log to file = true mon cluster log file level = debug mon clock drift allowed = 1.000 # replicate across OSDs, not hosts osd crush chooseleaf type = 0 #osd pool default size = 2 osd pool default erasure code profile = plugin=jerasure technique=reed_sol_van k=2 m=1 crush-failure-domain=osd # enable some debugging auth debug = true ms die on old message = true ms die on bug = true debug asserts on shutdown = true # adjust warnings mon max pg per osd = 10000# >= luminous mon pg warn max object skew = 0 mon osd allow primary affinity = true mon osd allow pg remap = true mon warn on legacy crush tunables = false mon warn on crush straw calc version zero = false mon warn on no sortbitwise = false mon warn on osd down out interval zero = false mon warn on too few osds = false mon_warn_on_pool_pg_num_not_power_of_two = false # disable pg_autoscaler by default for new pools osd_pool_default_pg_autoscale_mode = off # tests delete pools mon allow pool delete = true fsid = 9c59102a-1c48-11f1-b618-035af535377d mon pg warn min per osd = 0 [osd] osd scrub load threshold = 5.0 osd scrub max interval = 600 osd mclock profile = high_recovery_ops osd recover clone overlap = true osd recovery max chunk = 1048576 osd deep scrub update digest min age = 30 osd map max advance = 10 osd memory target autotune = true # debugging osd debug shutdown = true osd debug op order = true osd debug verify stray on activate = true osd debug pg log writeout = true osd debug verify cached snaps = true osd debug verify missing on start = true osd debug misdirected ops = true osd op queue = debug_random osd op queue cut off = debug_random osd shutdown pgref assert = true bdev debug aio = true osd sloppy crc = true osd_class_default_list = * osd_class_load_list = * bdev async discard = True bdev enable discard = True bluestore allocator = bitmap bluestore block size = 96636764160 bluestore fsck on mount = True debug bluefs = 1/20 debug bluestore = 1/20 debug ms = 1 debug osd = 20 debug rocksdb = 4/10 mon osd backfillfull_ratio = 0.85 mon osd full ratio = 0.9 mon osd nearfull ratio = 0.8 osd failsafe full ratio = 0.95 osd mclock iops capacity threshold hdd = 49000 osd objectstore = bluestore osd op complaint time = 180 [mgr] mon reweight min pgs per osd = 4 mon reweight min bytes per osd = 10 mgr/telemetry/nag = false debug mgr = 20 debug ms = 1 [mon] mon data avail warn = 5 mon mgr mkfs grace = 240 mon reweight min pgs per osd = 4 mon osd reporter subtree level = osd mon osd prime pg temp = true mon reweight min bytes per osd = 10 # rotate auth tickets quickly to exercise renewal paths auth mon ticket ttl = 660# 11m auth service ticket ttl = 240# 4m # don't complain about global id reclaim mon_warn_on_insecure_global_id_reclaim = false mon_warn_on_insecure_global_id_reclaim_allowed = false debug mon = 20 debug ms = 1 debug paxos = 20 mon down mkfs grace = 300 mon op complaint time = 120 [client.rgw] rgw cache enabled = true rgw enable ops log = true rgw enable usage log = true [client] client mount timeout = 600 debug client = 20 debug ms = 1 rados mon op timeout = 900 rados osd op timeout = 900 [mds] debug mds = 20 debug mds balancer = 20 debug ms = 1 mds debug frag = True mds debug scatterstat = True mds op complaint time = 180 mds verify scatter = True osd op complaint time = 180 rados mon op timeout = 900 rados osd op timeout = 900 2026-03-10T06:16:40.537 DEBUG:teuthology.orchestra.run.vm04:mon.vm04> sudo journalctl -f -n 0 -u ceph-9c59102a-1c48-11f1-b618-035af535377d@mon.vm04.service 2026-03-10T06:16:40.579 INFO:tasks.cephadm:Bootstrapping... 2026-03-10T06:16:40.579 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 -v bootstrap --fsid 9c59102a-1c48-11f1-b618-035af535377d --config /home/ubuntu/cephtest/seed.ceph.conf --output-config /etc/ceph/ceph.conf --output-keyring /etc/ceph/ceph.client.admin.keyring --output-pub-ssh-key /home/ubuntu/cephtest/ceph.pub --mon-ip 192.168.123.104 --skip-admin-label && sudo chmod +r /etc/ceph/ceph.client.admin.keyring 2026-03-10T06:16:40.696 INFO:teuthology.orchestra.run.vm04.stdout:-------------------------------------------------------------------------------- 2026-03-10T06:16:40.696 INFO:teuthology.orchestra.run.vm04.stdout:cephadm ['--image', 'quay.io/ceph/ceph:v18.2.0', '-v', 'bootstrap', '--fsid', '9c59102a-1c48-11f1-b618-035af535377d', '--config', '/home/ubuntu/cephtest/seed.ceph.conf', '--output-config', '/etc/ceph/ceph.conf', '--output-keyring', '/etc/ceph/ceph.client.admin.keyring', '--output-pub-ssh-key', '/home/ubuntu/cephtest/ceph.pub', '--mon-ip', '192.168.123.104', '--skip-admin-label'] 2026-03-10T06:16:40.717 INFO:teuthology.orchestra.run.vm04.stdout:/bin/podman: stdout 5.8.0 2026-03-10T06:16:40.717 INFO:teuthology.orchestra.run.vm04.stderr:Specifying an fsid for your cluster offers no advantages and may increase the likelihood of fsid conflicts. 2026-03-10T06:16:40.717 INFO:teuthology.orchestra.run.vm04.stdout:Verifying podman|docker is present... 2026-03-10T06:16:40.737 INFO:teuthology.orchestra.run.vm04.stdout:/bin/podman: stdout 5.8.0 2026-03-10T06:16:40.737 INFO:teuthology.orchestra.run.vm04.stdout:Verifying lvm2 is present... 2026-03-10T06:16:40.737 INFO:teuthology.orchestra.run.vm04.stdout:Verifying time synchronization is in place... 2026-03-10T06:16:40.744 INFO:teuthology.orchestra.run.vm04.stdout:Non-zero exit code 1 from systemctl is-enabled chrony.service 2026-03-10T06:16:40.744 INFO:teuthology.orchestra.run.vm04.stdout:systemctl: stderr Failed to get unit file state for chrony.service: No such file or directory 2026-03-10T06:16:40.751 INFO:teuthology.orchestra.run.vm04.stdout:Non-zero exit code 3 from systemctl is-active chrony.service 2026-03-10T06:16:40.751 INFO:teuthology.orchestra.run.vm04.stdout:systemctl: stdout inactive 2026-03-10T06:16:40.757 INFO:teuthology.orchestra.run.vm04.stdout:systemctl: stdout enabled 2026-03-10T06:16:40.764 INFO:teuthology.orchestra.run.vm04.stdout:systemctl: stdout active 2026-03-10T06:16:40.764 INFO:teuthology.orchestra.run.vm04.stdout:Unit chronyd.service is enabled and running 2026-03-10T06:16:40.764 INFO:teuthology.orchestra.run.vm04.stdout:Repeating the final host check... 2026-03-10T06:16:40.784 INFO:teuthology.orchestra.run.vm04.stdout:/bin/podman: stdout 5.8.0 2026-03-10T06:16:40.784 INFO:teuthology.orchestra.run.vm04.stdout:podman (/bin/podman) version 5.8.0 is present 2026-03-10T06:16:40.784 INFO:teuthology.orchestra.run.vm04.stdout:systemctl is present 2026-03-10T06:16:40.784 INFO:teuthology.orchestra.run.vm04.stdout:lvcreate is present 2026-03-10T06:16:40.790 INFO:teuthology.orchestra.run.vm04.stdout:Non-zero exit code 1 from systemctl is-enabled chrony.service 2026-03-10T06:16:40.790 INFO:teuthology.orchestra.run.vm04.stdout:systemctl: stderr Failed to get unit file state for chrony.service: No such file or directory 2026-03-10T06:16:40.797 INFO:teuthology.orchestra.run.vm04.stdout:Non-zero exit code 3 from systemctl is-active chrony.service 2026-03-10T06:16:40.797 INFO:teuthology.orchestra.run.vm04.stdout:systemctl: stdout inactive 2026-03-10T06:16:40.803 INFO:teuthology.orchestra.run.vm04.stdout:systemctl: stdout enabled 2026-03-10T06:16:40.810 INFO:teuthology.orchestra.run.vm04.stdout:systemctl: stdout active 2026-03-10T06:16:40.810 INFO:teuthology.orchestra.run.vm04.stdout:Unit chronyd.service is enabled and running 2026-03-10T06:16:40.810 INFO:teuthology.orchestra.run.vm04.stdout:Host looks OK 2026-03-10T06:16:40.810 INFO:teuthology.orchestra.run.vm04.stdout:Cluster fsid: 9c59102a-1c48-11f1-b618-035af535377d 2026-03-10T06:16:40.810 INFO:teuthology.orchestra.run.vm04.stdout:Acquiring lock 140566546432976 on /run/cephadm/9c59102a-1c48-11f1-b618-035af535377d.lock 2026-03-10T06:16:40.810 INFO:teuthology.orchestra.run.vm04.stdout:Lock 140566546432976 acquired on /run/cephadm/9c59102a-1c48-11f1-b618-035af535377d.lock 2026-03-10T06:16:40.810 INFO:teuthology.orchestra.run.vm04.stdout:Verifying IP 192.168.123.104 port 3300 ... 2026-03-10T06:16:40.811 INFO:teuthology.orchestra.run.vm04.stdout:Verifying IP 192.168.123.104 port 6789 ... 2026-03-10T06:16:40.811 INFO:teuthology.orchestra.run.vm04.stdout:Base mon IP(s) is [192.168.123.104:3300, 192.168.123.104:6789], mon addrv is [v2:192.168.123.104:3300,v1:192.168.123.104:6789] 2026-03-10T06:16:40.815 INFO:teuthology.orchestra.run.vm04.stdout:/sbin/ip: stdout default via 192.168.123.1 dev eth0 proto dhcp src 192.168.123.104 metric 100 2026-03-10T06:16:40.815 INFO:teuthology.orchestra.run.vm04.stdout:/sbin/ip: stdout 192.168.123.0/24 dev eth0 proto kernel scope link src 192.168.123.104 metric 100 2026-03-10T06:16:40.819 INFO:teuthology.orchestra.run.vm04.stdout:/sbin/ip: stdout ::1 dev lo proto kernel metric 256 pref medium 2026-03-10T06:16:40.819 INFO:teuthology.orchestra.run.vm04.stdout:/sbin/ip: stdout fe80::/64 dev eth0 proto kernel metric 1024 pref medium 2026-03-10T06:16:40.821 INFO:teuthology.orchestra.run.vm04.stdout:/sbin/ip: stdout 1: lo: mtu 65536 state UNKNOWN qlen 1000 2026-03-10T06:16:40.821 INFO:teuthology.orchestra.run.vm04.stdout:/sbin/ip: stdout inet6 ::1/128 scope host 2026-03-10T06:16:40.821 INFO:teuthology.orchestra.run.vm04.stdout:/sbin/ip: stdout valid_lft forever preferred_lft forever 2026-03-10T06:16:40.821 INFO:teuthology.orchestra.run.vm04.stdout:/sbin/ip: stdout 2: eth0: mtu 1500 state UP qlen 1000 2026-03-10T06:16:40.821 INFO:teuthology.orchestra.run.vm04.stdout:/sbin/ip: stdout inet6 fe80::5055:ff:fe00:4/64 scope link noprefixroute 2026-03-10T06:16:40.821 INFO:teuthology.orchestra.run.vm04.stdout:/sbin/ip: stdout valid_lft forever preferred_lft forever 2026-03-10T06:16:40.821 INFO:teuthology.orchestra.run.vm04.stdout:Mon IP `192.168.123.104` is in CIDR network `192.168.123.0/24` 2026-03-10T06:16:40.821 INFO:teuthology.orchestra.run.vm04.stdout:Mon IP `192.168.123.104` is in CIDR network `192.168.123.0/24` 2026-03-10T06:16:40.821 INFO:teuthology.orchestra.run.vm04.stdout:Inferred mon public CIDR from local network configuration ['192.168.123.0/24', '192.168.123.0/24'] 2026-03-10T06:16:40.822 INFO:teuthology.orchestra.run.vm04.stdout:Internal network (--cluster-network) has not been provided, OSD replication will default to the public_network 2026-03-10T06:16:40.822 INFO:teuthology.orchestra.run.vm04.stdout:Pulling container image quay.io/ceph/ceph:v18.2.0... 2026-03-10T06:16:42.015 INFO:teuthology.orchestra.run.vm04.stdout:/bin/podman: stdout dc2bc1663786ad5862ef6efcff18243ab7dfeda0bcb911ddf757a0d20feca946 2026-03-10T06:16:42.015 INFO:teuthology.orchestra.run.vm04.stdout:/bin/podman: stderr Trying to pull quay.io/ceph/ceph:v18.2.0... 2026-03-10T06:16:42.015 INFO:teuthology.orchestra.run.vm04.stdout:/bin/podman: stderr Getting image source signatures 2026-03-10T06:16:42.015 INFO:teuthology.orchestra.run.vm04.stdout:/bin/podman: stderr Copying blob sha256:3bd20aeff60302f668275dc2005d10679ae56492967a3a5a54fd3dde85333aec 2026-03-10T06:16:42.015 INFO:teuthology.orchestra.run.vm04.stdout:/bin/podman: stderr Copying blob sha256:46af8f5390d4e94fc57efb422ccb97bb53dfe5b948546bfc191b46557eb2dbd9 2026-03-10T06:16:42.015 INFO:teuthology.orchestra.run.vm04.stdout:/bin/podman: stderr Copying config sha256:dc2bc1663786ad5862ef6efcff18243ab7dfeda0bcb911ddf757a0d20feca946 2026-03-10T06:16:42.015 INFO:teuthology.orchestra.run.vm04.stdout:/bin/podman: stderr Writing manifest to image destination 2026-03-10T06:16:42.195 INFO:teuthology.orchestra.run.vm04.stdout:ceph: stdout ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable) 2026-03-10T06:16:42.195 INFO:teuthology.orchestra.run.vm04.stdout:Ceph version: ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable) 2026-03-10T06:16:42.195 INFO:teuthology.orchestra.run.vm04.stdout:Extracting ceph user uid/gid from container image... 2026-03-10T06:16:42.286 INFO:teuthology.orchestra.run.vm04.stdout:stat: stdout 167 167 2026-03-10T06:16:42.287 INFO:teuthology.orchestra.run.vm04.stdout:Creating initial keys... 2026-03-10T06:16:42.412 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph-authtool: stdout AQBKt69p66oaFhAAi/7wcjGBAjfKHMPoM78FGQ== 2026-03-10T06:16:42.522 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph-authtool: stdout AQBKt69ptzzhHBAArt4fTn3PNzbOYY6yKJYZhQ== 2026-03-10T06:16:42.655 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph-authtool: stdout AQBKt69pxcMIIxAA1jkUz8ZnL1uMTe8lwgxI4A== 2026-03-10T06:16:42.656 INFO:teuthology.orchestra.run.vm04.stdout:Creating initial monmap... 2026-03-10T06:16:42.767 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/monmaptool: stdout /usr/bin/monmaptool: monmap file /tmp/monmap 2026-03-10T06:16:42.767 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/monmaptool: stdout setting min_mon_release = pacific 2026-03-10T06:16:42.767 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/monmaptool: stdout /usr/bin/monmaptool: set fsid to 9c59102a-1c48-11f1-b618-035af535377d 2026-03-10T06:16:42.767 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/monmaptool: stdout /usr/bin/monmaptool: writing epoch 0 to /tmp/monmap (1 monitors) 2026-03-10T06:16:42.767 INFO:teuthology.orchestra.run.vm04.stdout:monmaptool for vm04 [v2:192.168.123.104:3300,v1:192.168.123.104:6789] on /usr/bin/monmaptool: monmap file /tmp/monmap 2026-03-10T06:16:42.767 INFO:teuthology.orchestra.run.vm04.stdout:setting min_mon_release = pacific 2026-03-10T06:16:42.767 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/monmaptool: set fsid to 9c59102a-1c48-11f1-b618-035af535377d 2026-03-10T06:16:42.767 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/monmaptool: writing epoch 0 to /tmp/monmap (1 monitors) 2026-03-10T06:16:42.767 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:16:42.767 INFO:teuthology.orchestra.run.vm04.stdout:Creating mon... 2026-03-10T06:16:42.920 INFO:teuthology.orchestra.run.vm04.stdout:create mon.vm04 on 2026-03-10T06:16:43.090 INFO:teuthology.orchestra.run.vm04.stdout:systemctl: stderr Removed "/etc/systemd/system/multi-user.target.wants/ceph.target". 2026-03-10T06:16:43.223 INFO:teuthology.orchestra.run.vm04.stdout:systemctl: stderr Created symlink /etc/systemd/system/multi-user.target.wants/ceph.target → /etc/systemd/system/ceph.target. 2026-03-10T06:16:43.350 INFO:teuthology.orchestra.run.vm04.stdout:systemctl: stderr Created symlink /etc/systemd/system/multi-user.target.wants/ceph-9c59102a-1c48-11f1-b618-035af535377d.target → /etc/systemd/system/ceph-9c59102a-1c48-11f1-b618-035af535377d.target. 2026-03-10T06:16:43.350 INFO:teuthology.orchestra.run.vm04.stdout:systemctl: stderr Created symlink /etc/systemd/system/ceph.target.wants/ceph-9c59102a-1c48-11f1-b618-035af535377d.target → /etc/systemd/system/ceph-9c59102a-1c48-11f1-b618-035af535377d.target. 2026-03-10T06:16:43.559 INFO:teuthology.orchestra.run.vm04.stdout:Non-zero exit code 1 from systemctl reset-failed ceph-9c59102a-1c48-11f1-b618-035af535377d@mon.vm04 2026-03-10T06:16:43.559 INFO:teuthology.orchestra.run.vm04.stdout:systemctl: stderr Failed to reset failed state of unit ceph-9c59102a-1c48-11f1-b618-035af535377d@mon.vm04.service: Unit ceph-9c59102a-1c48-11f1-b618-035af535377d@mon.vm04.service not loaded. 2026-03-10T06:16:43.710 INFO:teuthology.orchestra.run.vm04.stdout:systemctl: stderr Created symlink /etc/systemd/system/ceph-9c59102a-1c48-11f1-b618-035af535377d.target.wants/ceph-9c59102a-1c48-11f1-b618-035af535377d@mon.vm04.service → /etc/systemd/system/ceph-9c59102a-1c48-11f1-b618-035af535377d@.service. 2026-03-10T06:16:43.911 INFO:teuthology.orchestra.run.vm04.stdout:firewalld does not appear to be present 2026-03-10T06:16:43.911 INFO:teuthology.orchestra.run.vm04.stdout:Not possible to enable service . firewalld.service is not available 2026-03-10T06:16:43.912 INFO:teuthology.orchestra.run.vm04.stdout:Waiting for mon to start... 2026-03-10T06:16:43.912 INFO:teuthology.orchestra.run.vm04.stdout:Waiting for mon... 2026-03-10T06:16:44.187 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout cluster: 2026-03-10T06:16:44.187 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout id: 9c59102a-1c48-11f1-b618-035af535377d 2026-03-10T06:16:44.187 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout health: HEALTH_OK 2026-03-10T06:16:44.187 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout 2026-03-10T06:16:44.187 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout services: 2026-03-10T06:16:44.187 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout mon: 1 daemons, quorum vm04 (age 0.190527s) 2026-03-10T06:16:44.187 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout mgr: no daemons active 2026-03-10T06:16:44.187 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout osd: 0 osds: 0 up, 0 in 2026-03-10T06:16:44.187 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout 2026-03-10T06:16:44.187 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout data: 2026-03-10T06:16:44.188 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout pools: 0 pools, 0 pgs 2026-03-10T06:16:44.188 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout objects: 0 objects, 0 B 2026-03-10T06:16:44.188 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout usage: 0 B used, 0 B / 0 B avail 2026-03-10T06:16:44.188 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout pgs: 2026-03-10T06:16:44.188 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout 2026-03-10T06:16:44.188 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.076+0000 7f0c5ad24700 1 Processor -- start 2026-03-10T06:16:44.188 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.077+0000 7f0c5ad24700 1 -- start start 2026-03-10T06:16:44.188 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.077+0000 7f0c5ad24700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0c54105d30 0x7f0c54106140 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:16:44.189 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.077+0000 7f0c5ad24700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0c54106680 con 0x7f0c54105d30 2026-03-10T06:16:44.189 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.078+0000 7f0c58ac0700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0c54105d30 0x7f0c54106140 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:16:44.189 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.078+0000 7f0c58ac0700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0c54105d30 0x7f0c54106140 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:40988/0 (socket says 192.168.123.104:40988) 2026-03-10T06:16:44.189 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.078+0000 7f0c58ac0700 1 -- 192.168.123.104:0/1814961431 learned_addr learned my addr 192.168.123.104:0/1814961431 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:16:44.189 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.079+0000 7f0c58ac0700 1 -- 192.168.123.104:0/1814961431 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f0c541067c0 con 0x7f0c54105d30 2026-03-10T06:16:44.189 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.079+0000 7f0c58ac0700 1 --2- 192.168.123.104:0/1814961431 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0c54105d30 0x7f0c54106140 secure :-1 s=READY pgs=1 cs=0 l=1 rev1=1 crypto rx=0x7f0c40009a90 tx=0x7f0c40009da0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=24f4bab96a4b4dae server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:16:44.189 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.079+0000 7f0c537fe700 1 -- 192.168.123.104:0/1814961431 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f0c40004030 con 0x7f0c54105d30 2026-03-10T06:16:44.189 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.079+0000 7f0c537fe700 1 -- 192.168.123.104:0/1814961431 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(0 keys) v1 ==== 4+0+0 (secure 0 0 0) 0x7f0c40004190 con 0x7f0c54105d30 2026-03-10T06:16:44.189 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.080+0000 7f0c5ad24700 1 -- 192.168.123.104:0/1814961431 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0c54105d30 msgr2=0x7f0c54106140 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:16:44.189 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.080+0000 7f0c5ad24700 1 --2- 192.168.123.104:0/1814961431 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0c54105d30 0x7f0c54106140 secure :-1 s=READY pgs=1 cs=0 l=1 rev1=1 crypto rx=0x7f0c40009a90 tx=0x7f0c40009da0 comp rx=0 tx=0).stop 2026-03-10T06:16:44.189 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.080+0000 7f0c5ad24700 1 -- 192.168.123.104:0/1814961431 shutdown_connections 2026-03-10T06:16:44.189 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.080+0000 7f0c5ad24700 1 --2- 192.168.123.104:0/1814961431 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0c54105d30 0x7f0c54106140 unknown :-1 s=CLOSED pgs=1 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:16:44.189 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.080+0000 7f0c5ad24700 1 -- 192.168.123.104:0/1814961431 >> 192.168.123.104:0/1814961431 conn(0x7f0c541013a0 msgr2=0x7f0c541037b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:16:44.189 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.080+0000 7f0c5ad24700 1 -- 192.168.123.104:0/1814961431 shutdown_connections 2026-03-10T06:16:44.189 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.080+0000 7f0c5ad24700 1 -- 192.168.123.104:0/1814961431 wait complete. 2026-03-10T06:16:44.189 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.081+0000 7f0c5ad24700 1 Processor -- start 2026-03-10T06:16:44.189 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.081+0000 7f0c5ad24700 1 -- start start 2026-03-10T06:16:44.189 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.081+0000 7f0c5ad24700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0c54105d30 0x7f0c54196ed0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:16:44.189 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.081+0000 7f0c5ad24700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0c54106680 con 0x7f0c54105d30 2026-03-10T06:16:44.189 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.082+0000 7f0c58ac0700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0c54105d30 0x7f0c54196ed0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:16:44.189 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.082+0000 7f0c58ac0700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0c54105d30 0x7f0c54196ed0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:41000/0 (socket says 192.168.123.104:41000) 2026-03-10T06:16:44.189 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.082+0000 7f0c58ac0700 1 -- 192.168.123.104:0/310882764 learned_addr learned my addr 192.168.123.104:0/310882764 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:16:44.189 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.082+0000 7f0c58ac0700 1 -- 192.168.123.104:0/310882764 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f0c40009740 con 0x7f0c54105d30 2026-03-10T06:16:44.189 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.082+0000 7f0c58ac0700 1 --2- 192.168.123.104:0/310882764 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0c54105d30 0x7f0c54196ed0 secure :-1 s=READY pgs=2 cs=0 l=1 rev1=1 crypto rx=0x7f0c4000b3c0 tx=0x7f0c40004750 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:16:44.189 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.083+0000 7f0c51ffb700 1 -- 192.168.123.104:0/310882764 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f0c400045d0 con 0x7f0c54105d30 2026-03-10T06:16:44.189 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.083+0000 7f0c51ffb700 1 -- 192.168.123.104:0/310882764 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(0 keys) v1 ==== 4+0+0 (secure 0 0 0) 0x7f0c400038b0 con 0x7f0c54105d30 2026-03-10T06:16:44.189 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.083+0000 7f0c51ffb700 1 -- 192.168.123.104:0/310882764 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f0c40003a20 con 0x7f0c54105d30 2026-03-10T06:16:44.189 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.083+0000 7f0c5ad24700 1 -- 192.168.123.104:0/310882764 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f0c54197410 con 0x7f0c54105d30 2026-03-10T06:16:44.189 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.083+0000 7f0c5ad24700 1 -- 192.168.123.104:0/310882764 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f0c541978b0 con 0x7f0c54105d30 2026-03-10T06:16:44.189 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.084+0000 7f0c51ffb700 1 -- 192.168.123.104:0/310882764 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 1) v1 ==== 660+0+0 (secure 0 0 0) 0x7f0c40022070 con 0x7f0c54105d30 2026-03-10T06:16:44.189 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.084+0000 7f0c51ffb700 1 -- 192.168.123.104:0/310882764 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(1..1 src has 1..1) v4 ==== 725+0+0 (secure 0 0 0) 0x7f0c4001b790 con 0x7f0c54105d30 2026-03-10T06:16:44.190 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.084+0000 7f0c5ad24700 1 -- 192.168.123.104:0/310882764 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f0c5404f9e0 con 0x7f0c54105d30 2026-03-10T06:16:44.190 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.086+0000 7f0c51ffb700 1 -- 192.168.123.104:0/310882764 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+72422 (secure 0 0 0) 0x7f0c40025d40 con 0x7f0c54105d30 2026-03-10T06:16:44.190 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.125+0000 7f0c5ad24700 1 -- 192.168.123.104:0/310882764 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "status"} v 0) v1 -- 0x7f0c54062380 con 0x7f0c54105d30 2026-03-10T06:16:44.190 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.126+0000 7f0c51ffb700 1 -- 192.168.123.104:0/310882764 <== mon.0 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "status"}]=0 v0) v1 ==== 54+0+320 (secure 0 0 0) 0x7f0c40025470 con 0x7f0c54105d30 2026-03-10T06:16:44.190 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.127+0000 7f0c5ad24700 1 -- 192.168.123.104:0/310882764 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0c54105d30 msgr2=0x7f0c54196ed0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:16:44.190 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.127+0000 7f0c5ad24700 1 --2- 192.168.123.104:0/310882764 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0c54105d30 0x7f0c54196ed0 secure :-1 s=READY pgs=2 cs=0 l=1 rev1=1 crypto rx=0x7f0c4000b3c0 tx=0x7f0c40004750 comp rx=0 tx=0).stop 2026-03-10T06:16:44.190 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.127+0000 7f0c5ad24700 1 -- 192.168.123.104:0/310882764 shutdown_connections 2026-03-10T06:16:44.190 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.127+0000 7f0c5ad24700 1 --2- 192.168.123.104:0/310882764 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0c54105d30 0x7f0c54196ed0 unknown :-1 s=CLOSED pgs=2 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:16:44.190 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.127+0000 7f0c5ad24700 1 -- 192.168.123.104:0/310882764 >> 192.168.123.104:0/310882764 conn(0x7f0c541013a0 msgr2=0x7f0c5407aaf0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:16:44.190 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.128+0000 7f0c5ad24700 1 -- 192.168.123.104:0/310882764 shutdown_connections 2026-03-10T06:16:44.190 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.128+0000 7f0c5ad24700 1 -- 192.168.123.104:0/310882764 wait complete. 2026-03-10T06:16:44.190 INFO:teuthology.orchestra.run.vm04.stdout:mon is available 2026-03-10T06:16:44.190 INFO:teuthology.orchestra.run.vm04.stdout:Assimilating anything we can from ceph.conf... 2026-03-10T06:16:44.450 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout 2026-03-10T06:16:44.450 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout [global] 2026-03-10T06:16:44.450 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout fsid = 9c59102a-1c48-11f1-b618-035af535377d 2026-03-10T06:16:44.450 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout mon_host = [v2:192.168.123.104:3300,v1:192.168.123.104:6789] 2026-03-10T06:16:44.450 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout mon_osd_allow_pg_remap = true 2026-03-10T06:16:44.450 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout mon_osd_allow_primary_affinity = true 2026-03-10T06:16:44.450 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout mon_warn_on_no_sortbitwise = false 2026-03-10T06:16:44.450 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout osd_crush_chooseleaf_type = 0 2026-03-10T06:16:44.450 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout 2026-03-10T06:16:44.450 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout [mgr] 2026-03-10T06:16:44.451 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout mgr/telemetry/nag = false 2026-03-10T06:16:44.451 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout 2026-03-10T06:16:44.451 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout [osd] 2026-03-10T06:16:44.451 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout mon_osd_backfillfull_ratio = 0.85 2026-03-10T06:16:44.451 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout mon_osd_full_ratio = 0.9 2026-03-10T06:16:44.451 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout mon_osd_nearfull_ratio = 0.8 2026-03-10T06:16:44.451 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout osd_map_max_advance = 10 2026-03-10T06:16:44.451 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout osd_sloppy_crc = true 2026-03-10T06:16:44.451 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.345+0000 7f140806d700 1 Processor -- start 2026-03-10T06:16:44.451 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.345+0000 7f140806d700 1 -- start start 2026-03-10T06:16:44.451 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.346+0000 7f140806d700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1400105e50 0x7f1400106260 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:16:44.451 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.346+0000 7f140806d700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f14001067a0 con 0x7f1400105e50 2026-03-10T06:16:44.451 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.346+0000 7f1405e09700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1400105e50 0x7f1400106260 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:16:44.451 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.346+0000 7f1405e09700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1400105e50 0x7f1400106260 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:41014/0 (socket says 192.168.123.104:41014) 2026-03-10T06:16:44.452 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.346+0000 7f1405e09700 1 -- 192.168.123.104:0/529529144 learned_addr learned my addr 192.168.123.104:0/529529144 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:16:44.452 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.346+0000 7f1405e09700 1 -- 192.168.123.104:0/529529144 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f14001068e0 con 0x7f1400105e50 2026-03-10T06:16:44.452 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.347+0000 7f1405e09700 1 --2- 192.168.123.104:0/529529144 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1400105e50 0x7f1400106260 secure :-1 s=READY pgs=3 cs=0 l=1 rev1=1 crypto rx=0x7f13f4009a90 tx=0x7f13f4009da0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=f3f554a08f3c6445 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:16:44.452 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.347+0000 7f1404e07700 1 -- 192.168.123.104:0/529529144 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f13f4004030 con 0x7f1400105e50 2026-03-10T06:16:44.452 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.347+0000 7f1404e07700 1 -- 192.168.123.104:0/529529144 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(0 keys) v1 ==== 4+0+0 (secure 0 0 0) 0x7f13f4004190 con 0x7f1400105e50 2026-03-10T06:16:44.452 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.347+0000 7f1404e07700 1 -- 192.168.123.104:0/529529144 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f13f4004320 con 0x7f1400105e50 2026-03-10T06:16:44.452 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.347+0000 7f140806d700 1 -- 192.168.123.104:0/529529144 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1400105e50 msgr2=0x7f1400106260 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:16:44.452 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.347+0000 7f140806d700 1 --2- 192.168.123.104:0/529529144 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1400105e50 0x7f1400106260 secure :-1 s=READY pgs=3 cs=0 l=1 rev1=1 crypto rx=0x7f13f4009a90 tx=0x7f13f4009da0 comp rx=0 tx=0).stop 2026-03-10T06:16:44.452 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.348+0000 7f140806d700 1 -- 192.168.123.104:0/529529144 shutdown_connections 2026-03-10T06:16:44.452 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.348+0000 7f140806d700 1 --2- 192.168.123.104:0/529529144 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1400105e50 0x7f1400106260 unknown :-1 s=CLOSED pgs=3 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:16:44.452 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.348+0000 7f140806d700 1 -- 192.168.123.104:0/529529144 >> 192.168.123.104:0/529529144 conn(0x7f1400101420 msgr2=0x7f1400103830 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:16:44.452 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.348+0000 7f140806d700 1 -- 192.168.123.104:0/529529144 shutdown_connections 2026-03-10T06:16:44.452 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.348+0000 7f140806d700 1 -- 192.168.123.104:0/529529144 wait complete. 2026-03-10T06:16:44.452 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.348+0000 7f140806d700 1 Processor -- start 2026-03-10T06:16:44.452 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.348+0000 7f140806d700 1 -- start start 2026-03-10T06:16:44.452 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.349+0000 7f140806d700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1400105e50 0x7f140007bc70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:16:44.452 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.349+0000 7f140806d700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f140007c1b0 con 0x7f1400105e50 2026-03-10T06:16:44.452 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.349+0000 7f1405e09700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1400105e50 0x7f140007bc70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:16:44.452 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.349+0000 7f1405e09700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1400105e50 0x7f140007bc70 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:41016/0 (socket says 192.168.123.104:41016) 2026-03-10T06:16:44.452 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.349+0000 7f1405e09700 1 -- 192.168.123.104:0/2474162311 learned_addr learned my addr 192.168.123.104:0/2474162311 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:16:44.452 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.349+0000 7f1405e09700 1 -- 192.168.123.104:0/2474162311 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f13f4009740 con 0x7f1400105e50 2026-03-10T06:16:44.452 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.349+0000 7f1405e09700 1 --2- 192.168.123.104:0/2474162311 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1400105e50 0x7f140007bc70 secure :-1 s=READY pgs=4 cs=0 l=1 rev1=1 crypto rx=0x7f1400106ae0 tx=0x7f13f4004750 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:16:44.452 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.350+0000 7f13f2ffd700 1 -- 192.168.123.104:0/2474162311 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f13f4004550 con 0x7f1400105e50 2026-03-10T06:16:44.452 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.350+0000 7f13f2ffd700 1 -- 192.168.123.104:0/2474162311 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(0 keys) v1 ==== 4+0+0 (secure 0 0 0) 0x7f13f4020070 con 0x7f1400105e50 2026-03-10T06:16:44.452 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.350+0000 7f140806d700 1 -- 192.168.123.104:0/2474162311 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f140007a2c0 con 0x7f1400105e50 2026-03-10T06:16:44.452 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.350+0000 7f13f2ffd700 1 -- 192.168.123.104:0/2474162311 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f13f40036a0 con 0x7f1400105e50 2026-03-10T06:16:44.452 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.350+0000 7f140806d700 1 -- 192.168.123.104:0/2474162311 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f140007a760 con 0x7f1400105e50 2026-03-10T06:16:44.452 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.351+0000 7f13f2ffd700 1 -- 192.168.123.104:0/2474162311 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 1) v1 ==== 660+0+0 (secure 0 0 0) 0x7f13f4022030 con 0x7f1400105e50 2026-03-10T06:16:44.452 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.351+0000 7f13f2ffd700 1 -- 192.168.123.104:0/2474162311 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(1..1 src has 1..1) v4 ==== 725+0+0 (secure 0 0 0) 0x7f13f401be40 con 0x7f1400105e50 2026-03-10T06:16:44.453 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.351+0000 7f140806d700 1 -- 192.168.123.104:0/2474162311 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f140004fa50 con 0x7f1400105e50 2026-03-10T06:16:44.453 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.353+0000 7f13f2ffd700 1 -- 192.168.123.104:0/2474162311 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+72422 (secure 0 0 0) 0x7f14001068e0 con 0x7f1400105e50 2026-03-10T06:16:44.453 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.391+0000 7f140806d700 1 -- 192.168.123.104:0/2474162311 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "config assimilate-conf"} v 0) v1 -- 0x7f1400062380 con 0x7f1400105e50 2026-03-10T06:16:44.453 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.395+0000 7f13f2ffd700 1 -- 192.168.123.104:0/2474162311 <== mon.0 v2:192.168.123.104:3300/0 7 ==== config(27 keys) v1 ==== 1037+0+0 (secure 0 0 0) 0x7f13f401b450 con 0x7f1400105e50 2026-03-10T06:16:44.453 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.395+0000 7f13f2ffd700 1 -- 192.168.123.104:0/2474162311 <== mon.0 v2:192.168.123.104:3300/0 8 ==== mon_command_ack([{"prefix": "config assimilate-conf"}]=0 v2) v1 ==== 70+0+435 (secure 0 0 0) 0x7f13f402fc40 con 0x7f1400105e50 2026-03-10T06:16:44.453 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.398+0000 7f140806d700 1 -- 192.168.123.104:0/2474162311 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1400105e50 msgr2=0x7f140007bc70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:16:44.453 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.398+0000 7f140806d700 1 --2- 192.168.123.104:0/2474162311 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1400105e50 0x7f140007bc70 secure :-1 s=READY pgs=4 cs=0 l=1 rev1=1 crypto rx=0x7f1400106ae0 tx=0x7f13f4004750 comp rx=0 tx=0).stop 2026-03-10T06:16:44.453 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.398+0000 7f140806d700 1 -- 192.168.123.104:0/2474162311 shutdown_connections 2026-03-10T06:16:44.453 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.398+0000 7f140806d700 1 --2- 192.168.123.104:0/2474162311 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1400105e50 0x7f140007bc70 unknown :-1 s=CLOSED pgs=4 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:16:44.453 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.398+0000 7f140806d700 1 -- 192.168.123.104:0/2474162311 >> 192.168.123.104:0/2474162311 conn(0x7f1400101420 msgr2=0x7f140018d770 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:16:44.453 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.399+0000 7f140806d700 1 -- 192.168.123.104:0/2474162311 shutdown_connections 2026-03-10T06:16:44.453 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.399+0000 7f140806d700 1 -- 192.168.123.104:0/2474162311 wait complete. 2026-03-10T06:16:44.453 INFO:teuthology.orchestra.run.vm04.stdout:Generating new minimal ceph.conf... 2026-03-10T06:16:44.677 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.579+0000 7fc204bf9700 1 Processor -- start 2026-03-10T06:16:44.677 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.579+0000 7fc204bf9700 1 -- start start 2026-03-10T06:16:44.677 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.580+0000 7fc204bf9700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc200107f40 0x7fc200108350 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:16:44.677 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.580+0000 7fc204bf9700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc200108890 con 0x7fc200107f40 2026-03-10T06:16:44.677 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.580+0000 7fc1fe59c700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc200107f40 0x7fc200108350 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:16:44.677 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.580+0000 7fc1fe59c700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc200107f40 0x7fc200108350 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:41024/0 (socket says 192.168.123.104:41024) 2026-03-10T06:16:44.677 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.580+0000 7fc1fe59c700 1 -- 192.168.123.104:0/241534552 learned_addr learned my addr 192.168.123.104:0/241534552 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:16:44.677 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.580+0000 7fc1fe59c700 1 -- 192.168.123.104:0/241534552 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fc2001089d0 con 0x7fc200107f40 2026-03-10T06:16:44.677 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.581+0000 7fc1fe59c700 1 --2- 192.168.123.104:0/241534552 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc200107f40 0x7fc200108350 secure :-1 s=READY pgs=5 cs=0 l=1 rev1=1 crypto rx=0x7fc1e8009cf0 tx=0x7fc1e800b0e0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=13b44305706785f4 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:16:44.677 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.581+0000 7fc1fd59a700 1 -- 192.168.123.104:0/241534552 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fc1e8004030 con 0x7fc200107f40 2026-03-10T06:16:44.677 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.581+0000 7fc1fd59a700 1 -- 192.168.123.104:0/241534552 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(27 keys) v1 ==== 1037+0+0 (secure 0 0 0) 0x7fc1e800b810 con 0x7fc200107f40 2026-03-10T06:16:44.677 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.581+0000 7fc204bf9700 1 -- 192.168.123.104:0/241534552 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc200107f40 msgr2=0x7fc200108350 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:16:44.677 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.581+0000 7fc204bf9700 1 --2- 192.168.123.104:0/241534552 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc200107f40 0x7fc200108350 secure :-1 s=READY pgs=5 cs=0 l=1 rev1=1 crypto rx=0x7fc1e8009cf0 tx=0x7fc1e800b0e0 comp rx=0 tx=0).stop 2026-03-10T06:16:44.677 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.582+0000 7fc204bf9700 1 -- 192.168.123.104:0/241534552 shutdown_connections 2026-03-10T06:16:44.677 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.582+0000 7fc204bf9700 1 --2- 192.168.123.104:0/241534552 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc200107f40 0x7fc200108350 unknown :-1 s=CLOSED pgs=5 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:16:44.677 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.582+0000 7fc204bf9700 1 -- 192.168.123.104:0/241534552 >> 192.168.123.104:0/241534552 conn(0x7fc200103770 msgr2=0x7fc200105b50 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:16:44.677 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.582+0000 7fc204bf9700 1 -- 192.168.123.104:0/241534552 shutdown_connections 2026-03-10T06:16:44.677 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.582+0000 7fc204bf9700 1 -- 192.168.123.104:0/241534552 wait complete. 2026-03-10T06:16:44.677 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.582+0000 7fc204bf9700 1 Processor -- start 2026-03-10T06:16:44.677 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.582+0000 7fc204bf9700 1 -- start start 2026-03-10T06:16:44.677 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.583+0000 7fc204bf9700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc200107f40 0x7fc20019bec0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:16:44.677 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.583+0000 7fc1fe59c700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc200107f40 0x7fc20019bec0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:16:44.677 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.583+0000 7fc1fe59c700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc200107f40 0x7fc20019bec0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:41032/0 (socket says 192.168.123.104:41032) 2026-03-10T06:16:44.677 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.583+0000 7fc1fe59c700 1 -- 192.168.123.104:0/2606445562 learned_addr learned my addr 192.168.123.104:0/2606445562 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:16:44.677 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.583+0000 7fc204bf9700 1 -- 192.168.123.104:0/2606445562 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc200108890 con 0x7fc200107f40 2026-03-10T06:16:44.677 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.583+0000 7fc1fe59c700 1 -- 192.168.123.104:0/2606445562 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fc1e8009740 con 0x7fc200107f40 2026-03-10T06:16:44.677 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.583+0000 7fc1fe59c700 1 --2- 192.168.123.104:0/2606445562 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc200107f40 0x7fc20019bec0 secure :-1 s=READY pgs=6 cs=0 l=1 rev1=1 crypto rx=0x7fc1e80116a0 tx=0x7fc1e8011780 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:16:44.678 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.584+0000 7fc1f77fe700 1 -- 192.168.123.104:0/2606445562 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fc1e80119d0 con 0x7fc200107f40 2026-03-10T06:16:44.678 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.584+0000 7fc1f77fe700 1 -- 192.168.123.104:0/2606445562 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(27 keys) v1 ==== 1037+0+0 (secure 0 0 0) 0x7fc1e801a430 con 0x7fc200107f40 2026-03-10T06:16:44.678 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.584+0000 7fc1f77fe700 1 -- 192.168.123.104:0/2606445562 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fc1e80243f0 con 0x7fc200107f40 2026-03-10T06:16:44.678 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.584+0000 7fc204bf9700 1 -- 192.168.123.104:0/2606445562 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fc20019c400 con 0x7fc200107f40 2026-03-10T06:16:44.678 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.584+0000 7fc204bf9700 1 -- 192.168.123.104:0/2606445562 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fc20019c8a0 con 0x7fc200107f40 2026-03-10T06:16:44.678 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.584+0000 7fc1f77fe700 1 -- 192.168.123.104:0/2606445562 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 1) v1 ==== 660+0+0 (secure 0 0 0) 0x7fc1e8024850 con 0x7fc200107f40 2026-03-10T06:16:44.678 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.584+0000 7fc1f77fe700 1 -- 192.168.123.104:0/2606445562 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(1..1 src has 1..1) v4 ==== 725+0+0 (secure 0 0 0) 0x7fc1e802da20 con 0x7fc200107f40 2026-03-10T06:16:44.678 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.585+0000 7fc204bf9700 1 -- 192.168.123.104:0/2606445562 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fc200195d30 con 0x7fc200107f40 2026-03-10T06:16:44.678 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.586+0000 7fc1f77fe700 1 -- 192.168.123.104:0/2606445562 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+72422 (secure 0 0 0) 0x7fc1e802dc30 con 0x7fc200107f40 2026-03-10T06:16:44.678 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.623+0000 7fc204bf9700 1 -- 192.168.123.104:0/2606445562 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "config generate-minimal-conf"} v 0) v1 -- 0x7fc20004f9e0 con 0x7fc200107f40 2026-03-10T06:16:44.678 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.623+0000 7fc1f77fe700 1 -- 192.168.123.104:0/2606445562 <== mon.0 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "config generate-minimal-conf"}]=0 v2) v1 ==== 76+0+181 (secure 0 0 0) 0x7fc1e802ddd0 con 0x7fc200107f40 2026-03-10T06:16:44.678 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.624+0000 7fc204bf9700 1 -- 192.168.123.104:0/2606445562 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc200107f40 msgr2=0x7fc20019bec0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:16:44.678 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.624+0000 7fc204bf9700 1 --2- 192.168.123.104:0/2606445562 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc200107f40 0x7fc20019bec0 secure :-1 s=READY pgs=6 cs=0 l=1 rev1=1 crypto rx=0x7fc1e80116a0 tx=0x7fc1e8011780 comp rx=0 tx=0).stop 2026-03-10T06:16:44.678 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.624+0000 7fc204bf9700 1 -- 192.168.123.104:0/2606445562 shutdown_connections 2026-03-10T06:16:44.678 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.624+0000 7fc204bf9700 1 --2- 192.168.123.104:0/2606445562 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc200107f40 0x7fc20019bec0 unknown :-1 s=CLOSED pgs=6 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:16:44.678 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.625+0000 7fc204bf9700 1 -- 192.168.123.104:0/2606445562 >> 192.168.123.104:0/2606445562 conn(0x7fc200103770 msgr2=0x7fc200105f60 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:16:44.678 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.625+0000 7fc204bf9700 1 -- 192.168.123.104:0/2606445562 shutdown_connections 2026-03-10T06:16:44.678 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:44.625+0000 7fc204bf9700 1 -- 192.168.123.104:0/2606445562 wait complete. 2026-03-10T06:16:44.678 INFO:teuthology.orchestra.run.vm04.stdout:Restarting the monitor... 2026-03-10T06:16:44.760 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:44 vm04 systemd[1]: Stopping Ceph mon.vm04 for 9c59102a-1c48-11f1-b618-035af535377d... 2026-03-10T06:16:44.997 INFO:teuthology.orchestra.run.vm04.stdout:Setting public_network to 192.168.123.0/24 in global config section 2026-03-10T06:16:45.013 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:44 vm04 ceph-9c59102a-1c48-11f1-b618-035af535377d-mon-vm04[50770]: 2026-03-10T06:16:44.758+0000 7efe04161700 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-mon -n mon.vm04 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false --default-mon-cluster-log-to-file=false --default-mon-cluster-log-to-journald=true --default-mon-cluster-log-to-stderr=false (PID: 1) UID: 0 2026-03-10T06:16:45.013 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:44 vm04 ceph-9c59102a-1c48-11f1-b618-035af535377d-mon-vm04[50770]: 2026-03-10T06:16:44.758+0000 7efe04161700 -1 mon.vm04@0(leader) e1 *** Got Signal Terminated *** 2026-03-10T06:16:45.013 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:44 vm04 podman[50977]: 2026-03-10 06:16:44.769922542 +0000 UTC m=+0.024988462 container died cbf197e942e50b2e84fc44f88d186defe992894fc4ae5a36128345a065056962 (image=quay.io/ceph/ceph:v18.2.0, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-mon-vm04, GIT_CLEAN=True, RELEASE=HEAD, ceph=True, org.label-schema.license=GPLv2, io.buildah.version=1.29.1, maintainer=Guillaume Abrioux , GIT_COMMIT=0396eef90bef641b676c164ec7a3876f45010308, GIT_REPO=https://github.com/ceph/ceph-container.git, org.label-schema.build-date=20231212, GIT_BRANCH=HEAD, org.label-schema.name=CentOS Stream 8 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_POINT_RELEASE=-18.2.0) 2026-03-10T06:16:45.013 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:44 vm04 podman[50977]: 2026-03-10 06:16:44.785336089 +0000 UTC m=+0.040402009 container remove cbf197e942e50b2e84fc44f88d186defe992894fc4ae5a36128345a065056962 (image=quay.io/ceph/ceph:v18.2.0, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-mon-vm04, ceph=True, maintainer=Guillaume Abrioux , org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, RELEASE=HEAD, io.buildah.version=1.29.1, org.label-schema.name=CentOS Stream 8 Base Image, org.label-schema.vendor=CentOS, CEPH_POINT_RELEASE=-18.2.0, GIT_CLEAN=True, GIT_COMMIT=0396eef90bef641b676c164ec7a3876f45010308, org.label-schema.build-date=20231212, GIT_BRANCH=HEAD, GIT_REPO=https://github.com/ceph/ceph-container.git) 2026-03-10T06:16:45.013 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:44 vm04 bash[50977]: ceph-9c59102a-1c48-11f1-b618-035af535377d-mon-vm04 2026-03-10T06:16:45.013 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:44 vm04 systemd[1]: ceph-9c59102a-1c48-11f1-b618-035af535377d@mon.vm04.service: Deactivated successfully. 2026-03-10T06:16:45.013 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:44 vm04 systemd[1]: Stopped Ceph mon.vm04 for 9c59102a-1c48-11f1-b618-035af535377d. 2026-03-10T06:16:45.013 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:44 vm04 systemd[1]: Starting Ceph mon.vm04 for 9c59102a-1c48-11f1-b618-035af535377d... 2026-03-10T06:16:45.013 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:44 vm04 podman[51044]: 2026-03-10 06:16:44.950372804 +0000 UTC m=+0.018381311 container create 089bb557f95b9394c87ff557894f535f578da6e4e05adc52d8e40c294b0d47b2 (image=quay.io/ceph/ceph:v18.2.0, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-mon-vm04, GIT_COMMIT=0396eef90bef641b676c164ec7a3876f45010308, maintainer=Guillaume Abrioux , org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 8 Base Image, RELEASE=HEAD, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.29.1, ceph=True, GIT_CLEAN=True, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20231212, CEPH_POINT_RELEASE=-18.2.0, GIT_BRANCH=HEAD) 2026-03-10T06:16:45.013 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:44 vm04 podman[51044]: 2026-03-10 06:16:44.983735791 +0000 UTC m=+0.051744297 container init 089bb557f95b9394c87ff557894f535f578da6e4e05adc52d8e40c294b0d47b2 (image=quay.io/ceph/ceph:v18.2.0, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-mon-vm04, org.label-schema.name=CentOS Stream 8 Base Image, RELEASE=HEAD, org.label-schema.schema-version=1.0, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.29.1, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.build-date=20231212, CEPH_POINT_RELEASE=-18.2.0, GIT_BRANCH=HEAD, GIT_CLEAN=True, GIT_COMMIT=0396eef90bef641b676c164ec7a3876f45010308, maintainer=Guillaume Abrioux , org.label-schema.license=GPLv2) 2026-03-10T06:16:45.013 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:44 vm04 podman[51044]: 2026-03-10 06:16:44.987763419 +0000 UTC m=+0.055771926 container start 089bb557f95b9394c87ff557894f535f578da6e4e05adc52d8e40c294b0d47b2 (image=quay.io/ceph/ceph:v18.2.0, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-mon-vm04, GIT_REPO=https://github.com/ceph/ceph-container.git, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20231212, GIT_BRANCH=HEAD, GIT_CLEAN=True, maintainer=Guillaume Abrioux , org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 8 Base Image, io.buildah.version=1.29.1, ceph=True, CEPH_POINT_RELEASE=-18.2.0, GIT_COMMIT=0396eef90bef641b676c164ec7a3876f45010308, RELEASE=HEAD) 2026-03-10T06:16:45.013 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:44 vm04 bash[51044]: 089bb557f95b9394c87ff557894f535f578da6e4e05adc52d8e40c294b0d47b2 2026-03-10T06:16:45.014 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:44 vm04 podman[51044]: 2026-03-10 06:16:44.943118889 +0000 UTC m=+0.011127406 image pull dc2bc1663786ad5862ef6efcff18243ab7dfeda0bcb911ddf757a0d20feca946 quay.io/ceph/ceph:v18.2.0 2026-03-10T06:16:45.014 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:44 vm04 systemd[1]: Started Ceph mon.vm04 for 9c59102a-1c48-11f1-b618-035af535377d. 2026-03-10T06:16:45.257 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:45.148+0000 7ff08d203700 1 Processor -- start 2026-03-10T06:16:45.257 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:45.149+0000 7ff08d203700 1 -- start start 2026-03-10T06:16:45.257 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:45.149+0000 7ff08d203700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7ff088107f40 0x7ff088108350 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:16:45.257 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:45.149+0000 7ff08d203700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff088108890 con 0x7ff088107f40 2026-03-10T06:16:45.257 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:45.150+0000 7ff086d9d700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7ff088107f40 0x7ff088108350 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:16:45.257 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:45.150+0000 7ff086d9d700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7ff088107f40 0x7ff088108350 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:41034/0 (socket says 192.168.123.104:41034) 2026-03-10T06:16:45.257 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:45.150+0000 7ff086d9d700 1 -- 192.168.123.104:0/982218932 learned_addr learned my addr 192.168.123.104:0/982218932 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:16:45.257 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:45.150+0000 7ff086d9d700 1 -- 192.168.123.104:0/982218932 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7ff0881089d0 con 0x7ff088107f40 2026-03-10T06:16:45.257 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:45.150+0000 7ff086d9d700 1 --2- 192.168.123.104:0/982218932 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7ff088107f40 0x7ff088108350 secure :-1 s=READY pgs=1 cs=0 l=1 rev1=1 crypto rx=0x7ff07001ad90 tx=0x7ff07001c3d0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=e4f9f80a1a328de6 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:16:45.257 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:45.150+0000 7ff085d9b700 1 -- 192.168.123.104:0/982218932 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7ff07001c9e0 con 0x7ff088107f40 2026-03-10T06:16:45.257 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:45.150+0000 7ff085d9b700 1 -- 192.168.123.104:0/982218932 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(27 keys) v1 ==== 1037+0+0 (secure 0 0 0) 0x7ff070004030 con 0x7ff088107f40 2026-03-10T06:16:45.257 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:45.151+0000 7ff08d203700 1 -- 192.168.123.104:0/982218932 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7ff088107f40 msgr2=0x7ff088108350 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:16:45.257 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:45.151+0000 7ff08d203700 1 --2- 192.168.123.104:0/982218932 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7ff088107f40 0x7ff088108350 secure :-1 s=READY pgs=1 cs=0 l=1 rev1=1 crypto rx=0x7ff07001ad90 tx=0x7ff07001c3d0 comp rx=0 tx=0).stop 2026-03-10T06:16:45.257 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:45.151+0000 7ff08d203700 1 -- 192.168.123.104:0/982218932 shutdown_connections 2026-03-10T06:16:45.257 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:45.151+0000 7ff08d203700 1 --2- 192.168.123.104:0/982218932 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7ff088107f40 0x7ff088108350 unknown :-1 s=CLOSED pgs=1 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:16:45.257 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:45.151+0000 7ff08d203700 1 -- 192.168.123.104:0/982218932 >> 192.168.123.104:0/982218932 conn(0x7ff088103770 msgr2=0x7ff088105b50 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:16:45.257 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:45.151+0000 7ff08d203700 1 -- 192.168.123.104:0/982218932 shutdown_connections 2026-03-10T06:16:45.257 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:45.151+0000 7ff08d203700 1 -- 192.168.123.104:0/982218932 wait complete. 2026-03-10T06:16:45.257 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:45.152+0000 7ff08d203700 1 Processor -- start 2026-03-10T06:16:45.257 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:45.152+0000 7ff08d203700 1 -- start start 2026-03-10T06:16:45.257 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:45.152+0000 7ff08d203700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7ff088193460 0x7ff088193870 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:16:45.257 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:45.152+0000 7ff08d203700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff088108890 con 0x7ff088193460 2026-03-10T06:16:45.257 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:45.152+0000 7ff086d9d700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7ff088193460 0x7ff088193870 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:16:45.257 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:45.152+0000 7ff086d9d700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7ff088193460 0x7ff088193870 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:41038/0 (socket says 192.168.123.104:41038) 2026-03-10T06:16:45.257 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:45.152+0000 7ff086d9d700 1 -- 192.168.123.104:0/624239529 learned_addr learned my addr 192.168.123.104:0/624239529 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:16:45.257 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:45.152+0000 7ff086d9d700 1 -- 192.168.123.104:0/624239529 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7ff07001a7e0 con 0x7ff088193460 2026-03-10T06:16:45.257 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:45.153+0000 7ff086d9d700 1 --2- 192.168.123.104:0/624239529 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7ff088193460 0x7ff088193870 secure :-1 s=READY pgs=2 cs=0 l=1 rev1=1 crypto rx=0x7ff07001c850 tx=0x7ff070004340 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:16:45.257 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:45.153+0000 7ff07ffff700 1 -- 192.168.123.104:0/624239529 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7ff07001ce40 con 0x7ff088193460 2026-03-10T06:16:45.258 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:45.153+0000 7ff08d203700 1 -- 192.168.123.104:0/624239529 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7ff088193db0 con 0x7ff088193460 2026-03-10T06:16:45.258 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:45.153+0000 7ff08d203700 1 -- 192.168.123.104:0/624239529 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7ff088196a40 con 0x7ff088193460 2026-03-10T06:16:45.258 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:45.153+0000 7ff07ffff700 1 -- 192.168.123.104:0/624239529 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(27 keys) v1 ==== 1037+0+0 (secure 0 0 0) 0x7ff0700044a0 con 0x7ff088193460 2026-03-10T06:16:45.258 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:45.153+0000 7ff07ffff700 1 -- 192.168.123.104:0/624239529 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7ff070022430 con 0x7ff088193460 2026-03-10T06:16:45.258 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:45.153+0000 7ff07ffff700 1 -- 192.168.123.104:0/624239529 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 1) v1 ==== 660+0+0 (secure 0 0 0) 0x7ff070022910 con 0x7ff088193460 2026-03-10T06:16:45.258 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:45.153+0000 7ff07ffff700 1 -- 192.168.123.104:0/624239529 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(1..1 src has 1..1) v4 ==== 725+0+0 (secure 0 0 0) 0x7ff070034da0 con 0x7ff088193460 2026-03-10T06:16:45.258 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:45.154+0000 7ff08d203700 1 -- 192.168.123.104:0/624239529 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7ff068005320 con 0x7ff088193460 2026-03-10T06:16:45.258 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:45.155+0000 7ff07ffff700 1 -- 192.168.123.104:0/624239529 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+72422 (secure 0 0 0) 0x7ff070052b00 con 0x7ff088193460 2026-03-10T06:16:45.258 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:45.194+0000 7ff08d203700 1 -- 192.168.123.104:0/624239529 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command([{prefix=config set, name=public_network}] v 0) v1 -- 0x7ff068005cc0 con 0x7ff088193460 2026-03-10T06:16:45.258 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:45.198+0000 7ff07ffff700 1 -- 192.168.123.104:0/624239529 <== mon.0 v2:192.168.123.104:3300/0 7 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7ff07002c8e0 con 0x7ff088193460 2026-03-10T06:16:45.258 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:45.198+0000 7ff07ffff700 1 -- 192.168.123.104:0/624239529 <== mon.0 v2:192.168.123.104:3300/0 8 ==== mon_command_ack([{prefix=config set, name=public_network}]=0 v3)=0 v3) v1 ==== 130+0+0 (secure 0 0 0) 0x7ff070041030 con 0x7ff088193460 2026-03-10T06:16:45.258 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:45.200+0000 7ff08d203700 1 -- 192.168.123.104:0/624239529 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7ff088193460 msgr2=0x7ff088193870 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:16:45.258 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:45.200+0000 7ff08d203700 1 --2- 192.168.123.104:0/624239529 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7ff088193460 0x7ff088193870 secure :-1 s=READY pgs=2 cs=0 l=1 rev1=1 crypto rx=0x7ff07001c850 tx=0x7ff070004340 comp rx=0 tx=0).stop 2026-03-10T06:16:45.258 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:45.200+0000 7ff08d203700 1 -- 192.168.123.104:0/624239529 shutdown_connections 2026-03-10T06:16:45.258 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:45.200+0000 7ff08d203700 1 --2- 192.168.123.104:0/624239529 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7ff088193460 0x7ff088193870 unknown :-1 s=CLOSED pgs=2 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:16:45.258 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:45.200+0000 7ff08d203700 1 -- 192.168.123.104:0/624239529 >> 192.168.123.104:0/624239529 conn(0x7ff088103770 msgr2=0x7ff08806b590 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:16:45.258 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:45.200+0000 7ff08d203700 1 -- 192.168.123.104:0/624239529 shutdown_connections 2026-03-10T06:16:45.258 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:45.200+0000 7ff08d203700 1 -- 192.168.123.104:0/624239529 wait complete. 2026-03-10T06:16:45.258 INFO:teuthology.orchestra.run.vm04.stdout:Wrote config to /etc/ceph/ceph.conf 2026-03-10T06:16:45.258 INFO:teuthology.orchestra.run.vm04.stdout:Wrote keyring to /etc/ceph/ceph.client.admin.keyring 2026-03-10T06:16:45.258 INFO:teuthology.orchestra.run.vm04.stdout:Creating mgr... 2026-03-10T06:16:45.258 INFO:teuthology.orchestra.run.vm04.stdout:Verifying port 0.0.0.0:9283 ... 2026-03-10T06:16:45.258 INFO:teuthology.orchestra.run.vm04.stdout:Verifying port 0.0.0.0:8765 ... 2026-03-10T06:16:45.259 INFO:teuthology.orchestra.run.vm04.stdout:Verifying port 0.0.0.0:8443 ... 2026-03-10T06:16:45.277 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: set uid:gid to 167:167 (ceph:ceph) 2026-03-10T06:16:45.278 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable), process ceph-mon, pid 2 2026-03-10T06:16:45.278 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: pidfile_write: ignore empty --pid-file 2026-03-10T06:16:45.278 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: load: jerasure load: lrc 2026-03-10T06:16:45.278 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: RocksDB version: 7.9.2 2026-03-10T06:16:45.278 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Git sha 0 2026-03-10T06:16:45.278 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Compile date 2023-08-03 19:21:13 2026-03-10T06:16:45.278 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: DB SUMMARY 2026-03-10T06:16:45.278 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: DB Session ID: HI3TEGSH7D4D0YYBRD3A 2026-03-10T06:16:45.278 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: CURRENT file: CURRENT 2026-03-10T06:16:45.278 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: IDENTITY file: IDENTITY 2026-03-10T06:16:45.278 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: MANIFEST file: MANIFEST-000010 size: 179 Bytes 2026-03-10T06:16:45.278 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: SST files in /var/lib/ceph/mon/ceph-vm04/store.db dir, Total Num: 1, files: 000008.sst 2026-03-10T06:16:45.278 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Write Ahead Log file in /var/lib/ceph/mon/ceph-vm04/store.db: 000009.log size: 89048 ; 2026-03-10T06:16:45.278 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.error_if_exists: 0 2026-03-10T06:16:45.278 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.create_if_missing: 0 2026-03-10T06:16:45.278 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.paranoid_checks: 1 2026-03-10T06:16:45.278 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.flush_verify_memtable_count: 1 2026-03-10T06:16:45.278 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.track_and_verify_wals_in_manifest: 0 2026-03-10T06:16:45.278 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.verify_sst_unique_id_in_manifest: 1 2026-03-10T06:16:45.278 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.env: 0x562a3c7f1720 2026-03-10T06:16:45.278 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.fs: PosixFileSystem 2026-03-10T06:16:45.278 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.info_log: 0x562a3ec2d340 2026-03-10T06:16:45.278 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.max_file_opening_threads: 16 2026-03-10T06:16:45.278 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.statistics: (nil) 2026-03-10T06:16:45.278 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.use_fsync: 0 2026-03-10T06:16:45.278 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.max_log_file_size: 0 2026-03-10T06:16:45.278 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.max_manifest_file_size: 1073741824 2026-03-10T06:16:45.278 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.log_file_time_to_roll: 0 2026-03-10T06:16:45.278 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.keep_log_file_num: 1000 2026-03-10T06:16:45.278 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.recycle_log_file_num: 0 2026-03-10T06:16:45.278 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.allow_fallocate: 1 2026-03-10T06:16:45.278 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.allow_mmap_reads: 0 2026-03-10T06:16:45.278 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.allow_mmap_writes: 0 2026-03-10T06:16:45.278 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.use_direct_reads: 0 2026-03-10T06:16:45.279 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.use_direct_io_for_flush_and_compaction: 0 2026-03-10T06:16:45.279 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.create_missing_column_families: 0 2026-03-10T06:16:45.279 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.db_log_dir: 2026-03-10T06:16:45.279 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.wal_dir: 2026-03-10T06:16:45.279 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.table_cache_numshardbits: 6 2026-03-10T06:16:45.279 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.WAL_ttl_seconds: 0 2026-03-10T06:16:45.279 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.WAL_size_limit_MB: 0 2026-03-10T06:16:45.279 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.max_write_batch_group_size_bytes: 1048576 2026-03-10T06:16:45.279 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.manifest_preallocation_size: 4194304 2026-03-10T06:16:45.279 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.is_fd_close_on_exec: 1 2026-03-10T06:16:45.279 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.advise_random_on_open: 1 2026-03-10T06:16:45.279 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.db_write_buffer_size: 0 2026-03-10T06:16:45.279 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.write_buffer_manager: 0x562a3debc5a0 2026-03-10T06:16:45.279 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.access_hint_on_compaction_start: 1 2026-03-10T06:16:45.279 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.random_access_max_buffer_size: 1048576 2026-03-10T06:16:45.279 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.use_adaptive_mutex: 0 2026-03-10T06:16:45.279 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.rate_limiter: (nil) 2026-03-10T06:16:45.279 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.sst_file_manager.rate_bytes_per_sec: 0 2026-03-10T06:16:45.279 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.wal_recovery_mode: 2 2026-03-10T06:16:45.279 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.enable_thread_tracking: 0 2026-03-10T06:16:45.279 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.enable_pipelined_write: 0 2026-03-10T06:16:45.279 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.unordered_write: 0 2026-03-10T06:16:45.279 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.allow_concurrent_memtable_write: 1 2026-03-10T06:16:45.279 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.enable_write_thread_adaptive_yield: 1 2026-03-10T06:16:45.279 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.write_thread_max_yield_usec: 100 2026-03-10T06:16:45.279 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.write_thread_slow_yield_usec: 3 2026-03-10T06:16:45.279 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.row_cache: None 2026-03-10T06:16:45.279 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.wal_filter: None 2026-03-10T06:16:45.279 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.avoid_flush_during_recovery: 0 2026-03-10T06:16:45.279 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.allow_ingest_behind: 0 2026-03-10T06:16:45.279 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.two_write_queues: 0 2026-03-10T06:16:45.279 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.manual_wal_flush: 0 2026-03-10T06:16:45.279 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.wal_compression: 0 2026-03-10T06:16:45.279 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.atomic_flush: 0 2026-03-10T06:16:45.279 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.avoid_unnecessary_blocking_io: 0 2026-03-10T06:16:45.279 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.persist_stats_to_disk: 0 2026-03-10T06:16:45.279 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.write_dbid_to_manifest: 0 2026-03-10T06:16:45.279 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.log_readahead_size: 0 2026-03-10T06:16:45.280 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.file_checksum_gen_factory: Unknown 2026-03-10T06:16:45.280 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.best_efforts_recovery: 0 2026-03-10T06:16:45.280 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.max_bgerror_resume_count: 2147483647 2026-03-10T06:16:45.280 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.bgerror_resume_retry_interval: 1000000 2026-03-10T06:16:45.280 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.allow_data_in_errors: 0 2026-03-10T06:16:45.280 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.db_host_id: __hostname__ 2026-03-10T06:16:45.280 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.enforce_single_del_contracts: true 2026-03-10T06:16:45.280 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.max_background_jobs: 2 2026-03-10T06:16:45.280 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.max_background_compactions: -1 2026-03-10T06:16:45.280 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.max_subcompactions: 1 2026-03-10T06:16:45.280 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.avoid_flush_during_shutdown: 0 2026-03-10T06:16:45.280 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.writable_file_max_buffer_size: 1048576 2026-03-10T06:16:45.280 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.delayed_write_rate : 16777216 2026-03-10T06:16:45.280 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.max_total_wal_size: 0 2026-03-10T06:16:45.280 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.delete_obsolete_files_period_micros: 21600000000 2026-03-10T06:16:45.280 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.stats_dump_period_sec: 600 2026-03-10T06:16:45.280 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.stats_persist_period_sec: 600 2026-03-10T06:16:45.280 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.stats_history_buffer_size: 1048576 2026-03-10T06:16:45.280 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.max_open_files: -1 2026-03-10T06:16:45.280 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.bytes_per_sync: 0 2026-03-10T06:16:45.280 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.wal_bytes_per_sync: 0 2026-03-10T06:16:45.280 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.strict_bytes_per_sync: 0 2026-03-10T06:16:45.280 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.compaction_readahead_size: 0 2026-03-10T06:16:45.280 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.max_background_flushes: -1 2026-03-10T06:16:45.280 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Compression algorithms supported: 2026-03-10T06:16:45.280 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: kZSTDNotFinalCompression supported: 0 2026-03-10T06:16:45.280 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: kZSTD supported: 0 2026-03-10T06:16:45.280 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: kXpressCompression supported: 0 2026-03-10T06:16:45.280 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: kLZ4HCCompression supported: 1 2026-03-10T06:16:45.280 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: kZlibCompression supported: 1 2026-03-10T06:16:45.280 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: kSnappyCompression supported: 1 2026-03-10T06:16:45.280 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: kLZ4Compression supported: 1 2026-03-10T06:16:45.280 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: kBZip2Compression supported: 0 2026-03-10T06:16:45.280 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Fast CRC32 supported: Supported on x86 2026-03-10T06:16:45.280 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: DMutex implementation: pthread_mutex_t 2026-03-10T06:16:45.280 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: /var/lib/ceph/mon/ceph-vm04/store.db/MANIFEST-000010 2026-03-10T06:16:45.280 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]: 2026-03-10T06:16:45.280 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.comparator: leveldb.BytewiseComparator 2026-03-10T06:16:45.281 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.merge_operator: 2026-03-10T06:16:45.281 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.compaction_filter: None 2026-03-10T06:16:45.281 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.compaction_filter_factory: None 2026-03-10T06:16:45.281 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.sst_partitioner_factory: None 2026-03-10T06:16:45.281 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.memtable_factory: SkipListFactory 2026-03-10T06:16:45.281 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.table_factory: BlockBasedTable 2026-03-10T06:16:45.281 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x562a3ec2d460) 2026-03-10T06:16:45.281 INFO:journalctl@ceph.mon.vm04.vm04.stdout: cache_index_and_filter_blocks: 1 2026-03-10T06:16:45.281 INFO:journalctl@ceph.mon.vm04.vm04.stdout: cache_index_and_filter_blocks_with_high_priority: 0 2026-03-10T06:16:45.281 INFO:journalctl@ceph.mon.vm04.vm04.stdout: pin_l0_filter_and_index_blocks_in_cache: 0 2026-03-10T06:16:45.281 INFO:journalctl@ceph.mon.vm04.vm04.stdout: pin_top_level_index_and_filter: 1 2026-03-10T06:16:45.281 INFO:journalctl@ceph.mon.vm04.vm04.stdout: index_type: 0 2026-03-10T06:16:45.281 INFO:journalctl@ceph.mon.vm04.vm04.stdout: data_block_index_type: 0 2026-03-10T06:16:45.281 INFO:journalctl@ceph.mon.vm04.vm04.stdout: index_shortening: 1 2026-03-10T06:16:45.281 INFO:journalctl@ceph.mon.vm04.vm04.stdout: data_block_hash_table_util_ratio: 0.750000 2026-03-10T06:16:45.281 INFO:journalctl@ceph.mon.vm04.vm04.stdout: checksum: 4 2026-03-10T06:16:45.281 INFO:journalctl@ceph.mon.vm04.vm04.stdout: no_block_cache: 0 2026-03-10T06:16:45.281 INFO:journalctl@ceph.mon.vm04.vm04.stdout: block_cache: 0x562a3df3f350 2026-03-10T06:16:45.281 INFO:journalctl@ceph.mon.vm04.vm04.stdout: block_cache_name: BinnedLRUCache 2026-03-10T06:16:45.281 INFO:journalctl@ceph.mon.vm04.vm04.stdout: block_cache_options: 2026-03-10T06:16:45.281 INFO:journalctl@ceph.mon.vm04.vm04.stdout: capacity : 536870912 2026-03-10T06:16:45.281 INFO:journalctl@ceph.mon.vm04.vm04.stdout: num_shard_bits : 4 2026-03-10T06:16:45.281 INFO:journalctl@ceph.mon.vm04.vm04.stdout: strict_capacity_limit : 0 2026-03-10T06:16:45.281 INFO:journalctl@ceph.mon.vm04.vm04.stdout: high_pri_pool_ratio: 0.000 2026-03-10T06:16:45.281 INFO:journalctl@ceph.mon.vm04.vm04.stdout: block_cache_compressed: (nil) 2026-03-10T06:16:45.281 INFO:journalctl@ceph.mon.vm04.vm04.stdout: persistent_cache: (nil) 2026-03-10T06:16:45.281 INFO:journalctl@ceph.mon.vm04.vm04.stdout: block_size: 4096 2026-03-10T06:16:45.281 INFO:journalctl@ceph.mon.vm04.vm04.stdout: block_size_deviation: 10 2026-03-10T06:16:45.281 INFO:journalctl@ceph.mon.vm04.vm04.stdout: block_restart_interval: 16 2026-03-10T06:16:45.281 INFO:journalctl@ceph.mon.vm04.vm04.stdout: index_block_restart_interval: 1 2026-03-10T06:16:45.281 INFO:journalctl@ceph.mon.vm04.vm04.stdout: metadata_block_size: 4096 2026-03-10T06:16:45.281 INFO:journalctl@ceph.mon.vm04.vm04.stdout: partition_filters: 0 2026-03-10T06:16:45.281 INFO:journalctl@ceph.mon.vm04.vm04.stdout: use_delta_encoding: 1 2026-03-10T06:16:45.281 INFO:journalctl@ceph.mon.vm04.vm04.stdout: filter_policy: bloomfilter 2026-03-10T06:16:45.281 INFO:journalctl@ceph.mon.vm04.vm04.stdout: whole_key_filtering: 1 2026-03-10T06:16:45.281 INFO:journalctl@ceph.mon.vm04.vm04.stdout: verify_compression: 0 2026-03-10T06:16:45.281 INFO:journalctl@ceph.mon.vm04.vm04.stdout: read_amp_bytes_per_bit: 0 2026-03-10T06:16:45.281 INFO:journalctl@ceph.mon.vm04.vm04.stdout: format_version: 5 2026-03-10T06:16:45.281 INFO:journalctl@ceph.mon.vm04.vm04.stdout: enable_index_compression: 1 2026-03-10T06:16:45.281 INFO:journalctl@ceph.mon.vm04.vm04.stdout: block_align: 0 2026-03-10T06:16:45.282 INFO:journalctl@ceph.mon.vm04.vm04.stdout: max_auto_readahead_size: 262144 2026-03-10T06:16:45.282 INFO:journalctl@ceph.mon.vm04.vm04.stdout: prepopulate_block_cache: 0 2026-03-10T06:16:45.282 INFO:journalctl@ceph.mon.vm04.vm04.stdout: initial_auto_readahead_size: 8192 2026-03-10T06:16:45.282 INFO:journalctl@ceph.mon.vm04.vm04.stdout: num_file_reads_for_auto_readahead: 2 2026-03-10T06:16:45.282 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.write_buffer_size: 33554432 2026-03-10T06:16:45.282 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.max_write_buffer_number: 2 2026-03-10T06:16:45.282 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.compression: NoCompression 2026-03-10T06:16:45.282 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.bottommost_compression: Disabled 2026-03-10T06:16:45.282 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.prefix_extractor: nullptr 2026-03-10T06:16:45.282 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr 2026-03-10T06:16:45.282 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.num_levels: 7 2026-03-10T06:16:45.282 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.min_write_buffer_number_to_merge: 1 2026-03-10T06:16:45.282 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 2026-03-10T06:16:45.282 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 2026-03-10T06:16:45.282 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 2026-03-10T06:16:45.282 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.bottommost_compression_opts.level: 32767 2026-03-10T06:16:45.282 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.bottommost_compression_opts.strategy: 0 2026-03-10T06:16:45.282 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 2026-03-10T06:16:45.282 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 2026-03-10T06:16:45.282 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 2026-03-10T06:16:45.282 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.bottommost_compression_opts.enabled: false 2026-03-10T06:16:45.282 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 2026-03-10T06:16:45.282 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true 2026-03-10T06:16:45.282 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.compression_opts.window_bits: -14 2026-03-10T06:16:45.282 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.compression_opts.level: 32767 2026-03-10T06:16:45.282 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.compression_opts.strategy: 0 2026-03-10T06:16:45.282 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.compression_opts.max_dict_bytes: 0 2026-03-10T06:16:45.282 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 2026-03-10T06:16:45.282 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true 2026-03-10T06:16:45.282 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.compression_opts.parallel_threads: 1 2026-03-10T06:16:45.282 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.compression_opts.enabled: false 2026-03-10T06:16:45.282 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 2026-03-10T06:16:45.282 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.level0_file_num_compaction_trigger: 4 2026-03-10T06:16:45.282 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.level0_slowdown_writes_trigger: 20 2026-03-10T06:16:45.282 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.level0_stop_writes_trigger: 36 2026-03-10T06:16:45.282 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.target_file_size_base: 67108864 2026-03-10T06:16:45.282 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.target_file_size_multiplier: 1 2026-03-10T06:16:45.282 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.max_bytes_for_level_base: 268435456 2026-03-10T06:16:45.283 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.level_compaction_dynamic_level_bytes: 1 2026-03-10T06:16:45.283 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.max_bytes_for_level_multiplier: 10.000000 2026-03-10T06:16:45.283 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 2026-03-10T06:16:45.283 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 2026-03-10T06:16:45.283 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 2026-03-10T06:16:45.283 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 2026-03-10T06:16:45.283 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 2026-03-10T06:16:45.283 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 2026-03-10T06:16:45.283 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 2026-03-10T06:16:45.283 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.max_sequential_skip_in_iterations: 8 2026-03-10T06:16:45.283 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.max_compaction_bytes: 1677721600 2026-03-10T06:16:45.283 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true 2026-03-10T06:16:45.283 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.arena_block_size: 1048576 2026-03-10T06:16:45.283 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 2026-03-10T06:16:45.283 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 2026-03-10T06:16:45.283 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.disable_auto_compactions: 0 2026-03-10T06:16:45.283 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.compaction_style: kCompactionStyleLevel 2026-03-10T06:16:45.283 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.compaction_pri: kMinOverlappingRatio 2026-03-10T06:16:45.283 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.compaction_options_universal.size_ratio: 1 2026-03-10T06:16:45.283 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 2026-03-10T06:16:45.283 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 2026-03-10T06:16:45.283 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 2026-03-10T06:16:45.283 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 2026-03-10T06:16:45.283 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize 2026-03-10T06:16:45.283 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 2026-03-10T06:16:45.283 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 2026-03-10T06:16:45.283 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.table_properties_collectors: 2026-03-10T06:16:45.283 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.inplace_update_support: 0 2026-03-10T06:16:45.283 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.inplace_update_num_locks: 10000 2026-03-10T06:16:45.283 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 2026-03-10T06:16:45.283 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.memtable_whole_key_filtering: 0 2026-03-10T06:16:45.283 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.memtable_huge_page_size: 0 2026-03-10T06:16:45.283 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.bloom_locality: 0 2026-03-10T06:16:45.283 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.max_successive_merges: 0 2026-03-10T06:16:45.283 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.optimize_filters_for_hits: 0 2026-03-10T06:16:45.283 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.paranoid_file_checks: 0 2026-03-10T06:16:45.283 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.force_consistency_checks: 1 2026-03-10T06:16:45.283 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.report_bg_io_stats: 0 2026-03-10T06:16:45.283 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.ttl: 2592000 2026-03-10T06:16:45.284 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.periodic_compaction_seconds: 0 2026-03-10T06:16:45.284 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.preclude_last_level_data_seconds: 0 2026-03-10T06:16:45.284 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.preserve_internal_time_seconds: 0 2026-03-10T06:16:45.284 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.enable_blob_files: false 2026-03-10T06:16:45.284 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.min_blob_size: 0 2026-03-10T06:16:45.284 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.blob_file_size: 268435456 2026-03-10T06:16:45.284 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.blob_compression_type: NoCompression 2026-03-10T06:16:45.284 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.enable_blob_garbage_collection: false 2026-03-10T06:16:45.284 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 2026-03-10T06:16:45.284 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 2026-03-10T06:16:45.284 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.blob_compaction_readahead_size: 0 2026-03-10T06:16:45.284 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.blob_file_starting_level: 0 2026-03-10T06:16:45.284 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 2026-03-10T06:16:45.284 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:/var/lib/ceph/mon/ceph-vm04/store.db/MANIFEST-000010 succeeded,manifest_file_number is 10, next_file_number is 12, last_sequence is 5, log_number is 5,prev_log_number is 0,max_column_family is 0,min_log_number_to_keep is 5 2026-03-10T06:16:45.284 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5 2026-03-10T06:16:45.284 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 408df411-08e0-4cc2-8e50-beb38e7e2a77 2026-03-10T06:16:45.284 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: EVENT_LOG_v1 {"time_micros": 1773123405019129, "job": 1, "event": "recovery_started", "wal_files": [9]} 2026-03-10T06:16:45.284 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #9 mode 2 2026-03-10T06:16:45.284 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: EVENT_LOG_v1 {"time_micros": 1773123405020298, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 13, "file_size": 84711, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 8, "largest_seqno": 287, "table_properties": {"data_size": 82789, "index_size": 209, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 709, "raw_key_size": 13288, "raw_average_key_size": 51, "raw_value_size": 75614, "raw_average_value_size": 293, "num_data_blocks": 9, "num_entries": 258, "num_filter_entries": 258, "num_deletions": 3, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1773123405, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "408df411-08e0-4cc2-8e50-beb38e7e2a77", "db_session_id": "HI3TEGSH7D4D0YYBRD3A", "orig_file_number": 13, "seqno_to_time_mapping": "N/A"}} 2026-03-10T06:16:45.284 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: EVENT_LOG_v1 {"time_micros": 1773123405020339, "job": 1, "event": "recovery_finished"} 2026-03-10T06:16:45.284 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: [db/version_set.cc:5047] Creating manifest 15 2026-03-10T06:16:45.284 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-vm04/store.db/000009.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 2026-03-10T06:16:45.284 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x562a3dfdc000 2026-03-10T06:16:45.284 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: DB pointer 0x562a3dfc8000 2026-03-10T06:16:45.284 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- 2026-03-10T06:16:45.284 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: rocksdb: [db/db_impl/db_impl.cc:1111] 2026-03-10T06:16:45.284 INFO:journalctl@ceph.mon.vm04.vm04.stdout: ** DB Stats ** 2026-03-10T06:16:45.284 INFO:journalctl@ceph.mon.vm04.vm04.stdout: Uptime(secs): 0.0 total, 0.0 interval 2026-03-10T06:16:45.284 INFO:journalctl@ceph.mon.vm04.vm04.stdout: Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s 2026-03-10T06:16:45.284 INFO:journalctl@ceph.mon.vm04.vm04.stdout: Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s 2026-03-10T06:16:45.284 INFO:journalctl@ceph.mon.vm04.vm04.stdout: Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent 2026-03-10T06:16:45.284 INFO:journalctl@ceph.mon.vm04.vm04.stdout: Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s 2026-03-10T06:16:45.284 INFO:journalctl@ceph.mon.vm04.vm04.stdout: Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s 2026-03-10T06:16:45.284 INFO:journalctl@ceph.mon.vm04.vm04.stdout: Interval stall: 00:00:0.000 H:M:S, 0.0 percent 2026-03-10T06:16:45.284 INFO:journalctl@ceph.mon.vm04.vm04.stdout: 2026-03-10T06:16:45.284 INFO:journalctl@ceph.mon.vm04.vm04.stdout: ** Compaction Stats [default] ** 2026-03-10T06:16:45.284 INFO:journalctl@ceph.mon.vm04.vm04.stdout: Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB) 2026-03-10T06:16:45.284 INFO:journalctl@ceph.mon.vm04.vm04.stdout: ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------ 2026-03-10T06:16:45.284 INFO:journalctl@ceph.mon.vm04.vm04.stdout: L0 2/0 84.57 KB 0.5 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 89.3 0.00 0.00 1 0.001 0 0 0.0 0.0 2026-03-10T06:16:45.284 INFO:journalctl@ceph.mon.vm04.vm04.stdout: Sum 2/0 84.57 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 89.3 0.00 0.00 1 0.001 0 0 0.0 0.0 2026-03-10T06:16:45.284 INFO:journalctl@ceph.mon.vm04.vm04.stdout: Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 89.3 0.00 0.00 1 0.001 0 0 0.0 0.0 2026-03-10T06:16:45.284 INFO:journalctl@ceph.mon.vm04.vm04.stdout: 2026-03-10T06:16:45.284 INFO:journalctl@ceph.mon.vm04.vm04.stdout: ** Compaction Stats [default] ** 2026-03-10T06:16:45.285 INFO:journalctl@ceph.mon.vm04.vm04.stdout: Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB) 2026-03-10T06:16:45.285 INFO:journalctl@ceph.mon.vm04.vm04.stdout: --------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- 2026-03-10T06:16:45.285 INFO:journalctl@ceph.mon.vm04.vm04.stdout: User 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 89.3 0.00 0.00 1 0.001 0 0 0.0 0.0 2026-03-10T06:16:45.285 INFO:journalctl@ceph.mon.vm04.vm04.stdout: 2026-03-10T06:16:45.285 INFO:journalctl@ceph.mon.vm04.vm04.stdout: Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0 2026-03-10T06:16:45.285 INFO:journalctl@ceph.mon.vm04.vm04.stdout: 2026-03-10T06:16:45.285 INFO:journalctl@ceph.mon.vm04.vm04.stdout: Uptime(secs): 0.0 total, 0.0 interval 2026-03-10T06:16:45.285 INFO:journalctl@ceph.mon.vm04.vm04.stdout: Flush(GB): cumulative 0.000, interval 0.000 2026-03-10T06:16:45.285 INFO:journalctl@ceph.mon.vm04.vm04.stdout: AddFile(GB): cumulative 0.000, interval 0.000 2026-03-10T06:16:45.285 INFO:journalctl@ceph.mon.vm04.vm04.stdout: AddFile(Total Files): cumulative 0, interval 0 2026-03-10T06:16:45.285 INFO:journalctl@ceph.mon.vm04.vm04.stdout: AddFile(L0 Files): cumulative 0, interval 0 2026-03-10T06:16:45.285 INFO:journalctl@ceph.mon.vm04.vm04.stdout: AddFile(Keys): cumulative 0, interval 0 2026-03-10T06:16:45.285 INFO:journalctl@ceph.mon.vm04.vm04.stdout: Cumulative compaction: 0.00 GB write, 12.08 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds 2026-03-10T06:16:45.285 INFO:journalctl@ceph.mon.vm04.vm04.stdout: Interval compaction: 0.00 GB write, 12.08 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds 2026-03-10T06:16:45.285 INFO:journalctl@ceph.mon.vm04.vm04.stdout: Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count 2026-03-10T06:16:45.285 INFO:journalctl@ceph.mon.vm04.vm04.stdout: Block cache BinnedLRUCache@0x562a3df3f350#2 capacity: 512.00 MB usage: 1.30 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 0 last_secs: 6e-06 secs_since: 0 2026-03-10T06:16:45.285 INFO:journalctl@ceph.mon.vm04.vm04.stdout: Block cache entry stats(count,size,portion): FilterBlock(2,0.89 KB,0.000169873%) IndexBlock(2,0.41 KB,7.7486e-05%) Misc(1,0.00 KB,0%) 2026-03-10T06:16:45.285 INFO:journalctl@ceph.mon.vm04.vm04.stdout: 2026-03-10T06:16:45.285 INFO:journalctl@ceph.mon.vm04.vm04.stdout: ** File Read Latency Histogram By Level [default] ** 2026-03-10T06:16:45.285 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: starting mon.vm04 rank 0 at public addrs [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] at bind addrs [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] mon_data /var/lib/ceph/mon/ceph-vm04 fsid 9c59102a-1c48-11f1-b618-035af535377d 2026-03-10T06:16:45.285 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: mon.vm04@-1(???) e1 preinit fsid 9c59102a-1c48-11f1-b618-035af535377d 2026-03-10T06:16:45.285 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: mon.vm04@-1(???).mds e1 new map 2026-03-10T06:16:45.285 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: mon.vm04@-1(???).mds e1 print_map 2026-03-10T06:16:45.285 INFO:journalctl@ceph.mon.vm04.vm04.stdout: e1 2026-03-10T06:16:45.285 INFO:journalctl@ceph.mon.vm04.vm04.stdout: enable_multiple, ever_enabled_multiple: 1,1 2026-03-10T06:16:45.285 INFO:journalctl@ceph.mon.vm04.vm04.stdout: default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T06:16:45.285 INFO:journalctl@ceph.mon.vm04.vm04.stdout: legacy client fscid: -1 2026-03-10T06:16:45.285 INFO:journalctl@ceph.mon.vm04.vm04.stdout: 2026-03-10T06:16:45.285 INFO:journalctl@ceph.mon.vm04.vm04.stdout: No filesystems configured 2026-03-10T06:16:45.285 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: mon.vm04@-1(???).osd e1 crush map has features 3314932999778484224, adjusting msgr requires 2026-03-10T06:16:45.285 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: mon.vm04@-1(???).osd e1 crush map has features 288514050185494528, adjusting msgr requires 2026-03-10T06:16:45.285 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: mon.vm04@-1(???).osd e1 crush map has features 288514050185494528, adjusting msgr requires 2026-03-10T06:16:45.285 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: mon.vm04@-1(???).osd e1 crush map has features 288514050185494528, adjusting msgr requires 2026-03-10T06:16:45.285 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: mon.vm04@-1(???).paxosservice(auth 1..2) refresh upgraded, format 0 -> 3 2026-03-10T06:16:45.285 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: expand_channel_meta expand map: {default=false} 2026-03-10T06:16:45.285 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: expand_channel_meta from 'false' to 'false' 2026-03-10T06:16:45.285 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: expand_channel_meta expanded map: {default=false} 2026-03-10T06:16:45.285 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: expand_channel_meta expand map: {default=info} 2026-03-10T06:16:45.285 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: expand_channel_meta from 'info' to 'info' 2026-03-10T06:16:45.285 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: expand_channel_meta expanded map: {default=info} 2026-03-10T06:16:45.285 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: expand_channel_meta expand map: {default=daemon} 2026-03-10T06:16:45.285 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: expand_channel_meta from 'daemon' to 'daemon' 2026-03-10T06:16:45.285 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: expand_channel_meta expanded map: {default=daemon} 2026-03-10T06:16:45.285 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: expand_channel_meta expand map: {default=debug} 2026-03-10T06:16:45.285 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: expand_channel_meta from 'debug' to 'debug' 2026-03-10T06:16:45.286 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: expand_channel_meta expanded map: {default=debug} 2026-03-10T06:16:45.286 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: mon.vm04 is new leader, mons vm04 in quorum (ranks 0) 2026-03-10T06:16:45.286 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: monmap e1: 1 mons at {vm04=[v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0]} removed_ranks: {} 2026-03-10T06:16:45.286 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: fsmap 2026-03-10T06:16:45.286 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: osdmap e1: 0 total, 0 up, 0 in 2026-03-10T06:16:45.286 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:45 vm04 ceph-mon[51058]: mgrmap e1: no daemons active 2026-03-10T06:16:45.413 INFO:teuthology.orchestra.run.vm04.stdout:Non-zero exit code 1 from systemctl reset-failed ceph-9c59102a-1c48-11f1-b618-035af535377d@mgr.vm04.exdvdb 2026-03-10T06:16:45.413 INFO:teuthology.orchestra.run.vm04.stdout:systemctl: stderr Failed to reset failed state of unit ceph-9c59102a-1c48-11f1-b618-035af535377d@mgr.vm04.exdvdb.service: Unit ceph-9c59102a-1c48-11f1-b618-035af535377d@mgr.vm04.exdvdb.service not loaded. 2026-03-10T06:16:45.539 INFO:teuthology.orchestra.run.vm04.stdout:systemctl: stderr Created symlink /etc/systemd/system/ceph-9c59102a-1c48-11f1-b618-035af535377d.target.wants/ceph-9c59102a-1c48-11f1-b618-035af535377d@mgr.vm04.exdvdb.service → /etc/systemd/system/ceph-9c59102a-1c48-11f1-b618-035af535377d@.service. 2026-03-10T06:16:45.721 INFO:teuthology.orchestra.run.vm04.stdout:firewalld does not appear to be present 2026-03-10T06:16:45.721 INFO:teuthology.orchestra.run.vm04.stdout:Not possible to enable service . firewalld.service is not available 2026-03-10T06:16:45.721 INFO:teuthology.orchestra.run.vm04.stdout:firewalld does not appear to be present 2026-03-10T06:16:45.721 INFO:teuthology.orchestra.run.vm04.stdout:Not possible to open ports <[9283, 8765, 8443]>. firewalld.service is not available 2026-03-10T06:16:45.721 INFO:teuthology.orchestra.run.vm04.stdout:Waiting for mgr to start... 2026-03-10T06:16:45.721 INFO:teuthology.orchestra.run.vm04.stdout:Waiting for mgr... 2026-03-10T06:16:45.997 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout 2026-03-10T06:16:45.997 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout { 2026-03-10T06:16:45.997 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "fsid": "9c59102a-1c48-11f1-b618-035af535377d", 2026-03-10T06:16:45.997 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "health": { 2026-03-10T06:16:45.997 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "status": "HEALTH_OK", 2026-03-10T06:16:45.997 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "checks": {}, 2026-03-10T06:16:45.997 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "mutes": [] 2026-03-10T06:16:45.997 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout }, 2026-03-10T06:16:45.997 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "election_epoch": 5, 2026-03-10T06:16:45.997 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "quorum": [ 2026-03-10T06:16:45.997 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout 0 2026-03-10T06:16:45.997 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout ], 2026-03-10T06:16:45.997 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "quorum_names": [ 2026-03-10T06:16:45.997 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "vm04" 2026-03-10T06:16:45.997 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout ], 2026-03-10T06:16:45.997 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "quorum_age": 0, 2026-03-10T06:16:45.997 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "monmap": { 2026-03-10T06:16:45.997 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "epoch": 1, 2026-03-10T06:16:45.997 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "min_mon_release_name": "reef", 2026-03-10T06:16:45.997 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "num_mons": 1 2026-03-10T06:16:45.997 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout }, 2026-03-10T06:16:45.997 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "osdmap": { 2026-03-10T06:16:45.997 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "epoch": 1, 2026-03-10T06:16:45.997 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "num_osds": 0, 2026-03-10T06:16:45.997 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "num_up_osds": 0, 2026-03-10T06:16:45.998 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "osd_up_since": 0, 2026-03-10T06:16:45.998 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "num_in_osds": 0, 2026-03-10T06:16:45.998 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "osd_in_since": 0, 2026-03-10T06:16:45.998 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "num_remapped_pgs": 0 2026-03-10T06:16:45.998 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout }, 2026-03-10T06:16:45.998 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "pgmap": { 2026-03-10T06:16:45.998 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "pgs_by_state": [], 2026-03-10T06:16:45.998 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "num_pgs": 0, 2026-03-10T06:16:45.998 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "num_pools": 0, 2026-03-10T06:16:45.998 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "num_objects": 0, 2026-03-10T06:16:45.998 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "data_bytes": 0, 2026-03-10T06:16:45.998 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "bytes_used": 0, 2026-03-10T06:16:45.998 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "bytes_avail": 0, 2026-03-10T06:16:45.998 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "bytes_total": 0 2026-03-10T06:16:45.998 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout }, 2026-03-10T06:16:45.998 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "fsmap": { 2026-03-10T06:16:45.998 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "epoch": 1, 2026-03-10T06:16:45.998 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "by_rank": [], 2026-03-10T06:16:45.998 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "up:standby": 0 2026-03-10T06:16:45.998 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout }, 2026-03-10T06:16:45.998 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "mgrmap": { 2026-03-10T06:16:45.998 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "available": false, 2026-03-10T06:16:45.998 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "num_standbys": 0, 2026-03-10T06:16:45.998 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "modules": [ 2026-03-10T06:16:45.998 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "iostat", 2026-03-10T06:16:45.998 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "nfs", 2026-03-10T06:16:45.998 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "restful" 2026-03-10T06:16:45.998 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout ], 2026-03-10T06:16:45.998 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "services": {} 2026-03-10T06:16:45.998 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout }, 2026-03-10T06:16:45.998 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "servicemap": { 2026-03-10T06:16:45.998 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "epoch": 1, 2026-03-10T06:16:45.998 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "modified": "2026-03-10T06:16:43.937299+0000", 2026-03-10T06:16:45.998 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "services": {} 2026-03-10T06:16:45.998 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout }, 2026-03-10T06:16:45.998 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "progress_events": {} 2026-03-10T06:16:45.998 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout } 2026-03-10T06:16:45.998 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:45.892+0000 7f104544f700 1 Processor -- start 2026-03-10T06:16:45.998 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:45.893+0000 7f104544f700 1 -- start start 2026-03-10T06:16:45.998 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:45.893+0000 7f104544f700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1040071410 0x7f1040071820 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:16:45.998 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:45.893+0000 7f104544f700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f1040071d60 con 0x7f1040071410 2026-03-10T06:16:45.998 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:45.894+0000 7f103ffff700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1040071410 0x7f1040071820 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:16:45.998 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:45.894+0000 7f103ffff700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1040071410 0x7f1040071820 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:41066/0 (socket says 192.168.123.104:41066) 2026-03-10T06:16:45.998 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:45.894+0000 7f103ffff700 1 -- 192.168.123.104:0/2320849913 learned_addr learned my addr 192.168.123.104:0/2320849913 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:16:45.998 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:45.895+0000 7f103ffff700 1 -- 192.168.123.104:0/2320849913 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f1040071ea0 con 0x7f1040071410 2026-03-10T06:16:45.998 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:45.895+0000 7f103ffff700 1 --2- 192.168.123.104:0/2320849913 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1040071410 0x7f1040071820 secure :-1 s=READY pgs=5 cs=0 l=1 rev1=1 crypto rx=0x7f103000d180 tx=0x7f103000d490 comp rx=0 tx=0).ready entity=mon.0 client_cookie=35075edb19ac35ff server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:16:45.998 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:45.898+0000 7f103effd700 1 -- 192.168.123.104:0/2320849913 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f1030010070 con 0x7f1040071410 2026-03-10T06:16:45.998 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:45.898+0000 7f103effd700 1 -- 192.168.123.104:0/2320849913 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f1030004030 con 0x7f1040071410 2026-03-10T06:16:45.998 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:45.898+0000 7f104544f700 1 -- 192.168.123.104:0/2320849913 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1040071410 msgr2=0x7f1040071820 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:16:45.998 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:45.898+0000 7f104544f700 1 --2- 192.168.123.104:0/2320849913 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1040071410 0x7f1040071820 secure :-1 s=READY pgs=5 cs=0 l=1 rev1=1 crypto rx=0x7f103000d180 tx=0x7f103000d490 comp rx=0 tx=0).stop 2026-03-10T06:16:45.998 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:45.899+0000 7f104544f700 1 -- 192.168.123.104:0/2320849913 shutdown_connections 2026-03-10T06:16:45.998 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:45.899+0000 7f104544f700 1 --2- 192.168.123.104:0/2320849913 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1040071410 0x7f1040071820 unknown :-1 s=CLOSED pgs=5 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:16:45.998 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:45.899+0000 7f104544f700 1 -- 192.168.123.104:0/2320849913 >> 192.168.123.104:0/2320849913 conn(0x7f104006c9d0 msgr2=0x7f104006ee00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:16:45.998 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:45.899+0000 7f104544f700 1 -- 192.168.123.104:0/2320849913 shutdown_connections 2026-03-10T06:16:45.998 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:45.899+0000 7f104544f700 1 -- 192.168.123.104:0/2320849913 wait complete. 2026-03-10T06:16:45.998 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:45.900+0000 7f104544f700 1 Processor -- start 2026-03-10T06:16:45.998 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:45.900+0000 7f104544f700 1 -- start start 2026-03-10T06:16:45.998 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:45.900+0000 7f104544f700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f104019ff10 0x7f10401a0320 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:16:45.998 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:45.900+0000 7f104544f700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f1030003bb0 con 0x7f104019ff10 2026-03-10T06:16:45.999 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:45.900+0000 7f103ffff700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f104019ff10 0x7f10401a0320 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:16:45.999 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:45.900+0000 7f103ffff700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f104019ff10 0x7f10401a0320 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:41070/0 (socket says 192.168.123.104:41070) 2026-03-10T06:16:45.999 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:45.900+0000 7f103ffff700 1 -- 192.168.123.104:0/4224146068 learned_addr learned my addr 192.168.123.104:0/4224146068 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:16:45.999 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:45.901+0000 7f103ffff700 1 -- 192.168.123.104:0/4224146068 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f10300087c0 con 0x7f104019ff10 2026-03-10T06:16:45.999 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:45.901+0000 7f103ffff700 1 --2- 192.168.123.104:0/4224146068 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f104019ff10 0x7f10401a0320 secure :-1 s=READY pgs=6 cs=0 l=1 rev1=1 crypto rx=0x7f1030008c10 tx=0x7f1030008cf0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:16:45.999 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:45.902+0000 7f103d7fa700 1 -- 192.168.123.104:0/4224146068 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f1030010050 con 0x7f104019ff10 2026-03-10T06:16:45.999 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:45.903+0000 7f104544f700 1 -- 192.168.123.104:0/4224146068 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f10401a0860 con 0x7f104019ff10 2026-03-10T06:16:45.999 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:45.903+0000 7f104544f700 1 -- 192.168.123.104:0/4224146068 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f10401a14e0 con 0x7f104019ff10 2026-03-10T06:16:45.999 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:45.903+0000 7f103d7fa700 1 -- 192.168.123.104:0/4224146068 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f1030004620 con 0x7f104019ff10 2026-03-10T06:16:45.999 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:45.904+0000 7f103d7fa700 1 -- 192.168.123.104:0/4224146068 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f1030016440 con 0x7f104019ff10 2026-03-10T06:16:45.999 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:45.904+0000 7f104544f700 1 -- 192.168.123.104:0/4224146068 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f102c005320 con 0x7f104019ff10 2026-03-10T06:16:45.999 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:45.906+0000 7f103d7fa700 1 -- 192.168.123.104:0/4224146068 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 1) v1 ==== 660+0+0 (secure 0 0 0) 0x7f10300165a0 con 0x7f104019ff10 2026-03-10T06:16:45.999 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:45.906+0000 7f103d7fa700 1 -- 192.168.123.104:0/4224146068 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(1..1 src has 1..1) v4 ==== 725+0+0 (secure 0 0 0) 0x7f1030006ca0 con 0x7f104019ff10 2026-03-10T06:16:45.999 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:45.906+0000 7f103d7fa700 1 -- 192.168.123.104:0/4224146068 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+72422 (secure 0 0 0) 0x7f103001b070 con 0x7f104019ff10 2026-03-10T06:16:45.999 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:45.946+0000 7f104544f700 1 -- 192.168.123.104:0/4224146068 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "status", "format": "json-pretty"} v 0) v1 -- 0x7f102c005190 con 0x7f104019ff10 2026-03-10T06:16:45.999 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:45.947+0000 7f103d7fa700 1 -- 192.168.123.104:0/4224146068 <== mon.0 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "status", "format": "json-pretty"}]=0 v0) v1 ==== 79+0+1241 (secure 0 0 0) 0x7f1030004480 con 0x7f104019ff10 2026-03-10T06:16:45.999 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:45.950+0000 7f1026ffd700 1 -- 192.168.123.104:0/4224146068 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f104019ff10 msgr2=0x7f10401a0320 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:16:45.999 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:45.950+0000 7f1026ffd700 1 --2- 192.168.123.104:0/4224146068 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f104019ff10 0x7f10401a0320 secure :-1 s=READY pgs=6 cs=0 l=1 rev1=1 crypto rx=0x7f1030008c10 tx=0x7f1030008cf0 comp rx=0 tx=0).stop 2026-03-10T06:16:45.999 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:45.950+0000 7f1026ffd700 1 -- 192.168.123.104:0/4224146068 shutdown_connections 2026-03-10T06:16:45.999 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:45.950+0000 7f1026ffd700 1 --2- 192.168.123.104:0/4224146068 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f104019ff10 0x7f10401a0320 unknown :-1 s=CLOSED pgs=6 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:16:45.999 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:45.951+0000 7f1026ffd700 1 -- 192.168.123.104:0/4224146068 >> 192.168.123.104:0/4224146068 conn(0x7f104006c9d0 msgr2=0x7f104006d450 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:16:45.999 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:45.951+0000 7f1026ffd700 1 -- 192.168.123.104:0/4224146068 shutdown_connections 2026-03-10T06:16:45.999 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:45.951+0000 7f1026ffd700 1 -- 192.168.123.104:0/4224146068 wait complete. 2026-03-10T06:16:45.999 INFO:teuthology.orchestra.run.vm04.stdout:mgr not available, waiting (1/15)... 2026-03-10T06:16:46.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:46 vm04 ceph-mon[51058]: from='client.? 192.168.123.104:0/624239529' entity='client.admin' 2026-03-10T06:16:46.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:46 vm04 ceph-mon[51058]: from='client.? 192.168.123.104:0/4224146068' entity='client.admin' cmd=[{"prefix": "status", "format": "json-pretty"}]: dispatch 2026-03-10T06:16:48.247 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout 2026-03-10T06:16:48.247 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout { 2026-03-10T06:16:48.247 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "fsid": "9c59102a-1c48-11f1-b618-035af535377d", 2026-03-10T06:16:48.247 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "health": { 2026-03-10T06:16:48.247 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "status": "HEALTH_OK", 2026-03-10T06:16:48.247 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "checks": {}, 2026-03-10T06:16:48.247 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "mutes": [] 2026-03-10T06:16:48.247 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout }, 2026-03-10T06:16:48.247 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "election_epoch": 5, 2026-03-10T06:16:48.247 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "quorum": [ 2026-03-10T06:16:48.247 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout 0 2026-03-10T06:16:48.247 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout ], 2026-03-10T06:16:48.247 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "quorum_names": [ 2026-03-10T06:16:48.247 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "vm04" 2026-03-10T06:16:48.247 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout ], 2026-03-10T06:16:48.247 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "quorum_age": 3, 2026-03-10T06:16:48.247 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "monmap": { 2026-03-10T06:16:48.247 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "epoch": 1, 2026-03-10T06:16:48.247 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "min_mon_release_name": "reef", 2026-03-10T06:16:48.247 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "num_mons": 1 2026-03-10T06:16:48.247 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout }, 2026-03-10T06:16:48.247 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "osdmap": { 2026-03-10T06:16:48.247 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "epoch": 1, 2026-03-10T06:16:48.247 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "num_osds": 0, 2026-03-10T06:16:48.247 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "num_up_osds": 0, 2026-03-10T06:16:48.247 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "osd_up_since": 0, 2026-03-10T06:16:48.247 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "num_in_osds": 0, 2026-03-10T06:16:48.247 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "osd_in_since": 0, 2026-03-10T06:16:48.247 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "num_remapped_pgs": 0 2026-03-10T06:16:48.247 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout }, 2026-03-10T06:16:48.247 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "pgmap": { 2026-03-10T06:16:48.247 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "pgs_by_state": [], 2026-03-10T06:16:48.247 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "num_pgs": 0, 2026-03-10T06:16:48.249 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "num_pools": 0, 2026-03-10T06:16:48.249 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "num_objects": 0, 2026-03-10T06:16:48.249 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "data_bytes": 0, 2026-03-10T06:16:48.249 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "bytes_used": 0, 2026-03-10T06:16:48.249 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "bytes_avail": 0, 2026-03-10T06:16:48.249 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "bytes_total": 0 2026-03-10T06:16:48.249 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout }, 2026-03-10T06:16:48.249 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "fsmap": { 2026-03-10T06:16:48.249 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "epoch": 1, 2026-03-10T06:16:48.249 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "by_rank": [], 2026-03-10T06:16:48.249 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "up:standby": 0 2026-03-10T06:16:48.249 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout }, 2026-03-10T06:16:48.249 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "mgrmap": { 2026-03-10T06:16:48.249 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "available": false, 2026-03-10T06:16:48.249 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "num_standbys": 0, 2026-03-10T06:16:48.249 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "modules": [ 2026-03-10T06:16:48.249 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "iostat", 2026-03-10T06:16:48.249 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "nfs", 2026-03-10T06:16:48.249 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "restful" 2026-03-10T06:16:48.249 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout ], 2026-03-10T06:16:48.249 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "services": {} 2026-03-10T06:16:48.249 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout }, 2026-03-10T06:16:48.249 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "servicemap": { 2026-03-10T06:16:48.249 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "epoch": 1, 2026-03-10T06:16:48.249 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "modified": "2026-03-10T06:16:43.937299+0000", 2026-03-10T06:16:48.249 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "services": {} 2026-03-10T06:16:48.249 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout }, 2026-03-10T06:16:48.249 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "progress_events": {} 2026-03-10T06:16:48.250 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout } 2026-03-10T06:16:48.250 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:48.166+0000 7f1572d6f700 1 Processor -- start 2026-03-10T06:16:48.250 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:48.166+0000 7f1572d6f700 1 -- start start 2026-03-10T06:16:48.250 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:48.166+0000 7f1572d6f700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f156c07c7e0 0x7f156c07cbf0 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:16:48.250 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:48.166+0000 7f1572d6f700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f156c07d130 con 0x7f156c07c7e0 2026-03-10T06:16:48.250 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:48.167+0000 7f1570b0b700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f156c07c7e0 0x7f156c07cbf0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:16:48.250 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:48.167+0000 7f1570b0b700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f156c07c7e0 0x7f156c07cbf0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:41084/0 (socket says 192.168.123.104:41084) 2026-03-10T06:16:48.250 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:48.167+0000 7f1570b0b700 1 -- 192.168.123.104:0/3476745991 learned_addr learned my addr 192.168.123.104:0/3476745991 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:16:48.250 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:48.167+0000 7f1570b0b700 1 -- 192.168.123.104:0/3476745991 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f156c07d270 con 0x7f156c07c7e0 2026-03-10T06:16:48.250 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:48.167+0000 7f1570b0b700 1 --2- 192.168.123.104:0/3476745991 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f156c07c7e0 0x7f156c07cbf0 secure :-1 s=READY pgs=7 cs=0 l=1 rev1=1 crypto rx=0x7f155c009a90 tx=0x7f155c009da0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=80b16dbd7e7400d4 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:16:48.250 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:48.168+0000 7f156b7fe700 1 -- 192.168.123.104:0/3476745991 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f155c004030 con 0x7f156c07c7e0 2026-03-10T06:16:48.250 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:48.168+0000 7f156b7fe700 1 -- 192.168.123.104:0/3476745991 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f155c00b7e0 con 0x7f156c07c7e0 2026-03-10T06:16:48.250 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:48.168+0000 7f1572d6f700 1 -- 192.168.123.104:0/3476745991 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f156c07c7e0 msgr2=0x7f156c07cbf0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:16:48.250 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:48.168+0000 7f1572d6f700 1 --2- 192.168.123.104:0/3476745991 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f156c07c7e0 0x7f156c07cbf0 secure :-1 s=READY pgs=7 cs=0 l=1 rev1=1 crypto rx=0x7f155c009a90 tx=0x7f155c009da0 comp rx=0 tx=0).stop 2026-03-10T06:16:48.250 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:48.168+0000 7f1572d6f700 1 -- 192.168.123.104:0/3476745991 shutdown_connections 2026-03-10T06:16:48.250 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:48.168+0000 7f1572d6f700 1 --2- 192.168.123.104:0/3476745991 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f156c07c7e0 0x7f156c07cbf0 unknown :-1 s=CLOSED pgs=7 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:16:48.250 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:48.168+0000 7f1572d6f700 1 -- 192.168.123.104:0/3476745991 >> 192.168.123.104:0/3476745991 conn(0x7f156c07b220 msgr2=0x7f156c07b620 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:16:48.250 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:48.168+0000 7f1572d6f700 1 -- 192.168.123.104:0/3476745991 shutdown_connections 2026-03-10T06:16:48.250 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:48.169+0000 7f1572d6f700 1 -- 192.168.123.104:0/3476745991 wait complete. 2026-03-10T06:16:48.250 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:48.169+0000 7f1572d6f700 1 Processor -- start 2026-03-10T06:16:48.250 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:48.169+0000 7f1572d6f700 1 -- start start 2026-03-10T06:16:48.250 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:48.169+0000 7f1572d6f700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f156c07c7e0 0x7f156c1a02b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:16:48.250 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:48.169+0000 7f1572d6f700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f156c07d130 con 0x7f156c07c7e0 2026-03-10T06:16:48.250 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:48.170+0000 7f1570b0b700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f156c07c7e0 0x7f156c1a02b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:16:48.250 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:48.170+0000 7f1570b0b700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f156c07c7e0 0x7f156c1a02b0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:41092/0 (socket says 192.168.123.104:41092) 2026-03-10T06:16:48.250 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:48.170+0000 7f1570b0b700 1 -- 192.168.123.104:0/751969818 learned_addr learned my addr 192.168.123.104:0/751969818 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:16:48.250 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:48.170+0000 7f1570b0b700 1 -- 192.168.123.104:0/751969818 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f155c009740 con 0x7f156c07c7e0 2026-03-10T06:16:48.250 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:48.170+0000 7f1570b0b700 1 --2- 192.168.123.104:0/751969818 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f156c07c7e0 0x7f156c1a02b0 secure :-1 s=READY pgs=8 cs=0 l=1 rev1=1 crypto rx=0x7f155c00bdf0 tx=0x7f155c00bed0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:16:48.250 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:48.170+0000 7f1569ffb700 1 -- 192.168.123.104:0/751969818 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f155c003f40 con 0x7f156c07c7e0 2026-03-10T06:16:48.250 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:48.170+0000 7f1572d6f700 1 -- 192.168.123.104:0/751969818 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f156c1a07f0 con 0x7f156c07c7e0 2026-03-10T06:16:48.250 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:48.171+0000 7f1572d6f700 1 -- 192.168.123.104:0/751969818 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f156c1a0c90 con 0x7f156c07c7e0 2026-03-10T06:16:48.250 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:48.171+0000 7f1569ffb700 1 -- 192.168.123.104:0/751969818 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f155c004540 con 0x7f156c07c7e0 2026-03-10T06:16:48.250 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:48.171+0000 7f1572d6f700 1 -- 192.168.123.104:0/751969818 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f156c1999a0 con 0x7f156c07c7e0 2026-03-10T06:16:48.250 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:48.171+0000 7f1569ffb700 1 -- 192.168.123.104:0/751969818 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f155c024de0 con 0x7f156c07c7e0 2026-03-10T06:16:48.250 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:48.173+0000 7f1569ffb700 1 -- 192.168.123.104:0/751969818 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 1) v1 ==== 660+0+0 (secure 0 0 0) 0x7f155c01b440 con 0x7f156c07c7e0 2026-03-10T06:16:48.250 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:48.173+0000 7f1569ffb700 1 -- 192.168.123.104:0/751969818 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(1..1 src has 1..1) v4 ==== 725+0+0 (secure 0 0 0) 0x7f155c02e430 con 0x7f156c07c7e0 2026-03-10T06:16:48.250 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:48.173+0000 7f1569ffb700 1 -- 192.168.123.104:0/751969818 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+72422 (secure 0 0 0) 0x7f155c01f070 con 0x7f156c07c7e0 2026-03-10T06:16:48.250 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:48.212+0000 7f1572d6f700 1 -- 192.168.123.104:0/751969818 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "status", "format": "json-pretty"} v 0) v1 -- 0x7f156c02d050 con 0x7f156c07c7e0 2026-03-10T06:16:48.250 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:48.212+0000 7f1569ffb700 1 -- 192.168.123.104:0/751969818 <== mon.0 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "status", "format": "json-pretty"}]=0 v0) v1 ==== 79+0+1241 (secure 0 0 0) 0x7f156c02d050 con 0x7f156c07c7e0 2026-03-10T06:16:48.250 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:48.214+0000 7f1572d6f700 1 -- 192.168.123.104:0/751969818 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f156c07c7e0 msgr2=0x7f156c1a02b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:16:48.250 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:48.214+0000 7f1572d6f700 1 --2- 192.168.123.104:0/751969818 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f156c07c7e0 0x7f156c1a02b0 secure :-1 s=READY pgs=8 cs=0 l=1 rev1=1 crypto rx=0x7f155c00bdf0 tx=0x7f155c00bed0 comp rx=0 tx=0).stop 2026-03-10T06:16:48.250 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:48.215+0000 7f1572d6f700 1 -- 192.168.123.104:0/751969818 shutdown_connections 2026-03-10T06:16:48.250 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:48.215+0000 7f1572d6f700 1 --2- 192.168.123.104:0/751969818 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f156c07c7e0 0x7f156c1a02b0 unknown :-1 s=CLOSED pgs=8 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:16:48.250 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:48.215+0000 7f1572d6f700 1 -- 192.168.123.104:0/751969818 >> 192.168.123.104:0/751969818 conn(0x7f156c07b220 msgr2=0x7f156c106d50 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:16:48.250 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:48.215+0000 7f1572d6f700 1 -- 192.168.123.104:0/751969818 shutdown_connections 2026-03-10T06:16:48.250 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:48.215+0000 7f1572d6f700 1 -- 192.168.123.104:0/751969818 wait complete. 2026-03-10T06:16:48.250 INFO:teuthology.orchestra.run.vm04.stdout:mgr not available, waiting (2/15)... 2026-03-10T06:16:48.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:48 vm04 ceph-mon[51058]: from='client.? 192.168.123.104:0/751969818' entity='client.admin' cmd=[{"prefix": "status", "format": "json-pretty"}]: dispatch 2026-03-10T06:16:50.493 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout 2026-03-10T06:16:50.493 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout { 2026-03-10T06:16:50.493 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "fsid": "9c59102a-1c48-11f1-b618-035af535377d", 2026-03-10T06:16:50.493 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "health": { 2026-03-10T06:16:50.493 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "status": "HEALTH_OK", 2026-03-10T06:16:50.493 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "checks": {}, 2026-03-10T06:16:50.493 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "mutes": [] 2026-03-10T06:16:50.493 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout }, 2026-03-10T06:16:50.493 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "election_epoch": 5, 2026-03-10T06:16:50.493 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "quorum": [ 2026-03-10T06:16:50.493 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout 0 2026-03-10T06:16:50.493 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout ], 2026-03-10T06:16:50.493 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "quorum_names": [ 2026-03-10T06:16:50.493 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "vm04" 2026-03-10T06:16:50.493 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout ], 2026-03-10T06:16:50.493 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "quorum_age": 5, 2026-03-10T06:16:50.493 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "monmap": { 2026-03-10T06:16:50.493 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "epoch": 1, 2026-03-10T06:16:50.493 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "min_mon_release_name": "reef", 2026-03-10T06:16:50.493 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "num_mons": 1 2026-03-10T06:16:50.493 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout }, 2026-03-10T06:16:50.493 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "osdmap": { 2026-03-10T06:16:50.493 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "epoch": 1, 2026-03-10T06:16:50.493 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "num_osds": 0, 2026-03-10T06:16:50.493 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "num_up_osds": 0, 2026-03-10T06:16:50.493 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "osd_up_since": 0, 2026-03-10T06:16:50.493 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "num_in_osds": 0, 2026-03-10T06:16:50.493 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "osd_in_since": 0, 2026-03-10T06:16:50.493 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "num_remapped_pgs": 0 2026-03-10T06:16:50.493 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout }, 2026-03-10T06:16:50.495 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "pgmap": { 2026-03-10T06:16:50.495 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "pgs_by_state": [], 2026-03-10T06:16:50.495 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "num_pgs": 0, 2026-03-10T06:16:50.495 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "num_pools": 0, 2026-03-10T06:16:50.495 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "num_objects": 0, 2026-03-10T06:16:50.495 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "data_bytes": 0, 2026-03-10T06:16:50.495 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "bytes_used": 0, 2026-03-10T06:16:50.495 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "bytes_avail": 0, 2026-03-10T06:16:50.495 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "bytes_total": 0 2026-03-10T06:16:50.495 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout }, 2026-03-10T06:16:50.495 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "fsmap": { 2026-03-10T06:16:50.495 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "epoch": 1, 2026-03-10T06:16:50.495 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "by_rank": [], 2026-03-10T06:16:50.495 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "up:standby": 0 2026-03-10T06:16:50.495 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout }, 2026-03-10T06:16:50.495 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "mgrmap": { 2026-03-10T06:16:50.495 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "available": false, 2026-03-10T06:16:50.496 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "num_standbys": 0, 2026-03-10T06:16:50.496 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "modules": [ 2026-03-10T06:16:50.496 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "iostat", 2026-03-10T06:16:50.496 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "nfs", 2026-03-10T06:16:50.496 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "restful" 2026-03-10T06:16:50.496 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout ], 2026-03-10T06:16:50.496 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "services": {} 2026-03-10T06:16:50.496 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout }, 2026-03-10T06:16:50.496 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "servicemap": { 2026-03-10T06:16:50.496 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "epoch": 1, 2026-03-10T06:16:50.496 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "modified": "2026-03-10T06:16:43.937299+0000", 2026-03-10T06:16:50.496 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "services": {} 2026-03-10T06:16:50.496 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout }, 2026-03-10T06:16:50.496 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "progress_events": {} 2026-03-10T06:16:50.496 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout } 2026-03-10T06:16:50.496 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:50.405+0000 7f3b6c608700 1 Processor -- start 2026-03-10T06:16:50.496 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:50.406+0000 7f3b6c608700 1 -- start start 2026-03-10T06:16:50.496 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:50.406+0000 7f3b6c608700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3b64107e20 0x7f3b64108230 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:16:50.496 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:50.406+0000 7f3b6c608700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3b64108770 con 0x7f3b64107e20 2026-03-10T06:16:50.496 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:50.407+0000 7f3b6a3a4700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3b64107e20 0x7f3b64108230 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:16:50.496 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:50.407+0000 7f3b6a3a4700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3b64107e20 0x7f3b64108230 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:41108/0 (socket says 192.168.123.104:41108) 2026-03-10T06:16:50.496 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:50.407+0000 7f3b6a3a4700 1 -- 192.168.123.104:0/719195511 learned_addr learned my addr 192.168.123.104:0/719195511 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:16:50.496 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:50.407+0000 7f3b6a3a4700 1 -- 192.168.123.104:0/719195511 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f3b641088b0 con 0x7f3b64107e20 2026-03-10T06:16:50.496 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:50.409+0000 7f3b6a3a4700 1 --2- 192.168.123.104:0/719195511 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3b64107e20 0x7f3b64108230 secure :-1 s=READY pgs=9 cs=0 l=1 rev1=1 crypto rx=0x7f3b58009a90 tx=0x7f3b58009da0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=78136da2a6b9dd88 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:16:50.496 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:50.409+0000 7f3b693a2700 1 -- 192.168.123.104:0/719195511 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f3b58004030 con 0x7f3b64107e20 2026-03-10T06:16:50.496 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:50.410+0000 7f3b693a2700 1 -- 192.168.123.104:0/719195511 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f3b5800b7e0 con 0x7f3b64107e20 2026-03-10T06:16:50.496 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:50.410+0000 7f3b693a2700 1 -- 192.168.123.104:0/719195511 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f3b58004030 con 0x7f3b64107e20 2026-03-10T06:16:50.496 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:50.410+0000 7f3b6c608700 1 -- 192.168.123.104:0/719195511 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3b64107e20 msgr2=0x7f3b64108230 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:16:50.496 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:50.410+0000 7f3b6c608700 1 --2- 192.168.123.104:0/719195511 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3b64107e20 0x7f3b64108230 secure :-1 s=READY pgs=9 cs=0 l=1 rev1=1 crypto rx=0x7f3b58009a90 tx=0x7f3b58009da0 comp rx=0 tx=0).stop 2026-03-10T06:16:50.496 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:50.412+0000 7f3b6c608700 1 -- 192.168.123.104:0/719195511 shutdown_connections 2026-03-10T06:16:50.496 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:50.412+0000 7f3b6c608700 1 --2- 192.168.123.104:0/719195511 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3b64107e20 0x7f3b64108230 unknown :-1 s=CLOSED pgs=9 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:16:50.496 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:50.412+0000 7f3b6c608700 1 -- 192.168.123.104:0/719195511 >> 192.168.123.104:0/719195511 conn(0x7f3b641033d0 msgr2=0x7f3b64105800 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:16:50.496 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:50.412+0000 7f3b6c608700 1 -- 192.168.123.104:0/719195511 shutdown_connections 2026-03-10T06:16:50.496 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:50.412+0000 7f3b6c608700 1 -- 192.168.123.104:0/719195511 wait complete. 2026-03-10T06:16:50.496 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:50.413+0000 7f3b6c608700 1 Processor -- start 2026-03-10T06:16:50.496 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:50.413+0000 7f3b6c608700 1 -- start start 2026-03-10T06:16:50.496 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:50.413+0000 7f3b6c608700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3b64107e20 0x7f3b64197720 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:16:50.496 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:50.413+0000 7f3b6c608700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3b64197c60 con 0x7f3b64107e20 2026-03-10T06:16:50.496 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:50.413+0000 7f3b6a3a4700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3b64107e20 0x7f3b64197720 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:16:50.496 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:50.413+0000 7f3b6a3a4700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3b64107e20 0x7f3b64197720 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:41120/0 (socket says 192.168.123.104:41120) 2026-03-10T06:16:50.496 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:50.413+0000 7f3b6a3a4700 1 -- 192.168.123.104:0/117620350 learned_addr learned my addr 192.168.123.104:0/117620350 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:16:50.496 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:50.414+0000 7f3b6a3a4700 1 -- 192.168.123.104:0/117620350 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f3b58009740 con 0x7f3b64107e20 2026-03-10T06:16:50.496 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:50.414+0000 7f3b6a3a4700 1 --2- 192.168.123.104:0/117620350 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3b64107e20 0x7f3b64197720 secure :-1 s=READY pgs=11 cs=0 l=1 rev1=1 crypto rx=0x7f3b580043c0 tx=0x7f3b580044a0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:16:50.496 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:50.415+0000 7f3b577fe700 1 -- 192.168.123.104:0/117620350 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f3b58004030 con 0x7f3b64107e20 2026-03-10T06:16:50.496 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:50.415+0000 7f3b577fe700 1 -- 192.168.123.104:0/117620350 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f3b5800bb60 con 0x7f3b64107e20 2026-03-10T06:16:50.496 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:50.415+0000 7f3b6c608700 1 -- 192.168.123.104:0/117620350 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f3b64197e60 con 0x7f3b64107e20 2026-03-10T06:16:50.496 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:50.415+0000 7f3b577fe700 1 -- 192.168.123.104:0/117620350 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f3b58011420 con 0x7f3b64107e20 2026-03-10T06:16:50.496 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:50.416+0000 7f3b6c608700 1 -- 192.168.123.104:0/117620350 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f3b64198300 con 0x7f3b64107e20 2026-03-10T06:16:50.496 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:50.416+0000 7f3b557fa700 1 -- 192.168.123.104:0/117620350 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f3b4c0052f0 con 0x7f3b64107e20 2026-03-10T06:16:50.496 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:50.418+0000 7f3b577fe700 1 -- 192.168.123.104:0/117620350 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 2) v1 ==== 44835+0+0 (secure 0 0 0) 0x7f3b5800bcd0 con 0x7f3b64107e20 2026-03-10T06:16:50.497 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:50.418+0000 7f3b577fe700 1 -- 192.168.123.104:0/117620350 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(1..1 src has 1..1) v4 ==== 725+0+0 (secure 0 0 0) 0x7f3b5804bed0 con 0x7f3b64107e20 2026-03-10T06:16:50.497 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:50.418+0000 7f3b577fe700 1 -- 192.168.123.104:0/117620350 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+72422 (secure 0 0 0) 0x7f3b580186d0 con 0x7f3b64107e20 2026-03-10T06:16:50.497 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:50.459+0000 7f3b557fa700 1 -- 192.168.123.104:0/117620350 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "status", "format": "json-pretty"} v 0) v1 -- 0x7f3b4c0059c0 con 0x7f3b64107e20 2026-03-10T06:16:50.497 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:50.460+0000 7f3b577fe700 1 -- 192.168.123.104:0/117620350 <== mon.0 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "status", "format": "json-pretty"}]=0 v0) v1 ==== 79+0+1241 (secure 0 0 0) 0x7f3b58018960 con 0x7f3b64107e20 2026-03-10T06:16:50.497 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:50.462+0000 7f3b6c608700 1 -- 192.168.123.104:0/117620350 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3b64107e20 msgr2=0x7f3b64197720 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:16:50.497 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:50.462+0000 7f3b6c608700 1 --2- 192.168.123.104:0/117620350 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3b64107e20 0x7f3b64197720 secure :-1 s=READY pgs=11 cs=0 l=1 rev1=1 crypto rx=0x7f3b580043c0 tx=0x7f3b580044a0 comp rx=0 tx=0).stop 2026-03-10T06:16:50.497 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:50.462+0000 7f3b6c608700 1 -- 192.168.123.104:0/117620350 shutdown_connections 2026-03-10T06:16:50.497 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:50.462+0000 7f3b6c608700 1 --2- 192.168.123.104:0/117620350 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3b64107e20 0x7f3b64197720 secure :-1 s=CLOSED pgs=11 cs=0 l=1 rev1=1 crypto rx=0x7f3b580043c0 tx=0x7f3b580044a0 comp rx=0 tx=0).stop 2026-03-10T06:16:50.497 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:50.462+0000 7f3b6c608700 1 -- 192.168.123.104:0/117620350 >> 192.168.123.104:0/117620350 conn(0x7f3b641033d0 msgr2=0x7f3b64104090 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:16:50.497 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:50.463+0000 7f3b6c608700 1 -- 192.168.123.104:0/117620350 shutdown_connections 2026-03-10T06:16:50.497 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:50.463+0000 7f3b6c608700 1 -- 192.168.123.104:0/117620350 wait complete. 2026-03-10T06:16:50.497 INFO:teuthology.orchestra.run.vm04.stdout:mgr not available, waiting (3/15)... 2026-03-10T06:16:51.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:51 vm04 ceph-mon[51058]: Activating manager daemon vm04.exdvdb 2026-03-10T06:16:51.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:51 vm04 ceph-mon[51058]: mgrmap e2: vm04.exdvdb(active, starting, since 0.00519247s) 2026-03-10T06:16:51.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:51 vm04 ceph-mon[51058]: from='mgr.14100 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "mds metadata"}]: dispatch 2026-03-10T06:16:51.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:51 vm04 ceph-mon[51058]: from='mgr.14100 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd metadata"}]: dispatch 2026-03-10T06:16:51.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:51 vm04 ceph-mon[51058]: from='mgr.14100 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "mon metadata"}]: dispatch 2026-03-10T06:16:51.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:51 vm04 ceph-mon[51058]: from='mgr.14100 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "mon metadata", "id": "vm04"}]: dispatch 2026-03-10T06:16:51.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:51 vm04 ceph-mon[51058]: from='mgr.14100 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "mgr metadata", "who": "vm04.exdvdb", "id": "vm04.exdvdb"}]: dispatch 2026-03-10T06:16:51.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:51 vm04 ceph-mon[51058]: Manager daemon vm04.exdvdb is now available 2026-03-10T06:16:51.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:51 vm04 ceph-mon[51058]: from='mgr.14100 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm04.exdvdb/mirror_snapshot_schedule"}]: dispatch 2026-03-10T06:16:51.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:51 vm04 ceph-mon[51058]: from='mgr.14100 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:16:51.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:51 vm04 ceph-mon[51058]: from='mgr.14100 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:16:51.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:51 vm04 ceph-mon[51058]: from='mgr.14100 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm04.exdvdb/trash_purge_schedule"}]: dispatch 2026-03-10T06:16:51.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:51 vm04 ceph-mon[51058]: from='mgr.14100 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:16:51.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:51 vm04 ceph-mon[51058]: from='client.? 192.168.123.104:0/117620350' entity='client.admin' cmd=[{"prefix": "status", "format": "json-pretty"}]: dispatch 2026-03-10T06:16:52.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:52 vm04 ceph-mon[51058]: mgrmap e3: vm04.exdvdb(active, since 1.01032s) 2026-03-10T06:16:52.823 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout 2026-03-10T06:16:52.823 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout { 2026-03-10T06:16:52.823 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "fsid": "9c59102a-1c48-11f1-b618-035af535377d", 2026-03-10T06:16:52.823 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "health": { 2026-03-10T06:16:52.823 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "status": "HEALTH_OK", 2026-03-10T06:16:52.823 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "checks": {}, 2026-03-10T06:16:52.823 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "mutes": [] 2026-03-10T06:16:52.823 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout }, 2026-03-10T06:16:52.823 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "election_epoch": 5, 2026-03-10T06:16:52.823 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "quorum": [ 2026-03-10T06:16:52.823 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout 0 2026-03-10T06:16:52.823 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout ], 2026-03-10T06:16:52.823 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "quorum_names": [ 2026-03-10T06:16:52.823 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "vm04" 2026-03-10T06:16:52.823 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout ], 2026-03-10T06:16:52.823 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "quorum_age": 7, 2026-03-10T06:16:52.823 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "monmap": { 2026-03-10T06:16:52.823 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "epoch": 1, 2026-03-10T06:16:52.823 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "min_mon_release_name": "reef", 2026-03-10T06:16:52.823 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "num_mons": 1 2026-03-10T06:16:52.823 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout }, 2026-03-10T06:16:52.823 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "osdmap": { 2026-03-10T06:16:52.824 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "epoch": 1, 2026-03-10T06:16:52.824 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "num_osds": 0, 2026-03-10T06:16:52.824 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "num_up_osds": 0, 2026-03-10T06:16:52.824 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "osd_up_since": 0, 2026-03-10T06:16:52.824 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "num_in_osds": 0, 2026-03-10T06:16:52.826 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "osd_in_since": 0, 2026-03-10T06:16:52.826 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "num_remapped_pgs": 0 2026-03-10T06:16:52.826 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout }, 2026-03-10T06:16:52.826 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "pgmap": { 2026-03-10T06:16:52.826 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "pgs_by_state": [], 2026-03-10T06:16:52.826 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "num_pgs": 0, 2026-03-10T06:16:52.826 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "num_pools": 0, 2026-03-10T06:16:52.826 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "num_objects": 0, 2026-03-10T06:16:52.826 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "data_bytes": 0, 2026-03-10T06:16:52.826 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "bytes_used": 0, 2026-03-10T06:16:52.826 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "bytes_avail": 0, 2026-03-10T06:16:52.826 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "bytes_total": 0 2026-03-10T06:16:52.826 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout }, 2026-03-10T06:16:52.826 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "fsmap": { 2026-03-10T06:16:52.826 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "epoch": 1, 2026-03-10T06:16:52.826 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "by_rank": [], 2026-03-10T06:16:52.826 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "up:standby": 0 2026-03-10T06:16:52.826 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout }, 2026-03-10T06:16:52.826 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "mgrmap": { 2026-03-10T06:16:52.826 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "available": true, 2026-03-10T06:16:52.826 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "num_standbys": 0, 2026-03-10T06:16:52.826 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "modules": [ 2026-03-10T06:16:52.826 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "iostat", 2026-03-10T06:16:52.826 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "nfs", 2026-03-10T06:16:52.826 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "restful" 2026-03-10T06:16:52.826 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout ], 2026-03-10T06:16:52.827 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "services": {} 2026-03-10T06:16:52.827 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout }, 2026-03-10T06:16:52.827 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "servicemap": { 2026-03-10T06:16:52.827 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "epoch": 1, 2026-03-10T06:16:52.827 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "modified": "2026-03-10T06:16:43.937299+0000", 2026-03-10T06:16:52.827 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "services": {} 2026-03-10T06:16:52.827 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout }, 2026-03-10T06:16:52.827 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "progress_events": {} 2026-03-10T06:16:52.827 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout } 2026-03-10T06:16:52.827 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:52.628+0000 7f15a8ba3700 1 Processor -- start 2026-03-10T06:16:52.827 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:52.629+0000 7f15a8ba3700 1 -- start start 2026-03-10T06:16:52.827 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:52.629+0000 7f15a8ba3700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f15a407acf0 0x7f15a40791f0 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:16:52.827 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:52.629+0000 7f15a8ba3700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f15a4079730 con 0x7f15a407acf0 2026-03-10T06:16:52.827 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:52.630+0000 7f15a259c700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f15a407acf0 0x7f15a40791f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:16:52.827 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:52.630+0000 7f15a259c700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f15a407acf0 0x7f15a40791f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:41162/0 (socket says 192.168.123.104:41162) 2026-03-10T06:16:52.827 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:52.630+0000 7f15a259c700 1 -- 192.168.123.104:0/1726540836 learned_addr learned my addr 192.168.123.104:0/1726540836 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:16:52.827 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:52.630+0000 7f15a259c700 1 -- 192.168.123.104:0/1726540836 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f15a4079870 con 0x7f15a407acf0 2026-03-10T06:16:52.827 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:52.630+0000 7f15a259c700 1 --2- 192.168.123.104:0/1726540836 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f15a407acf0 0x7f15a40791f0 secure :-1 s=READY pgs=18 cs=0 l=1 rev1=1 crypto rx=0x7f158c009a90 tx=0x7f158c009da0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=79856ed3684a0888 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:16:52.827 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:52.631+0000 7f15a159a700 1 -- 192.168.123.104:0/1726540836 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f158c004030 con 0x7f15a407acf0 2026-03-10T06:16:52.827 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:52.631+0000 7f15a159a700 1 -- 192.168.123.104:0/1726540836 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f158c00b7e0 con 0x7f15a407acf0 2026-03-10T06:16:52.827 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:52.631+0000 7f15a159a700 1 -- 192.168.123.104:0/1726540836 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f158c0039f0 con 0x7f15a407acf0 2026-03-10T06:16:52.827 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:52.631+0000 7f15a8ba3700 1 -- 192.168.123.104:0/1726540836 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f15a407acf0 msgr2=0x7f15a40791f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:16:52.827 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:52.631+0000 7f15a8ba3700 1 --2- 192.168.123.104:0/1726540836 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f15a407acf0 0x7f15a40791f0 secure :-1 s=READY pgs=18 cs=0 l=1 rev1=1 crypto rx=0x7f158c009a90 tx=0x7f158c009da0 comp rx=0 tx=0).stop 2026-03-10T06:16:52.827 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:52.632+0000 7f15a8ba3700 1 -- 192.168.123.104:0/1726540836 shutdown_connections 2026-03-10T06:16:52.827 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:52.632+0000 7f15a8ba3700 1 --2- 192.168.123.104:0/1726540836 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f15a407acf0 0x7f15a40791f0 unknown :-1 s=CLOSED pgs=18 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:16:52.827 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:52.632+0000 7f15a8ba3700 1 -- 192.168.123.104:0/1726540836 >> 192.168.123.104:0/1726540836 conn(0x7f15a41013a0 msgr2=0x7f15a41037b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:16:52.827 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:52.632+0000 7f15a8ba3700 1 -- 192.168.123.104:0/1726540836 shutdown_connections 2026-03-10T06:16:52.827 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:52.632+0000 7f15a8ba3700 1 -- 192.168.123.104:0/1726540836 wait complete. 2026-03-10T06:16:52.827 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:52.633+0000 7f15a8ba3700 1 Processor -- start 2026-03-10T06:16:52.827 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:52.633+0000 7f15a8ba3700 1 -- start start 2026-03-10T06:16:52.827 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:52.633+0000 7f15a8ba3700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f15a407acf0 0x7f15a419bc30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:16:52.827 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:52.633+0000 7f15a8ba3700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f15a419c170 con 0x7f15a407acf0 2026-03-10T06:16:52.827 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:52.633+0000 7f15a259c700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f15a407acf0 0x7f15a419bc30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:16:52.827 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:52.633+0000 7f15a259c700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f15a407acf0 0x7f15a419bc30 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:41168/0 (socket says 192.168.123.104:41168) 2026-03-10T06:16:52.827 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:52.633+0000 7f15a259c700 1 -- 192.168.123.104:0/862957842 learned_addr learned my addr 192.168.123.104:0/862957842 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:16:52.827 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:52.633+0000 7f15a259c700 1 -- 192.168.123.104:0/862957842 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f158c009740 con 0x7f15a407acf0 2026-03-10T06:16:52.827 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:52.634+0000 7f15a259c700 1 --2- 192.168.123.104:0/862957842 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f15a407acf0 0x7f15a419bc30 secure :-1 s=READY pgs=19 cs=0 l=1 rev1=1 crypto rx=0x7f158c00be40 tx=0x7f158c00bf20 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:16:52.827 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:52.634+0000 7f159b7fe700 1 -- 192.168.123.104:0/862957842 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f158c0040d0 con 0x7f15a407acf0 2026-03-10T06:16:52.827 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:52.634+0000 7f15a8ba3700 1 -- 192.168.123.104:0/862957842 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f15a419c370 con 0x7f15a407acf0 2026-03-10T06:16:52.827 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:52.634+0000 7f15a8ba3700 1 -- 192.168.123.104:0/862957842 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f15a419c810 con 0x7f15a407acf0 2026-03-10T06:16:52.828 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:52.634+0000 7f159b7fe700 1 -- 192.168.123.104:0/862957842 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f158c01a430 con 0x7f15a407acf0 2026-03-10T06:16:52.828 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:52.634+0000 7f159b7fe700 1 -- 192.168.123.104:0/862957842 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f158c011420 con 0x7f15a407acf0 2026-03-10T06:16:52.828 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:52.635+0000 7f159b7fe700 1 -- 192.168.123.104:0/862957842 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 4) v1 ==== 45068+0+0 (secure 0 0 0) 0x7f158c011580 con 0x7f15a407acf0 2026-03-10T06:16:52.828 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:52.635+0000 7f159b7fe700 1 --2- 192.168.123.104:0/862957842 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f1590038470 0x7f159003a920 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:16:52.828 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:52.635+0000 7f159b7fe700 1 -- 192.168.123.104:0/862957842 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(1..1 src has 1..1) v4 ==== 725+0+0 (secure 0 0 0) 0x7f158c04cf60 con 0x7f15a407acf0 2026-03-10T06:16:52.828 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:52.636+0000 7f15a1d9b700 1 --2- 192.168.123.104:0/862957842 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f1590038470 0x7f159003a920 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:16:52.828 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:52.636+0000 7f15a8ba3700 1 -- 192.168.123.104:0/862957842 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f1584005320 con 0x7f15a407acf0 2026-03-10T06:16:52.828 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:52.639+0000 7f159b7fe700 1 -- 192.168.123.104:0/862957842 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f158c02b430 con 0x7f15a407acf0 2026-03-10T06:16:52.828 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:52.639+0000 7f15a1d9b700 1 --2- 192.168.123.104:0/862957842 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f1590038470 0x7f159003a920 secure :-1 s=READY pgs=6 cs=0 l=1 rev1=1 crypto rx=0x7f1594006fd0 tx=0x7f1594006e40 comp rx=0 tx=0).ready entity=mgr.14100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:16:52.828 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:52.783+0000 7f15a8ba3700 1 -- 192.168.123.104:0/862957842 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "status", "format": "json-pretty"} v 0) v1 -- 0x7f15840059f0 con 0x7f15a407acf0 2026-03-10T06:16:52.828 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:52.783+0000 7f159b7fe700 1 -- 192.168.123.104:0/862957842 <== mon.0 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "status", "format": "json-pretty"}]=0 v0) v1 ==== 79+0+1240 (secure 0 0 0) 0x7f158c0187c0 con 0x7f15a407acf0 2026-03-10T06:16:52.828 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:52.787+0000 7f15a8ba3700 1 -- 192.168.123.104:0/862957842 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f1590038470 msgr2=0x7f159003a920 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:16:52.828 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:52.787+0000 7f15a8ba3700 1 --2- 192.168.123.104:0/862957842 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f1590038470 0x7f159003a920 secure :-1 s=READY pgs=6 cs=0 l=1 rev1=1 crypto rx=0x7f1594006fd0 tx=0x7f1594006e40 comp rx=0 tx=0).stop 2026-03-10T06:16:52.828 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:52.787+0000 7f15a8ba3700 1 -- 192.168.123.104:0/862957842 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f15a407acf0 msgr2=0x7f15a419bc30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:16:52.828 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:52.787+0000 7f15a8ba3700 1 --2- 192.168.123.104:0/862957842 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f15a407acf0 0x7f15a419bc30 secure :-1 s=READY pgs=19 cs=0 l=1 rev1=1 crypto rx=0x7f158c00be40 tx=0x7f158c00bf20 comp rx=0 tx=0).stop 2026-03-10T06:16:52.828 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:52.788+0000 7f15a8ba3700 1 -- 192.168.123.104:0/862957842 shutdown_connections 2026-03-10T06:16:52.828 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:52.788+0000 7f15a8ba3700 1 --2- 192.168.123.104:0/862957842 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f1590038470 0x7f159003a920 unknown :-1 s=CLOSED pgs=6 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:16:52.828 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:52.788+0000 7f15a8ba3700 1 --2- 192.168.123.104:0/862957842 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f15a407acf0 0x7f15a419bc30 unknown :-1 s=CLOSED pgs=19 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:16:52.828 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:52.788+0000 7f15a8ba3700 1 -- 192.168.123.104:0/862957842 >> 192.168.123.104:0/862957842 conn(0x7f15a41013a0 msgr2=0x7f15a4101fa0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:16:52.828 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:52.788+0000 7f15a8ba3700 1 -- 192.168.123.104:0/862957842 shutdown_connections 2026-03-10T06:16:52.828 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:52.788+0000 7f15a8ba3700 1 -- 192.168.123.104:0/862957842 wait complete. 2026-03-10T06:16:52.828 INFO:teuthology.orchestra.run.vm04.stdout:mgr is available 2026-03-10T06:16:53.139 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout 2026-03-10T06:16:53.140 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout [global] 2026-03-10T06:16:53.140 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout fsid = 9c59102a-1c48-11f1-b618-035af535377d 2026-03-10T06:16:53.140 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout mon_osd_allow_pg_remap = true 2026-03-10T06:16:53.140 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout mon_osd_allow_primary_affinity = true 2026-03-10T06:16:53.140 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout mon_warn_on_no_sortbitwise = false 2026-03-10T06:16:53.140 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout osd_crush_chooseleaf_type = 0 2026-03-10T06:16:53.140 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout 2026-03-10T06:16:53.140 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout [mgr] 2026-03-10T06:16:53.140 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout mgr/telemetry/nag = false 2026-03-10T06:16:53.140 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout 2026-03-10T06:16:53.140 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout [osd] 2026-03-10T06:16:53.140 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout mon_osd_backfillfull_ratio = 0.85 2026-03-10T06:16:53.140 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout mon_osd_full_ratio = 0.9 2026-03-10T06:16:53.140 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout mon_osd_nearfull_ratio = 0.8 2026-03-10T06:16:53.140 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout osd_map_max_advance = 10 2026-03-10T06:16:53.140 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout osd_sloppy_crc = true 2026-03-10T06:16:53.140 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:52.963+0000 7fe172ab7700 1 Processor -- start 2026-03-10T06:16:53.140 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:52.964+0000 7fe172ab7700 1 -- start start 2026-03-10T06:16:53.140 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:52.964+0000 7fe172ab7700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe16c104c20 0x7fe16c105030 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:16:53.140 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:52.964+0000 7fe172ab7700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe16c105570 con 0x7fe16c104c20 2026-03-10T06:16:53.140 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:52.964+0000 7fe170853700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe16c104c20 0x7fe16c105030 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:16:53.140 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:52.964+0000 7fe170853700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe16c104c20 0x7fe16c105030 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:41180/0 (socket says 192.168.123.104:41180) 2026-03-10T06:16:53.140 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:52.964+0000 7fe170853700 1 -- 192.168.123.104:0/2519968274 learned_addr learned my addr 192.168.123.104:0/2519968274 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:16:53.140 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:52.965+0000 7fe170853700 1 -- 192.168.123.104:0/2519968274 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fe16c1056b0 con 0x7fe16c104c20 2026-03-10T06:16:53.140 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:52.965+0000 7fe170853700 1 --2- 192.168.123.104:0/2519968274 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe16c104c20 0x7fe16c105030 secure :-1 s=READY pgs=20 cs=0 l=1 rev1=1 crypto rx=0x7fe16000bf90 tx=0x7fe16000d5d0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=3f15404f1c0e097d server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:16:53.140 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:52.966+0000 7fe16b7fe700 1 -- 192.168.123.104:0/2519968274 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fe16000dcc0 con 0x7fe16c104c20 2026-03-10T06:16:53.140 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:52.966+0000 7fe16b7fe700 1 -- 192.168.123.104:0/2519968274 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7fe16000de20 con 0x7fe16c104c20 2026-03-10T06:16:53.140 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:52.966+0000 7fe16b7fe700 1 -- 192.168.123.104:0/2519968274 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fe160010470 con 0x7fe16c104c20 2026-03-10T06:16:53.140 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:52.967+0000 7fe172ab7700 1 -- 192.168.123.104:0/2519968274 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe16c104c20 msgr2=0x7fe16c105030 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:16:53.140 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:52.967+0000 7fe172ab7700 1 --2- 192.168.123.104:0/2519968274 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe16c104c20 0x7fe16c105030 secure :-1 s=READY pgs=20 cs=0 l=1 rev1=1 crypto rx=0x7fe16000bf90 tx=0x7fe16000d5d0 comp rx=0 tx=0).stop 2026-03-10T06:16:53.140 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:52.967+0000 7fe172ab7700 1 -- 192.168.123.104:0/2519968274 shutdown_connections 2026-03-10T06:16:53.140 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:52.967+0000 7fe172ab7700 1 --2- 192.168.123.104:0/2519968274 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe16c104c20 0x7fe16c105030 unknown :-1 s=CLOSED pgs=20 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:16:53.140 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:52.967+0000 7fe172ab7700 1 -- 192.168.123.104:0/2519968274 >> 192.168.123.104:0/2519968274 conn(0x7fe16c100270 msgr2=0x7fe16c1026a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:16:53.140 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:52.967+0000 7fe172ab7700 1 -- 192.168.123.104:0/2519968274 shutdown_connections 2026-03-10T06:16:53.141 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:52.967+0000 7fe172ab7700 1 -- 192.168.123.104:0/2519968274 wait complete. 2026-03-10T06:16:53.141 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:52.968+0000 7fe172ab7700 1 Processor -- start 2026-03-10T06:16:53.141 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:52.968+0000 7fe172ab7700 1 -- start start 2026-03-10T06:16:53.141 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:52.968+0000 7fe172ab7700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe16c104c20 0x7fe16c19bc50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:16:53.141 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:52.968+0000 7fe172ab7700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe16c19c190 con 0x7fe16c104c20 2026-03-10T06:16:53.141 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:52.969+0000 7fe170853700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe16c104c20 0x7fe16c19bc50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:16:53.141 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:52.969+0000 7fe170853700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe16c104c20 0x7fe16c19bc50 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:41182/0 (socket says 192.168.123.104:41182) 2026-03-10T06:16:53.141 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:52.969+0000 7fe170853700 1 -- 192.168.123.104:0/564684997 learned_addr learned my addr 192.168.123.104:0/564684997 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:16:53.141 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:52.969+0000 7fe170853700 1 -- 192.168.123.104:0/564684997 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fe16000b9e0 con 0x7fe16c104c20 2026-03-10T06:16:53.141 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:52.969+0000 7fe170853700 1 --2- 192.168.123.104:0/564684997 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe16c104c20 0x7fe16c19bc50 secure :-1 s=READY pgs=21 cs=0 l=1 rev1=1 crypto rx=0x7fe16000b340 tx=0x7fe1600119e0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:16:53.141 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:52.970+0000 7fe169ffb700 1 -- 192.168.123.104:0/564684997 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fe1600104e0 con 0x7fe16c104c20 2026-03-10T06:16:53.141 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:52.970+0000 7fe169ffb700 1 -- 192.168.123.104:0/564684997 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7fe160004580 con 0x7fe16c104c20 2026-03-10T06:16:53.141 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:52.970+0000 7fe172ab7700 1 -- 192.168.123.104:0/564684997 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fe16c19c390 con 0x7fe16c104c20 2026-03-10T06:16:53.141 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:52.970+0000 7fe172ab7700 1 -- 192.168.123.104:0/564684997 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fe16c19c830 con 0x7fe16c104c20 2026-03-10T06:16:53.141 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:52.971+0000 7fe169ffb700 1 -- 192.168.123.104:0/564684997 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fe160019e40 con 0x7fe16c104c20 2026-03-10T06:16:53.141 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:52.971+0000 7fe172ab7700 1 -- 192.168.123.104:0/564684997 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fe158005320 con 0x7fe16c104c20 2026-03-10T06:16:53.141 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:52.974+0000 7fe169ffb700 1 -- 192.168.123.104:0/564684997 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 4) v1 ==== 45068+0+0 (secure 0 0 0) 0x7fe160018420 con 0x7fe16c104c20 2026-03-10T06:16:53.141 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:52.974+0000 7fe169ffb700 1 --2- 192.168.123.104:0/564684997 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fe1540380f0 0x7fe15403a5a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:16:53.141 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:52.974+0000 7fe169ffb700 1 -- 192.168.123.104:0/564684997 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(1..1 src has 1..1) v4 ==== 725+0+0 (secure 0 0 0) 0x7fe16004d450 con 0x7fe16c104c20 2026-03-10T06:16:53.141 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:52.975+0000 7fe16bfff700 1 --2- 192.168.123.104:0/564684997 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fe1540380f0 0x7fe15403a5a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:16:53.141 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:52.975+0000 7fe16bfff700 1 --2- 192.168.123.104:0/564684997 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fe1540380f0 0x7fe15403a5a0 secure :-1 s=READY pgs=7 cs=0 l=1 rev1=1 crypto rx=0x7fe15c00ad30 tx=0x7fe15c0093f0 comp rx=0 tx=0).ready entity=mgr.14100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:16:53.141 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:52.975+0000 7fe169ffb700 1 -- 192.168.123.104:0/564684997 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fe16000f3d0 con 0x7fe16c104c20 2026-03-10T06:16:53.141 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:53.079+0000 7fe172ab7700 1 -- 192.168.123.104:0/564684997 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "config assimilate-conf"} v 0) v1 -- 0x7fe158005f70 con 0x7fe16c104c20 2026-03-10T06:16:53.141 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:53.082+0000 7fe169ffb700 1 -- 192.168.123.104:0/564684997 <== mon.0 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "config assimilate-conf"}]=0 v3) v1 ==== 70+0+373 (secure 0 0 0) 0x7fe16001f070 con 0x7fe16c104c20 2026-03-10T06:16:53.141 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:53.085+0000 7fe172ab7700 1 -- 192.168.123.104:0/564684997 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fe1540380f0 msgr2=0x7fe15403a5a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:16:53.141 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:53.085+0000 7fe172ab7700 1 --2- 192.168.123.104:0/564684997 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fe1540380f0 0x7fe15403a5a0 secure :-1 s=READY pgs=7 cs=0 l=1 rev1=1 crypto rx=0x7fe15c00ad30 tx=0x7fe15c0093f0 comp rx=0 tx=0).stop 2026-03-10T06:16:53.141 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:53.086+0000 7fe172ab7700 1 -- 192.168.123.104:0/564684997 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe16c104c20 msgr2=0x7fe16c19bc50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:16:53.141 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:53.086+0000 7fe172ab7700 1 --2- 192.168.123.104:0/564684997 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe16c104c20 0x7fe16c19bc50 secure :-1 s=READY pgs=21 cs=0 l=1 rev1=1 crypto rx=0x7fe16000b340 tx=0x7fe1600119e0 comp rx=0 tx=0).stop 2026-03-10T06:16:53.141 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:53.086+0000 7fe172ab7700 1 -- 192.168.123.104:0/564684997 shutdown_connections 2026-03-10T06:16:53.141 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:53.086+0000 7fe172ab7700 1 --2- 192.168.123.104:0/564684997 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fe1540380f0 0x7fe15403a5a0 unknown :-1 s=CLOSED pgs=7 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:16:53.141 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:53.086+0000 7fe172ab7700 1 --2- 192.168.123.104:0/564684997 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe16c104c20 0x7fe16c19bc50 unknown :-1 s=CLOSED pgs=21 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:16:53.141 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:53.086+0000 7fe172ab7700 1 -- 192.168.123.104:0/564684997 >> 192.168.123.104:0/564684997 conn(0x7fe16c100270 msgr2=0x7fe16c18e690 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:16:53.141 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:53.087+0000 7fe172ab7700 1 -- 192.168.123.104:0/564684997 shutdown_connections 2026-03-10T06:16:53.141 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:53.087+0000 7fe172ab7700 1 -- 192.168.123.104:0/564684997 wait complete. 2026-03-10T06:16:53.141 INFO:teuthology.orchestra.run.vm04.stdout:Enabling cephadm module... 2026-03-10T06:16:53.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:53 vm04 ceph-mon[51058]: mgrmap e4: vm04.exdvdb(active, since 2s) 2026-03-10T06:16:53.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:53 vm04 ceph-mon[51058]: from='client.? 192.168.123.104:0/862957842' entity='client.admin' cmd=[{"prefix": "status", "format": "json-pretty"}]: dispatch 2026-03-10T06:16:53.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:53 vm04 ceph-mon[51058]: from='client.? 192.168.123.104:0/564684997' entity='client.admin' cmd=[{"prefix": "config assimilate-conf"}]: dispatch 2026-03-10T06:16:54.471 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:53.271+0000 7fc1d59c9700 1 Processor -- start 2026-03-10T06:16:54.472 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:53.272+0000 7fc1d59c9700 1 -- start start 2026-03-10T06:16:54.472 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:53.272+0000 7fc1d59c9700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc1d007acf0 0x7fc1d00791f0 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:16:54.472 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:53.272+0000 7fc1d59c9700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc1d0079730 con 0x7fc1d007acf0 2026-03-10T06:16:54.472 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:53.272+0000 7fc1ceffd700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc1d007acf0 0x7fc1d00791f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:16:54.472 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:53.272+0000 7fc1ceffd700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc1d007acf0 0x7fc1d00791f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:41198/0 (socket says 192.168.123.104:41198) 2026-03-10T06:16:54.472 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:53.272+0000 7fc1ceffd700 1 -- 192.168.123.104:0/2018462369 learned_addr learned my addr 192.168.123.104:0/2018462369 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:16:54.472 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:53.273+0000 7fc1ceffd700 1 -- 192.168.123.104:0/2018462369 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fc1d0079870 con 0x7fc1d007acf0 2026-03-10T06:16:54.472 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:53.273+0000 7fc1ceffd700 1 --2- 192.168.123.104:0/2018462369 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc1d007acf0 0x7fc1d00791f0 secure :-1 s=READY pgs=22 cs=0 l=1 rev1=1 crypto rx=0x7fc1b8009a90 tx=0x7fc1b8009da0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=16d1cf63f5c8439 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:16:54.472 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:53.274+0000 7fc1cdffb700 1 -- 192.168.123.104:0/2018462369 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fc1b8004030 con 0x7fc1d007acf0 2026-03-10T06:16:54.472 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:53.274+0000 7fc1cdffb700 1 -- 192.168.123.104:0/2018462369 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7fc1b800b7e0 con 0x7fc1d007acf0 2026-03-10T06:16:54.472 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:53.274+0000 7fc1cdffb700 1 -- 192.168.123.104:0/2018462369 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fc1b80039f0 con 0x7fc1d007acf0 2026-03-10T06:16:54.472 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:53.274+0000 7fc1d59c9700 1 -- 192.168.123.104:0/2018462369 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc1d007acf0 msgr2=0x7fc1d00791f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:16:54.472 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:53.274+0000 7fc1d59c9700 1 --2- 192.168.123.104:0/2018462369 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc1d007acf0 0x7fc1d00791f0 secure :-1 s=READY pgs=22 cs=0 l=1 rev1=1 crypto rx=0x7fc1b8009a90 tx=0x7fc1b8009da0 comp rx=0 tx=0).stop 2026-03-10T06:16:54.472 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:53.275+0000 7fc1d59c9700 1 -- 192.168.123.104:0/2018462369 shutdown_connections 2026-03-10T06:16:54.472 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:53.275+0000 7fc1d59c9700 1 --2- 192.168.123.104:0/2018462369 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc1d007acf0 0x7fc1d00791f0 unknown :-1 s=CLOSED pgs=22 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:16:54.472 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:53.275+0000 7fc1d59c9700 1 -- 192.168.123.104:0/2018462369 >> 192.168.123.104:0/2018462369 conn(0x7fc1d01013a0 msgr2=0x7fc1d01037b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:16:54.475 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:53.275+0000 7fc1d59c9700 1 -- 192.168.123.104:0/2018462369 shutdown_connections 2026-03-10T06:16:54.475 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:53.275+0000 7fc1d59c9700 1 -- 192.168.123.104:0/2018462369 wait complete. 2026-03-10T06:16:54.475 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:53.276+0000 7fc1d59c9700 1 Processor -- start 2026-03-10T06:16:54.475 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:53.276+0000 7fc1d59c9700 1 -- start start 2026-03-10T06:16:54.475 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:53.276+0000 7fc1d59c9700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc1d007acf0 0x7fc1d019bc30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:16:54.475 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:53.276+0000 7fc1d59c9700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc1d019c170 con 0x7fc1d007acf0 2026-03-10T06:16:54.475 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:53.276+0000 7fc1ceffd700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc1d007acf0 0x7fc1d019bc30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:16:54.475 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:53.276+0000 7fc1ceffd700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc1d007acf0 0x7fc1d019bc30 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:41200/0 (socket says 192.168.123.104:41200) 2026-03-10T06:16:54.475 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:53.276+0000 7fc1ceffd700 1 -- 192.168.123.104:0/2861984136 learned_addr learned my addr 192.168.123.104:0/2861984136 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:16:54.476 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:53.277+0000 7fc1ceffd700 1 -- 192.168.123.104:0/2861984136 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fc1b8009740 con 0x7fc1d007acf0 2026-03-10T06:16:54.476 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:53.277+0000 7fc1ceffd700 1 --2- 192.168.123.104:0/2861984136 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc1d007acf0 0x7fc1d019bc30 secure :-1 s=READY pgs=23 cs=0 l=1 rev1=1 crypto rx=0x7fc1b800be40 tx=0x7fc1b800bf20 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:16:54.476 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:53.277+0000 7fc1d49c7700 1 -- 192.168.123.104:0/2861984136 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fc1b80040d0 con 0x7fc1d007acf0 2026-03-10T06:16:54.476 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:53.277+0000 7fc1d49c7700 1 -- 192.168.123.104:0/2861984136 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7fc1b801a430 con 0x7fc1d007acf0 2026-03-10T06:16:54.476 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:53.277+0000 7fc1d49c7700 1 -- 192.168.123.104:0/2861984136 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fc1b8011420 con 0x7fc1d007acf0 2026-03-10T06:16:54.476 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:53.277+0000 7fc1d59c9700 1 -- 192.168.123.104:0/2861984136 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fc1d019c370 con 0x7fc1d007acf0 2026-03-10T06:16:54.476 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:53.277+0000 7fc1d59c9700 1 -- 192.168.123.104:0/2861984136 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fc1d019c810 con 0x7fc1d007acf0 2026-03-10T06:16:54.476 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:53.279+0000 7fc1d49c7700 1 -- 192.168.123.104:0/2861984136 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 4) v1 ==== 45068+0+0 (secure 0 0 0) 0x7fc1b8011580 con 0x7fc1d007acf0 2026-03-10T06:16:54.476 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:53.279+0000 7fc1d49c7700 1 --2- 192.168.123.104:0/2861984136 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fc1bc038470 0x7fc1bc03a920 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:16:54.476 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:53.279+0000 7fc1d49c7700 1 -- 192.168.123.104:0/2861984136 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(1..1 src has 1..1) v4 ==== 725+0+0 (secure 0 0 0) 0x7fc1b804cfa0 con 0x7fc1d007acf0 2026-03-10T06:16:54.476 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:53.279+0000 7fc1ce7fc700 1 --2- 192.168.123.104:0/2861984136 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fc1bc038470 0x7fc1bc03a920 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:16:54.476 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:53.280+0000 7fc1d59c9700 1 -- 192.168.123.104:0/2861984136 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fc1d0062380 con 0x7fc1d007acf0 2026-03-10T06:16:54.476 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:53.280+0000 7fc1ce7fc700 1 --2- 192.168.123.104:0/2861984136 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fc1bc038470 0x7fc1bc03a920 secure :-1 s=READY pgs=8 cs=0 l=1 rev1=1 crypto rx=0x7fc1c0006fd0 tx=0x7fc1c0006e40 comp rx=0 tx=0).ready entity=mgr.14100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:16:54.476 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:53.283+0000 7fc1d49c7700 1 -- 192.168.123.104:0/2861984136 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fc1b802b430 con 0x7fc1d007acf0 2026-03-10T06:16:54.476 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:53.416+0000 7fc1d59c9700 1 -- 192.168.123.104:0/2861984136 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "mgr module enable", "module": "cephadm"} v 0) v1 -- 0x7fc1d019f280 con 0x7fc1d007acf0 2026-03-10T06:16:54.476 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:54.425+0000 7fc1d49c7700 1 -- 192.168.123.104:0/2861984136 <== mon.0 v2:192.168.123.104:3300/0 7 ==== mgrmap(e 5) v1 ==== 45079+0+0 (secure 0 0 0) 0x7fc1b802b7b0 con 0x7fc1d007acf0 2026-03-10T06:16:54.476 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:54.427+0000 7fc1d49c7700 1 -- 192.168.123.104:0/2861984136 <== mon.0 v2:192.168.123.104:3300/0 8 ==== mon_command_ack([{"prefix": "mgr module enable", "module": "cephadm"}]=0 v5) v1 ==== 86+0+0 (secure 0 0 0) 0x7fc1b801aa30 con 0x7fc1d007acf0 2026-03-10T06:16:54.476 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:54.430+0000 7fc1d59c9700 1 -- 192.168.123.104:0/2861984136 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fc1bc038470 msgr2=0x7fc1bc03a920 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:16:54.476 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:54.430+0000 7fc1d59c9700 1 --2- 192.168.123.104:0/2861984136 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fc1bc038470 0x7fc1bc03a920 secure :-1 s=READY pgs=8 cs=0 l=1 rev1=1 crypto rx=0x7fc1c0006fd0 tx=0x7fc1c0006e40 comp rx=0 tx=0).stop 2026-03-10T06:16:54.476 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:54.430+0000 7fc1d59c9700 1 -- 192.168.123.104:0/2861984136 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc1d007acf0 msgr2=0x7fc1d019bc30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:16:54.476 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:54.430+0000 7fc1d59c9700 1 --2- 192.168.123.104:0/2861984136 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc1d007acf0 0x7fc1d019bc30 secure :-1 s=READY pgs=23 cs=0 l=1 rev1=1 crypto rx=0x7fc1b800be40 tx=0x7fc1b800bf20 comp rx=0 tx=0).stop 2026-03-10T06:16:54.476 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:54.430+0000 7fc1d59c9700 1 -- 192.168.123.104:0/2861984136 shutdown_connections 2026-03-10T06:16:54.476 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:54.430+0000 7fc1d59c9700 1 --2- 192.168.123.104:0/2861984136 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fc1bc038470 0x7fc1bc03a920 unknown :-1 s=CLOSED pgs=8 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:16:54.476 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:54.430+0000 7fc1d59c9700 1 --2- 192.168.123.104:0/2861984136 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc1d007acf0 0x7fc1d019bc30 unknown :-1 s=CLOSED pgs=23 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:16:54.476 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:54.430+0000 7fc1d59c9700 1 -- 192.168.123.104:0/2861984136 >> 192.168.123.104:0/2861984136 conn(0x7fc1d01013a0 msgr2=0x7fc1d0101fa0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:16:54.476 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:54.430+0000 7fc1d59c9700 1 -- 192.168.123.104:0/2861984136 shutdown_connections 2026-03-10T06:16:54.476 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:54.431+0000 7fc1d59c9700 1 -- 192.168.123.104:0/2861984136 wait complete. 2026-03-10T06:16:54.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:54 vm04 ceph-mon[51058]: from='client.? 192.168.123.104:0/2861984136' entity='client.admin' cmd=[{"prefix": "mgr module enable", "module": "cephadm"}]: dispatch 2026-03-10T06:16:54.834 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout { 2026-03-10T06:16:54.834 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "epoch": 5, 2026-03-10T06:16:54.834 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "available": true, 2026-03-10T06:16:54.834 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "active_name": "vm04.exdvdb", 2026-03-10T06:16:54.834 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "num_standby": 0 2026-03-10T06:16:54.834 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout } 2026-03-10T06:16:54.834 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:54.603+0000 7fafac3b0700 1 Processor -- start 2026-03-10T06:16:54.834 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:54.603+0000 7fafac3b0700 1 -- start start 2026-03-10T06:16:54.834 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:54.604+0000 7fafac3b0700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fafa4105c00 0x7fafa4106010 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:16:54.834 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:54.604+0000 7fafac3b0700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fafa4106550 con 0x7fafa4105c00 2026-03-10T06:16:54.835 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:54.604+0000 7fafaa14c700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fafa4105c00 0x7fafa4106010 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:16:54.835 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:54.604+0000 7fafaa14c700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fafa4105c00 0x7fafa4106010 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:46530/0 (socket says 192.168.123.104:46530) 2026-03-10T06:16:54.835 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:54.604+0000 7fafaa14c700 1 -- 192.168.123.104:0/3948089110 learned_addr learned my addr 192.168.123.104:0/3948089110 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:16:54.835 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:54.605+0000 7fafaa14c700 1 -- 192.168.123.104:0/3948089110 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fafa4106690 con 0x7fafa4105c00 2026-03-10T06:16:54.835 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:54.605+0000 7fafaa14c700 1 --2- 192.168.123.104:0/3948089110 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fafa4105c00 0x7fafa4106010 secure :-1 s=READY pgs=26 cs=0 l=1 rev1=1 crypto rx=0x7fafa0009a90 tx=0x7fafa0009da0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=a2dc36c9c3b3a6fd server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:16:54.835 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:54.605+0000 7fafa914a700 1 -- 192.168.123.104:0/3948089110 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fafa0004030 con 0x7fafa4105c00 2026-03-10T06:16:54.835 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:54.605+0000 7fafa914a700 1 -- 192.168.123.104:0/3948089110 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7fafa000b7e0 con 0x7fafa4105c00 2026-03-10T06:16:54.835 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:54.606+0000 7fafac3b0700 1 -- 192.168.123.104:0/3948089110 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fafa4105c00 msgr2=0x7fafa4106010 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:16:54.835 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:54.606+0000 7fafac3b0700 1 --2- 192.168.123.104:0/3948089110 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fafa4105c00 0x7fafa4106010 secure :-1 s=READY pgs=26 cs=0 l=1 rev1=1 crypto rx=0x7fafa0009a90 tx=0x7fafa0009da0 comp rx=0 tx=0).stop 2026-03-10T06:16:54.835 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:54.606+0000 7fafac3b0700 1 -- 192.168.123.104:0/3948089110 shutdown_connections 2026-03-10T06:16:54.835 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:54.606+0000 7fafac3b0700 1 --2- 192.168.123.104:0/3948089110 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fafa4105c00 0x7fafa4106010 unknown :-1 s=CLOSED pgs=26 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:16:54.835 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:54.606+0000 7fafac3b0700 1 -- 192.168.123.104:0/3948089110 >> 192.168.123.104:0/3948089110 conn(0x7fafa4101250 msgr2=0x7fafa4103680 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:16:54.835 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:54.606+0000 7fafac3b0700 1 -- 192.168.123.104:0/3948089110 shutdown_connections 2026-03-10T06:16:54.835 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:54.606+0000 7fafac3b0700 1 -- 192.168.123.104:0/3948089110 wait complete. 2026-03-10T06:16:54.835 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:54.607+0000 7fafac3b0700 1 Processor -- start 2026-03-10T06:16:54.835 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:54.607+0000 7fafac3b0700 1 -- start start 2026-03-10T06:16:54.835 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:54.607+0000 7fafac3b0700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fafa407bc30 0x7fafa407a2c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:16:54.835 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:54.607+0000 7fafac3b0700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fafa4106550 con 0x7fafa407bc30 2026-03-10T06:16:54.835 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:54.608+0000 7fafaa14c700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fafa407bc30 0x7fafa407a2c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:16:54.835 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:54.608+0000 7fafaa14c700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fafa407bc30 0x7fafa407a2c0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:46538/0 (socket says 192.168.123.104:46538) 2026-03-10T06:16:54.835 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:54.608+0000 7fafaa14c700 1 -- 192.168.123.104:0/1609401590 learned_addr learned my addr 192.168.123.104:0/1609401590 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:16:54.835 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:54.608+0000 7fafaa14c700 1 -- 192.168.123.104:0/1609401590 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fafa0009740 con 0x7fafa407bc30 2026-03-10T06:16:54.835 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:54.608+0000 7fafaa14c700 1 --2- 192.168.123.104:0/1609401590 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fafa407bc30 0x7fafa407a2c0 secure :-1 s=READY pgs=27 cs=0 l=1 rev1=1 crypto rx=0x7fafa000bdb0 tx=0x7fafa000be90 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:16:54.835 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:54.612+0000 7faf9b7fe700 1 -- 192.168.123.104:0/1609401590 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fafa0003ec0 con 0x7fafa407bc30 2026-03-10T06:16:54.835 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:54.612+0000 7fafac3b0700 1 -- 192.168.123.104:0/1609401590 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fafa407c040 con 0x7fafa407bc30 2026-03-10T06:16:54.835 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:54.612+0000 7fafac3b0700 1 -- 192.168.123.104:0/1609401590 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fafa407a9d0 con 0x7fafa407bc30 2026-03-10T06:16:54.835 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:54.613+0000 7faf9b7fe700 1 -- 192.168.123.104:0/1609401590 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7fafa00044c0 con 0x7fafa407bc30 2026-03-10T06:16:54.836 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:54.613+0000 7faf9b7fe700 1 -- 192.168.123.104:0/1609401590 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fafa001ace0 con 0x7fafa407bc30 2026-03-10T06:16:54.836 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:54.613+0000 7fafac3b0700 1 -- 192.168.123.104:0/1609401590 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7faf88005320 con 0x7fafa407bc30 2026-03-10T06:16:54.836 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:54.614+0000 7faf9b7fe700 1 -- 192.168.123.104:0/1609401590 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 5) v1 ==== 45079+0+0 (secure 0 0 0) 0x7fafa001a5f0 con 0x7fafa407bc30 2026-03-10T06:16:54.836 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:54.614+0000 7faf9b7fe700 1 --2- 192.168.123.104:0/1609401590 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7faf90038560 0x7faf9003aa10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:16:54.836 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:54.614+0000 7fafa994b700 1 -- 192.168.123.104:0/1609401590 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7faf90038560 msgr2=0x7faf9003aa10 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.104:6800/2 2026-03-10T06:16:54.836 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:54.614+0000 7fafa994b700 1 --2- 192.168.123.104:0/1609401590 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7faf90038560 0x7faf9003aa10 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 0.200000 2026-03-10T06:16:54.836 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:54.615+0000 7faf9b7fe700 1 -- 192.168.123.104:0/1609401590 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(1..1 src has 1..1) v4 ==== 725+0+0 (secure 0 0 0) 0x7fafa004b9e0 con 0x7fafa407bc30 2026-03-10T06:16:54.836 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:54.622+0000 7faf9b7fe700 1 -- 192.168.123.104:0/1609401590 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fafa0004630 con 0x7fafa407bc30 2026-03-10T06:16:54.836 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:54.758+0000 7fafac3b0700 1 -- 192.168.123.104:0/1609401590 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "mgr stat"} v 0) v1 -- 0x7faf88006200 con 0x7fafa407bc30 2026-03-10T06:16:54.836 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:54.758+0000 7faf9b7fe700 1 -- 192.168.123.104:0/1609401590 <== mon.0 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "mgr stat"}]=0 v5) v1 ==== 56+0+98 (secure 0 0 0) 0x7fafa0049070 con 0x7fafa407bc30 2026-03-10T06:16:54.836 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:54.760+0000 7faf997fa700 1 -- 192.168.123.104:0/1609401590 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7faf90038560 msgr2=0x7faf9003aa10 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T06:16:54.836 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:54.760+0000 7faf997fa700 1 --2- 192.168.123.104:0/1609401590 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7faf90038560 0x7faf9003aa10 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:16:54.836 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:54.760+0000 7faf997fa700 1 -- 192.168.123.104:0/1609401590 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fafa407bc30 msgr2=0x7fafa407a2c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:16:54.836 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:54.760+0000 7faf997fa700 1 --2- 192.168.123.104:0/1609401590 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fafa407bc30 0x7fafa407a2c0 secure :-1 s=READY pgs=27 cs=0 l=1 rev1=1 crypto rx=0x7fafa000bdb0 tx=0x7fafa000be90 comp rx=0 tx=0).stop 2026-03-10T06:16:54.836 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:54.761+0000 7faf997fa700 1 -- 192.168.123.104:0/1609401590 shutdown_connections 2026-03-10T06:16:54.836 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:54.761+0000 7faf997fa700 1 --2- 192.168.123.104:0/1609401590 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7faf90038560 0x7faf9003aa10 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:16:54.836 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:54.761+0000 7faf997fa700 1 --2- 192.168.123.104:0/1609401590 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fafa407bc30 0x7fafa407a2c0 unknown :-1 s=CLOSED pgs=27 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:16:54.836 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:54.761+0000 7faf997fa700 1 -- 192.168.123.104:0/1609401590 >> 192.168.123.104:0/1609401590 conn(0x7fafa4101250 msgr2=0x7fafa4101cc0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:16:54.836 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:54.762+0000 7faf997fa700 1 -- 192.168.123.104:0/1609401590 shutdown_connections 2026-03-10T06:16:54.836 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:54.762+0000 7faf997fa700 1 -- 192.168.123.104:0/1609401590 wait complete. 2026-03-10T06:16:54.836 INFO:teuthology.orchestra.run.vm04.stdout:Waiting for the mgr to restart... 2026-03-10T06:16:54.836 INFO:teuthology.orchestra.run.vm04.stdout:Waiting for mgr epoch 5... 2026-03-10T06:16:55.870 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:55 vm04 ceph-mon[51058]: from='client.? 192.168.123.104:0/2861984136' entity='client.admin' cmd='[{"prefix": "mgr module enable", "module": "cephadm"}]': finished 2026-03-10T06:16:55.870 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:55 vm04 ceph-mon[51058]: mgrmap e5: vm04.exdvdb(active, since 4s) 2026-03-10T06:16:55.870 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:55 vm04 ceph-mon[51058]: from='client.? 192.168.123.104:0/1609401590' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch 2026-03-10T06:16:59.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:59 vm04 ceph-mon[51058]: Active manager daemon vm04.exdvdb restarted 2026-03-10T06:16:59.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:59 vm04 ceph-mon[51058]: Activating manager daemon vm04.exdvdb 2026-03-10T06:16:59.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:59 vm04 ceph-mon[51058]: osdmap e2: 0 total, 0 up, 0 in 2026-03-10T06:16:59.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:59 vm04 ceph-mon[51058]: mgrmap e6: vm04.exdvdb(active, starting, since 0.00614017s) 2026-03-10T06:16:59.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:59 vm04 ceph-mon[51058]: from='mgr.14120 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "mon metadata", "id": "vm04"}]: dispatch 2026-03-10T06:16:59.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:59 vm04 ceph-mon[51058]: from='mgr.14120 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "mgr metadata", "who": "vm04.exdvdb", "id": "vm04.exdvdb"}]: dispatch 2026-03-10T06:16:59.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:59 vm04 ceph-mon[51058]: from='mgr.14120 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "mds metadata"}]: dispatch 2026-03-10T06:16:59.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:59 vm04 ceph-mon[51058]: from='mgr.14120 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd metadata"}]: dispatch 2026-03-10T06:16:59.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:59 vm04 ceph-mon[51058]: from='mgr.14120 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "mon metadata"}]: dispatch 2026-03-10T06:16:59.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:59 vm04 ceph-mon[51058]: Manager daemon vm04.exdvdb is now available 2026-03-10T06:16:59.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:59 vm04 ceph-mon[51058]: from='mgr.14120 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:16:59.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:59 vm04 ceph-mon[51058]: from='mgr.14120 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:16:59.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:59 vm04 ceph-mon[51058]: from='mgr.14120 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:16:59.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:59 vm04 ceph-mon[51058]: from='mgr.14120 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:16:59.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:16:59 vm04 ceph-mon[51058]: from='mgr.14120 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm04.exdvdb/mirror_snapshot_schedule"}]: dispatch 2026-03-10T06:17:00.291 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout { 2026-03-10T06:17:00.291 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "mgrmap_epoch": 7, 2026-03-10T06:17:00.291 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "initialized": true 2026-03-10T06:17:00.291 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout } 2026-03-10T06:17:00.291 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:54.991+0000 7f738216a700 1 Processor -- start 2026-03-10T06:17:00.291 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:54.991+0000 7f738216a700 1 -- start start 2026-03-10T06:17:00.291 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:54.991+0000 7f738216a700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f737c10e510 0x7f737c10e920 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:17:00.291 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:54.991+0000 7f738216a700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f737c10ee60 con 0x7f737c10e510 2026-03-10T06:17:00.291 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:54.991+0000 7f737b7fe700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f737c10e510 0x7f737c10e920 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:17:00.291 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:54.991+0000 7f737b7fe700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f737c10e510 0x7f737c10e920 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:46548/0 (socket says 192.168.123.104:46548) 2026-03-10T06:17:00.291 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:54.991+0000 7f737b7fe700 1 -- 192.168.123.104:0/1902714373 learned_addr learned my addr 192.168.123.104:0/1902714373 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:17:00.291 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:54.992+0000 7f737b7fe700 1 -- 192.168.123.104:0/1902714373 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f737c10efa0 con 0x7f737c10e510 2026-03-10T06:17:00.291 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:54.992+0000 7f737b7fe700 1 --2- 192.168.123.104:0/1902714373 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f737c10e510 0x7f737c10e920 secure :-1 s=READY pgs=28 cs=0 l=1 rev1=1 crypto rx=0x7f736c009a90 tx=0x7f736c009da0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=fc71242c5e3b0f71 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:17:00.291 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:54.994+0000 7f737a7fc700 1 -- 192.168.123.104:0/1902714373 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f736c004030 con 0x7f737c10e510 2026-03-10T06:17:00.291 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:54.994+0000 7f737a7fc700 1 -- 192.168.123.104:0/1902714373 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f736c00b7e0 con 0x7f737c10e510 2026-03-10T06:17:00.291 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:54.994+0000 7f738216a700 1 -- 192.168.123.104:0/1902714373 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f737c10e510 msgr2=0x7f737c10e920 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:00.291 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:54.994+0000 7f738216a700 1 --2- 192.168.123.104:0/1902714373 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f737c10e510 0x7f737c10e920 secure :-1 s=READY pgs=28 cs=0 l=1 rev1=1 crypto rx=0x7f736c009a90 tx=0x7f736c009da0 comp rx=0 tx=0).stop 2026-03-10T06:17:00.291 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:54.994+0000 7f738216a700 1 -- 192.168.123.104:0/1902714373 shutdown_connections 2026-03-10T06:17:00.291 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:54.994+0000 7f738216a700 1 --2- 192.168.123.104:0/1902714373 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f737c10e510 0x7f737c10e920 unknown :-1 s=CLOSED pgs=28 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:00.291 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:54.994+0000 7f738216a700 1 -- 192.168.123.104:0/1902714373 >> 192.168.123.104:0/1902714373 conn(0x7f737c06ed20 msgr2=0x7f737c071150 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:17:00.291 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:54.995+0000 7f738216a700 1 -- 192.168.123.104:0/1902714373 shutdown_connections 2026-03-10T06:17:00.291 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:54.995+0000 7f738216a700 1 -- 192.168.123.104:0/1902714373 wait complete. 2026-03-10T06:17:00.291 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:54.995+0000 7f738216a700 1 Processor -- start 2026-03-10T06:17:00.291 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:54.995+0000 7f738216a700 1 -- start start 2026-03-10T06:17:00.291 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:54.995+0000 7f738216a700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f737c10e510 0x7f737c1a0000 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:17:00.291 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:54.995+0000 7f738216a700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f737c10ee60 con 0x7f737c10e510 2026-03-10T06:17:00.291 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:54.996+0000 7f737b7fe700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f737c10e510 0x7f737c1a0000 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:17:00.291 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:54.996+0000 7f737b7fe700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f737c10e510 0x7f737c1a0000 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:46560/0 (socket says 192.168.123.104:46560) 2026-03-10T06:17:00.291 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:54.996+0000 7f737b7fe700 1 -- 192.168.123.104:0/1264802732 learned_addr learned my addr 192.168.123.104:0/1264802732 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:17:00.291 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:54.996+0000 7f737b7fe700 1 -- 192.168.123.104:0/1264802732 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f736c009740 con 0x7f737c10e510 2026-03-10T06:17:00.291 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:54.996+0000 7f737b7fe700 1 --2- 192.168.123.104:0/1264802732 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f737c10e510 0x7f737c1a0000 secure :-1 s=READY pgs=29 cs=0 l=1 rev1=1 crypto rx=0x7f736c003e90 tx=0x7f736c003f70 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:17:00.291 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:54.997+0000 7f7378ff9700 1 -- 192.168.123.104:0/1264802732 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f736c0043d0 con 0x7f737c10e510 2026-03-10T06:17:00.291 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:54.997+0000 7f7378ff9700 1 -- 192.168.123.104:0/1264802732 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f736c004530 con 0x7f737c10e510 2026-03-10T06:17:00.291 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:54.997+0000 7f7378ff9700 1 -- 192.168.123.104:0/1264802732 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f736c011620 con 0x7f737c10e510 2026-03-10T06:17:00.291 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:54.997+0000 7f738216a700 1 -- 192.168.123.104:0/1264802732 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f737c1a05a0 con 0x7f737c10e510 2026-03-10T06:17:00.291 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:54.997+0000 7f738216a700 1 -- 192.168.123.104:0/1264802732 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f737c1a09c0 con 0x7f737c10e510 2026-03-10T06:17:00.291 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:54.998+0000 7f7378ff9700 1 -- 192.168.123.104:0/1264802732 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 5) v1 ==== 45079+0+0 (secure 0 0 0) 0x7f736c028020 con 0x7f737c10e510 2026-03-10T06:17:00.291 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:54.998+0000 7f7378ff9700 1 --2- 192.168.123.104:0/1264802732 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f736403c8c0 0x7f736403ed70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:17:00.291 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:54.998+0000 7f7378ff9700 1 -- 192.168.123.104:0/1264802732 --> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] -- command(tid 0: {"prefix": "get_command_descriptions"}) v1 -- 0x7f736403f480 con 0x7f736403c8c0 2026-03-10T06:17:00.291 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:54.998+0000 7f7378ff9700 1 -- 192.168.123.104:0/1264802732 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(1..1 src has 1..1) v4 ==== 725+0+0 (secure 0 0 0) 0x7f736c04bd90 con 0x7f737c10e510 2026-03-10T06:17:00.291 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:54.999+0000 7f737affd700 1 -- 192.168.123.104:0/1264802732 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f736403c8c0 msgr2=0x7f736403ed70 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.104:6800/2 2026-03-10T06:17:00.291 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:54.999+0000 7f737affd700 1 --2- 192.168.123.104:0/1264802732 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f736403c8c0 0x7f736403ed70 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 0.200000 2026-03-10T06:17:00.291 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:55.199+0000 7f737affd700 1 -- 192.168.123.104:0/1264802732 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f736403c8c0 msgr2=0x7f736403ed70 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.104:6800/2 2026-03-10T06:17:00.291 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:55.199+0000 7f737affd700 1 --2- 192.168.123.104:0/1264802732 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f736403c8c0 0x7f736403ed70 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 0.400000 2026-03-10T06:17:00.291 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:55.600+0000 7f737affd700 1 -- 192.168.123.104:0/1264802732 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f736403c8c0 msgr2=0x7f736403ed70 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.104:6800/2 2026-03-10T06:17:00.291 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:55.600+0000 7f737affd700 1 --2- 192.168.123.104:0/1264802732 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f736403c8c0 0x7f736403ed70 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 0.800000 2026-03-10T06:17:00.291 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:56.401+0000 7f737affd700 1 -- 192.168.123.104:0/1264802732 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f736403c8c0 msgr2=0x7f736403ed70 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.104:6800/2 2026-03-10T06:17:00.292 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:56.401+0000 7f737affd700 1 --2- 192.168.123.104:0/1264802732 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f736403c8c0 0x7f736403ed70 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 1.600000 2026-03-10T06:17:00.292 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:58.003+0000 7f737affd700 1 -- 192.168.123.104:0/1264802732 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f736403c8c0 msgr2=0x7f736403ed70 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.104:6800/2 2026-03-10T06:17:00.292 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:58.003+0000 7f737affd700 1 --2- 192.168.123.104:0/1264802732 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f736403c8c0 0x7f736403ed70 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 3.200000 2026-03-10T06:17:00.292 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:59.247+0000 7f7378ff9700 1 -- 192.168.123.104:0/1264802732 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mgrmap(e 6) v1 ==== 44846+0+0 (secure 0 0 0) 0x7f736c01a430 con 0x7f737c10e510 2026-03-10T06:17:00.292 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:59.247+0000 7f7378ff9700 1 -- 192.168.123.104:0/1264802732 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f736403c8c0 msgr2=0x7f736403ed70 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T06:17:00.292 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:16:59.247+0000 7f7378ff9700 1 --2- 192.168.123.104:0/1264802732 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f736403c8c0 0x7f736403ed70 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:00.292 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:00.250+0000 7f7378ff9700 1 -- 192.168.123.104:0/1264802732 <== mon.0 v2:192.168.123.104:3300/0 7 ==== mgrmap(e 7) v1 ==== 44973+0+0 (secure 0 0 0) 0x7f736c04d4d0 con 0x7f737c10e510 2026-03-10T06:17:00.292 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:00.250+0000 7f7378ff9700 1 --2- 192.168.123.104:0/1264802732 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f736403c8c0 0x7f736403ed70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:17:00.292 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:00.250+0000 7f7378ff9700 1 -- 192.168.123.104:0/1264802732 --> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] -- command(tid 0: {"prefix": "get_command_descriptions"}) v1 -- 0x7f736403f480 con 0x7f736403c8c0 2026-03-10T06:17:00.292 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:00.252+0000 7f737affd700 1 --2- 192.168.123.104:0/1264802732 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f736403c8c0 0x7f736403ed70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:17:00.292 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:00.253+0000 7f737affd700 1 --2- 192.168.123.104:0/1264802732 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f736403c8c0 0x7f736403ed70 secure :-1 s=READY pgs=1 cs=0 l=1 rev1=1 crypto rx=0x7f7370003a10 tx=0x7f73700092b0 comp rx=0 tx=0).ready entity=mgr.14120 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:17:00.292 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:00.254+0000 7f7378ff9700 1 -- 192.168.123.104:0/1264802732 <== mgr.14120 v2:192.168.123.104:6800/2 1 ==== command_reply(tid 0: 0 ) v1 ==== 8+0+6910 (secure 0 0 0) 0x7f736403f480 con 0x7f736403c8c0 2026-03-10T06:17:00.292 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:00.258+0000 7f738216a700 1 -- 192.168.123.104:0/1264802732 --> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] -- command(tid 1: {"prefix": "mgr_status"}) v1 -- 0x7f737c1a1380 con 0x7f736403c8c0 2026-03-10T06:17:00.292 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:00.259+0000 7f7378ff9700 1 -- 192.168.123.104:0/1264802732 <== mgr.14120 v2:192.168.123.104:6800/2 2 ==== command_reply(tid 1: 0 ) v1 ==== 8+0+51 (secure 0 0 0) 0x7f737c1a1380 con 0x7f736403c8c0 2026-03-10T06:17:00.292 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:00.259+0000 7f738216a700 1 -- 192.168.123.104:0/1264802732 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f736403c8c0 msgr2=0x7f736403ed70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:00.292 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:00.259+0000 7f738216a700 1 --2- 192.168.123.104:0/1264802732 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f736403c8c0 0x7f736403ed70 secure :-1 s=READY pgs=1 cs=0 l=1 rev1=1 crypto rx=0x7f7370003a10 tx=0x7f73700092b0 comp rx=0 tx=0).stop 2026-03-10T06:17:00.292 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:00.259+0000 7f738216a700 1 -- 192.168.123.104:0/1264802732 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f737c10e510 msgr2=0x7f737c1a0000 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:00.292 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:00.259+0000 7f738216a700 1 --2- 192.168.123.104:0/1264802732 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f737c10e510 0x7f737c1a0000 secure :-1 s=READY pgs=29 cs=0 l=1 rev1=1 crypto rx=0x7f736c003e90 tx=0x7f736c003f70 comp rx=0 tx=0).stop 2026-03-10T06:17:00.292 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:00.259+0000 7f738216a700 1 -- 192.168.123.104:0/1264802732 shutdown_connections 2026-03-10T06:17:00.292 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:00.259+0000 7f738216a700 1 --2- 192.168.123.104:0/1264802732 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f736403c8c0 0x7f736403ed70 unknown :-1 s=CLOSED pgs=1 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:00.292 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:00.259+0000 7f738216a700 1 --2- 192.168.123.104:0/1264802732 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f737c10e510 0x7f737c1a0000 unknown :-1 s=CLOSED pgs=29 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:00.292 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:00.259+0000 7f738216a700 1 -- 192.168.123.104:0/1264802732 >> 192.168.123.104:0/1264802732 conn(0x7f737c06ed20 msgr2=0x7f737c0705c0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:17:00.292 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:00.260+0000 7f738216a700 1 -- 192.168.123.104:0/1264802732 shutdown_connections 2026-03-10T06:17:00.292 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:00.260+0000 7f738216a700 1 -- 192.168.123.104:0/1264802732 wait complete. 2026-03-10T06:17:00.292 INFO:teuthology.orchestra.run.vm04.stdout:mgr epoch 5 is available 2026-03-10T06:17:00.292 INFO:teuthology.orchestra.run.vm04.stdout:Setting orchestrator backend to cephadm... 2026-03-10T06:17:00.552 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:00 vm04 ceph-mon[51058]: Found migration_current of "None". Setting to last migration. 2026-03-10T06:17:00.552 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:00 vm04 ceph-mon[51058]: from='mgr.14120 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:17:00.552 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:00 vm04 ceph-mon[51058]: from='mgr.14120 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm04.exdvdb/trash_purge_schedule"}]: dispatch 2026-03-10T06:17:00.552 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:00 vm04 ceph-mon[51058]: from='mgr.14120 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:17:00.552 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:00 vm04 ceph-mon[51058]: from='mgr.14120 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:17:00.552 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:00 vm04 ceph-mon[51058]: from='mgr.14120 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:17:00.552 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:00 vm04 ceph-mon[51058]: mgrmap e7: vm04.exdvdb(active, since 1.00957s) 2026-03-10T06:17:00.598 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:00.421+0000 7efe547da700 1 Processor -- start 2026-03-10T06:17:00.598 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:00.421+0000 7efe547da700 1 -- start start 2026-03-10T06:17:00.598 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:00.421+0000 7efe547da700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7efe4c104c30 0x7efe4c105040 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:17:00.598 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:00.421+0000 7efe547da700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7efe4c105580 con 0x7efe4c104c30 2026-03-10T06:17:00.598 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:00.422+0000 7efe52576700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7efe4c104c30 0x7efe4c105040 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:17:00.598 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:00.422+0000 7efe52576700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7efe4c104c30 0x7efe4c105040 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:46640/0 (socket says 192.168.123.104:46640) 2026-03-10T06:17:00.599 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:00.422+0000 7efe52576700 1 -- 192.168.123.104:0/4154525371 learned_addr learned my addr 192.168.123.104:0/4154525371 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:17:00.599 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:00.423+0000 7efe52576700 1 -- 192.168.123.104:0/4154525371 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7efe4c1056c0 con 0x7efe4c104c30 2026-03-10T06:17:00.599 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:00.423+0000 7efe52576700 1 --2- 192.168.123.104:0/4154525371 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7efe4c104c30 0x7efe4c105040 secure :-1 s=READY pgs=37 cs=0 l=1 rev1=1 crypto rx=0x7efe48009a90 tx=0x7efe48009da0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=8bf1949f12e4ae0d server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:17:00.599 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:00.424+0000 7efe51574700 1 -- 192.168.123.104:0/4154525371 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7efe48004030 con 0x7efe4c104c30 2026-03-10T06:17:00.599 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:00.424+0000 7efe51574700 1 -- 192.168.123.104:0/4154525371 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7efe4800b7e0 con 0x7efe4c104c30 2026-03-10T06:17:00.599 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:00.424+0000 7efe547da700 1 -- 192.168.123.104:0/4154525371 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7efe4c104c30 msgr2=0x7efe4c105040 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:00.599 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:00.424+0000 7efe547da700 1 --2- 192.168.123.104:0/4154525371 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7efe4c104c30 0x7efe4c105040 secure :-1 s=READY pgs=37 cs=0 l=1 rev1=1 crypto rx=0x7efe48009a90 tx=0x7efe48009da0 comp rx=0 tx=0).stop 2026-03-10T06:17:00.599 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:00.424+0000 7efe547da700 1 -- 192.168.123.104:0/4154525371 shutdown_connections 2026-03-10T06:17:00.599 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:00.424+0000 7efe547da700 1 --2- 192.168.123.104:0/4154525371 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7efe4c104c30 0x7efe4c105040 unknown :-1 s=CLOSED pgs=37 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:00.599 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:00.424+0000 7efe547da700 1 -- 192.168.123.104:0/4154525371 >> 192.168.123.104:0/4154525371 conn(0x7efe4c1002a0 msgr2=0x7efe4c1026b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:17:00.599 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:00.425+0000 7efe547da700 1 -- 192.168.123.104:0/4154525371 shutdown_connections 2026-03-10T06:17:00.599 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:00.425+0000 7efe547da700 1 -- 192.168.123.104:0/4154525371 wait complete. 2026-03-10T06:17:00.599 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:00.425+0000 7efe547da700 1 Processor -- start 2026-03-10T06:17:00.599 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:00.425+0000 7efe547da700 1 -- start start 2026-03-10T06:17:00.599 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:00.426+0000 7efe547da700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7efe4c197980 0x7efe4c197d90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:17:00.599 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:00.426+0000 7efe547da700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7efe4c105580 con 0x7efe4c197980 2026-03-10T06:17:00.599 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:00.426+0000 7efe52576700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7efe4c197980 0x7efe4c197d90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:17:00.599 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:00.426+0000 7efe52576700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7efe4c197980 0x7efe4c197d90 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:46654/0 (socket says 192.168.123.104:46654) 2026-03-10T06:17:00.599 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:00.426+0000 7efe52576700 1 -- 192.168.123.104:0/2734241813 learned_addr learned my addr 192.168.123.104:0/2734241813 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:17:00.599 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:00.426+0000 7efe52576700 1 -- 192.168.123.104:0/2734241813 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7efe48009740 con 0x7efe4c197980 2026-03-10T06:17:00.599 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:00.427+0000 7efe52576700 1 --2- 192.168.123.104:0/2734241813 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7efe4c197980 0x7efe4c197d90 secure :-1 s=READY pgs=38 cs=0 l=1 rev1=1 crypto rx=0x7efe48009130 tx=0x7efe4800be80 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:17:00.599 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:00.427+0000 7efe437fe700 1 -- 192.168.123.104:0/2734241813 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7efe4801a670 con 0x7efe4c197980 2026-03-10T06:17:00.599 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:00.427+0000 7efe437fe700 1 -- 192.168.123.104:0/2734241813 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7efe4801ac70 con 0x7efe4c197980 2026-03-10T06:17:00.599 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:00.427+0000 7efe437fe700 1 -- 192.168.123.104:0/2734241813 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7efe480044b0 con 0x7efe4c197980 2026-03-10T06:17:00.599 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:00.427+0000 7efe547da700 1 -- 192.168.123.104:0/2734241813 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7efe4c1982d0 con 0x7efe4c197980 2026-03-10T06:17:00.599 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:00.427+0000 7efe547da700 1 -- 192.168.123.104:0/2734241813 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7efe4c19af60 con 0x7efe4c197980 2026-03-10T06:17:00.599 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:00.428+0000 7efe437fe700 1 -- 192.168.123.104:0/2734241813 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 7) v1 ==== 44973+0+0 (secure 0 0 0) 0x7efe48011420 con 0x7efe4c197980 2026-03-10T06:17:00.599 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:00.429+0000 7efe437fe700 1 --2- 192.168.123.104:0/2734241813 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7efe38038340 0x7efe3803a7f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:17:00.599 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:00.429+0000 7efe437fe700 1 -- 192.168.123.104:0/2734241813 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7efe4804c650 con 0x7efe4c197980 2026-03-10T06:17:00.599 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:00.429+0000 7efe51d75700 1 --2- 192.168.123.104:0/2734241813 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7efe38038340 0x7efe3803a7f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:17:00.599 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:00.430+0000 7efe51d75700 1 --2- 192.168.123.104:0/2734241813 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7efe38038340 0x7efe3803a7f0 secure :-1 s=READY pgs=7 cs=0 l=1 rev1=1 crypto rx=0x7efe3c006fd0 tx=0x7efe3c006e40 comp rx=0 tx=0).ready entity=mgr.14120 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:17:00.599 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:00.430+0000 7efe547da700 1 -- 192.168.123.104:0/2734241813 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7efe4c04f9e0 con 0x7efe4c197980 2026-03-10T06:17:00.599 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:00.433+0000 7efe437fe700 1 -- 192.168.123.104:0/2734241813 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7efe48004610 con 0x7efe4c197980 2026-03-10T06:17:00.599 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:00.550+0000 7efe547da700 1 -- 192.168.123.104:0/2734241813 --> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] -- mgr_command(tid 0: {"prefix": "orch set backend", "module_name": "cephadm", "target": ["mon-mgr", ""]}) v1 -- 0x7efe4c10aee0 con 0x7efe38038340 2026-03-10T06:17:00.599 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:00.559+0000 7efe437fe700 1 -- 192.168.123.104:0/2734241813 <== mgr.14120 v2:192.168.123.104:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+0 (secure 0 0 0) 0x7efe4c10aee0 con 0x7efe38038340 2026-03-10T06:17:00.599 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:00.564+0000 7efe547da700 1 -- 192.168.123.104:0/2734241813 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7efe38038340 msgr2=0x7efe3803a7f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:00.599 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:00.564+0000 7efe547da700 1 --2- 192.168.123.104:0/2734241813 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7efe38038340 0x7efe3803a7f0 secure :-1 s=READY pgs=7 cs=0 l=1 rev1=1 crypto rx=0x7efe3c006fd0 tx=0x7efe3c006e40 comp rx=0 tx=0).stop 2026-03-10T06:17:00.599 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:00.564+0000 7efe547da700 1 -- 192.168.123.104:0/2734241813 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7efe4c197980 msgr2=0x7efe4c197d90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:00.599 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:00.564+0000 7efe547da700 1 --2- 192.168.123.104:0/2734241813 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7efe4c197980 0x7efe4c197d90 secure :-1 s=READY pgs=38 cs=0 l=1 rev1=1 crypto rx=0x7efe48009130 tx=0x7efe4800be80 comp rx=0 tx=0).stop 2026-03-10T06:17:00.599 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:00.564+0000 7efe547da700 1 -- 192.168.123.104:0/2734241813 shutdown_connections 2026-03-10T06:17:00.600 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:00.564+0000 7efe547da700 1 --2- 192.168.123.104:0/2734241813 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7efe38038340 0x7efe3803a7f0 unknown :-1 s=CLOSED pgs=7 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:00.600 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:00.564+0000 7efe547da700 1 --2- 192.168.123.104:0/2734241813 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7efe4c197980 0x7efe4c197d90 unknown :-1 s=CLOSED pgs=38 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:00.600 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:00.564+0000 7efe547da700 1 -- 192.168.123.104:0/2734241813 >> 192.168.123.104:0/2734241813 conn(0x7efe4c1002a0 msgr2=0x7efe4c1026b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:17:00.600 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:00.564+0000 7efe547da700 1 -- 192.168.123.104:0/2734241813 shutdown_connections 2026-03-10T06:17:00.600 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:00.564+0000 7efe547da700 1 -- 192.168.123.104:0/2734241813 wait complete. 2026-03-10T06:17:00.907 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout value unchanged 2026-03-10T06:17:00.907 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:00.731+0000 7f424bffb700 1 Processor -- start 2026-03-10T06:17:00.907 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:00.731+0000 7f424bffb700 1 -- start start 2026-03-10T06:17:00.907 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:00.731+0000 7f424bffb700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4244107f40 0x7f4244108350 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:17:00.907 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:00.731+0000 7f424bffb700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4244108890 con 0x7f4244107f40 2026-03-10T06:17:00.907 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:00.732+0000 7f4249d97700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4244107f40 0x7f4244108350 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:17:00.907 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:00.732+0000 7f4249d97700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4244107f40 0x7f4244108350 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:46666/0 (socket says 192.168.123.104:46666) 2026-03-10T06:17:00.907 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:00.732+0000 7f4249d97700 1 -- 192.168.123.104:0/1388537937 learned_addr learned my addr 192.168.123.104:0/1388537937 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:17:00.907 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:00.732+0000 7f4249d97700 1 -- 192.168.123.104:0/1388537937 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f42441089d0 con 0x7f4244107f40 2026-03-10T06:17:00.907 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:00.732+0000 7f4249d97700 1 --2- 192.168.123.104:0/1388537937 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4244107f40 0x7f4244108350 secure :-1 s=READY pgs=39 cs=0 l=1 rev1=1 crypto rx=0x7f4234009cf0 tx=0x7f423400b0e0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=b6f8e763425d9623 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:17:00.907 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:00.732+0000 7f4248d95700 1 -- 192.168.123.104:0/1388537937 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f4234004030 con 0x7f4244107f40 2026-03-10T06:17:00.907 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:00.732+0000 7f4248d95700 1 -- 192.168.123.104:0/1388537937 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f423400b810 con 0x7f4244107f40 2026-03-10T06:17:00.907 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:00.733+0000 7f424bffb700 1 -- 192.168.123.104:0/1388537937 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4244107f40 msgr2=0x7f4244108350 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:00.907 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:00.733+0000 7f424bffb700 1 --2- 192.168.123.104:0/1388537937 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4244107f40 0x7f4244108350 secure :-1 s=READY pgs=39 cs=0 l=1 rev1=1 crypto rx=0x7f4234009cf0 tx=0x7f423400b0e0 comp rx=0 tx=0).stop 2026-03-10T06:17:00.907 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:00.733+0000 7f424bffb700 1 -- 192.168.123.104:0/1388537937 shutdown_connections 2026-03-10T06:17:00.907 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:00.733+0000 7f424bffb700 1 --2- 192.168.123.104:0/1388537937 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4244107f40 0x7f4244108350 unknown :-1 s=CLOSED pgs=39 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:00.907 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:00.733+0000 7f424bffb700 1 -- 192.168.123.104:0/1388537937 >> 192.168.123.104:0/1388537937 conn(0x7f4244103770 msgr2=0x7f4244105b50 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:17:00.907 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:00.733+0000 7f424bffb700 1 -- 192.168.123.104:0/1388537937 shutdown_connections 2026-03-10T06:17:00.907 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:00.733+0000 7f424bffb700 1 -- 192.168.123.104:0/1388537937 wait complete. 2026-03-10T06:17:00.907 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:00.733+0000 7f424bffb700 1 Processor -- start 2026-03-10T06:17:00.907 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:00.733+0000 7f424bffb700 1 -- start start 2026-03-10T06:17:00.907 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:00.734+0000 7f424bffb700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4244107f40 0x7f424419c000 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:17:00.907 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:00.734+0000 7f424bffb700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4244108890 con 0x7f4244107f40 2026-03-10T06:17:00.907 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:00.734+0000 7f4249d97700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4244107f40 0x7f424419c000 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:17:00.907 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:00.734+0000 7f4249d97700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4244107f40 0x7f424419c000 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:46674/0 (socket says 192.168.123.104:46674) 2026-03-10T06:17:00.907 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:00.734+0000 7f4249d97700 1 -- 192.168.123.104:0/1574706423 learned_addr learned my addr 192.168.123.104:0/1574706423 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:17:00.907 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:00.734+0000 7f4249d97700 1 -- 192.168.123.104:0/1574706423 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f4234009740 con 0x7f4244107f40 2026-03-10T06:17:00.907 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:00.734+0000 7f4249d97700 1 --2- 192.168.123.104:0/1574706423 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4244107f40 0x7f424419c000 secure :-1 s=READY pgs=40 cs=0 l=1 rev1=1 crypto rx=0x7f4234003f30 tx=0x7f4234004010 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:17:00.907 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:00.734+0000 7f423affd700 1 -- 192.168.123.104:0/1574706423 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f423400bed0 con 0x7f4244107f40 2026-03-10T06:17:00.907 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:00.734+0000 7f423affd700 1 -- 192.168.123.104:0/1574706423 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f4234003710 con 0x7f4244107f40 2026-03-10T06:17:00.907 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:00.734+0000 7f423affd700 1 -- 192.168.123.104:0/1574706423 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f423401ae20 con 0x7f4244107f40 2026-03-10T06:17:00.907 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:00.734+0000 7f424bffb700 1 -- 192.168.123.104:0/1574706423 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f424419c540 con 0x7f4244107f40 2026-03-10T06:17:00.907 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:00.734+0000 7f424bffb700 1 -- 192.168.123.104:0/1574706423 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f424419c9e0 con 0x7f4244107f40 2026-03-10T06:17:00.907 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:00.735+0000 7f424bffb700 1 -- 192.168.123.104:0/1574706423 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f4244195ea0 con 0x7f4244107f40 2026-03-10T06:17:00.907 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:00.735+0000 7f423affd700 1 -- 192.168.123.104:0/1574706423 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 7) v1 ==== 44973+0+0 (secure 0 0 0) 0x7f4234004260 con 0x7f4244107f40 2026-03-10T06:17:00.907 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:00.736+0000 7f423affd700 1 --2- 192.168.123.104:0/1574706423 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f42300382d0 0x7f423003a780 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:17:00.907 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:00.736+0000 7f423affd700 1 -- 192.168.123.104:0/1574706423 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7f423404b990 con 0x7f4244107f40 2026-03-10T06:17:00.907 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:00.738+0000 7f423affd700 1 -- 192.168.123.104:0/1574706423 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f4234018b40 con 0x7f4244107f40 2026-03-10T06:17:00.907 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:00.738+0000 7f4249596700 1 --2- 192.168.123.104:0/1574706423 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f42300382d0 0x7f423003a780 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:17:00.907 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:00.739+0000 7f4249596700 1 --2- 192.168.123.104:0/1574706423 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f42300382d0 0x7f423003a780 secure :-1 s=READY pgs=8 cs=0 l=1 rev1=1 crypto rx=0x7f4240006fd0 tx=0x7f4240006e40 comp rx=0 tx=0).ready entity=mgr.14120 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:17:00.908 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:00.844+0000 7f424bffb700 1 -- 192.168.123.104:0/1574706423 --> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] -- mgr_command(tid 0: {"prefix": "cephadm set-user", "user": "root", "target": ["mon-mgr", ""]}) v1 -- 0x7f42440008d0 con 0x7f42300382d0 2026-03-10T06:17:00.908 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:00.845+0000 7f423affd700 1 -- 192.168.123.104:0/1574706423 <== mgr.14120 v2:192.168.123.104:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+16 (secure 0 0 0) 0x7f42440008d0 con 0x7f42300382d0 2026-03-10T06:17:00.908 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:00.848+0000 7f424bffb700 1 -- 192.168.123.104:0/1574706423 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f42300382d0 msgr2=0x7f423003a780 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:00.908 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:00.848+0000 7f424bffb700 1 --2- 192.168.123.104:0/1574706423 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f42300382d0 0x7f423003a780 secure :-1 s=READY pgs=8 cs=0 l=1 rev1=1 crypto rx=0x7f4240006fd0 tx=0x7f4240006e40 comp rx=0 tx=0).stop 2026-03-10T06:17:00.908 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:00.849+0000 7f424bffb700 1 -- 192.168.123.104:0/1574706423 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4244107f40 msgr2=0x7f424419c000 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:00.908 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:00.849+0000 7f424bffb700 1 --2- 192.168.123.104:0/1574706423 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4244107f40 0x7f424419c000 secure :-1 s=READY pgs=40 cs=0 l=1 rev1=1 crypto rx=0x7f4234003f30 tx=0x7f4234004010 comp rx=0 tx=0).stop 2026-03-10T06:17:00.908 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:00.849+0000 7f424bffb700 1 -- 192.168.123.104:0/1574706423 shutdown_connections 2026-03-10T06:17:00.908 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:00.849+0000 7f424bffb700 1 --2- 192.168.123.104:0/1574706423 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f42300382d0 0x7f423003a780 unknown :-1 s=CLOSED pgs=8 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:00.908 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:00.849+0000 7f424bffb700 1 --2- 192.168.123.104:0/1574706423 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4244107f40 0x7f424419c000 unknown :-1 s=CLOSED pgs=40 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:00.908 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:00.849+0000 7f424bffb700 1 -- 192.168.123.104:0/1574706423 >> 192.168.123.104:0/1574706423 conn(0x7f4244103770 msgr2=0x7f4244105f60 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:17:00.908 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:00.850+0000 7f424bffb700 1 -- 192.168.123.104:0/1574706423 shutdown_connections 2026-03-10T06:17:00.908 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:00.850+0000 7f424bffb700 1 -- 192.168.123.104:0/1574706423 wait complete. 2026-03-10T06:17:00.908 INFO:teuthology.orchestra.run.vm04.stdout:Generating ssh key... 2026-03-10T06:17:01.417 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:01 vm04 ceph-mon[51058]: [10/Mar/2026:06:16:59] ENGINE Bus STARTING 2026-03-10T06:17:01.418 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:01 vm04 ceph-mon[51058]: [10/Mar/2026:06:16:59] ENGINE Serving on https://192.168.123.104:7150 2026-03-10T06:17:01.418 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:01 vm04 ceph-mon[51058]: [10/Mar/2026:06:16:59] ENGINE Serving on http://192.168.123.104:8765 2026-03-10T06:17:01.418 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:01 vm04 ceph-mon[51058]: [10/Mar/2026:06:16:59] ENGINE Bus STARTED 2026-03-10T06:17:01.418 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:01 vm04 ceph-mon[51058]: from='client.14124 -' entity='client.admin' cmd=[{"prefix": "get_command_descriptions"}]: dispatch 2026-03-10T06:17:01.418 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:01 vm04 ceph-mon[51058]: from='client.14124 -' entity='client.admin' cmd=[{"prefix": "mgr_status"}]: dispatch 2026-03-10T06:17:01.418 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:01 vm04 ceph-mon[51058]: from='mgr.14120 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:17:01.418 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:01 vm04 ceph-mon[51058]: from='mgr.14120 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:17:01.457 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:01.029+0000 7f7825d00700 1 Processor -- start 2026-03-10T06:17:01.457 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:01.029+0000 7f7825d00700 1 -- start start 2026-03-10T06:17:01.457 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:01.029+0000 7f7825d00700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7820107f40 0x7f7820108350 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:17:01.457 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:01.029+0000 7f7825d00700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7820108890 con 0x7f7820107f40 2026-03-10T06:17:01.457 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:01.030+0000 7f781f7fe700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7820107f40 0x7f7820108350 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:17:01.457 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:01.030+0000 7f781f7fe700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7820107f40 0x7f7820108350 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:46684/0 (socket says 192.168.123.104:46684) 2026-03-10T06:17:01.457 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:01.030+0000 7f781f7fe700 1 -- 192.168.123.104:0/2717010787 learned_addr learned my addr 192.168.123.104:0/2717010787 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:17:01.457 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:01.030+0000 7f781f7fe700 1 -- 192.168.123.104:0/2717010787 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f78201089d0 con 0x7f7820107f40 2026-03-10T06:17:01.457 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:01.030+0000 7f781f7fe700 1 --2- 192.168.123.104:0/2717010787 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7820107f40 0x7f7820108350 secure :-1 s=READY pgs=41 cs=0 l=1 rev1=1 crypto rx=0x7f7810009cf0 tx=0x7f781000b0e0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=18a1f449fed5a88 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:17:01.457 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:01.031+0000 7f781effd700 1 -- 192.168.123.104:0/2717010787 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f7810004030 con 0x7f7820107f40 2026-03-10T06:17:01.457 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:01.031+0000 7f781effd700 1 -- 192.168.123.104:0/2717010787 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f781000b810 con 0x7f7820107f40 2026-03-10T06:17:01.457 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:01.031+0000 7f7825d00700 1 -- 192.168.123.104:0/2717010787 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7820107f40 msgr2=0x7f7820108350 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:01.457 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:01.031+0000 7f7825d00700 1 --2- 192.168.123.104:0/2717010787 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7820107f40 0x7f7820108350 secure :-1 s=READY pgs=41 cs=0 l=1 rev1=1 crypto rx=0x7f7810009cf0 tx=0x7f781000b0e0 comp rx=0 tx=0).stop 2026-03-10T06:17:01.457 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:01.032+0000 7f7825d00700 1 -- 192.168.123.104:0/2717010787 shutdown_connections 2026-03-10T06:17:01.457 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:01.032+0000 7f7825d00700 1 --2- 192.168.123.104:0/2717010787 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7820107f40 0x7f7820108350 unknown :-1 s=CLOSED pgs=41 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:01.457 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:01.032+0000 7f7825d00700 1 -- 192.168.123.104:0/2717010787 >> 192.168.123.104:0/2717010787 conn(0x7f7820103770 msgr2=0x7f7820105b50 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:17:01.457 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:01.032+0000 7f7825d00700 1 -- 192.168.123.104:0/2717010787 shutdown_connections 2026-03-10T06:17:01.457 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:01.032+0000 7f7825d00700 1 -- 192.168.123.104:0/2717010787 wait complete. 2026-03-10T06:17:01.457 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:01.032+0000 7f7825d00700 1 Processor -- start 2026-03-10T06:17:01.457 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:01.032+0000 7f7825d00700 1 -- start start 2026-03-10T06:17:01.457 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:01.033+0000 7f7825d00700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7820107f40 0x7f782007eb70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:17:01.457 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:01.033+0000 7f781f7fe700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7820107f40 0x7f782007eb70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:17:01.457 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:01.033+0000 7f781f7fe700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7820107f40 0x7f782007eb70 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:46698/0 (socket says 192.168.123.104:46698) 2026-03-10T06:17:01.457 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:01.033+0000 7f781f7fe700 1 -- 192.168.123.104:0/3610738988 learned_addr learned my addr 192.168.123.104:0/3610738988 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:17:01.457 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:01.033+0000 7f7825d00700 1 -- 192.168.123.104:0/3610738988 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7820108890 con 0x7f7820107f40 2026-03-10T06:17:01.457 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:01.033+0000 7f781f7fe700 1 -- 192.168.123.104:0/3610738988 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f7810009740 con 0x7f7820107f40 2026-03-10T06:17:01.457 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:01.033+0000 7f781f7fe700 1 --2- 192.168.123.104:0/3610738988 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7820107f40 0x7f782007eb70 secure :-1 s=READY pgs=42 cs=0 l=1 rev1=1 crypto rx=0x7f7810003ee0 tx=0x7f7810003fc0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:17:01.457 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:01.034+0000 7f781d7fa700 1 -- 192.168.123.104:0/3610738988 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f7810004320 con 0x7f7820107f40 2026-03-10T06:17:01.457 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:01.034+0000 7f7825d00700 1 -- 192.168.123.104:0/3610738988 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f782007f0b0 con 0x7f7820107f40 2026-03-10T06:17:01.457 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:01.034+0000 7f7825d00700 1 -- 192.168.123.104:0/3610738988 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f782007b5d0 con 0x7f7820107f40 2026-03-10T06:17:01.457 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:01.035+0000 7f7825d00700 1 -- 192.168.123.104:0/3610738988 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f782004f9e0 con 0x7f7820107f40 2026-03-10T06:17:01.457 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:01.036+0000 7f781d7fa700 1 -- 192.168.123.104:0/3610738988 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f7810004480 con 0x7f7820107f40 2026-03-10T06:17:01.457 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:01.036+0000 7f781d7fa700 1 -- 192.168.123.104:0/3610738988 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f781001a560 con 0x7f7820107f40 2026-03-10T06:17:01.457 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:01.036+0000 7f781d7fa700 1 -- 192.168.123.104:0/3610738988 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 7) v1 ==== 44973+0+0 (secure 0 0 0) 0x7f781001a780 con 0x7f7820107f40 2026-03-10T06:17:01.458 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:01.036+0000 7f781d7fa700 1 --2- 192.168.123.104:0/3610738988 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f780c038070 0x7f780c03a520 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:17:01.458 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:01.036+0000 7f781d7fa700 1 -- 192.168.123.104:0/3610738988 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7f781001e070 con 0x7f7820107f40 2026-03-10T06:17:01.458 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:01.038+0000 7f7817fff700 1 --2- 192.168.123.104:0/3610738988 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f780c038070 0x7f780c03a520 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:17:01.458 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:01.038+0000 7f7817fff700 1 --2- 192.168.123.104:0/3610738988 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f780c038070 0x7f780c03a520 secure :-1 s=READY pgs=9 cs=0 l=1 rev1=1 crypto rx=0x7f7808006fd0 tx=0x7f7808006e40 comp rx=0 tx=0).ready entity=mgr.14120 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:17:01.458 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:01.038+0000 7f781d7fa700 1 -- 192.168.123.104:0/3610738988 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f781001aa30 con 0x7f7820107f40 2026-03-10T06:17:01.458 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:01.148+0000 7f7825d00700 1 -- 192.168.123.104:0/3610738988 --> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] -- mgr_command(tid 0: {"prefix": "cephadm generate-key", "target": ["mon-mgr", ""]}) v1 -- 0x7f782007be90 con 0x7f780c038070 2026-03-10T06:17:01.458 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:01.424+0000 7f781d7fa700 1 -- 192.168.123.104:0/3610738988 <== mgr.14120 v2:192.168.123.104:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+0 (secure 0 0 0) 0x7f782007be90 con 0x7f780c038070 2026-03-10T06:17:01.458 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:01.427+0000 7f7825d00700 1 -- 192.168.123.104:0/3610738988 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f780c038070 msgr2=0x7f780c03a520 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:01.458 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:01.427+0000 7f7825d00700 1 --2- 192.168.123.104:0/3610738988 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f780c038070 0x7f780c03a520 secure :-1 s=READY pgs=9 cs=0 l=1 rev1=1 crypto rx=0x7f7808006fd0 tx=0x7f7808006e40 comp rx=0 tx=0).stop 2026-03-10T06:17:01.458 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:01.427+0000 7f7825d00700 1 -- 192.168.123.104:0/3610738988 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7820107f40 msgr2=0x7f782007eb70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:01.458 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:01.427+0000 7f7825d00700 1 --2- 192.168.123.104:0/3610738988 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7820107f40 0x7f782007eb70 secure :-1 s=READY pgs=42 cs=0 l=1 rev1=1 crypto rx=0x7f7810003ee0 tx=0x7f7810003fc0 comp rx=0 tx=0).stop 2026-03-10T06:17:01.458 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:01.427+0000 7f7825d00700 1 -- 192.168.123.104:0/3610738988 shutdown_connections 2026-03-10T06:17:01.458 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:01.427+0000 7f7825d00700 1 --2- 192.168.123.104:0/3610738988 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f780c038070 0x7f780c03a520 unknown :-1 s=CLOSED pgs=9 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:01.458 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:01.427+0000 7f7825d00700 1 --2- 192.168.123.104:0/3610738988 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7820107f40 0x7f782007eb70 unknown :-1 s=CLOSED pgs=42 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:01.458 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:01.427+0000 7f7825d00700 1 -- 192.168.123.104:0/3610738988 >> 192.168.123.104:0/3610738988 conn(0x7f7820103770 msgr2=0x7f7820105500 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:17:01.458 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:01.427+0000 7f7825d00700 1 -- 192.168.123.104:0/3610738988 shutdown_connections 2026-03-10T06:17:01.458 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:01.427+0000 7f7825d00700 1 -- 192.168.123.104:0/3610738988 wait complete. 2026-03-10T06:17:01.924 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDB23dv+HsXxEfOHeT6j+84qNSP4Y7ONSep6uvtL71jND6dXQ8eR/GSMTGbbmy8ec0MPvVSTBrriUxdPW3Kqh4kl30X+Y2GuZdTR5dZGB5/gO5eVOg6al1oJpRs/cPIPfSxNlcFUXrRW9VHfuGY5fONK9hqFzPSK1Q/9jan/WHO7mag42yyp5ZH4kE8FHxHTjKTlgeCajuBfOUD+gXWxmoQBbICcKi+IO3+EzQTRh7oBqTD6FHZYWA/tq1uN1M2ZurBcUQOhnRVcKA17m49fIyYdh1JtGdydZ7sO59iSzw736+LDth0UxfwItidMKFhCtSGTJuCp+L0dk1Lzq+B6Bq2h1Id83lqkzCxMgn0H80e0+d8Wd61tCqaCkXG+U3xdfJKJFtn0EfGBeoFmdxrgYceqFKoS/sK107w1vLODuFqUswWc2GRYO878c9i1AgU1vc3PjDvbfgh8ufq4cjleARoqqJLoqVGKJ4vKZzen32DOvJXQ8aQaYESMdv7hUbsiyE= ceph-9c59102a-1c48-11f1-b618-035af535377d 2026-03-10T06:17:01.924 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:01.726+0000 7f48de349700 1 Processor -- start 2026-03-10T06:17:01.924 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:01.727+0000 7f48de349700 1 -- start start 2026-03-10T06:17:01.924 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:01.727+0000 7f48de349700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f48d0095120 0x7f48d0095530 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:17:01.924 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:01.727+0000 7f48d7fff700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f48d0095120 0x7f48d0095530 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:17:01.924 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:01.727+0000 7f48de349700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f48d0095b00 con 0x7f48d0095120 2026-03-10T06:17:01.924 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:01.727+0000 7f48d7fff700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f48d0095120 0x7f48d0095530 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:46702/0 (socket says 192.168.123.104:46702) 2026-03-10T06:17:01.924 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:01.727+0000 7f48d7fff700 1 -- 192.168.123.104:0/894355704 learned_addr learned my addr 192.168.123.104:0/894355704 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:17:01.924 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:01.728+0000 7f48d7fff700 1 -- 192.168.123.104:0/894355704 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f48d0096320 con 0x7f48d0095120 2026-03-10T06:17:01.924 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:01.728+0000 7f48d7fff700 1 --2- 192.168.123.104:0/894355704 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f48d0095120 0x7f48d0095530 secure :-1 s=READY pgs=43 cs=0 l=1 rev1=1 crypto rx=0x7f48cc009a90 tx=0x7f48cc009da0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=78f99e51342d3c65 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:17:01.924 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:01.728+0000 7f48d6ffd700 1 -- 192.168.123.104:0/894355704 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f48cc004030 con 0x7f48d0095120 2026-03-10T06:17:01.924 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:01.728+0000 7f48d6ffd700 1 -- 192.168.123.104:0/894355704 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f48cc00b7e0 con 0x7f48d0095120 2026-03-10T06:17:01.924 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:01.728+0000 7f48d6ffd700 1 -- 192.168.123.104:0/894355704 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f48cc003a40 con 0x7f48d0095120 2026-03-10T06:17:01.924 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:01.729+0000 7f48de349700 1 -- 192.168.123.104:0/894355704 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f48d0095120 msgr2=0x7f48d0095530 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:01.924 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:01.729+0000 7f48de349700 1 --2- 192.168.123.104:0/894355704 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f48d0095120 0x7f48d0095530 secure :-1 s=READY pgs=43 cs=0 l=1 rev1=1 crypto rx=0x7f48cc009a90 tx=0x7f48cc009da0 comp rx=0 tx=0).stop 2026-03-10T06:17:01.924 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:01.729+0000 7f48de349700 1 -- 192.168.123.104:0/894355704 shutdown_connections 2026-03-10T06:17:01.924 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:01.729+0000 7f48de349700 1 --2- 192.168.123.104:0/894355704 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f48d0095120 0x7f48d0095530 unknown :-1 s=CLOSED pgs=43 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:01.924 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:01.729+0000 7f48de349700 1 -- 192.168.123.104:0/894355704 >> 192.168.123.104:0/894355704 conn(0x7f48d00906d0 msgr2=0x7f48d0092b00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:17:01.924 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:01.729+0000 7f48de349700 1 -- 192.168.123.104:0/894355704 shutdown_connections 2026-03-10T06:17:01.924 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:01.729+0000 7f48de349700 1 -- 192.168.123.104:0/894355704 wait complete. 2026-03-10T06:17:01.924 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:01.730+0000 7f48de349700 1 Processor -- start 2026-03-10T06:17:01.924 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:01.730+0000 7f48de349700 1 -- start start 2026-03-10T06:17:01.924 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:01.730+0000 7f48de349700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f48d0095120 0x7f48d0128b20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:17:01.924 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:01.730+0000 7f48de349700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f48d0129060 con 0x7f48d0095120 2026-03-10T06:17:01.924 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:01.730+0000 7f48d7fff700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f48d0095120 0x7f48d0128b20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:17:01.924 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:01.730+0000 7f48d7fff700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f48d0095120 0x7f48d0128b20 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:46716/0 (socket says 192.168.123.104:46716) 2026-03-10T06:17:01.924 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:01.730+0000 7f48d7fff700 1 -- 192.168.123.104:0/2313780212 learned_addr learned my addr 192.168.123.104:0/2313780212 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:17:01.924 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:01.730+0000 7f48d7fff700 1 -- 192.168.123.104:0/2313780212 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f48cc009740 con 0x7f48d0095120 2026-03-10T06:17:01.924 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:01.731+0000 7f48d7fff700 1 --2- 192.168.123.104:0/2313780212 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f48d0095120 0x7f48d0128b20 secure :-1 s=READY pgs=44 cs=0 l=1 rev1=1 crypto rx=0x7f48cc003c10 tx=0x7f48cc003d60 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:17:01.924 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:01.731+0000 7f48d57fa700 1 -- 192.168.123.104:0/2313780212 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f48cc004350 con 0x7f48d0095120 2026-03-10T06:17:01.924 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:01.731+0000 7f48de349700 1 -- 192.168.123.104:0/2313780212 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f48d0129260 con 0x7f48d0095120 2026-03-10T06:17:01.924 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:01.731+0000 7f48de349700 1 -- 192.168.123.104:0/2313780212 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f48d0129700 con 0x7f48d0095120 2026-03-10T06:17:01.924 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:01.732+0000 7f48d57fa700 1 -- 192.168.123.104:0/2313780212 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f48cc0044b0 con 0x7f48d0095120 2026-03-10T06:17:01.924 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:01.732+0000 7f48d57fa700 1 -- 192.168.123.104:0/2313780212 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f48cc01a770 con 0x7f48d0095120 2026-03-10T06:17:01.924 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:01.732+0000 7f48d57fa700 1 -- 192.168.123.104:0/2313780212 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 8) v1 ==== 45079+0+0 (secure 0 0 0) 0x7f48cc01a9e0 con 0x7f48d0095120 2026-03-10T06:17:01.924 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:01.732+0000 7f48d57fa700 1 --2- 192.168.123.104:0/2313780212 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f48c80384c0 0x7f48c803a970 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:17:01.924 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:01.732+0000 7f48d77fe700 1 --2- 192.168.123.104:0/2313780212 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f48c80384c0 0x7f48c803a970 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:17:01.924 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:01.733+0000 7f48d57fa700 1 -- 192.168.123.104:0/2313780212 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7f48cc04d380 con 0x7f48d0095120 2026-03-10T06:17:01.924 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:01.733+0000 7f48d77fe700 1 --2- 192.168.123.104:0/2313780212 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f48c80384c0 0x7f48c803a970 secure :-1 s=READY pgs=10 cs=0 l=1 rev1=1 crypto rx=0x7f48c4006fd0 tx=0x7f48c4006e40 comp rx=0 tx=0).ready entity=mgr.14120 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:17:01.924 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:01.733+0000 7f48de349700 1 -- 192.168.123.104:0/2313780212 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f48d00052d0 con 0x7f48d0095120 2026-03-10T06:17:01.924 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:01.737+0000 7f48d57fa700 1 -- 192.168.123.104:0/2313780212 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f48cc011420 con 0x7f48d0095120 2026-03-10T06:17:01.924 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:01.844+0000 7f48de349700 1 -- 192.168.123.104:0/2313780212 --> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] -- mgr_command(tid 0: {"prefix": "cephadm get-pub-key", "target": ["mon-mgr", ""]}) v1 -- 0x7f48d012c180 con 0x7f48c80384c0 2026-03-10T06:17:01.925 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:01.845+0000 7f48d57fa700 1 -- 192.168.123.104:0/2313780212 <== mgr.14120 v2:192.168.123.104:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+595 (secure 0 0 0) 0x7f48d012c180 con 0x7f48c80384c0 2026-03-10T06:17:01.925 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:01.848+0000 7f48de349700 1 -- 192.168.123.104:0/2313780212 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f48c80384c0 msgr2=0x7f48c803a970 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:01.925 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:01.848+0000 7f48de349700 1 --2- 192.168.123.104:0/2313780212 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f48c80384c0 0x7f48c803a970 secure :-1 s=READY pgs=10 cs=0 l=1 rev1=1 crypto rx=0x7f48c4006fd0 tx=0x7f48c4006e40 comp rx=0 tx=0).stop 2026-03-10T06:17:01.925 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:01.848+0000 7f48de349700 1 -- 192.168.123.104:0/2313780212 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f48d0095120 msgr2=0x7f48d0128b20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:01.925 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:01.848+0000 7f48de349700 1 --2- 192.168.123.104:0/2313780212 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f48d0095120 0x7f48d0128b20 secure :-1 s=READY pgs=44 cs=0 l=1 rev1=1 crypto rx=0x7f48cc003c10 tx=0x7f48cc003d60 comp rx=0 tx=0).stop 2026-03-10T06:17:01.925 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:01.848+0000 7f48de349700 1 -- 192.168.123.104:0/2313780212 shutdown_connections 2026-03-10T06:17:01.925 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:01.848+0000 7f48de349700 1 --2- 192.168.123.104:0/2313780212 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f48c80384c0 0x7f48c803a970 unknown :-1 s=CLOSED pgs=10 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:01.925 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:01.848+0000 7f48de349700 1 --2- 192.168.123.104:0/2313780212 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f48d0095120 0x7f48d0128b20 unknown :-1 s=CLOSED pgs=44 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:01.925 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:01.849+0000 7f48de349700 1 -- 192.168.123.104:0/2313780212 >> 192.168.123.104:0/2313780212 conn(0x7f48d00906d0 msgr2=0x7f48d0091090 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:17:01.925 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:01.849+0000 7f48de349700 1 -- 192.168.123.104:0/2313780212 shutdown_connections 2026-03-10T06:17:01.925 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:01.849+0000 7f48de349700 1 -- 192.168.123.104:0/2313780212 wait complete. 2026-03-10T06:17:01.925 INFO:teuthology.orchestra.run.vm04.stdout:Wrote public SSH key to /home/ubuntu/cephtest/ceph.pub 2026-03-10T06:17:01.925 INFO:teuthology.orchestra.run.vm04.stdout:Adding key to root@localhost authorized_keys... 2026-03-10T06:17:01.925 INFO:teuthology.orchestra.run.vm04.stdout:Adding host vm04... 2026-03-10T06:17:02.781 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:02 vm04 ceph-mon[51058]: from='client.14132 -' entity='client.admin' cmd=[{"prefix": "orch set backend", "module_name": "cephadm", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:17:02.782 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:02 vm04 ceph-mon[51058]: from='client.14134 -' entity='client.admin' cmd=[{"prefix": "cephadm set-user", "user": "root", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:17:02.782 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:02 vm04 ceph-mon[51058]: from='client.14136 -' entity='client.admin' cmd=[{"prefix": "cephadm generate-key", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:17:02.782 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:02 vm04 ceph-mon[51058]: Generating ssh key... 2026-03-10T06:17:02.782 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:02 vm04 ceph-mon[51058]: from='mgr.14120 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:17:02.782 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:02 vm04 ceph-mon[51058]: from='mgr.14120 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:17:02.782 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:02 vm04 ceph-mon[51058]: mgrmap e8: vm04.exdvdb(active, since 2s) 2026-03-10T06:17:03.861 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:03 vm04 ceph-mon[51058]: from='client.14138 -' entity='client.admin' cmd=[{"prefix": "cephadm get-pub-key", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:17:03.861 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:03 vm04 ceph-mon[51058]: from='client.14140 -' entity='client.admin' cmd=[{"prefix": "orch host add", "hostname": "vm04", "addr": "192.168.123.104", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:17:03.861 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:03 vm04 ceph-mon[51058]: Deploying cephadm binary to vm04 2026-03-10T06:17:03.928 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout Added host 'vm04' with addr '192.168.123.104' 2026-03-10T06:17:03.929 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:02.056+0000 7f1199499700 1 Processor -- start 2026-03-10T06:17:03.929 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:02.057+0000 7f1199499700 1 -- start start 2026-03-10T06:17:03.929 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:02.057+0000 7f1199499700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1194106830 0x7f1194108c10 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:17:03.929 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:02.057+0000 7f1199499700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f11940745b0 con 0x7f1194106830 2026-03-10T06:17:03.929 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:02.057+0000 7f1192ffd700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1194106830 0x7f1194108c10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:17:03.929 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:02.057+0000 7f1192ffd700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1194106830 0x7f1194108c10 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:46722/0 (socket says 192.168.123.104:46722) 2026-03-10T06:17:03.929 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:02.057+0000 7f1192ffd700 1 -- 192.168.123.104:0/3930588262 learned_addr learned my addr 192.168.123.104:0/3930588262 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:17:03.929 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:02.058+0000 7f1192ffd700 1 -- 192.168.123.104:0/3930588262 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f11940746f0 con 0x7f1194106830 2026-03-10T06:17:03.929 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:02.058+0000 7f1192ffd700 1 --2- 192.168.123.104:0/3930588262 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1194106830 0x7f1194108c10 secure :-1 s=READY pgs=45 cs=0 l=1 rev1=1 crypto rx=0x7f117c009a90 tx=0x7f117c009da0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=608b7a4f47e9731c server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:17:03.929 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:02.058+0000 7f1191ffb700 1 -- 192.168.123.104:0/3930588262 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f117c004030 con 0x7f1194106830 2026-03-10T06:17:03.929 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:02.059+0000 7f1191ffb700 1 -- 192.168.123.104:0/3930588262 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f117c00b7e0 con 0x7f1194106830 2026-03-10T06:17:03.929 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:02.059+0000 7f1191ffb700 1 -- 192.168.123.104:0/3930588262 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f117c003a40 con 0x7f1194106830 2026-03-10T06:17:03.929 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:02.059+0000 7f1199499700 1 -- 192.168.123.104:0/3930588262 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1194106830 msgr2=0x7f1194108c10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:03.929 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:02.059+0000 7f1199499700 1 --2- 192.168.123.104:0/3930588262 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1194106830 0x7f1194108c10 secure :-1 s=READY pgs=45 cs=0 l=1 rev1=1 crypto rx=0x7f117c009a90 tx=0x7f117c009da0 comp rx=0 tx=0).stop 2026-03-10T06:17:03.929 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:02.059+0000 7f1199499700 1 -- 192.168.123.104:0/3930588262 shutdown_connections 2026-03-10T06:17:03.929 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:02.059+0000 7f1199499700 1 --2- 192.168.123.104:0/3930588262 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1194106830 0x7f1194108c10 unknown :-1 s=CLOSED pgs=45 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:03.929 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:02.059+0000 7f1199499700 1 -- 192.168.123.104:0/3930588262 >> 192.168.123.104:0/3930588262 conn(0x7f1194100270 msgr2=0x7f11941026a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:17:03.929 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:02.060+0000 7f1199499700 1 -- 192.168.123.104:0/3930588262 shutdown_connections 2026-03-10T06:17:03.929 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:02.060+0000 7f1199499700 1 -- 192.168.123.104:0/3930588262 wait complete. 2026-03-10T06:17:03.929 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:02.060+0000 7f1199499700 1 Processor -- start 2026-03-10T06:17:03.929 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:02.060+0000 7f1199499700 1 -- start start 2026-03-10T06:17:03.929 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:02.061+0000 7f1199499700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1194106830 0x7f1194106190 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:17:03.929 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:02.061+0000 7f1199499700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f11941066d0 con 0x7f1194106830 2026-03-10T06:17:03.929 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:02.061+0000 7f1192ffd700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1194106830 0x7f1194106190 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:17:03.929 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:02.061+0000 7f1192ffd700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1194106830 0x7f1194106190 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:46738/0 (socket says 192.168.123.104:46738) 2026-03-10T06:17:03.929 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:02.061+0000 7f1192ffd700 1 -- 192.168.123.104:0/2366751162 learned_addr learned my addr 192.168.123.104:0/2366751162 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:17:03.929 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:02.061+0000 7f1192ffd700 1 -- 192.168.123.104:0/2366751162 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f117c009740 con 0x7f1194106830 2026-03-10T06:17:03.929 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:02.061+0000 7f1192ffd700 1 --2- 192.168.123.104:0/2366751162 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1194106830 0x7f1194106190 secure :-1 s=READY pgs=46 cs=0 l=1 rev1=1 crypto rx=0x7f117c00bef0 tx=0x7f117c003b40 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:17:03.929 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:02.062+0000 7f118bfff700 1 -- 192.168.123.104:0/2366751162 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f117c004140 con 0x7f1194106830 2026-03-10T06:17:03.929 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:02.062+0000 7f118bfff700 1 -- 192.168.123.104:0/2366751162 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f117c0042a0 con 0x7f1194106830 2026-03-10T06:17:03.929 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:02.062+0000 7f118bfff700 1 -- 192.168.123.104:0/2366751162 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f117c011440 con 0x7f1194106830 2026-03-10T06:17:03.929 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:02.062+0000 7f1199499700 1 -- 192.168.123.104:0/2366751162 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f11941047e0 con 0x7f1194106830 2026-03-10T06:17:03.929 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:02.062+0000 7f1199499700 1 -- 192.168.123.104:0/2366751162 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f1194104c80 con 0x7f1194106830 2026-03-10T06:17:03.929 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:02.062+0000 7f118bfff700 1 -- 192.168.123.104:0/2366751162 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 8) v1 ==== 45079+0+0 (secure 0 0 0) 0x7f117c0115a0 con 0x7f1194106830 2026-03-10T06:17:03.929 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:02.063+0000 7f118bfff700 1 --2- 192.168.123.104:0/2366751162 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f11800384c0 0x7f118003a970 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:17:03.929 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:02.063+0000 7f11927fc700 1 --2- 192.168.123.104:0/2366751162 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f11800384c0 0x7f118003a970 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:17:03.929 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:02.064+0000 7f11927fc700 1 --2- 192.168.123.104:0/2366751162 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f11800384c0 0x7f118003a970 secure :-1 s=READY pgs=11 cs=0 l=1 rev1=1 crypto rx=0x7f1184006fd0 tx=0x7f1184006e40 comp rx=0 tx=0).ready entity=mgr.14120 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:17:03.929 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:02.064+0000 7f118bfff700 1 -- 192.168.123.104:0/2366751162 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7f117c04d120 con 0x7f1194106830 2026-03-10T06:17:03.929 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:02.064+0000 7f1199499700 1 -- 192.168.123.104:0/2366751162 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f119404fa50 con 0x7f1194106830 2026-03-10T06:17:03.929 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:02.067+0000 7f118bfff700 1 -- 192.168.123.104:0/2366751162 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f117c02b430 con 0x7f1194106830 2026-03-10T06:17:03.929 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:02.173+0000 7f1199499700 1 -- 192.168.123.104:0/2366751162 --> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] -- mgr_command(tid 0: {"prefix": "orch host add", "hostname": "vm04", "addr": "192.168.123.104", "target": ["mon-mgr", ""]}) v1 -- 0x7f11941054f0 con 0x7f11800384c0 2026-03-10T06:17:03.929 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:03.856+0000 7f118bfff700 1 -- 192.168.123.104:0/2366751162 <== mgr.14120 v2:192.168.123.104:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+46 (secure 0 0 0) 0x7f11941054f0 con 0x7f11800384c0 2026-03-10T06:17:03.929 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:03.860+0000 7f1199499700 1 -- 192.168.123.104:0/2366751162 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f11800384c0 msgr2=0x7f118003a970 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:03.929 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:03.860+0000 7f1199499700 1 --2- 192.168.123.104:0/2366751162 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f11800384c0 0x7f118003a970 secure :-1 s=READY pgs=11 cs=0 l=1 rev1=1 crypto rx=0x7f1184006fd0 tx=0x7f1184006e40 comp rx=0 tx=0).stop 2026-03-10T06:17:03.929 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:03.860+0000 7f1199499700 1 -- 192.168.123.104:0/2366751162 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1194106830 msgr2=0x7f1194106190 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:03.929 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:03.860+0000 7f1199499700 1 --2- 192.168.123.104:0/2366751162 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1194106830 0x7f1194106190 secure :-1 s=READY pgs=46 cs=0 l=1 rev1=1 crypto rx=0x7f117c00bef0 tx=0x7f117c003b40 comp rx=0 tx=0).stop 2026-03-10T06:17:03.929 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:03.861+0000 7f1199499700 1 -- 192.168.123.104:0/2366751162 shutdown_connections 2026-03-10T06:17:03.929 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:03.861+0000 7f1199499700 1 --2- 192.168.123.104:0/2366751162 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f11800384c0 0x7f118003a970 unknown :-1 s=CLOSED pgs=11 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:03.929 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:03.861+0000 7f1199499700 1 --2- 192.168.123.104:0/2366751162 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1194106830 0x7f1194106190 unknown :-1 s=CLOSED pgs=46 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:03.929 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:03.861+0000 7f1199499700 1 -- 192.168.123.104:0/2366751162 >> 192.168.123.104:0/2366751162 conn(0x7f1194100270 msgr2=0x7f1194100f70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:17:03.929 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:03.861+0000 7f1199499700 1 -- 192.168.123.104:0/2366751162 shutdown_connections 2026-03-10T06:17:03.929 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:03.861+0000 7f1199499700 1 -- 192.168.123.104:0/2366751162 wait complete. 2026-03-10T06:17:03.929 INFO:teuthology.orchestra.run.vm04.stdout:Deploying mon service with default placement... 2026-03-10T06:17:04.255 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout Scheduled mon update... 2026-03-10T06:17:04.256 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.075+0000 7fe11d21a700 1 Processor -- start 2026-03-10T06:17:04.256 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.075+0000 7fe11d21a700 1 -- start start 2026-03-10T06:17:04.256 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.075+0000 7fe11d21a700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe118071190 0x7fe1180715a0 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:17:04.256 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.075+0000 7fe11d21a700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe118072c90 con 0x7fe118071190 2026-03-10T06:17:04.256 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.075+0000 7fe116d9d700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe118071190 0x7fe1180715a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:17:04.256 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.075+0000 7fe116d9d700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe118071190 0x7fe1180715a0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:60606/0 (socket says 192.168.123.104:60606) 2026-03-10T06:17:04.256 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.075+0000 7fe116d9d700 1 -- 192.168.123.104:0/2763202336 learned_addr learned my addr 192.168.123.104:0/2763202336 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:17:04.256 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.076+0000 7fe116d9d700 1 -- 192.168.123.104:0/2763202336 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fe118072dd0 con 0x7fe118071190 2026-03-10T06:17:04.256 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.077+0000 7fe116d9d700 1 --2- 192.168.123.104:0/2763202336 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe118071190 0x7fe1180715a0 secure :-1 s=READY pgs=47 cs=0 l=1 rev1=1 crypto rx=0x7fe10800ac30 tx=0x7fe108010730 comp rx=0 tx=0).ready entity=mon.0 client_cookie=90397465972df552 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:17:04.256 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.077+0000 7fe115d9b700 1 -- 192.168.123.104:0/2763202336 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fe108010d40 con 0x7fe118071190 2026-03-10T06:17:04.256 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.077+0000 7fe115d9b700 1 -- 192.168.123.104:0/2763202336 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7fe1080044c0 con 0x7fe118071190 2026-03-10T06:17:04.256 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.077+0000 7fe115d9b700 1 -- 192.168.123.104:0/2763202336 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fe10801a5c0 con 0x7fe118071190 2026-03-10T06:17:04.256 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.077+0000 7fe11d21a700 1 -- 192.168.123.104:0/2763202336 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe118071190 msgr2=0x7fe1180715a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:04.256 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.077+0000 7fe11d21a700 1 --2- 192.168.123.104:0/2763202336 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe118071190 0x7fe1180715a0 secure :-1 s=READY pgs=47 cs=0 l=1 rev1=1 crypto rx=0x7fe10800ac30 tx=0x7fe108010730 comp rx=0 tx=0).stop 2026-03-10T06:17:04.256 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.077+0000 7fe11d21a700 1 -- 192.168.123.104:0/2763202336 shutdown_connections 2026-03-10T06:17:04.256 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.077+0000 7fe11d21a700 1 --2- 192.168.123.104:0/2763202336 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe118071190 0x7fe1180715a0 unknown :-1 s=CLOSED pgs=47 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:04.256 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.077+0000 7fe11d21a700 1 -- 192.168.123.104:0/2763202336 >> 192.168.123.104:0/2763202336 conn(0x7fe11806cc30 msgr2=0x7fe11806f060 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:17:04.256 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.078+0000 7fe11d21a700 1 -- 192.168.123.104:0/2763202336 shutdown_connections 2026-03-10T06:17:04.256 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.078+0000 7fe11d21a700 1 -- 192.168.123.104:0/2763202336 wait complete. 2026-03-10T06:17:04.256 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.078+0000 7fe11d21a700 1 Processor -- start 2026-03-10T06:17:04.256 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.078+0000 7fe11d21a700 1 -- start start 2026-03-10T06:17:04.256 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.078+0000 7fe11d21a700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe118071190 0x7fe1180865a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:17:04.256 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.078+0000 7fe11d21a700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe118089b40 con 0x7fe118071190 2026-03-10T06:17:04.256 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.078+0000 7fe116d9d700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe118071190 0x7fe1180865a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:17:04.256 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.078+0000 7fe116d9d700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe118071190 0x7fe1180865a0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:60618/0 (socket says 192.168.123.104:60618) 2026-03-10T06:17:04.256 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.078+0000 7fe116d9d700 1 -- 192.168.123.104:0/2778120801 learned_addr learned my addr 192.168.123.104:0/2778120801 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:17:04.256 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.079+0000 7fe116d9d700 1 -- 192.168.123.104:0/2778120801 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fe10800a8e0 con 0x7fe118071190 2026-03-10T06:17:04.256 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.079+0000 7fe116d9d700 1 --2- 192.168.123.104:0/2778120801 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe118071190 0x7fe1180865a0 secure :-1 s=READY pgs=48 cs=0 l=1 rev1=1 crypto rx=0x7fe108003da0 tx=0x7fe108003980 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:17:04.256 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.079+0000 7fe0fffff700 1 -- 192.168.123.104:0/2778120801 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fe108003fc0 con 0x7fe118071190 2026-03-10T06:17:04.256 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.079+0000 7fe0fffff700 1 -- 192.168.123.104:0/2778120801 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7fe108004120 con 0x7fe118071190 2026-03-10T06:17:04.256 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.079+0000 7fe11d21a700 1 -- 192.168.123.104:0/2778120801 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fe118089d40 con 0x7fe118071190 2026-03-10T06:17:04.256 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.079+0000 7fe0fffff700 1 -- 192.168.123.104:0/2778120801 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fe1080096f0 con 0x7fe118071190 2026-03-10T06:17:04.256 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.079+0000 7fe11d21a700 1 -- 192.168.123.104:0/2778120801 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fe118086d30 con 0x7fe118071190 2026-03-10T06:17:04.256 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.080+0000 7fe11d21a700 1 -- 192.168.123.104:0/2778120801 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fe11804f9e0 con 0x7fe118071190 2026-03-10T06:17:04.256 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.081+0000 7fe0fffff700 1 -- 192.168.123.104:0/2778120801 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 8) v1 ==== 45079+0+0 (secure 0 0 0) 0x7fe108018070 con 0x7fe118071190 2026-03-10T06:17:04.256 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.081+0000 7fe0fffff700 1 --2- 192.168.123.104:0/2778120801 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fe1000384c0 0x7fe10003a970 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:17:04.256 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.083+0000 7fe0fffff700 1 -- 192.168.123.104:0/2778120801 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7fe10804feb0 con 0x7fe118071190 2026-03-10T06:17:04.256 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.088+0000 7fe0fffff700 1 -- 192.168.123.104:0/2778120801 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fe10801e020 con 0x7fe118071190 2026-03-10T06:17:04.256 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.088+0000 7fe11659c700 1 --2- 192.168.123.104:0/2778120801 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fe1000384c0 0x7fe10003a970 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:17:04.256 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.088+0000 7fe11659c700 1 --2- 192.168.123.104:0/2778120801 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fe1000384c0 0x7fe10003a970 secure :-1 s=READY pgs=12 cs=0 l=1 rev1=1 crypto rx=0x7fe10c006fd0 tx=0x7fe10c006e40 comp rx=0 tx=0).ready entity=mgr.14120 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:17:04.257 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.206+0000 7fe11d21a700 1 -- 192.168.123.104:0/2778120801 --> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] -- mgr_command(tid 0: {"prefix": "orch apply", "service_type": "mon", "target": ["mon-mgr", ""]}) v1 -- 0x7fe11806f0a0 con 0x7fe1000384c0 2026-03-10T06:17:04.257 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.212+0000 7fe0fffff700 1 -- 192.168.123.104:0/2778120801 <== mgr.14120 v2:192.168.123.104:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+24 (secure 0 0 0) 0x7fe11806f0a0 con 0x7fe1000384c0 2026-03-10T06:17:04.257 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.216+0000 7fe11d21a700 1 -- 192.168.123.104:0/2778120801 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fe1000384c0 msgr2=0x7fe10003a970 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:04.257 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.216+0000 7fe11d21a700 1 --2- 192.168.123.104:0/2778120801 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fe1000384c0 0x7fe10003a970 secure :-1 s=READY pgs=12 cs=0 l=1 rev1=1 crypto rx=0x7fe10c006fd0 tx=0x7fe10c006e40 comp rx=0 tx=0).stop 2026-03-10T06:17:04.257 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.216+0000 7fe11d21a700 1 -- 192.168.123.104:0/2778120801 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe118071190 msgr2=0x7fe1180865a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:04.257 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.216+0000 7fe11d21a700 1 --2- 192.168.123.104:0/2778120801 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe118071190 0x7fe1180865a0 secure :-1 s=READY pgs=48 cs=0 l=1 rev1=1 crypto rx=0x7fe108003da0 tx=0x7fe108003980 comp rx=0 tx=0).stop 2026-03-10T06:17:04.257 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.216+0000 7fe11d21a700 1 -- 192.168.123.104:0/2778120801 shutdown_connections 2026-03-10T06:17:04.257 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.216+0000 7fe11d21a700 1 --2- 192.168.123.104:0/2778120801 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fe1000384c0 0x7fe10003a970 secure :-1 s=CLOSED pgs=12 cs=0 l=1 rev1=1 crypto rx=0x7fe10c006fd0 tx=0x7fe10c006e40 comp rx=0 tx=0).stop 2026-03-10T06:17:04.257 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.216+0000 7fe11d21a700 1 --2- 192.168.123.104:0/2778120801 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe118071190 0x7fe1180865a0 unknown :-1 s=CLOSED pgs=48 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:04.257 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.216+0000 7fe11d21a700 1 -- 192.168.123.104:0/2778120801 >> 192.168.123.104:0/2778120801 conn(0x7fe11806cc30 msgr2=0x7fe11806e990 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:17:04.257 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.217+0000 7fe11d21a700 1 -- 192.168.123.104:0/2778120801 shutdown_connections 2026-03-10T06:17:04.257 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.217+0000 7fe11d21a700 1 -- 192.168.123.104:0/2778120801 wait complete. 2026-03-10T06:17:04.257 INFO:teuthology.orchestra.run.vm04.stdout:Deploying mgr service with default placement... 2026-03-10T06:17:04.609 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout Scheduled mgr update... 2026-03-10T06:17:04.609 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.427+0000 7f58f73b0700 1 Processor -- start 2026-03-10T06:17:04.609 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.427+0000 7f58f73b0700 1 -- start start 2026-03-10T06:17:04.609 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.427+0000 7f58f73b0700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f58f0071200 0x7f58f0071610 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:17:04.609 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.427+0000 7f58f73b0700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f58f00728b0 con 0x7f58f0071200 2026-03-10T06:17:04.609 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.428+0000 7f58f514c700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f58f0071200 0x7f58f0071610 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:17:04.609 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.428+0000 7f58f514c700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f58f0071200 0x7f58f0071610 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:60622/0 (socket says 192.168.123.104:60622) 2026-03-10T06:17:04.609 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.428+0000 7f58f514c700 1 -- 192.168.123.104:0/3135517865 learned_addr learned my addr 192.168.123.104:0/3135517865 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:17:04.609 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.428+0000 7f58f514c700 1 -- 192.168.123.104:0/3135517865 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f58f00729f0 con 0x7f58f0071200 2026-03-10T06:17:04.609 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.429+0000 7f58f514c700 1 --2- 192.168.123.104:0/3135517865 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f58f0071200 0x7f58f0071610 secure :-1 s=READY pgs=49 cs=0 l=1 rev1=1 crypto rx=0x7f58ec009a90 tx=0x7f58ec009da0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=cd8f6553cc50266 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:17:04.609 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.429+0000 7f58e7fff700 1 -- 192.168.123.104:0/3135517865 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f58ec004030 con 0x7f58f0071200 2026-03-10T06:17:04.609 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.429+0000 7f58e7fff700 1 -- 192.168.123.104:0/3135517865 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f58ec00b7e0 con 0x7f58f0071200 2026-03-10T06:17:04.609 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.429+0000 7f58f73b0700 1 -- 192.168.123.104:0/3135517865 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f58f0071200 msgr2=0x7f58f0071610 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:04.609 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.429+0000 7f58e7fff700 1 -- 192.168.123.104:0/3135517865 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f58ec004030 con 0x7f58f0071200 2026-03-10T06:17:04.609 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.429+0000 7f58f73b0700 1 --2- 192.168.123.104:0/3135517865 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f58f0071200 0x7f58f0071610 secure :-1 s=READY pgs=49 cs=0 l=1 rev1=1 crypto rx=0x7f58ec009a90 tx=0x7f58ec009da0 comp rx=0 tx=0).stop 2026-03-10T06:17:04.609 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.429+0000 7f58f73b0700 1 -- 192.168.123.104:0/3135517865 shutdown_connections 2026-03-10T06:17:04.609 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.429+0000 7f58f73b0700 1 --2- 192.168.123.104:0/3135517865 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f58f0071200 0x7f58f0071610 unknown :-1 s=CLOSED pgs=49 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:04.610 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.429+0000 7f58f73b0700 1 -- 192.168.123.104:0/3135517865 >> 192.168.123.104:0/3135517865 conn(0x7f58f006cc30 msgr2=0x7f58f006f060 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:17:04.612 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.430+0000 7f58f73b0700 1 -- 192.168.123.104:0/3135517865 shutdown_connections 2026-03-10T06:17:04.612 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.430+0000 7f58f73b0700 1 -- 192.168.123.104:0/3135517865 wait complete. 2026-03-10T06:17:04.612 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.431+0000 7f58f73b0700 1 Processor -- start 2026-03-10T06:17:04.612 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.431+0000 7f58f73b0700 1 -- start start 2026-03-10T06:17:04.612 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.431+0000 7f58f73b0700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f58f0071200 0x7f58f01a8970 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:17:04.612 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.431+0000 7f58f73b0700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f58f01a8eb0 con 0x7f58f0071200 2026-03-10T06:17:04.612 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.431+0000 7f58f514c700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f58f0071200 0x7f58f01a8970 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:17:04.612 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.431+0000 7f58f514c700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f58f0071200 0x7f58f01a8970 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:60638/0 (socket says 192.168.123.104:60638) 2026-03-10T06:17:04.612 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.431+0000 7f58f514c700 1 -- 192.168.123.104:0/2786419883 learned_addr learned my addr 192.168.123.104:0/2786419883 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:17:04.612 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.432+0000 7f58f514c700 1 -- 192.168.123.104:0/2786419883 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f58ec009740 con 0x7f58f0071200 2026-03-10T06:17:04.612 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.432+0000 7f58f514c700 1 --2- 192.168.123.104:0/2786419883 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f58f0071200 0x7f58f01a8970 secure :-1 s=READY pgs=50 cs=0 l=1 rev1=1 crypto rx=0x7f58ec009710 tx=0x7f58ec00bf80 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:17:04.612 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.432+0000 7f58e67fc700 1 -- 192.168.123.104:0/2786419883 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f58ec0040b0 con 0x7f58f0071200 2026-03-10T06:17:04.612 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.432+0000 7f58e67fc700 1 -- 192.168.123.104:0/2786419883 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f58ec004210 con 0x7f58f0071200 2026-03-10T06:17:04.612 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.432+0000 7f58e67fc700 1 -- 192.168.123.104:0/2786419883 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f58ec0114c0 con 0x7f58f0071200 2026-03-10T06:17:04.612 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.432+0000 7f58f73b0700 1 -- 192.168.123.104:0/2786419883 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f58f01a90b0 con 0x7f58f0071200 2026-03-10T06:17:04.613 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.432+0000 7f58f73b0700 1 -- 192.168.123.104:0/2786419883 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f58f01a9550 con 0x7f58f0071200 2026-03-10T06:17:04.613 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.434+0000 7f58e67fc700 1 -- 192.168.123.104:0/2786419883 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 8) v1 ==== 45079+0+0 (secure 0 0 0) 0x7f58ec011620 con 0x7f58f0071200 2026-03-10T06:17:04.613 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.434+0000 7f58e67fc700 1 --2- 192.168.123.104:0/2786419883 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f58dc0384b0 0x7f58dc03a960 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:17:04.613 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.434+0000 7f58f494b700 1 --2- 192.168.123.104:0/2786419883 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f58dc0384b0 0x7f58dc03a960 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:17:04.613 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.434+0000 7f58e67fc700 1 -- 192.168.123.104:0/2786419883 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7f58ec04cc10 con 0x7f58f0071200 2026-03-10T06:17:04.613 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.435+0000 7f58f494b700 1 --2- 192.168.123.104:0/2786419883 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f58dc0384b0 0x7f58dc03a960 secure :-1 s=READY pgs=13 cs=0 l=1 rev1=1 crypto rx=0x7f58e800ad80 tx=0x7f58e80093f0 comp rx=0 tx=0).ready entity=mgr.14120 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:17:04.613 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.435+0000 7f58f73b0700 1 -- 192.168.123.104:0/2786419883 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f58f01a30a0 con 0x7f58f0071200 2026-03-10T06:17:04.613 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.438+0000 7f58e67fc700 1 -- 192.168.123.104:0/2786419883 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f58ec01e070 con 0x7f58f0071200 2026-03-10T06:17:04.613 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.552+0000 7f58f73b0700 1 -- 192.168.123.104:0/2786419883 --> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] -- mgr_command(tid 0: {"prefix": "orch apply", "service_type": "mgr", "target": ["mon-mgr", ""]}) v1 -- 0x7f58f0061190 con 0x7f58dc0384b0 2026-03-10T06:17:04.613 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.560+0000 7f58e67fc700 1 -- 192.168.123.104:0/2786419883 <== mgr.14120 v2:192.168.123.104:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+24 (secure 0 0 0) 0x7f58f0061190 con 0x7f58dc0384b0 2026-03-10T06:17:04.613 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.565+0000 7f58dbfff700 1 -- 192.168.123.104:0/2786419883 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f58dc0384b0 msgr2=0x7f58dc03a960 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:04.613 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.565+0000 7f58dbfff700 1 --2- 192.168.123.104:0/2786419883 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f58dc0384b0 0x7f58dc03a960 secure :-1 s=READY pgs=13 cs=0 l=1 rev1=1 crypto rx=0x7f58e800ad80 tx=0x7f58e80093f0 comp rx=0 tx=0).stop 2026-03-10T06:17:04.613 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.565+0000 7f58dbfff700 1 -- 192.168.123.104:0/2786419883 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f58f0071200 msgr2=0x7f58f01a8970 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:04.613 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.565+0000 7f58dbfff700 1 --2- 192.168.123.104:0/2786419883 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f58f0071200 0x7f58f01a8970 secure :-1 s=READY pgs=50 cs=0 l=1 rev1=1 crypto rx=0x7f58ec009710 tx=0x7f58ec00bf80 comp rx=0 tx=0).stop 2026-03-10T06:17:04.613 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.565+0000 7f58dbfff700 1 -- 192.168.123.104:0/2786419883 shutdown_connections 2026-03-10T06:17:04.613 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.565+0000 7f58dbfff700 1 --2- 192.168.123.104:0/2786419883 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f58dc0384b0 0x7f58dc03a960 unknown :-1 s=CLOSED pgs=13 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:04.613 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.565+0000 7f58dbfff700 1 --2- 192.168.123.104:0/2786419883 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f58f0071200 0x7f58f01a8970 unknown :-1 s=CLOSED pgs=50 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:04.613 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.565+0000 7f58dbfff700 1 -- 192.168.123.104:0/2786419883 >> 192.168.123.104:0/2786419883 conn(0x7f58f006cc30 msgr2=0x7f58f01130c0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:17:04.613 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.565+0000 7f58dbfff700 1 -- 192.168.123.104:0/2786419883 shutdown_connections 2026-03-10T06:17:04.613 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.565+0000 7f58dbfff700 1 -- 192.168.123.104:0/2786419883 wait complete. 2026-03-10T06:17:04.613 INFO:teuthology.orchestra.run.vm04.stdout:Deploying crash service with default placement... 2026-03-10T06:17:04.957 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout Scheduled crash update... 2026-03-10T06:17:04.957 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.767+0000 7f806b377700 1 Processor -- start 2026-03-10T06:17:04.957 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.767+0000 7f806b377700 1 -- start start 2026-03-10T06:17:04.957 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.767+0000 7f806b377700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8064071410 0x7f8064071820 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:17:04.957 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.767+0000 7f806b377700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8064071d60 con 0x7f8064071410 2026-03-10T06:17:04.957 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.767+0000 7f806a375700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8064071410 0x7f8064071820 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:17:04.958 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.767+0000 7f806a375700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8064071410 0x7f8064071820 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:60644/0 (socket says 192.168.123.104:60644) 2026-03-10T06:17:04.958 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.767+0000 7f806a375700 1 -- 192.168.123.104:0/2062780171 learned_addr learned my addr 192.168.123.104:0/2062780171 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:17:04.958 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.768+0000 7f806a375700 1 -- 192.168.123.104:0/2062780171 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f8064071ea0 con 0x7f8064071410 2026-03-10T06:17:04.958 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.768+0000 7f806a375700 1 --2- 192.168.123.104:0/2062780171 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8064071410 0x7f8064071820 secure :-1 s=READY pgs=51 cs=0 l=1 rev1=1 crypto rx=0x7f805c009b80 tx=0x7f805c009e90 comp rx=0 tx=0).ready entity=mon.0 client_cookie=10c1d7d6897b7a58 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:17:04.958 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.768+0000 7f8069373700 1 -- 192.168.123.104:0/2062780171 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f805c00dc30 con 0x7f8064071410 2026-03-10T06:17:04.958 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.768+0000 7f8069373700 1 -- 192.168.123.104:0/2062780171 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f805c00dd90 con 0x7f8064071410 2026-03-10T06:17:04.958 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.768+0000 7f806b377700 1 -- 192.168.123.104:0/2062780171 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8064071410 msgr2=0x7f8064071820 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:04.958 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.768+0000 7f806b377700 1 --2- 192.168.123.104:0/2062780171 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8064071410 0x7f8064071820 secure :-1 s=READY pgs=51 cs=0 l=1 rev1=1 crypto rx=0x7f805c009b80 tx=0x7f805c009e90 comp rx=0 tx=0).stop 2026-03-10T06:17:04.958 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.769+0000 7f806b377700 1 -- 192.168.123.104:0/2062780171 shutdown_connections 2026-03-10T06:17:04.958 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.769+0000 7f806b377700 1 --2- 192.168.123.104:0/2062780171 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8064071410 0x7f8064071820 unknown :-1 s=CLOSED pgs=51 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:04.958 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.769+0000 7f806b377700 1 -- 192.168.123.104:0/2062780171 >> 192.168.123.104:0/2062780171 conn(0x7f806406c9d0 msgr2=0x7f806406ee00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:17:04.958 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.769+0000 7f806b377700 1 -- 192.168.123.104:0/2062780171 shutdown_connections 2026-03-10T06:17:04.958 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.769+0000 7f806b377700 1 -- 192.168.123.104:0/2062780171 wait complete. 2026-03-10T06:17:04.958 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.769+0000 7f806b377700 1 Processor -- start 2026-03-10T06:17:04.958 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.769+0000 7f806b377700 1 -- start start 2026-03-10T06:17:04.958 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.769+0000 7f806b377700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8064071410 0x7f8064112690 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:17:04.958 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.769+0000 7f806b377700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f805c004820 con 0x7f8064071410 2026-03-10T06:17:04.958 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.770+0000 7f806a375700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8064071410 0x7f8064112690 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:17:04.958 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.770+0000 7f806a375700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8064071410 0x7f8064112690 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:60652/0 (socket says 192.168.123.104:60652) 2026-03-10T06:17:04.958 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.770+0000 7f806a375700 1 -- 192.168.123.104:0/2560549558 learned_addr learned my addr 192.168.123.104:0/2560549558 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:17:04.958 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.770+0000 7f806a375700 1 -- 192.168.123.104:0/2560549558 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f805c009830 con 0x7f8064071410 2026-03-10T06:17:04.958 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.770+0000 7f806a375700 1 --2- 192.168.123.104:0/2560549558 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8064071410 0x7f8064112690 secure :-1 s=READY pgs=52 cs=0 l=1 rev1=1 crypto rx=0x7f805c00d980 tx=0x7f805c015950 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:17:04.958 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.770+0000 7f805b7fe700 1 -- 192.168.123.104:0/2560549558 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f805c00dc30 con 0x7f8064071410 2026-03-10T06:17:04.958 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.770+0000 7f805b7fe700 1 -- 192.168.123.104:0/2560549558 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f805c0043a0 con 0x7f8064071410 2026-03-10T06:17:04.958 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.770+0000 7f806b377700 1 -- 192.168.123.104:0/2560549558 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f8064110ce0 con 0x7f8064071410 2026-03-10T06:17:04.958 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.770+0000 7f806b377700 1 -- 192.168.123.104:0/2560549558 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f80641111e0 con 0x7f8064071410 2026-03-10T06:17:04.958 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.770+0000 7f805b7fe700 1 -- 192.168.123.104:0/2560549558 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f805c00cb00 con 0x7f8064071410 2026-03-10T06:17:04.958 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.773+0000 7f805b7fe700 1 -- 192.168.123.104:0/2560549558 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 8) v1 ==== 45079+0+0 (secure 0 0 0) 0x7f805c00cc60 con 0x7f8064071410 2026-03-10T06:17:04.958 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.773+0000 7f805b7fe700 1 --2- 192.168.123.104:0/2560549558 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f8050038460 0x7f805003a910 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:17:04.958 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.773+0000 7f8069b74700 1 --2- 192.168.123.104:0/2560549558 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f8050038460 0x7f805003a910 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:17:04.958 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.773+0000 7f805b7fe700 1 -- 192.168.123.104:0/2560549558 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7f805c04c120 con 0x7f8064071410 2026-03-10T06:17:04.958 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.774+0000 7f806b377700 1 -- 192.168.123.104:0/2560549558 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f8048005320 con 0x7f8064071410 2026-03-10T06:17:04.958 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.777+0000 7f805b7fe700 1 -- 192.168.123.104:0/2560549558 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f805c024020 con 0x7f8064071410 2026-03-10T06:17:04.958 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.777+0000 7f8069b74700 1 --2- 192.168.123.104:0/2560549558 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f8050038460 0x7f805003a910 secure :-1 s=READY pgs=14 cs=0 l=1 rev1=1 crypto rx=0x7f8054009940 tx=0x7f8054006e30 comp rx=0 tx=0).ready entity=mgr.14120 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:17:04.958 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.898+0000 7f806b377700 1 -- 192.168.123.104:0/2560549558 --> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] -- mgr_command(tid 0: {"prefix": "orch apply", "service_type": "crash", "target": ["mon-mgr", ""]}) v1 -- 0x7f8048000bf0 con 0x7f8050038460 2026-03-10T06:17:04.958 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.907+0000 7f805b7fe700 1 -- 192.168.123.104:0/2560549558 <== mgr.14120 v2:192.168.123.104:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+26 (secure 0 0 0) 0x7f8048000bf0 con 0x7f8050038460 2026-03-10T06:17:04.958 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.913+0000 7f806b377700 1 -- 192.168.123.104:0/2560549558 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f8050038460 msgr2=0x7f805003a910 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:04.958 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.913+0000 7f806b377700 1 --2- 192.168.123.104:0/2560549558 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f8050038460 0x7f805003a910 secure :-1 s=READY pgs=14 cs=0 l=1 rev1=1 crypto rx=0x7f8054009940 tx=0x7f8054006e30 comp rx=0 tx=0).stop 2026-03-10T06:17:04.958 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.913+0000 7f806b377700 1 -- 192.168.123.104:0/2560549558 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8064071410 msgr2=0x7f8064112690 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:04.958 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.913+0000 7f806b377700 1 --2- 192.168.123.104:0/2560549558 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8064071410 0x7f8064112690 secure :-1 s=READY pgs=52 cs=0 l=1 rev1=1 crypto rx=0x7f805c00d980 tx=0x7f805c015950 comp rx=0 tx=0).stop 2026-03-10T06:17:04.958 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.913+0000 7f806b377700 1 -- 192.168.123.104:0/2560549558 shutdown_connections 2026-03-10T06:17:04.958 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.913+0000 7f806b377700 1 --2- 192.168.123.104:0/2560549558 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f8050038460 0x7f805003a910 unknown :-1 s=CLOSED pgs=14 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:04.958 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.913+0000 7f806b377700 1 --2- 192.168.123.104:0/2560549558 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8064071410 0x7f8064112690 unknown :-1 s=CLOSED pgs=52 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:04.958 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.913+0000 7f806b377700 1 -- 192.168.123.104:0/2560549558 >> 192.168.123.104:0/2560549558 conn(0x7f806406c9d0 msgr2=0x7f806406d6e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:17:04.958 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.913+0000 7f806b377700 1 -- 192.168.123.104:0/2560549558 shutdown_connections 2026-03-10T06:17:04.958 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:04.913+0000 7f806b377700 1 -- 192.168.123.104:0/2560549558 wait complete. 2026-03-10T06:17:04.959 INFO:teuthology.orchestra.run.vm04.stdout:Deploying ceph-exporter service with default placement... 2026-03-10T06:17:05.123 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:04 vm04 ceph-mon[51058]: from='mgr.14120 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:17:05.123 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:04 vm04 ceph-mon[51058]: Added host vm04 2026-03-10T06:17:05.123 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:04 vm04 ceph-mon[51058]: from='mgr.14120 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:17:05.123 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:04 vm04 ceph-mon[51058]: from='client.14142 -' entity='client.admin' cmd=[{"prefix": "orch apply", "service_type": "mon", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:17:05.123 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:04 vm04 ceph-mon[51058]: Saving service mon spec with placement count:5 2026-03-10T06:17:05.123 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:04 vm04 ceph-mon[51058]: from='mgr.14120 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:17:05.123 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:04 vm04 ceph-mon[51058]: from='mgr.14120 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:17:05.123 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:04 vm04 ceph-mon[51058]: from='mgr.14120 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:17:05.123 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:04 vm04 ceph-mon[51058]: from='mgr.14120 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:17:05.371 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout Scheduled ceph-exporter update... 2026-03-10T06:17:05.371 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.192+0000 7fc7f8c5a700 1 Processor -- start 2026-03-10T06:17:05.371 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.192+0000 7fc7f8c5a700 1 -- start start 2026-03-10T06:17:05.371 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.192+0000 7fc7f8c5a700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc7f4072ac0 0x7fc7f4070fc0 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:17:05.371 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.192+0000 7fc7f8c5a700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc7f4071500 con 0x7fc7f4072ac0 2026-03-10T06:17:05.371 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.192+0000 7fc7f37fe700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc7f4072ac0 0x7fc7f4070fc0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:17:05.371 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.192+0000 7fc7f37fe700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc7f4072ac0 0x7fc7f4070fc0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:60658/0 (socket says 192.168.123.104:60658) 2026-03-10T06:17:05.371 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.192+0000 7fc7f37fe700 1 -- 192.168.123.104:0/3747904896 learned_addr learned my addr 192.168.123.104:0/3747904896 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:17:05.371 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.193+0000 7fc7f37fe700 1 -- 192.168.123.104:0/3747904896 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fc7f4071640 con 0x7fc7f4072ac0 2026-03-10T06:17:05.371 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.193+0000 7fc7f37fe700 1 --2- 192.168.123.104:0/3747904896 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc7f4072ac0 0x7fc7f4070fc0 secure :-1 s=READY pgs=53 cs=0 l=1 rev1=1 crypto rx=0x7fc7e40098d0 tx=0x7fc7e4009be0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=634d2fc9b3c59ad9 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:17:05.371 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.193+0000 7fc7f27fc700 1 -- 192.168.123.104:0/3747904896 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fc7e4004030 con 0x7fc7f4072ac0 2026-03-10T06:17:05.371 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.193+0000 7fc7f27fc700 1 -- 192.168.123.104:0/3747904896 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fc7e400c8f0 con 0x7fc7f4072ac0 2026-03-10T06:17:05.371 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.193+0000 7fc7f8c5a700 1 -- 192.168.123.104:0/3747904896 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc7f4072ac0 msgr2=0x7fc7f4070fc0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:05.371 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.193+0000 7fc7f8c5a700 1 --2- 192.168.123.104:0/3747904896 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc7f4072ac0 0x7fc7f4070fc0 secure :-1 s=READY pgs=53 cs=0 l=1 rev1=1 crypto rx=0x7fc7e40098d0 tx=0x7fc7e4009be0 comp rx=0 tx=0).stop 2026-03-10T06:17:05.371 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.194+0000 7fc7f8c5a700 1 -- 192.168.123.104:0/3747904896 shutdown_connections 2026-03-10T06:17:05.371 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.194+0000 7fc7f8c5a700 1 --2- 192.168.123.104:0/3747904896 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc7f4072ac0 0x7fc7f4070fc0 unknown :-1 s=CLOSED pgs=53 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:05.371 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.194+0000 7fc7f8c5a700 1 -- 192.168.123.104:0/3747904896 >> 192.168.123.104:0/3747904896 conn(0x7fc7f406c9d0 msgr2=0x7fc7f406ee00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:17:05.371 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.194+0000 7fc7f8c5a700 1 -- 192.168.123.104:0/3747904896 shutdown_connections 2026-03-10T06:17:05.371 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.194+0000 7fc7f8c5a700 1 -- 192.168.123.104:0/3747904896 wait complete. 2026-03-10T06:17:05.371 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.194+0000 7fc7f8c5a700 1 Processor -- start 2026-03-10T06:17:05.371 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.194+0000 7fc7f8c5a700 1 -- start start 2026-03-10T06:17:05.371 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.194+0000 7fc7f8c5a700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc7f407e220 0x7fc7f407e630 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:17:05.371 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.194+0000 7fc7f8c5a700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc7e4013070 con 0x7fc7f407e220 2026-03-10T06:17:05.371 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.194+0000 7fc7f37fe700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc7f407e220 0x7fc7f407e630 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:17:05.371 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.194+0000 7fc7f37fe700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc7f407e220 0x7fc7f407e630 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:60674/0 (socket says 192.168.123.104:60674) 2026-03-10T06:17:05.371 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.194+0000 7fc7f37fe700 1 -- 192.168.123.104:0/3073389771 learned_addr learned my addr 192.168.123.104:0/3073389771 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:17:05.371 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.195+0000 7fc7f37fe700 1 -- 192.168.123.104:0/3073389771 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fc7e4009580 con 0x7fc7f407e220 2026-03-10T06:17:05.371 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.195+0000 7fc7f37fe700 1 --2- 192.168.123.104:0/3073389771 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc7f407e220 0x7fc7f407e630 secure :-1 s=READY pgs=54 cs=0 l=1 rev1=1 crypto rx=0x7fc7e400c1c0 tx=0x7fc7e40041c0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:17:05.371 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.195+0000 7fc7f0ff9700 1 -- 192.168.123.104:0/3073389771 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fc7e40046a0 con 0x7fc7f407e220 2026-03-10T06:17:05.371 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.195+0000 7fc7f8c5a700 1 -- 192.168.123.104:0/3073389771 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fc7f407eb70 con 0x7fc7f407e220 2026-03-10T06:17:05.371 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.195+0000 7fc7f8c5a700 1 -- 192.168.123.104:0/3073389771 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fc7f4081210 con 0x7fc7f407e220 2026-03-10T06:17:05.371 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.195+0000 7fc7f0ff9700 1 -- 192.168.123.104:0/3073389771 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fc7e40039f0 con 0x7fc7f407e220 2026-03-10T06:17:05.371 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.195+0000 7fc7f0ff9700 1 -- 192.168.123.104:0/3073389771 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fc7e40203f0 con 0x7fc7f407e220 2026-03-10T06:17:05.371 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.196+0000 7fc7f0ff9700 1 -- 192.168.123.104:0/3073389771 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 8) v1 ==== 45079+0+0 (secure 0 0 0) 0x7fc7e4018960 con 0x7fc7f407e220 2026-03-10T06:17:05.371 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.196+0000 7fc7f8c5a700 1 -- 192.168.123.104:0/3073389771 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fc7e0005320 con 0x7fc7f407e220 2026-03-10T06:17:05.371 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.196+0000 7fc7f0ff9700 1 --2- 192.168.123.104:0/3073389771 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fc7dc038460 0x7fc7dc03a910 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:17:05.371 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.196+0000 7fc7f0ff9700 1 -- 192.168.123.104:0/3073389771 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7fc7e404b9c0 con 0x7fc7f407e220 2026-03-10T06:17:05.371 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.196+0000 7fc7f2ffd700 1 --2- 192.168.123.104:0/3073389771 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fc7dc038460 0x7fc7dc03a910 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:17:05.371 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.197+0000 7fc7f2ffd700 1 --2- 192.168.123.104:0/3073389771 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fc7dc038460 0x7fc7dc03a910 secure :-1 s=READY pgs=15 cs=0 l=1 rev1=1 crypto rx=0x7fc7ec00ad30 tx=0x7fc7ec0093f0 comp rx=0 tx=0).ready entity=mgr.14120 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:17:05.371 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.199+0000 7fc7f0ff9700 1 -- 192.168.123.104:0/3073389771 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fc7e40183f0 con 0x7fc7f407e220 2026-03-10T06:17:05.371 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.318+0000 7fc7f8c5a700 1 -- 192.168.123.104:0/3073389771 --> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] -- mgr_command(tid 0: {"prefix": "orch apply", "service_type": "ceph-exporter", "target": ["mon-mgr", ""]}) v1 -- 0x7fc7e0000bf0 con 0x7fc7dc038460 2026-03-10T06:17:05.371 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.323+0000 7fc7f0ff9700 1 -- 192.168.123.104:0/3073389771 <== mgr.14120 v2:192.168.123.104:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+34 (secure 0 0 0) 0x7fc7e0000bf0 con 0x7fc7dc038460 2026-03-10T06:17:05.371 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.326+0000 7fc7f8c5a700 1 -- 192.168.123.104:0/3073389771 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fc7dc038460 msgr2=0x7fc7dc03a910 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:05.371 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.326+0000 7fc7f8c5a700 1 --2- 192.168.123.104:0/3073389771 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fc7dc038460 0x7fc7dc03a910 secure :-1 s=READY pgs=15 cs=0 l=1 rev1=1 crypto rx=0x7fc7ec00ad30 tx=0x7fc7ec0093f0 comp rx=0 tx=0).stop 2026-03-10T06:17:05.372 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.326+0000 7fc7f8c5a700 1 -- 192.168.123.104:0/3073389771 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc7f407e220 msgr2=0x7fc7f407e630 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:05.372 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.326+0000 7fc7f8c5a700 1 --2- 192.168.123.104:0/3073389771 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc7f407e220 0x7fc7f407e630 secure :-1 s=READY pgs=54 cs=0 l=1 rev1=1 crypto rx=0x7fc7e400c1c0 tx=0x7fc7e40041c0 comp rx=0 tx=0).stop 2026-03-10T06:17:05.372 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.326+0000 7fc7f8c5a700 1 -- 192.168.123.104:0/3073389771 shutdown_connections 2026-03-10T06:17:05.372 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.326+0000 7fc7f8c5a700 1 --2- 192.168.123.104:0/3073389771 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fc7dc038460 0x7fc7dc03a910 unknown :-1 s=CLOSED pgs=15 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:05.372 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.326+0000 7fc7f8c5a700 1 --2- 192.168.123.104:0/3073389771 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc7f407e220 0x7fc7f407e630 unknown :-1 s=CLOSED pgs=54 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:05.372 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.326+0000 7fc7f8c5a700 1 -- 192.168.123.104:0/3073389771 >> 192.168.123.104:0/3073389771 conn(0x7fc7f406c9d0 msgr2=0x7fc7f406d3d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:17:05.372 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.326+0000 7fc7f8c5a700 1 -- 192.168.123.104:0/3073389771 shutdown_connections 2026-03-10T06:17:05.372 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.326+0000 7fc7f8c5a700 1 -- 192.168.123.104:0/3073389771 wait complete. 2026-03-10T06:17:05.372 INFO:teuthology.orchestra.run.vm04.stdout:Deploying prometheus service with default placement... 2026-03-10T06:17:05.724 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout Scheduled prometheus update... 2026-03-10T06:17:05.724 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.548+0000 7fafdd1d5700 1 Processor -- start 2026-03-10T06:17:05.724 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.550+0000 7fafdd1d5700 1 -- start start 2026-03-10T06:17:05.724 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.550+0000 7fafdd1d5700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fafd8072ac0 0x7fafd8071190 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:17:05.724 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.550+0000 7fafdd1d5700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fafd80716d0 con 0x7fafd8072ac0 2026-03-10T06:17:05.724 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.551+0000 7fafd7fff700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fafd8072ac0 0x7fafd8071190 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:17:05.724 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.551+0000 7fafd7fff700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fafd8072ac0 0x7fafd8071190 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:60684/0 (socket says 192.168.123.104:60684) 2026-03-10T06:17:05.725 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.551+0000 7fafd7fff700 1 -- 192.168.123.104:0/1244122887 learned_addr learned my addr 192.168.123.104:0/1244122887 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:17:05.725 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.551+0000 7fafd7fff700 1 -- 192.168.123.104:0/1244122887 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fafd8071810 con 0x7fafd8072ac0 2026-03-10T06:17:05.725 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.551+0000 7fafd7fff700 1 --2- 192.168.123.104:0/1244122887 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fafd8072ac0 0x7fafd8071190 secure :-1 s=READY pgs=55 cs=0 l=1 rev1=1 crypto rx=0x7fafc8009a90 tx=0x7fafc8009da0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=8ae60e6cdfe44c87 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:17:05.725 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.551+0000 7fafd6ffd700 1 -- 192.168.123.104:0/1244122887 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fafc8004030 con 0x7fafd8072ac0 2026-03-10T06:17:05.725 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.551+0000 7fafd6ffd700 1 -- 192.168.123.104:0/1244122887 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fafc800b7e0 con 0x7fafd8072ac0 2026-03-10T06:17:05.725 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.552+0000 7fafdd1d5700 1 -- 192.168.123.104:0/1244122887 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fafd8072ac0 msgr2=0x7fafd8071190 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:05.725 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.552+0000 7fafdd1d5700 1 --2- 192.168.123.104:0/1244122887 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fafd8072ac0 0x7fafd8071190 secure :-1 s=READY pgs=55 cs=0 l=1 rev1=1 crypto rx=0x7fafc8009a90 tx=0x7fafc8009da0 comp rx=0 tx=0).stop 2026-03-10T06:17:05.725 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.552+0000 7fafdd1d5700 1 -- 192.168.123.104:0/1244122887 shutdown_connections 2026-03-10T06:17:05.725 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.552+0000 7fafdd1d5700 1 --2- 192.168.123.104:0/1244122887 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fafd8072ac0 0x7fafd8071190 unknown :-1 s=CLOSED pgs=55 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:05.725 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.552+0000 7fafdd1d5700 1 -- 192.168.123.104:0/1244122887 >> 192.168.123.104:0/1244122887 conn(0x7fafd806c9d0 msgr2=0x7fafd806ee00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:17:05.725 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.554+0000 7fafdd1d5700 1 -- 192.168.123.104:0/1244122887 shutdown_connections 2026-03-10T06:17:05.725 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.554+0000 7fafdd1d5700 1 -- 192.168.123.104:0/1244122887 wait complete. 2026-03-10T06:17:05.725 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.555+0000 7fafdd1d5700 1 Processor -- start 2026-03-10T06:17:05.725 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.555+0000 7fafdd1d5700 1 -- start start 2026-03-10T06:17:05.725 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.555+0000 7fafdd1d5700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fafd81a87f0 0x7fafd81a8c00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:17:05.725 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.555+0000 7fafdd1d5700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fafd80716d0 con 0x7fafd81a87f0 2026-03-10T06:17:05.725 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.555+0000 7fafd7fff700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fafd81a87f0 0x7fafd81a8c00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:17:05.725 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.555+0000 7fafd7fff700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fafd81a87f0 0x7fafd81a8c00 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:60698/0 (socket says 192.168.123.104:60698) 2026-03-10T06:17:05.725 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.555+0000 7fafd7fff700 1 -- 192.168.123.104:0/1272626003 learned_addr learned my addr 192.168.123.104:0/1272626003 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:17:05.725 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.555+0000 7fafd7fff700 1 -- 192.168.123.104:0/1272626003 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fafc8009740 con 0x7fafd81a87f0 2026-03-10T06:17:05.725 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.555+0000 7fafd7fff700 1 --2- 192.168.123.104:0/1272626003 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fafd81a87f0 0x7fafd81a8c00 secure :-1 s=READY pgs=56 cs=0 l=1 rev1=1 crypto rx=0x7fafc800bdb0 tx=0x7fafc800be90 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:17:05.725 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.556+0000 7fafd57fa700 1 -- 192.168.123.104:0/1272626003 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fafc8003750 con 0x7fafd81a87f0 2026-03-10T06:17:05.725 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.556+0000 7fafdd1d5700 1 -- 192.168.123.104:0/1272626003 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fafd81a9140 con 0x7fafd81a87f0 2026-03-10T06:17:05.725 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.556+0000 7fafdd1d5700 1 -- 192.168.123.104:0/1272626003 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fafd807b000 con 0x7fafd81a87f0 2026-03-10T06:17:05.725 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.556+0000 7fafd57fa700 1 -- 192.168.123.104:0/1272626003 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fafc8004440 con 0x7fafd81a87f0 2026-03-10T06:17:05.725 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.556+0000 7fafd57fa700 1 -- 192.168.123.104:0/1272626003 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fafc801acb0 con 0x7fafd81a87f0 2026-03-10T06:17:05.725 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.556+0000 7fafd57fa700 1 -- 192.168.123.104:0/1272626003 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 8) v1 ==== 45079+0+0 (secure 0 0 0) 0x7fafc8011420 con 0x7fafd81a87f0 2026-03-10T06:17:05.725 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.557+0000 7fafd57fa700 1 --2- 192.168.123.104:0/1272626003 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fafc00384e0 0x7fafc003a990 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:17:05.725 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.557+0000 7fafd77fe700 1 --2- 192.168.123.104:0/1272626003 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fafc00384e0 0x7fafc003a990 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:17:05.725 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.557+0000 7fafd77fe700 1 --2- 192.168.123.104:0/1272626003 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fafc00384e0 0x7fafc003a990 secure :-1 s=READY pgs=16 cs=0 l=1 rev1=1 crypto rx=0x7fafcc009990 tx=0x7fafcc006e30 comp rx=0 tx=0).ready entity=mgr.14120 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:17:05.725 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.557+0000 7fafd57fa700 1 -- 192.168.123.104:0/1272626003 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7fafc804c910 con 0x7fafd81a87f0 2026-03-10T06:17:05.725 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.558+0000 7fafdd1d5700 1 -- 192.168.123.104:0/1272626003 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fafc4005320 con 0x7fafd81a87f0 2026-03-10T06:17:05.725 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.561+0000 7fafd57fa700 1 -- 192.168.123.104:0/1272626003 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fafc8010970 con 0x7fafd81a87f0 2026-03-10T06:17:05.725 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.675+0000 7fafdd1d5700 1 -- 192.168.123.104:0/1272626003 --> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] -- mgr_command(tid 0: {"prefix": "orch apply", "service_type": "prometheus", "target": ["mon-mgr", ""]}) v1 -- 0x7fafc4000bf0 con 0x7fafc00384e0 2026-03-10T06:17:05.726 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.679+0000 7fafd57fa700 1 -- 192.168.123.104:0/1272626003 <== mgr.14120 v2:192.168.123.104:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+31 (secure 0 0 0) 0x7fafc4000bf0 con 0x7fafc00384e0 2026-03-10T06:17:05.726 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.682+0000 7fafbeffd700 1 -- 192.168.123.104:0/1272626003 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fafc00384e0 msgr2=0x7fafc003a990 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:05.726 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.682+0000 7fafbeffd700 1 --2- 192.168.123.104:0/1272626003 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fafc00384e0 0x7fafc003a990 secure :-1 s=READY pgs=16 cs=0 l=1 rev1=1 crypto rx=0x7fafcc009990 tx=0x7fafcc006e30 comp rx=0 tx=0).stop 2026-03-10T06:17:05.726 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.683+0000 7fafbeffd700 1 -- 192.168.123.104:0/1272626003 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fafd81a87f0 msgr2=0x7fafd81a8c00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:05.726 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.683+0000 7fafbeffd700 1 --2- 192.168.123.104:0/1272626003 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fafd81a87f0 0x7fafd81a8c00 secure :-1 s=READY pgs=56 cs=0 l=1 rev1=1 crypto rx=0x7fafc800bdb0 tx=0x7fafc800be90 comp rx=0 tx=0).stop 2026-03-10T06:17:05.726 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.683+0000 7fafbeffd700 1 -- 192.168.123.104:0/1272626003 shutdown_connections 2026-03-10T06:17:05.726 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.683+0000 7fafbeffd700 1 --2- 192.168.123.104:0/1272626003 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fafc00384e0 0x7fafc003a990 unknown :-1 s=CLOSED pgs=16 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:05.726 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.683+0000 7fafbeffd700 1 --2- 192.168.123.104:0/1272626003 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fafd81a87f0 0x7fafd81a8c00 unknown :-1 s=CLOSED pgs=56 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:05.726 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.683+0000 7fafbeffd700 1 -- 192.168.123.104:0/1272626003 >> 192.168.123.104:0/1272626003 conn(0x7fafd806c9d0 msgr2=0x7fafd806d590 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:17:05.726 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.683+0000 7fafbeffd700 1 -- 192.168.123.104:0/1272626003 shutdown_connections 2026-03-10T06:17:05.726 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.683+0000 7fafbeffd700 1 -- 192.168.123.104:0/1272626003 wait complete. 2026-03-10T06:17:05.726 INFO:teuthology.orchestra.run.vm04.stdout:Deploying grafana service with default placement... 2026-03-10T06:17:05.929 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:05 vm04 ceph-mon[51058]: from='client.14144 -' entity='client.admin' cmd=[{"prefix": "orch apply", "service_type": "mgr", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:17:05.929 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:05 vm04 ceph-mon[51058]: Saving service mgr spec with placement count:2 2026-03-10T06:17:05.930 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:05 vm04 ceph-mon[51058]: from='client.14146 -' entity='client.admin' cmd=[{"prefix": "orch apply", "service_type": "crash", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:17:05.930 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:05 vm04 ceph-mon[51058]: Saving service crash spec with placement * 2026-03-10T06:17:05.930 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:05 vm04 ceph-mon[51058]: from='mgr.14120 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:17:05.930 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:05 vm04 ceph-mon[51058]: from='client.14148 -' entity='client.admin' cmd=[{"prefix": "orch apply", "service_type": "ceph-exporter", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:17:05.930 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:05 vm04 ceph-mon[51058]: Saving service ceph-exporter spec with placement * 2026-03-10T06:17:05.930 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:05 vm04 ceph-mon[51058]: from='mgr.14120 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:17:05.930 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:05 vm04 ceph-mon[51058]: from='mgr.14120 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:17:05.930 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:05 vm04 ceph-mon[51058]: from='mgr.14120 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:17:06.084 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout Scheduled grafana update... 2026-03-10T06:17:06.084 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.860+0000 7f629c1ce700 1 Processor -- start 2026-03-10T06:17:06.084 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.861+0000 7f629c1ce700 1 -- start start 2026-03-10T06:17:06.084 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.861+0000 7f629c1ce700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6294104010 0x7f6294104420 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:17:06.084 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.861+0000 7f629c1ce700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6294104960 con 0x7f6294104010 2026-03-10T06:17:06.084 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.861+0000 7f6299f6a700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6294104010 0x7f6294104420 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:17:06.084 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.861+0000 7f6299f6a700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6294104010 0x7f6294104420 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:60714/0 (socket says 192.168.123.104:60714) 2026-03-10T06:17:06.084 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.861+0000 7f6299f6a700 1 -- 192.168.123.104:0/252961670 learned_addr learned my addr 192.168.123.104:0/252961670 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:17:06.084 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.862+0000 7f6299f6a700 1 -- 192.168.123.104:0/252961670 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f6294104aa0 con 0x7f6294104010 2026-03-10T06:17:06.084 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.862+0000 7f6299f6a700 1 --2- 192.168.123.104:0/252961670 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6294104010 0x7f6294104420 secure :-1 s=READY pgs=57 cs=0 l=1 rev1=1 crypto rx=0x7f6288009a90 tx=0x7f6288009da0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=8384735c4b53bf0a server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:17:06.084 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.862+0000 7f6298f68700 1 -- 192.168.123.104:0/252961670 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f628800fbf0 con 0x7f6294104010 2026-03-10T06:17:06.084 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.863+0000 7f6298f68700 1 -- 192.168.123.104:0/252961670 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f6288004510 con 0x7f6294104010 2026-03-10T06:17:06.084 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.863+0000 7f6298f68700 1 -- 192.168.123.104:0/252961670 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f6288017450 con 0x7f6294104010 2026-03-10T06:17:06.084 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.863+0000 7f629c1ce700 1 -- 192.168.123.104:0/252961670 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6294104010 msgr2=0x7f6294104420 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:06.084 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.863+0000 7f629c1ce700 1 --2- 192.168.123.104:0/252961670 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6294104010 0x7f6294104420 secure :-1 s=READY pgs=57 cs=0 l=1 rev1=1 crypto rx=0x7f6288009a90 tx=0x7f6288009da0 comp rx=0 tx=0).stop 2026-03-10T06:17:06.084 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.863+0000 7f629c1ce700 1 -- 192.168.123.104:0/252961670 shutdown_connections 2026-03-10T06:17:06.084 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.863+0000 7f629c1ce700 1 --2- 192.168.123.104:0/252961670 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6294104010 0x7f6294104420 unknown :-1 s=CLOSED pgs=57 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:06.084 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.863+0000 7f629c1ce700 1 -- 192.168.123.104:0/252961670 >> 192.168.123.104:0/252961670 conn(0x7f62940ffa10 msgr2=0x7f6294101e20 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:17:06.084 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.864+0000 7f629c1ce700 1 -- 192.168.123.104:0/252961670 shutdown_connections 2026-03-10T06:17:06.084 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.864+0000 7f629c1ce700 1 -- 192.168.123.104:0/252961670 wait complete. 2026-03-10T06:17:06.084 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.864+0000 7f629c1ce700 1 Processor -- start 2026-03-10T06:17:06.084 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.864+0000 7f629c1ce700 1 -- start start 2026-03-10T06:17:06.084 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.864+0000 7f629c1ce700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6294104010 0x7f62941978b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:17:06.084 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.864+0000 7f629c1ce700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6294197df0 con 0x7f6294104010 2026-03-10T06:17:06.084 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.865+0000 7f6299f6a700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6294104010 0x7f62941978b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:17:06.084 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.865+0000 7f6299f6a700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6294104010 0x7f62941978b0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:60722/0 (socket says 192.168.123.104:60722) 2026-03-10T06:17:06.084 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.865+0000 7f6299f6a700 1 -- 192.168.123.104:0/875611446 learned_addr learned my addr 192.168.123.104:0/875611446 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:17:06.084 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.865+0000 7f6299f6a700 1 -- 192.168.123.104:0/875611446 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f6288009740 con 0x7f6294104010 2026-03-10T06:17:06.084 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.865+0000 7f6299f6a700 1 --2- 192.168.123.104:0/875611446 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6294104010 0x7f62941978b0 secure :-1 s=READY pgs=58 cs=0 l=1 rev1=1 crypto rx=0x7f6288009710 tx=0x7f62880040e0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:17:06.084 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.865+0000 7f6286ffd700 1 -- 192.168.123.104:0/875611446 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f6288017450 con 0x7f6294104010 2026-03-10T06:17:06.084 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.865+0000 7f6286ffd700 1 -- 192.168.123.104:0/875611446 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f6288017bb0 con 0x7f6294104010 2026-03-10T06:17:06.084 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.865+0000 7f629c1ce700 1 -- 192.168.123.104:0/875611446 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f6294197ff0 con 0x7f6294104010 2026-03-10T06:17:06.084 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.865+0000 7f629c1ce700 1 -- 192.168.123.104:0/875611446 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f6294198490 con 0x7f6294104010 2026-03-10T06:17:06.084 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.866+0000 7f6286ffd700 1 -- 192.168.123.104:0/875611446 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f6288020c10 con 0x7f6294104010 2026-03-10T06:17:06.084 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.866+0000 7f6286ffd700 1 -- 192.168.123.104:0/875611446 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 8) v1 ==== 45079+0+0 (secure 0 0 0) 0x7f628801e070 con 0x7f6294104010 2026-03-10T06:17:06.084 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.866+0000 7f629c1ce700 1 -- 192.168.123.104:0/875611446 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f6294191730 con 0x7f6294104010 2026-03-10T06:17:06.084 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.868+0000 7f6286ffd700 1 --2- 192.168.123.104:0/875611446 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f6280038500 0x7f628003a9b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:17:06.084 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.868+0000 7f6286ffd700 1 -- 192.168.123.104:0/875611446 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7f6288051050 con 0x7f6294104010 2026-03-10T06:17:06.084 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.869+0000 7f6286ffd700 1 -- 192.168.123.104:0/875611446 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f6288016e10 con 0x7f6294104010 2026-03-10T06:17:06.084 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.869+0000 7f6299769700 1 --2- 192.168.123.104:0/875611446 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f6280038500 0x7f628003a9b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:17:06.084 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:05.871+0000 7f6299769700 1 --2- 192.168.123.104:0/875611446 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f6280038500 0x7f628003a9b0 secure :-1 s=READY pgs=17 cs=0 l=1 rev1=1 crypto rx=0x7f6290006fd0 tx=0x7f6290006e40 comp rx=0 tx=0).ready entity=mgr.14120 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:17:06.084 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.019+0000 7f629c1ce700 1 -- 192.168.123.104:0/875611446 --> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] -- mgr_command(tid 0: {"prefix": "orch apply", "service_type": "grafana", "target": ["mon-mgr", ""]}) v1 -- 0x7f6294061190 con 0x7f6280038500 2026-03-10T06:17:06.084 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.028+0000 7f6286ffd700 1 -- 192.168.123.104:0/875611446 <== mgr.14120 v2:192.168.123.104:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+28 (secure 0 0 0) 0x7f6294061190 con 0x7f6280038500 2026-03-10T06:17:06.084 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.032+0000 7f629c1ce700 1 -- 192.168.123.104:0/875611446 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f6280038500 msgr2=0x7f628003a9b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:06.084 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.032+0000 7f629c1ce700 1 --2- 192.168.123.104:0/875611446 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f6280038500 0x7f628003a9b0 secure :-1 s=READY pgs=17 cs=0 l=1 rev1=1 crypto rx=0x7f6290006fd0 tx=0x7f6290006e40 comp rx=0 tx=0).stop 2026-03-10T06:17:06.084 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.032+0000 7f629c1ce700 1 -- 192.168.123.104:0/875611446 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6294104010 msgr2=0x7f62941978b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:06.084 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.032+0000 7f629c1ce700 1 --2- 192.168.123.104:0/875611446 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6294104010 0x7f62941978b0 secure :-1 s=READY pgs=58 cs=0 l=1 rev1=1 crypto rx=0x7f6288009710 tx=0x7f62880040e0 comp rx=0 tx=0).stop 2026-03-10T06:17:06.084 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.032+0000 7f629c1ce700 1 -- 192.168.123.104:0/875611446 shutdown_connections 2026-03-10T06:17:06.084 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.032+0000 7f629c1ce700 1 --2- 192.168.123.104:0/875611446 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f6280038500 0x7f628003a9b0 unknown :-1 s=CLOSED pgs=17 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:06.084 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.032+0000 7f629c1ce700 1 --2- 192.168.123.104:0/875611446 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6294104010 0x7f62941978b0 unknown :-1 s=CLOSED pgs=58 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:06.084 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.032+0000 7f629c1ce700 1 -- 192.168.123.104:0/875611446 >> 192.168.123.104:0/875611446 conn(0x7f62940ffa10 msgr2=0x7f6294101230 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:17:06.084 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.032+0000 7f629c1ce700 1 -- 192.168.123.104:0/875611446 shutdown_connections 2026-03-10T06:17:06.084 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.032+0000 7f629c1ce700 1 -- 192.168.123.104:0/875611446 wait complete. 2026-03-10T06:17:06.085 INFO:teuthology.orchestra.run.vm04.stdout:Deploying node-exporter service with default placement... 2026-03-10T06:17:06.389 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout Scheduled node-exporter update... 2026-03-10T06:17:06.389 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.219+0000 7f053e5de700 1 Processor -- start 2026-03-10T06:17:06.389 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.219+0000 7f053e5de700 1 -- start start 2026-03-10T06:17:06.389 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.220+0000 7f053e5de700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0538105c00 0x7f0538106010 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:17:06.390 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.220+0000 7f053e5de700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0538106550 con 0x7f0538105c00 2026-03-10T06:17:06.390 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.220+0000 7f0537fff700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0538105c00 0x7f0538106010 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:17:06.390 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.220+0000 7f0537fff700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0538105c00 0x7f0538106010 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:60736/0 (socket says 192.168.123.104:60736) 2026-03-10T06:17:06.390 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.220+0000 7f0537fff700 1 -- 192.168.123.104:0/616189575 learned_addr learned my addr 192.168.123.104:0/616189575 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:17:06.390 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.220+0000 7f0537fff700 1 -- 192.168.123.104:0/616189575 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f0538106690 con 0x7f0538105c00 2026-03-10T06:17:06.390 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.221+0000 7f0537fff700 1 --2- 192.168.123.104:0/616189575 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0538105c00 0x7f0538106010 secure :-1 s=READY pgs=59 cs=0 l=1 rev1=1 crypto rx=0x7f0528009a90 tx=0x7f0528009da0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=db242e4e5c2aff77 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:17:06.390 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.222+0000 7f0536ffd700 1 -- 192.168.123.104:0/616189575 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f0528004030 con 0x7f0538105c00 2026-03-10T06:17:06.390 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.222+0000 7f0536ffd700 1 -- 192.168.123.104:0/616189575 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f052800b7e0 con 0x7f0538105c00 2026-03-10T06:17:06.390 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.222+0000 7f0536ffd700 1 -- 192.168.123.104:0/616189575 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f0528003ae0 con 0x7f0538105c00 2026-03-10T06:17:06.390 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.222+0000 7f053e5de700 1 -- 192.168.123.104:0/616189575 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0538105c00 msgr2=0x7f0538106010 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:06.390 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.222+0000 7f053e5de700 1 --2- 192.168.123.104:0/616189575 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0538105c00 0x7f0538106010 secure :-1 s=READY pgs=59 cs=0 l=1 rev1=1 crypto rx=0x7f0528009a90 tx=0x7f0528009da0 comp rx=0 tx=0).stop 2026-03-10T06:17:06.390 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.222+0000 7f053e5de700 1 -- 192.168.123.104:0/616189575 shutdown_connections 2026-03-10T06:17:06.390 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.222+0000 7f053e5de700 1 --2- 192.168.123.104:0/616189575 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0538105c00 0x7f0538106010 unknown :-1 s=CLOSED pgs=59 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:06.390 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.222+0000 7f053e5de700 1 -- 192.168.123.104:0/616189575 >> 192.168.123.104:0/616189575 conn(0x7f0538101250 msgr2=0x7f0538103680 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:17:06.390 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.223+0000 7f053e5de700 1 -- 192.168.123.104:0/616189575 shutdown_connections 2026-03-10T06:17:06.390 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.223+0000 7f053e5de700 1 -- 192.168.123.104:0/616189575 wait complete. 2026-03-10T06:17:06.390 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.223+0000 7f053e5de700 1 Processor -- start 2026-03-10T06:17:06.390 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.223+0000 7f053e5de700 1 -- start start 2026-03-10T06:17:06.390 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.223+0000 7f053e5de700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0538105c00 0x7f053807bc70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:17:06.390 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.223+0000 7f053e5de700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f053807c1b0 con 0x7f0538105c00 2026-03-10T06:17:06.390 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.224+0000 7f0537fff700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0538105c00 0x7f053807bc70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:17:06.390 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.224+0000 7f0537fff700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0538105c00 0x7f053807bc70 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:60740/0 (socket says 192.168.123.104:60740) 2026-03-10T06:17:06.390 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.224+0000 7f0537fff700 1 -- 192.168.123.104:0/4146989505 learned_addr learned my addr 192.168.123.104:0/4146989505 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:17:06.390 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.224+0000 7f0537fff700 1 -- 192.168.123.104:0/4146989505 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f0528009740 con 0x7f0538105c00 2026-03-10T06:17:06.390 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.224+0000 7f0537fff700 1 --2- 192.168.123.104:0/4146989505 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0538105c00 0x7f053807bc70 secure :-1 s=READY pgs=60 cs=0 l=1 rev1=1 crypto rx=0x7f0528000c00 tx=0x7f052800bfa0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:17:06.390 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.225+0000 7f05357fa700 1 -- 192.168.123.104:0/4146989505 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f05280041a0 con 0x7f0538105c00 2026-03-10T06:17:06.390 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.225+0000 7f053e5de700 1 -- 192.168.123.104:0/4146989505 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f053807a2c0 con 0x7f0538105c00 2026-03-10T06:17:06.390 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.225+0000 7f053e5de700 1 -- 192.168.123.104:0/4146989505 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f053807a760 con 0x7f0538105c00 2026-03-10T06:17:06.390 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.226+0000 7f05357fa700 1 -- 192.168.123.104:0/4146989505 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f0528004300 con 0x7f0538105c00 2026-03-10T06:17:06.391 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.226+0000 7f05357fa700 1 -- 192.168.123.104:0/4146989505 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f0528011550 con 0x7f0538105c00 2026-03-10T06:17:06.391 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.226+0000 7f05357fa700 1 -- 192.168.123.104:0/4146989505 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 8) v1 ==== 45079+0+0 (secure 0 0 0) 0x7f0528011770 con 0x7f0538105c00 2026-03-10T06:17:06.391 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.226+0000 7f05357fa700 1 --2- 192.168.123.104:0/4146989505 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f05200385d0 0x7f052003aa80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:17:06.391 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.227+0000 7f053e5de700 1 -- 192.168.123.104:0/4146989505 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f053804efc0 con 0x7f0538105c00 2026-03-10T06:17:06.391 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.227+0000 7f05377fe700 1 --2- 192.168.123.104:0/4146989505 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f05200385d0 0x7f052003aa80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:17:06.391 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.230+0000 7f05377fe700 1 --2- 192.168.123.104:0/4146989505 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f05200385d0 0x7f052003aa80 secure :-1 s=READY pgs=18 cs=0 l=1 rev1=1 crypto rx=0x7f052c006fd0 tx=0x7f052c006e40 comp rx=0 tx=0).ready entity=mgr.14120 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:17:06.391 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.230+0000 7f05357fa700 1 -- 192.168.123.104:0/4146989505 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7f052804d210 con 0x7f0538105c00 2026-03-10T06:17:06.391 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.230+0000 7f05357fa700 1 -- 192.168.123.104:0/4146989505 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f05280531d0 con 0x7f0538105c00 2026-03-10T06:17:06.391 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.347+0000 7f053e5de700 1 -- 192.168.123.104:0/4146989505 --> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] -- mgr_command(tid 0: {"prefix": "orch apply", "service_type": "node-exporter", "target": ["mon-mgr", ""]}) v1 -- 0x7f053807b060 con 0x7f05200385d0 2026-03-10T06:17:06.391 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.352+0000 7f05357fa700 1 -- 192.168.123.104:0/4146989505 <== mgr.14120 v2:192.168.123.104:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+34 (secure 0 0 0) 0x7f053807b060 con 0x7f05200385d0 2026-03-10T06:17:06.391 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.355+0000 7f053e5de700 1 -- 192.168.123.104:0/4146989505 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f05200385d0 msgr2=0x7f052003aa80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:06.391 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.355+0000 7f053e5de700 1 --2- 192.168.123.104:0/4146989505 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f05200385d0 0x7f052003aa80 secure :-1 s=READY pgs=18 cs=0 l=1 rev1=1 crypto rx=0x7f052c006fd0 tx=0x7f052c006e40 comp rx=0 tx=0).stop 2026-03-10T06:17:06.391 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.355+0000 7f053e5de700 1 -- 192.168.123.104:0/4146989505 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0538105c00 msgr2=0x7f053807bc70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:06.391 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.355+0000 7f053e5de700 1 --2- 192.168.123.104:0/4146989505 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0538105c00 0x7f053807bc70 secure :-1 s=READY pgs=60 cs=0 l=1 rev1=1 crypto rx=0x7f0528000c00 tx=0x7f052800bfa0 comp rx=0 tx=0).stop 2026-03-10T06:17:06.391 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.355+0000 7f053e5de700 1 -- 192.168.123.104:0/4146989505 shutdown_connections 2026-03-10T06:17:06.391 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.355+0000 7f053e5de700 1 --2- 192.168.123.104:0/4146989505 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f05200385d0 0x7f052003aa80 unknown :-1 s=CLOSED pgs=18 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:06.391 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.355+0000 7f053e5de700 1 --2- 192.168.123.104:0/4146989505 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0538105c00 0x7f053807bc70 unknown :-1 s=CLOSED pgs=60 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:06.391 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.355+0000 7f053e5de700 1 -- 192.168.123.104:0/4146989505 >> 192.168.123.104:0/4146989505 conn(0x7f0538101250 msgr2=0x7f0538101d30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:17:06.391 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.355+0000 7f053e5de700 1 -- 192.168.123.104:0/4146989505 shutdown_connections 2026-03-10T06:17:06.391 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.355+0000 7f053e5de700 1 -- 192.168.123.104:0/4146989505 wait complete. 2026-03-10T06:17:06.391 INFO:teuthology.orchestra.run.vm04.stdout:Deploying alertmanager service with default placement... 2026-03-10T06:17:06.689 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout Scheduled alertmanager update... 2026-03-10T06:17:06.689 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.517+0000 7f0a3ecbd700 1 Processor -- start 2026-03-10T06:17:06.689 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.517+0000 7f0a3ecbd700 1 -- start start 2026-03-10T06:17:06.689 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.517+0000 7f0a3ecbd700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0a38072b50 0x7f0a38071050 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:17:06.689 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.517+0000 7f0a3ecbd700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0a38071590 con 0x7f0a38072b50 2026-03-10T06:17:06.689 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.517+0000 7f0a3ca59700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0a38072b50 0x7f0a38071050 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:17:06.689 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.517+0000 7f0a3ca59700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0a38072b50 0x7f0a38071050 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:60746/0 (socket says 192.168.123.104:60746) 2026-03-10T06:17:06.689 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.517+0000 7f0a3ca59700 1 -- 192.168.123.104:0/1140861406 learned_addr learned my addr 192.168.123.104:0/1140861406 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:17:06.689 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.517+0000 7f0a3ca59700 1 -- 192.168.123.104:0/1140861406 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f0a380716d0 con 0x7f0a38072b50 2026-03-10T06:17:06.689 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.518+0000 7f0a3ca59700 1 --2- 192.168.123.104:0/1140861406 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0a38072b50 0x7f0a38071050 secure :-1 s=READY pgs=61 cs=0 l=1 rev1=1 crypto rx=0x7f0a2c009a90 tx=0x7f0a2c009da0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=43f135c3123d9fa8 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:17:06.689 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.518+0000 7f0a377fe700 1 -- 192.168.123.104:0/1140861406 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f0a2c004030 con 0x7f0a38072b50 2026-03-10T06:17:06.689 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.518+0000 7f0a377fe700 1 -- 192.168.123.104:0/1140861406 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f0a2c00b7e0 con 0x7f0a38072b50 2026-03-10T06:17:06.689 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.518+0000 7f0a3ecbd700 1 -- 192.168.123.104:0/1140861406 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0a38072b50 msgr2=0x7f0a38071050 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:06.689 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.518+0000 7f0a3ecbd700 1 --2- 192.168.123.104:0/1140861406 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0a38072b50 0x7f0a38071050 secure :-1 s=READY pgs=61 cs=0 l=1 rev1=1 crypto rx=0x7f0a2c009a90 tx=0x7f0a2c009da0 comp rx=0 tx=0).stop 2026-03-10T06:17:06.689 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.519+0000 7f0a3ecbd700 1 -- 192.168.123.104:0/1140861406 shutdown_connections 2026-03-10T06:17:06.689 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.519+0000 7f0a3ecbd700 1 --2- 192.168.123.104:0/1140861406 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0a38072b50 0x7f0a38071050 unknown :-1 s=CLOSED pgs=61 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:06.689 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.519+0000 7f0a3ecbd700 1 -- 192.168.123.104:0/1140861406 >> 192.168.123.104:0/1140861406 conn(0x7f0a3806c970 msgr2=0x7f0a3806eda0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:17:06.689 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.519+0000 7f0a3ecbd700 1 -- 192.168.123.104:0/1140861406 shutdown_connections 2026-03-10T06:17:06.689 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.519+0000 7f0a3ecbd700 1 -- 192.168.123.104:0/1140861406 wait complete. 2026-03-10T06:17:06.690 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.519+0000 7f0a3ecbd700 1 Processor -- start 2026-03-10T06:17:06.690 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.519+0000 7f0a3ecbd700 1 -- start start 2026-03-10T06:17:06.690 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.519+0000 7f0a3ecbd700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0a3811b150 0x7f0a3811b560 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:17:06.690 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.519+0000 7f0a3ecbd700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0a38071590 con 0x7f0a3811b150 2026-03-10T06:17:06.690 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.519+0000 7f0a3ca59700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0a3811b150 0x7f0a3811b560 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:17:06.690 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.519+0000 7f0a3ca59700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0a3811b150 0x7f0a3811b560 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:60756/0 (socket says 192.168.123.104:60756) 2026-03-10T06:17:06.690 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.519+0000 7f0a3ca59700 1 -- 192.168.123.104:0/1464657085 learned_addr learned my addr 192.168.123.104:0/1464657085 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:17:06.690 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.520+0000 7f0a3ca59700 1 -- 192.168.123.104:0/1464657085 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f0a2c009740 con 0x7f0a3811b150 2026-03-10T06:17:06.690 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.520+0000 7f0a3ca59700 1 --2- 192.168.123.104:0/1464657085 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0a3811b150 0x7f0a3811b560 secure :-1 s=READY pgs=62 cs=0 l=1 rev1=1 crypto rx=0x7f0a2c00bdb0 tx=0x7f0a2c00be90 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:17:06.690 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.520+0000 7f0a35ffb700 1 -- 192.168.123.104:0/1464657085 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f0a2c003750 con 0x7f0a3811b150 2026-03-10T06:17:06.690 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.520+0000 7f0a3ecbd700 1 -- 192.168.123.104:0/1464657085 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f0a3811baa0 con 0x7f0a3811b150 2026-03-10T06:17:06.690 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.520+0000 7f0a3ecbd700 1 -- 192.168.123.104:0/1464657085 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f0a3811bf40 con 0x7f0a3811b150 2026-03-10T06:17:06.690 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.521+0000 7f0a3ecbd700 1 -- 192.168.123.104:0/1464657085 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f0a38062380 con 0x7f0a3811b150 2026-03-10T06:17:06.690 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.524+0000 7f0a35ffb700 1 -- 192.168.123.104:0/1464657085 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f0a2c004440 con 0x7f0a3811b150 2026-03-10T06:17:06.690 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.524+0000 7f0a35ffb700 1 -- 192.168.123.104:0/1464657085 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f0a2c01acb0 con 0x7f0a3811b150 2026-03-10T06:17:06.690 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.524+0000 7f0a35ffb700 1 -- 192.168.123.104:0/1464657085 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 8) v1 ==== 45079+0+0 (secure 0 0 0) 0x7f0a2c011420 con 0x7f0a3811b150 2026-03-10T06:17:06.690 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.524+0000 7f0a35ffb700 1 --2- 192.168.123.104:0/1464657085 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f0a200384e0 0x7f0a2003a990 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:17:06.690 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.525+0000 7f0a37fff700 1 --2- 192.168.123.104:0/1464657085 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f0a200384e0 0x7f0a2003a990 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:17:06.690 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.525+0000 7f0a35ffb700 1 -- 192.168.123.104:0/1464657085 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7f0a2c04c910 con 0x7f0a3811b150 2026-03-10T06:17:06.690 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.525+0000 7f0a35ffb700 1 -- 192.168.123.104:0/1464657085 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f0a2c0502a0 con 0x7f0a3811b150 2026-03-10T06:17:06.690 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.526+0000 7f0a37fff700 1 --2- 192.168.123.104:0/1464657085 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f0a200384e0 0x7f0a2003a990 secure :-1 s=READY pgs=19 cs=0 l=1 rev1=1 crypto rx=0x7f0a3000ad80 tx=0x7f0a300093f0 comp rx=0 tx=0).ready entity=mgr.14120 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:17:06.690 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.638+0000 7f0a3ecbd700 1 -- 192.168.123.104:0/1464657085 --> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] -- mgr_command(tid 0: {"prefix": "orch apply", "service_type": "alertmanager", "target": ["mon-mgr", ""]}) v1 -- 0x7f0a3806dc60 con 0x7f0a200384e0 2026-03-10T06:17:06.690 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.644+0000 7f0a35ffb700 1 -- 192.168.123.104:0/1464657085 <== mgr.14120 v2:192.168.123.104:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+33 (secure 0 0 0) 0x7f0a3806dc60 con 0x7f0a200384e0 2026-03-10T06:17:06.690 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.648+0000 7f0a1f7fe700 1 -- 192.168.123.104:0/1464657085 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f0a200384e0 msgr2=0x7f0a2003a990 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:06.690 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.648+0000 7f0a1f7fe700 1 --2- 192.168.123.104:0/1464657085 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f0a200384e0 0x7f0a2003a990 secure :-1 s=READY pgs=19 cs=0 l=1 rev1=1 crypto rx=0x7f0a3000ad80 tx=0x7f0a300093f0 comp rx=0 tx=0).stop 2026-03-10T06:17:06.690 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.648+0000 7f0a1f7fe700 1 -- 192.168.123.104:0/1464657085 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0a3811b150 msgr2=0x7f0a3811b560 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:06.690 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.648+0000 7f0a1f7fe700 1 --2- 192.168.123.104:0/1464657085 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0a3811b150 0x7f0a3811b560 secure :-1 s=READY pgs=62 cs=0 l=1 rev1=1 crypto rx=0x7f0a2c00bdb0 tx=0x7f0a2c00be90 comp rx=0 tx=0).stop 2026-03-10T06:17:06.690 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.648+0000 7f0a1f7fe700 1 -- 192.168.123.104:0/1464657085 shutdown_connections 2026-03-10T06:17:06.690 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.648+0000 7f0a1f7fe700 1 --2- 192.168.123.104:0/1464657085 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f0a200384e0 0x7f0a2003a990 secure :-1 s=CLOSED pgs=19 cs=0 l=1 rev1=1 crypto rx=0x7f0a3000ad80 tx=0x7f0a300093f0 comp rx=0 tx=0).stop 2026-03-10T06:17:06.690 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.648+0000 7f0a1f7fe700 1 --2- 192.168.123.104:0/1464657085 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0a3811b150 0x7f0a3811b560 unknown :-1 s=CLOSED pgs=62 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:06.690 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.648+0000 7f0a1f7fe700 1 -- 192.168.123.104:0/1464657085 >> 192.168.123.104:0/1464657085 conn(0x7f0a3806c970 msgr2=0x7f0a3806d550 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:17:06.690 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.649+0000 7f0a1f7fe700 1 -- 192.168.123.104:0/1464657085 shutdown_connections 2026-03-10T06:17:06.690 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.649+0000 7f0a1f7fe700 1 -- 192.168.123.104:0/1464657085 wait complete. 2026-03-10T06:17:07.025 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.855+0000 7f208824d700 1 Processor -- start 2026-03-10T06:17:07.025 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.856+0000 7f208824d700 1 -- start start 2026-03-10T06:17:07.025 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.856+0000 7f208824d700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2080074510 0x7f208010ace0 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:17:07.025 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.856+0000 7f208824d700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f20801066d0 con 0x7f2080074510 2026-03-10T06:17:07.025 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.856+0000 7f2085fe9700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2080074510 0x7f208010ace0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:17:07.025 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.856+0000 7f2085fe9700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2080074510 0x7f208010ace0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:60772/0 (socket says 192.168.123.104:60772) 2026-03-10T06:17:07.025 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.856+0000 7f2085fe9700 1 -- 192.168.123.104:0/4118215173 learned_addr learned my addr 192.168.123.104:0/4118215173 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:17:07.025 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.856+0000 7f2085fe9700 1 -- 192.168.123.104:0/4118215173 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f2080106810 con 0x7f2080074510 2026-03-10T06:17:07.025 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.857+0000 7f2085fe9700 1 --2- 192.168.123.104:0/4118215173 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2080074510 0x7f208010ace0 secure :-1 s=READY pgs=63 cs=0 l=1 rev1=1 crypto rx=0x7f2070009a90 tx=0x7f2070009da0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=14d2ff977dc8bf4e server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:17:07.026 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.857+0000 7f2084fe7700 1 -- 192.168.123.104:0/4118215173 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f2070004030 con 0x7f2080074510 2026-03-10T06:17:07.026 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.857+0000 7f2084fe7700 1 -- 192.168.123.104:0/4118215173 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f207000b7e0 con 0x7f2080074510 2026-03-10T06:17:07.026 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.857+0000 7f208824d700 1 -- 192.168.123.104:0/4118215173 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2080074510 msgr2=0x7f208010ace0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:07.026 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.857+0000 7f208824d700 1 --2- 192.168.123.104:0/4118215173 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2080074510 0x7f208010ace0 secure :-1 s=READY pgs=63 cs=0 l=1 rev1=1 crypto rx=0x7f2070009a90 tx=0x7f2070009da0 comp rx=0 tx=0).stop 2026-03-10T06:17:07.026 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.857+0000 7f208824d700 1 -- 192.168.123.104:0/4118215173 shutdown_connections 2026-03-10T06:17:07.026 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.857+0000 7f208824d700 1 --2- 192.168.123.104:0/4118215173 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2080074510 0x7f208010ace0 unknown :-1 s=CLOSED pgs=63 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:07.026 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.857+0000 7f208824d700 1 -- 192.168.123.104:0/4118215173 >> 192.168.123.104:0/4118215173 conn(0x7f2080100270 msgr2=0x7f20801026a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:17:07.026 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.858+0000 7f208824d700 1 -- 192.168.123.104:0/4118215173 shutdown_connections 2026-03-10T06:17:07.026 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.858+0000 7f208824d700 1 -- 192.168.123.104:0/4118215173 wait complete. 2026-03-10T06:17:07.026 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.858+0000 7f208824d700 1 Processor -- start 2026-03-10T06:17:07.026 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.858+0000 7f208824d700 1 -- start start 2026-03-10T06:17:07.026 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.858+0000 7f208824d700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2080074510 0x7f20801a04a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:17:07.026 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.859+0000 7f208824d700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f20801066d0 con 0x7f2080074510 2026-03-10T06:17:07.026 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.859+0000 7f2085fe9700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2080074510 0x7f20801a04a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:17:07.026 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.859+0000 7f2085fe9700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2080074510 0x7f20801a04a0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:60780/0 (socket says 192.168.123.104:60780) 2026-03-10T06:17:07.026 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.859+0000 7f2085fe9700 1 -- 192.168.123.104:0/3475266183 learned_addr learned my addr 192.168.123.104:0/3475266183 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:17:07.026 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.859+0000 7f2085fe9700 1 -- 192.168.123.104:0/3475266183 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f2070009740 con 0x7f2080074510 2026-03-10T06:17:07.027 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.859+0000 7f2085fe9700 1 --2- 192.168.123.104:0/3475266183 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2080074510 0x7f20801a04a0 secure :-1 s=READY pgs=64 cs=0 l=1 rev1=1 crypto rx=0x7f207000bd00 tx=0x7f207000bde0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:17:07.027 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.859+0000 7f2076ffd700 1 -- 192.168.123.104:0/3475266183 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f2070003f60 con 0x7f2080074510 2026-03-10T06:17:07.027 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.859+0000 7f208824d700 1 -- 192.168.123.104:0/3475266183 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f20801a09e0 con 0x7f2080074510 2026-03-10T06:17:07.027 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.859+0000 7f208824d700 1 -- 192.168.123.104:0/3475266183 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f20801a0e80 con 0x7f2080074510 2026-03-10T06:17:07.027 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.860+0000 7f2076ffd700 1 -- 192.168.123.104:0/3475266183 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f20700045a0 con 0x7f2080074510 2026-03-10T06:17:07.027 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.860+0000 7f2076ffd700 1 -- 192.168.123.104:0/3475266183 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f2070024dc0 con 0x7f2080074510 2026-03-10T06:17:07.027 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.860+0000 7f2076ffd700 1 -- 192.168.123.104:0/3475266183 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 8) v1 ==== 45079+0+0 (secure 0 0 0) 0x7f207001b440 con 0x7f2080074510 2026-03-10T06:17:07.027 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.860+0000 7f2076ffd700 1 --2- 192.168.123.104:0/3475266183 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f206c0384b0 0x7f206c03a960 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:17:07.027 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.860+0000 7f2076ffd700 1 -- 192.168.123.104:0/3475266183 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7f207004d080 con 0x7f2080074510 2026-03-10T06:17:07.027 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.860+0000 7f20857e8700 1 --2- 192.168.123.104:0/3475266183 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f206c0384b0 0x7f206c03a960 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:17:07.027 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.861+0000 7f208824d700 1 -- 192.168.123.104:0/3475266183 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f2064005320 con 0x7f2080074510 2026-03-10T06:17:07.028 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.863+0000 7f20857e8700 1 --2- 192.168.123.104:0/3475266183 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f206c0384b0 0x7f206c03a960 secure :-1 s=READY pgs=20 cs=0 l=1 rev1=1 crypto rx=0x7f207c006fd0 tx=0x7f207c006e40 comp rx=0 tx=0).ready entity=mgr.14120 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:17:07.028 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.863+0000 7f2076ffd700 1 -- 192.168.123.104:0/3475266183 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f207001f080 con 0x7f2080074510 2026-03-10T06:17:07.028 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.965+0000 7f208824d700 1 -- 192.168.123.104:0/3475266183 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command([{prefix=config set, name=mgr/cephadm/container_init}] v 0) v1 -- 0x7f2064005190 con 0x7f2080074510 2026-03-10T06:17:07.028 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.972+0000 7f2076ffd700 1 -- 192.168.123.104:0/3475266183 <== mon.0 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{prefix=config set, name=mgr/cephadm/container_init}]=0 v7)=0 v7) v1 ==== 142+0+0 (secure 0 0 0) 0x7f207004b0d0 con 0x7f2080074510 2026-03-10T06:17:07.028 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.975+0000 7f208824d700 1 -- 192.168.123.104:0/3475266183 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f206c0384b0 msgr2=0x7f206c03a960 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:07.029 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.975+0000 7f208824d700 1 --2- 192.168.123.104:0/3475266183 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f206c0384b0 0x7f206c03a960 secure :-1 s=READY pgs=20 cs=0 l=1 rev1=1 crypto rx=0x7f207c006fd0 tx=0x7f207c006e40 comp rx=0 tx=0).stop 2026-03-10T06:17:07.029 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.975+0000 7f208824d700 1 -- 192.168.123.104:0/3475266183 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2080074510 msgr2=0x7f20801a04a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:07.029 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.975+0000 7f208824d700 1 --2- 192.168.123.104:0/3475266183 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2080074510 0x7f20801a04a0 secure :-1 s=READY pgs=64 cs=0 l=1 rev1=1 crypto rx=0x7f207000bd00 tx=0x7f207000bde0 comp rx=0 tx=0).stop 2026-03-10T06:17:07.029 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.975+0000 7f208824d700 1 -- 192.168.123.104:0/3475266183 shutdown_connections 2026-03-10T06:17:07.031 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.975+0000 7f208824d700 1 --2- 192.168.123.104:0/3475266183 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f206c0384b0 0x7f206c03a960 unknown :-1 s=CLOSED pgs=20 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:07.031 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.975+0000 7f208824d700 1 --2- 192.168.123.104:0/3475266183 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2080074510 0x7f20801a04a0 unknown :-1 s=CLOSED pgs=64 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:07.031 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.975+0000 7f208824d700 1 -- 192.168.123.104:0/3475266183 >> 192.168.123.104:0/3475266183 conn(0x7f2080100270 msgr2=0x7f2080101d70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:17:07.031 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.975+0000 7f208824d700 1 -- 192.168.123.104:0/3475266183 shutdown_connections 2026-03-10T06:17:07.031 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:06.975+0000 7f208824d700 1 -- 192.168.123.104:0/3475266183 wait complete. 2026-03-10T06:17:07.283 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:07 vm04 ceph-mon[51058]: from='client.14150 -' entity='client.admin' cmd=[{"prefix": "orch apply", "service_type": "prometheus", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:17:07.283 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:07 vm04 ceph-mon[51058]: Saving service prometheus spec with placement count:1 2026-03-10T06:17:07.283 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:07 vm04 ceph-mon[51058]: from='client.14152 -' entity='client.admin' cmd=[{"prefix": "orch apply", "service_type": "grafana", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:17:07.283 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:07 vm04 ceph-mon[51058]: Saving service grafana spec with placement count:1 2026-03-10T06:17:07.283 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:07 vm04 ceph-mon[51058]: from='mgr.14120 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:17:07.283 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:07 vm04 ceph-mon[51058]: from='client.14154 -' entity='client.admin' cmd=[{"prefix": "orch apply", "service_type": "node-exporter", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:17:07.283 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:07 vm04 ceph-mon[51058]: Saving service node-exporter spec with placement * 2026-03-10T06:17:07.283 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:07 vm04 ceph-mon[51058]: from='mgr.14120 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:17:07.284 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:07 vm04 ceph-mon[51058]: from='mgr.14120 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:17:07.284 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:07 vm04 ceph-mon[51058]: from='mgr.14120 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:17:07.284 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:07 vm04 ceph-mon[51058]: from='client.? 192.168.123.104:0/3475266183' entity='client.admin' 2026-03-10T06:17:07.344 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:07.169+0000 7f2234fe4700 1 Processor -- start 2026-03-10T06:17:07.344 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:07.169+0000 7f2234fe4700 1 -- start start 2026-03-10T06:17:07.344 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:07.169+0000 7f2234fe4700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2230107f40 0x7f2230108350 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:17:07.344 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:07.169+0000 7f2234fe4700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f2230108890 con 0x7f2230107f40 2026-03-10T06:17:07.344 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:07.170+0000 7f222e59c700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2230107f40 0x7f2230108350 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:17:07.344 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:07.170+0000 7f222e59c700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2230107f40 0x7f2230108350 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:60784/0 (socket says 192.168.123.104:60784) 2026-03-10T06:17:07.344 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:07.170+0000 7f222e59c700 1 -- 192.168.123.104:0/1590389903 learned_addr learned my addr 192.168.123.104:0/1590389903 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:17:07.344 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:07.170+0000 7f222e59c700 1 -- 192.168.123.104:0/1590389903 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f22301089d0 con 0x7f2230107f40 2026-03-10T06:17:07.344 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:07.170+0000 7f222e59c700 1 --2- 192.168.123.104:0/1590389903 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2230107f40 0x7f2230108350 secure :-1 s=READY pgs=65 cs=0 l=1 rev1=1 crypto rx=0x7f2220009cf0 tx=0x7f222000b0e0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=692d6a10a46b5ef server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:17:07.344 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:07.170+0000 7f222d59a700 1 -- 192.168.123.104:0/1590389903 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f2220004030 con 0x7f2230107f40 2026-03-10T06:17:07.344 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:07.170+0000 7f222d59a700 1 -- 192.168.123.104:0/1590389903 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f222000b810 con 0x7f2230107f40 2026-03-10T06:17:07.344 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:07.171+0000 7f222d59a700 1 -- 192.168.123.104:0/1590389903 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f2220003b10 con 0x7f2230107f40 2026-03-10T06:17:07.344 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:07.171+0000 7f2234fe4700 1 -- 192.168.123.104:0/1590389903 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2230107f40 msgr2=0x7f2230108350 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:07.344 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:07.171+0000 7f2234fe4700 1 --2- 192.168.123.104:0/1590389903 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2230107f40 0x7f2230108350 secure :-1 s=READY pgs=65 cs=0 l=1 rev1=1 crypto rx=0x7f2220009cf0 tx=0x7f222000b0e0 comp rx=0 tx=0).stop 2026-03-10T06:17:07.344 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:07.171+0000 7f2234fe4700 1 -- 192.168.123.104:0/1590389903 shutdown_connections 2026-03-10T06:17:07.344 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:07.171+0000 7f2234fe4700 1 --2- 192.168.123.104:0/1590389903 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2230107f40 0x7f2230108350 unknown :-1 s=CLOSED pgs=65 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:07.344 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:07.171+0000 7f2234fe4700 1 -- 192.168.123.104:0/1590389903 >> 192.168.123.104:0/1590389903 conn(0x7f223007b4b0 msgr2=0x7f223007b8d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:17:07.344 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:07.171+0000 7f2234fe4700 1 -- 192.168.123.104:0/1590389903 shutdown_connections 2026-03-10T06:17:07.344 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:07.172+0000 7f2234fe4700 1 -- 192.168.123.104:0/1590389903 wait complete. 2026-03-10T06:17:07.344 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:07.172+0000 7f2234fe4700 1 Processor -- start 2026-03-10T06:17:07.344 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:07.172+0000 7f2234fe4700 1 -- start start 2026-03-10T06:17:07.344 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:07.172+0000 7f2234fe4700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2230107f40 0x7f223019bb40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:17:07.344 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:07.172+0000 7f2234fe4700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f223019c080 con 0x7f2230107f40 2026-03-10T06:17:07.344 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:07.172+0000 7f222e59c700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2230107f40 0x7f223019bb40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:17:07.344 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:07.173+0000 7f222e59c700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2230107f40 0x7f223019bb40 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:60798/0 (socket says 192.168.123.104:60798) 2026-03-10T06:17:07.344 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:07.173+0000 7f222e59c700 1 -- 192.168.123.104:0/3303711853 learned_addr learned my addr 192.168.123.104:0/3303711853 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:17:07.344 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:07.173+0000 7f222e59c700 1 -- 192.168.123.104:0/3303711853 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f2220009740 con 0x7f2230107f40 2026-03-10T06:17:07.344 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:07.173+0000 7f222e59c700 1 --2- 192.168.123.104:0/3303711853 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2230107f40 0x7f223019bb40 secure :-1 s=READY pgs=66 cs=0 l=1 rev1=1 crypto rx=0x7f22200037e0 tx=0x7f2220011770 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:17:07.344 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:07.173+0000 7f221f7fe700 1 -- 192.168.123.104:0/3303711853 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f2220011a10 con 0x7f2230107f40 2026-03-10T06:17:07.344 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:07.173+0000 7f221f7fe700 1 -- 192.168.123.104:0/3303711853 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f2220011b70 con 0x7f2230107f40 2026-03-10T06:17:07.344 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:07.173+0000 7f221f7fe700 1 -- 192.168.123.104:0/3303711853 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f222001a520 con 0x7f2230107f40 2026-03-10T06:17:07.344 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:07.173+0000 7f2234fe4700 1 -- 192.168.123.104:0/3303711853 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f223019c280 con 0x7f2230107f40 2026-03-10T06:17:07.344 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:07.173+0000 7f2234fe4700 1 -- 192.168.123.104:0/3303711853 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f223019c660 con 0x7f2230107f40 2026-03-10T06:17:07.344 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:07.174+0000 7f2234fe4700 1 -- 192.168.123.104:0/3303711853 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f223004f9e0 con 0x7f2230107f40 2026-03-10T06:17:07.344 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:07.177+0000 7f221f7fe700 1 -- 192.168.123.104:0/3303711853 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 8) v1 ==== 45079+0+0 (secure 0 0 0) 0x7f2220011ce0 con 0x7f2230107f40 2026-03-10T06:17:07.344 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:07.178+0000 7f221f7fe700 1 --2- 192.168.123.104:0/3303711853 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f2218038580 0x7f221803aa30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:17:07.344 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:07.178+0000 7f221f7fe700 1 -- 192.168.123.104:0/3303711853 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7f2220028030 con 0x7f2230107f40 2026-03-10T06:17:07.344 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:07.178+0000 7f222dd9b700 1 --2- 192.168.123.104:0/3303711853 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f2218038580 0x7f221803aa30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:17:07.344 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:07.178+0000 7f221f7fe700 1 -- 192.168.123.104:0/3303711853 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f222004c850 con 0x7f2230107f40 2026-03-10T06:17:07.344 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:07.178+0000 7f222dd9b700 1 --2- 192.168.123.104:0/3303711853 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f2218038580 0x7f221803aa30 secure :-1 s=READY pgs=21 cs=0 l=1 rev1=1 crypto rx=0x7f2224006fd0 tx=0x7f2224006e40 comp rx=0 tx=0).ready entity=mgr.14120 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:17:07.344 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:07.282+0000 7f2234fe4700 1 -- 192.168.123.104:0/3303711853 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command([{prefix=config set, name=mgr/dashboard/ssl_server_port}] v 0) v1 -- 0x7f223010c410 con 0x7f2230107f40 2026-03-10T06:17:07.344 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:07.288+0000 7f221f7fe700 1 -- 192.168.123.104:0/3303711853 <== mon.0 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{prefix=config set, name=mgr/dashboard/ssl_server_port}]=0 v8)=0 v8) v1 ==== 130+0+0 (secure 0 0 0) 0x7f222001a680 con 0x7f2230107f40 2026-03-10T06:17:07.344 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:07.293+0000 7f2234fe4700 1 -- 192.168.123.104:0/3303711853 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f2218038580 msgr2=0x7f221803aa30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:07.344 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:07.293+0000 7f2234fe4700 1 --2- 192.168.123.104:0/3303711853 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f2218038580 0x7f221803aa30 secure :-1 s=READY pgs=21 cs=0 l=1 rev1=1 crypto rx=0x7f2224006fd0 tx=0x7f2224006e40 comp rx=0 tx=0).stop 2026-03-10T06:17:07.344 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:07.293+0000 7f2234fe4700 1 -- 192.168.123.104:0/3303711853 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2230107f40 msgr2=0x7f223019bb40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:07.344 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:07.293+0000 7f2234fe4700 1 --2- 192.168.123.104:0/3303711853 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2230107f40 0x7f223019bb40 secure :-1 s=READY pgs=66 cs=0 l=1 rev1=1 crypto rx=0x7f22200037e0 tx=0x7f2220011770 comp rx=0 tx=0).stop 2026-03-10T06:17:07.344 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:07.293+0000 7f2234fe4700 1 -- 192.168.123.104:0/3303711853 shutdown_connections 2026-03-10T06:17:07.344 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:07.293+0000 7f2234fe4700 1 --2- 192.168.123.104:0/3303711853 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f2218038580 0x7f221803aa30 unknown :-1 s=CLOSED pgs=21 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:07.344 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:07.293+0000 7f2234fe4700 1 --2- 192.168.123.104:0/3303711853 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2230107f40 0x7f223019bb40 unknown :-1 s=CLOSED pgs=66 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:07.344 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:07.293+0000 7f2234fe4700 1 -- 192.168.123.104:0/3303711853 >> 192.168.123.104:0/3303711853 conn(0x7f223007b4b0 msgr2=0x7f22301054d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:17:07.344 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:07.293+0000 7f2234fe4700 1 -- 192.168.123.104:0/3303711853 shutdown_connections 2026-03-10T06:17:07.344 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:07.293+0000 7f2234fe4700 1 -- 192.168.123.104:0/3303711853 wait complete. 2026-03-10T06:17:07.345 INFO:teuthology.orchestra.run.vm04.stdout:Enabling the dashboard module... 2026-03-10T06:17:08.338 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:07.469+0000 7fb34810f700 1 Processor -- start 2026-03-10T06:17:08.338 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:07.470+0000 7fb34810f700 1 -- start start 2026-03-10T06:17:08.338 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:07.470+0000 7fb34810f700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb340105d10 0x7fb340106120 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:17:08.338 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:07.470+0000 7fb34810f700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb340106660 con 0x7fb340105d10 2026-03-10T06:17:08.338 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:07.470+0000 7fb345eab700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb340105d10 0x7fb340106120 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:17:08.338 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:07.470+0000 7fb345eab700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb340105d10 0x7fb340106120 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:60802/0 (socket says 192.168.123.104:60802) 2026-03-10T06:17:08.338 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:07.470+0000 7fb345eab700 1 -- 192.168.123.104:0/1435566873 learned_addr learned my addr 192.168.123.104:0/1435566873 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:17:08.338 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:07.470+0000 7fb345eab700 1 -- 192.168.123.104:0/1435566873 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb3401067a0 con 0x7fb340105d10 2026-03-10T06:17:08.338 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:07.471+0000 7fb345eab700 1 --2- 192.168.123.104:0/1435566873 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb340105d10 0x7fb340106120 secure :-1 s=READY pgs=67 cs=0 l=1 rev1=1 crypto rx=0x7fb330009cf0 tx=0x7fb33000b0e0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=ecb1dacb6e62a30 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:17:08.338 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:07.471+0000 7fb344ea9700 1 -- 192.168.123.104:0/1435566873 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fb330004030 con 0x7fb340105d10 2026-03-10T06:17:08.338 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:07.471+0000 7fb344ea9700 1 -- 192.168.123.104:0/1435566873 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fb33000b810 con 0x7fb340105d10 2026-03-10T06:17:08.338 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:07.471+0000 7fb344ea9700 1 -- 192.168.123.104:0/1435566873 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fb330003b10 con 0x7fb340105d10 2026-03-10T06:17:08.338 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:07.471+0000 7fb34810f700 1 -- 192.168.123.104:0/1435566873 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb340105d10 msgr2=0x7fb340106120 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:08.338 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:07.471+0000 7fb34810f700 1 --2- 192.168.123.104:0/1435566873 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb340105d10 0x7fb340106120 secure :-1 s=READY pgs=67 cs=0 l=1 rev1=1 crypto rx=0x7fb330009cf0 tx=0x7fb33000b0e0 comp rx=0 tx=0).stop 2026-03-10T06:17:08.338 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:07.472+0000 7fb34810f700 1 -- 192.168.123.104:0/1435566873 shutdown_connections 2026-03-10T06:17:08.338 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:07.472+0000 7fb34810f700 1 --2- 192.168.123.104:0/1435566873 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb340105d10 0x7fb340106120 unknown :-1 s=CLOSED pgs=67 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:08.338 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:07.472+0000 7fb34810f700 1 -- 192.168.123.104:0/1435566873 >> 192.168.123.104:0/1435566873 conn(0x7fb340101360 msgr2=0x7fb340103790 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:17:08.338 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:07.472+0000 7fb34810f700 1 -- 192.168.123.104:0/1435566873 shutdown_connections 2026-03-10T06:17:08.338 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:07.472+0000 7fb34810f700 1 -- 192.168.123.104:0/1435566873 wait complete. 2026-03-10T06:17:08.338 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:07.472+0000 7fb34810f700 1 Processor -- start 2026-03-10T06:17:08.338 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:07.472+0000 7fb34810f700 1 -- start start 2026-03-10T06:17:08.339 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:07.472+0000 7fb34810f700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb340105d10 0x7fb34019bc70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:17:08.339 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:07.472+0000 7fb34810f700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb34019c1b0 con 0x7fb340105d10 2026-03-10T06:17:08.339 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:07.473+0000 7fb345eab700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb340105d10 0x7fb34019bc70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:17:08.339 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:07.473+0000 7fb345eab700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb340105d10 0x7fb34019bc70 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:60804/0 (socket says 192.168.123.104:60804) 2026-03-10T06:17:08.339 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:07.473+0000 7fb345eab700 1 -- 192.168.123.104:0/2875191519 learned_addr learned my addr 192.168.123.104:0/2875191519 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:17:08.339 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:07.473+0000 7fb345eab700 1 -- 192.168.123.104:0/2875191519 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb330009740 con 0x7fb340105d10 2026-03-10T06:17:08.339 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:07.473+0000 7fb345eab700 1 --2- 192.168.123.104:0/2875191519 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb340105d10 0x7fb34019bc70 secure :-1 s=READY pgs=68 cs=0 l=1 rev1=1 crypto rx=0x7fb330009cc0 tx=0x7fb330011840 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:17:08.339 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:07.473+0000 7fb336ffd700 1 -- 192.168.123.104:0/2875191519 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fb330011a60 con 0x7fb340105d10 2026-03-10T06:17:08.339 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:07.473+0000 7fb336ffd700 1 -- 192.168.123.104:0/2875191519 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fb330011bc0 con 0x7fb340105d10 2026-03-10T06:17:08.339 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:07.473+0000 7fb336ffd700 1 -- 192.168.123.104:0/2875191519 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fb33001a500 con 0x7fb340105d10 2026-03-10T06:17:08.339 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:07.473+0000 7fb34810f700 1 -- 192.168.123.104:0/2875191519 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fb34019c3b0 con 0x7fb340105d10 2026-03-10T06:17:08.339 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:07.473+0000 7fb34810f700 1 -- 192.168.123.104:0/2875191519 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fb34019c850 con 0x7fb340105d10 2026-03-10T06:17:08.339 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:07.474+0000 7fb336ffd700 1 -- 192.168.123.104:0/2875191519 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 8) v1 ==== 45079+0+0 (secure 0 0 0) 0x7fb33001a660 con 0x7fb340105d10 2026-03-10T06:17:08.339 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:07.474+0000 7fb34810f700 1 -- 192.168.123.104:0/2875191519 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fb34004f9e0 con 0x7fb340105d10 2026-03-10T06:17:08.339 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:07.475+0000 7fb336ffd700 1 --2- 192.168.123.104:0/2875191519 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fb32c038450 0x7fb32c03a900 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:17:08.339 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:07.475+0000 7fb336ffd700 1 -- 192.168.123.104:0/2875191519 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7fb33004c090 con 0x7fb340105d10 2026-03-10T06:17:08.339 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:07.475+0000 7fb3456aa700 1 --2- 192.168.123.104:0/2875191519 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fb32c038450 0x7fb32c03a900 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:17:08.339 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:07.476+0000 7fb3456aa700 1 --2- 192.168.123.104:0/2875191519 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fb32c038450 0x7fb32c03a900 secure :-1 s=READY pgs=22 cs=0 l=1 rev1=1 crypto rx=0x7fb33c006fd0 tx=0x7fb33c006e40 comp rx=0 tx=0).ready entity=mgr.14120 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:17:08.339 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:07.478+0000 7fb336ffd700 1 -- 192.168.123.104:0/2875191519 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fb33002c700 con 0x7fb340105d10 2026-03-10T06:17:08.339 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:07.609+0000 7fb34810f700 1 -- 192.168.123.104:0/2875191519 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "mgr module enable", "module": "dashboard"} v 0) v1 -- 0x7fb340062380 con 0x7fb340105d10 2026-03-10T06:17:08.339 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:08.292+0000 7fb336ffd700 1 -- 192.168.123.104:0/2875191519 <== mon.0 v2:192.168.123.104:3300/0 7 ==== mgrmap(e 9) v1 ==== 45092+0+0 (secure 0 0 0) 0x7fb330011d30 con 0x7fb340105d10 2026-03-10T06:17:08.339 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:08.293+0000 7fb336ffd700 1 -- 192.168.123.104:0/2875191519 <== mon.0 v2:192.168.123.104:3300/0 8 ==== mon_command_ack([{"prefix": "mgr module enable", "module": "dashboard"}]=0 v9) v1 ==== 88+0+0 (secure 0 0 0) 0x7fb330022720 con 0x7fb340105d10 2026-03-10T06:17:08.339 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:08.297+0000 7fb34810f700 1 -- 192.168.123.104:0/2875191519 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fb32c038450 msgr2=0x7fb32c03a900 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:08.339 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:08.297+0000 7fb34810f700 1 --2- 192.168.123.104:0/2875191519 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fb32c038450 0x7fb32c03a900 secure :-1 s=READY pgs=22 cs=0 l=1 rev1=1 crypto rx=0x7fb33c006fd0 tx=0x7fb33c006e40 comp rx=0 tx=0).stop 2026-03-10T06:17:08.339 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:08.297+0000 7fb34810f700 1 -- 192.168.123.104:0/2875191519 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb340105d10 msgr2=0x7fb34019bc70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:08.339 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:08.297+0000 7fb34810f700 1 --2- 192.168.123.104:0/2875191519 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb340105d10 0x7fb34019bc70 secure :-1 s=READY pgs=68 cs=0 l=1 rev1=1 crypto rx=0x7fb330009cc0 tx=0x7fb330011840 comp rx=0 tx=0).stop 2026-03-10T06:17:08.339 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:08.301+0000 7fb34810f700 1 -- 192.168.123.104:0/2875191519 shutdown_connections 2026-03-10T06:17:08.339 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:08.301+0000 7fb34810f700 1 --2- 192.168.123.104:0/2875191519 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fb32c038450 0x7fb32c03a900 unknown :-1 s=CLOSED pgs=22 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:08.339 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:08.301+0000 7fb34810f700 1 --2- 192.168.123.104:0/2875191519 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb340105d10 0x7fb34019bc70 unknown :-1 s=CLOSED pgs=68 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:08.339 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:08.301+0000 7fb34810f700 1 -- 192.168.123.104:0/2875191519 >> 192.168.123.104:0/2875191519 conn(0x7fb340101360 msgr2=0x7fb340102020 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:17:08.339 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:08.301+0000 7fb34810f700 1 -- 192.168.123.104:0/2875191519 shutdown_connections 2026-03-10T06:17:08.339 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:08.301+0000 7fb34810f700 1 -- 192.168.123.104:0/2875191519 wait complete. 2026-03-10T06:17:08.562 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:08 vm04 ceph-mon[51058]: from='client.14156 -' entity='client.admin' cmd=[{"prefix": "orch apply", "service_type": "alertmanager", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:17:08.562 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:08 vm04 ceph-mon[51058]: Saving service alertmanager spec with placement count:1 2026-03-10T06:17:08.562 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:08 vm04 ceph-mon[51058]: from='client.? 192.168.123.104:0/3303711853' entity='client.admin' 2026-03-10T06:17:08.562 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:08 vm04 ceph-mon[51058]: from='client.? 192.168.123.104:0/2875191519' entity='client.admin' cmd=[{"prefix": "mgr module enable", "module": "dashboard"}]: dispatch 2026-03-10T06:17:08.720 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout { 2026-03-10T06:17:08.720 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "epoch": 9, 2026-03-10T06:17:08.720 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "available": true, 2026-03-10T06:17:08.720 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "active_name": "vm04.exdvdb", 2026-03-10T06:17:08.720 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "num_standby": 0 2026-03-10T06:17:08.720 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout } 2026-03-10T06:17:08.720 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:08.501+0000 7f235f59e700 1 Processor -- start 2026-03-10T06:17:08.720 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:08.501+0000 7f235f59e700 1 -- start start 2026-03-10T06:17:08.720 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:08.501+0000 7f235f59e700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2360071410 0x7f2360071820 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:17:08.721 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:08.501+0000 7f235f59e700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f2360071d60 con 0x7f2360071410 2026-03-10T06:17:08.721 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:08.501+0000 7f235e59c700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2360071410 0x7f2360071820 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:17:08.721 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:08.502+0000 7f235e59c700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2360071410 0x7f2360071820 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:60828/0 (socket says 192.168.123.104:60828) 2026-03-10T06:17:08.721 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:08.502+0000 7f235e59c700 1 -- 192.168.123.104:0/1313682274 learned_addr learned my addr 192.168.123.104:0/1313682274 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:17:08.721 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:08.502+0000 7f235e59c700 1 -- 192.168.123.104:0/1313682274 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f2360071ea0 con 0x7f2360071410 2026-03-10T06:17:08.721 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:08.503+0000 7f235e59c700 1 --2- 192.168.123.104:0/1313682274 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2360071410 0x7f2360071820 secure :-1 s=READY pgs=71 cs=0 l=1 rev1=1 crypto rx=0x7f2350009480 tx=0x7f2350009790 comp rx=0 tx=0).ready entity=mon.0 client_cookie=d72ead0c960ed853 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:17:08.721 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:08.503+0000 7f235d59a700 1 -- 192.168.123.104:0/1313682274 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f2350004030 con 0x7f2360071410 2026-03-10T06:17:08.721 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:08.503+0000 7f235d59a700 1 -- 192.168.123.104:0/1313682274 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f235000c8f0 con 0x7f2360071410 2026-03-10T06:17:08.721 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:08.503+0000 7f235f59e700 1 -- 192.168.123.104:0/1313682274 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2360071410 msgr2=0x7f2360071820 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:08.721 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:08.503+0000 7f235f59e700 1 --2- 192.168.123.104:0/1313682274 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2360071410 0x7f2360071820 secure :-1 s=READY pgs=71 cs=0 l=1 rev1=1 crypto rx=0x7f2350009480 tx=0x7f2350009790 comp rx=0 tx=0).stop 2026-03-10T06:17:08.721 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:08.503+0000 7f235f59e700 1 -- 192.168.123.104:0/1313682274 shutdown_connections 2026-03-10T06:17:08.721 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:08.503+0000 7f235f59e700 1 --2- 192.168.123.104:0/1313682274 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2360071410 0x7f2360071820 unknown :-1 s=CLOSED pgs=71 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:08.721 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:08.503+0000 7f235f59e700 1 -- 192.168.123.104:0/1313682274 >> 192.168.123.104:0/1313682274 conn(0x7f236006c9d0 msgr2=0x7f236006ee00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:17:08.721 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:08.504+0000 7f235f59e700 1 -- 192.168.123.104:0/1313682274 shutdown_connections 2026-03-10T06:17:08.721 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:08.504+0000 7f235f59e700 1 -- 192.168.123.104:0/1313682274 wait complete. 2026-03-10T06:17:08.721 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:08.504+0000 7f235f59e700 1 Processor -- start 2026-03-10T06:17:08.721 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:08.504+0000 7f235f59e700 1 -- start start 2026-03-10T06:17:08.721 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:08.504+0000 7f235f59e700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f23601126b0 0x7f2360110d40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:17:08.721 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:08.504+0000 7f235f59e700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f2350013070 con 0x7f23601126b0 2026-03-10T06:17:08.721 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:08.504+0000 7f235e59c700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f23601126b0 0x7f2360110d40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:17:08.721 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:08.504+0000 7f235e59c700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f23601126b0 0x7f2360110d40 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:60842/0 (socket says 192.168.123.104:60842) 2026-03-10T06:17:08.721 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:08.504+0000 7f235e59c700 1 -- 192.168.123.104:0/326564844 learned_addr learned my addr 192.168.123.104:0/326564844 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:17:08.721 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:08.505+0000 7f235e59c700 1 -- 192.168.123.104:0/326564844 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f2350009160 con 0x7f23601126b0 2026-03-10T06:17:08.721 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:08.505+0000 7f235e59c700 1 --2- 192.168.123.104:0/326564844 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f23601126b0 0x7f2360110d40 secure :-1 s=READY pgs=72 cs=0 l=1 rev1=1 crypto rx=0x7f23500038d0 tx=0x7f2350003e10 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:17:08.721 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:08.505+0000 7f234f7fe700 1 -- 192.168.123.104:0/326564844 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f2350004060 con 0x7f23601126b0 2026-03-10T06:17:08.721 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:08.505+0000 7f235f59e700 1 -- 192.168.123.104:0/326564844 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f2360112ac0 con 0x7f23601126b0 2026-03-10T06:17:08.721 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:08.505+0000 7f235f59e700 1 -- 192.168.123.104:0/326564844 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f2360111450 con 0x7f23601126b0 2026-03-10T06:17:08.721 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:08.506+0000 7f234f7fe700 1 -- 192.168.123.104:0/326564844 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f235001c070 con 0x7f23601126b0 2026-03-10T06:17:08.721 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:08.506+0000 7f234f7fe700 1 -- 192.168.123.104:0/326564844 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f23500183f0 con 0x7f23601126b0 2026-03-10T06:17:08.721 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:08.507+0000 7f235f59e700 1 -- 192.168.123.104:0/326564844 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f236004f030 con 0x7f23601126b0 2026-03-10T06:17:08.721 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:08.510+0000 7f234f7fe700 1 -- 192.168.123.104:0/326564844 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 9) v1 ==== 45092+0+0 (secure 0 0 0) 0x7f235001e030 con 0x7f23601126b0 2026-03-10T06:17:08.721 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:08.510+0000 7f234f7fe700 1 --2- 192.168.123.104:0/326564844 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f2348038500 0x7f234803a9b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:17:08.721 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:08.510+0000 7f235dd9b700 1 -- 192.168.123.104:0/326564844 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f2348038500 msgr2=0x7f234803a9b0 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.104:6800/2 2026-03-10T06:17:08.721 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:08.510+0000 7f235dd9b700 1 --2- 192.168.123.104:0/326564844 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f2348038500 0x7f234803a9b0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 0.200000 2026-03-10T06:17:08.721 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:08.510+0000 7f234f7fe700 1 -- 192.168.123.104:0/326564844 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7f235004ba90 con 0x7f23601126b0 2026-03-10T06:17:08.721 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:08.512+0000 7f234f7fe700 1 -- 192.168.123.104:0/326564844 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f2350010780 con 0x7f23601126b0 2026-03-10T06:17:08.721 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:08.649+0000 7f235f59e700 1 -- 192.168.123.104:0/326564844 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "mgr stat"} v 0) v1 -- 0x7f23601117f0 con 0x7f23601126b0 2026-03-10T06:17:08.721 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:08.650+0000 7f234f7fe700 1 -- 192.168.123.104:0/326564844 <== mon.0 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "mgr stat"}]=0 v9) v1 ==== 56+0+98 (secure 0 0 0) 0x7f2350029390 con 0x7f23601126b0 2026-03-10T06:17:08.721 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:08.653+0000 7f235f59e700 1 -- 192.168.123.104:0/326564844 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f2348038500 msgr2=0x7f234803a9b0 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T06:17:08.721 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:08.653+0000 7f235f59e700 1 --2- 192.168.123.104:0/326564844 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f2348038500 0x7f234803a9b0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:08.721 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:08.653+0000 7f235f59e700 1 -- 192.168.123.104:0/326564844 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f23601126b0 msgr2=0x7f2360110d40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:08.721 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:08.653+0000 7f235f59e700 1 --2- 192.168.123.104:0/326564844 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f23601126b0 0x7f2360110d40 secure :-1 s=READY pgs=72 cs=0 l=1 rev1=1 crypto rx=0x7f23500038d0 tx=0x7f2350003e10 comp rx=0 tx=0).stop 2026-03-10T06:17:08.721 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:08.654+0000 7f235f59e700 1 -- 192.168.123.104:0/326564844 shutdown_connections 2026-03-10T06:17:08.721 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:08.654+0000 7f235f59e700 1 --2- 192.168.123.104:0/326564844 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f2348038500 0x7f234803a9b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:08.721 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:08.654+0000 7f235f59e700 1 --2- 192.168.123.104:0/326564844 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f23601126b0 0x7f2360110d40 unknown :-1 s=CLOSED pgs=72 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:08.721 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:08.654+0000 7f235f59e700 1 -- 192.168.123.104:0/326564844 >> 192.168.123.104:0/326564844 conn(0x7f236006c9d0 msgr2=0x7f236006d490 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:17:08.721 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:08.655+0000 7f235f59e700 1 -- 192.168.123.104:0/326564844 shutdown_connections 2026-03-10T06:17:08.721 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:08.655+0000 7f235f59e700 1 -- 192.168.123.104:0/326564844 wait complete. 2026-03-10T06:17:08.721 INFO:teuthology.orchestra.run.vm04.stdout:Waiting for the mgr to restart... 2026-03-10T06:17:08.721 INFO:teuthology.orchestra.run.vm04.stdout:Waiting for mgr epoch 9... 2026-03-10T06:17:09.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:09 vm04 ceph-mon[51058]: from='client.? 192.168.123.104:0/2875191519' entity='client.admin' cmd='[{"prefix": "mgr module enable", "module": "dashboard"}]': finished 2026-03-10T06:17:09.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:09 vm04 ceph-mon[51058]: mgrmap e9: vm04.exdvdb(active, since 9s) 2026-03-10T06:17:09.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:09 vm04 ceph-mon[51058]: from='client.? 192.168.123.104:0/326564844' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch 2026-03-10T06:17:13.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:13 vm04 ceph-mon[51058]: Active manager daemon vm04.exdvdb restarted 2026-03-10T06:17:13.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:13 vm04 ceph-mon[51058]: Activating manager daemon vm04.exdvdb 2026-03-10T06:17:13.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:13 vm04 ceph-mon[51058]: osdmap e3: 0 total, 0 up, 0 in 2026-03-10T06:17:13.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:13 vm04 ceph-mon[51058]: mgrmap e10: vm04.exdvdb(active, starting, since 0.00434398s) 2026-03-10T06:17:13.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:13 vm04 ceph-mon[51058]: from='mgr.14164 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "mon metadata", "id": "vm04"}]: dispatch 2026-03-10T06:17:13.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:13 vm04 ceph-mon[51058]: from='mgr.14164 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "mgr metadata", "who": "vm04.exdvdb", "id": "vm04.exdvdb"}]: dispatch 2026-03-10T06:17:13.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:13 vm04 ceph-mon[51058]: from='mgr.14164 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "mds metadata"}]: dispatch 2026-03-10T06:17:13.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:13 vm04 ceph-mon[51058]: from='mgr.14164 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd metadata"}]: dispatch 2026-03-10T06:17:13.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:13 vm04 ceph-mon[51058]: from='mgr.14164 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "mon metadata"}]: dispatch 2026-03-10T06:17:13.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:13 vm04 ceph-mon[51058]: Manager daemon vm04.exdvdb is now available 2026-03-10T06:17:13.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:13 vm04 ceph-mon[51058]: from='mgr.14164 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:17:13.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:13 vm04 ceph-mon[51058]: from='mgr.14164 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm04.exdvdb/mirror_snapshot_schedule"}]: dispatch 2026-03-10T06:17:13.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:13 vm04 ceph-mon[51058]: from='mgr.14164 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:17:14.193 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout { 2026-03-10T06:17:14.195 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "mgrmap_epoch": 11, 2026-03-10T06:17:14.195 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout "initialized": true 2026-03-10T06:17:14.195 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout } 2026-03-10T06:17:14.195 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:08.879+0000 7f4bd44b5700 1 Processor -- start 2026-03-10T06:17:14.195 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:08.879+0000 7f4bd44b5700 1 -- start start 2026-03-10T06:17:14.195 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:08.879+0000 7f4bd44b5700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4bcc071200 0x7f4bcc071610 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:17:14.195 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:08.879+0000 7f4bd44b5700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4bcc0728b0 con 0x7f4bcc071200 2026-03-10T06:17:14.195 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:08.879+0000 7f4bd2251700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4bcc071200 0x7f4bcc071610 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:17:14.195 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:08.879+0000 7f4bd2251700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4bcc071200 0x7f4bcc071610 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:60850/0 (socket says 192.168.123.104:60850) 2026-03-10T06:17:14.195 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:08.879+0000 7f4bd2251700 1 -- 192.168.123.104:0/3529111522 learned_addr learned my addr 192.168.123.104:0/3529111522 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:17:14.195 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:08.879+0000 7f4bd2251700 1 -- 192.168.123.104:0/3529111522 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f4bcc0729f0 con 0x7f4bcc071200 2026-03-10T06:17:14.195 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:08.879+0000 7f4bd2251700 1 --2- 192.168.123.104:0/3529111522 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4bcc071200 0x7f4bcc071610 secure :-1 s=READY pgs=73 cs=0 l=1 rev1=1 crypto rx=0x7f4bc8009a90 tx=0x7f4bc8009da0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=d995e7a31ae10f71 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:17:14.195 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:08.879+0000 7f4bd124f700 1 -- 192.168.123.104:0/3529111522 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f4bc8004030 con 0x7f4bcc071200 2026-03-10T06:17:14.195 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:08.881+0000 7f4bd124f700 1 -- 192.168.123.104:0/3529111522 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f4bc800b7e0 con 0x7f4bcc071200 2026-03-10T06:17:14.195 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:08.881+0000 7f4bd44b5700 1 -- 192.168.123.104:0/3529111522 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4bcc071200 msgr2=0x7f4bcc071610 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:14.195 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:08.881+0000 7f4bd44b5700 1 --2- 192.168.123.104:0/3529111522 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4bcc071200 0x7f4bcc071610 secure :-1 s=READY pgs=73 cs=0 l=1 rev1=1 crypto rx=0x7f4bc8009a90 tx=0x7f4bc8009da0 comp rx=0 tx=0).stop 2026-03-10T06:17:14.195 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:08.881+0000 7f4bd44b5700 1 -- 192.168.123.104:0/3529111522 shutdown_connections 2026-03-10T06:17:14.195 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:08.881+0000 7f4bd44b5700 1 --2- 192.168.123.104:0/3529111522 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4bcc071200 0x7f4bcc071610 unknown :-1 s=CLOSED pgs=73 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:14.195 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:08.881+0000 7f4bd44b5700 1 -- 192.168.123.104:0/3529111522 >> 192.168.123.104:0/3529111522 conn(0x7f4bcc06cc30 msgr2=0x7f4bcc06f060 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:17:14.195 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:08.881+0000 7f4bd44b5700 1 -- 192.168.123.104:0/3529111522 shutdown_connections 2026-03-10T06:17:14.195 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:08.881+0000 7f4bd44b5700 1 -- 192.168.123.104:0/3529111522 wait complete. 2026-03-10T06:17:14.195 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:08.882+0000 7f4bd44b5700 1 Processor -- start 2026-03-10T06:17:14.195 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:08.882+0000 7f4bd44b5700 1 -- start start 2026-03-10T06:17:14.195 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:08.882+0000 7f4bd44b5700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4bcc1a89c0 0x7f4bcc1a8dd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:17:14.195 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:08.882+0000 7f4bd44b5700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4bcc0728b0 con 0x7f4bcc1a89c0 2026-03-10T06:17:14.195 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:08.882+0000 7f4bd2251700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4bcc1a89c0 0x7f4bcc1a8dd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:17:14.195 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:08.882+0000 7f4bd2251700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4bcc1a89c0 0x7f4bcc1a8dd0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:60854/0 (socket says 192.168.123.104:60854) 2026-03-10T06:17:14.195 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:08.882+0000 7f4bd2251700 1 -- 192.168.123.104:0/186747058 learned_addr learned my addr 192.168.123.104:0/186747058 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:17:14.195 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:08.883+0000 7f4bd2251700 1 -- 192.168.123.104:0/186747058 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f4bc8009740 con 0x7f4bcc1a89c0 2026-03-10T06:17:14.195 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:08.883+0000 7f4bd2251700 1 --2- 192.168.123.104:0/186747058 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4bcc1a89c0 0x7f4bcc1a8dd0 secure :-1 s=READY pgs=74 cs=0 l=1 rev1=1 crypto rx=0x7f4bc800bdb0 tx=0x7f4bc800be90 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:17:14.195 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:08.884+0000 7f4bc37fe700 1 -- 192.168.123.104:0/186747058 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f4bc8003710 con 0x7f4bcc1a89c0 2026-03-10T06:17:14.195 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:08.884+0000 7f4bd44b5700 1 -- 192.168.123.104:0/186747058 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f4bcc1a9310 con 0x7f4bcc1a89c0 2026-03-10T06:17:14.195 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:08.884+0000 7f4bd44b5700 1 -- 192.168.123.104:0/186747058 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f4bcc07b000 con 0x7f4bcc1a89c0 2026-03-10T06:17:14.195 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:08.884+0000 7f4bc37fe700 1 -- 192.168.123.104:0/186747058 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f4bc8004490 con 0x7f4bcc1a89c0 2026-03-10T06:17:14.195 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:08.884+0000 7f4bc37fe700 1 -- 192.168.123.104:0/186747058 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f4bc801ace0 con 0x7f4bcc1a89c0 2026-03-10T06:17:14.195 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:08.885+0000 7f4bc37fe700 1 -- 192.168.123.104:0/186747058 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 9) v1 ==== 45092+0+0 (secure 0 0 0) 0x7f4bc801a680 con 0x7f4bcc1a89c0 2026-03-10T06:17:14.195 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:08.886+0000 7f4bc37fe700 1 --2- 192.168.123.104:0/186747058 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f4bb8038500 0x7f4bb803a9b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:17:14.195 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:08.886+0000 7f4bd1a50700 1 -- 192.168.123.104:0/186747058 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f4bb8038500 msgr2=0x7f4bb803a9b0 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.104:6800/2 2026-03-10T06:17:14.195 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:08.886+0000 7f4bd1a50700 1 --2- 192.168.123.104:0/186747058 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f4bb8038500 0x7f4bb803a9b0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 0.200000 2026-03-10T06:17:14.196 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:08.886+0000 7f4bc37fe700 1 -- 192.168.123.104:0/186747058 --> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] -- command(tid 0: {"prefix": "get_command_descriptions"}) v1 -- 0x7f4bb803b0c0 con 0x7f4bb8038500 2026-03-10T06:17:14.196 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:08.886+0000 7f4bc37fe700 1 -- 192.168.123.104:0/186747058 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7f4bc804b7c0 con 0x7f4bcc1a89c0 2026-03-10T06:17:14.196 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:09.086+0000 7f4bd1a50700 1 -- 192.168.123.104:0/186747058 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f4bb8038500 msgr2=0x7f4bb803a9b0 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.104:6800/2 2026-03-10T06:17:14.196 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:09.086+0000 7f4bd1a50700 1 --2- 192.168.123.104:0/186747058 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f4bb8038500 0x7f4bb803a9b0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 0.400000 2026-03-10T06:17:14.196 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:09.486+0000 7f4bd1a50700 1 -- 192.168.123.104:0/186747058 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f4bb8038500 msgr2=0x7f4bb803a9b0 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.104:6800/2 2026-03-10T06:17:14.196 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:09.486+0000 7f4bd1a50700 1 --2- 192.168.123.104:0/186747058 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f4bb8038500 0x7f4bb803a9b0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 0.800000 2026-03-10T06:17:14.196 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:10.287+0000 7f4bd1a50700 1 -- 192.168.123.104:0/186747058 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f4bb8038500 msgr2=0x7f4bb803a9b0 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.104:6800/2 2026-03-10T06:17:14.196 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:10.287+0000 7f4bd1a50700 1 --2- 192.168.123.104:0/186747058 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f4bb8038500 0x7f4bb803a9b0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 1.600000 2026-03-10T06:17:14.196 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:11.889+0000 7f4bd1a50700 1 -- 192.168.123.104:0/186747058 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f4bb8038500 msgr2=0x7f4bb803a9b0 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.104:6800/2 2026-03-10T06:17:14.196 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:11.889+0000 7f4bd1a50700 1 --2- 192.168.123.104:0/186747058 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f4bb8038500 0x7f4bb803a9b0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 3.200000 2026-03-10T06:17:14.196 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:13.126+0000 7f4bc37fe700 1 -- 192.168.123.104:0/186747058 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mgrmap(e 10) v1 ==== 44859+0+0 (secure 0 0 0) 0x7f4bc801fd70 con 0x7f4bcc1a89c0 2026-03-10T06:17:14.196 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:13.126+0000 7f4bc37fe700 1 -- 192.168.123.104:0/186747058 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f4bb8038500 msgr2=0x7f4bb803a9b0 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T06:17:14.196 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:13.126+0000 7f4bc37fe700 1 --2- 192.168.123.104:0/186747058 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f4bb8038500 0x7f4bb803a9b0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:14.196 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:14.128+0000 7f4bc37fe700 1 -- 192.168.123.104:0/186747058 <== mon.0 v2:192.168.123.104:3300/0 7 ==== mgrmap(e 11) v1 ==== 44986+0+0 (secure 0 0 0) 0x7f4bc804d220 con 0x7f4bcc1a89c0 2026-03-10T06:17:14.196 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:14.128+0000 7f4bc37fe700 1 --2- 192.168.123.104:0/186747058 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f4bb8038500 0x7f4bb803a9b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:17:14.196 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:14.129+0000 7f4bc37fe700 1 -- 192.168.123.104:0/186747058 --> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] -- command(tid 0: {"prefix": "get_command_descriptions"}) v1 -- 0x7f4bb803b0c0 con 0x7f4bb8038500 2026-03-10T06:17:14.196 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:14.129+0000 7f4bd1a50700 1 --2- 192.168.123.104:0/186747058 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f4bb8038500 0x7f4bb803a9b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:17:14.196 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:14.131+0000 7f4bd1a50700 1 --2- 192.168.123.104:0/186747058 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f4bb8038500 0x7f4bb803a9b0 secure :-1 s=READY pgs=2 cs=0 l=1 rev1=1 crypto rx=0x7f4bbc003a10 tx=0x7f4bbc0092b0 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:17:14.196 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:14.131+0000 7f4bc37fe700 1 -- 192.168.123.104:0/186747058 <== mgr.14164 v2:192.168.123.104:6800/2 1 ==== command_reply(tid 0: 0 ) v1 ==== 8+0+6910 (secure 0 0 0) 0x7f4bb803b0c0 con 0x7f4bb8038500 2026-03-10T06:17:14.196 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:14.135+0000 7f4bd44b5700 1 -- 192.168.123.104:0/186747058 --> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] -- command(tid 1: {"prefix": "mgr_status"}) v1 -- 0x7f4bcc1a9d40 con 0x7f4bb8038500 2026-03-10T06:17:14.196 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:14.136+0000 7f4bc37fe700 1 -- 192.168.123.104:0/186747058 <== mgr.14164 v2:192.168.123.104:6800/2 2 ==== command_reply(tid 1: 0 ) v1 ==== 8+0+52 (secure 0 0 0) 0x7f4bcc1a9d40 con 0x7f4bb8038500 2026-03-10T06:17:14.196 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:14.137+0000 7f4bc17fa700 1 -- 192.168.123.104:0/186747058 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f4bb8038500 msgr2=0x7f4bb803a9b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:14.196 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:14.137+0000 7f4bc17fa700 1 --2- 192.168.123.104:0/186747058 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f4bb8038500 0x7f4bb803a9b0 secure :-1 s=READY pgs=2 cs=0 l=1 rev1=1 crypto rx=0x7f4bbc003a10 tx=0x7f4bbc0092b0 comp rx=0 tx=0).stop 2026-03-10T06:17:14.196 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:14.137+0000 7f4bc17fa700 1 -- 192.168.123.104:0/186747058 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4bcc1a89c0 msgr2=0x7f4bcc1a8dd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:14.196 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:14.137+0000 7f4bc17fa700 1 --2- 192.168.123.104:0/186747058 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4bcc1a89c0 0x7f4bcc1a8dd0 secure :-1 s=READY pgs=74 cs=0 l=1 rev1=1 crypto rx=0x7f4bc800bdb0 tx=0x7f4bc800be90 comp rx=0 tx=0).stop 2026-03-10T06:17:14.196 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:14.137+0000 7f4bc17fa700 1 -- 192.168.123.104:0/186747058 shutdown_connections 2026-03-10T06:17:14.196 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:14.137+0000 7f4bc17fa700 1 --2- 192.168.123.104:0/186747058 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f4bb8038500 0x7f4bb803a9b0 secure :-1 s=CLOSED pgs=2 cs=0 l=1 rev1=1 crypto rx=0x7f4bbc003a10 tx=0x7f4bbc0092b0 comp rx=0 tx=0).stop 2026-03-10T06:17:14.196 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:14.137+0000 7f4bc17fa700 1 --2- 192.168.123.104:0/186747058 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4bcc1a89c0 0x7f4bcc1a8dd0 secure :-1 s=CLOSED pgs=74 cs=0 l=1 rev1=1 crypto rx=0x7f4bc800bdb0 tx=0x7f4bc800be90 comp rx=0 tx=0).stop 2026-03-10T06:17:14.196 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:14.137+0000 7f4bc17fa700 1 -- 192.168.123.104:0/186747058 >> 192.168.123.104:0/186747058 conn(0x7f4bcc06cc30 msgr2=0x7f4bcc1125b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:17:14.196 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:14.137+0000 7f4bc17fa700 1 -- 192.168.123.104:0/186747058 shutdown_connections 2026-03-10T06:17:14.196 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:14.137+0000 7f4bc17fa700 1 -- 192.168.123.104:0/186747058 wait complete. 2026-03-10T06:17:14.196 INFO:teuthology.orchestra.run.vm04.stdout:mgr epoch 9 is available 2026-03-10T06:17:14.196 INFO:teuthology.orchestra.run.vm04.stdout:Generating a dashboard self-signed certificate... 2026-03-10T06:17:14.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:14 vm04 ceph-mon[51058]: from='mgr.14164 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm04.exdvdb/trash_purge_schedule"}]: dispatch 2026-03-10T06:17:14.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:14 vm04 ceph-mon[51058]: from='mgr.14164 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:17:14.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:14 vm04 ceph-mon[51058]: mgrmap e11: vm04.exdvdb(active, since 1.00984s) 2026-03-10T06:17:14.559 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout Self-signed certificate created 2026-03-10T06:17:14.559 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:14.368+0000 7f3d20b5b700 1 Processor -- start 2026-03-10T06:17:14.559 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:14.369+0000 7f3d20b5b700 1 -- start start 2026-03-10T06:17:14.559 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:14.369+0000 7f3d20b5b700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3d1c107f20 0x7f3d1c108330 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:17:14.559 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:14.369+0000 7f3d20b5b700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3d1c108870 con 0x7f3d1c107f20 2026-03-10T06:17:14.559 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:14.369+0000 7f3d1a59c700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3d1c107f20 0x7f3d1c108330 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:17:14.559 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:14.369+0000 7f3d1a59c700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3d1c107f20 0x7f3d1c108330 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:35570/0 (socket says 192.168.123.104:35570) 2026-03-10T06:17:14.559 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:14.369+0000 7f3d1a59c700 1 -- 192.168.123.104:0/2035763906 learned_addr learned my addr 192.168.123.104:0/2035763906 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:17:14.559 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:14.370+0000 7f3d1a59c700 1 -- 192.168.123.104:0/2035763906 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f3d1c1089b0 con 0x7f3d1c107f20 2026-03-10T06:17:14.559 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:14.370+0000 7f3d1a59c700 1 --2- 192.168.123.104:0/2035763906 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3d1c107f20 0x7f3d1c108330 secure :-1 s=READY pgs=82 cs=0 l=1 rev1=1 crypto rx=0x7f3d04009cf0 tx=0x7f3d0400b0e0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=d3870e7b4176cbaa server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:17:14.559 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:14.370+0000 7f3d1959a700 1 -- 192.168.123.104:0/2035763906 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f3d04004030 con 0x7f3d1c107f20 2026-03-10T06:17:14.559 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:14.370+0000 7f3d1959a700 1 -- 192.168.123.104:0/2035763906 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f3d0400b810 con 0x7f3d1c107f20 2026-03-10T06:17:14.559 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:14.371+0000 7f3d1959a700 1 -- 192.168.123.104:0/2035763906 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f3d04003b10 con 0x7f3d1c107f20 2026-03-10T06:17:14.559 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:14.371+0000 7f3d20b5b700 1 -- 192.168.123.104:0/2035763906 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3d1c107f20 msgr2=0x7f3d1c108330 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:14.559 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:14.371+0000 7f3d20b5b700 1 --2- 192.168.123.104:0/2035763906 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3d1c107f20 0x7f3d1c108330 secure :-1 s=READY pgs=82 cs=0 l=1 rev1=1 crypto rx=0x7f3d04009cf0 tx=0x7f3d0400b0e0 comp rx=0 tx=0).stop 2026-03-10T06:17:14.559 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:14.371+0000 7f3d20b5b700 1 -- 192.168.123.104:0/2035763906 shutdown_connections 2026-03-10T06:17:14.559 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:14.371+0000 7f3d20b5b700 1 --2- 192.168.123.104:0/2035763906 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3d1c107f20 0x7f3d1c108330 unknown :-1 s=CLOSED pgs=82 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:14.559 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:14.371+0000 7f3d20b5b700 1 -- 192.168.123.104:0/2035763906 >> 192.168.123.104:0/2035763906 conn(0x7f3d1c07b4b0 msgr2=0x7f3d1c07b8d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:17:14.559 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:14.371+0000 7f3d20b5b700 1 -- 192.168.123.104:0/2035763906 shutdown_connections 2026-03-10T06:17:14.559 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:14.371+0000 7f3d20b5b700 1 -- 192.168.123.104:0/2035763906 wait complete. 2026-03-10T06:17:14.559 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:14.372+0000 7f3d20b5b700 1 Processor -- start 2026-03-10T06:17:14.559 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:14.372+0000 7f3d20b5b700 1 -- start start 2026-03-10T06:17:14.559 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:14.372+0000 7f3d20b5b700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3d1c07c6a0 0x7f3d1c07cab0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:17:14.559 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:14.372+0000 7f3d20b5b700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3d1c07cff0 con 0x7f3d1c07c6a0 2026-03-10T06:17:14.559 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:14.372+0000 7f3d1a59c700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3d1c07c6a0 0x7f3d1c07cab0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:17:14.559 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:14.372+0000 7f3d1a59c700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3d1c07c6a0 0x7f3d1c07cab0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:35584/0 (socket says 192.168.123.104:35584) 2026-03-10T06:17:14.559 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:14.372+0000 7f3d1a59c700 1 -- 192.168.123.104:0/1277621123 learned_addr learned my addr 192.168.123.104:0/1277621123 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:17:14.559 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:14.373+0000 7f3d1a59c700 1 -- 192.168.123.104:0/1277621123 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f3d04009740 con 0x7f3d1c07c6a0 2026-03-10T06:17:14.559 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:14.373+0000 7f3d1a59c700 1 --2- 192.168.123.104:0/1277621123 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3d1c07c6a0 0x7f3d1c07cab0 secure :-1 s=READY pgs=83 cs=0 l=1 rev1=1 crypto rx=0x7f3d04003920 tx=0x7f3d04011770 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:17:14.559 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:14.373+0000 7f3d137fe700 1 -- 192.168.123.104:0/1277621123 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f3d04011980 con 0x7f3d1c07c6a0 2026-03-10T06:17:14.559 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:14.373+0000 7f3d20b5b700 1 -- 192.168.123.104:0/1277621123 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f3d1c07d1f0 con 0x7f3d1c07c6a0 2026-03-10T06:17:14.559 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:14.373+0000 7f3d20b5b700 1 -- 192.168.123.104:0/1277621123 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f3d1c07fed0 con 0x7f3d1c07c6a0 2026-03-10T06:17:14.559 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:14.374+0000 7f3d137fe700 1 -- 192.168.123.104:0/1277621123 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f3d0401a440 con 0x7f3d1c07c6a0 2026-03-10T06:17:14.559 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:14.374+0000 7f3d137fe700 1 -- 192.168.123.104:0/1277621123 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f3d0401b440 con 0x7f3d1c07c6a0 2026-03-10T06:17:14.559 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:14.374+0000 7f3d137fe700 1 -- 192.168.123.104:0/1277621123 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 11) v1 ==== 44986+0+0 (secure 0 0 0) 0x7f3d0401b620 con 0x7f3d1c07c6a0 2026-03-10T06:17:14.559 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:14.375+0000 7f3d137fe700 1 --2- 192.168.123.104:0/1277621123 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f3d08038340 0x7f3d0803a7f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:17:14.559 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:14.375+0000 7f3d19d9b700 1 --2- 192.168.123.104:0/1277621123 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f3d08038340 0x7f3d0803a7f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:17:14.559 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:14.375+0000 7f3d20b5b700 1 -- 192.168.123.104:0/1277621123 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f3d1c04f9e0 con 0x7f3d1c07c6a0 2026-03-10T06:17:14.559 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:14.375+0000 7f3d19d9b700 1 --2- 192.168.123.104:0/1277621123 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f3d08038340 0x7f3d0803a7f0 secure :-1 s=READY pgs=7 cs=0 l=1 rev1=1 crypto rx=0x7f3d0c006fd0 tx=0x7f3d0c006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:17:14.559 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:14.378+0000 7f3d137fe700 1 -- 192.168.123.104:0/1277621123 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(3..3 src has 1..3) v4 ==== 1069+0+0 (secure 0 0 0) 0x7f3d04021030 con 0x7f3d1c07c6a0 2026-03-10T06:17:14.559 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:14.378+0000 7f3d137fe700 1 -- 192.168.123.104:0/1277621123 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f3d0407f720 con 0x7f3d1c07c6a0 2026-03-10T06:17:14.559 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:14.498+0000 7f3d20b5b700 1 -- 192.168.123.104:0/1277621123 --> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] -- mgr_command(tid 0: {"prefix": "dashboard create-self-signed-cert", "target": ["mon-mgr", ""]}) v1 -- 0x7f3d1c105d90 con 0x7f3d08038340 2026-03-10T06:17:14.559 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:14.516+0000 7f3d137fe700 1 -- 192.168.123.104:0/1277621123 <== mgr.14164 v2:192.168.123.104:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+32 (secure 0 0 0) 0x7f3d1c105d90 con 0x7f3d08038340 2026-03-10T06:17:14.559 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:14.518+0000 7f3d20b5b700 1 -- 192.168.123.104:0/1277621123 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f3d08038340 msgr2=0x7f3d0803a7f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:14.559 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:14.518+0000 7f3d20b5b700 1 --2- 192.168.123.104:0/1277621123 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f3d08038340 0x7f3d0803a7f0 secure :-1 s=READY pgs=7 cs=0 l=1 rev1=1 crypto rx=0x7f3d0c006fd0 tx=0x7f3d0c006e40 comp rx=0 tx=0).stop 2026-03-10T06:17:14.559 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:14.518+0000 7f3d20b5b700 1 -- 192.168.123.104:0/1277621123 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3d1c07c6a0 msgr2=0x7f3d1c07cab0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:14.559 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:14.518+0000 7f3d20b5b700 1 --2- 192.168.123.104:0/1277621123 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3d1c07c6a0 0x7f3d1c07cab0 secure :-1 s=READY pgs=83 cs=0 l=1 rev1=1 crypto rx=0x7f3d04003920 tx=0x7f3d04011770 comp rx=0 tx=0).stop 2026-03-10T06:17:14.559 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:14.518+0000 7f3d20b5b700 1 -- 192.168.123.104:0/1277621123 shutdown_connections 2026-03-10T06:17:14.559 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:14.518+0000 7f3d20b5b700 1 --2- 192.168.123.104:0/1277621123 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f3d08038340 0x7f3d0803a7f0 secure :-1 s=CLOSED pgs=7 cs=0 l=1 rev1=1 crypto rx=0x7f3d0c006fd0 tx=0x7f3d0c006e40 comp rx=0 tx=0).stop 2026-03-10T06:17:14.559 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:14.518+0000 7f3d20b5b700 1 --2- 192.168.123.104:0/1277621123 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3d1c07c6a0 0x7f3d1c07cab0 unknown :-1 s=CLOSED pgs=83 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:14.559 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:14.518+0000 7f3d20b5b700 1 -- 192.168.123.104:0/1277621123 >> 192.168.123.104:0/1277621123 conn(0x7f3d1c07b4b0 msgr2=0x7f3d1c105680 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:17:14.560 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:14.519+0000 7f3d20b5b700 1 -- 192.168.123.104:0/1277621123 shutdown_connections 2026-03-10T06:17:14.560 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:14.519+0000 7f3d20b5b700 1 -- 192.168.123.104:0/1277621123 wait complete. 2026-03-10T06:17:14.560 INFO:teuthology.orchestra.run.vm04.stdout:Creating initial admin user... 2026-03-10T06:17:15.082 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout {"username": "admin", "password": "$2b$12$YCGXnrAgdpXKNL9552oSpOxL8lcZrYphH9aFSJhGlMuXnrDM4BfC6", "roles": ["administrator"], "name": null, "email": null, "lastUpdate": 1773123435, "enabled": true, "pwdExpirationDate": null, "pwdUpdateRequired": true} 2026-03-10T06:17:15.082 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:14.734+0000 7fd90e80b700 1 Processor -- start 2026-03-10T06:17:15.082 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:14.734+0000 7fd90e80b700 1 -- start start 2026-03-10T06:17:15.082 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:14.734+0000 7fd90e80b700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd900095120 0x7fd900095530 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:17:15.082 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:14.734+0000 7fd90e80b700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd900095b00 con 0x7fd900095120 2026-03-10T06:17:15.082 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:14.734+0000 7fd907fff700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd900095120 0x7fd900095530 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:17:15.082 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:14.734+0000 7fd907fff700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd900095120 0x7fd900095530 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:35586/0 (socket says 192.168.123.104:35586) 2026-03-10T06:17:15.082 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:14.734+0000 7fd907fff700 1 -- 192.168.123.104:0/244595668 learned_addr learned my addr 192.168.123.104:0/244595668 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:17:15.082 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:14.735+0000 7fd907fff700 1 -- 192.168.123.104:0/244595668 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fd900096320 con 0x7fd900095120 2026-03-10T06:17:15.082 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:14.735+0000 7fd907fff700 1 --2- 192.168.123.104:0/244595668 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd900095120 0x7fd900095530 secure :-1 s=READY pgs=84 cs=0 l=1 rev1=1 crypto rx=0x7fd8fc009cf0 tx=0x7fd8fc00b0e0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=e93b99936ad4a083 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:17:15.082 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:14.735+0000 7fd906ffd700 1 -- 192.168.123.104:0/244595668 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fd8fc004030 con 0x7fd900095120 2026-03-10T06:17:15.082 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:14.735+0000 7fd906ffd700 1 -- 192.168.123.104:0/244595668 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fd8fc00b810 con 0x7fd900095120 2026-03-10T06:17:15.082 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:14.735+0000 7fd906ffd700 1 -- 192.168.123.104:0/244595668 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fd8fc003b10 con 0x7fd900095120 2026-03-10T06:17:15.082 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:14.735+0000 7fd90e80b700 1 -- 192.168.123.104:0/244595668 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd900095120 msgr2=0x7fd900095530 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:15.082 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:14.735+0000 7fd90e80b700 1 --2- 192.168.123.104:0/244595668 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd900095120 0x7fd900095530 secure :-1 s=READY pgs=84 cs=0 l=1 rev1=1 crypto rx=0x7fd8fc009cf0 tx=0x7fd8fc00b0e0 comp rx=0 tx=0).stop 2026-03-10T06:17:15.082 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:14.735+0000 7fd90e80b700 1 -- 192.168.123.104:0/244595668 shutdown_connections 2026-03-10T06:17:15.082 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:14.735+0000 7fd90e80b700 1 --2- 192.168.123.104:0/244595668 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd900095120 0x7fd900095530 unknown :-1 s=CLOSED pgs=84 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:15.082 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:14.735+0000 7fd90e80b700 1 -- 192.168.123.104:0/244595668 >> 192.168.123.104:0/244595668 conn(0x7fd9000906d0 msgr2=0x7fd900092b00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:17:15.082 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:14.736+0000 7fd90e80b700 1 -- 192.168.123.104:0/244595668 shutdown_connections 2026-03-10T06:17:15.082 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:14.736+0000 7fd90e80b700 1 -- 192.168.123.104:0/244595668 wait complete. 2026-03-10T06:17:15.082 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:14.736+0000 7fd90e80b700 1 Processor -- start 2026-03-10T06:17:15.082 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:14.736+0000 7fd90e80b700 1 -- start start 2026-03-10T06:17:15.082 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:14.736+0000 7fd90e80b700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd900095120 0x7fd900128dc0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:17:15.082 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:14.736+0000 7fd90e80b700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd900129300 con 0x7fd900095120 2026-03-10T06:17:15.082 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:14.736+0000 7fd907fff700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd900095120 0x7fd900128dc0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:17:15.082 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:14.736+0000 7fd907fff700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd900095120 0x7fd900128dc0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:35590/0 (socket says 192.168.123.104:35590) 2026-03-10T06:17:15.082 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:14.736+0000 7fd907fff700 1 -- 192.168.123.104:0/1607655157 learned_addr learned my addr 192.168.123.104:0/1607655157 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:17:15.082 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:14.737+0000 7fd907fff700 1 -- 192.168.123.104:0/1607655157 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fd8fc009740 con 0x7fd900095120 2026-03-10T06:17:15.082 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:14.737+0000 7fd907fff700 1 --2- 192.168.123.104:0/1607655157 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd900095120 0x7fd900128dc0 secure :-1 s=READY pgs=85 cs=0 l=1 rev1=1 crypto rx=0x7fd8fc000c00 tx=0x7fd8fc011700 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:17:15.082 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:14.737+0000 7fd9057fa700 1 -- 192.168.123.104:0/1607655157 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fd8fc0118e0 con 0x7fd900095120 2026-03-10T06:17:15.082 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:14.737+0000 7fd9057fa700 1 -- 192.168.123.104:0/1607655157 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fd8fc023450 con 0x7fd900095120 2026-03-10T06:17:15.082 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:14.737+0000 7fd9057fa700 1 -- 192.168.123.104:0/1607655157 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fd8fc01a430 con 0x7fd900095120 2026-03-10T06:17:15.082 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:14.738+0000 7fd90e80b700 1 -- 192.168.123.104:0/1607655157 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fd900129500 con 0x7fd900095120 2026-03-10T06:17:15.082 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:14.738+0000 7fd9057fa700 1 -- 192.168.123.104:0/1607655157 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 11) v1 ==== 44986+0+0 (secure 0 0 0) 0x7fd8fc011a40 con 0x7fd900095120 2026-03-10T06:17:15.082 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:14.739+0000 7fd9057fa700 1 --2- 192.168.123.104:0/1607655157 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fd8f8038030 0x7fd8f803a4e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:17:15.082 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:14.739+0000 7fd9077fe700 1 --2- 192.168.123.104:0/1607655157 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fd8f8038030 0x7fd8f803a4e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:17:15.082 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:14.739+0000 7fd9077fe700 1 --2- 192.168.123.104:0/1607655157 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fd8f8038030 0x7fd8f803a4e0 secure :-1 s=READY pgs=8 cs=0 l=1 rev1=1 crypto rx=0x7fd8f4006fd0 tx=0x7fd8f4006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:17:15.082 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:14.740+0000 7fd90e80b700 1 -- 192.168.123.104:0/1607655157 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fd9001298e0 con 0x7fd900095120 2026-03-10T06:17:15.082 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:14.740+0000 7fd9057fa700 1 -- 192.168.123.104:0/1607655157 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(3..3 src has 1..3) v4 ==== 1069+0+0 (secure 0 0 0) 0x7fd8fc018710 con 0x7fd900095120 2026-03-10T06:17:15.083 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:14.743+0000 7fd8f2ffd700 1 -- 192.168.123.104:0/1607655157 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fd900004850 con 0x7fd900095120 2026-03-10T06:17:15.083 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:14.745+0000 7fd9057fa700 1 -- 192.168.123.104:0/1607655157 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fd8fc018920 con 0x7fd900095120 2026-03-10T06:17:15.083 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:14.872+0000 7fd8f2ffd700 1 -- 192.168.123.104:0/1607655157 --> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] -- mgr_command(tid 0: {"prefix": "dashboard ac-user-create", "username": "admin", "rolename": "administrator", "force_password": true, "pwd_update_required": true, "target": ["mon-mgr", ""]}) v1 -- 0x7fd900003640 con 0x7fd8f8038030 2026-03-10T06:17:15.083 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.026+0000 7fd9057fa700 1 -- 192.168.123.104:0/1607655157 <== mgr.14164 v2:192.168.123.104:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+252 (secure 0 0 0) 0x7fd900003640 con 0x7fd8f8038030 2026-03-10T06:17:15.083 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.029+0000 7fd8f2ffd700 1 -- 192.168.123.104:0/1607655157 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fd8f8038030 msgr2=0x7fd8f803a4e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:15.083 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.029+0000 7fd8f2ffd700 1 --2- 192.168.123.104:0/1607655157 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fd8f8038030 0x7fd8f803a4e0 secure :-1 s=READY pgs=8 cs=0 l=1 rev1=1 crypto rx=0x7fd8f4006fd0 tx=0x7fd8f4006e40 comp rx=0 tx=0).stop 2026-03-10T06:17:15.083 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.029+0000 7fd8f2ffd700 1 -- 192.168.123.104:0/1607655157 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd900095120 msgr2=0x7fd900128dc0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:15.083 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.029+0000 7fd8f2ffd700 1 --2- 192.168.123.104:0/1607655157 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd900095120 0x7fd900128dc0 secure :-1 s=READY pgs=85 cs=0 l=1 rev1=1 crypto rx=0x7fd8fc000c00 tx=0x7fd8fc011700 comp rx=0 tx=0).stop 2026-03-10T06:17:15.083 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.029+0000 7fd8f2ffd700 1 -- 192.168.123.104:0/1607655157 shutdown_connections 2026-03-10T06:17:15.083 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.029+0000 7fd8f2ffd700 1 --2- 192.168.123.104:0/1607655157 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fd8f8038030 0x7fd8f803a4e0 secure :-1 s=CLOSED pgs=8 cs=0 l=1 rev1=1 crypto rx=0x7fd8f4006fd0 tx=0x7fd8f4006e40 comp rx=0 tx=0).stop 2026-03-10T06:17:15.083 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.029+0000 7fd8f2ffd700 1 --2- 192.168.123.104:0/1607655157 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd900095120 0x7fd900128dc0 unknown :-1 s=CLOSED pgs=85 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:15.083 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.029+0000 7fd8f2ffd700 1 -- 192.168.123.104:0/1607655157 >> 192.168.123.104:0/1607655157 conn(0x7fd9000906d0 msgr2=0x7fd9000912e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:17:15.083 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.030+0000 7fd8f2ffd700 1 -- 192.168.123.104:0/1607655157 shutdown_connections 2026-03-10T06:17:15.083 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.030+0000 7fd8f2ffd700 1 -- 192.168.123.104:0/1607655157 wait complete. 2026-03-10T06:17:15.083 INFO:teuthology.orchestra.run.vm04.stdout:Fetching dashboard port number... 2026-03-10T06:17:15.400 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stdout 8443 2026-03-10T06:17:15.400 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.214+0000 7fd0fef09700 1 Processor -- start 2026-03-10T06:17:15.400 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.214+0000 7fd0fef09700 1 -- start start 2026-03-10T06:17:15.400 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.215+0000 7fd0fef09700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd0f8073800 0x7fd0f81043a0 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:17:15.400 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.215+0000 7fd0fef09700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd0f8073c10 con 0x7fd0f8073800 2026-03-10T06:17:15.400 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.215+0000 7fd0fcca5700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd0f8073800 0x7fd0f81043a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:17:15.400 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.215+0000 7fd0fcca5700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd0f8073800 0x7fd0f81043a0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:35604/0 (socket says 192.168.123.104:35604) 2026-03-10T06:17:15.400 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.215+0000 7fd0fcca5700 1 -- 192.168.123.104:0/187296051 learned_addr learned my addr 192.168.123.104:0/187296051 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:17:15.400 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.215+0000 7fd0fcca5700 1 -- 192.168.123.104:0/187296051 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fd0f8073d50 con 0x7fd0f8073800 2026-03-10T06:17:15.400 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.216+0000 7fd0fcca5700 1 --2- 192.168.123.104:0/187296051 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd0f8073800 0x7fd0f81043a0 secure :-1 s=READY pgs=86 cs=0 l=1 rev1=1 crypto rx=0x7fd0f4009cf0 tx=0x7fd0f400b0e0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=d7d470dd992d6667 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:17:15.400 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.216+0000 7fd0ef7fe700 1 -- 192.168.123.104:0/187296051 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fd0f4004030 con 0x7fd0f8073800 2026-03-10T06:17:15.400 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.216+0000 7fd0ef7fe700 1 -- 192.168.123.104:0/187296051 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fd0f400b810 con 0x7fd0f8073800 2026-03-10T06:17:15.400 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.216+0000 7fd0ef7fe700 1 -- 192.168.123.104:0/187296051 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fd0f4003b10 con 0x7fd0f8073800 2026-03-10T06:17:15.400 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.217+0000 7fd0fef09700 1 -- 192.168.123.104:0/187296051 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd0f8073800 msgr2=0x7fd0f81043a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:15.400 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.217+0000 7fd0fef09700 1 --2- 192.168.123.104:0/187296051 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd0f8073800 0x7fd0f81043a0 secure :-1 s=READY pgs=86 cs=0 l=1 rev1=1 crypto rx=0x7fd0f4009cf0 tx=0x7fd0f400b0e0 comp rx=0 tx=0).stop 2026-03-10T06:17:15.400 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.217+0000 7fd0fef09700 1 -- 192.168.123.104:0/187296051 shutdown_connections 2026-03-10T06:17:15.400 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.217+0000 7fd0fef09700 1 --2- 192.168.123.104:0/187296051 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd0f8073800 0x7fd0f81043a0 unknown :-1 s=CLOSED pgs=86 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:15.400 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.217+0000 7fd0fef09700 1 -- 192.168.123.104:0/187296051 >> 192.168.123.104:0/187296051 conn(0x7fd0f80ffa10 msgr2=0x7fd0f8101e20 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:17:15.400 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.217+0000 7fd0fef09700 1 -- 192.168.123.104:0/187296051 shutdown_connections 2026-03-10T06:17:15.400 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.217+0000 7fd0fef09700 1 -- 192.168.123.104:0/187296051 wait complete. 2026-03-10T06:17:15.400 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.218+0000 7fd0fef09700 1 Processor -- start 2026-03-10T06:17:15.400 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.218+0000 7fd0fef09700 1 -- start start 2026-03-10T06:17:15.400 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.218+0000 7fd0fef09700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd0f81a02b0 0x7fd0f81a06c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:17:15.400 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.218+0000 7fd0fef09700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd0f81a0c00 con 0x7fd0f81a02b0 2026-03-10T06:17:15.400 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.218+0000 7fd0fcca5700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd0f81a02b0 0x7fd0f81a06c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:17:15.400 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.219+0000 7fd0fcca5700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd0f81a02b0 0x7fd0f81a06c0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:35610/0 (socket says 192.168.123.104:35610) 2026-03-10T06:17:15.400 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.219+0000 7fd0fcca5700 1 -- 192.168.123.104:0/967445111 learned_addr learned my addr 192.168.123.104:0/967445111 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:17:15.400 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.219+0000 7fd0fcca5700 1 -- 192.168.123.104:0/967445111 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fd0f4009740 con 0x7fd0f81a02b0 2026-03-10T06:17:15.400 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.219+0000 7fd0fcca5700 1 --2- 192.168.123.104:0/967445111 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd0f81a02b0 0x7fd0f81a06c0 secure :-1 s=READY pgs=87 cs=0 l=1 rev1=1 crypto rx=0x7fd0f4009cc0 tx=0x7fd0f401b800 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:17:15.400 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.219+0000 7fd0edffb700 1 -- 192.168.123.104:0/967445111 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fd0f401ba20 con 0x7fd0f81a02b0 2026-03-10T06:17:15.400 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.220+0000 7fd0edffb700 1 -- 192.168.123.104:0/967445111 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fd0f401bb80 con 0x7fd0f81a02b0 2026-03-10T06:17:15.400 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.220+0000 7fd0edffb700 1 -- 192.168.123.104:0/967445111 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fd0f401c450 con 0x7fd0f81a02b0 2026-03-10T06:17:15.400 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.220+0000 7fd0fef09700 1 -- 192.168.123.104:0/967445111 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fd0f81a0e00 con 0x7fd0f81a02b0 2026-03-10T06:17:15.400 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.220+0000 7fd0fef09700 1 -- 192.168.123.104:0/967445111 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fd0f81a3a60 con 0x7fd0f81a02b0 2026-03-10T06:17:15.400 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.221+0000 7fd0edffb700 1 -- 192.168.123.104:0/967445111 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 11) v1 ==== 44986+0+0 (secure 0 0 0) 0x7fd0f4022070 con 0x7fd0f81a02b0 2026-03-10T06:17:15.400 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.221+0000 7fd0edffb700 1 --2- 192.168.123.104:0/967445111 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fd0e0038390 0x7fd0e003a840 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:17:15.400 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.221+0000 7fd0edffb700 1 -- 192.168.123.104:0/967445111 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(3..3 src has 1..3) v4 ==== 1069+0+0 (secure 0 0 0) 0x7fd0f404d4f0 con 0x7fd0f81a02b0 2026-03-10T06:17:15.400 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.221+0000 7fd0effff700 1 --2- 192.168.123.104:0/967445111 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fd0e0038390 0x7fd0e003a840 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:17:15.400 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.221+0000 7fd0fef09700 1 -- 192.168.123.104:0/967445111 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fd0f8191280 con 0x7fd0f81a02b0 2026-03-10T06:17:15.400 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.223+0000 7fd0effff700 1 --2- 192.168.123.104:0/967445111 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fd0e0038390 0x7fd0e003a840 secure :-1 s=READY pgs=9 cs=0 l=1 rev1=1 crypto rx=0x7fd0e8006fd0 tx=0x7fd0e8006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:17:15.400 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.225+0000 7fd0edffb700 1 -- 192.168.123.104:0/967445111 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fd0f401c5b0 con 0x7fd0f81a02b0 2026-03-10T06:17:15.400 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.337+0000 7fd0fef09700 1 -- 192.168.123.104:0/967445111 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "config get", "who": "mgr", "key": "mgr/dashboard/ssl_server_port"} v 0) v1 -- 0x7fd0f8062380 con 0x7fd0f81a02b0 2026-03-10T06:17:15.400 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.338+0000 7fd0edffb700 1 -- 192.168.123.104:0/967445111 <== mon.0 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "config get", "who": "mgr", "key": "mgr/dashboard/ssl_server_port"}]=0 v8) v1 ==== 112+0+5 (secure 0 0 0) 0x7fd0f8062380 con 0x7fd0f81a02b0 2026-03-10T06:17:15.400 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.342+0000 7fd0fef09700 1 -- 192.168.123.104:0/967445111 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fd0e0038390 msgr2=0x7fd0e003a840 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:15.400 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.342+0000 7fd0fef09700 1 --2- 192.168.123.104:0/967445111 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fd0e0038390 0x7fd0e003a840 secure :-1 s=READY pgs=9 cs=0 l=1 rev1=1 crypto rx=0x7fd0e8006fd0 tx=0x7fd0e8006e40 comp rx=0 tx=0).stop 2026-03-10T06:17:15.400 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.342+0000 7fd0fef09700 1 -- 192.168.123.104:0/967445111 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd0f81a02b0 msgr2=0x7fd0f81a06c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:15.400 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.342+0000 7fd0fef09700 1 --2- 192.168.123.104:0/967445111 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd0f81a02b0 0x7fd0f81a06c0 secure :-1 s=READY pgs=87 cs=0 l=1 rev1=1 crypto rx=0x7fd0f4009cc0 tx=0x7fd0f401b800 comp rx=0 tx=0).stop 2026-03-10T06:17:15.400 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.342+0000 7fd0fef09700 1 -- 192.168.123.104:0/967445111 shutdown_connections 2026-03-10T06:17:15.400 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.342+0000 7fd0fef09700 1 --2- 192.168.123.104:0/967445111 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fd0e0038390 0x7fd0e003a840 unknown :-1 s=CLOSED pgs=9 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:15.400 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.342+0000 7fd0fef09700 1 --2- 192.168.123.104:0/967445111 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd0f81a02b0 0x7fd0f81a06c0 unknown :-1 s=CLOSED pgs=87 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:15.400 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.342+0000 7fd0fef09700 1 -- 192.168.123.104:0/967445111 >> 192.168.123.104:0/967445111 conn(0x7fd0f80ffa10 msgr2=0x7fd0f818be00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:17:15.400 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.342+0000 7fd0fef09700 1 -- 192.168.123.104:0/967445111 shutdown_connections 2026-03-10T06:17:15.400 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.342+0000 7fd0fef09700 1 -- 192.168.123.104:0/967445111 wait complete. 2026-03-10T06:17:15.400 INFO:teuthology.orchestra.run.vm04.stdout:firewalld does not appear to be present 2026-03-10T06:17:15.401 INFO:teuthology.orchestra.run.vm04.stdout:Not possible to open ports <[8443]>. firewalld.service is not available 2026-03-10T06:17:15.401 INFO:teuthology.orchestra.run.vm04.stdout:Ceph Dashboard is now available at: 2026-03-10T06:17:15.401 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:17:15.401 INFO:teuthology.orchestra.run.vm04.stdout: URL: https://vm04.local:8443/ 2026-03-10T06:17:15.401 INFO:teuthology.orchestra.run.vm04.stdout: User: admin 2026-03-10T06:17:15.401 INFO:teuthology.orchestra.run.vm04.stdout: Password: wfz3dnui0d 2026-03-10T06:17:15.401 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:17:15.401 INFO:teuthology.orchestra.run.vm04.stdout:Saving cluster configuration to /var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/config directory 2026-03-10T06:17:15.402 INFO:teuthology.orchestra.run.vm04.stdout:Enabling autotune for osd_memory_target 2026-03-10T06:17:15.654 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:15 vm04 ceph-mon[51058]: [10/Mar/2026:06:17:13] ENGINE Bus STARTING 2026-03-10T06:17:15.654 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:15 vm04 ceph-mon[51058]: [10/Mar/2026:06:17:13] ENGINE Serving on http://192.168.123.104:8765 2026-03-10T06:17:15.654 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:15 vm04 ceph-mon[51058]: [10/Mar/2026:06:17:13] ENGINE Serving on https://192.168.123.104:7150 2026-03-10T06:17:15.654 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:15 vm04 ceph-mon[51058]: [10/Mar/2026:06:17:13] ENGINE Bus STARTED 2026-03-10T06:17:15.654 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:15 vm04 ceph-mon[51058]: from='client.14168 -' entity='client.admin' cmd=[{"prefix": "get_command_descriptions"}]: dispatch 2026-03-10T06:17:15.654 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:15 vm04 ceph-mon[51058]: from='client.14168 -' entity='client.admin' cmd=[{"prefix": "mgr_status"}]: dispatch 2026-03-10T06:17:15.654 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:15 vm04 ceph-mon[51058]: from='mgr.14164 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:17:15.654 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:15 vm04 ceph-mon[51058]: from='mgr.14164 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:17:15.654 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:15 vm04 ceph-mon[51058]: from='mgr.14164 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:17:15.654 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:15 vm04 ceph-mon[51058]: from='mgr.14164 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:17:15.655 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:15 vm04 ceph-mon[51058]: from='client.? 192.168.123.104:0/967445111' entity='client.admin' cmd=[{"prefix": "config get", "who": "mgr", "key": "mgr/dashboard/ssl_server_port"}]: dispatch 2026-03-10T06:17:15.711 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.539+0000 7f7564e89700 1 Processor -- start 2026-03-10T06:17:15.711 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.539+0000 7f7564e89700 1 -- start start 2026-03-10T06:17:15.711 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.540+0000 7f7564e89700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7560106830 0x7f7560108c10 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:17:15.711 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.540+0000 7f7564e89700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f75600745b0 con 0x7f7560106830 2026-03-10T06:17:15.711 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.540+0000 7f755e59c700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7560106830 0x7f7560108c10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:17:15.711 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.540+0000 7f755e59c700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7560106830 0x7f7560108c10 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:35616/0 (socket says 192.168.123.104:35616) 2026-03-10T06:17:15.711 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.540+0000 7f755e59c700 1 -- 192.168.123.104:0/2196913602 learned_addr learned my addr 192.168.123.104:0/2196913602 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:17:15.711 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.541+0000 7f755e59c700 1 -- 192.168.123.104:0/2196913602 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f75600746f0 con 0x7f7560106830 2026-03-10T06:17:15.711 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.541+0000 7f755e59c700 1 --2- 192.168.123.104:0/2196913602 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7560106830 0x7f7560108c10 secure :-1 s=READY pgs=88 cs=0 l=1 rev1=1 crypto rx=0x7f7548009a90 tx=0x7f7548009da0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=7525172cd7967a6a server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:17:15.711 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.541+0000 7f755d59a700 1 -- 192.168.123.104:0/2196913602 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f7548004030 con 0x7f7560106830 2026-03-10T06:17:15.711 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.541+0000 7f755d59a700 1 -- 192.168.123.104:0/2196913602 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f754800b7e0 con 0x7f7560106830 2026-03-10T06:17:15.711 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.541+0000 7f755d59a700 1 -- 192.168.123.104:0/2196913602 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f7548003b30 con 0x7f7560106830 2026-03-10T06:17:15.711 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.542+0000 7f7564e89700 1 -- 192.168.123.104:0/2196913602 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7560106830 msgr2=0x7f7560108c10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:15.711 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.542+0000 7f7564e89700 1 --2- 192.168.123.104:0/2196913602 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7560106830 0x7f7560108c10 secure :-1 s=READY pgs=88 cs=0 l=1 rev1=1 crypto rx=0x7f7548009a90 tx=0x7f7548009da0 comp rx=0 tx=0).stop 2026-03-10T06:17:15.711 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.542+0000 7f7564e89700 1 -- 192.168.123.104:0/2196913602 shutdown_connections 2026-03-10T06:17:15.711 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.542+0000 7f7564e89700 1 --2- 192.168.123.104:0/2196913602 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7560106830 0x7f7560108c10 unknown :-1 s=CLOSED pgs=88 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:15.712 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.542+0000 7f7564e89700 1 -- 192.168.123.104:0/2196913602 >> 192.168.123.104:0/2196913602 conn(0x7f7560100270 msgr2=0x7f75601026a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:17:15.712 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.542+0000 7f7564e89700 1 -- 192.168.123.104:0/2196913602 shutdown_connections 2026-03-10T06:17:15.712 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.542+0000 7f7564e89700 1 -- 192.168.123.104:0/2196913602 wait complete. 2026-03-10T06:17:15.712 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.543+0000 7f7564e89700 1 Processor -- start 2026-03-10T06:17:15.712 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.543+0000 7f7564e89700 1 -- start start 2026-03-10T06:17:15.712 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.543+0000 7f7564e89700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7560106830 0x7f756019bc30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:17:15.712 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.543+0000 7f7564e89700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f756019c170 con 0x7f7560106830 2026-03-10T06:17:15.712 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.543+0000 7f755e59c700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7560106830 0x7f756019bc30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:17:15.712 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.543+0000 7f755e59c700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7560106830 0x7f756019bc30 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:35630/0 (socket says 192.168.123.104:35630) 2026-03-10T06:17:15.712 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.543+0000 7f755e59c700 1 -- 192.168.123.104:0/3880599649 learned_addr learned my addr 192.168.123.104:0/3880599649 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:17:15.713 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.544+0000 7f755e59c700 1 -- 192.168.123.104:0/3880599649 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f7548009740 con 0x7f7560106830 2026-03-10T06:17:15.713 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.544+0000 7f755e59c700 1 --2- 192.168.123.104:0/3880599649 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7560106830 0x7f756019bc30 secure :-1 s=READY pgs=89 cs=0 l=1 rev1=1 crypto rx=0x7f7548000c00 tx=0x7f754800bfa0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:17:15.713 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.544+0000 7f75577fe700 1 -- 192.168.123.104:0/3880599649 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f75480040f0 con 0x7f7560106830 2026-03-10T06:17:15.713 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.544+0000 7f75577fe700 1 -- 192.168.123.104:0/3880599649 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f7548004250 con 0x7f7560106830 2026-03-10T06:17:15.713 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.544+0000 7f7564e89700 1 -- 192.168.123.104:0/3880599649 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f756019c370 con 0x7f7560106830 2026-03-10T06:17:15.713 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.544+0000 7f75577fe700 1 -- 192.168.123.104:0/3880599649 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f7548011560 con 0x7f7560106830 2026-03-10T06:17:15.713 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.545+0000 7f7564e89700 1 -- 192.168.123.104:0/3880599649 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f756019c810 con 0x7f7560106830 2026-03-10T06:17:15.713 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.545+0000 7f75577fe700 1 -- 192.168.123.104:0/3880599649 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 11) v1 ==== 44986+0+0 (secure 0 0 0) 0x7f7548011790 con 0x7f7560106830 2026-03-10T06:17:15.713 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.546+0000 7f75577fe700 1 --2- 192.168.123.104:0/3880599649 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f754c0383a0 0x7f754c03a850 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:17:15.713 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.546+0000 7f755dd9b700 1 --2- 192.168.123.104:0/3880599649 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f754c0383a0 0x7f754c03a850 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:17:15.713 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.546+0000 7f75577fe700 1 -- 192.168.123.104:0/3880599649 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(3..3 src has 1..3) v4 ==== 1069+0+0 (secure 0 0 0) 0x7f754804ca50 con 0x7f7560106830 2026-03-10T06:17:15.713 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.546+0000 7f7564e89700 1 -- 192.168.123.104:0/3880599649 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f7560062380 con 0x7f7560106830 2026-03-10T06:17:15.713 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.549+0000 7f755dd9b700 1 --2- 192.168.123.104:0/3880599649 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f754c0383a0 0x7f754c03a850 secure :-1 s=READY pgs=10 cs=0 l=1 rev1=1 crypto rx=0x7f7550006fd0 tx=0x7f7550006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:17:15.713 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.549+0000 7f75577fe700 1 -- 192.168.123.104:0/3880599649 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f754801e070 con 0x7f7560106830 2026-03-10T06:17:15.713 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.652+0000 7f7564e89700 1 -- 192.168.123.104:0/3880599649 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command([{prefix=config set, name=osd_memory_target_autotune}] v 0) v1 -- 0x7f756019f190 con 0x7f7560106830 2026-03-10T06:17:15.713 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.655+0000 7f75577fe700 1 -- 192.168.123.104:0/3880599649 <== mon.0 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{prefix=config set, name=osd_memory_target_autotune}]=0 v8)=0 v8) v1 ==== 127+0+0 (secure 0 0 0) 0x7f754804b0c0 con 0x7f7560106830 2026-03-10T06:17:15.713 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.657+0000 7f7564e89700 1 -- 192.168.123.104:0/3880599649 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f754c0383a0 msgr2=0x7f754c03a850 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:15.713 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.657+0000 7f7564e89700 1 --2- 192.168.123.104:0/3880599649 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f754c0383a0 0x7f754c03a850 secure :-1 s=READY pgs=10 cs=0 l=1 rev1=1 crypto rx=0x7f7550006fd0 tx=0x7f7550006e40 comp rx=0 tx=0).stop 2026-03-10T06:17:15.713 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.657+0000 7f7564e89700 1 -- 192.168.123.104:0/3880599649 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7560106830 msgr2=0x7f756019bc30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:15.713 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.657+0000 7f7564e89700 1 --2- 192.168.123.104:0/3880599649 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7560106830 0x7f756019bc30 secure :-1 s=READY pgs=89 cs=0 l=1 rev1=1 crypto rx=0x7f7548000c00 tx=0x7f754800bfa0 comp rx=0 tx=0).stop 2026-03-10T06:17:15.713 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.658+0000 7f7564e89700 1 -- 192.168.123.104:0/3880599649 shutdown_connections 2026-03-10T06:17:15.713 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.658+0000 7f7564e89700 1 --2- 192.168.123.104:0/3880599649 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f754c0383a0 0x7f754c03a850 unknown :-1 s=CLOSED pgs=10 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:15.713 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.658+0000 7f7564e89700 1 --2- 192.168.123.104:0/3880599649 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7560106830 0x7f756019bc30 unknown :-1 s=CLOSED pgs=89 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:15.713 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.658+0000 7f7564e89700 1 -- 192.168.123.104:0/3880599649 >> 192.168.123.104:0/3880599649 conn(0x7f7560100270 msgr2=0x7f7560100f30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:17:15.713 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.658+0000 7f7564e89700 1 -- 192.168.123.104:0/3880599649 shutdown_connections 2026-03-10T06:17:15.713 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.658+0000 7f7564e89700 1 -- 192.168.123.104:0/3880599649 wait complete. 2026-03-10T06:17:16.039 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.837+0000 7f468d0ce700 1 Processor -- start 2026-03-10T06:17:16.039 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.838+0000 7f468d0ce700 1 -- start start 2026-03-10T06:17:16.039 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.838+0000 7f468d0ce700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4688107f40 0x7f4688108350 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:17:16.039 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.838+0000 7f468d0ce700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4688108890 con 0x7f4688107f40 2026-03-10T06:17:16.039 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.838+0000 7f4686d9d700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4688107f40 0x7f4688108350 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:17:16.039 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.838+0000 7f4686d9d700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4688107f40 0x7f4688108350 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:35632/0 (socket says 192.168.123.104:35632) 2026-03-10T06:17:16.039 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.838+0000 7f4686d9d700 1 -- 192.168.123.104:0/306993798 learned_addr learned my addr 192.168.123.104:0/306993798 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:17:16.039 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.839+0000 7f4686d9d700 1 -- 192.168.123.104:0/306993798 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f46881089d0 con 0x7f4688107f40 2026-03-10T06:17:16.039 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.839+0000 7f4686d9d700 1 --2- 192.168.123.104:0/306993798 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4688107f40 0x7f4688108350 secure :-1 s=READY pgs=90 cs=0 l=1 rev1=1 crypto rx=0x7f4670009a90 tx=0x7f4670009da0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=20f88d9eb851b06f server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:17:16.039 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.839+0000 7f4685d9b700 1 -- 192.168.123.104:0/306993798 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f4670004030 con 0x7f4688107f40 2026-03-10T06:17:16.039 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.839+0000 7f4685d9b700 1 -- 192.168.123.104:0/306993798 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f467000b7e0 con 0x7f4688107f40 2026-03-10T06:17:16.039 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.839+0000 7f4685d9b700 1 -- 192.168.123.104:0/306993798 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f4670003ae0 con 0x7f4688107f40 2026-03-10T06:17:16.039 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.839+0000 7f468d0ce700 1 -- 192.168.123.104:0/306993798 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4688107f40 msgr2=0x7f4688108350 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:16.039 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.839+0000 7f468d0ce700 1 --2- 192.168.123.104:0/306993798 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4688107f40 0x7f4688108350 secure :-1 s=READY pgs=90 cs=0 l=1 rev1=1 crypto rx=0x7f4670009a90 tx=0x7f4670009da0 comp rx=0 tx=0).stop 2026-03-10T06:17:16.039 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.840+0000 7f468d0ce700 1 -- 192.168.123.104:0/306993798 shutdown_connections 2026-03-10T06:17:16.039 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.840+0000 7f468d0ce700 1 --2- 192.168.123.104:0/306993798 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4688107f40 0x7f4688108350 unknown :-1 s=CLOSED pgs=90 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:16.039 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.840+0000 7f468d0ce700 1 -- 192.168.123.104:0/306993798 >> 192.168.123.104:0/306993798 conn(0x7f468807b4b0 msgr2=0x7f468807b8d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:17:16.039 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.840+0000 7f468d0ce700 1 -- 192.168.123.104:0/306993798 shutdown_connections 2026-03-10T06:17:16.039 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.840+0000 7f468d0ce700 1 -- 192.168.123.104:0/306993798 wait complete. 2026-03-10T06:17:16.039 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.840+0000 7f468d0ce700 1 Processor -- start 2026-03-10T06:17:16.039 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.840+0000 7f468d0ce700 1 -- start start 2026-03-10T06:17:16.039 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.841+0000 7f468d0ce700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4688107f40 0x7f468819ff00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:17:16.039 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.841+0000 7f468d0ce700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f46881a0440 con 0x7f4688107f40 2026-03-10T06:17:16.040 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.841+0000 7f4686d9d700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4688107f40 0x7f468819ff00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:17:16.040 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.841+0000 7f4686d9d700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4688107f40 0x7f468819ff00 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:35646/0 (socket says 192.168.123.104:35646) 2026-03-10T06:17:16.040 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.841+0000 7f4686d9d700 1 -- 192.168.123.104:0/201033957 learned_addr learned my addr 192.168.123.104:0/201033957 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:17:16.040 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.841+0000 7f4686d9d700 1 -- 192.168.123.104:0/201033957 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f4670009740 con 0x7f4688107f40 2026-03-10T06:17:16.040 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.841+0000 7f4686d9d700 1 --2- 192.168.123.104:0/201033957 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4688107f40 0x7f468819ff00 secure :-1 s=READY pgs=91 cs=0 l=1 rev1=1 crypto rx=0x7f4670000c00 tx=0x7f467000bfa0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:17:16.040 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.841+0000 7f467ffff700 1 -- 192.168.123.104:0/201033957 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f46700041a0 con 0x7f4688107f40 2026-03-10T06:17:16.040 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.841+0000 7f467ffff700 1 -- 192.168.123.104:0/201033957 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f4670004300 con 0x7f4688107f40 2026-03-10T06:17:16.040 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.841+0000 7f467ffff700 1 -- 192.168.123.104:0/201033957 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f4670011550 con 0x7f4688107f40 2026-03-10T06:17:16.040 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.841+0000 7f468d0ce700 1 -- 192.168.123.104:0/201033957 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f46881a0640 con 0x7f4688107f40 2026-03-10T06:17:16.040 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.841+0000 7f468d0ce700 1 -- 192.168.123.104:0/201033957 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f46881a0ae0 con 0x7f4688107f40 2026-03-10T06:17:16.040 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.843+0000 7f468d0ce700 1 -- 192.168.123.104:0/201033957 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f468804f9e0 con 0x7f4688107f40 2026-03-10T06:17:16.040 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.844+0000 7f467ffff700 1 -- 192.168.123.104:0/201033957 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 11) v1 ==== 44986+0+0 (secure 0 0 0) 0x7f46700116b0 con 0x7f4688107f40 2026-03-10T06:17:16.040 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.844+0000 7f467ffff700 1 --2- 192.168.123.104:0/201033957 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f46740384b0 0x7f467403a960 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:17:16.040 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.844+0000 7f467ffff700 1 -- 192.168.123.104:0/201033957 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(3..3 src has 1..3) v4 ==== 1069+0+0 (secure 0 0 0) 0x7f467004d0e0 con 0x7f4688107f40 2026-03-10T06:17:16.040 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.846+0000 7f467ffff700 1 -- 192.168.123.104:0/201033957 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f467007f720 con 0x7f4688107f40 2026-03-10T06:17:16.040 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.846+0000 7f468659c700 1 --2- 192.168.123.104:0/201033957 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f46740384b0 0x7f467403a960 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:17:16.040 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:15.846+0000 7f468659c700 1 --2- 192.168.123.104:0/201033957 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f46740384b0 0x7f467403a960 secure :-1 s=READY pgs=11 cs=0 l=1 rev1=1 crypto rx=0x7f4678006fd0 tx=0x7f4678006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:17:16.040 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:16.004+0000 7f468d0ce700 1 -- 192.168.123.104:0/201033957 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command([{prefix=config-key set, key=mgr/dashboard/cluster/status}] v 0) v1 -- 0x7f4688062380 con 0x7f4688107f40 2026-03-10T06:17:16.040 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:16.007+0000 7f467ffff700 1 -- 192.168.123.104:0/201033957 <== mon.0 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{prefix=config-key set, key=mgr/dashboard/cluster/status}]=0 set mgr/dashboard/cluster/status v34)=0 set mgr/dashboard/cluster/status v34) v1 ==== 153+0+0 (secure 0 0 0) 0x7f4670029330 con 0x7f4688107f40 2026-03-10T06:17:16.040 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:16.009+0000 7f468d0ce700 1 -- 192.168.123.104:0/201033957 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f46740384b0 msgr2=0x7f467403a960 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:16.040 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:16.009+0000 7f468d0ce700 1 --2- 192.168.123.104:0/201033957 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f46740384b0 0x7f467403a960 secure :-1 s=READY pgs=11 cs=0 l=1 rev1=1 crypto rx=0x7f4678006fd0 tx=0x7f4678006e40 comp rx=0 tx=0).stop 2026-03-10T06:17:16.040 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:16.009+0000 7f468d0ce700 1 -- 192.168.123.104:0/201033957 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4688107f40 msgr2=0x7f468819ff00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:16.040 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:16.009+0000 7f468d0ce700 1 --2- 192.168.123.104:0/201033957 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4688107f40 0x7f468819ff00 secure :-1 s=READY pgs=91 cs=0 l=1 rev1=1 crypto rx=0x7f4670000c00 tx=0x7f467000bfa0 comp rx=0 tx=0).stop 2026-03-10T06:17:16.040 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:16.010+0000 7f468d0ce700 1 -- 192.168.123.104:0/201033957 shutdown_connections 2026-03-10T06:17:16.040 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:16.010+0000 7f468d0ce700 1 --2- 192.168.123.104:0/201033957 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f46740384b0 0x7f467403a960 unknown :-1 s=CLOSED pgs=11 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:16.040 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:16.010+0000 7f468d0ce700 1 --2- 192.168.123.104:0/201033957 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4688107f40 0x7f468819ff00 unknown :-1 s=CLOSED pgs=91 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:16.040 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:16.010+0000 7f468d0ce700 1 -- 192.168.123.104:0/201033957 >> 192.168.123.104:0/201033957 conn(0x7f468807b4b0 msgr2=0x7f46881064f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:17:16.040 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:16.010+0000 7f468d0ce700 1 -- 192.168.123.104:0/201033957 shutdown_connections 2026-03-10T06:17:16.040 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr 2026-03-10T06:17:16.010+0000 7f468d0ce700 1 -- 192.168.123.104:0/201033957 wait complete. 2026-03-10T06:17:16.040 INFO:teuthology.orchestra.run.vm04.stdout:/usr/bin/ceph: stderr set mgr/dashboard/cluster/status 2026-03-10T06:17:16.040 INFO:teuthology.orchestra.run.vm04.stdout:You can access the Ceph CLI as following in case of multi-cluster or non-default config: 2026-03-10T06:17:16.040 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:17:16.040 INFO:teuthology.orchestra.run.vm04.stdout: sudo /home/ubuntu/cephtest/cephadm shell --fsid 9c59102a-1c48-11f1-b618-035af535377d -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring 2026-03-10T06:17:16.040 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:17:16.040 INFO:teuthology.orchestra.run.vm04.stdout:Or, if you are only running a single cluster on this host: 2026-03-10T06:17:16.040 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:17:16.040 INFO:teuthology.orchestra.run.vm04.stdout: sudo /home/ubuntu/cephtest/cephadm shell 2026-03-10T06:17:16.040 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:17:16.040 INFO:teuthology.orchestra.run.vm04.stdout:Please consider enabling telemetry to help improve Ceph: 2026-03-10T06:17:16.040 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:17:16.040 INFO:teuthology.orchestra.run.vm04.stdout: ceph telemetry on 2026-03-10T06:17:16.040 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:17:16.040 INFO:teuthology.orchestra.run.vm04.stdout:For more information see: 2026-03-10T06:17:16.040 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:17:16.040 INFO:teuthology.orchestra.run.vm04.stdout: https://docs.ceph.com/en/latest/mgr/telemetry/ 2026-03-10T06:17:16.040 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:17:16.040 INFO:teuthology.orchestra.run.vm04.stdout:Bootstrap complete. 2026-03-10T06:17:16.073 INFO:tasks.cephadm:Fetching config... 2026-03-10T06:17:16.073 DEBUG:teuthology.orchestra.run.vm04:> set -ex 2026-03-10T06:17:16.073 DEBUG:teuthology.orchestra.run.vm04:> dd if=/etc/ceph/ceph.conf of=/dev/stdout 2026-03-10T06:17:16.099 INFO:tasks.cephadm:Fetching client.admin keyring... 2026-03-10T06:17:16.111 DEBUG:teuthology.orchestra.run.vm04:> set -ex 2026-03-10T06:17:16.111 DEBUG:teuthology.orchestra.run.vm04:> dd if=/etc/ceph/ceph.client.admin.keyring of=/dev/stdout 2026-03-10T06:17:16.173 INFO:tasks.cephadm:Fetching mon keyring... 2026-03-10T06:17:16.173 DEBUG:teuthology.orchestra.run.vm04:> set -ex 2026-03-10T06:17:16.173 DEBUG:teuthology.orchestra.run.vm04:> sudo dd if=/var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/mon.vm04/keyring of=/dev/stdout 2026-03-10T06:17:16.242 INFO:tasks.cephadm:Fetching pub ssh key... 2026-03-10T06:17:16.242 DEBUG:teuthology.orchestra.run.vm04:> set -ex 2026-03-10T06:17:16.242 DEBUG:teuthology.orchestra.run.vm04:> dd if=/home/ubuntu/cephtest/ceph.pub of=/dev/stdout 2026-03-10T06:17:16.300 INFO:tasks.cephadm:Installing pub ssh key for root users... 2026-03-10T06:17:16.300 DEBUG:teuthology.orchestra.run.vm04:> sudo install -d -m 0700 /root/.ssh && echo 'ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDB23dv+HsXxEfOHeT6j+84qNSP4Y7ONSep6uvtL71jND6dXQ8eR/GSMTGbbmy8ec0MPvVSTBrriUxdPW3Kqh4kl30X+Y2GuZdTR5dZGB5/gO5eVOg6al1oJpRs/cPIPfSxNlcFUXrRW9VHfuGY5fONK9hqFzPSK1Q/9jan/WHO7mag42yyp5ZH4kE8FHxHTjKTlgeCajuBfOUD+gXWxmoQBbICcKi+IO3+EzQTRh7oBqTD6FHZYWA/tq1uN1M2ZurBcUQOhnRVcKA17m49fIyYdh1JtGdydZ7sO59iSzw736+LDth0UxfwItidMKFhCtSGTJuCp+L0dk1Lzq+B6Bq2h1Id83lqkzCxMgn0H80e0+d8Wd61tCqaCkXG+U3xdfJKJFtn0EfGBeoFmdxrgYceqFKoS/sK107w1vLODuFqUswWc2GRYO878c9i1AgU1vc3PjDvbfgh8ufq4cjleARoqqJLoqVGKJ4vKZzen32DOvJXQ8aQaYESMdv7hUbsiyE= ceph-9c59102a-1c48-11f1-b618-035af535377d' | sudo tee -a /root/.ssh/authorized_keys && sudo chmod 0600 /root/.ssh/authorized_keys 2026-03-10T06:17:16.392 INFO:teuthology.orchestra.run.vm04.stdout:ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDB23dv+HsXxEfOHeT6j+84qNSP4Y7ONSep6uvtL71jND6dXQ8eR/GSMTGbbmy8ec0MPvVSTBrriUxdPW3Kqh4kl30X+Y2GuZdTR5dZGB5/gO5eVOg6al1oJpRs/cPIPfSxNlcFUXrRW9VHfuGY5fONK9hqFzPSK1Q/9jan/WHO7mag42yyp5ZH4kE8FHxHTjKTlgeCajuBfOUD+gXWxmoQBbICcKi+IO3+EzQTRh7oBqTD6FHZYWA/tq1uN1M2ZurBcUQOhnRVcKA17m49fIyYdh1JtGdydZ7sO59iSzw736+LDth0UxfwItidMKFhCtSGTJuCp+L0dk1Lzq+B6Bq2h1Id83lqkzCxMgn0H80e0+d8Wd61tCqaCkXG+U3xdfJKJFtn0EfGBeoFmdxrgYceqFKoS/sK107w1vLODuFqUswWc2GRYO878c9i1AgU1vc3PjDvbfgh8ufq4cjleARoqqJLoqVGKJ4vKZzen32DOvJXQ8aQaYESMdv7hUbsiyE= ceph-9c59102a-1c48-11f1-b618-035af535377d 2026-03-10T06:17:16.409 DEBUG:teuthology.orchestra.run.vm06:> sudo install -d -m 0700 /root/.ssh && echo 'ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDB23dv+HsXxEfOHeT6j+84qNSP4Y7ONSep6uvtL71jND6dXQ8eR/GSMTGbbmy8ec0MPvVSTBrriUxdPW3Kqh4kl30X+Y2GuZdTR5dZGB5/gO5eVOg6al1oJpRs/cPIPfSxNlcFUXrRW9VHfuGY5fONK9hqFzPSK1Q/9jan/WHO7mag42yyp5ZH4kE8FHxHTjKTlgeCajuBfOUD+gXWxmoQBbICcKi+IO3+EzQTRh7oBqTD6FHZYWA/tq1uN1M2ZurBcUQOhnRVcKA17m49fIyYdh1JtGdydZ7sO59iSzw736+LDth0UxfwItidMKFhCtSGTJuCp+L0dk1Lzq+B6Bq2h1Id83lqkzCxMgn0H80e0+d8Wd61tCqaCkXG+U3xdfJKJFtn0EfGBeoFmdxrgYceqFKoS/sK107w1vLODuFqUswWc2GRYO878c9i1AgU1vc3PjDvbfgh8ufq4cjleARoqqJLoqVGKJ4vKZzen32DOvJXQ8aQaYESMdv7hUbsiyE= ceph-9c59102a-1c48-11f1-b618-035af535377d' | sudo tee -a /root/.ssh/authorized_keys && sudo chmod 0600 /root/.ssh/authorized_keys 2026-03-10T06:17:16.448 INFO:teuthology.orchestra.run.vm06.stdout:ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDB23dv+HsXxEfOHeT6j+84qNSP4Y7ONSep6uvtL71jND6dXQ8eR/GSMTGbbmy8ec0MPvVSTBrriUxdPW3Kqh4kl30X+Y2GuZdTR5dZGB5/gO5eVOg6al1oJpRs/cPIPfSxNlcFUXrRW9VHfuGY5fONK9hqFzPSK1Q/9jan/WHO7mag42yyp5ZH4kE8FHxHTjKTlgeCajuBfOUD+gXWxmoQBbICcKi+IO3+EzQTRh7oBqTD6FHZYWA/tq1uN1M2ZurBcUQOhnRVcKA17m49fIyYdh1JtGdydZ7sO59iSzw736+LDth0UxfwItidMKFhCtSGTJuCp+L0dk1Lzq+B6Bq2h1Id83lqkzCxMgn0H80e0+d8Wd61tCqaCkXG+U3xdfJKJFtn0EfGBeoFmdxrgYceqFKoS/sK107w1vLODuFqUswWc2GRYO878c9i1AgU1vc3PjDvbfgh8ufq4cjleARoqqJLoqVGKJ4vKZzen32DOvJXQ8aQaYESMdv7hUbsiyE= ceph-9c59102a-1c48-11f1-b618-035af535377d 2026-03-10T06:17:16.458 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 9c59102a-1c48-11f1-b618-035af535377d -- ceph config set mgr mgr/cephadm/allow_ptrace true 2026-03-10T06:17:16.611 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/mon.vm04/config 2026-03-10T06:17:16.640 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:16 vm04 ceph-mon[51058]: from='client.14176 -' entity='client.admin' cmd=[{"prefix": "dashboard create-self-signed-cert", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:17:16.640 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:16 vm04 ceph-mon[51058]: from='client.14178 -' entity='client.admin' cmd=[{"prefix": "dashboard ac-user-create", "username": "admin", "rolename": "administrator", "force_password": true, "pwd_update_required": true, "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:17:16.640 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:16 vm04 ceph-mon[51058]: from='client.? 192.168.123.104:0/201033957' entity='client.admin' 2026-03-10T06:17:16.640 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:16 vm04 ceph-mon[51058]: mgrmap e12: vm04.exdvdb(active, since 2s) 2026-03-10T06:17:16.949 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:16.947+0000 7f1207e9e700 1 -- 192.168.123.104:0/3950978377 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f12000715f0 msgr2=0x7f12000719e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:16.949 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:16.947+0000 7f1207e9e700 1 --2- 192.168.123.104:0/3950978377 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f12000715f0 0x7f12000719e0 secure :-1 s=READY pgs=92 cs=0 l=1 rev1=1 crypto rx=0x7f11fc009b00 tx=0x7f11fc009e10 comp rx=0 tx=0).stop 2026-03-10T06:17:16.949 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:16.947+0000 7f1207e9e700 1 -- 192.168.123.104:0/3950978377 shutdown_connections 2026-03-10T06:17:16.949 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:16.947+0000 7f1207e9e700 1 --2- 192.168.123.104:0/3950978377 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f12000715f0 0x7f12000719e0 unknown :-1 s=CLOSED pgs=92 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:16.949 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:16.947+0000 7f1207e9e700 1 -- 192.168.123.104:0/3950978377 >> 192.168.123.104:0/3950978377 conn(0x7f120006cf00 msgr2=0x7f120006f350 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:17:16.950 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:16.948+0000 7f1207e9e700 1 -- 192.168.123.104:0/3950978377 shutdown_connections 2026-03-10T06:17:16.950 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:16.948+0000 7f1207e9e700 1 -- 192.168.123.104:0/3950978377 wait complete. 2026-03-10T06:17:16.950 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:16.948+0000 7f1207e9e700 1 Processor -- start 2026-03-10T06:17:16.950 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:16.948+0000 7f1207e9e700 1 -- start start 2026-03-10T06:17:16.950 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:16.948+0000 7f1207e9e700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f12000715f0 0x7f12001a4280 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:17:16.950 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:16.948+0000 7f1207e9e700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f11fc012070 con 0x7f12000715f0 2026-03-10T06:17:16.950 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:16.948+0000 7f1205c3a700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f12000715f0 0x7f12001a4280 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:17:16.950 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:16.948+0000 7f1205c3a700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f12000715f0 0x7f12001a4280 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:35676/0 (socket says 192.168.123.104:35676) 2026-03-10T06:17:16.951 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:16.948+0000 7f1205c3a700 1 -- 192.168.123.104:0/3428026385 learned_addr learned my addr 192.168.123.104:0/3428026385 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:17:16.951 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:16.949+0000 7f1205c3a700 1 -- 192.168.123.104:0/3428026385 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f11fc0097e0 con 0x7f12000715f0 2026-03-10T06:17:16.951 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:16.949+0000 7f1205c3a700 1 --2- 192.168.123.104:0/3428026385 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f12000715f0 0x7f12001a4280 secure :-1 s=READY pgs=93 cs=0 l=1 rev1=1 crypto rx=0x7f11fc00c010 tx=0x7f11fc00bc10 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:17:16.951 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:16.949+0000 7f11f6ffd700 1 -- 192.168.123.104:0/3428026385 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f11fc01d070 con 0x7f12000715f0 2026-03-10T06:17:16.951 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:16.949+0000 7f11f6ffd700 1 -- 192.168.123.104:0/3428026385 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f11fc00f460 con 0x7f12000715f0 2026-03-10T06:17:16.951 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:16.949+0000 7f11f6ffd700 1 -- 192.168.123.104:0/3428026385 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f11fc0175c0 con 0x7f12000715f0 2026-03-10T06:17:16.951 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:16.949+0000 7f1207e9e700 1 -- 192.168.123.104:0/3428026385 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f12001a47c0 con 0x7f12000715f0 2026-03-10T06:17:16.951 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:16.949+0000 7f1207e9e700 1 -- 192.168.123.104:0/3428026385 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f12001a4c60 con 0x7f12000715f0 2026-03-10T06:17:16.953 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:16.950+0000 7f1207e9e700 1 -- 192.168.123.104:0/3428026385 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f120019e040 con 0x7f12000715f0 2026-03-10T06:17:16.955 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:16.953+0000 7f11f6ffd700 1 -- 192.168.123.104:0/3428026385 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 12) v1 ==== 45092+0+0 (secure 0 0 0) 0x7f11fc00f5d0 con 0x7f12000715f0 2026-03-10T06:17:16.955 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:16.954+0000 7f11f6ffd700 1 --2- 192.168.123.104:0/3428026385 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f11ec0384b0 0x7f11ec03a960 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:17:16.955 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:16.954+0000 7f11f6ffd700 1 -- 192.168.123.104:0/3428026385 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(3..3 src has 1..3) v4 ==== 1069+0+0 (secure 0 0 0) 0x7f11fc04c000 con 0x7f12000715f0 2026-03-10T06:17:16.956 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:16.954+0000 7f1205439700 1 --2- 192.168.123.104:0/3428026385 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f11ec0384b0 0x7f11ec03a960 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:17:16.956 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:16.954+0000 7f1205439700 1 --2- 192.168.123.104:0/3428026385 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f11ec0384b0 0x7f11ec03a960 secure :-1 s=READY pgs=12 cs=0 l=1 rev1=1 crypto rx=0x7f11f0006fd0 tx=0x7f11f0006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:17:16.956 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:16.955+0000 7f11f6ffd700 1 -- 192.168.123.104:0/3428026385 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f11fc029390 con 0x7f12000715f0 2026-03-10T06:17:17.124 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:17.122+0000 7f1207e9e700 1 -- 192.168.123.104:0/3428026385 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command([{prefix=config set, name=mgr/cephadm/allow_ptrace}] v 0) v1 -- 0x7f120004f9e0 con 0x7f12000715f0 2026-03-10T06:17:17.130 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:17.127+0000 7f11f6ffd700 1 -- 192.168.123.104:0/3428026385 <== mon.0 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{prefix=config set, name=mgr/cephadm/allow_ptrace}]=0 v9)=0 v9) v1 ==== 125+0+0 (secure 0 0 0) 0x7f11fc026020 con 0x7f12000715f0 2026-03-10T06:17:17.140 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:17.138+0000 7f1207e9e700 1 -- 192.168.123.104:0/3428026385 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f11ec0384b0 msgr2=0x7f11ec03a960 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:17.140 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:17.138+0000 7f1207e9e700 1 --2- 192.168.123.104:0/3428026385 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f11ec0384b0 0x7f11ec03a960 secure :-1 s=READY pgs=12 cs=0 l=1 rev1=1 crypto rx=0x7f11f0006fd0 tx=0x7f11f0006e40 comp rx=0 tx=0).stop 2026-03-10T06:17:17.140 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:17.138+0000 7f1207e9e700 1 -- 192.168.123.104:0/3428026385 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f12000715f0 msgr2=0x7f12001a4280 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:17.140 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:17.138+0000 7f1207e9e700 1 --2- 192.168.123.104:0/3428026385 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f12000715f0 0x7f12001a4280 secure :-1 s=READY pgs=93 cs=0 l=1 rev1=1 crypto rx=0x7f11fc00c010 tx=0x7f11fc00bc10 comp rx=0 tx=0).stop 2026-03-10T06:17:17.141 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:17.140+0000 7f1207e9e700 1 -- 192.168.123.104:0/3428026385 shutdown_connections 2026-03-10T06:17:17.141 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:17.140+0000 7f1207e9e700 1 --2- 192.168.123.104:0/3428026385 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f11ec0384b0 0x7f11ec03a960 unknown :-1 s=CLOSED pgs=12 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:17.142 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:17.140+0000 7f1207e9e700 1 --2- 192.168.123.104:0/3428026385 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f12000715f0 0x7f12001a4280 unknown :-1 s=CLOSED pgs=93 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:17.142 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:17.140+0000 7f1207e9e700 1 -- 192.168.123.104:0/3428026385 >> 192.168.123.104:0/3428026385 conn(0x7f120006cf00 msgr2=0x7f120006dbd0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:17:17.143 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:17.142+0000 7f1207e9e700 1 -- 192.168.123.104:0/3428026385 shutdown_connections 2026-03-10T06:17:17.144 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:17.143+0000 7f1207e9e700 1 -- 192.168.123.104:0/3428026385 wait complete. 2026-03-10T06:17:17.285 INFO:tasks.cephadm:Distributing conf and client.admin keyring to all hosts + 0755 2026-03-10T06:17:17.285 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 9c59102a-1c48-11f1-b618-035af535377d -- ceph orch client-keyring set client.admin '*' --mode 0755 2026-03-10T06:17:17.449 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/mon.vm04/config 2026-03-10T06:17:17.730 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:17.727+0000 7fc1720ed700 1 -- 192.168.123.104:0/2341217602 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc16c0ff630 msgr2=0x7fc16c101a10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:17.730 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:17.727+0000 7fc1720ed700 1 --2- 192.168.123.104:0/2341217602 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc16c0ff630 0x7fc16c101a10 secure :-1 s=READY pgs=94 cs=0 l=1 rev1=1 crypto rx=0x7fc168009b00 tx=0x7fc168009e10 comp rx=0 tx=0).stop 2026-03-10T06:17:17.730 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:17.728+0000 7fc1720ed700 1 -- 192.168.123.104:0/2341217602 shutdown_connections 2026-03-10T06:17:17.730 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:17.728+0000 7fc1720ed700 1 --2- 192.168.123.104:0/2341217602 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc16c0ff630 0x7fc16c101a10 unknown :-1 s=CLOSED pgs=94 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:17.730 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:17.728+0000 7fc1720ed700 1 -- 192.168.123.104:0/2341217602 >> 192.168.123.104:0/2341217602 conn(0x7fc16c0f9650 msgr2=0x7fc16c0fba80 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:17:17.731 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:17.729+0000 7fc1720ed700 1 -- 192.168.123.104:0/2341217602 shutdown_connections 2026-03-10T06:17:17.731 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:17.729+0000 7fc1720ed700 1 -- 192.168.123.104:0/2341217602 wait complete. 2026-03-10T06:17:17.731 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:17.729+0000 7fc1720ed700 1 Processor -- start 2026-03-10T06:17:17.731 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:17.730+0000 7fc1720ed700 1 -- start start 2026-03-10T06:17:17.731 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:17.730+0000 7fc1720ed700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc16c0ff630 0x7fc16c1046e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:17:17.732 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:17.730+0000 7fc1710eb700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc16c0ff630 0x7fc16c1046e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:17:17.732 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:17.730+0000 7fc1710eb700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc16c0ff630 0x7fc16c1046e0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:35684/0 (socket says 192.168.123.104:35684) 2026-03-10T06:17:17.732 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:17.730+0000 7fc1710eb700 1 -- 192.168.123.104:0/378460761 learned_addr learned my addr 192.168.123.104:0/378460761 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:17:17.732 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:17.730+0000 7fc1720ed700 1 -- 192.168.123.104:0/378460761 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc16c104c20 con 0x7fc16c0ff630 2026-03-10T06:17:17.732 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:17.731+0000 7fc1710eb700 1 -- 192.168.123.104:0/378460761 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fc1680097e0 con 0x7fc16c0ff630 2026-03-10T06:17:17.732 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:17.731+0000 7fc1710eb700 1 --2- 192.168.123.104:0/378460761 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc16c0ff630 0x7fc16c1046e0 secure :-1 s=READY pgs=95 cs=0 l=1 rev1=1 crypto rx=0x7fc168000c00 tx=0x7fc168005310 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:17:17.733 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:17.731+0000 7fc1627fc700 1 -- 192.168.123.104:0/378460761 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fc16801c070 con 0x7fc16c0ff630 2026-03-10T06:17:17.733 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:17.731+0000 7fc1720ed700 1 -- 192.168.123.104:0/378460761 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fc16c102d30 con 0x7fc16c0ff630 2026-03-10T06:17:17.733 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:17.732+0000 7fc1720ed700 1 -- 192.168.123.104:0/378460761 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fc16c103170 con 0x7fc16c0ff630 2026-03-10T06:17:17.734 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:17.732+0000 7fc1627fc700 1 -- 192.168.123.104:0/378460761 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fc168004530 con 0x7fc16c0ff630 2026-03-10T06:17:17.735 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:17.734+0000 7fc1627fc700 1 -- 192.168.123.104:0/378460761 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fc168003bc0 con 0x7fc16c0ff630 2026-03-10T06:17:17.736 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:17.734+0000 7fc1720ed700 1 -- 192.168.123.104:0/378460761 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fc150005320 con 0x7fc16c0ff630 2026-03-10T06:17:17.738 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:17.737+0000 7fc1627fc700 1 -- 192.168.123.104:0/378460761 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 12) v1 ==== 45092+0+0 (secure 0 0 0) 0x7fc16800f460 con 0x7fc16c0ff630 2026-03-10T06:17:17.740 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:17.738+0000 7fc1627fc700 1 --2- 192.168.123.104:0/378460761 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fc158038200 0x7fc15803a6b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:17:17.741 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:17.739+0000 7fc1708ea700 1 --2- 192.168.123.104:0/378460761 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fc158038200 0x7fc15803a6b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:17:17.741 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:17.740+0000 7fc1708ea700 1 --2- 192.168.123.104:0/378460761 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fc158038200 0x7fc15803a6b0 secure :-1 s=READY pgs=13 cs=0 l=1 rev1=1 crypto rx=0x7fc15c006fd0 tx=0x7fc15c006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:17:17.742 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:17.740+0000 7fc1627fc700 1 -- 192.168.123.104:0/378460761 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(3..3 src has 1..3) v4 ==== 1069+0+0 (secure 0 0 0) 0x7fc16804d1f0 con 0x7fc16c0ff630 2026-03-10T06:17:17.742 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:17.740+0000 7fc1627fc700 1 -- 192.168.123.104:0/378460761 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fc168005470 con 0x7fc16c0ff630 2026-03-10T06:17:17.883 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:17.880+0000 7fc1720ed700 1 -- 192.168.123.104:0/378460761 --> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] -- mgr_command(tid 0: {"prefix": "orch client-keyring set", "entity": "client.admin", "placement": "*", "mode": "0755", "target": ["mon-mgr", ""]}) v1 -- 0x7fc150000bf0 con 0x7fc158038200 2026-03-10T06:17:17.887 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:17.885+0000 7fc1627fc700 1 -- 192.168.123.104:0/378460761 <== mgr.14164 v2:192.168.123.104:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+0 (secure 0 0 0) 0x7fc150000bf0 con 0x7fc158038200 2026-03-10T06:17:17.891 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:17.888+0000 7fc1720ed700 1 -- 192.168.123.104:0/378460761 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fc158038200 msgr2=0x7fc15803a6b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:17.891 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:17.888+0000 7fc1720ed700 1 --2- 192.168.123.104:0/378460761 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fc158038200 0x7fc15803a6b0 secure :-1 s=READY pgs=13 cs=0 l=1 rev1=1 crypto rx=0x7fc15c006fd0 tx=0x7fc15c006e40 comp rx=0 tx=0).stop 2026-03-10T06:17:17.891 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:17.888+0000 7fc1720ed700 1 -- 192.168.123.104:0/378460761 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc16c0ff630 msgr2=0x7fc16c1046e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:17.891 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:17.888+0000 7fc1720ed700 1 --2- 192.168.123.104:0/378460761 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc16c0ff630 0x7fc16c1046e0 secure :-1 s=READY pgs=95 cs=0 l=1 rev1=1 crypto rx=0x7fc168000c00 tx=0x7fc168005310 comp rx=0 tx=0).stop 2026-03-10T06:17:17.891 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:17.888+0000 7fc1720ed700 1 -- 192.168.123.104:0/378460761 shutdown_connections 2026-03-10T06:17:17.891 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:17.888+0000 7fc1720ed700 1 --2- 192.168.123.104:0/378460761 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fc158038200 0x7fc15803a6b0 unknown :-1 s=CLOSED pgs=13 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:17.891 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:17.888+0000 7fc1720ed700 1 --2- 192.168.123.104:0/378460761 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc16c0ff630 0x7fc16c1046e0 unknown :-1 s=CLOSED pgs=95 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:17.891 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:17.888+0000 7fc1720ed700 1 -- 192.168.123.104:0/378460761 >> 192.168.123.104:0/378460761 conn(0x7fc16c0f9650 msgr2=0x7fc16c0fa300 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:17:17.891 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:17.888+0000 7fc1720ed700 1 -- 192.168.123.104:0/378460761 shutdown_connections 2026-03-10T06:17:17.891 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:17.888+0000 7fc1720ed700 1 -- 192.168.123.104:0/378460761 wait complete. 2026-03-10T06:17:17.948 INFO:tasks.cephadm:Writing (initial) conf and keyring to vm06 2026-03-10T06:17:17.948 DEBUG:teuthology.orchestra.run.vm06:> set -ex 2026-03-10T06:17:17.949 DEBUG:teuthology.orchestra.run.vm06:> dd of=/etc/ceph/ceph.conf 2026-03-10T06:17:17.968 DEBUG:teuthology.orchestra.run.vm06:> set -ex 2026-03-10T06:17:17.968 DEBUG:teuthology.orchestra.run.vm06:> dd of=/etc/ceph/ceph.client.admin.keyring 2026-03-10T06:17:18.026 INFO:tasks.cephadm:Adding host vm06 to orchestrator... 2026-03-10T06:17:18.026 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 9c59102a-1c48-11f1-b618-035af535377d -- ceph orch host add vm06 2026-03-10T06:17:18.049 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:17 vm04 ceph-mon[51058]: from='mgr.14164 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:17:18.049 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:17 vm04 ceph-mon[51058]: from='mgr.14164 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:17:18.049 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:17 vm04 ceph-mon[51058]: from='mgr.14164 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "who": "osd/host:vm04", "name": "osd_memory_target"}]: dispatch 2026-03-10T06:17:18.049 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:17 vm04 ceph-mon[51058]: from='mgr.14164 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:17:18.049 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:17 vm04 ceph-mon[51058]: from='mgr.14164 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm04", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-10T06:17:18.049 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:17 vm04 ceph-mon[51058]: from='mgr.14164 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd='[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm04", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]': finished 2026-03-10T06:17:18.049 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:17 vm04 ceph-mon[51058]: from='mgr.14164 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:17:18.049 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:17 vm04 ceph-mon[51058]: Deploying daemon ceph-exporter.vm04 on vm04 2026-03-10T06:17:18.049 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:17 vm04 ceph-mon[51058]: from='client.? 192.168.123.104:0/3428026385' entity='client.admin' 2026-03-10T06:17:18.049 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:17 vm04 ceph-mon[51058]: from='mgr.14164 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:17:18.509 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/mon.vm04/config 2026-03-10T06:17:19.045 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:18 vm04 ceph-mon[51058]: from='client.14188 -' entity='client.admin' cmd=[{"prefix": "orch client-keyring set", "entity": "client.admin", "placement": "*", "mode": "0755", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:17:19.064 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:19.061+0000 7f27e4a71700 1 -- 192.168.123.104:0/1721390285 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f27e01008f0 msgr2=0x7f27e0102cd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:19.064 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:19.061+0000 7f27e4a71700 1 --2- 192.168.123.104:0/1721390285 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f27e01008f0 0x7f27e0102cd0 secure :-1 s=READY pgs=97 cs=0 l=1 rev1=1 crypto rx=0x7f27d0009b00 tx=0x7f27d0009e10 comp rx=0 tx=0).stop 2026-03-10T06:17:19.064 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:19.062+0000 7f27e4a71700 1 -- 192.168.123.104:0/1721390285 shutdown_connections 2026-03-10T06:17:19.064 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:19.062+0000 7f27e4a71700 1 --2- 192.168.123.104:0/1721390285 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f27e01008f0 0x7f27e0102cd0 unknown :-1 s=CLOSED pgs=97 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:19.064 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:19.062+0000 7f27e4a71700 1 -- 192.168.123.104:0/1721390285 >> 192.168.123.104:0/1721390285 conn(0x7f27e00fa310 msgr2=0x7f27e00fc740 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:17:19.064 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:19.063+0000 7f27e4a71700 1 -- 192.168.123.104:0/1721390285 shutdown_connections 2026-03-10T06:17:19.064 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:19.063+0000 7f27e4a71700 1 -- 192.168.123.104:0/1721390285 wait complete. 2026-03-10T06:17:19.065 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:19.063+0000 7f27e4a71700 1 Processor -- start 2026-03-10T06:17:19.065 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:19.063+0000 7f27e4a71700 1 -- start start 2026-03-10T06:17:19.065 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:19.063+0000 7f27e4a71700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f27e01008f0 0x7f27e0100250 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:17:19.065 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:19.063+0000 7f27e4a71700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f27e0100790 con 0x7f27e01008f0 2026-03-10T06:17:19.065 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:19.063+0000 7f27df7fe700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f27e01008f0 0x7f27e0100250 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:17:19.065 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:19.064+0000 7f27df7fe700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f27e01008f0 0x7f27e0100250 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:35710/0 (socket says 192.168.123.104:35710) 2026-03-10T06:17:19.065 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:19.064+0000 7f27df7fe700 1 -- 192.168.123.104:0/1356324515 learned_addr learned my addr 192.168.123.104:0/1356324515 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:17:19.066 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:19.064+0000 7f27df7fe700 1 -- 192.168.123.104:0/1356324515 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f27d00097e0 con 0x7f27e01008f0 2026-03-10T06:17:19.066 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:19.064+0000 7f27df7fe700 1 --2- 192.168.123.104:0/1356324515 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f27e01008f0 0x7f27e0100250 secure :-1 s=READY pgs=98 cs=0 l=1 rev1=1 crypto rx=0x7f27d0004f40 tx=0x7f27d0005e70 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:17:19.066 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:19.064+0000 7f27dcff9700 1 -- 192.168.123.104:0/1356324515 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f27d001c070 con 0x7f27e01008f0 2026-03-10T06:17:19.066 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:19.064+0000 7f27e4a71700 1 -- 192.168.123.104:0/1356324515 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f27e00fe8a0 con 0x7f27e01008f0 2026-03-10T06:17:19.066 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:19.064+0000 7f27e4a71700 1 -- 192.168.123.104:0/1356324515 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f27e00fed40 con 0x7f27e01008f0 2026-03-10T06:17:19.067 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:19.065+0000 7f27dcff9700 1 -- 192.168.123.104:0/1356324515 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f27d00053b0 con 0x7f27e01008f0 2026-03-10T06:17:19.067 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:19.065+0000 7f27dcff9700 1 -- 192.168.123.104:0/1356324515 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f27d000f460 con 0x7f27e01008f0 2026-03-10T06:17:19.068 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:19.066+0000 7f27c67fc700 1 -- 192.168.123.104:0/1356324515 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f27c00052f0 con 0x7f27e01008f0 2026-03-10T06:17:19.069 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:19.066+0000 7f27dcff9700 1 -- 192.168.123.104:0/1356324515 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 12) v1 ==== 45092+0+0 (secure 0 0 0) 0x7f27d0021470 con 0x7f27e01008f0 2026-03-10T06:17:19.069 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:19.067+0000 7f27dcff9700 1 --2- 192.168.123.104:0/1356324515 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f27c80384a0 0x7f27c803a950 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:17:19.069 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:19.067+0000 7f27dcff9700 1 -- 192.168.123.104:0/1356324515 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(3..3 src has 1..3) v4 ==== 1069+0+0 (secure 0 0 0) 0x7f27d004cd90 con 0x7f27e01008f0 2026-03-10T06:17:19.070 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:19.068+0000 7f27deffd700 1 --2- 192.168.123.104:0/1356324515 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f27c80384a0 0x7f27c803a950 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:17:19.070 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:19.068+0000 7f27deffd700 1 --2- 192.168.123.104:0/1356324515 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f27c80384a0 0x7f27c803a950 secure :-1 s=READY pgs=14 cs=0 l=1 rev1=1 crypto rx=0x7f27d4006fd0 tx=0x7f27d4006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:17:19.072 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:19.070+0000 7f27dcff9700 1 -- 192.168.123.104:0/1356324515 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f27d002aa20 con 0x7f27e01008f0 2026-03-10T06:17:19.211 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:19.208+0000 7f27c67fc700 1 -- 192.168.123.104:0/1356324515 --> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] -- mgr_command(tid 0: {"prefix": "orch host add", "hostname": "vm06", "target": ["mon-mgr", ""]}) v1 -- 0x7f27c0000bc0 con 0x7f27c80384a0 2026-03-10T06:17:20.093 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:20.090+0000 7f27dcff9700 1 -- 192.168.123.104:0/1356324515 <== mon.0 v2:192.168.123.104:3300/0 7 ==== mgrmap(e 13) v1 ==== 45138+0+0 (secure 0 0 0) 0x7f27d002bd00 con 0x7f27e01008f0 2026-03-10T06:17:20.373 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:20 vm04 ceph-mon[51058]: from='mgr.14164 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:17:20.373 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:20 vm04 ceph-mon[51058]: from='mgr.14164 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:17:20.373 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:20 vm04 ceph-mon[51058]: from='mgr.14164 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:17:20.373 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:20 vm04 ceph-mon[51058]: from='mgr.14164 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:17:20.373 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:20 vm04 ceph-mon[51058]: from='mgr.14164 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.vm04", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch 2026-03-10T06:17:20.373 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:20 vm04 ceph-mon[51058]: from='mgr.14164 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd='[{"prefix": "auth get-or-create", "entity": "client.crash.vm04", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]': finished 2026-03-10T06:17:20.373 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:20 vm04 ceph-mon[51058]: from='mgr.14164 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:17:20.373 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:20 vm04 ceph-mon[51058]: Deploying daemon crash.vm04 on vm04 2026-03-10T06:17:20.373 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:20 vm04 ceph-mon[51058]: from='client.14191 -' entity='client.admin' cmd=[{"prefix": "orch host add", "hostname": "vm06", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:17:20.373 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:20 vm04 ceph-mon[51058]: from='mgr.14164 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:17:20.373 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:20 vm04 ceph-mon[51058]: from='mgr.14164 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:17:21.112 INFO:teuthology.orchestra.run.vm04.stdout:Added host 'vm06' with addr '192.168.123.106' 2026-03-10T06:17:21.112 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:21.107+0000 7f27dcff9700 1 -- 192.168.123.104:0/1356324515 <== mgr.14164 v2:192.168.123.104:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+46 (secure 0 0 0) 0x7f27c0000bc0 con 0x7f27c80384a0 2026-03-10T06:17:21.112 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:21.110+0000 7f27c67fc700 1 -- 192.168.123.104:0/1356324515 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f27c80384a0 msgr2=0x7f27c803a950 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:21.112 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:21.110+0000 7f27c67fc700 1 --2- 192.168.123.104:0/1356324515 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f27c80384a0 0x7f27c803a950 secure :-1 s=READY pgs=14 cs=0 l=1 rev1=1 crypto rx=0x7f27d4006fd0 tx=0x7f27d4006e40 comp rx=0 tx=0).stop 2026-03-10T06:17:21.112 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:21.110+0000 7f27c67fc700 1 -- 192.168.123.104:0/1356324515 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f27e01008f0 msgr2=0x7f27e0100250 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:21.112 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:21.110+0000 7f27c67fc700 1 --2- 192.168.123.104:0/1356324515 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f27e01008f0 0x7f27e0100250 secure :-1 s=READY pgs=98 cs=0 l=1 rev1=1 crypto rx=0x7f27d0004f40 tx=0x7f27d0005e70 comp rx=0 tx=0).stop 2026-03-10T06:17:21.112 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:21.110+0000 7f27c67fc700 1 -- 192.168.123.104:0/1356324515 shutdown_connections 2026-03-10T06:17:21.112 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:21.110+0000 7f27c67fc700 1 --2- 192.168.123.104:0/1356324515 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f27c80384a0 0x7f27c803a950 unknown :-1 s=CLOSED pgs=14 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:21.112 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:21.110+0000 7f27c67fc700 1 --2- 192.168.123.104:0/1356324515 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f27e01008f0 0x7f27e0100250 secure :-1 s=CLOSED pgs=98 cs=0 l=1 rev1=1 crypto rx=0x7f27d0004f40 tx=0x7f27d0005e70 comp rx=0 tx=0).stop 2026-03-10T06:17:21.112 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:21.110+0000 7f27c67fc700 1 -- 192.168.123.104:0/1356324515 >> 192.168.123.104:0/1356324515 conn(0x7f27e00fa310 msgr2=0x7f27e00fafc0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:17:21.112 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:21.111+0000 7f27c67fc700 1 -- 192.168.123.104:0/1356324515 shutdown_connections 2026-03-10T06:17:21.112 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:21.111+0000 7f27c67fc700 1 -- 192.168.123.104:0/1356324515 wait complete. 2026-03-10T06:17:21.179 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 9c59102a-1c48-11f1-b618-035af535377d -- ceph orch host ls --format=json 2026-03-10T06:17:21.366 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/mon.vm04/config 2026-03-10T06:17:21.390 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:21 vm04 ceph-mon[51058]: Deploying cephadm binary to vm06 2026-03-10T06:17:21.390 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:21 vm04 ceph-mon[51058]: from='mgr.14164 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:17:21.390 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:21 vm04 ceph-mon[51058]: from='mgr.14164 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:17:21.390 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:21 vm04 ceph-mon[51058]: Deploying daemon node-exporter.vm04 on vm04 2026-03-10T06:17:21.390 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:21 vm04 ceph-mon[51058]: mgrmap e13: vm04.exdvdb(active, since 6s) 2026-03-10T06:17:21.635 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:21.633+0000 7f6e3f687700 1 -- 192.168.123.104:0/1564718873 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6e38100010 msgr2=0x7f6e38100420 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:21.635 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:21.633+0000 7f6e3f687700 1 --2- 192.168.123.104:0/1564718873 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6e38100010 0x7f6e38100420 secure :-1 s=READY pgs=99 cs=0 l=1 rev1=1 crypto rx=0x7f6e28009b00 tx=0x7f6e28009e10 comp rx=0 tx=0).stop 2026-03-10T06:17:21.636 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:21.634+0000 7f6e3f687700 1 -- 192.168.123.104:0/1564718873 shutdown_connections 2026-03-10T06:17:21.636 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:21.634+0000 7f6e3f687700 1 --2- 192.168.123.104:0/1564718873 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6e38100010 0x7f6e38100420 unknown :-1 s=CLOSED pgs=99 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:21.636 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:21.634+0000 7f6e3f687700 1 -- 192.168.123.104:0/1564718873 >> 192.168.123.104:0/1564718873 conn(0x7f6e380fb5a0 msgr2=0x7f6e380fd9f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:17:21.636 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:21.634+0000 7f6e3f687700 1 -- 192.168.123.104:0/1564718873 shutdown_connections 2026-03-10T06:17:21.636 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:21.634+0000 7f6e3f687700 1 -- 192.168.123.104:0/1564718873 wait complete. 2026-03-10T06:17:21.637 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:21.635+0000 7f6e3f687700 1 Processor -- start 2026-03-10T06:17:21.637 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:21.635+0000 7f6e3f687700 1 -- start start 2026-03-10T06:17:21.637 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:21.635+0000 7f6e3f687700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6e38100010 0x7f6e38074af0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:17:21.638 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:21.635+0000 7f6e3f687700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6e38075030 con 0x7f6e38100010 2026-03-10T06:17:21.638 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:21.635+0000 7f6e3d423700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6e38100010 0x7f6e38074af0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:17:21.639 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:21.635+0000 7f6e3d423700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6e38100010 0x7f6e38074af0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:35744/0 (socket says 192.168.123.104:35744) 2026-03-10T06:17:21.639 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:21.635+0000 7f6e3d423700 1 -- 192.168.123.104:0/1287072349 learned_addr learned my addr 192.168.123.104:0/1287072349 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:17:21.639 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:21.636+0000 7f6e3d423700 1 -- 192.168.123.104:0/1287072349 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f6e280097e0 con 0x7f6e38100010 2026-03-10T06:17:21.639 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:21.636+0000 7f6e3d423700 1 --2- 192.168.123.104:0/1287072349 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6e38100010 0x7f6e38074af0 secure :-1 s=READY pgs=100 cs=0 l=1 rev1=1 crypto rx=0x7f6e28005280 tx=0x7f6e28005360 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:17:21.639 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:21.636+0000 7f6e2e7fc700 1 -- 192.168.123.104:0/1287072349 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f6e2801c070 con 0x7f6e38100010 2026-03-10T06:17:21.639 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:21.636+0000 7f6e2e7fc700 1 -- 192.168.123.104:0/1287072349 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f6e28005dc0 con 0x7f6e38100010 2026-03-10T06:17:21.639 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:21.636+0000 7f6e2e7fc700 1 -- 192.168.123.104:0/1287072349 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f6e2800f460 con 0x7f6e38100010 2026-03-10T06:17:21.639 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:21.637+0000 7f6e3f687700 1 -- 192.168.123.104:0/1287072349 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f6e38073140 con 0x7f6e38100010 2026-03-10T06:17:21.640 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:21.637+0000 7f6e3f687700 1 -- 192.168.123.104:0/1287072349 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f6e38073560 con 0x7f6e38100010 2026-03-10T06:17:21.640 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:21.639+0000 7f6e2e7fc700 1 -- 192.168.123.104:0/1287072349 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 13) v1 ==== 45138+0+0 (secure 0 0 0) 0x7f6e280219e0 con 0x7f6e38100010 2026-03-10T06:17:21.640 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:21.639+0000 7f6e2e7fc700 1 --2- 192.168.123.104:0/1287072349 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f6e240384c0 0x7f6e2403a970 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:17:21.641 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:21.640+0000 7f6e2e7fc700 1 -- 192.168.123.104:0/1287072349 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(3..3 src has 1..3) v4 ==== 1069+0+0 (secure 0 0 0) 0x7f6e2802b830 con 0x7f6e38100010 2026-03-10T06:17:21.642 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:21.640+0000 7f6e3cc22700 1 --2- 192.168.123.104:0/1287072349 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f6e240384c0 0x7f6e2403a970 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:17:21.642 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:21.640+0000 7f6e3f687700 1 -- 192.168.123.104:0/1287072349 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f6e3804f9e0 con 0x7f6e38100010 2026-03-10T06:17:21.645 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:21.643+0000 7f6e2e7fc700 1 -- 192.168.123.104:0/1287072349 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f6e28026030 con 0x7f6e38100010 2026-03-10T06:17:21.645 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:21.643+0000 7f6e3cc22700 1 --2- 192.168.123.104:0/1287072349 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f6e240384c0 0x7f6e2403a970 secure :-1 s=READY pgs=15 cs=0 l=1 rev1=1 crypto rx=0x7f6e34006fd0 tx=0x7f6e34006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:17:21.756 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:21.753+0000 7f6e3f687700 1 -- 192.168.123.104:0/1287072349 --> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] -- mgr_command(tid 0: {"prefix": "orch host ls", "target": ["mon-mgr", ""], "format": "json"}) v1 -- 0x7f6e3806dd70 con 0x7f6e240384c0 2026-03-10T06:17:21.756 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:21.754+0000 7f6e2e7fc700 1 -- 192.168.123.104:0/1287072349 <== mgr.14164 v2:192.168.123.104:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+155 (secure 0 0 0) 0x7f6e3806dd70 con 0x7f6e240384c0 2026-03-10T06:17:21.756 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:17:21.756 INFO:teuthology.orchestra.run.vm04.stdout:[{"addr": "192.168.123.104", "hostname": "vm04", "labels": [], "status": ""}, {"addr": "192.168.123.106", "hostname": "vm06", "labels": [], "status": ""}] 2026-03-10T06:17:21.758 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:21.757+0000 7f6e3f687700 1 -- 192.168.123.104:0/1287072349 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f6e240384c0 msgr2=0x7f6e2403a970 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:21.758 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:21.757+0000 7f6e3f687700 1 --2- 192.168.123.104:0/1287072349 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f6e240384c0 0x7f6e2403a970 secure :-1 s=READY pgs=15 cs=0 l=1 rev1=1 crypto rx=0x7f6e34006fd0 tx=0x7f6e34006e40 comp rx=0 tx=0).stop 2026-03-10T06:17:21.759 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:21.757+0000 7f6e3f687700 1 -- 192.168.123.104:0/1287072349 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6e38100010 msgr2=0x7f6e38074af0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:21.759 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:21.757+0000 7f6e3f687700 1 --2- 192.168.123.104:0/1287072349 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6e38100010 0x7f6e38074af0 secure :-1 s=READY pgs=100 cs=0 l=1 rev1=1 crypto rx=0x7f6e28005280 tx=0x7f6e28005360 comp rx=0 tx=0).stop 2026-03-10T06:17:21.759 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:21.757+0000 7f6e3f687700 1 -- 192.168.123.104:0/1287072349 shutdown_connections 2026-03-10T06:17:21.759 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:21.757+0000 7f6e3f687700 1 --2- 192.168.123.104:0/1287072349 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f6e240384c0 0x7f6e2403a970 unknown :-1 s=CLOSED pgs=15 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:21.759 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:21.757+0000 7f6e3f687700 1 --2- 192.168.123.104:0/1287072349 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6e38100010 0x7f6e38074af0 unknown :-1 s=CLOSED pgs=100 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:21.759 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:21.757+0000 7f6e3f687700 1 -- 192.168.123.104:0/1287072349 >> 192.168.123.104:0/1287072349 conn(0x7f6e380fb5a0 msgr2=0x7f6e380fc270 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:17:21.759 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:21.757+0000 7f6e3f687700 1 -- 192.168.123.104:0/1287072349 shutdown_connections 2026-03-10T06:17:21.759 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:21.757+0000 7f6e3f687700 1 -- 192.168.123.104:0/1287072349 wait complete. 2026-03-10T06:17:21.806 INFO:tasks.cephadm:Setting crush tunables to default 2026-03-10T06:17:21.806 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 9c59102a-1c48-11f1-b618-035af535377d -- ceph osd crush tunables default 2026-03-10T06:17:21.954 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/mon.vm04/config 2026-03-10T06:17:22.222 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:22 vm04 ceph-mon[51058]: from='mgr.14164 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:17:22.222 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:22 vm04 ceph-mon[51058]: Added host vm06 2026-03-10T06:17:22.325 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:22.322+0000 7f82a6224700 1 -- 192.168.123.104:0/4040195977 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f82a0071530 msgr2=0x7f82a0071940 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:22.325 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:22.322+0000 7f82a6224700 1 --2- 192.168.123.104:0/4040195977 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f82a0071530 0x7f82a0071940 secure :-1 s=READY pgs=101 cs=0 l=1 rev1=1 crypto rx=0x7f8290007780 tx=0x7f829000c050 comp rx=0 tx=0).stop 2026-03-10T06:17:22.325 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:22.322+0000 7f82a6224700 1 -- 192.168.123.104:0/4040195977 shutdown_connections 2026-03-10T06:17:22.325 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:22.322+0000 7f82a6224700 1 --2- 192.168.123.104:0/4040195977 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f82a0071530 0x7f82a0071940 unknown :-1 s=CLOSED pgs=101 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:22.325 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:22.322+0000 7f82a6224700 1 -- 192.168.123.104:0/4040195977 >> 192.168.123.104:0/4040195977 conn(0x7f82a006cd50 msgr2=0x7f82a006f1a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:17:22.326 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:22.323+0000 7f82a6224700 1 -- 192.168.123.104:0/4040195977 shutdown_connections 2026-03-10T06:17:22.326 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:22.323+0000 7f82a6224700 1 -- 192.168.123.104:0/4040195977 wait complete. 2026-03-10T06:17:22.326 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:22.324+0000 7f82a6224700 1 Processor -- start 2026-03-10T06:17:22.326 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:22.324+0000 7f82a6224700 1 -- start start 2026-03-10T06:17:22.327 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:22.324+0000 7f82a6224700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f82a00793d0 0x7f82a00797e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:17:22.328 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:22.324+0000 7f82a6224700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8290003680 con 0x7f82a00793d0 2026-03-10T06:17:22.328 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:22.324+0000 7f829f7fe700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f82a00793d0 0x7f82a00797e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:17:22.328 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:22.324+0000 7f829f7fe700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f82a00793d0 0x7f82a00797e0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:35754/0 (socket says 192.168.123.104:35754) 2026-03-10T06:17:22.331 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:22.324+0000 7f829f7fe700 1 -- 192.168.123.104:0/3480004790 learned_addr learned my addr 192.168.123.104:0/3480004790 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:17:22.331 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:22.324+0000 7f829f7fe700 1 -- 192.168.123.104:0/3480004790 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f8290007430 con 0x7f82a00793d0 2026-03-10T06:17:22.331 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:22.325+0000 7f829f7fe700 1 --2- 192.168.123.104:0/3480004790 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f82a00793d0 0x7f82a00797e0 secure :-1 s=READY pgs=102 cs=0 l=1 rev1=1 crypto rx=0x7f829000a010 tx=0x7f829000c7b0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:17:22.331 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:22.325+0000 7f829cff9700 1 -- 192.168.123.104:0/3480004790 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f829000f050 con 0x7f82a00793d0 2026-03-10T06:17:22.331 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:22.325+0000 7f829cff9700 1 -- 192.168.123.104:0/3480004790 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f829000cb60 con 0x7f82a00793d0 2026-03-10T06:17:22.331 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:22.325+0000 7f829cff9700 1 -- 192.168.123.104:0/3480004790 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f82900083c0 con 0x7f82a00793d0 2026-03-10T06:17:22.331 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:22.325+0000 7f82a6224700 1 -- 192.168.123.104:0/3480004790 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f82a0079d20 con 0x7f82a00793d0 2026-03-10T06:17:22.331 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:22.325+0000 7f82a6224700 1 -- 192.168.123.104:0/3480004790 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f82a007c9b0 con 0x7f82a00793d0 2026-03-10T06:17:22.331 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:22.326+0000 7f82a6224700 1 -- 192.168.123.104:0/3480004790 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f82a012d510 con 0x7f82a00793d0 2026-03-10T06:17:22.331 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:22.326+0000 7f829cff9700 1 -- 192.168.123.104:0/3480004790 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 13) v1 ==== 45138+0+0 (secure 0 0 0) 0x7f829001a040 con 0x7f82a00793d0 2026-03-10T06:17:22.331 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:22.327+0000 7f829cff9700 1 --2- 192.168.123.104:0/3480004790 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f82880384f0 0x7f828803a9a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:17:22.331 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:22.327+0000 7f829cff9700 1 -- 192.168.123.104:0/3480004790 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(3..3 src has 1..3) v4 ==== 1069+0+0 (secure 0 0 0) 0x7f829002a030 con 0x7f82a00793d0 2026-03-10T06:17:22.332 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:22.330+0000 7f829cff9700 1 -- 192.168.123.104:0/3480004790 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f829007d720 con 0x7f82a00793d0 2026-03-10T06:17:22.332 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:22.330+0000 7f829effd700 1 --2- 192.168.123.104:0/3480004790 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f82880384f0 0x7f828803a9a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:17:22.336 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:22.334+0000 7f829effd700 1 --2- 192.168.123.104:0/3480004790 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f82880384f0 0x7f828803a9a0 secure :-1 s=READY pgs=16 cs=0 l=1 rev1=1 crypto rx=0x7f829800ad30 tx=0x7f82980093f0 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:17:22.456 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:22.453+0000 7f82a6224700 1 -- 192.168.123.104:0/3480004790 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "osd crush tunables", "profile": "default"} v 0) v1 -- 0x7f82a0062380 con 0x7f82a00793d0 2026-03-10T06:17:23.132 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:23.128+0000 7f829cff9700 1 -- 192.168.123.104:0/3480004790 <== mon.0 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "osd crush tunables", "profile": "default"}]=0 adjusted tunables profile to default v4) v1 ==== 124+0+0 (secure 0 0 0) 0x7f8290020070 con 0x7f82a00793d0 2026-03-10T06:17:23.133 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:23.131+0000 7f82a6224700 1 -- 192.168.123.104:0/3480004790 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f82880384f0 msgr2=0x7f828803a9a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:23.133 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:23.131+0000 7f82a6224700 1 --2- 192.168.123.104:0/3480004790 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f82880384f0 0x7f828803a9a0 secure :-1 s=READY pgs=16 cs=0 l=1 rev1=1 crypto rx=0x7f829800ad30 tx=0x7f82980093f0 comp rx=0 tx=0).stop 2026-03-10T06:17:23.134 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:23.131+0000 7f82a6224700 1 -- 192.168.123.104:0/3480004790 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f82a00793d0 msgr2=0x7f82a00797e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:23.134 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:23.131+0000 7f82a6224700 1 --2- 192.168.123.104:0/3480004790 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f82a00793d0 0x7f82a00797e0 secure :-1 s=READY pgs=102 cs=0 l=1 rev1=1 crypto rx=0x7f829000a010 tx=0x7f829000c7b0 comp rx=0 tx=0).stop 2026-03-10T06:17:23.134 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:23.132+0000 7f82a6224700 1 -- 192.168.123.104:0/3480004790 shutdown_connections 2026-03-10T06:17:23.134 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:23.132+0000 7f82a6224700 1 --2- 192.168.123.104:0/3480004790 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f82880384f0 0x7f828803a9a0 unknown :-1 s=CLOSED pgs=16 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:23.134 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:23.132+0000 7f82a6224700 1 --2- 192.168.123.104:0/3480004790 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f82a00793d0 0x7f82a00797e0 unknown :-1 s=CLOSED pgs=102 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:23.134 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:23.132+0000 7f82a6224700 1 -- 192.168.123.104:0/3480004790 >> 192.168.123.104:0/3480004790 conn(0x7f82a006cd50 msgr2=0x7f82a006e7c0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:17:23.134 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:23.132+0000 7f82a6224700 1 -- 192.168.123.104:0/3480004790 shutdown_connections 2026-03-10T06:17:23.134 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:17:23.132+0000 7f82a6224700 1 -- 192.168.123.104:0/3480004790 wait complete. 2026-03-10T06:17:23.137 INFO:teuthology.orchestra.run.vm04.stderr:adjusted tunables profile to default 2026-03-10T06:17:23.210 INFO:tasks.cephadm:Adding mon.vm04 on vm04 2026-03-10T06:17:23.210 INFO:tasks.cephadm:Adding mon.vm06 on vm06 2026-03-10T06:17:23.210 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 9c59102a-1c48-11f1-b618-035af535377d -- ceph orch apply mon '2;vm04:192.168.123.104=vm04;vm06:192.168.123.106=vm06' 2026-03-10T06:17:23.358 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T06:17:23.395 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T06:17:23.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:23 vm04 ceph-mon[51058]: from='client.14193 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""], "format": "json"}]: dispatch 2026-03-10T06:17:23.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:23 vm04 ceph-mon[51058]: from='client.? 192.168.123.104:0/3480004790' entity='client.admin' cmd=[{"prefix": "osd crush tunables", "profile": "default"}]: dispatch 2026-03-10T06:17:23.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:23 vm04 ceph-mon[51058]: from='mgr.14164 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:17:23.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:23 vm04 ceph-mon[51058]: from='mgr.14164 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:17:23.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:23 vm04 ceph-mon[51058]: from='mgr.14164 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:17:23.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:23 vm04 ceph-mon[51058]: from='mgr.14164 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:17:24.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:24 vm04 ceph-mon[51058]: Deploying daemon alertmanager.vm04 on vm04 2026-03-10T06:17:24.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:24 vm04 ceph-mon[51058]: from='client.? 192.168.123.104:0/3480004790' entity='client.admin' cmd='[{"prefix": "osd crush tunables", "profile": "default"}]': finished 2026-03-10T06:17:24.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:24 vm04 ceph-mon[51058]: osdmap e4: 0 total, 0 up, 0 in 2026-03-10T06:17:24.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:24 vm04 ceph-mon[51058]: from='mgr.14164 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:17:24.494 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:24.491+0000 7f7d46c79700 1 -- 192.168.123.106:0/4237639411 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7d40102240 msgr2=0x7f7d40102650 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:24.494 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:24.491+0000 7f7d46c79700 1 --2- 192.168.123.106:0/4237639411 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7d40102240 0x7f7d40102650 secure :-1 s=READY pgs=103 cs=0 l=1 rev1=1 crypto rx=0x7f7d30009b00 tx=0x7f7d30009e10 comp rx=0 tx=0).stop 2026-03-10T06:17:24.494 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:24.493+0000 7f7d46c79700 1 -- 192.168.123.106:0/4237639411 shutdown_connections 2026-03-10T06:17:24.494 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:24.493+0000 7f7d46c79700 1 --2- 192.168.123.106:0/4237639411 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7d40102240 0x7f7d40102650 unknown :-1 s=CLOSED pgs=103 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:24.494 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:24.493+0000 7f7d46c79700 1 -- 192.168.123.106:0/4237639411 >> 192.168.123.106:0/4237639411 conn(0x7f7d400fd8d0 msgr2=0x7f7d400ffcb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:17:24.494 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:24.493+0000 7f7d46c79700 1 -- 192.168.123.106:0/4237639411 shutdown_connections 2026-03-10T06:17:24.495 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:24.493+0000 7f7d46c79700 1 -- 192.168.123.106:0/4237639411 wait complete. 2026-03-10T06:17:24.495 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:24.494+0000 7f7d46c79700 1 Processor -- start 2026-03-10T06:17:24.495 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:24.494+0000 7f7d46c79700 1 -- start start 2026-03-10T06:17:24.495 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:24.494+0000 7f7d46c79700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7d40102240 0x7f7d40078b00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:17:24.495 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:24.494+0000 7f7d46c79700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7d40079040 con 0x7f7d40102240 2026-03-10T06:17:24.495 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:24.494+0000 7f7d44a15700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7d40102240 0x7f7d40078b00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:17:24.495 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:24.494+0000 7f7d44a15700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7d40102240 0x7f7d40078b00 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.106:49526/0 (socket says 192.168.123.106:49526) 2026-03-10T06:17:24.495 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:24.494+0000 7f7d44a15700 1 -- 192.168.123.106:0/393778266 learned_addr learned my addr 192.168.123.106:0/393778266 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-10T06:17:24.496 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:24.494+0000 7f7d44a15700 1 -- 192.168.123.106:0/393778266 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f7d300097e0 con 0x7f7d40102240 2026-03-10T06:17:24.496 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:24.494+0000 7f7d44a15700 1 --2- 192.168.123.106:0/393778266 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7d40102240 0x7f7d40078b00 secure :-1 s=READY pgs=104 cs=0 l=1 rev1=1 crypto rx=0x7f7d30004d40 tx=0x7f7d30004e20 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:17:24.496 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:24.495+0000 7f7d3dffb700 1 -- 192.168.123.106:0/393778266 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f7d3001c070 con 0x7f7d40102240 2026-03-10T06:17:24.496 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:24.495+0000 7f7d46c79700 1 -- 192.168.123.106:0/393778266 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f7d40079240 con 0x7f7d40102240 2026-03-10T06:17:24.497 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:24.495+0000 7f7d46c79700 1 -- 192.168.123.106:0/393778266 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f7d400757b0 con 0x7f7d40102240 2026-03-10T06:17:24.497 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:24.495+0000 7f7d3dffb700 1 -- 192.168.123.106:0/393778266 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f7d300054e0 con 0x7f7d40102240 2026-03-10T06:17:24.497 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:24.495+0000 7f7d3dffb700 1 -- 192.168.123.106:0/393778266 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f7d30003ab0 con 0x7f7d40102240 2026-03-10T06:17:24.497 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:24.496+0000 7f7d3dffb700 1 -- 192.168.123.106:0/393778266 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 13) v1 ==== 45138+0+0 (secure 0 0 0) 0x7f7d30003c10 con 0x7f7d40102240 2026-03-10T06:17:24.497 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:24.496+0000 7f7d3dffb700 1 --2- 192.168.123.106:0/393778266 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f7d28038510 0x7f7d2803a9c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:17:24.497 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:24.496+0000 7f7d3dffb700 1 -- 192.168.123.106:0/393778266 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1069+0+0 (secure 0 0 0) 0x7f7d3004c270 con 0x7f7d40102240 2026-03-10T06:17:24.497 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:24.496+0000 7f7d46c79700 1 -- 192.168.123.106:0/393778266 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f7d2c005320 con 0x7f7d40102240 2026-03-10T06:17:24.498 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:24.496+0000 7f7d3ffff700 1 --2- 192.168.123.106:0/393778266 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f7d28038510 0x7f7d2803a9c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:17:24.501 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:24.499+0000 7f7d3dffb700 1 -- 192.168.123.106:0/393778266 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f7d3000f4d0 con 0x7f7d40102240 2026-03-10T06:17:24.504 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:24.500+0000 7f7d3ffff700 1 --2- 192.168.123.106:0/393778266 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f7d28038510 0x7f7d2803a9c0 secure :-1 s=READY pgs=17 cs=0 l=1 rev1=1 crypto rx=0x7f7d34006fd0 tx=0x7f7d34006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:17:24.609 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:24.606+0000 7f7d46c79700 1 -- 192.168.123.106:0/393778266 --> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] -- mgr_command(tid 0: {"prefix": "orch apply", "service_type": "mon", "placement": "2;vm04:192.168.123.104=vm04;vm06:192.168.123.106=vm06", "target": ["mon-mgr", ""]}) v1 -- 0x7f7d2c000c90 con 0x7f7d28038510 2026-03-10T06:17:24.614 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:24.612+0000 7f7d3dffb700 1 -- 192.168.123.106:0/393778266 <== mgr.14164 v2:192.168.123.104:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+24 (secure 0 0 0) 0x7f7d2c000c90 con 0x7f7d28038510 2026-03-10T06:17:24.614 INFO:teuthology.orchestra.run.vm06.stdout:Scheduled mon update... 2026-03-10T06:17:24.616 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:24.615+0000 7f7d46c79700 1 -- 192.168.123.106:0/393778266 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f7d28038510 msgr2=0x7f7d2803a9c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:24.616 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:24.615+0000 7f7d46c79700 1 --2- 192.168.123.106:0/393778266 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f7d28038510 0x7f7d2803a9c0 secure :-1 s=READY pgs=17 cs=0 l=1 rev1=1 crypto rx=0x7f7d34006fd0 tx=0x7f7d34006e40 comp rx=0 tx=0).stop 2026-03-10T06:17:24.616 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:24.615+0000 7f7d46c79700 1 -- 192.168.123.106:0/393778266 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7d40102240 msgr2=0x7f7d40078b00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:24.616 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:24.615+0000 7f7d46c79700 1 --2- 192.168.123.106:0/393778266 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7d40102240 0x7f7d40078b00 secure :-1 s=READY pgs=104 cs=0 l=1 rev1=1 crypto rx=0x7f7d30004d40 tx=0x7f7d30004e20 comp rx=0 tx=0).stop 2026-03-10T06:17:24.616 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:24.615+0000 7f7d46c79700 1 -- 192.168.123.106:0/393778266 shutdown_connections 2026-03-10T06:17:24.617 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:24.615+0000 7f7d46c79700 1 --2- 192.168.123.106:0/393778266 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f7d28038510 0x7f7d2803a9c0 unknown :-1 s=CLOSED pgs=17 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:24.617 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:24.615+0000 7f7d46c79700 1 --2- 192.168.123.106:0/393778266 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7d40102240 0x7f7d40078b00 unknown :-1 s=CLOSED pgs=104 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:24.617 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:24.615+0000 7f7d46c79700 1 -- 192.168.123.106:0/393778266 >> 192.168.123.106:0/393778266 conn(0x7f7d400fd8d0 msgr2=0x7f7d400fe530 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:17:24.617 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:24.615+0000 7f7d46c79700 1 -- 192.168.123.106:0/393778266 shutdown_connections 2026-03-10T06:17:24.617 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:24.615+0000 7f7d46c79700 1 -- 192.168.123.106:0/393778266 wait complete. 2026-03-10T06:17:24.665 DEBUG:teuthology.orchestra.run.vm06:mon.vm06> sudo journalctl -f -n 0 -u ceph-9c59102a-1c48-11f1-b618-035af535377d@mon.vm06.service 2026-03-10T06:17:24.667 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-10T06:17:24.667 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 9c59102a-1c48-11f1-b618-035af535377d -- ceph mon dump -f json 2026-03-10T06:17:24.841 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T06:17:24.881 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T06:17:25.148 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:25.146+0000 7f6779970700 1 -- 192.168.123.106:0/3579507454 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6774100010 msgr2=0x7f6774100420 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:25.148 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:25.146+0000 7f6779970700 1 --2- 192.168.123.106:0/3579507454 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6774100010 0x7f6774100420 secure :-1 s=READY pgs=105 cs=0 l=1 rev1=1 crypto rx=0x7f675c009b00 tx=0x7f675c009e10 comp rx=0 tx=0).stop 2026-03-10T06:17:25.148 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:25.147+0000 7f6779970700 1 -- 192.168.123.106:0/3579507454 shutdown_connections 2026-03-10T06:17:25.148 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:25.147+0000 7f6779970700 1 --2- 192.168.123.106:0/3579507454 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6774100010 0x7f6774100420 unknown :-1 s=CLOSED pgs=105 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:25.148 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:25.147+0000 7f6779970700 1 -- 192.168.123.106:0/3579507454 >> 192.168.123.106:0/3579507454 conn(0x7f67740fb5a0 msgr2=0x7f67740fd9f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:17:25.148 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:25.147+0000 7f6779970700 1 -- 192.168.123.106:0/3579507454 shutdown_connections 2026-03-10T06:17:25.148 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:25.147+0000 7f6779970700 1 -- 192.168.123.106:0/3579507454 wait complete. 2026-03-10T06:17:25.149 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:25.147+0000 7f6779970700 1 Processor -- start 2026-03-10T06:17:25.149 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:25.147+0000 7f6779970700 1 -- start start 2026-03-10T06:17:25.149 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:25.148+0000 7f6779970700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6774100010 0x7f67741973c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:17:25.149 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:25.148+0000 7f6779970700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6774197900 con 0x7f6774100010 2026-03-10T06:17:25.149 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:25.148+0000 7f6772ffd700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6774100010 0x7f67741973c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:17:25.149 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:25.148+0000 7f6772ffd700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6774100010 0x7f67741973c0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.106:49550/0 (socket says 192.168.123.106:49550) 2026-03-10T06:17:25.149 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:25.148+0000 7f6772ffd700 1 -- 192.168.123.106:0/3367216640 learned_addr learned my addr 192.168.123.106:0/3367216640 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-10T06:17:25.150 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:25.148+0000 7f6772ffd700 1 -- 192.168.123.106:0/3367216640 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f675c0097e0 con 0x7f6774100010 2026-03-10T06:17:25.150 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:25.148+0000 7f6772ffd700 1 --2- 192.168.123.106:0/3367216640 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6774100010 0x7f67741973c0 secure :-1 s=READY pgs=106 cs=0 l=1 rev1=1 crypto rx=0x7f675c004750 tx=0x7f675c005dc0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:17:25.150 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:25.149+0000 7f677896e700 1 -- 192.168.123.106:0/3367216640 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f675c01c070 con 0x7f6774100010 2026-03-10T06:17:25.150 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:25.149+0000 7f6779970700 1 -- 192.168.123.106:0/3367216640 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f6774197b00 con 0x7f6774100010 2026-03-10T06:17:25.150 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:25.149+0000 7f6779970700 1 -- 192.168.123.106:0/3367216640 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f6774197fa0 con 0x7f6774100010 2026-03-10T06:17:25.150 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:25.149+0000 7f677896e700 1 -- 192.168.123.106:0/3367216640 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f675c021470 con 0x7f6774100010 2026-03-10T06:17:25.150 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:25.149+0000 7f677896e700 1 -- 192.168.123.106:0/3367216640 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f675c00f460 con 0x7f6774100010 2026-03-10T06:17:25.152 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:25.150+0000 7f677896e700 1 -- 192.168.123.106:0/3367216640 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 13) v1 ==== 45138+0+0 (secure 0 0 0) 0x7f675c00f6d0 con 0x7f6774100010 2026-03-10T06:17:25.152 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:25.150+0000 7f6779970700 1 -- 192.168.123.106:0/3367216640 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f6754005320 con 0x7f6774100010 2026-03-10T06:17:25.152 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:25.150+0000 7f677896e700 1 --2- 192.168.123.106:0/3367216640 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f6760038510 0x7f676003a9c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:17:25.152 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:25.150+0000 7f677896e700 1 -- 192.168.123.106:0/3367216640 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1069+0+0 (secure 0 0 0) 0x7f675c04d4a0 con 0x7f6774100010 2026-03-10T06:17:25.152 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:25.151+0000 7f67727fc700 1 --2- 192.168.123.106:0/3367216640 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f6760038510 0x7f676003a9c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:17:25.153 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:25.151+0000 7f67727fc700 1 --2- 192.168.123.106:0/3367216640 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f6760038510 0x7f676003a9c0 secure :-1 s=READY pgs=18 cs=0 l=1 rev1=1 crypto rx=0x7f6764006fd0 tx=0x7f6764006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:17:25.154 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:25.153+0000 7f677896e700 1 -- 192.168.123.106:0/3367216640 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f675c029540 con 0x7f6774100010 2026-03-10T06:17:25.296 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:25.295+0000 7f6779970700 1 -- 192.168.123.106:0/3367216640 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f6754005190 con 0x7f6774100010 2026-03-10T06:17:25.297 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:25.295+0000 7f677896e700 1 -- 192.168.123.106:0/3367216640 <== mon.0 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7f675c026030 con 0x7f6774100010 2026-03-10T06:17:25.299 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-10T06:17:25.299 INFO:teuthology.orchestra.run.vm06.stdout:{"epoch":1,"fsid":"9c59102a-1c48-11f1-b618-035af535377d","modified":"2026-03-10T06:16:42.736031Z","created":"2026-03-10T06:16:42.736031Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm04","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:3300","nonce":0},{"type":"v1","addr":"192.168.123.104:6789","nonce":0}]},"addr":"192.168.123.104:6789/0","public_addr":"192.168.123.104:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-10T06:17:25.301 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:25.299+0000 7f6779970700 1 -- 192.168.123.106:0/3367216640 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f6760038510 msgr2=0x7f676003a9c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:25.301 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:25.300+0000 7f6779970700 1 --2- 192.168.123.106:0/3367216640 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f6760038510 0x7f676003a9c0 secure :-1 s=READY pgs=18 cs=0 l=1 rev1=1 crypto rx=0x7f6764006fd0 tx=0x7f6764006e40 comp rx=0 tx=0).stop 2026-03-10T06:17:25.301 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:25.300+0000 7f6779970700 1 -- 192.168.123.106:0/3367216640 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6774100010 msgr2=0x7f67741973c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:25.301 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:25.300+0000 7f6779970700 1 --2- 192.168.123.106:0/3367216640 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6774100010 0x7f67741973c0 secure :-1 s=READY pgs=106 cs=0 l=1 rev1=1 crypto rx=0x7f675c004750 tx=0x7f675c005dc0 comp rx=0 tx=0).stop 2026-03-10T06:17:25.301 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:25.300+0000 7f6779970700 1 -- 192.168.123.106:0/3367216640 shutdown_connections 2026-03-10T06:17:25.301 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:25.300+0000 7f6779970700 1 --2- 192.168.123.106:0/3367216640 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f6760038510 0x7f676003a9c0 unknown :-1 s=CLOSED pgs=18 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:25.302 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:25.300+0000 7f6779970700 1 --2- 192.168.123.106:0/3367216640 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6774100010 0x7f67741973c0 unknown :-1 s=CLOSED pgs=106 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:25.302 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:25.300+0000 7f6779970700 1 -- 192.168.123.106:0/3367216640 >> 192.168.123.106:0/3367216640 conn(0x7f67740fb5a0 msgr2=0x7f67740fc270 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:17:25.302 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:25.301+0000 7f6779970700 1 -- 192.168.123.106:0/3367216640 shutdown_connections 2026-03-10T06:17:25.302 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:25.301+0000 7f6779970700 1 -- 192.168.123.106:0/3367216640 wait complete. 2026-03-10T06:17:25.303 INFO:teuthology.orchestra.run.vm06.stderr:dumped monmap epoch 1 2026-03-10T06:17:25.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:25 vm04 ceph-mon[51058]: from='client.14197 -' entity='client.admin' cmd=[{"prefix": "orch apply", "service_type": "mon", "placement": "2;vm04:192.168.123.104=vm04;vm06:192.168.123.106=vm06", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:17:25.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:25 vm04 ceph-mon[51058]: Saving service mon spec with placement vm04:192.168.123.104=vm04;vm06:192.168.123.106=vm06;count:2 2026-03-10T06:17:25.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:25 vm04 ceph-mon[51058]: from='mgr.14164 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:17:25.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:25 vm04 ceph-mon[51058]: from='client.? 192.168.123.106:0/3367216640' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T06:17:26.387 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-10T06:17:26.387 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 9c59102a-1c48-11f1-b618-035af535377d -- ceph mon dump -f json 2026-03-10T06:17:26.534 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T06:17:26.574 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T06:17:26.815 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:26.813+0000 7f49fed0d700 1 -- 192.168.123.106:0/1371307691 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f49f8100b40 msgr2=0x7f49f8102f20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:26.815 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:26.813+0000 7f49fed0d700 1 --2- 192.168.123.106:0/1371307691 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f49f8100b40 0x7f49f8102f20 secure :-1 s=READY pgs=107 cs=0 l=1 rev1=1 crypto rx=0x7f49e8009b00 tx=0x7f49e8009e10 comp rx=0 tx=0).stop 2026-03-10T06:17:26.815 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:26.814+0000 7f49fed0d700 1 -- 192.168.123.106:0/1371307691 shutdown_connections 2026-03-10T06:17:26.815 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:26.814+0000 7f49fed0d700 1 --2- 192.168.123.106:0/1371307691 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f49f8100b40 0x7f49f8102f20 unknown :-1 s=CLOSED pgs=107 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:26.815 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:26.814+0000 7f49fed0d700 1 -- 192.168.123.106:0/1371307691 >> 192.168.123.106:0/1371307691 conn(0x7f49f80fa4a0 msgr2=0x7f49f80fc8f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:17:26.816 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:26.814+0000 7f49fed0d700 1 -- 192.168.123.106:0/1371307691 shutdown_connections 2026-03-10T06:17:26.816 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:26.814+0000 7f49fed0d700 1 -- 192.168.123.106:0/1371307691 wait complete. 2026-03-10T06:17:26.816 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:26.815+0000 7f49fed0d700 1 Processor -- start 2026-03-10T06:17:26.816 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:26.815+0000 7f49fed0d700 1 -- start start 2026-03-10T06:17:26.816 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:26.815+0000 7f49fed0d700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f49f8100b40 0x7f49f8197360 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:17:26.817 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:26.815+0000 7f49fed0d700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f49f81978a0 con 0x7f49f8100b40 2026-03-10T06:17:26.817 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:26.816+0000 7f49fcaa9700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f49f8100b40 0x7f49f8197360 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:17:26.817 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:26.816+0000 7f49fcaa9700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f49f8100b40 0x7f49f8197360 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.106:49562/0 (socket says 192.168.123.106:49562) 2026-03-10T06:17:26.817 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:26.816+0000 7f49fcaa9700 1 -- 192.168.123.106:0/1822050153 learned_addr learned my addr 192.168.123.106:0/1822050153 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-10T06:17:26.817 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:26.816+0000 7f49fcaa9700 1 -- 192.168.123.106:0/1822050153 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f49e80097e0 con 0x7f49f8100b40 2026-03-10T06:17:26.818 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:26.816+0000 7f49fcaa9700 1 --2- 192.168.123.106:0/1822050153 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f49f8100b40 0x7f49f8197360 secure :-1 s=READY pgs=108 cs=0 l=1 rev1=1 crypto rx=0x7f49e8004f40 tx=0x7f49e8005e70 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:17:26.818 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:26.817+0000 7f49f5ffb700 1 -- 192.168.123.106:0/1822050153 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f49e801c070 con 0x7f49f8100b40 2026-03-10T06:17:26.818 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:26.817+0000 7f49fed0d700 1 -- 192.168.123.106:0/1822050153 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f49f8197aa0 con 0x7f49f8100b40 2026-03-10T06:17:26.818 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:26.817+0000 7f49fed0d700 1 -- 192.168.123.106:0/1822050153 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f49f8197f40 con 0x7f49f8100b40 2026-03-10T06:17:26.819 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:26.818+0000 7f49f5ffb700 1 -- 192.168.123.106:0/1822050153 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f49e80053b0 con 0x7f49f8100b40 2026-03-10T06:17:26.819 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:26.818+0000 7f49f5ffb700 1 -- 192.168.123.106:0/1822050153 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f49e800f460 con 0x7f49f8100b40 2026-03-10T06:17:26.819 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:26.818+0000 7f49f5ffb700 1 -- 192.168.123.106:0/1822050153 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 13) v1 ==== 45138+0+0 (secure 0 0 0) 0x7f49e800f5c0 con 0x7f49f8100b40 2026-03-10T06:17:26.820 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:26.818+0000 7f49f5ffb700 1 --2- 192.168.123.106:0/1822050153 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f49e00384c0 0x7f49e003a970 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:17:26.820 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:26.818+0000 7f49f5ffb700 1 -- 192.168.123.106:0/1822050153 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1069+0+0 (secure 0 0 0) 0x7f49e800de00 con 0x7f49f8100b40 2026-03-10T06:17:26.820 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:26.818+0000 7f49f7fff700 1 --2- 192.168.123.106:0/1822050153 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f49e00384c0 0x7f49e003a970 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:17:26.820 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:26.819+0000 7f49f7fff700 1 --2- 192.168.123.106:0/1822050153 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f49e00384c0 0x7f49e003a970 secure :-1 s=READY pgs=19 cs=0 l=1 rev1=1 crypto rx=0x7f49ec006fd0 tx=0x7f49ec006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:17:26.820 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:26.819+0000 7f49fed0d700 1 -- 192.168.123.106:0/1822050153 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f49e4005320 con 0x7f49f8100b40 2026-03-10T06:17:26.824 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:26.823+0000 7f49f5ffb700 1 -- 192.168.123.106:0/1822050153 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f49e8017780 con 0x7f49f8100b40 2026-03-10T06:17:26.970 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:26.968+0000 7f49fed0d700 1 -- 192.168.123.106:0/1822050153 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f49e4005190 con 0x7f49f8100b40 2026-03-10T06:17:26.971 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:26.969+0000 7f49f5ffb700 1 -- 192.168.123.106:0/1822050153 <== mon.0 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7f49e8026030 con 0x7f49f8100b40 2026-03-10T06:17:26.971 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-10T06:17:26.971 INFO:teuthology.orchestra.run.vm06.stdout:{"epoch":1,"fsid":"9c59102a-1c48-11f1-b618-035af535377d","modified":"2026-03-10T06:16:42.736031Z","created":"2026-03-10T06:16:42.736031Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm04","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:3300","nonce":0},{"type":"v1","addr":"192.168.123.104:6789","nonce":0}]},"addr":"192.168.123.104:6789/0","public_addr":"192.168.123.104:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-10T06:17:26.973 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:26.972+0000 7f49fed0d700 1 -- 192.168.123.106:0/1822050153 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f49e00384c0 msgr2=0x7f49e003a970 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:26.974 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:26.972+0000 7f49fed0d700 1 --2- 192.168.123.106:0/1822050153 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f49e00384c0 0x7f49e003a970 secure :-1 s=READY pgs=19 cs=0 l=1 rev1=1 crypto rx=0x7f49ec006fd0 tx=0x7f49ec006e40 comp rx=0 tx=0).stop 2026-03-10T06:17:26.974 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:26.972+0000 7f49fed0d700 1 -- 192.168.123.106:0/1822050153 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f49f8100b40 msgr2=0x7f49f8197360 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:26.974 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:26.972+0000 7f49fed0d700 1 --2- 192.168.123.106:0/1822050153 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f49f8100b40 0x7f49f8197360 secure :-1 s=READY pgs=108 cs=0 l=1 rev1=1 crypto rx=0x7f49e8004f40 tx=0x7f49e8005e70 comp rx=0 tx=0).stop 2026-03-10T06:17:26.974 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:26.973+0000 7f49fed0d700 1 -- 192.168.123.106:0/1822050153 shutdown_connections 2026-03-10T06:17:26.974 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:26.973+0000 7f49fed0d700 1 --2- 192.168.123.106:0/1822050153 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f49e00384c0 0x7f49e003a970 unknown :-1 s=CLOSED pgs=19 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:26.974 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:26.973+0000 7f49fed0d700 1 --2- 192.168.123.106:0/1822050153 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f49f8100b40 0x7f49f8197360 unknown :-1 s=CLOSED pgs=108 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:26.974 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:26.973+0000 7f49fed0d700 1 -- 192.168.123.106:0/1822050153 >> 192.168.123.106:0/1822050153 conn(0x7f49f80fa4a0 msgr2=0x7f49f80fc8f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:17:26.974 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:26.973+0000 7f49fed0d700 1 -- 192.168.123.106:0/1822050153 shutdown_connections 2026-03-10T06:17:26.975 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:26.973+0000 7f49fed0d700 1 -- 192.168.123.106:0/1822050153 wait complete. 2026-03-10T06:17:26.976 INFO:teuthology.orchestra.run.vm06.stderr:dumped monmap epoch 1 2026-03-10T06:17:27.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:27 vm04 ceph-mon[51058]: from='mgr.14164 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:17:27.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:27 vm04 ceph-mon[51058]: from='mgr.14164 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:17:27.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:27 vm04 ceph-mon[51058]: from='mgr.14164 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:17:27.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:27 vm04 ceph-mon[51058]: from='mgr.14164 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:17:27.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:27 vm04 ceph-mon[51058]: Regenerating cephadm self-signed grafana TLS certificates 2026-03-10T06:17:27.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:27 vm04 ceph-mon[51058]: from='mgr.14164 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:17:27.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:27 vm04 ceph-mon[51058]: from='mgr.14164 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:17:27.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:27 vm04 ceph-mon[51058]: from='mgr.14164 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "dashboard set-grafana-api-ssl-verify", "value": "false"}]: dispatch 2026-03-10T06:17:27.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:27 vm04 ceph-mon[51058]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard set-grafana-api-ssl-verify", "value": "false"}]: dispatch 2026-03-10T06:17:27.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:27 vm04 ceph-mon[51058]: from='mgr.14164 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:17:27.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:27 vm04 ceph-mon[51058]: Deploying daemon grafana.vm04 on vm04 2026-03-10T06:17:27.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:27 vm04 ceph-mon[51058]: from='client.? 192.168.123.106:0/1822050153' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T06:17:28.043 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-10T06:17:28.043 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 9c59102a-1c48-11f1-b618-035af535377d -- ceph mon dump -f json 2026-03-10T06:17:28.189 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T06:17:28.226 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T06:17:28.475 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:28.473+0000 7f5fe2f1b700 1 -- 192.168.123.106:0/2610678648 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5fdc100020 msgr2=0x7f5fdc100430 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:28.475 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:28.473+0000 7f5fe2f1b700 1 --2- 192.168.123.106:0/2610678648 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5fdc100020 0x7f5fdc100430 secure :-1 s=READY pgs=109 cs=0 l=1 rev1=1 crypto rx=0x7f5fcc009b00 tx=0x7f5fcc009e10 comp rx=0 tx=0).stop 2026-03-10T06:17:28.476 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:28.474+0000 7f5fe2f1b700 1 -- 192.168.123.106:0/2610678648 shutdown_connections 2026-03-10T06:17:28.476 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:28.474+0000 7f5fe2f1b700 1 --2- 192.168.123.106:0/2610678648 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5fdc100020 0x7f5fdc100430 unknown :-1 s=CLOSED pgs=109 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:28.476 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:28.474+0000 7f5fe2f1b700 1 -- 192.168.123.106:0/2610678648 >> 192.168.123.106:0/2610678648 conn(0x7f5fdc0fb5b0 msgr2=0x7f5fdc0fda00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:17:28.476 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:28.474+0000 7f5fe2f1b700 1 -- 192.168.123.106:0/2610678648 shutdown_connections 2026-03-10T06:17:28.476 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:28.474+0000 7f5fe2f1b700 1 -- 192.168.123.106:0/2610678648 wait complete. 2026-03-10T06:17:28.476 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:28.475+0000 7f5fe2f1b700 1 Processor -- start 2026-03-10T06:17:28.476 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:28.475+0000 7f5fe2f1b700 1 -- start start 2026-03-10T06:17:28.477 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:28.475+0000 7f5fe2f1b700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5fdc100020 0x7f5fdc074af0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:17:28.477 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:28.475+0000 7f5fe2f1b700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5fdc075030 con 0x7f5fdc100020 2026-03-10T06:17:28.477 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:28.476+0000 7f5fe0cb7700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5fdc100020 0x7f5fdc074af0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:17:28.477 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:28.476+0000 7f5fe0cb7700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5fdc100020 0x7f5fdc074af0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.106:44418/0 (socket says 192.168.123.106:44418) 2026-03-10T06:17:28.477 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:28.476+0000 7f5fe0cb7700 1 -- 192.168.123.106:0/2558382949 learned_addr learned my addr 192.168.123.106:0/2558382949 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-10T06:17:28.477 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:28.476+0000 7f5fe0cb7700 1 -- 192.168.123.106:0/2558382949 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f5fcc0097e0 con 0x7f5fdc100020 2026-03-10T06:17:28.478 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:28.476+0000 7f5fe0cb7700 1 --2- 192.168.123.106:0/2558382949 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5fdc100020 0x7f5fdc074af0 secure :-1 s=READY pgs=110 cs=0 l=1 rev1=1 crypto rx=0x7f5fcc009fd0 tx=0x7f5fcc005e70 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:17:28.478 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:28.477+0000 7f5fd9ffb700 1 -- 192.168.123.106:0/2558382949 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f5fcc01d070 con 0x7f5fdc100020 2026-03-10T06:17:28.478 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:28.477+0000 7f5fe2f1b700 1 -- 192.168.123.106:0/2558382949 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f5fdc073140 con 0x7f5fdc100020 2026-03-10T06:17:28.478 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:28.477+0000 7f5fe2f1b700 1 -- 192.168.123.106:0/2558382949 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f5fdc0735e0 con 0x7f5fdc100020 2026-03-10T06:17:28.479 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:28.477+0000 7f5fd9ffb700 1 -- 192.168.123.106:0/2558382949 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f5fcc022470 con 0x7f5fdc100020 2026-03-10T06:17:28.479 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:28.477+0000 7f5fd9ffb700 1 -- 192.168.123.106:0/2558382949 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f5fcc00f460 con 0x7f5fdc100020 2026-03-10T06:17:28.479 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:28.478+0000 7f5fd9ffb700 1 -- 192.168.123.106:0/2558382949 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 13) v1 ==== 45138+0+0 (secure 0 0 0) 0x7f5fcc00f610 con 0x7f5fdc100020 2026-03-10T06:17:28.479 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:28.478+0000 7f5fe2f1b700 1 -- 192.168.123.106:0/2558382949 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f5fdc04fa50 con 0x7f5fdc100020 2026-03-10T06:17:28.479 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:28.478+0000 7f5fd9ffb700 1 --2- 192.168.123.106:0/2558382949 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f5fc4038550 0x7f5fc403aa00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:17:28.479 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:28.478+0000 7f5fd9ffb700 1 -- 192.168.123.106:0/2558382949 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1069+0+0 (secure 0 0 0) 0x7f5fcc04d440 con 0x7f5fdc100020 2026-03-10T06:17:28.480 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:28.479+0000 7f5fdbfff700 1 --2- 192.168.123.106:0/2558382949 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f5fc4038550 0x7f5fc403aa00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:17:28.481 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:28.479+0000 7f5fdbfff700 1 --2- 192.168.123.106:0/2558382949 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f5fc4038550 0x7f5fc403aa00 secure :-1 s=READY pgs=20 cs=0 l=1 rev1=1 crypto rx=0x7f5fd0006fd0 tx=0x7f5fd0006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:17:28.482 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:28.481+0000 7f5fd9ffb700 1 -- 192.168.123.106:0/2558382949 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f5fcc02a950 con 0x7f5fdc100020 2026-03-10T06:17:28.628 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:28.627+0000 7f5fe2f1b700 1 -- 192.168.123.106:0/2558382949 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f5fdc0623e0 con 0x7f5fdc100020 2026-03-10T06:17:28.629 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:28.628+0000 7f5fd9ffb700 1 -- 192.168.123.106:0/2558382949 <== mon.0 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7f5fcc027030 con 0x7f5fdc100020 2026-03-10T06:17:28.629 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-10T06:17:28.629 INFO:teuthology.orchestra.run.vm06.stdout:{"epoch":1,"fsid":"9c59102a-1c48-11f1-b618-035af535377d","modified":"2026-03-10T06:16:42.736031Z","created":"2026-03-10T06:16:42.736031Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm04","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:3300","nonce":0},{"type":"v1","addr":"192.168.123.104:6789","nonce":0}]},"addr":"192.168.123.104:6789/0","public_addr":"192.168.123.104:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-10T06:17:28.631 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:28.630+0000 7f5fe2f1b700 1 -- 192.168.123.106:0/2558382949 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f5fc4038550 msgr2=0x7f5fc403aa00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:28.631 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:28.630+0000 7f5fe2f1b700 1 --2- 192.168.123.106:0/2558382949 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f5fc4038550 0x7f5fc403aa00 secure :-1 s=READY pgs=20 cs=0 l=1 rev1=1 crypto rx=0x7f5fd0006fd0 tx=0x7f5fd0006e40 comp rx=0 tx=0).stop 2026-03-10T06:17:28.631 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:28.630+0000 7f5fe2f1b700 1 -- 192.168.123.106:0/2558382949 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5fdc100020 msgr2=0x7f5fdc074af0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:28.632 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:28.630+0000 7f5fe2f1b700 1 --2- 192.168.123.106:0/2558382949 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5fdc100020 0x7f5fdc074af0 secure :-1 s=READY pgs=110 cs=0 l=1 rev1=1 crypto rx=0x7f5fcc009fd0 tx=0x7f5fcc005e70 comp rx=0 tx=0).stop 2026-03-10T06:17:28.632 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:28.630+0000 7f5fe2f1b700 1 -- 192.168.123.106:0/2558382949 shutdown_connections 2026-03-10T06:17:28.632 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:28.630+0000 7f5fe2f1b700 1 --2- 192.168.123.106:0/2558382949 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f5fc4038550 0x7f5fc403aa00 unknown :-1 s=CLOSED pgs=20 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:28.632 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:28.631+0000 7f5fe2f1b700 1 --2- 192.168.123.106:0/2558382949 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5fdc100020 0x7f5fdc074af0 unknown :-1 s=CLOSED pgs=110 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:28.632 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:28.631+0000 7f5fe2f1b700 1 -- 192.168.123.106:0/2558382949 >> 192.168.123.106:0/2558382949 conn(0x7f5fdc0fb5b0 msgr2=0x7f5fdc0fc280 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:17:28.632 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:28.631+0000 7f5fe2f1b700 1 -- 192.168.123.106:0/2558382949 shutdown_connections 2026-03-10T06:17:28.632 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:28.631+0000 7f5fe2f1b700 1 -- 192.168.123.106:0/2558382949 wait complete. 2026-03-10T06:17:28.633 INFO:teuthology.orchestra.run.vm06.stderr:dumped monmap epoch 1 2026-03-10T06:17:29.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:29 vm04 ceph-mon[51058]: from='mgr.14164 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:17:29.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:29 vm04 ceph-mon[51058]: from='client.? 192.168.123.106:0/2558382949' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T06:17:29.698 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-10T06:17:29.698 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 9c59102a-1c48-11f1-b618-035af535377d -- ceph mon dump -f json 2026-03-10T06:17:29.846 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T06:17:29.883 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T06:17:30.142 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:30.140+0000 7f4885842700 1 -- 192.168.123.106:0/2076411767 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4880102230 msgr2=0x7f4880102640 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:30.142 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:30.140+0000 7f4885842700 1 --2- 192.168.123.106:0/2076411767 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4880102230 0x7f4880102640 secure :-1 s=READY pgs=111 cs=0 l=1 rev1=1 crypto rx=0x7f4868009b00 tx=0x7f4868009e10 comp rx=0 tx=0).stop 2026-03-10T06:17:30.142 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:30.141+0000 7f4885842700 1 -- 192.168.123.106:0/2076411767 shutdown_connections 2026-03-10T06:17:30.142 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:30.141+0000 7f4885842700 1 --2- 192.168.123.106:0/2076411767 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4880102230 0x7f4880102640 unknown :-1 s=CLOSED pgs=111 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:30.142 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:30.141+0000 7f4885842700 1 -- 192.168.123.106:0/2076411767 >> 192.168.123.106:0/2076411767 conn(0x7f48800fd8d0 msgr2=0x7f48800ffcb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:17:30.142 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:30.141+0000 7f4885842700 1 -- 192.168.123.106:0/2076411767 shutdown_connections 2026-03-10T06:17:30.142 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:30.141+0000 7f4885842700 1 -- 192.168.123.106:0/2076411767 wait complete. 2026-03-10T06:17:30.143 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:30.141+0000 7f4885842700 1 Processor -- start 2026-03-10T06:17:30.143 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:30.141+0000 7f4885842700 1 -- start start 2026-03-10T06:17:30.143 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:30.142+0000 7f4885842700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4880102230 0x7f48801973b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:17:30.143 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:30.142+0000 7f4885842700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f48801978f0 con 0x7f4880102230 2026-03-10T06:17:30.143 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:30.142+0000 7f487effd700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4880102230 0x7f48801973b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:17:30.143 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:30.142+0000 7f487effd700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4880102230 0x7f48801973b0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.106:44444/0 (socket says 192.168.123.106:44444) 2026-03-10T06:17:30.143 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:30.142+0000 7f487effd700 1 -- 192.168.123.106:0/3511390787 learned_addr learned my addr 192.168.123.106:0/3511390787 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-10T06:17:30.143 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:30.142+0000 7f487effd700 1 -- 192.168.123.106:0/3511390787 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f48680097e0 con 0x7f4880102230 2026-03-10T06:17:30.144 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:30.142+0000 7f487effd700 1 --2- 192.168.123.106:0/3511390787 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4880102230 0x7f48801973b0 secure :-1 s=READY pgs=112 cs=0 l=1 rev1=1 crypto rx=0x7f4868009fd0 tx=0x7f4868005e70 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:17:30.144 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:30.143+0000 7f4884840700 1 -- 192.168.123.106:0/3511390787 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f486801d070 con 0x7f4880102230 2026-03-10T06:17:30.144 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:30.143+0000 7f4885842700 1 -- 192.168.123.106:0/3511390787 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f4880197af0 con 0x7f4880102230 2026-03-10T06:17:30.144 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:30.143+0000 7f4885842700 1 -- 192.168.123.106:0/3511390787 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f4880197f90 con 0x7f4880102230 2026-03-10T06:17:30.144 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:30.143+0000 7f4884840700 1 -- 192.168.123.106:0/3511390787 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f4868022470 con 0x7f4880102230 2026-03-10T06:17:30.145 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:30.143+0000 7f4884840700 1 -- 192.168.123.106:0/3511390787 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f486800f460 con 0x7f4880102230 2026-03-10T06:17:30.145 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:30.144+0000 7f4884840700 1 -- 192.168.123.106:0/3511390787 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 13) v1 ==== 45138+0+0 (secure 0 0 0) 0x7f4868022ae0 con 0x7f4880102230 2026-03-10T06:17:30.145 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:30.144+0000 7f4885842700 1 -- 192.168.123.106:0/3511390787 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f4860005320 con 0x7f4880102230 2026-03-10T06:17:30.145 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:30.144+0000 7f4884840700 1 --2- 192.168.123.106:0/3511390787 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f486c038550 0x7f486c03aa00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:17:30.145 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:30.144+0000 7f4884840700 1 -- 192.168.123.106:0/3511390787 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1069+0+0 (secure 0 0 0) 0x7f486804c280 con 0x7f4880102230 2026-03-10T06:17:30.146 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:30.144+0000 7f487e7fc700 1 --2- 192.168.123.106:0/3511390787 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f486c038550 0x7f486c03aa00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:17:30.146 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:30.145+0000 7f487e7fc700 1 --2- 192.168.123.106:0/3511390787 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f486c038550 0x7f486c03aa00 secure :-1 s=READY pgs=21 cs=0 l=1 rev1=1 crypto rx=0x7f4870006fd0 tx=0x7f4870006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:17:30.148 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:30.147+0000 7f4884840700 1 -- 192.168.123.106:0/3511390787 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f486802a360 con 0x7f4880102230 2026-03-10T06:17:30.290 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:30.289+0000 7f4885842700 1 -- 192.168.123.106:0/3511390787 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f4860005190 con 0x7f4880102230 2026-03-10T06:17:30.292 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:30.290+0000 7f4884840700 1 -- 192.168.123.106:0/3511390787 <== mon.0 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7f4868027070 con 0x7f4880102230 2026-03-10T06:17:30.292 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-10T06:17:30.292 INFO:teuthology.orchestra.run.vm06.stdout:{"epoch":1,"fsid":"9c59102a-1c48-11f1-b618-035af535377d","modified":"2026-03-10T06:16:42.736031Z","created":"2026-03-10T06:16:42.736031Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm04","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:3300","nonce":0},{"type":"v1","addr":"192.168.123.104:6789","nonce":0}]},"addr":"192.168.123.104:6789/0","public_addr":"192.168.123.104:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-10T06:17:30.294 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:30.293+0000 7f4885842700 1 -- 192.168.123.106:0/3511390787 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f486c038550 msgr2=0x7f486c03aa00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:30.294 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:30.293+0000 7f4885842700 1 --2- 192.168.123.106:0/3511390787 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f486c038550 0x7f486c03aa00 secure :-1 s=READY pgs=21 cs=0 l=1 rev1=1 crypto rx=0x7f4870006fd0 tx=0x7f4870006e40 comp rx=0 tx=0).stop 2026-03-10T06:17:30.294 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:30.293+0000 7f4885842700 1 -- 192.168.123.106:0/3511390787 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4880102230 msgr2=0x7f48801973b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:30.294 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:30.293+0000 7f4885842700 1 --2- 192.168.123.106:0/3511390787 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4880102230 0x7f48801973b0 secure :-1 s=READY pgs=112 cs=0 l=1 rev1=1 crypto rx=0x7f4868009fd0 tx=0x7f4868005e70 comp rx=0 tx=0).stop 2026-03-10T06:17:30.294 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:30.293+0000 7f4885842700 1 -- 192.168.123.106:0/3511390787 shutdown_connections 2026-03-10T06:17:30.294 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:30.293+0000 7f4885842700 1 --2- 192.168.123.106:0/3511390787 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f486c038550 0x7f486c03aa00 unknown :-1 s=CLOSED pgs=21 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:30.294 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:30.293+0000 7f4885842700 1 --2- 192.168.123.106:0/3511390787 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4880102230 0x7f48801973b0 unknown :-1 s=CLOSED pgs=112 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:30.294 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:30.293+0000 7f4885842700 1 -- 192.168.123.106:0/3511390787 >> 192.168.123.106:0/3511390787 conn(0x7f48800fd8d0 msgr2=0x7f48800fe530 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:17:30.294 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:30.293+0000 7f4885842700 1 -- 192.168.123.106:0/3511390787 shutdown_connections 2026-03-10T06:17:30.294 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:30.293+0000 7f4885842700 1 -- 192.168.123.106:0/3511390787 wait complete. 2026-03-10T06:17:30.295 INFO:teuthology.orchestra.run.vm06.stderr:dumped monmap epoch 1 2026-03-10T06:17:31.351 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-10T06:17:31.352 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 9c59102a-1c48-11f1-b618-035af535377d -- ceph mon dump -f json 2026-03-10T06:17:31.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:31 vm04 ceph-mon[51058]: from='client.? 192.168.123.106:0/3511390787' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T06:17:31.495 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T06:17:31.531 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T06:17:31.783 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:31.781+0000 7efdf6d0a700 1 -- 192.168.123.106:0/435998489 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7efdf0102230 msgr2=0x7efdf0102640 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:31.783 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:31.781+0000 7efdf6d0a700 1 --2- 192.168.123.106:0/435998489 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7efdf0102230 0x7efdf0102640 secure :-1 s=READY pgs=113 cs=0 l=1 rev1=1 crypto rx=0x7efde0009b00 tx=0x7efde0009e10 comp rx=0 tx=0).stop 2026-03-10T06:17:31.783 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:31.782+0000 7efdf6d0a700 1 -- 192.168.123.106:0/435998489 shutdown_connections 2026-03-10T06:17:31.783 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:31.782+0000 7efdf6d0a700 1 --2- 192.168.123.106:0/435998489 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7efdf0102230 0x7efdf0102640 unknown :-1 s=CLOSED pgs=113 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:31.783 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:31.782+0000 7efdf6d0a700 1 -- 192.168.123.106:0/435998489 >> 192.168.123.106:0/435998489 conn(0x7efdf00fd8d0 msgr2=0x7efdf00ffcb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:17:31.784 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:31.782+0000 7efdf6d0a700 1 -- 192.168.123.106:0/435998489 shutdown_connections 2026-03-10T06:17:31.784 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:31.782+0000 7efdf6d0a700 1 -- 192.168.123.106:0/435998489 wait complete. 2026-03-10T06:17:31.784 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:31.783+0000 7efdf6d0a700 1 Processor -- start 2026-03-10T06:17:31.784 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:31.783+0000 7efdf6d0a700 1 -- start start 2026-03-10T06:17:31.784 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:31.783+0000 7efdf6d0a700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7efdf0102230 0x7efdf0197380 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:17:31.784 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:31.783+0000 7efdf6d0a700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7efdf01978c0 con 0x7efdf0102230 2026-03-10T06:17:31.785 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:31.783+0000 7efdf4aa6700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7efdf0102230 0x7efdf0197380 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:17:31.785 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:31.783+0000 7efdf4aa6700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7efdf0102230 0x7efdf0197380 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.106:44460/0 (socket says 192.168.123.106:44460) 2026-03-10T06:17:31.785 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:31.783+0000 7efdf4aa6700 1 -- 192.168.123.106:0/2077997119 learned_addr learned my addr 192.168.123.106:0/2077997119 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-10T06:17:31.785 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:31.784+0000 7efdf4aa6700 1 -- 192.168.123.106:0/2077997119 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7efde00097e0 con 0x7efdf0102230 2026-03-10T06:17:31.785 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:31.784+0000 7efdf4aa6700 1 --2- 192.168.123.106:0/2077997119 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7efdf0102230 0x7efdf0197380 secure :-1 s=READY pgs=114 cs=0 l=1 rev1=1 crypto rx=0x7efde0004750 tx=0x7efde0005dc0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:17:31.786 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:31.784+0000 7efdedffb700 1 -- 192.168.123.106:0/2077997119 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7efde001c070 con 0x7efdf0102230 2026-03-10T06:17:31.786 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:31.784+0000 7efdf6d0a700 1 -- 192.168.123.106:0/2077997119 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7efdf0197ac0 con 0x7efdf0102230 2026-03-10T06:17:31.786 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:31.785+0000 7efdf6d0a700 1 -- 192.168.123.106:0/2077997119 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7efdf0197f60 con 0x7efdf0102230 2026-03-10T06:17:31.787 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:31.785+0000 7efdedffb700 1 -- 192.168.123.106:0/2077997119 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7efde0021470 con 0x7efdf0102230 2026-03-10T06:17:31.787 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:31.785+0000 7efdedffb700 1 -- 192.168.123.106:0/2077997119 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7efde000f460 con 0x7efdf0102230 2026-03-10T06:17:31.787 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:31.786+0000 7efdedffb700 1 -- 192.168.123.106:0/2077997119 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 13) v1 ==== 45138+0+0 (secure 0 0 0) 0x7efde000f690 con 0x7efdf0102230 2026-03-10T06:17:31.787 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:31.786+0000 7efdf6d0a700 1 -- 192.168.123.106:0/2077997119 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7efdf0191080 con 0x7efdf0102230 2026-03-10T06:17:31.787 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:31.786+0000 7efdedffb700 1 --2- 192.168.123.106:0/2077997119 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7efdd80384c0 0x7efdd803a970 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:17:31.787 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:31.786+0000 7efdedffb700 1 -- 192.168.123.106:0/2077997119 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1069+0+0 (secure 0 0 0) 0x7efde004c310 con 0x7efdf0102230 2026-03-10T06:17:31.789 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:31.788+0000 7efdeffff700 1 --2- 192.168.123.106:0/2077997119 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7efdd80384c0 0x7efdd803a970 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:17:31.790 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:31.788+0000 7efdeffff700 1 --2- 192.168.123.106:0/2077997119 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7efdd80384c0 0x7efdd803a970 secure :-1 s=READY pgs=22 cs=0 l=1 rev1=1 crypto rx=0x7efde4006fd0 tx=0x7efde4006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:17:31.790 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:31.789+0000 7efdedffb700 1 -- 192.168.123.106:0/2077997119 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7efde0029950 con 0x7efdf0102230 2026-03-10T06:17:31.941 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:31.939+0000 7efdf6d0a700 1 -- 192.168.123.106:0/2077997119 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7efdf0062380 con 0x7efdf0102230 2026-03-10T06:17:31.944 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:31.942+0000 7efdedffb700 1 -- 192.168.123.106:0/2077997119 <== mon.0 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7efde0026030 con 0x7efdf0102230 2026-03-10T06:17:31.944 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-10T06:17:31.944 INFO:teuthology.orchestra.run.vm06.stdout:{"epoch":1,"fsid":"9c59102a-1c48-11f1-b618-035af535377d","modified":"2026-03-10T06:16:42.736031Z","created":"2026-03-10T06:16:42.736031Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm04","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:3300","nonce":0},{"type":"v1","addr":"192.168.123.104:6789","nonce":0}]},"addr":"192.168.123.104:6789/0","public_addr":"192.168.123.104:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-10T06:17:31.946 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:31.945+0000 7efdf6d0a700 1 -- 192.168.123.106:0/2077997119 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7efdd80384c0 msgr2=0x7efdd803a970 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:31.946 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:31.945+0000 7efdf6d0a700 1 --2- 192.168.123.106:0/2077997119 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7efdd80384c0 0x7efdd803a970 secure :-1 s=READY pgs=22 cs=0 l=1 rev1=1 crypto rx=0x7efde4006fd0 tx=0x7efde4006e40 comp rx=0 tx=0).stop 2026-03-10T06:17:31.946 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:31.945+0000 7efdf6d0a700 1 -- 192.168.123.106:0/2077997119 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7efdf0102230 msgr2=0x7efdf0197380 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:31.946 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:31.945+0000 7efdf6d0a700 1 --2- 192.168.123.106:0/2077997119 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7efdf0102230 0x7efdf0197380 secure :-1 s=READY pgs=114 cs=0 l=1 rev1=1 crypto rx=0x7efde0004750 tx=0x7efde0005dc0 comp rx=0 tx=0).stop 2026-03-10T06:17:31.946 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:31.945+0000 7efdf6d0a700 1 -- 192.168.123.106:0/2077997119 shutdown_connections 2026-03-10T06:17:31.946 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:31.945+0000 7efdf6d0a700 1 --2- 192.168.123.106:0/2077997119 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7efdd80384c0 0x7efdd803a970 unknown :-1 s=CLOSED pgs=22 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:31.946 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:31.945+0000 7efdf6d0a700 1 --2- 192.168.123.106:0/2077997119 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7efdf0102230 0x7efdf0197380 unknown :-1 s=CLOSED pgs=114 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:31.946 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:31.945+0000 7efdf6d0a700 1 -- 192.168.123.106:0/2077997119 >> 192.168.123.106:0/2077997119 conn(0x7efdf00fd8d0 msgr2=0x7efdf00fe530 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:17:31.946 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:31.945+0000 7efdf6d0a700 1 -- 192.168.123.106:0/2077997119 shutdown_connections 2026-03-10T06:17:31.946 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:31.945+0000 7efdf6d0a700 1 -- 192.168.123.106:0/2077997119 wait complete. 2026-03-10T06:17:31.947 INFO:teuthology.orchestra.run.vm06.stderr:dumped monmap epoch 1 2026-03-10T06:17:32.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:32 vm04 ceph-mon[51058]: from='client.? 192.168.123.106:0/2077997119' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T06:17:32.992 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-10T06:17:32.992 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 9c59102a-1c48-11f1-b618-035af535377d -- ceph mon dump -f json 2026-03-10T06:17:33.136 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T06:17:33.174 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T06:17:33.452 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:33.450+0000 7fa19365c700 1 -- 192.168.123.106:0/1253579298 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa18c106610 msgr2=0x7fa18c106a20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:33.452 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:33.450+0000 7fa183fff700 1 -- 192.168.123.106:0/1253579298 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fa17c00bcf0 con 0x7fa18c106610 2026-03-10T06:17:33.452 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:33.450+0000 7fa19365c700 1 --2- 192.168.123.106:0/1253579298 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa18c106610 0x7fa18c106a20 secure :-1 s=READY pgs=115 cs=0 l=1 rev1=1 crypto rx=0x7fa17c009b00 tx=0x7fa17c009e10 comp rx=0 tx=0).stop 2026-03-10T06:17:33.452 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:33.451+0000 7fa19365c700 1 -- 192.168.123.106:0/1253579298 shutdown_connections 2026-03-10T06:17:33.452 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:33.451+0000 7fa19365c700 1 --2- 192.168.123.106:0/1253579298 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa18c106610 0x7fa18c106a20 unknown :-1 s=CLOSED pgs=115 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:33.452 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:33.451+0000 7fa19365c700 1 -- 192.168.123.106:0/1253579298 >> 192.168.123.106:0/1253579298 conn(0x7fa18c075940 msgr2=0x7fa18c077d90 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:17:33.452 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:33.451+0000 7fa19365c700 1 -- 192.168.123.106:0/1253579298 shutdown_connections 2026-03-10T06:17:33.452 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:33.451+0000 7fa19365c700 1 -- 192.168.123.106:0/1253579298 wait complete. 2026-03-10T06:17:33.452 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:33.451+0000 7fa19365c700 1 Processor -- start 2026-03-10T06:17:33.453 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:33.451+0000 7fa19365c700 1 -- start start 2026-03-10T06:17:33.453 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:33.451+0000 7fa19365c700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa18c106610 0x7fa18c19b7d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:17:33.453 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:33.451+0000 7fa19365c700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa18c19bd10 con 0x7fa18c106610 2026-03-10T06:17:33.453 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:33.452+0000 7fa1913f8700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa18c106610 0x7fa18c19b7d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:17:33.453 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:33.452+0000 7fa1913f8700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa18c106610 0x7fa18c19b7d0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.106:44472/0 (socket says 192.168.123.106:44472) 2026-03-10T06:17:33.453 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:33.452+0000 7fa1913f8700 1 -- 192.168.123.106:0/3878914568 learned_addr learned my addr 192.168.123.106:0/3878914568 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-10T06:17:33.453 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:33.452+0000 7fa1913f8700 1 -- 192.168.123.106:0/3878914568 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fa17c0097e0 con 0x7fa18c106610 2026-03-10T06:17:33.453 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:33.452+0000 7fa1913f8700 1 --2- 192.168.123.106:0/3878914568 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa18c106610 0x7fa18c19b7d0 secure :-1 s=READY pgs=116 cs=0 l=1 rev1=1 crypto rx=0x7fa17c000c00 tx=0x7fa17c0047b0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:17:33.454 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:33.452+0000 7fa1827fc700 1 -- 192.168.123.106:0/3878914568 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fa17c01c070 con 0x7fa18c106610 2026-03-10T06:17:33.454 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:33.452+0000 7fa19365c700 1 -- 192.168.123.106:0/3878914568 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fa18c19bf10 con 0x7fa18c106610 2026-03-10T06:17:33.454 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:33.453+0000 7fa19365c700 1 -- 192.168.123.106:0/3878914568 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fa18c19c3b0 con 0x7fa18c106610 2026-03-10T06:17:33.455 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:33.453+0000 7fa1827fc700 1 -- 192.168.123.106:0/3878914568 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fa17c0056f0 con 0x7fa18c106610 2026-03-10T06:17:33.455 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:33.453+0000 7fa1827fc700 1 -- 192.168.123.106:0/3878914568 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fa17c00f460 con 0x7fa18c106610 2026-03-10T06:17:33.455 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:33.453+0000 7fa1827fc700 1 -- 192.168.123.106:0/3878914568 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 13) v1 ==== 45138+0+0 (secure 0 0 0) 0x7fa17c00f5c0 con 0x7fa18c106610 2026-03-10T06:17:33.455 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:33.454+0000 7fa1827fc700 1 --2- 192.168.123.106:0/3878914568 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fa178038550 0x7fa17803aa00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:17:33.455 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:33.454+0000 7fa1827fc700 1 -- 192.168.123.106:0/3878914568 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1069+0+0 (secure 0 0 0) 0x7fa17c04d420 con 0x7fa18c106610 2026-03-10T06:17:33.455 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:33.454+0000 7fa19365c700 1 -- 192.168.123.106:0/3878914568 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fa18c195460 con 0x7fa18c106610 2026-03-10T06:17:33.455 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:33.454+0000 7fa190bf7700 1 --2- 192.168.123.106:0/3878914568 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fa178038550 0x7fa17803aa00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:17:33.459 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:33.457+0000 7fa1827fc700 1 -- 192.168.123.106:0/3878914568 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fa17c026070 con 0x7fa18c106610 2026-03-10T06:17:33.459 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:33.457+0000 7fa190bf7700 1 --2- 192.168.123.106:0/3878914568 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fa178038550 0x7fa17803aa00 secure :-1 s=READY pgs=23 cs=0 l=1 rev1=1 crypto rx=0x7fa188006fd0 tx=0x7fa188006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:17:33.607 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:33.604+0000 7fa19365c700 1 -- 192.168.123.106:0/3878914568 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7fa18c062380 con 0x7fa18c106610 2026-03-10T06:17:33.607 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:33.606+0000 7fa1827fc700 1 -- 192.168.123.106:0/3878914568 <== mon.0 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7fa17c029720 con 0x7fa18c106610 2026-03-10T06:17:33.608 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-10T06:17:33.608 INFO:teuthology.orchestra.run.vm06.stdout:{"epoch":1,"fsid":"9c59102a-1c48-11f1-b618-035af535377d","modified":"2026-03-10T06:16:42.736031Z","created":"2026-03-10T06:16:42.736031Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm04","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:3300","nonce":0},{"type":"v1","addr":"192.168.123.104:6789","nonce":0}]},"addr":"192.168.123.104:6789/0","public_addr":"192.168.123.104:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-10T06:17:33.610 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:33.608+0000 7fa19365c700 1 -- 192.168.123.106:0/3878914568 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fa178038550 msgr2=0x7fa17803aa00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:33.610 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:33.608+0000 7fa19365c700 1 --2- 192.168.123.106:0/3878914568 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fa178038550 0x7fa17803aa00 secure :-1 s=READY pgs=23 cs=0 l=1 rev1=1 crypto rx=0x7fa188006fd0 tx=0x7fa188006e40 comp rx=0 tx=0).stop 2026-03-10T06:17:33.611 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:33.609+0000 7fa19365c700 1 -- 192.168.123.106:0/3878914568 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa18c106610 msgr2=0x7fa18c19b7d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:33.611 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:33.609+0000 7fa19365c700 1 --2- 192.168.123.106:0/3878914568 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa18c106610 0x7fa18c19b7d0 secure :-1 s=READY pgs=116 cs=0 l=1 rev1=1 crypto rx=0x7fa17c000c00 tx=0x7fa17c0047b0 comp rx=0 tx=0).stop 2026-03-10T06:17:33.611 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:33.609+0000 7fa19365c700 1 -- 192.168.123.106:0/3878914568 shutdown_connections 2026-03-10T06:17:33.611 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:33.609+0000 7fa19365c700 1 --2- 192.168.123.106:0/3878914568 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fa178038550 0x7fa17803aa00 unknown :-1 s=CLOSED pgs=23 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:33.611 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:33.609+0000 7fa19365c700 1 --2- 192.168.123.106:0/3878914568 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa18c106610 0x7fa18c19b7d0 unknown :-1 s=CLOSED pgs=116 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:33.611 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:33.609+0000 7fa19365c700 1 -- 192.168.123.106:0/3878914568 >> 192.168.123.106:0/3878914568 conn(0x7fa18c075940 msgr2=0x7fa18c0775a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:17:33.611 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:33.609+0000 7fa19365c700 1 -- 192.168.123.106:0/3878914568 shutdown_connections 2026-03-10T06:17:33.611 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:33.609+0000 7fa19365c700 1 -- 192.168.123.106:0/3878914568 wait complete. 2026-03-10T06:17:33.611 INFO:teuthology.orchestra.run.vm06.stderr:dumped monmap epoch 1 2026-03-10T06:17:34.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:34 vm04 ceph-mon[51058]: pgmap v4: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-10T06:17:34.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:34 vm04 ceph-mon[51058]: from='client.? 192.168.123.106:0/3878914568' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T06:17:34.673 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-10T06:17:34.673 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 9c59102a-1c48-11f1-b618-035af535377d -- ceph mon dump -f json 2026-03-10T06:17:34.825 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T06:17:34.862 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T06:17:35.134 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:35.131+0000 7effa6bf5700 1 -- 192.168.123.106:0/1318348442 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7effa0102230 msgr2=0x7effa0102640 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:35.134 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:35.131+0000 7effa6bf5700 1 --2- 192.168.123.106:0/1318348442 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7effa0102230 0x7effa0102640 secure :-1 s=READY pgs=117 cs=0 l=1 rev1=1 crypto rx=0x7eff94009b00 tx=0x7eff94009e10 comp rx=0 tx=0).stop 2026-03-10T06:17:35.134 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:35.131+0000 7effa6bf5700 1 -- 192.168.123.106:0/1318348442 shutdown_connections 2026-03-10T06:17:35.134 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:35.131+0000 7effa6bf5700 1 --2- 192.168.123.106:0/1318348442 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7effa0102230 0x7effa0102640 unknown :-1 s=CLOSED pgs=117 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:35.134 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:35.131+0000 7effa6bf5700 1 -- 192.168.123.106:0/1318348442 >> 192.168.123.106:0/1318348442 conn(0x7effa00fd8d0 msgr2=0x7effa00ffcb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:17:35.134 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:35.132+0000 7effa6bf5700 1 -- 192.168.123.106:0/1318348442 shutdown_connections 2026-03-10T06:17:35.134 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:35.132+0000 7effa6bf5700 1 -- 192.168.123.106:0/1318348442 wait complete. 2026-03-10T06:17:35.134 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:35.132+0000 7effa6bf5700 1 Processor -- start 2026-03-10T06:17:35.134 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:35.133+0000 7effa6bf5700 1 -- start start 2026-03-10T06:17:35.134 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:35.133+0000 7effa6bf5700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7effa0102230 0x7effa0197380 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:17:35.134 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:35.133+0000 7effa6bf5700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7effa01978c0 con 0x7effa0102230 2026-03-10T06:17:35.135 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:35.133+0000 7effa4991700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7effa0102230 0x7effa0197380 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:17:35.135 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:35.133+0000 7effa4991700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7effa0102230 0x7effa0197380 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.106:44496/0 (socket says 192.168.123.106:44496) 2026-03-10T06:17:35.135 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:35.133+0000 7effa4991700 1 -- 192.168.123.106:0/3756159660 learned_addr learned my addr 192.168.123.106:0/3756159660 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-10T06:17:35.135 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:35.133+0000 7effa4991700 1 -- 192.168.123.106:0/3756159660 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7eff940097e0 con 0x7effa0102230 2026-03-10T06:17:35.135 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:35.134+0000 7effa4991700 1 --2- 192.168.123.106:0/3756159660 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7effa0102230 0x7effa0197380 secure :-1 s=READY pgs=118 cs=0 l=1 rev1=1 crypto rx=0x7eff94000c00 tx=0x7eff94004740 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:17:35.135 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:35.134+0000 7eff9dffb700 1 -- 192.168.123.106:0/3756159660 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7eff9401c070 con 0x7effa0102230 2026-03-10T06:17:35.135 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:35.134+0000 7eff9dffb700 1 -- 192.168.123.106:0/3756159660 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7eff940053b0 con 0x7effa0102230 2026-03-10T06:17:35.135 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:35.134+0000 7effa6bf5700 1 -- 192.168.123.106:0/3756159660 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7effa0197ac0 con 0x7effa0102230 2026-03-10T06:17:35.136 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:35.135+0000 7eff9dffb700 1 -- 192.168.123.106:0/3756159660 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7eff9400f460 con 0x7effa0102230 2026-03-10T06:17:35.136 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:35.135+0000 7effa6bf5700 1 -- 192.168.123.106:0/3756159660 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7effa0197f60 con 0x7effa0102230 2026-03-10T06:17:35.137 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:35.135+0000 7eff9dffb700 1 -- 192.168.123.106:0/3756159660 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 13) v1 ==== 45138+0+0 (secure 0 0 0) 0x7eff94021470 con 0x7effa0102230 2026-03-10T06:17:35.137 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:35.136+0000 7effa6bf5700 1 -- 192.168.123.106:0/3756159660 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7effa0191080 con 0x7effa0102230 2026-03-10T06:17:35.137 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:35.136+0000 7eff9dffb700 1 --2- 192.168.123.106:0/3756159660 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7eff880381e0 0x7eff8803a690 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:17:35.137 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:35.136+0000 7eff9dffb700 1 -- 192.168.123.106:0/3756159660 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1069+0+0 (secure 0 0 0) 0x7eff940506b0 con 0x7effa0102230 2026-03-10T06:17:35.137 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:35.136+0000 7eff9ffff700 1 --2- 192.168.123.106:0/3756159660 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7eff880381e0 0x7eff8803a690 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:17:35.138 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:35.137+0000 7eff9ffff700 1 --2- 192.168.123.106:0/3756159660 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7eff880381e0 0x7eff8803a690 secure :-1 s=READY pgs=24 cs=0 l=1 rev1=1 crypto rx=0x7eff90006fd0 tx=0x7eff90006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:17:35.141 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:35.140+0000 7eff9dffb700 1 -- 192.168.123.106:0/3756159660 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7eff94029b50 con 0x7effa0102230 2026-03-10T06:17:35.285 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:35.283+0000 7effa6bf5700 1 -- 192.168.123.106:0/3756159660 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7effa002cc30 con 0x7effa0102230 2026-03-10T06:17:35.286 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:35.285+0000 7eff9dffb700 1 -- 192.168.123.106:0/3756159660 <== mon.0 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7eff94026020 con 0x7effa0102230 2026-03-10T06:17:35.287 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-10T06:17:35.287 INFO:teuthology.orchestra.run.vm06.stdout:{"epoch":1,"fsid":"9c59102a-1c48-11f1-b618-035af535377d","modified":"2026-03-10T06:16:42.736031Z","created":"2026-03-10T06:16:42.736031Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm04","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:3300","nonce":0},{"type":"v1","addr":"192.168.123.104:6789","nonce":0}]},"addr":"192.168.123.104:6789/0","public_addr":"192.168.123.104:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-10T06:17:35.289 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:35.287+0000 7effa6bf5700 1 -- 192.168.123.106:0/3756159660 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7eff880381e0 msgr2=0x7eff8803a690 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:35.289 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:35.287+0000 7effa6bf5700 1 --2- 192.168.123.106:0/3756159660 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7eff880381e0 0x7eff8803a690 secure :-1 s=READY pgs=24 cs=0 l=1 rev1=1 crypto rx=0x7eff90006fd0 tx=0x7eff90006e40 comp rx=0 tx=0).stop 2026-03-10T06:17:35.289 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:35.287+0000 7effa6bf5700 1 -- 192.168.123.106:0/3756159660 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7effa0102230 msgr2=0x7effa0197380 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:35.289 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:35.287+0000 7effa6bf5700 1 --2- 192.168.123.106:0/3756159660 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7effa0102230 0x7effa0197380 secure :-1 s=READY pgs=118 cs=0 l=1 rev1=1 crypto rx=0x7eff94000c00 tx=0x7eff94004740 comp rx=0 tx=0).stop 2026-03-10T06:17:35.290 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:35.288+0000 7effa6bf5700 1 -- 192.168.123.106:0/3756159660 shutdown_connections 2026-03-10T06:17:35.290 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:35.288+0000 7effa6bf5700 1 --2- 192.168.123.106:0/3756159660 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7eff880381e0 0x7eff8803a690 unknown :-1 s=CLOSED pgs=24 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:35.290 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:35.288+0000 7effa6bf5700 1 --2- 192.168.123.106:0/3756159660 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7effa0102230 0x7effa0197380 unknown :-1 s=CLOSED pgs=118 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:35.290 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:35.288+0000 7effa6bf5700 1 -- 192.168.123.106:0/3756159660 >> 192.168.123.106:0/3756159660 conn(0x7effa00fd8d0 msgr2=0x7effa00fe530 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:17:35.290 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:35.288+0000 7effa6bf5700 1 -- 192.168.123.106:0/3756159660 shutdown_connections 2026-03-10T06:17:35.290 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:35.288+0000 7effa6bf5700 1 -- 192.168.123.106:0/3756159660 wait complete. 2026-03-10T06:17:35.290 INFO:teuthology.orchestra.run.vm06.stderr:dumped monmap epoch 1 2026-03-10T06:17:36.331 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-10T06:17:36.331 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 9c59102a-1c48-11f1-b618-035af535377d -- ceph mon dump -f json 2026-03-10T06:17:36.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:36 vm04 ceph-mon[51058]: pgmap v5: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-10T06:17:36.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:36 vm04 ceph-mon[51058]: from='client.? 192.168.123.106:0/3756159660' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T06:17:36.482 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T06:17:36.519 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T06:17:36.767 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:36.765+0000 7f2e3588c700 1 -- 192.168.123.106:0/3343638618 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2e30102210 msgr2=0x7f2e30102620 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:36.767 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:36.765+0000 7f2e3588c700 1 --2- 192.168.123.106:0/3343638618 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2e30102210 0x7f2e30102620 secure :-1 s=READY pgs=119 cs=0 l=1 rev1=1 crypto rx=0x7f2e18009b00 tx=0x7f2e18009e10 comp rx=0 tx=0).stop 2026-03-10T06:17:36.767 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:36.766+0000 7f2e3588c700 1 -- 192.168.123.106:0/3343638618 shutdown_connections 2026-03-10T06:17:36.767 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:36.766+0000 7f2e3588c700 1 --2- 192.168.123.106:0/3343638618 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2e30102210 0x7f2e30102620 unknown :-1 s=CLOSED pgs=119 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:36.767 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:36.766+0000 7f2e3588c700 1 -- 192.168.123.106:0/3343638618 >> 192.168.123.106:0/3343638618 conn(0x7f2e300fd8d0 msgr2=0x7f2e300ffcb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:17:36.767 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:36.766+0000 7f2e3588c700 1 -- 192.168.123.106:0/3343638618 shutdown_connections 2026-03-10T06:17:36.767 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:36.766+0000 7f2e3588c700 1 -- 192.168.123.106:0/3343638618 wait complete. 2026-03-10T06:17:36.768 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:36.766+0000 7f2e3588c700 1 Processor -- start 2026-03-10T06:17:36.768 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:36.767+0000 7f2e3588c700 1 -- start start 2026-03-10T06:17:36.768 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:36.767+0000 7f2e3588c700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2e30102210 0x7f2e301973d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:17:36.768 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:36.767+0000 7f2e3588c700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f2e30197910 con 0x7f2e30102210 2026-03-10T06:17:36.768 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:36.767+0000 7f2e2effd700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2e30102210 0x7f2e301973d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:17:36.769 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:36.767+0000 7f2e2effd700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2e30102210 0x7f2e301973d0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.106:44512/0 (socket says 192.168.123.106:44512) 2026-03-10T06:17:36.769 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:36.767+0000 7f2e2effd700 1 -- 192.168.123.106:0/360692378 learned_addr learned my addr 192.168.123.106:0/360692378 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-10T06:17:36.769 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:36.768+0000 7f2e2effd700 1 -- 192.168.123.106:0/360692378 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f2e180097e0 con 0x7f2e30102210 2026-03-10T06:17:36.769 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:36.768+0000 7f2e2effd700 1 --2- 192.168.123.106:0/360692378 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2e30102210 0x7f2e301973d0 secure :-1 s=READY pgs=120 cs=0 l=1 rev1=1 crypto rx=0x7f2e18004750 tx=0x7f2e18005dc0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:17:36.769 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:36.768+0000 7f2e3488a700 1 -- 192.168.123.106:0/360692378 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f2e1801c070 con 0x7f2e30102210 2026-03-10T06:17:36.769 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:36.768+0000 7f2e3588c700 1 -- 192.168.123.106:0/360692378 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f2e30197b10 con 0x7f2e30102210 2026-03-10T06:17:36.770 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:36.768+0000 7f2e3588c700 1 -- 192.168.123.106:0/360692378 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f2e30197fb0 con 0x7f2e30102210 2026-03-10T06:17:36.770 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:36.769+0000 7f2e3488a700 1 -- 192.168.123.106:0/360692378 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f2e18021470 con 0x7f2e30102210 2026-03-10T06:17:36.770 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:36.769+0000 7f2e3488a700 1 -- 192.168.123.106:0/360692378 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f2e1800f460 con 0x7f2e30102210 2026-03-10T06:17:36.771 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:36.769+0000 7f2e3488a700 1 -- 192.168.123.106:0/360692378 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 13) v1 ==== 45138+0+0 (secure 0 0 0) 0x7f2e1800f6d0 con 0x7f2e30102210 2026-03-10T06:17:36.771 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:36.769+0000 7f2e3588c700 1 -- 192.168.123.106:0/360692378 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f2e10005320 con 0x7f2e30102210 2026-03-10T06:17:36.771 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:36.769+0000 7f2e3488a700 1 --2- 192.168.123.106:0/360692378 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f2e1c038510 0x7f2e1c03a9c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:17:36.771 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:36.769+0000 7f2e3488a700 1 -- 192.168.123.106:0/360692378 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1069+0+0 (secure 0 0 0) 0x7f2e1804d4a0 con 0x7f2e30102210 2026-03-10T06:17:36.771 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:36.770+0000 7f2e2e7fc700 1 --2- 192.168.123.106:0/360692378 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f2e1c038510 0x7f2e1c03a9c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:17:36.771 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:36.770+0000 7f2e2e7fc700 1 --2- 192.168.123.106:0/360692378 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f2e1c038510 0x7f2e1c03a9c0 secure :-1 s=READY pgs=25 cs=0 l=1 rev1=1 crypto rx=0x7f2e20006fd0 tx=0x7f2e20006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:17:36.774 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:36.772+0000 7f2e3488a700 1 -- 192.168.123.106:0/360692378 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f2e18026070 con 0x7f2e30102210 2026-03-10T06:17:36.920 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:36.918+0000 7f2e3588c700 1 -- 192.168.123.106:0/360692378 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f2e10005190 con 0x7f2e30102210 2026-03-10T06:17:36.920 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:36.919+0000 7f2e3488a700 1 -- 192.168.123.106:0/360692378 <== mon.0 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7f2e18029360 con 0x7f2e30102210 2026-03-10T06:17:36.921 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-10T06:17:36.921 INFO:teuthology.orchestra.run.vm06.stdout:{"epoch":1,"fsid":"9c59102a-1c48-11f1-b618-035af535377d","modified":"2026-03-10T06:16:42.736031Z","created":"2026-03-10T06:16:42.736031Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm04","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:3300","nonce":0},{"type":"v1","addr":"192.168.123.104:6789","nonce":0}]},"addr":"192.168.123.104:6789/0","public_addr":"192.168.123.104:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-10T06:17:36.923 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:36.921+0000 7f2e3588c700 1 -- 192.168.123.106:0/360692378 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f2e1c038510 msgr2=0x7f2e1c03a9c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:36.923 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:36.921+0000 7f2e3588c700 1 --2- 192.168.123.106:0/360692378 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f2e1c038510 0x7f2e1c03a9c0 secure :-1 s=READY pgs=25 cs=0 l=1 rev1=1 crypto rx=0x7f2e20006fd0 tx=0x7f2e20006e40 comp rx=0 tx=0).stop 2026-03-10T06:17:36.923 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:36.921+0000 7f2e3588c700 1 -- 192.168.123.106:0/360692378 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2e30102210 msgr2=0x7f2e301973d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:36.923 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:36.921+0000 7f2e3588c700 1 --2- 192.168.123.106:0/360692378 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2e30102210 0x7f2e301973d0 secure :-1 s=READY pgs=120 cs=0 l=1 rev1=1 crypto rx=0x7f2e18004750 tx=0x7f2e18005dc0 comp rx=0 tx=0).stop 2026-03-10T06:17:36.923 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:36.922+0000 7f2e3588c700 1 -- 192.168.123.106:0/360692378 shutdown_connections 2026-03-10T06:17:36.923 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:36.922+0000 7f2e3588c700 1 --2- 192.168.123.106:0/360692378 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f2e1c038510 0x7f2e1c03a9c0 unknown :-1 s=CLOSED pgs=25 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:36.923 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:36.922+0000 7f2e3588c700 1 --2- 192.168.123.106:0/360692378 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2e30102210 0x7f2e301973d0 unknown :-1 s=CLOSED pgs=120 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:36.923 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:36.922+0000 7f2e3588c700 1 -- 192.168.123.106:0/360692378 >> 192.168.123.106:0/360692378 conn(0x7f2e300fd8d0 msgr2=0x7f2e300ff390 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:17:36.923 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:36.922+0000 7f2e3588c700 1 -- 192.168.123.106:0/360692378 shutdown_connections 2026-03-10T06:17:36.923 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:36.922+0000 7f2e3588c700 1 -- 192.168.123.106:0/360692378 wait complete. 2026-03-10T06:17:36.924 INFO:teuthology.orchestra.run.vm06.stderr:dumped monmap epoch 1 2026-03-10T06:17:37.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:37 vm04 ceph-mon[51058]: from='client.? 192.168.123.106:0/360692378' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T06:17:37.966 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-10T06:17:37.966 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 9c59102a-1c48-11f1-b618-035af535377d -- ceph mon dump -f json 2026-03-10T06:17:38.113 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T06:17:38.150 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T06:17:38.426 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:38.424+0000 7fc974cd3700 1 -- 192.168.123.106:0/1902711749 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc9700fef10 msgr2=0x7fc9700ff320 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:38.426 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:38.424+0000 7fc974cd3700 1 --2- 192.168.123.106:0/1902711749 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc9700fef10 0x7fc9700ff320 secure :-1 s=READY pgs=121 cs=0 l=1 rev1=1 crypto rx=0x7fc958009b00 tx=0x7fc958009e10 comp rx=0 tx=0).stop 2026-03-10T06:17:38.426 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:38.425+0000 7fc974cd3700 1 -- 192.168.123.106:0/1902711749 shutdown_connections 2026-03-10T06:17:38.426 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:38.425+0000 7fc974cd3700 1 --2- 192.168.123.106:0/1902711749 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc9700fef10 0x7fc9700ff320 unknown :-1 s=CLOSED pgs=121 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:38.426 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:38.425+0000 7fc974cd3700 1 -- 192.168.123.106:0/1902711749 >> 192.168.123.106:0/1902711749 conn(0x7fc9700fa4a0 msgr2=0x7fc9700fc8f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:17:38.426 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:38.425+0000 7fc974cd3700 1 -- 192.168.123.106:0/1902711749 shutdown_connections 2026-03-10T06:17:38.426 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:38.425+0000 7fc974cd3700 1 -- 192.168.123.106:0/1902711749 wait complete. 2026-03-10T06:17:38.427 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:38.425+0000 7fc974cd3700 1 Processor -- start 2026-03-10T06:17:38.427 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:38.425+0000 7fc974cd3700 1 -- start start 2026-03-10T06:17:38.427 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:38.426+0000 7fc974cd3700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc9700fef10 0x7fc9701973c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:17:38.427 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:38.426+0000 7fc974cd3700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc970197900 con 0x7fc9700fef10 2026-03-10T06:17:38.427 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:38.426+0000 7fc96e59c700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc9700fef10 0x7fc9701973c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:17:38.427 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:38.426+0000 7fc96e59c700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc9700fef10 0x7fc9701973c0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.106:55726/0 (socket says 192.168.123.106:55726) 2026-03-10T06:17:38.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:38 vm04 ceph-mon[51058]: pgmap v6: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-10T06:17:38.427 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:38.426+0000 7fc96e59c700 1 -- 192.168.123.106:0/1891083182 learned_addr learned my addr 192.168.123.106:0/1891083182 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-10T06:17:38.427 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:38.426+0000 7fc96e59c700 1 -- 192.168.123.106:0/1891083182 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fc9580097e0 con 0x7fc9700fef10 2026-03-10T06:17:38.428 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:38.426+0000 7fc96e59c700 1 --2- 192.168.123.106:0/1891083182 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc9700fef10 0x7fc9701973c0 secure :-1 s=READY pgs=122 cs=0 l=1 rev1=1 crypto rx=0x7fc958004f40 tx=0x7fc958005e70 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:17:38.428 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:38.426+0000 7fc9677fe700 1 -- 192.168.123.106:0/1891083182 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fc95801d070 con 0x7fc9700fef10 2026-03-10T06:17:38.428 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:38.427+0000 7fc974cd3700 1 -- 192.168.123.106:0/1891083182 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fc970197b00 con 0x7fc9700fef10 2026-03-10T06:17:38.428 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:38.427+0000 7fc974cd3700 1 -- 192.168.123.106:0/1891083182 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fc970197fa0 con 0x7fc9700fef10 2026-03-10T06:17:38.429 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:38.427+0000 7fc9677fe700 1 -- 192.168.123.106:0/1891083182 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fc958022470 con 0x7fc9700fef10 2026-03-10T06:17:38.429 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:38.427+0000 7fc9677fe700 1 -- 192.168.123.106:0/1891083182 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fc95800f460 con 0x7fc9700fef10 2026-03-10T06:17:38.429 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:38.428+0000 7fc9677fe700 1 -- 192.168.123.106:0/1891083182 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 13) v1 ==== 45138+0+0 (secure 0 0 0) 0x7fc95800f650 con 0x7fc9700fef10 2026-03-10T06:17:38.429 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:38.428+0000 7fc9677fe700 1 --2- 192.168.123.106:0/1891083182 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fc95c038540 0x7fc95c03a9f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:17:38.429 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:38.428+0000 7fc9677fe700 1 -- 192.168.123.106:0/1891083182 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1069+0+0 (secure 0 0 0) 0x7fc95804c170 con 0x7fc9700fef10 2026-03-10T06:17:38.429 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:38.428+0000 7fc974cd3700 1 -- 192.168.123.106:0/1891083182 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fc970190fe0 con 0x7fc9700fef10 2026-03-10T06:17:38.430 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:38.428+0000 7fc96dd9b700 1 --2- 192.168.123.106:0/1891083182 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fc95c038540 0x7fc95c03a9f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:17:38.430 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:38.429+0000 7fc96dd9b700 1 --2- 192.168.123.106:0/1891083182 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fc95c038540 0x7fc95c03a9f0 secure :-1 s=READY pgs=26 cs=0 l=1 rev1=1 crypto rx=0x7fc960006fd0 tx=0x7fc960006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:17:38.434 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:38.433+0000 7fc9677fe700 1 -- 192.168.123.106:0/1891083182 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fc958027070 con 0x7fc9700fef10 2026-03-10T06:17:38.574 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:38.572+0000 7fc974cd3700 1 -- 192.168.123.106:0/1891083182 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7fc97004f9e0 con 0x7fc9700fef10 2026-03-10T06:17:38.574 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:38.573+0000 7fc9677fe700 1 -- 192.168.123.106:0/1891083182 <== mon.0 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7fc95800fe60 con 0x7fc9700fef10 2026-03-10T06:17:38.575 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-10T06:17:38.575 INFO:teuthology.orchestra.run.vm06.stdout:{"epoch":1,"fsid":"9c59102a-1c48-11f1-b618-035af535377d","modified":"2026-03-10T06:16:42.736031Z","created":"2026-03-10T06:16:42.736031Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm04","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:3300","nonce":0},{"type":"v1","addr":"192.168.123.104:6789","nonce":0}]},"addr":"192.168.123.104:6789/0","public_addr":"192.168.123.104:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-10T06:17:38.577 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:38.576+0000 7fc974cd3700 1 -- 192.168.123.106:0/1891083182 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fc95c038540 msgr2=0x7fc95c03a9f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:38.577 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:38.576+0000 7fc974cd3700 1 --2- 192.168.123.106:0/1891083182 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fc95c038540 0x7fc95c03a9f0 secure :-1 s=READY pgs=26 cs=0 l=1 rev1=1 crypto rx=0x7fc960006fd0 tx=0x7fc960006e40 comp rx=0 tx=0).stop 2026-03-10T06:17:38.577 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:38.576+0000 7fc974cd3700 1 -- 192.168.123.106:0/1891083182 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc9700fef10 msgr2=0x7fc9701973c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:38.578 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:38.576+0000 7fc974cd3700 1 --2- 192.168.123.106:0/1891083182 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc9700fef10 0x7fc9701973c0 secure :-1 s=READY pgs=122 cs=0 l=1 rev1=1 crypto rx=0x7fc958004f40 tx=0x7fc958005e70 comp rx=0 tx=0).stop 2026-03-10T06:17:38.578 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:38.576+0000 7fc974cd3700 1 -- 192.168.123.106:0/1891083182 shutdown_connections 2026-03-10T06:17:38.578 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:38.577+0000 7fc974cd3700 1 --2- 192.168.123.106:0/1891083182 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fc95c038540 0x7fc95c03a9f0 unknown :-1 s=CLOSED pgs=26 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:38.578 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:38.577+0000 7fc974cd3700 1 --2- 192.168.123.106:0/1891083182 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc9700fef10 0x7fc9701973c0 unknown :-1 s=CLOSED pgs=122 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:38.578 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:38.577+0000 7fc974cd3700 1 -- 192.168.123.106:0/1891083182 >> 192.168.123.106:0/1891083182 conn(0x7fc9700fa4a0 msgr2=0x7fc9700fb170 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:17:38.579 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:38.577+0000 7fc974cd3700 1 -- 192.168.123.106:0/1891083182 shutdown_connections 2026-03-10T06:17:38.579 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:38.577+0000 7fc974cd3700 1 -- 192.168.123.106:0/1891083182 wait complete. 2026-03-10T06:17:38.580 INFO:teuthology.orchestra.run.vm06.stderr:dumped monmap epoch 1 2026-03-10T06:17:39.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:39 vm04 ceph-mon[51058]: from='client.? 192.168.123.106:0/1891083182' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T06:17:39.645 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-10T06:17:39.646 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 9c59102a-1c48-11f1-b618-035af535377d -- ceph mon dump -f json 2026-03-10T06:17:39.792 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T06:17:39.830 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T06:17:40.125 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:40.123+0000 7f9a949c8700 1 -- 192.168.123.106:0/3231088193 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9a8c074d80 msgr2=0x7f9a8c0731e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:40.125 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:40.123+0000 7f9a949c8700 1 --2- 192.168.123.106:0/3231088193 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9a8c074d80 0x7f9a8c0731e0 secure :-1 s=READY pgs=123 cs=0 l=1 rev1=1 crypto rx=0x7f9a84009b00 tx=0x7f9a84009e10 comp rx=0 tx=0).stop 2026-03-10T06:17:40.125 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:40.124+0000 7f9a949c8700 1 -- 192.168.123.106:0/3231088193 shutdown_connections 2026-03-10T06:17:40.125 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:40.124+0000 7f9a949c8700 1 --2- 192.168.123.106:0/3231088193 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9a8c074d80 0x7f9a8c0731e0 unknown :-1 s=CLOSED pgs=123 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:40.125 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:40.124+0000 7f9a949c8700 1 -- 192.168.123.106:0/3231088193 >> 192.168.123.106:0/3231088193 conn(0x7f9a8c0fb5a0 msgr2=0x7f9a8c0fd9f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:17:40.125 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:40.124+0000 7f9a949c8700 1 -- 192.168.123.106:0/3231088193 shutdown_connections 2026-03-10T06:17:40.125 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:40.124+0000 7f9a949c8700 1 -- 192.168.123.106:0/3231088193 wait complete. 2026-03-10T06:17:40.126 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:40.124+0000 7f9a949c8700 1 Processor -- start 2026-03-10T06:17:40.126 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:40.125+0000 7f9a949c8700 1 -- start start 2026-03-10T06:17:40.126 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:40.125+0000 7f9a949c8700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9a8c074d80 0x7f9a8c19b720 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:17:40.126 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:40.125+0000 7f9a949c8700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9a84012070 con 0x7f9a8c074d80 2026-03-10T06:17:40.126 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:40.125+0000 7f9a92764700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9a8c074d80 0x7f9a8c19b720 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:17:40.126 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:40.125+0000 7f9a92764700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9a8c074d80 0x7f9a8c19b720 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.106:55736/0 (socket says 192.168.123.106:55736) 2026-03-10T06:17:40.126 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:40.125+0000 7f9a92764700 1 -- 192.168.123.106:0/3915067665 learned_addr learned my addr 192.168.123.106:0/3915067665 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-10T06:17:40.127 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:40.125+0000 7f9a92764700 1 -- 192.168.123.106:0/3915067665 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f9a840097e0 con 0x7f9a8c074d80 2026-03-10T06:17:40.127 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:40.126+0000 7f9a92764700 1 --2- 192.168.123.106:0/3915067665 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9a8c074d80 0x7f9a8c19b720 secure :-1 s=READY pgs=124 cs=0 l=1 rev1=1 crypto rx=0x7f9a840055d0 tx=0x7f9a840056b0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:17:40.127 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:40.126+0000 7f9a7f7fe700 1 -- 192.168.123.106:0/3915067665 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f9a8401d070 con 0x7f9a8c074d80 2026-03-10T06:17:40.127 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:40.126+0000 7f9a949c8700 1 -- 192.168.123.106:0/3915067665 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f9a8c19bc60 con 0x7f9a8c074d80 2026-03-10T06:17:40.127 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:40.126+0000 7f9a949c8700 1 -- 192.168.123.106:0/3915067665 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f9a8c19c080 con 0x7f9a8c074d80 2026-03-10T06:17:40.128 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:40.127+0000 7f9a7f7fe700 1 -- 192.168.123.106:0/3915067665 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f9a84005dc0 con 0x7f9a8c074d80 2026-03-10T06:17:40.128 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:40.127+0000 7f9a7f7fe700 1 -- 192.168.123.106:0/3915067665 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f9a8400f460 con 0x7f9a8c074d80 2026-03-10T06:17:40.128 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:40.127+0000 7f9a7f7fe700 1 -- 192.168.123.106:0/3915067665 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 13) v1 ==== 45138+0+0 (secure 0 0 0) 0x7f9a8400f5c0 con 0x7f9a8c074d80 2026-03-10T06:17:40.129 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:40.127+0000 7f9a7f7fe700 1 --2- 192.168.123.106:0/3915067665 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f9a78038500 0x7f9a7803a9b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:17:40.129 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:40.127+0000 7f9a7f7fe700 1 -- 192.168.123.106:0/3915067665 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1069+0+0 (secure 0 0 0) 0x7f9a84015350 con 0x7f9a8c074d80 2026-03-10T06:17:40.129 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:40.127+0000 7f9a91f63700 1 --2- 192.168.123.106:0/3915067665 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f9a78038500 0x7f9a7803a9b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:17:40.129 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:40.127+0000 7f9a949c8700 1 -- 192.168.123.106:0/3915067665 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f9a70005320 con 0x7f9a8c074d80 2026-03-10T06:17:40.129 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:40.128+0000 7f9a91f63700 1 --2- 192.168.123.106:0/3915067665 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f9a78038500 0x7f9a7803a9b0 secure :-1 s=READY pgs=27 cs=0 l=1 rev1=1 crypto rx=0x7f9a80006fd0 tx=0x7f9a80006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:17:40.132 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:40.131+0000 7f9a7f7fe700 1 -- 192.168.123.106:0/3915067665 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f9a84017630 con 0x7f9a8c074d80 2026-03-10T06:17:40.277 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:40.276+0000 7f9a949c8700 1 -- 192.168.123.106:0/3915067665 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f9a70005190 con 0x7f9a8c074d80 2026-03-10T06:17:40.279 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:40.278+0000 7f9a7f7fe700 1 -- 192.168.123.106:0/3915067665 <== mon.0 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7f9a84026030 con 0x7f9a8c074d80 2026-03-10T06:17:40.279 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-10T06:17:40.279 INFO:teuthology.orchestra.run.vm06.stdout:{"epoch":1,"fsid":"9c59102a-1c48-11f1-b618-035af535377d","modified":"2026-03-10T06:16:42.736031Z","created":"2026-03-10T06:16:42.736031Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm04","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:3300","nonce":0},{"type":"v1","addr":"192.168.123.104:6789","nonce":0}]},"addr":"192.168.123.104:6789/0","public_addr":"192.168.123.104:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-10T06:17:40.282 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:40.280+0000 7f9a949c8700 1 -- 192.168.123.106:0/3915067665 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f9a78038500 msgr2=0x7f9a7803a9b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:40.282 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:40.280+0000 7f9a949c8700 1 --2- 192.168.123.106:0/3915067665 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f9a78038500 0x7f9a7803a9b0 secure :-1 s=READY pgs=27 cs=0 l=1 rev1=1 crypto rx=0x7f9a80006fd0 tx=0x7f9a80006e40 comp rx=0 tx=0).stop 2026-03-10T06:17:40.282 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:40.280+0000 7f9a949c8700 1 -- 192.168.123.106:0/3915067665 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9a8c074d80 msgr2=0x7f9a8c19b720 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:40.282 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:40.280+0000 7f9a949c8700 1 --2- 192.168.123.106:0/3915067665 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9a8c074d80 0x7f9a8c19b720 secure :-1 s=READY pgs=124 cs=0 l=1 rev1=1 crypto rx=0x7f9a840055d0 tx=0x7f9a840056b0 comp rx=0 tx=0).stop 2026-03-10T06:17:40.282 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:40.281+0000 7f9a949c8700 1 -- 192.168.123.106:0/3915067665 shutdown_connections 2026-03-10T06:17:40.282 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:40.281+0000 7f9a949c8700 1 --2- 192.168.123.106:0/3915067665 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f9a78038500 0x7f9a7803a9b0 unknown :-1 s=CLOSED pgs=27 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:40.282 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:40.281+0000 7f9a949c8700 1 --2- 192.168.123.106:0/3915067665 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9a8c074d80 0x7f9a8c19b720 unknown :-1 s=CLOSED pgs=124 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:40.282 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:40.281+0000 7f9a949c8700 1 -- 192.168.123.106:0/3915067665 >> 192.168.123.106:0/3915067665 conn(0x7f9a8c0fb5a0 msgr2=0x7f9a8c0fc270 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:17:40.282 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:40.281+0000 7f9a949c8700 1 -- 192.168.123.106:0/3915067665 shutdown_connections 2026-03-10T06:17:40.282 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:40.281+0000 7f9a949c8700 1 -- 192.168.123.106:0/3915067665 wait complete. 2026-03-10T06:17:40.283 INFO:teuthology.orchestra.run.vm06.stderr:dumped monmap epoch 1 2026-03-10T06:17:40.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:40 vm04 ceph-mon[51058]: pgmap v7: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-10T06:17:41.330 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-10T06:17:41.330 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 9c59102a-1c48-11f1-b618-035af535377d -- ceph mon dump -f json 2026-03-10T06:17:41.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:41 vm04 ceph-mon[51058]: from='client.? 192.168.123.106:0/3915067665' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T06:17:41.474 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T06:17:41.517 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T06:17:41.785 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:41.782+0000 7f86334ae700 1 -- 192.168.123.106:0/1671356229 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f862c100010 msgr2=0x7f862c100420 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:41.785 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:41.782+0000 7f86334ae700 1 --2- 192.168.123.106:0/1671356229 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f862c100010 0x7f862c100420 secure :-1 s=READY pgs=125 cs=0 l=1 rev1=1 crypto rx=0x7f861c009b00 tx=0x7f861c009e10 comp rx=0 tx=0).stop 2026-03-10T06:17:41.785 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:41.783+0000 7f86334ae700 1 -- 192.168.123.106:0/1671356229 shutdown_connections 2026-03-10T06:17:41.785 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:41.783+0000 7f86334ae700 1 --2- 192.168.123.106:0/1671356229 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f862c100010 0x7f862c100420 unknown :-1 s=CLOSED pgs=125 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:41.785 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:41.783+0000 7f86334ae700 1 -- 192.168.123.106:0/1671356229 >> 192.168.123.106:0/1671356229 conn(0x7f862c0fb5a0 msgr2=0x7f862c0fd9f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:17:41.785 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:41.784+0000 7f86334ae700 1 -- 192.168.123.106:0/1671356229 shutdown_connections 2026-03-10T06:17:41.785 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:41.784+0000 7f86334ae700 1 -- 192.168.123.106:0/1671356229 wait complete. 2026-03-10T06:17:41.786 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:41.784+0000 7f86334ae700 1 Processor -- start 2026-03-10T06:17:41.786 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:41.785+0000 7f86334ae700 1 -- start start 2026-03-10T06:17:41.786 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:41.785+0000 7f86334ae700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f862c100010 0x7f862c1951c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:17:41.786 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:41.785+0000 7f86334ae700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f862c195700 con 0x7f862c100010 2026-03-10T06:17:41.787 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:41.785+0000 7f863124a700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f862c100010 0x7f862c1951c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:17:41.787 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:41.785+0000 7f863124a700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f862c100010 0x7f862c1951c0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.106:55758/0 (socket says 192.168.123.106:55758) 2026-03-10T06:17:41.787 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:41.785+0000 7f863124a700 1 -- 192.168.123.106:0/344918987 learned_addr learned my addr 192.168.123.106:0/344918987 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-10T06:17:41.787 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:41.786+0000 7f863124a700 1 -- 192.168.123.106:0/344918987 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f861c0097e0 con 0x7f862c100010 2026-03-10T06:17:41.788 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:41.786+0000 7f863124a700 1 --2- 192.168.123.106:0/344918987 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f862c100010 0x7f862c1951c0 secure :-1 s=READY pgs=126 cs=0 l=1 rev1=1 crypto rx=0x7f861c004750 tx=0x7f861c005dc0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:17:41.788 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:41.787+0000 7f86227fc700 1 -- 192.168.123.106:0/344918987 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f861c01c070 con 0x7f862c100010 2026-03-10T06:17:41.788 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:41.787+0000 7f86334ae700 1 -- 192.168.123.106:0/344918987 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f862c195900 con 0x7f862c100010 2026-03-10T06:17:41.788 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:41.787+0000 7f86334ae700 1 -- 192.168.123.106:0/344918987 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f862c195da0 con 0x7f862c100010 2026-03-10T06:17:41.789 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:41.787+0000 7f86227fc700 1 -- 192.168.123.106:0/344918987 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f861c021470 con 0x7f862c100010 2026-03-10T06:17:41.789 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:41.787+0000 7f86227fc700 1 -- 192.168.123.106:0/344918987 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f861c00f460 con 0x7f862c100010 2026-03-10T06:17:41.790 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:41.788+0000 7f86227fc700 1 -- 192.168.123.106:0/344918987 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 13) v1 ==== 45138+0+0 (secure 0 0 0) 0x7f861c00f6d0 con 0x7f862c100010 2026-03-10T06:17:41.790 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:41.788+0000 7f86227fc700 1 --2- 192.168.123.106:0/344918987 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f8618038510 0x7f861803a9c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:17:41.790 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:41.788+0000 7f86227fc700 1 -- 192.168.123.106:0/344918987 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1069+0+0 (secure 0 0 0) 0x7f861c04d4a0 con 0x7f862c100010 2026-03-10T06:17:41.790 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:41.788+0000 7f86334ae700 1 -- 192.168.123.106:0/344918987 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f8610005320 con 0x7f862c100010 2026-03-10T06:17:41.790 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:41.788+0000 7f8630a49700 1 --2- 192.168.123.106:0/344918987 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f8618038510 0x7f861803a9c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:17:41.790 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:41.789+0000 7f8630a49700 1 --2- 192.168.123.106:0/344918987 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f8618038510 0x7f861803a9c0 secure :-1 s=READY pgs=28 cs=0 l=1 rev1=1 crypto rx=0x7f8628006fd0 tx=0x7f8628006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:17:41.793 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:41.791+0000 7f86227fc700 1 -- 192.168.123.106:0/344918987 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f861c026020 con 0x7f862c100010 2026-03-10T06:17:41.934 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:41.933+0000 7f86334ae700 1 -- 192.168.123.106:0/344918987 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f8610005190 con 0x7f862c100010 2026-03-10T06:17:41.935 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:41.933+0000 7f86227fc700 1 -- 192.168.123.106:0/344918987 <== mon.0 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7f861c04b090 con 0x7f862c100010 2026-03-10T06:17:41.935 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-10T06:17:41.935 INFO:teuthology.orchestra.run.vm06.stdout:{"epoch":1,"fsid":"9c59102a-1c48-11f1-b618-035af535377d","modified":"2026-03-10T06:16:42.736031Z","created":"2026-03-10T06:16:42.736031Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm04","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:3300","nonce":0},{"type":"v1","addr":"192.168.123.104:6789","nonce":0}]},"addr":"192.168.123.104:6789/0","public_addr":"192.168.123.104:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-10T06:17:41.937 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:41.936+0000 7f86334ae700 1 -- 192.168.123.106:0/344918987 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f8618038510 msgr2=0x7f861803a9c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:41.937 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:41.936+0000 7f86334ae700 1 --2- 192.168.123.106:0/344918987 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f8618038510 0x7f861803a9c0 secure :-1 s=READY pgs=28 cs=0 l=1 rev1=1 crypto rx=0x7f8628006fd0 tx=0x7f8628006e40 comp rx=0 tx=0).stop 2026-03-10T06:17:41.937 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:41.936+0000 7f86334ae700 1 -- 192.168.123.106:0/344918987 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f862c100010 msgr2=0x7f862c1951c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:41.937 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:41.936+0000 7f86334ae700 1 --2- 192.168.123.106:0/344918987 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f862c100010 0x7f862c1951c0 secure :-1 s=READY pgs=126 cs=0 l=1 rev1=1 crypto rx=0x7f861c004750 tx=0x7f861c005dc0 comp rx=0 tx=0).stop 2026-03-10T06:17:41.937 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:41.936+0000 7f86334ae700 1 -- 192.168.123.106:0/344918987 shutdown_connections 2026-03-10T06:17:41.938 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:41.936+0000 7f86334ae700 1 --2- 192.168.123.106:0/344918987 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f8618038510 0x7f861803a9c0 unknown :-1 s=CLOSED pgs=28 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:41.938 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:41.936+0000 7f86334ae700 1 --2- 192.168.123.106:0/344918987 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f862c100010 0x7f862c1951c0 unknown :-1 s=CLOSED pgs=126 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:41.938 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:41.937+0000 7f86334ae700 1 -- 192.168.123.106:0/344918987 >> 192.168.123.106:0/344918987 conn(0x7f862c0fb5a0 msgr2=0x7f862c0fc270 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:17:41.938 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:41.937+0000 7f86334ae700 1 -- 192.168.123.106:0/344918987 shutdown_connections 2026-03-10T06:17:41.938 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:41.937+0000 7f86334ae700 1 -- 192.168.123.106:0/344918987 wait complete. 2026-03-10T06:17:41.939 INFO:teuthology.orchestra.run.vm06.stderr:dumped monmap epoch 1 2026-03-10T06:17:42.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:42 vm04 ceph-mon[51058]: pgmap v8: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-10T06:17:42.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:42 vm04 ceph-mon[51058]: from='client.? 192.168.123.106:0/344918987' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T06:17:42.983 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-10T06:17:42.983 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 9c59102a-1c48-11f1-b618-035af535377d -- ceph mon dump -f json 2026-03-10T06:17:43.148 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T06:17:43.196 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T06:17:43.501 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:43.499+0000 7f9070842700 1 -- 192.168.123.106:0/2825151302 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9068100010 msgr2=0x7f9068100420 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:43.501 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:43.499+0000 7f9070842700 1 --2- 192.168.123.106:0/2825151302 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9068100010 0x7f9068100420 secure :-1 s=READY pgs=127 cs=0 l=1 rev1=1 crypto rx=0x7f905c009b00 tx=0x7f905c009e10 comp rx=0 tx=0).stop 2026-03-10T06:17:43.501 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:43.500+0000 7f9070842700 1 -- 192.168.123.106:0/2825151302 shutdown_connections 2026-03-10T06:17:43.501 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:43.500+0000 7f9070842700 1 --2- 192.168.123.106:0/2825151302 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9068100010 0x7f9068100420 unknown :-1 s=CLOSED pgs=127 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:43.501 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:43.500+0000 7f9070842700 1 -- 192.168.123.106:0/2825151302 >> 192.168.123.106:0/2825151302 conn(0x7f90680fb5a0 msgr2=0x7f90680fd9f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:17:43.501 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:43.500+0000 7f9070842700 1 -- 192.168.123.106:0/2825151302 shutdown_connections 2026-03-10T06:17:43.501 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:43.500+0000 7f9070842700 1 -- 192.168.123.106:0/2825151302 wait complete. 2026-03-10T06:17:43.502 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:43.500+0000 7f9070842700 1 Processor -- start 2026-03-10T06:17:43.502 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:43.501+0000 7f9070842700 1 -- start start 2026-03-10T06:17:43.502 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:43.501+0000 7f9070842700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9068100010 0x7f90681973c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:17:43.502 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:43.501+0000 7f9070842700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9068197900 con 0x7f9068100010 2026-03-10T06:17:43.502 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:43.501+0000 7f906e5de700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9068100010 0x7f90681973c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:17:43.503 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:43.501+0000 7f906e5de700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9068100010 0x7f90681973c0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.106:55776/0 (socket says 192.168.123.106:55776) 2026-03-10T06:17:43.503 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:43.501+0000 7f906e5de700 1 -- 192.168.123.106:0/518347578 learned_addr learned my addr 192.168.123.106:0/518347578 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-10T06:17:43.503 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:43.501+0000 7f906e5de700 1 -- 192.168.123.106:0/518347578 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f905c0097e0 con 0x7f9068100010 2026-03-10T06:17:43.503 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:43.502+0000 7f906e5de700 1 --2- 192.168.123.106:0/518347578 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9068100010 0x7f90681973c0 secure :-1 s=READY pgs=128 cs=0 l=1 rev1=1 crypto rx=0x7f905c004750 tx=0x7f905c005dc0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:17:43.503 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:43.502+0000 7f905b7fe700 1 -- 192.168.123.106:0/518347578 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f905c01c070 con 0x7f9068100010 2026-03-10T06:17:43.503 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:43.502+0000 7f9070842700 1 -- 192.168.123.106:0/518347578 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f9068197b00 con 0x7f9068100010 2026-03-10T06:17:43.503 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:43.502+0000 7f9070842700 1 -- 192.168.123.106:0/518347578 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f9068197fa0 con 0x7f9068100010 2026-03-10T06:17:43.504 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:43.502+0000 7f905b7fe700 1 -- 192.168.123.106:0/518347578 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f905c021470 con 0x7f9068100010 2026-03-10T06:17:43.504 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:43.502+0000 7f905b7fe700 1 -- 192.168.123.106:0/518347578 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f905c00f460 con 0x7f9068100010 2026-03-10T06:17:43.505 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:43.503+0000 7f905b7fe700 1 -- 192.168.123.106:0/518347578 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 13) v1 ==== 45138+0+0 (secure 0 0 0) 0x7f905c00f6d0 con 0x7f9068100010 2026-03-10T06:17:43.505 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:43.503+0000 7f905b7fe700 1 --2- 192.168.123.106:0/518347578 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f9054038510 0x7f905403a9c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:17:43.505 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:43.503+0000 7f905b7fe700 1 -- 192.168.123.106:0/518347578 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1069+0+0 (secure 0 0 0) 0x7f905c04d4a0 con 0x7f9068100010 2026-03-10T06:17:43.505 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:43.504+0000 7f906dddd700 1 --2- 192.168.123.106:0/518347578 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f9054038510 0x7f905403a9c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:17:43.506 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:43.504+0000 7f9070842700 1 -- 192.168.123.106:0/518347578 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f904c005320 con 0x7f9068100010 2026-03-10T06:17:43.509 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:43.507+0000 7f905b7fe700 1 -- 192.168.123.106:0/518347578 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f905c026070 con 0x7f9068100010 2026-03-10T06:17:43.509 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:43.508+0000 7f906dddd700 1 --2- 192.168.123.106:0/518347578 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f9054038510 0x7f905403a9c0 secure :-1 s=READY pgs=29 cs=0 l=1 rev1=1 crypto rx=0x7f9060006fd0 tx=0x7f9060006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:17:43.669 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:43.667+0000 7f9070842700 1 -- 192.168.123.106:0/518347578 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f904c005190 con 0x7f9068100010 2026-03-10T06:17:43.669 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:43.668+0000 7f905b7fe700 1 -- 192.168.123.106:0/518347578 <== mon.0 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7f905c029360 con 0x7f9068100010 2026-03-10T06:17:43.670 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-10T06:17:43.670 INFO:teuthology.orchestra.run.vm06.stdout:{"epoch":1,"fsid":"9c59102a-1c48-11f1-b618-035af535377d","modified":"2026-03-10T06:16:42.736031Z","created":"2026-03-10T06:16:42.736031Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm04","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:3300","nonce":0},{"type":"v1","addr":"192.168.123.104:6789","nonce":0}]},"addr":"192.168.123.104:6789/0","public_addr":"192.168.123.104:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-10T06:17:43.672 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:43.671+0000 7f9070842700 1 -- 192.168.123.106:0/518347578 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f9054038510 msgr2=0x7f905403a9c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:43.672 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:43.671+0000 7f9070842700 1 --2- 192.168.123.106:0/518347578 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f9054038510 0x7f905403a9c0 secure :-1 s=READY pgs=29 cs=0 l=1 rev1=1 crypto rx=0x7f9060006fd0 tx=0x7f9060006e40 comp rx=0 tx=0).stop 2026-03-10T06:17:43.672 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:43.671+0000 7f9070842700 1 -- 192.168.123.106:0/518347578 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9068100010 msgr2=0x7f90681973c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:43.672 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:43.671+0000 7f9070842700 1 --2- 192.168.123.106:0/518347578 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9068100010 0x7f90681973c0 secure :-1 s=READY pgs=128 cs=0 l=1 rev1=1 crypto rx=0x7f905c004750 tx=0x7f905c005dc0 comp rx=0 tx=0).stop 2026-03-10T06:17:43.672 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:43.671+0000 7f9070842700 1 -- 192.168.123.106:0/518347578 shutdown_connections 2026-03-10T06:17:43.673 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:43.671+0000 7f9070842700 1 --2- 192.168.123.106:0/518347578 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f9054038510 0x7f905403a9c0 unknown :-1 s=CLOSED pgs=29 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:43.673 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:43.671+0000 7f9070842700 1 --2- 192.168.123.106:0/518347578 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9068100010 0x7f90681973c0 unknown :-1 s=CLOSED pgs=128 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:43.673 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:43.671+0000 7f9070842700 1 -- 192.168.123.106:0/518347578 >> 192.168.123.106:0/518347578 conn(0x7f90680fb5a0 msgr2=0x7f90680fc270 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:17:43.673 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:43.672+0000 7f9070842700 1 -- 192.168.123.106:0/518347578 shutdown_connections 2026-03-10T06:17:43.673 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:43.672+0000 7f9070842700 1 -- 192.168.123.106:0/518347578 wait complete. 2026-03-10T06:17:43.674 INFO:teuthology.orchestra.run.vm06.stderr:dumped monmap epoch 1 2026-03-10T06:17:44.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:44 vm04 ceph-mon[51058]: pgmap v9: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-10T06:17:44.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:44 vm04 ceph-mon[51058]: from='client.? 192.168.123.106:0/518347578' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T06:17:44.750 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-10T06:17:44.750 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 9c59102a-1c48-11f1-b618-035af535377d -- ceph mon dump -f json 2026-03-10T06:17:44.905 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T06:17:44.945 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T06:17:45.232 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:45.230+0000 7f560a7f8700 1 -- 192.168.123.106:0/3371013704 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5604100aa0 msgr2=0x7f5604102e80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:45.232 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:45.230+0000 7f560a7f8700 1 --2- 192.168.123.106:0/3371013704 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5604100aa0 0x7f5604102e80 secure :-1 s=READY pgs=129 cs=0 l=1 rev1=1 crypto rx=0x7f55ec009b00 tx=0x7f55ec009e10 comp rx=0 tx=0).stop 2026-03-10T06:17:45.232 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:45.231+0000 7f560a7f8700 1 -- 192.168.123.106:0/3371013704 shutdown_connections 2026-03-10T06:17:45.232 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:45.231+0000 7f560a7f8700 1 --2- 192.168.123.106:0/3371013704 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5604100aa0 0x7f5604102e80 unknown :-1 s=CLOSED pgs=129 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:45.232 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:45.231+0000 7f560a7f8700 1 -- 192.168.123.106:0/3371013704 >> 192.168.123.106:0/3371013704 conn(0x7f56040fa4a0 msgr2=0x7f56040fc8f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:17:45.232 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:45.231+0000 7f560a7f8700 1 -- 192.168.123.106:0/3371013704 shutdown_connections 2026-03-10T06:17:45.232 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:45.231+0000 7f560a7f8700 1 -- 192.168.123.106:0/3371013704 wait complete. 2026-03-10T06:17:45.233 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:45.232+0000 7f560a7f8700 1 Processor -- start 2026-03-10T06:17:45.233 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:45.232+0000 7f560a7f8700 1 -- start start 2026-03-10T06:17:45.233 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:45.232+0000 7f560a7f8700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5604100aa0 0x7f5604195150 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:17:45.233 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:45.232+0000 7f560a7f8700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5604195690 con 0x7f5604100aa0 2026-03-10T06:17:45.234 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:45.232+0000 7f5603fff700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5604100aa0 0x7f5604195150 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:17:45.234 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:45.232+0000 7f5603fff700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5604100aa0 0x7f5604195150 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.106:55790/0 (socket says 192.168.123.106:55790) 2026-03-10T06:17:45.234 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:45.232+0000 7f5603fff700 1 -- 192.168.123.106:0/3008969130 learned_addr learned my addr 192.168.123.106:0/3008969130 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-10T06:17:45.234 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:45.233+0000 7f5603fff700 1 -- 192.168.123.106:0/3008969130 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f55ec0097e0 con 0x7f5604100aa0 2026-03-10T06:17:45.235 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:45.233+0000 7f5603fff700 1 --2- 192.168.123.106:0/3008969130 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5604100aa0 0x7f5604195150 secure :-1 s=READY pgs=130 cs=0 l=1 rev1=1 crypto rx=0x7f55ec004f40 tx=0x7f55ec005e70 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:17:45.235 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:45.234+0000 7f56017fa700 1 -- 192.168.123.106:0/3008969130 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f55ec01c070 con 0x7f5604100aa0 2026-03-10T06:17:45.235 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:45.234+0000 7f560a7f8700 1 -- 192.168.123.106:0/3008969130 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f5604195890 con 0x7f5604100aa0 2026-03-10T06:17:45.235 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:45.234+0000 7f560a7f8700 1 -- 192.168.123.106:0/3008969130 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f5604195d30 con 0x7f5604100aa0 2026-03-10T06:17:45.236 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:45.234+0000 7f56017fa700 1 -- 192.168.123.106:0/3008969130 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f55ec0053b0 con 0x7f5604100aa0 2026-03-10T06:17:45.236 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:45.235+0000 7f56017fa700 1 -- 192.168.123.106:0/3008969130 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f55ec00f460 con 0x7f5604100aa0 2026-03-10T06:17:45.236 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:45.235+0000 7f56017fa700 1 -- 192.168.123.106:0/3008969130 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 13) v1 ==== 45138+0+0 (secure 0 0 0) 0x7f55ec00f680 con 0x7f5604100aa0 2026-03-10T06:17:45.237 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:45.235+0000 7f560a7f8700 1 -- 192.168.123.106:0/3008969130 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f55e4005320 con 0x7f5604100aa0 2026-03-10T06:17:45.237 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:45.235+0000 7f56017fa700 1 --2- 192.168.123.106:0/3008969130 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f55f0038510 0x7f55f003a9c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:17:45.237 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:45.235+0000 7f56017fa700 1 -- 192.168.123.106:0/3008969130 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1069+0+0 (secure 0 0 0) 0x7f55ec04d4d0 con 0x7f5604100aa0 2026-03-10T06:17:45.237 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:45.236+0000 7f56037fe700 1 --2- 192.168.123.106:0/3008969130 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f55f0038510 0x7f55f003a9c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:17:45.237 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:45.236+0000 7f56037fe700 1 --2- 192.168.123.106:0/3008969130 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f55f0038510 0x7f55f003a9c0 secure :-1 s=READY pgs=30 cs=0 l=1 rev1=1 crypto rx=0x7f55f4006fd0 tx=0x7f55f4006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:17:45.240 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:45.239+0000 7f56017fa700 1 -- 192.168.123.106:0/3008969130 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f55ec029930 con 0x7f5604100aa0 2026-03-10T06:17:45.419 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:45.418+0000 7f560a7f8700 1 -- 192.168.123.106:0/3008969130 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f55e4005190 con 0x7f5604100aa0 2026-03-10T06:17:45.420 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:45.419+0000 7f56017fa700 1 -- 192.168.123.106:0/3008969130 <== mon.0 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7f55ec026030 con 0x7f5604100aa0 2026-03-10T06:17:45.421 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-10T06:17:45.421 INFO:teuthology.orchestra.run.vm06.stdout:{"epoch":1,"fsid":"9c59102a-1c48-11f1-b618-035af535377d","modified":"2026-03-10T06:16:42.736031Z","created":"2026-03-10T06:16:42.736031Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm04","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:3300","nonce":0},{"type":"v1","addr":"192.168.123.104:6789","nonce":0}]},"addr":"192.168.123.104:6789/0","public_addr":"192.168.123.104:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-10T06:17:45.423 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:45.421+0000 7f560a7f8700 1 -- 192.168.123.106:0/3008969130 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f55f0038510 msgr2=0x7f55f003a9c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:45.423 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:45.421+0000 7f560a7f8700 1 --2- 192.168.123.106:0/3008969130 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f55f0038510 0x7f55f003a9c0 secure :-1 s=READY pgs=30 cs=0 l=1 rev1=1 crypto rx=0x7f55f4006fd0 tx=0x7f55f4006e40 comp rx=0 tx=0).stop 2026-03-10T06:17:45.423 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:45.421+0000 7f560a7f8700 1 -- 192.168.123.106:0/3008969130 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5604100aa0 msgr2=0x7f5604195150 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:45.423 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:45.421+0000 7f560a7f8700 1 --2- 192.168.123.106:0/3008969130 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5604100aa0 0x7f5604195150 secure :-1 s=READY pgs=130 cs=0 l=1 rev1=1 crypto rx=0x7f55ec004f40 tx=0x7f55ec005e70 comp rx=0 tx=0).stop 2026-03-10T06:17:45.423 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:45.422+0000 7f560a7f8700 1 -- 192.168.123.106:0/3008969130 shutdown_connections 2026-03-10T06:17:45.423 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:45.422+0000 7f560a7f8700 1 --2- 192.168.123.106:0/3008969130 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f55f0038510 0x7f55f003a9c0 unknown :-1 s=CLOSED pgs=30 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:45.423 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:45.422+0000 7f560a7f8700 1 --2- 192.168.123.106:0/3008969130 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5604100aa0 0x7f5604195150 unknown :-1 s=CLOSED pgs=130 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:45.423 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:45.422+0000 7f560a7f8700 1 -- 192.168.123.106:0/3008969130 >> 192.168.123.106:0/3008969130 conn(0x7f56040fa4a0 msgr2=0x7f56040fb170 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:17:45.423 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:45.422+0000 7f560a7f8700 1 -- 192.168.123.106:0/3008969130 shutdown_connections 2026-03-10T06:17:45.423 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:45.422+0000 7f560a7f8700 1 -- 192.168.123.106:0/3008969130 wait complete. 2026-03-10T06:17:45.424 INFO:teuthology.orchestra.run.vm06.stderr:dumped monmap epoch 1 2026-03-10T06:17:46.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:46 vm04 ceph-mon[51058]: pgmap v10: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-10T06:17:46.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:46 vm04 ceph-mon[51058]: from='client.? 192.168.123.106:0/3008969130' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T06:17:46.488 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-10T06:17:46.488 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 9c59102a-1c48-11f1-b618-035af535377d -- ceph mon dump -f json 2026-03-10T06:17:46.643 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T06:17:46.687 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T06:17:46.971 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:46.969+0000 7f521e501700 1 -- 192.168.123.106:0/1062636621 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5218102240 msgr2=0x7f5218102650 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:46.971 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:46.969+0000 7f521e501700 1 --2- 192.168.123.106:0/1062636621 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5218102240 0x7f5218102650 secure :-1 s=READY pgs=131 cs=0 l=1 rev1=1 crypto rx=0x7f5200009b00 tx=0x7f5200009e10 comp rx=0 tx=0).stop 2026-03-10T06:17:46.971 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:46.970+0000 7f521e501700 1 -- 192.168.123.106:0/1062636621 shutdown_connections 2026-03-10T06:17:46.971 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:46.970+0000 7f521e501700 1 --2- 192.168.123.106:0/1062636621 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5218102240 0x7f5218102650 unknown :-1 s=CLOSED pgs=131 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:46.971 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:46.970+0000 7f521e501700 1 -- 192.168.123.106:0/1062636621 >> 192.168.123.106:0/1062636621 conn(0x7f52180fd8d0 msgr2=0x7f52180ffcb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:17:46.971 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:46.970+0000 7f521e501700 1 -- 192.168.123.106:0/1062636621 shutdown_connections 2026-03-10T06:17:46.971 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:46.970+0000 7f521e501700 1 -- 192.168.123.106:0/1062636621 wait complete. 2026-03-10T06:17:46.972 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:46.970+0000 7f521e501700 1 Processor -- start 2026-03-10T06:17:46.972 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:46.971+0000 7f521e501700 1 -- start start 2026-03-10T06:17:46.972 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:46.971+0000 7f521e501700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5218197630 0x7f5218197a40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:17:46.972 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:46.971+0000 7f521e501700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5218197f80 con 0x7f5218197630 2026-03-10T06:17:46.972 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:46.971+0000 7f5217fff700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5218197630 0x7f5218197a40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:17:46.973 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:46.971+0000 7f5217fff700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5218197630 0x7f5218197a40 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.106:55808/0 (socket says 192.168.123.106:55808) 2026-03-10T06:17:46.973 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:46.971+0000 7f5217fff700 1 -- 192.168.123.106:0/2261013774 learned_addr learned my addr 192.168.123.106:0/2261013774 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-10T06:17:46.973 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:46.972+0000 7f5217fff700 1 -- 192.168.123.106:0/2261013774 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f52000097e0 con 0x7f5218197630 2026-03-10T06:17:46.973 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:46.972+0000 7f5217fff700 1 --2- 192.168.123.106:0/2261013774 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5218197630 0x7f5218197a40 secure :-1 s=READY pgs=132 cs=0 l=1 rev1=1 crypto rx=0x7f52181032a0 tx=0x7f5200004dc0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:17:46.974 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:46.972+0000 7f52157fa700 1 -- 192.168.123.106:0/2261013774 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f520001c070 con 0x7f5218197630 2026-03-10T06:17:46.974 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:46.972+0000 7f521e501700 1 -- 192.168.123.106:0/2261013774 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f5218198180 con 0x7f5218197630 2026-03-10T06:17:46.974 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:46.973+0000 7f521e501700 1 -- 192.168.123.106:0/2261013774 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f521819ade0 con 0x7f5218197630 2026-03-10T06:17:46.975 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:46.973+0000 7f52157fa700 1 -- 192.168.123.106:0/2261013774 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f5200005000 con 0x7f5218197630 2026-03-10T06:17:46.975 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:46.973+0000 7f52157fa700 1 -- 192.168.123.106:0/2261013774 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f5200003940 con 0x7f5218197630 2026-03-10T06:17:46.975 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:46.974+0000 7f52157fa700 1 -- 192.168.123.106:0/2261013774 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 13) v1 ==== 45138+0+0 (secure 0 0 0) 0x7f5200003b20 con 0x7f5218197630 2026-03-10T06:17:46.976 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:46.974+0000 7f52157fa700 1 --2- 192.168.123.106:0/2261013774 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f5204038510 0x7f520403a9c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:17:46.976 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:46.974+0000 7f52157fa700 1 -- 192.168.123.106:0/2261013774 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1069+0+0 (secure 0 0 0) 0x7f520004cfb0 con 0x7f5218197630 2026-03-10T06:17:46.976 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:46.974+0000 7f521e501700 1 -- 192.168.123.106:0/2261013774 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f51f8005320 con 0x7f5218197630 2026-03-10T06:17:46.976 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:46.974+0000 7f52177fe700 1 --2- 192.168.123.106:0/2261013774 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f5204038510 0x7f520403a9c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:17:46.976 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:46.975+0000 7f52177fe700 1 --2- 192.168.123.106:0/2261013774 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f5204038510 0x7f520403a9c0 secure :-1 s=READY pgs=31 cs=0 l=1 rev1=1 crypto rx=0x7f5208006fd0 tx=0x7f5208006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:17:46.980 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:46.978+0000 7f52157fa700 1 -- 192.168.123.106:0/2261013774 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f5200079720 con 0x7f5218197630 2026-03-10T06:17:47.136 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:47.134+0000 7f521e501700 1 -- 192.168.123.106:0/2261013774 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f51f8005190 con 0x7f5218197630 2026-03-10T06:17:47.137 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:47.135+0000 7f52157fa700 1 -- 192.168.123.106:0/2261013774 <== mon.0 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7f5200025070 con 0x7f5218197630 2026-03-10T06:17:47.137 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-10T06:17:47.137 INFO:teuthology.orchestra.run.vm06.stdout:{"epoch":1,"fsid":"9c59102a-1c48-11f1-b618-035af535377d","modified":"2026-03-10T06:16:42.736031Z","created":"2026-03-10T06:16:42.736031Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm04","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:3300","nonce":0},{"type":"v1","addr":"192.168.123.104:6789","nonce":0}]},"addr":"192.168.123.104:6789/0","public_addr":"192.168.123.104:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-10T06:17:47.139 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:47.138+0000 7f521e501700 1 -- 192.168.123.106:0/2261013774 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f5204038510 msgr2=0x7f520403a9c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:47.139 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:47.138+0000 7f521e501700 1 --2- 192.168.123.106:0/2261013774 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f5204038510 0x7f520403a9c0 secure :-1 s=READY pgs=31 cs=0 l=1 rev1=1 crypto rx=0x7f5208006fd0 tx=0x7f5208006e40 comp rx=0 tx=0).stop 2026-03-10T06:17:47.139 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:47.138+0000 7f521e501700 1 -- 192.168.123.106:0/2261013774 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5218197630 msgr2=0x7f5218197a40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:47.139 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:47.138+0000 7f521e501700 1 --2- 192.168.123.106:0/2261013774 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5218197630 0x7f5218197a40 secure :-1 s=READY pgs=132 cs=0 l=1 rev1=1 crypto rx=0x7f52181032a0 tx=0x7f5200004dc0 comp rx=0 tx=0).stop 2026-03-10T06:17:47.140 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:47.138+0000 7f521e501700 1 -- 192.168.123.106:0/2261013774 shutdown_connections 2026-03-10T06:17:47.140 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:47.138+0000 7f521e501700 1 --2- 192.168.123.106:0/2261013774 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f5204038510 0x7f520403a9c0 unknown :-1 s=CLOSED pgs=31 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:47.140 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:47.138+0000 7f521e501700 1 --2- 192.168.123.106:0/2261013774 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5218197630 0x7f5218197a40 unknown :-1 s=CLOSED pgs=132 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:47.140 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:47.139+0000 7f521e501700 1 -- 192.168.123.106:0/2261013774 >> 192.168.123.106:0/2261013774 conn(0x7f52180fd8d0 msgr2=0x7f52180fe530 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:17:47.140 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:47.139+0000 7f521e501700 1 -- 192.168.123.106:0/2261013774 shutdown_connections 2026-03-10T06:17:47.140 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:47.139+0000 7f521e501700 1 -- 192.168.123.106:0/2261013774 wait complete. 2026-03-10T06:17:47.141 INFO:teuthology.orchestra.run.vm06.stderr:dumped monmap epoch 1 2026-03-10T06:17:47.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:47 vm04 ceph-mon[51058]: from='client.? 192.168.123.106:0/2261013774' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T06:17:48.191 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-10T06:17:48.191 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 9c59102a-1c48-11f1-b618-035af535377d -- ceph mon dump -f json 2026-03-10T06:17:48.343 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T06:17:48.381 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T06:17:48.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:48 vm04 ceph-mon[51058]: pgmap v11: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-10T06:17:48.665 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:48.663+0000 7f15c66b6700 1 -- 192.168.123.106:0/2152794646 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f15c0102240 msgr2=0x7f15c0102650 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:48.665 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:48.663+0000 7f15c66b6700 1 --2- 192.168.123.106:0/2152794646 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f15c0102240 0x7f15c0102650 secure :-1 s=READY pgs=133 cs=0 l=1 rev1=1 crypto rx=0x7f15b4009b00 tx=0x7f15b4009e10 comp rx=0 tx=0).stop 2026-03-10T06:17:48.665 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:48.664+0000 7f15c66b6700 1 -- 192.168.123.106:0/2152794646 shutdown_connections 2026-03-10T06:17:48.666 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:48.664+0000 7f15c66b6700 1 --2- 192.168.123.106:0/2152794646 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f15c0102240 0x7f15c0102650 unknown :-1 s=CLOSED pgs=133 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:48.666 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:48.664+0000 7f15c66b6700 1 -- 192.168.123.106:0/2152794646 >> 192.168.123.106:0/2152794646 conn(0x7f15c00fd8d0 msgr2=0x7f15c00ffcb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:17:48.666 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:48.664+0000 7f15c66b6700 1 -- 192.168.123.106:0/2152794646 shutdown_connections 2026-03-10T06:17:48.666 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:48.664+0000 7f15c66b6700 1 -- 192.168.123.106:0/2152794646 wait complete. 2026-03-10T06:17:48.666 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:48.665+0000 7f15c66b6700 1 Processor -- start 2026-03-10T06:17:48.666 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:48.665+0000 7f15c66b6700 1 -- start start 2026-03-10T06:17:48.666 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:48.665+0000 7f15c66b6700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f15c0102240 0x7f15c0197450 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:17:48.667 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:48.665+0000 7f15c66b6700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f15c0197990 con 0x7f15c0102240 2026-03-10T06:17:48.667 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:48.665+0000 7f15bffff700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f15c0102240 0x7f15c0197450 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:17:48.667 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:48.665+0000 7f15bffff700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f15c0102240 0x7f15c0197450 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.106:53672/0 (socket says 192.168.123.106:53672) 2026-03-10T06:17:48.667 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:48.665+0000 7f15bffff700 1 -- 192.168.123.106:0/1811953370 learned_addr learned my addr 192.168.123.106:0/1811953370 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-10T06:17:48.667 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:48.666+0000 7f15bffff700 1 -- 192.168.123.106:0/1811953370 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f15b40097e0 con 0x7f15c0102240 2026-03-10T06:17:48.667 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:48.666+0000 7f15bffff700 1 --2- 192.168.123.106:0/1811953370 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f15c0102240 0x7f15c0197450 secure :-1 s=READY pgs=134 cs=0 l=1 rev1=1 crypto rx=0x7f15b4004d40 tx=0x7f15b4004e20 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:17:48.667 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:48.666+0000 7f15bd7fa700 1 -- 192.168.123.106:0/1811953370 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f15b401c070 con 0x7f15c0102240 2026-03-10T06:17:48.669 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:48.666+0000 7f15c66b6700 1 -- 192.168.123.106:0/1811953370 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f15c0197b90 con 0x7f15c0102240 2026-03-10T06:17:48.669 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:48.666+0000 7f15c66b6700 1 -- 192.168.123.106:0/1811953370 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f15c0198030 con 0x7f15c0102240 2026-03-10T06:17:48.669 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:48.667+0000 7f15bd7fa700 1 -- 192.168.123.106:0/1811953370 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f15b40056f0 con 0x7f15c0102240 2026-03-10T06:17:48.669 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:48.667+0000 7f15bd7fa700 1 -- 192.168.123.106:0/1811953370 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f15b4017440 con 0x7f15c0102240 2026-03-10T06:17:48.669 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:48.667+0000 7f15bd7fa700 1 -- 192.168.123.106:0/1811953370 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 13) v1 ==== 45138+0+0 (secure 0 0 0) 0x7f15b40175a0 con 0x7f15c0102240 2026-03-10T06:17:48.669 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:48.667+0000 7f15c66b6700 1 -- 192.168.123.106:0/1811953370 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f15c0191090 con 0x7f15c0102240 2026-03-10T06:17:48.669 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:48.667+0000 7f15bd7fa700 1 --2- 192.168.123.106:0/1811953370 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f15a8038510 0x7f15a803a9c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:17:48.669 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:48.667+0000 7f15bd7fa700 1 -- 192.168.123.106:0/1811953370 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1069+0+0 (secure 0 0 0) 0x7f15b404d150 con 0x7f15c0102240 2026-03-10T06:17:48.672 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:48.671+0000 7f15bf7fe700 1 --2- 192.168.123.106:0/1811953370 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f15a8038510 0x7f15a803a9c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:17:48.672 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:48.671+0000 7f15bd7fa700 1 -- 192.168.123.106:0/1811953370 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f15b4028be0 con 0x7f15c0102240 2026-03-10T06:17:48.674 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:48.671+0000 7f15bf7fe700 1 --2- 192.168.123.106:0/1811953370 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f15a8038510 0x7f15a803a9c0 secure :-1 s=READY pgs=32 cs=0 l=1 rev1=1 crypto rx=0x7f15b0006fd0 tx=0x7f15b0006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:17:48.826 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:48.823+0000 7f15c66b6700 1 -- 192.168.123.106:0/1811953370 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f15c0062380 con 0x7f15c0102240 2026-03-10T06:17:48.826 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:48.824+0000 7f15bd7fa700 1 -- 192.168.123.106:0/1811953370 <== mon.0 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7f15b4025030 con 0x7f15c0102240 2026-03-10T06:17:48.827 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-10T06:17:48.827 INFO:teuthology.orchestra.run.vm06.stdout:{"epoch":1,"fsid":"9c59102a-1c48-11f1-b618-035af535377d","modified":"2026-03-10T06:16:42.736031Z","created":"2026-03-10T06:16:42.736031Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm04","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:3300","nonce":0},{"type":"v1","addr":"192.168.123.104:6789","nonce":0}]},"addr":"192.168.123.104:6789/0","public_addr":"192.168.123.104:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-10T06:17:48.829 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:48.828+0000 7f15c66b6700 1 -- 192.168.123.106:0/1811953370 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f15a8038510 msgr2=0x7f15a803a9c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:48.829 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:48.828+0000 7f15c66b6700 1 --2- 192.168.123.106:0/1811953370 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f15a8038510 0x7f15a803a9c0 secure :-1 s=READY pgs=32 cs=0 l=1 rev1=1 crypto rx=0x7f15b0006fd0 tx=0x7f15b0006e40 comp rx=0 tx=0).stop 2026-03-10T06:17:48.830 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:48.828+0000 7f15c66b6700 1 -- 192.168.123.106:0/1811953370 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f15c0102240 msgr2=0x7f15c0197450 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:48.830 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:48.828+0000 7f15c66b6700 1 --2- 192.168.123.106:0/1811953370 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f15c0102240 0x7f15c0197450 secure :-1 s=READY pgs=134 cs=0 l=1 rev1=1 crypto rx=0x7f15b4004d40 tx=0x7f15b4004e20 comp rx=0 tx=0).stop 2026-03-10T06:17:48.830 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:48.829+0000 7f15c66b6700 1 -- 192.168.123.106:0/1811953370 shutdown_connections 2026-03-10T06:17:48.830 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:48.829+0000 7f15c66b6700 1 --2- 192.168.123.106:0/1811953370 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f15a8038510 0x7f15a803a9c0 unknown :-1 s=CLOSED pgs=32 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:48.830 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:48.829+0000 7f15c66b6700 1 --2- 192.168.123.106:0/1811953370 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f15c0102240 0x7f15c0197450 unknown :-1 s=CLOSED pgs=134 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:48.830 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:48.829+0000 7f15c66b6700 1 -- 192.168.123.106:0/1811953370 >> 192.168.123.106:0/1811953370 conn(0x7f15c00fd8d0 msgr2=0x7f15c00fe530 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:17:48.831 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:48.829+0000 7f15c66b6700 1 -- 192.168.123.106:0/1811953370 shutdown_connections 2026-03-10T06:17:48.831 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:48.829+0000 7f15c66b6700 1 -- 192.168.123.106:0/1811953370 wait complete. 2026-03-10T06:17:48.832 INFO:teuthology.orchestra.run.vm06.stderr:dumped monmap epoch 1 2026-03-10T06:17:49.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:49 vm04 ceph-mon[51058]: from='client.? 192.168.123.106:0/1811953370' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T06:17:49.878 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-10T06:17:49.879 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 9c59102a-1c48-11f1-b618-035af535377d -- ceph mon dump -f json 2026-03-10T06:17:50.038 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T06:17:50.081 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T06:17:50.364 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:50.361+0000 7fc26cbd3700 1 -- 192.168.123.106:0/3300701888 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc268100010 msgr2=0x7fc268100420 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:50.364 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:50.361+0000 7fc26cbd3700 1 --2- 192.168.123.106:0/3300701888 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc268100010 0x7fc268100420 secure :-1 s=READY pgs=135 cs=0 l=1 rev1=1 crypto rx=0x7fc250009b00 tx=0x7fc250009e10 comp rx=0 tx=0).stop 2026-03-10T06:17:50.364 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:50.363+0000 7fc26cbd3700 1 -- 192.168.123.106:0/3300701888 shutdown_connections 2026-03-10T06:17:50.364 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:50.363+0000 7fc26cbd3700 1 --2- 192.168.123.106:0/3300701888 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc268100010 0x7fc268100420 unknown :-1 s=CLOSED pgs=135 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:50.364 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:50.363+0000 7fc26cbd3700 1 -- 192.168.123.106:0/3300701888 >> 192.168.123.106:0/3300701888 conn(0x7fc2680fb5a0 msgr2=0x7fc2680fd9f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:17:50.364 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:50.363+0000 7fc26cbd3700 1 -- 192.168.123.106:0/3300701888 shutdown_connections 2026-03-10T06:17:50.364 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:50.363+0000 7fc26cbd3700 1 -- 192.168.123.106:0/3300701888 wait complete. 2026-03-10T06:17:50.365 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:50.363+0000 7fc26cbd3700 1 Processor -- start 2026-03-10T06:17:50.365 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:50.364+0000 7fc26cbd3700 1 -- start start 2026-03-10T06:17:50.365 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:50.364+0000 7fc26cbd3700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc268100010 0x7fc268074af0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:17:50.365 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:50.364+0000 7fc26cbd3700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc268075030 con 0x7fc268100010 2026-03-10T06:17:50.366 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:50.364+0000 7fc26659c700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc268100010 0x7fc268074af0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:17:50.366 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:50.364+0000 7fc26659c700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc268100010 0x7fc268074af0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.106:53690/0 (socket says 192.168.123.106:53690) 2026-03-10T06:17:50.366 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:50.364+0000 7fc26659c700 1 -- 192.168.123.106:0/144040188 learned_addr learned my addr 192.168.123.106:0/144040188 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-10T06:17:50.366 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:50.365+0000 7fc26659c700 1 -- 192.168.123.106:0/144040188 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fc2500097e0 con 0x7fc268100010 2026-03-10T06:17:50.366 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:50.365+0000 7fc26659c700 1 --2- 192.168.123.106:0/144040188 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc268100010 0x7fc268074af0 secure :-1 s=READY pgs=136 cs=0 l=1 rev1=1 crypto rx=0x7fc250004750 tx=0x7fc250005dc0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:17:50.367 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:50.365+0000 7fc25f7fe700 1 -- 192.168.123.106:0/144040188 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fc25001c070 con 0x7fc268100010 2026-03-10T06:17:50.367 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:50.366+0000 7fc26cbd3700 1 -- 192.168.123.106:0/144040188 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fc268073140 con 0x7fc268100010 2026-03-10T06:17:50.367 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:50.366+0000 7fc26cbd3700 1 -- 192.168.123.106:0/144040188 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fc2680735e0 con 0x7fc268100010 2026-03-10T06:17:50.368 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:50.366+0000 7fc25f7fe700 1 -- 192.168.123.106:0/144040188 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fc250021470 con 0x7fc268100010 2026-03-10T06:17:50.368 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:50.366+0000 7fc25f7fe700 1 -- 192.168.123.106:0/144040188 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fc25000f460 con 0x7fc268100010 2026-03-10T06:17:50.369 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:50.367+0000 7fc25f7fe700 1 -- 192.168.123.106:0/144040188 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 13) v1 ==== 45138+0+0 (secure 0 0 0) 0x7fc25000f6d0 con 0x7fc268100010 2026-03-10T06:17:50.369 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:50.367+0000 7fc25f7fe700 1 --2- 192.168.123.106:0/144040188 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fc254038510 0x7fc25403a9c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:17:50.369 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:50.367+0000 7fc25f7fe700 1 -- 192.168.123.106:0/144040188 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1069+0+0 (secure 0 0 0) 0x7fc25004d4a0 con 0x7fc268100010 2026-03-10T06:17:50.369 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:50.367+0000 7fc26cbd3700 1 -- 192.168.123.106:0/144040188 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fc26804fa50 con 0x7fc268100010 2026-03-10T06:17:50.369 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:50.367+0000 7fc265d9b700 1 --2- 192.168.123.106:0/144040188 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fc254038510 0x7fc25403a9c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:17:50.369 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:50.368+0000 7fc265d9b700 1 --2- 192.168.123.106:0/144040188 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fc254038510 0x7fc25403a9c0 secure :-1 s=READY pgs=33 cs=0 l=1 rev1=1 crypto rx=0x7fc258006fd0 tx=0x7fc258006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:17:50.372 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:50.370+0000 7fc25f7fe700 1 -- 192.168.123.106:0/144040188 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fc250026070 con 0x7fc268100010 2026-03-10T06:17:50.529 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-10T06:17:50.529 INFO:teuthology.orchestra.run.vm06.stdout:{"epoch":1,"fsid":"9c59102a-1c48-11f1-b618-035af535377d","modified":"2026-03-10T06:16:42.736031Z","created":"2026-03-10T06:16:42.736031Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm04","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:3300","nonce":0},{"type":"v1","addr":"192.168.123.104:6789","nonce":0}]},"addr":"192.168.123.104:6789/0","public_addr":"192.168.123.104:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-10T06:17:50.529 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:50.526+0000 7fc26cbd3700 1 -- 192.168.123.106:0/144040188 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7fc26818e840 con 0x7fc268100010 2026-03-10T06:17:50.529 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:50.527+0000 7fc25f7fe700 1 -- 192.168.123.106:0/144040188 <== mon.0 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7fc250029720 con 0x7fc268100010 2026-03-10T06:17:50.531 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:50.530+0000 7fc26cbd3700 1 -- 192.168.123.106:0/144040188 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fc254038510 msgr2=0x7fc25403a9c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:50.531 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:50.530+0000 7fc26cbd3700 1 --2- 192.168.123.106:0/144040188 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fc254038510 0x7fc25403a9c0 secure :-1 s=READY pgs=33 cs=0 l=1 rev1=1 crypto rx=0x7fc258006fd0 tx=0x7fc258006e40 comp rx=0 tx=0).stop 2026-03-10T06:17:50.531 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:50.530+0000 7fc26cbd3700 1 -- 192.168.123.106:0/144040188 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc268100010 msgr2=0x7fc268074af0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:50.531 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:50.530+0000 7fc26cbd3700 1 --2- 192.168.123.106:0/144040188 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc268100010 0x7fc268074af0 secure :-1 s=READY pgs=136 cs=0 l=1 rev1=1 crypto rx=0x7fc250004750 tx=0x7fc250005dc0 comp rx=0 tx=0).stop 2026-03-10T06:17:50.532 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:50.530+0000 7fc26cbd3700 1 -- 192.168.123.106:0/144040188 shutdown_connections 2026-03-10T06:17:50.532 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:50.530+0000 7fc26cbd3700 1 --2- 192.168.123.106:0/144040188 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fc254038510 0x7fc25403a9c0 unknown :-1 s=CLOSED pgs=33 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:50.532 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:50.530+0000 7fc26cbd3700 1 --2- 192.168.123.106:0/144040188 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc268100010 0x7fc268074af0 unknown :-1 s=CLOSED pgs=136 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:50.532 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:50.530+0000 7fc26cbd3700 1 -- 192.168.123.106:0/144040188 >> 192.168.123.106:0/144040188 conn(0x7fc2680fb5a0 msgr2=0x7fc2680fc270 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:17:50.532 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:50.531+0000 7fc26cbd3700 1 -- 192.168.123.106:0/144040188 shutdown_connections 2026-03-10T06:17:50.532 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:50.531+0000 7fc26cbd3700 1 -- 192.168.123.106:0/144040188 wait complete. 2026-03-10T06:17:50.533 INFO:teuthology.orchestra.run.vm06.stderr:dumped monmap epoch 1 2026-03-10T06:17:50.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:50 vm04 ceph-mon[51058]: pgmap v12: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-10T06:17:51.602 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-10T06:17:51.602 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 9c59102a-1c48-11f1-b618-035af535377d -- ceph mon dump -f json 2026-03-10T06:17:51.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:51 vm04 ceph-mon[51058]: from='client.? 192.168.123.106:0/144040188' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T06:17:51.762 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T06:17:51.842 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T06:17:52.151 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:52.149+0000 7f87c84e3700 1 -- 192.168.123.106:0/151563848 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f87c0100aa0 msgr2=0x7f87c0102e80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:52.151 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:52.149+0000 7f87c84e3700 1 --2- 192.168.123.106:0/151563848 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f87c0100aa0 0x7f87c0102e80 secure :-1 s=READY pgs=137 cs=0 l=1 rev1=1 crypto rx=0x7f87b0009b00 tx=0x7f87b0009e10 comp rx=0 tx=0).stop 2026-03-10T06:17:52.151 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:52.150+0000 7f87c84e3700 1 -- 192.168.123.106:0/151563848 shutdown_connections 2026-03-10T06:17:52.151 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:52.150+0000 7f87c84e3700 1 --2- 192.168.123.106:0/151563848 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f87c0100aa0 0x7f87c0102e80 unknown :-1 s=CLOSED pgs=137 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:52.152 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:52.150+0000 7f87c84e3700 1 -- 192.168.123.106:0/151563848 >> 192.168.123.106:0/151563848 conn(0x7f87c00fa4a0 msgr2=0x7f87c00fc8f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:17:52.152 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:52.150+0000 7f87c84e3700 1 -- 192.168.123.106:0/151563848 shutdown_connections 2026-03-10T06:17:52.152 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:52.150+0000 7f87c84e3700 1 -- 192.168.123.106:0/151563848 wait complete. 2026-03-10T06:17:52.152 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:52.151+0000 7f87c84e3700 1 Processor -- start 2026-03-10T06:17:52.152 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:52.151+0000 7f87c84e3700 1 -- start start 2026-03-10T06:17:52.153 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:52.151+0000 7f87c84e3700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f87c0100aa0 0x7f87c0197370 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:17:52.153 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:52.151+0000 7f87c84e3700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f87c01978b0 con 0x7f87c0100aa0 2026-03-10T06:17:52.153 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:52.151+0000 7f87c627f700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f87c0100aa0 0x7f87c0197370 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:17:52.153 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:52.151+0000 7f87c627f700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f87c0100aa0 0x7f87c0197370 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.106:53702/0 (socket says 192.168.123.106:53702) 2026-03-10T06:17:52.153 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:52.151+0000 7f87c627f700 1 -- 192.168.123.106:0/845924963 learned_addr learned my addr 192.168.123.106:0/845924963 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-10T06:17:52.153 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:52.152+0000 7f87c627f700 1 -- 192.168.123.106:0/845924963 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f87b00097e0 con 0x7f87c0100aa0 2026-03-10T06:17:52.155 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:52.152+0000 7f87c627f700 1 --2- 192.168.123.106:0/845924963 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f87c0100aa0 0x7f87c0197370 secure :-1 s=READY pgs=138 cs=0 l=1 rev1=1 crypto rx=0x7f87b0004f40 tx=0x7f87b0005e70 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:17:52.155 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:52.152+0000 7f87b77fe700 1 -- 192.168.123.106:0/845924963 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f87b001c070 con 0x7f87c0100aa0 2026-03-10T06:17:52.155 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:52.152+0000 7f87c84e3700 1 -- 192.168.123.106:0/845924963 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f87c0197ab0 con 0x7f87c0100aa0 2026-03-10T06:17:52.155 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:52.152+0000 7f87c84e3700 1 -- 192.168.123.106:0/845924963 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f87c0197f50 con 0x7f87c0100aa0 2026-03-10T06:17:52.156 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:52.153+0000 7f87b77fe700 1 -- 192.168.123.106:0/845924963 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f87b00053b0 con 0x7f87c0100aa0 2026-03-10T06:17:52.156 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:52.153+0000 7f87b77fe700 1 -- 192.168.123.106:0/845924963 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f87b000f460 con 0x7f87c0100aa0 2026-03-10T06:17:52.156 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:52.153+0000 7f87b77fe700 1 -- 192.168.123.106:0/845924963 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 13) v1 ==== 45138+0+0 (secure 0 0 0) 0x7f87b000f5c0 con 0x7f87c0100aa0 2026-03-10T06:17:52.156 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:52.153+0000 7f87c84e3700 1 -- 192.168.123.106:0/845924963 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f87c0191260 con 0x7f87c0100aa0 2026-03-10T06:17:52.156 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:52.153+0000 7f87b77fe700 1 --2- 192.168.123.106:0/845924963 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f87ac038510 0x7f87ac03a9c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:17:52.156 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:52.154+0000 7f87c5a7e700 1 --2- 192.168.123.106:0/845924963 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f87ac038510 0x7f87ac03a9c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:17:52.156 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:52.155+0000 7f87b77fe700 1 -- 192.168.123.106:0/845924963 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1069+0+0 (secure 0 0 0) 0x7f87b004d490 con 0x7f87c0100aa0 2026-03-10T06:17:52.157 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:52.155+0000 7f87c5a7e700 1 --2- 192.168.123.106:0/845924963 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f87ac038510 0x7f87ac03a9c0 secure :-1 s=READY pgs=34 cs=0 l=1 rev1=1 crypto rx=0x7f87bc006fd0 tx=0x7f87bc006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:17:52.158 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:52.157+0000 7f87b77fe700 1 -- 192.168.123.106:0/845924963 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f87b0029950 con 0x7f87c0100aa0 2026-03-10T06:17:52.306 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:52.304+0000 7f87c84e3700 1 -- 192.168.123.106:0/845924963 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f87c0062380 con 0x7f87c0100aa0 2026-03-10T06:17:52.306 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:52.305+0000 7f87b77fe700 1 -- 192.168.123.106:0/845924963 <== mon.0 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7f87b0026030 con 0x7f87c0100aa0 2026-03-10T06:17:52.307 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-10T06:17:52.307 INFO:teuthology.orchestra.run.vm06.stdout:{"epoch":1,"fsid":"9c59102a-1c48-11f1-b618-035af535377d","modified":"2026-03-10T06:16:42.736031Z","created":"2026-03-10T06:16:42.736031Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm04","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:3300","nonce":0},{"type":"v1","addr":"192.168.123.104:6789","nonce":0}]},"addr":"192.168.123.104:6789/0","public_addr":"192.168.123.104:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-10T06:17:52.309 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:52.308+0000 7f87c84e3700 1 -- 192.168.123.106:0/845924963 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f87ac038510 msgr2=0x7f87ac03a9c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:52.310 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:52.308+0000 7f87c84e3700 1 --2- 192.168.123.106:0/845924963 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f87ac038510 0x7f87ac03a9c0 secure :-1 s=READY pgs=34 cs=0 l=1 rev1=1 crypto rx=0x7f87bc006fd0 tx=0x7f87bc006e40 comp rx=0 tx=0).stop 2026-03-10T06:17:52.310 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:52.308+0000 7f87c84e3700 1 -- 192.168.123.106:0/845924963 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f87c0100aa0 msgr2=0x7f87c0197370 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:52.310 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:52.308+0000 7f87c84e3700 1 --2- 192.168.123.106:0/845924963 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f87c0100aa0 0x7f87c0197370 secure :-1 s=READY pgs=138 cs=0 l=1 rev1=1 crypto rx=0x7f87b0004f40 tx=0x7f87b0005e70 comp rx=0 tx=0).stop 2026-03-10T06:17:52.310 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:52.309+0000 7f87c84e3700 1 -- 192.168.123.106:0/845924963 shutdown_connections 2026-03-10T06:17:52.310 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:52.309+0000 7f87c84e3700 1 --2- 192.168.123.106:0/845924963 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f87ac038510 0x7f87ac03a9c0 unknown :-1 s=CLOSED pgs=34 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:52.310 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:52.309+0000 7f87c84e3700 1 --2- 192.168.123.106:0/845924963 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f87c0100aa0 0x7f87c0197370 unknown :-1 s=CLOSED pgs=138 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:52.311 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:52.309+0000 7f87c84e3700 1 -- 192.168.123.106:0/845924963 >> 192.168.123.106:0/845924963 conn(0x7f87c00fa4a0 msgr2=0x7f87c00fb170 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:17:52.311 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:52.309+0000 7f87c84e3700 1 -- 192.168.123.106:0/845924963 shutdown_connections 2026-03-10T06:17:52.311 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:52.310+0000 7f87c84e3700 1 -- 192.168.123.106:0/845924963 wait complete. 2026-03-10T06:17:52.312 INFO:teuthology.orchestra.run.vm06.stderr:dumped monmap epoch 1 2026-03-10T06:17:52.580 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:52 vm04 ceph-mon[51058]: pgmap v13: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-10T06:17:52.580 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:52 vm04 ceph-mon[51058]: from='client.? 192.168.123.106:0/845924963' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T06:17:53.386 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-10T06:17:53.387 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 9c59102a-1c48-11f1-b618-035af535377d -- ceph mon dump -f json 2026-03-10T06:17:53.549 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T06:17:53.594 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T06:17:53.917 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:53.915+0000 7fb11aed4700 1 -- 192.168.123.106:0/4248466109 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb114102240 msgr2=0x7fb114102650 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:53.917 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:53.915+0000 7fb1137fe700 1 -- 192.168.123.106:0/4248466109 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fb10000bcf0 con 0x7fb114102240 2026-03-10T06:17:53.917 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:53.915+0000 7fb11aed4700 1 --2- 192.168.123.106:0/4248466109 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb114102240 0x7fb114102650 secure :-1 s=READY pgs=139 cs=0 l=1 rev1=1 crypto rx=0x7fb100009b00 tx=0x7fb100009e10 comp rx=0 tx=0).stop 2026-03-10T06:17:53.917 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:53.916+0000 7fb11aed4700 1 -- 192.168.123.106:0/4248466109 shutdown_connections 2026-03-10T06:17:53.917 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:53.916+0000 7fb11aed4700 1 --2- 192.168.123.106:0/4248466109 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb114102240 0x7fb114102650 unknown :-1 s=CLOSED pgs=139 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:53.917 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:53.916+0000 7fb11aed4700 1 -- 192.168.123.106:0/4248466109 >> 192.168.123.106:0/4248466109 conn(0x7fb1140fd8d0 msgr2=0x7fb1140ffcb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:17:53.917 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:53.916+0000 7fb11aed4700 1 -- 192.168.123.106:0/4248466109 shutdown_connections 2026-03-10T06:17:53.918 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:53.916+0000 7fb11aed4700 1 -- 192.168.123.106:0/4248466109 wait complete. 2026-03-10T06:17:53.918 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:53.917+0000 7fb11aed4700 1 Processor -- start 2026-03-10T06:17:53.918 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:53.917+0000 7fb11aed4700 1 -- start start 2026-03-10T06:17:53.918 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:53.917+0000 7fb11aed4700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb114102240 0x7fb114197420 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:17:53.918 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:53.917+0000 7fb11aed4700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb114197960 con 0x7fb114102240 2026-03-10T06:17:53.919 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:53.917+0000 7fb118c70700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb114102240 0x7fb114197420 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:17:53.919 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:53.917+0000 7fb118c70700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb114102240 0x7fb114197420 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.106:53722/0 (socket says 192.168.123.106:53722) 2026-03-10T06:17:53.919 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:53.917+0000 7fb118c70700 1 -- 192.168.123.106:0/125327768 learned_addr learned my addr 192.168.123.106:0/125327768 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-10T06:17:53.919 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:53.918+0000 7fb118c70700 1 -- 192.168.123.106:0/125327768 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb1000097e0 con 0x7fb114102240 2026-03-10T06:17:53.919 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:53.918+0000 7fb118c70700 1 --2- 192.168.123.106:0/125327768 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb114102240 0x7fb114197420 secure :-1 s=READY pgs=140 cs=0 l=1 rev1=1 crypto rx=0x7fb100009fd0 tx=0x7fb100004dc0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:17:53.920 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:53.918+0000 7fb111ffb700 1 -- 192.168.123.106:0/125327768 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fb10001c070 con 0x7fb114102240 2026-03-10T06:17:53.920 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:53.918+0000 7fb11aed4700 1 -- 192.168.123.106:0/125327768 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fb114197b60 con 0x7fb114102240 2026-03-10T06:17:53.920 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:53.919+0000 7fb11aed4700 1 -- 192.168.123.106:0/125327768 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fb114198000 con 0x7fb114102240 2026-03-10T06:17:53.921 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:53.919+0000 7fb111ffb700 1 -- 192.168.123.106:0/125327768 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fb1000054e0 con 0x7fb114102240 2026-03-10T06:17:53.921 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:53.919+0000 7fb111ffb700 1 -- 192.168.123.106:0/125327768 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fb100003ae0 con 0x7fb114102240 2026-03-10T06:17:53.922 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:53.920+0000 7fb111ffb700 1 -- 192.168.123.106:0/125327768 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 13) v1 ==== 45138+0+0 (secure 0 0 0) 0x7fb100003c40 con 0x7fb114102240 2026-03-10T06:17:53.922 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:53.920+0000 7fb11aed4700 1 -- 192.168.123.106:0/125327768 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fb114191090 con 0x7fb114102240 2026-03-10T06:17:53.922 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:53.920+0000 7fb111ffb700 1 --2- 192.168.123.106:0/125327768 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fb104038550 0x7fb10403aa00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:17:53.922 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:53.920+0000 7fb111ffb700 1 -- 192.168.123.106:0/125327768 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1069+0+0 (secure 0 0 0) 0x7fb10004d0e0 con 0x7fb114102240 2026-03-10T06:17:53.922 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:53.921+0000 7fb113fff700 1 --2- 192.168.123.106:0/125327768 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fb104038550 0x7fb10403aa00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:17:53.922 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:53.921+0000 7fb113fff700 1 --2- 192.168.123.106:0/125327768 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fb104038550 0x7fb10403aa00 secure :-1 s=READY pgs=35 cs=0 l=1 rev1=1 crypto rx=0x7fb108006fd0 tx=0x7fb108006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:17:53.925 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:53.924+0000 7fb111ffb700 1 -- 192.168.123.106:0/125327768 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fb100017440 con 0x7fb114102240 2026-03-10T06:17:54.081 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:54.078+0000 7fb11aed4700 1 -- 192.168.123.106:0/125327768 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7fb114062380 con 0x7fb114102240 2026-03-10T06:17:54.082 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:54.081+0000 7fb111ffb700 1 -- 192.168.123.106:0/125327768 <== mon.0 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7fb100025030 con 0x7fb114102240 2026-03-10T06:17:54.082 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-10T06:17:54.083 INFO:teuthology.orchestra.run.vm06.stdout:{"epoch":1,"fsid":"9c59102a-1c48-11f1-b618-035af535377d","modified":"2026-03-10T06:16:42.736031Z","created":"2026-03-10T06:16:42.736031Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm04","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:3300","nonce":0},{"type":"v1","addr":"192.168.123.104:6789","nonce":0}]},"addr":"192.168.123.104:6789/0","public_addr":"192.168.123.104:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-10T06:17:54.085 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:54.083+0000 7fb11aed4700 1 -- 192.168.123.106:0/125327768 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fb104038550 msgr2=0x7fb10403aa00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:54.085 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:54.083+0000 7fb11aed4700 1 --2- 192.168.123.106:0/125327768 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fb104038550 0x7fb10403aa00 secure :-1 s=READY pgs=35 cs=0 l=1 rev1=1 crypto rx=0x7fb108006fd0 tx=0x7fb108006e40 comp rx=0 tx=0).stop 2026-03-10T06:17:54.085 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:54.083+0000 7fb11aed4700 1 -- 192.168.123.106:0/125327768 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb114102240 msgr2=0x7fb114197420 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:54.085 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:54.083+0000 7fb11aed4700 1 --2- 192.168.123.106:0/125327768 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb114102240 0x7fb114197420 secure :-1 s=READY pgs=140 cs=0 l=1 rev1=1 crypto rx=0x7fb100009fd0 tx=0x7fb100004dc0 comp rx=0 tx=0).stop 2026-03-10T06:17:54.085 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:54.084+0000 7fb11aed4700 1 -- 192.168.123.106:0/125327768 shutdown_connections 2026-03-10T06:17:54.085 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:54.084+0000 7fb11aed4700 1 --2- 192.168.123.106:0/125327768 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fb104038550 0x7fb10403aa00 unknown :-1 s=CLOSED pgs=35 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:54.085 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:54.084+0000 7fb11aed4700 1 --2- 192.168.123.106:0/125327768 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb114102240 0x7fb114197420 unknown :-1 s=CLOSED pgs=140 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:54.085 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:54.084+0000 7fb11aed4700 1 -- 192.168.123.106:0/125327768 >> 192.168.123.106:0/125327768 conn(0x7fb1140fd8d0 msgr2=0x7fb1140fe530 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:17:54.085 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:54.084+0000 7fb11aed4700 1 -- 192.168.123.106:0/125327768 shutdown_connections 2026-03-10T06:17:54.085 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:54.084+0000 7fb11aed4700 1 -- 192.168.123.106:0/125327768 wait complete. 2026-03-10T06:17:54.086 INFO:teuthology.orchestra.run.vm06.stderr:dumped monmap epoch 1 2026-03-10T06:17:54.501 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:54 vm04 ceph-mon[51058]: pgmap v14: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-10T06:17:54.501 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:54 vm04 ceph-mon[51058]: from='mgr.14164 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:17:54.501 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:54 vm04 ceph-mon[51058]: from='mgr.14164 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:17:54.501 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:54 vm04 ceph-mon[51058]: from='mgr.14164 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:17:54.501 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:54 vm04 ceph-mon[51058]: from='mgr.14164 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:17:54.501 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:54 vm04 ceph-mon[51058]: from='mgr.14164 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:17:54.501 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:54 vm04 ceph-mon[51058]: from='mgr.14164 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:17:54.501 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:54 vm04 ceph-mon[51058]: from='client.? 192.168.123.106:0/125327768' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T06:17:55.172 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-10T06:17:55.172 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 9c59102a-1c48-11f1-b618-035af535377d -- ceph mon dump -f json 2026-03-10T06:17:55.319 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T06:17:55.361 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T06:17:55.630 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:55.628+0000 7f788b624700 1 -- 192.168.123.106:0/146852387 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7884100b40 msgr2=0x7f7884102f20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:55.630 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:55.628+0000 7f788b624700 1 --2- 192.168.123.106:0/146852387 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7884100b40 0x7f7884102f20 secure :-1 s=READY pgs=141 cs=0 l=1 rev1=1 crypto rx=0x7f7874009b00 tx=0x7f7874009e10 comp rx=0 tx=0).stop 2026-03-10T06:17:55.630 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:55.629+0000 7f788b624700 1 -- 192.168.123.106:0/146852387 shutdown_connections 2026-03-10T06:17:55.630 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:55.629+0000 7f788b624700 1 --2- 192.168.123.106:0/146852387 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7884100b40 0x7f7884102f20 unknown :-1 s=CLOSED pgs=141 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:55.630 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:55.629+0000 7f788b624700 1 -- 192.168.123.106:0/146852387 >> 192.168.123.106:0/146852387 conn(0x7f78840fa4a0 msgr2=0x7f78840fc8f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:17:55.631 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:55.629+0000 7f788b624700 1 -- 192.168.123.106:0/146852387 shutdown_connections 2026-03-10T06:17:55.631 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:55.629+0000 7f788b624700 1 -- 192.168.123.106:0/146852387 wait complete. 2026-03-10T06:17:55.631 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:55.629+0000 7f788b624700 1 Processor -- start 2026-03-10T06:17:55.631 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:55.630+0000 7f788b624700 1 -- start start 2026-03-10T06:17:55.631 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:55.630+0000 7f788b624700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7884100b40 0x7f7884195190 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:17:55.631 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:55.630+0000 7f788b624700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f78841956d0 con 0x7f7884100b40 2026-03-10T06:17:55.632 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:55.630+0000 7f78893c0700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7884100b40 0x7f7884195190 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:17:55.632 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:55.630+0000 7f78893c0700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7884100b40 0x7f7884195190 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.106:53744/0 (socket says 192.168.123.106:53744) 2026-03-10T06:17:55.632 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:55.630+0000 7f78893c0700 1 -- 192.168.123.106:0/1370505785 learned_addr learned my addr 192.168.123.106:0/1370505785 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-10T06:17:55.632 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:55.631+0000 7f78893c0700 1 -- 192.168.123.106:0/1370505785 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f78740097e0 con 0x7f7884100b40 2026-03-10T06:17:55.632 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:55.631+0000 7f78893c0700 1 --2- 192.168.123.106:0/1370505785 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7884100b40 0x7f7884195190 secure :-1 s=READY pgs=142 cs=0 l=1 rev1=1 crypto rx=0x7f7874004f40 tx=0x7f7874005e70 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:17:55.633 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:55.631+0000 7f787a7fc700 1 -- 192.168.123.106:0/1370505785 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f787401c070 con 0x7f7884100b40 2026-03-10T06:17:55.633 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:55.631+0000 7f788b624700 1 -- 192.168.123.106:0/1370505785 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f78841958d0 con 0x7f7884100b40 2026-03-10T06:17:55.633 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:55.631+0000 7f788b624700 1 -- 192.168.123.106:0/1370505785 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f7884195d70 con 0x7f7884100b40 2026-03-10T06:17:55.633 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:55.632+0000 7f787a7fc700 1 -- 192.168.123.106:0/1370505785 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f78740053b0 con 0x7f7884100b40 2026-03-10T06:17:55.634 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:55.632+0000 7f787a7fc700 1 -- 192.168.123.106:0/1370505785 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f787400f460 con 0x7f7884100b40 2026-03-10T06:17:55.634 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:55.632+0000 7f788b624700 1 -- 192.168.123.106:0/1370505785 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f788418ee60 con 0x7f7884100b40 2026-03-10T06:17:55.635 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:55.633+0000 7f787a7fc700 1 -- 192.168.123.106:0/1370505785 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 13) v1 ==== 45138+0+0 (secure 0 0 0) 0x7f787400f5c0 con 0x7f7884100b40 2026-03-10T06:17:55.635 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:55.633+0000 7f787a7fc700 1 --2- 192.168.123.106:0/1370505785 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f7870038510 0x7f787003a9c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:17:55.635 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:55.633+0000 7f787a7fc700 1 -- 192.168.123.106:0/1370505785 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1069+0+0 (secure 0 0 0) 0x7f787404d490 con 0x7f7884100b40 2026-03-10T06:17:55.636 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:55.635+0000 7f7888bbf700 1 --2- 192.168.123.106:0/1370505785 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f7870038510 0x7f787003a9c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:17:55.637 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:55.636+0000 7f7888bbf700 1 --2- 192.168.123.106:0/1370505785 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f7870038510 0x7f787003a9c0 secure :-1 s=READY pgs=36 cs=0 l=1 rev1=1 crypto rx=0x7f7880006fd0 tx=0x7f7880006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:17:55.637 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:55.636+0000 7f787a7fc700 1 -- 192.168.123.106:0/1370505785 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f7874029950 con 0x7f7884100b40 2026-03-10T06:17:55.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:55 vm04 ceph-mon[51058]: Deploying daemon prometheus.vm04 on vm04 2026-03-10T06:17:55.794 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:55.791+0000 7f788b624700 1 -- 192.168.123.106:0/1370505785 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f788402cc30 con 0x7f7884100b40 2026-03-10T06:17:55.795 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:55.794+0000 7f787a7fc700 1 -- 192.168.123.106:0/1370505785 <== mon.0 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7f7874026030 con 0x7f7884100b40 2026-03-10T06:17:55.795 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-10T06:17:55.795 INFO:teuthology.orchestra.run.vm06.stdout:{"epoch":1,"fsid":"9c59102a-1c48-11f1-b618-035af535377d","modified":"2026-03-10T06:16:42.736031Z","created":"2026-03-10T06:16:42.736031Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm04","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:3300","nonce":0},{"type":"v1","addr":"192.168.123.104:6789","nonce":0}]},"addr":"192.168.123.104:6789/0","public_addr":"192.168.123.104:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-10T06:17:55.798 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:55.796+0000 7f788b624700 1 -- 192.168.123.106:0/1370505785 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f7870038510 msgr2=0x7f787003a9c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:55.798 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:55.796+0000 7f788b624700 1 --2- 192.168.123.106:0/1370505785 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f7870038510 0x7f787003a9c0 secure :-1 s=READY pgs=36 cs=0 l=1 rev1=1 crypto rx=0x7f7880006fd0 tx=0x7f7880006e40 comp rx=0 tx=0).stop 2026-03-10T06:17:55.798 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:55.796+0000 7f788b624700 1 -- 192.168.123.106:0/1370505785 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7884100b40 msgr2=0x7f7884195190 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:55.798 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:55.796+0000 7f788b624700 1 --2- 192.168.123.106:0/1370505785 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7884100b40 0x7f7884195190 secure :-1 s=READY pgs=142 cs=0 l=1 rev1=1 crypto rx=0x7f7874004f40 tx=0x7f7874005e70 comp rx=0 tx=0).stop 2026-03-10T06:17:55.798 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:55.796+0000 7f788b624700 1 -- 192.168.123.106:0/1370505785 shutdown_connections 2026-03-10T06:17:55.798 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:55.796+0000 7f788b624700 1 --2- 192.168.123.106:0/1370505785 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f7870038510 0x7f787003a9c0 unknown :-1 s=CLOSED pgs=36 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:55.798 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:55.796+0000 7f788b624700 1 --2- 192.168.123.106:0/1370505785 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7884100b40 0x7f7884195190 unknown :-1 s=CLOSED pgs=142 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:55.798 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:55.796+0000 7f788b624700 1 -- 192.168.123.106:0/1370505785 >> 192.168.123.106:0/1370505785 conn(0x7f78840fa4a0 msgr2=0x7f78840fb170 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:17:55.798 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:55.797+0000 7f788b624700 1 -- 192.168.123.106:0/1370505785 shutdown_connections 2026-03-10T06:17:55.798 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:55.797+0000 7f788b624700 1 -- 192.168.123.106:0/1370505785 wait complete. 2026-03-10T06:17:55.799 INFO:teuthology.orchestra.run.vm06.stderr:dumped monmap epoch 1 2026-03-10T06:17:56.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:56 vm04 ceph-mon[51058]: pgmap v15: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-10T06:17:56.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:56 vm04 ceph-mon[51058]: from='client.? 192.168.123.106:0/1370505785' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T06:17:56.876 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-10T06:17:56.876 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 9c59102a-1c48-11f1-b618-035af535377d -- ceph mon dump -f json 2026-03-10T06:17:57.023 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T06:17:57.061 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T06:17:57.543 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:57.542+0000 7f4ca52d3700 1 -- 192.168.123.106:0/554343510 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4ca0100aa0 msgr2=0x7f4ca0102e80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:57.544 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:57.542+0000 7f4ca52d3700 1 --2- 192.168.123.106:0/554343510 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4ca0100aa0 0x7f4ca0102e80 secure :-1 s=READY pgs=143 cs=0 l=1 rev1=1 crypto rx=0x7f4c88009b00 tx=0x7f4c88009e10 comp rx=0 tx=0).stop 2026-03-10T06:17:57.544 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:57.542+0000 7f4ca52d3700 1 -- 192.168.123.106:0/554343510 shutdown_connections 2026-03-10T06:17:57.544 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:57.542+0000 7f4ca52d3700 1 --2- 192.168.123.106:0/554343510 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4ca0100aa0 0x7f4ca0102e80 unknown :-1 s=CLOSED pgs=143 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:57.544 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:57.542+0000 7f4ca52d3700 1 -- 192.168.123.106:0/554343510 >> 192.168.123.106:0/554343510 conn(0x7f4ca00fa4a0 msgr2=0x7f4ca00fc8f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:17:57.544 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:57.542+0000 7f4ca52d3700 1 -- 192.168.123.106:0/554343510 shutdown_connections 2026-03-10T06:17:57.544 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:57.542+0000 7f4ca52d3700 1 -- 192.168.123.106:0/554343510 wait complete. 2026-03-10T06:17:57.544 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:57.543+0000 7f4ca52d3700 1 Processor -- start 2026-03-10T06:17:57.544 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:57.543+0000 7f4ca52d3700 1 -- start start 2026-03-10T06:17:57.545 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:57.543+0000 7f4ca52d3700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4ca0100aa0 0x7f4ca0197360 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:17:57.545 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:57.543+0000 7f4ca52d3700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4ca01978a0 con 0x7f4ca0100aa0 2026-03-10T06:17:57.545 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:57.544+0000 7f4c9effd700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4ca0100aa0 0x7f4ca0197360 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:17:57.545 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:57.544+0000 7f4c9effd700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4ca0100aa0 0x7f4ca0197360 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.106:53762/0 (socket says 192.168.123.106:53762) 2026-03-10T06:17:57.545 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:57.544+0000 7f4c9effd700 1 -- 192.168.123.106:0/2509586082 learned_addr learned my addr 192.168.123.106:0/2509586082 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-10T06:17:57.545 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:57.544+0000 7f4c9effd700 1 -- 192.168.123.106:0/2509586082 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f4c880097e0 con 0x7f4ca0100aa0 2026-03-10T06:17:57.545 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:57.544+0000 7f4c9effd700 1 --2- 192.168.123.106:0/2509586082 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4ca0100aa0 0x7f4ca0197360 secure :-1 s=READY pgs=144 cs=0 l=1 rev1=1 crypto rx=0x7f4c88004f40 tx=0x7f4c88005e70 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:17:57.546 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:57.544+0000 7f4c97fff700 1 -- 192.168.123.106:0/2509586082 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f4c8801c070 con 0x7f4ca0100aa0 2026-03-10T06:17:57.546 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:57.545+0000 7f4ca52d3700 1 -- 192.168.123.106:0/2509586082 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f4ca0197aa0 con 0x7f4ca0100aa0 2026-03-10T06:17:57.546 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:57.545+0000 7f4ca52d3700 1 -- 192.168.123.106:0/2509586082 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f4ca0197f40 con 0x7f4ca0100aa0 2026-03-10T06:17:57.548 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:57.546+0000 7f4c97fff700 1 -- 192.168.123.106:0/2509586082 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f4c880053b0 con 0x7f4ca0100aa0 2026-03-10T06:17:57.548 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:57.546+0000 7f4c97fff700 1 -- 192.168.123.106:0/2509586082 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f4c8800f460 con 0x7f4ca0100aa0 2026-03-10T06:17:57.548 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:57.546+0000 7f4c97fff700 1 -- 192.168.123.106:0/2509586082 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 13) v1 ==== 45138+0+0 (secure 0 0 0) 0x7f4c8800f680 con 0x7f4ca0100aa0 2026-03-10T06:17:57.548 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:57.546+0000 7f4ca52d3700 1 -- 192.168.123.106:0/2509586082 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f4ca0191260 con 0x7f4ca0100aa0 2026-03-10T06:17:57.548 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:57.546+0000 7f4c97fff700 1 --2- 192.168.123.106:0/2509586082 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f4c8c038560 0x7f4c8c03aa10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:17:57.548 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:57.546+0000 7f4c97fff700 1 -- 192.168.123.106:0/2509586082 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1069+0+0 (secure 0 0 0) 0x7f4c8804d610 con 0x7f4ca0100aa0 2026-03-10T06:17:57.548 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:57.547+0000 7f4c9e7fc700 1 --2- 192.168.123.106:0/2509586082 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f4c8c038560 0x7f4c8c03aa10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:17:57.548 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:57.547+0000 7f4c9e7fc700 1 --2- 192.168.123.106:0/2509586082 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f4c8c038560 0x7f4c8c03aa10 secure :-1 s=READY pgs=37 cs=0 l=1 rev1=1 crypto rx=0x7f4c90006fd0 tx=0x7f4c90006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:17:57.550 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:57.549+0000 7f4c97fff700 1 -- 192.168.123.106:0/2509586082 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f4c88029b90 con 0x7f4ca0100aa0 2026-03-10T06:17:57.697 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:57.695+0000 7f4ca52d3700 1 -- 192.168.123.106:0/2509586082 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f4ca002cc30 con 0x7f4ca0100aa0 2026-03-10T06:17:57.698 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:57.696+0000 7f4c97fff700 1 -- 192.168.123.106:0/2509586082 <== mon.0 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7f4c88026030 con 0x7f4ca0100aa0 2026-03-10T06:17:57.698 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-10T06:17:57.698 INFO:teuthology.orchestra.run.vm06.stdout:{"epoch":1,"fsid":"9c59102a-1c48-11f1-b618-035af535377d","modified":"2026-03-10T06:16:42.736031Z","created":"2026-03-10T06:16:42.736031Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm04","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:3300","nonce":0},{"type":"v1","addr":"192.168.123.104:6789","nonce":0}]},"addr":"192.168.123.104:6789/0","public_addr":"192.168.123.104:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-10T06:17:57.700 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:57.699+0000 7f4ca52d3700 1 -- 192.168.123.106:0/2509586082 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f4c8c038560 msgr2=0x7f4c8c03aa10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:57.700 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:57.699+0000 7f4ca52d3700 1 --2- 192.168.123.106:0/2509586082 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f4c8c038560 0x7f4c8c03aa10 secure :-1 s=READY pgs=37 cs=0 l=1 rev1=1 crypto rx=0x7f4c90006fd0 tx=0x7f4c90006e40 comp rx=0 tx=0).stop 2026-03-10T06:17:57.700 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:57.699+0000 7f4ca52d3700 1 -- 192.168.123.106:0/2509586082 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4ca0100aa0 msgr2=0x7f4ca0197360 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:57.700 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:57.699+0000 7f4ca52d3700 1 --2- 192.168.123.106:0/2509586082 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4ca0100aa0 0x7f4ca0197360 secure :-1 s=READY pgs=144 cs=0 l=1 rev1=1 crypto rx=0x7f4c88004f40 tx=0x7f4c88005e70 comp rx=0 tx=0).stop 2026-03-10T06:17:57.700 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:57.699+0000 7f4ca52d3700 1 -- 192.168.123.106:0/2509586082 shutdown_connections 2026-03-10T06:17:57.700 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:57.699+0000 7f4ca52d3700 1 --2- 192.168.123.106:0/2509586082 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f4c8c038560 0x7f4c8c03aa10 unknown :-1 s=CLOSED pgs=37 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:57.700 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:57.699+0000 7f4ca52d3700 1 --2- 192.168.123.106:0/2509586082 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4ca0100aa0 0x7f4ca0197360 unknown :-1 s=CLOSED pgs=144 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:57.701 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:57.699+0000 7f4ca52d3700 1 -- 192.168.123.106:0/2509586082 >> 192.168.123.106:0/2509586082 conn(0x7f4ca00fa4a0 msgr2=0x7f4ca00fc8f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:17:57.701 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:57.699+0000 7f4ca52d3700 1 -- 192.168.123.106:0/2509586082 shutdown_connections 2026-03-10T06:17:57.701 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:57.699+0000 7f4ca52d3700 1 -- 192.168.123.106:0/2509586082 wait complete. 2026-03-10T06:17:57.701 INFO:teuthology.orchestra.run.vm06.stderr:dumped monmap epoch 1 2026-03-10T06:17:57.850 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:57 vm04 ceph-mon[51058]: pgmap v16: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-10T06:17:58.763 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-10T06:17:58.763 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 9c59102a-1c48-11f1-b618-035af535377d -- ceph mon dump -f json 2026-03-10T06:17:58.910 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T06:17:58.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:58 vm04 ceph-mon[51058]: from='client.? 192.168.123.106:0/2509586082' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T06:17:58.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:58 vm04 ceph-mon[51058]: from='mgr.14164 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:17:58.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:58 vm04 ceph-mon[51058]: from='mgr.14164 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:17:58.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:58 vm04 ceph-mon[51058]: from='mgr.14164 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:17:58.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:58 vm04 ceph-mon[51058]: from='mgr.14164 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:17:58.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:58 vm04 ceph-mon[51058]: from='mgr.14164 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "mgr module enable", "module": "prometheus"}]: dispatch 2026-03-10T06:17:58.946 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T06:17:59.206 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:59.203+0000 7fed03e99700 1 -- 192.168.123.106:0/443256474 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fecfc102240 msgr2=0x7fecfc102650 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:59.206 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:59.203+0000 7fed03e99700 1 --2- 192.168.123.106:0/443256474 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fecfc102240 0x7fecfc102650 secure :-1 s=READY pgs=145 cs=0 l=1 rev1=1 crypto rx=0x7fecf0009b00 tx=0x7fecf0009e10 comp rx=0 tx=0).stop 2026-03-10T06:17:59.206 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:59.204+0000 7fed03e99700 1 -- 192.168.123.106:0/443256474 shutdown_connections 2026-03-10T06:17:59.206 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:59.204+0000 7fed03e99700 1 --2- 192.168.123.106:0/443256474 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fecfc102240 0x7fecfc102650 unknown :-1 s=CLOSED pgs=145 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:59.206 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:59.204+0000 7fed03e99700 1 -- 192.168.123.106:0/443256474 >> 192.168.123.106:0/443256474 conn(0x7fecfc0fd8d0 msgr2=0x7fecfc0ffcb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:17:59.206 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:59.205+0000 7fed03e99700 1 -- 192.168.123.106:0/443256474 shutdown_connections 2026-03-10T06:17:59.206 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:59.205+0000 7fed03e99700 1 -- 192.168.123.106:0/443256474 wait complete. 2026-03-10T06:17:59.207 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:59.205+0000 7fed03e99700 1 Processor -- start 2026-03-10T06:17:59.207 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:59.206+0000 7fed03e99700 1 -- start start 2026-03-10T06:17:59.207 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:59.206+0000 7fed03e99700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fecfc197640 0x7fecfc197a50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:17:59.207 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:59.206+0000 7fed03e99700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fecfc197f90 con 0x7fecfc197640 2026-03-10T06:17:59.208 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:59.206+0000 7fed01c35700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fecfc197640 0x7fecfc197a50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:17:59.208 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:59.206+0000 7fed01c35700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fecfc197640 0x7fecfc197a50 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.106:56982/0 (socket says 192.168.123.106:56982) 2026-03-10T06:17:59.208 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:59.206+0000 7fed01c35700 1 -- 192.168.123.106:0/2782604412 learned_addr learned my addr 192.168.123.106:0/2782604412 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-10T06:17:59.208 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:59.207+0000 7fed01c35700 1 -- 192.168.123.106:0/2782604412 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fecf00097e0 con 0x7fecfc197640 2026-03-10T06:17:59.208 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:59.207+0000 7fed01c35700 1 --2- 192.168.123.106:0/2782604412 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fecfc197640 0x7fecfc197a50 secure :-1 s=READY pgs=146 cs=0 l=1 rev1=1 crypto rx=0x7fecf0009fd0 tx=0x7fecf0004dc0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:17:59.209 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:59.207+0000 7feceeffd700 1 -- 192.168.123.106:0/2782604412 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fecf001d070 con 0x7fecfc197640 2026-03-10T06:17:59.209 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:59.208+0000 7fed03e99700 1 -- 192.168.123.106:0/2782604412 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fecfc198190 con 0x7fecfc197640 2026-03-10T06:17:59.209 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:59.208+0000 7fed03e99700 1 -- 192.168.123.106:0/2782604412 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fecfc19adf0 con 0x7fecfc197640 2026-03-10T06:17:59.210 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:59.208+0000 7feceeffd700 1 -- 192.168.123.106:0/2782604412 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fecf00056f0 con 0x7fecfc197640 2026-03-10T06:17:59.210 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:59.208+0000 7feceeffd700 1 -- 192.168.123.106:0/2782604412 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fecf0022ca0 con 0x7fecfc197640 2026-03-10T06:17:59.211 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:59.209+0000 7feceeffd700 1 -- 192.168.123.106:0/2782604412 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 13) v1 ==== 45138+0+0 (secure 0 0 0) 0x7fecf000f480 con 0x7fecfc197640 2026-03-10T06:17:59.211 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:59.209+0000 7feceeffd700 1 --2- 192.168.123.106:0/2782604412 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fece8038550 0x7fece803aa00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:17:59.211 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:59.209+0000 7feceeffd700 1 -- 192.168.123.106:0/2782604412 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1069+0+0 (secure 0 0 0) 0x7fecf004d3d0 con 0x7fecfc197640 2026-03-10T06:17:59.211 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:59.209+0000 7fed03e99700 1 -- 192.168.123.106:0/2782604412 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fece0005320 con 0x7fecfc197640 2026-03-10T06:17:59.211 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:59.209+0000 7fed01434700 1 --2- 192.168.123.106:0/2782604412 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fece8038550 0x7fece803aa00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:17:59.211 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:59.210+0000 7fed01434700 1 --2- 192.168.123.106:0/2782604412 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fece8038550 0x7fece803aa00 secure :-1 s=READY pgs=38 cs=0 l=1 rev1=1 crypto rx=0x7fecf8006fd0 tx=0x7fecf8006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:17:59.215 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:59.213+0000 7feceeffd700 1 -- 192.168.123.106:0/2782604412 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fecf0027070 con 0x7fecfc197640 2026-03-10T06:17:59.361 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:59.358+0000 7fed03e99700 1 -- 192.168.123.106:0/2782604412 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7fece0005190 con 0x7fecfc197640 2026-03-10T06:17:59.362 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:59.361+0000 7feceeffd700 1 -- 192.168.123.106:0/2782604412 <== mon.0 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7fecf0020dd0 con 0x7fecfc197640 2026-03-10T06:17:59.363 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-10T06:17:59.363 INFO:teuthology.orchestra.run.vm06.stdout:{"epoch":1,"fsid":"9c59102a-1c48-11f1-b618-035af535377d","modified":"2026-03-10T06:16:42.736031Z","created":"2026-03-10T06:16:42.736031Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm04","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:3300","nonce":0},{"type":"v1","addr":"192.168.123.104:6789","nonce":0}]},"addr":"192.168.123.104:6789/0","public_addr":"192.168.123.104:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-10T06:17:59.365 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:59.364+0000 7fed03e99700 1 -- 192.168.123.106:0/2782604412 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fece8038550 msgr2=0x7fece803aa00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:59.365 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:59.364+0000 7fed03e99700 1 --2- 192.168.123.106:0/2782604412 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fece8038550 0x7fece803aa00 secure :-1 s=READY pgs=38 cs=0 l=1 rev1=1 crypto rx=0x7fecf8006fd0 tx=0x7fecf8006e40 comp rx=0 tx=0).stop 2026-03-10T06:17:59.365 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:59.364+0000 7fed03e99700 1 -- 192.168.123.106:0/2782604412 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fecfc197640 msgr2=0x7fecfc197a50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:17:59.365 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:59.364+0000 7fed03e99700 1 --2- 192.168.123.106:0/2782604412 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fecfc197640 0x7fecfc197a50 secure :-1 s=READY pgs=146 cs=0 l=1 rev1=1 crypto rx=0x7fecf0009fd0 tx=0x7fecf0004dc0 comp rx=0 tx=0).stop 2026-03-10T06:17:59.365 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:59.364+0000 7fed03e99700 1 -- 192.168.123.106:0/2782604412 shutdown_connections 2026-03-10T06:17:59.365 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:59.364+0000 7fed03e99700 1 --2- 192.168.123.106:0/2782604412 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fece8038550 0x7fece803aa00 unknown :-1 s=CLOSED pgs=38 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:59.365 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:59.364+0000 7fed03e99700 1 --2- 192.168.123.106:0/2782604412 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fecfc197640 0x7fecfc197a50 unknown :-1 s=CLOSED pgs=146 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:17:59.366 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:59.364+0000 7fed03e99700 1 -- 192.168.123.106:0/2782604412 >> 192.168.123.106:0/2782604412 conn(0x7fecfc0fd8d0 msgr2=0x7fecfc0fe530 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:17:59.366 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:59.365+0000 7fed03e99700 1 -- 192.168.123.106:0/2782604412 shutdown_connections 2026-03-10T06:17:59.366 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:17:59.365+0000 7fed03e99700 1 -- 192.168.123.106:0/2782604412 wait complete. 2026-03-10T06:17:59.367 INFO:teuthology.orchestra.run.vm06.stderr:dumped monmap epoch 1 2026-03-10T06:17:59.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:59 vm04 ceph-mon[51058]: pgmap v17: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-10T06:17:59.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:59 vm04 ceph-mon[51058]: from='client.? 192.168.123.106:0/2782604412' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T06:17:59.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:59 vm04 ceph-mon[51058]: from='mgr.14164 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd='[{"prefix": "mgr module enable", "module": "prometheus"}]': finished 2026-03-10T06:17:59.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:17:59 vm04 ceph-mon[51058]: mgrmap e14: vm04.exdvdb(active, since 46s) 2026-03-10T06:18:00.432 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-10T06:18:00.432 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 9c59102a-1c48-11f1-b618-035af535377d -- ceph mon dump -f json 2026-03-10T06:18:00.583 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T06:18:00.628 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T06:18:00.898 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:00.896+0000 7f9e0b359700 1 -- 192.168.123.106:0/2348481979 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9e040fe910 msgr2=0x7f9e040fed20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:18:00.898 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:00.896+0000 7f9e0b359700 1 --2- 192.168.123.106:0/2348481979 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9e040fe910 0x7f9e040fed20 secure :-1 s=READY pgs=149 cs=0 l=1 rev1=1 crypto rx=0x7f9df4009b00 tx=0x7f9df4009e10 comp rx=0 tx=0).stop 2026-03-10T06:18:00.898 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:00.897+0000 7f9e0b359700 1 -- 192.168.123.106:0/2348481979 shutdown_connections 2026-03-10T06:18:00.898 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:00.897+0000 7f9e0b359700 1 --2- 192.168.123.106:0/2348481979 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9e040fe910 0x7f9e040fed20 unknown :-1 s=CLOSED pgs=149 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:18:00.898 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:00.897+0000 7f9e0b359700 1 -- 192.168.123.106:0/2348481979 >> 192.168.123.106:0/2348481979 conn(0x7f9e040fa4a0 msgr2=0x7f9e040fc8f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:18:00.898 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:00.897+0000 7f9e0b359700 1 -- 192.168.123.106:0/2348481979 shutdown_connections 2026-03-10T06:18:00.898 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:00.897+0000 7f9e0b359700 1 -- 192.168.123.106:0/2348481979 wait complete. 2026-03-10T06:18:00.899 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:00.898+0000 7f9e0b359700 1 Processor -- start 2026-03-10T06:18:00.899 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:00.898+0000 7f9e0b359700 1 -- start start 2026-03-10T06:18:00.899 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:00.898+0000 7f9e0b359700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9e040fe910 0x7f9e04197360 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:18:00.899 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:00.898+0000 7f9e0b359700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9e041978a0 con 0x7f9e040fe910 2026-03-10T06:18:00.899 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:00.898+0000 7f9e090f5700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9e040fe910 0x7f9e04197360 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:18:00.900 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:00.898+0000 7f9e090f5700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9e040fe910 0x7f9e04197360 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.106:57010/0 (socket says 192.168.123.106:57010) 2026-03-10T06:18:00.900 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:00.898+0000 7f9e090f5700 1 -- 192.168.123.106:0/437344505 learned_addr learned my addr 192.168.123.106:0/437344505 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-10T06:18:00.900 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:00.898+0000 7f9e090f5700 1 -- 192.168.123.106:0/437344505 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f9df40097e0 con 0x7f9e040fe910 2026-03-10T06:18:00.900 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:00.899+0000 7f9e090f5700 1 --2- 192.168.123.106:0/437344505 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9e040fe910 0x7f9e04197360 secure :-1 s=READY pgs=150 cs=0 l=1 rev1=1 crypto rx=0x7f9df4004f40 tx=0x7f9df4005e70 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:18:00.900 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:00.899+0000 7f9dfa7fc700 1 -- 192.168.123.106:0/437344505 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f9df401c070 con 0x7f9e040fe910 2026-03-10T06:18:00.900 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:00.899+0000 7f9e0b359700 1 -- 192.168.123.106:0/437344505 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f9e04197aa0 con 0x7f9e040fe910 2026-03-10T06:18:00.900 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:00.899+0000 7f9e0b359700 1 -- 192.168.123.106:0/437344505 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f9e04197f40 con 0x7f9e040fe910 2026-03-10T06:18:00.901 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:00.899+0000 7f9dfa7fc700 1 -- 192.168.123.106:0/437344505 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f9df40053b0 con 0x7f9e040fe910 2026-03-10T06:18:00.901 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:00.899+0000 7f9dfa7fc700 1 -- 192.168.123.106:0/437344505 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f9df400f460 con 0x7f9e040fe910 2026-03-10T06:18:00.901 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:00.900+0000 7f9dfa7fc700 1 -- 192.168.123.106:0/437344505 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 14) v1 ==== 45152+0+0 (secure 0 0 0) 0x7f9df400f5e0 con 0x7f9e040fe910 2026-03-10T06:18:00.901 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:00.900+0000 7f9dfa7fc700 1 --2- 192.168.123.106:0/437344505 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f9df0038590 0x7f9df003aa40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:18:00.901 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:00.900+0000 7f9dfa7fc700 1 -- 192.168.123.106:0/437344505 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1069+0+0 (secure 0 0 0) 0x7f9df404d4f0 con 0x7f9e040fe910 2026-03-10T06:18:00.902 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:00.901+0000 7f9e088f4700 1 -- 192.168.123.106:0/437344505 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f9df0038590 msgr2=0x7f9df003aa40 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.104:6800/2 2026-03-10T06:18:00.902 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:00.901+0000 7f9e088f4700 1 --2- 192.168.123.106:0/437344505 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f9df0038590 0x7f9df003aa40 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 0.200000 2026-03-10T06:18:00.902 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:00.901+0000 7f9e0b359700 1 -- 192.168.123.106:0/437344505 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f9de8005320 con 0x7f9e040fe910 2026-03-10T06:18:00.907 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:00.906+0000 7f9dfa7fc700 1 -- 192.168.123.106:0/437344505 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f9df4026070 con 0x7f9e040fe910 2026-03-10T06:18:01.053 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:01.052+0000 7f9e0b359700 1 -- 192.168.123.106:0/437344505 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f9de8005190 con 0x7f9e040fe910 2026-03-10T06:18:01.054 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:01.052+0000 7f9dfa7fc700 1 -- 192.168.123.106:0/437344505 <== mon.0 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7f9df4017490 con 0x7f9e040fe910 2026-03-10T06:18:01.055 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-10T06:18:01.055 INFO:teuthology.orchestra.run.vm06.stdout:{"epoch":1,"fsid":"9c59102a-1c48-11f1-b618-035af535377d","modified":"2026-03-10T06:16:42.736031Z","created":"2026-03-10T06:16:42.736031Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm04","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:3300","nonce":0},{"type":"v1","addr":"192.168.123.104:6789","nonce":0}]},"addr":"192.168.123.104:6789/0","public_addr":"192.168.123.104:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-10T06:18:01.057 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:01.056+0000 7f9e0b359700 1 -- 192.168.123.106:0/437344505 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f9df0038590 msgr2=0x7f9df003aa40 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T06:18:01.057 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:01.056+0000 7f9e0b359700 1 --2- 192.168.123.106:0/437344505 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f9df0038590 0x7f9df003aa40 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:18:01.057 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:01.056+0000 7f9e0b359700 1 -- 192.168.123.106:0/437344505 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9e040fe910 msgr2=0x7f9e04197360 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:18:01.058 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:01.056+0000 7f9e0b359700 1 --2- 192.168.123.106:0/437344505 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9e040fe910 0x7f9e04197360 secure :-1 s=READY pgs=150 cs=0 l=1 rev1=1 crypto rx=0x7f9df4004f40 tx=0x7f9df4005e70 comp rx=0 tx=0).stop 2026-03-10T06:18:01.058 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:01.056+0000 7f9e0b359700 1 -- 192.168.123.106:0/437344505 shutdown_connections 2026-03-10T06:18:01.058 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:01.057+0000 7f9e0b359700 1 --2- 192.168.123.106:0/437344505 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f9df0038590 0x7f9df003aa40 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:18:01.058 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:01.057+0000 7f9e0b359700 1 --2- 192.168.123.106:0/437344505 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9e040fe910 0x7f9e04197360 unknown :-1 s=CLOSED pgs=150 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:18:01.058 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:01.057+0000 7f9e0b359700 1 -- 192.168.123.106:0/437344505 >> 192.168.123.106:0/437344505 conn(0x7f9e040fa4a0 msgr2=0x7f9e040fb170 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:18:01.058 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:01.057+0000 7f9e0b359700 1 -- 192.168.123.106:0/437344505 shutdown_connections 2026-03-10T06:18:01.058 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:01.057+0000 7f9e0b359700 1 -- 192.168.123.106:0/437344505 wait complete. 2026-03-10T06:18:01.059 INFO:teuthology.orchestra.run.vm06.stderr:dumped monmap epoch 1 2026-03-10T06:18:01.599 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:01 vm04 ceph-mon[51058]: from='client.? 192.168.123.106:0/437344505' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T06:18:02.135 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-10T06:18:02.135 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 9c59102a-1c48-11f1-b618-035af535377d -- ceph mon dump -f json 2026-03-10T06:18:02.290 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T06:18:02.328 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T06:18:02.599 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:02.597+0000 7f8c984e0700 1 -- 192.168.123.106:0/745100040 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8c90100ab0 msgr2=0x7f8c90102e90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:18:02.599 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:02.597+0000 7f8c984e0700 1 --2- 192.168.123.106:0/745100040 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8c90100ab0 0x7f8c90102e90 secure :-1 s=READY pgs=151 cs=0 l=1 rev1=1 crypto rx=0x7f8c80009b00 tx=0x7f8c80009e10 comp rx=0 tx=0).stop 2026-03-10T06:18:02.599 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:02.598+0000 7f8c984e0700 1 -- 192.168.123.106:0/745100040 shutdown_connections 2026-03-10T06:18:02.599 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:02.598+0000 7f8c984e0700 1 --2- 192.168.123.106:0/745100040 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8c90100ab0 0x7f8c90102e90 unknown :-1 s=CLOSED pgs=151 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:18:02.599 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:02.598+0000 7f8c984e0700 1 -- 192.168.123.106:0/745100040 >> 192.168.123.106:0/745100040 conn(0x7f8c900fa4f0 msgr2=0x7f8c900fc900 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:18:02.599 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:02.598+0000 7f8c984e0700 1 -- 192.168.123.106:0/745100040 shutdown_connections 2026-03-10T06:18:02.599 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:02.598+0000 7f8c984e0700 1 -- 192.168.123.106:0/745100040 wait complete. 2026-03-10T06:18:02.600 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:02.598+0000 7f8c984e0700 1 Processor -- start 2026-03-10T06:18:02.600 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:02.598+0000 7f8c984e0700 1 -- start start 2026-03-10T06:18:02.600 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:02.599+0000 7f8c984e0700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8c90100ab0 0x7f8c90197350 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:18:02.600 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:02.599+0000 7f8c984e0700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8c80012070 con 0x7f8c90100ab0 2026-03-10T06:18:02.600 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:02.599+0000 7f8c9627c700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8c90100ab0 0x7f8c90197350 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:18:02.600 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:02.599+0000 7f8c9627c700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8c90100ab0 0x7f8c90197350 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.106:57026/0 (socket says 192.168.123.106:57026) 2026-03-10T06:18:02.600 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:02.599+0000 7f8c9627c700 1 -- 192.168.123.106:0/506116644 learned_addr learned my addr 192.168.123.106:0/506116644 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-10T06:18:02.600 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:02.599+0000 7f8c9627c700 1 -- 192.168.123.106:0/506116644 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f8c800097e0 con 0x7f8c90100ab0 2026-03-10T06:18:02.601 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:02.599+0000 7f8c9627c700 1 --2- 192.168.123.106:0/506116644 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8c90100ab0 0x7f8c90197350 secure :-1 s=READY pgs=152 cs=0 l=1 rev1=1 crypto rx=0x7f8c800055d0 tx=0x7f8c800056b0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:18:02.601 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:02.600+0000 7f8c877fe700 1 -- 192.168.123.106:0/506116644 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f8c8001d070 con 0x7f8c90100ab0 2026-03-10T06:18:02.601 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:02.600+0000 7f8c984e0700 1 -- 192.168.123.106:0/506116644 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f8c90197890 con 0x7f8c90100ab0 2026-03-10T06:18:02.601 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:02.600+0000 7f8c984e0700 1 -- 192.168.123.106:0/506116644 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f8c90197cb0 con 0x7f8c90100ab0 2026-03-10T06:18:02.602 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:02.600+0000 7f8c877fe700 1 -- 192.168.123.106:0/506116644 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f8c80005dc0 con 0x7f8c90100ab0 2026-03-10T06:18:02.602 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:02.600+0000 7f8c877fe700 1 -- 192.168.123.106:0/506116644 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f8c8000f460 con 0x7f8c90100ab0 2026-03-10T06:18:02.602 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:02.601+0000 7f8c877fe700 1 -- 192.168.123.106:0/506116644 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 14) v1 ==== 45152+0+0 (secure 0 0 0) 0x7f8c8000f680 con 0x7f8c90100ab0 2026-03-10T06:18:02.602 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:02.601+0000 7f8c984e0700 1 -- 192.168.123.106:0/506116644 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f8c9004fa50 con 0x7f8c90100ab0 2026-03-10T06:18:02.602 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:02.601+0000 7f8c877fe700 1 --2- 192.168.123.106:0/506116644 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f8c7c0385a0 0x7f8c7c03aa50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:18:02.603 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:02.601+0000 7f8c877fe700 1 -- 192.168.123.106:0/506116644 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1069+0+0 (secure 0 0 0) 0x7f8c8004d4c0 con 0x7f8c90100ab0 2026-03-10T06:18:02.603 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:02.601+0000 7f8c95a7b700 1 -- 192.168.123.106:0/506116644 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f8c7c0385a0 msgr2=0x7f8c7c03aa50 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.104:6800/2 2026-03-10T06:18:02.603 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:02.601+0000 7f8c95a7b700 1 --2- 192.168.123.106:0/506116644 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f8c7c0385a0 0x7f8c7c03aa50 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 0.200000 2026-03-10T06:18:02.605 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:02.604+0000 7f8c877fe700 1 -- 192.168.123.106:0/506116644 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f8c80017440 con 0x7f8c90100ab0 2026-03-10T06:18:02.751 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:02.749+0000 7f8c984e0700 1 -- 192.168.123.106:0/506116644 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f8c9002cea0 con 0x7f8c90100ab0 2026-03-10T06:18:02.751 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:02.750+0000 7f8c877fe700 1 -- 192.168.123.106:0/506116644 <== mon.0 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7f8c80026030 con 0x7f8c90100ab0 2026-03-10T06:18:02.751 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-10T06:18:02.751 INFO:teuthology.orchestra.run.vm06.stdout:{"epoch":1,"fsid":"9c59102a-1c48-11f1-b618-035af535377d","modified":"2026-03-10T06:16:42.736031Z","created":"2026-03-10T06:16:42.736031Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm04","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:3300","nonce":0},{"type":"v1","addr":"192.168.123.104:6789","nonce":0}]},"addr":"192.168.123.104:6789/0","public_addr":"192.168.123.104:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-10T06:18:02.753 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:02.752+0000 7f8c984e0700 1 -- 192.168.123.106:0/506116644 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f8c7c0385a0 msgr2=0x7f8c7c03aa50 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T06:18:02.753 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:02.752+0000 7f8c984e0700 1 --2- 192.168.123.106:0/506116644 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f8c7c0385a0 0x7f8c7c03aa50 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:18:02.753 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:02.752+0000 7f8c984e0700 1 -- 192.168.123.106:0/506116644 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8c90100ab0 msgr2=0x7f8c90197350 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:18:02.754 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:02.752+0000 7f8c984e0700 1 --2- 192.168.123.106:0/506116644 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8c90100ab0 0x7f8c90197350 secure :-1 s=READY pgs=152 cs=0 l=1 rev1=1 crypto rx=0x7f8c800055d0 tx=0x7f8c800056b0 comp rx=0 tx=0).stop 2026-03-10T06:18:02.754 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:02.752+0000 7f8c984e0700 1 -- 192.168.123.106:0/506116644 shutdown_connections 2026-03-10T06:18:02.754 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:02.752+0000 7f8c984e0700 1 --2- 192.168.123.106:0/506116644 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f8c7c0385a0 0x7f8c7c03aa50 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:18:02.754 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:02.752+0000 7f8c984e0700 1 --2- 192.168.123.106:0/506116644 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8c90100ab0 0x7f8c90197350 unknown :-1 s=CLOSED pgs=152 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:18:02.754 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:02.752+0000 7f8c984e0700 1 -- 192.168.123.106:0/506116644 >> 192.168.123.106:0/506116644 conn(0x7f8c900fa4f0 msgr2=0x7f8c900fb150 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:18:02.754 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:02.752+0000 7f8c984e0700 1 -- 192.168.123.106:0/506116644 shutdown_connections 2026-03-10T06:18:02.754 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:02.753+0000 7f8c984e0700 1 -- 192.168.123.106:0/506116644 wait complete. 2026-03-10T06:18:02.755 INFO:teuthology.orchestra.run.vm06.stderr:dumped monmap epoch 1 2026-03-10T06:18:02.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:02 vm04 ceph-mon[51058]: from='client.? 192.168.123.106:0/506116644' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T06:18:03.796 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-10T06:18:03.796 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 9c59102a-1c48-11f1-b618-035af535377d -- ceph mon dump -f json 2026-03-10T06:18:03.943 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T06:18:03.984 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T06:18:04.261 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:04.258+0000 7f05b4a40700 1 -- 192.168.123.106:0/1058065083 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f05b0102230 msgr2=0x7f05b0102640 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:18:04.261 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:04.258+0000 7f05b4a40700 1 --2- 192.168.123.106:0/1058065083 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f05b0102230 0x7f05b0102640 secure :-1 s=READY pgs=160 cs=0 l=1 rev1=1 crypto rx=0x7f05a0009b00 tx=0x7f05a0009e10 comp rx=0 tx=0).stop 2026-03-10T06:18:04.261 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:04.260+0000 7f05b4a40700 1 -- 192.168.123.106:0/1058065083 shutdown_connections 2026-03-10T06:18:04.261 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:04.260+0000 7f05b4a40700 1 --2- 192.168.123.106:0/1058065083 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f05b0102230 0x7f05b0102640 unknown :-1 s=CLOSED pgs=160 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:18:04.261 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:04.260+0000 7f05b4a40700 1 -- 192.168.123.106:0/1058065083 >> 192.168.123.106:0/1058065083 conn(0x7f05b00fd8d0 msgr2=0x7f05b00ffcb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:18:04.261 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:04.260+0000 7f05b4a40700 1 -- 192.168.123.106:0/1058065083 shutdown_connections 2026-03-10T06:18:04.261 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:04.260+0000 7f05b4a40700 1 -- 192.168.123.106:0/1058065083 wait complete. 2026-03-10T06:18:04.262 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:04.261+0000 7f05b4a40700 1 Processor -- start 2026-03-10T06:18:04.262 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:04.261+0000 7f05b4a40700 1 -- start start 2026-03-10T06:18:04.262 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:04.261+0000 7f05b4a40700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f05b0102230 0x7f05b0197380 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:18:04.262 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:04.261+0000 7f05b4a40700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f05b01978c0 con 0x7f05b0102230 2026-03-10T06:18:04.262 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:04.261+0000 7f05ae59c700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f05b0102230 0x7f05b0197380 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:18:04.263 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:04.261+0000 7f05ae59c700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f05b0102230 0x7f05b0197380 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.106:57044/0 (socket says 192.168.123.106:57044) 2026-03-10T06:18:04.263 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:04.261+0000 7f05ae59c700 1 -- 192.168.123.106:0/880248481 learned_addr learned my addr 192.168.123.106:0/880248481 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-10T06:18:04.263 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:04.262+0000 7f05ae59c700 1 -- 192.168.123.106:0/880248481 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f05a00097e0 con 0x7f05b0102230 2026-03-10T06:18:04.263 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:04.262+0000 7f05ae59c700 1 --2- 192.168.123.106:0/880248481 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f05b0102230 0x7f05b0197380 secure :-1 s=READY pgs=161 cs=0 l=1 rev1=1 crypto rx=0x7f05a0004750 tx=0x7f05a0005dc0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:18:04.263 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:04.262+0000 7f059f7fe700 1 -- 192.168.123.106:0/880248481 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f05a001c070 con 0x7f05b0102230 2026-03-10T06:18:04.263 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:04.262+0000 7f05b4a40700 1 -- 192.168.123.106:0/880248481 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f05b0197ac0 con 0x7f05b0102230 2026-03-10T06:18:04.264 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:04.262+0000 7f05b4a40700 1 -- 192.168.123.106:0/880248481 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f05b0197f60 con 0x7f05b0102230 2026-03-10T06:18:04.264 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:04.262+0000 7f059f7fe700 1 -- 192.168.123.106:0/880248481 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f05a0021470 con 0x7f05b0102230 2026-03-10T06:18:04.264 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:04.262+0000 7f059f7fe700 1 -- 192.168.123.106:0/880248481 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f05a000f460 con 0x7f05b0102230 2026-03-10T06:18:04.265 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:04.263+0000 7f05b4a40700 1 -- 192.168.123.106:0/880248481 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f0590005320 con 0x7f05b0102230 2026-03-10T06:18:04.265 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:04.263+0000 7f059f7fe700 1 -- 192.168.123.106:0/880248481 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 15) v1 ==== 44873+0+0 (secure 0 0 0) 0x7f05a0021ac0 con 0x7f05b0102230 2026-03-10T06:18:04.265 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:04.264+0000 7f059f7fe700 1 -- 192.168.123.106:0/880248481 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(5..5 src has 1..5) v4 ==== 1198+0+0 (secure 0 0 0) 0x7f05a004bfd0 con 0x7f05b0102230 2026-03-10T06:18:04.268 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:04.266+0000 7f059f7fe700 1 -- 192.168.123.106:0/880248481 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f05a0029360 con 0x7f05b0102230 2026-03-10T06:18:04.419 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:04.418+0000 7f05b4a40700 1 -- 192.168.123.106:0/880248481 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f0590005190 con 0x7f05b0102230 2026-03-10T06:18:04.420 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:04.418+0000 7f059f7fe700 1 -- 192.168.123.106:0/880248481 <== mon.0 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7f05a0026030 con 0x7f05b0102230 2026-03-10T06:18:04.420 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-10T06:18:04.420 INFO:teuthology.orchestra.run.vm06.stdout:{"epoch":1,"fsid":"9c59102a-1c48-11f1-b618-035af535377d","modified":"2026-03-10T06:16:42.736031Z","created":"2026-03-10T06:16:42.736031Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm04","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:3300","nonce":0},{"type":"v1","addr":"192.168.123.104:6789","nonce":0}]},"addr":"192.168.123.104:6789/0","public_addr":"192.168.123.104:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-10T06:18:04.422 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:04.421+0000 7f059d7fa700 1 -- 192.168.123.106:0/880248481 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f05b0102230 msgr2=0x7f05b0197380 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:18:04.422 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:04.421+0000 7f059d7fa700 1 --2- 192.168.123.106:0/880248481 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f05b0102230 0x7f05b0197380 secure :-1 s=READY pgs=161 cs=0 l=1 rev1=1 crypto rx=0x7f05a0004750 tx=0x7f05a0005dc0 comp rx=0 tx=0).stop 2026-03-10T06:18:04.422 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:04.421+0000 7f059d7fa700 1 -- 192.168.123.106:0/880248481 shutdown_connections 2026-03-10T06:18:04.423 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:04.421+0000 7f059d7fa700 1 --2- 192.168.123.106:0/880248481 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f05b0102230 0x7f05b0197380 unknown :-1 s=CLOSED pgs=161 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:18:04.423 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:04.421+0000 7f059d7fa700 1 -- 192.168.123.106:0/880248481 >> 192.168.123.106:0/880248481 conn(0x7f05b00fd8d0 msgr2=0x7f05b00fe530 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:18:04.425 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:04.424+0000 7f059d7fa700 1 -- 192.168.123.106:0/880248481 shutdown_connections 2026-03-10T06:18:04.425 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:04.424+0000 7f059d7fa700 1 -- 192.168.123.106:0/880248481 wait complete. 2026-03-10T06:18:04.426 INFO:teuthology.orchestra.run.vm06.stderr:dumped monmap epoch 1 2026-03-10T06:18:04.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:04 vm04 ceph-mon[51058]: Active manager daemon vm04.exdvdb restarted 2026-03-10T06:18:04.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:04 vm04 ceph-mon[51058]: Activating manager daemon vm04.exdvdb 2026-03-10T06:18:04.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:04 vm04 ceph-mon[51058]: osdmap e5: 0 total, 0 up, 0 in 2026-03-10T06:18:04.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:04 vm04 ceph-mon[51058]: mgrmap e15: vm04.exdvdb(active, starting, since 0.00468297s) 2026-03-10T06:18:04.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:04 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "mon metadata", "id": "vm04"}]: dispatch 2026-03-10T06:18:04.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:04 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "mgr metadata", "who": "vm04.exdvdb", "id": "vm04.exdvdb"}]: dispatch 2026-03-10T06:18:04.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:04 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "mds metadata"}]: dispatch 2026-03-10T06:18:04.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:04 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd metadata"}]: dispatch 2026-03-10T06:18:04.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:04 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "mon metadata"}]: dispatch 2026-03-10T06:18:04.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:04 vm04 ceph-mon[51058]: Manager daemon vm04.exdvdb is now available 2026-03-10T06:18:04.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:04 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:05.281 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:05 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm04.exdvdb/mirror_snapshot_schedule"}]: dispatch 2026-03-10T06:18:05.281 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:05 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T06:18:05.281 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:05 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:18:05.281 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:05 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:18:05.281 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:05 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm04.exdvdb/trash_purge_schedule"}]: dispatch 2026-03-10T06:18:05.281 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:05 vm04 ceph-mon[51058]: from='client.? 192.168.123.106:0/880248481' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T06:18:05.282 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:05 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:05.282 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:05 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:05.282 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:05 vm04 ceph-mon[51058]: mgrmap e16: vm04.exdvdb(active, since 1.01037s) 2026-03-10T06:18:05.486 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-10T06:18:05.487 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 9c59102a-1c48-11f1-b618-035af535377d -- ceph mon dump -f json 2026-03-10T06:18:05.655 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T06:18:05.700 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T06:18:06.017 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:06.015+0000 7f888b551700 1 -- 192.168.123.106:0/1860294247 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8884079aa0 msgr2=0x7f8884079eb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:18:06.018 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:06.015+0000 7f888b551700 1 --2- 192.168.123.106:0/1860294247 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8884079aa0 0x7f8884079eb0 secure :-1 s=READY pgs=162 cs=0 l=1 rev1=1 crypto rx=0x7f8880009b00 tx=0x7f8880009e10 comp rx=0 tx=0).stop 2026-03-10T06:18:06.018 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:06.015+0000 7f888b551700 1 -- 192.168.123.106:0/1860294247 shutdown_connections 2026-03-10T06:18:06.018 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:06.016+0000 7f888b551700 1 --2- 192.168.123.106:0/1860294247 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8884079aa0 0x7f8884079eb0 unknown :-1 s=CLOSED pgs=162 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:18:06.018 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:06.016+0000 7f888b551700 1 -- 192.168.123.106:0/1860294247 >> 192.168.123.106:0/1860294247 conn(0x7f88840756e0 msgr2=0x7f8884077b10 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:18:06.018 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:06.016+0000 7f888b551700 1 -- 192.168.123.106:0/1860294247 shutdown_connections 2026-03-10T06:18:06.018 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:06.016+0000 7f888b551700 1 -- 192.168.123.106:0/1860294247 wait complete. 2026-03-10T06:18:06.018 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:06.016+0000 7f888b551700 1 Processor -- start 2026-03-10T06:18:06.018 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:06.016+0000 7f888b551700 1 -- start start 2026-03-10T06:18:06.018 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:06.016+0000 7f888b551700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8884079aa0 0x7f88841a3dd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:18:06.018 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:06.017+0000 7f888b551700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f88841a4310 con 0x7f8884079aa0 2026-03-10T06:18:06.018 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:06.017+0000 7f888a54f700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8884079aa0 0x7f88841a3dd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:18:06.018 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:06.017+0000 7f888a54f700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8884079aa0 0x7f88841a3dd0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.106:57052/0 (socket says 192.168.123.106:57052) 2026-03-10T06:18:06.018 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:06.017+0000 7f888a54f700 1 -- 192.168.123.106:0/1272952014 learned_addr learned my addr 192.168.123.106:0/1272952014 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-10T06:18:06.018 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:06.017+0000 7f888a54f700 1 -- 192.168.123.106:0/1272952014 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f88800097e0 con 0x7f8884079aa0 2026-03-10T06:18:06.019 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:06.017+0000 7f888a54f700 1 --2- 192.168.123.106:0/1272952014 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8884079aa0 0x7f88841a3dd0 secure :-1 s=READY pgs=163 cs=0 l=1 rev1=1 crypto rx=0x7f8880004f40 tx=0x7f8880004740 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:18:06.020 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:06.017+0000 7f887b7fe700 1 -- 192.168.123.106:0/1272952014 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f888001c070 con 0x7f8884079aa0 2026-03-10T06:18:06.020 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:06.018+0000 7f888b551700 1 -- 192.168.123.106:0/1272952014 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f88841a4510 con 0x7f8884079aa0 2026-03-10T06:18:06.020 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:06.018+0000 7f888b551700 1 -- 192.168.123.106:0/1272952014 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f88841a49b0 con 0x7f8884079aa0 2026-03-10T06:18:06.020 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:06.018+0000 7f887b7fe700 1 -- 192.168.123.106:0/1272952014 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f88800053f0 con 0x7f8884079aa0 2026-03-10T06:18:06.020 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:06.018+0000 7f887b7fe700 1 -- 192.168.123.106:0/1272952014 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f888000f550 con 0x7f8884079aa0 2026-03-10T06:18:06.020 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:06.018+0000 7f888b551700 1 -- 192.168.123.106:0/1272952014 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f8868005320 con 0x7f8884079aa0 2026-03-10T06:18:06.021 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:06.019+0000 7f887b7fe700 1 -- 192.168.123.106:0/1272952014 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 16) v1 ==== 45000+0+0 (secure 0 0 0) 0x7f8880005560 con 0x7f8884079aa0 2026-03-10T06:18:06.021 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:06.019+0000 7f887b7fe700 1 --2- 192.168.123.106:0/1272952014 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f88700383b0 0x7f887003a860 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:18:06.023 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:06.020+0000 7f8889d4e700 1 --2- 192.168.123.106:0/1272952014 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f88700383b0 0x7f887003a860 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:18:06.023 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:06.021+0000 7f887b7fe700 1 -- 192.168.123.106:0/1272952014 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(5..5 src has 1..5) v4 ==== 1198+0+0 (secure 0 0 0) 0x7f888000ec10 con 0x7f8884079aa0 2026-03-10T06:18:06.023 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:06.021+0000 7f8889d4e700 1 --2- 192.168.123.106:0/1272952014 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f88700383b0 0x7f887003a860 secure :-1 s=READY pgs=6 cs=0 l=1 rev1=1 crypto rx=0x7f887c00ad30 tx=0x7f887c0093f0 comp rx=0 tx=0).ready entity=mgr.14241 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:18:06.025 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:06.024+0000 7f887b7fe700 1 -- 192.168.123.106:0/1272952014 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f888000ee20 con 0x7f8884079aa0 2026-03-10T06:18:06.188 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:06.186+0000 7f888b551700 1 -- 192.168.123.106:0/1272952014 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f8868005190 con 0x7f8884079aa0 2026-03-10T06:18:06.188 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:06.187+0000 7f887b7fe700 1 -- 192.168.123.106:0/1272952014 <== mon.0 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7f8880026020 con 0x7f8884079aa0 2026-03-10T06:18:06.189 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-10T06:18:06.189 INFO:teuthology.orchestra.run.vm06.stdout:{"epoch":1,"fsid":"9c59102a-1c48-11f1-b618-035af535377d","modified":"2026-03-10T06:16:42.736031Z","created":"2026-03-10T06:16:42.736031Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm04","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:3300","nonce":0},{"type":"v1","addr":"192.168.123.104:6789","nonce":0}]},"addr":"192.168.123.104:6789/0","public_addr":"192.168.123.104:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-10T06:18:06.191 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:06.189+0000 7f888b551700 1 -- 192.168.123.106:0/1272952014 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f88700383b0 msgr2=0x7f887003a860 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:18:06.191 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:06.189+0000 7f888b551700 1 --2- 192.168.123.106:0/1272952014 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f88700383b0 0x7f887003a860 secure :-1 s=READY pgs=6 cs=0 l=1 rev1=1 crypto rx=0x7f887c00ad30 tx=0x7f887c0093f0 comp rx=0 tx=0).stop 2026-03-10T06:18:06.191 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:06.189+0000 7f888b551700 1 -- 192.168.123.106:0/1272952014 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8884079aa0 msgr2=0x7f88841a3dd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:18:06.191 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:06.189+0000 7f888b551700 1 --2- 192.168.123.106:0/1272952014 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8884079aa0 0x7f88841a3dd0 secure :-1 s=READY pgs=163 cs=0 l=1 rev1=1 crypto rx=0x7f8880004f40 tx=0x7f8880004740 comp rx=0 tx=0).stop 2026-03-10T06:18:06.191 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:06.190+0000 7f888b551700 1 -- 192.168.123.106:0/1272952014 shutdown_connections 2026-03-10T06:18:06.191 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:06.190+0000 7f888b551700 1 --2- 192.168.123.106:0/1272952014 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f88700383b0 0x7f887003a860 unknown :-1 s=CLOSED pgs=6 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:18:06.191 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:06.190+0000 7f888b551700 1 --2- 192.168.123.106:0/1272952014 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8884079aa0 0x7f88841a3dd0 unknown :-1 s=CLOSED pgs=163 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:18:06.191 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:06.190+0000 7f888b551700 1 -- 192.168.123.106:0/1272952014 >> 192.168.123.106:0/1272952014 conn(0x7f88840756e0 msgr2=0x7f88840783a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:18:06.191 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:06.190+0000 7f888b551700 1 -- 192.168.123.106:0/1272952014 shutdown_connections 2026-03-10T06:18:06.191 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:06.190+0000 7f888b551700 1 -- 192.168.123.106:0/1272952014 wait complete. 2026-03-10T06:18:06.195 INFO:teuthology.orchestra.run.vm06.stderr:dumped monmap epoch 1 2026-03-10T06:18:06.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:06 vm04 ceph-mon[51058]: [10/Mar/2026:06:18:04] ENGINE Bus STARTING 2026-03-10T06:18:06.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:06 vm04 ceph-mon[51058]: [10/Mar/2026:06:18:04] ENGINE Serving on https://192.168.123.104:7150 2026-03-10T06:18:06.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:06 vm04 ceph-mon[51058]: [10/Mar/2026:06:18:04] ENGINE Serving on http://192.168.123.104:8765 2026-03-10T06:18:06.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:06 vm04 ceph-mon[51058]: [10/Mar/2026:06:18:04] ENGINE Bus STARTED 2026-03-10T06:18:06.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:06 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:06.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:06 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:06.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:06 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:06.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:06 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:06.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:06 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:06.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:06 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "who": "osd/host:vm04", "name": "osd_memory_target"}]: dispatch 2026-03-10T06:18:07.262 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-10T06:18:07.262 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 9c59102a-1c48-11f1-b618-035af535377d -- ceph mon dump -f json 2026-03-10T06:18:07.416 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/config/ceph.conf 2026-03-10T06:18:07.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:07 vm04 ceph-mon[51058]: from='client.? 192.168.123.106:0/1272952014' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T06:18:07.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:07 vm04 ceph-mon[51058]: mgrmap e17: vm04.exdvdb(active, since 2s) 2026-03-10T06:18:07.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:07 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:07.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:07 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:07.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:07 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:07.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:07 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:07.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:07 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "who": "osd/host:vm06", "name": "osd_memory_target"}]: dispatch 2026-03-10T06:18:07.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:07 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:18:07.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:07 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T06:18:07.715 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:07.714+0000 7f959f5ea700 1 -- 192.168.123.106:0/2038122568 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f95980715f0 msgr2=0x7f95980719e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:18:07.715 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:07.714+0000 7f959f5ea700 1 --2- 192.168.123.106:0/2038122568 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f95980715f0 0x7f95980719e0 secure :-1 s=READY pgs=164 cs=0 l=1 rev1=1 crypto rx=0x7f9594009b00 tx=0x7f9594009e10 comp rx=0 tx=0).stop 2026-03-10T06:18:07.716 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:07.714+0000 7f959f5ea700 1 -- 192.168.123.106:0/2038122568 shutdown_connections 2026-03-10T06:18:07.716 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:07.714+0000 7f959f5ea700 1 --2- 192.168.123.106:0/2038122568 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f95980715f0 0x7f95980719e0 secure :-1 s=CLOSED pgs=164 cs=0 l=1 rev1=1 crypto rx=0x7f9594009b00 tx=0x7f9594009e10 comp rx=0 tx=0).stop 2026-03-10T06:18:07.716 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:07.714+0000 7f959f5ea700 1 -- 192.168.123.106:0/2038122568 >> 192.168.123.106:0/2038122568 conn(0x7f959806cf00 msgr2=0x7f959806f350 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:18:07.717 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:07.715+0000 7f959f5ea700 1 -- 192.168.123.106:0/2038122568 shutdown_connections 2026-03-10T06:18:07.718 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:07.716+0000 7f959f5ea700 1 -- 192.168.123.106:0/2038122568 wait complete. 2026-03-10T06:18:07.718 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:07.716+0000 7f959f5ea700 1 Processor -- start 2026-03-10T06:18:07.718 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:07.716+0000 7f959f5ea700 1 -- start start 2026-03-10T06:18:07.718 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:07.716+0000 7f959f5ea700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f95981aca70 0x7f95981ace80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:18:07.718 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:07.716+0000 7f959f5ea700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9594012070 con 0x7f95981aca70 2026-03-10T06:18:07.719 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:07.717+0000 7f959d386700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f95981aca70 0x7f95981ace80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:18:07.719 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:07.718+0000 7f959d386700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f95981aca70 0x7f95981ace80 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.106:46780/0 (socket says 192.168.123.106:46780) 2026-03-10T06:18:07.719 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:07.718+0000 7f959d386700 1 -- 192.168.123.106:0/285565054 learned_addr learned my addr 192.168.123.106:0/285565054 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-10T06:18:07.719 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:07.718+0000 7f959d386700 1 -- 192.168.123.106:0/285565054 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f95940097e0 con 0x7f95981aca70 2026-03-10T06:18:07.719 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:07.718+0000 7f959d386700 1 --2- 192.168.123.106:0/285565054 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f95981aca70 0x7f95981ace80 secure :-1 s=READY pgs=165 cs=0 l=1 rev1=1 crypto rx=0x7f9594005950 tx=0x7f9594005650 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:18:07.720 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:07.718+0000 7f958e7fc700 1 -- 192.168.123.106:0/285565054 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f959401d070 con 0x7f95981aca70 2026-03-10T06:18:07.722 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:07.718+0000 7f959f5ea700 1 -- 192.168.123.106:0/285565054 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f95981ad3c0 con 0x7f95981aca70 2026-03-10T06:18:07.722 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:07.718+0000 7f959f5ea700 1 -- 192.168.123.106:0/285565054 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f95981afef0 con 0x7f95981aca70 2026-03-10T06:18:07.722 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:07.719+0000 7f958e7fc700 1 -- 192.168.123.106:0/285565054 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f959400b810 con 0x7f95981aca70 2026-03-10T06:18:07.722 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:07.719+0000 7f958e7fc700 1 -- 192.168.123.106:0/285565054 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f959400f460 con 0x7f95981aca70 2026-03-10T06:18:07.722 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:07.719+0000 7f959f5ea700 1 -- 192.168.123.106:0/285565054 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f9598110980 con 0x7f95981aca70 2026-03-10T06:18:07.722 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:07.720+0000 7f958e7fc700 1 -- 192.168.123.106:0/285565054 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 17) v1 ==== 45198+0+0 (secure 0 0 0) 0x7f959400f5c0 con 0x7f95981aca70 2026-03-10T06:18:07.722 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:07.720+0000 7f958e7fc700 1 --2- 192.168.123.106:0/285565054 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f9584038640 0x7f958403aaf0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:18:07.722 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:07.720+0000 7f958e7fc700 1 -- 192.168.123.106:0/285565054 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(5..5 src has 1..5) v4 ==== 1198+0+0 (secure 0 0 0) 0x7f959404d3c0 con 0x7f95981aca70 2026-03-10T06:18:07.722 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:07.720+0000 7f959cb85700 1 --2- 192.168.123.106:0/285565054 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f9584038640 0x7f958403aaf0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:18:07.722 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:07.721+0000 7f959cb85700 1 --2- 192.168.123.106:0/285565054 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f9584038640 0x7f958403aaf0 secure :-1 s=READY pgs=7 cs=0 l=1 rev1=1 crypto rx=0x7f9588006fd0 tx=0x7f9588006e40 comp rx=0 tx=0).ready entity=mgr.14241 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:18:07.724 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:07.723+0000 7f958e7fc700 1 -- 192.168.123.106:0/285565054 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f9594029950 con 0x7f95981aca70 2026-03-10T06:18:07.894 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:07.892+0000 7f959f5ea700 1 -- 192.168.123.106:0/285565054 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f9598062380 con 0x7f95981aca70 2026-03-10T06:18:07.894 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-10T06:18:07.894 INFO:teuthology.orchestra.run.vm06.stdout:{"epoch":1,"fsid":"9c59102a-1c48-11f1-b618-035af535377d","modified":"2026-03-10T06:16:42.736031Z","created":"2026-03-10T06:16:42.736031Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm04","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:3300","nonce":0},{"type":"v1","addr":"192.168.123.104:6789","nonce":0}]},"addr":"192.168.123.104:6789/0","public_addr":"192.168.123.104:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-10T06:18:07.895 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:07.893+0000 7f958e7fc700 1 -- 192.168.123.106:0/285565054 <== mon.0 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7f9594026030 con 0x7f95981aca70 2026-03-10T06:18:07.897 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:07.896+0000 7f959f5ea700 1 -- 192.168.123.106:0/285565054 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f9584038640 msgr2=0x7f958403aaf0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:18:07.897 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:07.896+0000 7f959f5ea700 1 --2- 192.168.123.106:0/285565054 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f9584038640 0x7f958403aaf0 secure :-1 s=READY pgs=7 cs=0 l=1 rev1=1 crypto rx=0x7f9588006fd0 tx=0x7f9588006e40 comp rx=0 tx=0).stop 2026-03-10T06:18:07.897 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:07.896+0000 7f959f5ea700 1 -- 192.168.123.106:0/285565054 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f95981aca70 msgr2=0x7f95981ace80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:18:07.897 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:07.896+0000 7f959f5ea700 1 --2- 192.168.123.106:0/285565054 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f95981aca70 0x7f95981ace80 secure :-1 s=READY pgs=165 cs=0 l=1 rev1=1 crypto rx=0x7f9594005950 tx=0x7f9594005650 comp rx=0 tx=0).stop 2026-03-10T06:18:07.898 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:07.896+0000 7f959f5ea700 1 -- 192.168.123.106:0/285565054 shutdown_connections 2026-03-10T06:18:07.898 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:07.896+0000 7f959f5ea700 1 --2- 192.168.123.106:0/285565054 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f9584038640 0x7f958403aaf0 unknown :-1 s=CLOSED pgs=7 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:18:07.898 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:07.896+0000 7f959f5ea700 1 --2- 192.168.123.106:0/285565054 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f95981aca70 0x7f95981ace80 unknown :-1 s=CLOSED pgs=165 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:18:07.898 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:07.896+0000 7f959f5ea700 1 -- 192.168.123.106:0/285565054 >> 192.168.123.106:0/285565054 conn(0x7f959806cf00 msgr2=0x7f959806dbd0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:18:07.898 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:07.896+0000 7f959f5ea700 1 -- 192.168.123.106:0/285565054 shutdown_connections 2026-03-10T06:18:07.898 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:07.897+0000 7f959f5ea700 1 -- 192.168.123.106:0/285565054 wait complete. 2026-03-10T06:18:07.899 INFO:teuthology.orchestra.run.vm06.stderr:dumped monmap epoch 1 2026-03-10T06:18:08.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:08 vm04 ceph-mon[51058]: Updating vm04:/etc/ceph/ceph.conf 2026-03-10T06:18:08.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:08 vm04 ceph-mon[51058]: Updating vm06:/etc/ceph/ceph.conf 2026-03-10T06:18:08.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:08 vm04 ceph-mon[51058]: Updating vm04:/var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/config/ceph.conf 2026-03-10T06:18:08.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:08 vm04 ceph-mon[51058]: Updating vm06:/var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/config/ceph.conf 2026-03-10T06:18:08.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:08 vm04 ceph-mon[51058]: Updating vm04:/etc/ceph/ceph.client.admin.keyring 2026-03-10T06:18:08.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:08 vm04 ceph-mon[51058]: Updating vm06:/etc/ceph/ceph.client.admin.keyring 2026-03-10T06:18:08.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:08 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:08.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:08 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:08.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:08 vm04 ceph-mon[51058]: from='client.? 192.168.123.106:0/285565054' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T06:18:08.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:08 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:08.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:08 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:08.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:08 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:08.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:08 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm06", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-10T06:18:08.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:08 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd='[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm06", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]': finished 2026-03-10T06:18:08.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:08 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:18:08.954 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-10T06:18:08.954 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 9c59102a-1c48-11f1-b618-035af535377d -- ceph mon dump -f json 2026-03-10T06:18:09.265 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/config/ceph.conf 2026-03-10T06:18:09.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:09 vm04 ceph-mon[51058]: Updating vm04:/var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/config/ceph.client.admin.keyring 2026-03-10T06:18:09.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:09 vm04 ceph-mon[51058]: Updating vm06:/var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/config/ceph.client.admin.keyring 2026-03-10T06:18:09.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:09 vm04 ceph-mon[51058]: Deploying daemon ceph-exporter.vm06 on vm06 2026-03-10T06:18:09.771 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:09.769+0000 7f8795833700 1 -- 192.168.123.106:0/2541421917 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f87900730f0 msgr2=0x7f87900734c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:18:09.771 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:09.769+0000 7f8795833700 1 --2- 192.168.123.106:0/2541421917 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f87900730f0 0x7f87900734c0 secure :-1 s=READY pgs=167 cs=0 l=1 rev1=1 crypto rx=0x7f878800b3a0 tx=0x7f878800b6b0 comp rx=0 tx=0).stop 2026-03-10T06:18:09.774 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:09.770+0000 7f8795833700 1 -- 192.168.123.106:0/2541421917 shutdown_connections 2026-03-10T06:18:09.774 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:09.770+0000 7f8795833700 1 --2- 192.168.123.106:0/2541421917 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f87900730f0 0x7f87900734c0 unknown :-1 s=CLOSED pgs=167 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:18:09.774 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:09.770+0000 7f8795833700 1 -- 192.168.123.106:0/2541421917 >> 192.168.123.106:0/2541421917 conn(0x7f87900fb9d0 msgr2=0x7f87900fdde0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:18:09.774 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:09.772+0000 7f8795833700 1 -- 192.168.123.106:0/2541421917 shutdown_connections 2026-03-10T06:18:09.774 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:09.772+0000 7f8795833700 1 -- 192.168.123.106:0/2541421917 wait complete. 2026-03-10T06:18:09.774 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:09.772+0000 7f8795833700 1 Processor -- start 2026-03-10T06:18:09.774 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:09.772+0000 7f8795833700 1 -- start start 2026-03-10T06:18:09.774 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:09.772+0000 7f8795833700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f87900730f0 0x7f879006fef0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:18:09.774 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:09.772+0000 7f8795833700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8790070430 con 0x7f87900730f0 2026-03-10T06:18:09.774 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:09.772+0000 7f8794831700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f87900730f0 0x7f879006fef0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:18:09.774 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:09.773+0000 7f8794831700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f87900730f0 0x7f879006fef0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.106:46802/0 (socket says 192.168.123.106:46802) 2026-03-10T06:18:09.774 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:09.773+0000 7f8794831700 1 -- 192.168.123.106:0/3019110123 learned_addr learned my addr 192.168.123.106:0/3019110123 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-10T06:18:09.774 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:09.773+0000 7f8794831700 1 -- 192.168.123.106:0/3019110123 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f878800b050 con 0x7f87900730f0 2026-03-10T06:18:09.774 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:09.773+0000 7f8794831700 1 --2- 192.168.123.106:0/3019110123 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f87900730f0 0x7f879006fef0 secure :-1 s=READY pgs=168 cs=0 l=1 rev1=1 crypto rx=0x7f878800bb30 tx=0x7f8788009420 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:18:09.774 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:09.773+0000 7f878dffb700 1 -- 192.168.123.106:0/3019110123 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f878800e040 con 0x7f87900730f0 2026-03-10T06:18:09.775 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:09.773+0000 7f8795833700 1 -- 192.168.123.106:0/3019110123 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f8790070630 con 0x7f87900730f0 2026-03-10T06:18:09.775 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:09.773+0000 7f8795833700 1 -- 192.168.123.106:0/3019110123 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f8790070a50 con 0x7f87900730f0 2026-03-10T06:18:09.776 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:09.774+0000 7f878dffb700 1 -- 192.168.123.106:0/3019110123 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f8788012c30 con 0x7f87900730f0 2026-03-10T06:18:09.776 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:09.774+0000 7f878dffb700 1 -- 192.168.123.106:0/3019110123 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f878801dda0 con 0x7f87900730f0 2026-03-10T06:18:09.776 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:09.774+0000 7f8795833700 1 -- 192.168.123.106:0/3019110123 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f87901033a0 con 0x7f87900730f0 2026-03-10T06:18:09.776 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:09.775+0000 7f878dffb700 1 -- 192.168.123.106:0/3019110123 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 17) v1 ==== 45198+0+0 (secure 0 0 0) 0x7f8788027460 con 0x7f87900730f0 2026-03-10T06:18:09.776 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:09.775+0000 7f878dffb700 1 --2- 192.168.123.106:0/3019110123 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f8778038600 0x7f877803aab0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:18:09.776 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:09.775+0000 7f878dffb700 1 -- 192.168.123.106:0/3019110123 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(5..5 src has 1..5) v4 ==== 1198+0+0 (secure 0 0 0) 0x7f878804c920 con 0x7f87900730f0 2026-03-10T06:18:09.777 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:09.776+0000 7f878ffff700 1 --2- 192.168.123.106:0/3019110123 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f8778038600 0x7f877803aab0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:18:09.778 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:09.776+0000 7f878ffff700 1 --2- 192.168.123.106:0/3019110123 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f8778038600 0x7f877803aab0 secure :-1 s=READY pgs=8 cs=0 l=1 rev1=1 crypto rx=0x7f8780006fd0 tx=0x7f8780006e40 comp rx=0 tx=0).ready entity=mgr.14241 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:18:09.780 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:09.779+0000 7f878dffb700 1 -- 192.168.123.106:0/3019110123 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f878801b930 con 0x7f87900730f0 2026-03-10T06:18:09.930 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:09.929+0000 7f8795833700 1 -- 192.168.123.106:0/3019110123 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f8790062380 con 0x7f87900730f0 2026-03-10T06:18:09.931 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:09.930+0000 7f878dffb700 1 -- 192.168.123.106:0/3019110123 <== mon.0 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7f8788017030 con 0x7f87900730f0 2026-03-10T06:18:09.932 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-10T06:18:09.932 INFO:teuthology.orchestra.run.vm06.stdout:{"epoch":1,"fsid":"9c59102a-1c48-11f1-b618-035af535377d","modified":"2026-03-10T06:16:42.736031Z","created":"2026-03-10T06:16:42.736031Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm04","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:3300","nonce":0},{"type":"v1","addr":"192.168.123.104:6789","nonce":0}]},"addr":"192.168.123.104:6789/0","public_addr":"192.168.123.104:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-10T06:18:09.934 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:09.933+0000 7f8795833700 1 -- 192.168.123.106:0/3019110123 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f8778038600 msgr2=0x7f877803aab0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:18:09.934 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:09.933+0000 7f8795833700 1 --2- 192.168.123.106:0/3019110123 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f8778038600 0x7f877803aab0 secure :-1 s=READY pgs=8 cs=0 l=1 rev1=1 crypto rx=0x7f8780006fd0 tx=0x7f8780006e40 comp rx=0 tx=0).stop 2026-03-10T06:18:09.934 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:09.933+0000 7f8795833700 1 -- 192.168.123.106:0/3019110123 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f87900730f0 msgr2=0x7f879006fef0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:18:09.934 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:09.933+0000 7f8795833700 1 --2- 192.168.123.106:0/3019110123 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f87900730f0 0x7f879006fef0 secure :-1 s=READY pgs=168 cs=0 l=1 rev1=1 crypto rx=0x7f878800bb30 tx=0x7f8788009420 comp rx=0 tx=0).stop 2026-03-10T06:18:09.934 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:09.933+0000 7f8795833700 1 -- 192.168.123.106:0/3019110123 shutdown_connections 2026-03-10T06:18:09.934 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:09.933+0000 7f8795833700 1 --2- 192.168.123.106:0/3019110123 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f8778038600 0x7f877803aab0 unknown :-1 s=CLOSED pgs=8 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:18:09.934 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:09.933+0000 7f8795833700 1 --2- 192.168.123.106:0/3019110123 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f87900730f0 0x7f879006fef0 unknown :-1 s=CLOSED pgs=168 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:18:09.934 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:09.933+0000 7f8795833700 1 -- 192.168.123.106:0/3019110123 >> 192.168.123.106:0/3019110123 conn(0x7f87900fb9d0 msgr2=0x7f87900fc630 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:18:09.935 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:09.933+0000 7f8795833700 1 -- 192.168.123.106:0/3019110123 shutdown_connections 2026-03-10T06:18:09.935 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:09.933+0000 7f8795833700 1 -- 192.168.123.106:0/3019110123 wait complete. 2026-03-10T06:18:09.935 INFO:teuthology.orchestra.run.vm06.stderr:dumped monmap epoch 1 2026-03-10T06:18:10.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:10 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:10.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:10 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:10.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:10 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:10.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:10 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:10.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:10 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.vm06", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch 2026-03-10T06:18:10.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:10 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd='[{"prefix": "auth get-or-create", "entity": "client.crash.vm06", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]': finished 2026-03-10T06:18:10.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:10 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:18:10.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:10 vm04 ceph-mon[51058]: from='client.? 192.168.123.106:0/3019110123' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T06:18:10.984 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-10T06:18:10.984 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 9c59102a-1c48-11f1-b618-035af535377d -- ceph mon dump -f json 2026-03-10T06:18:11.170 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/config/ceph.conf 2026-03-10T06:18:11.491 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:11.488+0000 7f4ca1eca700 1 -- 192.168.123.106:0/1446540771 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4c9c1024a0 msgr2=0x7f4c9c102870 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:18:11.491 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:11.488+0000 7f4ca1eca700 1 --2- 192.168.123.106:0/1446540771 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4c9c1024a0 0x7f4c9c102870 secure :-1 s=READY pgs=169 cs=0 l=1 rev1=1 crypto rx=0x7f4c84009b00 tx=0x7f4c84009e10 comp rx=0 tx=0).stop 2026-03-10T06:18:11.491 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:11.489+0000 7f4ca1eca700 1 -- 192.168.123.106:0/1446540771 shutdown_connections 2026-03-10T06:18:11.491 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:11.489+0000 7f4ca1eca700 1 --2- 192.168.123.106:0/1446540771 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4c9c1024a0 0x7f4c9c102870 unknown :-1 s=CLOSED pgs=169 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:18:11.491 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:11.489+0000 7f4ca1eca700 1 -- 192.168.123.106:0/1446540771 >> 192.168.123.106:0/1446540771 conn(0x7f4c9c0fdd60 msgr2=0x7f4c9c100170 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:18:11.491 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:11.489+0000 7f4ca1eca700 1 -- 192.168.123.106:0/1446540771 shutdown_connections 2026-03-10T06:18:11.491 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:11.490+0000 7f4ca1eca700 1 -- 192.168.123.106:0/1446540771 wait complete. 2026-03-10T06:18:11.492 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:11.490+0000 7f4ca1eca700 1 Processor -- start 2026-03-10T06:18:11.492 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:11.490+0000 7f4ca1eca700 1 -- start start 2026-03-10T06:18:11.492 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:11.491+0000 7f4ca1eca700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4c9c1024a0 0x7f4c9c1976e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:18:11.492 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:11.491+0000 7f4ca1eca700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4c9c197c20 con 0x7f4c9c1024a0 2026-03-10T06:18:11.492 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:11.491+0000 7f4c9b7fe700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4c9c1024a0 0x7f4c9c1976e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:18:11.492 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:11.491+0000 7f4c9b7fe700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4c9c1024a0 0x7f4c9c1976e0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.106:46832/0 (socket says 192.168.123.106:46832) 2026-03-10T06:18:11.492 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:11.491+0000 7f4c9b7fe700 1 -- 192.168.123.106:0/3926665551 learned_addr learned my addr 192.168.123.106:0/3926665551 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-10T06:18:11.493 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:11.491+0000 7f4c9b7fe700 1 -- 192.168.123.106:0/3926665551 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f4c840097e0 con 0x7f4c9c1024a0 2026-03-10T06:18:11.493 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:11.492+0000 7f4c9b7fe700 1 --2- 192.168.123.106:0/3926665551 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4c9c1024a0 0x7f4c9c1976e0 secure :-1 s=READY pgs=170 cs=0 l=1 rev1=1 crypto rx=0x7f4c84006010 tx=0x7f4c84004dc0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:18:11.493 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:11.492+0000 7f4c98ff9700 1 -- 192.168.123.106:0/3926665551 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f4c8401c070 con 0x7f4c9c1024a0 2026-03-10T06:18:11.493 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:11.492+0000 7f4ca1eca700 1 -- 192.168.123.106:0/3926665551 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f4c9c197e20 con 0x7f4c9c1024a0 2026-03-10T06:18:11.493 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:11.492+0000 7f4ca1eca700 1 -- 192.168.123.106:0/3926665551 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f4c9c198240 con 0x7f4c9c1024a0 2026-03-10T06:18:11.494 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:11.493+0000 7f4c98ff9700 1 -- 192.168.123.106:0/3926665551 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f4c84021470 con 0x7f4c9c1024a0 2026-03-10T06:18:11.494 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:11.493+0000 7f4c98ff9700 1 -- 192.168.123.106:0/3926665551 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f4c8400f460 con 0x7f4c9c1024a0 2026-03-10T06:18:11.494 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:11.493+0000 7f4c98ff9700 1 -- 192.168.123.106:0/3926665551 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 17) v1 ==== 45198+0+0 (secure 0 0 0) 0x7f4c8400f6d0 con 0x7f4c9c1024a0 2026-03-10T06:18:11.495 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:11.493+0000 7f4c98ff9700 1 --2- 192.168.123.106:0/3926665551 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f4c880385b0 0x7f4c8803aa60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:18:11.495 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:11.493+0000 7f4c9affd700 1 --2- 192.168.123.106:0/3926665551 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f4c880385b0 0x7f4c8803aa60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:18:11.495 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:11.493+0000 7f4c98ff9700 1 -- 192.168.123.106:0/3926665551 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(5..5 src has 1..5) v4 ==== 1198+0+0 (secure 0 0 0) 0x7f4c840203d0 con 0x7f4c9c1024a0 2026-03-10T06:18:11.495 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:11.494+0000 7f4c9affd700 1 --2- 192.168.123.106:0/3926665551 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f4c880385b0 0x7f4c8803aa60 secure :-1 s=READY pgs=9 cs=0 l=1 rev1=1 crypto rx=0x7f4c8c006fd0 tx=0x7f4c8c006e40 comp rx=0 tx=0).ready entity=mgr.14241 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:18:11.495 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:11.494+0000 7f4ca1eca700 1 -- 192.168.123.106:0/3926665551 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f4c7c005320 con 0x7f4c9c1024a0 2026-03-10T06:18:11.498 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:11.497+0000 7f4c98ff9700 1 -- 192.168.123.106:0/3926665551 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f4c84017780 con 0x7f4c9c1024a0 2026-03-10T06:18:11.651 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:11.649+0000 7f4ca1eca700 1 -- 192.168.123.106:0/3926665551 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f4c7c005190 con 0x7f4c9c1024a0 2026-03-10T06:18:11.651 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:11.650+0000 7f4c98ff9700 1 -- 192.168.123.106:0/3926665551 <== mon.0 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7f4c84017780 con 0x7f4c9c1024a0 2026-03-10T06:18:11.651 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-10T06:18:11.651 INFO:teuthology.orchestra.run.vm06.stdout:{"epoch":1,"fsid":"9c59102a-1c48-11f1-b618-035af535377d","modified":"2026-03-10T06:16:42.736031Z","created":"2026-03-10T06:16:42.736031Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm04","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:3300","nonce":0},{"type":"v1","addr":"192.168.123.104:6789","nonce":0}]},"addr":"192.168.123.104:6789/0","public_addr":"192.168.123.104:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-10T06:18:11.654 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:11.652+0000 7f4ca1eca700 1 -- 192.168.123.106:0/3926665551 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f4c880385b0 msgr2=0x7f4c8803aa60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:18:11.654 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:11.652+0000 7f4ca1eca700 1 --2- 192.168.123.106:0/3926665551 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f4c880385b0 0x7f4c8803aa60 secure :-1 s=READY pgs=9 cs=0 l=1 rev1=1 crypto rx=0x7f4c8c006fd0 tx=0x7f4c8c006e40 comp rx=0 tx=0).stop 2026-03-10T06:18:11.654 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:11.652+0000 7f4ca1eca700 1 -- 192.168.123.106:0/3926665551 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4c9c1024a0 msgr2=0x7f4c9c1976e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:18:11.654 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:11.653+0000 7f4ca1eca700 1 --2- 192.168.123.106:0/3926665551 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4c9c1024a0 0x7f4c9c1976e0 secure :-1 s=READY pgs=170 cs=0 l=1 rev1=1 crypto rx=0x7f4c84006010 tx=0x7f4c84004dc0 comp rx=0 tx=0).stop 2026-03-10T06:18:11.654 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:11.653+0000 7f4ca1eca700 1 -- 192.168.123.106:0/3926665551 shutdown_connections 2026-03-10T06:18:11.654 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:11.653+0000 7f4ca1eca700 1 --2- 192.168.123.106:0/3926665551 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f4c880385b0 0x7f4c8803aa60 unknown :-1 s=CLOSED pgs=9 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:18:11.654 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:11.653+0000 7f4ca1eca700 1 --2- 192.168.123.106:0/3926665551 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4c9c1024a0 0x7f4c9c1976e0 unknown :-1 s=CLOSED pgs=170 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:18:11.654 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:11.653+0000 7f4ca1eca700 1 -- 192.168.123.106:0/3926665551 >> 192.168.123.106:0/3926665551 conn(0x7f4c9c0fdd60 msgr2=0x7f4c9c106680 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:18:11.654 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:11.653+0000 7f4ca1eca700 1 -- 192.168.123.106:0/3926665551 shutdown_connections 2026-03-10T06:18:11.654 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:11.653+0000 7f4ca1eca700 1 -- 192.168.123.106:0/3926665551 wait complete. 2026-03-10T06:18:11.655 INFO:teuthology.orchestra.run.vm06.stderr:dumped monmap epoch 1 2026-03-10T06:18:11.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:11 vm04 ceph-mon[51058]: Deploying daemon crash.vm06 on vm06 2026-03-10T06:18:11.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:11 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:11.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:11 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:11.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:11 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:11.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:11 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:12.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:12 vm04 ceph-mon[51058]: Deploying daemon node-exporter.vm06 on vm06 2026-03-10T06:18:12.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:12 vm04 ceph-mon[51058]: from='client.? 192.168.123.106:0/3926665551' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T06:18:12.701 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-10T06:18:12.701 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 9c59102a-1c48-11f1-b618-035af535377d -- ceph mon dump -f json 2026-03-10T06:18:12.893 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/config/ceph.conf 2026-03-10T06:18:13.207 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:13.205+0000 7f24e1fbe700 1 -- 192.168.123.106:0/1646683437 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f24dc06d9f0 msgr2=0x7f24dc10edb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:18:13.207 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:13.205+0000 7f24e1fbe700 1 --2- 192.168.123.106:0/1646683437 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f24dc06d9f0 0x7f24dc10edb0 secure :-1 s=READY pgs=171 cs=0 l=1 rev1=1 crypto rx=0x7f24cc009b00 tx=0x7f24cc009e10 comp rx=0 tx=0).stop 2026-03-10T06:18:13.207 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:13.205+0000 7f24e1fbe700 1 -- 192.168.123.106:0/1646683437 shutdown_connections 2026-03-10T06:18:13.207 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:13.205+0000 7f24e1fbe700 1 --2- 192.168.123.106:0/1646683437 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f24dc06d9f0 0x7f24dc10edb0 unknown :-1 s=CLOSED pgs=171 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:18:13.207 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:13.205+0000 7f24e1fbe700 1 -- 192.168.123.106:0/1646683437 >> 192.168.123.106:0/1646683437 conn(0x7f24dc06c5e0 msgr2=0x7f24dc06c9e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:18:13.207 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:13.205+0000 7f24e1fbe700 1 -- 192.168.123.106:0/1646683437 shutdown_connections 2026-03-10T06:18:13.207 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:13.205+0000 7f24e1fbe700 1 -- 192.168.123.106:0/1646683437 wait complete. 2026-03-10T06:18:13.207 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:13.206+0000 7f24e1fbe700 1 Processor -- start 2026-03-10T06:18:13.207 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:13.206+0000 7f24e1fbe700 1 -- start start 2026-03-10T06:18:13.208 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:13.206+0000 7f24e1fbe700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f24dc06d9f0 0x7f24dc1a42a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:18:13.208 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:13.206+0000 7f24e1fbe700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f24dc1a47e0 con 0x7f24dc06d9f0 2026-03-10T06:18:13.208 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:13.207+0000 7f24db7fe700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f24dc06d9f0 0x7f24dc1a42a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:18:13.208 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:13.207+0000 7f24db7fe700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f24dc06d9f0 0x7f24dc1a42a0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.106:46854/0 (socket says 192.168.123.106:46854) 2026-03-10T06:18:13.208 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:13.207+0000 7f24db7fe700 1 -- 192.168.123.106:0/455275937 learned_addr learned my addr 192.168.123.106:0/455275937 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-10T06:18:13.208 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:13.207+0000 7f24db7fe700 1 -- 192.168.123.106:0/455275937 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f24cc0097e0 con 0x7f24dc06d9f0 2026-03-10T06:18:13.209 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:13.208+0000 7f24db7fe700 1 --2- 192.168.123.106:0/455275937 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f24dc06d9f0 0x7f24dc1a42a0 secure :-1 s=READY pgs=172 cs=0 l=1 rev1=1 crypto rx=0x7f24cc006010 tx=0x7f24cc004dc0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:18:13.209 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:13.208+0000 7f24d8ff9700 1 -- 192.168.123.106:0/455275937 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f24cc01c070 con 0x7f24dc06d9f0 2026-03-10T06:18:13.209 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:13.208+0000 7f24d8ff9700 1 -- 192.168.123.106:0/455275937 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f24cc021470 con 0x7f24dc06d9f0 2026-03-10T06:18:13.210 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:13.209+0000 7f24d8ff9700 1 -- 192.168.123.106:0/455275937 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f24cc00f460 con 0x7f24dc06d9f0 2026-03-10T06:18:13.211 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:13.210+0000 7f24e1fbe700 1 -- 192.168.123.106:0/455275937 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f24dc1a49e0 con 0x7f24dc06d9f0 2026-03-10T06:18:13.211 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:13.210+0000 7f24e1fbe700 1 -- 192.168.123.106:0/455275937 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f24dc1a4e80 con 0x7f24dc06d9f0 2026-03-10T06:18:13.211 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:13.210+0000 7f24e1fbe700 1 -- 192.168.123.106:0/455275937 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f24dc04f9e0 con 0x7f24dc06d9f0 2026-03-10T06:18:13.212 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:13.210+0000 7f24d8ff9700 1 -- 192.168.123.106:0/455275937 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 17) v1 ==== 45198+0+0 (secure 0 0 0) 0x7f24cc0215e0 con 0x7f24dc06d9f0 2026-03-10T06:18:13.212 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:13.211+0000 7f24d8ff9700 1 --2- 192.168.123.106:0/455275937 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f24c4038280 0x7f24c403a730 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:18:13.212 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:13.211+0000 7f24d8ff9700 1 -- 192.168.123.106:0/455275937 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(5..5 src has 1..5) v4 ==== 1198+0+0 (secure 0 0 0) 0x7f24cc04c620 con 0x7f24dc06d9f0 2026-03-10T06:18:13.212 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:13.211+0000 7f24daffd700 1 --2- 192.168.123.106:0/455275937 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f24c4038280 0x7f24c403a730 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:18:13.213 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:13.211+0000 7f24daffd700 1 --2- 192.168.123.106:0/455275937 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f24c4038280 0x7f24c403a730 secure :-1 s=READY pgs=10 cs=0 l=1 rev1=1 crypto rx=0x7f24d0006fd0 tx=0x7f24d0006e40 comp rx=0 tx=0).ready entity=mgr.14241 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:18:13.216 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:13.214+0000 7f24d8ff9700 1 -- 192.168.123.106:0/455275937 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f24cc026030 con 0x7f24dc06d9f0 2026-03-10T06:18:13.380 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:13.378+0000 7f24e1fbe700 1 -- 192.168.123.106:0/455275937 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f24dc10e120 con 0x7f24dc06d9f0 2026-03-10T06:18:13.380 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:13.379+0000 7f24d8ff9700 1 -- 192.168.123.106:0/455275937 <== mon.0 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7f24cc026020 con 0x7f24dc06d9f0 2026-03-10T06:18:13.380 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-10T06:18:13.381 INFO:teuthology.orchestra.run.vm06.stdout:{"epoch":1,"fsid":"9c59102a-1c48-11f1-b618-035af535377d","modified":"2026-03-10T06:16:42.736031Z","created":"2026-03-10T06:16:42.736031Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm04","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:3300","nonce":0},{"type":"v1","addr":"192.168.123.104:6789","nonce":0}]},"addr":"192.168.123.104:6789/0","public_addr":"192.168.123.104:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-10T06:18:13.383 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:13.382+0000 7f24c27fc700 1 -- 192.168.123.106:0/455275937 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f24c4038280 msgr2=0x7f24c403a730 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:18:13.383 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:13.382+0000 7f24c27fc700 1 --2- 192.168.123.106:0/455275937 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f24c4038280 0x7f24c403a730 secure :-1 s=READY pgs=10 cs=0 l=1 rev1=1 crypto rx=0x7f24d0006fd0 tx=0x7f24d0006e40 comp rx=0 tx=0).stop 2026-03-10T06:18:13.383 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:13.382+0000 7f24c27fc700 1 -- 192.168.123.106:0/455275937 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f24dc06d9f0 msgr2=0x7f24dc1a42a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:18:13.383 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:13.382+0000 7f24c27fc700 1 --2- 192.168.123.106:0/455275937 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f24dc06d9f0 0x7f24dc1a42a0 secure :-1 s=READY pgs=172 cs=0 l=1 rev1=1 crypto rx=0x7f24cc006010 tx=0x7f24cc004dc0 comp rx=0 tx=0).stop 2026-03-10T06:18:13.383 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:13.382+0000 7f24c27fc700 1 -- 192.168.123.106:0/455275937 shutdown_connections 2026-03-10T06:18:13.384 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:13.382+0000 7f24c27fc700 1 --2- 192.168.123.106:0/455275937 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f24c4038280 0x7f24c403a730 unknown :-1 s=CLOSED pgs=10 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:18:13.384 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:13.382+0000 7f24c27fc700 1 --2- 192.168.123.106:0/455275937 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f24dc06d9f0 0x7f24dc1a42a0 unknown :-1 s=CLOSED pgs=172 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:18:13.384 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:13.382+0000 7f24c27fc700 1 -- 192.168.123.106:0/455275937 >> 192.168.123.106:0/455275937 conn(0x7f24dc06c5e0 msgr2=0x7f24dc10b880 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:18:13.384 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:13.383+0000 7f24c27fc700 1 -- 192.168.123.106:0/455275937 shutdown_connections 2026-03-10T06:18:13.384 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:13.383+0000 7f24c27fc700 1 -- 192.168.123.106:0/455275937 wait complete. 2026-03-10T06:18:13.385 INFO:teuthology.orchestra.run.vm06.stderr:dumped monmap epoch 1 2026-03-10T06:18:13.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:13 vm04 ceph-mon[51058]: from='client.? 192.168.123.106:0/455275937' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T06:18:14.442 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-10T06:18:14.442 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 9c59102a-1c48-11f1-b618-035af535377d -- ceph mon dump -f json 2026-03-10T06:18:14.665 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/config/ceph.conf 2026-03-10T06:18:14.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:14 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:14.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:14 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:14.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:14 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:14.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:14 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:14.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:14 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm06.wwotdr", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-10T06:18:14.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:14 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd='[{"prefix": "auth get-or-create", "entity": "mgr.vm06.wwotdr", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]': finished 2026-03-10T06:18:14.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:14 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-10T06:18:14.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:14 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:18:14.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:14 vm04 ceph-mon[51058]: Deploying daemon mgr.vm06.wwotdr on vm06 2026-03-10T06:18:14.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:14 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:14.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:14 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:14.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:14 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:14.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:14 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:14.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:14 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:14.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:14 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch 2026-03-10T06:18:14.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:14 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:18:15.121 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:15.118+0000 7fe014318700 1 -- 192.168.123.106:0/657735839 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe00c06d750 msgr2=0x7fe00c107d50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:18:15.121 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:15.118+0000 7fe014318700 1 --2- 192.168.123.106:0/657735839 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe00c06d750 0x7fe00c107d50 secure :-1 s=READY pgs=175 cs=0 l=1 rev1=1 crypto rx=0x7fe008007780 tx=0x7fe00800c050 comp rx=0 tx=0).stop 2026-03-10T06:18:15.121 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:15.118+0000 7fe014318700 1 -- 192.168.123.106:0/657735839 shutdown_connections 2026-03-10T06:18:15.121 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:15.118+0000 7fe014318700 1 --2- 192.168.123.106:0/657735839 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe00c06d750 0x7fe00c107d50 unknown :-1 s=CLOSED pgs=175 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:18:15.121 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:15.118+0000 7fe014318700 1 -- 192.168.123.106:0/657735839 >> 192.168.123.106:0/657735839 conn(0x7fe00c06c430 msgr2=0x7fe00c06c830 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:18:15.121 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:15.118+0000 7fe014318700 1 -- 192.168.123.106:0/657735839 shutdown_connections 2026-03-10T06:18:15.121 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:15.118+0000 7fe014318700 1 -- 192.168.123.106:0/657735839 wait complete. 2026-03-10T06:18:15.121 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:15.119+0000 7fe014318700 1 Processor -- start 2026-03-10T06:18:15.121 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:15.119+0000 7fe014318700 1 -- start start 2026-03-10T06:18:15.121 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:15.119+0000 7fe014318700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe00c07cb40 0x7fe00c07cf10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:18:15.122 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:15.119+0000 7fe014318700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe008003680 con 0x7fe00c07cb40 2026-03-10T06:18:15.122 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:15.120+0000 7fe0120b4700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe00c07cb40 0x7fe00c07cf10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:18:15.122 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:15.120+0000 7fe0120b4700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe00c07cb40 0x7fe00c07cf10 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.106:46878/0 (socket says 192.168.123.106:46878) 2026-03-10T06:18:15.122 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:15.120+0000 7fe0120b4700 1 -- 192.168.123.106:0/456846376 learned_addr learned my addr 192.168.123.106:0/456846376 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-10T06:18:15.123 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:15.122+0000 7fe0120b4700 1 -- 192.168.123.106:0/456846376 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fe008007430 con 0x7fe00c07cb40 2026-03-10T06:18:15.123 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:15.122+0000 7fe0120b4700 1 --2- 192.168.123.106:0/456846376 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe00c07cb40 0x7fe00c07cf10 secure :-1 s=READY pgs=176 cs=0 l=1 rev1=1 crypto rx=0x7fe008005df0 tx=0x7fe00800c7b0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:18:15.125 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:15.122+0000 7fe0037fe700 1 -- 192.168.123.106:0/456846376 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fe00800f050 con 0x7fe00c07cb40 2026-03-10T06:18:15.125 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:15.122+0000 7fe014318700 1 -- 192.168.123.106:0/456846376 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fe00c083be0 con 0x7fe00c07cb40 2026-03-10T06:18:15.125 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:15.122+0000 7fe014318700 1 -- 192.168.123.106:0/456846376 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fe00c07d680 con 0x7fe00c07cb40 2026-03-10T06:18:15.125 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:15.123+0000 7fe0037fe700 1 -- 192.168.123.106:0/456846376 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fe00800cbd0 con 0x7fe00c07cb40 2026-03-10T06:18:15.125 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:15.123+0000 7fe0037fe700 1 -- 192.168.123.106:0/456846376 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fe0080084a0 con 0x7fe00c07cb40 2026-03-10T06:18:15.126 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:15.125+0000 7fe014318700 1 -- 192.168.123.106:0/456846376 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fe00c04efc0 con 0x7fe00c07cb40 2026-03-10T06:18:15.126 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:15.125+0000 7fe0037fe700 1 -- 192.168.123.106:0/456846376 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 17) v1 ==== 45198+0+0 (secure 0 0 0) 0x7fe00801a040 con 0x7fe00c07cb40 2026-03-10T06:18:15.127 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:15.125+0000 7fe0037fe700 1 --2- 192.168.123.106:0/456846376 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fdff8038690 0x7fdff803ab40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:18:15.127 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:15.125+0000 7fe0037fe700 1 -- 192.168.123.106:0/456846376 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(5..5 src has 1..5) v4 ==== 1198+0+0 (secure 0 0 0) 0x7fe00804b540 con 0x7fe00c07cb40 2026-03-10T06:18:15.127 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:15.125+0000 7fe0118b3700 1 --2- 192.168.123.106:0/456846376 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fdff8038690 0x7fdff803ab40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:18:15.127 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:15.125+0000 7fe0118b3700 1 --2- 192.168.123.106:0/456846376 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fdff8038690 0x7fdff803ab40 secure :-1 s=READY pgs=11 cs=0 l=1 rev1=1 crypto rx=0x7fe00400ad30 tx=0x7fe0040093f0 comp rx=0 tx=0).ready entity=mgr.14241 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:18:15.129 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:15.128+0000 7fe0037fe700 1 -- 192.168.123.106:0/456846376 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fe0080243b0 con 0x7fe00c07cb40 2026-03-10T06:18:15.354 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:15.351+0000 7fe014318700 1 -- 192.168.123.106:0/456846376 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7fe00c062380 con 0x7fe00c07cb40 2026-03-10T06:18:15.354 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:15.352+0000 7fe0037fe700 1 -- 192.168.123.106:0/456846376 <== mon.0 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7fe008020300 con 0x7fe00c07cb40 2026-03-10T06:18:15.354 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-10T06:18:15.354 INFO:teuthology.orchestra.run.vm06.stdout:{"epoch":1,"fsid":"9c59102a-1c48-11f1-b618-035af535377d","modified":"2026-03-10T06:16:42.736031Z","created":"2026-03-10T06:16:42.736031Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm04","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:3300","nonce":0},{"type":"v1","addr":"192.168.123.104:6789","nonce":0}]},"addr":"192.168.123.104:6789/0","public_addr":"192.168.123.104:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-10T06:18:15.356 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:15.355+0000 7fe014318700 1 -- 192.168.123.106:0/456846376 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fdff8038690 msgr2=0x7fdff803ab40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:18:15.357 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:15.355+0000 7fe014318700 1 --2- 192.168.123.106:0/456846376 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fdff8038690 0x7fdff803ab40 secure :-1 s=READY pgs=11 cs=0 l=1 rev1=1 crypto rx=0x7fe00400ad30 tx=0x7fe0040093f0 comp rx=0 tx=0).stop 2026-03-10T06:18:15.357 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:15.355+0000 7fe014318700 1 -- 192.168.123.106:0/456846376 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe00c07cb40 msgr2=0x7fe00c07cf10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:18:15.357 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:15.355+0000 7fe014318700 1 --2- 192.168.123.106:0/456846376 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe00c07cb40 0x7fe00c07cf10 secure :-1 s=READY pgs=176 cs=0 l=1 rev1=1 crypto rx=0x7fe008005df0 tx=0x7fe00800c7b0 comp rx=0 tx=0).stop 2026-03-10T06:18:15.357 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:15.355+0000 7fe014318700 1 -- 192.168.123.106:0/456846376 shutdown_connections 2026-03-10T06:18:15.357 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:15.356+0000 7fe014318700 1 --2- 192.168.123.106:0/456846376 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fdff8038690 0x7fdff803ab40 unknown :-1 s=CLOSED pgs=11 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:18:15.357 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:15.356+0000 7fe014318700 1 --2- 192.168.123.106:0/456846376 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe00c07cb40 0x7fe00c07cf10 unknown :-1 s=CLOSED pgs=176 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:18:15.357 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:15.356+0000 7fe014318700 1 -- 192.168.123.106:0/456846376 >> 192.168.123.106:0/456846376 conn(0x7fe00c06c430 msgr2=0x7fe00c06fab0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:18:15.357 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:15.356+0000 7fe014318700 1 -- 192.168.123.106:0/456846376 shutdown_connections 2026-03-10T06:18:15.357 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:15.356+0000 7fe014318700 1 -- 192.168.123.106:0/456846376 wait complete. 2026-03-10T06:18:15.358 INFO:teuthology.orchestra.run.vm06.stderr:dumped monmap epoch 1 2026-03-10T06:18:16.090 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 systemd[1]: Started Ceph mon.vm06 for 9c59102a-1c48-11f1-b618-035af535377d. 2026-03-10T06:18:16.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: set uid:gid to 167:167 (ceph:ceph) 2026-03-10T06:18:16.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable), process ceph-mon, pid 2 2026-03-10T06:18:16.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: pidfile_write: ignore empty --pid-file 2026-03-10T06:18:16.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: load: jerasure load: lrc 2026-03-10T06:18:16.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: RocksDB version: 7.9.2 2026-03-10T06:18:16.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Git sha 0 2026-03-10T06:18:16.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Compile date 2023-08-03 19:21:13 2026-03-10T06:18:16.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: DB SUMMARY 2026-03-10T06:18:16.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: DB Session ID: ZK2CLL5KRALYS0ITYOYG 2026-03-10T06:18:16.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: CURRENT file: CURRENT 2026-03-10T06:18:16.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: IDENTITY file: IDENTITY 2026-03-10T06:18:16.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: MANIFEST file: MANIFEST-000005 size: 59 Bytes 2026-03-10T06:18:16.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: SST files in /var/lib/ceph/mon/ceph-vm06/store.db dir, Total Num: 0, files: 2026-03-10T06:18:16.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Write Ahead Log file in /var/lib/ceph/mon/ceph-vm06/store.db: 000004.log size: 511 ; 2026-03-10T06:18:16.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.error_if_exists: 0 2026-03-10T06:18:16.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.create_if_missing: 0 2026-03-10T06:18:16.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.paranoid_checks: 1 2026-03-10T06:18:16.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.flush_verify_memtable_count: 1 2026-03-10T06:18:16.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.track_and_verify_wals_in_manifest: 0 2026-03-10T06:18:16.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.verify_sst_unique_id_in_manifest: 1 2026-03-10T06:18:16.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.env: 0x559cdef39720 2026-03-10T06:18:16.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.fs: PosixFileSystem 2026-03-10T06:18:16.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.info_log: 0x559ce0a2f340 2026-03-10T06:18:16.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.max_file_opening_threads: 16 2026-03-10T06:18:16.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.statistics: (nil) 2026-03-10T06:18:16.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.use_fsync: 0 2026-03-10T06:18:16.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.max_log_file_size: 0 2026-03-10T06:18:16.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.max_manifest_file_size: 1073741824 2026-03-10T06:18:16.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.log_file_time_to_roll: 0 2026-03-10T06:18:16.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.keep_log_file_num: 1000 2026-03-10T06:18:16.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.recycle_log_file_num: 0 2026-03-10T06:18:16.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.allow_fallocate: 1 2026-03-10T06:18:16.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.allow_mmap_reads: 0 2026-03-10T06:18:16.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.allow_mmap_writes: 0 2026-03-10T06:18:16.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.use_direct_reads: 0 2026-03-10T06:18:16.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.use_direct_io_for_flush_and_compaction: 0 2026-03-10T06:18:16.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.create_missing_column_families: 0 2026-03-10T06:18:16.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.db_log_dir: 2026-03-10T06:18:16.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.wal_dir: 2026-03-10T06:18:16.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.table_cache_numshardbits: 6 2026-03-10T06:18:16.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.WAL_ttl_seconds: 0 2026-03-10T06:18:16.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.WAL_size_limit_MB: 0 2026-03-10T06:18:16.369 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.max_write_batch_group_size_bytes: 1048576 2026-03-10T06:18:16.369 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.manifest_preallocation_size: 4194304 2026-03-10T06:18:16.369 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.is_fd_close_on_exec: 1 2026-03-10T06:18:16.369 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.advise_random_on_open: 1 2026-03-10T06:18:16.369 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.db_write_buffer_size: 0 2026-03-10T06:18:16.369 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.write_buffer_manager: 0x559cdfca03c0 2026-03-10T06:18:16.369 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.access_hint_on_compaction_start: 1 2026-03-10T06:18:16.369 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.random_access_max_buffer_size: 1048576 2026-03-10T06:18:16.369 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.use_adaptive_mutex: 0 2026-03-10T06:18:16.369 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.rate_limiter: (nil) 2026-03-10T06:18:16.369 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.sst_file_manager.rate_bytes_per_sec: 0 2026-03-10T06:18:16.369 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.wal_recovery_mode: 2 2026-03-10T06:18:16.369 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.enable_thread_tracking: 0 2026-03-10T06:18:16.369 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.enable_pipelined_write: 0 2026-03-10T06:18:16.369 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.unordered_write: 0 2026-03-10T06:18:16.369 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.allow_concurrent_memtable_write: 1 2026-03-10T06:18:16.369 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.enable_write_thread_adaptive_yield: 1 2026-03-10T06:18:16.369 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.write_thread_max_yield_usec: 100 2026-03-10T06:18:16.369 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.write_thread_slow_yield_usec: 3 2026-03-10T06:18:16.369 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.row_cache: None 2026-03-10T06:18:16.369 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.wal_filter: None 2026-03-10T06:18:16.369 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.avoid_flush_during_recovery: 0 2026-03-10T06:18:16.370 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.allow_ingest_behind: 0 2026-03-10T06:18:16.370 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.two_write_queues: 0 2026-03-10T06:18:16.370 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.manual_wal_flush: 0 2026-03-10T06:18:16.370 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.wal_compression: 0 2026-03-10T06:18:16.370 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.atomic_flush: 0 2026-03-10T06:18:16.370 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.avoid_unnecessary_blocking_io: 0 2026-03-10T06:18:16.370 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.persist_stats_to_disk: 0 2026-03-10T06:18:16.370 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.write_dbid_to_manifest: 0 2026-03-10T06:18:16.370 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.log_readahead_size: 0 2026-03-10T06:18:16.370 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.file_checksum_gen_factory: Unknown 2026-03-10T06:18:16.370 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.best_efforts_recovery: 0 2026-03-10T06:18:16.370 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.max_bgerror_resume_count: 2147483647 2026-03-10T06:18:16.370 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.bgerror_resume_retry_interval: 1000000 2026-03-10T06:18:16.370 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.allow_data_in_errors: 0 2026-03-10T06:18:16.370 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.db_host_id: __hostname__ 2026-03-10T06:18:16.370 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.enforce_single_del_contracts: true 2026-03-10T06:18:16.370 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.max_background_jobs: 2 2026-03-10T06:18:16.370 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.max_background_compactions: -1 2026-03-10T06:18:16.370 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.max_subcompactions: 1 2026-03-10T06:18:16.370 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.avoid_flush_during_shutdown: 0 2026-03-10T06:18:16.370 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.writable_file_max_buffer_size: 1048576 2026-03-10T06:18:16.370 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.delayed_write_rate : 16777216 2026-03-10T06:18:16.370 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.max_total_wal_size: 0 2026-03-10T06:18:16.370 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.delete_obsolete_files_period_micros: 21600000000 2026-03-10T06:18:16.370 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.stats_dump_period_sec: 600 2026-03-10T06:18:16.370 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.stats_persist_period_sec: 600 2026-03-10T06:18:16.370 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.stats_history_buffer_size: 1048576 2026-03-10T06:18:16.370 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.max_open_files: -1 2026-03-10T06:18:16.370 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.bytes_per_sync: 0 2026-03-10T06:18:16.370 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.wal_bytes_per_sync: 0 2026-03-10T06:18:16.370 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.strict_bytes_per_sync: 0 2026-03-10T06:18:16.370 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.compaction_readahead_size: 0 2026-03-10T06:18:16.370 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.max_background_flushes: -1 2026-03-10T06:18:16.370 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Compression algorithms supported: 2026-03-10T06:18:16.370 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: kZSTDNotFinalCompression supported: 0 2026-03-10T06:18:16.370 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: kZSTD supported: 0 2026-03-10T06:18:16.370 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: kXpressCompression supported: 0 2026-03-10T06:18:16.370 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: kLZ4HCCompression supported: 1 2026-03-10T06:18:16.370 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: kZlibCompression supported: 1 2026-03-10T06:18:16.370 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: kSnappyCompression supported: 1 2026-03-10T06:18:16.370 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: kLZ4Compression supported: 1 2026-03-10T06:18:16.370 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: kBZip2Compression supported: 0 2026-03-10T06:18:16.370 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Fast CRC32 supported: Supported on x86 2026-03-10T06:18:16.370 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: DMutex implementation: pthread_mutex_t 2026-03-10T06:18:16.370 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: /var/lib/ceph/mon/ceph-vm06/store.db/MANIFEST-000005 2026-03-10T06:18:16.370 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]: 2026-03-10T06:18:16.370 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.comparator: leveldb.BytewiseComparator 2026-03-10T06:18:16.370 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.merge_operator: 2026-03-10T06:18:16.371 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.compaction_filter: None 2026-03-10T06:18:16.371 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.compaction_filter_factory: None 2026-03-10T06:18:16.371 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.sst_partitioner_factory: None 2026-03-10T06:18:16.371 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.memtable_factory: SkipListFactory 2026-03-10T06:18:16.371 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.table_factory: BlockBasedTable 2026-03-10T06:18:16.371 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x559ce0a2f440) 2026-03-10T06:18:16.371 INFO:journalctl@ceph.mon.vm06.vm06.stdout: cache_index_and_filter_blocks: 1 2026-03-10T06:18:16.371 INFO:journalctl@ceph.mon.vm06.vm06.stdout: cache_index_and_filter_blocks_with_high_priority: 0 2026-03-10T06:18:16.371 INFO:journalctl@ceph.mon.vm06.vm06.stdout: pin_l0_filter_and_index_blocks_in_cache: 0 2026-03-10T06:18:16.371 INFO:journalctl@ceph.mon.vm06.vm06.stdout: pin_top_level_index_and_filter: 1 2026-03-10T06:18:16.371 INFO:journalctl@ceph.mon.vm06.vm06.stdout: index_type: 0 2026-03-10T06:18:16.371 INFO:journalctl@ceph.mon.vm06.vm06.stdout: data_block_index_type: 0 2026-03-10T06:18:16.371 INFO:journalctl@ceph.mon.vm06.vm06.stdout: index_shortening: 1 2026-03-10T06:18:16.371 INFO:journalctl@ceph.mon.vm06.vm06.stdout: data_block_hash_table_util_ratio: 0.750000 2026-03-10T06:18:16.371 INFO:journalctl@ceph.mon.vm06.vm06.stdout: checksum: 4 2026-03-10T06:18:16.371 INFO:journalctl@ceph.mon.vm06.vm06.stdout: no_block_cache: 0 2026-03-10T06:18:16.371 INFO:journalctl@ceph.mon.vm06.vm06.stdout: block_cache: 0x559cdfd41350 2026-03-10T06:18:16.371 INFO:journalctl@ceph.mon.vm06.vm06.stdout: block_cache_name: BinnedLRUCache 2026-03-10T06:18:16.371 INFO:journalctl@ceph.mon.vm06.vm06.stdout: block_cache_options: 2026-03-10T06:18:16.371 INFO:journalctl@ceph.mon.vm06.vm06.stdout: capacity : 536870912 2026-03-10T06:18:16.371 INFO:journalctl@ceph.mon.vm06.vm06.stdout: num_shard_bits : 4 2026-03-10T06:18:16.371 INFO:journalctl@ceph.mon.vm06.vm06.stdout: strict_capacity_limit : 0 2026-03-10T06:18:16.371 INFO:journalctl@ceph.mon.vm06.vm06.stdout: high_pri_pool_ratio: 0.000 2026-03-10T06:18:16.371 INFO:journalctl@ceph.mon.vm06.vm06.stdout: block_cache_compressed: (nil) 2026-03-10T06:18:16.371 INFO:journalctl@ceph.mon.vm06.vm06.stdout: persistent_cache: (nil) 2026-03-10T06:18:16.371 INFO:journalctl@ceph.mon.vm06.vm06.stdout: block_size: 4096 2026-03-10T06:18:16.371 INFO:journalctl@ceph.mon.vm06.vm06.stdout: block_size_deviation: 10 2026-03-10T06:18:16.371 INFO:journalctl@ceph.mon.vm06.vm06.stdout: block_restart_interval: 16 2026-03-10T06:18:16.371 INFO:journalctl@ceph.mon.vm06.vm06.stdout: index_block_restart_interval: 1 2026-03-10T06:18:16.371 INFO:journalctl@ceph.mon.vm06.vm06.stdout: metadata_block_size: 4096 2026-03-10T06:18:16.371 INFO:journalctl@ceph.mon.vm06.vm06.stdout: partition_filters: 0 2026-03-10T06:18:16.371 INFO:journalctl@ceph.mon.vm06.vm06.stdout: use_delta_encoding: 1 2026-03-10T06:18:16.371 INFO:journalctl@ceph.mon.vm06.vm06.stdout: filter_policy: bloomfilter 2026-03-10T06:18:16.371 INFO:journalctl@ceph.mon.vm06.vm06.stdout: whole_key_filtering: 1 2026-03-10T06:18:16.371 INFO:journalctl@ceph.mon.vm06.vm06.stdout: verify_compression: 0 2026-03-10T06:18:16.371 INFO:journalctl@ceph.mon.vm06.vm06.stdout: read_amp_bytes_per_bit: 0 2026-03-10T06:18:16.371 INFO:journalctl@ceph.mon.vm06.vm06.stdout: format_version: 5 2026-03-10T06:18:16.371 INFO:journalctl@ceph.mon.vm06.vm06.stdout: enable_index_compression: 1 2026-03-10T06:18:16.371 INFO:journalctl@ceph.mon.vm06.vm06.stdout: block_align: 0 2026-03-10T06:18:16.371 INFO:journalctl@ceph.mon.vm06.vm06.stdout: max_auto_readahead_size: 262144 2026-03-10T06:18:16.371 INFO:journalctl@ceph.mon.vm06.vm06.stdout: prepopulate_block_cache: 0 2026-03-10T06:18:16.371 INFO:journalctl@ceph.mon.vm06.vm06.stdout: initial_auto_readahead_size: 8192 2026-03-10T06:18:16.371 INFO:journalctl@ceph.mon.vm06.vm06.stdout: num_file_reads_for_auto_readahead: 2 2026-03-10T06:18:16.371 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.write_buffer_size: 33554432 2026-03-10T06:18:16.371 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.max_write_buffer_number: 2 2026-03-10T06:18:16.371 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.compression: NoCompression 2026-03-10T06:18:16.371 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.bottommost_compression: Disabled 2026-03-10T06:18:16.371 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.prefix_extractor: nullptr 2026-03-10T06:18:16.371 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr 2026-03-10T06:18:16.371 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.num_levels: 7 2026-03-10T06:18:16.371 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.min_write_buffer_number_to_merge: 1 2026-03-10T06:18:16.371 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 2026-03-10T06:18:16.371 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 2026-03-10T06:18:16.371 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 2026-03-10T06:18:16.371 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.bottommost_compression_opts.level: 32767 2026-03-10T06:18:16.371 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.bottommost_compression_opts.strategy: 0 2026-03-10T06:18:16.372 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 2026-03-10T06:18:16.372 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 2026-03-10T06:18:16.372 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 2026-03-10T06:18:16.372 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.bottommost_compression_opts.enabled: false 2026-03-10T06:18:16.372 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 2026-03-10T06:18:16.372 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true 2026-03-10T06:18:16.372 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.compression_opts.window_bits: -14 2026-03-10T06:18:16.372 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.compression_opts.level: 32767 2026-03-10T06:18:16.372 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.compression_opts.strategy: 0 2026-03-10T06:18:16.372 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.compression_opts.max_dict_bytes: 0 2026-03-10T06:18:16.372 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 2026-03-10T06:18:16.372 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true 2026-03-10T06:18:16.372 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.compression_opts.parallel_threads: 1 2026-03-10T06:18:16.372 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.compression_opts.enabled: false 2026-03-10T06:18:16.372 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 2026-03-10T06:18:16.372 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.level0_file_num_compaction_trigger: 4 2026-03-10T06:18:16.372 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.level0_slowdown_writes_trigger: 20 2026-03-10T06:18:16.372 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.level0_stop_writes_trigger: 36 2026-03-10T06:18:16.372 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.target_file_size_base: 67108864 2026-03-10T06:18:16.372 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.target_file_size_multiplier: 1 2026-03-10T06:18:16.372 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.max_bytes_for_level_base: 268435456 2026-03-10T06:18:16.372 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.level_compaction_dynamic_level_bytes: 1 2026-03-10T06:18:16.372 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.max_bytes_for_level_multiplier: 10.000000 2026-03-10T06:18:16.372 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 2026-03-10T06:18:16.372 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 2026-03-10T06:18:16.372 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 2026-03-10T06:18:16.372 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 2026-03-10T06:18:16.372 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 2026-03-10T06:18:16.372 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 2026-03-10T06:18:16.372 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 2026-03-10T06:18:16.372 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.max_sequential_skip_in_iterations: 8 2026-03-10T06:18:16.372 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.max_compaction_bytes: 1677721600 2026-03-10T06:18:16.372 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true 2026-03-10T06:18:16.372 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.arena_block_size: 1048576 2026-03-10T06:18:16.372 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 2026-03-10T06:18:16.372 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 2026-03-10T06:18:16.372 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.disable_auto_compactions: 0 2026-03-10T06:18:16.372 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.compaction_style: kCompactionStyleLevel 2026-03-10T06:18:16.372 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.compaction_pri: kMinOverlappingRatio 2026-03-10T06:18:16.372 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.compaction_options_universal.size_ratio: 1 2026-03-10T06:18:16.372 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 2026-03-10T06:18:16.372 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 2026-03-10T06:18:16.372 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 2026-03-10T06:18:16.372 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 2026-03-10T06:18:16.372 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize 2026-03-10T06:18:16.372 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 2026-03-10T06:18:16.372 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 2026-03-10T06:18:16.372 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.table_properties_collectors: 2026-03-10T06:18:16.372 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.inplace_update_support: 0 2026-03-10T06:18:16.372 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.inplace_update_num_locks: 10000 2026-03-10T06:18:16.372 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 2026-03-10T06:18:16.372 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.memtable_whole_key_filtering: 0 2026-03-10T06:18:16.372 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.memtable_huge_page_size: 0 2026-03-10T06:18:16.372 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.bloom_locality: 0 2026-03-10T06:18:16.372 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.max_successive_merges: 0 2026-03-10T06:18:16.372 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.optimize_filters_for_hits: 0 2026-03-10T06:18:16.372 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.paranoid_file_checks: 0 2026-03-10T06:18:16.372 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.force_consistency_checks: 1 2026-03-10T06:18:16.372 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.report_bg_io_stats: 0 2026-03-10T06:18:16.372 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.ttl: 2592000 2026-03-10T06:18:16.373 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.periodic_compaction_seconds: 0 2026-03-10T06:18:16.373 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.preclude_last_level_data_seconds: 0 2026-03-10T06:18:16.373 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.preserve_internal_time_seconds: 0 2026-03-10T06:18:16.373 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.enable_blob_files: false 2026-03-10T06:18:16.373 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.min_blob_size: 0 2026-03-10T06:18:16.373 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.blob_file_size: 268435456 2026-03-10T06:18:16.373 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.blob_compression_type: NoCompression 2026-03-10T06:18:16.373 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.enable_blob_garbage_collection: false 2026-03-10T06:18:16.373 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 2026-03-10T06:18:16.373 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 2026-03-10T06:18:16.373 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.blob_compaction_readahead_size: 0 2026-03-10T06:18:16.373 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.blob_file_starting_level: 0 2026-03-10T06:18:16.373 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 2026-03-10T06:18:16.373 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:/var/lib/ceph/mon/ceph-vm06/store.db/MANIFEST-000005 succeeded,manifest_file_number is 5, next_file_number is 7, last_sequence is 0, log_number is 0,prev_log_number is 0,max_column_family is 0,min_log_number_to_keep is 0 2026-03-10T06:18:16.373 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 0 2026-03-10T06:18:16.373 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: d7f668b6-f670-46ee-a723-f9534d755530 2026-03-10T06:18:16.373 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: EVENT_LOG_v1 {"time_micros": 1773123496092375, "job": 1, "event": "recovery_started", "wal_files": [4]} 2026-03-10T06:18:16.373 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #4 mode 2 2026-03-10T06:18:16.373 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: EVENT_LOG_v1 {"time_micros": 1773123496096725, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 8, "file_size": 1617, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 1, "largest_seqno": 5, "table_properties": {"data_size": 523, "index_size": 31, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 115, "raw_average_key_size": 23, "raw_value_size": 401, "raw_average_value_size": 80, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1773123496, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d7f668b6-f670-46ee-a723-f9534d755530", "db_session_id": "ZK2CLL5KRALYS0ITYOYG", "orig_file_number": 8, "seqno_to_time_mapping": "N/A"}} 2026-03-10T06:18:16.373 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: EVENT_LOG_v1 {"time_micros": 1773123496096791, "job": 1, "event": "recovery_finished"} 2026-03-10T06:18:16.373 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: [db/version_set.cc:5047] Creating manifest 10 2026-03-10T06:18:16.373 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-vm06/store.db/000004.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 2026-03-10T06:18:16.373 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x559cdfdde000 2026-03-10T06:18:16.373 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: DB pointer 0x559cdfdca000 2026-03-10T06:18:16.373 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: mon.vm06 does not exist in monmap, will attempt to join an existing cluster 2026-03-10T06:18:16.373 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: using public_addr v2:192.168.123.106:0/0 -> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] 2026-03-10T06:18:16.373 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- 2026-03-10T06:18:16.373 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: rocksdb: [db/db_impl/db_impl.cc:1111] 2026-03-10T06:18:16.373 INFO:journalctl@ceph.mon.vm06.vm06.stdout: ** DB Stats ** 2026-03-10T06:18:16.373 INFO:journalctl@ceph.mon.vm06.vm06.stdout: Uptime(secs): 0.0 total, 0.0 interval 2026-03-10T06:18:16.373 INFO:journalctl@ceph.mon.vm06.vm06.stdout: Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s 2026-03-10T06:18:16.373 INFO:journalctl@ceph.mon.vm06.vm06.stdout: Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s 2026-03-10T06:18:16.373 INFO:journalctl@ceph.mon.vm06.vm06.stdout: Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent 2026-03-10T06:18:16.373 INFO:journalctl@ceph.mon.vm06.vm06.stdout: Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s 2026-03-10T06:18:16.373 INFO:journalctl@ceph.mon.vm06.vm06.stdout: Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s 2026-03-10T06:18:16.373 INFO:journalctl@ceph.mon.vm06.vm06.stdout: Interval stall: 00:00:0.000 H:M:S, 0.0 percent 2026-03-10T06:18:16.373 INFO:journalctl@ceph.mon.vm06.vm06.stdout: 2026-03-10T06:18:16.373 INFO:journalctl@ceph.mon.vm06.vm06.stdout: ** Compaction Stats [default] ** 2026-03-10T06:18:16.373 INFO:journalctl@ceph.mon.vm06.vm06.stdout: Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB) 2026-03-10T06:18:16.373 INFO:journalctl@ceph.mon.vm06.vm06.stdout: ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------ 2026-03-10T06:18:16.373 INFO:journalctl@ceph.mon.vm06.vm06.stdout: L0 1/0 1.58 KB 0.2 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.4 0.00 0.00 1 0.004 0 0 0.0 0.0 2026-03-10T06:18:16.373 INFO:journalctl@ceph.mon.vm06.vm06.stdout: Sum 1/0 1.58 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.4 0.00 0.00 1 0.004 0 0 0.0 0.0 2026-03-10T06:18:16.373 INFO:journalctl@ceph.mon.vm06.vm06.stdout: Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.4 0.00 0.00 1 0.004 0 0 0.0 0.0 2026-03-10T06:18:16.373 INFO:journalctl@ceph.mon.vm06.vm06.stdout: 2026-03-10T06:18:16.373 INFO:journalctl@ceph.mon.vm06.vm06.stdout: ** Compaction Stats [default] ** 2026-03-10T06:18:16.373 INFO:journalctl@ceph.mon.vm06.vm06.stdout: Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB) 2026-03-10T06:18:16.373 INFO:journalctl@ceph.mon.vm06.vm06.stdout: --------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- 2026-03-10T06:18:16.373 INFO:journalctl@ceph.mon.vm06.vm06.stdout: User 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.4 0.00 0.00 1 0.004 0 0 0.0 0.0 2026-03-10T06:18:16.373 INFO:journalctl@ceph.mon.vm06.vm06.stdout: 2026-03-10T06:18:16.373 INFO:journalctl@ceph.mon.vm06.vm06.stdout: Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0 2026-03-10T06:18:16.373 INFO:journalctl@ceph.mon.vm06.vm06.stdout: 2026-03-10T06:18:16.373 INFO:journalctl@ceph.mon.vm06.vm06.stdout: Uptime(secs): 0.0 total, 0.0 interval 2026-03-10T06:18:16.373 INFO:journalctl@ceph.mon.vm06.vm06.stdout: Flush(GB): cumulative 0.000, interval 0.000 2026-03-10T06:18:16.373 INFO:journalctl@ceph.mon.vm06.vm06.stdout: AddFile(GB): cumulative 0.000, interval 0.000 2026-03-10T06:18:16.373 INFO:journalctl@ceph.mon.vm06.vm06.stdout: AddFile(Total Files): cumulative 0, interval 0 2026-03-10T06:18:16.373 INFO:journalctl@ceph.mon.vm06.vm06.stdout: AddFile(L0 Files): cumulative 0, interval 0 2026-03-10T06:18:16.373 INFO:journalctl@ceph.mon.vm06.vm06.stdout: AddFile(Keys): cumulative 0, interval 0 2026-03-10T06:18:16.373 INFO:journalctl@ceph.mon.vm06.vm06.stdout: Cumulative compaction: 0.00 GB write, 0.16 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds 2026-03-10T06:18:16.373 INFO:journalctl@ceph.mon.vm06.vm06.stdout: Interval compaction: 0.00 GB write, 0.16 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds 2026-03-10T06:18:16.373 INFO:journalctl@ceph.mon.vm06.vm06.stdout: Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count 2026-03-10T06:18:16.373 INFO:journalctl@ceph.mon.vm06.vm06.stdout: Block cache BinnedLRUCache@0x559cdfd41350#2 capacity: 512.00 MB usage: 0.86 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 0 last_secs: 7e-06 secs_since: 0 2026-03-10T06:18:16.374 INFO:journalctl@ceph.mon.vm06.vm06.stdout: Block cache entry stats(count,size,portion): DataBlock(1,0.64 KB,0.00012219%) FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.11 KB,2.08616e-05%) Misc(1,0.00 KB,0%) 2026-03-10T06:18:16.374 INFO:journalctl@ceph.mon.vm06.vm06.stdout: 2026-03-10T06:18:16.374 INFO:journalctl@ceph.mon.vm06.vm06.stdout: ** File Read Latency Histogram By Level [default] ** 2026-03-10T06:18:16.374 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: starting mon.vm06 rank -1 at public addrs [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] at bind addrs [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] mon_data /var/lib/ceph/mon/ceph-vm06 fsid 9c59102a-1c48-11f1-b618-035af535377d 2026-03-10T06:18:16.374 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: mon.vm06@-1(???) e0 preinit fsid 9c59102a-1c48-11f1-b618-035af535377d 2026-03-10T06:18:16.374 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: mon.vm06@-1(synchronizing).mds e1 new map 2026-03-10T06:18:16.374 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: mon.vm06@-1(synchronizing).mds e1 print_map 2026-03-10T06:18:16.374 INFO:journalctl@ceph.mon.vm06.vm06.stdout: e1 2026-03-10T06:18:16.374 INFO:journalctl@ceph.mon.vm06.vm06.stdout: enable_multiple, ever_enabled_multiple: 1,1 2026-03-10T06:18:16.374 INFO:journalctl@ceph.mon.vm06.vm06.stdout: default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T06:18:16.374 INFO:journalctl@ceph.mon.vm06.vm06.stdout: legacy client fscid: -1 2026-03-10T06:18:16.374 INFO:journalctl@ceph.mon.vm06.vm06.stdout: 2026-03-10T06:18:16.374 INFO:journalctl@ceph.mon.vm06.vm06.stdout: No filesystems configured 2026-03-10T06:18:16.374 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: mon.vm06@-1(synchronizing).osd e0 _set_cache_ratios kv ratio 0.25 inc ratio 0.375 full ratio 0.375 2026-03-10T06:18:16.374 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: mon.vm06@-1(synchronizing).osd e0 register_cache_with_pcm pcm target: 2147483648 pcm max: 1020054732 pcm min: 134217728 inc_osd_cache size: 1 2026-03-10T06:18:16.374 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: mon.vm06@-1(synchronizing).osd e1 e1: 0 total, 0 up, 0 in 2026-03-10T06:18:16.374 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: mon.vm06@-1(synchronizing).osd e2 e2: 0 total, 0 up, 0 in 2026-03-10T06:18:16.374 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: mon.vm06@-1(synchronizing).osd e3 e3: 0 total, 0 up, 0 in 2026-03-10T06:18:16.374 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: mon.vm06@-1(synchronizing).osd e4 e4: 0 total, 0 up, 0 in 2026-03-10T06:18:16.374 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: mon.vm06@-1(synchronizing).osd e5 e5: 0 total, 0 up, 0 in 2026-03-10T06:18:16.374 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: mon.vm06@-1(synchronizing).osd e5 crush map has features 3314932999778484224, adjusting msgr requires 2026-03-10T06:18:16.374 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: mon.vm06@-1(synchronizing).osd e5 crush map has features 288514050185494528, adjusting msgr requires 2026-03-10T06:18:16.374 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: mon.vm06@-1(synchronizing).osd e5 crush map has features 288514050185494528, adjusting msgr requires 2026-03-10T06:18:16.374 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: mon.vm06@-1(synchronizing).osd e5 crush map has features 288514050185494528, adjusting msgr requires 2026-03-10T06:18:16.374 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: pgmap v13: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-10T06:18:16.374 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: from='client.? 192.168.123.106:0/845924963' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T06:18:16.374 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: pgmap v14: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-10T06:18:16.374 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: from='mgr.14164 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:16.374 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: from='mgr.14164 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:16.374 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: from='mgr.14164 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:16.374 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: from='mgr.14164 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:16.374 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: from='mgr.14164 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:16.374 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: from='mgr.14164 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:16.374 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: from='client.? 192.168.123.106:0/125327768' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T06:18:16.374 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: Deploying daemon prometheus.vm04 on vm04 2026-03-10T06:18:16.374 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: pgmap v15: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-10T06:18:16.374 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: from='client.? 192.168.123.106:0/1370505785' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T06:18:16.374 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: pgmap v16: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-10T06:18:16.374 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: from='client.? 192.168.123.106:0/2509586082' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T06:18:16.374 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: from='mgr.14164 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:16.374 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: from='mgr.14164 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:16.374 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: from='mgr.14164 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:16.374 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: from='mgr.14164 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:16.374 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: from='mgr.14164 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "mgr module enable", "module": "prometheus"}]: dispatch 2026-03-10T06:18:16.374 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: pgmap v17: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-10T06:18:16.374 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: from='client.? 192.168.123.106:0/2782604412' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T06:18:16.374 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: from='mgr.14164 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd='[{"prefix": "mgr module enable", "module": "prometheus"}]': finished 2026-03-10T06:18:16.374 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: mgrmap e14: vm04.exdvdb(active, since 46s) 2026-03-10T06:18:16.374 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: from='client.? 192.168.123.106:0/437344505' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T06:18:16.374 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: from='client.? 192.168.123.106:0/506116644' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T06:18:16.374 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: Active manager daemon vm04.exdvdb restarted 2026-03-10T06:18:16.374 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: Activating manager daemon vm04.exdvdb 2026-03-10T06:18:16.374 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: osdmap e5: 0 total, 0 up, 0 in 2026-03-10T06:18:16.374 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: mgrmap e15: vm04.exdvdb(active, starting, since 0.00468297s) 2026-03-10T06:18:16.374 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "mon metadata", "id": "vm04"}]: dispatch 2026-03-10T06:18:16.374 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "mgr metadata", "who": "vm04.exdvdb", "id": "vm04.exdvdb"}]: dispatch 2026-03-10T06:18:16.375 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "mds metadata"}]: dispatch 2026-03-10T06:18:16.375 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd metadata"}]: dispatch 2026-03-10T06:18:16.375 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "mon metadata"}]: dispatch 2026-03-10T06:18:16.375 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: Manager daemon vm04.exdvdb is now available 2026-03-10T06:18:16.375 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:16.375 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm04.exdvdb/mirror_snapshot_schedule"}]: dispatch 2026-03-10T06:18:16.375 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T06:18:16.375 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:18:16.375 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:18:16.375 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm04.exdvdb/trash_purge_schedule"}]: dispatch 2026-03-10T06:18:16.375 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: from='client.? 192.168.123.106:0/880248481' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T06:18:16.375 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:16.375 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:16.375 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: mgrmap e16: vm04.exdvdb(active, since 1.01037s) 2026-03-10T06:18:16.375 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: [10/Mar/2026:06:18:04] ENGINE Bus STARTING 2026-03-10T06:18:16.375 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: [10/Mar/2026:06:18:04] ENGINE Serving on https://192.168.123.104:7150 2026-03-10T06:18:16.375 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: [10/Mar/2026:06:18:04] ENGINE Serving on http://192.168.123.104:8765 2026-03-10T06:18:16.375 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: [10/Mar/2026:06:18:04] ENGINE Bus STARTED 2026-03-10T06:18:16.375 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:16.375 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:16.375 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:16.375 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:16.375 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:16.375 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "who": "osd/host:vm04", "name": "osd_memory_target"}]: dispatch 2026-03-10T06:18:16.375 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: from='client.? 192.168.123.106:0/1272952014' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T06:18:16.375 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: mgrmap e17: vm04.exdvdb(active, since 2s) 2026-03-10T06:18:16.375 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:16.375 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:16.375 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:16.375 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:16.375 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "who": "osd/host:vm06", "name": "osd_memory_target"}]: dispatch 2026-03-10T06:18:16.375 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:18:16.375 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T06:18:16.375 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: Updating vm04:/etc/ceph/ceph.conf 2026-03-10T06:18:16.375 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: Updating vm06:/etc/ceph/ceph.conf 2026-03-10T06:18:16.375 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: Updating vm04:/var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/config/ceph.conf 2026-03-10T06:18:16.375 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: Updating vm06:/var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/config/ceph.conf 2026-03-10T06:18:16.375 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: Updating vm04:/etc/ceph/ceph.client.admin.keyring 2026-03-10T06:18:16.375 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: Updating vm06:/etc/ceph/ceph.client.admin.keyring 2026-03-10T06:18:16.375 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:16.375 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:16.375 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: from='client.? 192.168.123.106:0/285565054' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T06:18:16.375 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:16.375 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:16.375 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:16.375 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm06", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-10T06:18:16.375 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd='[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm06", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]': finished 2026-03-10T06:18:16.375 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:18:16.375 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: Updating vm04:/var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/config/ceph.client.admin.keyring 2026-03-10T06:18:16.375 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: Updating vm06:/var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/config/ceph.client.admin.keyring 2026-03-10T06:18:16.375 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: Deploying daemon ceph-exporter.vm06 on vm06 2026-03-10T06:18:16.375 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:16.375 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:16.375 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:16.375 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:16.375 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.vm06", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch 2026-03-10T06:18:16.376 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd='[{"prefix": "auth get-or-create", "entity": "client.crash.vm06", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]': finished 2026-03-10T06:18:16.376 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:18:16.376 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: from='client.? 192.168.123.106:0/3019110123' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T06:18:16.376 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: Deploying daemon crash.vm06 on vm06 2026-03-10T06:18:16.376 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:16.376 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:16.376 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:16.376 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:16.376 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: Deploying daemon node-exporter.vm06 on vm06 2026-03-10T06:18:16.376 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: from='client.? 192.168.123.106:0/3926665551' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T06:18:16.376 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: from='client.? 192.168.123.106:0/455275937' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T06:18:16.376 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:16.376 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:16.376 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:16.376 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:16.376 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm06.wwotdr", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-10T06:18:16.376 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd='[{"prefix": "auth get-or-create", "entity": "mgr.vm06.wwotdr", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]': finished 2026-03-10T06:18:16.376 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-10T06:18:16.376 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:18:16.376 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: Deploying daemon mgr.vm06.wwotdr on vm06 2026-03-10T06:18:16.376 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:16.376 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:16.376 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:16.376 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:16.376 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:16.376 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch 2026-03-10T06:18:16.376 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:18:16.376 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:16 vm06 ceph-mon[58974]: mon.vm06@-1(synchronizing).paxosservice(auth 1..8) refresh upgraded, format 0 -> 3 2026-03-10T06:18:16.435 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-10T06:18:16.435 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 9c59102a-1c48-11f1-b618-035af535377d -- ceph mon dump -f json 2026-03-10T06:18:16.602 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/mon.vm06/config 2026-03-10T06:18:21.300 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:21.296+0000 7f69372f3700 1 -- 192.168.123.106:0/4200523367 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f693006eaa0 msgr2=0x7f691c005610 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:18:21.300 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:21.296+0000 7f69372f3700 1 --2- 192.168.123.106:0/4200523367 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f693006eaa0 0x7f691c005610 secure :-1 s=READY pgs=179 cs=0 l=1 rev1=1 crypto rx=0x7f692c005fa0 tx=0x7f692c00ff70 comp rx=0 tx=0).stop 2026-03-10T06:18:21.300 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:21.296+0000 7f69372f3700 1 -- 192.168.123.106:0/4200523367 shutdown_connections 2026-03-10T06:18:21.300 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:21.296+0000 7f69372f3700 1 --2- 192.168.123.106:0/4200523367 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f693006eaa0 0x7f691c005610 unknown :-1 s=CLOSED pgs=179 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:18:21.300 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:21.296+0000 7f69372f3700 1 -- 192.168.123.106:0/4200523367 >> 192.168.123.106:0/4200523367 conn(0x7f6930074fa0 msgr2=0x7f69300753a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:18:21.300 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:21.297+0000 7f69372f3700 1 -- 192.168.123.106:0/4200523367 shutdown_connections 2026-03-10T06:18:21.300 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:21.297+0000 7f69372f3700 1 -- 192.168.123.106:0/4200523367 wait complete. 2026-03-10T06:18:21.300 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:21.298+0000 7f69372f3700 1 Processor -- start 2026-03-10T06:18:21.300 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:21.298+0000 7f69372f3700 1 -- start start 2026-03-10T06:18:21.300 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:21.298+0000 7f69372f3700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f693011d030 0x7f6930116e30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:18:21.300 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:21.298+0000 7f69372f3700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6930117370 0x7f69301177e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:18:21.300 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:21.298+0000 7f69372f3700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f693011d490 con 0x7f6930117370 2026-03-10T06:18:21.300 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:21.298+0000 7f69372f3700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f69301b7420 con 0x7f693011d030 2026-03-10T06:18:21.300 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:21.298+0000 7f693488e700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6930117370 0x7f69301177e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:18:21.300 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:21.298+0000 7f693488e700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6930117370 0x7f69301177e0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.106:38486/0 (socket says 192.168.123.106:38486) 2026-03-10T06:18:21.300 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:21.298+0000 7f693488e700 1 -- 192.168.123.106:0/1762146804 learned_addr learned my addr 192.168.123.106:0/1762146804 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-10T06:18:21.300 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:21.298+0000 7f693508f700 1 --2- 192.168.123.106:0/1762146804 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f693011d030 0x7f6930116e30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:18:21.300 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:21.298+0000 7f693508f700 1 -- 192.168.123.106:0/1762146804 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f693011d030 msgr2=0x7f6930116e30 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_bulk peer close file descriptor 13 2026-03-10T06:18:21.300 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:21.298+0000 7f693508f700 1 -- 192.168.123.106:0/1762146804 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f693011d030 msgr2=0x7f6930116e30 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until read failed 2026-03-10T06:18:21.300 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:21.298+0000 7f693508f700 1 --2- 192.168.123.106:0/1762146804 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f693011d030 0x7f6930116e30 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_read_frame_preamble_main read frame preamble failed r=-1 2026-03-10T06:18:21.300 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:21.298+0000 7f693508f700 1 --2- 192.168.123.106:0/1762146804 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f693011d030 0x7f6930116e30 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 0.200000 2026-03-10T06:18:21.300 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:21.299+0000 7f693488e700 1 -- 192.168.123.106:0/1762146804 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f693011d030 msgr2=0x7f6930116e30 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T06:18:21.300 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:21.299+0000 7f693488e700 1 --2- 192.168.123.106:0/1762146804 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f693011d030 0x7f6930116e30 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:18:21.300 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:21.299+0000 7f693488e700 1 -- 192.168.123.106:0/1762146804 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f692c00f970 con 0x7f6930117370 2026-03-10T06:18:21.300 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:21.299+0000 7f693488e700 1 --2- 192.168.123.106:0/1762146804 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6930117370 0x7f69301177e0 secure :-1 s=READY pgs=180 cs=0 l=1 rev1=1 crypto rx=0x7f692800bf40 tx=0x7f692800bf70 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:18:21.300 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:21.299+0000 7f69267fc700 1 -- 192.168.123.106:0/1762146804 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f692800c9d0 con 0x7f6930117370 2026-03-10T06:18:21.300 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:21.299+0000 7f69372f3700 1 -- 192.168.123.106:0/1762146804 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f69301b7560 con 0x7f6930117370 2026-03-10T06:18:21.301 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:21.299+0000 7f69372f3700 1 -- 192.168.123.106:0/1762146804 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f69301b7a80 con 0x7f6930117370 2026-03-10T06:18:21.301 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:21.299+0000 7f69267fc700 1 -- 192.168.123.106:0/1762146804 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f69280092e0 con 0x7f6930117370 2026-03-10T06:18:21.301 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:21.299+0000 7f69267fc700 1 -- 192.168.123.106:0/1762146804 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6928007600 con 0x7f6930117370 2026-03-10T06:18:21.301 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:21.300+0000 7f69267fc700 1 -- 192.168.123.106:0/1762146804 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 18) v1 ==== 89854+0+0 (secure 0 0 0) 0x7f6928007760 con 0x7f6930117370 2026-03-10T06:18:21.301 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:21.301+0000 7f69372f3700 1 -- 192.168.123.106:0/1762146804 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f693011ce40 con 0x7f6930117370 2026-03-10T06:18:21.301 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:21.301+0000 7f69267fc700 1 --2- 192.168.123.106:0/1762146804 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f691406c820 0x7f691406ecd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:18:21.301 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:21.301+0000 7f69267fc700 1 -- 192.168.123.106:0/1762146804 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(5..5 src has 1..5) v4 ==== 1198+0+0 (secure 0 0 0) 0x7f692808a0c0 con 0x7f6930117370 2026-03-10T06:18:21.303 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:21.303+0000 7f693508f700 1 --2- 192.168.123.106:0/1762146804 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f691406c820 0x7f691406ecd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:18:21.304 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:21.303+0000 7f693508f700 1 --2- 192.168.123.106:0/1762146804 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f691406c820 0x7f691406ecd0 secure :-1 s=READY pgs=20 cs=0 l=1 rev1=1 crypto rx=0x7f692c00f470 tx=0x7f692c00f3e0 comp rx=0 tx=0).ready entity=mgr.14241 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:18:21.306 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:21.304+0000 7f69267fc700 1 -- 192.168.123.106:0/1762146804 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f6928056a40 con 0x7f6930117370 2026-03-10T06:18:21.454 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:21 vm06 ceph-mon[58974]: mon.vm04 calling monitor election 2026-03-10T06:18:21.454 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:21 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "mon metadata", "id": "vm04"}]: dispatch 2026-03-10T06:18:21.454 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:21 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "mon metadata", "id": "vm06"}]: dispatch 2026-03-10T06:18:21.455 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:21 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "mon metadata", "id": "vm06"}]: dispatch 2026-03-10T06:18:21.455 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:21 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "mon metadata", "id": "vm06"}]: dispatch 2026-03-10T06:18:21.455 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:21 vm06 ceph-mon[58974]: mon.vm06 calling monitor election 2026-03-10T06:18:21.455 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:21 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T06:18:21.455 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:21 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "mon metadata", "id": "vm06"}]: dispatch 2026-03-10T06:18:21.455 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:21 vm06 ceph-mon[58974]: from='mgr.? 192.168.123.106:0/512624975' entity='mgr.vm06.wwotdr' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm06.wwotdr/crt"}]: dispatch 2026-03-10T06:18:21.455 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:21 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "mon metadata", "id": "vm06"}]: dispatch 2026-03-10T06:18:21.455 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:21 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "mon metadata", "id": "vm06"}]: dispatch 2026-03-10T06:18:21.455 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:21 vm06 ceph-mon[58974]: mon.vm04 is new leader, mons vm04,vm06 in quorum (ranks 0,1) 2026-03-10T06:18:21.455 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:21 vm06 ceph-mon[58974]: monmap e2: 2 mons at {vm04=[v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0],vm06=[v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0]} removed_ranks: {} 2026-03-10T06:18:21.455 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:21 vm06 ceph-mon[58974]: fsmap 2026-03-10T06:18:21.455 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:21 vm06 ceph-mon[58974]: osdmap e5: 0 total, 0 up, 0 in 2026-03-10T06:18:21.455 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:21 vm06 ceph-mon[58974]: mgrmap e17: vm04.exdvdb(active, since 17s) 2026-03-10T06:18:21.455 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:21 vm06 ceph-mon[58974]: Standby manager daemon vm06.wwotdr started 2026-03-10T06:18:21.455 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:21 vm06 ceph-mon[58974]: from='mgr.? 192.168.123.106:0/512624975' entity='mgr.vm06.wwotdr' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/crt"}]: dispatch 2026-03-10T06:18:21.455 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:21 vm06 ceph-mon[58974]: overall HEALTH_OK 2026-03-10T06:18:21.455 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:21 vm06 ceph-mon[58974]: from='mgr.? 192.168.123.106:0/512624975' entity='mgr.vm06.wwotdr' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm06.wwotdr/key"}]: dispatch 2026-03-10T06:18:21.455 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:21 vm06 ceph-mon[58974]: from='mgr.? 192.168.123.106:0/512624975' entity='mgr.vm06.wwotdr' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/key"}]: dispatch 2026-03-10T06:18:21.455 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:21 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:21.455 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:21 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:21.455 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:21 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:21.455 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:21 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:21.455 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:21.453+0000 7f69372f3700 1 -- 192.168.123.106:0/1762146804 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f693004f9e0 con 0x7f6930117370 2026-03-10T06:18:21.455 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:21.454+0000 7f69267fc700 1 -- 192.168.123.106:0/1762146804 <== mon.0 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 2 v2) v1 ==== 95+0+1032 (secure 0 0 0) 0x7f692805a060 con 0x7f6930117370 2026-03-10T06:18:21.456 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-10T06:18:21.456 INFO:teuthology.orchestra.run.vm06.stdout:{"epoch":2,"fsid":"9c59102a-1c48-11f1-b618-035af535377d","modified":"2026-03-10T06:18:16.127480Z","created":"2026-03-10T06:16:42.736031Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm04","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:3300","nonce":0},{"type":"v1","addr":"192.168.123.104:6789","nonce":0}]},"addr":"192.168.123.104:6789/0","public_addr":"192.168.123.104:6789/0","priority":0,"weight":0,"crush_location":"{}"},{"rank":1,"name":"vm06","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:3300","nonce":0},{"type":"v1","addr":"192.168.123.106:6789","nonce":0}]},"addr":"192.168.123.106:6789/0","public_addr":"192.168.123.106:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0,1]} 2026-03-10T06:18:21.458 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:21.458+0000 7f691bfff700 1 -- 192.168.123.106:0/1762146804 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f691406c820 msgr2=0x7f691406ecd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:18:21.458 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:21.458+0000 7f691bfff700 1 --2- 192.168.123.106:0/1762146804 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f691406c820 0x7f691406ecd0 secure :-1 s=READY pgs=20 cs=0 l=1 rev1=1 crypto rx=0x7f692c00f470 tx=0x7f692c00f3e0 comp rx=0 tx=0).stop 2026-03-10T06:18:21.459 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:21.458+0000 7f691bfff700 1 -- 192.168.123.106:0/1762146804 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6930117370 msgr2=0x7f69301177e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:18:21.459 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:21.458+0000 7f691bfff700 1 --2- 192.168.123.106:0/1762146804 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6930117370 0x7f69301177e0 secure :-1 s=READY pgs=180 cs=0 l=1 rev1=1 crypto rx=0x7f692800bf40 tx=0x7f692800bf70 comp rx=0 tx=0).stop 2026-03-10T06:18:21.459 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:21.458+0000 7f691bfff700 1 -- 192.168.123.106:0/1762146804 shutdown_connections 2026-03-10T06:18:21.459 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:21.458+0000 7f691bfff700 1 --2- 192.168.123.106:0/1762146804 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f691406c820 0x7f691406ecd0 unknown :-1 s=CLOSED pgs=20 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:18:21.459 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:21.458+0000 7f691bfff700 1 --2- 192.168.123.106:0/1762146804 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f693011d030 0x7f6930116e30 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:18:21.459 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:21.458+0000 7f691bfff700 1 --2- 192.168.123.106:0/1762146804 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6930117370 0x7f69301177e0 unknown :-1 s=CLOSED pgs=180 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:18:21.459 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:21.458+0000 7f691bfff700 1 -- 192.168.123.106:0/1762146804 >> 192.168.123.106:0/1762146804 conn(0x7f6930074fa0 msgr2=0x7f693006d600 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:18:21.462 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:21.462+0000 7f691bfff700 1 -- 192.168.123.106:0/1762146804 shutdown_connections 2026-03-10T06:18:21.462 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:21.462+0000 7f691bfff700 1 -- 192.168.123.106:0/1762146804 wait complete. 2026-03-10T06:18:21.463 INFO:teuthology.orchestra.run.vm06.stderr:dumped monmap epoch 2 2026-03-10T06:18:21.531 INFO:tasks.cephadm:Generating final ceph.conf file... 2026-03-10T06:18:21.532 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 9c59102a-1c48-11f1-b618-035af535377d -- ceph config generate-minimal-conf 2026-03-10T06:18:21.555 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:21 vm04 ceph-mon[51058]: mon.vm04 calling monitor election 2026-03-10T06:18:21.555 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:21 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "mon metadata", "id": "vm04"}]: dispatch 2026-03-10T06:18:21.555 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:21 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "mon metadata", "id": "vm06"}]: dispatch 2026-03-10T06:18:21.555 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:21 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "mon metadata", "id": "vm06"}]: dispatch 2026-03-10T06:18:21.555 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:21 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "mon metadata", "id": "vm06"}]: dispatch 2026-03-10T06:18:21.555 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:21 vm04 ceph-mon[51058]: mon.vm06 calling monitor election 2026-03-10T06:18:21.555 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:21 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T06:18:21.555 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:21 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "mon metadata", "id": "vm06"}]: dispatch 2026-03-10T06:18:21.555 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:21 vm04 ceph-mon[51058]: from='mgr.? 192.168.123.106:0/512624975' entity='mgr.vm06.wwotdr' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm06.wwotdr/crt"}]: dispatch 2026-03-10T06:18:21.555 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:21 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "mon metadata", "id": "vm06"}]: dispatch 2026-03-10T06:18:21.555 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:21 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "mon metadata", "id": "vm06"}]: dispatch 2026-03-10T06:18:21.555 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:21 vm04 ceph-mon[51058]: mon.vm04 is new leader, mons vm04,vm06 in quorum (ranks 0,1) 2026-03-10T06:18:21.555 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:21 vm04 ceph-mon[51058]: monmap e2: 2 mons at {vm04=[v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0],vm06=[v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0]} removed_ranks: {} 2026-03-10T06:18:21.555 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:21 vm04 ceph-mon[51058]: fsmap 2026-03-10T06:18:21.555 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:21 vm04 ceph-mon[51058]: osdmap e5: 0 total, 0 up, 0 in 2026-03-10T06:18:21.555 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:21 vm04 ceph-mon[51058]: mgrmap e17: vm04.exdvdb(active, since 17s) 2026-03-10T06:18:21.555 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:21 vm04 ceph-mon[51058]: Standby manager daemon vm06.wwotdr started 2026-03-10T06:18:21.555 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:21 vm04 ceph-mon[51058]: from='mgr.? 192.168.123.106:0/512624975' entity='mgr.vm06.wwotdr' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/crt"}]: dispatch 2026-03-10T06:18:21.555 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:21 vm04 ceph-mon[51058]: overall HEALTH_OK 2026-03-10T06:18:21.555 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:21 vm04 ceph-mon[51058]: from='mgr.? 192.168.123.106:0/512624975' entity='mgr.vm06.wwotdr' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm06.wwotdr/key"}]: dispatch 2026-03-10T06:18:21.555 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:21 vm04 ceph-mon[51058]: from='mgr.? 192.168.123.106:0/512624975' entity='mgr.vm06.wwotdr' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/key"}]: dispatch 2026-03-10T06:18:21.555 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:21 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:21.555 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:21 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:21.555 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:21 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:21.555 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:21 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:21.683 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/mon.vm04/config 2026-03-10T06:18:21.959 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:21.956+0000 7fa4987fc700 1 -- 192.168.123.104:0/3244291511 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa490106150 msgr2=0x7fa490106520 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:18:21.959 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:21.956+0000 7fa4987fc700 1 --2- 192.168.123.104:0/3244291511 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa490106150 0x7fa490106520 secure :-1 s=READY pgs=181 cs=0 l=1 rev1=1 crypto rx=0x7fa484009b00 tx=0x7fa484009e10 comp rx=0 tx=0).stop 2026-03-10T06:18:21.959 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:21.957+0000 7fa4987fc700 1 -- 192.168.123.104:0/3244291511 shutdown_connections 2026-03-10T06:18:21.959 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:21.957+0000 7fa4987fc700 1 --2- 192.168.123.104:0/3244291511 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa490106150 0x7fa490106520 unknown :-1 s=CLOSED pgs=181 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:18:21.959 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:21.957+0000 7fa4987fc700 1 -- 192.168.123.104:0/3244291511 >> 192.168.123.104:0/3244291511 conn(0x7fa4900f9d90 msgr2=0x7fa4900fc1a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:18:21.959 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:21.957+0000 7fa4987fc700 1 -- 192.168.123.104:0/3244291511 shutdown_connections 2026-03-10T06:18:21.959 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:21.958+0000 7fa4987fc700 1 -- 192.168.123.104:0/3244291511 wait complete. 2026-03-10T06:18:21.960 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:21.958+0000 7fa4987fc700 1 Processor -- start 2026-03-10T06:18:21.960 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:21.958+0000 7fa4987fc700 1 -- start start 2026-03-10T06:18:21.960 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:21.958+0000 7fa4987fc700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa490106150 0x7fa490193830 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:18:21.960 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:21.958+0000 7fa4987fc700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa490193d70 0x7fa490197ff0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:18:21.961 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:21.959+0000 7fa4987fc700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa490198530 con 0x7fa490193d70 2026-03-10T06:18:21.961 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:21.959+0000 7fa4987fc700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa4901986a0 con 0x7fa490106150 2026-03-10T06:18:21.961 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:21.959+0000 7fa496598700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa490106150 0x7fa490193830 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:18:21.961 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:21.959+0000 7fa496598700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa490106150 0x7fa490193830 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.104:37190/0 (socket says 192.168.123.104:37190) 2026-03-10T06:18:21.961 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:21.959+0000 7fa496598700 1 -- 192.168.123.104:0/2570431031 learned_addr learned my addr 192.168.123.104:0/2570431031 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:18:21.961 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:21.959+0000 7fa495d97700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa490193d70 0x7fa490197ff0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:18:21.961 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:21.959+0000 7fa496598700 1 -- 192.168.123.104:0/2570431031 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa490106150 msgr2=0x7fa490193830 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_bulk peer close file descriptor 12 2026-03-10T06:18:21.961 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:21.959+0000 7fa496598700 1 -- 192.168.123.104:0/2570431031 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa490106150 msgr2=0x7fa490193830 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until read failed 2026-03-10T06:18:21.961 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:21.959+0000 7fa496598700 1 --2- 192.168.123.104:0/2570431031 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa490106150 0x7fa490193830 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_read_frame_preamble_main read frame preamble failed r=-1 2026-03-10T06:18:21.961 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:21.959+0000 7fa496598700 1 --2- 192.168.123.104:0/2570431031 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa490106150 0x7fa490193830 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 0.200000 2026-03-10T06:18:21.961 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:21.959+0000 7fa495d97700 1 -- 192.168.123.104:0/2570431031 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa490106150 msgr2=0x7fa490193830 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T06:18:21.961 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:21.959+0000 7fa495d97700 1 --2- 192.168.123.104:0/2570431031 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa490106150 0x7fa490193830 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:18:21.961 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:21.959+0000 7fa495d97700 1 -- 192.168.123.104:0/2570431031 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fa4840097e0 con 0x7fa490193d70 2026-03-10T06:18:21.961 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:21.960+0000 7fa495d97700 1 --2- 192.168.123.104:0/2570431031 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa490193d70 0x7fa490197ff0 secure :-1 s=READY pgs=182 cs=0 l=1 rev1=1 crypto rx=0x7fa48c00b700 tx=0x7fa48c00bac0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:18:21.963 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:21.960+0000 7fa4837fe700 1 -- 192.168.123.104:0/2570431031 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa48c010820 con 0x7fa490193d70 2026-03-10T06:18:21.963 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:21.960+0000 7fa4837fe700 1 -- 192.168.123.104:0/2570431031 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fa48c010e60 con 0x7fa490193d70 2026-03-10T06:18:21.963 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:21.960+0000 7fa4837fe700 1 -- 192.168.123.104:0/2570431031 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa48c017570 con 0x7fa490193d70 2026-03-10T06:18:21.963 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:21.960+0000 7fa4987fc700 1 -- 192.168.123.104:0/2570431031 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fa490198980 con 0x7fa490193d70 2026-03-10T06:18:21.963 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:21.960+0000 7fa4987fc700 1 -- 192.168.123.104:0/2570431031 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fa490198d90 con 0x7fa490193d70 2026-03-10T06:18:21.963 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:21.961+0000 7fa4987fc700 1 -- 192.168.123.104:0/2570431031 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fa490193640 con 0x7fa490193d70 2026-03-10T06:18:21.967 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:21.965+0000 7fa4837fe700 1 -- 192.168.123.104:0/2570431031 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 18) v1 ==== 89854+0+0 (secure 0 0 0) 0x7fa48c00f3c0 con 0x7fa490193d70 2026-03-10T06:18:21.967 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:21.965+0000 7fa4837fe700 1 --2- 192.168.123.104:0/2570431031 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fa47c06c820 0x7fa47c06ecd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:18:21.967 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:21.965+0000 7fa496598700 1 --2- 192.168.123.104:0/2570431031 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fa47c06c820 0x7fa47c06ecd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:18:21.967 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:21.965+0000 7fa4837fe700 1 -- 192.168.123.104:0/2570431031 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(5..5 src has 1..5) v4 ==== 1198+0+0 (secure 0 0 0) 0x7fa48c08a7a0 con 0x7fa490193d70 2026-03-10T06:18:21.967 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:21.966+0000 7fa496598700 1 --2- 192.168.123.104:0/2570431031 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fa47c06c820 0x7fa47c06ecd0 secure :-1 s=READY pgs=22 cs=0 l=1 rev1=1 crypto rx=0x7fa484006010 tx=0x7fa48400b540 comp rx=0 tx=0).ready entity=mgr.14241 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:18:21.968 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:21.966+0000 7fa4837fe700 1 -- 192.168.123.104:0/2570431031 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fa48c056070 con 0x7fa490193d70 2026-03-10T06:18:22.096 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:22.093+0000 7fa4987fc700 1 -- 192.168.123.104:0/2570431031 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "config generate-minimal-conf"} v 0) v1 -- 0x7fa49004f9e0 con 0x7fa490193d70 2026-03-10T06:18:22.097 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:22.095+0000 7fa4837fe700 1 -- 192.168.123.104:0/2570431031 <== mon.0 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "config generate-minimal-conf"}]=0 v10) v1 ==== 76+0+235 (secure 0 0 0) 0x7fa48c014100 con 0x7fa490193d70 2026-03-10T06:18:22.097 INFO:teuthology.orchestra.run.vm04.stdout:# minimal ceph.conf for 9c59102a-1c48-11f1-b618-035af535377d 2026-03-10T06:18:22.097 INFO:teuthology.orchestra.run.vm04.stdout:[global] 2026-03-10T06:18:22.097 INFO:teuthology.orchestra.run.vm04.stdout: fsid = 9c59102a-1c48-11f1-b618-035af535377d 2026-03-10T06:18:22.097 INFO:teuthology.orchestra.run.vm04.stdout: mon_host = [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] 2026-03-10T06:18:22.101 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:22.098+0000 7fa4987fc700 1 -- 192.168.123.104:0/2570431031 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fa47c06c820 msgr2=0x7fa47c06ecd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:18:22.101 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:22.098+0000 7fa4987fc700 1 --2- 192.168.123.104:0/2570431031 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fa47c06c820 0x7fa47c06ecd0 secure :-1 s=READY pgs=22 cs=0 l=1 rev1=1 crypto rx=0x7fa484006010 tx=0x7fa48400b540 comp rx=0 tx=0).stop 2026-03-10T06:18:22.101 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:22.098+0000 7fa4987fc700 1 -- 192.168.123.104:0/2570431031 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa490193d70 msgr2=0x7fa490197ff0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:18:22.101 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:22.098+0000 7fa4987fc700 1 --2- 192.168.123.104:0/2570431031 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa490193d70 0x7fa490197ff0 secure :-1 s=READY pgs=182 cs=0 l=1 rev1=1 crypto rx=0x7fa48c00b700 tx=0x7fa48c00bac0 comp rx=0 tx=0).stop 2026-03-10T06:18:22.101 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:22.098+0000 7fa4987fc700 1 -- 192.168.123.104:0/2570431031 shutdown_connections 2026-03-10T06:18:22.101 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:22.098+0000 7fa4987fc700 1 --2- 192.168.123.104:0/2570431031 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fa47c06c820 0x7fa47c06ecd0 secure :-1 s=CLOSED pgs=22 cs=0 l=1 rev1=1 crypto rx=0x7fa484006010 tx=0x7fa48400b540 comp rx=0 tx=0).stop 2026-03-10T06:18:22.101 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:22.098+0000 7fa4987fc700 1 --2- 192.168.123.104:0/2570431031 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa490106150 0x7fa490193830 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:18:22.101 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:22.098+0000 7fa4987fc700 1 --2- 192.168.123.104:0/2570431031 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa490193d70 0x7fa490197ff0 secure :-1 s=CLOSED pgs=182 cs=0 l=1 rev1=1 crypto rx=0x7fa48c00b700 tx=0x7fa48c00bac0 comp rx=0 tx=0).stop 2026-03-10T06:18:22.101 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:22.098+0000 7fa4987fc700 1 -- 192.168.123.104:0/2570431031 >> 192.168.123.104:0/2570431031 conn(0x7fa4900f9d90 msgr2=0x7fa4901045b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:18:22.101 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:22.099+0000 7fa4987fc700 1 -- 192.168.123.104:0/2570431031 shutdown_connections 2026-03-10T06:18:22.101 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:22.099+0000 7fa4987fc700 1 -- 192.168.123.104:0/2570431031 wait complete. 2026-03-10T06:18:22.172 INFO:tasks.cephadm:Distributing (final) config and client.admin keyring... 2026-03-10T06:18:22.172 DEBUG:teuthology.orchestra.run.vm04:> set -ex 2026-03-10T06:18:22.172 DEBUG:teuthology.orchestra.run.vm04:> sudo dd of=/etc/ceph/ceph.conf 2026-03-10T06:18:22.249 DEBUG:teuthology.orchestra.run.vm04:> set -ex 2026-03-10T06:18:22.249 DEBUG:teuthology.orchestra.run.vm04:> sudo dd of=/etc/ceph/ceph.client.admin.keyring 2026-03-10T06:18:22.321 DEBUG:teuthology.orchestra.run.vm06:> set -ex 2026-03-10T06:18:22.321 DEBUG:teuthology.orchestra.run.vm06:> sudo dd of=/etc/ceph/ceph.conf 2026-03-10T06:18:22.354 DEBUG:teuthology.orchestra.run.vm06:> set -ex 2026-03-10T06:18:22.354 DEBUG:teuthology.orchestra.run.vm06:> sudo dd of=/etc/ceph/ceph.client.admin.keyring 2026-03-10T06:18:22.425 INFO:tasks.cephadm:Deploying OSDs... 2026-03-10T06:18:22.425 DEBUG:teuthology.orchestra.run.vm04:> set -ex 2026-03-10T06:18:22.425 DEBUG:teuthology.orchestra.run.vm04:> dd if=/scratch_devs of=/dev/stdout 2026-03-10T06:18:22.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:22 vm04 ceph-mon[51058]: mgrmap e18: vm04.exdvdb(active, since 17s), standbys: vm06.wwotdr 2026-03-10T06:18:22.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:22 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "mgr metadata", "who": "vm06.wwotdr", "id": "vm06.wwotdr"}]: dispatch 2026-03-10T06:18:22.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:22 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:18:22.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:22 vm04 ceph-mon[51058]: from='client.? 192.168.123.106:0/1762146804' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T06:18:22.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:22 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:22.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:22 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:22.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:22 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:18:22.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:22 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T06:18:22.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:22 vm04 ceph-mon[51058]: from='client.? 192.168.123.104:0/2570431031' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:18:22.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:22 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "mon metadata", "id": "vm06"}]: dispatch 2026-03-10T06:18:22.446 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-10T06:18:22.446 DEBUG:teuthology.orchestra.run.vm04:> ls /dev/[sv]d? 2026-03-10T06:18:22.506 INFO:teuthology.orchestra.run.vm04.stdout:/dev/vda 2026-03-10T06:18:22.506 INFO:teuthology.orchestra.run.vm04.stdout:/dev/vdb 2026-03-10T06:18:22.506 INFO:teuthology.orchestra.run.vm04.stdout:/dev/vdc 2026-03-10T06:18:22.506 INFO:teuthology.orchestra.run.vm04.stdout:/dev/vdd 2026-03-10T06:18:22.506 INFO:teuthology.orchestra.run.vm04.stdout:/dev/vde 2026-03-10T06:18:22.506 WARNING:teuthology.misc:Removing root device: /dev/vda from device list 2026-03-10T06:18:22.506 DEBUG:teuthology.misc:devs=['/dev/vdb', '/dev/vdc', '/dev/vdd', '/dev/vde'] 2026-03-10T06:18:22.507 DEBUG:teuthology.orchestra.run.vm04:> stat /dev/vdb 2026-03-10T06:18:22.570 INFO:teuthology.orchestra.run.vm04.stdout: File: /dev/vdb 2026-03-10T06:18:22.571 INFO:teuthology.orchestra.run.vm04.stdout: Size: 0 Blocks: 0 IO Block: 512 block special file 2026-03-10T06:18:22.571 INFO:teuthology.orchestra.run.vm04.stdout:Device: 6h/6d Inode: 221 Links: 1 Device type: fc,10 2026-03-10T06:18:22.571 INFO:teuthology.orchestra.run.vm04.stdout:Access: (0660/brw-rw----) Uid: ( 0/ root) Gid: ( 6/ disk) 2026-03-10T06:18:22.571 INFO:teuthology.orchestra.run.vm04.stdout:Context: system_u:object_r:fixed_disk_device_t:s0 2026-03-10T06:18:22.571 INFO:teuthology.orchestra.run.vm04.stdout:Access: 2026-03-10 06:17:17.399683924 +0000 2026-03-10T06:18:22.571 INFO:teuthology.orchestra.run.vm04.stdout:Modify: 2026-03-10 06:17:17.288683743 +0000 2026-03-10T06:18:22.571 INFO:teuthology.orchestra.run.vm04.stdout:Change: 2026-03-10 06:17:17.288683743 +0000 2026-03-10T06:18:22.571 INFO:teuthology.orchestra.run.vm04.stdout: Birth: 2026-03-10 06:12:22.231000000 +0000 2026-03-10T06:18:22.571 DEBUG:teuthology.orchestra.run.vm04:> sudo dd if=/dev/vdb of=/dev/null count=1 2026-03-10T06:18:22.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:22 vm06 ceph-mon[58974]: mgrmap e18: vm04.exdvdb(active, since 17s), standbys: vm06.wwotdr 2026-03-10T06:18:22.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:22 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "mgr metadata", "who": "vm06.wwotdr", "id": "vm06.wwotdr"}]: dispatch 2026-03-10T06:18:22.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:22 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:18:22.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:22 vm06 ceph-mon[58974]: from='client.? 192.168.123.106:0/1762146804' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T06:18:22.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:22 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:22.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:22 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:22.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:22 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:18:22.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:22 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T06:18:22.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:22 vm06 ceph-mon[58974]: from='client.? 192.168.123.104:0/2570431031' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:18:22.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:22 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "mon metadata", "id": "vm06"}]: dispatch 2026-03-10T06:18:22.638 INFO:teuthology.orchestra.run.vm04.stderr:1+0 records in 2026-03-10T06:18:22.638 INFO:teuthology.orchestra.run.vm04.stderr:1+0 records out 2026-03-10T06:18:22.638 INFO:teuthology.orchestra.run.vm04.stderr:512 bytes copied, 0.000248105 s, 2.1 MB/s 2026-03-10T06:18:22.639 DEBUG:teuthology.orchestra.run.vm04:> ! mount | grep -v devtmpfs | grep -q /dev/vdb 2026-03-10T06:18:22.699 DEBUG:teuthology.orchestra.run.vm04:> stat /dev/vdc 2026-03-10T06:18:22.760 INFO:teuthology.orchestra.run.vm04.stdout: File: /dev/vdc 2026-03-10T06:18:22.760 INFO:teuthology.orchestra.run.vm04.stdout: Size: 0 Blocks: 0 IO Block: 512 block special file 2026-03-10T06:18:22.760 INFO:teuthology.orchestra.run.vm04.stdout:Device: 6h/6d Inode: 224 Links: 1 Device type: fc,20 2026-03-10T06:18:22.760 INFO:teuthology.orchestra.run.vm04.stdout:Access: (0660/brw-rw----) Uid: ( 0/ root) Gid: ( 6/ disk) 2026-03-10T06:18:22.760 INFO:teuthology.orchestra.run.vm04.stdout:Context: system_u:object_r:fixed_disk_device_t:s0 2026-03-10T06:18:22.760 INFO:teuthology.orchestra.run.vm04.stdout:Access: 2026-03-10 06:17:17.457684019 +0000 2026-03-10T06:18:22.760 INFO:teuthology.orchestra.run.vm04.stdout:Modify: 2026-03-10 06:17:17.296683756 +0000 2026-03-10T06:18:22.760 INFO:teuthology.orchestra.run.vm04.stdout:Change: 2026-03-10 06:17:17.296683756 +0000 2026-03-10T06:18:22.760 INFO:teuthology.orchestra.run.vm04.stdout: Birth: 2026-03-10 06:12:22.252000000 +0000 2026-03-10T06:18:22.760 DEBUG:teuthology.orchestra.run.vm04:> sudo dd if=/dev/vdc of=/dev/null count=1 2026-03-10T06:18:22.837 INFO:teuthology.orchestra.run.vm04.stderr:1+0 records in 2026-03-10T06:18:22.837 INFO:teuthology.orchestra.run.vm04.stderr:1+0 records out 2026-03-10T06:18:22.837 INFO:teuthology.orchestra.run.vm04.stderr:512 bytes copied, 0.00016552 s, 3.1 MB/s 2026-03-10T06:18:22.838 DEBUG:teuthology.orchestra.run.vm04:> ! mount | grep -v devtmpfs | grep -q /dev/vdc 2026-03-10T06:18:22.915 DEBUG:teuthology.orchestra.run.vm04:> stat /dev/vdd 2026-03-10T06:18:22.973 INFO:teuthology.orchestra.run.vm04.stdout: File: /dev/vdd 2026-03-10T06:18:22.973 INFO:teuthology.orchestra.run.vm04.stdout: Size: 0 Blocks: 0 IO Block: 512 block special file 2026-03-10T06:18:22.973 INFO:teuthology.orchestra.run.vm04.stdout:Device: 6h/6d Inode: 256 Links: 1 Device type: fc,30 2026-03-10T06:18:22.973 INFO:teuthology.orchestra.run.vm04.stdout:Access: (0660/brw-rw----) Uid: ( 0/ root) Gid: ( 6/ disk) 2026-03-10T06:18:22.973 INFO:teuthology.orchestra.run.vm04.stdout:Context: system_u:object_r:fixed_disk_device_t:s0 2026-03-10T06:18:22.974 INFO:teuthology.orchestra.run.vm04.stdout:Access: 2026-03-10 06:17:17.512684109 +0000 2026-03-10T06:18:22.974 INFO:teuthology.orchestra.run.vm04.stdout:Modify: 2026-03-10 06:17:17.290683747 +0000 2026-03-10T06:18:22.974 INFO:teuthology.orchestra.run.vm04.stdout:Change: 2026-03-10 06:17:17.290683747 +0000 2026-03-10T06:18:22.974 INFO:teuthology.orchestra.run.vm04.stdout: Birth: 2026-03-10 06:12:22.260000000 +0000 2026-03-10T06:18:22.974 DEBUG:teuthology.orchestra.run.vm04:> sudo dd if=/dev/vdd of=/dev/null count=1 2026-03-10T06:18:23.043 INFO:teuthology.orchestra.run.vm04.stderr:1+0 records in 2026-03-10T06:18:23.043 INFO:teuthology.orchestra.run.vm04.stderr:1+0 records out 2026-03-10T06:18:23.043 INFO:teuthology.orchestra.run.vm04.stderr:512 bytes copied, 0.000226513 s, 2.3 MB/s 2026-03-10T06:18:23.044 DEBUG:teuthology.orchestra.run.vm04:> ! mount | grep -v devtmpfs | grep -q /dev/vdd 2026-03-10T06:18:23.111 DEBUG:teuthology.orchestra.run.vm04:> stat /dev/vde 2026-03-10T06:18:23.170 INFO:teuthology.orchestra.run.vm04.stdout: File: /dev/vde 2026-03-10T06:18:23.170 INFO:teuthology.orchestra.run.vm04.stdout: Size: 0 Blocks: 0 IO Block: 512 block special file 2026-03-10T06:18:23.170 INFO:teuthology.orchestra.run.vm04.stdout:Device: 6h/6d Inode: 257 Links: 1 Device type: fc,40 2026-03-10T06:18:23.170 INFO:teuthology.orchestra.run.vm04.stdout:Access: (0660/brw-rw----) Uid: ( 0/ root) Gid: ( 6/ disk) 2026-03-10T06:18:23.170 INFO:teuthology.orchestra.run.vm04.stdout:Context: system_u:object_r:fixed_disk_device_t:s0 2026-03-10T06:18:23.170 INFO:teuthology.orchestra.run.vm04.stdout:Access: 2026-03-10 06:17:17.581684221 +0000 2026-03-10T06:18:23.170 INFO:teuthology.orchestra.run.vm04.stdout:Modify: 2026-03-10 06:17:17.300683763 +0000 2026-03-10T06:18:23.170 INFO:teuthology.orchestra.run.vm04.stdout:Change: 2026-03-10 06:17:17.300683763 +0000 2026-03-10T06:18:23.170 INFO:teuthology.orchestra.run.vm04.stdout: Birth: 2026-03-10 06:12:22.268000000 +0000 2026-03-10T06:18:23.170 DEBUG:teuthology.orchestra.run.vm04:> sudo dd if=/dev/vde of=/dev/null count=1 2026-03-10T06:18:23.259 INFO:teuthology.orchestra.run.vm04.stderr:1+0 records in 2026-03-10T06:18:23.259 INFO:teuthology.orchestra.run.vm04.stderr:1+0 records out 2026-03-10T06:18:23.259 INFO:teuthology.orchestra.run.vm04.stderr:512 bytes copied, 0.000135563 s, 3.8 MB/s 2026-03-10T06:18:23.260 DEBUG:teuthology.orchestra.run.vm04:> ! mount | grep -v devtmpfs | grep -q /dev/vde 2026-03-10T06:18:23.277 DEBUG:teuthology.orchestra.run.vm06:> set -ex 2026-03-10T06:18:23.277 DEBUG:teuthology.orchestra.run.vm06:> dd if=/scratch_devs of=/dev/stdout 2026-03-10T06:18:23.295 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-10T06:18:23.295 DEBUG:teuthology.orchestra.run.vm06:> ls /dev/[sv]d? 2026-03-10T06:18:23.355 INFO:teuthology.orchestra.run.vm06.stdout:/dev/vda 2026-03-10T06:18:23.355 INFO:teuthology.orchestra.run.vm06.stdout:/dev/vdb 2026-03-10T06:18:23.355 INFO:teuthology.orchestra.run.vm06.stdout:/dev/vdc 2026-03-10T06:18:23.355 INFO:teuthology.orchestra.run.vm06.stdout:/dev/vdd 2026-03-10T06:18:23.355 INFO:teuthology.orchestra.run.vm06.stdout:/dev/vde 2026-03-10T06:18:23.355 WARNING:teuthology.misc:Removing root device: /dev/vda from device list 2026-03-10T06:18:23.355 DEBUG:teuthology.misc:devs=['/dev/vdb', '/dev/vdc', '/dev/vdd', '/dev/vde'] 2026-03-10T06:18:23.355 DEBUG:teuthology.orchestra.run.vm06:> stat /dev/vdb 2026-03-10T06:18:23.415 INFO:teuthology.orchestra.run.vm06.stdout: File: /dev/vdb 2026-03-10T06:18:23.415 INFO:teuthology.orchestra.run.vm06.stdout: Size: 0 Blocks: 0 IO Block: 512 block special file 2026-03-10T06:18:23.415 INFO:teuthology.orchestra.run.vm06.stdout:Device: 6h/6d Inode: 223 Links: 1 Device type: fc,10 2026-03-10T06:18:23.415 INFO:teuthology.orchestra.run.vm06.stdout:Access: (0660/brw-rw----) Uid: ( 0/ root) Gid: ( 6/ disk) 2026-03-10T06:18:23.415 INFO:teuthology.orchestra.run.vm06.stdout:Context: system_u:object_r:fixed_disk_device_t:s0 2026-03-10T06:18:23.415 INFO:teuthology.orchestra.run.vm06.stdout:Access: 2026-03-10 06:18:06.400990058 +0000 2026-03-10T06:18:23.415 INFO:teuthology.orchestra.run.vm06.stdout:Modify: 2026-03-10 06:18:06.262989934 +0000 2026-03-10T06:18:23.415 INFO:teuthology.orchestra.run.vm06.stdout:Change: 2026-03-10 06:18:06.262989934 +0000 2026-03-10T06:18:23.415 INFO:teuthology.orchestra.run.vm06.stdout: Birth: 2026-03-10 06:11:56.245000000 +0000 2026-03-10T06:18:23.416 DEBUG:teuthology.orchestra.run.vm06:> sudo dd if=/dev/vdb of=/dev/null count=1 2026-03-10T06:18:23.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:23 vm04 ceph-mon[51058]: Updating vm04:/etc/ceph/ceph.conf 2026-03-10T06:18:23.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:23 vm04 ceph-mon[51058]: Updating vm06:/etc/ceph/ceph.conf 2026-03-10T06:18:23.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:23 vm04 ceph-mon[51058]: Updating vm06:/var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/config/ceph.conf 2026-03-10T06:18:23.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:23 vm04 ceph-mon[51058]: Updating vm04:/var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/config/ceph.conf 2026-03-10T06:18:23.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:23 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:23.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:23 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:23.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:23 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:23.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:23 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:23.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:23 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:23.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:23 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch 2026-03-10T06:18:23.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:23 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch 2026-03-10T06:18:23.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:23 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:18:23.479 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:23 vm06 ceph-mon[58974]: Updating vm04:/etc/ceph/ceph.conf 2026-03-10T06:18:23.479 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:23 vm06 ceph-mon[58974]: Updating vm06:/etc/ceph/ceph.conf 2026-03-10T06:18:23.479 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:23 vm06 ceph-mon[58974]: Updating vm06:/var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/config/ceph.conf 2026-03-10T06:18:23.479 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:23 vm06 ceph-mon[58974]: Updating vm04:/var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/config/ceph.conf 2026-03-10T06:18:23.479 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:23 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:23.479 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:23 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:23.479 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:23 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:23.479 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:23 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:23.479 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:23 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:23.479 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:23 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch 2026-03-10T06:18:23.479 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:23 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch 2026-03-10T06:18:23.479 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:23 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:18:23.480 INFO:teuthology.orchestra.run.vm06.stderr:1+0 records in 2026-03-10T06:18:23.480 INFO:teuthology.orchestra.run.vm06.stderr:1+0 records out 2026-03-10T06:18:23.480 INFO:teuthology.orchestra.run.vm06.stderr:512 bytes copied, 9.9656e-05 s, 5.1 MB/s 2026-03-10T06:18:23.482 DEBUG:teuthology.orchestra.run.vm06:> ! mount | grep -v devtmpfs | grep -q /dev/vdb 2026-03-10T06:18:23.540 DEBUG:teuthology.orchestra.run.vm06:> stat /dev/vdc 2026-03-10T06:18:23.601 INFO:teuthology.orchestra.run.vm06.stdout: File: /dev/vdc 2026-03-10T06:18:23.601 INFO:teuthology.orchestra.run.vm06.stdout: Size: 0 Blocks: 0 IO Block: 512 block special file 2026-03-10T06:18:23.601 INFO:teuthology.orchestra.run.vm06.stdout:Device: 6h/6d Inode: 224 Links: 1 Device type: fc,20 2026-03-10T06:18:23.601 INFO:teuthology.orchestra.run.vm06.stdout:Access: (0660/brw-rw----) Uid: ( 0/ root) Gid: ( 6/ disk) 2026-03-10T06:18:23.601 INFO:teuthology.orchestra.run.vm06.stdout:Context: system_u:object_r:fixed_disk_device_t:s0 2026-03-10T06:18:23.601 INFO:teuthology.orchestra.run.vm06.stdout:Access: 2026-03-10 06:18:06.467990119 +0000 2026-03-10T06:18:23.601 INFO:teuthology.orchestra.run.vm06.stdout:Modify: 2026-03-10 06:18:06.271989942 +0000 2026-03-10T06:18:23.601 INFO:teuthology.orchestra.run.vm06.stdout:Change: 2026-03-10 06:18:06.271989942 +0000 2026-03-10T06:18:23.601 INFO:teuthology.orchestra.run.vm06.stdout: Birth: 2026-03-10 06:11:56.249000000 +0000 2026-03-10T06:18:23.601 DEBUG:teuthology.orchestra.run.vm06:> sudo dd if=/dev/vdc of=/dev/null count=1 2026-03-10T06:18:23.670 INFO:teuthology.orchestra.run.vm06.stderr:1+0 records in 2026-03-10T06:18:23.670 INFO:teuthology.orchestra.run.vm06.stderr:1+0 records out 2026-03-10T06:18:23.670 INFO:teuthology.orchestra.run.vm06.stderr:512 bytes copied, 0.000145513 s, 3.5 MB/s 2026-03-10T06:18:23.672 DEBUG:teuthology.orchestra.run.vm06:> ! mount | grep -v devtmpfs | grep -q /dev/vdc 2026-03-10T06:18:23.728 DEBUG:teuthology.orchestra.run.vm06:> stat /dev/vdd 2026-03-10T06:18:23.787 INFO:teuthology.orchestra.run.vm06.stdout: File: /dev/vdd 2026-03-10T06:18:23.787 INFO:teuthology.orchestra.run.vm06.stdout: Size: 0 Blocks: 0 IO Block: 512 block special file 2026-03-10T06:18:23.787 INFO:teuthology.orchestra.run.vm06.stdout:Device: 6h/6d Inode: 245 Links: 1 Device type: fc,30 2026-03-10T06:18:23.787 INFO:teuthology.orchestra.run.vm06.stdout:Access: (0660/brw-rw----) Uid: ( 0/ root) Gid: ( 6/ disk) 2026-03-10T06:18:23.788 INFO:teuthology.orchestra.run.vm06.stdout:Context: system_u:object_r:fixed_disk_device_t:s0 2026-03-10T06:18:23.788 INFO:teuthology.orchestra.run.vm06.stdout:Access: 2026-03-10 06:18:06.538990183 +0000 2026-03-10T06:18:23.788 INFO:teuthology.orchestra.run.vm06.stdout:Modify: 2026-03-10 06:18:06.267989938 +0000 2026-03-10T06:18:23.788 INFO:teuthology.orchestra.run.vm06.stdout:Change: 2026-03-10 06:18:06.267989938 +0000 2026-03-10T06:18:23.788 INFO:teuthology.orchestra.run.vm06.stdout: Birth: 2026-03-10 06:11:56.253000000 +0000 2026-03-10T06:18:23.788 DEBUG:teuthology.orchestra.run.vm06:> sudo dd if=/dev/vdd of=/dev/null count=1 2026-03-10T06:18:23.857 INFO:teuthology.orchestra.run.vm06.stderr:1+0 records in 2026-03-10T06:18:23.857 INFO:teuthology.orchestra.run.vm06.stderr:1+0 records out 2026-03-10T06:18:23.857 INFO:teuthology.orchestra.run.vm06.stderr:512 bytes copied, 0.000160601 s, 3.2 MB/s 2026-03-10T06:18:23.859 DEBUG:teuthology.orchestra.run.vm06:> ! mount | grep -v devtmpfs | grep -q /dev/vdd 2026-03-10T06:18:23.918 DEBUG:teuthology.orchestra.run.vm06:> stat /dev/vde 2026-03-10T06:18:23.977 INFO:teuthology.orchestra.run.vm06.stdout: File: /dev/vde 2026-03-10T06:18:23.977 INFO:teuthology.orchestra.run.vm06.stdout: Size: 0 Blocks: 0 IO Block: 512 block special file 2026-03-10T06:18:23.978 INFO:teuthology.orchestra.run.vm06.stdout:Device: 6h/6d Inode: 257 Links: 1 Device type: fc,40 2026-03-10T06:18:23.978 INFO:teuthology.orchestra.run.vm06.stdout:Access: (0660/brw-rw----) Uid: ( 0/ root) Gid: ( 6/ disk) 2026-03-10T06:18:23.978 INFO:teuthology.orchestra.run.vm06.stdout:Context: system_u:object_r:fixed_disk_device_t:s0 2026-03-10T06:18:23.978 INFO:teuthology.orchestra.run.vm06.stdout:Access: 2026-03-10 06:18:06.610990248 +0000 2026-03-10T06:18:23.978 INFO:teuthology.orchestra.run.vm06.stdout:Modify: 2026-03-10 06:18:06.272989943 +0000 2026-03-10T06:18:23.978 INFO:teuthology.orchestra.run.vm06.stdout:Change: 2026-03-10 06:18:06.272989943 +0000 2026-03-10T06:18:23.978 INFO:teuthology.orchestra.run.vm06.stdout: Birth: 2026-03-10 06:11:56.259000000 +0000 2026-03-10T06:18:23.978 DEBUG:teuthology.orchestra.run.vm06:> sudo dd if=/dev/vde of=/dev/null count=1 2026-03-10T06:18:24.043 INFO:teuthology.orchestra.run.vm06.stderr:1+0 records in 2026-03-10T06:18:24.043 INFO:teuthology.orchestra.run.vm06.stderr:1+0 records out 2026-03-10T06:18:24.043 INFO:teuthology.orchestra.run.vm06.stderr:512 bytes copied, 0.000192431 s, 2.7 MB/s 2026-03-10T06:18:24.045 DEBUG:teuthology.orchestra.run.vm06:> ! mount | grep -v devtmpfs | grep -q /dev/vde 2026-03-10T06:18:24.102 INFO:tasks.cephadm:Deploying osd.0 on vm04 with /dev/vde... 2026-03-10T06:18:24.102 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 ceph-volume -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 9c59102a-1c48-11f1-b618-035af535377d -- lvm zap /dev/vde 2026-03-10T06:18:24.260 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:24 vm04 ceph-mon[51058]: Reconfiguring mon.vm04 (unknown last config time)... 2026-03-10T06:18:24.260 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:24 vm04 ceph-mon[51058]: Reconfiguring daemon mon.vm04 on vm04 2026-03-10T06:18:24.260 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:24 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:24.260 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:24 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:24.260 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:24 vm04 ceph-mon[51058]: Reconfiguring mgr.vm04.exdvdb (unknown last config time)... 2026-03-10T06:18:24.260 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:24 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm04.exdvdb", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-10T06:18:24.260 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:24 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-10T06:18:24.260 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:24 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:18:24.260 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:24 vm04 ceph-mon[51058]: Reconfiguring daemon mgr.vm04.exdvdb on vm04 2026-03-10T06:18:24.260 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:24 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:24.260 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:24 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:24.260 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:24 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm04", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-10T06:18:24.260 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:24 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:18:24.260 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:24 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:24.260 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:24 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:24.260 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:24 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.vm04", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch 2026-03-10T06:18:24.260 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:24 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:18:24.278 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/mon.vm04/config 2026-03-10T06:18:24.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:24 vm06 ceph-mon[58974]: Reconfiguring mon.vm04 (unknown last config time)... 2026-03-10T06:18:24.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:24 vm06 ceph-mon[58974]: Reconfiguring daemon mon.vm04 on vm04 2026-03-10T06:18:24.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:24 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:24.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:24 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:24.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:24 vm06 ceph-mon[58974]: Reconfiguring mgr.vm04.exdvdb (unknown last config time)... 2026-03-10T06:18:24.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:24 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm04.exdvdb", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-10T06:18:24.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:24 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-10T06:18:24.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:24 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:18:24.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:24 vm06 ceph-mon[58974]: Reconfiguring daemon mgr.vm04.exdvdb on vm04 2026-03-10T06:18:24.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:24 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:24.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:24 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:24.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:24 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm04", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-10T06:18:24.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:24 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:18:24.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:24 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:24.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:24 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:24.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:24 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.vm04", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch 2026-03-10T06:18:24.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:24 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:18:24.901 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:18:24.914 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 9c59102a-1c48-11f1-b618-035af535377d -- ceph orch daemon add osd vm04:/dev/vde 2026-03-10T06:18:25.160 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/mon.vm04/config 2026-03-10T06:18:25.415 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:25 vm04 ceph-mon[51058]: Reconfiguring ceph-exporter.vm04 (monmap changed)... 2026-03-10T06:18:25.415 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:25 vm04 ceph-mon[51058]: Reconfiguring daemon ceph-exporter.vm04 on vm04 2026-03-10T06:18:25.415 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:25 vm04 ceph-mon[51058]: pgmap v3: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-10T06:18:25.415 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:25 vm04 ceph-mon[51058]: Reconfiguring crash.vm04 (monmap changed)... 2026-03-10T06:18:25.415 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:25 vm04 ceph-mon[51058]: Reconfiguring daemon crash.vm04 on vm04 2026-03-10T06:18:25.415 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:25 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:25.415 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:25 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:25.599 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:25.596+0000 7f7f60357700 1 -- 192.168.123.104:0/3520209268 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7f58100570 msgr2=0x7f7f58100980 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:18:25.599 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:25.596+0000 7f7f60357700 1 --2- 192.168.123.104:0/3520209268 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7f58100570 0x7f7f58100980 secure :-1 s=READY pgs=183 cs=0 l=1 rev1=1 crypto rx=0x7f7f54009b00 tx=0x7f7f54009e10 comp rx=0 tx=0).stop 2026-03-10T06:18:25.599 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:25.597+0000 7f7f60357700 1 -- 192.168.123.104:0/3520209268 shutdown_connections 2026-03-10T06:18:25.599 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:25.597+0000 7f7f60357700 1 --2- 192.168.123.104:0/3520209268 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7f58101770 0x7f7f58101bc0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:18:25.599 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:25.597+0000 7f7f60357700 1 --2- 192.168.123.104:0/3520209268 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7f58100570 0x7f7f58100980 unknown :-1 s=CLOSED pgs=183 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:18:25.599 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:25.597+0000 7f7f60357700 1 -- 192.168.123.104:0/3520209268 >> 192.168.123.104:0/3520209268 conn(0x7f7f580fbb00 msgr2=0x7f7f580fdf50 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:18:25.600 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:25.599+0000 7f7f60357700 1 -- 192.168.123.104:0/3520209268 shutdown_connections 2026-03-10T06:18:25.600 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:25.599+0000 7f7f60357700 1 -- 192.168.123.104:0/3520209268 wait complete. 2026-03-10T06:18:25.601 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:25.599+0000 7f7f60357700 1 Processor -- start 2026-03-10T06:18:25.601 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:25.599+0000 7f7f60357700 1 -- start start 2026-03-10T06:18:25.601 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:25.600+0000 7f7f60357700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7f58100570 0x7f7f58195e10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:18:25.601 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:25.600+0000 7f7f60357700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7f58101770 0x7f7f58196350 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:18:25.602 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:25.600+0000 7f7f5e0f3700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7f58100570 0x7f7f58195e10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:18:25.602 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:25.600+0000 7f7f5e0f3700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7f58100570 0x7f7f58195e10 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.104:36448/0 (socket says 192.168.123.104:36448) 2026-03-10T06:18:25.602 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:25.600+0000 7f7f5e0f3700 1 -- 192.168.123.104:0/2306007039 learned_addr learned my addr 192.168.123.104:0/2306007039 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:18:25.602 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:25.600+0000 7f7f60357700 1 -- 192.168.123.104:0/2306007039 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7f58196970 con 0x7f7f58101770 2026-03-10T06:18:25.602 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:25.600+0000 7f7f60357700 1 -- 192.168.123.104:0/2306007039 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7f58196ab0 con 0x7f7f58100570 2026-03-10T06:18:25.602 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:25.600+0000 7f7f5d8f2700 1 --2- 192.168.123.104:0/2306007039 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7f58101770 0x7f7f58196350 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:18:25.602 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:25.600+0000 7f7f5e0f3700 1 -- 192.168.123.104:0/2306007039 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7f58100570 msgr2=0x7f7f58195e10 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_bulk peer close file descriptor 12 2026-03-10T06:18:25.602 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:25.600+0000 7f7f5e0f3700 1 -- 192.168.123.104:0/2306007039 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7f58100570 msgr2=0x7f7f58195e10 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until read failed 2026-03-10T06:18:25.602 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:25.600+0000 7f7f5e0f3700 1 --2- 192.168.123.104:0/2306007039 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7f58100570 0x7f7f58195e10 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_read_frame_preamble_main read frame preamble failed r=-1 2026-03-10T06:18:25.602 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:25.600+0000 7f7f5e0f3700 1 --2- 192.168.123.104:0/2306007039 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7f58100570 0x7f7f58195e10 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 0.200000 2026-03-10T06:18:25.602 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:25.601+0000 7f7f5d8f2700 1 -- 192.168.123.104:0/2306007039 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7f58100570 msgr2=0x7f7f58195e10 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T06:18:25.603 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:25.601+0000 7f7f5d8f2700 1 --2- 192.168.123.104:0/2306007039 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7f58100570 0x7f7f58195e10 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:18:25.603 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:25.601+0000 7f7f5d8f2700 1 -- 192.168.123.104:0/2306007039 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f7f540097e0 con 0x7f7f58101770 2026-03-10T06:18:25.603 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:25.601+0000 7f7f5d8f2700 1 --2- 192.168.123.104:0/2306007039 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7f58101770 0x7f7f58196350 secure :-1 s=READY pgs=184 cs=0 l=1 rev1=1 crypto rx=0x7f7f4800d8d0 tx=0x7f7f4800dc90 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:18:25.603 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:25.602+0000 7f7f4f7fe700 1 -- 192.168.123.104:0/2306007039 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f7f480049e0 con 0x7f7f58101770 2026-03-10T06:18:25.603 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:25.602+0000 7f7f4f7fe700 1 -- 192.168.123.104:0/2306007039 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f7f4800b5c0 con 0x7f7f58101770 2026-03-10T06:18:25.604 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:25.602+0000 7f7f4f7fe700 1 -- 192.168.123.104:0/2306007039 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f7f48009c40 con 0x7f7f58101770 2026-03-10T06:18:25.604 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:25.602+0000 7f7f60357700 1 -- 192.168.123.104:0/2306007039 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f7f5819b560 con 0x7f7f58101770 2026-03-10T06:18:25.604 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:25.602+0000 7f7f60357700 1 -- 192.168.123.104:0/2306007039 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f7f5819ba30 con 0x7f7f58101770 2026-03-10T06:18:25.605 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:25.604+0000 7f7f4f7fe700 1 -- 192.168.123.104:0/2306007039 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 18) v1 ==== 89854+0+0 (secure 0 0 0) 0x7f7f48004b40 con 0x7f7f58101770 2026-03-10T06:18:25.606 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:25.604+0000 7f7f4f7fe700 1 --2- 192.168.123.104:0/2306007039 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f7f4406c430 0x7f7f4406e8e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:18:25.606 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:25.604+0000 7f7f4f7fe700 1 -- 192.168.123.104:0/2306007039 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(5..5 src has 1..5) v4 ==== 1198+0+0 (secure 0 0 0) 0x7f7f4808a0f0 con 0x7f7f58101770 2026-03-10T06:18:25.606 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:25.604+0000 7f7f60357700 1 -- 192.168.123.104:0/2306007039 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f7f5804ea50 con 0x7f7f58101770 2026-03-10T06:18:25.607 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:25.606+0000 7f7f5e0f3700 1 --2- 192.168.123.104:0/2306007039 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f7f4406c430 0x7f7f4406e8e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:18:25.608 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:25.606+0000 7f7f5e0f3700 1 --2- 192.168.123.104:0/2306007039 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f7f4406c430 0x7f7f4406e8e0 secure :-1 s=READY pgs=25 cs=0 l=1 rev1=1 crypto rx=0x7f7f54006010 tx=0x7f7f54005c20 comp rx=0 tx=0).ready entity=mgr.14241 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:18:25.612 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:25.607+0000 7f7f4f7fe700 1 -- 192.168.123.104:0/2306007039 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f7f48017080 con 0x7f7f58101770 2026-03-10T06:18:25.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:25 vm06 ceph-mon[58974]: Reconfiguring ceph-exporter.vm04 (monmap changed)... 2026-03-10T06:18:25.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:25 vm06 ceph-mon[58974]: Reconfiguring daemon ceph-exporter.vm04 on vm04 2026-03-10T06:18:25.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:25 vm06 ceph-mon[58974]: pgmap v3: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-10T06:18:25.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:25 vm06 ceph-mon[58974]: Reconfiguring crash.vm04 (monmap changed)... 2026-03-10T06:18:25.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:25 vm06 ceph-mon[58974]: Reconfiguring daemon crash.vm04 on vm04 2026-03-10T06:18:25.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:25 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:25.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:25 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:25.727 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:25.724+0000 7f7f60357700 1 -- 192.168.123.104:0/2306007039 --> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] -- mgr_command(tid 0: {"prefix": "orch daemon add osd", "svc_arg": "vm04:/dev/vde", "target": ["mon-mgr", ""]}) v1 -- 0x7f7f581060c0 con 0x7f7f4406c430 2026-03-10T06:18:26.371 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:26 vm04 ceph-mon[51058]: Reconfiguring alertmanager.vm04 (dependencies changed)... 2026-03-10T06:18:26.371 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:26 vm04 ceph-mon[51058]: Reconfiguring daemon alertmanager.vm04 on vm04 2026-03-10T06:18:26.371 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:26 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch 2026-03-10T06:18:26.371 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:26 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch 2026-03-10T06:18:26.371 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:26 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:18:26.371 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:26 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:26.371 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:26 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:26.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:26 vm06 ceph-mon[58974]: Reconfiguring alertmanager.vm04 (dependencies changed)... 2026-03-10T06:18:26.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:26 vm06 ceph-mon[58974]: Reconfiguring daemon alertmanager.vm04 on vm04 2026-03-10T06:18:26.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:26 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch 2026-03-10T06:18:26.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:26 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch 2026-03-10T06:18:26.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:26 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:18:26.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:26 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:26.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:26 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:27.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:27 vm06 ceph-mon[58974]: from='client.14280 -' entity='client.admin' cmd=[{"prefix": "orch daemon add osd", "svc_arg": "vm04:/dev/vde", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:18:27.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:27 vm06 ceph-mon[58974]: Reconfiguring grafana.vm04 (dependencies changed)... 2026-03-10T06:18:27.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:27 vm06 ceph-mon[58974]: Reconfiguring daemon grafana.vm04 on vm04 2026-03-10T06:18:27.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:27 vm06 ceph-mon[58974]: pgmap v4: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-10T06:18:27.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:27 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:27.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:27 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:27.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:27 vm04 ceph-mon[51058]: from='client.14280 -' entity='client.admin' cmd=[{"prefix": "orch daemon add osd", "svc_arg": "vm04:/dev/vde", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:18:27.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:27 vm04 ceph-mon[51058]: Reconfiguring grafana.vm04 (dependencies changed)... 2026-03-10T06:18:27.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:27 vm04 ceph-mon[51058]: Reconfiguring daemon grafana.vm04 on vm04 2026-03-10T06:18:27.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:27 vm04 ceph-mon[51058]: pgmap v4: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-10T06:18:27.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:27 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:27.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:27 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:28.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:28 vm06 ceph-mon[58974]: Reconfiguring prometheus.vm04 (dependencies changed)... 2026-03-10T06:18:28.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:28 vm06 ceph-mon[58974]: Reconfiguring daemon prometheus.vm04 on vm04 2026-03-10T06:18:28.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:28 vm06 ceph-mon[58974]: from='client.? 192.168.123.104:0/2601857690' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "24cf9fc9-b995-47ea-a145-3fd48dc1ed14"}]: dispatch 2026-03-10T06:18:28.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:28 vm06 ceph-mon[58974]: from='client.? 192.168.123.104:0/2601857690' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "24cf9fc9-b995-47ea-a145-3fd48dc1ed14"}]': finished 2026-03-10T06:18:28.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:28 vm06 ceph-mon[58974]: osdmap e6: 1 total, 0 up, 1 in 2026-03-10T06:18:28.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:28 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-10T06:18:28.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:28 vm06 ceph-mon[58974]: from='client.? 192.168.123.104:0/3741550685' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch 2026-03-10T06:18:28.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:28 vm04 ceph-mon[51058]: Reconfiguring prometheus.vm04 (dependencies changed)... 2026-03-10T06:18:28.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:28 vm04 ceph-mon[51058]: Reconfiguring daemon prometheus.vm04 on vm04 2026-03-10T06:18:28.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:28 vm04 ceph-mon[51058]: from='client.? 192.168.123.104:0/2601857690' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "24cf9fc9-b995-47ea-a145-3fd48dc1ed14"}]: dispatch 2026-03-10T06:18:28.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:28 vm04 ceph-mon[51058]: from='client.? 192.168.123.104:0/2601857690' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "24cf9fc9-b995-47ea-a145-3fd48dc1ed14"}]': finished 2026-03-10T06:18:28.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:28 vm04 ceph-mon[51058]: osdmap e6: 1 total, 0 up, 1 in 2026-03-10T06:18:28.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:28 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-10T06:18:28.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:28 vm04 ceph-mon[51058]: from='client.? 192.168.123.104:0/3741550685' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch 2026-03-10T06:18:29.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:29 vm04 ceph-mon[51058]: pgmap v6: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-10T06:18:29.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:29 vm06 ceph-mon[58974]: pgmap v6: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-10T06:18:31.602 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:31 vm04 ceph-mon[51058]: pgmap v7: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-10T06:18:31.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:31 vm06 ceph-mon[58974]: pgmap v7: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-10T06:18:33.007 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:33 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:33.007 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:33 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:33.007 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:33 vm04 ceph-mon[51058]: Reconfiguring ceph-exporter.vm06 (monmap changed)... 2026-03-10T06:18:33.007 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:33 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm06", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-10T06:18:33.007 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:33 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:18:33.007 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:33 vm04 ceph-mon[51058]: Reconfiguring daemon ceph-exporter.vm06 on vm06 2026-03-10T06:18:33.007 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:33 vm04 ceph-mon[51058]: pgmap v8: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-10T06:18:33.007 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:33 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:33.007 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:33 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:33.007 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:33 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "osd.0"}]: dispatch 2026-03-10T06:18:33.007 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:33 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:18:33.008 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:33 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:33.008 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:33 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:33.008 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:33 vm06 ceph-mon[58974]: Reconfiguring ceph-exporter.vm06 (monmap changed)... 2026-03-10T06:18:33.008 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:33 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm06", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-10T06:18:33.008 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:33 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:18:33.008 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:33 vm06 ceph-mon[58974]: Reconfiguring daemon ceph-exporter.vm06 on vm06 2026-03-10T06:18:33.008 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:33 vm06 ceph-mon[58974]: pgmap v8: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-10T06:18:33.008 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:33 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:33.008 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:33 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:33.008 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:33 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "osd.0"}]: dispatch 2026-03-10T06:18:33.008 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:33 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:18:33.009 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:33 vm06 ceph-mon[58974]: Deploying daemon osd.0 on vm04 2026-03-10T06:18:33.009 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:33 vm06 ceph-mon[58974]: Reconfiguring crash.vm06 (monmap changed)... 2026-03-10T06:18:33.009 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:33 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.vm06", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch 2026-03-10T06:18:33.009 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:33 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:18:33.009 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:33 vm06 ceph-mon[58974]: Reconfiguring daemon crash.vm06 on vm06 2026-03-10T06:18:33.009 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:33 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:33.009 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:33 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:33.009 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:33 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm06.wwotdr", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-10T06:18:33.009 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:33 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-10T06:18:33.009 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:33 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:18:33.359 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:33 vm04 ceph-mon[51058]: Deploying daemon osd.0 on vm04 2026-03-10T06:18:33.359 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:33 vm04 ceph-mon[51058]: Reconfiguring crash.vm06 (monmap changed)... 2026-03-10T06:18:33.359 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:33 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.vm06", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch 2026-03-10T06:18:33.359 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:33 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:18:33.359 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:33 vm04 ceph-mon[51058]: Reconfiguring daemon crash.vm06 on vm06 2026-03-10T06:18:33.359 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:33 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:33.359 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:33 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:33.359 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:33 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm06.wwotdr", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-10T06:18:33.359 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:33 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-10T06:18:33.359 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:33 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:18:34.236 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:34 vm06 ceph-mon[58974]: Reconfiguring mgr.vm06.wwotdr (monmap changed)... 2026-03-10T06:18:34.236 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:34 vm06 ceph-mon[58974]: Reconfiguring daemon mgr.vm06.wwotdr on vm06 2026-03-10T06:18:34.236 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:34 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:34.236 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:34 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:34.236 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:34 vm06 ceph-mon[58974]: Reconfiguring mon.vm06 (monmap changed)... 2026-03-10T06:18:34.236 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:34 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch 2026-03-10T06:18:34.237 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:34 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch 2026-03-10T06:18:34.237 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:34 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:18:34.237 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:34 vm06 ceph-mon[58974]: Reconfiguring daemon mon.vm06 on vm06 2026-03-10T06:18:34.237 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:34 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:34.237 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:34 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:34.237 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:34 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "dashboard get-alertmanager-api-host"}]: dispatch 2026-03-10T06:18:34.237 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:34 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "dashboard set-alertmanager-api-host", "value": "http://vm04.local:9093"}]: dispatch 2026-03-10T06:18:34.237 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:34 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:34.237 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:34 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "dashboard get-grafana-api-url"}]: dispatch 2026-03-10T06:18:34.237 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:34 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "dashboard set-grafana-api-url", "value": "https://vm04.local:3000"}]: dispatch 2026-03-10T06:18:34.237 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:34 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:34.237 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:34 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-10T06:18:34.237 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:34 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "dashboard set-prometheus-api-host", "value": "http://vm04.local:9095"}]: dispatch 2026-03-10T06:18:34.237 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:34 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:34.237 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:34 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:18:34.237 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:34 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T06:18:34.407 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:34 vm04 ceph-mon[51058]: Reconfiguring mgr.vm06.wwotdr (monmap changed)... 2026-03-10T06:18:34.407 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:34 vm04 ceph-mon[51058]: Reconfiguring daemon mgr.vm06.wwotdr on vm06 2026-03-10T06:18:34.407 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:34 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:34.407 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:34 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:34.407 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:34 vm04 ceph-mon[51058]: Reconfiguring mon.vm06 (monmap changed)... 2026-03-10T06:18:34.407 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:34 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch 2026-03-10T06:18:34.407 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:34 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch 2026-03-10T06:18:34.407 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:34 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:18:34.407 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:34 vm04 ceph-mon[51058]: Reconfiguring daemon mon.vm06 on vm06 2026-03-10T06:18:34.407 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:34 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:34.407 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:34 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:34.407 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:34 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "dashboard get-alertmanager-api-host"}]: dispatch 2026-03-10T06:18:34.407 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:34 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "dashboard set-alertmanager-api-host", "value": "http://vm04.local:9093"}]: dispatch 2026-03-10T06:18:34.407 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:34 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:34.407 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:34 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "dashboard get-grafana-api-url"}]: dispatch 2026-03-10T06:18:34.407 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:34 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "dashboard set-grafana-api-url", "value": "https://vm04.local:3000"}]: dispatch 2026-03-10T06:18:34.407 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:34 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:34.407 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:34 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-10T06:18:34.407 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:34 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "dashboard set-prometheus-api-host", "value": "http://vm04.local:9095"}]: dispatch 2026-03-10T06:18:34.407 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:34 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:34.407 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:34 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:18:34.407 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:34 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T06:18:35.620 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:35 vm04 ceph-mon[51058]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard get-alertmanager-api-host"}]: dispatch 2026-03-10T06:18:35.620 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:35 vm04 ceph-mon[51058]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard set-alertmanager-api-host", "value": "http://vm04.local:9093"}]: dispatch 2026-03-10T06:18:35.621 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:35 vm04 ceph-mon[51058]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard get-grafana-api-url"}]: dispatch 2026-03-10T06:18:35.621 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:35 vm04 ceph-mon[51058]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard set-grafana-api-url", "value": "https://vm04.local:3000"}]: dispatch 2026-03-10T06:18:35.621 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:35 vm04 ceph-mon[51058]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-10T06:18:35.621 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:35 vm04 ceph-mon[51058]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard set-prometheus-api-host", "value": "http://vm04.local:9095"}]: dispatch 2026-03-10T06:18:35.621 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:35 vm04 ceph-mon[51058]: pgmap v9: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-10T06:18:35.621 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:35 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:35.621 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:35 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:35.621 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:35 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:35.621 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:35 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:35.708 INFO:teuthology.orchestra.run.vm04.stdout:Created osd(s) 0 on host 'vm04' 2026-03-10T06:18:35.708 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:35.704+0000 7f7f4f7fe700 1 -- 192.168.123.104:0/2306007039 <== mgr.14241 v2:192.168.123.104:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+32 (secure 0 0 0) 0x7f7f581060c0 con 0x7f7f4406c430 2026-03-10T06:18:35.708 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:35.706+0000 7f7f60357700 1 -- 192.168.123.104:0/2306007039 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f7f4406c430 msgr2=0x7f7f4406e8e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:18:35.708 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:35.706+0000 7f7f60357700 1 --2- 192.168.123.104:0/2306007039 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f7f4406c430 0x7f7f4406e8e0 secure :-1 s=READY pgs=25 cs=0 l=1 rev1=1 crypto rx=0x7f7f54006010 tx=0x7f7f54005c20 comp rx=0 tx=0).stop 2026-03-10T06:18:35.708 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:35.706+0000 7f7f60357700 1 -- 192.168.123.104:0/2306007039 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7f58101770 msgr2=0x7f7f58196350 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:18:35.708 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:35.706+0000 7f7f60357700 1 --2- 192.168.123.104:0/2306007039 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7f58101770 0x7f7f58196350 secure :-1 s=READY pgs=184 cs=0 l=1 rev1=1 crypto rx=0x7f7f4800d8d0 tx=0x7f7f4800dc90 comp rx=0 tx=0).stop 2026-03-10T06:18:35.708 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:35.707+0000 7f7f60357700 1 -- 192.168.123.104:0/2306007039 shutdown_connections 2026-03-10T06:18:35.708 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:35.707+0000 7f7f60357700 1 --2- 192.168.123.104:0/2306007039 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f7f4406c430 0x7f7f4406e8e0 unknown :-1 s=CLOSED pgs=25 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:18:35.708 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:35.707+0000 7f7f60357700 1 --2- 192.168.123.104:0/2306007039 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7f58100570 0x7f7f58195e10 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:18:35.708 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:35.707+0000 7f7f60357700 1 --2- 192.168.123.104:0/2306007039 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7f58101770 0x7f7f58196350 unknown :-1 s=CLOSED pgs=184 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:18:35.708 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:35.707+0000 7f7f60357700 1 -- 192.168.123.104:0/2306007039 >> 192.168.123.104:0/2306007039 conn(0x7f7f580fbb00 msgr2=0x7f7f581049a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:18:35.709 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:35.707+0000 7f7f60357700 1 -- 192.168.123.104:0/2306007039 shutdown_connections 2026-03-10T06:18:35.709 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:35.707+0000 7f7f60357700 1 -- 192.168.123.104:0/2306007039 wait complete. 2026-03-10T06:18:35.788 DEBUG:teuthology.orchestra.run.vm04:osd.0> sudo journalctl -f -n 0 -u ceph-9c59102a-1c48-11f1-b618-035af535377d@osd.0.service 2026-03-10T06:18:35.789 INFO:tasks.cephadm:Deploying osd.1 on vm04 with /dev/vdd... 2026-03-10T06:18:35.789 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 ceph-volume -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 9c59102a-1c48-11f1-b618-035af535377d -- lvm zap /dev/vdd 2026-03-10T06:18:35.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:35 vm06 ceph-mon[58974]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard get-alertmanager-api-host"}]: dispatch 2026-03-10T06:18:35.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:35 vm06 ceph-mon[58974]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard set-alertmanager-api-host", "value": "http://vm04.local:9093"}]: dispatch 2026-03-10T06:18:35.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:35 vm06 ceph-mon[58974]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard get-grafana-api-url"}]: dispatch 2026-03-10T06:18:35.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:35 vm06 ceph-mon[58974]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard set-grafana-api-url", "value": "https://vm04.local:3000"}]: dispatch 2026-03-10T06:18:35.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:35 vm06 ceph-mon[58974]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-10T06:18:35.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:35 vm06 ceph-mon[58974]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard set-prometheus-api-host", "value": "http://vm04.local:9095"}]: dispatch 2026-03-10T06:18:35.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:35 vm06 ceph-mon[58974]: pgmap v9: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-10T06:18:35.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:35 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:35.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:35 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:35.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:35 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:35.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:35 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:36.062 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/mon.vm04/config 2026-03-10T06:18:36.141 INFO:journalctl@ceph.osd.0.vm04.stdout:Mar 10 06:18:35 vm04 ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-0[68076]: 2026-03-10T06:18:35.897+0000 7fb5700cc640 -1 osd.0 0 log_to_monitors true 2026-03-10T06:18:36.403 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:36 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:36.403 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:36 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:36.403 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:36 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:18:36.403 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:36 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T06:18:36.404 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:36 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:36.404 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:36 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:18:36.404 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:36 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:36.404 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:36 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:36.404 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:36 vm04 ceph-mon[51058]: from='osd.0 [v2:192.168.123.104:6802/353393816,v1:192.168.123.104:6803/353393816]' entity='osd.0' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["0"]}]: dispatch 2026-03-10T06:18:36.404 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:36 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:36.404 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:36 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:36.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:36 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:36.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:36 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:36.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:36 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:18:36.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:36 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T06:18:36.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:36 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:36.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:36 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:18:36.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:36 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:36.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:36 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:36.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:36 vm06 ceph-mon[58974]: from='osd.0 [v2:192.168.123.104:6802/353393816,v1:192.168.123.104:6803/353393816]' entity='osd.0' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["0"]}]: dispatch 2026-03-10T06:18:36.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:36 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:36.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:36 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:36.702 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:18:36.728 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 9c59102a-1c48-11f1-b618-035af535377d -- ceph orch daemon add osd vm04:/dev/vdd 2026-03-10T06:18:36.928 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/mon.vm04/config 2026-03-10T06:18:37.280 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:37.277+0000 7f34c81ab700 1 -- 192.168.123.104:0/441089950 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f34c0072470 msgr2=0x7f34c010beb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:18:37.280 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:37.277+0000 7f34c81ab700 1 --2- 192.168.123.104:0/441089950 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f34c0072470 0x7f34c010beb0 secure :-1 s=READY pgs=192 cs=0 l=1 rev1=1 crypto rx=0x7f34b800b3a0 tx=0x7f34b800b6b0 comp rx=0 tx=0).stop 2026-03-10T06:18:37.281 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:37.278+0000 7f34c81ab700 1 -- 192.168.123.104:0/441089950 shutdown_connections 2026-03-10T06:18:37.281 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:37.278+0000 7f34c81ab700 1 --2- 192.168.123.104:0/441089950 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f34c0072470 0x7f34c010beb0 unknown :-1 s=CLOSED pgs=192 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:18:37.281 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:37.278+0000 7f34c81ab700 1 --2- 192.168.123.104:0/441089950 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f34c0071a90 0x7f34c0071ea0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:18:37.281 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:37.278+0000 7f34c81ab700 1 -- 192.168.123.104:0/441089950 >> 192.168.123.104:0/441089950 conn(0x7f34c006d1a0 msgr2=0x7f34c006f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:18:37.282 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:37.280+0000 7f34c81ab700 1 -- 192.168.123.104:0/441089950 shutdown_connections 2026-03-10T06:18:37.282 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:37.280+0000 7f34c81ab700 1 -- 192.168.123.104:0/441089950 wait complete. 2026-03-10T06:18:37.283 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:37.281+0000 7f34c81ab700 1 Processor -- start 2026-03-10T06:18:37.283 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:37.281+0000 7f34c81ab700 1 -- start start 2026-03-10T06:18:37.283 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:37.281+0000 7f34c81ab700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f34c0071a90 0x7f34c0116a70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:18:37.283 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:37.281+0000 7f34c81ab700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f34c0072470 0x7f34c0116fb0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:18:37.283 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:37.281+0000 7f34c81ab700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f34c01175d0 con 0x7f34c0071a90 2026-03-10T06:18:37.283 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:37.281+0000 7f34c81ab700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f34c01b2820 con 0x7f34c0072470 2026-03-10T06:18:37.283 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:37.281+0000 7f34c5f47700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f34c0071a90 0x7f34c0116a70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:18:37.283 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:37.281+0000 7f34c5f47700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f34c0071a90 0x7f34c0116a70 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:60508/0 (socket says 192.168.123.104:60508) 2026-03-10T06:18:37.283 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:37.281+0000 7f34c5f47700 1 -- 192.168.123.104:0/2025873183 learned_addr learned my addr 192.168.123.104:0/2025873183 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:18:37.283 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:37.281+0000 7f34c5746700 1 --2- 192.168.123.104:0/2025873183 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f34c0072470 0x7f34c0116fb0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:18:37.284 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:37.282+0000 7f34c5f47700 1 -- 192.168.123.104:0/2025873183 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f34c0072470 msgr2=0x7f34c0116fb0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:18:37.284 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:37.282+0000 7f34c5f47700 1 --2- 192.168.123.104:0/2025873183 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f34c0072470 0x7f34c0116fb0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:18:37.284 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:37.282+0000 7f34c5f47700 1 -- 192.168.123.104:0/2025873183 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f34b800b050 con 0x7f34c0071a90 2026-03-10T06:18:37.284 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:37.282+0000 7f34c5f47700 1 --2- 192.168.123.104:0/2025873183 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f34c0071a90 0x7f34c0116a70 secure :-1 s=READY pgs=193 cs=0 l=1 rev1=1 crypto rx=0x7f34b000b700 tx=0x7f34b000ba10 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:18:37.284 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:37.282+0000 7f34b6ffd700 1 -- 192.168.123.104:0/2025873183 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f34b00107c0 con 0x7f34c0071a90 2026-03-10T06:18:37.284 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:37.282+0000 7f34b6ffd700 1 -- 192.168.123.104:0/2025873183 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f34b0010e00 con 0x7f34c0071a90 2026-03-10T06:18:37.284 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:37.282+0000 7f34b6ffd700 1 -- 192.168.123.104:0/2025873183 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f34b000f360 con 0x7f34c0071a90 2026-03-10T06:18:37.284 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:37.282+0000 7f34c81ab700 1 -- 192.168.123.104:0/2025873183 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f34c01b2a20 con 0x7f34c0071a90 2026-03-10T06:18:37.284 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:37.282+0000 7f34c81ab700 1 -- 192.168.123.104:0/2025873183 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f34c01b2f70 con 0x7f34c0071a90 2026-03-10T06:18:37.286 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:37.284+0000 7f34c81ab700 1 -- 192.168.123.104:0/2025873183 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f34c0110c20 con 0x7f34c0071a90 2026-03-10T06:18:37.289 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:37.285+0000 7f34b6ffd700 1 -- 192.168.123.104:0/2025873183 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 18) v1 ==== 89854+0+0 (secure 0 0 0) 0x7f34b0017360 con 0x7f34c0071a90 2026-03-10T06:18:37.289 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:37.285+0000 7f34b6ffd700 1 --2- 192.168.123.104:0/2025873183 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f34ac06c720 0x7f34ac06ebd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:18:37.289 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:37.285+0000 7f34b6ffd700 1 -- 192.168.123.104:0/2025873183 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(7..7 src has 1..7) v4 ==== 1404+0+0 (secure 0 0 0) 0x7f34b0089f60 con 0x7f34c0071a90 2026-03-10T06:18:37.289 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:37.287+0000 7f34c5746700 1 --2- 192.168.123.104:0/2025873183 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f34ac06c720 0x7f34ac06ebd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:18:37.289 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:37.287+0000 7f34c5746700 1 --2- 192.168.123.104:0/2025873183 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f34ac06c720 0x7f34ac06ebd0 secure :-1 s=READY pgs=28 cs=0 l=1 rev1=1 crypto rx=0x7f34c0117b10 tx=0x7f34b800bf90 comp rx=0 tx=0).ready entity=mgr.14241 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:18:37.289 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:37.287+0000 7f34b6ffd700 1 -- 192.168.123.104:0/2025873183 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f34b00591f0 con 0x7f34c0071a90 2026-03-10T06:18:37.434 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:37.431+0000 7f34c81ab700 1 -- 192.168.123.104:0/2025873183 --> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] -- mgr_command(tid 0: {"prefix": "orch daemon add osd", "svc_arg": "vm04:/dev/vdd", "target": ["mon-mgr", ""]}) v1 -- 0x7f34c0061190 con 0x7f34ac06c720 2026-03-10T06:18:37.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:37 vm04 ceph-mon[51058]: pgmap v10: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-10T06:18:37.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:37 vm04 ceph-mon[51058]: from='osd.0 [v2:192.168.123.104:6802/353393816,v1:192.168.123.104:6803/353393816]' entity='osd.0' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["0"]}]': finished 2026-03-10T06:18:37.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:37 vm04 ceph-mon[51058]: osdmap e7: 1 total, 0 up, 1 in 2026-03-10T06:18:37.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:37 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-10T06:18:37.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:37 vm04 ceph-mon[51058]: from='osd.0 [v2:192.168.123.104:6802/353393816,v1:192.168.123.104:6803/353393816]' entity='osd.0' cmd=[{"prefix": "osd crush create-or-move", "id": 0, "weight":0.0195, "args": ["host=vm04", "root=default"]}]: dispatch 2026-03-10T06:18:37.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:37 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:37.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:37 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:37.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:37 vm06 ceph-mon[58974]: pgmap v10: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-10T06:18:37.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:37 vm06 ceph-mon[58974]: from='osd.0 [v2:192.168.123.104:6802/353393816,v1:192.168.123.104:6803/353393816]' entity='osd.0' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["0"]}]': finished 2026-03-10T06:18:37.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:37 vm06 ceph-mon[58974]: osdmap e7: 1 total, 0 up, 1 in 2026-03-10T06:18:37.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:37 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-10T06:18:37.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:37 vm06 ceph-mon[58974]: from='osd.0 [v2:192.168.123.104:6802/353393816,v1:192.168.123.104:6803/353393816]' entity='osd.0' cmd=[{"prefix": "osd crush create-or-move", "id": 0, "weight":0.0195, "args": ["host=vm04", "root=default"]}]: dispatch 2026-03-10T06:18:37.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:37 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:37.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:37 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:37.956 INFO:journalctl@ceph.osd.0.vm04.stdout:Mar 10 06:18:37 vm04 ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-0[68076]: 2026-03-10T06:18:37.714+0000 7fb566745700 -1 osd.0 0 waiting for initial osdmap 2026-03-10T06:18:37.956 INFO:journalctl@ceph.osd.0.vm04.stdout:Mar 10 06:18:37 vm04 ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-0[68076]: 2026-03-10T06:18:37.739+0000 7fb561538700 -1 osd.0 8 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-10T06:18:38.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:38 vm04 ceph-mon[51058]: from='client.14298 -' entity='client.admin' cmd=[{"prefix": "orch daemon add osd", "svc_arg": "vm04:/dev/vdd", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:18:38.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:38 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch 2026-03-10T06:18:38.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:38 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch 2026-03-10T06:18:38.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:38 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:18:38.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:38 vm04 ceph-mon[51058]: from='osd.0 [v2:192.168.123.104:6802/353393816,v1:192.168.123.104:6803/353393816]' entity='osd.0' cmd='[{"prefix": "osd crush create-or-move", "id": 0, "weight":0.0195, "args": ["host=vm04", "root=default"]}]': finished 2026-03-10T06:18:38.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:38 vm04 ceph-mon[51058]: osdmap e8: 1 total, 0 up, 1 in 2026-03-10T06:18:38.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:38 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-10T06:18:38.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:38 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-10T06:18:38.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:38 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:38.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:38 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:38.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:38 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "who": "osd/host:vm04", "name": "osd_memory_target"}]: dispatch 2026-03-10T06:18:38.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:38 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:18:38.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:38 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T06:18:38.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:38 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:38.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:38 vm06 ceph-mon[58974]: from='client.14298 -' entity='client.admin' cmd=[{"prefix": "orch daemon add osd", "svc_arg": "vm04:/dev/vdd", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:18:38.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:38 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch 2026-03-10T06:18:38.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:38 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch 2026-03-10T06:18:38.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:38 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:18:38.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:38 vm06 ceph-mon[58974]: from='osd.0 [v2:192.168.123.104:6802/353393816,v1:192.168.123.104:6803/353393816]' entity='osd.0' cmd='[{"prefix": "osd crush create-or-move", "id": 0, "weight":0.0195, "args": ["host=vm04", "root=default"]}]': finished 2026-03-10T06:18:38.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:38 vm06 ceph-mon[58974]: osdmap e8: 1 total, 0 up, 1 in 2026-03-10T06:18:38.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:38 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-10T06:18:38.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:38 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-10T06:18:38.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:38 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:38.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:38 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:38.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:38 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "who": "osd/host:vm04", "name": "osd_memory_target"}]: dispatch 2026-03-10T06:18:38.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:38 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:18:38.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:38 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T06:18:38.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:38 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:39.554 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:39 vm04 ceph-mon[51058]: purged_snaps scrub starts 2026-03-10T06:18:39.554 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:39 vm04 ceph-mon[51058]: purged_snaps scrub ok 2026-03-10T06:18:39.554 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:39 vm04 ceph-mon[51058]: pgmap v13: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-10T06:18:39.554 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:39 vm04 ceph-mon[51058]: Detected new or changed devices on vm04 2026-03-10T06:18:39.554 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:39 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:18:39.554 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:39 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-10T06:18:39.554 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:39 vm04 ceph-mon[51058]: osd.0 [v2:192.168.123.104:6802/353393816,v1:192.168.123.104:6803/353393816] boot 2026-03-10T06:18:39.554 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:39 vm04 ceph-mon[51058]: osdmap e9: 1 total, 1 up, 1 in 2026-03-10T06:18:39.554 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:39 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-10T06:18:39.554 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:39 vm04 ceph-mon[51058]: from='client.? 192.168.123.104:0/2087590177' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "38852b8c-2ea4-46c8-a734-cf521893e9b5"}]: dispatch 2026-03-10T06:18:39.554 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:39 vm04 ceph-mon[51058]: from='client.? 192.168.123.104:0/2087590177' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "38852b8c-2ea4-46c8-a734-cf521893e9b5"}]': finished 2026-03-10T06:18:39.554 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:39 vm04 ceph-mon[51058]: osdmap e10: 2 total, 1 up, 2 in 2026-03-10T06:18:39.554 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:39 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-10T06:18:39.554 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:39 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:39.554 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:39 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:39.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:39 vm06 ceph-mon[58974]: purged_snaps scrub starts 2026-03-10T06:18:39.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:39 vm06 ceph-mon[58974]: purged_snaps scrub ok 2026-03-10T06:18:39.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:39 vm06 ceph-mon[58974]: pgmap v13: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-10T06:18:39.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:39 vm06 ceph-mon[58974]: Detected new or changed devices on vm04 2026-03-10T06:18:39.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:39 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:18:39.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:39 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-10T06:18:39.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:39 vm06 ceph-mon[58974]: osd.0 [v2:192.168.123.104:6802/353393816,v1:192.168.123.104:6803/353393816] boot 2026-03-10T06:18:39.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:39 vm06 ceph-mon[58974]: osdmap e9: 1 total, 1 up, 1 in 2026-03-10T06:18:39.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:39 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-10T06:18:39.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:39 vm06 ceph-mon[58974]: from='client.? 192.168.123.104:0/2087590177' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "38852b8c-2ea4-46c8-a734-cf521893e9b5"}]: dispatch 2026-03-10T06:18:39.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:39 vm06 ceph-mon[58974]: from='client.? 192.168.123.104:0/2087590177' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "38852b8c-2ea4-46c8-a734-cf521893e9b5"}]': finished 2026-03-10T06:18:39.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:39 vm06 ceph-mon[58974]: osdmap e10: 2 total, 1 up, 2 in 2026-03-10T06:18:39.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:39 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-10T06:18:39.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:39 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:39.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:39 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:40.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:40 vm04 ceph-mon[51058]: from='client.? 192.168.123.104:0/733603783' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch 2026-03-10T06:18:40.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:40 vm04 ceph-mon[51058]: osdmap e11: 2 total, 1 up, 2 in 2026-03-10T06:18:40.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:40 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-10T06:18:40.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:40 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:40.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:40 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:40.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:40 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:18:40.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:40 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T06:18:40.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:40 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:40.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:40 vm06 ceph-mon[58974]: from='client.? 192.168.123.104:0/733603783' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch 2026-03-10T06:18:40.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:40 vm06 ceph-mon[58974]: osdmap e11: 2 total, 1 up, 2 in 2026-03-10T06:18:40.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:40 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-10T06:18:40.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:40 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:40.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:40 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:40.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:40 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:18:40.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:40 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T06:18:40.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:40 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:41.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:41 vm04 ceph-mon[51058]: pgmap v17: 0 pgs: ; 0 B data, 26 MiB used, 20 GiB / 20 GiB avail 2026-03-10T06:18:41.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:41 vm06 ceph-mon[58974]: pgmap v17: 0 pgs: ; 0 B data, 26 MiB used, 20 GiB / 20 GiB avail 2026-03-10T06:18:43.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:43 vm04 ceph-mon[51058]: pgmap v18: 0 pgs: ; 0 B data, 26 MiB used, 20 GiB / 20 GiB avail 2026-03-10T06:18:43.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:43 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "osd.1"}]: dispatch 2026-03-10T06:18:43.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:43 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:18:43.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:43 vm06 ceph-mon[58974]: pgmap v18: 0 pgs: ; 0 B data, 26 MiB used, 20 GiB / 20 GiB avail 2026-03-10T06:18:43.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:43 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "osd.1"}]: dispatch 2026-03-10T06:18:43.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:43 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:18:44.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:44 vm04 ceph-mon[51058]: Deploying daemon osd.1 on vm04 2026-03-10T06:18:44.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:44 vm06 ceph-mon[58974]: Deploying daemon osd.1 on vm04 2026-03-10T06:18:45.553 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:45 vm04 ceph-mon[51058]: pgmap v19: 0 pgs: ; 0 B data, 26 MiB used, 20 GiB / 20 GiB avail 2026-03-10T06:18:45.553 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:45 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:45.553 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:45 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:18:45.553 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:45 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:45.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:45 vm06 ceph-mon[58974]: pgmap v19: 0 pgs: ; 0 B data, 26 MiB used, 20 GiB / 20 GiB avail 2026-03-10T06:18:45.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:45 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:45.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:45 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:18:45.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:45 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:46.131 INFO:teuthology.orchestra.run.vm04.stdout:Created osd(s) 1 on host 'vm04' 2026-03-10T06:18:46.131 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:46.126+0000 7f34b6ffd700 1 -- 192.168.123.104:0/2025873183 <== mgr.14241 v2:192.168.123.104:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+32 (secure 0 0 0) 0x7f34c0061190 con 0x7f34ac06c720 2026-03-10T06:18:46.131 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:46.128+0000 7f34b4ff9700 1 -- 192.168.123.104:0/2025873183 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f34ac06c720 msgr2=0x7f34ac06ebd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:18:46.131 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:46.128+0000 7f34b4ff9700 1 --2- 192.168.123.104:0/2025873183 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f34ac06c720 0x7f34ac06ebd0 secure :-1 s=READY pgs=28 cs=0 l=1 rev1=1 crypto rx=0x7f34c0117b10 tx=0x7f34b800bf90 comp rx=0 tx=0).stop 2026-03-10T06:18:46.131 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:46.128+0000 7f34b4ff9700 1 -- 192.168.123.104:0/2025873183 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f34c0071a90 msgr2=0x7f34c0116a70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:18:46.131 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:46.128+0000 7f34b4ff9700 1 --2- 192.168.123.104:0/2025873183 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f34c0071a90 0x7f34c0116a70 secure :-1 s=READY pgs=193 cs=0 l=1 rev1=1 crypto rx=0x7f34b000b700 tx=0x7f34b000ba10 comp rx=0 tx=0).stop 2026-03-10T06:18:46.131 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:46.128+0000 7f34b4ff9700 1 -- 192.168.123.104:0/2025873183 shutdown_connections 2026-03-10T06:18:46.131 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:46.128+0000 7f34b4ff9700 1 --2- 192.168.123.104:0/2025873183 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f34ac06c720 0x7f34ac06ebd0 secure :-1 s=CLOSED pgs=28 cs=0 l=1 rev1=1 crypto rx=0x7f34c0117b10 tx=0x7f34b800bf90 comp rx=0 tx=0).stop 2026-03-10T06:18:46.131 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:46.128+0000 7f34b4ff9700 1 --2- 192.168.123.104:0/2025873183 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f34c0071a90 0x7f34c0116a70 unknown :-1 s=CLOSED pgs=193 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:18:46.131 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:46.128+0000 7f34b4ff9700 1 --2- 192.168.123.104:0/2025873183 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f34c0072470 0x7f34c0116fb0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:18:46.131 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:46.128+0000 7f34b4ff9700 1 -- 192.168.123.104:0/2025873183 >> 192.168.123.104:0/2025873183 conn(0x7f34c006d1a0 msgr2=0x7f34c010b4d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:18:46.131 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:46.129+0000 7f34b4ff9700 1 -- 192.168.123.104:0/2025873183 shutdown_connections 2026-03-10T06:18:46.131 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:46.129+0000 7f34b4ff9700 1 -- 192.168.123.104:0/2025873183 wait complete. 2026-03-10T06:18:46.207 DEBUG:teuthology.orchestra.run.vm04:osd.1> sudo journalctl -f -n 0 -u ceph-9c59102a-1c48-11f1-b618-035af535377d@osd.1.service 2026-03-10T06:18:46.208 INFO:tasks.cephadm:Deploying osd.2 on vm04 with /dev/vdc... 2026-03-10T06:18:46.208 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 ceph-volume -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 9c59102a-1c48-11f1-b618-035af535377d -- lvm zap /dev/vdc 2026-03-10T06:18:46.434 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/mon.vm04/config 2026-03-10T06:18:46.929 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:46 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:46.929 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:46 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:46.929 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:46 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:46.929 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:46 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:46.929 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:46 vm04 ceph-mon[51058]: pgmap v20: 0 pgs: ; 0 B data, 26 MiB used, 20 GiB / 20 GiB avail 2026-03-10T06:18:47.086 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:18:47.101 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 9c59102a-1c48-11f1-b618-035af535377d -- ceph orch daemon add osd vm04:/dev/vdc 2026-03-10T06:18:47.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:46 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:47.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:46 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:47.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:46 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:47.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:46 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:47.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:46 vm06 ceph-mon[58974]: pgmap v20: 0 pgs: ; 0 B data, 26 MiB used, 20 GiB / 20 GiB avail 2026-03-10T06:18:47.326 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/mon.vm04/config 2026-03-10T06:18:47.670 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:47.667+0000 7f5b7b59e700 1 -- 192.168.123.104:0/2869355924 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5b7c072440 msgr2=0x7f5b7c10be90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:18:47.670 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:47.667+0000 7f5b7b59e700 1 --2- 192.168.123.104:0/2869355924 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5b7c072440 0x7f5b7c10be90 secure :-1 s=READY pgs=200 cs=0 l=1 rev1=1 crypto rx=0x7f5b7400b3a0 tx=0x7f5b7400b6b0 comp rx=0 tx=0).stop 2026-03-10T06:18:47.670 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:47.667+0000 7f5b7b59e700 1 -- 192.168.123.104:0/2869355924 shutdown_connections 2026-03-10T06:18:47.670 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:47.667+0000 7f5b7b59e700 1 --2- 192.168.123.104:0/2869355924 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5b7c072440 0x7f5b7c10be90 unknown :-1 s=CLOSED pgs=200 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:18:47.670 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:47.667+0000 7f5b7b59e700 1 --2- 192.168.123.104:0/2869355924 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5b7c071a60 0x7f5b7c071e70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:18:47.670 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:47.667+0000 7f5b7b59e700 1 -- 192.168.123.104:0/2869355924 >> 192.168.123.104:0/2869355924 conn(0x7f5b7c06d1a0 msgr2=0x7f5b7c06f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:18:47.670 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:47.669+0000 7f5b7b59e700 1 -- 192.168.123.104:0/2869355924 shutdown_connections 2026-03-10T06:18:47.671 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:47.669+0000 7f5b7b59e700 1 -- 192.168.123.104:0/2869355924 wait complete. 2026-03-10T06:18:47.671 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:47.669+0000 7f5b7b59e700 1 Processor -- start 2026-03-10T06:18:47.671 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:47.669+0000 7f5b7b59e700 1 -- start start 2026-03-10T06:18:47.671 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:47.669+0000 7f5b7b59e700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5b7c071a60 0x7f5b7c1af9c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:18:47.671 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:47.669+0000 7f5b7b59e700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5b7c072440 0x7f5b7c1b1f10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:18:47.671 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:47.669+0000 7f5b7b59e700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5b7c1b24e0 con 0x7f5b7c072440 2026-03-10T06:18:47.671 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:47.669+0000 7f5b7b59e700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5b7c1b2620 con 0x7f5b7c071a60 2026-03-10T06:18:47.672 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:47.670+0000 7f5b79d9b700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5b7c072440 0x7f5b7c1b1f10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:18:47.672 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:47.670+0000 7f5b7a59c700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5b7c071a60 0x7f5b7c1af9c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:18:47.672 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:47.670+0000 7f5b79d9b700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5b7c072440 0x7f5b7c1b1f10 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:44382/0 (socket says 192.168.123.104:44382) 2026-03-10T06:18:47.672 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:47.670+0000 7f5b79d9b700 1 -- 192.168.123.104:0/3544750828 learned_addr learned my addr 192.168.123.104:0/3544750828 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:18:47.672 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:47.670+0000 7f5b79d9b700 1 -- 192.168.123.104:0/3544750828 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5b7c071a60 msgr2=0x7f5b7c1af9c0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:18:47.672 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:47.670+0000 7f5b79d9b700 1 --2- 192.168.123.104:0/3544750828 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5b7c071a60 0x7f5b7c1af9c0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:18:47.673 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:47.670+0000 7f5b79d9b700 1 -- 192.168.123.104:0/3544750828 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f5b7400b050 con 0x7f5b7c072440 2026-03-10T06:18:47.673 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:47.670+0000 7f5b79d9b700 1 --2- 192.168.123.104:0/3544750828 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5b7c072440 0x7f5b7c1b1f10 secure :-1 s=READY pgs=201 cs=0 l=1 rev1=1 crypto rx=0x7f5b74000f80 tx=0x7f5b74008fa0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:18:47.673 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:47.670+0000 7f5b6b7fe700 1 -- 192.168.123.104:0/3544750828 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f5b7400e050 con 0x7f5b7c072440 2026-03-10T06:18:47.673 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:47.670+0000 7f5b6b7fe700 1 -- 192.168.123.104:0/3544750828 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f5b74003f00 con 0x7f5b7c072440 2026-03-10T06:18:47.674 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:47.671+0000 7f5b6b7fe700 1 -- 192.168.123.104:0/3544750828 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f5b7401dad0 con 0x7f5b7c072440 2026-03-10T06:18:47.674 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:47.672+0000 7f5b7b59e700 1 -- 192.168.123.104:0/3544750828 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f5b7c1b2870 con 0x7f5b7c072440 2026-03-10T06:18:47.674 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:47.672+0000 7f5b7b59e700 1 -- 192.168.123.104:0/3544750828 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f5b7c1b2d40 con 0x7f5b7c072440 2026-03-10T06:18:47.674 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:47.672+0000 7f5b7b59e700 1 -- 192.168.123.104:0/3544750828 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f5b7c111660 con 0x7f5b7c072440 2026-03-10T06:18:47.676 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:47.674+0000 7f5b6b7fe700 1 -- 192.168.123.104:0/3544750828 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 18) v1 ==== 89854+0+0 (secure 0 0 0) 0x7f5b74019040 con 0x7f5b7c072440 2026-03-10T06:18:47.677 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:47.675+0000 7f5b6b7fe700 1 --2- 192.168.123.104:0/3544750828 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f5b6406c480 0x7f5b6406e930 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:18:47.677 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:47.675+0000 7f5b6b7fe700 1 -- 192.168.123.104:0/3544750828 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(11..11 src has 1..11) v4 ==== 1915+0+0 (secure 0 0 0) 0x7f5b7408c5d0 con 0x7f5b7c072440 2026-03-10T06:18:47.677 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:47.675+0000 7f5b7a59c700 1 --2- 192.168.123.104:0/3544750828 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f5b6406c480 0x7f5b6406e930 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:18:47.677 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:47.675+0000 7f5b7a59c700 1 --2- 192.168.123.104:0/3544750828 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f5b6406c480 0x7f5b6406e930 secure :-1 s=READY pgs=34 cs=0 l=1 rev1=1 crypto rx=0x7f5b7c072f50 tx=0x7f5b7000b410 comp rx=0 tx=0).ready entity=mgr.14241 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:18:47.686 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:47.676+0000 7f5b6b7fe700 1 -- 192.168.123.104:0/3544750828 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f5b7405b660 con 0x7f5b7c072440 2026-03-10T06:18:47.798 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:47.796+0000 7f5b7b59e700 1 -- 192.168.123.104:0/3544750828 --> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] -- mgr_command(tid 0: {"prefix": "orch daemon add osd", "svc_arg": "vm04:/dev/vdc", "target": ["mon-mgr", ""]}) v1 -- 0x7f5b7c061190 con 0x7f5b6406c480 2026-03-10T06:18:47.801 INFO:journalctl@ceph.osd.1.vm04.stdout:Mar 10 06:18:47 vm04 ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-1[74271]: 2026-03-10T06:18:47.682+0000 7f68e79bd640 -1 osd.1 0 log_to_monitors true 2026-03-10T06:18:48.067 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:47 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:48.067 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:47 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:48.067 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:47 vm04 ceph-mon[51058]: from='osd.1 [v2:192.168.123.104:6810/3264858383,v1:192.168.123.104:6811/3264858383]' entity='osd.1' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["1"]}]: dispatch 2026-03-10T06:18:48.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:47 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:48.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:47 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:48.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:47 vm06 ceph-mon[58974]: from='osd.1 [v2:192.168.123.104:6810/3264858383,v1:192.168.123.104:6811/3264858383]' entity='osd.1' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["1"]}]: dispatch 2026-03-10T06:18:49.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:48 vm06 ceph-mon[58974]: from='client.14314 -' entity='client.admin' cmd=[{"prefix": "orch daemon add osd", "svc_arg": "vm04:/dev/vdc", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:18:49.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:48 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch 2026-03-10T06:18:49.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:48 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch 2026-03-10T06:18:49.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:48 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:18:49.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:48 vm06 ceph-mon[58974]: Detected new or changed devices on vm04 2026-03-10T06:18:49.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:48 vm06 ceph-mon[58974]: pgmap v21: 0 pgs: ; 0 B data, 26 MiB used, 20 GiB / 20 GiB avail 2026-03-10T06:18:49.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:48 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:49.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:48 vm06 ceph-mon[58974]: from='osd.1 [v2:192.168.123.104:6810/3264858383,v1:192.168.123.104:6811/3264858383]' entity='osd.1' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["1"]}]': finished 2026-03-10T06:18:49.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:48 vm06 ceph-mon[58974]: osdmap e12: 2 total, 1 up, 2 in 2026-03-10T06:18:49.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:48 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-10T06:18:49.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:48 vm06 ceph-mon[58974]: from='osd.1 [v2:192.168.123.104:6810/3264858383,v1:192.168.123.104:6811/3264858383]' entity='osd.1' cmd=[{"prefix": "osd crush create-or-move", "id": 1, "weight":0.0195, "args": ["host=vm04", "root=default"]}]: dispatch 2026-03-10T06:18:49.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:48 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:49.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:48 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "who": "osd/host:vm04", "name": "osd_memory_target"}]: dispatch 2026-03-10T06:18:49.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:48 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:18:49.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:48 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T06:18:49.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:48 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:49.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:48 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:18:49.124 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:48 vm04 ceph-mon[51058]: from='client.14314 -' entity='client.admin' cmd=[{"prefix": "orch daemon add osd", "svc_arg": "vm04:/dev/vdc", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:18:49.124 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:48 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch 2026-03-10T06:18:49.124 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:48 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch 2026-03-10T06:18:49.124 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:48 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:18:49.124 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:48 vm04 ceph-mon[51058]: Detected new or changed devices on vm04 2026-03-10T06:18:49.124 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:48 vm04 ceph-mon[51058]: pgmap v21: 0 pgs: ; 0 B data, 26 MiB used, 20 GiB / 20 GiB avail 2026-03-10T06:18:49.124 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:48 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:49.124 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:48 vm04 ceph-mon[51058]: from='osd.1 [v2:192.168.123.104:6810/3264858383,v1:192.168.123.104:6811/3264858383]' entity='osd.1' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["1"]}]': finished 2026-03-10T06:18:49.124 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:48 vm04 ceph-mon[51058]: osdmap e12: 2 total, 1 up, 2 in 2026-03-10T06:18:49.124 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:48 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-10T06:18:49.124 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:48 vm04 ceph-mon[51058]: from='osd.1 [v2:192.168.123.104:6810/3264858383,v1:192.168.123.104:6811/3264858383]' entity='osd.1' cmd=[{"prefix": "osd crush create-or-move", "id": 1, "weight":0.0195, "args": ["host=vm04", "root=default"]}]: dispatch 2026-03-10T06:18:49.124 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:48 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:49.124 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:48 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "who": "osd/host:vm04", "name": "osd_memory_target"}]: dispatch 2026-03-10T06:18:49.124 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:48 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:18:49.124 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:48 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T06:18:49.124 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:48 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:49.124 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:48 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:18:49.124 INFO:journalctl@ceph.osd.1.vm04.stdout:Mar 10 06:18:48 vm04 ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-1[74271]: 2026-03-10T06:18:48.830+0000 7f68dc833700 -1 osd.1 0 waiting for initial osdmap 2026-03-10T06:18:49.124 INFO:journalctl@ceph.osd.1.vm04.stdout:Mar 10 06:18:48 vm04 ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-1[74271]: 2026-03-10T06:18:48.837+0000 7f68d8628700 -1 osd.1 13 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-10T06:18:49.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:49 vm04 ceph-mon[51058]: from='client.? 192.168.123.104:0/3271770134' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "7fc62f1e-5fa4-44db-82a0-4b766c28a491"}]: dispatch 2026-03-10T06:18:49.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:49 vm04 ceph-mon[51058]: from='osd.1 [v2:192.168.123.104:6810/3264858383,v1:192.168.123.104:6811/3264858383]' entity='osd.1' cmd='[{"prefix": "osd crush create-or-move", "id": 1, "weight":0.0195, "args": ["host=vm04", "root=default"]}]': finished 2026-03-10T06:18:49.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:49 vm04 ceph-mon[51058]: from='client.? 192.168.123.104:0/3271770134' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "7fc62f1e-5fa4-44db-82a0-4b766c28a491"}]': finished 2026-03-10T06:18:49.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:49 vm04 ceph-mon[51058]: osdmap e13: 3 total, 1 up, 3 in 2026-03-10T06:18:49.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:49 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-10T06:18:49.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:49 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-10T06:18:49.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:49 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-10T06:18:49.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:49 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T06:18:49.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:49 vm04 ceph-mon[51058]: from='client.? 192.168.123.104:0/1081051653' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch 2026-03-10T06:18:50.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:49 vm06 ceph-mon[58974]: from='client.? 192.168.123.104:0/3271770134' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "7fc62f1e-5fa4-44db-82a0-4b766c28a491"}]: dispatch 2026-03-10T06:18:50.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:49 vm06 ceph-mon[58974]: from='osd.1 [v2:192.168.123.104:6810/3264858383,v1:192.168.123.104:6811/3264858383]' entity='osd.1' cmd='[{"prefix": "osd crush create-or-move", "id": 1, "weight":0.0195, "args": ["host=vm04", "root=default"]}]': finished 2026-03-10T06:18:50.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:49 vm06 ceph-mon[58974]: from='client.? 192.168.123.104:0/3271770134' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "7fc62f1e-5fa4-44db-82a0-4b766c28a491"}]': finished 2026-03-10T06:18:50.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:49 vm06 ceph-mon[58974]: osdmap e13: 3 total, 1 up, 3 in 2026-03-10T06:18:50.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:49 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-10T06:18:50.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:49 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-10T06:18:50.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:49 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-10T06:18:50.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:49 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T06:18:50.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:49 vm06 ceph-mon[58974]: from='client.? 192.168.123.104:0/1081051653' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch 2026-03-10T06:18:51.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:50 vm06 ceph-mon[58974]: purged_snaps scrub starts 2026-03-10T06:18:51.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:50 vm06 ceph-mon[58974]: purged_snaps scrub ok 2026-03-10T06:18:51.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:50 vm06 ceph-mon[58974]: osd.1 [v2:192.168.123.104:6810/3264858383,v1:192.168.123.104:6811/3264858383] boot 2026-03-10T06:18:51.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:50 vm06 ceph-mon[58974]: osdmap e14: 3 total, 2 up, 3 in 2026-03-10T06:18:51.118 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:50 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-10T06:18:51.118 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:50 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-10T06:18:51.118 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:50 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:51.118 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:50 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:51.118 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:50 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:18:51.118 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:50 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T06:18:51.118 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:50 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:51.118 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:50 vm06 ceph-mon[58974]: pgmap v25: 0 pgs: ; 0 B data, 52 MiB used, 40 GiB / 40 GiB avail 2026-03-10T06:18:51.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:50 vm04 ceph-mon[51058]: purged_snaps scrub starts 2026-03-10T06:18:51.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:50 vm04 ceph-mon[51058]: purged_snaps scrub ok 2026-03-10T06:18:51.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:50 vm04 ceph-mon[51058]: osd.1 [v2:192.168.123.104:6810/3264858383,v1:192.168.123.104:6811/3264858383] boot 2026-03-10T06:18:51.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:50 vm04 ceph-mon[51058]: osdmap e14: 3 total, 2 up, 3 in 2026-03-10T06:18:51.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:50 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-10T06:18:51.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:50 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-10T06:18:51.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:50 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:51.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:50 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:51.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:50 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:18:51.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:50 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T06:18:51.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:50 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:51.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:50 vm04 ceph-mon[51058]: pgmap v25: 0 pgs: ; 0 B data, 52 MiB used, 40 GiB / 40 GiB avail 2026-03-10T06:18:52.457 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:52 vm04 ceph-mon[51058]: osdmap e15: 3 total, 2 up, 3 in 2026-03-10T06:18:52.457 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:52 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-10T06:18:52.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:52 vm06 ceph-mon[58974]: osdmap e15: 3 total, 2 up, 3 in 2026-03-10T06:18:52.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:52 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-10T06:18:53.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:53 vm04 ceph-mon[51058]: pgmap v27: 0 pgs: ; 0 B data, 52 MiB used, 40 GiB / 40 GiB avail 2026-03-10T06:18:53.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:53 vm06 ceph-mon[58974]: pgmap v27: 0 pgs: ; 0 B data, 52 MiB used, 40 GiB / 40 GiB avail 2026-03-10T06:18:54.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:54 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "osd.2"}]: dispatch 2026-03-10T06:18:54.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:54 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:18:54.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:54 vm04 ceph-mon[51058]: Deploying daemon osd.2 on vm04 2026-03-10T06:18:54.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:54 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "osd.2"}]: dispatch 2026-03-10T06:18:54.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:54 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:18:54.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:54 vm06 ceph-mon[58974]: Deploying daemon osd.2 on vm04 2026-03-10T06:18:55.471 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:55 vm04 ceph-mon[51058]: pgmap v28: 0 pgs: ; 0 B data, 52 MiB used, 40 GiB / 40 GiB avail 2026-03-10T06:18:55.471 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:55 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:55.471 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:55 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:55.471 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:55 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:18:55.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:55 vm06 ceph-mon[58974]: pgmap v28: 0 pgs: ; 0 B data, 52 MiB used, 40 GiB / 40 GiB avail 2026-03-10T06:18:55.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:55 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:55.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:55 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:55.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:55 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:18:56.147 INFO:teuthology.orchestra.run.vm04.stdout:Created osd(s) 2 on host 'vm04' 2026-03-10T06:18:56.147 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:56.141+0000 7f5b6b7fe700 1 -- 192.168.123.104:0/3544750828 <== mgr.14241 v2:192.168.123.104:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+32 (secure 0 0 0) 0x7f5b7c061190 con 0x7f5b6406c480 2026-03-10T06:18:56.147 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:56.144+0000 7f5b7b59e700 1 -- 192.168.123.104:0/3544750828 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f5b6406c480 msgr2=0x7f5b6406e930 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:18:56.147 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:56.144+0000 7f5b7b59e700 1 --2- 192.168.123.104:0/3544750828 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f5b6406c480 0x7f5b6406e930 secure :-1 s=READY pgs=34 cs=0 l=1 rev1=1 crypto rx=0x7f5b7c072f50 tx=0x7f5b7000b410 comp rx=0 tx=0).stop 2026-03-10T06:18:56.147 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:56.144+0000 7f5b7b59e700 1 -- 192.168.123.104:0/3544750828 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5b7c072440 msgr2=0x7f5b7c1b1f10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:18:56.147 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:56.144+0000 7f5b7b59e700 1 --2- 192.168.123.104:0/3544750828 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5b7c072440 0x7f5b7c1b1f10 secure :-1 s=READY pgs=201 cs=0 l=1 rev1=1 crypto rx=0x7f5b74000f80 tx=0x7f5b74008fa0 comp rx=0 tx=0).stop 2026-03-10T06:18:56.147 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:56.144+0000 7f5b7b59e700 1 -- 192.168.123.104:0/3544750828 shutdown_connections 2026-03-10T06:18:56.147 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:56.144+0000 7f5b7b59e700 1 --2- 192.168.123.104:0/3544750828 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f5b6406c480 0x7f5b6406e930 unknown :-1 s=CLOSED pgs=34 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:18:56.147 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:56.144+0000 7f5b7b59e700 1 --2- 192.168.123.104:0/3544750828 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5b7c071a60 0x7f5b7c1af9c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:18:56.147 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:56.144+0000 7f5b7b59e700 1 --2- 192.168.123.104:0/3544750828 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5b7c072440 0x7f5b7c1b1f10 unknown :-1 s=CLOSED pgs=201 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:18:56.147 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:56.144+0000 7f5b7b59e700 1 -- 192.168.123.104:0/3544750828 >> 192.168.123.104:0/3544750828 conn(0x7f5b7c06d1a0 msgr2=0x7f5b7c10a6d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:18:56.147 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:56.144+0000 7f5b7b59e700 1 -- 192.168.123.104:0/3544750828 shutdown_connections 2026-03-10T06:18:56.147 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:18:56.144+0000 7f5b7b59e700 1 -- 192.168.123.104:0/3544750828 wait complete. 2026-03-10T06:18:56.218 DEBUG:teuthology.orchestra.run.vm04:osd.2> sudo journalctl -f -n 0 -u ceph-9c59102a-1c48-11f1-b618-035af535377d@osd.2.service 2026-03-10T06:18:56.221 INFO:tasks.cephadm:Deploying osd.3 on vm06 with /dev/vde... 2026-03-10T06:18:56.221 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 ceph-volume -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 9c59102a-1c48-11f1-b618-035af535377d -- lvm zap /dev/vde 2026-03-10T06:18:56.358 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/mon.vm06/config 2026-03-10T06:18:56.909 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-10T06:18:56.925 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 9c59102a-1c48-11f1-b618-035af535377d -- ceph orch daemon add osd vm06:/dev/vde 2026-03-10T06:18:57.086 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/mon.vm06/config 2026-03-10T06:18:57.093 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:57 vm06 ceph-mon[58974]: pgmap v29: 0 pgs: ; 0 B data, 53 MiB used, 40 GiB / 40 GiB avail 2026-03-10T06:18:57.093 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:57 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:57.093 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:57 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:57.093 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:57 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:57.093 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:57 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:57.355 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:57.354+0000 7fcd95143700 1 -- 192.168.123.106:0/3358187741 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fcd90103980 msgr2=0x7fcd90103dd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:18:57.355 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:57.354+0000 7fcd95143700 1 --2- 192.168.123.106:0/3358187741 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fcd90103980 0x7fcd90103dd0 secure :-1 s=READY pgs=4 cs=0 l=1 rev1=1 crypto rx=0x7fcd80009b00 tx=0x7fcd80009e10 comp rx=0 tx=0).stop 2026-03-10T06:18:57.356 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:57.354+0000 7fcd95143700 1 -- 192.168.123.106:0/3358187741 shutdown_connections 2026-03-10T06:18:57.356 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:57.354+0000 7fcd95143700 1 --2- 192.168.123.106:0/3358187741 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fcd90103980 0x7fcd90103dd0 unknown :-1 s=CLOSED pgs=4 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:18:57.356 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:57.354+0000 7fcd95143700 1 --2- 192.168.123.106:0/3358187741 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fcd90102780 0x7fcd90102b90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:18:57.356 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:57.354+0000 7fcd95143700 1 -- 192.168.123.106:0/3358187741 >> 192.168.123.106:0/3358187741 conn(0x7fcd900fdd50 msgr2=0x7fcd90100160 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:18:57.356 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:57.355+0000 7fcd95143700 1 -- 192.168.123.106:0/3358187741 shutdown_connections 2026-03-10T06:18:57.356 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:57.356+0000 7fcd95143700 1 -- 192.168.123.106:0/3358187741 wait complete. 2026-03-10T06:18:57.356 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:57.356+0000 7fcd95143700 1 Processor -- start 2026-03-10T06:18:57.356 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:57.356+0000 7fcd95143700 1 -- start start 2026-03-10T06:18:57.357 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:57.356+0000 7fcd95143700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fcd90102780 0x7fcd90198090 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:18:57.357 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:57.357+0000 7fcd95143700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fcd90103980 0x7fcd901985d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:18:57.357 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:57.357+0000 7fcd95143700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fcd90198bf0 con 0x7fcd90103980 2026-03-10T06:18:57.357 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:57.357+0000 7fcd95143700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fcd90198d30 con 0x7fcd90102780 2026-03-10T06:18:57.357 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:57.357+0000 7fcd8ed9d700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fcd90102780 0x7fcd90198090 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:18:57.357 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:57.357+0000 7fcd8ed9d700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fcd90102780 0x7fcd90198090 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:34366/0 (socket says 192.168.123.106:34366) 2026-03-10T06:18:57.357 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:57.357+0000 7fcd8ed9d700 1 -- 192.168.123.106:0/4166184084 learned_addr learned my addr 192.168.123.106:0/4166184084 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-10T06:18:57.358 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:57.357+0000 7fcd8ed9d700 1 -- 192.168.123.106:0/4166184084 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fcd90103980 msgr2=0x7fcd901985d0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:18:57.358 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:57.357+0000 7fcd8ed9d700 1 --2- 192.168.123.106:0/4166184084 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fcd90103980 0x7fcd901985d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:18:57.358 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:57.357+0000 7fcd8ed9d700 1 -- 192.168.123.106:0/4166184084 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fcd800097e0 con 0x7fcd90102780 2026-03-10T06:18:57.360 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:57.357+0000 7fcd8ed9d700 1 --2- 192.168.123.106:0/4166184084 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fcd90102780 0x7fcd90198090 secure :-1 s=READY pgs=5 cs=0 l=1 rev1=1 crypto rx=0x7fcd7800b700 tx=0x7fcd7800bac0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:18:57.360 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:57.358+0000 7fcd87fff700 1 -- 192.168.123.106:0/4166184084 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fcd78010820 con 0x7fcd90102780 2026-03-10T06:18:57.360 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:57.358+0000 7fcd95143700 1 -- 192.168.123.106:0/4166184084 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fcd9019d7e0 con 0x7fcd90102780 2026-03-10T06:18:57.360 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:57.358+0000 7fcd95143700 1 -- 192.168.123.106:0/4166184084 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fcd9019dd30 con 0x7fcd90102780 2026-03-10T06:18:57.360 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:57.358+0000 7fcd87fff700 1 -- 192.168.123.106:0/4166184084 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fcd78010e60 con 0x7fcd90102780 2026-03-10T06:18:57.360 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:57.358+0000 7fcd87fff700 1 -- 192.168.123.106:0/4166184084 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fcd78010820 con 0x7fcd90102780 2026-03-10T06:18:57.360 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:57.359+0000 7fcd87fff700 1 -- 192.168.123.106:0/4166184084 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 18) v1 ==== 89854+0+0 (secure 0 0 0) 0x7fcd7800d360 con 0x7fcd90102780 2026-03-10T06:18:57.360 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:57.359+0000 7fcd87fff700 1 --2- 192.168.123.106:0/4166184084 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fcd7c074ec0 0x7fcd7c077370 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:18:57.360 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:57.359+0000 7fcd87fff700 1 -- 192.168.123.106:0/4166184084 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(15..15 src has 1..15) v4 ==== 2347+0+0 (secure 0 0 0) 0x7fcd7800fe40 con 0x7fcd90102780 2026-03-10T06:18:57.361 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:57.360+0000 7fcd95143700 1 -- 192.168.123.106:0/4166184084 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fcd70005320 con 0x7fcd90102780 2026-03-10T06:18:57.364 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:57.363+0000 7fcd87fff700 1 -- 192.168.123.106:0/4166184084 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fcd78056ff0 con 0x7fcd90102780 2026-03-10T06:18:57.365 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:57 vm04 ceph-mon[51058]: pgmap v29: 0 pgs: ; 0 B data, 53 MiB used, 40 GiB / 40 GiB avail 2026-03-10T06:18:57.365 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:57 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:57.365 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:57 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:57.365 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:57 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:57.365 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:57 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:57.368 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:57.367+0000 7fcd8e59c700 1 --2- 192.168.123.106:0/4166184084 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fcd7c074ec0 0x7fcd7c077370 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:18:57.369 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:57.369+0000 7fcd8e59c700 1 --2- 192.168.123.106:0/4166184084 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fcd7c074ec0 0x7fcd7c077370 secure :-1 s=READY pgs=39 cs=0 l=1 rev1=1 crypto rx=0x7fcd80009ad0 tx=0x7fcd80009f90 comp rx=0 tx=0).ready entity=mgr.14241 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:18:57.485 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:18:57.484+0000 7fcd95143700 1 -- 192.168.123.106:0/4166184084 --> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] -- mgr_command(tid 0: {"prefix": "orch daemon add osd", "svc_arg": "vm06:/dev/vde", "target": ["mon-mgr", ""]}) v1 -- 0x7fcd70000bf0 con 0x7fcd7c074ec0 2026-03-10T06:18:57.906 INFO:journalctl@ceph.osd.2.vm04.stdout:Mar 10 06:18:57 vm04 ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-2[80471]: 2026-03-10T06:18:57.639+0000 7ff463eda640 -1 osd.2 0 log_to_monitors true 2026-03-10T06:18:58.099 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:58 vm06 ceph-mon[58974]: from='client.24135 -' entity='client.admin' cmd=[{"prefix": "orch daemon add osd", "svc_arg": "vm06:/dev/vde", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:18:58.100 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:58 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch 2026-03-10T06:18:58.100 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:58 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch 2026-03-10T06:18:58.100 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:58 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:18:58.100 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:58 vm06 ceph-mon[58974]: from='osd.2 [v2:192.168.123.104:6818/4287451356,v1:192.168.123.104:6819/4287451356]' entity='osd.2' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]: dispatch 2026-03-10T06:18:58.100 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:58 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:58.100 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:58 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:58.100 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:58 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "who": "osd/host:vm04", "name": "osd_memory_target"}]: dispatch 2026-03-10T06:18:58.100 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:58 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:18:58.100 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:58 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T06:18:58.100 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:58 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:58.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:58 vm04 ceph-mon[51058]: from='client.24135 -' entity='client.admin' cmd=[{"prefix": "orch daemon add osd", "svc_arg": "vm06:/dev/vde", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:18:58.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:58 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch 2026-03-10T06:18:58.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:58 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch 2026-03-10T06:18:58.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:58 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:18:58.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:58 vm04 ceph-mon[51058]: from='osd.2 [v2:192.168.123.104:6818/4287451356,v1:192.168.123.104:6819/4287451356]' entity='osd.2' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]: dispatch 2026-03-10T06:18:58.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:58 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:58.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:58 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:58.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:58 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "who": "osd/host:vm04", "name": "osd_memory_target"}]: dispatch 2026-03-10T06:18:58.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:58 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:18:58.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:58 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T06:18:58.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:58 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:58.928 INFO:journalctl@ceph.osd.2.vm04.stdout:Mar 10 06:18:58 vm04 ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-2[80471]: 2026-03-10T06:18:58.452+0000 7ff458d50700 -1 osd.2 0 waiting for initial osdmap 2026-03-10T06:18:58.928 INFO:journalctl@ceph.osd.2.vm04.stdout:Mar 10 06:18:58 vm04 ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-2[80471]: 2026-03-10T06:18:58.462+0000 7ff455346700 -1 osd.2 17 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-10T06:18:59.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:59 vm06 ceph-mon[58974]: Detected new or changed devices on vm04 2026-03-10T06:18:59.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:59 vm06 ceph-mon[58974]: pgmap v30: 0 pgs: ; 0 B data, 53 MiB used, 40 GiB / 40 GiB avail 2026-03-10T06:18:59.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:59 vm06 ceph-mon[58974]: from='osd.2 [v2:192.168.123.104:6818/4287451356,v1:192.168.123.104:6819/4287451356]' entity='osd.2' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]': finished 2026-03-10T06:18:59.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:59 vm06 ceph-mon[58974]: osdmap e16: 3 total, 2 up, 3 in 2026-03-10T06:18:59.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:59 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-10T06:18:59.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:59 vm06 ceph-mon[58974]: from='osd.2 [v2:192.168.123.104:6818/4287451356,v1:192.168.123.104:6819/4287451356]' entity='osd.2' cmd=[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=vm04", "root=default"]}]: dispatch 2026-03-10T06:18:59.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:59 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:18:59.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:59 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:18:59.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:59 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T06:18:59.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:59 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:59.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:59 vm06 ceph-mon[58974]: from='client.? 192.168.123.106:0/2334992547' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "343c7178-1f64-4726-9c13-d1d348b25384"}]: dispatch 2026-03-10T06:18:59.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:59 vm06 ceph-mon[58974]: from='client.? ' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "343c7178-1f64-4726-9c13-d1d348b25384"}]: dispatch 2026-03-10T06:18:59.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:59 vm06 ceph-mon[58974]: from='osd.2 [v2:192.168.123.104:6818/4287451356,v1:192.168.123.104:6819/4287451356]' entity='osd.2' cmd='[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=vm04", "root=default"]}]': finished 2026-03-10T06:18:59.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:59 vm06 ceph-mon[58974]: from='client.? ' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "343c7178-1f64-4726-9c13-d1d348b25384"}]': finished 2026-03-10T06:18:59.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:59 vm06 ceph-mon[58974]: osdmap e17: 4 total, 2 up, 4 in 2026-03-10T06:18:59.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:59 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-10T06:18:59.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:59 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-10T06:18:59.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:59 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-10T06:18:59.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:18:59 vm06 ceph-mon[58974]: from='client.? 192.168.123.106:0/2649111941' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch 2026-03-10T06:18:59.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:59 vm04 ceph-mon[51058]: Detected new or changed devices on vm04 2026-03-10T06:18:59.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:59 vm04 ceph-mon[51058]: pgmap v30: 0 pgs: ; 0 B data, 53 MiB used, 40 GiB / 40 GiB avail 2026-03-10T06:18:59.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:59 vm04 ceph-mon[51058]: from='osd.2 [v2:192.168.123.104:6818/4287451356,v1:192.168.123.104:6819/4287451356]' entity='osd.2' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]': finished 2026-03-10T06:18:59.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:59 vm04 ceph-mon[51058]: osdmap e16: 3 total, 2 up, 3 in 2026-03-10T06:18:59.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:59 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-10T06:18:59.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:59 vm04 ceph-mon[51058]: from='osd.2 [v2:192.168.123.104:6818/4287451356,v1:192.168.123.104:6819/4287451356]' entity='osd.2' cmd=[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=vm04", "root=default"]}]: dispatch 2026-03-10T06:18:59.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:59 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:18:59.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:59 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:18:59.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:59 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T06:18:59.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:59 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:18:59.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:59 vm04 ceph-mon[51058]: from='client.? 192.168.123.106:0/2334992547' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "343c7178-1f64-4726-9c13-d1d348b25384"}]: dispatch 2026-03-10T06:18:59.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:59 vm04 ceph-mon[51058]: from='client.? ' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "343c7178-1f64-4726-9c13-d1d348b25384"}]: dispatch 2026-03-10T06:18:59.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:59 vm04 ceph-mon[51058]: from='osd.2 [v2:192.168.123.104:6818/4287451356,v1:192.168.123.104:6819/4287451356]' entity='osd.2' cmd='[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=vm04", "root=default"]}]': finished 2026-03-10T06:18:59.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:59 vm04 ceph-mon[51058]: from='client.? ' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "343c7178-1f64-4726-9c13-d1d348b25384"}]': finished 2026-03-10T06:18:59.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:59 vm04 ceph-mon[51058]: osdmap e17: 4 total, 2 up, 4 in 2026-03-10T06:18:59.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:59 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-10T06:18:59.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:59 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-10T06:18:59.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:59 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-10T06:18:59.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:18:59 vm04 ceph-mon[51058]: from='client.? 192.168.123.106:0/2649111941' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch 2026-03-10T06:19:01.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:00 vm06 ceph-mon[58974]: osd.2 [v2:192.168.123.104:6818/4287451356,v1:192.168.123.104:6819/4287451356] boot 2026-03-10T06:19:01.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:00 vm06 ceph-mon[58974]: osdmap e18: 4 total, 3 up, 4 in 2026-03-10T06:19:01.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:00 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-10T06:19:01.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:00 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-10T06:19:01.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:00 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd pool create", "format": "json", "pool": ".mgr", "pg_num": 1, "pg_num_min": 1, "pg_num_max": 32, "yes_i_really_mean_it": true}]: dispatch 2026-03-10T06:19:01.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:00 vm04 ceph-mon[51058]: osd.2 [v2:192.168.123.104:6818/4287451356,v1:192.168.123.104:6819/4287451356] boot 2026-03-10T06:19:01.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:00 vm04 ceph-mon[51058]: osdmap e18: 4 total, 3 up, 4 in 2026-03-10T06:19:01.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:00 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-10T06:19:01.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:00 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-10T06:19:01.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:00 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd pool create", "format": "json", "pool": ".mgr", "pg_num": 1, "pg_num_min": 1, "pg_num_max": 32, "yes_i_really_mean_it": true}]: dispatch 2026-03-10T06:19:01.972 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:01 vm06 ceph-mon[58974]: purged_snaps scrub starts 2026-03-10T06:19:01.972 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:01 vm06 ceph-mon[58974]: purged_snaps scrub ok 2026-03-10T06:19:01.972 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:01 vm06 ceph-mon[58974]: pgmap v34: 0 pgs: ; 0 B data, 79 MiB used, 60 GiB / 60 GiB avail 2026-03-10T06:19:01.972 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:01 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd='[{"prefix": "osd pool create", "format": "json", "pool": ".mgr", "pg_num": 1, "pg_num_min": 1, "pg_num_max": 32, "yes_i_really_mean_it": true}]': finished 2026-03-10T06:19:01.972 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:01 vm06 ceph-mon[58974]: osdmap e19: 4 total, 3 up, 4 in 2026-03-10T06:19:01.972 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:01 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-10T06:19:01.972 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:01 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd pool application enable", "format": "json", "pool": ".mgr", "app": "mgr", "yes_i_really_mean_it": true}]: dispatch 2026-03-10T06:19:02.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:01 vm04 ceph-mon[51058]: purged_snaps scrub starts 2026-03-10T06:19:02.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:01 vm04 ceph-mon[51058]: purged_snaps scrub ok 2026-03-10T06:19:02.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:01 vm04 ceph-mon[51058]: pgmap v34: 0 pgs: ; 0 B data, 79 MiB used, 60 GiB / 60 GiB avail 2026-03-10T06:19:02.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:01 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd='[{"prefix": "osd pool create", "format": "json", "pool": ".mgr", "pg_num": 1, "pg_num_min": 1, "pg_num_max": 32, "yes_i_really_mean_it": true}]': finished 2026-03-10T06:19:02.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:01 vm04 ceph-mon[51058]: osdmap e19: 4 total, 3 up, 4 in 2026-03-10T06:19:02.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:01 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-10T06:19:02.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:01 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd pool application enable", "format": "json", "pool": ".mgr", "app": "mgr", "yes_i_really_mean_it": true}]: dispatch 2026-03-10T06:19:03.087 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:02 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd='[{"prefix": "osd pool application enable", "format": "json", "pool": ".mgr", "app": "mgr", "yes_i_really_mean_it": true}]': finished 2026-03-10T06:19:03.087 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:02 vm06 ceph-mon[58974]: osdmap e20: 4 total, 3 up, 4 in 2026-03-10T06:19:03.087 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:02 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-10T06:19:03.087 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:02 vm06 ceph-mon[58974]: pgmap v37: 1 pgs: 1 unknown; 0 B data, 79 MiB used, 60 GiB / 60 GiB avail 2026-03-10T06:19:03.150 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:02 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd='[{"prefix": "osd pool application enable", "format": "json", "pool": ".mgr", "app": "mgr", "yes_i_really_mean_it": true}]': finished 2026-03-10T06:19:03.150 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:02 vm04 ceph-mon[51058]: osdmap e20: 4 total, 3 up, 4 in 2026-03-10T06:19:03.150 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:02 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-10T06:19:03.150 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:02 vm04 ceph-mon[51058]: pgmap v37: 1 pgs: 1 unknown; 0 B data, 79 MiB used, 60 GiB / 60 GiB avail 2026-03-10T06:19:03.151 INFO:journalctl@ceph.osd.0.vm04.stdout:Mar 10 06:19:02 vm04 sudo[83827]: ceph : TTY=unknown ; PWD=/ ; USER=root ; COMMAND=/sbin/smartctl -x --json=o /dev/vde 2026-03-10T06:19:03.151 INFO:journalctl@ceph.osd.0.vm04.stdout:Mar 10 06:19:02 vm04 sudo[83827]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory 2026-03-10T06:19:03.151 INFO:journalctl@ceph.osd.0.vm04.stdout:Mar 10 06:19:02 vm04 sudo[83827]: pam_unix(sudo:session): session opened for user root by (uid=0) 2026-03-10T06:19:03.151 INFO:journalctl@ceph.osd.0.vm04.stdout:Mar 10 06:19:02 vm04 sudo[83827]: pam_unix(sudo:session): session closed for user root 2026-03-10T06:19:03.428 INFO:journalctl@ceph.osd.1.vm04.stdout:Mar 10 06:19:03 vm04 sudo[83830]: ceph : TTY=unknown ; PWD=/ ; USER=root ; COMMAND=/sbin/smartctl -x --json=o /dev/vdd 2026-03-10T06:19:03.428 INFO:journalctl@ceph.osd.1.vm04.stdout:Mar 10 06:19:03 vm04 sudo[83830]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory 2026-03-10T06:19:03.428 INFO:journalctl@ceph.osd.1.vm04.stdout:Mar 10 06:19:03 vm04 sudo[83830]: pam_unix(sudo:session): session opened for user root by (uid=0) 2026-03-10T06:19:03.428 INFO:journalctl@ceph.osd.1.vm04.stdout:Mar 10 06:19:03 vm04 sudo[83830]: pam_unix(sudo:session): session closed for user root 2026-03-10T06:19:03.428 INFO:journalctl@ceph.osd.2.vm04.stdout:Mar 10 06:19:03 vm04 sudo[83833]: ceph : TTY=unknown ; PWD=/ ; USER=root ; COMMAND=/sbin/smartctl -x --json=o /dev/vdc 2026-03-10T06:19:03.428 INFO:journalctl@ceph.osd.2.vm04.stdout:Mar 10 06:19:03 vm04 sudo[83833]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory 2026-03-10T06:19:03.428 INFO:journalctl@ceph.osd.2.vm04.stdout:Mar 10 06:19:03 vm04 sudo[83833]: pam_unix(sudo:session): session opened for user root by (uid=0) 2026-03-10T06:19:03.428 INFO:journalctl@ceph.osd.2.vm04.stdout:Mar 10 06:19:03 vm04 sudo[83833]: pam_unix(sudo:session): session closed for user root 2026-03-10T06:19:03.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:03 vm04 sudo[83836]: ceph : TTY=unknown ; PWD=/ ; USER=root ; COMMAND=/sbin/smartctl -x --json=o /dev/vda 2026-03-10T06:19:03.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:03 vm04 sudo[83836]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory 2026-03-10T06:19:03.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:03 vm04 sudo[83836]: pam_unix(sudo:session): session opened for user root by (uid=0) 2026-03-10T06:19:03.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:03 vm04 sudo[83836]: pam_unix(sudo:session): session closed for user root 2026-03-10T06:19:03.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:03 vm04 ceph-mon[51058]: osdmap e21: 4 total, 3 up, 4 in 2026-03-10T06:19:03.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:03 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-10T06:19:03.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:03 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "osd.3"}]: dispatch 2026-03-10T06:19:03.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:03 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:19:03.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:03 vm04 ceph-mon[51058]: Deploying daemon osd.3 on vm06 2026-03-10T06:19:03.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:03 vm04 ceph-mon[51058]: from='admin socket' entity='admin socket' cmd='smart' args=[json]: dispatch 2026-03-10T06:19:03.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:03 vm04 ceph-mon[51058]: from='admin socket' entity='admin socket' cmd=smart args=[json]: finished 2026-03-10T06:19:03.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:03 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "mon metadata", "id": "vm04"}]: dispatch 2026-03-10T06:19:03.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:03 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "mon metadata", "id": "vm06"}]: dispatch 2026-03-10T06:19:03.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:03 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "mon metadata", "id": "vm04"}]: dispatch 2026-03-10T06:19:03.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:03 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "mon metadata", "id": "vm06"}]: dispatch 2026-03-10T06:19:04.019 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:03 vm06 sudo[64620]: ceph : TTY=unknown ; PWD=/ ; USER=root ; COMMAND=/sbin/smartctl -x --json=o /dev/vda 2026-03-10T06:19:04.020 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:03 vm06 sudo[64620]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory 2026-03-10T06:19:04.020 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:03 vm06 sudo[64620]: pam_unix(sudo:session): session opened for user root by (uid=0) 2026-03-10T06:19:04.020 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:03 vm06 sudo[64620]: pam_unix(sudo:session): session closed for user root 2026-03-10T06:19:04.020 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:03 vm06 ceph-mon[58974]: osdmap e21: 4 total, 3 up, 4 in 2026-03-10T06:19:04.020 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:03 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-10T06:19:04.020 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:03 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "osd.3"}]: dispatch 2026-03-10T06:19:04.020 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:03 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:19:04.020 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:03 vm06 ceph-mon[58974]: Deploying daemon osd.3 on vm06 2026-03-10T06:19:04.020 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:03 vm06 ceph-mon[58974]: from='admin socket' entity='admin socket' cmd='smart' args=[json]: dispatch 2026-03-10T06:19:04.020 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:03 vm06 ceph-mon[58974]: from='admin socket' entity='admin socket' cmd=smart args=[json]: finished 2026-03-10T06:19:04.020 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:03 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "mon metadata", "id": "vm04"}]: dispatch 2026-03-10T06:19:04.020 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:03 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "mon metadata", "id": "vm06"}]: dispatch 2026-03-10T06:19:04.020 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:03 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "mon metadata", "id": "vm04"}]: dispatch 2026-03-10T06:19:04.020 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:03 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "mon metadata", "id": "vm06"}]: dispatch 2026-03-10T06:19:04.785 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:04.783+0000 7fcd87fff700 1 -- 192.168.123.106:0/4166184084 <== mon.1 v2:192.168.123.106:3300/0 7 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7fcd7800d360 con 0x7fcd90102780 2026-03-10T06:19:05.031 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:04 vm06 ceph-mon[58974]: from='admin socket' entity='admin socket' cmd='smart' args=[json]: dispatch 2026-03-10T06:19:05.031 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:04 vm06 ceph-mon[58974]: from='admin socket' entity='admin socket' cmd=smart args=[json]: finished 2026-03-10T06:19:05.031 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:04 vm06 ceph-mon[58974]: pgmap v39: 1 pgs: 1 unknown; 0 B data, 79 MiB used, 60 GiB / 60 GiB avail 2026-03-10T06:19:05.031 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:04 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T06:19:05.031 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:04 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:19:05.031 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:04 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:19:05.031 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:04 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:19:05.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:04 vm04 ceph-mon[51058]: from='admin socket' entity='admin socket' cmd='smart' args=[json]: dispatch 2026-03-10T06:19:05.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:04 vm04 ceph-mon[51058]: from='admin socket' entity='admin socket' cmd=smart args=[json]: finished 2026-03-10T06:19:05.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:04 vm04 ceph-mon[51058]: pgmap v39: 1 pgs: 1 unknown; 0 B data, 79 MiB used, 60 GiB / 60 GiB avail 2026-03-10T06:19:05.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:04 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T06:19:05.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:04 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:19:05.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:04 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:19:05.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:04 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:19:05.673 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:05.672+0000 7fcd87fff700 1 -- 192.168.123.106:0/4166184084 <== mgr.14241 v2:192.168.123.104:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+32 (secure 0 0 0) 0x7fcd70000bf0 con 0x7fcd7c074ec0 2026-03-10T06:19:05.673 INFO:teuthology.orchestra.run.vm06.stdout:Created osd(s) 3 on host 'vm06' 2026-03-10T06:19:05.676 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:05.675+0000 7fcd95143700 1 -- 192.168.123.106:0/4166184084 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fcd7c074ec0 msgr2=0x7fcd7c077370 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:05.676 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:05.675+0000 7fcd95143700 1 --2- 192.168.123.106:0/4166184084 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fcd7c074ec0 0x7fcd7c077370 secure :-1 s=READY pgs=39 cs=0 l=1 rev1=1 crypto rx=0x7fcd80009ad0 tx=0x7fcd80009f90 comp rx=0 tx=0).stop 2026-03-10T06:19:05.676 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:05.675+0000 7fcd95143700 1 -- 192.168.123.106:0/4166184084 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fcd90102780 msgr2=0x7fcd90198090 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:05.676 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:05.675+0000 7fcd95143700 1 --2- 192.168.123.106:0/4166184084 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fcd90102780 0x7fcd90198090 secure :-1 s=READY pgs=5 cs=0 l=1 rev1=1 crypto rx=0x7fcd7800b700 tx=0x7fcd7800bac0 comp rx=0 tx=0).stop 2026-03-10T06:19:05.676 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:05.675+0000 7fcd95143700 1 -- 192.168.123.106:0/4166184084 shutdown_connections 2026-03-10T06:19:05.676 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:05.675+0000 7fcd95143700 1 --2- 192.168.123.106:0/4166184084 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fcd7c074ec0 0x7fcd7c077370 unknown :-1 s=CLOSED pgs=39 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:05.676 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:05.675+0000 7fcd95143700 1 --2- 192.168.123.106:0/4166184084 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fcd90102780 0x7fcd90198090 unknown :-1 s=CLOSED pgs=5 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:05.676 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:05.675+0000 7fcd95143700 1 --2- 192.168.123.106:0/4166184084 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fcd90103980 0x7fcd901985d0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:05.676 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:05.675+0000 7fcd95143700 1 -- 192.168.123.106:0/4166184084 >> 192.168.123.106:0/4166184084 conn(0x7fcd900fdd50 msgr2=0x7fcd90106bb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:19:05.679 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:05.678+0000 7fcd95143700 1 -- 192.168.123.106:0/4166184084 shutdown_connections 2026-03-10T06:19:05.679 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:05.678+0000 7fcd95143700 1 -- 192.168.123.106:0/4166184084 wait complete. 2026-03-10T06:19:05.738 DEBUG:teuthology.orchestra.run.vm06:osd.3> sudo journalctl -f -n 0 -u ceph-9c59102a-1c48-11f1-b618-035af535377d@osd.3.service 2026-03-10T06:19:05.742 INFO:tasks.cephadm:Deploying osd.4 on vm06 with /dev/vdd... 2026-03-10T06:19:05.742 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 ceph-volume -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 9c59102a-1c48-11f1-b618-035af535377d -- lvm zap /dev/vdd 2026-03-10T06:19:05.813 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:05 vm06 ceph-mon[58974]: mgrmap e19: vm04.exdvdb(active, since 60s), standbys: vm06.wwotdr 2026-03-10T06:19:05.813 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:05 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:19:05.813 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:05 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:19:05.813 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:05 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:19:05.813 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:05 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:19:05.955 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/mon.vm06/config 2026-03-10T06:19:06.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:05 vm04 ceph-mon[51058]: mgrmap e19: vm04.exdvdb(active, since 60s), standbys: vm06.wwotdr 2026-03-10T06:19:06.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:05 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:19:06.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:05 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:19:06.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:05 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:19:06.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:05 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:19:06.510 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-10T06:19:06.524 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 9c59102a-1c48-11f1-b618-035af535377d -- ceph orch daemon add osd vm06:/dev/vdd 2026-03-10T06:19:06.843 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/mon.vm06/config 2026-03-10T06:19:07.003 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:06 vm06 ceph-mon[58974]: pgmap v40: 1 pgs: 1 active+clean; 449 KiB data, 81 MiB used, 60 GiB / 60 GiB avail 2026-03-10T06:19:07.003 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 10 06:19:06 vm06 ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-3[64871]: 2026-03-10T06:19:06.779+0000 7fa1bd291640 -1 osd.3 0 log_to_monitors true 2026-03-10T06:19:07.132 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:07.130+0000 7f4601891700 1 -- 192.168.123.106:0/3056388181 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f45fc1080e0 msgr2=0x7f45fc1084f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:07.132 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:07.130+0000 7f4601891700 1 --2- 192.168.123.106:0/3056388181 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f45fc1080e0 0x7f45fc1084f0 secure :-1 s=READY pgs=14 cs=0 l=1 rev1=1 crypto rx=0x7f45ec008790 tx=0x7f45ec008aa0 comp rx=0 tx=0).stop 2026-03-10T06:19:07.132 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:07.132+0000 7f4601891700 1 -- 192.168.123.106:0/3056388181 shutdown_connections 2026-03-10T06:19:07.132 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:07.132+0000 7f4601891700 1 --2- 192.168.123.106:0/3056388181 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f45fc071960 0x7f45fc071dd0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:07.132 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:07.132+0000 7f4601891700 1 --2- 192.168.123.106:0/3056388181 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f45fc1080e0 0x7f45fc1084f0 unknown :-1 s=CLOSED pgs=14 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:07.132 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:07.132+0000 7f4601891700 1 -- 192.168.123.106:0/3056388181 >> 192.168.123.106:0/3056388181 conn(0x7f45fc06d3e0 msgr2=0x7f45fc06f830 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:19:07.133 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:07.132+0000 7f4601891700 1 -- 192.168.123.106:0/3056388181 shutdown_connections 2026-03-10T06:19:07.133 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:07.133+0000 7f4601891700 1 -- 192.168.123.106:0/3056388181 wait complete. 2026-03-10T06:19:07.134 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:07.133+0000 7f4601891700 1 Processor -- start 2026-03-10T06:19:07.134 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:07.134+0000 7f4601891700 1 -- start start 2026-03-10T06:19:07.134 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:07.134+0000 7f4601891700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f45fc071960 0x7f45fc081660 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:07.134 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:07.134+0000 7f45faffd700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f45fc071960 0x7f45fc081660 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:19:07.134 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:07.134+0000 7f45faffd700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f45fc071960 0x7f45fc081660 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:50720/0 (socket says 192.168.123.106:50720) 2026-03-10T06:19:07.135 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:07.134+0000 7f4601891700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f45fc1080e0 0x7f45fc081ba0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:07.135 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:07.134+0000 7f45faffd700 1 -- 192.168.123.106:0/2508070586 learned_addr learned my addr 192.168.123.106:0/2508070586 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-10T06:19:07.135 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:07.135+0000 7f45fa7fc700 1 --2- 192.168.123.106:0/2508070586 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f45fc1080e0 0x7f45fc081ba0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:19:07.135 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:07.135+0000 7f4601891700 1 -- 192.168.123.106:0/2508070586 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f45fc0821c0 con 0x7f45fc1080e0 2026-03-10T06:19:07.135 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:07.135+0000 7f4601891700 1 -- 192.168.123.106:0/2508070586 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f45fc082300 con 0x7f45fc071960 2026-03-10T06:19:07.136 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:07.135+0000 7f45faffd700 1 -- 192.168.123.106:0/2508070586 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f45fc1080e0 msgr2=0x7f45fc081ba0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:07.136 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:07.135+0000 7f45faffd700 1 --2- 192.168.123.106:0/2508070586 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f45fc1080e0 0x7f45fc081ba0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:07.136 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:07.135+0000 7f45faffd700 1 -- 192.168.123.106:0/2508070586 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f45ec008440 con 0x7f45fc071960 2026-03-10T06:19:07.136 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:07.135+0000 7f45faffd700 1 --2- 192.168.123.106:0/2508070586 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f45fc071960 0x7f45fc081660 secure :-1 s=READY pgs=15 cs=0 l=1 rev1=1 crypto rx=0x7f45ec013040 tx=0x7f45ec00a3b0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:19:07.136 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:07.136+0000 7f460088f700 1 -- 192.168.123.106:0/2508070586 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f45ec00a790 con 0x7f45fc071960 2026-03-10T06:19:07.136 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:07.136+0000 7f460088f700 1 -- 192.168.123.106:0/2508070586 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f45ec00a8f0 con 0x7f45fc071960 2026-03-10T06:19:07.136 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:07.136+0000 7f460088f700 1 -- 192.168.123.106:0/2508070586 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f45ec01b7c0 con 0x7f45fc071960 2026-03-10T06:19:07.137 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:07.136+0000 7f45fa7fc700 1 --2- 192.168.123.106:0/2508070586 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f45fc1080e0 0x7f45fc081ba0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T06:19:07.137 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:07.137+0000 7f4601891700 1 -- 192.168.123.106:0/2508070586 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f45fc082d30 con 0x7f45fc071960 2026-03-10T06:19:07.138 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:07.137+0000 7f4601891700 1 -- 192.168.123.106:0/2508070586 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f45fc083200 con 0x7f45fc071960 2026-03-10T06:19:07.138 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:07.138+0000 7f460088f700 1 -- 192.168.123.106:0/2508070586 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f45ec01b920 con 0x7f45fc071960 2026-03-10T06:19:07.139 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:07.138+0000 7f460088f700 1 --2- 192.168.123.106:0/2508070586 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f45e406c4e0 0x7f45e406e990 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:07.139 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:07.138+0000 7f460088f700 1 -- 192.168.123.106:0/2508070586 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(21..21 src has 1..21) v4 ==== 3165+0+0 (secure 0 0 0) 0x7f45ec08c220 con 0x7f45fc071960 2026-03-10T06:19:07.139 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:07.138+0000 7f4601891700 1 -- 192.168.123.106:0/2508070586 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f45fc066e40 con 0x7f45fc071960 2026-03-10T06:19:07.142 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:07.139+0000 7f45fa7fc700 1 --2- 192.168.123.106:0/2508070586 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f45e406c4e0 0x7f45e406e990 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:19:07.142 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:07.142+0000 7f460088f700 1 -- 192.168.123.106:0/2508070586 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f45ec05ac70 con 0x7f45fc071960 2026-03-10T06:19:07.142 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:07.142+0000 7f45fa7fc700 1 --2- 192.168.123.106:0/2508070586 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f45e406c4e0 0x7f45e406e990 secure :-1 s=READY pgs=45 cs=0 l=1 rev1=1 crypto rx=0x7f45fc107e10 tx=0x7f45f400f040 comp rx=0 tx=0).ready entity=mgr.14241 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:19:07.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:06 vm04 ceph-mon[51058]: pgmap v40: 1 pgs: 1 active+clean; 449 KiB data, 81 MiB used, 60 GiB / 60 GiB avail 2026-03-10T06:19:07.275 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:07.275+0000 7f4601891700 1 -- 192.168.123.106:0/2508070586 --> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] -- mgr_command(tid 0: {"prefix": "orch daemon add osd", "svc_arg": "vm06:/dev/vdd", "target": ["mon-mgr", ""]}) v1 -- 0x7f45fc083c00 con 0x7f45e406c4e0 2026-03-10T06:19:07.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:07 vm06 ceph-mon[58974]: from='osd.3 [v2:192.168.123.106:6800/1172578215,v1:192.168.123.106:6801/1172578215]' entity='osd.3' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["3"]}]: dispatch 2026-03-10T06:19:07.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:07 vm06 ceph-mon[58974]: from='osd.3 ' entity='osd.3' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["3"]}]: dispatch 2026-03-10T06:19:07.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:07 vm06 ceph-mon[58974]: from='client.24161 -' entity='client.admin' cmd=[{"prefix": "orch daemon add osd", "svc_arg": "vm06:/dev/vdd", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:19:07.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:07 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch 2026-03-10T06:19:07.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:07 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch 2026-03-10T06:19:07.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:07 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:19:07.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:07 vm06 ceph-mon[58974]: Detected new or changed devices on vm06 2026-03-10T06:19:07.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:07 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:19:07.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:07 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:19:07.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:07 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "who": "osd/host:vm06", "name": "osd_memory_target"}]: dispatch 2026-03-10T06:19:07.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:07 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:19:07.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:07 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T06:19:07.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:07 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:19:07.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:07 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:19:07.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:07 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:19:07.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:07 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T06:19:07.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:07 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:19:08.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:07 vm04 ceph-mon[51058]: from='osd.3 [v2:192.168.123.106:6800/1172578215,v1:192.168.123.106:6801/1172578215]' entity='osd.3' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["3"]}]: dispatch 2026-03-10T06:19:08.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:07 vm04 ceph-mon[51058]: from='osd.3 ' entity='osd.3' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["3"]}]: dispatch 2026-03-10T06:19:08.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:07 vm04 ceph-mon[51058]: from='client.24161 -' entity='client.admin' cmd=[{"prefix": "orch daemon add osd", "svc_arg": "vm06:/dev/vdd", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:19:08.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:07 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch 2026-03-10T06:19:08.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:07 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch 2026-03-10T06:19:08.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:07 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:19:08.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:07 vm04 ceph-mon[51058]: Detected new or changed devices on vm06 2026-03-10T06:19:08.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:07 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:19:08.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:07 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:19:08.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:07 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "who": "osd/host:vm06", "name": "osd_memory_target"}]: dispatch 2026-03-10T06:19:08.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:07 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:19:08.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:07 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T06:19:08.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:07 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:19:08.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:07 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:19:08.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:07 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:19:08.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:07 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T06:19:08.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:07 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:19:08.220 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 10 06:19:08 vm06 ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-3[64871]: 2026-03-10T06:19:08.216+0000 7fa1b390a700 -1 osd.3 0 waiting for initial osdmap 2026-03-10T06:19:08.599 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 10 06:19:08 vm06 ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-3[64871]: 2026-03-10T06:19:08.229+0000 7fa1abef8700 -1 osd.3 23 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-10T06:19:08.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:08 vm06 ceph-mon[58974]: from='osd.3 ' entity='osd.3' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["3"]}]': finished 2026-03-10T06:19:08.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:08 vm06 ceph-mon[58974]: osdmap e22: 4 total, 3 up, 4 in 2026-03-10T06:19:08.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:08 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-10T06:19:08.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:08 vm06 ceph-mon[58974]: from='osd.3 ' entity='osd.3' cmd=[{"prefix": "osd crush create-or-move", "id": 3, "weight":0.0195, "args": ["host=vm06", "root=default"]}]: dispatch 2026-03-10T06:19:08.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:08 vm06 ceph-mon[58974]: from='osd.3 [v2:192.168.123.106:6800/1172578215,v1:192.168.123.106:6801/1172578215]' entity='osd.3' cmd=[{"prefix": "osd crush create-or-move", "id": 3, "weight":0.0195, "args": ["host=vm06", "root=default"]}]: dispatch 2026-03-10T06:19:08.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:08 vm06 ceph-mon[58974]: pgmap v42: 1 pgs: 1 active+clean; 449 KiB data, 81 MiB used, 60 GiB / 60 GiB avail 2026-03-10T06:19:08.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:08 vm06 ceph-mon[58974]: from='client.? 192.168.123.106:0/1746692486' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "0e68b450-e783-4ec6-99f7-0610ac3453d1"}]: dispatch 2026-03-10T06:19:08.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:08 vm06 ceph-mon[58974]: from='client.? ' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "0e68b450-e783-4ec6-99f7-0610ac3453d1"}]: dispatch 2026-03-10T06:19:08.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:08 vm06 ceph-mon[58974]: from='osd.3 ' entity='osd.3' cmd='[{"prefix": "osd crush create-or-move", "id": 3, "weight":0.0195, "args": ["host=vm06", "root=default"]}]': finished 2026-03-10T06:19:08.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:08 vm06 ceph-mon[58974]: from='client.? ' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "0e68b450-e783-4ec6-99f7-0610ac3453d1"}]': finished 2026-03-10T06:19:08.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:08 vm06 ceph-mon[58974]: osdmap e23: 5 total, 3 up, 5 in 2026-03-10T06:19:08.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:08 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-10T06:19:08.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:08 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-10T06:19:08.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:08 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-10T06:19:08.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:08 vm06 ceph-mon[58974]: from='client.? 192.168.123.106:0/4120933775' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch 2026-03-10T06:19:09.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:08 vm04 ceph-mon[51058]: from='osd.3 ' entity='osd.3' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["3"]}]': finished 2026-03-10T06:19:09.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:08 vm04 ceph-mon[51058]: osdmap e22: 4 total, 3 up, 4 in 2026-03-10T06:19:09.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:08 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-10T06:19:09.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:08 vm04 ceph-mon[51058]: from='osd.3 ' entity='osd.3' cmd=[{"prefix": "osd crush create-or-move", "id": 3, "weight":0.0195, "args": ["host=vm06", "root=default"]}]: dispatch 2026-03-10T06:19:09.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:08 vm04 ceph-mon[51058]: from='osd.3 [v2:192.168.123.106:6800/1172578215,v1:192.168.123.106:6801/1172578215]' entity='osd.3' cmd=[{"prefix": "osd crush create-or-move", "id": 3, "weight":0.0195, "args": ["host=vm06", "root=default"]}]: dispatch 2026-03-10T06:19:09.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:08 vm04 ceph-mon[51058]: pgmap v42: 1 pgs: 1 active+clean; 449 KiB data, 81 MiB used, 60 GiB / 60 GiB avail 2026-03-10T06:19:09.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:08 vm04 ceph-mon[51058]: from='client.? 192.168.123.106:0/1746692486' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "0e68b450-e783-4ec6-99f7-0610ac3453d1"}]: dispatch 2026-03-10T06:19:09.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:08 vm04 ceph-mon[51058]: from='client.? ' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "0e68b450-e783-4ec6-99f7-0610ac3453d1"}]: dispatch 2026-03-10T06:19:09.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:08 vm04 ceph-mon[51058]: from='osd.3 ' entity='osd.3' cmd='[{"prefix": "osd crush create-or-move", "id": 3, "weight":0.0195, "args": ["host=vm06", "root=default"]}]': finished 2026-03-10T06:19:09.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:08 vm04 ceph-mon[51058]: from='client.? ' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "0e68b450-e783-4ec6-99f7-0610ac3453d1"}]': finished 2026-03-10T06:19:09.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:08 vm04 ceph-mon[51058]: osdmap e23: 5 total, 3 up, 5 in 2026-03-10T06:19:09.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:08 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-10T06:19:09.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:08 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-10T06:19:09.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:08 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-10T06:19:09.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:08 vm04 ceph-mon[51058]: from='client.? 192.168.123.106:0/4120933775' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch 2026-03-10T06:19:10.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:10 vm06 ceph-mon[58974]: purged_snaps scrub starts 2026-03-10T06:19:10.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:10 vm06 ceph-mon[58974]: purged_snaps scrub ok 2026-03-10T06:19:10.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:10 vm06 ceph-mon[58974]: osd.3 [v2:192.168.123.106:6800/1172578215,v1:192.168.123.106:6801/1172578215] boot 2026-03-10T06:19:10.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:10 vm06 ceph-mon[58974]: osdmap e24: 5 total, 4 up, 5 in 2026-03-10T06:19:10.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:10 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-10T06:19:10.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:10 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-10T06:19:10.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:10 vm04 ceph-mon[51058]: purged_snaps scrub starts 2026-03-10T06:19:10.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:10 vm04 ceph-mon[51058]: purged_snaps scrub ok 2026-03-10T06:19:10.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:10 vm04 ceph-mon[51058]: osd.3 [v2:192.168.123.106:6800/1172578215,v1:192.168.123.106:6801/1172578215] boot 2026-03-10T06:19:10.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:10 vm04 ceph-mon[51058]: osdmap e24: 5 total, 4 up, 5 in 2026-03-10T06:19:10.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:10 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-10T06:19:10.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:10 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-10T06:19:11.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:11 vm06 ceph-mon[58974]: pgmap v45: 1 pgs: 1 remapped+peering; 449 KiB data, 107 MiB used, 80 GiB / 80 GiB avail 2026-03-10T06:19:11.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:11 vm06 ceph-mon[58974]: osdmap e25: 5 total, 4 up, 5 in 2026-03-10T06:19:11.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:11 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-10T06:19:11.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:11 vm06 ceph-mon[58974]: osdmap e26: 5 total, 4 up, 5 in 2026-03-10T06:19:11.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:11 vm04 ceph-mon[51058]: pgmap v45: 1 pgs: 1 remapped+peering; 449 KiB data, 107 MiB used, 80 GiB / 80 GiB avail 2026-03-10T06:19:11.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:11 vm04 ceph-mon[51058]: osdmap e25: 5 total, 4 up, 5 in 2026-03-10T06:19:11.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:11 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-10T06:19:11.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:11 vm04 ceph-mon[51058]: osdmap e26: 5 total, 4 up, 5 in 2026-03-10T06:19:12.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:12 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-10T06:19:12.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:12 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-10T06:19:13.417 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:13 vm06 ceph-mon[58974]: pgmap v48: 1 pgs: 1 remapped+peering; 449 KiB data, 107 MiB used, 80 GiB / 80 GiB avail 2026-03-10T06:19:13.417 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:13 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "osd.4"}]: dispatch 2026-03-10T06:19:13.417 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:13 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:19:13.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:13 vm04 ceph-mon[51058]: pgmap v48: 1 pgs: 1 remapped+peering; 449 KiB data, 107 MiB used, 80 GiB / 80 GiB avail 2026-03-10T06:19:13.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:13 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "osd.4"}]: dispatch 2026-03-10T06:19:13.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:13 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:19:14.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:14 vm06 ceph-mon[58974]: Deploying daemon osd.4 on vm06 2026-03-10T06:19:14.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:14 vm04 ceph-mon[51058]: Deploying daemon osd.4 on vm06 2026-03-10T06:19:15.258 INFO:teuthology.orchestra.run.vm06.stdout:Created osd(s) 4 on host 'vm06' 2026-03-10T06:19:15.258 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:15.255+0000 7f460088f700 1 -- 192.168.123.106:0/2508070586 <== mgr.14241 v2:192.168.123.104:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+32 (secure 0 0 0) 0x7f45fc083c00 con 0x7f45e406c4e0 2026-03-10T06:19:15.258 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:15.258+0000 7f4601891700 1 -- 192.168.123.106:0/2508070586 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f45e406c4e0 msgr2=0x7f45e406e990 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:15.258 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:15.258+0000 7f4601891700 1 --2- 192.168.123.106:0/2508070586 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f45e406c4e0 0x7f45e406e990 secure :-1 s=READY pgs=45 cs=0 l=1 rev1=1 crypto rx=0x7f45fc107e10 tx=0x7f45f400f040 comp rx=0 tx=0).stop 2026-03-10T06:19:15.259 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:15.258+0000 7f4601891700 1 -- 192.168.123.106:0/2508070586 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f45fc071960 msgr2=0x7f45fc081660 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:15.259 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:15.258+0000 7f4601891700 1 --2- 192.168.123.106:0/2508070586 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f45fc071960 0x7f45fc081660 secure :-1 s=READY pgs=15 cs=0 l=1 rev1=1 crypto rx=0x7f45ec013040 tx=0x7f45ec00a3b0 comp rx=0 tx=0).stop 2026-03-10T06:19:15.259 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:15.258+0000 7f4601891700 1 -- 192.168.123.106:0/2508070586 shutdown_connections 2026-03-10T06:19:15.259 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:15.258+0000 7f4601891700 1 --2- 192.168.123.106:0/2508070586 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f45e406c4e0 0x7f45e406e990 unknown :-1 s=CLOSED pgs=45 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:15.259 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:15.258+0000 7f4601891700 1 --2- 192.168.123.106:0/2508070586 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f45fc071960 0x7f45fc081660 unknown :-1 s=CLOSED pgs=15 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:15.259 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:15.258+0000 7f4601891700 1 --2- 192.168.123.106:0/2508070586 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f45fc1080e0 0x7f45fc081ba0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:15.259 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:15.258+0000 7f4601891700 1 -- 192.168.123.106:0/2508070586 >> 192.168.123.106:0/2508070586 conn(0x7f45fc06d3e0 msgr2=0x7f45fc074f70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:19:15.259 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:15.258+0000 7f4601891700 1 -- 192.168.123.106:0/2508070586 shutdown_connections 2026-03-10T06:19:15.259 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:15.258+0000 7f4601891700 1 -- 192.168.123.106:0/2508070586 wait complete. 2026-03-10T06:19:15.313 DEBUG:teuthology.orchestra.run.vm06:osd.4> sudo journalctl -f -n 0 -u ceph-9c59102a-1c48-11f1-b618-035af535377d@osd.4.service 2026-03-10T06:19:15.318 INFO:tasks.cephadm:Deploying osd.5 on vm06 with /dev/vdc... 2026-03-10T06:19:15.318 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 ceph-volume -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 9c59102a-1c48-11f1-b618-035af535377d -- lvm zap /dev/vdc 2026-03-10T06:19:15.357 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:15 vm06 ceph-mon[58974]: pgmap v49: 1 pgs: 1 remapped+peering; 449 KiB data, 107 MiB used, 80 GiB / 80 GiB avail 2026-03-10T06:19:15.357 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:15 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:19:15.357 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:15 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:19:15.357 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:15 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:19:15.357 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:15 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:19:15.357 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:15 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:19:15.519 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/mon.vm06/config 2026-03-10T06:19:15.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:15 vm04 ceph-mon[51058]: pgmap v49: 1 pgs: 1 remapped+peering; 449 KiB data, 107 MiB used, 80 GiB / 80 GiB avail 2026-03-10T06:19:15.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:15 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:19:15.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:15 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:19:15.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:15 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:19:15.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:15 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:19:15.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:15 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:19:16.116 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-10T06:19:16.129 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 9c59102a-1c48-11f1-b618-035af535377d -- ceph orch daemon add osd vm06:/dev/vdc 2026-03-10T06:19:16.306 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/mon.vm06/config 2026-03-10T06:19:16.677 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:16 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:19:16.677 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:16 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:19:16.678 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:16 vm06 ceph-mon[58974]: from='osd.4 [v2:192.168.123.106:6808/1799920647,v1:192.168.123.106:6809/1799920647]' entity='osd.4' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["4"]}]: dispatch 2026-03-10T06:19:16.678 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:16 vm06 ceph-mon[58974]: from='osd.4 ' entity='osd.4' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["4"]}]: dispatch 2026-03-10T06:19:16.678 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 10 06:19:16 vm06 ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-4[70536]: 2026-03-10T06:19:16.414+0000 7f9b8768d640 -1 osd.4 0 log_to_monitors true 2026-03-10T06:19:16.678 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:16.676+0000 7f2d22813700 1 -- 192.168.123.106:0/454230754 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f2d1c074d80 msgr2=0x7f2d1c0731e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:16.678 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:16.676+0000 7f2d22813700 1 --2- 192.168.123.106:0/454230754 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f2d1c074d80 0x7f2d1c0731e0 secure :-1 s=READY pgs=23 cs=0 l=1 rev1=1 crypto rx=0x7f2d18009b00 tx=0x7f2d18009e10 comp rx=0 tx=0).stop 2026-03-10T06:19:16.679 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:16.678+0000 7f2d22813700 1 -- 192.168.123.106:0/454230754 shutdown_connections 2026-03-10T06:19:16.679 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:16.678+0000 7f2d22813700 1 --2- 192.168.123.106:0/454230754 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2d1c0737b0 0x7f2d1c073c20 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:16.679 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:16.678+0000 7f2d22813700 1 --2- 192.168.123.106:0/454230754 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f2d1c074d80 0x7f2d1c0731e0 unknown :-1 s=CLOSED pgs=23 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:16.679 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:16.678+0000 7f2d22813700 1 -- 192.168.123.106:0/454230754 >> 192.168.123.106:0/454230754 conn(0x7f2d1c0fb850 msgr2=0x7f2d1c0fdc80 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:19:16.680 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:16.679+0000 7f2d22813700 1 -- 192.168.123.106:0/454230754 shutdown_connections 2026-03-10T06:19:16.680 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:16.680+0000 7f2d22813700 1 -- 192.168.123.106:0/454230754 wait complete. 2026-03-10T06:19:16.681 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:16.680+0000 7f2d22813700 1 Processor -- start 2026-03-10T06:19:16.681 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:16.680+0000 7f2d22813700 1 -- start start 2026-03-10T06:19:16.681 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:16.681+0000 7f2d22813700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f2d1c0737b0 0x7f2d1c19c1b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:16.681 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:16.681+0000 7f2d22813700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2d1c074d80 0x7f2d1c19c6f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:16.681 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:16.681+0000 7f2d22813700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f2d1c19cd10 con 0x7f2d1c074d80 2026-03-10T06:19:16.682 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:16.681+0000 7f2d22813700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f2d1c19ce50 con 0x7f2d1c0737b0 2026-03-10T06:19:16.682 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:16.681+0000 7f2d21811700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f2d1c0737b0 0x7f2d1c19c1b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:19:16.682 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:16.681+0000 7f2d21811700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f2d1c0737b0 0x7f2d1c19c1b0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:33636/0 (socket says 192.168.123.106:33636) 2026-03-10T06:19:16.682 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:16.681+0000 7f2d21811700 1 -- 192.168.123.106:0/964298054 learned_addr learned my addr 192.168.123.106:0/964298054 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-10T06:19:16.682 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:16.682+0000 7f2d21010700 1 --2- 192.168.123.106:0/964298054 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2d1c074d80 0x7f2d1c19c6f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:19:16.683 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:16.682+0000 7f2d21811700 1 -- 192.168.123.106:0/964298054 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2d1c074d80 msgr2=0x7f2d1c19c6f0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:16.683 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:16.682+0000 7f2d21811700 1 --2- 192.168.123.106:0/964298054 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2d1c074d80 0x7f2d1c19c6f0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:16.683 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:16.682+0000 7f2d21811700 1 -- 192.168.123.106:0/964298054 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f2d180097e0 con 0x7f2d1c0737b0 2026-03-10T06:19:16.683 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:16.683+0000 7f2d21811700 1 --2- 192.168.123.106:0/964298054 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f2d1c0737b0 0x7f2d1c19c1b0 secure :-1 s=READY pgs=24 cs=0 l=1 rev1=1 crypto rx=0x7f2d18005230 tx=0x7f2d18004c30 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:19:16.684 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:16.683+0000 7f2d12ffd700 1 -- 192.168.123.106:0/964298054 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f2d1801d070 con 0x7f2d1c0737b0 2026-03-10T06:19:16.684 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:16.684+0000 7f2d12ffd700 1 -- 192.168.123.106:0/964298054 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f2d1800bc50 con 0x7f2d1c0737b0 2026-03-10T06:19:16.684 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:16.684+0000 7f2d12ffd700 1 -- 192.168.123.106:0/964298054 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f2d1800f780 con 0x7f2d1c0737b0 2026-03-10T06:19:16.685 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:16.684+0000 7f2d22813700 1 -- 192.168.123.106:0/964298054 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f2d1c1a18a0 con 0x7f2d1c0737b0 2026-03-10T06:19:16.686 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:16.684+0000 7f2d22813700 1 -- 192.168.123.106:0/964298054 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f2d1c1a1d10 con 0x7f2d1c0737b0 2026-03-10T06:19:16.686 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:16.686+0000 7f2d22813700 1 -- 192.168.123.106:0/964298054 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f2d1c04ea50 con 0x7f2d1c0737b0 2026-03-10T06:19:16.689 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:16.686+0000 7f2d12ffd700 1 -- 192.168.123.106:0/964298054 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f2d1800f8e0 con 0x7f2d1c0737b0 2026-03-10T06:19:16.689 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:16.687+0000 7f2d12ffd700 1 --2- 192.168.123.106:0/964298054 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f2d0806c4d0 0x7f2d0806e980 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:16.689 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:16.687+0000 7f2d12ffd700 1 -- 192.168.123.106:0/964298054 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(27..27 src has 1..27) v4 ==== 3718+0+0 (secure 0 0 0) 0x7f2d1808c900 con 0x7f2d1c0737b0 2026-03-10T06:19:16.690 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:16.690+0000 7f2d12ffd700 1 -- 192.168.123.106:0/964298054 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f2d1805b1a0 con 0x7f2d1c0737b0 2026-03-10T06:19:16.690 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:16.690+0000 7f2d21010700 1 --2- 192.168.123.106:0/964298054 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f2d0806c4d0 0x7f2d0806e980 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:19:16.691 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:16.691+0000 7f2d21010700 1 --2- 192.168.123.106:0/964298054 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f2d0806c4d0 0x7f2d0806e980 secure :-1 s=READY pgs=50 cs=0 l=1 rev1=1 crypto rx=0x7f2d0c009c00 tx=0x7f2d0c009380 comp rx=0 tx=0).ready entity=mgr.14241 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:19:16.834 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:16.833+0000 7f2d22813700 1 -- 192.168.123.106:0/964298054 --> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] -- mgr_command(tid 0: {"prefix": "orch daemon add osd", "svc_arg": "vm06:/dev/vdc", "target": ["mon-mgr", ""]}) v1 -- 0x7f2d1c103350 con 0x7f2d0806c4d0 2026-03-10T06:19:16.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:16 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:19:16.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:16 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:19:16.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:16 vm04 ceph-mon[51058]: from='osd.4 [v2:192.168.123.106:6808/1799920647,v1:192.168.123.106:6809/1799920647]' entity='osd.4' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["4"]}]: dispatch 2026-03-10T06:19:16.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:16 vm04 ceph-mon[51058]: from='osd.4 ' entity='osd.4' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["4"]}]: dispatch 2026-03-10T06:19:17.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:17 vm06 ceph-mon[58974]: pgmap v50: 1 pgs: 1 active+clean; 449 KiB data, 107 MiB used, 80 GiB / 80 GiB avail; 65 KiB/s, 0 objects/s recovering 2026-03-10T06:19:17.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:17 vm06 ceph-mon[58974]: from='osd.4 ' entity='osd.4' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["4"]}]': finished 2026-03-10T06:19:17.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:17 vm06 ceph-mon[58974]: osdmap e27: 5 total, 4 up, 5 in 2026-03-10T06:19:17.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:17 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-10T06:19:17.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:17 vm06 ceph-mon[58974]: from='osd.4 [v2:192.168.123.106:6808/1799920647,v1:192.168.123.106:6809/1799920647]' entity='osd.4' cmd=[{"prefix": "osd crush create-or-move", "id": 4, "weight":0.0195, "args": ["host=vm06", "root=default"]}]: dispatch 2026-03-10T06:19:17.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:17 vm06 ceph-mon[58974]: from='osd.4 ' entity='osd.4' cmd=[{"prefix": "osd crush create-or-move", "id": 4, "weight":0.0195, "args": ["host=vm06", "root=default"]}]: dispatch 2026-03-10T06:19:17.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:17 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch 2026-03-10T06:19:17.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:17 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch 2026-03-10T06:19:17.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:17 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:19:17.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:17 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:19:17.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:17 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:19:17.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:17 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "who": "osd/host:vm06", "name": "osd_memory_target"}]: dispatch 2026-03-10T06:19:17.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:17 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:19:17.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:17 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T06:19:17.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:17 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:19:17.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:17 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:19:17.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:17 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:19:17.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:17 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T06:19:17.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:17 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:19:17.868 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 10 06:19:17 vm06 ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-4[70536]: 2026-03-10T06:19:17.496+0000 7f9b7dd06700 -1 osd.4 0 waiting for initial osdmap 2026-03-10T06:19:17.868 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 10 06:19:17 vm06 ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-4[70536]: 2026-03-10T06:19:17.513+0000 7f9b762f4700 -1 osd.4 28 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-10T06:19:17.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:17 vm04 ceph-mon[51058]: pgmap v50: 1 pgs: 1 active+clean; 449 KiB data, 107 MiB used, 80 GiB / 80 GiB avail; 65 KiB/s, 0 objects/s recovering 2026-03-10T06:19:17.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:17 vm04 ceph-mon[51058]: from='osd.4 ' entity='osd.4' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["4"]}]': finished 2026-03-10T06:19:17.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:17 vm04 ceph-mon[51058]: osdmap e27: 5 total, 4 up, 5 in 2026-03-10T06:19:17.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:17 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-10T06:19:17.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:17 vm04 ceph-mon[51058]: from='osd.4 [v2:192.168.123.106:6808/1799920647,v1:192.168.123.106:6809/1799920647]' entity='osd.4' cmd=[{"prefix": "osd crush create-or-move", "id": 4, "weight":0.0195, "args": ["host=vm06", "root=default"]}]: dispatch 2026-03-10T06:19:17.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:17 vm04 ceph-mon[51058]: from='osd.4 ' entity='osd.4' cmd=[{"prefix": "osd crush create-or-move", "id": 4, "weight":0.0195, "args": ["host=vm06", "root=default"]}]: dispatch 2026-03-10T06:19:17.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:17 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch 2026-03-10T06:19:17.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:17 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch 2026-03-10T06:19:17.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:17 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:19:17.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:17 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:19:17.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:17 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:19:17.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:17 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "who": "osd/host:vm06", "name": "osd_memory_target"}]: dispatch 2026-03-10T06:19:17.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:17 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:19:17.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:17 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T06:19:17.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:17 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:19:17.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:17 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:19:17.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:17 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:19:17.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:17 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T06:19:17.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:17 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:19:18.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:18 vm06 ceph-mon[58974]: from='client.24179 -' entity='client.admin' cmd=[{"prefix": "orch daemon add osd", "svc_arg": "vm06:/dev/vdc", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:19:18.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:18 vm06 ceph-mon[58974]: Detected new or changed devices on vm06 2026-03-10T06:19:18.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:18 vm06 ceph-mon[58974]: from='osd.4 ' entity='osd.4' cmd='[{"prefix": "osd crush create-or-move", "id": 4, "weight":0.0195, "args": ["host=vm06", "root=default"]}]': finished 2026-03-10T06:19:18.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:18 vm06 ceph-mon[58974]: osdmap e28: 5 total, 4 up, 5 in 2026-03-10T06:19:18.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:18 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-10T06:19:18.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:18 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-10T06:19:18.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:18 vm06 ceph-mon[58974]: from='client.? 192.168.123.106:0/3233842973' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "014751cc-c4ec-4d41-86ea-4b72c6f87e86"}]: dispatch 2026-03-10T06:19:18.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:18 vm06 ceph-mon[58974]: from='client.? ' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "014751cc-c4ec-4d41-86ea-4b72c6f87e86"}]: dispatch 2026-03-10T06:19:18.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:18 vm06 ceph-mon[58974]: osd.4 [v2:192.168.123.106:6808/1799920647,v1:192.168.123.106:6809/1799920647] boot 2026-03-10T06:19:18.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:18 vm06 ceph-mon[58974]: from='client.? ' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "014751cc-c4ec-4d41-86ea-4b72c6f87e86"}]': finished 2026-03-10T06:19:18.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:18 vm06 ceph-mon[58974]: osdmap e29: 6 total, 5 up, 6 in 2026-03-10T06:19:18.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:18 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-10T06:19:18.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:18 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-10T06:19:18.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:18 vm06 ceph-mon[58974]: from='client.? 192.168.123.106:0/354614758' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch 2026-03-10T06:19:18.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:18 vm04 ceph-mon[51058]: from='client.24179 -' entity='client.admin' cmd=[{"prefix": "orch daemon add osd", "svc_arg": "vm06:/dev/vdc", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:19:18.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:18 vm04 ceph-mon[51058]: Detected new or changed devices on vm06 2026-03-10T06:19:18.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:18 vm04 ceph-mon[51058]: from='osd.4 ' entity='osd.4' cmd='[{"prefix": "osd crush create-or-move", "id": 4, "weight":0.0195, "args": ["host=vm06", "root=default"]}]': finished 2026-03-10T06:19:18.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:18 vm04 ceph-mon[51058]: osdmap e28: 5 total, 4 up, 5 in 2026-03-10T06:19:18.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:18 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-10T06:19:18.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:18 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-10T06:19:18.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:18 vm04 ceph-mon[51058]: from='client.? 192.168.123.106:0/3233842973' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "014751cc-c4ec-4d41-86ea-4b72c6f87e86"}]: dispatch 2026-03-10T06:19:18.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:18 vm04 ceph-mon[51058]: from='client.? ' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "014751cc-c4ec-4d41-86ea-4b72c6f87e86"}]: dispatch 2026-03-10T06:19:18.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:18 vm04 ceph-mon[51058]: osd.4 [v2:192.168.123.106:6808/1799920647,v1:192.168.123.106:6809/1799920647] boot 2026-03-10T06:19:18.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:18 vm04 ceph-mon[51058]: from='client.? ' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "014751cc-c4ec-4d41-86ea-4b72c6f87e86"}]': finished 2026-03-10T06:19:18.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:18 vm04 ceph-mon[51058]: osdmap e29: 6 total, 5 up, 6 in 2026-03-10T06:19:18.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:18 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-10T06:19:18.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:18 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-10T06:19:18.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:18 vm04 ceph-mon[51058]: from='client.? 192.168.123.106:0/354614758' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch 2026-03-10T06:19:19.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:19 vm04 ceph-mon[51058]: purged_snaps scrub starts 2026-03-10T06:19:19.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:19 vm04 ceph-mon[51058]: purged_snaps scrub ok 2026-03-10T06:19:19.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:19 vm04 ceph-mon[51058]: pgmap v54: 1 pgs: 1 active+clean; 449 KiB data, 107 MiB used, 80 GiB / 80 GiB avail; 75 KiB/s, 0 objects/s recovering 2026-03-10T06:19:19.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:19 vm04 ceph-mon[51058]: osdmap e30: 6 total, 5 up, 6 in 2026-03-10T06:19:19.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:19 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-10T06:19:19.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:19 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T06:19:19.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:19 vm06 ceph-mon[58974]: purged_snaps scrub starts 2026-03-10T06:19:19.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:19 vm06 ceph-mon[58974]: purged_snaps scrub ok 2026-03-10T06:19:19.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:19 vm06 ceph-mon[58974]: pgmap v54: 1 pgs: 1 active+clean; 449 KiB data, 107 MiB used, 80 GiB / 80 GiB avail; 75 KiB/s, 0 objects/s recovering 2026-03-10T06:19:19.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:19 vm06 ceph-mon[58974]: osdmap e30: 6 total, 5 up, 6 in 2026-03-10T06:19:19.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:19 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-10T06:19:19.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:19 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T06:19:21.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:21 vm06 ceph-mon[58974]: pgmap v56: 1 pgs: 1 active+clean; 449 KiB data, 133 MiB used, 100 GiB / 100 GiB avail 2026-03-10T06:19:21.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:21 vm04 ceph-mon[51058]: pgmap v56: 1 pgs: 1 active+clean; 449 KiB data, 133 MiB used, 100 GiB / 100 GiB avail 2026-03-10T06:19:22.331 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:22 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "osd.5"}]: dispatch 2026-03-10T06:19:22.331 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:22 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:19:22.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:22 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "osd.5"}]: dispatch 2026-03-10T06:19:22.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:22 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:19:23.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:23 vm06 ceph-mon[58974]: pgmap v57: 1 pgs: 1 active+clean; 449 KiB data, 133 MiB used, 100 GiB / 100 GiB avail 2026-03-10T06:19:23.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:23 vm06 ceph-mon[58974]: Deploying daemon osd.5 on vm06 2026-03-10T06:19:23.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:23 vm04 ceph-mon[51058]: pgmap v57: 1 pgs: 1 active+clean; 449 KiB data, 133 MiB used, 100 GiB / 100 GiB avail 2026-03-10T06:19:23.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:23 vm04 ceph-mon[51058]: Deploying daemon osd.5 on vm06 2026-03-10T06:19:25.017 INFO:teuthology.orchestra.run.vm06.stdout:Created osd(s) 5 on host 'vm06' 2026-03-10T06:19:25.017 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:25.013+0000 7f2d12ffd700 1 -- 192.168.123.106:0/964298054 <== mgr.14241 v2:192.168.123.104:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+32 (secure 0 0 0) 0x7f2d1c103350 con 0x7f2d0806c4d0 2026-03-10T06:19:25.017 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:25.016+0000 7f2d10f39700 1 -- 192.168.123.106:0/964298054 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f2d0806c4d0 msgr2=0x7f2d0806e980 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:25.017 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:25.016+0000 7f2d10f39700 1 --2- 192.168.123.106:0/964298054 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f2d0806c4d0 0x7f2d0806e980 secure :-1 s=READY pgs=50 cs=0 l=1 rev1=1 crypto rx=0x7f2d0c009c00 tx=0x7f2d0c009380 comp rx=0 tx=0).stop 2026-03-10T06:19:25.017 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:25.016+0000 7f2d10f39700 1 -- 192.168.123.106:0/964298054 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f2d1c0737b0 msgr2=0x7f2d1c19c1b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:25.017 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:25.016+0000 7f2d10f39700 1 --2- 192.168.123.106:0/964298054 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f2d1c0737b0 0x7f2d1c19c1b0 secure :-1 s=READY pgs=24 cs=0 l=1 rev1=1 crypto rx=0x7f2d18005230 tx=0x7f2d18004c30 comp rx=0 tx=0).stop 2026-03-10T06:19:25.017 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:25.016+0000 7f2d10f39700 1 -- 192.168.123.106:0/964298054 shutdown_connections 2026-03-10T06:19:25.017 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:25.016+0000 7f2d10f39700 1 --2- 192.168.123.106:0/964298054 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f2d0806c4d0 0x7f2d0806e980 unknown :-1 s=CLOSED pgs=50 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:25.017 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:25.016+0000 7f2d10f39700 1 --2- 192.168.123.106:0/964298054 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f2d1c0737b0 0x7f2d1c19c1b0 unknown :-1 s=CLOSED pgs=24 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:25.017 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:25.016+0000 7f2d10f39700 1 --2- 192.168.123.106:0/964298054 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2d1c074d80 0x7f2d1c19c6f0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:25.017 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:25.016+0000 7f2d10f39700 1 -- 192.168.123.106:0/964298054 >> 192.168.123.106:0/964298054 conn(0x7f2d1c0fb850 msgr2=0x7f2d1c101c30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:19:25.017 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:25.016+0000 7f2d10f39700 1 -- 192.168.123.106:0/964298054 shutdown_connections 2026-03-10T06:19:25.017 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:25.016+0000 7f2d10f39700 1 -- 192.168.123.106:0/964298054 wait complete. 2026-03-10T06:19:25.110 DEBUG:teuthology.orchestra.run.vm06:osd.5> sudo journalctl -f -n 0 -u ceph-9c59102a-1c48-11f1-b618-035af535377d@osd.5.service 2026-03-10T06:19:25.112 INFO:tasks.cephadm:Waiting for 6 OSDs to come up... 2026-03-10T06:19:25.112 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 9c59102a-1c48-11f1-b618-035af535377d -- ceph osd stat -f json 2026-03-10T06:19:25.282 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:24 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:19:25.282 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:24 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:19:25.282 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:24 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:19:25.282 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:24 vm04 ceph-mon[51058]: pgmap v58: 1 pgs: 1 active+clean; 449 KiB data, 133 MiB used, 100 GiB / 100 GiB avail 2026-03-10T06:19:25.294 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/mon.vm04/config 2026-03-10T06:19:25.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:24 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:19:25.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:24 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:19:25.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:24 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:19:25.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:24 vm06 ceph-mon[58974]: pgmap v58: 1 pgs: 1 active+clean; 449 KiB data, 133 MiB used, 100 GiB / 100 GiB avail 2026-03-10T06:19:25.602 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:25.599+0000 7f85d17c9700 1 -- 192.168.123.104:0/3169247504 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f85cc101710 msgr2=0x7f85cc101b60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:25.602 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:25.599+0000 7f85d17c9700 1 --2- 192.168.123.104:0/3169247504 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f85cc101710 0x7f85cc101b60 secure :-1 s=READY pgs=213 cs=0 l=1 rev1=1 crypto rx=0x7f85bc009b00 tx=0x7f85bc009e10 comp rx=0 tx=0).stop 2026-03-10T06:19:25.602 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:25.600+0000 7f85d17c9700 1 -- 192.168.123.104:0/3169247504 shutdown_connections 2026-03-10T06:19:25.602 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:25.600+0000 7f85d17c9700 1 --2- 192.168.123.104:0/3169247504 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f85cc101710 0x7f85cc101b60 unknown :-1 s=CLOSED pgs=213 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:25.602 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:25.600+0000 7f85d17c9700 1 --2- 192.168.123.104:0/3169247504 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f85cc100510 0x7f85cc100920 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:25.602 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:25.600+0000 7f85d17c9700 1 -- 192.168.123.104:0/3169247504 >> 192.168.123.104:0/3169247504 conn(0x7f85cc0fba80 msgr2=0x7f85cc0fdef0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:19:25.602 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:25.601+0000 7f85d17c9700 1 -- 192.168.123.104:0/3169247504 shutdown_connections 2026-03-10T06:19:25.602 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:25.601+0000 7f85d17c9700 1 -- 192.168.123.104:0/3169247504 wait complete. 2026-03-10T06:19:25.603 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:25.601+0000 7f85d17c9700 1 Processor -- start 2026-03-10T06:19:25.603 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:25.601+0000 7f85d17c9700 1 -- start start 2026-03-10T06:19:25.603 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:25.602+0000 7f85d17c9700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f85cc100510 0x7f85cc195dd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:25.604 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:25.602+0000 7f85d17c9700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f85cc101710 0x7f85cc196310 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:25.604 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:25.602+0000 7f85d17c9700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f85cc196930 con 0x7f85cc101710 2026-03-10T06:19:25.604 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:25.602+0000 7f85d17c9700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f85cc196a70 con 0x7f85cc100510 2026-03-10T06:19:25.604 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:25.602+0000 7f85ca7fc700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f85cc101710 0x7f85cc196310 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:19:25.604 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:25.602+0000 7f85ca7fc700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f85cc101710 0x7f85cc196310 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:57082/0 (socket says 192.168.123.104:57082) 2026-03-10T06:19:25.604 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:25.602+0000 7f85ca7fc700 1 -- 192.168.123.104:0/537780964 learned_addr learned my addr 192.168.123.104:0/537780964 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:19:25.604 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:25.602+0000 7f85caffd700 1 --2- 192.168.123.104:0/537780964 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f85cc100510 0x7f85cc195dd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:19:25.604 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:25.602+0000 7f85ca7fc700 1 -- 192.168.123.104:0/537780964 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f85cc100510 msgr2=0x7f85cc195dd0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:25.604 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:25.602+0000 7f85ca7fc700 1 --2- 192.168.123.104:0/537780964 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f85cc100510 0x7f85cc195dd0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:25.604 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:25.602+0000 7f85ca7fc700 1 -- 192.168.123.104:0/537780964 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f85bc0097e0 con 0x7f85cc101710 2026-03-10T06:19:25.604 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:25.603+0000 7f85ca7fc700 1 --2- 192.168.123.104:0/537780964 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f85cc101710 0x7f85cc196310 secure :-1 s=READY pgs=214 cs=0 l=1 rev1=1 crypto rx=0x7f85bc0049c0 tx=0x7f85bc004aa0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:19:25.605 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:25.603+0000 7f85c3fff700 1 -- 192.168.123.104:0/537780964 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f85bc01d070 con 0x7f85cc101710 2026-03-10T06:19:25.605 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:25.603+0000 7f85c3fff700 1 -- 192.168.123.104:0/537780964 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f85bc00bc50 con 0x7f85cc101710 2026-03-10T06:19:25.605 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:25.603+0000 7f85c3fff700 1 -- 192.168.123.104:0/537780964 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f85bc00f970 con 0x7f85cc101710 2026-03-10T06:19:25.605 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:25.603+0000 7f85d17c9700 1 -- 192.168.123.104:0/537780964 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f85cc072f80 con 0x7f85cc101710 2026-03-10T06:19:25.605 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:25.603+0000 7f85d17c9700 1 -- 192.168.123.104:0/537780964 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f85cc073470 con 0x7f85cc101710 2026-03-10T06:19:25.608 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:25.605+0000 7f85d17c9700 1 -- 192.168.123.104:0/537780964 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f85cc1091f0 con 0x7f85cc101710 2026-03-10T06:19:25.608 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:25.605+0000 7f85c3fff700 1 -- 192.168.123.104:0/537780964 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f85bc022c70 con 0x7f85cc101710 2026-03-10T06:19:25.608 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:25.606+0000 7f85c3fff700 1 --2- 192.168.123.104:0/537780964 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f85b806c750 0x7f85b806ec00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:25.608 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:25.606+0000 7f85c3fff700 1 -- 192.168.123.104:0/537780964 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(30..30 src has 1..30) v4 ==== 4129+0+0 (secure 0 0 0) 0x7f85bc08cc10 con 0x7f85cc101710 2026-03-10T06:19:25.609 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:25.607+0000 7f85caffd700 1 --2- 192.168.123.104:0/537780964 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f85b806c750 0x7f85b806ec00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:19:25.609 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:25.607+0000 7f85caffd700 1 --2- 192.168.123.104:0/537780964 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f85b806c750 0x7f85b806ec00 secure :-1 s=READY pgs=55 cs=0 l=1 rev1=1 crypto rx=0x7f85cc101570 tx=0x7f85b4005cf0 comp rx=0 tx=0).ready entity=mgr.14241 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:19:25.611 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:25.609+0000 7f85c3fff700 1 -- 192.168.123.104:0/537780964 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f85bc05b310 con 0x7f85cc101710 2026-03-10T06:19:25.730 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:25.727+0000 7f85d17c9700 1 -- 192.168.123.104:0/537780964 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "osd stat", "format": "json"} v 0) v1 -- 0x7f85cc1093e0 con 0x7f85cc101710 2026-03-10T06:19:25.730 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:25.728+0000 7f85c3fff700 1 -- 192.168.123.104:0/537780964 <== mon.0 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "osd stat", "format": "json"}]=0 v30) v1 ==== 74+0+130 (secure 0 0 0) 0x7f85bc05aea0 con 0x7f85cc101710 2026-03-10T06:19:25.730 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:19:25.733 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:25.731+0000 7f85d17c9700 1 -- 192.168.123.104:0/537780964 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f85b806c750 msgr2=0x7f85b806ec00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:25.733 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:25.731+0000 7f85d17c9700 1 --2- 192.168.123.104:0/537780964 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f85b806c750 0x7f85b806ec00 secure :-1 s=READY pgs=55 cs=0 l=1 rev1=1 crypto rx=0x7f85cc101570 tx=0x7f85b4005cf0 comp rx=0 tx=0).stop 2026-03-10T06:19:25.733 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:25.731+0000 7f85d17c9700 1 -- 192.168.123.104:0/537780964 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f85cc101710 msgr2=0x7f85cc196310 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:25.733 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:25.731+0000 7f85d17c9700 1 --2- 192.168.123.104:0/537780964 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f85cc101710 0x7f85cc196310 secure :-1 s=READY pgs=214 cs=0 l=1 rev1=1 crypto rx=0x7f85bc0049c0 tx=0x7f85bc004aa0 comp rx=0 tx=0).stop 2026-03-10T06:19:25.733 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:25.731+0000 7f85d17c9700 1 -- 192.168.123.104:0/537780964 shutdown_connections 2026-03-10T06:19:25.733 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:25.731+0000 7f85d17c9700 1 --2- 192.168.123.104:0/537780964 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f85b806c750 0x7f85b806ec00 unknown :-1 s=CLOSED pgs=55 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:25.733 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:25.732+0000 7f85d17c9700 1 --2- 192.168.123.104:0/537780964 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f85cc100510 0x7f85cc195dd0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:25.733 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:25.732+0000 7f85d17c9700 1 --2- 192.168.123.104:0/537780964 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f85cc101710 0x7f85cc196310 unknown :-1 s=CLOSED pgs=214 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:25.733 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:25.732+0000 7f85d17c9700 1 -- 192.168.123.104:0/537780964 >> 192.168.123.104:0/537780964 conn(0x7f85cc0fba80 msgr2=0x7f85cc104940 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:19:25.734 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:25.732+0000 7f85d17c9700 1 -- 192.168.123.104:0/537780964 shutdown_connections 2026-03-10T06:19:25.734 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:25.732+0000 7f85d17c9700 1 -- 192.168.123.104:0/537780964 wait complete. 2026-03-10T06:19:25.814 INFO:teuthology.orchestra.run.vm04.stdout:{"epoch":30,"num_osds":6,"num_up_osds":5,"osd_up_since":1773123557,"num_in_osds":6,"osd_in_since":1773123557,"num_remapped_pgs":0} 2026-03-10T06:19:26.038 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:26 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:19:26.038 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:26 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:19:26.038 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:26 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:19:26.038 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:26 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:19:26.038 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:26 vm04 ceph-mon[51058]: from='client.? 192.168.123.104:0/537780964' entity='client.admin' cmd=[{"prefix": "osd stat", "format": "json"}]: dispatch 2026-03-10T06:19:26.294 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:26 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:19:26.294 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:26 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:19:26.294 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:26 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:19:26.294 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:26 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:19:26.294 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:26 vm06 ceph-mon[58974]: from='client.? 192.168.123.104:0/537780964' entity='client.admin' cmd=[{"prefix": "osd stat", "format": "json"}]: dispatch 2026-03-10T06:19:26.294 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 10 06:19:26 vm06 ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-5[76033]: 2026-03-10T06:19:26.141+0000 7f33e7b16640 -1 osd.5 0 log_to_monitors true 2026-03-10T06:19:26.816 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 9c59102a-1c48-11f1-b618-035af535377d -- ceph osd stat -f json 2026-03-10T06:19:26.994 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/mon.vm04/config 2026-03-10T06:19:27.100 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:27 vm04 ceph-mon[51058]: pgmap v59: 1 pgs: 1 active+clean; 449 KiB data, 133 MiB used, 100 GiB / 100 GiB avail 2026-03-10T06:19:27.100 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:27 vm04 ceph-mon[51058]: from='osd.5 [v2:192.168.123.106:6816/240520744,v1:192.168.123.106:6817/240520744]' entity='osd.5' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["5"]}]: dispatch 2026-03-10T06:19:27.100 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:27 vm04 ceph-mon[51058]: Detected new or changed devices on vm06 2026-03-10T06:19:27.100 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:27 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:19:27.100 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:27 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:19:27.100 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:27 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "who": "osd/host:vm06", "name": "osd_memory_target"}]: dispatch 2026-03-10T06:19:27.100 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:27 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:19:27.100 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:27 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T06:19:27.100 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:27 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:19:27.100 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:27 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:19:27.100 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:27 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:19:27.100 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:27 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T06:19:27.100 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:27 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:19:27.275 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:27.273+0000 7fd2442fa700 1 -- 192.168.123.104:0/3718718509 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd23c102760 msgr2=0x7fd23c102b70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:27.275 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:27.273+0000 7fd2442fa700 1 --2- 192.168.123.104:0/3718718509 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd23c102760 0x7fd23c102b70 secure :-1 s=READY pgs=216 cs=0 l=1 rev1=1 crypto rx=0x7fd22c009b00 tx=0x7fd22c009e10 comp rx=0 tx=0).stop 2026-03-10T06:19:27.275 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:27.273+0000 7fd2442fa700 1 -- 192.168.123.104:0/3718718509 shutdown_connections 2026-03-10T06:19:27.275 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:27.273+0000 7fd2442fa700 1 --2- 192.168.123.104:0/3718718509 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd23c103960 0x7fd23c103db0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:27.275 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:27.273+0000 7fd2442fa700 1 --2- 192.168.123.104:0/3718718509 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd23c102760 0x7fd23c102b70 unknown :-1 s=CLOSED pgs=216 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:27.275 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:27.273+0000 7fd2442fa700 1 -- 192.168.123.104:0/3718718509 >> 192.168.123.104:0/3718718509 conn(0x7fd23c0fdcf0 msgr2=0x7fd23c100140 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:19:27.275 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:27.274+0000 7fd2442fa700 1 -- 192.168.123.104:0/3718718509 shutdown_connections 2026-03-10T06:19:27.275 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:27.274+0000 7fd2442fa700 1 -- 192.168.123.104:0/3718718509 wait complete. 2026-03-10T06:19:27.276 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:27.274+0000 7fd2442fa700 1 Processor -- start 2026-03-10T06:19:27.276 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:27.274+0000 7fd2442fa700 1 -- start start 2026-03-10T06:19:27.276 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:27.274+0000 7fd2442fa700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd23c102760 0x7fd23c198020 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:27.276 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:27.274+0000 7fd2442fa700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd23c103960 0x7fd23c198560 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:27.276 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:27.274+0000 7fd2442fa700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd23c198af0 con 0x7fd23c102760 2026-03-10T06:19:27.276 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:27.274+0000 7fd2442fa700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd23c198c30 con 0x7fd23c103960 2026-03-10T06:19:27.276 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:27.274+0000 7fd242096700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd23c102760 0x7fd23c198020 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:19:27.277 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:27.275+0000 7fd242096700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd23c102760 0x7fd23c198020 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:57106/0 (socket says 192.168.123.104:57106) 2026-03-10T06:19:27.277 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:27.275+0000 7fd242096700 1 -- 192.168.123.104:0/2480621135 learned_addr learned my addr 192.168.123.104:0/2480621135 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:19:27.277 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:27.275+0000 7fd242096700 1 -- 192.168.123.104:0/2480621135 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd23c103960 msgr2=0x7fd23c198560 unknown :-1 s=STATE_CONNECTING_RE l=1).mark_down 2026-03-10T06:19:27.277 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:27.275+0000 7fd242096700 1 --2- 192.168.123.104:0/2480621135 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd23c103960 0x7fd23c198560 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:27.277 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:27.275+0000 7fd242096700 1 -- 192.168.123.104:0/2480621135 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fd22c0097e0 con 0x7fd23c102760 2026-03-10T06:19:27.277 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:27.275+0000 7fd242096700 1 --2- 192.168.123.104:0/2480621135 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd23c102760 0x7fd23c198020 secure :-1 s=READY pgs=217 cs=0 l=1 rev1=1 crypto rx=0x7fd22c004a30 tx=0x7fd22c004b10 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:19:27.277 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:27.275+0000 7fd2337fe700 1 -- 192.168.123.104:0/2480621135 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd22c01d070 con 0x7fd23c102760 2026-03-10T06:19:27.277 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:27.275+0000 7fd2442fa700 1 -- 192.168.123.104:0/2480621135 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fd23c19d690 con 0x7fd23c102760 2026-03-10T06:19:27.277 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:27.275+0000 7fd2442fa700 1 -- 192.168.123.104:0/2480621135 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fd23c19db50 con 0x7fd23c102760 2026-03-10T06:19:27.278 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:27.276+0000 7fd2337fe700 1 -- 192.168.123.104:0/2480621135 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fd22c00bcd0 con 0x7fd23c102760 2026-03-10T06:19:27.278 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:27.276+0000 7fd2337fe700 1 -- 192.168.123.104:0/2480621135 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd22c0218c0 con 0x7fd23c102760 2026-03-10T06:19:27.278 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:27.277+0000 7fd2337fe700 1 -- 192.168.123.104:0/2480621135 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7fd22c02b430 con 0x7fd23c102760 2026-03-10T06:19:27.279 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:27.277+0000 7fd2337fe700 1 --2- 192.168.123.104:0/2480621135 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fd22806c750 0x7fd22806ec00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:27.279 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:27.277+0000 7fd2337fe700 1 -- 192.168.123.104:0/2480621135 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(31..31 src has 1..31) v4 ==== 4150+0+0 (secure 0 0 0) 0x7fd22c08d9c0 con 0x7fd23c102760 2026-03-10T06:19:27.279 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:27.277+0000 7fd241895700 1 --2- 192.168.123.104:0/2480621135 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fd22806c750 0x7fd22806ec00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:19:27.282 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:27.278+0000 7fd2442fa700 1 -- 192.168.123.104:0/2480621135 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fd220005320 con 0x7fd23c102760 2026-03-10T06:19:27.283 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:27.281+0000 7fd241895700 1 --2- 192.168.123.104:0/2480621135 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fd22806c750 0x7fd22806ec00 secure :-1 s=READY pgs=56 cs=0 l=1 rev1=1 crypto rx=0x7fd23800a9b0 tx=0x7fd238005c90 comp rx=0 tx=0).ready entity=mgr.14241 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:19:27.283 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:27.281+0000 7fd2337fe700 1 -- 192.168.123.104:0/2480621135 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fd22c05c0b0 con 0x7fd23c102760 2026-03-10T06:19:27.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:27 vm06 ceph-mon[58974]: pgmap v59: 1 pgs: 1 active+clean; 449 KiB data, 133 MiB used, 100 GiB / 100 GiB avail 2026-03-10T06:19:27.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:27 vm06 ceph-mon[58974]: from='osd.5 [v2:192.168.123.106:6816/240520744,v1:192.168.123.106:6817/240520744]' entity='osd.5' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["5"]}]: dispatch 2026-03-10T06:19:27.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:27 vm06 ceph-mon[58974]: Detected new or changed devices on vm06 2026-03-10T06:19:27.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:27 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:19:27.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:27 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:19:27.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:27 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "who": "osd/host:vm06", "name": "osd_memory_target"}]: dispatch 2026-03-10T06:19:27.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:27 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:19:27.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:27 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T06:19:27.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:27 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:19:27.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:27 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:19:27.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:27 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:19:27.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:27 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T06:19:27.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:27 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:19:27.392 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:27.390+0000 7fd2442fa700 1 -- 192.168.123.104:0/2480621135 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "osd stat", "format": "json"} v 0) v1 -- 0x7fd220005190 con 0x7fd23c102760 2026-03-10T06:19:27.392 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:27.391+0000 7fd2337fe700 1 -- 192.168.123.104:0/2480621135 <== mon.0 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "osd stat", "format": "json"}]=0 v31) v1 ==== 74+0+130 (secure 0 0 0) 0x7fd22c026070 con 0x7fd23c102760 2026-03-10T06:19:27.394 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:19:27.396 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:27.394+0000 7fd2442fa700 1 -- 192.168.123.104:0/2480621135 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fd22806c750 msgr2=0x7fd22806ec00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:27.396 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:27.394+0000 7fd2442fa700 1 --2- 192.168.123.104:0/2480621135 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fd22806c750 0x7fd22806ec00 secure :-1 s=READY pgs=56 cs=0 l=1 rev1=1 crypto rx=0x7fd23800a9b0 tx=0x7fd238005c90 comp rx=0 tx=0).stop 2026-03-10T06:19:27.396 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:27.395+0000 7fd2442fa700 1 -- 192.168.123.104:0/2480621135 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd23c102760 msgr2=0x7fd23c198020 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:27.397 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:27.395+0000 7fd2442fa700 1 --2- 192.168.123.104:0/2480621135 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd23c102760 0x7fd23c198020 secure :-1 s=READY pgs=217 cs=0 l=1 rev1=1 crypto rx=0x7fd22c004a30 tx=0x7fd22c004b10 comp rx=0 tx=0).stop 2026-03-10T06:19:27.397 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:27.395+0000 7fd2442fa700 1 -- 192.168.123.104:0/2480621135 shutdown_connections 2026-03-10T06:19:27.397 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:27.395+0000 7fd2442fa700 1 --2- 192.168.123.104:0/2480621135 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fd22806c750 0x7fd22806ec00 unknown :-1 s=CLOSED pgs=56 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:27.397 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:27.395+0000 7fd2442fa700 1 --2- 192.168.123.104:0/2480621135 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd23c102760 0x7fd23c198020 unknown :-1 s=CLOSED pgs=217 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:27.397 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:27.395+0000 7fd2442fa700 1 --2- 192.168.123.104:0/2480621135 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd23c103960 0x7fd23c198560 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:27.397 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:27.396+0000 7fd2442fa700 1 -- 192.168.123.104:0/2480621135 >> 192.168.123.104:0/2480621135 conn(0x7fd23c0fdcf0 msgr2=0x7fd23c106b90 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:19:27.397 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:27.396+0000 7fd2442fa700 1 -- 192.168.123.104:0/2480621135 shutdown_connections 2026-03-10T06:19:27.398 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:27.396+0000 7fd2442fa700 1 -- 192.168.123.104:0/2480621135 wait complete. 2026-03-10T06:19:27.447 INFO:teuthology.orchestra.run.vm04.stdout:{"epoch":31,"num_osds":6,"num_up_osds":5,"osd_up_since":1773123557,"num_in_osds":6,"osd_in_since":1773123557,"num_remapped_pgs":0} 2026-03-10T06:19:28.367 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 10 06:19:28 vm06 ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-5[76033]: 2026-03-10T06:19:28.056+0000 7f33dc98c700 -1 osd.5 0 waiting for initial osdmap 2026-03-10T06:19:28.367 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 10 06:19:28 vm06 ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-5[76033]: 2026-03-10T06:19:28.075+0000 7f33d8f82700 -1 osd.5 32 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-10T06:19:28.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:28 vm06 ceph-mon[58974]: from='osd.5 [v2:192.168.123.106:6816/240520744,v1:192.168.123.106:6817/240520744]' entity='osd.5' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["5"]}]': finished 2026-03-10T06:19:28.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:28 vm06 ceph-mon[58974]: osdmap e31: 6 total, 5 up, 6 in 2026-03-10T06:19:28.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:28 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-10T06:19:28.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:28 vm06 ceph-mon[58974]: from='osd.5 [v2:192.168.123.106:6816/240520744,v1:192.168.123.106:6817/240520744]' entity='osd.5' cmd=[{"prefix": "osd crush create-or-move", "id": 5, "weight":0.0195, "args": ["host=vm06", "root=default"]}]: dispatch 2026-03-10T06:19:28.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:28 vm06 ceph-mon[58974]: from='client.? 192.168.123.104:0/2480621135' entity='client.admin' cmd=[{"prefix": "osd stat", "format": "json"}]: dispatch 2026-03-10T06:19:28.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:28 vm04 ceph-mon[51058]: from='osd.5 [v2:192.168.123.106:6816/240520744,v1:192.168.123.106:6817/240520744]' entity='osd.5' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["5"]}]': finished 2026-03-10T06:19:28.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:28 vm04 ceph-mon[51058]: osdmap e31: 6 total, 5 up, 6 in 2026-03-10T06:19:28.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:28 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-10T06:19:28.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:28 vm04 ceph-mon[51058]: from='osd.5 [v2:192.168.123.106:6816/240520744,v1:192.168.123.106:6817/240520744]' entity='osd.5' cmd=[{"prefix": "osd crush create-or-move", "id": 5, "weight":0.0195, "args": ["host=vm06", "root=default"]}]: dispatch 2026-03-10T06:19:28.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:28 vm04 ceph-mon[51058]: from='client.? 192.168.123.104:0/2480621135' entity='client.admin' cmd=[{"prefix": "osd stat", "format": "json"}]: dispatch 2026-03-10T06:19:28.448 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 9c59102a-1c48-11f1-b618-035af535377d -- ceph osd stat -f json 2026-03-10T06:19:28.598 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/mon.vm04/config 2026-03-10T06:19:28.862 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:28.859+0000 7f598d5fe700 1 -- 192.168.123.104:0/2994715318 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5988069160 msgr2=0x7f5988103160 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:28.862 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:28.859+0000 7f598d5fe700 1 --2- 192.168.123.104:0/2994715318 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5988069160 0x7f5988103160 secure :-1 s=READY pgs=218 cs=0 l=1 rev1=1 crypto rx=0x7f5970009b00 tx=0x7f5970009e10 comp rx=0 tx=0).stop 2026-03-10T06:19:28.862 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:28.860+0000 7f598d5fe700 1 -- 192.168.123.104:0/2994715318 shutdown_connections 2026-03-10T06:19:28.862 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:28.860+0000 7f598d5fe700 1 --2- 192.168.123.104:0/2994715318 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f59881036a0 0x7f5988105ac0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:28.862 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:28.860+0000 7f598d5fe700 1 --2- 192.168.123.104:0/2994715318 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5988069160 0x7f5988103160 unknown :-1 s=CLOSED pgs=218 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:28.862 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:28.860+0000 7f598d5fe700 1 -- 192.168.123.104:0/2994715318 >> 192.168.123.104:0/2994715318 conn(0x7f59880faa70 msgr2=0x7f59880fcee0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:19:28.862 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:28.860+0000 7f598d5fe700 1 -- 192.168.123.104:0/2994715318 shutdown_connections 2026-03-10T06:19:28.862 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:28.861+0000 7f598d5fe700 1 -- 192.168.123.104:0/2994715318 wait complete. 2026-03-10T06:19:28.863 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:28.861+0000 7f598d5fe700 1 Processor -- start 2026-03-10T06:19:28.863 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:28.861+0000 7f598d5fe700 1 -- start start 2026-03-10T06:19:28.863 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:28.861+0000 7f598d5fe700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5988069160 0x7f5988197fe0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:28.863 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:28.861+0000 7f598d5fe700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f59881036a0 0x7f5988198520 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:28.863 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:28.861+0000 7f598d5fe700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5988198b40 con 0x7f5988069160 2026-03-10T06:19:28.863 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:28.861+0000 7f598d5fe700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5988198c80 con 0x7f59881036a0 2026-03-10T06:19:28.863 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:28.862+0000 7f59867fc700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f59881036a0 0x7f5988198520 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:19:28.863 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:28.862+0000 7f59867fc700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f59881036a0 0x7f5988198520 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.104:34970/0 (socket says 192.168.123.104:34970) 2026-03-10T06:19:28.863 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:28.862+0000 7f59867fc700 1 -- 192.168.123.104:0/2493909015 learned_addr learned my addr 192.168.123.104:0/2493909015 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:19:28.863 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:28.862+0000 7f5986ffd700 1 --2- 192.168.123.104:0/2493909015 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5988069160 0x7f5988197fe0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:19:28.864 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:28.862+0000 7f59867fc700 1 -- 192.168.123.104:0/2493909015 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5988069160 msgr2=0x7f5988197fe0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:28.864 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:28.862+0000 7f59867fc700 1 --2- 192.168.123.104:0/2493909015 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5988069160 0x7f5988197fe0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:28.864 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:28.862+0000 7f59867fc700 1 -- 192.168.123.104:0/2493909015 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f59700097e0 con 0x7f59881036a0 2026-03-10T06:19:28.864 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:28.862+0000 7f5986ffd700 1 --2- 192.168.123.104:0/2493909015 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5988069160 0x7f5988197fe0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T06:19:28.864 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:28.862+0000 7f59867fc700 1 --2- 192.168.123.104:0/2493909015 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f59881036a0 0x7f5988198520 secure :-1 s=READY pgs=30 cs=0 l=1 rev1=1 crypto rx=0x7f597800d8d0 tx=0x7f597800dc90 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:19:28.864 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:28.862+0000 7f597ffff700 1 -- 192.168.123.104:0/2493909015 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f5978009940 con 0x7f59881036a0 2026-03-10T06:19:28.864 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:28.863+0000 7f597ffff700 1 -- 192.168.123.104:0/2493909015 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f5978010460 con 0x7f59881036a0 2026-03-10T06:19:28.864 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:28.863+0000 7f598d5fe700 1 -- 192.168.123.104:0/2493909015 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f598819d730 con 0x7f59881036a0 2026-03-10T06:19:28.865 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:28.863+0000 7f597ffff700 1 -- 192.168.123.104:0/2493909015 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f597800f5d0 con 0x7f59881036a0 2026-03-10T06:19:28.865 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:28.863+0000 7f598d5fe700 1 -- 192.168.123.104:0/2493909015 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f598819dc80 con 0x7f59881036a0 2026-03-10T06:19:28.866 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:28.864+0000 7f598d5fe700 1 -- 192.168.123.104:0/2493909015 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f59880fc670 con 0x7f59881036a0 2026-03-10T06:19:28.866 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:28.864+0000 7f597ffff700 1 -- 192.168.123.104:0/2493909015 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f5978009aa0 con 0x7f59881036a0 2026-03-10T06:19:28.866 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:28.865+0000 7f597ffff700 1 --2- 192.168.123.104:0/2493909015 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f597406c480 0x7f597406e930 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:28.867 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:28.865+0000 7f597ffff700 1 -- 192.168.123.104:0/2493909015 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(32..32 src has 1..32) v4 ==== 4166+0+0 (secure 0 0 0) 0x7f597808aed0 con 0x7f59881036a0 2026-03-10T06:19:28.867 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:28.865+0000 7f5986ffd700 1 --2- 192.168.123.104:0/2493909015 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f597406c480 0x7f597406e930 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:19:28.867 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:28.865+0000 7f5986ffd700 1 --2- 192.168.123.104:0/2493909015 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f597406c480 0x7f597406e930 secure :-1 s=READY pgs=58 cs=0 l=1 rev1=1 crypto rx=0x7f5970006010 tx=0x7f5970005dc0 comp rx=0 tx=0).ready entity=mgr.14241 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:19:28.869 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:28.867+0000 7f597ffff700 1 -- 192.168.123.104:0/2493909015 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f5978059220 con 0x7f59881036a0 2026-03-10T06:19:28.976 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:28.974+0000 7f598d5fe700 1 -- 192.168.123.104:0/2493909015 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "osd stat", "format": "json"} v 0) v1 -- 0x7f5988066e40 con 0x7f59881036a0 2026-03-10T06:19:28.977 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:28.975+0000 7f597ffff700 1 -- 192.168.123.104:0/2493909015 <== mon.1 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "osd stat", "format": "json"}]=0 v32) v1 ==== 74+0+130 (secure 0 0 0) 0x7f5978059040 con 0x7f59881036a0 2026-03-10T06:19:28.977 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:19:28.980 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:28.978+0000 7f598d5fe700 1 -- 192.168.123.104:0/2493909015 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f597406c480 msgr2=0x7f597406e930 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:28.980 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:28.978+0000 7f598d5fe700 1 --2- 192.168.123.104:0/2493909015 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f597406c480 0x7f597406e930 secure :-1 s=READY pgs=58 cs=0 l=1 rev1=1 crypto rx=0x7f5970006010 tx=0x7f5970005dc0 comp rx=0 tx=0).stop 2026-03-10T06:19:28.980 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:28.978+0000 7f598d5fe700 1 -- 192.168.123.104:0/2493909015 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f59881036a0 msgr2=0x7f5988198520 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:28.980 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:28.978+0000 7f598d5fe700 1 --2- 192.168.123.104:0/2493909015 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f59881036a0 0x7f5988198520 secure :-1 s=READY pgs=30 cs=0 l=1 rev1=1 crypto rx=0x7f597800d8d0 tx=0x7f597800dc90 comp rx=0 tx=0).stop 2026-03-10T06:19:28.980 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:28.979+0000 7f598d5fe700 1 -- 192.168.123.104:0/2493909015 shutdown_connections 2026-03-10T06:19:28.980 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:28.979+0000 7f598d5fe700 1 --2- 192.168.123.104:0/2493909015 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f597406c480 0x7f597406e930 unknown :-1 s=CLOSED pgs=58 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:28.980 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:28.979+0000 7f598d5fe700 1 --2- 192.168.123.104:0/2493909015 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5988069160 0x7f5988197fe0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:28.980 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:28.979+0000 7f598d5fe700 1 --2- 192.168.123.104:0/2493909015 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f59881036a0 0x7f5988198520 unknown :-1 s=CLOSED pgs=30 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:28.980 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:28.979+0000 7f598d5fe700 1 -- 192.168.123.104:0/2493909015 >> 192.168.123.104:0/2493909015 conn(0x7f59880faa70 msgr2=0x7f5988104340 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:19:28.981 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:28.979+0000 7f598d5fe700 1 -- 192.168.123.104:0/2493909015 shutdown_connections 2026-03-10T06:19:28.981 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:28.979+0000 7f598d5fe700 1 -- 192.168.123.104:0/2493909015 wait complete. 2026-03-10T06:19:29.027 INFO:teuthology.orchestra.run.vm04.stdout:{"epoch":32,"num_osds":6,"num_up_osds":5,"osd_up_since":1773123557,"num_in_osds":6,"osd_in_since":1773123557,"num_remapped_pgs":0} 2026-03-10T06:19:29.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:29 vm06 ceph-mon[58974]: from='osd.5 [v2:192.168.123.106:6816/240520744,v1:192.168.123.106:6817/240520744]' entity='osd.5' cmd='[{"prefix": "osd crush create-or-move", "id": 5, "weight":0.0195, "args": ["host=vm06", "root=default"]}]': finished 2026-03-10T06:19:29.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:29 vm06 ceph-mon[58974]: osdmap e32: 6 total, 5 up, 6 in 2026-03-10T06:19:29.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:29 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-10T06:19:29.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:29 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-10T06:19:29.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:29 vm06 ceph-mon[58974]: pgmap v62: 1 pgs: 1 active+clean; 449 KiB data, 133 MiB used, 100 GiB / 100 GiB avail 2026-03-10T06:19:29.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:29 vm06 ceph-mon[58974]: from='client.? 192.168.123.104:0/2493909015' entity='client.admin' cmd=[{"prefix": "osd stat", "format": "json"}]: dispatch 2026-03-10T06:19:29.368 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:29 vm04 ceph-mon[51058]: from='osd.5 [v2:192.168.123.106:6816/240520744,v1:192.168.123.106:6817/240520744]' entity='osd.5' cmd='[{"prefix": "osd crush create-or-move", "id": 5, "weight":0.0195, "args": ["host=vm06", "root=default"]}]': finished 2026-03-10T06:19:29.368 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:29 vm04 ceph-mon[51058]: osdmap e32: 6 total, 5 up, 6 in 2026-03-10T06:19:29.368 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:29 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-10T06:19:29.368 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:29 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-10T06:19:29.368 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:29 vm04 ceph-mon[51058]: pgmap v62: 1 pgs: 1 active+clean; 449 KiB data, 133 MiB used, 100 GiB / 100 GiB avail 2026-03-10T06:19:29.368 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:29 vm04 ceph-mon[51058]: from='client.? 192.168.123.104:0/2493909015' entity='client.admin' cmd=[{"prefix": "osd stat", "format": "json"}]: dispatch 2026-03-10T06:19:30.028 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 9c59102a-1c48-11f1-b618-035af535377d -- ceph osd stat -f json 2026-03-10T06:19:30.197 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/mon.vm04/config 2026-03-10T06:19:30.347 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:30 vm04 ceph-mon[51058]: purged_snaps scrub starts 2026-03-10T06:19:30.347 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:30 vm04 ceph-mon[51058]: purged_snaps scrub ok 2026-03-10T06:19:30.347 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:30 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-10T06:19:30.348 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:30 vm04 ceph-mon[51058]: osd.5 [v2:192.168.123.106:6816/240520744,v1:192.168.123.106:6817/240520744] boot 2026-03-10T06:19:30.348 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:30 vm04 ceph-mon[51058]: osdmap e33: 6 total, 6 up, 6 in 2026-03-10T06:19:30.348 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:30 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-10T06:19:30.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:30 vm06 ceph-mon[58974]: purged_snaps scrub starts 2026-03-10T06:19:30.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:30 vm06 ceph-mon[58974]: purged_snaps scrub ok 2026-03-10T06:19:30.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:30 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-10T06:19:30.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:30 vm06 ceph-mon[58974]: osd.5 [v2:192.168.123.106:6816/240520744,v1:192.168.123.106:6817/240520744] boot 2026-03-10T06:19:30.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:30 vm06 ceph-mon[58974]: osdmap e33: 6 total, 6 up, 6 in 2026-03-10T06:19:30.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:30 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-10T06:19:30.463 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:30.461+0000 7f0f2c032700 1 -- 192.168.123.104:0/2931456320 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0f240fe3c0 msgr2=0x7f0f240fe7d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:30.463 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:30.461+0000 7f0f2c032700 1 --2- 192.168.123.104:0/2931456320 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0f240fe3c0 0x7f0f240fe7d0 secure :-1 s=READY pgs=219 cs=0 l=1 rev1=1 crypto rx=0x7f0f18009b00 tx=0x7f0f18009e10 comp rx=0 tx=0).stop 2026-03-10T06:19:30.464 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:30.462+0000 7f0f2c032700 1 -- 192.168.123.104:0/2931456320 shutdown_connections 2026-03-10T06:19:30.464 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:30.462+0000 7f0f2c032700 1 --2- 192.168.123.104:0/2931456320 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0f240ff500 0x7f0f240ff970 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:30.464 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:30.462+0000 7f0f2c032700 1 --2- 192.168.123.104:0/2931456320 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0f240fe3c0 0x7f0f240fe7d0 unknown :-1 s=CLOSED pgs=219 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:30.464 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:30.462+0000 7f0f2c032700 1 -- 192.168.123.104:0/2931456320 >> 192.168.123.104:0/2931456320 conn(0x7f0f240f9ce0 msgr2=0x7f0f240fc130 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:19:30.464 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:30.462+0000 7f0f2c032700 1 -- 192.168.123.104:0/2931456320 shutdown_connections 2026-03-10T06:19:30.464 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:30.462+0000 7f0f2c032700 1 -- 192.168.123.104:0/2931456320 wait complete. 2026-03-10T06:19:30.464 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:30.463+0000 7f0f2c032700 1 Processor -- start 2026-03-10T06:19:30.464 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:30.463+0000 7f0f2c032700 1 -- start start 2026-03-10T06:19:30.465 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:30.463+0000 7f0f2c032700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0f240fe3c0 0x7f0f24197fc0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:30.465 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:30.463+0000 7f0f2c032700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0f240ff500 0x7f0f24198500 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:30.465 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:30.463+0000 7f0f2c032700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0f24198a90 con 0x7f0f240ff500 2026-03-10T06:19:30.465 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:30.463+0000 7f0f2c032700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0f24198bd0 con 0x7f0f240fe3c0 2026-03-10T06:19:30.465 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:30.463+0000 7f0f29dce700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0f240fe3c0 0x7f0f24197fc0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:19:30.465 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:30.463+0000 7f0f295cd700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0f240ff500 0x7f0f24198500 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:19:30.465 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:30.463+0000 7f0f29dce700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0f240fe3c0 0x7f0f24197fc0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.104:34980/0 (socket says 192.168.123.104:34980) 2026-03-10T06:19:30.465 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:30.463+0000 7f0f29dce700 1 -- 192.168.123.104:0/1852436323 learned_addr learned my addr 192.168.123.104:0/1852436323 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:19:30.465 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:30.464+0000 7f0f295cd700 1 -- 192.168.123.104:0/1852436323 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0f240fe3c0 msgr2=0x7f0f24197fc0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:30.465 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:30.464+0000 7f0f295cd700 1 --2- 192.168.123.104:0/1852436323 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0f240fe3c0 0x7f0f24197fc0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:30.465 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:30.464+0000 7f0f295cd700 1 -- 192.168.123.104:0/1852436323 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f0f180097e0 con 0x7f0f240ff500 2026-03-10T06:19:30.466 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:30.464+0000 7f0f295cd700 1 --2- 192.168.123.104:0/1852436323 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0f240ff500 0x7f0f24198500 secure :-1 s=READY pgs=220 cs=0 l=1 rev1=1 crypto rx=0x7f0f2000c8f0 tx=0x7f0f2000cc00 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:19:30.466 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:30.464+0000 7f0f16ffd700 1 -- 192.168.123.104:0/1852436323 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f0f200043f0 con 0x7f0f240ff500 2026-03-10T06:19:30.467 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:30.464+0000 7f0f16ffd700 1 -- 192.168.123.104:0/1852436323 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f0f20004550 con 0x7f0f240ff500 2026-03-10T06:19:30.467 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:30.464+0000 7f0f16ffd700 1 -- 192.168.123.104:0/1852436323 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f0f20003890 con 0x7f0f240ff500 2026-03-10T06:19:30.467 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:30.464+0000 7f0f2c032700 1 -- 192.168.123.104:0/1852436323 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f0f2419d690 con 0x7f0f240ff500 2026-03-10T06:19:30.467 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:30.464+0000 7f0f2c032700 1 -- 192.168.123.104:0/1852436323 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f0f2419db60 con 0x7f0f240ff500 2026-03-10T06:19:30.469 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:30.466+0000 7f0f16ffd700 1 -- 192.168.123.104:0/1852436323 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f0f200051b0 con 0x7f0f240ff500 2026-03-10T06:19:30.469 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:30.466+0000 7f0f2c032700 1 -- 192.168.123.104:0/1852436323 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f0f24066e40 con 0x7f0f240ff500 2026-03-10T06:19:30.470 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:30.468+0000 7f0f16ffd700 1 --2- 192.168.123.104:0/1852436323 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f0f1006c680 0x7f0f1006eb30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:30.470 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:30.468+0000 7f0f16ffd700 1 -- 192.168.123.104:0/1852436323 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(34..34 src has 1..34) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f0f2008af50 con 0x7f0f240ff500 2026-03-10T06:19:30.472 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:30.470+0000 7f0f16ffd700 1 -- 192.168.123.104:0/1852436323 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f0f20059520 con 0x7f0f240ff500 2026-03-10T06:19:30.472 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:30.471+0000 7f0f29dce700 1 --2- 192.168.123.104:0/1852436323 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f0f1006c680 0x7f0f1006eb30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:19:30.473 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:30.471+0000 7f0f29dce700 1 --2- 192.168.123.104:0/1852436323 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f0f1006c680 0x7f0f1006eb30 secure :-1 s=READY pgs=61 cs=0 l=1 rev1=1 crypto rx=0x7f0f18006010 tx=0x7f0f18005dc0 comp rx=0 tx=0).ready entity=mgr.14241 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:19:30.579 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:30.577+0000 7f0f2c032700 1 -- 192.168.123.104:0/1852436323 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "osd stat", "format": "json"} v 0) v1 -- 0x7f0f2419df30 con 0x7f0f240ff500 2026-03-10T06:19:30.580 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:30.578+0000 7f0f16ffd700 1 -- 192.168.123.104:0/1852436323 <== mon.0 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "osd stat", "format": "json"}]=0 v34) v1 ==== 74+0+130 (secure 0 0 0) 0x7f0f200140a0 con 0x7f0f240ff500 2026-03-10T06:19:30.580 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:19:30.582 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:30.580+0000 7f0f2c032700 1 -- 192.168.123.104:0/1852436323 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f0f1006c680 msgr2=0x7f0f1006eb30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:30.582 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:30.580+0000 7f0f2c032700 1 --2- 192.168.123.104:0/1852436323 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f0f1006c680 0x7f0f1006eb30 secure :-1 s=READY pgs=61 cs=0 l=1 rev1=1 crypto rx=0x7f0f18006010 tx=0x7f0f18005dc0 comp rx=0 tx=0).stop 2026-03-10T06:19:30.582 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:30.580+0000 7f0f2c032700 1 -- 192.168.123.104:0/1852436323 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0f240ff500 msgr2=0x7f0f24198500 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:30.582 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:30.581+0000 7f0f2c032700 1 --2- 192.168.123.104:0/1852436323 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0f240ff500 0x7f0f24198500 secure :-1 s=READY pgs=220 cs=0 l=1 rev1=1 crypto rx=0x7f0f2000c8f0 tx=0x7f0f2000cc00 comp rx=0 tx=0).stop 2026-03-10T06:19:30.583 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:30.581+0000 7f0f2c032700 1 -- 192.168.123.104:0/1852436323 shutdown_connections 2026-03-10T06:19:30.583 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:30.581+0000 7f0f2c032700 1 --2- 192.168.123.104:0/1852436323 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f0f1006c680 0x7f0f1006eb30 unknown :-1 s=CLOSED pgs=61 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:30.583 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:30.581+0000 7f0f2c032700 1 --2- 192.168.123.104:0/1852436323 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0f240fe3c0 0x7f0f24197fc0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:30.583 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:30.581+0000 7f0f2c032700 1 --2- 192.168.123.104:0/1852436323 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0f240ff500 0x7f0f24198500 unknown :-1 s=CLOSED pgs=220 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:30.583 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:30.581+0000 7f0f2c032700 1 -- 192.168.123.104:0/1852436323 >> 192.168.123.104:0/1852436323 conn(0x7f0f240f9ce0 msgr2=0x7f0f24108000 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:19:30.583 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:30.582+0000 7f0f2c032700 1 -- 192.168.123.104:0/1852436323 shutdown_connections 2026-03-10T06:19:30.583 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:30.582+0000 7f0f2c032700 1 -- 192.168.123.104:0/1852436323 wait complete. 2026-03-10T06:19:30.655 INFO:teuthology.orchestra.run.vm04.stdout:{"epoch":34,"num_osds":6,"num_up_osds":6,"osd_up_since":1773123569,"num_in_osds":6,"osd_in_since":1773123557,"num_remapped_pgs":0} 2026-03-10T06:19:30.655 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid 9c59102a-1c48-11f1-b618-035af535377d -- ceph osd dump --format=json 2026-03-10T06:19:30.818 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/mon.vm04/config 2026-03-10T06:19:31.074 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:31.072+0000 7fb77d66f700 1 -- 192.168.123.104:0/3869466319 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb7781013a0 msgr2=0x7fb778101770 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:31.074 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:31.072+0000 7fb77d66f700 1 --2- 192.168.123.104:0/3869466319 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb7781013a0 0x7fb778101770 secure :-1 s=READY pgs=221 cs=0 l=1 rev1=1 crypto rx=0x7fb760009b00 tx=0x7fb760009e10 comp rx=0 tx=0).stop 2026-03-10T06:19:31.075 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:31.073+0000 7fb77d66f700 1 -- 192.168.123.104:0/3869466319 shutdown_connections 2026-03-10T06:19:31.075 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:31.073+0000 7fb77d66f700 1 --2- 192.168.123.104:0/3869466319 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb778068490 0x7fb778068900 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:31.075 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:31.073+0000 7fb77d66f700 1 --2- 192.168.123.104:0/3869466319 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb7781013a0 0x7fb778101770 unknown :-1 s=CLOSED pgs=221 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:31.075 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:31.073+0000 7fb77d66f700 1 -- 192.168.123.104:0/3869466319 >> 192.168.123.104:0/3869466319 conn(0x7fb7780754a0 msgr2=0x7fb7780758a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:19:31.075 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:31.073+0000 7fb77d66f700 1 -- 192.168.123.104:0/3869466319 shutdown_connections 2026-03-10T06:19:31.075 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:31.073+0000 7fb77d66f700 1 -- 192.168.123.104:0/3869466319 wait complete. 2026-03-10T06:19:31.076 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:31.074+0000 7fb77d66f700 1 Processor -- start 2026-03-10T06:19:31.077 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:31.074+0000 7fb77d66f700 1 -- start start 2026-03-10T06:19:31.077 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:31.074+0000 7fb77d66f700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb778068490 0x7fb7781021e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:31.077 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:31.074+0000 7fb77d66f700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb7781013a0 0x7fb778102720 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:31.078 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:31.074+0000 7fb77d66f700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb778106370 con 0x7fb7781013a0 2026-03-10T06:19:31.078 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:31.074+0000 7fb77d66f700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb778102c60 con 0x7fb778068490 2026-03-10T06:19:31.078 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:31.075+0000 7fb776ffd700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb778068490 0x7fb7781021e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:19:31.078 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:31.075+0000 7fb776ffd700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb778068490 0x7fb7781021e0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.104:34996/0 (socket says 192.168.123.104:34996) 2026-03-10T06:19:31.078 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:31.075+0000 7fb776ffd700 1 -- 192.168.123.104:0/2961735742 learned_addr learned my addr 192.168.123.104:0/2961735742 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:19:31.078 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:31.075+0000 7fb776ffd700 1 -- 192.168.123.104:0/2961735742 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb7781013a0 msgr2=0x7fb778102720 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T06:19:31.078 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:31.075+0000 7fb776ffd700 1 --2- 192.168.123.104:0/2961735742 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb7781013a0 0x7fb778102720 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:31.078 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:31.075+0000 7fb776ffd700 1 -- 192.168.123.104:0/2961735742 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb7600097e0 con 0x7fb778068490 2026-03-10T06:19:31.079 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:31.075+0000 7fb776ffd700 1 --2- 192.168.123.104:0/2961735742 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb778068490 0x7fb7781021e0 secure :-1 s=READY pgs=31 cs=0 l=1 rev1=1 crypto rx=0x7fb76000b5c0 tx=0x7fb760004950 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:19:31.079 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:31.077+0000 7fb76ffff700 1 -- 192.168.123.104:0/2961735742 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb76001d070 con 0x7fb778068490 2026-03-10T06:19:31.079 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:31.078+0000 7fb77d66f700 1 -- 192.168.123.104:0/2961735742 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fb778102e60 con 0x7fb778068490 2026-03-10T06:19:31.080 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:31.078+0000 7fb77d66f700 1 -- 192.168.123.104:0/2961735742 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fb778198240 con 0x7fb778068490 2026-03-10T06:19:31.080 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:31.078+0000 7fb76ffff700 1 -- 192.168.123.104:0/2961735742 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fb760022470 con 0x7fb778068490 2026-03-10T06:19:31.080 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:31.079+0000 7fb76ffff700 1 -- 192.168.123.104:0/2961735742 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb76000f700 con 0x7fb778068490 2026-03-10T06:19:31.083 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:31.081+0000 7fb77d66f700 1 -- 192.168.123.104:0/2961735742 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fb758005320 con 0x7fb778068490 2026-03-10T06:19:31.083 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:31.082+0000 7fb76ffff700 1 -- 192.168.123.104:0/2961735742 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7fb76000f860 con 0x7fb778068490 2026-03-10T06:19:31.083 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:31.082+0000 7fb76ffff700 1 --2- 192.168.123.104:0/2961735742 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fb76406c6d0 0x7fb76406eb80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:31.084 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:31.082+0000 7fb76ffff700 1 -- 192.168.123.104:0/2961735742 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(34..34 src has 1..34) v4 ==== 4446+0+0 (secure 0 0 0) 0x7fb76008ca30 con 0x7fb778068490 2026-03-10T06:19:31.084 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:31.082+0000 7fb7767fc700 1 --2- 192.168.123.104:0/2961735742 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fb76406c6d0 0x7fb76406eb80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:19:31.084 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:31.082+0000 7fb7767fc700 1 --2- 192.168.123.104:0/2961735742 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fb76406c6d0 0x7fb76406eb80 secure :-1 s=READY pgs=62 cs=0 l=1 rev1=1 crypto rx=0x7fb7781039d0 tx=0x7fb76800b3f0 comp rx=0 tx=0).ready entity=mgr.14241 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:19:31.086 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:31.084+0000 7fb76ffff700 1 -- 192.168.123.104:0/2961735742 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fb7600575a0 con 0x7fb778068490 2026-03-10T06:19:31.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:31 vm04 ceph-mon[51058]: osdmap e34: 6 total, 6 up, 6 in 2026-03-10T06:19:31.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:31 vm04 ceph-mon[51058]: pgmap v65: 1 pgs: 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T06:19:31.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:31 vm04 ceph-mon[51058]: from='client.? 192.168.123.104:0/1852436323' entity='client.admin' cmd=[{"prefix": "osd stat", "format": "json"}]: dispatch 2026-03-10T06:19:31.198 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:31.196+0000 7fb77d66f700 1 -- 192.168.123.104:0/2961735742 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "osd dump", "format": "json"} v 0) v1 -- 0x7fb7580059f0 con 0x7fb778068490 2026-03-10T06:19:31.199 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:31.197+0000 7fb76ffff700 1 -- 192.168.123.104:0/2961735742 <== mon.1 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "osd dump", "format": "json"}]=0 v34) v1 ==== 74+0+11260 (secure 0 0 0) 0x7fb76005abc0 con 0x7fb778068490 2026-03-10T06:19:31.199 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:19:31.199 INFO:teuthology.orchestra.run.vm04.stdout:{"epoch":34,"fsid":"9c59102a-1c48-11f1-b618-035af535377d","created":"2026-03-10T06:16:43.937039+0000","modified":"2026-03-10T06:19:30.067285+0000","last_up_change":"2026-03-10T06:19:29.056279+0000","last_in_change":"2026-03-10T06:19:17.791524+0000","flags":"sortbitwise,recovery_deletes,purged_snapdirs,pglog_hardlimit","flags_num":5799936,"flags_set":["pglog_hardlimit","purged_snapdirs","recovery_deletes","sortbitwise"],"crush_version":14,"full_ratio":0.94999998807907104,"backfillfull_ratio":0.89999997615814209,"nearfull_ratio":0.85000002384185791,"cluster_snapshot":"","pool_max":1,"max_osd":6,"require_min_compat_client":"luminous","min_compat_client":"jewel","require_osd_release":"reef","allow_crimson":false,"pools":[{"pool":1,"pool_name":".mgr","create_time":"2026-03-10T06:19:00.284324+0000","flags":1,"flags_names":"hashpspool","type":1,"size":3,"min_size":2,"crush_rule":0,"peering_crush_bucket_count":0,"peering_crush_bucket_target":0,"peering_crush_bucket_barrier":0,"peering_crush_bucket_mandatory_member":2147483647,"object_hash":2,"pg_autoscale_mode":"off","pg_num":1,"pg_placement_num":1,"pg_placement_num_target":1,"pg_num_target":1,"pg_num_pending":1,"last_pg_merge_meta":{"source_pgid":"0.0","ready_epoch":0,"last_epoch_started":0,"last_epoch_clean":0,"source_version":"0'0","target_version":"0'0"},"last_change":"21","last_force_op_resend":"0","last_force_op_resend_prenautilus":"0","last_force_op_resend_preluminous":"0","auid":0,"snap_mode":"selfmanaged","snap_seq":0,"snap_epoch":0,"pool_snaps":[],"removed_snaps":"[]","quota_max_bytes":0,"quota_max_objects":0,"tiers":[],"tier_of":-1,"read_tier":-1,"write_tier":-1,"cache_mode":"none","target_max_bytes":0,"target_max_objects":0,"cache_target_dirty_ratio_micro":400000,"cache_target_dirty_high_ratio_micro":600000,"cache_target_full_ratio_micro":800000,"cache_min_flush_age":0,"cache_min_evict_age":0,"erasure_code_profile":"","hit_set_params":{"type":"none"},"hit_set_period":0,"hit_set_count":0,"use_gmt_hitset":true,"min_read_recency_for_promote":0,"min_write_recency_for_promote":0,"hit_set_grade_decay_rate":0,"hit_set_search_last_n":0,"grade_table":[],"stripe_width":0,"expected_num_objects":0,"fast_read":false,"options":{"pg_num_max":32,"pg_num_min":1},"application_metadata":{"mgr":{}},"read_balance":{"score_acting":6,"score_stable":6,"optimal_score":0.5,"raw_score_acting":3,"raw_score_stable":3,"primary_affinity_weighted":1,"average_primary_affinity":1,"average_primary_affinity_weighted":1}}],"osds":[{"osd":0,"uuid":"24cf9fc9-b995-47ea-a145-3fd48dc1ed14","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":9,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6802","nonce":353393816},{"type":"v1","addr":"192.168.123.104:6803","nonce":353393816}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6804","nonce":353393816},{"type":"v1","addr":"192.168.123.104:6805","nonce":353393816}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6808","nonce":353393816},{"type":"v1","addr":"192.168.123.104:6809","nonce":353393816}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6806","nonce":353393816},{"type":"v1","addr":"192.168.123.104:6807","nonce":353393816}]},"public_addr":"192.168.123.104:6803/353393816","cluster_addr":"192.168.123.104:6805/353393816","heartbeat_back_addr":"192.168.123.104:6809/353393816","heartbeat_front_addr":"192.168.123.104:6807/353393816","state":["exists","up"]},{"osd":1,"uuid":"38852b8c-2ea4-46c8-a734-cf521893e9b5","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":14,"up_thru":25,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6810","nonce":3264858383},{"type":"v1","addr":"192.168.123.104:6811","nonce":3264858383}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6812","nonce":3264858383},{"type":"v1","addr":"192.168.123.104:6813","nonce":3264858383}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6816","nonce":3264858383},{"type":"v1","addr":"192.168.123.104:6817","nonce":3264858383}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6814","nonce":3264858383},{"type":"v1","addr":"192.168.123.104:6815","nonce":3264858383}]},"public_addr":"192.168.123.104:6811/3264858383","cluster_addr":"192.168.123.104:6813/3264858383","heartbeat_back_addr":"192.168.123.104:6817/3264858383","heartbeat_front_addr":"192.168.123.104:6815/3264858383","state":["exists","up"]},{"osd":2,"uuid":"7fc62f1e-5fa4-44db-82a0-4b766c28a491","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":18,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6818","nonce":4287451356},{"type":"v1","addr":"192.168.123.104:6819","nonce":4287451356}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6820","nonce":4287451356},{"type":"v1","addr":"192.168.123.104:6821","nonce":4287451356}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6824","nonce":4287451356},{"type":"v1","addr":"192.168.123.104:6825","nonce":4287451356}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6822","nonce":4287451356},{"type":"v1","addr":"192.168.123.104:6823","nonce":4287451356}]},"public_addr":"192.168.123.104:6819/4287451356","cluster_addr":"192.168.123.104:6821/4287451356","heartbeat_back_addr":"192.168.123.104:6825/4287451356","heartbeat_front_addr":"192.168.123.104:6823/4287451356","state":["exists","up"]},{"osd":3,"uuid":"343c7178-1f64-4726-9c13-d1d348b25384","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":24,"up_thru":28,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6800","nonce":1172578215},{"type":"v1","addr":"192.168.123.106:6801","nonce":1172578215}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6802","nonce":1172578215},{"type":"v1","addr":"192.168.123.106:6803","nonce":1172578215}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6806","nonce":1172578215},{"type":"v1","addr":"192.168.123.106:6807","nonce":1172578215}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6804","nonce":1172578215},{"type":"v1","addr":"192.168.123.106:6805","nonce":1172578215}]},"public_addr":"192.168.123.106:6801/1172578215","cluster_addr":"192.168.123.106:6803/1172578215","heartbeat_back_addr":"192.168.123.106:6807/1172578215","heartbeat_front_addr":"192.168.123.106:6805/1172578215","state":["exists","up"]},{"osd":4,"uuid":"0e68b450-e783-4ec6-99f7-0610ac3453d1","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":29,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6808","nonce":1799920647},{"type":"v1","addr":"192.168.123.106:6809","nonce":1799920647}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6810","nonce":1799920647},{"type":"v1","addr":"192.168.123.106:6811","nonce":1799920647}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6814","nonce":1799920647},{"type":"v1","addr":"192.168.123.106:6815","nonce":1799920647}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6812","nonce":1799920647},{"type":"v1","addr":"192.168.123.106:6813","nonce":1799920647}]},"public_addr":"192.168.123.106:6809/1799920647","cluster_addr":"192.168.123.106:6811/1799920647","heartbeat_back_addr":"192.168.123.106:6815/1799920647","heartbeat_front_addr":"192.168.123.106:6813/1799920647","state":["exists","up"]},{"osd":5,"uuid":"014751cc-c4ec-4d41-86ea-4b72c6f87e86","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":33,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6816","nonce":240520744},{"type":"v1","addr":"192.168.123.106:6817","nonce":240520744}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6818","nonce":240520744},{"type":"v1","addr":"192.168.123.106:6819","nonce":240520744}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6822","nonce":240520744},{"type":"v1","addr":"192.168.123.106:6823","nonce":240520744}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6820","nonce":240520744},{"type":"v1","addr":"192.168.123.106:6821","nonce":240520744}]},"public_addr":"192.168.123.106:6817/240520744","cluster_addr":"192.168.123.106:6819/240520744","heartbeat_back_addr":"192.168.123.106:6823/240520744","heartbeat_front_addr":"192.168.123.106:6821/240520744","state":["exists","up"]}],"osd_xinfo":[{"osd":0,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-10T06:18:36.888061+0000","dead_epoch":0},{"osd":1,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-10T06:18:48.702876+0000","dead_epoch":0},{"osd":2,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-10T06:18:58.655570+0000","dead_epoch":0},{"osd":3,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-10T06:19:07.794122+0000","dead_epoch":0},{"osd":4,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-10T06:19:17.462801+0000","dead_epoch":0},{"osd":5,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-10T06:19:27.109339+0000","dead_epoch":0}],"pg_upmap":[],"pg_upmap_items":[],"pg_upmap_primaries":[],"pg_temp":[],"primary_temp":[],"blocklist":{"192.168.123.104:0/4184397752":"2026-03-11T06:18:04.067248+0000","192.168.123.104:0/63423144":"2026-03-11T06:18:04.067248+0000","192.168.123.104:0/1120149152":"2026-03-11T06:17:13.124403+0000","192.168.123.104:0/2832999113":"2026-03-11T06:17:13.124403+0000","192.168.123.104:6801/2":"2026-03-11T06:16:59.244005+0000","192.168.123.104:0/4014113593":"2026-03-11T06:18:04.067248+0000","192.168.123.104:6800/2":"2026-03-11T06:16:59.244005+0000","192.168.123.104:0/3171159121":"2026-03-11T06:16:59.244005+0000","192.168.123.104:0/2564492007":"2026-03-11T06:16:59.244005+0000","192.168.123.104:0/1133148001":"2026-03-11T06:16:59.244005+0000","192.168.123.104:0/1930923909":"2026-03-11T06:17:13.124403+0000"},"range_blocklist":{},"erasure_code_profiles":{"default":{"crush-failure-domain":"osd","k":"2","m":"1","plugin":"jerasure","technique":"reed_sol_van"}},"removed_snaps_queue":[],"new_removed_snaps":[],"new_purged_snaps":[],"crush_node_flags":{},"device_class_flags":{},"stretch_mode":{"stretch_mode_enabled":false,"stretch_bucket_count":0,"degraded_stretch_mode":0,"recovering_stretch_mode":0,"stretch_mode_bucket":0}} 2026-03-10T06:19:31.202 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:31.200+0000 7fb77d66f700 1 -- 192.168.123.104:0/2961735742 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fb76406c6d0 msgr2=0x7fb76406eb80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:31.202 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:31.200+0000 7fb77d66f700 1 --2- 192.168.123.104:0/2961735742 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fb76406c6d0 0x7fb76406eb80 secure :-1 s=READY pgs=62 cs=0 l=1 rev1=1 crypto rx=0x7fb7781039d0 tx=0x7fb76800b3f0 comp rx=0 tx=0).stop 2026-03-10T06:19:31.202 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:31.200+0000 7fb77d66f700 1 -- 192.168.123.104:0/2961735742 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb778068490 msgr2=0x7fb7781021e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:31.202 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:31.200+0000 7fb77d66f700 1 --2- 192.168.123.104:0/2961735742 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb778068490 0x7fb7781021e0 secure :-1 s=READY pgs=31 cs=0 l=1 rev1=1 crypto rx=0x7fb76000b5c0 tx=0x7fb760004950 comp rx=0 tx=0).stop 2026-03-10T06:19:31.202 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:31.200+0000 7fb77d66f700 1 -- 192.168.123.104:0/2961735742 shutdown_connections 2026-03-10T06:19:31.202 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:31.200+0000 7fb77d66f700 1 --2- 192.168.123.104:0/2961735742 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fb76406c6d0 0x7fb76406eb80 unknown :-1 s=CLOSED pgs=62 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:31.202 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:31.200+0000 7fb77d66f700 1 --2- 192.168.123.104:0/2961735742 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb778068490 0x7fb7781021e0 unknown :-1 s=CLOSED pgs=31 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:31.202 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:31.200+0000 7fb77d66f700 1 --2- 192.168.123.104:0/2961735742 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb7781013a0 0x7fb778102720 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:31.202 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:31.200+0000 7fb77d66f700 1 -- 192.168.123.104:0/2961735742 >> 192.168.123.104:0/2961735742 conn(0x7fb7780754a0 msgr2=0x7fb7780fdd70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:19:31.202 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:31.200+0000 7fb77d66f700 1 -- 192.168.123.104:0/2961735742 shutdown_connections 2026-03-10T06:19:31.202 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:31.200+0000 7fb77d66f700 1 -- 192.168.123.104:0/2961735742 wait complete. 2026-03-10T06:19:31.274 INFO:tasks.cephadm.ceph_manager.ceph:[{'pool': 1, 'pool_name': '.mgr', 'create_time': '2026-03-10T06:19:00.284324+0000', 'flags': 1, 'flags_names': 'hashpspool', 'type': 1, 'size': 3, 'min_size': 2, 'crush_rule': 0, 'peering_crush_bucket_count': 0, 'peering_crush_bucket_target': 0, 'peering_crush_bucket_barrier': 0, 'peering_crush_bucket_mandatory_member': 2147483647, 'object_hash': 2, 'pg_autoscale_mode': 'off', 'pg_num': 1, 'pg_placement_num': 1, 'pg_placement_num_target': 1, 'pg_num_target': 1, 'pg_num_pending': 1, 'last_pg_merge_meta': {'source_pgid': '0.0', 'ready_epoch': 0, 'last_epoch_started': 0, 'last_epoch_clean': 0, 'source_version': "0'0", 'target_version': "0'0"}, 'last_change': '21', 'last_force_op_resend': '0', 'last_force_op_resend_prenautilus': '0', 'last_force_op_resend_preluminous': '0', 'auid': 0, 'snap_mode': 'selfmanaged', 'snap_seq': 0, 'snap_epoch': 0, 'pool_snaps': [], 'removed_snaps': '[]', 'quota_max_bytes': 0, 'quota_max_objects': 0, 'tiers': [], 'tier_of': -1, 'read_tier': -1, 'write_tier': -1, 'cache_mode': 'none', 'target_max_bytes': 0, 'target_max_objects': 0, 'cache_target_dirty_ratio_micro': 400000, 'cache_target_dirty_high_ratio_micro': 600000, 'cache_target_full_ratio_micro': 800000, 'cache_min_flush_age': 0, 'cache_min_evict_age': 0, 'erasure_code_profile': '', 'hit_set_params': {'type': 'none'}, 'hit_set_period': 0, 'hit_set_count': 0, 'use_gmt_hitset': True, 'min_read_recency_for_promote': 0, 'min_write_recency_for_promote': 0, 'hit_set_grade_decay_rate': 0, 'hit_set_search_last_n': 0, 'grade_table': [], 'stripe_width': 0, 'expected_num_objects': 0, 'fast_read': False, 'options': {'pg_num_max': 32, 'pg_num_min': 1}, 'application_metadata': {'mgr': {}}, 'read_balance': {'score_acting': 6, 'score_stable': 6, 'optimal_score': 0.5, 'raw_score_acting': 3, 'raw_score_stable': 3, 'primary_affinity_weighted': 1, 'average_primary_affinity': 1, 'average_primary_affinity_weighted': 1}}] 2026-03-10T06:19:31.274 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid 9c59102a-1c48-11f1-b618-035af535377d -- ceph osd pool get .mgr pg_num 2026-03-10T06:19:31.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:31 vm06 ceph-mon[58974]: osdmap e34: 6 total, 6 up, 6 in 2026-03-10T06:19:31.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:31 vm06 ceph-mon[58974]: pgmap v65: 1 pgs: 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T06:19:31.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:31 vm06 ceph-mon[58974]: from='client.? 192.168.123.104:0/1852436323' entity='client.admin' cmd=[{"prefix": "osd stat", "format": "json"}]: dispatch 2026-03-10T06:19:31.430 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/mon.vm04/config 2026-03-10T06:19:31.694 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:31.691+0000 7fba55e45700 1 -- 192.168.123.104:0/1408593261 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fba500ffc00 msgr2=0x7fba5010c960 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:31.694 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:31.691+0000 7fba55e45700 1 --2- 192.168.123.104:0/1408593261 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fba500ffc00 0x7fba5010c960 secure :-1 s=READY pgs=32 cs=0 l=1 rev1=1 crypto rx=0x7fba40009b00 tx=0x7fba40009e10 comp rx=0 tx=0).stop 2026-03-10T06:19:31.694 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:31.692+0000 7fba55e45700 1 -- 192.168.123.104:0/1408593261 shutdown_connections 2026-03-10T06:19:31.694 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:31.692+0000 7fba55e45700 1 --2- 192.168.123.104:0/1408593261 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fba500ffc00 0x7fba5010c960 unknown :-1 s=CLOSED pgs=32 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:31.694 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:31.692+0000 7fba55e45700 1 --2- 192.168.123.104:0/1408593261 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fba500ff2f0 0x7fba500ff6c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:31.694 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:31.692+0000 7fba55e45700 1 -- 192.168.123.104:0/1408593261 >> 192.168.123.104:0/1408593261 conn(0x7fba500faf00 msgr2=0x7fba500fd310 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:19:31.695 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:31.693+0000 7fba55e45700 1 -- 192.168.123.104:0/1408593261 shutdown_connections 2026-03-10T06:19:31.695 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:31.693+0000 7fba55e45700 1 -- 192.168.123.104:0/1408593261 wait complete. 2026-03-10T06:19:31.695 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:31.693+0000 7fba55e45700 1 Processor -- start 2026-03-10T06:19:31.695 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:31.694+0000 7fba55e45700 1 -- start start 2026-03-10T06:19:31.696 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:31.694+0000 7fba55e45700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fba500ff2f0 0x7fba501015d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:31.696 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:31.694+0000 7fba55e45700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fba500ffc00 0x7fba50101b10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:31.696 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:31.694+0000 7fba55e45700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fba50105760 con 0x7fba500ff2f0 2026-03-10T06:19:31.696 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:31.694+0000 7fba55e45700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fba50102050 con 0x7fba500ffc00 2026-03-10T06:19:31.696 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:31.694+0000 7fba4f7fe700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fba500ff2f0 0x7fba501015d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:19:31.696 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:31.694+0000 7fba4effd700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fba500ffc00 0x7fba50101b10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:19:31.696 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:31.694+0000 7fba4effd700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fba500ffc00 0x7fba50101b10 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.104:35006/0 (socket says 192.168.123.104:35006) 2026-03-10T06:19:31.696 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:31.694+0000 7fba4effd700 1 -- 192.168.123.104:0/3767709546 learned_addr learned my addr 192.168.123.104:0/3767709546 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:19:31.697 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:31.694+0000 7fba4f7fe700 1 -- 192.168.123.104:0/3767709546 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fba500ffc00 msgr2=0x7fba50101b10 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:31.697 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:31.694+0000 7fba4f7fe700 1 --2- 192.168.123.104:0/3767709546 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fba500ffc00 0x7fba50101b10 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:31.697 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:31.695+0000 7fba4f7fe700 1 -- 192.168.123.104:0/3767709546 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fba44009710 con 0x7fba500ff2f0 2026-03-10T06:19:31.697 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:31.695+0000 7fba4f7fe700 1 --2- 192.168.123.104:0/3767709546 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fba500ff2f0 0x7fba501015d0 secure :-1 s=READY pgs=222 cs=0 l=1 rev1=1 crypto rx=0x7fba4400eb40 tx=0x7fba4400ee50 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:19:31.697 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:31.695+0000 7fba4cff9700 1 -- 192.168.123.104:0/3767709546 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fba4400cc40 con 0x7fba500ff2f0 2026-03-10T06:19:31.698 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:31.695+0000 7fba55e45700 1 -- 192.168.123.104:0/3767709546 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fba400097e0 con 0x7fba500ff2f0 2026-03-10T06:19:31.698 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:31.695+0000 7fba55e45700 1 -- 192.168.123.104:0/3767709546 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fba501a6a80 con 0x7fba500ff2f0 2026-03-10T06:19:31.698 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:31.696+0000 7fba4cff9700 1 -- 192.168.123.104:0/3767709546 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fba44004500 con 0x7fba500ff2f0 2026-03-10T06:19:31.699 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:31.696+0000 7fba4cff9700 1 -- 192.168.123.104:0/3767709546 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fba44010430 con 0x7fba500ff2f0 2026-03-10T06:19:31.699 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:31.696+0000 7fba4cff9700 1 -- 192.168.123.104:0/3767709546 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7fba44010670 con 0x7fba500ff2f0 2026-03-10T06:19:31.699 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:31.697+0000 7fba4cff9700 1 --2- 192.168.123.104:0/3767709546 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fba3806c820 0x7fba3806ecd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:31.699 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:31.697+0000 7fba4effd700 1 --2- 192.168.123.104:0/3767709546 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fba3806c820 0x7fba3806ecd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:19:31.699 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:31.697+0000 7fba4cff9700 1 -- 192.168.123.104:0/3767709546 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(34..34 src has 1..34) v4 ==== 4446+0+0 (secure 0 0 0) 0x7fba44014070 con 0x7fba500ff2f0 2026-03-10T06:19:31.699 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:31.698+0000 7fba4effd700 1 --2- 192.168.123.104:0/3767709546 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fba3806c820 0x7fba3806ecd0 secure :-1 s=READY pgs=63 cs=0 l=1 rev1=1 crypto rx=0x7fba50102dc0 tx=0x7fba40005ab0 comp rx=0 tx=0).ready entity=mgr.14241 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:19:31.702 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:31.698+0000 7fba55e45700 1 -- 192.168.123.104:0/3767709546 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fba3c005320 con 0x7fba500ff2f0 2026-03-10T06:19:31.702 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:31.700+0000 7fba4cff9700 1 -- 192.168.123.104:0/3767709546 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fba440575f0 con 0x7fba500ff2f0 2026-03-10T06:19:31.807 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:31.805+0000 7fba55e45700 1 -- 192.168.123.104:0/3767709546 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "osd pool get", "pool": ".mgr", "var": "pg_num"} v 0) v1 -- 0x7fba3c005f70 con 0x7fba500ff2f0 2026-03-10T06:19:31.807 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:31.805+0000 7fba4cff9700 1 -- 192.168.123.104:0/3767709546 <== mon.0 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "osd pool get", "pool": ".mgr", "var": "pg_num"}]=0 v34) v1 ==== 93+0+10 (secure 0 0 0) 0x7fba4405ac10 con 0x7fba500ff2f0 2026-03-10T06:19:31.809 INFO:teuthology.orchestra.run.vm04.stdout:pg_num: 1 2026-03-10T06:19:31.811 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:31.809+0000 7fba55e45700 1 -- 192.168.123.104:0/3767709546 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fba3806c820 msgr2=0x7fba3806ecd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:31.811 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:31.809+0000 7fba55e45700 1 --2- 192.168.123.104:0/3767709546 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fba3806c820 0x7fba3806ecd0 secure :-1 s=READY pgs=63 cs=0 l=1 rev1=1 crypto rx=0x7fba50102dc0 tx=0x7fba40005ab0 comp rx=0 tx=0).stop 2026-03-10T06:19:31.811 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:31.810+0000 7fba55e45700 1 -- 192.168.123.104:0/3767709546 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fba500ff2f0 msgr2=0x7fba501015d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:31.811 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:31.810+0000 7fba55e45700 1 --2- 192.168.123.104:0/3767709546 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fba500ff2f0 0x7fba501015d0 secure :-1 s=READY pgs=222 cs=0 l=1 rev1=1 crypto rx=0x7fba4400eb40 tx=0x7fba4400ee50 comp rx=0 tx=0).stop 2026-03-10T06:19:31.812 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:31.810+0000 7fba55e45700 1 -- 192.168.123.104:0/3767709546 shutdown_connections 2026-03-10T06:19:31.812 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:31.810+0000 7fba55e45700 1 --2- 192.168.123.104:0/3767709546 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fba3806c820 0x7fba3806ecd0 unknown :-1 s=CLOSED pgs=63 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:31.812 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:31.810+0000 7fba55e45700 1 --2- 192.168.123.104:0/3767709546 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fba500ff2f0 0x7fba501015d0 unknown :-1 s=CLOSED pgs=222 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:31.812 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:31.810+0000 7fba55e45700 1 --2- 192.168.123.104:0/3767709546 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fba500ffc00 0x7fba50101b10 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:31.812 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:31.810+0000 7fba55e45700 1 -- 192.168.123.104:0/3767709546 >> 192.168.123.104:0/3767709546 conn(0x7fba500faf00 msgr2=0x7fba500fc470 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:19:31.812 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:31.811+0000 7fba55e45700 1 -- 192.168.123.104:0/3767709546 shutdown_connections 2026-03-10T06:19:31.812 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:31.811+0000 7fba55e45700 1 -- 192.168.123.104:0/3767709546 wait complete. 2026-03-10T06:19:31.859 INFO:tasks.cephadm:Setting up client nodes... 2026-03-10T06:19:31.859 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 9c59102a-1c48-11f1-b618-035af535377d -- ceph auth get-or-create client.0 mon 'allow *' osd 'allow *' mds 'allow *' mgr 'allow *' 2026-03-10T06:19:32.023 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/mon.vm04/config 2026-03-10T06:19:32.297 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:32.295+0000 7f0df170f700 1 -- 192.168.123.104:0/2381525636 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0dec103960 msgr2=0x7f0dec103db0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:32.297 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:32.295+0000 7f0df170f700 1 --2- 192.168.123.104:0/2381525636 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0dec103960 0x7f0dec103db0 secure :-1 s=READY pgs=223 cs=0 l=1 rev1=1 crypto rx=0x7f0de0009b00 tx=0x7f0de0009e10 comp rx=0 tx=0).stop 2026-03-10T06:19:32.297 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:32.296+0000 7f0df170f700 1 -- 192.168.123.104:0/2381525636 shutdown_connections 2026-03-10T06:19:32.297 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:32.296+0000 7f0df170f700 1 --2- 192.168.123.104:0/2381525636 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0dec103960 0x7f0dec103db0 unknown :-1 s=CLOSED pgs=223 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:32.298 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:32.296+0000 7f0df170f700 1 --2- 192.168.123.104:0/2381525636 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0dec102760 0x7f0dec102b70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:32.298 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:32.296+0000 7f0df170f700 1 -- 192.168.123.104:0/2381525636 >> 192.168.123.104:0/2381525636 conn(0x7f0dec0fdcf0 msgr2=0x7f0dec100140 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:19:32.298 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:32.296+0000 7f0df170f700 1 -- 192.168.123.104:0/2381525636 shutdown_connections 2026-03-10T06:19:32.298 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:32.296+0000 7f0df170f700 1 -- 192.168.123.104:0/2381525636 wait complete. 2026-03-10T06:19:32.298 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:32.297+0000 7f0df170f700 1 Processor -- start 2026-03-10T06:19:32.299 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:32.297+0000 7f0df170f700 1 -- start start 2026-03-10T06:19:32.299 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:32.297+0000 7f0df170f700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0dec102760 0x7f0dec198020 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:32.299 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:32.297+0000 7f0df170f700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0dec103960 0x7f0dec198560 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:32.299 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:32.297+0000 7f0df170f700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0dec198b80 con 0x7f0dec102760 2026-03-10T06:19:32.299 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:32.297+0000 7f0df170f700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0dec198cc0 con 0x7f0dec103960 2026-03-10T06:19:32.299 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:32.298+0000 7f0deaffd700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0dec102760 0x7f0dec198020 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:19:32.300 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:32.298+0000 7f0deaffd700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0dec102760 0x7f0dec198020 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:57188/0 (socket says 192.168.123.104:57188) 2026-03-10T06:19:32.300 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:32.298+0000 7f0deaffd700 1 -- 192.168.123.104:0/2038945973 learned_addr learned my addr 192.168.123.104:0/2038945973 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:19:32.300 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:32.298+0000 7f0dea7fc700 1 --2- 192.168.123.104:0/2038945973 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0dec103960 0x7f0dec198560 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:19:32.300 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:32.298+0000 7f0deaffd700 1 -- 192.168.123.104:0/2038945973 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0dec103960 msgr2=0x7f0dec198560 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:32.300 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:32.298+0000 7f0deaffd700 1 --2- 192.168.123.104:0/2038945973 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0dec103960 0x7f0dec198560 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:32.300 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:32.298+0000 7f0deaffd700 1 -- 192.168.123.104:0/2038945973 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f0de00097e0 con 0x7f0dec102760 2026-03-10T06:19:32.300 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:32.299+0000 7f0deaffd700 1 --2- 192.168.123.104:0/2038945973 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0dec102760 0x7f0dec198020 secure :-1 s=READY pgs=224 cs=0 l=1 rev1=1 crypto rx=0x7f0ddc00d900 tx=0x7f0ddc00dc10 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:19:32.301 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:32.299+0000 7f0dd3fff700 1 -- 192.168.123.104:0/2038945973 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f0ddc0041d0 con 0x7f0dec102760 2026-03-10T06:19:32.301 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:32.299+0000 7f0dd3fff700 1 -- 192.168.123.104:0/2038945973 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f0ddc004330 con 0x7f0dec102760 2026-03-10T06:19:32.301 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:32.299+0000 7f0dd3fff700 1 -- 192.168.123.104:0/2038945973 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f0ddc003d70 con 0x7f0dec102760 2026-03-10T06:19:32.301 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:32.299+0000 7f0df170f700 1 -- 192.168.123.104:0/2038945973 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f0dec19d770 con 0x7f0dec102760 2026-03-10T06:19:32.301 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:32.299+0000 7f0df170f700 1 -- 192.168.123.104:0/2038945973 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f0dec075360 con 0x7f0dec102760 2026-03-10T06:19:32.302 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:32.300+0000 7f0df170f700 1 -- 192.168.123.104:0/2038945973 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f0dec066e40 con 0x7f0dec102760 2026-03-10T06:19:32.303 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:32.301+0000 7f0dd3fff700 1 -- 192.168.123.104:0/2038945973 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f0ddc009730 con 0x7f0dec102760 2026-03-10T06:19:32.303 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:32.301+0000 7f0dd3fff700 1 --2- 192.168.123.104:0/2038945973 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f0dd406c7a0 0x7f0dd406ec50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:32.303 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:32.301+0000 7f0dd3fff700 1 -- 192.168.123.104:0/2038945973 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(34..34 src has 1..34) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f0ddc08b4f0 con 0x7f0dec102760 2026-03-10T06:19:32.304 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:32.302+0000 7f0dea7fc700 1 --2- 192.168.123.104:0/2038945973 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f0dd406c7a0 0x7f0dd406ec50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:19:32.304 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:32.302+0000 7f0dea7fc700 1 --2- 192.168.123.104:0/2038945973 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f0dd406c7a0 0x7f0dd406ec50 secure :-1 s=READY pgs=64 cs=0 l=1 rev1=1 crypto rx=0x7f0de000b5c0 tx=0x7f0de0005fb0 comp rx=0 tx=0).ready entity=mgr.14241 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:19:32.305 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:32.303+0000 7f0dd3fff700 1 -- 192.168.123.104:0/2038945973 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f0ddc059ac0 con 0x7f0dec102760 2026-03-10T06:19:32.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:32 vm06 ceph-mon[58974]: from='client.? 192.168.123.104:0/2961735742' entity='client.admin' cmd=[{"prefix": "osd dump", "format": "json"}]: dispatch 2026-03-10T06:19:32.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:32 vm06 ceph-mon[58974]: from='client.? 192.168.123.104:0/3767709546' entity='client.admin' cmd=[{"prefix": "osd pool get", "pool": ".mgr", "var": "pg_num"}]: dispatch 2026-03-10T06:19:32.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:32 vm04 ceph-mon[51058]: from='client.? 192.168.123.104:0/2961735742' entity='client.admin' cmd=[{"prefix": "osd dump", "format": "json"}]: dispatch 2026-03-10T06:19:32.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:32 vm04 ceph-mon[51058]: from='client.? 192.168.123.104:0/3767709546' entity='client.admin' cmd=[{"prefix": "osd pool get", "pool": ".mgr", "var": "pg_num"}]: dispatch 2026-03-10T06:19:32.453 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:32.451+0000 7f0df170f700 1 -- 192.168.123.104:0/2038945973 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "auth get-or-create", "entity": "client.0", "caps": ["mon", "allow *", "osd", "allow *", "mds", "allow *", "mgr", "allow *"]} v 0) v1 -- 0x7f0dec075af0 con 0x7f0dec102760 2026-03-10T06:19:32.457 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:32.455+0000 7f0dd3fff700 1 -- 192.168.123.104:0/2038945973 <== mon.0 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "auth get-or-create", "entity": "client.0", "caps": ["mon", "allow *", "osd", "allow *", "mds", "allow *", "mgr", "allow *"]}]=0 v16) v1 ==== 170+0+59 (secure 0 0 0) 0x7f0ddc059650 con 0x7f0dec102760 2026-03-10T06:19:32.457 INFO:teuthology.orchestra.run.vm04.stdout:[client.0] 2026-03-10T06:19:32.457 INFO:teuthology.orchestra.run.vm04.stdout: key = AQD0t69p+SkNGxAAiWzrvjLujjCsvbD/VReg5A== 2026-03-10T06:19:32.463 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:32.461+0000 7f0df170f700 1 -- 192.168.123.104:0/2038945973 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f0dd406c7a0 msgr2=0x7f0dd406ec50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:32.463 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:32.461+0000 7f0df170f700 1 --2- 192.168.123.104:0/2038945973 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f0dd406c7a0 0x7f0dd406ec50 secure :-1 s=READY pgs=64 cs=0 l=1 rev1=1 crypto rx=0x7f0de000b5c0 tx=0x7f0de0005fb0 comp rx=0 tx=0).stop 2026-03-10T06:19:32.463 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:32.461+0000 7f0df170f700 1 -- 192.168.123.104:0/2038945973 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0dec102760 msgr2=0x7f0dec198020 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:32.463 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:32.461+0000 7f0df170f700 1 --2- 192.168.123.104:0/2038945973 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0dec102760 0x7f0dec198020 secure :-1 s=READY pgs=224 cs=0 l=1 rev1=1 crypto rx=0x7f0ddc00d900 tx=0x7f0ddc00dc10 comp rx=0 tx=0).stop 2026-03-10T06:19:32.463 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:32.462+0000 7f0df170f700 1 -- 192.168.123.104:0/2038945973 shutdown_connections 2026-03-10T06:19:32.463 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:32.462+0000 7f0df170f700 1 --2- 192.168.123.104:0/2038945973 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f0dd406c7a0 0x7f0dd406ec50 unknown :-1 s=CLOSED pgs=64 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:32.464 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:32.462+0000 7f0df170f700 1 --2- 192.168.123.104:0/2038945973 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0dec102760 0x7f0dec198020 unknown :-1 s=CLOSED pgs=224 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:32.464 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:32.462+0000 7f0df170f700 1 --2- 192.168.123.104:0/2038945973 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0dec103960 0x7f0dec198560 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:32.464 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:32.462+0000 7f0df170f700 1 -- 192.168.123.104:0/2038945973 >> 192.168.123.104:0/2038945973 conn(0x7f0dec0fdcf0 msgr2=0x7f0dec106b90 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:19:32.464 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:32.462+0000 7f0df170f700 1 -- 192.168.123.104:0/2038945973 shutdown_connections 2026-03-10T06:19:32.464 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:32.462+0000 7f0df170f700 1 -- 192.168.123.104:0/2038945973 wait complete. 2026-03-10T06:19:32.512 DEBUG:teuthology.orchestra.run.vm04:> set -ex 2026-03-10T06:19:32.512 DEBUG:teuthology.orchestra.run.vm04:> sudo dd of=/etc/ceph/ceph.client.0.keyring 2026-03-10T06:19:32.512 DEBUG:teuthology.orchestra.run.vm04:> sudo chmod 0644 /etc/ceph/ceph.client.0.keyring 2026-03-10T06:19:32.550 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 9c59102a-1c48-11f1-b618-035af535377d -- ceph auth get-or-create client.1 mon 'allow *' osd 'allow *' mds 'allow *' mgr 'allow *' 2026-03-10T06:19:32.704 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/mon.vm06/config 2026-03-10T06:19:32.995 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:32.994+0000 7fccf16ae700 1 -- 192.168.123.106:0/3031742934 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fccec103900 msgr2=0x7fccec105ce0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:32.995 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:32.994+0000 7fccf16ae700 1 --2- 192.168.123.106:0/3031742934 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fccec103900 0x7fccec105ce0 secure :-1 s=READY pgs=33 cs=0 l=1 rev1=1 crypto rx=0x7fcce0009b00 tx=0x7fcce0009e10 comp rx=0 tx=0).stop 2026-03-10T06:19:32.996 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:32.995+0000 7fccf16ae700 1 -- 192.168.123.106:0/3031742934 shutdown_connections 2026-03-10T06:19:32.996 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:32.995+0000 7fccf16ae700 1 --2- 192.168.123.106:0/3031742934 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fccec103900 0x7fccec105ce0 unknown :-1 s=CLOSED pgs=33 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:32.996 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:32.995+0000 7fccf16ae700 1 --2- 192.168.123.106:0/3031742934 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fccec100fe0 0x7fccec1033c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:32.996 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:32.995+0000 7fccf16ae700 1 -- 192.168.123.106:0/3031742934 >> 192.168.123.106:0/3031742934 conn(0x7fccec0fa9e0 msgr2=0x7fccec0fce30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:19:32.996 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:32.995+0000 7fccf16ae700 1 -- 192.168.123.106:0/3031742934 shutdown_connections 2026-03-10T06:19:32.996 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:32.995+0000 7fccf16ae700 1 -- 192.168.123.106:0/3031742934 wait complete. 2026-03-10T06:19:32.996 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:32.996+0000 7fccf16ae700 1 Processor -- start 2026-03-10T06:19:32.996 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:32.996+0000 7fccf16ae700 1 -- start start 2026-03-10T06:19:32.996 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:32.996+0000 7fccf16ae700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fccec100fe0 0x7fccec195e50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:32.997 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:32.996+0000 7fccf16ae700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fccec103900 0x7fccec196390 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:32.997 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:32.996+0000 7fcceaffd700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fccec100fe0 0x7fccec195e50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:19:32.997 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:32.996+0000 7fccf16ae700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fccec1969b0 con 0x7fccec103900 2026-03-10T06:19:32.997 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:32.996+0000 7fccf16ae700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fccec196af0 con 0x7fccec100fe0 2026-03-10T06:19:32.997 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:32.996+0000 7fcceaffd700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fccec100fe0 0x7fccec195e50 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:50116/0 (socket says 192.168.123.106:50116) 2026-03-10T06:19:32.997 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:32.996+0000 7fcceaffd700 1 -- 192.168.123.106:0/4083509174 learned_addr learned my addr 192.168.123.106:0/4083509174 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-10T06:19:32.997 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:32.997+0000 7fccea7fc700 1 --2- 192.168.123.106:0/4083509174 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fccec103900 0x7fccec196390 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:19:32.997 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:32.997+0000 7fcceaffd700 1 -- 192.168.123.106:0/4083509174 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fccec103900 msgr2=0x7fccec196390 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:32.997 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:32.997+0000 7fcceaffd700 1 --2- 192.168.123.106:0/4083509174 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fccec103900 0x7fccec196390 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:32.998 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:32.997+0000 7fcceaffd700 1 -- 192.168.123.106:0/4083509174 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fcce00097e0 con 0x7fccec100fe0 2026-03-10T06:19:32.998 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:32.997+0000 7fcceaffd700 1 --2- 192.168.123.106:0/4083509174 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fccec100fe0 0x7fccec195e50 secure :-1 s=READY pgs=34 cs=0 l=1 rev1=1 crypto rx=0x7fccdc00ba70 tx=0x7fccdc00be30 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:19:32.998 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:32.997+0000 7fccd3fff700 1 -- 192.168.123.106:0/4083509174 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fccdc00c780 con 0x7fccec100fe0 2026-03-10T06:19:32.998 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:32.997+0000 7fccd3fff700 1 -- 192.168.123.106:0/4083509174 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fccdc00cdc0 con 0x7fccec100fe0 2026-03-10T06:19:32.998 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:32.997+0000 7fccd3fff700 1 -- 192.168.123.106:0/4083509174 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fccdc012550 con 0x7fccec100fe0 2026-03-10T06:19:32.998 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:32.997+0000 7fccf16ae700 1 -- 192.168.123.106:0/4083509174 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fccec0fee30 con 0x7fccec100fe0 2026-03-10T06:19:32.998 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:32.997+0000 7fccf16ae700 1 -- 192.168.123.106:0/4083509174 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fccec0ff380 con 0x7fccec100fe0 2026-03-10T06:19:33.003 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:32.999+0000 7fccd3fff700 1 -- 192.168.123.106:0/4083509174 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7fccdc014440 con 0x7fccec100fe0 2026-03-10T06:19:33.003 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:32.999+0000 7fccf16ae700 1 -- 192.168.123.106:0/4083509174 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fccec190090 con 0x7fccec100fe0 2026-03-10T06:19:33.003 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:32.999+0000 7fccd3fff700 1 --2- 192.168.123.106:0/4083509174 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fccd406c630 0x7fccd406eae0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:33.003 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:32.999+0000 7fccd3fff700 1 -- 192.168.123.106:0/4083509174 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(34..34 src has 1..34) v4 ==== 4446+0+0 (secure 0 0 0) 0x7fccdc08a7a0 con 0x7fccec100fe0 2026-03-10T06:19:33.003 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:33.000+0000 7fccea7fc700 1 --2- 192.168.123.106:0/4083509174 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fccd406c630 0x7fccd406eae0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:19:33.003 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:33.000+0000 7fccea7fc700 1 --2- 192.168.123.106:0/4083509174 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fccd406c630 0x7fccd406eae0 secure :-1 s=READY pgs=65 cs=0 l=1 rev1=1 crypto rx=0x7fcce000b5c0 tx=0x7fcce0005fd0 comp rx=0 tx=0).ready entity=mgr.14241 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:19:33.003 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:33.003+0000 7fccd3fff700 1 -- 192.168.123.106:0/4083509174 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fccdc058cf0 con 0x7fccec100fe0 2026-03-10T06:19:33.151 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:33.150+0000 7fccf16ae700 1 -- 192.168.123.106:0/4083509174 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "auth get-or-create", "entity": "client.1", "caps": ["mon", "allow *", "osd", "allow *", "mds", "allow *", "mgr", "allow *"]} v 0) v1 -- 0x7fccec0fefc0 con 0x7fccec100fe0 2026-03-10T06:19:33.151 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:33 vm06 ceph-mon[58974]: pgmap v66: 1 pgs: 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T06:19:33.151 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:33 vm06 ceph-mon[58974]: from='client.? 192.168.123.104:0/2038945973' entity='client.admin' cmd=[{"prefix": "auth get-or-create", "entity": "client.0", "caps": ["mon", "allow *", "osd", "allow *", "mds", "allow *", "mgr", "allow *"]}]: dispatch 2026-03-10T06:19:33.151 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:33 vm06 ceph-mon[58974]: from='client.? 192.168.123.104:0/2038945973' entity='client.admin' cmd='[{"prefix": "auth get-or-create", "entity": "client.0", "caps": ["mon", "allow *", "osd", "allow *", "mds", "allow *", "mgr", "allow *"]}]': finished 2026-03-10T06:19:33.157 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:33.157+0000 7fccd3fff700 1 -- 192.168.123.106:0/4083509174 <== mon.1 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "auth get-or-create", "entity": "client.1", "caps": ["mon", "allow *", "osd", "allow *", "mds", "allow *", "mgr", "allow *"]}]=0 v17) v1 ==== 170+0+59 (secure 0 0 0) 0x7fccdc019050 con 0x7fccec100fe0 2026-03-10T06:19:33.157 INFO:teuthology.orchestra.run.vm06.stdout:[client.1] 2026-03-10T06:19:33.157 INFO:teuthology.orchestra.run.vm06.stdout: key = AQD1t69pCywTCRAAEusAgi4hyzbikiqow26Wlw== 2026-03-10T06:19:33.160 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:33.159+0000 7fccf16ae700 1 -- 192.168.123.106:0/4083509174 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fccd406c630 msgr2=0x7fccd406eae0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:33.160 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:33.159+0000 7fccf16ae700 1 --2- 192.168.123.106:0/4083509174 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fccd406c630 0x7fccd406eae0 secure :-1 s=READY pgs=65 cs=0 l=1 rev1=1 crypto rx=0x7fcce000b5c0 tx=0x7fcce0005fd0 comp rx=0 tx=0).stop 2026-03-10T06:19:33.160 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:33.159+0000 7fccf16ae700 1 -- 192.168.123.106:0/4083509174 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fccec100fe0 msgr2=0x7fccec195e50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:33.160 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:33.159+0000 7fccf16ae700 1 --2- 192.168.123.106:0/4083509174 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fccec100fe0 0x7fccec195e50 secure :-1 s=READY pgs=34 cs=0 l=1 rev1=1 crypto rx=0x7fccdc00ba70 tx=0x7fccdc00be30 comp rx=0 tx=0).stop 2026-03-10T06:19:33.160 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:33.159+0000 7fccf16ae700 1 -- 192.168.123.106:0/4083509174 shutdown_connections 2026-03-10T06:19:33.160 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:33.159+0000 7fccf16ae700 1 --2- 192.168.123.106:0/4083509174 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fccd406c630 0x7fccd406eae0 unknown :-1 s=CLOSED pgs=65 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:33.160 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:33.159+0000 7fccf16ae700 1 --2- 192.168.123.106:0/4083509174 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fccec100fe0 0x7fccec195e50 unknown :-1 s=CLOSED pgs=34 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:33.160 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:33.159+0000 7fccf16ae700 1 --2- 192.168.123.106:0/4083509174 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fccec103900 0x7fccec196390 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:33.160 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:33.159+0000 7fccf16ae700 1 -- 192.168.123.106:0/4083509174 >> 192.168.123.106:0/4083509174 conn(0x7fccec0fa9e0 msgr2=0x7fccec0fce30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:19:33.160 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:33.160+0000 7fccf16ae700 1 -- 192.168.123.106:0/4083509174 shutdown_connections 2026-03-10T06:19:33.160 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:19:33.160+0000 7fccf16ae700 1 -- 192.168.123.106:0/4083509174 wait complete. 2026-03-10T06:19:33.201 DEBUG:teuthology.orchestra.run.vm06:> set -ex 2026-03-10T06:19:33.201 DEBUG:teuthology.orchestra.run.vm06:> sudo dd of=/etc/ceph/ceph.client.1.keyring 2026-03-10T06:19:33.202 DEBUG:teuthology.orchestra.run.vm06:> sudo chmod 0644 /etc/ceph/ceph.client.1.keyring 2026-03-10T06:19:33.241 INFO:tasks.ceph:Waiting until ceph daemons up and pgs clean... 2026-03-10T06:19:33.241 INFO:tasks.cephadm.ceph_manager.ceph:waiting for mgr available 2026-03-10T06:19:33.241 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid 9c59102a-1c48-11f1-b618-035af535377d -- ceph mgr dump --format=json 2026-03-10T06:19:33.404 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/mon.vm04/config 2026-03-10T06:19:33.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:33 vm04 ceph-mon[51058]: pgmap v66: 1 pgs: 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T06:19:33.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:33 vm04 ceph-mon[51058]: from='client.? 192.168.123.104:0/2038945973' entity='client.admin' cmd=[{"prefix": "auth get-or-create", "entity": "client.0", "caps": ["mon", "allow *", "osd", "allow *", "mds", "allow *", "mgr", "allow *"]}]: dispatch 2026-03-10T06:19:33.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:33 vm04 ceph-mon[51058]: from='client.? 192.168.123.104:0/2038945973' entity='client.admin' cmd='[{"prefix": "auth get-or-create", "entity": "client.0", "caps": ["mon", "allow *", "osd", "allow *", "mds", "allow *", "mgr", "allow *"]}]': finished 2026-03-10T06:19:33.672 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:33.669+0000 7f629f129700 1 -- 192.168.123.104:0/663203344 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6298068490 msgr2=0x7f6298068900 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:33.672 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:33.669+0000 7f629f129700 1 --2- 192.168.123.104:0/663203344 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6298068490 0x7f6298068900 secure :-1 s=READY pgs=225 cs=0 l=1 rev1=1 crypto rx=0x7f628c009b00 tx=0x7f628c009e10 comp rx=0 tx=0).stop 2026-03-10T06:19:33.672 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:33.670+0000 7f629f129700 1 -- 192.168.123.104:0/663203344 shutdown_connections 2026-03-10T06:19:33.672 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:33.670+0000 7f629f129700 1 --2- 192.168.123.104:0/663203344 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6298068490 0x7f6298068900 unknown :-1 s=CLOSED pgs=225 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:33.672 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:33.670+0000 7f629f129700 1 --2- 192.168.123.104:0/663203344 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f62981013c0 0x7f6298101790 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:33.672 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:33.670+0000 7f629f129700 1 -- 192.168.123.104:0/663203344 >> 192.168.123.104:0/663203344 conn(0x7f62980754a0 msgr2=0x7f62980758a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:19:33.672 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:33.670+0000 7f629f129700 1 -- 192.168.123.104:0/663203344 shutdown_connections 2026-03-10T06:19:33.672 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:33.670+0000 7f629f129700 1 -- 192.168.123.104:0/663203344 wait complete. 2026-03-10T06:19:33.673 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:33.671+0000 7f629f129700 1 Processor -- start 2026-03-10T06:19:33.673 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:33.671+0000 7f629f129700 1 -- start start 2026-03-10T06:19:33.673 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:33.671+0000 7f629f129700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6298068490 0x7f6298198340 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:33.673 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:33.671+0000 7f629f129700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f62981013c0 0x7f6298198880 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:33.673 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:33.671+0000 7f629f129700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6298198f60 con 0x7f62981013c0 2026-03-10T06:19:33.673 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:33.671+0000 7f629f129700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f629819ccf0 con 0x7f6298068490 2026-03-10T06:19:33.673 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:33.672+0000 7f6297fff700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f62981013c0 0x7f6298198880 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:19:33.674 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:33.672+0000 7f629cec5700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6298068490 0x7f6298198340 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:19:33.674 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:33.672+0000 7f629cec5700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6298068490 0x7f6298198340 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.104:55480/0 (socket says 192.168.123.104:55480) 2026-03-10T06:19:33.674 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:33.672+0000 7f629cec5700 1 -- 192.168.123.104:0/1435245220 learned_addr learned my addr 192.168.123.104:0/1435245220 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:19:33.674 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:33.672+0000 7f629cec5700 1 -- 192.168.123.104:0/1435245220 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f62981013c0 msgr2=0x7f6298198880 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:33.674 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:33.672+0000 7f629cec5700 1 --2- 192.168.123.104:0/1435245220 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f62981013c0 0x7f6298198880 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:33.674 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:33.672+0000 7f629cec5700 1 -- 192.168.123.104:0/1435245220 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f628c0097e0 con 0x7f6298068490 2026-03-10T06:19:33.674 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:33.672+0000 7f629cec5700 1 --2- 192.168.123.104:0/1435245220 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6298068490 0x7f6298198340 secure :-1 s=READY pgs=35 cs=0 l=1 rev1=1 crypto rx=0x7f628800d8d0 tx=0x7f628800dc90 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:19:33.674 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:33.673+0000 7f6295ffb700 1 -- 192.168.123.104:0/1435245220 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6288009940 con 0x7f6298068490 2026-03-10T06:19:33.674 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:33.673+0000 7f629f129700 1 -- 192.168.123.104:0/1435245220 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f629819cfd0 con 0x7f6298068490 2026-03-10T06:19:33.675 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:33.673+0000 7f629f129700 1 -- 192.168.123.104:0/1435245220 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f629819d520 con 0x7f6298068490 2026-03-10T06:19:33.675 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:33.673+0000 7f6295ffb700 1 -- 192.168.123.104:0/1435245220 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f6288010460 con 0x7f6298068490 2026-03-10T06:19:33.675 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:33.673+0000 7f6295ffb700 1 -- 192.168.123.104:0/1435245220 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f628800f5d0 con 0x7f6298068490 2026-03-10T06:19:33.676 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:33.674+0000 7f6295ffb700 1 -- 192.168.123.104:0/1435245220 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f628800f7e0 con 0x7f6298068490 2026-03-10T06:19:33.676 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:33.674+0000 7f6295ffb700 1 --2- 192.168.123.104:0/1435245220 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f628006c6f0 0x7f628006eba0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:33.676 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:33.675+0000 7f6297fff700 1 --2- 192.168.123.104:0/1435245220 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f628006c6f0 0x7f628006eba0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:19:33.676 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:33.675+0000 7f6295ffb700 1 -- 192.168.123.104:0/1435245220 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(34..34 src has 1..34) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f628808b490 con 0x7f6298068490 2026-03-10T06:19:33.677 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:33.675+0000 7f629f129700 1 -- 192.168.123.104:0/1435245220 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f629804ea50 con 0x7f6298068490 2026-03-10T06:19:33.677 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:33.675+0000 7f6297fff700 1 --2- 192.168.123.104:0/1435245220 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f628006c6f0 0x7f628006eba0 secure :-1 s=READY pgs=66 cs=0 l=1 rev1=1 crypto rx=0x7f628c00b5c0 tx=0x7f628c005fd0 comp rx=0 tx=0).ready entity=mgr.14241 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:19:33.680 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:33.678+0000 7f6295ffb700 1 -- 192.168.123.104:0/1435245220 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f6288016080 con 0x7f6298068490 2026-03-10T06:19:33.818 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:33.816+0000 7f629f129700 1 -- 192.168.123.104:0/1435245220 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "mgr dump", "format": "json"} v 0) v1 -- 0x7f6298199740 con 0x7f6298068490 2026-03-10T06:19:33.820 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:33.818+0000 7f6295ffb700 1 -- 192.168.123.104:0/1435245220 <== mon.1 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "mgr dump", "format": "json"}]=0 v19) v1 ==== 74+0+172844 (secure 0 0 0) 0x7f6288020070 con 0x7f6298068490 2026-03-10T06:19:33.820 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:19:33.825 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:33.823+0000 7f629f129700 1 -- 192.168.123.104:0/1435245220 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f628006c6f0 msgr2=0x7f628006eba0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:33.825 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:33.823+0000 7f629f129700 1 --2- 192.168.123.104:0/1435245220 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f628006c6f0 0x7f628006eba0 secure :-1 s=READY pgs=66 cs=0 l=1 rev1=1 crypto rx=0x7f628c00b5c0 tx=0x7f628c005fd0 comp rx=0 tx=0).stop 2026-03-10T06:19:33.825 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:33.823+0000 7f629f129700 1 -- 192.168.123.104:0/1435245220 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6298068490 msgr2=0x7f6298198340 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:33.825 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:33.823+0000 7f629f129700 1 --2- 192.168.123.104:0/1435245220 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6298068490 0x7f6298198340 secure :-1 s=READY pgs=35 cs=0 l=1 rev1=1 crypto rx=0x7f628800d8d0 tx=0x7f628800dc90 comp rx=0 tx=0).stop 2026-03-10T06:19:33.825 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:33.824+0000 7f629f129700 1 -- 192.168.123.104:0/1435245220 shutdown_connections 2026-03-10T06:19:33.825 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:33.824+0000 7f629f129700 1 --2- 192.168.123.104:0/1435245220 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f628006c6f0 0x7f628006eba0 unknown :-1 s=CLOSED pgs=66 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:33.825 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:33.824+0000 7f629f129700 1 --2- 192.168.123.104:0/1435245220 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6298068490 0x7f6298198340 unknown :-1 s=CLOSED pgs=35 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:33.825 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:33.824+0000 7f629f129700 1 --2- 192.168.123.104:0/1435245220 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f62981013c0 0x7f6298198880 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:33.826 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:33.824+0000 7f629f129700 1 -- 192.168.123.104:0/1435245220 >> 192.168.123.104:0/1435245220 conn(0x7f62980754a0 msgr2=0x7f62980fddd0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:19:33.826 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:33.824+0000 7f629f129700 1 -- 192.168.123.104:0/1435245220 shutdown_connections 2026-03-10T06:19:33.826 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:33.824+0000 7f629f129700 1 -- 192.168.123.104:0/1435245220 wait complete. 2026-03-10T06:19:33.895 INFO:teuthology.orchestra.run.vm04.stdout:{"epoch":19,"active_gid":14241,"active_name":"vm04.exdvdb","active_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6800","nonce":2},{"type":"v1","addr":"192.168.123.104:6801","nonce":2}]},"active_addr":"192.168.123.104:6801/2","active_change":"2026-03-10T06:18:04.067341+0000","active_mgr_features":4540138322906710015,"available":true,"standbys":[{"gid":14266,"name":"vm06.wwotdr","mgr_features":4540138322906710015,"available_modules":[{"name":"alerts","can_run":true,"error_string":"","module_options":{"interval":{"name":"interval","type":"secs","level":"advanced","flags":1,"default_value":"60","min":"","max":"","enum_allowed":[],"desc":"How frequently to reexamine health status","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"smtp_destination":{"name":"smtp_destination","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"Email address to send alerts to","long_desc":"","tags":[],"see_also":[]},"smtp_from_name":{"name":"smtp_from_name","type":"str","level":"advanced","flags":1,"default_value":"Ceph","min":"","max":"","enum_allowed":[],"desc":"Email From: name","long_desc":"","tags":[],"see_also":[]},"smtp_host":{"name":"smtp_host","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"SMTP server","long_desc":"","tags":[],"see_also":[]},"smtp_password":{"name":"smtp_password","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"Password to authenticate with","long_desc":"","tags":[],"see_also":[]},"smtp_port":{"name":"smtp_port","type":"int","level":"advanced","flags":1,"default_value":"465","min":"","max":"","enum_allowed":[],"desc":"SMTP port","long_desc":"","tags":[],"see_also":[]},"smtp_sender":{"name":"smtp_sender","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"SMTP envelope sender","long_desc":"","tags":[],"see_also":[]},"smtp_ssl":{"name":"smtp_ssl","type":"bool","level":"advanced","flags":1,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Use SSL to connect to SMTP server","long_desc":"","tags":[],"see_also":[]},"smtp_user":{"name":"smtp_user","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"User to authenticate as","long_desc":"","tags":[],"see_also":[]}}},{"name":"balancer","can_run":true,"error_string":"","module_options":{"active":{"name":"active","type":"bool","level":"advanced","flags":1,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"automatically balance PGs across cluster","long_desc":"","tags":[],"see_also":[]},"begin_time":{"name":"begin_time","type":"str","level":"advanced","flags":1,"default_value":"0000","min":"","max":"","enum_allowed":[],"desc":"beginning time of day to automatically balance","long_desc":"This is a time of day in the format HHMM.","tags":[],"see_also":[]},"begin_weekday":{"name":"begin_weekday","type":"uint","level":"advanced","flags":1,"default_value":"0","min":"0","max":"6","enum_allowed":[],"desc":"Restrict automatic balancing to this day of the week or later","long_desc":"0 = Sunday, 1 = Monday, etc.","tags":[],"see_also":[]},"crush_compat_max_iterations":{"name":"crush_compat_max_iterations","type":"uint","level":"advanced","flags":1,"default_value":"25","min":"1","max":"250","enum_allowed":[],"desc":"maximum number of iterations to attempt optimization","long_desc":"","tags":[],"see_also":[]},"crush_compat_metrics":{"name":"crush_compat_metrics","type":"str","level":"advanced","flags":1,"default_value":"pgs,objects,bytes","min":"","max":"","enum_allowed":[],"desc":"metrics with which to calculate OSD utilization","long_desc":"Value is a list of one or more of \"pgs\", \"objects\", or \"bytes\", and indicates which metrics to use to balance utilization.","tags":[],"see_also":[]},"crush_compat_step":{"name":"crush_compat_step","type":"float","level":"advanced","flags":1,"default_value":"0.5","min":"0.001","max":"0.999","enum_allowed":[],"desc":"aggressiveness of optimization","long_desc":".99 is very aggressive, .01 is less aggressive","tags":[],"see_also":[]},"end_time":{"name":"end_time","type":"str","level":"advanced","flags":1,"default_value":"2359","min":"","max":"","enum_allowed":[],"desc":"ending time of day to automatically balance","long_desc":"This is a time of day in the format HHMM.","tags":[],"see_also":[]},"end_weekday":{"name":"end_weekday","type":"uint","level":"advanced","flags":1,"default_value":"0","min":"0","max":"6","enum_allowed":[],"desc":"Restrict automatic balancing to days of the week earlier than this","long_desc":"0 = Sunday, 1 = Monday, etc.","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"min_score":{"name":"min_score","type":"float","level":"advanced","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"minimum score, below which no optimization is attempted","long_desc":"","tags":[],"see_also":[]},"mode":{"name":"mode","type":"str","level":"advanced","flags":1,"default_value":"upmap","min":"","max":"","enum_allowed":["crush-compat","none","upmap"],"desc":"Balancer mode","long_desc":"","tags":[],"see_also":[]},"pool_ids":{"name":"pool_ids","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"pools which the automatic balancing will be limited to","long_desc":"","tags":[],"see_also":[]},"sleep_interval":{"name":"sleep_interval","type":"secs","level":"advanced","flags":1,"default_value":"60","min":"","max":"","enum_allowed":[],"desc":"how frequently to wake up and attempt optimization","long_desc":"","tags":[],"see_also":[]},"upmap_max_deviation":{"name":"upmap_max_deviation","type":"int","level":"advanced","flags":1,"default_value":"5","min":"1","max":"","enum_allowed":[],"desc":"deviation below which no optimization is attempted","long_desc":"If the number of PGs are within this count then no optimization is attempted","tags":[],"see_also":[]},"upmap_max_optimizations":{"name":"upmap_max_optimizations","type":"uint","level":"advanced","flags":1,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"maximum upmap optimizations to make per attempt","long_desc":"","tags":[],"see_also":[]}}},{"name":"cephadm","can_run":true,"error_string":"","module_options":{"agent_down_multiplier":{"name":"agent_down_multiplier","type":"float","level":"advanced","flags":0,"default_value":"3.0","min":"","max":"","enum_allowed":[],"desc":"Multiplied by agent refresh rate to calculate how long agent must not report before being marked down","long_desc":"","tags":[],"see_also":[]},"agent_refresh_rate":{"name":"agent_refresh_rate","type":"secs","level":"advanced","flags":0,"default_value":"20","min":"","max":"","enum_allowed":[],"desc":"How often agent on each host will try to gather and send metadata","long_desc":"","tags":[],"see_also":[]},"agent_starting_port":{"name":"agent_starting_port","type":"int","level":"advanced","flags":0,"default_value":"4721","min":"","max":"","enum_allowed":[],"desc":"First port agent will try to bind to (will also try up to next 1000 subsequent ports if blocked)","long_desc":"","tags":[],"see_also":[]},"alertmanager_web_password":{"name":"alertmanager_web_password","type":"str","level":"advanced","flags":0,"default_value":"admin","min":"","max":"","enum_allowed":[],"desc":"Alertmanager web password","long_desc":"","tags":[],"see_also":[]},"alertmanager_web_user":{"name":"alertmanager_web_user","type":"str","level":"advanced","flags":0,"default_value":"admin","min":"","max":"","enum_allowed":[],"desc":"Alertmanager web user","long_desc":"","tags":[],"see_also":[]},"allow_ptrace":{"name":"allow_ptrace","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"allow SYS_PTRACE capability on ceph containers","long_desc":"The SYS_PTRACE capability is needed to attach to a process with gdb or strace. Enabling this options can allow debugging daemons that encounter problems at runtime.","tags":[],"see_also":[]},"autotune_interval":{"name":"autotune_interval","type":"secs","level":"advanced","flags":0,"default_value":"600","min":"","max":"","enum_allowed":[],"desc":"how frequently to autotune daemon memory","long_desc":"","tags":[],"see_also":[]},"autotune_memory_target_ratio":{"name":"autotune_memory_target_ratio","type":"float","level":"advanced","flags":0,"default_value":"0.7","min":"","max":"","enum_allowed":[],"desc":"ratio of total system memory to divide amongst autotuned daemons","long_desc":"","tags":[],"see_also":[]},"cgroups_split":{"name":"cgroups_split","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Pass --cgroups=split when cephadm creates containers (currently podman only)","long_desc":"","tags":[],"see_also":[]},"config_checks_enabled":{"name":"config_checks_enabled","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Enable or disable the cephadm configuration analysis","long_desc":"","tags":[],"see_also":[]},"config_dashboard":{"name":"config_dashboard","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"manage configs like API endpoints in Dashboard.","long_desc":"","tags":[],"see_also":[]},"container_image_alertmanager":{"name":"container_image_alertmanager","type":"str","level":"advanced","flags":0,"default_value":"quay.io/prometheus/alertmanager:v0.25.0","min":"","max":"","enum_allowed":[],"desc":"Prometheus container image","long_desc":"","tags":[],"see_also":[]},"container_image_base":{"name":"container_image_base","type":"str","level":"advanced","flags":1,"default_value":"quay.io/ceph/ceph:v18","min":"","max":"","enum_allowed":[],"desc":"Container image name, without the tag","long_desc":"","tags":[],"see_also":[]},"container_image_elasticsearch":{"name":"container_image_elasticsearch","type":"str","level":"advanced","flags":0,"default_value":"quay.io/omrizeneva/elasticsearch:6.8.23","min":"","max":"","enum_allowed":[],"desc":"elasticsearch container image","long_desc":"","tags":[],"see_also":[]},"container_image_grafana":{"name":"container_image_grafana","type":"str","level":"advanced","flags":0,"default_value":"quay.io/ceph/ceph-grafana:9.4.7","min":"","max":"","enum_allowed":[],"desc":"Prometheus container image","long_desc":"","tags":[],"see_also":[]},"container_image_haproxy":{"name":"container_image_haproxy","type":"str","level":"advanced","flags":0,"default_value":"quay.io/ceph/haproxy:2.3","min":"","max":"","enum_allowed":[],"desc":"HAproxy container image","long_desc":"","tags":[],"see_also":[]},"container_image_jaeger_agent":{"name":"container_image_jaeger_agent","type":"str","level":"advanced","flags":0,"default_value":"quay.io/jaegertracing/jaeger-agent:1.29","min":"","max":"","enum_allowed":[],"desc":"Jaeger agent container image","long_desc":"","tags":[],"see_also":[]},"container_image_jaeger_collector":{"name":"container_image_jaeger_collector","type":"str","level":"advanced","flags":0,"default_value":"quay.io/jaegertracing/jaeger-collector:1.29","min":"","max":"","enum_allowed":[],"desc":"Jaeger collector container image","long_desc":"","tags":[],"see_also":[]},"container_image_jaeger_query":{"name":"container_image_jaeger_query","type":"str","level":"advanced","flags":0,"default_value":"quay.io/jaegertracing/jaeger-query:1.29","min":"","max":"","enum_allowed":[],"desc":"Jaeger query container image","long_desc":"","tags":[],"see_also":[]},"container_image_keepalived":{"name":"container_image_keepalived","type":"str","level":"advanced","flags":0,"default_value":"quay.io/ceph/keepalived:2.2.4","min":"","max":"","enum_allowed":[],"desc":"Keepalived container image","long_desc":"","tags":[],"see_also":[]},"container_image_loki":{"name":"container_image_loki","type":"str","level":"advanced","flags":0,"default_value":"docker.io/grafana/loki:2.4.0","min":"","max":"","enum_allowed":[],"desc":"Loki container image","long_desc":"","tags":[],"see_also":[]},"container_image_node_exporter":{"name":"container_image_node_exporter","type":"str","level":"advanced","flags":0,"default_value":"quay.io/prometheus/node-exporter:v1.5.0","min":"","max":"","enum_allowed":[],"desc":"Prometheus container image","long_desc":"","tags":[],"see_also":[]},"container_image_prometheus":{"name":"container_image_prometheus","type":"str","level":"advanced","flags":0,"default_value":"quay.io/prometheus/prometheus:v2.43.0","min":"","max":"","enum_allowed":[],"desc":"Prometheus container image","long_desc":"","tags":[],"see_also":[]},"container_image_promtail":{"name":"container_image_promtail","type":"str","level":"advanced","flags":0,"default_value":"docker.io/grafana/promtail:2.4.0","min":"","max":"","enum_allowed":[],"desc":"Promtail container image","long_desc":"","tags":[],"see_also":[]},"container_image_snmp_gateway":{"name":"container_image_snmp_gateway","type":"str","level":"advanced","flags":0,"default_value":"docker.io/maxwo/snmp-notifier:v1.2.1","min":"","max":"","enum_allowed":[],"desc":"SNMP Gateway container image","long_desc":"","tags":[],"see_also":[]},"container_init":{"name":"container_init","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Run podman/docker with `--init`","long_desc":"","tags":[],"see_also":[]},"daemon_cache_timeout":{"name":"daemon_cache_timeout","type":"secs","level":"advanced","flags":0,"default_value":"600","min":"","max":"","enum_allowed":[],"desc":"seconds to cache service (daemon) inventory","long_desc":"","tags":[],"see_also":[]},"default_cephadm_command_timeout":{"name":"default_cephadm_command_timeout","type":"secs","level":"advanced","flags":0,"default_value":"900","min":"","max":"","enum_allowed":[],"desc":"Default timeout applied to cephadm commands run directly on the host (in seconds)","long_desc":"","tags":[],"see_also":[]},"default_registry":{"name":"default_registry","type":"str","level":"advanced","flags":0,"default_value":"docker.io","min":"","max":"","enum_allowed":[],"desc":"Search-registry to which we should normalize unqualified image names. This is not the default registry","long_desc":"","tags":[],"see_also":[]},"device_cache_timeout":{"name":"device_cache_timeout","type":"secs","level":"advanced","flags":0,"default_value":"1800","min":"","max":"","enum_allowed":[],"desc":"seconds to cache device inventory","long_desc":"","tags":[],"see_also":[]},"device_enhanced_scan":{"name":"device_enhanced_scan","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Use libstoragemgmt during device scans","long_desc":"","tags":[],"see_also":[]},"facts_cache_timeout":{"name":"facts_cache_timeout","type":"secs","level":"advanced","flags":0,"default_value":"60","min":"","max":"","enum_allowed":[],"desc":"seconds to cache host facts data","long_desc":"","tags":[],"see_also":[]},"host_check_interval":{"name":"host_check_interval","type":"secs","level":"advanced","flags":0,"default_value":"600","min":"","max":"","enum_allowed":[],"desc":"how frequently to perform a host check","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_refresh_metadata":{"name":"log_refresh_metadata","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Log all refresh metadata. Includes daemon, device, and host info collected regularly. Only has effect if logging at debug level","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"log to the \"cephadm\" cluster log channel\"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"manage_etc_ceph_ceph_conf":{"name":"manage_etc_ceph_ceph_conf","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Manage and own /etc/ceph/ceph.conf on the hosts.","long_desc":"","tags":[],"see_also":[]},"manage_etc_ceph_ceph_conf_hosts":{"name":"manage_etc_ceph_ceph_conf_hosts","type":"str","level":"advanced","flags":0,"default_value":"*","min":"","max":"","enum_allowed":[],"desc":"PlacementSpec describing on which hosts to manage /etc/ceph/ceph.conf","long_desc":"","tags":[],"see_also":[]},"max_count_per_host":{"name":"max_count_per_host","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"max number of daemons per service per host","long_desc":"","tags":[],"see_also":[]},"max_osd_draining_count":{"name":"max_osd_draining_count","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"max number of osds that will be drained simultaneously when osds are removed","long_desc":"","tags":[],"see_also":[]},"migration_current":{"name":"migration_current","type":"int","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"internal - do not modify","long_desc":"","tags":[],"see_also":[]},"mode":{"name":"mode","type":"str","level":"advanced","flags":0,"default_value":"root","min":"","max":"","enum_allowed":["cephadm-package","root"],"desc":"mode for remote execution of cephadm","long_desc":"","tags":[],"see_also":[]},"prometheus_alerts_path":{"name":"prometheus_alerts_path","type":"str","level":"advanced","flags":0,"default_value":"/etc/prometheus/ceph/ceph_default_alerts.yml","min":"","max":"","enum_allowed":[],"desc":"location of alerts to include in prometheus deployments","long_desc":"","tags":[],"see_also":[]},"prometheus_web_password":{"name":"prometheus_web_password","type":"str","level":"advanced","flags":0,"default_value":"admin","min":"","max":"","enum_allowed":[],"desc":"Prometheus web password","long_desc":"","tags":[],"see_also":[]},"prometheus_web_user":{"name":"prometheus_web_user","type":"str","level":"advanced","flags":0,"default_value":"admin","min":"","max":"","enum_allowed":[],"desc":"Prometheus web user","long_desc":"","tags":[],"see_also":[]},"registry_insecure":{"name":"registry_insecure","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Registry is to be considered insecure (no TLS available). Only for development purposes.","long_desc":"","tags":[],"see_also":[]},"registry_password":{"name":"registry_password","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"Custom repository password. Only used for logging into a registry.","long_desc":"","tags":[],"see_also":[]},"registry_url":{"name":"registry_url","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"Registry url for login purposes. This is not the default registry","long_desc":"","tags":[],"see_also":[]},"registry_username":{"name":"registry_username","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"Custom repository username. Only used for logging into a registry.","long_desc":"","tags":[],"see_also":[]},"secure_monitoring_stack":{"name":"secure_monitoring_stack","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Enable TLS security for all the monitoring stack daemons","long_desc":"","tags":[],"see_also":[]},"service_discovery_port":{"name":"service_discovery_port","type":"int","level":"advanced","flags":0,"default_value":"8765","min":"","max":"","enum_allowed":[],"desc":"cephadm service discovery port","long_desc":"","tags":[],"see_also":[]},"ssh_config_file":{"name":"ssh_config_file","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"customized SSH config file to connect to managed hosts","long_desc":"","tags":[],"see_also":[]},"use_agent":{"name":"use_agent","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Use cephadm agent on each host to gather and send metadata","long_desc":"","tags":[],"see_also":[]},"use_repo_digest":{"name":"use_repo_digest","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Automatically convert image tags to image digest. Make sure all daemons use the same image","long_desc":"","tags":[],"see_also":[]},"warn_on_failed_host_check":{"name":"warn_on_failed_host_check","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"raise a health warning if the host check fails","long_desc":"","tags":[],"see_also":[]},"warn_on_stray_daemons":{"name":"warn_on_stray_daemons","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"raise a health warning if daemons are detected that are not managed by cephadm","long_desc":"","tags":[],"see_also":[]},"warn_on_stray_hosts":{"name":"warn_on_stray_hosts","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"raise a health warning if daemons are detected on a host that is not managed by cephadm","long_desc":"","tags":[],"see_also":[]}}},{"name":"crash","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"retain_interval":{"name":"retain_interval","type":"secs","level":"advanced","flags":1,"default_value":"31536000","min":"","max":"","enum_allowed":[],"desc":"how long to retain crashes before pruning them","long_desc":"","tags":[],"see_also":[]},"warn_recent_interval":{"name":"warn_recent_interval","type":"secs","level":"advanced","flags":1,"default_value":"1209600","min":"","max":"","enum_allowed":[],"desc":"time interval in which to warn about recent crashes","long_desc":"","tags":[],"see_also":[]}}},{"name":"dashboard","can_run":true,"error_string":"","module_options":{"ACCOUNT_LOCKOUT_ATTEMPTS":{"name":"ACCOUNT_LOCKOUT_ATTEMPTS","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ALERTMANAGER_API_HOST":{"name":"ALERTMANAGER_API_HOST","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ALERTMANAGER_API_SSL_VERIFY":{"name":"ALERTMANAGER_API_SSL_VERIFY","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"AUDIT_API_ENABLED":{"name":"AUDIT_API_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"AUDIT_API_LOG_PAYLOAD":{"name":"AUDIT_API_LOG_PAYLOAD","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ENABLE_BROWSABLE_API":{"name":"ENABLE_BROWSABLE_API","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_CEPHFS":{"name":"FEATURE_TOGGLE_CEPHFS","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_DASHBOARD":{"name":"FEATURE_TOGGLE_DASHBOARD","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_ISCSI":{"name":"FEATURE_TOGGLE_ISCSI","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_MIRRORING":{"name":"FEATURE_TOGGLE_MIRRORING","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_NFS":{"name":"FEATURE_TOGGLE_NFS","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_RBD":{"name":"FEATURE_TOGGLE_RBD","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_RGW":{"name":"FEATURE_TOGGLE_RGW","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GANESHA_CLUSTERS_RADOS_POOL_NAMESPACE":{"name":"GANESHA_CLUSTERS_RADOS_POOL_NAMESPACE","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_API_PASSWORD":{"name":"GRAFANA_API_PASSWORD","type":"str","level":"advanced","flags":0,"default_value":"admin","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_API_SSL_VERIFY":{"name":"GRAFANA_API_SSL_VERIFY","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_API_URL":{"name":"GRAFANA_API_URL","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_API_USERNAME":{"name":"GRAFANA_API_USERNAME","type":"str","level":"advanced","flags":0,"default_value":"admin","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_FRONTEND_API_URL":{"name":"GRAFANA_FRONTEND_API_URL","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_UPDATE_DASHBOARDS":{"name":"GRAFANA_UPDATE_DASHBOARDS","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ISCSI_API_SSL_VERIFICATION":{"name":"ISCSI_API_SSL_VERIFICATION","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ISSUE_TRACKER_API_KEY":{"name":"ISSUE_TRACKER_API_KEY","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PROMETHEUS_API_HOST":{"name":"PROMETHEUS_API_HOST","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PROMETHEUS_API_SSL_VERIFY":{"name":"PROMETHEUS_API_SSL_VERIFY","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_COMPLEXITY_ENABLED":{"name":"PWD_POLICY_CHECK_COMPLEXITY_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_EXCLUSION_LIST_ENABLED":{"name":"PWD_POLICY_CHECK_EXCLUSION_LIST_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_LENGTH_ENABLED":{"name":"PWD_POLICY_CHECK_LENGTH_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_OLDPWD_ENABLED":{"name":"PWD_POLICY_CHECK_OLDPWD_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_REPETITIVE_CHARS_ENABLED":{"name":"PWD_POLICY_CHECK_REPETITIVE_CHARS_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_SEQUENTIAL_CHARS_ENABLED":{"name":"PWD_POLICY_CHECK_SEQUENTIAL_CHARS_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_USERNAME_ENABLED":{"name":"PWD_POLICY_CHECK_USERNAME_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_ENABLED":{"name":"PWD_POLICY_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_EXCLUSION_LIST":{"name":"PWD_POLICY_EXCLUSION_LIST","type":"str","level":"advanced","flags":0,"default_value":"osd,host,dashboard,pool,block,nfs,ceph,monitors,gateway,logs,crush,maps","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_MIN_COMPLEXITY":{"name":"PWD_POLICY_MIN_COMPLEXITY","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_MIN_LENGTH":{"name":"PWD_POLICY_MIN_LENGTH","type":"int","level":"advanced","flags":0,"default_value":"8","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"REST_REQUESTS_TIMEOUT":{"name":"REST_REQUESTS_TIMEOUT","type":"int","level":"advanced","flags":0,"default_value":"45","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"RGW_API_ACCESS_KEY":{"name":"RGW_API_ACCESS_KEY","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"RGW_API_ADMIN_RESOURCE":{"name":"RGW_API_ADMIN_RESOURCE","type":"str","level":"advanced","flags":0,"default_value":"admin","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"RGW_API_SECRET_KEY":{"name":"RGW_API_SECRET_KEY","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"RGW_API_SSL_VERIFY":{"name":"RGW_API_SSL_VERIFY","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"USER_PWD_EXPIRATION_SPAN":{"name":"USER_PWD_EXPIRATION_SPAN","type":"int","level":"advanced","flags":0,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"USER_PWD_EXPIRATION_WARNING_1":{"name":"USER_PWD_EXPIRATION_WARNING_1","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"USER_PWD_EXPIRATION_WARNING_2":{"name":"USER_PWD_EXPIRATION_WARNING_2","type":"int","level":"advanced","flags":0,"default_value":"5","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"cross_origin_url":{"name":"cross_origin_url","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"crt_file":{"name":"crt_file","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"debug":{"name":"debug","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Enable/disable debug options","long_desc":"","tags":[],"see_also":[]},"jwt_token_ttl":{"name":"jwt_token_ttl","type":"int","level":"advanced","flags":0,"default_value":"28800","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"key_file":{"name":"key_file","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"motd":{"name":"motd","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"The message of the day","long_desc":"","tags":[],"see_also":[]},"redirect_resolve_ip_addr":{"name":"redirect_resolve_ip_addr","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"server_addr":{"name":"server_addr","type":"str","level":"advanced","flags":0,"default_value":"::","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"server_port":{"name":"server_port","type":"int","level":"advanced","flags":0,"default_value":"8080","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ssl":{"name":"ssl","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ssl_server_port":{"name":"ssl_server_port","type":"int","level":"advanced","flags":0,"default_value":"8443","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"standby_behaviour":{"name":"standby_behaviour","type":"str","level":"advanced","flags":0,"default_value":"redirect","min":"","max":"","enum_allowed":["error","redirect"],"desc":"","long_desc":"","tags":[],"see_also":[]},"standby_error_status_code":{"name":"standby_error_status_code","type":"int","level":"advanced","flags":0,"default_value":"500","min":"400","max":"599","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"url_prefix":{"name":"url_prefix","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"devicehealth","can_run":true,"error_string":"","module_options":{"enable_monitoring":{"name":"enable_monitoring","type":"bool","level":"advanced","flags":1,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"monitor device health metrics","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"mark_out_threshold":{"name":"mark_out_threshold","type":"secs","level":"advanced","flags":1,"default_value":"2419200","min":"","max":"","enum_allowed":[],"desc":"automatically mark OSD if it may fail before this long","long_desc":"","tags":[],"see_also":[]},"pool_name":{"name":"pool_name","type":"str","level":"advanced","flags":1,"default_value":"device_health_metrics","min":"","max":"","enum_allowed":[],"desc":"name of pool in which to store device health metrics","long_desc":"","tags":[],"see_also":[]},"retention_period":{"name":"retention_period","type":"secs","level":"advanced","flags":1,"default_value":"15552000","min":"","max":"","enum_allowed":[],"desc":"how long to retain device health metrics","long_desc":"","tags":[],"see_also":[]},"scrape_frequency":{"name":"scrape_frequency","type":"secs","level":"advanced","flags":1,"default_value":"86400","min":"","max":"","enum_allowed":[],"desc":"how frequently to scrape device health metrics","long_desc":"","tags":[],"see_also":[]},"self_heal":{"name":"self_heal","type":"bool","level":"advanced","flags":1,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"preemptively heal cluster around devices that may fail","long_desc":"","tags":[],"see_also":[]},"sleep_interval":{"name":"sleep_interval","type":"secs","level":"advanced","flags":1,"default_value":"600","min":"","max":"","enum_allowed":[],"desc":"how frequently to wake up and check device health","long_desc":"","tags":[],"see_also":[]},"warn_threshold":{"name":"warn_threshold","type":"secs","level":"advanced","flags":1,"default_value":"7257600","min":"","max":"","enum_allowed":[],"desc":"raise health warning if OSD may fail before this long","long_desc":"","tags":[],"see_also":[]}}},{"name":"diskprediction_local","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"predict_interval":{"name":"predict_interval","type":"str","level":"advanced","flags":0,"default_value":"86400","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"predictor_model":{"name":"predictor_model","type":"str","level":"advanced","flags":0,"default_value":"prophetstor","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sleep_interval":{"name":"sleep_interval","type":"str","level":"advanced","flags":0,"default_value":"600","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"influx","can_run":false,"error_string":"influxdb python module not found","module_options":{"batch_size":{"name":"batch_size","type":"int","level":"advanced","flags":0,"default_value":"5000","min":"","max":"","enum_allowed":[],"desc":"How big batches of data points should be when sending to InfluxDB.","long_desc":"","tags":[],"see_also":[]},"database":{"name":"database","type":"str","level":"advanced","flags":0,"default_value":"ceph","min":"","max":"","enum_allowed":[],"desc":"InfluxDB database name. You will need to create this database and grant write privileges to the configured username or the username must have admin privileges to create it.","long_desc":"","tags":[],"see_also":[]},"hostname":{"name":"hostname","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"InfluxDB server hostname","long_desc":"","tags":[],"see_also":[]},"interval":{"name":"interval","type":"secs","level":"advanced","flags":0,"default_value":"30","min":"5","max":"","enum_allowed":[],"desc":"Time between reports to InfluxDB. Default 30 seconds.","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"password":{"name":"password","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"password of InfluxDB server user","long_desc":"","tags":[],"see_also":[]},"port":{"name":"port","type":"int","level":"advanced","flags":0,"default_value":"8086","min":"","max":"","enum_allowed":[],"desc":"InfluxDB server port","long_desc":"","tags":[],"see_also":[]},"ssl":{"name":"ssl","type":"str","level":"advanced","flags":0,"default_value":"false","min":"","max":"","enum_allowed":[],"desc":"Use https connection for InfluxDB server. Use \"true\" or \"false\".","long_desc":"","tags":[],"see_also":[]},"threads":{"name":"threads","type":"int","level":"advanced","flags":0,"default_value":"5","min":"1","max":"32","enum_allowed":[],"desc":"How many worker threads should be spawned for sending data to InfluxDB.","long_desc":"","tags":[],"see_also":[]},"username":{"name":"username","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"username of InfluxDB server user","long_desc":"","tags":[],"see_also":[]},"verify_ssl":{"name":"verify_ssl","type":"str","level":"advanced","flags":0,"default_value":"true","min":"","max":"","enum_allowed":[],"desc":"Verify https cert for InfluxDB server. Use \"true\" or \"false\".","long_desc":"","tags":[],"see_also":[]}}},{"name":"insights","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"iostat","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"k8sevents","can_run":true,"error_string":"","module_options":{"ceph_event_retention_days":{"name":"ceph_event_retention_days","type":"int","level":"advanced","flags":0,"default_value":"7","min":"","max":"","enum_allowed":[],"desc":"Days to hold ceph event information within local cache","long_desc":"","tags":[],"see_also":[]},"config_check_secs":{"name":"config_check_secs","type":"int","level":"advanced","flags":0,"default_value":"10","min":"10","max":"","enum_allowed":[],"desc":"interval (secs) to check for cluster configuration changes","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"localpool","can_run":true,"error_string":"","module_options":{"failure_domain":{"name":"failure_domain","type":"str","level":"advanced","flags":1,"default_value":"host","min":"","max":"","enum_allowed":[],"desc":"failure domain for any created local pool","long_desc":"what failure domain we should separate data replicas across.","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"min_size":{"name":"min_size","type":"int","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"default min_size for any created local pool","long_desc":"value to set min_size to (unchanged from Ceph's default if this option is not set)","tags":[],"see_also":[]},"num_rep":{"name":"num_rep","type":"int","level":"advanced","flags":1,"default_value":"3","min":"","max":"","enum_allowed":[],"desc":"default replica count for any created local pool","long_desc":"","tags":[],"see_also":[]},"pg_num":{"name":"pg_num","type":"int","level":"advanced","flags":1,"default_value":"128","min":"","max":"","enum_allowed":[],"desc":"default pg_num for any created local pool","long_desc":"","tags":[],"see_also":[]},"prefix":{"name":"prefix","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"name prefix for any created local pool","long_desc":"","tags":[],"see_also":[]},"subtree":{"name":"subtree","type":"str","level":"advanced","flags":1,"default_value":"rack","min":"","max":"","enum_allowed":[],"desc":"CRUSH level for which to create a local pool","long_desc":"which CRUSH subtree type the module should create a pool for.","tags":[],"see_also":[]}}},{"name":"mds_autoscaler","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"mirroring","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"nfs","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"orchestrator","can_run":true,"error_string":"","module_options":{"fail_fs":{"name":"fail_fs","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Fail filesystem for rapid multi-rank mds upgrade","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"orchestrator":{"name":"orchestrator","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["cephadm","rook","test_orchestrator"],"desc":"Orchestrator backend","long_desc":"","tags":[],"see_also":[]}}},{"name":"osd_perf_query","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"osd_support","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"pg_autoscaler","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"noautoscale":{"name":"noautoscale","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"global autoscale flag","long_desc":"Option to turn on/off the autoscaler for all pools","tags":[],"see_also":[]},"sleep_interval":{"name":"sleep_interval","type":"secs","level":"advanced","flags":0,"default_value":"60","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"threshold":{"name":"threshold","type":"float","level":"advanced","flags":0,"default_value":"3.0","min":"1.0","max":"","enum_allowed":[],"desc":"scaling threshold","long_desc":"The factor by which the `NEW PG_NUM` must vary from the current`PG_NUM` before being accepted. Cannot be less than 1.0","tags":[],"see_also":[]}}},{"name":"progress","can_run":true,"error_string":"","module_options":{"allow_pg_recovery_event":{"name":"allow_pg_recovery_event","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"allow the module to show pg recovery progress","long_desc":"","tags":[],"see_also":[]},"enabled":{"name":"enabled","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"max_completed_events":{"name":"max_completed_events","type":"int","level":"advanced","flags":1,"default_value":"50","min":"","max":"","enum_allowed":[],"desc":"number of past completed events to remember","long_desc":"","tags":[],"see_also":[]},"sleep_interval":{"name":"sleep_interval","type":"secs","level":"advanced","flags":1,"default_value":"5","min":"","max":"","enum_allowed":[],"desc":"how long the module is going to sleep","long_desc":"","tags":[],"see_also":[]}}},{"name":"prometheus","can_run":true,"error_string":"","module_options":{"cache":{"name":"cache","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rbd_stats_pools":{"name":"rbd_stats_pools","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rbd_stats_pools_refresh_interval":{"name":"rbd_stats_pools_refresh_interval","type":"int","level":"advanced","flags":0,"default_value":"300","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"scrape_interval":{"name":"scrape_interval","type":"float","level":"advanced","flags":0,"default_value":"15.0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"server_addr":{"name":"server_addr","type":"str","level":"advanced","flags":0,"default_value":"::","min":"","max":"","enum_allowed":[],"desc":"the IPv4 or IPv6 address on which the module listens for HTTP requests","long_desc":"","tags":[],"see_also":[]},"server_port":{"name":"server_port","type":"int","level":"advanced","flags":1,"default_value":"9283","min":"","max":"","enum_allowed":[],"desc":"the port on which the module listens for HTTP requests","long_desc":"","tags":[],"see_also":[]},"stale_cache_strategy":{"name":"stale_cache_strategy","type":"str","level":"advanced","flags":0,"default_value":"log","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"standby_behaviour":{"name":"standby_behaviour","type":"str","level":"advanced","flags":1,"default_value":"default","min":"","max":"","enum_allowed":["default","error"],"desc":"","long_desc":"","tags":[],"see_also":[]},"standby_error_status_code":{"name":"standby_error_status_code","type":"int","level":"advanced","flags":1,"default_value":"500","min":"400","max":"599","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"rbd_support","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"max_concurrent_snap_create":{"name":"max_concurrent_snap_create","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"mirror_snapshot_schedule":{"name":"mirror_snapshot_schedule","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"trash_purge_schedule":{"name":"trash_purge_schedule","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"restful","can_run":true,"error_string":"","module_options":{"enable_auth":{"name":"enable_auth","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"key_file":{"name":"key_file","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"server_addr":{"name":"server_addr","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"server_port":{"name":"server_port","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"rgw","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"rook","can_run":true,"error_string":"","module_options":{"drive_group_interval":{"name":"drive_group_interval","type":"float","level":"advanced","flags":0,"default_value":"300.0","min":"","max":"","enum_allowed":[],"desc":"interval in seconds between re-application of applied drive_groups","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"storage_class":{"name":"storage_class","type":"str","level":"advanced","flags":0,"default_value":"local","min":"","max":"","enum_allowed":[],"desc":"storage class name for LSO-discovered PVs","long_desc":"","tags":[],"see_also":[]}}},{"name":"selftest","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"roption1":{"name":"roption1","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"roption2":{"name":"roption2","type":"str","level":"advanced","flags":0,"default_value":"xyz","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption1":{"name":"rwoption1","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption2":{"name":"rwoption2","type":"int","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption3":{"name":"rwoption3","type":"float","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption4":{"name":"rwoption4","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption5":{"name":"rwoption5","type":"bool","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption6":{"name":"rwoption6","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption7":{"name":"rwoption7","type":"int","level":"advanced","flags":0,"default_value":"","min":"1","max":"42","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"testkey":{"name":"testkey","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"testlkey":{"name":"testlkey","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"testnewline":{"name":"testnewline","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"snap_schedule","can_run":true,"error_string":"","module_options":{"allow_m_granularity":{"name":"allow_m_granularity","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"allow minute scheduled snapshots","long_desc":"","tags":[],"see_also":[]},"dump_on_update":{"name":"dump_on_update","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"dump database to debug log on update","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"stats","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"status","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"telegraf","can_run":true,"error_string":"","module_options":{"address":{"name":"address","type":"str","level":"advanced","flags":0,"default_value":"unixgram:///tmp/telegraf.sock","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"interval":{"name":"interval","type":"secs","level":"advanced","flags":0,"default_value":"15","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"telemetry","can_run":true,"error_string":"","module_options":{"channel_basic":{"name":"channel_basic","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Share basic cluster information (size, version)","long_desc":"","tags":[],"see_also":[]},"channel_crash":{"name":"channel_crash","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Share metadata about Ceph daemon crashes (version, stack straces, etc)","long_desc":"","tags":[],"see_also":[]},"channel_device":{"name":"channel_device","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Share device health metrics (e.g., SMART data, minus potentially identifying info like serial numbers)","long_desc":"","tags":[],"see_also":[]},"channel_ident":{"name":"channel_ident","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Share a user-provided description and/or contact email for the cluster","long_desc":"","tags":[],"see_also":[]},"channel_perf":{"name":"channel_perf","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Share various performance metrics of a cluster","long_desc":"","tags":[],"see_also":[]},"contact":{"name":"contact","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"description":{"name":"description","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"device_url":{"name":"device_url","type":"str","level":"advanced","flags":0,"default_value":"https://telemetry.ceph.com/device","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"enabled":{"name":"enabled","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"interval":{"name":"interval","type":"int","level":"advanced","flags":0,"default_value":"24","min":"8","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"last_opt_revision":{"name":"last_opt_revision","type":"int","level":"advanced","flags":0,"default_value":"1","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"leaderboard":{"name":"leaderboard","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"leaderboard_description":{"name":"leaderboard_description","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"organization":{"name":"organization","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"proxy":{"name":"proxy","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"url":{"name":"url","type":"str","level":"advanced","flags":0,"default_value":"https://telemetry.ceph.com/report","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"test_orchestrator","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"volumes","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"max_concurrent_clones":{"name":"max_concurrent_clones","type":"int","level":"advanced","flags":0,"default_value":"4","min":"","max":"","enum_allowed":[],"desc":"Number of asynchronous cloner threads","long_desc":"","tags":[],"see_also":[]},"snapshot_clone_delay":{"name":"snapshot_clone_delay","type":"int","level":"advanced","flags":0,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"Delay clone begin operation by snapshot_clone_delay seconds","long_desc":"","tags":[],"see_also":[]}}},{"name":"zabbix","can_run":true,"error_string":"","module_options":{"discovery_interval":{"name":"discovery_interval","type":"uint","level":"advanced","flags":0,"default_value":"100","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"identifier":{"name":"identifier","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"interval":{"name":"interval","type":"secs","level":"advanced","flags":0,"default_value":"60","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"zabbix_host":{"name":"zabbix_host","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"zabbix_port":{"name":"zabbix_port","type":"int","level":"advanced","flags":0,"default_value":"10051","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"zabbix_sender":{"name":"zabbix_sender","type":"str","level":"advanced","flags":0,"default_value":"/usr/bin/zabbix_sender","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}}]}],"modules":["cephadm","dashboard","iostat","nfs","prometheus","restful"],"available_modules":[{"name":"alerts","can_run":true,"error_string":"","module_options":{"interval":{"name":"interval","type":"secs","level":"advanced","flags":1,"default_value":"60","min":"","max":"","enum_allowed":[],"desc":"How frequently to reexamine health status","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"smtp_destination":{"name":"smtp_destination","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"Email address to send alerts to","long_desc":"","tags":[],"see_also":[]},"smtp_from_name":{"name":"smtp_from_name","type":"str","level":"advanced","flags":1,"default_value":"Ceph","min":"","max":"","enum_allowed":[],"desc":"Email From: name","long_desc":"","tags":[],"see_also":[]},"smtp_host":{"name":"smtp_host","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"SMTP server","long_desc":"","tags":[],"see_also":[]},"smtp_password":{"name":"smtp_password","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"Password to authenticate with","long_desc":"","tags":[],"see_also":[]},"smtp_port":{"name":"smtp_port","type":"int","level":"advanced","flags":1,"default_value":"465","min":"","max":"","enum_allowed":[],"desc":"SMTP port","long_desc":"","tags":[],"see_also":[]},"smtp_sender":{"name":"smtp_sender","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"SMTP envelope sender","long_desc":"","tags":[],"see_also":[]},"smtp_ssl":{"name":"smtp_ssl","type":"bool","level":"advanced","flags":1,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Use SSL to connect to SMTP server","long_desc":"","tags":[],"see_also":[]},"smtp_user":{"name":"smtp_user","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"User to authenticate as","long_desc":"","tags":[],"see_also":[]}}},{"name":"balancer","can_run":true,"error_string":"","module_options":{"active":{"name":"active","type":"bool","level":"advanced","flags":1,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"automatically balance PGs across cluster","long_desc":"","tags":[],"see_also":[]},"begin_time":{"name":"begin_time","type":"str","level":"advanced","flags":1,"default_value":"0000","min":"","max":"","enum_allowed":[],"desc":"beginning time of day to automatically balance","long_desc":"This is a time of day in the format HHMM.","tags":[],"see_also":[]},"begin_weekday":{"name":"begin_weekday","type":"uint","level":"advanced","flags":1,"default_value":"0","min":"0","max":"6","enum_allowed":[],"desc":"Restrict automatic balancing to this day of the week or later","long_desc":"0 = Sunday, 1 = Monday, etc.","tags":[],"see_also":[]},"crush_compat_max_iterations":{"name":"crush_compat_max_iterations","type":"uint","level":"advanced","flags":1,"default_value":"25","min":"1","max":"250","enum_allowed":[],"desc":"maximum number of iterations to attempt optimization","long_desc":"","tags":[],"see_also":[]},"crush_compat_metrics":{"name":"crush_compat_metrics","type":"str","level":"advanced","flags":1,"default_value":"pgs,objects,bytes","min":"","max":"","enum_allowed":[],"desc":"metrics with which to calculate OSD utilization","long_desc":"Value is a list of one or more of \"pgs\", \"objects\", or \"bytes\", and indicates which metrics to use to balance utilization.","tags":[],"see_also":[]},"crush_compat_step":{"name":"crush_compat_step","type":"float","level":"advanced","flags":1,"default_value":"0.5","min":"0.001","max":"0.999","enum_allowed":[],"desc":"aggressiveness of optimization","long_desc":".99 is very aggressive, .01 is less aggressive","tags":[],"see_also":[]},"end_time":{"name":"end_time","type":"str","level":"advanced","flags":1,"default_value":"2359","min":"","max":"","enum_allowed":[],"desc":"ending time of day to automatically balance","long_desc":"This is a time of day in the format HHMM.","tags":[],"see_also":[]},"end_weekday":{"name":"end_weekday","type":"uint","level":"advanced","flags":1,"default_value":"0","min":"0","max":"6","enum_allowed":[],"desc":"Restrict automatic balancing to days of the week earlier than this","long_desc":"0 = Sunday, 1 = Monday, etc.","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"min_score":{"name":"min_score","type":"float","level":"advanced","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"minimum score, below which no optimization is attempted","long_desc":"","tags":[],"see_also":[]},"mode":{"name":"mode","type":"str","level":"advanced","flags":1,"default_value":"upmap","min":"","max":"","enum_allowed":["crush-compat","none","upmap"],"desc":"Balancer mode","long_desc":"","tags":[],"see_also":[]},"pool_ids":{"name":"pool_ids","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"pools which the automatic balancing will be limited to","long_desc":"","tags":[],"see_also":[]},"sleep_interval":{"name":"sleep_interval","type":"secs","level":"advanced","flags":1,"default_value":"60","min":"","max":"","enum_allowed":[],"desc":"how frequently to wake up and attempt optimization","long_desc":"","tags":[],"see_also":[]},"upmap_max_deviation":{"name":"upmap_max_deviation","type":"int","level":"advanced","flags":1,"default_value":"5","min":"1","max":"","enum_allowed":[],"desc":"deviation below which no optimization is attempted","long_desc":"If the number of PGs are within this count then no optimization is attempted","tags":[],"see_also":[]},"upmap_max_optimizations":{"name":"upmap_max_optimizations","type":"uint","level":"advanced","flags":1,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"maximum upmap optimizations to make per attempt","long_desc":"","tags":[],"see_also":[]}}},{"name":"cephadm","can_run":true,"error_string":"","module_options":{"agent_down_multiplier":{"name":"agent_down_multiplier","type":"float","level":"advanced","flags":0,"default_value":"3.0","min":"","max":"","enum_allowed":[],"desc":"Multiplied by agent refresh rate to calculate how long agent must not report before being marked down","long_desc":"","tags":[],"see_also":[]},"agent_refresh_rate":{"name":"agent_refresh_rate","type":"secs","level":"advanced","flags":0,"default_value":"20","min":"","max":"","enum_allowed":[],"desc":"How often agent on each host will try to gather and send metadata","long_desc":"","tags":[],"see_also":[]},"agent_starting_port":{"name":"agent_starting_port","type":"int","level":"advanced","flags":0,"default_value":"4721","min":"","max":"","enum_allowed":[],"desc":"First port agent will try to bind to (will also try up to next 1000 subsequent ports if blocked)","long_desc":"","tags":[],"see_also":[]},"alertmanager_web_password":{"name":"alertmanager_web_password","type":"str","level":"advanced","flags":0,"default_value":"admin","min":"","max":"","enum_allowed":[],"desc":"Alertmanager web password","long_desc":"","tags":[],"see_also":[]},"alertmanager_web_user":{"name":"alertmanager_web_user","type":"str","level":"advanced","flags":0,"default_value":"admin","min":"","max":"","enum_allowed":[],"desc":"Alertmanager web user","long_desc":"","tags":[],"see_also":[]},"allow_ptrace":{"name":"allow_ptrace","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"allow SYS_PTRACE capability on ceph containers","long_desc":"The SYS_PTRACE capability is needed to attach to a process with gdb or strace. Enabling this options can allow debugging daemons that encounter problems at runtime.","tags":[],"see_also":[]},"autotune_interval":{"name":"autotune_interval","type":"secs","level":"advanced","flags":0,"default_value":"600","min":"","max":"","enum_allowed":[],"desc":"how frequently to autotune daemon memory","long_desc":"","tags":[],"see_also":[]},"autotune_memory_target_ratio":{"name":"autotune_memory_target_ratio","type":"float","level":"advanced","flags":0,"default_value":"0.7","min":"","max":"","enum_allowed":[],"desc":"ratio of total system memory to divide amongst autotuned daemons","long_desc":"","tags":[],"see_also":[]},"cgroups_split":{"name":"cgroups_split","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Pass --cgroups=split when cephadm creates containers (currently podman only)","long_desc":"","tags":[],"see_also":[]},"config_checks_enabled":{"name":"config_checks_enabled","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Enable or disable the cephadm configuration analysis","long_desc":"","tags":[],"see_also":[]},"config_dashboard":{"name":"config_dashboard","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"manage configs like API endpoints in Dashboard.","long_desc":"","tags":[],"see_also":[]},"container_image_alertmanager":{"name":"container_image_alertmanager","type":"str","level":"advanced","flags":0,"default_value":"quay.io/prometheus/alertmanager:v0.25.0","min":"","max":"","enum_allowed":[],"desc":"Prometheus container image","long_desc":"","tags":[],"see_also":[]},"container_image_base":{"name":"container_image_base","type":"str","level":"advanced","flags":1,"default_value":"quay.io/ceph/ceph:v18","min":"","max":"","enum_allowed":[],"desc":"Container image name, without the tag","long_desc":"","tags":[],"see_also":[]},"container_image_elasticsearch":{"name":"container_image_elasticsearch","type":"str","level":"advanced","flags":0,"default_value":"quay.io/omrizeneva/elasticsearch:6.8.23","min":"","max":"","enum_allowed":[],"desc":"elasticsearch container image","long_desc":"","tags":[],"see_also":[]},"container_image_grafana":{"name":"container_image_grafana","type":"str","level":"advanced","flags":0,"default_value":"quay.io/ceph/ceph-grafana:9.4.7","min":"","max":"","enum_allowed":[],"desc":"Prometheus container image","long_desc":"","tags":[],"see_also":[]},"container_image_haproxy":{"name":"container_image_haproxy","type":"str","level":"advanced","flags":0,"default_value":"quay.io/ceph/haproxy:2.3","min":"","max":"","enum_allowed":[],"desc":"HAproxy container image","long_desc":"","tags":[],"see_also":[]},"container_image_jaeger_agent":{"name":"container_image_jaeger_agent","type":"str","level":"advanced","flags":0,"default_value":"quay.io/jaegertracing/jaeger-agent:1.29","min":"","max":"","enum_allowed":[],"desc":"Jaeger agent container image","long_desc":"","tags":[],"see_also":[]},"container_image_jaeger_collector":{"name":"container_image_jaeger_collector","type":"str","level":"advanced","flags":0,"default_value":"quay.io/jaegertracing/jaeger-collector:1.29","min":"","max":"","enum_allowed":[],"desc":"Jaeger collector container image","long_desc":"","tags":[],"see_also":[]},"container_image_jaeger_query":{"name":"container_image_jaeger_query","type":"str","level":"advanced","flags":0,"default_value":"quay.io/jaegertracing/jaeger-query:1.29","min":"","max":"","enum_allowed":[],"desc":"Jaeger query container image","long_desc":"","tags":[],"see_also":[]},"container_image_keepalived":{"name":"container_image_keepalived","type":"str","level":"advanced","flags":0,"default_value":"quay.io/ceph/keepalived:2.2.4","min":"","max":"","enum_allowed":[],"desc":"Keepalived container image","long_desc":"","tags":[],"see_also":[]},"container_image_loki":{"name":"container_image_loki","type":"str","level":"advanced","flags":0,"default_value":"docker.io/grafana/loki:2.4.0","min":"","max":"","enum_allowed":[],"desc":"Loki container image","long_desc":"","tags":[],"see_also":[]},"container_image_node_exporter":{"name":"container_image_node_exporter","type":"str","level":"advanced","flags":0,"default_value":"quay.io/prometheus/node-exporter:v1.5.0","min":"","max":"","enum_allowed":[],"desc":"Prometheus container image","long_desc":"","tags":[],"see_also":[]},"container_image_prometheus":{"name":"container_image_prometheus","type":"str","level":"advanced","flags":0,"default_value":"quay.io/prometheus/prometheus:v2.43.0","min":"","max":"","enum_allowed":[],"desc":"Prometheus container image","long_desc":"","tags":[],"see_also":[]},"container_image_promtail":{"name":"container_image_promtail","type":"str","level":"advanced","flags":0,"default_value":"docker.io/grafana/promtail:2.4.0","min":"","max":"","enum_allowed":[],"desc":"Promtail container image","long_desc":"","tags":[],"see_also":[]},"container_image_snmp_gateway":{"name":"container_image_snmp_gateway","type":"str","level":"advanced","flags":0,"default_value":"docker.io/maxwo/snmp-notifier:v1.2.1","min":"","max":"","enum_allowed":[],"desc":"SNMP Gateway container image","long_desc":"","tags":[],"see_also":[]},"container_init":{"name":"container_init","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Run podman/docker with `--init`","long_desc":"","tags":[],"see_also":[]},"daemon_cache_timeout":{"name":"daemon_cache_timeout","type":"secs","level":"advanced","flags":0,"default_value":"600","min":"","max":"","enum_allowed":[],"desc":"seconds to cache service (daemon) inventory","long_desc":"","tags":[],"see_also":[]},"default_cephadm_command_timeout":{"name":"default_cephadm_command_timeout","type":"secs","level":"advanced","flags":0,"default_value":"900","min":"","max":"","enum_allowed":[],"desc":"Default timeout applied to cephadm commands run directly on the host (in seconds)","long_desc":"","tags":[],"see_also":[]},"default_registry":{"name":"default_registry","type":"str","level":"advanced","flags":0,"default_value":"docker.io","min":"","max":"","enum_allowed":[],"desc":"Search-registry to which we should normalize unqualified image names. This is not the default registry","long_desc":"","tags":[],"see_also":[]},"device_cache_timeout":{"name":"device_cache_timeout","type":"secs","level":"advanced","flags":0,"default_value":"1800","min":"","max":"","enum_allowed":[],"desc":"seconds to cache device inventory","long_desc":"","tags":[],"see_also":[]},"device_enhanced_scan":{"name":"device_enhanced_scan","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Use libstoragemgmt during device scans","long_desc":"","tags":[],"see_also":[]},"facts_cache_timeout":{"name":"facts_cache_timeout","type":"secs","level":"advanced","flags":0,"default_value":"60","min":"","max":"","enum_allowed":[],"desc":"seconds to cache host facts data","long_desc":"","tags":[],"see_also":[]},"host_check_interval":{"name":"host_check_interval","type":"secs","level":"advanced","flags":0,"default_value":"600","min":"","max":"","enum_allowed":[],"desc":"how frequently to perform a host check","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_refresh_metadata":{"name":"log_refresh_metadata","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Log all refresh metadata. Includes daemon, device, and host info collected regularly. Only has effect if logging at debug level","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"log to the \"cephadm\" cluster log channel\"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"manage_etc_ceph_ceph_conf":{"name":"manage_etc_ceph_ceph_conf","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Manage and own /etc/ceph/ceph.conf on the hosts.","long_desc":"","tags":[],"see_also":[]},"manage_etc_ceph_ceph_conf_hosts":{"name":"manage_etc_ceph_ceph_conf_hosts","type":"str","level":"advanced","flags":0,"default_value":"*","min":"","max":"","enum_allowed":[],"desc":"PlacementSpec describing on which hosts to manage /etc/ceph/ceph.conf","long_desc":"","tags":[],"see_also":[]},"max_count_per_host":{"name":"max_count_per_host","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"max number of daemons per service per host","long_desc":"","tags":[],"see_also":[]},"max_osd_draining_count":{"name":"max_osd_draining_count","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"max number of osds that will be drained simultaneously when osds are removed","long_desc":"","tags":[],"see_also":[]},"migration_current":{"name":"migration_current","type":"int","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"internal - do not modify","long_desc":"","tags":[],"see_also":[]},"mode":{"name":"mode","type":"str","level":"advanced","flags":0,"default_value":"root","min":"","max":"","enum_allowed":["cephadm-package","root"],"desc":"mode for remote execution of cephadm","long_desc":"","tags":[],"see_also":[]},"prometheus_alerts_path":{"name":"prometheus_alerts_path","type":"str","level":"advanced","flags":0,"default_value":"/etc/prometheus/ceph/ceph_default_alerts.yml","min":"","max":"","enum_allowed":[],"desc":"location of alerts to include in prometheus deployments","long_desc":"","tags":[],"see_also":[]},"prometheus_web_password":{"name":"prometheus_web_password","type":"str","level":"advanced","flags":0,"default_value":"admin","min":"","max":"","enum_allowed":[],"desc":"Prometheus web password","long_desc":"","tags":[],"see_also":[]},"prometheus_web_user":{"name":"prometheus_web_user","type":"str","level":"advanced","flags":0,"default_value":"admin","min":"","max":"","enum_allowed":[],"desc":"Prometheus web user","long_desc":"","tags":[],"see_also":[]},"registry_insecure":{"name":"registry_insecure","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Registry is to be considered insecure (no TLS available). Only for development purposes.","long_desc":"","tags":[],"see_also":[]},"registry_password":{"name":"registry_password","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"Custom repository password. Only used for logging into a registry.","long_desc":"","tags":[],"see_also":[]},"registry_url":{"name":"registry_url","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"Registry url for login purposes. This is not the default registry","long_desc":"","tags":[],"see_also":[]},"registry_username":{"name":"registry_username","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"Custom repository username. Only used for logging into a registry.","long_desc":"","tags":[],"see_also":[]},"secure_monitoring_stack":{"name":"secure_monitoring_stack","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Enable TLS security for all the monitoring stack daemons","long_desc":"","tags":[],"see_also":[]},"service_discovery_port":{"name":"service_discovery_port","type":"int","level":"advanced","flags":0,"default_value":"8765","min":"","max":"","enum_allowed":[],"desc":"cephadm service discovery port","long_desc":"","tags":[],"see_also":[]},"ssh_config_file":{"name":"ssh_config_file","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"customized SSH config file to connect to managed hosts","long_desc":"","tags":[],"see_also":[]},"use_agent":{"name":"use_agent","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Use cephadm agent on each host to gather and send metadata","long_desc":"","tags":[],"see_also":[]},"use_repo_digest":{"name":"use_repo_digest","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Automatically convert image tags to image digest. Make sure all daemons use the same image","long_desc":"","tags":[],"see_also":[]},"warn_on_failed_host_check":{"name":"warn_on_failed_host_check","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"raise a health warning if the host check fails","long_desc":"","tags":[],"see_also":[]},"warn_on_stray_daemons":{"name":"warn_on_stray_daemons","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"raise a health warning if daemons are detected that are not managed by cephadm","long_desc":"","tags":[],"see_also":[]},"warn_on_stray_hosts":{"name":"warn_on_stray_hosts","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"raise a health warning if daemons are detected on a host that is not managed by cephadm","long_desc":"","tags":[],"see_also":[]}}},{"name":"crash","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"retain_interval":{"name":"retain_interval","type":"secs","level":"advanced","flags":1,"default_value":"31536000","min":"","max":"","enum_allowed":[],"desc":"how long to retain crashes before pruning them","long_desc":"","tags":[],"see_also":[]},"warn_recent_interval":{"name":"warn_recent_interval","type":"secs","level":"advanced","flags":1,"default_value":"1209600","min":"","max":"","enum_allowed":[],"desc":"time interval in which to warn about recent crashes","long_desc":"","tags":[],"see_also":[]}}},{"name":"dashboard","can_run":true,"error_string":"","module_options":{"ACCOUNT_LOCKOUT_ATTEMPTS":{"name":"ACCOUNT_LOCKOUT_ATTEMPTS","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ALERTMANAGER_API_HOST":{"name":"ALERTMANAGER_API_HOST","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ALERTMANAGER_API_SSL_VERIFY":{"name":"ALERTMANAGER_API_SSL_VERIFY","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"AUDIT_API_ENABLED":{"name":"AUDIT_API_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"AUDIT_API_LOG_PAYLOAD":{"name":"AUDIT_API_LOG_PAYLOAD","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ENABLE_BROWSABLE_API":{"name":"ENABLE_BROWSABLE_API","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_CEPHFS":{"name":"FEATURE_TOGGLE_CEPHFS","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_DASHBOARD":{"name":"FEATURE_TOGGLE_DASHBOARD","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_ISCSI":{"name":"FEATURE_TOGGLE_ISCSI","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_MIRRORING":{"name":"FEATURE_TOGGLE_MIRRORING","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_NFS":{"name":"FEATURE_TOGGLE_NFS","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_RBD":{"name":"FEATURE_TOGGLE_RBD","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_RGW":{"name":"FEATURE_TOGGLE_RGW","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GANESHA_CLUSTERS_RADOS_POOL_NAMESPACE":{"name":"GANESHA_CLUSTERS_RADOS_POOL_NAMESPACE","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_API_PASSWORD":{"name":"GRAFANA_API_PASSWORD","type":"str","level":"advanced","flags":0,"default_value":"admin","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_API_SSL_VERIFY":{"name":"GRAFANA_API_SSL_VERIFY","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_API_URL":{"name":"GRAFANA_API_URL","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_API_USERNAME":{"name":"GRAFANA_API_USERNAME","type":"str","level":"advanced","flags":0,"default_value":"admin","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_FRONTEND_API_URL":{"name":"GRAFANA_FRONTEND_API_URL","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_UPDATE_DASHBOARDS":{"name":"GRAFANA_UPDATE_DASHBOARDS","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ISCSI_API_SSL_VERIFICATION":{"name":"ISCSI_API_SSL_VERIFICATION","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ISSUE_TRACKER_API_KEY":{"name":"ISSUE_TRACKER_API_KEY","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PROMETHEUS_API_HOST":{"name":"PROMETHEUS_API_HOST","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PROMETHEUS_API_SSL_VERIFY":{"name":"PROMETHEUS_API_SSL_VERIFY","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_COMPLEXITY_ENABLED":{"name":"PWD_POLICY_CHECK_COMPLEXITY_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_EXCLUSION_LIST_ENABLED":{"name":"PWD_POLICY_CHECK_EXCLUSION_LIST_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_LENGTH_ENABLED":{"name":"PWD_POLICY_CHECK_LENGTH_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_OLDPWD_ENABLED":{"name":"PWD_POLICY_CHECK_OLDPWD_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_REPETITIVE_CHARS_ENABLED":{"name":"PWD_POLICY_CHECK_REPETITIVE_CHARS_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_SEQUENTIAL_CHARS_ENABLED":{"name":"PWD_POLICY_CHECK_SEQUENTIAL_CHARS_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_USERNAME_ENABLED":{"name":"PWD_POLICY_CHECK_USERNAME_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_ENABLED":{"name":"PWD_POLICY_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_EXCLUSION_LIST":{"name":"PWD_POLICY_EXCLUSION_LIST","type":"str","level":"advanced","flags":0,"default_value":"osd,host,dashboard,pool,block,nfs,ceph,monitors,gateway,logs,crush,maps","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_MIN_COMPLEXITY":{"name":"PWD_POLICY_MIN_COMPLEXITY","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_MIN_LENGTH":{"name":"PWD_POLICY_MIN_LENGTH","type":"int","level":"advanced","flags":0,"default_value":"8","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"REST_REQUESTS_TIMEOUT":{"name":"REST_REQUESTS_TIMEOUT","type":"int","level":"advanced","flags":0,"default_value":"45","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"RGW_API_ACCESS_KEY":{"name":"RGW_API_ACCESS_KEY","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"RGW_API_ADMIN_RESOURCE":{"name":"RGW_API_ADMIN_RESOURCE","type":"str","level":"advanced","flags":0,"default_value":"admin","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"RGW_API_SECRET_KEY":{"name":"RGW_API_SECRET_KEY","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"RGW_API_SSL_VERIFY":{"name":"RGW_API_SSL_VERIFY","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"USER_PWD_EXPIRATION_SPAN":{"name":"USER_PWD_EXPIRATION_SPAN","type":"int","level":"advanced","flags":0,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"USER_PWD_EXPIRATION_WARNING_1":{"name":"USER_PWD_EXPIRATION_WARNING_1","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"USER_PWD_EXPIRATION_WARNING_2":{"name":"USER_PWD_EXPIRATION_WARNING_2","type":"int","level":"advanced","flags":0,"default_value":"5","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"cross_origin_url":{"name":"cross_origin_url","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"crt_file":{"name":"crt_file","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"debug":{"name":"debug","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Enable/disable debug options","long_desc":"","tags":[],"see_also":[]},"jwt_token_ttl":{"name":"jwt_token_ttl","type":"int","level":"advanced","flags":0,"default_value":"28800","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"key_file":{"name":"key_file","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"motd":{"name":"motd","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"The message of the day","long_desc":"","tags":[],"see_also":[]},"redirect_resolve_ip_addr":{"name":"redirect_resolve_ip_addr","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"server_addr":{"name":"server_addr","type":"str","level":"advanced","flags":0,"default_value":"::","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"server_port":{"name":"server_port","type":"int","level":"advanced","flags":0,"default_value":"8080","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ssl":{"name":"ssl","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ssl_server_port":{"name":"ssl_server_port","type":"int","level":"advanced","flags":0,"default_value":"8443","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"standby_behaviour":{"name":"standby_behaviour","type":"str","level":"advanced","flags":0,"default_value":"redirect","min":"","max":"","enum_allowed":["error","redirect"],"desc":"","long_desc":"","tags":[],"see_also":[]},"standby_error_status_code":{"name":"standby_error_status_code","type":"int","level":"advanced","flags":0,"default_value":"500","min":"400","max":"599","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"url_prefix":{"name":"url_prefix","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"devicehealth","can_run":true,"error_string":"","module_options":{"enable_monitoring":{"name":"enable_monitoring","type":"bool","level":"advanced","flags":1,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"monitor device health metrics","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"mark_out_threshold":{"name":"mark_out_threshold","type":"secs","level":"advanced","flags":1,"default_value":"2419200","min":"","max":"","enum_allowed":[],"desc":"automatically mark OSD if it may fail before this long","long_desc":"","tags":[],"see_also":[]},"pool_name":{"name":"pool_name","type":"str","level":"advanced","flags":1,"default_value":"device_health_metrics","min":"","max":"","enum_allowed":[],"desc":"name of pool in which to store device health metrics","long_desc":"","tags":[],"see_also":[]},"retention_period":{"name":"retention_period","type":"secs","level":"advanced","flags":1,"default_value":"15552000","min":"","max":"","enum_allowed":[],"desc":"how long to retain device health metrics","long_desc":"","tags":[],"see_also":[]},"scrape_frequency":{"name":"scrape_frequency","type":"secs","level":"advanced","flags":1,"default_value":"86400","min":"","max":"","enum_allowed":[],"desc":"how frequently to scrape device health metrics","long_desc":"","tags":[],"see_also":[]},"self_heal":{"name":"self_heal","type":"bool","level":"advanced","flags":1,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"preemptively heal cluster around devices that may fail","long_desc":"","tags":[],"see_also":[]},"sleep_interval":{"name":"sleep_interval","type":"secs","level":"advanced","flags":1,"default_value":"600","min":"","max":"","enum_allowed":[],"desc":"how frequently to wake up and check device health","long_desc":"","tags":[],"see_also":[]},"warn_threshold":{"name":"warn_threshold","type":"secs","level":"advanced","flags":1,"default_value":"7257600","min":"","max":"","enum_allowed":[],"desc":"raise health warning if OSD may fail before this long","long_desc":"","tags":[],"see_also":[]}}},{"name":"diskprediction_local","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"predict_interval":{"name":"predict_interval","type":"str","level":"advanced","flags":0,"default_value":"86400","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"predictor_model":{"name":"predictor_model","type":"str","level":"advanced","flags":0,"default_value":"prophetstor","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sleep_interval":{"name":"sleep_interval","type":"str","level":"advanced","flags":0,"default_value":"600","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"influx","can_run":false,"error_string":"influxdb python module not found","module_options":{"batch_size":{"name":"batch_size","type":"int","level":"advanced","flags":0,"default_value":"5000","min":"","max":"","enum_allowed":[],"desc":"How big batches of data points should be when sending to InfluxDB.","long_desc":"","tags":[],"see_also":[]},"database":{"name":"database","type":"str","level":"advanced","flags":0,"default_value":"ceph","min":"","max":"","enum_allowed":[],"desc":"InfluxDB database name. You will need to create this database and grant write privileges to the configured username or the username must have admin privileges to create it.","long_desc":"","tags":[],"see_also":[]},"hostname":{"name":"hostname","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"InfluxDB server hostname","long_desc":"","tags":[],"see_also":[]},"interval":{"name":"interval","type":"secs","level":"advanced","flags":0,"default_value":"30","min":"5","max":"","enum_allowed":[],"desc":"Time between reports to InfluxDB. Default 30 seconds.","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"password":{"name":"password","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"password of InfluxDB server user","long_desc":"","tags":[],"see_also":[]},"port":{"name":"port","type":"int","level":"advanced","flags":0,"default_value":"8086","min":"","max":"","enum_allowed":[],"desc":"InfluxDB server port","long_desc":"","tags":[],"see_also":[]},"ssl":{"name":"ssl","type":"str","level":"advanced","flags":0,"default_value":"false","min":"","max":"","enum_allowed":[],"desc":"Use https connection for InfluxDB server. Use \"true\" or \"false\".","long_desc":"","tags":[],"see_also":[]},"threads":{"name":"threads","type":"int","level":"advanced","flags":0,"default_value":"5","min":"1","max":"32","enum_allowed":[],"desc":"How many worker threads should be spawned for sending data to InfluxDB.","long_desc":"","tags":[],"see_also":[]},"username":{"name":"username","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"username of InfluxDB server user","long_desc":"","tags":[],"see_also":[]},"verify_ssl":{"name":"verify_ssl","type":"str","level":"advanced","flags":0,"default_value":"true","min":"","max":"","enum_allowed":[],"desc":"Verify https cert for InfluxDB server. Use \"true\" or \"false\".","long_desc":"","tags":[],"see_also":[]}}},{"name":"insights","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"iostat","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"k8sevents","can_run":true,"error_string":"","module_options":{"ceph_event_retention_days":{"name":"ceph_event_retention_days","type":"int","level":"advanced","flags":0,"default_value":"7","min":"","max":"","enum_allowed":[],"desc":"Days to hold ceph event information within local cache","long_desc":"","tags":[],"see_also":[]},"config_check_secs":{"name":"config_check_secs","type":"int","level":"advanced","flags":0,"default_value":"10","min":"10","max":"","enum_allowed":[],"desc":"interval (secs) to check for cluster configuration changes","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"localpool","can_run":true,"error_string":"","module_options":{"failure_domain":{"name":"failure_domain","type":"str","level":"advanced","flags":1,"default_value":"host","min":"","max":"","enum_allowed":[],"desc":"failure domain for any created local pool","long_desc":"what failure domain we should separate data replicas across.","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"min_size":{"name":"min_size","type":"int","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"default min_size for any created local pool","long_desc":"value to set min_size to (unchanged from Ceph's default if this option is not set)","tags":[],"see_also":[]},"num_rep":{"name":"num_rep","type":"int","level":"advanced","flags":1,"default_value":"3","min":"","max":"","enum_allowed":[],"desc":"default replica count for any created local pool","long_desc":"","tags":[],"see_also":[]},"pg_num":{"name":"pg_num","type":"int","level":"advanced","flags":1,"default_value":"128","min":"","max":"","enum_allowed":[],"desc":"default pg_num for any created local pool","long_desc":"","tags":[],"see_also":[]},"prefix":{"name":"prefix","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"name prefix for any created local pool","long_desc":"","tags":[],"see_also":[]},"subtree":{"name":"subtree","type":"str","level":"advanced","flags":1,"default_value":"rack","min":"","max":"","enum_allowed":[],"desc":"CRUSH level for which to create a local pool","long_desc":"which CRUSH subtree type the module should create a pool for.","tags":[],"see_also":[]}}},{"name":"mds_autoscaler","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"mirroring","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"nfs","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"orchestrator","can_run":true,"error_string":"","module_options":{"fail_fs":{"name":"fail_fs","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Fail filesystem for rapid multi-rank mds upgrade","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"orchestrator":{"name":"orchestrator","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["cephadm","rook","test_orchestrator"],"desc":"Orchestrator backend","long_desc":"","tags":[],"see_also":[]}}},{"name":"osd_perf_query","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"osd_support","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"pg_autoscaler","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"noautoscale":{"name":"noautoscale","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"global autoscale flag","long_desc":"Option to turn on/off the autoscaler for all pools","tags":[],"see_also":[]},"sleep_interval":{"name":"sleep_interval","type":"secs","level":"advanced","flags":0,"default_value":"60","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"threshold":{"name":"threshold","type":"float","level":"advanced","flags":0,"default_value":"3.0","min":"1.0","max":"","enum_allowed":[],"desc":"scaling threshold","long_desc":"The factor by which the `NEW PG_NUM` must vary from the current`PG_NUM` before being accepted. Cannot be less than 1.0","tags":[],"see_also":[]}}},{"name":"progress","can_run":true,"error_string":"","module_options":{"allow_pg_recovery_event":{"name":"allow_pg_recovery_event","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"allow the module to show pg recovery progress","long_desc":"","tags":[],"see_also":[]},"enabled":{"name":"enabled","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"max_completed_events":{"name":"max_completed_events","type":"int","level":"advanced","flags":1,"default_value":"50","min":"","max":"","enum_allowed":[],"desc":"number of past completed events to remember","long_desc":"","tags":[],"see_also":[]},"sleep_interval":{"name":"sleep_interval","type":"secs","level":"advanced","flags":1,"default_value":"5","min":"","max":"","enum_allowed":[],"desc":"how long the module is going to sleep","long_desc":"","tags":[],"see_also":[]}}},{"name":"prometheus","can_run":true,"error_string":"","module_options":{"cache":{"name":"cache","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rbd_stats_pools":{"name":"rbd_stats_pools","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rbd_stats_pools_refresh_interval":{"name":"rbd_stats_pools_refresh_interval","type":"int","level":"advanced","flags":0,"default_value":"300","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"scrape_interval":{"name":"scrape_interval","type":"float","level":"advanced","flags":0,"default_value":"15.0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"server_addr":{"name":"server_addr","type":"str","level":"advanced","flags":0,"default_value":"::","min":"","max":"","enum_allowed":[],"desc":"the IPv4 or IPv6 address on which the module listens for HTTP requests","long_desc":"","tags":[],"see_also":[]},"server_port":{"name":"server_port","type":"int","level":"advanced","flags":1,"default_value":"9283","min":"","max":"","enum_allowed":[],"desc":"the port on which the module listens for HTTP requests","long_desc":"","tags":[],"see_also":[]},"stale_cache_strategy":{"name":"stale_cache_strategy","type":"str","level":"advanced","flags":0,"default_value":"log","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"standby_behaviour":{"name":"standby_behaviour","type":"str","level":"advanced","flags":1,"default_value":"default","min":"","max":"","enum_allowed":["default","error"],"desc":"","long_desc":"","tags":[],"see_also":[]},"standby_error_status_code":{"name":"standby_error_status_code","type":"int","level":"advanced","flags":1,"default_value":"500","min":"400","max":"599","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"rbd_support","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"max_concurrent_snap_create":{"name":"max_concurrent_snap_create","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"mirror_snapshot_schedule":{"name":"mirror_snapshot_schedule","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"trash_purge_schedule":{"name":"trash_purge_schedule","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"restful","can_run":true,"error_string":"","module_options":{"enable_auth":{"name":"enable_auth","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"key_file":{"name":"key_file","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"server_addr":{"name":"server_addr","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"server_port":{"name":"server_port","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"rgw","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"rook","can_run":true,"error_string":"","module_options":{"drive_group_interval":{"name":"drive_group_interval","type":"float","level":"advanced","flags":0,"default_value":"300.0","min":"","max":"","enum_allowed":[],"desc":"interval in seconds between re-application of applied drive_groups","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"storage_class":{"name":"storage_class","type":"str","level":"advanced","flags":0,"default_value":"local","min":"","max":"","enum_allowed":[],"desc":"storage class name for LSO-discovered PVs","long_desc":"","tags":[],"see_also":[]}}},{"name":"selftest","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"roption1":{"name":"roption1","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"roption2":{"name":"roption2","type":"str","level":"advanced","flags":0,"default_value":"xyz","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption1":{"name":"rwoption1","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption2":{"name":"rwoption2","type":"int","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption3":{"name":"rwoption3","type":"float","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption4":{"name":"rwoption4","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption5":{"name":"rwoption5","type":"bool","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption6":{"name":"rwoption6","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption7":{"name":"rwoption7","type":"int","level":"advanced","flags":0,"default_value":"","min":"1","max":"42","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"testkey":{"name":"testkey","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"testlkey":{"name":"testlkey","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"testnewline":{"name":"testnewline","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"snap_schedule","can_run":true,"error_string":"","module_options":{"allow_m_granularity":{"name":"allow_m_granularity","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"allow minute scheduled snapshots","long_desc":"","tags":[],"see_also":[]},"dump_on_update":{"name":"dump_on_update","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"dump database to debug log on update","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"stats","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"status","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"telegraf","can_run":true,"error_string":"","module_options":{"address":{"name":"address","type":"str","level":"advanced","flags":0,"default_value":"unixgram:///tmp/telegraf.sock","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"interval":{"name":"interval","type":"secs","level":"advanced","flags":0,"default_value":"15","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"telemetry","can_run":true,"error_string":"","module_options":{"channel_basic":{"name":"channel_basic","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Share basic cluster information (size, version)","long_desc":"","tags":[],"see_also":[]},"channel_crash":{"name":"channel_crash","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Share metadata about Ceph daemon crashes (version, stack straces, etc)","long_desc":"","tags":[],"see_also":[]},"channel_device":{"name":"channel_device","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Share device health metrics (e.g., SMART data, minus potentially identifying info like serial numbers)","long_desc":"","tags":[],"see_also":[]},"channel_ident":{"name":"channel_ident","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Share a user-provided description and/or contact email for the cluster","long_desc":"","tags":[],"see_also":[]},"channel_perf":{"name":"channel_perf","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Share various performance metrics of a cluster","long_desc":"","tags":[],"see_also":[]},"contact":{"name":"contact","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"description":{"name":"description","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"device_url":{"name":"device_url","type":"str","level":"advanced","flags":0,"default_value":"https://telemetry.ceph.com/device","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"enabled":{"name":"enabled","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"interval":{"name":"interval","type":"int","level":"advanced","flags":0,"default_value":"24","min":"8","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"last_opt_revision":{"name":"last_opt_revision","type":"int","level":"advanced","flags":0,"default_value":"1","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"leaderboard":{"name":"leaderboard","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"leaderboard_description":{"name":"leaderboard_description","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"organization":{"name":"organization","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"proxy":{"name":"proxy","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"url":{"name":"url","type":"str","level":"advanced","flags":0,"default_value":"https://telemetry.ceph.com/report","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"test_orchestrator","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"volumes","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"max_concurrent_clones":{"name":"max_concurrent_clones","type":"int","level":"advanced","flags":0,"default_value":"4","min":"","max":"","enum_allowed":[],"desc":"Number of asynchronous cloner threads","long_desc":"","tags":[],"see_also":[]},"snapshot_clone_delay":{"name":"snapshot_clone_delay","type":"int","level":"advanced","flags":0,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"Delay clone begin operation by snapshot_clone_delay seconds","long_desc":"","tags":[],"see_also":[]}}},{"name":"zabbix","can_run":true,"error_string":"","module_options":{"discovery_interval":{"name":"discovery_interval","type":"uint","level":"advanced","flags":0,"default_value":"100","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"identifier":{"name":"identifier","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"interval":{"name":"interval","type":"secs","level":"advanced","flags":0,"default_value":"60","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"zabbix_host":{"name":"zabbix_host","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"zabbix_port":{"name":"zabbix_port","type":"int","level":"advanced","flags":0,"default_value":"10051","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"zabbix_sender":{"name":"zabbix_sender","type":"str","level":"advanced","flags":0,"default_value":"/usr/bin/zabbix_sender","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}}],"services":{"dashboard":"https://192.168.123.104:8443/","prometheus":"http://192.168.123.104:9283/"},"always_on_modules":{"octopus":["balancer","crash","devicehealth","orchestrator","pg_autoscaler","progress","rbd_support","status","telemetry","volumes"],"pacific":["balancer","crash","devicehealth","orchestrator","pg_autoscaler","progress","rbd_support","status","telemetry","volumes"],"quincy":["balancer","crash","devicehealth","orchestrator","pg_autoscaler","progress","rbd_support","status","telemetry","volumes"],"reef":["balancer","crash","devicehealth","orchestrator","pg_autoscaler","progress","rbd_support","status","telemetry","volumes"]},"last_failure_osd_epoch":5,"active_clients":[{"name":"devicehealth","addrvec":[{"type":"v2","addr":"192.168.123.104:0","nonce":1794352543}]},{"name":"libcephsqlite","addrvec":[{"type":"v2","addr":"192.168.123.104:0","nonce":2105211783}]},{"name":"rbd_support","addrvec":[{"type":"v2","addr":"192.168.123.104:0","nonce":3321367007}]},{"name":"volumes","addrvec":[{"type":"v2","addr":"192.168.123.104:0","nonce":140224168}]}]} 2026-03-10T06:19:33.896 INFO:tasks.cephadm.ceph_manager.ceph:mgr available! 2026-03-10T06:19:33.896 INFO:tasks.cephadm.ceph_manager.ceph:waiting for all up 2026-03-10T06:19:33.896 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid 9c59102a-1c48-11f1-b618-035af535377d -- ceph osd dump --format=json 2026-03-10T06:19:34.061 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/mon.vm04/config 2026-03-10T06:19:34.331 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:34.329+0000 7fd1fd9c6700 1 -- 192.168.123.104:0/321686886 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd1f8069000 msgr2=0x7fd1f81051e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:34.331 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:34.329+0000 7fd1fd9c6700 1 --2- 192.168.123.104:0/321686886 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd1f8069000 0x7fd1f81051e0 secure :-1 s=READY pgs=226 cs=0 l=1 rev1=1 crypto rx=0x7fd1e8009b00 tx=0x7fd1e8009e10 comp rx=0 tx=0).stop 2026-03-10T06:19:34.331 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:34.330+0000 7fd1fd9c6700 1 -- 192.168.123.104:0/321686886 shutdown_connections 2026-03-10T06:19:34.331 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:34.330+0000 7fd1fd9c6700 1 --2- 192.168.123.104:0/321686886 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd1f8069000 0x7fd1f81051e0 unknown :-1 s=CLOSED pgs=226 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:34.332 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:34.330+0000 7fd1fd9c6700 1 --2- 192.168.123.104:0/321686886 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd1f80686f0 0x7fd1f8068ac0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:34.332 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:34.330+0000 7fd1fd9c6700 1 -- 192.168.123.104:0/321686886 >> 192.168.123.104:0/321686886 conn(0x7fd1f80754a0 msgr2=0x7fd1f80758a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:19:34.332 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:34.330+0000 7fd1fd9c6700 1 -- 192.168.123.104:0/321686886 shutdown_connections 2026-03-10T06:19:34.332 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:34.330+0000 7fd1fd9c6700 1 -- 192.168.123.104:0/321686886 wait complete. 2026-03-10T06:19:34.332 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:34.331+0000 7fd1fd9c6700 1 Processor -- start 2026-03-10T06:19:34.333 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:34.331+0000 7fd1fd9c6700 1 -- start start 2026-03-10T06:19:34.333 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:34.331+0000 7fd1fd9c6700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd1f80686f0 0x7fd1f8198470 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:34.333 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:34.331+0000 7fd1fd9c6700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd1f8069000 0x7fd1f81989b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:34.333 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:34.331+0000 7fd1fd9c6700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd1f8199090 con 0x7fd1f8069000 2026-03-10T06:19:34.333 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:34.331+0000 7fd1fd9c6700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd1f819ce20 con 0x7fd1f80686f0 2026-03-10T06:19:34.333 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:34.332+0000 7fd1f6ffd700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd1f80686f0 0x7fd1f8198470 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:19:34.333 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:34.332+0000 7fd1f6ffd700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd1f80686f0 0x7fd1f8198470 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.104:55506/0 (socket says 192.168.123.104:55506) 2026-03-10T06:19:34.333 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:34.332+0000 7fd1f67fc700 1 --2- 192.168.123.104:0/781512930 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd1f8069000 0x7fd1f81989b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:19:34.333 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:34.332+0000 7fd1f6ffd700 1 -- 192.168.123.104:0/781512930 learned_addr learned my addr 192.168.123.104:0/781512930 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:19:34.334 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:34.332+0000 7fd1f67fc700 1 -- 192.168.123.104:0/781512930 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd1f80686f0 msgr2=0x7fd1f8198470 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:34.334 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:34.332+0000 7fd1f67fc700 1 --2- 192.168.123.104:0/781512930 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd1f80686f0 0x7fd1f8198470 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:34.334 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:34.332+0000 7fd1f67fc700 1 -- 192.168.123.104:0/781512930 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fd1e80097e0 con 0x7fd1f8069000 2026-03-10T06:19:34.334 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:34.332+0000 7fd1f67fc700 1 --2- 192.168.123.104:0/781512930 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd1f8069000 0x7fd1f81989b0 secure :-1 s=READY pgs=227 cs=0 l=1 rev1=1 crypto rx=0x7fd1e8009fd0 tx=0x7fd1e8004930 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:19:34.335 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:34.333+0000 7fd1fc9c4700 1 -- 192.168.123.104:0/781512930 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd1e801d070 con 0x7fd1f8069000 2026-03-10T06:19:34.335 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:34.333+0000 7fd1fc9c4700 1 -- 192.168.123.104:0/781512930 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fd1e8004b90 con 0x7fd1f8069000 2026-03-10T06:19:34.335 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:34.333+0000 7fd1fc9c4700 1 -- 192.168.123.104:0/781512930 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd1e800f670 con 0x7fd1f8069000 2026-03-10T06:19:34.335 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:34.333+0000 7fd1fd9c6700 1 -- 192.168.123.104:0/781512930 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fd1f819d0a0 con 0x7fd1f8069000 2026-03-10T06:19:34.335 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:34.333+0000 7fd1fd9c6700 1 -- 192.168.123.104:0/781512930 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fd1f819d590 con 0x7fd1f8069000 2026-03-10T06:19:34.336 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:34.334+0000 7fd1fd9c6700 1 -- 192.168.123.104:0/781512930 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fd1f804ea50 con 0x7fd1f8069000 2026-03-10T06:19:34.340 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:34.337+0000 7fd1fc9c4700 1 -- 192.168.123.104:0/781512930 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7fd1e8022470 con 0x7fd1f8069000 2026-03-10T06:19:34.340 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:34.338+0000 7fd1fc9c4700 1 --2- 192.168.123.104:0/781512930 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fd1e406c7a0 0x7fd1e406ec50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:34.340 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:34.338+0000 7fd1fc9c4700 1 -- 192.168.123.104:0/781512930 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(34..34 src has 1..34) v4 ==== 4446+0+0 (secure 0 0 0) 0x7fd1e808d3d0 con 0x7fd1f8069000 2026-03-10T06:19:34.340 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:34.338+0000 7fd1f6ffd700 1 --2- 192.168.123.104:0/781512930 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fd1e406c7a0 0x7fd1e406ec50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:19:34.340 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:34.338+0000 7fd1fc9c4700 1 -- 192.168.123.104:0/781512930 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fd1e808d7b0 con 0x7fd1f8069000 2026-03-10T06:19:34.340 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:34.339+0000 7fd1f6ffd700 1 --2- 192.168.123.104:0/781512930 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fd1e406c7a0 0x7fd1e406ec50 secure :-1 s=READY pgs=67 cs=0 l=1 rev1=1 crypto rx=0x7fd1e0005950 tx=0x7fd1e000a400 comp rx=0 tx=0).ready entity=mgr.14241 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:19:34.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:34 vm06 ceph-mon[58974]: from='client.? 192.168.123.106:0/4083509174' entity='client.admin' cmd=[{"prefix": "auth get-or-create", "entity": "client.1", "caps": ["mon", "allow *", "osd", "allow *", "mds", "allow *", "mgr", "allow *"]}]: dispatch 2026-03-10T06:19:34.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:34 vm06 ceph-mon[58974]: from='client.? ' entity='client.admin' cmd=[{"prefix": "auth get-or-create", "entity": "client.1", "caps": ["mon", "allow *", "osd", "allow *", "mds", "allow *", "mgr", "allow *"]}]: dispatch 2026-03-10T06:19:34.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:34 vm06 ceph-mon[58974]: from='client.? ' entity='client.admin' cmd='[{"prefix": "auth get-or-create", "entity": "client.1", "caps": ["mon", "allow *", "osd", "allow *", "mds", "allow *", "mgr", "allow *"]}]': finished 2026-03-10T06:19:34.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:34 vm06 ceph-mon[58974]: from='client.? 192.168.123.104:0/1435245220' entity='client.admin' cmd=[{"prefix": "mgr dump", "format": "json"}]: dispatch 2026-03-10T06:19:34.392 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:34 vm04 ceph-mon[51058]: from='client.? 192.168.123.106:0/4083509174' entity='client.admin' cmd=[{"prefix": "auth get-or-create", "entity": "client.1", "caps": ["mon", "allow *", "osd", "allow *", "mds", "allow *", "mgr", "allow *"]}]: dispatch 2026-03-10T06:19:34.392 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:34 vm04 ceph-mon[51058]: from='client.? ' entity='client.admin' cmd=[{"prefix": "auth get-or-create", "entity": "client.1", "caps": ["mon", "allow *", "osd", "allow *", "mds", "allow *", "mgr", "allow *"]}]: dispatch 2026-03-10T06:19:34.392 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:34 vm04 ceph-mon[51058]: from='client.? ' entity='client.admin' cmd='[{"prefix": "auth get-or-create", "entity": "client.1", "caps": ["mon", "allow *", "osd", "allow *", "mds", "allow *", "mgr", "allow *"]}]': finished 2026-03-10T06:19:34.392 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:34 vm04 ceph-mon[51058]: from='client.? 192.168.123.104:0/1435245220' entity='client.admin' cmd=[{"prefix": "mgr dump", "format": "json"}]: dispatch 2026-03-10T06:19:34.462 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:34.460+0000 7fd1fd9c6700 1 -- 192.168.123.104:0/781512930 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "osd dump", "format": "json"} v 0) v1 -- 0x7fd1f8066e40 con 0x7fd1f8069000 2026-03-10T06:19:34.463 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:34.461+0000 7fd1fc9c4700 1 -- 192.168.123.104:0/781512930 <== mon.0 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "osd dump", "format": "json"}]=0 v34) v1 ==== 74+0+11260 (secure 0 0 0) 0x7fd1e805fe60 con 0x7fd1f8069000 2026-03-10T06:19:34.463 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:19:34.463 INFO:teuthology.orchestra.run.vm04.stdout:{"epoch":34,"fsid":"9c59102a-1c48-11f1-b618-035af535377d","created":"2026-03-10T06:16:43.937039+0000","modified":"2026-03-10T06:19:30.067285+0000","last_up_change":"2026-03-10T06:19:29.056279+0000","last_in_change":"2026-03-10T06:19:17.791524+0000","flags":"sortbitwise,recovery_deletes,purged_snapdirs,pglog_hardlimit","flags_num":5799936,"flags_set":["pglog_hardlimit","purged_snapdirs","recovery_deletes","sortbitwise"],"crush_version":14,"full_ratio":0.94999998807907104,"backfillfull_ratio":0.89999997615814209,"nearfull_ratio":0.85000002384185791,"cluster_snapshot":"","pool_max":1,"max_osd":6,"require_min_compat_client":"luminous","min_compat_client":"jewel","require_osd_release":"reef","allow_crimson":false,"pools":[{"pool":1,"pool_name":".mgr","create_time":"2026-03-10T06:19:00.284324+0000","flags":1,"flags_names":"hashpspool","type":1,"size":3,"min_size":2,"crush_rule":0,"peering_crush_bucket_count":0,"peering_crush_bucket_target":0,"peering_crush_bucket_barrier":0,"peering_crush_bucket_mandatory_member":2147483647,"object_hash":2,"pg_autoscale_mode":"off","pg_num":1,"pg_placement_num":1,"pg_placement_num_target":1,"pg_num_target":1,"pg_num_pending":1,"last_pg_merge_meta":{"source_pgid":"0.0","ready_epoch":0,"last_epoch_started":0,"last_epoch_clean":0,"source_version":"0'0","target_version":"0'0"},"last_change":"21","last_force_op_resend":"0","last_force_op_resend_prenautilus":"0","last_force_op_resend_preluminous":"0","auid":0,"snap_mode":"selfmanaged","snap_seq":0,"snap_epoch":0,"pool_snaps":[],"removed_snaps":"[]","quota_max_bytes":0,"quota_max_objects":0,"tiers":[],"tier_of":-1,"read_tier":-1,"write_tier":-1,"cache_mode":"none","target_max_bytes":0,"target_max_objects":0,"cache_target_dirty_ratio_micro":400000,"cache_target_dirty_high_ratio_micro":600000,"cache_target_full_ratio_micro":800000,"cache_min_flush_age":0,"cache_min_evict_age":0,"erasure_code_profile":"","hit_set_params":{"type":"none"},"hit_set_period":0,"hit_set_count":0,"use_gmt_hitset":true,"min_read_recency_for_promote":0,"min_write_recency_for_promote":0,"hit_set_grade_decay_rate":0,"hit_set_search_last_n":0,"grade_table":[],"stripe_width":0,"expected_num_objects":0,"fast_read":false,"options":{"pg_num_max":32,"pg_num_min":1},"application_metadata":{"mgr":{}},"read_balance":{"score_acting":6,"score_stable":6,"optimal_score":0.5,"raw_score_acting":3,"raw_score_stable":3,"primary_affinity_weighted":1,"average_primary_affinity":1,"average_primary_affinity_weighted":1}}],"osds":[{"osd":0,"uuid":"24cf9fc9-b995-47ea-a145-3fd48dc1ed14","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":9,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6802","nonce":353393816},{"type":"v1","addr":"192.168.123.104:6803","nonce":353393816}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6804","nonce":353393816},{"type":"v1","addr":"192.168.123.104:6805","nonce":353393816}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6808","nonce":353393816},{"type":"v1","addr":"192.168.123.104:6809","nonce":353393816}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6806","nonce":353393816},{"type":"v1","addr":"192.168.123.104:6807","nonce":353393816}]},"public_addr":"192.168.123.104:6803/353393816","cluster_addr":"192.168.123.104:6805/353393816","heartbeat_back_addr":"192.168.123.104:6809/353393816","heartbeat_front_addr":"192.168.123.104:6807/353393816","state":["exists","up"]},{"osd":1,"uuid":"38852b8c-2ea4-46c8-a734-cf521893e9b5","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":14,"up_thru":25,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6810","nonce":3264858383},{"type":"v1","addr":"192.168.123.104:6811","nonce":3264858383}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6812","nonce":3264858383},{"type":"v1","addr":"192.168.123.104:6813","nonce":3264858383}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6816","nonce":3264858383},{"type":"v1","addr":"192.168.123.104:6817","nonce":3264858383}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6814","nonce":3264858383},{"type":"v1","addr":"192.168.123.104:6815","nonce":3264858383}]},"public_addr":"192.168.123.104:6811/3264858383","cluster_addr":"192.168.123.104:6813/3264858383","heartbeat_back_addr":"192.168.123.104:6817/3264858383","heartbeat_front_addr":"192.168.123.104:6815/3264858383","state":["exists","up"]},{"osd":2,"uuid":"7fc62f1e-5fa4-44db-82a0-4b766c28a491","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":18,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6818","nonce":4287451356},{"type":"v1","addr":"192.168.123.104:6819","nonce":4287451356}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6820","nonce":4287451356},{"type":"v1","addr":"192.168.123.104:6821","nonce":4287451356}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6824","nonce":4287451356},{"type":"v1","addr":"192.168.123.104:6825","nonce":4287451356}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6822","nonce":4287451356},{"type":"v1","addr":"192.168.123.104:6823","nonce":4287451356}]},"public_addr":"192.168.123.104:6819/4287451356","cluster_addr":"192.168.123.104:6821/4287451356","heartbeat_back_addr":"192.168.123.104:6825/4287451356","heartbeat_front_addr":"192.168.123.104:6823/4287451356","state":["exists","up"]},{"osd":3,"uuid":"343c7178-1f64-4726-9c13-d1d348b25384","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":24,"up_thru":28,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6800","nonce":1172578215},{"type":"v1","addr":"192.168.123.106:6801","nonce":1172578215}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6802","nonce":1172578215},{"type":"v1","addr":"192.168.123.106:6803","nonce":1172578215}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6806","nonce":1172578215},{"type":"v1","addr":"192.168.123.106:6807","nonce":1172578215}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6804","nonce":1172578215},{"type":"v1","addr":"192.168.123.106:6805","nonce":1172578215}]},"public_addr":"192.168.123.106:6801/1172578215","cluster_addr":"192.168.123.106:6803/1172578215","heartbeat_back_addr":"192.168.123.106:6807/1172578215","heartbeat_front_addr":"192.168.123.106:6805/1172578215","state":["exists","up"]},{"osd":4,"uuid":"0e68b450-e783-4ec6-99f7-0610ac3453d1","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":29,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6808","nonce":1799920647},{"type":"v1","addr":"192.168.123.106:6809","nonce":1799920647}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6810","nonce":1799920647},{"type":"v1","addr":"192.168.123.106:6811","nonce":1799920647}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6814","nonce":1799920647},{"type":"v1","addr":"192.168.123.106:6815","nonce":1799920647}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6812","nonce":1799920647},{"type":"v1","addr":"192.168.123.106:6813","nonce":1799920647}]},"public_addr":"192.168.123.106:6809/1799920647","cluster_addr":"192.168.123.106:6811/1799920647","heartbeat_back_addr":"192.168.123.106:6815/1799920647","heartbeat_front_addr":"192.168.123.106:6813/1799920647","state":["exists","up"]},{"osd":5,"uuid":"014751cc-c4ec-4d41-86ea-4b72c6f87e86","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":33,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6816","nonce":240520744},{"type":"v1","addr":"192.168.123.106:6817","nonce":240520744}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6818","nonce":240520744},{"type":"v1","addr":"192.168.123.106:6819","nonce":240520744}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6822","nonce":240520744},{"type":"v1","addr":"192.168.123.106:6823","nonce":240520744}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6820","nonce":240520744},{"type":"v1","addr":"192.168.123.106:6821","nonce":240520744}]},"public_addr":"192.168.123.106:6817/240520744","cluster_addr":"192.168.123.106:6819/240520744","heartbeat_back_addr":"192.168.123.106:6823/240520744","heartbeat_front_addr":"192.168.123.106:6821/240520744","state":["exists","up"]}],"osd_xinfo":[{"osd":0,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-10T06:18:36.888061+0000","dead_epoch":0},{"osd":1,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-10T06:18:48.702876+0000","dead_epoch":0},{"osd":2,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-10T06:18:58.655570+0000","dead_epoch":0},{"osd":3,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-10T06:19:07.794122+0000","dead_epoch":0},{"osd":4,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-10T06:19:17.462801+0000","dead_epoch":0},{"osd":5,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-10T06:19:27.109339+0000","dead_epoch":0}],"pg_upmap":[],"pg_upmap_items":[],"pg_upmap_primaries":[],"pg_temp":[],"primary_temp":[],"blocklist":{"192.168.123.104:0/4184397752":"2026-03-11T06:18:04.067248+0000","192.168.123.104:0/63423144":"2026-03-11T06:18:04.067248+0000","192.168.123.104:0/1120149152":"2026-03-11T06:17:13.124403+0000","192.168.123.104:0/2832999113":"2026-03-11T06:17:13.124403+0000","192.168.123.104:6801/2":"2026-03-11T06:16:59.244005+0000","192.168.123.104:0/4014113593":"2026-03-11T06:18:04.067248+0000","192.168.123.104:6800/2":"2026-03-11T06:16:59.244005+0000","192.168.123.104:0/3171159121":"2026-03-11T06:16:59.244005+0000","192.168.123.104:0/2564492007":"2026-03-11T06:16:59.244005+0000","192.168.123.104:0/1133148001":"2026-03-11T06:16:59.244005+0000","192.168.123.104:0/1930923909":"2026-03-11T06:17:13.124403+0000"},"range_blocklist":{},"erasure_code_profiles":{"default":{"crush-failure-domain":"osd","k":"2","m":"1","plugin":"jerasure","technique":"reed_sol_van"}},"removed_snaps_queue":[],"new_removed_snaps":[],"new_purged_snaps":[],"crush_node_flags":{},"device_class_flags":{},"stretch_mode":{"stretch_mode_enabled":false,"stretch_bucket_count":0,"degraded_stretch_mode":0,"recovering_stretch_mode":0,"stretch_mode_bucket":0}} 2026-03-10T06:19:34.466 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:34.464+0000 7fd1fd9c6700 1 -- 192.168.123.104:0/781512930 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fd1e406c7a0 msgr2=0x7fd1e406ec50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:34.466 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:34.464+0000 7fd1fd9c6700 1 --2- 192.168.123.104:0/781512930 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fd1e406c7a0 0x7fd1e406ec50 secure :-1 s=READY pgs=67 cs=0 l=1 rev1=1 crypto rx=0x7fd1e0005950 tx=0x7fd1e000a400 comp rx=0 tx=0).stop 2026-03-10T06:19:34.466 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:34.464+0000 7fd1fd9c6700 1 -- 192.168.123.104:0/781512930 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd1f8069000 msgr2=0x7fd1f81989b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:34.466 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:34.464+0000 7fd1fd9c6700 1 --2- 192.168.123.104:0/781512930 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd1f8069000 0x7fd1f81989b0 secure :-1 s=READY pgs=227 cs=0 l=1 rev1=1 crypto rx=0x7fd1e8009fd0 tx=0x7fd1e8004930 comp rx=0 tx=0).stop 2026-03-10T06:19:34.466 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:34.465+0000 7fd1fd9c6700 1 -- 192.168.123.104:0/781512930 shutdown_connections 2026-03-10T06:19:34.466 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:34.465+0000 7fd1fd9c6700 1 --2- 192.168.123.104:0/781512930 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fd1e406c7a0 0x7fd1e406ec50 unknown :-1 s=CLOSED pgs=67 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:34.466 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:34.465+0000 7fd1fd9c6700 1 --2- 192.168.123.104:0/781512930 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd1f80686f0 0x7fd1f8198470 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:34.466 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:34.465+0000 7fd1fd9c6700 1 --2- 192.168.123.104:0/781512930 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd1f8069000 0x7fd1f81989b0 unknown :-1 s=CLOSED pgs=227 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:34.466 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:34.465+0000 7fd1fd9c6700 1 -- 192.168.123.104:0/781512930 >> 192.168.123.104:0/781512930 conn(0x7fd1f80754a0 msgr2=0x7fd1f81021b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:19:34.466 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:34.465+0000 7fd1fd9c6700 1 -- 192.168.123.104:0/781512930 shutdown_connections 2026-03-10T06:19:34.466 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:34.465+0000 7fd1fd9c6700 1 -- 192.168.123.104:0/781512930 wait complete. 2026-03-10T06:19:34.557 INFO:tasks.cephadm.ceph_manager.ceph:all up! 2026-03-10T06:19:34.557 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid 9c59102a-1c48-11f1-b618-035af535377d -- ceph osd dump --format=json 2026-03-10T06:19:34.714 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/mon.vm04/config 2026-03-10T06:19:34.987 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:34.984+0000 7fb93dbac700 1 -- 192.168.123.104:0/2612487341 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb938102780 msgr2=0x7fb938102bf0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:34.987 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:34.984+0000 7fb93dbac700 1 --2- 192.168.123.104:0/2612487341 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb938102780 0x7fb938102bf0 secure :-1 s=READY pgs=36 cs=0 l=1 rev1=1 crypto rx=0x7fb928009a60 tx=0x7fb928009d70 comp rx=0 tx=0).stop 2026-03-10T06:19:34.987 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:34.985+0000 7fb93dbac700 1 -- 192.168.123.104:0/2612487341 shutdown_connections 2026-03-10T06:19:34.987 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:34.985+0000 7fb93dbac700 1 --2- 192.168.123.104:0/2612487341 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb938102780 0x7fb938102bf0 unknown :-1 s=CLOSED pgs=36 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:34.987 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:34.985+0000 7fb93dbac700 1 --2- 192.168.123.104:0/2612487341 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb938108780 0x7fb938108b50 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:34.987 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:34.985+0000 7fb93dbac700 1 -- 192.168.123.104:0/2612487341 >> 192.168.123.104:0/2612487341 conn(0x7fb9380fe280 msgr2=0x7fb938100690 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:19:34.987 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:34.985+0000 7fb93dbac700 1 -- 192.168.123.104:0/2612487341 shutdown_connections 2026-03-10T06:19:34.987 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:34.985+0000 7fb93dbac700 1 -- 192.168.123.104:0/2612487341 wait complete. 2026-03-10T06:19:34.988 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:34.986+0000 7fb93dbac700 1 Processor -- start 2026-03-10T06:19:34.988 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:34.986+0000 7fb93dbac700 1 -- start start 2026-03-10T06:19:34.988 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:34.986+0000 7fb93dbac700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb938102780 0x7fb938072820 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:34.988 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:34.986+0000 7fb93dbac700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb938108780 0x7fb93806d820 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:34.988 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:34.986+0000 7fb93dbac700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb93806dd60 con 0x7fb938102780 2026-03-10T06:19:34.988 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:34.986+0000 7fb93dbac700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb93806ded0 con 0x7fb938108780 2026-03-10T06:19:34.988 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:34.987+0000 7fb936ffd700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb938108780 0x7fb93806d820 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:19:34.988 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:34.987+0000 7fb936ffd700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb938108780 0x7fb93806d820 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.104:55526/0 (socket says 192.168.123.104:55526) 2026-03-10T06:19:34.988 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:34.987+0000 7fb936ffd700 1 -- 192.168.123.104:0/4207965809 learned_addr learned my addr 192.168.123.104:0/4207965809 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:19:34.989 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:34.987+0000 7fb9377fe700 1 --2- 192.168.123.104:0/4207965809 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb938102780 0x7fb938072820 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:19:34.989 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:34.987+0000 7fb936ffd700 1 -- 192.168.123.104:0/4207965809 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb938102780 msgr2=0x7fb938072820 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:34.989 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:34.987+0000 7fb936ffd700 1 --2- 192.168.123.104:0/4207965809 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb938102780 0x7fb938072820 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:34.989 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:34.987+0000 7fb936ffd700 1 -- 192.168.123.104:0/4207965809 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb9200097e0 con 0x7fb938108780 2026-03-10T06:19:34.989 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:34.987+0000 7fb9377fe700 1 --2- 192.168.123.104:0/4207965809 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb938102780 0x7fb938072820 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_reply_more state changed! 2026-03-10T06:19:34.989 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:34.987+0000 7fb936ffd700 1 --2- 192.168.123.104:0/4207965809 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb938108780 0x7fb93806d820 secure :-1 s=READY pgs=37 cs=0 l=1 rev1=1 crypto rx=0x7fb92800b5c0 tx=0x7fb92800f740 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:19:34.990 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:34.988+0000 7fb934ff9700 1 -- 192.168.123.104:0/4207965809 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb92801d070 con 0x7fb938108780 2026-03-10T06:19:34.990 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:34.988+0000 7fb934ff9700 1 -- 192.168.123.104:0/4207965809 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fb92800fd20 con 0x7fb938108780 2026-03-10T06:19:34.990 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:34.988+0000 7fb934ff9700 1 -- 192.168.123.104:0/4207965809 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb928017760 con 0x7fb938108780 2026-03-10T06:19:34.990 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:34.988+0000 7fb93dbac700 1 -- 192.168.123.104:0/4207965809 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fb928009710 con 0x7fb938108780 2026-03-10T06:19:34.990 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:34.989+0000 7fb93dbac700 1 -- 192.168.123.104:0/4207965809 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fb93806e4b0 con 0x7fb938108780 2026-03-10T06:19:34.994 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:34.990+0000 7fb934ff9700 1 -- 192.168.123.104:0/4207965809 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7fb928021410 con 0x7fb938108780 2026-03-10T06:19:34.994 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:34.990+0000 7fb934ff9700 1 --2- 192.168.123.104:0/4207965809 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fb92406c750 0x7fb92406ec00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:34.994 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:34.990+0000 7fb934ff9700 1 -- 192.168.123.104:0/4207965809 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(34..34 src has 1..34) v4 ==== 4446+0+0 (secure 0 0 0) 0x7fb92808c5a0 con 0x7fb938108780 2026-03-10T06:19:34.994 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:34.991+0000 7fb93dbac700 1 -- 192.168.123.104:0/4207965809 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fb918005320 con 0x7fb938108780 2026-03-10T06:19:34.994 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:34.992+0000 7fb9377fe700 1 --2- 192.168.123.104:0/4207965809 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fb92406c750 0x7fb92406ec00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:19:34.994 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:34.992+0000 7fb9377fe700 1 --2- 192.168.123.104:0/4207965809 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fb92406c750 0x7fb92406ec00 secure :-1 s=READY pgs=68 cs=0 l=1 rev1=1 crypto rx=0x7fb9381038c0 tx=0x7fb920009500 comp rx=0 tx=0).ready entity=mgr.14241 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:19:34.995 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:34.994+0000 7fb934ff9700 1 -- 192.168.123.104:0/4207965809 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fb928057190 con 0x7fb938108780 2026-03-10T06:19:35.105 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:35.102+0000 7fb93dbac700 1 -- 192.168.123.104:0/4207965809 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "osd dump", "format": "json"} v 0) v1 -- 0x7fb918005190 con 0x7fb938108780 2026-03-10T06:19:35.105 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:35.104+0000 7fb934ff9700 1 -- 192.168.123.104:0/4207965809 <== mon.1 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "osd dump", "format": "json"}]=0 v34) v1 ==== 74+0+11260 (secure 0 0 0) 0x7fb928026030 con 0x7fb938108780 2026-03-10T06:19:35.105 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:19:35.106 INFO:teuthology.orchestra.run.vm04.stdout:{"epoch":34,"fsid":"9c59102a-1c48-11f1-b618-035af535377d","created":"2026-03-10T06:16:43.937039+0000","modified":"2026-03-10T06:19:30.067285+0000","last_up_change":"2026-03-10T06:19:29.056279+0000","last_in_change":"2026-03-10T06:19:17.791524+0000","flags":"sortbitwise,recovery_deletes,purged_snapdirs,pglog_hardlimit","flags_num":5799936,"flags_set":["pglog_hardlimit","purged_snapdirs","recovery_deletes","sortbitwise"],"crush_version":14,"full_ratio":0.94999998807907104,"backfillfull_ratio":0.89999997615814209,"nearfull_ratio":0.85000002384185791,"cluster_snapshot":"","pool_max":1,"max_osd":6,"require_min_compat_client":"luminous","min_compat_client":"jewel","require_osd_release":"reef","allow_crimson":false,"pools":[{"pool":1,"pool_name":".mgr","create_time":"2026-03-10T06:19:00.284324+0000","flags":1,"flags_names":"hashpspool","type":1,"size":3,"min_size":2,"crush_rule":0,"peering_crush_bucket_count":0,"peering_crush_bucket_target":0,"peering_crush_bucket_barrier":0,"peering_crush_bucket_mandatory_member":2147483647,"object_hash":2,"pg_autoscale_mode":"off","pg_num":1,"pg_placement_num":1,"pg_placement_num_target":1,"pg_num_target":1,"pg_num_pending":1,"last_pg_merge_meta":{"source_pgid":"0.0","ready_epoch":0,"last_epoch_started":0,"last_epoch_clean":0,"source_version":"0'0","target_version":"0'0"},"last_change":"21","last_force_op_resend":"0","last_force_op_resend_prenautilus":"0","last_force_op_resend_preluminous":"0","auid":0,"snap_mode":"selfmanaged","snap_seq":0,"snap_epoch":0,"pool_snaps":[],"removed_snaps":"[]","quota_max_bytes":0,"quota_max_objects":0,"tiers":[],"tier_of":-1,"read_tier":-1,"write_tier":-1,"cache_mode":"none","target_max_bytes":0,"target_max_objects":0,"cache_target_dirty_ratio_micro":400000,"cache_target_dirty_high_ratio_micro":600000,"cache_target_full_ratio_micro":800000,"cache_min_flush_age":0,"cache_min_evict_age":0,"erasure_code_profile":"","hit_set_params":{"type":"none"},"hit_set_period":0,"hit_set_count":0,"use_gmt_hitset":true,"min_read_recency_for_promote":0,"min_write_recency_for_promote":0,"hit_set_grade_decay_rate":0,"hit_set_search_last_n":0,"grade_table":[],"stripe_width":0,"expected_num_objects":0,"fast_read":false,"options":{"pg_num_max":32,"pg_num_min":1},"application_metadata":{"mgr":{}},"read_balance":{"score_acting":6,"score_stable":6,"optimal_score":0.5,"raw_score_acting":3,"raw_score_stable":3,"primary_affinity_weighted":1,"average_primary_affinity":1,"average_primary_affinity_weighted":1}}],"osds":[{"osd":0,"uuid":"24cf9fc9-b995-47ea-a145-3fd48dc1ed14","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":9,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6802","nonce":353393816},{"type":"v1","addr":"192.168.123.104:6803","nonce":353393816}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6804","nonce":353393816},{"type":"v1","addr":"192.168.123.104:6805","nonce":353393816}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6808","nonce":353393816},{"type":"v1","addr":"192.168.123.104:6809","nonce":353393816}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6806","nonce":353393816},{"type":"v1","addr":"192.168.123.104:6807","nonce":353393816}]},"public_addr":"192.168.123.104:6803/353393816","cluster_addr":"192.168.123.104:6805/353393816","heartbeat_back_addr":"192.168.123.104:6809/353393816","heartbeat_front_addr":"192.168.123.104:6807/353393816","state":["exists","up"]},{"osd":1,"uuid":"38852b8c-2ea4-46c8-a734-cf521893e9b5","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":14,"up_thru":25,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6810","nonce":3264858383},{"type":"v1","addr":"192.168.123.104:6811","nonce":3264858383}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6812","nonce":3264858383},{"type":"v1","addr":"192.168.123.104:6813","nonce":3264858383}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6816","nonce":3264858383},{"type":"v1","addr":"192.168.123.104:6817","nonce":3264858383}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6814","nonce":3264858383},{"type":"v1","addr":"192.168.123.104:6815","nonce":3264858383}]},"public_addr":"192.168.123.104:6811/3264858383","cluster_addr":"192.168.123.104:6813/3264858383","heartbeat_back_addr":"192.168.123.104:6817/3264858383","heartbeat_front_addr":"192.168.123.104:6815/3264858383","state":["exists","up"]},{"osd":2,"uuid":"7fc62f1e-5fa4-44db-82a0-4b766c28a491","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":18,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6818","nonce":4287451356},{"type":"v1","addr":"192.168.123.104:6819","nonce":4287451356}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6820","nonce":4287451356},{"type":"v1","addr":"192.168.123.104:6821","nonce":4287451356}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6824","nonce":4287451356},{"type":"v1","addr":"192.168.123.104:6825","nonce":4287451356}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6822","nonce":4287451356},{"type":"v1","addr":"192.168.123.104:6823","nonce":4287451356}]},"public_addr":"192.168.123.104:6819/4287451356","cluster_addr":"192.168.123.104:6821/4287451356","heartbeat_back_addr":"192.168.123.104:6825/4287451356","heartbeat_front_addr":"192.168.123.104:6823/4287451356","state":["exists","up"]},{"osd":3,"uuid":"343c7178-1f64-4726-9c13-d1d348b25384","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":24,"up_thru":28,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6800","nonce":1172578215},{"type":"v1","addr":"192.168.123.106:6801","nonce":1172578215}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6802","nonce":1172578215},{"type":"v1","addr":"192.168.123.106:6803","nonce":1172578215}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6806","nonce":1172578215},{"type":"v1","addr":"192.168.123.106:6807","nonce":1172578215}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6804","nonce":1172578215},{"type":"v1","addr":"192.168.123.106:6805","nonce":1172578215}]},"public_addr":"192.168.123.106:6801/1172578215","cluster_addr":"192.168.123.106:6803/1172578215","heartbeat_back_addr":"192.168.123.106:6807/1172578215","heartbeat_front_addr":"192.168.123.106:6805/1172578215","state":["exists","up"]},{"osd":4,"uuid":"0e68b450-e783-4ec6-99f7-0610ac3453d1","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":29,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6808","nonce":1799920647},{"type":"v1","addr":"192.168.123.106:6809","nonce":1799920647}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6810","nonce":1799920647},{"type":"v1","addr":"192.168.123.106:6811","nonce":1799920647}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6814","nonce":1799920647},{"type":"v1","addr":"192.168.123.106:6815","nonce":1799920647}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6812","nonce":1799920647},{"type":"v1","addr":"192.168.123.106:6813","nonce":1799920647}]},"public_addr":"192.168.123.106:6809/1799920647","cluster_addr":"192.168.123.106:6811/1799920647","heartbeat_back_addr":"192.168.123.106:6815/1799920647","heartbeat_front_addr":"192.168.123.106:6813/1799920647","state":["exists","up"]},{"osd":5,"uuid":"014751cc-c4ec-4d41-86ea-4b72c6f87e86","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":33,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6816","nonce":240520744},{"type":"v1","addr":"192.168.123.106:6817","nonce":240520744}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6818","nonce":240520744},{"type":"v1","addr":"192.168.123.106:6819","nonce":240520744}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6822","nonce":240520744},{"type":"v1","addr":"192.168.123.106:6823","nonce":240520744}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6820","nonce":240520744},{"type":"v1","addr":"192.168.123.106:6821","nonce":240520744}]},"public_addr":"192.168.123.106:6817/240520744","cluster_addr":"192.168.123.106:6819/240520744","heartbeat_back_addr":"192.168.123.106:6823/240520744","heartbeat_front_addr":"192.168.123.106:6821/240520744","state":["exists","up"]}],"osd_xinfo":[{"osd":0,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-10T06:18:36.888061+0000","dead_epoch":0},{"osd":1,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-10T06:18:48.702876+0000","dead_epoch":0},{"osd":2,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-10T06:18:58.655570+0000","dead_epoch":0},{"osd":3,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-10T06:19:07.794122+0000","dead_epoch":0},{"osd":4,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-10T06:19:17.462801+0000","dead_epoch":0},{"osd":5,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-10T06:19:27.109339+0000","dead_epoch":0}],"pg_upmap":[],"pg_upmap_items":[],"pg_upmap_primaries":[],"pg_temp":[],"primary_temp":[],"blocklist":{"192.168.123.104:0/4184397752":"2026-03-11T06:18:04.067248+0000","192.168.123.104:0/63423144":"2026-03-11T06:18:04.067248+0000","192.168.123.104:0/1120149152":"2026-03-11T06:17:13.124403+0000","192.168.123.104:0/2832999113":"2026-03-11T06:17:13.124403+0000","192.168.123.104:6801/2":"2026-03-11T06:16:59.244005+0000","192.168.123.104:0/4014113593":"2026-03-11T06:18:04.067248+0000","192.168.123.104:6800/2":"2026-03-11T06:16:59.244005+0000","192.168.123.104:0/3171159121":"2026-03-11T06:16:59.244005+0000","192.168.123.104:0/2564492007":"2026-03-11T06:16:59.244005+0000","192.168.123.104:0/1133148001":"2026-03-11T06:16:59.244005+0000","192.168.123.104:0/1930923909":"2026-03-11T06:17:13.124403+0000"},"range_blocklist":{},"erasure_code_profiles":{"default":{"crush-failure-domain":"osd","k":"2","m":"1","plugin":"jerasure","technique":"reed_sol_van"}},"removed_snaps_queue":[],"new_removed_snaps":[],"new_purged_snaps":[],"crush_node_flags":{},"device_class_flags":{},"stretch_mode":{"stretch_mode_enabled":false,"stretch_bucket_count":0,"degraded_stretch_mode":0,"recovering_stretch_mode":0,"stretch_mode_bucket":0}} 2026-03-10T06:19:35.108 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:35.106+0000 7fb93dbac700 1 -- 192.168.123.104:0/4207965809 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fb92406c750 msgr2=0x7fb92406ec00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:35.108 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:35.106+0000 7fb93dbac700 1 --2- 192.168.123.104:0/4207965809 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fb92406c750 0x7fb92406ec00 secure :-1 s=READY pgs=68 cs=0 l=1 rev1=1 crypto rx=0x7fb9381038c0 tx=0x7fb920009500 comp rx=0 tx=0).stop 2026-03-10T06:19:35.108 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:35.106+0000 7fb93dbac700 1 -- 192.168.123.104:0/4207965809 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb938108780 msgr2=0x7fb93806d820 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:35.108 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:35.106+0000 7fb93dbac700 1 --2- 192.168.123.104:0/4207965809 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb938108780 0x7fb93806d820 secure :-1 s=READY pgs=37 cs=0 l=1 rev1=1 crypto rx=0x7fb92800b5c0 tx=0x7fb92800f740 comp rx=0 tx=0).stop 2026-03-10T06:19:35.108 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:35.106+0000 7fb93dbac700 1 -- 192.168.123.104:0/4207965809 shutdown_connections 2026-03-10T06:19:35.108 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:35.106+0000 7fb93dbac700 1 --2- 192.168.123.104:0/4207965809 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fb92406c750 0x7fb92406ec00 unknown :-1 s=CLOSED pgs=68 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:35.108 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:35.107+0000 7fb93dbac700 1 --2- 192.168.123.104:0/4207965809 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb938102780 0x7fb938072820 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:35.108 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:35.107+0000 7fb93dbac700 1 --2- 192.168.123.104:0/4207965809 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb938108780 0x7fb93806d820 unknown :-1 s=CLOSED pgs=37 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:35.108 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:35.107+0000 7fb93dbac700 1 -- 192.168.123.104:0/4207965809 >> 192.168.123.104:0/4207965809 conn(0x7fb9380fe280 msgr2=0x7fb9380ffa40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:19:35.108 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:35.107+0000 7fb93dbac700 1 -- 192.168.123.104:0/4207965809 shutdown_connections 2026-03-10T06:19:35.108 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:35.107+0000 7fb93dbac700 1 -- 192.168.123.104:0/4207965809 wait complete. 2026-03-10T06:19:35.151 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid 9c59102a-1c48-11f1-b618-035af535377d -- ceph tell osd.0 flush_pg_stats 2026-03-10T06:19:35.151 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid 9c59102a-1c48-11f1-b618-035af535377d -- ceph tell osd.1 flush_pg_stats 2026-03-10T06:19:35.151 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid 9c59102a-1c48-11f1-b618-035af535377d -- ceph tell osd.2 flush_pg_stats 2026-03-10T06:19:35.151 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid 9c59102a-1c48-11f1-b618-035af535377d -- ceph tell osd.3 flush_pg_stats 2026-03-10T06:19:35.151 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid 9c59102a-1c48-11f1-b618-035af535377d -- ceph tell osd.4 flush_pg_stats 2026-03-10T06:19:35.152 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid 9c59102a-1c48-11f1-b618-035af535377d -- ceph tell osd.5 flush_pg_stats 2026-03-10T06:19:35.259 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:35 vm04 ceph-mon[51058]: pgmap v67: 1 pgs: 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T06:19:35.259 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:35 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T06:19:35.259 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:35 vm04 ceph-mon[51058]: from='client.? 192.168.123.104:0/781512930' entity='client.admin' cmd=[{"prefix": "osd dump", "format": "json"}]: dispatch 2026-03-10T06:19:35.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:35 vm06 ceph-mon[58974]: pgmap v67: 1 pgs: 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T06:19:35.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:35 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T06:19:35.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:35 vm06 ceph-mon[58974]: from='client.? 192.168.123.104:0/781512930' entity='client.admin' cmd=[{"prefix": "osd dump", "format": "json"}]: dispatch 2026-03-10T06:19:35.571 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/mon.vm04/config 2026-03-10T06:19:35.582 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/mon.vm04/config 2026-03-10T06:19:35.653 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/mon.vm04/config 2026-03-10T06:19:35.923 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/mon.vm04/config 2026-03-10T06:19:35.926 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/mon.vm04/config 2026-03-10T06:19:35.930 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/mon.vm04/config 2026-03-10T06:19:36.166 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:36 vm04 ceph-mon[51058]: from='client.? 192.168.123.104:0/4207965809' entity='client.admin' cmd=[{"prefix": "osd dump", "format": "json"}]: dispatch 2026-03-10T06:19:36.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:36 vm06 ceph-mon[58974]: from='client.? 192.168.123.104:0/4207965809' entity='client.admin' cmd=[{"prefix": "osd dump", "format": "json"}]: dispatch 2026-03-10T06:19:36.413 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.409+0000 7fe2e7147700 1 -- 192.168.123.104:0/2496929027 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe2d80ac3c0 msgr2=0x7fe2d80a4cd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:36.413 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.409+0000 7fe2e7147700 1 --2- 192.168.123.104:0/2496929027 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe2d80ac3c0 0x7fe2d80a4cd0 secure :-1 s=READY pgs=38 cs=0 l=1 rev1=1 crypto rx=0x7fe2dc009a60 tx=0x7fe2dc009d70 comp rx=0 tx=0).stop 2026-03-10T06:19:36.413 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.410+0000 7fe2e7147700 1 -- 192.168.123.104:0/2496929027 shutdown_connections 2026-03-10T06:19:36.413 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.410+0000 7fe2e7147700 1 --2- 192.168.123.104:0/2496929027 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe2d80ac790 0x7fe2d80a5210 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:36.413 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.410+0000 7fe2e7147700 1 --2- 192.168.123.104:0/2496929027 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe2d80ac3c0 0x7fe2d80a4cd0 secure :-1 s=CLOSED pgs=38 cs=0 l=1 rev1=1 crypto rx=0x7fe2dc009a60 tx=0x7fe2dc009d70 comp rx=0 tx=0).stop 2026-03-10T06:19:36.413 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.410+0000 7fe2e7147700 1 -- 192.168.123.104:0/2496929027 >> 192.168.123.104:0/2496929027 conn(0x7fe2d801a290 msgr2=0x7fe2d801a690 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:19:36.413 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.410+0000 7fe2e7147700 1 -- 192.168.123.104:0/2496929027 shutdown_connections 2026-03-10T06:19:36.413 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.410+0000 7fe2e7147700 1 -- 192.168.123.104:0/2496929027 wait complete. 2026-03-10T06:19:36.414 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.408+0000 7f81588a7700 1 -- 192.168.123.104:0/180294878 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f815410e9e0 msgr2=0x7f815410edb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:36.414 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.408+0000 7f81588a7700 1 --2- 192.168.123.104:0/180294878 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f815410e9e0 0x7f815410edb0 secure :-1 s=READY pgs=228 cs=0 l=1 rev1=1 crypto rx=0x7f8144009b50 tx=0x7f8144009e60 comp rx=0 tx=0).stop 2026-03-10T06:19:36.414 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.411+0000 7f81588a7700 1 -- 192.168.123.104:0/180294878 shutdown_connections 2026-03-10T06:19:36.414 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.411+0000 7f81588a7700 1 --2- 192.168.123.104:0/180294878 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8154071b60 0x7f8154071fd0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:36.414 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.411+0000 7f81588a7700 1 --2- 192.168.123.104:0/180294878 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f815410e9e0 0x7f815410edb0 unknown :-1 s=CLOSED pgs=228 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:36.414 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.411+0000 7f81588a7700 1 -- 192.168.123.104:0/180294878 >> 192.168.123.104:0/180294878 conn(0x7f815406c6c0 msgr2=0x7f815406cac0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:19:36.414 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.411+0000 7f81588a7700 1 -- 192.168.123.104:0/180294878 shutdown_connections 2026-03-10T06:19:36.414 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.411+0000 7fe2e7147700 1 Processor -- start 2026-03-10T06:19:36.418 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.412+0000 7fe2e7147700 1 -- start start 2026-03-10T06:19:36.418 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.412+0000 7fe2e7147700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe2d80ac790 0x7fe2d81485c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:36.418 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.412+0000 7fe2e7147700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe2d8148b00 0x7fe2d8142640 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:36.418 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.412+0000 7fe2e7147700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe2d81490c0 con 0x7fe2d80ac790 2026-03-10T06:19:36.418 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.412+0000 7fe2e7147700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe2d8142b80 con 0x7fe2d8148b00 2026-03-10T06:19:36.418 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.414+0000 7fe2e6145700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe2d80ac790 0x7fe2d81485c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:19:36.418 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.414+0000 7fe2e6145700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe2d80ac790 0x7fe2d81485c0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:44738/0 (socket says 192.168.123.104:44738) 2026-03-10T06:19:36.418 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.414+0000 7fe2e6145700 1 -- 192.168.123.104:0/3554576286 learned_addr learned my addr 192.168.123.104:0/3554576286 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:19:36.418 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.414+0000 7fe2e5944700 1 --2- 192.168.123.104:0/3554576286 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe2d8148b00 0x7fe2d8142640 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:19:36.418 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.411+0000 7f81588a7700 1 -- 192.168.123.104:0/180294878 wait complete. 2026-03-10T06:19:36.418 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.416+0000 7fe2e5944700 1 -- 192.168.123.104:0/3554576286 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe2d80ac790 msgr2=0x7fe2d81485c0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:36.419 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.416+0000 7fe2e5944700 1 --2- 192.168.123.104:0/3554576286 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe2d80ac790 0x7fe2d81485c0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:36.419 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.416+0000 7fe2e5944700 1 -- 192.168.123.104:0/3554576286 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fe2dc009710 con 0x7fe2d8148b00 2026-03-10T06:19:36.419 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.416+0000 7fe2e5944700 1 --2- 192.168.123.104:0/3554576286 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe2d8148b00 0x7fe2d8142640 secure :-1 s=READY pgs=39 cs=0 l=1 rev1=1 crypto rx=0x7fe2d0009fd0 tx=0x7fe2d000ec90 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:19:36.421 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.419+0000 7f81588a7700 1 Processor -- start 2026-03-10T06:19:36.421 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.417+0000 7fe2d77fe700 1 -- 192.168.123.104:0/3554576286 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe2d000cb80 con 0x7fe2d8148b00 2026-03-10T06:19:36.421 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.417+0000 7fe2e7147700 1 -- 192.168.123.104:0/3554576286 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fe2d8142e60 con 0x7fe2d8148b00 2026-03-10T06:19:36.421 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.417+0000 7fe2e7147700 1 -- 192.168.123.104:0/3554576286 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fe2d81433b0 con 0x7fe2d8148b00 2026-03-10T06:19:36.421 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.417+0000 7fe2d77fe700 1 -- 192.168.123.104:0/3554576286 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fe2d000eed0 con 0x7fe2d8148b00 2026-03-10T06:19:36.421 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.417+0000 7fe2d77fe700 1 -- 192.168.123.104:0/3554576286 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe2d00185f0 con 0x7fe2d8148b00 2026-03-10T06:19:36.421 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.419+0000 7fe2e7147700 1 -- 192.168.123.104:0/3554576286 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_get_version(what=osdmap handle=1) v1 -- 0x7fe2c4000ff0 con 0x7fe2d8148b00 2026-03-10T06:19:36.425 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.423+0000 7fe2d77fe700 1 -- 192.168.123.104:0/3554576286 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7fe2d000cce0 con 0x7fe2d8148b00 2026-03-10T06:19:36.425 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.423+0000 7f81588a7700 1 -- start start 2026-03-10T06:19:36.425 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.423+0000 7f81588a7700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8154071b60 0x7f8154119600 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:36.425 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.423+0000 7f81588a7700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f815410e9e0 0x7f8154114600 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:36.425 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.423+0000 7f81588a7700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8154114bd0 con 0x7f815410e9e0 2026-03-10T06:19:36.425 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.423+0000 7f81588a7700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8154114d40 con 0x7f8154071b60 2026-03-10T06:19:36.429 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.424+0000 7f8152d9d700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8154071b60 0x7f8154119600 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:19:36.429 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.424+0000 7f815259c700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f815410e9e0 0x7f8154114600 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:19:36.429 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.424+0000 7f8152d9d700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8154071b60 0x7f8154119600 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.104:55566/0 (socket says 192.168.123.104:55566) 2026-03-10T06:19:36.429 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.424+0000 7f8152d9d700 1 -- 192.168.123.104:0/1868511713 learned_addr learned my addr 192.168.123.104:0/1868511713 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:19:36.429 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.424+0000 7f815259c700 1 -- 192.168.123.104:0/1868511713 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8154071b60 msgr2=0x7f8154119600 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:36.429 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.424+0000 7f815259c700 1 --2- 192.168.123.104:0/1868511713 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8154071b60 0x7f8154119600 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:36.429 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.424+0000 7f815259c700 1 -- 192.168.123.104:0/1868511713 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f81440097e0 con 0x7f815410e9e0 2026-03-10T06:19:36.429 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.424+0000 7f815259c700 1 --2- 192.168.123.104:0/1868511713 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f815410e9e0 0x7f8154114600 secure :-1 s=READY pgs=229 cs=0 l=1 rev1=1 crypto rx=0x7f814800b810 tx=0x7f814800bb20 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:19:36.429 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.426+0000 7f813bfff700 1 -- 192.168.123.104:0/1868511713 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f814800d610 con 0x7f815410e9e0 2026-03-10T06:19:36.429 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.426+0000 7f813bfff700 1 -- 192.168.123.104:0/1868511713 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f814800dc50 con 0x7f815410e9e0 2026-03-10T06:19:36.429 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.426+0000 7f813bfff700 1 -- 192.168.123.104:0/1868511713 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f814800c390 con 0x7f815410e9e0 2026-03-10T06:19:36.429 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.426+0000 7f81588a7700 1 -- 192.168.123.104:0/1868511713 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f8154114fd0 con 0x7f815410e9e0 2026-03-10T06:19:36.430 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.426+0000 7f81588a7700 1 -- 192.168.123.104:0/1868511713 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f81541b7b20 con 0x7f815410e9e0 2026-03-10T06:19:36.430 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.425+0000 7fe2d77fe700 1 --2- 192.168.123.104:0/3554576286 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fe2cc06c7a0 0x7fe2cc06ec50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:36.430 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.425+0000 7fe2d77fe700 1 -- 192.168.123.104:0/3554576286 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(34..34 src has 1..34) v4 ==== 4446+0+0 (secure 0 0 0) 0x7fe2d0014070 con 0x7fe2d8148b00 2026-03-10T06:19:36.430 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.425+0000 7fe2d77fe700 1 --2- 192.168.123.104:0/3554576286 >> [v2:192.168.123.104:6802/353393816,v1:192.168.123.104:6803/353393816] conn(0x7fe2cc072330 0x7fe2cc074740 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:36.430 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.425+0000 7fe2d77fe700 1 -- 192.168.123.104:0/3554576286 --> [v2:192.168.123.104:6802/353393816,v1:192.168.123.104:6803/353393816] -- command(tid 1: {"prefix": "get_command_descriptions"}) v1 -- 0x7fe2cc074df0 con 0x7fe2cc072330 2026-03-10T06:19:36.430 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.425+0000 7fe2d77fe700 1 -- 192.168.123.104:0/3554576286 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_get_version_reply(handle=1 version=34) v2 ==== 24+0+0 (secure 0 0 0) 0x7fe2d008c0b0 con 0x7fe2d8148b00 2026-03-10T06:19:36.430 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.425+0000 7fe2e6946700 1 --2- 192.168.123.104:0/3554576286 >> [v2:192.168.123.104:6802/353393816,v1:192.168.123.104:6803/353393816] conn(0x7fe2cc072330 0x7fe2cc074740 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:19:36.430 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.425+0000 7fe2e6145700 1 --2- 192.168.123.104:0/3554576286 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fe2cc06c7a0 0x7fe2cc06ec50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:19:36.430 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.426+0000 7fe2e6946700 1 --2- 192.168.123.104:0/3554576286 >> [v2:192.168.123.104:6802/353393816,v1:192.168.123.104:6803/353393816] conn(0x7fe2cc072330 0x7fe2cc074740 crc :-1 s=READY pgs=7 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).ready entity=osd.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:19:36.430 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.426+0000 7fe2e6145700 1 --2- 192.168.123.104:0/3554576286 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fe2cc06c7a0 0x7fe2cc06ec50 secure :-1 s=READY pgs=69 cs=0 l=1 rev1=1 crypto rx=0x7fe2dc000c00 tx=0x7fe2dc003820 comp rx=0 tx=0).ready entity=mgr.14241 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:19:36.437 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.430+0000 7fe2d77fe700 1 -- 192.168.123.104:0/3554576286 <== osd.0 v2:192.168.123.104:6802/353393816 1 ==== command_reply(tid 1: 0 ) v1 ==== 8+0+25106 (crc 0 0 0) 0x7fe2cc074df0 con 0x7fe2cc072330 2026-03-10T06:19:36.439 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.429+0000 7f81588a7700 1 -- 192.168.123.104:0/1868511713 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_get_version(what=osdmap handle=1) v1 -- 0x7f8140000ff0 con 0x7f815410e9e0 2026-03-10T06:19:36.440 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.437+0000 7f813bfff700 1 -- 192.168.123.104:0/1868511713 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f814800c4f0 con 0x7f815410e9e0 2026-03-10T06:19:36.445 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.437+0000 7f813bfff700 1 --2- 192.168.123.104:0/1868511713 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f813c06c7a0 0x7f813c06ec50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:36.445 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.437+0000 7f813bfff700 1 -- 192.168.123.104:0/1868511713 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(34..34 src has 1..34) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f814808bb70 con 0x7f815410e9e0 2026-03-10T06:19:36.445 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.437+0000 7f813bfff700 1 --2- 192.168.123.104:0/1868511713 >> [v2:192.168.123.104:6818/4287451356,v1:192.168.123.104:6819/4287451356] conn(0x7f813c072330 0x7f813c074740 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:36.445 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.437+0000 7f813bfff700 1 -- 192.168.123.104:0/1868511713 --> [v2:192.168.123.104:6818/4287451356,v1:192.168.123.104:6819/4287451356] -- command(tid 1: {"prefix": "get_command_descriptions"}) v1 -- 0x7f813c074df0 con 0x7f813c072330 2026-03-10T06:19:36.445 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.437+0000 7f813bfff700 1 -- 192.168.123.104:0/1868511713 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_get_version_reply(handle=1 version=34) v2 ==== 24+0+0 (secure 0 0 0) 0x7f814800ddc0 con 0x7f815410e9e0 2026-03-10T06:19:36.445 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.437+0000 7f815359e700 1 --2- 192.168.123.104:0/1868511713 >> [v2:192.168.123.104:6818/4287451356,v1:192.168.123.104:6819/4287451356] conn(0x7f813c072330 0x7f813c074740 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:19:36.445 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.437+0000 7f8152d9d700 1 --2- 192.168.123.104:0/1868511713 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f813c06c7a0 0x7f813c06ec50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:19:36.445 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.438+0000 7f815359e700 1 --2- 192.168.123.104:0/1868511713 >> [v2:192.168.123.104:6818/4287451356,v1:192.168.123.104:6819/4287451356] conn(0x7f813c072330 0x7f813c074740 crc :-1 s=READY pgs=6 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).ready entity=osd.2 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:19:36.445 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.438+0000 7f8152d9d700 1 --2- 192.168.123.104:0/1868511713 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f813c06c7a0 0x7f813c06ec50 secure :-1 s=READY pgs=70 cs=0 l=1 rev1=1 crypto rx=0x7f8144009b20 tx=0x7f814400b560 comp rx=0 tx=0).ready entity=mgr.14241 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:19:36.445 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.440+0000 7f813bfff700 1 -- 192.168.123.104:0/1868511713 <== osd.2 v2:192.168.123.104:6818/4287451356 1 ==== command_reply(tid 1: 0 ) v1 ==== 8+0+25106 (crc 0 0 0) 0x7f813c074df0 con 0x7f813c072330 2026-03-10T06:19:36.459 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.454+0000 7f81588a7700 1 -- 192.168.123.104:0/1868511713 --> [v2:192.168.123.104:6818/4287451356,v1:192.168.123.104:6819/4287451356] -- command(tid 2: {"prefix": "flush_pg_stats"}) v1 -- 0x7f8140002da0 con 0x7f813c072330 2026-03-10T06:19:36.459 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.456+0000 7f813bfff700 1 -- 192.168.123.104:0/1868511713 <== osd.2 v2:192.168.123.104:6818/4287451356 2 ==== command_reply(tid 2: 0 ) v1 ==== 8+0+11 (crc 0 0 0) 0x7f8140002da0 con 0x7f813c072330 2026-03-10T06:19:36.459 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.457+0000 7f8139ffb700 1 -- 192.168.123.104:0/1868511713 >> [v2:192.168.123.104:6818/4287451356,v1:192.168.123.104:6819/4287451356] conn(0x7f813c072330 msgr2=0x7f813c074740 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:36.459 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.457+0000 7f8139ffb700 1 --2- 192.168.123.104:0/1868511713 >> [v2:192.168.123.104:6818/4287451356,v1:192.168.123.104:6819/4287451356] conn(0x7f813c072330 0x7f813c074740 crc :-1 s=READY pgs=6 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:36.459 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.457+0000 7f8139ffb700 1 -- 192.168.123.104:0/1868511713 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f813c06c7a0 msgr2=0x7f813c06ec50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:36.459 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.457+0000 7f8139ffb700 1 --2- 192.168.123.104:0/1868511713 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f813c06c7a0 0x7f813c06ec50 secure :-1 s=READY pgs=70 cs=0 l=1 rev1=1 crypto rx=0x7f8144009b20 tx=0x7f814400b560 comp rx=0 tx=0).stop 2026-03-10T06:19:36.459 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.457+0000 7f8139ffb700 1 -- 192.168.123.104:0/1868511713 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f815410e9e0 msgr2=0x7f8154114600 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:36.459 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.457+0000 7f8139ffb700 1 --2- 192.168.123.104:0/1868511713 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f815410e9e0 0x7f8154114600 secure :-1 s=READY pgs=229 cs=0 l=1 rev1=1 crypto rx=0x7f814800b810 tx=0x7f814800bb20 comp rx=0 tx=0).stop 2026-03-10T06:19:36.463 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.457+0000 7f8139ffb700 1 -- 192.168.123.104:0/1868511713 shutdown_connections 2026-03-10T06:19:36.463 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.459+0000 7f8139ffb700 1 --2- 192.168.123.104:0/1868511713 >> [v2:192.168.123.104:6818/4287451356,v1:192.168.123.104:6819/4287451356] conn(0x7f813c072330 0x7f813c074740 unknown :-1 s=CLOSED pgs=6 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:36.463 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.459+0000 7f8139ffb700 1 --2- 192.168.123.104:0/1868511713 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f813c06c7a0 0x7f813c06ec50 unknown :-1 s=CLOSED pgs=70 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:36.463 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.459+0000 7f8139ffb700 1 --2- 192.168.123.104:0/1868511713 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8154071b60 0x7f8154119600 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:36.463 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.459+0000 7f8139ffb700 1 --2- 192.168.123.104:0/1868511713 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f815410e9e0 0x7f8154114600 unknown :-1 s=CLOSED pgs=229 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:36.463 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.459+0000 7f8139ffb700 1 -- 192.168.123.104:0/1868511713 >> 192.168.123.104:0/1868511713 conn(0x7f815406c6c0 msgr2=0x7f815406f7f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:19:36.463 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.459+0000 7f8139ffb700 1 -- 192.168.123.104:0/1868511713 shutdown_connections 2026-03-10T06:19:36.463 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.459+0000 7f8139ffb700 1 -- 192.168.123.104:0/1868511713 wait complete. 2026-03-10T06:19:36.463 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.460+0000 7efd8d00d700 1 -- 192.168.123.104:0/2160284246 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7efd8810e9e0 msgr2=0x7efd8810edb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:36.463 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.460+0000 7efd8d00d700 1 --2- 192.168.123.104:0/2160284246 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7efd8810e9e0 0x7efd8810edb0 secure :-1 s=READY pgs=40 cs=0 l=1 rev1=1 crypto rx=0x7efd8000b210 tx=0x7efd8000b520 comp rx=0 tx=0).stop 2026-03-10T06:19:36.463 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.460+0000 7efd8d00d700 1 -- 192.168.123.104:0/2160284246 shutdown_connections 2026-03-10T06:19:36.463 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.460+0000 7efd8d00d700 1 --2- 192.168.123.104:0/2160284246 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7efd88071b60 0x7efd88071fd0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:36.463 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.460+0000 7efd8d00d700 1 --2- 192.168.123.104:0/2160284246 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7efd8810e9e0 0x7efd8810edb0 unknown :-1 s=CLOSED pgs=40 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:36.463 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.460+0000 7efd8d00d700 1 -- 192.168.123.104:0/2160284246 >> 192.168.123.104:0/2160284246 conn(0x7efd8806c6c0 msgr2=0x7efd8806cac0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:19:36.463 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.461+0000 7efd8d00d700 1 -- 192.168.123.104:0/2160284246 shutdown_connections 2026-03-10T06:19:36.463 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.461+0000 7efd8d00d700 1 -- 192.168.123.104:0/2160284246 wait complete. 2026-03-10T06:19:36.464 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.461+0000 7efd8d00d700 1 Processor -- start 2026-03-10T06:19:36.465 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.461+0000 7efd8d00d700 1 -- start start 2026-03-10T06:19:36.465 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.461+0000 7efd8d00d700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7efd88071b60 0x7efd881a4f30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:36.465 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.461+0000 7efd8d00d700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7efd8810e9e0 0x7efd881a5470 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:36.465 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.461+0000 7efd8d00d700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7efd881a5b50 con 0x7efd8810e9e0 2026-03-10T06:19:36.465 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.461+0000 7efd8d00d700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7efd881a98e0 con 0x7efd88071b60 2026-03-10T06:19:36.468 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.466+0000 7efd87fff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7efd88071b60 0x7efd881a4f30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:19:36.468 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.466+0000 7efd87fff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7efd88071b60 0x7efd881a4f30 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.104:55574/0 (socket says 192.168.123.104:55574) 2026-03-10T06:19:36.468 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.466+0000 7efd87fff700 1 -- 192.168.123.104:0/998297408 learned_addr learned my addr 192.168.123.104:0/998297408 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:19:36.468 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.466+0000 7efd877fe700 1 --2- 192.168.123.104:0/998297408 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7efd8810e9e0 0x7efd881a5470 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:19:36.468 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.466+0000 7efd87fff700 1 -- 192.168.123.104:0/998297408 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7efd8810e9e0 msgr2=0x7efd881a5470 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:36.468 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.466+0000 7efd87fff700 1 --2- 192.168.123.104:0/998297408 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7efd8810e9e0 0x7efd881a5470 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:36.469 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.466+0000 7efd87fff700 1 -- 192.168.123.104:0/998297408 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7efd80009e30 con 0x7efd88071b60 2026-03-10T06:19:36.469 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.467+0000 7efd87fff700 1 --2- 192.168.123.104:0/998297408 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7efd88071b60 0x7efd881a4f30 secure :-1 s=READY pgs=41 cs=0 l=1 rev1=1 crypto rx=0x7efd800087e0 tx=0x7efd800088c0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:19:36.471 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.468+0000 7efd857fa700 1 -- 192.168.123.104:0/998297408 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7efd8000e050 con 0x7efd88071b60 2026-03-10T06:19:36.471 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.468+0000 7efd8d00d700 1 -- 192.168.123.104:0/998297408 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7efd881a9a80 con 0x7efd88071b60 2026-03-10T06:19:36.471 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.468+0000 7efd8d00d700 1 -- 192.168.123.104:0/998297408 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7efd881a9fd0 con 0x7efd88071b60 2026-03-10T06:19:36.471 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.468+0000 7efd857fa700 1 -- 192.168.123.104:0/998297408 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7efd80008ed0 con 0x7efd88071b60 2026-03-10T06:19:36.473 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.468+0000 7fe2e7147700 1 -- 192.168.123.104:0/3554576286 --> [v2:192.168.123.104:6802/353393816,v1:192.168.123.104:6803/353393816] -- command(tid 2: {"prefix": "flush_pg_stats"}) v1 -- 0x7fe2c4002da0 con 0x7fe2cc072330 2026-03-10T06:19:36.474 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.472+0000 7efd857fa700 1 -- 192.168.123.104:0/998297408 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7efd8001bad0 con 0x7efd88071b60 2026-03-10T06:19:36.474 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.472+0000 7efd857fa700 1 -- 192.168.123.104:0/998297408 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7efd80019040 con 0x7efd88071b60 2026-03-10T06:19:36.474 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.473+0000 7efd8d00d700 1 -- 192.168.123.104:0/998297408 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_get_version(what=osdmap handle=1) v1 -- 0x7efd74000ff0 con 0x7efd88071b60 2026-03-10T06:19:36.474 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.472+0000 7fe2d77fe700 1 -- 192.168.123.104:0/3554576286 <== osd.0 v2:192.168.123.104:6802/353393816 2 ==== command_reply(tid 2: 0 ) v1 ==== 8+0+11 (crc 0 0 0) 0x7fe2c4002da0 con 0x7fe2cc072330 2026-03-10T06:19:36.476 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.473+0000 7efd857fa700 1 --2- 192.168.123.104:0/998297408 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7efd7006c620 0x7efd7006ead0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:36.476 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.473+0000 7efd857fa700 1 -- 192.168.123.104:0/998297408 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(34..34 src has 1..34) v4 ==== 4446+0+0 (secure 0 0 0) 0x7efd8008d2e0 con 0x7efd88071b60 2026-03-10T06:19:36.476 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.473+0000 7efd857fa700 1 --2- 192.168.123.104:0/998297408 >> [v2:192.168.123.106:6800/1172578215,v1:192.168.123.106:6801/1172578215] conn(0x7efd70072040 0x7efd70074450 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:36.476 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.473+0000 7efd857fa700 1 -- 192.168.123.104:0/998297408 --> [v2:192.168.123.106:6800/1172578215,v1:192.168.123.106:6801/1172578215] -- command(tid 1: {"prefix": "get_command_descriptions"}) v1 -- 0x7efd70074b00 con 0x7efd70072040 2026-03-10T06:19:36.476 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.474+0000 7efd8c80c700 1 --2- 192.168.123.104:0/998297408 >> [v2:192.168.123.106:6800/1172578215,v1:192.168.123.106:6801/1172578215] conn(0x7efd70072040 0x7efd70074450 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:19:36.476 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.474+0000 7efd857fa700 1 -- 192.168.123.104:0/998297408 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_get_version_reply(handle=1 version=34) v2 ==== 24+0+0 (secure 0 0 0) 0x7efd800048b0 con 0x7efd88071b60 2026-03-10T06:19:36.476 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.474+0000 7efd877fe700 1 --2- 192.168.123.104:0/998297408 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7efd7006c620 0x7efd7006ead0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:19:36.476 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.474+0000 7fe2e7147700 1 -- 192.168.123.104:0/3554576286 >> [v2:192.168.123.104:6802/353393816,v1:192.168.123.104:6803/353393816] conn(0x7fe2cc072330 msgr2=0x7fe2cc074740 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:36.476 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.474+0000 7fe2e7147700 1 --2- 192.168.123.104:0/3554576286 >> [v2:192.168.123.104:6802/353393816,v1:192.168.123.104:6803/353393816] conn(0x7fe2cc072330 0x7fe2cc074740 crc :-1 s=READY pgs=7 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:36.477 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.475+0000 7fe2e7147700 1 -- 192.168.123.104:0/3554576286 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fe2cc06c7a0 msgr2=0x7fe2cc06ec50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:36.477 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.475+0000 7fe2e7147700 1 --2- 192.168.123.104:0/3554576286 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fe2cc06c7a0 0x7fe2cc06ec50 secure :-1 s=READY pgs=69 cs=0 l=1 rev1=1 crypto rx=0x7fe2dc000c00 tx=0x7fe2dc003820 comp rx=0 tx=0).stop 2026-03-10T06:19:36.477 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.475+0000 7fe2e7147700 1 -- 192.168.123.104:0/3554576286 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe2d8148b00 msgr2=0x7fe2d8142640 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:36.477 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.475+0000 7fe2e7147700 1 --2- 192.168.123.104:0/3554576286 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe2d8148b00 0x7fe2d8142640 secure :-1 s=READY pgs=39 cs=0 l=1 rev1=1 crypto rx=0x7fe2d0009fd0 tx=0x7fe2d000ec90 comp rx=0 tx=0).stop 2026-03-10T06:19:36.477 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.476+0000 7efd8c80c700 1 --2- 192.168.123.104:0/998297408 >> [v2:192.168.123.106:6800/1172578215,v1:192.168.123.106:6801/1172578215] conn(0x7efd70072040 0x7efd70074450 crc :-1 s=READY pgs=5 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).ready entity=osd.3 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:19:36.477 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.476+0000 7efd877fe700 1 --2- 192.168.123.104:0/998297408 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7efd7006c620 0x7efd7006ead0 secure :-1 s=READY pgs=71 cs=0 l=1 rev1=1 crypto rx=0x7efd881a6550 tx=0x7efd78009500 comp rx=0 tx=0).ready entity=mgr.14241 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:19:36.485 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.476+0000 7fe2e7147700 1 -- 192.168.123.104:0/3554576286 shutdown_connections 2026-03-10T06:19:36.485 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.476+0000 7fe2e7147700 1 --2- 192.168.123.104:0/3554576286 >> [v2:192.168.123.104:6802/353393816,v1:192.168.123.104:6803/353393816] conn(0x7fe2cc072330 0x7fe2cc074740 unknown :-1 s=CLOSED pgs=7 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:36.485 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.476+0000 7fe2e7147700 1 --2- 192.168.123.104:0/3554576286 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fe2cc06c7a0 0x7fe2cc06ec50 unknown :-1 s=CLOSED pgs=69 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:36.485 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.476+0000 7fe2e7147700 1 --2- 192.168.123.104:0/3554576286 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe2d80ac790 0x7fe2d81485c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:36.485 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.476+0000 7fe2e7147700 1 --2- 192.168.123.104:0/3554576286 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe2d8148b00 0x7fe2d8142640 unknown :-1 s=CLOSED pgs=39 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:36.485 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.476+0000 7fe2e7147700 1 -- 192.168.123.104:0/3554576286 >> 192.168.123.104:0/3554576286 conn(0x7fe2d801a290 msgr2=0x7fe2d80b5ed0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:19:36.485 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.477+0000 7fe2e7147700 1 -- 192.168.123.104:0/3554576286 shutdown_connections 2026-03-10T06:19:36.485 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.477+0000 7fe2e7147700 1 -- 192.168.123.104:0/3554576286 wait complete. 2026-03-10T06:19:36.485 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.477+0000 7efd857fa700 1 -- 192.168.123.104:0/998297408 <== osd.3 v2:192.168.123.106:6800/1172578215 1 ==== command_reply(tid 1: 0 ) v1 ==== 8+0+25106 (crc 0 0 0) 0x7efd70074b00 con 0x7efd70072040 2026-03-10T06:19:36.523 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.519+0000 7efd8d00d700 1 -- 192.168.123.104:0/998297408 --> [v2:192.168.123.106:6800/1172578215,v1:192.168.123.106:6801/1172578215] -- command(tid 2: {"prefix": "flush_pg_stats"}) v1 -- 0x7efd74002da0 con 0x7efd70072040 2026-03-10T06:19:36.524 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.522+0000 7efd857fa700 1 -- 192.168.123.104:0/998297408 <== osd.3 v2:192.168.123.106:6800/1172578215 2 ==== command_reply(tid 2: 0 ) v1 ==== 8+0+12 (crc 0 0 0) 0x7efd74002da0 con 0x7efd70072040 2026-03-10T06:19:36.525 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.523+0000 7efd6effd700 1 -- 192.168.123.104:0/998297408 >> [v2:192.168.123.106:6800/1172578215,v1:192.168.123.106:6801/1172578215] conn(0x7efd70072040 msgr2=0x7efd70074450 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:36.525 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.523+0000 7efd6effd700 1 --2- 192.168.123.104:0/998297408 >> [v2:192.168.123.106:6800/1172578215,v1:192.168.123.106:6801/1172578215] conn(0x7efd70072040 0x7efd70074450 crc :-1 s=READY pgs=5 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:36.525 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.523+0000 7efd6effd700 1 -- 192.168.123.104:0/998297408 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7efd7006c620 msgr2=0x7efd7006ead0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:36.525 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.523+0000 7efd6effd700 1 --2- 192.168.123.104:0/998297408 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7efd7006c620 0x7efd7006ead0 secure :-1 s=READY pgs=71 cs=0 l=1 rev1=1 crypto rx=0x7efd881a6550 tx=0x7efd78009500 comp rx=0 tx=0).stop 2026-03-10T06:19:36.525 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.523+0000 7efd6effd700 1 -- 192.168.123.104:0/998297408 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7efd88071b60 msgr2=0x7efd881a4f30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:36.525 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.523+0000 7efd6effd700 1 --2- 192.168.123.104:0/998297408 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7efd88071b60 0x7efd881a4f30 secure :-1 s=READY pgs=41 cs=0 l=1 rev1=1 crypto rx=0x7efd800087e0 tx=0x7efd800088c0 comp rx=0 tx=0).stop 2026-03-10T06:19:36.527 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.525+0000 7efd6effd700 1 -- 192.168.123.104:0/998297408 shutdown_connections 2026-03-10T06:19:36.527 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.525+0000 7efd6effd700 1 --2- 192.168.123.104:0/998297408 >> [v2:192.168.123.106:6800/1172578215,v1:192.168.123.106:6801/1172578215] conn(0x7efd70072040 0x7efd70074450 unknown :-1 s=CLOSED pgs=5 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:36.527 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.525+0000 7efd6effd700 1 --2- 192.168.123.104:0/998297408 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7efd7006c620 0x7efd7006ead0 unknown :-1 s=CLOSED pgs=71 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:36.527 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.525+0000 7efd6effd700 1 --2- 192.168.123.104:0/998297408 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7efd88071b60 0x7efd881a4f30 unknown :-1 s=CLOSED pgs=41 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:36.527 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.525+0000 7efd6effd700 1 --2- 192.168.123.104:0/998297408 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7efd8810e9e0 0x7efd881a5470 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:36.527 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.525+0000 7efd6effd700 1 -- 192.168.123.104:0/998297408 >> 192.168.123.104:0/998297408 conn(0x7efd8806c6c0 msgr2=0x7efd8806fb80 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:19:36.530 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.529+0000 7efd6effd700 1 -- 192.168.123.104:0/998297408 shutdown_connections 2026-03-10T06:19:36.530 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.529+0000 7efd6effd700 1 -- 192.168.123.104:0/998297408 wait complete. 2026-03-10T06:19:36.676 INFO:teuthology.orchestra.run.vm04.stdout:77309411337 2026-03-10T06:19:36.676 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid 9c59102a-1c48-11f1-b618-035af535377d -- ceph osd last-stat-seq osd.2 2026-03-10T06:19:36.708 INFO:teuthology.orchestra.run.vm04.stdout:103079215111 2026-03-10T06:19:36.708 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid 9c59102a-1c48-11f1-b618-035af535377d -- ceph osd last-stat-seq osd.3 2026-03-10T06:19:36.736 INFO:teuthology.orchestra.run.vm04.stdout:38654705677 2026-03-10T06:19:36.736 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid 9c59102a-1c48-11f1-b618-035af535377d -- ceph osd last-stat-seq osd.0 2026-03-10T06:19:36.772 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.770+0000 7f224ad5b700 1 -- 192.168.123.104:0/436567837 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f224406d7a0 msgr2=0x7f224406dc10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:36.772 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.770+0000 7f224ad5b700 1 --2- 192.168.123.104:0/436567837 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f224406d7a0 0x7f224406dc10 secure :-1 s=READY pgs=42 cs=0 l=1 rev1=1 crypto rx=0x7f2238009a60 tx=0x7f2238009d70 comp rx=0 tx=0).stop 2026-03-10T06:19:36.779 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.770+0000 7f224ad5b700 1 -- 192.168.123.104:0/436567837 shutdown_connections 2026-03-10T06:19:36.779 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.770+0000 7f224ad5b700 1 --2- 192.168.123.104:0/436567837 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f224406d7a0 0x7f224406dc10 unknown :-1 s=CLOSED pgs=42 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:36.780 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.770+0000 7f224ad5b700 1 --2- 192.168.123.104:0/436567837 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f224410ed80 0x7f224406d260 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:36.780 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.770+0000 7f224ad5b700 1 -- 192.168.123.104:0/436567837 >> 192.168.123.104:0/436567837 conn(0x7f224406c830 msgr2=0x7f2244071830 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:19:36.780 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.770+0000 7f224ad5b700 1 -- 192.168.123.104:0/436567837 shutdown_connections 2026-03-10T06:19:36.785 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.770+0000 7f224ad5b700 1 -- 192.168.123.104:0/436567837 wait complete. 2026-03-10T06:19:36.785 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.777+0000 7f224ad5b700 1 Processor -- start 2026-03-10T06:19:36.785 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.777+0000 7f224ad5b700 1 -- start start 2026-03-10T06:19:36.785 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.778+0000 7f224ad5b700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f224410ed80 0x7f2244117450 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:36.785 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.778+0000 7f224ad5b700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2244112450 0x7f22441128c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:36.785 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.778+0000 7f224ad5b700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f2244117ae0 con 0x7f2244112450 2026-03-10T06:19:36.785 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.778+0000 7f224ad5b700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f2244112e00 con 0x7f224410ed80 2026-03-10T06:19:36.785 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.781+0000 7f2248af7700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f224410ed80 0x7f2244117450 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:19:36.785 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.781+0000 7f2248af7700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f224410ed80 0x7f2244117450 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.104:55592/0 (socket says 192.168.123.104:55592) 2026-03-10T06:19:36.786 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.781+0000 7f2248af7700 1 -- 192.168.123.104:0/4014844779 learned_addr learned my addr 192.168.123.104:0/4014844779 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:19:36.786 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.781+0000 7f2243fff700 1 --2- 192.168.123.104:0/4014844779 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2244112450 0x7f22441128c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:19:36.786 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.782+0000 7f2248af7700 1 -- 192.168.123.104:0/4014844779 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2244112450 msgr2=0x7f22441128c0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:36.786 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.782+0000 7f2248af7700 1 --2- 192.168.123.104:0/4014844779 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2244112450 0x7f22441128c0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:36.786 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.782+0000 7f2248af7700 1 -- 192.168.123.104:0/4014844779 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f2238009710 con 0x7f224410ed80 2026-03-10T06:19:36.786 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.782+0000 7f2248af7700 1 --2- 192.168.123.104:0/4014844779 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f224410ed80 0x7f2244117450 secure :-1 s=READY pgs=43 cs=0 l=1 rev1=1 crypto rx=0x7f223400ea30 tx=0x7f223400ed40 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:19:36.786 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.783+0000 7f2241ffb700 1 -- 192.168.123.104:0/4014844779 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f223400cb80 con 0x7f224410ed80 2026-03-10T06:19:36.786 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.783+0000 7f224ad5b700 1 -- 192.168.123.104:0/4014844779 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f22441130e0 con 0x7f224410ed80 2026-03-10T06:19:36.786 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.783+0000 7f224ad5b700 1 -- 192.168.123.104:0/4014844779 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f22441a5d80 con 0x7f224410ed80 2026-03-10T06:19:36.786 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.783+0000 7f2241ffb700 1 -- 192.168.123.104:0/4014844779 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f2234004d10 con 0x7f224410ed80 2026-03-10T06:19:36.786 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.783+0000 7f2241ffb700 1 -- 192.168.123.104:0/4014844779 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f2234010430 con 0x7f224410ed80 2026-03-10T06:19:36.789 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.786+0000 7f2241ffb700 1 -- 192.168.123.104:0/4014844779 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f22340106a0 con 0x7f224410ed80 2026-03-10T06:19:36.790 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.787+0000 7f2241ffb700 1 --2- 192.168.123.104:0/4014844779 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f222c06c6d0 0x7f222c06eb80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:36.790 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.788+0000 7f224ad5b700 1 -- 192.168.123.104:0/4014844779 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_get_version(what=osdmap handle=1) v1 -- 0x7f2230000ff0 con 0x7f224410ed80 2026-03-10T06:19:36.790 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.788+0000 7f2243fff700 1 --2- 192.168.123.104:0/4014844779 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f222c06c6d0 0x7f222c06eb80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:19:36.791 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.789+0000 7f2243fff700 1 --2- 192.168.123.104:0/4014844779 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f222c06c6d0 0x7f222c06eb80 secure :-1 s=READY pgs=72 cs=0 l=1 rev1=1 crypto rx=0x7f2238009a30 tx=0x7f2238019040 comp rx=0 tx=0).ready entity=mgr.14241 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:19:36.791 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.789+0000 7f2241ffb700 1 -- 192.168.123.104:0/4014844779 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(34..34 src has 1..34) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f2234014070 con 0x7f224410ed80 2026-03-10T06:19:36.794 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.789+0000 7f2241ffb700 1 --2- 192.168.123.104:0/4014844779 >> [v2:192.168.123.106:6808/1799920647,v1:192.168.123.106:6809/1799920647] conn(0x7f222c072260 0x7f222c074670 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:36.794 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.791+0000 7f2241ffb700 1 -- 192.168.123.104:0/4014844779 --> [v2:192.168.123.106:6808/1799920647,v1:192.168.123.106:6809/1799920647] -- command(tid 1: {"prefix": "get_command_descriptions"}) v1 -- 0x7f222c074d20 con 0x7f222c072260 2026-03-10T06:19:36.794 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.791+0000 7f2241ffb700 1 -- 192.168.123.104:0/4014844779 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_get_version_reply(handle=1 version=34) v2 ==== 24+0+0 (secure 0 0 0) 0x7f2234057460 con 0x7f224410ed80 2026-03-10T06:19:36.794 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.791+0000 7f22492f8700 1 --2- 192.168.123.104:0/4014844779 >> [v2:192.168.123.106:6808/1799920647,v1:192.168.123.106:6809/1799920647] conn(0x7f222c072260 0x7f222c074670 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:19:36.798 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.794+0000 7f22492f8700 1 --2- 192.168.123.104:0/4014844779 >> [v2:192.168.123.106:6808/1799920647,v1:192.168.123.106:6809/1799920647] conn(0x7f222c072260 0x7f222c074670 crc :-1 s=READY pgs=5 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).ready entity=osd.4 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:19:36.798 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.794+0000 7f2241ffb700 1 -- 192.168.123.104:0/4014844779 <== osd.4 v2:192.168.123.106:6808/1799920647 1 ==== command_reply(tid 1: 0 ) v1 ==== 8+0+25106 (crc 0 0 0) 0x7f222c074d20 con 0x7f222c072260 2026-03-10T06:19:36.826 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.823+0000 7f224ad5b700 1 -- 192.168.123.104:0/4014844779 --> [v2:192.168.123.106:6808/1799920647,v1:192.168.123.106:6809/1799920647] -- command(tid 2: {"prefix": "flush_pg_stats"}) v1 -- 0x7f2230002ce0 con 0x7f222c072260 2026-03-10T06:19:36.826 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.823+0000 7f2241ffb700 1 -- 192.168.123.104:0/4014844779 <== osd.4 v2:192.168.123.106:6808/1799920647 2 ==== command_reply(tid 2: 0 ) v1 ==== 8+0+12 (crc 0 0 0) 0x7f2230002ce0 con 0x7f222c072260 2026-03-10T06:19:36.826 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.824+0000 7f66df637700 1 -- 192.168.123.104:0/2300271340 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f66d8071e40 msgr2=0x7f66d80722b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:36.826 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.824+0000 7f66df637700 1 --2- 192.168.123.104:0/2300271340 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f66d8071e40 0x7f66d80722b0 secure :-1 s=READY pgs=44 cs=0 l=1 rev1=1 crypto rx=0x7f66d000b3a0 tx=0x7f66d000b6b0 comp rx=0 tx=0).stop 2026-03-10T06:19:36.826 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.824+0000 7f66df637700 1 -- 192.168.123.104:0/2300271340 shutdown_connections 2026-03-10T06:19:36.826 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.824+0000 7f66df637700 1 --2- 192.168.123.104:0/2300271340 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f66d8071e40 0x7f66d80722b0 unknown :-1 s=CLOSED pgs=44 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:36.826 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.824+0000 7f66df637700 1 --2- 192.168.123.104:0/2300271340 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f66d810c8b0 0x7f66d810cc80 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:36.826 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.824+0000 7f66df637700 1 -- 192.168.123.104:0/2300271340 >> 192.168.123.104:0/2300271340 conn(0x7f66d806c6c0 msgr2=0x7f66d806cac0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:19:36.828 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.824+0000 7f66df637700 1 -- 192.168.123.104:0/2300271340 shutdown_connections 2026-03-10T06:19:36.828 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.824+0000 7f66df637700 1 -- 192.168.123.104:0/2300271340 wait complete. 2026-03-10T06:19:36.828 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.824+0000 7f66df637700 1 Processor -- start 2026-03-10T06:19:36.828 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.824+0000 7f66df637700 1 -- start start 2026-03-10T06:19:36.828 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.824+0000 7f66df637700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f66d810c8b0 0x7f66d807cf50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:36.828 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.824+0000 7f66df637700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f66d807d490 0x7f66d807d900 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:36.828 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.824+0000 7f66df637700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f66d8081b60 con 0x7f66d807d490 2026-03-10T06:19:36.828 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.824+0000 7f66df637700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f66d8081cd0 con 0x7f66d810c8b0 2026-03-10T06:19:36.828 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.825+0000 7f66dcbd2700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f66d807d490 0x7f66d807d900 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:19:36.828 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.825+0000 7f66dcbd2700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f66d807d490 0x7f66d807d900 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:44818/0 (socket says 192.168.123.104:44818) 2026-03-10T06:19:36.828 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.825+0000 7f66dcbd2700 1 -- 192.168.123.104:0/1042582225 learned_addr learned my addr 192.168.123.104:0/1042582225 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:19:36.828 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.825+0000 7f66dcbd2700 1 -- 192.168.123.104:0/1042582225 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f66d810c8b0 msgr2=0x7f66d807cf50 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T06:19:36.828 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.825+0000 7f66dcbd2700 1 --2- 192.168.123.104:0/1042582225 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f66d810c8b0 0x7f66d807cf50 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:36.828 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.825+0000 7f66dcbd2700 1 -- 192.168.123.104:0/1042582225 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f66d4009710 con 0x7f66d807d490 2026-03-10T06:19:36.828 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.825+0000 7f66dcbd2700 1 --2- 192.168.123.104:0/1042582225 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f66d807d490 0x7f66d807d900 secure :-1 s=READY pgs=230 cs=0 l=1 rev1=1 crypto rx=0x7f66d0003c30 tx=0x7f66d0003d10 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:19:36.828 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.825+0000 7f66ce7fc700 1 -- 192.168.123.104:0/1042582225 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f66d000e050 con 0x7f66d807d490 2026-03-10T06:19:36.830 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.825+0000 7f66df637700 1 -- 192.168.123.104:0/1042582225 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f66d000b050 con 0x7f66d807d490 2026-03-10T06:19:36.830 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.825+0000 7f66df637700 1 -- 192.168.123.104:0/1042582225 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f66d80822b0 con 0x7f66d807d490 2026-03-10T06:19:36.830 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.827+0000 7f66ce7fc700 1 -- 192.168.123.104:0/1042582225 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f66d0007e90 con 0x7f66d807d490 2026-03-10T06:19:36.830 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.827+0000 7f66ce7fc700 1 -- 192.168.123.104:0/1042582225 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f66d001ba40 con 0x7f66d807d490 2026-03-10T06:19:36.830 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.828+0000 7f66ce7fc700 1 -- 192.168.123.104:0/1042582225 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f66d0019040 con 0x7f66d807d490 2026-03-10T06:19:36.830 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.828+0000 7f66ce7fc700 1 --2- 192.168.123.104:0/1042582225 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f66c406ea90 0x7f66c4070f40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:36.835 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.831+0000 7f224ad5b700 1 -- 192.168.123.104:0/4014844779 >> [v2:192.168.123.106:6808/1799920647,v1:192.168.123.106:6809/1799920647] conn(0x7f222c072260 msgr2=0x7f222c074670 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:36.835 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.831+0000 7f224ad5b700 1 --2- 192.168.123.104:0/4014844779 >> [v2:192.168.123.106:6808/1799920647,v1:192.168.123.106:6809/1799920647] conn(0x7f222c072260 0x7f222c074670 crc :-1 s=READY pgs=5 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:36.835 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.831+0000 7f224ad5b700 1 -- 192.168.123.104:0/4014844779 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f222c06c6d0 msgr2=0x7f222c06eb80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:36.835 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.831+0000 7f224ad5b700 1 --2- 192.168.123.104:0/4014844779 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f222c06c6d0 0x7f222c06eb80 secure :-1 s=READY pgs=72 cs=0 l=1 rev1=1 crypto rx=0x7f2238009a30 tx=0x7f2238019040 comp rx=0 tx=0).stop 2026-03-10T06:19:36.835 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.831+0000 7f224ad5b700 1 -- 192.168.123.104:0/4014844779 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f224410ed80 msgr2=0x7f2244117450 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:36.835 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.831+0000 7f224ad5b700 1 --2- 192.168.123.104:0/4014844779 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f224410ed80 0x7f2244117450 secure :-1 s=READY pgs=43 cs=0 l=1 rev1=1 crypto rx=0x7f223400ea30 tx=0x7f223400ed40 comp rx=0 tx=0).stop 2026-03-10T06:19:36.835 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.832+0000 7f224ad5b700 1 -- 192.168.123.104:0/4014844779 shutdown_connections 2026-03-10T06:19:36.835 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.832+0000 7f224ad5b700 1 --2- 192.168.123.104:0/4014844779 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f222c06c6d0 0x7f222c06eb80 unknown :-1 s=CLOSED pgs=72 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:36.835 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.832+0000 7f224ad5b700 1 --2- 192.168.123.104:0/4014844779 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f224410ed80 0x7f2244117450 unknown :-1 s=CLOSED pgs=43 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:36.835 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.832+0000 7f224ad5b700 1 --2- 192.168.123.104:0/4014844779 >> [v2:192.168.123.106:6808/1799920647,v1:192.168.123.106:6809/1799920647] conn(0x7f222c072260 0x7f222c074670 unknown :-1 s=CLOSED pgs=5 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:36.835 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.832+0000 7f224ad5b700 1 --2- 192.168.123.104:0/4014844779 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2244112450 0x7f22441128c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:36.835 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.832+0000 7f224ad5b700 1 -- 192.168.123.104:0/4014844779 >> 192.168.123.104:0/4014844779 conn(0x7f224406c830 msgr2=0x7f224410ca30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:19:36.835 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.832+0000 7f224ad5b700 1 -- 192.168.123.104:0/4014844779 shutdown_connections 2026-03-10T06:19:36.835 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.832+0000 7f224ad5b700 1 -- 192.168.123.104:0/4014844779 wait complete. 2026-03-10T06:19:36.835 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.831+0000 7f66dd3d3700 1 --2- 192.168.123.104:0/1042582225 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f66c406ea90 0x7f66c4070f40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:19:36.835 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.832+0000 7f66dd3d3700 1 --2- 192.168.123.104:0/1042582225 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f66c406ea90 0x7f66c4070f40 secure :-1 s=READY pgs=73 cs=0 l=1 rev1=1 crypto rx=0x7f66d4009ee0 tx=0x7f66d4009450 comp rx=0 tx=0).ready entity=mgr.14241 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:19:36.835 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.832+0000 7f66ce7fc700 1 -- 192.168.123.104:0/1042582225 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(34..34 src has 1..34) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f66d008d2c0 con 0x7f66d807d490 2026-03-10T06:19:36.835 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.832+0000 7f66df637700 1 --2- 192.168.123.104:0/1042582225 >> [v2:192.168.123.106:6816/240520744,v1:192.168.123.106:6817/240520744] conn(0x7f66bc001610 0x7f66bc003ac0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:36.835 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.832+0000 7f66df637700 1 -- 192.168.123.104:0/1042582225 --> [v2:192.168.123.106:6816/240520744,v1:192.168.123.106:6817/240520744] -- command(tid 1: {"prefix": "get_command_descriptions"}) v1 -- 0x7f66bc006bf0 con 0x7f66bc001610 2026-03-10T06:19:36.835 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.832+0000 7f66ddbd4700 1 --2- 192.168.123.104:0/1042582225 >> [v2:192.168.123.106:6816/240520744,v1:192.168.123.106:6817/240520744] conn(0x7f66bc001610 0x7f66bc003ac0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:19:36.835 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.833+0000 7f66ddbd4700 1 --2- 192.168.123.104:0/1042582225 >> [v2:192.168.123.106:6816/240520744,v1:192.168.123.106:6817/240520744] conn(0x7f66bc001610 0x7f66bc003ac0 crc :-1 s=READY pgs=5 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).ready entity=osd.5 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:19:36.838 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.833+0000 7f66ce7fc700 1 -- 192.168.123.104:0/1042582225 <== osd.5 v2:192.168.123.106:6816/240520744 1 ==== command_reply(tid 1: 0 ) v1 ==== 8+0+25106 (crc 0 0 0) 0x7f66bc006bf0 con 0x7f66bc001610 2026-03-10T06:19:36.860 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.857+0000 7f66c3fff700 1 -- 192.168.123.104:0/1042582225 --> [v2:192.168.123.106:6816/240520744,v1:192.168.123.106:6817/240520744] -- command(tid 2: {"prefix": "flush_pg_stats"}) v1 -- 0x7f66bc005cd0 con 0x7f66bc001610 2026-03-10T06:19:36.860 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.857+0000 7f66ce7fc700 1 -- 192.168.123.104:0/1042582225 <== osd.5 v2:192.168.123.106:6816/240520744 2 ==== command_reply(tid 2: 0 ) v1 ==== 8+0+12 (crc 0 0 0) 0x7f66bc005cd0 con 0x7f66bc001610 2026-03-10T06:19:36.860 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.857+0000 7f66c3fff700 1 -- 192.168.123.104:0/1042582225 >> [v2:192.168.123.106:6816/240520744,v1:192.168.123.106:6817/240520744] conn(0x7f66bc001610 msgr2=0x7f66bc003ac0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:36.860 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.857+0000 7f66c3fff700 1 --2- 192.168.123.104:0/1042582225 >> [v2:192.168.123.106:6816/240520744,v1:192.168.123.106:6817/240520744] conn(0x7f66bc001610 0x7f66bc003ac0 crc :-1 s=READY pgs=5 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:36.860 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.857+0000 7f66c3fff700 1 -- 192.168.123.104:0/1042582225 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f66c406ea90 msgr2=0x7f66c4070f40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:36.860 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.858+0000 7f66c3fff700 1 --2- 192.168.123.104:0/1042582225 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f66c406ea90 0x7f66c4070f40 secure :-1 s=READY pgs=73 cs=0 l=1 rev1=1 crypto rx=0x7f66d4009ee0 tx=0x7f66d4009450 comp rx=0 tx=0).stop 2026-03-10T06:19:36.860 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.858+0000 7f66c3fff700 1 -- 192.168.123.104:0/1042582225 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f66d807d490 msgr2=0x7f66d807d900 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:36.860 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.858+0000 7f66c3fff700 1 --2- 192.168.123.104:0/1042582225 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f66d807d490 0x7f66d807d900 secure :-1 s=READY pgs=230 cs=0 l=1 rev1=1 crypto rx=0x7f66d0003c30 tx=0x7f66d0003d10 comp rx=0 tx=0).stop 2026-03-10T06:19:36.861 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.858+0000 7f66c3fff700 1 -- 192.168.123.104:0/1042582225 shutdown_connections 2026-03-10T06:19:36.861 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.858+0000 7f66c3fff700 1 --2- 192.168.123.104:0/1042582225 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f66c406ea90 0x7f66c4070f40 unknown :-1 s=CLOSED pgs=73 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:36.861 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.858+0000 7f66c3fff700 1 --2- 192.168.123.104:0/1042582225 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f66d810c8b0 0x7f66d807cf50 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:36.861 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.858+0000 7f66c3fff700 1 --2- 192.168.123.104:0/1042582225 >> [v2:192.168.123.106:6816/240520744,v1:192.168.123.106:6817/240520744] conn(0x7f66bc001610 0x7f66bc003ac0 unknown :-1 s=CLOSED pgs=5 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:36.861 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.858+0000 7f66c3fff700 1 --2- 192.168.123.104:0/1042582225 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f66d807d490 0x7f66d807d900 unknown :-1 s=CLOSED pgs=230 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:36.861 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.858+0000 7f66c3fff700 1 -- 192.168.123.104:0/1042582225 >> 192.168.123.104:0/1042582225 conn(0x7f66d806c6c0 msgr2=0x7f66d8070070 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:19:36.861 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.858+0000 7f66c3fff700 1 -- 192.168.123.104:0/1042582225 shutdown_connections 2026-03-10T06:19:36.861 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.858+0000 7f66c3fff700 1 -- 192.168.123.104:0/1042582225 wait complete. 2026-03-10T06:19:36.913 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.911+0000 7f7c0d29e700 1 -- 192.168.123.104:0/817167534 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7c0810e9e0 msgr2=0x7f7c0810edb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:36.913 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.911+0000 7f7c0d29e700 1 --2- 192.168.123.104:0/817167534 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7c0810e9e0 0x7f7c0810edb0 secure :-1 s=READY pgs=45 cs=0 l=1 rev1=1 crypto rx=0x7f7bf8009b00 tx=0x7f7bf8009e10 comp rx=0 tx=0).stop 2026-03-10T06:19:36.913 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.911+0000 7f7c0d29e700 1 -- 192.168.123.104:0/817167534 shutdown_connections 2026-03-10T06:19:36.913 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.911+0000 7f7c0d29e700 1 --2- 192.168.123.104:0/817167534 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7c08071b60 0x7f7c08071fd0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:36.913 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.911+0000 7f7c0d29e700 1 --2- 192.168.123.104:0/817167534 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7c0810e9e0 0x7f7c0810edb0 unknown :-1 s=CLOSED pgs=45 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:36.913 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.911+0000 7f7c0d29e700 1 -- 192.168.123.104:0/817167534 >> 192.168.123.104:0/817167534 conn(0x7f7c0806c6c0 msgr2=0x7f7c0806cac0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:19:36.926 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.918+0000 7f7c0d29e700 1 -- 192.168.123.104:0/817167534 shutdown_connections 2026-03-10T06:19:36.927 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.924+0000 7f7c0d29e700 1 -- 192.168.123.104:0/817167534 wait complete. 2026-03-10T06:19:36.927 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.925+0000 7f7c0d29e700 1 Processor -- start 2026-03-10T06:19:36.927 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.925+0000 7f7c0d29e700 1 -- start start 2026-03-10T06:19:36.927 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.925+0000 7f7c0d29e700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7c08119680 0x7f7c08114680 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:36.927 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.925+0000 7f7c0d29e700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7c08114bc0 0x7f7c08115030 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:36.927 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.925+0000 7f7c0d29e700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7c08119ae0 con 0x7f7c08119680 2026-03-10T06:19:36.927 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.925+0000 7f7c0d29e700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7c081b7b20 con 0x7f7c08114bc0 2026-03-10T06:19:36.928 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.925+0000 7f7c07fff700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7c08119680 0x7f7c08114680 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:19:36.928 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.925+0000 7f7c07fff700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7c08119680 0x7f7c08114680 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:44820/0 (socket says 192.168.123.104:44820) 2026-03-10T06:19:36.928 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.925+0000 7f7c07fff700 1 -- 192.168.123.104:0/2800705494 learned_addr learned my addr 192.168.123.104:0/2800705494 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:19:36.928 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.926+0000 7f7c077fe700 1 --2- 192.168.123.104:0/2800705494 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7c08114bc0 0x7f7c08115030 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:19:36.928 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.926+0000 7f7c07fff700 1 -- 192.168.123.104:0/2800705494 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7c08114bc0 msgr2=0x7f7c08115030 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:36.928 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.926+0000 7f7c07fff700 1 --2- 192.168.123.104:0/2800705494 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7c08114bc0 0x7f7c08115030 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:36.928 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.926+0000 7f7c07fff700 1 -- 192.168.123.104:0/2800705494 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f7bf80097e0 con 0x7f7c08119680 2026-03-10T06:19:36.930 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.926+0000 7f7c07fff700 1 --2- 192.168.123.104:0/2800705494 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7c08119680 0x7f7c08114680 secure :-1 s=READY pgs=231 cs=0 l=1 rev1=1 crypto rx=0x7f7bf8009fd0 tx=0x7f7bf800fad0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:19:36.930 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.926+0000 7f7c057fa700 1 -- 192.168.123.104:0/2800705494 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f7bf801c070 con 0x7f7c08119680 2026-03-10T06:19:36.930 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.926+0000 7f7c057fa700 1 -- 192.168.123.104:0/2800705494 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f7bf8005070 con 0x7f7c08119680 2026-03-10T06:19:36.930 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.926+0000 7f7c057fa700 1 -- 192.168.123.104:0/2800705494 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f7bf8017910 con 0x7f7c08119680 2026-03-10T06:19:36.930 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.926+0000 7f7c0d29e700 1 -- 192.168.123.104:0/2800705494 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f7c081b7cc0 con 0x7f7c08119680 2026-03-10T06:19:36.930 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.926+0000 7f7c0d29e700 1 -- 192.168.123.104:0/2800705494 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f7c081b8190 con 0x7f7c08119680 2026-03-10T06:19:36.930 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.928+0000 7f7c057fa700 1 -- 192.168.123.104:0/2800705494 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f7bf8017a70 con 0x7f7c08119680 2026-03-10T06:19:36.931 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.929+0000 7f7c057fa700 1 --2- 192.168.123.104:0/2800705494 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f7bf006c7a0 0x7f7bf006ec50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:36.931 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.929+0000 7f7c057fa700 1 -- 192.168.123.104:0/2800705494 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(34..34 src has 1..34) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f7bf808d8a0 con 0x7f7c08119680 2026-03-10T06:19:36.934 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.931+0000 7f7c077fe700 1 --2- 192.168.123.104:0/2800705494 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f7bf006c7a0 0x7f7bf006ec50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:19:36.934 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.931+0000 7f7c0d29e700 1 --2- 192.168.123.104:0/2800705494 >> [v2:192.168.123.104:6810/3264858383,v1:192.168.123.104:6811/3264858383] conn(0x7f7bf4001610 0x7f7bf4003ac0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:36.934 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.931+0000 7f7c0d29e700 1 -- 192.168.123.104:0/2800705494 --> [v2:192.168.123.104:6810/3264858383,v1:192.168.123.104:6811/3264858383] -- command(tid 1: {"prefix": "get_command_descriptions"}) v1 -- 0x7f7bf4006bf0 con 0x7f7bf4001610 2026-03-10T06:19:36.934 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.931+0000 7f7c0ca9d700 1 --2- 192.168.123.104:0/2800705494 >> [v2:192.168.123.104:6810/3264858383,v1:192.168.123.104:6811/3264858383] conn(0x7f7bf4001610 0x7f7bf4003ac0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:19:36.934 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.931+0000 7f7c0ca9d700 1 --2- 192.168.123.104:0/2800705494 >> [v2:192.168.123.104:6810/3264858383,v1:192.168.123.104:6811/3264858383] conn(0x7f7bf4001610 0x7f7bf4003ac0 crc :-1 s=READY pgs=8 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).ready entity=osd.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:19:36.939 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.936+0000 7f7c077fe700 1 --2- 192.168.123.104:0/2800705494 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f7bf006c7a0 0x7f7bf006ec50 secure :-1 s=READY pgs=74 cs=0 l=1 rev1=1 crypto rx=0x7f7c08072c80 tx=0x7f7bfc008040 comp rx=0 tx=0).ready entity=mgr.14241 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:19:36.939 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.936+0000 7f7c057fa700 1 -- 192.168.123.104:0/2800705494 <== osd.1 v2:192.168.123.104:6810/3264858383 1 ==== command_reply(tid 1: 0 ) v1 ==== 8+0+25106 (crc 0 0 0) 0x7f7bf4006bf0 con 0x7f7bf4001610 2026-03-10T06:19:36.963 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.960+0000 7f7c0d29e700 1 -- 192.168.123.104:0/2800705494 --> [v2:192.168.123.104:6810/3264858383,v1:192.168.123.104:6811/3264858383] -- command(tid 2: {"prefix": "flush_pg_stats"}) v1 -- 0x7f7bf4005cd0 con 0x7f7bf4001610 2026-03-10T06:19:36.976 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.974+0000 7f7c057fa700 1 -- 192.168.123.104:0/2800705494 <== osd.1 v2:192.168.123.104:6810/3264858383 2 ==== command_reply(tid 2: 0 ) v1 ==== 8+0+11 (crc 0 0 0) 0x7f7bf4005cd0 con 0x7f7bf4001610 2026-03-10T06:19:36.977 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.975+0000 7f7c0d29e700 1 -- 192.168.123.104:0/2800705494 >> [v2:192.168.123.104:6810/3264858383,v1:192.168.123.104:6811/3264858383] conn(0x7f7bf4001610 msgr2=0x7f7bf4003ac0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:36.977 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.975+0000 7f7c0d29e700 1 --2- 192.168.123.104:0/2800705494 >> [v2:192.168.123.104:6810/3264858383,v1:192.168.123.104:6811/3264858383] conn(0x7f7bf4001610 0x7f7bf4003ac0 crc :-1 s=READY pgs=8 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:36.977 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.975+0000 7f7c0d29e700 1 -- 192.168.123.104:0/2800705494 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f7bf006c7a0 msgr2=0x7f7bf006ec50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:36.977 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.975+0000 7f7c0d29e700 1 --2- 192.168.123.104:0/2800705494 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f7bf006c7a0 0x7f7bf006ec50 secure :-1 s=READY pgs=74 cs=0 l=1 rev1=1 crypto rx=0x7f7c08072c80 tx=0x7f7bfc008040 comp rx=0 tx=0).stop 2026-03-10T06:19:36.977 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.975+0000 7f7c0d29e700 1 -- 192.168.123.104:0/2800705494 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7c08119680 msgr2=0x7f7c08114680 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:36.977 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.975+0000 7f7c0d29e700 1 --2- 192.168.123.104:0/2800705494 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7c08119680 0x7f7c08114680 secure :-1 s=READY pgs=231 cs=0 l=1 rev1=1 crypto rx=0x7f7bf8009fd0 tx=0x7f7bf800fad0 comp rx=0 tx=0).stop 2026-03-10T06:19:36.977 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.975+0000 7f7c0d29e700 1 -- 192.168.123.104:0/2800705494 shutdown_connections 2026-03-10T06:19:36.977 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.975+0000 7f7c0d29e700 1 --2- 192.168.123.104:0/2800705494 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f7bf006c7a0 0x7f7bf006ec50 secure :-1 s=CLOSED pgs=74 cs=0 l=1 rev1=1 crypto rx=0x7f7c08072c80 tx=0x7f7bfc008040 comp rx=0 tx=0).stop 2026-03-10T06:19:36.977 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.975+0000 7f7c0d29e700 1 --2- 192.168.123.104:0/2800705494 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7c08119680 0x7f7c08114680 unknown :-1 s=CLOSED pgs=231 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:36.977 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.975+0000 7f7c0d29e700 1 --2- 192.168.123.104:0/2800705494 >> [v2:192.168.123.104:6810/3264858383,v1:192.168.123.104:6811/3264858383] conn(0x7f7bf4001610 0x7f7bf4003ac0 unknown :-1 s=CLOSED pgs=8 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:36.977 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.975+0000 7f7c0d29e700 1 --2- 192.168.123.104:0/2800705494 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7c08114bc0 0x7f7c08115030 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:36.977 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.975+0000 7f7c0d29e700 1 -- 192.168.123.104:0/2800705494 >> 192.168.123.104:0/2800705494 conn(0x7f7c0806c6c0 msgr2=0x7f7c0806ce30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:19:36.978 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.976+0000 7f7c0d29e700 1 -- 192.168.123.104:0/2800705494 shutdown_connections 2026-03-10T06:19:36.978 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:36.976+0000 7f7c0d29e700 1 -- 192.168.123.104:0/2800705494 wait complete. 2026-03-10T06:19:37.071 INFO:teuthology.orchestra.run.vm04.stdout:141733920772 2026-03-10T06:19:37.071 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid 9c59102a-1c48-11f1-b618-035af535377d -- ceph osd last-stat-seq osd.5 2026-03-10T06:19:37.075 INFO:teuthology.orchestra.run.vm04.stdout:124554051589 2026-03-10T06:19:37.075 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid 9c59102a-1c48-11f1-b618-035af535377d -- ceph osd last-stat-seq osd.4 2026-03-10T06:19:37.080 INFO:teuthology.orchestra.run.vm04.stdout:60129542155 2026-03-10T06:19:37.080 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid 9c59102a-1c48-11f1-b618-035af535377d -- ceph osd last-stat-seq osd.1 2026-03-10T06:19:37.286 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/mon.vm04/config 2026-03-10T06:19:37.347 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/mon.vm04/config 2026-03-10T06:19:37.559 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/mon.vm04/config 2026-03-10T06:19:37.605 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:37 vm04 ceph-mon[51058]: pgmap v68: 1 pgs: 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T06:19:37.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:37 vm06 ceph-mon[58974]: pgmap v68: 1 pgs: 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T06:19:37.680 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/mon.vm04/config 2026-03-10T06:19:37.696 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/mon.vm04/config 2026-03-10T06:19:37.843 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/mon.vm04/config 2026-03-10T06:19:37.998 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:37.994+0000 7f2099220700 1 -- 192.168.123.104:0/2975887981 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2094071b60 msgr2=0x7f2094071fd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:37.998 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:37.994+0000 7f2099220700 1 --2- 192.168.123.104:0/2975887981 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2094071b60 0x7f2094071fd0 secure :-1 s=READY pgs=232 cs=0 l=1 rev1=1 crypto rx=0x7f2088009b00 tx=0x7f2088009e10 comp rx=0 tx=0).stop 2026-03-10T06:19:38.005 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.002+0000 7f2099220700 1 -- 192.168.123.104:0/2975887981 shutdown_connections 2026-03-10T06:19:38.006 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.002+0000 7f2099220700 1 --2- 192.168.123.104:0/2975887981 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2094071b60 0x7f2094071fd0 unknown :-1 s=CLOSED pgs=232 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:38.006 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.002+0000 7f2099220700 1 --2- 192.168.123.104:0/2975887981 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f209410eab0 0x7f209410ee80 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:38.006 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.002+0000 7f2099220700 1 -- 192.168.123.104:0/2975887981 >> 192.168.123.104:0/2975887981 conn(0x7f209406c6c0 msgr2=0x7f209406cac0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:19:38.012 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.003+0000 7f2099220700 1 -- 192.168.123.104:0/2975887981 shutdown_connections 2026-03-10T06:19:38.012 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.004+0000 7f2099220700 1 -- 192.168.123.104:0/2975887981 wait complete. 2026-03-10T06:19:38.012 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.004+0000 7f2099220700 1 Processor -- start 2026-03-10T06:19:38.012 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.004+0000 7f2099220700 1 -- start start 2026-03-10T06:19:38.012 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.004+0000 7f2099220700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f2094071b60 0x7f2094119570 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:38.012 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.004+0000 7f2099220700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f209410eab0 0x7f2094114570 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:38.012 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.004+0000 7f2099220700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f2094114ab0 con 0x7f209410eab0 2026-03-10T06:19:38.012 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.004+0000 7f2099220700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f2094114c20 con 0x7f2094071b60 2026-03-10T06:19:38.013 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.005+0000 7f2092d9d700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f2094071b60 0x7f2094119570 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:19:38.013 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.005+0000 7f209259c700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f209410eab0 0x7f2094114570 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:19:38.013 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.005+0000 7f209259c700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f209410eab0 0x7f2094114570 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:44846/0 (socket says 192.168.123.104:44846) 2026-03-10T06:19:38.013 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.005+0000 7f209259c700 1 -- 192.168.123.104:0/3208210365 learned_addr learned my addr 192.168.123.104:0/3208210365 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:19:38.013 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.006+0000 7f209259c700 1 -- 192.168.123.104:0/3208210365 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f2094071b60 msgr2=0x7f2094119570 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:38.013 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.006+0000 7f209259c700 1 --2- 192.168.123.104:0/3208210365 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f2094071b60 0x7f2094119570 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:38.013 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.006+0000 7f209259c700 1 -- 192.168.123.104:0/3208210365 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f20880097e0 con 0x7f209410eab0 2026-03-10T06:19:38.013 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.006+0000 7f209259c700 1 --2- 192.168.123.104:0/3208210365 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f209410eab0 0x7f2094114570 secure :-1 s=READY pgs=233 cs=0 l=1 rev1=1 crypto rx=0x7f20880048c0 tx=0x7f20880048f0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:19:38.013 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.006+0000 7f207bfff700 1 -- 192.168.123.104:0/3208210365 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f208801d070 con 0x7f209410eab0 2026-03-10T06:19:38.013 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.006+0000 7f207bfff700 1 -- 192.168.123.104:0/3208210365 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f2088004b80 con 0x7f209410eab0 2026-03-10T06:19:38.013 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.007+0000 7f207bfff700 1 -- 192.168.123.104:0/3208210365 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f208800f650 con 0x7f209410eab0 2026-03-10T06:19:38.014 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.007+0000 7f2099220700 1 -- 192.168.123.104:0/3208210365 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f2094114e50 con 0x7f209410eab0 2026-03-10T06:19:38.017 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.007+0000 7f2099220700 1 -- 192.168.123.104:0/3208210365 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f2094115310 con 0x7f209410eab0 2026-03-10T06:19:38.018 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.013+0000 7f2099220700 1 -- 192.168.123.104:0/3208210365 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f209404f2a0 con 0x7f209410eab0 2026-03-10T06:19:38.019 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.015+0000 7f207bfff700 1 -- 192.168.123.104:0/3208210365 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f208800bc50 con 0x7f209410eab0 2026-03-10T06:19:38.023 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.017+0000 7f207bfff700 1 --2- 192.168.123.104:0/3208210365 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f207c06c520 0x7f207c06e9d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:38.023 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.019+0000 7f207bfff700 1 -- 192.168.123.104:0/3208210365 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(34..34 src has 1..34) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f208808d630 con 0x7f209410eab0 2026-03-10T06:19:38.023 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.019+0000 7f2092d9d700 1 --2- 192.168.123.104:0/3208210365 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f207c06c520 0x7f207c06e9d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:19:38.023 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.020+0000 7f2092d9d700 1 --2- 192.168.123.104:0/3208210365 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f207c06c520 0x7f207c06e9d0 secure :-1 s=READY pgs=75 cs=0 l=1 rev1=1 crypto rx=0x7f2084005950 tx=0x7f20840058e0 comp rx=0 tx=0).ready entity=mgr.14241 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:19:38.024 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.020+0000 7f207bfff700 1 -- 192.168.123.104:0/3208210365 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f208808da10 con 0x7f209410eab0 2026-03-10T06:19:38.133 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.130+0000 7f3ac2383700 1 -- 192.168.123.104:0/389700692 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3abc071e40 msgr2=0x7f3abc0722b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:38.133 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.130+0000 7f3ac2383700 1 --2- 192.168.123.104:0/389700692 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3abc071e40 0x7f3abc0722b0 secure :-1 s=READY pgs=234 cs=0 l=1 rev1=1 crypto rx=0x7f3ab400b600 tx=0x7f3ab400b910 comp rx=0 tx=0).stop 2026-03-10T06:19:38.133 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.130+0000 7f3ac2383700 1 -- 192.168.123.104:0/389700692 shutdown_connections 2026-03-10T06:19:38.133 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.130+0000 7f3ac2383700 1 --2- 192.168.123.104:0/389700692 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3abc071e40 0x7f3abc0722b0 unknown :-1 s=CLOSED pgs=234 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:38.134 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.130+0000 7f3ac2383700 1 --2- 192.168.123.104:0/389700692 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3abc10c8b0 0x7f3abc10cc80 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:38.134 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.130+0000 7f3ac2383700 1 -- 192.168.123.104:0/389700692 >> 192.168.123.104:0/389700692 conn(0x7f3abc06c6c0 msgr2=0x7f3abc06cac0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:19:38.135 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.130+0000 7f3ac2383700 1 -- 192.168.123.104:0/389700692 shutdown_connections 2026-03-10T06:19:38.135 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.130+0000 7f3ac2383700 1 -- 192.168.123.104:0/389700692 wait complete. 2026-03-10T06:19:38.135 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.131+0000 7f3ac2383700 1 Processor -- start 2026-03-10T06:19:38.135 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.131+0000 7f3ac2383700 1 -- start start 2026-03-10T06:19:38.135 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.131+0000 7f3ac2383700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3abc10c8b0 0x7f3abc07e2b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:38.135 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.131+0000 7f3ac2383700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3abc07e7f0 0x7f3abc1b8420 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:38.135 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.131+0000 7f3ac2383700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3abc07edb0 con 0x7f3abc07e7f0 2026-03-10T06:19:38.135 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.131+0000 7f3ac2383700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3abc07ef20 con 0x7f3abc10c8b0 2026-03-10T06:19:38.135 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.131+0000 7f3abb7fe700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3abc07e7f0 0x7f3abc1b8420 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:19:38.135 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.131+0000 7f3abb7fe700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3abc07e7f0 0x7f3abc1b8420 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:44854/0 (socket says 192.168.123.104:44854) 2026-03-10T06:19:38.135 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.131+0000 7f3abb7fe700 1 -- 192.168.123.104:0/247858744 learned_addr learned my addr 192.168.123.104:0/247858744 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:19:38.135 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.131+0000 7f3abbfff700 1 --2- 192.168.123.104:0/247858744 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3abc10c8b0 0x7f3abc07e2b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:19:38.135 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.131+0000 7f3abb7fe700 1 -- 192.168.123.104:0/247858744 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3abc10c8b0 msgr2=0x7f3abc07e2b0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:38.135 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.131+0000 7f3abb7fe700 1 --2- 192.168.123.104:0/247858744 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3abc10c8b0 0x7f3abc07e2b0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:38.136 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.131+0000 7f3abb7fe700 1 -- 192.168.123.104:0/247858744 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f3ab400b050 con 0x7f3abc07e7f0 2026-03-10T06:19:38.136 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.131+0000 7f3abb7fe700 1 --2- 192.168.123.104:0/247858744 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3abc07e7f0 0x7f3abc1b8420 secure :-1 s=READY pgs=235 cs=0 l=1 rev1=1 crypto rx=0x7f3ab4003c70 tx=0x7f3ab4003ca0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:19:38.136 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.132+0000 7f3ab97fa700 1 -- 192.168.123.104:0/247858744 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3ab400e030 con 0x7f3abc07e7f0 2026-03-10T06:19:38.138 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.132+0000 7f3ac2383700 1 -- 192.168.123.104:0/247858744 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f3abc1b89c0 con 0x7f3abc07e7f0 2026-03-10T06:19:38.138 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.132+0000 7f3ac2383700 1 -- 192.168.123.104:0/247858744 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f3abc07a6e0 con 0x7f3abc07e7f0 2026-03-10T06:19:38.138 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.135+0000 7f3ab97fa700 1 -- 192.168.123.104:0/247858744 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f3ab40048e0 con 0x7f3abc07e7f0 2026-03-10T06:19:38.138 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.135+0000 7f3ab97fa700 1 -- 192.168.123.104:0/247858744 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3ab401cdc0 con 0x7f3abc07e7f0 2026-03-10T06:19:38.138 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.135+0000 7f3ab97fa700 1 -- 192.168.123.104:0/247858744 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f3ab4012430 con 0x7f3abc07e7f0 2026-03-10T06:19:38.138 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.136+0000 7f3ab97fa700 1 --2- 192.168.123.104:0/247858744 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f3aa406c830 0x7f3aa406ece0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:38.139 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.136+0000 7f3abbfff700 1 --2- 192.168.123.104:0/247858744 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f3aa406c830 0x7f3aa406ece0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:19:38.139 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.136+0000 7f3abbfff700 1 --2- 192.168.123.104:0/247858744 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f3aa406c830 0x7f3aa406ece0 secure :-1 s=READY pgs=76 cs=0 l=1 rev1=1 crypto rx=0x7f3aac00a8a0 tx=0x7f3aac008040 comp rx=0 tx=0).ready entity=mgr.14241 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:19:38.139 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.136+0000 7f3ab97fa700 1 -- 192.168.123.104:0/247858744 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(34..34 src has 1..34) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f3ab402a030 con 0x7f3abc07e7f0 2026-03-10T06:19:38.139 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.136+0000 7f3ac2383700 1 -- 192.168.123.104:0/247858744 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f3aa8005320 con 0x7f3abc07e7f0 2026-03-10T06:19:38.143 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.139+0000 7f3ab97fa700 1 -- 192.168.123.104:0/247858744 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f3ab4058820 con 0x7f3abc07e7f0 2026-03-10T06:19:38.331 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.327+0000 7f2099220700 1 -- 192.168.123.104:0/3208210365 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "osd last-stat-seq", "id": 3} v 0) v1 -- 0x7f2094115aa0 con 0x7f209410eab0 2026-03-10T06:19:38.341 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.332+0000 7f207bfff700 1 -- 192.168.123.104:0/3208210365 <== mon.0 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "osd last-stat-seq", "id": 3}]=0 v0) v1 ==== 74+0+13 (secure 0 0 0) 0x7f2088027020 con 0x7f209410eab0 2026-03-10T06:19:38.344 INFO:teuthology.orchestra.run.vm04.stdout:103079215110 2026-03-10T06:19:38.350 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.346+0000 7f2079ffb700 1 -- 192.168.123.104:0/3208210365 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f207c06c520 msgr2=0x7f207c06e9d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:38.350 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.346+0000 7f2079ffb700 1 --2- 192.168.123.104:0/3208210365 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f207c06c520 0x7f207c06e9d0 secure :-1 s=READY pgs=75 cs=0 l=1 rev1=1 crypto rx=0x7f2084005950 tx=0x7f20840058e0 comp rx=0 tx=0).stop 2026-03-10T06:19:38.350 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.346+0000 7f2079ffb700 1 -- 192.168.123.104:0/3208210365 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f209410eab0 msgr2=0x7f2094114570 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:38.350 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.346+0000 7f2079ffb700 1 --2- 192.168.123.104:0/3208210365 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f209410eab0 0x7f2094114570 secure :-1 s=READY pgs=233 cs=0 l=1 rev1=1 crypto rx=0x7f20880048c0 tx=0x7f20880048f0 comp rx=0 tx=0).stop 2026-03-10T06:19:38.352 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.349+0000 7f2079ffb700 1 -- 192.168.123.104:0/3208210365 shutdown_connections 2026-03-10T06:19:38.352 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.349+0000 7f2079ffb700 1 --2- 192.168.123.104:0/3208210365 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f207c06c520 0x7f207c06e9d0 unknown :-1 s=CLOSED pgs=75 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:38.352 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.349+0000 7f2079ffb700 1 --2- 192.168.123.104:0/3208210365 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f2094071b60 0x7f2094119570 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:38.352 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.349+0000 7f2079ffb700 1 --2- 192.168.123.104:0/3208210365 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f209410eab0 0x7f2094114570 unknown :-1 s=CLOSED pgs=233 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:38.352 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.349+0000 7f2079ffb700 1 -- 192.168.123.104:0/3208210365 >> 192.168.123.104:0/3208210365 conn(0x7f209406c6c0 msgr2=0x7f2094070370 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:19:38.352 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.349+0000 7f2079ffb700 1 -- 192.168.123.104:0/3208210365 shutdown_connections 2026-03-10T06:19:38.352 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.349+0000 7f2079ffb700 1 -- 192.168.123.104:0/3208210365 wait complete. 2026-03-10T06:19:38.387 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:38 vm04 ceph-mon[51058]: from='client.? 192.168.123.104:0/3208210365' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 3}]: dispatch 2026-03-10T06:19:38.482 INFO:tasks.cephadm.ceph_manager.ceph:need seq 103079215111 got 103079215110 for osd.3 2026-03-10T06:19:38.539 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.536+0000 7f3ac2383700 1 -- 192.168.123.104:0/247858744 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "osd last-stat-seq", "id": 2} v 0) v1 -- 0x7f3aa80059f0 con 0x7f3abc07e7f0 2026-03-10T06:19:38.539 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.537+0000 7f3ab97fa700 1 -- 192.168.123.104:0/247858744 <== mon.0 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "osd last-stat-seq", "id": 2}]=0 v0) v1 ==== 74+0+12 (secure 0 0 0) 0x7f3ab405be40 con 0x7f3abc07e7f0 2026-03-10T06:19:38.541 INFO:teuthology.orchestra.run.vm04.stdout:77309411337 2026-03-10T06:19:38.555 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.553+0000 7f3ac2383700 1 -- 192.168.123.104:0/247858744 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f3aa406c830 msgr2=0x7f3aa406ece0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:38.555 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.553+0000 7f3ac2383700 1 --2- 192.168.123.104:0/247858744 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f3aa406c830 0x7f3aa406ece0 secure :-1 s=READY pgs=76 cs=0 l=1 rev1=1 crypto rx=0x7f3aac00a8a0 tx=0x7f3aac008040 comp rx=0 tx=0).stop 2026-03-10T06:19:38.555 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.553+0000 7f3ac2383700 1 -- 192.168.123.104:0/247858744 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3abc07e7f0 msgr2=0x7f3abc1b8420 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:38.555 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.553+0000 7f3ac2383700 1 --2- 192.168.123.104:0/247858744 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3abc07e7f0 0x7f3abc1b8420 secure :-1 s=READY pgs=235 cs=0 l=1 rev1=1 crypto rx=0x7f3ab4003c70 tx=0x7f3ab4003ca0 comp rx=0 tx=0).stop 2026-03-10T06:19:38.555 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.553+0000 7f3ac2383700 1 -- 192.168.123.104:0/247858744 shutdown_connections 2026-03-10T06:19:38.555 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.553+0000 7f3ac2383700 1 --2- 192.168.123.104:0/247858744 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f3aa406c830 0x7f3aa406ece0 unknown :-1 s=CLOSED pgs=76 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:38.555 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.553+0000 7f3ac2383700 1 --2- 192.168.123.104:0/247858744 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3abc10c8b0 0x7f3abc07e2b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:38.555 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.553+0000 7f3ac2383700 1 --2- 192.168.123.104:0/247858744 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3abc07e7f0 0x7f3abc1b8420 unknown :-1 s=CLOSED pgs=235 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:38.555 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.553+0000 7f3ac2383700 1 -- 192.168.123.104:0/247858744 >> 192.168.123.104:0/247858744 conn(0x7f3abc06c6c0 msgr2=0x7f3abc070080 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:19:38.555 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.554+0000 7f3ac2383700 1 -- 192.168.123.104:0/247858744 shutdown_connections 2026-03-10T06:19:38.556 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.554+0000 7f3ac2383700 1 -- 192.168.123.104:0/247858744 wait complete. 2026-03-10T06:19:38.560 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.558+0000 7f2101720700 1 -- 192.168.123.104:0/3626232724 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f20fc071b60 msgr2=0x7f20fc071fd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:38.560 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.558+0000 7f2101720700 1 --2- 192.168.123.104:0/3626232724 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f20fc071b60 0x7f20fc071fd0 secure :-1 s=READY pgs=236 cs=0 l=1 rev1=1 crypto rx=0x7f20f400b3a0 tx=0x7f20f400b6b0 comp rx=0 tx=0).stop 2026-03-10T06:19:38.569 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.567+0000 7f2101720700 1 -- 192.168.123.104:0/3626232724 shutdown_connections 2026-03-10T06:19:38.569 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.567+0000 7f2101720700 1 --2- 192.168.123.104:0/3626232724 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f20fc071b60 0x7f20fc071fd0 unknown :-1 s=CLOSED pgs=236 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:38.569 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.567+0000 7f2101720700 1 --2- 192.168.123.104:0/3626232724 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f20fc10e9e0 0x7f20fc10edb0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:38.569 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.567+0000 7f2101720700 1 -- 192.168.123.104:0/3626232724 >> 192.168.123.104:0/3626232724 conn(0x7f20fc06c6c0 msgr2=0x7f20fc06cac0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:19:38.570 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.569+0000 7f2101720700 1 -- 192.168.123.104:0/3626232724 shutdown_connections 2026-03-10T06:19:38.571 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.569+0000 7f2101720700 1 -- 192.168.123.104:0/3626232724 wait complete. 2026-03-10T06:19:38.571 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.569+0000 7f2101720700 1 Processor -- start 2026-03-10T06:19:38.572 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.569+0000 7f2101720700 1 -- start start 2026-03-10T06:19:38.572 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.570+0000 7f2101720700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f20fc071b60 0x7f20fc1158c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:38.572 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.570+0000 7f2101720700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f20fc10e9e0 0x7f20fc115e00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:38.572 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.570+0000 7f2101720700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f20fc119a50 con 0x7f20fc10e9e0 2026-03-10T06:19:38.572 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.570+0000 7f2101720700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f20fc116340 con 0x7f20fc071b60 2026-03-10T06:19:38.572 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.570+0000 7f20fb7fe700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f20fc10e9e0 0x7f20fc115e00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:19:38.572 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.570+0000 7f20fb7fe700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f20fc10e9e0 0x7f20fc115e00 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:44878/0 (socket says 192.168.123.104:44878) 2026-03-10T06:19:38.572 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.570+0000 7f20fb7fe700 1 -- 192.168.123.104:0/3661307442 learned_addr learned my addr 192.168.123.104:0/3661307442 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:19:38.572 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.570+0000 7f20fbfff700 1 --2- 192.168.123.104:0/3661307442 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f20fc071b60 0x7f20fc1158c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:19:38.572 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.570+0000 7f20fb7fe700 1 -- 192.168.123.104:0/3661307442 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f20fc071b60 msgr2=0x7f20fc1158c0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:38.572 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.570+0000 7f20fb7fe700 1 --2- 192.168.123.104:0/3661307442 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f20fc071b60 0x7f20fc1158c0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:38.572 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.570+0000 7f20fb7fe700 1 -- 192.168.123.104:0/3661307442 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f20f400b050 con 0x7f20fc10e9e0 2026-03-10T06:19:38.573 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.571+0000 7f20fb7fe700 1 --2- 192.168.123.104:0/3661307442 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f20fc10e9e0 0x7f20fc115e00 secure :-1 s=READY pgs=237 cs=0 l=1 rev1=1 crypto rx=0x7f20f4015040 tx=0x7f20f40077f0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:19:38.573 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.571+0000 7f20f97fa700 1 -- 192.168.123.104:0/3661307442 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f20f400e050 con 0x7f20fc10e9e0 2026-03-10T06:19:38.573 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.571+0000 7f20f97fa700 1 -- 192.168.123.104:0/3661307442 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f20f40047d0 con 0x7f20fc10e9e0 2026-03-10T06:19:38.575 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.571+0000 7f20f97fa700 1 -- 192.168.123.104:0/3661307442 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f20f401d9e0 con 0x7f20fc10e9e0 2026-03-10T06:19:38.575 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.572+0000 7f2101720700 1 -- 192.168.123.104:0/3661307442 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f20fc1165c0 con 0x7f20fc10e9e0 2026-03-10T06:19:38.575 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.573+0000 7f2101720700 1 -- 192.168.123.104:0/3661307442 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f20fc1b7900 con 0x7f20fc10e9e0 2026-03-10T06:19:38.575 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.573+0000 7f20f97fa700 1 -- 192.168.123.104:0/3661307442 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f20f4019040 con 0x7f20fc10e9e0 2026-03-10T06:19:38.579 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.574+0000 7f20f97fa700 1 --2- 192.168.123.104:0/3661307442 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f20e406c430 0x7f20e406e8e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:38.579 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.574+0000 7f20f97fa700 1 -- 192.168.123.104:0/3661307442 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(34..34 src has 1..34) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f20f4090170 con 0x7f20fc10e9e0 2026-03-10T06:19:38.579 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.574+0000 7f20fbfff700 1 --2- 192.168.123.104:0/3661307442 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f20e406c430 0x7f20e406e8e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:19:38.579 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.574+0000 7f20e2ffd700 1 -- 192.168.123.104:0/3661307442 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f20fc04f2a0 con 0x7f20fc10e9e0 2026-03-10T06:19:38.579 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.574+0000 7f20fbfff700 1 --2- 192.168.123.104:0/3661307442 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f20e406c430 0x7f20e406e8e0 secure :-1 s=READY pgs=77 cs=0 l=1 rev1=1 crypto rx=0x7f20f0009e50 tx=0x7f20f0009450 comp rx=0 tx=0).ready entity=mgr.14241 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:19:38.583 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.578+0000 7f20f97fa700 1 -- 192.168.123.104:0/3661307442 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f20f4091050 con 0x7f20fc10e9e0 2026-03-10T06:19:38.611 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.609+0000 7f471ec07700 1 -- 192.168.123.104:0/1174669340 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4718071e40 msgr2=0x7f47180722b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:38.611 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.609+0000 7f471ec07700 1 --2- 192.168.123.104:0/1174669340 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4718071e40 0x7f47180722b0 secure :-1 s=READY pgs=238 cs=0 l=1 rev1=1 crypto rx=0x7f470c009a60 tx=0x7f470c009d70 comp rx=0 tx=0).stop 2026-03-10T06:19:38.612 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.609+0000 7f471ec07700 1 -- 192.168.123.104:0/1174669340 shutdown_connections 2026-03-10T06:19:38.612 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.609+0000 7f471ec07700 1 --2- 192.168.123.104:0/1174669340 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4718071e40 0x7f47180722b0 unknown :-1 s=CLOSED pgs=238 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:38.612 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.609+0000 7f471ec07700 1 --2- 192.168.123.104:0/1174669340 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f471810c8b0 0x7f471810cc80 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:38.612 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.609+0000 7f471ec07700 1 -- 192.168.123.104:0/1174669340 >> 192.168.123.104:0/1174669340 conn(0x7f471806c6c0 msgr2=0x7f471806cac0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:19:38.612 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.609+0000 7f471ec07700 1 -- 192.168.123.104:0/1174669340 shutdown_connections 2026-03-10T06:19:38.612 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.609+0000 7f471ec07700 1 -- 192.168.123.104:0/1174669340 wait complete. 2026-03-10T06:19:38.612 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.610+0000 7f471ec07700 1 Processor -- start 2026-03-10T06:19:38.612 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.610+0000 7f471ec07700 1 -- start start 2026-03-10T06:19:38.612 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.610+0000 7f471ec07700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f471810c8b0 0x7f47181378a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:38.612 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.610+0000 7f471ec07700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f47181328a0 0x7f4718132d10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:38.612 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.610+0000 7f471ec07700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f47181332e0 con 0x7f471810c8b0 2026-03-10T06:19:38.612 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.610+0000 7f471ec07700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4718133450 con 0x7f47181328a0 2026-03-10T06:19:38.613 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.611+0000 7f471c9a3700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f471810c8b0 0x7f47181378a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:19:38.613 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.611+0000 7f471c9a3700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f471810c8b0 0x7f47181378a0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:44896/0 (socket says 192.168.123.104:44896) 2026-03-10T06:19:38.613 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.611+0000 7f471c9a3700 1 -- 192.168.123.104:0/2476193754 learned_addr learned my addr 192.168.123.104:0/2476193754 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:19:38.613 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.611+0000 7f4717fff700 1 --2- 192.168.123.104:0/2476193754 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f47181328a0 0x7f4718132d10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:19:38.613 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.611+0000 7f471c9a3700 1 -- 192.168.123.104:0/2476193754 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f47181328a0 msgr2=0x7f4718132d10 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:38.613 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.611+0000 7f471c9a3700 1 --2- 192.168.123.104:0/2476193754 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f47181328a0 0x7f4718132d10 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:38.613 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.611+0000 7f471c9a3700 1 -- 192.168.123.104:0/2476193754 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f470c009710 con 0x7f471810c8b0 2026-03-10T06:19:38.615 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.613+0000 7f471c9a3700 1 --2- 192.168.123.104:0/2476193754 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f471810c8b0 0x7f47181378a0 secure :-1 s=READY pgs=239 cs=0 l=1 rev1=1 crypto rx=0x7f4708009d00 tx=0x7f470800e3b0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:19:38.615 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.613+0000 7f4715ffb700 1 -- 192.168.123.104:0/2476193754 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f470800a4f0 con 0x7f471810c8b0 2026-03-10T06:19:38.615 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.613+0000 7f471ec07700 1 -- 192.168.123.104:0/2476193754 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f47181336d0 con 0x7f471810c8b0 2026-03-10T06:19:38.615 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.613+0000 7f471ec07700 1 -- 192.168.123.104:0/2476193754 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f471807f0c0 con 0x7f471810c8b0 2026-03-10T06:19:38.616 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.614+0000 7f4715ffb700 1 -- 192.168.123.104:0/2476193754 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f4708010040 con 0x7f471810c8b0 2026-03-10T06:19:38.616 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.614+0000 7f4715ffb700 1 -- 192.168.123.104:0/2476193754 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f47080136a0 con 0x7f471810c8b0 2026-03-10T06:19:38.617 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.615+0000 7f4715ffb700 1 -- 192.168.123.104:0/2476193754 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f47080138d0 con 0x7f471810c8b0 2026-03-10T06:19:38.618 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.616+0000 7f4715ffb700 1 --2- 192.168.123.104:0/2476193754 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f470006c6d0 0x7f470006eb80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:38.618 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.616+0000 7f4717fff700 1 --2- 192.168.123.104:0/2476193754 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f470006c6d0 0x7f470006eb80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:19:38.619 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.617+0000 7f4717fff700 1 --2- 192.168.123.104:0/2476193754 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f470006c6d0 0x7f470006eb80 secure :-1 s=READY pgs=78 cs=0 l=1 rev1=1 crypto rx=0x7f470c00b5c0 tx=0x7f470c011040 comp rx=0 tx=0).ready entity=mgr.14241 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:19:38.619 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.618+0000 7f4715ffb700 1 -- 192.168.123.104:0/2476193754 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(34..34 src has 1..34) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f470808c900 con 0x7f471810c8b0 2026-03-10T06:19:38.624 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.622+0000 7f46ff7fe700 1 -- 192.168.123.104:0/2476193754 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f4704005320 con 0x7f471810c8b0 2026-03-10T06:19:38.632 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.627+0000 7f4715ffb700 1 -- 192.168.123.104:0/2476193754 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f470805ae50 con 0x7f471810c8b0 2026-03-10T06:19:38.647 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.643+0000 7f5ef3fff700 1 -- 192.168.123.104:0/2021793563 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5ef410e9e0 msgr2=0x7f5ef410edb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:38.647 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.643+0000 7f5ef3fff700 1 --2- 192.168.123.104:0/2021793563 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5ef410e9e0 0x7f5ef410edb0 secure :-1 s=READY pgs=240 cs=0 l=1 rev1=1 crypto rx=0x7f5ee8009b00 tx=0x7f5ee8009e10 comp rx=0 tx=0).stop 2026-03-10T06:19:38.650 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.648+0000 7f5ef3fff700 1 -- 192.168.123.104:0/2021793563 shutdown_connections 2026-03-10T06:19:38.650 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.648+0000 7f5ef3fff700 1 --2- 192.168.123.104:0/2021793563 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5ef4071b60 0x7f5ef4071fd0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:38.650 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.648+0000 7f5ef3fff700 1 --2- 192.168.123.104:0/2021793563 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5ef410e9e0 0x7f5ef410edb0 unknown :-1 s=CLOSED pgs=240 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:38.650 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.648+0000 7f5ef3fff700 1 -- 192.168.123.104:0/2021793563 >> 192.168.123.104:0/2021793563 conn(0x7f5ef406c6c0 msgr2=0x7f5ef406cac0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:19:38.651 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.649+0000 7f5ef3fff700 1 -- 192.168.123.104:0/2021793563 shutdown_connections 2026-03-10T06:19:38.651 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.649+0000 7f5ef3fff700 1 -- 192.168.123.104:0/2021793563 wait complete. 2026-03-10T06:19:38.652 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.650+0000 7f5ef3fff700 1 Processor -- start 2026-03-10T06:19:38.652 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.651+0000 7f5ef3fff700 1 -- start start 2026-03-10T06:19:38.653 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.651+0000 7f5ef3fff700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5ef4071b60 0x7f5ef4115400 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:38.653 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.651+0000 7f5ef3fff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5ef410e9e0 0x7f5ef4115940 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:38.653 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.651+0000 7f5ef3fff700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5ef41b1f30 con 0x7f5ef4071b60 2026-03-10T06:19:38.653 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.651+0000 7f5ef3fff700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5ef41b20a0 con 0x7f5ef410e9e0 2026-03-10T06:19:38.653 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.651+0000 7f5ef27fc700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5ef410e9e0 0x7f5ef4115940 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:19:38.653 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.651+0000 7f5ef27fc700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5ef410e9e0 0x7f5ef4115940 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.104:55686/0 (socket says 192.168.123.104:55686) 2026-03-10T06:19:38.653 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.651+0000 7f5ef27fc700 1 -- 192.168.123.104:0/2252339427 learned_addr learned my addr 192.168.123.104:0/2252339427 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:19:38.653 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.651+0000 7f5ef2ffd700 1 --2- 192.168.123.104:0/2252339427 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5ef4071b60 0x7f5ef4115400 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:19:38.655 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.653+0000 7f5ef27fc700 1 -- 192.168.123.104:0/2252339427 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5ef4071b60 msgr2=0x7f5ef4115400 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:38.655 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.653+0000 7f5ef27fc700 1 --2- 192.168.123.104:0/2252339427 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5ef4071b60 0x7f5ef4115400 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:38.655 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.653+0000 7f5ef27fc700 1 -- 192.168.123.104:0/2252339427 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f5ee80097e0 con 0x7f5ef410e9e0 2026-03-10T06:19:38.655 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.654+0000 7f5ef2ffd700 1 --2- 192.168.123.104:0/2252339427 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5ef4071b60 0x7f5ef4115400 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T06:19:38.657 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.655+0000 7f5ef27fc700 1 --2- 192.168.123.104:0/2252339427 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5ef410e9e0 0x7f5ef4115940 secure :-1 s=READY pgs=46 cs=0 l=1 rev1=1 crypto rx=0x7f5ee400eb10 tx=0x7f5ee400eed0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:19:38.657 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.656+0000 7f5ef881e700 1 -- 192.168.123.104:0/2252339427 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f5ee400cca0 con 0x7f5ef410e9e0 2026-03-10T06:19:38.658 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.656+0000 7f5ef881e700 1 -- 192.168.123.104:0/2252339427 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f5ee400ce00 con 0x7f5ef410e9e0 2026-03-10T06:19:38.658 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.656+0000 7f5ef881e700 1 -- 192.168.123.104:0/2252339427 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f5ee40189c0 con 0x7f5ef410e9e0 2026-03-10T06:19:38.660 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.659+0000 7f5ef3fff700 1 -- 192.168.123.104:0/2252339427 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f5ef41b2380 con 0x7f5ef410e9e0 2026-03-10T06:19:38.660 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.659+0000 7f5ef3fff700 1 -- 192.168.123.104:0/2252339427 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f5ef41b28d0 con 0x7f5ef410e9e0 2026-03-10T06:19:38.663 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.660+0000 7f5ef881e700 1 -- 192.168.123.104:0/2252339427 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f5ee4018b20 con 0x7f5ef410e9e0 2026-03-10T06:19:38.663 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.660+0000 7f5ef881e700 1 --2- 192.168.123.104:0/2252339427 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f5edc06c750 0x7f5edc06ec00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:38.663 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.660+0000 7f5ef881e700 1 -- 192.168.123.104:0/2252339427 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(34..34 src has 1..34) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f5ee4014070 con 0x7f5ef410e9e0 2026-03-10T06:19:38.663 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.660+0000 7f5ef3fff700 1 -- 192.168.123.104:0/2252339427 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f5ee0005320 con 0x7f5ef410e9e0 2026-03-10T06:19:38.663 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.662+0000 7f5ef2ffd700 1 --2- 192.168.123.104:0/2252339427 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f5edc06c750 0x7f5edc06ec00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:19:38.664 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.662+0000 7f5ef2ffd700 1 --2- 192.168.123.104:0/2252339427 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f5edc06c750 0x7f5edc06ec00 secure :-1 s=READY pgs=79 cs=0 l=1 rev1=1 crypto rx=0x7f5ee800b5c0 tx=0x7f5ee8005fd0 comp rx=0 tx=0).ready entity=mgr.14241 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:19:38.667 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.665+0000 7fd0ff118700 1 -- 192.168.123.104:0/2250353913 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd0f8071e40 msgr2=0x7fd0f80722b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:38.667 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.665+0000 7fd0ff118700 1 --2- 192.168.123.104:0/2250353913 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd0f8071e40 0x7fd0f80722b0 secure :-1 s=READY pgs=241 cs=0 l=1 rev1=1 crypto rx=0x7fd0f0009230 tx=0x7fd0f0009260 comp rx=0 tx=0).stop 2026-03-10T06:19:38.667 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.665+0000 7fd0ff118700 1 -- 192.168.123.104:0/2250353913 shutdown_connections 2026-03-10T06:19:38.667 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.665+0000 7fd0ff118700 1 --2- 192.168.123.104:0/2250353913 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd0f8071e40 0x7fd0f80722b0 unknown :-1 s=CLOSED pgs=241 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:38.667 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.665+0000 7fd0ff118700 1 --2- 192.168.123.104:0/2250353913 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd0f810c8f0 0x7fd0f810ccc0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:38.667 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.665+0000 7fd0ff118700 1 -- 192.168.123.104:0/2250353913 >> 192.168.123.104:0/2250353913 conn(0x7fd0f806c6c0 msgr2=0x7fd0f806cac0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:19:38.668 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.664+0000 7f5ef881e700 1 -- 192.168.123.104:0/2252339427 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f5ee4056980 con 0x7f5ef410e9e0 2026-03-10T06:19:38.669 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.666+0000 7fd0ff118700 1 -- 192.168.123.104:0/2250353913 shutdown_connections 2026-03-10T06:19:38.669 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.666+0000 7fd0ff118700 1 -- 192.168.123.104:0/2250353913 wait complete. 2026-03-10T06:19:38.669 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.666+0000 7fd0ff118700 1 Processor -- start 2026-03-10T06:19:38.669 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.667+0000 7fd0ff118700 1 -- start start 2026-03-10T06:19:38.669 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.667+0000 7fd0ff118700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd0f810c8f0 0x7fd0f807cec0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:38.669 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.667+0000 7fd0ff118700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd0f807d400 0x7fd0f807d870 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:38.669 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.667+0000 7fd0ff118700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd0f8081a40 con 0x7fd0f810c8f0 2026-03-10T06:19:38.669 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.667+0000 7fd0ff118700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd0f8081b80 con 0x7fd0f807d400 2026-03-10T06:19:38.669 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.667+0000 7fd0fe116700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd0f810c8f0 0x7fd0f807cec0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:19:38.669 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.667+0000 7fd0fe116700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd0f810c8f0 0x7fd0f807cec0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:44924/0 (socket says 192.168.123.104:44924) 2026-03-10T06:19:38.669 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.667+0000 7fd0fe116700 1 -- 192.168.123.104:0/2520201412 learned_addr learned my addr 192.168.123.104:0/2520201412 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:19:38.670 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.667+0000 7fd0fd915700 1 --2- 192.168.123.104:0/2520201412 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd0f807d400 0x7fd0f807d870 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:19:38.670 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.668+0000 7fd0fe116700 1 -- 192.168.123.104:0/2520201412 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd0f807d400 msgr2=0x7fd0f807d870 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:38.670 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.668+0000 7fd0fe116700 1 --2- 192.168.123.104:0/2520201412 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd0f807d400 0x7fd0f807d870 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:38.670 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.668+0000 7fd0fe116700 1 -- 192.168.123.104:0/2520201412 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fd0f0008ee0 con 0x7fd0f810c8f0 2026-03-10T06:19:38.671 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.668+0000 7fd0fe116700 1 --2- 192.168.123.104:0/2520201412 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd0f810c8f0 0x7fd0f807cec0 secure :-1 s=READY pgs=242 cs=0 l=1 rev1=1 crypto rx=0x7fd0f4010fc0 tx=0x7fd0f400e3a0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:19:38.671 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.669+0000 7fd0ef7fe700 1 -- 192.168.123.104:0/2520201412 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd0f400bb90 con 0x7fd0f810c8f0 2026-03-10T06:19:38.679 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.669+0000 7fd0ff118700 1 -- 192.168.123.104:0/2520201412 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fd0f8081e60 con 0x7fd0f810c8f0 2026-03-10T06:19:38.679 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.669+0000 7fd0ff118700 1 -- 192.168.123.104:0/2520201412 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fd0f80823b0 con 0x7fd0f810c8f0 2026-03-10T06:19:38.679 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.673+0000 7fd0ef7fe700 1 -- 192.168.123.104:0/2520201412 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fd0f4011040 con 0x7fd0f810c8f0 2026-03-10T06:19:38.679 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.673+0000 7fd0ef7fe700 1 -- 192.168.123.104:0/2520201412 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd0f4015710 con 0x7fd0f810c8f0 2026-03-10T06:19:38.679 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.673+0000 7fd0ed7fa700 1 -- 192.168.123.104:0/2520201412 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fd0f804f2a0 con 0x7fd0f810c8f0 2026-03-10T06:19:38.679 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.674+0000 7fd0ef7fe700 1 -- 192.168.123.104:0/2520201412 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7fd0f4015970 con 0x7fd0f810c8f0 2026-03-10T06:19:38.679 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.674+0000 7fd0ef7fe700 1 --2- 192.168.123.104:0/2520201412 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fd0e406c6d0 0x7fd0e406eb80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:38.679 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.674+0000 7fd0ef7fe700 1 -- 192.168.123.104:0/2520201412 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(34..34 src has 1..34) v4 ==== 4446+0+0 (secure 0 0 0) 0x7fd0f408f200 con 0x7fd0f810c8f0 2026-03-10T06:19:38.679 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.675+0000 7fd0fd915700 1 --2- 192.168.123.104:0/2520201412 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fd0e406c6d0 0x7fd0e406eb80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:19:38.679 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.675+0000 7fd0fd915700 1 --2- 192.168.123.104:0/2520201412 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fd0e406c6d0 0x7fd0e406eb80 secure :-1 s=READY pgs=80 cs=0 l=1 rev1=1 crypto rx=0x7fd0f00062a0 tx=0x7fd0f00061f0 comp rx=0 tx=0).ready entity=mgr.14241 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:19:38.679 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.678+0000 7fd0ef7fe700 1 -- 192.168.123.104:0/2520201412 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fd0f4059d40 con 0x7fd0f810c8f0 2026-03-10T06:19:38.715 INFO:tasks.cephadm.ceph_manager.ceph:need seq 77309411337 got 77309411337 for osd.2 2026-03-10T06:19:38.715 DEBUG:teuthology.parallel:result is None 2026-03-10T06:19:38.809 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.807+0000 7fd0ed7fa700 1 -- 192.168.123.104:0/2520201412 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "osd last-stat-seq", "id": 1} v 0) v1 -- 0x7fd0f804ea50 con 0x7fd0f810c8f0 2026-03-10T06:19:38.809 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.807+0000 7fd0ef7fe700 1 -- 192.168.123.104:0/2520201412 <== mon.0 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "osd last-stat-seq", "id": 1}]=0 v0) v1 ==== 74+0+12 (secure 0 0 0) 0x7fd0f405d360 con 0x7fd0f810c8f0 2026-03-10T06:19:38.809 INFO:teuthology.orchestra.run.vm04.stdout:60129542155 2026-03-10T06:19:38.820 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.818+0000 7fd0ff118700 1 -- 192.168.123.104:0/2520201412 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fd0e406c6d0 msgr2=0x7fd0e406eb80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:38.820 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.818+0000 7fd0ff118700 1 --2- 192.168.123.104:0/2520201412 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fd0e406c6d0 0x7fd0e406eb80 secure :-1 s=READY pgs=80 cs=0 l=1 rev1=1 crypto rx=0x7fd0f00062a0 tx=0x7fd0f00061f0 comp rx=0 tx=0).stop 2026-03-10T06:19:38.820 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.818+0000 7fd0ff118700 1 -- 192.168.123.104:0/2520201412 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd0f810c8f0 msgr2=0x7fd0f807cec0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:38.820 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.818+0000 7fd0ff118700 1 --2- 192.168.123.104:0/2520201412 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd0f810c8f0 0x7fd0f807cec0 secure :-1 s=READY pgs=242 cs=0 l=1 rev1=1 crypto rx=0x7fd0f4010fc0 tx=0x7fd0f400e3a0 comp rx=0 tx=0).stop 2026-03-10T06:19:38.820 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.818+0000 7fd0ff118700 1 -- 192.168.123.104:0/2520201412 shutdown_connections 2026-03-10T06:19:38.820 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.818+0000 7fd0ff118700 1 --2- 192.168.123.104:0/2520201412 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fd0e406c6d0 0x7fd0e406eb80 unknown :-1 s=CLOSED pgs=80 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:38.820 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.818+0000 7fd0ff118700 1 --2- 192.168.123.104:0/2520201412 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd0f810c8f0 0x7fd0f807cec0 unknown :-1 s=CLOSED pgs=242 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:38.820 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.818+0000 7fd0ff118700 1 --2- 192.168.123.104:0/2520201412 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd0f807d400 0x7fd0f807d870 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:38.820 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.819+0000 7fd0ff118700 1 -- 192.168.123.104:0/2520201412 >> 192.168.123.104:0/2520201412 conn(0x7fd0f806c6c0 msgr2=0x7fd0f8070080 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:19:38.820 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.819+0000 7fd0ff118700 1 -- 192.168.123.104:0/2520201412 shutdown_connections 2026-03-10T06:19:38.820 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.819+0000 7fd0ff118700 1 -- 192.168.123.104:0/2520201412 wait complete. 2026-03-10T06:19:38.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:38 vm06 ceph-mon[58974]: from='client.? 192.168.123.104:0/3208210365' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 3}]: dispatch 2026-03-10T06:19:38.891 INFO:tasks.cephadm.ceph_manager.ceph:need seq 60129542155 got 60129542155 for osd.1 2026-03-10T06:19:38.891 DEBUG:teuthology.parallel:result is None 2026-03-10T06:19:38.916 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.911+0000 7f20e2ffd700 1 -- 192.168.123.104:0/3661307442 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "osd last-stat-seq", "id": 4} v 0) v1 -- 0x7f20fc116eb0 con 0x7f20fc10e9e0 2026-03-10T06:19:38.917 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.915+0000 7f20f97fa700 1 -- 192.168.123.104:0/3661307442 <== mon.0 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "osd last-stat-seq", "id": 4}]=0 v0) v1 ==== 74+0+13 (secure 0 0 0) 0x7f20f40566b0 con 0x7f20fc10e9e0 2026-03-10T06:19:38.917 INFO:teuthology.orchestra.run.vm04.stdout:124554051589 2026-03-10T06:19:38.920 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.917+0000 7f20e2ffd700 1 -- 192.168.123.104:0/3661307442 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f20e406c430 msgr2=0x7f20e406e8e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:38.920 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.917+0000 7f20e2ffd700 1 --2- 192.168.123.104:0/3661307442 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f20e406c430 0x7f20e406e8e0 secure :-1 s=READY pgs=77 cs=0 l=1 rev1=1 crypto rx=0x7f20f0009e50 tx=0x7f20f0009450 comp rx=0 tx=0).stop 2026-03-10T06:19:38.920 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.917+0000 7f20e2ffd700 1 -- 192.168.123.104:0/3661307442 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f20fc10e9e0 msgr2=0x7f20fc115e00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:38.920 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.918+0000 7f20e2ffd700 1 --2- 192.168.123.104:0/3661307442 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f20fc10e9e0 0x7f20fc115e00 secure :-1 s=READY pgs=237 cs=0 l=1 rev1=1 crypto rx=0x7f20f4015040 tx=0x7f20f40077f0 comp rx=0 tx=0).stop 2026-03-10T06:19:38.920 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.918+0000 7f20e2ffd700 1 -- 192.168.123.104:0/3661307442 shutdown_connections 2026-03-10T06:19:38.920 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.918+0000 7f20e2ffd700 1 --2- 192.168.123.104:0/3661307442 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f20e406c430 0x7f20e406e8e0 unknown :-1 s=CLOSED pgs=77 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:38.920 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.918+0000 7f20e2ffd700 1 --2- 192.168.123.104:0/3661307442 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f20fc071b60 0x7f20fc1158c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:38.920 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.918+0000 7f20e2ffd700 1 --2- 192.168.123.104:0/3661307442 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f20fc10e9e0 0x7f20fc115e00 unknown :-1 s=CLOSED pgs=237 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:38.920 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.918+0000 7f20e2ffd700 1 -- 192.168.123.104:0/3661307442 >> 192.168.123.104:0/3661307442 conn(0x7f20fc06c6c0 msgr2=0x7f20fc06cfb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:19:38.920 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.918+0000 7f20e2ffd700 1 -- 192.168.123.104:0/3661307442 shutdown_connections 2026-03-10T06:19:38.920 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.918+0000 7f20e2ffd700 1 -- 192.168.123.104:0/3661307442 wait complete. 2026-03-10T06:19:38.926 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.924+0000 7f5ef3fff700 1 -- 192.168.123.104:0/2252339427 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "osd last-stat-seq", "id": 5} v 0) v1 -- 0x7f5ee0005190 con 0x7f5ef410e9e0 2026-03-10T06:19:38.930 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.925+0000 7f5ef881e700 1 -- 192.168.123.104:0/2252339427 <== mon.1 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "osd last-stat-seq", "id": 5}]=0 v0) v1 ==== 74+0+13 (secure 0 0 0) 0x7f5ee4059fa0 con 0x7f5ef410e9e0 2026-03-10T06:19:38.935 INFO:teuthology.orchestra.run.vm04.stdout:141733920772 2026-03-10T06:19:38.939 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.937+0000 7f5ef3fff700 1 -- 192.168.123.104:0/2252339427 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f5edc06c750 msgr2=0x7f5edc06ec00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:38.939 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.937+0000 7f5ef3fff700 1 --2- 192.168.123.104:0/2252339427 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f5edc06c750 0x7f5edc06ec00 secure :-1 s=READY pgs=79 cs=0 l=1 rev1=1 crypto rx=0x7f5ee800b5c0 tx=0x7f5ee8005fd0 comp rx=0 tx=0).stop 2026-03-10T06:19:38.939 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.937+0000 7f5ef3fff700 1 -- 192.168.123.104:0/2252339427 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5ef410e9e0 msgr2=0x7f5ef4115940 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:38.939 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.937+0000 7f5ef3fff700 1 --2- 192.168.123.104:0/2252339427 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5ef410e9e0 0x7f5ef4115940 secure :-1 s=READY pgs=46 cs=0 l=1 rev1=1 crypto rx=0x7f5ee400eb10 tx=0x7f5ee400eed0 comp rx=0 tx=0).stop 2026-03-10T06:19:38.939 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.937+0000 7f5ef3fff700 1 -- 192.168.123.104:0/2252339427 shutdown_connections 2026-03-10T06:19:38.939 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.937+0000 7f5ef3fff700 1 --2- 192.168.123.104:0/2252339427 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f5edc06c750 0x7f5edc06ec00 unknown :-1 s=CLOSED pgs=79 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:38.939 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.937+0000 7f5ef3fff700 1 --2- 192.168.123.104:0/2252339427 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5ef4071b60 0x7f5ef4115400 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:38.939 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.937+0000 7f5ef3fff700 1 --2- 192.168.123.104:0/2252339427 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5ef410e9e0 0x7f5ef4115940 unknown :-1 s=CLOSED pgs=46 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:38.939 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.937+0000 7f5ef3fff700 1 -- 192.168.123.104:0/2252339427 >> 192.168.123.104:0/2252339427 conn(0x7f5ef406c6c0 msgr2=0x7f5ef406d030 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:19:38.940 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.938+0000 7f5ef3fff700 1 -- 192.168.123.104:0/2252339427 shutdown_connections 2026-03-10T06:19:38.940 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.938+0000 7f5ef3fff700 1 -- 192.168.123.104:0/2252339427 wait complete. 2026-03-10T06:19:38.954 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.951+0000 7f46ff7fe700 1 -- 192.168.123.104:0/2476193754 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "osd last-stat-seq", "id": 0} v 0) v1 -- 0x7f4704005190 con 0x7f471810c8b0 2026-03-10T06:19:38.954 INFO:teuthology.orchestra.run.vm04.stdout:38654705677 2026-03-10T06:19:38.954 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.952+0000 7f4715ffb700 1 -- 192.168.123.104:0/2476193754 <== mon.0 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "osd last-stat-seq", "id": 0}]=0 v0) v1 ==== 74+0+12 (secure 0 0 0) 0x7f4708018030 con 0x7f471810c8b0 2026-03-10T06:19:38.958 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.956+0000 7f471ec07700 1 -- 192.168.123.104:0/2476193754 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f470006c6d0 msgr2=0x7f470006eb80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:38.958 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.956+0000 7f471ec07700 1 --2- 192.168.123.104:0/2476193754 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f470006c6d0 0x7f470006eb80 secure :-1 s=READY pgs=78 cs=0 l=1 rev1=1 crypto rx=0x7f470c00b5c0 tx=0x7f470c011040 comp rx=0 tx=0).stop 2026-03-10T06:19:38.958 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.956+0000 7f471ec07700 1 -- 192.168.123.104:0/2476193754 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f471810c8b0 msgr2=0x7f47181378a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:38.958 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.956+0000 7f471ec07700 1 --2- 192.168.123.104:0/2476193754 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f471810c8b0 0x7f47181378a0 secure :-1 s=READY pgs=239 cs=0 l=1 rev1=1 crypto rx=0x7f4708009d00 tx=0x7f470800e3b0 comp rx=0 tx=0).stop 2026-03-10T06:19:38.958 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.956+0000 7f471ec07700 1 -- 192.168.123.104:0/2476193754 shutdown_connections 2026-03-10T06:19:38.958 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.956+0000 7f471ec07700 1 --2- 192.168.123.104:0/2476193754 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f470006c6d0 0x7f470006eb80 unknown :-1 s=CLOSED pgs=78 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:38.958 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.956+0000 7f471ec07700 1 --2- 192.168.123.104:0/2476193754 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f471810c8b0 0x7f47181378a0 unknown :-1 s=CLOSED pgs=239 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:38.958 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.956+0000 7f471ec07700 1 --2- 192.168.123.104:0/2476193754 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f47181328a0 0x7f4718132d10 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:38.958 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.956+0000 7f471ec07700 1 -- 192.168.123.104:0/2476193754 >> 192.168.123.104:0/2476193754 conn(0x7f471806c6c0 msgr2=0x7f4718070030 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:19:38.965 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.961+0000 7f471ec07700 1 -- 192.168.123.104:0/2476193754 shutdown_connections 2026-03-10T06:19:38.965 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:38.962+0000 7f471ec07700 1 -- 192.168.123.104:0/2476193754 wait complete. 2026-03-10T06:19:39.003 INFO:tasks.cephadm.ceph_manager.ceph:need seq 141733920772 got 141733920772 for osd.5 2026-03-10T06:19:39.003 DEBUG:teuthology.parallel:result is None 2026-03-10T06:19:39.037 INFO:tasks.cephadm.ceph_manager.ceph:need seq 38654705677 got 38654705677 for osd.0 2026-03-10T06:19:39.037 DEBUG:teuthology.parallel:result is None 2026-03-10T06:19:39.058 INFO:tasks.cephadm.ceph_manager.ceph:need seq 124554051589 got 124554051589 for osd.4 2026-03-10T06:19:39.058 DEBUG:teuthology.parallel:result is None 2026-03-10T06:19:39.483 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid 9c59102a-1c48-11f1-b618-035af535377d -- ceph osd last-stat-seq osd.3 2026-03-10T06:19:39.641 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/mon.vm04/config 2026-03-10T06:19:39.670 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:39 vm04 ceph-mon[51058]: pgmap v69: 1 pgs: 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T06:19:39.670 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:39 vm04 ceph-mon[51058]: from='client.? 192.168.123.104:0/247858744' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 2}]: dispatch 2026-03-10T06:19:39.670 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:39 vm04 ceph-mon[51058]: from='client.? 192.168.123.104:0/2520201412' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 1}]: dispatch 2026-03-10T06:19:39.670 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:39 vm04 ceph-mon[51058]: from='client.? 192.168.123.104:0/3661307442' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 4}]: dispatch 2026-03-10T06:19:39.670 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:39 vm04 ceph-mon[51058]: from='client.? 192.168.123.104:0/2252339427' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 5}]: dispatch 2026-03-10T06:19:39.670 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:39 vm04 ceph-mon[51058]: from='client.? 192.168.123.104:0/2476193754' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 0}]: dispatch 2026-03-10T06:19:39.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:39 vm06 ceph-mon[58974]: pgmap v69: 1 pgs: 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T06:19:39.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:39 vm06 ceph-mon[58974]: from='client.? 192.168.123.104:0/247858744' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 2}]: dispatch 2026-03-10T06:19:39.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:39 vm06 ceph-mon[58974]: from='client.? 192.168.123.104:0/2520201412' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 1}]: dispatch 2026-03-10T06:19:39.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:39 vm06 ceph-mon[58974]: from='client.? 192.168.123.104:0/3661307442' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 4}]: dispatch 2026-03-10T06:19:39.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:39 vm06 ceph-mon[58974]: from='client.? 192.168.123.104:0/2252339427' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 5}]: dispatch 2026-03-10T06:19:39.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:39 vm06 ceph-mon[58974]: from='client.? 192.168.123.104:0/2476193754' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 0}]: dispatch 2026-03-10T06:19:39.934 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:39.932+0000 7fca32587700 1 -- 192.168.123.104:0/2284215434 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fca2c1013a0 msgr2=0x7fca2c101770 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:39.934 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:39.932+0000 7fca32587700 1 --2- 192.168.123.104:0/2284215434 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fca2c1013a0 0x7fca2c101770 secure :-1 s=READY pgs=243 cs=0 l=1 rev1=1 crypto rx=0x7fca14009ab0 tx=0x7fca14009dc0 comp rx=0 tx=0).stop 2026-03-10T06:19:39.934 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:39.932+0000 7fca32587700 1 -- 192.168.123.104:0/2284215434 shutdown_connections 2026-03-10T06:19:39.934 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:39.932+0000 7fca32587700 1 --2- 192.168.123.104:0/2284215434 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fca2c068490 0x7fca2c068900 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:39.934 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:39.932+0000 7fca32587700 1 --2- 192.168.123.104:0/2284215434 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fca2c1013a0 0x7fca2c101770 unknown :-1 s=CLOSED pgs=243 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:39.934 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:39.932+0000 7fca32587700 1 -- 192.168.123.104:0/2284215434 >> 192.168.123.104:0/2284215434 conn(0x7fca2c0754a0 msgr2=0x7fca2c0758a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:19:39.934 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:39.933+0000 7fca32587700 1 -- 192.168.123.104:0/2284215434 shutdown_connections 2026-03-10T06:19:39.934 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:39.933+0000 7fca32587700 1 -- 192.168.123.104:0/2284215434 wait complete. 2026-03-10T06:19:39.935 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:39.933+0000 7fca32587700 1 Processor -- start 2026-03-10T06:19:39.935 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:39.933+0000 7fca32587700 1 -- start start 2026-03-10T06:19:39.935 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:39.933+0000 7fca32587700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fca2c068490 0x7fca2c198380 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:39.935 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:39.933+0000 7fca32587700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fca2c1013a0 0x7fca2c1988c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:39.935 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:39.933+0000 7fca32587700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fca2c198fa0 con 0x7fca2c068490 2026-03-10T06:19:39.935 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:39.933+0000 7fca32587700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fca2c19cd30 con 0x7fca2c1013a0 2026-03-10T06:19:39.935 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:39.934+0000 7fca2b7fe700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fca2c1013a0 0x7fca2c1988c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:19:39.935 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:39.934+0000 7fca2b7fe700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fca2c1013a0 0x7fca2c1988c0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.104:55726/0 (socket says 192.168.123.104:55726) 2026-03-10T06:19:39.935 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:39.934+0000 7fca2b7fe700 1 -- 192.168.123.104:0/212057107 learned_addr learned my addr 192.168.123.104:0/212057107 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:19:39.935 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:39.934+0000 7fca2bfff700 1 --2- 192.168.123.104:0/212057107 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fca2c068490 0x7fca2c198380 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:19:39.936 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:39.934+0000 7fca2b7fe700 1 -- 192.168.123.104:0/212057107 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fca2c068490 msgr2=0x7fca2c198380 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:39.936 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:39.934+0000 7fca2b7fe700 1 --2- 192.168.123.104:0/212057107 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fca2c068490 0x7fca2c198380 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:39.936 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:39.934+0000 7fca2b7fe700 1 -- 192.168.123.104:0/212057107 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fca14009710 con 0x7fca2c1013a0 2026-03-10T06:19:39.936 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:39.934+0000 7fca2bfff700 1 --2- 192.168.123.104:0/212057107 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fca2c068490 0x7fca2c198380 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_reply_more state changed! 2026-03-10T06:19:39.936 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:39.934+0000 7fca2b7fe700 1 --2- 192.168.123.104:0/212057107 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fca2c1013a0 0x7fca2c1988c0 secure :-1 s=READY pgs=47 cs=0 l=1 rev1=1 crypto rx=0x7fca1c00ea30 tx=0x7fca1c00edf0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:19:39.936 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:39.934+0000 7fca297fa700 1 -- 192.168.123.104:0/212057107 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fca1c00cc40 con 0x7fca2c1013a0 2026-03-10T06:19:39.937 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:39.934+0000 7fca32587700 1 -- 192.168.123.104:0/212057107 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fca2c19d010 con 0x7fca2c1013a0 2026-03-10T06:19:39.937 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:39.934+0000 7fca297fa700 1 -- 192.168.123.104:0/212057107 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fca1c00cda0 con 0x7fca2c1013a0 2026-03-10T06:19:39.937 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:39.934+0000 7fca297fa700 1 -- 192.168.123.104:0/212057107 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fca1c010430 con 0x7fca2c1013a0 2026-03-10T06:19:39.937 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:39.935+0000 7fca32587700 1 -- 192.168.123.104:0/212057107 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fca2c19d560 con 0x7fca2c1013a0 2026-03-10T06:19:39.937 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:39.936+0000 7fca297fa700 1 -- 192.168.123.104:0/212057107 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7fca1c004830 con 0x7fca2c1013a0 2026-03-10T06:19:39.937 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:39.936+0000 7fca32587700 1 -- 192.168.123.104:0/212057107 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fca2c04ea50 con 0x7fca2c1013a0 2026-03-10T06:19:39.938 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:39.936+0000 7fca297fa700 1 --2- 192.168.123.104:0/212057107 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fca1806c680 0x7fca1806eb30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:39.938 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:39.936+0000 7fca297fa700 1 -- 192.168.123.104:0/212057107 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(34..34 src has 1..34) v4 ==== 4446+0+0 (secure 0 0 0) 0x7fca1c014070 con 0x7fca2c1013a0 2026-03-10T06:19:39.938 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:39.937+0000 7fca2bfff700 1 --2- 192.168.123.104:0/212057107 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fca1806c680 0x7fca1806eb30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:19:39.939 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:39.937+0000 7fca2bfff700 1 --2- 192.168.123.104:0/212057107 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fca1806c680 0x7fca1806eb30 secure :-1 s=READY pgs=81 cs=0 l=1 rev1=1 crypto rx=0x7fca14005e90 tx=0x7fca14005d80 comp rx=0 tx=0).ready entity=mgr.14241 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:19:39.940 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:39.938+0000 7fca297fa700 1 -- 192.168.123.104:0/212057107 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fca1c056620 con 0x7fca2c1013a0 2026-03-10T06:19:40.049 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:40.047+0000 7fca32587700 1 -- 192.168.123.104:0/212057107 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "osd last-stat-seq", "id": 3} v 0) v1 -- 0x7fca2c066e40 con 0x7fca2c1013a0 2026-03-10T06:19:40.049 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:40.048+0000 7fca297fa700 1 -- 192.168.123.104:0/212057107 <== mon.1 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "osd last-stat-seq", "id": 3}]=0 v0) v1 ==== 74+0+13 (secure 0 0 0) 0x7fca1c059c40 con 0x7fca2c1013a0 2026-03-10T06:19:40.049 INFO:teuthology.orchestra.run.vm04.stdout:103079215111 2026-03-10T06:19:40.052 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:40.050+0000 7fca32587700 1 -- 192.168.123.104:0/212057107 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fca1806c680 msgr2=0x7fca1806eb30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:40.052 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:40.050+0000 7fca32587700 1 --2- 192.168.123.104:0/212057107 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fca1806c680 0x7fca1806eb30 secure :-1 s=READY pgs=81 cs=0 l=1 rev1=1 crypto rx=0x7fca14005e90 tx=0x7fca14005d80 comp rx=0 tx=0).stop 2026-03-10T06:19:40.052 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:40.050+0000 7fca32587700 1 -- 192.168.123.104:0/212057107 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fca2c1013a0 msgr2=0x7fca2c1988c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:40.052 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:40.050+0000 7fca32587700 1 --2- 192.168.123.104:0/212057107 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fca2c1013a0 0x7fca2c1988c0 secure :-1 s=READY pgs=47 cs=0 l=1 rev1=1 crypto rx=0x7fca1c00ea30 tx=0x7fca1c00edf0 comp rx=0 tx=0).stop 2026-03-10T06:19:40.052 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:40.050+0000 7fca32587700 1 -- 192.168.123.104:0/212057107 shutdown_connections 2026-03-10T06:19:40.052 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:40.050+0000 7fca32587700 1 --2- 192.168.123.104:0/212057107 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fca1806c680 0x7fca1806eb30 unknown :-1 s=CLOSED pgs=81 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:40.052 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:40.050+0000 7fca32587700 1 --2- 192.168.123.104:0/212057107 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fca2c068490 0x7fca2c198380 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:40.052 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:40.050+0000 7fca32587700 1 --2- 192.168.123.104:0/212057107 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fca2c1013a0 0x7fca2c1988c0 unknown :-1 s=CLOSED pgs=47 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:40.052 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:40.050+0000 7fca32587700 1 -- 192.168.123.104:0/212057107 >> 192.168.123.104:0/212057107 conn(0x7fca2c0754a0 msgr2=0x7fca2c0fddd0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:19:40.052 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:40.051+0000 7fca32587700 1 -- 192.168.123.104:0/212057107 shutdown_connections 2026-03-10T06:19:40.052 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:40.051+0000 7fca32587700 1 -- 192.168.123.104:0/212057107 wait complete. 2026-03-10T06:19:40.118 INFO:tasks.cephadm.ceph_manager.ceph:need seq 103079215111 got 103079215111 for osd.3 2026-03-10T06:19:40.118 DEBUG:teuthology.parallel:result is None 2026-03-10T06:19:40.119 INFO:tasks.cephadm.ceph_manager.ceph:waiting for clean 2026-03-10T06:19:40.119 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid 9c59102a-1c48-11f1-b618-035af535377d -- ceph pg dump --format=json 2026-03-10T06:19:40.273 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/mon.vm04/config 2026-03-10T06:19:40.530 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:40.528+0000 7fa680d57700 1 -- 192.168.123.104:0/2455550909 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa67c108670 msgr2=0x7fa67c108a40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:40.530 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:40.528+0000 7fa680d57700 1 --2- 192.168.123.104:0/2455550909 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa67c108670 0x7fa67c108a40 secure :-1 s=READY pgs=244 cs=0 l=1 rev1=1 crypto rx=0x7fa66c009b00 tx=0x7fa66c009e10 comp rx=0 tx=0).stop 2026-03-10T06:19:40.530 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:40.528+0000 7fa680d57700 1 -- 192.168.123.104:0/2455550909 shutdown_connections 2026-03-10T06:19:40.530 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:40.528+0000 7fa680d57700 1 --2- 192.168.123.104:0/2455550909 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa67c102670 0x7fa67c102ae0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:40.530 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:40.528+0000 7fa680d57700 1 --2- 192.168.123.104:0/2455550909 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa67c108670 0x7fa67c108a40 unknown :-1 s=CLOSED pgs=244 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:40.530 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:40.528+0000 7fa680d57700 1 -- 192.168.123.104:0/2455550909 >> 192.168.123.104:0/2455550909 conn(0x7fa67c0fe150 msgr2=0x7fa67c100560 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:19:40.530 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:40.529+0000 7fa680d57700 1 -- 192.168.123.104:0/2455550909 shutdown_connections 2026-03-10T06:19:40.530 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:40.529+0000 7fa680d57700 1 -- 192.168.123.104:0/2455550909 wait complete. 2026-03-10T06:19:40.531 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:40.529+0000 7fa680d57700 1 Processor -- start 2026-03-10T06:19:40.531 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:40.530+0000 7fa680d57700 1 -- start start 2026-03-10T06:19:40.532 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:40.530+0000 7fa680d57700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa67c102670 0x7fa67c198460 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:40.532 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:40.530+0000 7fa680d57700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa67c1989a0 0x7fa67c19ce10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:40.532 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:40.530+0000 7fa680d57700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa67c198f20 con 0x7fa67c1989a0 2026-03-10T06:19:40.532 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:40.530+0000 7fa680d57700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa67c199090 con 0x7fa67c102670 2026-03-10T06:19:40.532 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:40.530+0000 7fa679d9b700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa67c1989a0 0x7fa67c19ce10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:19:40.533 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:40.530+0000 7fa679d9b700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa67c1989a0 0x7fa67c19ce10 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:44962/0 (socket says 192.168.123.104:44962) 2026-03-10T06:19:40.533 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:40.530+0000 7fa679d9b700 1 -- 192.168.123.104:0/536959126 learned_addr learned my addr 192.168.123.104:0/536959126 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:19:40.533 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:40.530+0000 7fa679d9b700 1 -- 192.168.123.104:0/536959126 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa67c102670 msgr2=0x7fa67c198460 unknown :-1 s=STATE_CONNECTING_RE l=1).mark_down 2026-03-10T06:19:40.533 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:40.530+0000 7fa679d9b700 1 --2- 192.168.123.104:0/536959126 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa67c102670 0x7fa67c198460 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:40.533 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:40.530+0000 7fa679d9b700 1 -- 192.168.123.104:0/536959126 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fa66c0097e0 con 0x7fa67c1989a0 2026-03-10T06:19:40.533 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:40.531+0000 7fa679d9b700 1 --2- 192.168.123.104:0/536959126 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa67c1989a0 0x7fa67c19ce10 secure :-1 s=READY pgs=245 cs=0 l=1 rev1=1 crypto rx=0x7fa66400c930 tx=0x7fa66400ccf0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:19:40.533 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:40.531+0000 7fa6737fe700 1 -- 192.168.123.104:0/536959126 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa664007ab0 con 0x7fa67c1989a0 2026-03-10T06:19:40.534 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:40.531+0000 7fa6737fe700 1 -- 192.168.123.104:0/536959126 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fa664007c10 con 0x7fa67c1989a0 2026-03-10T06:19:40.534 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:40.531+0000 7fa6737fe700 1 -- 192.168.123.104:0/536959126 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa6640186e0 con 0x7fa67c1989a0 2026-03-10T06:19:40.534 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:40.531+0000 7fa680d57700 1 -- 192.168.123.104:0/536959126 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fa67c19d3b0 con 0x7fa67c1989a0 2026-03-10T06:19:40.534 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:40.531+0000 7fa680d57700 1 -- 192.168.123.104:0/536959126 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fa67c19d8b0 con 0x7fa67c1989a0 2026-03-10T06:19:40.534 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:40.532+0000 7fa680d57700 1 -- 192.168.123.104:0/536959126 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fa67c10abe0 con 0x7fa67c1989a0 2026-03-10T06:19:40.537 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:40.536+0000 7fa6737fe700 1 -- 192.168.123.104:0/536959126 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7fa66401f030 con 0x7fa67c1989a0 2026-03-10T06:19:40.538 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:40.536+0000 7fa6737fe700 1 --2- 192.168.123.104:0/536959126 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fa66806c680 0x7fa66806eb30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:40.538 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:40.536+0000 7fa6737fe700 1 -- 192.168.123.104:0/536959126 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(34..34 src has 1..34) v4 ==== 4446+0+0 (secure 0 0 0) 0x7fa66408b940 con 0x7fa67c1989a0 2026-03-10T06:19:40.538 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:40.536+0000 7fa67a59c700 1 --2- 192.168.123.104:0/536959126 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fa66806c680 0x7fa66806eb30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:19:40.538 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:40.536+0000 7fa6737fe700 1 -- 192.168.123.104:0/536959126 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fa6640b7720 con 0x7fa67c1989a0 2026-03-10T06:19:40.539 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:40.537+0000 7fa67a59c700 1 --2- 192.168.123.104:0/536959126 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fa66806c680 0x7fa66806eb30 secure :-1 s=READY pgs=82 cs=0 l=1 rev1=1 crypto rx=0x7fa66c009fd0 tx=0x7fa66c005dc0 comp rx=0 tx=0).ready entity=mgr.14241 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:19:40.647 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:40 vm04 ceph-mon[51058]: from='client.? 192.168.123.104:0/212057107' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 3}]: dispatch 2026-03-10T06:19:40.647 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:40.643+0000 7fa680d57700 1 -- 192.168.123.104:0/536959126 --> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] -- mgr_command(tid 0: {"prefix": "pg dump", "target": ["mon-mgr", ""], "format": "json"}) v1 -- 0x7fa67c1997c0 con 0x7fa66806c680 2026-03-10T06:19:40.648 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:40.646+0000 7fa6737fe700 1 -- 192.168.123.104:0/536959126 <== mgr.14241 v2:192.168.123.104:6800/2 1 ==== mgr_command_reply(tid 0: 0 dumped all) v1 ==== 18+0+19093 (secure 0 0 0) 0x7fa67c1997c0 con 0x7fa66806c680 2026-03-10T06:19:40.648 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:19:40.650 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:40.648+0000 7fa680d57700 1 -- 192.168.123.104:0/536959126 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fa66806c680 msgr2=0x7fa66806eb30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:40.650 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:40.648+0000 7fa680d57700 1 --2- 192.168.123.104:0/536959126 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fa66806c680 0x7fa66806eb30 secure :-1 s=READY pgs=82 cs=0 l=1 rev1=1 crypto rx=0x7fa66c009fd0 tx=0x7fa66c005dc0 comp rx=0 tx=0).stop 2026-03-10T06:19:40.650 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:40.648+0000 7fa680d57700 1 -- 192.168.123.104:0/536959126 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa67c1989a0 msgr2=0x7fa67c19ce10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:40.650 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:40.648+0000 7fa680d57700 1 --2- 192.168.123.104:0/536959126 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa67c1989a0 0x7fa67c19ce10 secure :-1 s=READY pgs=245 cs=0 l=1 rev1=1 crypto rx=0x7fa66400c930 tx=0x7fa66400ccf0 comp rx=0 tx=0).stop 2026-03-10T06:19:40.650 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:40.649+0000 7fa680d57700 1 -- 192.168.123.104:0/536959126 shutdown_connections 2026-03-10T06:19:40.651 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:40.649+0000 7fa680d57700 1 --2- 192.168.123.104:0/536959126 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fa66806c680 0x7fa66806eb30 unknown :-1 s=CLOSED pgs=82 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:40.651 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:40.649+0000 7fa680d57700 1 --2- 192.168.123.104:0/536959126 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa67c102670 0x7fa67c198460 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:40.651 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:40.649+0000 7fa680d57700 1 --2- 192.168.123.104:0/536959126 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa67c1989a0 0x7fa67c19ce10 unknown :-1 s=CLOSED pgs=245 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:40.651 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:40.649+0000 7fa680d57700 1 -- 192.168.123.104:0/536959126 >> 192.168.123.104:0/536959126 conn(0x7fa67c0fe150 msgr2=0x7fa67c0ffaf0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:19:40.651 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:40.649+0000 7fa680d57700 1 -- 192.168.123.104:0/536959126 shutdown_connections 2026-03-10T06:19:40.651 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:40.649+0000 7fa680d57700 1 -- 192.168.123.104:0/536959126 wait complete. 2026-03-10T06:19:40.651 INFO:teuthology.orchestra.run.vm04.stderr:dumped all 2026-03-10T06:19:40.693 INFO:teuthology.orchestra.run.vm04.stdout:{"pg_ready":true,"pg_map":{"version":70,"stamp":"2026-03-10T06:19:40.088394+0000","last_osdmap_epoch":0,"last_pg_scan":0,"pg_stats_sum":{"stat_sum":{"num_bytes":459280,"num_objects":2,"num_object_clones":0,"num_object_copies":6,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":96,"num_read_kb":82,"num_write":113,"num_write_kb":1372,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":76,"ondisk_log_size":76,"up":3,"acting":3,"num_store_stats":0},"osd_stats_sum":{"up_from":0,"seq":0,"num_pgs":3,"num_osds":6,"num_per_pool_osds":6,"num_per_pool_omap_osds":3,"kb":125804544,"kb_used":163692,"kb_used_data":3132,"kb_used_omap":0,"kb_used_meta":160512,"kb_avail":125640852,"statfs":{"total":128823853056,"available":128656232448,"internally_reserved":0,"allocated":3207168,"data_stored":2053338,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":164364288},"hb_peers":[],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[]},"pg_stats_delta":{"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":0,"ondisk_log_size":0,"up":0,"acting":0,"num_store_stats":0,"stamp_delta":"10.013355"},"pg_stats":[{"pgid":"1.0","version":"21'76","reported_seq":138,"reported_epoch":33,"state":"active+clean","last_fresh":"2026-03-10T06:19:29.068927+0000","last_change":"2026-03-10T06:19:17.805828+0000","last_active":"2026-03-10T06:19:29.068927+0000","last_peered":"2026-03-10T06:19:29.068927+0000","last_clean":"2026-03-10T06:19:29.068927+0000","last_became_active":"2026-03-10T06:19:17.805679+0000","last_became_peered":"2026-03-10T06:19:17.805679+0000","last_unstale":"2026-03-10T06:19:29.068927+0000","last_undegraded":"2026-03-10T06:19:29.068927+0000","last_fullsized":"2026-03-10T06:19:29.068927+0000","mapping_epoch":28,"log_start":"0'0","ondisk_log_start":"0'0","created":19,"last_epoch_clean":29,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-10T06:19:00.469886+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-10T06:19:00.469886+0000","last_clean_scrub_stamp":"2026-03-10T06:19:00.469886+0000","objects_scrubbed":0,"log_size":76,"log_dups_size":0,"ondisk_log_size":76,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-11T16:39:57.732996+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0,"stat_sum":{"num_bytes":459280,"num_objects":2,"num_object_clones":0,"num_object_copies":6,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":96,"num_read_kb":82,"num_write":113,"num_write_kb":1372,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[3,0,1],"acting":[3,0,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":3,"acting_primary":3,"purged_snaps":[]}],"pool_stats":[{"poolid":1,"num_pg":1,"stat_sum":{"num_bytes":459280,"num_objects":2,"num_object_clones":0,"num_object_copies":6,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":96,"num_read_kb":82,"num_write":113,"num_write_kb":1372,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":1388544,"data_stored":1377840,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":76,"ondisk_log_size":76,"up":3,"acting":3,"num_store_stats":4}],"osd_stats":[{"osd":5,"up_from":33,"seq":141733920773,"num_pgs":0,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":0,"kb":20967424,"kb_used":27056,"kb_used_data":296,"kb_used_omap":0,"kb_used_meta":26752,"kb_avail":20940368,"statfs":{"total":21470642176,"available":21442936832,"internally_reserved":0,"allocated":303104,"data_stored":112583,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":27394048},"hb_peers":[0,1,2,3,4],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[{"osd":0,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.48199999999999998}]},{"osd":1,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.45200000000000001}]},{"osd":2,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.439}]},{"osd":3,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.24399999999999999}]},{"osd":4,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.495}]}]},{"osd":4,"up_from":29,"seq":124554051590,"num_pgs":0,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":0,"kb":20967424,"kb_used":27056,"kb_used_data":296,"kb_used_omap":0,"kb_used_meta":26752,"kb_avail":20940368,"statfs":{"total":21470642176,"available":21442936832,"internally_reserved":0,"allocated":303104,"data_stored":112583,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":27394048},"hb_peers":[0,1,2,3,5],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[{"osd":0,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.38}]},{"osd":1,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.36399999999999999}]},{"osd":2,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.53000000000000003}]},{"osd":3,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.441}]},{"osd":5,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.35199999999999998}]}]},{"osd":3,"up_from":24,"seq":103079215112,"num_pgs":1,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":20967424,"kb_used":27508,"kb_used_data":748,"kb_used_omap":0,"kb_used_meta":26752,"kb_avail":20939916,"statfs":{"total":21470642176,"available":21442473984,"internally_reserved":0,"allocated":765952,"data_stored":571863,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":27394048},"hb_peers":[0,1,2,4,5],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[{"osd":0,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.48899999999999999}]},{"osd":1,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.39700000000000002}]},{"osd":2,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.46700000000000003}]},{"osd":4,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.438}]},{"osd":5,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.42199999999999999}]}]},{"osd":2,"up_from":18,"seq":77309411338,"num_pgs":0,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":0,"kb":20967424,"kb_used":27056,"kb_used_data":296,"kb_used_omap":0,"kb_used_meta":26752,"kb_avail":20940368,"statfs":{"total":21470642176,"available":21442936832,"internally_reserved":0,"allocated":303104,"data_stored":112583,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":27394048},"hb_peers":[0,1,3,4,5],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[{"osd":0,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.44500000000000001}]},{"osd":1,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.64400000000000002}]},{"osd":3,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.42799999999999999}]},{"osd":4,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.58899999999999997}]},{"osd":5,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.5}]}]},{"osd":0,"up_from":9,"seq":38654705678,"num_pgs":1,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":20967424,"kb_used":27508,"kb_used_data":748,"kb_used_omap":0,"kb_used_meta":26752,"kb_avail":20939916,"statfs":{"total":21470642176,"available":21442473984,"internally_reserved":0,"allocated":765952,"data_stored":571863,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":27394048},"hb_peers":[1,2,3,4,5],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[{"osd":1,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.54500000000000004}]},{"osd":2,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.40699999999999997}]},{"osd":3,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.72999999999999998}]},{"osd":4,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.60799999999999998}]},{"osd":5,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.622}]}]},{"osd":1,"up_from":14,"seq":60129542156,"num_pgs":1,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":20967424,"kb_used":27508,"kb_used_data":748,"kb_used_omap":0,"kb_used_meta":26752,"kb_avail":20939916,"statfs":{"total":21470642176,"available":21442473984,"internally_reserved":0,"allocated":765952,"data_stored":571863,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":27394048},"hb_peers":[0,2,3,4,5],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[{"osd":0,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.36799999999999999}]},{"osd":2,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.60199999999999998}]},{"osd":3,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.58799999999999997}]},{"osd":4,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.65800000000000003}]},{"osd":5,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.60999999999999999}]}]}],"pool_statfs":[{"poolid":1,"osd":0,"total":0,"available":0,"internally_reserved":0,"allocated":462848,"data_stored":459280,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},{"poolid":1,"osd":1,"total":0,"available":0,"internally_reserved":0,"allocated":462848,"data_stored":459280,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},{"poolid":1,"osd":2,"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},{"poolid":1,"osd":3,"total":0,"available":0,"internally_reserved":0,"allocated":462848,"data_stored":459280,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0}]}} 2026-03-10T06:19:40.694 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid 9c59102a-1c48-11f1-b618-035af535377d -- ceph pg dump --format=json 2026-03-10T06:19:40.858 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/mon.vm04/config 2026-03-10T06:19:40.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:40 vm06 ceph-mon[58974]: from='client.? 192.168.123.104:0/212057107' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 3}]: dispatch 2026-03-10T06:19:41.111 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:41.108+0000 7fb81a82d700 1 -- 192.168.123.104:0/871268454 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb814068490 msgr2=0x7fb814068900 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:41.111 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:41.108+0000 7fb81a82d700 1 --2- 192.168.123.104:0/871268454 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb814068490 0x7fb814068900 secure :-1 s=READY pgs=246 cs=0 l=1 rev1=1 crypto rx=0x7fb7fc009b00 tx=0x7fb7fc009e10 comp rx=0 tx=0).stop 2026-03-10T06:19:41.111 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:41.109+0000 7fb81a82d700 1 -- 192.168.123.104:0/871268454 shutdown_connections 2026-03-10T06:19:41.111 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:41.109+0000 7fb81a82d700 1 --2- 192.168.123.104:0/871268454 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb814068490 0x7fb814068900 unknown :-1 s=CLOSED pgs=246 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:41.111 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:41.109+0000 7fb81a82d700 1 --2- 192.168.123.104:0/871268454 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb8141013a0 0x7fb814101770 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:41.111 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:41.109+0000 7fb81a82d700 1 -- 192.168.123.104:0/871268454 >> 192.168.123.104:0/871268454 conn(0x7fb8140754a0 msgr2=0x7fb8140758a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:19:41.111 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:41.109+0000 7fb81a82d700 1 -- 192.168.123.104:0/871268454 shutdown_connections 2026-03-10T06:19:41.111 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:41.109+0000 7fb81a82d700 1 -- 192.168.123.104:0/871268454 wait complete. 2026-03-10T06:19:41.112 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:41.110+0000 7fb81a82d700 1 Processor -- start 2026-03-10T06:19:41.112 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:41.110+0000 7fb81a82d700 1 -- start start 2026-03-10T06:19:41.112 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:41.110+0000 7fb81a82d700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb814068490 0x7fb8141982f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:41.112 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:41.110+0000 7fb81a82d700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb8141013a0 0x7fb814198830 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:41.112 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:41.110+0000 7fb81a82d700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb814198f10 con 0x7fb8141013a0 2026-03-10T06:19:41.113 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:41.110+0000 7fb81a82d700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb81419cca0 con 0x7fb814068490 2026-03-10T06:19:41.113 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:41.111+0000 7fb80bfff700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb8141013a0 0x7fb814198830 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:19:41.113 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:41.111+0000 7fb80bfff700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb8141013a0 0x7fb814198830 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:44988/0 (socket says 192.168.123.104:44988) 2026-03-10T06:19:41.113 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:41.111+0000 7fb80bfff700 1 -- 192.168.123.104:0/268460968 learned_addr learned my addr 192.168.123.104:0/268460968 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:19:41.113 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:41.111+0000 7fb80bfff700 1 -- 192.168.123.104:0/268460968 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb814068490 msgr2=0x7fb8141982f0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:41.113 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:41.111+0000 7fb813fff700 1 --2- 192.168.123.104:0/268460968 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb814068490 0x7fb8141982f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:19:41.113 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:41.111+0000 7fb80bfff700 1 --2- 192.168.123.104:0/268460968 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb814068490 0x7fb8141982f0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:41.113 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:41.111+0000 7fb80bfff700 1 -- 192.168.123.104:0/268460968 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb7fc0097e0 con 0x7fb8141013a0 2026-03-10T06:19:41.113 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:41.111+0000 7fb813fff700 1 --2- 192.168.123.104:0/268460968 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb814068490 0x7fb8141982f0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T06:19:41.113 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:41.111+0000 7fb80bfff700 1 --2- 192.168.123.104:0/268460968 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb8141013a0 0x7fb814198830 secure :-1 s=READY pgs=247 cs=0 l=1 rev1=1 crypto rx=0x7fb7fc00b5c0 tx=0x7fb7fc004950 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:19:41.113 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:41.112+0000 7fb811ffb700 1 -- 192.168.123.104:0/268460968 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb7fc01d070 con 0x7fb8141013a0 2026-03-10T06:19:41.114 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:41.112+0000 7fb811ffb700 1 -- 192.168.123.104:0/268460968 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fb7fc022470 con 0x7fb8141013a0 2026-03-10T06:19:41.114 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:41.112+0000 7fb811ffb700 1 -- 192.168.123.104:0/268460968 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb7fc00f700 con 0x7fb8141013a0 2026-03-10T06:19:41.114 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:41.112+0000 7fb81a82d700 1 -- 192.168.123.104:0/268460968 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fb81419cf20 con 0x7fb8141013a0 2026-03-10T06:19:41.114 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:41.112+0000 7fb81a82d700 1 -- 192.168.123.104:0/268460968 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fb81419d410 con 0x7fb8141013a0 2026-03-10T06:19:41.118 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:41.113+0000 7fb81a82d700 1 -- 192.168.123.104:0/268460968 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fb81404ea50 con 0x7fb8141013a0 2026-03-10T06:19:41.118 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:41.114+0000 7fb811ffb700 1 -- 192.168.123.104:0/268460968 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7fb7fc0225e0 con 0x7fb8141013a0 2026-03-10T06:19:41.119 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:41.117+0000 7fb811ffb700 1 --2- 192.168.123.104:0/268460968 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fb7f406c6d0 0x7fb7f406eb80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:41.119 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:41.117+0000 7fb811ffb700 1 -- 192.168.123.104:0/268460968 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(34..34 src has 1..34) v4 ==== 4446+0+0 (secure 0 0 0) 0x7fb7fc08cc60 con 0x7fb8141013a0 2026-03-10T06:19:41.119 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:41.117+0000 7fb811ffb700 1 -- 192.168.123.104:0/268460968 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fb7fc057720 con 0x7fb8141013a0 2026-03-10T06:19:41.119 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:41.117+0000 7fb813fff700 1 --2- 192.168.123.104:0/268460968 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fb7f406c6d0 0x7fb7f406eb80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:19:41.119 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:41.118+0000 7fb813fff700 1 --2- 192.168.123.104:0/268460968 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fb7f406c6d0 0x7fb7f406eb80 secure :-1 s=READY pgs=83 cs=0 l=1 rev1=1 crypto rx=0x7fb804005fd0 tx=0x7fb804005ee0 comp rx=0 tx=0).ready entity=mgr.14241 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:19:41.227 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:41.225+0000 7fb81a82d700 1 -- 192.168.123.104:0/268460968 --> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] -- mgr_command(tid 0: {"prefix": "pg dump", "target": ["mon-mgr", ""], "format": "json"}) v1 -- 0x7fb81419d6f0 con 0x7fb7f406c6d0 2026-03-10T06:19:41.228 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:41.226+0000 7fb811ffb700 1 -- 192.168.123.104:0/268460968 <== mgr.14241 v2:192.168.123.104:6800/2 1 ==== mgr_command_reply(tid 0: 0 dumped all) v1 ==== 18+0+19093 (secure 0 0 0) 0x7fb81419d6f0 con 0x7fb7f406c6d0 2026-03-10T06:19:41.228 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:19:41.230 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:41.228+0000 7fb81a82d700 1 -- 192.168.123.104:0/268460968 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fb7f406c6d0 msgr2=0x7fb7f406eb80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:41.230 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:41.229+0000 7fb81a82d700 1 --2- 192.168.123.104:0/268460968 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fb7f406c6d0 0x7fb7f406eb80 secure :-1 s=READY pgs=83 cs=0 l=1 rev1=1 crypto rx=0x7fb804005fd0 tx=0x7fb804005ee0 comp rx=0 tx=0).stop 2026-03-10T06:19:41.231 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:41.229+0000 7fb81a82d700 1 -- 192.168.123.104:0/268460968 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb8141013a0 msgr2=0x7fb814198830 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:41.231 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:41.229+0000 7fb81a82d700 1 --2- 192.168.123.104:0/268460968 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb8141013a0 0x7fb814198830 secure :-1 s=READY pgs=247 cs=0 l=1 rev1=1 crypto rx=0x7fb7fc00b5c0 tx=0x7fb7fc004950 comp rx=0 tx=0).stop 2026-03-10T06:19:41.231 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:41.229+0000 7fb81a82d700 1 -- 192.168.123.104:0/268460968 shutdown_connections 2026-03-10T06:19:41.231 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:41.229+0000 7fb81a82d700 1 --2- 192.168.123.104:0/268460968 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fb7f406c6d0 0x7fb7f406eb80 unknown :-1 s=CLOSED pgs=83 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:41.231 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:41.230+0000 7fb81a82d700 1 --2- 192.168.123.104:0/268460968 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb814068490 0x7fb8141982f0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:41.231 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:41.230+0000 7fb81a82d700 1 --2- 192.168.123.104:0/268460968 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb8141013a0 0x7fb814198830 unknown :-1 s=CLOSED pgs=247 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:41.232 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:41.230+0000 7fb81a82d700 1 -- 192.168.123.104:0/268460968 >> 192.168.123.104:0/268460968 conn(0x7fb8140754a0 msgr2=0x7fb8140fdd70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:19:41.232 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:41.230+0000 7fb81a82d700 1 -- 192.168.123.104:0/268460968 shutdown_connections 2026-03-10T06:19:41.232 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:41.230+0000 7fb81a82d700 1 -- 192.168.123.104:0/268460968 wait complete. 2026-03-10T06:19:41.233 INFO:teuthology.orchestra.run.vm04.stderr:dumped all 2026-03-10T06:19:41.277 INFO:teuthology.orchestra.run.vm04.stdout:{"pg_ready":true,"pg_map":{"version":70,"stamp":"2026-03-10T06:19:40.088394+0000","last_osdmap_epoch":0,"last_pg_scan":0,"pg_stats_sum":{"stat_sum":{"num_bytes":459280,"num_objects":2,"num_object_clones":0,"num_object_copies":6,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":96,"num_read_kb":82,"num_write":113,"num_write_kb":1372,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":76,"ondisk_log_size":76,"up":3,"acting":3,"num_store_stats":0},"osd_stats_sum":{"up_from":0,"seq":0,"num_pgs":3,"num_osds":6,"num_per_pool_osds":6,"num_per_pool_omap_osds":3,"kb":125804544,"kb_used":163692,"kb_used_data":3132,"kb_used_omap":0,"kb_used_meta":160512,"kb_avail":125640852,"statfs":{"total":128823853056,"available":128656232448,"internally_reserved":0,"allocated":3207168,"data_stored":2053338,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":164364288},"hb_peers":[],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[]},"pg_stats_delta":{"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":0,"ondisk_log_size":0,"up":0,"acting":0,"num_store_stats":0,"stamp_delta":"10.013355"},"pg_stats":[{"pgid":"1.0","version":"21'76","reported_seq":138,"reported_epoch":33,"state":"active+clean","last_fresh":"2026-03-10T06:19:29.068927+0000","last_change":"2026-03-10T06:19:17.805828+0000","last_active":"2026-03-10T06:19:29.068927+0000","last_peered":"2026-03-10T06:19:29.068927+0000","last_clean":"2026-03-10T06:19:29.068927+0000","last_became_active":"2026-03-10T06:19:17.805679+0000","last_became_peered":"2026-03-10T06:19:17.805679+0000","last_unstale":"2026-03-10T06:19:29.068927+0000","last_undegraded":"2026-03-10T06:19:29.068927+0000","last_fullsized":"2026-03-10T06:19:29.068927+0000","mapping_epoch":28,"log_start":"0'0","ondisk_log_start":"0'0","created":19,"last_epoch_clean":29,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-10T06:19:00.469886+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-10T06:19:00.469886+0000","last_clean_scrub_stamp":"2026-03-10T06:19:00.469886+0000","objects_scrubbed":0,"log_size":76,"log_dups_size":0,"ondisk_log_size":76,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-11T16:39:57.732996+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0,"stat_sum":{"num_bytes":459280,"num_objects":2,"num_object_clones":0,"num_object_copies":6,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":96,"num_read_kb":82,"num_write":113,"num_write_kb":1372,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[3,0,1],"acting":[3,0,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":3,"acting_primary":3,"purged_snaps":[]}],"pool_stats":[{"poolid":1,"num_pg":1,"stat_sum":{"num_bytes":459280,"num_objects":2,"num_object_clones":0,"num_object_copies":6,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":96,"num_read_kb":82,"num_write":113,"num_write_kb":1372,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":1388544,"data_stored":1377840,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":76,"ondisk_log_size":76,"up":3,"acting":3,"num_store_stats":4}],"osd_stats":[{"osd":5,"up_from":33,"seq":141733920773,"num_pgs":0,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":0,"kb":20967424,"kb_used":27056,"kb_used_data":296,"kb_used_omap":0,"kb_used_meta":26752,"kb_avail":20940368,"statfs":{"total":21470642176,"available":21442936832,"internally_reserved":0,"allocated":303104,"data_stored":112583,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":27394048},"hb_peers":[0,1,2,3,4],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[{"osd":0,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.48199999999999998}]},{"osd":1,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.45200000000000001}]},{"osd":2,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.439}]},{"osd":3,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.24399999999999999}]},{"osd":4,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.495}]}]},{"osd":4,"up_from":29,"seq":124554051590,"num_pgs":0,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":0,"kb":20967424,"kb_used":27056,"kb_used_data":296,"kb_used_omap":0,"kb_used_meta":26752,"kb_avail":20940368,"statfs":{"total":21470642176,"available":21442936832,"internally_reserved":0,"allocated":303104,"data_stored":112583,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":27394048},"hb_peers":[0,1,2,3,5],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[{"osd":0,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.38}]},{"osd":1,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.36399999999999999}]},{"osd":2,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.53000000000000003}]},{"osd":3,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.441}]},{"osd":5,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.35199999999999998}]}]},{"osd":3,"up_from":24,"seq":103079215112,"num_pgs":1,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":20967424,"kb_used":27508,"kb_used_data":748,"kb_used_omap":0,"kb_used_meta":26752,"kb_avail":20939916,"statfs":{"total":21470642176,"available":21442473984,"internally_reserved":0,"allocated":765952,"data_stored":571863,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":27394048},"hb_peers":[0,1,2,4,5],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[{"osd":0,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.48899999999999999}]},{"osd":1,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.39700000000000002}]},{"osd":2,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.46700000000000003}]},{"osd":4,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.438}]},{"osd":5,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.42199999999999999}]}]},{"osd":2,"up_from":18,"seq":77309411338,"num_pgs":0,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":0,"kb":20967424,"kb_used":27056,"kb_used_data":296,"kb_used_omap":0,"kb_used_meta":26752,"kb_avail":20940368,"statfs":{"total":21470642176,"available":21442936832,"internally_reserved":0,"allocated":303104,"data_stored":112583,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":27394048},"hb_peers":[0,1,3,4,5],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[{"osd":0,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.44500000000000001}]},{"osd":1,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.64400000000000002}]},{"osd":3,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.42799999999999999}]},{"osd":4,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.58899999999999997}]},{"osd":5,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.5}]}]},{"osd":0,"up_from":9,"seq":38654705678,"num_pgs":1,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":20967424,"kb_used":27508,"kb_used_data":748,"kb_used_omap":0,"kb_used_meta":26752,"kb_avail":20939916,"statfs":{"total":21470642176,"available":21442473984,"internally_reserved":0,"allocated":765952,"data_stored":571863,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":27394048},"hb_peers":[1,2,3,4,5],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[{"osd":1,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.54500000000000004}]},{"osd":2,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.40699999999999997}]},{"osd":3,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.72999999999999998}]},{"osd":4,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.60799999999999998}]},{"osd":5,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.622}]}]},{"osd":1,"up_from":14,"seq":60129542156,"num_pgs":1,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":20967424,"kb_used":27508,"kb_used_data":748,"kb_used_omap":0,"kb_used_meta":26752,"kb_avail":20939916,"statfs":{"total":21470642176,"available":21442473984,"internally_reserved":0,"allocated":765952,"data_stored":571863,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":27394048},"hb_peers":[0,2,3,4,5],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[{"osd":0,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.36799999999999999}]},{"osd":2,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.60199999999999998}]},{"osd":3,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.58799999999999997}]},{"osd":4,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.65800000000000003}]},{"osd":5,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.60999999999999999}]}]}],"pool_statfs":[{"poolid":1,"osd":0,"total":0,"available":0,"internally_reserved":0,"allocated":462848,"data_stored":459280,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},{"poolid":1,"osd":1,"total":0,"available":0,"internally_reserved":0,"allocated":462848,"data_stored":459280,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},{"poolid":1,"osd":2,"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},{"poolid":1,"osd":3,"total":0,"available":0,"internally_reserved":0,"allocated":462848,"data_stored":459280,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0}]}} 2026-03-10T06:19:41.277 INFO:tasks.cephadm.ceph_manager.ceph:clean! 2026-03-10T06:19:41.277 INFO:tasks.ceph:Waiting until ceph cluster ceph is healthy... 2026-03-10T06:19:41.278 INFO:tasks.cephadm.ceph_manager.ceph:wait_until_healthy 2026-03-10T06:19:41.278 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid 9c59102a-1c48-11f1-b618-035af535377d -- ceph health --format=json 2026-03-10T06:19:41.446 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/mon.vm04/config 2026-03-10T06:19:41.491 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:41 vm04 ceph-mon[51058]: pgmap v70: 1 pgs: 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T06:19:41.694 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:41.691+0000 7fb3023fd700 1 -- 192.168.123.104:0/2470922402 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb2fc0fffa0 msgr2=0x7fb2fc10aa10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:41.694 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:41.691+0000 7fb3023fd700 1 --2- 192.168.123.104:0/2470922402 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb2fc0fffa0 0x7fb2fc10aa10 secure :-1 s=READY pgs=248 cs=0 l=1 rev1=1 crypto rx=0x7fb2ec009b00 tx=0x7fb2ec009e10 comp rx=0 tx=0).stop 2026-03-10T06:19:41.694 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:41.692+0000 7fb3023fd700 1 -- 192.168.123.104:0/2470922402 shutdown_connections 2026-03-10T06:19:41.694 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:41.692+0000 7fb3023fd700 1 --2- 192.168.123.104:0/2470922402 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb2fc0fffa0 0x7fb2fc10aa10 secure :-1 s=CLOSED pgs=248 cs=0 l=1 rev1=1 crypto rx=0x7fb2ec009b00 tx=0x7fb2ec009e10 comp rx=0 tx=0).stop 2026-03-10T06:19:41.694 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:41.692+0000 7fb3023fd700 1 --2- 192.168.123.104:0/2470922402 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb2fc0ff690 0x7fb2fc0ffa60 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:41.695 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:41.692+0000 7fb3023fd700 1 -- 192.168.123.104:0/2470922402 >> 192.168.123.104:0/2470922402 conn(0x7fb2fc074bf0 msgr2=0x7fb2fc074ff0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:19:41.695 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:41.693+0000 7fb3023fd700 1 -- 192.168.123.104:0/2470922402 shutdown_connections 2026-03-10T06:19:41.695 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:41.693+0000 7fb3023fd700 1 -- 192.168.123.104:0/2470922402 wait complete. 2026-03-10T06:19:41.695 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:41.693+0000 7fb3023fd700 1 Processor -- start 2026-03-10T06:19:41.695 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:41.693+0000 7fb3023fd700 1 -- start start 2026-03-10T06:19:41.695 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:41.694+0000 7fb3023fd700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb2fc0ff690 0x7fb2fc1986b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:41.695 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:41.694+0000 7fb3023fd700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb2fc198bf0 0x7fb2fc19bfb0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:41.696 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:41.694+0000 7fb3023fd700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb2fc199200 con 0x7fb2fc0ff690 2026-03-10T06:19:41.696 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:41.694+0000 7fb3023fd700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb2fc199340 con 0x7fb2fc198bf0 2026-03-10T06:19:41.696 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:41.694+0000 7fb2fbfff700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb2fc0ff690 0x7fb2fc1986b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:19:41.696 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:41.694+0000 7fb2fbfff700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb2fc0ff690 0x7fb2fc1986b0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:45004/0 (socket says 192.168.123.104:45004) 2026-03-10T06:19:41.696 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:41.694+0000 7fb2fbfff700 1 -- 192.168.123.104:0/2466392458 learned_addr learned my addr 192.168.123.104:0/2466392458 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:19:41.696 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:41.694+0000 7fb2fbfff700 1 -- 192.168.123.104:0/2466392458 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb2fc198bf0 msgr2=0x7fb2fc19bfb0 unknown :-1 s=STATE_CONNECTING_RE l=1).mark_down 2026-03-10T06:19:41.697 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:41.694+0000 7fb2fbfff700 1 --2- 192.168.123.104:0/2466392458 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb2fc198bf0 0x7fb2fc19bfb0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:41.697 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:41.694+0000 7fb2fbfff700 1 -- 192.168.123.104:0/2466392458 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb2ec0097e0 con 0x7fb2fc0ff690 2026-03-10T06:19:41.697 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:41.694+0000 7fb2fbfff700 1 --2- 192.168.123.104:0/2466392458 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb2fc0ff690 0x7fb2fc1986b0 secure :-1 s=READY pgs=249 cs=0 l=1 rev1=1 crypto rx=0x7fb2e4009e90 tx=0x7fb2e400bd50 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:19:41.697 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:41.694+0000 7fb2f97fa700 1 -- 192.168.123.104:0/2466392458 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb2e401b5b0 con 0x7fb2fc0ff690 2026-03-10T06:19:41.697 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:41.694+0000 7fb2f97fa700 1 -- 192.168.123.104:0/2466392458 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fb2e4017050 con 0x7fb2fc0ff690 2026-03-10T06:19:41.697 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:41.694+0000 7fb2f97fa700 1 -- 192.168.123.104:0/2466392458 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb2e401aca0 con 0x7fb2fc0ff690 2026-03-10T06:19:41.697 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:41.694+0000 7fb3023fd700 1 -- 192.168.123.104:0/2466392458 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fb2fc19c5b0 con 0x7fb2fc0ff690 2026-03-10T06:19:41.697 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:41.695+0000 7fb3023fd700 1 -- 192.168.123.104:0/2466392458 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fb2fc19cb00 con 0x7fb2fc0ff690 2026-03-10T06:19:41.701 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:41.696+0000 7fb2f97fa700 1 -- 192.168.123.104:0/2466392458 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7fb2e40243b0 con 0x7fb2fc0ff690 2026-03-10T06:19:41.701 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:41.696+0000 7fb3023fd700 1 -- 192.168.123.104:0/2466392458 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fb2fc10a570 con 0x7fb2fc0ff690 2026-03-10T06:19:41.701 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:41.696+0000 7fb2f97fa700 1 --2- 192.168.123.104:0/2466392458 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fb2e806c750 0x7fb2e806ec00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:41.701 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:41.697+0000 7fb2fb7fe700 1 --2- 192.168.123.104:0/2466392458 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fb2e806c750 0x7fb2e806ec00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:19:41.701 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:41.697+0000 7fb2f97fa700 1 -- 192.168.123.104:0/2466392458 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(34..34 src has 1..34) v4 ==== 4446+0+0 (secure 0 0 0) 0x7fb2e4092d10 con 0x7fb2fc0ff690 2026-03-10T06:19:41.701 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:41.698+0000 7fb2fb7fe700 1 --2- 192.168.123.104:0/2466392458 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fb2e806c750 0x7fb2e806ec00 secure :-1 s=READY pgs=84 cs=0 l=1 rev1=1 crypto rx=0x7fb2ec00b5c0 tx=0x7fb2ec005dc0 comp rx=0 tx=0).ready entity=mgr.14241 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:19:41.701 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:41.699+0000 7fb2f97fa700 1 -- 192.168.123.104:0/2466392458 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fb2e4055720 con 0x7fb2fc0ff690 2026-03-10T06:19:41.837 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:41.834+0000 7fb3023fd700 1 -- 192.168.123.104:0/2466392458 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "health", "format": "json"} v 0) v1 -- 0x7fb2fc19cde0 con 0x7fb2fc0ff690 2026-03-10T06:19:41.837 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:41.835+0000 7fb2f97fa700 1 -- 192.168.123.104:0/2466392458 <== mon.0 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "health", "format": "json"}]=0 v0) v1 ==== 72+0+46 (secure 0 0 0) 0x7fb2e405d900 con 0x7fb2fc0ff690 2026-03-10T06:19:41.838 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:19:41.838 INFO:teuthology.orchestra.run.vm04.stdout:{"status":"HEALTH_OK","checks":{},"mutes":[]} 2026-03-10T06:19:41.840 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:41.839+0000 7fb3023fd700 1 -- 192.168.123.104:0/2466392458 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fb2e806c750 msgr2=0x7fb2e806ec00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:41.841 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:41.839+0000 7fb3023fd700 1 --2- 192.168.123.104:0/2466392458 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fb2e806c750 0x7fb2e806ec00 secure :-1 s=READY pgs=84 cs=0 l=1 rev1=1 crypto rx=0x7fb2ec00b5c0 tx=0x7fb2ec005dc0 comp rx=0 tx=0).stop 2026-03-10T06:19:41.841 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:41.839+0000 7fb3023fd700 1 -- 192.168.123.104:0/2466392458 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb2fc0ff690 msgr2=0x7fb2fc1986b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:41.841 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:41.839+0000 7fb3023fd700 1 --2- 192.168.123.104:0/2466392458 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb2fc0ff690 0x7fb2fc1986b0 secure :-1 s=READY pgs=249 cs=0 l=1 rev1=1 crypto rx=0x7fb2e4009e90 tx=0x7fb2e400bd50 comp rx=0 tx=0).stop 2026-03-10T06:19:41.841 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:41.839+0000 7fb3023fd700 1 -- 192.168.123.104:0/2466392458 shutdown_connections 2026-03-10T06:19:41.841 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:41.839+0000 7fb3023fd700 1 --2- 192.168.123.104:0/2466392458 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fb2e806c750 0x7fb2e806ec00 unknown :-1 s=CLOSED pgs=84 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:41.841 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:41.839+0000 7fb3023fd700 1 --2- 192.168.123.104:0/2466392458 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb2fc0ff690 0x7fb2fc1986b0 secure :-1 s=CLOSED pgs=249 cs=0 l=1 rev1=1 crypto rx=0x7fb2e4009e90 tx=0x7fb2e400bd50 comp rx=0 tx=0).stop 2026-03-10T06:19:41.841 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:41.839+0000 7fb3023fd700 1 --2- 192.168.123.104:0/2466392458 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb2fc198bf0 0x7fb2fc19bfb0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:41.841 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:41.839+0000 7fb3023fd700 1 -- 192.168.123.104:0/2466392458 >> 192.168.123.104:0/2466392458 conn(0x7fb2fc074bf0 msgr2=0x7fb2fc101030 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:19:41.842 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:41.839+0000 7fb3023fd700 1 -- 192.168.123.104:0/2466392458 shutdown_connections 2026-03-10T06:19:41.842 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:41.839+0000 7fb3023fd700 1 -- 192.168.123.104:0/2466392458 wait complete. 2026-03-10T06:19:41.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:41 vm06 ceph-mon[58974]: pgmap v70: 1 pgs: 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T06:19:41.912 INFO:tasks.cephadm.ceph_manager.ceph:wait_until_healthy done 2026-03-10T06:19:41.912 INFO:tasks.cephadm:Setup complete, yielding 2026-03-10T06:19:41.912 INFO:teuthology.run_tasks:Running task print... 2026-03-10T06:19:41.914 INFO:teuthology.task.print:**** done end installing v18.2.0 cephadm ... 2026-03-10T06:19:41.914 INFO:teuthology.run_tasks:Running task cephadm.shell... 2026-03-10T06:19:41.916 INFO:tasks.cephadm:Running commands on role host.a host ubuntu@vm04.local 2026-03-10T06:19:41.916 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 9c59102a-1c48-11f1-b618-035af535377d -- bash -c 'ceph config set mgr mgr/cephadm/use_repo_digest true --force' 2026-03-10T06:19:42.083 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/mon.vm04/config 2026-03-10T06:19:42.367 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:42.365+0000 7f4cf2fc0700 1 -- 192.168.123.104:0/1766761158 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4cec101710 msgr2=0x7f4cec101b60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:42.367 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:42.365+0000 7f4cf2fc0700 1 --2- 192.168.123.104:0/1766761158 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4cec101710 0x7f4cec101b60 secure :-1 s=READY pgs=250 cs=0 l=1 rev1=1 crypto rx=0x7f4ce0009b00 tx=0x7f4ce0009e10 comp rx=0 tx=0).stop 2026-03-10T06:19:42.367 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:42.365+0000 7f4cf2fc0700 1 -- 192.168.123.104:0/1766761158 shutdown_connections 2026-03-10T06:19:42.367 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:42.365+0000 7f4cf2fc0700 1 --2- 192.168.123.104:0/1766761158 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4cec101710 0x7f4cec101b60 unknown :-1 s=CLOSED pgs=250 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:42.367 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:42.365+0000 7f4cf2fc0700 1 --2- 192.168.123.104:0/1766761158 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4cec100510 0x7f4cec100920 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:42.367 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:42.365+0000 7f4cf2fc0700 1 -- 192.168.123.104:0/1766761158 >> 192.168.123.104:0/1766761158 conn(0x7f4cec0fba80 msgr2=0x7f4cec0fdef0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:19:42.367 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:42.365+0000 7f4cf2fc0700 1 -- 192.168.123.104:0/1766761158 shutdown_connections 2026-03-10T06:19:42.367 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:42.365+0000 7f4cf2fc0700 1 -- 192.168.123.104:0/1766761158 wait complete. 2026-03-10T06:19:42.368 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:42.366+0000 7f4cf2fc0700 1 Processor -- start 2026-03-10T06:19:42.368 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:42.366+0000 7f4cf2fc0700 1 -- start start 2026-03-10T06:19:42.368 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:42.366+0000 7f4cf2fc0700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4cec100510 0x7f4cec074af0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:42.368 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:42.366+0000 7f4cf2fc0700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4cec101710 0x7f4cec073140 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:42.368 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:42.366+0000 7f4cf2fc0700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4cec1999d0 con 0x7f4cec100510 2026-03-10T06:19:42.369 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:42.366+0000 7f4cf2fc0700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4cec199b10 con 0x7f4cec101710 2026-03-10T06:19:42.369 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:42.366+0000 7f4cebfff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4cec101710 0x7f4cec073140 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:19:42.371 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:42.367+0000 7f4cebfff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4cec101710 0x7f4cec073140 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.104:55786/0 (socket says 192.168.123.104:55786) 2026-03-10T06:19:42.371 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:42.367+0000 7f4cebfff700 1 -- 192.168.123.104:0/1965166254 learned_addr learned my addr 192.168.123.104:0/1965166254 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:19:42.371 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:42.367+0000 7f4cf0d5c700 1 --2- 192.168.123.104:0/1965166254 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4cec100510 0x7f4cec074af0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:19:42.371 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:42.367+0000 7f4cebfff700 1 -- 192.168.123.104:0/1965166254 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4cec100510 msgr2=0x7f4cec074af0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:42.371 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:42.367+0000 7f4cebfff700 1 --2- 192.168.123.104:0/1965166254 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4cec100510 0x7f4cec074af0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:42.371 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:42.367+0000 7f4cebfff700 1 -- 192.168.123.104:0/1965166254 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f4ce00097e0 con 0x7f4cec101710 2026-03-10T06:19:42.371 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:42.367+0000 7f4cf0d5c700 1 --2- 192.168.123.104:0/1965166254 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4cec100510 0x7f4cec074af0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-10T06:19:42.371 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:42.367+0000 7f4cebfff700 1 --2- 192.168.123.104:0/1965166254 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4cec101710 0x7f4cec073140 secure :-1 s=READY pgs=48 cs=0 l=1 rev1=1 crypto rx=0x7f4ce0004990 tx=0x7f4ce00049c0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:19:42.371 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:42.367+0000 7f4ce9ffb700 1 -- 192.168.123.104:0/1965166254 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f4ce001d070 con 0x7f4cec101710 2026-03-10T06:19:42.371 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:42.367+0000 7f4cf2fc0700 1 -- 192.168.123.104:0/1965166254 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f4cec073680 con 0x7f4cec101710 2026-03-10T06:19:42.372 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:42.367+0000 7f4cf2fc0700 1 -- 192.168.123.104:0/1965166254 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f4cec073bd0 con 0x7f4cec101710 2026-03-10T06:19:42.372 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:42.368+0000 7f4ce9ffb700 1 -- 192.168.123.104:0/1965166254 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f4ce000bc50 con 0x7f4cec101710 2026-03-10T06:19:42.372 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:42.368+0000 7f4ce9ffb700 1 -- 192.168.123.104:0/1965166254 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f4ce000f700 con 0x7f4cec101710 2026-03-10T06:19:42.372 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:42.369+0000 7f4ce9ffb700 1 -- 192.168.123.104:0/1965166254 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f4ce000f920 con 0x7f4cec101710 2026-03-10T06:19:42.372 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:42.369+0000 7f4ce9ffb700 1 --2- 192.168.123.104:0/1965166254 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f4cd406c680 0x7f4cd406eb30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:42.372 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:42.369+0000 7f4cf0d5c700 1 --2- 192.168.123.104:0/1965166254 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f4cd406c680 0x7f4cd406eb30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:19:42.372 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:42.370+0000 7f4cf0d5c700 1 --2- 192.168.123.104:0/1965166254 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f4cd406c680 0x7f4cd406eb30 secure :-1 s=READY pgs=85 cs=0 l=1 rev1=1 crypto rx=0x7f4cec101570 tx=0x7f4cdc00a380 comp rx=0 tx=0).ready entity=mgr.14241 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:19:42.372 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:42.370+0000 7f4ce9ffb700 1 -- 192.168.123.104:0/1965166254 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(34..34 src has 1..34) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f4ce008cba0 con 0x7f4cec101710 2026-03-10T06:19:42.372 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:42.370+0000 7f4cf2fc0700 1 -- 192.168.123.104:0/1965166254 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f4cd8005320 con 0x7f4cec101710 2026-03-10T06:19:42.375 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:42.373+0000 7f4ce9ffb700 1 -- 192.168.123.104:0/1965166254 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f4ce005b1a0 con 0x7f4cec101710 2026-03-10T06:19:42.492 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:42.489+0000 7f4cf2fc0700 1 -- 192.168.123.104:0/1965166254 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command([{prefix=config set, name=mgr/cephadm/use_repo_digest}] v 0) v1 -- 0x7f4cd8005190 con 0x7f4cec101710 2026-03-10T06:19:42.500 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:42.497+0000 7f4ce9ffb700 1 -- 192.168.123.104:0/1965166254 <== mon.1 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{prefix=config set, name=mgr/cephadm/use_repo_digest}]=0 v14)=0 v14) v1 ==== 143+0+0 (secure 0 0 0) 0x7f4ce0027030 con 0x7f4cec101710 2026-03-10T06:19:42.506 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:42.504+0000 7f4cf2fc0700 1 -- 192.168.123.104:0/1965166254 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f4cd406c680 msgr2=0x7f4cd406eb30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:42.506 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:42.504+0000 7f4cf2fc0700 1 --2- 192.168.123.104:0/1965166254 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f4cd406c680 0x7f4cd406eb30 secure :-1 s=READY pgs=85 cs=0 l=1 rev1=1 crypto rx=0x7f4cec101570 tx=0x7f4cdc00a380 comp rx=0 tx=0).stop 2026-03-10T06:19:42.506 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:42.504+0000 7f4cf2fc0700 1 -- 192.168.123.104:0/1965166254 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4cec101710 msgr2=0x7f4cec073140 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:42.506 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:42.504+0000 7f4cf2fc0700 1 --2- 192.168.123.104:0/1965166254 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4cec101710 0x7f4cec073140 secure :-1 s=READY pgs=48 cs=0 l=1 rev1=1 crypto rx=0x7f4ce0004990 tx=0x7f4ce00049c0 comp rx=0 tx=0).stop 2026-03-10T06:19:42.506 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:42.505+0000 7f4cf2fc0700 1 -- 192.168.123.104:0/1965166254 shutdown_connections 2026-03-10T06:19:42.506 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:42.505+0000 7f4cf2fc0700 1 --2- 192.168.123.104:0/1965166254 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f4cd406c680 0x7f4cd406eb30 unknown :-1 s=CLOSED pgs=85 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:42.506 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:42.505+0000 7f4cf2fc0700 1 --2- 192.168.123.104:0/1965166254 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4cec100510 0x7f4cec074af0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:42.507 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:42.505+0000 7f4cf2fc0700 1 --2- 192.168.123.104:0/1965166254 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4cec101710 0x7f4cec073140 unknown :-1 s=CLOSED pgs=48 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:42.507 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:42.505+0000 7f4cf2fc0700 1 -- 192.168.123.104:0/1965166254 >> 192.168.123.104:0/1965166254 conn(0x7f4cec0fba80 msgr2=0x7f4cec104940 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:19:42.507 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:42.505+0000 7f4cf2fc0700 1 -- 192.168.123.104:0/1965166254 shutdown_connections 2026-03-10T06:19:42.507 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:42.505+0000 7f4cf2fc0700 1 -- 192.168.123.104:0/1965166254 wait complete. 2026-03-10T06:19:42.594 INFO:teuthology.run_tasks:Running task print... 2026-03-10T06:19:42.596 INFO:teuthology.task.print:**** done cephadm.shell ceph config set mgr... 2026-03-10T06:19:42.596 INFO:teuthology.run_tasks:Running task cephadm.shell... 2026-03-10T06:19:42.598 INFO:tasks.cephadm:Running commands on role host.a host ubuntu@vm04.local 2026-03-10T06:19:42.598 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 9c59102a-1c48-11f1-b618-035af535377d -- bash -c 'ceph orch status' 2026-03-10T06:19:42.619 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:42 vm04 ceph-mon[51058]: from='client.14464 -' entity='client.admin' cmd=[{"prefix": "pg dump", "target": ["mon-mgr", ""], "format": "json"}]: dispatch 2026-03-10T06:19:42.619 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:42 vm04 ceph-mon[51058]: from='client.14468 -' entity='client.admin' cmd=[{"prefix": "pg dump", "target": ["mon-mgr", ""], "format": "json"}]: dispatch 2026-03-10T06:19:42.619 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:42 vm04 ceph-mon[51058]: from='client.? 192.168.123.104:0/2466392458' entity='client.admin' cmd=[{"prefix": "health", "format": "json"}]: dispatch 2026-03-10T06:19:42.763 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/mon.vm04/config 2026-03-10T06:19:42.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:42 vm06 ceph-mon[58974]: from='client.14464 -' entity='client.admin' cmd=[{"prefix": "pg dump", "target": ["mon-mgr", ""], "format": "json"}]: dispatch 2026-03-10T06:19:42.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:42 vm06 ceph-mon[58974]: from='client.14468 -' entity='client.admin' cmd=[{"prefix": "pg dump", "target": ["mon-mgr", ""], "format": "json"}]: dispatch 2026-03-10T06:19:42.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:42 vm06 ceph-mon[58974]: from='client.? 192.168.123.104:0/2466392458' entity='client.admin' cmd=[{"prefix": "health", "format": "json"}]: dispatch 2026-03-10T06:19:43.066 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:43.064+0000 7fd79ca5a700 1 -- 192.168.123.104:0/4248899103 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd798103960 msgr2=0x7fd798103db0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:43.066 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:43.064+0000 7fd79ca5a700 1 --2- 192.168.123.104:0/4248899103 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd798103960 0x7fd798103db0 secure :-1 s=READY pgs=251 cs=0 l=1 rev1=1 crypto rx=0x7fd788009b50 tx=0x7fd788009e60 comp rx=0 tx=0).stop 2026-03-10T06:19:43.066 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:43.064+0000 7fd79ca5a700 1 -- 192.168.123.104:0/4248899103 shutdown_connections 2026-03-10T06:19:43.066 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:43.064+0000 7fd79ca5a700 1 --2- 192.168.123.104:0/4248899103 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd798103960 0x7fd798103db0 unknown :-1 s=CLOSED pgs=251 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:43.066 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:43.064+0000 7fd79ca5a700 1 --2- 192.168.123.104:0/4248899103 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd798102760 0x7fd798102b70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:43.066 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:43.064+0000 7fd79ca5a700 1 -- 192.168.123.104:0/4248899103 >> 192.168.123.104:0/4248899103 conn(0x7fd7980fdcf0 msgr2=0x7fd798100140 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:19:43.067 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:43.065+0000 7fd79ca5a700 1 -- 192.168.123.104:0/4248899103 shutdown_connections 2026-03-10T06:19:43.067 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:43.065+0000 7fd79ca5a700 1 -- 192.168.123.104:0/4248899103 wait complete. 2026-03-10T06:19:43.067 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:43.065+0000 7fd79ca5a700 1 Processor -- start 2026-03-10T06:19:43.067 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:43.065+0000 7fd79ca5a700 1 -- start start 2026-03-10T06:19:43.067 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:43.066+0000 7fd79ca5a700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd798102760 0x7fd798198020 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:43.067 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:43.066+0000 7fd79ca5a700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd798103960 0x7fd798198560 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:43.067 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:43.066+0000 7fd79ca5a700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd798198b80 con 0x7fd798102760 2026-03-10T06:19:43.067 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:43.066+0000 7fd79ca5a700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd798198cc0 con 0x7fd798103960 2026-03-10T06:19:43.068 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:43.066+0000 7fd79659c700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd798102760 0x7fd798198020 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:19:43.068 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:43.066+0000 7fd79659c700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd798102760 0x7fd798198020 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:45042/0 (socket says 192.168.123.104:45042) 2026-03-10T06:19:43.068 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:43.066+0000 7fd79659c700 1 -- 192.168.123.104:0/3022079468 learned_addr learned my addr 192.168.123.104:0/3022079468 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:19:43.068 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:43.066+0000 7fd79659c700 1 -- 192.168.123.104:0/3022079468 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd798103960 msgr2=0x7fd798198560 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:43.068 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:43.066+0000 7fd79659c700 1 --2- 192.168.123.104:0/3022079468 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd798103960 0x7fd798198560 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:43.068 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:43.066+0000 7fd79659c700 1 -- 192.168.123.104:0/3022079468 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fd7880097e0 con 0x7fd798102760 2026-03-10T06:19:43.068 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:43.066+0000 7fd79659c700 1 --2- 192.168.123.104:0/3022079468 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd798102760 0x7fd798198020 secure :-1 s=READY pgs=252 cs=0 l=1 rev1=1 crypto rx=0x7fd78000eac0 tx=0x7fd78000ee80 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:19:43.069 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:43.067+0000 7fd78f7fe700 1 -- 192.168.123.104:0/3022079468 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd780009960 con 0x7fd798102760 2026-03-10T06:19:43.069 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:43.067+0000 7fd78f7fe700 1 -- 192.168.123.104:0/3022079468 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fd780004510 con 0x7fd798102760 2026-03-10T06:19:43.069 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:43.067+0000 7fd78f7fe700 1 -- 192.168.123.104:0/3022079468 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd780010450 con 0x7fd798102760 2026-03-10T06:19:43.069 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:43.067+0000 7fd79ca5a700 1 -- 192.168.123.104:0/3022079468 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fd79819d770 con 0x7fd798102760 2026-03-10T06:19:43.069 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:43.067+0000 7fd79ca5a700 1 -- 192.168.123.104:0/3022079468 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fd79819dcc0 con 0x7fd798102760 2026-03-10T06:19:43.071 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:43.068+0000 7fd78f7fe700 1 -- 192.168.123.104:0/3022079468 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7fd780010650 con 0x7fd798102760 2026-03-10T06:19:43.071 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:43.068+0000 7fd79ca5a700 1 -- 192.168.123.104:0/3022079468 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fd798066e40 con 0x7fd798102760 2026-03-10T06:19:43.071 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:43.069+0000 7fd78f7fe700 1 --2- 192.168.123.104:0/3022079468 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fd78406c7a0 0x7fd78406ec50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:43.071 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:43.069+0000 7fd78f7fe700 1 -- 192.168.123.104:0/3022079468 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(34..34 src has 1..34) v4 ==== 4446+0+0 (secure 0 0 0) 0x7fd780014070 con 0x7fd798102760 2026-03-10T06:19:43.071 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:43.069+0000 7fd795d9b700 1 --2- 192.168.123.104:0/3022079468 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fd78406c7a0 0x7fd78406ec50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:19:43.071 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:43.069+0000 7fd795d9b700 1 --2- 192.168.123.104:0/3022079468 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fd78406c7a0 0x7fd78406ec50 secure :-1 s=READY pgs=86 cs=0 l=1 rev1=1 crypto rx=0x7fd78800b5c0 tx=0x7fd788005fb0 comp rx=0 tx=0).ready entity=mgr.14241 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:19:43.073 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:43.071+0000 7fd78f7fe700 1 -- 192.168.123.104:0/3022079468 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fd78005e120 con 0x7fd798102760 2026-03-10T06:19:43.188 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:43.186+0000 7fd79ca5a700 1 -- 192.168.123.104:0/3022079468 --> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] -- mgr_command(tid 0: {"prefix": "orch status", "target": ["mon-mgr", ""]}) v1 -- 0x7fd79819dfa0 con 0x7fd78406c7a0 2026-03-10T06:19:43.189 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:43.188+0000 7fd78f7fe700 1 -- 192.168.123.104:0/3022079468 <== mgr.14241 v2:192.168.123.104:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+43 (secure 0 0 0) 0x7fd79819dfa0 con 0x7fd78406c7a0 2026-03-10T06:19:43.189 INFO:teuthology.orchestra.run.vm04.stdout:Backend: cephadm 2026-03-10T06:19:43.190 INFO:teuthology.orchestra.run.vm04.stdout:Available: Yes 2026-03-10T06:19:43.190 INFO:teuthology.orchestra.run.vm04.stdout:Paused: No 2026-03-10T06:19:43.192 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:43.190+0000 7fd79ca5a700 1 -- 192.168.123.104:0/3022079468 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fd78406c7a0 msgr2=0x7fd78406ec50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:43.192 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:43.190+0000 7fd79ca5a700 1 --2- 192.168.123.104:0/3022079468 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fd78406c7a0 0x7fd78406ec50 secure :-1 s=READY pgs=86 cs=0 l=1 rev1=1 crypto rx=0x7fd78800b5c0 tx=0x7fd788005fb0 comp rx=0 tx=0).stop 2026-03-10T06:19:43.192 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:43.190+0000 7fd79ca5a700 1 -- 192.168.123.104:0/3022079468 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd798102760 msgr2=0x7fd798198020 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:43.192 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:43.190+0000 7fd79ca5a700 1 --2- 192.168.123.104:0/3022079468 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd798102760 0x7fd798198020 secure :-1 s=READY pgs=252 cs=0 l=1 rev1=1 crypto rx=0x7fd78000eac0 tx=0x7fd78000ee80 comp rx=0 tx=0).stop 2026-03-10T06:19:43.192 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:43.191+0000 7fd79ca5a700 1 -- 192.168.123.104:0/3022079468 shutdown_connections 2026-03-10T06:19:43.192 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:43.191+0000 7fd79ca5a700 1 --2- 192.168.123.104:0/3022079468 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fd78406c7a0 0x7fd78406ec50 unknown :-1 s=CLOSED pgs=86 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:43.193 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:43.191+0000 7fd79ca5a700 1 --2- 192.168.123.104:0/3022079468 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd798102760 0x7fd798198020 unknown :-1 s=CLOSED pgs=252 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:43.193 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:43.191+0000 7fd79ca5a700 1 --2- 192.168.123.104:0/3022079468 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd798103960 0x7fd798198560 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:43.193 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:43.191+0000 7fd79ca5a700 1 -- 192.168.123.104:0/3022079468 >> 192.168.123.104:0/3022079468 conn(0x7fd7980fdcf0 msgr2=0x7fd798106b90 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:19:43.193 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:43.191+0000 7fd79ca5a700 1 -- 192.168.123.104:0/3022079468 shutdown_connections 2026-03-10T06:19:43.193 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:43.191+0000 7fd79ca5a700 1 -- 192.168.123.104:0/3022079468 wait complete. 2026-03-10T06:19:43.242 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 9c59102a-1c48-11f1-b618-035af535377d -- bash -c 'ceph orch ps' 2026-03-10T06:19:43.449 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/mon.vm04/config 2026-03-10T06:19:43.835 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:43 vm04 ceph-mon[51058]: pgmap v71: 1 pgs: 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T06:19:43.835 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:43 vm04 ceph-mon[51058]: from='client.? ' entity='client.admin' 2026-03-10T06:19:43.835 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:43 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:19:43.836 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:43 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:19:43.836 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:43 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T06:19:43.836 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:43 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:19:43.860 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:43.858+0000 7feef8762700 1 -- 192.168.123.104:0/548769715 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7feef0103970 msgr2=0x7feef0105d50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:43.860 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:43.858+0000 7feef8762700 1 --2- 192.168.123.104:0/548769715 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7feef0103970 0x7feef0105d50 secure :-1 s=READY pgs=253 cs=0 l=1 rev1=1 crypto rx=0x7feeec009b50 tx=0x7feeec009e60 comp rx=0 tx=0).stop 2026-03-10T06:19:43.860 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:43.858+0000 7feef8762700 1 -- 192.168.123.104:0/548769715 shutdown_connections 2026-03-10T06:19:43.860 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:43.858+0000 7feef8762700 1 --2- 192.168.123.104:0/548769715 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7feef0103970 0x7feef0105d50 unknown :-1 s=CLOSED pgs=253 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:43.860 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:43.858+0000 7feef8762700 1 --2- 192.168.123.104:0/548769715 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7feef0101050 0x7feef0103430 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:43.860 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:43.858+0000 7feef8762700 1 -- 192.168.123.104:0/548769715 >> 192.168.123.104:0/548769715 conn(0x7feef00fa9b0 msgr2=0x7feef00fce00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:19:43.860 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:43.859+0000 7feef8762700 1 -- 192.168.123.104:0/548769715 shutdown_connections 2026-03-10T06:19:43.860 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:43.859+0000 7feef8762700 1 -- 192.168.123.104:0/548769715 wait complete. 2026-03-10T06:19:43.861 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:43.859+0000 7feef8762700 1 Processor -- start 2026-03-10T06:19:43.861 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:43.859+0000 7feef8762700 1 -- start start 2026-03-10T06:19:43.861 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:43.860+0000 7feef8762700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7feef0101050 0x7feef01980c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:43.861 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:43.860+0000 7feef8762700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7feef0103970 0x7feef0198600 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:43.861 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:43.860+0000 7feef8762700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7feef0198c20 con 0x7feef0103970 2026-03-10T06:19:43.861 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:43.860+0000 7feef8762700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7feef0198d60 con 0x7feef0101050 2026-03-10T06:19:43.862 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:43.860+0000 7feef5cfd700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7feef0103970 0x7feef0198600 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:19:43.862 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:43.860+0000 7feef5cfd700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7feef0103970 0x7feef0198600 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:39848/0 (socket says 192.168.123.104:39848) 2026-03-10T06:19:43.862 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:43.860+0000 7feef5cfd700 1 -- 192.168.123.104:0/1197965672 learned_addr learned my addr 192.168.123.104:0/1197965672 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:19:43.862 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:43.860+0000 7feef64fe700 1 --2- 192.168.123.104:0/1197965672 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7feef0101050 0x7feef01980c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:19:43.862 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:43.861+0000 7feef64fe700 1 -- 192.168.123.104:0/1197965672 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7feef0103970 msgr2=0x7feef0198600 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:43.863 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:43.861+0000 7feef64fe700 1 --2- 192.168.123.104:0/1197965672 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7feef0103970 0x7feef0198600 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:43.863 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:43.861+0000 7feef64fe700 1 -- 192.168.123.104:0/1197965672 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7feeec0097e0 con 0x7feef0101050 2026-03-10T06:19:43.863 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:43.861+0000 7feef5cfd700 1 --2- 192.168.123.104:0/1197965672 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7feef0103970 0x7feef0198600 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-10T06:19:43.863 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:43.861+0000 7feef64fe700 1 --2- 192.168.123.104:0/1197965672 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7feef0101050 0x7feef01980c0 secure :-1 s=READY pgs=49 cs=0 l=1 rev1=1 crypto rx=0x7feee000b6d0 tx=0x7feee000b9e0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:19:43.863 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:43.861+0000 7feee77fe700 1 -- 192.168.123.104:0/1197965672 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7feee0011630 con 0x7feef0101050 2026-03-10T06:19:43.863 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:43.861+0000 7feee77fe700 1 -- 192.168.123.104:0/1197965672 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7feee0011c70 con 0x7feef0101050 2026-03-10T06:19:43.863 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:43.861+0000 7feef8762700 1 -- 192.168.123.104:0/1197965672 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7feef019d810 con 0x7feef0101050 2026-03-10T06:19:43.864 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:43.861+0000 7feee77fe700 1 -- 192.168.123.104:0/1197965672 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7feee000f2e0 con 0x7feef0101050 2026-03-10T06:19:43.864 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:43.861+0000 7feef8762700 1 -- 192.168.123.104:0/1197965672 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7feef019dd60 con 0x7feef0101050 2026-03-10T06:19:43.864 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:43.862+0000 7feef8762700 1 -- 192.168.123.104:0/1197965672 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7feef00fc590 con 0x7feef0101050 2026-03-10T06:19:43.866 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:43.864+0000 7feee77fe700 1 -- 192.168.123.104:0/1197965672 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7feee0011790 con 0x7feef0101050 2026-03-10T06:19:43.866 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:43.864+0000 7feee77fe700 1 --2- 192.168.123.104:0/1197965672 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7feedc06c680 0x7feedc06eb30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:43.866 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:43.864+0000 7feee77fe700 1 -- 192.168.123.104:0/1197965672 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(34..34 src has 1..34) v4 ==== 4446+0+0 (secure 0 0 0) 0x7feee008b3b0 con 0x7feef0101050 2026-03-10T06:19:43.866 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:43.864+0000 7feef5cfd700 1 --2- 192.168.123.104:0/1197965672 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7feedc06c680 0x7feedc06eb30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:19:43.866 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:43.865+0000 7feef5cfd700 1 --2- 192.168.123.104:0/1197965672 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7feedc06c680 0x7feedc06eb30 secure :-1 s=READY pgs=87 cs=0 l=1 rev1=1 crypto rx=0x7feeec005310 tx=0x7feeec00b580 comp rx=0 tx=0).ready entity=mgr.14241 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:19:43.867 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:43.866+0000 7feee77fe700 1 -- 192.168.123.104:0/1197965672 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7feee0059980 con 0x7feef0101050 2026-03-10T06:19:43.976 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:43.975+0000 7feef8762700 1 -- 192.168.123.104:0/1197965672 --> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7feef0061190 con 0x7feedc06c680 2026-03-10T06:19:43.983 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:43.981+0000 7feee77fe700 1 -- 192.168.123.104:0/1197965672 <== mgr.14241 v2:192.168.123.104:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+2640 (secure 0 0 0) 0x7feef0061190 con 0x7feedc06c680 2026-03-10T06:19:43.983 INFO:teuthology.orchestra.run.vm04.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-10T06:19:43.983 INFO:teuthology.orchestra.run.vm04.stdout:alertmanager.vm04 vm04 *:9093,9094 running (78s) 47s ago 2m 20.5M - 0.25.0 c8568f914cd2 3d98d9c97afc 2026-03-10T06:19:43.983 INFO:teuthology.orchestra.run.vm04.stdout:ceph-exporter.vm04 vm04 running (2m) 47s ago 2m 7746k - 18.2.0 dc2bc1663786 019b79596e39 2026-03-10T06:19:43.983 INFO:teuthology.orchestra.run.vm04.stdout:ceph-exporter.vm06 vm06 running (94s) 18s ago 94s 8014k - 18.2.0 dc2bc1663786 02ba67f7b99e 2026-03-10T06:19:43.983 INFO:teuthology.orchestra.run.vm04.stdout:crash.vm04 vm04 running (2m) 47s ago 2m 7402k - 18.2.0 dc2bc1663786 35fbdbd85c40 2026-03-10T06:19:43.983 INFO:teuthology.orchestra.run.vm04.stdout:crash.vm06 vm06 running (93s) 18s ago 93s 7411k - 18.2.0 dc2bc1663786 a60199b09d41 2026-03-10T06:19:43.984 INFO:teuthology.orchestra.run.vm04.stdout:grafana.vm04 vm04 *:3000 running (77s) 47s ago 110s 79.6M - 9.4.7 954c08fa6188 888c399470c8 2026-03-10T06:19:43.984 INFO:teuthology.orchestra.run.vm04.stdout:mgr.vm04.exdvdb vm04 *:9283,8765,8443 running (2m) 47s ago 2m 485M - 18.2.0 dc2bc1663786 90f53ab8e17a 2026-03-10T06:19:43.984 INFO:teuthology.orchestra.run.vm04.stdout:mgr.vm06.wwotdr vm06 *:8443,9283,8765 running (89s) 18s ago 89s 445M - 18.2.0 dc2bc1663786 db76c25cd8f7 2026-03-10T06:19:43.984 INFO:teuthology.orchestra.run.vm04.stdout:mon.vm04 vm04 running (2m) 47s ago 3m 43.9M 2048M 18.2.0 dc2bc1663786 089bb557f95b 2026-03-10T06:19:43.984 INFO:teuthology.orchestra.run.vm04.stdout:mon.vm06 vm06 running (87s) 18s ago 87s 41.4M 2048M 18.2.0 dc2bc1663786 826078cd5cc7 2026-03-10T06:19:43.984 INFO:teuthology.orchestra.run.vm04.stdout:node-exporter.vm04 vm04 *:9100 running (2m) 47s ago 2m 11.8M - 1.5.0 0da6a335fe13 f563a35e96ab 2026-03-10T06:19:43.984 INFO:teuthology.orchestra.run.vm04.stdout:node-exporter.vm06 vm06 *:9100 running (90s) 18s ago 90s 12.6M - 1.5.0 0da6a335fe13 3304cc389738 2026-03-10T06:19:43.984 INFO:teuthology.orchestra.run.vm04.stdout:osd.0 vm04 running (69s) 47s ago 69s 37.4M 4096M 18.2.0 dc2bc1663786 23249edb3d75 2026-03-10T06:19:43.984 INFO:teuthology.orchestra.run.vm04.stdout:osd.1 vm04 running (59s) 47s ago 58s 38.2M 4096M 18.2.0 dc2bc1663786 ddcaf1636c42 2026-03-10T06:19:43.984 INFO:teuthology.orchestra.run.vm04.stdout:osd.2 vm04 running (49s) 47s ago 48s 12.4M 4096M 18.2.0 dc2bc1663786 e5a533082c80 2026-03-10T06:19:43.984 INFO:teuthology.orchestra.run.vm04.stdout:osd.3 vm06 running (39s) 18s ago 39s 40.3M 4096M 18.2.0 dc2bc1663786 62400287eca0 2026-03-10T06:19:43.984 INFO:teuthology.orchestra.run.vm04.stdout:osd.4 vm06 running (29s) 18s ago 29s 38.3M 4096M 18.2.0 dc2bc1663786 dcd395dfe220 2026-03-10T06:19:43.984 INFO:teuthology.orchestra.run.vm04.stdout:osd.5 vm06 running (20s) 18s ago 20s 12.5M 4096M 18.2.0 dc2bc1663786 862da087fc06 2026-03-10T06:19:43.984 INFO:teuthology.orchestra.run.vm04.stdout:prometheus.vm04 vm04 *:9095 running (72s) 47s ago 105s 34.2M - 2.43.0 a07b618ecd1d 5d3ae08adc2a 2026-03-10T06:19:43.986 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:43.985+0000 7feef8762700 1 -- 192.168.123.104:0/1197965672 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7feedc06c680 msgr2=0x7feedc06eb30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:43.986 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:43.985+0000 7feef8762700 1 --2- 192.168.123.104:0/1197965672 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7feedc06c680 0x7feedc06eb30 secure :-1 s=READY pgs=87 cs=0 l=1 rev1=1 crypto rx=0x7feeec005310 tx=0x7feeec00b580 comp rx=0 tx=0).stop 2026-03-10T06:19:43.986 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:43.985+0000 7feef8762700 1 -- 192.168.123.104:0/1197965672 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7feef0101050 msgr2=0x7feef01980c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:43.986 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:43.985+0000 7feef8762700 1 --2- 192.168.123.104:0/1197965672 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7feef0101050 0x7feef01980c0 secure :-1 s=READY pgs=49 cs=0 l=1 rev1=1 crypto rx=0x7feee000b6d0 tx=0x7feee000b9e0 comp rx=0 tx=0).stop 2026-03-10T06:19:43.987 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:43.985+0000 7feef8762700 1 -- 192.168.123.104:0/1197965672 shutdown_connections 2026-03-10T06:19:43.987 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:43.985+0000 7feef8762700 1 --2- 192.168.123.104:0/1197965672 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7feedc06c680 0x7feedc06eb30 unknown :-1 s=CLOSED pgs=87 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:43.987 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:43.985+0000 7feef8762700 1 --2- 192.168.123.104:0/1197965672 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7feef0101050 0x7feef01980c0 unknown :-1 s=CLOSED pgs=49 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:43.987 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:43.985+0000 7feef8762700 1 --2- 192.168.123.104:0/1197965672 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7feef0103970 0x7feef0198600 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:43.987 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:43.985+0000 7feef8762700 1 -- 192.168.123.104:0/1197965672 >> 192.168.123.104:0/1197965672 conn(0x7feef00fa9b0 msgr2=0x7feef01045d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:19:43.987 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:43.985+0000 7feef8762700 1 -- 192.168.123.104:0/1197965672 shutdown_connections 2026-03-10T06:19:43.987 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:43.985+0000 7feef8762700 1 -- 192.168.123.104:0/1197965672 wait complete. 2026-03-10T06:19:44.056 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 9c59102a-1c48-11f1-b618-035af535377d -- bash -c 'ceph orch ls' 2026-03-10T06:19:44.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:43 vm06 ceph-mon[58974]: pgmap v71: 1 pgs: 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T06:19:44.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:43 vm06 ceph-mon[58974]: from='client.? ' entity='client.admin' 2026-03-10T06:19:44.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:43 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:19:44.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:43 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:19:44.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:43 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T06:19:44.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:43 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:19:44.222 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/mon.vm04/config 2026-03-10T06:19:44.500 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:44.497+0000 7fdf7f9a1700 1 -- 192.168.123.104:0/2228783367 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fdf78103960 msgr2=0x7fdf78103db0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:44.500 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:44.497+0000 7fdf7f9a1700 1 --2- 192.168.123.104:0/2228783367 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fdf78103960 0x7fdf78103db0 secure :-1 s=READY pgs=254 cs=0 l=1 rev1=1 crypto rx=0x7fdf74009b50 tx=0x7fdf74009e60 comp rx=0 tx=0).stop 2026-03-10T06:19:44.500 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:44.498+0000 7fdf7f9a1700 1 -- 192.168.123.104:0/2228783367 shutdown_connections 2026-03-10T06:19:44.500 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:44.498+0000 7fdf7f9a1700 1 --2- 192.168.123.104:0/2228783367 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fdf78103960 0x7fdf78103db0 unknown :-1 s=CLOSED pgs=254 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:44.500 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:44.498+0000 7fdf7f9a1700 1 --2- 192.168.123.104:0/2228783367 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fdf78102760 0x7fdf78102b70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:44.500 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:44.498+0000 7fdf7f9a1700 1 -- 192.168.123.104:0/2228783367 >> 192.168.123.104:0/2228783367 conn(0x7fdf780fdcf0 msgr2=0x7fdf78100140 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:19:44.501 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:44.498+0000 7fdf7f9a1700 1 -- 192.168.123.104:0/2228783367 shutdown_connections 2026-03-10T06:19:44.501 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:44.498+0000 7fdf7f9a1700 1 -- 192.168.123.104:0/2228783367 wait complete. 2026-03-10T06:19:44.501 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:44.499+0000 7fdf7f9a1700 1 Processor -- start 2026-03-10T06:19:44.501 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:44.499+0000 7fdf7f9a1700 1 -- start start 2026-03-10T06:19:44.501 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:44.499+0000 7fdf7f9a1700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fdf78102760 0x7fdf78193c40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:44.501 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:44.499+0000 7fdf7f9a1700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fdf78103960 0x7fdf78194180 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:44.501 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:44.499+0000 7fdf7f9a1700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fdf78194750 con 0x7fdf78102760 2026-03-10T06:19:44.502 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:44.499+0000 7fdf7f9a1700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fdf78194890 con 0x7fdf78103960 2026-03-10T06:19:44.502 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:44.500+0000 7fdf7d73d700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fdf78102760 0x7fdf78193c40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:19:44.502 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:44.500+0000 7fdf7d73d700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fdf78102760 0x7fdf78193c40 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:39878/0 (socket says 192.168.123.104:39878) 2026-03-10T06:19:44.502 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:44.500+0000 7fdf7d73d700 1 -- 192.168.123.104:0/3369214508 learned_addr learned my addr 192.168.123.104:0/3369214508 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:19:44.502 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:44.500+0000 7fdf7d73d700 1 -- 192.168.123.104:0/3369214508 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fdf78103960 msgr2=0x7fdf78194180 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T06:19:44.502 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:44.500+0000 7fdf7d73d700 1 --2- 192.168.123.104:0/3369214508 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fdf78103960 0x7fdf78194180 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:44.502 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:44.500+0000 7fdf7d73d700 1 -- 192.168.123.104:0/3369214508 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fdf740097e0 con 0x7fdf78102760 2026-03-10T06:19:44.502 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:44.500+0000 7fdf7d73d700 1 --2- 192.168.123.104:0/3369214508 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fdf78102760 0x7fdf78193c40 secure :-1 s=READY pgs=255 cs=0 l=1 rev1=1 crypto rx=0x7fdf68009fd0 tx=0x7fdf6800c5b0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:19:44.502 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:44.501+0000 7fdf6e7fc700 1 -- 192.168.123.104:0/3369214508 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fdf6800cd10 con 0x7fdf78102760 2026-03-10T06:19:44.502 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:44.501+0000 7fdf6e7fc700 1 -- 192.168.123.104:0/3369214508 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fdf6800ce70 con 0x7fdf78102760 2026-03-10T06:19:44.502 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:44.501+0000 7fdf7f9a1700 1 -- 192.168.123.104:0/3369214508 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fdf7806a890 con 0x7fdf78102760 2026-03-10T06:19:44.503 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:44.501+0000 7fdf7f9a1700 1 -- 192.168.123.104:0/3369214508 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fdf7806ade0 con 0x7fdf78102760 2026-03-10T06:19:44.504 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:44.502+0000 7fdf6e7fc700 1 -- 192.168.123.104:0/3369214508 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fdf6800cd10 con 0x7fdf78102760 2026-03-10T06:19:44.504 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:44.502+0000 7fdf7f9a1700 1 -- 192.168.123.104:0/3369214508 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fdf78066e40 con 0x7fdf78102760 2026-03-10T06:19:44.509 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:44.505+0000 7fdf6e7fc700 1 -- 192.168.123.104:0/3369214508 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7fdf68003e60 con 0x7fdf78102760 2026-03-10T06:19:44.509 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:44.506+0000 7fdf6e7fc700 1 --2- 192.168.123.104:0/3369214508 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fdf6406c6f0 0x7fdf6406eba0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:44.509 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:44.506+0000 7fdf6e7fc700 1 -- 192.168.123.104:0/3369214508 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(34..34 src has 1..34) v4 ==== 4446+0+0 (secure 0 0 0) 0x7fdf68014070 con 0x7fdf78102760 2026-03-10T06:19:44.509 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:44.506+0000 7fdf6e7fc700 1 -- 192.168.123.104:0/3369214508 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fdf680566c0 con 0x7fdf78102760 2026-03-10T06:19:44.509 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:44.508+0000 7fdf7cf3c700 1 --2- 192.168.123.104:0/3369214508 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fdf6406c6f0 0x7fdf6406eba0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:19:44.510 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:44.508+0000 7fdf7cf3c700 1 --2- 192.168.123.104:0/3369214508 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fdf6406c6f0 0x7fdf6406eba0 secure :-1 s=READY pgs=88 cs=0 l=1 rev1=1 crypto rx=0x7fdf74005e50 tx=0x7fdf74005dc0 comp rx=0 tx=0).ready entity=mgr.14241 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:19:44.619 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:44.617+0000 7fdf7f9a1700 1 -- 192.168.123.104:0/3369214508 --> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] -- mgr_command(tid 0: {"prefix": "orch ls", "target": ["mon-mgr", ""]}) v1 -- 0x7fdf781082b0 con 0x7fdf6406c6f0 2026-03-10T06:19:44.622 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:44.620+0000 7fdf6e7fc700 1 -- 192.168.123.104:0/3369214508 <== mgr.14241 v2:192.168.123.104:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+1150 (secure 0 0 0) 0x7fdf781082b0 con 0x7fdf6406c6f0 2026-03-10T06:19:44.622 INFO:teuthology.orchestra.run.vm04.stdout:NAME PORTS RUNNING REFRESHED AGE PLACEMENT 2026-03-10T06:19:44.622 INFO:teuthology.orchestra.run.vm04.stdout:alertmanager ?:9093,9094 1/1 47s ago 2m count:1 2026-03-10T06:19:44.623 INFO:teuthology.orchestra.run.vm04.stdout:ceph-exporter 2/2 47s ago 2m * 2026-03-10T06:19:44.623 INFO:teuthology.orchestra.run.vm04.stdout:crash 2/2 47s ago 2m * 2026-03-10T06:19:44.623 INFO:teuthology.orchestra.run.vm04.stdout:grafana ?:3000 1/1 47s ago 2m count:1 2026-03-10T06:19:44.623 INFO:teuthology.orchestra.run.vm04.stdout:mgr 2/2 47s ago 2m count:2 2026-03-10T06:19:44.623 INFO:teuthology.orchestra.run.vm04.stdout:mon 2/2 47s ago 2m vm04:192.168.123.104=vm04;vm06:192.168.123.106=vm06;count:2 2026-03-10T06:19:44.623 INFO:teuthology.orchestra.run.vm04.stdout:node-exporter ?:9100 2/2 47s ago 2m * 2026-03-10T06:19:44.623 INFO:teuthology.orchestra.run.vm04.stdout:osd 6 47s ago - 2026-03-10T06:19:44.623 INFO:teuthology.orchestra.run.vm04.stdout:prometheus ?:9095 1/1 47s ago 2m count:1 2026-03-10T06:19:44.623 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:44.622+0000 7fdf7f9a1700 1 -- 192.168.123.104:0/3369214508 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fdf6406c6f0 msgr2=0x7fdf6406eba0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:44.624 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:44.622+0000 7fdf7f9a1700 1 --2- 192.168.123.104:0/3369214508 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fdf6406c6f0 0x7fdf6406eba0 secure :-1 s=READY pgs=88 cs=0 l=1 rev1=1 crypto rx=0x7fdf74005e50 tx=0x7fdf74005dc0 comp rx=0 tx=0).stop 2026-03-10T06:19:44.624 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:44.622+0000 7fdf7f9a1700 1 -- 192.168.123.104:0/3369214508 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fdf78102760 msgr2=0x7fdf78193c40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:44.624 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:44.622+0000 7fdf7f9a1700 1 --2- 192.168.123.104:0/3369214508 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fdf78102760 0x7fdf78193c40 secure :-1 s=READY pgs=255 cs=0 l=1 rev1=1 crypto rx=0x7fdf68009fd0 tx=0x7fdf6800c5b0 comp rx=0 tx=0).stop 2026-03-10T06:19:44.624 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:44.622+0000 7fdf7f9a1700 1 -- 192.168.123.104:0/3369214508 shutdown_connections 2026-03-10T06:19:44.624 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:44.622+0000 7fdf7f9a1700 1 --2- 192.168.123.104:0/3369214508 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fdf6406c6f0 0x7fdf6406eba0 unknown :-1 s=CLOSED pgs=88 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:44.624 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:44.622+0000 7fdf7f9a1700 1 --2- 192.168.123.104:0/3369214508 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fdf78102760 0x7fdf78193c40 unknown :-1 s=CLOSED pgs=255 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:44.624 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:44.622+0000 7fdf7f9a1700 1 --2- 192.168.123.104:0/3369214508 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fdf78103960 0x7fdf78194180 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:44.624 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:44.622+0000 7fdf7f9a1700 1 -- 192.168.123.104:0/3369214508 >> 192.168.123.104:0/3369214508 conn(0x7fdf780fdcf0 msgr2=0x7fdf78106b90 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:19:44.624 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:44.622+0000 7fdf7f9a1700 1 -- 192.168.123.104:0/3369214508 shutdown_connections 2026-03-10T06:19:44.624 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:44.623+0000 7fdf7f9a1700 1 -- 192.168.123.104:0/3369214508 wait complete. 2026-03-10T06:19:44.695 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 9c59102a-1c48-11f1-b618-035af535377d -- bash -c 'ceph orch host ls' 2026-03-10T06:19:44.857 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/mon.vm04/config 2026-03-10T06:19:44.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:44 vm06 ceph-mon[58974]: from='client.14480 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:19:44.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:44 vm06 ceph-mon[58974]: from='client.24273 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:19:44.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:44 vm06 ceph-mon[58974]: pgmap v72: 1 pgs: 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T06:19:44.887 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:44 vm04 ceph-mon[51058]: from='client.14480 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:19:44.887 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:44 vm04 ceph-mon[51058]: from='client.24273 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:19:44.887 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:44 vm04 ceph-mon[51058]: pgmap v72: 1 pgs: 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T06:19:45.130 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:45.128+0000 7f80f62ff700 1 -- 192.168.123.104:0/3617513142 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f80f00691a0 msgr2=0x7f80f0105520 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:45.130 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:45.128+0000 7f80f62ff700 1 --2- 192.168.123.104:0/3617513142 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f80f00691a0 0x7f80f0105520 secure :-1 s=READY pgs=256 cs=0 l=1 rev1=1 crypto rx=0x7f80e0009b50 tx=0x7f80e0009e60 comp rx=0 tx=0).stop 2026-03-10T06:19:45.130 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:45.128+0000 7f80f62ff700 1 -- 192.168.123.104:0/3617513142 shutdown_connections 2026-03-10T06:19:45.130 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:45.128+0000 7f80f62ff700 1 --2- 192.168.123.104:0/3617513142 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f80f0105a60 0x7f80f0107e40 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:45.130 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:45.128+0000 7f80f62ff700 1 --2- 192.168.123.104:0/3617513142 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f80f00691a0 0x7f80f0105520 unknown :-1 s=CLOSED pgs=256 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:45.130 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:45.128+0000 7f80f62ff700 1 -- 192.168.123.104:0/3617513142 >> 192.168.123.104:0/3617513142 conn(0x7f80f00faa70 msgr2=0x7f80f00fcee0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:19:45.130 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:45.129+0000 7f80f62ff700 1 -- 192.168.123.104:0/3617513142 shutdown_connections 2026-03-10T06:19:45.130 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:45.129+0000 7f80f62ff700 1 -- 192.168.123.104:0/3617513142 wait complete. 2026-03-10T06:19:45.131 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:45.129+0000 7f80f62ff700 1 Processor -- start 2026-03-10T06:19:45.131 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:45.129+0000 7f80f62ff700 1 -- start start 2026-03-10T06:19:45.131 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:45.130+0000 7f80f62ff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f80f00691a0 0x7f80f0198030 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:45.131 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:45.130+0000 7f80f62ff700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f80f0105a60 0x7f80f0198570 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:45.132 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:45.130+0000 7f80f62ff700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f80f0198b90 con 0x7f80f0105a60 2026-03-10T06:19:45.132 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:45.130+0000 7f80f62ff700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f80f0198cd0 con 0x7f80f00691a0 2026-03-10T06:19:45.132 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:45.130+0000 7f80ef7fe700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f80f0105a60 0x7f80f0198570 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:19:45.132 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:45.130+0000 7f80ef7fe700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f80f0105a60 0x7f80f0198570 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:39910/0 (socket says 192.168.123.104:39910) 2026-03-10T06:19:45.132 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:45.130+0000 7f80ef7fe700 1 -- 192.168.123.104:0/1346907070 learned_addr learned my addr 192.168.123.104:0/1346907070 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:19:45.132 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:45.130+0000 7f80effff700 1 --2- 192.168.123.104:0/1346907070 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f80f00691a0 0x7f80f0198030 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:19:45.132 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:45.130+0000 7f80ef7fe700 1 -- 192.168.123.104:0/1346907070 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f80f00691a0 msgr2=0x7f80f0198030 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:45.132 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:45.130+0000 7f80ef7fe700 1 --2- 192.168.123.104:0/1346907070 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f80f00691a0 0x7f80f0198030 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:45.132 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:45.130+0000 7f80ef7fe700 1 -- 192.168.123.104:0/1346907070 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f80e00097e0 con 0x7f80f0105a60 2026-03-10T06:19:45.132 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:45.131+0000 7f80ef7fe700 1 --2- 192.168.123.104:0/1346907070 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f80f0105a60 0x7f80f0198570 secure :-1 s=READY pgs=257 cs=0 l=1 rev1=1 crypto rx=0x7f80e400d900 tx=0x7f80e400dcc0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:19:45.133 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:45.131+0000 7f80ed7fa700 1 -- 192.168.123.104:0/1346907070 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f80e40049e0 con 0x7f80f0105a60 2026-03-10T06:19:45.133 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:45.131+0000 7f80ed7fa700 1 -- 192.168.123.104:0/1346907070 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f80e4005500 con 0x7f80f0105a60 2026-03-10T06:19:45.133 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:45.131+0000 7f80f62ff700 1 -- 192.168.123.104:0/1346907070 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f80f019d780 con 0x7f80f0105a60 2026-03-10T06:19:45.133 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:45.131+0000 7f80ed7fa700 1 -- 192.168.123.104:0/1346907070 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f80e4009de0 con 0x7f80f0105a60 2026-03-10T06:19:45.133 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:45.131+0000 7f80f62ff700 1 -- 192.168.123.104:0/1346907070 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f80f019dcd0 con 0x7f80f0105a60 2026-03-10T06:19:45.134 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:45.133+0000 7f80ed7fa700 1 -- 192.168.123.104:0/1346907070 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f80e4010460 con 0x7f80f0105a60 2026-03-10T06:19:45.134 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:45.133+0000 7f80ed7fa700 1 --2- 192.168.123.104:0/1346907070 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f80d806c6f0 0x7f80d806eba0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:45.135 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:45.133+0000 7f80effff700 1 --2- 192.168.123.104:0/1346907070 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f80d806c6f0 0x7f80d806eba0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:19:45.135 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:45.134+0000 7f80ed7fa700 1 -- 192.168.123.104:0/1346907070 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(34..34 src has 1..34) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f80e408c420 con 0x7f80f0105a60 2026-03-10T06:19:45.135 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:45.134+0000 7f80effff700 1 --2- 192.168.123.104:0/1346907070 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f80d806c6f0 0x7f80d806eba0 secure :-1 s=READY pgs=89 cs=0 l=1 rev1=1 crypto rx=0x7f80e0005340 tx=0x7f80e00058e0 comp rx=0 tx=0).ready entity=mgr.14241 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:19:45.136 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:45.134+0000 7f80f62ff700 1 -- 192.168.123.104:0/1346907070 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f80f01921c0 con 0x7f80f0105a60 2026-03-10T06:19:45.141 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:45.137+0000 7f80ed7fa700 1 -- 192.168.123.104:0/1346907070 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f80e4017080 con 0x7f80f0105a60 2026-03-10T06:19:45.250 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:45.247+0000 7f80f62ff700 1 -- 192.168.123.104:0/1346907070 --> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] -- mgr_command(tid 0: {"prefix": "orch host ls", "target": ["mon-mgr", ""]}) v1 -- 0x7f80f019d910 con 0x7f80d806c6f0 2026-03-10T06:19:45.251 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:45.249+0000 7f80ed7fa700 1 -- 192.168.123.104:0/1346907070 <== mgr.14241 v2:192.168.123.104:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+139 (secure 0 0 0) 0x7f80f019d910 con 0x7f80d806c6f0 2026-03-10T06:19:45.251 INFO:teuthology.orchestra.run.vm04.stdout:HOST ADDR LABELS STATUS 2026-03-10T06:19:45.251 INFO:teuthology.orchestra.run.vm04.stdout:vm04 192.168.123.104 2026-03-10T06:19:45.251 INFO:teuthology.orchestra.run.vm04.stdout:vm06 192.168.123.106 2026-03-10T06:19:45.251 INFO:teuthology.orchestra.run.vm04.stdout:2 hosts in cluster 2026-03-10T06:19:45.261 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:45.258+0000 7f80f62ff700 1 -- 192.168.123.104:0/1346907070 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f80d806c6f0 msgr2=0x7f80d806eba0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:45.261 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:45.259+0000 7f80f62ff700 1 --2- 192.168.123.104:0/1346907070 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f80d806c6f0 0x7f80d806eba0 secure :-1 s=READY pgs=89 cs=0 l=1 rev1=1 crypto rx=0x7f80e0005340 tx=0x7f80e00058e0 comp rx=0 tx=0).stop 2026-03-10T06:19:45.261 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:45.259+0000 7f80f62ff700 1 -- 192.168.123.104:0/1346907070 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f80f0105a60 msgr2=0x7f80f0198570 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:45.261 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:45.259+0000 7f80f62ff700 1 --2- 192.168.123.104:0/1346907070 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f80f0105a60 0x7f80f0198570 secure :-1 s=READY pgs=257 cs=0 l=1 rev1=1 crypto rx=0x7f80e400d900 tx=0x7f80e400dcc0 comp rx=0 tx=0).stop 2026-03-10T06:19:45.261 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:45.259+0000 7f80f62ff700 1 -- 192.168.123.104:0/1346907070 shutdown_connections 2026-03-10T06:19:45.261 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:45.259+0000 7f80f62ff700 1 --2- 192.168.123.104:0/1346907070 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f80d806c6f0 0x7f80d806eba0 unknown :-1 s=CLOSED pgs=89 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:45.261 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:45.260+0000 7f80f62ff700 1 --2- 192.168.123.104:0/1346907070 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f80f00691a0 0x7f80f0198030 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:45.261 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:45.260+0000 7f80f62ff700 1 --2- 192.168.123.104:0/1346907070 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f80f0105a60 0x7f80f0198570 unknown :-1 s=CLOSED pgs=257 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:45.261 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:45.260+0000 7f80f62ff700 1 -- 192.168.123.104:0/1346907070 >> 192.168.123.104:0/1346907070 conn(0x7f80f00faa70 msgr2=0x7f80f00fcee0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:19:45.261 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:45.260+0000 7f80f62ff700 1 -- 192.168.123.104:0/1346907070 shutdown_connections 2026-03-10T06:19:45.261 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:45.260+0000 7f80f62ff700 1 -- 192.168.123.104:0/1346907070 wait complete. 2026-03-10T06:19:45.333 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 9c59102a-1c48-11f1-b618-035af535377d -- bash -c 'ceph orch device ls' 2026-03-10T06:19:45.499 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/mon.vm04/config 2026-03-10T06:19:45.748 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:45.745+0000 7fb18020e700 1 -- 192.168.123.104:0/1648076904 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb178103a00 msgr2=0x7fb178103e70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:45.748 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:45.745+0000 7fb18020e700 1 --2- 192.168.123.104:0/1648076904 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb178103a00 0x7fb178103e70 secure :-1 s=READY pgs=258 cs=0 l=1 rev1=1 crypto rx=0x7fb174009b50 tx=0x7fb174009e60 comp rx=0 tx=0).stop 2026-03-10T06:19:45.748 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:45.746+0000 7fb18020e700 1 -- 192.168.123.104:0/1648076904 shutdown_connections 2026-03-10T06:19:45.748 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:45.746+0000 7fb18020e700 1 --2- 192.168.123.104:0/1648076904 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb178103a00 0x7fb178103e70 unknown :-1 s=CLOSED pgs=258 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:45.748 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:45.746+0000 7fb18020e700 1 --2- 192.168.123.104:0/1648076904 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb178102760 0x7fb178102b70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:45.748 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:45.747+0000 7fb18020e700 1 -- 192.168.123.104:0/1648076904 >> 192.168.123.104:0/1648076904 conn(0x7fb1780fddb0 msgr2=0x7fb1781001e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:19:45.749 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:45.747+0000 7fb18020e700 1 -- 192.168.123.104:0/1648076904 shutdown_connections 2026-03-10T06:19:45.749 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:45.747+0000 7fb18020e700 1 -- 192.168.123.104:0/1648076904 wait complete. 2026-03-10T06:19:45.749 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:45.748+0000 7fb18020e700 1 Processor -- start 2026-03-10T06:19:45.749 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:45.748+0000 7fb18020e700 1 -- start start 2026-03-10T06:19:45.750 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:45.748+0000 7fb18020e700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb178102760 0x7fb178197fc0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:45.750 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:45.748+0000 7fb18020e700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb178103a00 0x7fb178198500 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:45.750 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:45.748+0000 7fb18020e700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb178198b20 con 0x7fb178102760 2026-03-10T06:19:45.750 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:45.748+0000 7fb18020e700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb178198c60 con 0x7fb178103a00 2026-03-10T06:19:45.750 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:45.748+0000 7fb17dfaa700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb178102760 0x7fb178197fc0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:19:45.750 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:45.748+0000 7fb17d7a9700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb178103a00 0x7fb178198500 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:19:45.750 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:45.748+0000 7fb17d7a9700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb178103a00 0x7fb178198500 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.104:46722/0 (socket says 192.168.123.104:46722) 2026-03-10T06:19:45.750 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:45.748+0000 7fb17d7a9700 1 -- 192.168.123.104:0/3158171781 learned_addr learned my addr 192.168.123.104:0/3158171781 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:19:45.750 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:45.749+0000 7fb17d7a9700 1 -- 192.168.123.104:0/3158171781 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb178102760 msgr2=0x7fb178197fc0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:45.750 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:45.749+0000 7fb17d7a9700 1 --2- 192.168.123.104:0/3158171781 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb178102760 0x7fb178197fc0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:45.751 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:45.749+0000 7fb17d7a9700 1 -- 192.168.123.104:0/3158171781 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb1740097e0 con 0x7fb178103a00 2026-03-10T06:19:45.751 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:45.749+0000 7fb17dfaa700 1 --2- 192.168.123.104:0/3158171781 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb178102760 0x7fb178197fc0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-10T06:19:45.751 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:45.749+0000 7fb17d7a9700 1 --2- 192.168.123.104:0/3158171781 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb178103a00 0x7fb178198500 secure :-1 s=READY pgs=50 cs=0 l=1 rev1=1 crypto rx=0x7fb17400b5c0 tx=0x7fb174005740 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:19:45.751 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:45.749+0000 7fb16affd700 1 -- 192.168.123.104:0/3158171781 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb17401d070 con 0x7fb178103a00 2026-03-10T06:19:45.751 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:45.749+0000 7fb18020e700 1 -- 192.168.123.104:0/3158171781 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fb17819d6b0 con 0x7fb178103a00 2026-03-10T06:19:45.752 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:45.749+0000 7fb16affd700 1 -- 192.168.123.104:0/3158171781 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fb174004e80 con 0x7fb178103a00 2026-03-10T06:19:45.752 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:45.749+0000 7fb16affd700 1 -- 192.168.123.104:0/3158171781 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb17400f780 con 0x7fb178103a00 2026-03-10T06:19:45.752 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:45.749+0000 7fb18020e700 1 -- 192.168.123.104:0/3158171781 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fb17819dba0 con 0x7fb178103a00 2026-03-10T06:19:45.753 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:45.750+0000 7fb18020e700 1 -- 192.168.123.104:0/3158171781 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fb178066e40 con 0x7fb178103a00 2026-03-10T06:19:45.753 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:45.751+0000 7fb16affd700 1 -- 192.168.123.104:0/3158171781 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7fb17400bc30 con 0x7fb178103a00 2026-03-10T06:19:45.753 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:45.751+0000 7fb16affd700 1 --2- 192.168.123.104:0/3158171781 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fb16406c6f0 0x7fb16406eba0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:45.753 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:45.751+0000 7fb16affd700 1 -- 192.168.123.104:0/3158171781 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(34..34 src has 1..34) v4 ==== 4446+0+0 (secure 0 0 0) 0x7fb17408d0f0 con 0x7fb178103a00 2026-03-10T06:19:45.753 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:45.752+0000 7fb17dfaa700 1 --2- 192.168.123.104:0/3158171781 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fb16406c6f0 0x7fb16406eba0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:19:45.753 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:45.752+0000 7fb17dfaa700 1 --2- 192.168.123.104:0/3158171781 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fb16406c6f0 0x7fb16406eba0 secure :-1 s=READY pgs=90 cs=0 l=1 rev1=1 crypto rx=0x7fb16c005950 tx=0x7fb16c00b410 comp rx=0 tx=0).ready entity=mgr.14241 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:19:45.755 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:45.754+0000 7fb16affd700 1 -- 192.168.123.104:0/3158171781 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fb17405b640 con 0x7fb178103a00 2026-03-10T06:19:45.868 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:45.866+0000 7fb18020e700 1 -- 192.168.123.104:0/3158171781 --> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] -- mgr_command(tid 0: {"prefix": "orch device ls", "target": ["mon-mgr", ""]}) v1 -- 0x7fb178108120 con 0x7fb16406c6f0 2026-03-10T06:19:45.869 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:45.867+0000 7fb16affd700 1 -- 192.168.123.104:0/3158171781 <== mgr.14241 v2:192.168.123.104:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+1188 (secure 0 0 0) 0x7fb178108120 con 0x7fb16406c6f0 2026-03-10T06:19:45.869 INFO:teuthology.orchestra.run.vm04.stdout:HOST PATH TYPE DEVICE ID SIZE AVAILABLE REFRESHED REJECT REASONS 2026-03-10T06:19:45.869 INFO:teuthology.orchestra.run.vm04.stdout:vm04 /dev/vdb hdd DWNBRSTVMM04001 20.0G Yes 47s ago 2026-03-10T06:19:45.870 INFO:teuthology.orchestra.run.vm04.stdout:vm04 /dev/vdc hdd DWNBRSTVMM04002 20.0G No 47s ago Insufficient space (<10 extents) on vgs, LVM detected, locked 2026-03-10T06:19:45.870 INFO:teuthology.orchestra.run.vm04.stdout:vm04 /dev/vdd hdd DWNBRSTVMM04003 20.0G No 47s ago Insufficient space (<10 extents) on vgs, LVM detected, locked 2026-03-10T06:19:45.870 INFO:teuthology.orchestra.run.vm04.stdout:vm04 /dev/vde hdd DWNBRSTVMM04004 20.0G No 47s ago Insufficient space (<10 extents) on vgs, LVM detected, locked 2026-03-10T06:19:45.870 INFO:teuthology.orchestra.run.vm04.stdout:vm06 /dev/vdb hdd DWNBRSTVMM06001 20.0G Yes 19s ago 2026-03-10T06:19:45.870 INFO:teuthology.orchestra.run.vm04.stdout:vm06 /dev/vdc hdd DWNBRSTVMM06002 20.0G No 19s ago Insufficient space (<10 extents) on vgs, LVM detected, locked 2026-03-10T06:19:45.870 INFO:teuthology.orchestra.run.vm04.stdout:vm06 /dev/vdd hdd DWNBRSTVMM06003 20.0G No 19s ago Insufficient space (<10 extents) on vgs, LVM detected, locked 2026-03-10T06:19:45.870 INFO:teuthology.orchestra.run.vm04.stdout:vm06 /dev/vde hdd DWNBRSTVMM06004 20.0G No 19s ago Insufficient space (<10 extents) on vgs, LVM detected, locked 2026-03-10T06:19:45.871 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:45.870+0000 7fb18020e700 1 -- 192.168.123.104:0/3158171781 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fb16406c6f0 msgr2=0x7fb16406eba0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:45.871 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:45.870+0000 7fb18020e700 1 --2- 192.168.123.104:0/3158171781 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fb16406c6f0 0x7fb16406eba0 secure :-1 s=READY pgs=90 cs=0 l=1 rev1=1 crypto rx=0x7fb16c005950 tx=0x7fb16c00b410 comp rx=0 tx=0).stop 2026-03-10T06:19:45.872 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:45.870+0000 7fb18020e700 1 -- 192.168.123.104:0/3158171781 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb178103a00 msgr2=0x7fb178198500 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:45.872 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:45.870+0000 7fb18020e700 1 --2- 192.168.123.104:0/3158171781 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb178103a00 0x7fb178198500 secure :-1 s=READY pgs=50 cs=0 l=1 rev1=1 crypto rx=0x7fb17400b5c0 tx=0x7fb174005740 comp rx=0 tx=0).stop 2026-03-10T06:19:45.872 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:45.870+0000 7fb18020e700 1 -- 192.168.123.104:0/3158171781 shutdown_connections 2026-03-10T06:19:45.872 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:45.870+0000 7fb18020e700 1 --2- 192.168.123.104:0/3158171781 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fb16406c6f0 0x7fb16406eba0 unknown :-1 s=CLOSED pgs=90 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:45.872 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:45.870+0000 7fb18020e700 1 --2- 192.168.123.104:0/3158171781 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb178102760 0x7fb178197fc0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:45.872 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:45.870+0000 7fb18020e700 1 --2- 192.168.123.104:0/3158171781 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb178103a00 0x7fb178198500 unknown :-1 s=CLOSED pgs=50 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:45.872 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:45.871+0000 7fb18020e700 1 -- 192.168.123.104:0/3158171781 >> 192.168.123.104:0/3158171781 conn(0x7fb1780fddb0 msgr2=0x7fb178100160 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:19:45.873 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:45.871+0000 7fb18020e700 1 -- 192.168.123.104:0/3158171781 shutdown_connections 2026-03-10T06:19:45.873 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:45.871+0000 7fb18020e700 1 -- 192.168.123.104:0/3158171781 wait complete. 2026-03-10T06:19:45.918 INFO:teuthology.run_tasks:Running task cephadm.shell... 2026-03-10T06:19:45.921 INFO:tasks.cephadm:Running commands on role host.a host ubuntu@vm04.local 2026-03-10T06:19:45.921 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 9c59102a-1c48-11f1-b618-035af535377d -- bash -c 'ceph fs volume create cephfs --placement=4' 2026-03-10T06:19:46.081 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/mon.vm04/config 2026-03-10T06:19:46.342 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:46.340+0000 7f60062ca700 1 -- 192.168.123.104:0/2311045169 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f60001017d0 msgr2=0x7f6000101c20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:46.342 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:46.340+0000 7f60062ca700 1 --2- 192.168.123.104:0/2311045169 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f60001017d0 0x7f6000101c20 secure :-1 s=READY pgs=259 cs=0 l=1 rev1=1 crypto rx=0x7f5fe8009b00 tx=0x7f5fe8009e10 comp rx=0 tx=0).stop 2026-03-10T06:19:46.342 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:46.341+0000 7f60062ca700 1 -- 192.168.123.104:0/2311045169 shutdown_connections 2026-03-10T06:19:46.342 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:46.341+0000 7f60062ca700 1 --2- 192.168.123.104:0/2311045169 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f60001017d0 0x7f6000101c20 unknown :-1 s=CLOSED pgs=259 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:46.342 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:46.341+0000 7f60062ca700 1 --2- 192.168.123.104:0/2311045169 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f60000fe7f0 0x7f60000fec00 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:46.342 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:46.341+0000 7f60062ca700 1 -- 192.168.123.104:0/2311045169 >> 192.168.123.104:0/2311045169 conn(0x7f60000fa140 msgr2=0x7f60000fc590 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:19:46.343 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:46.341+0000 7f60062ca700 1 -- 192.168.123.104:0/2311045169 shutdown_connections 2026-03-10T06:19:46.343 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:46.341+0000 7f60062ca700 1 -- 192.168.123.104:0/2311045169 wait complete. 2026-03-10T06:19:46.343 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:46.341+0000 7f60062ca700 1 Processor -- start 2026-03-10T06:19:46.343 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:46.342+0000 7f60062ca700 1 -- start start 2026-03-10T06:19:46.343 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:46.342+0000 7f60062ca700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f60000fe7f0 0x7f6000101130 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:46.344 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:46.342+0000 7f60062ca700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f60001017d0 0x7f60000ff780 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:46.344 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:46.342+0000 7f60062ca700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f60000ffcc0 con 0x7f60000fe7f0 2026-03-10T06:19:46.344 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:46.342+0000 7f60062ca700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f60000ffe00 con 0x7f60001017d0 2026-03-10T06:19:46.344 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:46.342+0000 7f5ffffff700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f60000fe7f0 0x7f6000101130 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:19:46.344 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:46.342+0000 7f5ffffff700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f60000fe7f0 0x7f6000101130 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:39938/0 (socket says 192.168.123.104:39938) 2026-03-10T06:19:46.344 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:46.342+0000 7f5ffffff700 1 -- 192.168.123.104:0/564152656 learned_addr learned my addr 192.168.123.104:0/564152656 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:19:46.344 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:46.342+0000 7f5ff7fff700 1 --2- 192.168.123.104:0/564152656 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f60001017d0 0x7f60000ff780 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:19:46.344 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:46.342+0000 7f5ffffff700 1 -- 192.168.123.104:0/564152656 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f60001017d0 msgr2=0x7f60000ff780 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:46.344 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:46.342+0000 7f5ffffff700 1 --2- 192.168.123.104:0/564152656 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f60001017d0 0x7f60000ff780 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:46.344 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:46.342+0000 7f5ffffff700 1 -- 192.168.123.104:0/564152656 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f5fe80097e0 con 0x7f60000fe7f0 2026-03-10T06:19:46.344 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:46.342+0000 7f5ffffff700 1 --2- 192.168.123.104:0/564152656 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f60000fe7f0 0x7f6000101130 secure :-1 s=READY pgs=260 cs=0 l=1 rev1=1 crypto rx=0x7f5ff000da40 tx=0x7f5ff000de00 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:19:46.345 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:46.343+0000 7f5ffdffb700 1 -- 192.168.123.104:0/564152656 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f5ff00041d0 con 0x7f60000fe7f0 2026-03-10T06:19:46.345 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:46.343+0000 7f5ffdffb700 1 -- 192.168.123.104:0/564152656 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f5ff0009c70 con 0x7f60000fe7f0 2026-03-10T06:19:46.345 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:46.343+0000 7f5ffdffb700 1 -- 192.168.123.104:0/564152656 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f5ff0003e40 con 0x7f60000fe7f0 2026-03-10T06:19:46.346 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:46.343+0000 7f60062ca700 1 -- 192.168.123.104:0/564152656 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f60001000e0 con 0x7f60000fe7f0 2026-03-10T06:19:46.346 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:46.343+0000 7f60062ca700 1 -- 192.168.123.104:0/564152656 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f60001005b0 con 0x7f60000fe7f0 2026-03-10T06:19:46.349 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:46.345+0000 7f5ffdffb700 1 -- 192.168.123.104:0/564152656 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f5ff0004330 con 0x7f60000fe7f0 2026-03-10T06:19:46.349 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:46.345+0000 7f60062ca700 1 -- 192.168.123.104:0/564152656 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f6000066e40 con 0x7f60000fe7f0 2026-03-10T06:19:46.349 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:46.345+0000 7f5ffdffb700 1 --2- 192.168.123.104:0/564152656 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f5fec06c4d0 0x7f5fec06e980 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:46.349 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:46.345+0000 7f5ffdffb700 1 -- 192.168.123.104:0/564152656 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(34..34 src has 1..34) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f5ff008a980 con 0x7f60000fe7f0 2026-03-10T06:19:46.349 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:46.347+0000 7f5ff7fff700 1 --2- 192.168.123.104:0/564152656 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f5fec06c4d0 0x7f5fec06e980 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:19:46.349 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:46.348+0000 7f5ffdffb700 1 -- 192.168.123.104:0/564152656 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f5ff0058f50 con 0x7f60000fe7f0 2026-03-10T06:19:46.350 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:46.348+0000 7f5ff7fff700 1 --2- 192.168.123.104:0/564152656 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f5fec06c4d0 0x7f5fec06e980 secure :-1 s=READY pgs=91 cs=0 l=1 rev1=1 crypto rx=0x7f5fe800b5c0 tx=0x7f5fe8005300 comp rx=0 tx=0).ready entity=mgr.14241 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:19:46.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:46 vm04 ceph-mon[51058]: from='client.14488 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:19:46.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:46 vm04 ceph-mon[51058]: from='client.14492 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:19:46.473 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:46.471+0000 7f60062ca700 1 -- 192.168.123.104:0/564152656 --> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] -- mgr_command(tid 0: {"prefix": "fs volume create", "name": "cephfs", "placement": "4", "target": ["mon-mgr", ""]}) v1 -- 0x7f600019df10 con 0x7f5fec06c4d0 2026-03-10T06:19:46.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:46 vm06 ceph-mon[58974]: from='client.14488 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:19:46.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:46 vm06 ceph-mon[58974]: from='client.14492 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:19:47.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:47 vm04 ceph-mon[51058]: from='client.24283 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:19:47.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:47 vm04 ceph-mon[51058]: pgmap v73: 1 pgs: 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T06:19:47.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:47 vm04 ceph-mon[51058]: from='client.14500 -' entity='client.admin' cmd=[{"prefix": "fs volume create", "name": "cephfs", "placement": "4", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:19:47.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:47 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd pool create", "pool": "cephfs.cephfs.meta"}]: dispatch 2026-03-10T06:19:47.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:47 vm06 ceph-mon[58974]: from='client.24283 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:19:47.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:47 vm06 ceph-mon[58974]: pgmap v73: 1 pgs: 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T06:19:47.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:47 vm06 ceph-mon[58974]: from='client.14500 -' entity='client.admin' cmd=[{"prefix": "fs volume create", "name": "cephfs", "placement": "4", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:19:47.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:47 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd pool create", "pool": "cephfs.cephfs.meta"}]: dispatch 2026-03-10T06:19:48.447 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:48.445+0000 7f5ffdffb700 1 -- 192.168.123.104:0/564152656 <== mgr.14241 v2:192.168.123.104:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+0 (secure 0 0 0) 0x7f600019df10 con 0x7f5fec06c4d0 2026-03-10T06:19:48.449 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:48.448+0000 7f60062ca700 1 -- 192.168.123.104:0/564152656 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f5fec06c4d0 msgr2=0x7f5fec06e980 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:48.450 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:48.448+0000 7f60062ca700 1 --2- 192.168.123.104:0/564152656 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f5fec06c4d0 0x7f5fec06e980 secure :-1 s=READY pgs=91 cs=0 l=1 rev1=1 crypto rx=0x7f5fe800b5c0 tx=0x7f5fe8005300 comp rx=0 tx=0).stop 2026-03-10T06:19:48.450 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:48.448+0000 7f60062ca700 1 -- 192.168.123.104:0/564152656 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f60000fe7f0 msgr2=0x7f6000101130 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:48.450 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:48.448+0000 7f60062ca700 1 --2- 192.168.123.104:0/564152656 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f60000fe7f0 0x7f6000101130 secure :-1 s=READY pgs=260 cs=0 l=1 rev1=1 crypto rx=0x7f5ff000da40 tx=0x7f5ff000de00 comp rx=0 tx=0).stop 2026-03-10T06:19:48.450 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:48.449+0000 7f60062ca700 1 -- 192.168.123.104:0/564152656 shutdown_connections 2026-03-10T06:19:48.450 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:48.449+0000 7f60062ca700 1 --2- 192.168.123.104:0/564152656 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f5fec06c4d0 0x7f5fec06e980 unknown :-1 s=CLOSED pgs=91 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:48.450 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:48.449+0000 7f60062ca700 1 --2- 192.168.123.104:0/564152656 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f60000fe7f0 0x7f6000101130 unknown :-1 s=CLOSED pgs=260 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:48.451 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:48.449+0000 7f60062ca700 1 --2- 192.168.123.104:0/564152656 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f60001017d0 0x7f60000ff780 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:48.451 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:48.449+0000 7f60062ca700 1 -- 192.168.123.104:0/564152656 >> 192.168.123.104:0/564152656 conn(0x7f60000fa140 msgr2=0x7f6000104a00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:19:48.451 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:48.449+0000 7f60062ca700 1 -- 192.168.123.104:0/564152656 shutdown_connections 2026-03-10T06:19:48.451 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:48.449+0000 7f60062ca700 1 -- 192.168.123.104:0/564152656 wait complete. 2026-03-10T06:19:48.516 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 9c59102a-1c48-11f1-b618-035af535377d -- bash -c 'ceph fs dump' 2026-03-10T06:19:48.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:48 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd='[{"prefix": "osd pool create", "pool": "cephfs.cephfs.meta"}]': finished 2026-03-10T06:19:48.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:48 vm04 ceph-mon[51058]: osdmap e35: 6 total, 6 up, 6 in 2026-03-10T06:19:48.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:48 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd pool create", "pool": "cephfs.cephfs.data"}]: dispatch 2026-03-10T06:19:48.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:48 vm04 ceph-9c59102a-1c48-11f1-b618-035af535377d-mon-vm04[51054]: 2026-03-10T06:19:48.406+0000 7f1a4eb1c700 -1 log_channel(cluster) log [ERR] : Health check failed: 1 filesystem is offline (MDS_ALL_DOWN) 2026-03-10T06:19:48.698 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/mon.vm04/config 2026-03-10T06:19:48.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:48 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd='[{"prefix": "osd pool create", "pool": "cephfs.cephfs.meta"}]': finished 2026-03-10T06:19:48.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:48 vm06 ceph-mon[58974]: osdmap e35: 6 total, 6 up, 6 in 2026-03-10T06:19:48.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:48 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd pool create", "pool": "cephfs.cephfs.data"}]: dispatch 2026-03-10T06:19:49.396 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:49 vm04 ceph-mon[51058]: pgmap v75: 33 pgs: 32 unknown, 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T06:19:49.396 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:49 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd='[{"prefix": "osd pool create", "pool": "cephfs.cephfs.data"}]': finished 2026-03-10T06:19:49.396 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:49 vm04 ceph-mon[51058]: osdmap e36: 6 total, 6 up, 6 in 2026-03-10T06:19:49.396 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:49 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "fs new", "fs_name": "cephfs", "metadata": "cephfs.cephfs.meta", "data": "cephfs.cephfs.data"}]: dispatch 2026-03-10T06:19:49.396 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:49 vm04 ceph-mon[51058]: Health check failed: 1 filesystem is offline (MDS_ALL_DOWN) 2026-03-10T06:19:49.396 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:49 vm04 ceph-mon[51058]: Health check failed: 1 filesystem is online with fewer MDS than max_mds (MDS_UP_LESS_THAN_MAX) 2026-03-10T06:19:49.396 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:49 vm04 ceph-mon[51058]: osdmap e37: 6 total, 6 up, 6 in 2026-03-10T06:19:49.396 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:49 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd='[{"prefix": "fs new", "fs_name": "cephfs", "metadata": "cephfs.cephfs.meta", "data": "cephfs.cephfs.data"}]': finished 2026-03-10T06:19:49.396 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:49 vm04 ceph-mon[51058]: fsmap cephfs:0 2026-03-10T06:19:49.396 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:49 vm04 ceph-mon[51058]: Saving service mds.cephfs spec with placement count:4 2026-03-10T06:19:49.396 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:49 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:19:49.396 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:49 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:19:49.396 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:49 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:19:49.396 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:49 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T06:19:49.396 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:49 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:19:49.396 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:49 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm04.hdxbzv", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-10T06:19:49.396 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:49 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm04.hdxbzv", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished 2026-03-10T06:19:49.397 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:49 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:19:49.397 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:49 vm04 ceph-mon[51058]: Deploying daemon mds.cephfs.vm04.hdxbzv on vm04 2026-03-10T06:19:49.397 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:49 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:19:49.397 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:49 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T06:19:49.535 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:49.533+0000 7f1aff59e700 1 -- 192.168.123.104:0/3330505304 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1af80a5420 msgr2=0x7f1af80a5890 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:49.535 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:49.533+0000 7f1aff59e700 1 --2- 192.168.123.104:0/3330505304 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1af80a5420 0x7f1af80a5890 secure :-1 s=READY pgs=261 cs=0 l=1 rev1=1 crypto rx=0x7f1ae8009ab0 tx=0x7f1ae8009dc0 comp rx=0 tx=0).stop 2026-03-10T06:19:49.536 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:49.534+0000 7f1aff59e700 1 -- 192.168.123.104:0/3330505304 shutdown_connections 2026-03-10T06:19:49.536 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:49.534+0000 7f1aff59e700 1 --2- 192.168.123.104:0/3330505304 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1af80a5420 0x7f1af80a5890 unknown :-1 s=CLOSED pgs=261 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:49.536 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:49.534+0000 7f1aff59e700 1 --2- 192.168.123.104:0/3330505304 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1af80a42e0 0x7f1af80a46f0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:49.536 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:49.534+0000 7f1aff59e700 1 -- 192.168.123.104:0/3330505304 >> 192.168.123.104:0/3330505304 conn(0x7f1af809f7b0 msgr2=0x7f1af80a1c00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:19:49.536 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:49.534+0000 7f1aff59e700 1 -- 192.168.123.104:0/3330505304 shutdown_connections 2026-03-10T06:19:49.536 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:49.534+0000 7f1aff59e700 1 -- 192.168.123.104:0/3330505304 wait complete. 2026-03-10T06:19:49.536 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:49.535+0000 7f1aff59e700 1 Processor -- start 2026-03-10T06:19:49.537 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:49.535+0000 7f1aff59e700 1 -- start start 2026-03-10T06:19:49.537 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:49.535+0000 7f1aff59e700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1af80a42e0 0x7f1af8142580 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:49.537 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:49.535+0000 7f1afe59c700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1af80a42e0 0x7f1af8142580 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:19:49.537 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:49.535+0000 7f1afe59c700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1af80a42e0 0x7f1af8142580 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:39958/0 (socket says 192.168.123.104:39958) 2026-03-10T06:19:49.537 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:49.536+0000 7f1aff59e700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1af80a5420 0x7f1af8142ac0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:49.537 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:49.536+0000 7f1aff59e700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f1af81430e0 con 0x7f1af80a42e0 2026-03-10T06:19:49.537 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:49.536+0000 7f1aff59e700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f1af8143220 con 0x7f1af80a5420 2026-03-10T06:19:49.537 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:49.536+0000 7f1afe59c700 1 -- 192.168.123.104:0/25662079 learned_addr learned my addr 192.168.123.104:0/25662079 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:19:49.538 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:49.536+0000 7f1afdd9b700 1 --2- 192.168.123.104:0/25662079 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1af80a5420 0x7f1af8142ac0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:19:49.538 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:49.536+0000 7f1afe59c700 1 -- 192.168.123.104:0/25662079 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1af80a5420 msgr2=0x7f1af8142ac0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:49.538 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:49.536+0000 7f1afe59c700 1 --2- 192.168.123.104:0/25662079 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1af80a5420 0x7f1af8142ac0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:49.538 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:49.536+0000 7f1afe59c700 1 -- 192.168.123.104:0/25662079 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f1ae8009710 con 0x7f1af80a42e0 2026-03-10T06:19:49.538 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:49.536+0000 7f1afdd9b700 1 --2- 192.168.123.104:0/25662079 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1af80a5420 0x7f1af8142ac0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_reply_more state changed! 2026-03-10T06:19:49.538 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:49.537+0000 7f1afe59c700 1 --2- 192.168.123.104:0/25662079 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1af80a42e0 0x7f1af8142580 secure :-1 s=READY pgs=262 cs=0 l=1 rev1=1 crypto rx=0x7f1af000ea30 tx=0x7f1af000edf0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:19:49.538 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:49.537+0000 7f1af77fe700 1 -- 192.168.123.104:0/25662079 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f1af000cc40 con 0x7f1af80a42e0 2026-03-10T06:19:49.539 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:49.537+0000 7f1af77fe700 1 -- 192.168.123.104:0/25662079 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f1af000cda0 con 0x7f1af80a42e0 2026-03-10T06:19:49.539 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:49.537+0000 7f1af77fe700 1 -- 192.168.123.104:0/25662079 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f1af0010430 con 0x7f1af80a42e0 2026-03-10T06:19:49.539 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:49.537+0000 7f1aff59e700 1 -- 192.168.123.104:0/25662079 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f1af8147cd0 con 0x7f1af80a42e0 2026-03-10T06:19:49.540 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:49.538+0000 7f1aff59e700 1 -- 192.168.123.104:0/25662079 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f1af81481a0 con 0x7f1af80a42e0 2026-03-10T06:19:49.540 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:49.539+0000 7f1aff59e700 1 -- 192.168.123.104:0/25662079 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f1af8004f40 con 0x7f1af80a42e0 2026-03-10T06:19:49.545 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:49.543+0000 7f1af77fe700 1 -- 192.168.123.104:0/25662079 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f1af0004750 con 0x7f1af80a42e0 2026-03-10T06:19:49.549 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:49.546+0000 7f1af77fe700 1 --2- 192.168.123.104:0/25662079 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f1aec06c7a0 0x7f1aec06ec50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:49.549 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:49.546+0000 7f1af77fe700 1 -- 192.168.123.104:0/25662079 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(38..38 src has 1..38) v4 ==== 5276+0+0 (secure 0 0 0) 0x7f1af0014070 con 0x7f1af80a42e0 2026-03-10T06:19:49.552 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:49.550+0000 7f1afdd9b700 1 --2- 192.168.123.104:0/25662079 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f1aec06c7a0 0x7f1aec06ec50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:19:49.552 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:49.550+0000 7f1af77fe700 1 -- 192.168.123.104:0/25662079 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f1af005a050 con 0x7f1af80a42e0 2026-03-10T06:19:49.552 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:49.551+0000 7f1afdd9b700 1 --2- 192.168.123.104:0/25662079 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f1aec06c7a0 0x7f1aec06ec50 secure :-1 s=READY pgs=92 cs=0 l=1 rev1=1 crypto rx=0x7f1ae800b5c0 tx=0x7f1ae801a040 comp rx=0 tx=0).ready entity=mgr.14241 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:19:49.726 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:49.723+0000 7f1aff59e700 1 -- 192.168.123.104:0/25662079 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7f1af8148b70 con 0x7f1af80a42e0 2026-03-10T06:19:49.727 INFO:teuthology.orchestra.run.vm04.stdout:e2 2026-03-10T06:19:49.727 INFO:teuthology.orchestra.run.vm04.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-10T06:19:49.727 INFO:teuthology.orchestra.run.vm04.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T06:19:49.727 INFO:teuthology.orchestra.run.vm04.stdout:legacy client fscid: 1 2026-03-10T06:19:49.727 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:19:49.727 INFO:teuthology.orchestra.run.vm04.stdout:Filesystem 'cephfs' (1) 2026-03-10T06:19:49.727 INFO:teuthology.orchestra.run.vm04.stdout:fs_name cephfs 2026-03-10T06:19:49.727 INFO:teuthology.orchestra.run.vm04.stdout:epoch 2 2026-03-10T06:19:49.727 INFO:teuthology.orchestra.run.vm04.stdout:flags 12 joinable allow_snaps allow_multimds_snaps 2026-03-10T06:19:49.728 INFO:teuthology.orchestra.run.vm04.stdout:created 2026-03-10T06:19:48.407965+0000 2026-03-10T06:19:49.728 INFO:teuthology.orchestra.run.vm04.stdout:modified 2026-03-10T06:19:48.407997+0000 2026-03-10T06:19:49.728 INFO:teuthology.orchestra.run.vm04.stdout:tableserver 0 2026-03-10T06:19:49.728 INFO:teuthology.orchestra.run.vm04.stdout:root 0 2026-03-10T06:19:49.728 INFO:teuthology.orchestra.run.vm04.stdout:session_timeout 60 2026-03-10T06:19:49.728 INFO:teuthology.orchestra.run.vm04.stdout:session_autoclose 300 2026-03-10T06:19:49.728 INFO:teuthology.orchestra.run.vm04.stdout:max_file_size 1099511627776 2026-03-10T06:19:49.728 INFO:teuthology.orchestra.run.vm04.stdout:required_client_features {} 2026-03-10T06:19:49.728 INFO:teuthology.orchestra.run.vm04.stdout:last_failure 0 2026-03-10T06:19:49.728 INFO:teuthology.orchestra.run.vm04.stdout:last_failure_osd_epoch 0 2026-03-10T06:19:49.728 INFO:teuthology.orchestra.run.vm04.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T06:19:49.728 INFO:teuthology.orchestra.run.vm04.stdout:max_mds 1 2026-03-10T06:19:49.728 INFO:teuthology.orchestra.run.vm04.stdout:in 2026-03-10T06:19:49.728 INFO:teuthology.orchestra.run.vm04.stdout:up {} 2026-03-10T06:19:49.728 INFO:teuthology.orchestra.run.vm04.stdout:failed 2026-03-10T06:19:49.728 INFO:teuthology.orchestra.run.vm04.stdout:damaged 2026-03-10T06:19:49.728 INFO:teuthology.orchestra.run.vm04.stdout:stopped 2026-03-10T06:19:49.728 INFO:teuthology.orchestra.run.vm04.stdout:data_pools [3] 2026-03-10T06:19:49.728 INFO:teuthology.orchestra.run.vm04.stdout:metadata_pool 2 2026-03-10T06:19:49.728 INFO:teuthology.orchestra.run.vm04.stdout:inline_data disabled 2026-03-10T06:19:49.728 INFO:teuthology.orchestra.run.vm04.stdout:balancer 2026-03-10T06:19:49.728 INFO:teuthology.orchestra.run.vm04.stdout:bal_rank_mask -1 2026-03-10T06:19:49.728 INFO:teuthology.orchestra.run.vm04.stdout:standby_count_wanted 0 2026-03-10T06:19:49.728 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:19:49.728 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:19:49.728 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:49.725+0000 7f1af77fe700 1 -- 192.168.123.104:0/25662079 <== mon.0 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 2 v2) v1 ==== 75+0+1093 (secure 0 0 0) 0x7f1af0016330 con 0x7f1af80a42e0 2026-03-10T06:19:49.731 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:49.729+0000 7f1af57fa700 1 -- 192.168.123.104:0/25662079 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f1aec06c7a0 msgr2=0x7f1aec06ec50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:49.731 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:49.729+0000 7f1af57fa700 1 --2- 192.168.123.104:0/25662079 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f1aec06c7a0 0x7f1aec06ec50 secure :-1 s=READY pgs=92 cs=0 l=1 rev1=1 crypto rx=0x7f1ae800b5c0 tx=0x7f1ae801a040 comp rx=0 tx=0).stop 2026-03-10T06:19:49.731 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:49.729+0000 7f1af57fa700 1 -- 192.168.123.104:0/25662079 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1af80a42e0 msgr2=0x7f1af8142580 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:49.731 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:49.729+0000 7f1af57fa700 1 --2- 192.168.123.104:0/25662079 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1af80a42e0 0x7f1af8142580 secure :-1 s=READY pgs=262 cs=0 l=1 rev1=1 crypto rx=0x7f1af000ea30 tx=0x7f1af000edf0 comp rx=0 tx=0).stop 2026-03-10T06:19:49.731 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:49.729+0000 7f1af57fa700 1 -- 192.168.123.104:0/25662079 shutdown_connections 2026-03-10T06:19:49.731 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:49.729+0000 7f1af57fa700 1 --2- 192.168.123.104:0/25662079 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f1aec06c7a0 0x7f1aec06ec50 unknown :-1 s=CLOSED pgs=92 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:49.731 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:49.729+0000 7f1af57fa700 1 --2- 192.168.123.104:0/25662079 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1af80a42e0 0x7f1af8142580 unknown :-1 s=CLOSED pgs=262 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:49.731 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:49.729+0000 7f1af57fa700 1 --2- 192.168.123.104:0/25662079 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1af80a5420 0x7f1af8142ac0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:49.731 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:49.730+0000 7f1af57fa700 1 -- 192.168.123.104:0/25662079 >> 192.168.123.104:0/25662079 conn(0x7f1af809f7b0 msgr2=0x7f1af80a8650 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:19:49.732 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:49.730+0000 7f1af57fa700 1 -- 192.168.123.104:0/25662079 shutdown_connections 2026-03-10T06:19:49.732 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:49.730+0000 7f1af57fa700 1 -- 192.168.123.104:0/25662079 wait complete. 2026-03-10T06:19:49.736 INFO:teuthology.orchestra.run.vm04.stderr:dumped fsmap epoch 2 2026-03-10T06:19:49.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:49 vm06 ceph-mon[58974]: pgmap v75: 33 pgs: 32 unknown, 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T06:19:49.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:49 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd='[{"prefix": "osd pool create", "pool": "cephfs.cephfs.data"}]': finished 2026-03-10T06:19:49.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:49 vm06 ceph-mon[58974]: osdmap e36: 6 total, 6 up, 6 in 2026-03-10T06:19:49.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:49 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "fs new", "fs_name": "cephfs", "metadata": "cephfs.cephfs.meta", "data": "cephfs.cephfs.data"}]: dispatch 2026-03-10T06:19:49.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:49 vm06 ceph-mon[58974]: Health check failed: 1 filesystem is offline (MDS_ALL_DOWN) 2026-03-10T06:19:49.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:49 vm06 ceph-mon[58974]: Health check failed: 1 filesystem is online with fewer MDS than max_mds (MDS_UP_LESS_THAN_MAX) 2026-03-10T06:19:49.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:49 vm06 ceph-mon[58974]: osdmap e37: 6 total, 6 up, 6 in 2026-03-10T06:19:49.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:49 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd='[{"prefix": "fs new", "fs_name": "cephfs", "metadata": "cephfs.cephfs.meta", "data": "cephfs.cephfs.data"}]': finished 2026-03-10T06:19:49.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:49 vm06 ceph-mon[58974]: fsmap cephfs:0 2026-03-10T06:19:49.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:49 vm06 ceph-mon[58974]: Saving service mds.cephfs spec with placement count:4 2026-03-10T06:19:49.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:49 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:19:49.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:49 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:19:49.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:49 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:19:49.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:49 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T06:19:49.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:49 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:19:49.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:49 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm04.hdxbzv", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-10T06:19:49.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:49 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm04.hdxbzv", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished 2026-03-10T06:19:49.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:49 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:19:49.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:49 vm06 ceph-mon[58974]: Deploying daemon mds.cephfs.vm04.hdxbzv on vm04 2026-03-10T06:19:49.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:49 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:19:49.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:49 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T06:19:49.933 INFO:teuthology.run_tasks:Running task cephadm.shell... 2026-03-10T06:19:49.936 INFO:tasks.cephadm:Running commands on role host.a host ubuntu@vm04.local 2026-03-10T06:19:49.936 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 9c59102a-1c48-11f1-b618-035af535377d -- bash -c 'ceph fs set cephfs max_mds 1' 2026-03-10T06:19:50.171 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/mon.vm04/config 2026-03-10T06:19:50.479 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:50.476+0000 7f2ce8b2e700 1 -- 192.168.123.104:0/3796356140 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2ce4071960 msgr2=0x7f2ce4071dd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:50.479 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:50.476+0000 7f2ce8b2e700 1 --2- 192.168.123.104:0/3796356140 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2ce4071960 0x7f2ce4071dd0 secure :-1 s=READY pgs=265 cs=0 l=1 rev1=1 crypto rx=0x7f2cdc00b600 tx=0x7f2cdc00b910 comp rx=0 tx=0).stop 2026-03-10T06:19:50.479 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:50.476+0000 7f2ce8b2e700 1 -- 192.168.123.104:0/3796356140 shutdown_connections 2026-03-10T06:19:50.479 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:50.476+0000 7f2ce8b2e700 1 --2- 192.168.123.104:0/3796356140 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2ce4071960 0x7f2ce4071dd0 unknown :-1 s=CLOSED pgs=265 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:50.479 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:50.476+0000 7f2ce8b2e700 1 --2- 192.168.123.104:0/3796356140 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f2ce41080e0 0x7f2ce41084f0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:50.479 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:50.476+0000 7f2ce8b2e700 1 -- 192.168.123.104:0/3796356140 >> 192.168.123.104:0/3796356140 conn(0x7f2ce406d3e0 msgr2=0x7f2ce406f830 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:19:50.481 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:50.477+0000 7f2ce8b2e700 1 -- 192.168.123.104:0/3796356140 shutdown_connections 2026-03-10T06:19:50.482 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:50.477+0000 7f2ce8b2e700 1 -- 192.168.123.104:0/3796356140 wait complete. 2026-03-10T06:19:50.482 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:50.478+0000 7f2ce8b2e700 1 Processor -- start 2026-03-10T06:19:50.482 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:50.478+0000 7f2ce8b2e700 1 -- start start 2026-03-10T06:19:50.482 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:50.478+0000 7f2ce8b2e700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2ce4071960 0x7f2ce4132580 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:50.483 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:50.478+0000 7f2ce8b2e700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f2ce41080e0 0x7f2ce4132ac0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:50.483 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:50.478+0000 7f2ce8b2e700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f2ce41330e0 con 0x7f2ce4071960 2026-03-10T06:19:50.483 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:50.478+0000 7f2ce8b2e700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f2ce4133220 con 0x7f2ce41080e0 2026-03-10T06:19:50.483 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:50.478+0000 7f2ce259c700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2ce4071960 0x7f2ce4132580 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:19:50.483 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:50.478+0000 7f2ce259c700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2ce4071960 0x7f2ce4132580 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:39988/0 (socket says 192.168.123.104:39988) 2026-03-10T06:19:50.484 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:50.478+0000 7f2ce259c700 1 -- 192.168.123.104:0/139536413 learned_addr learned my addr 192.168.123.104:0/139536413 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:19:50.484 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:50.479+0000 7f2ce1d9b700 1 --2- 192.168.123.104:0/139536413 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f2ce41080e0 0x7f2ce4132ac0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:19:50.484 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:50.479+0000 7f2ce259c700 1 -- 192.168.123.104:0/139536413 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f2ce41080e0 msgr2=0x7f2ce4132ac0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:50.484 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:50.479+0000 7f2ce259c700 1 --2- 192.168.123.104:0/139536413 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f2ce41080e0 0x7f2ce4132ac0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:50.484 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:50.479+0000 7f2ce259c700 1 -- 192.168.123.104:0/139536413 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f2cdc00b050 con 0x7f2ce4071960 2026-03-10T06:19:50.484 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:50.479+0000 7f2ce259c700 1 --2- 192.168.123.104:0/139536413 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2ce4071960 0x7f2ce4132580 secure :-1 s=READY pgs=266 cs=0 l=1 rev1=1 crypto rx=0x7f2cd400ba70 tx=0x7f2cd400bd80 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:19:50.484 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:50.479+0000 7f2cd37fe700 1 -- 192.168.123.104:0/139536413 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f2cd400c700 con 0x7f2ce4071960 2026-03-10T06:19:50.484 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:50.479+0000 7f2cd37fe700 1 -- 192.168.123.104:0/139536413 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f2cd400cd40 con 0x7f2ce4071960 2026-03-10T06:19:50.484 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:50.479+0000 7f2cd37fe700 1 -- 192.168.123.104:0/139536413 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f2cd4012570 con 0x7f2ce4071960 2026-03-10T06:19:50.485 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:50.479+0000 7f2ce8b2e700 1 -- 192.168.123.104:0/139536413 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f2ce407e740 con 0x7f2ce4071960 2026-03-10T06:19:50.485 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:50.479+0000 7f2ce8b2e700 1 -- 192.168.123.104:0/139536413 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f2ce407ec90 con 0x7f2ce4071960 2026-03-10T06:19:50.485 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:50.481+0000 7f2cd37fe700 1 -- 192.168.123.104:0/139536413 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f2cd40126d0 con 0x7f2ce4071960 2026-03-10T06:19:50.485 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:50.481+0000 7f2cd37fe700 1 --2- 192.168.123.104:0/139536413 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f2ccc06c760 0x7f2ccc06ec10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:50.485 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:50.481+0000 7f2ce1d9b700 1 --2- 192.168.123.104:0/139536413 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f2ccc06c760 0x7f2ccc06ec10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:19:50.485 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:50.481+0000 7f2cd37fe700 1 -- 192.168.123.104:0/139536413 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(38..38 src has 1..38) v4 ==== 5276+0+0 (secure 0 0 0) 0x7f2cd408b860 con 0x7f2ce4071960 2026-03-10T06:19:50.485 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:50.481+0000 7f2ce1d9b700 1 --2- 192.168.123.104:0/139536413 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f2ccc06c760 0x7f2ccc06ec10 secure :-1 s=READY pgs=94 cs=0 l=1 rev1=1 crypto rx=0x7f2cdc007ad0 tx=0x7f2cdc009f90 comp rx=0 tx=0).ready entity=mgr.14241 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:19:50.485 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:50.484+0000 7f2ce8b2e700 1 -- 192.168.123.104:0/139536413 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f2ce4066e40 con 0x7f2ce4071960 2026-03-10T06:19:50.488 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:50.487+0000 7f2cd37fe700 1 -- 192.168.123.104:0/139536413 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f2cd404e720 con 0x7f2ce4071960 2026-03-10T06:19:50.625 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:50 vm04 ceph-mon[51058]: osdmap e38: 6 total, 6 up, 6 in 2026-03-10T06:19:50.626 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:50 vm04 ceph-mon[51058]: from='client.? 192.168.123.104:0/25662079' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T06:19:50.626 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:50 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:19:50.626 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:50 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:19:50.626 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:50 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:19:50.626 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:50 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm06.wzhqon", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-10T06:19:50.626 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:50 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm06.wzhqon", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished 2026-03-10T06:19:50.626 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:50 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:19:50.626 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:50 vm04 ceph-mon[51058]: daemon mds.cephfs.vm04.hdxbzv assigned to filesystem cephfs as rank 0 (now has 1 ranks) 2026-03-10T06:19:50.626 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:50 vm04 ceph-mon[51058]: Health check cleared: MDS_ALL_DOWN (was: 1 filesystem is offline) 2026-03-10T06:19:50.626 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:50 vm04 ceph-mon[51058]: Health check cleared: MDS_UP_LESS_THAN_MAX (was: 1 filesystem is online with fewer MDS than max_mds) 2026-03-10T06:19:50.626 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:50 vm04 ceph-mon[51058]: Cluster is now healthy 2026-03-10T06:19:50.626 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:50 vm04 ceph-mon[51058]: mds.? [v2:192.168.123.104:6826/2274683007,v1:192.168.123.104:6827/2274683007] up:boot 2026-03-10T06:19:50.626 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:50 vm04 ceph-mon[51058]: fsmap cephfs:1 {0=cephfs.vm04.hdxbzv=up:creating} 2026-03-10T06:19:50.626 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:50 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm04.hdxbzv"}]: dispatch 2026-03-10T06:19:50.626 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:50 vm04 ceph-mon[51058]: daemon mds.cephfs.vm04.hdxbzv is now active in filesystem cephfs as rank 0 2026-03-10T06:19:50.627 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:50.623+0000 7f2ce8b2e700 1 -- 192.168.123.104:0/139536413 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "fs set", "fs_name": "cephfs", "var": "max_mds", "val": "1"} v 0) v1 -- 0x7f2ce407f3f0 con 0x7f2ce4071960 2026-03-10T06:19:50.756 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:50 vm06 ceph-mon[58974]: osdmap e38: 6 total, 6 up, 6 in 2026-03-10T06:19:50.756 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:50 vm06 ceph-mon[58974]: from='client.? 192.168.123.104:0/25662079' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T06:19:50.756 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:50 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:19:50.756 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:50 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:19:50.756 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:50 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:19:50.756 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:50 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm06.wzhqon", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-10T06:19:50.756 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:50 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm06.wzhqon", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished 2026-03-10T06:19:50.756 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:50 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:19:50.756 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:50 vm06 ceph-mon[58974]: daemon mds.cephfs.vm04.hdxbzv assigned to filesystem cephfs as rank 0 (now has 1 ranks) 2026-03-10T06:19:50.756 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:50 vm06 ceph-mon[58974]: Health check cleared: MDS_ALL_DOWN (was: 1 filesystem is offline) 2026-03-10T06:19:50.756 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:50 vm06 ceph-mon[58974]: Health check cleared: MDS_UP_LESS_THAN_MAX (was: 1 filesystem is online with fewer MDS than max_mds) 2026-03-10T06:19:50.756 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:50 vm06 ceph-mon[58974]: Cluster is now healthy 2026-03-10T06:19:50.756 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:50 vm06 ceph-mon[58974]: mds.? [v2:192.168.123.104:6826/2274683007,v1:192.168.123.104:6827/2274683007] up:boot 2026-03-10T06:19:50.756 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:50 vm06 ceph-mon[58974]: fsmap cephfs:1 {0=cephfs.vm04.hdxbzv=up:creating} 2026-03-10T06:19:50.756 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:50 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm04.hdxbzv"}]: dispatch 2026-03-10T06:19:50.756 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:50 vm06 ceph-mon[58974]: daemon mds.cephfs.vm04.hdxbzv is now active in filesystem cephfs as rank 0 2026-03-10T06:19:51.346 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:51.343+0000 7f2cd37fe700 1 -- 192.168.123.104:0/139536413 <== mon.0 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "fs set", "fs_name": "cephfs", "var": "max_mds", "val": "1"}]=0 v4) v1 ==== 105+0+0 (secure 0 0 0) 0x7f2cd4019020 con 0x7f2ce4071960 2026-03-10T06:19:51.349 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:51.346+0000 7f2ce8b2e700 1 -- 192.168.123.104:0/139536413 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f2ccc06c760 msgr2=0x7f2ccc06ec10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:51.350 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:51.346+0000 7f2ce8b2e700 1 --2- 192.168.123.104:0/139536413 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f2ccc06c760 0x7f2ccc06ec10 secure :-1 s=READY pgs=94 cs=0 l=1 rev1=1 crypto rx=0x7f2cdc007ad0 tx=0x7f2cdc009f90 comp rx=0 tx=0).stop 2026-03-10T06:19:51.350 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:51.346+0000 7f2ce8b2e700 1 -- 192.168.123.104:0/139536413 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2ce4071960 msgr2=0x7f2ce4132580 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:51.350 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:51.346+0000 7f2ce8b2e700 1 --2- 192.168.123.104:0/139536413 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2ce4071960 0x7f2ce4132580 secure :-1 s=READY pgs=266 cs=0 l=1 rev1=1 crypto rx=0x7f2cd400ba70 tx=0x7f2cd400bd80 comp rx=0 tx=0).stop 2026-03-10T06:19:51.350 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:51.346+0000 7f2ce8b2e700 1 -- 192.168.123.104:0/139536413 shutdown_connections 2026-03-10T06:19:51.350 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:51.346+0000 7f2ce8b2e700 1 --2- 192.168.123.104:0/139536413 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f2ccc06c760 0x7f2ccc06ec10 unknown :-1 s=CLOSED pgs=94 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:51.350 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:51.346+0000 7f2ce8b2e700 1 --2- 192.168.123.104:0/139536413 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2ce4071960 0x7f2ce4132580 unknown :-1 s=CLOSED pgs=266 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:51.350 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:51.346+0000 7f2ce8b2e700 1 --2- 192.168.123.104:0/139536413 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f2ce41080e0 0x7f2ce4132ac0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:51.350 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:51.346+0000 7f2ce8b2e700 1 -- 192.168.123.104:0/139536413 >> 192.168.123.104:0/139536413 conn(0x7f2ce406d3e0 msgr2=0x7f2ce4074f70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:19:51.350 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:51.347+0000 7f2ce8b2e700 1 -- 192.168.123.104:0/139536413 shutdown_connections 2026-03-10T06:19:51.350 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:51.347+0000 7f2ce8b2e700 1 -- 192.168.123.104:0/139536413 wait complete. 2026-03-10T06:19:51.424 INFO:teuthology.run_tasks:Running task cephadm.shell... 2026-03-10T06:19:51.426 INFO:tasks.cephadm:Running commands on role host.a host ubuntu@vm04.local 2026-03-10T06:19:51.426 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 9c59102a-1c48-11f1-b618-035af535377d -- bash -c 'ceph fs set cephfs allow_standby_replay true' 2026-03-10T06:19:51.615 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:51 vm04 ceph-mon[51058]: pgmap v79: 65 pgs: 65 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T06:19:51.615 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:51 vm04 ceph-mon[51058]: Deploying daemon mds.cephfs.vm06.wzhqon on vm06 2026-03-10T06:19:51.615 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:51 vm04 ceph-mon[51058]: from='client.? 192.168.123.104:0/139536413' entity='client.admin' cmd=[{"prefix": "fs set", "fs_name": "cephfs", "var": "max_mds", "val": "1"}]: dispatch 2026-03-10T06:19:51.615 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:51 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:19:51.615 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:51 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:19:51.615 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:51 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:19:51.615 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:51 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm04.hsrsig", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-10T06:19:51.615 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:51 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm04.hsrsig", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished 2026-03-10T06:19:51.615 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:51 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:19:51.615 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:51 vm04 ceph-mon[51058]: mds.? [v2:192.168.123.104:6826/2274683007,v1:192.168.123.104:6827/2274683007] up:active 2026-03-10T06:19:51.615 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:51 vm04 ceph-mon[51058]: from='client.? 192.168.123.104:0/139536413' entity='client.admin' cmd='[{"prefix": "fs set", "fs_name": "cephfs", "var": "max_mds", "val": "1"}]': finished 2026-03-10T06:19:51.615 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:51 vm04 ceph-mon[51058]: mds.? [v2:192.168.123.106:6824/3071631026,v1:192.168.123.106:6825/3071631026] up:boot 2026-03-10T06:19:51.615 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:51 vm04 ceph-mon[51058]: fsmap cephfs:1 {0=cephfs.vm04.hdxbzv=up:active} 1 up:standby 2026-03-10T06:19:51.615 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:51 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm06.wzhqon"}]: dispatch 2026-03-10T06:19:51.615 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:51 vm04 ceph-mon[51058]: fsmap cephfs:1 {0=cephfs.vm04.hdxbzv=up:active} 1 up:standby 2026-03-10T06:19:51.688 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/mon.vm04/config 2026-03-10T06:19:51.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:51 vm06 ceph-mon[58974]: pgmap v79: 65 pgs: 65 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T06:19:51.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:51 vm06 ceph-mon[58974]: Deploying daemon mds.cephfs.vm06.wzhqon on vm06 2026-03-10T06:19:51.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:51 vm06 ceph-mon[58974]: from='client.? 192.168.123.104:0/139536413' entity='client.admin' cmd=[{"prefix": "fs set", "fs_name": "cephfs", "var": "max_mds", "val": "1"}]: dispatch 2026-03-10T06:19:51.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:51 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:19:51.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:51 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:19:51.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:51 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:19:51.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:51 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm04.hsrsig", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-10T06:19:51.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:51 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm04.hsrsig", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished 2026-03-10T06:19:51.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:51 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:19:51.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:51 vm06 ceph-mon[58974]: mds.? [v2:192.168.123.104:6826/2274683007,v1:192.168.123.104:6827/2274683007] up:active 2026-03-10T06:19:51.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:51 vm06 ceph-mon[58974]: from='client.? 192.168.123.104:0/139536413' entity='client.admin' cmd='[{"prefix": "fs set", "fs_name": "cephfs", "var": "max_mds", "val": "1"}]': finished 2026-03-10T06:19:51.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:51 vm06 ceph-mon[58974]: mds.? [v2:192.168.123.106:6824/3071631026,v1:192.168.123.106:6825/3071631026] up:boot 2026-03-10T06:19:51.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:51 vm06 ceph-mon[58974]: fsmap cephfs:1 {0=cephfs.vm04.hdxbzv=up:active} 1 up:standby 2026-03-10T06:19:51.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:51 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm06.wzhqon"}]: dispatch 2026-03-10T06:19:51.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:51 vm06 ceph-mon[58974]: fsmap cephfs:1 {0=cephfs.vm04.hdxbzv=up:active} 1 up:standby 2026-03-10T06:19:52.300 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:52.297+0000 7f468ffff700 1 -- 192.168.123.104:0/4269482667 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f46900ff1a0 msgr2=0x7f46900ff5b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:52.300 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:52.297+0000 7f468ffff700 1 --2- 192.168.123.104:0/4269482667 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f46900ff1a0 0x7f46900ff5b0 secure :-1 s=READY pgs=269 cs=0 l=1 rev1=1 crypto rx=0x7f4680009b00 tx=0x7f4680009e10 comp rx=0 tx=0).stop 2026-03-10T06:19:52.300 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:52.299+0000 7f468ffff700 1 -- 192.168.123.104:0/4269482667 shutdown_connections 2026-03-10T06:19:52.300 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:52.299+0000 7f468ffff700 1 --2- 192.168.123.104:0/4269482667 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4690100440 0x7f46901008b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:52.300 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:52.299+0000 7f468ffff700 1 --2- 192.168.123.104:0/4269482667 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f46900ff1a0 0x7f46900ff5b0 unknown :-1 s=CLOSED pgs=269 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:52.300 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:52.299+0000 7f468ffff700 1 -- 192.168.123.104:0/4269482667 >> 192.168.123.104:0/4269482667 conn(0x7f46900fa7b0 msgr2=0x7f46900fcc20 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:19:52.301 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:52.299+0000 7f468ffff700 1 -- 192.168.123.104:0/4269482667 shutdown_connections 2026-03-10T06:19:52.301 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:52.299+0000 7f468ffff700 1 -- 192.168.123.104:0/4269482667 wait complete. 2026-03-10T06:19:52.301 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:52.300+0000 7f468ffff700 1 Processor -- start 2026-03-10T06:19:52.301 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:52.300+0000 7f468ffff700 1 -- start start 2026-03-10T06:19:52.302 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:52.300+0000 7f468ffff700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f46900ff1a0 0x7f4690195b40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:52.302 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:52.300+0000 7f468ffff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4690100440 0x7f4690196080 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:52.302 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:52.300+0000 7f468ffff700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f46901966a0 con 0x7f46900ff1a0 2026-03-10T06:19:52.302 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:52.300+0000 7f468ffff700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f46901967e0 con 0x7f4690100440 2026-03-10T06:19:52.302 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:52.300+0000 7f4687fff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4690100440 0x7f4690196080 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:19:52.302 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:52.300+0000 7f468effd700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f46900ff1a0 0x7f4690195b40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:19:52.302 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:52.301+0000 7f4687fff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4690100440 0x7f4690196080 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.104:46814/0 (socket says 192.168.123.104:46814) 2026-03-10T06:19:52.302 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:52.301+0000 7f4687fff700 1 -- 192.168.123.104:0/1869523952 learned_addr learned my addr 192.168.123.104:0/1869523952 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:19:52.303 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:52.301+0000 7f468effd700 1 -- 192.168.123.104:0/1869523952 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4690100440 msgr2=0x7f4690196080 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:52.303 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:52.301+0000 7f468effd700 1 --2- 192.168.123.104:0/1869523952 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4690100440 0x7f4690196080 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:52.303 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:52.301+0000 7f468effd700 1 -- 192.168.123.104:0/1869523952 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f46800097e0 con 0x7f46900ff1a0 2026-03-10T06:19:52.303 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:52.301+0000 7f468effd700 1 --2- 192.168.123.104:0/1869523952 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f46900ff1a0 0x7f4690195b40 secure :-1 s=READY pgs=270 cs=0 l=1 rev1=1 crypto rx=0x7f4680000c00 tx=0x7f4680004990 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:19:52.303 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:52.302+0000 7f468cff9700 1 -- 192.168.123.104:0/1869523952 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f468001d070 con 0x7f46900ff1a0 2026-03-10T06:19:52.303 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:52.302+0000 7f468ffff700 1 -- 192.168.123.104:0/1869523952 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f4690105740 con 0x7f46900ff1a0 2026-03-10T06:19:52.303 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:52.302+0000 7f468ffff700 1 -- 192.168.123.104:0/1869523952 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f4690105c30 con 0x7f46900ff1a0 2026-03-10T06:19:52.304 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:52.303+0000 7f468cff9700 1 -- 192.168.123.104:0/1869523952 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f4680022470 con 0x7f46900ff1a0 2026-03-10T06:19:52.305 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:52.303+0000 7f468cff9700 1 -- 192.168.123.104:0/1869523952 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f468000f650 con 0x7f46900ff1a0 2026-03-10T06:19:52.305 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:52.304+0000 7f468ffff700 1 -- 192.168.123.104:0/1869523952 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f4670005320 con 0x7f46900ff1a0 2026-03-10T06:19:52.308 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:52.305+0000 7f468cff9700 1 -- 192.168.123.104:0/1869523952 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f468000f870 con 0x7f46900ff1a0 2026-03-10T06:19:52.308 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:52.305+0000 7f468cff9700 1 --2- 192.168.123.104:0/1869523952 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f467c06c4d0 0x7f467c06e980 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:52.308 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:52.305+0000 7f468cff9700 1 -- 192.168.123.104:0/1869523952 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(38..38 src has 1..38) v4 ==== 5276+0+0 (secure 0 0 0) 0x7f468008de80 con 0x7f46900ff1a0 2026-03-10T06:19:52.308 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:52.307+0000 7f4687fff700 1 --2- 192.168.123.104:0/1869523952 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f467c06c4d0 0x7f467c06e980 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:19:52.309 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:52.307+0000 7f468cff9700 1 -- 192.168.123.104:0/1869523952 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f468005c1c0 con 0x7f46900ff1a0 2026-03-10T06:19:52.309 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:52.307+0000 7f4687fff700 1 --2- 192.168.123.104:0/1869523952 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f467c06c4d0 0x7f467c06e980 secure :-1 s=READY pgs=96 cs=0 l=1 rev1=1 crypto rx=0x7f4678005950 tx=0x7f467800b500 comp rx=0 tx=0).ready entity=mgr.14241 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:19:52.446 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:52.444+0000 7f468ffff700 1 -- 192.168.123.104:0/1869523952 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "fs set", "fs_name": "cephfs", "var": "allow_standby_replay", "val": "true"} v 0) v1 -- 0x7f4670005f70 con 0x7f46900ff1a0 2026-03-10T06:19:52.692 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:52 vm06 ceph-mon[58974]: Deploying daemon mds.cephfs.vm04.hsrsig on vm04 2026-03-10T06:19:52.692 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:52 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:19:52.692 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:52 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:19:52.692 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:52 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:19:52.692 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:52 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm06.afscws", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-10T06:19:52.692 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:52 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm06.afscws", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished 2026-03-10T06:19:52.693 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:52 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:19:52.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:52 vm04 ceph-mon[51058]: Deploying daemon mds.cephfs.vm04.hsrsig on vm04 2026-03-10T06:19:52.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:52 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:19:52.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:52 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:19:52.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:52 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:19:52.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:52 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm06.afscws", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-10T06:19:52.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:52 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm06.afscws", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished 2026-03-10T06:19:52.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:52 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:19:53.286 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:53.284+0000 7f468cff9700 1 -- 192.168.123.104:0/1869523952 <== mon.0 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "fs set", "fs_name": "cephfs", "var": "allow_standby_replay", "val": "true"}]=0 v6) v1 ==== 121+0+0 (secure 0 0 0) 0x7f468005bd50 con 0x7f46900ff1a0 2026-03-10T06:19:53.290 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:53.286+0000 7f468ffff700 1 -- 192.168.123.104:0/1869523952 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f467c06c4d0 msgr2=0x7f467c06e980 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:53.290 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:53.286+0000 7f468ffff700 1 --2- 192.168.123.104:0/1869523952 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f467c06c4d0 0x7f467c06e980 secure :-1 s=READY pgs=96 cs=0 l=1 rev1=1 crypto rx=0x7f4678005950 tx=0x7f467800b500 comp rx=0 tx=0).stop 2026-03-10T06:19:53.290 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:53.286+0000 7f468ffff700 1 -- 192.168.123.104:0/1869523952 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f46900ff1a0 msgr2=0x7f4690195b40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:53.290 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:53.286+0000 7f468ffff700 1 --2- 192.168.123.104:0/1869523952 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f46900ff1a0 0x7f4690195b40 secure :-1 s=READY pgs=270 cs=0 l=1 rev1=1 crypto rx=0x7f4680000c00 tx=0x7f4680004990 comp rx=0 tx=0).stop 2026-03-10T06:19:53.290 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:53.286+0000 7f468ffff700 1 -- 192.168.123.104:0/1869523952 shutdown_connections 2026-03-10T06:19:53.290 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:53.286+0000 7f468ffff700 1 --2- 192.168.123.104:0/1869523952 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f467c06c4d0 0x7f467c06e980 unknown :-1 s=CLOSED pgs=96 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:53.290 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:53.286+0000 7f468ffff700 1 --2- 192.168.123.104:0/1869523952 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f46900ff1a0 0x7f4690195b40 unknown :-1 s=CLOSED pgs=270 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:53.290 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:53.286+0000 7f468ffff700 1 --2- 192.168.123.104:0/1869523952 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4690100440 0x7f4690196080 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:53.290 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:53.286+0000 7f468ffff700 1 -- 192.168.123.104:0/1869523952 >> 192.168.123.104:0/1869523952 conn(0x7f46900fa7b0 msgr2=0x7f4690103670 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:19:53.290 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:53.286+0000 7f468ffff700 1 -- 192.168.123.104:0/1869523952 shutdown_connections 2026-03-10T06:19:53.290 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:53.286+0000 7f468ffff700 1 -- 192.168.123.104:0/1869523952 wait complete. 2026-03-10T06:19:53.336 INFO:teuthology.run_tasks:Running task cephadm.shell... 2026-03-10T06:19:53.339 INFO:tasks.cephadm:Running commands on role host.a host ubuntu@vm04.local 2026-03-10T06:19:53.339 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 9c59102a-1c48-11f1-b618-035af535377d -- bash -c 'ceph fs set cephfs inline_data true --yes-i-really-really-mean-it' 2026-03-10T06:19:53.510 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/mon.vm04/config 2026-03-10T06:19:53.540 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:53 vm04 ceph-mon[51058]: pgmap v80: 65 pgs: 65 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T06:19:53.541 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:53 vm04 ceph-mon[51058]: Deploying daemon mds.cephfs.vm06.afscws on vm06 2026-03-10T06:19:53.541 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:53 vm04 ceph-mon[51058]: from='client.? 192.168.123.104:0/1869523952' entity='client.admin' cmd=[{"prefix": "fs set", "fs_name": "cephfs", "var": "allow_standby_replay", "val": "true"}]: dispatch 2026-03-10T06:19:53.541 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:53 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:19:53.541 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:53 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:19:53.541 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:53 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:19:53.541 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:53 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:19:53.541 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:53 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:19:53.541 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:53 vm04 ceph-mon[51058]: mds.? [v2:192.168.123.104:6828/2419696492,v1:192.168.123.104:6829/2419696492] up:boot 2026-03-10T06:19:53.541 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:53 vm04 ceph-mon[51058]: from='client.? 192.168.123.104:0/1869523952' entity='client.admin' cmd='[{"prefix": "fs set", "fs_name": "cephfs", "var": "allow_standby_replay", "val": "true"}]': finished 2026-03-10T06:19:53.541 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:53 vm04 ceph-mon[51058]: mds.? [v2:192.168.123.106:6826/3120742985,v1:192.168.123.106:6827/3120742985] up:boot 2026-03-10T06:19:53.541 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:53 vm04 ceph-mon[51058]: fsmap cephfs:1 {0=cephfs.vm04.hdxbzv=up:active} 3 up:standby 2026-03-10T06:19:53.541 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:53 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm04.hsrsig"}]: dispatch 2026-03-10T06:19:53.541 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:53 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm06.afscws"}]: dispatch 2026-03-10T06:19:53.541 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:53 vm04 ceph-mon[51058]: fsmap cephfs:1 {0=cephfs.vm04.hdxbzv=up:active} 1 up:standby-replay 2 up:standby 2026-03-10T06:19:53.541 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:53 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:19:53.781 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:53 vm06 ceph-mon[58974]: pgmap v80: 65 pgs: 65 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T06:19:53.782 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:53 vm06 ceph-mon[58974]: Deploying daemon mds.cephfs.vm06.afscws on vm06 2026-03-10T06:19:53.782 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:53 vm06 ceph-mon[58974]: from='client.? 192.168.123.104:0/1869523952' entity='client.admin' cmd=[{"prefix": "fs set", "fs_name": "cephfs", "var": "allow_standby_replay", "val": "true"}]: dispatch 2026-03-10T06:19:53.782 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:53 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:19:53.782 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:53 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:19:53.782 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:53 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:19:53.782 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:53 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:19:53.782 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:53 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:19:53.782 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:53 vm06 ceph-mon[58974]: mds.? [v2:192.168.123.104:6828/2419696492,v1:192.168.123.104:6829/2419696492] up:boot 2026-03-10T06:19:53.782 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:53 vm06 ceph-mon[58974]: from='client.? 192.168.123.104:0/1869523952' entity='client.admin' cmd='[{"prefix": "fs set", "fs_name": "cephfs", "var": "allow_standby_replay", "val": "true"}]': finished 2026-03-10T06:19:53.782 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:53 vm06 ceph-mon[58974]: mds.? [v2:192.168.123.106:6826/3120742985,v1:192.168.123.106:6827/3120742985] up:boot 2026-03-10T06:19:53.782 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:53 vm06 ceph-mon[58974]: fsmap cephfs:1 {0=cephfs.vm04.hdxbzv=up:active} 3 up:standby 2026-03-10T06:19:53.782 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:53 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm04.hsrsig"}]: dispatch 2026-03-10T06:19:53.782 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:53 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm06.afscws"}]: dispatch 2026-03-10T06:19:53.782 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:53 vm06 ceph-mon[58974]: fsmap cephfs:1 {0=cephfs.vm04.hdxbzv=up:active} 1 up:standby-replay 2 up:standby 2026-03-10T06:19:53.782 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:53 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:19:53.819 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:53.815+0000 7f99522da700 1 -- 192.168.123.104:0/1565804736 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f994c101490 msgr2=0x7f994c103910 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:53.819 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:53.815+0000 7f99522da700 1 --2- 192.168.123.104:0/1565804736 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f994c101490 0x7f994c103910 secure :-1 s=READY pgs=272 cs=0 l=1 rev1=1 crypto rx=0x7f993c009b00 tx=0x7f993c009e10 comp rx=0 tx=0).stop 2026-03-10T06:19:53.819 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:53.818+0000 7f99522da700 1 -- 192.168.123.104:0/1565804736 shutdown_connections 2026-03-10T06:19:53.820 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:53.818+0000 7f99522da700 1 --2- 192.168.123.104:0/1565804736 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f994c101490 0x7f994c103910 unknown :-1 s=CLOSED pgs=272 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:53.820 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:53.818+0000 7f99522da700 1 --2- 192.168.123.104:0/1565804736 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f994c0feb30 0x7f994c100f50 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:53.820 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:53.818+0000 7f99522da700 1 -- 192.168.123.104:0/1565804736 >> 192.168.123.104:0/1565804736 conn(0x7f994c0fa740 msgr2=0x7f994c0fcb90 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:19:53.823 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:53.819+0000 7f99522da700 1 -- 192.168.123.104:0/1565804736 shutdown_connections 2026-03-10T06:19:53.823 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:53.819+0000 7f99522da700 1 -- 192.168.123.104:0/1565804736 wait complete. 2026-03-10T06:19:53.823 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:53.820+0000 7f99522da700 1 Processor -- start 2026-03-10T06:19:53.823 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:53.820+0000 7f99522da700 1 -- start start 2026-03-10T06:19:53.824 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:53.820+0000 7f99522da700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f994c0feb30 0x7f994c072020 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:53.824 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:53.820+0000 7f99522da700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f994c101490 0x7f994c072560 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:53.824 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:53.820+0000 7f99522da700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f994c072b80 con 0x7f994c0feb30 2026-03-10T06:19:53.824 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:53.820+0000 7f99522da700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f994c1c3a90 con 0x7f994c101490 2026-03-10T06:19:53.824 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:53.820+0000 7f99512d8700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f994c0feb30 0x7f994c072020 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:19:53.824 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:53.820+0000 7f99512d8700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f994c0feb30 0x7f994c072020 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:36678/0 (socket says 192.168.123.104:36678) 2026-03-10T06:19:53.824 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:53.820+0000 7f99512d8700 1 -- 192.168.123.104:0/425069863 learned_addr learned my addr 192.168.123.104:0/425069863 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:19:53.824 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:53.821+0000 7f99512d8700 1 -- 192.168.123.104:0/425069863 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f994c101490 msgr2=0x7f994c072560 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T06:19:53.824 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:53.821+0000 7f99512d8700 1 --2- 192.168.123.104:0/425069863 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f994c101490 0x7f994c072560 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:53.824 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:53.821+0000 7f99512d8700 1 -- 192.168.123.104:0/425069863 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f993c0097e0 con 0x7f994c0feb30 2026-03-10T06:19:53.824 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:53.821+0000 7f99512d8700 1 --2- 192.168.123.104:0/425069863 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f994c0feb30 0x7f994c072020 secure :-1 s=READY pgs=273 cs=0 l=1 rev1=1 crypto rx=0x7f994800b700 tx=0x7f994800bac0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:19:53.824 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:53.821+0000 7f99427fc700 1 -- 192.168.123.104:0/425069863 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9948010820 con 0x7f994c0feb30 2026-03-10T06:19:53.824 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:53.821+0000 7f99427fc700 1 -- 192.168.123.104:0/425069863 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f9948010e60 con 0x7f994c0feb30 2026-03-10T06:19:53.825 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:53.821+0000 7f99427fc700 1 -- 192.168.123.104:0/425069863 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9948017570 con 0x7f994c0feb30 2026-03-10T06:19:53.825 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:53.821+0000 7f99522da700 1 -- 192.168.123.104:0/425069863 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f994c1c3c90 con 0x7f994c0feb30 2026-03-10T06:19:53.825 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:53.821+0000 7f99522da700 1 -- 192.168.123.104:0/425069863 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f994c1c4160 con 0x7f994c0feb30 2026-03-10T06:19:53.825 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:53.823+0000 7f99427fc700 1 -- 192.168.123.104:0/425069863 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f9948010980 con 0x7f994c0feb30 2026-03-10T06:19:53.825 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:53.823+0000 7f99427fc700 1 --2- 192.168.123.104:0/425069863 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f993806c480 0x7f993806e930 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:53.831 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:53.823+0000 7f9950ad7700 1 --2- 192.168.123.104:0/425069863 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f993806c480 0x7f993806e930 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:19:53.831 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:53.825+0000 7f99522da700 1 -- 192.168.123.104:0/425069863 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f994c06bf10 con 0x7f994c0feb30 2026-03-10T06:19:53.831 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:53.827+0000 7f99427fc700 1 -- 192.168.123.104:0/425069863 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(38..38 src has 1..38) v4 ==== 5276+0+0 (secure 0 0 0) 0x7f994808b0d0 con 0x7f994c0feb30 2026-03-10T06:19:53.831 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:53.828+0000 7f99427fc700 1 -- 192.168.123.104:0/425069863 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f9948055830 con 0x7f994c0feb30 2026-03-10T06:19:53.832 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:53.829+0000 7f9950ad7700 1 --2- 192.168.123.104:0/425069863 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f993806c480 0x7f993806e930 secure :-1 s=READY pgs=100 cs=0 l=1 rev1=1 crypto rx=0x7f993c009ad0 tx=0x7f993c000bc0 comp rx=0 tx=0).ready entity=mgr.14241 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:19:53.985 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:53.982+0000 7f99522da700 1 -- 192.168.123.104:0/425069863 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "fs set", "fs_name": "cephfs", "var": "inline_data", "val": "true", "yes_i_really_really_mean_it": true} v 0) v1 -- 0x7f994c04ea50 con 0x7f994c0feb30 2026-03-10T06:19:54.457 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:54.450+0000 7f99427fc700 1 -- 192.168.123.104:0/425069863 <== mon.0 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "fs set", "fs_name": "cephfs", "var": "inline_data", "val": "true", "yes_i_really_really_mean_it": true}]=0 inline data enabled v8) v1 ==== 168+0+0 (secure 0 0 0) 0x7f9948058e50 con 0x7f994c0feb30 2026-03-10T06:19:54.457 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:54.453+0000 7f99522da700 1 -- 192.168.123.104:0/425069863 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f993806c480 msgr2=0x7f993806e930 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:54.457 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:54.453+0000 7f99522da700 1 --2- 192.168.123.104:0/425069863 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f993806c480 0x7f993806e930 secure :-1 s=READY pgs=100 cs=0 l=1 rev1=1 crypto rx=0x7f993c009ad0 tx=0x7f993c000bc0 comp rx=0 tx=0).stop 2026-03-10T06:19:54.457 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:54.453+0000 7f99522da700 1 -- 192.168.123.104:0/425069863 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f994c0feb30 msgr2=0x7f994c072020 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:54.457 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:54.453+0000 7f99522da700 1 --2- 192.168.123.104:0/425069863 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f994c0feb30 0x7f994c072020 secure :-1 s=READY pgs=273 cs=0 l=1 rev1=1 crypto rx=0x7f994800b700 tx=0x7f994800bac0 comp rx=0 tx=0).stop 2026-03-10T06:19:54.457 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:54.454+0000 7f99522da700 1 -- 192.168.123.104:0/425069863 shutdown_connections 2026-03-10T06:19:54.457 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:54.454+0000 7f99522da700 1 --2- 192.168.123.104:0/425069863 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f993806c480 0x7f993806e930 unknown :-1 s=CLOSED pgs=100 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:54.457 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:54.454+0000 7f99522da700 1 --2- 192.168.123.104:0/425069863 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f994c0feb30 0x7f994c072020 unknown :-1 s=CLOSED pgs=273 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:54.457 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:54.455+0000 7f99522da700 1 --2- 192.168.123.104:0/425069863 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f994c101490 0x7f994c072560 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:54.457 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:54.455+0000 7f99522da700 1 -- 192.168.123.104:0/425069863 >> 192.168.123.104:0/425069863 conn(0x7f994c0fa740 msgr2=0x7f994c0fcb90 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:19:54.457 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:54.455+0000 7f99522da700 1 -- 192.168.123.104:0/425069863 shutdown_connections 2026-03-10T06:19:54.457 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:54.455+0000 7f99522da700 1 -- 192.168.123.104:0/425069863 wait complete. 2026-03-10T06:19:54.466 INFO:teuthology.orchestra.run.vm04.stderr:inline data enabled 2026-03-10T06:19:54.490 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:54 vm04 ceph-mon[51058]: from='client.? 192.168.123.104:0/425069863' entity='client.admin' cmd=[{"prefix": "fs set", "fs_name": "cephfs", "var": "inline_data", "val": "true", "yes_i_really_really_mean_it": true}]: dispatch 2026-03-10T06:19:54.490 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:54 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:19:54.490 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:54 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:19:54.509 INFO:teuthology.run_tasks:Running task cephadm.shell... 2026-03-10T06:19:54.512 INFO:tasks.cephadm:Running commands on role host.a host ubuntu@vm04.local 2026-03-10T06:19:54.512 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 9c59102a-1c48-11f1-b618-035af535377d -- bash -c 'ceph fs dump' 2026-03-10T06:19:54.722 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/mon.vm04/config 2026-03-10T06:19:54.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:54 vm06 ceph-mon[58974]: from='client.? 192.168.123.104:0/425069863' entity='client.admin' cmd=[{"prefix": "fs set", "fs_name": "cephfs", "var": "inline_data", "val": "true", "yes_i_really_really_mean_it": true}]: dispatch 2026-03-10T06:19:54.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:54 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:19:54.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:54 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:19:55.082 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:55.079+0000 7fcba479e700 1 -- 192.168.123.104:0/780432332 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fcb9c072360 msgr2=0x7fcb9c0770e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:55.082 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:55.079+0000 7fcba479e700 1 --2- 192.168.123.104:0/780432332 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fcb9c072360 0x7fcb9c0770e0 secure :-1 s=READY pgs=274 cs=0 l=1 rev1=1 crypto rx=0x7fcb9400cd40 tx=0x7fcb9400a320 comp rx=0 tx=0).stop 2026-03-10T06:19:55.082 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:55.080+0000 7fcba479e700 1 -- 192.168.123.104:0/780432332 shutdown_connections 2026-03-10T06:19:55.082 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:55.080+0000 7fcba479e700 1 --2- 192.168.123.104:0/780432332 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fcb9c072360 0x7fcb9c0770e0 unknown :-1 s=CLOSED pgs=274 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:55.082 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:55.080+0000 7fcba479e700 1 --2- 192.168.123.104:0/780432332 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fcb9c071980 0x7fcb9c071d90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:55.082 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:55.080+0000 7fcba479e700 1 -- 192.168.123.104:0/780432332 >> 192.168.123.104:0/780432332 conn(0x7fcb9c06d1a0 msgr2=0x7fcb9c06f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:19:55.083 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:55.081+0000 7fcba479e700 1 -- 192.168.123.104:0/780432332 shutdown_connections 2026-03-10T06:19:55.084 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:55.081+0000 7fcba479e700 1 -- 192.168.123.104:0/780432332 wait complete. 2026-03-10T06:19:55.084 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:55.081+0000 7fcba479e700 1 Processor -- start 2026-03-10T06:19:55.084 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:55.081+0000 7fcba479e700 1 -- start start 2026-03-10T06:19:55.084 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:55.081+0000 7fcba479e700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fcb9c071980 0x7fcb9c131350 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:55.084 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:55.081+0000 7fcba479e700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fcb9c131890 0x7fcb9c07f4c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:55.084 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:55.082+0000 7fcba479e700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fcb9c131d90 con 0x7fcb9c131890 2026-03-10T06:19:55.084 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:55.082+0000 7fcba479e700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fcb9c131ed0 con 0x7fcb9c071980 2026-03-10T06:19:55.084 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:55.082+0000 7fcba1d39700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fcb9c131890 0x7fcb9c07f4c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:19:55.085 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:55.082+0000 7fcba1d39700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fcb9c131890 0x7fcb9c07f4c0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:36696/0 (socket says 192.168.123.104:36696) 2026-03-10T06:19:55.085 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:55.082+0000 7fcba1d39700 1 -- 192.168.123.104:0/2970064130 learned_addr learned my addr 192.168.123.104:0/2970064130 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:19:55.085 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:55.082+0000 7fcba253a700 1 --2- 192.168.123.104:0/2970064130 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fcb9c071980 0x7fcb9c131350 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:19:55.085 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:55.082+0000 7fcba1d39700 1 -- 192.168.123.104:0/2970064130 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fcb9c071980 msgr2=0x7fcb9c131350 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:55.085 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:55.082+0000 7fcba1d39700 1 --2- 192.168.123.104:0/2970064130 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fcb9c071980 0x7fcb9c131350 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:55.085 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:55.082+0000 7fcba1d39700 1 -- 192.168.123.104:0/2970064130 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fcb9400c9f0 con 0x7fcb9c131890 2026-03-10T06:19:55.086 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:55.082+0000 7fcba1d39700 1 --2- 192.168.123.104:0/2970064130 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fcb9c131890 0x7fcb9c07f4c0 secure :-1 s=READY pgs=275 cs=0 l=1 rev1=1 crypto rx=0x7fcb9400bb40 tx=0x7fcb9400bc20 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:19:55.086 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:55.084+0000 7fcb937fe700 1 -- 192.168.123.104:0/2970064130 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fcb9400dea0 con 0x7fcb9c131890 2026-03-10T06:19:55.088 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:55.084+0000 7fcba479e700 1 -- 192.168.123.104:0/2970064130 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fcb9c07fa00 con 0x7fcb9c131890 2026-03-10T06:19:55.091 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:55.084+0000 7fcba479e700 1 -- 192.168.123.104:0/2970064130 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fcb9c07ff00 con 0x7fcb9c131890 2026-03-10T06:19:55.092 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:55.086+0000 7fcba479e700 1 -- 192.168.123.104:0/2970064130 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fcb9c12b500 con 0x7fcb9c131890 2026-03-10T06:19:55.092 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:55.090+0000 7fcb937fe700 1 -- 192.168.123.104:0/2970064130 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fcb94009d70 con 0x7fcb9c131890 2026-03-10T06:19:55.092 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:55.090+0000 7fcb937fe700 1 -- 192.168.123.104:0/2970064130 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fcb9401f920 con 0x7fcb9c131890 2026-03-10T06:19:55.092 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:55.090+0000 7fcb937fe700 1 -- 192.168.123.104:0/2970064130 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7fcb94004030 con 0x7fcb9c131890 2026-03-10T06:19:55.093 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:55.091+0000 7fcb937fe700 1 --2- 192.168.123.104:0/2970064130 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fcb8806c870 0x7fcb8806ed20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:55.094 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:55.092+0000 7fcba253a700 1 --2- 192.168.123.104:0/2970064130 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fcb8806c870 0x7fcb8806ed20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:19:55.095 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:55.092+0000 7fcba253a700 1 --2- 192.168.123.104:0/2970064130 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fcb8806c870 0x7fcb8806ed20 secure :-1 s=READY pgs=101 cs=0 l=1 rev1=1 crypto rx=0x7fcb980099e0 tx=0x7fcb98008040 comp rx=0 tx=0).ready entity=mgr.14241 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:19:55.096 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:55.094+0000 7fcb937fe700 1 -- 192.168.123.104:0/2970064130 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(38..38 src has 1..38) v4 ==== 5276+0+0 (secure 0 0 0) 0x7fcb9408dbe0 con 0x7fcb9c131890 2026-03-10T06:19:55.096 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:55.095+0000 7fcb937fe700 1 -- 192.168.123.104:0/2970064130 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fcb940912a0 con 0x7fcb9c131890 2026-03-10T06:19:55.245 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:55.243+0000 7fcba479e700 1 -- 192.168.123.104:0/2970064130 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7fcb9c02d0b0 con 0x7fcb9c131890 2026-03-10T06:19:55.245 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:55.244+0000 7fcb937fe700 1 -- 192.168.123.104:0/2970064130 <== mon.0 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 8 v8) v1 ==== 75+0+1813 (secure 0 0 0) 0x7fcb9c02d0b0 con 0x7fcb9c131890 2026-03-10T06:19:55.247 INFO:teuthology.orchestra.run.vm04.stdout:e8 2026-03-10T06:19:55.247 INFO:teuthology.orchestra.run.vm04.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-10T06:19:55.247 INFO:teuthology.orchestra.run.vm04.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T06:19:55.247 INFO:teuthology.orchestra.run.vm04.stdout:legacy client fscid: 1 2026-03-10T06:19:55.247 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:19:55.247 INFO:teuthology.orchestra.run.vm04.stdout:Filesystem 'cephfs' (1) 2026-03-10T06:19:55.247 INFO:teuthology.orchestra.run.vm04.stdout:fs_name cephfs 2026-03-10T06:19:55.247 INFO:teuthology.orchestra.run.vm04.stdout:epoch 8 2026-03-10T06:19:55.247 INFO:teuthology.orchestra.run.vm04.stdout:flags 32 joinable allow_snaps allow_multimds_snaps allow_standby_replay 2026-03-10T06:19:55.247 INFO:teuthology.orchestra.run.vm04.stdout:created 2026-03-10T06:19:48.407965+0000 2026-03-10T06:19:55.247 INFO:teuthology.orchestra.run.vm04.stdout:modified 2026-03-10T06:19:54.442893+0000 2026-03-10T06:19:55.247 INFO:teuthology.orchestra.run.vm04.stdout:tableserver 0 2026-03-10T06:19:55.247 INFO:teuthology.orchestra.run.vm04.stdout:root 0 2026-03-10T06:19:55.247 INFO:teuthology.orchestra.run.vm04.stdout:session_timeout 60 2026-03-10T06:19:55.247 INFO:teuthology.orchestra.run.vm04.stdout:session_autoclose 300 2026-03-10T06:19:55.247 INFO:teuthology.orchestra.run.vm04.stdout:max_file_size 1099511627776 2026-03-10T06:19:55.247 INFO:teuthology.orchestra.run.vm04.stdout:required_client_features {} 2026-03-10T06:19:55.247 INFO:teuthology.orchestra.run.vm04.stdout:last_failure 0 2026-03-10T06:19:55.247 INFO:teuthology.orchestra.run.vm04.stdout:last_failure_osd_epoch 0 2026-03-10T06:19:55.247 INFO:teuthology.orchestra.run.vm04.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T06:19:55.247 INFO:teuthology.orchestra.run.vm04.stdout:max_mds 1 2026-03-10T06:19:55.247 INFO:teuthology.orchestra.run.vm04.stdout:in 0 2026-03-10T06:19:55.247 INFO:teuthology.orchestra.run.vm04.stdout:up {0=14508} 2026-03-10T06:19:55.247 INFO:teuthology.orchestra.run.vm04.stdout:failed 2026-03-10T06:19:55.247 INFO:teuthology.orchestra.run.vm04.stdout:damaged 2026-03-10T06:19:55.247 INFO:teuthology.orchestra.run.vm04.stdout:stopped 2026-03-10T06:19:55.247 INFO:teuthology.orchestra.run.vm04.stdout:data_pools [3] 2026-03-10T06:19:55.247 INFO:teuthology.orchestra.run.vm04.stdout:metadata_pool 2 2026-03-10T06:19:55.247 INFO:teuthology.orchestra.run.vm04.stdout:inline_data enabled 2026-03-10T06:19:55.247 INFO:teuthology.orchestra.run.vm04.stdout:balancer 2026-03-10T06:19:55.247 INFO:teuthology.orchestra.run.vm04.stdout:bal_rank_mask -1 2026-03-10T06:19:55.247 INFO:teuthology.orchestra.run.vm04.stdout:standby_count_wanted 1 2026-03-10T06:19:55.247 INFO:teuthology.orchestra.run.vm04.stdout:[mds.cephfs.vm04.hdxbzv{0:14508} state up:active seq 3 join_fscid=1 addr [v2:192.168.123.104:6826/2274683007,v1:192.168.123.104:6827/2274683007] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T06:19:55.247 INFO:teuthology.orchestra.run.vm04.stdout:[mds.cephfs.vm06.wzhqon{0:24299} state up:standby-replay seq 1 addr [v2:192.168.123.106:6824/3071631026,v1:192.168.123.106:6825/3071631026] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T06:19:55.247 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:19:55.247 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:19:55.247 INFO:teuthology.orchestra.run.vm04.stdout:Standby daemons: 2026-03-10T06:19:55.248 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:19:55.248 INFO:teuthology.orchestra.run.vm04.stdout:[mds.cephfs.vm04.hsrsig{-1:14518} state up:standby seq 1 addr [v2:192.168.123.104:6828/2419696492,v1:192.168.123.104:6829/2419696492] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T06:19:55.248 INFO:teuthology.orchestra.run.vm04.stdout:[mds.cephfs.vm06.afscws{-1:14526} state up:standby seq 1 addr [v2:192.168.123.106:6826/3120742985,v1:192.168.123.106:6827/3120742985] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T06:19:55.249 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:55.247+0000 7fcba479e700 1 -- 192.168.123.104:0/2970064130 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fcb8806c870 msgr2=0x7fcb8806ed20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:55.249 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:55.247+0000 7fcba479e700 1 --2- 192.168.123.104:0/2970064130 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fcb8806c870 0x7fcb8806ed20 secure :-1 s=READY pgs=101 cs=0 l=1 rev1=1 crypto rx=0x7fcb980099e0 tx=0x7fcb98008040 comp rx=0 tx=0).stop 2026-03-10T06:19:55.250 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:55.247+0000 7fcba479e700 1 -- 192.168.123.104:0/2970064130 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fcb9c131890 msgr2=0x7fcb9c07f4c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:55.250 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:55.247+0000 7fcba479e700 1 --2- 192.168.123.104:0/2970064130 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fcb9c131890 0x7fcb9c07f4c0 secure :-1 s=READY pgs=275 cs=0 l=1 rev1=1 crypto rx=0x7fcb9400bb40 tx=0x7fcb9400bc20 comp rx=0 tx=0).stop 2026-03-10T06:19:55.250 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:55.248+0000 7fcba479e700 1 -- 192.168.123.104:0/2970064130 shutdown_connections 2026-03-10T06:19:55.250 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:55.248+0000 7fcba479e700 1 --2- 192.168.123.104:0/2970064130 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fcb8806c870 0x7fcb8806ed20 unknown :-1 s=CLOSED pgs=101 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:55.250 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:55.248+0000 7fcba479e700 1 --2- 192.168.123.104:0/2970064130 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fcb9c071980 0x7fcb9c131350 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:55.250 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:55.248+0000 7fcba479e700 1 --2- 192.168.123.104:0/2970064130 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fcb9c131890 0x7fcb9c07f4c0 unknown :-1 s=CLOSED pgs=275 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:55.250 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:55.248+0000 7fcba479e700 1 -- 192.168.123.104:0/2970064130 >> 192.168.123.104:0/2970064130 conn(0x7fcb9c06d1a0 msgr2=0x7fcb9c076420 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:19:55.250 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:55.248+0000 7fcba479e700 1 -- 192.168.123.104:0/2970064130 shutdown_connections 2026-03-10T06:19:55.250 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:55.248+0000 7fcba479e700 1 -- 192.168.123.104:0/2970064130 wait complete. 2026-03-10T06:19:55.251 INFO:teuthology.orchestra.run.vm04.stderr:dumped fsmap epoch 8 2026-03-10T06:19:55.353 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 9c59102a-1c48-11f1-b618-035af535377d -- bash -c 'ceph --format=json fs dump | jq -e ".filesystems | length == 1"' 2026-03-10T06:19:55.534 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/mon.vm04/config 2026-03-10T06:19:55.572 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:55 vm04 ceph-mon[51058]: pgmap v81: 65 pgs: 65 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 170 B/s rd, 170 B/s wr, 0 op/s 2026-03-10T06:19:55.572 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:55 vm04 ceph-mon[51058]: Health check failed: 1 filesystem with deprecated feature inline_data (FS_INLINE_DATA_DEPRECATED) 2026-03-10T06:19:55.572 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:55 vm04 ceph-mon[51058]: from='client.? 192.168.123.104:0/425069863' entity='client.admin' cmd='[{"prefix": "fs set", "fs_name": "cephfs", "var": "inline_data", "val": "true", "yes_i_really_really_mean_it": true}]': finished 2026-03-10T06:19:55.572 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:55 vm04 ceph-mon[51058]: mds.? [v2:192.168.123.104:6826/2274683007,v1:192.168.123.104:6827/2274683007] up:active 2026-03-10T06:19:55.572 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:55 vm04 ceph-mon[51058]: fsmap cephfs:1 {0=cephfs.vm04.hdxbzv=up:active} 1 up:standby-replay 2 up:standby 2026-03-10T06:19:55.573 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:55 vm04 ceph-mon[51058]: from='client.? 192.168.123.104:0/2970064130' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T06:19:55.573 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:55 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:19:55.573 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:55 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:19:55.573 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:55 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:19:55.573 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:55 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T06:19:55.573 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:55 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:19:55.573 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:55 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:19:55.795 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:55 vm06 ceph-mon[58974]: pgmap v81: 65 pgs: 65 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 170 B/s rd, 170 B/s wr, 0 op/s 2026-03-10T06:19:55.795 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:55 vm06 ceph-mon[58974]: Health check failed: 1 filesystem with deprecated feature inline_data (FS_INLINE_DATA_DEPRECATED) 2026-03-10T06:19:55.795 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:55 vm06 ceph-mon[58974]: from='client.? 192.168.123.104:0/425069863' entity='client.admin' cmd='[{"prefix": "fs set", "fs_name": "cephfs", "var": "inline_data", "val": "true", "yes_i_really_really_mean_it": true}]': finished 2026-03-10T06:19:55.795 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:55 vm06 ceph-mon[58974]: mds.? [v2:192.168.123.104:6826/2274683007,v1:192.168.123.104:6827/2274683007] up:active 2026-03-10T06:19:55.795 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:55 vm06 ceph-mon[58974]: fsmap cephfs:1 {0=cephfs.vm04.hdxbzv=up:active} 1 up:standby-replay 2 up:standby 2026-03-10T06:19:55.795 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:55 vm06 ceph-mon[58974]: from='client.? 192.168.123.104:0/2970064130' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T06:19:55.795 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:55 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:19:55.795 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:55 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:19:55.795 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:55 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:19:55.795 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:55 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T06:19:55.795 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:55 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:19:55.795 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:55 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:19:55.841 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:55.839+0000 7f74368a0700 1 -- 192.168.123.104:0/2978642108 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7430071db0 msgr2=0x7f74300721c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:55.842 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:55.839+0000 7f74368a0700 1 --2- 192.168.123.104:0/2978642108 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7430071db0 0x7f74300721c0 secure :-1 s=READY pgs=276 cs=0 l=1 rev1=1 crypto rx=0x7f7418009b00 tx=0x7f7418009e10 comp rx=0 tx=0).stop 2026-03-10T06:19:55.842 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:55.840+0000 7f74368a0700 1 -- 192.168.123.104:0/2978642108 shutdown_connections 2026-03-10T06:19:55.842 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:55.840+0000 7f74368a0700 1 --2- 192.168.123.104:0/2978642108 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7430107d50 0x7f74301081c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:55.842 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:55.840+0000 7f74368a0700 1 --2- 192.168.123.104:0/2978642108 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7430071db0 0x7f74300721c0 unknown :-1 s=CLOSED pgs=276 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:55.842 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:55.840+0000 7f74368a0700 1 -- 192.168.123.104:0/2978642108 >> 192.168.123.104:0/2978642108 conn(0x7f743006d3e0 msgr2=0x7f743006f830 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:19:55.845 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:55.841+0000 7f74368a0700 1 -- 192.168.123.104:0/2978642108 shutdown_connections 2026-03-10T06:19:55.845 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:55.842+0000 7f74368a0700 1 -- 192.168.123.104:0/2978642108 wait complete. 2026-03-10T06:19:55.845 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:55.842+0000 7f74368a0700 1 Processor -- start 2026-03-10T06:19:55.845 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:55.842+0000 7f74368a0700 1 -- start start 2026-03-10T06:19:55.845 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:55.842+0000 7f74368a0700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7430071db0 0x7f7430116a00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:55.845 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:55.842+0000 7f74368a0700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7430107d50 0x7f7430116f40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:55.845 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:55.842+0000 7f74368a0700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f74301175b0 con 0x7f7430071db0 2026-03-10T06:19:55.845 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:55.842+0000 7f74368a0700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f74301a16f0 con 0x7f7430107d50 2026-03-10T06:19:55.845 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:55.843+0000 7f742f7fe700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7430107d50 0x7f7430116f40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:19:55.846 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:55.843+0000 7f742f7fe700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7430107d50 0x7f7430116f40 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.104:43658/0 (socket says 192.168.123.104:43658) 2026-03-10T06:19:55.849 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:55.843+0000 7f742f7fe700 1 -- 192.168.123.104:0/3498997775 learned_addr learned my addr 192.168.123.104:0/3498997775 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:19:55.849 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:55.843+0000 7f742f7fe700 1 -- 192.168.123.104:0/3498997775 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7430071db0 msgr2=0x7f7430116a00 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:55.849 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:55.843+0000 7f742f7fe700 1 --2- 192.168.123.104:0/3498997775 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7430071db0 0x7f7430116a00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:55.849 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:55.843+0000 7f742f7fe700 1 -- 192.168.123.104:0/3498997775 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f74180097e0 con 0x7f7430107d50 2026-03-10T06:19:55.849 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:55.843+0000 7f742f7fe700 1 --2- 192.168.123.104:0/3498997775 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7430107d50 0x7f7430116f40 secure :-1 s=READY pgs=54 cs=0 l=1 rev1=1 crypto rx=0x7f742000b700 tx=0x7f742000bac0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:19:55.849 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:55.844+0000 7f742d7fa700 1 -- 192.168.123.104:0/3498997775 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f7420010840 con 0x7f7430107d50 2026-03-10T06:19:55.849 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:55.844+0000 7f74368a0700 1 -- 192.168.123.104:0/3498997775 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f74301a19d0 con 0x7f7430107d50 2026-03-10T06:19:55.849 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:55.844+0000 7f74368a0700 1 -- 192.168.123.104:0/3498997775 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f74301a1fa0 con 0x7f7430107d50 2026-03-10T06:19:55.849 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:55.845+0000 7f742d7fa700 1 -- 192.168.123.104:0/3498997775 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f7420010e80 con 0x7f7430107d50 2026-03-10T06:19:55.849 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:55.845+0000 7f742d7fa700 1 -- 192.168.123.104:0/3498997775 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f742000d590 con 0x7f7430107d50 2026-03-10T06:19:55.849 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:55.846+0000 7f742d7fa700 1 -- 192.168.123.104:0/3498997775 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f742000d770 con 0x7f7430107d50 2026-03-10T06:19:55.849 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:55.847+0000 7f742d7fa700 1 --2- 192.168.123.104:0/3498997775 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f741c06c750 0x7f741c06ec00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:55.849 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:55.847+0000 7f742d7fa700 1 -- 192.168.123.104:0/3498997775 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(38..38 src has 1..38) v4 ==== 5276+0+0 (secure 0 0 0) 0x7f742008c3a0 con 0x7f7430107d50 2026-03-10T06:19:55.849 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:55.847+0000 7f742ffff700 1 --2- 192.168.123.104:0/3498997775 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f741c06c750 0x7f741c06ec00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:19:55.849 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:55.847+0000 7f742ffff700 1 --2- 192.168.123.104:0/3498997775 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f741c06c750 0x7f741c06ec00 secure :-1 s=READY pgs=102 cs=0 l=1 rev1=1 crypto rx=0x7f7418009ad0 tx=0x7f7418009f90 comp rx=0 tx=0).ready entity=mgr.14241 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:19:55.852 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:55.848+0000 7f74368a0700 1 -- 192.168.123.104:0/3498997775 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f7410005320 con 0x7f7430107d50 2026-03-10T06:19:55.853 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:55.852+0000 7f742d7fa700 1 -- 192.168.123.104:0/3498997775 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f7420056b20 con 0x7f7430107d50 2026-03-10T06:19:56.001 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:55.998+0000 7f74368a0700 1 -- 192.168.123.104:0/3498997775 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "fs dump", "format": "json"} v 0) v1 -- 0x7f7410005f70 con 0x7f7430107d50 2026-03-10T06:19:56.003 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:55.998+0000 7f742d7fa700 1 -- 192.168.123.104:0/3498997775 <== mon.1 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "format": "json"}]=0 dumped fsmap epoch 9 v9) v1 ==== 93+0+4759 (secure 0 0 0) 0x7f742005a140 con 0x7f7430107d50 2026-03-10T06:19:56.004 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:56.003+0000 7f7426ffd700 1 -- 192.168.123.104:0/3498997775 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f741c06c750 msgr2=0x7f741c06ec00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:56.004 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:56.003+0000 7f7426ffd700 1 --2- 192.168.123.104:0/3498997775 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f741c06c750 0x7f741c06ec00 secure :-1 s=READY pgs=102 cs=0 l=1 rev1=1 crypto rx=0x7f7418009ad0 tx=0x7f7418009f90 comp rx=0 tx=0).stop 2026-03-10T06:19:56.005 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:56.003+0000 7f7426ffd700 1 -- 192.168.123.104:0/3498997775 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7430107d50 msgr2=0x7f7430116f40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:56.005 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:56.003+0000 7f7426ffd700 1 --2- 192.168.123.104:0/3498997775 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7430107d50 0x7f7430116f40 secure :-1 s=READY pgs=54 cs=0 l=1 rev1=1 crypto rx=0x7f742000b700 tx=0x7f742000bac0 comp rx=0 tx=0).stop 2026-03-10T06:19:56.005 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:56.003+0000 7f7426ffd700 1 -- 192.168.123.104:0/3498997775 shutdown_connections 2026-03-10T06:19:56.005 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:56.003+0000 7f7426ffd700 1 --2- 192.168.123.104:0/3498997775 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f741c06c750 0x7f741c06ec00 unknown :-1 s=CLOSED pgs=102 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:56.005 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:56.003+0000 7f7426ffd700 1 --2- 192.168.123.104:0/3498997775 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7430071db0 0x7f7430116a00 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:56.005 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:56.003+0000 7f7426ffd700 1 --2- 192.168.123.104:0/3498997775 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7430107d50 0x7f7430116f40 unknown :-1 s=CLOSED pgs=54 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:56.005 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:56.003+0000 7f7426ffd700 1 -- 192.168.123.104:0/3498997775 >> 192.168.123.104:0/3498997775 conn(0x7f743006d3e0 msgr2=0x7f743010af80 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:19:56.005 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:56.003+0000 7f7426ffd700 1 -- 192.168.123.104:0/3498997775 shutdown_connections 2026-03-10T06:19:56.005 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:56.003+0000 7f7426ffd700 1 -- 192.168.123.104:0/3498997775 wait complete. 2026-03-10T06:19:56.006 INFO:teuthology.orchestra.run.vm04.stderr:dumped fsmap epoch 9 2026-03-10T06:19:56.015 INFO:teuthology.orchestra.run.vm04.stdout:true 2026-03-10T06:19:56.073 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 9c59102a-1c48-11f1-b618-035af535377d -- bash -c 'while ! ceph --format=json mds versions | jq -e ". | add == 4"; do sleep 1; done' 2026-03-10T06:19:56.289 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/mon.vm04/config 2026-03-10T06:19:56.629 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:56.625+0000 7f4156ccc700 1 -- 192.168.123.104:0/416225189 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f41501024d0 msgr2=0x7f41501028e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:56.629 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:56.625+0000 7f4156ccc700 1 --2- 192.168.123.104:0/416225189 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f41501024d0 0x7f41501028e0 secure :-1 s=READY pgs=277 cs=0 l=1 rev1=1 crypto rx=0x7f414c009ab0 tx=0x7f414c009dc0 comp rx=0 tx=0).stop 2026-03-10T06:19:56.629 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:56.626+0000 7f4156ccc700 1 -- 192.168.123.104:0/416225189 shutdown_connections 2026-03-10T06:19:56.629 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:56.626+0000 7f4156ccc700 1 --2- 192.168.123.104:0/416225189 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f41501036d0 0x7f4150103b20 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:56.629 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:56.626+0000 7f4156ccc700 1 --2- 192.168.123.104:0/416225189 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f41501024d0 0x7f41501028e0 unknown :-1 s=CLOSED pgs=277 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:56.629 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:56.626+0000 7f4156ccc700 1 -- 192.168.123.104:0/416225189 >> 192.168.123.104:0/416225189 conn(0x7f41500fda60 msgr2=0x7f41500ffeb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:19:56.629 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:56.626+0000 7f4156ccc700 1 -- 192.168.123.104:0/416225189 shutdown_connections 2026-03-10T06:19:56.629 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:56.626+0000 7f4156ccc700 1 -- 192.168.123.104:0/416225189 wait complete. 2026-03-10T06:19:56.629 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:56.626+0000 7f4156ccc700 1 Processor -- start 2026-03-10T06:19:56.629 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:56.627+0000 7f4156ccc700 1 -- start start 2026-03-10T06:19:56.630 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:56.627+0000 7f4156ccc700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f41501024d0 0x7f4150197da0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:56.630 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:56.627+0000 7f4156ccc700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f41501036d0 0x7f41501982e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:56.630 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:56.627+0000 7f4156ccc700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4150198900 con 0x7f41501024d0 2026-03-10T06:19:56.630 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:56.627+0000 7f4156ccc700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f415019d310 con 0x7f41501036d0 2026-03-10T06:19:56.630 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:56.627+0000 7f4155cca700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f41501024d0 0x7f4150197da0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:19:56.630 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:56.627+0000 7f4155cca700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f41501024d0 0x7f4150197da0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:36730/0 (socket says 192.168.123.104:36730) 2026-03-10T06:19:56.630 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:56.627+0000 7f4155cca700 1 -- 192.168.123.104:0/991190602 learned_addr learned my addr 192.168.123.104:0/991190602 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:19:56.630 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:56.627+0000 7f41554c9700 1 --2- 192.168.123.104:0/991190602 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f41501036d0 0x7f41501982e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:19:56.630 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:56.627+0000 7f4155cca700 1 -- 192.168.123.104:0/991190602 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f41501036d0 msgr2=0x7f41501982e0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:56.630 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:56.627+0000 7f4155cca700 1 --2- 192.168.123.104:0/991190602 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f41501036d0 0x7f41501982e0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:56.630 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:56.627+0000 7f4155cca700 1 -- 192.168.123.104:0/991190602 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f41400097e0 con 0x7f41501024d0 2026-03-10T06:19:56.630 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:56.628+0000 7f4155cca700 1 --2- 192.168.123.104:0/991190602 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f41501024d0 0x7f4150197da0 secure :-1 s=READY pgs=278 cs=0 l=1 rev1=1 crypto rx=0x7f414c000c00 tx=0x7f414c00f670 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:19:56.631 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:56.628+0000 7f4146ffd700 1 -- 192.168.123.104:0/991190602 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f414c01d070 con 0x7f41501024d0 2026-03-10T06:19:56.631 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:56.628+0000 7f4146ffd700 1 -- 192.168.123.104:0/991190602 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f414c00fc10 con 0x7f41501024d0 2026-03-10T06:19:56.631 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:56.628+0000 7f4146ffd700 1 -- 192.168.123.104:0/991190602 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f414c017690 con 0x7f41501024d0 2026-03-10T06:19:56.631 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:56.628+0000 7f4156ccc700 1 -- 192.168.123.104:0/991190602 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f414c009710 con 0x7f41501024d0 2026-03-10T06:19:56.631 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:56.628+0000 7f4156ccc700 1 -- 192.168.123.104:0/991190602 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f4150075420 con 0x7f41501024d0 2026-03-10T06:19:56.632 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:56.630+0000 7f4156ccc700 1 -- 192.168.123.104:0/991190602 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f415004ea50 con 0x7f41501024d0 2026-03-10T06:19:56.637 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:56 vm04 ceph-mon[51058]: mds.? [v2:192.168.123.106:6824/3071631026,v1:192.168.123.106:6825/3071631026] up:standby-replay 2026-03-10T06:19:56.637 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:56 vm04 ceph-mon[51058]: fsmap cephfs:1 {0=cephfs.vm04.hdxbzv=up:active} 1 up:standby-replay 2 up:standby 2026-03-10T06:19:56.637 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:56 vm04 ceph-mon[51058]: from='client.? 192.168.123.104:0/3498997775' entity='client.admin' cmd=[{"prefix": "fs dump", "format": "json"}]: dispatch 2026-03-10T06:19:56.637 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:56 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:19:56.637 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:56 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:19:56.637 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:56 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:19:56.637 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:56.632+0000 7f4146ffd700 1 -- 192.168.123.104:0/991190602 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f414c021a80 con 0x7f41501024d0 2026-03-10T06:19:56.637 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:56.632+0000 7f4146ffd700 1 --2- 192.168.123.104:0/991190602 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f413c06c7a0 0x7f413c06ec50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:56.638 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:56.632+0000 7f4146ffd700 1 -- 192.168.123.104:0/991190602 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(38..38 src has 1..38) v4 ==== 5276+0+0 (secure 0 0 0) 0x7f414c08c8e0 con 0x7f41501024d0 2026-03-10T06:19:56.638 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:56.636+0000 7f4146ffd700 1 -- 192.168.123.104:0/991190602 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f414c057190 con 0x7f41501024d0 2026-03-10T06:19:56.639 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:56.637+0000 7f41554c9700 1 --2- 192.168.123.104:0/991190602 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f413c06c7a0 0x7f413c06ec50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:19:56.639 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:56.637+0000 7f41554c9700 1 --2- 192.168.123.104:0/991190602 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f413c06c7a0 0x7f413c06ec50 secure :-1 s=READY pgs=103 cs=0 l=1 rev1=1 crypto rx=0x7f4140005fd0 tx=0x7f4140009500 comp rx=0 tx=0).ready entity=mgr.14241 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:19:56.811 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:56.809+0000 7f4156ccc700 1 -- 192.168.123.104:0/991190602 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "mds versions", "format": "json"} v 0) v1 -- 0x7f4150075e40 con 0x7f41501024d0 2026-03-10T06:19:56.811 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:56.809+0000 7f4146ffd700 1 -- 192.168.123.104:0/991190602 <== mon.0 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "mds versions", "format": "json"}]=0 v9) v1 ==== 78+0+83 (secure 0 0 0) 0x7f414c026090 con 0x7f41501024d0 2026-03-10T06:19:56.814 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:56.812+0000 7f4144ff9700 1 -- 192.168.123.104:0/991190602 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f413c06c7a0 msgr2=0x7f413c06ec50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:56.814 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:56.812+0000 7f4144ff9700 1 --2- 192.168.123.104:0/991190602 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f413c06c7a0 0x7f413c06ec50 secure :-1 s=READY pgs=103 cs=0 l=1 rev1=1 crypto rx=0x7f4140005fd0 tx=0x7f4140009500 comp rx=0 tx=0).stop 2026-03-10T06:19:56.814 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:56.813+0000 7f4144ff9700 1 -- 192.168.123.104:0/991190602 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f41501024d0 msgr2=0x7f4150197da0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:56.814 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:56.813+0000 7f4144ff9700 1 --2- 192.168.123.104:0/991190602 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f41501024d0 0x7f4150197da0 secure :-1 s=READY pgs=278 cs=0 l=1 rev1=1 crypto rx=0x7f414c000c00 tx=0x7f414c00f670 comp rx=0 tx=0).stop 2026-03-10T06:19:56.815 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:56.814+0000 7f4144ff9700 1 -- 192.168.123.104:0/991190602 shutdown_connections 2026-03-10T06:19:56.815 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:56.814+0000 7f4144ff9700 1 --2- 192.168.123.104:0/991190602 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f413c06c7a0 0x7f413c06ec50 unknown :-1 s=CLOSED pgs=103 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:56.815 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:56.814+0000 7f4144ff9700 1 --2- 192.168.123.104:0/991190602 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f41501024d0 0x7f4150197da0 unknown :-1 s=CLOSED pgs=278 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:56.815 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:56.814+0000 7f4144ff9700 1 --2- 192.168.123.104:0/991190602 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f41501036d0 0x7f41501982e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:56.816 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:56.814+0000 7f4144ff9700 1 -- 192.168.123.104:0/991190602 >> 192.168.123.104:0/991190602 conn(0x7f41500fda60 msgr2=0x7f4150106900 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:19:56.817 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:56.816+0000 7f4144ff9700 1 -- 192.168.123.104:0/991190602 shutdown_connections 2026-03-10T06:19:56.818 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:56.816+0000 7f4144ff9700 1 -- 192.168.123.104:0/991190602 wait complete. 2026-03-10T06:19:56.830 INFO:teuthology.orchestra.run.vm04.stdout:true 2026-03-10T06:19:56.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:56 vm06 ceph-mon[58974]: mds.? [v2:192.168.123.106:6824/3071631026,v1:192.168.123.106:6825/3071631026] up:standby-replay 2026-03-10T06:19:56.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:56 vm06 ceph-mon[58974]: fsmap cephfs:1 {0=cephfs.vm04.hdxbzv=up:active} 1 up:standby-replay 2 up:standby 2026-03-10T06:19:56.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:56 vm06 ceph-mon[58974]: from='client.? 192.168.123.104:0/3498997775' entity='client.admin' cmd=[{"prefix": "fs dump", "format": "json"}]: dispatch 2026-03-10T06:19:56.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:56 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:19:56.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:56 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:19:56.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:56 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:19:56.897 INFO:teuthology.run_tasks:Running task fs.pre_upgrade_save... 2026-03-10T06:19:56.900 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid 9c59102a-1c48-11f1-b618-035af535377d -- ceph fs dump --format=json 2026-03-10T06:19:57.081 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/mon.vm04/config 2026-03-10T06:19:57.406 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:57.404+0000 7f553973c700 1 -- 192.168.123.104:0/4167416622 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f553410c8b0 msgr2=0x7f553410cc80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:57.406 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:57.404+0000 7f553973c700 1 --2- 192.168.123.104:0/4167416622 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f553410c8b0 0x7f553410cc80 secure :-1 s=READY pgs=279 cs=0 l=1 rev1=1 crypto rx=0x7f5524009b00 tx=0x7f5524009e10 comp rx=0 tx=0).stop 2026-03-10T06:19:57.406 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:57.404+0000 7f553973c700 1 -- 192.168.123.104:0/4167416622 shutdown_connections 2026-03-10T06:19:57.406 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:57.404+0000 7f553973c700 1 --2- 192.168.123.104:0/4167416622 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5534071e40 0x7f55340722b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:57.406 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:57.404+0000 7f553973c700 1 --2- 192.168.123.104:0/4167416622 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f553410c8b0 0x7f553410cc80 unknown :-1 s=CLOSED pgs=279 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:57.406 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:57.404+0000 7f553973c700 1 -- 192.168.123.104:0/4167416622 >> 192.168.123.104:0/4167416622 conn(0x7f553406c6c0 msgr2=0x7f553406cac0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:19:57.407 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:57.404+0000 7f553973c700 1 -- 192.168.123.104:0/4167416622 shutdown_connections 2026-03-10T06:19:57.407 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:57.404+0000 7f553973c700 1 -- 192.168.123.104:0/4167416622 wait complete. 2026-03-10T06:19:57.407 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:57.405+0000 7f553973c700 1 Processor -- start 2026-03-10T06:19:57.408 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:57.405+0000 7f553973c700 1 -- start start 2026-03-10T06:19:57.408 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:57.405+0000 7f553973c700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5534071e40 0x7f553407cef0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:57.408 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:57.405+0000 7f553973c700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f553407d430 0x7f553407d8a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:57.408 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:57.405+0000 7f553973c700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5534081a70 con 0x7f5534071e40 2026-03-10T06:19:57.408 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:57.405+0000 7f553973c700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5534081be0 con 0x7f553407d430 2026-03-10T06:19:57.408 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:57.405+0000 7f5532ffd700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5534071e40 0x7f553407cef0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:19:57.408 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:57.405+0000 7f55327fc700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f553407d430 0x7f553407d8a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:19:57.408 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:57.405+0000 7f55327fc700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f553407d430 0x7f553407d8a0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.104:43680/0 (socket says 192.168.123.104:43680) 2026-03-10T06:19:57.408 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:57.405+0000 7f55327fc700 1 -- 192.168.123.104:0/255816013 learned_addr learned my addr 192.168.123.104:0/255816013 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:19:57.409 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:57.406+0000 7f55327fc700 1 -- 192.168.123.104:0/255816013 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5534071e40 msgr2=0x7f553407cef0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:57.409 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:57.406+0000 7f55327fc700 1 --2- 192.168.123.104:0/255816013 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5534071e40 0x7f553407cef0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:57.409 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:57.406+0000 7f55327fc700 1 -- 192.168.123.104:0/255816013 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f55240097e0 con 0x7f553407d430 2026-03-10T06:19:57.409 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:57.406+0000 7f55327fc700 1 --2- 192.168.123.104:0/255816013 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f553407d430 0x7f553407d8a0 secure :-1 s=READY pgs=55 cs=0 l=1 rev1=1 crypto rx=0x7f552c00ba20 tx=0x7f552c00bd30 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:19:57.409 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:57.406+0000 7f551bfff700 1 -- 192.168.123.104:0/255816013 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f552c015070 con 0x7f553407d430 2026-03-10T06:19:57.409 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:57.406+0000 7f553973c700 1 -- 192.168.123.104:0/255816013 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f5534081ec0 con 0x7f553407d430 2026-03-10T06:19:57.409 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:57.406+0000 7f553973c700 1 -- 192.168.123.104:0/255816013 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f5534082410 con 0x7f553407d430 2026-03-10T06:19:57.409 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:57.407+0000 7f551bfff700 1 -- 192.168.123.104:0/255816013 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f552c01b900 con 0x7f553407d430 2026-03-10T06:19:57.409 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:57.407+0000 7f551bfff700 1 -- 192.168.123.104:0/255816013 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f552c012bd0 con 0x7f553407d430 2026-03-10T06:19:57.410 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:57.408+0000 7f551bfff700 1 -- 192.168.123.104:0/255816013 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f552c019070 con 0x7f553407d430 2026-03-10T06:19:57.411 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:57.409+0000 7f551bfff700 1 --2- 192.168.123.104:0/255816013 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f551c06c6d0 0x7f551c06eb80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:57.411 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:57.409+0000 7f5532ffd700 1 --2- 192.168.123.104:0/255816013 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f551c06c6d0 0x7f551c06eb80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:19:57.411 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:57.410+0000 7f5532ffd700 1 --2- 192.168.123.104:0/255816013 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f551c06c6d0 0x7f551c06eb80 secure :-1 s=READY pgs=104 cs=0 l=1 rev1=1 crypto rx=0x7f5524009ad0 tx=0x7f5524017040 comp rx=0 tx=0).ready entity=mgr.14241 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:19:57.411 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:57.410+0000 7f551bfff700 1 -- 192.168.123.104:0/255816013 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(38..38 src has 1..38) v4 ==== 5276+0+0 (secure 0 0 0) 0x7f552c0907e0 con 0x7f553407d430 2026-03-10T06:19:57.412 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:57.410+0000 7f553973c700 1 -- 192.168.123.104:0/255816013 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f5520005320 con 0x7f553407d430 2026-03-10T06:19:57.415 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:57.413+0000 7f551bfff700 1 -- 192.168.123.104:0/255816013 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f552c05af60 con 0x7f553407d430 2026-03-10T06:19:57.564 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:57.562+0000 7f553973c700 1 -- 192.168.123.104:0/255816013 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "fs dump", "format": "json"} v 0) v1 -- 0x7f5520005f70 con 0x7f553407d430 2026-03-10T06:19:57.564 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:57.562+0000 7f551bfff700 1 -- 192.168.123.104:0/255816013 <== mon.1 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "format": "json"}]=0 dumped fsmap epoch 10 v10) v1 ==== 94+0+4760 (secure 0 0 0) 0x7f552c05e580 con 0x7f553407d430 2026-03-10T06:19:57.565 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:19:57.565 INFO:teuthology.orchestra.run.vm04.stdout:{"epoch":10,"default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":14518,"name":"cephfs.vm04.hsrsig","rank":-1,"incarnation":0,"state":"up:standby","state_seq":2,"addr":"192.168.123.104:6829/2419696492","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6828","nonce":2419696492},{"type":"v1","addr":"192.168.123.104:6829","nonce":2419696492}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":10},{"gid":14526,"name":"cephfs.vm06.afscws","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.106:6827/3120742985","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6826","nonce":3120742985},{"type":"v1","addr":"192.168.123.106:6827","nonce":3120742985}]},"join_fscid":-1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":6}],"filesystems":[{"mdsmap":{"epoch":9,"flags":50,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":true,"refuse_client_session":false},"ever_allowed_features":32,"explicitly_allowed_features":32,"created":"2026-03-10T06:19:48.407965+0000","modified":"2026-03-10T06:19:55.449951+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"last_failure":0,"last_failure_osd_epoch":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{"mds_0":14508},"failed":[],"damaged":[],"stopped":[],"info":{"gid_14508":{"gid":14508,"name":"cephfs.vm04.hdxbzv","rank":0,"incarnation":3,"state":"up:active","state_seq":3,"addr":"192.168.123.104:6827/2274683007","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6826","nonce":2274683007},{"type":"v1","addr":"192.168.123.104:6827","nonce":2274683007}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}},"gid_24299":{"gid":24299,"name":"cephfs.vm06.wzhqon","rank":0,"incarnation":0,"state":"up:standby-replay","state_seq":2,"addr":"192.168.123.106:6825/3071631026","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6824","nonce":3071631026},{"type":"v1","addr":"192.168.123.106:6825","nonce":3071631026}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1},"id":1}]} 2026-03-10T06:19:57.568 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:57.565+0000 7f553973c700 1 -- 192.168.123.104:0/255816013 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f551c06c6d0 msgr2=0x7f551c06eb80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:57.568 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:57.565+0000 7f553973c700 1 --2- 192.168.123.104:0/255816013 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f551c06c6d0 0x7f551c06eb80 secure :-1 s=READY pgs=104 cs=0 l=1 rev1=1 crypto rx=0x7f5524009ad0 tx=0x7f5524017040 comp rx=0 tx=0).stop 2026-03-10T06:19:57.568 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:57.565+0000 7f553973c700 1 -- 192.168.123.104:0/255816013 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f553407d430 msgr2=0x7f553407d8a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:57.568 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:57.565+0000 7f553973c700 1 --2- 192.168.123.104:0/255816013 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f553407d430 0x7f553407d8a0 secure :-1 s=READY pgs=55 cs=0 l=1 rev1=1 crypto rx=0x7f552c00ba20 tx=0x7f552c00bd30 comp rx=0 tx=0).stop 2026-03-10T06:19:57.568 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:57.565+0000 7f553973c700 1 -- 192.168.123.104:0/255816013 shutdown_connections 2026-03-10T06:19:57.568 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:57.565+0000 7f553973c700 1 --2- 192.168.123.104:0/255816013 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f551c06c6d0 0x7f551c06eb80 unknown :-1 s=CLOSED pgs=104 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:57.568 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:57.565+0000 7f553973c700 1 --2- 192.168.123.104:0/255816013 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5534071e40 0x7f553407cef0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:57.568 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:57.565+0000 7f553973c700 1 --2- 192.168.123.104:0/255816013 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f553407d430 0x7f553407d8a0 unknown :-1 s=CLOSED pgs=55 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:57.568 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:57.565+0000 7f553973c700 1 -- 192.168.123.104:0/255816013 >> 192.168.123.104:0/255816013 conn(0x7f553406c6c0 msgr2=0x7f5534070070 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:19:57.568 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:57.565+0000 7f553973c700 1 -- 192.168.123.104:0/255816013 shutdown_connections 2026-03-10T06:19:57.568 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:57.565+0000 7f553973c700 1 -- 192.168.123.104:0/255816013 wait complete. 2026-03-10T06:19:57.568 INFO:teuthology.orchestra.run.vm04.stderr:dumped fsmap epoch 10 2026-03-10T06:19:57.613 DEBUG:tasks.fs:fs fscid=1,name=cephfs state = {'epoch': 9, 'max_mds': 1, 'flags': 50} 2026-03-10T06:19:57.613 INFO:teuthology.run_tasks:Running task ceph-fuse... 2026-03-10T06:19:57.624 INFO:tasks.ceph_fuse:Running ceph_fuse task... 2026-03-10T06:19:57.624 INFO:tasks.ceph_fuse:config is {'client.0': {}, 'client.1': {}} 2026-03-10T06:19:57.624 INFO:tasks.ceph_fuse:client.0 config is {} 2026-03-10T06:19:57.624 INFO:tasks.cephfs.mount:cephfs_mntpt = None 2026-03-10T06:19:57.624 INFO:tasks.cephfs.mount:hostfs_mntpt = /home/ubuntu/cephtest/mnt.0 2026-03-10T06:19:57.624 INFO:tasks.ceph_fuse:client.1 config is {} 2026-03-10T06:19:57.624 INFO:tasks.cephfs.mount:cephfs_mntpt = None 2026-03-10T06:19:57.624 INFO:tasks.cephfs.mount:hostfs_mntpt = /home/ubuntu/cephtest/mnt.1 2026-03-10T06:19:57.624 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-10T06:19:57.624 DEBUG:teuthology.orchestra.run.vm04:> ip netns list 2026-03-10T06:19:57.640 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-10T06:19:57.640 DEBUG:teuthology.orchestra.run.vm04:> sudo ip link delete ceph-brx 2026-03-10T06:19:57.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:57 vm04 ceph-mon[51058]: pgmap v82: 65 pgs: 65 active+clean; 451 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 1.2 KiB/s rd, 1.8 KiB/s wr, 7 op/s 2026-03-10T06:19:57.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:57 vm04 ceph-mon[51058]: from='client.? 192.168.123.104:0/991190602' entity='client.admin' cmd=[{"prefix": "mds versions", "format": "json"}]: dispatch 2026-03-10T06:19:57.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:57 vm04 ceph-mon[51058]: mds.? [v2:192.168.123.104:6828/2419696492,v1:192.168.123.104:6829/2419696492] up:standby 2026-03-10T06:19:57.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:57 vm04 ceph-mon[51058]: fsmap cephfs:1 {0=cephfs.vm04.hdxbzv=up:active} 1 up:standby-replay 2 up:standby 2026-03-10T06:19:57.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:57 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:19:57.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:57 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:19:57.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:57 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:19:57.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:57 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T06:19:57.711 INFO:teuthology.orchestra.run.vm04.stderr:Cannot find device "ceph-brx" 2026-03-10T06:19:57.712 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-10T06:19:57.712 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-10T06:19:57.712 DEBUG:teuthology.orchestra.run.vm06:> ip netns list 2026-03-10T06:19:57.726 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-10T06:19:57.726 DEBUG:teuthology.orchestra.run.vm06:> sudo ip link delete ceph-brx 2026-03-10T06:19:57.788 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:57 vm06 ceph-mon[58974]: pgmap v82: 65 pgs: 65 active+clean; 451 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 1.2 KiB/s rd, 1.8 KiB/s wr, 7 op/s 2026-03-10T06:19:57.788 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:57 vm06 ceph-mon[58974]: from='client.? 192.168.123.104:0/991190602' entity='client.admin' cmd=[{"prefix": "mds versions", "format": "json"}]: dispatch 2026-03-10T06:19:57.788 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:57 vm06 ceph-mon[58974]: mds.? [v2:192.168.123.104:6828/2419696492,v1:192.168.123.104:6829/2419696492] up:standby 2026-03-10T06:19:57.788 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:57 vm06 ceph-mon[58974]: fsmap cephfs:1 {0=cephfs.vm04.hdxbzv=up:active} 1 up:standby-replay 2 up:standby 2026-03-10T06:19:57.788 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:57 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:19:57.788 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:57 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:19:57.788 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:57 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:19:57.788 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:57 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T06:19:57.795 INFO:teuthology.orchestra.run.vm06.stderr:Cannot find device "ceph-brx" 2026-03-10T06:19:57.797 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-10T06:19:57.797 INFO:tasks.ceph_fuse:Mounting ceph-fuse clients... 2026-03-10T06:19:57.797 DEBUG:tasks.ceph_fuse:passing mntargs=[] 2026-03-10T06:19:57.797 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid 9c59102a-1c48-11f1-b618-035af535377d -- ceph fs ls 2026-03-10T06:19:57.962 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/mon.vm04/config 2026-03-10T06:19:58.244 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:58.241+0000 7fb619b1f700 1 -- 192.168.123.104:0/3260330765 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb614108760 msgr2=0x7fb614108b30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:58.245 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:58.241+0000 7fb619b1f700 1 --2- 192.168.123.104:0/3260330765 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb614108760 0x7fb614108b30 secure :-1 s=READY pgs=280 cs=0 l=1 rev1=1 crypto rx=0x7fb5fc009b50 tx=0x7fb5fc009e60 comp rx=0 tx=0).stop 2026-03-10T06:19:58.245 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:58.241+0000 7fb619b1f700 1 -- 192.168.123.104:0/3260330765 shutdown_connections 2026-03-10T06:19:58.245 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:58.241+0000 7fb619b1f700 1 --2- 192.168.123.104:0/3260330765 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb614102760 0x7fb614102bd0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:58.245 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:58.241+0000 7fb619b1f700 1 --2- 192.168.123.104:0/3260330765 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb614108760 0x7fb614108b30 unknown :-1 s=CLOSED pgs=280 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:58.245 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:58.241+0000 7fb619b1f700 1 -- 192.168.123.104:0/3260330765 >> 192.168.123.104:0/3260330765 conn(0x7fb6140fe280 msgr2=0x7fb614100690 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:19:58.245 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:58.243+0000 7fb619b1f700 1 -- 192.168.123.104:0/3260330765 shutdown_connections 2026-03-10T06:19:58.245 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:58.243+0000 7fb619b1f700 1 -- 192.168.123.104:0/3260330765 wait complete. 2026-03-10T06:19:58.245 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:58.243+0000 7fb619b1f700 1 Processor -- start 2026-03-10T06:19:58.245 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:58.243+0000 7fb619b1f700 1 -- start start 2026-03-10T06:19:58.245 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:58.243+0000 7fb619b1f700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb614102760 0x7fb61419cf50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:58.245 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:58.243+0000 7fb619b1f700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb614108760 0x7fb6140782b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:58.245 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:58.243+0000 7fb619b1f700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb61419d520 con 0x7fb614108760 2026-03-10T06:19:58.246 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:58.243+0000 7fb619b1f700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb61419d690 con 0x7fb614102760 2026-03-10T06:19:58.246 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:58.244+0000 7fb612ffd700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb614108760 0x7fb6140782b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:19:58.246 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:58.244+0000 7fb612ffd700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb614108760 0x7fb6140782b0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:36760/0 (socket says 192.168.123.104:36760) 2026-03-10T06:19:58.247 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:58.244+0000 7fb612ffd700 1 -- 192.168.123.104:0/3996396004 learned_addr learned my addr 192.168.123.104:0/3996396004 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:19:58.247 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:58.244+0000 7fb612ffd700 1 -- 192.168.123.104:0/3996396004 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb614102760 msgr2=0x7fb61419cf50 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T06:19:58.247 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:58.244+0000 7fb612ffd700 1 --2- 192.168.123.104:0/3996396004 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb614102760 0x7fb61419cf50 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:58.247 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:58.244+0000 7fb612ffd700 1 -- 192.168.123.104:0/3996396004 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb5fc0097e0 con 0x7fb614108760 2026-03-10T06:19:58.247 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:58.244+0000 7fb612ffd700 1 --2- 192.168.123.104:0/3996396004 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb614108760 0x7fb6140782b0 secure :-1 s=READY pgs=281 cs=0 l=1 rev1=1 crypto rx=0x7fb60400dc40 tx=0x7fb60400be10 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:19:58.247 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:58.244+0000 7fb610ff9700 1 -- 192.168.123.104:0/3996396004 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb6040099a0 con 0x7fb614108760 2026-03-10T06:19:58.247 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:58.244+0000 7fb619b1f700 1 -- 192.168.123.104:0/3996396004 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fb6140788b0 con 0x7fb614108760 2026-03-10T06:19:58.248 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:58.244+0000 7fb619b1f700 1 -- 192.168.123.104:0/3996396004 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fb614078e00 con 0x7fb614108760 2026-03-10T06:19:58.248 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:58.244+0000 7fb610ff9700 1 -- 192.168.123.104:0/3996396004 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fb604010460 con 0x7fb614108760 2026-03-10T06:19:58.248 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:58.244+0000 7fb610ff9700 1 -- 192.168.123.104:0/3996396004 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb60400f660 con 0x7fb614108760 2026-03-10T06:19:58.248 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:58.246+0000 7fb610ff9700 1 -- 192.168.123.104:0/3996396004 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7fb60400f800 con 0x7fb614108760 2026-03-10T06:19:58.248 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:58.246+0000 7fb610ff9700 1 --2- 192.168.123.104:0/3996396004 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fb60006c680 0x7fb60006eb30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:19:58.248 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:58.246+0000 7fb610ff9700 1 -- 192.168.123.104:0/3996396004 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(38..38 src has 1..38) v4 ==== 5276+0+0 (secure 0 0 0) 0x7fb60408b5b0 con 0x7fb614108760 2026-03-10T06:19:58.251 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:58.247+0000 7fb6137fe700 1 --2- 192.168.123.104:0/3996396004 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fb60006c680 0x7fb60006eb30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:19:58.251 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:58.247+0000 7fb619b1f700 1 -- 192.168.123.104:0/3996396004 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fb5f4005320 con 0x7fb614108760 2026-03-10T06:19:58.251 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:58.248+0000 7fb6137fe700 1 --2- 192.168.123.104:0/3996396004 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fb60006c680 0x7fb60006eb30 secure :-1 s=READY pgs=105 cs=0 l=1 rev1=1 crypto rx=0x7fb5fc00b5c0 tx=0x7fb5fc005fb0 comp rx=0 tx=0).ready entity=mgr.14241 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:19:58.251 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:58.249+0000 7fb610ff9700 1 -- 192.168.123.104:0/3996396004 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fb6040597c0 con 0x7fb614108760 2026-03-10T06:19:58.384 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:58.381+0000 7fb619b1f700 1 -- 192.168.123.104:0/3996396004 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "fs ls"} v 0) v1 -- 0x7fb5f4006200 con 0x7fb614108760 2026-03-10T06:19:58.384 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:58.382+0000 7fb610ff9700 1 -- 192.168.123.104:0/3996396004 <== mon.0 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "fs ls"}]=0 v11) v1 ==== 53+0+83 (secure 0 0 0) 0x7fb604020020 con 0x7fb614108760 2026-03-10T06:19:58.384 INFO:teuthology.orchestra.run.vm04.stdout:name: cephfs, metadata pool: cephfs.cephfs.meta, data pools: [cephfs.cephfs.data ] 2026-03-10T06:19:58.386 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:58.384+0000 7fb619b1f700 1 -- 192.168.123.104:0/3996396004 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fb60006c680 msgr2=0x7fb60006eb30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:58.386 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:58.384+0000 7fb619b1f700 1 --2- 192.168.123.104:0/3996396004 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fb60006c680 0x7fb60006eb30 secure :-1 s=READY pgs=105 cs=0 l=1 rev1=1 crypto rx=0x7fb5fc00b5c0 tx=0x7fb5fc005fb0 comp rx=0 tx=0).stop 2026-03-10T06:19:58.386 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:58.385+0000 7fb619b1f700 1 -- 192.168.123.104:0/3996396004 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb614108760 msgr2=0x7fb6140782b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:19:58.387 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:58.385+0000 7fb619b1f700 1 --2- 192.168.123.104:0/3996396004 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb614108760 0x7fb6140782b0 secure :-1 s=READY pgs=281 cs=0 l=1 rev1=1 crypto rx=0x7fb60400dc40 tx=0x7fb60400be10 comp rx=0 tx=0).stop 2026-03-10T06:19:58.387 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:58.385+0000 7fb619b1f700 1 -- 192.168.123.104:0/3996396004 shutdown_connections 2026-03-10T06:19:58.387 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:58.385+0000 7fb619b1f700 1 --2- 192.168.123.104:0/3996396004 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fb60006c680 0x7fb60006eb30 unknown :-1 s=CLOSED pgs=105 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:58.387 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:58.385+0000 7fb619b1f700 1 --2- 192.168.123.104:0/3996396004 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb614102760 0x7fb61419cf50 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:58.387 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:58.385+0000 7fb619b1f700 1 --2- 192.168.123.104:0/3996396004 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb614108760 0x7fb6140782b0 unknown :-1 s=CLOSED pgs=281 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:19:58.387 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:58.385+0000 7fb619b1f700 1 -- 192.168.123.104:0/3996396004 >> 192.168.123.104:0/3996396004 conn(0x7fb6140fe280 msgr2=0x7fb6140ffac0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:19:58.387 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:58.385+0000 7fb619b1f700 1 -- 192.168.123.104:0/3996396004 shutdown_connections 2026-03-10T06:19:58.387 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:58.385+0000 7fb619b1f700 1 -- 192.168.123.104:0/3996396004 wait complete. 2026-03-10T06:19:58.431 INFO:tasks.cephfs.mount:Mounting default Ceph FS; just confirmed its presence on cluster 2026-03-10T06:19:58.431 INFO:tasks.cephfs.mount:Mounting Ceph FS. Following are details of mount; remember "None" represents Python type None - 2026-03-10T06:19:58.431 INFO:tasks.cephfs.mount:self.client_remote.hostname = vm04.local 2026-03-10T06:19:58.431 INFO:tasks.cephfs.mount:self.client.name = client.0 2026-03-10T06:19:58.431 INFO:tasks.cephfs.mount:self.hostfs_mntpt = /home/ubuntu/cephtest/mnt.0 2026-03-10T06:19:58.431 INFO:tasks.cephfs.mount:self.cephfs_name = None 2026-03-10T06:19:58.431 INFO:tasks.cephfs.mount:self.cephfs_mntpt = None 2026-03-10T06:19:58.431 INFO:tasks.cephfs.mount:self.client_keyring_path = None 2026-03-10T06:19:58.431 INFO:tasks.cephfs.mount:Setting the 'None' netns for '/home/ubuntu/cephtest/mnt.0' 2026-03-10T06:19:58.431 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-10T06:19:58.431 DEBUG:teuthology.orchestra.run.vm04:> ip addr 2026-03-10T06:19:58.446 INFO:teuthology.orchestra.run.vm04.stdout:1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 2026-03-10T06:19:58.446 INFO:teuthology.orchestra.run.vm04.stdout: link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 2026-03-10T06:19:58.446 INFO:teuthology.orchestra.run.vm04.stdout: inet 127.0.0.1/8 scope host lo 2026-03-10T06:19:58.446 INFO:teuthology.orchestra.run.vm04.stdout: valid_lft forever preferred_lft forever 2026-03-10T06:19:58.446 INFO:teuthology.orchestra.run.vm04.stdout: inet6 ::1/128 scope host 2026-03-10T06:19:58.446 INFO:teuthology.orchestra.run.vm04.stdout: valid_lft forever preferred_lft forever 2026-03-10T06:19:58.447 INFO:teuthology.orchestra.run.vm04.stdout:2: eth0: mtu 1500 qdisc fq_codel state UP group default qlen 1000 2026-03-10T06:19:58.447 INFO:teuthology.orchestra.run.vm04.stdout: link/ether 52:55:00:00:00:04 brd ff:ff:ff:ff:ff:ff 2026-03-10T06:19:58.447 INFO:teuthology.orchestra.run.vm04.stdout: altname enp0s3 2026-03-10T06:19:58.447 INFO:teuthology.orchestra.run.vm04.stdout: altname ens3 2026-03-10T06:19:58.447 INFO:teuthology.orchestra.run.vm04.stdout: inet 192.168.123.104/24 brd 192.168.123.255 scope global dynamic noprefixroute eth0 2026-03-10T06:19:58.447 INFO:teuthology.orchestra.run.vm04.stdout: valid_lft 3147sec preferred_lft 3147sec 2026-03-10T06:19:58.447 INFO:teuthology.orchestra.run.vm04.stdout: inet6 fe80::5055:ff:fe00:4/64 scope link noprefixroute 2026-03-10T06:19:58.447 INFO:teuthology.orchestra.run.vm04.stdout: valid_lft forever preferred_lft forever 2026-03-10T06:19:58.447 INFO:tasks.cephfs.mount:Setuping the 'ceph-brx' with 192.168.159.254/20 2026-03-10T06:19:58.447 DEBUG:teuthology.orchestra.run.vm04:> (cd / && exec stdin-killer --timeout=300 -- bash -c ' 2026-03-10T06:19:58.447 DEBUG:teuthology.orchestra.run.vm04:> set -e 2026-03-10T06:19:58.447 DEBUG:teuthology.orchestra.run.vm04:> sudo ip link add name ceph-brx type bridge 2026-03-10T06:19:58.447 DEBUG:teuthology.orchestra.run.vm04:> sudo ip addr flush dev ceph-brx 2026-03-10T06:19:58.447 DEBUG:teuthology.orchestra.run.vm04:> sudo ip link set ceph-brx up 2026-03-10T06:19:58.447 DEBUG:teuthology.orchestra.run.vm04:> sudo ip addr add 192.168.159.254/20 brd 192.168.159.255 dev ceph-brx 2026-03-10T06:19:58.447 DEBUG:teuthology.orchestra.run.vm04:> ') 2026-03-10T06:19:58.523 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:58 stdin-killer INFO: expiration expected; waiting 300 seconds for command to complete 2026-03-10T06:19:58.531 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:58 vm04 ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:19:58.531 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:58 vm04 ceph-mon[51058]: from='client.? 192.168.123.104:0/255816013' entity='client.admin' cmd=[{"prefix": "fs dump", "format": "json"}]: dispatch 2026-03-10T06:19:58.531 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:58 vm04 ceph-mon[51058]: mds.? [v2:192.168.123.106:6826/3120742985,v1:192.168.123.106:6827/3120742985] up:standby 2026-03-10T06:19:58.531 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:58 vm04 ceph-mon[51058]: fsmap cephfs:1 {0=cephfs.vm04.hdxbzv=up:active} 1 up:standby-replay 2 up:standby 2026-03-10T06:19:58.531 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:58 vm04 ceph-mon[51058]: from='client.? 192.168.123.104:0/3996396004' entity='client.admin' cmd=[{"prefix": "fs ls"}]: dispatch 2026-03-10T06:19:58.594 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:58 stdin-killer INFO: command exited with status 0: exiting normally with same code! 2026-03-10T06:19:58.598 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-10T06:19:58.598 DEBUG:teuthology.orchestra.run.vm04:> echo 1 | sudo tee /proc/sys/net/ipv4/ip_forward 2026-03-10T06:19:58.671 INFO:teuthology.orchestra.run.vm04.stdout:1 2026-03-10T06:19:58.673 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-10T06:19:58.673 DEBUG:teuthology.orchestra.run.vm04:> ip r 2026-03-10T06:19:58.728 INFO:teuthology.orchestra.run.vm04.stdout:default via 192.168.123.1 dev eth0 proto dhcp src 192.168.123.104 metric 100 2026-03-10T06:19:58.728 INFO:teuthology.orchestra.run.vm04.stdout:192.168.123.0/24 dev eth0 proto kernel scope link src 192.168.123.104 metric 100 2026-03-10T06:19:58.728 INFO:teuthology.orchestra.run.vm04.stdout:192.168.144.0/20 dev ceph-brx proto kernel scope link src 192.168.159.254 2026-03-10T06:19:58.729 DEBUG:teuthology.orchestra.run.vm04:> (cd / && exec stdin-killer --timeout=300 -- bash -c ' 2026-03-10T06:19:58.729 DEBUG:teuthology.orchestra.run.vm04:> set -e 2026-03-10T06:19:58.729 DEBUG:teuthology.orchestra.run.vm04:> sudo iptables -A FORWARD -o eth0 -i ceph-brx -j ACCEPT 2026-03-10T06:19:58.729 DEBUG:teuthology.orchestra.run.vm04:> sudo iptables -A FORWARD -i eth0 -o ceph-brx -j ACCEPT 2026-03-10T06:19:58.729 DEBUG:teuthology.orchestra.run.vm04:> sudo iptables -t nat -A POSTROUTING -s 192.168.159.254/20 -o eth0 -j MASQUERADE 2026-03-10T06:19:58.729 DEBUG:teuthology.orchestra.run.vm04:> ') 2026-03-10T06:19:58.807 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:58 stdin-killer INFO: expiration expected; waiting 300 seconds for command to complete 2026-03-10T06:19:58.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:58 vm06 ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:19:58.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:58 vm06 ceph-mon[58974]: from='client.? 192.168.123.104:0/255816013' entity='client.admin' cmd=[{"prefix": "fs dump", "format": "json"}]: dispatch 2026-03-10T06:19:58.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:58 vm06 ceph-mon[58974]: mds.? [v2:192.168.123.106:6826/3120742985,v1:192.168.123.106:6827/3120742985] up:standby 2026-03-10T06:19:58.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:58 vm06 ceph-mon[58974]: fsmap cephfs:1 {0=cephfs.vm04.hdxbzv=up:active} 1 up:standby-replay 2 up:standby 2026-03-10T06:19:58.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:58 vm06 ceph-mon[58974]: from='client.? 192.168.123.104:0/3996396004' entity='client.admin' cmd=[{"prefix": "fs ls"}]: dispatch 2026-03-10T06:19:58.872 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:58 stdin-killer INFO: command exited with status 0: exiting normally with same code! 2026-03-10T06:19:58.876 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-10T06:19:58.876 DEBUG:teuthology.orchestra.run.vm04:> ip netns list 2026-03-10T06:19:58.934 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-10T06:19:58.934 DEBUG:teuthology.orchestra.run.vm04:> ip netns list-id 2026-03-10T06:19:58.989 DEBUG:teuthology.orchestra.run.vm04:> (cd / && exec stdin-killer --timeout=300 -- bash -c ' 2026-03-10T06:19:58.989 DEBUG:teuthology.orchestra.run.vm04:> set -e 2026-03-10T06:19:58.989 DEBUG:teuthology.orchestra.run.vm04:> sudo ip netns add ceph-ns--home-ubuntu-cephtest-mnt.0 2026-03-10T06:19:58.989 DEBUG:teuthology.orchestra.run.vm04:> sudo ip netns set ceph-ns--home-ubuntu-cephtest-mnt.0 0 2026-03-10T06:19:58.989 DEBUG:teuthology.orchestra.run.vm04:> ') 2026-03-10T06:19:59.066 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:59 stdin-killer INFO: expiration expected; waiting 300 seconds for command to complete 2026-03-10T06:19:59.093 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:59 stdin-killer INFO: command exited with status 0: exiting normally with same code! 2026-03-10T06:19:59.098 INFO:tasks.cephfs.mount:Setuping the netns 'ceph-ns--home-ubuntu-cephtest-mnt.0' with 192.168.144.1/20 2026-03-10T06:19:59.098 DEBUG:teuthology.orchestra.run.vm04:> (cd / && exec stdin-killer --timeout=300 -- bash -c ' 2026-03-10T06:19:59.098 DEBUG:teuthology.orchestra.run.vm04:> set -e 2026-03-10T06:19:59.098 DEBUG:teuthology.orchestra.run.vm04:> sudo ip link add veth0 netns ceph-ns--home-ubuntu-cephtest-mnt.0 type veth peer name brx.0 2026-03-10T06:19:59.098 DEBUG:teuthology.orchestra.run.vm04:> sudo ip netns exec ceph-ns--home-ubuntu-cephtest-mnt.0 ip addr add 192.168.144.1/20 brd 192.168.159.255 dev veth0 2026-03-10T06:19:59.098 DEBUG:teuthology.orchestra.run.vm04:> sudo ip netns exec ceph-ns--home-ubuntu-cephtest-mnt.0 ip link set veth0 up 2026-03-10T06:19:59.098 DEBUG:teuthology.orchestra.run.vm04:> sudo ip netns exec ceph-ns--home-ubuntu-cephtest-mnt.0 ip link set lo up 2026-03-10T06:19:59.098 DEBUG:teuthology.orchestra.run.vm04:> sudo ip netns exec ceph-ns--home-ubuntu-cephtest-mnt.0 ip route add default via 192.168.159.254 2026-03-10T06:19:59.098 DEBUG:teuthology.orchestra.run.vm04:> ') 2026-03-10T06:19:59.174 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:59 stdin-killer INFO: expiration expected; waiting 300 seconds for command to complete 2026-03-10T06:19:59.250 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:59 stdin-killer INFO: command exited with status 0: exiting normally with same code! 2026-03-10T06:19:59.253 DEBUG:teuthology.orchestra.run.vm04:> (cd / && exec stdin-killer --timeout=300 -- bash -c ' 2026-03-10T06:19:59.253 DEBUG:teuthology.orchestra.run.vm04:> set -e 2026-03-10T06:19:59.253 DEBUG:teuthology.orchestra.run.vm04:> sudo ip link set brx.0 up 2026-03-10T06:19:59.253 DEBUG:teuthology.orchestra.run.vm04:> sudo ip link set dev brx.0 master ceph-brx 2026-03-10T06:19:59.253 DEBUG:teuthology.orchestra.run.vm04:> ') 2026-03-10T06:19:59.331 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:59 stdin-killer INFO: expiration expected; waiting 300 seconds for command to complete 2026-03-10T06:19:59.360 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:59 stdin-killer INFO: command exited with status 0: exiting normally with same code! 2026-03-10T06:19:59.364 INFO:tasks.cephfs.fuse_mount:Client client.0 config is {} 2026-03-10T06:19:59.364 INFO:teuthology.orchestra.run:Running command with timeout 60 2026-03-10T06:19:59.364 DEBUG:teuthology.orchestra.run.vm04:> mkdir -p -v /home/ubuntu/cephtest/mnt.0 2026-03-10T06:19:59.423 INFO:teuthology.orchestra.run.vm04.stdout:mkdir: created directory '/home/ubuntu/cephtest/mnt.0' 2026-03-10T06:19:59.423 INFO:teuthology.orchestra.run:Running command with timeout 60 2026-03-10T06:19:59.423 DEBUG:teuthology.orchestra.run.vm04:> chmod 0000 /home/ubuntu/cephtest/mnt.0 2026-03-10T06:19:59.482 DEBUG:teuthology.orchestra.run.vm04:> sudo modprobe fuse 2026-03-10T06:19:59.549 DEBUG:teuthology.orchestra.run.vm04:> cat /proc/self/mounts | awk '{print $2}' 2026-03-10T06:19:59.607 INFO:teuthology.orchestra.run.vm04.stdout:/proc 2026-03-10T06:19:59.608 INFO:teuthology.orchestra.run.vm04.stdout:/sys 2026-03-10T06:19:59.608 INFO:teuthology.orchestra.run.vm04.stdout:/dev 2026-03-10T06:19:59.608 INFO:teuthology.orchestra.run.vm04.stdout:/sys/kernel/security 2026-03-10T06:19:59.608 INFO:teuthology.orchestra.run.vm04.stdout:/dev/shm 2026-03-10T06:19:59.608 INFO:teuthology.orchestra.run.vm04.stdout:/dev/pts 2026-03-10T06:19:59.608 INFO:teuthology.orchestra.run.vm04.stdout:/run 2026-03-10T06:19:59.608 INFO:teuthology.orchestra.run.vm04.stdout:/sys/fs/cgroup 2026-03-10T06:19:59.608 INFO:teuthology.orchestra.run.vm04.stdout:/sys/fs/pstore 2026-03-10T06:19:59.608 INFO:teuthology.orchestra.run.vm04.stdout:/sys/fs/bpf 2026-03-10T06:19:59.608 INFO:teuthology.orchestra.run.vm04.stdout:/sys/kernel/config 2026-03-10T06:19:59.608 INFO:teuthology.orchestra.run.vm04.stdout:/ 2026-03-10T06:19:59.608 INFO:teuthology.orchestra.run.vm04.stdout:/var/lib/nfs/rpc_pipefs 2026-03-10T06:19:59.608 INFO:teuthology.orchestra.run.vm04.stdout:/sys/fs/selinux 2026-03-10T06:19:59.608 INFO:teuthology.orchestra.run.vm04.stdout:/proc/sys/fs/binfmt_misc 2026-03-10T06:19:59.608 INFO:teuthology.orchestra.run.vm04.stdout:/dev/hugepages 2026-03-10T06:19:59.608 INFO:teuthology.orchestra.run.vm04.stdout:/dev/mqueue 2026-03-10T06:19:59.608 INFO:teuthology.orchestra.run.vm04.stdout:/sys/kernel/debug 2026-03-10T06:19:59.608 INFO:teuthology.orchestra.run.vm04.stdout:/sys/kernel/tracing 2026-03-10T06:19:59.608 INFO:teuthology.orchestra.run.vm04.stdout:/run/credentials/systemd-sysctl.service 2026-03-10T06:19:59.608 INFO:teuthology.orchestra.run.vm04.stdout:/sys/fs/fuse/connections 2026-03-10T06:19:59.608 INFO:teuthology.orchestra.run.vm04.stdout:/run/credentials/systemd-sysusers.service 2026-03-10T06:19:59.608 INFO:teuthology.orchestra.run.vm04.stdout:/run/credentials/systemd-tmpfiles-setup-dev.service 2026-03-10T06:19:59.608 INFO:teuthology.orchestra.run.vm04.stdout:/run/credentials/systemd-tmpfiles-setup.service 2026-03-10T06:19:59.608 INFO:teuthology.orchestra.run.vm04.stdout:/run/user/1000 2026-03-10T06:19:59.608 INFO:teuthology.orchestra.run.vm04.stdout:/var/lib/containers/storage/overlay 2026-03-10T06:19:59.608 INFO:teuthology.orchestra.run.vm04.stdout:/var/lib/containers/storage/overlay/0542a38865c50d4ceb733209e3756dea09521c62400a1254fae47ad6023aba59/merged 2026-03-10T06:19:59.608 INFO:teuthology.orchestra.run.vm04.stdout:/var/lib/containers/storage/overlay/25d63dcf4f0fbac14a3b51709b924ff35c0e6e74d58aa316fd18c778e9966141/merged 2026-03-10T06:19:59.608 INFO:teuthology.orchestra.run.vm04.stdout:/run/user/0 2026-03-10T06:19:59.608 INFO:teuthology.orchestra.run.vm04.stdout:/proc/sys/fs/binfmt_misc 2026-03-10T06:19:59.608 INFO:teuthology.orchestra.run.vm04.stdout:/var/lib/containers/storage/overlay/dc3981d9fc0d52c2d8ac0ff52927847dab7f94d4263c7f52d0f45da7129c8601/merged 2026-03-10T06:19:59.608 INFO:teuthology.orchestra.run.vm04.stdout:/var/lib/containers/storage/overlay/7a01c2dc3e4b4779c1a8fcf92900e8dcd3c437224cefa365e308a50ffc262806/merged 2026-03-10T06:19:59.608 INFO:teuthology.orchestra.run.vm04.stdout:/var/lib/containers/storage/overlay/d997c7d127542a219f6f1f0300943d263a21081f698edd830322e2dd6b6ebadd/merged 2026-03-10T06:19:59.608 INFO:teuthology.orchestra.run.vm04.stdout:/var/lib/containers/storage/overlay/97bcf0fffa8ce713013f32ee9870fec6147e00c9519819c5b1c0d1f80a855214/merged 2026-03-10T06:19:59.608 INFO:teuthology.orchestra.run.vm04.stdout:/var/lib/containers/storage/overlay/b96edbdb86d911c1c9b8c1cceaf9c463edd48e7c5cba5a66c247e69e4bf87134/merged 2026-03-10T06:19:59.608 INFO:teuthology.orchestra.run.vm04.stdout:/var/lib/containers/storage/overlay/6eb8603edfbf891d7719713b45811a594c4616c72c573c5dc7ce28ba800879d8/merged 2026-03-10T06:19:59.608 INFO:teuthology.orchestra.run.vm04.stdout:/var/lib/containers/storage/overlay/2a2d7a59582ea4fce4a4bf54dcc6210c76e373fe52161e197074d406e7027fe2/merged 2026-03-10T06:19:59.608 INFO:teuthology.orchestra.run.vm04.stdout:/var/lib/containers/storage/overlay/41448ba2396771df9e3aab1b51045315491819246529cac6df0cb93fc040be8e/merged 2026-03-10T06:19:59.608 INFO:teuthology.orchestra.run.vm04.stdout:/var/lib/containers/storage/overlay/1a426a1a8aed60dddf0bea4c457294bae1ddb61d0238a765d468362ff2d0beee/merged 2026-03-10T06:19:59.608 INFO:teuthology.orchestra.run.vm04.stdout:/var/lib/containers/storage/overlay/20555ec1adb8b3e6528b1efbe4f2cf6335568dd075e6db217623dfd95ab548c3/merged 2026-03-10T06:19:59.608 INFO:teuthology.orchestra.run.vm04.stdout:/var/lib/containers/storage/overlay/8e296f284b47ecb38c396dd615fecc371ed33c9a70fc735b089e5eb9d1d6c7af/merged 2026-03-10T06:19:59.608 INFO:teuthology.orchestra.run.vm04.stdout:/run/netns 2026-03-10T06:19:59.608 INFO:teuthology.orchestra.run.vm04.stdout:/run/netns/ceph-ns--home-ubuntu-cephtest-mnt.0 2026-03-10T06:19:59.608 INFO:teuthology.orchestra.run.vm04.stdout:/run/netns/ceph-ns--home-ubuntu-cephtest-mnt.0 2026-03-10T06:19:59.609 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-10T06:19:59.609 DEBUG:teuthology.orchestra.run.vm04:> ls /sys/fs/fuse/connections 2026-03-10T06:19:59.665 INFO:tasks.cephfs.fuse_mount:Pre-mount connections: [] 2026-03-10T06:19:59.665 DEBUG:teuthology.orchestra.run.vm04:> (cd /home/ubuntu/cephtest && exec sudo nsenter --net=/var/run/netns/ceph-ns--home-ubuntu-cephtest-mnt.0 sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage daemon-helper kill ceph-fuse -f --admin-socket '/var/run/ceph/$cluster-$name.$pid.asok' /home/ubuntu/cephtest/mnt.0 --id 0) 2026-03-10T06:19:59.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:19:59 vm04.local ceph-mon[51058]: pgmap v83: 65 pgs: 65 active+clean; 451 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 952 B/s rd, 1.4 KiB/s wr, 5 op/s 2026-03-10T06:19:59.679 DEBUG:teuthology.orchestra.run.vm04:> sudo modprobe fuse 2026-03-10T06:19:59.708 DEBUG:teuthology.orchestra.run.vm04:> cat /proc/self/mounts | awk '{print $2}' 2026-03-10T06:19:59.754 INFO:tasks.cephfs.fuse_mount.ceph-fuse.0.vm04.stderr:ceph-fuse[92037]: starting ceph client 2026-03-10T06:19:59.754 INFO:tasks.cephfs.fuse_mount.ceph-fuse.0.vm04.stderr:2026-03-10T06:19:59.752+0000 7f8354b64480 -1 init, newargv = 0x55aaff9ef4a0 newargc=15 2026-03-10T06:19:59.762 INFO:tasks.cephfs.fuse_mount.ceph-fuse.0.vm04.stderr:ceph-fuse[92037]: starting fuse 2026-03-10T06:19:59.772 INFO:teuthology.orchestra.run.vm04.stdout:/proc 2026-03-10T06:19:59.772 INFO:teuthology.orchestra.run.vm04.stdout:/sys 2026-03-10T06:19:59.772 INFO:teuthology.orchestra.run.vm04.stdout:/dev 2026-03-10T06:19:59.772 INFO:teuthology.orchestra.run.vm04.stdout:/sys/kernel/security 2026-03-10T06:19:59.772 INFO:teuthology.orchestra.run.vm04.stdout:/dev/shm 2026-03-10T06:19:59.772 INFO:teuthology.orchestra.run.vm04.stdout:/dev/pts 2026-03-10T06:19:59.772 INFO:teuthology.orchestra.run.vm04.stdout:/run 2026-03-10T06:19:59.772 INFO:teuthology.orchestra.run.vm04.stdout:/sys/fs/cgroup 2026-03-10T06:19:59.772 INFO:teuthology.orchestra.run.vm04.stdout:/sys/fs/pstore 2026-03-10T06:19:59.772 INFO:teuthology.orchestra.run.vm04.stdout:/sys/fs/bpf 2026-03-10T06:19:59.772 INFO:teuthology.orchestra.run.vm04.stdout:/sys/kernel/config 2026-03-10T06:19:59.772 INFO:teuthology.orchestra.run.vm04.stdout:/ 2026-03-10T06:19:59.772 INFO:teuthology.orchestra.run.vm04.stdout:/var/lib/nfs/rpc_pipefs 2026-03-10T06:19:59.772 INFO:teuthology.orchestra.run.vm04.stdout:/sys/fs/selinux 2026-03-10T06:19:59.772 INFO:teuthology.orchestra.run.vm04.stdout:/proc/sys/fs/binfmt_misc 2026-03-10T06:19:59.772 INFO:teuthology.orchestra.run.vm04.stdout:/dev/hugepages 2026-03-10T06:19:59.772 INFO:teuthology.orchestra.run.vm04.stdout:/dev/mqueue 2026-03-10T06:19:59.772 INFO:teuthology.orchestra.run.vm04.stdout:/sys/kernel/debug 2026-03-10T06:19:59.772 INFO:teuthology.orchestra.run.vm04.stdout:/sys/kernel/tracing 2026-03-10T06:19:59.772 INFO:teuthology.orchestra.run.vm04.stdout:/run/credentials/systemd-sysctl.service 2026-03-10T06:19:59.772 INFO:teuthology.orchestra.run.vm04.stdout:/sys/fs/fuse/connections 2026-03-10T06:19:59.772 INFO:teuthology.orchestra.run.vm04.stdout:/run/credentials/systemd-sysusers.service 2026-03-10T06:19:59.772 INFO:teuthology.orchestra.run.vm04.stdout:/run/credentials/systemd-tmpfiles-setup-dev.service 2026-03-10T06:19:59.772 INFO:teuthology.orchestra.run.vm04.stdout:/run/credentials/systemd-tmpfiles-setup.service 2026-03-10T06:19:59.772 INFO:teuthology.orchestra.run.vm04.stdout:/run/user/1000 2026-03-10T06:19:59.772 INFO:teuthology.orchestra.run.vm04.stdout:/var/lib/containers/storage/overlay 2026-03-10T06:19:59.772 INFO:teuthology.orchestra.run.vm04.stdout:/var/lib/containers/storage/overlay/0542a38865c50d4ceb733209e3756dea09521c62400a1254fae47ad6023aba59/merged 2026-03-10T06:19:59.772 INFO:teuthology.orchestra.run.vm04.stdout:/var/lib/containers/storage/overlay/25d63dcf4f0fbac14a3b51709b924ff35c0e6e74d58aa316fd18c778e9966141/merged 2026-03-10T06:19:59.772 INFO:teuthology.orchestra.run.vm04.stdout:/run/user/0 2026-03-10T06:19:59.772 INFO:teuthology.orchestra.run.vm04.stdout:/proc/sys/fs/binfmt_misc 2026-03-10T06:19:59.772 INFO:teuthology.orchestra.run.vm04.stdout:/var/lib/containers/storage/overlay/dc3981d9fc0d52c2d8ac0ff52927847dab7f94d4263c7f52d0f45da7129c8601/merged 2026-03-10T06:19:59.772 INFO:teuthology.orchestra.run.vm04.stdout:/var/lib/containers/storage/overlay/7a01c2dc3e4b4779c1a8fcf92900e8dcd3c437224cefa365e308a50ffc262806/merged 2026-03-10T06:19:59.772 INFO:teuthology.orchestra.run.vm04.stdout:/var/lib/containers/storage/overlay/d997c7d127542a219f6f1f0300943d263a21081f698edd830322e2dd6b6ebadd/merged 2026-03-10T06:19:59.772 INFO:teuthology.orchestra.run.vm04.stdout:/var/lib/containers/storage/overlay/97bcf0fffa8ce713013f32ee9870fec6147e00c9519819c5b1c0d1f80a855214/merged 2026-03-10T06:19:59.772 INFO:teuthology.orchestra.run.vm04.stdout:/var/lib/containers/storage/overlay/b96edbdb86d911c1c9b8c1cceaf9c463edd48e7c5cba5a66c247e69e4bf87134/merged 2026-03-10T06:19:59.772 INFO:teuthology.orchestra.run.vm04.stdout:/var/lib/containers/storage/overlay/6eb8603edfbf891d7719713b45811a594c4616c72c573c5dc7ce28ba800879d8/merged 2026-03-10T06:19:59.772 INFO:teuthology.orchestra.run.vm04.stdout:/var/lib/containers/storage/overlay/2a2d7a59582ea4fce4a4bf54dcc6210c76e373fe52161e197074d406e7027fe2/merged 2026-03-10T06:19:59.772 INFO:teuthology.orchestra.run.vm04.stdout:/var/lib/containers/storage/overlay/41448ba2396771df9e3aab1b51045315491819246529cac6df0cb93fc040be8e/merged 2026-03-10T06:19:59.773 INFO:teuthology.orchestra.run.vm04.stdout:/var/lib/containers/storage/overlay/1a426a1a8aed60dddf0bea4c457294bae1ddb61d0238a765d468362ff2d0beee/merged 2026-03-10T06:19:59.773 INFO:teuthology.orchestra.run.vm04.stdout:/var/lib/containers/storage/overlay/20555ec1adb8b3e6528b1efbe4f2cf6335568dd075e6db217623dfd95ab548c3/merged 2026-03-10T06:19:59.773 INFO:teuthology.orchestra.run.vm04.stdout:/var/lib/containers/storage/overlay/8e296f284b47ecb38c396dd615fecc371ed33c9a70fc735b089e5eb9d1d6c7af/merged 2026-03-10T06:19:59.773 INFO:teuthology.orchestra.run.vm04.stdout:/run/netns 2026-03-10T06:19:59.773 INFO:teuthology.orchestra.run.vm04.stdout:/run/netns/ceph-ns--home-ubuntu-cephtest-mnt.0 2026-03-10T06:19:59.773 INFO:teuthology.orchestra.run.vm04.stdout:/run/netns/ceph-ns--home-ubuntu-cephtest-mnt.0 2026-03-10T06:19:59.773 INFO:teuthology.orchestra.run.vm04.stdout:/home/ubuntu/cephtest/mnt.0 2026-03-10T06:19:59.773 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-10T06:19:59.773 DEBUG:teuthology.orchestra.run.vm04:> ls /sys/fs/fuse/connections 2026-03-10T06:19:59.829 INFO:teuthology.orchestra.run.vm04.stdout:55 2026-03-10T06:19:59.829 INFO:tasks.cephfs.fuse_mount:Post-mount connections: [55] 2026-03-10T06:19:59.829 DEBUG:teuthology.orchestra.run.vm04:> sudo stdin-killer -- python3 -c ' 2026-03-10T06:19:59.829 DEBUG:teuthology.orchestra.run.vm04:> import glob 2026-03-10T06:19:59.829 DEBUG:teuthology.orchestra.run.vm04:> import re 2026-03-10T06:19:59.829 DEBUG:teuthology.orchestra.run.vm04:> import os 2026-03-10T06:19:59.829 DEBUG:teuthology.orchestra.run.vm04:> import subprocess 2026-03-10T06:19:59.829 DEBUG:teuthology.orchestra.run.vm04:> 2026-03-10T06:19:59.829 DEBUG:teuthology.orchestra.run.vm04:> def _find_admin_socket(client_name): 2026-03-10T06:19:59.829 DEBUG:teuthology.orchestra.run.vm04:> asok_path = "/var/run/ceph/ceph-client.0.*.asok" 2026-03-10T06:19:59.829 DEBUG:teuthology.orchestra.run.vm04:> files = glob.glob(asok_path) 2026-03-10T06:19:59.829 DEBUG:teuthology.orchestra.run.vm04:> mountpoint = "/home/ubuntu/cephtest/mnt.0" 2026-03-10T06:19:59.829 DEBUG:teuthology.orchestra.run.vm04:> 2026-03-10T06:19:59.829 DEBUG:teuthology.orchestra.run.vm04:> # Given a non-glob path, it better be there 2026-03-10T06:19:59.829 DEBUG:teuthology.orchestra.run.vm04:> if "*" not in asok_path: 2026-03-10T06:19:59.829 DEBUG:teuthology.orchestra.run.vm04:> assert(len(files) == 1) 2026-03-10T06:19:59.829 DEBUG:teuthology.orchestra.run.vm04:> return files[0] 2026-03-10T06:19:59.829 DEBUG:teuthology.orchestra.run.vm04:> 2026-03-10T06:19:59.829 DEBUG:teuthology.orchestra.run.vm04:> for f in files: 2026-03-10T06:19:59.830 DEBUG:teuthology.orchestra.run.vm04:> pid = re.match(".*\.(\d+)\.asok$", f).group(1) 2026-03-10T06:19:59.830 DEBUG:teuthology.orchestra.run.vm04:> if os.path.exists("/proc/{0}".format(pid)): 2026-03-10T06:19:59.830 DEBUG:teuthology.orchestra.run.vm04:> with open("/proc/{0}/cmdline".format(pid), '"'"'r'"'"') as proc_f: 2026-03-10T06:19:59.830 DEBUG:teuthology.orchestra.run.vm04:> contents = proc_f.read() 2026-03-10T06:19:59.830 DEBUG:teuthology.orchestra.run.vm04:> if mountpoint in contents: 2026-03-10T06:19:59.830 DEBUG:teuthology.orchestra.run.vm04:> return f 2026-03-10T06:19:59.830 DEBUG:teuthology.orchestra.run.vm04:> raise RuntimeError("Client socket {0} not found".format(client_name)) 2026-03-10T06:19:59.830 DEBUG:teuthology.orchestra.run.vm04:> 2026-03-10T06:19:59.830 DEBUG:teuthology.orchestra.run.vm04:> print(_find_admin_socket("client.0")) 2026-03-10T06:19:59.830 DEBUG:teuthology.orchestra.run.vm04:> ' 2026-03-10T06:19:59.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:19:59 vm06 ceph-mon[58974]: pgmap v83: 65 pgs: 65 active+clean; 451 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 952 B/s rd, 1.4 KiB/s wr, 5 op/s 2026-03-10T06:19:59.933 INFO:teuthology.orchestra.run.vm04.stdout:/var/run/ceph/ceph-client.0.92037.asok 2026-03-10T06:19:59.935 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:19:59 stdin-killer INFO: command exited with status 0: exiting normally with same code! 2026-03-10T06:19:59.941 INFO:tasks.cephfs.fuse_mount:Found client admin socket at /var/run/ceph/ceph-client.0.92037.asok 2026-03-10T06:19:59.941 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-10T06:19:59.941 DEBUG:teuthology.orchestra.run.vm04:> sudo ceph --admin-daemon /var/run/ceph/ceph-client.0.92037.asok status 2026-03-10T06:20:00.057 INFO:teuthology.orchestra.run.vm04.stdout:{ 2026-03-10T06:20:00.057 INFO:teuthology.orchestra.run.vm04.stdout: "metadata": { 2026-03-10T06:20:00.057 INFO:teuthology.orchestra.run.vm04.stdout: "ceph_sha1": "5dd24139a1eada541a3bc16b6941c5dde975e26d", 2026-03-10T06:20:00.057 INFO:teuthology.orchestra.run.vm04.stdout: "ceph_version": "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)", 2026-03-10T06:20:00.057 INFO:teuthology.orchestra.run.vm04.stdout: "entity_id": "0", 2026-03-10T06:20:00.057 INFO:teuthology.orchestra.run.vm04.stdout: "hostname": "vm04.local", 2026-03-10T06:20:00.057 INFO:teuthology.orchestra.run.vm04.stdout: "mount_point": "/home/ubuntu/cephtest/mnt.0", 2026-03-10T06:20:00.057 INFO:teuthology.orchestra.run.vm04.stdout: "pid": "92037", 2026-03-10T06:20:00.057 INFO:teuthology.orchestra.run.vm04.stdout: "root": "/" 2026-03-10T06:20:00.057 INFO:teuthology.orchestra.run.vm04.stdout: }, 2026-03-10T06:20:00.057 INFO:teuthology.orchestra.run.vm04.stdout: "dentry_count": 0, 2026-03-10T06:20:00.057 INFO:teuthology.orchestra.run.vm04.stdout: "dentry_pinned_count": 0, 2026-03-10T06:20:00.057 INFO:teuthology.orchestra.run.vm04.stdout: "id": 24325, 2026-03-10T06:20:00.057 INFO:teuthology.orchestra.run.vm04.stdout: "inst": { 2026-03-10T06:20:00.057 INFO:teuthology.orchestra.run.vm04.stdout: "name": { 2026-03-10T06:20:00.057 INFO:teuthology.orchestra.run.vm04.stdout: "type": "client", 2026-03-10T06:20:00.057 INFO:teuthology.orchestra.run.vm04.stdout: "num": 24325 2026-03-10T06:20:00.057 INFO:teuthology.orchestra.run.vm04.stdout: }, 2026-03-10T06:20:00.057 INFO:teuthology.orchestra.run.vm04.stdout: "addr": { 2026-03-10T06:20:00.057 INFO:teuthology.orchestra.run.vm04.stdout: "type": "v1", 2026-03-10T06:20:00.057 INFO:teuthology.orchestra.run.vm04.stdout: "addr": "192.168.123.104:0", 2026-03-10T06:20:00.057 INFO:teuthology.orchestra.run.vm04.stdout: "nonce": 950847804 2026-03-10T06:20:00.057 INFO:teuthology.orchestra.run.vm04.stdout: } 2026-03-10T06:20:00.057 INFO:teuthology.orchestra.run.vm04.stdout: }, 2026-03-10T06:20:00.057 INFO:teuthology.orchestra.run.vm04.stdout: "addr": { 2026-03-10T06:20:00.057 INFO:teuthology.orchestra.run.vm04.stdout: "type": "v1", 2026-03-10T06:20:00.057 INFO:teuthology.orchestra.run.vm04.stdout: "addr": "192.168.123.104:0", 2026-03-10T06:20:00.057 INFO:teuthology.orchestra.run.vm04.stdout: "nonce": 950847804 2026-03-10T06:20:00.057 INFO:teuthology.orchestra.run.vm04.stdout: }, 2026-03-10T06:20:00.057 INFO:teuthology.orchestra.run.vm04.stdout: "inst_str": "client.24325 192.168.123.104:0/950847804", 2026-03-10T06:20:00.057 INFO:teuthology.orchestra.run.vm04.stdout: "addr_str": "192.168.123.104:0/950847804", 2026-03-10T06:20:00.057 INFO:teuthology.orchestra.run.vm04.stdout: "inode_count": 1, 2026-03-10T06:20:00.057 INFO:teuthology.orchestra.run.vm04.stdout: "mds_epoch": 9, 2026-03-10T06:20:00.057 INFO:teuthology.orchestra.run.vm04.stdout: "osd_epoch": 38, 2026-03-10T06:20:00.057 INFO:teuthology.orchestra.run.vm04.stdout: "osd_epoch_barrier": 0, 2026-03-10T06:20:00.057 INFO:teuthology.orchestra.run.vm04.stdout: "blocklisted": false, 2026-03-10T06:20:00.057 INFO:teuthology.orchestra.run.vm04.stdout: "fs_name": "cephfs" 2026-03-10T06:20:00.057 INFO:teuthology.orchestra.run.vm04.stdout:} 2026-03-10T06:20:00.064 DEBUG:tasks.ceph_fuse:passing mntargs=[] 2026-03-10T06:20:00.064 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid 9c59102a-1c48-11f1-b618-035af535377d -- ceph fs ls 2026-03-10T06:20:00.228 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/mon.vm04/config 2026-03-10T06:20:00.516 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:20:00 vm04.local ceph-mon[51058]: Health detail: HEALTH_WARN 1 filesystem with deprecated feature inline_data 2026-03-10T06:20:00.516 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:20:00 vm04.local ceph-mon[51058]: [WRN] FS_INLINE_DATA_DEPRECATED: 1 filesystem with deprecated feature inline_data 2026-03-10T06:20:00.516 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:20:00 vm04.local ceph-mon[51058]: fs cephfs has deprecated feature inline_data enabled. 2026-03-10T06:20:00.516 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:00.513+0000 7fbf1c1e2700 1 -- 192.168.123.104:0/3954890067 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fbf14068490 msgr2=0x7fbf14068900 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:20:00.516 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:00.513+0000 7fbf1c1e2700 1 --2- 192.168.123.104:0/3954890067 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fbf14068490 0x7fbf14068900 secure :-1 s=READY pgs=283 cs=0 l=1 rev1=1 crypto rx=0x7fbf10009b50 tx=0x7fbf10009e60 comp rx=0 tx=0).stop 2026-03-10T06:20:00.516 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:00.514+0000 7fbf1c1e2700 1 -- 192.168.123.104:0/3954890067 shutdown_connections 2026-03-10T06:20:00.516 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:00.514+0000 7fbf1c1e2700 1 --2- 192.168.123.104:0/3954890067 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fbf14068490 0x7fbf14068900 unknown :-1 s=CLOSED pgs=283 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:20:00.516 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:00.514+0000 7fbf1c1e2700 1 --2- 192.168.123.104:0/3954890067 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fbf141013a0 0x7fbf14101770 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:20:00.516 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:00.514+0000 7fbf1c1e2700 1 -- 192.168.123.104:0/3954890067 >> 192.168.123.104:0/3954890067 conn(0x7fbf140754a0 msgr2=0x7fbf140758a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:20:00.516 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:00.515+0000 7fbf1c1e2700 1 -- 192.168.123.104:0/3954890067 shutdown_connections 2026-03-10T06:20:00.516 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:00.515+0000 7fbf1c1e2700 1 -- 192.168.123.104:0/3954890067 wait complete. 2026-03-10T06:20:00.517 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:00.515+0000 7fbf1c1e2700 1 Processor -- start 2026-03-10T06:20:00.517 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:00.515+0000 7fbf1c1e2700 1 -- start start 2026-03-10T06:20:00.517 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:00.516+0000 7fbf1c1e2700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fbf14068490 0x7fbf14198350 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:20:00.518 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:00.516+0000 7fbf1c1e2700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fbf141013a0 0x7fbf14198890 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:20:00.518 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:00.516+0000 7fbf1c1e2700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fbf14198f70 con 0x7fbf141013a0 2026-03-10T06:20:00.518 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:00.516+0000 7fbf1c1e2700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fbf1419cd00 con 0x7fbf14068490 2026-03-10T06:20:00.518 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:00.516+0000 7fbf1977d700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fbf141013a0 0x7fbf14198890 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:20:00.518 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:00.516+0000 7fbf1977d700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fbf141013a0 0x7fbf14198890 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:36782/0 (socket says 192.168.123.104:36782) 2026-03-10T06:20:00.518 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:00.516+0000 7fbf1977d700 1 -- 192.168.123.104:0/117371979 learned_addr learned my addr 192.168.123.104:0/117371979 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:20:00.518 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:00.516+0000 7fbf1977d700 1 -- 192.168.123.104:0/117371979 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fbf14068490 msgr2=0x7fbf14198350 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:20:00.518 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:00.516+0000 7fbf19f7e700 1 --2- 192.168.123.104:0/117371979 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fbf14068490 0x7fbf14198350 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:20:00.518 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:00.517+0000 7fbf1977d700 1 --2- 192.168.123.104:0/117371979 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fbf14068490 0x7fbf14198350 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:20:00.518 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:00.517+0000 7fbf1977d700 1 -- 192.168.123.104:0/117371979 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fbf100097e0 con 0x7fbf141013a0 2026-03-10T06:20:00.518 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:00.517+0000 7fbf1977d700 1 --2- 192.168.123.104:0/117371979 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fbf141013a0 0x7fbf14198890 secure :-1 s=READY pgs=284 cs=0 l=1 rev1=1 crypto rx=0x7fbf10004bd0 tx=0x7fbf10005dc0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:20:00.519 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:00.517+0000 7fbf0affd700 1 -- 192.168.123.104:0/117371979 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fbf1001d070 con 0x7fbf141013a0 2026-03-10T06:20:00.519 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:00.517+0000 7fbf0affd700 1 -- 192.168.123.104:0/117371979 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fbf1000bc00 con 0x7fbf141013a0 2026-03-10T06:20:00.519 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:00.517+0000 7fbf0affd700 1 -- 192.168.123.104:0/117371979 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fbf1000f700 con 0x7fbf141013a0 2026-03-10T06:20:00.519 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:00.517+0000 7fbf1c1e2700 1 -- 192.168.123.104:0/117371979 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fbf1419cf80 con 0x7fbf141013a0 2026-03-10T06:20:00.520 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:00.517+0000 7fbf1c1e2700 1 -- 192.168.123.104:0/117371979 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fbf1419d390 con 0x7fbf141013a0 2026-03-10T06:20:00.520 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:00.517+0000 7fbf19f7e700 1 --2- 192.168.123.104:0/117371979 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fbf14068490 0x7fbf14198350 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T06:20:00.520 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:00.518+0000 7fbf1c1e2700 1 -- 192.168.123.104:0/117371979 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fbf1404ea50 con 0x7fbf141013a0 2026-03-10T06:20:00.524 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:00.522+0000 7fbf0affd700 1 -- 192.168.123.104:0/117371979 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7fbf10022a50 con 0x7fbf141013a0 2026-03-10T06:20:00.524 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:00.523+0000 7fbf0affd700 1 --2- 192.168.123.104:0/117371979 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fbf0006c680 0x7fbf0006eb30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:20:00.524 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:00.523+0000 7fbf0affd700 1 -- 192.168.123.104:0/117371979 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(38..38 src has 1..38) v4 ==== 5276+0+0 (secure 0 0 0) 0x7fbf1008d9e0 con 0x7fbf141013a0 2026-03-10T06:20:00.524 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:00.523+0000 7fbf0affd700 1 -- 192.168.123.104:0/117371979 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fbf100b97c0 con 0x7fbf141013a0 2026-03-10T06:20:00.525 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:00.523+0000 7fbf19f7e700 1 --2- 192.168.123.104:0/117371979 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fbf0006c680 0x7fbf0006eb30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:20:00.525 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:00.524+0000 7fbf19f7e700 1 --2- 192.168.123.104:0/117371979 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fbf0006c680 0x7fbf0006eb30 secure :-1 s=READY pgs=106 cs=0 l=1 rev1=1 crypto rx=0x7fbf04009e20 tx=0x7fbf04009450 comp rx=0 tx=0).ready entity=mgr.14241 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:20:00.662 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:00.660+0000 7fbf1c1e2700 1 -- 192.168.123.104:0/117371979 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "fs ls"} v 0) v1 -- 0x7fbf14066e40 con 0x7fbf141013a0 2026-03-10T06:20:00.662 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:00.661+0000 7fbf0affd700 1 -- 192.168.123.104:0/117371979 <== mon.0 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "fs ls"}]=0 v11) v1 ==== 53+0+83 (secure 0 0 0) 0x7fbf10027070 con 0x7fbf141013a0 2026-03-10T06:20:00.663 INFO:teuthology.orchestra.run.vm04.stdout:name: cephfs, metadata pool: cephfs.cephfs.meta, data pools: [cephfs.cephfs.data ] 2026-03-10T06:20:00.665 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:00.663+0000 7fbf1c1e2700 1 -- 192.168.123.104:0/117371979 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fbf0006c680 msgr2=0x7fbf0006eb30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:20:00.665 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:00.663+0000 7fbf1c1e2700 1 --2- 192.168.123.104:0/117371979 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fbf0006c680 0x7fbf0006eb30 secure :-1 s=READY pgs=106 cs=0 l=1 rev1=1 crypto rx=0x7fbf04009e20 tx=0x7fbf04009450 comp rx=0 tx=0).stop 2026-03-10T06:20:00.665 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:00.663+0000 7fbf1c1e2700 1 -- 192.168.123.104:0/117371979 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fbf141013a0 msgr2=0x7fbf14198890 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:20:00.665 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:00.664+0000 7fbf1c1e2700 1 --2- 192.168.123.104:0/117371979 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fbf141013a0 0x7fbf14198890 secure :-1 s=READY pgs=284 cs=0 l=1 rev1=1 crypto rx=0x7fbf10004bd0 tx=0x7fbf10005dc0 comp rx=0 tx=0).stop 2026-03-10T06:20:00.666 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:00.664+0000 7fbf1c1e2700 1 -- 192.168.123.104:0/117371979 shutdown_connections 2026-03-10T06:20:00.666 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:00.664+0000 7fbf1c1e2700 1 --2- 192.168.123.104:0/117371979 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fbf0006c680 0x7fbf0006eb30 unknown :-1 s=CLOSED pgs=106 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:20:00.666 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:00.664+0000 7fbf1c1e2700 1 --2- 192.168.123.104:0/117371979 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fbf14068490 0x7fbf14198350 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:20:00.666 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:00.664+0000 7fbf1c1e2700 1 --2- 192.168.123.104:0/117371979 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fbf141013a0 0x7fbf14198890 unknown :-1 s=CLOSED pgs=284 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:20:00.666 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:00.664+0000 7fbf1c1e2700 1 -- 192.168.123.104:0/117371979 >> 192.168.123.104:0/117371979 conn(0x7fbf140754a0 msgr2=0x7fbf140fdd70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:20:00.666 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:00.665+0000 7fbf1c1e2700 1 -- 192.168.123.104:0/117371979 shutdown_connections 2026-03-10T06:20:00.666 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:00.665+0000 7fbf1c1e2700 1 -- 192.168.123.104:0/117371979 wait complete. 2026-03-10T06:20:00.736 INFO:tasks.cephfs.mount:Mounting default Ceph FS; just confirmed its presence on cluster 2026-03-10T06:20:00.736 INFO:tasks.cephfs.mount:Mounting Ceph FS. Following are details of mount; remember "None" represents Python type None - 2026-03-10T06:20:00.736 INFO:tasks.cephfs.mount:self.client_remote.hostname = vm06.local 2026-03-10T06:20:00.736 INFO:tasks.cephfs.mount:self.client.name = client.1 2026-03-10T06:20:00.736 INFO:tasks.cephfs.mount:self.hostfs_mntpt = /home/ubuntu/cephtest/mnt.1 2026-03-10T06:20:00.736 INFO:tasks.cephfs.mount:self.cephfs_name = None 2026-03-10T06:20:00.736 INFO:tasks.cephfs.mount:self.cephfs_mntpt = None 2026-03-10T06:20:00.736 INFO:tasks.cephfs.mount:self.client_keyring_path = None 2026-03-10T06:20:00.736 INFO:tasks.cephfs.mount:Setting the 'None' netns for '/home/ubuntu/cephtest/mnt.1' 2026-03-10T06:20:00.736 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-10T06:20:00.736 DEBUG:teuthology.orchestra.run.vm06:> ip addr 2026-03-10T06:20:00.752 INFO:teuthology.orchestra.run.vm06.stdout:1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 2026-03-10T06:20:00.752 INFO:teuthology.orchestra.run.vm06.stdout: link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 2026-03-10T06:20:00.752 INFO:teuthology.orchestra.run.vm06.stdout: inet 127.0.0.1/8 scope host lo 2026-03-10T06:20:00.752 INFO:teuthology.orchestra.run.vm06.stdout: valid_lft forever preferred_lft forever 2026-03-10T06:20:00.752 INFO:teuthology.orchestra.run.vm06.stdout: inet6 ::1/128 scope host 2026-03-10T06:20:00.752 INFO:teuthology.orchestra.run.vm06.stdout: valid_lft forever preferred_lft forever 2026-03-10T06:20:00.752 INFO:teuthology.orchestra.run.vm06.stdout:2: eth0: mtu 1500 qdisc fq_codel state UP group default qlen 1000 2026-03-10T06:20:00.752 INFO:teuthology.orchestra.run.vm06.stdout: link/ether 52:55:00:00:00:06 brd ff:ff:ff:ff:ff:ff 2026-03-10T06:20:00.752 INFO:teuthology.orchestra.run.vm06.stdout: altname enp0s3 2026-03-10T06:20:00.752 INFO:teuthology.orchestra.run.vm06.stdout: altname ens3 2026-03-10T06:20:00.752 INFO:teuthology.orchestra.run.vm06.stdout: inet 192.168.123.106/24 brd 192.168.123.255 scope global dynamic noprefixroute eth0 2026-03-10T06:20:00.752 INFO:teuthology.orchestra.run.vm06.stdout: valid_lft 3119sec preferred_lft 3119sec 2026-03-10T06:20:00.752 INFO:teuthology.orchestra.run.vm06.stdout: inet6 fe80::5055:ff:fe00:6/64 scope link noprefixroute 2026-03-10T06:20:00.752 INFO:teuthology.orchestra.run.vm06.stdout: valid_lft forever preferred_lft forever 2026-03-10T06:20:00.752 INFO:tasks.cephfs.mount:Setuping the 'ceph-brx' with 192.168.159.254/20 2026-03-10T06:20:00.752 DEBUG:teuthology.orchestra.run.vm06:> (cd / && exec stdin-killer --timeout=300 -- bash -c ' 2026-03-10T06:20:00.752 DEBUG:teuthology.orchestra.run.vm06:> set -e 2026-03-10T06:20:00.752 DEBUG:teuthology.orchestra.run.vm06:> sudo ip link add name ceph-brx type bridge 2026-03-10T06:20:00.752 DEBUG:teuthology.orchestra.run.vm06:> sudo ip addr flush dev ceph-brx 2026-03-10T06:20:00.752 DEBUG:teuthology.orchestra.run.vm06:> sudo ip link set ceph-brx up 2026-03-10T06:20:00.752 DEBUG:teuthology.orchestra.run.vm06:> sudo ip addr add 192.168.159.254/20 brd 192.168.159.255 dev ceph-brx 2026-03-10T06:20:00.752 DEBUG:teuthology.orchestra.run.vm06:> ') 2026-03-10T06:20:00.831 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:20:00 stdin-killer INFO: expiration expected; waiting 300 seconds for command to complete 2026-03-10T06:20:00.839 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:20:00 vm06 ceph-mon[58974]: Health detail: HEALTH_WARN 1 filesystem with deprecated feature inline_data 2026-03-10T06:20:00.839 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:20:00 vm06 ceph-mon[58974]: [WRN] FS_INLINE_DATA_DEPRECATED: 1 filesystem with deprecated feature inline_data 2026-03-10T06:20:00.839 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:20:00 vm06 ceph-mon[58974]: fs cephfs has deprecated feature inline_data enabled. 2026-03-10T06:20:00.907 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:20:00 stdin-killer INFO: command exited with status 0: exiting normally with same code! 2026-03-10T06:20:00.915 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-10T06:20:00.915 DEBUG:teuthology.orchestra.run.vm06:> echo 1 | sudo tee /proc/sys/net/ipv4/ip_forward 2026-03-10T06:20:00.988 INFO:teuthology.orchestra.run.vm06.stdout:1 2026-03-10T06:20:00.989 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-10T06:20:00.989 DEBUG:teuthology.orchestra.run.vm06:> ip r 2026-03-10T06:20:01.047 INFO:teuthology.orchestra.run.vm06.stdout:default via 192.168.123.1 dev eth0 proto dhcp src 192.168.123.106 metric 100 2026-03-10T06:20:01.047 INFO:teuthology.orchestra.run.vm06.stdout:192.168.123.0/24 dev eth0 proto kernel scope link src 192.168.123.106 metric 100 2026-03-10T06:20:01.047 INFO:teuthology.orchestra.run.vm06.stdout:192.168.144.0/20 dev ceph-brx proto kernel scope link src 192.168.159.254 2026-03-10T06:20:01.047 DEBUG:teuthology.orchestra.run.vm06:> (cd / && exec stdin-killer --timeout=300 -- bash -c ' 2026-03-10T06:20:01.047 DEBUG:teuthology.orchestra.run.vm06:> set -e 2026-03-10T06:20:01.047 DEBUG:teuthology.orchestra.run.vm06:> sudo iptables -A FORWARD -o eth0 -i ceph-brx -j ACCEPT 2026-03-10T06:20:01.047 DEBUG:teuthology.orchestra.run.vm06:> sudo iptables -A FORWARD -i eth0 -o ceph-brx -j ACCEPT 2026-03-10T06:20:01.047 DEBUG:teuthology.orchestra.run.vm06:> sudo iptables -t nat -A POSTROUTING -s 192.168.159.254/20 -o eth0 -j MASQUERADE 2026-03-10T06:20:01.047 DEBUG:teuthology.orchestra.run.vm06:> ') 2026-03-10T06:20:01.124 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:20:01 stdin-killer INFO: expiration expected; waiting 300 seconds for command to complete 2026-03-10T06:20:01.180 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:20:01 stdin-killer INFO: command exited with status 0: exiting normally with same code! 2026-03-10T06:20:01.184 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-10T06:20:01.184 DEBUG:teuthology.orchestra.run.vm06:> ip netns list 2026-03-10T06:20:01.240 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-10T06:20:01.240 DEBUG:teuthology.orchestra.run.vm06:> ip netns list-id 2026-03-10T06:20:01.296 DEBUG:teuthology.orchestra.run.vm06:> (cd / && exec stdin-killer --timeout=300 -- bash -c ' 2026-03-10T06:20:01.296 DEBUG:teuthology.orchestra.run.vm06:> set -e 2026-03-10T06:20:01.296 DEBUG:teuthology.orchestra.run.vm06:> sudo ip netns add ceph-ns--home-ubuntu-cephtest-mnt.1 2026-03-10T06:20:01.296 DEBUG:teuthology.orchestra.run.vm06:> sudo ip netns set ceph-ns--home-ubuntu-cephtest-mnt.1 0 2026-03-10T06:20:01.296 DEBUG:teuthology.orchestra.run.vm06:> ') 2026-03-10T06:20:01.374 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:20:01 stdin-killer INFO: expiration expected; waiting 300 seconds for command to complete 2026-03-10T06:20:01.397 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:20:01 stdin-killer INFO: command exited with status 0: exiting normally with same code! 2026-03-10T06:20:01.399 INFO:tasks.cephfs.mount:Setuping the netns 'ceph-ns--home-ubuntu-cephtest-mnt.1' with 192.168.144.1/20 2026-03-10T06:20:01.400 DEBUG:teuthology.orchestra.run.vm06:> (cd / && exec stdin-killer --timeout=300 -- bash -c ' 2026-03-10T06:20:01.400 DEBUG:teuthology.orchestra.run.vm06:> set -e 2026-03-10T06:20:01.400 DEBUG:teuthology.orchestra.run.vm06:> sudo ip link add veth0 netns ceph-ns--home-ubuntu-cephtest-mnt.1 type veth peer name brx.0 2026-03-10T06:20:01.400 DEBUG:teuthology.orchestra.run.vm06:> sudo ip netns exec ceph-ns--home-ubuntu-cephtest-mnt.1 ip addr add 192.168.144.1/20 brd 192.168.159.255 dev veth0 2026-03-10T06:20:01.400 DEBUG:teuthology.orchestra.run.vm06:> sudo ip netns exec ceph-ns--home-ubuntu-cephtest-mnt.1 ip link set veth0 up 2026-03-10T06:20:01.400 DEBUG:teuthology.orchestra.run.vm06:> sudo ip netns exec ceph-ns--home-ubuntu-cephtest-mnt.1 ip link set lo up 2026-03-10T06:20:01.400 DEBUG:teuthology.orchestra.run.vm06:> sudo ip netns exec ceph-ns--home-ubuntu-cephtest-mnt.1 ip route add default via 192.168.159.254 2026-03-10T06:20:01.400 DEBUG:teuthology.orchestra.run.vm06:> ') 2026-03-10T06:20:01.478 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:20:01 stdin-killer INFO: expiration expected; waiting 300 seconds for command to complete 2026-03-10T06:20:01.549 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:20:01 stdin-killer INFO: command exited with status 0: exiting normally with same code! 2026-03-10T06:20:01.553 DEBUG:teuthology.orchestra.run.vm06:> (cd / && exec stdin-killer --timeout=300 -- bash -c ' 2026-03-10T06:20:01.553 DEBUG:teuthology.orchestra.run.vm06:> set -e 2026-03-10T06:20:01.553 DEBUG:teuthology.orchestra.run.vm06:> sudo ip link set brx.0 up 2026-03-10T06:20:01.553 DEBUG:teuthology.orchestra.run.vm06:> sudo ip link set dev brx.0 master ceph-brx 2026-03-10T06:20:01.553 DEBUG:teuthology.orchestra.run.vm06:> ') 2026-03-10T06:20:01.630 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:20:01 stdin-killer INFO: expiration expected; waiting 300 seconds for command to complete 2026-03-10T06:20:01.639 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:20:01 vm06.local ceph-mon[58974]: pgmap v84: 65 pgs: 65 active+clean; 451 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 1.3 KiB/s rd, 1.3 KiB/s wr, 6 op/s 2026-03-10T06:20:01.639 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:20:01 vm06.local ceph-mon[58974]: from='client.? 192.168.123.104:0/117371979' entity='client.admin' cmd=[{"prefix": "fs ls"}]: dispatch 2026-03-10T06:20:01.658 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:20:01 stdin-killer INFO: command exited with status 0: exiting normally with same code! 2026-03-10T06:20:01.663 INFO:tasks.cephfs.fuse_mount:Client client.1 config is {} 2026-03-10T06:20:01.663 INFO:teuthology.orchestra.run:Running command with timeout 60 2026-03-10T06:20:01.663 DEBUG:teuthology.orchestra.run.vm06:> mkdir -p -v /home/ubuntu/cephtest/mnt.1 2026-03-10T06:20:01.719 INFO:teuthology.orchestra.run.vm06.stdout:mkdir: created directory '/home/ubuntu/cephtest/mnt.1' 2026-03-10T06:20:01.719 INFO:teuthology.orchestra.run:Running command with timeout 60 2026-03-10T06:20:01.719 DEBUG:teuthology.orchestra.run.vm06:> chmod 0000 /home/ubuntu/cephtest/mnt.1 2026-03-10T06:20:01.777 DEBUG:teuthology.orchestra.run.vm06:> sudo modprobe fuse 2026-03-10T06:20:01.843 DEBUG:teuthology.orchestra.run.vm06:> cat /proc/self/mounts | awk '{print $2}' 2026-03-10T06:20:01.902 INFO:teuthology.orchestra.run.vm06.stdout:/proc 2026-03-10T06:20:01.902 INFO:teuthology.orchestra.run.vm06.stdout:/sys 2026-03-10T06:20:01.902 INFO:teuthology.orchestra.run.vm06.stdout:/dev 2026-03-10T06:20:01.902 INFO:teuthology.orchestra.run.vm06.stdout:/sys/kernel/security 2026-03-10T06:20:01.902 INFO:teuthology.orchestra.run.vm06.stdout:/dev/shm 2026-03-10T06:20:01.902 INFO:teuthology.orchestra.run.vm06.stdout:/dev/pts 2026-03-10T06:20:01.902 INFO:teuthology.orchestra.run.vm06.stdout:/run 2026-03-10T06:20:01.902 INFO:teuthology.orchestra.run.vm06.stdout:/sys/fs/cgroup 2026-03-10T06:20:01.902 INFO:teuthology.orchestra.run.vm06.stdout:/sys/fs/pstore 2026-03-10T06:20:01.902 INFO:teuthology.orchestra.run.vm06.stdout:/sys/fs/bpf 2026-03-10T06:20:01.902 INFO:teuthology.orchestra.run.vm06.stdout:/sys/kernel/config 2026-03-10T06:20:01.902 INFO:teuthology.orchestra.run.vm06.stdout:/ 2026-03-10T06:20:01.902 INFO:teuthology.orchestra.run.vm06.stdout:/var/lib/nfs/rpc_pipefs 2026-03-10T06:20:01.902 INFO:teuthology.orchestra.run.vm06.stdout:/sys/fs/selinux 2026-03-10T06:20:01.902 INFO:teuthology.orchestra.run.vm06.stdout:/proc/sys/fs/binfmt_misc 2026-03-10T06:20:01.902 INFO:teuthology.orchestra.run.vm06.stdout:/dev/hugepages 2026-03-10T06:20:01.902 INFO:teuthology.orchestra.run.vm06.stdout:/dev/mqueue 2026-03-10T06:20:01.902 INFO:teuthology.orchestra.run.vm06.stdout:/sys/kernel/debug 2026-03-10T06:20:01.902 INFO:teuthology.orchestra.run.vm06.stdout:/sys/kernel/tracing 2026-03-10T06:20:01.902 INFO:teuthology.orchestra.run.vm06.stdout:/run/credentials/systemd-sysctl.service 2026-03-10T06:20:01.902 INFO:teuthology.orchestra.run.vm06.stdout:/sys/fs/fuse/connections 2026-03-10T06:20:01.902 INFO:teuthology.orchestra.run.vm06.stdout:/run/credentials/systemd-sysusers.service 2026-03-10T06:20:01.902 INFO:teuthology.orchestra.run.vm06.stdout:/run/credentials/systemd-tmpfiles-setup-dev.service 2026-03-10T06:20:01.902 INFO:teuthology.orchestra.run.vm06.stdout:/run/credentials/systemd-tmpfiles-setup.service 2026-03-10T06:20:01.902 INFO:teuthology.orchestra.run.vm06.stdout:/run/user/1000 2026-03-10T06:20:01.902 INFO:teuthology.orchestra.run.vm06.stdout:/run/user/0 2026-03-10T06:20:01.902 INFO:teuthology.orchestra.run.vm06.stdout:/proc/sys/fs/binfmt_misc 2026-03-10T06:20:01.902 INFO:teuthology.orchestra.run.vm06.stdout:/var/lib/containers/storage/overlay 2026-03-10T06:20:01.902 INFO:teuthology.orchestra.run.vm06.stdout:/var/lib/containers/storage/overlay/fa1223ddc67e60d4d4666c464a973b4eadb72b522d5f1130aaeebb29e1cbd95e/merged 2026-03-10T06:20:01.902 INFO:teuthology.orchestra.run.vm06.stdout:/var/lib/containers/storage/overlay/35157ea1799b55dc0db2dcc2a4a7d8afd4ab092584a4874509daf710f9ca8ce9/merged 2026-03-10T06:20:01.902 INFO:teuthology.orchestra.run.vm06.stdout:/var/lib/containers/storage/overlay/99403544497e46fcefeb155cc92af50d604f690965a85b78688ba96874a46212/merged 2026-03-10T06:20:01.902 INFO:teuthology.orchestra.run.vm06.stdout:/var/lib/containers/storage/overlay/54fd56c248b73f9cb674cd00ce4f1513b24115aa46eb73f15943f537f73ca3d5/merged 2026-03-10T06:20:01.902 INFO:teuthology.orchestra.run.vm06.stdout:/var/lib/containers/storage/overlay/37a431851c128e4a2da7b826a027432da425c2177b1e42b8fa4be2324f9e2ef7/merged 2026-03-10T06:20:01.902 INFO:teuthology.orchestra.run.vm06.stdout:/var/lib/containers/storage/overlay/d0141c152d2808debe639c1fa897e27fcf6162a61a0f3cba483a18b44a7cf5b0/merged 2026-03-10T06:20:01.902 INFO:teuthology.orchestra.run.vm06.stdout:/var/lib/containers/storage/overlay/444945980d69e1c4edf81b74d78acf3b517be9f734da9469e9240427b943d4c4/merged 2026-03-10T06:20:01.903 INFO:teuthology.orchestra.run.vm06.stdout:/var/lib/containers/storage/overlay/9d3bc87006bd9bed42ace29109bcf9efb8ab124894713f99cf97adcc2c11e9e8/merged 2026-03-10T06:20:01.903 INFO:teuthology.orchestra.run.vm06.stdout:/var/lib/containers/storage/overlay/565279a20e230d12e2ff8b2f8fc8f331fb465cab573312fef65a1fe3791a7a69/merged 2026-03-10T06:20:01.903 INFO:teuthology.orchestra.run.vm06.stdout:/var/lib/containers/storage/overlay/98b7c88e9945b5388d62cea6d65e45f7af5bf9c3e30a5d202187608afd7c55f6/merged 2026-03-10T06:20:01.903 INFO:teuthology.orchestra.run.vm06.stdout:/run/netns 2026-03-10T06:20:01.903 INFO:teuthology.orchestra.run.vm06.stdout:/run/netns/ceph-ns--home-ubuntu-cephtest-mnt.1 2026-03-10T06:20:01.903 INFO:teuthology.orchestra.run.vm06.stdout:/run/netns/ceph-ns--home-ubuntu-cephtest-mnt.1 2026-03-10T06:20:01.903 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-10T06:20:01.903 DEBUG:teuthology.orchestra.run.vm06:> ls /sys/fs/fuse/connections 2026-03-10T06:20:01.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:20:01 vm04.local ceph-mon[51058]: pgmap v84: 65 pgs: 65 active+clean; 451 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 1.3 KiB/s rd, 1.3 KiB/s wr, 6 op/s 2026-03-10T06:20:01.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:20:01 vm04.local ceph-mon[51058]: from='client.? 192.168.123.104:0/117371979' entity='client.admin' cmd=[{"prefix": "fs ls"}]: dispatch 2026-03-10T06:20:01.959 INFO:tasks.cephfs.fuse_mount:Pre-mount connections: [] 2026-03-10T06:20:01.959 DEBUG:teuthology.orchestra.run.vm06:> (cd /home/ubuntu/cephtest && exec sudo nsenter --net=/var/run/netns/ceph-ns--home-ubuntu-cephtest-mnt.1 sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage daemon-helper kill ceph-fuse -f --admin-socket '/var/run/ceph/$cluster-$name.$pid.asok' /home/ubuntu/cephtest/mnt.1 --id 1) 2026-03-10T06:20:02.001 DEBUG:teuthology.orchestra.run.vm06:> sudo modprobe fuse 2026-03-10T06:20:02.027 DEBUG:teuthology.orchestra.run.vm06:> cat /proc/self/mounts | awk '{print $2}' 2026-03-10T06:20:02.074 INFO:tasks.cephfs.fuse_mount.ceph-fuse.1.vm06.stderr:2026-03-10T06:20:02.073+0000 7f1da01ce480 -1 init, newargv = 0x55e56365b970 newargc=15 2026-03-10T06:20:02.074 INFO:tasks.cephfs.fuse_mount.ceph-fuse.1.vm06.stderr:ceph-fuse[81227]: starting ceph client 2026-03-10T06:20:02.083 INFO:tasks.cephfs.fuse_mount.ceph-fuse.1.vm06.stderr:ceph-fuse[81227]: starting fuse 2026-03-10T06:20:02.095 INFO:teuthology.orchestra.run.vm06.stdout:/proc 2026-03-10T06:20:02.095 INFO:teuthology.orchestra.run.vm06.stdout:/sys 2026-03-10T06:20:02.095 INFO:teuthology.orchestra.run.vm06.stdout:/dev 2026-03-10T06:20:02.095 INFO:teuthology.orchestra.run.vm06.stdout:/sys/kernel/security 2026-03-10T06:20:02.095 INFO:teuthology.orchestra.run.vm06.stdout:/dev/shm 2026-03-10T06:20:02.095 INFO:teuthology.orchestra.run.vm06.stdout:/dev/pts 2026-03-10T06:20:02.095 INFO:teuthology.orchestra.run.vm06.stdout:/run 2026-03-10T06:20:02.095 INFO:teuthology.orchestra.run.vm06.stdout:/sys/fs/cgroup 2026-03-10T06:20:02.095 INFO:teuthology.orchestra.run.vm06.stdout:/sys/fs/pstore 2026-03-10T06:20:02.095 INFO:teuthology.orchestra.run.vm06.stdout:/sys/fs/bpf 2026-03-10T06:20:02.095 INFO:teuthology.orchestra.run.vm06.stdout:/sys/kernel/config 2026-03-10T06:20:02.095 INFO:teuthology.orchestra.run.vm06.stdout:/ 2026-03-10T06:20:02.095 INFO:teuthology.orchestra.run.vm06.stdout:/var/lib/nfs/rpc_pipefs 2026-03-10T06:20:02.095 INFO:teuthology.orchestra.run.vm06.stdout:/sys/fs/selinux 2026-03-10T06:20:02.095 INFO:teuthology.orchestra.run.vm06.stdout:/proc/sys/fs/binfmt_misc 2026-03-10T06:20:02.095 INFO:teuthology.orchestra.run.vm06.stdout:/dev/hugepages 2026-03-10T06:20:02.095 INFO:teuthology.orchestra.run.vm06.stdout:/dev/mqueue 2026-03-10T06:20:02.095 INFO:teuthology.orchestra.run.vm06.stdout:/sys/kernel/debug 2026-03-10T06:20:02.095 INFO:teuthology.orchestra.run.vm06.stdout:/sys/kernel/tracing 2026-03-10T06:20:02.095 INFO:teuthology.orchestra.run.vm06.stdout:/run/credentials/systemd-sysctl.service 2026-03-10T06:20:02.095 INFO:teuthology.orchestra.run.vm06.stdout:/sys/fs/fuse/connections 2026-03-10T06:20:02.095 INFO:teuthology.orchestra.run.vm06.stdout:/run/credentials/systemd-sysusers.service 2026-03-10T06:20:02.095 INFO:teuthology.orchestra.run.vm06.stdout:/run/credentials/systemd-tmpfiles-setup-dev.service 2026-03-10T06:20:02.095 INFO:teuthology.orchestra.run.vm06.stdout:/run/credentials/systemd-tmpfiles-setup.service 2026-03-10T06:20:02.095 INFO:teuthology.orchestra.run.vm06.stdout:/run/user/1000 2026-03-10T06:20:02.095 INFO:teuthology.orchestra.run.vm06.stdout:/run/user/0 2026-03-10T06:20:02.095 INFO:teuthology.orchestra.run.vm06.stdout:/proc/sys/fs/binfmt_misc 2026-03-10T06:20:02.095 INFO:teuthology.orchestra.run.vm06.stdout:/var/lib/containers/storage/overlay 2026-03-10T06:20:02.095 INFO:teuthology.orchestra.run.vm06.stdout:/var/lib/containers/storage/overlay/fa1223ddc67e60d4d4666c464a973b4eadb72b522d5f1130aaeebb29e1cbd95e/merged 2026-03-10T06:20:02.095 INFO:teuthology.orchestra.run.vm06.stdout:/var/lib/containers/storage/overlay/35157ea1799b55dc0db2dcc2a4a7d8afd4ab092584a4874509daf710f9ca8ce9/merged 2026-03-10T06:20:02.095 INFO:teuthology.orchestra.run.vm06.stdout:/var/lib/containers/storage/overlay/99403544497e46fcefeb155cc92af50d604f690965a85b78688ba96874a46212/merged 2026-03-10T06:20:02.095 INFO:teuthology.orchestra.run.vm06.stdout:/var/lib/containers/storage/overlay/54fd56c248b73f9cb674cd00ce4f1513b24115aa46eb73f15943f537f73ca3d5/merged 2026-03-10T06:20:02.095 INFO:teuthology.orchestra.run.vm06.stdout:/var/lib/containers/storage/overlay/37a431851c128e4a2da7b826a027432da425c2177b1e42b8fa4be2324f9e2ef7/merged 2026-03-10T06:20:02.095 INFO:teuthology.orchestra.run.vm06.stdout:/var/lib/containers/storage/overlay/d0141c152d2808debe639c1fa897e27fcf6162a61a0f3cba483a18b44a7cf5b0/merged 2026-03-10T06:20:02.095 INFO:teuthology.orchestra.run.vm06.stdout:/var/lib/containers/storage/overlay/444945980d69e1c4edf81b74d78acf3b517be9f734da9469e9240427b943d4c4/merged 2026-03-10T06:20:02.095 INFO:teuthology.orchestra.run.vm06.stdout:/var/lib/containers/storage/overlay/9d3bc87006bd9bed42ace29109bcf9efb8ab124894713f99cf97adcc2c11e9e8/merged 2026-03-10T06:20:02.095 INFO:teuthology.orchestra.run.vm06.stdout:/var/lib/containers/storage/overlay/565279a20e230d12e2ff8b2f8fc8f331fb465cab573312fef65a1fe3791a7a69/merged 2026-03-10T06:20:02.095 INFO:teuthology.orchestra.run.vm06.stdout:/var/lib/containers/storage/overlay/98b7c88e9945b5388d62cea6d65e45f7af5bf9c3e30a5d202187608afd7c55f6/merged 2026-03-10T06:20:02.095 INFO:teuthology.orchestra.run.vm06.stdout:/run/netns 2026-03-10T06:20:02.095 INFO:teuthology.orchestra.run.vm06.stdout:/run/netns/ceph-ns--home-ubuntu-cephtest-mnt.1 2026-03-10T06:20:02.095 INFO:teuthology.orchestra.run.vm06.stdout:/run/netns/ceph-ns--home-ubuntu-cephtest-mnt.1 2026-03-10T06:20:02.095 INFO:teuthology.orchestra.run.vm06.stdout:/home/ubuntu/cephtest/mnt.1 2026-03-10T06:20:02.095 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-10T06:20:02.096 DEBUG:teuthology.orchestra.run.vm06:> ls /sys/fs/fuse/connections 2026-03-10T06:20:02.154 INFO:teuthology.orchestra.run.vm06.stdout:90 2026-03-10T06:20:02.155 INFO:tasks.cephfs.fuse_mount:Post-mount connections: [90] 2026-03-10T06:20:02.155 DEBUG:teuthology.orchestra.run.vm06:> sudo stdin-killer -- python3 -c ' 2026-03-10T06:20:02.155 DEBUG:teuthology.orchestra.run.vm06:> import glob 2026-03-10T06:20:02.155 DEBUG:teuthology.orchestra.run.vm06:> import re 2026-03-10T06:20:02.155 DEBUG:teuthology.orchestra.run.vm06:> import os 2026-03-10T06:20:02.155 DEBUG:teuthology.orchestra.run.vm06:> import subprocess 2026-03-10T06:20:02.155 DEBUG:teuthology.orchestra.run.vm06:> 2026-03-10T06:20:02.155 DEBUG:teuthology.orchestra.run.vm06:> def _find_admin_socket(client_name): 2026-03-10T06:20:02.155 DEBUG:teuthology.orchestra.run.vm06:> asok_path = "/var/run/ceph/ceph-client.1.*.asok" 2026-03-10T06:20:02.155 DEBUG:teuthology.orchestra.run.vm06:> files = glob.glob(asok_path) 2026-03-10T06:20:02.155 DEBUG:teuthology.orchestra.run.vm06:> mountpoint = "/home/ubuntu/cephtest/mnt.1" 2026-03-10T06:20:02.155 DEBUG:teuthology.orchestra.run.vm06:> 2026-03-10T06:20:02.155 DEBUG:teuthology.orchestra.run.vm06:> # Given a non-glob path, it better be there 2026-03-10T06:20:02.155 DEBUG:teuthology.orchestra.run.vm06:> if "*" not in asok_path: 2026-03-10T06:20:02.155 DEBUG:teuthology.orchestra.run.vm06:> assert(len(files) == 1) 2026-03-10T06:20:02.155 DEBUG:teuthology.orchestra.run.vm06:> return files[0] 2026-03-10T06:20:02.155 DEBUG:teuthology.orchestra.run.vm06:> 2026-03-10T06:20:02.155 DEBUG:teuthology.orchestra.run.vm06:> for f in files: 2026-03-10T06:20:02.155 DEBUG:teuthology.orchestra.run.vm06:> pid = re.match(".*\.(\d+)\.asok$", f).group(1) 2026-03-10T06:20:02.155 DEBUG:teuthology.orchestra.run.vm06:> if os.path.exists("/proc/{0}".format(pid)): 2026-03-10T06:20:02.155 DEBUG:teuthology.orchestra.run.vm06:> with open("/proc/{0}/cmdline".format(pid), '"'"'r'"'"') as proc_f: 2026-03-10T06:20:02.155 DEBUG:teuthology.orchestra.run.vm06:> contents = proc_f.read() 2026-03-10T06:20:02.155 DEBUG:teuthology.orchestra.run.vm06:> if mountpoint in contents: 2026-03-10T06:20:02.155 DEBUG:teuthology.orchestra.run.vm06:> return f 2026-03-10T06:20:02.155 DEBUG:teuthology.orchestra.run.vm06:> raise RuntimeError("Client socket {0} not found".format(client_name)) 2026-03-10T06:20:02.155 DEBUG:teuthology.orchestra.run.vm06:> 2026-03-10T06:20:02.155 DEBUG:teuthology.orchestra.run.vm06:> print(_find_admin_socket("client.1")) 2026-03-10T06:20:02.155 DEBUG:teuthology.orchestra.run.vm06:> ' 2026-03-10T06:20:02.255 INFO:teuthology.orchestra.run.vm06.stdout:/var/run/ceph/ceph-client.1.81227.asok 2026-03-10T06:20:02.258 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-10T06:20:02 stdin-killer INFO: command exited with status 0: exiting normally with same code! 2026-03-10T06:20:02.264 INFO:tasks.cephfs.fuse_mount:Found client admin socket at /var/run/ceph/ceph-client.1.81227.asok 2026-03-10T06:20:02.264 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-10T06:20:02.264 DEBUG:teuthology.orchestra.run.vm06:> sudo ceph --admin-daemon /var/run/ceph/ceph-client.1.81227.asok status 2026-03-10T06:20:02.377 INFO:teuthology.orchestra.run.vm06.stdout:{ 2026-03-10T06:20:02.377 INFO:teuthology.orchestra.run.vm06.stdout: "metadata": { 2026-03-10T06:20:02.377 INFO:teuthology.orchestra.run.vm06.stdout: "ceph_sha1": "5dd24139a1eada541a3bc16b6941c5dde975e26d", 2026-03-10T06:20:02.377 INFO:teuthology.orchestra.run.vm06.stdout: "ceph_version": "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)", 2026-03-10T06:20:02.377 INFO:teuthology.orchestra.run.vm06.stdout: "entity_id": "1", 2026-03-10T06:20:02.377 INFO:teuthology.orchestra.run.vm06.stdout: "hostname": "vm06.local", 2026-03-10T06:20:02.377 INFO:teuthology.orchestra.run.vm06.stdout: "mount_point": "/home/ubuntu/cephtest/mnt.1", 2026-03-10T06:20:02.377 INFO:teuthology.orchestra.run.vm06.stdout: "pid": "81227", 2026-03-10T06:20:02.377 INFO:teuthology.orchestra.run.vm06.stdout: "root": "/" 2026-03-10T06:20:02.377 INFO:teuthology.orchestra.run.vm06.stdout: }, 2026-03-10T06:20:02.377 INFO:teuthology.orchestra.run.vm06.stdout: "dentry_count": 0, 2026-03-10T06:20:02.377 INFO:teuthology.orchestra.run.vm06.stdout: "dentry_pinned_count": 0, 2026-03-10T06:20:02.377 INFO:teuthology.orchestra.run.vm06.stdout: "id": 24331, 2026-03-10T06:20:02.377 INFO:teuthology.orchestra.run.vm06.stdout: "inst": { 2026-03-10T06:20:02.377 INFO:teuthology.orchestra.run.vm06.stdout: "name": { 2026-03-10T06:20:02.377 INFO:teuthology.orchestra.run.vm06.stdout: "type": "client", 2026-03-10T06:20:02.377 INFO:teuthology.orchestra.run.vm06.stdout: "num": 24331 2026-03-10T06:20:02.377 INFO:teuthology.orchestra.run.vm06.stdout: }, 2026-03-10T06:20:02.377 INFO:teuthology.orchestra.run.vm06.stdout: "addr": { 2026-03-10T06:20:02.377 INFO:teuthology.orchestra.run.vm06.stdout: "type": "v1", 2026-03-10T06:20:02.377 INFO:teuthology.orchestra.run.vm06.stdout: "addr": "192.168.144.1:0", 2026-03-10T06:20:02.377 INFO:teuthology.orchestra.run.vm06.stdout: "nonce": 2593838473 2026-03-10T06:20:02.377 INFO:teuthology.orchestra.run.vm06.stdout: } 2026-03-10T06:20:02.377 INFO:teuthology.orchestra.run.vm06.stdout: }, 2026-03-10T06:20:02.377 INFO:teuthology.orchestra.run.vm06.stdout: "addr": { 2026-03-10T06:20:02.377 INFO:teuthology.orchestra.run.vm06.stdout: "type": "v1", 2026-03-10T06:20:02.377 INFO:teuthology.orchestra.run.vm06.stdout: "addr": "192.168.144.1:0", 2026-03-10T06:20:02.378 INFO:teuthology.orchestra.run.vm06.stdout: "nonce": 2593838473 2026-03-10T06:20:02.378 INFO:teuthology.orchestra.run.vm06.stdout: }, 2026-03-10T06:20:02.378 INFO:teuthology.orchestra.run.vm06.stdout: "inst_str": "client.24331 192.168.144.1:0/2593838473", 2026-03-10T06:20:02.378 INFO:teuthology.orchestra.run.vm06.stdout: "addr_str": "192.168.144.1:0/2593838473", 2026-03-10T06:20:02.378 INFO:teuthology.orchestra.run.vm06.stdout: "inode_count": 1, 2026-03-10T06:20:02.378 INFO:teuthology.orchestra.run.vm06.stdout: "mds_epoch": 9, 2026-03-10T06:20:02.378 INFO:teuthology.orchestra.run.vm06.stdout: "osd_epoch": 38, 2026-03-10T06:20:02.378 INFO:teuthology.orchestra.run.vm06.stdout: "osd_epoch_barrier": 0, 2026-03-10T06:20:02.378 INFO:teuthology.orchestra.run.vm06.stdout: "blocklisted": false, 2026-03-10T06:20:02.378 INFO:teuthology.orchestra.run.vm06.stdout: "fs_name": "cephfs" 2026-03-10T06:20:02.378 INFO:teuthology.orchestra.run.vm06.stdout:} 2026-03-10T06:20:02.385 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-10T06:20:02.385 DEBUG:teuthology.orchestra.run.vm04:> stat --file-system '--printf=%T 2026-03-10T06:20:02.385 DEBUG:teuthology.orchestra.run.vm04:> ' -- /home/ubuntu/cephtest/mnt.0 2026-03-10T06:20:02.402 INFO:teuthology.orchestra.run.vm04.stdout:fuseblk 2026-03-10T06:20:02.402 INFO:tasks.cephfs.fuse_mount:ceph-fuse is mounted on /home/ubuntu/cephtest/mnt.0 2026-03-10T06:20:02.402 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-10T06:20:02.402 DEBUG:teuthology.orchestra.run.vm04:> sudo chmod 1777 /home/ubuntu/cephtest/mnt.0 2026-03-10T06:20:02.469 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-10T06:20:02.469 DEBUG:teuthology.orchestra.run.vm06:> stat --file-system '--printf=%T 2026-03-10T06:20:02.469 DEBUG:teuthology.orchestra.run.vm06:> ' -- /home/ubuntu/cephtest/mnt.1 2026-03-10T06:20:02.486 INFO:teuthology.orchestra.run.vm06.stdout:fuseblk 2026-03-10T06:20:02.486 INFO:tasks.cephfs.fuse_mount:ceph-fuse is mounted on /home/ubuntu/cephtest/mnt.1 2026-03-10T06:20:02.486 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-10T06:20:02.486 DEBUG:teuthology.orchestra.run.vm06:> sudo chmod 1777 /home/ubuntu/cephtest/mnt.1 2026-03-10T06:20:02.562 INFO:teuthology.run_tasks:Running task print... 2026-03-10T06:20:02.564 INFO:teuthology.task.print:**** done client 2026-03-10T06:20:02.564 INFO:teuthology.run_tasks:Running task parallel... 2026-03-10T06:20:02.567 INFO:teuthology.task.parallel:starting parallel... 2026-03-10T06:20:02.567 INFO:teuthology.task.parallel:In parallel, running task sequential... 2026-03-10T06:20:02.568 INFO:teuthology.task.sequential:In sequential, running task cephadm.shell... 2026-03-10T06:20:02.568 INFO:tasks.cephadm:Running commands on role host.a host ubuntu@vm04.local 2026-03-10T06:20:02.568 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 9c59102a-1c48-11f1-b618-035af535377d -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph config set mon mon_warn_on_insecure_global_id_reclaim false --force' 2026-03-10T06:20:02.568 INFO:teuthology.task.parallel:In parallel, running task sequential... 2026-03-10T06:20:02.568 INFO:teuthology.task.sequential:In sequential, running task workunit... 2026-03-10T06:20:02.570 INFO:tasks.workunit:Pulling workunits from ref 75a68fd8ca3f918fe9466b4c0bb385b7fc260a9b 2026-03-10T06:20:02.570 INFO:tasks.workunit:Making a separate scratch dir for every client... 2026-03-10T06:20:02.570 INFO:tasks.workunit:timeout=3h 2026-03-10T06:20:02.570 INFO:tasks.workunit:cleanup=True 2026-03-10T06:20:02.570 DEBUG:teuthology.orchestra.run.vm04:> stat -- /home/ubuntu/cephtest/mnt.0 2026-03-10T06:20:02.591 INFO:teuthology.orchestra.run.vm04.stdout: File: /home/ubuntu/cephtest/mnt.0 2026-03-10T06:20:02.591 INFO:teuthology.orchestra.run.vm04.stdout: Size: 0 Blocks: 1 IO Block: 4194304 directory 2026-03-10T06:20:02.591 INFO:teuthology.orchestra.run.vm04.stdout:Device: 37h/55d Inode: 1 Links: 2 2026-03-10T06:20:02.591 INFO:teuthology.orchestra.run.vm04.stdout:Access: (1777/drwxrwxrwt) Uid: ( 0/ root) Gid: ( 0/ root) 2026-03-10T06:20:02.591 INFO:teuthology.orchestra.run.vm04.stdout:Context: system_u:object_r:fusefs_t:s0 2026-03-10T06:20:02.591 INFO:teuthology.orchestra.run.vm04.stdout:Access: 1970-01-01 00:00:00.000000000 +0000 2026-03-10T06:20:02.591 INFO:teuthology.orchestra.run.vm04.stdout:Modify: 2026-03-10 06:19:50.332471843 +0000 2026-03-10T06:20:02.591 INFO:teuthology.orchestra.run.vm04.stdout:Change: 2026-03-10 06:20:02.467197059 +0000 2026-03-10T06:20:02.591 INFO:teuthology.orchestra.run.vm04.stdout: Birth: - 2026-03-10T06:20:02.591 INFO:tasks.workunit:Did not need to create dir /home/ubuntu/cephtest/mnt.0 2026-03-10T06:20:02.591 DEBUG:teuthology.orchestra.run.vm04:> cd -- /home/ubuntu/cephtest/mnt.0 && sudo install -d -m 0755 --owner=ubuntu -- client.0 2026-03-10T06:20:02.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:20:02 vm06.local ceph-mon[58974]: pgmap v85: 65 pgs: 65 active+clean; 451 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 1.2 KiB/s rd, 1.2 KiB/s wr, 5 op/s 2026-03-10T06:20:02.667 DEBUG:teuthology.orchestra.run.vm06:> stat -- /home/ubuntu/cephtest/mnt.1 2026-03-10T06:20:02.685 INFO:teuthology.orchestra.run.vm06.stdout: File: /home/ubuntu/cephtest/mnt.1 2026-03-10T06:20:02.686 INFO:teuthology.orchestra.run.vm06.stdout: Size: 0 Blocks: 1 IO Block: 4194304 directory 2026-03-10T06:20:02.686 INFO:teuthology.orchestra.run.vm06.stdout:Device: 5ah/90d Inode: 1 Links: 3 2026-03-10T06:20:02.686 INFO:teuthology.orchestra.run.vm06.stdout:Access: (1777/drwxrwxrwt) Uid: ( 0/ root) Gid: ( 0/ root) 2026-03-10T06:20:02.686 INFO:teuthology.orchestra.run.vm06.stdout:Context: system_u:object_r:fusefs_t:s0 2026-03-10T06:20:02.686 INFO:teuthology.orchestra.run.vm06.stdout:Access: 1970-01-01 00:00:00.000000000 +0000 2026-03-10T06:20:02.686 INFO:teuthology.orchestra.run.vm06.stdout:Modify: 2026-03-10 06:20:02.660847338 +0000 2026-03-10T06:20:02.686 INFO:teuthology.orchestra.run.vm06.stdout:Change: 2026-03-10 06:20:02.660847338 +0000 2026-03-10T06:20:02.686 INFO:teuthology.orchestra.run.vm06.stdout: Birth: - 2026-03-10T06:20:02.686 INFO:tasks.workunit:Did not need to create dir /home/ubuntu/cephtest/mnt.1 2026-03-10T06:20:02.686 DEBUG:teuthology.orchestra.run.vm06:> cd -- /home/ubuntu/cephtest/mnt.1 && sudo install -d -m 0755 --owner=ubuntu -- client.1 2026-03-10T06:20:02.739 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/mon.vm04/config 2026-03-10T06:20:02.755 DEBUG:teuthology.orchestra.run.vm04:> rm -rf /home/ubuntu/cephtest/clone.client.0 && git clone https://github.com/kshtsk/ceph.git /home/ubuntu/cephtest/clone.client.0 && cd /home/ubuntu/cephtest/clone.client.0 && git checkout 75a68fd8ca3f918fe9466b4c0bb385b7fc260a9b 2026-03-10T06:20:02.756 DEBUG:teuthology.orchestra.run.vm06:> rm -rf /home/ubuntu/cephtest/clone.client.1 && git clone https://github.com/kshtsk/ceph.git /home/ubuntu/cephtest/clone.client.1 && cd /home/ubuntu/cephtest/clone.client.1 && git checkout 75a68fd8ca3f918fe9466b4c0bb385b7fc260a9b 2026-03-10T06:20:02.770 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:20:02 vm04.local ceph-mon[51058]: pgmap v85: 65 pgs: 65 active+clean; 451 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 1.2 KiB/s rd, 1.2 KiB/s wr, 5 op/s 2026-03-10T06:20:02.797 INFO:tasks.workunit.client.0.vm04.stderr:Cloning into '/home/ubuntu/cephtest/clone.client.0'... 2026-03-10T06:20:02.815 INFO:tasks.workunit.client.1.vm06.stderr:Cloning into '/home/ubuntu/cephtest/clone.client.1'... 2026-03-10T06:20:03.014 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:03.012+0000 7f6218d47700 1 -- 192.168.123.104:0/413982906 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6214103960 msgr2=0x7f6214103db0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:20:03.015 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:03.012+0000 7f6218d47700 1 --2- 192.168.123.104:0/413982906 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6214103960 0x7f6214103db0 secure :-1 s=READY pgs=285 cs=0 l=1 rev1=1 crypto rx=0x7f61fc009b00 tx=0x7f61fc009e10 comp rx=0 tx=0).stop 2026-03-10T06:20:03.015 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:03.013+0000 7f6218d47700 1 -- 192.168.123.104:0/413982906 shutdown_connections 2026-03-10T06:20:03.015 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:03.013+0000 7f6218d47700 1 --2- 192.168.123.104:0/413982906 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6214103960 0x7f6214103db0 unknown :-1 s=CLOSED pgs=285 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:20:03.015 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:03.013+0000 7f6218d47700 1 --2- 192.168.123.104:0/413982906 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6214102760 0x7f6214102b70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:20:03.015 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:03.013+0000 7f6218d47700 1 -- 192.168.123.104:0/413982906 >> 192.168.123.104:0/413982906 conn(0x7f62140fdcf0 msgr2=0x7f6214100140 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:20:03.015 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:03.013+0000 7f6218d47700 1 -- 192.168.123.104:0/413982906 shutdown_connections 2026-03-10T06:20:03.015 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:03.013+0000 7f6218d47700 1 -- 192.168.123.104:0/413982906 wait complete. 2026-03-10T06:20:03.015 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:03.013+0000 7f6218d47700 1 Processor -- start 2026-03-10T06:20:03.015 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:03.013+0000 7f6218d47700 1 -- start start 2026-03-10T06:20:03.015 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:03.014+0000 7f6218d47700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6214102760 0x7f6214198020 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:20:03.016 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:03.014+0000 7f6218d47700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6214103960 0x7f6214198560 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:20:03.016 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:03.014+0000 7f6218d47700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6214198b80 con 0x7f6214103960 2026-03-10T06:20:03.016 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:03.014+0000 7f6218d47700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6214198cc0 con 0x7f6214102760 2026-03-10T06:20:03.016 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:03.014+0000 7f620bfff700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6214103960 0x7f6214198560 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:20:03.016 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:03.014+0000 7f620bfff700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6214103960 0x7f6214198560 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:36802/0 (socket says 192.168.123.104:36802) 2026-03-10T06:20:03.016 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:03.014+0000 7f620bfff700 1 -- 192.168.123.104:0/3205960403 learned_addr learned my addr 192.168.123.104:0/3205960403 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:20:03.016 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:03.014+0000 7f621259c700 1 --2- 192.168.123.104:0/3205960403 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6214102760 0x7f6214198020 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:20:03.016 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:03.015+0000 7f620bfff700 1 -- 192.168.123.104:0/3205960403 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6214102760 msgr2=0x7f6214198020 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:20:03.016 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:03.015+0000 7f620bfff700 1 --2- 192.168.123.104:0/3205960403 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6214102760 0x7f6214198020 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:20:03.016 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:03.015+0000 7f620bfff700 1 -- 192.168.123.104:0/3205960403 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f61fc0097e0 con 0x7f6214103960 2026-03-10T06:20:03.016 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:03.015+0000 7f620bfff700 1 --2- 192.168.123.104:0/3205960403 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6214103960 0x7f6214198560 secure :-1 s=READY pgs=286 cs=0 l=1 rev1=1 crypto rx=0x7f61fc00b5c0 tx=0x7f61fc004990 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:20:03.017 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:03.015+0000 7f620b7fe700 1 -- 192.168.123.104:0/3205960403 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f61fc01d070 con 0x7f6214103960 2026-03-10T06:20:03.017 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:03.015+0000 7f620b7fe700 1 -- 192.168.123.104:0/3205960403 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f61fc022470 con 0x7f6214103960 2026-03-10T06:20:03.017 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:03.015+0000 7f620b7fe700 1 -- 192.168.123.104:0/3205960403 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f61fc00f650 con 0x7f6214103960 2026-03-10T06:20:03.017 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:03.015+0000 7f6218d47700 1 -- 192.168.123.104:0/3205960403 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f621419d710 con 0x7f6214103960 2026-03-10T06:20:03.017 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:03.015+0000 7f6218d47700 1 -- 192.168.123.104:0/3205960403 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f621419dc00 con 0x7f6214103960 2026-03-10T06:20:03.019 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:03.017+0000 7f6218d47700 1 -- 192.168.123.104:0/3205960403 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f6214066e40 con 0x7f6214103960 2026-03-10T06:20:03.019 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:03.017+0000 7f620b7fe700 1 -- 192.168.123.104:0/3205960403 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f61fc0225e0 con 0x7f6214103960 2026-03-10T06:20:03.021 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:03.020+0000 7f620b7fe700 1 --2- 192.168.123.104:0/3205960403 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f61f406c6d0 0x7f61f406eb80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:20:03.023 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:03.020+0000 7f620b7fe700 1 -- 192.168.123.104:0/3205960403 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(38..38 src has 1..38) v4 ==== 5276+0+0 (secure 0 0 0) 0x7f61fc08cfa0 con 0x7f6214103960 2026-03-10T06:20:03.023 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:03.020+0000 7f620b7fe700 1 -- 192.168.123.104:0/3205960403 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f61fc0915c0 con 0x7f6214103960 2026-03-10T06:20:03.024 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:03.022+0000 7f621259c700 1 --2- 192.168.123.104:0/3205960403 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f61f406c6d0 0x7f61f406eb80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:20:03.024 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:03.022+0000 7f621259c700 1 --2- 192.168.123.104:0/3205960403 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f61f406c6d0 0x7f61f406eb80 secure :-1 s=READY pgs=107 cs=0 l=1 rev1=1 crypto rx=0x7f62141037c0 tx=0x7f620400b410 comp rx=0 tx=0).ready entity=mgr.14241 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:20:03.130 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:03.128+0000 7f6218d47700 1 -- 192.168.123.104:0/3205960403 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command([{prefix=config set, name=mon_warn_on_insecure_global_id_reclaim}] v 0) v1 -- 0x7f621419dee0 con 0x7f6214103960 2026-03-10T06:20:03.131 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:03.129+0000 7f620b7fe700 1 -- 192.168.123.104:0/3205960403 <== mon.0 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{prefix=config set, name=mon_warn_on_insecure_global_id_reclaim}]=0 v15)=0 v15) v1 ==== 155+0+0 (secure 0 0 0) 0x7f61fc05b230 con 0x7f6214103960 2026-03-10T06:20:03.133 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:03.131+0000 7f6218d47700 1 -- 192.168.123.104:0/3205960403 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f61f406c6d0 msgr2=0x7f61f406eb80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:20:03.133 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:03.131+0000 7f6218d47700 1 --2- 192.168.123.104:0/3205960403 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f61f406c6d0 0x7f61f406eb80 secure :-1 s=READY pgs=107 cs=0 l=1 rev1=1 crypto rx=0x7f62141037c0 tx=0x7f620400b410 comp rx=0 tx=0).stop 2026-03-10T06:20:03.133 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:03.132+0000 7f6218d47700 1 -- 192.168.123.104:0/3205960403 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6214103960 msgr2=0x7f6214198560 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:20:03.133 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:03.132+0000 7f6218d47700 1 --2- 192.168.123.104:0/3205960403 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6214103960 0x7f6214198560 secure :-1 s=READY pgs=286 cs=0 l=1 rev1=1 crypto rx=0x7f61fc00b5c0 tx=0x7f61fc004990 comp rx=0 tx=0).stop 2026-03-10T06:20:03.133 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:03.132+0000 7f6218d47700 1 -- 192.168.123.104:0/3205960403 shutdown_connections 2026-03-10T06:20:03.133 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:03.132+0000 7f6218d47700 1 --2- 192.168.123.104:0/3205960403 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f61f406c6d0 0x7f61f406eb80 unknown :-1 s=CLOSED pgs=107 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:20:03.134 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:03.132+0000 7f6218d47700 1 --2- 192.168.123.104:0/3205960403 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6214102760 0x7f6214198020 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:20:03.134 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:03.132+0000 7f6218d47700 1 --2- 192.168.123.104:0/3205960403 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6214103960 0x7f6214198560 unknown :-1 s=CLOSED pgs=286 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:20:03.134 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:03.132+0000 7f6218d47700 1 -- 192.168.123.104:0/3205960403 >> 192.168.123.104:0/3205960403 conn(0x7f62140fdcf0 msgr2=0x7f6214106b90 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:20:03.134 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:03.132+0000 7f6218d47700 1 -- 192.168.123.104:0/3205960403 shutdown_connections 2026-03-10T06:20:03.134 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:03.132+0000 7f6218d47700 1 -- 192.168.123.104:0/3205960403 wait complete. 2026-03-10T06:20:03.181 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 9c59102a-1c48-11f1-b618-035af535377d -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph config set mon mon_warn_on_insecure_global_id_reclaim_allowed false --force' 2026-03-10T06:20:03.375 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/mon.vm04/config 2026-03-10T06:20:03.653 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:03.650+0000 7f5e1cc4b700 1 -- 192.168.123.104:0/270521078 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5e18101740 msgr2=0x7f5e18101b90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:20:03.654 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:03.650+0000 7f5e1cc4b700 1 --2- 192.168.123.104:0/270521078 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5e18101740 0x7f5e18101b90 secure :-1 s=READY pgs=59 cs=0 l=1 rev1=1 crypto rx=0x7f5e08009b00 tx=0x7f5e08009e10 comp rx=0 tx=0).stop 2026-03-10T06:20:03.654 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:03.652+0000 7f5e1cc4b700 1 -- 192.168.123.104:0/270521078 shutdown_connections 2026-03-10T06:20:03.654 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:03.652+0000 7f5e1cc4b700 1 --2- 192.168.123.104:0/270521078 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5e18101740 0x7f5e18101b90 unknown :-1 s=CLOSED pgs=59 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:20:03.654 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:03.652+0000 7f5e1cc4b700 1 --2- 192.168.123.104:0/270521078 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5e18100540 0x7f5e18100950 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:20:03.654 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:03.652+0000 7f5e1cc4b700 1 -- 192.168.123.104:0/270521078 >> 192.168.123.104:0/270521078 conn(0x7f5e180fbaf0 msgr2=0x7f5e180fdf20 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:20:03.654 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:03.652+0000 7f5e1cc4b700 1 -- 192.168.123.104:0/270521078 shutdown_connections 2026-03-10T06:20:03.654 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:03.652+0000 7f5e1cc4b700 1 -- 192.168.123.104:0/270521078 wait complete. 2026-03-10T06:20:03.654 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:03.653+0000 7f5e1cc4b700 1 Processor -- start 2026-03-10T06:20:03.654 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:03.653+0000 7f5e1cc4b700 1 -- start start 2026-03-10T06:20:03.655 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:03.653+0000 7f5e1cc4b700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5e18100540 0x7f5e18198000 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:20:03.655 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:03.653+0000 7f5e1659c700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5e18100540 0x7f5e18198000 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:20:03.655 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:03.653+0000 7f5e1659c700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5e18100540 0x7f5e18198000 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:51034/0 (socket says 192.168.123.104:51034) 2026-03-10T06:20:03.655 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:03.653+0000 7f5e1cc4b700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5e18101740 0x7f5e18198540 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:20:03.655 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:03.653+0000 7f5e1cc4b700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5e18198b60 con 0x7f5e18100540 2026-03-10T06:20:03.656 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:03.653+0000 7f5e1cc4b700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5e18198ca0 con 0x7f5e18101740 2026-03-10T06:20:03.656 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:03.653+0000 7f5e1659c700 1 -- 192.168.123.104:0/2224625025 learned_addr learned my addr 192.168.123.104:0/2224625025 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:20:03.656 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:03.654+0000 7f5e15d9b700 1 --2- 192.168.123.104:0/2224625025 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5e18101740 0x7f5e18198540 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:20:03.656 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:03.654+0000 7f5e1659c700 1 -- 192.168.123.104:0/2224625025 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5e18101740 msgr2=0x7f5e18198540 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:20:03.656 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:03.654+0000 7f5e1659c700 1 --2- 192.168.123.104:0/2224625025 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5e18101740 0x7f5e18198540 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:20:03.656 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:03.654+0000 7f5e1659c700 1 -- 192.168.123.104:0/2224625025 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f5e00009710 con 0x7f5e18100540 2026-03-10T06:20:03.656 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:03.654+0000 7f5e1659c700 1 --2- 192.168.123.104:0/2224625025 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5e18100540 0x7f5e18198000 secure :-1 s=READY pgs=287 cs=0 l=1 rev1=1 crypto rx=0x7f5e0000eeb0 tx=0x7f5e0000c5b0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:20:03.656 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:03.655+0000 7f5e0f7fe700 1 -- 192.168.123.104:0/2224625025 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f5e0000cd70 con 0x7f5e18100540 2026-03-10T06:20:03.656 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:03.655+0000 7f5e1cc4b700 1 -- 192.168.123.104:0/2224625025 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f5e080097e0 con 0x7f5e18100540 2026-03-10T06:20:03.658 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:03.655+0000 7f5e1cc4b700 1 -- 192.168.123.104:0/2224625025 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f5e1819db10 con 0x7f5e18100540 2026-03-10T06:20:03.658 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:03.655+0000 7f5e0f7fe700 1 -- 192.168.123.104:0/2224625025 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f5e00010910 con 0x7f5e18100540 2026-03-10T06:20:03.658 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:03.655+0000 7f5e0f7fe700 1 -- 192.168.123.104:0/2224625025 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f5e00018a10 con 0x7f5e18100540 2026-03-10T06:20:03.658 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:03.656+0000 7f5e0f7fe700 1 -- 192.168.123.104:0/2224625025 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f5e00018cb0 con 0x7f5e18100540 2026-03-10T06:20:03.661 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:03.657+0000 7f5e0f7fe700 1 --2- 192.168.123.104:0/2224625025 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f5e0406c750 0x7f5e0406ec00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:20:03.661 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:03.657+0000 7f5e0f7fe700 1 -- 192.168.123.104:0/2224625025 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(38..38 src has 1..38) v4 ==== 5276+0+0 (secure 0 0 0) 0x7f5e00014070 con 0x7f5e18100540 2026-03-10T06:20:03.661 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:03.657+0000 7f5e1cc4b700 1 -- 192.168.123.104:0/2224625025 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f5df8005320 con 0x7f5e18100540 2026-03-10T06:20:03.661 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:03.660+0000 7f5e15d9b700 1 --2- 192.168.123.104:0/2224625025 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f5e0406c750 0x7f5e0406ec00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:20:03.662 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:03.661+0000 7f5e0f7fe700 1 -- 192.168.123.104:0/2224625025 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f5e0005b270 con 0x7f5e18100540 2026-03-10T06:20:03.663 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:03.662+0000 7f5e15d9b700 1 --2- 192.168.123.104:0/2224625025 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f5e0406c750 0x7f5e0406ec00 secure :-1 s=READY pgs=108 cs=0 l=1 rev1=1 crypto rx=0x7f5e08005230 tx=0x7f5e0801a040 comp rx=0 tx=0).ready entity=mgr.14241 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:20:03.770 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:03.768+0000 7f5e1cc4b700 1 -- 192.168.123.104:0/2224625025 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command([{prefix=config set, name=mon_warn_on_insecure_global_id_reclaim_allowed}] v 0) v1 -- 0x7f5df8005190 con 0x7f5e18100540 2026-03-10T06:20:03.770 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:03.768+0000 7f5e0f7fe700 1 -- 192.168.123.104:0/2224625025 <== mon.0 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{prefix=config set, name=mon_warn_on_insecure_global_id_reclaim_allowed}]=0 v15)=0 v15) v1 ==== 163+0+0 (secure 0 0 0) 0x7f5e0005ae00 con 0x7f5e18100540 2026-03-10T06:20:03.776 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:03.773+0000 7f5e1cc4b700 1 -- 192.168.123.104:0/2224625025 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f5e0406c750 msgr2=0x7f5e0406ec00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:20:03.776 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:03.774+0000 7f5e1cc4b700 1 --2- 192.168.123.104:0/2224625025 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f5e0406c750 0x7f5e0406ec00 secure :-1 s=READY pgs=108 cs=0 l=1 rev1=1 crypto rx=0x7f5e08005230 tx=0x7f5e0801a040 comp rx=0 tx=0).stop 2026-03-10T06:20:03.776 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:03.774+0000 7f5e1cc4b700 1 -- 192.168.123.104:0/2224625025 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5e18100540 msgr2=0x7f5e18198000 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:20:03.776 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:03.774+0000 7f5e1cc4b700 1 --2- 192.168.123.104:0/2224625025 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5e18100540 0x7f5e18198000 secure :-1 s=READY pgs=287 cs=0 l=1 rev1=1 crypto rx=0x7f5e0000eeb0 tx=0x7f5e0000c5b0 comp rx=0 tx=0).stop 2026-03-10T06:20:03.776 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:03.774+0000 7f5e1cc4b700 1 -- 192.168.123.104:0/2224625025 shutdown_connections 2026-03-10T06:20:03.776 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:03.774+0000 7f5e1cc4b700 1 --2- 192.168.123.104:0/2224625025 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f5e0406c750 0x7f5e0406ec00 unknown :-1 s=CLOSED pgs=108 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:20:03.776 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:03.774+0000 7f5e1cc4b700 1 --2- 192.168.123.104:0/2224625025 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5e18100540 0x7f5e18198000 unknown :-1 s=CLOSED pgs=287 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:20:03.776 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:03.774+0000 7f5e1cc4b700 1 --2- 192.168.123.104:0/2224625025 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5e18101740 0x7f5e18198540 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:20:03.776 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:03.774+0000 7f5e1cc4b700 1 -- 192.168.123.104:0/2224625025 >> 192.168.123.104:0/2224625025 conn(0x7f5e180fbaf0 msgr2=0x7f5e18102960 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:20:03.776 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:03.774+0000 7f5e1cc4b700 1 -- 192.168.123.104:0/2224625025 shutdown_connections 2026-03-10T06:20:03.776 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:03.774+0000 7f5e1cc4b700 1 -- 192.168.123.104:0/2224625025 wait complete. 2026-03-10T06:20:03.821 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 9c59102a-1c48-11f1-b618-035af535377d -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph config set global log_to_journald false --force' 2026-03-10T06:20:03.982 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/mon.vm04/config 2026-03-10T06:20:04.271 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:04.268+0000 7fc6ac1d2700 1 -- 192.168.123.104:0/985482529 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc6a4102760 msgr2=0x7fc6a4102b70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:20:04.271 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:04.268+0000 7fc6ac1d2700 1 --2- 192.168.123.104:0/985482529 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc6a4102760 0x7fc6a4102b70 secure :-1 s=READY pgs=288 cs=0 l=1 rev1=1 crypto rx=0x7fc694009b00 tx=0x7fc694009e10 comp rx=0 tx=0).stop 2026-03-10T06:20:04.272 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:04.270+0000 7fc6ac1d2700 1 -- 192.168.123.104:0/985482529 shutdown_connections 2026-03-10T06:20:04.272 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:04.270+0000 7fc6ac1d2700 1 --2- 192.168.123.104:0/985482529 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc6a4103960 0x7fc6a4103db0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:20:04.272 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:04.270+0000 7fc6ac1d2700 1 --2- 192.168.123.104:0/985482529 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc6a4102760 0x7fc6a4102b70 unknown :-1 s=CLOSED pgs=288 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:20:04.272 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:04.270+0000 7fc6ac1d2700 1 -- 192.168.123.104:0/985482529 >> 192.168.123.104:0/985482529 conn(0x7fc6a40fdcf0 msgr2=0x7fc6a4100140 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:20:04.272 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:04.270+0000 7fc6ac1d2700 1 -- 192.168.123.104:0/985482529 shutdown_connections 2026-03-10T06:20:04.272 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:04.270+0000 7fc6ac1d2700 1 -- 192.168.123.104:0/985482529 wait complete. 2026-03-10T06:20:04.272 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:04.271+0000 7fc6ac1d2700 1 Processor -- start 2026-03-10T06:20:04.272 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:04.271+0000 7fc6ac1d2700 1 -- start start 2026-03-10T06:20:04.273 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:04.271+0000 7fc6ac1d2700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc6a4102760 0x7fc6a4198020 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:20:04.273 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:04.271+0000 7fc6ac1d2700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc6a4103960 0x7fc6a4198560 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:20:04.273 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:04.271+0000 7fc6ac1d2700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc6a4198b80 con 0x7fc6a4103960 2026-03-10T06:20:04.273 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:04.271+0000 7fc6ac1d2700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc6a4198cc0 con 0x7fc6a4102760 2026-03-10T06:20:04.273 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:04.271+0000 7fc6a976d700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc6a4103960 0x7fc6a4198560 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:20:04.273 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:04.271+0000 7fc6a9f6e700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc6a4102760 0x7fc6a4198020 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:20:04.273 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:04.271+0000 7fc6a9f6e700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc6a4102760 0x7fc6a4198020 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.104:60462/0 (socket says 192.168.123.104:60462) 2026-03-10T06:20:04.273 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:04.271+0000 7fc6a9f6e700 1 -- 192.168.123.104:0/2762883644 learned_addr learned my addr 192.168.123.104:0/2762883644 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:20:04.273 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:04.272+0000 7fc6a976d700 1 -- 192.168.123.104:0/2762883644 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc6a4102760 msgr2=0x7fc6a4198020 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:20:04.273 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:04.272+0000 7fc6a976d700 1 --2- 192.168.123.104:0/2762883644 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc6a4102760 0x7fc6a4198020 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:20:04.273 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:04.272+0000 7fc6a976d700 1 -- 192.168.123.104:0/2762883644 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fc6940097e0 con 0x7fc6a4103960 2026-03-10T06:20:04.274 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:04.272+0000 7fc6a976d700 1 --2- 192.168.123.104:0/2762883644 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc6a4103960 0x7fc6a4198560 secure :-1 s=READY pgs=289 cs=0 l=1 rev1=1 crypto rx=0x7fc6a000eab0 tx=0x7fc6a000edc0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:20:04.275 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:04.272+0000 7fc69affd700 1 -- 192.168.123.104:0/2762883644 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc6a000cb20 con 0x7fc6a4103960 2026-03-10T06:20:04.275 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:04.272+0000 7fc69affd700 1 -- 192.168.123.104:0/2762883644 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fc6a000cc80 con 0x7fc6a4103960 2026-03-10T06:20:04.275 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:04.272+0000 7fc69affd700 1 -- 192.168.123.104:0/2762883644 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc6a0018860 con 0x7fc6a4103960 2026-03-10T06:20:04.275 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:04.272+0000 7fc6ac1d2700 1 -- 192.168.123.104:0/2762883644 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fc6a419d770 con 0x7fc6a4103960 2026-03-10T06:20:04.275 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:04.272+0000 7fc6ac1d2700 1 -- 192.168.123.104:0/2762883644 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fc6a40754f0 con 0x7fc6a4103960 2026-03-10T06:20:04.279 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:04.274+0000 7fc6ac1d2700 1 -- 192.168.123.104:0/2762883644 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fc6a4066e40 con 0x7fc6a4103960 2026-03-10T06:20:04.280 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:04.275+0000 7fc69affd700 1 -- 192.168.123.104:0/2762883644 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7fc6a00189c0 con 0x7fc6a4103960 2026-03-10T06:20:04.280 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:04.278+0000 7fc69affd700 1 --2- 192.168.123.104:0/2762883644 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fc69006c750 0x7fc69006ec00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:20:04.280 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:04.278+0000 7fc6a9f6e700 1 --2- 192.168.123.104:0/2762883644 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fc69006c750 0x7fc69006ec00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:20:04.280 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:04.279+0000 7fc69affd700 1 -- 192.168.123.104:0/2762883644 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(38..38 src has 1..38) v4 ==== 5276+0+0 (secure 0 0 0) 0x7fc6a0014070 con 0x7fc6a4103960 2026-03-10T06:20:04.280 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:04.279+0000 7fc6a9f6e700 1 --2- 192.168.123.104:0/2762883644 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fc69006c750 0x7fc69006ec00 secure :-1 s=READY pgs=109 cs=0 l=1 rev1=1 crypto rx=0x7fc694006010 tx=0x7fc69401a040 comp rx=0 tx=0).ready entity=mgr.14241 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:20:04.281 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:04.279+0000 7fc69affd700 1 -- 192.168.123.104:0/2762883644 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fc6a005a270 con 0x7fc6a4103960 2026-03-10T06:20:04.389 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:04.387+0000 7fc6ac1d2700 1 -- 192.168.123.104:0/2762883644 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command([{prefix=config set, name=log_to_journald}] v 0) v1 -- 0x7fc6a40757d0 con 0x7fc6a4103960 2026-03-10T06:20:04.389 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:04.387+0000 7fc69affd700 1 -- 192.168.123.104:0/2762883644 <== mon.0 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{prefix=config set, name=log_to_journald}]=0 v15)=0 v15) v1 ==== 135+0+0 (secure 0 0 0) 0x7fc6a0059e00 con 0x7fc6a4103960 2026-03-10T06:20:04.390 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:04.389+0000 7fc6ac1d2700 1 -- 192.168.123.104:0/2762883644 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fc69006c750 msgr2=0x7fc69006ec00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:20:04.390 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:04.389+0000 7fc6ac1d2700 1 --2- 192.168.123.104:0/2762883644 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fc69006c750 0x7fc69006ec00 secure :-1 s=READY pgs=109 cs=0 l=1 rev1=1 crypto rx=0x7fc694006010 tx=0x7fc69401a040 comp rx=0 tx=0).stop 2026-03-10T06:20:04.390 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:04.389+0000 7fc6ac1d2700 1 -- 192.168.123.104:0/2762883644 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc6a4103960 msgr2=0x7fc6a4198560 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:20:04.391 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:04.389+0000 7fc6ac1d2700 1 --2- 192.168.123.104:0/2762883644 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc6a4103960 0x7fc6a4198560 secure :-1 s=READY pgs=289 cs=0 l=1 rev1=1 crypto rx=0x7fc6a000eab0 tx=0x7fc6a000edc0 comp rx=0 tx=0).stop 2026-03-10T06:20:04.391 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:04.389+0000 7fc6ac1d2700 1 -- 192.168.123.104:0/2762883644 shutdown_connections 2026-03-10T06:20:04.391 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:04.389+0000 7fc6ac1d2700 1 --2- 192.168.123.104:0/2762883644 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fc69006c750 0x7fc69006ec00 unknown :-1 s=CLOSED pgs=109 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:20:04.391 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:04.389+0000 7fc6ac1d2700 1 --2- 192.168.123.104:0/2762883644 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc6a4102760 0x7fc6a4198020 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:20:04.391 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:04.389+0000 7fc6ac1d2700 1 --2- 192.168.123.104:0/2762883644 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc6a4103960 0x7fc6a4198560 unknown :-1 s=CLOSED pgs=289 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:20:04.391 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:04.389+0000 7fc6ac1d2700 1 -- 192.168.123.104:0/2762883644 >> 192.168.123.104:0/2762883644 conn(0x7fc6a40fdcf0 msgr2=0x7fc6a4106b90 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:20:04.391 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:04.389+0000 7fc6ac1d2700 1 -- 192.168.123.104:0/2762883644 shutdown_connections 2026-03-10T06:20:04.391 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:04.389+0000 7fc6ac1d2700 1 -- 192.168.123.104:0/2762883644 wait complete. 2026-03-10T06:20:04.469 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 9c59102a-1c48-11f1-b618-035af535377d -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph orch upgrade start --image quay.ceph.io/ceph-ci/ceph:$sha1 --daemon-types mgr' 2026-03-10T06:20:04.648 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/mon.vm04/config 2026-03-10T06:20:04.951 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:04.949+0000 7f8596269700 1 -- 192.168.123.104:0/1360218791 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8590102760 msgr2=0x7f8590102b70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:20:04.951 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:04.949+0000 7f8596269700 1 --2- 192.168.123.104:0/1360218791 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8590102760 0x7f8590102b70 secure :-1 s=READY pgs=290 cs=0 l=1 rev1=1 crypto rx=0x7f8578009b50 tx=0x7f8578009e60 comp rx=0 tx=0).stop 2026-03-10T06:20:04.951 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:04.949+0000 7f8596269700 1 -- 192.168.123.104:0/1360218791 shutdown_connections 2026-03-10T06:20:04.951 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:04.949+0000 7f8596269700 1 --2- 192.168.123.104:0/1360218791 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8590103960 0x7f8590103db0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:20:04.951 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:04.949+0000 7f8596269700 1 --2- 192.168.123.104:0/1360218791 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8590102760 0x7f8590102b70 unknown :-1 s=CLOSED pgs=290 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:20:04.951 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:04.949+0000 7f8596269700 1 -- 192.168.123.104:0/1360218791 >> 192.168.123.104:0/1360218791 conn(0x7f85900fdcf0 msgr2=0x7f8590100140 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:20:04.951 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:04.950+0000 7f8596269700 1 -- 192.168.123.104:0/1360218791 shutdown_connections 2026-03-10T06:20:04.952 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:04.950+0000 7f8596269700 1 -- 192.168.123.104:0/1360218791 wait complete. 2026-03-10T06:20:04.952 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:04.950+0000 7f8596269700 1 Processor -- start 2026-03-10T06:20:04.952 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:04.950+0000 7f8596269700 1 -- start start 2026-03-10T06:20:04.952 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:04.951+0000 7f8596269700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8590103960 0x7f8590198340 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:20:04.952 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:04.951+0000 7f8596269700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8590198880 0x7f859019d8f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:20:04.953 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:04.951+0000 7f858f7fe700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8590198880 0x7f859019d8f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:20:04.953 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:04.951+0000 7f858f7fe700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8590198880 0x7f859019d8f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:51058/0 (socket says 192.168.123.104:51058) 2026-03-10T06:20:04.953 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:04.951+0000 7f858f7fe700 1 -- 192.168.123.104:0/538629821 learned_addr learned my addr 192.168.123.104:0/538629821 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:20:04.953 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:04.951+0000 7f8596269700 1 -- 192.168.123.104:0/538629821 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8590198d80 con 0x7f8590198880 2026-03-10T06:20:04.953 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:04.951+0000 7f8596269700 1 -- 192.168.123.104:0/538629821 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8590198ef0 con 0x7f8590103960 2026-03-10T06:20:04.953 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:04.952+0000 7f858f7fe700 1 -- 192.168.123.104:0/538629821 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8590103960 msgr2=0x7f8590198340 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T06:20:04.953 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:04.952+0000 7f858f7fe700 1 --2- 192.168.123.104:0/538629821 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8590103960 0x7f8590198340 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:20:04.953 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:04.952+0000 7f858f7fe700 1 -- 192.168.123.104:0/538629821 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f85780097e0 con 0x7f8590198880 2026-03-10T06:20:04.954 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:04.952+0000 7f858f7fe700 1 --2- 192.168.123.104:0/538629821 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8590198880 0x7f859019d8f0 secure :-1 s=READY pgs=291 cs=0 l=1 rev1=1 crypto rx=0x7f858000ed70 tx=0x7f858000c5b0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:20:04.955 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:04.952+0000 7f858d7fa700 1 -- 192.168.123.104:0/538629821 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f858000cd70 con 0x7f8590198880 2026-03-10T06:20:04.955 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:04.952+0000 7f858d7fa700 1 -- 192.168.123.104:0/538629821 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f8580010910 con 0x7f8590198880 2026-03-10T06:20:04.955 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:04.952+0000 7f858d7fa700 1 -- 192.168.123.104:0/538629821 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f8580018980 con 0x7f8590198880 2026-03-10T06:20:04.955 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:04.952+0000 7f8596269700 1 -- 192.168.123.104:0/538629821 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f859019de90 con 0x7f8590198880 2026-03-10T06:20:04.955 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:04.952+0000 7f8596269700 1 -- 192.168.123.104:0/538629821 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f859019e3b0 con 0x7f8590198880 2026-03-10T06:20:04.958 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:04.954+0000 7f858d7fa700 1 -- 192.168.123.104:0/538629821 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f8580010a80 con 0x7f8590198880 2026-03-10T06:20:04.958 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:04.954+0000 7f8596269700 1 -- 192.168.123.104:0/538629821 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f8590066e40 con 0x7f8590198880 2026-03-10T06:20:04.958 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:04.954+0000 7f858d7fa700 1 --2- 192.168.123.104:0/538629821 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f857c06c630 0x7f857c06eae0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:20:04.958 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:04.954+0000 7f858d7fa700 1 -- 192.168.123.104:0/538629821 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(38..38 src has 1..38) v4 ==== 5276+0+0 (secure 0 0 0) 0x7f8580014070 con 0x7f8590198880 2026-03-10T06:20:04.958 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:04.957+0000 7f858d7fa700 1 -- 192.168.123.104:0/538629821 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f858005a580 con 0x7f8590198880 2026-03-10T06:20:04.959 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:04.957+0000 7f858ffff700 1 --2- 192.168.123.104:0/538629821 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f857c06c630 0x7f857c06eae0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:20:04.960 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:04.957+0000 7f858ffff700 1 --2- 192.168.123.104:0/538629821 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f857c06c630 0x7f857c06eae0 secure :-1 s=READY pgs=110 cs=0 l=1 rev1=1 crypto rx=0x7f857800b5c0 tx=0x7f8578005fd0 comp rx=0 tx=0).ready entity=mgr.14241 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:20:05.090 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:20:05.088+0000 7f8596269700 1 -- 192.168.123.104:0/538629821 --> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] -- mgr_command(tid 0: {"prefix": "orch upgrade start", "image": "quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df", "daemon_types": "mgr", "target": ["mon-mgr", ""]}) v1 -- 0x7f85901082b0 con 0x7f857c06c630 2026-03-10T06:20:05.129 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:20:05 vm04.local ceph-mon[51058]: pgmap v86: 65 pgs: 65 active+clean; 451 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 1.2 KiB/s rd, 1.2 KiB/s wr, 5 op/s 2026-03-10T06:20:05.129 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:20:05 vm04.local ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:20:05.129 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:20:05 vm04.local ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T06:20:05.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:20:05 vm06.local ceph-mon[58974]: pgmap v86: 65 pgs: 65 active+clean; 451 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 1.2 KiB/s rd, 1.2 KiB/s wr, 5 op/s 2026-03-10T06:20:05.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:20:05 vm06.local ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:20:05.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:20:05 vm06.local ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T06:20:06.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:20:06 vm06.local ceph-mon[58974]: from='client.14572 -' entity='client.admin' cmd=[{"prefix": "orch upgrade start", "image": "quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df", "daemon_types": "mgr", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:20:06.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:20:06 vm04.local ceph-mon[51058]: from='client.14572 -' entity='client.admin' cmd=[{"prefix": "orch upgrade start", "image": "quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df", "daemon_types": "mgr", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:20:07.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:20:07 vm04.local ceph-mon[51058]: pgmap v87: 65 pgs: 65 active+clean; 457 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 2.1 KiB/s rd, 1.7 KiB/s wr, 6 op/s 2026-03-10T06:20:07.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:20:07 vm06.local ceph-mon[58974]: pgmap v87: 65 pgs: 65 active+clean; 457 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 2.1 KiB/s rd, 1.7 KiB/s wr, 6 op/s 2026-03-10T06:20:09.643 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:20:09 vm04.local ceph-mon[51058]: pgmap v88: 65 pgs: 65 active+clean; 457 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 1.4 KiB/s rd, 597 B/s wr, 2 op/s 2026-03-10T06:20:09.750 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:20:09 vm06.local ceph-mon[58974]: pgmap v88: 65 pgs: 65 active+clean; 457 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 1.4 KiB/s rd, 597 B/s wr, 2 op/s 2026-03-10T06:20:11.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:20:11 vm04.local ceph-mon[51058]: pgmap v89: 65 pgs: 65 active+clean; 457 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 1.8 KiB/s rd, 682 B/s wr, 3 op/s 2026-03-10T06:20:11.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:20:11 vm06.local ceph-mon[58974]: pgmap v89: 65 pgs: 65 active+clean; 457 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 1.8 KiB/s rd, 682 B/s wr, 3 op/s 2026-03-10T06:20:12.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:20:12 vm06.local ceph-mon[58974]: pgmap v90: 65 pgs: 65 active+clean; 457 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 1.4 KiB/s rd, 682 B/s wr, 2 op/s 2026-03-10T06:20:12.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:20:12 vm04.local ceph-mon[51058]: pgmap v90: 65 pgs: 65 active+clean; 457 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 1.4 KiB/s rd, 682 B/s wr, 2 op/s 2026-03-10T06:20:15.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:20:15 vm06.local ceph-mon[58974]: pgmap v91: 65 pgs: 65 active+clean; 457 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 1.4 KiB/s rd, 682 B/s wr, 2 op/s 2026-03-10T06:20:15.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:20:15 vm04.local ceph-mon[51058]: pgmap v91: 65 pgs: 65 active+clean; 457 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 1.4 KiB/s rd, 682 B/s wr, 2 op/s 2026-03-10T06:20:17.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:20:17 vm06.local ceph-mon[58974]: pgmap v92: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 2.2 KiB/s rd, 1.2 KiB/s wr, 3 op/s 2026-03-10T06:20:17.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:20:17 vm04.local ceph-mon[51058]: pgmap v92: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 2.2 KiB/s rd, 1.2 KiB/s wr, 3 op/s 2026-03-10T06:20:18.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:20:18 vm06.local ceph-mon[58974]: pgmap v93: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 1.2 KiB/s rd, 597 B/s wr, 2 op/s 2026-03-10T06:20:18.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:20:18 vm04.local ceph-mon[51058]: pgmap v93: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 1.2 KiB/s rd, 597 B/s wr, 2 op/s 2026-03-10T06:20:19.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:20:19 vm06.local ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T06:20:19.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:20:19 vm04.local ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T06:20:21.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:20:21 vm04.local ceph-mon[51058]: pgmap v94: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 1.6 KiB/s rd, 597 B/s wr, 3 op/s 2026-03-10T06:20:21.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:20:21 vm06.local ceph-mon[58974]: pgmap v94: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 1.6 KiB/s rd, 597 B/s wr, 3 op/s 2026-03-10T06:20:23.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:20:23 vm04.local ceph-mon[51058]: pgmap v95: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 1.2 KiB/s rd, 511 B/s wr, 2 op/s 2026-03-10T06:20:23.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:20:23 vm06.local ceph-mon[58974]: pgmap v95: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 1.2 KiB/s rd, 511 B/s wr, 2 op/s 2026-03-10T06:20:25.569 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:20:25 vm04.local ceph-mon[51058]: pgmap v96: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 1.2 KiB/s rd, 511 B/s wr, 2 op/s 2026-03-10T06:20:25.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:20:25 vm06.local ceph-mon[58974]: pgmap v96: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 1.2 KiB/s rd, 511 B/s wr, 2 op/s 2026-03-10T06:20:26.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:20:26 vm06.local ceph-mon[58974]: pgmap v97: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 1.6 KiB/s rd, 511 B/s wr, 2 op/s 2026-03-10T06:20:26.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:20:26 vm04.local ceph-mon[51058]: pgmap v97: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 1.6 KiB/s rd, 511 B/s wr, 2 op/s 2026-03-10T06:20:29.156 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:20:29 vm04.local ceph-mon[51058]: pgmap v98: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:20:29.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:20:29 vm06.local ceph-mon[58974]: pgmap v98: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:20:31.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:20:31 vm06.local ceph-mon[58974]: pgmap v99: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:20:31.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:20:31 vm04.local ceph-mon[51058]: pgmap v99: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:20:33.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:20:32 vm04.local ceph-mon[51058]: pgmap v100: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:20:33.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:20:32 vm06.local ceph-mon[58974]: pgmap v100: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:20:34.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:20:34 vm04.local ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T06:20:34.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:20:34 vm06.local ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T06:20:35.179 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:20:35 vm06.local ceph-mon[58974]: pgmap v101: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:20:35.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:20:35 vm04.local ceph-mon[51058]: pgmap v101: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:20:36.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:20:36 vm06.local ceph-mon[58974]: pgmap v102: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:20:36.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:20:36 vm04.local ceph-mon[51058]: pgmap v102: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:20:39.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:20:39 vm06.local ceph-mon[58974]: pgmap v103: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:20:39.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:20:39 vm04.local ceph-mon[51058]: pgmap v103: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:20:41.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:20:41 vm06.local ceph-mon[58974]: pgmap v104: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:20:41.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:20:41 vm04.local ceph-mon[51058]: pgmap v104: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:20:43.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:20:43 vm04.local ceph-mon[51058]: pgmap v105: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:20:44.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:20:43 vm06.local ceph-mon[58974]: pgmap v105: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:20:45.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:20:45 vm04.local ceph-mon[51058]: pgmap v106: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:20:45.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:20:45 vm06.local ceph-mon[58974]: pgmap v106: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:20:46.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:20:46 vm06.local ceph-mon[58974]: pgmap v107: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:20:46.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:20:46 vm04.local ceph-mon[51058]: pgmap v107: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:20:48.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:20:48 vm06.local ceph-mon[58974]: pgmap v108: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:20:48.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:20:48 vm04.local ceph-mon[51058]: pgmap v108: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:20:49.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:20:49 vm06.local ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T06:20:49.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:20:49 vm04.local ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T06:20:50.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:20:50 vm04.local ceph-mon[51058]: pgmap v109: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:20:51.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:20:50 vm06.local ceph-mon[58974]: pgmap v109: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:20:53.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:20:53 vm06.local ceph-mon[58974]: pgmap v110: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:20:53.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:20:53 vm04.local ceph-mon[51058]: pgmap v110: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:20:55.575 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:20:55 vm04.local ceph-mon[51058]: pgmap v111: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:20:55.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:20:55 vm06.local ceph-mon[58974]: pgmap v111: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:20:56.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:20:56 vm04.local ceph-mon[51058]: pgmap v112: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:20:57.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:20:56 vm06.local ceph-mon[58974]: pgmap v112: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:20:58.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:20:57 vm06.local ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:20:58.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:20:57 vm04.local ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:20:59.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:20:59 vm06.local ceph-mon[58974]: pgmap v113: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:20:59.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:20:59 vm04.local ceph-mon[51058]: pgmap v113: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:21:00.125 INFO:teuthology.orchestra.run.vm04.stdout:Initiating upgrade to quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-10T06:21:00.125 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:00.116+0000 7f858d7fa700 1 -- 192.168.123.104:0/538629821 <== mgr.14241 v2:192.168.123.104:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+89 (secure 0 0 0) 0x7f85901082b0 con 0x7f857c06c630 2026-03-10T06:21:00.125 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:00.121+0000 7f8586ffd700 1 -- 192.168.123.104:0/538629821 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f857c06c630 msgr2=0x7f857c06eae0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:21:00.125 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:00.121+0000 7f8586ffd700 1 --2- 192.168.123.104:0/538629821 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f857c06c630 0x7f857c06eae0 secure :-1 s=READY pgs=110 cs=0 l=1 rev1=1 crypto rx=0x7f857800b5c0 tx=0x7f8578005fd0 comp rx=0 tx=0).stop 2026-03-10T06:21:00.125 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:00.121+0000 7f8586ffd700 1 -- 192.168.123.104:0/538629821 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8590198880 msgr2=0x7f859019d8f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:21:00.125 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:00.121+0000 7f8586ffd700 1 --2- 192.168.123.104:0/538629821 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8590198880 0x7f859019d8f0 secure :-1 s=READY pgs=291 cs=0 l=1 rev1=1 crypto rx=0x7f858000ed70 tx=0x7f858000c5b0 comp rx=0 tx=0).stop 2026-03-10T06:21:00.125 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:00.121+0000 7f8586ffd700 1 -- 192.168.123.104:0/538629821 shutdown_connections 2026-03-10T06:21:00.125 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:00.121+0000 7f8586ffd700 1 --2- 192.168.123.104:0/538629821 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f857c06c630 0x7f857c06eae0 unknown :-1 s=CLOSED pgs=110 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:21:00.125 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:00.121+0000 7f8586ffd700 1 --2- 192.168.123.104:0/538629821 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8590103960 0x7f8590198340 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:21:00.125 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:00.121+0000 7f8586ffd700 1 --2- 192.168.123.104:0/538629821 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8590198880 0x7f859019d8f0 unknown :-1 s=CLOSED pgs=291 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:21:00.125 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:00.121+0000 7f8586ffd700 1 -- 192.168.123.104:0/538629821 >> 192.168.123.104:0/538629821 conn(0x7f85900fdcf0 msgr2=0x7f8590106b90 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:21:00.125 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:00.123+0000 7f8586ffd700 1 -- 192.168.123.104:0/538629821 shutdown_connections 2026-03-10T06:21:00.127 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:00.123+0000 7f8586ffd700 1 -- 192.168.123.104:0/538629821 wait complete. 2026-03-10T06:21:00.237 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 9c59102a-1c48-11f1-b618-035af535377d -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'while ceph orch upgrade status | jq '"'"'.in_progress'"'"' | grep true && ! ceph orch upgrade status | jq '"'"'.message'"'"' | grep Error ; do ceph orch ps ; ceph versions ; ceph orch upgrade status ; sleep 30 ; done' 2026-03-10T06:21:00.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:00 vm06.local ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:21:00.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:00 vm06.local ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T06:21:00.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:00 vm06.local ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:21:00.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:00 vm06.local ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:21:00.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:00 vm06.local ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:21:00.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:00 vm06.local ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:21:00.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:00 vm06.local ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T06:21:00.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:00 vm06.local ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:21:00.371 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:00 vm04.local ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:21:00.371 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:00 vm04.local ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T06:21:00.371 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:00 vm04.local ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:21:00.371 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:00 vm04.local ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:21:00.371 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:00 vm04.local ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:21:00.371 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:00 vm04.local ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:21:00.371 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:00 vm04.local ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T06:21:00.371 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:00 vm04.local ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:21:00.664 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/mon.vm04/config 2026-03-10T06:21:01.219 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:01.215+0000 7f26d40ed700 1 -- 192.168.123.104:0/2777139743 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f26cc100fe0 msgr2=0x7f26cc1033c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:21:01.219 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:01.215+0000 7f26d40ed700 1 --2- 192.168.123.104:0/2777139743 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f26cc100fe0 0x7f26cc1033c0 secure :-1 s=READY pgs=292 cs=0 l=1 rev1=1 crypto rx=0x7f26c8009b00 tx=0x7f26c8009e10 comp rx=0 tx=0).stop 2026-03-10T06:21:01.219 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:01.217+0000 7f26d40ed700 1 -- 192.168.123.104:0/2777139743 shutdown_connections 2026-03-10T06:21:01.219 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:01.217+0000 7f26d40ed700 1 --2- 192.168.123.104:0/2777139743 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f26cc103900 0x7f26cc105ce0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:21:01.219 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:01.217+0000 7f26d40ed700 1 --2- 192.168.123.104:0/2777139743 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f26cc100fe0 0x7f26cc1033c0 unknown :-1 s=CLOSED pgs=292 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:21:01.219 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:01.217+0000 7f26d40ed700 1 -- 192.168.123.104:0/2777139743 >> 192.168.123.104:0/2777139743 conn(0x7f26cc0fa9e0 msgr2=0x7f26cc0fce30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:21:01.221 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:01.217+0000 7f26d40ed700 1 -- 192.168.123.104:0/2777139743 shutdown_connections 2026-03-10T06:21:01.221 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:01.217+0000 7f26d40ed700 1 -- 192.168.123.104:0/2777139743 wait complete. 2026-03-10T06:21:01.221 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:01.217+0000 7f26d40ed700 1 Processor -- start 2026-03-10T06:21:01.221 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:01.217+0000 7f26d40ed700 1 -- start start 2026-03-10T06:21:01.221 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:01.218+0000 7f26d40ed700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f26cc100fe0 0x7f26cc197ff0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:21:01.221 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:01.218+0000 7f26d40ed700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f26cc103900 0x7f26cc198530 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:21:01.221 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:01.218+0000 7f26d40ed700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f26cc198b50 con 0x7f26cc100fe0 2026-03-10T06:21:01.221 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:01.218+0000 7f26d40ed700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f26cc198c90 con 0x7f26cc103900 2026-03-10T06:21:01.221 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:01.218+0000 7f26d1e89700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f26cc100fe0 0x7f26cc197ff0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:21:01.221 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:01.218+0000 7f26d1e89700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f26cc100fe0 0x7f26cc197ff0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:47566/0 (socket says 192.168.123.104:47566) 2026-03-10T06:21:01.221 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:01.218+0000 7f26d1e89700 1 -- 192.168.123.104:0/2197586185 learned_addr learned my addr 192.168.123.104:0/2197586185 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:21:01.221 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:01.218+0000 7f26d1688700 1 --2- 192.168.123.104:0/2197586185 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f26cc103900 0x7f26cc198530 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:21:01.221 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:01.218+0000 7f26d1e89700 1 -- 192.168.123.104:0/2197586185 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f26cc103900 msgr2=0x7f26cc198530 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:21:01.221 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:01.218+0000 7f26d1e89700 1 --2- 192.168.123.104:0/2197586185 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f26cc103900 0x7f26cc198530 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:21:01.221 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:01.218+0000 7f26d1e89700 1 -- 192.168.123.104:0/2197586185 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f26c80097e0 con 0x7f26cc100fe0 2026-03-10T06:21:01.221 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:01.218+0000 7f26d1e89700 1 --2- 192.168.123.104:0/2197586185 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f26cc100fe0 0x7f26cc197ff0 secure :-1 s=READY pgs=293 cs=0 l=1 rev1=1 crypto rx=0x7f26c8005850 tx=0x7f26c8004990 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:21:01.221 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:01.219+0000 7f26c2ffd700 1 -- 192.168.123.104:0/2197586185 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f26c801d070 con 0x7f26cc100fe0 2026-03-10T06:21:01.221 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:01.219+0000 7f26c2ffd700 1 -- 192.168.123.104:0/2197586185 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f26c8022470 con 0x7f26cc100fe0 2026-03-10T06:21:01.221 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:01.219+0000 7f26c2ffd700 1 -- 192.168.123.104:0/2197586185 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f26c800f670 con 0x7f26cc100fe0 2026-03-10T06:21:01.226 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:01.220+0000 7f26d40ed700 1 -- 192.168.123.104:0/2197586185 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f26cc19d6e0 con 0x7f26cc100fe0 2026-03-10T06:21:01.226 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:01.220+0000 7f26d40ed700 1 -- 192.168.123.104:0/2197586185 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f26cc19db50 con 0x7f26cc100fe0 2026-03-10T06:21:01.227 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:01.220+0000 7f26d40ed700 1 -- 192.168.123.104:0/2197586185 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f26cc0fc5c0 con 0x7f26cc100fe0 2026-03-10T06:21:01.227 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:01.221+0000 7f26c2ffd700 1 -- 192.168.123.104:0/2197586185 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f26c800baa0 con 0x7f26cc100fe0 2026-03-10T06:21:01.227 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:01.221+0000 7f26c2ffd700 1 --2- 192.168.123.104:0/2197586185 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f26b806c520 0x7f26b806e9d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:21:01.227 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:01.222+0000 7f26d1688700 1 --2- 192.168.123.104:0/2197586185 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f26b806c520 0x7f26b806e9d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:21:01.227 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:01.222+0000 7f26c2ffd700 1 -- 192.168.123.104:0/2197586185 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(38..38 src has 1..38) v4 ==== 5276+0+0 (secure 0 0 0) 0x7f26c808ce50 con 0x7f26cc100fe0 2026-03-10T06:21:01.227 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:01.225+0000 7f26d1688700 1 --2- 192.168.123.104:0/2197586185 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f26b806c520 0x7f26b806e9d0 secure :-1 s=READY pgs=111 cs=0 l=1 rev1=1 crypto rx=0x7f26bc005950 tx=0x7f26bc00b410 comp rx=0 tx=0).ready entity=mgr.14241 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:21:01.227 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:01.225+0000 7f26c2ffd700 1 -- 192.168.123.104:0/2197586185 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f26c805b0e0 con 0x7f26cc100fe0 2026-03-10T06:21:01.389 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:01 vm04.local ceph-mon[51058]: pgmap v114: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:21:01.389 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:01 vm04.local ceph-mon[51058]: Upgrade: Started with target quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-10T06:21:01.389 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:01.387+0000 7f26d40ed700 1 -- 192.168.123.104:0/2197586185 --> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f26cc061190 con 0x7f26b806c520 2026-03-10T06:21:01.395 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:01.391+0000 7f26c2ffd700 1 -- 192.168.123.104:0/2197586185 <== mgr.14241 v2:192.168.123.104:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+351 (secure 0 0 0) 0x7f26cc061190 con 0x7f26b806c520 2026-03-10T06:21:01.398 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:01.396+0000 7f26c0ff9700 1 -- 192.168.123.104:0/2197586185 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f26b806c520 msgr2=0x7f26b806e9d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:21:01.398 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:01.396+0000 7f26c0ff9700 1 --2- 192.168.123.104:0/2197586185 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f26b806c520 0x7f26b806e9d0 secure :-1 s=READY pgs=111 cs=0 l=1 rev1=1 crypto rx=0x7f26bc005950 tx=0x7f26bc00b410 comp rx=0 tx=0).stop 2026-03-10T06:21:01.398 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:01.396+0000 7f26c0ff9700 1 -- 192.168.123.104:0/2197586185 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f26cc100fe0 msgr2=0x7f26cc197ff0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:21:01.398 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:01.396+0000 7f26c0ff9700 1 --2- 192.168.123.104:0/2197586185 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f26cc100fe0 0x7f26cc197ff0 secure :-1 s=READY pgs=293 cs=0 l=1 rev1=1 crypto rx=0x7f26c8005850 tx=0x7f26c8004990 comp rx=0 tx=0).stop 2026-03-10T06:21:01.399 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:01.396+0000 7f26c0ff9700 1 -- 192.168.123.104:0/2197586185 shutdown_connections 2026-03-10T06:21:01.399 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:01.396+0000 7f26c0ff9700 1 --2- 192.168.123.104:0/2197586185 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f26b806c520 0x7f26b806e9d0 unknown :-1 s=CLOSED pgs=111 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:21:01.399 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:01.396+0000 7f26c0ff9700 1 --2- 192.168.123.104:0/2197586185 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f26cc100fe0 0x7f26cc197ff0 unknown :-1 s=CLOSED pgs=293 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:21:01.399 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:01.396+0000 7f26c0ff9700 1 --2- 192.168.123.104:0/2197586185 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f26cc103900 0x7f26cc198530 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:21:01.399 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:01.396+0000 7f26c0ff9700 1 -- 192.168.123.104:0/2197586185 >> 192.168.123.104:0/2197586185 conn(0x7f26cc0fa9e0 msgr2=0x7f26cc0ff660 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:21:01.399 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:01.396+0000 7f26c0ff9700 1 -- 192.168.123.104:0/2197586185 shutdown_connections 2026-03-10T06:21:01.399 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:01.396+0000 7f26c0ff9700 1 -- 192.168.123.104:0/2197586185 wait complete. 2026-03-10T06:21:01.411 INFO:teuthology.orchestra.run.vm04.stdout:true 2026-03-10T06:21:01.525 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:01.523+0000 7f35c9e0a700 1 -- 192.168.123.104:0/2627513510 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f35c4072440 msgr2=0x7f35c410be90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:21:01.525 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:01.523+0000 7f35c9e0a700 1 --2- 192.168.123.104:0/2627513510 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f35c4072440 0x7f35c410be90 secure :-1 s=READY pgs=294 cs=0 l=1 rev1=1 crypto rx=0x7f35b0009b00 tx=0x7f35b0009e10 comp rx=0 tx=0).stop 2026-03-10T06:21:01.526 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:01.525+0000 7f35c9e0a700 1 -- 192.168.123.104:0/2627513510 shutdown_connections 2026-03-10T06:21:01.526 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:01.525+0000 7f35c9e0a700 1 --2- 192.168.123.104:0/2627513510 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f35c4072440 0x7f35c410be90 unknown :-1 s=CLOSED pgs=294 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:21:01.526 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:01.525+0000 7f35c9e0a700 1 --2- 192.168.123.104:0/2627513510 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f35c4071a60 0x7f35c4071e70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:21:01.526 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:01.525+0000 7f35c9e0a700 1 -- 192.168.123.104:0/2627513510 >> 192.168.123.104:0/2627513510 conn(0x7f35c406d1a0 msgr2=0x7f35c406f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:21:01.530 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:01.526+0000 7f35c9e0a700 1 -- 192.168.123.104:0/2627513510 shutdown_connections 2026-03-10T06:21:01.530 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:01.526+0000 7f35c9e0a700 1 -- 192.168.123.104:0/2627513510 wait complete. 2026-03-10T06:21:01.530 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:01.527+0000 7f35c9e0a700 1 Processor -- start 2026-03-10T06:21:01.530 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:01.527+0000 7f35c9e0a700 1 -- start start 2026-03-10T06:21:01.530 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:01.527+0000 7f35c9e0a700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f35c4071a60 0x7f35c41af9c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:21:01.530 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:01.527+0000 7f35c9e0a700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f35c4072440 0x7f35c41b1f10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:21:01.530 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:01.527+0000 7f35c9e0a700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f35c41b24e0 con 0x7f35c4072440 2026-03-10T06:21:01.530 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:01.527+0000 7f35c9e0a700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f35c41b2620 con 0x7f35c4071a60 2026-03-10T06:21:01.530 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:01.527+0000 7f35c8e08700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f35c4071a60 0x7f35c41af9c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:21:01.530 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:01.527+0000 7f35c3fff700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f35c4072440 0x7f35c41b1f10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:21:01.530 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:01.527+0000 7f35c3fff700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f35c4072440 0x7f35c41b1f10 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:47590/0 (socket says 192.168.123.104:47590) 2026-03-10T06:21:01.530 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:01.527+0000 7f35c3fff700 1 -- 192.168.123.104:0/1694898560 learned_addr learned my addr 192.168.123.104:0/1694898560 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:21:01.530 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:01.528+0000 7f35c3fff700 1 -- 192.168.123.104:0/1694898560 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f35c4071a60 msgr2=0x7f35c41af9c0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:21:01.530 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:01.528+0000 7f35c3fff700 1 --2- 192.168.123.104:0/1694898560 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f35c4071a60 0x7f35c41af9c0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:21:01.530 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:01.528+0000 7f35c3fff700 1 -- 192.168.123.104:0/1694898560 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f35b00097e0 con 0x7f35c4072440 2026-03-10T06:21:01.530 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:01.528+0000 7f35c3fff700 1 --2- 192.168.123.104:0/1694898560 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f35c4072440 0x7f35c41b1f10 secure :-1 s=READY pgs=295 cs=0 l=1 rev1=1 crypto rx=0x7f35b000bb20 tx=0x7f35b000bb50 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:21:01.536 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:01.528+0000 7f35c1ffb700 1 -- 192.168.123.104:0/1694898560 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f35b001d070 con 0x7f35c4072440 2026-03-10T06:21:01.536 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:01.528+0000 7f35c1ffb700 1 -- 192.168.123.104:0/1694898560 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f35b0022470 con 0x7f35c4072440 2026-03-10T06:21:01.536 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:01.528+0000 7f35c1ffb700 1 -- 192.168.123.104:0/1694898560 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f35b000f650 con 0x7f35c4072440 2026-03-10T06:21:01.536 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:01.528+0000 7f35c9e0a700 1 -- 192.168.123.104:0/1694898560 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f35c41b2870 con 0x7f35c4072440 2026-03-10T06:21:01.536 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:01.528+0000 7f35c9e0a700 1 -- 192.168.123.104:0/1694898560 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f35c41b2d60 con 0x7f35c4072440 2026-03-10T06:21:01.536 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:01.529+0000 7f35c9e0a700 1 -- 192.168.123.104:0/1694898560 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f35a8005320 con 0x7f35c4072440 2026-03-10T06:21:01.536 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:01.530+0000 7f35c1ffb700 1 -- 192.168.123.104:0/1694898560 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f35b0004bc0 con 0x7f35c4072440 2026-03-10T06:21:01.536 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:01.531+0000 7f35c1ffb700 1 --2- 192.168.123.104:0/1694898560 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f35b406c690 0x7f35b406eb40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:21:01.536 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:01.531+0000 7f35c1ffb700 1 -- 192.168.123.104:0/1694898560 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(38..38 src has 1..38) v4 ==== 5276+0+0 (secure 0 0 0) 0x7f35b008cc60 con 0x7f35c4072440 2026-03-10T06:21:01.536 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:01.533+0000 7f35c1ffb700 1 -- 192.168.123.104:0/1694898560 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f35b005af20 con 0x7f35c4072440 2026-03-10T06:21:01.536 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:01.533+0000 7f35c8e08700 1 --2- 192.168.123.104:0/1694898560 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f35b406c690 0x7f35b406eb40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:21:01.543 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:01.536+0000 7f35c8e08700 1 --2- 192.168.123.104:0/1694898560 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f35b406c690 0x7f35b406eb40 secure :-1 s=READY pgs=112 cs=0 l=1 rev1=1 crypto rx=0x7f35c4072f50 tx=0x7f35b8009450 comp rx=0 tx=0).ready entity=mgr.14241 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:21:01.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:01 vm06.local ceph-mon[58974]: pgmap v114: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:21:01.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:01 vm06.local ceph-mon[58974]: Upgrade: Started with target quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-10T06:21:01.724 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:01.721+0000 7f35c9e0a700 1 -- 192.168.123.104:0/1694898560 --> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f35a8000bf0 con 0x7f35b406c690 2026-03-10T06:21:01.725 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:01.722+0000 7f35c1ffb700 1 -- 192.168.123.104:0/1694898560 <== mgr.14241 v2:192.168.123.104:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+351 (secure 0 0 0) 0x7f35a8000bf0 con 0x7f35b406c690 2026-03-10T06:21:01.733 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:01.729+0000 7f35af7fe700 1 -- 192.168.123.104:0/1694898560 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f35b406c690 msgr2=0x7f35b406eb40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:21:01.733 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:01.729+0000 7f35af7fe700 1 --2- 192.168.123.104:0/1694898560 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f35b406c690 0x7f35b406eb40 secure :-1 s=READY pgs=112 cs=0 l=1 rev1=1 crypto rx=0x7f35c4072f50 tx=0x7f35b8009450 comp rx=0 tx=0).stop 2026-03-10T06:21:01.733 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:01.729+0000 7f35af7fe700 1 -- 192.168.123.104:0/1694898560 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f35c4072440 msgr2=0x7f35c41b1f10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:21:01.733 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:01.729+0000 7f35af7fe700 1 --2- 192.168.123.104:0/1694898560 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f35c4072440 0x7f35c41b1f10 secure :-1 s=READY pgs=295 cs=0 l=1 rev1=1 crypto rx=0x7f35b000bb20 tx=0x7f35b000bb50 comp rx=0 tx=0).stop 2026-03-10T06:21:01.734 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:01.731+0000 7f35af7fe700 1 -- 192.168.123.104:0/1694898560 shutdown_connections 2026-03-10T06:21:01.734 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:01.731+0000 7f35af7fe700 1 --2- 192.168.123.104:0/1694898560 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f35b406c690 0x7f35b406eb40 unknown :-1 s=CLOSED pgs=112 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:21:01.734 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:01.731+0000 7f35af7fe700 1 --2- 192.168.123.104:0/1694898560 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f35c4071a60 0x7f35c41af9c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:21:01.734 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:01.731+0000 7f35af7fe700 1 --2- 192.168.123.104:0/1694898560 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f35c4072440 0x7f35c41b1f10 unknown :-1 s=CLOSED pgs=295 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:21:01.734 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:01.731+0000 7f35af7fe700 1 -- 192.168.123.104:0/1694898560 >> 192.168.123.104:0/1694898560 conn(0x7f35c406d1a0 msgr2=0x7f35c410a6d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:21:01.734 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:01.732+0000 7f35af7fe700 1 -- 192.168.123.104:0/1694898560 shutdown_connections 2026-03-10T06:21:01.734 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:01.732+0000 7f35af7fe700 1 -- 192.168.123.104:0/1694898560 wait complete. 2026-03-10T06:21:01.895 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:01.891+0000 7f6e315fb700 1 -- 192.168.123.104:0/3933132843 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6e2c103980 msgr2=0x7f6e2c103dd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:21:01.895 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:01.891+0000 7f6e315fb700 1 --2- 192.168.123.104:0/3933132843 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6e2c103980 0x7f6e2c103dd0 secure :-1 s=READY pgs=296 cs=0 l=1 rev1=1 crypto rx=0x7f6e1c009b00 tx=0x7f6e1c009e10 comp rx=0 tx=0).stop 2026-03-10T06:21:01.895 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:01.893+0000 7f6e315fb700 1 -- 192.168.123.104:0/3933132843 shutdown_connections 2026-03-10T06:21:01.895 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:01.893+0000 7f6e315fb700 1 --2- 192.168.123.104:0/3933132843 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6e2c103980 0x7f6e2c103dd0 unknown :-1 s=CLOSED pgs=296 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:21:01.895 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:01.893+0000 7f6e315fb700 1 --2- 192.168.123.104:0/3933132843 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6e2c102780 0x7f6e2c102b90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:21:01.895 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:01.893+0000 7f6e315fb700 1 -- 192.168.123.104:0/3933132843 >> 192.168.123.104:0/3933132843 conn(0x7f6e2c0fdd10 msgr2=0x7f6e2c100160 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:21:01.895 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:01.893+0000 7f6e315fb700 1 -- 192.168.123.104:0/3933132843 shutdown_connections 2026-03-10T06:21:01.895 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:01.893+0000 7f6e315fb700 1 -- 192.168.123.104:0/3933132843 wait complete. 2026-03-10T06:21:01.896 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:01.894+0000 7f6e315fb700 1 Processor -- start 2026-03-10T06:21:01.896 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:01.894+0000 7f6e315fb700 1 -- start start 2026-03-10T06:21:01.897 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:01.895+0000 7f6e315fb700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6e2c103980 0x7f6e2c1981e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:21:01.898 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:01.896+0000 7f6e315fb700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6e2c198720 0x7f6e2c19d790 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:21:01.898 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:01.896+0000 7f6e315fb700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6e2c198c20 con 0x7f6e2c198720 2026-03-10T06:21:01.898 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:01.896+0000 7f6e315fb700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6e2c198d90 con 0x7f6e2c103980 2026-03-10T06:21:01.898 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:01.896+0000 7f6e2a7fc700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6e2c198720 0x7f6e2c19d790 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:21:01.898 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:01.896+0000 7f6e2affd700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6e2c103980 0x7f6e2c1981e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:21:01.898 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:01.896+0000 7f6e2a7fc700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6e2c198720 0x7f6e2c19d790 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:47608/0 (socket says 192.168.123.104:47608) 2026-03-10T06:21:01.898 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:01.896+0000 7f6e2a7fc700 1 -- 192.168.123.104:0/2155253625 learned_addr learned my addr 192.168.123.104:0/2155253625 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:21:01.898 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:01.896+0000 7f6e2affd700 1 -- 192.168.123.104:0/2155253625 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6e2c198720 msgr2=0x7f6e2c19d790 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:21:01.898 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:01.896+0000 7f6e2affd700 1 --2- 192.168.123.104:0/2155253625 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6e2c198720 0x7f6e2c19d790 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:21:01.898 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:01.896+0000 7f6e2affd700 1 -- 192.168.123.104:0/2155253625 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f6e1c0097e0 con 0x7f6e2c103980 2026-03-10T06:21:01.898 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:01.896+0000 7f6e2affd700 1 --2- 192.168.123.104:0/2155253625 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6e2c103980 0x7f6e2c1981e0 secure :-1 s=READY pgs=60 cs=0 l=1 rev1=1 crypto rx=0x7f6e2c1022d0 tx=0x7f6e2000b960 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:21:01.901 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:01.897+0000 7f6e13fff700 1 -- 192.168.123.104:0/2155253625 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6e200117e0 con 0x7f6e2c103980 2026-03-10T06:21:01.902 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:01.897+0000 7f6e315fb700 1 -- 192.168.123.104:0/2155253625 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f6e2c19dd30 con 0x7f6e2c103980 2026-03-10T06:21:01.902 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:01.897+0000 7f6e315fb700 1 -- 192.168.123.104:0/2155253625 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f6e2c19e1f0 con 0x7f6e2c103980 2026-03-10T06:21:01.904 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:01.899+0000 7f6e13fff700 1 -- 192.168.123.104:0/2155253625 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f6e20011e20 con 0x7f6e2c103980 2026-03-10T06:21:01.904 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:01.899+0000 7f6e13fff700 1 -- 192.168.123.104:0/2155253625 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6e20010e80 con 0x7f6e2c103980 2026-03-10T06:21:01.904 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:01.901+0000 7f6e13fff700 1 -- 192.168.123.104:0/2155253625 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f6e20011940 con 0x7f6e2c103980 2026-03-10T06:21:01.904 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:01.901+0000 7f6e13fff700 1 --2- 192.168.123.104:0/2155253625 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f6e1406c7a0 0x7f6e1406ec50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:21:01.904 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:01.901+0000 7f6e13fff700 1 -- 192.168.123.104:0/2155253625 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(38..38 src has 1..38) v4 ==== 5276+0+0 (secure 0 0 0) 0x7f6e2008b0f0 con 0x7f6e2c103980 2026-03-10T06:21:01.904 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:01.901+0000 7f6e315fb700 1 -- 192.168.123.104:0/2155253625 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f6e2c066e40 con 0x7f6e2c103980 2026-03-10T06:21:01.904 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:01.902+0000 7f6e2a7fc700 1 --2- 192.168.123.104:0/2155253625 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f6e1406c7a0 0x7f6e1406ec50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:21:01.904 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:01.902+0000 7f6e2a7fc700 1 --2- 192.168.123.104:0/2155253625 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f6e1406c7a0 0x7f6e1406ec50 secure :-1 s=READY pgs=113 cs=0 l=1 rev1=1 crypto rx=0x7f6e1c009fd0 tx=0x7f6e1c004f60 comp rx=0 tx=0).ready entity=mgr.14241 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:21:01.912 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:01.906+0000 7f6e13fff700 1 -- 192.168.123.104:0/2155253625 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f6e20059380 con 0x7f6e2c103980 2026-03-10T06:21:02.050 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:02.045+0000 7f6e315fb700 1 -- 192.168.123.104:0/2155253625 --> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7f6e2c1082d0 con 0x7f6e1406c7a0 2026-03-10T06:21:02.056 INFO:teuthology.orchestra.run.vm04.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-10T06:21:02.057 INFO:teuthology.orchestra.run.vm04.stdout:alertmanager.vm04 vm04 *:9093,9094 running (2m) 64s ago 3m 22.6M - 0.25.0 c8568f914cd2 3d98d9c97afc 2026-03-10T06:21:02.057 INFO:teuthology.orchestra.run.vm04.stdout:ceph-exporter.vm04 vm04 running (3m) 64s ago 3m 8002k - 18.2.0 dc2bc1663786 019b79596e39 2026-03-10T06:21:02.057 INFO:teuthology.orchestra.run.vm04.stdout:ceph-exporter.vm06 vm06 running (2m) 65s ago 2m 8288k - 18.2.0 dc2bc1663786 02ba67f7b99e 2026-03-10T06:21:02.057 INFO:teuthology.orchestra.run.vm04.stdout:crash.vm04 vm04 running (3m) 64s ago 3m 7402k - 18.2.0 dc2bc1663786 35fbdbd85c40 2026-03-10T06:21:02.057 INFO:teuthology.orchestra.run.vm04.stdout:crash.vm06 vm06 running (2m) 65s ago 2m 7411k - 18.2.0 dc2bc1663786 a60199b09d41 2026-03-10T06:21:02.057 INFO:teuthology.orchestra.run.vm04.stdout:grafana.vm04 vm04 *:3000 running (2m) 64s ago 3m 81.2M - 9.4.7 954c08fa6188 888c399470c8 2026-03-10T06:21:02.057 INFO:teuthology.orchestra.run.vm04.stdout:mds.cephfs.vm04.hdxbzv vm04 running (71s) 64s ago 71s 17.0M - 18.2.0 dc2bc1663786 342935a5b39a 2026-03-10T06:21:02.057 INFO:teuthology.orchestra.run.vm04.stdout:mds.cephfs.vm04.hsrsig vm04 running (69s) 64s ago 69s 14.1M - 18.2.0 dc2bc1663786 9bbaa4df4333 2026-03-10T06:21:02.057 INFO:teuthology.orchestra.run.vm04.stdout:mds.cephfs.vm06.afscws vm06 running (68s) 65s ago 68s 14.0M - 18.2.0 dc2bc1663786 dc29bd0a94dd 2026-03-10T06:21:02.057 INFO:teuthology.orchestra.run.vm04.stdout:mds.cephfs.vm06.wzhqon vm06 running (71s) 65s ago 70s 14.9M - 18.2.0 dc2bc1663786 5f7b9f10b346 2026-03-10T06:21:02.057 INFO:teuthology.orchestra.run.vm04.stdout:mgr.vm04.exdvdb vm04 *:9283,8765,8443 running (4m) 64s ago 4m 499M - 18.2.0 dc2bc1663786 90f53ab8e17a 2026-03-10T06:21:02.057 INFO:teuthology.orchestra.run.vm04.stdout:mgr.vm06.wwotdr vm06 *:8443,9283,8765 running (2m) 65s ago 2m 445M - 18.2.0 dc2bc1663786 db76c25cd8f7 2026-03-10T06:21:02.057 INFO:teuthology.orchestra.run.vm04.stdout:mon.vm04 vm04 running (4m) 64s ago 4m 49.2M 2048M 18.2.0 dc2bc1663786 089bb557f95b 2026-03-10T06:21:02.057 INFO:teuthology.orchestra.run.vm04.stdout:mon.vm06 vm06 running (2m) 65s ago 2m 44.7M 2048M 18.2.0 dc2bc1663786 826078cd5cc7 2026-03-10T06:21:02.057 INFO:teuthology.orchestra.run.vm04.stdout:node-exporter.vm04 vm04 *:9100 running (3m) 64s ago 3m 12.3M - 1.5.0 0da6a335fe13 f563a35e96ab 2026-03-10T06:21:02.057 INFO:teuthology.orchestra.run.vm04.stdout:node-exporter.vm06 vm06 *:9100 running (2m) 65s ago 2m 15.0M - 1.5.0 0da6a335fe13 3304cc389738 2026-03-10T06:21:02.057 INFO:teuthology.orchestra.run.vm04.stdout:osd.0 vm04 running (2m) 64s ago 2m 46.2M 4096M 18.2.0 dc2bc1663786 23249edb3d75 2026-03-10T06:21:02.057 INFO:teuthology.orchestra.run.vm04.stdout:osd.1 vm04 running (2m) 64s ago 2m 47.0M 4096M 18.2.0 dc2bc1663786 ddcaf1636c42 2026-03-10T06:21:02.057 INFO:teuthology.orchestra.run.vm04.stdout:osd.2 vm04 running (2m) 64s ago 2m 45.9M 4096M 18.2.0 dc2bc1663786 e5a533082c80 2026-03-10T06:21:02.057 INFO:teuthology.orchestra.run.vm04.stdout:osd.3 vm06 running (117s) 65s ago 117s 44.5M 4096M 18.2.0 dc2bc1663786 62400287eca0 2026-03-10T06:21:02.057 INFO:teuthology.orchestra.run.vm04.stdout:osd.4 vm06 running (107s) 65s ago 107s 43.4M 4096M 18.2.0 dc2bc1663786 dcd395dfe220 2026-03-10T06:21:02.057 INFO:teuthology.orchestra.run.vm04.stdout:osd.5 vm06 running (98s) 65s ago 98s 45.0M 4096M 18.2.0 dc2bc1663786 862da087fc06 2026-03-10T06:21:02.057 INFO:teuthology.orchestra.run.vm04.stdout:prometheus.vm04 vm04 *:9095 running (2m) 64s ago 3m 38.7M - 2.43.0 a07b618ecd1d 5d3ae08adc2a 2026-03-10T06:21:02.058 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:02.053+0000 7f6e13fff700 1 -- 192.168.123.104:0/2155253625 <== mgr.14241 v2:192.168.123.104:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3288 (secure 0 0 0) 0x7f6e2c1082d0 con 0x7f6e1406c7a0 2026-03-10T06:21:02.058 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:02.056+0000 7f6e11ffb700 1 -- 192.168.123.104:0/2155253625 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f6e1406c7a0 msgr2=0x7f6e1406ec50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:21:02.058 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:02.056+0000 7f6e11ffb700 1 --2- 192.168.123.104:0/2155253625 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f6e1406c7a0 0x7f6e1406ec50 secure :-1 s=READY pgs=113 cs=0 l=1 rev1=1 crypto rx=0x7f6e1c009fd0 tx=0x7f6e1c004f60 comp rx=0 tx=0).stop 2026-03-10T06:21:02.058 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:02.056+0000 7f6e11ffb700 1 -- 192.168.123.104:0/2155253625 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6e2c103980 msgr2=0x7f6e2c1981e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:21:02.058 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:02.056+0000 7f6e11ffb700 1 --2- 192.168.123.104:0/2155253625 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6e2c103980 0x7f6e2c1981e0 secure :-1 s=READY pgs=60 cs=0 l=1 rev1=1 crypto rx=0x7f6e2c1022d0 tx=0x7f6e2000b960 comp rx=0 tx=0).stop 2026-03-10T06:21:02.062 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:02.057+0000 7f6e11ffb700 1 -- 192.168.123.104:0/2155253625 shutdown_connections 2026-03-10T06:21:02.062 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:02.057+0000 7f6e11ffb700 1 --2- 192.168.123.104:0/2155253625 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f6e1406c7a0 0x7f6e1406ec50 unknown :-1 s=CLOSED pgs=113 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:21:02.062 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:02.057+0000 7f6e11ffb700 1 --2- 192.168.123.104:0/2155253625 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6e2c103980 0x7f6e2c1981e0 unknown :-1 s=CLOSED pgs=60 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:21:02.062 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:02.057+0000 7f6e11ffb700 1 --2- 192.168.123.104:0/2155253625 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6e2c198720 0x7f6e2c19d790 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:21:02.063 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:02.057+0000 7f6e11ffb700 1 -- 192.168.123.104:0/2155253625 >> 192.168.123.104:0/2155253625 conn(0x7f6e2c0fdd10 msgr2=0x7f6e2c106bb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:21:02.063 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:02.060+0000 7f6e11ffb700 1 -- 192.168.123.104:0/2155253625 shutdown_connections 2026-03-10T06:21:02.063 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:02.061+0000 7f6e11ffb700 1 -- 192.168.123.104:0/2155253625 wait complete. 2026-03-10T06:21:02.166 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:02.164+0000 7f365ae34700 1 -- 192.168.123.104:0/125535818 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3654107d50 msgr2=0x7f36541081c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:21:02.167 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:02.164+0000 7f365ae34700 1 --2- 192.168.123.104:0/125535818 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3654107d50 0x7f36541081c0 secure :-1 s=READY pgs=297 cs=0 l=1 rev1=1 crypto rx=0x7f3648009b00 tx=0x7f3648009e10 comp rx=0 tx=0).stop 2026-03-10T06:21:02.167 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:02.164+0000 7f365ae34700 1 -- 192.168.123.104:0/125535818 shutdown_connections 2026-03-10T06:21:02.167 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:02.164+0000 7f365ae34700 1 --2- 192.168.123.104:0/125535818 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3654107d50 0x7f36541081c0 unknown :-1 s=CLOSED pgs=297 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:21:02.167 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:02.164+0000 7f365ae34700 1 --2- 192.168.123.104:0/125535818 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3654071db0 0x7f36540721c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:21:02.167 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:02.164+0000 7f365ae34700 1 -- 192.168.123.104:0/125535818 >> 192.168.123.104:0/125535818 conn(0x7f365406d3e0 msgr2=0x7f365406f830 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:21:02.173 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:02.164+0000 7f365ae34700 1 -- 192.168.123.104:0/125535818 shutdown_connections 2026-03-10T06:21:02.173 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:02.167+0000 7f365ae34700 1 -- 192.168.123.104:0/125535818 wait complete. 2026-03-10T06:21:02.173 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:02.168+0000 7f365ae34700 1 Processor -- start 2026-03-10T06:21:02.173 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:02.169+0000 7f365ae34700 1 -- start start 2026-03-10T06:21:02.173 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:02.169+0000 7f365ae34700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3654071db0 0x7f3654116a10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:21:02.173 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:02.169+0000 7f365ae34700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3654107d50 0x7f3654116f50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:21:02.173 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:02.169+0000 7f365ae34700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3654117570 con 0x7f3654107d50 2026-03-10T06:21:02.173 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:02.169+0000 7f365ae34700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f36541b29e0 con 0x7f3654071db0 2026-03-10T06:21:02.174 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:02.169+0000 7f3653fff700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3654107d50 0x7f3654116f50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:21:02.174 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:02.169+0000 7f3653fff700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3654107d50 0x7f3654116f50 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:47624/0 (socket says 192.168.123.104:47624) 2026-03-10T06:21:02.174 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:02.169+0000 7f3653fff700 1 -- 192.168.123.104:0/1397697958 learned_addr learned my addr 192.168.123.104:0/1397697958 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:21:02.174 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:02.169+0000 7f3653fff700 1 -- 192.168.123.104:0/1397697958 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3654071db0 msgr2=0x7f3654116a10 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:21:02.174 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:02.169+0000 7f3653fff700 1 --2- 192.168.123.104:0/1397697958 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3654071db0 0x7f3654116a10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:21:02.174 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:02.169+0000 7f3653fff700 1 -- 192.168.123.104:0/1397697958 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f36480097e0 con 0x7f3654107d50 2026-03-10T06:21:02.174 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:02.169+0000 7f3653fff700 1 --2- 192.168.123.104:0/1397697958 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3654107d50 0x7f3654116f50 secure :-1 s=READY pgs=298 cs=0 l=1 rev1=1 crypto rx=0x7f36480049c0 tx=0x7f3648004aa0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:21:02.174 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:02.169+0000 7f3651ffb700 1 -- 192.168.123.104:0/1397697958 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f364801d070 con 0x7f3654107d50 2026-03-10T06:21:02.174 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:02.170+0000 7f3651ffb700 1 -- 192.168.123.104:0/1397697958 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f364800bd10 con 0x7f3654107d50 2026-03-10T06:21:02.174 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:02.170+0000 7f3651ffb700 1 -- 192.168.123.104:0/1397697958 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f364800f940 con 0x7f3654107d50 2026-03-10T06:21:02.174 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:02.170+0000 7f365ae34700 1 -- 192.168.123.104:0/1397697958 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f36541b2b80 con 0x7f3654107d50 2026-03-10T06:21:02.174 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:02.170+0000 7f365ae34700 1 -- 192.168.123.104:0/1397697958 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f36541b2fa0 con 0x7f3654107d50 2026-03-10T06:21:02.174 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:02.172+0000 7f3651ffb700 1 -- 192.168.123.104:0/1397697958 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f364800faa0 con 0x7f3654107d50 2026-03-10T06:21:02.174 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:02.172+0000 7f3651ffb700 1 --2- 192.168.123.104:0/1397697958 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f363c06c480 0x7f363c06e930 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:21:02.174 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:02.172+0000 7f3651ffb700 1 -- 192.168.123.104:0/1397697958 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(38..38 src has 1..38) v4 ==== 5276+0+0 (secure 0 0 0) 0x7f364808d0d0 con 0x7f3654107d50 2026-03-10T06:21:02.174 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:02.172+0000 7f3658bd0700 1 --2- 192.168.123.104:0/1397697958 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f363c06c480 0x7f363c06e930 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:21:02.178 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:02.172+0000 7f365ae34700 1 -- 192.168.123.104:0/1397697958 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f3654066e40 con 0x7f3654107d50 2026-03-10T06:21:02.178 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:02.176+0000 7f3651ffb700 1 -- 192.168.123.104:0/1397697958 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f36480577d0 con 0x7f3654107d50 2026-03-10T06:21:02.183 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:02.176+0000 7f3658bd0700 1 --2- 192.168.123.104:0/1397697958 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f363c06c480 0x7f363c06e930 secure :-1 s=READY pgs=114 cs=0 l=1 rev1=1 crypto rx=0x7f36541ae950 tx=0x7f3644006d20 comp rx=0 tx=0).ready entity=mgr.14241 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:21:02.334 INFO:teuthology.orchestra.run.vm04.stdout:{ 2026-03-10T06:21:02.334 INFO:teuthology.orchestra.run.vm04.stdout: "mon": { 2026-03-10T06:21:02.334 INFO:teuthology.orchestra.run.vm04.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 2 2026-03-10T06:21:02.334 INFO:teuthology.orchestra.run.vm04.stdout: }, 2026-03-10T06:21:02.334 INFO:teuthology.orchestra.run.vm04.stdout: "mgr": { 2026-03-10T06:21:02.334 INFO:teuthology.orchestra.run.vm04.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 2 2026-03-10T06:21:02.334 INFO:teuthology.orchestra.run.vm04.stdout: }, 2026-03-10T06:21:02.334 INFO:teuthology.orchestra.run.vm04.stdout: "osd": { 2026-03-10T06:21:02.334 INFO:teuthology.orchestra.run.vm04.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 6 2026-03-10T06:21:02.334 INFO:teuthology.orchestra.run.vm04.stdout: }, 2026-03-10T06:21:02.334 INFO:teuthology.orchestra.run.vm04.stdout: "mds": { 2026-03-10T06:21:02.334 INFO:teuthology.orchestra.run.vm04.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 4 2026-03-10T06:21:02.334 INFO:teuthology.orchestra.run.vm04.stdout: }, 2026-03-10T06:21:02.334 INFO:teuthology.orchestra.run.vm04.stdout: "overall": { 2026-03-10T06:21:02.334 INFO:teuthology.orchestra.run.vm04.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 14 2026-03-10T06:21:02.334 INFO:teuthology.orchestra.run.vm04.stdout: } 2026-03-10T06:21:02.334 INFO:teuthology.orchestra.run.vm04.stdout:} 2026-03-10T06:21:02.334 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:02.331+0000 7f365ae34700 1 -- 192.168.123.104:0/1397697958 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7f36541b33e0 con 0x7f3654107d50 2026-03-10T06:21:02.335 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:02.331+0000 7f3651ffb700 1 -- 192.168.123.104:0/1397697958 <== mon.0 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+558 (secure 0 0 0) 0x7f3648027070 con 0x7f3654107d50 2026-03-10T06:21:02.336 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:02.334+0000 7f363b7fe700 1 -- 192.168.123.104:0/1397697958 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f363c06c480 msgr2=0x7f363c06e930 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:21:02.336 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:02.334+0000 7f363b7fe700 1 --2- 192.168.123.104:0/1397697958 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f363c06c480 0x7f363c06e930 secure :-1 s=READY pgs=114 cs=0 l=1 rev1=1 crypto rx=0x7f36541ae950 tx=0x7f3644006d20 comp rx=0 tx=0).stop 2026-03-10T06:21:02.336 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:02.334+0000 7f363b7fe700 1 -- 192.168.123.104:0/1397697958 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3654107d50 msgr2=0x7f3654116f50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:21:02.336 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:02.334+0000 7f363b7fe700 1 --2- 192.168.123.104:0/1397697958 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3654107d50 0x7f3654116f50 secure :-1 s=READY pgs=298 cs=0 l=1 rev1=1 crypto rx=0x7f36480049c0 tx=0x7f3648004aa0 comp rx=0 tx=0).stop 2026-03-10T06:21:02.336 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:02.334+0000 7f363b7fe700 1 -- 192.168.123.104:0/1397697958 shutdown_connections 2026-03-10T06:21:02.336 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:02.334+0000 7f363b7fe700 1 --2- 192.168.123.104:0/1397697958 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f363c06c480 0x7f363c06e930 secure :-1 s=CLOSED pgs=114 cs=0 l=1 rev1=1 crypto rx=0x7f36541ae950 tx=0x7f3644006d20 comp rx=0 tx=0).stop 2026-03-10T06:21:02.336 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:02.334+0000 7f363b7fe700 1 --2- 192.168.123.104:0/1397697958 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3654071db0 0x7f3654116a10 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:21:02.336 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:02.334+0000 7f363b7fe700 1 --2- 192.168.123.104:0/1397697958 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3654107d50 0x7f3654116f50 unknown :-1 s=CLOSED pgs=298 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:21:02.336 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:02.334+0000 7f363b7fe700 1 -- 192.168.123.104:0/1397697958 >> 192.168.123.104:0/1397697958 conn(0x7f365406d3e0 msgr2=0x7f365410af80 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:21:02.336 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:02.334+0000 7f363b7fe700 1 -- 192.168.123.104:0/1397697958 shutdown_connections 2026-03-10T06:21:02.336 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:02.334+0000 7f363b7fe700 1 -- 192.168.123.104:0/1397697958 wait complete. 2026-03-10T06:21:02.412 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:02.409+0000 7f430359a700 1 -- 192.168.123.104:0/4040801115 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f42fc072360 msgr2=0x7f42fc0770e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:21:02.412 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:02.409+0000 7f430359a700 1 --2- 192.168.123.104:0/4040801115 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f42fc072360 0x7f42fc0770e0 secure :-1 s=READY pgs=299 cs=0 l=1 rev1=1 crypto rx=0x7f42f4009f20 tx=0x7f42f4009f50 comp rx=0 tx=0).stop 2026-03-10T06:21:02.412 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:02.409+0000 7f430359a700 1 -- 192.168.123.104:0/4040801115 shutdown_connections 2026-03-10T06:21:02.412 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:02.409+0000 7f430359a700 1 --2- 192.168.123.104:0/4040801115 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f42fc072360 0x7f42fc0770e0 unknown :-1 s=CLOSED pgs=299 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:21:02.412 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:02.409+0000 7f430359a700 1 --2- 192.168.123.104:0/4040801115 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f42fc071980 0x7f42fc071d90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:21:02.412 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:02.409+0000 7f430359a700 1 -- 192.168.123.104:0/4040801115 >> 192.168.123.104:0/4040801115 conn(0x7f42fc06d1a0 msgr2=0x7f42fc06f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:21:02.413 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:02.410+0000 7f430359a700 1 -- 192.168.123.104:0/4040801115 shutdown_connections 2026-03-10T06:21:02.413 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:02.410+0000 7f430359a700 1 -- 192.168.123.104:0/4040801115 wait complete. 2026-03-10T06:21:02.413 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:02.410+0000 7f430359a700 1 Processor -- start 2026-03-10T06:21:02.413 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:02.410+0000 7f430359a700 1 -- start start 2026-03-10T06:21:02.413 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:02.410+0000 7f430359a700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f42fc071980 0x7f42fc0824d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:21:02.413 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:02.410+0000 7f430359a700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f42fc082a10 0x7f42fc082e80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:21:02.413 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:02.410+0000 7f430359a700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f42fc1b2a90 con 0x7f42fc071980 2026-03-10T06:21:02.413 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:02.410+0000 7f430359a700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f42fc1b2bd0 con 0x7f42fc082a10 2026-03-10T06:21:02.413 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:02.410+0000 7f4301336700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f42fc071980 0x7f42fc0824d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:21:02.413 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:02.411+0000 7f4301336700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f42fc071980 0x7f42fc0824d0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:47642/0 (socket says 192.168.123.104:47642) 2026-03-10T06:21:02.413 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:02.411+0000 7f4301336700 1 -- 192.168.123.104:0/2344074355 learned_addr learned my addr 192.168.123.104:0/2344074355 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:21:02.413 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:02.411+0000 7f4300b35700 1 --2- 192.168.123.104:0/2344074355 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f42fc082a10 0x7f42fc082e80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:21:02.413 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:02.411+0000 7f4301336700 1 -- 192.168.123.104:0/2344074355 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f42fc082a10 msgr2=0x7f42fc082e80 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:21:02.413 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:02.411+0000 7f4301336700 1 --2- 192.168.123.104:0/2344074355 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f42fc082a10 0x7f42fc082e80 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:21:02.413 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:02.411+0000 7f4301336700 1 -- 192.168.123.104:0/2344074355 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f42f400c9f0 con 0x7f42fc071980 2026-03-10T06:21:02.413 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:02.411+0000 7f4301336700 1 --2- 192.168.123.104:0/2344074355 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f42fc071980 0x7f42fc0824d0 secure :-1 s=READY pgs=300 cs=0 l=1 rev1=1 crypto rx=0x7f42f8009fa0 tx=0x7f42f800ba50 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:21:02.414 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:02.412+0000 7f42f27fc700 1 -- 192.168.123.104:0/2344074355 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f42f8010b20 con 0x7f42fc071980 2026-03-10T06:21:02.415 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:02.412+0000 7f430359a700 1 -- 192.168.123.104:0/2344074355 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f42fc1b2d10 con 0x7f42fc071980 2026-03-10T06:21:02.415 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:02.412+0000 7f430359a700 1 -- 192.168.123.104:0/2344074355 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f42fc1b31c0 con 0x7f42fc071980 2026-03-10T06:21:02.415 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:02.412+0000 7f42f27fc700 1 -- 192.168.123.104:0/2344074355 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f42f8010c80 con 0x7f42fc071980 2026-03-10T06:21:02.415 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:02.412+0000 7f42f27fc700 1 -- 192.168.123.104:0/2344074355 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f42f8016710 con 0x7f42fc071980 2026-03-10T06:21:02.415 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:02.413+0000 7f42f27fc700 1 -- 192.168.123.104:0/2344074355 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f42f8016950 con 0x7f42fc071980 2026-03-10T06:21:02.416 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:02.414+0000 7f42f27fc700 1 --2- 192.168.123.104:0/2344074355 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f42e806c7a0 0x7f42e806ec50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:21:02.416 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:02.414+0000 7f4300b35700 1 --2- 192.168.123.104:0/2344074355 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f42e806c7a0 0x7f42e806ec50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:21:02.416 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:02 vm04.local ceph-mon[51058]: Upgrade: First pull of quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-10T06:21:02.416 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:02 vm04.local ceph-mon[51058]: from='client.14576 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:21:02.416 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:02 vm04.local ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:21:02.416 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:02 vm04.local ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:21:02.416 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:02 vm04.local ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:21:02.419 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:02.414+0000 7f4300b35700 1 --2- 192.168.123.104:0/2344074355 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f42e806c7a0 0x7f42e806ec50 secure :-1 s=READY pgs=115 cs=0 l=1 rev1=1 crypto rx=0x7f42f4009ef0 tx=0x7f42f4006040 comp rx=0 tx=0).ready entity=mgr.14241 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:21:02.419 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:02.414+0000 7f42f27fc700 1 -- 192.168.123.104:0/2344074355 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(38..38 src has 1..38) v4 ==== 5276+0+0 (secure 0 0 0) 0x7f42f808eb70 con 0x7f42fc071980 2026-03-10T06:21:02.422 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:02.417+0000 7f430359a700 1 -- 192.168.123.104:0/2344074355 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f42e0005320 con 0x7f42fc071980 2026-03-10T06:21:02.422 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:02.420+0000 7f42f27fc700 1 -- 192.168.123.104:0/2344074355 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f42f8052720 con 0x7f42fc071980 2026-03-10T06:21:02.539 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:02.537+0000 7f430359a700 1 -- 192.168.123.104:0/2344074355 --> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f42e0000bf0 con 0x7f42e806c7a0 2026-03-10T06:21:02.540 INFO:teuthology.orchestra.run.vm04.stdout:{ 2026-03-10T06:21:02.540 INFO:teuthology.orchestra.run.vm04.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc", 2026-03-10T06:21:02.540 INFO:teuthology.orchestra.run.vm04.stdout: "in_progress": true, 2026-03-10T06:21:02.540 INFO:teuthology.orchestra.run.vm04.stdout: "which": "Upgrading daemons of type(s) mgr", 2026-03-10T06:21:02.540 INFO:teuthology.orchestra.run.vm04.stdout: "services_complete": [], 2026-03-10T06:21:02.540 INFO:teuthology.orchestra.run.vm04.stdout: "progress": "0/2 daemons upgraded", 2026-03-10T06:21:02.540 INFO:teuthology.orchestra.run.vm04.stdout: "message": "Pulling quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc image on host vm06", 2026-03-10T06:21:02.540 INFO:teuthology.orchestra.run.vm04.stdout: "is_paused": false 2026-03-10T06:21:02.540 INFO:teuthology.orchestra.run.vm04.stdout:} 2026-03-10T06:21:02.540 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:02.538+0000 7f42f27fc700 1 -- 192.168.123.104:0/2344074355 <== mgr.14241 v2:192.168.123.104:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+434 (secure 0 0 0) 0x7f42e0000bf0 con 0x7f42e806c7a0 2026-03-10T06:21:02.541 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:02.539+0000 7f42e7fff700 1 -- 192.168.123.104:0/2344074355 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f42e806c7a0 msgr2=0x7f42e806ec50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:21:02.541 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:02.539+0000 7f42e7fff700 1 --2- 192.168.123.104:0/2344074355 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f42e806c7a0 0x7f42e806ec50 secure :-1 s=READY pgs=115 cs=0 l=1 rev1=1 crypto rx=0x7f42f4009ef0 tx=0x7f42f4006040 comp rx=0 tx=0).stop 2026-03-10T06:21:02.541 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:02.540+0000 7f42e7fff700 1 -- 192.168.123.104:0/2344074355 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f42fc071980 msgr2=0x7f42fc0824d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:21:02.541 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:02.540+0000 7f42e7fff700 1 --2- 192.168.123.104:0/2344074355 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f42fc071980 0x7f42fc0824d0 secure :-1 s=READY pgs=300 cs=0 l=1 rev1=1 crypto rx=0x7f42f8009fa0 tx=0x7f42f800ba50 comp rx=0 tx=0).stop 2026-03-10T06:21:02.541 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:02.540+0000 7f42e7fff700 1 -- 192.168.123.104:0/2344074355 shutdown_connections 2026-03-10T06:21:02.541 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:02.540+0000 7f42e7fff700 1 --2- 192.168.123.104:0/2344074355 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f42e806c7a0 0x7f42e806ec50 unknown :-1 s=CLOSED pgs=115 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:21:02.541 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:02.540+0000 7f42e7fff700 1 --2- 192.168.123.104:0/2344074355 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f42fc071980 0x7f42fc0824d0 unknown :-1 s=CLOSED pgs=300 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:21:02.542 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:02.540+0000 7f42e7fff700 1 --2- 192.168.123.104:0/2344074355 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f42fc082a10 0x7f42fc082e80 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:21:02.542 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:02.540+0000 7f42e7fff700 1 -- 192.168.123.104:0/2344074355 >> 192.168.123.104:0/2344074355 conn(0x7f42fc06d1a0 msgr2=0x7f42fc076390 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:21:02.542 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:02.540+0000 7f42e7fff700 1 -- 192.168.123.104:0/2344074355 shutdown_connections 2026-03-10T06:21:02.542 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:02.540+0000 7f42e7fff700 1 -- 192.168.123.104:0/2344074355 wait complete. 2026-03-10T06:21:02.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:02 vm06.local ceph-mon[58974]: Upgrade: First pull of quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-10T06:21:02.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:02 vm06.local ceph-mon[58974]: from='client.14576 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:21:02.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:02 vm06.local ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:21:02.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:02 vm06.local ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:21:02.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:02 vm06.local ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:21:03.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:03 vm06.local ceph-mon[58974]: from='client.14580 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:21:03.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:03 vm06.local ceph-mon[58974]: Upgrade: Target is version 19.2.3-678-ge911bdeb (unknown) 2026-03-10T06:21:03.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:03 vm06.local ceph-mon[58974]: Upgrade: Target container is quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, digests ['quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc'] 2026-03-10T06:21:03.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:03 vm06.local ceph-mon[58974]: Upgrade: Need to upgrade myself (mgr.vm04.exdvdb) 2026-03-10T06:21:03.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:03 vm06.local ceph-mon[58974]: from='client.24355 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:21:03.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:03 vm06.local ceph-mon[58974]: pgmap v115: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:21:03.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:03 vm06.local ceph-mon[58974]: Upgrade: Pulling quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc on vm06 2026-03-10T06:21:03.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:03 vm06.local ceph-mon[58974]: from='client.? 192.168.123.104:0/1397697958' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:21:03.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:03 vm04.local ceph-mon[51058]: from='client.14580 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:21:03.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:03 vm04.local ceph-mon[51058]: Upgrade: Target is version 19.2.3-678-ge911bdeb (unknown) 2026-03-10T06:21:03.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:03 vm04.local ceph-mon[51058]: Upgrade: Target container is quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, digests ['quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc'] 2026-03-10T06:21:03.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:03 vm04.local ceph-mon[51058]: Upgrade: Need to upgrade myself (mgr.vm04.exdvdb) 2026-03-10T06:21:03.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:03 vm04.local ceph-mon[51058]: from='client.24355 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:21:03.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:03 vm04.local ceph-mon[51058]: pgmap v115: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:21:03.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:03 vm04.local ceph-mon[51058]: Upgrade: Pulling quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc on vm06 2026-03-10T06:21:03.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:03 vm04.local ceph-mon[51058]: from='client.? 192.168.123.104:0/1397697958' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:21:03.691 INFO:tasks.workunit.client.1.vm06.stderr:Note: switching to '75a68fd8ca3f918fe9466b4c0bb385b7fc260a9b'. 2026-03-10T06:21:03.691 INFO:tasks.workunit.client.1.vm06.stderr: 2026-03-10T06:21:03.691 INFO:tasks.workunit.client.1.vm06.stderr:You are in 'detached HEAD' state. You can look around, make experimental 2026-03-10T06:21:03.691 INFO:tasks.workunit.client.1.vm06.stderr:changes and commit them, and you can discard any commits you make in this 2026-03-10T06:21:03.691 INFO:tasks.workunit.client.1.vm06.stderr:state without impacting any branches by switching back to a branch. 2026-03-10T06:21:03.691 INFO:tasks.workunit.client.1.vm06.stderr: 2026-03-10T06:21:03.691 INFO:tasks.workunit.client.1.vm06.stderr:If you want to create a new branch to retain commits you create, you may 2026-03-10T06:21:03.691 INFO:tasks.workunit.client.1.vm06.stderr:do so (now or later) by using -c with the switch command. Example: 2026-03-10T06:21:03.691 INFO:tasks.workunit.client.1.vm06.stderr: 2026-03-10T06:21:03.691 INFO:tasks.workunit.client.1.vm06.stderr: git switch -c 2026-03-10T06:21:03.692 INFO:tasks.workunit.client.1.vm06.stderr: 2026-03-10T06:21:03.692 INFO:tasks.workunit.client.1.vm06.stderr:Or undo this operation with: 2026-03-10T06:21:03.692 INFO:tasks.workunit.client.1.vm06.stderr: 2026-03-10T06:21:03.692 INFO:tasks.workunit.client.1.vm06.stderr: git switch - 2026-03-10T06:21:03.692 INFO:tasks.workunit.client.1.vm06.stderr: 2026-03-10T06:21:03.692 INFO:tasks.workunit.client.1.vm06.stderr:Turn off this advice by setting config variable advice.detachedHead to false 2026-03-10T06:21:03.692 INFO:tasks.workunit.client.1.vm06.stderr: 2026-03-10T06:21:03.692 INFO:tasks.workunit.client.1.vm06.stderr:HEAD is now at 75a68fd8ca3 qa/suites/orch/cephadm/osds: drop nvme_loop task 2026-03-10T06:21:03.697 DEBUG:teuthology.orchestra.run.vm06:> cd -- /home/ubuntu/cephtest/clone.client.1/qa/workunits && if test -e Makefile ; then make ; fi && find -executable -type f -printf '%P\0' >/home/ubuntu/cephtest/workunits.list.client.1 2026-03-10T06:21:03.754 INFO:tasks.workunit.client.1.vm06.stdout:for d in direct_io fs ; do ( cd $d ; make all ) ; done 2026-03-10T06:21:03.756 INFO:tasks.workunit.client.1.vm06.stdout:make[1]: Entering directory '/home/ubuntu/cephtest/clone.client.1/qa/workunits/direct_io' 2026-03-10T06:21:03.756 INFO:tasks.workunit.client.1.vm06.stdout:cc -Wall -Wextra -D_GNU_SOURCE direct_io_test.c -o direct_io_test 2026-03-10T06:21:03.798 INFO:tasks.workunit.client.1.vm06.stdout:cc -Wall -Wextra -D_GNU_SOURCE test_sync_io.c -o test_sync_io 2026-03-10T06:21:03.834 INFO:tasks.workunit.client.1.vm06.stdout:cc -Wall -Wextra -D_GNU_SOURCE test_short_dio_read.c -o test_short_dio_read 2026-03-10T06:21:03.864 INFO:tasks.workunit.client.1.vm06.stdout:make[1]: Leaving directory '/home/ubuntu/cephtest/clone.client.1/qa/workunits/direct_io' 2026-03-10T06:21:03.865 INFO:tasks.workunit.client.1.vm06.stdout:make[1]: Entering directory '/home/ubuntu/cephtest/clone.client.1/qa/workunits/fs' 2026-03-10T06:21:03.865 INFO:tasks.workunit.client.1.vm06.stdout:cc -Wall -Wextra -D_GNU_SOURCE test_o_trunc.c -o test_o_trunc 2026-03-10T06:21:03.900 INFO:tasks.workunit.client.1.vm06.stdout:make[1]: Leaving directory '/home/ubuntu/cephtest/clone.client.1/qa/workunits/fs' 2026-03-10T06:21:03.904 DEBUG:teuthology.orchestra.run.vm06:> set -ex 2026-03-10T06:21:03.904 DEBUG:teuthology.orchestra.run.vm06:> dd if=/home/ubuntu/cephtest/workunits.list.client.1 of=/dev/stdout 2026-03-10T06:21:03.962 INFO:tasks.workunit:Running workunits matching suites/fsstress.sh on client.1... 2026-03-10T06:21:03.964 INFO:tasks.workunit:Running workunit suites/fsstress.sh... 2026-03-10T06:21:03.964 DEBUG:teuthology.orchestra.run.vm06:workunit test suites/fsstress.sh> mkdir -p -- /home/ubuntu/cephtest/mnt.1/client.1/tmp && cd -- /home/ubuntu/cephtest/mnt.1/client.1/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=75a68fd8ca3f918fe9466b4c0bb385b7fc260a9b TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="1" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.1 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.1 CEPH_MNT=/home/ubuntu/cephtest/mnt.1 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 3h /home/ubuntu/cephtest/clone.client.1/qa/workunits/suites/fsstress.sh 2026-03-10T06:21:04.028 INFO:tasks.workunit.client.1.vm06.stderr:+ mkdir -p fsstress 2026-03-10T06:21:04.030 INFO:tasks.workunit.client.1.vm06.stderr:+ pushd fsstress 2026-03-10T06:21:04.031 INFO:tasks.workunit.client.1.vm06.stdout:~/cephtest/mnt.1/client.1/tmp/fsstress ~/cephtest/mnt.1/client.1/tmp 2026-03-10T06:21:04.031 INFO:tasks.workunit.client.1.vm06.stderr:+ wget -q -O ltp-full.tgz http://download.ceph.com/qa/ltp-full-20091231.tgz 2026-03-10T06:21:05.060 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:04 vm04.local ceph-mon[51058]: from='client.14592 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:21:05.060 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:04 vm04.local ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T06:21:05.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:04 vm06.local ceph-mon[58974]: from='client.14592 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:21:05.118 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:04 vm06.local ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T06:21:06.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:05 vm06.local ceph-mon[58974]: pgmap v116: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:21:06.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:05 vm04.local ceph-mon[51058]: pgmap v116: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:21:07.024 INFO:tasks.workunit.client.1.vm06.stderr:+ tar xzf ltp-full.tgz 2026-03-10T06:21:07.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:06 vm06.local ceph-mon[58974]: pgmap v117: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:21:07.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:06 vm04.local ceph-mon[51058]: pgmap v117: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:21:09.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:09 vm06.local ceph-mon[58974]: pgmap v118: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:21:09.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:09 vm04.local ceph-mon[51058]: pgmap v118: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:21:11.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:11 vm06.local ceph-mon[58974]: pgmap v119: 65 pgs: 65 active+clean; 14 MiB data, 201 MiB used, 120 GiB / 120 GiB avail; 84 KiB/s rd, 1.1 MiB/s wr, 36 op/s 2026-03-10T06:21:11.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:11 vm04.local ceph-mon[51058]: pgmap v119: 65 pgs: 65 active+clean; 14 MiB data, 201 MiB used, 120 GiB / 120 GiB avail; 84 KiB/s rd, 1.1 MiB/s wr, 36 op/s 2026-03-10T06:21:13.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:12 vm06.local ceph-mon[58974]: pgmap v120: 65 pgs: 65 active+clean; 14 MiB data, 201 MiB used, 120 GiB / 120 GiB avail; 84 KiB/s rd, 1.1 MiB/s wr, 35 op/s 2026-03-10T06:21:13.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:12 vm04.local ceph-mon[51058]: pgmap v120: 65 pgs: 65 active+clean; 14 MiB data, 201 MiB used, 120 GiB / 120 GiB avail; 84 KiB/s rd, 1.1 MiB/s wr, 35 op/s 2026-03-10T06:21:15.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:15 vm04.local ceph-mon[51058]: pgmap v121: 65 pgs: 65 active+clean; 14 MiB data, 207 MiB used, 120 GiB / 120 GiB avail; 84 KiB/s rd, 1.2 MiB/s wr, 41 op/s 2026-03-10T06:21:15.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:15 vm06.local ceph-mon[58974]: pgmap v121: 65 pgs: 65 active+clean; 14 MiB data, 207 MiB used, 120 GiB / 120 GiB avail; 84 KiB/s rd, 1.2 MiB/s wr, 41 op/s 2026-03-10T06:21:17.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:17 vm06.local ceph-mon[58974]: pgmap v122: 65 pgs: 65 active+clean; 26 MiB data, 233 MiB used, 120 GiB / 120 GiB avail; 683 KiB/s rd, 2.1 MiB/s wr, 87 op/s 2026-03-10T06:21:17.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:17 vm04.local ceph-mon[51058]: pgmap v122: 65 pgs: 65 active+clean; 26 MiB data, 233 MiB used, 120 GiB / 120 GiB avail; 683 KiB/s rd, 2.1 MiB/s wr, 87 op/s 2026-03-10T06:21:18.109 INFO:tasks.workunit.client.0.vm04.stderr:Note: switching to '75a68fd8ca3f918fe9466b4c0bb385b7fc260a9b'. 2026-03-10T06:21:18.109 INFO:tasks.workunit.client.0.vm04.stderr: 2026-03-10T06:21:18.109 INFO:tasks.workunit.client.0.vm04.stderr:You are in 'detached HEAD' state. You can look around, make experimental 2026-03-10T06:21:18.109 INFO:tasks.workunit.client.0.vm04.stderr:changes and commit them, and you can discard any commits you make in this 2026-03-10T06:21:18.110 INFO:tasks.workunit.client.0.vm04.stderr:state without impacting any branches by switching back to a branch. 2026-03-10T06:21:18.110 INFO:tasks.workunit.client.0.vm04.stderr: 2026-03-10T06:21:18.110 INFO:tasks.workunit.client.0.vm04.stderr:If you want to create a new branch to retain commits you create, you may 2026-03-10T06:21:18.110 INFO:tasks.workunit.client.0.vm04.stderr:do so (now or later) by using -c with the switch command. Example: 2026-03-10T06:21:18.110 INFO:tasks.workunit.client.0.vm04.stderr: 2026-03-10T06:21:18.110 INFO:tasks.workunit.client.0.vm04.stderr: git switch -c 2026-03-10T06:21:18.110 INFO:tasks.workunit.client.0.vm04.stderr: 2026-03-10T06:21:18.110 INFO:tasks.workunit.client.0.vm04.stderr:Or undo this operation with: 2026-03-10T06:21:18.110 INFO:tasks.workunit.client.0.vm04.stderr: 2026-03-10T06:21:18.110 INFO:tasks.workunit.client.0.vm04.stderr: git switch - 2026-03-10T06:21:18.110 INFO:tasks.workunit.client.0.vm04.stderr: 2026-03-10T06:21:18.110 INFO:tasks.workunit.client.0.vm04.stderr:Turn off this advice by setting config variable advice.detachedHead to false 2026-03-10T06:21:18.110 INFO:tasks.workunit.client.0.vm04.stderr: 2026-03-10T06:21:18.110 INFO:tasks.workunit.client.0.vm04.stderr:HEAD is now at 75a68fd8ca3 qa/suites/orch/cephadm/osds: drop nvme_loop task 2026-03-10T06:21:18.115 DEBUG:teuthology.orchestra.run.vm04:> cd -- /home/ubuntu/cephtest/clone.client.0/qa/workunits && if test -e Makefile ; then make ; fi && find -executable -type f -printf '%P\0' >/home/ubuntu/cephtest/workunits.list.client.0 2026-03-10T06:21:18.138 INFO:tasks.workunit.client.0.vm04.stdout:for d in direct_io fs ; do ( cd $d ; make all ) ; done 2026-03-10T06:21:18.140 INFO:tasks.workunit.client.0.vm04.stdout:make[1]: Entering directory '/home/ubuntu/cephtest/clone.client.0/qa/workunits/direct_io' 2026-03-10T06:21:18.140 INFO:tasks.workunit.client.0.vm04.stdout:cc -Wall -Wextra -D_GNU_SOURCE direct_io_test.c -o direct_io_test 2026-03-10T06:21:18.221 INFO:tasks.workunit.client.0.vm04.stdout:cc -Wall -Wextra -D_GNU_SOURCE test_sync_io.c -o test_sync_io 2026-03-10T06:21:18.258 INFO:tasks.workunit.client.0.vm04.stdout:cc -Wall -Wextra -D_GNU_SOURCE test_short_dio_read.c -o test_short_dio_read 2026-03-10T06:21:18.290 INFO:tasks.workunit.client.0.vm04.stdout:make[1]: Leaving directory '/home/ubuntu/cephtest/clone.client.0/qa/workunits/direct_io' 2026-03-10T06:21:18.291 INFO:tasks.workunit.client.0.vm04.stdout:make[1]: Entering directory '/home/ubuntu/cephtest/clone.client.0/qa/workunits/fs' 2026-03-10T06:21:18.302 INFO:tasks.workunit.client.0.vm04.stdout:cc -Wall -Wextra -D_GNU_SOURCE test_o_trunc.c -o test_o_trunc 2026-03-10T06:21:18.319 INFO:tasks.workunit.client.0.vm04.stdout:make[1]: Leaving directory '/home/ubuntu/cephtest/clone.client.0/qa/workunits/fs' 2026-03-10T06:21:18.327 DEBUG:teuthology.orchestra.run.vm04:> set -ex 2026-03-10T06:21:18.327 DEBUG:teuthology.orchestra.run.vm04:> dd if=/home/ubuntu/cephtest/workunits.list.client.0 of=/dev/stdout 2026-03-10T06:21:18.387 INFO:tasks.workunit:Running workunits matching suites/fsstress.sh on client.0... 2026-03-10T06:21:18.389 INFO:tasks.workunit:Running workunit suites/fsstress.sh... 2026-03-10T06:21:18.389 DEBUG:teuthology.orchestra.run.vm04:workunit test suites/fsstress.sh> mkdir -p -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && cd -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=75a68fd8ca3f918fe9466b4c0bb385b7fc260a9b TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="0" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.0 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.0 CEPH_MNT=/home/ubuntu/cephtest/mnt.0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 3h /home/ubuntu/cephtest/clone.client.0/qa/workunits/suites/fsstress.sh 2026-03-10T06:21:18.453 INFO:tasks.workunit.client.0.vm04.stderr:+ mkdir -p fsstress 2026-03-10T06:21:18.455 INFO:tasks.workunit.client.0.vm04.stderr:+ pushd fsstress 2026-03-10T06:21:18.456 INFO:tasks.workunit.client.0.vm04.stdout:~/cephtest/mnt.0/client.0/tmp/fsstress ~/cephtest/mnt.0/client.0/tmp 2026-03-10T06:21:18.456 INFO:tasks.workunit.client.0.vm04.stderr:+ wget -q -O ltp-full.tgz http://download.ceph.com/qa/ltp-full-20091231.tgz 2026-03-10T06:21:19.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:18 vm06.local ceph-mon[58974]: pgmap v123: 65 pgs: 65 active+clean; 26 MiB data, 233 MiB used, 120 GiB / 120 GiB avail; 683 KiB/s rd, 2.1 MiB/s wr, 87 op/s 2026-03-10T06:21:19.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:18 vm04.local ceph-mon[51058]: pgmap v123: 65 pgs: 65 active+clean; 26 MiB data, 233 MiB used, 120 GiB / 120 GiB avail; 683 KiB/s rd, 2.1 MiB/s wr, 87 op/s 2026-03-10T06:21:19.987 INFO:tasks.workunit.client.0.vm04.stderr:+ tar xzf ltp-full.tgz 2026-03-10T06:21:20.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:19 vm06.local ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T06:21:20.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:19 vm04.local ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T06:21:21.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:20 vm06.local ceph-mon[58974]: pgmap v124: 65 pgs: 65 active+clean; 47 MiB data, 303 MiB used, 120 GiB / 120 GiB avail; 1.3 MiB/s rd, 3.9 MiB/s wr, 146 op/s 2026-03-10T06:21:21.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:20 vm04.local ceph-mon[51058]: pgmap v124: 65 pgs: 65 active+clean; 47 MiB data, 303 MiB used, 120 GiB / 120 GiB avail; 1.3 MiB/s rd, 3.9 MiB/s wr, 146 op/s 2026-03-10T06:21:23.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:23 vm04.local ceph-mon[51058]: pgmap v125: 65 pgs: 65 active+clean; 47 MiB data, 303 MiB used, 120 GiB / 120 GiB avail; 1.3 MiB/s rd, 2.8 MiB/s wr, 111 op/s 2026-03-10T06:21:23.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:23 vm06.local ceph-mon[58974]: pgmap v125: 65 pgs: 65 active+clean; 47 MiB data, 303 MiB used, 120 GiB / 120 GiB avail; 1.3 MiB/s rd, 2.8 MiB/s wr, 111 op/s 2026-03-10T06:21:25.217 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:24 vm04.local ceph-mon[51058]: pgmap v126: 65 pgs: 65 active+clean; 47 MiB data, 316 MiB used, 120 GiB / 120 GiB avail; 1.3 MiB/s rd, 2.8 MiB/s wr, 119 op/s 2026-03-10T06:21:25.296 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:24 vm06.local ceph-mon[58974]: pgmap v126: 65 pgs: 65 active+clean; 47 MiB data, 316 MiB used, 120 GiB / 120 GiB avail; 1.3 MiB/s rd, 2.8 MiB/s wr, 119 op/s 2026-03-10T06:21:28.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:27 vm06.local ceph-mon[58974]: pgmap v127: 65 pgs: 65 active+clean; 69 MiB data, 393 MiB used, 120 GiB / 120 GiB avail; 2.0 MiB/s rd, 4.6 MiB/s wr, 191 op/s 2026-03-10T06:21:28.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:27 vm04.local ceph-mon[51058]: pgmap v127: 65 pgs: 65 active+clean; 69 MiB data, 393 MiB used, 120 GiB / 120 GiB avail; 2.0 MiB/s rd, 4.6 MiB/s wr, 191 op/s 2026-03-10T06:21:29.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:29 vm06.local ceph-mon[58974]: pgmap v128: 65 pgs: 65 active+clean; 69 MiB data, 393 MiB used, 120 GiB / 120 GiB avail; 1.4 MiB/s rd, 3.7 MiB/s wr, 144 op/s 2026-03-10T06:21:29.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:29 vm04.local ceph-mon[51058]: pgmap v128: 65 pgs: 65 active+clean; 69 MiB data, 393 MiB used, 120 GiB / 120 GiB avail; 1.4 MiB/s rd, 3.7 MiB/s wr, 144 op/s 2026-03-10T06:21:31.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:31 vm06.local ceph-mon[58974]: pgmap v129: 65 pgs: 65 active+clean; 90 MiB data, 452 MiB used, 120 GiB / 120 GiB avail; 3.0 MiB/s rd, 5.4 MiB/s wr, 211 op/s 2026-03-10T06:21:31.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:31 vm04.local ceph-mon[51058]: pgmap v129: 65 pgs: 65 active+clean; 90 MiB data, 452 MiB used, 120 GiB / 120 GiB avail; 3.0 MiB/s rd, 5.4 MiB/s wr, 211 op/s 2026-03-10T06:21:32.681 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:32.680+0000 7f9c9e68c700 1 -- 192.168.123.104:0/1624675473 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9c98072330 msgr2=0x7f9c980770b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:21:32.681 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:32.680+0000 7f9c9e68c700 1 --2- 192.168.123.104:0/1624675473 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9c98072330 0x7f9c980770b0 secure :-1 s=READY pgs=301 cs=0 l=1 rev1=1 crypto rx=0x7f9c9000b600 tx=0x7f9c9000b910 comp rx=0 tx=0).stop 2026-03-10T06:21:32.681 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:32.680+0000 7f9c9e68c700 1 -- 192.168.123.104:0/1624675473 shutdown_connections 2026-03-10T06:21:32.681 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:32.680+0000 7f9c9e68c700 1 --2- 192.168.123.104:0/1624675473 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9c98072330 0x7f9c980770b0 unknown :-1 s=CLOSED pgs=301 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:21:32.681 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:32.680+0000 7f9c9e68c700 1 --2- 192.168.123.104:0/1624675473 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9c98071950 0x7f9c98071d60 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:21:32.681 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:32.680+0000 7f9c9e68c700 1 -- 192.168.123.104:0/1624675473 >> 192.168.123.104:0/1624675473 conn(0x7f9c9806d1a0 msgr2=0x7f9c9806f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:21:32.683 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:32.681+0000 7f9c9e68c700 1 -- 192.168.123.104:0/1624675473 shutdown_connections 2026-03-10T06:21:32.683 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:32.681+0000 7f9c9e68c700 1 -- 192.168.123.104:0/1624675473 wait complete. 2026-03-10T06:21:32.683 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:32.681+0000 7f9c9e68c700 1 Processor -- start 2026-03-10T06:21:32.683 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:32.681+0000 7f9c9e68c700 1 -- start start 2026-03-10T06:21:32.693 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:32.681+0000 7f9c9e68c700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9c98071950 0x7f9c98082880 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:21:32.693 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:32.681+0000 7f9c9e68c700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9c98082dc0 0x7f9c981b2a90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:21:32.693 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:32.682+0000 7f9c9e68c700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9c98083230 con 0x7f9c98071950 2026-03-10T06:21:32.693 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:32.682+0000 7f9c9e68c700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9c980833a0 con 0x7f9c98082dc0 2026-03-10T06:21:32.693 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:32.682+0000 7f9c9d68a700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9c98071950 0x7f9c98082880 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:21:32.693 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:32.682+0000 7f9c9d68a700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9c98071950 0x7f9c98082880 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:41214/0 (socket says 192.168.123.104:41214) 2026-03-10T06:21:32.693 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:32.682+0000 7f9c9d68a700 1 -- 192.168.123.104:0/4257983344 learned_addr learned my addr 192.168.123.104:0/4257983344 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:21:32.693 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:32.682+0000 7f9c9ce89700 1 --2- 192.168.123.104:0/4257983344 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9c98082dc0 0x7f9c981b2a90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:21:32.693 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:32.682+0000 7f9c9d68a700 1 -- 192.168.123.104:0/4257983344 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9c98082dc0 msgr2=0x7f9c981b2a90 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:21:32.693 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:32.682+0000 7f9c9d68a700 1 --2- 192.168.123.104:0/4257983344 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9c98082dc0 0x7f9c981b2a90 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:21:32.693 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:32.682+0000 7f9c9d68a700 1 -- 192.168.123.104:0/4257983344 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f9c9000b050 con 0x7f9c98071950 2026-03-10T06:21:32.693 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:32.682+0000 7f9c9d68a700 1 --2- 192.168.123.104:0/4257983344 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9c98071950 0x7f9c98082880 secure :-1 s=READY pgs=302 cs=0 l=1 rev1=1 crypto rx=0x7f9c9400ba70 tx=0x7f9c9400bd80 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:21:32.693 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:32.682+0000 7f9c8e7fc700 1 -- 192.168.123.104:0/4257983344 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9c9400c700 con 0x7f9c98071950 2026-03-10T06:21:32.693 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:32.682+0000 7f9c8e7fc700 1 -- 192.168.123.104:0/4257983344 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f9c9400cd40 con 0x7f9c98071950 2026-03-10T06:21:32.693 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:32.682+0000 7f9c8e7fc700 1 -- 192.168.123.104:0/4257983344 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9c94012340 con 0x7f9c98071950 2026-03-10T06:21:32.693 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:32.683+0000 7f9c9e68c700 1 -- 192.168.123.104:0/4257983344 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f9c981b2fd0 con 0x7f9c98071950 2026-03-10T06:21:32.693 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:32.683+0000 7f9c9e68c700 1 -- 192.168.123.104:0/4257983344 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f9c981bbb90 con 0x7f9c98071950 2026-03-10T06:21:32.693 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:32.687+0000 7f9c9e68c700 1 -- 192.168.123.104:0/4257983344 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f9c9807c920 con 0x7f9c98071950 2026-03-10T06:21:32.693 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:32.692+0000 7f9c8e7fc700 1 -- 192.168.123.104:0/4257983344 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f9c94014440 con 0x7f9c98071950 2026-03-10T06:21:32.694 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:32.693+0000 7f9c8e7fc700 1 --2- 192.168.123.104:0/4257983344 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f9c8406c6e0 0x7f9c8406eb90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:21:32.694 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:32.693+0000 7f9c8e7fc700 1 -- 192.168.123.104:0/4257983344 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(38..38 src has 1..38) v4 ==== 5276+0+0 (secure 0 0 0) 0x7f9c9408b150 con 0x7f9c98071950 2026-03-10T06:21:32.694 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:32.693+0000 7f9c8e7fc700 1 -- 192.168.123.104:0/4257983344 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f9c940593e0 con 0x7f9c98071950 2026-03-10T06:21:32.703 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:32.696+0000 7f9c9ce89700 1 --2- 192.168.123.104:0/4257983344 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f9c8406c6e0 0x7f9c8406eb90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:21:32.703 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:32.698+0000 7f9c9ce89700 1 --2- 192.168.123.104:0/4257983344 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f9c8406c6e0 0x7f9c8406eb90 secure :-1 s=READY pgs=116 cs=0 l=1 rev1=1 crypto rx=0x7f9c9000bd90 tx=0x7f9c90006040 comp rx=0 tx=0).ready entity=mgr.14241 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:21:32.873 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:32.872+0000 7f9c9e68c700 1 -- 192.168.123.104:0/4257983344 --> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f9c98061190 con 0x7f9c8406c6e0 2026-03-10T06:21:32.874 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:32.874+0000 7f9c8e7fc700 1 -- 192.168.123.104:0/4257983344 <== mgr.14241 v2:192.168.123.104:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+434 (secure 0 0 0) 0x7f9c98061190 con 0x7f9c8406c6e0 2026-03-10T06:21:32.881 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:32.880+0000 7f9c9e68c700 1 -- 192.168.123.104:0/4257983344 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f9c8406c6e0 msgr2=0x7f9c8406eb90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:21:32.881 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:32.880+0000 7f9c9e68c700 1 --2- 192.168.123.104:0/4257983344 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f9c8406c6e0 0x7f9c8406eb90 secure :-1 s=READY pgs=116 cs=0 l=1 rev1=1 crypto rx=0x7f9c9000bd90 tx=0x7f9c90006040 comp rx=0 tx=0).stop 2026-03-10T06:21:32.881 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:32.881+0000 7f9c9e68c700 1 -- 192.168.123.104:0/4257983344 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9c98071950 msgr2=0x7f9c98082880 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:21:32.881 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:32.881+0000 7f9c9e68c700 1 --2- 192.168.123.104:0/4257983344 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9c98071950 0x7f9c98082880 secure :-1 s=READY pgs=302 cs=0 l=1 rev1=1 crypto rx=0x7f9c9400ba70 tx=0x7f9c9400bd80 comp rx=0 tx=0).stop 2026-03-10T06:21:32.882 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:32.881+0000 7f9c9e68c700 1 -- 192.168.123.104:0/4257983344 shutdown_connections 2026-03-10T06:21:32.882 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:32.881+0000 7f9c9e68c700 1 --2- 192.168.123.104:0/4257983344 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f9c8406c6e0 0x7f9c8406eb90 unknown :-1 s=CLOSED pgs=116 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:21:32.882 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:32.881+0000 7f9c9e68c700 1 --2- 192.168.123.104:0/4257983344 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9c98071950 0x7f9c98082880 unknown :-1 s=CLOSED pgs=302 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:21:32.882 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:32.881+0000 7f9c9e68c700 1 --2- 192.168.123.104:0/4257983344 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9c98082dc0 0x7f9c981b2a90 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:21:32.882 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:32.881+0000 7f9c9e68c700 1 -- 192.168.123.104:0/4257983344 >> 192.168.123.104:0/4257983344 conn(0x7f9c9806d1a0 msgr2=0x7f9c98076590 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:21:32.883 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:32.882+0000 7f9c9e68c700 1 -- 192.168.123.104:0/4257983344 shutdown_connections 2026-03-10T06:21:32.883 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:32.882+0000 7f9c9e68c700 1 -- 192.168.123.104:0/4257983344 wait complete. 2026-03-10T06:21:32.896 INFO:teuthology.orchestra.run.vm04.stdout:true 2026-03-10T06:21:33.007 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.006+0000 7f3524590700 1 -- 192.168.123.104:0/4120485971 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f351c072360 msgr2=0x7f351c0770e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:21:33.007 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.006+0000 7f3524590700 1 --2- 192.168.123.104:0/4120485971 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f351c072360 0x7f351c0770e0 secure :-1 s=READY pgs=61 cs=0 l=1 rev1=1 crypto rx=0x7f351400d5b0 tx=0x7f351400d8c0 comp rx=0 tx=0).stop 2026-03-10T06:21:33.007 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.006+0000 7f3524590700 1 -- 192.168.123.104:0/4120485971 shutdown_connections 2026-03-10T06:21:33.007 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.006+0000 7f3524590700 1 --2- 192.168.123.104:0/4120485971 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f351c072360 0x7f351c0770e0 unknown :-1 s=CLOSED pgs=61 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:21:33.007 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.006+0000 7f3524590700 1 --2- 192.168.123.104:0/4120485971 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f351c071980 0x7f351c071d90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:21:33.007 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.006+0000 7f3524590700 1 -- 192.168.123.104:0/4120485971 >> 192.168.123.104:0/4120485971 conn(0x7f351c06d1a0 msgr2=0x7f351c06f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:21:33.008 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.006+0000 7f3524590700 1 -- 192.168.123.104:0/4120485971 shutdown_connections 2026-03-10T06:21:33.008 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.006+0000 7f3524590700 1 -- 192.168.123.104:0/4120485971 wait complete. 2026-03-10T06:21:33.008 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.007+0000 7f3524590700 1 Processor -- start 2026-03-10T06:21:33.008 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.007+0000 7f3524590700 1 -- start start 2026-03-10T06:21:33.008 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.007+0000 7f3524590700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f351c071980 0x7f351c0824a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:21:33.008 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.007+0000 7f3524590700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f351c0829e0 0x7f351c082e50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:21:33.008 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.007+0000 7f3524590700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f351c083e50 con 0x7f351c071980 2026-03-10T06:21:33.008 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.007+0000 7f3524590700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f351c12dd80 con 0x7f351c0829e0 2026-03-10T06:21:33.008 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.007+0000 7f3521b2b700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f351c0829e0 0x7f351c082e50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:21:33.008 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.007+0000 7f3521b2b700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f351c0829e0 0x7f351c082e50 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.104:39532/0 (socket says 192.168.123.104:39532) 2026-03-10T06:21:33.008 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.007+0000 7f3521b2b700 1 -- 192.168.123.104:0/1795079353 learned_addr learned my addr 192.168.123.104:0/1795079353 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:21:33.008 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.007+0000 7f352232c700 1 --2- 192.168.123.104:0/1795079353 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f351c071980 0x7f351c0824a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:21:33.009 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.008+0000 7f3521b2b700 1 -- 192.168.123.104:0/1795079353 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f351c071980 msgr2=0x7f351c0824a0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:21:33.009 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.008+0000 7f3521b2b700 1 --2- 192.168.123.104:0/1795079353 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f351c071980 0x7f351c0824a0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:21:33.009 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.008+0000 7f3521b2b700 1 -- 192.168.123.104:0/1795079353 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f351400d260 con 0x7f351c0829e0 2026-03-10T06:21:33.009 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.008+0000 7f3521b2b700 1 --2- 192.168.123.104:0/1795079353 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f351c0829e0 0x7f351c082e50 secure :-1 s=READY pgs=62 cs=0 l=1 rev1=1 crypto rx=0x7f3514000f80 tx=0x7f35140046c0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:21:33.009 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.008+0000 7f35137fe700 1 -- 192.168.123.104:0/1795079353 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f351400dea0 con 0x7f351c0829e0 2026-03-10T06:21:33.010 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.009+0000 7f3524590700 1 -- 192.168.123.104:0/1795079353 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f351c12dfa0 con 0x7f351c0829e0 2026-03-10T06:21:33.010 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.009+0000 7f3524590700 1 -- 192.168.123.104:0/1795079353 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f351c12e4f0 con 0x7f351c0829e0 2026-03-10T06:21:33.010 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.009+0000 7f35137fe700 1 -- 192.168.123.104:0/1795079353 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f3514007aa0 con 0x7f351c0829e0 2026-03-10T06:21:33.010 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.009+0000 7f35137fe700 1 -- 192.168.123.104:0/1795079353 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3514022b40 con 0x7f351c0829e0 2026-03-10T06:21:33.012 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.011+0000 7f35137fe700 1 -- 192.168.123.104:0/1795079353 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f351401f070 con 0x7f351c0829e0 2026-03-10T06:21:33.012 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.011+0000 7f35137fe700 1 --2- 192.168.123.104:0/1795079353 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f350806c6d0 0x7f350806eb80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:21:33.013 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.012+0000 7f35137fe700 1 -- 192.168.123.104:0/1795079353 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(38..38 src has 1..38) v4 ==== 5276+0+0 (secure 0 0 0) 0x7f351408e0f0 con 0x7f351c0829e0 2026-03-10T06:21:33.013 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.012+0000 7f352232c700 1 --2- 192.168.123.104:0/1795079353 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f350806c6d0 0x7f350806eb80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:21:33.014 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.014+0000 7f35117fa700 1 -- 192.168.123.104:0/1795079353 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f3500005320 con 0x7f351c0829e0 2026-03-10T06:21:33.015 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.014+0000 7f352232c700 1 --2- 192.168.123.104:0/1795079353 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f350806c6d0 0x7f350806eb80 secure :-1 s=READY pgs=117 cs=0 l=1 rev1=1 crypto rx=0x7f351800afd0 tx=0x7f351800c040 comp rx=0 tx=0).ready entity=mgr.14241 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:21:33.018 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.017+0000 7f35137fe700 1 -- 192.168.123.104:0/1795079353 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f351405c300 con 0x7f351c0829e0 2026-03-10T06:21:33.186 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.184+0000 7f35117fa700 1 -- 192.168.123.104:0/1795079353 --> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f3500000bf0 con 0x7f350806c6d0 2026-03-10T06:21:33.186 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.185+0000 7f35137fe700 1 -- 192.168.123.104:0/1795079353 <== mgr.14241 v2:192.168.123.104:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+434 (secure 0 0 0) 0x7f3500000bf0 con 0x7f350806c6d0 2026-03-10T06:21:33.190 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.189+0000 7f3524590700 1 -- 192.168.123.104:0/1795079353 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f350806c6d0 msgr2=0x7f350806eb80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:21:33.191 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.189+0000 7f3524590700 1 --2- 192.168.123.104:0/1795079353 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f350806c6d0 0x7f350806eb80 secure :-1 s=READY pgs=117 cs=0 l=1 rev1=1 crypto rx=0x7f351800afd0 tx=0x7f351800c040 comp rx=0 tx=0).stop 2026-03-10T06:21:33.191 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.190+0000 7f3524590700 1 -- 192.168.123.104:0/1795079353 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f351c0829e0 msgr2=0x7f351c082e50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:21:33.191 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.190+0000 7f3524590700 1 --2- 192.168.123.104:0/1795079353 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f351c0829e0 0x7f351c082e50 secure :-1 s=READY pgs=62 cs=0 l=1 rev1=1 crypto rx=0x7f3514000f80 tx=0x7f35140046c0 comp rx=0 tx=0).stop 2026-03-10T06:21:33.191 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.190+0000 7f3524590700 1 -- 192.168.123.104:0/1795079353 shutdown_connections 2026-03-10T06:21:33.191 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.190+0000 7f3524590700 1 --2- 192.168.123.104:0/1795079353 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f350806c6d0 0x7f350806eb80 unknown :-1 s=CLOSED pgs=117 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:21:33.191 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.190+0000 7f3524590700 1 --2- 192.168.123.104:0/1795079353 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f351c071980 0x7f351c0824a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:21:33.191 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.190+0000 7f3524590700 1 --2- 192.168.123.104:0/1795079353 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f351c0829e0 0x7f351c082e50 unknown :-1 s=CLOSED pgs=62 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:21:33.191 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.190+0000 7f3524590700 1 -- 192.168.123.104:0/1795079353 >> 192.168.123.104:0/1795079353 conn(0x7f351c06d1a0 msgr2=0x7f351c076470 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:21:33.193 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.192+0000 7f3524590700 1 -- 192.168.123.104:0/1795079353 shutdown_connections 2026-03-10T06:21:33.193 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.193+0000 7f3524590700 1 -- 192.168.123.104:0/1795079353 wait complete. 2026-03-10T06:21:33.306 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.305+0000 7fed1905e700 1 -- 192.168.123.104:0/1421278071 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fed14072360 msgr2=0x7fed140770e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:21:33.306 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.305+0000 7fed1905e700 1 --2- 192.168.123.104:0/1421278071 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fed14072360 0x7fed140770e0 secure :-1 s=READY pgs=63 cs=0 l=1 rev1=1 crypto rx=0x7fed0c00d5b0 tx=0x7fed0c00d8c0 comp rx=0 tx=0).stop 2026-03-10T06:21:33.306 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.305+0000 7fed1905e700 1 -- 192.168.123.104:0/1421278071 shutdown_connections 2026-03-10T06:21:33.306 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.305+0000 7fed1905e700 1 --2- 192.168.123.104:0/1421278071 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fed14072360 0x7fed140770e0 unknown :-1 s=CLOSED pgs=63 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:21:33.306 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.305+0000 7fed1905e700 1 --2- 192.168.123.104:0/1421278071 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fed14071980 0x7fed14071d90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:21:33.306 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.305+0000 7fed1905e700 1 -- 192.168.123.104:0/1421278071 >> 192.168.123.104:0/1421278071 conn(0x7fed1406d1a0 msgr2=0x7fed1406f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:21:33.307 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.305+0000 7fed1905e700 1 -- 192.168.123.104:0/1421278071 shutdown_connections 2026-03-10T06:21:33.307 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.305+0000 7fed1905e700 1 -- 192.168.123.104:0/1421278071 wait complete. 2026-03-10T06:21:33.308 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.306+0000 7fed1905e700 1 Processor -- start 2026-03-10T06:21:33.308 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.306+0000 7fed1905e700 1 -- start start 2026-03-10T06:21:33.308 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.306+0000 7fed1905e700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fed14071980 0x7fed140824d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:21:33.308 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.306+0000 7fed1905e700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fed14082a10 0x7fed14082e80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:21:33.308 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.306+0000 7fed1905e700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fed141b2a90 con 0x7fed14082a10 2026-03-10T06:21:33.308 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.306+0000 7fed1905e700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fed141b2bd0 con 0x7fed14071980 2026-03-10T06:21:33.308 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.307+0000 7fed1259c700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fed14082a10 0x7fed14082e80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:21:33.308 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.307+0000 7fed1259c700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fed14082a10 0x7fed14082e80 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:41260/0 (socket says 192.168.123.104:41260) 2026-03-10T06:21:33.308 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.307+0000 7fed1259c700 1 -- 192.168.123.104:0/329626997 learned_addr learned my addr 192.168.123.104:0/329626997 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:21:33.308 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.307+0000 7fed12d9d700 1 --2- 192.168.123.104:0/329626997 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fed14071980 0x7fed140824d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:21:33.309 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.308+0000 7fed1259c700 1 -- 192.168.123.104:0/329626997 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fed14071980 msgr2=0x7fed140824d0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:21:33.309 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.308+0000 7fed1259c700 1 --2- 192.168.123.104:0/329626997 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fed14071980 0x7fed140824d0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:21:33.309 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.308+0000 7fed1259c700 1 -- 192.168.123.104:0/329626997 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fed040082d0 con 0x7fed14082a10 2026-03-10T06:21:33.309 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.308+0000 7fed1259c700 1 --2- 192.168.123.104:0/329626997 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fed14082a10 0x7fed14082e80 secure :-1 s=READY pgs=303 cs=0 l=1 rev1=1 crypto rx=0x7fed0c004780 tx=0x7fed0c004860 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:21:33.309 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.309+0000 7fecfbfff700 1 -- 192.168.123.104:0/329626997 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fed0c007980 con 0x7fed14082a10 2026-03-10T06:21:33.311 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.309+0000 7fed1905e700 1 -- 192.168.123.104:0/329626997 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fed0c00d260 con 0x7fed14082a10 2026-03-10T06:21:33.311 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.309+0000 7fed1905e700 1 -- 192.168.123.104:0/329626997 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fed141b2ff0 con 0x7fed14082a10 2026-03-10T06:21:33.311 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.310+0000 7fecfbfff700 1 -- 192.168.123.104:0/329626997 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fed0c003f80 con 0x7fed14082a10 2026-03-10T06:21:33.313 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.312+0000 7fecfbfff700 1 -- 192.168.123.104:0/329626997 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fed0c0193f0 con 0x7fed14082a10 2026-03-10T06:21:33.313 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.312+0000 7fecfbfff700 1 -- 192.168.123.104:0/329626997 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7fed0c019590 con 0x7fed14082a10 2026-03-10T06:21:33.314 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.313+0000 7fecfbfff700 1 --2- 192.168.123.104:0/329626997 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fecfc06c870 0x7fecfc06ed20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:21:33.314 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.314+0000 7fecfbfff700 1 -- 192.168.123.104:0/329626997 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(38..38 src has 1..38) v4 ==== 5276+0+0 (secure 0 0 0) 0x7fed0c0902f0 con 0x7fed14082a10 2026-03-10T06:21:33.314 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.314+0000 7fed12d9d700 1 --2- 192.168.123.104:0/329626997 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fecfc06c870 0x7fecfc06ed20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:21:33.315 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.314+0000 7fed1905e700 1 -- 192.168.123.104:0/329626997 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fed00005320 con 0x7fed14082a10 2026-03-10T06:21:33.317 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.317+0000 7fed12d9d700 1 --2- 192.168.123.104:0/329626997 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fecfc06c870 0x7fecfc06ed20 secure :-1 s=READY pgs=118 cs=0 l=1 rev1=1 crypto rx=0x7fed0400afd0 tx=0x7fed0400c040 comp rx=0 tx=0).ready entity=mgr.14241 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:21:33.322 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.318+0000 7fecfbfff700 1 -- 192.168.123.104:0/329626997 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fed0c05e580 con 0x7fed14082a10 2026-03-10T06:21:33.467 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.466+0000 7fed1905e700 1 -- 192.168.123.104:0/329626997 --> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7fed00000bf0 con 0x7fecfc06c870 2026-03-10T06:21:33.477 INFO:teuthology.orchestra.run.vm04.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-10T06:21:33.477 INFO:teuthology.orchestra.run.vm04.stdout:alertmanager.vm04 vm04 *:9093,9094 running (3m) 96s ago 4m 22.6M - 0.25.0 c8568f914cd2 3d98d9c97afc 2026-03-10T06:21:33.477 INFO:teuthology.orchestra.run.vm04.stdout:ceph-exporter.vm04 vm04 running (4m) 96s ago 4m 8002k - 18.2.0 dc2bc1663786 019b79596e39 2026-03-10T06:21:33.477 INFO:teuthology.orchestra.run.vm04.stdout:ceph-exporter.vm06 vm06 running (3m) 97s ago 3m 8288k - 18.2.0 dc2bc1663786 02ba67f7b99e 2026-03-10T06:21:33.477 INFO:teuthology.orchestra.run.vm04.stdout:crash.vm04 vm04 running (4m) 96s ago 4m 7402k - 18.2.0 dc2bc1663786 35fbdbd85c40 2026-03-10T06:21:33.477 INFO:teuthology.orchestra.run.vm04.stdout:crash.vm06 vm06 running (3m) 97s ago 3m 7411k - 18.2.0 dc2bc1663786 a60199b09d41 2026-03-10T06:21:33.477 INFO:teuthology.orchestra.run.vm04.stdout:grafana.vm04 vm04 *:3000 running (3m) 96s ago 3m 81.2M - 9.4.7 954c08fa6188 888c399470c8 2026-03-10T06:21:33.477 INFO:teuthology.orchestra.run.vm04.stdout:mds.cephfs.vm04.hdxbzv vm04 running (103s) 96s ago 103s 17.0M - 18.2.0 dc2bc1663786 342935a5b39a 2026-03-10T06:21:33.477 INFO:teuthology.orchestra.run.vm04.stdout:mds.cephfs.vm04.hsrsig vm04 running (101s) 96s ago 101s 14.1M - 18.2.0 dc2bc1663786 9bbaa4df4333 2026-03-10T06:21:33.477 INFO:teuthology.orchestra.run.vm04.stdout:mds.cephfs.vm06.afscws vm06 running (100s) 97s ago 100s 14.0M - 18.2.0 dc2bc1663786 dc29bd0a94dd 2026-03-10T06:21:33.477 INFO:teuthology.orchestra.run.vm04.stdout:mds.cephfs.vm06.wzhqon vm06 running (102s) 97s ago 102s 14.9M - 18.2.0 dc2bc1663786 5f7b9f10b346 2026-03-10T06:21:33.477 INFO:teuthology.orchestra.run.vm04.stdout:mgr.vm04.exdvdb vm04 *:9283,8765,8443 running (4m) 96s ago 4m 499M - 18.2.0 dc2bc1663786 90f53ab8e17a 2026-03-10T06:21:33.477 INFO:teuthology.orchestra.run.vm04.stdout:mgr.vm06.wwotdr vm06 *:8443,9283,8765 running (3m) 97s ago 3m 445M - 18.2.0 dc2bc1663786 db76c25cd8f7 2026-03-10T06:21:33.477 INFO:teuthology.orchestra.run.vm04.stdout:mon.vm04 vm04 running (4m) 96s ago 4m 49.2M 2048M 18.2.0 dc2bc1663786 089bb557f95b 2026-03-10T06:21:33.477 INFO:teuthology.orchestra.run.vm04.stdout:mon.vm06 vm06 running (3m) 97s ago 3m 44.7M 2048M 18.2.0 dc2bc1663786 826078cd5cc7 2026-03-10T06:21:33.477 INFO:teuthology.orchestra.run.vm04.stdout:node-exporter.vm04 vm04 *:9100 running (4m) 96s ago 4m 12.3M - 1.5.0 0da6a335fe13 f563a35e96ab 2026-03-10T06:21:33.477 INFO:teuthology.orchestra.run.vm04.stdout:node-exporter.vm06 vm06 *:9100 running (3m) 97s ago 3m 15.0M - 1.5.0 0da6a335fe13 3304cc389738 2026-03-10T06:21:33.477 INFO:teuthology.orchestra.run.vm04.stdout:osd.0 vm04 running (2m) 96s ago 2m 46.2M 4096M 18.2.0 dc2bc1663786 23249edb3d75 2026-03-10T06:21:33.477 INFO:teuthology.orchestra.run.vm04.stdout:osd.1 vm04 running (2m) 96s ago 2m 47.0M 4096M 18.2.0 dc2bc1663786 ddcaf1636c42 2026-03-10T06:21:33.477 INFO:teuthology.orchestra.run.vm04.stdout:osd.2 vm04 running (2m) 96s ago 2m 45.9M 4096M 18.2.0 dc2bc1663786 e5a533082c80 2026-03-10T06:21:33.477 INFO:teuthology.orchestra.run.vm04.stdout:osd.3 vm06 running (2m) 97s ago 2m 44.5M 4096M 18.2.0 dc2bc1663786 62400287eca0 2026-03-10T06:21:33.477 INFO:teuthology.orchestra.run.vm04.stdout:osd.4 vm06 running (2m) 97s ago 2m 43.4M 4096M 18.2.0 dc2bc1663786 dcd395dfe220 2026-03-10T06:21:33.477 INFO:teuthology.orchestra.run.vm04.stdout:osd.5 vm06 running (2m) 97s ago 2m 45.0M 4096M 18.2.0 dc2bc1663786 862da087fc06 2026-03-10T06:21:33.477 INFO:teuthology.orchestra.run.vm04.stdout:prometheus.vm04 vm04 *:9095 running (3m) 96s ago 3m 38.7M - 2.43.0 a07b618ecd1d 5d3ae08adc2a 2026-03-10T06:21:33.477 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.474+0000 7fecfbfff700 1 -- 192.168.123.104:0/329626997 <== mgr.14241 v2:192.168.123.104:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3288 (secure 0 0 0) 0x7fed00000bf0 con 0x7fecfc06c870 2026-03-10T06:21:33.477 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.476+0000 7fecf9ffb700 1 -- 192.168.123.104:0/329626997 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fecfc06c870 msgr2=0x7fecfc06ed20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:21:33.477 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.476+0000 7fecf9ffb700 1 --2- 192.168.123.104:0/329626997 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fecfc06c870 0x7fecfc06ed20 secure :-1 s=READY pgs=118 cs=0 l=1 rev1=1 crypto rx=0x7fed0400afd0 tx=0x7fed0400c040 comp rx=0 tx=0).stop 2026-03-10T06:21:33.477 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.476+0000 7fecf9ffb700 1 -- 192.168.123.104:0/329626997 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fed14082a10 msgr2=0x7fed14082e80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:21:33.477 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.476+0000 7fecf9ffb700 1 --2- 192.168.123.104:0/329626997 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fed14082a10 0x7fed14082e80 secure :-1 s=READY pgs=303 cs=0 l=1 rev1=1 crypto rx=0x7fed0c004780 tx=0x7fed0c004860 comp rx=0 tx=0).stop 2026-03-10T06:21:33.477 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.476+0000 7fecf9ffb700 1 -- 192.168.123.104:0/329626997 shutdown_connections 2026-03-10T06:21:33.477 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.476+0000 7fecf9ffb700 1 --2- 192.168.123.104:0/329626997 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fecfc06c870 0x7fecfc06ed20 unknown :-1 s=CLOSED pgs=118 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:21:33.477 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.476+0000 7fecf9ffb700 1 --2- 192.168.123.104:0/329626997 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fed14071980 0x7fed140824d0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:21:33.478 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.476+0000 7fecf9ffb700 1 --2- 192.168.123.104:0/329626997 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fed14082a10 0x7fed14082e80 unknown :-1 s=CLOSED pgs=303 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:21:33.478 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.476+0000 7fecf9ffb700 1 -- 192.168.123.104:0/329626997 >> 192.168.123.104:0/329626997 conn(0x7fed1406d1a0 msgr2=0x7fed14076450 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:21:33.478 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.476+0000 7fecf9ffb700 1 -- 192.168.123.104:0/329626997 shutdown_connections 2026-03-10T06:21:33.478 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.476+0000 7fecf9ffb700 1 -- 192.168.123.104:0/329626997 wait complete. 2026-03-10T06:21:33.576 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:33 vm04.local ceph-mon[51058]: pgmap v130: 65 pgs: 65 active+clean; 90 MiB data, 452 MiB used, 120 GiB / 120 GiB avail; 2.3 MiB/s rd, 3.6 MiB/s wr, 152 op/s 2026-03-10T06:21:33.576 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.575+0000 7fac8d172700 1 -- 192.168.123.104:0/308395282 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fac88100460 msgr2=0x7fac881008d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:21:33.577 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.575+0000 7fac8d172700 1 --2- 192.168.123.104:0/308395282 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fac88100460 0x7fac881008d0 secure :-1 s=READY pgs=304 cs=0 l=1 rev1=1 crypto rx=0x7fac70009b00 tx=0x7fac70009e10 comp rx=0 tx=0).stop 2026-03-10T06:21:33.579 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.578+0000 7fac8d172700 1 -- 192.168.123.104:0/308395282 shutdown_connections 2026-03-10T06:21:33.579 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.578+0000 7fac8d172700 1 --2- 192.168.123.104:0/308395282 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fac88100460 0x7fac881008d0 unknown :-1 s=CLOSED pgs=304 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:21:33.579 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.578+0000 7fac8d172700 1 --2- 192.168.123.104:0/308395282 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fac880ff1c0 0x7fac880ff5d0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:21:33.579 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.578+0000 7fac8d172700 1 -- 192.168.123.104:0/308395282 >> 192.168.123.104:0/308395282 conn(0x7fac880fa7f0 msgr2=0x7fac880fcc40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:21:33.579 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.579+0000 7fac8d172700 1 -- 192.168.123.104:0/308395282 shutdown_connections 2026-03-10T06:21:33.580 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.579+0000 7fac8d172700 1 -- 192.168.123.104:0/308395282 wait complete. 2026-03-10T06:21:33.580 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.579+0000 7fac8d172700 1 Processor -- start 2026-03-10T06:21:33.580 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.580+0000 7fac8d172700 1 -- start start 2026-03-10T06:21:33.581 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.580+0000 7fac8d172700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fac880ff1c0 0x7fac8819c150 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:21:33.581 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.580+0000 7fac8d172700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fac88100460 0x7fac8819c690 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:21:33.582 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.580+0000 7fac8d172700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fac8819ccb0 con 0x7fac88100460 2026-03-10T06:21:33.582 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.580+0000 7fac8d172700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fac8819cdf0 con 0x7fac880ff1c0 2026-03-10T06:21:33.582 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.581+0000 7fac86d9d700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fac880ff1c0 0x7fac8819c150 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:21:33.582 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.581+0000 7fac86d9d700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fac880ff1c0 0x7fac8819c150 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.104:39570/0 (socket says 192.168.123.104:39570) 2026-03-10T06:21:33.582 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.581+0000 7fac86d9d700 1 -- 192.168.123.104:0/3427746706 learned_addr learned my addr 192.168.123.104:0/3427746706 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:21:33.582 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.582+0000 7fac7ffff700 1 --2- 192.168.123.104:0/3427746706 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fac88100460 0x7fac8819c690 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:21:33.583 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.582+0000 7fac7ffff700 1 -- 192.168.123.104:0/3427746706 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fac880ff1c0 msgr2=0x7fac8819c150 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:21:33.583 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.582+0000 7fac7ffff700 1 --2- 192.168.123.104:0/3427746706 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fac880ff1c0 0x7fac8819c150 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:21:33.583 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.582+0000 7fac7ffff700 1 -- 192.168.123.104:0/3427746706 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fac78009710 con 0x7fac88100460 2026-03-10T06:21:33.584 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.583+0000 7fac7ffff700 1 --2- 192.168.123.104:0/3427746706 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fac88100460 0x7fac8819c690 secure :-1 s=READY pgs=305 cs=0 l=1 rev1=1 crypto rx=0x7fac70009ad0 tx=0x7fac7000bb20 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:21:33.584 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.583+0000 7fac84d99700 1 -- 192.168.123.104:0/3427746706 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fac7001d070 con 0x7fac88100460 2026-03-10T06:21:33.584 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.583+0000 7fac8d172700 1 -- 192.168.123.104:0/3427746706 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fac700097e0 con 0x7fac88100460 2026-03-10T06:21:33.584 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.583+0000 7fac8d172700 1 -- 192.168.123.104:0/3427746706 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fac881a1ba0 con 0x7fac88100460 2026-03-10T06:21:33.585 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.584+0000 7fac8d172700 1 -- 192.168.123.104:0/3427746706 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fac8804ea50 con 0x7fac88100460 2026-03-10T06:21:33.585 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.585+0000 7fac84d99700 1 -- 192.168.123.104:0/3427746706 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fac7000f460 con 0x7fac88100460 2026-03-10T06:21:33.586 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.585+0000 7fac84d99700 1 -- 192.168.123.104:0/3427746706 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fac7000e5f0 con 0x7fac88100460 2026-03-10T06:21:33.611 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.586+0000 7fac84d99700 1 -- 192.168.123.104:0/3427746706 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7fac7000f5d0 con 0x7fac88100460 2026-03-10T06:21:33.611 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.587+0000 7fac84d99700 1 --2- 192.168.123.104:0/3427746706 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fac7406c480 0x7fac7406e930 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:21:33.611 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.587+0000 7fac86d9d700 1 --2- 192.168.123.104:0/3427746706 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fac7406c480 0x7fac7406e930 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:21:33.611 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.587+0000 7fac84d99700 1 -- 192.168.123.104:0/3427746706 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(38..38 src has 1..38) v4 ==== 5276+0+0 (secure 0 0 0) 0x7fac7005b240 con 0x7fac88100460 2026-03-10T06:21:33.611 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.588+0000 7fac86d9d700 1 --2- 192.168.123.104:0/3427746706 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fac7406c480 0x7fac7406e930 secure :-1 s=READY pgs=119 cs=0 l=1 rev1=1 crypto rx=0x7fac78009fd0 tx=0x7fac78009450 comp rx=0 tx=0).ready entity=mgr.14241 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:21:33.611 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.591+0000 7fac84d99700 1 -- 192.168.123.104:0/3427746706 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fac7005b010 con 0x7fac88100460 2026-03-10T06:21:33.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:33 vm06.local ceph-mon[58974]: pgmap v130: 65 pgs: 65 active+clean; 90 MiB data, 452 MiB used, 120 GiB / 120 GiB avail; 2.3 MiB/s rd, 3.6 MiB/s wr, 152 op/s 2026-03-10T06:21:33.779 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.778+0000 7fac8d172700 1 -- 192.168.123.104:0/3427746706 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7fac881a1ea0 con 0x7fac88100460 2026-03-10T06:21:33.780 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.779+0000 7fac84d99700 1 -- 192.168.123.104:0/3427746706 <== mon.0 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+558 (secure 0 0 0) 0x7fac70027240 con 0x7fac88100460 2026-03-10T06:21:33.780 INFO:teuthology.orchestra.run.vm04.stdout:{ 2026-03-10T06:21:33.780 INFO:teuthology.orchestra.run.vm04.stdout: "mon": { 2026-03-10T06:21:33.780 INFO:teuthology.orchestra.run.vm04.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 2 2026-03-10T06:21:33.780 INFO:teuthology.orchestra.run.vm04.stdout: }, 2026-03-10T06:21:33.780 INFO:teuthology.orchestra.run.vm04.stdout: "mgr": { 2026-03-10T06:21:33.780 INFO:teuthology.orchestra.run.vm04.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 2 2026-03-10T06:21:33.780 INFO:teuthology.orchestra.run.vm04.stdout: }, 2026-03-10T06:21:33.780 INFO:teuthology.orchestra.run.vm04.stdout: "osd": { 2026-03-10T06:21:33.780 INFO:teuthology.orchestra.run.vm04.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 6 2026-03-10T06:21:33.780 INFO:teuthology.orchestra.run.vm04.stdout: }, 2026-03-10T06:21:33.780 INFO:teuthology.orchestra.run.vm04.stdout: "mds": { 2026-03-10T06:21:33.780 INFO:teuthology.orchestra.run.vm04.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 4 2026-03-10T06:21:33.780 INFO:teuthology.orchestra.run.vm04.stdout: }, 2026-03-10T06:21:33.780 INFO:teuthology.orchestra.run.vm04.stdout: "overall": { 2026-03-10T06:21:33.780 INFO:teuthology.orchestra.run.vm04.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 14 2026-03-10T06:21:33.780 INFO:teuthology.orchestra.run.vm04.stdout: } 2026-03-10T06:21:33.780 INFO:teuthology.orchestra.run.vm04.stdout:} 2026-03-10T06:21:33.782 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.782+0000 7fac8d172700 1 -- 192.168.123.104:0/3427746706 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fac7406c480 msgr2=0x7fac7406e930 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:21:33.782 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.782+0000 7fac8d172700 1 --2- 192.168.123.104:0/3427746706 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fac7406c480 0x7fac7406e930 secure :-1 s=READY pgs=119 cs=0 l=1 rev1=1 crypto rx=0x7fac78009fd0 tx=0x7fac78009450 comp rx=0 tx=0).stop 2026-03-10T06:21:33.783 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.782+0000 7fac8d172700 1 -- 192.168.123.104:0/3427746706 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fac88100460 msgr2=0x7fac8819c690 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:21:33.783 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.782+0000 7fac8d172700 1 --2- 192.168.123.104:0/3427746706 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fac88100460 0x7fac8819c690 secure :-1 s=READY pgs=305 cs=0 l=1 rev1=1 crypto rx=0x7fac70009ad0 tx=0x7fac7000bb20 comp rx=0 tx=0).stop 2026-03-10T06:21:33.783 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.782+0000 7fac8d172700 1 -- 192.168.123.104:0/3427746706 shutdown_connections 2026-03-10T06:21:33.783 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.782+0000 7fac8d172700 1 --2- 192.168.123.104:0/3427746706 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7fac7406c480 0x7fac7406e930 unknown :-1 s=CLOSED pgs=119 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:21:33.783 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.782+0000 7fac8d172700 1 --2- 192.168.123.104:0/3427746706 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fac880ff1c0 0x7fac8819c150 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:21:33.783 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.782+0000 7fac8d172700 1 --2- 192.168.123.104:0/3427746706 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fac88100460 0x7fac8819c690 secure :-1 s=CLOSED pgs=305 cs=0 l=1 rev1=1 crypto rx=0x7fac70009ad0 tx=0x7fac7000bb20 comp rx=0 tx=0).stop 2026-03-10T06:21:33.783 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.782+0000 7fac8d172700 1 -- 192.168.123.104:0/3427746706 >> 192.168.123.104:0/3427746706 conn(0x7fac880fa7f0 msgr2=0x7fac88068490 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:21:33.783 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.782+0000 7fac8d172700 1 -- 192.168.123.104:0/3427746706 shutdown_connections 2026-03-10T06:21:33.783 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.782+0000 7fac8d172700 1 -- 192.168.123.104:0/3427746706 wait complete. 2026-03-10T06:21:33.889 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.888+0000 7f54d0f1d700 1 -- 192.168.123.104:0/421976145 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f54cc072360 msgr2=0x7f54cc0770e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:21:33.889 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.888+0000 7f54d0f1d700 1 --2- 192.168.123.104:0/421976145 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f54cc072360 0x7f54cc0770e0 secure :-1 s=READY pgs=64 cs=0 l=1 rev1=1 crypto rx=0x7f54c4009230 tx=0x7f54c4009260 comp rx=0 tx=0).stop 2026-03-10T06:21:33.889 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.888+0000 7f54d0f1d700 1 -- 192.168.123.104:0/421976145 shutdown_connections 2026-03-10T06:21:33.889 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.888+0000 7f54d0f1d700 1 --2- 192.168.123.104:0/421976145 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f54cc072360 0x7f54cc0770e0 unknown :-1 s=CLOSED pgs=64 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:21:33.889 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.888+0000 7f54d0f1d700 1 --2- 192.168.123.104:0/421976145 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f54cc071980 0x7f54cc071d90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:21:33.889 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.888+0000 7f54d0f1d700 1 -- 192.168.123.104:0/421976145 >> 192.168.123.104:0/421976145 conn(0x7f54cc06d1a0 msgr2=0x7f54cc06f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:21:33.890 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.889+0000 7f54d0f1d700 1 -- 192.168.123.104:0/421976145 shutdown_connections 2026-03-10T06:21:33.890 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.889+0000 7f54d0f1d700 1 -- 192.168.123.104:0/421976145 wait complete. 2026-03-10T06:21:33.890 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.889+0000 7f54d0f1d700 1 Processor -- start 2026-03-10T06:21:33.890 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.889+0000 7f54d0f1d700 1 -- start start 2026-03-10T06:21:33.890 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.889+0000 7f54d0f1d700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f54cc071980 0x7f54cc0824a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:21:33.890 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.889+0000 7f54d0f1d700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f54cc0829e0 0x7f54cc082e50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:21:33.890 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.889+0000 7f54d0f1d700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f54cc083e50 con 0x7f54cc0829e0 2026-03-10T06:21:33.890 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.889+0000 7f54d0f1d700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f54cc12dd80 con 0x7f54cc071980 2026-03-10T06:21:33.890 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.889+0000 7f54c9d9b700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f54cc0829e0 0x7f54cc082e50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:21:33.890 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.889+0000 7f54c9d9b700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f54cc0829e0 0x7f54cc082e50 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:43856/0 (socket says 192.168.123.104:43856) 2026-03-10T06:21:33.890 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.889+0000 7f54c9d9b700 1 -- 192.168.123.104:0/3200034999 learned_addr learned my addr 192.168.123.104:0/3200034999 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:21:33.890 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.890+0000 7f54ca59c700 1 --2- 192.168.123.104:0/3200034999 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f54cc071980 0x7f54cc0824a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:21:33.891 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.890+0000 7f54c9d9b700 1 -- 192.168.123.104:0/3200034999 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f54cc071980 msgr2=0x7f54cc0824a0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:21:33.891 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.890+0000 7f54c9d9b700 1 --2- 192.168.123.104:0/3200034999 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f54cc071980 0x7f54cc0824a0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:21:33.891 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.890+0000 7f54c9d9b700 1 -- 192.168.123.104:0/3200034999 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f54c4008ee0 con 0x7f54cc0829e0 2026-03-10T06:21:33.891 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.890+0000 7f54c9d9b700 1 --2- 192.168.123.104:0/3200034999 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f54cc0829e0 0x7f54cc082e50 secure :-1 s=READY pgs=306 cs=0 l=1 rev1=1 crypto rx=0x7f54c4004740 tx=0x7f54c4004820 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:21:33.892 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.892+0000 7f54bb7fe700 1 -- 192.168.123.104:0/3200034999 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f54c401d070 con 0x7f54cc0829e0 2026-03-10T06:21:33.896 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.892+0000 7f54d0f1d700 1 -- 192.168.123.104:0/3200034999 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f54cc12dfa0 con 0x7f54cc0829e0 2026-03-10T06:21:33.896 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.892+0000 7f54d0f1d700 1 -- 192.168.123.104:0/3200034999 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f54cc12e490 con 0x7f54cc0829e0 2026-03-10T06:21:33.896 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.892+0000 7f54d0f1d700 1 -- 192.168.123.104:0/3200034999 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f54cc04ea50 con 0x7f54cc0829e0 2026-03-10T06:21:33.896 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.895+0000 7f54bb7fe700 1 -- 192.168.123.104:0/3200034999 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f54c400ece0 con 0x7f54cc0829e0 2026-03-10T06:21:33.896 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.895+0000 7f54bb7fe700 1 -- 192.168.123.104:0/3200034999 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f54c4016b40 con 0x7f54cc0829e0 2026-03-10T06:21:33.899 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.898+0000 7f54bb7fe700 1 -- 192.168.123.104:0/3200034999 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f54c4016ca0 con 0x7f54cc0829e0 2026-03-10T06:21:33.899 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.898+0000 7f54bb7fe700 1 --2- 192.168.123.104:0/3200034999 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f54b406c6d0 0x7f54b406eb80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:21:33.900 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.899+0000 7f54ca59c700 1 --2- 192.168.123.104:0/3200034999 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f54b406c6d0 0x7f54b406eb80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:21:33.900 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.899+0000 7f54bb7fe700 1 -- 192.168.123.104:0/3200034999 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(38..38 src has 1..38) v4 ==== 5276+0+0 (secure 0 0 0) 0x7f54c4012070 con 0x7f54cc0829e0 2026-03-10T06:21:33.901 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.901+0000 7f54bb7fe700 1 -- 192.168.123.104:0/3200034999 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f54c405bb20 con 0x7f54cc0829e0 2026-03-10T06:21:33.904 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:33.904+0000 7f54ca59c700 1 --2- 192.168.123.104:0/3200034999 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f54b406c6d0 0x7f54b406eb80 secure :-1 s=READY pgs=120 cs=0 l=1 rev1=1 crypto rx=0x7f54bc00b3c0 tx=0x7f54bc00d040 comp rx=0 tx=0).ready entity=mgr.14241 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:21:34.034 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:34.033+0000 7f54d0f1d700 1 -- 192.168.123.104:0/3200034999 --> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f54cc07c760 con 0x7f54b406c6d0 2026-03-10T06:21:34.038 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:34.038+0000 7f54bb7fe700 1 -- 192.168.123.104:0/3200034999 <== mgr.14241 v2:192.168.123.104:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+434 (secure 0 0 0) 0x7f54cc07c760 con 0x7f54b406c6d0 2026-03-10T06:21:34.039 INFO:teuthology.orchestra.run.vm04.stdout:{ 2026-03-10T06:21:34.039 INFO:teuthology.orchestra.run.vm04.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc", 2026-03-10T06:21:34.039 INFO:teuthology.orchestra.run.vm04.stdout: "in_progress": true, 2026-03-10T06:21:34.039 INFO:teuthology.orchestra.run.vm04.stdout: "which": "Upgrading daemons of type(s) mgr", 2026-03-10T06:21:34.039 INFO:teuthology.orchestra.run.vm04.stdout: "services_complete": [], 2026-03-10T06:21:34.039 INFO:teuthology.orchestra.run.vm04.stdout: "progress": "0/2 daemons upgraded", 2026-03-10T06:21:34.039 INFO:teuthology.orchestra.run.vm04.stdout: "message": "Pulling quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc image on host vm06", 2026-03-10T06:21:34.039 INFO:teuthology.orchestra.run.vm04.stdout: "is_paused": false 2026-03-10T06:21:34.039 INFO:teuthology.orchestra.run.vm04.stdout:} 2026-03-10T06:21:34.043 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:34.042+0000 7f54b97fa700 1 -- 192.168.123.104:0/3200034999 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f54b406c6d0 msgr2=0x7f54b406eb80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:21:34.043 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:34.042+0000 7f54b97fa700 1 --2- 192.168.123.104:0/3200034999 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f54b406c6d0 0x7f54b406eb80 secure :-1 s=READY pgs=120 cs=0 l=1 rev1=1 crypto rx=0x7f54bc00b3c0 tx=0x7f54bc00d040 comp rx=0 tx=0).stop 2026-03-10T06:21:34.043 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:34.042+0000 7f54b97fa700 1 -- 192.168.123.104:0/3200034999 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f54cc0829e0 msgr2=0x7f54cc082e50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:21:34.043 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:34.042+0000 7f54b97fa700 1 --2- 192.168.123.104:0/3200034999 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f54cc0829e0 0x7f54cc082e50 secure :-1 s=READY pgs=306 cs=0 l=1 rev1=1 crypto rx=0x7f54c4004740 tx=0x7f54c4004820 comp rx=0 tx=0).stop 2026-03-10T06:21:34.043 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:34.042+0000 7f54b97fa700 1 -- 192.168.123.104:0/3200034999 shutdown_connections 2026-03-10T06:21:34.043 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:34.042+0000 7f54b97fa700 1 --2- 192.168.123.104:0/3200034999 >> [v2:192.168.123.104:6800/2,v1:192.168.123.104:6801/2] conn(0x7f54b406c6d0 0x7f54b406eb80 unknown :-1 s=CLOSED pgs=120 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:21:34.043 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:34.042+0000 7f54b97fa700 1 --2- 192.168.123.104:0/3200034999 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f54cc071980 0x7f54cc0824a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:21:34.043 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:34.042+0000 7f54b97fa700 1 --2- 192.168.123.104:0/3200034999 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f54cc0829e0 0x7f54cc082e50 unknown :-1 s=CLOSED pgs=306 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:21:34.043 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:34.042+0000 7f54b97fa700 1 -- 192.168.123.104:0/3200034999 >> 192.168.123.104:0/3200034999 conn(0x7f54cc06d1a0 msgr2=0x7f54cc076470 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:21:34.043 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:34.042+0000 7f54b97fa700 1 -- 192.168.123.104:0/3200034999 shutdown_connections 2026-03-10T06:21:34.043 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:21:34.042+0000 7f54b97fa700 1 -- 192.168.123.104:0/3200034999 wait complete. 2026-03-10T06:21:35.279 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:34 vm04.local ceph-mon[51058]: from='client.14596 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:21:35.279 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:34 vm04.local ceph-mon[51058]: from='client.24363 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:21:35.279 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:34 vm04.local ceph-mon[51058]: from='client.14604 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:21:35.279 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:34 vm04.local ceph-mon[51058]: from='client.? 192.168.123.104:0/3427746706' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:21:35.279 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:34 vm04.local ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T06:21:35.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:34 vm06.local ceph-mon[58974]: from='client.14596 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:21:35.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:34 vm06.local ceph-mon[58974]: from='client.24363 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:21:35.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:34 vm06.local ceph-mon[58974]: from='client.14604 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:21:35.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:34 vm06.local ceph-mon[58974]: from='client.? 192.168.123.104:0/3427746706' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:21:35.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:34 vm06.local ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T06:21:36.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:35 vm06.local ceph-mon[58974]: from='client.14612 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:21:36.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:35 vm06.local ceph-mon[58974]: pgmap v131: 65 pgs: 65 active+clean; 92 MiB data, 509 MiB used, 119 GiB / 120 GiB avail; 2.3 MiB/s rd, 3.8 MiB/s wr, 160 op/s 2026-03-10T06:21:36.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:35 vm04.local ceph-mon[51058]: from='client.14612 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:21:36.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:35 vm04.local ceph-mon[51058]: pgmap v131: 65 pgs: 65 active+clean; 92 MiB data, 509 MiB used, 119 GiB / 120 GiB avail; 2.3 MiB/s rd, 3.8 MiB/s wr, 160 op/s 2026-03-10T06:21:41.361 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:41 vm06.local ceph-mon[58974]: pgmap v132: 65 pgs: 65 active+clean; 118 MiB data, 668 MiB used, 119 GiB / 120 GiB avail; 3.0 MiB/s rd, 6.0 MiB/s wr, 261 op/s 2026-03-10T06:21:41.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:40 vm04.local ceph-mon[51058]: pgmap v132: 65 pgs: 65 active+clean; 118 MiB data, 668 MiB used, 119 GiB / 120 GiB avail; 3.0 MiB/s rd, 6.0 MiB/s wr, 261 op/s 2026-03-10T06:21:41.986 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:41 vm06.local ceph-mon[58974]: pgmap v133: 65 pgs: 65 active+clean; 118 MiB data, 668 MiB used, 119 GiB / 120 GiB avail; 2.3 MiB/s rd, 4.1 MiB/s wr, 183 op/s 2026-03-10T06:21:41.987 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:41 vm06.local ceph-mon[58974]: pgmap v134: 65 pgs: 65 active+clean; 128 MiB data, 747 MiB used, 119 GiB / 120 GiB avail; 3.3 MiB/s rd, 5.0 MiB/s wr, 234 op/s 2026-03-10T06:21:41.987 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:41 vm06.local ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:21:41.987 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:41 vm06.local ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm06.wwotdr", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-10T06:21:41.987 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:41 vm06.local ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-10T06:21:41.987 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:41 vm06.local ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:21:42.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:41 vm04.local ceph-mon[51058]: pgmap v133: 65 pgs: 65 active+clean; 118 MiB data, 668 MiB used, 119 GiB / 120 GiB avail; 2.3 MiB/s rd, 4.1 MiB/s wr, 183 op/s 2026-03-10T06:21:42.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:41 vm04.local ceph-mon[51058]: pgmap v134: 65 pgs: 65 active+clean; 128 MiB data, 747 MiB used, 119 GiB / 120 GiB avail; 3.3 MiB/s rd, 5.0 MiB/s wr, 234 op/s 2026-03-10T06:21:42.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:41 vm04.local ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:21:42.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:41 vm04.local ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm06.wwotdr", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-10T06:21:42.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:41 vm04.local ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-10T06:21:42.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:41 vm04.local ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:21:43.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:42 vm06.local ceph-mon[58974]: Upgrade: Updating mgr.vm06.wwotdr 2026-03-10T06:21:43.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:42 vm06.local ceph-mon[58974]: Deploying daemon mgr.vm06.wwotdr on vm06 2026-03-10T06:21:43.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:42 vm06.local ceph-mon[58974]: pgmap v135: 65 pgs: 65 active+clean; 128 MiB data, 747 MiB used, 119 GiB / 120 GiB avail; 1.7 MiB/s rd, 3.2 MiB/s wr, 167 op/s 2026-03-10T06:21:43.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:42 vm04.local ceph-mon[51058]: Upgrade: Updating mgr.vm06.wwotdr 2026-03-10T06:21:43.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:42 vm04.local ceph-mon[51058]: Deploying daemon mgr.vm06.wwotdr on vm06 2026-03-10T06:21:43.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:42 vm04.local ceph-mon[51058]: pgmap v135: 65 pgs: 65 active+clean; 128 MiB data, 747 MiB used, 119 GiB / 120 GiB avail; 1.7 MiB/s rd, 3.2 MiB/s wr, 167 op/s 2026-03-10T06:21:44.866 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:44 vm06.local ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:21:44.866 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:44 vm06.local ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:21:44.866 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:44 vm06.local ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:21:44.866 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:44 vm06.local ceph-mon[58974]: pgmap v136: 65 pgs: 65 active+clean; 128 MiB data, 752 MiB used, 119 GiB / 120 GiB avail; 1.7 MiB/s rd, 3.3 MiB/s wr, 172 op/s 2026-03-10T06:21:44.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:44 vm04.local ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:21:44.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:44 vm04.local ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:21:44.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:44 vm04.local ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:21:44.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:44 vm04.local ceph-mon[51058]: pgmap v136: 65 pgs: 65 active+clean; 128 MiB data, 752 MiB used, 119 GiB / 120 GiB avail; 1.7 MiB/s rd, 3.3 MiB/s wr, 172 op/s 2026-03-10T06:21:46.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:46 vm04.local ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:21:46.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:46 vm04.local ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:21:46.869 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:46 vm06.local ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:21:46.869 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:46 vm06.local ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:21:47.512 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:47 vm06.local ceph-mon[58974]: pgmap v137: 65 pgs: 65 active+clean; 142 MiB data, 863 MiB used, 119 GiB / 120 GiB avail; 2.6 MiB/s rd, 4.2 MiB/s wr, 245 op/s 2026-03-10T06:21:47.512 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:47 vm06.local ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:21:47.513 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:47 vm06.local ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:21:47.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:47 vm04.local ceph-mon[51058]: pgmap v137: 65 pgs: 65 active+clean; 142 MiB data, 863 MiB used, 119 GiB / 120 GiB avail; 2.6 MiB/s rd, 4.2 MiB/s wr, 245 op/s 2026-03-10T06:21:47.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:47 vm04.local ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:21:47.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:47 vm04.local ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:21:49.785 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:49 vm06.local ceph-mon[58974]: pgmap v138: 65 pgs: 65 active+clean; 142 MiB data, 863 MiB used, 119 GiB / 120 GiB avail; 1.9 MiB/s rd, 2.0 MiB/s wr, 137 op/s 2026-03-10T06:21:49.786 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:49 vm06.local ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:21:49.786 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:49 vm06.local ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:21:49.786 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:49 vm06.local ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:21:49.786 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:49 vm06.local ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T06:21:49.786 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:49 vm06.local ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:21:49.786 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:49 vm06.local ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:21:49.786 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:49 vm06.local ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "mgr fail", "who": "vm04.exdvdb"}]: dispatch 2026-03-10T06:21:49.786 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:49 vm06.local ceph-mon[58974]: osdmap e39: 6 total, 6 up, 6 in 2026-03-10T06:21:49.786 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:49 vm06.local ceph-mon[58974]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd='[{"prefix": "mgr fail", "who": "vm04.exdvdb"}]': finished 2026-03-10T06:21:49.786 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:49 vm06.local ceph-mon[58974]: mgrmap e20: vm06.wwotdr(active, starting, since 0.0112237s) 2026-03-10T06:21:49.786 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:49 vm06.local ceph-mon[58974]: Active manager daemon vm06.wwotdr restarted 2026-03-10T06:21:49.786 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:49 vm06.local ceph-mon[58974]: Activating manager daemon vm06.wwotdr 2026-03-10T06:21:49.786 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:49 vm06.local ceph-mon[58974]: from='mgr.? 192.168.123.106:0/16315041' entity='mgr.vm06.wwotdr' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm06.wwotdr/crt"}]: dispatch 2026-03-10T06:21:49.786 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:49 vm06.local ceph-mon[58974]: osdmap e40: 6 total, 6 up, 6 in 2026-03-10T06:21:49.786 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:49 vm06.local ceph-mon[58974]: mgrmap e21: vm06.wwotdr(active, starting, since 0.00925926s) 2026-03-10T06:21:49.786 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:49 vm06.local ceph-mon[58974]: from='mgr.24377 192.168.123.106:0/16315041' entity='mgr.vm06.wwotdr' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm06.wwotdr/crt"}]: dispatch 2026-03-10T06:21:49.786 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:49 vm06.local ceph-mon[58974]: from='mgr.24377 192.168.123.106:0/16315041' entity='mgr.vm06.wwotdr' cmd=[{"prefix": "mon metadata", "id": "vm04"}]: dispatch 2026-03-10T06:21:49.786 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:49 vm06.local ceph-mon[58974]: from='mgr.24377 192.168.123.106:0/16315041' entity='mgr.vm06.wwotdr' cmd=[{"prefix": "mon metadata", "id": "vm06"}]: dispatch 2026-03-10T06:21:49.786 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:49 vm06.local ceph-mon[58974]: from='mgr.24377 192.168.123.106:0/16315041' entity='mgr.vm06.wwotdr' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm04.hdxbzv"}]: dispatch 2026-03-10T06:21:49.786 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:49 vm06.local ceph-mon[58974]: from='mgr.24377 192.168.123.106:0/16315041' entity='mgr.vm06.wwotdr' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm04.hsrsig"}]: dispatch 2026-03-10T06:21:49.786 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:49 vm06.local ceph-mon[58974]: from='mgr.24377 192.168.123.106:0/16315041' entity='mgr.vm06.wwotdr' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm06.afscws"}]: dispatch 2026-03-10T06:21:49.786 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:49 vm06.local ceph-mon[58974]: from='mgr.24377 192.168.123.106:0/16315041' entity='mgr.vm06.wwotdr' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm06.wzhqon"}]: dispatch 2026-03-10T06:21:49.786 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:49 vm06.local ceph-mon[58974]: from='mgr.24377 192.168.123.106:0/16315041' entity='mgr.vm06.wwotdr' cmd=[{"prefix": "mgr metadata", "who": "vm06.wwotdr", "id": "vm06.wwotdr"}]: dispatch 2026-03-10T06:21:49.786 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:49 vm06.local ceph-mon[58974]: from='mgr.24377 192.168.123.106:0/16315041' entity='mgr.vm06.wwotdr' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-10T06:21:49.786 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:49 vm06.local ceph-mon[58974]: from='mgr.24377 192.168.123.106:0/16315041' entity='mgr.vm06.wwotdr' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-10T06:21:49.786 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:49 vm06.local ceph-mon[58974]: from='mgr.24377 192.168.123.106:0/16315041' entity='mgr.vm06.wwotdr' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-10T06:21:49.786 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:49 vm06.local ceph-mon[58974]: from='mgr.24377 192.168.123.106:0/16315041' entity='mgr.vm06.wwotdr' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-10T06:21:49.786 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:49 vm06.local ceph-mon[58974]: from='mgr.24377 192.168.123.106:0/16315041' entity='mgr.vm06.wwotdr' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-10T06:21:49.786 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:49 vm06.local ceph-mon[58974]: from='mgr.24377 192.168.123.106:0/16315041' entity='mgr.vm06.wwotdr' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-10T06:21:49.786 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:49 vm06.local ceph-mon[58974]: from='mgr.24377 192.168.123.106:0/16315041' entity='mgr.vm06.wwotdr' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/crt"}]: dispatch 2026-03-10T06:21:49.786 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:49 vm06.local ceph-mon[58974]: from='mgr.24377 192.168.123.106:0/16315041' entity='mgr.vm06.wwotdr' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm06.wwotdr/key"}]: dispatch 2026-03-10T06:21:49.786 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:49 vm06.local ceph-mon[58974]: from='mgr.24377 192.168.123.106:0/16315041' entity='mgr.vm06.wwotdr' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/key"}]: dispatch 2026-03-10T06:21:49.786 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:49 vm06.local ceph-mon[58974]: from='mgr.24377 192.168.123.106:0/16315041' entity='mgr.vm06.wwotdr' cmd=[{"prefix": "mds metadata"}]: dispatch 2026-03-10T06:21:49.786 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:49 vm06.local ceph-mon[58974]: from='mgr.24377 192.168.123.106:0/16315041' entity='mgr.vm06.wwotdr' cmd=[{"prefix": "osd metadata"}]: dispatch 2026-03-10T06:21:49.786 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:49 vm06.local ceph-mon[58974]: from='mgr.24377 192.168.123.106:0/16315041' entity='mgr.vm06.wwotdr' cmd=[{"prefix": "mon metadata"}]: dispatch 2026-03-10T06:21:49.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:49 vm04.local ceph-mon[51058]: pgmap v138: 65 pgs: 65 active+clean; 142 MiB data, 863 MiB used, 119 GiB / 120 GiB avail; 1.9 MiB/s rd, 2.0 MiB/s wr, 137 op/s 2026-03-10T06:21:49.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:49 vm04.local ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:21:49.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:49 vm04.local ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:21:49.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:49 vm04.local ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:21:49.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:49 vm04.local ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T06:21:49.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:49 vm04.local ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' 2026-03-10T06:21:49.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:49 vm04.local ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:21:49.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:49 vm04.local ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "mgr fail", "who": "vm04.exdvdb"}]: dispatch 2026-03-10T06:21:49.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:49 vm04.local ceph-mon[51058]: osdmap e39: 6 total, 6 up, 6 in 2026-03-10T06:21:49.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:49 vm04.local ceph-mon[51058]: from='mgr.14241 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd='[{"prefix": "mgr fail", "who": "vm04.exdvdb"}]': finished 2026-03-10T06:21:49.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:49 vm04.local ceph-mon[51058]: mgrmap e20: vm06.wwotdr(active, starting, since 0.0112237s) 2026-03-10T06:21:49.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:49 vm04.local ceph-mon[51058]: Active manager daemon vm06.wwotdr restarted 2026-03-10T06:21:49.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:49 vm04.local ceph-mon[51058]: Activating manager daemon vm06.wwotdr 2026-03-10T06:21:49.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:49 vm04.local ceph-mon[51058]: from='mgr.? 192.168.123.106:0/16315041' entity='mgr.vm06.wwotdr' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm06.wwotdr/crt"}]: dispatch 2026-03-10T06:21:49.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:49 vm04.local ceph-mon[51058]: osdmap e40: 6 total, 6 up, 6 in 2026-03-10T06:21:49.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:49 vm04.local ceph-mon[51058]: mgrmap e21: vm06.wwotdr(active, starting, since 0.00925926s) 2026-03-10T06:21:49.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:49 vm04.local ceph-mon[51058]: from='mgr.24377 192.168.123.106:0/16315041' entity='mgr.vm06.wwotdr' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm06.wwotdr/crt"}]: dispatch 2026-03-10T06:21:49.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:49 vm04.local ceph-mon[51058]: from='mgr.24377 192.168.123.106:0/16315041' entity='mgr.vm06.wwotdr' cmd=[{"prefix": "mon metadata", "id": "vm04"}]: dispatch 2026-03-10T06:21:49.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:49 vm04.local ceph-mon[51058]: from='mgr.24377 192.168.123.106:0/16315041' entity='mgr.vm06.wwotdr' cmd=[{"prefix": "mon metadata", "id": "vm06"}]: dispatch 2026-03-10T06:21:49.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:49 vm04.local ceph-mon[51058]: from='mgr.24377 192.168.123.106:0/16315041' entity='mgr.vm06.wwotdr' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm04.hdxbzv"}]: dispatch 2026-03-10T06:21:49.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:49 vm04.local ceph-mon[51058]: from='mgr.24377 192.168.123.106:0/16315041' entity='mgr.vm06.wwotdr' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm04.hsrsig"}]: dispatch 2026-03-10T06:21:49.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:49 vm04.local ceph-mon[51058]: from='mgr.24377 192.168.123.106:0/16315041' entity='mgr.vm06.wwotdr' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm06.afscws"}]: dispatch 2026-03-10T06:21:49.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:49 vm04.local ceph-mon[51058]: from='mgr.24377 192.168.123.106:0/16315041' entity='mgr.vm06.wwotdr' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm06.wzhqon"}]: dispatch 2026-03-10T06:21:49.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:49 vm04.local ceph-mon[51058]: from='mgr.24377 192.168.123.106:0/16315041' entity='mgr.vm06.wwotdr' cmd=[{"prefix": "mgr metadata", "who": "vm06.wwotdr", "id": "vm06.wwotdr"}]: dispatch 2026-03-10T06:21:49.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:49 vm04.local ceph-mon[51058]: from='mgr.24377 192.168.123.106:0/16315041' entity='mgr.vm06.wwotdr' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-10T06:21:49.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:49 vm04.local ceph-mon[51058]: from='mgr.24377 192.168.123.106:0/16315041' entity='mgr.vm06.wwotdr' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-10T06:21:49.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:49 vm04.local ceph-mon[51058]: from='mgr.24377 192.168.123.106:0/16315041' entity='mgr.vm06.wwotdr' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-10T06:21:49.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:49 vm04.local ceph-mon[51058]: from='mgr.24377 192.168.123.106:0/16315041' entity='mgr.vm06.wwotdr' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-10T06:21:49.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:49 vm04.local ceph-mon[51058]: from='mgr.24377 192.168.123.106:0/16315041' entity='mgr.vm06.wwotdr' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-10T06:21:49.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:49 vm04.local ceph-mon[51058]: from='mgr.24377 192.168.123.106:0/16315041' entity='mgr.vm06.wwotdr' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-10T06:21:49.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:49 vm04.local ceph-mon[51058]: from='mgr.24377 192.168.123.106:0/16315041' entity='mgr.vm06.wwotdr' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/crt"}]: dispatch 2026-03-10T06:21:49.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:49 vm04.local ceph-mon[51058]: from='mgr.24377 192.168.123.106:0/16315041' entity='mgr.vm06.wwotdr' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm06.wwotdr/key"}]: dispatch 2026-03-10T06:21:49.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:49 vm04.local ceph-mon[51058]: from='mgr.24377 192.168.123.106:0/16315041' entity='mgr.vm06.wwotdr' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/key"}]: dispatch 2026-03-10T06:21:49.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:49 vm04.local ceph-mon[51058]: from='mgr.24377 192.168.123.106:0/16315041' entity='mgr.vm06.wwotdr' cmd=[{"prefix": "mds metadata"}]: dispatch 2026-03-10T06:21:49.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:49 vm04.local ceph-mon[51058]: from='mgr.24377 192.168.123.106:0/16315041' entity='mgr.vm06.wwotdr' cmd=[{"prefix": "osd metadata"}]: dispatch 2026-03-10T06:21:49.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:49 vm04.local ceph-mon[51058]: from='mgr.24377 192.168.123.106:0/16315041' entity='mgr.vm06.wwotdr' cmd=[{"prefix": "mon metadata"}]: dispatch 2026-03-10T06:21:50.805 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:50 vm04.local ceph-mon[51058]: Manager daemon vm06.wwotdr is now available 2026-03-10T06:21:50.806 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:50 vm04.local ceph-mon[51058]: from='mgr.24377 ' entity='mgr.vm06.wwotdr' 2026-03-10T06:21:50.806 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:50 vm04.local ceph-mon[51058]: from='mgr.24377 ' entity='mgr.vm06.wwotdr' 2026-03-10T06:21:50.806 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:50 vm04.local ceph-mon[51058]: from='mgr.24377 ' entity='mgr.vm06.wwotdr' 2026-03-10T06:21:50.806 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:50 vm04.local ceph-mon[51058]: from='mgr.24377 ' entity='mgr.vm06.wwotdr' 2026-03-10T06:21:50.806 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:50 vm04.local ceph-mon[51058]: from='mgr.24377 ' entity='mgr.vm06.wwotdr' 2026-03-10T06:21:50.806 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:50 vm04.local ceph-mon[51058]: from='mgr.24377 192.168.123.106:0/16315041' entity='mgr.vm06.wwotdr' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T06:21:50.806 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:50 vm04.local ceph-mon[51058]: from='mgr.24377 192.168.123.106:0/16315041' entity='mgr.vm06.wwotdr' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:21:50.806 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:50 vm04.local ceph-mon[51058]: from='mgr.24377 192.168.123.106:0/16315041' entity='mgr.vm06.wwotdr' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm06.wwotdr/mirror_snapshot_schedule"}]: dispatch 2026-03-10T06:21:50.806 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:50 vm04.local ceph-mon[51058]: from='mgr.24377 ' entity='mgr.vm06.wwotdr' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm06.wwotdr/mirror_snapshot_schedule"}]: dispatch 2026-03-10T06:21:50.806 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:50 vm04.local ceph-mon[51058]: from='mgr.24377 192.168.123.106:0/16315041' entity='mgr.vm06.wwotdr' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm06.wwotdr/trash_purge_schedule"}]: dispatch 2026-03-10T06:21:50.806 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:50 vm04.local ceph-mon[51058]: from='mgr.24377 ' entity='mgr.vm06.wwotdr' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm06.wwotdr/trash_purge_schedule"}]: dispatch 2026-03-10T06:21:50.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:50 vm06.local ceph-mon[58974]: Manager daemon vm06.wwotdr is now available 2026-03-10T06:21:50.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:50 vm06.local ceph-mon[58974]: from='mgr.24377 ' entity='mgr.vm06.wwotdr' 2026-03-10T06:21:50.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:50 vm06.local ceph-mon[58974]: from='mgr.24377 ' entity='mgr.vm06.wwotdr' 2026-03-10T06:21:50.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:50 vm06.local ceph-mon[58974]: from='mgr.24377 ' entity='mgr.vm06.wwotdr' 2026-03-10T06:21:50.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:50 vm06.local ceph-mon[58974]: from='mgr.24377 ' entity='mgr.vm06.wwotdr' 2026-03-10T06:21:50.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:50 vm06.local ceph-mon[58974]: from='mgr.24377 ' entity='mgr.vm06.wwotdr' 2026-03-10T06:21:50.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:50 vm06.local ceph-mon[58974]: from='mgr.24377 192.168.123.106:0/16315041' entity='mgr.vm06.wwotdr' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T06:21:50.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:50 vm06.local ceph-mon[58974]: from='mgr.24377 192.168.123.106:0/16315041' entity='mgr.vm06.wwotdr' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:21:50.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:50 vm06.local ceph-mon[58974]: from='mgr.24377 192.168.123.106:0/16315041' entity='mgr.vm06.wwotdr' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm06.wwotdr/mirror_snapshot_schedule"}]: dispatch 2026-03-10T06:21:50.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:50 vm06.local ceph-mon[58974]: from='mgr.24377 ' entity='mgr.vm06.wwotdr' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm06.wwotdr/mirror_snapshot_schedule"}]: dispatch 2026-03-10T06:21:50.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:50 vm06.local ceph-mon[58974]: from='mgr.24377 192.168.123.106:0/16315041' entity='mgr.vm06.wwotdr' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm06.wwotdr/trash_purge_schedule"}]: dispatch 2026-03-10T06:21:50.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:50 vm06.local ceph-mon[58974]: from='mgr.24377 ' entity='mgr.vm06.wwotdr' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm06.wwotdr/trash_purge_schedule"}]: dispatch 2026-03-10T06:21:52.025 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:51 vm06.local ceph-mon[58974]: Migrating agent root cert to cert store 2026-03-10T06:21:52.025 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:51 vm06.local ceph-mon[58974]: Migrating agent root key to cert store 2026-03-10T06:21:52.025 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:51 vm06.local ceph-mon[58974]: Checking for cert/key for grafana.vm04 2026-03-10T06:21:52.025 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:51 vm06.local ceph-mon[58974]: Migrating grafana.vm04 cert to cert store 2026-03-10T06:21:52.025 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:51 vm06.local ceph-mon[58974]: Migrating grafana.vm04 key to cert store 2026-03-10T06:21:52.025 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:51 vm06.local ceph-mon[58974]: Deploying cephadm binary to vm06 2026-03-10T06:21:52.025 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:51 vm06.local ceph-mon[58974]: Deploying cephadm binary to vm04 2026-03-10T06:21:52.025 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:51 vm06.local ceph-mon[58974]: mgrmap e22: vm06.wwotdr(active, since 1.54937s) 2026-03-10T06:21:52.025 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:51 vm06.local ceph-mon[58974]: pgmap v3: 65 pgs: 65 active+clean; 161 MiB data, 941 MiB used, 119 GiB / 120 GiB avail 2026-03-10T06:21:52.025 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:51 vm06.local ceph-mon[58974]: [10/Mar/2026:06:21:50] ENGINE Bus STARTING 2026-03-10T06:21:52.026 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:51 vm06.local ceph-mon[58974]: [10/Mar/2026:06:21:51] ENGINE Serving on http://192.168.123.106:8765 2026-03-10T06:21:52.026 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:51 vm06.local ceph-mon[58974]: pgmap v4: 65 pgs: 65 active+clean; 161 MiB data, 941 MiB used, 119 GiB / 120 GiB avail 2026-03-10T06:21:52.026 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:51 vm06.local ceph-mon[58974]: [10/Mar/2026:06:21:51] ENGINE Serving on https://192.168.123.106:7150 2026-03-10T06:21:52.026 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:51 vm06.local ceph-mon[58974]: [10/Mar/2026:06:21:51] ENGINE Bus STARTED 2026-03-10T06:21:52.026 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:51 vm06.local ceph-mon[58974]: [10/Mar/2026:06:21:51] ENGINE Client ('192.168.123.106', 52586) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)') 2026-03-10T06:21:52.157 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:51 vm04.local ceph-mon[51058]: Migrating agent root cert to cert store 2026-03-10T06:21:52.157 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:51 vm04.local ceph-mon[51058]: Migrating agent root key to cert store 2026-03-10T06:21:52.157 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:51 vm04.local ceph-mon[51058]: Checking for cert/key for grafana.vm04 2026-03-10T06:21:52.157 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:51 vm04.local ceph-mon[51058]: Migrating grafana.vm04 cert to cert store 2026-03-10T06:21:52.157 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:51 vm04.local ceph-mon[51058]: Migrating grafana.vm04 key to cert store 2026-03-10T06:21:52.157 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:51 vm04.local ceph-mon[51058]: Deploying cephadm binary to vm06 2026-03-10T06:21:52.157 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:51 vm04.local ceph-mon[51058]: Deploying cephadm binary to vm04 2026-03-10T06:21:52.157 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:51 vm04.local ceph-mon[51058]: mgrmap e22: vm06.wwotdr(active, since 1.54937s) 2026-03-10T06:21:52.157 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:51 vm04.local ceph-mon[51058]: pgmap v3: 65 pgs: 65 active+clean; 161 MiB data, 941 MiB used, 119 GiB / 120 GiB avail 2026-03-10T06:21:52.157 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:51 vm04.local ceph-mon[51058]: [10/Mar/2026:06:21:50] ENGINE Bus STARTING 2026-03-10T06:21:52.157 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:51 vm04.local ceph-mon[51058]: [10/Mar/2026:06:21:51] ENGINE Serving on http://192.168.123.106:8765 2026-03-10T06:21:52.157 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:51 vm04.local ceph-mon[51058]: pgmap v4: 65 pgs: 65 active+clean; 161 MiB data, 941 MiB used, 119 GiB / 120 GiB avail 2026-03-10T06:21:52.157 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:51 vm04.local ceph-mon[51058]: [10/Mar/2026:06:21:51] ENGINE Serving on https://192.168.123.106:7150 2026-03-10T06:21:52.157 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:51 vm04.local ceph-mon[51058]: [10/Mar/2026:06:21:51] ENGINE Bus STARTED 2026-03-10T06:21:52.157 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:51 vm04.local ceph-mon[51058]: [10/Mar/2026:06:21:51] ENGINE Client ('192.168.123.106', 52586) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)') 2026-03-10T06:21:53.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:53 vm06.local ceph-mon[58974]: mgrmap e23: vm06.wwotdr(active, since 2s) 2026-03-10T06:21:53.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:53 vm06.local ceph-mon[58974]: from='mgr.24377 ' entity='mgr.vm06.wwotdr' 2026-03-10T06:21:53.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:53 vm06.local ceph-mon[58974]: from='mgr.24377 ' entity='mgr.vm06.wwotdr' 2026-03-10T06:21:53.435 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:53 vm04.local ceph-mon[51058]: mgrmap e23: vm06.wwotdr(active, since 2s) 2026-03-10T06:21:53.436 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:53 vm04.local ceph-mon[51058]: from='mgr.24377 ' entity='mgr.vm06.wwotdr' 2026-03-10T06:21:53.436 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:53 vm04.local ceph-mon[51058]: from='mgr.24377 ' entity='mgr.vm06.wwotdr' 2026-03-10T06:21:54.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:54 vm06.local ceph-mon[58974]: pgmap v5: 65 pgs: 65 active+clean; 161 MiB data, 941 MiB used, 119 GiB / 120 GiB avail 2026-03-10T06:21:54.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:54 vm06.local ceph-mon[58974]: from='mgr.24377 ' entity='mgr.vm06.wwotdr' 2026-03-10T06:21:54.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:54 vm06.local ceph-mon[58974]: from='mgr.24377 ' entity='mgr.vm06.wwotdr' 2026-03-10T06:21:54.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:54 vm06.local ceph-mon[58974]: mgrmap e24: vm06.wwotdr(active, since 5s) 2026-03-10T06:21:54.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:54 vm06.local ceph-mon[58974]: from='mgr.24377 ' entity='mgr.vm06.wwotdr' 2026-03-10T06:21:54.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:54 vm06.local ceph-mon[58974]: from='mgr.24377 ' entity='mgr.vm06.wwotdr' 2026-03-10T06:21:54.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:54 vm04.local ceph-mon[51058]: pgmap v5: 65 pgs: 65 active+clean; 161 MiB data, 941 MiB used, 119 GiB / 120 GiB avail 2026-03-10T06:21:54.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:54 vm04.local ceph-mon[51058]: from='mgr.24377 ' entity='mgr.vm06.wwotdr' 2026-03-10T06:21:54.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:54 vm04.local ceph-mon[51058]: from='mgr.24377 ' entity='mgr.vm06.wwotdr' 2026-03-10T06:21:54.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:54 vm04.local ceph-mon[51058]: mgrmap e24: vm06.wwotdr(active, since 5s) 2026-03-10T06:21:54.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:54 vm04.local ceph-mon[51058]: from='mgr.24377 ' entity='mgr.vm06.wwotdr' 2026-03-10T06:21:54.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:54 vm04.local ceph-mon[51058]: from='mgr.24377 ' entity='mgr.vm06.wwotdr' 2026-03-10T06:21:55.714 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:55 vm04.local ceph-mon[51058]: from='mgr.24377 ' entity='mgr.vm06.wwotdr' 2026-03-10T06:21:55.714 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:55 vm04.local ceph-mon[51058]: from='mgr.24377 ' entity='mgr.vm06.wwotdr' 2026-03-10T06:21:55.714 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:55 vm04.local ceph-mon[51058]: from='mgr.24377 192.168.123.106:0/16315041' entity='mgr.vm06.wwotdr' cmd=[{"prefix": "config rm", "who": "osd/host:vm06", "name": "osd_memory_target"}]: dispatch 2026-03-10T06:21:55.714 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:55 vm04.local ceph-mon[51058]: from='mgr.24377 ' entity='mgr.vm06.wwotdr' cmd=[{"prefix": "config rm", "who": "osd/host:vm06", "name": "osd_memory_target"}]: dispatch 2026-03-10T06:21:55.714 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:55 vm04.local ceph-mon[51058]: from='mgr.24377 ' entity='mgr.vm06.wwotdr' 2026-03-10T06:21:55.714 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:55 vm04.local ceph-mon[51058]: from='mgr.24377 ' entity='mgr.vm06.wwotdr' 2026-03-10T06:21:55.714 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:55 vm04.local ceph-mon[51058]: from='mgr.24377 192.168.123.106:0/16315041' entity='mgr.vm06.wwotdr' cmd=[{"prefix": "config rm", "who": "osd/host:vm04", "name": "osd_memory_target"}]: dispatch 2026-03-10T06:21:55.714 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:55 vm04.local ceph-mon[51058]: from='mgr.24377 ' entity='mgr.vm06.wwotdr' cmd=[{"prefix": "config rm", "who": "osd/host:vm04", "name": "osd_memory_target"}]: dispatch 2026-03-10T06:21:55.714 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:55 vm04.local ceph-mon[51058]: from='mgr.24377 192.168.123.106:0/16315041' entity='mgr.vm06.wwotdr' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:21:55.714 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:55 vm04.local ceph-mon[51058]: from='mgr.24377 192.168.123.106:0/16315041' entity='mgr.vm06.wwotdr' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T06:21:55.714 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:55 vm04.local ceph-mon[51058]: Updating vm04:/etc/ceph/ceph.conf 2026-03-10T06:21:55.714 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:55 vm04.local ceph-mon[51058]: Updating vm06:/etc/ceph/ceph.conf 2026-03-10T06:21:55.714 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:55 vm04.local ceph-mon[51058]: Updating vm04:/var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/config/ceph.conf 2026-03-10T06:21:55.714 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:55 vm04.local ceph-mon[51058]: pgmap v6: 65 pgs: 65 active+clean; 161 MiB data, 941 MiB used, 119 GiB / 120 GiB avail 2026-03-10T06:21:55.714 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:55 vm04.local ceph-mon[51058]: Updating vm06:/var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/config/ceph.conf 2026-03-10T06:21:55.714 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:55 vm04.local ceph-mon[51058]: Updating vm04:/etc/ceph/ceph.client.admin.keyring 2026-03-10T06:21:55.714 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:55 vm04.local ceph-mon[51058]: Updating vm06:/etc/ceph/ceph.client.admin.keyring 2026-03-10T06:21:55.714 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:55 vm04.local ceph-mon[51058]: Updating vm04:/var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/config/ceph.client.admin.keyring 2026-03-10T06:21:55.714 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:55 vm04.local ceph-mon[51058]: Updating vm06:/var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/config/ceph.client.admin.keyring 2026-03-10T06:21:55.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:55 vm06.local ceph-mon[58974]: from='mgr.24377 ' entity='mgr.vm06.wwotdr' 2026-03-10T06:21:55.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:55 vm06.local ceph-mon[58974]: from='mgr.24377 ' entity='mgr.vm06.wwotdr' 2026-03-10T06:21:55.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:55 vm06.local ceph-mon[58974]: from='mgr.24377 192.168.123.106:0/16315041' entity='mgr.vm06.wwotdr' cmd=[{"prefix": "config rm", "who": "osd/host:vm06", "name": "osd_memory_target"}]: dispatch 2026-03-10T06:21:55.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:55 vm06.local ceph-mon[58974]: from='mgr.24377 ' entity='mgr.vm06.wwotdr' cmd=[{"prefix": "config rm", "who": "osd/host:vm06", "name": "osd_memory_target"}]: dispatch 2026-03-10T06:21:55.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:55 vm06.local ceph-mon[58974]: from='mgr.24377 ' entity='mgr.vm06.wwotdr' 2026-03-10T06:21:55.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:55 vm06.local ceph-mon[58974]: from='mgr.24377 ' entity='mgr.vm06.wwotdr' 2026-03-10T06:21:55.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:55 vm06.local ceph-mon[58974]: from='mgr.24377 192.168.123.106:0/16315041' entity='mgr.vm06.wwotdr' cmd=[{"prefix": "config rm", "who": "osd/host:vm04", "name": "osd_memory_target"}]: dispatch 2026-03-10T06:21:55.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:55 vm06.local ceph-mon[58974]: from='mgr.24377 ' entity='mgr.vm06.wwotdr' cmd=[{"prefix": "config rm", "who": "osd/host:vm04", "name": "osd_memory_target"}]: dispatch 2026-03-10T06:21:55.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:55 vm06.local ceph-mon[58974]: from='mgr.24377 192.168.123.106:0/16315041' entity='mgr.vm06.wwotdr' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:21:55.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:55 vm06.local ceph-mon[58974]: from='mgr.24377 192.168.123.106:0/16315041' entity='mgr.vm06.wwotdr' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T06:21:55.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:55 vm06.local ceph-mon[58974]: Updating vm04:/etc/ceph/ceph.conf 2026-03-10T06:21:55.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:55 vm06.local ceph-mon[58974]: Updating vm06:/etc/ceph/ceph.conf 2026-03-10T06:21:55.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:55 vm06.local ceph-mon[58974]: Updating vm04:/var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/config/ceph.conf 2026-03-10T06:21:55.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:55 vm06.local ceph-mon[58974]: pgmap v6: 65 pgs: 65 active+clean; 161 MiB data, 941 MiB used, 119 GiB / 120 GiB avail 2026-03-10T06:21:55.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:55 vm06.local ceph-mon[58974]: Updating vm06:/var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/config/ceph.conf 2026-03-10T06:21:55.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:55 vm06.local ceph-mon[58974]: Updating vm04:/etc/ceph/ceph.client.admin.keyring 2026-03-10T06:21:55.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:55 vm06.local ceph-mon[58974]: Updating vm06:/etc/ceph/ceph.client.admin.keyring 2026-03-10T06:21:55.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:55 vm06.local ceph-mon[58974]: Updating vm04:/var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/config/ceph.client.admin.keyring 2026-03-10T06:21:55.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:55 vm06.local ceph-mon[58974]: Updating vm06:/var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/config/ceph.client.admin.keyring 2026-03-10T06:21:56.972 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:56 vm04.local ceph-mon[51058]: from='mgr.24377 ' entity='mgr.vm06.wwotdr' 2026-03-10T06:21:56.972 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:56 vm04.local ceph-mon[51058]: from='mgr.24377 ' entity='mgr.vm06.wwotdr' 2026-03-10T06:21:56.972 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:56 vm04.local ceph-mon[51058]: from='mgr.24377 ' entity='mgr.vm06.wwotdr' 2026-03-10T06:21:56.972 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:56 vm04.local ceph-mon[51058]: from='mgr.24377 ' entity='mgr.vm06.wwotdr' 2026-03-10T06:21:56.972 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:56 vm04.local ceph-mon[51058]: from='mgr.24377 ' entity='mgr.vm06.wwotdr' 2026-03-10T06:21:56.972 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:56 vm04.local ceph-mon[51058]: from='mgr.24377 ' entity='mgr.vm06.wwotdr' 2026-03-10T06:21:56.972 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:56 vm04.local ceph-mon[51058]: from='mgr.24377 ' entity='mgr.vm06.wwotdr' 2026-03-10T06:21:56.972 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:56 vm04.local ceph-mon[51058]: from='mgr.24377 ' entity='mgr.vm06.wwotdr' 2026-03-10T06:21:56.972 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:56 vm04.local ceph-mon[51058]: from='mgr.24377 ' entity='mgr.vm06.wwotdr' 2026-03-10T06:21:56.972 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:56 vm04.local ceph-mon[51058]: Reconfiguring prometheus.vm04 (dependencies changed)... 2026-03-10T06:21:56.972 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:56 vm04.local ceph-mon[51058]: Reconfiguring daemon prometheus.vm04 on vm04 2026-03-10T06:21:57.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:56 vm06.local ceph-mon[58974]: from='mgr.24377 ' entity='mgr.vm06.wwotdr' 2026-03-10T06:21:57.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:56 vm06.local ceph-mon[58974]: from='mgr.24377 ' entity='mgr.vm06.wwotdr' 2026-03-10T06:21:57.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:56 vm06.local ceph-mon[58974]: from='mgr.24377 ' entity='mgr.vm06.wwotdr' 2026-03-10T06:21:57.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:56 vm06.local ceph-mon[58974]: from='mgr.24377 ' entity='mgr.vm06.wwotdr' 2026-03-10T06:21:57.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:56 vm06.local ceph-mon[58974]: from='mgr.24377 ' entity='mgr.vm06.wwotdr' 2026-03-10T06:21:57.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:56 vm06.local ceph-mon[58974]: from='mgr.24377 ' entity='mgr.vm06.wwotdr' 2026-03-10T06:21:57.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:56 vm06.local ceph-mon[58974]: from='mgr.24377 ' entity='mgr.vm06.wwotdr' 2026-03-10T06:21:57.118 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:56 vm06.local ceph-mon[58974]: from='mgr.24377 ' entity='mgr.vm06.wwotdr' 2026-03-10T06:21:57.118 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:56 vm06.local ceph-mon[58974]: from='mgr.24377 ' entity='mgr.vm06.wwotdr' 2026-03-10T06:21:57.118 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:56 vm06.local ceph-mon[58974]: Reconfiguring prometheus.vm04 (dependencies changed)... 2026-03-10T06:21:57.118 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:56 vm06.local ceph-mon[58974]: Reconfiguring daemon prometheus.vm04 on vm04 2026-03-10T06:21:57.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:57 vm04.local ceph-mon[51058]: pgmap v7: 65 pgs: 65 active+clean; 174 MiB data, 999 MiB used, 119 GiB / 120 GiB avail; 994 KiB/s rd, 2.1 MiB/s wr, 182 op/s 2026-03-10T06:21:57.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:57 vm04.local ceph-mon[51058]: Standby manager daemon vm04.exdvdb started 2026-03-10T06:21:57.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:57 vm04.local ceph-mon[51058]: from='mgr.? 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm04.exdvdb/crt"}]: dispatch 2026-03-10T06:21:57.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:57 vm04.local ceph-mon[51058]: from='mgr.? 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/crt"}]: dispatch 2026-03-10T06:21:57.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:57 vm04.local ceph-mon[51058]: from='mgr.? 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm04.exdvdb/key"}]: dispatch 2026-03-10T06:21:57.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:57 vm04.local ceph-mon[51058]: from='mgr.? 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/key"}]: dispatch 2026-03-10T06:21:57.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:57 vm04.local ceph-mon[51058]: from='mgr.24377 ' entity='mgr.vm06.wwotdr' 2026-03-10T06:21:57.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:57 vm04.local ceph-mon[51058]: from='mgr.24377 ' entity='mgr.vm06.wwotdr' 2026-03-10T06:21:58.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:57 vm06.local ceph-mon[58974]: pgmap v7: 65 pgs: 65 active+clean; 174 MiB data, 999 MiB used, 119 GiB / 120 GiB avail; 994 KiB/s rd, 2.1 MiB/s wr, 182 op/s 2026-03-10T06:21:58.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:57 vm06.local ceph-mon[58974]: Standby manager daemon vm04.exdvdb started 2026-03-10T06:21:58.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:57 vm06.local ceph-mon[58974]: from='mgr.? 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm04.exdvdb/crt"}]: dispatch 2026-03-10T06:21:58.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:57 vm06.local ceph-mon[58974]: from='mgr.? 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/crt"}]: dispatch 2026-03-10T06:21:58.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:57 vm06.local ceph-mon[58974]: from='mgr.? 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm04.exdvdb/key"}]: dispatch 2026-03-10T06:21:58.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:57 vm06.local ceph-mon[58974]: from='mgr.? 192.168.123.104:0/2' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/key"}]: dispatch 2026-03-10T06:21:58.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:57 vm06.local ceph-mon[58974]: from='mgr.24377 ' entity='mgr.vm06.wwotdr' 2026-03-10T06:21:58.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:57 vm06.local ceph-mon[58974]: from='mgr.24377 ' entity='mgr.vm06.wwotdr' 2026-03-10T06:21:58.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:58 vm04.local ceph-mon[51058]: from='mgr.24377 192.168.123.106:0/16315041' entity='mgr.vm06.wwotdr' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-10T06:21:58.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:58 vm04.local ceph-mon[51058]: from='mon.? -' entity='mon.' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-10T06:21:58.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:58 vm04.local ceph-mon[51058]: from='mgr.24377 192.168.123.106:0/16315041' entity='mgr.vm06.wwotdr' cmd=[{"prefix": "mgr metadata", "who": "vm04.exdvdb", "id": "vm04.exdvdb"}]: dispatch 2026-03-10T06:21:58.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:58 vm04.local ceph-mon[51058]: mgrmap e25: vm06.wwotdr(active, since 8s), standbys: vm04.exdvdb 2026-03-10T06:21:58.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:58 vm04.local ceph-mon[51058]: from='mgr.24377 192.168.123.106:0/16315041' entity='mgr.vm06.wwotdr' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:21:58.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:58 vm04.local ceph-mon[51058]: Upgrade: Need to upgrade myself (mgr.vm06.wwotdr) 2026-03-10T06:21:58.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:21:58 vm04.local ceph-mon[51058]: Upgrade: Need to upgrade myself (mgr.vm06.wwotdr) 2026-03-10T06:21:59.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:58 vm06.local ceph-mon[58974]: from='mgr.24377 192.168.123.106:0/16315041' entity='mgr.vm06.wwotdr' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-10T06:21:59.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:58 vm06.local ceph-mon[58974]: from='mon.? -' entity='mon.' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-10T06:21:59.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:58 vm06.local ceph-mon[58974]: from='mgr.24377 192.168.123.106:0/16315041' entity='mgr.vm06.wwotdr' cmd=[{"prefix": "mgr metadata", "who": "vm04.exdvdb", "id": "vm04.exdvdb"}]: dispatch 2026-03-10T06:21:59.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:58 vm06.local ceph-mon[58974]: mgrmap e25: vm06.wwotdr(active, since 8s), standbys: vm04.exdvdb 2026-03-10T06:21:59.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:58 vm06.local ceph-mon[58974]: from='mgr.24377 192.168.123.106:0/16315041' entity='mgr.vm06.wwotdr' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:21:59.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:58 vm06.local ceph-mon[58974]: Upgrade: Need to upgrade myself (mgr.vm06.wwotdr) 2026-03-10T06:21:59.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:21:58 vm06.local ceph-mon[58974]: Upgrade: Need to upgrade myself (mgr.vm06.wwotdr) 2026-03-10T06:22:00.342 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:00 vm04.local ceph-mon[51058]: Upgrade: Updating mgr.vm04.exdvdb 2026-03-10T06:22:00.342 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:00 vm04.local ceph-mon[51058]: from='mgr.24377 ' entity='mgr.vm06.wwotdr' 2026-03-10T06:22:00.342 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:00 vm04.local ceph-mon[51058]: from='mgr.24377 192.168.123.106:0/16315041' entity='mgr.vm06.wwotdr' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm04.exdvdb", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-10T06:22:00.342 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:00 vm04.local ceph-mon[51058]: from='mgr.24377 ' entity='mgr.vm06.wwotdr' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm04.exdvdb", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-10T06:22:00.342 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:00 vm04.local ceph-mon[51058]: from='mgr.24377 192.168.123.106:0/16315041' entity='mgr.vm06.wwotdr' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-10T06:22:00.342 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:00 vm04.local ceph-mon[51058]: from='mgr.24377 192.168.123.106:0/16315041' entity='mgr.vm06.wwotdr' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:22:00.342 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:00 vm04.local ceph-mon[51058]: Deploying daemon mgr.vm04.exdvdb on vm04 2026-03-10T06:22:00.342 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:00 vm04.local ceph-mon[51058]: pgmap v8: 65 pgs: 65 active+clean; 174 MiB data, 999 MiB used, 119 GiB / 120 GiB avail; 759 KiB/s rd, 1.6 MiB/s wr, 139 op/s 2026-03-10T06:22:00.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:00 vm06.local ceph-mon[58974]: Upgrade: Updating mgr.vm04.exdvdb 2026-03-10T06:22:00.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:00 vm06.local ceph-mon[58974]: from='mgr.24377 ' entity='mgr.vm06.wwotdr' 2026-03-10T06:22:00.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:00 vm06.local ceph-mon[58974]: from='mgr.24377 192.168.123.106:0/16315041' entity='mgr.vm06.wwotdr' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm04.exdvdb", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-10T06:22:00.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:00 vm06.local ceph-mon[58974]: from='mgr.24377 ' entity='mgr.vm06.wwotdr' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm04.exdvdb", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-10T06:22:00.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:00 vm06.local ceph-mon[58974]: from='mgr.24377 192.168.123.106:0/16315041' entity='mgr.vm06.wwotdr' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-10T06:22:00.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:00 vm06.local ceph-mon[58974]: from='mgr.24377 192.168.123.106:0/16315041' entity='mgr.vm06.wwotdr' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:22:00.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:00 vm06.local ceph-mon[58974]: Deploying daemon mgr.vm04.exdvdb on vm04 2026-03-10T06:22:00.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:00 vm06.local ceph-mon[58974]: pgmap v8: 65 pgs: 65 active+clean; 174 MiB data, 999 MiB used, 119 GiB / 120 GiB avail; 759 KiB/s rd, 1.6 MiB/s wr, 139 op/s 2026-03-10T06:22:01.930 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:01 vm04.local ceph-mon[51058]: from='mgr.24377 ' entity='mgr.vm06.wwotdr' 2026-03-10T06:22:01.930 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:01 vm04.local ceph-mon[51058]: from='mgr.24377 ' entity='mgr.vm06.wwotdr' 2026-03-10T06:22:01.930 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:01 vm04.local ceph-mon[51058]: from='mgr.24377 192.168.123.106:0/16315041' entity='mgr.vm06.wwotdr' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:22:01.930 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:01 vm04.local ceph-mon[51058]: pgmap v9: 65 pgs: 65 active+clean; 190 MiB data, 1.1 GiB used, 119 GiB / 120 GiB avail; 1006 KiB/s rd, 2.9 MiB/s wr, 233 op/s 2026-03-10T06:22:02.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:01 vm06.local ceph-mon[58974]: from='mgr.24377 ' entity='mgr.vm06.wwotdr' 2026-03-10T06:22:02.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:01 vm06.local ceph-mon[58974]: from='mgr.24377 ' entity='mgr.vm06.wwotdr' 2026-03-10T06:22:02.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:01 vm06.local ceph-mon[58974]: from='mgr.24377 192.168.123.106:0/16315041' entity='mgr.vm06.wwotdr' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:22:02.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:01 vm06.local ceph-mon[58974]: pgmap v9: 65 pgs: 65 active+clean; 190 MiB data, 1.1 GiB used, 119 GiB / 120 GiB avail; 1006 KiB/s rd, 2.9 MiB/s wr, 233 op/s 2026-03-10T06:22:02.426 INFO:tasks.workunit.client.1.vm06.stderr:+ pushd ltp-full-20091231/testcases/kernel/fs/fsstress 2026-03-10T06:22:02.432 INFO:tasks.workunit.client.1.vm06.stdout:~/cephtest/mnt.1/client.1/tmp/fsstress/ltp-full-20091231/testcases/kernel/fs/fsstress ~/cephtest/mnt.1/client.1/tmp/fsstress ~/cephtest/mnt.1/client.1/tmp 2026-03-10T06:22:02.432 INFO:tasks.workunit.client.1.vm06.stderr:+ make 2026-03-10T06:22:03.041 INFO:tasks.workunit.client.1.vm06.stdout:cc -DNO_XFS -I/home/ubuntu/cephtest/mnt.1/client.1/tmp/fsstress/ltp-full-20091231/testcases/kernel/fs/fsstress -D_LARGEFILE64_SOURCE -D_GNU_SOURCE -I../../../../include -I../../../../include -L../../../../lib fsstress.c -o fsstress 2026-03-10T06:22:03.314 INFO:tasks.workunit.client.1.vm06.stderr:++ readlink -f fsstress 2026-03-10T06:22:03.316 INFO:tasks.workunit.client.1.vm06.stderr:+ BIN=/home/ubuntu/cephtest/mnt.1/client.1/tmp/fsstress/ltp-full-20091231/testcases/kernel/fs/fsstress/fsstress 2026-03-10T06:22:03.316 INFO:tasks.workunit.client.1.vm06.stderr:+ popd 2026-03-10T06:22:03.317 INFO:tasks.workunit.client.1.vm06.stdout:~/cephtest/mnt.1/client.1/tmp/fsstress ~/cephtest/mnt.1/client.1/tmp 2026-03-10T06:22:03.317 INFO:tasks.workunit.client.1.vm06.stderr:+ popd 2026-03-10T06:22:03.317 INFO:tasks.workunit.client.1.vm06.stdout:~/cephtest/mnt.1/client.1/tmp 2026-03-10T06:22:03.317 INFO:tasks.workunit.client.1.vm06.stderr:++ mktemp -d -p . 2026-03-10T06:22:03.322 INFO:tasks.workunit.client.1.vm06.stderr:+ T=./tmp.NoDCIoppvH 2026-03-10T06:22:03.322 INFO:tasks.workunit.client.1.vm06.stderr:+ /home/ubuntu/cephtest/mnt.1/client.1/tmp/fsstress/ltp-full-20091231/testcases/kernel/fs/fsstress/fsstress -d ./tmp.NoDCIoppvH -l 1 -n 1000 -p 10 -v 2026-03-10T06:22:03.329 INFO:tasks.workunit.client.1.vm06.stdout:seed = 1772797550 2026-03-10T06:22:03.333 INFO:tasks.workunit.client.1.vm06.stdout:0/0: dwrite - no filename 2026-03-10T06:22:03.333 INFO:tasks.workunit.client.1.vm06.stdout:0/1: truncate - no filename 2026-03-10T06:22:03.336 INFO:tasks.workunit.client.1.vm06.stdout:1/0: link - no file 2026-03-10T06:22:03.336 INFO:tasks.workunit.client.1.vm06.stdout:1/1: chown . 1064337899 1 2026-03-10T06:22:03.336 INFO:tasks.workunit.client.1.vm06.stdout:1/2: write - no filename 2026-03-10T06:22:03.343 INFO:tasks.workunit.client.1.vm06.stdout:0/2: mkdir d0 0 2026-03-10T06:22:03.343 INFO:tasks.workunit.client.1.vm06.stdout:0/3: link - no file 2026-03-10T06:22:03.345 INFO:tasks.workunit.client.1.vm06.stdout:3/0: dread - no filename 2026-03-10T06:22:03.345 INFO:tasks.workunit.client.1.vm06.stdout:3/1: dwrite - no filename 2026-03-10T06:22:03.346 INFO:tasks.workunit.client.1.vm06.stdout:2/0: symlink l0 0 2026-03-10T06:22:03.346 INFO:tasks.workunit.client.1.vm06.stdout:2/1: dread - no filename 2026-03-10T06:22:03.346 INFO:tasks.workunit.client.1.vm06.stdout:2/2: readlink l0 0 2026-03-10T06:22:03.347 INFO:tasks.workunit.client.1.vm06.stdout:0/4: creat d0/f1 x:0 0 0 2026-03-10T06:22:03.348 INFO:tasks.workunit.client.1.vm06.stdout:4/0: chown . 8544 1 2026-03-10T06:22:03.349 INFO:tasks.workunit.client.1.vm06.stdout:0/5: truncate d0/f1 786723 0 2026-03-10T06:22:03.349 INFO:tasks.workunit.client.1.vm06.stdout:0/6: rename d0 to d0/d2 22 2026-03-10T06:22:03.350 INFO:tasks.workunit.client.1.vm06.stdout:5/0: dread - no filename 2026-03-10T06:22:03.352 INFO:tasks.workunit.client.1.vm06.stdout:3/2: creat f0 x:0 0 0 2026-03-10T06:22:03.353 INFO:tasks.workunit.client.1.vm06.stdout:3/3: write f0 [803324,124797] 0 2026-03-10T06:22:03.354 INFO:tasks.workunit.client.1.vm06.stdout:3/4: rmdir - no directory 2026-03-10T06:22:03.358 INFO:tasks.workunit.client.1.vm06.stdout:6/0: write - no filename 2026-03-10T06:22:03.358 INFO:tasks.workunit.client.1.vm06.stdout:6/1: dwrite - no filename 2026-03-10T06:22:03.358 INFO:tasks.workunit.client.1.vm06.stdout:6/2: link - no file 2026-03-10T06:22:03.358 INFO:tasks.workunit.client.1.vm06.stdout:6/3: write - no filename 2026-03-10T06:22:03.358 INFO:tasks.workunit.client.1.vm06.stdout:4/1: creat f0 x:0 0 0 2026-03-10T06:22:03.364 INFO:tasks.workunit.client.1.vm06.stdout:3/5: creat f1 x:0 0 0 2026-03-10T06:22:03.365 INFO:tasks.workunit.client.1.vm06.stdout:7/0: write - no filename 2026-03-10T06:22:03.365 INFO:tasks.workunit.client.1.vm06.stdout:7/1: dread - no filename 2026-03-10T06:22:03.365 INFO:tasks.workunit.client.1.vm06.stdout:7/2: rename - no filename 2026-03-10T06:22:03.365 INFO:tasks.workunit.client.1.vm06.stdout:7/3: rmdir - no directory 2026-03-10T06:22:03.365 INFO:tasks.workunit.client.1.vm06.stdout:7/4: fdatasync - no filename 2026-03-10T06:22:03.365 INFO:tasks.workunit.client.1.vm06.stdout:3/6: write f0 [674447,84062] 0 2026-03-10T06:22:03.370 INFO:tasks.workunit.client.1.vm06.stdout:5/1: mkdir d0 0 2026-03-10T06:22:03.370 INFO:tasks.workunit.client.1.vm06.stdout:5/2: write - no filename 2026-03-10T06:22:03.370 INFO:tasks.workunit.client.1.vm06.stdout:4/2: creat f1 x:0 0 0 2026-03-10T06:22:03.373 INFO:tasks.workunit.client.1.vm06.stdout:8/0: write - no filename 2026-03-10T06:22:03.373 INFO:tasks.workunit.client.1.vm06.stdout:8/1: rename - no filename 2026-03-10T06:22:03.377 INFO:tasks.workunit.client.1.vm06.stdout:3/7: dwrite f0 [0,4194304] 0 2026-03-10T06:22:03.378 INFO:tasks.workunit.client.1.vm06.stdout:3/8: readlink - no filename 2026-03-10T06:22:03.389 INFO:tasks.workunit.client.1.vm06.stdout:6/4: getdents . 0 2026-03-10T06:22:03.389 INFO:tasks.workunit.client.1.vm06.stdout:4/3: creat f2 x:0 0 0 2026-03-10T06:22:03.390 INFO:tasks.workunit.client.1.vm06.stdout:4/4: readlink - no filename 2026-03-10T06:22:03.390 INFO:tasks.workunit.client.1.vm06.stdout:7/5: symlink l0 0 2026-03-10T06:22:03.390 INFO:tasks.workunit.client.1.vm06.stdout:7/6: rmdir - no directory 2026-03-10T06:22:03.390 INFO:tasks.workunit.client.1.vm06.stdout:4/5: truncate f2 25717 0 2026-03-10T06:22:03.390 INFO:tasks.workunit.client.1.vm06.stdout:3/9: symlink l2 0 2026-03-10T06:22:03.391 INFO:tasks.workunit.client.1.vm06.stdout:7/7: symlink l1 0 2026-03-10T06:22:03.393 INFO:tasks.workunit.client.1.vm06.stdout:4/6: unlink f1 0 2026-03-10T06:22:03.393 INFO:tasks.workunit.client.1.vm06.stdout:4/7: truncate f0 209819 0 2026-03-10T06:22:03.394 INFO:tasks.workunit.client.1.vm06.stdout:3/10: mknod c3 0 2026-03-10T06:22:03.394 INFO:tasks.workunit.client.1.vm06.stdout:3/11: dread - f1 zero size 2026-03-10T06:22:03.394 INFO:tasks.workunit.client.1.vm06.stdout:5/3: rmdir d0 0 2026-03-10T06:22:03.395 INFO:tasks.workunit.client.1.vm06.stdout:6/5: mknod c0 0 2026-03-10T06:22:03.396 INFO:tasks.workunit.client.1.vm06.stdout:9/0: creat f0 x:0 0 0 2026-03-10T06:22:03.399 INFO:tasks.workunit.client.1.vm06.stdout:3/12: mknod c4 0 2026-03-10T06:22:03.399 INFO:tasks.workunit.client.1.vm06.stdout:5/4: mknod c1 0 2026-03-10T06:22:03.399 INFO:tasks.workunit.client.1.vm06.stdout:5/5: dwrite - no filename 2026-03-10T06:22:03.399 INFO:tasks.workunit.client.1.vm06.stdout:5/6: dwrite - no filename 2026-03-10T06:22:03.400 INFO:tasks.workunit.client.1.vm06.stdout:6/6: unlink c0 0 2026-03-10T06:22:03.400 INFO:tasks.workunit.client.1.vm06.stdout:6/7: chown . 475060 1 2026-03-10T06:22:03.400 INFO:tasks.workunit.client.1.vm06.stdout:6/8: write - no filename 2026-03-10T06:22:03.401 INFO:tasks.workunit.client.1.vm06.stdout:9/1: mknod c1 0 2026-03-10T06:22:03.406 INFO:tasks.workunit.client.1.vm06.stdout:4/8: link f0 f3 0 2026-03-10T06:22:03.406 INFO:tasks.workunit.client.1.vm06.stdout:3/13: mknod c5 0 2026-03-10T06:22:03.424 INFO:tasks.workunit.client.1.vm06.stdout:5/7: creat f2 x:0 0 0 2026-03-10T06:22:03.424 INFO:tasks.workunit.client.1.vm06.stdout:9/2: mknod c2 0 2026-03-10T06:22:03.424 INFO:tasks.workunit.client.1.vm06.stdout:4/9: mknod c4 0 2026-03-10T06:22:03.424 INFO:tasks.workunit.client.1.vm06.stdout:3/14: dwrite f1 [0,4194304] 0 2026-03-10T06:22:03.424 INFO:tasks.workunit.client.1.vm06.stdout:5/8: write f2 [186463,86210] 0 2026-03-10T06:22:03.424 INFO:tasks.workunit.client.1.vm06.stdout:3/15: mkdir d6 0 2026-03-10T06:22:03.424 INFO:tasks.workunit.client.1.vm06.stdout:9/3: write f0 [369238,125729] 0 2026-03-10T06:22:03.426 INFO:tasks.workunit.client.1.vm06.stdout:4/10: write f3 [484695,77016] 0 2026-03-10T06:22:03.430 INFO:tasks.workunit.client.1.vm06.stdout:4/11: dwrite f3 [0,4194304] 0 2026-03-10T06:22:03.453 INFO:tasks.workunit.client.1.vm06.stdout:3/16: mknod d6/c7 0 2026-03-10T06:22:03.454 INFO:tasks.workunit.client.1.vm06.stdout:4/12: dwrite f0 [0,4194304] 0 2026-03-10T06:22:03.454 INFO:tasks.workunit.client.1.vm06.stdout:4/13: stat f0 0 2026-03-10T06:22:03.454 INFO:tasks.workunit.client.1.vm06.stdout:3/17: mkdir d6/d8 0 2026-03-10T06:22:03.454 INFO:tasks.workunit.client.1.vm06.stdout:3/18: symlink d6/l9 0 2026-03-10T06:22:03.454 INFO:tasks.workunit.client.1.vm06.stdout:3/19: write f1 [3711123,65596] 0 2026-03-10T06:22:03.454 INFO:tasks.workunit.client.1.vm06.stdout:3/20: rename f1 to d6/d8/fa 0 2026-03-10T06:22:03.515 INFO:tasks.workunit.client.1.vm06.stdout:2/3: getdents . 0 2026-03-10T06:22:03.515 INFO:tasks.workunit.client.1.vm06.stdout:2/4: dwrite - no filename 2026-03-10T06:22:03.515 INFO:tasks.workunit.client.1.vm06.stdout:0/7: truncate d0/f1 543898 0 2026-03-10T06:22:03.521 INFO:tasks.workunit.client.1.vm06.stdout:0/8: mknod d0/c3 0 2026-03-10T06:22:03.589 INFO:tasks.workunit.client.1.vm06.stdout:5/9: fdatasync f2 0 2026-03-10T06:22:03.589 INFO:tasks.workunit.client.1.vm06.stdout:5/10: chown f2 363730752 1 2026-03-10T06:22:03.590 INFO:tasks.workunit.client.1.vm06.stdout:5/11: symlink l3 0 2026-03-10T06:22:03.591 INFO:tasks.workunit.client.1.vm06.stdout:5/12: chown c1 31402012 1 2026-03-10T06:22:03.593 INFO:tasks.workunit.client.1.vm06.stdout:5/13: link c1 c4 0 2026-03-10T06:22:03.595 INFO:tasks.workunit.client.1.vm06.stdout:5/14: creat f5 x:0 0 0 2026-03-10T06:22:03.595 INFO:tasks.workunit.client.1.vm06.stdout:5/15: chown f5 201491554 1 2026-03-10T06:22:03.597 INFO:tasks.workunit.client.1.vm06.stdout:3/21: fdatasync d6/d8/fa 0 2026-03-10T06:22:03.597 INFO:tasks.workunit.client.1.vm06.stdout:5/16: creat f6 x:0 0 0 2026-03-10T06:22:03.603 INFO:tasks.workunit.client.1.vm06.stdout:5/17: dwrite f5 [0,4194304] 0 2026-03-10T06:22:03.603 INFO:tasks.workunit.client.1.vm06.stdout:5/18: truncate f6 1045307 0 2026-03-10T06:22:03.614 INFO:tasks.workunit.client.1.vm06.stdout:7/8: getdents . 0 2026-03-10T06:22:03.614 INFO:tasks.workunit.client.1.vm06.stdout:7/9: dwrite - no filename 2026-03-10T06:22:03.614 INFO:tasks.workunit.client.1.vm06.stdout:7/10: dwrite - no filename 2026-03-10T06:22:03.614 INFO:tasks.workunit.client.1.vm06.stdout:7/11: dread - no filename 2026-03-10T06:22:03.614 INFO:tasks.workunit.client.1.vm06.stdout:5/19: chown f5 3017339 1 2026-03-10T06:22:03.614 INFO:tasks.workunit.client.1.vm06.stdout:5/20: dread f6 [0,4194304] 0 2026-03-10T06:22:03.620 INFO:tasks.workunit.client.1.vm06.stdout:5/21: dread f5 [0,4194304] 0 2026-03-10T06:22:03.626 INFO:tasks.workunit.client.1.vm06.stdout:5/22: read f2 [131067,84770] 0 2026-03-10T06:22:03.626 INFO:tasks.workunit.client.1.vm06.stdout:5/23: read f2 [23910,55136] 0 2026-03-10T06:22:03.627 INFO:tasks.workunit.client.1.vm06.stdout:6/9: getdents . 0 2026-03-10T06:22:03.628 INFO:tasks.workunit.client.1.vm06.stdout:6/10: dread - no filename 2026-03-10T06:22:03.674 INFO:tasks.workunit.client.1.vm06.stdout:1/3: sync 2026-03-10T06:22:03.674 INFO:tasks.workunit.client.1.vm06.stdout:1/4: write - no filename 2026-03-10T06:22:03.674 INFO:tasks.workunit.client.1.vm06.stdout:1/5: rename - no filename 2026-03-10T06:22:03.674 INFO:tasks.workunit.client.1.vm06.stdout:8/2: sync 2026-03-10T06:22:03.674 INFO:tasks.workunit.client.1.vm06.stdout:1/6: fsync - no filename 2026-03-10T06:22:03.674 INFO:tasks.workunit.client.1.vm06.stdout:2/5: sync 2026-03-10T06:22:03.674 INFO:tasks.workunit.client.1.vm06.stdout:2/6: write - no filename 2026-03-10T06:22:03.674 INFO:tasks.workunit.client.1.vm06.stdout:9/4: sync 2026-03-10T06:22:03.674 INFO:tasks.workunit.client.1.vm06.stdout:3/22: sync 2026-03-10T06:22:03.675 INFO:tasks.workunit.client.1.vm06.stdout:4/14: sync 2026-03-10T06:22:03.675 INFO:tasks.workunit.client.1.vm06.stdout:9/5: write f0 [539379,50630] 0 2026-03-10T06:22:03.675 INFO:tasks.workunit.client.1.vm06.stdout:9/6: rmdir - no directory 2026-03-10T06:22:03.676 INFO:tasks.workunit.client.1.vm06.stdout:4/15: fsync f0 0 2026-03-10T06:22:03.680 INFO:tasks.workunit.client.1.vm06.stdout:9/7: sync 2026-03-10T06:22:03.694 INFO:tasks.workunit.client.1.vm06.stdout:0/9: truncate d0/f1 547803 0 2026-03-10T06:22:03.694 INFO:tasks.workunit.client.1.vm06.stdout:0/10: stat d0/c3 0 2026-03-10T06:22:03.980 INFO:tasks.workunit.client.1.vm06.stdout:7/12: creat f2 x:0 0 0 2026-03-10T06:22:03.983 INFO:tasks.workunit.client.1.vm06.stdout:5/24: creat f7 x:0 0 0 2026-03-10T06:22:03.984 INFO:tasks.workunit.client.1.vm06.stdout:6/11: creat f1 x:0 0 0 2026-03-10T06:22:04.001 INFO:tasks.workunit.client.1.vm06.stdout:3/23: creat d6/d8/fb x:0 0 0 2026-03-10T06:22:04.001 INFO:tasks.workunit.client.1.vm06.stdout:3/24: chown d6/d8 7 1 2026-03-10T06:22:04.009 INFO:tasks.workunit.client.1.vm06.stdout:4/16: mknod c5 0 2026-03-10T06:22:04.009 INFO:tasks.workunit.client.1.vm06.stdout:4/17: rmdir - no directory 2026-03-10T06:22:04.009 INFO:tasks.workunit.client.1.vm06.stdout:4/18: readlink - no filename 2026-03-10T06:22:04.010 INFO:tasks.workunit.client.1.vm06.stdout:9/8: unlink c2 0 2026-03-10T06:22:04.012 INFO:tasks.workunit.client.1.vm06.stdout:9/9: dread f0 [0,4194304] 0 2026-03-10T06:22:04.013 INFO:tasks.workunit.client.1.vm06.stdout:9/10: dread f0 [0,4194304] 0 2026-03-10T06:22:04.025 INFO:tasks.workunit.client.1.vm06.stdout:7/13: unlink f2 0 2026-03-10T06:22:04.025 INFO:tasks.workunit.client.1.vm06.stdout:7/14: truncate - no filename 2026-03-10T06:22:04.045 INFO:tasks.workunit.client.1.vm06.stdout:5/25: chown c1 62 1 2026-03-10T06:22:04.050 INFO:tasks.workunit.client.1.vm06.stdout:1/7: creat f0 x:0 0 0 2026-03-10T06:22:04.053 INFO:tasks.workunit.client.1.vm06.stdout:6/12: mknod c2 0 2026-03-10T06:22:04.058 INFO:tasks.workunit.client.1.vm06.stdout:6/13: dwrite f1 [0,4194304] 0 2026-03-10T06:22:04.067 INFO:tasks.workunit.client.1.vm06.stdout:6/14: dwrite f1 [0,4194304] 0 2026-03-10T06:22:04.073 INFO:tasks.workunit.client.1.vm06.stdout:8/3: creat f0 x:0 0 0 2026-03-10T06:22:04.076 INFO:tasks.workunit.client.1.vm06.stdout:2/7: getdents . 0 2026-03-10T06:22:04.077 INFO:tasks.workunit.client.1.vm06.stdout:2/8: dread - no filename 2026-03-10T06:22:04.077 INFO:tasks.workunit.client.1.vm06.stdout:2/9: chown l0 16904 1 2026-03-10T06:22:04.081 INFO:tasks.workunit.client.1.vm06.stdout:4/19: creat f6 x:0 0 0 2026-03-10T06:22:04.087 INFO:tasks.workunit.client.1.vm06.stdout:2/10: sync 2026-03-10T06:22:04.087 INFO:tasks.workunit.client.1.vm06.stdout:2/11: read - no filename 2026-03-10T06:22:04.087 INFO:tasks.workunit.client.1.vm06.stdout:2/12: write - no filename 2026-03-10T06:22:04.087 INFO:tasks.workunit.client.1.vm06.stdout:2/13: fsync - no filename 2026-03-10T06:22:04.099 INFO:tasks.workunit.client.1.vm06.stdout:9/11: symlink l3 0 2026-03-10T06:22:04.118 INFO:tasks.workunit.client.1.vm06.stdout:7/15: rename l1 to l3 0 2026-03-10T06:22:04.118 INFO:tasks.workunit.client.1.vm06.stdout:7/16: dwrite - no filename 2026-03-10T06:22:04.118 INFO:tasks.workunit.client.1.vm06.stdout:7/17: chown l0 20364980 1 2026-03-10T06:22:04.118 INFO:tasks.workunit.client.1.vm06.stdout:7/18: chown l0 63933 1 2026-03-10T06:22:04.121 INFO:tasks.workunit.client.1.vm06.stdout:5/26: mkdir d8 0 2026-03-10T06:22:04.126 INFO:tasks.workunit.client.1.vm06.stdout:1/8: rename f0 to f1 0 2026-03-10T06:22:04.151 INFO:tasks.workunit.client.1.vm06.stdout:8/4: mkdir d1 0 2026-03-10T06:22:04.151 INFO:tasks.workunit.client.1.vm06.stdout:8/5: write f0 [777777,108747] 0 2026-03-10T06:22:04.153 INFO:tasks.workunit.client.1.vm06.stdout:8/6: sync 2026-03-10T06:22:04.182 INFO:tasks.workunit.client.1.vm06.stdout:7/19: creat f4 x:0 0 0 2026-03-10T06:22:04.189 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:04.186+0000 7ff4e49b6700 1 -- 192.168.123.104:0/1462351819 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff4dc072360 msgr2=0x7ff4dc0770e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:22:04.189 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:04.186+0000 7ff4e49b6700 1 --2- 192.168.123.104:0/1462351819 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff4dc072360 0x7ff4dc0770e0 secure :-1 s=READY pgs=79 cs=0 l=1 rev1=1 crypto rx=0x7ff4d8009230 tx=0x7ff4d8009260 comp rx=0 tx=0).stop 2026-03-10T06:22:04.189 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:04.186+0000 7ff4e49b6700 1 -- 192.168.123.104:0/1462351819 shutdown_connections 2026-03-10T06:22:04.189 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:04.186+0000 7ff4e49b6700 1 --2- 192.168.123.104:0/1462351819 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff4dc072360 0x7ff4dc0770e0 unknown :-1 s=CLOSED pgs=79 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:22:04.189 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:04.186+0000 7ff4e49b6700 1 --2- 192.168.123.104:0/1462351819 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7ff4dc071980 0x7ff4dc071d90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:22:04.189 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:04.186+0000 7ff4e49b6700 1 -- 192.168.123.104:0/1462351819 >> 192.168.123.104:0/1462351819 conn(0x7ff4dc06d1a0 msgr2=0x7ff4dc06f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:22:04.189 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:04.186+0000 7ff4e49b6700 1 -- 192.168.123.104:0/1462351819 shutdown_connections 2026-03-10T06:22:04.189 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:04.186+0000 7ff4e49b6700 1 -- 192.168.123.104:0/1462351819 wait complete. 2026-03-10T06:22:04.189 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:04.186+0000 7ff4e49b6700 1 Processor -- start 2026-03-10T06:22:04.189 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:04.186+0000 7ff4e49b6700 1 -- start start 2026-03-10T06:22:04.189 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:04.186+0000 7ff4e49b6700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff4dc071980 0x7ff4dc0824a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:22:04.189 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:04.186+0000 7ff4e49b6700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7ff4dc0829e0 0x7ff4dc082e50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:22:04.189 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:04.186+0000 7ff4e49b6700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff4dc083e50 con 0x7ff4dc0829e0 2026-03-10T06:22:04.189 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:04.186+0000 7ff4e49b6700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff4dc12dd80 con 0x7ff4dc071980 2026-03-10T06:22:04.189 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:04.187+0000 7ff4e2752700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff4dc071980 0x7ff4dc0824a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:22:04.189 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:04.187+0000 7ff4e2752700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff4dc071980 0x7ff4dc0824a0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.104:55552/0 (socket says 192.168.123.104:55552) 2026-03-10T06:22:04.189 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:04.187+0000 7ff4e2752700 1 -- 192.168.123.104:0/3869625284 learned_addr learned my addr 192.168.123.104:0/3869625284 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:22:04.189 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:04.187+0000 7ff4e1f51700 1 --2- 192.168.123.104:0/3869625284 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7ff4dc0829e0 0x7ff4dc082e50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:22:04.189 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:04.187+0000 7ff4e2752700 1 -- 192.168.123.104:0/3869625284 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7ff4dc0829e0 msgr2=0x7ff4dc082e50 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:22:04.189 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:04.187+0000 7ff4e2752700 1 --2- 192.168.123.104:0/3869625284 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7ff4dc0829e0 0x7ff4dc082e50 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:22:04.190 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:04.187+0000 7ff4e2752700 1 -- 192.168.123.104:0/3869625284 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7ff4d8008ee0 con 0x7ff4dc071980 2026-03-10T06:22:04.190 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:04.187+0000 7ff4e2752700 1 --2- 192.168.123.104:0/3869625284 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff4dc071980 0x7ff4dc0824a0 secure :-1 s=READY pgs=80 cs=0 l=1 rev1=1 crypto rx=0x7ff4d000bfd0 tx=0x7ff4d0009d70 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:22:04.190 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:04.187+0000 7ff4cf7fe700 1 -- 192.168.123.104:0/3869625284 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff4d0010040 con 0x7ff4dc071980 2026-03-10T06:22:04.190 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:04.187+0000 7ff4e49b6700 1 -- 192.168.123.104:0/3869625284 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7ff4dc12e000 con 0x7ff4dc071980 2026-03-10T06:22:04.190 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:04.187+0000 7ff4e49b6700 1 -- 192.168.123.104:0/3869625284 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7ff4dc12e550 con 0x7ff4dc071980 2026-03-10T06:22:04.190 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:04.188+0000 7ff4cf7fe700 1 -- 192.168.123.104:0/3869625284 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7ff4d000ec20 con 0x7ff4dc071980 2026-03-10T06:22:04.190 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:04.188+0000 7ff4cf7fe700 1 -- 192.168.123.104:0/3869625284 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff4d0014e40 con 0x7ff4dc071980 2026-03-10T06:22:04.195 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:04.193+0000 7ff4cf7fe700 1 -- 192.168.123.104:0/3869625284 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 25) v1 ==== 95039+0+0 (secure 0 0 0) 0x7ff4d0014590 con 0x7ff4dc071980 2026-03-10T06:22:04.195 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:04.194+0000 7ff4cf7fe700 1 --2- 192.168.123.104:0/3869625284 >> [v2:192.168.123.106:6828/1426890327,v1:192.168.123.106:6829/1426890327] conn(0x7ff4c8071f70 0x7ff4c8074420 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:22:04.195 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:04.194+0000 7ff4cf7fe700 1 -- 192.168.123.104:0/3869625284 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(40..40 src has 1..40) v4 ==== 5448+0+0 (secure 0 0 0) 0x7ff4d0092480 con 0x7ff4dc071980 2026-03-10T06:22:04.195 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:04.194+0000 7ff4e49b6700 1 -- 192.168.123.104:0/3869625284 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7ff4c0005320 con 0x7ff4dc071980 2026-03-10T06:22:04.196 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:04.195+0000 7ff4e1f51700 1 --2- 192.168.123.104:0/3869625284 >> [v2:192.168.123.106:6828/1426890327,v1:192.168.123.106:6829/1426890327] conn(0x7ff4c8071f70 0x7ff4c8074420 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:22:04.197 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:04.196+0000 7ff4e1f51700 1 --2- 192.168.123.104:0/3869625284 >> [v2:192.168.123.106:6828/1426890327,v1:192.168.123.106:6829/1426890327] conn(0x7ff4c8071f70 0x7ff4c8074420 secure :-1 s=READY pgs=20 cs=0 l=1 rev1=1 crypto rx=0x7ff4d8009200 tx=0x7ff4d800c920 comp rx=0 tx=0).ready entity=mgr.24377 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:22:04.202 INFO:tasks.workunit.client.1.vm06.stdout:1/9: dwrite f1 [0,4194304] 0 2026-03-10T06:22:04.212 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:04.203+0000 7ff4cf7fe700 1 -- 192.168.123.104:0/3869625284 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191525 (secure 0 0 0) 0x7ff4d005b1e0 con 0x7ff4dc071980 2026-03-10T06:22:04.242 INFO:tasks.workunit.client.1.vm06.stdout:7/20: symlink l5 0 2026-03-10T06:22:04.246 INFO:tasks.workunit.client.1.vm06.stdout:7/21: dwrite f4 [0,4194304] 0 2026-03-10T06:22:04.248 INFO:tasks.workunit.client.1.vm06.stdout:7/22: dread f4 [0,4194304] 0 2026-03-10T06:22:04.252 INFO:tasks.workunit.client.1.vm06.stdout:7/23: dwrite f4 [0,4194304] 0 2026-03-10T06:22:04.253 INFO:tasks.workunit.client.1.vm06.stdout:5/27: mkdir d8/d9 0 2026-03-10T06:22:04.284 INFO:tasks.workunit.client.1.vm06.stdout:8/7: creat d1/f2 x:0 0 0 2026-03-10T06:22:04.285 INFO:tasks.workunit.client.1.vm06.stdout:3/25: getdents d6 0 2026-03-10T06:22:04.286 INFO:tasks.workunit.client.1.vm06.stdout:3/26: write f0 [3607699,129838] 0 2026-03-10T06:22:04.305 INFO:tasks.workunit.client.1.vm06.stdout:4/20: link c5 c7 0 2026-03-10T06:22:04.309 INFO:tasks.workunit.client.1.vm06.stdout:4/21: dwrite f2 [0,4194304] 0 2026-03-10T06:22:04.324 INFO:tasks.workunit.client.1.vm06.stdout:2/14: link l0 l1 0 2026-03-10T06:22:04.343 INFO:tasks.workunit.client.1.vm06.stdout:7/24: symlink l6 0 2026-03-10T06:22:04.356 INFO:tasks.workunit.client.1.vm06.stdout:7/25: sync 2026-03-10T06:22:04.372 INFO:tasks.workunit.client.1.vm06.stdout:3/27: mkdir d6/dc 0 2026-03-10T06:22:04.377 INFO:tasks.workunit.client.1.vm06.stdout:3/28: dwrite d6/d8/fb [0,4194304] 0 2026-03-10T06:22:04.409 INFO:tasks.workunit.client.1.vm06.stdout:6/15: truncate f1 4146590 0 2026-03-10T06:22:04.409 INFO:tasks.workunit.client.1.vm06.stdout:9/12: getdents . 0 2026-03-10T06:22:04.435 INFO:tasks.workunit.client.1.vm06.stdout:7/26: fsync f4 0 2026-03-10T06:22:04.439 INFO:tasks.workunit.client.1.vm06.stdout:7/27: dwrite f4 [0,4194304] 0 2026-03-10T06:22:04.446 INFO:tasks.workunit.client.1.vm06.stdout:0/11: dwrite d0/f1 [0,4194304] 0 2026-03-10T06:22:04.468 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:04.467+0000 7ff4e49b6700 1 -- 192.168.123.104:0/3869625284 --> [v2:192.168.123.106:6828/1426890327,v1:192.168.123.106:6829/1426890327] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7ff4c0000bf0 con 0x7ff4c8071f70 2026-03-10T06:22:04.468 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:04 vm04.local ceph-mon[51058]: pgmap v10: 65 pgs: 65 active+clean; 190 MiB data, 1.1 GiB used, 119 GiB / 120 GiB avail; 878 KiB/s rd, 2.5 MiB/s wr, 203 op/s 2026-03-10T06:22:04.468 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:04 vm04.local ceph-mon[51058]: from='mgr.24377 ' entity='mgr.vm06.wwotdr' 2026-03-10T06:22:04.468 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:04 vm04.local ceph-mon[51058]: from='mgr.24377 ' entity='mgr.vm06.wwotdr' 2026-03-10T06:22:04.468 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:04 vm04.local ceph-mon[51058]: from='mgr.24377 ' entity='mgr.vm06.wwotdr' 2026-03-10T06:22:04.468 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:04 vm04.local ceph-mon[51058]: from='mgr.24377 ' entity='mgr.vm06.wwotdr' 2026-03-10T06:22:04.470 INFO:tasks.workunit.client.1.vm06.stdout:8/8: fsync d1/f2 0 2026-03-10T06:22:04.471 INFO:tasks.workunit.client.1.vm06.stdout:3/29: creat d6/d8/fd x:0 0 0 2026-03-10T06:22:04.471 INFO:tasks.workunit.client.1.vm06.stdout:8/9: write f0 [1650110,14390] 0 2026-03-10T06:22:04.472 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:04.469+0000 7ff4cf7fe700 1 -- 192.168.123.104:0/3869625284 <== mgr.24377 v2:192.168.123.106:6828/1426890327 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+360 (secure 0 0 0) 0x7ff4c0000bf0 con 0x7ff4c8071f70 2026-03-10T06:22:04.476 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:04.473+0000 7ff4cd7fa700 1 -- 192.168.123.104:0/3869625284 >> [v2:192.168.123.106:6828/1426890327,v1:192.168.123.106:6829/1426890327] conn(0x7ff4c8071f70 msgr2=0x7ff4c8074420 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:22:04.476 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:04.473+0000 7ff4cd7fa700 1 --2- 192.168.123.104:0/3869625284 >> [v2:192.168.123.106:6828/1426890327,v1:192.168.123.106:6829/1426890327] conn(0x7ff4c8071f70 0x7ff4c8074420 secure :-1 s=READY pgs=20 cs=0 l=1 rev1=1 crypto rx=0x7ff4d8009200 tx=0x7ff4d800c920 comp rx=0 tx=0).stop 2026-03-10T06:22:04.476 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:04.473+0000 7ff4cd7fa700 1 -- 192.168.123.104:0/3869625284 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff4dc071980 msgr2=0x7ff4dc0824a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:22:04.476 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:04.473+0000 7ff4cd7fa700 1 --2- 192.168.123.104:0/3869625284 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff4dc071980 0x7ff4dc0824a0 secure :-1 s=READY pgs=80 cs=0 l=1 rev1=1 crypto rx=0x7ff4d000bfd0 tx=0x7ff4d0009d70 comp rx=0 tx=0).stop 2026-03-10T06:22:04.476 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:04.473+0000 7ff4cd7fa700 1 -- 192.168.123.104:0/3869625284 shutdown_connections 2026-03-10T06:22:04.476 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:04.473+0000 7ff4cd7fa700 1 --2- 192.168.123.104:0/3869625284 >> [v2:192.168.123.106:6828/1426890327,v1:192.168.123.106:6829/1426890327] conn(0x7ff4c8071f70 0x7ff4c8074420 unknown :-1 s=CLOSED pgs=20 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:22:04.476 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:04.473+0000 7ff4cd7fa700 1 --2- 192.168.123.104:0/3869625284 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff4dc071980 0x7ff4dc0824a0 unknown :-1 s=CLOSED pgs=80 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:22:04.476 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:04.473+0000 7ff4cd7fa700 1 --2- 192.168.123.104:0/3869625284 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7ff4dc0829e0 0x7ff4dc082e50 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:22:04.476 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:04.473+0000 7ff4cd7fa700 1 -- 192.168.123.104:0/3869625284 >> 192.168.123.104:0/3869625284 conn(0x7ff4dc06d1a0 msgr2=0x7ff4dc076470 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:22:04.476 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:04.473+0000 7ff4cd7fa700 1 -- 192.168.123.104:0/3869625284 shutdown_connections 2026-03-10T06:22:04.476 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:04.473+0000 7ff4cd7fa700 1 -- 192.168.123.104:0/3869625284 wait complete. 2026-03-10T06:22:04.480 INFO:tasks.workunit.client.1.vm06.stdout:2/15: rename l0 to l2 0 2026-03-10T06:22:04.480 INFO:tasks.workunit.client.1.vm06.stdout:2/16: write - no filename 2026-03-10T06:22:04.492 INFO:tasks.workunit.client.1.vm06.stdout:5/28: write f5 [1775310,48266] 0 2026-03-10T06:22:04.493 INFO:tasks.workunit.client.1.vm06.stdout:5/29: write f7 [918482,21874] 0 2026-03-10T06:22:04.497 INFO:tasks.workunit.client.1.vm06.stdout:5/30: dwrite f2 [0,4194304] 0 2026-03-10T06:22:04.499 INFO:teuthology.orchestra.run.vm04.stdout:true 2026-03-10T06:22:04.536 INFO:tasks.workunit.client.1.vm06.stdout:9/13: creat f4 x:0 0 0 2026-03-10T06:22:04.536 INFO:tasks.workunit.client.1.vm06.stdout:1/10: truncate f1 489357 0 2026-03-10T06:22:04.537 INFO:tasks.workunit.client.1.vm06.stdout:7/28: symlink l7 0 2026-03-10T06:22:04.538 INFO:tasks.workunit.client.1.vm06.stdout:9/14: dread f0 [0,4194304] 0 2026-03-10T06:22:04.538 INFO:tasks.workunit.client.1.vm06.stdout:7/29: write f4 [5192663,41344] 0 2026-03-10T06:22:04.544 INFO:tasks.workunit.client.1.vm06.stdout:0/12: rename d0/f1 to d0/f4 0 2026-03-10T06:22:04.544 INFO:tasks.workunit.client.1.vm06.stdout:9/15: dwrite f4 [0,4194304] 0 2026-03-10T06:22:04.546 INFO:tasks.workunit.client.1.vm06.stdout:9/16: chown l3 59 1 2026-03-10T06:22:04.546 INFO:tasks.workunit.client.1.vm06.stdout:9/17: chown c1 154 1 2026-03-10T06:22:04.547 INFO:tasks.workunit.client.1.vm06.stdout:9/18: dread f0 [0,4194304] 0 2026-03-10T06:22:04.555 INFO:tasks.workunit.client.1.vm06.stdout:4/22: dwrite f2 [4194304,4194304] 0 2026-03-10T06:22:04.559 INFO:tasks.workunit.client.1.vm06.stdout:8/10: symlink d1/l3 0 2026-03-10T06:22:04.561 INFO:tasks.workunit.client.1.vm06.stdout:3/30: creat d6/d8/fe x:0 0 0 2026-03-10T06:22:04.568 INFO:tasks.workunit.client.1.vm06.stdout:4/23: dwrite f3 [0,4194304] 0 2026-03-10T06:22:04.578 INFO:tasks.workunit.client.1.vm06.stdout:4/24: dwrite f3 [4194304,4194304] 0 2026-03-10T06:22:04.579 INFO:tasks.workunit.client.1.vm06.stdout:4/25: stat c4 0 2026-03-10T06:22:04.579 INFO:tasks.workunit.client.1.vm06.stdout:4/26: write f3 [999229,123739] 0 2026-03-10T06:22:04.596 INFO:tasks.workunit.client.1.vm06.stdout:6/16: chown f1 1262742 1 2026-03-10T06:22:04.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:04 vm06.local ceph-mon[58974]: pgmap v10: 65 pgs: 65 active+clean; 190 MiB data, 1.1 GiB used, 119 GiB / 120 GiB avail; 878 KiB/s rd, 2.5 MiB/s wr, 203 op/s 2026-03-10T06:22:04.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:04 vm06.local ceph-mon[58974]: from='mgr.24377 ' entity='mgr.vm06.wwotdr' 2026-03-10T06:22:04.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:04 vm06.local ceph-mon[58974]: from='mgr.24377 ' entity='mgr.vm06.wwotdr' 2026-03-10T06:22:04.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:04 vm06.local ceph-mon[58974]: from='mgr.24377 ' entity='mgr.vm06.wwotdr' 2026-03-10T06:22:04.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:04 vm06.local ceph-mon[58974]: from='mgr.24377 ' entity='mgr.vm06.wwotdr' 2026-03-10T06:22:04.622 INFO:tasks.workunit.client.1.vm06.stdout:1/11: symlink l2 0 2026-03-10T06:22:04.668 INFO:tasks.workunit.client.1.vm06.stdout:0/13: creat d0/f5 x:0 0 0 2026-03-10T06:22:04.677 INFO:tasks.workunit.client.1.vm06.stdout:9/19: mknod c5 0 2026-03-10T06:22:04.678 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:04.677+0000 7f0c35988700 1 -- 192.168.123.104:0/3260020347 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0c30072470 msgr2=0x7f0c3010beb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:22:04.678 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:04.677+0000 7f0c35988700 1 --2- 192.168.123.104:0/3260020347 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0c30072470 0x7f0c3010beb0 secure :-1 s=READY pgs=312 cs=0 l=1 rev1=1 crypto rx=0x7f0c24009b00 tx=0x7f0c24009e10 comp rx=0 tx=0).stop 2026-03-10T06:22:04.678 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:04.677+0000 7f0c35988700 1 -- 192.168.123.104:0/3260020347 shutdown_connections 2026-03-10T06:22:04.678 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:04.677+0000 7f0c35988700 1 --2- 192.168.123.104:0/3260020347 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0c30072470 0x7f0c3010beb0 unknown :-1 s=CLOSED pgs=312 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:22:04.678 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:04.677+0000 7f0c35988700 1 --2- 192.168.123.104:0/3260020347 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0c30071a90 0x7f0c30071ea0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:22:04.678 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:04.677+0000 7f0c35988700 1 -- 192.168.123.104:0/3260020347 >> 192.168.123.104:0/3260020347 conn(0x7f0c3006d1a0 msgr2=0x7f0c3006f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:22:04.679 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:04.678+0000 7f0c35988700 1 -- 192.168.123.104:0/3260020347 shutdown_connections 2026-03-10T06:22:04.679 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:04.678+0000 7f0c35988700 1 -- 192.168.123.104:0/3260020347 wait complete. 2026-03-10T06:22:04.679 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:04.678+0000 7f0c35988700 1 Processor -- start 2026-03-10T06:22:04.679 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:04.678+0000 7f0c35988700 1 -- start start 2026-03-10T06:22:04.679 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:04.678+0000 7f0c35988700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0c30071a90 0x7f0c30116a40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:22:04.679 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:04.678+0000 7f0c35988700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0c30072470 0x7f0c30116f80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:22:04.679 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:04.678+0000 7f0c35988700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0c301175a0 con 0x7f0c30072470 2026-03-10T06:22:04.679 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:04.678+0000 7f0c35988700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0c301b2740 con 0x7f0c30071a90 2026-03-10T06:22:04.680 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:04.680+0000 7f0c2effd700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0c30071a90 0x7f0c30116a40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:22:04.680 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:04.680+0000 7f0c2effd700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0c30071a90 0x7f0c30116a40 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.104:55572/0 (socket says 192.168.123.104:55572) 2026-03-10T06:22:04.680 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:04.680+0000 7f0c2effd700 1 -- 192.168.123.104:0/3693478530 learned_addr learned my addr 192.168.123.104:0/3693478530 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:22:04.680 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:04.680+0000 7f0c2e7fc700 1 --2- 192.168.123.104:0/3693478530 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0c30072470 0x7f0c30116f80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:22:04.680 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:04.680+0000 7f0c2e7fc700 1 -- 192.168.123.104:0/3693478530 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0c30071a90 msgr2=0x7f0c30116a40 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:22:04.681 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:04.680+0000 7f0c2e7fc700 1 --2- 192.168.123.104:0/3693478530 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0c30071a90 0x7f0c30116a40 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:22:04.681 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:04.680+0000 7f0c2e7fc700 1 -- 192.168.123.104:0/3693478530 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f0c240097e0 con 0x7f0c30072470 2026-03-10T06:22:04.682 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:04.680+0000 7f0c2e7fc700 1 --2- 192.168.123.104:0/3693478530 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0c30072470 0x7f0c30116f80 secure :-1 s=READY pgs=313 cs=0 l=1 rev1=1 crypto rx=0x7f0c24000c00 tx=0x7f0c24004990 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:22:04.683 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:04.683+0000 7f0c34986700 1 -- 192.168.123.104:0/3693478530 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f0c2401d070 con 0x7f0c30072470 2026-03-10T06:22:04.683 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:04.683+0000 7f0c35988700 1 -- 192.168.123.104:0/3693478530 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f0c301b28e0 con 0x7f0c30072470 2026-03-10T06:22:04.683 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:04.683+0000 7f0c35988700 1 -- 192.168.123.104:0/3693478530 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f0c301b2d80 con 0x7f0c30072470 2026-03-10T06:22:04.684 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:04.683+0000 7f0c34986700 1 -- 192.168.123.104:0/3693478530 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f0c24022470 con 0x7f0c30072470 2026-03-10T06:22:04.684 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:04.683+0000 7f0c34986700 1 -- 192.168.123.104:0/3693478530 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f0c2400f650 con 0x7f0c30072470 2026-03-10T06:22:04.687 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:04.686+0000 7f0c167fc700 1 -- 192.168.123.104:0/3693478530 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f0c3004ea50 con 0x7f0c30072470 2026-03-10T06:22:04.687 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:04.687+0000 7f0c34986700 1 -- 192.168.123.104:0/3693478530 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 25) v1 ==== 95039+0+0 (secure 0 0 0) 0x7f0c24022ae0 con 0x7f0c30072470 2026-03-10T06:22:04.688 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:04.687+0000 7f0c34986700 1 --2- 192.168.123.104:0/3693478530 >> [v2:192.168.123.106:6828/1426890327,v1:192.168.123.106:6829/1426890327] conn(0x7f0c18072020 0x7f0c180744d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:22:04.688 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:04.687+0000 7f0c34986700 1 -- 192.168.123.104:0/3693478530 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(40..40 src has 1..40) v4 ==== 5448+0+0 (secure 0 0 0) 0x7f0c24093cf0 con 0x7f0c30072470 2026-03-10T06:22:04.688 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:04.688+0000 7f0c2effd700 1 --2- 192.168.123.104:0/3693478530 >> [v2:192.168.123.106:6828/1426890327,v1:192.168.123.106:6829/1426890327] conn(0x7f0c18072020 0x7f0c180744d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:22:04.689 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:04.689+0000 7f0c2effd700 1 --2- 192.168.123.104:0/3693478530 >> [v2:192.168.123.106:6828/1426890327,v1:192.168.123.106:6829/1426890327] conn(0x7f0c18072020 0x7f0c180744d0 secure :-1 s=READY pgs=21 cs=0 l=1 rev1=1 crypto rx=0x7f0c20005950 tx=0x7f0c2000b500 comp rx=0 tx=0).ready entity=mgr.24377 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:22:04.701 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:04.700+0000 7f0c34986700 1 -- 192.168.123.104:0/3693478530 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191525 (secure 0 0 0) 0x7f0c2405ca60 con 0x7f0c30072470 2026-03-10T06:22:04.705 INFO:tasks.workunit.client.1.vm06.stdout:3/31: symlink d6/lf 0 2026-03-10T06:22:04.720 INFO:tasks.workunit.client.1.vm06.stdout:8/11: rmdir d1 39 2026-03-10T06:22:04.726 INFO:tasks.workunit.client.1.vm06.stdout:4/27: chown c7 3518 1 2026-03-10T06:22:04.727 INFO:tasks.workunit.client.1.vm06.stdout:4/28: write f6 [382407,5114] 0 2026-03-10T06:22:04.727 INFO:tasks.workunit.client.1.vm06.stdout:4/29: write f0 [8449385,110778] 0 2026-03-10T06:22:04.761 INFO:tasks.workunit.client.1.vm06.stdout:6/17: dread f1 [0,4194304] 0 2026-03-10T06:22:04.965 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:04.964+0000 7f0c167fc700 1 -- 192.168.123.104:0/3693478530 --> [v2:192.168.123.106:6828/1426890327,v1:192.168.123.106:6829/1426890327] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f0c30061b80 con 0x7f0c18072020 2026-03-10T06:22:04.967 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:04.967+0000 7f0c34986700 1 -- 192.168.123.104:0/3693478530 <== mgr.24377 v2:192.168.123.106:6828/1426890327 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+360 (secure 0 0 0) 0x7f0c30061b80 con 0x7f0c18072020 2026-03-10T06:22:04.979 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:04.976+0000 7f0c35988700 1 -- 192.168.123.104:0/3693478530 >> [v2:192.168.123.106:6828/1426890327,v1:192.168.123.106:6829/1426890327] conn(0x7f0c18072020 msgr2=0x7f0c180744d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:22:04.979 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:04.976+0000 7f0c35988700 1 --2- 192.168.123.104:0/3693478530 >> [v2:192.168.123.106:6828/1426890327,v1:192.168.123.106:6829/1426890327] conn(0x7f0c18072020 0x7f0c180744d0 secure :-1 s=READY pgs=21 cs=0 l=1 rev1=1 crypto rx=0x7f0c20005950 tx=0x7f0c2000b500 comp rx=0 tx=0).stop 2026-03-10T06:22:04.979 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:04.976+0000 7f0c35988700 1 -- 192.168.123.104:0/3693478530 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0c30072470 msgr2=0x7f0c30116f80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:22:04.979 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:04.976+0000 7f0c35988700 1 --2- 192.168.123.104:0/3693478530 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0c30072470 0x7f0c30116f80 secure :-1 s=READY pgs=313 cs=0 l=1 rev1=1 crypto rx=0x7f0c24000c00 tx=0x7f0c24004990 comp rx=0 tx=0).stop 2026-03-10T06:22:04.979 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:04.976+0000 7f0c35988700 1 -- 192.168.123.104:0/3693478530 shutdown_connections 2026-03-10T06:22:04.979 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:04.976+0000 7f0c35988700 1 --2- 192.168.123.104:0/3693478530 >> [v2:192.168.123.106:6828/1426890327,v1:192.168.123.106:6829/1426890327] conn(0x7f0c18072020 0x7f0c180744d0 unknown :-1 s=CLOSED pgs=21 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:22:04.979 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:04.976+0000 7f0c35988700 1 --2- 192.168.123.104:0/3693478530 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0c30071a90 0x7f0c30116a40 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:22:04.979 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:04.976+0000 7f0c35988700 1 --2- 192.168.123.104:0/3693478530 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0c30072470 0x7f0c30116f80 unknown :-1 s=CLOSED pgs=313 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:22:04.979 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:04.976+0000 7f0c35988700 1 -- 192.168.123.104:0/3693478530 >> 192.168.123.104:0/3693478530 conn(0x7f0c3006d1a0 msgr2=0x7f0c3010b480 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:22:04.979 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:04.976+0000 7f0c35988700 1 -- 192.168.123.104:0/3693478530 shutdown_connections 2026-03-10T06:22:04.979 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:04.976+0000 7f0c35988700 1 -- 192.168.123.104:0/3693478530 wait complete. 2026-03-10T06:22:05.103 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.103+0000 7fa68cc5e700 1 -- 192.168.123.104:0/3645261802 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa688071a60 msgr2=0x7fa688071e70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:22:05.105 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.103+0000 7fa68cc5e700 1 --2- 192.168.123.104:0/3645261802 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa688071a60 0x7fa688071e70 secure :-1 s=READY pgs=314 cs=0 l=1 rev1=1 crypto rx=0x7fa678009b00 tx=0x7fa678009e10 comp rx=0 tx=0).stop 2026-03-10T06:22:05.105 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.105+0000 7fa68cc5e700 1 -- 192.168.123.104:0/3645261802 shutdown_connections 2026-03-10T06:22:05.106 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.105+0000 7fa68cc5e700 1 --2- 192.168.123.104:0/3645261802 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa688072440 0x7fa68810be90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:22:05.106 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.105+0000 7fa68cc5e700 1 --2- 192.168.123.104:0/3645261802 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa688071a60 0x7fa688071e70 unknown :-1 s=CLOSED pgs=314 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:22:05.106 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.105+0000 7fa68cc5e700 1 -- 192.168.123.104:0/3645261802 >> 192.168.123.104:0/3645261802 conn(0x7fa68806d1a0 msgr2=0x7fa68806f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:22:05.106 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.105+0000 7fa68cc5e700 1 -- 192.168.123.104:0/3645261802 shutdown_connections 2026-03-10T06:22:05.108 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.105+0000 7fa68cc5e700 1 -- 192.168.123.104:0/3645261802 wait complete. 2026-03-10T06:22:05.108 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.105+0000 7fa68cc5e700 1 Processor -- start 2026-03-10T06:22:05.108 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.107+0000 7fa68cc5e700 1 -- start start 2026-03-10T06:22:05.108 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.107+0000 7fa68cc5e700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa688071a60 0x7fa6881a4980 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:22:05.108 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.107+0000 7fa68cc5e700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa688072440 0x7fa6881a4ec0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:22:05.108 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.107+0000 7fa68cc5e700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa6881a54e0 con 0x7fa688072440 2026-03-10T06:22:05.108 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.107+0000 7fa68cc5e700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa6881a5620 con 0x7fa688071a60 2026-03-10T06:22:05.108 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.107+0000 7fa6877fe700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa688071a60 0x7fa6881a4980 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:22:05.109 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.107+0000 7fa6877fe700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa688071a60 0x7fa6881a4980 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.104:55594/0 (socket says 192.168.123.104:55594) 2026-03-10T06:22:05.109 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.107+0000 7fa6877fe700 1 -- 192.168.123.104:0/2402264232 learned_addr learned my addr 192.168.123.104:0/2402264232 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:22:05.110 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.107+0000 7fa6877fe700 1 -- 192.168.123.104:0/2402264232 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa688072440 msgr2=0x7fa6881a4ec0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:22:05.110 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.107+0000 7fa6877fe700 1 --2- 192.168.123.104:0/2402264232 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa688072440 0x7fa6881a4ec0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:22:05.110 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.107+0000 7fa6877fe700 1 -- 192.168.123.104:0/2402264232 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fa6780097e0 con 0x7fa688071a60 2026-03-10T06:22:05.110 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.107+0000 7fa6877fe700 1 --2- 192.168.123.104:0/2402264232 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa688071a60 0x7fa6881a4980 secure :-1 s=READY pgs=81 cs=0 l=1 rev1=1 crypto rx=0x7fa67800bb40 tx=0x7fa67800bc20 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:22:05.110 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.107+0000 7fa684ff9700 1 -- 192.168.123.104:0/2402264232 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa67801d070 con 0x7fa688071a60 2026-03-10T06:22:05.110 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.108+0000 7fa684ff9700 1 -- 192.168.123.104:0/2402264232 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fa678022470 con 0x7fa688071a60 2026-03-10T06:22:05.111 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.108+0000 7fa684ff9700 1 -- 192.168.123.104:0/2402264232 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa67800f650 con 0x7fa688071a60 2026-03-10T06:22:05.111 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.108+0000 7fa68cc5e700 1 -- 192.168.123.104:0/2402264232 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fa68810f5c0 con 0x7fa688071a60 2026-03-10T06:22:05.111 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.108+0000 7fa68cc5e700 1 -- 192.168.123.104:0/2402264232 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fa68810fab0 con 0x7fa688071a60 2026-03-10T06:22:05.111 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.108+0000 7fa66e7fc700 1 -- 192.168.123.104:0/2402264232 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fa6680052f0 con 0x7fa688071a60 2026-03-10T06:22:05.111 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.109+0000 7fa684ff9700 1 -- 192.168.123.104:0/2402264232 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 25) v1 ==== 95039+0+0 (secure 0 0 0) 0x7fa678022a80 con 0x7fa688071a60 2026-03-10T06:22:05.111 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.109+0000 7fa684ff9700 1 --2- 192.168.123.104:0/2402264232 >> [v2:192.168.123.106:6828/1426890327,v1:192.168.123.106:6829/1426890327] conn(0x7fa670071fd0 0x7fa670074480 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:22:05.111 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.109+0000 7fa684ff9700 1 -- 192.168.123.104:0/2402264232 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(40..40 src has 1..40) v4 ==== 5448+0+0 (secure 0 0 0) 0x7fa6780942b0 con 0x7fa688071a60 2026-03-10T06:22:05.145 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.112+0000 7fa686ffd700 1 --2- 192.168.123.104:0/2402264232 >> [v2:192.168.123.106:6828/1426890327,v1:192.168.123.106:6829/1426890327] conn(0x7fa670071fd0 0x7fa670074480 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:22:05.145 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.114+0000 7fa684ff9700 1 -- 192.168.123.104:0/2402264232 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191525 (secure 0 0 0) 0x7fa67805d020 con 0x7fa688071a60 2026-03-10T06:22:05.145 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.114+0000 7fa686ffd700 1 --2- 192.168.123.104:0/2402264232 >> [v2:192.168.123.106:6828/1426890327,v1:192.168.123.106:6829/1426890327] conn(0x7fa670071fd0 0x7fa670074480 secure :-1 s=READY pgs=22 cs=0 l=1 rev1=1 crypto rx=0x7fa67c005950 tx=0x7fa67c0058e0 comp rx=0 tx=0).ready entity=mgr.24377 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:22:05.342 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.342+0000 7fa66e7fc700 1 -- 192.168.123.104:0/2402264232 --> [v2:192.168.123.106:6828/1426890327,v1:192.168.123.106:6829/1426890327] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7fa668000bc0 con 0x7fa670071fd0 2026-03-10T06:22:05.355 INFO:teuthology.orchestra.run.vm04.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-10T06:22:05.355 INFO:teuthology.orchestra.run.vm04.stdout:alertmanager.vm04 vm04 *:9093,9094 running (3m) 2s ago 4m 25.1M - 0.25.0 c8568f914cd2 3d98d9c97afc 2026-03-10T06:22:05.355 INFO:teuthology.orchestra.run.vm04.stdout:ceph-exporter.vm04 vm04 running (4m) 2s ago 4m 8308k - 18.2.0 dc2bc1663786 019b79596e39 2026-03-10T06:22:05.355 INFO:teuthology.orchestra.run.vm04.stdout:ceph-exporter.vm06 vm06 running (3m) 12s ago 3m 8577k - 18.2.0 dc2bc1663786 02ba67f7b99e 2026-03-10T06:22:05.355 INFO:teuthology.orchestra.run.vm04.stdout:crash.vm04 vm04 running (4m) 2s ago 4m 7402k - 18.2.0 dc2bc1663786 35fbdbd85c40 2026-03-10T06:22:05.355 INFO:teuthology.orchestra.run.vm04.stdout:crash.vm06 vm06 running (3m) 12s ago 3m 7411k - 18.2.0 dc2bc1663786 a60199b09d41 2026-03-10T06:22:05.355 INFO:teuthology.orchestra.run.vm04.stdout:grafana.vm04 vm04 *:3000 running (3m) 2s ago 4m 88.4M - 9.4.7 954c08fa6188 888c399470c8 2026-03-10T06:22:05.355 INFO:teuthology.orchestra.run.vm04.stdout:mds.cephfs.vm04.hdxbzv vm04 running (2m) 2s ago 2m 211M - 18.2.0 dc2bc1663786 342935a5b39a 2026-03-10T06:22:05.355 INFO:teuthology.orchestra.run.vm04.stdout:mds.cephfs.vm04.hsrsig vm04 running (2m) 2s ago 2m 15.8M - 18.2.0 dc2bc1663786 9bbaa4df4333 2026-03-10T06:22:05.355 INFO:teuthology.orchestra.run.vm04.stdout:mds.cephfs.vm06.afscws vm06 running (2m) 12s ago 2m 15.6M - 18.2.0 dc2bc1663786 dc29bd0a94dd 2026-03-10T06:22:05.355 INFO:teuthology.orchestra.run.vm04.stdout:mds.cephfs.vm06.wzhqon vm06 running (2m) 12s ago 2m 176M - 18.2.0 dc2bc1663786 5f7b9f10b346 2026-03-10T06:22:05.355 INFO:teuthology.orchestra.run.vm04.stdout:mgr.vm04.exdvdb vm04 *:8443,9283,8765 running (4s) 2s ago 5m 57.0M - 19.2.3-678-ge911bdeb 654f31e6858e 640a9d30421c 2026-03-10T06:22:05.355 INFO:teuthology.orchestra.run.vm04.stdout:mgr.vm06.wwotdr vm06 *:8443,9283,8765 running (22s) 12s ago 3m 535M - 19.2.3-678-ge911bdeb 654f31e6858e 573c8d485029 2026-03-10T06:22:05.355 INFO:teuthology.orchestra.run.vm04.stdout:mon.vm04 vm04 running (5m) 2s ago 5m 49.8M 2048M 18.2.0 dc2bc1663786 089bb557f95b 2026-03-10T06:22:05.355 INFO:teuthology.orchestra.run.vm04.stdout:mon.vm06 vm06 running (3m) 12s ago 3m 47.7M 2048M 18.2.0 dc2bc1663786 826078cd5cc7 2026-03-10T06:22:05.355 INFO:teuthology.orchestra.run.vm04.stdout:node-exporter.vm04 vm04 *:9100 running (4m) 2s ago 4m 12.7M - 1.5.0 0da6a335fe13 f563a35e96ab 2026-03-10T06:22:05.355 INFO:teuthology.orchestra.run.vm04.stdout:node-exporter.vm06 vm06 *:9100 running (3m) 12s ago 3m 15.2M - 1.5.0 0da6a335fe13 3304cc389738 2026-03-10T06:22:05.355 INFO:teuthology.orchestra.run.vm04.stdout:osd.0 vm04 running (3m) 2s ago 3m 106M 4096M 18.2.0 dc2bc1663786 23249edb3d75 2026-03-10T06:22:05.355 INFO:teuthology.orchestra.run.vm04.stdout:osd.1 vm04 running (3m) 2s ago 3m 122M 4096M 18.2.0 dc2bc1663786 ddcaf1636c42 2026-03-10T06:22:05.355 INFO:teuthology.orchestra.run.vm04.stdout:osd.2 vm04 running (3m) 2s ago 3m 103M 4096M 18.2.0 dc2bc1663786 e5a533082c80 2026-03-10T06:22:05.356 INFO:teuthology.orchestra.run.vm04.stdout:osd.3 vm06 running (3m) 12s ago 3m 111M 4096M 18.2.0 dc2bc1663786 62400287eca0 2026-03-10T06:22:05.356 INFO:teuthology.orchestra.run.vm04.stdout:osd.4 vm06 running (2m) 12s ago 2m 113M 4096M 18.2.0 dc2bc1663786 dcd395dfe220 2026-03-10T06:22:05.356 INFO:teuthology.orchestra.run.vm04.stdout:osd.5 vm06 running (2m) 12s ago 2m 101M 4096M 18.2.0 dc2bc1663786 862da087fc06 2026-03-10T06:22:05.356 INFO:teuthology.orchestra.run.vm04.stdout:prometheus.vm04 vm04 *:9095 running (8s) 2s ago 4m 38.6M - 2.43.0 a07b618ecd1d f68b5d792ea4 2026-03-10T06:22:05.356 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.352+0000 7fa684ff9700 1 -- 192.168.123.104:0/2402264232 <== mgr.24377 v2:192.168.123.106:6828/1426890327 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3552 (secure 0 0 0) 0x7fa668000bc0 con 0x7fa670071fd0 2026-03-10T06:22:05.357 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.356+0000 7fa68cc5e700 1 -- 192.168.123.104:0/2402264232 >> [v2:192.168.123.106:6828/1426890327,v1:192.168.123.106:6829/1426890327] conn(0x7fa670071fd0 msgr2=0x7fa670074480 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:22:05.357 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.356+0000 7fa68cc5e700 1 --2- 192.168.123.104:0/2402264232 >> [v2:192.168.123.106:6828/1426890327,v1:192.168.123.106:6829/1426890327] conn(0x7fa670071fd0 0x7fa670074480 secure :-1 s=READY pgs=22 cs=0 l=1 rev1=1 crypto rx=0x7fa67c005950 tx=0x7fa67c0058e0 comp rx=0 tx=0).stop 2026-03-10T06:22:05.357 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.357+0000 7fa68cc5e700 1 -- 192.168.123.104:0/2402264232 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa688071a60 msgr2=0x7fa6881a4980 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:22:05.357 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.357+0000 7fa68cc5e700 1 --2- 192.168.123.104:0/2402264232 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa688071a60 0x7fa6881a4980 secure :-1 s=READY pgs=81 cs=0 l=1 rev1=1 crypto rx=0x7fa67800bb40 tx=0x7fa67800bc20 comp rx=0 tx=0).stop 2026-03-10T06:22:05.358 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.358+0000 7fa68cc5e700 1 -- 192.168.123.104:0/2402264232 shutdown_connections 2026-03-10T06:22:05.358 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.358+0000 7fa68cc5e700 1 --2- 192.168.123.104:0/2402264232 >> [v2:192.168.123.106:6828/1426890327,v1:192.168.123.106:6829/1426890327] conn(0x7fa670071fd0 0x7fa670074480 unknown :-1 s=CLOSED pgs=22 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:22:05.358 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.358+0000 7fa68cc5e700 1 --2- 192.168.123.104:0/2402264232 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa688071a60 0x7fa6881a4980 unknown :-1 s=CLOSED pgs=81 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:22:05.358 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.358+0000 7fa68cc5e700 1 --2- 192.168.123.104:0/2402264232 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa688072440 0x7fa6881a4ec0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:22:05.358 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.358+0000 7fa68cc5e700 1 -- 192.168.123.104:0/2402264232 >> 192.168.123.104:0/2402264232 conn(0x7fa68806d1a0 msgr2=0x7fa68810a6d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:22:05.359 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.358+0000 7fa68cc5e700 1 -- 192.168.123.104:0/2402264232 shutdown_connections 2026-03-10T06:22:05.359 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.359+0000 7fa68cc5e700 1 -- 192.168.123.104:0/2402264232 wait complete. 2026-03-10T06:22:05.462 INFO:tasks.workunit.client.1.vm06.stdout:5/31: symlink d8/d9/la 0 2026-03-10T06:22:05.489 INFO:tasks.workunit.client.1.vm06.stdout:1/12: symlink l3 0 2026-03-10T06:22:05.506 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.503+0000 7f55a559a700 1 -- 192.168.123.104:0/1215120602 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f55a0072360 msgr2=0x7f55a00770e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:22:05.506 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.503+0000 7f55a559a700 1 --2- 192.168.123.104:0/1215120602 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f55a0072360 0x7f55a00770e0 secure :-1 s=READY pgs=315 cs=0 l=1 rev1=1 crypto rx=0x7f5590008790 tx=0x7f5590008aa0 comp rx=0 tx=0).stop 2026-03-10T06:22:05.506 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.503+0000 7f55a559a700 1 -- 192.168.123.104:0/1215120602 shutdown_connections 2026-03-10T06:22:05.506 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.503+0000 7f55a559a700 1 --2- 192.168.123.104:0/1215120602 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f55a0072360 0x7f55a00770e0 unknown :-1 s=CLOSED pgs=315 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:22:05.506 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.503+0000 7f55a559a700 1 --2- 192.168.123.104:0/1215120602 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f55a0071980 0x7f55a0071d90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:22:05.506 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.503+0000 7f55a559a700 1 -- 192.168.123.104:0/1215120602 >> 192.168.123.104:0/1215120602 conn(0x7f55a006d1a0 msgr2=0x7f55a006f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:22:05.506 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.503+0000 7f55a559a700 1 -- 192.168.123.104:0/1215120602 shutdown_connections 2026-03-10T06:22:05.506 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.503+0000 7f55a559a700 1 -- 192.168.123.104:0/1215120602 wait complete. 2026-03-10T06:22:05.506 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.503+0000 7f55a559a700 1 Processor -- start 2026-03-10T06:22:05.506 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.503+0000 7f55a559a700 1 -- start start 2026-03-10T06:22:05.506 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.503+0000 7f55a559a700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f55a0071980 0x7f55a0082490 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:22:05.506 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.503+0000 7f55a559a700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f55a00829d0 0x7f55a0082e40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:22:05.506 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.503+0000 7f55a559a700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f55a0083e40 con 0x7f55a0071980 2026-03-10T06:22:05.506 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.503+0000 7f55a559a700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f55a01b2a90 con 0x7f55a00829d0 2026-03-10T06:22:05.506 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.504+0000 7f559effd700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f55a0071980 0x7f55a0082490 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:22:05.506 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.504+0000 7f559effd700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f55a0071980 0x7f55a0082490 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:53340/0 (socket says 192.168.123.104:53340) 2026-03-10T06:22:05.506 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.504+0000 7f559effd700 1 -- 192.168.123.104:0/958638431 learned_addr learned my addr 192.168.123.104:0/958638431 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:22:05.506 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.504+0000 7f559e7fc700 1 --2- 192.168.123.104:0/958638431 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f55a00829d0 0x7f55a0082e40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:22:05.506 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.504+0000 7f559effd700 1 -- 192.168.123.104:0/958638431 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f55a00829d0 msgr2=0x7f55a0082e40 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:22:05.506 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.504+0000 7f559effd700 1 --2- 192.168.123.104:0/958638431 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f55a00829d0 0x7f55a0082e40 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:22:05.506 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.504+0000 7f559effd700 1 -- 192.168.123.104:0/958638431 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f5590008440 con 0x7f55a0071980 2026-03-10T06:22:05.506 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.504+0000 7f559effd700 1 --2- 192.168.123.104:0/958638431 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f55a0071980 0x7f55a0082490 secure :-1 s=READY pgs=316 cs=0 l=1 rev1=1 crypto rx=0x7f559800f4d0 tx=0x7f559800f7e0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:22:05.506 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.504+0000 7f5587fff700 1 -- 192.168.123.104:0/958638431 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f5598010040 con 0x7f55a0071980 2026-03-10T06:22:05.507 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.505+0000 7f55a559a700 1 -- 192.168.123.104:0/958638431 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f55a01b2c30 con 0x7f55a0071980 2026-03-10T06:22:05.507 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.505+0000 7f55a559a700 1 -- 192.168.123.104:0/958638431 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f55a01b3180 con 0x7f55a0071980 2026-03-10T06:22:05.507 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.506+0000 7f55a559a700 1 -- 192.168.123.104:0/958638431 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f55a007c8b0 con 0x7f55a0071980 2026-03-10T06:22:05.507 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:05 vm04.local ceph-mon[51058]: from='client.24409 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:22:05.507 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:05 vm04.local ceph-mon[51058]: from='mgr.24377 192.168.123.106:0/16315041' entity='mgr.vm06.wwotdr' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T06:22:05.516 INFO:tasks.workunit.client.1.vm06.stdout:0/14: mknod d0/c6 0 2026-03-10T06:22:05.517 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.515+0000 7f5587fff700 1 -- 192.168.123.104:0/958638431 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f5598009bf0 con 0x7f55a0071980 2026-03-10T06:22:05.517 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.515+0000 7f5587fff700 1 -- 192.168.123.104:0/958638431 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f55980158e0 con 0x7f55a0071980 2026-03-10T06:22:05.518 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.517+0000 7f5587fff700 1 -- 192.168.123.104:0/958638431 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 25) v1 ==== 95039+0+0 (secure 0 0 0) 0x7f559800b3e0 con 0x7f55a0071980 2026-03-10T06:22:05.518 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.517+0000 7f5587fff700 1 --2- 192.168.123.104:0/958638431 >> [v2:192.168.123.106:6828/1426890327,v1:192.168.123.106:6829/1426890327] conn(0x7f5588071f70 0x7f5588074420 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:22:05.518 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.517+0000 7f5587fff700 1 -- 192.168.123.104:0/958638431 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(40..40 src has 1..40) v4 ==== 5448+0+0 (secure 0 0 0) 0x7f55980937e0 con 0x7f55a0071980 2026-03-10T06:22:05.518 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.518+0000 7f559e7fc700 1 --2- 192.168.123.104:0/958638431 >> [v2:192.168.123.106:6828/1426890327,v1:192.168.123.106:6829/1426890327] conn(0x7f5588071f70 0x7f5588074420 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:22:05.519 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.518+0000 7f559e7fc700 1 --2- 192.168.123.104:0/958638431 >> [v2:192.168.123.106:6828/1426890327,v1:192.168.123.106:6829/1426890327] conn(0x7f5588071f70 0x7f5588074420 secure :-1 s=READY pgs=23 cs=0 l=1 rev1=1 crypto rx=0x7f559000f7b0 tx=0x7f5590019040 comp rx=0 tx=0).ready entity=mgr.24377 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:22:05.520 INFO:tasks.workunit.client.1.vm06.stdout:9/20: creat f6 x:0 0 0 2026-03-10T06:22:05.524 INFO:tasks.workunit.client.1.vm06.stdout:3/32: unlink d6/lf 0 2026-03-10T06:22:05.524 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.521+0000 7f5587fff700 1 -- 192.168.123.104:0/958638431 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191525 (secure 0 0 0) 0x7f559805c5c0 con 0x7f55a0071980 2026-03-10T06:22:05.528 INFO:tasks.workunit.client.1.vm06.stdout:3/33: dwrite d6/d8/fa [0,4194304] 0 2026-03-10T06:22:05.545 INFO:tasks.workunit.client.1.vm06.stdout:0/15: unlink d0/c3 0 2026-03-10T06:22:05.551 INFO:tasks.workunit.client.1.vm06.stdout:9/21: symlink l7 0 2026-03-10T06:22:05.551 INFO:tasks.workunit.client.1.vm06.stdout:3/34: creat d6/f10 x:0 0 0 2026-03-10T06:22:05.552 INFO:tasks.workunit.client.1.vm06.stdout:3/35: dread d6/d8/fb [0,4194304] 0 2026-03-10T06:22:05.552 INFO:tasks.workunit.client.1.vm06.stdout:3/36: read - d6/d8/fd zero size 2026-03-10T06:22:05.553 INFO:tasks.workunit.client.1.vm06.stdout:8/12: creat d1/f4 x:0 0 0 2026-03-10T06:22:05.555 INFO:tasks.workunit.client.1.vm06.stdout:4/30: link f2 f8 0 2026-03-10T06:22:05.563 INFO:tasks.workunit.client.1.vm06.stdout:2/17: getdents . 0 2026-03-10T06:22:05.563 INFO:tasks.workunit.client.1.vm06.stdout:2/18: dwrite - no filename 2026-03-10T06:22:05.564 INFO:tasks.workunit.client.1.vm06.stdout:8/13: dwrite d1/f4 [0,4194304] 0 2026-03-10T06:22:05.564 INFO:tasks.workunit.client.1.vm06.stdout:8/14: fdatasync d1/f2 0 2026-03-10T06:22:05.596 INFO:tasks.workunit.client.1.vm06.stdout:3/37: rename d6/c7 to d6/d8/c11 0 2026-03-10T06:22:05.603 INFO:tasks.workunit.client.1.vm06.stdout:4/31: creat f9 x:0 0 0 2026-03-10T06:22:05.603 INFO:tasks.workunit.client.1.vm06.stdout:4/32: write f0 [711486,61326] 0 2026-03-10T06:22:05.612 INFO:tasks.workunit.client.1.vm06.stdout:2/19: stat l2 0 2026-03-10T06:22:05.612 INFO:tasks.workunit.client.1.vm06.stdout:2/20: stat l1 0 2026-03-10T06:22:05.613 INFO:tasks.workunit.client.1.vm06.stdout:8/15: creat d1/f5 x:0 0 0 2026-03-10T06:22:05.613 INFO:tasks.workunit.client.1.vm06.stdout:8/16: chown d1 763920 1 2026-03-10T06:22:05.614 INFO:tasks.workunit.client.1.vm06.stdout:8/17: write f0 [1169652,127589] 0 2026-03-10T06:22:05.614 INFO:tasks.workunit.client.1.vm06.stdout:8/18: write d1/f5 [530606,75006] 0 2026-03-10T06:22:05.616 INFO:tasks.workunit.client.1.vm06.stdout:0/16: link d0/f4 d0/f7 0 2026-03-10T06:22:05.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:05 vm06.local ceph-mon[58974]: from='client.24409 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:22:05.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:05 vm06.local ceph-mon[58974]: from='mgr.24377 192.168.123.106:0/16315041' entity='mgr.vm06.wwotdr' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T06:22:05.637 INFO:tasks.workunit.client.1.vm06.stdout:3/38: rename d6/d8/fa to d6/dc/f12 0 2026-03-10T06:22:05.651 INFO:tasks.workunit.client.1.vm06.stdout:4/33: dwrite f8 [4194304,4194304] 0 2026-03-10T06:22:05.688 INFO:tasks.workunit.client.1.vm06.stdout:2/21: rename l1 to l3 0 2026-03-10T06:22:05.688 INFO:tasks.workunit.client.1.vm06.stdout:2/22: fdatasync - no filename 2026-03-10T06:22:05.688 INFO:tasks.workunit.client.1.vm06.stdout:4/34: dread - f9 zero size 2026-03-10T06:22:05.688 INFO:tasks.workunit.client.1.vm06.stdout:4/35: dread f8 [4194304,4194304] 0 2026-03-10T06:22:05.688 INFO:tasks.workunit.client.1.vm06.stdout:4/36: chown c7 29 1 2026-03-10T06:22:05.689 INFO:tasks.workunit.client.1.vm06.stdout:8/19: symlink d1/l6 0 2026-03-10T06:22:05.689 INFO:tasks.workunit.client.1.vm06.stdout:8/20: chown d1/f4 214 1 2026-03-10T06:22:05.689 INFO:tasks.workunit.client.1.vm06.stdout:8/21: truncate d1/f5 1310769 0 2026-03-10T06:22:05.689 INFO:tasks.workunit.client.1.vm06.stdout:0/17: rename d0/f4 to d0/f8 0 2026-03-10T06:22:05.689 INFO:tasks.workunit.client.1.vm06.stdout:8/22: dwrite d1/f5 [0,4194304] 0 2026-03-10T06:22:05.724 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.724+0000 7f55a559a700 1 -- 192.168.123.104:0/958638431 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7f55a004ea50 con 0x7f55a0071980 2026-03-10T06:22:05.740 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.724+0000 7f5587fff700 1 -- 192.168.123.104:0/958638431 <== mon.0 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+770 (secure 0 0 0) 0x7f559805bd10 con 0x7f55a0071980 2026-03-10T06:22:05.740 INFO:teuthology.orchestra.run.vm04.stdout:{ 2026-03-10T06:22:05.740 INFO:teuthology.orchestra.run.vm04.stdout: "mon": { 2026-03-10T06:22:05.740 INFO:teuthology.orchestra.run.vm04.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 2 2026-03-10T06:22:05.740 INFO:teuthology.orchestra.run.vm04.stdout: }, 2026-03-10T06:22:05.740 INFO:teuthology.orchestra.run.vm04.stdout: "mgr": { 2026-03-10T06:22:05.740 INFO:teuthology.orchestra.run.vm04.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 1, 2026-03-10T06:22:05.740 INFO:teuthology.orchestra.run.vm04.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 1 2026-03-10T06:22:05.740 INFO:teuthology.orchestra.run.vm04.stdout: }, 2026-03-10T06:22:05.740 INFO:teuthology.orchestra.run.vm04.stdout: "osd": { 2026-03-10T06:22:05.740 INFO:teuthology.orchestra.run.vm04.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 6 2026-03-10T06:22:05.740 INFO:teuthology.orchestra.run.vm04.stdout: }, 2026-03-10T06:22:05.740 INFO:teuthology.orchestra.run.vm04.stdout: "mds": { 2026-03-10T06:22:05.740 INFO:teuthology.orchestra.run.vm04.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 4 2026-03-10T06:22:05.740 INFO:teuthology.orchestra.run.vm04.stdout: }, 2026-03-10T06:22:05.740 INFO:teuthology.orchestra.run.vm04.stdout: "overall": { 2026-03-10T06:22:05.740 INFO:teuthology.orchestra.run.vm04.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 13, 2026-03-10T06:22:05.740 INFO:teuthology.orchestra.run.vm04.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 1 2026-03-10T06:22:05.740 INFO:teuthology.orchestra.run.vm04.stdout: } 2026-03-10T06:22:05.740 INFO:teuthology.orchestra.run.vm04.stdout:} 2026-03-10T06:22:05.741 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.726+0000 7f5585ffb700 1 -- 192.168.123.104:0/958638431 >> [v2:192.168.123.106:6828/1426890327,v1:192.168.123.106:6829/1426890327] conn(0x7f5588071f70 msgr2=0x7f5588074420 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:22:05.741 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.726+0000 7f5585ffb700 1 --2- 192.168.123.104:0/958638431 >> [v2:192.168.123.106:6828/1426890327,v1:192.168.123.106:6829/1426890327] conn(0x7f5588071f70 0x7f5588074420 secure :-1 s=READY pgs=23 cs=0 l=1 rev1=1 crypto rx=0x7f559000f7b0 tx=0x7f5590019040 comp rx=0 tx=0).stop 2026-03-10T06:22:05.741 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.727+0000 7f5585ffb700 1 -- 192.168.123.104:0/958638431 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f55a0071980 msgr2=0x7f55a0082490 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:22:05.741 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.727+0000 7f5585ffb700 1 --2- 192.168.123.104:0/958638431 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f55a0071980 0x7f55a0082490 secure :-1 s=READY pgs=316 cs=0 l=1 rev1=1 crypto rx=0x7f559800f4d0 tx=0x7f559800f7e0 comp rx=0 tx=0).stop 2026-03-10T06:22:05.741 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.727+0000 7f5585ffb700 1 -- 192.168.123.104:0/958638431 shutdown_connections 2026-03-10T06:22:05.741 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.727+0000 7f5585ffb700 1 --2- 192.168.123.104:0/958638431 >> [v2:192.168.123.106:6828/1426890327,v1:192.168.123.106:6829/1426890327] conn(0x7f5588071f70 0x7f5588074420 unknown :-1 s=CLOSED pgs=23 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:22:05.741 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.727+0000 7f5585ffb700 1 --2- 192.168.123.104:0/958638431 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f55a0071980 0x7f55a0082490 unknown :-1 s=CLOSED pgs=316 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:22:05.741 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.727+0000 7f5585ffb700 1 --2- 192.168.123.104:0/958638431 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f55a00829d0 0x7f55a0082e40 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:22:05.741 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.727+0000 7f5585ffb700 1 -- 192.168.123.104:0/958638431 >> 192.168.123.104:0/958638431 conn(0x7f55a006d1a0 msgr2=0x7f55a0076460 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:22:05.741 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.727+0000 7f5585ffb700 1 -- 192.168.123.104:0/958638431 shutdown_connections 2026-03-10T06:22:05.741 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.727+0000 7f5585ffb700 1 -- 192.168.123.104:0/958638431 wait complete. 2026-03-10T06:22:05.795 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.794+0000 7fd9e942b700 1 -- 192.168.123.104:0/203338989 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd9e4072360 msgr2=0x7fd9e40770e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:22:05.795 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.794+0000 7fd9e942b700 1 --2- 192.168.123.104:0/203338989 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd9e4072360 0x7fd9e40770e0 secure :-1 s=READY pgs=317 cs=0 l=1 rev1=1 crypto rx=0x7fd9dc00d3f0 tx=0x7fd9dc00d700 comp rx=0 tx=0).stop 2026-03-10T06:22:05.795 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.795+0000 7fd9e942b700 1 -- 192.168.123.104:0/203338989 shutdown_connections 2026-03-10T06:22:05.795 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.795+0000 7fd9e942b700 1 --2- 192.168.123.104:0/203338989 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd9e4072360 0x7fd9e40770e0 unknown :-1 s=CLOSED pgs=317 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:22:05.795 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.795+0000 7fd9e942b700 1 --2- 192.168.123.104:0/203338989 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd9e4071980 0x7fd9e4071d90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:22:05.795 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.795+0000 7fd9e942b700 1 -- 192.168.123.104:0/203338989 >> 192.168.123.104:0/203338989 conn(0x7fd9e406d1a0 msgr2=0x7fd9e406f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:22:05.796 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.795+0000 7fd9e942b700 1 -- 192.168.123.104:0/203338989 shutdown_connections 2026-03-10T06:22:05.796 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.795+0000 7fd9e942b700 1 -- 192.168.123.104:0/203338989 wait complete. 2026-03-10T06:22:05.796 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.795+0000 7fd9e942b700 1 Processor -- start 2026-03-10T06:22:05.796 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.795+0000 7fd9e942b700 1 -- start start 2026-03-10T06:22:05.796 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.795+0000 7fd9e942b700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd9e4071980 0x7fd9e4131350 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:22:05.796 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.795+0000 7fd9e942b700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd9e4131890 0x7fd9e407f520 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:22:05.796 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.795+0000 7fd9e942b700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd9e4131d90 con 0x7fd9e4071980 2026-03-10T06:22:05.796 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.795+0000 7fd9e942b700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd9e4131ed0 con 0x7fd9e4131890 2026-03-10T06:22:05.796 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.795+0000 7fd9e27fc700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd9e4131890 0x7fd9e407f520 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:22:05.796 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.795+0000 7fd9e27fc700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd9e4131890 0x7fd9e407f520 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.104:55644/0 (socket says 192.168.123.104:55644) 2026-03-10T06:22:05.796 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.795+0000 7fd9e27fc700 1 -- 192.168.123.104:0/1358945547 learned_addr learned my addr 192.168.123.104:0/1358945547 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:22:05.796 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.796+0000 7fd9e27fc700 1 -- 192.168.123.104:0/1358945547 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd9e4071980 msgr2=0x7fd9e4131350 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:22:05.796 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.796+0000 7fd9e27fc700 1 --2- 192.168.123.104:0/1358945547 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd9e4071980 0x7fd9e4131350 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:22:05.796 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.796+0000 7fd9e27fc700 1 -- 192.168.123.104:0/1358945547 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fd9dc007ed0 con 0x7fd9e4131890 2026-03-10T06:22:05.796 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.796+0000 7fd9e27fc700 1 --2- 192.168.123.104:0/1358945547 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd9e4131890 0x7fd9e407f520 secure :-1 s=READY pgs=82 cs=0 l=1 rev1=1 crypto rx=0x7fd9dc003c30 tx=0x7fd9dc003d10 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:22:05.796 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.796+0000 7fd9cbfff700 1 -- 192.168.123.104:0/1358945547 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd9dc01c070 con 0x7fd9e4131890 2026-03-10T06:22:05.797 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.796+0000 7fd9e942b700 1 -- 192.168.123.104:0/1358945547 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fd9e407fa60 con 0x7fd9e4131890 2026-03-10T06:22:05.797 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.796+0000 7fd9e942b700 1 -- 192.168.123.104:0/1358945547 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fd9e407ff00 con 0x7fd9e4131890 2026-03-10T06:22:05.797 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.797+0000 7fd9cbfff700 1 -- 192.168.123.104:0/1358945547 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fd9dc004370 con 0x7fd9e4131890 2026-03-10T06:22:05.797 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.797+0000 7fd9cbfff700 1 -- 192.168.123.104:0/1358945547 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd9dc017910 con 0x7fd9e4131890 2026-03-10T06:22:05.798 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.797+0000 7fd9e942b700 1 -- 192.168.123.104:0/1358945547 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fd9d0005320 con 0x7fd9e4131890 2026-03-10T06:22:05.798 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.798+0000 7fd9cbfff700 1 -- 192.168.123.104:0/1358945547 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 25) v1 ==== 95039+0+0 (secure 0 0 0) 0x7fd9dc017a70 con 0x7fd9e4131890 2026-03-10T06:22:05.799 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.798+0000 7fd9cbfff700 1 --2- 192.168.123.104:0/1358945547 >> [v2:192.168.123.106:6828/1426890327,v1:192.168.123.106:6829/1426890327] conn(0x7fd9cc071f70 0x7fd9cc074420 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:22:05.799 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.798+0000 7fd9cbfff700 1 -- 192.168.123.104:0/1358945547 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(40..40 src has 1..40) v4 ==== 5448+0+0 (secure 0 0 0) 0x7fd9dc013070 con 0x7fd9e4131890 2026-03-10T06:22:05.800 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.799+0000 7fd9e2ffd700 1 --2- 192.168.123.104:0/1358945547 >> [v2:192.168.123.106:6828/1426890327,v1:192.168.123.106:6829/1426890327] conn(0x7fd9cc071f70 0x7fd9cc074420 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:22:05.801 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.800+0000 7fd9cbfff700 1 -- 192.168.123.104:0/1358945547 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191525 (secure 0 0 0) 0x7fd9dc05d7f0 con 0x7fd9e4131890 2026-03-10T06:22:05.801 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.800+0000 7fd9e2ffd700 1 --2- 192.168.123.104:0/1358945547 >> [v2:192.168.123.106:6828/1426890327,v1:192.168.123.106:6829/1426890327] conn(0x7fd9cc071f70 0x7fd9cc074420 secure :-1 s=READY pgs=24 cs=0 l=1 rev1=1 crypto rx=0x7fd9d40098a0 tx=0x7fd9d4006d90 comp rx=0 tx=0).ready entity=mgr.24377 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:22:05.925 INFO:tasks.workunit.client.1.vm06.stdout:7/30: truncate f4 4826633 0 2026-03-10T06:22:05.939 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.938+0000 7fd9e942b700 1 -- 192.168.123.104:0/1358945547 --> [v2:192.168.123.106:6828/1426890327,v1:192.168.123.106:6829/1426890327] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7fd9d0000bf0 con 0x7fd9cc071f70 2026-03-10T06:22:05.940 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.940+0000 7fd9cbfff700 1 -- 192.168.123.104:0/1358945547 <== mgr.24377 v2:192.168.123.106:6828/1426890327 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+360 (secure 0 0 0) 0x7fd9d0000bf0 con 0x7fd9cc071f70 2026-03-10T06:22:05.940 INFO:teuthology.orchestra.run.vm04.stdout:{ 2026-03-10T06:22:05.941 INFO:teuthology.orchestra.run.vm04.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc", 2026-03-10T06:22:05.941 INFO:teuthology.orchestra.run.vm04.stdout: "in_progress": true, 2026-03-10T06:22:05.941 INFO:teuthology.orchestra.run.vm04.stdout: "which": "Upgrading daemons of type(s) mgr", 2026-03-10T06:22:05.941 INFO:teuthology.orchestra.run.vm04.stdout: "services_complete": [ 2026-03-10T06:22:05.941 INFO:teuthology.orchestra.run.vm04.stdout: "mgr" 2026-03-10T06:22:05.941 INFO:teuthology.orchestra.run.vm04.stdout: ], 2026-03-10T06:22:05.941 INFO:teuthology.orchestra.run.vm04.stdout: "progress": "2/2 daemons upgraded", 2026-03-10T06:22:05.941 INFO:teuthology.orchestra.run.vm04.stdout: "message": "Currently upgrading mgr daemons", 2026-03-10T06:22:05.941 INFO:teuthology.orchestra.run.vm04.stdout: "is_paused": false 2026-03-10T06:22:05.941 INFO:teuthology.orchestra.run.vm04.stdout:} 2026-03-10T06:22:05.944 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.944+0000 7fd9c9ffb700 1 -- 192.168.123.104:0/1358945547 >> [v2:192.168.123.106:6828/1426890327,v1:192.168.123.106:6829/1426890327] conn(0x7fd9cc071f70 msgr2=0x7fd9cc074420 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:22:05.944 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.944+0000 7fd9c9ffb700 1 --2- 192.168.123.104:0/1358945547 >> [v2:192.168.123.106:6828/1426890327,v1:192.168.123.106:6829/1426890327] conn(0x7fd9cc071f70 0x7fd9cc074420 secure :-1 s=READY pgs=24 cs=0 l=1 rev1=1 crypto rx=0x7fd9d40098a0 tx=0x7fd9d4006d90 comp rx=0 tx=0).stop 2026-03-10T06:22:05.945 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.944+0000 7fd9c9ffb700 1 -- 192.168.123.104:0/1358945547 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd9e4131890 msgr2=0x7fd9e407f520 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:22:05.945 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.944+0000 7fd9c9ffb700 1 --2- 192.168.123.104:0/1358945547 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd9e4131890 0x7fd9e407f520 secure :-1 s=READY pgs=82 cs=0 l=1 rev1=1 crypto rx=0x7fd9dc003c30 tx=0x7fd9dc003d10 comp rx=0 tx=0).stop 2026-03-10T06:22:05.945 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.944+0000 7fd9c9ffb700 1 -- 192.168.123.104:0/1358945547 shutdown_connections 2026-03-10T06:22:05.945 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.944+0000 7fd9c9ffb700 1 --2- 192.168.123.104:0/1358945547 >> [v2:192.168.123.106:6828/1426890327,v1:192.168.123.106:6829/1426890327] conn(0x7fd9cc071f70 0x7fd9cc074420 unknown :-1 s=CLOSED pgs=24 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:22:05.945 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.944+0000 7fd9c9ffb700 1 --2- 192.168.123.104:0/1358945547 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd9e4071980 0x7fd9e4131350 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:22:05.945 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.944+0000 7fd9c9ffb700 1 --2- 192.168.123.104:0/1358945547 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd9e4131890 0x7fd9e407f520 unknown :-1 s=CLOSED pgs=82 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:22:05.945 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.944+0000 7fd9c9ffb700 1 -- 192.168.123.104:0/1358945547 >> 192.168.123.104:0/1358945547 conn(0x7fd9e406d1a0 msgr2=0x7fd9e4076480 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:22:05.945 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.944+0000 7fd9c9ffb700 1 -- 192.168.123.104:0/1358945547 shutdown_connections 2026-03-10T06:22:05.945 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:05.944+0000 7fd9c9ffb700 1 -- 192.168.123.104:0/1358945547 wait complete. 2026-03-10T06:22:05.952 INFO:tasks.workunit.client.1.vm06.stdout:6/18: dwrite f1 [0,4194304] 0 2026-03-10T06:22:05.953 INFO:tasks.workunit.client.1.vm06.stdout:6/19: write f1 [1307548,8458] 0 2026-03-10T06:22:05.975 INFO:tasks.workunit.client.1.vm06.stdout:5/32: truncate f7 382586 0 2026-03-10T06:22:06.081 INFO:tasks.workunit.client.1.vm06.stdout:4/37: creat fa x:0 0 0 2026-03-10T06:22:06.081 INFO:tasks.workunit.client.1.vm06.stdout:4/38: read f3 [7381850,106030] 0 2026-03-10T06:22:06.081 INFO:tasks.workunit.client.1.vm06.stdout:0/18: creat d0/f9 x:0 0 0 2026-03-10T06:22:06.082 INFO:tasks.workunit.client.1.vm06.stdout:8/23: mkdir d1/d7 0 2026-03-10T06:22:06.083 INFO:tasks.workunit.client.1.vm06.stdout:1/13: unlink f1 0 2026-03-10T06:22:06.083 INFO:tasks.workunit.client.1.vm06.stdout:1/14: chown l2 970469880 1 2026-03-10T06:22:06.083 INFO:tasks.workunit.client.1.vm06.stdout:1/15: dwrite - no filename 2026-03-10T06:22:06.083 INFO:tasks.workunit.client.1.vm06.stdout:1/16: dwrite - no filename 2026-03-10T06:22:06.085 INFO:tasks.workunit.client.1.vm06.stdout:6/20: creat f3 x:0 0 0 2026-03-10T06:22:06.085 INFO:tasks.workunit.client.1.vm06.stdout:6/21: stat f3 0 2026-03-10T06:22:06.085 INFO:tasks.workunit.client.1.vm06.stdout:6/22: rmdir - no directory 2026-03-10T06:22:06.085 INFO:tasks.workunit.client.1.vm06.stdout:6/23: stat f1 0 2026-03-10T06:22:06.085 INFO:tasks.workunit.client.1.vm06.stdout:6/24: dread - f3 zero size 2026-03-10T06:22:06.085 INFO:tasks.workunit.client.1.vm06.stdout:5/33: mkdir d8/db 0 2026-03-10T06:22:06.086 INFO:tasks.workunit.client.1.vm06.stdout:6/25: truncate f3 665473 0 2026-03-10T06:22:06.086 INFO:tasks.workunit.client.1.vm06.stdout:9/22: getdents . 0 2026-03-10T06:22:06.086 INFO:tasks.workunit.client.1.vm06.stdout:9/23: rmdir - no directory 2026-03-10T06:22:06.087 INFO:tasks.workunit.client.1.vm06.stdout:9/24: chown f0 376653828 1 2026-03-10T06:22:06.087 INFO:tasks.workunit.client.1.vm06.stdout:3/39: unlink d6/f10 0 2026-03-10T06:22:06.087 INFO:tasks.workunit.client.1.vm06.stdout:2/23: rename l2 to l4 0 2026-03-10T06:22:06.087 INFO:tasks.workunit.client.1.vm06.stdout:2/24: dread - no filename 2026-03-10T06:22:06.091 INFO:tasks.workunit.client.1.vm06.stdout:6/26: dwrite f3 [0,4194304] 0 2026-03-10T06:22:06.108 INFO:tasks.workunit.client.1.vm06.stdout:6/27: dread f3 [0,4194304] 0 2026-03-10T06:22:06.108 INFO:tasks.workunit.client.1.vm06.stdout:6/28: truncate f3 4746088 0 2026-03-10T06:22:06.108 INFO:tasks.workunit.client.1.vm06.stdout:4/39: mknod cb 0 2026-03-10T06:22:06.108 INFO:tasks.workunit.client.1.vm06.stdout:0/19: rename d0/f7 to d0/fa 0 2026-03-10T06:22:06.108 INFO:tasks.workunit.client.1.vm06.stdout:1/17: symlink l4 0 2026-03-10T06:22:06.108 INFO:tasks.workunit.client.1.vm06.stdout:9/25: unlink c5 0 2026-03-10T06:22:06.108 INFO:tasks.workunit.client.1.vm06.stdout:9/26: fdatasync f6 0 2026-03-10T06:22:06.112 INFO:tasks.workunit.client.1.vm06.stdout:2/25: creat f5 x:0 0 0 2026-03-10T06:22:06.113 INFO:tasks.workunit.client.1.vm06.stdout:9/27: dread f4 [0,4194304] 0 2026-03-10T06:22:06.113 INFO:tasks.workunit.client.1.vm06.stdout:9/28: write f4 [508055,53630] 0 2026-03-10T06:22:06.123 INFO:tasks.workunit.client.1.vm06.stdout:6/29: mknod c4 0 2026-03-10T06:22:06.128 INFO:tasks.workunit.client.1.vm06.stdout:6/30: dwrite f1 [0,4194304] 0 2026-03-10T06:22:06.145 INFO:tasks.workunit.client.1.vm06.stdout:0/20: mkdir d0/db 0 2026-03-10T06:22:06.148 INFO:tasks.workunit.client.1.vm06.stdout:1/18: creat f5 x:0 0 0 2026-03-10T06:22:06.163 INFO:tasks.workunit.client.1.vm06.stdout:2/26: mknod c6 0 2026-03-10T06:22:06.166 INFO:tasks.workunit.client.1.vm06.stdout:9/29: mknod c8 0 2026-03-10T06:22:06.166 INFO:tasks.workunit.client.1.vm06.stdout:9/30: chown l7 17 1 2026-03-10T06:22:06.182 INFO:tasks.workunit.client.1.vm06.stdout:6/31: mknod c5 0 2026-03-10T06:22:06.184 INFO:tasks.workunit.client.1.vm06.stdout:6/32: dread f1 [0,4194304] 0 2026-03-10T06:22:06.185 INFO:tasks.workunit.client.1.vm06.stdout:1/19: symlink l6 0 2026-03-10T06:22:06.188 INFO:tasks.workunit.client.1.vm06.stdout:2/27: creat f7 x:0 0 0 2026-03-10T06:22:06.189 INFO:tasks.workunit.client.1.vm06.stdout:9/31: rename f0 to f9 0 2026-03-10T06:22:06.193 INFO:tasks.workunit.client.1.vm06.stdout:0/21: symlink d0/db/lc 0 2026-03-10T06:22:06.193 INFO:tasks.workunit.client.1.vm06.stdout:6/33: mkdir d6 0 2026-03-10T06:22:06.199 INFO:tasks.workunit.client.1.vm06.stdout:1/20: creat f7 x:0 0 0 2026-03-10T06:22:06.199 INFO:tasks.workunit.client.1.vm06.stdout:1/21: stat f7 0 2026-03-10T06:22:06.200 INFO:tasks.workunit.client.1.vm06.stdout:1/22: write f5 [181921,86392] 0 2026-03-10T06:22:06.200 INFO:tasks.workunit.client.1.vm06.stdout:2/28: creat f8 x:0 0 0 2026-03-10T06:22:06.201 INFO:tasks.workunit.client.1.vm06.stdout:1/23: truncate f5 955247 0 2026-03-10T06:22:06.203 INFO:tasks.workunit.client.1.vm06.stdout:0/22: mkdir d0/dd 0 2026-03-10T06:22:06.204 INFO:tasks.workunit.client.1.vm06.stdout:7/31: dwrite f4 [0,4194304] 0 2026-03-10T06:22:06.206 INFO:tasks.workunit.client.1.vm06.stdout:7/32: write f4 [1615352,82519] 0 2026-03-10T06:22:06.214 INFO:tasks.workunit.client.1.vm06.stdout:1/24: dwrite f5 [0,4194304] 0 2026-03-10T06:22:06.215 INFO:tasks.workunit.client.1.vm06.stdout:1/25: fdatasync f7 0 2026-03-10T06:22:06.215 INFO:tasks.workunit.client.1.vm06.stdout:9/32: sync 2026-03-10T06:22:06.215 INFO:tasks.workunit.client.1.vm06.stdout:1/26: write f7 [849581,116113] 0 2026-03-10T06:22:06.215 INFO:tasks.workunit.client.1.vm06.stdout:6/34: dwrite f3 [0,4194304] 0 2026-03-10T06:22:06.219 INFO:tasks.workunit.client.1.vm06.stdout:7/33: dread f4 [0,4194304] 0 2026-03-10T06:22:06.220 INFO:tasks.workunit.client.1.vm06.stdout:1/27: dwrite f5 [0,4194304] 0 2026-03-10T06:22:06.220 INFO:tasks.workunit.client.1.vm06.stdout:1/28: write f7 [1705544,21438] 0 2026-03-10T06:22:06.227 INFO:tasks.workunit.client.1.vm06.stdout:0/23: mknod d0/db/ce 0 2026-03-10T06:22:06.227 INFO:tasks.workunit.client.1.vm06.stdout:0/24: stat d0/f9 0 2026-03-10T06:22:06.270 INFO:tasks.workunit.client.1.vm06.stdout:0/25: creat d0/ff x:0 0 0 2026-03-10T06:22:06.273 INFO:tasks.workunit.client.1.vm06.stdout:6/35: mkdir d6/d7 0 2026-03-10T06:22:06.273 INFO:tasks.workunit.client.1.vm06.stdout:0/26: read - d0/ff zero size 2026-03-10T06:22:06.276 INFO:tasks.workunit.client.1.vm06.stdout:7/34: rename l3 to l8 0 2026-03-10T06:22:06.277 INFO:tasks.workunit.client.1.vm06.stdout:1/29: dread f7 [0,4194304] 0 2026-03-10T06:22:06.286 INFO:tasks.workunit.client.1.vm06.stdout:7/35: creat f9 x:0 0 0 2026-03-10T06:22:06.297 INFO:tasks.workunit.client.1.vm06.stdout:1/30: unlink l6 0 2026-03-10T06:22:06.305 INFO:tasks.workunit.client.1.vm06.stdout:6/36: symlink d6/d7/l8 0 2026-03-10T06:22:06.305 INFO:tasks.workunit.client.1.vm06.stdout:6/37: stat f3 0 2026-03-10T06:22:06.308 INFO:tasks.workunit.client.1.vm06.stdout:7/36: fdatasync f4 0 2026-03-10T06:22:06.309 INFO:tasks.workunit.client.1.vm06.stdout:7/37: read f4 [4694050,94574] 0 2026-03-10T06:22:06.311 INFO:tasks.workunit.client.1.vm06.stdout:6/38: symlink d6/l9 0 2026-03-10T06:22:06.314 INFO:tasks.workunit.client.1.vm06.stdout:1/31: dwrite f7 [0,4194304] 0 2026-03-10T06:22:06.314 INFO:tasks.workunit.client.1.vm06.stdout:1/32: rmdir - no directory 2026-03-10T06:22:06.317 INFO:tasks.workunit.client.1.vm06.stdout:7/38: creat fa x:0 0 0 2026-03-10T06:22:06.326 INFO:tasks.workunit.client.1.vm06.stdout:1/33: dread f7 [0,4194304] 0 2026-03-10T06:22:06.326 INFO:tasks.workunit.client.1.vm06.stdout:1/34: rmdir - no directory 2026-03-10T06:22:06.328 INFO:tasks.workunit.client.1.vm06.stdout:7/39: mknod cb 0 2026-03-10T06:22:06.329 INFO:tasks.workunit.client.1.vm06.stdout:6/39: dwrite f1 [0,4194304] 0 2026-03-10T06:22:06.334 INFO:tasks.workunit.client.1.vm06.stdout:6/40: dread f3 [4194304,4194304] 0 2026-03-10T06:22:06.337 INFO:tasks.workunit.client.1.vm06.stdout:1/35: creat f8 x:0 0 0 2026-03-10T06:22:06.337 INFO:tasks.workunit.client.1.vm06.stdout:1/36: stat l3 0 2026-03-10T06:22:06.339 INFO:tasks.workunit.client.1.vm06.stdout:7/40: mknod cc 0 2026-03-10T06:22:06.343 INFO:tasks.workunit.client.1.vm06.stdout:6/41: mknod d6/ca 0 2026-03-10T06:22:06.345 INFO:tasks.workunit.client.1.vm06.stdout:1/37: mkdir d9 0 2026-03-10T06:22:06.346 INFO:tasks.workunit.client.1.vm06.stdout:1/38: read f5 [322756,50304] 0 2026-03-10T06:22:06.346 INFO:tasks.workunit.client.1.vm06.stdout:1/39: read f7 [3433289,95569] 0 2026-03-10T06:22:06.346 INFO:tasks.workunit.client.1.vm06.stdout:1/40: write f5 [1189081,55841] 0 2026-03-10T06:22:06.357 INFO:tasks.workunit.client.1.vm06.stdout:6/42: dwrite f3 [0,4194304] 0 2026-03-10T06:22:06.362 INFO:tasks.workunit.client.1.vm06.stdout:7/41: sync 2026-03-10T06:22:06.363 INFO:tasks.workunit.client.1.vm06.stdout:7/42: dread f4 [4194304,4194304] 0 2026-03-10T06:22:06.370 INFO:tasks.workunit.client.1.vm06.stdout:1/41: getdents d9 0 2026-03-10T06:22:06.370 INFO:tasks.workunit.client.1.vm06.stdout:1/42: chown f7 126532316 1 2026-03-10T06:22:06.372 INFO:tasks.workunit.client.1.vm06.stdout:6/43: creat d6/d7/fb x:0 0 0 2026-03-10T06:22:06.373 INFO:tasks.workunit.client.1.vm06.stdout:6/44: truncate d6/d7/fb 587033 0 2026-03-10T06:22:06.375 INFO:tasks.workunit.client.1.vm06.stdout:7/43: creat fd x:0 0 0 2026-03-10T06:22:06.382 INFO:tasks.workunit.client.1.vm06.stdout:7/44: dwrite fd [0,4194304] 0 2026-03-10T06:22:06.383 INFO:tasks.workunit.client.1.vm06.stdout:6/45: creat d6/fc x:0 0 0 2026-03-10T06:22:06.383 INFO:tasks.workunit.client.1.vm06.stdout:1/43: symlink d9/la 0 2026-03-10T06:22:06.392 INFO:tasks.workunit.client.1.vm06.stdout:7/45: rename l7 to le 0 2026-03-10T06:22:06.393 INFO:tasks.workunit.client.1.vm06.stdout:6/46: dwrite d6/fc [0,4194304] 0 2026-03-10T06:22:06.408 INFO:tasks.workunit.client.1.vm06.stdout:7/46: mknod cf 0 2026-03-10T06:22:06.410 INFO:tasks.workunit.client.1.vm06.stdout:6/47: mkdir d6/dd 0 2026-03-10T06:22:06.410 INFO:tasks.workunit.client.1.vm06.stdout:6/48: stat c2 0 2026-03-10T06:22:06.413 INFO:tasks.workunit.client.1.vm06.stdout:7/47: dwrite fa [0,4194304] 0 2026-03-10T06:22:06.420 INFO:tasks.workunit.client.1.vm06.stdout:7/48: creat f10 x:0 0 0 2026-03-10T06:22:06.423 INFO:tasks.workunit.client.1.vm06.stdout:6/49: mknod d6/dd/ce 0 2026-03-10T06:22:06.426 INFO:tasks.workunit.client.1.vm06.stdout:7/49: symlink l11 0 2026-03-10T06:22:06.431 INFO:tasks.workunit.client.1.vm06.stdout:7/50: dwrite fd [0,4194304] 0 2026-03-10T06:22:06.436 INFO:tasks.workunit.client.1.vm06.stdout:7/51: creat f12 x:0 0 0 2026-03-10T06:22:06.436 INFO:tasks.workunit.client.1.vm06.stdout:7/52: dread - f12 zero size 2026-03-10T06:22:06.447 INFO:tasks.workunit.client.1.vm06.stdout:3/40: write d6/d8/fb [351671,36407] 0 2026-03-10T06:22:06.450 INFO:tasks.workunit.client.1.vm06.stdout:3/41: write d6/dc/f12 [2884075,21847] 0 2026-03-10T06:22:06.453 INFO:tasks.workunit.client.1.vm06.stdout:4/40: truncate f3 3125769 0 2026-03-10T06:22:06.453 INFO:tasks.workunit.client.1.vm06.stdout:8/24: truncate d1/f4 3965144 0 2026-03-10T06:22:06.454 INFO:tasks.workunit.client.1.vm06.stdout:3/42: mkdir d6/dc/d13 0 2026-03-10T06:22:06.457 INFO:tasks.workunit.client.1.vm06.stdout:7/53: sync 2026-03-10T06:22:06.462 INFO:tasks.workunit.client.1.vm06.stdout:7/54: dwrite fa [0,4194304] 0 2026-03-10T06:22:06.467 INFO:tasks.workunit.client.1.vm06.stdout:7/55: dwrite f12 [0,4194304] 0 2026-03-10T06:22:06.488 INFO:tasks.workunit.client.1.vm06.stdout:2/29: fsync f7 0 2026-03-10T06:22:06.488 INFO:tasks.workunit.client.1.vm06.stdout:2/30: write f7 [641961,10689] 0 2026-03-10T06:22:06.489 INFO:tasks.workunit.client.1.vm06.stdout:2/31: dread - f8 zero size 2026-03-10T06:22:06.489 INFO:tasks.workunit.client.1.vm06.stdout:2/32: rmdir - no directory 2026-03-10T06:22:06.491 INFO:tasks.workunit.client.1.vm06.stdout:7/56: unlink l11 0 2026-03-10T06:22:06.492 INFO:tasks.workunit.client.1.vm06.stdout:4/41: getdents . 0 2026-03-10T06:22:06.495 INFO:tasks.workunit.client.1.vm06.stdout:2/33: creat f9 x:0 0 0 2026-03-10T06:22:06.497 INFO:tasks.workunit.client.1.vm06.stdout:9/33: truncate f9 939813 0 2026-03-10T06:22:06.499 INFO:tasks.workunit.client.1.vm06.stdout:7/57: creat f13 x:0 0 0 2026-03-10T06:22:06.501 INFO:tasks.workunit.client.1.vm06.stdout:9/34: dwrite f6 [0,4194304] 0 2026-03-10T06:22:06.501 INFO:tasks.workunit.client.1.vm06.stdout:9/35: rmdir - no directory 2026-03-10T06:22:06.504 INFO:tasks.workunit.client.1.vm06.stdout:2/34: mkdir da 0 2026-03-10T06:22:06.506 INFO:tasks.workunit.client.1.vm06.stdout:0/27: getdents d0/db 0 2026-03-10T06:22:06.506 INFO:tasks.workunit.client.1.vm06.stdout:0/28: dread - d0/f5 zero size 2026-03-10T06:22:06.509 INFO:tasks.workunit.client.1.vm06.stdout:9/36: dwrite f6 [0,4194304] 0 2026-03-10T06:22:06.516 INFO:tasks.workunit.client.1.vm06.stdout:4/42: link f9 fc 0 2026-03-10T06:22:06.518 INFO:tasks.workunit.client.1.vm06.stdout:3/43: fdatasync d6/dc/f12 0 2026-03-10T06:22:06.520 INFO:tasks.workunit.client.1.vm06.stdout:5/34: link f7 d8/db/fc 0 2026-03-10T06:22:06.529 INFO:tasks.workunit.client.1.vm06.stdout:9/37: sync 2026-03-10T06:22:06.532 INFO:tasks.workunit.client.1.vm06.stdout:3/44: symlink d6/d8/l14 0 2026-03-10T06:22:06.533 INFO:tasks.workunit.client.1.vm06.stdout:3/45: chown d6/dc/f12 173809 1 2026-03-10T06:22:06.533 INFO:tasks.workunit.client.1.vm06.stdout:4/43: dwrite fc [0,4194304] 0 2026-03-10T06:22:06.534 INFO:tasks.workunit.client.1.vm06.stdout:3/46: truncate d6/d8/fe 168881 0 2026-03-10T06:22:06.540 INFO:tasks.workunit.client.1.vm06.stdout:2/35: rename l4 to da/lb 0 2026-03-10T06:22:06.540 INFO:tasks.workunit.client.1.vm06.stdout:3/47: dread d6/dc/f12 [0,4194304] 0 2026-03-10T06:22:06.541 INFO:tasks.workunit.client.1.vm06.stdout:3/48: write d6/d8/fe [420276,100262] 0 2026-03-10T06:22:06.541 INFO:tasks.workunit.client.1.vm06.stdout:3/49: write d6/dc/f12 [5093924,79464] 0 2026-03-10T06:22:06.543 INFO:tasks.workunit.client.1.vm06.stdout:3/50: read d6/d8/fe [507270,12388] 0 2026-03-10T06:22:06.544 INFO:tasks.workunit.client.1.vm06.stdout:0/29: creat d0/dd/f10 x:0 0 0 2026-03-10T06:22:06.544 INFO:tasks.workunit.client.1.vm06.stdout:0/30: truncate d0/f9 917581 0 2026-03-10T06:22:06.545 INFO:tasks.workunit.client.1.vm06.stdout:0/31: dread - d0/f5 zero size 2026-03-10T06:22:06.545 INFO:tasks.workunit.client.1.vm06.stdout:0/32: chown d0/db/ce 95 1 2026-03-10T06:22:06.566 INFO:tasks.workunit.client.1.vm06.stdout:3/51: creat d6/d8/f15 x:0 0 0 2026-03-10T06:22:06.566 INFO:tasks.workunit.client.1.vm06.stdout:3/52: chown f0 187 1 2026-03-10T06:22:06.568 INFO:tasks.workunit.client.1.vm06.stdout:0/33: fdatasync d0/f8 0 2026-03-10T06:22:06.568 INFO:tasks.workunit.client.1.vm06.stdout:0/34: stat d0/db/ce 0 2026-03-10T06:22:06.572 INFO:tasks.workunit.client.1.vm06.stdout:9/38: fdatasync f9 0 2026-03-10T06:22:06.584 INFO:tasks.workunit.client.1.vm06.stdout:0/35: creat d0/dd/f11 x:0 0 0 2026-03-10T06:22:06.589 INFO:tasks.workunit.client.1.vm06.stdout:9/39: symlink la 0 2026-03-10T06:22:06.594 INFO:tasks.workunit.client.1.vm06.stdout:3/53: link d6/d8/f15 d6/d8/f16 0 2026-03-10T06:22:06.603 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:06 vm06.local ceph-mon[58974]: from='client.14640 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:22:06.603 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:06 vm06.local ceph-mon[58974]: pgmap v11: 65 pgs: 65 active+clean; 190 MiB data, 1.1 GiB used, 119 GiB / 120 GiB avail; 878 KiB/s rd, 2.5 MiB/s wr, 203 op/s 2026-03-10T06:22:06.603 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:06 vm06.local ceph-mon[58974]: from='client.24417 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:22:06.603 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:06 vm06.local ceph-mon[58974]: from='client.? 192.168.123.104:0/958638431' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:22:06.603 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:06 vm06.local ceph-mon[58974]: from='mgr.24377 ' entity='mgr.vm06.wwotdr' 2026-03-10T06:22:06.603 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:06 vm06.local ceph-mon[58974]: from='mgr.24377 ' entity='mgr.vm06.wwotdr' 2026-03-10T06:22:06.603 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:06 vm06.local ceph-mon[58974]: from='mgr.24377 192.168.123.106:0/16315041' entity='mgr.vm06.wwotdr' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:22:06.603 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:06 vm06.local ceph-mon[58974]: from='mgr.24377 192.168.123.106:0/16315041' entity='mgr.vm06.wwotdr' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T06:22:06.603 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:06 vm06.local ceph-mon[58974]: from='mgr.24377 ' entity='mgr.vm06.wwotdr' 2026-03-10T06:22:06.603 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:06 vm06.local ceph-mon[58974]: from='mgr.24377 192.168.123.106:0/16315041' entity='mgr.vm06.wwotdr' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:22:06.603 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:06 vm06.local ceph-mon[58974]: from='mgr.24377 192.168.123.106:0/16315041' entity='mgr.vm06.wwotdr' cmd=[{"prefix": "mgr fail", "who": "vm06.wwotdr"}]: dispatch 2026-03-10T06:22:06.603 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:06 vm06.local ceph-mon[58974]: from='mgr.24377 ' entity='mgr.vm06.wwotdr' cmd=[{"prefix": "mgr fail", "who": "vm06.wwotdr"}]: dispatch 2026-03-10T06:22:06.603 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:06 vm06.local ceph-mon[58974]: osdmap e41: 6 total, 6 up, 6 in 2026-03-10T06:22:06.603 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:06 vm06.local ceph-mon[58974]: from='mgr.24377 ' entity='mgr.vm06.wwotdr' cmd='[{"prefix": "mgr fail", "who": "vm06.wwotdr"}]': finished 2026-03-10T06:22:06.603 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:06 vm06.local ceph-mon[58974]: mgrmap e26: vm04.exdvdb(active, starting, since 0.00546442s) 2026-03-10T06:22:06.610 INFO:tasks.workunit.client.1.vm06.stdout:1/44: truncate f7 1724343 0 2026-03-10T06:22:06.619 INFO:tasks.workunit.client.1.vm06.stdout:6/50: truncate f3 1175926 0 2026-03-10T06:22:06.625 INFO:tasks.workunit.client.1.vm06.stdout:6/51: dwrite f1 [4194304,4194304] 0 2026-03-10T06:22:06.626 INFO:tasks.workunit.client.1.vm06.stdout:6/52: chown d6/l9 29002 1 2026-03-10T06:22:06.626 INFO:tasks.workunit.client.1.vm06.stdout:0/36: creat d0/db/f12 x:0 0 0 2026-03-10T06:22:06.626 INFO:tasks.workunit.client.1.vm06.stdout:9/40: rename f4 to fb 0 2026-03-10T06:22:06.628 INFO:tasks.workunit.client.1.vm06.stdout:0/37: stat d0/db/ce 0 2026-03-10T06:22:06.628 INFO:tasks.workunit.client.1.vm06.stdout:9/41: chown f9 389 1 2026-03-10T06:22:06.632 INFO:tasks.workunit.client.1.vm06.stdout:3/54: rename d6/d8/fd to d6/dc/d13/f17 0 2026-03-10T06:22:06.632 INFO:tasks.workunit.client.1.vm06.stdout:3/55: dread - d6/d8/f15 zero size 2026-03-10T06:22:06.636 INFO:tasks.workunit.client.1.vm06.stdout:3/56: dwrite d6/d8/fb [0,4194304] 0 2026-03-10T06:22:06.639 INFO:tasks.workunit.client.1.vm06.stdout:1/45: fdatasync f5 0 2026-03-10T06:22:06.650 INFO:tasks.workunit.client.1.vm06.stdout:8/25: rmdir d1 39 2026-03-10T06:22:06.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:06 vm04.local ceph-mon[51058]: from='client.14640 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:22:06.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:06 vm04.local ceph-mon[51058]: pgmap v11: 65 pgs: 65 active+clean; 190 MiB data, 1.1 GiB used, 119 GiB / 120 GiB avail; 878 KiB/s rd, 2.5 MiB/s wr, 203 op/s 2026-03-10T06:22:06.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:06 vm04.local ceph-mon[51058]: from='client.24417 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:22:06.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:06 vm04.local ceph-mon[51058]: from='client.? 192.168.123.104:0/958638431' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:22:06.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:06 vm04.local ceph-mon[51058]: from='mgr.24377 ' entity='mgr.vm06.wwotdr' 2026-03-10T06:22:06.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:06 vm04.local ceph-mon[51058]: from='mgr.24377 ' entity='mgr.vm06.wwotdr' 2026-03-10T06:22:06.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:06 vm04.local ceph-mon[51058]: from='mgr.24377 192.168.123.106:0/16315041' entity='mgr.vm06.wwotdr' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:22:06.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:06 vm04.local ceph-mon[51058]: from='mgr.24377 192.168.123.106:0/16315041' entity='mgr.vm06.wwotdr' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T06:22:06.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:06 vm04.local ceph-mon[51058]: from='mgr.24377 ' entity='mgr.vm06.wwotdr' 2026-03-10T06:22:06.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:06 vm04.local ceph-mon[51058]: from='mgr.24377 192.168.123.106:0/16315041' entity='mgr.vm06.wwotdr' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:22:06.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:06 vm04.local ceph-mon[51058]: from='mgr.24377 192.168.123.106:0/16315041' entity='mgr.vm06.wwotdr' cmd=[{"prefix": "mgr fail", "who": "vm06.wwotdr"}]: dispatch 2026-03-10T06:22:06.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:06 vm04.local ceph-mon[51058]: from='mgr.24377 ' entity='mgr.vm06.wwotdr' cmd=[{"prefix": "mgr fail", "who": "vm06.wwotdr"}]: dispatch 2026-03-10T06:22:06.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:06 vm04.local ceph-mon[51058]: osdmap e41: 6 total, 6 up, 6 in 2026-03-10T06:22:06.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:06 vm04.local ceph-mon[51058]: from='mgr.24377 ' entity='mgr.vm06.wwotdr' cmd='[{"prefix": "mgr fail", "who": "vm06.wwotdr"}]': finished 2026-03-10T06:22:06.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:06 vm04.local ceph-mon[51058]: mgrmap e26: vm04.exdvdb(active, starting, since 0.00546442s) 2026-03-10T06:22:06.688 INFO:tasks.workunit.client.1.vm06.stdout:7/58: getdents . 0 2026-03-10T06:22:06.688 INFO:tasks.workunit.client.1.vm06.stdout:7/59: dread - f13 zero size 2026-03-10T06:22:06.704 INFO:tasks.workunit.client.1.vm06.stdout:2/36: chown da/lb 1 1 2026-03-10T06:22:06.709 INFO:tasks.workunit.client.1.vm06.stdout:5/35: write d8/db/fc [181578,123743] 0 2026-03-10T06:22:06.739 INFO:tasks.workunit.client.1.vm06.stdout:4/44: write f3 [3799820,97696] 0 2026-03-10T06:22:06.823 INFO:tasks.workunit.client.1.vm06.stdout:6/53: mkdir d6/df 0 2026-03-10T06:22:06.825 INFO:tasks.workunit.client.1.vm06.stdout:0/38: mknod d0/dd/c13 0 2026-03-10T06:22:06.826 INFO:tasks.workunit.client.1.vm06.stdout:0/39: write d0/dd/f10 [597219,113723] 0 2026-03-10T06:22:06.836 INFO:tasks.workunit.client.1.vm06.stdout:3/57: creat d6/dc/f18 x:0 0 0 2026-03-10T06:22:06.851 INFO:tasks.workunit.client.1.vm06.stdout:8/26: fsync d1/f5 0 2026-03-10T06:22:06.853 INFO:tasks.workunit.client.1.vm06.stdout:7/60: creat f14 x:0 0 0 2026-03-10T06:22:06.853 INFO:tasks.workunit.client.1.vm06.stdout:2/37: symlink da/lc 0 2026-03-10T06:22:06.855 INFO:tasks.workunit.client.1.vm06.stdout:5/36: symlink d8/ld 0 2026-03-10T06:22:06.856 INFO:tasks.workunit.client.1.vm06.stdout:5/37: fsync f5 0 2026-03-10T06:22:06.857 INFO:tasks.workunit.client.1.vm06.stdout:5/38: readlink d8/d9/la 0 2026-03-10T06:22:06.858 INFO:tasks.workunit.client.1.vm06.stdout:7/61: dread f4 [0,4194304] 0 2026-03-10T06:22:06.858 INFO:tasks.workunit.client.1.vm06.stdout:7/62: dread - f14 zero size 2026-03-10T06:22:06.858 INFO:tasks.workunit.client.1.vm06.stdout:2/38: dwrite f5 [0,4194304] 0 2026-03-10T06:22:06.860 INFO:tasks.workunit.client.1.vm06.stdout:0/40: mkdir d0/dd/d14 0 2026-03-10T06:22:06.860 INFO:tasks.workunit.client.1.vm06.stdout:3/58: symlink d6/dc/d13/l19 0 2026-03-10T06:22:06.861 INFO:tasks.workunit.client.1.vm06.stdout:3/59: chown d6/dc/d13 15271812 1 2026-03-10T06:22:06.862 INFO:tasks.workunit.client.1.vm06.stdout:0/41: write d0/db/f12 [781119,41400] 0 2026-03-10T06:22:06.862 INFO:tasks.workunit.client.1.vm06.stdout:5/39: dread f5 [0,4194304] 0 2026-03-10T06:22:06.863 INFO:tasks.workunit.client.1.vm06.stdout:0/42: truncate d0/f5 881522 0 2026-03-10T06:22:06.875 INFO:tasks.workunit.client.1.vm06.stdout:8/27: symlink d1/l8 0 2026-03-10T06:22:06.878 INFO:tasks.workunit.client.1.vm06.stdout:7/63: creat f15 x:0 0 0 2026-03-10T06:22:06.878 INFO:tasks.workunit.client.1.vm06.stdout:7/64: chown f15 151379495 1 2026-03-10T06:22:06.881 INFO:tasks.workunit.client.1.vm06.stdout:6/54: symlink d6/df/l10 0 2026-03-10T06:22:06.887 INFO:tasks.workunit.client.1.vm06.stdout:9/42: dread f9 [0,4194304] 0 2026-03-10T06:22:06.890 INFO:tasks.workunit.client.1.vm06.stdout:9/43: dwrite f6 [0,4194304] 0 2026-03-10T06:22:06.891 INFO:tasks.workunit.client.1.vm06.stdout:3/60: mkdir d6/d1a 0 2026-03-10T06:22:06.893 INFO:tasks.workunit.client.1.vm06.stdout:1/46: chown f7 694 1 2026-03-10T06:22:06.894 INFO:tasks.workunit.client.1.vm06.stdout:5/40: creat d8/fe x:0 0 0 2026-03-10T06:22:06.894 INFO:tasks.workunit.client.1.vm06.stdout:0/43: mknod d0/c15 0 2026-03-10T06:22:06.896 INFO:tasks.workunit.client.1.vm06.stdout:7/65: unlink fd 0 2026-03-10T06:22:06.899 INFO:tasks.workunit.client.1.vm06.stdout:4/45: dwrite f2 [0,4194304] 0 2026-03-10T06:22:06.900 INFO:tasks.workunit.client.1.vm06.stdout:3/61: dread f0 [0,4194304] 0 2026-03-10T06:22:06.901 INFO:tasks.workunit.client.1.vm06.stdout:1/47: dread f5 [0,4194304] 0 2026-03-10T06:22:06.908 INFO:tasks.workunit.client.1.vm06.stdout:0/44: mkdir d0/d16 0 2026-03-10T06:22:06.909 INFO:tasks.workunit.client.1.vm06.stdout:7/66: dwrite f10 [0,4194304] 0 2026-03-10T06:22:06.910 INFO:tasks.workunit.client.1.vm06.stdout:0/45: dread - d0/dd/f11 zero size 2026-03-10T06:22:06.911 INFO:tasks.workunit.client.1.vm06.stdout:8/28: creat d1/d7/f9 x:0 0 0 2026-03-10T06:22:06.912 INFO:tasks.workunit.client.1.vm06.stdout:8/29: dread - d1/d7/f9 zero size 2026-03-10T06:22:06.913 INFO:tasks.workunit.client.1.vm06.stdout:8/30: write d1/f2 [1007849,89860] 0 2026-03-10T06:22:06.913 INFO:tasks.workunit.client.1.vm06.stdout:8/31: readlink d1/l6 0 2026-03-10T06:22:06.913 INFO:tasks.workunit.client.1.vm06.stdout:4/46: dread f3 [0,4194304] 0 2026-03-10T06:22:06.914 INFO:tasks.workunit.client.1.vm06.stdout:8/32: fdatasync d1/d7/f9 0 2026-03-10T06:22:06.914 INFO:tasks.workunit.client.1.vm06.stdout:4/47: chown f3 19 1 2026-03-10T06:22:06.916 INFO:tasks.workunit.client.1.vm06.stdout:3/62: unlink d6/dc/f12 0 2026-03-10T06:22:06.917 INFO:tasks.workunit.client.1.vm06.stdout:1/48: unlink f8 0 2026-03-10T06:22:06.917 INFO:tasks.workunit.client.1.vm06.stdout:1/49: readlink l4 0 2026-03-10T06:22:06.917 INFO:tasks.workunit.client.1.vm06.stdout:3/63: chown d6/d8/l14 10 1 2026-03-10T06:22:06.918 INFO:tasks.workunit.client.1.vm06.stdout:1/50: write f5 [1516873,128446] 0 2026-03-10T06:22:06.918 INFO:tasks.workunit.client.1.vm06.stdout:2/39: link l3 da/ld 0 2026-03-10T06:22:06.930 INFO:tasks.workunit.client.1.vm06.stdout:7/67: creat f16 x:0 0 0 2026-03-10T06:22:06.935 INFO:tasks.workunit.client.1.vm06.stdout:3/64: unlink d6/dc/f18 0 2026-03-10T06:22:06.946 INFO:tasks.workunit.client.1.vm06.stdout:2/40: creat da/fe x:0 0 0 2026-03-10T06:22:06.946 INFO:tasks.workunit.client.1.vm06.stdout:5/41: truncate f5 122832 0 2026-03-10T06:22:06.946 INFO:tasks.workunit.client.1.vm06.stdout:7/68: symlink l17 0 2026-03-10T06:22:06.946 INFO:tasks.workunit.client.1.vm06.stdout:7/69: dread - f9 zero size 2026-03-10T06:22:06.946 INFO:tasks.workunit.client.1.vm06.stdout:7/70: dread - f13 zero size 2026-03-10T06:22:06.946 INFO:tasks.workunit.client.1.vm06.stdout:0/46: symlink d0/dd/d14/l17 0 2026-03-10T06:22:06.946 INFO:tasks.workunit.client.1.vm06.stdout:5/42: unlink f6 0 2026-03-10T06:22:06.946 INFO:tasks.workunit.client.1.vm06.stdout:7/71: symlink l18 0 2026-03-10T06:22:06.946 INFO:tasks.workunit.client.1.vm06.stdout:0/47: truncate d0/f9 1899289 0 2026-03-10T06:22:06.946 INFO:tasks.workunit.client.1.vm06.stdout:0/48: chown d0 14 1 2026-03-10T06:22:06.947 INFO:tasks.workunit.client.1.vm06.stdout:0/49: dread d0/f5 [0,4194304] 0 2026-03-10T06:22:06.947 INFO:tasks.workunit.client.1.vm06.stdout:0/50: chown d0/dd/d14/l17 426 1 2026-03-10T06:22:06.947 INFO:tasks.workunit.client.1.vm06.stdout:5/43: creat d8/ff x:0 0 0 2026-03-10T06:22:06.947 INFO:tasks.workunit.client.1.vm06.stdout:4/48: getdents . 0 2026-03-10T06:22:06.947 INFO:tasks.workunit.client.1.vm06.stdout:0/51: mkdir d0/dd/d14/d18 0 2026-03-10T06:22:06.949 INFO:tasks.workunit.client.1.vm06.stdout:5/44: rename d8/d9/la to d8/l10 0 2026-03-10T06:22:06.949 INFO:tasks.workunit.client.1.vm06.stdout:4/49: mkdir dd 0 2026-03-10T06:22:06.950 INFO:tasks.workunit.client.1.vm06.stdout:7/72: dread f4 [0,4194304] 0 2026-03-10T06:22:06.950 INFO:tasks.workunit.client.1.vm06.stdout:7/73: fsync f4 0 2026-03-10T06:22:06.953 INFO:tasks.workunit.client.1.vm06.stdout:3/65: dwrite d6/dc/d13/f17 [0,4194304] 0 2026-03-10T06:22:06.953 INFO:tasks.workunit.client.1.vm06.stdout:5/45: truncate d8/ff 172950 0 2026-03-10T06:22:06.957 INFO:tasks.workunit.client.1.vm06.stdout:0/52: creat d0/f19 x:0 0 0 2026-03-10T06:22:06.958 INFO:tasks.workunit.client.1.vm06.stdout:7/74: mkdir d19 0 2026-03-10T06:22:06.959 INFO:tasks.workunit.client.1.vm06.stdout:5/46: creat d8/d9/f11 x:0 0 0 2026-03-10T06:22:06.959 INFO:tasks.workunit.client.1.vm06.stdout:4/50: getdents dd 0 2026-03-10T06:22:06.959 INFO:tasks.workunit.client.1.vm06.stdout:5/47: write d8/ff [1054721,105343] 0 2026-03-10T06:22:06.960 INFO:tasks.workunit.client.1.vm06.stdout:0/53: rename d0/f8 to d0/d16/f1a 0 2026-03-10T06:22:06.963 INFO:tasks.workunit.client.1.vm06.stdout:7/75: dwrite f4 [0,4194304] 0 2026-03-10T06:22:06.964 INFO:tasks.workunit.client.1.vm06.stdout:7/76: fsync f14 0 2026-03-10T06:22:06.965 INFO:tasks.workunit.client.1.vm06.stdout:2/41: sync 2026-03-10T06:22:06.965 INFO:tasks.workunit.client.1.vm06.stdout:3/66: sync 2026-03-10T06:22:06.968 INFO:tasks.workunit.client.1.vm06.stdout:2/42: creat da/ff x:0 0 0 2026-03-10T06:22:06.969 INFO:tasks.workunit.client.1.vm06.stdout:2/43: read f5 [893672,55782] 0 2026-03-10T06:22:06.971 INFO:tasks.workunit.client.1.vm06.stdout:7/77: creat d19/f1a x:0 0 0 2026-03-10T06:22:06.976 INFO:tasks.workunit.client.1.vm06.stdout:4/51: dwrite fc [0,4194304] 0 2026-03-10T06:22:06.976 INFO:tasks.workunit.client.1.vm06.stdout:7/78: chown cc 186726 1 2026-03-10T06:22:06.978 INFO:tasks.workunit.client.1.vm06.stdout:7/79: symlink d19/l1b 0 2026-03-10T06:22:06.979 INFO:tasks.workunit.client.1.vm06.stdout:4/52: creat dd/fe x:0 0 0 2026-03-10T06:22:06.982 INFO:tasks.workunit.client.1.vm06.stdout:4/53: creat dd/ff x:0 0 0 2026-03-10T06:22:06.984 INFO:tasks.workunit.client.1.vm06.stdout:4/54: dread - dd/fe zero size 2026-03-10T06:22:06.988 INFO:tasks.workunit.client.1.vm06.stdout:4/55: dwrite dd/ff [0,4194304] 0 2026-03-10T06:22:06.988 INFO:tasks.workunit.client.1.vm06.stdout:4/56: truncate fa 864976 0 2026-03-10T06:22:06.994 INFO:tasks.workunit.client.1.vm06.stdout:4/57: link c7 dd/c10 0 2026-03-10T06:22:07.000 INFO:tasks.workunit.client.1.vm06.stdout:4/58: dwrite fa [0,4194304] 0 2026-03-10T06:22:07.008 INFO:tasks.workunit.client.1.vm06.stdout:4/59: fsync f8 0 2026-03-10T06:22:07.013 INFO:tasks.workunit.client.1.vm06.stdout:4/60: dwrite f2 [0,4194304] 0 2026-03-10T06:22:07.017 INFO:tasks.workunit.client.1.vm06.stdout:4/61: dread fc [0,4194304] 0 2026-03-10T06:22:07.020 INFO:tasks.workunit.client.1.vm06.stdout:4/62: creat dd/f11 x:0 0 0 2026-03-10T06:22:07.022 INFO:tasks.workunit.client.1.vm06.stdout:4/63: rename f9 to dd/f12 0 2026-03-10T06:22:07.030 INFO:tasks.workunit.client.1.vm06.stdout:4/64: link dd/c10 dd/c13 0 2026-03-10T06:22:07.034 INFO:tasks.workunit.client.1.vm06.stdout:4/65: rename f3 to dd/f14 0 2026-03-10T06:22:07.035 INFO:tasks.workunit.client.1.vm06.stdout:4/66: creat dd/f15 x:0 0 0 2026-03-10T06:22:07.035 INFO:tasks.workunit.client.1.vm06.stdout:4/67: write dd/f15 [139298,14748] 0 2026-03-10T06:22:07.036 INFO:tasks.workunit.client.1.vm06.stdout:4/68: truncate dd/f12 4694088 0 2026-03-10T06:22:07.044 INFO:tasks.workunit.client.1.vm06.stdout:4/69: chown c7 77 1 2026-03-10T06:22:07.078 INFO:tasks.workunit.client.1.vm06.stdout:8/33: fsync d1/f2 0 2026-03-10T06:22:07.080 INFO:tasks.workunit.client.1.vm06.stdout:8/34: creat d1/fa x:0 0 0 2026-03-10T06:22:07.084 INFO:tasks.workunit.client.1.vm06.stdout:8/35: dread d1/f4 [0,4194304] 0 2026-03-10T06:22:07.085 INFO:tasks.workunit.client.1.vm06.stdout:9/44: dwrite f9 [0,4194304] 0 2026-03-10T06:22:07.098 INFO:tasks.workunit.client.1.vm06.stdout:6/55: truncate f1 6234985 0 2026-03-10T06:22:07.098 INFO:tasks.workunit.client.1.vm06.stdout:6/56: chown d6/d7/fb 3338558 1 2026-03-10T06:22:07.099 INFO:tasks.workunit.client.1.vm06.stdout:6/57: fdatasync d6/d7/fb 0 2026-03-10T06:22:07.099 INFO:tasks.workunit.client.1.vm06.stdout:8/36: mkdir d1/d7/db 0 2026-03-10T06:22:07.107 INFO:tasks.workunit.client.1.vm06.stdout:1/51: dwrite f7 [0,4194304] 0 2026-03-10T06:22:07.114 INFO:tasks.workunit.client.1.vm06.stdout:1/52: chown d9 53 1 2026-03-10T06:22:07.114 INFO:tasks.workunit.client.1.vm06.stdout:1/53: rename f7 to d9/fb 0 2026-03-10T06:22:07.114 INFO:tasks.workunit.client.1.vm06.stdout:1/54: write f5 [1227464,11126] 0 2026-03-10T06:22:07.119 INFO:tasks.workunit.client.1.vm06.stdout:0/54: unlink d0/d16/f1a 0 2026-03-10T06:22:07.119 INFO:tasks.workunit.client.1.vm06.stdout:0/55: dread - d0/ff zero size 2026-03-10T06:22:07.120 INFO:tasks.workunit.client.1.vm06.stdout:2/44: getdents da 0 2026-03-10T06:22:07.121 INFO:tasks.workunit.client.1.vm06.stdout:0/56: truncate d0/dd/f10 1292972 0 2026-03-10T06:22:07.121 INFO:tasks.workunit.client.1.vm06.stdout:7/80: fsync d19/f1a 0 2026-03-10T06:22:07.122 INFO:tasks.workunit.client.1.vm06.stdout:7/81: write fa [1673896,83347] 0 2026-03-10T06:22:07.124 INFO:tasks.workunit.client.1.vm06.stdout:7/82: write f10 [2708668,83337] 0 2026-03-10T06:22:07.124 INFO:tasks.workunit.client.1.vm06.stdout:3/67: dwrite d6/d8/fe [0,4194304] 0 2026-03-10T06:22:07.125 INFO:tasks.workunit.client.1.vm06.stdout:8/37: sync 2026-03-10T06:22:07.125 INFO:tasks.workunit.client.1.vm06.stdout:8/38: chown d1 782560836 1 2026-03-10T06:22:07.126 INFO:tasks.workunit.client.1.vm06.stdout:1/55: sync 2026-03-10T06:22:07.126 INFO:tasks.workunit.client.1.vm06.stdout:2/45: sync 2026-03-10T06:22:07.128 INFO:tasks.workunit.client.1.vm06.stdout:2/46: chown da/lc 4090 1 2026-03-10T06:22:07.134 INFO:tasks.workunit.client.1.vm06.stdout:0/57: mkdir d0/dd/d1b 0 2026-03-10T06:22:07.137 INFO:tasks.workunit.client.1.vm06.stdout:7/83: rename l0 to d19/l1c 0 2026-03-10T06:22:07.138 INFO:tasks.workunit.client.1.vm06.stdout:7/84: write f12 [305959,13714] 0 2026-03-10T06:22:07.140 INFO:tasks.workunit.client.1.vm06.stdout:3/68: creat d6/f1b x:0 0 0 2026-03-10T06:22:07.141 INFO:tasks.workunit.client.1.vm06.stdout:8/39: symlink d1/d7/lc 0 2026-03-10T06:22:07.142 INFO:tasks.workunit.client.1.vm06.stdout:8/40: chown d1/fa 27 1 2026-03-10T06:22:07.142 INFO:tasks.workunit.client.1.vm06.stdout:8/41: write d1/d7/f9 [302369,67242] 0 2026-03-10T06:22:07.146 INFO:tasks.workunit.client.1.vm06.stdout:2/47: unlink da/lc 0 2026-03-10T06:22:07.149 INFO:tasks.workunit.client.1.vm06.stdout:0/58: rmdir d0/db 39 2026-03-10T06:22:07.153 INFO:tasks.workunit.client.1.vm06.stdout:0/59: dwrite d0/f19 [0,4194304] 0 2026-03-10T06:22:07.154 INFO:tasks.workunit.client.1.vm06.stdout:0/60: chown d0/dd/d14 1 1 2026-03-10T06:22:07.174 INFO:tasks.workunit.client.1.vm06.stdout:7/85: creat d19/f1d x:0 0 0 2026-03-10T06:22:07.174 INFO:tasks.workunit.client.1.vm06.stdout:7/86: stat f10 0 2026-03-10T06:22:07.177 INFO:tasks.workunit.client.1.vm06.stdout:3/69: creat d6/f1c x:0 0 0 2026-03-10T06:22:07.177 INFO:tasks.workunit.client.1.vm06.stdout:3/70: write d6/d8/fe [1710456,22898] 0 2026-03-10T06:22:07.177 INFO:tasks.workunit.client.1.vm06.stdout:3/71: stat d6/d1a 0 2026-03-10T06:22:07.181 INFO:tasks.workunit.client.1.vm06.stdout:3/72: dwrite d6/f1c [0,4194304] 0 2026-03-10T06:22:07.183 INFO:tasks.workunit.client.1.vm06.stdout:8/42: creat d1/d7/fd x:0 0 0 2026-03-10T06:22:07.187 INFO:tasks.workunit.client.1.vm06.stdout:2/48: symlink da/l10 0 2026-03-10T06:22:07.193 INFO:tasks.workunit.client.1.vm06.stdout:2/49: dwrite f8 [0,4194304] 0 2026-03-10T06:22:07.198 INFO:tasks.workunit.client.1.vm06.stdout:0/61: mkdir d0/dd/d1c 0 2026-03-10T06:22:07.207 INFO:tasks.workunit.client.1.vm06.stdout:0/62: dread d0/f19 [0,4194304] 0 2026-03-10T06:22:07.207 INFO:tasks.workunit.client.1.vm06.stdout:6/58: dread f1 [4194304,4194304] 0 2026-03-10T06:22:07.207 INFO:tasks.workunit.client.1.vm06.stdout:6/59: readlink d6/df/l10 0 2026-03-10T06:22:07.207 INFO:tasks.workunit.client.1.vm06.stdout:4/70: getdents dd 0 2026-03-10T06:22:07.207 INFO:tasks.workunit.client.1.vm06.stdout:5/48: write f5 [988371,6389] 0 2026-03-10T06:22:07.207 INFO:tasks.workunit.client.1.vm06.stdout:7/87: rename l5 to d19/l1e 0 2026-03-10T06:22:07.207 INFO:tasks.workunit.client.1.vm06.stdout:7/88: dread - d19/f1a zero size 2026-03-10T06:22:07.207 INFO:tasks.workunit.client.1.vm06.stdout:9/45: dwrite fb [0,4194304] 0 2026-03-10T06:22:07.207 INFO:tasks.workunit.client.1.vm06.stdout:7/89: chown cf 124 1 2026-03-10T06:22:07.207 INFO:tasks.workunit.client.1.vm06.stdout:9/46: rmdir - no directory 2026-03-10T06:22:07.213 INFO:tasks.workunit.client.1.vm06.stdout:9/47: dwrite fb [0,4194304] 0 2026-03-10T06:22:07.217 INFO:tasks.workunit.client.1.vm06.stdout:3/73: creat d6/dc/f1d x:0 0 0 2026-03-10T06:22:07.221 INFO:tasks.workunit.client.1.vm06.stdout:1/56: truncate d9/fb 3034033 0 2026-03-10T06:22:07.241 INFO:tasks.workunit.client.1.vm06.stdout:6/60: mkdir d6/d11 0 2026-03-10T06:22:07.242 INFO:tasks.workunit.client.1.vm06.stdout:6/61: dread d6/d7/fb [0,4194304] 0 2026-03-10T06:22:07.243 INFO:tasks.workunit.client.1.vm06.stdout:6/62: chown d6/l9 19755 1 2026-03-10T06:22:07.243 INFO:tasks.workunit.client.1.vm06.stdout:6/63: write d6/d7/fb [876944,6959] 0 2026-03-10T06:22:07.243 INFO:tasks.workunit.client.1.vm06.stdout:4/71: symlink dd/l16 0 2026-03-10T06:22:07.244 INFO:tasks.workunit.client.1.vm06.stdout:4/72: write dd/f15 [28503,88287] 0 2026-03-10T06:22:07.245 INFO:tasks.workunit.client.1.vm06.stdout:4/73: write dd/ff [1737408,69273] 0 2026-03-10T06:22:07.245 INFO:tasks.workunit.client.1.vm06.stdout:4/74: read - dd/f11 zero size 2026-03-10T06:22:07.246 INFO:tasks.workunit.client.1.vm06.stdout:5/49: mkdir d8/d12 0 2026-03-10T06:22:07.246 INFO:tasks.workunit.client.1.vm06.stdout:5/50: stat d8/ld 0 2026-03-10T06:22:07.247 INFO:tasks.workunit.client.1.vm06.stdout:6/64: dread f1 [0,4194304] 0 2026-03-10T06:22:07.251 INFO:tasks.workunit.client.1.vm06.stdout:7/90: rename l6 to d19/l1f 0 2026-03-10T06:22:07.252 INFO:tasks.workunit.client.1.vm06.stdout:8/43: creat d1/d7/db/fe x:0 0 0 2026-03-10T06:22:07.252 INFO:tasks.workunit.client.1.vm06.stdout:9/48: creat fc x:0 0 0 2026-03-10T06:22:07.253 INFO:tasks.workunit.client.1.vm06.stdout:1/57: mknod d9/cc 0 2026-03-10T06:22:07.254 INFO:tasks.workunit.client.1.vm06.stdout:6/65: dread d6/fc [0,4194304] 0 2026-03-10T06:22:07.260 INFO:tasks.workunit.client.1.vm06.stdout:0/63: chown d0/db/ce 2 1 2026-03-10T06:22:07.260 INFO:tasks.workunit.client.1.vm06.stdout:9/49: dread f9 [0,4194304] 0 2026-03-10T06:22:07.262 INFO:tasks.workunit.client.1.vm06.stdout:1/58: dwrite f5 [0,4194304] 0 2026-03-10T06:22:07.267 INFO:tasks.workunit.client.1.vm06.stdout:9/50: dread f9 [0,4194304] 0 2026-03-10T06:22:07.278 INFO:tasks.workunit.client.1.vm06.stdout:5/51: symlink d8/d9/l13 0 2026-03-10T06:22:07.278 INFO:tasks.workunit.client.1.vm06.stdout:5/52: write d8/fe [968149,68142] 0 2026-03-10T06:22:07.280 INFO:tasks.workunit.client.1.vm06.stdout:4/75: dwrite dd/f14 [0,4194304] 0 2026-03-10T06:22:07.285 INFO:tasks.workunit.client.1.vm06.stdout:3/74: rename d6/d8/f15 to d6/dc/d13/f1e 0 2026-03-10T06:22:07.293 INFO:tasks.workunit.client.1.vm06.stdout:8/44: mkdir d1/df 0 2026-03-10T06:22:07.299 INFO:tasks.workunit.client.1.vm06.stdout:6/66: mknod d6/dd/c12 0 2026-03-10T06:22:07.300 INFO:tasks.workunit.client.1.vm06.stdout:6/67: write d6/d7/fb [407838,31096] 0 2026-03-10T06:22:07.302 INFO:tasks.workunit.client.1.vm06.stdout:2/50: link f5 da/f11 0 2026-03-10T06:22:07.302 INFO:tasks.workunit.client.1.vm06.stdout:2/51: chown da/fe 2928 1 2026-03-10T06:22:07.306 INFO:tasks.workunit.client.1.vm06.stdout:9/51: creat fd x:0 0 0 2026-03-10T06:22:07.306 INFO:tasks.workunit.client.1.vm06.stdout:9/52: dread - fc zero size 2026-03-10T06:22:07.310 INFO:tasks.workunit.client.1.vm06.stdout:4/76: mknod dd/c17 0 2026-03-10T06:22:07.316 INFO:tasks.workunit.client.1.vm06.stdout:5/53: rename f2 to d8/d9/f14 0 2026-03-10T06:22:07.316 INFO:tasks.workunit.client.1.vm06.stdout:1/59: write d9/fb [1634070,97693] 0 2026-03-10T06:22:07.317 INFO:tasks.workunit.client.1.vm06.stdout:5/54: dread - d8/d9/f11 zero size 2026-03-10T06:22:07.317 INFO:tasks.workunit.client.1.vm06.stdout:1/60: fdatasync f5 0 2026-03-10T06:22:07.319 INFO:tasks.workunit.client.1.vm06.stdout:7/91: rmdir d19 39 2026-03-10T06:22:07.322 INFO:tasks.workunit.client.1.vm06.stdout:7/92: dwrite fa [0,4194304] 0 2026-03-10T06:22:07.324 INFO:tasks.workunit.client.1.vm06.stdout:7/93: write fa [3765043,16065] 0 2026-03-10T06:22:07.340 INFO:tasks.workunit.client.1.vm06.stdout:2/52: mknod da/c12 0 2026-03-10T06:22:07.343 INFO:tasks.workunit.client.1.vm06.stdout:4/77: unlink dd/f15 0 2026-03-10T06:22:07.347 INFO:tasks.workunit.client.1.vm06.stdout:3/75: rename d6/d8/fe to d6/d1a/f1f 0 2026-03-10T06:22:07.355 INFO:tasks.workunit.client.1.vm06.stdout:8/45: dwrite d1/f4 [0,4194304] 0 2026-03-10T06:22:07.356 INFO:tasks.workunit.client.1.vm06.stdout:8/46: readlink d1/l6 0 2026-03-10T06:22:07.356 INFO:tasks.workunit.client.1.vm06.stdout:5/55: rmdir d8/db 39 2026-03-10T06:22:07.360 INFO:tasks.workunit.client.1.vm06.stdout:5/56: dread d8/ff [0,4194304] 0 2026-03-10T06:22:07.360 INFO:tasks.workunit.client.1.vm06.stdout:5/57: write f7 [641028,55746] 0 2026-03-10T06:22:07.364 INFO:tasks.workunit.client.1.vm06.stdout:1/61: unlink l4 0 2026-03-10T06:22:07.364 INFO:tasks.workunit.client.1.vm06.stdout:1/62: stat l2 0 2026-03-10T06:22:07.365 INFO:tasks.workunit.client.1.vm06.stdout:1/63: chown f5 12 1 2026-03-10T06:22:07.366 INFO:tasks.workunit.client.1.vm06.stdout:1/64: dread f5 [0,4194304] 0 2026-03-10T06:22:07.378 INFO:tasks.workunit.client.1.vm06.stdout:8/47: sync 2026-03-10T06:22:07.380 INFO:tasks.workunit.client.1.vm06.stdout:2/53: mkdir da/d13 0 2026-03-10T06:22:07.382 INFO:tasks.workunit.client.1.vm06.stdout:9/53: link fc fe 0 2026-03-10T06:22:07.385 INFO:tasks.workunit.client.1.vm06.stdout:4/78: mkdir dd/d18 0 2026-03-10T06:22:07.386 INFO:tasks.workunit.client.1.vm06.stdout:4/79: dread - dd/fe zero size 2026-03-10T06:22:07.406 INFO:tasks.workunit.client.1.vm06.stdout:7/94: rename f12 to d19/f20 0 2026-03-10T06:22:07.412 INFO:tasks.workunit.client.1.vm06.stdout:6/68: link d6/d7/l8 d6/d11/l13 0 2026-03-10T06:22:07.424 INFO:tasks.workunit.client.1.vm06.stdout:7/95: symlink d19/l21 0 2026-03-10T06:22:07.424 INFO:tasks.workunit.client.1.vm06.stdout:7/96: readlink le 0 2026-03-10T06:22:07.425 INFO:tasks.workunit.client.1.vm06.stdout:7/97: write f4 [3761756,112979] 0 2026-03-10T06:22:07.427 INFO:tasks.workunit.client.1.vm06.stdout:8/48: getdents d1/df 0 2026-03-10T06:22:07.430 INFO:tasks.workunit.client.1.vm06.stdout:2/54: symlink da/d13/l14 0 2026-03-10T06:22:07.430 INFO:tasks.workunit.client.1.vm06.stdout:6/69: symlink d6/dd/l14 0 2026-03-10T06:22:07.431 INFO:tasks.workunit.client.1.vm06.stdout:6/70: read d6/d7/fb [881746,12021] 0 2026-03-10T06:22:07.435 INFO:tasks.workunit.client.1.vm06.stdout:9/54: link fb ff 0 2026-03-10T06:22:07.435 INFO:tasks.workunit.client.1.vm06.stdout:9/55: rmdir - no directory 2026-03-10T06:22:07.435 INFO:tasks.workunit.client.1.vm06.stdout:4/80: mknod dd/d18/c19 0 2026-03-10T06:22:07.436 INFO:tasks.workunit.client.1.vm06.stdout:5/58: link d8/d9/f11 d8/db/f15 0 2026-03-10T06:22:07.439 INFO:tasks.workunit.client.1.vm06.stdout:1/65: truncate f5 413243 0 2026-03-10T06:22:07.440 INFO:tasks.workunit.client.1.vm06.stdout:0/64: truncate d0/fa 1895683 0 2026-03-10T06:22:07.445 INFO:tasks.workunit.client.1.vm06.stdout:7/98: chown d19/l1c 347 1 2026-03-10T06:22:07.446 INFO:tasks.workunit.client.1.vm06.stdout:8/49: mknod d1/d7/db/c10 0 2026-03-10T06:22:07.448 INFO:tasks.workunit.client.1.vm06.stdout:7/99: sync 2026-03-10T06:22:07.448 INFO:tasks.workunit.client.1.vm06.stdout:7/100: fsync d19/f1a 0 2026-03-10T06:22:07.450 INFO:tasks.workunit.client.1.vm06.stdout:8/50: dwrite d1/f2 [0,4194304] 0 2026-03-10T06:22:07.458 INFO:tasks.workunit.client.1.vm06.stdout:2/55: dread da/f11 [0,4194304] 0 2026-03-10T06:22:07.458 INFO:tasks.workunit.client.1.vm06.stdout:6/71: rmdir d6/dd 39 2026-03-10T06:22:07.468 INFO:tasks.workunit.client.1.vm06.stdout:7/101: dwrite f13 [0,4194304] 0 2026-03-10T06:22:07.470 INFO:tasks.workunit.client.1.vm06.stdout:3/76: getdents d6/dc/d13 0 2026-03-10T06:22:07.470 INFO:tasks.workunit.client.1.vm06.stdout:3/77: write d6/f1c [2373985,47635] 0 2026-03-10T06:22:07.473 INFO:tasks.workunit.client.1.vm06.stdout:9/56: creat f10 x:0 0 0 2026-03-10T06:22:07.473 INFO:tasks.workunit.client.1.vm06.stdout:8/51: dwrite d1/f5 [0,4194304] 0 2026-03-10T06:22:07.488 INFO:tasks.workunit.client.1.vm06.stdout:8/52: chown d1 3011110 1 2026-03-10T06:22:07.488 INFO:tasks.workunit.client.1.vm06.stdout:5/59: creat d8/db/f16 x:0 0 0 2026-03-10T06:22:07.490 INFO:tasks.workunit.client.1.vm06.stdout:0/65: read d0/f5 [852535,112563] 0 2026-03-10T06:22:07.490 INFO:tasks.workunit.client.1.vm06.stdout:6/72: mknod d6/df/c15 0 2026-03-10T06:22:07.495 INFO:tasks.workunit.client.1.vm06.stdout:9/57: creat f11 x:0 0 0 2026-03-10T06:22:07.496 INFO:tasks.workunit.client.1.vm06.stdout:9/58: truncate fd 812624 0 2026-03-10T06:22:07.499 INFO:tasks.workunit.client.1.vm06.stdout:2/56: rename da/lb to da/l15 0 2026-03-10T06:22:07.500 INFO:tasks.workunit.client.1.vm06.stdout:4/81: dwrite dd/f12 [0,4194304] 0 2026-03-10T06:22:07.504 INFO:tasks.workunit.client.1.vm06.stdout:9/59: creat f12 x:0 0 0 2026-03-10T06:22:07.509 INFO:tasks.workunit.client.1.vm06.stdout:9/60: readlink la 0 2026-03-10T06:22:07.514 INFO:tasks.workunit.client.1.vm06.stdout:4/82: dwrite dd/fe [0,4194304] 0 2026-03-10T06:22:07.514 INFO:tasks.workunit.client.1.vm06.stdout:0/66: chown d0/dd/d1b 481541184 1 2026-03-10T06:22:07.514 INFO:tasks.workunit.client.1.vm06.stdout:9/61: truncate f6 4678349 0 2026-03-10T06:22:07.514 INFO:tasks.workunit.client.1.vm06.stdout:5/60: creat d8/d12/f17 x:0 0 0 2026-03-10T06:22:07.514 INFO:tasks.workunit.client.1.vm06.stdout:9/62: write f11 [1027556,100111] 0 2026-03-10T06:22:07.514 INFO:tasks.workunit.client.1.vm06.stdout:2/57: creat da/d13/f16 x:0 0 0 2026-03-10T06:22:07.514 INFO:tasks.workunit.client.1.vm06.stdout:8/53: dwrite d1/fa [0,4194304] 0 2026-03-10T06:22:07.520 INFO:tasks.workunit.client.1.vm06.stdout:0/67: chown d0/d16 6460624 1 2026-03-10T06:22:07.521 INFO:tasks.workunit.client.1.vm06.stdout:0/68: chown d0/dd/d14 1754 1 2026-03-10T06:22:07.525 INFO:tasks.workunit.client.1.vm06.stdout:0/69: dwrite d0/db/f12 [0,4194304] 0 2026-03-10T06:22:07.527 INFO:tasks.workunit.client.1.vm06.stdout:2/58: creat da/f17 x:0 0 0 2026-03-10T06:22:07.529 INFO:tasks.workunit.client.1.vm06.stdout:4/83: sync 2026-03-10T06:22:07.529 INFO:tasks.workunit.client.1.vm06.stdout:3/78: link d6/d8/c11 d6/dc/d13/c20 0 2026-03-10T06:22:07.530 INFO:tasks.workunit.client.1.vm06.stdout:3/79: write d6/f1c [727373,47341] 0 2026-03-10T06:22:07.535 INFO:tasks.workunit.client.1.vm06.stdout:2/59: dread f5 [0,4194304] 0 2026-03-10T06:22:07.540 INFO:tasks.workunit.client.1.vm06.stdout:5/61: rename d8/db/f16 to d8/db/f18 0 2026-03-10T06:22:07.540 INFO:tasks.workunit.client.1.vm06.stdout:3/80: mkdir d6/d21 0 2026-03-10T06:22:07.540 INFO:tasks.workunit.client.1.vm06.stdout:4/84: dwrite dd/fe [0,4194304] 0 2026-03-10T06:22:07.540 INFO:tasks.workunit.client.1.vm06.stdout:2/60: truncate da/f11 2345338 0 2026-03-10T06:22:07.548 INFO:tasks.workunit.client.1.vm06.stdout:8/54: mkdir d1/df/d11 0 2026-03-10T06:22:07.549 INFO:tasks.workunit.client.1.vm06.stdout:9/63: link c8 c13 0 2026-03-10T06:22:07.552 INFO:tasks.workunit.client.1.vm06.stdout:5/62: unlink d8/fe 0 2026-03-10T06:22:07.560 INFO:tasks.workunit.client.1.vm06.stdout:9/64: rename f6 to f14 0 2026-03-10T06:22:07.562 INFO:tasks.workunit.client.1.vm06.stdout:5/63: mknod d8/db/c19 0 2026-03-10T06:22:07.564 INFO:tasks.workunit.client.1.vm06.stdout:9/65: dread fb [0,4194304] 0 2026-03-10T06:22:07.564 INFO:tasks.workunit.client.1.vm06.stdout:8/55: creat d1/df/d11/f12 x:0 0 0 2026-03-10T06:22:07.566 INFO:tasks.workunit.client.1.vm06.stdout:9/66: dread ff [0,4194304] 0 2026-03-10T06:22:07.567 INFO:tasks.workunit.client.1.vm06.stdout:8/56: creat d1/f13 x:0 0 0 2026-03-10T06:22:07.567 INFO:tasks.workunit.client.1.vm06.stdout:8/57: chown d1/l6 5 1 2026-03-10T06:22:07.568 INFO:tasks.workunit.client.1.vm06.stdout:5/64: creat d8/f1a x:0 0 0 2026-03-10T06:22:07.569 INFO:tasks.workunit.client.1.vm06.stdout:5/65: read - d8/db/f15 zero size 2026-03-10T06:22:07.577 INFO:tasks.workunit.client.1.vm06.stdout:8/58: dwrite d1/f4 [0,4194304] 0 2026-03-10T06:22:07.578 INFO:tasks.workunit.client.1.vm06.stdout:5/66: symlink d8/l1b 0 2026-03-10T06:22:07.579 INFO:tasks.workunit.client.1.vm06.stdout:8/59: rename d1/d7/db to d1/d7/db/d14 22 2026-03-10T06:22:07.579 INFO:tasks.workunit.client.1.vm06.stdout:8/60: stat d1/l3 0 2026-03-10T06:22:07.585 INFO:tasks.workunit.client.1.vm06.stdout:5/67: symlink d8/d12/l1c 0 2026-03-10T06:22:07.597 INFO:tasks.workunit.client.1.vm06.stdout:5/68: symlink d8/l1d 0 2026-03-10T06:22:07.597 INFO:tasks.workunit.client.1.vm06.stdout:5/69: dwrite d8/d9/f11 [0,4194304] 0 2026-03-10T06:22:07.597 INFO:tasks.workunit.client.1.vm06.stdout:5/70: fdatasync f5 0 2026-03-10T06:22:07.647 INFO:tasks.workunit.client.1.vm06.stdout:3/81: fdatasync d6/f1c 0 2026-03-10T06:22:07.648 INFO:tasks.workunit.client.1.vm06.stdout:3/82: chown d6/d21 886306 1 2026-03-10T06:22:07.649 INFO:tasks.workunit.client.1.vm06.stdout:4/85: fdatasync dd/fe 0 2026-03-10T06:22:07.651 INFO:tasks.workunit.client.1.vm06.stdout:3/83: dwrite d6/f1c [0,4194304] 0 2026-03-10T06:22:07.653 INFO:tasks.workunit.client.1.vm06.stdout:4/86: dread dd/f12 [0,4194304] 0 2026-03-10T06:22:07.653 INFO:tasks.workunit.client.1.vm06.stdout:4/87: write f6 [1297680,50801] 0 2026-03-10T06:22:07.661 INFO:tasks.workunit.client.1.vm06.stdout:7/102: getdents d19 0 2026-03-10T06:22:07.665 INFO:tasks.workunit.client.1.vm06.stdout:1/66: dread f5 [0,4194304] 0 2026-03-10T06:22:07.666 INFO:tasks.workunit.client.1.vm06.stdout:1/67: read f5 [206421,51178] 0 2026-03-10T06:22:07.688 INFO:tasks.workunit.client.1.vm06.stdout:6/73: dwrite f3 [0,4194304] 0 2026-03-10T06:22:07.690 INFO:tasks.workunit.client.1.vm06.stdout:0/70: write d0/f5 [1422837,40349] 0 2026-03-10T06:22:07.702 INFO:tasks.workunit.client.1.vm06.stdout:9/67: getdents . 0 2026-03-10T06:22:07.702 INFO:tasks.workunit.client.1.vm06.stdout:9/68: write f12 [573319,33405] 0 2026-03-10T06:22:07.703 INFO:tasks.workunit.client.1.vm06.stdout:9/69: chown fc 9545 1 2026-03-10T06:22:07.704 INFO:tasks.workunit.client.1.vm06.stdout:8/61: getdents d1 0 2026-03-10T06:22:07.705 INFO:tasks.workunit.client.1.vm06.stdout:5/71: rename d8/d12 to d8/d9/d1e 0 2026-03-10T06:22:07.709 INFO:tasks.workunit.client.1.vm06.stdout:5/72: dwrite f7 [0,4194304] 0 2026-03-10T06:22:07.715 INFO:tasks.workunit.client.1.vm06.stdout:5/73: dwrite d8/d9/f11 [4194304,4194304] 0 2026-03-10T06:22:07.758 INFO:tasks.workunit.client.1.vm06.stdout:2/61: write f5 [2797415,93455] 0 2026-03-10T06:22:07.758 INFO:tasks.workunit.client.1.vm06.stdout:2/62: chown da 850 1 2026-03-10T06:22:07.764 INFO:tasks.workunit.client.1.vm06.stdout:3/84: creat d6/d8/f22 x:0 0 0 2026-03-10T06:22:07.767 INFO:tasks.workunit.client.1.vm06.stdout:4/88: symlink dd/d18/l1a 0 2026-03-10T06:22:07.770 INFO:tasks.workunit.client.1.vm06.stdout:7/103: creat d19/f22 x:0 0 0 2026-03-10T06:22:07.773 INFO:tasks.workunit.client.1.vm06.stdout:1/68: write f5 [676079,94130] 0 2026-03-10T06:22:07.776 INFO:tasks.workunit.client.1.vm06.stdout:1/69: dread d9/fb [0,4194304] 0 2026-03-10T06:22:07.777 INFO:tasks.workunit.client.1.vm06.stdout:6/74: creat d6/d7/f16 x:0 0 0 2026-03-10T06:22:07.778 INFO:tasks.workunit.client.1.vm06.stdout:6/75: chown d6/ca 2 1 2026-03-10T06:22:07.779 INFO:tasks.workunit.client.1.vm06.stdout:9/70: symlink l15 0 2026-03-10T06:22:07.779 INFO:tasks.workunit.client.1.vm06.stdout:8/62: rmdir d1/df 39 2026-03-10T06:22:07.782 INFO:tasks.workunit.client.1.vm06.stdout:7/104: sync 2026-03-10T06:22:07.784 INFO:tasks.workunit.client.1.vm06.stdout:0/71: rename d0/d16 to d0/dd/d14/d1d 0 2026-03-10T06:22:07.794 INFO:tasks.workunit.client.1.vm06.stdout:5/74: rename d8/db/f15 to d8/db/f1f 0 2026-03-10T06:22:07.797 INFO:tasks.workunit.client.1.vm06.stdout:2/63: mknod da/d13/c18 0 2026-03-10T06:22:07.800 INFO:tasks.workunit.client.1.vm06.stdout:3/85: write d6/d1a/f1f [94518,16954] 0 2026-03-10T06:22:07.801 INFO:tasks.workunit.client.1.vm06.stdout:3/86: stat d6/d8/f22 0 2026-03-10T06:22:07.803 INFO:tasks.workunit.client.1.vm06.stdout:4/89: symlink dd/l1b 0 2026-03-10T06:22:07.808 INFO:tasks.workunit.client.1.vm06.stdout:1/70: symlink d9/ld 0 2026-03-10T06:22:07.819 INFO:tasks.workunit.client.1.vm06.stdout:6/76: dwrite d6/d7/fb [0,4194304] 0 2026-03-10T06:22:07.820 INFO:tasks.workunit.client.1.vm06.stdout:6/77: readlink d6/df/l10 0 2026-03-10T06:22:07.820 INFO:tasks.workunit.client.1.vm06.stdout:6/78: dread - d6/d7/f16 zero size 2026-03-10T06:22:07.820 INFO:tasks.workunit.client.1.vm06.stdout:7/105: creat d19/f23 x:0 0 0 2026-03-10T06:22:07.823 INFO:tasks.workunit.client.1.vm06.stdout:7/106: dread f13 [0,4194304] 0 2026-03-10T06:22:07.828 INFO:tasks.workunit.client.1.vm06.stdout:5/75: write d8/d9/f11 [1287035,41435] 0 2026-03-10T06:22:07.831 INFO:tasks.workunit.client.1.vm06.stdout:2/64: creat da/f19 x:0 0 0 2026-03-10T06:22:07.842 INFO:tasks.workunit.client.1.vm06.stdout:2/65: dread - f9 zero size 2026-03-10T06:22:07.842 INFO:tasks.workunit.client.1.vm06.stdout:3/87: symlink d6/d8/l23 0 2026-03-10T06:22:07.842 INFO:tasks.workunit.client.1.vm06.stdout:2/66: write da/f19 [563501,114767] 0 2026-03-10T06:22:07.842 INFO:tasks.workunit.client.1.vm06.stdout:1/71: dread d9/fb [0,4194304] 0 2026-03-10T06:22:07.842 INFO:tasks.workunit.client.1.vm06.stdout:9/71: unlink c8 0 2026-03-10T06:22:07.842 INFO:tasks.workunit.client.1.vm06.stdout:9/72: rmdir - no directory 2026-03-10T06:22:07.845 INFO:tasks.workunit.client.1.vm06.stdout:3/88: sync 2026-03-10T06:22:07.845 INFO:tasks.workunit.client.1.vm06.stdout:7/107: dwrite d19/f20 [0,4194304] 0 2026-03-10T06:22:07.846 INFO:tasks.workunit.client.1.vm06.stdout:3/89: write d6/f1b [858484,92545] 0 2026-03-10T06:22:07.847 INFO:tasks.workunit.client.1.vm06.stdout:3/90: write d6/d1a/f1f [1877862,53716] 0 2026-03-10T06:22:07.849 INFO:tasks.workunit.client.1.vm06.stdout:3/91: sync 2026-03-10T06:22:07.849 INFO:tasks.workunit.client.1.vm06.stdout:2/67: mkdir da/d13/d1a 0 2026-03-10T06:22:07.849 INFO:tasks.workunit.client.1.vm06.stdout:2/68: fsync f5 0 2026-03-10T06:22:07.849 INFO:tasks.workunit.client.1.vm06.stdout:3/92: write d6/f1c [1726469,109504] 0 2026-03-10T06:22:07.854 INFO:tasks.workunit.client.1.vm06.stdout:9/73: mknod c16 0 2026-03-10T06:22:07.858 INFO:tasks.workunit.client.1.vm06.stdout:6/79: mknod d6/c17 0 2026-03-10T06:22:07.859 INFO:tasks.workunit.client.1.vm06.stdout:1/72: dwrite d9/fb [0,4194304] 0 2026-03-10T06:22:07.859 INFO:tasks.workunit.client.1.vm06.stdout:1/73: chown d9/ld 8168669 1 2026-03-10T06:22:07.860 INFO:tasks.workunit.client.1.vm06.stdout:5/76: mkdir d8/d20 0 2026-03-10T06:22:07.862 INFO:tasks.workunit.client.1.vm06.stdout:7/108: creat d19/f24 x:0 0 0 2026-03-10T06:22:07.862 INFO:tasks.workunit.client.1.vm06.stdout:7/109: readlink l8 0 2026-03-10T06:22:07.867 INFO:tasks.workunit.client.1.vm06.stdout:5/77: dread d8/d9/f14 [0,4194304] 0 2026-03-10T06:22:07.869 INFO:tasks.workunit.client.1.vm06.stdout:2/69: mknod da/d13/c1b 0 2026-03-10T06:22:07.869 INFO:tasks.workunit.client.1.vm06.stdout:7/110: dwrite d19/f1a [0,4194304] 0 2026-03-10T06:22:07.870 INFO:tasks.workunit.client.1.vm06.stdout:2/70: read - da/ff zero size 2026-03-10T06:22:07.870 INFO:tasks.workunit.client.1.vm06.stdout:2/71: write f9 [408496,104168] 0 2026-03-10T06:22:07.870 INFO:tasks.workunit.client.1.vm06.stdout:7/111: stat d19/f20 0 2026-03-10T06:22:07.874 INFO:tasks.workunit.client.1.vm06.stdout:6/80: rmdir d6/d11 39 2026-03-10T06:22:07.875 INFO:tasks.workunit.client.1.vm06.stdout:6/81: truncate f3 4563319 0 2026-03-10T06:22:07.886 INFO:tasks.workunit.client.1.vm06.stdout:9/74: dread f12 [0,4194304] 0 2026-03-10T06:22:07.886 INFO:tasks.workunit.client.1.vm06.stdout:9/75: fdatasync f10 0 2026-03-10T06:22:07.887 INFO:tasks.workunit.client.1.vm06.stdout:2/72: mkdir da/d13/d1c 0 2026-03-10T06:22:07.888 INFO:tasks.workunit.client.1.vm06.stdout:2/73: write f9 [861304,107180] 0 2026-03-10T06:22:07.892 INFO:tasks.workunit.client.1.vm06.stdout:4/90: getdents dd 0 2026-03-10T06:22:07.899 INFO:tasks.workunit.client.1.vm06.stdout:1/74: link f5 d9/fe 0 2026-03-10T06:22:07.910 INFO:tasks.workunit.client.1.vm06.stdout:9/76: creat f17 x:0 0 0 2026-03-10T06:22:07.910 INFO:tasks.workunit.client.1.vm06.stdout:9/77: chown fd 1016311 1 2026-03-10T06:22:07.910 INFO:tasks.workunit.client.1.vm06.stdout:9/78: truncate f11 2029739 0 2026-03-10T06:22:07.911 INFO:tasks.workunit.client.1.vm06.stdout:7/112: link d19/f20 d19/f25 0 2026-03-10T06:22:07.911 INFO:tasks.workunit.client.1.vm06.stdout:7/113: chown d19/f22 7 1 2026-03-10T06:22:07.911 INFO:tasks.workunit.client.1.vm06.stdout:7/114: readlink d19/l1f 0 2026-03-10T06:22:07.912 INFO:tasks.workunit.client.1.vm06.stdout:4/91: rename dd/l16 to dd/d18/l1c 0 2026-03-10T06:22:07.912 INFO:tasks.workunit.client.1.vm06.stdout:4/92: fsync dd/f11 0 2026-03-10T06:22:07.917 INFO:tasks.workunit.client.1.vm06.stdout:1/75: dwrite d9/fe [0,4194304] 0 2026-03-10T06:22:07.917 INFO:tasks.workunit.client.1.vm06.stdout:1/76: chown l3 23797 1 2026-03-10T06:22:07.919 INFO:tasks.workunit.client.1.vm06.stdout:1/77: dread d9/fe [0,4194304] 0 2026-03-10T06:22:07.919 INFO:tasks.workunit.client.1.vm06.stdout:3/93: getdents d6/d1a 0 2026-03-10T06:22:07.919 INFO:tasks.workunit.client.1.vm06.stdout:3/94: chown d6 853391528 1 2026-03-10T06:22:07.924 INFO:tasks.workunit.client.1.vm06.stdout:2/74: mkdir da/d13/d1c/d1d 0 2026-03-10T06:22:07.925 INFO:tasks.workunit.client.1.vm06.stdout:2/75: write da/f17 [721194,60533] 0 2026-03-10T06:22:07.930 INFO:tasks.workunit.client.1.vm06.stdout:9/79: dwrite f14 [0,4194304] 0 2026-03-10T06:22:07.930 INFO:tasks.workunit.client.1.vm06.stdout:2/76: write da/f17 [1620368,130183] 0 2026-03-10T06:22:07.938 INFO:tasks.workunit.client.1.vm06.stdout:8/63: truncate d1/f5 2034902 0 2026-03-10T06:22:07.944 INFO:tasks.workunit.client.1.vm06.stdout:0/72: dwrite d0/f19 [0,4194304] 0 2026-03-10T06:22:07.948 INFO:tasks.workunit.client.1.vm06.stdout:6/82: dwrite d6/fc [0,4194304] 0 2026-03-10T06:22:07.951 INFO:tasks.workunit.client.1.vm06.stdout:1/78: mkdir d9/df 0 2026-03-10T06:22:07.955 INFO:tasks.workunit.client.1.vm06.stdout:7/115: link d19/f23 d19/f26 0 2026-03-10T06:22:07.959 INFO:tasks.workunit.client.1.vm06.stdout:1/79: dwrite d9/fb [0,4194304] 0 2026-03-10T06:22:07.959 INFO:tasks.workunit.client.1.vm06.stdout:0/73: dread - d0/ff zero size 2026-03-10T06:22:07.964 INFO:tasks.workunit.client.1.vm06.stdout:7/116: unlink d19/f22 0 2026-03-10T06:22:07.968 INFO:tasks.workunit.client.1.vm06.stdout:1/80: rmdir d9 39 2026-03-10T06:22:07.968 INFO:tasks.workunit.client.1.vm06.stdout:8/64: fdatasync d1/f5 0 2026-03-10T06:22:07.968 INFO:tasks.workunit.client.1.vm06.stdout:6/83: creat d6/dd/f18 x:0 0 0 2026-03-10T06:22:07.968 INFO:tasks.workunit.client.1.vm06.stdout:6/84: chown d6/d7/fb 0 1 2026-03-10T06:22:07.968 INFO:tasks.workunit.client.1.vm06.stdout:7/117: creat d19/f27 x:0 0 0 2026-03-10T06:22:07.968 INFO:tasks.workunit.client.1.vm06.stdout:6/85: stat d6/df/c15 0 2026-03-10T06:22:07.969 INFO:tasks.workunit.client.1.vm06.stdout:7/118: creat d19/f28 x:0 0 0 2026-03-10T06:22:07.969 INFO:tasks.workunit.client.1.vm06.stdout:0/74: dread d0/db/f12 [0,4194304] 0 2026-03-10T06:22:07.970 INFO:tasks.workunit.client.1.vm06.stdout:7/119: dread - d19/f1d zero size 2026-03-10T06:22:07.974 INFO:tasks.workunit.client.1.vm06.stdout:1/81: creat d9/df/f10 x:0 0 0 2026-03-10T06:22:07.975 INFO:tasks.workunit.client.1.vm06.stdout:7/120: unlink f14 0 2026-03-10T06:22:07.981 INFO:tasks.workunit.client.1.vm06.stdout:7/121: dread - d19/f26 zero size 2026-03-10T06:22:07.981 INFO:tasks.workunit.client.1.vm06.stdout:6/86: dwrite f1 [4194304,4194304] 0 2026-03-10T06:22:07.981 INFO:tasks.workunit.client.1.vm06.stdout:6/87: fdatasync d6/d7/fb 0 2026-03-10T06:22:07.981 INFO:tasks.workunit.client.1.vm06.stdout:8/65: link d1/l6 d1/d7/l15 0 2026-03-10T06:22:07.981 INFO:tasks.workunit.client.1.vm06.stdout:7/122: chown d19/f20 135 1 2026-03-10T06:22:07.982 INFO:tasks.workunit.client.1.vm06.stdout:8/66: mknod d1/d7/db/c16 0 2026-03-10T06:22:07.982 INFO:tasks.workunit.client.1.vm06.stdout:0/75: getdents d0/dd/d1b 0 2026-03-10T06:22:07.984 INFO:tasks.workunit.client.1.vm06.stdout:7/123: dwrite d19/f28 [0,4194304] 0 2026-03-10T06:22:07.987 INFO:tasks.workunit.client.1.vm06.stdout:7/124: dwrite d19/f28 [0,4194304] 0 2026-03-10T06:22:07.993 INFO:tasks.workunit.client.1.vm06.stdout:6/88: rename c2 to d6/c19 0 2026-03-10T06:22:07.993 INFO:tasks.workunit.client.1.vm06.stdout:5/78: truncate d8/d9/f11 4419328 0 2026-03-10T06:22:07.994 INFO:tasks.workunit.client.1.vm06.stdout:6/89: dread - d6/d7/f16 zero size 2026-03-10T06:22:07.996 INFO:tasks.workunit.client.1.vm06.stdout:8/67: symlink d1/df/d11/l17 0 2026-03-10T06:22:07.996 INFO:tasks.workunit.client.1.vm06.stdout:6/90: write f1 [552842,35875] 0 2026-03-10T06:22:08.005 INFO:tasks.workunit.client.1.vm06.stdout:5/79: mknod d8/d9/c21 0 2026-03-10T06:22:08.018 INFO:tasks.workunit.client.1.vm06.stdout:5/80: mkdir d8/d20/d22 0 2026-03-10T06:22:08.018 INFO:tasks.workunit.client.1.vm06.stdout:8/68: dwrite d1/fa [4194304,4194304] 0 2026-03-10T06:22:08.020 INFO:tasks.workunit.client.1.vm06.stdout:6/91: dwrite f1 [8388608,4194304] 0 2026-03-10T06:22:08.022 INFO:tasks.workunit.client.1.vm06.stdout:6/92: dread - d6/dd/f18 zero size 2026-03-10T06:22:08.024 INFO:tasks.workunit.client.1.vm06.stdout:3/95: dread d6/f1b [0,4194304] 0 2026-03-10T06:22:08.026 INFO:tasks.workunit.client.1.vm06.stdout:8/69: dwrite d1/df/d11/f12 [0,4194304] 0 2026-03-10T06:22:08.035 INFO:tasks.workunit.client.1.vm06.stdout:6/93: creat d6/d7/f1a x:0 0 0 2026-03-10T06:22:08.043 INFO:tasks.workunit.client.1.vm06.stdout:6/94: dwrite f1 [4194304,4194304] 0 2026-03-10T06:22:08.043 INFO:tasks.workunit.client.1.vm06.stdout:3/96: symlink d6/d21/l24 0 2026-03-10T06:22:08.044 INFO:tasks.workunit.client.1.vm06.stdout:8/70: creat d1/f18 x:0 0 0 2026-03-10T06:22:08.047 INFO:tasks.workunit.client.1.vm06.stdout:3/97: rmdir d6/dc/d13 39 2026-03-10T06:22:08.048 INFO:tasks.workunit.client.1.vm06.stdout:8/71: dwrite d1/d7/db/fe [0,4194304] 0 2026-03-10T06:22:08.055 INFO:tasks.workunit.client.1.vm06.stdout:6/95: creat d6/f1b x:0 0 0 2026-03-10T06:22:08.056 INFO:tasks.workunit.client.1.vm06.stdout:3/98: creat d6/f25 x:0 0 0 2026-03-10T06:22:08.057 INFO:tasks.workunit.client.1.vm06.stdout:8/72: creat d1/d7/db/f19 x:0 0 0 2026-03-10T06:22:08.060 INFO:tasks.workunit.client.1.vm06.stdout:3/99: dwrite d6/d8/f22 [0,4194304] 0 2026-03-10T06:22:08.071 INFO:tasks.workunit.client.1.vm06.stdout:8/73: symlink d1/d7/l1a 0 2026-03-10T06:22:08.072 INFO:tasks.workunit.client.1.vm06.stdout:3/100: creat d6/d21/f26 x:0 0 0 2026-03-10T06:22:08.072 INFO:tasks.workunit.client.1.vm06.stdout:8/74: write d1/d7/f9 [716947,97911] 0 2026-03-10T06:22:08.074 INFO:tasks.workunit.client.1.vm06.stdout:8/75: unlink d1/l3 0 2026-03-10T06:22:08.075 INFO:tasks.workunit.client.1.vm06.stdout:3/101: mknod d6/dc/d13/c27 0 2026-03-10T06:22:08.076 INFO:tasks.workunit.client.1.vm06.stdout:3/102: symlink d6/l28 0 2026-03-10T06:22:08.079 INFO:tasks.workunit.client.1.vm06.stdout:8/76: creat d1/f1b x:0 0 0 2026-03-10T06:22:08.079 INFO:tasks.workunit.client.1.vm06.stdout:8/77: fsync d1/f13 0 2026-03-10T06:22:08.082 INFO:tasks.workunit.client.1.vm06.stdout:8/78: creat d1/f1c x:0 0 0 2026-03-10T06:22:08.087 INFO:tasks.workunit.client.1.vm06.stdout:8/79: write d1/d7/db/f19 [289046,19021] 0 2026-03-10T06:22:08.087 INFO:tasks.workunit.client.1.vm06.stdout:8/80: getdents d1/df 0 2026-03-10T06:22:08.102 INFO:tasks.workunit.client.1.vm06.stdout:2/77: fdatasync da/f17 0 2026-03-10T06:22:08.102 INFO:tasks.workunit.client.1.vm06.stdout:2/78: fsync f7 0 2026-03-10T06:22:08.102 INFO:tasks.workunit.client.1.vm06.stdout:2/79: chown da/d13 1386 1 2026-03-10T06:22:08.174 INFO:tasks.workunit.client.1.vm06.stdout:4/93: write fc [2351572,104246] 0 2026-03-10T06:22:08.174 INFO:tasks.workunit.client.1.vm06.stdout:4/94: dread - dd/f11 zero size 2026-03-10T06:22:08.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:07 vm04.local ceph-mon[51058]: Active manager daemon vm04.exdvdb restarted 2026-03-10T06:22:08.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:07 vm04.local ceph-mon[51058]: Activating manager daemon vm04.exdvdb 2026-03-10T06:22:08.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:07 vm04.local ceph-mon[51058]: from='mgr.? 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm04.exdvdb/crt"}]: dispatch 2026-03-10T06:22:08.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:07 vm04.local ceph-mon[51058]: from='mgr.? 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/crt"}]: dispatch 2026-03-10T06:22:08.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:07 vm04.local ceph-mon[51058]: from='mgr.? 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm04.exdvdb/key"}]: dispatch 2026-03-10T06:22:08.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:07 vm04.local ceph-mon[51058]: from='mgr.? 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/key"}]: dispatch 2026-03-10T06:22:08.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:07 vm04.local ceph-mon[51058]: osdmap e42: 6 total, 6 up, 6 in 2026-03-10T06:22:08.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:07 vm04.local ceph-mon[51058]: mgrmap e27: vm04.exdvdb(active, starting, since 0.0373559s) 2026-03-10T06:22:08.178 INFO:tasks.workunit.client.1.vm06.stdout:4/95: dwrite f0 [0,4194304] 0 2026-03-10T06:22:08.187 INFO:tasks.workunit.client.1.vm06.stdout:4/96: dwrite f8 [8388608,4194304] 0 2026-03-10T06:22:08.189 INFO:tasks.workunit.client.1.vm06.stdout:4/97: creat dd/d18/f1d x:0 0 0 2026-03-10T06:22:08.194 INFO:tasks.workunit.client.1.vm06.stdout:4/98: dwrite dd/ff [0,4194304] 0 2026-03-10T06:22:08.194 INFO:tasks.workunit.client.1.vm06.stdout:4/99: write dd/fe [1400902,130280] 0 2026-03-10T06:22:08.197 INFO:tasks.workunit.client.1.vm06.stdout:4/100: creat dd/d18/f1e x:0 0 0 2026-03-10T06:22:08.198 INFO:tasks.workunit.client.1.vm06.stdout:8/81: dread d1/d7/f9 [0,4194304] 0 2026-03-10T06:22:08.198 INFO:tasks.workunit.client.1.vm06.stdout:8/82: chown d1/d7/f9 10964829 1 2026-03-10T06:22:08.199 INFO:tasks.workunit.client.1.vm06.stdout:8/83: chown d1/df/d11/l17 5987566 1 2026-03-10T06:22:08.199 INFO:tasks.workunit.client.1.vm06.stdout:8/84: write d1/f4 [3987101,62771] 0 2026-03-10T06:22:08.201 INFO:tasks.workunit.client.1.vm06.stdout:8/85: creat d1/df/d11/f1d x:0 0 0 2026-03-10T06:22:08.202 INFO:tasks.workunit.client.1.vm06.stdout:8/86: dread d1/d7/f9 [0,4194304] 0 2026-03-10T06:22:08.207 INFO:tasks.workunit.client.1.vm06.stdout:4/101: dread fc [0,4194304] 0 2026-03-10T06:22:08.214 INFO:tasks.workunit.client.1.vm06.stdout:4/102: chown c7 929 1 2026-03-10T06:22:08.215 INFO:tasks.workunit.client.1.vm06.stdout:4/103: readlink dd/d18/l1c 0 2026-03-10T06:22:08.216 INFO:tasks.workunit.client.1.vm06.stdout:4/104: write dd/f14 [4877570,7570] 0 2026-03-10T06:22:08.277 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:07 vm06.local ceph-mon[58974]: Active manager daemon vm04.exdvdb restarted 2026-03-10T06:22:08.277 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:07 vm06.local ceph-mon[58974]: Activating manager daemon vm04.exdvdb 2026-03-10T06:22:08.277 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:07 vm06.local ceph-mon[58974]: from='mgr.? 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm04.exdvdb/crt"}]: dispatch 2026-03-10T06:22:08.277 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:07 vm06.local ceph-mon[58974]: from='mgr.? 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/crt"}]: dispatch 2026-03-10T06:22:08.277 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:07 vm06.local ceph-mon[58974]: from='mgr.? 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm04.exdvdb/key"}]: dispatch 2026-03-10T06:22:08.277 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:07 vm06.local ceph-mon[58974]: from='mgr.? 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/key"}]: dispatch 2026-03-10T06:22:08.277 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:07 vm06.local ceph-mon[58974]: osdmap e42: 6 total, 6 up, 6 in 2026-03-10T06:22:08.277 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:07 vm06.local ceph-mon[58974]: mgrmap e27: vm04.exdvdb(active, starting, since 0.0373559s) 2026-03-10T06:22:08.312 INFO:tasks.workunit.client.1.vm06.stdout:9/80: dwrite fe [0,4194304] 0 2026-03-10T06:22:08.316 INFO:tasks.workunit.client.1.vm06.stdout:9/81: dread fe [0,4194304] 0 2026-03-10T06:22:08.361 INFO:tasks.workunit.client.1.vm06.stdout:0/76: write d0/db/f12 [4403422,80733] 0 2026-03-10T06:22:08.362 INFO:tasks.workunit.client.1.vm06.stdout:0/77: dread - d0/ff zero size 2026-03-10T06:22:08.364 INFO:tasks.workunit.client.1.vm06.stdout:0/78: creat d0/dd/d14/d1d/f1e x:0 0 0 2026-03-10T06:22:08.366 INFO:tasks.workunit.client.1.vm06.stdout:1/82: rmdir d9/df 39 2026-03-10T06:22:08.368 INFO:tasks.workunit.client.1.vm06.stdout:0/79: creat d0/dd/d14/f1f x:0 0 0 2026-03-10T06:22:08.369 INFO:tasks.workunit.client.1.vm06.stdout:0/80: chown d0/dd/d14/d1d/f1e 2734807 1 2026-03-10T06:22:08.369 INFO:tasks.workunit.client.1.vm06.stdout:0/81: write d0/f19 [947540,74127] 0 2026-03-10T06:22:08.372 INFO:tasks.workunit.client.1.vm06.stdout:1/83: creat d9/f11 x:0 0 0 2026-03-10T06:22:08.374 INFO:tasks.workunit.client.1.vm06.stdout:1/84: write d9/df/f10 [19333,127736] 0 2026-03-10T06:22:08.375 INFO:tasks.workunit.client.1.vm06.stdout:0/82: dwrite d0/f5 [0,4194304] 0 2026-03-10T06:22:08.385 INFO:tasks.workunit.client.1.vm06.stdout:0/83: dread d0/db/f12 [0,4194304] 0 2026-03-10T06:22:08.387 INFO:tasks.workunit.client.1.vm06.stdout:1/85: dwrite d9/fe [0,4194304] 0 2026-03-10T06:22:08.387 INFO:tasks.workunit.client.1.vm06.stdout:0/84: dread - d0/dd/f11 zero size 2026-03-10T06:22:08.387 INFO:tasks.workunit.client.1.vm06.stdout:0/85: chown d0/db/ce 636 1 2026-03-10T06:22:08.389 INFO:tasks.workunit.client.1.vm06.stdout:0/86: unlink d0/dd/d14/f1f 0 2026-03-10T06:22:08.390 INFO:tasks.workunit.client.1.vm06.stdout:0/87: fdatasync d0/f19 0 2026-03-10T06:22:08.396 INFO:tasks.workunit.client.1.vm06.stdout:1/86: dread d9/df/f10 [0,4194304] 0 2026-03-10T06:22:08.398 INFO:tasks.workunit.client.1.vm06.stdout:1/87: write f5 [2943400,77426] 0 2026-03-10T06:22:08.398 INFO:tasks.workunit.client.1.vm06.stdout:0/88: dwrite d0/f5 [0,4194304] 0 2026-03-10T06:22:08.401 INFO:tasks.workunit.client.1.vm06.stdout:1/88: chown d9/df/f10 10170766 1 2026-03-10T06:22:08.405 INFO:tasks.workunit.client.1.vm06.stdout:0/89: mknod d0/dd/d1c/c20 0 2026-03-10T06:22:08.409 INFO:tasks.workunit.client.1.vm06.stdout:0/90: dwrite d0/f19 [0,4194304] 0 2026-03-10T06:22:08.417 INFO:tasks.workunit.client.1.vm06.stdout:0/91: chown d0/ff 1934 1 2026-03-10T06:22:08.417 INFO:tasks.workunit.client.1.vm06.stdout:0/92: dwrite d0/f9 [0,4194304] 0 2026-03-10T06:22:08.421 INFO:tasks.workunit.client.1.vm06.stdout:0/93: fdatasync d0/db/f12 0 2026-03-10T06:22:08.423 INFO:tasks.workunit.client.1.vm06.stdout:0/94: dread d0/f9 [0,4194304] 0 2026-03-10T06:22:08.423 INFO:tasks.workunit.client.1.vm06.stdout:0/95: chown d0/dd/d14/d18 31369946 1 2026-03-10T06:22:08.476 INFO:tasks.workunit.client.1.vm06.stdout:0/96: sync 2026-03-10T06:22:08.478 INFO:tasks.workunit.client.1.vm06.stdout:0/97: mknod d0/dd/d1c/c21 0 2026-03-10T06:22:08.480 INFO:tasks.workunit.client.1.vm06.stdout:0/98: link d0/f9 d0/dd/d14/d18/f22 0 2026-03-10T06:22:08.481 INFO:tasks.workunit.client.1.vm06.stdout:0/99: symlink d0/dd/d14/d1d/l23 0 2026-03-10T06:22:08.482 INFO:tasks.workunit.client.1.vm06.stdout:0/100: creat d0/dd/f24 x:0 0 0 2026-03-10T06:22:08.484 INFO:tasks.workunit.client.1.vm06.stdout:9/82: symlink l18 0 2026-03-10T06:22:08.484 INFO:tasks.workunit.client.1.vm06.stdout:0/101: symlink d0/db/l25 0 2026-03-10T06:22:08.485 INFO:tasks.workunit.client.1.vm06.stdout:6/96: rename d6/c19 to d6/df/c1c 0 2026-03-10T06:22:08.486 INFO:tasks.workunit.client.1.vm06.stdout:6/97: chown d6/dd/l14 47210 1 2026-03-10T06:22:08.486 INFO:tasks.workunit.client.1.vm06.stdout:9/83: chown fb 123 1 2026-03-10T06:22:08.486 INFO:tasks.workunit.client.1.vm06.stdout:9/84: rmdir - no directory 2026-03-10T06:22:08.487 INFO:tasks.workunit.client.1.vm06.stdout:2/80: rename c6 to da/c1e 0 2026-03-10T06:22:08.488 INFO:tasks.workunit.client.1.vm06.stdout:0/102: symlink d0/dd/d1b/l26 0 2026-03-10T06:22:08.489 INFO:tasks.workunit.client.1.vm06.stdout:9/85: chown fd 0 1 2026-03-10T06:22:08.489 INFO:tasks.workunit.client.1.vm06.stdout:8/87: rename d1/d7/db/c10 to d1/d7/c1e 0 2026-03-10T06:22:08.493 INFO:tasks.workunit.client.1.vm06.stdout:0/103: truncate d0/dd/f11 74039 0 2026-03-10T06:22:08.493 INFO:tasks.workunit.client.1.vm06.stdout:6/98: dwrite f3 [0,4194304] 0 2026-03-10T06:22:08.493 INFO:tasks.workunit.client.1.vm06.stdout:1/89: rename l3 to d9/df/l12 0 2026-03-10T06:22:08.494 INFO:tasks.workunit.client.1.vm06.stdout:6/99: write f1 [1051566,66837] 0 2026-03-10T06:22:08.494 INFO:tasks.workunit.client.1.vm06.stdout:1/90: chown d9/cc 1 1 2026-03-10T06:22:08.496 INFO:tasks.workunit.client.1.vm06.stdout:8/88: symlink d1/l1f 0 2026-03-10T06:22:08.496 INFO:tasks.workunit.client.1.vm06.stdout:8/89: write d1/d7/db/fe [2945166,12848] 0 2026-03-10T06:22:08.498 INFO:tasks.workunit.client.1.vm06.stdout:8/90: write d1/df/d11/f1d [616658,115560] 0 2026-03-10T06:22:08.499 INFO:tasks.workunit.client.1.vm06.stdout:2/81: rename da/f17 to da/d13/f1f 0 2026-03-10T06:22:08.500 INFO:tasks.workunit.client.1.vm06.stdout:8/91: fsync d1/d7/fd 0 2026-03-10T06:22:08.501 INFO:tasks.workunit.client.1.vm06.stdout:8/92: readlink d1/df/d11/l17 0 2026-03-10T06:22:08.501 INFO:tasks.workunit.client.1.vm06.stdout:6/100: sync 2026-03-10T06:22:08.505 INFO:tasks.workunit.client.1.vm06.stdout:6/101: dwrite d6/d7/f1a [0,4194304] 0 2026-03-10T06:22:08.510 INFO:tasks.workunit.client.1.vm06.stdout:2/82: mknod da/c20 0 2026-03-10T06:22:08.511 INFO:tasks.workunit.client.1.vm06.stdout:0/104: dread d0/f19 [0,4194304] 0 2026-03-10T06:22:08.514 INFO:tasks.workunit.client.1.vm06.stdout:2/83: link da/fe da/d13/d1a/f21 0 2026-03-10T06:22:08.514 INFO:tasks.workunit.client.1.vm06.stdout:6/102: getdents d6/dd 0 2026-03-10T06:22:08.522 INFO:tasks.workunit.client.1.vm06.stdout:2/84: rmdir da/d13/d1a 39 2026-03-10T06:22:08.522 INFO:tasks.workunit.client.1.vm06.stdout:2/85: chown f8 211847 1 2026-03-10T06:22:08.522 INFO:tasks.workunit.client.1.vm06.stdout:2/86: mknod da/d13/d1c/d1d/c22 0 2026-03-10T06:22:08.522 INFO:tasks.workunit.client.1.vm06.stdout:2/87: write f5 [2043643,7380] 0 2026-03-10T06:22:08.522 INFO:tasks.workunit.client.1.vm06.stdout:2/88: dread - da/fe zero size 2026-03-10T06:22:08.524 INFO:tasks.workunit.client.1.vm06.stdout:6/103: sync 2026-03-10T06:22:08.526 INFO:tasks.workunit.client.1.vm06.stdout:2/89: dread da/f19 [0,4194304] 0 2026-03-10T06:22:08.528 INFO:tasks.workunit.client.1.vm06.stdout:2/90: fdatasync da/d13/d1a/f21 0 2026-03-10T06:22:08.531 INFO:tasks.workunit.client.1.vm06.stdout:2/91: dwrite da/fe [0,4194304] 0 2026-03-10T06:22:08.536 INFO:tasks.workunit.client.1.vm06.stdout:2/92: chown da/d13/d1c/d1d/c22 1081984 1 2026-03-10T06:22:08.537 INFO:tasks.workunit.client.1.vm06.stdout:2/93: symlink da/d13/d1c/l23 0 2026-03-10T06:22:08.537 INFO:tasks.workunit.client.1.vm06.stdout:2/94: write f8 [1511457,51924] 0 2026-03-10T06:22:08.537 INFO:tasks.workunit.client.1.vm06.stdout:6/104: sync 2026-03-10T06:22:08.537 INFO:tasks.workunit.client.1.vm06.stdout:2/95: dread da/d13/d1a/f21 [0,4194304] 0 2026-03-10T06:22:08.537 INFO:tasks.workunit.client.1.vm06.stdout:6/105: dread - d6/d7/f16 zero size 2026-03-10T06:22:08.538 INFO:tasks.workunit.client.1.vm06.stdout:6/106: rename d6/df to d6/df/d1d 22 2026-03-10T06:22:08.549 INFO:tasks.workunit.client.1.vm06.stdout:6/107: creat d6/df/f1e x:0 0 0 2026-03-10T06:22:08.581 INFO:tasks.workunit.client.1.vm06.stdout:7/125: truncate d19/f28 4163128 0 2026-03-10T06:22:08.581 INFO:tasks.workunit.client.1.vm06.stdout:7/126: dread - d19/f27 zero size 2026-03-10T06:22:08.583 INFO:tasks.workunit.client.1.vm06.stdout:7/127: rename d19/l1b to d19/l29 0 2026-03-10T06:22:08.585 INFO:tasks.workunit.client.1.vm06.stdout:7/128: mknod d19/c2a 0 2026-03-10T06:22:08.588 INFO:tasks.workunit.client.1.vm06.stdout:7/129: creat d19/f2b x:0 0 0 2026-03-10T06:22:08.589 INFO:tasks.workunit.client.1.vm06.stdout:7/130: write d19/f2b [419254,118320] 0 2026-03-10T06:22:08.590 INFO:tasks.workunit.client.1.vm06.stdout:7/131: read fa [2345282,123654] 0 2026-03-10T06:22:08.592 INFO:tasks.workunit.client.1.vm06.stdout:2/96: fdatasync da/d13/d1a/f21 0 2026-03-10T06:22:08.593 INFO:tasks.workunit.client.1.vm06.stdout:7/132: sync 2026-03-10T06:22:08.594 INFO:tasks.workunit.client.1.vm06.stdout:7/133: symlink d19/l2c 0 2026-03-10T06:22:08.594 INFO:tasks.workunit.client.1.vm06.stdout:7/134: chown f13 10974953 1 2026-03-10T06:22:08.595 INFO:tasks.workunit.client.1.vm06.stdout:7/135: rename cc to d19/c2d 0 2026-03-10T06:22:08.596 INFO:tasks.workunit.client.1.vm06.stdout:7/136: mknod d19/c2e 0 2026-03-10T06:22:08.597 INFO:tasks.workunit.client.1.vm06.stdout:7/137: fsync d19/f23 0 2026-03-10T06:22:08.597 INFO:tasks.workunit.client.1.vm06.stdout:7/138: dread - d19/f27 zero size 2026-03-10T06:22:08.598 INFO:tasks.workunit.client.1.vm06.stdout:7/139: write fa [4208191,68843] 0 2026-03-10T06:22:08.600 INFO:tasks.workunit.client.1.vm06.stdout:7/140: creat d19/f2f x:0 0 0 2026-03-10T06:22:08.603 INFO:tasks.workunit.client.1.vm06.stdout:7/141: dwrite fa [0,4194304] 0 2026-03-10T06:22:08.606 INFO:tasks.workunit.client.1.vm06.stdout:7/142: creat d19/f30 x:0 0 0 2026-03-10T06:22:08.608 INFO:tasks.workunit.client.1.vm06.stdout:7/143: symlink d19/l31 0 2026-03-10T06:22:08.608 INFO:tasks.workunit.client.1.vm06.stdout:5/81: rmdir d8 39 2026-03-10T06:22:08.610 INFO:tasks.workunit.client.1.vm06.stdout:7/144: sync 2026-03-10T06:22:08.610 INFO:tasks.workunit.client.1.vm06.stdout:7/145: dread - d19/f23 zero size 2026-03-10T06:22:08.612 INFO:tasks.workunit.client.1.vm06.stdout:7/146: dread d19/f1a [0,4194304] 0 2026-03-10T06:22:08.614 INFO:tasks.workunit.client.1.vm06.stdout:5/82: stat d8/db/f18 0 2026-03-10T06:22:08.616 INFO:tasks.workunit.client.1.vm06.stdout:7/147: symlink d19/l32 0 2026-03-10T06:22:08.617 INFO:tasks.workunit.client.1.vm06.stdout:5/83: creat d8/d9/d1e/f23 x:0 0 0 2026-03-10T06:22:08.617 INFO:tasks.workunit.client.1.vm06.stdout:5/84: dread - d8/d9/d1e/f23 zero size 2026-03-10T06:22:08.620 INFO:tasks.workunit.client.1.vm06.stdout:7/148: rename d19/f2b to d19/f33 0 2026-03-10T06:22:08.621 INFO:tasks.workunit.client.1.vm06.stdout:7/149: stat d19/l2c 0 2026-03-10T06:22:08.621 INFO:tasks.workunit.client.1.vm06.stdout:7/150: chown d19/l32 3470 1 2026-03-10T06:22:08.621 INFO:tasks.workunit.client.1.vm06.stdout:5/85: getdents d8/d20 0 2026-03-10T06:22:08.623 INFO:tasks.workunit.client.1.vm06.stdout:7/151: link l17 d19/l34 0 2026-03-10T06:22:08.623 INFO:tasks.workunit.client.1.vm06.stdout:5/86: mknod d8/c24 0 2026-03-10T06:22:08.624 INFO:tasks.workunit.client.1.vm06.stdout:5/87: creat d8/db/f25 x:0 0 0 2026-03-10T06:22:08.624 INFO:tasks.workunit.client.1.vm06.stdout:5/88: read f7 [549026,115174] 0 2026-03-10T06:22:08.625 INFO:tasks.workunit.client.1.vm06.stdout:5/89: write d8/d9/d1e/f23 [621369,79180] 0 2026-03-10T06:22:08.627 INFO:tasks.workunit.client.1.vm06.stdout:7/152: link f15 d19/f35 0 2026-03-10T06:22:08.627 INFO:tasks.workunit.client.1.vm06.stdout:7/153: chown d19/c2a 11337064 1 2026-03-10T06:22:08.628 INFO:tasks.workunit.client.1.vm06.stdout:7/154: fsync f16 0 2026-03-10T06:22:08.630 INFO:tasks.workunit.client.1.vm06.stdout:7/155: rename cb to d19/c36 0 2026-03-10T06:22:08.634 INFO:tasks.workunit.client.1.vm06.stdout:7/156: dwrite f9 [0,4194304] 0 2026-03-10T06:22:08.634 INFO:tasks.workunit.client.1.vm06.stdout:7/157: read - d19/f35 zero size 2026-03-10T06:22:08.677 INFO:tasks.workunit.client.1.vm06.stdout:3/103: truncate d6/f1b 485254 0 2026-03-10T06:22:08.679 INFO:tasks.workunit.client.1.vm06.stdout:3/104: creat d6/f29 x:0 0 0 2026-03-10T06:22:08.684 INFO:tasks.workunit.client.1.vm06.stdout:3/105: mkdir d6/dc/d2a 0 2026-03-10T06:22:08.684 INFO:tasks.workunit.client.1.vm06.stdout:3/106: fsync d6/dc/d13/f17 0 2026-03-10T06:22:08.775 INFO:tasks.workunit.client.1.vm06.stdout:4/105: rmdir dd/d18 39 2026-03-10T06:22:08.777 INFO:tasks.workunit.client.1.vm06.stdout:4/106: creat dd/d18/f1f x:0 0 0 2026-03-10T06:22:08.864 INFO:tasks.workunit.client.1.vm06.stdout:0/105: truncate d0/f9 2301429 0 2026-03-10T06:22:08.864 INFO:tasks.workunit.client.1.vm06.stdout:0/106: dread - d0/dd/d14/d1d/f1e zero size 2026-03-10T06:22:08.867 INFO:tasks.workunit.client.1.vm06.stdout:0/107: dread d0/f19 [0,4194304] 0 2026-03-10T06:22:08.868 INFO:tasks.workunit.client.1.vm06.stdout:0/108: chown d0/dd/d14/d1d/f1e 10425643 1 2026-03-10T06:22:08.874 INFO:tasks.workunit.client.1.vm06.stdout:9/86: rename f10 to f19 0 2026-03-10T06:22:08.875 INFO:tasks.workunit.client.1.vm06.stdout:0/109: read d0/fa [357419,20807] 0 2026-03-10T06:22:08.875 INFO:tasks.workunit.client.1.vm06.stdout:0/110: fsync d0/ff 0 2026-03-10T06:22:08.880 INFO:tasks.workunit.client.1.vm06.stdout:0/111: mkdir d0/dd/d27 0 2026-03-10T06:22:08.886 INFO:tasks.workunit.client.1.vm06.stdout:0/112: chown d0/c15 3 1 2026-03-10T06:22:08.886 INFO:tasks.workunit.client.1.vm06.stdout:0/113: read - d0/ff zero size 2026-03-10T06:22:08.888 INFO:tasks.workunit.client.1.vm06.stdout:9/87: rename la to l1a 0 2026-03-10T06:22:08.888 INFO:tasks.workunit.client.1.vm06.stdout:0/114: creat d0/dd/d14/f28 x:0 0 0 2026-03-10T06:22:08.888 INFO:tasks.workunit.client.1.vm06.stdout:0/115: fdatasync d0/dd/f11 0 2026-03-10T06:22:08.889 INFO:tasks.workunit.client.1.vm06.stdout:0/116: chown d0/dd/d1c/c20 13319919 1 2026-03-10T06:22:08.895 INFO:tasks.workunit.client.1.vm06.stdout:1/91: rmdir d9 39 2026-03-10T06:22:08.897 INFO:tasks.workunit.client.1.vm06.stdout:0/117: mknod d0/dd/d14/d18/c29 0 2026-03-10T06:22:08.898 INFO:tasks.workunit.client.1.vm06.stdout:9/88: creat f1b x:0 0 0 2026-03-10T06:22:08.905 INFO:tasks.workunit.client.1.vm06.stdout:9/89: symlink l1c 0 2026-03-10T06:22:08.905 INFO:tasks.workunit.client.1.vm06.stdout:1/92: dwrite d9/df/f10 [0,4194304] 0 2026-03-10T06:22:08.907 INFO:tasks.workunit.client.1.vm06.stdout:1/93: dread - d9/f11 zero size 2026-03-10T06:22:08.914 INFO:tasks.workunit.client.1.vm06.stdout:8/93: truncate d1/f2 3235121 0 2026-03-10T06:22:08.914 INFO:tasks.workunit.client.1.vm06.stdout:8/94: write d1/f18 [549028,62883] 0 2026-03-10T06:22:08.915 INFO:tasks.workunit.client.1.vm06.stdout:8/95: write d1/f18 [429196,9738] 0 2026-03-10T06:22:08.919 INFO:tasks.workunit.client.1.vm06.stdout:8/96: dwrite d1/f13 [0,4194304] 0 2026-03-10T06:22:08.921 INFO:tasks.workunit.client.1.vm06.stdout:8/97: write d1/d7/db/fe [2659562,124016] 0 2026-03-10T06:22:08.924 INFO:tasks.workunit.client.1.vm06.stdout:2/97: dwrite da/fe [4194304,4194304] 0 2026-03-10T06:22:08.925 INFO:tasks.workunit.client.1.vm06.stdout:1/94: unlink d9/df/l12 0 2026-03-10T06:22:08.930 INFO:tasks.workunit.client.1.vm06.stdout:6/108: rmdir d6/df 39 2026-03-10T06:22:08.931 INFO:tasks.workunit.client.1.vm06.stdout:6/109: rename d6 to d6/dd/d1f 22 2026-03-10T06:22:08.934 INFO:tasks.workunit.client.1.vm06.stdout:2/98: dwrite da/d13/f16 [0,4194304] 0 2026-03-10T06:22:08.936 INFO:tasks.workunit.client.1.vm06.stdout:1/95: dread d9/fe [0,4194304] 0 2026-03-10T06:22:08.948 INFO:tasks.workunit.client.1.vm06.stdout:6/110: creat d6/d7/f20 x:0 0 0 2026-03-10T06:22:08.951 INFO:tasks.workunit.client.1.vm06.stdout:2/99: fsync da/f19 0 2026-03-10T06:22:08.960 INFO:tasks.workunit.client.1.vm06.stdout:9/90: rename c1 to c1d 0 2026-03-10T06:22:08.964 INFO:tasks.workunit.client.1.vm06.stdout:5/90: getdents d8/d9/d1e 0 2026-03-10T06:22:08.965 INFO:tasks.workunit.client.1.vm06.stdout:5/91: write d8/d9/d1e/f23 [1224073,96131] 0 2026-03-10T06:22:08.968 INFO:tasks.workunit.client.1.vm06.stdout:7/158: link d19/c2d d19/c37 0 2026-03-10T06:22:08.969 INFO:tasks.workunit.client.1.vm06.stdout:5/92: mknod d8/d9/d1e/c26 0 2026-03-10T06:22:08.970 INFO:tasks.workunit.client.1.vm06.stdout:9/91: symlink l1e 0 2026-03-10T06:22:08.974 INFO:tasks.workunit.client.1.vm06.stdout:2/100: getdents da/d13/d1c 0 2026-03-10T06:22:08.977 INFO:tasks.workunit.client.1.vm06.stdout:9/92: symlink l1f 0 2026-03-10T06:22:08.977 INFO:tasks.workunit.client.1.vm06.stdout:2/101: symlink da/d13/d1c/l24 0 2026-03-10T06:22:08.980 INFO:tasks.workunit.client.1.vm06.stdout:2/102: dread f8 [0,4194304] 0 2026-03-10T06:22:08.980 INFO:tasks.workunit.client.1.vm06.stdout:9/93: creat f20 x:0 0 0 2026-03-10T06:22:08.981 INFO:tasks.workunit.client.1.vm06.stdout:2/103: read f8 [1480444,77106] 0 2026-03-10T06:22:08.981 INFO:tasks.workunit.client.1.vm06.stdout:2/104: chown da/d13/d1c/d1d/c22 1539 1 2026-03-10T06:22:08.982 INFO:tasks.workunit.client.1.vm06.stdout:2/105: fsync f9 0 2026-03-10T06:22:08.982 INFO:tasks.workunit.client.1.vm06.stdout:9/94: mkdir d21 0 2026-03-10T06:22:08.984 INFO:tasks.workunit.client.1.vm06.stdout:9/95: write f17 [1011637,129974] 0 2026-03-10T06:22:08.984 INFO:tasks.workunit.client.1.vm06.stdout:9/96: write fd [1408387,71757] 0 2026-03-10T06:22:08.985 INFO:tasks.workunit.client.1.vm06.stdout:9/97: write f17 [1312127,89716] 0 2026-03-10T06:22:08.987 INFO:tasks.workunit.client.1.vm06.stdout:2/106: symlink da/d13/d1c/d1d/l25 0 2026-03-10T06:22:08.990 INFO:tasks.workunit.client.1.vm06.stdout:8/98: sync 2026-03-10T06:22:08.991 INFO:tasks.workunit.client.1.vm06.stdout:1/96: sync 2026-03-10T06:22:08.992 INFO:tasks.workunit.client.1.vm06.stdout:2/107: creat da/d13/d1c/d1d/f26 x:0 0 0 2026-03-10T06:22:08.996 INFO:tasks.workunit.client.1.vm06.stdout:8/99: rename d1/d7/db to d1/df/d20 0 2026-03-10T06:22:09.000 INFO:tasks.workunit.client.1.vm06.stdout:8/100: dwrite d1/f1b [0,4194304] 0 2026-03-10T06:22:09.005 INFO:tasks.workunit.client.1.vm06.stdout:1/97: write f5 [1341330,37655] 0 2026-03-10T06:22:09.011 INFO:tasks.workunit.client.1.vm06.stdout:4/107: truncate f6 845961 0 2026-03-10T06:22:09.013 INFO:tasks.workunit.client.1.vm06.stdout:8/101: dwrite d1/f1c [0,4194304] 0 2026-03-10T06:22:09.015 INFO:tasks.workunit.client.1.vm06.stdout:1/98: dread d9/df/f10 [0,4194304] 0 2026-03-10T06:22:09.016 INFO:tasks.workunit.client.1.vm06.stdout:8/102: mkdir d1/df/d20/d21 0 2026-03-10T06:22:09.017 INFO:tasks.workunit.client.1.vm06.stdout:4/108: mkdir dd/d20 0 2026-03-10T06:22:09.018 INFO:tasks.workunit.client.1.vm06.stdout:4/109: write dd/d18/f1d [732664,78923] 0 2026-03-10T06:22:09.019 INFO:tasks.workunit.client.1.vm06.stdout:2/108: getdents da/d13/d1a 0 2026-03-10T06:22:09.020 INFO:tasks.workunit.client.1.vm06.stdout:8/103: rename d1/df/d20/c16 to d1/df/d11/c22 0 2026-03-10T06:22:09.022 INFO:tasks.workunit.client.1.vm06.stdout:1/99: link d9/cc d9/df/c13 0 2026-03-10T06:22:09.028 INFO:tasks.workunit.client.1.vm06.stdout:1/100: creat d9/df/f14 x:0 0 0 2026-03-10T06:22:09.045 INFO:tasks.workunit.client.1.vm06.stdout:5/93: dread d8/d9/d1e/f23 [0,4194304] 0 2026-03-10T06:22:09.048 INFO:tasks.workunit.client.1.vm06.stdout:5/94: write d8/db/f18 [291779,3161] 0 2026-03-10T06:22:09.050 INFO:tasks.workunit.client.1.vm06.stdout:5/95: rename c4 to d8/d20/c27 0 2026-03-10T06:22:09.051 INFO:tasks.workunit.client.1.vm06.stdout:5/96: dread - d8/d9/d1e/f17 zero size 2026-03-10T06:22:09.062 INFO:tasks.workunit.client.1.vm06.stdout:0/118: truncate d0/dd/f10 781266 0 2026-03-10T06:22:09.062 INFO:tasks.workunit.client.1.vm06.stdout:0/119: read d0/dd/f11 [46219,39888] 0 2026-03-10T06:22:09.067 INFO:tasks.workunit.client.1.vm06.stdout:0/120: rename d0/c6 to d0/dd/d1c/c2a 0 2026-03-10T06:22:09.068 INFO:tasks.workunit.client.1.vm06.stdout:0/121: mknod d0/c2b 0 2026-03-10T06:22:09.068 INFO:tasks.workunit.client.1.vm06.stdout:1/101: write d9/df/f10 [4101352,112938] 0 2026-03-10T06:22:09.069 INFO:tasks.workunit.client.1.vm06.stdout:0/122: dread - d0/dd/d14/f28 zero size 2026-03-10T06:22:09.069 INFO:tasks.workunit.client.1.vm06.stdout:1/102: write d9/df/f14 [1017719,26491] 0 2026-03-10T06:22:09.070 INFO:tasks.workunit.client.1.vm06.stdout:1/103: write d9/fb [3486509,95139] 0 2026-03-10T06:22:09.071 INFO:tasks.workunit.client.1.vm06.stdout:1/104: fsync d9/fe 0 2026-03-10T06:22:09.071 INFO:tasks.workunit.client.1.vm06.stdout:0/123: creat d0/dd/d14/d18/f2c x:0 0 0 2026-03-10T06:22:09.074 INFO:tasks.workunit.client.1.vm06.stdout:1/105: creat d9/df/f15 x:0 0 0 2026-03-10T06:22:09.075 INFO:tasks.workunit.client.1.vm06.stdout:1/106: chown d9/cc 838567 1 2026-03-10T06:22:09.076 INFO:tasks.workunit.client.1.vm06.stdout:1/107: rename d9/df to d9/df/d16 22 2026-03-10T06:22:09.080 INFO:tasks.workunit.client.1.vm06.stdout:1/108: creat d9/f17 x:0 0 0 2026-03-10T06:22:09.082 INFO:tasks.workunit.client.1.vm06.stdout:6/111: rmdir d6/d7 39 2026-03-10T06:22:09.085 INFO:tasks.workunit.client.1.vm06.stdout:1/109: dwrite d9/df/f15 [0,4194304] 0 2026-03-10T06:22:09.087 INFO:tasks.workunit.client.1.vm06.stdout:7/159: write d19/f20 [4439206,19025] 0 2026-03-10T06:22:09.089 INFO:tasks.workunit.client.1.vm06.stdout:3/107: write d6/f1b [323710,107787] 0 2026-03-10T06:22:09.090 INFO:tasks.workunit.client.1.vm06.stdout:6/112: dwrite d6/dd/f18 [0,4194304] 0 2026-03-10T06:22:09.098 INFO:tasks.workunit.client.1.vm06.stdout:7/160: truncate d19/f1a 1283943 0 2026-03-10T06:22:09.099 INFO:tasks.workunit.client.1.vm06.stdout:7/161: fdatasync f4 0 2026-03-10T06:22:09.099 INFO:tasks.workunit.client.1.vm06.stdout:3/108: mknod d6/dc/d13/c2b 0 2026-03-10T06:22:09.099 INFO:tasks.workunit.client.1.vm06.stdout:6/113: rmdir d6/d11 39 2026-03-10T06:22:09.100 INFO:tasks.workunit.client.1.vm06.stdout:7/162: fsync f15 0 2026-03-10T06:22:09.100 INFO:tasks.workunit.client.1.vm06.stdout:7/163: truncate d19/f1d 118701 0 2026-03-10T06:22:09.101 INFO:tasks.workunit.client.1.vm06.stdout:3/109: creat d6/d21/f2c x:0 0 0 2026-03-10T06:22:09.103 INFO:tasks.workunit.client.1.vm06.stdout:7/164: stat d19/c37 0 2026-03-10T06:22:09.104 INFO:tasks.workunit.client.1.vm06.stdout:3/110: creat d6/d8/f2d x:0 0 0 2026-03-10T06:22:09.104 INFO:tasks.workunit.client.1.vm06.stdout:6/114: mknod d6/d11/c21 0 2026-03-10T06:22:09.104 INFO:tasks.workunit.client.1.vm06.stdout:3/111: write d6/dc/d13/f17 [4987160,107378] 0 2026-03-10T06:22:09.105 INFO:tasks.workunit.client.1.vm06.stdout:7/165: mknod d19/c38 0 2026-03-10T06:22:09.108 INFO:tasks.workunit.client.1.vm06.stdout:2/109: fsync da/d13/d1c/d1d/f26 0 2026-03-10T06:22:09.108 INFO:tasks.workunit.client.1.vm06.stdout:3/112: rename c3 to d6/d21/c2e 0 2026-03-10T06:22:09.109 INFO:tasks.workunit.client.1.vm06.stdout:9/98: write f9 [1344439,51171] 0 2026-03-10T06:22:09.113 INFO:tasks.workunit.client.1.vm06.stdout:9/99: dread fc [0,4194304] 0 2026-03-10T06:22:09.114 INFO:tasks.workunit.client.1.vm06.stdout:6/115: link d6/d11/l13 d6/df/l22 0 2026-03-10T06:22:09.114 INFO:tasks.workunit.client.1.vm06.stdout:6/116: fdatasync d6/fc 0 2026-03-10T06:22:09.114 INFO:tasks.workunit.client.1.vm06.stdout:9/100: dread f12 [0,4194304] 0 2026-03-10T06:22:09.115 INFO:tasks.workunit.client.1.vm06.stdout:9/101: readlink l1f 0 2026-03-10T06:22:09.115 INFO:tasks.workunit.client.1.vm06.stdout:9/102: chown l1c 702962245 1 2026-03-10T06:22:09.116 INFO:tasks.workunit.client.1.vm06.stdout:9/103: write fd [1773904,72588] 0 2026-03-10T06:22:09.116 INFO:tasks.workunit.client.1.vm06.stdout:6/117: readlink d6/df/l22 0 2026-03-10T06:22:09.121 INFO:tasks.workunit.client.1.vm06.stdout:6/118: rename d6/dd/ce to d6/d7/c23 0 2026-03-10T06:22:09.122 INFO:tasks.workunit.client.1.vm06.stdout:6/119: write d6/d7/f1a [1942269,120839] 0 2026-03-10T06:22:09.126 INFO:tasks.workunit.client.1.vm06.stdout:6/120: creat d6/d11/f24 x:0 0 0 2026-03-10T06:22:09.126 INFO:tasks.workunit.client.1.vm06.stdout:6/121: fdatasync d6/f1b 0 2026-03-10T06:22:09.129 INFO:tasks.workunit.client.1.vm06.stdout:2/110: sync 2026-03-10T06:22:09.129 INFO:tasks.workunit.client.1.vm06.stdout:9/104: sync 2026-03-10T06:22:09.132 INFO:tasks.workunit.client.1.vm06.stdout:6/122: dwrite d6/d11/f24 [0,4194304] 0 2026-03-10T06:22:09.132 INFO:tasks.workunit.client.1.vm06.stdout:6/123: fdatasync f3 0 2026-03-10T06:22:09.133 INFO:tasks.workunit.client.1.vm06.stdout:6/124: chown d6/d11/f24 177419258 1 2026-03-10T06:22:09.137 INFO:tasks.workunit.client.1.vm06.stdout:6/125: dread f1 [4194304,4194304] 0 2026-03-10T06:22:09.142 INFO:tasks.workunit.client.1.vm06.stdout:2/111: sync 2026-03-10T06:22:09.145 INFO:tasks.workunit.client.1.vm06.stdout:9/105: dwrite f11 [0,4194304] 0 2026-03-10T06:22:09.147 INFO:tasks.workunit.client.1.vm06.stdout:6/126: mkdir d6/dd/d25 0 2026-03-10T06:22:09.151 INFO:tasks.workunit.client.1.vm06.stdout:2/112: creat da/d13/d1a/f27 x:0 0 0 2026-03-10T06:22:09.152 INFO:tasks.workunit.client.1.vm06.stdout:2/113: dread da/f19 [0,4194304] 0 2026-03-10T06:22:09.162 INFO:tasks.workunit.client.1.vm06.stdout:9/106: creat d21/f22 x:0 0 0 2026-03-10T06:22:09.166 INFO:tasks.workunit.client.1.vm06.stdout:9/107: symlink d21/l23 0 2026-03-10T06:22:09.173 INFO:tasks.workunit.client.1.vm06.stdout:9/108: unlink f17 0 2026-03-10T06:22:09.173 INFO:tasks.workunit.client.1.vm06.stdout:9/109: chown fd 124 1 2026-03-10T06:22:09.175 INFO:tasks.workunit.client.1.vm06.stdout:9/110: dread fc [0,4194304] 0 2026-03-10T06:22:09.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:08 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm04.hdxbzv"}]: dispatch 2026-03-10T06:22:09.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:08 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm04.hsrsig"}]: dispatch 2026-03-10T06:22:09.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:08 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm06.afscws"}]: dispatch 2026-03-10T06:22:09.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:08 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm06.wzhqon"}]: dispatch 2026-03-10T06:22:09.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:08 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "mon metadata", "id": "vm04"}]: dispatch 2026-03-10T06:22:09.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:08 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "mon metadata", "id": "vm06"}]: dispatch 2026-03-10T06:22:09.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:08 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "mgr metadata", "who": "vm04.exdvdb", "id": "vm04.exdvdb"}]: dispatch 2026-03-10T06:22:09.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:08 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-10T06:22:09.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:08 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-10T06:22:09.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:08 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-10T06:22:09.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:08 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-10T06:22:09.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:08 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-10T06:22:09.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:08 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-10T06:22:09.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:08 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "mds metadata"}]: dispatch 2026-03-10T06:22:09.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:08 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd metadata"}]: dispatch 2026-03-10T06:22:09.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:08 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "mon metadata"}]: dispatch 2026-03-10T06:22:09.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:08 vm04.local ceph-mon[51058]: Manager daemon vm04.exdvdb is now available 2026-03-10T06:22:09.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:08 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T06:22:09.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:08 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:22:09.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:08 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm04.exdvdb/mirror_snapshot_schedule"}]: dispatch 2026-03-10T06:22:09.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:08 vm04.local ceph-mon[51058]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm04.exdvdb/mirror_snapshot_schedule"}]: dispatch 2026-03-10T06:22:09.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:08 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm04.exdvdb/trash_purge_schedule"}]: dispatch 2026-03-10T06:22:09.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:08 vm04.local ceph-mon[51058]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm04.exdvdb/trash_purge_schedule"}]: dispatch 2026-03-10T06:22:09.180 INFO:tasks.workunit.client.1.vm06.stdout:2/114: getdents da 0 2026-03-10T06:22:09.182 INFO:tasks.workunit.client.1.vm06.stdout:4/110: truncate fa 2739004 0 2026-03-10T06:22:09.183 INFO:tasks.workunit.client.1.vm06.stdout:2/115: dread da/f11 [0,4194304] 0 2026-03-10T06:22:09.183 INFO:tasks.workunit.client.1.vm06.stdout:8/104: truncate d1/f13 1518144 0 2026-03-10T06:22:09.184 INFO:tasks.workunit.client.1.vm06.stdout:5/97: getdents d8/d20 0 2026-03-10T06:22:09.192 INFO:tasks.workunit.client.1.vm06.stdout:8/105: dwrite d1/fa [0,4194304] 0 2026-03-10T06:22:09.193 INFO:tasks.workunit.client.1.vm06.stdout:0/124: write d0/fa [410690,1935] 0 2026-03-10T06:22:09.193 INFO:tasks.workunit.client.1.vm06.stdout:1/110: rmdir d9 39 2026-03-10T06:22:09.195 INFO:tasks.workunit.client.1.vm06.stdout:4/111: write dd/f12 [2507396,45978] 0 2026-03-10T06:22:09.195 INFO:tasks.workunit.client.1.vm06.stdout:2/116: creat da/f28 x:0 0 0 2026-03-10T06:22:09.195 INFO:tasks.workunit.client.1.vm06.stdout:4/112: chown dd/c17 1063172802 1 2026-03-10T06:22:09.196 INFO:tasks.workunit.client.1.vm06.stdout:5/98: dwrite d8/f1a [0,4194304] 0 2026-03-10T06:22:09.196 INFO:tasks.workunit.client.1.vm06.stdout:2/117: chown da/fe 143200 1 2026-03-10T06:22:09.199 INFO:tasks.workunit.client.1.vm06.stdout:4/113: sync 2026-03-10T06:22:09.199 INFO:tasks.workunit.client.1.vm06.stdout:8/106: sync 2026-03-10T06:22:09.199 INFO:tasks.workunit.client.1.vm06.stdout:5/99: write d8/d9/d1e/f17 [853349,31135] 0 2026-03-10T06:22:09.201 INFO:tasks.workunit.client.1.vm06.stdout:5/100: stat d8/d9/f14 0 2026-03-10T06:22:09.202 INFO:tasks.workunit.client.1.vm06.stdout:5/101: sync 2026-03-10T06:22:09.206 INFO:tasks.workunit.client.1.vm06.stdout:1/111: dwrite f5 [0,4194304] 0 2026-03-10T06:22:09.219 INFO:tasks.workunit.client.1.vm06.stdout:7/166: rmdir d19 39 2026-03-10T06:22:09.219 INFO:tasks.workunit.client.1.vm06.stdout:7/167: write f4 [404285,46672] 0 2026-03-10T06:22:09.221 INFO:tasks.workunit.client.1.vm06.stdout:3/113: chown d6/d21/c2e 15072811 1 2026-03-10T06:22:09.221 INFO:tasks.workunit.client.1.vm06.stdout:3/114: readlink d6/dc/d13/l19 0 2026-03-10T06:22:09.222 INFO:tasks.workunit.client.1.vm06.stdout:3/115: write d6/dc/f1d [893474,83037] 0 2026-03-10T06:22:09.222 INFO:tasks.workunit.client.1.vm06.stdout:3/116: stat d6/dc/d13 0 2026-03-10T06:22:09.222 INFO:tasks.workunit.client.1.vm06.stdout:3/117: dread - d6/f29 zero size 2026-03-10T06:22:09.225 INFO:tasks.workunit.client.1.vm06.stdout:6/127: getdents d6/d11 0 2026-03-10T06:22:09.232 INFO:tasks.workunit.client.1.vm06.stdout:3/118: dread d6/d8/fb [0,4194304] 0 2026-03-10T06:22:09.238 INFO:tasks.workunit.client.1.vm06.stdout:0/125: mkdir d0/dd/d2d 0 2026-03-10T06:22:09.244 INFO:tasks.workunit.client.1.vm06.stdout:2/118: creat da/d13/d1c/f29 x:0 0 0 2026-03-10T06:22:09.244 INFO:tasks.workunit.client.1.vm06.stdout:2/119: dread - da/d13/d1c/f29 zero size 2026-03-10T06:22:09.244 INFO:tasks.workunit.client.1.vm06.stdout:0/126: dwrite d0/fa [0,4194304] 0 2026-03-10T06:22:09.244 INFO:tasks.workunit.client.1.vm06.stdout:8/107: readlink d1/l6 0 2026-03-10T06:22:09.244 INFO:tasks.workunit.client.1.vm06.stdout:8/108: readlink d1/d7/l15 0 2026-03-10T06:22:09.245 INFO:tasks.workunit.client.1.vm06.stdout:0/127: write d0/ff [853022,38007] 0 2026-03-10T06:22:09.254 INFO:tasks.workunit.client.1.vm06.stdout:5/102: dread d8/ff [0,4194304] 0 2026-03-10T06:22:09.254 INFO:tasks.workunit.client.1.vm06.stdout:5/103: readlink l3 0 2026-03-10T06:22:09.255 INFO:tasks.workunit.client.1.vm06.stdout:1/112: write d9/fb [3374151,51655] 0 2026-03-10T06:22:09.257 INFO:tasks.workunit.client.1.vm06.stdout:1/113: dread d9/fb [0,4194304] 0 2026-03-10T06:22:09.257 INFO:tasks.workunit.client.1.vm06.stdout:1/114: readlink d9/la 0 2026-03-10T06:22:09.259 INFO:tasks.workunit.client.1.vm06.stdout:7/168: unlink f16 0 2026-03-10T06:22:09.263 INFO:tasks.workunit.client.1.vm06.stdout:1/115: dwrite d9/df/f10 [0,4194304] 0 2026-03-10T06:22:09.263 INFO:tasks.workunit.client.1.vm06.stdout:1/116: write d9/fe [1411626,40223] 0 2026-03-10T06:22:09.276 INFO:tasks.workunit.client.1.vm06.stdout:3/119: symlink d6/d1a/l2f 0 2026-03-10T06:22:09.289 INFO:tasks.workunit.client.1.vm06.stdout:8/109: mknod d1/df/d20/c23 0 2026-03-10T06:22:09.292 INFO:tasks.workunit.client.1.vm06.stdout:8/110: dwrite d1/df/d20/f19 [0,4194304] 0 2026-03-10T06:22:09.294 INFO:tasks.workunit.client.1.vm06.stdout:8/111: truncate d1/f1b 5107172 0 2026-03-10T06:22:09.295 INFO:tasks.workunit.client.1.vm06.stdout:8/112: dread d1/f4 [0,4194304] 0 2026-03-10T06:22:09.301 INFO:tasks.workunit.client.1.vm06.stdout:8/113: dwrite d1/f4 [0,4194304] 0 2026-03-10T06:22:09.306 INFO:tasks.workunit.client.1.vm06.stdout:9/111: dwrite f11 [4194304,4194304] 0 2026-03-10T06:22:09.327 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:09 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm04.hdxbzv"}]: dispatch 2026-03-10T06:22:09.327 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:09 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm04.hsrsig"}]: dispatch 2026-03-10T06:22:09.327 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:09 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm06.afscws"}]: dispatch 2026-03-10T06:22:09.327 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:09 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm06.wzhqon"}]: dispatch 2026-03-10T06:22:09.327 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:09 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "mon metadata", "id": "vm04"}]: dispatch 2026-03-10T06:22:09.328 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:09 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "mon metadata", "id": "vm06"}]: dispatch 2026-03-10T06:22:09.328 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:09 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "mgr metadata", "who": "vm04.exdvdb", "id": "vm04.exdvdb"}]: dispatch 2026-03-10T06:22:09.328 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:09 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-10T06:22:09.328 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:09 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-10T06:22:09.328 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:09 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-10T06:22:09.328 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:09 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-10T06:22:09.328 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:09 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-10T06:22:09.328 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:09 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-10T06:22:09.328 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:09 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "mds metadata"}]: dispatch 2026-03-10T06:22:09.328 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:09 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd metadata"}]: dispatch 2026-03-10T06:22:09.328 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:09 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "mon metadata"}]: dispatch 2026-03-10T06:22:09.328 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:09 vm06.local ceph-mon[58974]: Manager daemon vm04.exdvdb is now available 2026-03-10T06:22:09.328 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:09 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T06:22:09.328 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:09 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:22:09.328 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:09 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm04.exdvdb/mirror_snapshot_schedule"}]: dispatch 2026-03-10T06:22:09.328 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:09 vm06.local ceph-mon[58974]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm04.exdvdb/mirror_snapshot_schedule"}]: dispatch 2026-03-10T06:22:09.328 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:09 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm04.exdvdb/trash_purge_schedule"}]: dispatch 2026-03-10T06:22:09.328 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:09 vm06.local ceph-mon[58974]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm04.exdvdb/trash_purge_schedule"}]: dispatch 2026-03-10T06:22:09.336 INFO:tasks.workunit.client.1.vm06.stdout:1/117: creat d9/df/f18 x:0 0 0 2026-03-10T06:22:09.341 INFO:tasks.workunit.client.1.vm06.stdout:1/118: dwrite d9/fe [0,4194304] 0 2026-03-10T06:22:09.344 INFO:tasks.workunit.client.1.vm06.stdout:0/128: mknod d0/c2e 0 2026-03-10T06:22:09.344 INFO:tasks.workunit.client.1.vm06.stdout:3/120: dwrite d6/dc/d13/f1e [0,4194304] 0 2026-03-10T06:22:09.351 INFO:tasks.workunit.client.1.vm06.stdout:1/119: dwrite d9/df/f14 [0,4194304] 0 2026-03-10T06:22:09.381 INFO:tasks.workunit.client.1.vm06.stdout:1/120: sync 2026-03-10T06:22:09.392 INFO:tasks.workunit.client.1.vm06.stdout:8/114: creat d1/d7/f24 x:0 0 0 2026-03-10T06:22:09.392 INFO:tasks.workunit.client.1.vm06.stdout:8/115: readlink d1/df/d11/l17 0 2026-03-10T06:22:09.401 INFO:tasks.workunit.client.1.vm06.stdout:9/112: mkdir d21/d24 0 2026-03-10T06:22:09.416 INFO:tasks.workunit.client.1.vm06.stdout:7/169: mkdir d19/d39 0 2026-03-10T06:22:09.417 INFO:tasks.workunit.client.1.vm06.stdout:7/170: dread d19/f1d [0,4194304] 0 2026-03-10T06:22:09.417 INFO:tasks.workunit.client.1.vm06.stdout:7/171: write f4 [1794104,33985] 0 2026-03-10T06:22:09.420 INFO:tasks.workunit.client.1.vm06.stdout:6/128: truncate f3 416268 0 2026-03-10T06:22:09.430 INFO:tasks.workunit.client.1.vm06.stdout:0/129: write d0/f19 [4345945,19661] 0 2026-03-10T06:22:09.436 INFO:tasks.workunit.client.1.vm06.stdout:0/130: dread d0/db/f12 [0,4194304] 0 2026-03-10T06:22:09.447 INFO:tasks.workunit.client.1.vm06.stdout:0/131: dwrite d0/dd/d14/f28 [0,4194304] 0 2026-03-10T06:22:09.451 INFO:tasks.workunit.client.1.vm06.stdout:0/132: chown d0/dd/d14/d18/c29 25031 1 2026-03-10T06:22:09.456 INFO:tasks.workunit.client.1.vm06.stdout:1/121: unlink d9/fb 0 2026-03-10T06:22:09.462 INFO:tasks.workunit.client.1.vm06.stdout:8/116: rmdir d1/df/d11 39 2026-03-10T06:22:09.468 INFO:tasks.workunit.client.1.vm06.stdout:2/120: rmdir da/d13/d1c 39 2026-03-10T06:22:09.474 INFO:tasks.workunit.client.1.vm06.stdout:8/117: dwrite d1/f1c [0,4194304] 0 2026-03-10T06:22:09.475 INFO:tasks.workunit.client.1.vm06.stdout:1/122: sync 2026-03-10T06:22:09.483 INFO:tasks.workunit.client.1.vm06.stdout:8/118: dwrite d1/fa [4194304,4194304] 0 2026-03-10T06:22:09.491 INFO:tasks.workunit.client.1.vm06.stdout:6/129: stat d6/d7/c23 0 2026-03-10T06:22:09.493 INFO:tasks.workunit.client.1.vm06.stdout:4/114: truncate f0 2261560 0 2026-03-10T06:22:09.493 INFO:tasks.workunit.client.1.vm06.stdout:5/104: write d8/db/fc [4668928,91913] 0 2026-03-10T06:22:09.496 INFO:tasks.workunit.client.1.vm06.stdout:5/105: sync 2026-03-10T06:22:09.520 INFO:tasks.workunit.client.1.vm06.stdout:0/133: rename d0/dd/f11 to d0/dd/d1b/f2f 0 2026-03-10T06:22:09.528 INFO:tasks.workunit.client.1.vm06.stdout:2/121: write f8 [3429411,98602] 0 2026-03-10T06:22:09.557 INFO:tasks.workunit.client.1.vm06.stdout:1/123: mknod d9/df/c19 0 2026-03-10T06:22:09.561 INFO:tasks.workunit.client.1.vm06.stdout:8/119: mknod d1/d7/c25 0 2026-03-10T06:22:09.562 INFO:tasks.workunit.client.1.vm06.stdout:6/130: mknod d6/d11/c26 0 2026-03-10T06:22:09.567 INFO:tasks.workunit.client.1.vm06.stdout:8/120: dwrite d1/f18 [0,4194304] 0 2026-03-10T06:22:09.569 INFO:tasks.workunit.client.1.vm06.stdout:5/106: creat d8/d9/d1e/f28 x:0 0 0 2026-03-10T06:22:09.578 INFO:tasks.workunit.client.1.vm06.stdout:0/134: rename d0/f19 to d0/dd/d14/d18/f30 0 2026-03-10T06:22:09.578 INFO:tasks.workunit.client.1.vm06.stdout:9/113: link l1c d21/d24/l25 0 2026-03-10T06:22:09.583 INFO:tasks.workunit.client.1.vm06.stdout:5/107: dwrite d8/f1a [0,4194304] 0 2026-03-10T06:22:09.586 INFO:tasks.workunit.client.1.vm06.stdout:4/115: dread f6 [0,4194304] 0 2026-03-10T06:22:09.587 INFO:tasks.workunit.client.1.vm06.stdout:5/108: sync 2026-03-10T06:22:09.589 INFO:tasks.workunit.client.1.vm06.stdout:4/116: chown dd/c13 2 1 2026-03-10T06:22:09.589 INFO:tasks.workunit.client.1.vm06.stdout:4/117: sync 2026-03-10T06:22:09.607 INFO:tasks.workunit.client.1.vm06.stdout:3/121: getdents d6/d1a 0 2026-03-10T06:22:09.622 INFO:tasks.workunit.client.1.vm06.stdout:2/122: creat da/d13/d1c/d1d/f2a x:0 0 0 2026-03-10T06:22:09.636 INFO:tasks.workunit.client.1.vm06.stdout:0/135: link d0/dd/d1b/f2f d0/dd/d14/f31 0 2026-03-10T06:22:09.639 INFO:tasks.workunit.client.1.vm06.stdout:9/114: mknod d21/d24/c26 0 2026-03-10T06:22:09.639 INFO:tasks.workunit.client.1.vm06.stdout:8/121: fdatasync d1/f18 0 2026-03-10T06:22:09.640 INFO:tasks.workunit.client.1.vm06.stdout:2/123: fdatasync da/d13/f1f 0 2026-03-10T06:22:09.640 INFO:tasks.workunit.client.1.vm06.stdout:2/124: chown da/d13/l14 1 1 2026-03-10T06:22:09.641 INFO:tasks.workunit.client.1.vm06.stdout:2/125: write da/d13/f1f [536864,95462] 0 2026-03-10T06:22:09.642 INFO:tasks.workunit.client.1.vm06.stdout:1/124: creat d9/f1a x:0 0 0 2026-03-10T06:22:09.642 INFO:tasks.workunit.client.1.vm06.stdout:4/118: creat dd/d20/f21 x:0 0 0 2026-03-10T06:22:09.645 INFO:tasks.workunit.client.1.vm06.stdout:4/119: write dd/f11 [986739,105420] 0 2026-03-10T06:22:09.648 INFO:tasks.workunit.client.1.vm06.stdout:8/122: dwrite f0 [0,4194304] 0 2026-03-10T06:22:09.648 INFO:tasks.workunit.client.1.vm06.stdout:5/109: link d8/d9/d1e/f28 d8/d9/d1e/f29 0 2026-03-10T06:22:09.649 INFO:tasks.workunit.client.1.vm06.stdout:2/126: dwrite da/d13/d1c/d1d/f26 [0,4194304] 0 2026-03-10T06:22:09.650 INFO:tasks.workunit.client.1.vm06.stdout:6/131: truncate d6/fc 1028046 0 2026-03-10T06:22:09.656 INFO:tasks.workunit.client.1.vm06.stdout:4/120: dwrite dd/ff [0,4194304] 0 2026-03-10T06:22:09.656 INFO:tasks.workunit.client.1.vm06.stdout:2/127: truncate da/d13/d1c/f29 854479 0 2026-03-10T06:22:09.665 INFO:tasks.workunit.client.1.vm06.stdout:9/115: stat f14 0 2026-03-10T06:22:09.702 INFO:tasks.workunit.client.1.vm06.stdout:7/172: write d19/f1a [880394,116643] 0 2026-03-10T06:22:09.706 INFO:tasks.workunit.client.1.vm06.stdout:5/110: rmdir d8/d20 39 2026-03-10T06:22:09.706 INFO:tasks.workunit.client.1.vm06.stdout:7/173: dread - d19/f26 zero size 2026-03-10T06:22:09.710 INFO:tasks.workunit.client.1.vm06.stdout:6/132: symlink d6/d11/l27 0 2026-03-10T06:22:09.719 INFO:tasks.workunit.client.1.vm06.stdout:4/121: mknod dd/d18/c22 0 2026-03-10T06:22:09.720 INFO:tasks.workunit.client.1.vm06.stdout:3/122: dwrite d6/d8/fb [0,4194304] 0 2026-03-10T06:22:09.723 INFO:tasks.workunit.client.1.vm06.stdout:4/122: chown fc 30001 1 2026-03-10T06:22:09.723 INFO:tasks.workunit.client.1.vm06.stdout:3/123: chown d6/d21/f26 16576795 1 2026-03-10T06:22:09.723 INFO:tasks.workunit.client.1.vm06.stdout:4/123: read f6 [821807,17183] 0 2026-03-10T06:22:09.726 INFO:tasks.workunit.client.1.vm06.stdout:9/116: mkdir d21/d27 0 2026-03-10T06:22:09.730 INFO:tasks.workunit.client.1.vm06.stdout:9/117: write f1b [59746,84547] 0 2026-03-10T06:22:09.737 INFO:tasks.workunit.client.1.vm06.stdout:9/118: dread fe [0,4194304] 0 2026-03-10T06:22:09.741 INFO:tasks.workunit.client.1.vm06.stdout:1/125: mkdir d9/d1b 0 2026-03-10T06:22:09.741 INFO:tasks.workunit.client.1.vm06.stdout:1/126: chown d9/d1b 1082 1 2026-03-10T06:22:09.741 INFO:tasks.workunit.client.1.vm06.stdout:1/127: readlink l2 0 2026-03-10T06:22:09.741 INFO:tasks.workunit.client.1.vm06.stdout:9/119: readlink d21/l23 0 2026-03-10T06:22:09.741 INFO:tasks.workunit.client.1.vm06.stdout:1/128: readlink d9/la 0 2026-03-10T06:22:09.741 INFO:tasks.workunit.client.1.vm06.stdout:9/120: write f11 [2773470,65843] 0 2026-03-10T06:22:09.749 INFO:tasks.workunit.client.1.vm06.stdout:5/111: dwrite d8/d9/d1e/f29 [0,4194304] 0 2026-03-10T06:22:09.761 INFO:tasks.workunit.client.1.vm06.stdout:7/174: write d19/f33 [460957,45973] 0 2026-03-10T06:22:09.781 INFO:tasks.workunit.client.1.vm06.stdout:6/133: fdatasync d6/fc 0 2026-03-10T06:22:09.781 INFO:tasks.workunit.client.1.vm06.stdout:2/128: mknod da/d13/c2b 0 2026-03-10T06:22:09.781 INFO:tasks.workunit.client.1.vm06.stdout:0/136: rmdir d0/dd/d27 0 2026-03-10T06:22:09.781 INFO:tasks.workunit.client.1.vm06.stdout:4/124: creat dd/d18/f23 x:0 0 0 2026-03-10T06:22:09.788 INFO:tasks.workunit.client.1.vm06.stdout:9/121: fdatasync fe 0 2026-03-10T06:22:09.797 INFO:tasks.workunit.client.1.vm06.stdout:5/112: rmdir d8/d9 39 2026-03-10T06:22:09.802 INFO:tasks.workunit.client.1.vm06.stdout:6/134: mkdir d6/dd/d28 0 2026-03-10T06:22:09.815 INFO:tasks.workunit.client.1.vm06.stdout:0/137: unlink d0/dd/d1b/l26 0 2026-03-10T06:22:09.824 INFO:tasks.workunit.client.1.vm06.stdout:0/138: stat d0/dd/f24 0 2026-03-10T06:22:09.824 INFO:tasks.workunit.client.1.vm06.stdout:1/129: mknod d9/c1c 0 2026-03-10T06:22:09.824 INFO:tasks.workunit.client.1.vm06.stdout:9/122: symlink d21/d24/l28 0 2026-03-10T06:22:09.830 INFO:tasks.workunit.client.1.vm06.stdout:9/123: dread f11 [0,4194304] 0 2026-03-10T06:22:09.832 INFO:tasks.workunit.client.1.vm06.stdout:6/135: unlink c5 0 2026-03-10T06:22:09.834 INFO:tasks.workunit.client.1.vm06.stdout:4/125: mkdir dd/d24 0 2026-03-10T06:22:09.836 INFO:tasks.workunit.client.1.vm06.stdout:1/130: unlink d9/df/f15 0 2026-03-10T06:22:09.836 INFO:tasks.workunit.client.1.vm06.stdout:0/139: chown d0/dd/d1c/c2a 1 1 2026-03-10T06:22:09.838 INFO:tasks.workunit.client.1.vm06.stdout:7/175: rmdir d19/d39 0 2026-03-10T06:22:09.839 INFO:tasks.workunit.client.1.vm06.stdout:4/126: symlink dd/d18/l25 0 2026-03-10T06:22:09.841 INFO:tasks.workunit.client.1.vm06.stdout:5/113: symlink d8/d20/d22/l2a 0 2026-03-10T06:22:09.845 INFO:tasks.workunit.client.1.vm06.stdout:9/124: symlink d21/d27/l29 0 2026-03-10T06:22:09.847 INFO:tasks.workunit.client.1.vm06.stdout:6/136: link d6/d7/f1a d6/df/f29 0 2026-03-10T06:22:09.847 INFO:tasks.workunit.client.1.vm06.stdout:7/176: mknod d19/c3a 0 2026-03-10T06:22:09.847 INFO:tasks.workunit.client.1.vm06.stdout:4/127: creat dd/d18/f26 x:0 0 0 2026-03-10T06:22:09.847 INFO:tasks.workunit.client.1.vm06.stdout:7/177: fdatasync d19/f24 0 2026-03-10T06:22:09.849 INFO:tasks.workunit.client.1.vm06.stdout:1/131: creat d9/d1b/f1d x:0 0 0 2026-03-10T06:22:09.857 INFO:tasks.workunit.client.1.vm06.stdout:5/114: stat d8/d9/l13 0 2026-03-10T06:22:09.858 INFO:tasks.workunit.client.1.vm06.stdout:4/128: unlink dd/c17 0 2026-03-10T06:22:09.859 INFO:tasks.workunit.client.1.vm06.stdout:7/178: unlink l18 0 2026-03-10T06:22:09.872 INFO:tasks.workunit.client.1.vm06.stdout:1/132: mknod d9/df/c1e 0 2026-03-10T06:22:09.873 INFO:tasks.workunit.client.1.vm06.stdout:9/125: dwrite f20 [0,4194304] 0 2026-03-10T06:22:09.884 INFO:tasks.workunit.client.1.vm06.stdout:9/126: dread f12 [0,4194304] 0 2026-03-10T06:22:09.885 INFO:tasks.workunit.client.1.vm06.stdout:9/127: truncate f19 360720 0 2026-03-10T06:22:09.894 INFO:tasks.workunit.client.1.vm06.stdout:1/133: dread f5 [0,4194304] 0 2026-03-10T06:22:09.895 INFO:tasks.workunit.client.1.vm06.stdout:1/134: dread - d9/df/f18 zero size 2026-03-10T06:22:09.897 INFO:tasks.workunit.client.1.vm06.stdout:1/135: write d9/f1a [295370,10701] 0 2026-03-10T06:22:09.906 INFO:tasks.workunit.client.1.vm06.stdout:6/137: link d6/dd/f18 d6/d7/f2a 0 2026-03-10T06:22:09.908 INFO:tasks.workunit.client.1.vm06.stdout:5/115: symlink d8/l2b 0 2026-03-10T06:22:09.926 INFO:tasks.workunit.client.1.vm06.stdout:4/129: getdents dd/d24 0 2026-03-10T06:22:09.927 INFO:tasks.workunit.client.1.vm06.stdout:8/123: truncate d1/f1b 3273296 0 2026-03-10T06:22:09.930 INFO:tasks.workunit.client.1.vm06.stdout:5/116: mknod d8/d9/c2c 0 2026-03-10T06:22:09.934 INFO:tasks.workunit.client.1.vm06.stdout:9/128: link fc d21/f2a 0 2026-03-10T06:22:09.939 INFO:tasks.workunit.client.1.vm06.stdout:9/129: readlink d21/d24/l28 0 2026-03-10T06:22:09.940 INFO:tasks.workunit.client.1.vm06.stdout:1/136: creat d9/f1f x:0 0 0 2026-03-10T06:22:09.943 INFO:tasks.workunit.client.1.vm06.stdout:1/137: truncate d9/d1b/f1d 764277 0 2026-03-10T06:22:09.947 INFO:tasks.workunit.client.1.vm06.stdout:9/130: symlink d21/l2b 0 2026-03-10T06:22:09.952 INFO:tasks.workunit.client.1.vm06.stdout:1/138: sync 2026-03-10T06:22:09.952 INFO:tasks.workunit.client.1.vm06.stdout:9/131: readlink l3 0 2026-03-10T06:22:09.953 INFO:tasks.workunit.client.1.vm06.stdout:3/124: truncate d6/f1c 3998296 0 2026-03-10T06:22:09.954 INFO:tasks.workunit.client.1.vm06.stdout:9/132: write f12 [85081,63951] 0 2026-03-10T06:22:09.955 INFO:tasks.workunit.client.1.vm06.stdout:9/133: read - d21/f22 zero size 2026-03-10T06:22:09.958 INFO:tasks.workunit.client.1.vm06.stdout:1/139: mkdir d9/d1b/d20 0 2026-03-10T06:22:09.959 INFO:tasks.workunit.client.1.vm06.stdout:3/125: rename d6/d8/f16 to d6/d21/f30 0 2026-03-10T06:22:09.959 INFO:tasks.workunit.client.1.vm06.stdout:3/126: readlink l2 0 2026-03-10T06:22:09.959 INFO:tasks.workunit.client.1.vm06.stdout:2/129: dwrite f5 [0,4194304] 0 2026-03-10T06:22:09.963 INFO:tasks.workunit.client.1.vm06.stdout:2/130: sync 2026-03-10T06:22:09.974 INFO:tasks.workunit.client.1.vm06.stdout:2/131: rmdir da/d13 39 2026-03-10T06:22:09.977 INFO:tasks.workunit.client.1.vm06.stdout:3/127: dwrite d6/d8/f2d [0,4194304] 0 2026-03-10T06:22:09.979 INFO:tasks.workunit.client.1.vm06.stdout:5/117: dread d8/d9/f14 [0,4194304] 0 2026-03-10T06:22:09.983 INFO:tasks.workunit.client.1.vm06.stdout:0/140: truncate d0/fa 1020613 0 2026-03-10T06:22:09.983 INFO:tasks.workunit.client.1.vm06.stdout:0/141: readlink d0/db/l25 0 2026-03-10T06:22:09.996 INFO:tasks.workunit.client.1.vm06.stdout:7/179: write d19/f35 [590328,125101] 0 2026-03-10T06:22:09.996 INFO:tasks.workunit.client.1.vm06.stdout:7/180: write d19/f2f [80879,44641] 0 2026-03-10T06:22:09.999 INFO:tasks.workunit.client.1.vm06.stdout:0/142: rename d0/dd/d14/d1d/f1e to d0/dd/f32 0 2026-03-10T06:22:09.999 INFO:tasks.workunit.client.1.vm06.stdout:2/132: unlink da/d13/f16 0 2026-03-10T06:22:10.000 INFO:tasks.workunit.client.1.vm06.stdout:2/133: chown f7 184 1 2026-03-10T06:22:10.000 INFO:tasks.workunit.client.1.vm06.stdout:2/134: fdatasync f8 0 2026-03-10T06:22:10.001 INFO:tasks.workunit.client.1.vm06.stdout:2/135: write da/d13/d1a/f27 [358691,64869] 0 2026-03-10T06:22:10.004 INFO:tasks.workunit.client.1.vm06.stdout:1/140: rename d9/cc to d9/c21 0 2026-03-10T06:22:10.007 INFO:tasks.workunit.client.1.vm06.stdout:5/118: fsync d8/db/f1f 0 2026-03-10T06:22:10.009 INFO:tasks.workunit.client.1.vm06.stdout:3/128: link d6/d21/f30 d6/d21/f31 0 2026-03-10T06:22:10.011 INFO:tasks.workunit.client.1.vm06.stdout:7/181: mkdir d19/d3b 0 2026-03-10T06:22:10.012 INFO:tasks.workunit.client.1.vm06.stdout:3/129: dread - d6/f25 zero size 2026-03-10T06:22:10.012 INFO:tasks.workunit.client.1.vm06.stdout:7/182: chown d19 1413815 1 2026-03-10T06:22:10.012 INFO:tasks.workunit.client.1.vm06.stdout:1/141: dread d9/f1a [0,4194304] 0 2026-03-10T06:22:10.014 INFO:tasks.workunit.client.1.vm06.stdout:9/134: truncate fe 2205595 0 2026-03-10T06:22:10.014 INFO:tasks.workunit.client.1.vm06.stdout:9/135: fdatasync d21/f22 0 2026-03-10T06:22:10.018 INFO:tasks.workunit.client.1.vm06.stdout:7/183: sync 2026-03-10T06:22:10.019 INFO:tasks.workunit.client.1.vm06.stdout:6/138: write f3 [535204,45834] 0 2026-03-10T06:22:10.027 INFO:tasks.workunit.client.1.vm06.stdout:0/143: rename d0/db/ce to d0/dd/d14/c33 0 2026-03-10T06:22:10.029 INFO:tasks.workunit.client.1.vm06.stdout:4/130: dwrite f6 [0,4194304] 0 2026-03-10T06:22:10.045 INFO:tasks.workunit.client.1.vm06.stdout:3/130: symlink d6/dc/d13/l32 0 2026-03-10T06:22:10.047 INFO:tasks.workunit.client.1.vm06.stdout:3/131: sync 2026-03-10T06:22:10.047 INFO:tasks.workunit.client.1.vm06.stdout:8/124: dwrite d1/f2 [0,4194304] 0 2026-03-10T06:22:10.054 INFO:tasks.workunit.client.1.vm06.stdout:2/136: symlink da/d13/l2c 0 2026-03-10T06:22:10.055 INFO:tasks.workunit.client.1.vm06.stdout:4/131: read dd/f12 [203464,41002] 0 2026-03-10T06:22:10.057 INFO:tasks.workunit.client.1.vm06.stdout:6/139: dwrite d6/fc [0,4194304] 0 2026-03-10T06:22:10.062 INFO:tasks.workunit.client.1.vm06.stdout:2/137: readlink da/d13/l14 0 2026-03-10T06:22:10.064 INFO:tasks.workunit.client.1.vm06.stdout:2/138: chown da/d13/d1c/d1d 0 1 2026-03-10T06:22:10.073 INFO:tasks.workunit.client.1.vm06.stdout:1/142: dwrite d9/f1a [0,4194304] 0 2026-03-10T06:22:10.078 INFO:tasks.workunit.client.1.vm06.stdout:5/119: dwrite d8/d9/f14 [4194304,4194304] 0 2026-03-10T06:22:10.080 INFO:tasks.workunit.client.1.vm06.stdout:7/184: dwrite f15 [0,4194304] 0 2026-03-10T06:22:10.081 INFO:tasks.workunit.client.1.vm06.stdout:0/144: dwrite d0/dd/f24 [0,4194304] 0 2026-03-10T06:22:10.088 INFO:tasks.workunit.client.1.vm06.stdout:2/139: dwrite da/f19 [0,4194304] 0 2026-03-10T06:22:10.088 INFO:tasks.workunit.client.1.vm06.stdout:3/132: dwrite d6/dc/d13/f17 [0,4194304] 0 2026-03-10T06:22:10.098 INFO:tasks.workunit.client.1.vm06.stdout:6/140: mkdir d6/dd/d2b 0 2026-03-10T06:22:10.104 INFO:tasks.workunit.client.1.vm06.stdout:2/140: creat da/d13/d1c/f2d x:0 0 0 2026-03-10T06:22:10.104 INFO:tasks.workunit.client.1.vm06.stdout:9/136: dwrite d21/f22 [0,4194304] 0 2026-03-10T06:22:10.104 INFO:tasks.workunit.client.1.vm06.stdout:7/185: dread fa [0,4194304] 0 2026-03-10T06:22:10.104 INFO:tasks.workunit.client.1.vm06.stdout:7/186: dread - d19/f30 zero size 2026-03-10T06:22:10.104 INFO:tasks.workunit.client.1.vm06.stdout:2/141: fdatasync f9 0 2026-03-10T06:22:10.104 INFO:tasks.workunit.client.1.vm06.stdout:3/133: write d6/dc/d13/f1e [299337,11246] 0 2026-03-10T06:22:10.112 INFO:tasks.workunit.client.1.vm06.stdout:8/125: creat d1/f26 x:0 0 0 2026-03-10T06:22:10.112 INFO:tasks.workunit.client.1.vm06.stdout:5/120: dread f5 [0,4194304] 0 2026-03-10T06:22:10.118 INFO:tasks.workunit.client.1.vm06.stdout:3/134: symlink d6/l33 0 2026-03-10T06:22:10.123 INFO:tasks.workunit.client.1.vm06.stdout:2/142: rmdir da/d13/d1c/d1d 39 2026-03-10T06:22:10.123 INFO:tasks.workunit.client.1.vm06.stdout:0/145: rename d0/dd/c13 to d0/c34 0 2026-03-10T06:22:10.123 INFO:tasks.workunit.client.1.vm06.stdout:3/135: write d6/d21/f26 [349815,23304] 0 2026-03-10T06:22:10.123 INFO:tasks.workunit.client.1.vm06.stdout:3/136: chown d6/dc/d13/c2b 5290 1 2026-03-10T06:22:10.123 INFO:tasks.workunit.client.1.vm06.stdout:6/141: truncate d6/dd/f18 539932 0 2026-03-10T06:22:10.123 INFO:tasks.workunit.client.1.vm06.stdout:1/143: link d9/c1c d9/d1b/d20/c22 0 2026-03-10T06:22:10.123 INFO:tasks.workunit.client.1.vm06.stdout:2/143: write f8 [4499399,60412] 0 2026-03-10T06:22:10.123 INFO:tasks.workunit.client.1.vm06.stdout:6/142: write d6/df/f1e [934486,114] 0 2026-03-10T06:22:10.124 INFO:tasks.workunit.client.1.vm06.stdout:6/143: dread - d6/f1b zero size 2026-03-10T06:22:10.126 INFO:tasks.workunit.client.1.vm06.stdout:2/144: dread f5 [0,4194304] 0 2026-03-10T06:22:10.126 INFO:tasks.workunit.client.1.vm06.stdout:7/187: creat d19/d3b/f3c x:0 0 0 2026-03-10T06:22:10.126 INFO:tasks.workunit.client.1.vm06.stdout:7/188: truncate d19/f30 606092 0 2026-03-10T06:22:10.129 INFO:tasks.workunit.client.1.vm06.stdout:1/144: fsync d9/fe 0 2026-03-10T06:22:10.133 INFO:tasks.workunit.client.1.vm06.stdout:1/145: dread d9/df/f10 [0,4194304] 0 2026-03-10T06:22:10.135 INFO:tasks.workunit.client.1.vm06.stdout:8/126: symlink d1/df/d20/d21/l27 0 2026-03-10T06:22:10.135 INFO:tasks.workunit.client.1.vm06.stdout:7/189: mknod d19/c3d 0 2026-03-10T06:22:10.135 INFO:tasks.workunit.client.1.vm06.stdout:3/137: mknod d6/dc/d2a/c34 0 2026-03-10T06:22:10.136 INFO:tasks.workunit.client.1.vm06.stdout:8/127: fdatasync d1/d7/f24 0 2026-03-10T06:22:10.138 INFO:tasks.workunit.client.1.vm06.stdout:0/146: mkdir d0/dd/d2d/d35 0 2026-03-10T06:22:10.138 INFO:tasks.workunit.client.1.vm06.stdout:9/137: link d21/d24/l25 d21/l2c 0 2026-03-10T06:22:10.140 INFO:tasks.workunit.client.1.vm06.stdout:6/144: truncate d6/df/f29 1162577 0 2026-03-10T06:22:10.150 INFO:tasks.workunit.client.1.vm06.stdout:2/145: mkdir da/d13/d2e 0 2026-03-10T06:22:10.150 INFO:tasks.workunit.client.1.vm06.stdout:1/146: symlink d9/d1b/l23 0 2026-03-10T06:22:10.151 INFO:tasks.workunit.client.1.vm06.stdout:2/146: fdatasync da/f28 0 2026-03-10T06:22:10.151 INFO:tasks.workunit.client.1.vm06.stdout:3/138: mkdir d6/dc/d13/d35 0 2026-03-10T06:22:10.154 INFO:tasks.workunit.client.1.vm06.stdout:8/128: unlink d1/d7/l15 0 2026-03-10T06:22:10.155 INFO:tasks.workunit.client.1.vm06.stdout:3/139: write d6/d8/fb [1508839,18321] 0 2026-03-10T06:22:10.156 INFO:tasks.workunit.client.1.vm06.stdout:0/147: creat d0/dd/d2d/f36 x:0 0 0 2026-03-10T06:22:10.164 INFO:tasks.workunit.client.1.vm06.stdout:1/147: dread f5 [0,4194304] 0 2026-03-10T06:22:10.167 INFO:tasks.workunit.client.1.vm06.stdout:6/145: mkdir d6/dd/d25/d2c 0 2026-03-10T06:22:10.167 INFO:tasks.workunit.client.1.vm06.stdout:8/129: dread d1/f2 [0,4194304] 0 2026-03-10T06:22:10.167 INFO:tasks.workunit.client.1.vm06.stdout:7/190: link l8 d19/l3e 0 2026-03-10T06:22:10.167 INFO:tasks.workunit.client.1.vm06.stdout:2/147: creat da/d13/d2e/f2f x:0 0 0 2026-03-10T06:22:10.167 INFO:tasks.workunit.client.1.vm06.stdout:0/148: dread d0/db/f12 [0,4194304] 0 2026-03-10T06:22:10.171 INFO:tasks.workunit.client.1.vm06.stdout:6/146: creat d6/d11/f2d x:0 0 0 2026-03-10T06:22:10.199 INFO:tasks.workunit.client.1.vm06.stdout:1/148: creat d9/d1b/d20/f24 x:0 0 0 2026-03-10T06:22:10.199 INFO:tasks.workunit.client.1.vm06.stdout:1/149: fsync d9/f1f 0 2026-03-10T06:22:10.199 INFO:tasks.workunit.client.1.vm06.stdout:6/147: symlink d6/d7/l2e 0 2026-03-10T06:22:10.200 INFO:tasks.workunit.client.1.vm06.stdout:1/150: stat d9/df/c19 0 2026-03-10T06:22:10.202 INFO:tasks.workunit.client.1.vm06.stdout:1/151: truncate d9/f11 421093 0 2026-03-10T06:22:10.202 INFO:tasks.workunit.client.1.vm06.stdout:2/148: symlink da/l30 0 2026-03-10T06:22:10.202 INFO:tasks.workunit.client.1.vm06.stdout:1/152: write d9/d1b/f1d [1759524,126262] 0 2026-03-10T06:22:10.202 INFO:tasks.workunit.client.1.vm06.stdout:2/149: chown f5 7704 1 2026-03-10T06:22:10.205 INFO:tasks.workunit.client.1.vm06.stdout:0/149: truncate d0/dd/d14/d18/f22 1354722 0 2026-03-10T06:22:10.206 INFO:tasks.workunit.client.1.vm06.stdout:6/148: dwrite d6/d11/f2d [0,4194304] 0 2026-03-10T06:22:10.208 INFO:tasks.workunit.client.1.vm06.stdout:0/150: write d0/dd/d2d/f36 [23075,108654] 0 2026-03-10T06:22:10.212 INFO:tasks.workunit.client.1.vm06.stdout:0/151: dread d0/dd/d14/f31 [0,4194304] 0 2026-03-10T06:22:10.226 INFO:tasks.workunit.client.1.vm06.stdout:1/153: write d9/df/f10 [3499643,123623] 0 2026-03-10T06:22:10.226 INFO:tasks.workunit.client.1.vm06.stdout:1/154: write d9/f11 [857363,126324] 0 2026-03-10T06:22:10.227 INFO:tasks.workunit.client.1.vm06.stdout:1/155: chown d9/df/c19 71 1 2026-03-10T06:22:10.227 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:10 vm04.local ceph-mon[51058]: mgrmap e28: vm04.exdvdb(active, since 1.14162s) 2026-03-10T06:22:10.227 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:10 vm04.local ceph-mon[51058]: pgmap v3: 65 pgs: 65 active+clean; 467 MiB data, 2.0 GiB used, 118 GiB / 120 GiB avail 2026-03-10T06:22:10.227 INFO:tasks.workunit.client.1.vm06.stdout:1/156: chown d9/d1b/d20 16 1 2026-03-10T06:22:10.228 INFO:tasks.workunit.client.1.vm06.stdout:1/157: write d9/d1b/d20/f24 [126653,64001] 0 2026-03-10T06:22:10.232 INFO:tasks.workunit.client.1.vm06.stdout:2/150: mknod da/d13/d2e/c31 0 2026-03-10T06:22:10.234 INFO:tasks.workunit.client.1.vm06.stdout:4/132: dread dd/f11 [0,4194304] 0 2026-03-10T06:22:10.235 INFO:tasks.workunit.client.1.vm06.stdout:2/151: dread da/fe [0,4194304] 0 2026-03-10T06:22:10.245 INFO:tasks.workunit.client.1.vm06.stdout:2/152: symlink da/d13/d1a/l32 0 2026-03-10T06:22:10.258 INFO:tasks.workunit.client.1.vm06.stdout:2/153: dread da/d13/f1f [0,4194304] 0 2026-03-10T06:22:10.259 INFO:tasks.workunit.client.1.vm06.stdout:2/154: stat da/d13/d1c/l23 0 2026-03-10T06:22:10.262 INFO:tasks.workunit.client.1.vm06.stdout:2/155: read da/f11 [2404694,48171] 0 2026-03-10T06:22:10.288 INFO:tasks.workunit.client.1.vm06.stdout:6/149: dread f1 [0,4194304] 0 2026-03-10T06:22:10.291 INFO:tasks.workunit.client.1.vm06.stdout:6/150: creat d6/d7/f2f x:0 0 0 2026-03-10T06:22:10.295 INFO:tasks.workunit.client.1.vm06.stdout:6/151: rename d6/dd/d28 to d6/dd/d25/d30 0 2026-03-10T06:22:10.295 INFO:tasks.workunit.client.1.vm06.stdout:6/152: chown d6/d11/c21 3062326 1 2026-03-10T06:22:10.295 INFO:tasks.workunit.client.1.vm06.stdout:6/153: stat d6 0 2026-03-10T06:22:10.302 INFO:tasks.workunit.client.1.vm06.stdout:1/158: unlink d9/c21 0 2026-03-10T06:22:10.307 INFO:tasks.workunit.client.1.vm06.stdout:1/159: creat d9/d1b/d20/f25 x:0 0 0 2026-03-10T06:22:10.309 INFO:tasks.workunit.client.1.vm06.stdout:1/160: symlink d9/d1b/l26 0 2026-03-10T06:22:10.309 INFO:tasks.workunit.client.1.vm06.stdout:1/161: chown l2 75593 1 2026-03-10T06:22:10.310 INFO:tasks.workunit.client.1.vm06.stdout:1/162: chown d9/df/f10 75973 1 2026-03-10T06:22:10.313 INFO:tasks.workunit.client.1.vm06.stdout:1/163: dwrite d9/f17 [0,4194304] 0 2026-03-10T06:22:10.318 INFO:tasks.workunit.client.1.vm06.stdout:1/164: dwrite d9/f17 [0,4194304] 0 2026-03-10T06:22:10.323 INFO:tasks.workunit.client.1.vm06.stdout:1/165: mknod d9/d1b/d20/c27 0 2026-03-10T06:22:10.324 INFO:tasks.workunit.client.1.vm06.stdout:1/166: dread - d9/df/f18 zero size 2026-03-10T06:22:10.327 INFO:tasks.workunit.client.1.vm06.stdout:2/156: dwrite da/f19 [4194304,4194304] 0 2026-03-10T06:22:10.333 INFO:tasks.workunit.client.1.vm06.stdout:1/167: dwrite d9/df/f14 [0,4194304] 0 2026-03-10T06:22:10.341 INFO:tasks.workunit.client.1.vm06.stdout:1/168: dwrite d9/df/f14 [0,4194304] 0 2026-03-10T06:22:10.352 INFO:tasks.workunit.client.1.vm06.stdout:1/169: symlink d9/df/l28 0 2026-03-10T06:22:10.360 INFO:tasks.workunit.client.1.vm06.stdout:1/170: symlink d9/l29 0 2026-03-10T06:22:10.362 INFO:tasks.workunit.client.1.vm06.stdout:1/171: unlink d9/df/f18 0 2026-03-10T06:22:10.380 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:10 vm06.local ceph-mon[58974]: mgrmap e28: vm04.exdvdb(active, since 1.14162s) 2026-03-10T06:22:10.385 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:10 vm06.local ceph-mon[58974]: pgmap v3: 65 pgs: 65 active+clean; 467 MiB data, 2.0 GiB used, 118 GiB / 120 GiB avail 2026-03-10T06:22:10.385 INFO:tasks.workunit.client.1.vm06.stdout:7/191: rmdir d19/d3b 39 2026-03-10T06:22:10.385 INFO:tasks.workunit.client.1.vm06.stdout:7/192: write d19/f35 [1742702,86903] 0 2026-03-10T06:22:10.385 INFO:tasks.workunit.client.1.vm06.stdout:7/193: dread - d19/f24 zero size 2026-03-10T06:22:10.385 INFO:tasks.workunit.client.1.vm06.stdout:5/121: truncate d8/f1a 3714024 0 2026-03-10T06:22:10.385 INFO:tasks.workunit.client.1.vm06.stdout:7/194: fdatasync d19/f28 0 2026-03-10T06:22:10.385 INFO:tasks.workunit.client.1.vm06.stdout:5/122: mknod d8/d20/d22/c2d 0 2026-03-10T06:22:10.388 INFO:tasks.workunit.client.1.vm06.stdout:1/172: sync 2026-03-10T06:22:10.388 INFO:tasks.workunit.client.1.vm06.stdout:1/173: fdatasync d9/d1b/f1d 0 2026-03-10T06:22:10.391 INFO:tasks.workunit.client.1.vm06.stdout:7/195: dwrite d19/f24 [0,4194304] 0 2026-03-10T06:22:10.392 INFO:tasks.workunit.client.1.vm06.stdout:1/174: dread d9/f1a [0,4194304] 0 2026-03-10T06:22:10.404 INFO:tasks.workunit.client.1.vm06.stdout:1/175: dread d9/fe [0,4194304] 0 2026-03-10T06:22:10.413 INFO:tasks.workunit.client.1.vm06.stdout:7/196: creat d19/f3f x:0 0 0 2026-03-10T06:22:10.416 INFO:tasks.workunit.client.1.vm06.stdout:1/176: chown d9/c1c 280195 1 2026-03-10T06:22:10.417 INFO:tasks.workunit.client.1.vm06.stdout:7/197: fdatasync d19/f3f 0 2026-03-10T06:22:10.423 INFO:tasks.workunit.client.1.vm06.stdout:1/177: rename f5 to d9/df/f2a 0 2026-03-10T06:22:10.429 INFO:tasks.workunit.client.1.vm06.stdout:9/138: write f19 [1382854,91784] 0 2026-03-10T06:22:10.433 INFO:tasks.workunit.client.1.vm06.stdout:1/178: symlink d9/d1b/l2b 0 2026-03-10T06:22:10.435 INFO:tasks.workunit.client.1.vm06.stdout:0/152: rmdir d0/dd/d2d 39 2026-03-10T06:22:10.436 INFO:tasks.workunit.client.1.vm06.stdout:0/153: readlink d0/db/lc 0 2026-03-10T06:22:10.436 INFO:tasks.workunit.client.1.vm06.stdout:1/179: unlink d9/l29 0 2026-03-10T06:22:10.437 INFO:tasks.workunit.client.1.vm06.stdout:3/140: truncate d6/d1a/f1f 971014 0 2026-03-10T06:22:10.448 INFO:tasks.workunit.client.1.vm06.stdout:1/180: mkdir d9/d1b/d2c 0 2026-03-10T06:22:10.450 INFO:tasks.workunit.client.1.vm06.stdout:9/139: link d21/l2b d21/l2d 0 2026-03-10T06:22:10.453 INFO:tasks.workunit.client.1.vm06.stdout:1/181: fsync d9/d1b/d20/f24 0 2026-03-10T06:22:10.454 INFO:tasks.workunit.client.1.vm06.stdout:8/130: truncate f0 1396279 0 2026-03-10T06:22:10.457 INFO:tasks.workunit.client.1.vm06.stdout:3/141: dwrite d6/d21/f26 [0,4194304] 0 2026-03-10T06:22:10.457 INFO:tasks.workunit.client.1.vm06.stdout:8/131: stat d1/l1f 0 2026-03-10T06:22:10.465 INFO:tasks.workunit.client.1.vm06.stdout:9/140: creat d21/d24/f2e x:0 0 0 2026-03-10T06:22:10.467 INFO:tasks.workunit.client.1.vm06.stdout:1/182: mknod d9/d1b/d20/c2d 0 2026-03-10T06:22:10.467 INFO:tasks.workunit.client.1.vm06.stdout:1/183: chown d9/d1b 351413249 1 2026-03-10T06:22:10.468 INFO:tasks.workunit.client.1.vm06.stdout:1/184: read d9/df/f10 [1701732,13023] 0 2026-03-10T06:22:10.473 INFO:tasks.workunit.client.1.vm06.stdout:9/141: creat d21/d24/f2f x:0 0 0 2026-03-10T06:22:10.483 INFO:tasks.workunit.client.1.vm06.stdout:8/132: rename d1/d7/c25 to d1/df/d20/d21/c28 0 2026-03-10T06:22:10.484 INFO:tasks.workunit.client.1.vm06.stdout:4/133: truncate dd/fe 2251678 0 2026-03-10T06:22:10.499 INFO:tasks.workunit.client.1.vm06.stdout:7/198: dread f4 [0,4194304] 0 2026-03-10T06:22:10.503 INFO:tasks.workunit.client.1.vm06.stdout:8/133: dwrite d1/d7/f24 [0,4194304] 0 2026-03-10T06:22:10.513 INFO:tasks.workunit.client.1.vm06.stdout:9/142: dread f9 [0,4194304] 0 2026-03-10T06:22:10.513 INFO:tasks.workunit.client.1.vm06.stdout:7/199: dread d19/f35 [0,4194304] 0 2026-03-10T06:22:10.516 INFO:tasks.workunit.client.1.vm06.stdout:2/157: truncate f8 2976054 0 2026-03-10T06:22:10.517 INFO:tasks.workunit.client.1.vm06.stdout:6/154: write d6/dd/f18 [401923,96185] 0 2026-03-10T06:22:10.518 INFO:tasks.workunit.client.1.vm06.stdout:2/158: write da/d13/d1a/f27 [1423838,103978] 0 2026-03-10T06:22:10.520 INFO:tasks.workunit.client.1.vm06.stdout:8/134: symlink d1/df/d20/d21/l29 0 2026-03-10T06:22:10.524 INFO:tasks.workunit.client.1.vm06.stdout:6/155: creat d6/dd/d25/f31 x:0 0 0 2026-03-10T06:22:10.529 INFO:tasks.workunit.client.1.vm06.stdout:6/156: fdatasync d6/f1b 0 2026-03-10T06:22:10.531 INFO:tasks.workunit.client.1.vm06.stdout:7/200: dread d19/f33 [0,4194304] 0 2026-03-10T06:22:10.531 INFO:tasks.workunit.client.1.vm06.stdout:5/123: dwrite d8/db/f1f [0,4194304] 0 2026-03-10T06:22:10.531 INFO:tasks.workunit.client.1.vm06.stdout:6/157: rename d6/dd/d25/f31 to d6/dd/d25/d2c/f32 0 2026-03-10T06:22:10.533 INFO:tasks.workunit.client.1.vm06.stdout:6/158: chown d6/d7/l2e 369179 1 2026-03-10T06:22:10.533 INFO:tasks.workunit.client.1.vm06.stdout:8/135: mknod d1/c2a 0 2026-03-10T06:22:10.539 INFO:tasks.workunit.client.1.vm06.stdout:8/136: rmdir d1/df 39 2026-03-10T06:22:10.541 INFO:tasks.workunit.client.1.vm06.stdout:6/159: write d6/d7/f20 [816068,121477] 0 2026-03-10T06:22:10.541 INFO:tasks.workunit.client.1.vm06.stdout:7/201: mknod d19/d3b/c40 0 2026-03-10T06:22:10.546 INFO:tasks.workunit.client.1.vm06.stdout:7/202: write d19/f25 [1155007,52322] 0 2026-03-10T06:22:10.546 INFO:tasks.workunit.client.1.vm06.stdout:8/137: fsync d1/f1b 0 2026-03-10T06:22:10.547 INFO:tasks.workunit.client.1.vm06.stdout:5/124: dwrite d8/db/f25 [0,4194304] 0 2026-03-10T06:22:10.547 INFO:tasks.workunit.client.1.vm06.stdout:6/160: chown d6/c17 165124 1 2026-03-10T06:22:10.548 INFO:tasks.workunit.client.1.vm06.stdout:8/138: truncate d1/d7/f9 952511 0 2026-03-10T06:22:10.550 INFO:tasks.workunit.client.1.vm06.stdout:7/203: write d19/f2f [906481,7612] 0 2026-03-10T06:22:10.550 INFO:tasks.workunit.client.1.vm06.stdout:8/139: write d1/f1c [2819749,105371] 0 2026-03-10T06:22:10.556 INFO:tasks.workunit.client.1.vm06.stdout:7/204: mkdir d19/d3b/d41 0 2026-03-10T06:22:10.559 INFO:tasks.workunit.client.1.vm06.stdout:6/161: dwrite d6/d11/f24 [0,4194304] 0 2026-03-10T06:22:10.564 INFO:tasks.workunit.client.1.vm06.stdout:5/125: dwrite d8/db/f18 [0,4194304] 0 2026-03-10T06:22:10.564 INFO:tasks.workunit.client.1.vm06.stdout:8/140: dread d1/f1c [0,4194304] 0 2026-03-10T06:22:10.570 INFO:tasks.workunit.client.1.vm06.stdout:6/162: truncate d6/dd/f18 825712 0 2026-03-10T06:22:10.573 INFO:tasks.workunit.client.1.vm06.stdout:7/205: mkdir d19/d3b/d41/d42 0 2026-03-10T06:22:10.578 INFO:tasks.workunit.client.1.vm06.stdout:8/141: chown d1/df/d11/f12 724349218 1 2026-03-10T06:22:10.578 INFO:tasks.workunit.client.1.vm06.stdout:5/126: dread d8/d9/f11 [0,4194304] 0 2026-03-10T06:22:10.583 INFO:tasks.workunit.client.1.vm06.stdout:6/163: dwrite d6/df/f1e [0,4194304] 0 2026-03-10T06:22:10.584 INFO:tasks.workunit.client.1.vm06.stdout:6/164: chown d6/dd/d2b 55599 1 2026-03-10T06:22:10.585 INFO:tasks.workunit.client.1.vm06.stdout:7/206: dwrite d19/f28 [0,4194304] 0 2026-03-10T06:22:10.587 INFO:tasks.workunit.client.1.vm06.stdout:8/142: rename d1/d7/f9 to d1/df/d20/d21/f2b 0 2026-03-10T06:22:10.594 INFO:tasks.workunit.client.1.vm06.stdout:7/207: creat d19/d3b/f43 x:0 0 0 2026-03-10T06:22:10.598 INFO:tasks.workunit.client.1.vm06.stdout:6/165: dwrite d6/d7/fb [4194304,4194304] 0 2026-03-10T06:22:10.599 INFO:tasks.workunit.client.1.vm06.stdout:8/143: mkdir d1/d2c 0 2026-03-10T06:22:10.605 INFO:tasks.workunit.client.1.vm06.stdout:7/208: mknod d19/d3b/d41/d42/c44 0 2026-03-10T06:22:10.605 INFO:tasks.workunit.client.1.vm06.stdout:7/209: fdatasync f9 0 2026-03-10T06:22:10.609 INFO:tasks.workunit.client.1.vm06.stdout:6/166: unlink d6/df/c15 0 2026-03-10T06:22:10.609 INFO:tasks.workunit.client.1.vm06.stdout:8/144: symlink d1/df/d20/d21/l2d 0 2026-03-10T06:22:10.611 INFO:tasks.workunit.client.1.vm06.stdout:0/154: dwrite d0/dd/d14/f31 [0,4194304] 0 2026-03-10T06:22:10.612 INFO:tasks.workunit.client.1.vm06.stdout:6/167: mkdir d6/dd/d25/d33 0 2026-03-10T06:22:10.612 INFO:tasks.workunit.client.1.vm06.stdout:7/210: readlink d19/l29 0 2026-03-10T06:22:10.613 INFO:tasks.workunit.client.1.vm06.stdout:8/145: mknod d1/d7/c2e 0 2026-03-10T06:22:10.615 INFO:tasks.workunit.client.1.vm06.stdout:6/168: write d6/d11/f24 [4852228,36272] 0 2026-03-10T06:22:10.620 INFO:tasks.workunit.client.1.vm06.stdout:0/155: symlink d0/l37 0 2026-03-10T06:22:10.620 INFO:tasks.workunit.client.1.vm06.stdout:7/211: mknod d19/d3b/c45 0 2026-03-10T06:22:10.620 INFO:tasks.workunit.client.1.vm06.stdout:8/146: mknod d1/df/d11/c2f 0 2026-03-10T06:22:10.622 INFO:tasks.workunit.client.1.vm06.stdout:0/156: write d0/db/f12 [2806212,126833] 0 2026-03-10T06:22:10.625 INFO:tasks.workunit.client.1.vm06.stdout:6/169: unlink d6/d7/f2f 0 2026-03-10T06:22:10.625 INFO:tasks.workunit.client.1.vm06.stdout:8/147: write d1/f2 [4611706,23084] 0 2026-03-10T06:22:10.627 INFO:tasks.workunit.client.1.vm06.stdout:7/212: symlink d19/d3b/d41/d42/l46 0 2026-03-10T06:22:10.627 INFO:tasks.workunit.client.1.vm06.stdout:0/157: mknod d0/dd/d14/d18/c38 0 2026-03-10T06:22:10.627 INFO:tasks.workunit.client.1.vm06.stdout:6/170: symlink d6/d7/l34 0 2026-03-10T06:22:10.627 INFO:tasks.workunit.client.1.vm06.stdout:8/148: truncate d1/df/d11/f1d 1535529 0 2026-03-10T06:22:10.628 INFO:tasks.workunit.client.1.vm06.stdout:6/171: readlink d6/d11/l27 0 2026-03-10T06:22:10.629 INFO:tasks.workunit.client.1.vm06.stdout:6/172: read d6/d7/f2a [20126,35694] 0 2026-03-10T06:22:10.633 INFO:tasks.workunit.client.1.vm06.stdout:8/149: read d1/f2 [3314288,56090] 0 2026-03-10T06:22:10.634 INFO:tasks.workunit.client.1.vm06.stdout:7/213: dread d19/f24 [0,4194304] 0 2026-03-10T06:22:10.670 INFO:tasks.workunit.client.1.vm06.stdout:6/173: rename d6/d11 to d6/dd/d35 0 2026-03-10T06:22:10.678 INFO:tasks.workunit.client.1.vm06.stdout:8/150: creat d1/d7/f30 x:0 0 0 2026-03-10T06:22:10.698 INFO:tasks.workunit.client.1.vm06.stdout:0/158: rename d0/db to d0/dd/d2d/d39 0 2026-03-10T06:22:10.700 INFO:tasks.workunit.client.1.vm06.stdout:6/174: creat d6/d7/f36 x:0 0 0 2026-03-10T06:22:10.703 INFO:tasks.workunit.client.1.vm06.stdout:0/159: chown d0/c2e 54024363 1 2026-03-10T06:22:10.715 INFO:tasks.workunit.client.1.vm06.stdout:6/175: mkdir d6/d7/d37 0 2026-03-10T06:22:10.719 INFO:tasks.workunit.client.1.vm06.stdout:6/176: write d6/d7/f16 [909194,3590] 0 2026-03-10T06:22:10.725 INFO:tasks.workunit.client.1.vm06.stdout:8/151: link d1/l8 d1/df/d20/l31 0 2026-03-10T06:22:10.728 INFO:tasks.workunit.client.1.vm06.stdout:8/152: truncate d1/f4 4580939 0 2026-03-10T06:22:10.729 INFO:tasks.workunit.client.1.vm06.stdout:6/177: dwrite d6/d7/f16 [0,4194304] 0 2026-03-10T06:22:10.730 INFO:tasks.workunit.client.1.vm06.stdout:6/178: chown d6/d7/f36 74 1 2026-03-10T06:22:10.743 INFO:tasks.workunit.client.1.vm06.stdout:0/160: unlink d0/dd/d14/c33 0 2026-03-10T06:22:10.746 INFO:tasks.workunit.client.1.vm06.stdout:6/179: symlink d6/dd/d35/l38 0 2026-03-10T06:22:10.754 INFO:tasks.workunit.client.1.vm06.stdout:6/180: dwrite d6/d7/f16 [0,4194304] 0 2026-03-10T06:22:10.760 INFO:tasks.workunit.client.1.vm06.stdout:8/153: creat d1/d2c/f32 x:0 0 0 2026-03-10T06:22:10.772 INFO:tasks.workunit.client.1.vm06.stdout:8/154: symlink d1/d2c/l33 0 2026-03-10T06:22:10.779 INFO:tasks.workunit.client.1.vm06.stdout:8/155: write d1/df/d11/f12 [2566267,84394] 0 2026-03-10T06:22:10.782 INFO:tasks.workunit.client.1.vm06.stdout:1/185: truncate d9/f1a 3514469 0 2026-03-10T06:22:10.784 INFO:tasks.workunit.client.1.vm06.stdout:0/161: creat d0/dd/d2d/d35/f3a x:0 0 0 2026-03-10T06:22:10.787 INFO:tasks.workunit.client.1.vm06.stdout:8/156: read d1/f13 [884186,21029] 0 2026-03-10T06:22:10.791 INFO:tasks.workunit.client.1.vm06.stdout:4/134: dwrite f0 [0,4194304] 0 2026-03-10T06:22:10.791 INFO:tasks.workunit.client.1.vm06.stdout:6/181: mknod d6/dd/d2b/c39 0 2026-03-10T06:22:10.796 INFO:tasks.workunit.client.1.vm06.stdout:2/159: dwrite f8 [0,4194304] 0 2026-03-10T06:22:10.797 INFO:tasks.workunit.client.1.vm06.stdout:4/135: write f6 [3059796,51070] 0 2026-03-10T06:22:10.804 INFO:tasks.workunit.client.1.vm06.stdout:4/136: dread dd/f11 [0,4194304] 0 2026-03-10T06:22:10.818 INFO:tasks.workunit.client.1.vm06.stdout:0/162: read d0/dd/f10 [66240,121211] 0 2026-03-10T06:22:10.822 INFO:tasks.workunit.client.1.vm06.stdout:8/157: mknod d1/d2c/c34 0 2026-03-10T06:22:10.841 INFO:tasks.workunit.client.1.vm06.stdout:2/160: rename f9 to da/d13/d1a/f33 0 2026-03-10T06:22:10.844 INFO:tasks.workunit.client.1.vm06.stdout:4/137: mknod dd/d20/c27 0 2026-03-10T06:22:10.845 INFO:tasks.workunit.client.1.vm06.stdout:4/138: fsync dd/d20/f21 0 2026-03-10T06:22:10.866 INFO:tasks.workunit.client.1.vm06.stdout:2/161: symlink da/d13/d1a/l34 0 2026-03-10T06:22:10.867 INFO:tasks.workunit.client.1.vm06.stdout:4/139: fdatasync fc 0 2026-03-10T06:22:10.868 INFO:tasks.workunit.client.1.vm06.stdout:9/143: truncate fd 117273 0 2026-03-10T06:22:10.873 INFO:tasks.workunit.client.1.vm06.stdout:0/163: symlink d0/dd/l3b 0 2026-03-10T06:22:10.878 INFO:tasks.workunit.client.1.vm06.stdout:5/127: fdatasync d8/f1a 0 2026-03-10T06:22:10.898 INFO:tasks.workunit.client.1.vm06.stdout:2/162: dwrite da/d13/d1c/d1d/f26 [0,4194304] 0 2026-03-10T06:22:10.898 INFO:tasks.workunit.client.1.vm06.stdout:3/142: rmdir d6/dc 39 2026-03-10T06:22:10.903 INFO:tasks.workunit.client.1.vm06.stdout:0/164: dwrite d0/dd/d2d/f36 [0,4194304] 0 2026-03-10T06:22:10.903 INFO:tasks.workunit.client.1.vm06.stdout:6/182: unlink d6/df/f29 0 2026-03-10T06:22:10.904 INFO:tasks.workunit.client.1.vm06.stdout:6/183: fdatasync d6/dd/d35/f24 0 2026-03-10T06:22:10.907 INFO:tasks.workunit.client.1.vm06.stdout:5/128: dread d8/db/f18 [0,4194304] 0 2026-03-10T06:22:10.913 INFO:tasks.workunit.client.1.vm06.stdout:0/165: read d0/dd/d14/f28 [1930941,84444] 0 2026-03-10T06:22:10.917 INFO:tasks.workunit.client.1.vm06.stdout:5/129: write d8/d9/d1e/f17 [103647,1474] 0 2026-03-10T06:22:10.940 INFO:tasks.workunit.client.1.vm06.stdout:5/130: dwrite d8/d9/d1e/f17 [0,4194304] 0 2026-03-10T06:22:10.944 INFO:tasks.workunit.client.1.vm06.stdout:5/131: write f5 [1720087,126667] 0 2026-03-10T06:22:10.966 INFO:tasks.workunit.client.1.vm06.stdout:1/186: fdatasync d9/df/f2a 0 2026-03-10T06:22:10.970 INFO:tasks.workunit.client.1.vm06.stdout:7/214: rmdir d19/d3b 39 2026-03-10T06:22:10.971 INFO:tasks.workunit.client.1.vm06.stdout:7/215: chown f9 315199061 1 2026-03-10T06:22:10.972 INFO:tasks.workunit.client.1.vm06.stdout:5/132: dwrite f5 [0,4194304] 0 2026-03-10T06:22:10.989 INFO:tasks.workunit.client.1.vm06.stdout:2/163: mkdir da/d13/d2e/d35 0 2026-03-10T06:22:10.996 INFO:tasks.workunit.client.1.vm06.stdout:7/216: dwrite f10 [4194304,4194304] 0 2026-03-10T06:22:10.996 INFO:tasks.workunit.client.1.vm06.stdout:3/143: dread d6/d1a/f1f [0,4194304] 0 2026-03-10T06:22:11.002 INFO:tasks.workunit.client.1.vm06.stdout:5/133: symlink d8/d9/l2e 0 2026-03-10T06:22:11.010 INFO:tasks.workunit.client.1.vm06.stdout:6/184: symlink d6/dd/d25/d33/l3a 0 2026-03-10T06:22:11.011 INFO:tasks.workunit.client.1.vm06.stdout:0/166: mkdir d0/d3c 0 2026-03-10T06:22:11.019 INFO:tasks.workunit.client.1.vm06.stdout:3/144: unlink d6/l33 0 2026-03-10T06:22:11.023 INFO:tasks.workunit.client.1.vm06.stdout:8/158: dwrite d1/f1b [0,4194304] 0 2026-03-10T06:22:11.024 INFO:tasks.workunit.client.1.vm06.stdout:3/145: write d6/d21/f2c [82890,7581] 0 2026-03-10T06:22:11.026 INFO:tasks.workunit.client.1.vm06.stdout:5/134: creat d8/d9/d1e/f2f x:0 0 0 2026-03-10T06:22:11.026 INFO:tasks.workunit.client.1.vm06.stdout:0/167: mkdir d0/dd/d1b/d3d 0 2026-03-10T06:22:11.027 INFO:tasks.workunit.client.1.vm06.stdout:5/135: chown d8/f1a 6881 1 2026-03-10T06:22:11.043 INFO:tasks.workunit.client.1.vm06.stdout:1/187: rmdir d9/d1b/d2c 0 2026-03-10T06:22:11.048 INFO:tasks.workunit.client.1.vm06.stdout:4/140: truncate f6 831578 0 2026-03-10T06:22:11.048 INFO:tasks.workunit.client.1.vm06.stdout:9/144: write ff [2806521,22501] 0 2026-03-10T06:22:11.054 INFO:tasks.workunit.client.1.vm06.stdout:8/159: mkdir d1/df/d20/d35 0 2026-03-10T06:22:11.060 INFO:tasks.workunit.client.1.vm06.stdout:2/164: rmdir da 39 2026-03-10T06:22:11.068 INFO:tasks.workunit.client.1.vm06.stdout:5/136: creat d8/d20/d22/f30 x:0 0 0 2026-03-10T06:22:11.069 INFO:tasks.workunit.client.1.vm06.stdout:7/217: link d19/f35 d19/d3b/f47 0 2026-03-10T06:22:11.071 INFO:tasks.workunit.client.1.vm06.stdout:9/145: rename d21/l23 to d21/d27/l30 0 2026-03-10T06:22:11.074 INFO:tasks.workunit.client.1.vm06.stdout:6/185: link d6/dd/d35/c26 d6/df/c3b 0 2026-03-10T06:22:11.075 INFO:tasks.workunit.client.1.vm06.stdout:6/186: chown d6/d7/f2a 7 1 2026-03-10T06:22:11.076 INFO:tasks.workunit.client.1.vm06.stdout:6/187: chown d6/ca 0 1 2026-03-10T06:22:11.089 INFO:tasks.workunit.client.1.vm06.stdout:7/218: fsync fa 0 2026-03-10T06:22:11.098 INFO:tasks.workunit.client.1.vm06.stdout:9/146: stat l15 0 2026-03-10T06:22:11.098 INFO:tasks.workunit.client.1.vm06.stdout:4/141: getdents dd/d24 0 2026-03-10T06:22:11.099 INFO:tasks.workunit.client.1.vm06.stdout:4/142: write dd/ff [4572724,26137] 0 2026-03-10T06:22:11.099 INFO:tasks.workunit.client.1.vm06.stdout:3/146: mknod d6/dc/d13/d35/c36 0 2026-03-10T06:22:11.100 INFO:tasks.workunit.client.1.vm06.stdout:2/165: rename da/d13/d1c/l24 to da/d13/d1a/l36 0 2026-03-10T06:22:11.102 INFO:tasks.workunit.client.1.vm06.stdout:6/188: creat d6/dd/d35/f3c x:0 0 0 2026-03-10T06:22:11.103 INFO:tasks.workunit.client.1.vm06.stdout:7/219: creat d19/d3b/d41/f48 x:0 0 0 2026-03-10T06:22:11.108 INFO:tasks.workunit.client.1.vm06.stdout:7/220: creat d19/d3b/d41/f49 x:0 0 0 2026-03-10T06:22:11.113 INFO:tasks.workunit.client.1.vm06.stdout:8/160: getdents d1/d2c 0 2026-03-10T06:22:11.113 INFO:tasks.workunit.client.1.vm06.stdout:8/161: chown d1/df/d20/d35 5 1 2026-03-10T06:22:11.113 INFO:tasks.workunit.client.1.vm06.stdout:8/162: fdatasync d1/df/d11/f1d 0 2026-03-10T06:22:11.113 INFO:tasks.workunit.client.1.vm06.stdout:8/163: write d1/df/d20/fe [3279581,52377] 0 2026-03-10T06:22:11.123 INFO:tasks.workunit.client.1.vm06.stdout:2/166: symlink da/d13/d2e/d35/l37 0 2026-03-10T06:22:11.123 INFO:tasks.workunit.client.1.vm06.stdout:3/147: dwrite d6/d8/fb [4194304,4194304] 0 2026-03-10T06:22:11.123 INFO:tasks.workunit.client.1.vm06.stdout:1/188: fsync d9/f1a 0 2026-03-10T06:22:11.132 INFO:tasks.workunit.client.1.vm06.stdout:6/189: dwrite d6/dd/f18 [0,4194304] 0 2026-03-10T06:22:11.132 INFO:tasks.workunit.client.1.vm06.stdout:6/190: write d6/d7/f20 [1654144,2872] 0 2026-03-10T06:22:11.132 INFO:tasks.workunit.client.1.vm06.stdout:0/168: dwrite d0/dd/f10 [0,4194304] 0 2026-03-10T06:22:11.136 INFO:tasks.workunit.client.1.vm06.stdout:7/221: stat l8 0 2026-03-10T06:22:11.141 INFO:tasks.workunit.client.1.vm06.stdout:7/222: stat f10 0 2026-03-10T06:22:11.143 INFO:tasks.workunit.client.1.vm06.stdout:3/148: dwrite d6/d21/f2c [0,4194304] 0 2026-03-10T06:22:11.156 INFO:tasks.workunit.client.1.vm06.stdout:0/169: creat d0/dd/d1c/f3e x:0 0 0 2026-03-10T06:22:11.163 INFO:tasks.workunit.client.1.vm06.stdout:4/143: getdents dd/d20 0 2026-03-10T06:22:11.163 INFO:tasks.workunit.client.1.vm06.stdout:4/144: stat dd/d18/c22 0 2026-03-10T06:22:11.164 INFO:tasks.workunit.client.1.vm06.stdout:4/145: readlink dd/l1b 0 2026-03-10T06:22:11.166 INFO:tasks.workunit.client.1.vm06.stdout:7/223: dwrite d19/d3b/d41/f49 [0,4194304] 0 2026-03-10T06:22:11.172 INFO:tasks.workunit.client.1.vm06.stdout:9/147: dread ff [0,4194304] 0 2026-03-10T06:22:11.176 INFO:tasks.workunit.client.1.vm06.stdout:4/146: creat dd/d20/f28 x:0 0 0 2026-03-10T06:22:11.176 INFO:tasks.workunit.client.1.vm06.stdout:6/191: dread d6/d7/f20 [0,4194304] 0 2026-03-10T06:22:11.189 INFO:tasks.workunit.client.1.vm06.stdout:4/147: dwrite dd/d20/f21 [0,4194304] 0 2026-03-10T06:22:11.190 INFO:tasks.workunit.client.1.vm06.stdout:5/137: truncate f7 836015 0 2026-03-10T06:22:11.191 INFO:tasks.workunit.client.1.vm06.stdout:5/138: write d8/db/f25 [4089729,125193] 0 2026-03-10T06:22:11.194 INFO:tasks.workunit.client.1.vm06.stdout:3/149: getdents d6/d1a 0 2026-03-10T06:22:11.196 INFO:tasks.workunit.client.1.vm06.stdout:2/167: rename da/c12 to da/d13/d1c/c38 0 2026-03-10T06:22:11.197 INFO:tasks.workunit.client.1.vm06.stdout:1/189: write d9/df/f10 [3412459,2151] 0 2026-03-10T06:22:11.203 INFO:tasks.workunit.client.1.vm06.stdout:5/139: rename d8/d20/d22/f30 to d8/d20/d22/f31 0 2026-03-10T06:22:11.216 INFO:tasks.workunit.client.1.vm06.stdout:7/224: symlink d19/l4a 0 2026-03-10T06:22:11.216 INFO:tasks.workunit.client.1.vm06.stdout:3/150: creat d6/dc/d13/f37 x:0 0 0 2026-03-10T06:22:11.216 INFO:tasks.workunit.client.1.vm06.stdout:6/192: link d6/df/f1e d6/d7/d37/f3d 0 2026-03-10T06:22:11.216 INFO:tasks.workunit.client.1.vm06.stdout:3/151: write d6/dc/f1d [614726,79743] 0 2026-03-10T06:22:11.216 INFO:tasks.workunit.client.1.vm06.stdout:1/190: fdatasync d9/df/f14 0 2026-03-10T06:22:11.216 INFO:tasks.workunit.client.1.vm06.stdout:3/152: unlink d6/dc/d13/f17 0 2026-03-10T06:22:11.216 INFO:tasks.workunit.client.1.vm06.stdout:9/148: rename fc to d21/d27/f31 0 2026-03-10T06:22:11.218 INFO:tasks.workunit.client.1.vm06.stdout:1/191: rmdir d9 39 2026-03-10T06:22:11.218 INFO:tasks.workunit.client.1.vm06.stdout:7/225: dwrite d19/f28 [4194304,4194304] 0 2026-03-10T06:22:11.220 INFO:tasks.workunit.client.1.vm06.stdout:9/149: mkdir d21/d32 0 2026-03-10T06:22:11.226 INFO:tasks.workunit.client.1.vm06.stdout:3/153: dwrite d6/d21/f30 [4194304,4194304] 0 2026-03-10T06:22:11.237 INFO:tasks.workunit.client.1.vm06.stdout:3/154: write d6/dc/f1d [734741,78493] 0 2026-03-10T06:22:11.238 INFO:tasks.workunit.client.1.vm06.stdout:3/155: readlink d6/l28 0 2026-03-10T06:22:11.239 INFO:tasks.workunit.client.1.vm06.stdout:3/156: read d6/d21/f26 [3249151,41241] 0 2026-03-10T06:22:11.239 INFO:tasks.workunit.client.1.vm06.stdout:1/192: rmdir d9/d1b/d20 39 2026-03-10T06:22:11.240 INFO:tasks.workunit.client.1.vm06.stdout:3/157: stat d6/d21/f30 0 2026-03-10T06:22:11.247 INFO:tasks.workunit.client.1.vm06.stdout:1/193: stat l2 0 2026-03-10T06:22:11.247 INFO:tasks.workunit.client.1.vm06.stdout:4/148: rename dd/d18/f1e to dd/f29 0 2026-03-10T06:22:11.251 INFO:tasks.workunit.client.1.vm06.stdout:3/158: mkdir d6/d21/d38 0 2026-03-10T06:22:11.258 INFO:tasks.workunit.client.1.vm06.stdout:3/159: mkdir d6/d21/d38/d39 0 2026-03-10T06:22:11.262 INFO:tasks.workunit.client.1.vm06.stdout:3/160: mknod d6/d21/c3a 0 2026-03-10T06:22:11.264 INFO:tasks.workunit.client.1.vm06.stdout:2/168: rename da/d13/d2e to da/d13/d1a/d39 0 2026-03-10T06:22:11.270 INFO:tasks.workunit.client.1.vm06.stdout:6/193: rename d6/dd/d35/l13 to d6/df/l3e 0 2026-03-10T06:22:11.279 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:11 vm06.local ceph-mon[58974]: pgmap v4: 65 pgs: 65 active+clean; 467 MiB data, 2.0 GiB used, 118 GiB / 120 GiB avail 2026-03-10T06:22:11.279 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:11 vm06.local ceph-mon[58974]: mgrmap e29: vm04.exdvdb(active, since 2s) 2026-03-10T06:22:11.280 INFO:tasks.workunit.client.1.vm06.stdout:1/194: rename d9/df/c1e to d9/c2e 0 2026-03-10T06:22:11.280 INFO:tasks.workunit.client.1.vm06.stdout:1/195: write d9/f17 [2094559,13083] 0 2026-03-10T06:22:11.282 INFO:tasks.workunit.client.1.vm06.stdout:6/194: rename f1 to d6/dd/d25/f3f 0 2026-03-10T06:22:11.285 INFO:tasks.workunit.client.1.vm06.stdout:6/195: mkdir d6/df/d40 0 2026-03-10T06:22:11.286 INFO:tasks.workunit.client.1.vm06.stdout:6/196: chown d6/d7/l34 66275 1 2026-03-10T06:22:11.286 INFO:tasks.workunit.client.1.vm06.stdout:0/170: dread d0/f9 [0,4194304] 0 2026-03-10T06:22:11.288 INFO:tasks.workunit.client.1.vm06.stdout:9/150: dread fe [0,4194304] 0 2026-03-10T06:22:11.292 INFO:tasks.workunit.client.1.vm06.stdout:0/171: creat d0/dd/d1b/f3f x:0 0 0 2026-03-10T06:22:11.293 INFO:tasks.workunit.client.1.vm06.stdout:1/196: creat d9/f2f x:0 0 0 2026-03-10T06:22:11.294 INFO:tasks.workunit.client.1.vm06.stdout:1/197: write d9/df/f14 [2158871,22126] 0 2026-03-10T06:22:11.295 INFO:tasks.workunit.client.1.vm06.stdout:0/172: chown d0/dd/d14/d18/c38 1206 1 2026-03-10T06:22:11.295 INFO:tasks.workunit.client.1.vm06.stdout:6/197: mknod d6/dd/d25/d2c/c41 0 2026-03-10T06:22:11.305 INFO:tasks.workunit.client.1.vm06.stdout:9/151: creat d21/f33 x:0 0 0 2026-03-10T06:22:11.305 INFO:tasks.workunit.client.1.vm06.stdout:1/198: creat d9/d1b/d20/f30 x:0 0 0 2026-03-10T06:22:11.311 INFO:tasks.workunit.client.1.vm06.stdout:6/198: getdents d6/dd 0 2026-03-10T06:22:11.311 INFO:tasks.workunit.client.1.vm06.stdout:9/152: creat d21/f34 x:0 0 0 2026-03-10T06:22:11.313 INFO:tasks.workunit.client.1.vm06.stdout:6/199: creat d6/df/f42 x:0 0 0 2026-03-10T06:22:11.314 INFO:tasks.workunit.client.1.vm06.stdout:6/200: mkdir d6/d7/d37/d43 0 2026-03-10T06:22:11.316 INFO:tasks.workunit.client.1.vm06.stdout:6/201: stat d6/df/f1e 0 2026-03-10T06:22:11.317 INFO:tasks.workunit.client.1.vm06.stdout:6/202: creat d6/d7/f44 x:0 0 0 2026-03-10T06:22:11.323 INFO:tasks.workunit.client.1.vm06.stdout:6/203: dwrite d6/df/f1e [0,4194304] 0 2026-03-10T06:22:11.323 INFO:tasks.workunit.client.1.vm06.stdout:6/204: chown d6/c17 14094 1 2026-03-10T06:22:11.330 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:11 vm04.local ceph-mon[51058]: pgmap v4: 65 pgs: 65 active+clean; 467 MiB data, 2.0 GiB used, 118 GiB / 120 GiB avail 2026-03-10T06:22:11.330 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:11 vm04.local ceph-mon[51058]: mgrmap e29: vm04.exdvdb(active, since 2s) 2026-03-10T06:22:11.336 INFO:tasks.workunit.client.1.vm06.stdout:9/153: dread f14 [0,4194304] 0 2026-03-10T06:22:11.343 INFO:tasks.workunit.client.1.vm06.stdout:8/164: truncate d1/f1b 2015227 0 2026-03-10T06:22:11.344 INFO:tasks.workunit.client.1.vm06.stdout:2/169: getdents da 0 2026-03-10T06:22:11.344 INFO:tasks.workunit.client.1.vm06.stdout:5/140: write d8/db/fc [642061,125567] 0 2026-03-10T06:22:11.345 INFO:tasks.workunit.client.1.vm06.stdout:2/170: write da/d13/d1a/d39/f2f [730866,3165] 0 2026-03-10T06:22:11.346 INFO:tasks.workunit.client.1.vm06.stdout:9/154: creat d21/d24/f35 x:0 0 0 2026-03-10T06:22:11.348 INFO:tasks.workunit.client.1.vm06.stdout:2/171: dwrite da/d13/d1c/d1d/f26 [0,4194304] 0 2026-03-10T06:22:11.353 INFO:tasks.workunit.client.1.vm06.stdout:5/141: write d8/d20/d22/f31 [428835,10301] 0 2026-03-10T06:22:11.358 INFO:tasks.workunit.client.1.vm06.stdout:8/165: dwrite d1/f1c [0,4194304] 0 2026-03-10T06:22:11.369 INFO:tasks.workunit.client.1.vm06.stdout:7/226: dwrite d19/f24 [0,4194304] 0 2026-03-10T06:22:11.372 INFO:tasks.workunit.client.1.vm06.stdout:7/227: chown d19/c3d 11 1 2026-03-10T06:22:11.378 INFO:tasks.workunit.client.1.vm06.stdout:4/149: truncate dd/d20/f21 1527167 0 2026-03-10T06:22:11.381 INFO:tasks.workunit.client.1.vm06.stdout:9/155: symlink d21/d24/l36 0 2026-03-10T06:22:11.382 INFO:tasks.workunit.client.1.vm06.stdout:9/156: dread - d21/d24/f2f zero size 2026-03-10T06:22:11.382 INFO:tasks.workunit.client.1.vm06.stdout:9/157: dread - d21/f34 zero size 2026-03-10T06:22:11.387 INFO:tasks.workunit.client.1.vm06.stdout:2/172: creat da/d13/d1a/f3a x:0 0 0 2026-03-10T06:22:11.394 INFO:tasks.workunit.client.1.vm06.stdout:6/205: readlink d6/df/l3e 0 2026-03-10T06:22:11.395 INFO:tasks.workunit.client.1.vm06.stdout:5/142: symlink d8/d20/d22/l32 0 2026-03-10T06:22:11.396 INFO:tasks.workunit.client.1.vm06.stdout:8/166: creat d1/df/d20/d21/f36 x:0 0 0 2026-03-10T06:22:11.396 INFO:tasks.workunit.client.1.vm06.stdout:8/167: chown d1/d2c 801545653 1 2026-03-10T06:22:11.397 INFO:tasks.workunit.client.1.vm06.stdout:7/228: mkdir d19/d3b/d41/d42/d4b 0 2026-03-10T06:22:11.398 INFO:tasks.workunit.client.1.vm06.stdout:7/229: write d19/d3b/d41/f48 [340163,73397] 0 2026-03-10T06:22:11.399 INFO:tasks.workunit.client.1.vm06.stdout:7/230: chown d19/f1a 104580755 1 2026-03-10T06:22:11.399 INFO:tasks.workunit.client.1.vm06.stdout:7/231: stat f13 0 2026-03-10T06:22:11.401 INFO:tasks.workunit.client.1.vm06.stdout:6/206: mknod d6/dd/d25/d30/c45 0 2026-03-10T06:22:11.402 INFO:tasks.workunit.client.1.vm06.stdout:2/173: chown da/d13/d1a/l36 123760 1 2026-03-10T06:22:11.403 INFO:tasks.workunit.client.1.vm06.stdout:1/199: truncate d9/f11 667700 0 2026-03-10T06:22:11.403 INFO:tasks.workunit.client.1.vm06.stdout:8/168: dread d1/d7/f24 [0,4194304] 0 2026-03-10T06:22:11.404 INFO:tasks.workunit.client.1.vm06.stdout:4/150: symlink dd/l2a 0 2026-03-10T06:22:11.405 INFO:tasks.workunit.client.1.vm06.stdout:9/158: creat d21/d32/f37 x:0 0 0 2026-03-10T06:22:11.405 INFO:tasks.workunit.client.1.vm06.stdout:2/174: write da/f28 [1041894,70762] 0 2026-03-10T06:22:11.405 INFO:tasks.workunit.client.1.vm06.stdout:9/159: chown d21/f34 5715575 1 2026-03-10T06:22:11.406 INFO:tasks.workunit.client.1.vm06.stdout:0/173: dwrite d0/f9 [0,4194304] 0 2026-03-10T06:22:11.410 INFO:tasks.workunit.client.1.vm06.stdout:4/151: write dd/d20/f28 [652340,85232] 0 2026-03-10T06:22:11.410 INFO:tasks.workunit.client.1.vm06.stdout:9/160: dread - d21/f33 zero size 2026-03-10T06:22:11.413 INFO:tasks.workunit.client.1.vm06.stdout:0/174: chown d0/dd/d1b/f2f 0 1 2026-03-10T06:22:11.413 INFO:tasks.workunit.client.1.vm06.stdout:9/161: chown d21/f33 187150 1 2026-03-10T06:22:11.414 INFO:tasks.workunit.client.1.vm06.stdout:7/232: dread d19/f35 [0,4194304] 0 2026-03-10T06:22:11.417 INFO:tasks.workunit.client.1.vm06.stdout:6/207: mknod d6/df/c46 0 2026-03-10T06:22:11.417 INFO:tasks.workunit.client.1.vm06.stdout:7/233: write d19/f2f [677214,1714] 0 2026-03-10T06:22:11.418 INFO:tasks.workunit.client.1.vm06.stdout:7/234: stat d19/d3b/d41 0 2026-03-10T06:22:11.420 INFO:tasks.workunit.client.1.vm06.stdout:2/175: dwrite da/f28 [0,4194304] 0 2026-03-10T06:22:11.420 INFO:tasks.workunit.client.1.vm06.stdout:9/162: mknod d21/d27/c38 0 2026-03-10T06:22:11.426 INFO:tasks.workunit.client.1.vm06.stdout:4/152: symlink dd/l2b 0 2026-03-10T06:22:11.426 INFO:tasks.workunit.client.1.vm06.stdout:6/208: dwrite d6/d7/f20 [0,4194304] 0 2026-03-10T06:22:11.430 INFO:tasks.workunit.client.1.vm06.stdout:4/153: write f8 [3736859,116669] 0 2026-03-10T06:22:11.430 INFO:tasks.workunit.client.1.vm06.stdout:0/175: creat d0/dd/d1b/d3d/f40 x:0 0 0 2026-03-10T06:22:11.432 INFO:tasks.workunit.client.1.vm06.stdout:1/200: creat d9/d1b/f31 x:0 0 0 2026-03-10T06:22:11.432 INFO:tasks.workunit.client.1.vm06.stdout:1/201: stat d9/df 0 2026-03-10T06:22:11.432 INFO:tasks.workunit.client.1.vm06.stdout:9/163: creat d21/d27/f39 x:0 0 0 2026-03-10T06:22:11.436 INFO:tasks.workunit.client.1.vm06.stdout:2/176: write da/d13/d1a/f33 [1502746,115688] 0 2026-03-10T06:22:11.438 INFO:tasks.workunit.client.1.vm06.stdout:6/209: write d6/dd/d25/d2c/f32 [418050,27355] 0 2026-03-10T06:22:11.442 INFO:tasks.workunit.client.1.vm06.stdout:3/161: fsync d6/d21/f30 0 2026-03-10T06:22:11.445 INFO:tasks.workunit.client.1.vm06.stdout:1/202: dwrite d9/d1b/d20/f24 [0,4194304] 0 2026-03-10T06:22:11.445 INFO:tasks.workunit.client.1.vm06.stdout:8/169: dread f0 [0,4194304] 0 2026-03-10T06:22:11.445 INFO:tasks.workunit.client.1.vm06.stdout:2/177: dwrite da/d13/d1c/f29 [0,4194304] 0 2026-03-10T06:22:11.448 INFO:tasks.workunit.client.1.vm06.stdout:9/164: mkdir d21/d27/d3a 0 2026-03-10T06:22:11.449 INFO:tasks.workunit.client.1.vm06.stdout:4/154: mknod dd/d24/c2c 0 2026-03-10T06:22:11.450 INFO:tasks.workunit.client.1.vm06.stdout:4/155: fsync dd/d18/f1d 0 2026-03-10T06:22:11.450 INFO:tasks.workunit.client.1.vm06.stdout:4/156: chown dd/d18/f23 1886 1 2026-03-10T06:22:11.451 INFO:tasks.workunit.client.1.vm06.stdout:4/157: write dd/d20/f28 [997062,83723] 0 2026-03-10T06:22:11.459 INFO:tasks.workunit.client.1.vm06.stdout:1/203: read d9/f1a [1320646,52014] 0 2026-03-10T06:22:11.463 INFO:tasks.workunit.client.1.vm06.stdout:9/165: read d21/f22 [2455767,49615] 0 2026-03-10T06:22:11.471 INFO:tasks.workunit.client.1.vm06.stdout:0/176: getdents d0/dd/d14/d1d 0 2026-03-10T06:22:11.475 INFO:tasks.workunit.client.1.vm06.stdout:0/177: creat d0/dd/d2d/d39/f41 x:0 0 0 2026-03-10T06:22:11.478 INFO:tasks.workunit.client.1.vm06.stdout:2/178: rename da/d13/d1a/f33 to da/d13/d1c/f3b 0 2026-03-10T06:22:11.481 INFO:tasks.workunit.client.1.vm06.stdout:4/158: dwrite dd/f29 [0,4194304] 0 2026-03-10T06:22:11.482 INFO:tasks.workunit.client.1.vm06.stdout:2/179: chown da/l30 129581 1 2026-03-10T06:22:11.485 INFO:tasks.workunit.client.1.vm06.stdout:2/180: creat da/d13/d1a/d39/f3c x:0 0 0 2026-03-10T06:22:11.487 INFO:tasks.workunit.client.1.vm06.stdout:2/181: fdatasync da/d13/f1f 0 2026-03-10T06:22:11.495 INFO:tasks.workunit.client.1.vm06.stdout:2/182: chown da/d13/d1a/f27 43833088 1 2026-03-10T06:22:11.537 INFO:tasks.workunit.client.1.vm06.stdout:3/162: sync 2026-03-10T06:22:11.542 INFO:tasks.workunit.client.1.vm06.stdout:3/163: chown d6/dc/d2a/c34 1018949486 1 2026-03-10T06:22:11.544 INFO:tasks.workunit.client.1.vm06.stdout:3/164: creat d6/dc/d13/d35/f3b x:0 0 0 2026-03-10T06:22:11.544 INFO:tasks.workunit.client.1.vm06.stdout:3/165: write d6/dc/d13/f37 [884400,100117] 0 2026-03-10T06:22:11.546 INFO:tasks.workunit.client.1.vm06.stdout:2/183: sync 2026-03-10T06:22:11.554 INFO:tasks.workunit.client.1.vm06.stdout:3/166: symlink d6/dc/d13/l3c 0 2026-03-10T06:22:11.554 INFO:tasks.workunit.client.1.vm06.stdout:2/184: fdatasync f5 0 2026-03-10T06:22:11.554 INFO:tasks.workunit.client.1.vm06.stdout:3/167: creat d6/d21/d38/f3d x:0 0 0 2026-03-10T06:22:11.574 INFO:tasks.workunit.client.1.vm06.stdout:5/143: truncate d8/d9/d1e/f23 244832 0 2026-03-10T06:22:11.575 INFO:tasks.workunit.client.1.vm06.stdout:5/144: stat d8/d20/d22/c2d 0 2026-03-10T06:22:11.579 INFO:tasks.workunit.client.1.vm06.stdout:5/145: dwrite d8/d9/f11 [0,4194304] 0 2026-03-10T06:22:11.580 INFO:tasks.workunit.client.1.vm06.stdout:5/146: truncate d8/d9/d1e/f2f 504045 0 2026-03-10T06:22:11.585 INFO:tasks.workunit.client.1.vm06.stdout:5/147: mknod d8/d9/c33 0 2026-03-10T06:22:11.590 INFO:tasks.workunit.client.1.vm06.stdout:5/148: dread d8/d20/d22/f31 [0,4194304] 0 2026-03-10T06:22:11.595 INFO:tasks.workunit.client.1.vm06.stdout:5/149: dwrite d8/d9/d1e/f29 [4194304,4194304] 0 2026-03-10T06:22:11.595 INFO:tasks.workunit.client.1.vm06.stdout:5/150: chown d8/ff 912 1 2026-03-10T06:22:11.596 INFO:tasks.workunit.client.1.vm06.stdout:5/151: chown d8/d20/d22/l32 2315841 1 2026-03-10T06:22:11.597 INFO:tasks.workunit.client.1.vm06.stdout:5/152: write d8/db/f1f [3952520,90708] 0 2026-03-10T06:22:11.598 INFO:tasks.workunit.client.1.vm06.stdout:5/153: write d8/d9/d1e/f29 [5809193,49940] 0 2026-03-10T06:22:11.600 INFO:tasks.workunit.client.1.vm06.stdout:5/154: mknod d8/c34 0 2026-03-10T06:22:11.600 INFO:tasks.workunit.client.1.vm06.stdout:5/155: stat d8/db/f18 0 2026-03-10T06:22:11.601 INFO:tasks.workunit.client.1.vm06.stdout:5/156: unlink d8/f1a 0 2026-03-10T06:22:11.609 INFO:tasks.workunit.client.1.vm06.stdout:5/157: dwrite d8/d9/f11 [0,4194304] 0 2026-03-10T06:22:11.620 INFO:tasks.workunit.client.1.vm06.stdout:5/158: dread d8/db/f25 [0,4194304] 0 2026-03-10T06:22:11.620 INFO:tasks.workunit.client.1.vm06.stdout:5/159: creat d8/d20/f35 x:0 0 0 2026-03-10T06:22:11.620 INFO:tasks.workunit.client.1.vm06.stdout:5/160: truncate d8/d9/d1e/f2f 829078 0 2026-03-10T06:22:11.620 INFO:tasks.workunit.client.1.vm06.stdout:5/161: creat d8/d9/d1e/f36 x:0 0 0 2026-03-10T06:22:11.705 INFO:tasks.workunit.client.1.vm06.stdout:7/235: truncate f15 3874418 0 2026-03-10T06:22:11.705 INFO:tasks.workunit.client.1.vm06.stdout:7/236: fdatasync d19/f2f 0 2026-03-10T06:22:11.711 INFO:tasks.workunit.client.1.vm06.stdout:0/178: rename d0/dd/d2d/d39 to d0/d3c/d42 0 2026-03-10T06:22:11.713 INFO:tasks.workunit.client.1.vm06.stdout:2/185: getdents da/d13/d1c 0 2026-03-10T06:22:11.713 INFO:tasks.workunit.client.1.vm06.stdout:1/204: dwrite d9/f1a [0,4194304] 0 2026-03-10T06:22:11.713 INFO:tasks.workunit.client.1.vm06.stdout:6/210: dwrite d6/dd/d25/f3f [8388608,4194304] 0 2026-03-10T06:22:11.713 INFO:tasks.workunit.client.1.vm06.stdout:2/186: stat da/fe 0 2026-03-10T06:22:11.717 INFO:tasks.workunit.client.1.vm06.stdout:9/166: dwrite fd [0,4194304] 0 2026-03-10T06:22:11.720 INFO:tasks.workunit.client.1.vm06.stdout:2/187: fsync da/d13/d1a/f3a 0 2026-03-10T06:22:11.724 INFO:tasks.workunit.client.1.vm06.stdout:4/159: rename dd/d20 to dd/d24/d2d 0 2026-03-10T06:22:11.724 INFO:tasks.workunit.client.1.vm06.stdout:9/167: symlink d21/d24/l3b 0 2026-03-10T06:22:11.725 INFO:tasks.workunit.client.1.vm06.stdout:6/211: unlink d6/d7/fb 0 2026-03-10T06:22:11.726 INFO:tasks.workunit.client.1.vm06.stdout:6/212: truncate d6/dd/d35/f2d 4525623 0 2026-03-10T06:22:11.728 INFO:tasks.workunit.client.1.vm06.stdout:2/188: mknod da/c3d 0 2026-03-10T06:22:11.729 INFO:tasks.workunit.client.1.vm06.stdout:6/213: write d6/d7/f36 [682080,80914] 0 2026-03-10T06:22:11.729 INFO:tasks.workunit.client.1.vm06.stdout:6/214: chown d6/c17 95304 1 2026-03-10T06:22:11.733 INFO:tasks.workunit.client.1.vm06.stdout:0/179: symlink d0/l43 0 2026-03-10T06:22:11.736 INFO:tasks.workunit.client.1.vm06.stdout:4/160: dread dd/f14 [0,4194304] 0 2026-03-10T06:22:11.744 INFO:tasks.workunit.client.1.vm06.stdout:9/168: mknod d21/d24/c3c 0 2026-03-10T06:22:11.746 INFO:tasks.workunit.client.1.vm06.stdout:1/205: mkdir d9/d32 0 2026-03-10T06:22:11.747 INFO:tasks.workunit.client.1.vm06.stdout:1/206: truncate d9/f2f 592844 0 2026-03-10T06:22:11.747 INFO:tasks.workunit.client.1.vm06.stdout:6/215: mkdir d6/d7/d47 0 2026-03-10T06:22:11.747 INFO:tasks.workunit.client.1.vm06.stdout:1/207: stat d9/f17 0 2026-03-10T06:22:11.750 INFO:tasks.workunit.client.1.vm06.stdout:9/169: unlink d21/d24/c26 0 2026-03-10T06:22:11.752 INFO:tasks.workunit.client.1.vm06.stdout:2/189: mkdir da/d13/d1c/d1d/d3e 0 2026-03-10T06:22:11.756 INFO:tasks.workunit.client.1.vm06.stdout:1/208: symlink d9/d1b/d20/l33 0 2026-03-10T06:22:11.765 INFO:tasks.workunit.client.1.vm06.stdout:1/209: fdatasync d9/df/f14 0 2026-03-10T06:22:11.765 INFO:tasks.workunit.client.1.vm06.stdout:9/170: mknod d21/c3d 0 2026-03-10T06:22:11.765 INFO:tasks.workunit.client.1.vm06.stdout:0/180: rename d0/dd/d14/f28 to d0/dd/f44 0 2026-03-10T06:22:11.765 INFO:tasks.workunit.client.1.vm06.stdout:9/171: write d21/d24/f2f [729837,130770] 0 2026-03-10T06:22:11.765 INFO:tasks.workunit.client.1.vm06.stdout:0/181: chown d0/d3c 2741956 1 2026-03-10T06:22:11.765 INFO:tasks.workunit.client.1.vm06.stdout:1/210: dread - d9/d1b/d20/f30 zero size 2026-03-10T06:22:11.765 INFO:tasks.workunit.client.1.vm06.stdout:4/161: link c5 dd/d18/c2e 0 2026-03-10T06:22:11.769 INFO:tasks.workunit.client.1.vm06.stdout:9/172: rename d21/d24/f35 to d21/f3e 0 2026-03-10T06:22:11.770 INFO:tasks.workunit.client.1.vm06.stdout:0/182: mknod d0/dd/c45 0 2026-03-10T06:22:11.770 INFO:tasks.workunit.client.1.vm06.stdout:9/173: truncate f1b 574468 0 2026-03-10T06:22:11.774 INFO:tasks.workunit.client.1.vm06.stdout:9/174: dread - d21/d24/f2e zero size 2026-03-10T06:22:11.774 INFO:tasks.workunit.client.1.vm06.stdout:1/211: creat d9/f34 x:0 0 0 2026-03-10T06:22:11.776 INFO:tasks.workunit.client.1.vm06.stdout:9/175: creat d21/d32/f3f x:0 0 0 2026-03-10T06:22:11.777 INFO:tasks.workunit.client.1.vm06.stdout:1/212: fdatasync d9/fe 0 2026-03-10T06:22:11.780 INFO:tasks.workunit.client.1.vm06.stdout:9/176: chown c16 95190 1 2026-03-10T06:22:11.784 INFO:tasks.workunit.client.1.vm06.stdout:4/162: dwrite dd/d24/d2d/f21 [0,4194304] 0 2026-03-10T06:22:11.784 INFO:tasks.workunit.client.1.vm06.stdout:1/213: dread d9/f1a [0,4194304] 0 2026-03-10T06:22:11.795 INFO:tasks.workunit.client.1.vm06.stdout:9/177: mkdir d21/d27/d3a/d40 0 2026-03-10T06:22:11.797 INFO:tasks.workunit.client.1.vm06.stdout:1/214: rename d9/d32 to d9/d35 0 2026-03-10T06:22:11.799 INFO:tasks.workunit.client.1.vm06.stdout:4/163: dwrite f0 [0,4194304] 0 2026-03-10T06:22:11.799 INFO:tasks.workunit.client.1.vm06.stdout:9/178: rename c16 to d21/d27/c41 0 2026-03-10T06:22:11.800 INFO:tasks.workunit.client.1.vm06.stdout:1/215: fdatasync d9/d1b/f31 0 2026-03-10T06:22:11.801 INFO:tasks.workunit.client.1.vm06.stdout:9/179: dread - d21/d32/f37 zero size 2026-03-10T06:22:11.802 INFO:tasks.workunit.client.1.vm06.stdout:4/164: mkdir dd/d24/d2d/d2f 0 2026-03-10T06:22:11.802 INFO:tasks.workunit.client.1.vm06.stdout:1/216: truncate d9/d1b/f31 573225 0 2026-03-10T06:22:11.804 INFO:tasks.workunit.client.1.vm06.stdout:4/165: symlink dd/d18/l30 0 2026-03-10T06:22:11.805 INFO:tasks.workunit.client.1.vm06.stdout:9/180: rmdir d21/d27/d3a/d40 0 2026-03-10T06:22:11.807 INFO:tasks.workunit.client.1.vm06.stdout:3/168: write d6/d21/f26 [3642630,16329] 0 2026-03-10T06:22:11.808 INFO:tasks.workunit.client.1.vm06.stdout:8/170: link d1/f1b d1/df/d20/d21/f37 0 2026-03-10T06:22:11.808 INFO:tasks.workunit.client.1.vm06.stdout:9/181: fdatasync f14 0 2026-03-10T06:22:11.808 INFO:tasks.workunit.client.1.vm06.stdout:3/169: fdatasync d6/d8/f22 0 2026-03-10T06:22:11.808 INFO:tasks.workunit.client.1.vm06.stdout:2/190: fdatasync da/f11 0 2026-03-10T06:22:11.809 INFO:tasks.workunit.client.1.vm06.stdout:8/171: truncate d1/f5 2585486 0 2026-03-10T06:22:11.811 INFO:tasks.workunit.client.1.vm06.stdout:3/170: fdatasync d6/d21/d38/f3d 0 2026-03-10T06:22:11.815 INFO:tasks.workunit.client.1.vm06.stdout:4/166: sync 2026-03-10T06:22:11.815 INFO:tasks.workunit.client.1.vm06.stdout:1/217: sync 2026-03-10T06:22:11.825 INFO:tasks.workunit.client.1.vm06.stdout:9/182: dread f19 [0,4194304] 0 2026-03-10T06:22:11.826 INFO:tasks.workunit.client.1.vm06.stdout:3/171: symlink d6/d21/d38/l3e 0 2026-03-10T06:22:11.829 INFO:tasks.workunit.client.1.vm06.stdout:3/172: write d6/d21/f2c [3541771,30853] 0 2026-03-10T06:22:11.835 INFO:tasks.workunit.client.1.vm06.stdout:9/183: mknod d21/d27/d3a/c42 0 2026-03-10T06:22:11.835 INFO:tasks.workunit.client.1.vm06.stdout:4/167: symlink dd/d24/d2d/d2f/l31 0 2026-03-10T06:22:11.844 INFO:tasks.workunit.client.1.vm06.stdout:3/173: dwrite d6/d8/fb [0,4194304] 0 2026-03-10T06:22:11.846 INFO:tasks.workunit.client.1.vm06.stdout:9/184: write f19 [1882211,46617] 0 2026-03-10T06:22:11.846 INFO:tasks.workunit.client.1.vm06.stdout:9/185: dread fd [0,4194304] 0 2026-03-10T06:22:11.848 INFO:tasks.workunit.client.1.vm06.stdout:4/168: creat dd/d18/f32 x:0 0 0 2026-03-10T06:22:11.853 INFO:tasks.workunit.client.1.vm06.stdout:4/169: fdatasync dd/d18/f23 0 2026-03-10T06:22:11.856 INFO:tasks.workunit.client.1.vm06.stdout:3/174: dread d6/dc/f1d [0,4194304] 0 2026-03-10T06:22:11.860 INFO:tasks.workunit.client.1.vm06.stdout:3/175: creat d6/dc/f3f x:0 0 0 2026-03-10T06:22:11.861 INFO:tasks.workunit.client.1.vm06.stdout:3/176: write d6/dc/d13/d35/f3b [319256,126235] 0 2026-03-10T06:22:11.862 INFO:tasks.workunit.client.1.vm06.stdout:3/177: mknod d6/d1a/c40 0 2026-03-10T06:22:11.866 INFO:tasks.workunit.client.1.vm06.stdout:3/178: mkdir d6/dc/d41 0 2026-03-10T06:22:11.893 INFO:tasks.workunit.client.1.vm06.stdout:5/162: write d8/ff [721949,74180] 0 2026-03-10T06:22:11.893 INFO:tasks.workunit.client.1.vm06.stdout:6/216: dread d6/d7/f1a [0,4194304] 0 2026-03-10T06:22:11.896 INFO:tasks.workunit.client.1.vm06.stdout:4/170: getdents dd 0 2026-03-10T06:22:11.899 INFO:tasks.workunit.client.1.vm06.stdout:6/217: mknod d6/dd/d25/d33/c48 0 2026-03-10T06:22:11.901 INFO:tasks.workunit.client.1.vm06.stdout:7/237: dread d19/d3b/f47 [0,4194304] 0 2026-03-10T06:22:11.902 INFO:tasks.workunit.client.1.vm06.stdout:5/163: sync 2026-03-10T06:22:11.903 INFO:tasks.workunit.client.1.vm06.stdout:7/238: fsync d19/f27 0 2026-03-10T06:22:11.905 INFO:tasks.workunit.client.1.vm06.stdout:5/164: fsync d8/d9/d1e/f29 0 2026-03-10T06:22:11.909 INFO:tasks.workunit.client.1.vm06.stdout:6/218: rename d6/dd/d35/f24 to d6/dd/d25/f49 0 2026-03-10T06:22:11.910 INFO:tasks.workunit.client.1.vm06.stdout:7/239: mkdir d19/d3b/d41/d4c 0 2026-03-10T06:22:11.910 INFO:tasks.workunit.client.1.vm06.stdout:2/191: dread da/fe [4194304,4194304] 0 2026-03-10T06:22:11.913 INFO:tasks.workunit.client.1.vm06.stdout:4/171: dwrite dd/d18/f1d [0,4194304] 0 2026-03-10T06:22:11.913 INFO:tasks.workunit.client.1.vm06.stdout:3/179: fdatasync d6/dc/d13/d35/f3b 0 2026-03-10T06:22:11.914 INFO:tasks.workunit.client.1.vm06.stdout:7/240: write d19/d3b/f3c [988928,129357] 0 2026-03-10T06:22:11.916 INFO:tasks.workunit.client.1.vm06.stdout:5/165: rmdir d8/d9 39 2026-03-10T06:22:11.917 INFO:tasks.workunit.client.1.vm06.stdout:6/219: truncate d6/dd/d35/f2d 4876246 0 2026-03-10T06:22:11.919 INFO:tasks.workunit.client.1.vm06.stdout:2/192: fdatasync da/f11 0 2026-03-10T06:22:11.920 INFO:tasks.workunit.client.1.vm06.stdout:4/172: mkdir dd/d33 0 2026-03-10T06:22:11.925 INFO:tasks.workunit.client.1.vm06.stdout:6/220: symlink d6/df/d40/l4a 0 2026-03-10T06:22:11.925 INFO:tasks.workunit.client.1.vm06.stdout:6/221: stat d6/d7 0 2026-03-10T06:22:11.929 INFO:tasks.workunit.client.1.vm06.stdout:5/166: creat d8/d9/d1e/f37 x:0 0 0 2026-03-10T06:22:11.931 INFO:tasks.workunit.client.1.vm06.stdout:2/193: dwrite da/f19 [4194304,4194304] 0 2026-03-10T06:22:11.931 INFO:tasks.workunit.client.1.vm06.stdout:5/167: dread d8/d9/d1e/f2f [0,4194304] 0 2026-03-10T06:22:11.933 INFO:tasks.workunit.client.1.vm06.stdout:2/194: chown f8 1128 1 2026-03-10T06:22:11.933 INFO:tasks.workunit.client.1.vm06.stdout:7/241: rename d19/f26 to d19/f4d 0 2026-03-10T06:22:11.934 INFO:tasks.workunit.client.1.vm06.stdout:2/195: dread - da/d13/d1a/f3a zero size 2026-03-10T06:22:11.942 INFO:tasks.workunit.client.1.vm06.stdout:4/173: unlink dd/l2b 0 2026-03-10T06:22:11.943 INFO:tasks.workunit.client.1.vm06.stdout:1/218: truncate d9/df/f14 2938292 0 2026-03-10T06:22:11.943 INFO:tasks.workunit.client.1.vm06.stdout:2/196: read - da/d13/d1a/f3a zero size 2026-03-10T06:22:11.943 INFO:tasks.workunit.client.1.vm06.stdout:2/197: stat da/d13/d1c/d1d/d3e 0 2026-03-10T06:22:11.943 INFO:tasks.workunit.client.1.vm06.stdout:5/168: unlink d8/d9/d1e/f28 0 2026-03-10T06:22:11.943 INFO:tasks.workunit.client.1.vm06.stdout:5/169: stat d8/d9/f11 0 2026-03-10T06:22:11.946 INFO:tasks.workunit.client.1.vm06.stdout:0/183: truncate d0/dd/d2d/f36 22812 0 2026-03-10T06:22:11.946 INFO:tasks.workunit.client.1.vm06.stdout:7/242: unlink d19/c3a 0 2026-03-10T06:22:11.951 INFO:tasks.workunit.client.1.vm06.stdout:1/219: mkdir d9/d35/d36 0 2026-03-10T06:22:11.953 INFO:tasks.workunit.client.1.vm06.stdout:1/220: dread - d9/d1b/d20/f25 zero size 2026-03-10T06:22:11.953 INFO:tasks.workunit.client.1.vm06.stdout:1/221: read d9/f2f [114059,50708] 0 2026-03-10T06:22:11.955 INFO:tasks.workunit.client.1.vm06.stdout:7/243: dread f15 [0,4194304] 0 2026-03-10T06:22:11.958 INFO:tasks.workunit.client.1.vm06.stdout:2/198: fsync f5 0 2026-03-10T06:22:11.959 INFO:tasks.workunit.client.1.vm06.stdout:6/222: dwrite d6/dd/f18 [0,4194304] 0 2026-03-10T06:22:11.962 INFO:tasks.workunit.client.1.vm06.stdout:6/223: read - d6/f1b zero size 2026-03-10T06:22:11.966 INFO:tasks.workunit.client.1.vm06.stdout:5/170: rename d8/d9/d1e/c26 to d8/d20/d22/c38 0 2026-03-10T06:22:11.971 INFO:tasks.workunit.client.1.vm06.stdout:7/244: dread d19/d3b/d41/f49 [0,4194304] 0 2026-03-10T06:22:11.971 INFO:tasks.workunit.client.1.vm06.stdout:1/222: unlink d9/df/f10 0 2026-03-10T06:22:11.974 INFO:tasks.workunit.client.1.vm06.stdout:4/174: mkdir dd/d24/d2d/d2f/d34 0 2026-03-10T06:22:11.975 INFO:tasks.workunit.client.1.vm06.stdout:2/199: mknod da/d13/d1a/d39/d35/c3f 0 2026-03-10T06:22:11.975 INFO:tasks.workunit.client.1.vm06.stdout:5/171: mkdir d8/d20/d22/d39 0 2026-03-10T06:22:11.976 INFO:tasks.workunit.client.1.vm06.stdout:5/172: write d8/db/fc [126444,41022] 0 2026-03-10T06:22:11.981 INFO:tasks.workunit.client.1.vm06.stdout:5/173: fdatasync d8/d9/d1e/f37 0 2026-03-10T06:22:11.982 INFO:tasks.workunit.client.1.vm06.stdout:6/224: dwrite d6/dd/d35/f2d [4194304,4194304] 0 2026-03-10T06:22:11.992 INFO:tasks.workunit.client.1.vm06.stdout:2/200: mknod da/d13/d1a/d39/d35/c40 0 2026-03-10T06:22:11.994 INFO:tasks.workunit.client.1.vm06.stdout:0/184: dread d0/dd/d14/d18/f30 [0,4194304] 0 2026-03-10T06:22:12.000 INFO:tasks.workunit.client.1.vm06.stdout:2/201: dwrite da/d13/d1a/f3a [0,4194304] 0 2026-03-10T06:22:12.003 INFO:tasks.workunit.client.1.vm06.stdout:2/202: write da/d13/d1a/f27 [2512098,1897] 0 2026-03-10T06:22:12.004 INFO:tasks.workunit.client.1.vm06.stdout:5/174: symlink d8/d9/l3a 0 2026-03-10T06:22:12.005 INFO:tasks.workunit.client.1.vm06.stdout:1/223: mknod d9/df/c37 0 2026-03-10T06:22:12.005 INFO:tasks.workunit.client.1.vm06.stdout:6/225: mknod d6/dd/d25/d2c/c4b 0 2026-03-10T06:22:12.013 INFO:tasks.workunit.client.1.vm06.stdout:0/185: dwrite d0/f5 [0,4194304] 0 2026-03-10T06:22:12.017 INFO:tasks.workunit.client.1.vm06.stdout:6/226: rmdir d6/dd 39 2026-03-10T06:22:12.018 INFO:tasks.workunit.client.1.vm06.stdout:6/227: write f3 [986101,113259] 0 2026-03-10T06:22:12.018 INFO:tasks.workunit.client.1.vm06.stdout:6/228: chown d6/d7/l34 0 1 2026-03-10T06:22:12.019 INFO:tasks.workunit.client.1.vm06.stdout:4/175: dread dd/d24/d2d/f21 [0,4194304] 0 2026-03-10T06:22:12.024 INFO:tasks.workunit.client.1.vm06.stdout:4/176: write f2 [12396773,128648] 0 2026-03-10T06:22:12.028 INFO:tasks.workunit.client.1.vm06.stdout:6/229: dread d6/d7/f20 [0,4194304] 0 2026-03-10T06:22:12.029 INFO:tasks.workunit.client.1.vm06.stdout:0/186: link d0/ff d0/f46 0 2026-03-10T06:22:12.029 INFO:tasks.workunit.client.1.vm06.stdout:0/187: chown d0/dd/d14/d18/c29 1408643 1 2026-03-10T06:22:12.029 INFO:tasks.workunit.client.1.vm06.stdout:0/188: mkdir d0/dd/d2d/d47 0 2026-03-10T06:22:12.029 INFO:tasks.workunit.client.1.vm06.stdout:0/189: write d0/dd/d14/f31 [1408250,115391] 0 2026-03-10T06:22:12.030 INFO:tasks.workunit.client.1.vm06.stdout:2/203: sync 2026-03-10T06:22:12.032 INFO:tasks.workunit.client.1.vm06.stdout:4/177: creat dd/d24/d2d/f35 x:0 0 0 2026-03-10T06:22:12.037 INFO:tasks.workunit.client.1.vm06.stdout:4/178: fsync dd/f14 0 2026-03-10T06:22:12.043 INFO:tasks.workunit.client.1.vm06.stdout:6/230: dwrite d6/dd/d35/f2d [0,4194304] 0 2026-03-10T06:22:12.053 INFO:tasks.workunit.client.1.vm06.stdout:2/204: rmdir da/d13/d1c/d1d 39 2026-03-10T06:22:12.061 INFO:tasks.workunit.client.1.vm06.stdout:6/231: creat d6/dd/d25/d2c/f4c x:0 0 0 2026-03-10T06:22:12.061 INFO:tasks.workunit.client.1.vm06.stdout:0/190: creat d0/dd/f48 x:0 0 0 2026-03-10T06:22:12.065 INFO:tasks.workunit.client.1.vm06.stdout:6/232: mkdir d6/dd/d25/d33/d4d 0 2026-03-10T06:22:12.066 INFO:tasks.workunit.client.1.vm06.stdout:2/205: creat da/d13/d1c/f41 x:0 0 0 2026-03-10T06:22:12.082 INFO:tasks.workunit.client.1.vm06.stdout:6/233: sync 2026-03-10T06:22:12.086 INFO:tasks.workunit.client.1.vm06.stdout:8/172: dwrite d1/f1b [0,4194304] 0 2026-03-10T06:22:12.089 INFO:tasks.workunit.client.1.vm06.stdout:3/180: dwrite d6/d21/f2c [4194304,4194304] 0 2026-03-10T06:22:12.089 INFO:tasks.workunit.client.1.vm06.stdout:3/181: stat d6/d8 0 2026-03-10T06:22:12.092 INFO:tasks.workunit.client.1.vm06.stdout:3/182: sync 2026-03-10T06:22:12.093 INFO:tasks.workunit.client.1.vm06.stdout:6/234: rename d6/dd/d25/d30 to d6/dd/d25/d4e 0 2026-03-10T06:22:12.094 INFO:tasks.workunit.client.1.vm06.stdout:3/183: chown d6/d21/d38 100 1 2026-03-10T06:22:12.102 INFO:tasks.workunit.client.1.vm06.stdout:6/235: symlink d6/df/d40/l4f 0 2026-03-10T06:22:12.105 INFO:tasks.workunit.client.1.vm06.stdout:9/186: dwrite fe [0,4194304] 0 2026-03-10T06:22:12.105 INFO:tasks.workunit.client.1.vm06.stdout:9/187: readlink l3 0 2026-03-10T06:22:12.107 INFO:tasks.workunit.client.1.vm06.stdout:8/173: chown d1/df/d20/d21/c28 173 1 2026-03-10T06:22:12.109 INFO:tasks.workunit.client.1.vm06.stdout:3/184: creat d6/f42 x:0 0 0 2026-03-10T06:22:12.117 INFO:tasks.workunit.client.1.vm06.stdout:3/185: mknod d6/d21/c43 0 2026-03-10T06:22:12.120 INFO:tasks.workunit.client.1.vm06.stdout:6/236: rmdir d6/d7/d47 0 2026-03-10T06:22:12.120 INFO:tasks.workunit.client.1.vm06.stdout:4/179: dread f6 [0,4194304] 0 2026-03-10T06:22:12.121 INFO:tasks.workunit.client.1.vm06.stdout:9/188: rename d21/l2b to d21/l43 0 2026-03-10T06:22:12.121 INFO:tasks.workunit.client.1.vm06.stdout:8/174: getdents d1/d2c 0 2026-03-10T06:22:12.121 INFO:tasks.workunit.client.1.vm06.stdout:9/189: chown d21/f22 11 1 2026-03-10T06:22:12.130 INFO:tasks.workunit.client.1.vm06.stdout:6/237: symlink d6/d7/d37/l50 0 2026-03-10T06:22:12.135 INFO:tasks.workunit.client.1.vm06.stdout:0/191: rename d0/dd/d2d/f36 to d0/dd/f49 0 2026-03-10T06:22:12.135 INFO:tasks.workunit.client.1.vm06.stdout:0/192: chown d0/l37 2 1 2026-03-10T06:22:12.137 INFO:tasks.workunit.client.1.vm06.stdout:7/245: getdents d19 0 2026-03-10T06:22:12.137 INFO:tasks.workunit.client.1.vm06.stdout:9/190: dwrite d21/d32/f37 [0,4194304] 0 2026-03-10T06:22:12.141 INFO:tasks.workunit.client.1.vm06.stdout:5/175: rmdir d8/d9/d1e 39 2026-03-10T06:22:12.141 INFO:tasks.workunit.client.1.vm06.stdout:1/224: dwrite d9/df/f14 [0,4194304] 0 2026-03-10T06:22:12.145 INFO:tasks.workunit.client.1.vm06.stdout:7/246: sync 2026-03-10T06:22:12.146 INFO:tasks.workunit.client.1.vm06.stdout:8/175: creat d1/df/d20/d21/f38 x:0 0 0 2026-03-10T06:22:12.153 INFO:tasks.workunit.client.1.vm06.stdout:4/180: unlink dd/d18/l30 0 2026-03-10T06:22:12.153 INFO:tasks.workunit.client.1.vm06.stdout:4/181: chown dd/d24/c2c 526 1 2026-03-10T06:22:12.156 INFO:tasks.workunit.client.1.vm06.stdout:6/238: symlink d6/df/l51 0 2026-03-10T06:22:12.157 INFO:tasks.workunit.client.1.vm06.stdout:6/239: read d6/df/f1e [3565225,68045] 0 2026-03-10T06:22:12.168 INFO:tasks.workunit.client.1.vm06.stdout:0/193: symlink d0/d3c/d42/l4a 0 2026-03-10T06:22:12.170 INFO:tasks.workunit.client.1.vm06.stdout:0/194: chown d0/ff 1 1 2026-03-10T06:22:12.171 INFO:tasks.workunit.client.1.vm06.stdout:9/191: fdatasync f9 0 2026-03-10T06:22:12.188 INFO:tasks.workunit.client.1.vm06.stdout:3/186: creat d6/f44 x:0 0 0 2026-03-10T06:22:12.189 INFO:tasks.workunit.client.1.vm06.stdout:6/240: creat d6/dd/d25/d33/f52 x:0 0 0 2026-03-10T06:22:12.189 INFO:tasks.workunit.client.1.vm06.stdout:8/176: dwrite d1/df/d20/d21/f2b [0,4194304] 0 2026-03-10T06:22:12.189 INFO:tasks.workunit.client.1.vm06.stdout:6/241: dwrite d6/dd/d25/d2c/f32 [0,4194304] 0 2026-03-10T06:22:12.191 INFO:tasks.workunit.client.1.vm06.stdout:0/195: sync 2026-03-10T06:22:12.191 INFO:tasks.workunit.client.1.vm06.stdout:9/192: sync 2026-03-10T06:22:12.203 INFO:tasks.workunit.client.1.vm06.stdout:1/225: mkdir d9/d35/d36/d38 0 2026-03-10T06:22:12.204 INFO:tasks.workunit.client.1.vm06.stdout:7/247: link d19/f20 d19/d3b/d41/d4c/f4e 0 2026-03-10T06:22:12.209 INFO:tasks.workunit.client.1.vm06.stdout:4/182: mkdir dd/d33/d36 0 2026-03-10T06:22:12.210 INFO:tasks.workunit.client.1.vm06.stdout:4/183: chown dd/d24 35 1 2026-03-10T06:22:12.211 INFO:tasks.workunit.client.1.vm06.stdout:3/187: rename d6/dc/d13/f37 to d6/d8/f45 0 2026-03-10T06:22:12.213 INFO:tasks.workunit.client.1.vm06.stdout:8/177: mknod d1/df/d20/c39 0 2026-03-10T06:22:12.215 INFO:tasks.workunit.client.1.vm06.stdout:9/193: rmdir d21/d27/d3a 39 2026-03-10T06:22:12.216 INFO:tasks.workunit.client.1.vm06.stdout:7/248: unlink d19/f28 0 2026-03-10T06:22:12.219 INFO:tasks.workunit.client.1.vm06.stdout:0/196: rename d0/l43 to d0/dd/d2d/l4b 0 2026-03-10T06:22:12.223 INFO:tasks.workunit.client.1.vm06.stdout:3/188: unlink d6/dc/d13/d35/c36 0 2026-03-10T06:22:12.229 INFO:tasks.workunit.client.1.vm06.stdout:3/189: chown d6/l28 76 1 2026-03-10T06:22:12.229 INFO:tasks.workunit.client.1.vm06.stdout:3/190: write d6/d21/f2c [2390223,10803] 0 2026-03-10T06:22:12.229 INFO:tasks.workunit.client.1.vm06.stdout:2/206: creat da/d13/d1c/f42 x:0 0 0 2026-03-10T06:22:12.231 INFO:tasks.workunit.client.1.vm06.stdout:3/191: sync 2026-03-10T06:22:12.233 INFO:tasks.workunit.client.1.vm06.stdout:9/194: dwrite d21/f33 [0,4194304] 0 2026-03-10T06:22:12.243 INFO:tasks.workunit.client.1.vm06.stdout:3/192: mknod d6/d8/c46 0 2026-03-10T06:22:12.245 INFO:tasks.workunit.client.1.vm06.stdout:0/197: dwrite d0/dd/d1b/f3f [0,4194304] 0 2026-03-10T06:22:12.245 INFO:tasks.workunit.client.1.vm06.stdout:1/226: creat d9/f39 x:0 0 0 2026-03-10T06:22:12.249 INFO:tasks.workunit.client.1.vm06.stdout:2/207: dread da/d13/d1a/f3a [0,4194304] 0 2026-03-10T06:22:12.256 INFO:tasks.workunit.client.1.vm06.stdout:2/208: dwrite da/d13/d1a/f3a [4194304,4194304] 0 2026-03-10T06:22:12.257 INFO:tasks.workunit.client.1.vm06.stdout:2/209: write da/d13/d1a/d39/f2f [1689029,16396] 0 2026-03-10T06:22:12.262 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:12 vm04.local ceph-mon[51058]: [10/Mar/2026:06:22:11] ENGINE Bus STARTING 2026-03-10T06:22:12.262 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:12 vm04.local ceph-mon[51058]: [10/Mar/2026:06:22:11] ENGINE Serving on https://192.168.123.104:7150 2026-03-10T06:22:12.262 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:12 vm04.local ceph-mon[51058]: [10/Mar/2026:06:22:11] ENGINE Client ('192.168.123.104', 56984) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)') 2026-03-10T06:22:12.262 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:12 vm04.local ceph-mon[51058]: [10/Mar/2026:06:22:11] ENGINE Serving on http://192.168.123.104:8765 2026-03-10T06:22:12.262 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:12 vm04.local ceph-mon[51058]: [10/Mar/2026:06:22:11] ENGINE Bus STARTED 2026-03-10T06:22:12.263 INFO:tasks.workunit.client.1.vm06.stdout:1/227: dwrite d9/d1b/d20/f24 [0,4194304] 0 2026-03-10T06:22:12.276 INFO:tasks.workunit.client.1.vm06.stdout:9/195: dread f12 [0,4194304] 0 2026-03-10T06:22:12.284 INFO:tasks.workunit.client.1.vm06.stdout:6/242: getdents d6/dd/d25/d33 0 2026-03-10T06:22:12.289 INFO:tasks.workunit.client.1.vm06.stdout:6/243: fdatasync d6/dd/f18 0 2026-03-10T06:22:12.290 INFO:tasks.workunit.client.1.vm06.stdout:5/176: creat d8/d9/d1e/f3b x:0 0 0 2026-03-10T06:22:12.296 INFO:tasks.workunit.client.1.vm06.stdout:8/178: creat d1/f3a x:0 0 0 2026-03-10T06:22:12.301 INFO:tasks.workunit.client.1.vm06.stdout:7/249: rename d19/d3b/c45 to d19/d3b/d41/d42/c4f 0 2026-03-10T06:22:12.301 INFO:tasks.workunit.client.1.vm06.stdout:7/250: chown d19/d3b/d41 3264101 1 2026-03-10T06:22:12.301 INFO:tasks.workunit.client.1.vm06.stdout:3/193: chown d6/f1c 48649843 1 2026-03-10T06:22:12.307 INFO:tasks.workunit.client.1.vm06.stdout:1/228: truncate d9/f2f 309859 0 2026-03-10T06:22:12.310 INFO:tasks.workunit.client.1.vm06.stdout:3/194: dwrite d6/d21/d38/f3d [0,4194304] 0 2026-03-10T06:22:12.313 INFO:tasks.workunit.client.1.vm06.stdout:8/179: dwrite d1/d7/f30 [0,4194304] 0 2026-03-10T06:22:12.316 INFO:tasks.workunit.client.1.vm06.stdout:1/229: truncate d9/f1f 391104 0 2026-03-10T06:22:12.317 INFO:tasks.workunit.client.1.vm06.stdout:8/180: read - d1/df/d20/d21/f36 zero size 2026-03-10T06:22:12.318 INFO:tasks.workunit.client.1.vm06.stdout:1/230: chown d9/f1f 28262 1 2026-03-10T06:22:12.318 INFO:tasks.workunit.client.1.vm06.stdout:1/231: chown d9 14888 1 2026-03-10T06:22:12.324 INFO:tasks.workunit.client.1.vm06.stdout:1/232: dread d9/f1a [0,4194304] 0 2026-03-10T06:22:12.335 INFO:tasks.workunit.client.1.vm06.stdout:6/244: mknod d6/d7/d37/c53 0 2026-03-10T06:22:12.335 INFO:tasks.workunit.client.1.vm06.stdout:2/210: mkdir da/d13/d1c/d43 0 2026-03-10T06:22:12.335 INFO:tasks.workunit.client.1.vm06.stdout:8/181: stat d1/d7/c1e 0 2026-03-10T06:22:12.335 INFO:tasks.workunit.client.1.vm06.stdout:8/182: dwrite d1/f1c [0,4194304] 0 2026-03-10T06:22:12.335 INFO:tasks.workunit.client.1.vm06.stdout:6/245: mknod d6/dd/d25/d4e/c54 0 2026-03-10T06:22:12.335 INFO:tasks.workunit.client.1.vm06.stdout:0/198: creat d0/dd/f4c x:0 0 0 2026-03-10T06:22:12.336 INFO:tasks.workunit.client.1.vm06.stdout:2/211: truncate da/f11 1121493 0 2026-03-10T06:22:12.337 INFO:tasks.workunit.client.1.vm06.stdout:3/195: rename d6/d8/c11 to d6/dc/d13/c47 0 2026-03-10T06:22:12.340 INFO:tasks.workunit.client.1.vm06.stdout:6/246: mknod d6/dd/d25/c55 0 2026-03-10T06:22:12.343 INFO:tasks.workunit.client.1.vm06.stdout:6/247: dread - d6/dd/d35/f3c zero size 2026-03-10T06:22:12.347 INFO:tasks.workunit.client.1.vm06.stdout:7/251: getdents d19/d3b/d41 0 2026-03-10T06:22:12.352 INFO:tasks.workunit.client.1.vm06.stdout:0/199: mkdir d0/dd/d2d/d47/d4d 0 2026-03-10T06:22:12.357 INFO:tasks.workunit.client.1.vm06.stdout:2/212: mkdir da/d13/d1c/d1d/d44 0 2026-03-10T06:22:12.359 INFO:tasks.workunit.client.1.vm06.stdout:7/252: fsync d19/f33 0 2026-03-10T06:22:12.366 INFO:tasks.workunit.client.1.vm06.stdout:2/213: creat da/d13/d1c/d1d/d44/f45 x:0 0 0 2026-03-10T06:22:12.369 INFO:tasks.workunit.client.1.vm06.stdout:7/253: symlink d19/d3b/l50 0 2026-03-10T06:22:12.371 INFO:tasks.workunit.client.1.vm06.stdout:4/184: dwrite dd/fe [0,4194304] 0 2026-03-10T06:22:12.371 INFO:tasks.workunit.client.1.vm06.stdout:6/248: sync 2026-03-10T06:22:12.372 INFO:tasks.workunit.client.1.vm06.stdout:6/249: write d6/dd/d35/f3c [62383,78336] 0 2026-03-10T06:22:12.383 INFO:tasks.workunit.client.1.vm06.stdout:6/250: symlink d6/dd/l56 0 2026-03-10T06:22:12.385 INFO:tasks.workunit.client.1.vm06.stdout:0/200: getdents d0/d3c/d42 0 2026-03-10T06:22:12.385 INFO:tasks.workunit.client.1.vm06.stdout:7/254: read fa [3416124,46049] 0 2026-03-10T06:22:12.387 INFO:tasks.workunit.client.1.vm06.stdout:6/251: symlink d6/dd/d25/d33/l57 0 2026-03-10T06:22:12.388 INFO:tasks.workunit.client.1.vm06.stdout:6/252: chown d6/d7 109 1 2026-03-10T06:22:12.389 INFO:tasks.workunit.client.1.vm06.stdout:6/253: write d6/fc [4225212,129822] 0 2026-03-10T06:22:12.390 INFO:tasks.workunit.client.1.vm06.stdout:0/201: creat d0/dd/d1b/f4e x:0 0 0 2026-03-10T06:22:12.401 INFO:tasks.workunit.client.1.vm06.stdout:6/254: mknod d6/dd/d2b/c58 0 2026-03-10T06:22:12.402 INFO:tasks.workunit.client.1.vm06.stdout:7/255: symlink d19/d3b/d41/d4c/l51 0 2026-03-10T06:22:12.405 INFO:tasks.workunit.client.1.vm06.stdout:4/185: dread dd/f11 [0,4194304] 0 2026-03-10T06:22:12.406 INFO:tasks.workunit.client.1.vm06.stdout:4/186: fsync dd/f14 0 2026-03-10T06:22:12.406 INFO:tasks.workunit.client.1.vm06.stdout:1/233: dread d9/f17 [0,4194304] 0 2026-03-10T06:22:12.411 INFO:tasks.workunit.client.1.vm06.stdout:1/234: write d9/f39 [855385,67025] 0 2026-03-10T06:22:12.417 INFO:tasks.workunit.client.1.vm06.stdout:6/255: dread d6/dd/d35/f2d [4194304,4194304] 0 2026-03-10T06:22:12.419 INFO:tasks.workunit.client.1.vm06.stdout:6/256: readlink d6/d7/l2e 0 2026-03-10T06:22:12.420 INFO:tasks.workunit.client.1.vm06.stdout:9/196: write d21/f22 [1724956,97092] 0 2026-03-10T06:22:12.423 INFO:tasks.workunit.client.1.vm06.stdout:1/235: dwrite d9/d1b/d20/f30 [0,4194304] 0 2026-03-10T06:22:12.438 INFO:tasks.workunit.client.1.vm06.stdout:7/256: dread d19/f35 [0,4194304] 0 2026-03-10T06:22:12.449 INFO:tasks.workunit.client.1.vm06.stdout:9/197: mknod d21/c44 0 2026-03-10T06:22:12.450 INFO:tasks.workunit.client.1.vm06.stdout:5/177: readlink d8/d9/d1e/l1c 0 2026-03-10T06:22:12.451 INFO:tasks.workunit.client.1.vm06.stdout:7/257: fsync d19/f35 0 2026-03-10T06:22:12.457 INFO:tasks.workunit.client.1.vm06.stdout:9/198: mknod d21/d24/c45 0 2026-03-10T06:22:12.457 INFO:tasks.workunit.client.1.vm06.stdout:6/257: creat d6/d7/d37/d43/f59 x:0 0 0 2026-03-10T06:22:12.458 INFO:tasks.workunit.client.1.vm06.stdout:1/236: symlink d9/l3a 0 2026-03-10T06:22:12.459 INFO:tasks.workunit.client.1.vm06.stdout:7/258: mkdir d19/d3b/d41/d42/d52 0 2026-03-10T06:22:12.460 INFO:tasks.workunit.client.1.vm06.stdout:5/178: mknod d8/d9/c3c 0 2026-03-10T06:22:12.461 INFO:tasks.workunit.client.1.vm06.stdout:9/199: dread d21/d27/f31 [0,4194304] 0 2026-03-10T06:22:12.463 INFO:tasks.workunit.client.1.vm06.stdout:1/237: symlink d9/d1b/d20/l3b 0 2026-03-10T06:22:12.464 INFO:tasks.workunit.client.1.vm06.stdout:9/200: write d21/d24/f2f [168759,60332] 0 2026-03-10T06:22:12.465 INFO:tasks.workunit.client.1.vm06.stdout:7/259: creat d19/d3b/f53 x:0 0 0 2026-03-10T06:22:12.467 INFO:tasks.workunit.client.1.vm06.stdout:7/260: fdatasync d19/f33 0 2026-03-10T06:22:12.467 INFO:tasks.workunit.client.1.vm06.stdout:4/187: getdents dd/d24 0 2026-03-10T06:22:12.471 INFO:tasks.workunit.client.1.vm06.stdout:1/238: dread d9/d1b/d20/f24 [0,4194304] 0 2026-03-10T06:22:12.474 INFO:tasks.workunit.client.1.vm06.stdout:9/201: dread d21/d32/f37 [0,4194304] 0 2026-03-10T06:22:12.477 INFO:tasks.workunit.client.1.vm06.stdout:8/183: write d1/df/d20/f19 [4405367,10891] 0 2026-03-10T06:22:12.481 INFO:tasks.workunit.client.1.vm06.stdout:8/184: write d1/df/d20/f19 [3513596,6759] 0 2026-03-10T06:22:12.487 INFO:tasks.workunit.client.1.vm06.stdout:3/196: dwrite d6/f1c [0,4194304] 0 2026-03-10T06:22:12.492 INFO:tasks.workunit.client.1.vm06.stdout:5/179: creat d8/d20/d22/d39/f3d x:0 0 0 2026-03-10T06:22:12.493 INFO:tasks.workunit.client.1.vm06.stdout:8/185: dwrite d1/d7/f30 [0,4194304] 0 2026-03-10T06:22:12.498 INFO:tasks.workunit.client.1.vm06.stdout:0/202: dwrite d0/f46 [0,4194304] 0 2026-03-10T06:22:12.504 INFO:tasks.workunit.client.1.vm06.stdout:9/202: mkdir d21/d46 0 2026-03-10T06:22:12.504 INFO:tasks.workunit.client.1.vm06.stdout:2/214: dread da/f11 [0,4194304] 0 2026-03-10T06:22:12.507 INFO:tasks.workunit.client.1.vm06.stdout:1/239: dwrite d9/f1a [4194304,4194304] 0 2026-03-10T06:22:12.508 INFO:tasks.workunit.client.1.vm06.stdout:1/240: write d9/f39 [1705550,85531] 0 2026-03-10T06:22:12.510 INFO:tasks.workunit.client.1.vm06.stdout:8/186: read d1/f4 [3675708,21059] 0 2026-03-10T06:22:12.515 INFO:tasks.workunit.client.1.vm06.stdout:1/241: dwrite d9/d1b/d20/f30 [0,4194304] 0 2026-03-10T06:22:12.550 INFO:tasks.workunit.client.1.vm06.stdout:7/261: unlink d19/f4d 0 2026-03-10T06:22:12.551 INFO:tasks.workunit.client.1.vm06.stdout:6/258: getdents d6/dd/d25/d4e 0 2026-03-10T06:22:12.551 INFO:tasks.workunit.client.1.vm06.stdout:7/262: stat d19/d3b/f53 0 2026-03-10T06:22:12.556 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:12 vm06.local ceph-mon[58974]: [10/Mar/2026:06:22:11] ENGINE Bus STARTING 2026-03-10T06:22:12.556 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:12 vm06.local ceph-mon[58974]: [10/Mar/2026:06:22:11] ENGINE Serving on https://192.168.123.104:7150 2026-03-10T06:22:12.556 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:12 vm06.local ceph-mon[58974]: [10/Mar/2026:06:22:11] ENGINE Client ('192.168.123.104', 56984) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)') 2026-03-10T06:22:12.556 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:12 vm06.local ceph-mon[58974]: [10/Mar/2026:06:22:11] ENGINE Serving on http://192.168.123.104:8765 2026-03-10T06:22:12.556 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:12 vm06.local ceph-mon[58974]: [10/Mar/2026:06:22:11] ENGINE Bus STARTED 2026-03-10T06:22:12.556 INFO:tasks.workunit.client.1.vm06.stdout:0/203: symlink d0/dd/d14/l4f 0 2026-03-10T06:22:12.557 INFO:tasks.workunit.client.1.vm06.stdout:0/204: stat d0/dd/d14/d18/c38 0 2026-03-10T06:22:12.557 INFO:tasks.workunit.client.1.vm06.stdout:0/205: write d0/dd/d1b/f4e [89072,128493] 0 2026-03-10T06:22:12.560 INFO:tasks.workunit.client.1.vm06.stdout:0/206: dwrite d0/dd/d1c/f3e [0,4194304] 0 2026-03-10T06:22:12.564 INFO:tasks.workunit.client.1.vm06.stdout:9/203: stat l1a 0 2026-03-10T06:22:12.570 INFO:tasks.workunit.client.1.vm06.stdout:2/215: write da/d13/d1c/f2d [673150,28421] 0 2026-03-10T06:22:12.570 INFO:tasks.workunit.client.1.vm06.stdout:9/204: chown d21/d27/l29 78671 1 2026-03-10T06:22:12.570 INFO:tasks.workunit.client.1.vm06.stdout:8/187: write d1/f13 [47247,126973] 0 2026-03-10T06:22:12.570 INFO:tasks.workunit.client.1.vm06.stdout:8/188: dread d1/f1b [0,4194304] 0 2026-03-10T06:22:12.570 INFO:tasks.workunit.client.1.vm06.stdout:6/259: mkdir d6/dd/d25/d33/d5a 0 2026-03-10T06:22:12.571 INFO:tasks.workunit.client.1.vm06.stdout:6/260: chown d6/df/f1e 721617552 1 2026-03-10T06:22:12.571 INFO:tasks.workunit.client.1.vm06.stdout:6/261: stat d6/dd/l14 0 2026-03-10T06:22:12.576 INFO:tasks.workunit.client.1.vm06.stdout:6/262: dwrite d6/d7/d37/d43/f59 [0,4194304] 0 2026-03-10T06:22:12.577 INFO:tasks.workunit.client.1.vm06.stdout:8/189: dread f0 [0,4194304] 0 2026-03-10T06:22:12.579 INFO:tasks.workunit.client.1.vm06.stdout:8/190: chown d1/f13 10 1 2026-03-10T06:22:12.583 INFO:tasks.workunit.client.1.vm06.stdout:0/207: rmdir d0 39 2026-03-10T06:22:12.583 INFO:tasks.workunit.client.1.vm06.stdout:2/216: fsync f5 0 2026-03-10T06:22:12.592 INFO:tasks.workunit.client.1.vm06.stdout:5/180: creat d8/d9/d1e/f3e x:0 0 0 2026-03-10T06:22:12.594 INFO:tasks.workunit.client.1.vm06.stdout:6/263: creat d6/dd/f5b x:0 0 0 2026-03-10T06:22:12.598 INFO:tasks.workunit.client.1.vm06.stdout:2/217: rmdir da/d13/d1c 39 2026-03-10T06:22:12.624 INFO:tasks.workunit.client.1.vm06.stdout:7/263: link d19/f2f d19/d3b/d41/f54 0 2026-03-10T06:22:12.625 INFO:tasks.workunit.client.1.vm06.stdout:7/264: write d19/d3b/d41/f48 [1240359,108567] 0 2026-03-10T06:22:12.626 INFO:tasks.workunit.client.1.vm06.stdout:6/264: creat d6/dd/d25/f5c x:0 0 0 2026-03-10T06:22:12.630 INFO:tasks.workunit.client.1.vm06.stdout:8/191: mkdir d1/d3b 0 2026-03-10T06:22:12.631 INFO:tasks.workunit.client.1.vm06.stdout:8/192: fsync d1/df/d11/f12 0 2026-03-10T06:22:12.635 INFO:tasks.workunit.client.1.vm06.stdout:3/197: dread d6/d8/f2d [0,4194304] 0 2026-03-10T06:22:12.641 INFO:tasks.workunit.client.1.vm06.stdout:1/242: getdents d9/d1b 0 2026-03-10T06:22:12.641 INFO:tasks.workunit.client.1.vm06.stdout:4/188: dread fa [0,4194304] 0 2026-03-10T06:22:12.641 INFO:tasks.workunit.client.1.vm06.stdout:6/265: creat d6/dd/d25/d33/f5d x:0 0 0 2026-03-10T06:22:12.641 INFO:tasks.workunit.client.1.vm06.stdout:5/181: sync 2026-03-10T06:22:12.644 INFO:tasks.workunit.client.1.vm06.stdout:2/218: rename da/d13/d1c/d1d/d3e to da/d13/d1c/d1d/d44/d46 0 2026-03-10T06:22:12.647 INFO:tasks.workunit.client.1.vm06.stdout:2/219: chown da/c3d 15302 1 2026-03-10T06:22:12.648 INFO:tasks.workunit.client.1.vm06.stdout:6/266: dread d6/d7/d37/d43/f59 [0,4194304] 0 2026-03-10T06:22:12.651 INFO:tasks.workunit.client.1.vm06.stdout:8/193: dwrite d1/f1b [0,4194304] 0 2026-03-10T06:22:12.651 INFO:tasks.workunit.client.1.vm06.stdout:8/194: write d1/f5 [1456212,107116] 0 2026-03-10T06:22:12.652 INFO:tasks.workunit.client.1.vm06.stdout:8/195: readlink d1/d7/l1a 0 2026-03-10T06:22:12.655 INFO:tasks.workunit.client.1.vm06.stdout:3/198: creat d6/d8/f48 x:0 0 0 2026-03-10T06:22:12.659 INFO:tasks.workunit.client.1.vm06.stdout:3/199: truncate d6/d8/f48 285322 0 2026-03-10T06:22:12.660 INFO:tasks.workunit.client.1.vm06.stdout:1/243: mknod d9/d35/c3c 0 2026-03-10T06:22:12.663 INFO:tasks.workunit.client.1.vm06.stdout:1/244: dwrite d9/d1b/d20/f25 [0,4194304] 0 2026-03-10T06:22:12.664 INFO:tasks.workunit.client.1.vm06.stdout:4/189: creat dd/d33/f37 x:0 0 0 2026-03-10T06:22:12.669 INFO:tasks.workunit.client.1.vm06.stdout:5/182: write d8/d20/d22/f31 [1140890,26513] 0 2026-03-10T06:22:12.673 INFO:tasks.workunit.client.1.vm06.stdout:8/196: dwrite d1/f13 [0,4194304] 0 2026-03-10T06:22:12.677 INFO:tasks.workunit.client.1.vm06.stdout:5/183: dwrite d8/d9/d1e/f3e [0,4194304] 0 2026-03-10T06:22:12.678 INFO:tasks.workunit.client.1.vm06.stdout:5/184: truncate d8/d20/d22/d39/f3d 944570 0 2026-03-10T06:22:12.683 INFO:tasks.workunit.client.1.vm06.stdout:6/267: dread d6/d7/f16 [0,4194304] 0 2026-03-10T06:22:12.683 INFO:tasks.workunit.client.1.vm06.stdout:6/268: dread - d6/dd/d25/d33/f5d zero size 2026-03-10T06:22:12.687 INFO:tasks.workunit.client.1.vm06.stdout:9/205: link l18 d21/d27/l47 0 2026-03-10T06:22:12.688 INFO:tasks.workunit.client.1.vm06.stdout:3/200: creat d6/d8/f49 x:0 0 0 2026-03-10T06:22:12.689 INFO:tasks.workunit.client.1.vm06.stdout:7/265: rmdir d19/d3b/d41/d42/d4b 0 2026-03-10T06:22:12.694 INFO:tasks.workunit.client.1.vm06.stdout:4/190: mknod dd/d24/c38 0 2026-03-10T06:22:12.695 INFO:tasks.workunit.client.1.vm06.stdout:6/269: mkdir d6/d7/d5e 0 2026-03-10T06:22:12.695 INFO:tasks.workunit.client.1.vm06.stdout:2/220: rename da/c20 to da/d13/d1c/d43/c47 0 2026-03-10T06:22:12.696 INFO:tasks.workunit.client.1.vm06.stdout:3/201: truncate d6/d1a/f1f 219454 0 2026-03-10T06:22:12.697 INFO:tasks.workunit.client.1.vm06.stdout:3/202: write d6/f29 [908460,52366] 0 2026-03-10T06:22:12.702 INFO:tasks.workunit.client.1.vm06.stdout:3/203: read d6/dc/d13/f1e [7795135,19378] 0 2026-03-10T06:22:12.704 INFO:tasks.workunit.client.1.vm06.stdout:2/221: truncate da/fe 6136337 0 2026-03-10T06:22:12.704 INFO:tasks.workunit.client.1.vm06.stdout:9/206: mknod d21/d27/c48 0 2026-03-10T06:22:12.705 INFO:tasks.workunit.client.1.vm06.stdout:9/207: chown fb 21002950 1 2026-03-10T06:22:12.705 INFO:tasks.workunit.client.1.vm06.stdout:7/266: rename d19/f27 to d19/d3b/d41/d4c/f55 0 2026-03-10T06:22:12.709 INFO:tasks.workunit.client.1.vm06.stdout:2/222: mkdir da/d13/d1c/d1d/d44/d48 0 2026-03-10T06:22:12.713 INFO:tasks.workunit.client.1.vm06.stdout:3/204: link d6/dc/f1d d6/d1a/f4a 0 2026-03-10T06:22:12.715 INFO:tasks.workunit.client.1.vm06.stdout:2/223: sync 2026-03-10T06:22:12.715 INFO:tasks.workunit.client.1.vm06.stdout:2/224: truncate f7 964601 0 2026-03-10T06:22:12.717 INFO:tasks.workunit.client.1.vm06.stdout:3/205: dread - d6/dc/f3f zero size 2026-03-10T06:22:12.719 INFO:tasks.workunit.client.1.vm06.stdout:9/208: link d21/d24/f2e d21/f49 0 2026-03-10T06:22:12.719 INFO:tasks.workunit.client.1.vm06.stdout:7/267: getdents d19/d3b/d41/d42/d52 0 2026-03-10T06:22:12.726 INFO:tasks.workunit.client.1.vm06.stdout:3/206: unlink d6/dc/d13/c20 0 2026-03-10T06:22:12.733 INFO:tasks.workunit.client.1.vm06.stdout:9/209: dwrite f1b [0,4194304] 0 2026-03-10T06:22:12.738 INFO:tasks.workunit.client.1.vm06.stdout:2/225: dwrite da/d13/d1a/d39/f2f [0,4194304] 0 2026-03-10T06:22:12.740 INFO:tasks.workunit.client.1.vm06.stdout:0/208: truncate d0/dd/d1b/f3f 1470010 0 2026-03-10T06:22:12.741 INFO:tasks.workunit.client.1.vm06.stdout:2/226: chown da/d13/d1c/d1d/f26 61196750 1 2026-03-10T06:22:12.743 INFO:tasks.workunit.client.1.vm06.stdout:9/210: sync 2026-03-10T06:22:12.743 INFO:tasks.workunit.client.1.vm06.stdout:2/227: write f8 [3236712,121860] 0 2026-03-10T06:22:12.744 INFO:tasks.workunit.client.1.vm06.stdout:6/270: rmdir d6/dd/d25/d33 39 2026-03-10T06:22:12.747 INFO:tasks.workunit.client.1.vm06.stdout:2/228: write f8 [1599523,65543] 0 2026-03-10T06:22:12.747 INFO:tasks.workunit.client.1.vm06.stdout:6/271: chown d6/l9 1184818 1 2026-03-10T06:22:12.752 INFO:tasks.workunit.client.1.vm06.stdout:0/209: mkdir d0/dd/d1b/d3d/d50 0 2026-03-10T06:22:12.754 INFO:tasks.workunit.client.1.vm06.stdout:3/207: chown d6/dc/d13/c47 2361 1 2026-03-10T06:22:12.755 INFO:tasks.workunit.client.1.vm06.stdout:3/208: chown d6/dc 5208016 1 2026-03-10T06:22:12.758 INFO:tasks.workunit.client.1.vm06.stdout:0/210: rename d0/dd/d1c/f3e to d0/dd/d2d/d35/f51 0 2026-03-10T06:22:12.764 INFO:tasks.workunit.client.1.vm06.stdout:3/209: write d6/dc/f1d [1019494,30064] 0 2026-03-10T06:22:12.765 INFO:tasks.workunit.client.1.vm06.stdout:6/272: rename d6/dd/d25/d33/f52 to d6/dd/d25/d4e/f5f 0 2026-03-10T06:22:12.768 INFO:tasks.workunit.client.1.vm06.stdout:0/211: rename d0/c34 to d0/dd/c52 0 2026-03-10T06:22:12.791 INFO:tasks.workunit.client.1.vm06.stdout:0/212: chown d0/dd/d14 0 1 2026-03-10T06:22:12.791 INFO:tasks.workunit.client.1.vm06.stdout:0/213: rmdir d0/dd/d14/d18 39 2026-03-10T06:22:12.791 INFO:tasks.workunit.client.1.vm06.stdout:0/214: creat d0/dd/d14/d1d/f53 x:0 0 0 2026-03-10T06:22:12.791 INFO:tasks.workunit.client.1.vm06.stdout:0/215: truncate d0/dd/d14/d1d/f53 167143 0 2026-03-10T06:22:12.791 INFO:tasks.workunit.client.1.vm06.stdout:0/216: read d0/dd/d2d/d35/f51 [1036099,14230] 0 2026-03-10T06:22:12.791 INFO:tasks.workunit.client.1.vm06.stdout:0/217: dwrite d0/dd/f24 [4194304,4194304] 0 2026-03-10T06:22:12.791 INFO:tasks.workunit.client.1.vm06.stdout:0/218: dwrite d0/d3c/d42/f12 [0,4194304] 0 2026-03-10T06:22:12.792 INFO:tasks.workunit.client.1.vm06.stdout:0/219: chown d0/dd/d1b 456022472 1 2026-03-10T06:22:12.798 INFO:tasks.workunit.client.1.vm06.stdout:0/220: creat d0/d3c/d42/f54 x:0 0 0 2026-03-10T06:22:12.799 INFO:tasks.workunit.client.1.vm06.stdout:0/221: write d0/ff [1866871,26710] 0 2026-03-10T06:22:12.801 INFO:tasks.workunit.client.1.vm06.stdout:1/245: write d9/df/f2a [5178337,62758] 0 2026-03-10T06:22:12.809 INFO:tasks.workunit.client.1.vm06.stdout:1/246: creat d9/df/f3d x:0 0 0 2026-03-10T06:22:12.818 INFO:tasks.workunit.client.1.vm06.stdout:1/247: stat d9/f2f 0 2026-03-10T06:22:12.818 INFO:tasks.workunit.client.1.vm06.stdout:1/248: creat d9/d35/d36/f3e x:0 0 0 2026-03-10T06:22:12.819 INFO:tasks.workunit.client.1.vm06.stdout:1/249: dread d9/d1b/f1d [0,4194304] 0 2026-03-10T06:22:12.821 INFO:tasks.workunit.client.1.vm06.stdout:1/250: read d9/fe [2085373,41164] 0 2026-03-10T06:22:12.822 INFO:tasks.workunit.client.1.vm06.stdout:1/251: dread d9/d1b/f1d [0,4194304] 0 2026-03-10T06:22:12.829 INFO:tasks.workunit.client.1.vm06.stdout:1/252: unlink d9/f11 0 2026-03-10T06:22:12.851 INFO:tasks.workunit.client.1.vm06.stdout:1/253: symlink d9/l3f 0 2026-03-10T06:22:12.851 INFO:tasks.workunit.client.1.vm06.stdout:1/254: rename d9/fe to d9/d1b/f40 0 2026-03-10T06:22:12.851 INFO:tasks.workunit.client.1.vm06.stdout:1/255: dwrite d9/f1f [0,4194304] 0 2026-03-10T06:22:12.933 INFO:tasks.workunit.client.1.vm06.stdout:0/222: sync 2026-03-10T06:22:12.933 INFO:tasks.workunit.client.1.vm06.stdout:3/210: sync 2026-03-10T06:22:12.934 INFO:tasks.workunit.client.1.vm06.stdout:2/229: sync 2026-03-10T06:22:12.938 INFO:tasks.workunit.client.1.vm06.stdout:2/230: chown da/l30 4474 1 2026-03-10T06:22:12.939 INFO:tasks.workunit.client.1.vm06.stdout:2/231: chown da/d13/d1c/d1d/f26 79 1 2026-03-10T06:22:12.939 INFO:tasks.workunit.client.1.vm06.stdout:3/211: mknod d6/c4b 0 2026-03-10T06:22:12.940 INFO:tasks.workunit.client.1.vm06.stdout:4/191: rmdir dd 39 2026-03-10T06:22:12.940 INFO:tasks.workunit.client.1.vm06.stdout:6/273: truncate d6/d7/f16 3861428 0 2026-03-10T06:22:12.941 INFO:tasks.workunit.client.1.vm06.stdout:3/212: truncate d6/f25 278268 0 2026-03-10T06:22:12.944 INFO:tasks.workunit.client.1.vm06.stdout:8/197: truncate d1/f2 952054 0 2026-03-10T06:22:12.944 INFO:tasks.workunit.client.1.vm06.stdout:8/198: fsync d1/df/d20/d21/f36 0 2026-03-10T06:22:12.946 INFO:tasks.workunit.client.1.vm06.stdout:0/223: creat d0/dd/d1b/d3d/d50/f55 x:0 0 0 2026-03-10T06:22:12.952 INFO:tasks.workunit.client.1.vm06.stdout:2/232: mkdir da/d49 0 2026-03-10T06:22:12.953 INFO:tasks.workunit.client.1.vm06.stdout:5/185: dwrite d8/d9/d1e/f23 [0,4194304] 0 2026-03-10T06:22:12.960 INFO:tasks.workunit.client.1.vm06.stdout:0/224: dwrite d0/f46 [4194304,4194304] 0 2026-03-10T06:22:12.961 INFO:tasks.workunit.client.1.vm06.stdout:0/225: chown d0/f9 4111357 1 2026-03-10T06:22:12.969 INFO:tasks.workunit.client.1.vm06.stdout:0/226: symlink d0/d3c/d42/l56 0 2026-03-10T06:22:12.970 INFO:tasks.workunit.client.1.vm06.stdout:8/199: creat d1/f3c x:0 0 0 2026-03-10T06:22:12.976 INFO:tasks.workunit.client.1.vm06.stdout:2/233: rmdir da/d49 0 2026-03-10T06:22:12.977 INFO:tasks.workunit.client.1.vm06.stdout:7/268: write d19/f1d [1044237,104564] 0 2026-03-10T06:22:12.978 INFO:tasks.workunit.client.1.vm06.stdout:5/186: creat d8/f3f x:0 0 0 2026-03-10T06:22:12.981 INFO:tasks.workunit.client.1.vm06.stdout:0/227: creat d0/dd/d2d/d47/d4d/f57 x:0 0 0 2026-03-10T06:22:12.986 INFO:tasks.workunit.client.1.vm06.stdout:8/200: mknod d1/c3d 0 2026-03-10T06:22:12.986 INFO:tasks.workunit.client.1.vm06.stdout:8/201: fsync d1/f1c 0 2026-03-10T06:22:12.986 INFO:tasks.workunit.client.1.vm06.stdout:7/269: fsync f13 0 2026-03-10T06:22:12.989 INFO:tasks.workunit.client.1.vm06.stdout:5/187: dwrite d8/ff [0,4194304] 0 2026-03-10T06:22:12.989 INFO:tasks.workunit.client.1.vm06.stdout:0/228: mknod d0/dd/d2d/d47/d4d/c58 0 2026-03-10T06:22:12.993 INFO:tasks.workunit.client.1.vm06.stdout:7/270: write d19/f3f [288787,28753] 0 2026-03-10T06:22:13.000 INFO:tasks.workunit.client.1.vm06.stdout:3/213: dread d6/d21/f31 [4194304,4194304] 0 2026-03-10T06:22:13.004 INFO:tasks.workunit.client.1.vm06.stdout:7/271: dread d19/d3b/d41/f49 [0,4194304] 0 2026-03-10T06:22:13.014 INFO:tasks.workunit.client.1.vm06.stdout:8/202: dwrite d1/f3c [0,4194304] 0 2026-03-10T06:22:13.014 INFO:tasks.workunit.client.1.vm06.stdout:2/234: link da/d13/d1c/f42 da/d13/d1a/d39/d35/f4a 0 2026-03-10T06:22:13.014 INFO:tasks.workunit.client.1.vm06.stdout:9/211: dwrite f1b [4194304,4194304] 0 2026-03-10T06:22:13.014 INFO:tasks.workunit.client.1.vm06.stdout:3/214: dwrite d6/d8/f22 [0,4194304] 0 2026-03-10T06:22:13.015 INFO:tasks.workunit.client.1.vm06.stdout:3/215: truncate d6/f29 1560560 0 2026-03-10T06:22:13.019 INFO:tasks.workunit.client.1.vm06.stdout:0/229: dread - d0/dd/d14/d18/f2c zero size 2026-03-10T06:22:13.030 INFO:tasks.workunit.client.1.vm06.stdout:1/256: getdents d9 0 2026-03-10T06:22:13.031 INFO:tasks.workunit.client.1.vm06.stdout:5/188: dwrite d8/d9/d1e/f17 [0,4194304] 0 2026-03-10T06:22:13.035 INFO:tasks.workunit.client.1.vm06.stdout:2/235: dread - da/ff zero size 2026-03-10T06:22:13.037 INFO:tasks.workunit.client.1.vm06.stdout:5/189: chown d8/d9/l3a 0 1 2026-03-10T06:22:13.051 INFO:tasks.workunit.client.1.vm06.stdout:9/212: rmdir d21/d27 39 2026-03-10T06:22:13.066 INFO:tasks.workunit.client.1.vm06.stdout:6/274: truncate d6/fc 979194 0 2026-03-10T06:22:13.071 INFO:tasks.workunit.client.1.vm06.stdout:4/192: dwrite dd/f12 [0,4194304] 0 2026-03-10T06:22:13.072 INFO:tasks.workunit.client.1.vm06.stdout:0/230: symlink d0/dd/d2d/d47/d4d/l59 0 2026-03-10T06:22:13.075 INFO:tasks.workunit.client.1.vm06.stdout:5/190: creat d8/db/f40 x:0 0 0 2026-03-10T06:22:13.075 INFO:tasks.workunit.client.1.vm06.stdout:1/257: write d9/f17 [2973518,14445] 0 2026-03-10T06:22:13.087 INFO:tasks.workunit.client.1.vm06.stdout:8/203: dread d1/fa [0,4194304] 0 2026-03-10T06:22:13.088 INFO:tasks.workunit.client.1.vm06.stdout:6/275: unlink d6/f1b 0 2026-03-10T06:22:13.088 INFO:tasks.workunit.client.1.vm06.stdout:8/204: write d1/df/d20/d21/f38 [277355,11542] 0 2026-03-10T06:22:13.089 INFO:tasks.workunit.client.1.vm06.stdout:5/191: sync 2026-03-10T06:22:13.095 INFO:tasks.workunit.client.1.vm06.stdout:6/276: unlink d6/df/f42 0 2026-03-10T06:22:13.098 INFO:tasks.workunit.client.1.vm06.stdout:6/277: truncate d6/dd/d25/d2c/f4c 556263 0 2026-03-10T06:22:13.113 INFO:tasks.workunit.client.1.vm06.stdout:8/205: mknod d1/df/d11/c3e 0 2026-03-10T06:22:13.113 INFO:tasks.workunit.client.1.vm06.stdout:4/193: unlink c5 0 2026-03-10T06:22:13.114 INFO:tasks.workunit.client.1.vm06.stdout:3/216: dread d6/d21/f26 [0,4194304] 0 2026-03-10T06:22:13.114 INFO:tasks.workunit.client.1.vm06.stdout:0/231: truncate d0/dd/f49 573874 0 2026-03-10T06:22:13.114 INFO:tasks.workunit.client.1.vm06.stdout:9/213: mknod d21/d27/c4a 0 2026-03-10T06:22:13.114 INFO:tasks.workunit.client.1.vm06.stdout:4/194: fsync fa 0 2026-03-10T06:22:13.114 INFO:tasks.workunit.client.1.vm06.stdout:3/217: write d6/d8/f2d [3630917,622] 0 2026-03-10T06:22:13.114 INFO:tasks.workunit.client.1.vm06.stdout:8/206: symlink d1/df/d20/d35/l3f 0 2026-03-10T06:22:13.114 INFO:tasks.workunit.client.1.vm06.stdout:0/232: creat d0/dd/d1c/f5a x:0 0 0 2026-03-10T06:22:13.114 INFO:tasks.workunit.client.1.vm06.stdout:1/258: creat d9/df/f41 x:0 0 0 2026-03-10T06:22:13.114 INFO:tasks.workunit.client.1.vm06.stdout:1/259: chown d9 1892 1 2026-03-10T06:22:13.114 INFO:tasks.workunit.client.1.vm06.stdout:9/214: dread d21/f33 [0,4194304] 0 2026-03-10T06:22:13.114 INFO:tasks.workunit.client.1.vm06.stdout:3/218: write d6/f25 [1319906,89021] 0 2026-03-10T06:22:13.114 INFO:tasks.workunit.client.1.vm06.stdout:1/260: chown d9/l3a 591285 1 2026-03-10T06:22:13.116 INFO:tasks.workunit.client.1.vm06.stdout:4/195: readlink dd/l2a 0 2026-03-10T06:22:13.117 INFO:tasks.workunit.client.1.vm06.stdout:4/196: fsync dd/f14 0 2026-03-10T06:22:13.121 INFO:tasks.workunit.client.1.vm06.stdout:8/207: dwrite d1/d2c/f32 [0,4194304] 0 2026-03-10T06:22:13.121 INFO:tasks.workunit.client.1.vm06.stdout:8/208: fdatasync d1/f1c 0 2026-03-10T06:22:13.139 INFO:tasks.workunit.client.1.vm06.stdout:5/192: getdents d8/d20/d22 0 2026-03-10T06:22:13.139 INFO:tasks.workunit.client.1.vm06.stdout:5/193: chown d8/ld 922200 1 2026-03-10T06:22:13.150 INFO:tasks.workunit.client.1.vm06.stdout:1/261: rename d9/d35/d36/f3e to d9/d1b/d20/f42 0 2026-03-10T06:22:13.151 INFO:tasks.workunit.client.1.vm06.stdout:1/262: fsync d9/f34 0 2026-03-10T06:22:13.154 INFO:tasks.workunit.client.1.vm06.stdout:1/263: dread d9/f1a [0,4194304] 0 2026-03-10T06:22:13.155 INFO:tasks.workunit.client.1.vm06.stdout:1/264: write d9/f39 [1311118,116144] 0 2026-03-10T06:22:13.165 INFO:tasks.workunit.client.1.vm06.stdout:0/233: creat d0/dd/f5b x:0 0 0 2026-03-10T06:22:13.166 INFO:tasks.workunit.client.1.vm06.stdout:5/194: creat d8/d20/d22/d39/f41 x:0 0 0 2026-03-10T06:22:13.166 INFO:tasks.workunit.client.1.vm06.stdout:0/234: write d0/dd/d14/d18/f2c [221752,88293] 0 2026-03-10T06:22:13.171 INFO:tasks.workunit.client.1.vm06.stdout:9/215: creat d21/d27/f4b x:0 0 0 2026-03-10T06:22:13.171 INFO:tasks.workunit.client.1.vm06.stdout:8/209: mknod d1/d3b/c40 0 2026-03-10T06:22:13.176 INFO:tasks.workunit.client.1.vm06.stdout:0/235: dread - d0/dd/f48 zero size 2026-03-10T06:22:13.176 INFO:tasks.workunit.client.1.vm06.stdout:3/219: truncate d6/d21/f31 2453465 0 2026-03-10T06:22:13.188 INFO:tasks.workunit.client.1.vm06.stdout:8/210: symlink d1/df/d20/l41 0 2026-03-10T06:22:13.191 INFO:tasks.workunit.client.1.vm06.stdout:5/195: dread d8/d9/d1e/f3e [0,4194304] 0 2026-03-10T06:22:13.193 INFO:tasks.workunit.client.1.vm06.stdout:4/197: getdents dd 0 2026-03-10T06:22:13.195 INFO:tasks.workunit.client.1.vm06.stdout:0/236: dwrite d0/dd/d14/d18/f2c [0,4194304] 0 2026-03-10T06:22:13.202 INFO:tasks.workunit.client.1.vm06.stdout:0/237: fdatasync d0/dd/d2d/d35/f3a 0 2026-03-10T06:22:13.209 INFO:tasks.workunit.client.1.vm06.stdout:7/272: write d19/d3b/d41/f54 [1328956,66156] 0 2026-03-10T06:22:13.211 INFO:tasks.workunit.client.1.vm06.stdout:9/216: symlink d21/d27/d3a/l4c 0 2026-03-10T06:22:13.214 INFO:tasks.workunit.client.1.vm06.stdout:9/217: write d21/f34 [668085,24256] 0 2026-03-10T06:22:13.214 INFO:tasks.workunit.client.1.vm06.stdout:8/211: creat d1/df/d20/d35/f42 x:0 0 0 2026-03-10T06:22:13.219 INFO:tasks.workunit.client.1.vm06.stdout:4/198: mkdir dd/d24/d2d/d2f/d39 0 2026-03-10T06:22:13.220 INFO:tasks.workunit.client.1.vm06.stdout:4/199: chown dd/d18/l25 950 1 2026-03-10T06:22:13.220 INFO:tasks.workunit.client.1.vm06.stdout:4/200: write dd/f12 [433799,93777] 0 2026-03-10T06:22:13.222 INFO:tasks.workunit.client.1.vm06.stdout:9/218: dwrite f14 [0,4194304] 0 2026-03-10T06:22:13.226 INFO:tasks.workunit.client.1.vm06.stdout:0/238: creat d0/d3c/d42/f5c x:0 0 0 2026-03-10T06:22:13.233 INFO:tasks.workunit.client.1.vm06.stdout:7/273: fdatasync d19/f24 0 2026-03-10T06:22:13.234 INFO:tasks.workunit.client.1.vm06.stdout:7/274: fsync d19/d3b/f3c 0 2026-03-10T06:22:13.235 INFO:tasks.workunit.client.1.vm06.stdout:7/275: write d19/d3b/d41/f48 [2268737,118640] 0 2026-03-10T06:22:13.237 INFO:tasks.workunit.client.1.vm06.stdout:8/212: creat d1/df/d20/f43 x:0 0 0 2026-03-10T06:22:13.248 INFO:tasks.workunit.client.1.vm06.stdout:8/213: dread d1/f3c [0,4194304] 0 2026-03-10T06:22:13.249 INFO:tasks.workunit.client.1.vm06.stdout:4/201: mknod dd/d24/d2d/c3a 0 2026-03-10T06:22:13.249 INFO:tasks.workunit.client.1.vm06.stdout:4/202: write dd/d18/f26 [455257,60156] 0 2026-03-10T06:22:13.254 INFO:tasks.workunit.client.1.vm06.stdout:0/239: mkdir d0/dd/d14/d1d/d5d 0 2026-03-10T06:22:13.255 INFO:tasks.workunit.client.1.vm06.stdout:9/219: write d21/f3e [448094,127436] 0 2026-03-10T06:22:13.261 INFO:tasks.workunit.client.1.vm06.stdout:2/236: truncate f7 680238 0 2026-03-10T06:22:13.263 INFO:tasks.workunit.client.1.vm06.stdout:6/278: write d6/dd/d35/f2d [4405121,29289] 0 2026-03-10T06:22:13.263 INFO:tasks.workunit.client.1.vm06.stdout:2/237: chown da/d13/d1a/l36 1815 1 2026-03-10T06:22:13.263 INFO:tasks.workunit.client.1.vm06.stdout:1/265: getdents d9/df 0 2026-03-10T06:22:13.263 INFO:tasks.workunit.client.1.vm06.stdout:1/266: chown d9/f17 469570 1 2026-03-10T06:22:13.263 INFO:tasks.workunit.client.1.vm06.stdout:2/238: write da/d13/d1c/d1d/f26 [3349666,126792] 0 2026-03-10T06:22:13.264 INFO:tasks.workunit.client.1.vm06.stdout:2/239: dread - da/d13/d1c/f41 zero size 2026-03-10T06:22:13.265 INFO:tasks.workunit.client.1.vm06.stdout:2/240: read da/d13/d1a/d39/f2f [2744725,128622] 0 2026-03-10T06:22:13.273 INFO:tasks.workunit.client.1.vm06.stdout:2/241: truncate da/d13/d1a/d39/d35/f4a 1047036 0 2026-03-10T06:22:13.275 INFO:tasks.workunit.client.1.vm06.stdout:7/276: chown d19/l34 985278 1 2026-03-10T06:22:13.281 INFO:tasks.workunit.client.1.vm06.stdout:7/277: dread d19/f33 [0,4194304] 0 2026-03-10T06:22:13.286 INFO:tasks.workunit.client.1.vm06.stdout:4/203: unlink dd/d18/f26 0 2026-03-10T06:22:13.289 INFO:tasks.workunit.client.1.vm06.stdout:5/196: rename d8/db/f25 to d8/d9/d1e/f42 0 2026-03-10T06:22:13.289 INFO:tasks.workunit.client.1.vm06.stdout:8/214: mknod d1/df/d20/d35/c44 0 2026-03-10T06:22:13.290 INFO:tasks.workunit.client.1.vm06.stdout:6/279: stat d6/df/c1c 0 2026-03-10T06:22:13.290 INFO:tasks.workunit.client.1.vm06.stdout:0/240: mkdir d0/d3c/d42/d5e 0 2026-03-10T06:22:13.290 INFO:tasks.workunit.client.1.vm06.stdout:9/220: mkdir d21/d32/d4d 0 2026-03-10T06:22:13.292 INFO:tasks.workunit.client.1.vm06.stdout:1/267: creat d9/d1b/d20/f43 x:0 0 0 2026-03-10T06:22:13.293 INFO:tasks.workunit.client.1.vm06.stdout:1/268: fsync d9/d1b/d20/f25 0 2026-03-10T06:22:13.297 INFO:tasks.workunit.client.1.vm06.stdout:2/242: mkdir da/d13/d1a/d39/d4b 0 2026-03-10T06:22:13.299 INFO:tasks.workunit.client.1.vm06.stdout:5/197: dwrite d8/d9/f11 [4194304,4194304] 0 2026-03-10T06:22:13.311 INFO:tasks.workunit.client.1.vm06.stdout:5/198: sync 2026-03-10T06:22:13.311 INFO:tasks.workunit.client.1.vm06.stdout:8/215: creat d1/df/d11/f45 x:0 0 0 2026-03-10T06:22:13.312 INFO:tasks.workunit.client.1.vm06.stdout:8/216: truncate d1/f26 960456 0 2026-03-10T06:22:13.313 INFO:tasks.workunit.client.1.vm06.stdout:6/280: creat d6/dd/d25/d4e/f60 x:0 0 0 2026-03-10T06:22:13.313 INFO:tasks.workunit.client.1.vm06.stdout:6/281: readlink d6/df/l51 0 2026-03-10T06:22:13.315 INFO:tasks.workunit.client.1.vm06.stdout:1/269: mkdir d9/d1b/d20/d44 0 2026-03-10T06:22:13.316 INFO:tasks.workunit.client.1.vm06.stdout:6/282: dread d6/dd/d25/d2c/f32 [0,4194304] 0 2026-03-10T06:22:13.319 INFO:tasks.workunit.client.1.vm06.stdout:6/283: chown d6/dd/d25/d2c 654 1 2026-03-10T06:22:13.321 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:13 vm06.local ceph-mon[58974]: pgmap v5: 65 pgs: 65 active+clean; 467 MiB data, 2.0 GiB used, 118 GiB / 120 GiB avail 2026-03-10T06:22:13.321 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:13 vm06.local ceph-mon[58974]: mgrmap e30: vm04.exdvdb(active, since 4s) 2026-03-10T06:22:13.321 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:13 vm06.local ceph-mon[58974]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:22:13.321 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:13 vm06.local ceph-mon[58974]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:22:13.321 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:13 vm06.local ceph-mon[58974]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:22:13.321 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:13 vm06.local ceph-mon[58974]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:22:13.321 INFO:tasks.workunit.client.1.vm06.stdout:6/284: stat d6/dd/d25/d33/l57 0 2026-03-10T06:22:13.323 INFO:tasks.workunit.client.1.vm06.stdout:2/243: mknod da/d13/d1a/d39/d35/c4c 0 2026-03-10T06:22:13.323 INFO:tasks.workunit.client.1.vm06.stdout:7/278: mknod d19/c56 0 2026-03-10T06:22:13.340 INFO:tasks.workunit.client.1.vm06.stdout:5/199: dread d8/d20/d22/f31 [0,4194304] 0 2026-03-10T06:22:13.341 INFO:tasks.workunit.client.1.vm06.stdout:3/220: dread d6/d21/f31 [0,4194304] 0 2026-03-10T06:22:13.342 INFO:tasks.workunit.client.1.vm06.stdout:3/221: dread - d6/dc/f3f zero size 2026-03-10T06:22:13.351 INFO:tasks.workunit.client.1.vm06.stdout:6/285: mknod d6/dd/d25/d2c/c61 0 2026-03-10T06:22:13.351 INFO:tasks.workunit.client.1.vm06.stdout:6/286: chown d6/dd/d35/f3c 683 1 2026-03-10T06:22:13.352 INFO:tasks.workunit.client.1.vm06.stdout:7/279: symlink d19/d3b/l57 0 2026-03-10T06:22:13.359 INFO:tasks.workunit.client.1.vm06.stdout:4/204: link dd/f14 dd/d24/d2d/f3b 0 2026-03-10T06:22:13.359 INFO:tasks.workunit.client.1.vm06.stdout:4/205: read - dd/d33/f37 zero size 2026-03-10T06:22:13.360 INFO:tasks.workunit.client.1.vm06.stdout:5/200: unlink f5 0 2026-03-10T06:22:13.384 INFO:tasks.workunit.client.1.vm06.stdout:9/221: link d21/f49 d21/d46/f4e 0 2026-03-10T06:22:13.390 INFO:tasks.workunit.client.1.vm06.stdout:1/270: rename d9/d1b/f40 to d9/d35/f45 0 2026-03-10T06:22:13.393 INFO:tasks.workunit.client.1.vm06.stdout:7/280: mknod d19/d3b/c58 0 2026-03-10T06:22:13.402 INFO:tasks.workunit.client.1.vm06.stdout:5/201: mknod d8/d20/c43 0 2026-03-10T06:22:13.403 INFO:tasks.workunit.client.1.vm06.stdout:5/202: stat d8/d9/d1e/f42 0 2026-03-10T06:22:13.405 INFO:tasks.workunit.client.1.vm06.stdout:3/222: creat d6/d21/d38/d39/f4c x:0 0 0 2026-03-10T06:22:13.410 INFO:tasks.workunit.client.1.vm06.stdout:9/222: readlink l7 0 2026-03-10T06:22:13.411 INFO:tasks.workunit.client.1.vm06.stdout:2/244: rename da/d13/c18 to da/d13/d1c/d1d/d44/d48/c4d 0 2026-03-10T06:22:13.412 INFO:tasks.workunit.client.1.vm06.stdout:0/241: getdents d0/dd/d1b/d3d/d50 0 2026-03-10T06:22:13.419 INFO:tasks.workunit.client.1.vm06.stdout:7/281: rmdir d19/d3b/d41/d42 39 2026-03-10T06:22:13.419 INFO:tasks.workunit.client.1.vm06.stdout:8/217: truncate d1/df/d20/d21/f2b 1076832 0 2026-03-10T06:22:13.420 INFO:tasks.workunit.client.1.vm06.stdout:8/218: chown d1/d3b/c40 3326 1 2026-03-10T06:22:13.420 INFO:tasks.workunit.client.1.vm06.stdout:5/203: creat d8/d20/d22/d39/f44 x:0 0 0 2026-03-10T06:22:13.422 INFO:tasks.workunit.client.1.vm06.stdout:3/223: rmdir d6/dc/d2a 39 2026-03-10T06:22:13.429 INFO:tasks.workunit.client.1.vm06.stdout:8/219: dwrite d1/f18 [0,4194304] 0 2026-03-10T06:22:13.437 INFO:tasks.workunit.client.1.vm06.stdout:1/271: write d9/f2f [143056,104286] 0 2026-03-10T06:22:13.439 INFO:tasks.workunit.client.1.vm06.stdout:0/242: readlink d0/dd/d2d/l4b 0 2026-03-10T06:22:13.440 INFO:tasks.workunit.client.1.vm06.stdout:2/245: dread da/d13/d1a/d39/f2f [0,4194304] 0 2026-03-10T06:22:13.440 INFO:tasks.workunit.client.1.vm06.stdout:4/206: link dd/d24/c38 dd/d24/d2d/c3c 0 2026-03-10T06:22:13.440 INFO:tasks.workunit.client.1.vm06.stdout:6/287: write d6/fc [1466017,100344] 0 2026-03-10T06:22:13.447 INFO:tasks.workunit.client.1.vm06.stdout:7/282: truncate f13 4703673 0 2026-03-10T06:22:13.448 INFO:tasks.workunit.client.1.vm06.stdout:5/204: dread - d8/d9/d1e/f36 zero size 2026-03-10T06:22:13.448 INFO:tasks.workunit.client.1.vm06.stdout:9/223: symlink d21/d46/l4f 0 2026-03-10T06:22:13.449 INFO:tasks.workunit.client.1.vm06.stdout:3/224: creat d6/d21/f4d x:0 0 0 2026-03-10T06:22:13.449 INFO:tasks.workunit.client.1.vm06.stdout:3/225: write d6/f44 [917601,104732] 0 2026-03-10T06:22:13.452 INFO:tasks.workunit.client.1.vm06.stdout:1/272: rmdir d9/d35/d36 39 2026-03-10T06:22:13.456 INFO:tasks.workunit.client.1.vm06.stdout:4/207: write f8 [7927411,24852] 0 2026-03-10T06:22:13.457 INFO:tasks.workunit.client.1.vm06.stdout:5/205: dread - d8/d9/d1e/f3b zero size 2026-03-10T06:22:13.463 INFO:tasks.workunit.client.1.vm06.stdout:9/224: unlink d21/f22 0 2026-03-10T06:22:13.464 INFO:tasks.workunit.client.1.vm06.stdout:2/246: fdatasync da/d13/d1a/f21 0 2026-03-10T06:22:13.465 INFO:tasks.workunit.client.1.vm06.stdout:4/208: creat dd/d24/f3d x:0 0 0 2026-03-10T06:22:13.465 INFO:tasks.workunit.client.1.vm06.stdout:4/209: chown f8 571426785 1 2026-03-10T06:22:13.470 INFO:tasks.workunit.client.1.vm06.stdout:4/210: dwrite dd/d33/f37 [0,4194304] 0 2026-03-10T06:22:13.473 INFO:tasks.workunit.client.1.vm06.stdout:5/206: unlink d8/d9/l3a 0 2026-03-10T06:22:13.474 INFO:tasks.workunit.client.1.vm06.stdout:2/247: mknod da/d13/d1a/d39/d35/c4e 0 2026-03-10T06:22:13.475 INFO:tasks.workunit.client.1.vm06.stdout:2/248: chown da/ff 16 1 2026-03-10T06:22:13.475 INFO:tasks.workunit.client.1.vm06.stdout:2/249: chown da/d13/d1a/d39/d35 7 1 2026-03-10T06:22:13.477 INFO:tasks.workunit.client.1.vm06.stdout:6/288: link d6/dd/d25/d4e/f60 d6/f62 0 2026-03-10T06:22:13.478 INFO:tasks.workunit.client.1.vm06.stdout:6/289: dread - d6/dd/d25/d4e/f60 zero size 2026-03-10T06:22:13.478 INFO:tasks.workunit.client.1.vm06.stdout:5/207: creat d8/db/f45 x:0 0 0 2026-03-10T06:22:13.485 INFO:tasks.workunit.client.1.vm06.stdout:5/208: write d8/d9/d1e/f3e [1009253,127403] 0 2026-03-10T06:22:13.490 INFO:tasks.workunit.client.1.vm06.stdout:5/209: readlink d8/d9/l13 0 2026-03-10T06:22:13.491 INFO:tasks.workunit.client.1.vm06.stdout:5/210: dread d8/ff [0,4194304] 0 2026-03-10T06:22:13.491 INFO:tasks.workunit.client.1.vm06.stdout:2/250: write da/d13/d1c/f3b [1807773,124586] 0 2026-03-10T06:22:13.495 INFO:tasks.workunit.client.1.vm06.stdout:0/243: dread d0/dd/f49 [0,4194304] 0 2026-03-10T06:22:13.495 INFO:tasks.workunit.client.1.vm06.stdout:2/251: dwrite da/d13/d1c/d1d/f2a [0,4194304] 0 2026-03-10T06:22:13.496 INFO:tasks.workunit.client.1.vm06.stdout:2/252: chown da/d13/d1c/d1d/d44 895 1 2026-03-10T06:22:13.497 INFO:tasks.workunit.client.1.vm06.stdout:2/253: write da/d13/d1c/f41 [217807,83204] 0 2026-03-10T06:22:13.504 INFO:tasks.workunit.client.1.vm06.stdout:0/244: dwrite d0/dd/d14/d1d/f53 [0,4194304] 0 2026-03-10T06:22:13.509 INFO:tasks.workunit.client.1.vm06.stdout:8/220: write f0 [1990443,119552] 0 2026-03-10T06:22:13.512 INFO:tasks.workunit.client.1.vm06.stdout:1/273: dwrite d9/f1a [4194304,4194304] 0 2026-03-10T06:22:13.515 INFO:tasks.workunit.client.1.vm06.stdout:5/211: chown d8/l10 1129287119 1 2026-03-10T06:22:13.515 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:13 vm04.local ceph-mon[51058]: pgmap v5: 65 pgs: 65 active+clean; 467 MiB data, 2.0 GiB used, 118 GiB / 120 GiB avail 2026-03-10T06:22:13.516 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:13 vm04.local ceph-mon[51058]: mgrmap e30: vm04.exdvdb(active, since 4s) 2026-03-10T06:22:13.516 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:13 vm04.local ceph-mon[51058]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:22:13.516 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:13 vm04.local ceph-mon[51058]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:22:13.516 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:13 vm04.local ceph-mon[51058]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:22:13.516 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:13 vm04.local ceph-mon[51058]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:22:13.516 INFO:tasks.workunit.client.1.vm06.stdout:2/254: mknod da/d13/d1a/d39/c4f 0 2026-03-10T06:22:13.518 INFO:tasks.workunit.client.1.vm06.stdout:2/255: chown da/d13/d1a/d39/f2f 113852 1 2026-03-10T06:22:13.523 INFO:tasks.workunit.client.1.vm06.stdout:3/226: truncate d6/d8/fb 2601956 0 2026-03-10T06:22:13.524 INFO:tasks.workunit.client.1.vm06.stdout:9/225: truncate d21/f2a 3883240 0 2026-03-10T06:22:13.524 INFO:tasks.workunit.client.1.vm06.stdout:8/221: symlink d1/df/d20/d35/l46 0 2026-03-10T06:22:13.524 INFO:tasks.workunit.client.1.vm06.stdout:4/211: write f6 [1350787,34181] 0 2026-03-10T06:22:13.531 INFO:tasks.workunit.client.1.vm06.stdout:4/212: dwrite f8 [4194304,4194304] 0 2026-03-10T06:22:13.536 INFO:tasks.workunit.client.1.vm06.stdout:4/213: write dd/d24/d2d/f35 [608320,93272] 0 2026-03-10T06:22:13.539 INFO:tasks.workunit.client.1.vm06.stdout:1/274: write d9/d35/f45 [4312843,29339] 0 2026-03-10T06:22:13.540 INFO:tasks.workunit.client.1.vm06.stdout:5/212: fdatasync d8/d9/d1e/f37 0 2026-03-10T06:22:13.541 INFO:tasks.workunit.client.1.vm06.stdout:9/226: dwrite d21/d32/f3f [0,4194304] 0 2026-03-10T06:22:13.556 INFO:tasks.workunit.client.1.vm06.stdout:4/214: read dd/ff [3403000,130124] 0 2026-03-10T06:22:13.559 INFO:tasks.workunit.client.1.vm06.stdout:0/245: creat d0/dd/d14/d1d/d5d/f5f x:0 0 0 2026-03-10T06:22:13.563 INFO:tasks.workunit.client.1.vm06.stdout:0/246: truncate d0/dd/d14/f31 4859391 0 2026-03-10T06:22:13.568 INFO:tasks.workunit.client.1.vm06.stdout:7/283: dread f13 [0,4194304] 0 2026-03-10T06:22:13.577 INFO:tasks.workunit.client.1.vm06.stdout:5/213: rmdir d8/d20/d22/d39 39 2026-03-10T06:22:13.586 INFO:tasks.workunit.client.1.vm06.stdout:9/227: stat l18 0 2026-03-10T06:22:13.586 INFO:tasks.workunit.client.1.vm06.stdout:0/247: fdatasync d0/dd/f32 0 2026-03-10T06:22:13.586 INFO:tasks.workunit.client.1.vm06.stdout:0/248: readlink d0/dd/d14/l17 0 2026-03-10T06:22:13.591 INFO:tasks.workunit.client.1.vm06.stdout:5/214: mkdir d8/d20/d46 0 2026-03-10T06:22:13.594 INFO:tasks.workunit.client.1.vm06.stdout:6/290: dread d6/df/f1e [0,4194304] 0 2026-03-10T06:22:13.597 INFO:tasks.workunit.client.1.vm06.stdout:7/284: mknod d19/c59 0 2026-03-10T06:22:13.597 INFO:tasks.workunit.client.1.vm06.stdout:0/249: creat d0/d3c/d42/f60 x:0 0 0 2026-03-10T06:22:13.597 INFO:tasks.workunit.client.1.vm06.stdout:0/250: chown d0/f9 5262789 1 2026-03-10T06:22:13.598 INFO:tasks.workunit.client.1.vm06.stdout:0/251: fsync d0/dd/f4c 0 2026-03-10T06:22:13.600 INFO:tasks.workunit.client.1.vm06.stdout:9/228: mkdir d21/d27/d50 0 2026-03-10T06:22:13.605 INFO:tasks.workunit.client.1.vm06.stdout:1/275: rename d9/d35/d36 to d9/d35/d46 0 2026-03-10T06:22:13.612 INFO:tasks.workunit.client.1.vm06.stdout:9/229: dwrite ff [4194304,4194304] 0 2026-03-10T06:22:13.615 INFO:tasks.workunit.client.1.vm06.stdout:9/230: stat d21/d46 0 2026-03-10T06:22:13.615 INFO:tasks.workunit.client.1.vm06.stdout:4/215: sync 2026-03-10T06:22:13.616 INFO:tasks.workunit.client.1.vm06.stdout:6/291: sync 2026-03-10T06:22:13.616 INFO:tasks.workunit.client.1.vm06.stdout:1/276: sync 2026-03-10T06:22:13.631 INFO:tasks.workunit.client.1.vm06.stdout:1/277: dwrite d9/df/f41 [0,4194304] 0 2026-03-10T06:22:13.636 INFO:tasks.workunit.client.1.vm06.stdout:6/292: dwrite d6/dd/d25/d33/f5d [0,4194304] 0 2026-03-10T06:22:13.636 INFO:tasks.workunit.client.1.vm06.stdout:6/293: chown d6/dd/d25/d33/d5a 356551 1 2026-03-10T06:22:13.637 INFO:tasks.workunit.client.1.vm06.stdout:6/294: dread - d6/f62 zero size 2026-03-10T06:22:13.651 INFO:tasks.workunit.client.1.vm06.stdout:5/215: creat d8/d9/f47 x:0 0 0 2026-03-10T06:22:13.655 INFO:tasks.workunit.client.1.vm06.stdout:2/256: dread da/f19 [0,4194304] 0 2026-03-10T06:22:13.666 INFO:tasks.workunit.client.1.vm06.stdout:3/227: dwrite f0 [0,4194304] 0 2026-03-10T06:22:13.672 INFO:tasks.workunit.client.1.vm06.stdout:6/295: symlink d6/dd/d25/d2c/l63 0 2026-03-10T06:22:13.676 INFO:tasks.workunit.client.1.vm06.stdout:7/285: dwrite d19/f20 [0,4194304] 0 2026-03-10T06:22:13.682 INFO:tasks.workunit.client.1.vm06.stdout:0/252: creat d0/f61 x:0 0 0 2026-03-10T06:22:13.690 INFO:tasks.workunit.client.1.vm06.stdout:8/222: unlink d1/df/d20/d21/f2b 0 2026-03-10T06:22:13.690 INFO:tasks.workunit.client.1.vm06.stdout:7/286: dwrite d19/d3b/f3c [0,4194304] 0 2026-03-10T06:22:13.699 INFO:tasks.workunit.client.1.vm06.stdout:3/228: creat d6/dc/d13/d35/f4e x:0 0 0 2026-03-10T06:22:13.701 INFO:tasks.workunit.client.1.vm06.stdout:4/216: dread dd/f29 [0,4194304] 0 2026-03-10T06:22:13.704 INFO:tasks.workunit.client.1.vm06.stdout:6/296: symlink d6/d7/d37/l64 0 2026-03-10T06:22:13.704 INFO:tasks.workunit.client.1.vm06.stdout:6/297: fdatasync f3 0 2026-03-10T06:22:13.704 INFO:tasks.workunit.client.1.vm06.stdout:6/298: write d6/d7/f36 [1301528,112794] 0 2026-03-10T06:22:13.708 INFO:tasks.workunit.client.1.vm06.stdout:6/299: dwrite d6/dd/d25/d33/f5d [0,4194304] 0 2026-03-10T06:22:13.718 INFO:tasks.workunit.client.1.vm06.stdout:5/216: link d8/d9/d1e/f37 d8/db/f48 0 2026-03-10T06:22:13.719 INFO:tasks.workunit.client.1.vm06.stdout:8/223: creat d1/df/d11/f47 x:0 0 0 2026-03-10T06:22:13.720 INFO:tasks.workunit.client.1.vm06.stdout:2/257: mknod da/c50 0 2026-03-10T06:22:13.728 INFO:tasks.workunit.client.1.vm06.stdout:0/253: mknod d0/dd/d14/c62 0 2026-03-10T06:22:13.729 INFO:tasks.workunit.client.1.vm06.stdout:8/224: creat d1/df/d11/f48 x:0 0 0 2026-03-10T06:22:13.730 INFO:tasks.workunit.client.1.vm06.stdout:8/225: fsync d1/df/d20/fe 0 2026-03-10T06:22:13.730 INFO:tasks.workunit.client.1.vm06.stdout:7/287: mknod d19/d3b/d41/c5a 0 2026-03-10T06:22:13.732 INFO:tasks.workunit.client.1.vm06.stdout:2/258: readlink l3 0 2026-03-10T06:22:13.732 INFO:tasks.workunit.client.1.vm06.stdout:2/259: stat da/d13/d1a/d39/d35/c4e 0 2026-03-10T06:22:13.736 INFO:tasks.workunit.client.1.vm06.stdout:3/229: mkdir d6/d4f 0 2026-03-10T06:22:13.736 INFO:tasks.workunit.client.1.vm06.stdout:4/217: mknod dd/d24/d2d/d2f/d39/c3e 0 2026-03-10T06:22:13.740 INFO:tasks.workunit.client.1.vm06.stdout:0/254: dwrite d0/dd/f4c [0,4194304] 0 2026-03-10T06:22:13.749 INFO:tasks.workunit.client.1.vm06.stdout:4/218: stat dd/f29 0 2026-03-10T06:22:13.749 INFO:tasks.workunit.client.1.vm06.stdout:7/288: mkdir d19/d3b/d5b 0 2026-03-10T06:22:13.749 INFO:tasks.workunit.client.1.vm06.stdout:8/226: creat d1/d3b/f49 x:0 0 0 2026-03-10T06:22:13.749 INFO:tasks.workunit.client.1.vm06.stdout:5/217: creat d8/f49 x:0 0 0 2026-03-10T06:22:13.749 INFO:tasks.workunit.client.1.vm06.stdout:3/230: creat d6/d21/d38/f50 x:0 0 0 2026-03-10T06:22:13.750 INFO:tasks.workunit.client.1.vm06.stdout:0/255: chown d0/fa 62374 1 2026-03-10T06:22:13.750 INFO:tasks.workunit.client.1.vm06.stdout:8/227: creat d1/df/d11/f4a x:0 0 0 2026-03-10T06:22:13.750 INFO:tasks.workunit.client.1.vm06.stdout:5/218: creat d8/d20/f4a x:0 0 0 2026-03-10T06:22:13.753 INFO:tasks.workunit.client.1.vm06.stdout:0/256: truncate d0/dd/d1b/f4e 722980 0 2026-03-10T06:22:13.754 INFO:tasks.workunit.client.1.vm06.stdout:7/289: mknod d19/d3b/d5b/c5c 0 2026-03-10T06:22:13.763 INFO:tasks.workunit.client.1.vm06.stdout:7/290: dwrite f9 [0,4194304] 0 2026-03-10T06:22:13.770 INFO:tasks.workunit.client.1.vm06.stdout:5/219: creat d8/d9/f4b x:0 0 0 2026-03-10T06:22:13.770 INFO:tasks.workunit.client.1.vm06.stdout:3/231: dwrite d6/d21/f2c [4194304,4194304] 0 2026-03-10T06:22:13.772 INFO:tasks.workunit.client.1.vm06.stdout:8/228: dwrite d1/f4 [0,4194304] 0 2026-03-10T06:22:13.775 INFO:tasks.workunit.client.1.vm06.stdout:0/257: link d0/dd/d1c/c2a d0/dd/d2d/d47/c63 0 2026-03-10T06:22:13.779 INFO:tasks.workunit.client.1.vm06.stdout:6/300: dread d6/d7/f20 [0,4194304] 0 2026-03-10T06:22:13.783 INFO:tasks.workunit.client.1.vm06.stdout:7/291: dwrite d19/f1d [0,4194304] 0 2026-03-10T06:22:13.785 INFO:tasks.workunit.client.1.vm06.stdout:3/232: mkdir d6/dc/d13/d51 0 2026-03-10T06:22:13.787 INFO:tasks.workunit.client.1.vm06.stdout:3/233: fsync d6/f1c 0 2026-03-10T06:22:13.790 INFO:tasks.workunit.client.1.vm06.stdout:6/301: creat d6/d7/d37/f65 x:0 0 0 2026-03-10T06:22:13.791 INFO:tasks.workunit.client.1.vm06.stdout:0/258: dwrite d0/ff [0,4194304] 0 2026-03-10T06:22:13.795 INFO:tasks.workunit.client.1.vm06.stdout:6/302: write d6/dd/d25/f3f [2127954,42423] 0 2026-03-10T06:22:13.796 INFO:tasks.workunit.client.1.vm06.stdout:5/220: link d8/d9/f4b d8/d9/d1e/f4c 0 2026-03-10T06:22:13.797 INFO:tasks.workunit.client.1.vm06.stdout:3/234: creat d6/d8/f52 x:0 0 0 2026-03-10T06:22:13.801 INFO:tasks.workunit.client.1.vm06.stdout:6/303: dread d6/dd/d25/d33/f5d [0,4194304] 0 2026-03-10T06:22:13.808 INFO:tasks.workunit.client.1.vm06.stdout:3/235: chown d6/dc/d13/c47 46888835 1 2026-03-10T06:22:13.819 INFO:tasks.workunit.client.1.vm06.stdout:5/221: sync 2026-03-10T06:22:13.821 INFO:tasks.workunit.client.1.vm06.stdout:0/259: read d0/dd/d14/d18/f22 [2730676,33816] 0 2026-03-10T06:22:13.825 INFO:tasks.workunit.client.1.vm06.stdout:7/292: link d19/l21 d19/d3b/d5b/l5d 0 2026-03-10T06:22:13.848 INFO:tasks.workunit.client.1.vm06.stdout:7/293: chown d19/d3b/d41/d42/l46 228 1 2026-03-10T06:22:13.851 INFO:tasks.workunit.client.1.vm06.stdout:7/294: symlink d19/d3b/l5e 0 2026-03-10T06:22:13.852 INFO:tasks.workunit.client.1.vm06.stdout:5/222: creat d8/d20/d22/f4d x:0 0 0 2026-03-10T06:22:13.854 INFO:tasks.workunit.client.1.vm06.stdout:7/295: chown d19/c3d 5702 1 2026-03-10T06:22:13.859 INFO:tasks.workunit.client.1.vm06.stdout:0/260: getdents d0/dd/d1b 0 2026-03-10T06:22:13.860 INFO:tasks.workunit.client.1.vm06.stdout:3/236: getdents d6/dc/d2a 0 2026-03-10T06:22:13.862 INFO:tasks.workunit.client.1.vm06.stdout:7/296: rmdir d19/d3b/d41/d42 39 2026-03-10T06:22:13.864 INFO:tasks.workunit.client.1.vm06.stdout:0/261: rename d0/dd/d2d/d47/d4d/l59 to d0/l64 0 2026-03-10T06:22:13.864 INFO:tasks.workunit.client.1.vm06.stdout:0/262: stat d0/d3c/d42/l4a 0 2026-03-10T06:22:13.865 INFO:tasks.workunit.client.1.vm06.stdout:7/297: chown d19/d3b/d41/d42/c4f 11417420 1 2026-03-10T06:22:13.865 INFO:tasks.workunit.client.1.vm06.stdout:0/263: fdatasync d0/dd/f48 0 2026-03-10T06:22:13.866 INFO:tasks.workunit.client.1.vm06.stdout:7/298: stat d19/d3b/d41/d42/c44 0 2026-03-10T06:22:13.868 INFO:tasks.workunit.client.1.vm06.stdout:7/299: rename d19/d3b/c58 to d19/d3b/c5f 0 2026-03-10T06:22:13.868 INFO:tasks.workunit.client.1.vm06.stdout:3/237: dwrite d6/d21/d38/f3d [0,4194304] 0 2026-03-10T06:22:13.872 INFO:tasks.workunit.client.1.vm06.stdout:7/300: dwrite d19/d3b/f43 [0,4194304] 0 2026-03-10T06:22:13.882 INFO:tasks.workunit.client.1.vm06.stdout:0/264: creat d0/dd/d14/f65 x:0 0 0 2026-03-10T06:22:13.883 INFO:tasks.workunit.client.1.vm06.stdout:0/265: chown d0/dd 2 1 2026-03-10T06:22:13.894 INFO:tasks.workunit.client.1.vm06.stdout:7/301: dwrite d19/f1a [0,4194304] 0 2026-03-10T06:22:13.906 INFO:tasks.workunit.client.1.vm06.stdout:7/302: symlink d19/l60 0 2026-03-10T06:22:13.912 INFO:tasks.workunit.client.1.vm06.stdout:1/278: truncate d9/f39 247838 0 2026-03-10T06:22:13.917 INFO:tasks.workunit.client.1.vm06.stdout:7/303: mkdir d19/d3b/d41/d61 0 2026-03-10T06:22:13.924 INFO:tasks.workunit.client.1.vm06.stdout:8/229: rmdir d1/df/d11 39 2026-03-10T06:22:13.924 INFO:tasks.workunit.client.1.vm06.stdout:8/230: chown d1/df/d20/d21/l2d 0 1 2026-03-10T06:22:13.924 INFO:tasks.workunit.client.1.vm06.stdout:2/260: dwrite da/fe [0,4194304] 0 2026-03-10T06:22:13.924 INFO:tasks.workunit.client.1.vm06.stdout:1/279: dwrite d9/d1b/f1d [0,4194304] 0 2026-03-10T06:22:13.924 INFO:tasks.workunit.client.1.vm06.stdout:1/280: chown d9/c1c 703029136 1 2026-03-10T06:22:13.926 INFO:tasks.workunit.client.1.vm06.stdout:4/219: truncate dd/d24/d2d/f21 1499610 0 2026-03-10T06:22:13.926 INFO:tasks.workunit.client.1.vm06.stdout:8/231: mknod d1/df/d20/d35/c4b 0 2026-03-10T06:22:13.926 INFO:tasks.workunit.client.1.vm06.stdout:9/231: truncate fe 4733784 0 2026-03-10T06:22:13.937 INFO:tasks.workunit.client.1.vm06.stdout:2/261: unlink da/d13/c1b 0 2026-03-10T06:22:13.937 INFO:tasks.workunit.client.1.vm06.stdout:7/304: rmdir d19/d3b/d41/d61 0 2026-03-10T06:22:13.938 INFO:tasks.workunit.client.1.vm06.stdout:8/232: write d1/df/d11/f1d [2093019,84784] 0 2026-03-10T06:22:13.942 INFO:tasks.workunit.client.1.vm06.stdout:9/232: mkdir d21/d32/d4d/d51 0 2026-03-10T06:22:13.943 INFO:tasks.workunit.client.1.vm06.stdout:1/281: dread d9/d1b/d20/f25 [0,4194304] 0 2026-03-10T06:22:13.943 INFO:tasks.workunit.client.1.vm06.stdout:8/233: mknod d1/d2c/c4c 0 2026-03-10T06:22:13.943 INFO:tasks.workunit.client.1.vm06.stdout:2/262: mkdir da/d13/d51 0 2026-03-10T06:22:13.949 INFO:tasks.workunit.client.1.vm06.stdout:1/282: chown d9/df/f2a 889750 1 2026-03-10T06:22:13.949 INFO:tasks.workunit.client.1.vm06.stdout:9/233: dwrite f11 [0,4194304] 0 2026-03-10T06:22:13.950 INFO:tasks.workunit.client.1.vm06.stdout:2/263: truncate da/d13/f1f 1246838 0 2026-03-10T06:22:13.950 INFO:tasks.workunit.client.1.vm06.stdout:1/283: write d9/f17 [2711188,3155] 0 2026-03-10T06:22:13.951 INFO:tasks.workunit.client.1.vm06.stdout:1/284: read d9/f1f [3988675,130053] 0 2026-03-10T06:22:13.952 INFO:tasks.workunit.client.1.vm06.stdout:1/285: chown d9/d1b/d20/c2d 0 1 2026-03-10T06:22:13.961 INFO:tasks.workunit.client.1.vm06.stdout:2/264: unlink da/d13/d1c/f3b 0 2026-03-10T06:22:13.962 INFO:tasks.workunit.client.1.vm06.stdout:2/265: dread - da/ff zero size 2026-03-10T06:22:13.963 INFO:tasks.workunit.client.1.vm06.stdout:9/234: rename f19 to d21/d32/f52 0 2026-03-10T06:22:13.963 INFO:tasks.workunit.client.1.vm06.stdout:2/266: truncate da/d13/d1a/f27 2805678 0 2026-03-10T06:22:13.968 INFO:tasks.workunit.client.1.vm06.stdout:1/286: dread d9/df/f2a [0,4194304] 0 2026-03-10T06:22:13.978 INFO:tasks.workunit.client.1.vm06.stdout:1/287: dread d9/d1b/d20/f24 [0,4194304] 0 2026-03-10T06:22:13.978 INFO:tasks.workunit.client.1.vm06.stdout:9/235: chown d21/d24/l25 9360680 1 2026-03-10T06:22:13.979 INFO:tasks.workunit.client.1.vm06.stdout:9/236: write f20 [150739,126750] 0 2026-03-10T06:22:13.983 INFO:tasks.workunit.client.1.vm06.stdout:9/237: read d21/d32/f52 [1582820,103861] 0 2026-03-10T06:22:13.986 INFO:tasks.workunit.client.1.vm06.stdout:9/238: dread d21/d32/f3f [0,4194304] 0 2026-03-10T06:22:13.986 INFO:tasks.workunit.client.1.vm06.stdout:9/239: chown d21/d32/d4d 612 1 2026-03-10T06:22:13.994 INFO:tasks.workunit.client.1.vm06.stdout:2/267: creat da/d13/f52 x:0 0 0 2026-03-10T06:22:13.997 INFO:tasks.workunit.client.1.vm06.stdout:9/240: rename d21/d27/l29 to d21/d24/l53 0 2026-03-10T06:22:14.001 INFO:tasks.workunit.client.1.vm06.stdout:9/241: mknod d21/c54 0 2026-03-10T06:22:14.001 INFO:tasks.workunit.client.1.vm06.stdout:9/242: write d21/d27/f39 [7627,106213] 0 2026-03-10T06:22:14.002 INFO:tasks.workunit.client.1.vm06.stdout:2/268: getdents da/d13/d1c/d1d 0 2026-03-10T06:22:14.018 INFO:tasks.workunit.client.1.vm06.stdout:9/243: dread fd [0,4194304] 0 2026-03-10T06:22:14.026 INFO:tasks.workunit.client.1.vm06.stdout:9/244: dread d21/f34 [0,4194304] 0 2026-03-10T06:22:14.027 INFO:tasks.workunit.client.1.vm06.stdout:9/245: write d21/d27/f39 [876520,111116] 0 2026-03-10T06:22:14.027 INFO:tasks.workunit.client.1.vm06.stdout:9/246: fsync f1b 0 2026-03-10T06:22:14.028 INFO:tasks.workunit.client.1.vm06.stdout:5/223: rmdir d8/d9 39 2026-03-10T06:22:14.031 INFO:tasks.workunit.client.1.vm06.stdout:6/304: write d6/d7/d37/f3d [1670680,19671] 0 2026-03-10T06:22:14.031 INFO:tasks.workunit.client.1.vm06.stdout:6/305: dwrite d6/dd/d35/f2d [4194304,4194304] 0 2026-03-10T06:22:14.046 INFO:tasks.workunit.client.1.vm06.stdout:6/306: mknod d6/df/d40/c66 0 2026-03-10T06:22:14.047 INFO:tasks.workunit.client.1.vm06.stdout:9/247: symlink d21/d27/d50/l55 0 2026-03-10T06:22:14.048 INFO:tasks.workunit.client.1.vm06.stdout:5/224: read d8/db/f18 [3015927,105593] 0 2026-03-10T06:22:14.048 INFO:tasks.workunit.client.1.vm06.stdout:3/238: dwrite d6/d21/f31 [0,4194304] 0 2026-03-10T06:22:14.051 INFO:tasks.workunit.client.1.vm06.stdout:6/307: unlink d6/dd/d25/d4e/c54 0 2026-03-10T06:22:14.055 INFO:tasks.workunit.client.1.vm06.stdout:9/248: mkdir d21/d27/d56 0 2026-03-10T06:22:14.056 INFO:tasks.workunit.client.1.vm06.stdout:6/308: mknod d6/df/d40/c67 0 2026-03-10T06:22:14.059 INFO:tasks.workunit.client.1.vm06.stdout:9/249: write f9 [1972696,90426] 0 2026-03-10T06:22:14.059 INFO:tasks.workunit.client.1.vm06.stdout:3/239: dwrite d6/dc/f3f [0,4194304] 0 2026-03-10T06:22:14.078 INFO:tasks.workunit.client.1.vm06.stdout:0/266: write d0/dd/f49 [855319,52323] 0 2026-03-10T06:22:14.078 INFO:tasks.workunit.client.1.vm06.stdout:0/267: chown d0/dd/c45 53152 1 2026-03-10T06:22:14.079 INFO:tasks.workunit.client.1.vm06.stdout:0/268: readlink d0/dd/d14/l17 0 2026-03-10T06:22:14.103 INFO:tasks.workunit.client.1.vm06.stdout:6/309: mknod d6/dd/c68 0 2026-03-10T06:22:14.117 INFO:tasks.workunit.client.1.vm06.stdout:4/220: write dd/ff [1954581,129770] 0 2026-03-10T06:22:14.118 INFO:tasks.workunit.client.1.vm06.stdout:9/250: mkdir d21/d27/d50/d57 0 2026-03-10T06:22:14.122 INFO:tasks.workunit.client.1.vm06.stdout:7/305: dwrite d19/d3b/d41/f49 [0,4194304] 0 2026-03-10T06:22:14.138 INFO:tasks.workunit.client.1.vm06.stdout:4/221: dread f8 [8388608,4194304] 0 2026-03-10T06:22:14.139 INFO:tasks.workunit.client.1.vm06.stdout:5/225: mknod d8/d9/d1e/c4e 0 2026-03-10T06:22:14.141 INFO:tasks.workunit.client.1.vm06.stdout:8/234: write d1/d7/fd [402475,98189] 0 2026-03-10T06:22:14.145 INFO:tasks.workunit.client.1.vm06.stdout:1/288: truncate d9/df/f2a 1113188 0 2026-03-10T06:22:14.152 INFO:tasks.workunit.client.1.vm06.stdout:2/269: dwrite da/d13/d1a/d39/f2f [0,4194304] 0 2026-03-10T06:22:14.157 INFO:tasks.workunit.client.1.vm06.stdout:7/306: mkdir d19/d3b/d41/d42/d62 0 2026-03-10T06:22:14.161 INFO:tasks.workunit.client.1.vm06.stdout:7/307: dwrite d19/f3f [0,4194304] 0 2026-03-10T06:22:14.169 INFO:tasks.workunit.client.1.vm06.stdout:7/308: dwrite f10 [4194304,4194304] 0 2026-03-10T06:22:14.170 INFO:tasks.workunit.client.1.vm06.stdout:7/309: write f10 [1308042,38485] 0 2026-03-10T06:22:14.180 INFO:tasks.workunit.client.1.vm06.stdout:2/270: dread da/fe [4194304,4194304] 0 2026-03-10T06:22:14.182 INFO:tasks.workunit.client.1.vm06.stdout:2/271: read da/d13/d1c/d1d/f2a [4135947,107583] 0 2026-03-10T06:22:14.182 INFO:tasks.workunit.client.1.vm06.stdout:3/240: creat d6/f53 x:0 0 0 2026-03-10T06:22:14.182 INFO:tasks.workunit.client.1.vm06.stdout:8/235: dread d1/d7/fd [0,4194304] 0 2026-03-10T06:22:14.183 INFO:tasks.workunit.client.1.vm06.stdout:8/236: write d1/df/d20/f43 [788287,91711] 0 2026-03-10T06:22:14.193 INFO:tasks.workunit.client.1.vm06.stdout:9/251: creat d21/d27/d50/d57/f58 x:0 0 0 2026-03-10T06:22:14.193 INFO:tasks.workunit.client.1.vm06.stdout:9/252: chown d21/d27/d50 9 1 2026-03-10T06:22:14.206 INFO:tasks.workunit.client.1.vm06.stdout:6/310: creat d6/dd/d25/f69 x:0 0 0 2026-03-10T06:22:14.224 INFO:tasks.workunit.client.1.vm06.stdout:1/289: rename d9/df/f2a to d9/df/f47 0 2026-03-10T06:22:14.225 INFO:tasks.workunit.client.1.vm06.stdout:0/269: truncate d0/dd/f10 1787937 0 2026-03-10T06:22:14.226 INFO:tasks.workunit.client.1.vm06.stdout:0/270: write d0/d3c/d42/f5c [489657,97395] 0 2026-03-10T06:22:14.226 INFO:tasks.workunit.client.1.vm06.stdout:0/271: dread - d0/dd/d14/f65 zero size 2026-03-10T06:22:14.227 INFO:tasks.workunit.client.1.vm06.stdout:7/310: rename d19/f2f to d19/d3b/d41/d42/d52/f63 0 2026-03-10T06:22:14.227 INFO:tasks.workunit.client.1.vm06.stdout:7/311: fdatasync d19/d3b/f43 0 2026-03-10T06:22:14.232 INFO:tasks.workunit.client.1.vm06.stdout:3/241: mkdir d6/dc/d2a/d54 0 2026-03-10T06:22:14.237 INFO:tasks.workunit.client.1.vm06.stdout:2/272: mkdir da/d13/d1c/d1d/d44/d53 0 2026-03-10T06:22:14.242 INFO:tasks.workunit.client.1.vm06.stdout:4/222: dwrite dd/d24/d2d/f3b [4194304,4194304] 0 2026-03-10T06:22:14.243 INFO:tasks.workunit.client.1.vm06.stdout:5/226: truncate d8/d9/d1e/f3e 911133 0 2026-03-10T06:22:14.248 INFO:tasks.workunit.client.1.vm06.stdout:4/223: dwrite dd/f12 [4194304,4194304] 0 2026-03-10T06:22:14.265 INFO:tasks.workunit.client.1.vm06.stdout:0/272: mkdir d0/dd/d14/d18/d66 0 2026-03-10T06:22:14.269 INFO:tasks.workunit.client.1.vm06.stdout:7/312: write d19/f25 [3719365,40991] 0 2026-03-10T06:22:14.269 INFO:tasks.workunit.client.1.vm06.stdout:7/313: read f13 [4285743,93586] 0 2026-03-10T06:22:14.276 INFO:tasks.workunit.client.1.vm06.stdout:3/242: dread d6/d8/fb [0,4194304] 0 2026-03-10T06:22:14.277 INFO:tasks.workunit.client.1.vm06.stdout:3/243: chown d6 0 1 2026-03-10T06:22:14.281 INFO:tasks.workunit.client.1.vm06.stdout:8/237: symlink d1/df/l4d 0 2026-03-10T06:22:14.294 INFO:tasks.workunit.client.1.vm06.stdout:3/244: creat d6/d21/f55 x:0 0 0 2026-03-10T06:22:14.294 INFO:tasks.workunit.client.1.vm06.stdout:8/238: mknod d1/d2c/c4e 0 2026-03-10T06:22:14.298 INFO:tasks.workunit.client.1.vm06.stdout:4/224: fsync f2 0 2026-03-10T06:22:14.300 INFO:tasks.workunit.client.1.vm06.stdout:6/311: creat d6/f6a x:0 0 0 2026-03-10T06:22:14.302 INFO:tasks.workunit.client.1.vm06.stdout:4/225: dwrite dd/d24/f3d [0,4194304] 0 2026-03-10T06:22:14.303 INFO:tasks.workunit.client.1.vm06.stdout:4/226: write dd/d18/f32 [977565,61512] 0 2026-03-10T06:22:14.303 INFO:tasks.workunit.client.1.vm06.stdout:4/227: stat dd/d24/d2d/f3b 0 2026-03-10T06:22:14.315 INFO:tasks.workunit.client.1.vm06.stdout:1/290: read d9/df/f47 [921134,34037] 0 2026-03-10T06:22:14.322 INFO:tasks.workunit.client.1.vm06.stdout:9/253: dwrite d21/d24/f2e [0,4194304] 0 2026-03-10T06:22:14.324 INFO:tasks.workunit.client.1.vm06.stdout:2/273: truncate da/d13/d1c/d1d/f2a 3796557 0 2026-03-10T06:22:14.324 INFO:tasks.workunit.client.1.vm06.stdout:7/314: unlink d19/c2a 0 2026-03-10T06:22:14.324 INFO:tasks.workunit.client.1.vm06.stdout:5/227: link d8/d9/c21 d8/d20/c4f 0 2026-03-10T06:22:14.324 INFO:tasks.workunit.client.1.vm06.stdout:3/245: unlink d6/d21/f4d 0 2026-03-10T06:22:14.326 INFO:tasks.workunit.client.1.vm06.stdout:2/274: dread da/d13/d1a/d39/f2f [0,4194304] 0 2026-03-10T06:22:14.326 INFO:tasks.workunit.client.1.vm06.stdout:6/312: rename d6/df/d40/c67 to d6/dd/c6b 0 2026-03-10T06:22:14.336 INFO:tasks.workunit.client.1.vm06.stdout:9/254: read f14 [3511341,103490] 0 2026-03-10T06:22:14.337 INFO:tasks.workunit.client.1.vm06.stdout:4/228: creat dd/d33/f3f x:0 0 0 2026-03-10T06:22:14.337 INFO:tasks.workunit.client.1.vm06.stdout:1/291: truncate d9/d1b/f31 934231 0 2026-03-10T06:22:14.338 INFO:tasks.workunit.client.1.vm06.stdout:4/229: stat dd/d18/f1d 0 2026-03-10T06:22:14.357 INFO:tasks.workunit.client.1.vm06.stdout:4/230: chown dd/d24/d2d/d2f/d39/c3e 7 1 2026-03-10T06:22:14.357 INFO:tasks.workunit.client.1.vm06.stdout:0/273: creat d0/dd/f67 x:0 0 0 2026-03-10T06:22:14.357 INFO:tasks.workunit.client.1.vm06.stdout:7/315: dread - d19/f23 zero size 2026-03-10T06:22:14.357 INFO:tasks.workunit.client.1.vm06.stdout:7/316: chown d19/d3b/d41/d42 244106 1 2026-03-10T06:22:14.357 INFO:tasks.workunit.client.1.vm06.stdout:6/313: mknod d6/dd/d2b/c6c 0 2026-03-10T06:22:14.357 INFO:tasks.workunit.client.1.vm06.stdout:1/292: unlink d9/d35/f45 0 2026-03-10T06:22:14.357 INFO:tasks.workunit.client.1.vm06.stdout:9/255: dwrite d21/d32/f37 [0,4194304] 0 2026-03-10T06:22:14.357 INFO:tasks.workunit.client.1.vm06.stdout:9/256: write f20 [884280,122528] 0 2026-03-10T06:22:14.357 INFO:tasks.workunit.client.1.vm06.stdout:0/274: symlink d0/d3c/d42/l68 0 2026-03-10T06:22:14.362 INFO:tasks.workunit.client.1.vm06.stdout:6/314: mknod d6/df/c6d 0 2026-03-10T06:22:14.364 INFO:tasks.workunit.client.1.vm06.stdout:4/231: mkdir dd/d24/d2d/d2f/d34/d40 0 2026-03-10T06:22:14.364 INFO:tasks.workunit.client.1.vm06.stdout:8/239: getdents d1/df/d20/d21 0 2026-03-10T06:22:14.366 INFO:tasks.workunit.client.1.vm06.stdout:1/293: mknod d9/d35/c48 0 2026-03-10T06:22:14.366 INFO:tasks.workunit.client.1.vm06.stdout:1/294: write d9/f34 [868157,126459] 0 2026-03-10T06:22:14.368 INFO:tasks.workunit.client.1.vm06.stdout:4/232: mkdir dd/d41 0 2026-03-10T06:22:14.368 INFO:tasks.workunit.client.1.vm06.stdout:8/240: creat d1/d7/f4f x:0 0 0 2026-03-10T06:22:14.368 INFO:tasks.workunit.client.1.vm06.stdout:5/228: link c1 d8/d20/d46/c50 0 2026-03-10T06:22:14.370 INFO:tasks.workunit.client.1.vm06.stdout:8/241: mkdir d1/d2c/d50 0 2026-03-10T06:22:14.371 INFO:tasks.workunit.client.1.vm06.stdout:5/229: rmdir d8/d9 39 2026-03-10T06:22:14.371 INFO:tasks.workunit.client.1.vm06.stdout:0/275: dread d0/dd/d1b/f4e [0,4194304] 0 2026-03-10T06:22:14.371 INFO:tasks.workunit.client.1.vm06.stdout:5/230: fdatasync f7 0 2026-03-10T06:22:14.372 INFO:tasks.workunit.client.1.vm06.stdout:8/242: chown d1/d7/c2e 32916312 1 2026-03-10T06:22:14.372 INFO:tasks.workunit.client.1.vm06.stdout:5/231: write d8/d20/d22/f4d [890270,111485] 0 2026-03-10T06:22:14.379 INFO:tasks.workunit.client.1.vm06.stdout:9/257: dread d21/f33 [0,4194304] 0 2026-03-10T06:22:14.388 INFO:tasks.workunit.client.1.vm06.stdout:6/315: dread d6/fc [0,4194304] 0 2026-03-10T06:22:14.389 INFO:tasks.workunit.client.1.vm06.stdout:0/276: mknod d0/dd/c69 0 2026-03-10T06:22:14.389 INFO:tasks.workunit.client.1.vm06.stdout:0/277: stat d0/dd/d14/l4f 0 2026-03-10T06:22:14.390 INFO:tasks.workunit.client.1.vm06.stdout:6/316: read f3 [864475,5986] 0 2026-03-10T06:22:14.391 INFO:tasks.workunit.client.1.vm06.stdout:9/258: symlink d21/d27/d50/l59 0 2026-03-10T06:22:14.397 INFO:tasks.workunit.client.1.vm06.stdout:9/259: stat fe 0 2026-03-10T06:22:14.398 INFO:tasks.workunit.client.1.vm06.stdout:0/278: symlink d0/d3c/d42/d5e/l6a 0 2026-03-10T06:22:14.398 INFO:tasks.workunit.client.1.vm06.stdout:0/279: dread - d0/dd/f32 zero size 2026-03-10T06:22:14.401 INFO:tasks.workunit.client.1.vm06.stdout:9/260: symlink d21/d27/l5a 0 2026-03-10T06:22:14.404 INFO:tasks.workunit.client.1.vm06.stdout:9/261: mknod d21/c5b 0 2026-03-10T06:22:14.404 INFO:tasks.workunit.client.1.vm06.stdout:9/262: stat l1e 0 2026-03-10T06:22:14.404 INFO:tasks.workunit.client.1.vm06.stdout:9/263: chown d21/d27/d50/d57 2385440 1 2026-03-10T06:22:14.406 INFO:tasks.workunit.client.1.vm06.stdout:0/280: getdents d0/dd/d14 0 2026-03-10T06:22:14.410 INFO:tasks.workunit.client.1.vm06.stdout:0/281: dwrite d0/d3c/d42/f12 [0,4194304] 0 2026-03-10T06:22:14.413 INFO:tasks.workunit.client.1.vm06.stdout:9/264: creat d21/d32/d4d/d51/f5c x:0 0 0 2026-03-10T06:22:14.418 INFO:tasks.workunit.client.1.vm06.stdout:9/265: fsync d21/d27/d50/d57/f58 0 2026-03-10T06:22:14.418 INFO:tasks.workunit.client.1.vm06.stdout:9/266: chown d21/d27/c38 418556 1 2026-03-10T06:22:14.418 INFO:tasks.workunit.client.1.vm06.stdout:6/317: dread f3 [0,4194304] 0 2026-03-10T06:22:14.418 INFO:tasks.workunit.client.1.vm06.stdout:9/267: mknod d21/d24/c5d 0 2026-03-10T06:22:14.420 INFO:tasks.workunit.client.1.vm06.stdout:6/318: dwrite d6/d7/f36 [0,4194304] 0 2026-03-10T06:22:14.431 INFO:tasks.workunit.client.1.vm06.stdout:6/319: mknod d6/df/c6e 0 2026-03-10T06:22:14.438 INFO:tasks.workunit.client.1.vm06.stdout:6/320: creat d6/df/f6f x:0 0 0 2026-03-10T06:22:14.438 INFO:tasks.workunit.client.1.vm06.stdout:6/321: read d6/d7/d37/f3d [1778795,71941] 0 2026-03-10T06:22:14.439 INFO:tasks.workunit.client.1.vm06.stdout:6/322: stat d6/dd/d25/d2c 0 2026-03-10T06:22:14.440 INFO:tasks.workunit.client.1.vm06.stdout:6/323: mkdir d6/df/d70 0 2026-03-10T06:22:14.444 INFO:tasks.workunit.client.1.vm06.stdout:6/324: dwrite d6/d7/d37/f3d [0,4194304] 0 2026-03-10T06:22:14.504 INFO:tasks.workunit.client.1.vm06.stdout:6/325: sync 2026-03-10T06:22:14.506 INFO:tasks.workunit.client.1.vm06.stdout:6/326: dread d6/dd/d35/f3c [0,4194304] 0 2026-03-10T06:22:14.515 INFO:tasks.workunit.client.1.vm06.stdout:6/327: link d6/df/c6d d6/dd/d35/c71 0 2026-03-10T06:22:14.517 INFO:tasks.workunit.client.1.vm06.stdout:6/328: mknod d6/dd/d25/d4e/c72 0 2026-03-10T06:22:14.526 INFO:tasks.workunit.client.1.vm06.stdout:0/282: getdents d0/d3c/d42 0 2026-03-10T06:22:14.529 INFO:tasks.workunit.client.1.vm06.stdout:0/283: mkdir d0/dd/d14/d6b 0 2026-03-10T06:22:14.530 INFO:tasks.workunit.client.1.vm06.stdout:6/329: dread d6/d7/f16 [0,4194304] 0 2026-03-10T06:22:14.530 INFO:tasks.workunit.client.1.vm06.stdout:3/246: truncate d6/d8/f45 775319 0 2026-03-10T06:22:14.532 INFO:tasks.workunit.client.1.vm06.stdout:2/275: write f7 [1601711,120330] 0 2026-03-10T06:22:14.533 INFO:tasks.workunit.client.1.vm06.stdout:2/276: chown da/d13/d1a/d39/d35 36786033 1 2026-03-10T06:22:14.534 INFO:tasks.workunit.client.1.vm06.stdout:2/277: write da/d13/d1c/d1d/d44/f45 [692656,70362] 0 2026-03-10T06:22:14.539 INFO:tasks.workunit.client.1.vm06.stdout:3/247: dwrite d6/f25 [0,4194304] 0 2026-03-10T06:22:14.544 INFO:tasks.workunit.client.1.vm06.stdout:3/248: chown d6/d8/l14 882363807 1 2026-03-10T06:22:14.548 INFO:tasks.workunit.client.1.vm06.stdout:6/330: dread d6/dd/d25/d2c/f32 [0,4194304] 0 2026-03-10T06:22:14.549 INFO:tasks.workunit.client.1.vm06.stdout:3/249: creat d6/d21/d38/f56 x:0 0 0 2026-03-10T06:22:14.553 INFO:tasks.workunit.client.1.vm06.stdout:7/317: dwrite d19/f23 [0,4194304] 0 2026-03-10T06:22:14.554 INFO:tasks.workunit.client.1.vm06.stdout:3/250: creat d6/dc/d2a/f57 x:0 0 0 2026-03-10T06:22:14.556 INFO:tasks.workunit.client.1.vm06.stdout:0/284: sync 2026-03-10T06:22:14.556 INFO:tasks.workunit.client.1.vm06.stdout:6/331: symlink d6/dd/d25/d33/d4d/l73 0 2026-03-10T06:22:14.561 INFO:tasks.workunit.client.1.vm06.stdout:7/318: dwrite d19/f3f [0,4194304] 0 2026-03-10T06:22:14.565 INFO:tasks.workunit.client.1.vm06.stdout:0/285: dwrite d0/dd/d2d/d35/f3a [0,4194304] 0 2026-03-10T06:22:14.595 INFO:tasks.workunit.client.1.vm06.stdout:3/251: rename d6/d21/f26 to d6/d21/f58 0 2026-03-10T06:22:14.595 INFO:tasks.workunit.client.1.vm06.stdout:3/252: unlink d6/d21/c3a 0 2026-03-10T06:22:14.595 INFO:tasks.workunit.client.1.vm06.stdout:7/319: unlink d19/c36 0 2026-03-10T06:22:14.595 INFO:tasks.workunit.client.1.vm06.stdout:3/253: rename d6/d21/c2e to d6/d21/d38/d39/c59 0 2026-03-10T06:22:14.595 INFO:tasks.workunit.client.1.vm06.stdout:3/254: fsync d6/dc/d13/d35/f3b 0 2026-03-10T06:22:14.595 INFO:tasks.workunit.client.1.vm06.stdout:7/320: rename d19/d3b/d41/d42/d52/f63 to d19/d3b/d41/d42/d52/f64 0 2026-03-10T06:22:14.599 INFO:tasks.workunit.client.1.vm06.stdout:7/321: dwrite d19/d3b/f3c [0,4194304] 0 2026-03-10T06:22:14.630 INFO:tasks.workunit.client.1.vm06.stdout:7/322: creat d19/d3b/d41/f65 x:0 0 0 2026-03-10T06:22:14.637 INFO:tasks.workunit.client.1.vm06.stdout:4/233: unlink dd/d18/l25 0 2026-03-10T06:22:14.637 INFO:tasks.workunit.client.1.vm06.stdout:1/295: write d9/df/f47 [1365152,92095] 0 2026-03-10T06:22:14.637 INFO:tasks.workunit.client.1.vm06.stdout:1/296: chown d9/df/c13 380291839 1 2026-03-10T06:22:14.639 INFO:tasks.workunit.client.1.vm06.stdout:4/234: creat dd/d24/d2d/d2f/f42 x:0 0 0 2026-03-10T06:22:14.643 INFO:tasks.workunit.client.1.vm06.stdout:8/243: truncate d1/df/d20/d21/f37 1515413 0 2026-03-10T06:22:14.643 INFO:tasks.workunit.client.1.vm06.stdout:5/232: dwrite d8/d9/d1e/f37 [0,4194304] 0 2026-03-10T06:22:14.646 INFO:tasks.workunit.client.1.vm06.stdout:9/268: getdents d21/d27 0 2026-03-10T06:22:14.647 INFO:tasks.workunit.client.1.vm06.stdout:8/244: creat d1/df/d20/f51 x:0 0 0 2026-03-10T06:22:14.648 INFO:tasks.workunit.client.1.vm06.stdout:5/233: rename d8/d20/f35 to d8/d20/d22/d39/f51 0 2026-03-10T06:22:14.649 INFO:tasks.workunit.client.1.vm06.stdout:4/235: dread f8 [8388608,4194304] 0 2026-03-10T06:22:14.650 INFO:tasks.workunit.client.1.vm06.stdout:8/245: mknod d1/df/d20/d21/c52 0 2026-03-10T06:22:14.651 INFO:tasks.workunit.client.1.vm06.stdout:5/234: rename d8/d9/d1e/f23 to d8/d20/d22/d39/f52 0 2026-03-10T06:22:14.656 INFO:tasks.workunit.client.1.vm06.stdout:8/246: unlink d1/f2 0 2026-03-10T06:22:14.656 INFO:tasks.workunit.client.1.vm06.stdout:5/235: creat d8/d20/d22/f53 x:0 0 0 2026-03-10T06:22:14.657 INFO:tasks.workunit.client.1.vm06.stdout:4/236: dwrite dd/d24/d2d/f28 [0,4194304] 0 2026-03-10T06:22:14.671 INFO:tasks.workunit.client.1.vm06.stdout:5/236: mkdir d8/db/d54 0 2026-03-10T06:22:14.674 INFO:tasks.workunit.client.1.vm06.stdout:4/237: unlink f6 0 2026-03-10T06:22:14.674 INFO:tasks.workunit.client.1.vm06.stdout:5/237: dwrite d8/d20/d22/f4d [0,4194304] 0 2026-03-10T06:22:14.675 INFO:tasks.workunit.client.1.vm06.stdout:5/238: readlink d8/ld 0 2026-03-10T06:22:14.678 INFO:tasks.workunit.client.1.vm06.stdout:4/238: write fc [3201152,10992] 0 2026-03-10T06:22:14.682 INFO:tasks.workunit.client.1.vm06.stdout:4/239: creat dd/f43 x:0 0 0 2026-03-10T06:22:14.691 INFO:tasks.workunit.client.1.vm06.stdout:4/240: rename dd/d24/d2d/d2f/l31 to dd/d24/d2d/d2f/d39/l44 0 2026-03-10T06:22:14.776 INFO:tasks.workunit.client.1.vm06.stdout:7/323: sync 2026-03-10T06:22:14.779 INFO:tasks.workunit.client.1.vm06.stdout:7/324: dwrite d19/f25 [0,4194304] 0 2026-03-10T06:22:14.781 INFO:tasks.workunit.client.1.vm06.stdout:7/325: fsync d19/f30 0 2026-03-10T06:22:14.784 INFO:tasks.workunit.client.1.vm06.stdout:9/269: sync 2026-03-10T06:22:14.784 INFO:tasks.workunit.client.1.vm06.stdout:1/297: sync 2026-03-10T06:22:14.784 INFO:tasks.workunit.client.1.vm06.stdout:4/241: sync 2026-03-10T06:22:14.787 INFO:tasks.workunit.client.1.vm06.stdout:7/326: dread d19/f1d [0,4194304] 0 2026-03-10T06:22:14.793 INFO:tasks.workunit.client.1.vm06.stdout:1/298: dwrite d9/f2f [0,4194304] 0 2026-03-10T06:22:14.811 INFO:tasks.workunit.client.1.vm06.stdout:1/299: write d9/d1b/d20/f25 [5026557,27356] 0 2026-03-10T06:22:14.811 INFO:tasks.workunit.client.1.vm06.stdout:7/327: dread f9 [0,4194304] 0 2026-03-10T06:22:14.811 INFO:tasks.workunit.client.1.vm06.stdout:1/300: creat d9/df/f49 x:0 0 0 2026-03-10T06:22:14.833 INFO:tasks.workunit.client.1.vm06.stdout:4/242: dread f2 [0,4194304] 0 2026-03-10T06:22:14.932 INFO:tasks.workunit.client.1.vm06.stdout:6/332: rmdir d6/dd/d25/d4e 39 2026-03-10T06:22:14.933 INFO:tasks.workunit.client.1.vm06.stdout:6/333: truncate d6/d7/d37/d43/f59 4608953 0 2026-03-10T06:22:14.933 INFO:tasks.workunit.client.1.vm06.stdout:6/334: dread - d6/d7/f44 zero size 2026-03-10T06:22:14.934 INFO:tasks.workunit.client.1.vm06.stdout:2/278: write da/f28 [5162795,58188] 0 2026-03-10T06:22:14.934 INFO:tasks.workunit.client.1.vm06.stdout:6/335: chown d6/dd/f5b 49848 1 2026-03-10T06:22:14.941 INFO:tasks.workunit.client.1.vm06.stdout:3/255: dwrite d6/d1a/f1f [0,4194304] 0 2026-03-10T06:22:14.942 INFO:tasks.workunit.client.1.vm06.stdout:3/256: truncate d6/f29 1768604 0 2026-03-10T06:22:14.942 INFO:tasks.workunit.client.1.vm06.stdout:0/286: read - d0/dd/d14/f65 zero size 2026-03-10T06:22:14.942 INFO:tasks.workunit.client.1.vm06.stdout:6/336: dwrite d6/d7/f36 [0,4194304] 0 2026-03-10T06:22:14.944 INFO:tasks.workunit.client.1.vm06.stdout:0/287: write d0/dd/f5b [626563,125806] 0 2026-03-10T06:22:14.944 INFO:tasks.workunit.client.1.vm06.stdout:3/257: truncate d6/f53 263514 0 2026-03-10T06:22:14.945 INFO:tasks.workunit.client.1.vm06.stdout:6/337: write f3 [737282,14852] 0 2026-03-10T06:22:14.946 INFO:tasks.workunit.client.1.vm06.stdout:6/338: chown d6/dd/f5b 0 1 2026-03-10T06:22:14.951 INFO:tasks.workunit.client.1.vm06.stdout:0/288: dwrite d0/dd/d1b/d3d/d50/f55 [0,4194304] 0 2026-03-10T06:22:14.957 INFO:tasks.workunit.client.1.vm06.stdout:2/279: rename f5 to da/d13/d1c/f54 0 2026-03-10T06:22:14.957 INFO:tasks.workunit.client.1.vm06.stdout:6/339: mknod d6/d7/c74 0 2026-03-10T06:22:14.957 INFO:tasks.workunit.client.1.vm06.stdout:3/258: rename d6/d21/f2c to d6/dc/d13/d35/f5a 0 2026-03-10T06:22:14.957 INFO:tasks.workunit.client.1.vm06.stdout:3/259: stat c5 0 2026-03-10T06:22:14.957 INFO:tasks.workunit.client.1.vm06.stdout:3/260: stat d6/d8/f52 0 2026-03-10T06:22:14.958 INFO:tasks.workunit.client.1.vm06.stdout:3/261: mkdir d6/d1a/d5b 0 2026-03-10T06:22:14.958 INFO:tasks.workunit.client.1.vm06.stdout:7/328: dread d19/d3b/d41/f48 [0,4194304] 0 2026-03-10T06:22:14.960 INFO:tasks.workunit.client.1.vm06.stdout:3/262: unlink d6/dc/d13/c27 0 2026-03-10T06:22:14.961 INFO:tasks.workunit.client.1.vm06.stdout:7/329: dread d19/f3f [0,4194304] 0 2026-03-10T06:22:14.961 INFO:tasks.workunit.client.1.vm06.stdout:3/263: write d6/d1a/f1f [1838342,56080] 0 2026-03-10T06:22:14.964 INFO:tasks.workunit.client.1.vm06.stdout:3/264: creat d6/f5c x:0 0 0 2026-03-10T06:22:14.965 INFO:tasks.workunit.client.1.vm06.stdout:7/330: rename d19/f23 to d19/d3b/d41/f66 0 2026-03-10T06:22:14.967 INFO:tasks.workunit.client.1.vm06.stdout:3/265: rename d6/f42 to d6/dc/d13/f5d 0 2026-03-10T06:22:14.968 INFO:tasks.workunit.client.1.vm06.stdout:7/331: rename d19/d3b/d41/d4c/l51 to d19/d3b/d41/d4c/l67 0 2026-03-10T06:22:14.968 INFO:tasks.workunit.client.1.vm06.stdout:3/266: creat d6/dc/d13/f5e x:0 0 0 2026-03-10T06:22:14.968 INFO:tasks.workunit.client.1.vm06.stdout:3/267: dread - d6/d21/d38/f56 zero size 2026-03-10T06:22:14.969 INFO:tasks.workunit.client.1.vm06.stdout:7/332: creat d19/d3b/f68 x:0 0 0 2026-03-10T06:22:14.973 INFO:tasks.workunit.client.1.vm06.stdout:7/333: getdents d19/d3b/d41/d42 0 2026-03-10T06:22:14.976 INFO:tasks.workunit.client.1.vm06.stdout:7/334: creat d19/d3b/d5b/f69 x:0 0 0 2026-03-10T06:22:14.977 INFO:tasks.workunit.client.1.vm06.stdout:3/268: dread d6/f1c [0,4194304] 0 2026-03-10T06:22:14.979 INFO:tasks.workunit.client.1.vm06.stdout:3/269: rename d6/d21/c43 to d6/dc/d13/d35/c5f 0 2026-03-10T06:22:14.980 INFO:tasks.workunit.client.1.vm06.stdout:3/270: mknod d6/dc/d13/d51/c60 0 2026-03-10T06:22:14.980 INFO:tasks.workunit.client.1.vm06.stdout:3/271: chown c4 71 1 2026-03-10T06:22:14.984 INFO:tasks.workunit.client.1.vm06.stdout:3/272: dwrite d6/d21/f30 [0,4194304] 0 2026-03-10T06:22:14.986 INFO:tasks.workunit.client.1.vm06.stdout:3/273: write d6/d21/f31 [1777042,77808] 0 2026-03-10T06:22:14.987 INFO:tasks.workunit.client.1.vm06.stdout:3/274: write d6/dc/d13/d35/f4e [84415,10520] 0 2026-03-10T06:22:14.988 INFO:tasks.workunit.client.1.vm06.stdout:3/275: dread - d6/d21/f55 zero size 2026-03-10T06:22:14.990 INFO:tasks.workunit.client.1.vm06.stdout:3/276: stat d6/dc/d13/f1e 0 2026-03-10T06:22:15.046 INFO:tasks.workunit.client.1.vm06.stdout:8/247: dwrite d1/f3c [0,4194304] 0 2026-03-10T06:22:15.052 INFO:tasks.workunit.client.1.vm06.stdout:8/248: fdatasync d1/f4 0 2026-03-10T06:22:15.052 INFO:tasks.workunit.client.1.vm06.stdout:8/249: fsync d1/df/d20/f51 0 2026-03-10T06:22:15.053 INFO:tasks.workunit.client.1.vm06.stdout:8/250: readlink d1/df/d20/d35/l46 0 2026-03-10T06:22:15.055 INFO:tasks.workunit.client.1.vm06.stdout:8/251: creat d1/df/d11/f53 x:0 0 0 2026-03-10T06:22:15.056 INFO:tasks.workunit.client.1.vm06.stdout:8/252: symlink d1/df/d20/l54 0 2026-03-10T06:22:15.057 INFO:tasks.workunit.client.1.vm06.stdout:8/253: truncate d1/d7/f24 5046009 0 2026-03-10T06:22:15.057 INFO:tasks.workunit.client.1.vm06.stdout:8/254: truncate d1/df/d20/f19 4437434 0 2026-03-10T06:22:15.058 INFO:tasks.workunit.client.1.vm06.stdout:8/255: mknod d1/d2c/d50/c55 0 2026-03-10T06:22:15.060 INFO:tasks.workunit.client.1.vm06.stdout:8/256: creat d1/df/f56 x:0 0 0 2026-03-10T06:22:15.060 INFO:tasks.workunit.client.1.vm06.stdout:8/257: chown d1/df/d20/f51 1982270 1 2026-03-10T06:22:15.061 INFO:tasks.workunit.client.1.vm06.stdout:8/258: dread - d1/df/d11/f53 zero size 2026-03-10T06:22:15.061 INFO:tasks.workunit.client.1.vm06.stdout:8/259: write d1/f3c [1008561,38377] 0 2026-03-10T06:22:15.062 INFO:tasks.workunit.client.1.vm06.stdout:8/260: fsync d1/df/d20/fe 0 2026-03-10T06:22:15.064 INFO:tasks.workunit.client.1.vm06.stdout:8/261: dwrite d1/f5 [0,4194304] 0 2026-03-10T06:22:15.066 INFO:tasks.workunit.client.1.vm06.stdout:8/262: rename d1 to d1/d2c/d50/d57 22 2026-03-10T06:22:15.066 INFO:tasks.workunit.client.1.vm06.stdout:8/263: chown d1/d2c/c4e 327 1 2026-03-10T06:22:15.128 INFO:tasks.workunit.client.1.vm06.stdout:1/301: sync 2026-03-10T06:22:15.167 INFO:tasks.workunit.client.1.vm06.stdout:1/302: truncate d9/df/f41 4433122 0 2026-03-10T06:22:15.167 INFO:tasks.workunit.client.1.vm06.stdout:1/303: dread d9/f34 [0,4194304] 0 2026-03-10T06:22:15.167 INFO:tasks.workunit.client.1.vm06.stdout:1/304: mknod d9/df/c4a 0 2026-03-10T06:22:15.167 INFO:tasks.workunit.client.1.vm06.stdout:1/305: symlink d9/d1b/d20/d44/l4b 0 2026-03-10T06:22:15.167 INFO:tasks.workunit.client.1.vm06.stdout:1/306: dwrite d9/d1b/d20/f43 [0,4194304] 0 2026-03-10T06:22:15.167 INFO:tasks.workunit.client.1.vm06.stdout:1/307: rmdir d9/d1b 39 2026-03-10T06:22:15.167 INFO:tasks.workunit.client.1.vm06.stdout:1/308: write d9/f34 [631174,56277] 0 2026-03-10T06:22:15.167 INFO:tasks.workunit.client.1.vm06.stdout:1/309: symlink d9/d1b/d20/d44/l4c 0 2026-03-10T06:22:15.167 INFO:tasks.workunit.client.1.vm06.stdout:1/310: dwrite d9/f2f [0,4194304] 0 2026-03-10T06:22:15.167 INFO:tasks.workunit.client.1.vm06.stdout:1/311: dread d9/d1b/d20/f25 [0,4194304] 0 2026-03-10T06:22:15.167 INFO:tasks.workunit.client.1.vm06.stdout:1/312: creat d9/df/f4d x:0 0 0 2026-03-10T06:22:15.167 INFO:tasks.workunit.client.1.vm06.stdout:1/313: chown d9/c1c 362797342 1 2026-03-10T06:22:15.171 INFO:tasks.workunit.client.1.vm06.stdout:1/314: unlink d9/df/f41 0 2026-03-10T06:22:15.171 INFO:tasks.workunit.client.1.vm06.stdout:1/315: write d9/df/f49 [746805,39102] 0 2026-03-10T06:22:15.183 INFO:tasks.workunit.client.1.vm06.stdout:4/243: sync 2026-03-10T06:22:15.184 INFO:tasks.workunit.client.1.vm06.stdout:4/244: chown dd 3 1 2026-03-10T06:22:15.184 INFO:tasks.workunit.client.1.vm06.stdout:0/289: sync 2026-03-10T06:22:15.184 INFO:tasks.workunit.client.1.vm06.stdout:4/245: chown fc 1365391040 1 2026-03-10T06:22:15.184 INFO:tasks.workunit.client.1.vm06.stdout:0/290: write d0/dd/d1c/f5a [749249,75613] 0 2026-03-10T06:22:15.188 INFO:tasks.workunit.client.1.vm06.stdout:1/316: symlink d9/d35/l4e 0 2026-03-10T06:22:15.189 INFO:tasks.workunit.client.1.vm06.stdout:1/317: read d9/d1b/f1d [1974838,109287] 0 2026-03-10T06:22:15.190 INFO:tasks.workunit.client.1.vm06.stdout:1/318: chown d9/d1b/d20/f24 0 1 2026-03-10T06:22:15.191 INFO:tasks.workunit.client.1.vm06.stdout:4/246: getdents dd/d24 0 2026-03-10T06:22:15.192 INFO:tasks.workunit.client.1.vm06.stdout:1/319: creat d9/df/f4f x:0 0 0 2026-03-10T06:22:15.195 INFO:tasks.workunit.client.1.vm06.stdout:4/247: dread dd/d24/d2d/f28 [0,4194304] 0 2026-03-10T06:22:15.241 INFO:tasks.workunit.client.1.vm06.stdout:1/320: dread d9/f2f [0,4194304] 0 2026-03-10T06:22:15.241 INFO:tasks.workunit.client.1.vm06.stdout:1/321: creat d9/d35/d46/f50 x:0 0 0 2026-03-10T06:22:15.241 INFO:tasks.workunit.client.1.vm06.stdout:1/322: readlink d9/d35/l4e 0 2026-03-10T06:22:15.241 INFO:tasks.workunit.client.1.vm06.stdout:1/323: write d9/d1b/d20/f42 [379874,94680] 0 2026-03-10T06:22:15.241 INFO:tasks.workunit.client.1.vm06.stdout:1/324: write d9/f17 [305149,48932] 0 2026-03-10T06:22:15.241 INFO:tasks.workunit.client.1.vm06.stdout:1/325: fsync d9/df/f14 0 2026-03-10T06:22:15.248 INFO:tasks.workunit.client.1.vm06.stdout:4/248: dread f0 [0,4194304] 0 2026-03-10T06:22:15.249 INFO:tasks.workunit.client.1.vm06.stdout:4/249: creat dd/d24/f45 x:0 0 0 2026-03-10T06:22:15.250 INFO:tasks.workunit.client.1.vm06.stdout:4/250: chown dd/d24/d2d/c3c 520329337 1 2026-03-10T06:22:15.251 INFO:tasks.workunit.client.1.vm06.stdout:4/251: getdents dd/d24/d2d/d2f/d34 0 2026-03-10T06:22:15.252 INFO:tasks.workunit.client.1.vm06.stdout:4/252: unlink dd/d24/d2d/f35 0 2026-03-10T06:22:15.253 INFO:tasks.workunit.client.1.vm06.stdout:4/253: stat dd/d33/d36 0 2026-03-10T06:22:15.254 INFO:tasks.workunit.client.1.vm06.stdout:4/254: symlink dd/d24/d2d/d2f/d34/d40/l46 0 2026-03-10T06:22:15.255 INFO:tasks.workunit.client.1.vm06.stdout:4/255: mkdir dd/d33/d47 0 2026-03-10T06:22:15.255 INFO:tasks.workunit.client.1.vm06.stdout:4/256: chown dd/d24/c38 8163748 1 2026-03-10T06:22:15.457 INFO:tasks.workunit.client.1.vm06.stdout:9/270: write f14 [2704663,121972] 0 2026-03-10T06:22:15.463 INFO:tasks.workunit.client.1.vm06.stdout:9/271: mknod d21/d27/c5e 0 2026-03-10T06:22:15.488 INFO:tasks.workunit.client.1.vm06.stdout:6/340: truncate d6/d7/f36 3777160 0 2026-03-10T06:22:15.488 INFO:tasks.workunit.client.1.vm06.stdout:2/280: dread f8 [0,4194304] 0 2026-03-10T06:22:15.488 INFO:tasks.workunit.client.1.vm06.stdout:6/341: dread - d6/dd/d25/d4e/f60 zero size 2026-03-10T06:22:15.488 INFO:tasks.workunit.client.1.vm06.stdout:9/272: getdents d21/d27/d50/d57 0 2026-03-10T06:22:15.488 INFO:tasks.workunit.client.1.vm06.stdout:9/273: read d21/d46/f4e [2825323,9369] 0 2026-03-10T06:22:15.488 INFO:tasks.workunit.client.1.vm06.stdout:9/274: fdatasync f20 0 2026-03-10T06:22:15.488 INFO:tasks.workunit.client.1.vm06.stdout:2/281: dread da/d13/d1c/f2d [0,4194304] 0 2026-03-10T06:22:15.488 INFO:tasks.workunit.client.1.vm06.stdout:2/282: chown da/l10 3 1 2026-03-10T06:22:15.488 INFO:tasks.workunit.client.1.vm06.stdout:2/283: link da/d13/d1a/f21 da/d13/d1c/d1d/f55 0 2026-03-10T06:22:15.488 INFO:tasks.workunit.client.1.vm06.stdout:2/284: mkdir da/d13/d1c/d1d/d44/d48/d56 0 2026-03-10T06:22:15.500 INFO:tasks.workunit.client.1.vm06.stdout:5/239: read d8/d20/d22/d39/f52 [2886578,96186] 0 2026-03-10T06:22:15.507 INFO:tasks.workunit.client.1.vm06.stdout:5/240: unlink d8/d9/d1e/f3e 0 2026-03-10T06:22:15.509 INFO:tasks.workunit.client.1.vm06.stdout:3/277: getdents d6/dc/d13/d51 0 2026-03-10T06:22:15.510 INFO:tasks.workunit.client.1.vm06.stdout:3/278: chown d6/dc/d13/d35/f3b 237694 1 2026-03-10T06:22:15.514 INFO:tasks.workunit.client.1.vm06.stdout:3/279: dwrite d6/d21/d38/f50 [0,4194304] 0 2026-03-10T06:22:15.517 INFO:tasks.workunit.client.1.vm06.stdout:8/264: rename d1/d2c/d50 to d1/df/d58 0 2026-03-10T06:22:15.520 INFO:tasks.workunit.client.1.vm06.stdout:8/265: dread d1/f26 [0,4194304] 0 2026-03-10T06:22:15.531 INFO:tasks.workunit.client.1.vm06.stdout:6/342: rename d6/dd/d25/d33/c48 to d6/dd/d25/d4e/c75 0 2026-03-10T06:22:15.531 INFO:tasks.workunit.client.1.vm06.stdout:7/335: dread fa [0,4194304] 0 2026-03-10T06:22:15.531 INFO:tasks.workunit.client.1.vm06.stdout:7/336: link d19/c37 d19/d3b/d41/d42/d62/c6a 0 2026-03-10T06:22:15.531 INFO:tasks.workunit.client.1.vm06.stdout:7/337: stat le 0 2026-03-10T06:22:15.532 INFO:tasks.workunit.client.1.vm06.stdout:7/338: write d19/f1d [2602925,652] 0 2026-03-10T06:22:15.533 INFO:tasks.workunit.client.1.vm06.stdout:7/339: stat d19/c38 0 2026-03-10T06:22:15.537 INFO:tasks.workunit.client.1.vm06.stdout:7/340: dwrite d19/d3b/f68 [0,4194304] 0 2026-03-10T06:22:15.550 INFO:tasks.workunit.client.1.vm06.stdout:5/241: dread d8/ff [0,4194304] 0 2026-03-10T06:22:15.557 INFO:tasks.workunit.client.1.vm06.stdout:5/242: chown d8/d9/d1e/f4c 14 1 2026-03-10T06:22:15.557 INFO:tasks.workunit.client.1.vm06.stdout:3/280: sync 2026-03-10T06:22:15.560 INFO:tasks.workunit.client.1.vm06.stdout:3/281: write d6/dc/d13/f5d [814260,94780] 0 2026-03-10T06:22:15.591 INFO:tasks.workunit.client.1.vm06.stdout:5/243: getdents d8/db 0 2026-03-10T06:22:15.591 INFO:tasks.workunit.client.1.vm06.stdout:7/341: dread d19/d3b/f43 [0,4194304] 0 2026-03-10T06:22:15.591 INFO:tasks.workunit.client.1.vm06.stdout:3/282: creat d6/dc/d41/f61 x:0 0 0 2026-03-10T06:22:15.591 INFO:tasks.workunit.client.1.vm06.stdout:3/283: readlink d6/d21/l24 0 2026-03-10T06:22:15.591 INFO:tasks.workunit.client.1.vm06.stdout:3/284: stat d6/dc/d13/l32 0 2026-03-10T06:22:15.591 INFO:tasks.workunit.client.1.vm06.stdout:7/342: write d19/d3b/d41/d4c/f4e [4971116,43642] 0 2026-03-10T06:22:15.591 INFO:tasks.workunit.client.1.vm06.stdout:5/244: getdents d8/d20/d22 0 2026-03-10T06:22:15.591 INFO:tasks.workunit.client.1.vm06.stdout:1/326: rmdir d9/d35/d46 39 2026-03-10T06:22:15.591 INFO:tasks.workunit.client.1.vm06.stdout:1/327: dwrite d9/df/f47 [0,4194304] 0 2026-03-10T06:22:15.591 INFO:tasks.workunit.client.1.vm06.stdout:4/257: truncate f0 7371830 0 2026-03-10T06:22:15.591 INFO:tasks.workunit.client.1.vm06.stdout:1/328: chown d9/df/c4a 381 1 2026-03-10T06:22:15.591 INFO:tasks.workunit.client.1.vm06.stdout:9/275: truncate ff 5305009 0 2026-03-10T06:22:15.593 INFO:tasks.workunit.client.1.vm06.stdout:9/276: getdents d21/d27 0 2026-03-10T06:22:15.594 INFO:tasks.workunit.client.1.vm06.stdout:9/277: write d21/d27/f39 [1059347,107086] 0 2026-03-10T06:22:15.594 INFO:tasks.workunit.client.1.vm06.stdout:9/278: write d21/d27/d50/d57/f58 [821773,97718] 0 2026-03-10T06:22:15.597 INFO:tasks.workunit.client.1.vm06.stdout:9/279: getdents d21/d27/d3a 0 2026-03-10T06:22:15.598 INFO:tasks.workunit.client.1.vm06.stdout:9/280: getdents d21/d32/d4d/d51 0 2026-03-10T06:22:15.609 INFO:tasks.workunit.client.1.vm06.stdout:5/245: dread d8/d9/f14 [4194304,4194304] 0 2026-03-10T06:22:15.610 INFO:tasks.workunit.client.1.vm06.stdout:5/246: mkdir d8/db/d54/d55 0 2026-03-10T06:22:15.611 INFO:tasks.workunit.client.1.vm06.stdout:5/247: chown d8/d9/d1e 172751941 1 2026-03-10T06:22:15.612 INFO:tasks.workunit.client.1.vm06.stdout:5/248: symlink d8/d20/d22/l56 0 2026-03-10T06:22:15.613 INFO:tasks.workunit.client.1.vm06.stdout:5/249: readlink d8/l2b 0 2026-03-10T06:22:15.616 INFO:tasks.workunit.client.1.vm06.stdout:5/250: dwrite d8/db/f40 [0,4194304] 0 2026-03-10T06:22:15.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:15 vm06.local ceph-mon[58974]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:22:15.639 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:15 vm06.local ceph-mon[58974]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:22:15.639 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:15 vm06.local ceph-mon[58974]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:22:15.639 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:15 vm06.local ceph-mon[58974]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:22:15.639 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:15 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "who": "osd/host:vm06", "name": "osd_memory_target"}]: dispatch 2026-03-10T06:22:15.639 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:15 vm06.local ceph-mon[58974]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "who": "osd/host:vm06", "name": "osd_memory_target"}]: dispatch 2026-03-10T06:22:15.643 INFO:tasks.workunit.client.1.vm06.stdout:3/285: sync 2026-03-10T06:22:15.645 INFO:tasks.workunit.client.1.vm06.stdout:2/285: write da/d13/d1a/d39/d35/f4a [490223,85247] 0 2026-03-10T06:22:15.645 INFO:tasks.workunit.client.1.vm06.stdout:3/286: creat d6/d8/f62 x:0 0 0 2026-03-10T06:22:15.647 INFO:tasks.workunit.client.1.vm06.stdout:3/287: dread d6/d8/fb [0,4194304] 0 2026-03-10T06:22:15.649 INFO:tasks.workunit.client.1.vm06.stdout:3/288: creat d6/f63 x:0 0 0 2026-03-10T06:22:15.649 INFO:tasks.workunit.client.1.vm06.stdout:3/289: dread - d6/d8/f49 zero size 2026-03-10T06:22:15.651 INFO:tasks.workunit.client.1.vm06.stdout:3/290: mknod d6/dc/d2a/d54/c64 0 2026-03-10T06:22:15.651 INFO:tasks.workunit.client.1.vm06.stdout:3/291: write d6/d21/f30 [1762224,76399] 0 2026-03-10T06:22:15.653 INFO:tasks.workunit.client.1.vm06.stdout:3/292: readlink d6/d21/l24 0 2026-03-10T06:22:15.672 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:15 vm04.local ceph-mon[51058]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:22:15.674 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:15 vm04.local ceph-mon[51058]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:22:15.674 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:15 vm04.local ceph-mon[51058]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:22:15.674 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:15 vm04.local ceph-mon[51058]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:22:15.674 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:15 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "who": "osd/host:vm06", "name": "osd_memory_target"}]: dispatch 2026-03-10T06:22:15.674 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:15 vm04.local ceph-mon[51058]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "who": "osd/host:vm06", "name": "osd_memory_target"}]: dispatch 2026-03-10T06:22:15.690 INFO:tasks.workunit.client.1.vm06.stdout:5/251: fdatasync d8/db/f40 0 2026-03-10T06:22:15.702 INFO:tasks.workunit.client.1.vm06.stdout:2/286: dread da/d13/d1a/d39/d35/f4a [0,4194304] 0 2026-03-10T06:22:15.703 INFO:tasks.workunit.client.1.vm06.stdout:2/287: readlink da/d13/l14 0 2026-03-10T06:22:15.704 INFO:tasks.workunit.client.1.vm06.stdout:2/288: creat da/d13/d1c/d1d/d44/d48/f57 x:0 0 0 2026-03-10T06:22:15.707 INFO:tasks.workunit.client.1.vm06.stdout:2/289: getdents da/d13/d1c/d1d/d44/d48/d56 0 2026-03-10T06:22:15.730 INFO:tasks.workunit.client.1.vm06.stdout:0/291: symlink d0/dd/l6c 0 2026-03-10T06:22:15.730 INFO:tasks.workunit.client.1.vm06.stdout:7/343: write d19/d3b/f47 [3169403,92230] 0 2026-03-10T06:22:15.732 INFO:tasks.workunit.client.1.vm06.stdout:7/344: creat d19/d3b/f6b x:0 0 0 2026-03-10T06:22:15.735 INFO:tasks.workunit.client.1.vm06.stdout:7/345: dwrite d19/d3b/f47 [0,4194304] 0 2026-03-10T06:22:15.736 INFO:tasks.workunit.client.1.vm06.stdout:7/346: truncate d19/d3b/f53 789155 0 2026-03-10T06:22:15.740 INFO:tasks.workunit.client.1.vm06.stdout:7/347: dwrite d19/f33 [0,4194304] 0 2026-03-10T06:22:15.741 INFO:tasks.workunit.client.1.vm06.stdout:6/343: mknod d6/c76 0 2026-03-10T06:22:15.741 INFO:tasks.workunit.client.1.vm06.stdout:2/290: sync 2026-03-10T06:22:15.748 INFO:tasks.workunit.client.1.vm06.stdout:9/281: dwrite fe [4194304,4194304] 0 2026-03-10T06:22:15.750 INFO:tasks.workunit.client.1.vm06.stdout:3/293: truncate d6/f29 1313986 0 2026-03-10T06:22:15.751 INFO:tasks.workunit.client.1.vm06.stdout:6/344: dwrite d6/dd/d25/f69 [0,4194304] 0 2026-03-10T06:22:15.751 INFO:tasks.workunit.client.1.vm06.stdout:3/294: chown d6 2 1 2026-03-10T06:22:15.752 INFO:tasks.workunit.client.1.vm06.stdout:7/348: creat d19/d3b/d41/d42/d62/f6c x:0 0 0 2026-03-10T06:22:15.755 INFO:tasks.workunit.client.1.vm06.stdout:7/349: dread d19/d3b/d41/f48 [0,4194304] 0 2026-03-10T06:22:15.760 INFO:tasks.workunit.client.1.vm06.stdout:9/282: dread d21/f3e [0,4194304] 0 2026-03-10T06:22:15.761 INFO:tasks.workunit.client.1.vm06.stdout:7/350: chown d19/d3b/f6b 17397334 1 2026-03-10T06:22:15.761 INFO:tasks.workunit.client.1.vm06.stdout:7/351: chown d19/d3b/l5e 3 1 2026-03-10T06:22:15.763 INFO:tasks.workunit.client.1.vm06.stdout:2/291: creat da/d13/d1c/d1d/d44/d48/d56/f58 x:0 0 0 2026-03-10T06:22:15.763 INFO:tasks.workunit.client.1.vm06.stdout:9/283: unlink d21/d32/f37 0 2026-03-10T06:22:15.769 INFO:tasks.workunit.client.1.vm06.stdout:6/345: dwrite d6/fc [0,4194304] 0 2026-03-10T06:22:15.771 INFO:tasks.workunit.client.1.vm06.stdout:2/292: getdents da/d13/d1a/d39 0 2026-03-10T06:22:15.772 INFO:tasks.workunit.client.1.vm06.stdout:6/346: unlink d6/dd/d25/f49 0 2026-03-10T06:22:15.772 INFO:tasks.workunit.client.1.vm06.stdout:9/284: link d21/d24/c5d d21/d46/c5f 0 2026-03-10T06:22:15.773 INFO:tasks.workunit.client.1.vm06.stdout:9/285: readlink l1a 0 2026-03-10T06:22:15.773 INFO:tasks.workunit.client.1.vm06.stdout:6/347: chown d6/d7/d37/l64 49560 1 2026-03-10T06:22:15.775 INFO:tasks.workunit.client.1.vm06.stdout:9/286: write d21/d32/d4d/d51/f5c [184349,100398] 0 2026-03-10T06:22:15.777 INFO:tasks.workunit.client.1.vm06.stdout:2/293: creat da/d13/d1c/d1d/d44/d48/f59 x:0 0 0 2026-03-10T06:22:15.783 INFO:tasks.workunit.client.1.vm06.stdout:0/292: dread d0/dd/f24 [4194304,4194304] 0 2026-03-10T06:22:15.801 INFO:tasks.workunit.client.1.vm06.stdout:9/287: dwrite f9 [0,4194304] 0 2026-03-10T06:22:15.801 INFO:tasks.workunit.client.1.vm06.stdout:6/348: dwrite d6/d7/f2a [0,4194304] 0 2026-03-10T06:22:15.801 INFO:tasks.workunit.client.1.vm06.stdout:6/349: stat d6/df/l22 0 2026-03-10T06:22:15.801 INFO:tasks.workunit.client.1.vm06.stdout:6/350: creat d6/d7/d37/d43/f77 x:0 0 0 2026-03-10T06:22:15.801 INFO:tasks.workunit.client.1.vm06.stdout:2/294: dwrite da/d13/d1a/f27 [0,4194304] 0 2026-03-10T06:22:15.801 INFO:tasks.workunit.client.1.vm06.stdout:6/351: fdatasync d6/d7/f16 0 2026-03-10T06:22:15.801 INFO:tasks.workunit.client.1.vm06.stdout:2/295: write da/d13/d1c/d1d/d44/d48/f57 [145411,128021] 0 2026-03-10T06:22:15.827 INFO:tasks.workunit.client.1.vm06.stdout:2/296: dread da/f28 [0,4194304] 0 2026-03-10T06:22:15.837 INFO:tasks.workunit.client.1.vm06.stdout:9/288: sync 2026-03-10T06:22:15.839 INFO:tasks.workunit.client.1.vm06.stdout:9/289: unlink f12 0 2026-03-10T06:22:15.839 INFO:tasks.workunit.client.1.vm06.stdout:0/293: read d0/dd/f4c [811913,106711] 0 2026-03-10T06:22:15.840 INFO:tasks.workunit.client.1.vm06.stdout:9/290: mknod d21/d32/d4d/d51/c60 0 2026-03-10T06:22:15.841 INFO:tasks.workunit.client.1.vm06.stdout:0/294: stat d0/dd/d2d/d47/d4d/c58 0 2026-03-10T06:22:15.845 INFO:tasks.workunit.client.1.vm06.stdout:0/295: dwrite d0/f46 [0,4194304] 0 2026-03-10T06:22:15.847 INFO:tasks.workunit.client.1.vm06.stdout:0/296: mknod d0/d3c/c6d 0 2026-03-10T06:22:15.847 INFO:tasks.workunit.client.1.vm06.stdout:0/297: readlink d0/d3c/d42/lc 0 2026-03-10T06:22:15.849 INFO:tasks.workunit.client.1.vm06.stdout:0/298: dread d0/f46 [0,4194304] 0 2026-03-10T06:22:15.851 INFO:tasks.workunit.client.1.vm06.stdout:0/299: symlink d0/dd/d1b/d3d/l6e 0 2026-03-10T06:22:15.854 INFO:tasks.workunit.client.1.vm06.stdout:0/300: symlink d0/dd/d14/d1d/d5d/l6f 0 2026-03-10T06:22:15.858 INFO:tasks.workunit.client.1.vm06.stdout:0/301: dread d0/fa [0,4194304] 0 2026-03-10T06:22:15.863 INFO:tasks.workunit.client.1.vm06.stdout:0/302: dwrite d0/dd/d1c/f5a [0,4194304] 0 2026-03-10T06:22:15.866 INFO:tasks.workunit.client.1.vm06.stdout:0/303: write d0/ff [1413355,27751] 0 2026-03-10T06:22:15.868 INFO:tasks.workunit.client.1.vm06.stdout:0/304: creat d0/dd/d14/f70 x:0 0 0 2026-03-10T06:22:15.870 INFO:tasks.workunit.client.1.vm06.stdout:0/305: mknod d0/dd/d1c/c71 0 2026-03-10T06:22:15.871 INFO:tasks.workunit.client.1.vm06.stdout:0/306: chown d0/dd/d2d/d47/d4d/f57 10746733 1 2026-03-10T06:22:15.871 INFO:tasks.workunit.client.1.vm06.stdout:0/307: write d0/ff [5042575,117967] 0 2026-03-10T06:22:15.872 INFO:tasks.workunit.client.1.vm06.stdout:0/308: truncate d0/dd/f5b 1580102 0 2026-03-10T06:22:15.962 INFO:tasks.workunit.client.1.vm06.stdout:8/266: rmdir d1/df/d20 39 2026-03-10T06:22:15.963 INFO:tasks.workunit.client.1.vm06.stdout:8/267: creat d1/df/d11/f59 x:0 0 0 2026-03-10T06:22:15.964 INFO:tasks.workunit.client.1.vm06.stdout:8/268: write f0 [769436,101247] 0 2026-03-10T06:22:15.966 INFO:tasks.workunit.client.1.vm06.stdout:8/269: unlink d1/f3c 0 2026-03-10T06:22:15.967 INFO:tasks.workunit.client.1.vm06.stdout:8/270: mknod d1/d3b/c5a 0 2026-03-10T06:22:15.968 INFO:tasks.workunit.client.1.vm06.stdout:8/271: dread - d1/df/d11/f59 zero size 2026-03-10T06:22:16.011 INFO:tasks.workunit.client.1.vm06.stdout:8/272: read f0 [651300,41165] 0 2026-03-10T06:22:16.011 INFO:tasks.workunit.client.1.vm06.stdout:1/329: rename d9/df/f47 to d9/d1b/f51 0 2026-03-10T06:22:16.019 INFO:tasks.workunit.client.1.vm06.stdout:3/295: rename d6/dc/d41/f61 to d6/d1a/f65 0 2026-03-10T06:22:16.025 INFO:tasks.workunit.client.1.vm06.stdout:3/296: symlink d6/dc/d2a/d54/l66 0 2026-03-10T06:22:16.029 INFO:tasks.workunit.client.1.vm06.stdout:3/297: dwrite d6/dc/d13/f1e [0,4194304] 0 2026-03-10T06:22:16.050 INFO:tasks.workunit.client.1.vm06.stdout:2/297: rmdir da/d13/d1c/d1d/d44/d48/d56 39 2026-03-10T06:22:16.050 INFO:tasks.workunit.client.1.vm06.stdout:2/298: chown f8 11442129 1 2026-03-10T06:22:16.054 INFO:tasks.workunit.client.1.vm06.stdout:7/352: dwrite f4 [4194304,4194304] 0 2026-03-10T06:22:16.064 INFO:tasks.workunit.client.1.vm06.stdout:5/252: truncate d8/d20/d22/d39/f52 2781879 0 2026-03-10T06:22:16.073 INFO:tasks.workunit.client.1.vm06.stdout:2/299: mknod da/d13/d1c/d1d/d44/c5a 0 2026-03-10T06:22:16.074 INFO:tasks.workunit.client.1.vm06.stdout:1/330: getdents d9/d1b/d20/d44 0 2026-03-10T06:22:16.084 INFO:tasks.workunit.client.1.vm06.stdout:7/353: rmdir d19/d3b/d5b 39 2026-03-10T06:22:16.088 INFO:tasks.workunit.client.1.vm06.stdout:8/273: getdents d1/df/d20 0 2026-03-10T06:22:16.091 INFO:tasks.workunit.client.1.vm06.stdout:6/352: truncate d6/d7/f2a 2465438 0 2026-03-10T06:22:16.093 INFO:tasks.workunit.client.1.vm06.stdout:7/354: write d19/f25 [843278,103263] 0 2026-03-10T06:22:16.094 INFO:tasks.workunit.client.1.vm06.stdout:6/353: chown d6/df/d40/l4f 2676822 1 2026-03-10T06:22:16.100 INFO:tasks.workunit.client.1.vm06.stdout:8/274: mkdir d1/d2c/d5b 0 2026-03-10T06:22:16.101 INFO:tasks.workunit.client.1.vm06.stdout:8/275: readlink d1/df/l4d 0 2026-03-10T06:22:16.103 INFO:tasks.workunit.client.1.vm06.stdout:8/276: chown d1/df/d20/d21/l2d 1412 1 2026-03-10T06:22:16.104 INFO:tasks.workunit.client.1.vm06.stdout:4/258: truncate f0 213901 0 2026-03-10T06:22:16.105 INFO:tasks.workunit.client.1.vm06.stdout:6/354: dwrite d6/dd/d35/f2d [4194304,4194304] 0 2026-03-10T06:22:16.107 INFO:tasks.workunit.client.1.vm06.stdout:0/309: dwrite d0/dd/f10 [0,4194304] 0 2026-03-10T06:22:16.122 INFO:tasks.workunit.client.1.vm06.stdout:6/355: dwrite d6/dd/f5b [0,4194304] 0 2026-03-10T06:22:16.127 INFO:tasks.workunit.client.1.vm06.stdout:1/331: getdents d9/d35/d46/d38 0 2026-03-10T06:22:16.131 INFO:tasks.workunit.client.1.vm06.stdout:1/332: dwrite d9/d1b/d20/f30 [0,4194304] 0 2026-03-10T06:22:16.133 INFO:tasks.workunit.client.1.vm06.stdout:8/277: mkdir d1/d3b/d5c 0 2026-03-10T06:22:16.137 INFO:tasks.workunit.client.1.vm06.stdout:2/300: rename da/d13/d1c/f29 to da/d13/f5b 0 2026-03-10T06:22:16.144 INFO:tasks.workunit.client.1.vm06.stdout:1/333: dread d9/d1b/f51 [0,4194304] 0 2026-03-10T06:22:16.155 INFO:tasks.workunit.client.1.vm06.stdout:3/298: truncate d6/f25 3683715 0 2026-03-10T06:22:16.155 INFO:tasks.workunit.client.1.vm06.stdout:3/299: fdatasync d6/d1a/f1f 0 2026-03-10T06:22:16.156 INFO:tasks.workunit.client.1.vm06.stdout:0/310: creat d0/dd/d1b/f72 x:0 0 0 2026-03-10T06:22:16.159 INFO:tasks.workunit.client.1.vm06.stdout:9/291: dwrite ff [0,4194304] 0 2026-03-10T06:22:16.165 INFO:tasks.workunit.client.1.vm06.stdout:8/278: symlink d1/d7/l5d 0 2026-03-10T06:22:16.169 INFO:tasks.workunit.client.1.vm06.stdout:1/334: symlink d9/d1b/d20/d44/l52 0 2026-03-10T06:22:16.173 INFO:tasks.workunit.client.1.vm06.stdout:0/311: unlink d0/dd/d1b/d3d/d50/f55 0 2026-03-10T06:22:16.173 INFO:tasks.workunit.client.1.vm06.stdout:3/300: chown d6/dc/d13/d35/c5f 6559398 1 2026-03-10T06:22:16.176 INFO:tasks.workunit.client.1.vm06.stdout:6/356: mkdir d6/dd/d25/d33/d5a/d78 0 2026-03-10T06:22:16.177 INFO:tasks.workunit.client.1.vm06.stdout:3/301: rmdir d6/d1a 39 2026-03-10T06:22:16.180 INFO:tasks.workunit.client.1.vm06.stdout:3/302: mknod d6/dc/d13/d35/c67 0 2026-03-10T06:22:16.180 INFO:tasks.workunit.client.1.vm06.stdout:2/301: getdents da/d13/d1a 0 2026-03-10T06:22:16.185 INFO:tasks.workunit.client.1.vm06.stdout:1/335: creat d9/d35/f53 x:0 0 0 2026-03-10T06:22:16.193 INFO:tasks.workunit.client.1.vm06.stdout:0/312: dread d0/d3c/d42/f12 [0,4194304] 0 2026-03-10T06:22:16.193 INFO:tasks.workunit.client.1.vm06.stdout:1/336: write d9/d1b/f31 [390559,107335] 0 2026-03-10T06:22:16.194 INFO:tasks.workunit.client.1.vm06.stdout:0/313: chown d0/d3c/d42/f5c 151376532 1 2026-03-10T06:22:16.194 INFO:tasks.workunit.client.1.vm06.stdout:8/279: getdents d1/df 0 2026-03-10T06:22:16.194 INFO:tasks.workunit.client.1.vm06.stdout:2/302: symlink da/d13/d1a/l5c 0 2026-03-10T06:22:16.194 INFO:tasks.workunit.client.1.vm06.stdout:1/337: dwrite d9/d1b/d20/f42 [0,4194304] 0 2026-03-10T06:22:16.197 INFO:tasks.workunit.client.1.vm06.stdout:8/280: mkdir d1/df/d20/d21/d5e 0 2026-03-10T06:22:16.204 INFO:tasks.workunit.client.1.vm06.stdout:2/303: chown da/fe 5462 1 2026-03-10T06:22:16.207 INFO:tasks.workunit.client.1.vm06.stdout:8/281: rename d1/d7/l5d to d1/df/d20/d21/l5f 0 2026-03-10T06:22:16.210 INFO:tasks.workunit.client.1.vm06.stdout:8/282: write d1/d7/f24 [4436953,82631] 0 2026-03-10T06:22:16.210 INFO:tasks.workunit.client.1.vm06.stdout:2/304: link da/d13/d1a/l34 da/d13/d1c/d1d/l5d 0 2026-03-10T06:22:16.211 INFO:tasks.workunit.client.1.vm06.stdout:6/357: sync 2026-03-10T06:22:16.212 INFO:tasks.workunit.client.1.vm06.stdout:6/358: fsync f3 0 2026-03-10T06:22:16.217 INFO:tasks.workunit.client.1.vm06.stdout:5/253: truncate d8/d20/d22/d39/f3d 49354 0 2026-03-10T06:22:16.218 INFO:tasks.workunit.client.1.vm06.stdout:2/305: dwrite da/d13/d1c/f41 [0,4194304] 0 2026-03-10T06:22:16.226 INFO:tasks.workunit.client.1.vm06.stdout:5/254: mkdir d8/db/d57 0 2026-03-10T06:22:16.228 INFO:tasks.workunit.client.1.vm06.stdout:5/255: dread - d8/d9/f47 zero size 2026-03-10T06:22:16.230 INFO:tasks.workunit.client.1.vm06.stdout:6/359: mkdir d6/d79 0 2026-03-10T06:22:16.232 INFO:tasks.workunit.client.1.vm06.stdout:0/314: dread d0/dd/d14/d18/f22 [0,4194304] 0 2026-03-10T06:22:16.235 INFO:tasks.workunit.client.1.vm06.stdout:2/306: fsync da/fe 0 2026-03-10T06:22:16.246 INFO:tasks.workunit.client.1.vm06.stdout:6/360: rename d6/d7/l8 to d6/dd/d2b/l7a 0 2026-03-10T06:22:16.250 INFO:tasks.workunit.client.1.vm06.stdout:0/315: chown d0/dd/d14/f31 598 1 2026-03-10T06:22:16.251 INFO:tasks.workunit.client.1.vm06.stdout:8/283: dread d1/f4 [0,4194304] 0 2026-03-10T06:22:16.252 INFO:tasks.workunit.client.1.vm06.stdout:0/316: fsync d0/d3c/d42/f54 0 2026-03-10T06:22:16.254 INFO:tasks.workunit.client.1.vm06.stdout:0/317: truncate d0/d3c/d42/f60 379802 0 2026-03-10T06:22:16.256 INFO:tasks.workunit.client.1.vm06.stdout:2/307: unlink da/d13/d1a/l36 0 2026-03-10T06:22:16.256 INFO:tasks.workunit.client.1.vm06.stdout:7/355: truncate d19/d3b/d41/f66 1080746 0 2026-03-10T06:22:16.258 INFO:tasks.workunit.client.1.vm06.stdout:4/259: write f8 [7956113,107657] 0 2026-03-10T06:22:16.264 INFO:tasks.workunit.client.1.vm06.stdout:4/260: chown dd/d24/d2d/c27 2 1 2026-03-10T06:22:16.267 INFO:tasks.workunit.client.1.vm06.stdout:7/356: dwrite d19/f1d [0,4194304] 0 2026-03-10T06:22:16.270 INFO:tasks.workunit.client.1.vm06.stdout:1/338: rmdir d9/d35 39 2026-03-10T06:22:16.270 INFO:tasks.workunit.client.1.vm06.stdout:9/292: dwrite fd [0,4194304] 0 2026-03-10T06:22:16.274 INFO:tasks.workunit.client.1.vm06.stdout:6/361: symlink d6/d7/d37/l7b 0 2026-03-10T06:22:16.275 INFO:tasks.workunit.client.1.vm06.stdout:1/339: chown d9/d1b/d20/d44/l4b 172094 1 2026-03-10T06:22:16.275 INFO:tasks.workunit.client.1.vm06.stdout:3/303: truncate d6/dc/d13/d35/f4e 75118 0 2026-03-10T06:22:16.276 INFO:tasks.workunit.client.1.vm06.stdout:3/304: fsync d6/dc/f1d 0 2026-03-10T06:22:16.276 INFO:tasks.workunit.client.1.vm06.stdout:3/305: readlink d6/d21/d38/l3e 0 2026-03-10T06:22:16.281 INFO:tasks.workunit.client.1.vm06.stdout:9/293: symlink d21/d32/l61 0 2026-03-10T06:22:16.281 INFO:tasks.workunit.client.1.vm06.stdout:7/357: creat d19/d3b/d41/d42/f6d x:0 0 0 2026-03-10T06:22:16.282 INFO:tasks.workunit.client.1.vm06.stdout:1/340: dread d9/d1b/d20/f25 [0,4194304] 0 2026-03-10T06:22:16.283 INFO:tasks.workunit.client.1.vm06.stdout:0/318: dwrite d0/d3c/d42/f12 [0,4194304] 0 2026-03-10T06:22:16.283 INFO:tasks.workunit.client.1.vm06.stdout:8/284: mknod d1/c60 0 2026-03-10T06:22:16.287 INFO:tasks.workunit.client.1.vm06.stdout:6/362: fsync d6/dd/d35/f3c 0 2026-03-10T06:22:16.292 INFO:tasks.workunit.client.1.vm06.stdout:5/256: dwrite d8/d9/f4b [0,4194304] 0 2026-03-10T06:22:16.293 INFO:tasks.workunit.client.1.vm06.stdout:5/257: write f7 [35028,20301] 0 2026-03-10T06:22:16.296 INFO:tasks.workunit.client.1.vm06.stdout:0/319: dwrite d0/dd/f49 [0,4194304] 0 2026-03-10T06:22:16.298 INFO:tasks.workunit.client.1.vm06.stdout:0/320: fdatasync d0/dd/d2d/d35/f3a 0 2026-03-10T06:22:16.313 INFO:tasks.workunit.client.1.vm06.stdout:2/308: mkdir da/d13/d5e 0 2026-03-10T06:22:16.314 INFO:tasks.workunit.client.1.vm06.stdout:5/258: dwrite d8/d20/d22/d39/f44 [0,4194304] 0 2026-03-10T06:22:16.320 INFO:tasks.workunit.client.1.vm06.stdout:0/321: dread d0/dd/d1c/f5a [0,4194304] 0 2026-03-10T06:22:16.320 INFO:tasks.workunit.client.1.vm06.stdout:3/306: dwrite d6/d1a/f1f [4194304,4194304] 0 2026-03-10T06:22:16.325 INFO:tasks.workunit.client.1.vm06.stdout:3/307: readlink d6/dc/d13/l3c 0 2026-03-10T06:22:16.326 INFO:tasks.workunit.client.1.vm06.stdout:7/358: creat d19/d3b/d41/d4c/f6e x:0 0 0 2026-03-10T06:22:16.329 INFO:tasks.workunit.client.1.vm06.stdout:9/294: fdatasync d21/d46/f4e 0 2026-03-10T06:22:16.345 INFO:tasks.workunit.client.1.vm06.stdout:1/341: rmdir d9/d1b/d20/d44 39 2026-03-10T06:22:16.345 INFO:tasks.workunit.client.1.vm06.stdout:6/363: rmdir d6/df 39 2026-03-10T06:22:16.346 INFO:tasks.workunit.client.1.vm06.stdout:5/259: mknod d8/db/c58 0 2026-03-10T06:22:16.347 INFO:tasks.workunit.client.1.vm06.stdout:7/359: symlink d19/d3b/d41/d42/d62/l6f 0 2026-03-10T06:22:16.348 INFO:tasks.workunit.client.1.vm06.stdout:7/360: chown d19/d3b/c40 48 1 2026-03-10T06:22:16.358 INFO:tasks.workunit.client.1.vm06.stdout:0/322: mkdir d0/dd/d14/d1d/d73 0 2026-03-10T06:22:16.359 INFO:tasks.workunit.client.1.vm06.stdout:3/308: symlink d6/d1a/d5b/l68 0 2026-03-10T06:22:16.360 INFO:tasks.workunit.client.1.vm06.stdout:0/323: write d0/dd/d2d/d47/d4d/f57 [652250,1899] 0 2026-03-10T06:22:16.361 INFO:tasks.workunit.client.1.vm06.stdout:1/342: fdatasync d9/f1f 0 2026-03-10T06:22:16.361 INFO:tasks.workunit.client.1.vm06.stdout:0/324: write d0/d3c/d42/f41 [1006133,22200] 0 2026-03-10T06:22:16.369 INFO:tasks.workunit.client.1.vm06.stdout:6/364: mknod d6/d79/c7c 0 2026-03-10T06:22:16.374 INFO:tasks.workunit.client.1.vm06.stdout:1/343: dread d9/d1b/d20/f43 [0,4194304] 0 2026-03-10T06:22:16.378 INFO:tasks.workunit.client.1.vm06.stdout:0/325: dwrite d0/f61 [0,4194304] 0 2026-03-10T06:22:16.378 INFO:tasks.workunit.client.1.vm06.stdout:7/361: mknod d19/d3b/d41/c70 0 2026-03-10T06:22:16.379 INFO:tasks.workunit.client.1.vm06.stdout:8/285: getdents d1/df/d58 0 2026-03-10T06:22:16.379 INFO:tasks.workunit.client.1.vm06.stdout:9/295: link d21/d24/l28 d21/d46/l62 0 2026-03-10T06:22:16.379 INFO:tasks.workunit.client.1.vm06.stdout:9/296: fdatasync f11 0 2026-03-10T06:22:16.380 INFO:tasks.workunit.client.1.vm06.stdout:8/286: chown d1/d7/f24 509004 1 2026-03-10T06:22:16.381 INFO:tasks.workunit.client.1.vm06.stdout:0/326: mkdir d0/dd/d2d/d35/d74 0 2026-03-10T06:22:16.383 INFO:tasks.workunit.client.1.vm06.stdout:7/362: write d19/d3b/d41/f48 [2577544,27405] 0 2026-03-10T06:22:16.389 INFO:tasks.workunit.client.1.vm06.stdout:3/309: creat d6/dc/f69 x:0 0 0 2026-03-10T06:22:16.390 INFO:tasks.workunit.client.1.vm06.stdout:1/344: creat d9/d1b/d20/d44/f54 x:0 0 0 2026-03-10T06:22:16.390 INFO:tasks.workunit.client.1.vm06.stdout:0/327: mknod d0/dd/d1b/d3d/d50/c75 0 2026-03-10T06:22:16.392 INFO:tasks.workunit.client.1.vm06.stdout:9/297: dread fb [0,4194304] 0 2026-03-10T06:22:16.393 INFO:tasks.workunit.client.1.vm06.stdout:0/328: chown d0/d3c/d42/l25 13947 1 2026-03-10T06:22:16.393 INFO:tasks.workunit.client.1.vm06.stdout:1/345: creat d9/d1b/d20/f55 x:0 0 0 2026-03-10T06:22:16.394 INFO:tasks.workunit.client.1.vm06.stdout:9/298: mknod d21/d27/d50/c63 0 2026-03-10T06:22:16.395 INFO:tasks.workunit.client.1.vm06.stdout:3/310: dwrite d6/d1a/f1f [4194304,4194304] 0 2026-03-10T06:22:16.407 INFO:tasks.workunit.client.1.vm06.stdout:9/299: readlink d21/l43 0 2026-03-10T06:22:16.407 INFO:tasks.workunit.client.1.vm06.stdout:9/300: rename d21/d27/f31 to d21/d32/d4d/f64 0 2026-03-10T06:22:16.407 INFO:tasks.workunit.client.1.vm06.stdout:3/311: truncate d6/f1c 2313836 0 2026-03-10T06:22:16.407 INFO:tasks.workunit.client.1.vm06.stdout:1/346: creat d9/d35/f56 x:0 0 0 2026-03-10T06:22:16.407 INFO:tasks.workunit.client.1.vm06.stdout:9/301: creat d21/d27/f65 x:0 0 0 2026-03-10T06:22:16.407 INFO:tasks.workunit.client.1.vm06.stdout:9/302: chown d21/d32 111 1 2026-03-10T06:22:16.407 INFO:tasks.workunit.client.1.vm06.stdout:3/312: rename d6/d8/l23 to d6/d4f/l6a 0 2026-03-10T06:22:16.408 INFO:tasks.workunit.client.1.vm06.stdout:1/347: readlink d9/d1b/l23 0 2026-03-10T06:22:16.408 INFO:tasks.workunit.client.1.vm06.stdout:9/303: symlink d21/d32/d4d/l66 0 2026-03-10T06:22:16.412 INFO:tasks.workunit.client.1.vm06.stdout:3/313: rename d6/dc/d2a to d6/dc/d13/d35/d6b 0 2026-03-10T06:22:16.414 INFO:tasks.workunit.client.1.vm06.stdout:9/304: mkdir d21/d32/d4d/d51/d67 0 2026-03-10T06:22:16.425 INFO:tasks.workunit.client.1.vm06.stdout:3/314: creat d6/d21/d38/f6c x:0 0 0 2026-03-10T06:22:16.431 INFO:tasks.workunit.client.1.vm06.stdout:1/348: link d9/d1b/d20/f24 d9/d35/f57 0 2026-03-10T06:22:16.436 INFO:tasks.workunit.client.1.vm06.stdout:3/315: mkdir d6/dc/d41/d6d 0 2026-03-10T06:22:16.441 INFO:tasks.workunit.client.1.vm06.stdout:9/305: link d21/d27/c48 d21/d32/d4d/d51/c68 0 2026-03-10T06:22:16.442 INFO:tasks.workunit.client.1.vm06.stdout:1/349: creat d9/f58 x:0 0 0 2026-03-10T06:22:16.445 INFO:tasks.workunit.client.1.vm06.stdout:1/350: mknod d9/d35/c59 0 2026-03-10T06:22:16.446 INFO:tasks.workunit.client.1.vm06.stdout:1/351: chown d9/d1b/l23 5 1 2026-03-10T06:22:16.446 INFO:tasks.workunit.client.1.vm06.stdout:1/352: truncate d9/f58 148626 0 2026-03-10T06:22:16.448 INFO:tasks.workunit.client.1.vm06.stdout:9/306: rename d21/d24/c5d to d21/d32/d4d/d51/d67/c69 0 2026-03-10T06:22:16.448 INFO:tasks.workunit.client.1.vm06.stdout:9/307: chown l1e 54865 1 2026-03-10T06:22:16.450 INFO:tasks.workunit.client.1.vm06.stdout:1/353: symlink d9/d1b/d20/d44/l5a 0 2026-03-10T06:22:16.450 INFO:tasks.workunit.client.1.vm06.stdout:1/354: stat d9/df/f4d 0 2026-03-10T06:22:16.453 INFO:tasks.workunit.client.1.vm06.stdout:1/355: symlink d9/d35/d46/l5b 0 2026-03-10T06:22:16.456 INFO:tasks.workunit.client.1.vm06.stdout:6/365: sync 2026-03-10T06:22:16.456 INFO:tasks.workunit.client.1.vm06.stdout:7/363: sync 2026-03-10T06:22:16.457 INFO:tasks.workunit.client.1.vm06.stdout:6/366: chown d6/dd/d25/d2c 55 1 2026-03-10T06:22:16.458 INFO:tasks.workunit.client.1.vm06.stdout:6/367: read d6/dd/f5b [2579256,13470] 0 2026-03-10T06:22:16.463 INFO:tasks.workunit.client.1.vm06.stdout:7/364: readlink d19/l31 0 2026-03-10T06:22:16.463 INFO:tasks.workunit.client.1.vm06.stdout:7/365: stat d19/d3b/d41/f49 0 2026-03-10T06:22:16.466 INFO:tasks.workunit.client.1.vm06.stdout:6/368: dwrite d6/d7/f20 [4194304,4194304] 0 2026-03-10T06:22:16.478 INFO:tasks.workunit.client.1.vm06.stdout:7/366: symlink d19/l71 0 2026-03-10T06:22:16.479 INFO:tasks.workunit.client.1.vm06.stdout:6/369: creat d6/d7/d37/f7d x:0 0 0 2026-03-10T06:22:16.482 INFO:tasks.workunit.client.1.vm06.stdout:6/370: dread d6/dd/f18 [0,4194304] 0 2026-03-10T06:22:16.488 INFO:tasks.workunit.client.1.vm06.stdout:7/367: dwrite d19/d3b/d41/d42/d62/f6c [0,4194304] 0 2026-03-10T06:22:16.494 INFO:tasks.workunit.client.1.vm06.stdout:7/368: chown d19/d3b/d41/f49 1309789 1 2026-03-10T06:22:16.500 INFO:tasks.workunit.client.1.vm06.stdout:9/308: write d21/d46/f4e [2413827,112571] 0 2026-03-10T06:22:16.500 INFO:tasks.workunit.client.1.vm06.stdout:4/261: write dd/d24/d2d/f21 [1063108,80098] 0 2026-03-10T06:22:16.500 INFO:tasks.workunit.client.1.vm06.stdout:9/309: write d21/d27/d50/d57/f58 [1589268,21862] 0 2026-03-10T06:22:16.506 INFO:tasks.workunit.client.1.vm06.stdout:4/262: dwrite dd/d24/f3d [4194304,4194304] 0 2026-03-10T06:22:16.509 INFO:tasks.workunit.client.1.vm06.stdout:0/329: read d0/dd/d2d/d47/d4d/f57 [641774,123095] 0 2026-03-10T06:22:16.513 INFO:tasks.workunit.client.1.vm06.stdout:2/309: write da/fe [2795041,128885] 0 2026-03-10T06:22:16.516 INFO:tasks.workunit.client.1.vm06.stdout:6/371: dwrite d6/dd/f5b [0,4194304] 0 2026-03-10T06:22:16.520 INFO:tasks.workunit.client.1.vm06.stdout:2/310: readlink da/d13/d1c/d1d/l25 0 2026-03-10T06:22:16.530 INFO:tasks.workunit.client.1.vm06.stdout:0/330: dwrite d0/dd/d2d/d35/f3a [0,4194304] 0 2026-03-10T06:22:16.536 INFO:tasks.workunit.client.1.vm06.stdout:5/260: dwrite d8/d20/d22/f31 [0,4194304] 0 2026-03-10T06:22:16.541 INFO:tasks.workunit.client.1.vm06.stdout:4/263: rename dd/d24/d2d/c3a to dd/d24/d2d/d2f/d34/c48 0 2026-03-10T06:22:16.542 INFO:tasks.workunit.client.1.vm06.stdout:6/372: creat d6/dd/d35/f7e x:0 0 0 2026-03-10T06:22:16.542 INFO:tasks.workunit.client.1.vm06.stdout:7/369: mkdir d19/d3b/d41/d72 0 2026-03-10T06:22:16.543 INFO:tasks.workunit.client.1.vm06.stdout:7/370: truncate d19/f20 5035027 0 2026-03-10T06:22:16.548 INFO:tasks.workunit.client.1.vm06.stdout:4/264: rename dd/d24/d2d/f21 to dd/d24/d2d/d2f/d34/f49 0 2026-03-10T06:22:16.551 INFO:tasks.workunit.client.1.vm06.stdout:0/331: dwrite d0/f9 [0,4194304] 0 2026-03-10T06:22:16.561 INFO:tasks.workunit.client.1.vm06.stdout:5/261: dwrite d8/d20/d22/f4d [0,4194304] 0 2026-03-10T06:22:16.562 INFO:tasks.workunit.client.1.vm06.stdout:4/265: creat dd/d24/d2d/d2f/d39/f4a x:0 0 0 2026-03-10T06:22:16.565 INFO:tasks.workunit.client.1.vm06.stdout:0/332: dwrite d0/dd/d1b/f2f [0,4194304] 0 2026-03-10T06:22:16.572 INFO:tasks.workunit.client.1.vm06.stdout:7/371: rename d19/c37 to d19/d3b/d41/d42/c73 0 2026-03-10T06:22:16.574 INFO:tasks.workunit.client.1.vm06.stdout:5/262: mknod d8/c59 0 2026-03-10T06:22:16.578 INFO:tasks.workunit.client.1.vm06.stdout:5/263: chown d8/d20/d22/f53 10 1 2026-03-10T06:22:16.588 INFO:tasks.workunit.client.1.vm06.stdout:7/372: dwrite d19/d3b/d41/d42/d52/f64 [0,4194304] 0 2026-03-10T06:22:16.589 INFO:tasks.workunit.client.1.vm06.stdout:2/311: sync 2026-03-10T06:22:16.590 INFO:tasks.workunit.client.1.vm06.stdout:0/333: dread d0/f5 [0,4194304] 0 2026-03-10T06:22:16.595 INFO:tasks.workunit.client.1.vm06.stdout:7/373: creat d19/d3b/d41/d42/f74 x:0 0 0 2026-03-10T06:22:16.624 INFO:tasks.workunit.client.1.vm06.stdout:0/334: dread d0/dd/f5b [0,4194304] 0 2026-03-10T06:22:16.630 INFO:tasks.workunit.client.1.vm06.stdout:2/312: dread da/d13/d1a/d39/f2f [0,4194304] 0 2026-03-10T06:22:16.631 INFO:tasks.workunit.client.1.vm06.stdout:0/335: truncate d0/dd/f44 2925297 0 2026-03-10T06:22:16.636 INFO:tasks.workunit.client.1.vm06.stdout:4/266: sync 2026-03-10T06:22:16.636 INFO:tasks.workunit.client.1.vm06.stdout:6/373: sync 2026-03-10T06:22:16.637 INFO:tasks.workunit.client.1.vm06.stdout:7/374: sync 2026-03-10T06:22:16.638 INFO:tasks.workunit.client.1.vm06.stdout:6/374: truncate d6/dd/d35/f7e 602771 0 2026-03-10T06:22:16.640 INFO:tasks.workunit.client.1.vm06.stdout:7/375: truncate d19/d3b/d41/f49 5223867 0 2026-03-10T06:22:16.640 INFO:tasks.workunit.client.1.vm06.stdout:7/376: readlink l8 0 2026-03-10T06:22:16.641 INFO:tasks.workunit.client.1.vm06.stdout:7/377: write d19/d3b/d41/f54 [3261066,31997] 0 2026-03-10T06:22:16.644 INFO:tasks.workunit.client.1.vm06.stdout:4/267: fsync dd/ff 0 2026-03-10T06:22:16.649 INFO:tasks.workunit.client.1.vm06.stdout:0/336: dwrite d0/dd/f24 [4194304,4194304] 0 2026-03-10T06:22:16.656 INFO:tasks.workunit.client.1.vm06.stdout:6/375: dwrite d6/dd/f18 [0,4194304] 0 2026-03-10T06:22:16.669 INFO:tasks.workunit.client.1.vm06.stdout:6/376: dread - d6/d7/d37/d43/f77 zero size 2026-03-10T06:22:16.669 INFO:tasks.workunit.client.1.vm06.stdout:8/287: write d1/f26 [799157,115847] 0 2026-03-10T06:22:16.669 INFO:tasks.workunit.client.1.vm06.stdout:8/288: fsync d1/df/d11/f59 0 2026-03-10T06:22:16.669 INFO:tasks.workunit.client.1.vm06.stdout:3/316: chown d6/d4f/l6a 14109 1 2026-03-10T06:22:16.669 INFO:tasks.workunit.client.1.vm06.stdout:0/337: dwrite d0/f46 [0,4194304] 0 2026-03-10T06:22:16.670 INFO:tasks.workunit.client.1.vm06.stdout:0/338: stat d0/d3c/d42/l25 0 2026-03-10T06:22:16.670 INFO:tasks.workunit.client.1.vm06.stdout:2/313: getdents da/d13/d1c/d1d/d44 0 2026-03-10T06:22:16.672 INFO:tasks.workunit.client.1.vm06.stdout:2/314: write da/d13/d1a/f27 [2874321,66021] 0 2026-03-10T06:22:16.685 INFO:tasks.workunit.client.1.vm06.stdout:9/310: getdents d21/d32/d4d/d51/d67 0 2026-03-10T06:22:16.687 INFO:tasks.workunit.client.1.vm06.stdout:8/289: unlink d1/df/d11/c3e 0 2026-03-10T06:22:16.687 INFO:tasks.workunit.client.1.vm06.stdout:8/290: chown d1/f3a 356421088 1 2026-03-10T06:22:16.689 INFO:tasks.workunit.client.1.vm06.stdout:3/317: write d6/d21/f58 [183885,67361] 0 2026-03-10T06:22:16.695 INFO:tasks.workunit.client.1.vm06.stdout:1/356: truncate d9/f1a 7894322 0 2026-03-10T06:22:16.697 INFO:tasks.workunit.client.1.vm06.stdout:0/339: creat d0/d3c/d42/d5e/f76 x:0 0 0 2026-03-10T06:22:16.697 INFO:tasks.workunit.client.1.vm06.stdout:3/318: dread - d6/d8/f49 zero size 2026-03-10T06:22:16.697 INFO:tasks.workunit.client.1.vm06.stdout:9/311: creat d21/d32/d4d/d51/d67/f6a x:0 0 0 2026-03-10T06:22:16.698 INFO:tasks.workunit.client.1.vm06.stdout:2/315: symlink da/d13/d51/l5f 0 2026-03-10T06:22:16.701 INFO:tasks.workunit.client.1.vm06.stdout:6/377: dwrite d6/df/f1e [0,4194304] 0 2026-03-10T06:22:16.702 INFO:tasks.workunit.client.1.vm06.stdout:3/319: getdents d6/dc/d41/d6d 0 2026-03-10T06:22:16.702 INFO:tasks.workunit.client.1.vm06.stdout:9/312: write fe [4141031,40149] 0 2026-03-10T06:22:16.705 INFO:tasks.workunit.client.1.vm06.stdout:2/316: write da/fe [4906417,82090] 0 2026-03-10T06:22:16.706 INFO:tasks.workunit.client.1.vm06.stdout:9/313: write f20 [4399108,2673] 0 2026-03-10T06:22:16.713 INFO:tasks.workunit.client.1.vm06.stdout:1/357: dread d9/f34 [0,4194304] 0 2026-03-10T06:22:16.722 INFO:tasks.workunit.client.1.vm06.stdout:9/314: write d21/d32/d4d/d51/f5c [588133,48727] 0 2026-03-10T06:22:16.722 INFO:tasks.workunit.client.1.vm06.stdout:6/378: unlink d6/dd/d25/d4e/c45 0 2026-03-10T06:22:16.722 INFO:tasks.workunit.client.1.vm06.stdout:6/379: stat d6/d7/d37/f65 0 2026-03-10T06:22:16.722 INFO:tasks.workunit.client.1.vm06.stdout:6/380: chown d6/dd/d25 82353127 1 2026-03-10T06:22:16.722 INFO:tasks.workunit.client.1.vm06.stdout:2/317: dread da/f28 [0,4194304] 0 2026-03-10T06:22:16.722 INFO:tasks.workunit.client.1.vm06.stdout:3/320: creat d6/dc/d13/f6e x:0 0 0 2026-03-10T06:22:16.727 INFO:tasks.workunit.client.1.vm06.stdout:1/358: dwrite d9/d1b/f51 [4194304,4194304] 0 2026-03-10T06:22:16.728 INFO:tasks.workunit.client.1.vm06.stdout:6/381: rename d6/dd/d2b/c6c to d6/dd/d25/d33/d5a/d78/c7f 0 2026-03-10T06:22:16.728 INFO:tasks.workunit.client.1.vm06.stdout:9/315: creat d21/d32/d4d/f6b x:0 0 0 2026-03-10T06:22:16.730 INFO:tasks.workunit.client.1.vm06.stdout:3/321: unlink d6/d1a/f65 0 2026-03-10T06:22:16.739 INFO:tasks.workunit.client.1.vm06.stdout:6/382: mkdir d6/d79/d80 0 2026-03-10T06:22:16.747 INFO:tasks.workunit.client.1.vm06.stdout:9/316: symlink d21/d27/d3a/l6c 0 2026-03-10T06:22:16.747 INFO:tasks.workunit.client.1.vm06.stdout:9/317: chown d21 0 1 2026-03-10T06:22:16.747 INFO:tasks.workunit.client.1.vm06.stdout:1/359: unlink d9/f39 0 2026-03-10T06:22:16.747 INFO:tasks.workunit.client.1.vm06.stdout:3/322: mknod d6/d21/d38/d39/c6f 0 2026-03-10T06:22:16.748 INFO:tasks.workunit.client.1.vm06.stdout:3/323: stat d6/dc/d13/d35/d6b/d54/l66 0 2026-03-10T06:22:16.749 INFO:tasks.workunit.client.1.vm06.stdout:3/324: fdatasync d6/d21/d38/f3d 0 2026-03-10T06:22:16.755 INFO:tasks.workunit.client.1.vm06.stdout:8/291: dread f0 [0,4194304] 0 2026-03-10T06:22:16.755 INFO:tasks.workunit.client.1.vm06.stdout:5/264: dread d8/d9/f11 [0,4194304] 0 2026-03-10T06:22:16.755 INFO:tasks.workunit.client.1.vm06.stdout:1/360: getdents d9/d35/d46/d38 0 2026-03-10T06:22:16.757 INFO:tasks.workunit.client.1.vm06.stdout:1/361: write d9/d1b/d20/f55 [212273,72008] 0 2026-03-10T06:22:16.764 INFO:tasks.workunit.client.1.vm06.stdout:6/383: dwrite d6/dd/d25/d4e/f5f [0,4194304] 0 2026-03-10T06:22:16.766 INFO:tasks.workunit.client.1.vm06.stdout:9/318: link d21/l2d d21/d27/d50/d57/l6d 0 2026-03-10T06:22:16.767 INFO:tasks.workunit.client.1.vm06.stdout:9/319: write d21/d24/f2e [3329521,72432] 0 2026-03-10T06:22:16.769 INFO:tasks.workunit.client.1.vm06.stdout:1/362: dwrite d9/df/f14 [0,4194304] 0 2026-03-10T06:22:16.791 INFO:tasks.workunit.client.1.vm06.stdout:9/320: rename d21/d24 to d21/d32/d6e 0 2026-03-10T06:22:16.794 INFO:tasks.workunit.client.1.vm06.stdout:6/384: truncate d6/f62 709994 0 2026-03-10T06:22:16.805 INFO:tasks.workunit.client.1.vm06.stdout:6/385: chown d6/dd/c68 7644511 1 2026-03-10T06:22:16.805 INFO:tasks.workunit.client.1.vm06.stdout:8/292: link d1/df/d11/f12 d1/df/f61 0 2026-03-10T06:22:16.805 INFO:tasks.workunit.client.1.vm06.stdout:8/293: dwrite d1/f1c [0,4194304] 0 2026-03-10T06:22:16.805 INFO:tasks.workunit.client.1.vm06.stdout:8/294: write d1/df/d11/f4a [497871,107756] 0 2026-03-10T06:22:16.810 INFO:tasks.workunit.client.1.vm06.stdout:6/386: symlink d6/dd/d25/d2c/l81 0 2026-03-10T06:22:16.814 INFO:tasks.workunit.client.1.vm06.stdout:8/295: read d1/df/d11/f1d [100833,43019] 0 2026-03-10T06:22:16.815 INFO:tasks.workunit.client.1.vm06.stdout:5/265: creat d8/d9/d1e/f5a x:0 0 0 2026-03-10T06:22:16.816 INFO:tasks.workunit.client.1.vm06.stdout:5/266: write f7 [744266,50761] 0 2026-03-10T06:22:16.818 INFO:tasks.workunit.client.1.vm06.stdout:5/267: fsync d8/db/f48 0 2026-03-10T06:22:16.818 INFO:tasks.workunit.client.1.vm06.stdout:1/363: creat d9/d35/f5c x:0 0 0 2026-03-10T06:22:16.818 INFO:tasks.workunit.client.1.vm06.stdout:4/268: rmdir dd/d24/d2d 39 2026-03-10T06:22:16.820 INFO:tasks.workunit.client.1.vm06.stdout:6/387: creat d6/df/f82 x:0 0 0 2026-03-10T06:22:16.822 INFO:tasks.workunit.client.1.vm06.stdout:1/364: dread d9/d1b/d20/f55 [0,4194304] 0 2026-03-10T06:22:16.825 INFO:tasks.workunit.client.1.vm06.stdout:8/296: dwrite d1/df/d20/fe [0,4194304] 0 2026-03-10T06:22:16.832 INFO:tasks.workunit.client.1.vm06.stdout:6/388: creat d6/dd/d25/d4e/f83 x:0 0 0 2026-03-10T06:22:16.835 INFO:tasks.workunit.client.1.vm06.stdout:9/321: link d21/d32/d6e/l53 d21/d27/d50/l6f 0 2026-03-10T06:22:16.835 INFO:tasks.workunit.client.1.vm06.stdout:6/389: stat d6/d7/f1a 0 2026-03-10T06:22:16.835 INFO:tasks.workunit.client.1.vm06.stdout:4/269: mknod dd/d24/d2d/d2f/d34/c4b 0 2026-03-10T06:22:16.840 INFO:tasks.workunit.client.1.vm06.stdout:8/297: dread d1/df/d11/f1d [0,4194304] 0 2026-03-10T06:22:16.842 INFO:tasks.workunit.client.1.vm06.stdout:6/390: symlink d6/l84 0 2026-03-10T06:22:16.845 INFO:tasks.workunit.client.1.vm06.stdout:9/322: symlink d21/d27/d56/l70 0 2026-03-10T06:22:16.846 INFO:tasks.workunit.client.1.vm06.stdout:9/323: fdatasync d21/d32/d6e/f2e 0 2026-03-10T06:22:16.846 INFO:tasks.workunit.client.1.vm06.stdout:4/270: unlink dd/l2a 0 2026-03-10T06:22:16.847 INFO:tasks.workunit.client.1.vm06.stdout:8/298: fsync d1/fa 0 2026-03-10T06:22:16.859 INFO:tasks.workunit.client.1.vm06.stdout:6/391: creat d6/dd/d25/d2c/f85 x:0 0 0 2026-03-10T06:22:16.859 INFO:tasks.workunit.client.1.vm06.stdout:9/324: creat d21/d32/f71 x:0 0 0 2026-03-10T06:22:16.859 INFO:tasks.workunit.client.1.vm06.stdout:6/392: write d6/fc [996565,10369] 0 2026-03-10T06:22:16.859 INFO:tasks.workunit.client.1.vm06.stdout:4/271: mknod dd/d24/c4c 0 2026-03-10T06:22:16.859 INFO:tasks.workunit.client.1.vm06.stdout:8/299: creat d1/d3b/d5c/f62 x:0 0 0 2026-03-10T06:22:16.859 INFO:tasks.workunit.client.1.vm06.stdout:6/393: fdatasync d6/f62 0 2026-03-10T06:22:16.862 INFO:tasks.workunit.client.1.vm06.stdout:8/300: chown d1/d7/f30 0 1 2026-03-10T06:22:16.862 INFO:tasks.workunit.client.1.vm06.stdout:4/272: mknod dd/d41/c4d 0 2026-03-10T06:22:16.862 INFO:tasks.workunit.client.1.vm06.stdout:6/394: read - d6/dd/d25/f5c zero size 2026-03-10T06:22:16.863 INFO:tasks.workunit.client.1.vm06.stdout:9/325: getdents d21/d32/d4d/d51 0 2026-03-10T06:22:16.863 INFO:tasks.workunit.client.1.vm06.stdout:6/395: write d6/dd/d25/f5c [216018,83610] 0 2026-03-10T06:22:16.864 INFO:tasks.workunit.client.1.vm06.stdout:8/301: rename d1/df/d20/d21/f36 to d1/df/d20/f63 0 2026-03-10T06:22:16.864 INFO:tasks.workunit.client.1.vm06.stdout:9/326: symlink d21/d46/l72 0 2026-03-10T06:22:16.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:16 vm06.local ceph-mon[58974]: pgmap v6: 65 pgs: 65 active+clean; 467 MiB data, 2.0 GiB used, 118 GiB / 120 GiB avail 2026-03-10T06:22:16.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:16 vm06.local ceph-mon[58974]: from='mgr.? 192.168.123.106:0/2271212863' entity='mgr.vm06.wwotdr' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm06.wwotdr/crt"}]: dispatch 2026-03-10T06:22:16.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:16 vm06.local ceph-mon[58974]: from='mgr.? 192.168.123.106:0/2271212863' entity='mgr.vm06.wwotdr' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/crt"}]: dispatch 2026-03-10T06:22:16.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:16 vm06.local ceph-mon[58974]: from='mgr.? 192.168.123.106:0/2271212863' entity='mgr.vm06.wwotdr' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm06.wwotdr/key"}]: dispatch 2026-03-10T06:22:16.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:16 vm06.local ceph-mon[58974]: from='mgr.? 192.168.123.106:0/2271212863' entity='mgr.vm06.wwotdr' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/key"}]: dispatch 2026-03-10T06:22:16.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:16 vm06.local ceph-mon[58974]: Standby manager daemon vm06.wwotdr started 2026-03-10T06:22:16.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:16 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "mgr metadata", "who": "vm06.wwotdr", "id": "vm06.wwotdr"}]: dispatch 2026-03-10T06:22:16.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:16 vm06.local ceph-mon[58974]: mgrmap e31: vm04.exdvdb(active, since 8s), standbys: vm06.wwotdr 2026-03-10T06:22:16.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:16 vm06.local ceph-mon[58974]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:22:16.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:16 vm06.local ceph-mon[58974]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:22:16.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:16 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "who": "osd/host:vm04", "name": "osd_memory_target"}]: dispatch 2026-03-10T06:22:16.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:16 vm06.local ceph-mon[58974]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "who": "osd/host:vm04", "name": "osd_memory_target"}]: dispatch 2026-03-10T06:22:16.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:16 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:22:16.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:16 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T06:22:16.868 INFO:tasks.workunit.client.1.vm06.stdout:9/327: unlink d21/l2d 0 2026-03-10T06:22:16.870 INFO:tasks.workunit.client.1.vm06.stdout:9/328: creat d21/d32/f73 x:0 0 0 2026-03-10T06:22:16.872 INFO:tasks.workunit.client.1.vm06.stdout:9/329: dread - d21/d32/f73 zero size 2026-03-10T06:22:16.872 INFO:tasks.workunit.client.1.vm06.stdout:8/302: dwrite d1/df/d11/f59 [0,4194304] 0 2026-03-10T06:22:16.884 INFO:tasks.workunit.client.1.vm06.stdout:9/330: creat d21/d27/d56/f74 x:0 0 0 2026-03-10T06:22:16.884 INFO:tasks.workunit.client.1.vm06.stdout:7/378: getdents d19/d3b/d41/d42 0 2026-03-10T06:22:16.885 INFO:tasks.workunit.client.1.vm06.stdout:8/303: creat d1/df/d20/f64 x:0 0 0 2026-03-10T06:22:16.886 INFO:tasks.workunit.client.1.vm06.stdout:4/273: sync 2026-03-10T06:22:16.894 INFO:tasks.workunit.client.1.vm06.stdout:7/379: rename d19/d3b/c5f to d19/c75 0 2026-03-10T06:22:16.894 INFO:tasks.workunit.client.1.vm06.stdout:4/274: dread dd/f14 [0,4194304] 0 2026-03-10T06:22:16.894 INFO:tasks.workunit.client.1.vm06.stdout:9/331: dread d21/d27/f39 [0,4194304] 0 2026-03-10T06:22:16.897 INFO:tasks.workunit.client.1.vm06.stdout:9/332: dread d21/f3e [0,4194304] 0 2026-03-10T06:22:16.897 INFO:tasks.workunit.client.1.vm06.stdout:7/380: symlink d19/d3b/d5b/l76 0 2026-03-10T06:22:16.898 INFO:tasks.workunit.client.1.vm06.stdout:4/275: creat dd/d24/d2d/d2f/d34/d40/f4e x:0 0 0 2026-03-10T06:22:16.899 INFO:tasks.workunit.client.1.vm06.stdout:9/333: chown d21/d32/d4d/d51/c68 4082184 1 2026-03-10T06:22:16.900 INFO:tasks.workunit.client.1.vm06.stdout:4/276: symlink dd/d18/l4f 0 2026-03-10T06:22:16.900 INFO:tasks.workunit.client.1.vm06.stdout:8/304: dwrite d1/df/d20/f19 [4194304,4194304] 0 2026-03-10T06:22:16.901 INFO:tasks.workunit.client.1.vm06.stdout:4/277: fsync f2 0 2026-03-10T06:22:16.901 INFO:tasks.workunit.client.1.vm06.stdout:7/381: creat d19/d3b/d41/f77 x:0 0 0 2026-03-10T06:22:16.901 INFO:tasks.workunit.client.1.vm06.stdout:7/382: fdatasync d19/f35 0 2026-03-10T06:22:16.903 INFO:tasks.workunit.client.1.vm06.stdout:9/334: mknod d21/d27/d50/d57/c75 0 2026-03-10T06:22:16.909 INFO:tasks.workunit.client.1.vm06.stdout:8/305: creat d1/df/d20/d21/d5e/f65 x:0 0 0 2026-03-10T06:22:16.909 INFO:tasks.workunit.client.1.vm06.stdout:7/383: creat d19/d3b/d41/d42/f78 x:0 0 0 2026-03-10T06:22:16.910 INFO:tasks.workunit.client.1.vm06.stdout:9/335: dread d21/f33 [0,4194304] 0 2026-03-10T06:22:16.911 INFO:tasks.workunit.client.1.vm06.stdout:9/336: chown d21/d27/f39 13875 1 2026-03-10T06:22:16.911 INFO:tasks.workunit.client.1.vm06.stdout:7/384: mknod d19/d3b/d41/d4c/c79 0 2026-03-10T06:22:16.911 INFO:tasks.workunit.client.1.vm06.stdout:8/306: mknod d1/df/d20/d21/d5e/c66 0 2026-03-10T06:22:16.912 INFO:tasks.workunit.client.1.vm06.stdout:8/307: creat d1/d2c/f67 x:0 0 0 2026-03-10T06:22:16.917 INFO:tasks.workunit.client.1.vm06.stdout:9/337: write d21/f33 [1268306,24739] 0 2026-03-10T06:22:16.918 INFO:tasks.workunit.client.1.vm06.stdout:9/338: stat d21/d46/l72 0 2026-03-10T06:22:16.923 INFO:tasks.workunit.client.1.vm06.stdout:7/385: mknod d19/d3b/c7a 0 2026-03-10T06:22:16.923 INFO:tasks.workunit.client.1.vm06.stdout:0/340: truncate d0/dd/d1b/d3d/f40 817506 0 2026-03-10T06:22:16.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:16 vm04.local ceph-mon[51058]: pgmap v6: 65 pgs: 65 active+clean; 467 MiB data, 2.0 GiB used, 118 GiB / 120 GiB avail 2026-03-10T06:22:16.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:16 vm04.local ceph-mon[51058]: from='mgr.? 192.168.123.106:0/2271212863' entity='mgr.vm06.wwotdr' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm06.wwotdr/crt"}]: dispatch 2026-03-10T06:22:16.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:16 vm04.local ceph-mon[51058]: from='mgr.? 192.168.123.106:0/2271212863' entity='mgr.vm06.wwotdr' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/crt"}]: dispatch 2026-03-10T06:22:16.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:16 vm04.local ceph-mon[51058]: from='mgr.? 192.168.123.106:0/2271212863' entity='mgr.vm06.wwotdr' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm06.wwotdr/key"}]: dispatch 2026-03-10T06:22:16.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:16 vm04.local ceph-mon[51058]: from='mgr.? 192.168.123.106:0/2271212863' entity='mgr.vm06.wwotdr' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/key"}]: dispatch 2026-03-10T06:22:16.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:16 vm04.local ceph-mon[51058]: Standby manager daemon vm06.wwotdr started 2026-03-10T06:22:16.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:16 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "mgr metadata", "who": "vm06.wwotdr", "id": "vm06.wwotdr"}]: dispatch 2026-03-10T06:22:16.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:16 vm04.local ceph-mon[51058]: mgrmap e31: vm04.exdvdb(active, since 8s), standbys: vm06.wwotdr 2026-03-10T06:22:16.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:16 vm04.local ceph-mon[51058]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:22:16.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:16 vm04.local ceph-mon[51058]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:22:16.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:16 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "who": "osd/host:vm04", "name": "osd_memory_target"}]: dispatch 2026-03-10T06:22:16.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:16 vm04.local ceph-mon[51058]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "who": "osd/host:vm04", "name": "osd_memory_target"}]: dispatch 2026-03-10T06:22:16.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:16 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:22:16.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:16 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T06:22:16.931 INFO:tasks.workunit.client.1.vm06.stdout:9/339: mkdir d21/d27/d50/d76 0 2026-03-10T06:22:16.932 INFO:tasks.workunit.client.1.vm06.stdout:8/308: symlink d1/d2c/d5b/l68 0 2026-03-10T06:22:16.932 INFO:tasks.workunit.client.1.vm06.stdout:0/341: symlink d0/dd/d2d/d35/d74/l77 0 2026-03-10T06:22:16.933 INFO:tasks.workunit.client.1.vm06.stdout:4/278: dread fc [0,4194304] 0 2026-03-10T06:22:16.933 INFO:tasks.workunit.client.1.vm06.stdout:4/279: chown dd/d33/f37 27 1 2026-03-10T06:22:16.937 INFO:tasks.workunit.client.1.vm06.stdout:7/386: rename d19/d3b/d41/d42/d62/f6c to d19/d3b/f7b 0 2026-03-10T06:22:16.939 INFO:tasks.workunit.client.1.vm06.stdout:7/387: dread - d19/d3b/f6b zero size 2026-03-10T06:22:16.941 INFO:tasks.workunit.client.1.vm06.stdout:8/309: creat d1/df/d20/d21/f69 x:0 0 0 2026-03-10T06:22:16.942 INFO:tasks.workunit.client.1.vm06.stdout:0/342: symlink d0/dd/d1c/l78 0 2026-03-10T06:22:16.943 INFO:tasks.workunit.client.1.vm06.stdout:4/280: mknod dd/d33/c50 0 2026-03-10T06:22:16.944 INFO:tasks.workunit.client.1.vm06.stdout:9/340: rmdir d21/d27/d50/d76 0 2026-03-10T06:22:16.944 INFO:tasks.workunit.client.1.vm06.stdout:8/310: dread - d1/df/d20/f63 zero size 2026-03-10T06:22:16.945 INFO:tasks.workunit.client.1.vm06.stdout:7/388: creat d19/d3b/d41/d42/d62/f7c x:0 0 0 2026-03-10T06:22:16.954 INFO:tasks.workunit.client.1.vm06.stdout:4/281: write dd/ff [912634,130988] 0 2026-03-10T06:22:16.954 INFO:tasks.workunit.client.1.vm06.stdout:2/318: truncate da/f19 5548669 0 2026-03-10T06:22:16.954 INFO:tasks.workunit.client.1.vm06.stdout:3/325: truncate d6/f44 790241 0 2026-03-10T06:22:16.955 INFO:tasks.workunit.client.1.vm06.stdout:4/282: fdatasync f2 0 2026-03-10T06:22:16.957 INFO:tasks.workunit.client.1.vm06.stdout:3/326: truncate d6/dc/d13/d35/f3b 676306 0 2026-03-10T06:22:16.962 INFO:tasks.workunit.client.1.vm06.stdout:8/311: creat d1/df/d58/f6a x:0 0 0 2026-03-10T06:22:16.964 INFO:tasks.workunit.client.1.vm06.stdout:8/312: readlink d1/d2c/d5b/l68 0 2026-03-10T06:22:16.964 INFO:tasks.workunit.client.1.vm06.stdout:2/319: dwrite da/d13/d1c/d1d/f26 [0,4194304] 0 2026-03-10T06:22:16.966 INFO:tasks.workunit.client.1.vm06.stdout:0/343: getdents d0/dd/d1b/d3d 0 2026-03-10T06:22:16.966 INFO:tasks.workunit.client.1.vm06.stdout:9/341: dwrite f20 [0,4194304] 0 2026-03-10T06:22:16.967 INFO:tasks.workunit.client.1.vm06.stdout:3/327: link d6/d21/d38/f6c d6/dc/d41/d6d/f70 0 2026-03-10T06:22:16.968 INFO:tasks.workunit.client.1.vm06.stdout:9/342: chown d21/d46 5 1 2026-03-10T06:22:16.978 INFO:tasks.workunit.client.1.vm06.stdout:8/313: rename d1/d7/f30 to d1/df/f6b 0 2026-03-10T06:22:16.983 INFO:tasks.workunit.client.1.vm06.stdout:5/268: dwrite d8/d9/d1e/f29 [4194304,4194304] 0 2026-03-10T06:22:16.985 INFO:tasks.workunit.client.1.vm06.stdout:5/269: dwrite d8/d20/d22/f53 [0,4194304] 0 2026-03-10T06:22:16.989 INFO:tasks.workunit.client.1.vm06.stdout:6/396: dwrite d6/d7/f1a [0,4194304] 0 2026-03-10T06:22:16.997 INFO:tasks.workunit.client.1.vm06.stdout:2/320: dread da/d13/f1f [0,4194304] 0 2026-03-10T06:22:17.001 INFO:tasks.workunit.client.1.vm06.stdout:3/328: symlink d6/dc/d13/d35/l71 0 2026-03-10T06:22:17.007 INFO:tasks.workunit.client.1.vm06.stdout:9/343: mknod d21/d32/c77 0 2026-03-10T06:22:17.007 INFO:tasks.workunit.client.1.vm06.stdout:9/344: write d21/d32/d6e/f2f [1491536,130504] 0 2026-03-10T06:22:17.007 INFO:tasks.workunit.client.1.vm06.stdout:9/345: chown d21/d46/l4f 13306 1 2026-03-10T06:22:17.007 INFO:tasks.workunit.client.1.vm06.stdout:9/346: readlink l1c 0 2026-03-10T06:22:17.009 INFO:tasks.workunit.client.1.vm06.stdout:8/314: mknod d1/d2c/d5b/c6c 0 2026-03-10T06:22:17.010 INFO:tasks.workunit.client.1.vm06.stdout:8/315: write d1/df/d58/f6a [908244,130225] 0 2026-03-10T06:22:17.016 INFO:tasks.workunit.client.1.vm06.stdout:5/270: mknod d8/d20/c5b 0 2026-03-10T06:22:17.022 INFO:tasks.workunit.client.1.vm06.stdout:7/389: rmdir d19/d3b 39 2026-03-10T06:22:17.022 INFO:tasks.workunit.client.1.vm06.stdout:9/347: dwrite d21/f2a [4194304,4194304] 0 2026-03-10T06:22:17.022 INFO:tasks.workunit.client.1.vm06.stdout:1/365: truncate d9/f1a 3615614 0 2026-03-10T06:22:17.022 INFO:tasks.workunit.client.1.vm06.stdout:5/271: write d8/d20/d22/d39/f41 [313103,76688] 0 2026-03-10T06:22:17.023 INFO:tasks.workunit.client.1.vm06.stdout:2/321: mknod da/d13/d1c/d1d/d44/d53/c60 0 2026-03-10T06:22:17.023 INFO:tasks.workunit.client.1.vm06.stdout:0/344: link d0/dd/d2d/d35/d74/l77 d0/dd/d14/d18/d66/l79 0 2026-03-10T06:22:17.025 INFO:tasks.workunit.client.1.vm06.stdout:0/345: write d0/d3c/d42/f60 [208703,26089] 0 2026-03-10T06:22:17.025 INFO:tasks.workunit.client.1.vm06.stdout:7/390: read d19/d3b/f68 [2703871,11150] 0 2026-03-10T06:22:17.029 INFO:tasks.workunit.client.1.vm06.stdout:9/348: mknod d21/d32/d4d/d51/d67/c78 0 2026-03-10T06:22:17.034 INFO:tasks.workunit.client.1.vm06.stdout:8/316: dwrite d1/f4 [4194304,4194304] 0 2026-03-10T06:22:17.034 INFO:tasks.workunit.client.1.vm06.stdout:8/317: fdatasync d1/df/d11/f59 0 2026-03-10T06:22:17.035 INFO:tasks.workunit.client.1.vm06.stdout:2/322: fsync da/d13/f5b 0 2026-03-10T06:22:17.035 INFO:tasks.workunit.client.1.vm06.stdout:5/272: creat d8/db/d54/f5c x:0 0 0 2026-03-10T06:22:17.035 INFO:tasks.workunit.client.1.vm06.stdout:0/346: rmdir d0/dd/d14/d1d 39 2026-03-10T06:22:17.035 INFO:tasks.workunit.client.1.vm06.stdout:3/329: getdents d6/d21 0 2026-03-10T06:22:17.035 INFO:tasks.workunit.client.1.vm06.stdout:9/349: mkdir d21/d46/d79 0 2026-03-10T06:22:17.035 INFO:tasks.workunit.client.1.vm06.stdout:2/323: mkdir da/d13/d1c/d1d/d44/d53/d61 0 2026-03-10T06:22:17.036 INFO:tasks.workunit.client.1.vm06.stdout:7/391: creat d19/d3b/d41/d42/f7d x:0 0 0 2026-03-10T06:22:17.037 INFO:tasks.workunit.client.1.vm06.stdout:1/366: symlink d9/d35/d46/d38/l5d 0 2026-03-10T06:22:17.037 INFO:tasks.workunit.client.1.vm06.stdout:6/397: link d6/dd/c6b d6/df/c86 0 2026-03-10T06:22:17.037 INFO:tasks.workunit.client.1.vm06.stdout:0/347: rmdir d0 39 2026-03-10T06:22:17.038 INFO:tasks.workunit.client.1.vm06.stdout:5/273: creat d8/db/f5d x:0 0 0 2026-03-10T06:22:17.039 INFO:tasks.workunit.client.1.vm06.stdout:7/392: write d19/d3b/d41/f48 [1296346,105154] 0 2026-03-10T06:22:17.040 INFO:tasks.workunit.client.1.vm06.stdout:9/350: creat d21/d32/f7a x:0 0 0 2026-03-10T06:22:17.040 INFO:tasks.workunit.client.1.vm06.stdout:2/324: creat da/d13/d1a/d39/d35/f62 x:0 0 0 2026-03-10T06:22:17.041 INFO:tasks.workunit.client.1.vm06.stdout:2/325: read da/f28 [539991,125710] 0 2026-03-10T06:22:17.044 INFO:tasks.workunit.client.1.vm06.stdout:6/398: dwrite d6/dd/d25/f3f [0,4194304] 0 2026-03-10T06:22:17.049 INFO:tasks.workunit.client.1.vm06.stdout:5/274: write d8/db/f5d [17770,45393] 0 2026-03-10T06:22:17.051 INFO:tasks.workunit.client.1.vm06.stdout:7/393: dwrite d19/d3b/d41/d42/f78 [0,4194304] 0 2026-03-10T06:22:17.051 INFO:tasks.workunit.client.1.vm06.stdout:3/330: mkdir d6/dc/d72 0 2026-03-10T06:22:17.052 INFO:tasks.workunit.client.1.vm06.stdout:8/318: dwrite d1/fa [4194304,4194304] 0 2026-03-10T06:22:17.053 INFO:tasks.workunit.client.1.vm06.stdout:1/367: unlink d9/d1b/l2b 0 2026-03-10T06:22:17.059 INFO:tasks.workunit.client.1.vm06.stdout:0/348: rmdir d0/dd/d1b 39 2026-03-10T06:22:17.063 INFO:tasks.workunit.client.1.vm06.stdout:3/331: mkdir d6/dc/d13/d73 0 2026-03-10T06:22:17.064 INFO:tasks.workunit.client.1.vm06.stdout:0/349: symlink d0/dd/d2d/d35/d74/l7a 0 2026-03-10T06:22:17.064 INFO:tasks.workunit.client.1.vm06.stdout:5/275: mkdir d8/d20/d22/d5e 0 2026-03-10T06:22:17.064 INFO:tasks.workunit.client.1.vm06.stdout:9/351: link l1e d21/d32/l7b 0 2026-03-10T06:22:17.065 INFO:tasks.workunit.client.1.vm06.stdout:0/350: write d0/dd/d14/f31 [2639724,87273] 0 2026-03-10T06:22:17.066 INFO:tasks.workunit.client.1.vm06.stdout:7/394: read f9 [127654,55869] 0 2026-03-10T06:22:17.071 INFO:tasks.workunit.client.1.vm06.stdout:1/368: getdents d9/d1b/d20 0 2026-03-10T06:22:17.071 INFO:tasks.workunit.client.1.vm06.stdout:8/319: dwrite d1/f1c [0,4194304] 0 2026-03-10T06:22:17.071 INFO:tasks.workunit.client.1.vm06.stdout:0/351: mknod d0/dd/d14/d18/d66/c7b 0 2026-03-10T06:22:17.073 INFO:tasks.workunit.client.1.vm06.stdout:5/276: unlink d8/db/f5d 0 2026-03-10T06:22:17.074 INFO:tasks.workunit.client.1.vm06.stdout:8/320: creat d1/df/f6d x:0 0 0 2026-03-10T06:22:17.076 INFO:tasks.workunit.client.1.vm06.stdout:1/369: mknod d9/d35/d46/c5e 0 2026-03-10T06:22:17.077 INFO:tasks.workunit.client.1.vm06.stdout:1/370: chown d9/d1b/d20/d44/l4c 1369713876 1 2026-03-10T06:22:17.077 INFO:tasks.workunit.client.1.vm06.stdout:1/371: dread - d9/d35/d46/f50 zero size 2026-03-10T06:22:17.078 INFO:tasks.workunit.client.1.vm06.stdout:8/321: creat d1/df/d58/f6e x:0 0 0 2026-03-10T06:22:17.079 INFO:tasks.workunit.client.1.vm06.stdout:0/352: getdents d0 0 2026-03-10T06:22:17.079 INFO:tasks.workunit.client.1.vm06.stdout:1/372: chown d9/f1a 9621 1 2026-03-10T06:22:17.083 INFO:tasks.workunit.client.1.vm06.stdout:0/353: rename d0/d3c/d42/f12 to d0/dd/d14/d18/f7c 0 2026-03-10T06:22:17.084 INFO:tasks.workunit.client.1.vm06.stdout:0/354: truncate d0/d3c/d42/f41 1927826 0 2026-03-10T06:22:17.093 INFO:tasks.workunit.client.1.vm06.stdout:0/355: mkdir d0/dd/d1b/d7d 0 2026-03-10T06:22:17.096 INFO:tasks.workunit.client.1.vm06.stdout:0/356: mkdir d0/dd/d14/d18/d7e 0 2026-03-10T06:22:17.097 INFO:tasks.workunit.client.1.vm06.stdout:0/357: fsync d0/d3c/d42/f5c 0 2026-03-10T06:22:17.098 INFO:tasks.workunit.client.1.vm06.stdout:0/358: creat d0/dd/d2d/d35/f7f x:0 0 0 2026-03-10T06:22:17.098 INFO:tasks.workunit.client.1.vm06.stdout:0/359: readlink d0/l37 0 2026-03-10T06:22:17.100 INFO:tasks.workunit.client.1.vm06.stdout:0/360: chown d0/d3c/d42 25 1 2026-03-10T06:22:17.105 INFO:tasks.workunit.client.1.vm06.stdout:3/332: dread d6/dc/d13/d35/f4e [0,4194304] 0 2026-03-10T06:22:17.106 INFO:tasks.workunit.client.1.vm06.stdout:3/333: write d6/f5c [702793,20176] 0 2026-03-10T06:22:17.106 INFO:tasks.workunit.client.1.vm06.stdout:3/334: readlink d6/d21/l24 0 2026-03-10T06:22:17.107 INFO:tasks.workunit.client.1.vm06.stdout:3/335: readlink d6/l28 0 2026-03-10T06:22:17.107 INFO:tasks.workunit.client.1.vm06.stdout:3/336: readlink d6/d21/l24 0 2026-03-10T06:22:17.107 INFO:tasks.workunit.client.1.vm06.stdout:3/337: stat d6/d21/d38/d39/c6f 0 2026-03-10T06:22:17.109 INFO:tasks.workunit.client.1.vm06.stdout:3/338: write d6/d21/d38/f56 [779027,18505] 0 2026-03-10T06:22:17.111 INFO:tasks.workunit.client.1.vm06.stdout:1/373: dread d9/d1b/d20/f24 [0,4194304] 0 2026-03-10T06:22:17.112 INFO:tasks.workunit.client.1.vm06.stdout:3/339: dread d6/d1a/f1f [4194304,4194304] 0 2026-03-10T06:22:17.114 INFO:tasks.workunit.client.1.vm06.stdout:1/374: truncate d9/d1b/d20/f55 203956 0 2026-03-10T06:22:17.115 INFO:tasks.workunit.client.1.vm06.stdout:3/340: symlink d6/d8/l74 0 2026-03-10T06:22:17.115 INFO:tasks.workunit.client.1.vm06.stdout:3/341: write d6/f53 [70842,100401] 0 2026-03-10T06:22:17.117 INFO:tasks.workunit.client.1.vm06.stdout:1/375: symlink d9/d35/d46/l5f 0 2026-03-10T06:22:17.117 INFO:tasks.workunit.client.1.vm06.stdout:3/342: symlink d6/d21/d38/d39/l75 0 2026-03-10T06:22:17.117 INFO:tasks.workunit.client.1.vm06.stdout:1/376: dread - d9/d35/f5c zero size 2026-03-10T06:22:17.118 INFO:tasks.workunit.client.1.vm06.stdout:6/399: sync 2026-03-10T06:22:17.118 INFO:tasks.workunit.client.1.vm06.stdout:7/395: sync 2026-03-10T06:22:17.119 INFO:tasks.workunit.client.1.vm06.stdout:1/377: write d9/df/f4f [284491,32547] 0 2026-03-10T06:22:17.122 INFO:tasks.workunit.client.1.vm06.stdout:8/322: sync 2026-03-10T06:22:17.122 INFO:tasks.workunit.client.1.vm06.stdout:6/400: write d6/d7/f20 [8294115,12904] 0 2026-03-10T06:22:17.123 INFO:tasks.workunit.client.1.vm06.stdout:6/401: write d6/dd/d25/d4e/f60 [431735,6651] 0 2026-03-10T06:22:17.125 INFO:tasks.workunit.client.1.vm06.stdout:1/378: truncate d9/d1b/d20/f25 5112948 0 2026-03-10T06:22:17.125 INFO:tasks.workunit.client.1.vm06.stdout:8/323: rmdir d1/d7 39 2026-03-10T06:22:17.125 INFO:tasks.workunit.client.1.vm06.stdout:1/379: chown d9/df/c13 25119452 1 2026-03-10T06:22:17.127 INFO:tasks.workunit.client.1.vm06.stdout:8/324: creat d1/d3b/d5c/f6f x:0 0 0 2026-03-10T06:22:17.133 INFO:tasks.workunit.client.1.vm06.stdout:1/380: creat d9/d1b/d20/f60 x:0 0 0 2026-03-10T06:22:17.133 INFO:tasks.workunit.client.1.vm06.stdout:6/402: getdents d6/dd/d25/d33/d4d 0 2026-03-10T06:22:17.133 INFO:tasks.workunit.client.1.vm06.stdout:6/403: creat d6/d7/f87 x:0 0 0 2026-03-10T06:22:17.133 INFO:tasks.workunit.client.1.vm06.stdout:7/396: dwrite d19/d3b/f47 [0,4194304] 0 2026-03-10T06:22:17.133 INFO:tasks.workunit.client.1.vm06.stdout:6/404: fdatasync d6/dd/f18 0 2026-03-10T06:22:17.133 INFO:tasks.workunit.client.1.vm06.stdout:8/325: creat d1/df/d20/d21/d5e/f70 x:0 0 0 2026-03-10T06:22:17.133 INFO:tasks.workunit.client.1.vm06.stdout:6/405: dread - d6/f6a zero size 2026-03-10T06:22:17.133 INFO:tasks.workunit.client.1.vm06.stdout:3/343: dwrite d6/dc/f3f [0,4194304] 0 2026-03-10T06:22:17.133 INFO:tasks.workunit.client.1.vm06.stdout:6/406: fdatasync d6/dd/d35/f2d 0 2026-03-10T06:22:17.139 INFO:tasks.workunit.client.1.vm06.stdout:8/326: creat d1/df/f71 x:0 0 0 2026-03-10T06:22:17.143 INFO:tasks.workunit.client.1.vm06.stdout:8/327: rename d1/c60 to d1/d2c/c72 0 2026-03-10T06:22:17.145 INFO:tasks.workunit.client.1.vm06.stdout:3/344: mknod d6/c76 0 2026-03-10T06:22:17.156 INFO:tasks.workunit.client.1.vm06.stdout:0/361: read d0/dd/f10 [4041195,26206] 0 2026-03-10T06:22:17.159 INFO:tasks.workunit.client.1.vm06.stdout:9/352: dread f1b [4194304,4194304] 0 2026-03-10T06:22:17.169 INFO:tasks.workunit.client.1.vm06.stdout:9/353: read fd [2343261,50464] 0 2026-03-10T06:22:17.169 INFO:tasks.workunit.client.1.vm06.stdout:3/345: symlink d6/dc/d41/d6d/l77 0 2026-03-10T06:22:17.169 INFO:tasks.workunit.client.1.vm06.stdout:7/397: dread f9 [0,4194304] 0 2026-03-10T06:22:17.170 INFO:tasks.workunit.client.1.vm06.stdout:9/354: write d21/f34 [301408,35245] 0 2026-03-10T06:22:17.170 INFO:tasks.workunit.client.1.vm06.stdout:8/328: dread d1/df/d20/fe [0,4194304] 0 2026-03-10T06:22:17.170 INFO:tasks.workunit.client.1.vm06.stdout:7/398: mknod d19/d3b/c7e 0 2026-03-10T06:22:17.170 INFO:tasks.workunit.client.1.vm06.stdout:9/355: creat d21/d32/d4d/d51/d67/f7c x:0 0 0 2026-03-10T06:22:17.170 INFO:tasks.workunit.client.1.vm06.stdout:4/283: dwrite dd/d18/f1f [0,4194304] 0 2026-03-10T06:22:17.170 INFO:tasks.workunit.client.1.vm06.stdout:0/362: dwrite d0/dd/d2d/d35/f3a [0,4194304] 0 2026-03-10T06:22:17.172 INFO:tasks.workunit.client.1.vm06.stdout:0/363: truncate d0/d3c/d42/f60 1150920 0 2026-03-10T06:22:17.174 INFO:tasks.workunit.client.1.vm06.stdout:9/356: dread d21/f3e [0,4194304] 0 2026-03-10T06:22:17.174 INFO:tasks.workunit.client.1.vm06.stdout:9/357: write d21/d32/d4d/f6b [64807,113025] 0 2026-03-10T06:22:17.181 INFO:tasks.workunit.client.1.vm06.stdout:7/399: sync 2026-03-10T06:22:17.181 INFO:tasks.workunit.client.1.vm06.stdout:9/358: truncate d21/d32/d4d/f6b 694048 0 2026-03-10T06:22:17.184 INFO:tasks.workunit.client.1.vm06.stdout:8/329: link d1/df/d20/f63 d1/df/d20/d21/d5e/f73 0 2026-03-10T06:22:17.184 INFO:tasks.workunit.client.1.vm06.stdout:4/284: mknod dd/d24/d2d/d2f/d39/c51 0 2026-03-10T06:22:17.184 INFO:tasks.workunit.client.1.vm06.stdout:9/359: readlink d21/d27/d3a/l4c 0 2026-03-10T06:22:17.184 INFO:tasks.workunit.client.1.vm06.stdout:7/400: creat d19/d3b/d5b/f7f x:0 0 0 2026-03-10T06:22:17.194 INFO:tasks.workunit.client.1.vm06.stdout:3/346: rename d6/f44 to d6/dc/d13/d35/d6b/f78 0 2026-03-10T06:22:17.211 INFO:tasks.workunit.client.1.vm06.stdout:4/285: dread dd/f12 [0,4194304] 0 2026-03-10T06:22:17.211 INFO:tasks.workunit.client.1.vm06.stdout:3/347: link d6/dc/d13/c47 d6/dc/d72/c79 0 2026-03-10T06:22:17.211 INFO:tasks.workunit.client.1.vm06.stdout:9/360: dwrite f14 [4194304,4194304] 0 2026-03-10T06:22:17.211 INFO:tasks.workunit.client.1.vm06.stdout:9/361: dread - d21/d32/f71 zero size 2026-03-10T06:22:17.211 INFO:tasks.workunit.client.1.vm06.stdout:4/286: readlink dd/d24/d2d/d2f/d39/l44 0 2026-03-10T06:22:17.211 INFO:tasks.workunit.client.1.vm06.stdout:4/287: chown dd/f43 2048969 1 2026-03-10T06:22:17.211 INFO:tasks.workunit.client.1.vm06.stdout:3/348: link c4 d6/d1a/d5b/c7a 0 2026-03-10T06:22:17.211 INFO:tasks.workunit.client.1.vm06.stdout:9/362: rmdir d21/d32/d4d/d51 39 2026-03-10T06:22:17.211 INFO:tasks.workunit.client.1.vm06.stdout:9/363: stat f1b 0 2026-03-10T06:22:17.211 INFO:tasks.workunit.client.1.vm06.stdout:3/349: dread - d6/d21/d38/f6c zero size 2026-03-10T06:22:17.220 INFO:tasks.workunit.client.1.vm06.stdout:4/288: dread dd/d33/f37 [0,4194304] 0 2026-03-10T06:22:17.222 INFO:tasks.workunit.client.1.vm06.stdout:3/350: creat d6/d21/f7b x:0 0 0 2026-03-10T06:22:17.222 INFO:tasks.workunit.client.1.vm06.stdout:9/364: mkdir d21/d7d 0 2026-03-10T06:22:17.222 INFO:tasks.workunit.client.1.vm06.stdout:9/365: chown d21/d27/d56 117090 1 2026-03-10T06:22:17.222 INFO:tasks.workunit.client.1.vm06.stdout:9/366: chown ff 16 1 2026-03-10T06:22:17.224 INFO:tasks.workunit.client.1.vm06.stdout:7/401: sync 2026-03-10T06:22:17.224 INFO:tasks.workunit.client.1.vm06.stdout:4/289: link dd/d24/d2d/f28 dd/d41/f52 0 2026-03-10T06:22:17.226 INFO:tasks.workunit.client.1.vm06.stdout:4/290: rename dd/d18/f23 to dd/d33/f53 0 2026-03-10T06:22:17.227 INFO:tasks.workunit.client.1.vm06.stdout:4/291: chown dd/d18/f1f 902 1 2026-03-10T06:22:17.227 INFO:tasks.workunit.client.1.vm06.stdout:7/402: mkdir d19/d3b/d41/d42/d62/d80 0 2026-03-10T06:22:17.228 INFO:tasks.workunit.client.1.vm06.stdout:4/292: symlink dd/d18/l54 0 2026-03-10T06:22:17.235 INFO:tasks.workunit.client.1.vm06.stdout:9/367: dwrite d21/f3e [0,4194304] 0 2026-03-10T06:22:17.235 INFO:tasks.workunit.client.1.vm06.stdout:4/293: creat dd/d18/f55 x:0 0 0 2026-03-10T06:22:17.244 INFO:tasks.workunit.client.1.vm06.stdout:4/294: dread - dd/d24/d2d/d2f/d34/d40/f4e zero size 2026-03-10T06:22:17.244 INFO:tasks.workunit.client.1.vm06.stdout:9/368: dwrite d21/d46/f4e [4194304,4194304] 0 2026-03-10T06:22:17.257 INFO:tasks.workunit.client.1.vm06.stdout:9/369: mknod d21/d32/d4d/d51/d67/c7e 0 2026-03-10T06:22:17.259 INFO:tasks.workunit.client.1.vm06.stdout:9/370: dread - d21/d32/d4d/d51/d67/f7c zero size 2026-03-10T06:22:17.260 INFO:tasks.workunit.client.1.vm06.stdout:5/277: getdents d8/db/d54 0 2026-03-10T06:22:17.261 INFO:tasks.workunit.client.1.vm06.stdout:9/371: mkdir d21/d46/d79/d7f 0 2026-03-10T06:22:17.262 INFO:tasks.workunit.client.1.vm06.stdout:5/278: creat d8/db/d57/f5f x:0 0 0 2026-03-10T06:22:17.262 INFO:tasks.workunit.client.1.vm06.stdout:5/279: stat d8/db/d57/f5f 0 2026-03-10T06:22:17.262 INFO:tasks.workunit.client.1.vm06.stdout:9/372: mkdir d21/d46/d79/d80 0 2026-03-10T06:22:17.267 INFO:tasks.workunit.client.1.vm06.stdout:5/280: dread d8/d20/d22/f4d [0,4194304] 0 2026-03-10T06:22:17.267 INFO:tasks.workunit.client.1.vm06.stdout:4/295: dwrite fc [0,4194304] 0 2026-03-10T06:22:17.268 INFO:tasks.workunit.client.1.vm06.stdout:5/281: creat d8/db/d54/d55/f60 x:0 0 0 2026-03-10T06:22:17.269 INFO:tasks.workunit.client.1.vm06.stdout:5/282: write d8/db/d57/f5f [50190,83136] 0 2026-03-10T06:22:17.269 INFO:tasks.workunit.client.1.vm06.stdout:4/296: chown dd/f29 2 1 2026-03-10T06:22:17.269 INFO:tasks.workunit.client.1.vm06.stdout:4/297: chown dd 115 1 2026-03-10T06:22:17.318 INFO:tasks.workunit.client.1.vm06.stdout:2/326: truncate da/f28 4784079 0 2026-03-10T06:22:17.334 INFO:tasks.workunit.client.1.vm06.stdout:2/327: sync 2026-03-10T06:22:17.336 INFO:tasks.workunit.client.1.vm06.stdout:2/328: symlink da/d13/d1c/d1d/d44/d53/l63 0 2026-03-10T06:22:17.336 INFO:tasks.workunit.client.1.vm06.stdout:2/329: readlink da/ld 0 2026-03-10T06:22:17.339 INFO:tasks.workunit.client.1.vm06.stdout:2/330: truncate da/d13/d1c/d1d/d44/d48/d56/f58 883262 0 2026-03-10T06:22:17.349 INFO:tasks.workunit.client.1.vm06.stdout:2/331: write da/d13/d1c/d1d/d44/d48/f59 [710021,126613] 0 2026-03-10T06:22:17.349 INFO:tasks.workunit.client.1.vm06.stdout:2/332: dwrite da/d13/d1c/f41 [0,4194304] 0 2026-03-10T06:22:17.358 INFO:tasks.workunit.client.1.vm06.stdout:2/333: creat da/d13/d5e/f64 x:0 0 0 2026-03-10T06:22:17.363 INFO:tasks.workunit.client.1.vm06.stdout:6/407: fsync d6/d7/d37/d43/f59 0 2026-03-10T06:22:17.370 INFO:tasks.workunit.client.1.vm06.stdout:0/364: dread d0/dd/d1b/f4e [0,4194304] 0 2026-03-10T06:22:17.371 INFO:tasks.workunit.client.1.vm06.stdout:2/334: creat da/d13/d1c/d1d/d44/d53/f65 x:0 0 0 2026-03-10T06:22:17.371 INFO:tasks.workunit.client.1.vm06.stdout:6/408: rmdir d6/dd 39 2026-03-10T06:22:17.371 INFO:tasks.workunit.client.1.vm06.stdout:2/335: chown da/d13/d1a 237596 1 2026-03-10T06:22:17.371 INFO:tasks.workunit.client.1.vm06.stdout:6/409: dread d6/dd/d35/f7e [0,4194304] 0 2026-03-10T06:22:17.371 INFO:tasks.workunit.client.1.vm06.stdout:6/410: stat d6/df/c6e 0 2026-03-10T06:22:17.371 INFO:tasks.workunit.client.1.vm06.stdout:0/365: link d0/f5 d0/dd/d2d/f80 0 2026-03-10T06:22:17.373 INFO:tasks.workunit.client.1.vm06.stdout:0/366: fdatasync d0/d3c/d42/f41 0 2026-03-10T06:22:17.375 INFO:tasks.workunit.client.1.vm06.stdout:6/411: rmdir d6/d79/d80 0 2026-03-10T06:22:17.376 INFO:tasks.workunit.client.1.vm06.stdout:6/412: truncate d6/f6a 641485 0 2026-03-10T06:22:17.383 INFO:tasks.workunit.client.1.vm06.stdout:6/413: symlink d6/d7/d37/d43/l88 0 2026-03-10T06:22:17.384 INFO:tasks.workunit.client.1.vm06.stdout:8/330: rmdir d1/df 39 2026-03-10T06:22:17.384 INFO:tasks.workunit.client.1.vm06.stdout:0/367: dwrite d0/dd/d14/f31 [0,4194304] 0 2026-03-10T06:22:17.386 INFO:tasks.workunit.client.1.vm06.stdout:7/403: rmdir d19/d3b/d5b 39 2026-03-10T06:22:17.388 INFO:tasks.workunit.client.1.vm06.stdout:1/381: dread d9/d1b/d20/f42 [0,4194304] 0 2026-03-10T06:22:17.396 INFO:tasks.workunit.client.1.vm06.stdout:6/414: creat d6/dd/d25/d33/d5a/d78/f89 x:0 0 0 2026-03-10T06:22:17.396 INFO:tasks.workunit.client.1.vm06.stdout:0/368: write d0/dd/f4c [1270687,13669] 0 2026-03-10T06:22:17.400 INFO:tasks.workunit.client.1.vm06.stdout:3/351: truncate d6/dc/d41/d6d/f70 1037149 0 2026-03-10T06:22:17.400 INFO:tasks.workunit.client.1.vm06.stdout:1/382: dwrite d9/df/f14 [0,4194304] 0 2026-03-10T06:22:17.401 INFO:tasks.workunit.client.1.vm06.stdout:3/352: chown d6/l28 2040 1 2026-03-10T06:22:17.401 INFO:tasks.workunit.client.1.vm06.stdout:2/336: dread da/d13/d1c/d1d/f55 [0,4194304] 0 2026-03-10T06:22:17.402 INFO:tasks.workunit.client.1.vm06.stdout:4/298: getdents dd/d33 0 2026-03-10T06:22:17.403 INFO:tasks.workunit.client.1.vm06.stdout:6/415: creat d6/dd/d25/d4e/f8a x:0 0 0 2026-03-10T06:22:17.406 INFO:tasks.workunit.client.1.vm06.stdout:0/369: creat d0/dd/d2d/d47/d4d/f81 x:0 0 0 2026-03-10T06:22:17.407 INFO:tasks.workunit.client.1.vm06.stdout:7/404: rename f9 to d19/d3b/d5b/f81 0 2026-03-10T06:22:17.413 INFO:tasks.workunit.client.1.vm06.stdout:7/405: stat d19/l31 0 2026-03-10T06:22:17.413 INFO:tasks.workunit.client.1.vm06.stdout:9/373: dwrite d21/f3e [4194304,4194304] 0 2026-03-10T06:22:17.413 INFO:tasks.workunit.client.1.vm06.stdout:1/383: write d9/f2f [2017136,4954] 0 2026-03-10T06:22:17.414 INFO:tasks.workunit.client.1.vm06.stdout:5/283: write d8/ff [2032699,73990] 0 2026-03-10T06:22:17.420 INFO:tasks.workunit.client.1.vm06.stdout:0/370: creat d0/dd/d1b/d3d/f82 x:0 0 0 2026-03-10T06:22:17.420 INFO:tasks.workunit.client.1.vm06.stdout:7/406: unlink le 0 2026-03-10T06:22:17.421 INFO:tasks.workunit.client.1.vm06.stdout:3/353: mknod d6/dc/d72/c7c 0 2026-03-10T06:22:17.422 INFO:tasks.workunit.client.1.vm06.stdout:9/374: creat d21/d32/d4d/d51/d67/f81 x:0 0 0 2026-03-10T06:22:17.426 INFO:tasks.workunit.client.1.vm06.stdout:6/416: dwrite d6/d7/f87 [0,4194304] 0 2026-03-10T06:22:17.438 INFO:tasks.workunit.client.1.vm06.stdout:8/331: dread d1/d7/fd [0,4194304] 0 2026-03-10T06:22:17.438 INFO:tasks.workunit.client.1.vm06.stdout:5/284: creat d8/db/d54/d55/f61 x:0 0 0 2026-03-10T06:22:17.438 INFO:tasks.workunit.client.1.vm06.stdout:5/285: stat d8/ld 0 2026-03-10T06:22:17.438 INFO:tasks.workunit.client.1.vm06.stdout:5/286: write d8/db/d54/d55/f61 [212774,99160] 0 2026-03-10T06:22:17.438 INFO:tasks.workunit.client.1.vm06.stdout:5/287: chown d8/l1b 130079226 1 2026-03-10T06:22:17.438 INFO:tasks.workunit.client.1.vm06.stdout:1/384: symlink d9/d35/d46/d38/l61 0 2026-03-10T06:22:17.438 INFO:tasks.workunit.client.1.vm06.stdout:2/337: dwrite f8 [0,4194304] 0 2026-03-10T06:22:17.438 INFO:tasks.workunit.client.1.vm06.stdout:0/371: mknod d0/dd/d1b/d3d/c83 0 2026-03-10T06:22:17.438 INFO:tasks.workunit.client.1.vm06.stdout:3/354: rename f0 to d6/dc/d13/d73/f7d 0 2026-03-10T06:22:17.438 INFO:tasks.workunit.client.1.vm06.stdout:0/372: chown d0/dd/f49 3251 1 2026-03-10T06:22:17.438 INFO:tasks.workunit.client.1.vm06.stdout:7/407: chown d19/c75 176664 1 2026-03-10T06:22:17.438 INFO:tasks.workunit.client.1.vm06.stdout:1/385: chown d9/d1b/d20/c27 37682 1 2026-03-10T06:22:17.443 INFO:tasks.workunit.client.1.vm06.stdout:1/386: dread d9/d1b/f51 [4194304,4194304] 0 2026-03-10T06:22:17.444 INFO:tasks.workunit.client.1.vm06.stdout:7/408: fsync d19/d3b/f43 0 2026-03-10T06:22:17.445 INFO:tasks.workunit.client.1.vm06.stdout:9/375: mknod d21/d46/d79/d80/c82 0 2026-03-10T06:22:17.446 INFO:tasks.workunit.client.1.vm06.stdout:6/417: mknod d6/dd/d25/c8b 0 2026-03-10T06:22:17.447 INFO:tasks.workunit.client.1.vm06.stdout:7/409: stat d19/l1e 0 2026-03-10T06:22:17.448 INFO:tasks.workunit.client.1.vm06.stdout:9/376: creat d21/d27/d3a/f83 x:0 0 0 2026-03-10T06:22:17.448 INFO:tasks.workunit.client.1.vm06.stdout:6/418: creat d6/dd/d25/d33/d4d/f8c x:0 0 0 2026-03-10T06:22:17.449 INFO:tasks.workunit.client.1.vm06.stdout:6/419: write d6/dd/d25/d4e/f5f [3262256,8161] 0 2026-03-10T06:22:17.450 INFO:tasks.workunit.client.1.vm06.stdout:6/420: read d6/d7/f1a [3987971,24016] 0 2026-03-10T06:22:17.450 INFO:tasks.workunit.client.1.vm06.stdout:5/288: creat d8/d9/d1e/f62 x:0 0 0 2026-03-10T06:22:17.451 INFO:tasks.workunit.client.1.vm06.stdout:8/332: link d1/f18 d1/df/d11/f74 0 2026-03-10T06:22:17.452 INFO:tasks.workunit.client.1.vm06.stdout:1/387: mkdir d9/d62 0 2026-03-10T06:22:17.453 INFO:tasks.workunit.client.1.vm06.stdout:9/377: mknod d21/d32/c84 0 2026-03-10T06:22:17.455 INFO:tasks.workunit.client.1.vm06.stdout:6/421: unlink d6/d7/d37/f7d 0 2026-03-10T06:22:17.468 INFO:tasks.workunit.client.1.vm06.stdout:1/388: mkdir d9/d35/d46/d38/d63 0 2026-03-10T06:22:17.468 INFO:tasks.workunit.client.1.vm06.stdout:9/378: symlink d21/d27/d3a/l85 0 2026-03-10T06:22:17.468 INFO:tasks.workunit.client.1.vm06.stdout:8/333: dread - d1/df/d20/f51 zero size 2026-03-10T06:22:17.468 INFO:tasks.workunit.client.1.vm06.stdout:2/338: dread da/f11 [0,4194304] 0 2026-03-10T06:22:17.468 INFO:tasks.workunit.client.1.vm06.stdout:7/410: mkdir d19/d3b/d41/d42/d62/d80/d82 0 2026-03-10T06:22:17.468 INFO:tasks.workunit.client.1.vm06.stdout:5/289: creat d8/d20/d22/d39/f63 x:0 0 0 2026-03-10T06:22:17.468 INFO:tasks.workunit.client.1.vm06.stdout:9/379: rmdir d21/d32/d4d/d51 39 2026-03-10T06:22:17.468 INFO:tasks.workunit.client.1.vm06.stdout:5/290: write d8/db/d54/d55/f61 [1231360,60855] 0 2026-03-10T06:22:17.468 INFO:tasks.workunit.client.1.vm06.stdout:9/380: write d21/d27/f4b [485412,23095] 0 2026-03-10T06:22:17.468 INFO:tasks.workunit.client.1.vm06.stdout:2/339: mknod da/d13/d1c/d1d/d44/d53/d61/c66 0 2026-03-10T06:22:17.468 INFO:tasks.workunit.client.1.vm06.stdout:8/334: creat d1/f75 x:0 0 0 2026-03-10T06:22:17.468 INFO:tasks.workunit.client.1.vm06.stdout:7/411: dwrite d19/f30 [0,4194304] 0 2026-03-10T06:22:17.468 INFO:tasks.workunit.client.1.vm06.stdout:8/335: write d1/d7/f4f [977529,34849] 0 2026-03-10T06:22:17.468 INFO:tasks.workunit.client.1.vm06.stdout:5/291: write d8/d9/f14 [2603105,14693] 0 2026-03-10T06:22:17.468 INFO:tasks.workunit.client.1.vm06.stdout:9/381: rmdir d21/d46 39 2026-03-10T06:22:17.469 INFO:tasks.workunit.client.1.vm06.stdout:5/292: fsync d8/d20/d22/d39/f41 0 2026-03-10T06:22:17.482 INFO:tasks.workunit.client.1.vm06.stdout:9/382: dwrite d21/d32/d6e/f2f [0,4194304] 0 2026-03-10T06:22:17.485 INFO:tasks.workunit.client.1.vm06.stdout:2/340: dwrite da/d13/d1a/f3a [0,4194304] 0 2026-03-10T06:22:17.486 INFO:tasks.workunit.client.1.vm06.stdout:2/341: truncate da/d13/d5e/f64 493500 0 2026-03-10T06:22:17.497 INFO:tasks.workunit.client.1.vm06.stdout:2/342: dread da/f11 [0,4194304] 0 2026-03-10T06:22:17.497 INFO:tasks.workunit.client.1.vm06.stdout:8/336: mknod d1/d2c/d5b/c76 0 2026-03-10T06:22:17.505 INFO:tasks.workunit.client.1.vm06.stdout:7/412: mkdir d19/d3b/d41/d42/d52/d83 0 2026-03-10T06:22:17.506 INFO:tasks.workunit.client.1.vm06.stdout:2/343: creat da/d13/d1c/d1d/d44/d53/f67 x:0 0 0 2026-03-10T06:22:17.507 INFO:tasks.workunit.client.1.vm06.stdout:5/293: link d8/d9/d1e/f17 d8/d20/d46/f64 0 2026-03-10T06:22:17.510 INFO:tasks.workunit.client.1.vm06.stdout:1/389: dread d9/f1a [0,4194304] 0 2026-03-10T06:22:17.511 INFO:tasks.workunit.client.1.vm06.stdout:8/337: truncate d1/df/d11/f1d 1513282 0 2026-03-10T06:22:17.515 INFO:tasks.workunit.client.1.vm06.stdout:2/344: dwrite da/d13/d1a/d39/d35/f62 [0,4194304] 0 2026-03-10T06:22:17.520 INFO:tasks.workunit.client.1.vm06.stdout:2/345: dwrite da/d13/f5b [0,4194304] 0 2026-03-10T06:22:17.527 INFO:tasks.workunit.client.1.vm06.stdout:6/422: fdatasync d6/dd/d25/d4e/f5f 0 2026-03-10T06:22:17.527 INFO:tasks.workunit.client.1.vm06.stdout:5/294: unlink d8/d20/f4a 0 2026-03-10T06:22:17.527 INFO:tasks.workunit.client.1.vm06.stdout:5/295: readlink l3 0 2026-03-10T06:22:17.527 INFO:tasks.workunit.client.1.vm06.stdout:4/299: dwrite fa [0,4194304] 0 2026-03-10T06:22:17.534 INFO:tasks.workunit.client.1.vm06.stdout:2/346: mkdir da/d13/d1c/d1d/d44/d53/d61/d68 0 2026-03-10T06:22:17.535 INFO:tasks.workunit.client.1.vm06.stdout:6/423: write d6/dd/d25/d33/f5d [619455,121938] 0 2026-03-10T06:22:17.535 INFO:tasks.workunit.client.1.vm06.stdout:7/413: rename d19/l21 to d19/d3b/d41/d42/d52/l84 0 2026-03-10T06:22:17.536 INFO:tasks.workunit.client.1.vm06.stdout:1/390: mknod d9/d1b/d20/d44/c64 0 2026-03-10T06:22:17.538 INFO:tasks.workunit.client.1.vm06.stdout:7/414: creat d19/d3b/d41/d4c/f85 x:0 0 0 2026-03-10T06:22:17.539 INFO:tasks.workunit.client.1.vm06.stdout:5/296: link d8/c59 d8/db/d57/c65 0 2026-03-10T06:22:17.540 INFO:tasks.workunit.client.1.vm06.stdout:7/415: read d19/f30 [1033294,81931] 0 2026-03-10T06:22:17.541 INFO:tasks.workunit.client.1.vm06.stdout:8/338: getdents d1/d3b/d5c 0 2026-03-10T06:22:17.542 INFO:tasks.workunit.client.1.vm06.stdout:8/339: fdatasync d1/df/d20/d21/d5e/f65 0 2026-03-10T06:22:17.542 INFO:tasks.workunit.client.1.vm06.stdout:7/416: write d19/d3b/d5b/f7f [720781,32898] 0 2026-03-10T06:22:17.543 INFO:tasks.workunit.client.1.vm06.stdout:7/417: chown d19/c38 2071091 1 2026-03-10T06:22:17.544 INFO:tasks.workunit.client.1.vm06.stdout:8/340: creat d1/df/f77 x:0 0 0 2026-03-10T06:22:17.546 INFO:tasks.workunit.client.1.vm06.stdout:5/297: dwrite d8/db/d57/f5f [0,4194304] 0 2026-03-10T06:22:17.549 INFO:tasks.workunit.client.1.vm06.stdout:2/347: dread da/d13/d1c/d1d/f26 [0,4194304] 0 2026-03-10T06:22:17.550 INFO:tasks.workunit.client.1.vm06.stdout:5/298: write d8/ff [2144675,101885] 0 2026-03-10T06:22:17.552 INFO:tasks.workunit.client.1.vm06.stdout:7/418: dwrite d19/f33 [0,4194304] 0 2026-03-10T06:22:17.553 INFO:tasks.workunit.client.1.vm06.stdout:5/299: write d8/db/f45 [552022,100280] 0 2026-03-10T06:22:17.554 INFO:tasks.workunit.client.1.vm06.stdout:8/341: dwrite d1/df/d11/f4a [0,4194304] 0 2026-03-10T06:22:17.558 INFO:tasks.workunit.client.1.vm06.stdout:1/391: sync 2026-03-10T06:22:17.564 INFO:tasks.workunit.client.1.vm06.stdout:9/383: fsync d21/d27/f4b 0 2026-03-10T06:22:17.564 INFO:tasks.workunit.client.1.vm06.stdout:7/419: read d19/f1d [2427938,109488] 0 2026-03-10T06:22:17.565 INFO:tasks.workunit.client.1.vm06.stdout:8/342: rename d1/df/d20/d21/l27 to d1/df/d11/l78 0 2026-03-10T06:22:17.565 INFO:tasks.workunit.client.1.vm06.stdout:7/420: chown d19/c75 3698 1 2026-03-10T06:22:17.565 INFO:tasks.workunit.client.1.vm06.stdout:8/343: truncate d1/d7/fd 1248810 0 2026-03-10T06:22:17.568 INFO:tasks.workunit.client.1.vm06.stdout:2/348: mknod da/d13/d1c/d1d/d44/d53/d61/d68/c69 0 2026-03-10T06:22:17.568 INFO:tasks.workunit.client.1.vm06.stdout:9/384: fdatasync fd 0 2026-03-10T06:22:17.569 INFO:tasks.workunit.client.1.vm06.stdout:1/392: symlink d9/d35/d46/d38/d63/l65 0 2026-03-10T06:22:17.575 INFO:tasks.workunit.client.1.vm06.stdout:9/385: creat d21/d46/d79/d80/f86 x:0 0 0 2026-03-10T06:22:17.576 INFO:tasks.workunit.client.1.vm06.stdout:9/386: fdatasync d21/d32/d4d/f64 0 2026-03-10T06:22:17.579 INFO:tasks.workunit.client.1.vm06.stdout:1/393: mknod d9/df/c66 0 2026-03-10T06:22:17.579 INFO:tasks.workunit.client.1.vm06.stdout:4/300: dread dd/fe [0,4194304] 0 2026-03-10T06:22:17.585 INFO:tasks.workunit.client.1.vm06.stdout:4/301: truncate f0 711950 0 2026-03-10T06:22:17.585 INFO:tasks.workunit.client.1.vm06.stdout:2/349: link da/d13/d1c/d1d/l5d da/d13/d1c/d1d/d44/l6a 0 2026-03-10T06:22:17.585 INFO:tasks.workunit.client.1.vm06.stdout:4/302: chown dd/f29 363094744 1 2026-03-10T06:22:17.586 INFO:tasks.workunit.client.1.vm06.stdout:2/350: write f8 [3473989,2033] 0 2026-03-10T06:22:17.587 INFO:tasks.workunit.client.1.vm06.stdout:4/303: dread - dd/d24/d2d/d2f/d39/f4a zero size 2026-03-10T06:22:17.593 INFO:tasks.workunit.client.1.vm06.stdout:2/351: unlink da/fe 0 2026-03-10T06:22:17.597 INFO:tasks.workunit.client.1.vm06.stdout:4/304: rename dd/d24/d2d/d2f/d34/f49 to dd/d33/f56 0 2026-03-10T06:22:17.598 INFO:tasks.workunit.client.1.vm06.stdout:3/355: write d6/dc/d13/d35/f4e [712322,100370] 0 2026-03-10T06:22:17.610 INFO:tasks.workunit.client.1.vm06.stdout:6/424: rmdir d6/dd/d25/d33/d4d 39 2026-03-10T06:22:17.610 INFO:tasks.workunit.client.1.vm06.stdout:2/352: rename da/d13/d1c/d1d/d44/d48/f59 to da/d13/d1c/d1d/d44/d53/d61/d68/f6b 0 2026-03-10T06:22:17.610 INFO:tasks.workunit.client.1.vm06.stdout:4/305: mknod dd/d41/c57 0 2026-03-10T06:22:17.610 INFO:tasks.workunit.client.1.vm06.stdout:0/373: truncate d0/dd/d14/d1d/f53 421751 0 2026-03-10T06:22:17.610 INFO:tasks.workunit.client.1.vm06.stdout:2/353: write da/d13/d1c/d1d/d44/d53/f67 [222684,65364] 0 2026-03-10T06:22:17.610 INFO:tasks.workunit.client.1.vm06.stdout:0/374: stat d0/dd/d2d/d35/d74/l7a 0 2026-03-10T06:22:17.610 INFO:tasks.workunit.client.1.vm06.stdout:4/306: creat dd/d33/f58 x:0 0 0 2026-03-10T06:22:17.616 INFO:tasks.workunit.client.1.vm06.stdout:3/356: creat d6/dc/f7e x:0 0 0 2026-03-10T06:22:17.617 INFO:tasks.workunit.client.1.vm06.stdout:0/375: mknod d0/dd/d14/d18/c84 0 2026-03-10T06:22:17.617 INFO:tasks.workunit.client.1.vm06.stdout:6/425: rmdir d6/d7/d5e 0 2026-03-10T06:22:17.620 INFO:tasks.workunit.client.1.vm06.stdout:4/307: creat dd/d33/d36/f59 x:0 0 0 2026-03-10T06:22:17.634 INFO:tasks.workunit.client.1.vm06.stdout:0/376: fsync d0/dd/d14/d1d/d5d/f5f 0 2026-03-10T06:22:17.634 INFO:tasks.workunit.client.1.vm06.stdout:6/426: symlink d6/df/d70/l8d 0 2026-03-10T06:22:17.634 INFO:tasks.workunit.client.1.vm06.stdout:3/357: dwrite d6/d21/d38/f3d [0,4194304] 0 2026-03-10T06:22:17.634 INFO:tasks.workunit.client.1.vm06.stdout:4/308: creat dd/d24/d2d/f5a x:0 0 0 2026-03-10T06:22:17.634 INFO:tasks.workunit.client.1.vm06.stdout:2/354: dwrite da/d13/d1a/f21 [0,4194304] 0 2026-03-10T06:22:17.634 INFO:tasks.workunit.client.1.vm06.stdout:3/358: unlink d6/d8/f2d 0 2026-03-10T06:22:17.634 INFO:tasks.workunit.client.1.vm06.stdout:0/377: write d0/dd/f48 [73722,44883] 0 2026-03-10T06:22:17.634 INFO:tasks.workunit.client.1.vm06.stdout:0/378: stat d0/dd/d14/f31 0 2026-03-10T06:22:17.634 INFO:tasks.workunit.client.1.vm06.stdout:3/359: chown d6/dc/d13/d35/f5a 1711 1 2026-03-10T06:22:17.634 INFO:tasks.workunit.client.1.vm06.stdout:2/355: fsync da/d13/d1a/f27 0 2026-03-10T06:22:17.634 INFO:tasks.workunit.client.1.vm06.stdout:2/356: readlink da/d13/d1c/d1d/l25 0 2026-03-10T06:22:17.635 INFO:tasks.workunit.client.1.vm06.stdout:0/379: fdatasync d0/dd/d14/d18/f22 0 2026-03-10T06:22:17.639 INFO:tasks.workunit.client.1.vm06.stdout:3/360: read - d6/d21/d38/d39/f4c zero size 2026-03-10T06:22:17.640 INFO:tasks.workunit.client.1.vm06.stdout:3/361: write d6/dc/d13/f5d [1494538,77974] 0 2026-03-10T06:22:17.640 INFO:tasks.workunit.client.1.vm06.stdout:3/362: chown d6/dc/d13/d35/f5a 220151 1 2026-03-10T06:22:17.643 INFO:tasks.workunit.client.1.vm06.stdout:3/363: dread - d6/d21/d38/d39/f4c zero size 2026-03-10T06:22:17.650 INFO:tasks.workunit.client.1.vm06.stdout:3/364: rmdir d6/dc/d13/d51 39 2026-03-10T06:22:17.651 INFO:tasks.workunit.client.1.vm06.stdout:6/427: link d6/dd/d35/c71 d6/d7/c8e 0 2026-03-10T06:22:17.651 INFO:tasks.workunit.client.1.vm06.stdout:4/309: dread dd/d24/d2d/f28 [0,4194304] 0 2026-03-10T06:22:17.652 INFO:tasks.workunit.client.1.vm06.stdout:2/357: rename da/d13/d1c/d1d/d44/l6a to da/d13/d5e/l6c 0 2026-03-10T06:22:17.653 INFO:tasks.workunit.client.1.vm06.stdout:0/380: getdents d0 0 2026-03-10T06:22:17.656 INFO:tasks.workunit.client.1.vm06.stdout:4/310: truncate dd/f43 883924 0 2026-03-10T06:22:17.666 INFO:tasks.workunit.client.1.vm06.stdout:0/381: mkdir d0/dd/d14/d18/d85 0 2026-03-10T06:22:17.666 INFO:tasks.workunit.client.1.vm06.stdout:6/428: dwrite d6/d7/d37/d43/f59 [4194304,4194304] 0 2026-03-10T06:22:17.666 INFO:tasks.workunit.client.1.vm06.stdout:4/311: mkdir dd/d24/d2d/d2f/d34/d40/d5b 0 2026-03-10T06:22:17.666 INFO:tasks.workunit.client.1.vm06.stdout:4/312: chown dd/d24/d2d/d2f/d39/c51 48040106 1 2026-03-10T06:22:17.666 INFO:tasks.workunit.client.1.vm06.stdout:6/429: mknod d6/d7/c8f 0 2026-03-10T06:22:17.666 INFO:tasks.workunit.client.1.vm06.stdout:6/430: creat d6/df/d70/f90 x:0 0 0 2026-03-10T06:22:17.666 INFO:tasks.workunit.client.1.vm06.stdout:6/431: write d6/dd/d25/d4e/f8a [77477,81076] 0 2026-03-10T06:22:17.669 INFO:tasks.workunit.client.1.vm06.stdout:3/365: read d6/f5c [200623,76259] 0 2026-03-10T06:22:17.672 INFO:tasks.workunit.client.1.vm06.stdout:4/313: creat dd/f5c x:0 0 0 2026-03-10T06:22:17.674 INFO:tasks.workunit.client.1.vm06.stdout:4/314: mkdir dd/d24/d5d 0 2026-03-10T06:22:17.674 INFO:tasks.workunit.client.1.vm06.stdout:4/315: read dd/fe [2211003,4498] 0 2026-03-10T06:22:17.687 INFO:tasks.workunit.client.1.vm06.stdout:4/316: mkdir dd/d24/d5e 0 2026-03-10T06:22:17.688 INFO:tasks.workunit.client.1.vm06.stdout:4/317: creat dd/d18/f5f x:0 0 0 2026-03-10T06:22:17.688 INFO:tasks.workunit.client.1.vm06.stdout:4/318: fdatasync dd/d33/f56 0 2026-03-10T06:22:17.688 INFO:tasks.workunit.client.1.vm06.stdout:4/319: chown dd/d24/d2d/d2f/d39/l44 220 1 2026-03-10T06:22:17.688 INFO:tasks.workunit.client.1.vm06.stdout:4/320: fsync dd/f12 0 2026-03-10T06:22:17.691 INFO:tasks.workunit.client.1.vm06.stdout:6/432: dread d6/dd/f5b [0,4194304] 0 2026-03-10T06:22:17.692 INFO:tasks.workunit.client.1.vm06.stdout:6/433: stat d6/df/c46 0 2026-03-10T06:22:17.693 INFO:tasks.workunit.client.1.vm06.stdout:6/434: symlink d6/dd/d25/d33/l91 0 2026-03-10T06:22:17.702 INFO:tasks.workunit.client.1.vm06.stdout:6/435: link d6/df/l10 d6/df/d40/l92 0 2026-03-10T06:22:17.707 INFO:tasks.workunit.client.1.vm06.stdout:6/436: rename d6/dd/f18 to d6/dd/d2b/f93 0 2026-03-10T06:22:17.708 INFO:tasks.workunit.client.1.vm06.stdout:0/382: dread d0/f46 [0,4194304] 0 2026-03-10T06:22:17.710 INFO:tasks.workunit.client.1.vm06.stdout:0/383: write d0/dd/d14/f70 [369308,18135] 0 2026-03-10T06:22:17.715 INFO:tasks.workunit.client.1.vm06.stdout:0/384: fsync d0/dd/f5b 0 2026-03-10T06:22:17.718 INFO:tasks.workunit.client.1.vm06.stdout:0/385: creat d0/d3c/d42/d5e/f86 x:0 0 0 2026-03-10T06:22:17.720 INFO:tasks.workunit.client.1.vm06.stdout:0/386: symlink d0/dd/d2d/d47/d4d/l87 0 2026-03-10T06:22:17.721 INFO:tasks.workunit.client.1.vm06.stdout:0/387: fsync d0/dd/d14/d18/f30 0 2026-03-10T06:22:17.724 INFO:tasks.workunit.client.1.vm06.stdout:0/388: getdents d0/d3c 0 2026-03-10T06:22:17.729 INFO:tasks.workunit.client.1.vm06.stdout:3/366: sync 2026-03-10T06:22:17.729 INFO:tasks.workunit.client.1.vm06.stdout:4/321: sync 2026-03-10T06:22:17.729 INFO:tasks.workunit.client.1.vm06.stdout:0/389: rmdir d0/dd/d2d/d35 39 2026-03-10T06:22:17.730 INFO:tasks.workunit.client.1.vm06.stdout:4/322: write dd/d24/f3d [6478331,106689] 0 2026-03-10T06:22:17.733 INFO:tasks.workunit.client.1.vm06.stdout:6/437: sync 2026-03-10T06:22:17.737 INFO:tasks.workunit.client.1.vm06.stdout:3/367: mkdir d6/d8/d7f 0 2026-03-10T06:22:17.742 INFO:tasks.workunit.client.1.vm06.stdout:4/323: unlink dd/d24/d2d/d2f/d39/c51 0 2026-03-10T06:22:17.742 INFO:tasks.workunit.client.1.vm06.stdout:3/368: truncate d6/dc/f7e 765244 0 2026-03-10T06:22:17.742 INFO:tasks.workunit.client.1.vm06.stdout:6/438: mknod d6/dd/d2b/c94 0 2026-03-10T06:22:17.749 INFO:tasks.workunit.client.1.vm06.stdout:6/439: mkdir d6/d79/d95 0 2026-03-10T06:22:17.750 INFO:tasks.workunit.client.1.vm06.stdout:7/421: getdents d19 0 2026-03-10T06:22:17.756 INFO:tasks.workunit.client.1.vm06.stdout:5/300: dwrite d8/d20/d22/d39/f52 [0,4194304] 0 2026-03-10T06:22:17.757 INFO:tasks.workunit.client.1.vm06.stdout:5/301: readlink d8/l1b 0 2026-03-10T06:22:17.760 INFO:tasks.workunit.client.1.vm06.stdout:8/344: truncate d1/f4 6850353 0 2026-03-10T06:22:17.767 INFO:tasks.workunit.client.1.vm06.stdout:8/345: dread d1/df/f6b [0,4194304] 0 2026-03-10T06:22:17.773 INFO:tasks.workunit.client.1.vm06.stdout:3/369: unlink d6/dc/d13/c47 0 2026-03-10T06:22:17.773 INFO:tasks.workunit.client.1.vm06.stdout:3/370: chown d6/dc/d13/d73 1 1 2026-03-10T06:22:17.774 INFO:tasks.workunit.client.1.vm06.stdout:3/371: chown d6/dc/f1d 1062153564 1 2026-03-10T06:22:17.776 INFO:tasks.workunit.client.1.vm06.stdout:3/372: chown d6/d21/d38/f56 76534 1 2026-03-10T06:22:17.782 INFO:tasks.workunit.client.1.vm06.stdout:9/387: write d21/d32/d6e/f2f [4691456,48326] 0 2026-03-10T06:22:17.782 INFO:tasks.workunit.client.1.vm06.stdout:9/388: chown d21/d46/d79/d7f 857 1 2026-03-10T06:22:17.782 INFO:tasks.workunit.client.1.vm06.stdout:1/394: write d9/f1f [5139582,20316] 0 2026-03-10T06:22:17.782 INFO:tasks.workunit.client.1.vm06.stdout:9/389: write d21/d27/f4b [943189,20039] 0 2026-03-10T06:22:17.782 INFO:tasks.workunit.client.1.vm06.stdout:9/390: chown fd 4 1 2026-03-10T06:22:17.784 INFO:tasks.workunit.client.1.vm06.stdout:9/391: write f9 [3878533,16931] 0 2026-03-10T06:22:17.785 INFO:tasks.workunit.client.1.vm06.stdout:6/440: creat d6/dd/f96 x:0 0 0 2026-03-10T06:22:17.786 INFO:tasks.workunit.client.1.vm06.stdout:9/392: dread - d21/d32/f73 zero size 2026-03-10T06:22:17.787 INFO:tasks.workunit.client.1.vm06.stdout:7/422: creat d19/d3b/d41/d42/d62/f86 x:0 0 0 2026-03-10T06:22:17.788 INFO:tasks.workunit.client.1.vm06.stdout:6/441: write d6/dd/d25/d4e/f5f [3439754,45456] 0 2026-03-10T06:22:17.789 INFO:tasks.workunit.client.1.vm06.stdout:9/393: read d21/d46/f4e [6707911,10438] 0 2026-03-10T06:22:17.795 INFO:tasks.workunit.client.1.vm06.stdout:0/390: dread d0/dd/f44 [0,4194304] 0 2026-03-10T06:22:17.797 INFO:tasks.workunit.client.1.vm06.stdout:8/346: mkdir d1/df/d20/d21/d5e/d79 0 2026-03-10T06:22:17.799 INFO:tasks.workunit.client.1.vm06.stdout:3/373: creat d6/dc/d13/d35/f80 x:0 0 0 2026-03-10T06:22:17.800 INFO:tasks.workunit.client.1.vm06.stdout:3/374: dread - d6/d8/f52 zero size 2026-03-10T06:22:17.801 INFO:tasks.workunit.client.1.vm06.stdout:5/302: dwrite d8/d20/d22/f31 [0,4194304] 0 2026-03-10T06:22:17.812 INFO:tasks.workunit.client.1.vm06.stdout:6/442: rename d6/d7/f44 to d6/dd/d35/f97 0 2026-03-10T06:22:17.814 INFO:tasks.workunit.client.1.vm06.stdout:0/391: dread d0/dd/d1b/f3f [0,4194304] 0 2026-03-10T06:22:17.820 INFO:tasks.workunit.client.1.vm06.stdout:9/394: dread d21/f49 [0,4194304] 0 2026-03-10T06:22:17.824 INFO:tasks.workunit.client.1.vm06.stdout:1/395: symlink d9/d1b/l67 0 2026-03-10T06:22:17.828 INFO:tasks.workunit.client.1.vm06.stdout:1/396: chown d9/d35/c3c 1596 1 2026-03-10T06:22:17.828 INFO:tasks.workunit.client.1.vm06.stdout:8/347: truncate d1/df/d11/f12 1884817 0 2026-03-10T06:22:17.832 INFO:tasks.workunit.client.1.vm06.stdout:3/375: mknod d6/dc/c81 0 2026-03-10T06:22:17.833 INFO:tasks.workunit.client.1.vm06.stdout:6/443: symlink d6/dd/d25/d33/d4d/l98 0 2026-03-10T06:22:17.834 INFO:tasks.workunit.client.1.vm06.stdout:1/397: write d9/f34 [2449,55039] 0 2026-03-10T06:22:17.838 INFO:tasks.workunit.client.1.vm06.stdout:4/324: truncate fc 3560980 0 2026-03-10T06:22:17.838 INFO:tasks.workunit.client.1.vm06.stdout:5/303: rename d8/db/fc to d8/d9/d1e/f66 0 2026-03-10T06:22:17.839 INFO:tasks.workunit.client.1.vm06.stdout:5/304: fdatasync d8/db/d57/f5f 0 2026-03-10T06:22:17.843 INFO:tasks.workunit.client.1.vm06.stdout:6/444: sync 2026-03-10T06:22:17.843 INFO:tasks.workunit.client.1.vm06.stdout:0/392: truncate d0/f9 3655808 0 2026-03-10T06:22:17.843 INFO:tasks.workunit.client.1.vm06.stdout:8/348: getdents d1/d3b/d5c 0 2026-03-10T06:22:17.844 INFO:tasks.workunit.client.1.vm06.stdout:1/398: write d9/df/f3d [603712,45607] 0 2026-03-10T06:22:17.844 INFO:tasks.workunit.client.1.vm06.stdout:7/423: dread d19/d3b/d41/d4c/f4e [0,4194304] 0 2026-03-10T06:22:17.847 INFO:tasks.workunit.client.1.vm06.stdout:0/393: write d0/dd/d14/d18/f30 [2024721,121374] 0 2026-03-10T06:22:17.848 INFO:tasks.workunit.client.1.vm06.stdout:7/424: stat d19/d3b/d41/d42/d62/f86 0 2026-03-10T06:22:17.848 INFO:tasks.workunit.client.1.vm06.stdout:0/394: stat d0/dd/d1c/c21 0 2026-03-10T06:22:17.849 INFO:tasks.workunit.client.1.vm06.stdout:1/399: dread - d9/d35/f53 zero size 2026-03-10T06:22:17.851 INFO:tasks.workunit.client.1.vm06.stdout:4/325: creat dd/d41/f60 x:0 0 0 2026-03-10T06:22:17.854 INFO:tasks.workunit.client.1.vm06.stdout:6/445: dwrite d6/dd/d25/d4e/f5f [0,4194304] 0 2026-03-10T06:22:17.856 INFO:tasks.workunit.client.1.vm06.stdout:9/395: dread ff [0,4194304] 0 2026-03-10T06:22:17.858 INFO:tasks.workunit.client.1.vm06.stdout:6/446: dread - d6/dd/d25/d2c/f85 zero size 2026-03-10T06:22:17.859 INFO:tasks.workunit.client.1.vm06.stdout:6/447: fdatasync d6/dd/f96 0 2026-03-10T06:22:17.861 INFO:tasks.workunit.client.1.vm06.stdout:3/376: dread d6/dc/f1d [0,4194304] 0 2026-03-10T06:22:17.862 INFO:tasks.workunit.client.1.vm06.stdout:6/448: sync 2026-03-10T06:22:17.869 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:17 vm06.local ceph-mon[58974]: pgmap v7: 65 pgs: 65 active+clean; 794 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 20 MiB/s rd, 100 MiB/s wr, 192 op/s 2026-03-10T06:22:17.869 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:17 vm06.local ceph-mon[58974]: Updating vm04:/etc/ceph/ceph.conf 2026-03-10T06:22:17.869 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:17 vm06.local ceph-mon[58974]: Updating vm06:/etc/ceph/ceph.conf 2026-03-10T06:22:17.869 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:17 vm06.local ceph-mon[58974]: Updating vm04:/var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/config/ceph.conf 2026-03-10T06:22:17.869 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:17 vm06.local ceph-mon[58974]: Updating vm04:/etc/ceph/ceph.client.admin.keyring 2026-03-10T06:22:17.870 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:17 vm06.local ceph-mon[58974]: Updating vm06:/var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/config/ceph.conf 2026-03-10T06:22:17.870 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:17 vm06.local ceph-mon[58974]: Updating vm04:/var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/config/ceph.client.admin.keyring 2026-03-10T06:22:17.870 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:17 vm06.local ceph-mon[58974]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:22:17.870 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:17 vm06.local ceph-mon[58974]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:22:17.870 INFO:tasks.workunit.client.1.vm06.stdout:5/305: rename d8/d20 to d8/db/d54/d67 0 2026-03-10T06:22:17.870 INFO:tasks.workunit.client.1.vm06.stdout:9/396: dread fe [0,4194304] 0 2026-03-10T06:22:17.871 INFO:tasks.workunit.client.1.vm06.stdout:2/358: link da/d13/d5e/l6c da/l6d 0 2026-03-10T06:22:17.882 INFO:tasks.workunit.client.1.vm06.stdout:7/425: write d19/d3b/d41/d4c/f4e [1681934,85859] 0 2026-03-10T06:22:17.882 INFO:tasks.workunit.client.1.vm06.stdout:4/326: dread dd/f11 [0,4194304] 0 2026-03-10T06:22:17.888 INFO:tasks.workunit.client.1.vm06.stdout:0/395: rename d0/dd/d2d to d0/d3c/d42/d88 0 2026-03-10T06:22:17.888 INFO:tasks.workunit.client.1.vm06.stdout:9/397: unlink d21/d32/f7a 0 2026-03-10T06:22:17.891 INFO:tasks.workunit.client.1.vm06.stdout:7/426: mknod d19/d3b/d5b/c87 0 2026-03-10T06:22:17.893 INFO:tasks.workunit.client.1.vm06.stdout:7/427: stat d19/f30 0 2026-03-10T06:22:17.895 INFO:tasks.workunit.client.1.vm06.stdout:5/306: rmdir d8/d9/d1e 39 2026-03-10T06:22:17.896 INFO:tasks.workunit.client.1.vm06.stdout:5/307: stat d8/f49 0 2026-03-10T06:22:17.900 INFO:tasks.workunit.client.1.vm06.stdout:4/327: creat dd/d24/d2d/d2f/d39/f61 x:0 0 0 2026-03-10T06:22:17.900 INFO:tasks.workunit.client.1.vm06.stdout:5/308: readlink d8/l2b 0 2026-03-10T06:22:17.903 INFO:tasks.workunit.client.1.vm06.stdout:5/309: stat d8/db/d54/d67/c5b 0 2026-03-10T06:22:17.905 INFO:tasks.workunit.client.1.vm06.stdout:2/359: chown da/d13/d1c/d1d/f26 836398 1 2026-03-10T06:22:17.906 INFO:tasks.workunit.client.1.vm06.stdout:7/428: dwrite d19/d3b/d41/f65 [0,4194304] 0 2026-03-10T06:22:17.908 INFO:tasks.workunit.client.1.vm06.stdout:7/429: dread - d19/d3b/d41/d4c/f85 zero size 2026-03-10T06:22:17.911 INFO:tasks.workunit.client.1.vm06.stdout:0/396: dwrite d0/f46 [0,4194304] 0 2026-03-10T06:22:17.912 INFO:tasks.workunit.client.1.vm06.stdout:7/430: dread d19/f33 [0,4194304] 0 2026-03-10T06:22:17.915 INFO:tasks.workunit.client.1.vm06.stdout:4/328: creat dd/d24/d2d/d2f/d39/f62 x:0 0 0 2026-03-10T06:22:17.916 INFO:tasks.workunit.client.1.vm06.stdout:9/398: creat d21/d32/d4d/d51/f87 x:0 0 0 2026-03-10T06:22:17.916 INFO:tasks.workunit.client.1.vm06.stdout:2/360: chown da/d13/d1c/d1d/d44 10469 1 2026-03-10T06:22:17.916 INFO:tasks.workunit.client.1.vm06.stdout:4/329: chown dd/d24/d2d/d2f 130 1 2026-03-10T06:22:17.917 INFO:tasks.workunit.client.1.vm06.stdout:4/330: truncate dd/f29 4466138 0 2026-03-10T06:22:17.920 INFO:tasks.workunit.client.1.vm06.stdout:7/431: mknod d19/d3b/d41/d4c/c88 0 2026-03-10T06:22:17.920 INFO:tasks.workunit.client.1.vm06.stdout:5/310: mkdir d8/db/d54/d67/d46/d68 0 2026-03-10T06:22:17.922 INFO:tasks.workunit.client.1.vm06.stdout:3/377: truncate d6/dc/d13/d73/f7d 3828034 0 2026-03-10T06:22:17.922 INFO:tasks.workunit.client.1.vm06.stdout:5/311: stat d8/db/d57/f5f 0 2026-03-10T06:22:17.923 INFO:tasks.workunit.client.1.vm06.stdout:5/312: chown f7 102 1 2026-03-10T06:22:17.923 INFO:tasks.workunit.client.1.vm06.stdout:7/432: write d19/d3b/d41/f65 [465260,68250] 0 2026-03-10T06:22:17.924 INFO:tasks.workunit.client.1.vm06.stdout:6/449: getdents d6/df/d40 0 2026-03-10T06:22:17.926 INFO:tasks.workunit.client.1.vm06.stdout:3/378: write d6/d8/f22 [2471471,28399] 0 2026-03-10T06:22:17.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:17 vm04.local ceph-mon[51058]: pgmap v7: 65 pgs: 65 active+clean; 794 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 20 MiB/s rd, 100 MiB/s wr, 192 op/s 2026-03-10T06:22:17.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:17 vm04.local ceph-mon[51058]: Updating vm04:/etc/ceph/ceph.conf 2026-03-10T06:22:17.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:17 vm04.local ceph-mon[51058]: Updating vm06:/etc/ceph/ceph.conf 2026-03-10T06:22:17.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:17 vm04.local ceph-mon[51058]: Updating vm04:/var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/config/ceph.conf 2026-03-10T06:22:17.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:17 vm04.local ceph-mon[51058]: Updating vm04:/etc/ceph/ceph.client.admin.keyring 2026-03-10T06:22:17.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:17 vm04.local ceph-mon[51058]: Updating vm06:/var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/config/ceph.conf 2026-03-10T06:22:17.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:17 vm04.local ceph-mon[51058]: Updating vm04:/var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/config/ceph.client.admin.keyring 2026-03-10T06:22:17.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:17 vm04.local ceph-mon[51058]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:22:17.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:17 vm04.local ceph-mon[51058]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:22:17.928 INFO:tasks.workunit.client.1.vm06.stdout:6/450: chown d6/dd/d35/l27 3 1 2026-03-10T06:22:17.930 INFO:tasks.workunit.client.1.vm06.stdout:3/379: write d6/dc/d13/d35/f3b [62122,80027] 0 2026-03-10T06:22:17.932 INFO:tasks.workunit.client.1.vm06.stdout:3/380: stat d6/dc/f3f 0 2026-03-10T06:22:17.934 INFO:tasks.workunit.client.1.vm06.stdout:3/381: write d6/dc/d13/d35/d6b/f57 [804907,74802] 0 2026-03-10T06:22:17.946 INFO:tasks.workunit.client.1.vm06.stdout:4/331: dwrite dd/d33/f53 [0,4194304] 0 2026-03-10T06:22:17.947 INFO:tasks.workunit.client.1.vm06.stdout:3/382: sync 2026-03-10T06:22:17.953 INFO:tasks.workunit.client.1.vm06.stdout:2/361: dread da/f19 [0,4194304] 0 2026-03-10T06:22:17.955 INFO:tasks.workunit.client.1.vm06.stdout:9/399: symlink d21/l88 0 2026-03-10T06:22:17.955 INFO:tasks.workunit.client.1.vm06.stdout:7/433: mknod d19/d3b/d41/d42/d62/c89 0 2026-03-10T06:22:17.956 INFO:tasks.workunit.client.1.vm06.stdout:6/451: mkdir d6/df/d40/d99 0 2026-03-10T06:22:17.956 INFO:tasks.workunit.client.1.vm06.stdout:9/400: write f20 [3046102,6043] 0 2026-03-10T06:22:17.957 INFO:tasks.workunit.client.1.vm06.stdout:9/401: fsync d21/d46/d79/d80/f86 0 2026-03-10T06:22:17.958 INFO:tasks.workunit.client.1.vm06.stdout:4/332: mknod dd/d24/d2d/d2f/d34/c63 0 2026-03-10T06:22:17.958 INFO:tasks.workunit.client.1.vm06.stdout:4/333: readlink dd/d18/l1c 0 2026-03-10T06:22:17.959 INFO:tasks.workunit.client.1.vm06.stdout:4/334: read - dd/d18/f5f zero size 2026-03-10T06:22:17.963 INFO:tasks.workunit.client.1.vm06.stdout:5/313: unlink d8/db/d54/d67/d22/l2a 0 2026-03-10T06:22:17.965 INFO:tasks.workunit.client.1.vm06.stdout:5/314: dwrite d8/ff [0,4194304] 0 2026-03-10T06:22:17.969 INFO:tasks.workunit.client.1.vm06.stdout:5/315: dwrite d8/db/d54/d55/f60 [0,4194304] 0 2026-03-10T06:22:17.973 INFO:tasks.workunit.client.1.vm06.stdout:7/434: rename d19/c56 to d19/d3b/d5b/c8a 0 2026-03-10T06:22:17.975 INFO:tasks.workunit.client.1.vm06.stdout:0/397: getdents d0/d3c/d42/d88/d47/d4d 0 2026-03-10T06:22:17.977 INFO:tasks.workunit.client.1.vm06.stdout:6/452: mknod d6/dd/d25/d33/d5a/d78/c9a 0 2026-03-10T06:22:17.982 INFO:tasks.workunit.client.1.vm06.stdout:3/383: link d6/d8/f22 d6/dc/d41/f82 0 2026-03-10T06:22:17.988 INFO:tasks.workunit.client.1.vm06.stdout:1/400: truncate d9/d1b/d20/f25 2068391 0 2026-03-10T06:22:17.992 INFO:tasks.workunit.client.1.vm06.stdout:2/362: dread da/d13/d1a/d39/f2f [0,4194304] 0 2026-03-10T06:22:18.007 INFO:tasks.workunit.client.1.vm06.stdout:0/398: creat d0/dd/d1c/f89 x:0 0 0 2026-03-10T06:22:18.007 INFO:tasks.workunit.client.1.vm06.stdout:0/399: write d0/d3c/d42/f54 [807310,73249] 0 2026-03-10T06:22:18.007 INFO:tasks.workunit.client.1.vm06.stdout:9/402: symlink d21/d32/d4d/l89 0 2026-03-10T06:22:18.007 INFO:tasks.workunit.client.1.vm06.stdout:0/400: chown d0/d3c/d42/d88/f80 1634657882 1 2026-03-10T06:22:18.007 INFO:tasks.workunit.client.1.vm06.stdout:0/401: chown d0/dd/d1c 227295 1 2026-03-10T06:22:18.007 INFO:tasks.workunit.client.1.vm06.stdout:0/402: read - d0/dd/d14/d1d/d5d/f5f zero size 2026-03-10T06:22:18.007 INFO:tasks.workunit.client.1.vm06.stdout:6/453: creat d6/dd/d25/d2c/f9b x:0 0 0 2026-03-10T06:22:18.007 INFO:tasks.workunit.client.1.vm06.stdout:0/403: chown d0/c2e 42 1 2026-03-10T06:22:18.007 INFO:tasks.workunit.client.1.vm06.stdout:0/404: chown d0/d3c/d42/lc 1016 1 2026-03-10T06:22:18.007 INFO:tasks.workunit.client.1.vm06.stdout:3/384: rename d6/dc/d13/d35/l71 to d6/d21/d38/d39/l83 0 2026-03-10T06:22:18.007 INFO:tasks.workunit.client.1.vm06.stdout:6/454: dwrite d6/d7/d37/f65 [0,4194304] 0 2026-03-10T06:22:18.007 INFO:tasks.workunit.client.1.vm06.stdout:3/385: readlink d6/dc/d13/l19 0 2026-03-10T06:22:18.007 INFO:tasks.workunit.client.1.vm06.stdout:9/403: rmdir d21 39 2026-03-10T06:22:18.013 INFO:tasks.workunit.client.1.vm06.stdout:8/349: write d1/df/d11/f1d [2469069,6575] 0 2026-03-10T06:22:18.015 INFO:tasks.workunit.client.1.vm06.stdout:4/335: link dd/c10 dd/d24/d5d/c64 0 2026-03-10T06:22:18.017 INFO:tasks.workunit.client.1.vm06.stdout:6/455: link d6/d7/d37/d43/f59 d6/dd/d25/d2c/f9c 0 2026-03-10T06:22:18.019 INFO:tasks.workunit.client.1.vm06.stdout:0/405: creat d0/d3c/d42/d88/f8a x:0 0 0 2026-03-10T06:22:18.021 INFO:tasks.workunit.client.1.vm06.stdout:8/350: creat d1/d3b/d5c/f7a x:0 0 0 2026-03-10T06:22:18.022 INFO:tasks.workunit.client.1.vm06.stdout:8/351: truncate d1/df/d11/f53 456029 0 2026-03-10T06:22:18.023 INFO:tasks.workunit.client.1.vm06.stdout:9/404: symlink d21/d27/d50/l8a 0 2026-03-10T06:22:18.024 INFO:tasks.workunit.client.1.vm06.stdout:9/405: read d21/f49 [7495106,108200] 0 2026-03-10T06:22:18.025 INFO:tasks.workunit.client.1.vm06.stdout:1/401: rename d9/df/f49 to d9/f68 0 2026-03-10T06:22:18.028 INFO:tasks.workunit.client.1.vm06.stdout:3/386: creat d6/f84 x:0 0 0 2026-03-10T06:22:18.029 INFO:tasks.workunit.client.1.vm06.stdout:9/406: dwrite f1b [0,4194304] 0 2026-03-10T06:22:18.029 INFO:tasks.workunit.client.1.vm06.stdout:8/352: symlink d1/df/d58/l7b 0 2026-03-10T06:22:18.036 INFO:tasks.workunit.client.1.vm06.stdout:2/363: rename da/d13/d51 to da/d13/d1c/d43/d6e 0 2026-03-10T06:22:18.039 INFO:tasks.workunit.client.1.vm06.stdout:4/336: fdatasync dd/d24/d2d/f3b 0 2026-03-10T06:22:18.039 INFO:tasks.workunit.client.1.vm06.stdout:4/337: dread - dd/d41/f60 zero size 2026-03-10T06:22:18.040 INFO:tasks.workunit.client.1.vm06.stdout:4/338: fdatasync fa 0 2026-03-10T06:22:18.043 INFO:tasks.workunit.client.1.vm06.stdout:6/456: symlink d6/l9d 0 2026-03-10T06:22:18.044 INFO:tasks.workunit.client.1.vm06.stdout:5/316: write d8/db/d54/d67/d22/d39/f3d [127061,111334] 0 2026-03-10T06:22:18.047 INFO:tasks.workunit.client.1.vm06.stdout:9/407: creat d21/d32/f8b x:0 0 0 2026-03-10T06:22:18.048 INFO:tasks.workunit.client.1.vm06.stdout:9/408: chown d21/d46/f4e 9059 1 2026-03-10T06:22:18.050 INFO:tasks.workunit.client.1.vm06.stdout:5/317: dwrite d8/db/d54/d67/d22/f31 [0,4194304] 0 2026-03-10T06:22:18.054 INFO:tasks.workunit.client.1.vm06.stdout:5/318: dwrite d8/db/d54/d55/f60 [0,4194304] 0 2026-03-10T06:22:18.057 INFO:tasks.workunit.client.1.vm06.stdout:6/457: rmdir d6/d79 39 2026-03-10T06:22:18.058 INFO:tasks.workunit.client.1.vm06.stdout:2/364: sync 2026-03-10T06:22:18.059 INFO:tasks.workunit.client.1.vm06.stdout:3/387: symlink d6/dc/d13/l85 0 2026-03-10T06:22:18.060 INFO:tasks.workunit.client.1.vm06.stdout:0/406: rename d0/d3c/d42/d88/d47/c63 to d0/dd/d1b/d3d/d50/c8b 0 2026-03-10T06:22:18.067 INFO:tasks.workunit.client.1.vm06.stdout:6/458: dwrite d6/dd/d25/f3f [4194304,4194304] 0 2026-03-10T06:22:18.077 INFO:tasks.workunit.client.1.vm06.stdout:4/339: mknod dd/d24/c65 0 2026-03-10T06:22:18.084 INFO:tasks.workunit.client.1.vm06.stdout:7/435: rmdir d19/d3b 39 2026-03-10T06:22:18.099 INFO:tasks.workunit.client.1.vm06.stdout:3/388: rename d6/d21/d38/f3d to d6/dc/d13/d35/d6b/f86 0 2026-03-10T06:22:18.100 INFO:tasks.workunit.client.1.vm06.stdout:3/389: write d6/d21/f31 [4541752,102498] 0 2026-03-10T06:22:18.107 INFO:tasks.workunit.client.1.vm06.stdout:9/409: mknod d21/d32/d4d/c8c 0 2026-03-10T06:22:18.109 INFO:tasks.workunit.client.1.vm06.stdout:1/402: dwrite d9/d1b/d20/f25 [0,4194304] 0 2026-03-10T06:22:18.111 INFO:tasks.workunit.client.1.vm06.stdout:2/365: truncate da/d13/d1a/f21 485345 0 2026-03-10T06:22:18.112 INFO:tasks.workunit.client.1.vm06.stdout:9/410: dwrite f14 [4194304,4194304] 0 2026-03-10T06:22:18.117 INFO:tasks.workunit.client.1.vm06.stdout:0/407: mknod d0/dd/d14/d1d/d73/c8c 0 2026-03-10T06:22:18.118 INFO:tasks.workunit.client.1.vm06.stdout:0/408: readlink d0/d3c/d42/d5e/l6a 0 2026-03-10T06:22:18.123 INFO:tasks.workunit.client.1.vm06.stdout:6/459: symlink d6/dd/d25/l9e 0 2026-03-10T06:22:18.125 INFO:tasks.workunit.client.1.vm06.stdout:3/390: unlink d6/dc/d13/f1e 0 2026-03-10T06:22:18.132 INFO:tasks.workunit.client.1.vm06.stdout:4/340: mkdir dd/d24/d5e/d66 0 2026-03-10T06:22:18.135 INFO:tasks.workunit.client.1.vm06.stdout:9/411: fdatasync ff 0 2026-03-10T06:22:18.141 INFO:tasks.workunit.client.1.vm06.stdout:5/319: fsync d8/db/d54/d67/d22/f31 0 2026-03-10T06:22:18.141 INFO:tasks.workunit.client.1.vm06.stdout:9/412: chown d21/d46/l72 2870711 1 2026-03-10T06:22:18.141 INFO:tasks.workunit.client.1.vm06.stdout:2/366: dwrite da/d13/d1c/d1d/f26 [0,4194304] 0 2026-03-10T06:22:18.141 INFO:tasks.workunit.client.1.vm06.stdout:6/460: mkdir d6/df/d9f 0 2026-03-10T06:22:18.146 INFO:tasks.workunit.client.1.vm06.stdout:3/391: symlink d6/d1a/l87 0 2026-03-10T06:22:18.155 INFO:tasks.workunit.client.1.vm06.stdout:4/341: creat dd/d24/d5e/f67 x:0 0 0 2026-03-10T06:22:18.155 INFO:tasks.workunit.client.1.vm06.stdout:8/353: dwrite d1/df/d20/d21/f37 [0,4194304] 0 2026-03-10T06:22:18.156 INFO:tasks.workunit.client.1.vm06.stdout:7/436: creat d19/d3b/d41/d42/d52/f8b x:0 0 0 2026-03-10T06:22:18.156 INFO:tasks.workunit.client.1.vm06.stdout:7/437: chown d19/d3b/l5e 107268 1 2026-03-10T06:22:18.157 INFO:tasks.workunit.client.1.vm06.stdout:5/320: rename d8/db/f40 to d8/db/d54/d67/d22/d39/f69 0 2026-03-10T06:22:18.175 INFO:tasks.workunit.client.1.vm06.stdout:9/413: unlink d21/d32/d6e/l25 0 2026-03-10T06:22:18.179 INFO:tasks.workunit.client.1.vm06.stdout:0/409: mknod d0/dd/d14/c8d 0 2026-03-10T06:22:18.179 INFO:tasks.workunit.client.1.vm06.stdout:0/410: fsync d0/dd/d14/d18/f30 0 2026-03-10T06:22:18.181 INFO:tasks.workunit.client.1.vm06.stdout:6/461: unlink d6/dd/d25/d2c/f9b 0 2026-03-10T06:22:18.189 INFO:tasks.workunit.client.1.vm06.stdout:8/354: chown d1/df/d20/l31 633179 1 2026-03-10T06:22:18.191 INFO:tasks.workunit.client.1.vm06.stdout:2/367: read da/d13/d1a/f27 [3826888,64934] 0 2026-03-10T06:22:18.192 INFO:tasks.workunit.client.1.vm06.stdout:9/414: creat d21/d46/d79/f8d x:0 0 0 2026-03-10T06:22:18.197 INFO:tasks.workunit.client.1.vm06.stdout:5/321: dread d8/d9/d1e/f66 [0,4194304] 0 2026-03-10T06:22:18.198 INFO:tasks.workunit.client.1.vm06.stdout:7/438: mknod d19/d3b/d41/d42/d62/d80/d82/c8c 0 2026-03-10T06:22:18.201 INFO:tasks.workunit.client.1.vm06.stdout:5/322: dread d8/db/d54/d55/f60 [0,4194304] 0 2026-03-10T06:22:18.202 INFO:tasks.workunit.client.1.vm06.stdout:2/368: symlink da/d13/d1a/d39/d4b/l6f 0 2026-03-10T06:22:18.206 INFO:tasks.workunit.client.1.vm06.stdout:6/462: link d6/f62 d6/dd/d25/fa0 0 2026-03-10T06:22:18.208 INFO:tasks.workunit.client.1.vm06.stdout:5/323: truncate d8/d9/d1e/f17 4132883 0 2026-03-10T06:22:18.209 INFO:tasks.workunit.client.1.vm06.stdout:6/463: creat d6/dd/d25/d33/d5a/fa1 x:0 0 0 2026-03-10T06:22:18.210 INFO:tasks.workunit.client.1.vm06.stdout:2/369: rename da/d13/d1c/d1d/d44/f45 to da/d13/d1a/d39/f70 0 2026-03-10T06:22:18.210 INFO:tasks.workunit.client.1.vm06.stdout:5/324: write d8/db/d54/d67/d22/f31 [49432,15306] 0 2026-03-10T06:22:18.211 INFO:tasks.workunit.client.1.vm06.stdout:7/439: dwrite f13 [0,4194304] 0 2026-03-10T06:22:18.213 INFO:tasks.workunit.client.1.vm06.stdout:6/464: dread d6/dd/d25/d2c/f4c [0,4194304] 0 2026-03-10T06:22:18.214 INFO:tasks.workunit.client.1.vm06.stdout:2/370: creat da/d13/d1c/d43/d6e/f71 x:0 0 0 2026-03-10T06:22:18.214 INFO:tasks.workunit.client.1.vm06.stdout:9/415: sync 2026-03-10T06:22:18.218 INFO:tasks.workunit.client.1.vm06.stdout:9/416: dwrite fe [4194304,4194304] 0 2026-03-10T06:22:18.223 INFO:tasks.workunit.client.1.vm06.stdout:7/440: rename d19/f1a to d19/d3b/d41/d4c/f8d 0 2026-03-10T06:22:18.223 INFO:tasks.workunit.client.1.vm06.stdout:2/371: symlink da/d13/d1c/d1d/d44/d48/d56/l72 0 2026-03-10T06:22:18.223 INFO:tasks.workunit.client.1.vm06.stdout:7/441: rename d19/d3b/d41/d42/d62 to d19/d3b/d41/d42/d62/d80/d82/d8e 22 2026-03-10T06:22:18.223 INFO:tasks.workunit.client.1.vm06.stdout:7/442: readlink d19/d3b/d41/d42/l46 0 2026-03-10T06:22:18.228 INFO:tasks.workunit.client.1.vm06.stdout:9/417: mkdir d21/d27/d56/d8e 0 2026-03-10T06:22:18.229 INFO:tasks.workunit.client.1.vm06.stdout:9/418: dread - d21/d32/d4d/d51/d67/f7c zero size 2026-03-10T06:22:18.229 INFO:tasks.workunit.client.1.vm06.stdout:9/419: stat d21/d46/d79 0 2026-03-10T06:22:18.234 INFO:tasks.workunit.client.1.vm06.stdout:9/420: dwrite d21/d27/d50/d57/f58 [0,4194304] 0 2026-03-10T06:22:18.250 INFO:tasks.workunit.client.1.vm06.stdout:2/372: dread da/f28 [4194304,4194304] 0 2026-03-10T06:22:18.253 INFO:tasks.workunit.client.1.vm06.stdout:7/443: readlink d19/l4a 0 2026-03-10T06:22:18.265 INFO:tasks.workunit.client.1.vm06.stdout:1/403: write d9/d1b/f51 [8414536,119666] 0 2026-03-10T06:22:18.268 INFO:tasks.workunit.client.1.vm06.stdout:1/404: dread d9/df/f3d [0,4194304] 0 2026-03-10T06:22:18.276 INFO:tasks.workunit.client.1.vm06.stdout:9/421: symlink d21/d32/d4d/l8f 0 2026-03-10T06:22:18.279 INFO:tasks.workunit.client.1.vm06.stdout:7/444: link d19/d3b/d41/f49 d19/d3b/d41/d42/d52/d83/f8f 0 2026-03-10T06:22:18.282 INFO:tasks.workunit.client.1.vm06.stdout:9/422: creat d21/d46/d79/d80/f90 x:0 0 0 2026-03-10T06:22:18.283 INFO:tasks.workunit.client.1.vm06.stdout:4/342: write dd/d33/f37 [3387105,64488] 0 2026-03-10T06:22:18.284 INFO:tasks.workunit.client.1.vm06.stdout:4/343: truncate dd/d24/d5e/f67 386065 0 2026-03-10T06:22:18.289 INFO:tasks.workunit.client.1.vm06.stdout:8/355: truncate d1/d7/fd 644363 0 2026-03-10T06:22:18.291 INFO:tasks.workunit.client.1.vm06.stdout:0/411: dwrite d0/dd/f5b [0,4194304] 0 2026-03-10T06:22:18.293 INFO:tasks.workunit.client.1.vm06.stdout:3/392: dwrite d6/d21/d38/f6c [0,4194304] 0 2026-03-10T06:22:18.294 INFO:tasks.workunit.client.1.vm06.stdout:1/405: mknod d9/c69 0 2026-03-10T06:22:18.294 INFO:tasks.workunit.client.1.vm06.stdout:1/406: chown d9/f2f 26301888 1 2026-03-10T06:22:18.307 INFO:tasks.workunit.client.1.vm06.stdout:4/344: fdatasync dd/d18/f1d 0 2026-03-10T06:22:18.307 INFO:tasks.workunit.client.1.vm06.stdout:4/345: fsync dd/d24/d5e/f67 0 2026-03-10T06:22:18.309 INFO:tasks.workunit.client.1.vm06.stdout:8/356: creat d1/d2c/d5b/f7c x:0 0 0 2026-03-10T06:22:18.311 INFO:tasks.workunit.client.1.vm06.stdout:6/465: write d6/dd/d25/d2c/f32 [118006,121168] 0 2026-03-10T06:22:18.322 INFO:tasks.workunit.client.1.vm06.stdout:0/412: dread - d0/d3c/d42/d88/d35/f7f zero size 2026-03-10T06:22:18.322 INFO:tasks.workunit.client.1.vm06.stdout:0/413: write d0/dd/d14/d1d/d5d/f5f [127752,85083] 0 2026-03-10T06:22:18.322 INFO:tasks.workunit.client.1.vm06.stdout:5/325: truncate d8/db/d54/d55/f60 119344 0 2026-03-10T06:22:18.322 INFO:tasks.workunit.client.1.vm06.stdout:5/326: read d8/db/d54/d55/f61 [149809,90497] 0 2026-03-10T06:22:18.322 INFO:tasks.workunit.client.1.vm06.stdout:0/414: dwrite d0/dd/f32 [0,4194304] 0 2026-03-10T06:22:18.322 INFO:tasks.workunit.client.1.vm06.stdout:9/423: creat d21/d46/d79/d7f/f91 x:0 0 0 2026-03-10T06:22:18.322 INFO:tasks.workunit.client.1.vm06.stdout:2/373: truncate da/d13/d1c/d1d/f26 534507 0 2026-03-10T06:22:18.327 INFO:tasks.workunit.client.1.vm06.stdout:3/393: rmdir d6 39 2026-03-10T06:22:18.336 INFO:tasks.workunit.client.1.vm06.stdout:5/327: creat d8/db/d54/d67/d22/d39/f6a x:0 0 0 2026-03-10T06:22:18.336 INFO:tasks.workunit.client.1.vm06.stdout:5/328: fdatasync d8/d9/d1e/f37 0 2026-03-10T06:22:18.336 INFO:tasks.workunit.client.1.vm06.stdout:5/329: dwrite d8/db/d54/d67/d22/d39/f63 [0,4194304] 0 2026-03-10T06:22:18.336 INFO:tasks.workunit.client.1.vm06.stdout:5/330: write d8/db/d54/f5c [987883,126368] 0 2026-03-10T06:22:18.337 INFO:tasks.workunit.client.1.vm06.stdout:2/374: dread da/d13/d1a/d39/f70 [0,4194304] 0 2026-03-10T06:22:18.339 INFO:tasks.workunit.client.1.vm06.stdout:7/445: truncate d19/f24 1416802 0 2026-03-10T06:22:18.340 INFO:tasks.workunit.client.1.vm06.stdout:1/407: rename d9/d1b/d20/c27 to d9/d1b/c6a 0 2026-03-10T06:22:18.344 INFO:tasks.workunit.client.1.vm06.stdout:9/424: symlink d21/d32/d4d/l92 0 2026-03-10T06:22:18.345 INFO:tasks.workunit.client.1.vm06.stdout:9/425: stat d21/d32/d6e/c3c 0 2026-03-10T06:22:18.349 INFO:tasks.workunit.client.1.vm06.stdout:1/408: dread d9/d1b/f31 [0,4194304] 0 2026-03-10T06:22:18.353 INFO:tasks.workunit.client.1.vm06.stdout:1/409: dwrite d9/d35/f53 [0,4194304] 0 2026-03-10T06:22:18.356 INFO:tasks.workunit.client.1.vm06.stdout:6/466: creat d6/fa2 x:0 0 0 2026-03-10T06:22:18.367 INFO:tasks.workunit.client.1.vm06.stdout:2/375: symlink da/d13/d1c/d1d/d44/d46/l73 0 2026-03-10T06:22:18.367 INFO:tasks.workunit.client.1.vm06.stdout:8/357: rename d1/d2c/c4e to d1/df/d20/d21/d5e/d79/c7d 0 2026-03-10T06:22:18.367 INFO:tasks.workunit.client.1.vm06.stdout:5/331: chown d8/c59 11206066 1 2026-03-10T06:22:18.367 INFO:tasks.workunit.client.1.vm06.stdout:7/446: fdatasync d19/f24 0 2026-03-10T06:22:18.367 INFO:tasks.workunit.client.1.vm06.stdout:7/447: dread - d19/d3b/d41/d42/f7d zero size 2026-03-10T06:22:18.367 INFO:tasks.workunit.client.1.vm06.stdout:2/376: read da/d13/f1f [989301,45774] 0 2026-03-10T06:22:18.367 INFO:tasks.workunit.client.1.vm06.stdout:2/377: dwrite da/d13/d1c/f41 [4194304,4194304] 0 2026-03-10T06:22:18.370 INFO:tasks.workunit.client.1.vm06.stdout:3/394: sync 2026-03-10T06:22:18.370 INFO:tasks.workunit.client.1.vm06.stdout:4/346: sync 2026-03-10T06:22:18.376 INFO:tasks.workunit.client.1.vm06.stdout:8/358: mkdir d1/df/d20/d21/d7e 0 2026-03-10T06:22:18.381 INFO:tasks.workunit.client.1.vm06.stdout:9/426: mkdir d21/d27/d56/d8e/d93 0 2026-03-10T06:22:18.381 INFO:tasks.workunit.client.1.vm06.stdout:5/332: symlink d8/d9/l6b 0 2026-03-10T06:22:18.381 INFO:tasks.workunit.client.1.vm06.stdout:7/448: creat d19/d3b/d41/d42/d62/d80/d82/f90 x:0 0 0 2026-03-10T06:22:18.381 INFO:tasks.workunit.client.1.vm06.stdout:2/378: creat da/d13/d1a/d39/d35/f74 x:0 0 0 2026-03-10T06:22:18.382 INFO:tasks.workunit.client.1.vm06.stdout:8/359: dread d1/df/d11/f4a [0,4194304] 0 2026-03-10T06:22:18.388 INFO:tasks.workunit.client.1.vm06.stdout:7/449: dread f15 [0,4194304] 0 2026-03-10T06:22:18.399 INFO:tasks.workunit.client.1.vm06.stdout:4/347: mknod dd/c68 0 2026-03-10T06:22:18.399 INFO:tasks.workunit.client.1.vm06.stdout:4/348: dread - dd/d18/f5f zero size 2026-03-10T06:22:18.400 INFO:tasks.workunit.client.1.vm06.stdout:7/450: creat d19/d3b/d41/d42/f91 x:0 0 0 2026-03-10T06:22:18.413 INFO:tasks.workunit.client.1.vm06.stdout:0/415: truncate d0/dd/f4c 3569598 0 2026-03-10T06:22:18.414 INFO:tasks.workunit.client.1.vm06.stdout:0/416: read d0/f61 [2811695,76342] 0 2026-03-10T06:22:18.419 INFO:tasks.workunit.client.1.vm06.stdout:6/467: read d6/dd/d25/d2c/f32 [1407441,115239] 0 2026-03-10T06:22:18.419 INFO:tasks.workunit.client.1.vm06.stdout:6/468: chown d6/dd/d25/f69 1007 1 2026-03-10T06:22:18.422 INFO:tasks.workunit.client.1.vm06.stdout:2/379: creat da/f75 x:0 0 0 2026-03-10T06:22:18.428 INFO:tasks.workunit.client.1.vm06.stdout:9/427: dwrite d21/d32/f3f [4194304,4194304] 0 2026-03-10T06:22:18.449 INFO:tasks.workunit.client.1.vm06.stdout:4/349: creat dd/d24/f69 x:0 0 0 2026-03-10T06:22:18.451 INFO:tasks.workunit.client.1.vm06.stdout:8/360: truncate d1/df/f6b 1144222 0 2026-03-10T06:22:18.452 INFO:tasks.workunit.client.1.vm06.stdout:6/469: unlink d6/df/l3e 0 2026-03-10T06:22:18.454 INFO:tasks.workunit.client.1.vm06.stdout:7/451: rename d19/d3b/d5b/c87 to d19/c92 0 2026-03-10T06:22:18.460 INFO:tasks.workunit.client.1.vm06.stdout:4/350: creat dd/d24/d5e/f6a x:0 0 0 2026-03-10T06:22:18.460 INFO:tasks.workunit.client.1.vm06.stdout:8/361: rmdir d1/df/d58 39 2026-03-10T06:22:18.463 INFO:tasks.workunit.client.1.vm06.stdout:3/395: rename d6/dc/d13/d73 to d6/d21/d38/d88 0 2026-03-10T06:22:18.464 INFO:tasks.workunit.client.1.vm06.stdout:7/452: readlink d19/d3b/d5b/l5d 0 2026-03-10T06:22:18.466 INFO:tasks.workunit.client.1.vm06.stdout:2/380: creat da/d13/d1c/f76 x:0 0 0 2026-03-10T06:22:18.466 INFO:tasks.workunit.client.1.vm06.stdout:0/417: dwrite d0/d3c/d42/d88/f80 [0,4194304] 0 2026-03-10T06:22:18.474 INFO:tasks.workunit.client.1.vm06.stdout:1/410: rename d9/d1b/d20/f60 to d9/d35/d46/d38/d63/f6b 0 2026-03-10T06:22:18.484 INFO:tasks.workunit.client.1.vm06.stdout:3/396: creat d6/d21/d38/d39/f89 x:0 0 0 2026-03-10T06:22:18.484 INFO:tasks.workunit.client.1.vm06.stdout:3/397: readlink d6/d1a/d5b/l68 0 2026-03-10T06:22:18.487 INFO:tasks.workunit.client.1.vm06.stdout:6/470: write d6/f62 [1574738,19349] 0 2026-03-10T06:22:18.488 INFO:tasks.workunit.client.1.vm06.stdout:2/381: rmdir da/d13/d1c/d1d/d44/d53 39 2026-03-10T06:22:18.489 INFO:tasks.workunit.client.1.vm06.stdout:0/418: rmdir d0/d3c/d42/d88/d35 39 2026-03-10T06:22:18.490 INFO:tasks.workunit.client.1.vm06.stdout:0/419: write d0/d3c/d42/d88/f80 [2241291,25644] 0 2026-03-10T06:22:18.490 INFO:tasks.workunit.client.1.vm06.stdout:0/420: stat d0/dd/f67 0 2026-03-10T06:22:18.491 INFO:tasks.workunit.client.1.vm06.stdout:9/428: getdents d21/d32 0 2026-03-10T06:22:18.491 INFO:tasks.workunit.client.1.vm06.stdout:0/421: write d0/dd/d1b/f2f [932955,36140] 0 2026-03-10T06:22:18.492 INFO:tasks.workunit.client.1.vm06.stdout:9/429: write d21/d32/d4d/d51/d67/f6a [994450,68893] 0 2026-03-10T06:22:18.493 INFO:tasks.workunit.client.1.vm06.stdout:8/362: rename d1/d7/f24 to d1/df/d20/d21/d5e/d79/f7f 0 2026-03-10T06:22:18.493 INFO:tasks.workunit.client.1.vm06.stdout:8/363: chown d1/d3b/f49 6144882 1 2026-03-10T06:22:18.497 INFO:tasks.workunit.client.1.vm06.stdout:4/351: write dd/f11 [1146788,4268] 0 2026-03-10T06:22:18.498 INFO:tasks.workunit.client.1.vm06.stdout:2/382: creat da/d13/d1c/d43/d6e/f77 x:0 0 0 2026-03-10T06:22:18.510 INFO:tasks.workunit.client.1.vm06.stdout:7/453: dwrite fa [0,4194304] 0 2026-03-10T06:22:18.511 INFO:tasks.workunit.client.1.vm06.stdout:5/333: write d8/db/d54/d55/f60 [495883,14884] 0 2026-03-10T06:22:18.512 INFO:tasks.workunit.client.1.vm06.stdout:7/454: read - d19/d3b/d41/d42/d62/d80/d82/f90 zero size 2026-03-10T06:22:18.514 INFO:tasks.workunit.client.1.vm06.stdout:0/422: rename d0/d3c/d42/d88/f80 to d0/f8e 0 2026-03-10T06:22:18.516 INFO:tasks.workunit.client.1.vm06.stdout:5/334: dread d8/db/d54/d67/d22/d39/f41 [0,4194304] 0 2026-03-10T06:22:18.522 INFO:tasks.workunit.client.1.vm06.stdout:3/398: write d6/d8/f45 [1639565,36804] 0 2026-03-10T06:22:18.523 INFO:tasks.workunit.client.1.vm06.stdout:6/471: mknod d6/dd/d25/d33/ca3 0 2026-03-10T06:22:18.529 INFO:tasks.workunit.client.1.vm06.stdout:4/352: rename dd/d24/d2d/d2f/d34/d40/f4e to dd/d24/d2d/d2f/d34/d40/f6b 0 2026-03-10T06:22:18.530 INFO:tasks.workunit.client.1.vm06.stdout:3/399: dread d6/dc/d13/d35/f3b [0,4194304] 0 2026-03-10T06:22:18.535 INFO:tasks.workunit.client.1.vm06.stdout:4/353: dwrite dd/d18/f55 [0,4194304] 0 2026-03-10T06:22:18.537 INFO:tasks.workunit.client.1.vm06.stdout:1/411: write d9/d1b/d20/f25 [4089264,12172] 0 2026-03-10T06:22:18.547 INFO:tasks.workunit.client.1.vm06.stdout:7/455: rename d19/d3b/d41/c70 to d19/d3b/d41/d42/d52/c93 0 2026-03-10T06:22:18.567 INFO:tasks.workunit.client.1.vm06.stdout:5/335: mkdir d8/db/d54/d67/d22/d39/d6c 0 2026-03-10T06:22:18.567 INFO:tasks.workunit.client.1.vm06.stdout:5/336: write d8/db/d54/f5c [623175,122407] 0 2026-03-10T06:22:18.573 INFO:tasks.workunit.client.1.vm06.stdout:9/430: fsync d21/d32/d4d/d51/d67/f6a 0 2026-03-10T06:22:18.574 INFO:tasks.workunit.client.1.vm06.stdout:1/412: dread d9/df/f4f [0,4194304] 0 2026-03-10T06:22:18.585 INFO:tasks.workunit.client.1.vm06.stdout:2/383: creat da/d13/d1c/d1d/d44/d53/f78 x:0 0 0 2026-03-10T06:22:18.586 INFO:tasks.workunit.client.1.vm06.stdout:2/384: chown da/d13/d1a/d39/d35/c4e 7648 1 2026-03-10T06:22:18.587 INFO:tasks.workunit.client.1.vm06.stdout:3/400: symlink d6/dc/d13/d35/l8a 0 2026-03-10T06:22:18.587 INFO:tasks.workunit.client.1.vm06.stdout:2/385: readlink da/d13/d1a/d39/d35/l37 0 2026-03-10T06:22:18.589 INFO:tasks.workunit.client.1.vm06.stdout:4/354: chown dd/l1b 8 1 2026-03-10T06:22:18.590 INFO:tasks.workunit.client.1.vm06.stdout:3/401: dwrite d6/dc/d41/d6d/f70 [0,4194304] 0 2026-03-10T06:22:18.595 INFO:tasks.workunit.client.1.vm06.stdout:6/472: dread d6/d7/f1a [0,4194304] 0 2026-03-10T06:22:18.596 INFO:tasks.workunit.client.1.vm06.stdout:6/473: write d6/d7/d37/f3d [3358975,129625] 0 2026-03-10T06:22:18.602 INFO:tasks.workunit.client.1.vm06.stdout:0/423: mkdir d0/dd/d14/d8f 0 2026-03-10T06:22:18.607 INFO:tasks.workunit.client.1.vm06.stdout:8/364: link d1/df/d20/d21/l2d d1/d2c/l80 0 2026-03-10T06:22:18.607 INFO:tasks.workunit.client.1.vm06.stdout:8/365: dread - d1/df/f77 zero size 2026-03-10T06:22:18.607 INFO:tasks.workunit.client.1.vm06.stdout:1/413: creat d9/d35/d46/f6c x:0 0 0 2026-03-10T06:22:18.616 INFO:tasks.workunit.client.1.vm06.stdout:2/386: creat da/d13/d1c/d43/f79 x:0 0 0 2026-03-10T06:22:18.636 INFO:tasks.workunit.client.1.vm06.stdout:7/456: creat d19/d3b/d41/d42/d52/d83/f94 x:0 0 0 2026-03-10T06:22:18.648 INFO:tasks.workunit.client.1.vm06.stdout:9/431: link d21/d46/d79/d80/f90 d21/d27/d56/d8e/d93/f94 0 2026-03-10T06:22:18.648 INFO:tasks.workunit.client.1.vm06.stdout:8/366: creat d1/df/d11/f81 x:0 0 0 2026-03-10T06:22:18.649 INFO:tasks.workunit.client.1.vm06.stdout:2/387: creat da/d13/d1c/d43/f7a x:0 0 0 2026-03-10T06:22:18.651 INFO:tasks.workunit.client.1.vm06.stdout:9/432: dread d21/f49 [0,4194304] 0 2026-03-10T06:22:18.658 INFO:tasks.workunit.client.1.vm06.stdout:7/457: creat d19/d3b/d41/d4c/f95 x:0 0 0 2026-03-10T06:22:18.658 INFO:tasks.workunit.client.1.vm06.stdout:7/458: readlink d19/d3b/l5e 0 2026-03-10T06:22:18.660 INFO:tasks.workunit.client.1.vm06.stdout:5/337: rename d8/d9/c33 to d8/db/d54/d67/d22/c6d 0 2026-03-10T06:22:18.662 INFO:tasks.workunit.client.1.vm06.stdout:7/459: dwrite d19/d3b/d41/d42/d52/f8b [0,4194304] 0 2026-03-10T06:22:18.676 INFO:tasks.workunit.client.1.vm06.stdout:1/414: unlink d9/d1b/d20/d44/l52 0 2026-03-10T06:22:18.681 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:18 vm04.local ceph-mon[51058]: Updating vm06:/etc/ceph/ceph.client.admin.keyring 2026-03-10T06:22:18.681 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:18 vm04.local ceph-mon[51058]: Updating vm06:/var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/config/ceph.client.admin.keyring 2026-03-10T06:22:18.681 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:18 vm04.local ceph-mon[51058]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:22:18.681 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:18 vm04.local ceph-mon[51058]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:22:18.681 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:18 vm04.local ceph-mon[51058]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:22:18.681 INFO:tasks.workunit.client.1.vm06.stdout:1/415: dwrite d9/d35/f56 [0,4194304] 0 2026-03-10T06:22:18.689 INFO:tasks.workunit.client.1.vm06.stdout:4/355: write dd/f12 [214917,87420] 0 2026-03-10T06:22:18.690 INFO:tasks.workunit.client.1.vm06.stdout:6/474: dwrite d6/d7/f16 [0,4194304] 0 2026-03-10T06:22:18.704 INFO:tasks.workunit.client.1.vm06.stdout:5/338: mkdir d8/db/d54/d67/d46/d6e 0 2026-03-10T06:22:18.704 INFO:tasks.workunit.client.1.vm06.stdout:5/339: fdatasync d8/db/d57/f5f 0 2026-03-10T06:22:18.711 INFO:tasks.workunit.client.1.vm06.stdout:1/416: unlink d9/df/c37 0 2026-03-10T06:22:18.711 INFO:tasks.workunit.client.1.vm06.stdout:1/417: stat d9/f58 0 2026-03-10T06:22:18.715 INFO:tasks.workunit.client.1.vm06.stdout:3/402: rename d6/dc/d13/d35/f80 to d6/dc/d13/f8b 0 2026-03-10T06:22:18.716 INFO:tasks.workunit.client.1.vm06.stdout:5/340: mknod d8/c6f 0 2026-03-10T06:22:18.719 INFO:tasks.workunit.client.1.vm06.stdout:1/418: mkdir d9/d35/d46/d38/d63/d6d 0 2026-03-10T06:22:18.719 INFO:tasks.workunit.client.1.vm06.stdout:3/403: dwrite d6/d21/d38/d39/f89 [0,4194304] 0 2026-03-10T06:22:18.723 INFO:tasks.workunit.client.1.vm06.stdout:1/419: write d9/d35/d46/f50 [803196,50900] 0 2026-03-10T06:22:18.723 INFO:tasks.workunit.client.1.vm06.stdout:0/424: rename d0/f8e to d0/dd/d14/d18/f90 0 2026-03-10T06:22:18.723 INFO:tasks.workunit.client.1.vm06.stdout:5/341: fdatasync d8/f3f 0 2026-03-10T06:22:18.724 INFO:tasks.workunit.client.1.vm06.stdout:4/356: getdents dd/d24/d5d 0 2026-03-10T06:22:18.732 INFO:tasks.workunit.client.1.vm06.stdout:0/425: chown d0/dd/d14/d18/f22 74711 1 2026-03-10T06:22:18.732 INFO:tasks.workunit.client.1.vm06.stdout:5/342: write d8/d9/d1e/f36 [517207,81133] 0 2026-03-10T06:22:18.733 INFO:tasks.workunit.client.1.vm06.stdout:4/357: mknod dd/d24/d5d/c6c 0 2026-03-10T06:22:18.733 INFO:tasks.workunit.client.1.vm06.stdout:1/420: mknod d9/d1b/c6e 0 2026-03-10T06:22:18.733 INFO:tasks.workunit.client.1.vm06.stdout:0/426: write d0/dd/f5b [2453091,64661] 0 2026-03-10T06:22:18.734 INFO:tasks.workunit.client.1.vm06.stdout:0/427: stat d0/dd/d14/d18/f90 0 2026-03-10T06:22:18.735 INFO:tasks.workunit.client.1.vm06.stdout:7/460: rename d19/c92 to d19/d3b/d41/d42/d52/d83/c96 0 2026-03-10T06:22:18.736 INFO:tasks.workunit.client.1.vm06.stdout:0/428: write d0/dd/d1b/d3d/f82 [664998,93006] 0 2026-03-10T06:22:18.737 INFO:tasks.workunit.client.1.vm06.stdout:4/358: mknod dd/d24/d2d/d2f/c6d 0 2026-03-10T06:22:18.738 INFO:tasks.workunit.client.1.vm06.stdout:4/359: dread dd/f43 [0,4194304] 0 2026-03-10T06:22:18.739 INFO:tasks.workunit.client.1.vm06.stdout:4/360: dread - dd/d24/d2d/f5a zero size 2026-03-10T06:22:18.740 INFO:tasks.workunit.client.1.vm06.stdout:1/421: creat d9/d1b/d20/d44/f6f x:0 0 0 2026-03-10T06:22:18.741 INFO:tasks.workunit.client.1.vm06.stdout:6/475: rename d6/dd/d25/d33/ca3 to d6/d79/ca4 0 2026-03-10T06:22:18.745 INFO:tasks.workunit.client.1.vm06.stdout:0/429: mkdir d0/dd/d1b/d3d/d50/d91 0 2026-03-10T06:22:18.745 INFO:tasks.workunit.client.1.vm06.stdout:4/361: creat dd/d24/d5e/f6e x:0 0 0 2026-03-10T06:22:18.747 INFO:tasks.workunit.client.1.vm06.stdout:3/404: dread d6/d21/f31 [0,4194304] 0 2026-03-10T06:22:18.753 INFO:tasks.workunit.client.1.vm06.stdout:7/461: mkdir d19/d3b/d41/d72/d97 0 2026-03-10T06:22:18.753 INFO:tasks.workunit.client.1.vm06.stdout:6/476: rename d6/d7/d37/f65 to d6/dd/d25/d33/d4d/fa5 0 2026-03-10T06:22:18.754 INFO:tasks.workunit.client.1.vm06.stdout:6/477: stat d6/df/l51 0 2026-03-10T06:22:18.755 INFO:tasks.workunit.client.1.vm06.stdout:4/362: symlink dd/d18/l6f 0 2026-03-10T06:22:18.755 INFO:tasks.workunit.client.1.vm06.stdout:8/367: dwrite d1/df/d11/f74 [0,4194304] 0 2026-03-10T06:22:18.757 INFO:tasks.workunit.client.1.vm06.stdout:0/430: truncate d0/dd/d1b/f3f 1152870 0 2026-03-10T06:22:18.759 INFO:tasks.workunit.client.1.vm06.stdout:1/422: symlink d9/l70 0 2026-03-10T06:22:18.760 INFO:tasks.workunit.client.1.vm06.stdout:0/431: stat d0/dd/d14/d18/f30 0 2026-03-10T06:22:18.760 INFO:tasks.workunit.client.1.vm06.stdout:4/363: chown dd/f5c 533 1 2026-03-10T06:22:18.761 INFO:tasks.workunit.client.1.vm06.stdout:2/388: dwrite da/d13/f1f [0,4194304] 0 2026-03-10T06:22:18.765 INFO:tasks.workunit.client.1.vm06.stdout:0/432: write d0/dd/d1b/f2f [5279103,118754] 0 2026-03-10T06:22:18.767 INFO:tasks.workunit.client.1.vm06.stdout:7/462: unlink d19/d3b/d41/d42/d52/f8b 0 2026-03-10T06:22:18.773 INFO:tasks.workunit.client.1.vm06.stdout:2/389: write da/d13/d1a/d39/d35/f4a [2013437,51945] 0 2026-03-10T06:22:18.781 INFO:tasks.workunit.client.1.vm06.stdout:0/433: mkdir d0/dd/d14/d18/d92 0 2026-03-10T06:22:18.797 INFO:tasks.workunit.client.1.vm06.stdout:7/463: fdatasync d19/d3b/f47 0 2026-03-10T06:22:18.797 INFO:tasks.workunit.client.1.vm06.stdout:0/434: dwrite d0/dd/f48 [0,4194304] 0 2026-03-10T06:22:18.797 INFO:tasks.workunit.client.1.vm06.stdout:0/435: chown d0/dd/d1c/c21 64909 1 2026-03-10T06:22:18.797 INFO:tasks.workunit.client.1.vm06.stdout:6/478: link d6/fc d6/df/d70/fa6 0 2026-03-10T06:22:18.797 INFO:tasks.workunit.client.1.vm06.stdout:1/423: rmdir d9/d35/d46/d38/d63/d6d 0 2026-03-10T06:22:18.797 INFO:tasks.workunit.client.1.vm06.stdout:0/436: symlink d0/dd/d1b/l93 0 2026-03-10T06:22:18.797 INFO:tasks.workunit.client.1.vm06.stdout:4/364: link dd/d24/d2d/f28 dd/d33/f70 0 2026-03-10T06:22:18.797 INFO:tasks.workunit.client.1.vm06.stdout:4/365: chown dd/f29 3 1 2026-03-10T06:22:18.797 INFO:tasks.workunit.client.1.vm06.stdout:7/464: dread d19/f33 [0,4194304] 0 2026-03-10T06:22:18.807 INFO:tasks.workunit.client.1.vm06.stdout:1/424: link d9/d1b/d20/f25 d9/d35/d46/f71 0 2026-03-10T06:22:18.809 INFO:tasks.workunit.client.1.vm06.stdout:7/465: link d19/d3b/l5e d19/d3b/d41/d72/d97/l98 0 2026-03-10T06:22:18.811 INFO:tasks.workunit.client.1.vm06.stdout:4/366: dread dd/d24/f3d [0,4194304] 0 2026-03-10T06:22:18.811 INFO:tasks.workunit.client.1.vm06.stdout:1/425: rename d9/l70 to d9/d35/d46/d38/l72 0 2026-03-10T06:22:18.812 INFO:tasks.workunit.client.1.vm06.stdout:1/426: chown d9/d1b/d20/d44 129503674 1 2026-03-10T06:22:18.812 INFO:tasks.workunit.client.1.vm06.stdout:9/433: truncate d21/f2a 2391522 0 2026-03-10T06:22:18.814 INFO:tasks.workunit.client.1.vm06.stdout:9/434: write d21/d27/d56/f74 [435290,74784] 0 2026-03-10T06:22:18.814 INFO:tasks.workunit.client.1.vm06.stdout:4/367: mkdir dd/d24/d2d/d2f/d39/d71 0 2026-03-10T06:22:18.814 INFO:tasks.workunit.client.1.vm06.stdout:9/435: chown f11 4 1 2026-03-10T06:22:18.816 INFO:tasks.workunit.client.1.vm06.stdout:1/427: symlink d9/d35/l73 0 2026-03-10T06:22:18.816 INFO:tasks.workunit.client.1.vm06.stdout:1/428: dread - d9/d35/f5c zero size 2026-03-10T06:22:18.817 INFO:tasks.workunit.client.1.vm06.stdout:7/466: creat d19/f99 x:0 0 0 2026-03-10T06:22:18.818 INFO:tasks.workunit.client.1.vm06.stdout:9/436: mkdir d21/d46/d79/d80/d95 0 2026-03-10T06:22:18.818 INFO:tasks.workunit.client.1.vm06.stdout:4/368: mkdir dd/d72 0 2026-03-10T06:22:18.819 INFO:tasks.workunit.client.1.vm06.stdout:1/429: creat d9/d35/d46/f74 x:0 0 0 2026-03-10T06:22:18.819 INFO:tasks.workunit.client.1.vm06.stdout:1/430: chown d9/c69 198820588 1 2026-03-10T06:22:18.822 INFO:tasks.workunit.client.1.vm06.stdout:9/437: fsync d21/d46/d79/d80/f90 0 2026-03-10T06:22:18.822 INFO:tasks.workunit.client.1.vm06.stdout:4/369: symlink dd/d33/l73 0 2026-03-10T06:22:18.823 INFO:tasks.workunit.client.1.vm06.stdout:9/438: read d21/d32/f52 [13352,6956] 0 2026-03-10T06:22:18.823 INFO:tasks.workunit.client.1.vm06.stdout:7/467: unlink d19/d3b/d41/d42/c73 0 2026-03-10T06:22:18.824 INFO:tasks.workunit.client.1.vm06.stdout:9/439: fdatasync d21/d32/d4d/d51/d67/f7c 0 2026-03-10T06:22:18.829 INFO:tasks.workunit.client.1.vm06.stdout:0/437: sync 2026-03-10T06:22:18.829 INFO:tasks.workunit.client.1.vm06.stdout:3/405: sync 2026-03-10T06:22:18.837 INFO:tasks.workunit.client.1.vm06.stdout:7/468: dwrite d19/f20 [0,4194304] 0 2026-03-10T06:22:18.838 INFO:tasks.workunit.client.1.vm06.stdout:5/343: write d8/db/d54/d67/d46/f64 [2920870,26904] 0 2026-03-10T06:22:18.842 INFO:tasks.workunit.client.1.vm06.stdout:5/344: fsync d8/d9/d1e/f17 0 2026-03-10T06:22:18.848 INFO:tasks.workunit.client.1.vm06.stdout:8/368: dwrite d1/df/d20/d35/f42 [0,4194304] 0 2026-03-10T06:22:18.851 INFO:tasks.workunit.client.1.vm06.stdout:2/390: write da/d13/d1a/d39/f2f [46666,93540] 0 2026-03-10T06:22:18.851 INFO:tasks.workunit.client.1.vm06.stdout:8/369: readlink d1/l8 0 2026-03-10T06:22:18.863 INFO:tasks.workunit.client.1.vm06.stdout:6/479: dwrite d6/dd/d25/d2c/f4c [0,4194304] 0 2026-03-10T06:22:18.864 INFO:tasks.workunit.client.1.vm06.stdout:6/480: dread - d6/dd/f96 zero size 2026-03-10T06:22:18.866 INFO:tasks.workunit.client.1.vm06.stdout:6/481: write d6/dd/d25/d33/f5d [1915930,130176] 0 2026-03-10T06:22:18.876 INFO:tasks.workunit.client.1.vm06.stdout:4/370: dread dd/d41/f52 [0,4194304] 0 2026-03-10T06:22:18.878 INFO:tasks.workunit.client.1.vm06.stdout:3/406: dread d6/d8/f48 [0,4194304] 0 2026-03-10T06:22:18.878 INFO:tasks.workunit.client.1.vm06.stdout:4/371: truncate dd/f5c 54210 0 2026-03-10T06:22:18.879 INFO:tasks.workunit.client.1.vm06.stdout:6/482: sync 2026-03-10T06:22:18.880 INFO:tasks.workunit.client.1.vm06.stdout:1/431: creat d9/d1b/f75 x:0 0 0 2026-03-10T06:22:18.884 INFO:tasks.workunit.client.1.vm06.stdout:1/432: dwrite d9/f34 [0,4194304] 0 2026-03-10T06:22:18.887 INFO:tasks.workunit.client.1.vm06.stdout:7/469: mkdir d19/d3b/d5b/d9a 0 2026-03-10T06:22:18.897 INFO:tasks.workunit.client.1.vm06.stdout:1/433: dwrite d9/f34 [0,4194304] 0 2026-03-10T06:22:18.897 INFO:tasks.workunit.client.1.vm06.stdout:5/345: write d8/db/d54/d67/d22/d39/f69 [2243611,73798] 0 2026-03-10T06:22:18.897 INFO:tasks.workunit.client.1.vm06.stdout:5/346: dread d8/db/d54/d67/d22/d39/f41 [0,4194304] 0 2026-03-10T06:22:18.897 INFO:tasks.workunit.client.1.vm06.stdout:8/370: creat d1/df/d20/d21/f82 x:0 0 0 2026-03-10T06:22:18.897 INFO:tasks.workunit.client.1.vm06.stdout:2/391: unlink da/d13/l14 0 2026-03-10T06:22:18.899 INFO:tasks.workunit.client.1.vm06.stdout:2/392: chown da/d13/d1c/f41 43 1 2026-03-10T06:22:18.901 INFO:tasks.workunit.client.1.vm06.stdout:9/440: link f9 d21/d27/d56/d8e/f96 0 2026-03-10T06:22:18.912 INFO:tasks.workunit.client.1.vm06.stdout:4/372: truncate dd/f14 329112 0 2026-03-10T06:22:18.913 INFO:tasks.workunit.client.1.vm06.stdout:6/483: creat d6/df/d40/fa7 x:0 0 0 2026-03-10T06:22:18.913 INFO:tasks.workunit.client.1.vm06.stdout:7/470: truncate d19/f33 4763428 0 2026-03-10T06:22:18.913 INFO:tasks.workunit.client.1.vm06.stdout:1/434: rename d9/d35/d46/f50 to d9/d62/f76 0 2026-03-10T06:22:18.914 INFO:tasks.workunit.client.1.vm06.stdout:6/484: write d6/dd/d25/d2c/f4c [871608,72881] 0 2026-03-10T06:22:18.918 INFO:tasks.workunit.client.1.vm06.stdout:9/441: creat d21/d27/d50/d57/f97 x:0 0 0 2026-03-10T06:22:18.921 INFO:tasks.workunit.client.1.vm06.stdout:3/407: creat d6/d8/d7f/f8c x:0 0 0 2026-03-10T06:22:18.921 INFO:tasks.workunit.client.1.vm06.stdout:4/373: rmdir dd/d24 39 2026-03-10T06:22:18.921 INFO:tasks.workunit.client.1.vm06.stdout:3/408: write d6/dc/d13/f6e [159240,35244] 0 2026-03-10T06:22:18.921 INFO:tasks.workunit.client.1.vm06.stdout:6/485: dread d6/dd/d25/d2c/f4c [0,4194304] 0 2026-03-10T06:22:18.921 INFO:tasks.workunit.client.1.vm06.stdout:6/486: chown d6/dd/d25/l9e 11200 1 2026-03-10T06:22:18.922 INFO:tasks.workunit.client.1.vm06.stdout:6/487: dread - d6/dd/d25/d2c/f85 zero size 2026-03-10T06:22:18.926 INFO:tasks.workunit.client.1.vm06.stdout:2/393: dread da/d13/f5b [0,4194304] 0 2026-03-10T06:22:18.928 INFO:tasks.workunit.client.1.vm06.stdout:8/371: rename d1/d2c/d5b/c6c to d1/df/d20/d21/d7e/c83 0 2026-03-10T06:22:18.928 INFO:tasks.workunit.client.1.vm06.stdout:8/372: chown d1/d2c/d5b/c76 6668764 1 2026-03-10T06:22:18.929 INFO:tasks.workunit.client.1.vm06.stdout:8/373: chown d1/d2c/l33 42310992 1 2026-03-10T06:22:18.932 INFO:tasks.workunit.client.1.vm06.stdout:5/347: symlink d8/db/d54/d67/d46/d6e/l70 0 2026-03-10T06:22:18.933 INFO:tasks.workunit.client.1.vm06.stdout:7/471: dwrite d19/d3b/f7b [4194304,4194304] 0 2026-03-10T06:22:18.938 INFO:tasks.workunit.client.1.vm06.stdout:4/374: creat dd/d33/d36/f74 x:0 0 0 2026-03-10T06:22:18.938 INFO:tasks.workunit.client.1.vm06.stdout:6/488: sync 2026-03-10T06:22:18.939 INFO:tasks.workunit.client.1.vm06.stdout:4/375: readlink dd/d18/l54 0 2026-03-10T06:22:18.940 INFO:tasks.workunit.client.1.vm06.stdout:6/489: write d6/dd/d25/d33/d5a/d78/f89 [606318,10241] 0 2026-03-10T06:22:18.944 INFO:tasks.workunit.client.1.vm06.stdout:8/374: chown d1/df/d11/c22 97976 1 2026-03-10T06:22:18.945 INFO:tasks.workunit.client.1.vm06.stdout:1/435: mknod d9/d1b/c77 0 2026-03-10T06:22:18.948 INFO:tasks.workunit.client.1.vm06.stdout:8/375: dwrite d1/df/f71 [0,4194304] 0 2026-03-10T06:22:18.952 INFO:tasks.workunit.client.1.vm06.stdout:7/472: rename d19/d3b/d41/d42/f7d to d19/d3b/d41/d42/d52/d83/f9b 0 2026-03-10T06:22:18.958 INFO:tasks.workunit.client.1.vm06.stdout:4/376: mkdir dd/d18/d75 0 2026-03-10T06:22:18.959 INFO:tasks.workunit.client.1.vm06.stdout:1/436: dread d9/d62/f76 [0,4194304] 0 2026-03-10T06:22:18.960 INFO:tasks.workunit.client.1.vm06.stdout:1/437: write d9/d1b/d20/d44/f54 [533016,100957] 0 2026-03-10T06:22:18.963 INFO:tasks.workunit.client.1.vm06.stdout:1/438: dwrite d9/d1b/f51 [4194304,4194304] 0 2026-03-10T06:22:18.964 INFO:tasks.workunit.client.1.vm06.stdout:1/439: dread - d9/d35/f5c zero size 2026-03-10T06:22:18.965 INFO:tasks.workunit.client.1.vm06.stdout:8/376: chown d1/df/d11/f12 111044 1 2026-03-10T06:22:18.966 INFO:tasks.workunit.client.1.vm06.stdout:8/377: stat d1/df/d20/f43 0 2026-03-10T06:22:18.971 INFO:tasks.workunit.client.1.vm06.stdout:7/473: mknod d19/d3b/d41/d42/d62/d80/c9c 0 2026-03-10T06:22:18.973 INFO:tasks.workunit.client.1.vm06.stdout:7/474: chown d19/d3b/d41/d42/d52/d83/f8f 1966796 1 2026-03-10T06:22:18.978 INFO:tasks.workunit.client.1.vm06.stdout:9/442: link d21/d27/d3a/l4c d21/d27/l98 0 2026-03-10T06:22:18.979 INFO:tasks.workunit.client.1.vm06.stdout:3/409: creat d6/dc/d13/f8d x:0 0 0 2026-03-10T06:22:18.980 INFO:tasks.workunit.client.1.vm06.stdout:4/377: dread dd/f43 [0,4194304] 0 2026-03-10T06:22:18.982 INFO:tasks.workunit.client.1.vm06.stdout:5/348: creat d8/d9/d1e/f71 x:0 0 0 2026-03-10T06:22:18.984 INFO:tasks.workunit.client.1.vm06.stdout:8/378: mkdir d1/df/d11/d84 0 2026-03-10T06:22:18.984 INFO:tasks.workunit.client.1.vm06.stdout:7/475: rmdir d19/d3b/d5b 39 2026-03-10T06:22:18.985 INFO:tasks.workunit.client.1.vm06.stdout:7/476: fdatasync d19/d3b/d41/f77 0 2026-03-10T06:22:18.987 INFO:tasks.workunit.client.1.vm06.stdout:6/490: link d6/dd/d25/fa0 d6/fa8 0 2026-03-10T06:22:18.991 INFO:tasks.workunit.client.1.vm06.stdout:2/394: link da/d13/d1c/l23 da/d13/d1c/d1d/d44/d53/l7b 0 2026-03-10T06:22:18.991 INFO:tasks.workunit.client.1.vm06.stdout:5/349: chown d8/db/d54/d67/c27 0 1 2026-03-10T06:22:18.991 INFO:tasks.workunit.client.1.vm06.stdout:8/379: rename d1/d2c/d5b/c76 to d1/d2c/d5b/c85 0 2026-03-10T06:22:18.993 INFO:tasks.workunit.client.1.vm06.stdout:3/410: creat d6/dc/d13/d51/f8e x:0 0 0 2026-03-10T06:22:18.993 INFO:tasks.workunit.client.1.vm06.stdout:4/378: creat dd/d18/d75/f76 x:0 0 0 2026-03-10T06:22:18.994 INFO:tasks.workunit.client.1.vm06.stdout:6/491: mknod d6/df/d40/ca9 0 2026-03-10T06:22:19.000 INFO:tasks.workunit.client.1.vm06.stdout:3/411: read d6/d21/f30 [1504417,95629] 0 2026-03-10T06:22:19.005 INFO:tasks.workunit.client.1.vm06.stdout:8/380: creat d1/df/d58/f86 x:0 0 0 2026-03-10T06:22:19.006 INFO:tasks.workunit.client.1.vm06.stdout:8/381: dread - d1/f75 zero size 2026-03-10T06:22:19.008 INFO:tasks.workunit.client.1.vm06.stdout:6/492: mkdir d6/df/d70/daa 0 2026-03-10T06:22:19.009 INFO:tasks.workunit.client.1.vm06.stdout:6/493: dread - d6/df/d40/fa7 zero size 2026-03-10T06:22:19.010 INFO:tasks.workunit.client.1.vm06.stdout:5/350: dwrite d8/db/f1f [8388608,4194304] 0 2026-03-10T06:22:19.018 INFO:tasks.workunit.client.1.vm06.stdout:0/438: write d0/d3c/d42/d88/d35/f51 [3982052,88731] 0 2026-03-10T06:22:19.022 INFO:tasks.workunit.client.1.vm06.stdout:2/395: dwrite da/d13/d1a/d39/f70 [0,4194304] 0 2026-03-10T06:22:19.030 INFO:tasks.workunit.client.1.vm06.stdout:1/440: fsync d9/d1b/d20/d44/f54 0 2026-03-10T06:22:19.032 INFO:tasks.workunit.client.1.vm06.stdout:6/494: symlink d6/dd/d25/d33/d5a/d78/lab 0 2026-03-10T06:22:19.034 INFO:tasks.workunit.client.1.vm06.stdout:5/351: mkdir d8/db/d54/d67/d22/d39/d72 0 2026-03-10T06:22:19.051 INFO:tasks.workunit.client.1.vm06.stdout:5/352: write d8/d9/f14 [3326003,118154] 0 2026-03-10T06:22:19.051 INFO:tasks.workunit.client.1.vm06.stdout:9/443: truncate d21/d32/d4d/f6b 180621 0 2026-03-10T06:22:19.051 INFO:tasks.workunit.client.1.vm06.stdout:7/477: write d19/d3b/f47 [4783684,95191] 0 2026-03-10T06:22:19.051 INFO:tasks.workunit.client.1.vm06.stdout:4/379: dwrite dd/d41/f52 [0,4194304] 0 2026-03-10T06:22:19.051 INFO:tasks.workunit.client.1.vm06.stdout:8/382: symlink d1/df/l87 0 2026-03-10T06:22:19.053 INFO:tasks.workunit.client.1.vm06.stdout:1/441: mknod d9/d35/d46/d38/d63/c78 0 2026-03-10T06:22:19.054 INFO:tasks.workunit.client.1.vm06.stdout:1/442: write d9/d35/d46/f74 [126643,108828] 0 2026-03-10T06:22:19.055 INFO:tasks.workunit.client.1.vm06.stdout:6/495: fsync d6/dd/d25/d33/d4d/fa5 0 2026-03-10T06:22:19.057 INFO:tasks.workunit.client.1.vm06.stdout:0/439: dwrite d0/dd/d14/f65 [0,4194304] 0 2026-03-10T06:22:19.058 INFO:tasks.workunit.client.1.vm06.stdout:1/443: dread d9/d35/f56 [0,4194304] 0 2026-03-10T06:22:19.060 INFO:tasks.workunit.client.1.vm06.stdout:1/444: readlink d9/d1b/d20/d44/l4c 0 2026-03-10T06:22:19.062 INFO:tasks.workunit.client.1.vm06.stdout:5/353: rename d8/db/f18 to d8/db/d54/d67/d46/d68/f73 0 2026-03-10T06:22:19.066 INFO:tasks.workunit.client.1.vm06.stdout:0/440: chown d0/dd/d1c/c20 18 1 2026-03-10T06:22:19.066 INFO:tasks.workunit.client.1.vm06.stdout:0/441: chown d0/dd/d14/d18/d85 25520 1 2026-03-10T06:22:19.066 INFO:tasks.workunit.client.1.vm06.stdout:0/442: chown d0/f5 4 1 2026-03-10T06:22:19.069 INFO:tasks.workunit.client.1.vm06.stdout:7/478: mkdir d19/d3b/d41/d42/d52/d83/d9d 0 2026-03-10T06:22:19.070 INFO:tasks.workunit.client.1.vm06.stdout:9/444: dwrite d21/d27/f39 [0,4194304] 0 2026-03-10T06:22:19.079 INFO:tasks.workunit.client.1.vm06.stdout:1/445: rmdir d9/d35/d46/d38/d63 39 2026-03-10T06:22:19.082 INFO:tasks.workunit.client.1.vm06.stdout:1/446: chown d9/d1b/c6e 26 1 2026-03-10T06:22:19.085 INFO:tasks.workunit.client.1.vm06.stdout:4/380: mknod dd/d24/d2d/d2f/d34/c77 0 2026-03-10T06:22:19.087 INFO:tasks.workunit.client.1.vm06.stdout:0/443: truncate d0/f9 2690780 0 2026-03-10T06:22:19.103 INFO:tasks.workunit.client.1.vm06.stdout:0/444: chown d0/dd/d1b/d3d/l6e 328 1 2026-03-10T06:22:19.103 INFO:tasks.workunit.client.1.vm06.stdout:0/445: write d0/dd/f48 [845281,60607] 0 2026-03-10T06:22:19.103 INFO:tasks.workunit.client.1.vm06.stdout:2/396: creat da/d13/d1c/d1d/d44/d48/f7c x:0 0 0 2026-03-10T06:22:19.103 INFO:tasks.workunit.client.1.vm06.stdout:5/354: dread d8/db/d54/d67/d22/f53 [0,4194304] 0 2026-03-10T06:22:19.103 INFO:tasks.workunit.client.1.vm06.stdout:2/397: chown da/d13/d1a/d39/f2f 0 1 2026-03-10T06:22:19.103 INFO:tasks.workunit.client.1.vm06.stdout:2/398: fsync da/d13/d1c/d1d/d44/d53/f78 0 2026-03-10T06:22:19.103 INFO:tasks.workunit.client.1.vm06.stdout:7/479: stat d19/d3b/d5b/f7f 0 2026-03-10T06:22:19.103 INFO:tasks.workunit.client.1.vm06.stdout:0/446: rename d0/dd/d14/d18/c38 to d0/dd/d14/d18/c94 0 2026-03-10T06:22:19.103 INFO:tasks.workunit.client.1.vm06.stdout:0/447: write d0/d3c/d42/d5e/f86 [632397,30209] 0 2026-03-10T06:22:19.108 INFO:tasks.workunit.client.1.vm06.stdout:9/445: rename d21/d27/l98 to d21/d27/d56/d8e/d93/l99 0 2026-03-10T06:22:19.108 INFO:tasks.workunit.client.1.vm06.stdout:8/383: link d1/df/d11/f45 d1/df/d20/f88 0 2026-03-10T06:22:19.111 INFO:tasks.workunit.client.1.vm06.stdout:0/448: fdatasync d0/dd/d14/d18/f90 0 2026-03-10T06:22:19.117 INFO:tasks.workunit.client.1.vm06.stdout:9/446: creat d21/d27/f9a x:0 0 0 2026-03-10T06:22:19.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:18 vm06.local ceph-mon[58974]: Updating vm06:/etc/ceph/ceph.client.admin.keyring 2026-03-10T06:22:19.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:18 vm06.local ceph-mon[58974]: Updating vm06:/var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/config/ceph.client.admin.keyring 2026-03-10T06:22:19.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:18 vm06.local ceph-mon[58974]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:22:19.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:18 vm06.local ceph-mon[58974]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:22:19.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:18 vm06.local ceph-mon[58974]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:22:19.127 INFO:tasks.workunit.client.1.vm06.stdout:1/447: getdents d9/d62 0 2026-03-10T06:22:19.128 INFO:tasks.workunit.client.1.vm06.stdout:5/355: rename d8/d9/d1e to d8/db/d54/d67/d22/d74 0 2026-03-10T06:22:19.130 INFO:tasks.workunit.client.1.vm06.stdout:7/480: rename d19/l4a to d19/d3b/d41/d42/d52/d83/d9d/l9e 0 2026-03-10T06:22:19.132 INFO:tasks.workunit.client.1.vm06.stdout:1/448: symlink d9/d35/l79 0 2026-03-10T06:22:19.132 INFO:tasks.workunit.client.1.vm06.stdout:0/449: dwrite d0/d3c/d42/d88/d35/f3a [0,4194304] 0 2026-03-10T06:22:19.138 INFO:tasks.workunit.client.1.vm06.stdout:9/447: dread f1b [4194304,4194304] 0 2026-03-10T06:22:19.140 INFO:tasks.workunit.client.1.vm06.stdout:5/356: sync 2026-03-10T06:22:19.142 INFO:tasks.workunit.client.1.vm06.stdout:1/449: creat d9/d35/d46/f7a x:0 0 0 2026-03-10T06:22:19.145 INFO:tasks.workunit.client.1.vm06.stdout:5/357: dread d8/db/d54/f5c [0,4194304] 0 2026-03-10T06:22:19.149 INFO:tasks.workunit.client.1.vm06.stdout:3/412: write d6/dc/d13/d35/d6b/f78 [1353744,8243] 0 2026-03-10T06:22:19.155 INFO:tasks.workunit.client.1.vm06.stdout:5/358: creat d8/db/d57/f75 x:0 0 0 2026-03-10T06:22:19.155 INFO:tasks.workunit.client.1.vm06.stdout:5/359: dread - d8/db/d54/d67/d22/d39/f6a zero size 2026-03-10T06:22:19.158 INFO:tasks.workunit.client.1.vm06.stdout:6/496: dwrite d6/dd/d25/fa0 [0,4194304] 0 2026-03-10T06:22:19.159 INFO:tasks.workunit.client.1.vm06.stdout:9/448: mkdir d21/d46/d79/d80/d95/d9b 0 2026-03-10T06:22:19.162 INFO:tasks.workunit.client.1.vm06.stdout:9/449: truncate d21/d32/d4d/d51/d67/f81 540771 0 2026-03-10T06:22:19.178 INFO:tasks.workunit.client.1.vm06.stdout:4/381: dwrite dd/f43 [0,4194304] 0 2026-03-10T06:22:19.181 INFO:tasks.workunit.client.1.vm06.stdout:2/399: dwrite da/d13/d1c/d1d/f55 [0,4194304] 0 2026-03-10T06:22:19.189 INFO:tasks.workunit.client.1.vm06.stdout:4/382: dwrite dd/d33/f58 [0,4194304] 0 2026-03-10T06:22:19.194 INFO:tasks.workunit.client.1.vm06.stdout:8/384: dwrite d1/df/d20/f63 [0,4194304] 0 2026-03-10T06:22:19.198 INFO:tasks.workunit.client.1.vm06.stdout:4/383: dwrite f8 [8388608,4194304] 0 2026-03-10T06:22:19.202 INFO:tasks.workunit.client.1.vm06.stdout:0/450: creat d0/dd/f95 x:0 0 0 2026-03-10T06:22:19.202 INFO:tasks.workunit.client.1.vm06.stdout:6/497: rename d6/df/c6e to d6/dd/d2b/cac 0 2026-03-10T06:22:19.203 INFO:tasks.workunit.client.1.vm06.stdout:4/384: truncate dd/d24/d2d/d2f/d39/f61 370244 0 2026-03-10T06:22:19.205 INFO:tasks.workunit.client.1.vm06.stdout:5/360: dread d8/db/d57/f5f [0,4194304] 0 2026-03-10T06:22:19.207 INFO:tasks.workunit.client.1.vm06.stdout:1/450: dwrite d9/df/f3d [0,4194304] 0 2026-03-10T06:22:19.218 INFO:tasks.workunit.client.1.vm06.stdout:8/385: sync 2026-03-10T06:22:19.230 INFO:tasks.workunit.client.1.vm06.stdout:8/386: dwrite d1/d7/f4f [0,4194304] 0 2026-03-10T06:22:19.230 INFO:tasks.workunit.client.1.vm06.stdout:8/387: readlink d1/d7/lc 0 2026-03-10T06:22:19.230 INFO:tasks.workunit.client.1.vm06.stdout:9/450: rename d21/d32/c77 to d21/d32/d4d/d51/c9c 0 2026-03-10T06:22:19.230 INFO:tasks.workunit.client.1.vm06.stdout:2/400: mkdir da/d13/d1c/d7d 0 2026-03-10T06:22:19.230 INFO:tasks.workunit.client.1.vm06.stdout:6/498: unlink d6/dd/d2b/c39 0 2026-03-10T06:22:19.230 INFO:tasks.workunit.client.1.vm06.stdout:2/401: read - da/d13/d1c/d1d/d44/d53/f78 zero size 2026-03-10T06:22:19.230 INFO:tasks.workunit.client.1.vm06.stdout:2/402: chown da/d13/d1c/d1d 24 1 2026-03-10T06:22:19.236 INFO:tasks.workunit.client.1.vm06.stdout:4/385: creat dd/d72/f78 x:0 0 0 2026-03-10T06:22:19.240 INFO:tasks.workunit.client.1.vm06.stdout:6/499: mknod d6/dd/d25/d2c/cad 0 2026-03-10T06:22:19.244 INFO:tasks.workunit.client.1.vm06.stdout:8/388: fsync d1/f4 0 2026-03-10T06:22:19.245 INFO:tasks.workunit.client.1.vm06.stdout:8/389: chown d1/df/d11/f81 1174938222 1 2026-03-10T06:22:19.246 INFO:tasks.workunit.client.1.vm06.stdout:2/403: dwrite da/d13/f5b [0,4194304] 0 2026-03-10T06:22:19.249 INFO:tasks.workunit.client.1.vm06.stdout:1/451: rename d9/f68 to d9/d1b/f7b 0 2026-03-10T06:22:19.250 INFO:tasks.workunit.client.1.vm06.stdout:1/452: write d9/d35/d46/f74 [857551,48230] 0 2026-03-10T06:22:19.258 INFO:tasks.workunit.client.1.vm06.stdout:2/404: dwrite da/f75 [0,4194304] 0 2026-03-10T06:22:19.262 INFO:tasks.workunit.client.1.vm06.stdout:9/451: creat d21/d32/d4d/f9d x:0 0 0 2026-03-10T06:22:19.262 INFO:tasks.workunit.client.1.vm06.stdout:4/386: rmdir dd/d24/d2d/d2f/d34/d40/d5b 0 2026-03-10T06:22:19.267 INFO:tasks.workunit.client.1.vm06.stdout:4/387: unlink dd/f29 0 2026-03-10T06:22:19.273 INFO:tasks.workunit.client.1.vm06.stdout:4/388: fsync dd/fe 0 2026-03-10T06:22:19.274 INFO:tasks.workunit.client.1.vm06.stdout:4/389: chown dd/d18/l6f 3815 1 2026-03-10T06:22:19.281 INFO:tasks.workunit.client.1.vm06.stdout:1/453: dread d9/d35/f57 [0,4194304] 0 2026-03-10T06:22:19.282 INFO:tasks.workunit.client.1.vm06.stdout:0/451: dread d0/f61 [0,4194304] 0 2026-03-10T06:22:19.285 INFO:tasks.workunit.client.1.vm06.stdout:9/452: dread d21/d27/d56/d8e/f96 [0,4194304] 0 2026-03-10T06:22:19.285 INFO:tasks.workunit.client.1.vm06.stdout:6/500: dread d6/dd/d35/f2d [0,4194304] 0 2026-03-10T06:22:19.286 INFO:tasks.workunit.client.1.vm06.stdout:2/405: link da/d13/d1a/f21 da/d13/d1c/f7e 0 2026-03-10T06:22:19.288 INFO:tasks.workunit.client.1.vm06.stdout:8/390: getdents d1/df/d20/d21 0 2026-03-10T06:22:19.289 INFO:tasks.workunit.client.1.vm06.stdout:2/406: dread da/f75 [0,4194304] 0 2026-03-10T06:22:19.289 INFO:tasks.workunit.client.1.vm06.stdout:4/390: mknod dd/d24/d5e/c79 0 2026-03-10T06:22:19.290 INFO:tasks.workunit.client.1.vm06.stdout:4/391: write dd/ff [4985793,73147] 0 2026-03-10T06:22:19.293 INFO:tasks.workunit.client.1.vm06.stdout:0/452: rmdir d0 39 2026-03-10T06:22:19.293 INFO:tasks.workunit.client.1.vm06.stdout:1/454: chown d9/d35/d46/d38/l72 8 1 2026-03-10T06:22:19.294 INFO:tasks.workunit.client.1.vm06.stdout:1/455: fdatasync d9/d1b/d20/d44/f54 0 2026-03-10T06:22:19.296 INFO:tasks.workunit.client.1.vm06.stdout:6/501: fsync d6/dd/d35/f97 0 2026-03-10T06:22:19.297 INFO:tasks.workunit.client.1.vm06.stdout:9/453: symlink d21/d27/d50/l9e 0 2026-03-10T06:22:19.307 INFO:tasks.workunit.client.1.vm06.stdout:4/392: symlink dd/d24/d5e/l7a 0 2026-03-10T06:22:19.309 INFO:tasks.workunit.client.1.vm06.stdout:7/481: truncate d19/d3b/f3c 1747225 0 2026-03-10T06:22:19.309 INFO:tasks.workunit.client.1.vm06.stdout:7/482: fsync d19/d3b/d41/f54 0 2026-03-10T06:22:19.309 INFO:tasks.workunit.client.1.vm06.stdout:6/502: mkdir d6/dd/d25/d33/d5a/dae 0 2026-03-10T06:22:19.310 INFO:tasks.workunit.client.1.vm06.stdout:7/483: write d19/d3b/d41/d42/d52/f64 [1579215,86074] 0 2026-03-10T06:22:19.312 INFO:tasks.workunit.client.1.vm06.stdout:9/454: mknod d21/d27/d56/d8e/c9f 0 2026-03-10T06:22:19.314 INFO:tasks.workunit.client.1.vm06.stdout:4/393: symlink dd/d33/l7b 0 2026-03-10T06:22:19.315 INFO:tasks.workunit.client.1.vm06.stdout:7/484: mkdir d19/d3b/d41/d42/d52/d9f 0 2026-03-10T06:22:19.317 INFO:tasks.workunit.client.1.vm06.stdout:9/455: symlink d21/d27/d56/d8e/d93/la0 0 2026-03-10T06:22:19.320 INFO:tasks.workunit.client.1.vm06.stdout:9/456: readlink l18 0 2026-03-10T06:22:19.320 INFO:tasks.workunit.client.1.vm06.stdout:8/391: creat d1/f89 x:0 0 0 2026-03-10T06:22:19.321 INFO:tasks.workunit.client.1.vm06.stdout:4/394: stat c4 0 2026-03-10T06:22:19.321 INFO:tasks.workunit.client.1.vm06.stdout:0/453: symlink d0/dd/l96 0 2026-03-10T06:22:19.321 INFO:tasks.workunit.client.1.vm06.stdout:6/503: rename d6/df/d40/l92 to d6/d79/laf 0 2026-03-10T06:22:19.323 INFO:tasks.workunit.client.1.vm06.stdout:8/392: chown d1/df/d20/d35/l3f 4 1 2026-03-10T06:22:19.323 INFO:tasks.workunit.client.1.vm06.stdout:4/395: read dd/d18/f1f [1527674,101420] 0 2026-03-10T06:22:19.323 INFO:tasks.workunit.client.1.vm06.stdout:6/504: mknod d6/df/cb0 0 2026-03-10T06:22:19.324 INFO:tasks.workunit.client.1.vm06.stdout:6/505: write f3 [1528501,57517] 0 2026-03-10T06:22:19.324 INFO:tasks.workunit.client.1.vm06.stdout:9/457: symlink d21/d32/la1 0 2026-03-10T06:22:19.325 INFO:tasks.workunit.client.1.vm06.stdout:8/393: creat d1/d2c/f8a x:0 0 0 2026-03-10T06:22:19.326 INFO:tasks.workunit.client.1.vm06.stdout:8/394: dread - d1/df/d20/d21/f82 zero size 2026-03-10T06:22:19.326 INFO:tasks.workunit.client.1.vm06.stdout:4/396: mkdir dd/d24/d2d/d7c 0 2026-03-10T06:22:19.330 INFO:tasks.workunit.client.1.vm06.stdout:7/485: dwrite d19/d3b/d41/d4c/f55 [0,4194304] 0 2026-03-10T06:22:19.330 INFO:tasks.workunit.client.1.vm06.stdout:8/395: mknod d1/df/d11/d84/c8b 0 2026-03-10T06:22:19.330 INFO:tasks.workunit.client.1.vm06.stdout:6/506: creat d6/fb1 x:0 0 0 2026-03-10T06:22:19.349 INFO:tasks.workunit.client.1.vm06.stdout:6/507: dread d6/dd/d25/d4e/f60 [0,4194304] 0 2026-03-10T06:22:19.350 INFO:tasks.workunit.client.1.vm06.stdout:3/413: dwrite d6/dc/d13/d35/f3b [0,4194304] 0 2026-03-10T06:22:19.357 INFO:tasks.workunit.client.1.vm06.stdout:7/486: dwrite d19/d3b/d41/d42/d62/f86 [0,4194304] 0 2026-03-10T06:22:19.361 INFO:tasks.workunit.client.1.vm06.stdout:7/487: write d19/d3b/f53 [898203,7723] 0 2026-03-10T06:22:19.367 INFO:tasks.workunit.client.1.vm06.stdout:6/508: fsync d6/dd/f5b 0 2026-03-10T06:22:19.374 INFO:tasks.workunit.client.1.vm06.stdout:3/414: symlink d6/dc/d13/d35/d6b/l8f 0 2026-03-10T06:22:19.377 INFO:tasks.workunit.client.1.vm06.stdout:3/415: chown d6/d21/d38/d88/f7d 179617 1 2026-03-10T06:22:19.380 INFO:tasks.workunit.client.1.vm06.stdout:7/488: rename d19/f1d to d19/d3b/d41/d42/d52/fa0 0 2026-03-10T06:22:19.381 INFO:tasks.workunit.client.1.vm06.stdout:3/416: mkdir d6/d21/d38/d39/d90 0 2026-03-10T06:22:19.381 INFO:tasks.workunit.client.1.vm06.stdout:3/417: dread - d6/f84 zero size 2026-03-10T06:22:19.382 INFO:tasks.workunit.client.1.vm06.stdout:3/418: write d6/f84 [214233,32223] 0 2026-03-10T06:22:19.382 INFO:tasks.workunit.client.1.vm06.stdout:7/489: mkdir d19/d3b/d41/d42/d62/d80/da1 0 2026-03-10T06:22:19.384 INFO:tasks.workunit.client.1.vm06.stdout:7/490: read d19/d3b/d41/d42/d62/f86 [2322324,35444] 0 2026-03-10T06:22:19.386 INFO:tasks.workunit.client.1.vm06.stdout:3/419: rename d6/dc/d13/f6e to d6/f91 0 2026-03-10T06:22:19.387 INFO:tasks.workunit.client.1.vm06.stdout:7/491: mknod d19/d3b/d41/d42/d52/d83/ca2 0 2026-03-10T06:22:19.389 INFO:tasks.workunit.client.1.vm06.stdout:3/420: dwrite d6/dc/d41/d6d/f70 [0,4194304] 0 2026-03-10T06:22:19.395 INFO:tasks.workunit.client.1.vm06.stdout:3/421: rmdir d6/dc/d13/d51 39 2026-03-10T06:22:19.401 INFO:tasks.workunit.client.1.vm06.stdout:3/422: getdents d6/dc/d72 0 2026-03-10T06:22:19.406 INFO:tasks.workunit.client.1.vm06.stdout:3/423: dwrite d6/dc/f69 [0,4194304] 0 2026-03-10T06:22:19.407 INFO:tasks.workunit.client.1.vm06.stdout:3/424: rmdir d6/d1a 39 2026-03-10T06:22:19.421 INFO:tasks.workunit.client.1.vm06.stdout:3/425: symlink d6/dc/d41/d6d/l92 0 2026-03-10T06:22:19.421 INFO:tasks.workunit.client.1.vm06.stdout:3/426: dread - d6/dc/d13/f5e zero size 2026-03-10T06:22:19.421 INFO:tasks.workunit.client.1.vm06.stdout:3/427: write d6/d21/f55 [729168,4529] 0 2026-03-10T06:22:19.421 INFO:tasks.workunit.client.1.vm06.stdout:3/428: mknod d6/dc/d13/d35/d6b/c93 0 2026-03-10T06:22:19.421 INFO:tasks.workunit.client.1.vm06.stdout:3/429: readlink d6/dc/d13/l32 0 2026-03-10T06:22:19.421 INFO:tasks.workunit.client.1.vm06.stdout:4/397: fsync dd/d72/f78 0 2026-03-10T06:22:19.421 INFO:tasks.workunit.client.1.vm06.stdout:4/398: rename dd/d33/d36 to dd/d33/d36/d7d 22 2026-03-10T06:22:19.421 INFO:tasks.workunit.client.1.vm06.stdout:3/430: creat d6/dc/f94 x:0 0 0 2026-03-10T06:22:19.424 INFO:tasks.workunit.client.1.vm06.stdout:5/361: dwrite d8/db/d54/d67/d46/d68/f73 [4194304,4194304] 0 2026-03-10T06:22:19.436 INFO:tasks.workunit.client.1.vm06.stdout:4/399: rename dd/d24/d2d/d2f/d34/c4b to dd/d18/c7e 0 2026-03-10T06:22:19.436 INFO:tasks.workunit.client.1.vm06.stdout:2/407: rmdir da/d13/d1c 39 2026-03-10T06:22:19.442 INFO:tasks.workunit.client.1.vm06.stdout:1/456: truncate d9/f2f 3287614 0 2026-03-10T06:22:19.444 INFO:tasks.workunit.client.1.vm06.stdout:5/362: dread d8/db/d54/f5c [0,4194304] 0 2026-03-10T06:22:19.444 INFO:tasks.workunit.client.1.vm06.stdout:4/400: symlink dd/d24/d2d/l7f 0 2026-03-10T06:22:19.445 INFO:tasks.workunit.client.1.vm06.stdout:4/401: fdatasync dd/f5c 0 2026-03-10T06:22:19.446 INFO:tasks.workunit.client.1.vm06.stdout:2/408: symlink da/d13/d1a/d39/d35/l7f 0 2026-03-10T06:22:19.448 INFO:tasks.workunit.client.1.vm06.stdout:1/457: fdatasync d9/f1a 0 2026-03-10T06:22:19.449 INFO:tasks.workunit.client.1.vm06.stdout:5/363: creat d8/db/d54/d67/d46/f76 x:0 0 0 2026-03-10T06:22:19.451 INFO:tasks.workunit.client.1.vm06.stdout:2/409: write da/d13/d1c/d43/f7a [173538,21879] 0 2026-03-10T06:22:19.452 INFO:tasks.workunit.client.1.vm06.stdout:4/402: fsync dd/f14 0 2026-03-10T06:22:19.453 INFO:tasks.workunit.client.1.vm06.stdout:5/364: dwrite d8/ff [0,4194304] 0 2026-03-10T06:22:19.458 INFO:tasks.workunit.client.1.vm06.stdout:1/458: creat d9/d1b/f7c x:0 0 0 2026-03-10T06:22:19.459 INFO:tasks.workunit.client.1.vm06.stdout:2/410: symlink da/d13/d1a/d39/d35/l80 0 2026-03-10T06:22:19.460 INFO:tasks.workunit.client.1.vm06.stdout:2/411: truncate da/d13/d1c/d43/f7a 312256 0 2026-03-10T06:22:19.461 INFO:tasks.workunit.client.1.vm06.stdout:2/412: read da/d13/d1c/d1d/d44/d53/f67 [148352,37486] 0 2026-03-10T06:22:19.484 INFO:tasks.workunit.client.1.vm06.stdout:1/459: symlink d9/d1b/d20/l7d 0 2026-03-10T06:22:19.486 INFO:tasks.workunit.client.1.vm06.stdout:4/403: creat dd/f80 x:0 0 0 2026-03-10T06:22:19.487 INFO:tasks.workunit.client.1.vm06.stdout:4/404: write dd/d33/f70 [1836153,54303] 0 2026-03-10T06:22:19.487 INFO:tasks.workunit.client.1.vm06.stdout:5/365: getdents d8/db/d54/d67/d22/d39 0 2026-03-10T06:22:19.488 INFO:tasks.workunit.client.1.vm06.stdout:5/366: write d8/db/f1f [6233258,69780] 0 2026-03-10T06:22:19.490 INFO:tasks.workunit.client.1.vm06.stdout:1/460: write d9/d1b/d20/f42 [3409125,117960] 0 2026-03-10T06:22:19.490 INFO:tasks.workunit.client.1.vm06.stdout:1/461: chown d9/df/c4a 16 1 2026-03-10T06:22:19.491 INFO:tasks.workunit.client.1.vm06.stdout:2/413: dread da/d13/d1a/d39/d35/f4a [0,4194304] 0 2026-03-10T06:22:19.492 INFO:tasks.workunit.client.1.vm06.stdout:2/414: stat da/d13/d1c/d1d/d44/d53/c60 0 2026-03-10T06:22:19.495 INFO:tasks.workunit.client.1.vm06.stdout:4/405: creat dd/f81 x:0 0 0 2026-03-10T06:22:19.495 INFO:tasks.workunit.client.1.vm06.stdout:2/415: read da/d13/d1c/d1d/f2a [2728350,65191] 0 2026-03-10T06:22:19.495 INFO:tasks.workunit.client.1.vm06.stdout:2/416: fdatasync f8 0 2026-03-10T06:22:19.496 INFO:tasks.workunit.client.1.vm06.stdout:1/462: creat d9/d35/f7e x:0 0 0 2026-03-10T06:22:19.496 INFO:tasks.workunit.client.1.vm06.stdout:4/406: write dd/d33/d36/f59 [773325,109837] 0 2026-03-10T06:22:19.499 INFO:tasks.workunit.client.1.vm06.stdout:4/407: dread dd/d24/d2d/d2f/d39/f61 [0,4194304] 0 2026-03-10T06:22:19.507 INFO:tasks.workunit.client.1.vm06.stdout:1/463: dread d9/df/f3d [0,4194304] 0 2026-03-10T06:22:19.507 INFO:tasks.workunit.client.1.vm06.stdout:4/408: chown dd/d24/d5d 45982014 1 2026-03-10T06:22:19.507 INFO:tasks.workunit.client.1.vm06.stdout:1/464: rename d9/d35/d46/f6c to d9/d35/f7f 0 2026-03-10T06:22:19.507 INFO:tasks.workunit.client.1.vm06.stdout:1/465: write d9/df/f4d [951947,103283] 0 2026-03-10T06:22:19.507 INFO:tasks.workunit.client.1.vm06.stdout:4/409: dread dd/d24/d2d/f28 [0,4194304] 0 2026-03-10T06:22:19.507 INFO:tasks.workunit.client.1.vm06.stdout:4/410: dread - dd/d24/f69 zero size 2026-03-10T06:22:19.507 INFO:tasks.workunit.client.1.vm06.stdout:1/466: chown d9/d35/d46/d38/d63 776 1 2026-03-10T06:22:19.510 INFO:tasks.workunit.client.1.vm06.stdout:4/411: rename dd/d33/d36/f59 to dd/d24/d2d/d2f/f82 0 2026-03-10T06:22:19.512 INFO:tasks.workunit.client.1.vm06.stdout:1/467: dread d9/d1b/f31 [0,4194304] 0 2026-03-10T06:22:19.521 INFO:tasks.workunit.client.1.vm06.stdout:1/468: rename d9/f17 to d9/d35/d46/d38/d63/f80 0 2026-03-10T06:22:19.521 INFO:tasks.workunit.client.1.vm06.stdout:1/469: fsync d9/f1f 0 2026-03-10T06:22:19.555 INFO:tasks.workunit.client.1.vm06.stdout:8/396: rmdir d1/df/d11 39 2026-03-10T06:22:19.558 INFO:tasks.workunit.client.1.vm06.stdout:9/458: dwrite d21/d32/d4d/f6b [0,4194304] 0 2026-03-10T06:22:19.559 INFO:tasks.workunit.client.1.vm06.stdout:9/459: fsync d21/f3e 0 2026-03-10T06:22:19.563 INFO:tasks.workunit.client.1.vm06.stdout:8/397: link d1/d7/c2e d1/df/d11/c8c 0 2026-03-10T06:22:19.564 INFO:tasks.workunit.client.1.vm06.stdout:9/460: mkdir d21/da2 0 2026-03-10T06:22:19.565 INFO:tasks.workunit.client.1.vm06.stdout:9/461: stat d21/d32/d6e/c3c 0 2026-03-10T06:22:19.578 INFO:tasks.workunit.client.1.vm06.stdout:9/462: dread fd [0,4194304] 0 2026-03-10T06:22:19.580 INFO:tasks.workunit.client.1.vm06.stdout:9/463: getdents d21/da2 0 2026-03-10T06:22:19.581 INFO:tasks.workunit.client.1.vm06.stdout:9/464: creat d21/d27/d56/d8e/d93/fa3 x:0 0 0 2026-03-10T06:22:19.583 INFO:tasks.workunit.client.1.vm06.stdout:9/465: link d21/d32/d4d/l66 d21/d32/la4 0 2026-03-10T06:22:19.584 INFO:tasks.workunit.client.1.vm06.stdout:9/466: write d21/d27/d56/d8e/d93/fa3 [201657,66909] 0 2026-03-10T06:22:19.589 INFO:tasks.workunit.client.1.vm06.stdout:9/467: dread d21/d32/d6e/f2e [0,4194304] 0 2026-03-10T06:22:19.590 INFO:tasks.workunit.client.1.vm06.stdout:9/468: stat d21/d27/f65 0 2026-03-10T06:22:19.602 INFO:tasks.workunit.client.1.vm06.stdout:9/469: dread d21/d27/d50/d57/f58 [0,4194304] 0 2026-03-10T06:22:19.602 INFO:tasks.workunit.client.1.vm06.stdout:9/470: fsync d21/d46/d79/d7f/f91 0 2026-03-10T06:22:19.611 INFO:tasks.workunit.client.1.vm06.stdout:9/471: getdents d21/d27 0 2026-03-10T06:22:19.613 INFO:tasks.workunit.client.1.vm06.stdout:6/509: dwrite d6/d7/d37/d43/f59 [4194304,4194304] 0 2026-03-10T06:22:19.614 INFO:tasks.workunit.client.1.vm06.stdout:7/492: truncate d19/f35 1646244 0 2026-03-10T06:22:19.616 INFO:tasks.workunit.client.1.vm06.stdout:7/493: stat d19/d3b/d41/d42/d52/d83 0 2026-03-10T06:22:19.617 INFO:tasks.workunit.client.1.vm06.stdout:9/472: creat d21/d46/d79/d80/d95/d9b/fa5 x:0 0 0 2026-03-10T06:22:19.619 INFO:tasks.workunit.client.1.vm06.stdout:6/510: dwrite d6/dd/d25/d33/d4d/f8c [0,4194304] 0 2026-03-10T06:22:19.621 INFO:tasks.workunit.client.1.vm06.stdout:3/431: write d6/d1a/f1f [8649727,89667] 0 2026-03-10T06:22:19.625 INFO:tasks.workunit.client.1.vm06.stdout:7/494: read f13 [4281008,97457] 0 2026-03-10T06:22:19.645 INFO:tasks.workunit.client.1.vm06.stdout:0/454: write d0/f9 [3642482,118722] 0 2026-03-10T06:22:19.646 INFO:tasks.workunit.client.1.vm06.stdout:0/455: readlink d0/d3c/d42/d88/d35/d74/l7a 0 2026-03-10T06:22:19.646 INFO:tasks.workunit.client.1.vm06.stdout:0/456: fdatasync d0/dd/f32 0 2026-03-10T06:22:19.648 INFO:tasks.workunit.client.1.vm06.stdout:0/457: write d0/ff [5611990,110795] 0 2026-03-10T06:22:19.649 INFO:tasks.workunit.client.1.vm06.stdout:9/473: readlink d21/d32/l7b 0 2026-03-10T06:22:19.651 INFO:tasks.workunit.client.1.vm06.stdout:6/511: mkdir d6/dd/d2b/db2 0 2026-03-10T06:22:19.660 INFO:tasks.workunit.client.1.vm06.stdout:7/495: readlink d19/d3b/l5e 0 2026-03-10T06:22:19.660 INFO:tasks.workunit.client.1.vm06.stdout:5/367: dwrite d8/db/d54/f5c [0,4194304] 0 2026-03-10T06:22:19.660 INFO:tasks.workunit.client.1.vm06.stdout:6/512: write d6/fa2 [632378,43678] 0 2026-03-10T06:22:19.660 INFO:tasks.workunit.client.1.vm06.stdout:2/417: dwrite da/f19 [4194304,4194304] 0 2026-03-10T06:22:19.660 INFO:tasks.workunit.client.1.vm06.stdout:1/470: truncate d9/d35/d46/f74 676050 0 2026-03-10T06:22:19.666 INFO:tasks.workunit.client.1.vm06.stdout:8/398: truncate d1/f26 387750 0 2026-03-10T06:22:19.667 INFO:tasks.workunit.client.1.vm06.stdout:0/458: symlink d0/d3c/d42/d88/d35/d74/l97 0 2026-03-10T06:22:19.671 INFO:tasks.workunit.client.1.vm06.stdout:3/432: link d6/dc/d13/d35/d6b/f86 d6/dc/d13/d35/f95 0 2026-03-10T06:22:19.672 INFO:tasks.workunit.client.1.vm06.stdout:4/412: dwrite f0 [0,4194304] 0 2026-03-10T06:22:19.675 INFO:tasks.workunit.client.1.vm06.stdout:3/433: truncate d6/d21/d38/d39/f4c 319224 0 2026-03-10T06:22:19.684 INFO:tasks.workunit.client.1.vm06.stdout:0/459: dwrite d0/dd/f5b [4194304,4194304] 0 2026-03-10T06:22:19.685 INFO:tasks.workunit.client.1.vm06.stdout:2/418: dwrite da/f75 [0,4194304] 0 2026-03-10T06:22:19.696 INFO:tasks.workunit.client.1.vm06.stdout:5/368: creat d8/db/d54/d67/d46/f77 x:0 0 0 2026-03-10T06:22:19.696 INFO:tasks.workunit.client.1.vm06.stdout:1/471: readlink d9/l3f 0 2026-03-10T06:22:19.701 INFO:tasks.workunit.client.1.vm06.stdout:8/399: mkdir d1/df/d20/d21/d7e/d8d 0 2026-03-10T06:22:19.704 INFO:tasks.workunit.client.1.vm06.stdout:7/496: creat d19/d3b/d41/d72/d97/fa3 x:0 0 0 2026-03-10T06:22:19.704 INFO:tasks.workunit.client.1.vm06.stdout:5/369: dread d8/db/d54/d67/d22/f31 [0,4194304] 0 2026-03-10T06:22:19.704 INFO:tasks.workunit.client.1.vm06.stdout:7/497: chown d19/d3b/d5b/l5d 17740 1 2026-03-10T06:22:19.706 INFO:tasks.workunit.client.1.vm06.stdout:9/474: fsync d21/d32/d4d/f64 0 2026-03-10T06:22:19.718 INFO:tasks.workunit.client.1.vm06.stdout:0/460: unlink d0/d3c/d42/d88/d35/f3a 0 2026-03-10T06:22:19.718 INFO:tasks.workunit.client.1.vm06.stdout:3/434: dread d6/d8/f22 [0,4194304] 0 2026-03-10T06:22:19.719 INFO:tasks.workunit.client.1.vm06.stdout:3/435: dread - d6/d8/f49 zero size 2026-03-10T06:22:19.723 INFO:tasks.workunit.client.1.vm06.stdout:8/400: rmdir d1/df/d20 39 2026-03-10T06:22:19.748 INFO:tasks.workunit.client.1.vm06.stdout:8/401: dwrite d1/df/f6d [0,4194304] 0 2026-03-10T06:22:19.748 INFO:tasks.workunit.client.1.vm06.stdout:9/475: mknod d21/d46/ca6 0 2026-03-10T06:22:19.748 INFO:tasks.workunit.client.1.vm06.stdout:6/513: link d6/c76 d6/df/d70/daa/cb3 0 2026-03-10T06:22:19.748 INFO:tasks.workunit.client.1.vm06.stdout:2/419: creat da/d13/d1c/d7d/f81 x:0 0 0 2026-03-10T06:22:19.748 INFO:tasks.workunit.client.1.vm06.stdout:2/420: dwrite da/f75 [0,4194304] 0 2026-03-10T06:22:19.748 INFO:tasks.workunit.client.1.vm06.stdout:9/476: truncate f9 2306604 0 2026-03-10T06:22:19.748 INFO:tasks.workunit.client.1.vm06.stdout:9/477: write fb [313965,15033] 0 2026-03-10T06:22:19.749 INFO:tasks.workunit.client.1.vm06.stdout:3/436: link d6/d21/f55 d6/d8/f96 0 2026-03-10T06:22:19.750 INFO:tasks.workunit.client.1.vm06.stdout:2/421: symlink da/d13/d1a/d39/d4b/l82 0 2026-03-10T06:22:19.769 INFO:tasks.workunit.client.1.vm06.stdout:3/437: write d6/f53 [1164230,106901] 0 2026-03-10T06:22:19.769 INFO:tasks.workunit.client.1.vm06.stdout:2/422: write da/d13/d1c/f76 [1029589,127165] 0 2026-03-10T06:22:19.769 INFO:tasks.workunit.client.1.vm06.stdout:2/423: chown da/d13/d1c/f41 2044434 1 2026-03-10T06:22:19.769 INFO:tasks.workunit.client.1.vm06.stdout:2/424: dwrite da/d13/d1a/d39/f2f [0,4194304] 0 2026-03-10T06:22:19.769 INFO:tasks.workunit.client.1.vm06.stdout:1/472: creat d9/d1b/f81 x:0 0 0 2026-03-10T06:22:19.769 INFO:tasks.workunit.client.1.vm06.stdout:9/478: rename d21/d27/d56/d8e to d21/da2/da7 0 2026-03-10T06:22:19.769 INFO:tasks.workunit.client.1.vm06.stdout:9/479: fdatasync d21/d32/f71 0 2026-03-10T06:22:19.769 INFO:tasks.workunit.client.1.vm06.stdout:1/473: chown d9/d1b/d20/d44 3 1 2026-03-10T06:22:19.769 INFO:tasks.workunit.client.1.vm06.stdout:1/474: dwrite d9/d1b/d20/f42 [0,4194304] 0 2026-03-10T06:22:19.769 INFO:tasks.workunit.client.1.vm06.stdout:9/480: rename d21/d46/l4f to d21/d46/d79/d80/d95/la8 0 2026-03-10T06:22:19.770 INFO:tasks.workunit.client.1.vm06.stdout:1/475: chown d9/df/l28 4652562 1 2026-03-10T06:22:19.775 INFO:tasks.workunit.client.1.vm06.stdout:7/498: sync 2026-03-10T06:22:19.777 INFO:tasks.workunit.client.1.vm06.stdout:9/481: creat d21/d27/d50/d57/fa9 x:0 0 0 2026-03-10T06:22:19.779 INFO:tasks.workunit.client.1.vm06.stdout:9/482: creat d21/d46/d79/faa x:0 0 0 2026-03-10T06:22:19.792 INFO:tasks.workunit.client.1.vm06.stdout:9/483: rename l1e to d21/d27/d56/lab 0 2026-03-10T06:22:19.793 INFO:tasks.workunit.client.1.vm06.stdout:9/484: rmdir d21/d32/d4d 39 2026-03-10T06:22:19.793 INFO:tasks.workunit.client.1.vm06.stdout:9/485: mknod d21/d27/d50/cac 0 2026-03-10T06:22:19.793 INFO:tasks.workunit.client.1.vm06.stdout:9/486: write d21/d32/d4d/d51/d67/f81 [824091,48976] 0 2026-03-10T06:22:19.796 INFO:tasks.workunit.client.1.vm06.stdout:6/514: sync 2026-03-10T06:22:19.796 INFO:tasks.workunit.client.1.vm06.stdout:3/438: sync 2026-03-10T06:22:19.797 INFO:tasks.workunit.client.1.vm06.stdout:9/487: write d21/d27/d50/d57/f58 [214220,100874] 0 2026-03-10T06:22:19.797 INFO:tasks.workunit.client.1.vm06.stdout:0/461: sync 2026-03-10T06:22:19.798 INFO:tasks.workunit.client.1.vm06.stdout:6/515: chown d6/d7/c23 0 1 2026-03-10T06:22:19.802 INFO:tasks.workunit.client.1.vm06.stdout:0/462: write d0/d3c/d42/d88/f8a [910500,72545] 0 2026-03-10T06:22:19.802 INFO:tasks.workunit.client.1.vm06.stdout:6/516: stat d6/df/d40 0 2026-03-10T06:22:19.804 INFO:tasks.workunit.client.1.vm06.stdout:6/517: stat d6/fa8 0 2026-03-10T06:22:19.806 INFO:tasks.workunit.client.1.vm06.stdout:3/439: dwrite d6/d1a/f4a [0,4194304] 0 2026-03-10T06:22:19.806 INFO:tasks.workunit.client.1.vm06.stdout:9/488: unlink d21/d32/d4d/d51/c68 0 2026-03-10T06:22:19.806 INFO:tasks.workunit.client.1.vm06.stdout:3/440: dread - d6/dc/f94 zero size 2026-03-10T06:22:19.807 INFO:tasks.workunit.client.1.vm06.stdout:6/518: read - d6/df/f6f zero size 2026-03-10T06:22:19.818 INFO:tasks.workunit.client.1.vm06.stdout:6/519: mkdir d6/d79/d95/db4 0 2026-03-10T06:22:19.821 INFO:tasks.workunit.client.1.vm06.stdout:6/520: dwrite f3 [0,4194304] 0 2026-03-10T06:22:19.825 INFO:tasks.workunit.client.1.vm06.stdout:6/521: creat d6/df/d40/d99/fb5 x:0 0 0 2026-03-10T06:22:19.841 INFO:tasks.workunit.client.1.vm06.stdout:6/522: sync 2026-03-10T06:22:19.841 INFO:tasks.workunit.client.1.vm06.stdout:6/523: chown d6/dd/d35 25940 1 2026-03-10T06:22:19.843 INFO:tasks.workunit.client.1.vm06.stdout:6/524: truncate d6/dd/d25/f5c 376291 0 2026-03-10T06:22:19.871 INFO:tasks.workunit.client.1.vm06.stdout:6/525: sync 2026-03-10T06:22:19.883 INFO:tasks.workunit.client.1.vm06.stdout:0/463: dread d0/f9 [0,4194304] 0 2026-03-10T06:22:19.907 INFO:tasks.workunit.client.1.vm06.stdout:9/489: write d21/f2a [1612798,75726] 0 2026-03-10T06:22:19.909 INFO:tasks.workunit.client.1.vm06.stdout:9/490: truncate d21/d46/d79/d80/d95/d9b/fa5 60767 0 2026-03-10T06:22:19.912 INFO:tasks.workunit.client.1.vm06.stdout:4/413: truncate dd/d33/f58 2804369 0 2026-03-10T06:22:19.912 INFO:tasks.workunit.client.1.vm06.stdout:4/414: fsync dd/d24/d5e/f6e 0 2026-03-10T06:22:19.914 INFO:tasks.workunit.client.1.vm06.stdout:5/370: dwrite d8/db/d54/d67/d22/d74/f42 [0,4194304] 0 2026-03-10T06:22:19.920 INFO:tasks.workunit.client.1.vm06.stdout:9/491: mknod d21/d27/d3a/cad 0 2026-03-10T06:22:19.921 INFO:tasks.workunit.client.1.vm06.stdout:1/476: rmdir d9 39 2026-03-10T06:22:19.921 INFO:tasks.workunit.client.1.vm06.stdout:4/415: write dd/d24/f3d [4383861,56142] 0 2026-03-10T06:22:19.922 INFO:tasks.workunit.client.1.vm06.stdout:4/416: chown dd/d24/d2d/f28 37 1 2026-03-10T06:22:19.925 INFO:tasks.workunit.client.1.vm06.stdout:4/417: dwrite dd/f12 [0,4194304] 0 2026-03-10T06:22:19.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:19 vm04.local ceph-mon[51058]: Reconfiguring prometheus.vm04 (dependencies changed)... 2026-03-10T06:22:19.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:19 vm04.local ceph-mon[51058]: pgmap v8: 65 pgs: 65 active+clean; 794 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 16 MiB/s rd, 78 MiB/s wr, 148 op/s 2026-03-10T06:22:19.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:19 vm04.local ceph-mon[51058]: Reconfiguring daemon prometheus.vm04 on vm04 2026-03-10T06:22:19.927 INFO:tasks.workunit.client.1.vm06.stdout:4/418: fsync dd/d24/d2d/f3b 0 2026-03-10T06:22:19.930 INFO:tasks.workunit.client.1.vm06.stdout:8/402: truncate d1/df/d20/d21/d5e/d79/f7f 604431 0 2026-03-10T06:22:19.932 INFO:tasks.workunit.client.1.vm06.stdout:8/403: write d1/df/d11/f53 [763840,44706] 0 2026-03-10T06:22:19.933 INFO:tasks.workunit.client.1.vm06.stdout:2/425: dwrite da/d13/d1c/d1d/f2a [0,4194304] 0 2026-03-10T06:22:19.934 INFO:tasks.workunit.client.1.vm06.stdout:9/492: unlink d21/d32/d4d/c8c 0 2026-03-10T06:22:19.938 INFO:tasks.workunit.client.1.vm06.stdout:7/499: write d19/d3b/f68 [2929569,84218] 0 2026-03-10T06:22:19.938 INFO:tasks.workunit.client.1.vm06.stdout:4/419: rmdir dd/d24/d2d/d2f 39 2026-03-10T06:22:19.938 INFO:tasks.workunit.client.1.vm06.stdout:4/420: fdatasync dd/d41/f60 0 2026-03-10T06:22:19.946 INFO:tasks.workunit.client.1.vm06.stdout:2/426: fsync da/d13/d1c/f7e 0 2026-03-10T06:22:19.947 INFO:tasks.workunit.client.1.vm06.stdout:9/493: creat d21/d27/d50/d57/fae x:0 0 0 2026-03-10T06:22:19.953 INFO:tasks.workunit.client.1.vm06.stdout:5/371: link d8/db/d54/d67/d46/f77 d8/db/d54/d67/d22/d74/f78 0 2026-03-10T06:22:19.954 INFO:tasks.workunit.client.1.vm06.stdout:4/421: dwrite dd/d18/f1f [4194304,4194304] 0 2026-03-10T06:22:19.955 INFO:tasks.workunit.client.1.vm06.stdout:2/427: read da/d13/d1a/d39/f70 [1220287,62890] 0 2026-03-10T06:22:19.956 INFO:tasks.workunit.client.1.vm06.stdout:9/494: rename d21/d32/f71 to d21/d27/faf 0 2026-03-10T06:22:19.956 INFO:tasks.workunit.client.1.vm06.stdout:4/422: chown dd/d18/l1c 1938102067 1 2026-03-10T06:22:19.956 INFO:tasks.workunit.client.1.vm06.stdout:3/441: truncate d6/d1a/f1f 5218049 0 2026-03-10T06:22:19.956 INFO:tasks.workunit.client.1.vm06.stdout:6/526: truncate d6/dd/d25/d2c/f9c 3543098 0 2026-03-10T06:22:19.958 INFO:tasks.workunit.client.1.vm06.stdout:5/372: mknod d8/db/c79 0 2026-03-10T06:22:19.965 INFO:tasks.workunit.client.1.vm06.stdout:9/495: mkdir d21/d32/d4d/d51/db0 0 2026-03-10T06:22:19.966 INFO:tasks.workunit.client.1.vm06.stdout:3/442: creat d6/d8/f97 x:0 0 0 2026-03-10T06:22:19.966 INFO:tasks.workunit.client.1.vm06.stdout:4/423: mkdir dd/d24/d2d/d2f/d34/d83 0 2026-03-10T06:22:19.966 INFO:tasks.workunit.client.1.vm06.stdout:9/496: mknod d21/da2/da7/cb1 0 2026-03-10T06:22:19.966 INFO:tasks.workunit.client.1.vm06.stdout:2/428: fdatasync da/d13/d1c/d1d/d44/d53/d61/d68/f6b 0 2026-03-10T06:22:19.966 INFO:tasks.workunit.client.1.vm06.stdout:5/373: rename d8/l2b to d8/l7a 0 2026-03-10T06:22:19.966 INFO:tasks.workunit.client.1.vm06.stdout:9/497: write fb [5019023,37276] 0 2026-03-10T06:22:19.966 INFO:tasks.workunit.client.1.vm06.stdout:2/429: symlink da/d13/d5e/l83 0 2026-03-10T06:22:19.966 INFO:tasks.workunit.client.1.vm06.stdout:3/443: symlink d6/dc/d13/l98 0 2026-03-10T06:22:19.966 INFO:tasks.workunit.client.1.vm06.stdout:2/430: fsync da/d13/d1a/f3a 0 2026-03-10T06:22:19.966 INFO:tasks.workunit.client.1.vm06.stdout:3/444: chown d6/d21/f58 270566435 1 2026-03-10T06:22:19.966 INFO:tasks.workunit.client.1.vm06.stdout:5/374: rename d8/d9/c21 to d8/db/d54/d55/c7b 0 2026-03-10T06:22:19.966 INFO:tasks.workunit.client.1.vm06.stdout:4/424: rmdir dd/d24/d2d/d2f 39 2026-03-10T06:22:19.966 INFO:tasks.workunit.client.1.vm06.stdout:4/425: readlink dd/d18/l54 0 2026-03-10T06:22:19.967 INFO:tasks.workunit.client.1.vm06.stdout:9/498: getdents d21/d46/d79 0 2026-03-10T06:22:19.968 INFO:tasks.workunit.client.1.vm06.stdout:4/426: write dd/d18/f1f [769431,29916] 0 2026-03-10T06:22:19.968 INFO:tasks.workunit.client.1.vm06.stdout:3/445: creat d6/d21/f99 x:0 0 0 2026-03-10T06:22:19.969 INFO:tasks.workunit.client.1.vm06.stdout:9/499: write d21/d32/d4d/d51/d67/f81 [1127076,49046] 0 2026-03-10T06:22:19.969 INFO:tasks.workunit.client.1.vm06.stdout:4/427: chown dd/d18/f5f 33283522 1 2026-03-10T06:22:19.969 INFO:tasks.workunit.client.1.vm06.stdout:5/375: symlink d8/db/d54/d67/d46/d6e/l7c 0 2026-03-10T06:22:19.970 INFO:tasks.workunit.client.1.vm06.stdout:5/376: chown d8/db 0 1 2026-03-10T06:22:19.971 INFO:tasks.workunit.client.1.vm06.stdout:9/500: write d21/d27/d3a/f83 [943980,37225] 0 2026-03-10T06:22:19.978 INFO:tasks.workunit.client.1.vm06.stdout:8/404: sync 2026-03-10T06:22:19.978 INFO:tasks.workunit.client.1.vm06.stdout:7/500: sync 2026-03-10T06:22:19.978 INFO:tasks.workunit.client.1.vm06.stdout:4/428: dread - dd/d24/d2d/d2f/d39/f4a zero size 2026-03-10T06:22:19.984 INFO:tasks.workunit.client.1.vm06.stdout:5/377: readlink d8/db/d54/d67/d46/d6e/l7c 0 2026-03-10T06:22:19.984 INFO:tasks.workunit.client.1.vm06.stdout:2/431: dwrite da/d13/f1f [0,4194304] 0 2026-03-10T06:22:19.987 INFO:tasks.workunit.client.1.vm06.stdout:2/432: write da/d13/d1a/f21 [2694765,8464] 0 2026-03-10T06:22:19.987 INFO:tasks.workunit.client.1.vm06.stdout:7/501: rename d19/d3b/d41/d4c/f8d to d19/d3b/d5b/fa4 0 2026-03-10T06:22:19.991 INFO:tasks.workunit.client.1.vm06.stdout:9/501: unlink d21/c54 0 2026-03-10T06:22:19.998 INFO:tasks.workunit.client.1.vm06.stdout:3/446: dwrite d6/d21/f7b [0,4194304] 0 2026-03-10T06:22:20.004 INFO:tasks.workunit.client.1.vm06.stdout:0/464: write d0/f61 [2905295,117694] 0 2026-03-10T06:22:20.004 INFO:tasks.workunit.client.1.vm06.stdout:0/465: write d0/dd/f24 [3741880,25537] 0 2026-03-10T06:22:20.004 INFO:tasks.workunit.client.1.vm06.stdout:8/405: symlink d1/df/d20/d21/d7e/d8d/l8e 0 2026-03-10T06:22:20.005 INFO:tasks.workunit.client.1.vm06.stdout:9/502: write d21/d32/d4d/d51/f87 [998242,57559] 0 2026-03-10T06:22:20.007 INFO:tasks.workunit.client.1.vm06.stdout:7/502: mkdir d19/d3b/d5b/d9a/da5 0 2026-03-10T06:22:20.009 INFO:tasks.workunit.client.1.vm06.stdout:0/466: mkdir d0/d3c/d42/d88/d98 0 2026-03-10T06:22:20.009 INFO:tasks.workunit.client.1.vm06.stdout:2/433: creat da/f84 x:0 0 0 2026-03-10T06:22:20.012 INFO:tasks.workunit.client.1.vm06.stdout:9/503: rename d21/d46/d79 to d21/d27/d50/d57/db2 0 2026-03-10T06:22:20.017 INFO:tasks.workunit.client.1.vm06.stdout:2/434: write da/d13/d1a/d39/d35/f74 [839279,122283] 0 2026-03-10T06:22:20.017 INFO:tasks.workunit.client.1.vm06.stdout:3/447: getdents d6/d21/d38 0 2026-03-10T06:22:20.017 INFO:tasks.workunit.client.1.vm06.stdout:7/503: link d19/d3b/f7b d19/d3b/d5b/d9a/da5/fa6 0 2026-03-10T06:22:20.017 INFO:tasks.workunit.client.1.vm06.stdout:9/504: creat d21/d27/fb3 x:0 0 0 2026-03-10T06:22:20.017 INFO:tasks.workunit.client.1.vm06.stdout:9/505: stat d21/d27/d50/d57/c75 0 2026-03-10T06:22:20.019 INFO:tasks.workunit.client.1.vm06.stdout:7/504: getdents d19/d3b/d41/d42/d52/d83 0 2026-03-10T06:22:20.019 INFO:tasks.workunit.client.1.vm06.stdout:3/448: dwrite d6/d21/f58 [0,4194304] 0 2026-03-10T06:22:20.021 INFO:tasks.workunit.client.1.vm06.stdout:7/505: chown d19/d3b/c7e 4714418 1 2026-03-10T06:22:20.021 INFO:tasks.workunit.client.1.vm06.stdout:7/506: readlink l17 0 2026-03-10T06:22:20.021 INFO:tasks.workunit.client.1.vm06.stdout:9/506: creat d21/d32/d4d/fb4 x:0 0 0 2026-03-10T06:22:20.022 INFO:tasks.workunit.client.1.vm06.stdout:7/507: symlink d19/d3b/d41/d4c/la7 0 2026-03-10T06:22:20.022 INFO:tasks.workunit.client.1.vm06.stdout:0/467: dwrite d0/dd/f32 [4194304,4194304] 0 2026-03-10T06:22:20.024 INFO:tasks.workunit.client.1.vm06.stdout:7/508: truncate d19/d3b/d41/d42/d62/d80/d82/f90 826236 0 2026-03-10T06:22:20.031 INFO:tasks.workunit.client.1.vm06.stdout:0/468: unlink d0/d3c/d42/l25 0 2026-03-10T06:22:20.031 INFO:tasks.workunit.client.1.vm06.stdout:7/509: truncate d19/f99 382706 0 2026-03-10T06:22:20.031 INFO:tasks.workunit.client.1.vm06.stdout:7/510: dread - d19/d3b/f6b zero size 2026-03-10T06:22:20.033 INFO:tasks.workunit.client.1.vm06.stdout:9/507: write d21/d27/d50/d57/db2/d80/d95/d9b/fa5 [709903,3308] 0 2026-03-10T06:22:20.034 INFO:tasks.workunit.client.1.vm06.stdout:7/511: chown d19/l34 5214 1 2026-03-10T06:22:20.036 INFO:tasks.workunit.client.1.vm06.stdout:7/512: write d19/d3b/d41/d4c/f4e [1016256,127794] 0 2026-03-10T06:22:20.038 INFO:tasks.workunit.client.1.vm06.stdout:1/477: dwrite d9/f1a [0,4194304] 0 2026-03-10T06:22:20.048 INFO:tasks.workunit.client.1.vm06.stdout:7/513: mkdir d19/d3b/d41/d42/d52/d83/d9d/da8 0 2026-03-10T06:22:20.054 INFO:tasks.workunit.client.1.vm06.stdout:1/478: creat d9/d35/d46/d38/f82 x:0 0 0 2026-03-10T06:22:20.055 INFO:tasks.workunit.client.1.vm06.stdout:1/479: write d9/d1b/f7c [525240,75995] 0 2026-03-10T06:22:20.059 INFO:tasks.workunit.client.1.vm06.stdout:3/449: dread d6/d1a/f1f [0,4194304] 0 2026-03-10T06:22:20.059 INFO:tasks.workunit.client.1.vm06.stdout:1/480: unlink d9/d1b/d20/f55 0 2026-03-10T06:22:20.059 INFO:tasks.workunit.client.1.vm06.stdout:1/481: chown d9/df/f14 607984533 1 2026-03-10T06:22:20.060 INFO:tasks.workunit.client.1.vm06.stdout:1/482: stat d9/d1b/f7b 0 2026-03-10T06:22:20.061 INFO:tasks.workunit.client.1.vm06.stdout:3/450: write d6/d21/d38/f50 [3376430,54318] 0 2026-03-10T06:22:20.062 INFO:tasks.workunit.client.1.vm06.stdout:3/451: write d6/dc/d13/f5e [80286,62213] 0 2026-03-10T06:22:20.062 INFO:tasks.workunit.client.1.vm06.stdout:3/452: chown d6/dc/d13/d35/f5a 225990231 1 2026-03-10T06:22:20.066 INFO:tasks.workunit.client.1.vm06.stdout:1/483: unlink d9/d1b/d20/l3b 0 2026-03-10T06:22:20.066 INFO:tasks.workunit.client.1.vm06.stdout:3/453: link d6/dc/d41/d6d/l92 d6/dc/d13/l9a 0 2026-03-10T06:22:20.072 INFO:tasks.workunit.client.1.vm06.stdout:3/454: fdatasync d6/d21/d38/d88/f7d 0 2026-03-10T06:22:20.075 INFO:tasks.workunit.client.1.vm06.stdout:3/455: rmdir d6/d21/d38/d39 39 2026-03-10T06:22:20.082 INFO:tasks.workunit.client.1.vm06.stdout:3/456: stat d6/dc/d41/d6d/l92 0 2026-03-10T06:22:20.083 INFO:tasks.workunit.client.1.vm06.stdout:0/469: dread d0/d3c/d42/f41 [0,4194304] 0 2026-03-10T06:22:20.083 INFO:tasks.workunit.client.1.vm06.stdout:9/508: dread d21/d32/d4d/d51/d67/f6a [0,4194304] 0 2026-03-10T06:22:20.084 INFO:tasks.workunit.client.1.vm06.stdout:3/457: write d6/f63 [381721,44703] 0 2026-03-10T06:22:20.084 INFO:tasks.workunit.client.1.vm06.stdout:3/458: chown d6/d1a/d5b 5832910 1 2026-03-10T06:22:20.086 INFO:tasks.workunit.client.1.vm06.stdout:3/459: unlink d6/d1a/d5b/l68 0 2026-03-10T06:22:20.095 INFO:tasks.workunit.client.1.vm06.stdout:3/460: symlink d6/d21/l9b 0 2026-03-10T06:22:20.096 INFO:tasks.workunit.client.1.vm06.stdout:6/527: truncate d6/df/f1e 2985568 0 2026-03-10T06:22:20.098 INFO:tasks.workunit.client.1.vm06.stdout:6/528: readlink d6/dd/d25/d2c/l63 0 2026-03-10T06:22:20.103 INFO:tasks.workunit.client.1.vm06.stdout:6/529: write d6/dd/d25/fa0 [4496238,80252] 0 2026-03-10T06:22:20.106 INFO:tasks.workunit.client.1.vm06.stdout:6/530: creat d6/dd/d25/d2c/fb6 x:0 0 0 2026-03-10T06:22:20.108 INFO:tasks.workunit.client.1.vm06.stdout:4/429: dwrite dd/d24/d2d/d2f/f42 [0,4194304] 0 2026-03-10T06:22:20.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:19 vm06.local ceph-mon[58974]: Reconfiguring prometheus.vm04 (dependencies changed)... 2026-03-10T06:22:20.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:19 vm06.local ceph-mon[58974]: pgmap v8: 65 pgs: 65 active+clean; 794 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 16 MiB/s rd, 78 MiB/s wr, 148 op/s 2026-03-10T06:22:20.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:19 vm06.local ceph-mon[58974]: Reconfiguring daemon prometheus.vm04 on vm04 2026-03-10T06:22:20.141 INFO:tasks.workunit.client.1.vm06.stdout:4/430: sync 2026-03-10T06:22:20.142 INFO:tasks.workunit.client.1.vm06.stdout:4/431: readlink dd/d24/d2d/d2f/d34/d40/l46 0 2026-03-10T06:22:20.143 INFO:tasks.workunit.client.1.vm06.stdout:4/432: chown dd/d18/l1a 173234 1 2026-03-10T06:22:20.145 INFO:tasks.workunit.client.1.vm06.stdout:4/433: creat dd/d33/f84 x:0 0 0 2026-03-10T06:22:20.158 INFO:tasks.workunit.client.1.vm06.stdout:4/434: unlink dd/d18/c2e 0 2026-03-10T06:22:20.166 INFO:tasks.workunit.client.1.vm06.stdout:2/435: truncate da/d13/d1c/f76 1133854 0 2026-03-10T06:22:20.166 INFO:tasks.workunit.client.1.vm06.stdout:4/435: chown dd/d18/l4f 13920713 1 2026-03-10T06:22:20.178 INFO:tasks.workunit.client.1.vm06.stdout:0/470: truncate d0/d3c/d42/d88/d35/f51 3166780 0 2026-03-10T06:22:20.180 INFO:tasks.workunit.client.1.vm06.stdout:8/406: dwrite d1/df/d11/f45 [0,4194304] 0 2026-03-10T06:22:20.180 INFO:tasks.workunit.client.1.vm06.stdout:1/484: write d9/d35/f56 [2492631,116783] 0 2026-03-10T06:22:20.182 INFO:tasks.workunit.client.1.vm06.stdout:5/378: dwrite d8/db/d54/d67/d22/d74/f2f [0,4194304] 0 2026-03-10T06:22:20.182 INFO:tasks.workunit.client.1.vm06.stdout:4/436: unlink dd/d18/c22 0 2026-03-10T06:22:20.183 INFO:tasks.workunit.client.1.vm06.stdout:5/379: chown d8/db/c58 285 1 2026-03-10T06:22:20.183 INFO:tasks.workunit.client.1.vm06.stdout:7/514: dwrite d19/f35 [0,4194304] 0 2026-03-10T06:22:20.185 INFO:tasks.workunit.client.1.vm06.stdout:7/515: fsync d19/d3b/d41/f54 0 2026-03-10T06:22:20.185 INFO:tasks.workunit.client.1.vm06.stdout:5/380: readlink d8/l1b 0 2026-03-10T06:22:20.187 INFO:tasks.workunit.client.1.vm06.stdout:8/407: chown d1/df/d58/f86 2701 1 2026-03-10T06:22:20.192 INFO:tasks.workunit.client.1.vm06.stdout:4/437: readlink dd/d18/l6f 0 2026-03-10T06:22:20.197 INFO:tasks.workunit.client.1.vm06.stdout:8/408: write d1/df/d20/f19 [5346081,75385] 0 2026-03-10T06:22:20.197 INFO:tasks.workunit.client.1.vm06.stdout:3/461: dwrite d6/dc/d41/f82 [0,4194304] 0 2026-03-10T06:22:20.197 INFO:tasks.workunit.client.1.vm06.stdout:4/438: write dd/f14 [2339463,118513] 0 2026-03-10T06:22:20.209 INFO:tasks.workunit.client.1.vm06.stdout:5/381: dwrite d8/db/d54/d67/d22/d39/f41 [0,4194304] 0 2026-03-10T06:22:20.224 INFO:tasks.workunit.client.1.vm06.stdout:4/439: symlink dd/l85 0 2026-03-10T06:22:20.224 INFO:tasks.workunit.client.1.vm06.stdout:8/409: read d1/f13 [2289195,47700] 0 2026-03-10T06:22:20.225 INFO:tasks.workunit.client.1.vm06.stdout:4/440: dread - dd/f81 zero size 2026-03-10T06:22:20.226 INFO:tasks.workunit.client.1.vm06.stdout:5/382: getdents d8/d9 0 2026-03-10T06:22:20.227 INFO:tasks.workunit.client.1.vm06.stdout:5/383: write d8/db/d54/d67/d22/d74/f29 [194244,113605] 0 2026-03-10T06:22:20.232 INFO:tasks.workunit.client.1.vm06.stdout:2/436: dread da/d13/d1c/f7e [0,4194304] 0 2026-03-10T06:22:20.233 INFO:tasks.workunit.client.1.vm06.stdout:9/509: dwrite f9 [0,4194304] 0 2026-03-10T06:22:20.235 INFO:tasks.workunit.client.1.vm06.stdout:4/441: symlink dd/d24/d2d/d2f/d39/l86 0 2026-03-10T06:22:20.235 INFO:tasks.workunit.client.1.vm06.stdout:2/437: write da/d13/d1c/f41 [3944344,25345] 0 2026-03-10T06:22:20.235 INFO:tasks.workunit.client.1.vm06.stdout:8/410: symlink d1/df/d20/d21/d7e/l8f 0 2026-03-10T06:22:20.236 INFO:tasks.workunit.client.1.vm06.stdout:8/411: readlink d1/df/l87 0 2026-03-10T06:22:20.247 INFO:tasks.workunit.client.1.vm06.stdout:2/438: creat da/d13/d1c/d1d/d44/d48/d56/f85 x:0 0 0 2026-03-10T06:22:20.248 INFO:tasks.workunit.client.1.vm06.stdout:5/384: getdents d8 0 2026-03-10T06:22:20.248 INFO:tasks.workunit.client.1.vm06.stdout:4/442: creat dd/d24/d2d/d2f/d34/d83/f87 x:0 0 0 2026-03-10T06:22:20.250 INFO:tasks.workunit.client.1.vm06.stdout:4/443: write f2 [12618154,165] 0 2026-03-10T06:22:20.253 INFO:tasks.workunit.client.1.vm06.stdout:6/531: dread d6/d7/d37/f3d [0,4194304] 0 2026-03-10T06:22:20.255 INFO:tasks.workunit.client.1.vm06.stdout:6/532: fsync d6/d7/f36 0 2026-03-10T06:22:20.255 INFO:tasks.workunit.client.1.vm06.stdout:4/444: link dd/d24/d2d/f28 dd/d33/d47/f88 0 2026-03-10T06:22:20.259 INFO:tasks.workunit.client.1.vm06.stdout:4/445: dread - dd/d33/f84 zero size 2026-03-10T06:22:20.260 INFO:tasks.workunit.client.1.vm06.stdout:6/533: dwrite d6/dd/d25/d2c/f32 [0,4194304] 0 2026-03-10T06:22:20.267 INFO:tasks.workunit.client.1.vm06.stdout:4/446: creat dd/d24/d2d/d2f/d34/d40/f89 x:0 0 0 2026-03-10T06:22:20.271 INFO:tasks.workunit.client.1.vm06.stdout:4/447: creat dd/d24/d2d/d2f/d34/d40/f8a x:0 0 0 2026-03-10T06:22:20.274 INFO:tasks.workunit.client.1.vm06.stdout:6/534: link d6/d7/d37/c53 d6/dd/d35/cb7 0 2026-03-10T06:22:20.274 INFO:tasks.workunit.client.1.vm06.stdout:6/535: readlink d6/d7/l34 0 2026-03-10T06:22:20.279 INFO:tasks.workunit.client.1.vm06.stdout:6/536: read d6/dd/d25/f3f [7292174,24482] 0 2026-03-10T06:22:20.280 INFO:tasks.workunit.client.1.vm06.stdout:4/448: dwrite dd/f12 [0,4194304] 0 2026-03-10T06:22:20.282 INFO:tasks.workunit.client.1.vm06.stdout:6/537: stat d6/dd/d25/c8b 0 2026-03-10T06:22:20.289 INFO:tasks.workunit.client.1.vm06.stdout:4/449: creat dd/d33/f8b x:0 0 0 2026-03-10T06:22:20.309 INFO:tasks.workunit.client.1.vm06.stdout:4/450: getdents dd/d24/d2d/d2f/d34 0 2026-03-10T06:22:20.309 INFO:tasks.workunit.client.1.vm06.stdout:4/451: chown dd/d33/f70 428569 1 2026-03-10T06:22:20.345 INFO:tasks.workunit.client.1.vm06.stdout:3/462: link d6/dc/d41/d6d/l77 d6/d21/d38/l9c 0 2026-03-10T06:22:20.350 INFO:tasks.workunit.client.1.vm06.stdout:9/510: mknod d21/d32/d4d/cb5 0 2026-03-10T06:22:20.351 INFO:tasks.workunit.client.1.vm06.stdout:8/412: mkdir d1/d2c/d90 0 2026-03-10T06:22:20.356 INFO:tasks.workunit.client.1.vm06.stdout:8/413: mknod d1/df/d20/d21/d7e/c91 0 2026-03-10T06:22:20.357 INFO:tasks.workunit.client.1.vm06.stdout:8/414: read - d1/f75 zero size 2026-03-10T06:22:20.361 INFO:tasks.workunit.client.1.vm06.stdout:1/485: write d9/d1b/d20/f43 [1787636,34414] 0 2026-03-10T06:22:20.361 INFO:tasks.workunit.client.1.vm06.stdout:1/486: stat d9/d1b/d20/d44/l4c 0 2026-03-10T06:22:20.361 INFO:tasks.workunit.client.1.vm06.stdout:1/487: chown d9/d35/d46 74819 1 2026-03-10T06:22:20.364 INFO:tasks.workunit.client.1.vm06.stdout:1/488: mkdir d9/d35/d46/d38/d63/d83 0 2026-03-10T06:22:20.366 INFO:tasks.workunit.client.1.vm06.stdout:1/489: fsync d9/d35/d46/f71 0 2026-03-10T06:22:20.367 INFO:tasks.workunit.client.1.vm06.stdout:1/490: write d9/d1b/d20/d44/f54 [1573576,98406] 0 2026-03-10T06:22:20.367 INFO:tasks.workunit.client.1.vm06.stdout:2/439: rmdir da 39 2026-03-10T06:22:20.369 INFO:tasks.workunit.client.1.vm06.stdout:2/440: write da/d13/f5b [4577081,119868] 0 2026-03-10T06:22:20.381 INFO:tasks.workunit.client.1.vm06.stdout:5/385: dwrite d8/db/d54/d67/d22/d74/f3b [0,4194304] 0 2026-03-10T06:22:20.381 INFO:tasks.workunit.client.1.vm06.stdout:1/491: dwrite d9/d62/f76 [0,4194304] 0 2026-03-10T06:22:20.383 INFO:tasks.workunit.client.1.vm06.stdout:5/386: dread - d8/db/d54/d67/d46/f77 zero size 2026-03-10T06:22:20.392 INFO:tasks.workunit.client.1.vm06.stdout:2/441: dwrite da/d13/d1c/d1d/d44/d48/d56/f58 [0,4194304] 0 2026-03-10T06:22:20.402 INFO:tasks.workunit.client.1.vm06.stdout:2/442: mkdir da/d13/d1a/d39/d4b/d86 0 2026-03-10T06:22:20.407 INFO:tasks.workunit.client.1.vm06.stdout:2/443: readlink da/d13/d5e/l6c 0 2026-03-10T06:22:20.408 INFO:tasks.workunit.client.1.vm06.stdout:5/387: dwrite d8/db/d54/d67/d22/d74/f62 [0,4194304] 0 2026-03-10T06:22:20.413 INFO:tasks.workunit.client.1.vm06.stdout:5/388: write d8/db/d54/f5c [935984,55495] 0 2026-03-10T06:22:20.423 INFO:tasks.workunit.client.1.vm06.stdout:2/444: dread da/d13/d1c/d1d/f26 [0,4194304] 0 2026-03-10T06:22:20.425 INFO:tasks.workunit.client.1.vm06.stdout:1/492: sync 2026-03-10T06:22:20.429 INFO:tasks.workunit.client.1.vm06.stdout:1/493: read d9/d1b/f7b [1208,22433] 0 2026-03-10T06:22:20.431 INFO:tasks.workunit.client.1.vm06.stdout:2/445: dwrite da/f84 [0,4194304] 0 2026-03-10T06:22:20.431 INFO:tasks.workunit.client.1.vm06.stdout:2/446: stat da/d13/d1a/d39/d35/l80 0 2026-03-10T06:22:20.432 INFO:tasks.workunit.client.1.vm06.stdout:2/447: fsync da/d13/d5e/f64 0 2026-03-10T06:22:20.434 INFO:tasks.workunit.client.1.vm06.stdout:1/494: chown d9/d35/d46 20160148 1 2026-03-10T06:22:20.441 INFO:tasks.workunit.client.1.vm06.stdout:1/495: getdents d9/d35/d46 0 2026-03-10T06:22:20.443 INFO:tasks.workunit.client.1.vm06.stdout:1/496: chown d9/d35/f7f 8967 1 2026-03-10T06:22:20.490 INFO:tasks.workunit.client.1.vm06.stdout:4/452: creat dd/d24/f8c x:0 0 0 2026-03-10T06:22:20.491 INFO:tasks.workunit.client.1.vm06.stdout:4/453: readlink dd/l1b 0 2026-03-10T06:22:20.491 INFO:tasks.workunit.client.1.vm06.stdout:4/454: chown dd 72 1 2026-03-10T06:22:20.513 INFO:tasks.workunit.client.1.vm06.stdout:8/415: dread d1/fa [4194304,4194304] 0 2026-03-10T06:22:20.515 INFO:tasks.workunit.client.1.vm06.stdout:4/455: dwrite dd/d33/f56 [0,4194304] 0 2026-03-10T06:22:20.525 INFO:tasks.workunit.client.1.vm06.stdout:4/456: dread - dd/d24/d2d/d2f/d34/d40/f89 zero size 2026-03-10T06:22:20.528 INFO:tasks.workunit.client.1.vm06.stdout:4/457: chown dd/l85 7538141 1 2026-03-10T06:22:20.536 INFO:tasks.workunit.client.1.vm06.stdout:8/416: dwrite d1/f1b [0,4194304] 0 2026-03-10T06:22:20.541 INFO:tasks.workunit.client.1.vm06.stdout:8/417: creat d1/d7/f92 x:0 0 0 2026-03-10T06:22:20.541 INFO:tasks.workunit.client.1.vm06.stdout:8/418: chown d1/d7 104463541 1 2026-03-10T06:22:20.543 INFO:tasks.workunit.client.1.vm06.stdout:4/458: dwrite dd/d33/f37 [0,4194304] 0 2026-03-10T06:22:20.543 INFO:tasks.workunit.client.1.vm06.stdout:8/419: rmdir d1/df/d20/d21/d5e/d79 39 2026-03-10T06:22:20.556 INFO:tasks.workunit.client.1.vm06.stdout:8/420: sync 2026-03-10T06:22:20.574 INFO:tasks.workunit.client.1.vm06.stdout:8/421: dread d1/d7/f4f [0,4194304] 0 2026-03-10T06:22:20.575 INFO:tasks.workunit.client.1.vm06.stdout:8/422: symlink d1/df/d20/d21/d7e/d8d/l93 0 2026-03-10T06:22:20.579 INFO:tasks.workunit.client.1.vm06.stdout:8/423: dwrite d1/df/d20/d35/f42 [0,4194304] 0 2026-03-10T06:22:20.588 INFO:tasks.workunit.client.1.vm06.stdout:8/424: rmdir d1/df/d20 39 2026-03-10T06:22:20.588 INFO:tasks.workunit.client.1.vm06.stdout:8/425: mknod d1/df/d20/d21/d7e/d8d/c94 0 2026-03-10T06:22:20.588 INFO:tasks.workunit.client.1.vm06.stdout:8/426: link d1/df/d11/f45 d1/df/d20/d21/d7e/d8d/f95 0 2026-03-10T06:22:20.596 INFO:tasks.workunit.client.1.vm06.stdout:8/427: readlink d1/df/d20/d21/l2d 0 2026-03-10T06:22:20.596 INFO:tasks.workunit.client.1.vm06.stdout:8/428: chown d1/df/d11/c22 54191954 1 2026-03-10T06:22:20.599 INFO:tasks.workunit.client.1.vm06.stdout:8/429: dwrite d1/df/d11/f1d [0,4194304] 0 2026-03-10T06:22:20.632 INFO:tasks.workunit.client.1.vm06.stdout:0/471: rename d0/dd/d14/d18/d92 to d0/d3c/d42/d99 0 2026-03-10T06:22:20.635 INFO:tasks.workunit.client.1.vm06.stdout:0/472: readlink d0/dd/d14/d18/d66/l79 0 2026-03-10T06:22:20.637 INFO:tasks.workunit.client.1.vm06.stdout:7/516: rename d19/d3b/d5b/d9a to d19/d3b/d41/da9 0 2026-03-10T06:22:20.640 INFO:tasks.workunit.client.1.vm06.stdout:0/473: creat d0/d3c/d42/d88/d98/f9a x:0 0 0 2026-03-10T06:22:20.641 INFO:tasks.workunit.client.1.vm06.stdout:7/517: mkdir d19/d3b/d41/da9/daa 0 2026-03-10T06:22:20.646 INFO:tasks.workunit.client.1.vm06.stdout:0/474: symlink d0/l9b 0 2026-03-10T06:22:20.647 INFO:tasks.workunit.client.1.vm06.stdout:7/518: stat d19/d3b/d41/d42/d62/c6a 0 2026-03-10T06:22:20.647 INFO:tasks.workunit.client.1.vm06.stdout:3/463: rename d6/dc/d13/d35/d6b to d6/dc/d13/d9d 0 2026-03-10T06:22:20.651 INFO:tasks.workunit.client.1.vm06.stdout:3/464: chown d6/dc/d13/d35/f4e 236588520 1 2026-03-10T06:22:20.652 INFO:tasks.workunit.client.1.vm06.stdout:7/519: truncate d19/d3b/f7b 1611733 0 2026-03-10T06:22:20.654 INFO:tasks.workunit.client.1.vm06.stdout:7/520: fsync d19/d3b/d41/d42/d52/fa0 0 2026-03-10T06:22:20.659 INFO:tasks.workunit.client.1.vm06.stdout:5/389: rename d8/l10 to d8/db/d54/d67/d22/d74/l7d 0 2026-03-10T06:22:20.659 INFO:tasks.workunit.client.1.vm06.stdout:3/465: creat d6/d21/d38/d39/d90/f9e x:0 0 0 2026-03-10T06:22:20.661 INFO:tasks.workunit.client.1.vm06.stdout:3/466: dread - d6/dc/d13/d51/f8e zero size 2026-03-10T06:22:20.663 INFO:tasks.workunit.client.1.vm06.stdout:7/521: dwrite d19/d3b/d5b/fa4 [0,4194304] 0 2026-03-10T06:22:20.671 INFO:tasks.workunit.client.1.vm06.stdout:7/522: mknod d19/d3b/d5b/cab 0 2026-03-10T06:22:20.676 INFO:tasks.workunit.client.1.vm06.stdout:2/448: rename da/d13/d1c/d1d/c22 to da/c87 0 2026-03-10T06:22:20.681 INFO:tasks.workunit.client.1.vm06.stdout:3/467: rename d6/dc/d13/f5e to d6/dc/d13/f9f 0 2026-03-10T06:22:20.681 INFO:tasks.workunit.client.1.vm06.stdout:3/468: rmdir d6/dc/d41 39 2026-03-10T06:22:20.681 INFO:tasks.workunit.client.1.vm06.stdout:3/469: chown d6/dc/d41/d6d 2578 1 2026-03-10T06:22:20.681 INFO:tasks.workunit.client.1.vm06.stdout:3/470: dread d6/d21/d38/d39/f4c [0,4194304] 0 2026-03-10T06:22:20.681 INFO:tasks.workunit.client.1.vm06.stdout:3/471: rmdir d6/dc/d13/d51 39 2026-03-10T06:22:20.682 INFO:tasks.workunit.client.1.vm06.stdout:2/449: dwrite da/d13/d1c/d43/d6e/f77 [0,4194304] 0 2026-03-10T06:22:20.686 INFO:tasks.workunit.client.1.vm06.stdout:7/523: sync 2026-03-10T06:22:20.690 INFO:tasks.workunit.client.1.vm06.stdout:3/472: link d6/dc/d13/d9d/l8f d6/dc/d13/d35/la0 0 2026-03-10T06:22:20.691 INFO:tasks.workunit.client.1.vm06.stdout:3/473: write d6/d21/f7b [4775037,47346] 0 2026-03-10T06:22:20.694 INFO:tasks.workunit.client.1.vm06.stdout:7/524: rename d19/d3b/f53 to d19/d3b/d41/d42/d62/d80/da1/fac 0 2026-03-10T06:22:20.694 INFO:tasks.workunit.client.1.vm06.stdout:7/525: write f4 [2869912,63115] 0 2026-03-10T06:22:20.696 INFO:tasks.workunit.client.1.vm06.stdout:7/526: sync 2026-03-10T06:22:20.701 INFO:tasks.workunit.client.1.vm06.stdout:2/450: mknod da/d13/d1c/d1d/d44/c88 0 2026-03-10T06:22:20.701 INFO:tasks.workunit.client.1.vm06.stdout:3/474: mkdir d6/d8/d7f/da1 0 2026-03-10T06:22:20.702 INFO:tasks.workunit.client.1.vm06.stdout:3/475: truncate d6/d21/f99 994090 0 2026-03-10T06:22:20.703 INFO:tasks.workunit.client.1.vm06.stdout:7/527: chown d19/c3d 0 1 2026-03-10T06:22:20.704 INFO:tasks.workunit.client.1.vm06.stdout:2/451: write da/d13/d1c/d1d/f55 [1899168,36991] 0 2026-03-10T06:22:20.705 INFO:tasks.workunit.client.1.vm06.stdout:2/452: truncate da/d13/d1c/d1d/d44/d53/f78 823238 0 2026-03-10T06:22:20.711 INFO:tasks.workunit.client.1.vm06.stdout:7/528: creat d19/d3b/d41/d42/d52/d9f/fad x:0 0 0 2026-03-10T06:22:20.711 INFO:tasks.workunit.client.1.vm06.stdout:2/453: creat da/d13/d1c/d1d/d44/d53/d61/f89 x:0 0 0 2026-03-10T06:22:20.715 INFO:tasks.workunit.client.1.vm06.stdout:7/529: unlink d19/d3b/d41/d42/c44 0 2026-03-10T06:22:20.716 INFO:tasks.workunit.client.1.vm06.stdout:2/454: chown da/d13/d1c/d1d/d44/d53/d61/d68/c69 8033 1 2026-03-10T06:22:20.717 INFO:tasks.workunit.client.1.vm06.stdout:7/530: write d19/d3b/d41/d4c/f6e [831343,17977] 0 2026-03-10T06:22:20.718 INFO:tasks.workunit.client.1.vm06.stdout:7/531: dread - d19/d3b/d41/d72/d97/fa3 zero size 2026-03-10T06:22:20.723 INFO:tasks.workunit.client.1.vm06.stdout:3/476: rename d6/dc/d13/d35/la0 to d6/d21/la2 0 2026-03-10T06:22:20.726 INFO:tasks.workunit.client.1.vm06.stdout:2/455: dwrite da/d13/d1c/d1d/d44/d48/d56/f85 [0,4194304] 0 2026-03-10T06:22:20.731 INFO:tasks.workunit.client.1.vm06.stdout:7/532: creat d19/d3b/d41/d42/d62/d80/d82/fae x:0 0 0 2026-03-10T06:22:20.736 INFO:tasks.workunit.client.1.vm06.stdout:3/477: creat d6/d21/d38/d88/fa3 x:0 0 0 2026-03-10T06:22:20.738 INFO:tasks.workunit.client.1.vm06.stdout:3/478: fdatasync d6/f84 0 2026-03-10T06:22:20.740 INFO:tasks.workunit.client.1.vm06.stdout:3/479: dwrite d6/dc/d41/d6d/f70 [0,4194304] 0 2026-03-10T06:22:20.744 INFO:tasks.workunit.client.1.vm06.stdout:9/511: truncate d21/d32/d4d/d51/f87 350132 0 2026-03-10T06:22:20.748 INFO:tasks.workunit.client.1.vm06.stdout:6/538: mknod d6/d7/d37/cb8 0 2026-03-10T06:22:20.750 INFO:tasks.workunit.client.1.vm06.stdout:7/533: symlink d19/d3b/d41/d42/d52/laf 0 2026-03-10T06:22:20.753 INFO:tasks.workunit.client.1.vm06.stdout:2/456: symlink da/d13/d1c/d1d/l8a 0 2026-03-10T06:22:20.754 INFO:tasks.workunit.client.1.vm06.stdout:3/480: chown d6/dc/d41/d6d/l92 528676151 1 2026-03-10T06:22:20.757 INFO:tasks.workunit.client.1.vm06.stdout:9/512: chown d21/d27/c48 5 1 2026-03-10T06:22:20.757 INFO:tasks.workunit.client.1.vm06.stdout:9/513: chown d21/d27/c5e 197315972 1 2026-03-10T06:22:20.757 INFO:tasks.workunit.client.1.vm06.stdout:6/539: mkdir d6/dd/d25/d33/d5a/d78/db9 0 2026-03-10T06:22:20.761 INFO:tasks.workunit.client.1.vm06.stdout:1/497: dread d9/d1b/d20/f25 [0,4194304] 0 2026-03-10T06:22:20.762 INFO:tasks.workunit.client.1.vm06.stdout:9/514: creat d21/d32/d4d/d51/d67/fb6 x:0 0 0 2026-03-10T06:22:20.763 INFO:tasks.workunit.client.1.vm06.stdout:6/540: mknod d6/dd/cba 0 2026-03-10T06:22:20.763 INFO:tasks.workunit.client.1.vm06.stdout:1/498: creat d9/d35/d46/d38/d63/f84 x:0 0 0 2026-03-10T06:22:20.763 INFO:tasks.workunit.client.1.vm06.stdout:7/534: truncate d19/d3b/f3c 579663 0 2026-03-10T06:22:20.766 INFO:tasks.workunit.client.1.vm06.stdout:1/499: creat d9/d1b/d20/d44/f85 x:0 0 0 2026-03-10T06:22:20.766 INFO:tasks.workunit.client.1.vm06.stdout:7/535: read - d19/d3b/d41/d42/f91 zero size 2026-03-10T06:22:20.766 INFO:tasks.workunit.client.1.vm06.stdout:1/500: stat d9/d35/d46/d38/d63/c78 0 2026-03-10T06:22:20.768 INFO:tasks.workunit.client.1.vm06.stdout:6/541: unlink d6/dd/d25/d33/d4d/fa5 0 2026-03-10T06:22:20.768 INFO:tasks.workunit.client.1.vm06.stdout:3/481: creat d6/d21/fa4 x:0 0 0 2026-03-10T06:22:20.768 INFO:tasks.workunit.client.1.vm06.stdout:9/515: sync 2026-03-10T06:22:20.768 INFO:tasks.workunit.client.1.vm06.stdout:3/482: readlink d6/d21/l9b 0 2026-03-10T06:22:20.771 INFO:tasks.workunit.client.1.vm06.stdout:7/536: mkdir d19/db0 0 2026-03-10T06:22:20.774 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:20 vm06.local ceph-mon[58974]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:22:20.774 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:20 vm06.local ceph-mon[58974]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:22:20.774 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:20 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-10T06:22:20.774 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:20 vm06.local ceph-mon[58974]: from='mon.? -' entity='mon.' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-10T06:22:20.774 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:20 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:22:20.774 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:20 vm06.local ceph-mon[58974]: pgmap v9: 65 pgs: 65 active+clean; 1.0 GiB data, 3.6 GiB used, 116 GiB / 120 GiB avail; 37 MiB/s rd, 127 MiB/s wr, 261 op/s 2026-03-10T06:22:20.775 INFO:tasks.workunit.client.1.vm06.stdout:6/542: dwrite d6/d7/f87 [0,4194304] 0 2026-03-10T06:22:20.778 INFO:tasks.workunit.client.1.vm06.stdout:1/501: rename d9/d35/f56 to d9/f86 0 2026-03-10T06:22:20.786 INFO:tasks.workunit.client.1.vm06.stdout:7/537: dwrite d19/d3b/f47 [4194304,4194304] 0 2026-03-10T06:22:20.786 INFO:tasks.workunit.client.1.vm06.stdout:7/538: write f15 [7066127,112221] 0 2026-03-10T06:22:20.792 INFO:tasks.workunit.client.1.vm06.stdout:6/543: rename d6/df/d40/ca9 to d6/d79/d95/cbb 0 2026-03-10T06:22:20.799 INFO:tasks.workunit.client.1.vm06.stdout:7/539: rename d19/d3b/d41/d72/d97/l98 to d19/d3b/d41/da9/daa/lb1 0 2026-03-10T06:22:20.799 INFO:tasks.workunit.client.1.vm06.stdout:1/502: link d9/df/f14 d9/d62/f87 0 2026-03-10T06:22:20.800 INFO:tasks.workunit.client.1.vm06.stdout:1/503: stat d9/f2f 0 2026-03-10T06:22:20.800 INFO:tasks.workunit.client.1.vm06.stdout:1/504: stat d9/df/c13 0 2026-03-10T06:22:20.801 INFO:tasks.workunit.client.1.vm06.stdout:1/505: readlink d9/d1b/l67 0 2026-03-10T06:22:20.802 INFO:tasks.workunit.client.1.vm06.stdout:1/506: readlink d9/d1b/d20/l7d 0 2026-03-10T06:22:20.803 INFO:tasks.workunit.client.1.vm06.stdout:7/540: creat d19/d3b/d41/fb2 x:0 0 0 2026-03-10T06:22:20.804 INFO:tasks.workunit.client.1.vm06.stdout:7/541: write d19/d3b/d41/d72/d97/fa3 [679955,50380] 0 2026-03-10T06:22:20.809 INFO:tasks.workunit.client.1.vm06.stdout:1/507: unlink d9/d1b/d20/d44/l4b 0 2026-03-10T06:22:20.809 INFO:tasks.workunit.client.1.vm06.stdout:7/542: dread f15 [0,4194304] 0 2026-03-10T06:22:20.810 INFO:tasks.workunit.client.1.vm06.stdout:9/516: dread d21/d32/d4d/f6b [0,4194304] 0 2026-03-10T06:22:20.811 INFO:tasks.workunit.client.1.vm06.stdout:4/459: write dd/d41/f52 [258697,111659] 0 2026-03-10T06:22:20.813 INFO:tasks.workunit.client.1.vm06.stdout:4/460: fsync dd/d24/d2d/f3b 0 2026-03-10T06:22:20.813 INFO:tasks.workunit.client.1.vm06.stdout:9/517: chown d21/d27/d3a/l85 635 1 2026-03-10T06:22:20.813 INFO:tasks.workunit.client.1.vm06.stdout:7/543: creat d19/d3b/d41/d72/d97/fb3 x:0 0 0 2026-03-10T06:22:20.815 INFO:tasks.workunit.client.1.vm06.stdout:4/461: rename dd/d24/d5e/f6e to dd/d33/d36/f8d 0 2026-03-10T06:22:20.816 INFO:tasks.workunit.client.1.vm06.stdout:4/462: mkdir dd/d18/d8e 0 2026-03-10T06:22:20.818 INFO:tasks.workunit.client.1.vm06.stdout:4/463: write dd/d33/f53 [2116380,26819] 0 2026-03-10T06:22:20.818 INFO:tasks.workunit.client.1.vm06.stdout:4/464: write dd/d41/f52 [2861492,66520] 0 2026-03-10T06:22:20.818 INFO:tasks.workunit.client.1.vm06.stdout:7/544: dwrite d19/d3b/d41/f54 [0,4194304] 0 2026-03-10T06:22:20.827 INFO:tasks.workunit.client.1.vm06.stdout:7/545: write d19/d3b/d5b/f81 [4629332,89097] 0 2026-03-10T06:22:20.827 INFO:tasks.workunit.client.1.vm06.stdout:4/465: creat dd/d24/f8f x:0 0 0 2026-03-10T06:22:20.832 INFO:tasks.workunit.client.1.vm06.stdout:7/546: mknod d19/d3b/d41/d42/d52/d83/cb4 0 2026-03-10T06:22:20.832 INFO:tasks.workunit.client.1.vm06.stdout:7/547: read f15 [2728144,59866] 0 2026-03-10T06:22:20.835 INFO:tasks.workunit.client.1.vm06.stdout:7/548: write d19/d3b/d41/d42/d52/d83/f9b [957352,96178] 0 2026-03-10T06:22:20.835 INFO:tasks.workunit.client.1.vm06.stdout:7/549: chown d19/d3b/f68 60722 1 2026-03-10T06:22:20.836 INFO:tasks.workunit.client.1.vm06.stdout:7/550: fdatasync f13 0 2026-03-10T06:22:20.837 INFO:tasks.workunit.client.1.vm06.stdout:7/551: truncate f10 3264246 0 2026-03-10T06:22:20.841 INFO:tasks.workunit.client.1.vm06.stdout:7/552: dwrite d19/d3b/d41/d42/d52/fa0 [0,4194304] 0 2026-03-10T06:22:20.848 INFO:tasks.workunit.client.1.vm06.stdout:7/553: creat d19/d3b/d41/d42/d62/fb5 x:0 0 0 2026-03-10T06:22:20.859 INFO:tasks.workunit.client.1.vm06.stdout:9/518: dread f14 [4194304,4194304] 0 2026-03-10T06:22:20.860 INFO:tasks.workunit.client.1.vm06.stdout:9/519: write f1b [6742229,113677] 0 2026-03-10T06:22:20.862 INFO:tasks.workunit.client.1.vm06.stdout:9/520: unlink d21/c5b 0 2026-03-10T06:22:20.863 INFO:tasks.workunit.client.1.vm06.stdout:9/521: chown d21/d27/d50/cac 97422 1 2026-03-10T06:22:20.865 INFO:tasks.workunit.client.1.vm06.stdout:0/475: dread d0/dd/d1b/f4e [0,4194304] 0 2026-03-10T06:22:20.865 INFO:tasks.workunit.client.1.vm06.stdout:9/522: creat d21/d27/d50/d57/fb7 x:0 0 0 2026-03-10T06:22:20.866 INFO:tasks.workunit.client.1.vm06.stdout:0/476: fsync d0/f5 0 2026-03-10T06:22:20.871 INFO:tasks.workunit.client.1.vm06.stdout:9/523: truncate d21/f3e 6800720 0 2026-03-10T06:22:20.872 INFO:tasks.workunit.client.1.vm06.stdout:0/477: creat d0/d3c/d42/d99/f9c x:0 0 0 2026-03-10T06:22:20.873 INFO:tasks.workunit.client.1.vm06.stdout:0/478: write d0/d3c/d42/f60 [1866792,113312] 0 2026-03-10T06:22:20.915 INFO:tasks.workunit.client.1.vm06.stdout:7/554: fsync f15 0 2026-03-10T06:22:20.921 INFO:tasks.workunit.client.1.vm06.stdout:7/555: dwrite d19/d3b/d5b/f69 [0,4194304] 0 2026-03-10T06:22:20.927 INFO:tasks.workunit.client.1.vm06.stdout:8/430: dwrite d1/f4 [0,4194304] 0 2026-03-10T06:22:20.933 INFO:tasks.workunit.client.1.vm06.stdout:8/431: symlink d1/d2c/l96 0 2026-03-10T06:22:20.934 INFO:tasks.workunit.client.1.vm06.stdout:8/432: chown d1/df/d20/d21/c52 261972 1 2026-03-10T06:22:20.935 INFO:tasks.workunit.client.1.vm06.stdout:8/433: write d1/df/d20/d21/f37 [4817659,86876] 0 2026-03-10T06:22:20.938 INFO:tasks.workunit.client.1.vm06.stdout:5/390: mkdir d8/db/d7e 0 2026-03-10T06:22:20.938 INFO:tasks.workunit.client.1.vm06.stdout:8/434: write d1/df/f71 [4201225,53725] 0 2026-03-10T06:22:20.939 INFO:tasks.workunit.client.1.vm06.stdout:5/391: dread - d8/db/d54/d67/d46/f77 zero size 2026-03-10T06:22:20.941 INFO:tasks.workunit.client.1.vm06.stdout:5/392: write d8/db/d54/d67/d22/d74/f5a [575440,65820] 0 2026-03-10T06:22:20.944 INFO:tasks.workunit.client.1.vm06.stdout:5/393: chown d8/d9/c2c 790811685 1 2026-03-10T06:22:20.950 INFO:tasks.workunit.client.1.vm06.stdout:5/394: rename d8/db/d54/d67/d22/d39/f63 to d8/db/d54/d67/d22/d39/d72/f7f 0 2026-03-10T06:22:20.951 INFO:tasks.workunit.client.1.vm06.stdout:5/395: truncate d8/db/d57/f75 142371 0 2026-03-10T06:22:20.952 INFO:tasks.workunit.client.1.vm06.stdout:5/396: rmdir d8/db/d54/d67 39 2026-03-10T06:22:20.955 INFO:tasks.workunit.client.1.vm06.stdout:5/397: unlink d8/d9/c3c 0 2026-03-10T06:22:20.962 INFO:tasks.workunit.client.1.vm06.stdout:5/398: dwrite d8/db/d54/d67/d22/d39/f41 [0,4194304] 0 2026-03-10T06:22:20.968 INFO:tasks.workunit.client.1.vm06.stdout:5/399: mkdir d8/db/d54/d55/d80 0 2026-03-10T06:22:20.972 INFO:tasks.workunit.client.1.vm06.stdout:5/400: unlink d8/db/d54/d67/d46/f64 0 2026-03-10T06:22:20.973 INFO:tasks.workunit.client.1.vm06.stdout:5/401: creat d8/db/d54/d67/d46/d68/f81 x:0 0 0 2026-03-10T06:22:20.974 INFO:tasks.workunit.client.1.vm06.stdout:5/402: readlink d8/db/d54/d67/d22/l56 0 2026-03-10T06:22:20.976 INFO:tasks.workunit.client.1.vm06.stdout:5/403: write d8/db/d54/d67/d22/d39/f51 [436528,85672] 0 2026-03-10T06:22:21.002 INFO:tasks.workunit.client.1.vm06.stdout:2/457: truncate da/d13/f1f 3459696 0 2026-03-10T06:22:21.003 INFO:tasks.workunit.client.1.vm06.stdout:2/458: mknod da/d13/d5e/c8b 0 2026-03-10T06:22:21.005 INFO:tasks.workunit.client.1.vm06.stdout:2/459: read - da/d13/d1a/d39/f3c zero size 2026-03-10T06:22:21.007 INFO:tasks.workunit.client.1.vm06.stdout:2/460: mknod da/d13/d5e/c8c 0 2026-03-10T06:22:21.009 INFO:tasks.workunit.client.1.vm06.stdout:3/483: truncate d6/f53 5264 0 2026-03-10T06:22:21.010 INFO:tasks.workunit.client.1.vm06.stdout:2/461: mkdir da/d13/d8d 0 2026-03-10T06:22:21.010 INFO:tasks.workunit.client.1.vm06.stdout:2/462: stat da 0 2026-03-10T06:22:21.016 INFO:tasks.workunit.client.1.vm06.stdout:3/484: getdents d6/d1a/d5b 0 2026-03-10T06:22:21.018 INFO:tasks.workunit.client.1.vm06.stdout:2/463: dwrite da/f11 [0,4194304] 0 2026-03-10T06:22:21.022 INFO:tasks.workunit.client.1.vm06.stdout:2/464: write da/d13/d1c/d1d/d44/d48/d56/f58 [4622130,26384] 0 2026-03-10T06:22:21.022 INFO:tasks.workunit.client.1.vm06.stdout:3/485: dread - d6/d8/f62 zero size 2026-03-10T06:22:21.023 INFO:tasks.workunit.client.1.vm06.stdout:2/465: fsync da/d13/d1c/d43/f7a 0 2026-03-10T06:22:21.023 INFO:tasks.workunit.client.1.vm06.stdout:3/486: write d6/dc/d41/f82 [5222006,47365] 0 2026-03-10T06:22:21.024 INFO:tasks.workunit.client.1.vm06.stdout:3/487: write d6/d8/f52 [236360,116639] 0 2026-03-10T06:22:21.024 INFO:tasks.workunit.client.1.vm06.stdout:3/488: dread - d6/dc/d13/f8d zero size 2026-03-10T06:22:21.025 INFO:tasks.workunit.client.1.vm06.stdout:2/466: chown da/d13/d1a/f3a 231419798 1 2026-03-10T06:22:21.027 INFO:tasks.workunit.client.1.vm06.stdout:3/489: creat d6/d21/d38/d39/d90/fa5 x:0 0 0 2026-03-10T06:22:21.028 INFO:tasks.workunit.client.1.vm06.stdout:2/467: mknod da/d13/d1c/d43/c8e 0 2026-03-10T06:22:21.030 INFO:tasks.workunit.client.1.vm06.stdout:1/508: getdents d9/d1b/d20/d44 0 2026-03-10T06:22:21.039 INFO:tasks.workunit.client.1.vm06.stdout:6/544: dwrite d6/dd/d25/d2c/f9c [0,4194304] 0 2026-03-10T06:22:21.039 INFO:tasks.workunit.client.1.vm06.stdout:6/545: write d6/f62 [157605,84576] 0 2026-03-10T06:22:21.039 INFO:tasks.workunit.client.1.vm06.stdout:2/468: creat da/d13/d5e/f8f x:0 0 0 2026-03-10T06:22:21.039 INFO:tasks.workunit.client.1.vm06.stdout:4/466: truncate dd/f12 440716 0 2026-03-10T06:22:21.039 INFO:tasks.workunit.client.1.vm06.stdout:2/469: stat da/d13/d5e/c8b 0 2026-03-10T06:22:21.051 INFO:tasks.workunit.client.1.vm06.stdout:2/470: chown da/d13/d1c/f76 1 1 2026-03-10T06:22:21.051 INFO:tasks.workunit.client.1.vm06.stdout:2/471: dread - da/d13/d1a/d39/f3c zero size 2026-03-10T06:22:21.053 INFO:tasks.workunit.client.1.vm06.stdout:6/546: rmdir d6/dd/d2b/db2 0 2026-03-10T06:22:21.054 INFO:tasks.workunit.client.1.vm06.stdout:3/490: dread d6/d21/f30 [0,4194304] 0 2026-03-10T06:22:21.055 INFO:tasks.workunit.client.1.vm06.stdout:2/472: chown da/d13/d1c/d1d/d44/d53/l7b 84679695 1 2026-03-10T06:22:21.055 INFO:tasks.workunit.client.1.vm06.stdout:6/547: rmdir d6/d7 39 2026-03-10T06:22:21.059 INFO:tasks.workunit.client.1.vm06.stdout:4/467: getdents dd/d24/d5d 0 2026-03-10T06:22:21.061 INFO:tasks.workunit.client.1.vm06.stdout:6/548: chown d6/dd/d25/d2c/c41 591840388 1 2026-03-10T06:22:21.062 INFO:tasks.workunit.client.1.vm06.stdout:1/509: dread d9/df/f4d [0,4194304] 0 2026-03-10T06:22:21.065 INFO:tasks.workunit.client.1.vm06.stdout:6/549: mknod d6/dd/d2b/cbc 0 2026-03-10T06:22:21.076 INFO:tasks.workunit.client.1.vm06.stdout:0/479: dwrite d0/d3c/d42/d88/d35/f51 [0,4194304] 0 2026-03-10T06:22:21.076 INFO:tasks.workunit.client.1.vm06.stdout:9/524: dwrite fd [0,4194304] 0 2026-03-10T06:22:21.077 INFO:tasks.workunit.client.1.vm06.stdout:9/525: chown d21/d27/d50/d57/db2/d80 7687 1 2026-03-10T06:22:21.084 INFO:tasks.workunit.client.1.vm06.stdout:0/480: unlink d0/dd/d14/f65 0 2026-03-10T06:22:21.085 INFO:tasks.workunit.client.1.vm06.stdout:0/481: chown d0/d3c/d42/d88/d35/d74/l77 1 1 2026-03-10T06:22:21.085 INFO:tasks.workunit.client.1.vm06.stdout:9/526: mknod d21/d32/d4d/d51/cb8 0 2026-03-10T06:22:21.088 INFO:tasks.workunit.client.1.vm06.stdout:4/468: sync 2026-03-10T06:22:21.089 INFO:tasks.workunit.client.1.vm06.stdout:9/527: dwrite d21/d27/d50/d57/fb7 [0,4194304] 0 2026-03-10T06:22:21.090 INFO:tasks.workunit.client.1.vm06.stdout:9/528: fsync d21/f2a 0 2026-03-10T06:22:21.090 INFO:tasks.workunit.client.1.vm06.stdout:4/469: read - dd/d24/d2d/f5a zero size 2026-03-10T06:22:21.096 INFO:tasks.workunit.client.1.vm06.stdout:3/491: getdents d6/d21/d38 0 2026-03-10T06:22:21.102 INFO:tasks.workunit.client.1.vm06.stdout:9/529: rename d21/d27/f39 to d21/d46/fb9 0 2026-03-10T06:22:21.103 INFO:tasks.workunit.client.1.vm06.stdout:9/530: fsync d21/d27/d50/d57/fa9 0 2026-03-10T06:22:21.103 INFO:tasks.workunit.client.1.vm06.stdout:9/531: readlink d21/d27/l30 0 2026-03-10T06:22:21.104 INFO:tasks.workunit.client.1.vm06.stdout:9/532: dread - d21/d27/d50/d57/db2/d80/f86 zero size 2026-03-10T06:22:21.104 INFO:tasks.workunit.client.1.vm06.stdout:9/533: write d21/da2/da7/f96 [259563,6093] 0 2026-03-10T06:22:21.107 INFO:tasks.workunit.client.1.vm06.stdout:9/534: dwrite d21/d32/d4d/f9d [0,4194304] 0 2026-03-10T06:22:21.115 INFO:tasks.workunit.client.1.vm06.stdout:4/470: creat dd/d24/d2d/d2f/d39/d71/f90 x:0 0 0 2026-03-10T06:22:21.115 INFO:tasks.workunit.client.1.vm06.stdout:0/482: truncate d0/dd/d14/d1d/f53 470366 0 2026-03-10T06:22:21.115 INFO:tasks.workunit.client.1.vm06.stdout:9/535: write d21/d32/d4d/f64 [1962029,16181] 0 2026-03-10T06:22:21.115 INFO:tasks.workunit.client.1.vm06.stdout:3/492: rename d6/d8/c46 to d6/dc/d13/d9d/d54/ca6 0 2026-03-10T06:22:21.115 INFO:tasks.workunit.client.1.vm06.stdout:4/471: creat dd/d18/d75/f91 x:0 0 0 2026-03-10T06:22:21.116 INFO:tasks.workunit.client.1.vm06.stdout:9/536: dwrite d21/f33 [0,4194304] 0 2026-03-10T06:22:21.123 INFO:tasks.workunit.client.1.vm06.stdout:3/493: mknod d6/d4f/ca7 0 2026-03-10T06:22:21.123 INFO:tasks.workunit.client.1.vm06.stdout:4/472: symlink dd/d24/d5d/l92 0 2026-03-10T06:22:21.124 INFO:tasks.workunit.client.1.vm06.stdout:4/473: dread - dd/d24/d2d/f5a zero size 2026-03-10T06:22:21.124 INFO:tasks.workunit.client.1.vm06.stdout:3/494: chown d6/d21/f31 4 1 2026-03-10T06:22:21.125 INFO:tasks.workunit.client.1.vm06.stdout:0/483: mknod d0/dd/d1b/d3d/d50/c9d 0 2026-03-10T06:22:21.126 INFO:tasks.workunit.client.1.vm06.stdout:3/495: rmdir d6/dc/d72 39 2026-03-10T06:22:21.128 INFO:tasks.workunit.client.1.vm06.stdout:9/537: dwrite d21/d32/d6e/f2e [0,4194304] 0 2026-03-10T06:22:21.135 INFO:tasks.workunit.client.1.vm06.stdout:4/474: creat dd/d33/d47/f93 x:0 0 0 2026-03-10T06:22:21.136 INFO:tasks.workunit.client.1.vm06.stdout:0/484: mkdir d0/d3c/d42/d88/d9e 0 2026-03-10T06:22:21.137 INFO:tasks.workunit.client.1.vm06.stdout:4/475: creat dd/d24/d5d/f94 x:0 0 0 2026-03-10T06:22:21.140 INFO:tasks.workunit.client.1.vm06.stdout:0/485: unlink d0/dd/d14/d18/f30 0 2026-03-10T06:22:21.145 INFO:tasks.workunit.client.1.vm06.stdout:8/435: truncate d1/df/f71 2782024 0 2026-03-10T06:22:21.146 INFO:tasks.workunit.client.1.vm06.stdout:3/496: rename d6/dc/d13/d9d/f78 to d6/fa8 0 2026-03-10T06:22:21.150 INFO:tasks.workunit.client.1.vm06.stdout:8/436: dwrite d1/df/d20/d21/d5e/f73 [0,4194304] 0 2026-03-10T06:22:21.153 INFO:tasks.workunit.client.1.vm06.stdout:0/486: symlink d0/d3c/d42/d88/d35/d74/l9f 0 2026-03-10T06:22:21.155 INFO:tasks.workunit.client.1.vm06.stdout:5/404: write d8/db/d54/d67/d46/f77 [161001,71037] 0 2026-03-10T06:22:21.155 INFO:tasks.workunit.client.1.vm06.stdout:7/556: dread d19/f30 [0,4194304] 0 2026-03-10T06:22:21.155 INFO:tasks.workunit.client.1.vm06.stdout:0/487: write d0/d3c/d42/f54 [1835708,11023] 0 2026-03-10T06:22:21.159 INFO:tasks.workunit.client.1.vm06.stdout:0/488: readlink d0/d3c/d42/d88/d47/d4d/l87 0 2026-03-10T06:22:21.160 INFO:tasks.workunit.client.1.vm06.stdout:7/557: chown d19/d3b/d41/d4c/f6e 2499066 1 2026-03-10T06:22:21.160 INFO:tasks.workunit.client.1.vm06.stdout:5/405: mknod d8/db/d54/d67/d46/c82 0 2026-03-10T06:22:21.162 INFO:tasks.workunit.client.1.vm06.stdout:4/476: getdents dd/d33 0 2026-03-10T06:22:21.165 INFO:tasks.workunit.client.1.vm06.stdout:5/406: truncate d8/d9/f47 17958 0 2026-03-10T06:22:21.165 INFO:tasks.workunit.client.1.vm06.stdout:8/437: write d1/df/d58/f6e [87799,96048] 0 2026-03-10T06:22:21.168 INFO:tasks.workunit.client.1.vm06.stdout:0/489: sync 2026-03-10T06:22:21.169 INFO:tasks.workunit.client.1.vm06.stdout:4/477: dwrite dd/d33/d47/f93 [0,4194304] 0 2026-03-10T06:22:21.171 INFO:tasks.workunit.client.1.vm06.stdout:4/478: fsync dd/d24/d5e/f67 0 2026-03-10T06:22:21.173 INFO:tasks.workunit.client.1.vm06.stdout:5/407: mkdir d8/db/d57/d83 0 2026-03-10T06:22:21.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:20 vm04.local ceph-mon[51058]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:22:21.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:20 vm04.local ceph-mon[51058]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:22:21.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:20 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-10T06:22:21.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:20 vm04.local ceph-mon[51058]: from='mon.? -' entity='mon.' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-10T06:22:21.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:20 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:22:21.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:20 vm04.local ceph-mon[51058]: pgmap v9: 65 pgs: 65 active+clean; 1.0 GiB data, 3.6 GiB used, 116 GiB / 120 GiB avail; 37 MiB/s rd, 127 MiB/s wr, 261 op/s 2026-03-10T06:22:21.179 INFO:tasks.workunit.client.1.vm06.stdout:4/479: dwrite f8 [12582912,4194304] 0 2026-03-10T06:22:21.180 INFO:tasks.workunit.client.1.vm06.stdout:9/538: fdatasync f9 0 2026-03-10T06:22:21.180 INFO:tasks.workunit.client.1.vm06.stdout:8/438: symlink d1/d3b/l97 0 2026-03-10T06:22:21.184 INFO:tasks.workunit.client.1.vm06.stdout:5/408: mknod d8/db/d54/d67/d46/d6e/c84 0 2026-03-10T06:22:21.184 INFO:tasks.workunit.client.1.vm06.stdout:7/558: link d19/d3b/l5e d19/d3b/d41/d42/d62/d80/da1/lb6 0 2026-03-10T06:22:21.185 INFO:tasks.workunit.client.1.vm06.stdout:4/480: dwrite dd/f80 [0,4194304] 0 2026-03-10T06:22:21.188 INFO:tasks.workunit.client.1.vm06.stdout:7/559: fsync d19/d3b/d41/d72/d97/fb3 0 2026-03-10T06:22:21.189 INFO:tasks.workunit.client.1.vm06.stdout:8/439: truncate d1/df/d11/f74 4220626 0 2026-03-10T06:22:21.189 INFO:tasks.workunit.client.1.vm06.stdout:8/440: dread - d1/df/d20/d21/d5e/f65 zero size 2026-03-10T06:22:21.189 INFO:tasks.workunit.client.1.vm06.stdout:0/490: creat d0/d3c/d42/d88/fa0 x:0 0 0 2026-03-10T06:22:21.195 INFO:tasks.workunit.client.1.vm06.stdout:0/491: write d0/dd/f49 [670227,10502] 0 2026-03-10T06:22:21.196 INFO:tasks.workunit.client.1.vm06.stdout:3/497: read d6/d21/d38/f50 [239318,82572] 0 2026-03-10T06:22:21.196 INFO:tasks.workunit.client.1.vm06.stdout:4/481: creat dd/d41/f95 x:0 0 0 2026-03-10T06:22:21.205 INFO:tasks.workunit.client.1.vm06.stdout:7/560: link d19/d3b/d41/d4c/f85 d19/fb7 0 2026-03-10T06:22:21.218 INFO:tasks.workunit.client.1.vm06.stdout:4/482: chown dd/d18/l54 376 1 2026-03-10T06:22:21.218 INFO:tasks.workunit.client.1.vm06.stdout:8/441: dwrite d1/df/f77 [0,4194304] 0 2026-03-10T06:22:21.218 INFO:tasks.workunit.client.1.vm06.stdout:8/442: read - d1/df/d20/f51 zero size 2026-03-10T06:22:21.218 INFO:tasks.workunit.client.1.vm06.stdout:4/483: write dd/d24/d2d/d2f/d39/d71/f90 [638928,61517] 0 2026-03-10T06:22:21.218 INFO:tasks.workunit.client.1.vm06.stdout:7/561: mknod d19/d3b/d5b/cb8 0 2026-03-10T06:22:21.218 INFO:tasks.workunit.client.1.vm06.stdout:0/492: dwrite d0/dd/f24 [0,4194304] 0 2026-03-10T06:22:21.218 INFO:tasks.workunit.client.1.vm06.stdout:8/443: dread - d1/f3a zero size 2026-03-10T06:22:21.218 INFO:tasks.workunit.client.1.vm06.stdout:4/484: write dd/d24/d2d/d2f/d39/d71/f90 [1499276,53338] 0 2026-03-10T06:22:21.222 INFO:tasks.workunit.client.1.vm06.stdout:7/562: dread f15 [0,4194304] 0 2026-03-10T06:22:21.226 INFO:tasks.workunit.client.1.vm06.stdout:8/444: rename d1/df/d20/f19 to d1/d3b/f98 0 2026-03-10T06:22:21.226 INFO:tasks.workunit.client.1.vm06.stdout:4/485: write dd/d33/d36/f8d [941838,44305] 0 2026-03-10T06:22:21.227 INFO:tasks.workunit.client.1.vm06.stdout:8/445: stat d1/df/d20/d35/l3f 0 2026-03-10T06:22:21.228 INFO:tasks.workunit.client.1.vm06.stdout:1/510: write d9/d1b/d20/f24 [3758438,23051] 0 2026-03-10T06:22:21.230 INFO:tasks.workunit.client.1.vm06.stdout:2/473: truncate da/d13/d1c/d1d/d44/d48/d56/f85 629765 0 2026-03-10T06:22:21.231 INFO:tasks.workunit.client.1.vm06.stdout:8/446: mkdir d1/d2c/d99 0 2026-03-10T06:22:21.233 INFO:tasks.workunit.client.1.vm06.stdout:8/447: readlink d1/l1f 0 2026-03-10T06:22:21.233 INFO:tasks.workunit.client.1.vm06.stdout:1/511: creat d9/d35/d46/f88 x:0 0 0 2026-03-10T06:22:21.234 INFO:tasks.workunit.client.1.vm06.stdout:5/409: dread d8/db/d54/d55/f60 [0,4194304] 0 2026-03-10T06:22:21.235 INFO:tasks.workunit.client.1.vm06.stdout:3/498: dread d6/f1b [0,4194304] 0 2026-03-10T06:22:21.235 INFO:tasks.workunit.client.1.vm06.stdout:4/486: getdents dd/d24 0 2026-03-10T06:22:21.235 INFO:tasks.workunit.client.1.vm06.stdout:1/512: write d9/d1b/d20/f24 [2856010,21857] 0 2026-03-10T06:22:21.236 INFO:tasks.workunit.client.1.vm06.stdout:8/448: symlink d1/df/l9a 0 2026-03-10T06:22:21.236 INFO:tasks.workunit.client.1.vm06.stdout:5/410: stat d8/db/d57/c65 0 2026-03-10T06:22:21.236 INFO:tasks.workunit.client.1.vm06.stdout:4/487: readlink dd/l85 0 2026-03-10T06:22:21.237 INFO:tasks.workunit.client.1.vm06.stdout:1/513: fdatasync d9/f1a 0 2026-03-10T06:22:21.237 INFO:tasks.workunit.client.1.vm06.stdout:6/550: dwrite d6/d7/f1a [0,4194304] 0 2026-03-10T06:22:21.240 INFO:tasks.workunit.client.1.vm06.stdout:8/449: chown d1/df/f71 38704357 1 2026-03-10T06:22:21.241 INFO:tasks.workunit.client.1.vm06.stdout:3/499: write d6/fa8 [462963,23278] 0 2026-03-10T06:22:21.242 INFO:tasks.workunit.client.1.vm06.stdout:6/551: fdatasync d6/dd/f5b 0 2026-03-10T06:22:21.247 INFO:tasks.workunit.client.1.vm06.stdout:8/450: write d1/df/d11/f81 [996571,25449] 0 2026-03-10T06:22:21.247 INFO:tasks.workunit.client.1.vm06.stdout:7/563: sync 2026-03-10T06:22:21.248 INFO:tasks.workunit.client.1.vm06.stdout:3/500: rmdir d6/dc/d41/d6d 39 2026-03-10T06:22:21.250 INFO:tasks.workunit.client.1.vm06.stdout:6/552: unlink d6/df/cb0 0 2026-03-10T06:22:21.250 INFO:tasks.workunit.client.1.vm06.stdout:7/564: readlink d19/l3e 0 2026-03-10T06:22:21.252 INFO:tasks.workunit.client.1.vm06.stdout:1/514: rename d9/df to d9/d35/d89 0 2026-03-10T06:22:21.252 INFO:tasks.workunit.client.1.vm06.stdout:3/501: dwrite d6/d21/f7b [0,4194304] 0 2026-03-10T06:22:21.261 INFO:tasks.workunit.client.1.vm06.stdout:7/565: sync 2026-03-10T06:22:21.262 INFO:tasks.workunit.client.1.vm06.stdout:8/451: mknod d1/df/d20/d21/d5e/d79/c9b 0 2026-03-10T06:22:21.266 INFO:tasks.workunit.client.1.vm06.stdout:1/515: dread d9/d62/f76 [0,4194304] 0 2026-03-10T06:22:21.269 INFO:tasks.workunit.client.1.vm06.stdout:8/452: link d1/df/d20/f43 d1/df/d20/d21/d7e/d8d/f9c 0 2026-03-10T06:22:21.273 INFO:tasks.workunit.client.1.vm06.stdout:3/502: link d6/f1b d6/dc/d13/d51/fa9 0 2026-03-10T06:22:21.273 INFO:tasks.workunit.client.1.vm06.stdout:3/503: fdatasync d6/dc/d13/f9f 0 2026-03-10T06:22:21.273 INFO:tasks.workunit.client.1.vm06.stdout:8/453: link d1/df/d20/d21/d7e/d8d/l8e d1/d2c/l9d 0 2026-03-10T06:22:21.273 INFO:tasks.workunit.client.1.vm06.stdout:8/454: fsync d1/df/d20/f64 0 2026-03-10T06:22:21.280 INFO:tasks.workunit.client.1.vm06.stdout:7/566: sync 2026-03-10T06:22:21.281 INFO:tasks.workunit.client.1.vm06.stdout:7/567: dread - d19/d3b/d41/fb2 zero size 2026-03-10T06:22:21.287 INFO:tasks.workunit.client.1.vm06.stdout:7/568: creat d19/d3b/d41/d72/fb9 x:0 0 0 2026-03-10T06:22:21.352 INFO:tasks.workunit.client.1.vm06.stdout:4/488: fsync dd/d24/d2d/d2f/d39/d71/f90 0 2026-03-10T06:22:21.352 INFO:tasks.workunit.client.1.vm06.stdout:4/489: truncate dd/d24/f69 904665 0 2026-03-10T06:22:21.407 INFO:tasks.workunit.client.1.vm06.stdout:9/539: write d21/f3e [4643135,130616] 0 2026-03-10T06:22:21.419 INFO:tasks.workunit.client.1.vm06.stdout:9/540: symlink d21/d27/d50/d57/db2/d80/lba 0 2026-03-10T06:22:21.419 INFO:tasks.workunit.client.1.vm06.stdout:0/493: write d0/dd/f67 [456955,93840] 0 2026-03-10T06:22:21.419 INFO:tasks.workunit.client.1.vm06.stdout:8/455: getdents d1/df/d20 0 2026-03-10T06:22:21.426 INFO:tasks.workunit.client.1.vm06.stdout:2/474: truncate da/d13/d1c/f7e 805822 0 2026-03-10T06:22:21.426 INFO:tasks.workunit.client.1.vm06.stdout:0/494: truncate d0/d3c/d42/d88/d98/f9a 263132 0 2026-03-10T06:22:21.430 INFO:tasks.workunit.client.1.vm06.stdout:2/475: fsync da/d13/d1c/d43/f79 0 2026-03-10T06:22:21.435 INFO:tasks.workunit.client.1.vm06.stdout:5/411: truncate d8/db/d54/d67/d22/f4d 2860461 0 2026-03-10T06:22:21.435 INFO:tasks.workunit.client.1.vm06.stdout:3/504: truncate d6/dc/d13/d9d/f57 598470 0 2026-03-10T06:22:21.435 INFO:tasks.workunit.client.1.vm06.stdout:1/516: write d9/f2f [1413243,51470] 0 2026-03-10T06:22:21.436 INFO:tasks.workunit.client.1.vm06.stdout:7/569: write d19/d3b/d41/d42/d52/d83/f8f [1662232,82004] 0 2026-03-10T06:22:21.436 INFO:tasks.workunit.client.1.vm06.stdout:5/412: chown d8/db/f48 3866492 1 2026-03-10T06:22:21.444 INFO:tasks.workunit.client.1.vm06.stdout:5/413: sync 2026-03-10T06:22:21.448 INFO:tasks.workunit.client.1.vm06.stdout:1/517: rename d9/d35/d46/f74 to d9/d62/f8a 0 2026-03-10T06:22:21.449 INFO:tasks.workunit.client.1.vm06.stdout:4/490: dwrite dd/d18/f1d [0,4194304] 0 2026-03-10T06:22:21.450 INFO:tasks.workunit.client.1.vm06.stdout:5/414: write d8/db/d54/d67/d46/f77 [614392,22435] 0 2026-03-10T06:22:21.451 INFO:tasks.workunit.client.1.vm06.stdout:6/553: dwrite d6/df/f6f [0,4194304] 0 2026-03-10T06:22:21.461 INFO:tasks.workunit.client.1.vm06.stdout:2/476: rename da/d13/d1c/d1d to da/d13/d1c/d1d/d44/d53/d90 22 2026-03-10T06:22:21.465 INFO:tasks.workunit.client.1.vm06.stdout:7/570: dwrite f4 [4194304,4194304] 0 2026-03-10T06:22:21.470 INFO:tasks.workunit.client.1.vm06.stdout:2/477: truncate da/d13/d1c/d7d/f81 744200 0 2026-03-10T06:22:21.470 INFO:tasks.workunit.client.1.vm06.stdout:7/571: truncate d19/d3b/d41/d42/d52/fa0 4545698 0 2026-03-10T06:22:21.470 INFO:tasks.workunit.client.1.vm06.stdout:3/505: dwrite d6/d21/f31 [0,4194304] 0 2026-03-10T06:22:21.470 INFO:tasks.workunit.client.1.vm06.stdout:7/572: write d19/d3b/d41/f77 [967871,37389] 0 2026-03-10T06:22:21.475 INFO:tasks.workunit.client.1.vm06.stdout:7/573: truncate d19/f99 1025242 0 2026-03-10T06:22:21.483 INFO:tasks.workunit.client.1.vm06.stdout:1/518: unlink d9/d1b/f7b 0 2026-03-10T06:22:21.491 INFO:tasks.workunit.client.1.vm06.stdout:6/554: rmdir d6/dd/d25/d2c 39 2026-03-10T06:22:21.491 INFO:tasks.workunit.client.1.vm06.stdout:2/478: creat da/d13/d1c/d43/f91 x:0 0 0 2026-03-10T06:22:21.491 INFO:tasks.workunit.client.1.vm06.stdout:6/555: dwrite d6/df/d70/f90 [0,4194304] 0 2026-03-10T06:22:21.491 INFO:tasks.workunit.client.1.vm06.stdout:3/506: dread d6/dc/d13/d51/fa9 [0,4194304] 0 2026-03-10T06:22:21.492 INFO:tasks.workunit.client.1.vm06.stdout:8/456: dwrite d1/f13 [0,4194304] 0 2026-03-10T06:22:21.492 INFO:tasks.workunit.client.1.vm06.stdout:7/574: mknod d19/d3b/d41/d42/d52/d83/d9d/da8/cba 0 2026-03-10T06:22:21.496 INFO:tasks.workunit.client.1.vm06.stdout:8/457: read - d1/df/d20/d21/d5e/f65 zero size 2026-03-10T06:22:21.502 INFO:tasks.workunit.client.1.vm06.stdout:1/519: chown d9/c2e 71510 1 2026-03-10T06:22:21.502 INFO:tasks.workunit.client.1.vm06.stdout:8/458: dwrite d1/f18 [0,4194304] 0 2026-03-10T06:22:21.504 INFO:tasks.workunit.client.1.vm06.stdout:6/556: dwrite d6/dd/d35/f2d [0,4194304] 0 2026-03-10T06:22:21.512 INFO:tasks.workunit.client.1.vm06.stdout:3/507: dwrite d6/d8/d7f/f8c [0,4194304] 0 2026-03-10T06:22:21.516 INFO:tasks.workunit.client.1.vm06.stdout:9/541: dread f20 [0,4194304] 0 2026-03-10T06:22:21.521 INFO:tasks.workunit.client.1.vm06.stdout:6/557: read d6/d7/f2a [1621152,19469] 0 2026-03-10T06:22:21.522 INFO:tasks.workunit.client.1.vm06.stdout:6/558: write d6/d7/f87 [2530160,1520] 0 2026-03-10T06:22:21.528 INFO:tasks.workunit.client.1.vm06.stdout:8/459: mkdir d1/d3b/d9e 0 2026-03-10T06:22:21.539 INFO:tasks.workunit.client.1.vm06.stdout:9/542: unlink d21/d27/c4a 0 2026-03-10T06:22:21.539 INFO:tasks.workunit.client.1.vm06.stdout:4/491: getdents dd/d24 0 2026-03-10T06:22:21.539 INFO:tasks.workunit.client.1.vm06.stdout:6/559: fdatasync d6/dd/d25/d2c/f9c 0 2026-03-10T06:22:21.539 INFO:tasks.workunit.client.1.vm06.stdout:4/492: fsync dd/d24/d5e/f67 0 2026-03-10T06:22:21.539 INFO:tasks.workunit.client.1.vm06.stdout:9/543: readlink d21/d32/d4d/l8f 0 2026-03-10T06:22:21.539 INFO:tasks.workunit.client.1.vm06.stdout:6/560: read - d6/df/f82 zero size 2026-03-10T06:22:21.539 INFO:tasks.workunit.client.1.vm06.stdout:7/575: link d19/d3b/d41/d4c/f4e d19/d3b/d41/d42/d62/fbb 0 2026-03-10T06:22:21.539 INFO:tasks.workunit.client.1.vm06.stdout:7/576: chown d19/d3b/d5b/cb8 1275 1 2026-03-10T06:22:21.539 INFO:tasks.workunit.client.1.vm06.stdout:6/561: readlink d6/d7/d37/d43/l88 0 2026-03-10T06:22:21.539 INFO:tasks.workunit.client.1.vm06.stdout:1/520: symlink d9/d35/d46/d38/l8b 0 2026-03-10T06:22:21.539 INFO:tasks.workunit.client.1.vm06.stdout:3/508: mknod d6/d8/d7f/da1/caa 0 2026-03-10T06:22:21.539 INFO:tasks.workunit.client.1.vm06.stdout:8/460: symlink d1/df/d20/d35/l9f 0 2026-03-10T06:22:21.539 INFO:tasks.workunit.client.1.vm06.stdout:4/493: unlink dd/d24/d2d/c27 0 2026-03-10T06:22:21.540 INFO:tasks.workunit.client.1.vm06.stdout:8/461: dread - d1/d3b/d5c/f62 zero size 2026-03-10T06:22:21.540 INFO:tasks.workunit.client.1.vm06.stdout:4/494: write dd/f5c [476847,68589] 0 2026-03-10T06:22:21.542 INFO:tasks.workunit.client.1.vm06.stdout:6/562: chown d6/dd/d35/cb7 2083879 1 2026-03-10T06:22:21.548 INFO:tasks.workunit.client.1.vm06.stdout:3/509: write d6/d21/f99 [678447,59761] 0 2026-03-10T06:22:21.549 INFO:tasks.workunit.client.1.vm06.stdout:6/563: fdatasync d6/dd/d25/d4e/f83 0 2026-03-10T06:22:21.549 INFO:tasks.workunit.client.1.vm06.stdout:9/544: creat d21/d27/d3a/fbb x:0 0 0 2026-03-10T06:22:21.549 INFO:tasks.workunit.client.1.vm06.stdout:9/545: dread - d21/d32/f8b zero size 2026-03-10T06:22:21.549 INFO:tasks.workunit.client.1.vm06.stdout:2/479: dread da/d13/d1a/d39/f70 [0,4194304] 0 2026-03-10T06:22:21.549 INFO:tasks.workunit.client.1.vm06.stdout:1/521: mkdir d9/d35/d46/d38/d8c 0 2026-03-10T06:22:21.549 INFO:tasks.workunit.client.1.vm06.stdout:9/546: creat d21/da2/da7/fbc x:0 0 0 2026-03-10T06:22:21.550 INFO:tasks.workunit.client.1.vm06.stdout:2/480: dread - da/d13/d1c/d43/d6e/f71 zero size 2026-03-10T06:22:21.551 INFO:tasks.workunit.client.1.vm06.stdout:8/462: creat d1/df/fa0 x:0 0 0 2026-03-10T06:22:21.552 INFO:tasks.workunit.client.1.vm06.stdout:7/577: dwrite d19/d3b/d41/d42/d62/d80/da1/fac [0,4194304] 0 2026-03-10T06:22:21.557 INFO:tasks.workunit.client.1.vm06.stdout:8/463: rename d1/df/d11/d84 to d1/df/d11/da1 0 2026-03-10T06:22:21.559 INFO:tasks.workunit.client.1.vm06.stdout:6/564: dwrite d6/dd/d25/d33/d4d/f8c [0,4194304] 0 2026-03-10T06:22:21.560 INFO:tasks.workunit.client.1.vm06.stdout:5/415: dread d8/db/d54/d67/d46/f77 [0,4194304] 0 2026-03-10T06:22:21.561 INFO:tasks.workunit.client.1.vm06.stdout:3/510: sync 2026-03-10T06:22:21.565 INFO:tasks.workunit.client.1.vm06.stdout:3/511: chown d6/f5c 3808 1 2026-03-10T06:22:21.565 INFO:tasks.workunit.client.1.vm06.stdout:5/416: write d8/db/d54/d67/d22/d39/f69 [3141910,103806] 0 2026-03-10T06:22:21.566 INFO:tasks.workunit.client.1.vm06.stdout:5/417: stat f7 0 2026-03-10T06:22:21.566 INFO:tasks.workunit.client.1.vm06.stdout:2/481: dwrite da/d13/d5e/f64 [0,4194304] 0 2026-03-10T06:22:21.568 INFO:tasks.workunit.client.1.vm06.stdout:9/547: creat d21/d32/d4d/fbd x:0 0 0 2026-03-10T06:22:21.581 INFO:tasks.workunit.client.1.vm06.stdout:6/565: creat d6/d79/d95/db4/fbd x:0 0 0 2026-03-10T06:22:21.581 INFO:tasks.workunit.client.1.vm06.stdout:1/522: dread d9/d35/d89/f4f [0,4194304] 0 2026-03-10T06:22:21.581 INFO:tasks.workunit.client.1.vm06.stdout:0/495: write d0/fa [1782523,65117] 0 2026-03-10T06:22:21.582 INFO:tasks.workunit.client.1.vm06.stdout:6/566: dread - d6/df/d40/fa7 zero size 2026-03-10T06:22:21.583 INFO:tasks.workunit.client.1.vm06.stdout:2/482: symlink da/d13/d1a/l92 0 2026-03-10T06:22:21.583 INFO:tasks.workunit.client.1.vm06.stdout:2/483: chown da/d13/d1a 0 1 2026-03-10T06:22:21.585 INFO:tasks.workunit.client.1.vm06.stdout:3/512: rename d6/d21/d38/d88/fa3 to d6/fab 0 2026-03-10T06:22:21.587 INFO:tasks.workunit.client.1.vm06.stdout:0/496: creat d0/d3c/d42/fa1 x:0 0 0 2026-03-10T06:22:21.589 INFO:tasks.workunit.client.1.vm06.stdout:5/418: creat d8/db/d54/d67/d22/d74/f85 x:0 0 0 2026-03-10T06:22:21.591 INFO:tasks.workunit.client.1.vm06.stdout:0/497: truncate d0/dd/f95 418943 0 2026-03-10T06:22:21.591 INFO:tasks.workunit.client.1.vm06.stdout:1/523: symlink d9/d35/d46/d38/d63/l8d 0 2026-03-10T06:22:21.592 INFO:tasks.workunit.client.1.vm06.stdout:2/484: mknod da/d13/d1c/d1d/d44/d53/c93 0 2026-03-10T06:22:21.593 INFO:tasks.workunit.client.1.vm06.stdout:9/548: rename d21/d46/f4e to d21/d32/d4d/d51/fbe 0 2026-03-10T06:22:21.596 INFO:tasks.workunit.client.1.vm06.stdout:5/419: symlink d8/d9/l86 0 2026-03-10T06:22:21.598 INFO:tasks.workunit.client.1.vm06.stdout:0/498: mkdir d0/dd/d1c/da2 0 2026-03-10T06:22:21.601 INFO:tasks.workunit.client.1.vm06.stdout:3/513: symlink d6/lac 0 2026-03-10T06:22:21.602 INFO:tasks.workunit.client.1.vm06.stdout:3/514: chown l2 484848508 1 2026-03-10T06:22:21.602 INFO:tasks.workunit.client.1.vm06.stdout:9/549: dread d21/d32/d4d/f9d [0,4194304] 0 2026-03-10T06:22:21.604 INFO:tasks.workunit.client.1.vm06.stdout:7/578: dread d19/d3b/f68 [0,4194304] 0 2026-03-10T06:22:21.604 INFO:tasks.workunit.client.1.vm06.stdout:6/567: dread d6/dd/d2b/f93 [0,4194304] 0 2026-03-10T06:22:21.605 INFO:tasks.workunit.client.1.vm06.stdout:3/515: dread d6/d21/d38/f6c [0,4194304] 0 2026-03-10T06:22:21.607 INFO:tasks.workunit.client.1.vm06.stdout:9/550: readlink d21/d32/d6e/l53 0 2026-03-10T06:22:21.607 INFO:tasks.workunit.client.1.vm06.stdout:0/499: fdatasync d0/dd/d14/d1d/f53 0 2026-03-10T06:22:21.607 INFO:tasks.workunit.client.1.vm06.stdout:6/568: dread d6/dd/d25/d4e/f8a [0,4194304] 0 2026-03-10T06:22:21.609 INFO:tasks.workunit.client.1.vm06.stdout:3/516: mknod d6/d21/cad 0 2026-03-10T06:22:21.609 INFO:tasks.workunit.client.1.vm06.stdout:3/517: readlink d6/d1a/l2f 0 2026-03-10T06:22:21.609 INFO:tasks.workunit.client.1.vm06.stdout:0/500: mkdir d0/da3 0 2026-03-10T06:22:21.610 INFO:tasks.workunit.client.1.vm06.stdout:7/579: rename d19/d3b/d41/d42/d62/d80/da1/fac to d19/d3b/d41/d72/d97/fbc 0 2026-03-10T06:22:21.612 INFO:tasks.workunit.client.1.vm06.stdout:1/524: sync 2026-03-10T06:22:21.613 INFO:tasks.workunit.client.1.vm06.stdout:3/518: mkdir d6/d21/d38/d88/dae 0 2026-03-10T06:22:21.617 INFO:tasks.workunit.client.1.vm06.stdout:9/551: rename d21/d32/d4d/d51/c9c to d21/d27/cbf 0 2026-03-10T06:22:21.617 INFO:tasks.workunit.client.1.vm06.stdout:7/580: dwrite d19/f35 [0,4194304] 0 2026-03-10T06:22:21.619 INFO:tasks.workunit.client.1.vm06.stdout:7/581: write f15 [6589843,121749] 0 2026-03-10T06:22:21.623 INFO:tasks.workunit.client.1.vm06.stdout:7/582: dwrite d19/d3b/d41/d42/f6d [0,4194304] 0 2026-03-10T06:22:21.629 INFO:tasks.workunit.client.1.vm06.stdout:3/519: mknod d6/d21/d38/caf 0 2026-03-10T06:22:21.629 INFO:tasks.workunit.client.1.vm06.stdout:1/525: creat d9/d1b/d20/f8e x:0 0 0 2026-03-10T06:22:21.634 INFO:tasks.workunit.client.1.vm06.stdout:7/583: mkdir d19/d3b/d41/da9/dbd 0 2026-03-10T06:22:21.641 INFO:tasks.workunit.client.1.vm06.stdout:1/526: symlink d9/d35/d89/l8f 0 2026-03-10T06:22:21.641 INFO:tasks.workunit.client.1.vm06.stdout:2/485: fsync da/d13/d1c/f7e 0 2026-03-10T06:22:21.641 INFO:tasks.workunit.client.1.vm06.stdout:0/501: dread d0/dd/f4c [0,4194304] 0 2026-03-10T06:22:21.642 INFO:tasks.workunit.client.1.vm06.stdout:3/520: readlink d6/dc/d41/d6d/l77 0 2026-03-10T06:22:21.642 INFO:tasks.workunit.client.1.vm06.stdout:9/552: dread d21/d32/f52 [0,4194304] 0 2026-03-10T06:22:21.643 INFO:tasks.workunit.client.1.vm06.stdout:0/502: readlink d0/d3c/d42/d88/d35/d74/l9f 0 2026-03-10T06:22:21.643 INFO:tasks.workunit.client.1.vm06.stdout:3/521: fdatasync d6/dc/d13/d35/f3b 0 2026-03-10T06:22:21.644 INFO:tasks.workunit.client.1.vm06.stdout:2/486: chown da/d13/d1c/d1d/d44/d48/d56 1441383440 1 2026-03-10T06:22:21.644 INFO:tasks.workunit.client.1.vm06.stdout:7/584: symlink d19/d3b/d41/d72/d97/lbe 0 2026-03-10T06:22:21.645 INFO:tasks.workunit.client.1.vm06.stdout:3/522: chown d6/d8/f62 466173 1 2026-03-10T06:22:21.648 INFO:tasks.workunit.client.1.vm06.stdout:2/487: mknod da/d13/d1a/d39/d4b/d86/c94 0 2026-03-10T06:22:21.660 INFO:tasks.workunit.client.1.vm06.stdout:0/503: dwrite d0/f5 [0,4194304] 0 2026-03-10T06:22:21.660 INFO:tasks.workunit.client.1.vm06.stdout:7/585: unlink d19/d3b/d41/d42/d52/d9f/fad 0 2026-03-10T06:22:21.660 INFO:tasks.workunit.client.1.vm06.stdout:3/523: rename d6/d1a/f4a to d6/d21/fb0 0 2026-03-10T06:22:21.660 INFO:tasks.workunit.client.1.vm06.stdout:2/488: symlink da/d13/d1c/d1d/d44/d46/l95 0 2026-03-10T06:22:21.660 INFO:tasks.workunit.client.1.vm06.stdout:2/489: write da/f75 [2878162,34412] 0 2026-03-10T06:22:21.660 INFO:tasks.workunit.client.1.vm06.stdout:0/504: dwrite d0/dd/f67 [0,4194304] 0 2026-03-10T06:22:21.666 INFO:tasks.workunit.client.1.vm06.stdout:3/524: creat d6/dc/d13/d51/fb1 x:0 0 0 2026-03-10T06:22:21.667 INFO:tasks.workunit.client.1.vm06.stdout:3/525: readlink d6/d4f/l6a 0 2026-03-10T06:22:21.668 INFO:tasks.workunit.client.1.vm06.stdout:3/526: chown d6/d21/fb0 219 1 2026-03-10T06:22:21.672 INFO:tasks.workunit.client.1.vm06.stdout:0/505: dwrite d0/d3c/d42/f60 [0,4194304] 0 2026-03-10T06:22:21.688 INFO:tasks.workunit.client.1.vm06.stdout:7/586: link d19/d3b/d41/d42/d62/f7c d19/d3b/d41/d42/d52/d83/d9d/fbf 0 2026-03-10T06:22:21.689 INFO:tasks.workunit.client.1.vm06.stdout:1/527: dread d9/d35/f53 [0,4194304] 0 2026-03-10T06:22:21.689 INFO:tasks.workunit.client.1.vm06.stdout:1/528: chown d9/d1b/d20/d44/f6f 327905 1 2026-03-10T06:22:21.689 INFO:tasks.workunit.client.1.vm06.stdout:1/529: fsync d9/f2f 0 2026-03-10T06:22:21.690 INFO:tasks.workunit.client.1.vm06.stdout:3/527: fdatasync d6/f91 0 2026-03-10T06:22:21.693 INFO:tasks.workunit.client.1.vm06.stdout:8/464: truncate d1/df/d11/f1d 2719708 0 2026-03-10T06:22:21.700 INFO:tasks.workunit.client.1.vm06.stdout:8/465: write d1/df/d20/d21/f69 [988315,7888] 0 2026-03-10T06:22:21.702 INFO:tasks.workunit.client.1.vm06.stdout:1/530: fsync d9/d35/d46/d38/d63/f80 0 2026-03-10T06:22:21.703 INFO:tasks.workunit.client.1.vm06.stdout:4/495: dwrite dd/d33/f3f [0,4194304] 0 2026-03-10T06:22:21.712 INFO:tasks.workunit.client.1.vm06.stdout:1/531: fdatasync d9/d35/d46/f71 0 2026-03-10T06:22:21.727 INFO:tasks.workunit.client.1.vm06.stdout:7/587: creat d19/fc0 x:0 0 0 2026-03-10T06:22:21.728 INFO:tasks.workunit.client.1.vm06.stdout:0/506: rename d0/dd/d1b/d3d/f40 to d0/dd/fa4 0 2026-03-10T06:22:21.730 INFO:tasks.workunit.client.1.vm06.stdout:0/507: write d0/dd/f49 [770371,9010] 0 2026-03-10T06:22:21.730 INFO:tasks.workunit.client.1.vm06.stdout:4/496: dwrite dd/d24/d5e/f6a [0,4194304] 0 2026-03-10T06:22:21.732 INFO:tasks.workunit.client.1.vm06.stdout:1/532: creat d9/d62/f90 x:0 0 0 2026-03-10T06:22:21.737 INFO:tasks.workunit.client.1.vm06.stdout:3/528: dread d6/f29 [0,4194304] 0 2026-03-10T06:22:21.743 INFO:tasks.workunit.client.1.vm06.stdout:8/466: rename d1/df/f77 to d1/df/d11/da1/fa2 0 2026-03-10T06:22:21.743 INFO:tasks.workunit.client.1.vm06.stdout:1/533: truncate d9/d1b/f75 145954 0 2026-03-10T06:22:21.745 INFO:tasks.workunit.client.1.vm06.stdout:7/588: mknod d19/cc1 0 2026-03-10T06:22:21.746 INFO:tasks.workunit.client.1.vm06.stdout:0/508: unlink d0/dd/f44 0 2026-03-10T06:22:21.749 INFO:tasks.workunit.client.1.vm06.stdout:4/497: mknod dd/d33/d36/c96 0 2026-03-10T06:22:21.749 INFO:tasks.workunit.client.1.vm06.stdout:8/467: write d1/d3b/f98 [9196021,16596] 0 2026-03-10T06:22:21.750 INFO:tasks.workunit.client.1.vm06.stdout:6/569: dread d6/d7/f16 [0,4194304] 0 2026-03-10T06:22:21.755 INFO:tasks.workunit.client.1.vm06.stdout:7/589: mkdir d19/d3b/d41/d42/d52/d9f/dc2 0 2026-03-10T06:22:21.760 INFO:tasks.workunit.client.1.vm06.stdout:4/498: dwrite dd/d18/f5f [0,4194304] 0 2026-03-10T06:22:21.762 INFO:tasks.workunit.client.1.vm06.stdout:6/570: mkdir d6/d79/d95/db4/dbe 0 2026-03-10T06:22:21.762 INFO:tasks.workunit.client.1.vm06.stdout:1/534: link d9/d35/f7e d9/f91 0 2026-03-10T06:22:21.766 INFO:tasks.workunit.client.1.vm06.stdout:6/571: mknod d6/dd/d2b/cbf 0 2026-03-10T06:22:21.766 INFO:tasks.workunit.client.1.vm06.stdout:1/535: dread - d9/d1b/d20/f8e zero size 2026-03-10T06:22:21.770 INFO:tasks.workunit.client.1.vm06.stdout:7/590: symlink d19/d3b/d41/lc3 0 2026-03-10T06:22:21.771 INFO:tasks.workunit.client.1.vm06.stdout:6/572: dread d6/d7/f2a [0,4194304] 0 2026-03-10T06:22:21.773 INFO:tasks.workunit.client.1.vm06.stdout:3/529: creat d6/d21/d38/d88/dae/fb2 x:0 0 0 2026-03-10T06:22:21.774 INFO:tasks.workunit.client.1.vm06.stdout:4/499: unlink dd/d24/d2d/c3c 0 2026-03-10T06:22:21.775 INFO:tasks.workunit.client.1.vm06.stdout:4/500: readlink dd/d24/d2d/d2f/d39/l86 0 2026-03-10T06:22:21.779 INFO:tasks.workunit.client.1.vm06.stdout:7/591: fsync d19/d3b/d41/d42/d62/d80/d82/f90 0 2026-03-10T06:22:21.779 INFO:tasks.workunit.client.1.vm06.stdout:7/592: chown f4 2457 1 2026-03-10T06:22:21.779 INFO:tasks.workunit.client.1.vm06.stdout:9/553: write d21/da2/da7/d93/f94 [413425,105958] 0 2026-03-10T06:22:21.779 INFO:tasks.workunit.client.1.vm06.stdout:9/554: truncate d21/d27/fb3 308665 0 2026-03-10T06:22:21.779 INFO:tasks.workunit.client.1.vm06.stdout:9/555: chown d21/d27/d3a/c42 20468 1 2026-03-10T06:22:21.779 INFO:tasks.workunit.client.1.vm06.stdout:0/509: getdents d0/dd/d14/d18/d66 0 2026-03-10T06:22:21.780 INFO:tasks.workunit.client.1.vm06.stdout:0/510: readlink d0/d3c/d42/lc 0 2026-03-10T06:22:21.782 INFO:tasks.workunit.client.1.vm06.stdout:0/511: truncate d0/d3c/d42/d88/fa0 1034738 0 2026-03-10T06:22:21.782 INFO:tasks.workunit.client.1.vm06.stdout:4/501: dwrite dd/d41/f95 [0,4194304] 0 2026-03-10T06:22:21.784 INFO:tasks.workunit.client.1.vm06.stdout:1/536: creat d9/d35/d46/d38/d8c/f92 x:0 0 0 2026-03-10T06:22:21.785 INFO:tasks.workunit.client.1.vm06.stdout:6/573: creat d6/dd/d25/d33/d4d/fc0 x:0 0 0 2026-03-10T06:22:21.795 INFO:tasks.workunit.client.1.vm06.stdout:7/593: mkdir d19/d3b/d41/d42/d62/dc4 0 2026-03-10T06:22:21.796 INFO:tasks.workunit.client.1.vm06.stdout:4/502: unlink dd/ff 0 2026-03-10T06:22:21.796 INFO:tasks.workunit.client.1.vm06.stdout:9/556: chown c1d 128342 1 2026-03-10T06:22:21.805 INFO:tasks.workunit.client.1.vm06.stdout:1/537: dwrite d9/d1b/d20/d44/f54 [0,4194304] 0 2026-03-10T06:22:21.808 INFO:tasks.workunit.client.1.vm06.stdout:9/557: dwrite d21/d27/d50/d57/fa9 [0,4194304] 0 2026-03-10T06:22:21.817 INFO:tasks.workunit.client.1.vm06.stdout:0/512: rename d0/l64 to d0/d3c/d42/d99/la5 0 2026-03-10T06:22:21.822 INFO:tasks.workunit.client.1.vm06.stdout:6/574: creat d6/dd/d35/fc1 x:0 0 0 2026-03-10T06:22:21.822 INFO:tasks.workunit.client.1.vm06.stdout:4/503: mkdir dd/d33/d47/d97 0 2026-03-10T06:22:21.822 INFO:tasks.workunit.client.1.vm06.stdout:0/513: chown d0/dd/d14/d1d/l23 205631915 1 2026-03-10T06:22:21.823 INFO:tasks.workunit.client.1.vm06.stdout:9/558: dread d21/d27/d50/d57/fa9 [0,4194304] 0 2026-03-10T06:22:21.823 INFO:tasks.workunit.client.1.vm06.stdout:0/514: fsync d0/dd/f49 0 2026-03-10T06:22:21.826 INFO:tasks.workunit.client.1.vm06.stdout:4/504: dwrite dd/d33/f56 [0,4194304] 0 2026-03-10T06:22:21.834 INFO:tasks.workunit.client.1.vm06.stdout:6/575: mkdir d6/dd/dc2 0 2026-03-10T06:22:21.840 INFO:tasks.workunit.client.1.vm06.stdout:1/538: mkdir d9/d35/d46/d38/d63/d83/d93 0 2026-03-10T06:22:21.840 INFO:tasks.workunit.client.1.vm06.stdout:1/539: creat d9/d62/f94 x:0 0 0 2026-03-10T06:22:21.840 INFO:tasks.workunit.client.1.vm06.stdout:6/576: rename d6/df/d40/fa7 to d6/dd/d25/d2c/fc3 0 2026-03-10T06:22:21.841 INFO:tasks.workunit.client.1.vm06.stdout:0/515: symlink d0/dd/d14/d6b/la6 0 2026-03-10T06:22:21.841 INFO:tasks.workunit.client.1.vm06.stdout:6/577: write d6/d7/f87 [2051510,16508] 0 2026-03-10T06:22:21.841 INFO:tasks.workunit.client.1.vm06.stdout:7/594: sync 2026-03-10T06:22:21.850 INFO:tasks.workunit.client.1.vm06.stdout:0/516: dwrite d0/fa [0,4194304] 0 2026-03-10T06:22:21.851 INFO:tasks.workunit.client.1.vm06.stdout:4/505: dread fc [0,4194304] 0 2026-03-10T06:22:21.851 INFO:tasks.workunit.client.1.vm06.stdout:7/595: creat d19/d3b/d41/d42/d52/fc5 x:0 0 0 2026-03-10T06:22:21.854 INFO:tasks.workunit.client.1.vm06.stdout:7/596: stat d19/d3b/f47 0 2026-03-10T06:22:21.854 INFO:tasks.workunit.client.1.vm06.stdout:6/578: symlink d6/dd/d25/d33/d5a/lc4 0 2026-03-10T06:22:21.855 INFO:tasks.workunit.client.1.vm06.stdout:1/540: symlink d9/l95 0 2026-03-10T06:22:21.858 INFO:tasks.workunit.client.1.vm06.stdout:6/579: write d6/df/f82 [344943,33063] 0 2026-03-10T06:22:21.860 INFO:tasks.workunit.client.1.vm06.stdout:7/597: creat d19/d3b/d41/d42/d52/d9f/fc6 x:0 0 0 2026-03-10T06:22:21.860 INFO:tasks.workunit.client.1.vm06.stdout:4/506: creat dd/d24/d2d/d2f/f98 x:0 0 0 2026-03-10T06:22:21.862 INFO:tasks.workunit.client.1.vm06.stdout:0/517: mkdir d0/dd/d1b/d3d/d50/d91/da7 0 2026-03-10T06:22:21.862 INFO:tasks.workunit.client.1.vm06.stdout:6/580: mkdir d6/dd/d2b/dc5 0 2026-03-10T06:22:21.864 INFO:tasks.workunit.client.1.vm06.stdout:7/598: mknod d19/db0/cc7 0 2026-03-10T06:22:21.868 INFO:tasks.workunit.client.1.vm06.stdout:7/599: rename d19/d3b/d41/d4c/c79 to d19/d3b/d41/d4c/cc8 0 2026-03-10T06:22:21.870 INFO:tasks.workunit.client.1.vm06.stdout:6/581: unlink d6/d79/laf 0 2026-03-10T06:22:21.872 INFO:tasks.workunit.client.1.vm06.stdout:7/600: symlink d19/d3b/d41/d42/d52/d9f/dc2/lc9 0 2026-03-10T06:22:21.875 INFO:tasks.workunit.client.1.vm06.stdout:4/507: dwrite dd/f81 [0,4194304] 0 2026-03-10T06:22:21.876 INFO:tasks.workunit.client.1.vm06.stdout:7/601: chown d19/c38 14 1 2026-03-10T06:22:21.878 INFO:tasks.workunit.client.1.vm06.stdout:7/602: mknod d19/d3b/d41/d42/d52/d83/d9d/cca 0 2026-03-10T06:22:21.880 INFO:tasks.workunit.client.1.vm06.stdout:4/508: unlink dd/d33/f58 0 2026-03-10T06:22:21.881 INFO:tasks.workunit.client.1.vm06.stdout:7/603: symlink d19/d3b/d41/d42/d62/d80/d82/lcb 0 2026-03-10T06:22:21.895 INFO:tasks.workunit.client.1.vm06.stdout:4/509: dwrite dd/d41/f52 [0,4194304] 0 2026-03-10T06:22:21.895 INFO:tasks.workunit.client.1.vm06.stdout:7/604: mknod d19/d3b/d41/d42/d62/d80/ccc 0 2026-03-10T06:22:21.896 INFO:tasks.workunit.client.1.vm06.stdout:6/582: dread d6/dd/d25/d4e/f5f [0,4194304] 0 2026-03-10T06:22:21.901 INFO:tasks.workunit.client.1.vm06.stdout:7/605: mknod d19/d3b/d41/d42/d52/d83/ccd 0 2026-03-10T06:22:21.903 INFO:tasks.workunit.client.1.vm06.stdout:7/606: symlink d19/d3b/d41/da9/da5/lce 0 2026-03-10T06:22:21.904 INFO:tasks.workunit.client.1.vm06.stdout:4/510: dread dd/d24/d2d/f28 [0,4194304] 0 2026-03-10T06:22:21.905 INFO:tasks.workunit.client.1.vm06.stdout:6/583: getdents d6 0 2026-03-10T06:22:21.905 INFO:tasks.workunit.client.1.vm06.stdout:6/584: chown d6/df/d40/d99 71537 1 2026-03-10T06:22:21.907 INFO:tasks.workunit.client.1.vm06.stdout:9/559: dread d21/d32/d4d/f64 [0,4194304] 0 2026-03-10T06:22:21.909 INFO:tasks.workunit.client.1.vm06.stdout:4/511: dwrite dd/d33/f53 [0,4194304] 0 2026-03-10T06:22:21.911 INFO:tasks.workunit.client.1.vm06.stdout:6/585: creat d6/d79/fc6 x:0 0 0 2026-03-10T06:22:21.911 INFO:tasks.workunit.client.1.vm06.stdout:4/512: chown dd/d24/f8f 6298 1 2026-03-10T06:22:21.916 INFO:tasks.workunit.client.1.vm06.stdout:9/560: rename d21/d7d to d21/d27/d56/dc0 0 2026-03-10T06:22:21.916 INFO:tasks.workunit.client.1.vm06.stdout:6/586: mkdir d6/dd/dc7 0 2026-03-10T06:22:21.920 INFO:tasks.workunit.client.1.vm06.stdout:6/587: mknod d6/df/d70/cc8 0 2026-03-10T06:22:21.920 INFO:tasks.workunit.client.1.vm06.stdout:9/561: creat d21/d32/d6e/fc1 x:0 0 0 2026-03-10T06:22:21.923 INFO:tasks.workunit.client.1.vm06.stdout:6/588: write d6/d79/fc6 [484414,15272] 0 2026-03-10T06:22:21.927 INFO:tasks.workunit.client.1.vm06.stdout:6/589: creat d6/df/d40/fc9 x:0 0 0 2026-03-10T06:22:21.930 INFO:tasks.workunit.client.1.vm06.stdout:6/590: write d6/dd/d25/d4e/f60 [4990251,130586] 0 2026-03-10T06:22:21.931 INFO:tasks.workunit.client.1.vm06.stdout:6/591: fdatasync d6/dd/d35/f97 0 2026-03-10T06:22:21.932 INFO:tasks.workunit.client.1.vm06.stdout:6/592: creat d6/dd/d35/fca x:0 0 0 2026-03-10T06:22:21.934 INFO:tasks.workunit.client.1.vm06.stdout:6/593: write d6/d7/d37/d43/f77 [200258,110472] 0 2026-03-10T06:22:21.937 INFO:tasks.workunit.client.1.vm06.stdout:0/518: sync 2026-03-10T06:22:21.939 INFO:tasks.workunit.client.1.vm06.stdout:6/594: read - d6/df/d40/fc9 zero size 2026-03-10T06:22:21.942 INFO:tasks.workunit.client.1.vm06.stdout:0/519: symlink d0/d3c/d42/d88/d9e/la8 0 2026-03-10T06:22:21.943 INFO:tasks.workunit.client.1.vm06.stdout:6/595: creat d6/dd/d25/d33/d5a/dae/fcb x:0 0 0 2026-03-10T06:22:21.949 INFO:tasks.workunit.client.1.vm06.stdout:0/520: dwrite d0/fa [0,4194304] 0 2026-03-10T06:22:21.952 INFO:tasks.workunit.client.1.vm06.stdout:0/521: dread d0/d3c/d42/d88/d35/f51 [0,4194304] 0 2026-03-10T06:22:21.961 INFO:tasks.workunit.client.1.vm06.stdout:0/522: dread d0/d3c/d42/d88/d47/d4d/f57 [0,4194304] 0 2026-03-10T06:22:21.964 INFO:tasks.workunit.client.1.vm06.stdout:0/523: dwrite d0/fa [0,4194304] 0 2026-03-10T06:22:21.969 INFO:tasks.workunit.client.1.vm06.stdout:0/524: rename d0/dd/d14/d18/d66/c7b to d0/dd/d14/d18/d7e/ca9 0 2026-03-10T06:22:21.971 INFO:tasks.workunit.client.1.vm06.stdout:0/525: write d0/dd/f32 [7083851,1166] 0 2026-03-10T06:22:21.973 INFO:tasks.workunit.client.1.vm06.stdout:0/526: symlink d0/dd/d14/d1d/laa 0 2026-03-10T06:22:21.974 INFO:tasks.workunit.client.1.vm06.stdout:0/527: chown d0/d3c/d42/d5e/f76 77953596 1 2026-03-10T06:22:21.975 INFO:tasks.workunit.client.1.vm06.stdout:0/528: mkdir d0/d3c/d42/dab 0 2026-03-10T06:22:21.983 INFO:tasks.workunit.client.1.vm06.stdout:0/529: mknod d0/d3c/d42/d88/d98/cac 0 2026-03-10T06:22:21.988 INFO:tasks.workunit.client.1.vm06.stdout:0/530: creat d0/d3c/d42/dab/fad x:0 0 0 2026-03-10T06:22:21.990 INFO:tasks.workunit.client.1.vm06.stdout:0/531: unlink d0/dd/d1c/l78 0 2026-03-10T06:22:21.991 INFO:tasks.workunit.client.1.vm06.stdout:0/532: dread - d0/d3c/d42/dab/fad zero size 2026-03-10T06:22:21.991 INFO:tasks.workunit.client.1.vm06.stdout:0/533: chown d0/dd/d14/d18/d85 37603 1 2026-03-10T06:22:22.002 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:21 vm06.local ceph-mon[58974]: Upgrade: Updating mgr.vm06.wwotdr 2026-03-10T06:22:22.002 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:21 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm06.wwotdr", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-10T06:22:22.002 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:21 vm06.local ceph-mon[58974]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm06.wwotdr", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-10T06:22:22.002 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:21 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-10T06:22:22.002 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:21 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:22:22.002 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:21 vm06.local ceph-mon[58974]: Deploying daemon mgr.vm06.wwotdr on vm06 2026-03-10T06:22:22.037 INFO:tasks.workunit.client.1.vm06.stdout:2/490: write da/d13/f1f [2925831,8269] 0 2026-03-10T06:22:22.048 INFO:tasks.workunit.client.1.vm06.stdout:2/491: symlink da/d13/d1c/d1d/d44/d53/l96 0 2026-03-10T06:22:22.048 INFO:tasks.workunit.client.1.vm06.stdout:5/420: truncate d8/db/d54/d67/d22/f4d 3095803 0 2026-03-10T06:22:22.050 INFO:tasks.workunit.client.1.vm06.stdout:2/492: symlink da/d13/d1c/d1d/d44/d53/l97 0 2026-03-10T06:22:22.052 INFO:tasks.workunit.client.1.vm06.stdout:5/421: creat d8/db/d54/d55/f87 x:0 0 0 2026-03-10T06:22:22.052 INFO:tasks.workunit.client.1.vm06.stdout:5/422: readlink d8/d9/l6b 0 2026-03-10T06:22:22.054 INFO:tasks.workunit.client.1.vm06.stdout:2/493: link da/f84 da/d13/d1c/d1d/d44/d53/f98 0 2026-03-10T06:22:22.056 INFO:tasks.workunit.client.1.vm06.stdout:5/423: link d8/db/d54/f5c d8/db/d54/f88 0 2026-03-10T06:22:22.057 INFO:tasks.workunit.client.1.vm06.stdout:8/468: dwrite d1/df/d11/f45 [0,4194304] 0 2026-03-10T06:22:22.058 INFO:tasks.workunit.client.1.vm06.stdout:2/494: mknod da/d13/d1c/d1d/c99 0 2026-03-10T06:22:22.059 INFO:tasks.workunit.client.1.vm06.stdout:5/424: creat d8/db/d54/d67/d22/d39/d72/f89 x:0 0 0 2026-03-10T06:22:22.060 INFO:tasks.workunit.client.1.vm06.stdout:2/495: creat da/d13/d5e/f9a x:0 0 0 2026-03-10T06:22:22.067 INFO:tasks.workunit.client.1.vm06.stdout:8/469: dwrite d1/df/fa0 [0,4194304] 0 2026-03-10T06:22:22.068 INFO:tasks.workunit.client.1.vm06.stdout:2/496: unlink da/f75 0 2026-03-10T06:22:22.069 INFO:tasks.workunit.client.1.vm06.stdout:2/497: chown da/d13/d1c/f76 1791475 1 2026-03-10T06:22:22.069 INFO:tasks.workunit.client.1.vm06.stdout:2/498: chown da/ld 28049 1 2026-03-10T06:22:22.073 INFO:tasks.workunit.client.1.vm06.stdout:2/499: mkdir da/d13/d1c/d43/d6e/d9b 0 2026-03-10T06:22:22.078 INFO:tasks.workunit.client.1.vm06.stdout:3/530: dwrite d6/dc/d13/d35/f95 [4194304,4194304] 0 2026-03-10T06:22:22.084 INFO:tasks.workunit.client.1.vm06.stdout:2/500: dwrite da/d13/d1c/f42 [0,4194304] 0 2026-03-10T06:22:22.086 INFO:tasks.workunit.client.1.vm06.stdout:2/501: write da/d13/d1c/d1d/d44/d53/d61/f89 [292942,105446] 0 2026-03-10T06:22:22.104 INFO:tasks.workunit.client.1.vm06.stdout:2/502: mknod da/d13/d1c/d43/c9c 0 2026-03-10T06:22:22.104 INFO:tasks.workunit.client.1.vm06.stdout:3/531: chown d6/dc/d72/c79 35 1 2026-03-10T06:22:22.107 INFO:tasks.workunit.client.1.vm06.stdout:1/541: dread d9/d62/f87 [0,4194304] 0 2026-03-10T06:22:22.109 INFO:tasks.workunit.client.1.vm06.stdout:2/503: dwrite da/d13/d1c/d43/f7a [0,4194304] 0 2026-03-10T06:22:22.110 INFO:tasks.workunit.client.1.vm06.stdout:2/504: readlink da/d13/d1c/d1d/d44/d46/l73 0 2026-03-10T06:22:22.116 INFO:tasks.workunit.client.1.vm06.stdout:1/542: symlink d9/d35/d46/d38/d63/l96 0 2026-03-10T06:22:22.124 INFO:tasks.workunit.client.1.vm06.stdout:4/513: dread dd/f43 [0,4194304] 0 2026-03-10T06:22:22.128 INFO:tasks.workunit.client.1.vm06.stdout:4/514: fdatasync dd/d24/d5e/f6a 0 2026-03-10T06:22:22.128 INFO:tasks.workunit.client.1.vm06.stdout:4/515: dread - dd/d24/d5d/f94 zero size 2026-03-10T06:22:22.128 INFO:tasks.workunit.client.1.vm06.stdout:1/543: dwrite d9/d35/f53 [0,4194304] 0 2026-03-10T06:22:22.128 INFO:tasks.workunit.client.1.vm06.stdout:4/516: truncate dd/d18/f55 4245208 0 2026-03-10T06:22:22.130 INFO:tasks.workunit.client.1.vm06.stdout:4/517: chown f2 28778528 1 2026-03-10T06:22:22.134 INFO:tasks.workunit.client.1.vm06.stdout:4/518: read dd/d33/f56 [1039952,118319] 0 2026-03-10T06:22:22.135 INFO:tasks.workunit.client.1.vm06.stdout:8/470: dread d1/fa [0,4194304] 0 2026-03-10T06:22:22.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:21 vm04.local ceph-mon[51058]: Upgrade: Updating mgr.vm06.wwotdr 2026-03-10T06:22:22.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:21 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm06.wwotdr", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-10T06:22:22.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:21 vm04.local ceph-mon[51058]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm06.wwotdr", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-10T06:22:22.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:21 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-10T06:22:22.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:21 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:22:22.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:21 vm04.local ceph-mon[51058]: Deploying daemon mgr.vm06.wwotdr on vm06 2026-03-10T06:22:22.178 INFO:tasks.workunit.client.1.vm06.stdout:8/471: sync 2026-03-10T06:22:22.184 INFO:tasks.workunit.client.1.vm06.stdout:8/472: symlink d1/df/d58/la3 0 2026-03-10T06:22:22.186 INFO:tasks.workunit.client.1.vm06.stdout:8/473: creat d1/df/d20/d21/d5e/fa4 x:0 0 0 2026-03-10T06:22:22.187 INFO:tasks.workunit.client.1.vm06.stdout:8/474: write d1/df/d20/f51 [540756,36413] 0 2026-03-10T06:22:22.188 INFO:tasks.workunit.client.1.vm06.stdout:8/475: chown d1/d2c/d5b/l68 49 1 2026-03-10T06:22:22.216 INFO:tasks.workunit.client.1.vm06.stdout:7/607: dwrite f13 [0,4194304] 0 2026-03-10T06:22:22.216 INFO:tasks.workunit.client.1.vm06.stdout:9/562: truncate d21/d32/d4d/f64 1396900 0 2026-03-10T06:22:22.217 INFO:tasks.workunit.client.1.vm06.stdout:9/563: write d21/d32/d4d/fbd [379714,87718] 0 2026-03-10T06:22:22.219 INFO:tasks.workunit.client.1.vm06.stdout:6/596: truncate d6/dd/d25/d4e/f60 243320 0 2026-03-10T06:22:22.221 INFO:tasks.workunit.client.1.vm06.stdout:9/564: mkdir d21/da2/dc2 0 2026-03-10T06:22:22.222 INFO:tasks.workunit.client.1.vm06.stdout:7/608: dread d19/f24 [0,4194304] 0 2026-03-10T06:22:22.224 INFO:tasks.workunit.client.1.vm06.stdout:0/534: truncate d0/dd/d1b/f2f 2614370 0 2026-03-10T06:22:22.226 INFO:tasks.workunit.client.1.vm06.stdout:9/565: getdents d21/d32/d4d/d51/d67 0 2026-03-10T06:22:22.227 INFO:tasks.workunit.client.1.vm06.stdout:7/609: dwrite d19/d3b/f68 [0,4194304] 0 2026-03-10T06:22:22.229 INFO:tasks.workunit.client.1.vm06.stdout:0/535: getdents d0/dd/d14/d1d/d5d 0 2026-03-10T06:22:22.230 INFO:tasks.workunit.client.1.vm06.stdout:0/536: chown d0/d3c/d42/d88/d35/f51 2 1 2026-03-10T06:22:22.236 INFO:tasks.workunit.client.1.vm06.stdout:9/566: write d21/f49 [4287584,29597] 0 2026-03-10T06:22:22.236 INFO:tasks.workunit.client.1.vm06.stdout:9/567: write d21/da2/da7/f96 [3607614,77351] 0 2026-03-10T06:22:22.236 INFO:tasks.workunit.client.1.vm06.stdout:5/425: rename d8/db/d54/d67/d22 to d8/db/d54/d8a 0 2026-03-10T06:22:22.243 INFO:tasks.workunit.client.1.vm06.stdout:7/610: creat d19/d3b/d41/d4c/fcf x:0 0 0 2026-03-10T06:22:22.247 INFO:tasks.workunit.client.1.vm06.stdout:1/544: rename d9/d35/d46/d38/d63/l96 to d9/d35/d46/d38/d63/d83/l97 0 2026-03-10T06:22:22.247 INFO:tasks.workunit.client.1.vm06.stdout:1/545: dread - d9/d62/f90 zero size 2026-03-10T06:22:22.247 INFO:tasks.workunit.client.1.vm06.stdout:5/426: creat d8/db/d54/d8a/d39/d72/f8b x:0 0 0 2026-03-10T06:22:22.247 INFO:tasks.workunit.client.1.vm06.stdout:5/427: truncate d8/db/d54/d67/d46/f76 975332 0 2026-03-10T06:22:22.248 INFO:tasks.workunit.client.1.vm06.stdout:5/428: write d8/db/d54/d8a/d74/f71 [747521,87707] 0 2026-03-10T06:22:22.248 INFO:tasks.workunit.client.1.vm06.stdout:1/546: dwrite d9/f1f [4194304,4194304] 0 2026-03-10T06:22:22.249 INFO:tasks.workunit.client.1.vm06.stdout:4/519: rename dd/d24/d2d/f28 to dd/d24/d2d/d2f/d34/d40/f99 0 2026-03-10T06:22:22.249 INFO:tasks.workunit.client.1.vm06.stdout:1/547: chown d9/d35/d46 1725 1 2026-03-10T06:22:22.254 INFO:tasks.workunit.client.1.vm06.stdout:4/520: dwrite dd/d24/f69 [0,4194304] 0 2026-03-10T06:22:22.256 INFO:tasks.workunit.client.1.vm06.stdout:0/537: creat d0/d3c/d42/d88/fae x:0 0 0 2026-03-10T06:22:22.257 INFO:tasks.workunit.client.1.vm06.stdout:0/538: write d0/d3c/d42/d88/d35/f7f [299785,25106] 0 2026-03-10T06:22:22.265 INFO:tasks.workunit.client.1.vm06.stdout:1/548: link d9/d35/d46/l5b d9/d35/d46/l98 0 2026-03-10T06:22:22.265 INFO:tasks.workunit.client.1.vm06.stdout:5/429: rename d8/c34 to d8/db/d54/c8c 0 2026-03-10T06:22:22.266 INFO:tasks.workunit.client.1.vm06.stdout:1/549: fsync d9/d35/d89/f3d 0 2026-03-10T06:22:22.267 INFO:tasks.workunit.client.1.vm06.stdout:1/550: creat d9/d62/f99 x:0 0 0 2026-03-10T06:22:22.268 INFO:tasks.workunit.client.1.vm06.stdout:1/551: readlink d9/la 0 2026-03-10T06:22:22.268 INFO:tasks.workunit.client.1.vm06.stdout:1/552: write d9/d35/d89/f3d [873483,45387] 0 2026-03-10T06:22:22.271 INFO:tasks.workunit.client.1.vm06.stdout:1/553: rename d9/d1b/d20/f43 to d9/d35/d46/d38/d8c/f9a 0 2026-03-10T06:22:22.280 INFO:tasks.workunit.client.1.vm06.stdout:1/554: chown d9/d35/d46/d38/d8c/f9a 6 1 2026-03-10T06:22:22.280 INFO:tasks.workunit.client.1.vm06.stdout:1/555: creat d9/d35/d89/f9b x:0 0 0 2026-03-10T06:22:22.280 INFO:tasks.workunit.client.1.vm06.stdout:1/556: creat d9/d35/d46/d38/d63/d83/d93/f9c x:0 0 0 2026-03-10T06:22:22.280 INFO:tasks.workunit.client.1.vm06.stdout:1/557: write d9/d1b/d20/d44/f54 [2267618,90019] 0 2026-03-10T06:22:22.280 INFO:tasks.workunit.client.1.vm06.stdout:1/558: write d9/f1f [4839039,82601] 0 2026-03-10T06:22:22.280 INFO:tasks.workunit.client.1.vm06.stdout:1/559: mknod d9/d35/d46/d38/d63/c9d 0 2026-03-10T06:22:22.281 INFO:tasks.workunit.client.1.vm06.stdout:4/521: read f8 [14919370,106438] 0 2026-03-10T06:22:22.283 INFO:tasks.workunit.client.1.vm06.stdout:1/560: dwrite d9/d35/d46/d38/d8c/f9a [0,4194304] 0 2026-03-10T06:22:22.285 INFO:tasks.workunit.client.1.vm06.stdout:4/522: truncate dd/d24/d2d/d2f/d34/d83/f87 990623 0 2026-03-10T06:22:22.286 INFO:tasks.workunit.client.1.vm06.stdout:6/597: sync 2026-03-10T06:22:22.287 INFO:tasks.workunit.client.1.vm06.stdout:1/561: symlink d9/d35/d46/d38/d63/d83/d93/l9e 0 2026-03-10T06:22:22.295 INFO:tasks.workunit.client.1.vm06.stdout:6/598: mkdir d6/dd/d25/d33/d5a/dcc 0 2026-03-10T06:22:22.295 INFO:tasks.workunit.client.1.vm06.stdout:4/523: creat dd/d33/d47/d97/f9a x:0 0 0 2026-03-10T06:22:22.295 INFO:tasks.workunit.client.1.vm06.stdout:1/562: dread d9/d35/d89/f14 [0,4194304] 0 2026-03-10T06:22:22.295 INFO:tasks.workunit.client.1.vm06.stdout:6/599: mknod d6/dd/dc7/ccd 0 2026-03-10T06:22:22.296 INFO:tasks.workunit.client.1.vm06.stdout:1/563: chown d9/d1b/c6a 9631524 1 2026-03-10T06:22:22.296 INFO:tasks.workunit.client.1.vm06.stdout:6/600: symlink d6/d79/d95/db4/dbe/lce 0 2026-03-10T06:22:22.296 INFO:tasks.workunit.client.1.vm06.stdout:0/539: sync 2026-03-10T06:22:22.296 INFO:tasks.workunit.client.1.vm06.stdout:9/568: sync 2026-03-10T06:22:22.297 INFO:tasks.workunit.client.1.vm06.stdout:1/564: dwrite d9/d35/d46/f88 [0,4194304] 0 2026-03-10T06:22:22.297 INFO:tasks.workunit.client.1.vm06.stdout:9/569: truncate d21/f33 4750592 0 2026-03-10T06:22:22.301 INFO:tasks.workunit.client.1.vm06.stdout:9/570: dwrite d21/d27/d3a/fbb [0,4194304] 0 2026-03-10T06:22:22.301 INFO:tasks.workunit.client.1.vm06.stdout:6/601: link d6/d7/d37/d43/l88 d6/dd/d25/d33/lcf 0 2026-03-10T06:22:22.303 INFO:tasks.workunit.client.1.vm06.stdout:9/571: readlink d21/d32/d6e/l3b 0 2026-03-10T06:22:22.311 INFO:tasks.workunit.client.1.vm06.stdout:6/602: rename d6/dd/d2b to d6/dd/d25/d33/d5a/d78/dd0 0 2026-03-10T06:22:22.313 INFO:tasks.workunit.client.1.vm06.stdout:9/572: symlink d21/d32/lc3 0 2026-03-10T06:22:22.314 INFO:tasks.workunit.client.1.vm06.stdout:1/565: mkdir d9/d9f 0 2026-03-10T06:22:22.322 INFO:tasks.workunit.client.1.vm06.stdout:1/566: dwrite d9/f1a [0,4194304] 0 2026-03-10T06:22:22.327 INFO:tasks.workunit.client.1.vm06.stdout:1/567: unlink d9/d35/d46/d38/d63/f6b 0 2026-03-10T06:22:22.327 INFO:tasks.workunit.client.1.vm06.stdout:1/568: chown d9/d62/f94 2365208 1 2026-03-10T06:22:22.329 INFO:tasks.workunit.client.1.vm06.stdout:1/569: rename d9/d9f to d9/d35/d46/d38/da0 0 2026-03-10T06:22:22.329 INFO:tasks.workunit.client.1.vm06.stdout:0/540: sync 2026-03-10T06:22:22.330 INFO:tasks.workunit.client.1.vm06.stdout:0/541: stat d0/dd/d1b/d7d 0 2026-03-10T06:22:22.331 INFO:tasks.workunit.client.1.vm06.stdout:1/570: creat d9/d35/d46/d38/d63/d83/fa1 x:0 0 0 2026-03-10T06:22:22.333 INFO:tasks.workunit.client.1.vm06.stdout:1/571: write d9/d35/d46/d38/d8c/f9a [2033243,125361] 0 2026-03-10T06:22:22.334 INFO:tasks.workunit.client.1.vm06.stdout:0/542: dwrite d0/dd/d14/f70 [0,4194304] 0 2026-03-10T06:22:22.336 INFO:tasks.workunit.client.1.vm06.stdout:0/543: chown d0/dd/d1b/f72 136 1 2026-03-10T06:22:22.336 INFO:tasks.workunit.client.1.vm06.stdout:1/572: creat d9/d1b/d20/d44/fa2 x:0 0 0 2026-03-10T06:22:22.338 INFO:tasks.workunit.client.1.vm06.stdout:1/573: creat d9/d62/fa3 x:0 0 0 2026-03-10T06:22:22.349 INFO:tasks.workunit.client.1.vm06.stdout:1/574: dwrite d9/d35/d89/f9b [0,4194304] 0 2026-03-10T06:22:22.350 INFO:tasks.workunit.client.1.vm06.stdout:9/573: dread ff [0,4194304] 0 2026-03-10T06:22:22.350 INFO:tasks.workunit.client.1.vm06.stdout:0/544: dread d0/dd/f49 [0,4194304] 0 2026-03-10T06:22:22.351 INFO:tasks.workunit.client.1.vm06.stdout:0/545: dread - d0/d3c/d42/d88/fae zero size 2026-03-10T06:22:22.352 INFO:tasks.workunit.client.1.vm06.stdout:1/575: write d9/d35/d46/d38/d63/f80 [1071346,62039] 0 2026-03-10T06:22:22.352 INFO:tasks.workunit.client.1.vm06.stdout:9/574: fsync d21/d32/d4d/f6b 0 2026-03-10T06:22:22.356 INFO:tasks.workunit.client.1.vm06.stdout:9/575: rename d21/d27/faf to d21/d27/d50/d57/db2/d80/d95/fc4 0 2026-03-10T06:22:22.356 INFO:tasks.workunit.client.1.vm06.stdout:0/546: creat d0/d3c/d42/d88/d98/faf x:0 0 0 2026-03-10T06:22:22.357 INFO:tasks.workunit.client.1.vm06.stdout:9/576: read - d21/da2/da7/fbc zero size 2026-03-10T06:22:22.360 INFO:tasks.workunit.client.1.vm06.stdout:0/547: write d0/dd/f10 [1222778,13442] 0 2026-03-10T06:22:22.363 INFO:tasks.workunit.client.1.vm06.stdout:0/548: truncate d0/dd/fa4 1054139 0 2026-03-10T06:22:22.365 INFO:tasks.workunit.client.1.vm06.stdout:0/549: mknod d0/dd/d1b/d3d/d50/cb0 0 2026-03-10T06:22:22.367 INFO:tasks.workunit.client.1.vm06.stdout:0/550: creat d0/d3c/d42/d88/d98/fb1 x:0 0 0 2026-03-10T06:22:22.376 INFO:tasks.workunit.client.1.vm06.stdout:1/576: dwrite d9/d35/d89/f4d [0,4194304] 0 2026-03-10T06:22:22.376 INFO:tasks.workunit.client.1.vm06.stdout:1/577: readlink d9/d35/d46/l5b 0 2026-03-10T06:22:22.376 INFO:tasks.workunit.client.1.vm06.stdout:1/578: stat d9/c69 0 2026-03-10T06:22:22.376 INFO:tasks.workunit.client.1.vm06.stdout:0/551: getdents d0/d3c/d42/d88/d47 0 2026-03-10T06:22:22.376 INFO:tasks.workunit.client.1.vm06.stdout:1/579: symlink d9/d35/d46/d38/d63/d83/la4 0 2026-03-10T06:22:22.378 INFO:tasks.workunit.client.1.vm06.stdout:1/580: rmdir d9/d35 39 2026-03-10T06:22:22.379 INFO:tasks.workunit.client.1.vm06.stdout:0/552: link d0/d3c/d42/d88/d9e/la8 d0/dd/lb2 0 2026-03-10T06:22:22.382 INFO:tasks.workunit.client.1.vm06.stdout:0/553: dread d0/d3c/d42/f60 [0,4194304] 0 2026-03-10T06:22:22.385 INFO:tasks.workunit.client.1.vm06.stdout:4/524: read fa [3850863,113772] 0 2026-03-10T06:22:22.390 INFO:tasks.workunit.client.1.vm06.stdout:4/525: rename dd/d18/l4f to dd/d72/l9b 0 2026-03-10T06:22:22.397 INFO:tasks.workunit.client.1.vm06.stdout:1/581: dwrite d9/d1b/d20/d44/f54 [0,4194304] 0 2026-03-10T06:22:22.397 INFO:tasks.workunit.client.1.vm06.stdout:3/532: dwrite d6/f1c [0,4194304] 0 2026-03-10T06:22:22.398 INFO:tasks.workunit.client.1.vm06.stdout:0/554: dwrite d0/ff [4194304,4194304] 0 2026-03-10T06:22:22.400 INFO:tasks.workunit.client.1.vm06.stdout:4/526: dwrite dd/d24/f69 [0,4194304] 0 2026-03-10T06:22:22.400 INFO:tasks.workunit.client.1.vm06.stdout:3/533: read d6/f29 [975413,78777] 0 2026-03-10T06:22:22.404 INFO:tasks.workunit.client.1.vm06.stdout:3/534: write d6/d8/f49 [853007,116001] 0 2026-03-10T06:22:22.405 INFO:tasks.workunit.client.1.vm06.stdout:3/535: write d6/d21/f31 [2846137,120488] 0 2026-03-10T06:22:22.413 INFO:tasks.workunit.client.1.vm06.stdout:0/555: chown d0/dd/d14/l4f 0 1 2026-03-10T06:22:22.414 INFO:tasks.workunit.client.1.vm06.stdout:2/505: dwrite da/d13/d1c/f2d [0,4194304] 0 2026-03-10T06:22:22.417 INFO:tasks.workunit.client.1.vm06.stdout:0/556: dread - d0/d3c/d42/d99/f9c zero size 2026-03-10T06:22:22.418 INFO:tasks.workunit.client.1.vm06.stdout:4/527: dwrite dd/d24/d5e/f6a [0,4194304] 0 2026-03-10T06:22:22.420 INFO:tasks.workunit.client.1.vm06.stdout:2/506: rename da/d13/d1a/d39/f70 to da/d13/d1a/d39/d4b/f9d 0 2026-03-10T06:22:22.426 INFO:tasks.workunit.client.1.vm06.stdout:3/536: dwrite d6/d21/f30 [0,4194304] 0 2026-03-10T06:22:22.429 INFO:tasks.workunit.client.1.vm06.stdout:3/537: write d6/fab [354044,79862] 0 2026-03-10T06:22:22.431 INFO:tasks.workunit.client.1.vm06.stdout:2/507: creat da/d13/d5e/f9e x:0 0 0 2026-03-10T06:22:22.434 INFO:tasks.workunit.client.1.vm06.stdout:3/538: dwrite d6/dc/d13/f8d [0,4194304] 0 2026-03-10T06:22:22.445 INFO:tasks.workunit.client.1.vm06.stdout:4/528: dwrite dd/d24/d5e/f6a [0,4194304] 0 2026-03-10T06:22:22.448 INFO:tasks.workunit.client.1.vm06.stdout:0/557: getdents d0/d3c/d42/d88/d47/d4d 0 2026-03-10T06:22:22.449 INFO:tasks.workunit.client.1.vm06.stdout:0/558: write d0/d3c/d42/d88/d98/fb1 [88974,27213] 0 2026-03-10T06:22:22.449 INFO:tasks.workunit.client.1.vm06.stdout:3/539: mkdir d6/dc/d13/db3 0 2026-03-10T06:22:22.450 INFO:tasks.workunit.client.1.vm06.stdout:4/529: write dd/d18/f1d [1373098,114487] 0 2026-03-10T06:22:22.456 INFO:tasks.workunit.client.1.vm06.stdout:0/559: unlink d0/d3c/d42/d88/d35/d74/l97 0 2026-03-10T06:22:22.467 INFO:tasks.workunit.client.1.vm06.stdout:0/560: dread - d0/d3c/d42/fa1 zero size 2026-03-10T06:22:22.468 INFO:tasks.workunit.client.1.vm06.stdout:4/530: mkdir dd/d24/d9c 0 2026-03-10T06:22:22.468 INFO:tasks.workunit.client.1.vm06.stdout:3/540: creat d6/d21/fb4 x:0 0 0 2026-03-10T06:22:22.468 INFO:tasks.workunit.client.1.vm06.stdout:0/561: write d0/f46 [2583082,121403] 0 2026-03-10T06:22:22.468 INFO:tasks.workunit.client.1.vm06.stdout:4/531: creat dd/d24/d5d/f9d x:0 0 0 2026-03-10T06:22:22.468 INFO:tasks.workunit.client.1.vm06.stdout:3/541: mknod d6/d21/d38/d39/d90/cb5 0 2026-03-10T06:22:22.468 INFO:tasks.workunit.client.1.vm06.stdout:4/532: creat dd/d24/f9e x:0 0 0 2026-03-10T06:22:22.468 INFO:tasks.workunit.client.1.vm06.stdout:3/542: dread d6/dc/d13/d9d/f86 [4194304,4194304] 0 2026-03-10T06:22:22.468 INFO:tasks.workunit.client.1.vm06.stdout:4/533: dread dd/d24/d2d/d2f/d34/d83/f87 [0,4194304] 0 2026-03-10T06:22:22.468 INFO:tasks.workunit.client.1.vm06.stdout:4/534: chown dd/d24/d2d/d2f/d39 107669 1 2026-03-10T06:22:22.468 INFO:tasks.workunit.client.1.vm06.stdout:3/543: mknod d6/d21/d38/d88/cb6 0 2026-03-10T06:22:22.468 INFO:tasks.workunit.client.1.vm06.stdout:4/535: chown dd/fe 28485067 1 2026-03-10T06:22:22.468 INFO:tasks.workunit.client.1.vm06.stdout:3/544: creat d6/dc/d41/fb7 x:0 0 0 2026-03-10T06:22:22.468 INFO:tasks.workunit.client.1.vm06.stdout:3/545: chown d6/d1a/l2f 5 1 2026-03-10T06:22:22.469 INFO:tasks.workunit.client.1.vm06.stdout:3/546: stat d6/dc/d13/l98 0 2026-03-10T06:22:22.474 INFO:tasks.workunit.client.1.vm06.stdout:4/536: dwrite dd/d24/d2d/d2f/d39/d71/f90 [0,4194304] 0 2026-03-10T06:22:22.478 INFO:tasks.workunit.client.1.vm06.stdout:3/547: dwrite d6/d21/f99 [0,4194304] 0 2026-03-10T06:22:22.483 INFO:tasks.workunit.client.1.vm06.stdout:3/548: chown d6/dc/d13/l3c 686296 1 2026-03-10T06:22:22.491 INFO:tasks.workunit.client.1.vm06.stdout:3/549: chown d6/dc/d13/d9d 1697 1 2026-03-10T06:22:22.491 INFO:tasks.workunit.client.1.vm06.stdout:3/550: creat d6/dc/d13/d9d/d54/fb8 x:0 0 0 2026-03-10T06:22:22.491 INFO:tasks.workunit.client.1.vm06.stdout:3/551: creat d6/d1a/fb9 x:0 0 0 2026-03-10T06:22:22.499 INFO:tasks.workunit.client.1.vm06.stdout:8/476: dwrite d1/df/d11/f47 [0,4194304] 0 2026-03-10T06:22:22.501 INFO:tasks.workunit.client.1.vm06.stdout:8/477: read d1/df/d20/f51 [64611,24102] 0 2026-03-10T06:22:22.565 INFO:tasks.workunit.client.1.vm06.stdout:8/478: dread f0 [0,4194304] 0 2026-03-10T06:22:22.577 INFO:tasks.workunit.client.1.vm06.stdout:3/552: dread d6/f84 [0,4194304] 0 2026-03-10T06:22:22.579 INFO:tasks.workunit.client.1.vm06.stdout:3/553: creat d6/dc/d13/fba x:0 0 0 2026-03-10T06:22:22.580 INFO:tasks.workunit.client.1.vm06.stdout:3/554: dread - d6/d21/fa4 zero size 2026-03-10T06:22:22.580 INFO:tasks.workunit.client.1.vm06.stdout:3/555: write d6/d21/fa4 [491722,96657] 0 2026-03-10T06:22:22.583 INFO:tasks.workunit.client.1.vm06.stdout:3/556: creat d6/dc/d13/d9d/fbb x:0 0 0 2026-03-10T06:22:22.585 INFO:tasks.workunit.client.1.vm06.stdout:3/557: rename d6/dc/d13/db3 to d6/d21/dbc 0 2026-03-10T06:22:22.628 INFO:tasks.workunit.client.1.vm06.stdout:7/611: dwrite d19/d3b/d41/da9/da5/fa6 [0,4194304] 0 2026-03-10T06:22:22.633 INFO:tasks.workunit.client.1.vm06.stdout:7/612: rename d19/d3b/d41/da9/da5/lce to d19/db0/ld0 0 2026-03-10T06:22:22.634 INFO:tasks.workunit.client.1.vm06.stdout:5/430: rename d8/db/d54 to d8/db/d54/d67/d46/d68/d8d 22 2026-03-10T06:22:22.634 INFO:tasks.workunit.client.1.vm06.stdout:7/613: write d19/d3b/d41/d72/fb9 [324529,64133] 0 2026-03-10T06:22:22.639 INFO:tasks.workunit.client.1.vm06.stdout:5/431: creat d8/d9/f8e x:0 0 0 2026-03-10T06:22:22.641 INFO:tasks.workunit.client.1.vm06.stdout:5/432: symlink d8/db/d57/l8f 0 2026-03-10T06:22:22.643 INFO:tasks.workunit.client.1.vm06.stdout:5/433: fdatasync d8/db/d54/d8a/d39/f44 0 2026-03-10T06:22:22.644 INFO:tasks.workunit.client.1.vm06.stdout:5/434: read d8/db/d54/f5c [3842771,127517] 0 2026-03-10T06:22:22.645 INFO:tasks.workunit.client.1.vm06.stdout:1/582: rmdir d9/d62 39 2026-03-10T06:22:22.647 INFO:tasks.workunit.client.1.vm06.stdout:5/435: mkdir d8/db/d54/d8a/d74/d90 0 2026-03-10T06:22:22.648 INFO:tasks.workunit.client.1.vm06.stdout:1/583: mknod d9/d35/d46/d38/d63/ca5 0 2026-03-10T06:22:22.648 INFO:tasks.workunit.client.1.vm06.stdout:5/436: fdatasync d8/db/d54/d8a/d74/f66 0 2026-03-10T06:22:22.649 INFO:tasks.workunit.client.1.vm06.stdout:1/584: mknod d9/d1b/d20/ca6 0 2026-03-10T06:22:22.650 INFO:tasks.workunit.client.1.vm06.stdout:5/437: rmdir d8/db/d54/d67/d46/d6e 39 2026-03-10T06:22:22.650 INFO:tasks.workunit.client.1.vm06.stdout:5/438: truncate d8/db/f45 1490423 0 2026-03-10T06:22:22.651 INFO:tasks.workunit.client.1.vm06.stdout:1/585: read d9/f34 [2305326,712] 0 2026-03-10T06:22:22.652 INFO:tasks.workunit.client.1.vm06.stdout:5/439: creat d8/db/d54/d8a/d39/d6c/f91 x:0 0 0 2026-03-10T06:22:22.653 INFO:tasks.workunit.client.1.vm06.stdout:5/440: fdatasync d8/f3f 0 2026-03-10T06:22:22.662 INFO:tasks.workunit.client.1.vm06.stdout:2/508: dread da/d13/d1c/d1d/d44/d53/f67 [0,4194304] 0 2026-03-10T06:22:22.663 INFO:tasks.workunit.client.1.vm06.stdout:2/509: fdatasync da/d13/d1c/d1d/d44/d48/f57 0 2026-03-10T06:22:22.664 INFO:tasks.workunit.client.1.vm06.stdout:2/510: truncate da/d13/d5e/f9e 1014004 0 2026-03-10T06:22:22.666 INFO:tasks.workunit.client.1.vm06.stdout:2/511: dread da/d13/d1c/f2d [0,4194304] 0 2026-03-10T06:22:22.666 INFO:tasks.workunit.client.1.vm06.stdout:2/512: chown da 10436265 1 2026-03-10T06:22:22.667 INFO:tasks.workunit.client.1.vm06.stdout:2/513: stat da/d13/d1a/l34 0 2026-03-10T06:22:22.668 INFO:tasks.workunit.client.1.vm06.stdout:2/514: symlink da/d13/d1c/d1d/d44/d46/l9f 0 2026-03-10T06:22:22.670 INFO:tasks.workunit.client.1.vm06.stdout:2/515: symlink da/d13/d1c/la0 0 2026-03-10T06:22:22.671 INFO:tasks.workunit.client.1.vm06.stdout:2/516: mkdir da/d13/d1a/d39/d35/da1 0 2026-03-10T06:22:22.679 INFO:tasks.workunit.client.1.vm06.stdout:6/603: dread d6/f62 [0,4194304] 0 2026-03-10T06:22:22.679 INFO:tasks.workunit.client.1.vm06.stdout:6/604: chown d6/df 844 1 2026-03-10T06:22:22.691 INFO:tasks.workunit.client.1.vm06.stdout:6/605: chown d6/dd/d25/d33/d5a/d78/dd0/cac 256971 1 2026-03-10T06:22:22.694 INFO:tasks.workunit.client.1.vm06.stdout:6/606: dwrite d6/d79/fc6 [0,4194304] 0 2026-03-10T06:22:22.707 INFO:tasks.workunit.client.1.vm06.stdout:6/607: rename d6/df/d40/c66 to d6/d7/d37/d43/cd1 0 2026-03-10T06:22:22.708 INFO:tasks.workunit.client.1.vm06.stdout:6/608: unlink d6/dd/d25/d33/l3a 0 2026-03-10T06:22:22.709 INFO:tasks.workunit.client.1.vm06.stdout:6/609: stat d6/d7/d37/l50 0 2026-03-10T06:22:22.710 INFO:tasks.workunit.client.1.vm06.stdout:6/610: unlink d6/df/f6f 0 2026-03-10T06:22:22.715 INFO:tasks.workunit.client.1.vm06.stdout:4/537: dwrite fc [0,4194304] 0 2026-03-10T06:22:22.722 INFO:tasks.workunit.client.1.vm06.stdout:4/538: rename dd/d33/f8b to dd/f9f 0 2026-03-10T06:22:22.722 INFO:tasks.workunit.client.1.vm06.stdout:4/539: readlink dd/d33/l73 0 2026-03-10T06:22:22.722 INFO:tasks.workunit.client.1.vm06.stdout:4/540: chown cb 8 1 2026-03-10T06:22:22.723 INFO:tasks.workunit.client.1.vm06.stdout:4/541: fsync dd/d33/f56 0 2026-03-10T06:22:22.724 INFO:tasks.workunit.client.1.vm06.stdout:4/542: chown dd/d33/f84 19 1 2026-03-10T06:22:22.725 INFO:tasks.workunit.client.1.vm06.stdout:4/543: mknod dd/d24/d2d/ca0 0 2026-03-10T06:22:22.725 INFO:tasks.workunit.client.1.vm06.stdout:4/544: write dd/f5c [451887,119650] 0 2026-03-10T06:22:22.729 INFO:tasks.workunit.client.1.vm06.stdout:4/545: dwrite dd/d24/d2d/d2f/f98 [0,4194304] 0 2026-03-10T06:22:22.732 INFO:tasks.workunit.client.1.vm06.stdout:4/546: symlink dd/d24/d2d/d2f/d39/d71/la1 0 2026-03-10T06:22:22.733 INFO:tasks.workunit.client.1.vm06.stdout:4/547: fsync dd/d18/f32 0 2026-03-10T06:22:22.738 INFO:tasks.workunit.client.1.vm06.stdout:4/548: dwrite dd/d33/d36/f74 [0,4194304] 0 2026-03-10T06:22:22.738 INFO:tasks.workunit.client.1.vm06.stdout:4/549: dread - dd/d24/f8f zero size 2026-03-10T06:22:22.738 INFO:tasks.workunit.client.1.vm06.stdout:4/550: dread - dd/d18/d75/f91 zero size 2026-03-10T06:22:22.743 INFO:tasks.workunit.client.1.vm06.stdout:4/551: read dd/d18/f1d [203586,50990] 0 2026-03-10T06:22:22.754 INFO:tasks.workunit.client.1.vm06.stdout:8/479: truncate d1/df/f6d 1084717 0 2026-03-10T06:22:22.755 INFO:tasks.workunit.client.1.vm06.stdout:0/562: dread d0/ff [0,4194304] 0 2026-03-10T06:22:22.757 INFO:tasks.workunit.client.1.vm06.stdout:0/563: link d0/f61 d0/dd/d14/d1d/d5d/fb3 0 2026-03-10T06:22:22.759 INFO:tasks.workunit.client.1.vm06.stdout:0/564: dwrite d0/fa [0,4194304] 0 2026-03-10T06:22:22.770 INFO:tasks.workunit.client.1.vm06.stdout:8/480: dread d1/d7/f4f [0,4194304] 0 2026-03-10T06:22:22.771 INFO:tasks.workunit.client.1.vm06.stdout:8/481: write d1/df/d20/d21/d7e/d8d/f95 [4264963,12347] 0 2026-03-10T06:22:22.772 INFO:tasks.workunit.client.1.vm06.stdout:8/482: write d1/d3b/d5c/f7a [839196,26299] 0 2026-03-10T06:22:22.774 INFO:tasks.workunit.client.1.vm06.stdout:8/483: creat d1/df/d11/da1/fa5 x:0 0 0 2026-03-10T06:22:22.775 INFO:tasks.workunit.client.1.vm06.stdout:8/484: write d1/f1c [2618397,130097] 0 2026-03-10T06:22:22.782 INFO:tasks.workunit.client.1.vm06.stdout:6/611: dread d6/dd/f5b [0,4194304] 0 2026-03-10T06:22:22.786 INFO:tasks.workunit.client.1.vm06.stdout:6/612: symlink d6/dd/d25/d33/d5a/d78/dd0/dc5/ld2 0 2026-03-10T06:22:22.786 INFO:tasks.workunit.client.1.vm06.stdout:6/613: dread - d6/d79/d95/db4/fbd zero size 2026-03-10T06:22:22.786 INFO:tasks.workunit.client.1.vm06.stdout:6/614: rmdir d6/d79/d95 39 2026-03-10T06:22:22.786 INFO:tasks.workunit.client.1.vm06.stdout:6/615: symlink d6/dd/dc7/ld3 0 2026-03-10T06:22:22.789 INFO:tasks.workunit.client.1.vm06.stdout:6/616: mkdir d6/d79/d95/db4/dd4 0 2026-03-10T06:22:22.791 INFO:tasks.workunit.client.1.vm06.stdout:6/617: truncate d6/dd/d25/d33/d5a/fa1 1000060 0 2026-03-10T06:22:22.793 INFO:tasks.workunit.client.1.vm06.stdout:6/618: rename d6/fa8 to d6/d79/d95/db4/dd4/fd5 0 2026-03-10T06:22:22.797 INFO:tasks.workunit.client.1.vm06.stdout:3/558: rmdir d6/dc/d13 39 2026-03-10T06:22:22.798 INFO:tasks.workunit.client.1.vm06.stdout:3/559: mkdir d6/d1a/d5b/dbd 0 2026-03-10T06:22:22.798 INFO:tasks.workunit.client.1.vm06.stdout:3/560: stat d6/c76 0 2026-03-10T06:22:22.801 INFO:tasks.workunit.client.1.vm06.stdout:5/441: write d8/db/d54/f5c [80665,78007] 0 2026-03-10T06:22:22.802 INFO:tasks.workunit.client.1.vm06.stdout:3/561: rename d6/dc/d41/d6d/f70 to d6/d4f/fbe 0 2026-03-10T06:22:22.803 INFO:tasks.workunit.client.1.vm06.stdout:5/442: chown d8/f3f 6607002 1 2026-03-10T06:22:22.804 INFO:tasks.workunit.client.1.vm06.stdout:3/562: mknod d6/d1a/cbf 0 2026-03-10T06:22:22.807 INFO:tasks.workunit.client.1.vm06.stdout:5/443: write d8/db/d54/d8a/d39/f69 [4647087,125550] 0 2026-03-10T06:22:22.808 INFO:tasks.workunit.client.1.vm06.stdout:3/563: fsync d6/dc/d13/d51/fb1 0 2026-03-10T06:22:22.814 INFO:tasks.workunit.client.1.vm06.stdout:5/444: readlink d8/db/d54/d67/d46/d6e/l7c 0 2026-03-10T06:22:22.817 INFO:tasks.workunit.client.1.vm06.stdout:7/614: dwrite d19/d3b/d41/f66 [0,4194304] 0 2026-03-10T06:22:22.819 INFO:tasks.workunit.client.1.vm06.stdout:3/564: creat d6/dc/fc0 x:0 0 0 2026-03-10T06:22:22.819 INFO:tasks.workunit.client.1.vm06.stdout:7/615: truncate d19/fc0 641330 0 2026-03-10T06:22:22.820 INFO:tasks.workunit.client.1.vm06.stdout:3/565: readlink d6/d21/d38/l9c 0 2026-03-10T06:22:22.826 INFO:tasks.workunit.client.1.vm06.stdout:7/616: dwrite d19/d3b/d41/f49 [0,4194304] 0 2026-03-10T06:22:22.839 INFO:tasks.workunit.client.1.vm06.stdout:7/617: chown d19/l31 19 1 2026-03-10T06:22:22.840 INFO:tasks.workunit.client.1.vm06.stdout:7/618: mknod d19/cd1 0 2026-03-10T06:22:22.843 INFO:tasks.workunit.client.1.vm06.stdout:7/619: chown d19/d3b/d41/d4c/c88 71634 1 2026-03-10T06:22:22.844 INFO:tasks.workunit.client.1.vm06.stdout:7/620: dread d19/fc0 [0,4194304] 0 2026-03-10T06:22:22.848 INFO:tasks.workunit.client.1.vm06.stdout:7/621: mkdir d19/d3b/d41/da9/dbd/dd2 0 2026-03-10T06:22:22.852 INFO:tasks.workunit.client.1.vm06.stdout:7/622: creat d19/d3b/d41/d42/d52/d9f/dc2/fd3 x:0 0 0 2026-03-10T06:22:22.854 INFO:tasks.workunit.client.1.vm06.stdout:1/586: write d9/f58 [736862,33241] 0 2026-03-10T06:22:22.857 INFO:tasks.workunit.client.1.vm06.stdout:1/587: creat d9/d1b/d20/fa7 x:0 0 0 2026-03-10T06:22:22.872 INFO:tasks.workunit.client.1.vm06.stdout:2/517: truncate da/d13/d1c/d43/d6e/f77 1681490 0 2026-03-10T06:22:22.896 INFO:tasks.workunit.client.1.vm06.stdout:4/552: truncate dd/d18/f32 1961436 0 2026-03-10T06:22:22.898 INFO:tasks.workunit.client.1.vm06.stdout:4/553: symlink dd/d33/d36/la2 0 2026-03-10T06:22:22.922 INFO:tasks.workunit.client.1.vm06.stdout:0/565: fsync d0/dd/d14/d1d/d5d/fb3 0 2026-03-10T06:22:22.923 INFO:tasks.workunit.client.1.vm06.stdout:0/566: creat d0/d3c/d42/d88/d35/d74/fb4 x:0 0 0 2026-03-10T06:22:22.924 INFO:tasks.workunit.client.1.vm06.stdout:0/567: write d0/d3c/d42/f54 [776742,111238] 0 2026-03-10T06:22:22.928 INFO:tasks.workunit.client.1.vm06.stdout:0/568: dread d0/fa [0,4194304] 0 2026-03-10T06:22:22.929 INFO:tasks.workunit.client.1.vm06.stdout:0/569: chown d0/dd/c69 41 1 2026-03-10T06:22:22.929 INFO:tasks.workunit.client.1.vm06.stdout:0/570: chown d0/d3c 9 1 2026-03-10T06:22:22.929 INFO:tasks.workunit.client.1.vm06.stdout:0/571: stat d0/c15 0 2026-03-10T06:22:22.930 INFO:tasks.workunit.client.1.vm06.stdout:8/485: truncate d1/df/fa0 3536986 0 2026-03-10T06:22:22.931 INFO:tasks.workunit.client.1.vm06.stdout:8/486: chown d1/df/d58/f6a 120238497 1 2026-03-10T06:22:22.937 INFO:tasks.workunit.client.1.vm06.stdout:5/445: truncate d8/d9/f11 1676509 0 2026-03-10T06:22:22.937 INFO:tasks.workunit.client.1.vm06.stdout:5/446: fsync d8/db/d54/d8a/d74/f71 0 2026-03-10T06:22:22.939 INFO:tasks.workunit.client.1.vm06.stdout:5/447: fdatasync d8/ff 0 2026-03-10T06:22:22.940 INFO:tasks.workunit.client.1.vm06.stdout:4/554: sync 2026-03-10T06:22:22.942 INFO:tasks.workunit.client.1.vm06.stdout:5/448: sync 2026-03-10T06:22:22.942 INFO:tasks.workunit.client.1.vm06.stdout:0/572: sync 2026-03-10T06:22:22.943 INFO:tasks.workunit.client.1.vm06.stdout:5/449: write d8/db/d54/d8a/d39/d6c/f91 [813490,15653] 0 2026-03-10T06:22:22.944 INFO:tasks.workunit.client.1.vm06.stdout:0/573: write d0/dd/d14/d18/f90 [2466118,97095] 0 2026-03-10T06:22:22.949 INFO:tasks.workunit.client.1.vm06.stdout:0/574: mknod d0/d3c/d42/d5e/cb5 0 2026-03-10T06:22:22.950 INFO:tasks.workunit.client.1.vm06.stdout:5/450: fdatasync d8/db/f1f 0 2026-03-10T06:22:22.952 INFO:tasks.workunit.client.1.vm06.stdout:0/575: dread d0/d3c/d42/d88/d35/f7f [0,4194304] 0 2026-03-10T06:22:22.953 INFO:tasks.workunit.client.1.vm06.stdout:5/451: unlink d8/db/d54/d8a/d39/d72/f7f 0 2026-03-10T06:22:22.954 INFO:tasks.workunit.client.1.vm06.stdout:3/566: truncate d6/d1a/f1f 718639 0 2026-03-10T06:22:22.955 INFO:tasks.workunit.client.1.vm06.stdout:7/623: getdents d19/d3b/d41/da9/dbd 0 2026-03-10T06:22:22.957 INFO:tasks.workunit.client.1.vm06.stdout:5/452: write d8/db/d54/d67/d46/f77 [1314594,21595] 0 2026-03-10T06:22:22.957 INFO:tasks.workunit.client.1.vm06.stdout:0/576: rename d0/dd/d14/d1d/d5d/fb3 to d0/d3c/d42/fb6 0 2026-03-10T06:22:22.959 INFO:tasks.workunit.client.1.vm06.stdout:1/588: write d9/d62/f8a [727021,96531] 0 2026-03-10T06:22:22.960 INFO:tasks.workunit.client.1.vm06.stdout:5/453: chown d8/db/d54/d8a/d74/f3b 6573 1 2026-03-10T06:22:22.965 INFO:tasks.workunit.client.1.vm06.stdout:3/567: dwrite d6/dc/f94 [0,4194304] 0 2026-03-10T06:22:22.966 INFO:tasks.workunit.client.1.vm06.stdout:0/577: fsync d0/d3c/d42/d88/f8a 0 2026-03-10T06:22:22.966 INFO:tasks.workunit.client.1.vm06.stdout:7/624: creat d19/d3b/d41/d42/fd4 x:0 0 0 2026-03-10T06:22:22.967 INFO:tasks.workunit.client.1.vm06.stdout:0/578: chown d0/dd/f24 56 1 2026-03-10T06:22:22.967 INFO:tasks.workunit.client.1.vm06.stdout:7/625: stat d19/d3b/f7b 0 2026-03-10T06:22:22.967 INFO:tasks.workunit.client.1.vm06.stdout:1/589: mknod d9/d35/d46/d38/d63/d83/ca8 0 2026-03-10T06:22:22.968 INFO:tasks.workunit.client.1.vm06.stdout:7/626: fdatasync d19/d3b/d5b/f69 0 2026-03-10T06:22:22.968 INFO:tasks.workunit.client.1.vm06.stdout:3/568: fsync d6/d21/d38/d39/d90/f9e 0 2026-03-10T06:22:22.968 INFO:tasks.workunit.client.1.vm06.stdout:1/590: write d9/d35/d46/d38/f82 [919411,109533] 0 2026-03-10T06:22:22.969 INFO:tasks.workunit.client.1.vm06.stdout:7/627: write d19/d3b/f6b [611034,20295] 0 2026-03-10T06:22:22.969 INFO:tasks.workunit.client.1.vm06.stdout:3/569: chown d6/d8/d7f/da1 173027 1 2026-03-10T06:22:22.970 INFO:tasks.workunit.client.1.vm06.stdout:1/591: write d9/d35/d46/f7a [330881,50235] 0 2026-03-10T06:22:22.970 INFO:tasks.workunit.client.1.vm06.stdout:0/579: mkdir d0/d3c/d42/db7 0 2026-03-10T06:22:22.974 INFO:tasks.workunit.client.1.vm06.stdout:0/580: truncate d0/f5 4838693 0 2026-03-10T06:22:22.983 INFO:tasks.workunit.client.1.vm06.stdout:1/592: dread d9/d35/d46/d38/d8c/f9a [0,4194304] 0 2026-03-10T06:22:22.984 INFO:tasks.workunit.client.1.vm06.stdout:1/593: write d9/d1b/d20/d44/f85 [758696,37091] 0 2026-03-10T06:22:22.985 INFO:tasks.workunit.client.1.vm06.stdout:7/628: link d19/d3b/d41/d42/d52/d83/f94 d19/d3b/d41/d42/d62/d80/fd5 0 2026-03-10T06:22:22.986 INFO:tasks.workunit.client.1.vm06.stdout:7/629: chown d19/d3b/d41/d42/d62/f7c 304170545 1 2026-03-10T06:22:22.989 INFO:tasks.workunit.client.1.vm06.stdout:7/630: dwrite f15 [0,4194304] 0 2026-03-10T06:22:22.994 INFO:tasks.workunit.client.1.vm06.stdout:1/594: sync 2026-03-10T06:22:22.994 INFO:tasks.workunit.client.1.vm06.stdout:1/595: fsync d9/d1b/f7c 0 2026-03-10T06:22:22.995 INFO:tasks.workunit.client.1.vm06.stdout:1/596: chown d9/d1b/d20/ca6 145 1 2026-03-10T06:22:22.995 INFO:tasks.workunit.client.1.vm06.stdout:7/631: mkdir d19/d3b/d41/d42/d52/d83/d9d/da8/dd6 0 2026-03-10T06:22:22.997 INFO:tasks.workunit.client.1.vm06.stdout:0/581: link d0/d3c/d42/d88/l4b d0/d3c/d42/d88/lb8 0 2026-03-10T06:22:22.999 INFO:tasks.workunit.client.1.vm06.stdout:1/597: dwrite d9/d1b/d20/fa7 [0,4194304] 0 2026-03-10T06:22:23.003 INFO:tasks.workunit.client.1.vm06.stdout:0/582: creat d0/dd/d1c/da2/fb9 x:0 0 0 2026-03-10T06:22:23.013 INFO:tasks.workunit.client.1.vm06.stdout:2/518: write da/d13/d1c/d1d/d44/d48/f57 [1067269,17743] 0 2026-03-10T06:22:23.015 INFO:tasks.workunit.client.1.vm06.stdout:1/598: dwrite d9/d35/d46/f7a [0,4194304] 0 2026-03-10T06:22:23.019 INFO:tasks.workunit.client.1.vm06.stdout:0/583: rename d0/dd/l96 to d0/dd/d14/d1d/d5d/lba 0 2026-03-10T06:22:23.020 INFO:tasks.workunit.client.1.vm06.stdout:1/599: creat d9/d35/d46/d38/d8c/fa9 x:0 0 0 2026-03-10T06:22:23.021 INFO:tasks.workunit.client.1.vm06.stdout:0/584: truncate d0/d3c/d42/d99/f9c 730193 0 2026-03-10T06:22:23.023 INFO:tasks.workunit.client.1.vm06.stdout:2/519: getdents da/d13/d1a/d39/d4b/d86 0 2026-03-10T06:22:23.025 INFO:tasks.workunit.client.1.vm06.stdout:6/619: write d6/dd/d25/f69 [183161,28258] 0 2026-03-10T06:22:23.026 INFO:tasks.workunit.client.1.vm06.stdout:8/487: write d1/d7/fd [231417,64179] 0 2026-03-10T06:22:23.033 INFO:tasks.workunit.client.1.vm06.stdout:2/520: symlink da/d13/d1a/d39/d4b/d86/la2 0 2026-03-10T06:22:23.038 INFO:tasks.workunit.client.1.vm06.stdout:4/555: dwrite dd/d24/f45 [0,4194304] 0 2026-03-10T06:22:23.038 INFO:tasks.workunit.client.1.vm06.stdout:2/521: write da/d13/f1f [1036724,86878] 0 2026-03-10T06:22:23.038 INFO:tasks.workunit.client.1.vm06.stdout:6/620: chown d6/d7/c8e 1154 1 2026-03-10T06:22:23.038 INFO:tasks.workunit.client.1.vm06.stdout:6/621: chown d6/dd/d25/d2c/l63 30165 1 2026-03-10T06:22:23.039 INFO:tasks.workunit.client.1.vm06.stdout:6/622: truncate d6/dd/d25/d33/d4d/fc0 775303 0 2026-03-10T06:22:23.042 INFO:tasks.workunit.client.1.vm06.stdout:4/556: write dd/d24/d2d/d2f/f82 [1230433,19818] 0 2026-03-10T06:22:23.048 INFO:tasks.workunit.client.1.vm06.stdout:6/623: creat d6/dd/d25/d2c/fd6 x:0 0 0 2026-03-10T06:22:23.048 INFO:tasks.workunit.client.1.vm06.stdout:7/632: dread d19/f20 [0,4194304] 0 2026-03-10T06:22:23.049 INFO:tasks.workunit.client.1.vm06.stdout:7/633: write d19/d3b/d41/d72/d97/fa3 [660652,28228] 0 2026-03-10T06:22:23.049 INFO:tasks.workunit.client.1.vm06.stdout:4/557: mknod dd/d41/ca3 0 2026-03-10T06:22:23.053 INFO:tasks.workunit.client.1.vm06.stdout:2/522: mknod da/d13/d1c/d43/d6e/d9b/ca3 0 2026-03-10T06:22:23.053 INFO:tasks.workunit.client.1.vm06.stdout:6/624: symlink d6/dd/d25/d4e/ld7 0 2026-03-10T06:22:23.054 INFO:tasks.workunit.client.1.vm06.stdout:2/523: stat da/d13/d1c/d1d 0 2026-03-10T06:22:23.060 INFO:tasks.workunit.client.1.vm06.stdout:7/634: fdatasync d19/d3b/d41/d42/d52/d83/f94 0 2026-03-10T06:22:23.060 INFO:tasks.workunit.client.1.vm06.stdout:4/558: unlink dd/f80 0 2026-03-10T06:22:23.061 INFO:tasks.workunit.client.1.vm06.stdout:4/559: chown dd/d24/d2d/d2f/d34/d83 104471645 1 2026-03-10T06:22:23.061 INFO:tasks.workunit.client.1.vm06.stdout:7/635: creat d19/d3b/d41/d42/d62/d80/fd7 x:0 0 0 2026-03-10T06:22:23.063 INFO:tasks.workunit.client.1.vm06.stdout:7/636: dread - d19/d3b/d41/d42/d52/d9f/dc2/fd3 zero size 2026-03-10T06:22:23.064 INFO:tasks.workunit.client.1.vm06.stdout:2/524: dwrite da/f19 [8388608,4194304] 0 2026-03-10T06:22:23.064 INFO:tasks.workunit.client.1.vm06.stdout:7/637: truncate d19/d3b/f68 4654422 0 2026-03-10T06:22:23.069 INFO:tasks.workunit.client.1.vm06.stdout:2/525: creat da/d13/d1c/d7d/fa4 x:0 0 0 2026-03-10T06:22:23.069 INFO:tasks.workunit.client.1.vm06.stdout:7/638: creat d19/d3b/d41/d42/d52/d83/fd8 x:0 0 0 2026-03-10T06:22:23.071 INFO:tasks.workunit.client.1.vm06.stdout:4/560: creat dd/d18/d8e/fa4 x:0 0 0 2026-03-10T06:22:23.074 INFO:tasks.workunit.client.1.vm06.stdout:4/561: truncate dd/d18/d75/f76 450282 0 2026-03-10T06:22:23.074 INFO:tasks.workunit.client.1.vm06.stdout:7/639: mknod d19/d3b/d41/d42/d52/cd9 0 2026-03-10T06:22:23.076 INFO:tasks.workunit.client.1.vm06.stdout:7/640: write d19/d3b/d41/d42/d52/d83/f94 [1001762,126486] 0 2026-03-10T06:22:23.076 INFO:tasks.workunit.client.1.vm06.stdout:4/562: symlink dd/d33/la5 0 2026-03-10T06:22:23.080 INFO:tasks.workunit.client.1.vm06.stdout:4/563: mkdir dd/d33/da6 0 2026-03-10T06:22:23.093 INFO:tasks.workunit.client.1.vm06.stdout:7/641: dread - d19/d3b/d41/d72/d97/fb3 zero size 2026-03-10T06:22:23.093 INFO:tasks.workunit.client.1.vm06.stdout:4/564: creat dd/fa7 x:0 0 0 2026-03-10T06:22:23.093 INFO:tasks.workunit.client.1.vm06.stdout:2/526: dwrite da/d13/d5e/f8f [0,4194304] 0 2026-03-10T06:22:23.093 INFO:tasks.workunit.client.1.vm06.stdout:7/642: symlink d19/d3b/d41/d42/d52/d83/d9d/lda 0 2026-03-10T06:22:23.093 INFO:tasks.workunit.client.1.vm06.stdout:2/527: dread da/d13/d1c/d1d/d44/d53/f67 [0,4194304] 0 2026-03-10T06:22:23.096 INFO:tasks.workunit.client.1.vm06.stdout:7/643: dwrite fa [0,4194304] 0 2026-03-10T06:22:23.097 INFO:tasks.workunit.client.1.vm06.stdout:2/528: getdents da/d13/d5e 0 2026-03-10T06:22:23.098 INFO:tasks.workunit.client.1.vm06.stdout:2/529: write da/d13/d1c/d43/f7a [3399960,34851] 0 2026-03-10T06:22:23.106 INFO:tasks.workunit.client.1.vm06.stdout:2/530: mknod da/d13/d1c/d1d/d44/d53/ca5 0 2026-03-10T06:22:23.119 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:22 vm06.local ceph-mon[58974]: pgmap v10: 65 pgs: 65 active+clean; 1.0 GiB data, 3.6 GiB used, 116 GiB / 120 GiB avail; 33 MiB/s rd, 116 MiB/s wr, 237 op/s 2026-03-10T06:22:23.120 INFO:tasks.workunit.client.1.vm06.stdout:2/531: dwrite da/f19 [8388608,4194304] 0 2026-03-10T06:22:23.121 INFO:tasks.workunit.client.1.vm06.stdout:1/600: read d9/d62/f76 [3216772,83074] 0 2026-03-10T06:22:23.122 INFO:tasks.workunit.client.1.vm06.stdout:5/454: dread d8/db/d54/d8a/d74/f66 [0,4194304] 0 2026-03-10T06:22:23.123 INFO:tasks.workunit.client.1.vm06.stdout:2/532: fdatasync da/d13/d1c/d1d/d44/d53/f98 0 2026-03-10T06:22:23.126 INFO:tasks.workunit.client.1.vm06.stdout:5/455: readlink d8/db/d54/d8a/d74/l7d 0 2026-03-10T06:22:23.126 INFO:tasks.workunit.client.1.vm06.stdout:1/601: symlink d9/d35/d46/d38/laa 0 2026-03-10T06:22:23.127 INFO:tasks.workunit.client.1.vm06.stdout:2/533: chown da/d13/d1a/d39/f2f 55368264 1 2026-03-10T06:22:23.127 INFO:tasks.workunit.client.1.vm06.stdout:5/456: write d8/d9/f47 [999202,54307] 0 2026-03-10T06:22:23.133 INFO:tasks.workunit.client.1.vm06.stdout:2/534: rename da/d13/d1c/d1d/d44/d48 to da/d13/d1c/d43/da6 0 2026-03-10T06:22:23.138 INFO:tasks.workunit.client.1.vm06.stdout:1/602: creat d9/d35/d46/d38/fab x:0 0 0 2026-03-10T06:22:23.138 INFO:tasks.workunit.client.1.vm06.stdout:1/603: chown d9/f1a 43640298 1 2026-03-10T06:22:23.138 INFO:tasks.workunit.client.1.vm06.stdout:1/604: write d9/d35/d89/f4d [2740344,95313] 0 2026-03-10T06:22:23.139 INFO:tasks.workunit.client.1.vm06.stdout:5/457: rename d8/ld to d8/db/d54/d8a/d74/l92 0 2026-03-10T06:22:23.141 INFO:tasks.workunit.client.1.vm06.stdout:1/605: stat d9/d1b/l26 0 2026-03-10T06:22:23.145 INFO:tasks.workunit.client.1.vm06.stdout:1/606: creat d9/fac x:0 0 0 2026-03-10T06:22:23.147 INFO:tasks.workunit.client.1.vm06.stdout:1/607: unlink d9/d35/d46/d38/d63/c9d 0 2026-03-10T06:22:23.160 INFO:tasks.workunit.client.1.vm06.stdout:1/608: write d9/d1b/d20/d44/f54 [4803946,51561] 0 2026-03-10T06:22:23.164 INFO:tasks.workunit.client.1.vm06.stdout:5/458: sync 2026-03-10T06:22:23.176 INFO:tasks.workunit.client.1.vm06.stdout:1/609: read d9/f34 [1025466,97893] 0 2026-03-10T06:22:23.177 INFO:tasks.workunit.client.1.vm06.stdout:1/610: chown d9/d35/d46/d38/l61 244518 1 2026-03-10T06:22:23.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:22 vm04.local ceph-mon[51058]: pgmap v10: 65 pgs: 65 active+clean; 1.0 GiB data, 3.6 GiB used, 116 GiB / 120 GiB avail; 33 MiB/s rd, 116 MiB/s wr, 237 op/s 2026-03-10T06:22:23.177 INFO:tasks.workunit.client.1.vm06.stdout:1/611: read - d9/d35/f7e zero size 2026-03-10T06:22:23.179 INFO:tasks.workunit.client.1.vm06.stdout:1/612: mknod d9/d62/cad 0 2026-03-10T06:22:23.183 INFO:tasks.workunit.client.1.vm06.stdout:1/613: dwrite d9/d1b/d20/f8e [0,4194304] 0 2026-03-10T06:22:23.195 INFO:tasks.workunit.client.1.vm06.stdout:1/614: dread - d9/d62/fa3 zero size 2026-03-10T06:22:23.195 INFO:tasks.workunit.client.1.vm06.stdout:1/615: rename d9/d1b/f75 to d9/d35/d46/d38/fae 0 2026-03-10T06:22:23.195 INFO:tasks.workunit.client.1.vm06.stdout:1/616: creat d9/d35/faf x:0 0 0 2026-03-10T06:22:23.195 INFO:tasks.workunit.client.1.vm06.stdout:1/617: truncate d9/f86 4991085 0 2026-03-10T06:22:23.195 INFO:tasks.workunit.client.1.vm06.stdout:1/618: mkdir d9/d35/d46/db0 0 2026-03-10T06:22:23.195 INFO:tasks.workunit.client.1.vm06.stdout:1/619: mknod d9/d1b/d20/d44/cb1 0 2026-03-10T06:22:23.208 INFO:tasks.workunit.client.1.vm06.stdout:2/535: dread da/d13/d1a/d39/f2f [0,4194304] 0 2026-03-10T06:22:23.209 INFO:tasks.workunit.client.1.vm06.stdout:1/620: sync 2026-03-10T06:22:23.210 INFO:tasks.workunit.client.1.vm06.stdout:1/621: chown d9/d35/d89/f4f 54 1 2026-03-10T06:22:23.214 INFO:tasks.workunit.client.1.vm06.stdout:1/622: dwrite d9/d35/d89/f3d [0,4194304] 0 2026-03-10T06:22:23.267 INFO:tasks.workunit.client.1.vm06.stdout:3/570: truncate d6/d21/f58 2943424 0 2026-03-10T06:22:23.274 INFO:tasks.workunit.client.1.vm06.stdout:0/585: truncate d0/dd/f5b 3244822 0 2026-03-10T06:22:23.276 INFO:tasks.workunit.client.1.vm06.stdout:8/488: dwrite d1/f26 [0,4194304] 0 2026-03-10T06:22:23.277 INFO:tasks.workunit.client.1.vm06.stdout:0/586: sync 2026-03-10T06:22:23.277 INFO:tasks.workunit.client.1.vm06.stdout:8/489: chown d1/l8 40643 1 2026-03-10T06:22:23.278 INFO:tasks.workunit.client.1.vm06.stdout:0/587: chown d0/dd/d1c/c71 69 1 2026-03-10T06:22:23.278 INFO:tasks.workunit.client.1.vm06.stdout:3/571: dread d6/d21/d38/f6c [0,4194304] 0 2026-03-10T06:22:23.279 INFO:tasks.workunit.client.1.vm06.stdout:0/588: mkdir d0/d3c/d42/d5e/dbb 0 2026-03-10T06:22:23.284 INFO:tasks.workunit.client.1.vm06.stdout:8/490: dwrite d1/f4 [0,4194304] 0 2026-03-10T06:22:23.286 INFO:tasks.workunit.client.1.vm06.stdout:8/491: stat d1/d2c/l96 0 2026-03-10T06:22:23.289 INFO:tasks.workunit.client.1.vm06.stdout:8/492: mknod d1/df/d20/d21/d7e/ca6 0 2026-03-10T06:22:23.293 INFO:tasks.workunit.client.1.vm06.stdout:0/589: dwrite d0/f9 [0,4194304] 0 2026-03-10T06:22:23.296 INFO:tasks.workunit.client.1.vm06.stdout:0/590: creat d0/d3c/d42/d88/d98/fbc x:0 0 0 2026-03-10T06:22:23.306 INFO:tasks.workunit.client.1.vm06.stdout:4/565: dwrite dd/d33/f56 [0,4194304] 0 2026-03-10T06:22:23.310 INFO:tasks.workunit.client.1.vm06.stdout:4/566: dwrite dd/d24/d2d/d2f/d39/f4a [0,4194304] 0 2026-03-10T06:22:23.316 INFO:tasks.workunit.client.1.vm06.stdout:4/567: unlink dd/d41/ca3 0 2026-03-10T06:22:23.317 INFO:tasks.workunit.client.1.vm06.stdout:4/568: creat dd/d24/fa8 x:0 0 0 2026-03-10T06:22:23.318 INFO:tasks.workunit.client.1.vm06.stdout:4/569: write dd/d24/d5e/f67 [693418,88160] 0 2026-03-10T06:22:23.322 INFO:tasks.workunit.client.1.vm06.stdout:4/570: dwrite dd/d33/f37 [0,4194304] 0 2026-03-10T06:22:23.328 INFO:tasks.workunit.client.1.vm06.stdout:9/577: dread f11 [4194304,4194304] 0 2026-03-10T06:22:23.330 INFO:tasks.workunit.client.1.vm06.stdout:4/571: fdatasync dd/d24/d2d/d2f/d34/d40/f99 0 2026-03-10T06:22:23.339 INFO:tasks.workunit.client.1.vm06.stdout:9/578: dread d21/d32/d4d/f6b [0,4194304] 0 2026-03-10T06:22:23.339 INFO:tasks.workunit.client.1.vm06.stdout:4/572: dwrite dd/d18/f55 [4194304,4194304] 0 2026-03-10T06:22:23.340 INFO:tasks.workunit.client.1.vm06.stdout:9/579: write d21/d27/d50/d57/db2/faa [968855,125996] 0 2026-03-10T06:22:23.344 INFO:tasks.workunit.client.1.vm06.stdout:4/573: mkdir dd/d41/da9 0 2026-03-10T06:22:23.347 INFO:tasks.workunit.client.1.vm06.stdout:9/580: rename c13 to d21/d27/d50/d57/db2/d80/cc5 0 2026-03-10T06:22:23.348 INFO:tasks.workunit.client.1.vm06.stdout:4/574: mkdir dd/d18/d8e/daa 0 2026-03-10T06:22:23.349 INFO:tasks.workunit.client.1.vm06.stdout:9/581: creat d21/d27/d50/d57/db2/d7f/fc6 x:0 0 0 2026-03-10T06:22:23.349 INFO:tasks.workunit.client.1.vm06.stdout:4/575: mknod dd/d24/d2d/d2f/d34/d83/cab 0 2026-03-10T06:22:23.351 INFO:tasks.workunit.client.1.vm06.stdout:9/582: truncate d21/d27/f65 770037 0 2026-03-10T06:22:23.353 INFO:tasks.workunit.client.1.vm06.stdout:4/576: dwrite dd/d33/d36/f74 [0,4194304] 0 2026-03-10T06:22:23.354 INFO:tasks.workunit.client.1.vm06.stdout:4/577: chown dd/d24/d2d/d2f/d34/d40/f89 33401085 1 2026-03-10T06:22:23.358 INFO:tasks.workunit.client.1.vm06.stdout:4/578: creat dd/d18/fac x:0 0 0 2026-03-10T06:22:23.358 INFO:tasks.workunit.client.1.vm06.stdout:4/579: write dd/d24/d5d/f9d [901997,85270] 0 2026-03-10T06:22:23.358 INFO:tasks.workunit.client.1.vm06.stdout:4/580: write dd/d33/f3f [329491,56031] 0 2026-03-10T06:22:23.362 INFO:tasks.workunit.client.1.vm06.stdout:4/581: dwrite dd/d24/f45 [0,4194304] 0 2026-03-10T06:22:23.371 INFO:tasks.workunit.client.1.vm06.stdout:4/582: read dd/d41/f52 [3664353,5586] 0 2026-03-10T06:22:23.374 INFO:tasks.workunit.client.1.vm06.stdout:4/583: getdents dd/d24/d5e 0 2026-03-10T06:22:23.374 INFO:tasks.workunit.client.1.vm06.stdout:4/584: read dd/d24/d5e/f6a [3486924,58790] 0 2026-03-10T06:22:23.375 INFO:tasks.workunit.client.1.vm06.stdout:4/585: chown dd/d24/d2d/d2f/d34/c63 5241 1 2026-03-10T06:22:23.378 INFO:tasks.workunit.client.1.vm06.stdout:9/583: sync 2026-03-10T06:22:23.381 INFO:tasks.workunit.client.1.vm06.stdout:9/584: dwrite d21/d27/d56/f74 [0,4194304] 0 2026-03-10T06:22:23.382 INFO:tasks.workunit.client.1.vm06.stdout:9/585: truncate d21/d27/f9a 540812 0 2026-03-10T06:22:23.382 INFO:tasks.workunit.client.1.vm06.stdout:4/586: dwrite dd/d24/d2d/f5a [0,4194304] 0 2026-03-10T06:22:23.390 INFO:tasks.workunit.client.1.vm06.stdout:4/587: creat dd/d24/d2d/d2f/d39/fad x:0 0 0 2026-03-10T06:22:23.390 INFO:tasks.workunit.client.1.vm06.stdout:9/586: write f14 [8213429,9864] 0 2026-03-10T06:22:23.392 INFO:tasks.workunit.client.1.vm06.stdout:4/588: chown dd/d24/d2d/d2f/d34/d40/f8a 3931573 1 2026-03-10T06:22:23.393 INFO:tasks.workunit.client.1.vm06.stdout:9/587: readlink d21/d27/d56/lab 0 2026-03-10T06:22:23.393 INFO:tasks.workunit.client.1.vm06.stdout:4/589: symlink dd/d33/d47/d97/lae 0 2026-03-10T06:22:23.394 INFO:tasks.workunit.client.1.vm06.stdout:9/588: write d21/d27/d50/d57/db2/d80/d95/fc4 [637443,112484] 0 2026-03-10T06:22:23.396 INFO:tasks.workunit.client.1.vm06.stdout:4/590: creat dd/d24/d2d/d2f/d34/faf x:0 0 0 2026-03-10T06:22:23.396 INFO:tasks.workunit.client.1.vm06.stdout:4/591: fsync dd/d24/d2d/d2f/d34/d40/f89 0 2026-03-10T06:22:23.397 INFO:tasks.workunit.client.1.vm06.stdout:4/592: chown dd/d24/c2c 41 1 2026-03-10T06:22:23.398 INFO:tasks.workunit.client.1.vm06.stdout:9/589: mknod d21/d27/d50/d57/cc7 0 2026-03-10T06:22:23.399 INFO:tasks.workunit.client.1.vm06.stdout:4/593: mkdir dd/d24/d5e/db0 0 2026-03-10T06:22:23.399 INFO:tasks.workunit.client.1.vm06.stdout:4/594: read dd/d24/f45 [410905,73985] 0 2026-03-10T06:22:23.400 INFO:tasks.workunit.client.1.vm06.stdout:4/595: rename dd/d33/d47/d97 to dd/d33/d47/d97/db1 22 2026-03-10T06:22:23.402 INFO:tasks.workunit.client.1.vm06.stdout:9/590: symlink d21/d32/d4d/lc8 0 2026-03-10T06:22:23.404 INFO:tasks.workunit.client.1.vm06.stdout:4/596: rename dd/d33/c50 to dd/d72/cb2 0 2026-03-10T06:22:23.416 INFO:tasks.workunit.client.1.vm06.stdout:9/591: creat d21/da2/dc2/fc9 x:0 0 0 2026-03-10T06:22:23.419 INFO:tasks.workunit.client.1.vm06.stdout:9/592: unlink d21/d46/c5f 0 2026-03-10T06:22:23.449 INFO:tasks.workunit.client.1.vm06.stdout:6/625: dread d6/df/d70/fa6 [0,4194304] 0 2026-03-10T06:22:23.455 INFO:tasks.workunit.client.1.vm06.stdout:6/626: rmdir d6/d79/d95/db4/dbe 39 2026-03-10T06:22:23.458 INFO:tasks.workunit.client.1.vm06.stdout:6/627: rename d6/d79/d95/db4/dbe to d6/dd/d25/d33/d5a/dd8 0 2026-03-10T06:22:23.460 INFO:tasks.workunit.client.1.vm06.stdout:6/628: creat d6/df/d70/fd9 x:0 0 0 2026-03-10T06:22:23.461 INFO:tasks.workunit.client.1.vm06.stdout:6/629: write f3 [276929,34325] 0 2026-03-10T06:22:23.463 INFO:tasks.workunit.client.1.vm06.stdout:6/630: symlink d6/dd/d25/d33/d5a/lda 0 2026-03-10T06:22:23.465 INFO:tasks.workunit.client.1.vm06.stdout:6/631: symlink d6/dd/d25/d33/d5a/d78/dd0/ldb 0 2026-03-10T06:22:23.475 INFO:tasks.workunit.client.1.vm06.stdout:6/632: dread d6/dd/d25/d33/f5d [0,4194304] 0 2026-03-10T06:22:23.477 INFO:tasks.workunit.client.1.vm06.stdout:6/633: rename d6/dd/d25/d2c/cad to d6/dd/d25/d33/d5a/d78/dd0/dc5/cdc 0 2026-03-10T06:22:23.499 INFO:tasks.workunit.client.1.vm06.stdout:7/644: truncate d19/d3b/d41/d72/fb9 374768 0 2026-03-10T06:22:23.521 INFO:tasks.workunit.client.1.vm06.stdout:7/645: dread d19/d3b/d5b/fa4 [0,4194304] 0 2026-03-10T06:22:23.523 INFO:tasks.workunit.client.1.vm06.stdout:7/646: symlink d19/d3b/d41/d42/d52/d9f/dc2/ldb 0 2026-03-10T06:22:23.523 INFO:tasks.workunit.client.1.vm06.stdout:7/647: write f15 [1882590,44308] 0 2026-03-10T06:22:23.551 INFO:tasks.workunit.client.1.vm06.stdout:5/459: fdatasync d8/d9/f14 0 2026-03-10T06:22:23.558 INFO:tasks.workunit.client.1.vm06.stdout:5/460: dwrite d8/db/d54/d8a/d74/f42 [0,4194304] 0 2026-03-10T06:22:23.565 INFO:tasks.workunit.client.1.vm06.stdout:5/461: symlink d8/db/d54/d67/d46/d6e/l93 0 2026-03-10T06:22:23.574 INFO:tasks.workunit.client.1.vm06.stdout:5/462: mknod d8/db/d54/d67/d46/c94 0 2026-03-10T06:22:23.575 INFO:tasks.workunit.client.1.vm06.stdout:1/623: write d9/d1b/f1d [2710999,124602] 0 2026-03-10T06:22:23.579 INFO:tasks.workunit.client.1.vm06.stdout:2/536: dwrite da/d13/d1a/d39/f3c [0,4194304] 0 2026-03-10T06:22:23.582 INFO:tasks.workunit.client.1.vm06.stdout:1/624: dwrite d9/d1b/d20/d44/fa2 [0,4194304] 0 2026-03-10T06:22:23.589 INFO:tasks.workunit.client.1.vm06.stdout:5/463: dread d8/db/d57/f5f [0,4194304] 0 2026-03-10T06:22:23.613 INFO:tasks.workunit.client.1.vm06.stdout:3/572: dwrite d6/f5c [0,4194304] 0 2026-03-10T06:22:23.619 INFO:tasks.workunit.client.1.vm06.stdout:0/591: write d0/dd/d1b/f72 [967362,40613] 0 2026-03-10T06:22:23.620 INFO:tasks.workunit.client.1.vm06.stdout:8/493: dwrite d1/df/d11/f48 [0,4194304] 0 2026-03-10T06:22:23.623 INFO:tasks.workunit.client.1.vm06.stdout:8/494: truncate d1/d2c/d5b/f7c 330226 0 2026-03-10T06:22:23.632 INFO:tasks.workunit.client.1.vm06.stdout:2/537: creat da/fa7 x:0 0 0 2026-03-10T06:22:23.639 INFO:tasks.workunit.client.1.vm06.stdout:0/592: symlink d0/dd/lbd 0 2026-03-10T06:22:23.660 INFO:tasks.workunit.client.1.vm06.stdout:3/573: creat d6/dc/d13/fc1 x:0 0 0 2026-03-10T06:22:23.660 INFO:tasks.workunit.client.1.vm06.stdout:3/574: chown d6/d21/d38/caf 1232 1 2026-03-10T06:22:23.660 INFO:tasks.workunit.client.1.vm06.stdout:8/495: rmdir d1/d3b/d9e 0 2026-03-10T06:22:23.660 INFO:tasks.workunit.client.1.vm06.stdout:3/575: truncate d6/dc/f3f 365418 0 2026-03-10T06:22:23.660 INFO:tasks.workunit.client.1.vm06.stdout:0/593: dwrite d0/d3c/d42/d88/fa0 [0,4194304] 0 2026-03-10T06:22:23.660 INFO:tasks.workunit.client.1.vm06.stdout:8/496: rmdir d1/df/d20/d21/d7e 39 2026-03-10T06:22:23.660 INFO:tasks.workunit.client.1.vm06.stdout:3/576: unlink d6/d21/d38/f50 0 2026-03-10T06:22:23.665 INFO:tasks.workunit.client.1.vm06.stdout:0/594: getdents d0/dd/d14/d1d/d73 0 2026-03-10T06:22:23.669 INFO:tasks.workunit.client.1.vm06.stdout:0/595: creat d0/d3c/d42/fbe x:0 0 0 2026-03-10T06:22:23.674 INFO:tasks.workunit.client.1.vm06.stdout:3/577: dwrite d6/d21/d38/d39/f89 [0,4194304] 0 2026-03-10T06:22:23.674 INFO:tasks.workunit.client.1.vm06.stdout:8/497: dwrite d1/df/d20/d21/f37 [4194304,4194304] 0 2026-03-10T06:22:23.682 INFO:tasks.workunit.client.1.vm06.stdout:0/596: dwrite d0/dd/f32 [4194304,4194304] 0 2026-03-10T06:22:23.684 INFO:tasks.workunit.client.1.vm06.stdout:4/597: rmdir dd 39 2026-03-10T06:22:23.685 INFO:tasks.workunit.client.1.vm06.stdout:3/578: readlink d6/d21/d38/l3e 0 2026-03-10T06:22:23.686 INFO:tasks.workunit.client.1.vm06.stdout:4/598: chown f8 0 1 2026-03-10T06:22:23.694 INFO:tasks.workunit.client.1.vm06.stdout:8/498: dwrite d1/d3b/d5c/f7a [0,4194304] 0 2026-03-10T06:22:23.702 INFO:tasks.workunit.client.1.vm06.stdout:0/597: fsync d0/d3c/d42/f60 0 2026-03-10T06:22:23.704 INFO:tasks.workunit.client.1.vm06.stdout:0/598: write d0/dd/d14/f70 [1615257,122111] 0 2026-03-10T06:22:23.707 INFO:tasks.workunit.client.1.vm06.stdout:3/579: link d6/d21/f99 d6/d1a/d5b/dbd/fc2 0 2026-03-10T06:22:23.710 INFO:tasks.workunit.client.1.vm06.stdout:3/580: mknod d6/d8/d7f/da1/cc3 0 2026-03-10T06:22:23.712 INFO:tasks.workunit.client.1.vm06.stdout:3/581: creat d6/d21/dbc/fc4 x:0 0 0 2026-03-10T06:22:23.715 INFO:tasks.workunit.client.1.vm06.stdout:8/499: dwrite f0 [0,4194304] 0 2026-03-10T06:22:23.723 INFO:tasks.workunit.client.1.vm06.stdout:3/582: dread d6/fab [0,4194304] 0 2026-03-10T06:22:23.726 INFO:tasks.workunit.client.1.vm06.stdout:8/500: dwrite d1/df/d20/f63 [0,4194304] 0 2026-03-10T06:22:23.727 INFO:tasks.workunit.client.1.vm06.stdout:4/599: sync 2026-03-10T06:22:23.731 INFO:tasks.workunit.client.1.vm06.stdout:3/583: mknod d6/d21/d38/d39/cc5 0 2026-03-10T06:22:23.733 INFO:tasks.workunit.client.1.vm06.stdout:8/501: creat d1/d7/fa7 x:0 0 0 2026-03-10T06:22:23.737 INFO:tasks.workunit.client.1.vm06.stdout:3/584: truncate d6/d21/f55 95685 0 2026-03-10T06:22:23.738 INFO:tasks.workunit.client.1.vm06.stdout:3/585: rename d6/d1a to d6/d1a/d5b/dc6 22 2026-03-10T06:22:23.738 INFO:tasks.workunit.client.1.vm06.stdout:4/600: dread - dd/d24/d2d/d2f/d39/f62 zero size 2026-03-10T06:22:23.739 INFO:tasks.workunit.client.1.vm06.stdout:3/586: write d6/dc/d13/fba [272113,121741] 0 2026-03-10T06:22:23.739 INFO:tasks.workunit.client.1.vm06.stdout:3/587: chown d6/l28 607207 1 2026-03-10T06:22:23.740 INFO:tasks.workunit.client.1.vm06.stdout:3/588: chown d6/dc/d13/f8d 20 1 2026-03-10T06:22:23.742 INFO:tasks.workunit.client.1.vm06.stdout:4/601: mknod dd/d24/cb3 0 2026-03-10T06:22:23.745 INFO:tasks.workunit.client.1.vm06.stdout:4/602: stat dd/d24/d2d/d2f 0 2026-03-10T06:22:23.745 INFO:tasks.workunit.client.1.vm06.stdout:0/599: dread d0/dd/f48 [0,4194304] 0 2026-03-10T06:22:23.745 INFO:tasks.workunit.client.1.vm06.stdout:4/603: truncate dd/d24/d5e/f67 936011 0 2026-03-10T06:22:23.745 INFO:tasks.workunit.client.1.vm06.stdout:0/600: chown d0/dd/d14/d18/d66/l79 3958 1 2026-03-10T06:22:23.746 INFO:tasks.workunit.client.1.vm06.stdout:0/601: truncate d0/dd/d1c/da2/fb9 891242 0 2026-03-10T06:22:23.748 INFO:tasks.workunit.client.1.vm06.stdout:8/502: dwrite d1/df/d11/da1/fa2 [0,4194304] 0 2026-03-10T06:22:23.755 INFO:tasks.workunit.client.1.vm06.stdout:4/604: creat dd/d18/fb4 x:0 0 0 2026-03-10T06:22:23.758 INFO:tasks.workunit.client.1.vm06.stdout:8/503: dwrite d1/df/d11/f81 [0,4194304] 0 2026-03-10T06:22:23.760 INFO:tasks.workunit.client.1.vm06.stdout:8/504: truncate d1/f89 701443 0 2026-03-10T06:22:23.760 INFO:tasks.workunit.client.1.vm06.stdout:8/505: fsync d1/f1c 0 2026-03-10T06:22:23.769 INFO:tasks.workunit.client.1.vm06.stdout:4/605: read dd/d24/d5e/f6a [791829,76164] 0 2026-03-10T06:22:23.787 INFO:tasks.workunit.client.1.vm06.stdout:8/506: symlink d1/d2c/d5b/la8 0 2026-03-10T06:22:23.790 INFO:tasks.workunit.client.1.vm06.stdout:8/507: chown d1/d2c/l33 0 1 2026-03-10T06:22:23.799 INFO:tasks.workunit.client.1.vm06.stdout:4/606: unlink dd/d33/f56 0 2026-03-10T06:22:23.807 INFO:tasks.workunit.client.1.vm06.stdout:4/607: truncate dd/d24/d2d/d2f/d34/d40/f89 472552 0 2026-03-10T06:22:23.807 INFO:tasks.workunit.client.1.vm06.stdout:8/508: mkdir d1/d3b/da9 0 2026-03-10T06:22:23.807 INFO:tasks.workunit.client.1.vm06.stdout:8/509: truncate d1/d7/fd 1532720 0 2026-03-10T06:22:23.812 INFO:tasks.workunit.client.1.vm06.stdout:8/510: dread d1/df/d11/da1/fa2 [0,4194304] 0 2026-03-10T06:22:23.818 INFO:tasks.workunit.client.1.vm06.stdout:4/608: write dd/d24/f45 [4057882,81099] 0 2026-03-10T06:22:23.820 INFO:tasks.workunit.client.1.vm06.stdout:8/511: mknod d1/df/d58/caa 0 2026-03-10T06:22:23.824 INFO:tasks.workunit.client.1.vm06.stdout:4/609: chown dd/d18/f32 0 1 2026-03-10T06:22:23.830 INFO:tasks.workunit.client.1.vm06.stdout:4/610: link dd/d18/l54 dd/d18/lb5 0 2026-03-10T06:22:23.842 INFO:tasks.workunit.client.1.vm06.stdout:9/593: truncate d21/d32/f52 1763239 0 2026-03-10T06:22:23.843 INFO:tasks.workunit.client.1.vm06.stdout:6/634: write f3 [5128043,47166] 0 2026-03-10T06:22:23.844 INFO:tasks.workunit.client.1.vm06.stdout:6/635: chown d6/dd/d25/d2c/c61 5155822 1 2026-03-10T06:22:23.848 INFO:tasks.workunit.client.1.vm06.stdout:6/636: creat d6/d7/fdd x:0 0 0 2026-03-10T06:22:23.853 INFO:tasks.workunit.client.1.vm06.stdout:8/512: fdatasync d1/f1b 0 2026-03-10T06:22:23.853 INFO:tasks.workunit.client.1.vm06.stdout:6/637: getdents d6/dd/d25/d33 0 2026-03-10T06:22:23.856 INFO:tasks.workunit.client.1.vm06.stdout:6/638: unlink d6/dd/d25/d33/l91 0 2026-03-10T06:22:23.858 INFO:tasks.workunit.client.1.vm06.stdout:8/513: mkdir d1/d3b/da9/dab 0 2026-03-10T06:22:23.859 INFO:tasks.workunit.client.1.vm06.stdout:6/639: symlink d6/dd/lde 0 2026-03-10T06:22:23.859 INFO:tasks.workunit.client.1.vm06.stdout:6/640: chown d6/dd/cba 4 1 2026-03-10T06:22:23.861 INFO:tasks.workunit.client.1.vm06.stdout:6/641: unlink f3 0 2026-03-10T06:22:23.864 INFO:tasks.workunit.client.1.vm06.stdout:6/642: truncate d6/df/d40/d99/fb5 761314 0 2026-03-10T06:22:23.867 INFO:tasks.workunit.client.1.vm06.stdout:8/514: dwrite d1/df/d20/d21/f69 [0,4194304] 0 2026-03-10T06:22:23.868 INFO:tasks.workunit.client.1.vm06.stdout:6/643: dread d6/dd/d25/fa0 [0,4194304] 0 2026-03-10T06:22:23.874 INFO:tasks.workunit.client.1.vm06.stdout:8/515: unlink d1/df/f61 0 2026-03-10T06:22:23.912 INFO:tasks.workunit.client.1.vm06.stdout:7/648: truncate d19/d3b/d41/f66 2731373 0 2026-03-10T06:22:23.913 INFO:tasks.workunit.client.1.vm06.stdout:7/649: write d19/d3b/d41/d42/d62/d80/d82/fae [984037,130835] 0 2026-03-10T06:22:23.914 INFO:tasks.workunit.client.1.vm06.stdout:7/650: chown d19/d3b/c40 56 1 2026-03-10T06:22:23.917 INFO:tasks.workunit.client.1.vm06.stdout:7/651: creat d19/d3b/d41/da9/daa/fdc x:0 0 0 2026-03-10T06:22:23.919 INFO:tasks.workunit.client.1.vm06.stdout:7/652: mkdir d19/db0/ddd 0 2026-03-10T06:22:23.939 INFO:tasks.workunit.client.1.vm06.stdout:1/625: truncate d9/d35/d46/d38/d8c/f9a 1874967 0 2026-03-10T06:22:23.940 INFO:tasks.workunit.client.1.vm06.stdout:5/464: truncate d8/db/d54/d8a/d74/f62 3989040 0 2026-03-10T06:22:23.941 INFO:tasks.workunit.client.1.vm06.stdout:5/465: write d8/db/d54/d55/f87 [949514,32922] 0 2026-03-10T06:22:23.944 INFO:tasks.workunit.client.1.vm06.stdout:1/626: creat d9/d35/d46/d38/d63/d83/fb2 x:0 0 0 2026-03-10T06:22:23.949 INFO:tasks.workunit.client.1.vm06.stdout:2/538: dwrite da/d13/d1c/d43/d6e/f77 [0,4194304] 0 2026-03-10T06:22:23.951 INFO:tasks.workunit.client.1.vm06.stdout:1/627: mkdir d9/d1b/d20/db3 0 2026-03-10T06:22:23.953 INFO:tasks.workunit.client.1.vm06.stdout:3/589: rmdir d6/d8 39 2026-03-10T06:22:23.953 INFO:tasks.workunit.client.1.vm06.stdout:0/602: dread d0/dd/f5b [0,4194304] 0 2026-03-10T06:22:23.954 INFO:tasks.workunit.client.1.vm06.stdout:0/603: chown d0/dd/d14/d18/d7e 13 1 2026-03-10T06:22:23.957 INFO:tasks.workunit.client.1.vm06.stdout:2/539: rmdir da/d13/d5e 39 2026-03-10T06:22:23.960 INFO:tasks.workunit.client.1.vm06.stdout:3/590: rmdir d6/d21/d38/d39 39 2026-03-10T06:22:23.966 INFO:tasks.workunit.client.1.vm06.stdout:2/540: readlink da/d13/d1a/l5c 0 2026-03-10T06:22:23.967 INFO:tasks.workunit.client.1.vm06.stdout:3/591: chown d6/dc/d13/d51/fb1 5521688 1 2026-03-10T06:22:23.967 INFO:tasks.workunit.client.1.vm06.stdout:0/604: mknod d0/dd/d1c/cbf 0 2026-03-10T06:22:23.968 INFO:tasks.workunit.client.1.vm06.stdout:0/605: chown d0/dd/d1b/d3d/d50/cb0 296776 1 2026-03-10T06:22:23.972 INFO:tasks.workunit.client.1.vm06.stdout:1/628: link d9/d35/d89/l28 d9/d35/d46/lb4 0 2026-03-10T06:22:23.973 INFO:tasks.workunit.client.1.vm06.stdout:0/606: fdatasync d0/d3c/d42/f54 0 2026-03-10T06:22:23.976 INFO:tasks.workunit.client.1.vm06.stdout:0/607: dread d0/d3c/d42/d88/d47/d4d/f57 [0,4194304] 0 2026-03-10T06:22:23.979 INFO:tasks.workunit.client.1.vm06.stdout:0/608: creat d0/d3c/d42/dab/fc0 x:0 0 0 2026-03-10T06:22:23.985 INFO:tasks.workunit.client.1.vm06.stdout:0/609: write d0/fa [3311060,43470] 0 2026-03-10T06:22:23.985 INFO:tasks.workunit.client.1.vm06.stdout:3/592: sync 2026-03-10T06:22:23.988 INFO:tasks.workunit.client.1.vm06.stdout:0/610: chown d0/d3c/d42/d88/d98/f9a 3810460 1 2026-03-10T06:22:23.989 INFO:tasks.workunit.client.1.vm06.stdout:0/611: fdatasync d0/dd/f67 0 2026-03-10T06:22:24.026 INFO:tasks.workunit.client.1.vm06.stdout:3/593: dread d6/d8/d7f/f8c [0,4194304] 0 2026-03-10T06:22:24.029 INFO:tasks.workunit.client.1.vm06.stdout:3/594: mkdir d6/d21/d38/d39/d90/dc7 0 2026-03-10T06:22:24.031 INFO:tasks.workunit.client.1.vm06.stdout:3/595: dread d6/d4f/fbe [0,4194304] 0 2026-03-10T06:22:24.032 INFO:tasks.workunit.client.1.vm06.stdout:3/596: write d6/d8/f49 [1576684,56022] 0 2026-03-10T06:22:24.035 INFO:tasks.workunit.client.1.vm06.stdout:3/597: unlink d6/dc/d13/d51/fa9 0 2026-03-10T06:22:24.035 INFO:tasks.workunit.client.1.vm06.stdout:3/598: fdatasync d6/d21/fa4 0 2026-03-10T06:22:24.036 INFO:tasks.workunit.client.1.vm06.stdout:3/599: dread - d6/d1a/fb9 zero size 2026-03-10T06:22:24.037 INFO:tasks.workunit.client.1.vm06.stdout:3/600: truncate d6/dc/d13/d9d/d54/fb8 956072 0 2026-03-10T06:22:24.037 INFO:tasks.workunit.client.1.vm06.stdout:3/601: chown d6/d21/d38/d88/dae 42983 1 2026-03-10T06:22:24.039 INFO:tasks.workunit.client.1.vm06.stdout:3/602: mknod d6/d21/d38/d39/d90/dc7/cc8 0 2026-03-10T06:22:24.039 INFO:tasks.workunit.client.1.vm06.stdout:3/603: truncate d6/dc/f69 4355510 0 2026-03-10T06:22:24.042 INFO:tasks.workunit.client.1.vm06.stdout:3/604: creat d6/d8/d7f/fc9 x:0 0 0 2026-03-10T06:22:24.047 INFO:tasks.workunit.client.1.vm06.stdout:3/605: sync 2026-03-10T06:22:24.048 INFO:tasks.workunit.client.1.vm06.stdout:3/606: chown d6/d21/dbc 423937962 1 2026-03-10T06:22:24.050 INFO:tasks.workunit.client.1.vm06.stdout:3/607: creat d6/dc/d13/fca x:0 0 0 2026-03-10T06:22:24.051 INFO:tasks.workunit.client.1.vm06.stdout:3/608: creat d6/d21/d38/d39/d90/dc7/fcb x:0 0 0 2026-03-10T06:22:24.055 INFO:tasks.workunit.client.1.vm06.stdout:3/609: dwrite d6/d21/fb0 [0,4194304] 0 2026-03-10T06:22:24.057 INFO:tasks.workunit.client.1.vm06.stdout:3/610: write d6/dc/f1d [3553077,130874] 0 2026-03-10T06:22:24.090 INFO:tasks.workunit.client.1.vm06.stdout:7/653: rename d19/d3b/d41/da9/daa to d19/d3b/dde 0 2026-03-10T06:22:24.090 INFO:tasks.workunit.client.1.vm06.stdout:7/654: chown f15 95177293 1 2026-03-10T06:22:24.096 INFO:tasks.workunit.client.1.vm06.stdout:8/516: dwrite d1/f3a [0,4194304] 0 2026-03-10T06:22:24.097 INFO:tasks.workunit.client.1.vm06.stdout:6/644: mknod d6/dd/d25/d33/d5a/d78/dd0/dc5/cdf 0 2026-03-10T06:22:24.101 INFO:tasks.workunit.client.1.vm06.stdout:5/466: rename d8/db/d54/d67/d46/c94 to d8/db/d54/d55/d80/c95 0 2026-03-10T06:22:24.106 INFO:tasks.workunit.client.1.vm06.stdout:4/611: dread dd/d24/f3d [0,4194304] 0 2026-03-10T06:22:24.107 INFO:tasks.workunit.client.1.vm06.stdout:1/629: rmdir d9/d1b/d20 39 2026-03-10T06:22:24.110 INFO:tasks.workunit.client.1.vm06.stdout:4/612: rmdir dd/d24/d5e 39 2026-03-10T06:22:24.111 INFO:tasks.workunit.client.1.vm06.stdout:1/630: creat d9/d35/d46/d38/d63/d83/d93/fb5 x:0 0 0 2026-03-10T06:22:24.117 INFO:tasks.workunit.client.1.vm06.stdout:1/631: dwrite d9/d62/fa3 [0,4194304] 0 2026-03-10T06:22:24.118 INFO:tasks.workunit.client.1.vm06.stdout:8/517: unlink d1/df/d20/d21/d7e/d8d/l93 0 2026-03-10T06:22:24.121 INFO:tasks.workunit.client.1.vm06.stdout:5/467: link d8/db/d54/f88 d8/db/d54/d55/d80/f96 0 2026-03-10T06:22:24.126 INFO:tasks.workunit.client.1.vm06.stdout:4/613: readlink dd/d18/l54 0 2026-03-10T06:22:24.127 INFO:tasks.workunit.client.1.vm06.stdout:1/632: symlink d9/d35/d46/lb6 0 2026-03-10T06:22:24.129 INFO:tasks.workunit.client.1.vm06.stdout:1/633: truncate d9/d35/d46/d38/d63/d83/fb2 471371 0 2026-03-10T06:22:24.129 INFO:tasks.workunit.client.1.vm06.stdout:8/518: mkdir d1/df/d20/d35/dac 0 2026-03-10T06:22:24.130 INFO:tasks.workunit.client.1.vm06.stdout:2/541: rename da/d13/d8d to da/da8 0 2026-03-10T06:22:24.133 INFO:tasks.workunit.client.1.vm06.stdout:2/542: stat da/d13/f5b 0 2026-03-10T06:22:24.136 INFO:tasks.workunit.client.1.vm06.stdout:0/612: rename d0/dd/d1b to d0/d3c/dc1 0 2026-03-10T06:22:24.138 INFO:tasks.workunit.client.1.vm06.stdout:2/543: rmdir da/d13/d1c/d1d/d44 39 2026-03-10T06:22:24.139 INFO:tasks.workunit.client.1.vm06.stdout:8/519: truncate d1/df/d20/d21/d7e/d8d/f9c 1125252 0 2026-03-10T06:22:24.140 INFO:tasks.workunit.client.1.vm06.stdout:8/520: write d1/df/d20/f64 [589272,80929] 0 2026-03-10T06:22:24.141 INFO:tasks.workunit.client.1.vm06.stdout:3/611: rename d6/d4f/fbe to d6/dc/d13/d9d/d54/fcc 0 2026-03-10T06:22:24.141 INFO:tasks.workunit.client.1.vm06.stdout:7/655: rename d19/d3b to d19/d3b/d41/d42/d52/d9f/ddf 22 2026-03-10T06:22:24.143 INFO:tasks.workunit.client.1.vm06.stdout:3/612: write d6/dc/d41/f82 [5105906,77723] 0 2026-03-10T06:22:24.143 INFO:tasks.workunit.client.1.vm06.stdout:8/521: chown d1/df/d20/fe 2080800 1 2026-03-10T06:22:24.144 INFO:tasks.workunit.client.1.vm06.stdout:7/656: write d19/d3b/dde/fdc [155086,21562] 0 2026-03-10T06:22:24.151 INFO:tasks.workunit.client.1.vm06.stdout:1/634: mknod d9/d1b/d20/cb7 0 2026-03-10T06:22:24.159 INFO:tasks.workunit.client.1.vm06.stdout:0/613: dwrite d0/d3c/d42/d88/fae [0,4194304] 0 2026-03-10T06:22:24.169 INFO:tasks.workunit.client.1.vm06.stdout:2/544: dwrite da/d13/d1a/d39/f2f [0,4194304] 0 2026-03-10T06:22:24.170 INFO:tasks.workunit.client.1.vm06.stdout:0/614: read d0/d3c/d42/f54 [1211541,98324] 0 2026-03-10T06:22:24.173 INFO:tasks.workunit.client.1.vm06.stdout:9/594: dwrite d21/d32/f52 [0,4194304] 0 2026-03-10T06:22:24.184 INFO:tasks.workunit.client.1.vm06.stdout:4/614: dread dd/d18/f5f [0,4194304] 0 2026-03-10T06:22:24.186 INFO:tasks.workunit.client.1.vm06.stdout:4/615: fdatasync dd/d24/d2d/d2f/d39/d71/f90 0 2026-03-10T06:22:24.188 INFO:tasks.workunit.client.1.vm06.stdout:4/616: write dd/d24/f45 [4496618,87518] 0 2026-03-10T06:22:24.194 INFO:tasks.workunit.client.1.vm06.stdout:6/645: rename d6/d7/f36 to d6/dd/d25/d33/d5a/dae/fe0 0 2026-03-10T06:22:24.195 INFO:tasks.workunit.client.1.vm06.stdout:7/657: mkdir d19/d3b/d41/d72/de0 0 2026-03-10T06:22:24.196 INFO:tasks.workunit.client.1.vm06.stdout:8/522: symlink d1/df/d20/d21/d5e/d79/lad 0 2026-03-10T06:22:24.197 INFO:tasks.workunit.client.1.vm06.stdout:8/523: read - d1/d7/fa7 zero size 2026-03-10T06:22:24.200 INFO:tasks.workunit.client.1.vm06.stdout:1/635: mkdir d9/d62/db8 0 2026-03-10T06:22:24.200 INFO:tasks.workunit.client.1.vm06.stdout:7/658: write d19/d3b/d41/d42/d62/d80/d82/fae [1131770,121252] 0 2026-03-10T06:22:24.201 INFO:tasks.workunit.client.1.vm06.stdout:7/659: dread - d19/d3b/d41/d4c/fcf zero size 2026-03-10T06:22:24.202 INFO:tasks.workunit.client.1.vm06.stdout:1/636: write d9/d35/d46/d38/d63/f80 [4840575,20681] 0 2026-03-10T06:22:24.209 INFO:tasks.workunit.client.1.vm06.stdout:2/545: mknod da/da8/ca9 0 2026-03-10T06:22:24.210 INFO:tasks.workunit.client.1.vm06.stdout:4/617: dwrite dd/d33/d36/f8d [0,4194304] 0 2026-03-10T06:22:24.218 INFO:tasks.workunit.client.1.vm06.stdout:5/468: rename d8/db/d54/d67/d46/f77 to d8/db/d57/f97 0 2026-03-10T06:22:24.219 INFO:tasks.workunit.client.1.vm06.stdout:3/613: mknod d6/dc/d13/ccd 0 2026-03-10T06:22:24.220 INFO:tasks.workunit.client.1.vm06.stdout:6/646: creat d6/dd/d25/d33/d5a/d78/dd0/dc5/fe1 x:0 0 0 2026-03-10T06:22:24.221 INFO:tasks.workunit.client.1.vm06.stdout:1/637: dwrite d9/d35/d46/d38/d63/f84 [0,4194304] 0 2026-03-10T06:22:24.221 INFO:tasks.workunit.client.1.vm06.stdout:7/660: stat d19/c59 0 2026-03-10T06:22:24.227 INFO:tasks.workunit.client.1.vm06.stdout:7/661: readlink d19/d3b/d41/d4c/la7 0 2026-03-10T06:22:24.231 INFO:tasks.workunit.client.1.vm06.stdout:4/618: dwrite dd/d33/d47/f93 [0,4194304] 0 2026-03-10T06:22:24.235 INFO:tasks.workunit.client.1.vm06.stdout:9/595: rename d21/d32/d4d/d51/fbe to d21/da2/da7/fca 0 2026-03-10T06:22:24.238 INFO:tasks.workunit.client.1.vm06.stdout:9/596: write d21/d32/d4d/fb4 [449287,96324] 0 2026-03-10T06:22:24.238 INFO:tasks.workunit.client.1.vm06.stdout:7/662: write d19/d3b/d41/d4c/f55 [3612139,60096] 0 2026-03-10T06:22:24.242 INFO:tasks.workunit.client.1.vm06.stdout:7/663: chown d19/d3b/d5b/f69 1 1 2026-03-10T06:22:24.242 INFO:tasks.workunit.client.1.vm06.stdout:4/619: mkdir dd/d33/d47/d97/db6 0 2026-03-10T06:22:24.242 INFO:tasks.workunit.client.1.vm06.stdout:9/597: write d21/da2/dc2/fc9 [548067,60861] 0 2026-03-10T06:22:24.242 INFO:tasks.workunit.client.1.vm06.stdout:5/469: dwrite d8/db/d54/d8a/d39/d72/f8b [0,4194304] 0 2026-03-10T06:22:24.244 INFO:tasks.workunit.client.1.vm06.stdout:8/524: dread d1/df/d58/f6a [0,4194304] 0 2026-03-10T06:22:24.252 INFO:tasks.workunit.client.1.vm06.stdout:4/620: read dd/d18/f1f [3616409,120745] 0 2026-03-10T06:22:24.267 INFO:tasks.workunit.client.1.vm06.stdout:2/546: rename da/d13/d1c/d1d/d44/d53/l97 to da/d13/d1c/d43/d6e/d9b/laa 0 2026-03-10T06:22:24.268 INFO:tasks.workunit.client.1.vm06.stdout:8/525: mknod d1/df/d11/cae 0 2026-03-10T06:22:24.269 INFO:tasks.workunit.client.1.vm06.stdout:5/470: readlink d8/l1b 0 2026-03-10T06:22:24.271 INFO:tasks.workunit.client.1.vm06.stdout:5/471: write d8/d9/f47 [267762,24492] 0 2026-03-10T06:22:24.278 INFO:tasks.workunit.client.1.vm06.stdout:8/526: fsync d1/d7/f4f 0 2026-03-10T06:22:24.283 INFO:tasks.workunit.client.1.vm06.stdout:9/598: dwrite d21/f49 [0,4194304] 0 2026-03-10T06:22:24.294 INFO:tasks.workunit.client.1.vm06.stdout:6/647: dread d6/dd/f5b [0,4194304] 0 2026-03-10T06:22:24.301 INFO:tasks.workunit.client.1.vm06.stdout:6/648: truncate d6/dd/d25/d2c/f9c 5051523 0 2026-03-10T06:22:24.301 INFO:tasks.workunit.client.1.vm06.stdout:6/649: dread - d6/d79/d95/db4/fbd zero size 2026-03-10T06:22:24.302 INFO:tasks.workunit.client.1.vm06.stdout:2/547: sync 2026-03-10T06:22:24.302 INFO:tasks.workunit.client.1.vm06.stdout:0/615: write d0/d3c/d42/d88/d47/d4d/f81 [1025966,59126] 0 2026-03-10T06:22:24.308 INFO:tasks.workunit.client.1.vm06.stdout:3/614: getdents d6 0 2026-03-10T06:22:24.309 INFO:tasks.workunit.client.1.vm06.stdout:8/527: symlink d1/df/d20/laf 0 2026-03-10T06:22:24.312 INFO:tasks.workunit.client.1.vm06.stdout:9/599: chown d21/d27/d50/d57/db2/d80/cc5 124272 1 2026-03-10T06:22:24.313 INFO:tasks.workunit.client.1.vm06.stdout:7/664: creat d19/d3b/fe1 x:0 0 0 2026-03-10T06:22:24.317 INFO:tasks.workunit.client.1.vm06.stdout:7/665: dwrite d19/d3b/d41/d42/f6d [0,4194304] 0 2026-03-10T06:22:24.319 INFO:tasks.workunit.client.1.vm06.stdout:7/666: truncate d19/f99 2064193 0 2026-03-10T06:22:24.321 INFO:tasks.workunit.client.1.vm06.stdout:2/548: symlink da/d13/d1c/d1d/d44/d46/lab 0 2026-03-10T06:22:24.321 INFO:tasks.workunit.client.1.vm06.stdout:7/667: readlink d19/d3b/l50 0 2026-03-10T06:22:24.327 INFO:tasks.workunit.client.1.vm06.stdout:8/528: symlink d1/d3b/da9/lb0 0 2026-03-10T06:22:24.327 INFO:tasks.workunit.client.1.vm06.stdout:9/600: mkdir d21/d32/d4d/d51/dcb 0 2026-03-10T06:22:24.327 INFO:tasks.workunit.client.1.vm06.stdout:8/529: chown d1/d2c/d99 7546220 1 2026-03-10T06:22:24.335 INFO:tasks.workunit.client.1.vm06.stdout:4/621: getdents dd/d33/d47 0 2026-03-10T06:22:24.335 INFO:tasks.workunit.client.1.vm06.stdout:0/616: symlink d0/d3c/dc1/d3d/d50/d91/da7/lc2 0 2026-03-10T06:22:24.336 INFO:tasks.workunit.client.1.vm06.stdout:1/638: rename d9/d35/d46/l98 to d9/d1b/lb9 0 2026-03-10T06:22:24.336 INFO:tasks.workunit.client.1.vm06.stdout:1/639: stat d9/d1b/f81 0 2026-03-10T06:22:24.337 INFO:tasks.workunit.client.1.vm06.stdout:5/472: creat d8/db/d54/d67/d46/f98 x:0 0 0 2026-03-10T06:22:24.348 INFO:tasks.workunit.client.1.vm06.stdout:1/640: dwrite d9/d35/d46/d38/d63/d83/fa1 [0,4194304] 0 2026-03-10T06:22:24.351 INFO:tasks.workunit.client.1.vm06.stdout:1/641: readlink d9/l3f 0 2026-03-10T06:22:24.352 INFO:tasks.workunit.client.1.vm06.stdout:1/642: chown d9/d35/f7e 49228 1 2026-03-10T06:22:24.358 INFO:tasks.workunit.client.1.vm06.stdout:3/615: link d6/d21/fb4 d6/dc/d41/d6d/fce 0 2026-03-10T06:22:24.367 INFO:tasks.workunit.client.1.vm06.stdout:9/601: creat d21/d27/d50/d57/db2/d80/d95/d9b/fcc x:0 0 0 2026-03-10T06:22:24.367 INFO:tasks.workunit.client.1.vm06.stdout:3/616: readlink d6/d21/d38/d39/l75 0 2026-03-10T06:22:24.367 INFO:tasks.workunit.client.1.vm06.stdout:1/643: dwrite d9/d35/d46/d38/f82 [0,4194304] 0 2026-03-10T06:22:24.368 INFO:tasks.workunit.client.1.vm06.stdout:3/617: fdatasync d6/dc/f1d 0 2026-03-10T06:22:24.368 INFO:tasks.workunit.client.1.vm06.stdout:2/549: symlink da/d13/d1a/lac 0 2026-03-10T06:22:24.370 INFO:tasks.workunit.client.1.vm06.stdout:2/550: readlink da/d13/d1c/d1d/d44/d53/l96 0 2026-03-10T06:22:24.388 INFO:tasks.workunit.client.1.vm06.stdout:4/622: mknod dd/d18/d8e/cb7 0 2026-03-10T06:22:24.389 INFO:tasks.workunit.client.1.vm06.stdout:4/623: read - dd/d18/d75/f91 zero size 2026-03-10T06:22:24.391 INFO:tasks.workunit.client.1.vm06.stdout:9/602: mkdir d21/d27/d50/d57/dcd 0 2026-03-10T06:22:24.392 INFO:tasks.workunit.client.1.vm06.stdout:3/618: mknod d6/d21/d38/d39/d90/dc7/ccf 0 2026-03-10T06:22:24.395 INFO:tasks.workunit.client.1.vm06.stdout:9/603: dwrite d21/da2/dc2/fc9 [0,4194304] 0 2026-03-10T06:22:24.406 INFO:tasks.workunit.client.1.vm06.stdout:4/624: rmdir dd/d72 39 2026-03-10T06:22:24.411 INFO:tasks.workunit.client.1.vm06.stdout:6/650: rename d6/dd/d25/d4e/f8a to d6/d79/fe2 0 2026-03-10T06:22:24.412 INFO:tasks.workunit.client.1.vm06.stdout:0/617: dread d0/dd/d14/d18/f90 [0,4194304] 0 2026-03-10T06:22:24.412 INFO:tasks.workunit.client.1.vm06.stdout:1/644: symlink d9/d35/d46/db0/lba 0 2026-03-10T06:22:24.413 INFO:tasks.workunit.client.1.vm06.stdout:6/651: readlink d6/dd/d25/d33/d5a/d78/dd0/ldb 0 2026-03-10T06:22:24.417 INFO:tasks.workunit.client.1.vm06.stdout:6/652: truncate d6/d79/d95/db4/dd4/fd5 815238 0 2026-03-10T06:22:24.423 INFO:tasks.workunit.client.1.vm06.stdout:5/473: getdents d8/db 0 2026-03-10T06:22:24.423 INFO:tasks.workunit.client.1.vm06.stdout:3/619: mkdir d6/d21/d38/dd0 0 2026-03-10T06:22:24.432 INFO:tasks.workunit.client.1.vm06.stdout:9/604: mknod d21/d27/d50/d57/dcd/cce 0 2026-03-10T06:22:24.433 INFO:tasks.workunit.client.1.vm06.stdout:1/645: symlink d9/d35/d89/lbb 0 2026-03-10T06:22:24.433 INFO:tasks.workunit.client.1.vm06.stdout:4/625: truncate dd/d24/d5e/f6a 1757300 0 2026-03-10T06:22:24.433 INFO:tasks.workunit.client.1.vm06.stdout:6/653: mknod d6/d7/d37/ce3 0 2026-03-10T06:22:24.434 INFO:tasks.workunit.client.1.vm06.stdout:0/618: creat d0/d3c/d42/d5e/dbb/fc3 x:0 0 0 2026-03-10T06:22:24.434 INFO:tasks.workunit.client.1.vm06.stdout:4/626: dread - dd/d24/fa8 zero size 2026-03-10T06:22:24.436 INFO:tasks.workunit.client.1.vm06.stdout:2/551: rename da/d13/l2c to da/d13/d1a/lad 0 2026-03-10T06:22:24.440 INFO:tasks.workunit.client.1.vm06.stdout:6/654: read - d6/fb1 zero size 2026-03-10T06:22:24.440 INFO:tasks.workunit.client.1.vm06.stdout:5/474: creat d8/db/d57/d83/f99 x:0 0 0 2026-03-10T06:22:24.442 INFO:tasks.workunit.client.1.vm06.stdout:0/619: mkdir d0/d3c/dc1/dc4 0 2026-03-10T06:22:24.444 INFO:tasks.workunit.client.1.vm06.stdout:3/620: dread d6/f29 [0,4194304] 0 2026-03-10T06:22:24.447 INFO:tasks.workunit.client.1.vm06.stdout:7/668: dwrite d19/d3b/d41/d42/f78 [4194304,4194304] 0 2026-03-10T06:22:24.449 INFO:tasks.workunit.client.1.vm06.stdout:2/552: sync 2026-03-10T06:22:24.450 INFO:tasks.workunit.client.1.vm06.stdout:6/655: rename d6/df/d70/l8d to d6/d79/d95/le4 0 2026-03-10T06:22:24.456 INFO:tasks.workunit.client.1.vm06.stdout:8/530: dwrite d1/df/d20/d21/f38 [0,4194304] 0 2026-03-10T06:22:24.460 INFO:tasks.workunit.client.1.vm06.stdout:1/646: dread d9/d1b/d20/f24 [0,4194304] 0 2026-03-10T06:22:24.466 INFO:tasks.workunit.client.1.vm06.stdout:9/605: creat d21/d32/fcf x:0 0 0 2026-03-10T06:22:24.470 INFO:tasks.workunit.client.1.vm06.stdout:4/627: dread dd/d24/d5e/f67 [0,4194304] 0 2026-03-10T06:22:24.471 INFO:tasks.workunit.client.1.vm06.stdout:5/475: rename d8/db/d57/f5f to d8/db/d54/d8a/d39/d72/f9a 0 2026-03-10T06:22:24.482 INFO:tasks.workunit.client.1.vm06.stdout:6/656: write d6/dd/d25/d33/d4d/fc0 [1119349,52586] 0 2026-03-10T06:22:24.485 INFO:tasks.workunit.client.1.vm06.stdout:6/657: write d6/dd/f96 [162094,76543] 0 2026-03-10T06:22:24.491 INFO:tasks.workunit.client.1.vm06.stdout:6/658: read d6/dd/d25/f69 [711258,27697] 0 2026-03-10T06:22:24.491 INFO:tasks.workunit.client.1.vm06.stdout:3/621: dread d6/d21/f99 [0,4194304] 0 2026-03-10T06:22:24.494 INFO:tasks.workunit.client.1.vm06.stdout:3/622: dread d6/f29 [0,4194304] 0 2026-03-10T06:22:24.502 INFO:tasks.workunit.client.1.vm06.stdout:8/531: dread d1/df/d20/d21/d7e/d8d/f9c [0,4194304] 0 2026-03-10T06:22:24.507 INFO:tasks.workunit.client.1.vm06.stdout:2/553: mknod da/d13/d1c/d1d/cae 0 2026-03-10T06:22:24.507 INFO:tasks.workunit.client.1.vm06.stdout:4/628: creat dd/d24/d2d/d2f/d34/d40/fb8 x:0 0 0 2026-03-10T06:22:24.511 INFO:tasks.workunit.client.1.vm06.stdout:5/476: mknod d8/db/c9b 0 2026-03-10T06:22:24.514 INFO:tasks.workunit.client.1.vm06.stdout:0/620: mkdir d0/d3c/dc1/dc4/dc5 0 2026-03-10T06:22:24.517 INFO:tasks.workunit.client.1.vm06.stdout:9/606: dread d21/d32/d4d/d51/f87 [0,4194304] 0 2026-03-10T06:22:24.519 INFO:tasks.workunit.client.1.vm06.stdout:6/659: readlink d6/l84 0 2026-03-10T06:22:24.522 INFO:tasks.workunit.client.1.vm06.stdout:9/607: write d21/d32/fcf [328615,89432] 0 2026-03-10T06:22:24.526 INFO:tasks.workunit.client.1.vm06.stdout:6/660: dread d6/d79/d95/db4/dd4/fd5 [0,4194304] 0 2026-03-10T06:22:24.529 INFO:tasks.workunit.client.1.vm06.stdout:6/661: chown d6/dd/d25/d33/d5a/d78/dd0/dc5 81509787 1 2026-03-10T06:22:24.532 INFO:tasks.workunit.client.1.vm06.stdout:3/623: rename d6/d21/d38/d39 to d6/d21/d38/dd0/dd1 0 2026-03-10T06:22:24.534 INFO:tasks.workunit.client.1.vm06.stdout:8/532: mknod d1/df/d20/cb1 0 2026-03-10T06:22:24.535 INFO:tasks.workunit.client.1.vm06.stdout:8/533: truncate d1/d7/fa7 588854 0 2026-03-10T06:22:24.536 INFO:tasks.workunit.client.1.vm06.stdout:4/629: fdatasync dd/d18/f32 0 2026-03-10T06:22:24.536 INFO:tasks.workunit.client.1.vm06.stdout:8/534: chown d1/df/d20/fe 3628400 1 2026-03-10T06:22:24.542 INFO:tasks.workunit.client.1.vm06.stdout:5/477: rmdir d8/db/d54/d8a/d74 39 2026-03-10T06:22:24.552 INFO:tasks.workunit.client.1.vm06.stdout:0/621: rename d0/dd/d14/d8f to d0/dd/d1c/da2/dc6 0 2026-03-10T06:22:24.555 INFO:tasks.workunit.client.1.vm06.stdout:3/624: creat d6/dc/d13/d51/fd2 x:0 0 0 2026-03-10T06:22:24.559 INFO:tasks.workunit.client.1.vm06.stdout:9/608: dwrite d21/d32/d4d/f9d [0,4194304] 0 2026-03-10T06:22:24.565 INFO:tasks.workunit.client.1.vm06.stdout:4/630: dwrite dd/d24/d2d/d2f/f42 [0,4194304] 0 2026-03-10T06:22:24.569 INFO:tasks.workunit.client.1.vm06.stdout:5/478: dwrite d8/db/d54/d8a/d39/d72/f89 [0,4194304] 0 2026-03-10T06:22:24.571 INFO:tasks.workunit.client.1.vm06.stdout:5/479: readlink d8/d9/l86 0 2026-03-10T06:22:24.578 INFO:tasks.workunit.client.1.vm06.stdout:5/480: chown d8/db/d54/d67/d46/d6e/l70 471891 1 2026-03-10T06:22:24.583 INFO:tasks.workunit.client.1.vm06.stdout:5/481: readlink d8/d9/l2e 0 2026-03-10T06:22:24.605 INFO:tasks.workunit.client.1.vm06.stdout:1/647: write d9/d1b/d20/f25 [3957727,99398] 0 2026-03-10T06:22:24.605 INFO:tasks.workunit.client.1.vm06.stdout:7/669: truncate d19/d3b/d5b/f81 1456396 0 2026-03-10T06:22:24.606 INFO:tasks.workunit.client.1.vm06.stdout:7/670: chown d19/d3b/d41/d42/d52/fa0 121306 1 2026-03-10T06:22:24.614 INFO:tasks.workunit.client.1.vm06.stdout:3/625: symlink d6/d21/d38/dd0/dd1/d90/dc7/ld3 0 2026-03-10T06:22:24.616 INFO:tasks.workunit.client.1.vm06.stdout:4/631: mknod dd/d33/d36/cb9 0 2026-03-10T06:22:24.616 INFO:tasks.workunit.client.1.vm06.stdout:4/632: write dd/d41/f60 [258222,104156] 0 2026-03-10T06:22:24.619 INFO:tasks.workunit.client.1.vm06.stdout:9/609: mkdir d21/d27/d50/d57/db2/d80/d95/d9b/dd0 0 2026-03-10T06:22:24.621 INFO:tasks.workunit.client.1.vm06.stdout:9/610: write d21/d32/d4d/d51/d67/f7c [713456,25372] 0 2026-03-10T06:22:24.621 INFO:tasks.workunit.client.1.vm06.stdout:1/648: write d9/d35/f57 [2096090,5062] 0 2026-03-10T06:22:24.621 INFO:tasks.workunit.client.1.vm06.stdout:8/535: creat d1/d3b/da9/dab/fb2 x:0 0 0 2026-03-10T06:22:24.622 INFO:tasks.workunit.client.1.vm06.stdout:8/536: readlink d1/df/d20/d35/l46 0 2026-03-10T06:22:24.630 INFO:tasks.workunit.client.1.vm06.stdout:8/537: sync 2026-03-10T06:22:24.636 INFO:tasks.workunit.client.1.vm06.stdout:3/626: creat d6/d8/d7f/fd4 x:0 0 0 2026-03-10T06:22:24.640 INFO:tasks.workunit.client.1.vm06.stdout:4/633: unlink dd/d24/d2d/d2f/d39/f4a 0 2026-03-10T06:22:24.640 INFO:tasks.workunit.client.1.vm06.stdout:9/611: rmdir d21/d32/d4d/d51/d67 39 2026-03-10T06:22:24.640 INFO:tasks.workunit.client.1.vm06.stdout:9/612: chown d21/d32/fcf 11285997 1 2026-03-10T06:22:24.641 INFO:tasks.workunit.client.1.vm06.stdout:9/613: chown d21/d32/la1 90276709 1 2026-03-10T06:22:24.642 INFO:tasks.workunit.client.1.vm06.stdout:9/614: write d21/d27/d50/d57/fae [341617,95903] 0 2026-03-10T06:22:24.643 INFO:tasks.workunit.client.1.vm06.stdout:9/615: sync 2026-03-10T06:22:24.644 INFO:tasks.workunit.client.1.vm06.stdout:8/538: rmdir d1/df/d11/da1 39 2026-03-10T06:22:24.648 INFO:tasks.workunit.client.1.vm06.stdout:3/627: rename l2 to d6/d8/ld5 0 2026-03-10T06:22:24.649 INFO:tasks.workunit.client.1.vm06.stdout:1/649: mkdir d9/d35/dbc 0 2026-03-10T06:22:24.649 INFO:tasks.workunit.client.1.vm06.stdout:8/539: mkdir d1/d3b/db3 0 2026-03-10T06:22:24.650 INFO:tasks.workunit.client.1.vm06.stdout:1/650: write d9/d1b/d20/d44/f85 [329115,74142] 0 2026-03-10T06:22:24.651 INFO:tasks.workunit.client.1.vm06.stdout:1/651: readlink d9/d35/d46/d38/l8b 0 2026-03-10T06:22:24.651 INFO:tasks.workunit.client.1.vm06.stdout:1/652: fsync d9/d1b/d20/f8e 0 2026-03-10T06:22:24.652 INFO:tasks.workunit.client.1.vm06.stdout:1/653: chown d9/d1b/d20/l7d 12 1 2026-03-10T06:22:24.652 INFO:tasks.workunit.client.1.vm06.stdout:3/628: symlink d6/dc/d13/d35/ld6 0 2026-03-10T06:22:24.652 INFO:tasks.workunit.client.1.vm06.stdout:4/634: rename dd/fe to dd/d33/d36/fba 0 2026-03-10T06:22:24.653 INFO:tasks.workunit.client.1.vm06.stdout:4/635: chown dd/d24/f69 119211 1 2026-03-10T06:22:24.655 INFO:tasks.workunit.client.1.vm06.stdout:8/540: rename d1/df/d58/c55 to d1/df/d58/cb4 0 2026-03-10T06:22:24.656 INFO:tasks.workunit.client.1.vm06.stdout:8/541: readlink d1/df/d58/l7b 0 2026-03-10T06:22:24.656 INFO:tasks.workunit.client.1.vm06.stdout:3/629: write d6/dc/d13/f8b [989702,99107] 0 2026-03-10T06:22:24.657 INFO:tasks.workunit.client.1.vm06.stdout:1/654: mknod d9/d1b/cbd 0 2026-03-10T06:22:24.658 INFO:tasks.workunit.client.1.vm06.stdout:3/630: chown d6/fa8 73812 1 2026-03-10T06:22:24.658 INFO:tasks.workunit.client.1.vm06.stdout:1/655: write d9/d1b/f81 [1024427,39284] 0 2026-03-10T06:22:24.662 INFO:tasks.workunit.client.1.vm06.stdout:8/542: mkdir d1/df/d58/db5 0 2026-03-10T06:22:24.664 INFO:tasks.workunit.client.1.vm06.stdout:3/631: symlink d6/d21/d38/d88/dae/ld7 0 2026-03-10T06:22:24.665 INFO:tasks.workunit.client.1.vm06.stdout:4/636: getdents dd/d24/d2d/d2f/d34/d83 0 2026-03-10T06:22:24.667 INFO:tasks.workunit.client.1.vm06.stdout:3/632: mknod d6/dc/d13/d9d/d54/cd8 0 2026-03-10T06:22:24.668 INFO:tasks.workunit.client.1.vm06.stdout:4/637: write dd/d24/d2d/d2f/d34/faf [107152,43593] 0 2026-03-10T06:22:24.670 INFO:tasks.workunit.client.1.vm06.stdout:3/633: creat d6/d21/dbc/fd9 x:0 0 0 2026-03-10T06:22:24.671 INFO:tasks.workunit.client.1.vm06.stdout:8/543: dwrite d1/d7/f92 [0,4194304] 0 2026-03-10T06:22:24.672 INFO:tasks.workunit.client.1.vm06.stdout:3/634: write d6/d21/d38/f6c [1522742,123764] 0 2026-03-10T06:22:24.673 INFO:tasks.workunit.client.1.vm06.stdout:4/638: write dd/d24/d2d/f3b [2339912,72737] 0 2026-03-10T06:22:24.676 INFO:tasks.workunit.client.1.vm06.stdout:4/639: chown dd/d24/d2d/d2f/d34/d40/l46 3326 1 2026-03-10T06:22:24.677 INFO:tasks.workunit.client.1.vm06.stdout:3/635: rmdir d6/d1a/d5b/dbd 39 2026-03-10T06:22:24.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:24 vm04.local ceph-mon[51058]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:22:24.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:24 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T06:22:24.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:24 vm04.local ceph-mon[51058]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:22:24.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:24 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:22:24.678 INFO:tasks.workunit.client.1.vm06.stdout:4/640: sync 2026-03-10T06:22:24.678 INFO:tasks.workunit.client.1.vm06.stdout:3/636: sync 2026-03-10T06:22:24.686 INFO:tasks.workunit.client.1.vm06.stdout:4/641: dwrite dd/d41/f52 [0,4194304] 0 2026-03-10T06:22:24.701 INFO:tasks.workunit.client.1.vm06.stdout:4/642: mkdir dd/d33/d47/d97/db6/dbb 0 2026-03-10T06:22:24.702 INFO:tasks.workunit.client.1.vm06.stdout:4/643: write dd/d24/d2d/d2f/d34/d40/f8a [346183,76784] 0 2026-03-10T06:22:24.708 INFO:tasks.workunit.client.1.vm06.stdout:4/644: creat dd/d24/d2d/d2f/d34/d40/fbc x:0 0 0 2026-03-10T06:22:24.709 INFO:tasks.workunit.client.1.vm06.stdout:4/645: chown dd/d24/d2d/d2f/d34/d83/f87 2917099 1 2026-03-10T06:22:24.711 INFO:tasks.workunit.client.1.vm06.stdout:4/646: mkdir dd/d24/d5e/d66/dbd 0 2026-03-10T06:22:24.711 INFO:tasks.workunit.client.1.vm06.stdout:4/647: dread - dd/d18/fac zero size 2026-03-10T06:22:24.713 INFO:tasks.workunit.client.1.vm06.stdout:4/648: creat dd/d24/d2d/fbe x:0 0 0 2026-03-10T06:22:24.714 INFO:tasks.workunit.client.1.vm06.stdout:4/649: truncate dd/d24/d2d/d2f/d34/d40/f89 1349758 0 2026-03-10T06:22:24.728 INFO:tasks.workunit.client.1.vm06.stdout:4/650: fsync f0 0 2026-03-10T06:22:24.728 INFO:tasks.workunit.client.1.vm06.stdout:4/651: readlink dd/d24/d2d/d2f/d39/d71/la1 0 2026-03-10T06:22:24.729 INFO:tasks.workunit.client.1.vm06.stdout:4/652: write dd/d24/d2d/d2f/d34/d40/f99 [3201745,80187] 0 2026-03-10T06:22:24.732 INFO:tasks.workunit.client.1.vm06.stdout:4/653: sync 2026-03-10T06:22:24.734 INFO:tasks.workunit.client.1.vm06.stdout:4/654: rename dd/d24/d2d/ca0 to dd/d24/d5e/db0/cbf 0 2026-03-10T06:22:24.734 INFO:tasks.workunit.client.1.vm06.stdout:2/554: dwrite da/d13/d1c/d43/da6/d56/f85 [0,4194304] 0 2026-03-10T06:22:24.750 INFO:tasks.workunit.client.1.vm06.stdout:7/671: rmdir d19/d3b 39 2026-03-10T06:22:24.751 INFO:tasks.workunit.client.1.vm06.stdout:6/662: truncate d6/dd/d25/d33/d5a/fa1 180502 0 2026-03-10T06:22:24.760 INFO:tasks.workunit.client.1.vm06.stdout:1/656: rmdir d9/d35 39 2026-03-10T06:22:24.760 INFO:tasks.workunit.client.1.vm06.stdout:0/622: write d0/dd/fa4 [900992,3778] 0 2026-03-10T06:22:24.761 INFO:tasks.workunit.client.1.vm06.stdout:0/623: readlink d0/d3c/d42/d88/d35/d74/l77 0 2026-03-10T06:22:24.762 INFO:tasks.workunit.client.1.vm06.stdout:9/616: truncate d21/d27/d50/d57/fb7 3997302 0 2026-03-10T06:22:24.763 INFO:tasks.workunit.client.1.vm06.stdout:1/657: chown d9/d1b/f81 8779 1 2026-03-10T06:22:24.765 INFO:tasks.workunit.client.1.vm06.stdout:2/555: rename da/d13/d1c/d43/da6 to da/d13/d1a/d39/d4b/daf 0 2026-03-10T06:22:24.769 INFO:tasks.workunit.client.1.vm06.stdout:0/624: creat d0/d3c/d42/d99/fc7 x:0 0 0 2026-03-10T06:22:24.769 INFO:tasks.workunit.client.1.vm06.stdout:9/617: symlink d21/d32/d4d/d51/dcb/ld1 0 2026-03-10T06:22:24.770 INFO:tasks.workunit.client.1.vm06.stdout:6/663: chown d6/d79/ca4 10837754 1 2026-03-10T06:22:24.770 INFO:tasks.workunit.client.1.vm06.stdout:7/672: chown d19/d3b/d41/d42/d52/d83/c96 38038 1 2026-03-10T06:22:24.772 INFO:tasks.workunit.client.1.vm06.stdout:7/673: write d19/d3b/d41/d42/d52/d83/fd8 [358501,49702] 0 2026-03-10T06:22:24.772 INFO:tasks.workunit.client.1.vm06.stdout:0/625: fsync d0/d3c/d42/fbe 0 2026-03-10T06:22:24.773 INFO:tasks.workunit.client.1.vm06.stdout:5/482: dwrite d8/db/d54/d8a/f31 [0,4194304] 0 2026-03-10T06:22:24.773 INFO:tasks.workunit.client.1.vm06.stdout:4/655: dread dd/d18/f1d [0,4194304] 0 2026-03-10T06:22:24.777 INFO:tasks.workunit.client.1.vm06.stdout:9/618: chown d21/d32/d6e/c3c 3 1 2026-03-10T06:22:24.786 INFO:tasks.workunit.client.1.vm06.stdout:2/556: rename da/d13/d1a/d39/d35/l80 to da/d13/d1a/d39/d4b/lb0 0 2026-03-10T06:22:24.795 INFO:tasks.workunit.client.1.vm06.stdout:2/557: chown da/d13/d1c/f41 491538914 1 2026-03-10T06:22:24.797 INFO:tasks.workunit.client.1.vm06.stdout:0/626: creat d0/d3c/d42/fc8 x:0 0 0 2026-03-10T06:22:24.798 INFO:tasks.workunit.client.1.vm06.stdout:7/674: fsync d19/f30 0 2026-03-10T06:22:24.805 INFO:tasks.workunit.client.1.vm06.stdout:9/619: mkdir d21/d32/d4d/dd2 0 2026-03-10T06:22:24.805 INFO:tasks.workunit.client.1.vm06.stdout:9/620: fdatasync d21/d32/fcf 0 2026-03-10T06:22:24.806 INFO:tasks.workunit.client.1.vm06.stdout:9/621: write d21/d32/d4d/fbd [723688,128087] 0 2026-03-10T06:22:24.807 INFO:tasks.workunit.client.1.vm06.stdout:9/622: chown d21/d27/d50/d57/fae 28 1 2026-03-10T06:22:24.810 INFO:tasks.workunit.client.1.vm06.stdout:1/658: link d9/d35/d89/f3d d9/d35/d46/d38/fbe 0 2026-03-10T06:22:24.811 INFO:tasks.workunit.client.1.vm06.stdout:7/675: stat d19/l32 0 2026-03-10T06:22:24.811 INFO:tasks.workunit.client.1.vm06.stdout:4/656: link dd/d33/d36/c96 dd/d24/d5e/d66/cc0 0 2026-03-10T06:22:24.812 INFO:tasks.workunit.client.1.vm06.stdout:1/659: chown d9/d35/d46/d38/laa 121136955 1 2026-03-10T06:22:24.813 INFO:tasks.workunit.client.1.vm06.stdout:1/660: chown d9/d35/l73 1674407064 1 2026-03-10T06:22:24.814 INFO:tasks.workunit.client.1.vm06.stdout:1/661: chown d9/c1c 2382 1 2026-03-10T06:22:24.814 INFO:tasks.workunit.client.1.vm06.stdout:1/662: readlink l2 0 2026-03-10T06:22:24.815 INFO:tasks.workunit.client.1.vm06.stdout:7/676: sync 2026-03-10T06:22:24.816 INFO:tasks.workunit.client.1.vm06.stdout:7/677: stat d19/d3b/d5b/l76 0 2026-03-10T06:22:24.817 INFO:tasks.workunit.client.1.vm06.stdout:2/558: symlink da/d13/d1c/lb1 0 2026-03-10T06:22:24.818 INFO:tasks.workunit.client.1.vm06.stdout:8/544: write d1/df/d11/f12 [2439641,113727] 0 2026-03-10T06:22:24.824 INFO:tasks.workunit.client.1.vm06.stdout:8/545: read d1/df/d11/f45 [1086532,59642] 0 2026-03-10T06:22:24.824 INFO:tasks.workunit.client.1.vm06.stdout:4/657: dwrite dd/d24/d2d/d2f/d39/d71/f90 [0,4194304] 0 2026-03-10T06:22:24.825 INFO:tasks.workunit.client.1.vm06.stdout:3/637: dwrite d6/dc/d41/d6d/fce [0,4194304] 0 2026-03-10T06:22:24.827 INFO:tasks.workunit.client.1.vm06.stdout:9/623: mknod d21/d32/d6e/cd3 0 2026-03-10T06:22:24.836 INFO:tasks.workunit.client.1.vm06.stdout:5/483: rename d8/db/d54/f5c to d8/f9c 0 2026-03-10T06:22:24.842 INFO:tasks.workunit.client.1.vm06.stdout:2/559: readlink da/d13/d1c/d1d/l5d 0 2026-03-10T06:22:24.842 INFO:tasks.workunit.client.1.vm06.stdout:6/664: dread d6/d7/f16 [0,4194304] 0 2026-03-10T06:22:24.847 INFO:tasks.workunit.client.1.vm06.stdout:6/665: truncate d6/dd/d25/d2c/fc3 256999 0 2026-03-10T06:22:24.861 INFO:tasks.workunit.client.1.vm06.stdout:0/627: rename d0/dd/d1c/c20 to d0/dd/d14/d1d/cc9 0 2026-03-10T06:22:24.863 INFO:tasks.workunit.client.1.vm06.stdout:0/628: write d0/d3c/d42/f41 [2443804,120449] 0 2026-03-10T06:22:24.867 INFO:tasks.workunit.client.1.vm06.stdout:3/638: creat d6/d21/fda x:0 0 0 2026-03-10T06:22:24.867 INFO:tasks.workunit.client.1.vm06.stdout:0/629: write d0/d3c/d42/d88/f8a [338073,89417] 0 2026-03-10T06:22:24.868 INFO:tasks.workunit.client.1.vm06.stdout:9/624: dwrite d21/d32/d4d/d51/d67/f6a [0,4194304] 0 2026-03-10T06:22:24.869 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:24 vm06.local ceph-mon[58974]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:22:24.869 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:24 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T06:22:24.869 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:24 vm06.local ceph-mon[58974]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:22:24.869 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:24 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:22:24.871 INFO:tasks.workunit.client.1.vm06.stdout:1/663: rename d9/d62/db8 to d9/d1b/d20/d44/dbf 0 2026-03-10T06:22:24.881 INFO:tasks.workunit.client.1.vm06.stdout:5/484: fdatasync d8/db/d54/d8a/d74/f4c 0 2026-03-10T06:22:24.881 INFO:tasks.workunit.client.1.vm06.stdout:8/546: link d1/d3b/d5c/f6f d1/df/d11/da1/fb6 0 2026-03-10T06:22:24.882 INFO:tasks.workunit.client.1.vm06.stdout:3/639: write d6/fab [360081,53729] 0 2026-03-10T06:22:24.883 INFO:tasks.workunit.client.1.vm06.stdout:0/630: mkdir d0/dd/d14/d1d/d5d/dca 0 2026-03-10T06:22:24.884 INFO:tasks.workunit.client.1.vm06.stdout:3/640: write d6/d21/fb4 [924076,126890] 0 2026-03-10T06:22:24.889 INFO:tasks.workunit.client.1.vm06.stdout:1/664: mknod d9/d35/d46/cc0 0 2026-03-10T06:22:24.897 INFO:tasks.workunit.client.1.vm06.stdout:2/560: getdents da/da8 0 2026-03-10T06:22:24.903 INFO:tasks.workunit.client.1.vm06.stdout:9/625: creat d21/d32/d4d/d51/db0/fd4 x:0 0 0 2026-03-10T06:22:24.903 INFO:tasks.workunit.client.1.vm06.stdout:5/485: dread d8/db/d54/d8a/d74/f71 [0,4194304] 0 2026-03-10T06:22:24.903 INFO:tasks.workunit.client.1.vm06.stdout:9/626: chown l1a 0 1 2026-03-10T06:22:24.905 INFO:tasks.workunit.client.1.vm06.stdout:8/547: dread d1/df/d11/f48 [0,4194304] 0 2026-03-10T06:22:24.906 INFO:tasks.workunit.client.1.vm06.stdout:8/548: chown d1/f1c 105323 1 2026-03-10T06:22:24.911 INFO:tasks.workunit.client.1.vm06.stdout:0/631: write d0/dd/d14/d18/f7c [3530126,33483] 0 2026-03-10T06:22:24.912 INFO:tasks.workunit.client.1.vm06.stdout:0/632: write d0/f9 [3155251,104467] 0 2026-03-10T06:22:24.918 INFO:tasks.workunit.client.1.vm06.stdout:0/633: stat d0/d3c/d42/d88/d98/fb1 0 2026-03-10T06:22:24.918 INFO:tasks.workunit.client.1.vm06.stdout:3/641: symlink d6/d21/d38/dd0/dd1/ldb 0 2026-03-10T06:22:24.919 INFO:tasks.workunit.client.1.vm06.stdout:0/634: chown d0/d3c/d42/d88/d98/f9a 3352 1 2026-03-10T06:22:24.919 INFO:tasks.workunit.client.1.vm06.stdout:1/665: dread d9/d1b/d20/f42 [0,4194304] 0 2026-03-10T06:22:24.920 INFO:tasks.workunit.client.1.vm06.stdout:3/642: rename d6/d21/d38 to d6/d21/d38/dd0/dd1/d90/dc7/ddc 22 2026-03-10T06:22:24.921 INFO:tasks.workunit.client.1.vm06.stdout:3/643: truncate d6/d1a/fb9 718512 0 2026-03-10T06:22:24.929 INFO:tasks.workunit.client.1.vm06.stdout:6/666: getdents d6/d7/d37/d43 0 2026-03-10T06:22:24.930 INFO:tasks.workunit.client.1.vm06.stdout:2/561: symlink da/d13/d1a/d39/d4b/lb2 0 2026-03-10T06:22:24.938 INFO:tasks.workunit.client.1.vm06.stdout:0/635: dwrite d0/d3c/d42/dab/fc0 [0,4194304] 0 2026-03-10T06:22:24.952 INFO:tasks.workunit.client.1.vm06.stdout:3/644: symlink d6/d21/d38/dd0/ldd 0 2026-03-10T06:22:24.955 INFO:tasks.workunit.client.1.vm06.stdout:4/658: dwrite dd/d24/d5e/f67 [0,4194304] 0 2026-03-10T06:22:24.958 INFO:tasks.workunit.client.1.vm06.stdout:4/659: write fa [2404386,87557] 0 2026-03-10T06:22:24.958 INFO:tasks.workunit.client.1.vm06.stdout:7/678: write d19/f3f [4033741,62018] 0 2026-03-10T06:22:24.962 INFO:tasks.workunit.client.1.vm06.stdout:4/660: chown dd/d24/f8f 388927295 1 2026-03-10T06:22:24.962 INFO:tasks.workunit.client.1.vm06.stdout:2/562: symlink da/d13/d1a/d39/d4b/daf/d56/lb3 0 2026-03-10T06:22:24.965 INFO:tasks.workunit.client.1.vm06.stdout:5/486: dread d8/db/d54/d8a/d74/f36 [0,4194304] 0 2026-03-10T06:22:24.966 INFO:tasks.workunit.client.1.vm06.stdout:0/636: creat d0/dd/d14/d18/d66/fcb x:0 0 0 2026-03-10T06:22:24.967 INFO:tasks.workunit.client.1.vm06.stdout:2/563: creat da/d13/d1c/d1d/d44/d46/fb4 x:0 0 0 2026-03-10T06:22:24.968 INFO:tasks.workunit.client.1.vm06.stdout:0/637: dread - d0/d3c/d42/d5e/dbb/fc3 zero size 2026-03-10T06:22:24.971 INFO:tasks.workunit.client.1.vm06.stdout:6/667: unlink d6/dd/d25/d33/d5a/d78/dd0/l7a 0 2026-03-10T06:22:24.972 INFO:tasks.workunit.client.1.vm06.stdout:7/679: mkdir d19/d3b/d41/d72/de0/de2 0 2026-03-10T06:22:24.972 INFO:tasks.workunit.client.1.vm06.stdout:7/680: chown l17 76931721 1 2026-03-10T06:22:24.975 INFO:tasks.workunit.client.1.vm06.stdout:7/681: write fa [2997005,80710] 0 2026-03-10T06:22:24.975 INFO:tasks.workunit.client.1.vm06.stdout:9/627: getdents d21/d32/d4d/d51/dcb 0 2026-03-10T06:22:24.977 INFO:tasks.workunit.client.1.vm06.stdout:9/628: fdatasync d21/d27/d3a/f83 0 2026-03-10T06:22:24.982 INFO:tasks.workunit.client.1.vm06.stdout:7/682: write d19/d3b/d41/d42/d52/d83/f8f [5381220,12127] 0 2026-03-10T06:22:24.990 INFO:tasks.workunit.client.1.vm06.stdout:2/564: link da/d13/d1a/d39/d35/c3f da/d13/d1c/d7d/cb5 0 2026-03-10T06:22:24.993 INFO:tasks.workunit.client.1.vm06.stdout:9/629: creat d21/d32/d4d/dd2/fd5 x:0 0 0 2026-03-10T06:22:24.994 INFO:tasks.workunit.client.1.vm06.stdout:6/668: dwrite d6/f62 [0,4194304] 0 2026-03-10T06:22:24.997 INFO:tasks.workunit.client.1.vm06.stdout:9/630: write d21/d27/d50/d57/db2/faa [273608,130854] 0 2026-03-10T06:22:24.997 INFO:tasks.workunit.client.1.vm06.stdout:6/669: readlink d6/dd/dc7/ld3 0 2026-03-10T06:22:24.998 INFO:tasks.workunit.client.1.vm06.stdout:0/638: dread d0/dd/d14/f31 [0,4194304] 0 2026-03-10T06:22:25.001 INFO:tasks.workunit.client.1.vm06.stdout:6/670: write d6/dd/d25/d2c/f32 [5125912,116983] 0 2026-03-10T06:22:25.001 INFO:tasks.workunit.client.1.vm06.stdout:0/639: write d0/fa [2954875,70031] 0 2026-03-10T06:22:25.005 INFO:tasks.workunit.client.1.vm06.stdout:2/565: creat da/d13/d1c/d43/d6e/fb6 x:0 0 0 2026-03-10T06:22:25.012 INFO:tasks.workunit.client.1.vm06.stdout:9/631: dread d21/d32/d4d/fb4 [0,4194304] 0 2026-03-10T06:22:25.015 INFO:tasks.workunit.client.1.vm06.stdout:6/671: stat d6/dd/d25/d33/f5d 0 2026-03-10T06:22:25.016 INFO:tasks.workunit.client.1.vm06.stdout:6/672: chown d6/dd/d35/cb7 4 1 2026-03-10T06:22:25.023 INFO:tasks.workunit.client.1.vm06.stdout:0/640: rename d0/d3c/d42 to d0/dd/d14/d18/d85/dcc 0 2026-03-10T06:22:25.024 INFO:tasks.workunit.client.1.vm06.stdout:2/566: mkdir da/d13/d1a/d39/d4b/daf/d56/db7 0 2026-03-10T06:22:25.024 INFO:tasks.workunit.client.1.vm06.stdout:1/666: write d9/d35/d89/f3d [4512771,102206] 0 2026-03-10T06:22:25.025 INFO:tasks.workunit.client.1.vm06.stdout:9/632: rmdir d21/da2 39 2026-03-10T06:22:25.026 INFO:tasks.workunit.client.1.vm06.stdout:9/633: truncate d21/d27/f9a 603613 0 2026-03-10T06:22:25.034 INFO:tasks.workunit.client.1.vm06.stdout:0/641: dread d0/dd/d14/f31 [0,4194304] 0 2026-03-10T06:22:25.034 INFO:tasks.workunit.client.1.vm06.stdout:8/549: truncate d1/d3b/d5c/f7a 4088894 0 2026-03-10T06:22:25.037 INFO:tasks.workunit.client.1.vm06.stdout:0/642: write d0/dd/d14/d18/d85/dcc/fc8 [848912,120783] 0 2026-03-10T06:22:25.040 INFO:tasks.workunit.client.1.vm06.stdout:6/673: dwrite d6/df/f1e [0,4194304] 0 2026-03-10T06:22:25.041 INFO:tasks.workunit.client.1.vm06.stdout:2/567: write da/d13/d5e/f9e [566602,2250] 0 2026-03-10T06:22:25.054 INFO:tasks.workunit.client.1.vm06.stdout:1/667: dread d9/f2f [0,4194304] 0 2026-03-10T06:22:25.054 INFO:tasks.workunit.client.1.vm06.stdout:1/668: readlink d9/d35/l4e 0 2026-03-10T06:22:25.055 INFO:tasks.workunit.client.1.vm06.stdout:0/643: dwrite d0/dd/d14/d18/f7c [0,4194304] 0 2026-03-10T06:22:25.062 INFO:tasks.workunit.client.1.vm06.stdout:8/550: mknod d1/df/d20/d35/cb7 0 2026-03-10T06:22:25.062 INFO:tasks.workunit.client.1.vm06.stdout:0/644: read d0/f46 [5312300,44314] 0 2026-03-10T06:22:25.062 INFO:tasks.workunit.client.1.vm06.stdout:8/551: write d1/f3a [1094943,64920] 0 2026-03-10T06:22:25.077 INFO:tasks.workunit.client.1.vm06.stdout:9/634: mknod d21/d32/d4d/cd6 0 2026-03-10T06:22:25.086 INFO:tasks.workunit.client.1.vm06.stdout:0/645: dwrite d0/dd/d14/d18/d85/dcc/d5e/dbb/fc3 [0,4194304] 0 2026-03-10T06:22:25.091 INFO:tasks.workunit.client.1.vm06.stdout:6/674: rename d6/d79/d95/db4 to d6/d79/d95/db4/de5 22 2026-03-10T06:22:25.091 INFO:tasks.workunit.client.1.vm06.stdout:9/635: write d21/d32/d4d/d51/d67/fb6 [1000443,56130] 0 2026-03-10T06:22:25.095 INFO:tasks.workunit.client.1.vm06.stdout:2/568: dread da/d13/d1a/d39/d4b/daf/d56/f58 [0,4194304] 0 2026-03-10T06:22:25.098 INFO:tasks.workunit.client.1.vm06.stdout:4/661: truncate dd/d18/f55 5432453 0 2026-03-10T06:22:25.099 INFO:tasks.workunit.client.1.vm06.stdout:3/645: dwrite d6/d8/f96 [0,4194304] 0 2026-03-10T06:22:25.099 INFO:tasks.workunit.client.1.vm06.stdout:5/487: truncate d8/db/d54/d8a/d39/d72/f89 2908819 0 2026-03-10T06:22:25.103 INFO:tasks.workunit.client.1.vm06.stdout:5/488: chown d8/db/d54/d8a 7553 1 2026-03-10T06:22:25.109 INFO:tasks.workunit.client.1.vm06.stdout:9/636: dwrite d21/d27/d50/d57/db2/d80/d95/d9b/fa5 [0,4194304] 0 2026-03-10T06:22:25.120 INFO:tasks.workunit.client.1.vm06.stdout:2/569: read da/d13/d1c/d1d/f26 [440883,124128] 0 2026-03-10T06:22:25.122 INFO:tasks.workunit.client.1.vm06.stdout:1/669: rename d9/d35/d89/f3d to d9/d35/d46/d38/d8c/fc1 0 2026-03-10T06:22:25.123 INFO:tasks.workunit.client.1.vm06.stdout:0/646: chown d0/dd/d14/d18/d85/dcc/d88/d9e/la8 163997 1 2026-03-10T06:22:25.127 INFO:tasks.workunit.client.1.vm06.stdout:2/570: fdatasync da/d13/d1c/d1d/d44/d53/d61/d68/f6b 0 2026-03-10T06:22:25.139 INFO:tasks.workunit.client.1.vm06.stdout:3/646: getdents d6/d1a 0 2026-03-10T06:22:25.141 INFO:tasks.workunit.client.1.vm06.stdout:1/670: rmdir d9/d35/d46/d38/da0 0 2026-03-10T06:22:25.142 INFO:tasks.workunit.client.1.vm06.stdout:2/571: getdents da/d13/d5e 0 2026-03-10T06:22:25.142 INFO:tasks.workunit.client.1.vm06.stdout:5/489: dread d8/d9/f11 [0,4194304] 0 2026-03-10T06:22:25.143 INFO:tasks.workunit.client.1.vm06.stdout:2/572: chown da/d13/d1a/d39/d4b/f9d 803062 1 2026-03-10T06:22:25.144 INFO:tasks.workunit.client.1.vm06.stdout:5/490: write d8/db/f1f [2683246,63907] 0 2026-03-10T06:22:25.146 INFO:tasks.workunit.client.1.vm06.stdout:0/647: truncate d0/dd/f49 1375995 0 2026-03-10T06:22:25.149 INFO:tasks.workunit.client.1.vm06.stdout:3/647: mkdir d6/d21/d38/d88/dde 0 2026-03-10T06:22:25.150 INFO:tasks.workunit.client.1.vm06.stdout:3/648: chown d6/dc/f7e 451 1 2026-03-10T06:22:25.150 INFO:tasks.workunit.client.1.vm06.stdout:3/649: readlink d6/d21/d38/d88/dae/ld7 0 2026-03-10T06:22:25.158 INFO:tasks.workunit.client.1.vm06.stdout:9/637: getdents d21/da2 0 2026-03-10T06:22:25.158 INFO:tasks.workunit.client.1.vm06.stdout:5/491: chown d8/l7a 2085250 1 2026-03-10T06:22:25.158 INFO:tasks.workunit.client.1.vm06.stdout:9/638: stat d21/d27/d3a/l6c 0 2026-03-10T06:22:25.161 INFO:tasks.workunit.client.1.vm06.stdout:1/671: rmdir d9/d35/dbc 0 2026-03-10T06:22:25.163 INFO:tasks.workunit.client.1.vm06.stdout:1/672: chown d9/d35/d46/d38/d63/d83/d93/fb5 194 1 2026-03-10T06:22:25.163 INFO:tasks.workunit.client.1.vm06.stdout:1/673: chown d9/c69 108915 1 2026-03-10T06:22:25.165 INFO:tasks.workunit.client.1.vm06.stdout:0/648: rename d0/c2e to d0/dd/ccd 0 2026-03-10T06:22:25.165 INFO:tasks.workunit.client.1.vm06.stdout:2/573: getdents da/d13/d1a/d39/d4b/d86 0 2026-03-10T06:22:25.166 INFO:tasks.workunit.client.1.vm06.stdout:0/649: readlink d0/dd/lb2 0 2026-03-10T06:22:25.166 INFO:tasks.workunit.client.1.vm06.stdout:9/639: dwrite d21/d32/f8b [0,4194304] 0 2026-03-10T06:22:25.170 INFO:tasks.workunit.client.1.vm06.stdout:9/640: chown f20 18 1 2026-03-10T06:22:25.173 INFO:tasks.workunit.client.1.vm06.stdout:0/650: creat d0/dd/d14/d18/d7e/fce x:0 0 0 2026-03-10T06:22:25.175 INFO:tasks.workunit.client.1.vm06.stdout:1/674: dwrite d9/f2f [0,4194304] 0 2026-03-10T06:22:25.178 INFO:tasks.workunit.client.1.vm06.stdout:2/574: dwrite da/d13/d1c/d43/f7a [0,4194304] 0 2026-03-10T06:22:25.178 INFO:tasks.workunit.client.1.vm06.stdout:2/575: fdatasync da/d13/d5e/f9a 0 2026-03-10T06:22:25.179 INFO:tasks.workunit.client.1.vm06.stdout:2/576: chown da/d13/d1a/d39/d35/c4c 0 1 2026-03-10T06:22:25.183 INFO:tasks.workunit.client.1.vm06.stdout:0/651: dwrite d0/f9 [0,4194304] 0 2026-03-10T06:22:25.194 INFO:tasks.workunit.client.1.vm06.stdout:0/652: dwrite d0/dd/f24 [4194304,4194304] 0 2026-03-10T06:22:25.220 INFO:tasks.workunit.client.1.vm06.stdout:5/492: link d8/db/d54/d8a/c2d d8/db/c9d 0 2026-03-10T06:22:25.225 INFO:tasks.workunit.client.1.vm06.stdout:7/683: dwrite d19/d3b/f43 [0,4194304] 0 2026-03-10T06:22:25.230 INFO:tasks.workunit.client.1.vm06.stdout:2/577: dread da/d13/d1c/f42 [0,4194304] 0 2026-03-10T06:22:25.230 INFO:tasks.workunit.client.1.vm06.stdout:2/578: readlink da/d13/d1a/d39/d4b/daf/d56/lb3 0 2026-03-10T06:22:25.241 INFO:tasks.workunit.client.1.vm06.stdout:1/675: creat d9/d35/d46/d38/d63/fc2 x:0 0 0 2026-03-10T06:22:25.252 INFO:tasks.workunit.client.1.vm06.stdout:8/552: dwrite d1/df/d20/d21/d7e/d8d/f95 [0,4194304] 0 2026-03-10T06:22:25.259 INFO:tasks.workunit.client.1.vm06.stdout:5/493: mkdir d8/db/d54/d8a/d39/d9e 0 2026-03-10T06:22:25.270 INFO:tasks.workunit.client.1.vm06.stdout:7/684: mkdir d19/d3b/d41/d42/d62/d80/da1/de3 0 2026-03-10T06:22:25.271 INFO:tasks.workunit.client.1.vm06.stdout:7/685: chown d19/d3b/d41/d42/d62/d80/d82 438 1 2026-03-10T06:22:25.277 INFO:tasks.workunit.client.1.vm06.stdout:4/662: truncate dd/d24/d2d/f5a 892477 0 2026-03-10T06:22:25.282 INFO:tasks.workunit.client.1.vm06.stdout:8/553: mknod d1/d3b/da9/cb8 0 2026-03-10T06:22:25.288 INFO:tasks.workunit.client.1.vm06.stdout:7/686: symlink d19/d3b/d41/d42/d62/d80/d82/le4 0 2026-03-10T06:22:25.289 INFO:tasks.workunit.client.1.vm06.stdout:4/663: unlink dd/d18/d75/f76 0 2026-03-10T06:22:25.294 INFO:tasks.workunit.client.1.vm06.stdout:6/675: dwrite d6/df/f82 [0,4194304] 0 2026-03-10T06:22:25.314 INFO:tasks.workunit.client.1.vm06.stdout:7/687: creat d19/d3b/dde/fe5 x:0 0 0 2026-03-10T06:22:25.315 INFO:tasks.workunit.client.1.vm06.stdout:2/579: getdents da 0 2026-03-10T06:22:25.315 INFO:tasks.workunit.client.1.vm06.stdout:3/650: dwrite d6/dc/d13/d35/f4e [0,4194304] 0 2026-03-10T06:22:25.315 INFO:tasks.workunit.client.1.vm06.stdout:2/580: dread - da/d13/d1c/d43/f91 zero size 2026-03-10T06:22:25.316 INFO:tasks.workunit.client.1.vm06.stdout:3/651: readlink d6/dc/d13/l85 0 2026-03-10T06:22:25.336 INFO:tasks.workunit.client.1.vm06.stdout:7/688: mknod d19/d3b/d41/d42/d52/d9f/ce6 0 2026-03-10T06:22:25.337 INFO:tasks.workunit.client.1.vm06.stdout:7/689: fdatasync d19/d3b/d5b/f69 0 2026-03-10T06:22:25.337 INFO:tasks.workunit.client.1.vm06.stdout:4/664: link dd/d24/d2d/d2f/d39/l86 dd/d24/d2d/d2f/d39/lc1 0 2026-03-10T06:22:25.340 INFO:tasks.workunit.client.1.vm06.stdout:4/665: chown dd/d24/c38 6 1 2026-03-10T06:22:25.356 INFO:tasks.workunit.client.1.vm06.stdout:6/676: symlink d6/df/d9f/le6 0 2026-03-10T06:22:25.360 INFO:tasks.workunit.client.1.vm06.stdout:7/690: link d19/d3b/d41/d42/d52/d83/ccd d19/d3b/d41/d72/de0/ce7 0 2026-03-10T06:22:25.380 INFO:tasks.workunit.client.1.vm06.stdout:4/666: sync 2026-03-10T06:22:25.419 INFO:tasks.workunit.client.1.vm06.stdout:6/677: read d6/df/d70/fa6 [1322068,17302] 0 2026-03-10T06:22:25.422 INFO:tasks.workunit.client.1.vm06.stdout:6/678: readlink d6/dd/d25/d33/d5a/d78/dd0/ldb 0 2026-03-10T06:22:25.426 INFO:tasks.workunit.client.1.vm06.stdout:6/679: mknod d6/dd/d25/d33/d5a/d78/ce7 0 2026-03-10T06:22:25.427 INFO:tasks.workunit.client.1.vm06.stdout:6/680: read - d6/dd/d25/d4e/f83 zero size 2026-03-10T06:22:25.622 INFO:tasks.workunit.client.1.vm06.stdout:1/676: write d9/d1b/d20/f30 [1086207,119930] 0 2026-03-10T06:22:25.628 INFO:tasks.workunit.client.1.vm06.stdout:1/677: truncate d9/d35/d46/d38/fae 452280 0 2026-03-10T06:22:25.628 INFO:tasks.workunit.client.1.vm06.stdout:1/678: readlink d9/d35/l4e 0 2026-03-10T06:22:25.629 INFO:tasks.workunit.client.1.vm06.stdout:1/679: stat d9/d35/d46/d38/d63/d83/l97 0 2026-03-10T06:22:25.630 INFO:tasks.workunit.client.1.vm06.stdout:1/680: write d9/d1b/f7c [179427,43431] 0 2026-03-10T06:22:25.630 INFO:tasks.workunit.client.1.vm06.stdout:1/681: write d9/f1f [8439097,126247] 0 2026-03-10T06:22:25.631 INFO:tasks.workunit.client.1.vm06.stdout:1/682: fsync d9/f58 0 2026-03-10T06:22:25.634 INFO:tasks.workunit.client.1.vm06.stdout:1/683: dwrite d9/d1b/d20/fa7 [0,4194304] 0 2026-03-10T06:22:25.637 INFO:tasks.workunit.client.1.vm06.stdout:1/684: creat d9/d1b/d20/d44/dbf/fc3 x:0 0 0 2026-03-10T06:22:25.648 INFO:tasks.workunit.client.1.vm06.stdout:1/685: getdents d9/d1b/d20/d44 0 2026-03-10T06:22:25.741 INFO:tasks.workunit.client.1.vm06.stdout:7/691: rmdir d19/d3b/d41/d42/d62/d80/da1 39 2026-03-10T06:22:25.837 INFO:tasks.workunit.client.1.vm06.stdout:8/554: creat d1/fb9 x:0 0 0 2026-03-10T06:22:25.839 INFO:tasks.workunit.client.1.vm06.stdout:4/667: creat dd/fc2 x:0 0 0 2026-03-10T06:22:25.848 INFO:tasks.workunit.client.1.vm06.stdout:4/668: mkdir dd/d24/d2d/d2f/d39/d71/dc3 0 2026-03-10T06:22:25.849 INFO:tasks.workunit.client.1.vm06.stdout:1/686: truncate d9/f1f 8397138 0 2026-03-10T06:22:25.849 INFO:tasks.workunit.client.1.vm06.stdout:1/687: write d9/d62/f8a [1037243,3109] 0 2026-03-10T06:22:25.850 INFO:tasks.workunit.client.1.vm06.stdout:0/653: rename d0/dd/lb2 to d0/dd/d14/d1d/lcf 0 2026-03-10T06:22:25.851 INFO:tasks.workunit.client.1.vm06.stdout:4/669: symlink dd/d24/d5e/d66/lc4 0 2026-03-10T06:22:25.851 INFO:tasks.workunit.client.1.vm06.stdout:0/654: truncate d0/dd/d14/d18/d66/fcb 664444 0 2026-03-10T06:22:25.856 INFO:tasks.workunit.client.1.vm06.stdout:1/688: symlink d9/d35/d46/db0/lc4 0 2026-03-10T06:22:25.858 INFO:tasks.workunit.client.1.vm06.stdout:5/494: rmdir d8/db/d57 39 2026-03-10T06:22:25.858 INFO:tasks.workunit.client.1.vm06.stdout:9/641: creat d21/fd7 x:0 0 0 2026-03-10T06:22:25.859 INFO:tasks.workunit.client.1.vm06.stdout:5/495: stat f7 0 2026-03-10T06:22:25.860 INFO:tasks.workunit.client.1.vm06.stdout:0/655: mkdir d0/dd/d14/d18/d7e/dd0 0 2026-03-10T06:22:25.864 INFO:tasks.workunit.client.1.vm06.stdout:9/642: creat d21/d32/fd8 x:0 0 0 2026-03-10T06:22:25.865 INFO:tasks.workunit.client.1.vm06.stdout:0/656: mkdir d0/dd/d14/d18/d85/dcc/d88/d47/dd1 0 2026-03-10T06:22:25.868 INFO:tasks.workunit.client.1.vm06.stdout:1/689: dwrite d9/d35/d46/d38/fab [0,4194304] 0 2026-03-10T06:22:25.869 INFO:tasks.workunit.client.1.vm06.stdout:1/690: write d9/d35/f57 [4912010,126698] 0 2026-03-10T06:22:25.869 INFO:tasks.workunit.client.1.vm06.stdout:1/691: readlink d9/d1b/l23 0 2026-03-10T06:22:25.875 INFO:tasks.workunit.client.1.vm06.stdout:0/657: symlink d0/dd/d1c/da2/ld2 0 2026-03-10T06:22:25.875 INFO:tasks.workunit.client.1.vm06.stdout:1/692: mkdir d9/d35/d46/d38/d63/d83/dc5 0 2026-03-10T06:22:25.880 INFO:tasks.workunit.client.1.vm06.stdout:9/643: dwrite f9 [0,4194304] 0 2026-03-10T06:22:25.881 INFO:tasks.workunit.client.1.vm06.stdout:1/693: read d9/d35/d89/f4d [4094923,77368] 0 2026-03-10T06:22:25.883 INFO:tasks.workunit.client.1.vm06.stdout:2/581: rename da/d13/d1c/c38 to da/cb8 0 2026-03-10T06:22:25.886 INFO:tasks.workunit.client.1.vm06.stdout:9/644: creat d21/d27/d50/d57/db2/d80/d95/d9b/fd9 x:0 0 0 2026-03-10T06:22:25.888 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:25 vm04.local ceph-mon[51058]: pgmap v11: 65 pgs: 65 active+clean; 1.0 GiB data, 3.6 GiB used, 116 GiB / 120 GiB avail; 33 MiB/s rd, 116 MiB/s wr, 237 op/s 2026-03-10T06:22:25.888 INFO:tasks.workunit.client.1.vm06.stdout:3/652: rename d6/d21/fb0 to d6/dc/d13/d9d/d54/fdf 0 2026-03-10T06:22:25.889 INFO:tasks.workunit.client.1.vm06.stdout:3/653: chown d6/d21/cad 76831 1 2026-03-10T06:22:25.889 INFO:tasks.workunit.client.1.vm06.stdout:0/658: getdents d0/dd/d14/d18/d85/dcc/d5e 0 2026-03-10T06:22:25.890 INFO:tasks.workunit.client.1.vm06.stdout:1/694: mkdir d9/d35/d46/d38/dc6 0 2026-03-10T06:22:25.895 INFO:tasks.workunit.client.1.vm06.stdout:6/681: rename d6/dd/d25/d2c/fd6 to d6/dd/d25/d33/d5a/dae/fe8 0 2026-03-10T06:22:25.901 INFO:tasks.workunit.client.1.vm06.stdout:3/654: truncate d6/dc/d13/f5d 1763464 0 2026-03-10T06:22:25.902 INFO:tasks.workunit.client.1.vm06.stdout:1/695: chown d9/d35/d46/lb4 2730 1 2026-03-10T06:22:25.904 INFO:tasks.workunit.client.1.vm06.stdout:6/682: dread d6/dd/d25/d33/d5a/fa1 [0,4194304] 0 2026-03-10T06:22:25.904 INFO:tasks.workunit.client.1.vm06.stdout:1/696: stat d9/d35/d46 0 2026-03-10T06:22:25.904 INFO:tasks.workunit.client.1.vm06.stdout:9/645: dwrite d21/d27/d50/d57/fa9 [0,4194304] 0 2026-03-10T06:22:25.917 INFO:tasks.workunit.client.1.vm06.stdout:3/655: creat d6/d21/d38/dd0/dd1/d90/fe0 x:0 0 0 2026-03-10T06:22:25.922 INFO:tasks.workunit.client.1.vm06.stdout:0/659: link d0/d3c/dc1/l93 d0/d3c/dc1/d3d/ld3 0 2026-03-10T06:22:25.942 INFO:tasks.workunit.client.1.vm06.stdout:3/656: creat d6/dc/d13/d9d/fe1 x:0 0 0 2026-03-10T06:22:25.942 INFO:tasks.workunit.client.1.vm06.stdout:0/660: mknod d0/dd/d14/d18/cd4 0 2026-03-10T06:22:25.942 INFO:tasks.workunit.client.1.vm06.stdout:6/683: creat d6/df/fe9 x:0 0 0 2026-03-10T06:22:25.942 INFO:tasks.workunit.client.1.vm06.stdout:6/684: readlink d6/dd/d25/d2c/l63 0 2026-03-10T06:22:25.942 INFO:tasks.workunit.client.1.vm06.stdout:0/661: mkdir d0/da3/dd5 0 2026-03-10T06:22:25.949 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:25 vm06.local ceph-mon[58974]: pgmap v11: 65 pgs: 65 active+clean; 1.0 GiB data, 3.6 GiB used, 116 GiB / 120 GiB avail; 33 MiB/s rd, 116 MiB/s wr, 237 op/s 2026-03-10T06:22:25.951 INFO:tasks.workunit.client.1.vm06.stdout:1/697: sync 2026-03-10T06:22:25.951 INFO:tasks.workunit.client.1.vm06.stdout:6/685: sync 2026-03-10T06:22:25.953 INFO:tasks.workunit.client.1.vm06.stdout:9/646: dread d21/d32/f52 [0,4194304] 0 2026-03-10T06:22:25.957 INFO:tasks.workunit.client.1.vm06.stdout:1/698: sync 2026-03-10T06:22:25.960 INFO:tasks.workunit.client.1.vm06.stdout:1/699: mkdir d9/d62/dc7 0 2026-03-10T06:22:25.965 INFO:tasks.workunit.client.1.vm06.stdout:9/647: rename d21/d32/d4d/d51/d67 to d21/da2/da7/d93/dda 0 2026-03-10T06:22:25.974 INFO:tasks.workunit.client.1.vm06.stdout:1/700: dwrite d9/d62/f90 [0,4194304] 0 2026-03-10T06:22:25.975 INFO:tasks.workunit.client.1.vm06.stdout:9/648: dwrite d21/f49 [4194304,4194304] 0 2026-03-10T06:22:25.979 INFO:tasks.workunit.client.1.vm06.stdout:1/701: readlink d9/d35/d46/d38/d63/d83/d93/l9e 0 2026-03-10T06:22:25.998 INFO:tasks.workunit.client.1.vm06.stdout:1/702: symlink d9/d35/d46/lc8 0 2026-03-10T06:22:25.999 INFO:tasks.workunit.client.1.vm06.stdout:1/703: write d9/d62/f99 [516825,48116] 0 2026-03-10T06:22:26.057 INFO:tasks.workunit.client.1.vm06.stdout:7/692: dwrite d19/d3b/d41/f65 [0,4194304] 0 2026-03-10T06:22:26.079 INFO:tasks.workunit.client.1.vm06.stdout:8/555: truncate d1/f26 77710 0 2026-03-10T06:22:26.085 INFO:tasks.workunit.client.1.vm06.stdout:8/556: dwrite d1/d7/f4f [0,4194304] 0 2026-03-10T06:22:26.120 INFO:tasks.workunit.client.1.vm06.stdout:8/557: dread d1/df/d20/d21/f69 [0,4194304] 0 2026-03-10T06:22:26.120 INFO:tasks.workunit.client.1.vm06.stdout:8/558: stat d1/df/d20/d21/l2d 0 2026-03-10T06:22:26.125 INFO:tasks.workunit.client.1.vm06.stdout:3/657: dread d6/dc/f94 [0,4194304] 0 2026-03-10T06:22:26.125 INFO:tasks.workunit.client.1.vm06.stdout:6/686: dread d6/df/d70/fa6 [0,4194304] 0 2026-03-10T06:22:26.129 INFO:tasks.workunit.client.1.vm06.stdout:8/559: getdents d1/df 0 2026-03-10T06:22:26.129 INFO:tasks.workunit.client.1.vm06.stdout:6/687: unlink d6/dd/d25/d33/d5a/d78/f89 0 2026-03-10T06:22:26.131 INFO:tasks.workunit.client.1.vm06.stdout:6/688: rmdir d6/d79/d95 39 2026-03-10T06:22:26.134 INFO:tasks.workunit.client.1.vm06.stdout:8/560: symlink d1/df/d20/d21/d5e/d79/lba 0 2026-03-10T06:22:26.136 INFO:tasks.workunit.client.1.vm06.stdout:6/689: mkdir d6/d79/d95/dea 0 2026-03-10T06:22:26.137 INFO:tasks.workunit.client.1.vm06.stdout:6/690: fdatasync d6/d7/fdd 0 2026-03-10T06:22:26.141 INFO:tasks.workunit.client.1.vm06.stdout:4/670: write dd/d24/d2d/f5a [600408,37870] 0 2026-03-10T06:22:26.142 INFO:tasks.workunit.client.1.vm06.stdout:5/496: write d8/db/d54/d8a/d74/f4c [3825589,86738] 0 2026-03-10T06:22:26.142 INFO:tasks.workunit.client.1.vm06.stdout:6/691: chown d6/dd/d25/d2c/f9c 42229237 1 2026-03-10T06:22:26.142 INFO:tasks.workunit.client.1.vm06.stdout:4/671: stat dd/d24/d2d/l7f 0 2026-03-10T06:22:26.147 INFO:tasks.workunit.client.1.vm06.stdout:5/497: mkdir d8/db/d54/d8a/d39/d9f 0 2026-03-10T06:22:26.148 INFO:tasks.workunit.client.1.vm06.stdout:6/692: fdatasync d6/dd/d35/f3c 0 2026-03-10T06:22:26.148 INFO:tasks.workunit.client.1.vm06.stdout:5/498: stat d8/db/d7e 0 2026-03-10T06:22:26.154 INFO:tasks.workunit.client.1.vm06.stdout:4/672: getdents dd/d24/d5e/d66/dbd 0 2026-03-10T06:22:26.158 INFO:tasks.workunit.client.1.vm06.stdout:4/673: write dd/d24/d2d/d2f/f98 [2844803,85176] 0 2026-03-10T06:22:26.158 INFO:tasks.workunit.client.1.vm06.stdout:6/693: truncate d6/dd/d25/d2c/f85 892674 0 2026-03-10T06:22:26.164 INFO:tasks.workunit.client.1.vm06.stdout:5/499: dwrite d8/db/d54/d8a/d74/f42 [0,4194304] 0 2026-03-10T06:22:26.168 INFO:tasks.workunit.client.1.vm06.stdout:4/674: rmdir dd/d24/d5e/d66/dbd 0 2026-03-10T06:22:26.168 INFO:tasks.workunit.client.1.vm06.stdout:4/675: chown dd/d24/f9e 6 1 2026-03-10T06:22:26.168 INFO:tasks.workunit.client.1.vm06.stdout:2/582: fdatasync da/d13/d1c/d7d/f81 0 2026-03-10T06:22:26.169 INFO:tasks.workunit.client.1.vm06.stdout:4/676: truncate dd/d24/d5d/f9d 1192838 0 2026-03-10T06:22:26.173 INFO:tasks.workunit.client.1.vm06.stdout:5/500: truncate d8/ff 560797 0 2026-03-10T06:22:26.216 INFO:tasks.workunit.client.1.vm06.stdout:2/583: dread da/d13/d1a/d39/d4b/f9d [0,4194304] 0 2026-03-10T06:22:26.229 INFO:tasks.workunit.client.1.vm06.stdout:2/584: sync 2026-03-10T06:22:26.231 INFO:tasks.workunit.client.1.vm06.stdout:2/585: fdatasync da/d13/d1c/d1d/d44/d46/fb4 0 2026-03-10T06:22:26.232 INFO:tasks.workunit.client.1.vm06.stdout:2/586: readlink da/d13/d1c/d1d/l8a 0 2026-03-10T06:22:26.236 INFO:tasks.workunit.client.1.vm06.stdout:2/587: rename da/d13/d1c/d43/d6e to da/d13/d1a/d39/d4b/daf/d56/db9 0 2026-03-10T06:22:26.237 INFO:tasks.workunit.client.1.vm06.stdout:0/662: write d0/d3c/dc1/d3d/f82 [1743007,14542] 0 2026-03-10T06:22:26.242 INFO:tasks.workunit.client.1.vm06.stdout:0/663: readlink d0/dd/d14/d1d/lcf 0 2026-03-10T06:22:26.244 INFO:tasks.workunit.client.1.vm06.stdout:0/664: stat d0/dd/l3b 0 2026-03-10T06:22:26.245 INFO:tasks.workunit.client.1.vm06.stdout:2/588: symlink da/d13/d1c/d1d/lba 0 2026-03-10T06:22:26.248 INFO:tasks.workunit.client.1.vm06.stdout:9/649: truncate d21/f3e 4868232 0 2026-03-10T06:22:26.254 INFO:tasks.workunit.client.1.vm06.stdout:1/704: truncate d9/d1b/f81 341906 0 2026-03-10T06:22:26.261 INFO:tasks.workunit.client.1.vm06.stdout:9/650: mkdir d21/ddb 0 2026-03-10T06:22:26.263 INFO:tasks.workunit.client.1.vm06.stdout:1/705: symlink d9/d1b/d20/d44/dbf/lc9 0 2026-03-10T06:22:26.266 INFO:tasks.workunit.client.1.vm06.stdout:2/589: link da/d13/d1a/d39/d35/l37 da/d13/d1a/d39/d4b/daf/d56/db9/d9b/lbb 0 2026-03-10T06:22:26.266 INFO:tasks.workunit.client.1.vm06.stdout:9/651: mknod d21/da2/da7/cdc 0 2026-03-10T06:22:26.266 INFO:tasks.workunit.client.1.vm06.stdout:7/693: write f10 [897434,106388] 0 2026-03-10T06:22:26.267 INFO:tasks.workunit.client.1.vm06.stdout:7/694: readlink d19/l32 0 2026-03-10T06:22:26.268 INFO:tasks.workunit.client.1.vm06.stdout:7/695: write d19/f3f [39860,66381] 0 2026-03-10T06:22:26.268 INFO:tasks.workunit.client.1.vm06.stdout:1/706: rmdir d9/d35/d46/d38/d8c 39 2026-03-10T06:22:26.274 INFO:tasks.workunit.client.1.vm06.stdout:9/652: symlink d21/da2/da7/d93/ldd 0 2026-03-10T06:22:26.276 INFO:tasks.workunit.client.1.vm06.stdout:1/707: dwrite d9/d1b/d20/d44/f85 [0,4194304] 0 2026-03-10T06:22:26.278 INFO:tasks.workunit.client.1.vm06.stdout:2/590: creat da/d13/d5e/fbc x:0 0 0 2026-03-10T06:22:26.290 INFO:tasks.workunit.client.1.vm06.stdout:8/561: write d1/d2c/f67 [158847,73230] 0 2026-03-10T06:22:26.290 INFO:tasks.workunit.client.1.vm06.stdout:9/653: write d21/d32/d4d/fb4 [723280,89570] 0 2026-03-10T06:22:26.292 INFO:tasks.workunit.client.1.vm06.stdout:9/654: chown d21/d32/d4d/dd2 2571 1 2026-03-10T06:22:26.292 INFO:tasks.workunit.client.1.vm06.stdout:7/696: creat d19/d3b/d41/da9/dbd/dd2/fe8 x:0 0 0 2026-03-10T06:22:26.293 INFO:tasks.workunit.client.1.vm06.stdout:1/708: symlink d9/lca 0 2026-03-10T06:22:26.296 INFO:tasks.workunit.client.1.vm06.stdout:3/658: dwrite d6/f1b [0,4194304] 0 2026-03-10T06:22:26.296 INFO:tasks.workunit.client.1.vm06.stdout:7/697: fdatasync d19/d3b/d41/d42/d52/d83/f8f 0 2026-03-10T06:22:26.301 INFO:tasks.workunit.client.1.vm06.stdout:7/698: fdatasync d19/d3b/d41/d42/d52/d9f/dc2/fd3 0 2026-03-10T06:22:26.301 INFO:tasks.workunit.client.1.vm06.stdout:7/699: chown l17 458032 1 2026-03-10T06:22:26.305 INFO:tasks.workunit.client.1.vm06.stdout:6/694: dread d6/dd/d25/d2c/f85 [0,4194304] 0 2026-03-10T06:22:26.305 INFO:tasks.workunit.client.1.vm06.stdout:2/591: creat da/d13/d1c/d1d/fbd x:0 0 0 2026-03-10T06:22:26.308 INFO:tasks.workunit.client.1.vm06.stdout:9/655: dwrite d21/d32/d4d/fb4 [0,4194304] 0 2026-03-10T06:22:26.308 INFO:tasks.workunit.client.1.vm06.stdout:3/659: dwrite d6/d21/dbc/fd9 [0,4194304] 0 2026-03-10T06:22:26.313 INFO:tasks.workunit.client.1.vm06.stdout:3/660: chown d6/d21/d38/dd0/dd1/d90/f9e 13274 1 2026-03-10T06:22:26.317 INFO:tasks.workunit.client.1.vm06.stdout:8/562: creat d1/df/d20/d21/d5e/fbb x:0 0 0 2026-03-10T06:22:26.319 INFO:tasks.workunit.client.1.vm06.stdout:4/677: write dd/d24/d2d/d2f/d34/d83/f87 [512861,123776] 0 2026-03-10T06:22:26.320 INFO:tasks.workunit.client.1.vm06.stdout:4/678: write dd/d24/d2d/d2f/d34/d40/f99 [946869,61852] 0 2026-03-10T06:22:26.320 INFO:tasks.workunit.client.1.vm06.stdout:5/501: write d8/db/d54/d8a/d74/f29 [3426324,114348] 0 2026-03-10T06:22:26.321 INFO:tasks.workunit.client.1.vm06.stdout:6/695: symlink d6/dd/d25/d33/d5a/d78/dd0/dc5/leb 0 2026-03-10T06:22:26.321 INFO:tasks.workunit.client.1.vm06.stdout:7/700: creat d19/d3b/d41/d42/d62/d80/da1/fe9 x:0 0 0 2026-03-10T06:22:26.324 INFO:tasks.workunit.client.1.vm06.stdout:4/679: dread - dd/f9f zero size 2026-03-10T06:22:26.325 INFO:tasks.workunit.client.1.vm06.stdout:6/696: chown d6/dd/d35/fc1 53967506 1 2026-03-10T06:22:26.328 INFO:tasks.workunit.client.1.vm06.stdout:2/592: chown da/d13/d1a/d39/d35/l37 72451350 1 2026-03-10T06:22:26.333 INFO:tasks.workunit.client.1.vm06.stdout:2/593: dwrite da/d13/d5e/fbc [0,4194304] 0 2026-03-10T06:22:26.339 INFO:tasks.workunit.client.1.vm06.stdout:0/665: write d0/dd/d14/d18/d85/dcc/f5c [374000,14388] 0 2026-03-10T06:22:26.344 INFO:tasks.workunit.client.1.vm06.stdout:9/656: getdents d21/d27/d50/d57/db2/d80/d95/d9b/dd0 0 2026-03-10T06:22:26.344 INFO:tasks.workunit.client.1.vm06.stdout:6/697: mkdir d6/dd/d25/d33/d5a/d78/dd0/dec 0 2026-03-10T06:22:26.344 INFO:tasks.workunit.client.1.vm06.stdout:4/680: unlink dd/d33/d36/cb9 0 2026-03-10T06:22:26.344 INFO:tasks.workunit.client.1.vm06.stdout:5/502: dwrite d8/db/f45 [0,4194304] 0 2026-03-10T06:22:26.346 INFO:tasks.workunit.client.1.vm06.stdout:2/594: dwrite da/d13/d5e/f9a [0,4194304] 0 2026-03-10T06:22:26.348 INFO:tasks.workunit.client.1.vm06.stdout:4/681: dwrite dd/d24/d5e/f67 [0,4194304] 0 2026-03-10T06:22:26.368 INFO:tasks.workunit.client.1.vm06.stdout:0/666: rmdir d0/dd/d14/d18 39 2026-03-10T06:22:26.368 INFO:tasks.workunit.client.1.vm06.stdout:1/709: getdents d9/d35/d46 0 2026-03-10T06:22:26.368 INFO:tasks.workunit.client.1.vm06.stdout:9/657: unlink d21/d32/d6e/l3b 0 2026-03-10T06:22:26.369 INFO:tasks.workunit.client.1.vm06.stdout:1/710: chown d9/d1b/d20 15 1 2026-03-10T06:22:26.369 INFO:tasks.workunit.client.1.vm06.stdout:0/667: chown d0/dd/c69 0 1 2026-03-10T06:22:26.373 INFO:tasks.workunit.client.1.vm06.stdout:6/698: creat d6/df/d70/fed x:0 0 0 2026-03-10T06:22:26.378 INFO:tasks.workunit.client.1.vm06.stdout:4/682: creat dd/d33/d36/fc5 x:0 0 0 2026-03-10T06:22:26.379 INFO:tasks.workunit.client.1.vm06.stdout:9/658: creat d21/d32/d4d/d51/dcb/fde x:0 0 0 2026-03-10T06:22:26.382 INFO:tasks.workunit.client.1.vm06.stdout:1/711: truncate d9/d1b/d20/d44/fa2 4388160 0 2026-03-10T06:22:26.383 INFO:tasks.workunit.client.1.vm06.stdout:2/595: mknod da/d13/d1c/d1d/cbe 0 2026-03-10T06:22:26.383 INFO:tasks.workunit.client.1.vm06.stdout:9/659: dread - d21/d27/d50/d57/db2/d80/d95/d9b/fd9 zero size 2026-03-10T06:22:26.386 INFO:tasks.workunit.client.1.vm06.stdout:5/503: creat d8/db/fa0 x:0 0 0 2026-03-10T06:22:26.397 INFO:tasks.workunit.client.1.vm06.stdout:6/699: mkdir d6/df/d70/daa/dee 0 2026-03-10T06:22:26.397 INFO:tasks.workunit.client.1.vm06.stdout:6/700: fdatasync d6/d79/d95/db4/fbd 0 2026-03-10T06:22:26.397 INFO:tasks.workunit.client.1.vm06.stdout:0/668: symlink d0/dd/d14/d18/d85/dcc/d88/d35/d74/ld6 0 2026-03-10T06:22:26.397 INFO:tasks.workunit.client.1.vm06.stdout:2/596: rmdir da/d13/d1a/d39/d4b/d86 39 2026-03-10T06:22:26.397 INFO:tasks.workunit.client.1.vm06.stdout:9/660: mknod d21/d27/d3a/cdf 0 2026-03-10T06:22:26.397 INFO:tasks.workunit.client.1.vm06.stdout:0/669: truncate d0/dd/d14/d18/f2c 1277573 0 2026-03-10T06:22:26.397 INFO:tasks.workunit.client.1.vm06.stdout:2/597: dread da/d13/d1c/f42 [0,4194304] 0 2026-03-10T06:22:26.398 INFO:tasks.workunit.client.1.vm06.stdout:2/598: chown da/d13/d1a/d39/d35/da1 0 1 2026-03-10T06:22:26.398 INFO:tasks.workunit.client.1.vm06.stdout:1/712: creat d9/d1b/fcb x:0 0 0 2026-03-10T06:22:26.400 INFO:tasks.workunit.client.1.vm06.stdout:9/661: creat d21/d46/fe0 x:0 0 0 2026-03-10T06:22:26.401 INFO:tasks.workunit.client.1.vm06.stdout:9/662: write d21/da2/da7/d93/dda/fb6 [459610,31670] 0 2026-03-10T06:22:26.401 INFO:tasks.workunit.client.1.vm06.stdout:9/663: chown d21/d32/d4d/cd6 31 1 2026-03-10T06:22:26.411 INFO:tasks.workunit.client.1.vm06.stdout:0/670: creat d0/da3/dd5/fd7 x:0 0 0 2026-03-10T06:22:26.412 INFO:tasks.workunit.client.1.vm06.stdout:6/701: getdents d6/dd/d35 0 2026-03-10T06:22:26.414 INFO:tasks.workunit.client.1.vm06.stdout:5/504: link d8/db/d54/d8a/d74/f17 d8/db/d54/d8a/d39/fa1 0 2026-03-10T06:22:26.416 INFO:tasks.workunit.client.1.vm06.stdout:6/702: chown d6/dd/d25/d4e/c75 482865304 1 2026-03-10T06:22:26.428 INFO:tasks.workunit.client.1.vm06.stdout:0/671: mkdir d0/dd/d14/d1d/d5d/dca/dd8 0 2026-03-10T06:22:26.430 INFO:tasks.workunit.client.1.vm06.stdout:3/661: dread d6/dc/d13/d35/f95 [0,4194304] 0 2026-03-10T06:22:26.430 INFO:tasks.workunit.client.1.vm06.stdout:7/701: dread d19/d3b/d41/da9/da5/fa6 [0,4194304] 0 2026-03-10T06:22:26.431 INFO:tasks.workunit.client.1.vm06.stdout:0/672: truncate d0/d3c/dc1/f72 1933197 0 2026-03-10T06:22:26.432 INFO:tasks.workunit.client.1.vm06.stdout:7/702: write d19/d3b/d5b/fa4 [2910714,77795] 0 2026-03-10T06:22:26.435 INFO:tasks.workunit.client.1.vm06.stdout:3/662: symlink d6/d21/d38/dd0/dd1/d90/le2 0 2026-03-10T06:22:26.437 INFO:tasks.workunit.client.1.vm06.stdout:3/663: write d6/d8/f52 [52146,91560] 0 2026-03-10T06:22:26.438 INFO:tasks.workunit.client.1.vm06.stdout:0/673: read d0/dd/f48 [3185249,80810] 0 2026-03-10T06:22:26.438 INFO:tasks.workunit.client.1.vm06.stdout:1/713: sync 2026-03-10T06:22:26.440 INFO:tasks.workunit.client.1.vm06.stdout:6/703: sync 2026-03-10T06:22:26.445 INFO:tasks.workunit.client.1.vm06.stdout:6/704: mknod d6/dd/cef 0 2026-03-10T06:22:26.446 INFO:tasks.workunit.client.1.vm06.stdout:0/674: mknod d0/d3c/dc1/d7d/cd9 0 2026-03-10T06:22:26.446 INFO:tasks.workunit.client.1.vm06.stdout:9/664: dread d21/f2a [0,4194304] 0 2026-03-10T06:22:26.447 INFO:tasks.workunit.client.1.vm06.stdout:3/664: rename d6/d21/d38/l3e to d6/d1a/d5b/le3 0 2026-03-10T06:22:26.448 INFO:tasks.workunit.client.1.vm06.stdout:9/665: write d21/d32/d4d/f9d [2421944,31058] 0 2026-03-10T06:22:26.456 INFO:tasks.workunit.client.1.vm06.stdout:9/666: dread d21/da2/da7/d93/dda/f6a [0,4194304] 0 2026-03-10T06:22:26.457 INFO:tasks.workunit.client.1.vm06.stdout:7/703: dread d19/d3b/d41/d42/d52/d83/f94 [0,4194304] 0 2026-03-10T06:22:26.463 INFO:tasks.workunit.client.1.vm06.stdout:8/563: dread d1/d2c/f67 [0,4194304] 0 2026-03-10T06:22:26.463 INFO:tasks.workunit.client.1.vm06.stdout:7/704: write d19/d3b/dde/fdc [1082850,109000] 0 2026-03-10T06:22:26.464 INFO:tasks.workunit.client.1.vm06.stdout:7/705: write d19/f3f [3953562,113092] 0 2026-03-10T06:22:26.476 INFO:tasks.workunit.client.1.vm06.stdout:6/705: rename d6/df/d70/daa/cb3 to d6/dd/dc7/cf0 0 2026-03-10T06:22:26.477 INFO:tasks.workunit.client.1.vm06.stdout:1/714: dread d9/d35/d46/f7a [0,4194304] 0 2026-03-10T06:22:26.477 INFO:tasks.workunit.client.1.vm06.stdout:4/683: write dd/d24/d2d/d2f/d39/f61 [1386895,15673] 0 2026-03-10T06:22:26.478 INFO:tasks.workunit.client.1.vm06.stdout:6/706: read d6/d79/fc6 [2799543,101418] 0 2026-03-10T06:22:26.493 INFO:tasks.workunit.client.1.vm06.stdout:7/706: chown d19/l71 3 1 2026-03-10T06:22:26.493 INFO:tasks.workunit.client.1.vm06.stdout:2/599: write da/d13/d1c/d1d/d44/d53/f65 [423307,88327] 0 2026-03-10T06:22:26.494 INFO:tasks.workunit.client.1.vm06.stdout:0/675: rename d0/dd/d14/d18/d85/dcc/d88/d35/d74/ld6 to d0/dd/d14/d18/d85/dcc/d88/d47/lda 0 2026-03-10T06:22:26.495 INFO:tasks.workunit.client.1.vm06.stdout:9/667: symlink d21/ddb/le1 0 2026-03-10T06:22:26.496 INFO:tasks.workunit.client.1.vm06.stdout:1/715: mknod d9/d1b/d20/d44/dbf/ccc 0 2026-03-10T06:22:26.500 INFO:tasks.workunit.client.1.vm06.stdout:4/684: mkdir dd/d24/d5e/d66/dc6 0 2026-03-10T06:22:26.500 INFO:tasks.workunit.client.1.vm06.stdout:5/505: write d8/f49 [641421,72917] 0 2026-03-10T06:22:26.501 INFO:tasks.workunit.client.1.vm06.stdout:1/716: stat d9/d62/f76 0 2026-03-10T06:22:26.502 INFO:tasks.workunit.client.1.vm06.stdout:5/506: stat d8/db/d54/d8a/d39/f3d 0 2026-03-10T06:22:26.502 INFO:tasks.workunit.client.1.vm06.stdout:0/676: creat d0/d3c/dc1/d3d/d50/d91/da7/fdb x:0 0 0 2026-03-10T06:22:26.507 INFO:tasks.workunit.client.1.vm06.stdout:4/685: creat dd/d33/d47/fc7 x:0 0 0 2026-03-10T06:22:26.508 INFO:tasks.workunit.client.1.vm06.stdout:1/717: rmdir d9/d1b/d20 39 2026-03-10T06:22:26.509 INFO:tasks.workunit.client.1.vm06.stdout:9/668: dwrite d21/d27/d50/d57/db2/d80/d95/d9b/fcc [0,4194304] 0 2026-03-10T06:22:26.509 INFO:tasks.workunit.client.1.vm06.stdout:4/686: unlink dd/d24/d5d/c6c 0 2026-03-10T06:22:26.519 INFO:tasks.workunit.client.1.vm06.stdout:8/564: dread d1/f1c [0,4194304] 0 2026-03-10T06:22:26.520 INFO:tasks.workunit.client.1.vm06.stdout:5/507: unlink d8/db/d54/d8a/d74/f36 0 2026-03-10T06:22:26.525 INFO:tasks.workunit.client.1.vm06.stdout:0/677: truncate d0/dd/d1c/f89 114425 0 2026-03-10T06:22:26.527 INFO:tasks.workunit.client.1.vm06.stdout:1/718: truncate d9/d35/f7e 676735 0 2026-03-10T06:22:26.527 INFO:tasks.workunit.client.1.vm06.stdout:9/669: creat d21/da2/da7/fe2 x:0 0 0 2026-03-10T06:22:26.527 INFO:tasks.workunit.client.1.vm06.stdout:3/665: dwrite d6/f53 [0,4194304] 0 2026-03-10T06:22:26.534 INFO:tasks.workunit.client.1.vm06.stdout:8/565: rename d1/df/d20/d21/f38 to d1/d3b/db3/fbc 0 2026-03-10T06:22:26.536 INFO:tasks.workunit.client.1.vm06.stdout:5/508: mkdir d8/db/d54/d67/d46/d6e/da2 0 2026-03-10T06:22:26.542 INFO:tasks.workunit.client.1.vm06.stdout:7/707: dread d19/f25 [0,4194304] 0 2026-03-10T06:22:26.543 INFO:tasks.workunit.client.1.vm06.stdout:4/687: mknod dd/d24/d2d/cc8 0 2026-03-10T06:22:26.547 INFO:tasks.workunit.client.1.vm06.stdout:6/707: truncate d6/df/d70/fa6 1752678 0 2026-03-10T06:22:26.548 INFO:tasks.workunit.client.1.vm06.stdout:0/678: mkdir d0/da3/dd5/ddc 0 2026-03-10T06:22:26.549 INFO:tasks.workunit.client.1.vm06.stdout:3/666: mknod d6/dc/d13/d51/ce4 0 2026-03-10T06:22:26.549 INFO:tasks.workunit.client.1.vm06.stdout:5/509: creat d8/db/d54/d55/fa3 x:0 0 0 2026-03-10T06:22:26.550 INFO:tasks.workunit.client.1.vm06.stdout:3/667: read - d6/d21/d38/dd0/dd1/d90/dc7/fcb zero size 2026-03-10T06:22:26.566 INFO:tasks.workunit.client.1.vm06.stdout:5/510: fsync d8/db/d54/d8a/d74/f5a 0 2026-03-10T06:22:26.567 INFO:tasks.workunit.client.1.vm06.stdout:7/708: symlink d19/d3b/d41/d42/d52/d83/d9d/lea 0 2026-03-10T06:22:26.567 INFO:tasks.workunit.client.1.vm06.stdout:6/708: mkdir d6/dd/d25/d33/d5a/df1 0 2026-03-10T06:22:26.567 INFO:tasks.workunit.client.1.vm06.stdout:5/511: write d8/db/d54/d55/f87 [1536432,13188] 0 2026-03-10T06:22:26.567 INFO:tasks.workunit.client.1.vm06.stdout:8/566: mknod d1/d2c/d99/cbd 0 2026-03-10T06:22:26.567 INFO:tasks.workunit.client.1.vm06.stdout:0/679: chown d0/dd/d14/d18/d85/dcc/d88/lb8 49010118 1 2026-03-10T06:22:26.567 INFO:tasks.workunit.client.1.vm06.stdout:6/709: dwrite d6/d7/d37/d43/f77 [0,4194304] 0 2026-03-10T06:22:26.567 INFO:tasks.workunit.client.1.vm06.stdout:0/680: chown d0/dd/d14/d18/d66/fcb 288815 1 2026-03-10T06:22:26.567 INFO:tasks.workunit.client.1.vm06.stdout:4/688: mkdir dd/d41/da9/dc9 0 2026-03-10T06:22:26.567 INFO:tasks.workunit.client.1.vm06.stdout:0/681: write d0/f9 [2039186,90033] 0 2026-03-10T06:22:26.567 INFO:tasks.workunit.client.1.vm06.stdout:3/668: mkdir d6/dc/de5 0 2026-03-10T06:22:26.567 INFO:tasks.workunit.client.1.vm06.stdout:6/710: mknod d6/dd/d25/d33/d4d/cf2 0 2026-03-10T06:22:26.568 INFO:tasks.workunit.client.1.vm06.stdout:7/709: mknod d19/d3b/d41/d42/d52/d83/d9d/da8/dd6/ceb 0 2026-03-10T06:22:26.568 INFO:tasks.workunit.client.1.vm06.stdout:0/682: chown d0/dd/d14/d18/d85/dcc/d5e/l6a 2 1 2026-03-10T06:22:26.568 INFO:tasks.workunit.client.1.vm06.stdout:4/689: fdatasync dd/d24/d2d/d2f/d34/d40/f6b 0 2026-03-10T06:22:26.571 INFO:tasks.workunit.client.1.vm06.stdout:3/669: dread d6/d21/d38/dd0/dd1/f4c [0,4194304] 0 2026-03-10T06:22:26.571 INFO:tasks.workunit.client.1.vm06.stdout:5/512: link d8/db/d54/d8a/d39/d72/f9a d8/db/d54/d67/d46/fa4 0 2026-03-10T06:22:26.573 INFO:tasks.workunit.client.1.vm06.stdout:6/711: mkdir d6/dd/d25/d33/d5a/dcc/df3 0 2026-03-10T06:22:26.576 INFO:tasks.workunit.client.1.vm06.stdout:0/683: unlink d0/dd/ccd 0 2026-03-10T06:22:26.577 INFO:tasks.workunit.client.1.vm06.stdout:3/670: dwrite d6/dc/d13/f8b [0,4194304] 0 2026-03-10T06:22:26.581 INFO:tasks.workunit.client.1.vm06.stdout:7/710: read f10 [1297417,85280] 0 2026-03-10T06:22:26.620 INFO:tasks.workunit.client.1.vm06.stdout:3/671: write d6/dc/d41/fb7 [437873,112288] 0 2026-03-10T06:22:26.620 INFO:tasks.workunit.client.1.vm06.stdout:7/711: write d19/d3b/dde/fe5 [441250,62185] 0 2026-03-10T06:22:26.620 INFO:tasks.workunit.client.1.vm06.stdout:6/712: rename d6/dd/d25/d2c/c61 to d6/dd/d25/d33/d5a/d78/cf4 0 2026-03-10T06:22:26.620 INFO:tasks.workunit.client.1.vm06.stdout:9/670: dread d21/d32/fcf [0,4194304] 0 2026-03-10T06:22:26.620 INFO:tasks.workunit.client.1.vm06.stdout:0/684: creat d0/dd/d14/d18/d85/dcc/d99/fdd x:0 0 0 2026-03-10T06:22:26.620 INFO:tasks.workunit.client.1.vm06.stdout:9/671: creat d21/d32/d4d/d51/db0/fe3 x:0 0 0 2026-03-10T06:22:26.620 INFO:tasks.workunit.client.1.vm06.stdout:5/513: link d8/db/d54/d55/f61 d8/fa5 0 2026-03-10T06:22:26.620 INFO:tasks.workunit.client.1.vm06.stdout:9/672: mkdir d21/d27/d50/d57/dcd/de4 0 2026-03-10T06:22:26.620 INFO:tasks.workunit.client.1.vm06.stdout:4/690: dread dd/d33/f37 [0,4194304] 0 2026-03-10T06:22:26.620 INFO:tasks.workunit.client.1.vm06.stdout:4/691: mkdir dd/d24/d5e/d66/dc6/dca 0 2026-03-10T06:22:26.620 INFO:tasks.workunit.client.1.vm06.stdout:4/692: mknod dd/d24/d5e/d66/dc6/ccb 0 2026-03-10T06:22:26.620 INFO:tasks.workunit.client.1.vm06.stdout:9/673: dread d21/d32/d6e/f2e [4194304,4194304] 0 2026-03-10T06:22:26.620 INFO:tasks.workunit.client.1.vm06.stdout:4/693: creat dd/d24/d2d/d2f/d34/d83/fcc x:0 0 0 2026-03-10T06:22:26.620 INFO:tasks.workunit.client.1.vm06.stdout:4/694: dwrite dd/d24/d5e/f67 [0,4194304] 0 2026-03-10T06:22:26.620 INFO:tasks.workunit.client.1.vm06.stdout:9/674: dread d21/d32/d4d/fb4 [0,4194304] 0 2026-03-10T06:22:26.620 INFO:tasks.workunit.client.1.vm06.stdout:9/675: write d21/da2/da7/fbc [885059,109508] 0 2026-03-10T06:22:26.620 INFO:tasks.workunit.client.1.vm06.stdout:4/695: mkdir dd/d72/dcd 0 2026-03-10T06:22:26.620 INFO:tasks.workunit.client.1.vm06.stdout:4/696: creat dd/d33/d47/d97/db6/dbb/fce x:0 0 0 2026-03-10T06:22:26.620 INFO:tasks.workunit.client.1.vm06.stdout:2/600: dread da/d13/d1c/f2d [0,4194304] 0 2026-03-10T06:22:26.620 INFO:tasks.workunit.client.1.vm06.stdout:4/697: chown dd/d33/d47/f88 109 1 2026-03-10T06:22:26.620 INFO:tasks.workunit.client.1.vm06.stdout:2/601: write da/d13/d1a/d39/d4b/daf/d56/db9/fb6 [800926,120522] 0 2026-03-10T06:22:26.620 INFO:tasks.workunit.client.1.vm06.stdout:4/698: chown dd/d33/d47/d97/f9a 2628 1 2026-03-10T06:22:26.621 INFO:tasks.workunit.client.1.vm06.stdout:4/699: write dd/f5c [1098537,50236] 0 2026-03-10T06:22:26.625 INFO:tasks.workunit.client.1.vm06.stdout:4/700: unlink dd/d24/d2d/cc8 0 2026-03-10T06:22:26.626 INFO:tasks.workunit.client.1.vm06.stdout:4/701: stat dd/c10 0 2026-03-10T06:22:26.627 INFO:tasks.workunit.client.1.vm06.stdout:2/602: link da/d13/d1c/d1d/d44/d53/d61/c66 da/d13/d1c/d1d/d44/d46/cbf 0 2026-03-10T06:22:26.629 INFO:tasks.workunit.client.1.vm06.stdout:2/603: chown da/d13/d1c/d1d/d44/d53/c93 140434 1 2026-03-10T06:22:26.630 INFO:tasks.workunit.client.1.vm06.stdout:4/702: truncate dd/d24/d5e/f6a 1221563 0 2026-03-10T06:22:26.631 INFO:tasks.workunit.client.1.vm06.stdout:2/604: write da/d13/d1c/d43/f79 [1003367,11496] 0 2026-03-10T06:22:26.633 INFO:tasks.workunit.client.1.vm06.stdout:4/703: symlink dd/d24/d2d/d2f/d34/d40/lcf 0 2026-03-10T06:22:26.634 INFO:tasks.workunit.client.1.vm06.stdout:4/704: chown dd/d24/c4c 0 1 2026-03-10T06:22:26.639 INFO:tasks.workunit.client.1.vm06.stdout:4/705: dwrite dd/f5c [0,4194304] 0 2026-03-10T06:22:26.640 INFO:tasks.workunit.client.1.vm06.stdout:4/706: chown dd/d24/d2d/d2f/d34/d83/cab 12652 1 2026-03-10T06:22:26.641 INFO:tasks.workunit.client.1.vm06.stdout:4/707: dread - dd/d18/d75/f91 zero size 2026-03-10T06:22:26.641 INFO:tasks.workunit.client.1.vm06.stdout:4/708: chown dd/d18/d8e/cb7 7294 1 2026-03-10T06:22:26.644 INFO:tasks.workunit.client.1.vm06.stdout:4/709: mkdir dd/d24/d2d/d2f/d39/d71/dc3/dd0 0 2026-03-10T06:22:26.667 INFO:tasks.workunit.client.1.vm06.stdout:4/710: chown c4 888585914 1 2026-03-10T06:22:26.667 INFO:tasks.workunit.client.1.vm06.stdout:2/605: dread da/d13/d1c/f7e [0,4194304] 0 2026-03-10T06:22:26.668 INFO:tasks.workunit.client.1.vm06.stdout:4/711: write dd/d24/fa8 [23927,91573] 0 2026-03-10T06:22:26.668 INFO:tasks.workunit.client.1.vm06.stdout:4/712: rename dd/d24/fa8 to dd/d24/d5d/fd1 0 2026-03-10T06:22:26.668 INFO:tasks.workunit.client.1.vm06.stdout:2/606: getdents da/d13/d1a/d39/d4b/daf/d56/db7 0 2026-03-10T06:22:26.668 INFO:tasks.workunit.client.1.vm06.stdout:2/607: unlink da/d13/d1c/d1d/d44/d53/l7b 0 2026-03-10T06:22:26.668 INFO:tasks.workunit.client.1.vm06.stdout:2/608: stat da/d13/d5e/c8c 0 2026-03-10T06:22:26.668 INFO:tasks.workunit.client.1.vm06.stdout:2/609: fdatasync da/d13/d1a/d39/d35/f74 0 2026-03-10T06:22:26.668 INFO:tasks.workunit.client.1.vm06.stdout:2/610: symlink da/d13/d5e/lc0 0 2026-03-10T06:22:26.668 INFO:tasks.workunit.client.1.vm06.stdout:2/611: symlink da/d13/d1c/d1d/d44/d53/lc1 0 2026-03-10T06:22:26.668 INFO:tasks.workunit.client.1.vm06.stdout:2/612: write da/d13/d1a/d39/f2f [1548675,53402] 0 2026-03-10T06:22:26.690 INFO:tasks.workunit.client.1.vm06.stdout:1/719: sync 2026-03-10T06:22:26.691 INFO:tasks.workunit.client.1.vm06.stdout:8/567: sync 2026-03-10T06:22:26.694 INFO:tasks.workunit.client.1.vm06.stdout:8/568: dwrite d1/f4 [4194304,4194304] 0 2026-03-10T06:22:26.697 INFO:tasks.workunit.client.1.vm06.stdout:8/569: chown d1/df/d11/f47 661822 1 2026-03-10T06:22:26.701 INFO:tasks.workunit.client.1.vm06.stdout:4/713: dread dd/d33/f3f [0,4194304] 0 2026-03-10T06:22:26.722 INFO:tasks.workunit.client.1.vm06.stdout:8/570: dread d1/df/d11/f12 [0,4194304] 0 2026-03-10T06:22:26.723 INFO:tasks.workunit.client.1.vm06.stdout:1/720: chown d9/d1b/d20/cb7 5176 1 2026-03-10T06:22:26.732 INFO:tasks.workunit.client.1.vm06.stdout:4/714: mkdir dd/d24/d2d/d2f/d34/d40/dd2 0 2026-03-10T06:22:26.738 INFO:tasks.workunit.client.1.vm06.stdout:8/571: rename d1/d3b/c5a to d1/df/d20/d21/cbe 0 2026-03-10T06:22:26.738 INFO:tasks.workunit.client.1.vm06.stdout:4/715: getdents dd/d24/d2d/d2f 0 2026-03-10T06:22:26.738 INFO:tasks.workunit.client.1.vm06.stdout:4/716: chown dd/d18/fb4 2 1 2026-03-10T06:22:26.738 INFO:tasks.workunit.client.1.vm06.stdout:4/717: read dd/f12 [272197,110092] 0 2026-03-10T06:22:26.738 INFO:tasks.workunit.client.1.vm06.stdout:4/718: stat dd/d18/l1c 0 2026-03-10T06:22:26.738 INFO:tasks.workunit.client.1.vm06.stdout:8/572: mkdir d1/df/d20/d35/dac/dbf 0 2026-03-10T06:22:26.738 INFO:tasks.workunit.client.1.vm06.stdout:4/719: chown dd/d24/d5e/db0/cbf 829 1 2026-03-10T06:22:26.739 INFO:tasks.workunit.client.1.vm06.stdout:4/720: chown dd/d18/d75/f91 1 1 2026-03-10T06:22:26.741 INFO:tasks.workunit.client.1.vm06.stdout:8/573: mkdir d1/d2c/d99/dc0 0 2026-03-10T06:22:26.745 INFO:tasks.workunit.client.1.vm06.stdout:8/574: rename d1/df/d20/d21/d5e/d79/lba to d1/df/d58/db5/lc1 0 2026-03-10T06:22:26.746 INFO:tasks.workunit.client.1.vm06.stdout:1/721: dread d9/d35/d46/d38/d8c/fc1 [0,4194304] 0 2026-03-10T06:22:26.746 INFO:tasks.workunit.client.1.vm06.stdout:4/721: truncate f0 2372184 0 2026-03-10T06:22:26.746 INFO:tasks.workunit.client.1.vm06.stdout:4/722: chown dd/d41/da9 88 1 2026-03-10T06:22:26.748 INFO:tasks.workunit.client.1.vm06.stdout:1/722: chown d9/d35/d46/d38/d63/d83/fa1 284561 1 2026-03-10T06:22:26.748 INFO:tasks.workunit.client.1.vm06.stdout:1/723: stat d9/d1b/c77 0 2026-03-10T06:22:26.748 INFO:tasks.workunit.client.1.vm06.stdout:1/724: chown d9/d1b/l23 21891 1 2026-03-10T06:22:26.750 INFO:tasks.workunit.client.1.vm06.stdout:4/723: creat dd/d24/d5e/d66/fd3 x:0 0 0 2026-03-10T06:22:26.750 INFO:tasks.workunit.client.1.vm06.stdout:1/725: dread d9/d35/d46/d38/d63/d83/fb2 [0,4194304] 0 2026-03-10T06:22:26.751 INFO:tasks.workunit.client.1.vm06.stdout:4/724: mknod dd/d24/d5d/cd4 0 2026-03-10T06:22:26.752 INFO:tasks.workunit.client.1.vm06.stdout:4/725: write f2 [5861543,121103] 0 2026-03-10T06:22:26.755 INFO:tasks.workunit.client.1.vm06.stdout:4/726: rename dd/d24/d2d/d2f/d39/d71/f90 to dd/d24/d5e/d66/dc6/dca/fd5 0 2026-03-10T06:22:26.766 INFO:tasks.workunit.client.1.vm06.stdout:1/726: symlink d9/d35/d46/d38/dc6/lcd 0 2026-03-10T06:22:26.766 INFO:tasks.workunit.client.1.vm06.stdout:1/727: stat d9/d62/dc7 0 2026-03-10T06:22:26.766 INFO:tasks.workunit.client.1.vm06.stdout:1/728: dwrite d9/d1b/d20/f24 [0,4194304] 0 2026-03-10T06:22:26.766 INFO:tasks.workunit.client.1.vm06.stdout:1/729: chown d9/f2f 0 1 2026-03-10T06:22:26.774 INFO:tasks.workunit.client.1.vm06.stdout:2/613: fsync da/d13/d1c/d43/f79 0 2026-03-10T06:22:26.832 INFO:tasks.workunit.client.1.vm06.stdout:7/712: write d19/d3b/d41/d4c/f4e [2727333,72720] 0 2026-03-10T06:22:26.836 INFO:tasks.workunit.client.1.vm06.stdout:9/676: rmdir d21/d32/d4d 39 2026-03-10T06:22:26.837 INFO:tasks.workunit.client.1.vm06.stdout:3/672: truncate d6/d21/d38/dd0/dd1/f89 1106979 0 2026-03-10T06:22:26.838 INFO:tasks.workunit.client.1.vm06.stdout:6/713: truncate d6/df/d70/f90 2078102 0 2026-03-10T06:22:26.843 INFO:tasks.workunit.client.1.vm06.stdout:7/713: read d19/d3b/d41/f54 [1028548,82427] 0 2026-03-10T06:22:26.844 INFO:tasks.workunit.client.1.vm06.stdout:9/677: dwrite d21/f33 [0,4194304] 0 2026-03-10T06:22:26.846 INFO:tasks.workunit.client.1.vm06.stdout:0/685: dwrite d0/d3c/dc1/f4e [0,4194304] 0 2026-03-10T06:22:26.846 INFO:tasks.workunit.client.1.vm06.stdout:7/714: truncate d19/d3b/d41/d42/d62/f7c 254289 0 2026-03-10T06:22:26.851 INFO:tasks.workunit.client.1.vm06.stdout:6/714: rename d6/dd/d25/d33/d5a/dcc to d6/d79/d95/db4/dd4/df5 0 2026-03-10T06:22:26.853 INFO:tasks.workunit.client.1.vm06.stdout:5/514: dwrite d8/db/d54/d8a/d74/f37 [0,4194304] 0 2026-03-10T06:22:26.858 INFO:tasks.workunit.client.1.vm06.stdout:3/673: link d6/d8/f97 d6/d8/d7f/fe6 0 2026-03-10T06:22:26.861 INFO:tasks.workunit.client.1.vm06.stdout:3/674: write d6/d1a/fb9 [1274915,79049] 0 2026-03-10T06:22:26.863 INFO:tasks.workunit.client.1.vm06.stdout:7/715: chown d19/db0/ld0 565148 1 2026-03-10T06:22:26.864 INFO:tasks.workunit.client.1.vm06.stdout:5/515: dwrite d8/db/d54/d8a/d74/f29 [0,4194304] 0 2026-03-10T06:22:26.864 INFO:tasks.workunit.client.1.vm06.stdout:7/716: fsync f13 0 2026-03-10T06:22:26.868 INFO:tasks.workunit.client.1.vm06.stdout:0/686: mkdir d0/dd/d14/d18/d85/dcc/d99/dde 0 2026-03-10T06:22:26.869 INFO:tasks.workunit.client.1.vm06.stdout:6/715: creat d6/dd/d25/d33/d5a/dae/ff6 x:0 0 0 2026-03-10T06:22:26.869 INFO:tasks.workunit.client.1.vm06.stdout:0/687: fdatasync d0/dd/d14/d18/d85/dcc/d88/fa0 0 2026-03-10T06:22:26.876 INFO:tasks.workunit.client.1.vm06.stdout:9/678: truncate d21/d32/d4d/f64 914238 0 2026-03-10T06:22:26.876 INFO:tasks.workunit.client.1.vm06.stdout:6/716: truncate d6/dd/d35/f97 363809 0 2026-03-10T06:22:26.888 INFO:tasks.workunit.client.1.vm06.stdout:6/717: mknod d6/dd/d25/d33/d5a/dd8/cf7 0 2026-03-10T06:22:26.888 INFO:tasks.workunit.client.1.vm06.stdout:9/679: creat d21/d32/d6e/fe5 x:0 0 0 2026-03-10T06:22:26.888 INFO:tasks.workunit.client.1.vm06.stdout:0/688: mknod d0/da3/dd5/ddc/cdf 0 2026-03-10T06:22:26.888 INFO:tasks.workunit.client.1.vm06.stdout:3/675: link d6/dc/d13/d9d/fbb d6/d1a/d5b/fe7 0 2026-03-10T06:22:26.890 INFO:tasks.workunit.client.1.vm06.stdout:7/717: rename d19/fc0 to d19/fec 0 2026-03-10T06:22:26.891 INFO:tasks.workunit.client.1.vm06.stdout:6/718: fdatasync d6/dd/d35/f3c 0 2026-03-10T06:22:26.899 INFO:tasks.workunit.client.1.vm06.stdout:7/718: write d19/d3b/d41/d42/d52/d9f/fc6 [352218,16244] 0 2026-03-10T06:22:26.899 INFO:tasks.workunit.client.1.vm06.stdout:7/719: write d19/d3b/d41/d42/d52/d9f/dc2/fd3 [476852,32999] 0 2026-03-10T06:22:26.899 INFO:tasks.workunit.client.1.vm06.stdout:6/719: unlink d6/dd/d35/fc1 0 2026-03-10T06:22:26.900 INFO:tasks.workunit.client.1.vm06.stdout:6/720: chown d6/dd/d25/f69 845268 1 2026-03-10T06:22:26.903 INFO:tasks.workunit.client.1.vm06.stdout:4/727: getdents dd/d24/d5e/d66 0 2026-03-10T06:22:26.906 INFO:tasks.workunit.client.1.vm06.stdout:7/720: link d19/d3b/d41/d42/d52/d83/d9d/cca d19/d3b/d41/d4c/ced 0 2026-03-10T06:22:26.906 INFO:tasks.workunit.client.1.vm06.stdout:2/614: write da/d13/d1a/d39/d35/f4a [441921,25209] 0 2026-03-10T06:22:26.908 INFO:tasks.workunit.client.1.vm06.stdout:5/516: sync 2026-03-10T06:22:26.909 INFO:tasks.workunit.client.1.vm06.stdout:5/517: fdatasync d8/d9/f11 0 2026-03-10T06:22:26.912 INFO:tasks.workunit.client.1.vm06.stdout:4/728: dread dd/d24/d5d/fd1 [0,4194304] 0 2026-03-10T06:22:26.912 INFO:tasks.workunit.client.1.vm06.stdout:8/575: dwrite d1/f75 [0,4194304] 0 2026-03-10T06:22:26.913 INFO:tasks.workunit.client.1.vm06.stdout:2/615: dread - da/d13/d1c/d43/f91 zero size 2026-03-10T06:22:26.916 INFO:tasks.workunit.client.1.vm06.stdout:1/730: dwrite d9/d35/d46/d38/d8c/f9a [0,4194304] 0 2026-03-10T06:22:26.919 INFO:tasks.workunit.client.1.vm06.stdout:7/721: creat d19/d3b/fee x:0 0 0 2026-03-10T06:22:26.922 INFO:tasks.workunit.client.1.vm06.stdout:1/731: rmdir d9/d1b/d20 39 2026-03-10T06:22:26.924 INFO:tasks.workunit.client.1.vm06.stdout:2/616: truncate da/d13/d1c/d1d/f55 657527 0 2026-03-10T06:22:26.924 INFO:tasks.workunit.client.1.vm06.stdout:0/689: sync 2026-03-10T06:22:26.927 INFO:tasks.workunit.client.1.vm06.stdout:4/729: dwrite dd/d18/fac [0,4194304] 0 2026-03-10T06:22:26.927 INFO:tasks.workunit.client.1.vm06.stdout:0/690: rmdir d0/dd/d14/d18/d85/dcc/dab 39 2026-03-10T06:22:26.927 INFO:tasks.workunit.client.1.vm06.stdout:1/732: mknod d9/cce 0 2026-03-10T06:22:26.934 INFO:tasks.workunit.client.1.vm06.stdout:0/691: write d0/dd/d14/d18/d85/dcc/d88/d98/faf [567272,53957] 0 2026-03-10T06:22:26.938 INFO:tasks.workunit.client.1.vm06.stdout:2/617: creat da/d13/d1a/d39/d4b/daf/d56/db9/fc2 x:0 0 0 2026-03-10T06:22:26.943 INFO:tasks.workunit.client.1.vm06.stdout:2/618: unlink da/d13/d1a/d39/d4b/daf/d56/db9/f71 0 2026-03-10T06:22:26.944 INFO:tasks.workunit.client.1.vm06.stdout:6/721: dread d6/d7/f87 [0,4194304] 0 2026-03-10T06:22:26.946 INFO:tasks.workunit.client.1.vm06.stdout:4/730: rmdir dd/d41/da9/dc9 0 2026-03-10T06:22:26.948 INFO:tasks.workunit.client.1.vm06.stdout:4/731: readlink dd/l85 0 2026-03-10T06:22:26.948 INFO:tasks.workunit.client.1.vm06.stdout:0/692: dread d0/dd/d14/d18/d85/dcc/fc8 [0,4194304] 0 2026-03-10T06:22:26.948 INFO:tasks.workunit.client.1.vm06.stdout:6/722: read d6/dd/d25/d33/d4d/f8c [1728047,112256] 0 2026-03-10T06:22:26.953 INFO:tasks.workunit.client.1.vm06.stdout:4/732: chown dd/d18/l54 111102482 1 2026-03-10T06:22:26.955 INFO:tasks.workunit.client.1.vm06.stdout:0/693: dread d0/dd/d14/d1d/d5d/f5f [0,4194304] 0 2026-03-10T06:22:26.963 INFO:tasks.workunit.client.1.vm06.stdout:6/723: dwrite d6/dd/d25/d33/d5a/dae/fcb [0,4194304] 0 2026-03-10T06:22:26.965 INFO:tasks.workunit.client.1.vm06.stdout:7/722: chown d19/fec 52028665 1 2026-03-10T06:22:26.966 INFO:tasks.workunit.client.1.vm06.stdout:9/680: write d21/d27/d50/d57/db2/d80/f86 [632282,72618] 0 2026-03-10T06:22:26.969 INFO:tasks.workunit.client.1.vm06.stdout:9/681: readlink d21/d32/la1 0 2026-03-10T06:22:26.974 INFO:tasks.workunit.client.1.vm06.stdout:6/724: sync 2026-03-10T06:22:26.975 INFO:tasks.workunit.client.1.vm06.stdout:0/694: readlink d0/dd/d14/d1d/d5d/lba 0 2026-03-10T06:22:26.977 INFO:tasks.workunit.client.1.vm06.stdout:0/695: write d0/dd/f24 [8272264,14301] 0 2026-03-10T06:22:26.977 INFO:tasks.workunit.client.1.vm06.stdout:7/723: creat d19/d3b/d41/d42/d62/fef x:0 0 0 2026-03-10T06:22:26.978 INFO:tasks.workunit.client.1.vm06.stdout:9/682: rename d21/d32/d6e to d21/da2/de6 0 2026-03-10T06:22:26.978 INFO:tasks.workunit.client.1.vm06.stdout:1/733: dread d9/d1b/d20/d44/f54 [0,4194304] 0 2026-03-10T06:22:26.978 INFO:tasks.workunit.client.1.vm06.stdout:3/676: dwrite d6/d8/d7f/fe6 [0,4194304] 0 2026-03-10T06:22:26.983 INFO:tasks.workunit.client.1.vm06.stdout:9/683: chown d21/d32/fd8 17 1 2026-03-10T06:22:26.983 INFO:tasks.workunit.client.1.vm06.stdout:9/684: write d21/d27/d3a/f83 [1370763,130581] 0 2026-03-10T06:22:26.983 INFO:tasks.workunit.client.1.vm06.stdout:4/733: rename dd/d24/f8c to dd/d24/fd6 0 2026-03-10T06:22:26.986 INFO:tasks.workunit.client.1.vm06.stdout:8/576: dwrite d1/d2c/f32 [0,4194304] 0 2026-03-10T06:22:26.994 INFO:tasks.workunit.client.1.vm06.stdout:2/619: write da/f28 [4574377,53393] 0 2026-03-10T06:22:26.997 INFO:tasks.workunit.client.1.vm06.stdout:2/620: dwrite da/d13/d1a/d39/d35/f4a [4194304,4194304] 0 2026-03-10T06:22:27.000 INFO:tasks.workunit.client.1.vm06.stdout:1/734: symlink d9/lcf 0 2026-03-10T06:22:27.004 INFO:tasks.workunit.client.1.vm06.stdout:8/577: creat d1/df/d20/fc2 x:0 0 0 2026-03-10T06:22:27.005 INFO:tasks.workunit.client.1.vm06.stdout:4/734: dwrite dd/d24/d2d/d2f/f42 [4194304,4194304] 0 2026-03-10T06:22:27.008 INFO:tasks.workunit.client.1.vm06.stdout:4/735: readlink dd/d18/l1c 0 2026-03-10T06:22:27.009 INFO:tasks.workunit.client.1.vm06.stdout:7/724: dwrite d19/d3b/d41/d42/d52/d83/d9d/fbf [0,4194304] 0 2026-03-10T06:22:27.009 INFO:tasks.workunit.client.1.vm06.stdout:0/696: dwrite d0/dd/d14/d18/d85/dcc/d5e/f86 [0,4194304] 0 2026-03-10T06:22:27.010 INFO:tasks.workunit.client.1.vm06.stdout:1/735: dread - d9/d35/d46/d38/d8c/fa9 zero size 2026-03-10T06:22:27.012 INFO:tasks.workunit.client.1.vm06.stdout:5/518: dread d8/d9/f14 [0,4194304] 0 2026-03-10T06:22:27.013 INFO:tasks.workunit.client.1.vm06.stdout:1/736: chown d9/d1b/d20/d44/dbf/fc3 112010810 1 2026-03-10T06:22:27.017 INFO:tasks.workunit.client.1.vm06.stdout:0/697: dread d0/dd/d14/d18/d85/dcc/d5e/f86 [0,4194304] 0 2026-03-10T06:22:27.018 INFO:tasks.workunit.client.1.vm06.stdout:6/725: creat d6/dd/d25/ff8 x:0 0 0 2026-03-10T06:22:27.019 INFO:tasks.workunit.client.1.vm06.stdout:3/677: creat d6/d21/d38/d88/dde/fe8 x:0 0 0 2026-03-10T06:22:27.023 INFO:tasks.workunit.client.1.vm06.stdout:3/678: stat d6/dc/d13/d9d/d54/fcc 0 2026-03-10T06:22:27.023 INFO:tasks.workunit.client.1.vm06.stdout:3/679: stat d6/d21/d38/f6c 0 2026-03-10T06:22:27.031 INFO:tasks.workunit.client.1.vm06.stdout:5/519: dwrite d8/db/d54/d8a/d74/f5a [0,4194304] 0 2026-03-10T06:22:27.038 INFO:tasks.workunit.client.1.vm06.stdout:4/736: read dd/f12 [4175384,31330] 0 2026-03-10T06:22:27.039 INFO:tasks.workunit.client.1.vm06.stdout:3/680: creat d6/d21/d38/dd0/dd1/d90/fe9 x:0 0 0 2026-03-10T06:22:27.048 INFO:tasks.workunit.client.1.vm06.stdout:9/685: dread d21/d32/f3f [0,4194304] 0 2026-03-10T06:22:27.048 INFO:tasks.workunit.client.1.vm06.stdout:9/686: dread - d21/d32/d4d/d51/db0/fe3 zero size 2026-03-10T06:22:27.048 INFO:tasks.workunit.client.1.vm06.stdout:5/520: creat d8/db/d57/d83/fa6 x:0 0 0 2026-03-10T06:22:27.050 INFO:tasks.workunit.client.1.vm06.stdout:0/698: link d0/dd/d14/d18/d85/dcc/l56 d0/da3/le0 0 2026-03-10T06:22:27.058 INFO:tasks.workunit.client.1.vm06.stdout:6/726: rename d6/dd/d25/d33/d5a/d78/dd0/ldb to d6/d79/d95/db4/lf9 0 2026-03-10T06:22:27.061 INFO:tasks.workunit.client.1.vm06.stdout:9/687: symlink d21/da2/da7/le7 0 2026-03-10T06:22:27.061 INFO:tasks.workunit.client.1.vm06.stdout:9/688: readlink l18 0 2026-03-10T06:22:27.062 INFO:tasks.workunit.client.1.vm06.stdout:3/681: creat d6/dc/d13/d9d/d54/fea x:0 0 0 2026-03-10T06:22:27.062 INFO:tasks.workunit.client.1.vm06.stdout:1/737: getdents d9/d35/d46 0 2026-03-10T06:22:27.064 INFO:tasks.workunit.client.1.vm06.stdout:6/727: write d6/df/f1e [3285603,76752] 0 2026-03-10T06:22:27.065 INFO:tasks.workunit.client.1.vm06.stdout:7/725: link d19/l2c d19/d3b/lf0 0 2026-03-10T06:22:27.067 INFO:tasks.workunit.client.1.vm06.stdout:6/728: fdatasync d6/d7/d37/d43/f59 0 2026-03-10T06:22:27.067 INFO:tasks.workunit.client.1.vm06.stdout:7/726: dread - d19/d3b/d41/da9/dbd/dd2/fe8 zero size 2026-03-10T06:22:27.069 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:27 vm06.local ceph-mon[58974]: pgmap v12: 65 pgs: 65 active+clean; 1.4 GiB data, 4.8 GiB used, 115 GiB / 120 GiB avail; 54 MiB/s rd, 182 MiB/s wr, 370 op/s 2026-03-10T06:22:27.069 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:27 vm06.local ceph-mon[58974]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:22:27.069 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:27 vm06.local ceph-mon[58974]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:22:27.069 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:27 vm06.local ceph-mon[58974]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:22:27.069 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:27 vm06.local ceph-mon[58974]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:22:27.072 INFO:tasks.workunit.client.1.vm06.stdout:9/689: dwrite d21/d32/d4d/d51/db0/fd4 [0,4194304] 0 2026-03-10T06:22:27.075 INFO:tasks.workunit.client.1.vm06.stdout:4/737: getdents dd/d24/d2d/d2f/d34/d40 0 2026-03-10T06:22:27.082 INFO:tasks.workunit.client.1.vm06.stdout:7/727: mknod d19/d3b/d41/d42/d52/d9f/cf1 0 2026-03-10T06:22:27.091 INFO:tasks.workunit.client.1.vm06.stdout:1/738: dread d9/d35/d46/f7a [0,4194304] 0 2026-03-10T06:22:27.091 INFO:tasks.workunit.client.1.vm06.stdout:3/682: dwrite d6/dc/d13/f8d [0,4194304] 0 2026-03-10T06:22:27.091 INFO:tasks.workunit.client.1.vm06.stdout:0/699: link d0/da3/dd5/ddc/cdf d0/dd/d14/d18/d85/dcc/d88/d47/ce1 0 2026-03-10T06:22:27.095 INFO:tasks.workunit.client.1.vm06.stdout:4/738: unlink dd/d24/d5e/d66/dc6/ccb 0 2026-03-10T06:22:27.095 INFO:tasks.workunit.client.1.vm06.stdout:3/683: stat d6/d8/f22 0 2026-03-10T06:22:27.100 INFO:tasks.workunit.client.1.vm06.stdout:8/578: write d1/df/fa0 [367720,54608] 0 2026-03-10T06:22:27.102 INFO:tasks.workunit.client.1.vm06.stdout:7/728: creat d19/d3b/d41/d72/ff2 x:0 0 0 2026-03-10T06:22:27.102 INFO:tasks.workunit.client.1.vm06.stdout:0/700: fdatasync d0/dd/d14/d18/d85/dcc/d88/d98/f9a 0 2026-03-10T06:22:27.103 INFO:tasks.workunit.client.1.vm06.stdout:4/739: write dd/d33/d36/fc5 [1010317,5758] 0 2026-03-10T06:22:27.107 INFO:tasks.workunit.client.1.vm06.stdout:7/729: write d19/f20 [1037121,19003] 0 2026-03-10T06:22:27.115 INFO:tasks.workunit.client.1.vm06.stdout:2/621: dwrite da/d13/d1c/f76 [0,4194304] 0 2026-03-10T06:22:27.117 INFO:tasks.workunit.client.1.vm06.stdout:8/579: read d1/df/d20/d21/d7e/d8d/f95 [1880890,62802] 0 2026-03-10T06:22:27.121 INFO:tasks.workunit.client.1.vm06.stdout:0/701: unlink d0/dd/d1c/f5a 0 2026-03-10T06:22:27.121 INFO:tasks.workunit.client.1.vm06.stdout:8/580: dread d1/df/d20/d21/f69 [0,4194304] 0 2026-03-10T06:22:27.121 INFO:tasks.workunit.client.1.vm06.stdout:0/702: write d0/dd/f32 [3726587,18633] 0 2026-03-10T06:22:27.121 INFO:tasks.workunit.client.1.vm06.stdout:8/581: fdatasync d1/fb9 0 2026-03-10T06:22:27.130 INFO:tasks.workunit.client.1.vm06.stdout:6/729: link d6/fc d6/dd/d25/d33/d5a/d78/ffa 0 2026-03-10T06:22:27.133 INFO:tasks.workunit.client.1.vm06.stdout:4/740: mkdir dd/d33/d47/d97/db6/dd7 0 2026-03-10T06:22:27.134 INFO:tasks.workunit.client.1.vm06.stdout:0/703: symlink d0/d3c/dc1/d3d/d50/d91/da7/le2 0 2026-03-10T06:22:27.137 INFO:tasks.workunit.client.1.vm06.stdout:2/622: read da/d13/d1c/f41 [4716744,78602] 0 2026-03-10T06:22:27.137 INFO:tasks.workunit.client.1.vm06.stdout:6/730: readlink d6/d79/d95/le4 0 2026-03-10T06:22:27.140 INFO:tasks.workunit.client.1.vm06.stdout:6/731: dread - d6/df/d70/fd9 zero size 2026-03-10T06:22:27.143 INFO:tasks.workunit.client.1.vm06.stdout:3/684: getdents d6/dc 0 2026-03-10T06:22:27.143 INFO:tasks.workunit.client.1.vm06.stdout:3/685: write d6/dc/d41/d6d/fce [5182490,22195] 0 2026-03-10T06:22:27.146 INFO:tasks.workunit.client.1.vm06.stdout:0/704: dwrite d0/dd/d14/d18/d85/dcc/f54 [0,4194304] 0 2026-03-10T06:22:27.147 INFO:tasks.workunit.client.1.vm06.stdout:8/582: creat d1/d2c/d99/dc0/fc3 x:0 0 0 2026-03-10T06:22:27.150 INFO:tasks.workunit.client.1.vm06.stdout:2/623: creat da/d13/d1c/d7d/fc3 x:0 0 0 2026-03-10T06:22:27.154 INFO:tasks.workunit.client.1.vm06.stdout:6/732: write d6/d7/f2a [847194,31527] 0 2026-03-10T06:22:27.163 INFO:tasks.workunit.client.1.vm06.stdout:6/733: dwrite d6/d79/d95/db4/fbd [0,4194304] 0 2026-03-10T06:22:27.174 INFO:tasks.workunit.client.1.vm06.stdout:5/521: write d8/db/d54/d8a/d39/f3d [1028801,102985] 0 2026-03-10T06:22:27.178 INFO:tasks.workunit.client.1.vm06.stdout:9/690: dwrite d21/da2/da7/d93/dda/f81 [0,4194304] 0 2026-03-10T06:22:27.179 INFO:tasks.workunit.client.1.vm06.stdout:6/734: creat d6/dd/d25/d33/d5a/ffb x:0 0 0 2026-03-10T06:22:27.184 INFO:tasks.workunit.client.1.vm06.stdout:4/741: sync 2026-03-10T06:22:27.184 INFO:tasks.workunit.client.1.vm06.stdout:0/705: sync 2026-03-10T06:22:27.185 INFO:tasks.workunit.client.1.vm06.stdout:6/735: rmdir d6/d79 39 2026-03-10T06:22:27.186 INFO:tasks.workunit.client.1.vm06.stdout:5/522: rename d8/db/d54/d67/d46/f76 to d8/db/d54/d67/d46/d6e/fa7 0 2026-03-10T06:22:27.188 INFO:tasks.workunit.client.1.vm06.stdout:9/691: unlink d21/da2/da7/d93/fa3 0 2026-03-10T06:22:27.192 INFO:tasks.workunit.client.1.vm06.stdout:1/739: write d9/d1b/f81 [256049,111382] 0 2026-03-10T06:22:27.196 INFO:tasks.workunit.client.1.vm06.stdout:4/742: rename dd/d24/f3d to dd/d33/d36/fd8 0 2026-03-10T06:22:27.196 INFO:tasks.workunit.client.1.vm06.stdout:4/743: chown c4 391 1 2026-03-10T06:22:27.202 INFO:tasks.workunit.client.1.vm06.stdout:0/706: link d0/dd/d14/d18/d85/dcc/d88/fae d0/da3/dd5/ddc/fe3 0 2026-03-10T06:22:27.206 INFO:tasks.workunit.client.1.vm06.stdout:5/523: mknod d8/db/d54/d8a/ca8 0 2026-03-10T06:22:27.209 INFO:tasks.workunit.client.1.vm06.stdout:7/730: truncate d19/d3b/d41/d42/d52/d83/d9d/fbf 2504315 0 2026-03-10T06:22:27.209 INFO:tasks.workunit.client.1.vm06.stdout:6/736: creat d6/d7/ffc x:0 0 0 2026-03-10T06:22:27.210 INFO:tasks.workunit.client.1.vm06.stdout:9/692: symlink d21/d27/d50/d57/db2/d80/d95/d9b/dd0/le8 0 2026-03-10T06:22:27.214 INFO:tasks.workunit.client.1.vm06.stdout:4/744: write dd/d24/d5d/fd1 [590109,79598] 0 2026-03-10T06:22:27.214 INFO:tasks.workunit.client.1.vm06.stdout:5/524: symlink d8/db/d57/la9 0 2026-03-10T06:22:27.216 INFO:tasks.workunit.client.1.vm06.stdout:0/707: creat d0/dd/d14/d18/d85/dcc/d99/fe4 x:0 0 0 2026-03-10T06:22:27.218 INFO:tasks.workunit.client.1.vm06.stdout:1/740: creat d9/fd0 x:0 0 0 2026-03-10T06:22:27.228 INFO:tasks.workunit.client.1.vm06.stdout:9/693: dwrite d21/d27/f9a [0,4194304] 0 2026-03-10T06:22:27.229 INFO:tasks.workunit.client.1.vm06.stdout:6/737: creat d6/dd/ffd x:0 0 0 2026-03-10T06:22:27.230 INFO:tasks.workunit.client.1.vm06.stdout:1/741: rmdir d9/d1b 39 2026-03-10T06:22:27.231 INFO:tasks.workunit.client.1.vm06.stdout:3/686: dwrite d6/d21/d38/f56 [0,4194304] 0 2026-03-10T06:22:27.233 INFO:tasks.workunit.client.1.vm06.stdout:8/583: dwrite d1/df/d20/d21/d5e/f70 [0,4194304] 0 2026-03-10T06:22:27.233 INFO:tasks.workunit.client.1.vm06.stdout:4/745: unlink dd/d33/d36/fd8 0 2026-03-10T06:22:27.242 INFO:tasks.workunit.client.1.vm06.stdout:5/525: write d8/db/d54/d8a/d39/f52 [4855107,47102] 0 2026-03-10T06:22:27.255 INFO:tasks.workunit.client.1.vm06.stdout:1/742: dwrite d9/d62/f94 [0,4194304] 0 2026-03-10T06:22:27.256 INFO:tasks.workunit.client.1.vm06.stdout:8/584: dread d1/f4 [4194304,4194304] 0 2026-03-10T06:22:27.256 INFO:tasks.workunit.client.1.vm06.stdout:1/743: chown d9/d35/d46/c5e 33546 1 2026-03-10T06:22:27.264 INFO:tasks.workunit.client.1.vm06.stdout:7/731: rename d19/d3b/d41/d4c/cc8 to d19/d3b/d41/d42/d62/d80/cf3 0 2026-03-10T06:22:27.268 INFO:tasks.workunit.client.1.vm06.stdout:3/687: dwrite d6/d21/fda [0,4194304] 0 2026-03-10T06:22:27.271 INFO:tasks.workunit.client.1.vm06.stdout:5/526: creat d8/db/d54/d67/d46/d6e/faa x:0 0 0 2026-03-10T06:22:27.279 INFO:tasks.workunit.client.1.vm06.stdout:6/738: dwrite d6/d7/ffc [0,4194304] 0 2026-03-10T06:22:27.280 INFO:tasks.workunit.client.1.vm06.stdout:9/694: dread - d21/d27/d50/d57/db2/d7f/f91 zero size 2026-03-10T06:22:27.280 INFO:tasks.workunit.client.1.vm06.stdout:7/732: dread d19/d3b/d41/d42/d62/f7c [0,4194304] 0 2026-03-10T06:22:27.280 INFO:tasks.workunit.client.1.vm06.stdout:0/708: rename d0/dd/d14/d18/d85/dcc/d88/d47/d4d/f57 to d0/da3/fe5 0 2026-03-10T06:22:27.280 INFO:tasks.workunit.client.1.vm06.stdout:7/733: stat d19/f20 0 2026-03-10T06:22:27.280 INFO:tasks.workunit.client.1.vm06.stdout:0/709: stat d0/d3c/dc1/d3d/d50/cb0 0 2026-03-10T06:22:27.280 INFO:tasks.workunit.client.1.vm06.stdout:7/734: fdatasync d19/d3b/d41/d72/ff2 0 2026-03-10T06:22:27.286 INFO:tasks.workunit.client.1.vm06.stdout:8/585: truncate d1/df/d11/f12 40336 0 2026-03-10T06:22:27.289 INFO:tasks.workunit.client.1.vm06.stdout:6/739: fsync d6/d7/f87 0 2026-03-10T06:22:27.289 INFO:tasks.workunit.client.1.vm06.stdout:1/744: fdatasync d9/d35/d46/d38/fae 0 2026-03-10T06:22:27.293 INFO:tasks.workunit.client.1.vm06.stdout:7/735: mkdir d19/d3b/d41/d42/d52/d83/d9d/da8/df4 0 2026-03-10T06:22:27.294 INFO:tasks.workunit.client.1.vm06.stdout:7/736: fdatasync d19/d3b/d41/d4c/fcf 0 2026-03-10T06:22:27.299 INFO:tasks.workunit.client.1.vm06.stdout:5/527: sync 2026-03-10T06:22:27.300 INFO:tasks.workunit.client.1.vm06.stdout:6/740: creat d6/d7/ffe x:0 0 0 2026-03-10T06:22:27.310 INFO:tasks.workunit.client.1.vm06.stdout:9/695: dwrite d21/f49 [4194304,4194304] 0 2026-03-10T06:22:27.311 INFO:tasks.workunit.client.1.vm06.stdout:8/586: rename d1/d2c/d5b/c85 to d1/d2c/d90/cc4 0 2026-03-10T06:22:27.319 INFO:tasks.workunit.client.1.vm06.stdout:7/737: link d19/d3b/d41/d42/d62/d80/d82/fae d19/db0/ddd/ff5 0 2026-03-10T06:22:27.322 INFO:tasks.workunit.client.1.vm06.stdout:1/745: dwrite d9/d35/d46/f7a [0,4194304] 0 2026-03-10T06:22:27.324 INFO:tasks.workunit.client.1.vm06.stdout:6/741: dwrite d6/d7/f2a [0,4194304] 0 2026-03-10T06:22:27.331 INFO:tasks.workunit.client.1.vm06.stdout:9/696: creat d21/ddb/fe9 x:0 0 0 2026-03-10T06:22:27.337 INFO:tasks.workunit.client.1.vm06.stdout:7/738: mknod d19/d3b/d41/d42/d52/d9f/cf6 0 2026-03-10T06:22:27.337 INFO:tasks.workunit.client.1.vm06.stdout:4/746: write dd/d18/f1f [5458944,41236] 0 2026-03-10T06:22:27.337 INFO:tasks.workunit.client.1.vm06.stdout:6/742: mkdir d6/dd/d35/dff 0 2026-03-10T06:22:27.338 INFO:tasks.workunit.client.1.vm06.stdout:1/746: fdatasync d9/d35/f7e 0 2026-03-10T06:22:27.345 INFO:tasks.workunit.client.1.vm06.stdout:9/697: link d21/d27/c48 d21/d32/d4d/cea 0 2026-03-10T06:22:27.346 INFO:tasks.workunit.client.1.vm06.stdout:4/747: dwrite dd/d33/d36/fba [0,4194304] 0 2026-03-10T06:22:27.348 INFO:tasks.workunit.client.1.vm06.stdout:6/743: fdatasync d6/df/d40/d99/fb5 0 2026-03-10T06:22:27.357 INFO:tasks.workunit.client.1.vm06.stdout:4/748: rename dd/d33/f84 to dd/d33/d47/d97/db6/dd7/fd9 0 2026-03-10T06:22:27.357 INFO:tasks.workunit.client.1.vm06.stdout:7/739: dread d19/db0/ddd/ff5 [0,4194304] 0 2026-03-10T06:22:27.360 INFO:tasks.workunit.client.1.vm06.stdout:4/749: unlink dd/d24/d2d/d2f/f98 0 2026-03-10T06:22:27.369 INFO:tasks.workunit.client.1.vm06.stdout:4/750: fsync dd/d18/d8e/fa4 0 2026-03-10T06:22:27.369 INFO:tasks.workunit.client.1.vm06.stdout:6/744: mknod d6/d79/d95/db4/dd4/c100 0 2026-03-10T06:22:27.369 INFO:tasks.workunit.client.1.vm06.stdout:7/740: rmdir d19/d3b/d41/d42/d52/d83/d9d 39 2026-03-10T06:22:27.369 INFO:tasks.workunit.client.1.vm06.stdout:9/698: getdents d21/d27/d50/d57/db2/d80/d95 0 2026-03-10T06:22:27.369 INFO:tasks.workunit.client.1.vm06.stdout:4/751: fdatasync dd/d72/f78 0 2026-03-10T06:22:27.369 INFO:tasks.workunit.client.1.vm06.stdout:3/688: dread d6/d21/f31 [0,4194304] 0 2026-03-10T06:22:27.369 INFO:tasks.workunit.client.1.vm06.stdout:7/741: rename d19/d3b/d41/d72/de0/ce7 to d19/d3b/d41/d72/de0/cf7 0 2026-03-10T06:22:27.369 INFO:tasks.workunit.client.1.vm06.stdout:9/699: chown d21/d32/d4d/d51/f87 458 1 2026-03-10T06:22:27.369 INFO:tasks.workunit.client.1.vm06.stdout:6/745: mknod d6/dd/d25/d33/d5a/df1/c101 0 2026-03-10T06:22:27.369 INFO:tasks.workunit.client.1.vm06.stdout:7/742: chown d19/d3b/d41/da9/da5 22909 1 2026-03-10T06:22:27.371 INFO:tasks.workunit.client.1.vm06.stdout:7/743: mkdir d19/d3b/d41/d42/d62/dc4/df8 0 2026-03-10T06:22:27.372 INFO:tasks.workunit.client.1.vm06.stdout:7/744: chown d19/d3b/d41/d42/d52/d83/d9d/da8/dd6/ceb 86934110 1 2026-03-10T06:22:27.380 INFO:tasks.workunit.client.1.vm06.stdout:7/745: dread d19/f99 [0,4194304] 0 2026-03-10T06:22:27.381 INFO:tasks.workunit.client.1.vm06.stdout:9/700: link d21/d32/d4d/dd2/fd5 d21/da2/da7/d93/dda/feb 0 2026-03-10T06:22:27.383 INFO:tasks.workunit.client.1.vm06.stdout:3/689: dwrite d6/d21/d38/f56 [4194304,4194304] 0 2026-03-10T06:22:27.391 INFO:tasks.workunit.client.1.vm06.stdout:6/746: rename d6/d7/c74 to d6/dd/d25/d33/d5a/c102 0 2026-03-10T06:22:27.395 INFO:tasks.workunit.client.1.vm06.stdout:7/746: dwrite d19/d3b/d41/d4c/f55 [4194304,4194304] 0 2026-03-10T06:22:27.398 INFO:tasks.workunit.client.1.vm06.stdout:8/587: write d1/df/d58/f6a [1584461,100286] 0 2026-03-10T06:22:27.401 INFO:tasks.workunit.client.1.vm06.stdout:9/701: dwrite d21/f49 [4194304,4194304] 0 2026-03-10T06:22:27.401 INFO:tasks.workunit.client.1.vm06.stdout:9/702: truncate d21/d27/d3a/f83 2294008 0 2026-03-10T06:22:27.405 INFO:tasks.workunit.client.1.vm06.stdout:5/528: dwrite d8/db/d54/d8a/d74/f3b [0,4194304] 0 2026-03-10T06:22:27.408 INFO:tasks.workunit.client.1.vm06.stdout:1/747: write d9/d35/f5c [404244,60993] 0 2026-03-10T06:22:27.418 INFO:tasks.workunit.client.1.vm06.stdout:0/710: dwrite d0/dd/f4c [0,4194304] 0 2026-03-10T06:22:27.418 INFO:tasks.workunit.client.1.vm06.stdout:7/747: sync 2026-03-10T06:22:27.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:27 vm04.local ceph-mon[51058]: pgmap v12: 65 pgs: 65 active+clean; 1.4 GiB data, 4.8 GiB used, 115 GiB / 120 GiB avail; 54 MiB/s rd, 182 MiB/s wr, 370 op/s 2026-03-10T06:22:27.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:27 vm04.local ceph-mon[51058]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:22:27.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:27 vm04.local ceph-mon[51058]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:22:27.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:27 vm04.local ceph-mon[51058]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:22:27.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:27 vm04.local ceph-mon[51058]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:22:27.428 INFO:tasks.workunit.client.1.vm06.stdout:0/711: mknod d0/da3/dd5/ddc/ce6 0 2026-03-10T06:22:27.429 INFO:tasks.workunit.client.1.vm06.stdout:7/748: symlink d19/d3b/d41/d4c/lf9 0 2026-03-10T06:22:27.431 INFO:tasks.workunit.client.1.vm06.stdout:9/703: dwrite d21/d27/d50/d57/db2/d80/d95/d9b/fd9 [0,4194304] 0 2026-03-10T06:22:27.432 INFO:tasks.workunit.client.1.vm06.stdout:6/747: dwrite d6/d7/f2a [0,4194304] 0 2026-03-10T06:22:27.434 INFO:tasks.workunit.client.1.vm06.stdout:1/748: dwrite d9/d35/faf [0,4194304] 0 2026-03-10T06:22:27.438 INFO:tasks.workunit.client.1.vm06.stdout:0/712: creat d0/dd/d14/d18/fe7 x:0 0 0 2026-03-10T06:22:27.438 INFO:tasks.workunit.client.1.vm06.stdout:9/704: chown d21/d27/d50/cac 13744986 1 2026-03-10T06:22:27.438 INFO:tasks.workunit.client.1.vm06.stdout:0/713: chown d0/d3c/dc1/dc4/dc5 2791 1 2026-03-10T06:22:27.438 INFO:tasks.workunit.client.1.vm06.stdout:7/749: truncate d19/d3b/d41/d72/ff2 104241 0 2026-03-10T06:22:27.439 INFO:tasks.workunit.client.1.vm06.stdout:7/750: chown d19/db0/ddd 3 1 2026-03-10T06:22:27.439 INFO:tasks.workunit.client.1.vm06.stdout:7/751: readlink l17 0 2026-03-10T06:22:27.448 INFO:tasks.workunit.client.1.vm06.stdout:9/705: rmdir d21/da2/da7/d93/dda 39 2026-03-10T06:22:27.451 INFO:tasks.workunit.client.1.vm06.stdout:9/706: stat d21/d27/d50/d57/db2/d80/f86 0 2026-03-10T06:22:27.451 INFO:tasks.workunit.client.1.vm06.stdout:0/714: unlink d0/dd/d14/d18/d85/dcc/dab/fad 0 2026-03-10T06:22:27.452 INFO:tasks.workunit.client.1.vm06.stdout:6/748: link d6/dd/dc7/ccd d6/dd/d25/d33/d4d/c103 0 2026-03-10T06:22:27.452 INFO:tasks.workunit.client.1.vm06.stdout:7/752: sync 2026-03-10T06:22:27.452 INFO:tasks.workunit.client.1.vm06.stdout:9/707: sync 2026-03-10T06:22:27.456 INFO:tasks.workunit.client.1.vm06.stdout:1/749: getdents d9/d35/d46/d38/dc6 0 2026-03-10T06:22:27.457 INFO:tasks.workunit.client.1.vm06.stdout:0/715: fsync d0/dd/d14/d1d/f53 0 2026-03-10T06:22:27.457 INFO:tasks.workunit.client.1.vm06.stdout:6/749: creat d6/dd/d25/d33/d5a/dae/f104 x:0 0 0 2026-03-10T06:22:27.458 INFO:tasks.workunit.client.1.vm06.stdout:7/753: write d19/d3b/d41/d42/d62/f86 [4068830,26233] 0 2026-03-10T06:22:27.459 INFO:tasks.workunit.client.1.vm06.stdout:7/754: dread - d19/d3b/d41/da9/dbd/dd2/fe8 zero size 2026-03-10T06:22:27.459 INFO:tasks.workunit.client.1.vm06.stdout:1/750: chown d9/d35/l79 820 1 2026-03-10T06:22:27.465 INFO:tasks.workunit.client.1.vm06.stdout:6/750: readlink d6/d7/d37/d43/l88 0 2026-03-10T06:22:27.468 INFO:tasks.workunit.client.1.vm06.stdout:6/751: truncate d6/df/d70/fed 1008993 0 2026-03-10T06:22:27.468 INFO:tasks.workunit.client.1.vm06.stdout:6/752: sync 2026-03-10T06:22:27.472 INFO:tasks.workunit.client.1.vm06.stdout:1/751: creat d9/d35/d89/fd1 x:0 0 0 2026-03-10T06:22:27.477 INFO:tasks.workunit.client.1.vm06.stdout:7/755: creat d19/d3b/d41/d42/d62/d80/ffa x:0 0 0 2026-03-10T06:22:27.486 INFO:tasks.workunit.client.1.vm06.stdout:7/756: truncate d19/d3b/fe1 78091 0 2026-03-10T06:22:27.486 INFO:tasks.workunit.client.1.vm06.stdout:6/753: dwrite d6/dd/d25/d4e/f5f [4194304,4194304] 0 2026-03-10T06:22:27.486 INFO:tasks.workunit.client.1.vm06.stdout:9/708: dread d21/d27/d56/f74 [0,4194304] 0 2026-03-10T06:22:27.490 INFO:tasks.workunit.client.1.vm06.stdout:7/757: dwrite d19/d3b/dde/fdc [0,4194304] 0 2026-03-10T06:22:27.490 INFO:tasks.workunit.client.1.vm06.stdout:1/752: unlink d9/d1b/d20/d44/f6f 0 2026-03-10T06:22:27.494 INFO:tasks.workunit.client.1.vm06.stdout:3/690: write d6/dc/f3f [614285,11766] 0 2026-03-10T06:22:27.503 INFO:tasks.workunit.client.1.vm06.stdout:8/588: dwrite d1/df/f56 [0,4194304] 0 2026-03-10T06:22:27.503 INFO:tasks.workunit.client.1.vm06.stdout:5/529: dwrite d8/db/d54/d55/d80/f96 [0,4194304] 0 2026-03-10T06:22:27.507 INFO:tasks.workunit.client.1.vm06.stdout:0/716: dwrite d0/dd/d14/d1d/f53 [0,4194304] 0 2026-03-10T06:22:27.514 INFO:tasks.workunit.client.1.vm06.stdout:3/691: unlink d6/fa8 0 2026-03-10T06:22:27.514 INFO:tasks.workunit.client.1.vm06.stdout:0/717: chown d0/dd/f4c 1169 1 2026-03-10T06:22:27.518 INFO:tasks.workunit.client.1.vm06.stdout:3/692: rename d6/d8/d7f/f8c to d6/d21/d38/dd0/feb 0 2026-03-10T06:22:27.523 INFO:tasks.workunit.client.1.vm06.stdout:0/718: unlink d0/dd/d14/d1d/lcf 0 2026-03-10T06:22:27.523 INFO:tasks.workunit.client.1.vm06.stdout:3/693: fsync d6/dc/d13/d51/fd2 0 2026-03-10T06:22:27.523 INFO:tasks.workunit.client.1.vm06.stdout:3/694: truncate d6/d21/d38/dd0/dd1/d90/fe0 286103 0 2026-03-10T06:22:27.523 INFO:tasks.workunit.client.1.vm06.stdout:3/695: write d6/d21/d38/dd0/dd1/d90/dc7/fcb [984719,73466] 0 2026-03-10T06:22:27.526 INFO:tasks.workunit.client.1.vm06.stdout:6/754: creat d6/dd/d25/f105 x:0 0 0 2026-03-10T06:22:27.526 INFO:tasks.workunit.client.1.vm06.stdout:8/589: creat d1/df/d20/d21/fc5 x:0 0 0 2026-03-10T06:22:27.527 INFO:tasks.workunit.client.1.vm06.stdout:6/755: write d6/d7/d37/d43/f77 [3346171,126037] 0 2026-03-10T06:22:27.527 INFO:tasks.workunit.client.1.vm06.stdout:3/696: unlink d6/d21/f30 0 2026-03-10T06:22:27.529 INFO:tasks.workunit.client.1.vm06.stdout:3/697: write d6/dc/f69 [5302303,67602] 0 2026-03-10T06:22:27.529 INFO:tasks.workunit.client.1.vm06.stdout:3/698: chown d6/dc/d13/fc1 3254 1 2026-03-10T06:22:27.533 INFO:tasks.workunit.client.1.vm06.stdout:7/758: link d19/d3b/c40 d19/d3b/d41/d42/d52/d83/cfb 0 2026-03-10T06:22:27.538 INFO:tasks.workunit.client.1.vm06.stdout:7/759: dread - d19/d3b/d41/d42/d62/d80/fd7 zero size 2026-03-10T06:22:27.541 INFO:tasks.workunit.client.1.vm06.stdout:6/756: mkdir d6/dd/d25/d33/d5a/d78/dd0/d106 0 2026-03-10T06:22:27.545 INFO:tasks.workunit.client.1.vm06.stdout:6/757: write d6/df/d40/fc9 [239002,32492] 0 2026-03-10T06:22:27.546 INFO:tasks.workunit.client.1.vm06.stdout:9/709: truncate d21/d27/fb3 281951 0 2026-03-10T06:22:27.549 INFO:tasks.workunit.client.1.vm06.stdout:7/760: write d19/d3b/d41/d4c/f4e [2816087,12133] 0 2026-03-10T06:22:27.553 INFO:tasks.workunit.client.1.vm06.stdout:5/530: dwrite d8/db/d54/d8a/d74/f85 [0,4194304] 0 2026-03-10T06:22:27.561 INFO:tasks.workunit.client.1.vm06.stdout:3/699: dwrite d6/dc/d13/d35/f5a [4194304,4194304] 0 2026-03-10T06:22:27.577 INFO:tasks.workunit.client.1.vm06.stdout:8/590: dread d1/f5 [0,4194304] 0 2026-03-10T06:22:27.581 INFO:tasks.workunit.client.1.vm06.stdout:9/710: mknod d21/d32/cec 0 2026-03-10T06:22:27.581 INFO:tasks.workunit.client.1.vm06.stdout:7/761: link d19/d3b/d41/d42/d52/d83/f9b d19/d3b/d41/d4c/ffc 0 2026-03-10T06:22:27.586 INFO:tasks.workunit.client.1.vm06.stdout:6/758: getdents d6/dd/d25/d4e 0 2026-03-10T06:22:27.588 INFO:tasks.workunit.client.1.vm06.stdout:0/719: dwrite d0/da3/fe5 [0,4194304] 0 2026-03-10T06:22:27.588 INFO:tasks.workunit.client.1.vm06.stdout:8/591: fdatasync d1/d2c/f67 0 2026-03-10T06:22:27.589 INFO:tasks.workunit.client.1.vm06.stdout:5/531: dwrite d8/db/d54/d67/d46/d6e/faa [0,4194304] 0 2026-03-10T06:22:27.591 INFO:tasks.workunit.client.1.vm06.stdout:7/762: fsync d19/d3b/d41/d42/d52/d83/f8f 0 2026-03-10T06:22:27.599 INFO:tasks.workunit.client.1.vm06.stdout:7/763: unlink d19/d3b/c7a 0 2026-03-10T06:22:27.600 INFO:tasks.workunit.client.1.vm06.stdout:7/764: write d19/d3b/d41/d42/d52/d83/fd8 [596231,45833] 0 2026-03-10T06:22:27.609 INFO:tasks.workunit.client.1.vm06.stdout:3/700: write d6/d21/d38/dd0/dd1/d90/f9e [74737,73065] 0 2026-03-10T06:22:27.613 INFO:tasks.workunit.client.1.vm06.stdout:5/532: rename d8/db/d54/d67/c4f to d8/db/d54/d8a/d39/d72/cab 0 2026-03-10T06:22:27.614 INFO:tasks.workunit.client.1.vm06.stdout:9/711: dwrite d21/da2/da7/f96 [0,4194304] 0 2026-03-10T06:22:27.615 INFO:tasks.workunit.client.1.vm06.stdout:6/759: write d6/f62 [4797528,45253] 0 2026-03-10T06:22:27.629 INFO:tasks.workunit.client.1.vm06.stdout:9/712: mkdir d21/d46/ded 0 2026-03-10T06:22:27.636 INFO:tasks.workunit.client.1.vm06.stdout:7/765: fdatasync d19/d3b/d41/d4c/f55 0 2026-03-10T06:22:27.644 INFO:tasks.workunit.client.1.vm06.stdout:0/720: dread d0/f5 [0,4194304] 0 2026-03-10T06:22:27.645 INFO:tasks.workunit.client.1.vm06.stdout:9/713: mkdir d21/d27/d50/d57/dcd/de4/dee 0 2026-03-10T06:22:27.645 INFO:tasks.workunit.client.1.vm06.stdout:3/701: dwrite d6/dc/d13/d9d/d54/fea [0,4194304] 0 2026-03-10T06:22:27.651 INFO:tasks.workunit.client.1.vm06.stdout:9/714: getdents d21/da2/da7/d93/dda 0 2026-03-10T06:22:27.664 INFO:tasks.workunit.client.1.vm06.stdout:7/766: sync 2026-03-10T06:22:27.668 INFO:tasks.workunit.client.1.vm06.stdout:9/715: truncate d21/f34 655131 0 2026-03-10T06:22:27.669 INFO:tasks.workunit.client.1.vm06.stdout:3/702: read d6/d8/f45 [307729,84798] 0 2026-03-10T06:22:27.671 INFO:tasks.workunit.client.1.vm06.stdout:3/703: write d6/d21/d38/f56 [662620,121172] 0 2026-03-10T06:22:27.676 INFO:tasks.workunit.client.1.vm06.stdout:0/721: dread d0/dd/d14/d18/d85/dcc/fb6 [0,4194304] 0 2026-03-10T06:22:27.680 INFO:tasks.workunit.client.1.vm06.stdout:7/767: creat d19/d3b/d41/d72/ffd x:0 0 0 2026-03-10T06:22:27.681 INFO:tasks.workunit.client.1.vm06.stdout:0/722: mknod d0/d3c/dc1/dc4/ce8 0 2026-03-10T06:22:27.681 INFO:tasks.workunit.client.1.vm06.stdout:3/704: rmdir d6/d21/d38/d88/dae 39 2026-03-10T06:22:27.683 INFO:tasks.workunit.client.1.vm06.stdout:7/768: mknod d19/d3b/d41/da9/da5/cfe 0 2026-03-10T06:22:27.688 INFO:tasks.workunit.client.1.vm06.stdout:2/624: dread da/f28 [0,4194304] 0 2026-03-10T06:22:27.696 INFO:tasks.workunit.client.1.vm06.stdout:3/705: mkdir d6/d21/d38/d88/dae/dec 0 2026-03-10T06:22:27.699 INFO:tasks.workunit.client.1.vm06.stdout:0/723: dwrite d0/dd/d14/d18/d85/dcc/fbe [0,4194304] 0 2026-03-10T06:22:27.701 INFO:tasks.workunit.client.1.vm06.stdout:1/753: dread d9/d1b/f1d [0,4194304] 0 2026-03-10T06:22:27.705 INFO:tasks.workunit.client.1.vm06.stdout:7/769: mknod d19/d3b/d41/d42/d62/dc4/df8/cff 0 2026-03-10T06:22:27.708 INFO:tasks.workunit.client.1.vm06.stdout:3/706: symlink d6/d8/d7f/da1/led 0 2026-03-10T06:22:27.708 INFO:tasks.workunit.client.1.vm06.stdout:0/724: dwrite d0/d3c/dc1/d3d/f82 [0,4194304] 0 2026-03-10T06:22:27.709 INFO:tasks.workunit.client.1.vm06.stdout:6/760: dwrite d6/d79/fe2 [0,4194304] 0 2026-03-10T06:22:27.711 INFO:tasks.workunit.client.1.vm06.stdout:2/625: mkdir da/d13/d1c/d1d/d44/dc4 0 2026-03-10T06:22:27.714 INFO:tasks.workunit.client.1.vm06.stdout:1/754: symlink d9/d35/d89/ld2 0 2026-03-10T06:22:27.719 INFO:tasks.workunit.client.1.vm06.stdout:6/761: mkdir d6/df/d40/d107 0 2026-03-10T06:22:27.722 INFO:tasks.workunit.client.1.vm06.stdout:5/533: write d8/ff [1008654,80707] 0 2026-03-10T06:22:27.723 INFO:tasks.workunit.client.1.vm06.stdout:5/534: write d8/f49 [68196,110992] 0 2026-03-10T06:22:27.727 INFO:tasks.workunit.client.1.vm06.stdout:3/707: dwrite d6/d21/d38/d88/dae/fb2 [0,4194304] 0 2026-03-10T06:22:27.739 INFO:tasks.workunit.client.1.vm06.stdout:0/725: dwrite d0/dd/d14/d18/d85/dcc/d88/d35/f51 [0,4194304] 0 2026-03-10T06:22:27.739 INFO:tasks.workunit.client.1.vm06.stdout:9/716: write d21/d27/d50/d57/db2/f8d [837658,121334] 0 2026-03-10T06:22:27.739 INFO:tasks.workunit.client.1.vm06.stdout:1/755: chown d9/d1b/d20/c2d 542885611 1 2026-03-10T06:22:27.739 INFO:tasks.workunit.client.1.vm06.stdout:9/717: write d21/da2/dc2/fc9 [3792115,49077] 0 2026-03-10T06:22:27.742 INFO:tasks.workunit.client.1.vm06.stdout:2/626: mkdir da/d13/d1a/d39/d4b/dc5 0 2026-03-10T06:22:27.746 INFO:tasks.workunit.client.1.vm06.stdout:9/718: mkdir d21/d27/d50/d57/db2/def 0 2026-03-10T06:22:27.749 INFO:tasks.workunit.client.1.vm06.stdout:9/719: write d21/d32/d4d/fbd [999125,20823] 0 2026-03-10T06:22:27.749 INFO:tasks.workunit.client.1.vm06.stdout:0/726: unlink d0/dd/d14/d1d/laa 0 2026-03-10T06:22:27.751 INFO:tasks.workunit.client.1.vm06.stdout:5/535: symlink d8/db/d54/d8a/d74/d90/lac 0 2026-03-10T06:22:27.753 INFO:tasks.workunit.client.1.vm06.stdout:6/762: rmdir d6/df/d40/d107 0 2026-03-10T06:22:27.755 INFO:tasks.workunit.client.1.vm06.stdout:0/727: creat d0/dd/d14/d18/d85/fe9 x:0 0 0 2026-03-10T06:22:27.755 INFO:tasks.workunit.client.1.vm06.stdout:2/627: creat da/da8/fc6 x:0 0 0 2026-03-10T06:22:27.756 INFO:tasks.workunit.client.1.vm06.stdout:2/628: readlink da/d13/d1c/d1d/lba 0 2026-03-10T06:22:27.757 INFO:tasks.workunit.client.1.vm06.stdout:1/756: rename d9/d1b/d20/d44 to d9/dd3 0 2026-03-10T06:22:27.757 INFO:tasks.workunit.client.1.vm06.stdout:6/763: read d6/d7/d37/d43/f59 [2707891,16931] 0 2026-03-10T06:22:27.758 INFO:tasks.workunit.client.1.vm06.stdout:9/720: mknod d21/d27/d50/d57/db2/def/cf0 0 2026-03-10T06:22:27.758 INFO:tasks.workunit.client.1.vm06.stdout:5/536: unlink d8/db/d54/d55/f87 0 2026-03-10T06:22:27.762 INFO:tasks.workunit.client.1.vm06.stdout:3/708: getdents d6/d8 0 2026-03-10T06:22:27.762 INFO:tasks.workunit.client.1.vm06.stdout:3/709: chown d6/d8/d7f/da1/caa 2431426 1 2026-03-10T06:22:27.763 INFO:tasks.workunit.client.1.vm06.stdout:0/728: mknod d0/d3c/dc1/cea 0 2026-03-10T06:22:27.763 INFO:tasks.workunit.client.1.vm06.stdout:1/757: fsync d9/d35/f7e 0 2026-03-10T06:22:27.764 INFO:tasks.workunit.client.1.vm06.stdout:5/537: symlink d8/db/d54/d8a/d74/d90/lad 0 2026-03-10T06:22:27.770 INFO:tasks.workunit.client.1.vm06.stdout:0/729: mkdir d0/dd/d14/d18/deb 0 2026-03-10T06:22:27.776 INFO:tasks.workunit.client.1.vm06.stdout:2/629: unlink da/d13/d1c/d1d/f55 0 2026-03-10T06:22:27.777 INFO:tasks.workunit.client.1.vm06.stdout:7/770: dread d19/f30 [0,4194304] 0 2026-03-10T06:22:27.778 INFO:tasks.workunit.client.1.vm06.stdout:7/771: read - d19/d3b/d41/d72/ffd zero size 2026-03-10T06:22:27.781 INFO:tasks.workunit.client.1.vm06.stdout:2/630: stat da/d13/d1a/d39/d4b/daf/d56/db9 0 2026-03-10T06:22:27.786 INFO:tasks.workunit.client.1.vm06.stdout:7/772: dwrite d19/f35 [0,4194304] 0 2026-03-10T06:22:27.787 INFO:tasks.workunit.client.1.vm06.stdout:6/764: creat d6/d79/d95/dea/f108 x:0 0 0 2026-03-10T06:22:27.788 INFO:tasks.workunit.client.1.vm06.stdout:1/758: write d9/d62/f76 [1332694,78621] 0 2026-03-10T06:22:27.789 INFO:tasks.workunit.client.1.vm06.stdout:0/730: chown d0/dd/d14/d18/f2c 405523 1 2026-03-10T06:22:27.789 INFO:tasks.workunit.client.1.vm06.stdout:9/721: dwrite d21/d27/d3a/fbb [0,4194304] 0 2026-03-10T06:22:27.795 INFO:tasks.workunit.client.1.vm06.stdout:1/759: rmdir d9/dd3 39 2026-03-10T06:22:27.795 INFO:tasks.workunit.client.1.vm06.stdout:6/765: creat d6/dd/d25/d33/d5a/f109 x:0 0 0 2026-03-10T06:22:27.796 INFO:tasks.workunit.client.1.vm06.stdout:9/722: write d21/d46/fe0 [1000376,22069] 0 2026-03-10T06:22:27.796 INFO:tasks.workunit.client.1.vm06.stdout:2/631: rename da/d13/d1a/d39/d4b to da/d13/d1a/dc7 0 2026-03-10T06:22:27.796 INFO:tasks.workunit.client.1.vm06.stdout:1/760: readlink d9/d35/d46/d38/l8b 0 2026-03-10T06:22:27.796 INFO:tasks.workunit.client.1.vm06.stdout:6/766: fsync d6/d7/ffc 0 2026-03-10T06:22:27.797 INFO:tasks.workunit.client.1.vm06.stdout:0/731: symlink d0/dd/d14/d18/d85/dcc/d88/d98/lec 0 2026-03-10T06:22:27.799 INFO:tasks.workunit.client.1.vm06.stdout:5/538: truncate d8/db/d54/d55/d80/f96 3468007 0 2026-03-10T06:22:27.802 INFO:tasks.workunit.client.1.vm06.stdout:1/761: mkdir d9/d35/d46/d38/dc6/dd4 0 2026-03-10T06:22:27.804 INFO:tasks.workunit.client.1.vm06.stdout:9/723: creat d21/d32/d4d/dd2/ff1 x:0 0 0 2026-03-10T06:22:27.807 INFO:tasks.workunit.client.1.vm06.stdout:0/732: write d0/dd/d14/d18/d85/dcc/fbe [1760171,124598] 0 2026-03-10T06:22:27.810 INFO:tasks.workunit.client.1.vm06.stdout:6/767: symlink d6/dd/d25/l10a 0 2026-03-10T06:22:27.811 INFO:tasks.workunit.client.1.vm06.stdout:2/632: creat da/d13/d1a/dc7/d86/fc8 x:0 0 0 2026-03-10T06:22:27.811 INFO:tasks.workunit.client.1.vm06.stdout:9/724: write f9 [4080490,39463] 0 2026-03-10T06:22:27.812 INFO:tasks.workunit.client.1.vm06.stdout:0/733: readlink d0/dd/l3b 0 2026-03-10T06:22:27.818 INFO:tasks.workunit.client.1.vm06.stdout:0/734: write d0/dd/d14/d18/d85/dcc/f60 [1327550,82728] 0 2026-03-10T06:22:27.818 INFO:tasks.workunit.client.1.vm06.stdout:5/539: link d8/db/d54/d8a/f53 d8/db/d54/d8a/d39/fae 0 2026-03-10T06:22:27.823 INFO:tasks.workunit.client.1.vm06.stdout:6/768: unlink d6/dd/d25/d33/d5a/d78/dd0/f93 0 2026-03-10T06:22:27.831 INFO:tasks.workunit.client.1.vm06.stdout:9/725: creat d21/d27/d50/d57/db2/d80/d95/ff2 x:0 0 0 2026-03-10T06:22:27.831 INFO:tasks.workunit.client.1.vm06.stdout:1/762: dwrite d9/dd3/dbf/fc3 [0,4194304] 0 2026-03-10T06:22:27.831 INFO:tasks.workunit.client.1.vm06.stdout:0/735: unlink d0/fa 0 2026-03-10T06:22:27.831 INFO:tasks.workunit.client.1.vm06.stdout:5/540: chown d8/db/d54/d55/f61 3940730 1 2026-03-10T06:22:27.833 INFO:tasks.workunit.client.1.vm06.stdout:9/726: write d21/da2/de6/fe5 [1015771,11552] 0 2026-03-10T06:22:27.838 INFO:tasks.workunit.client.1.vm06.stdout:6/769: unlink d6/dd/d25/l10a 0 2026-03-10T06:22:27.839 INFO:tasks.workunit.client.1.vm06.stdout:2/633: creat da/d13/d1a/fc9 x:0 0 0 2026-03-10T06:22:27.842 INFO:tasks.workunit.client.1.vm06.stdout:0/736: creat d0/dd/d14/d1d/d73/fed x:0 0 0 2026-03-10T06:22:27.844 INFO:tasks.workunit.client.1.vm06.stdout:1/763: getdents d9/d35/d46/d38/dc6/dd4 0 2026-03-10T06:22:27.847 INFO:tasks.workunit.client.1.vm06.stdout:9/727: truncate d21/d32/fcf 401557 0 2026-03-10T06:22:27.847 INFO:tasks.workunit.client.1.vm06.stdout:6/770: truncate d6/df/d70/fa6 1289416 0 2026-03-10T06:22:27.847 INFO:tasks.workunit.client.1.vm06.stdout:2/634: creat da/d13/d1c/d1d/fca x:0 0 0 2026-03-10T06:22:27.848 INFO:tasks.workunit.client.1.vm06.stdout:6/771: chown d6/df/d40/d99 12403 1 2026-03-10T06:22:27.849 INFO:tasks.workunit.client.1.vm06.stdout:1/764: mkdir d9/d35/d46/d38/d63/d83/dc5/dd5 0 2026-03-10T06:22:27.852 INFO:tasks.workunit.client.1.vm06.stdout:9/728: chown d21/da2/da7/d93/dda/fb6 226 1 2026-03-10T06:22:27.853 INFO:tasks.workunit.client.1.vm06.stdout:6/772: write d6/d7/ffc [461730,128033] 0 2026-03-10T06:22:27.857 INFO:tasks.workunit.client.1.vm06.stdout:9/729: stat d21/d27/d50/d57/fb7 0 2026-03-10T06:22:27.864 INFO:tasks.workunit.client.1.vm06.stdout:2/635: link da/d13/d1c/d1d/d44/d53/f78 da/d13/d1a/dc7/fcb 0 2026-03-10T06:22:27.864 INFO:tasks.workunit.client.1.vm06.stdout:1/765: getdents d9/d35/d46/d38/d63/d83/d93 0 2026-03-10T06:22:27.866 INFO:tasks.workunit.client.1.vm06.stdout:6/773: mknod d6/dd/d25/d33/d5a/dd8/c10b 0 2026-03-10T06:22:27.872 INFO:tasks.workunit.client.1.vm06.stdout:1/766: write d9/d1b/d20/f24 [1220316,55991] 0 2026-03-10T06:22:27.874 INFO:tasks.workunit.client.1.vm06.stdout:0/737: sync 2026-03-10T06:22:27.878 INFO:tasks.workunit.client.1.vm06.stdout:1/767: mkdir d9/d35/d46/d38/d63/dd6 0 2026-03-10T06:22:27.880 INFO:tasks.workunit.client.1.vm06.stdout:6/774: getdents d6/dd/d25/d33/d4d 0 2026-03-10T06:22:27.881 INFO:tasks.workunit.client.1.vm06.stdout:0/738: rename d0/dd/d14/d18/f22 to d0/d3c/dc1/fee 0 2026-03-10T06:22:27.883 INFO:tasks.workunit.client.1.vm06.stdout:6/775: mkdir d6/df/d40/d10c 0 2026-03-10T06:22:27.895 INFO:tasks.workunit.client.1.vm06.stdout:1/768: symlink d9/d35/d46/d38/d63/ld7 0 2026-03-10T06:22:27.895 INFO:tasks.workunit.client.1.vm06.stdout:0/739: mknod d0/da3/dd5/cef 0 2026-03-10T06:22:27.895 INFO:tasks.workunit.client.1.vm06.stdout:0/740: truncate d0/ff 9185874 0 2026-03-10T06:22:27.895 INFO:tasks.workunit.client.1.vm06.stdout:0/741: mknod d0/dd/d14/d18/d85/dcc/d88/d98/cf0 0 2026-03-10T06:22:27.895 INFO:tasks.workunit.client.1.vm06.stdout:1/769: creat d9/d35/d46/d38/d63/d83/dc5/dd5/fd8 x:0 0 0 2026-03-10T06:22:27.895 INFO:tasks.workunit.client.1.vm06.stdout:0/742: creat d0/dd/d14/d1d/d5d/dca/ff1 x:0 0 0 2026-03-10T06:22:27.901 INFO:tasks.workunit.client.1.vm06.stdout:0/743: write d0/d3c/dc1/fee [4396312,42442] 0 2026-03-10T06:22:27.916 INFO:tasks.workunit.client.1.vm06.stdout:0/744: creat d0/dd/d14/ff2 x:0 0 0 2026-03-10T06:22:27.920 INFO:tasks.workunit.client.1.vm06.stdout:6/776: dread d6/dd/d25/d33/d4d/f8c [0,4194304] 0 2026-03-10T06:22:27.922 INFO:tasks.workunit.client.1.vm06.stdout:6/777: dread - d6/dd/d25/d33/d5a/d78/dd0/dc5/fe1 zero size 2026-03-10T06:22:27.929 INFO:tasks.workunit.client.1.vm06.stdout:6/778: rename d6/dd/d25/d33/d5a/d78 to d6/dd/dc2/d10d 0 2026-03-10T06:22:27.929 INFO:tasks.workunit.client.1.vm06.stdout:5/541: write d8/db/d54/d55/f60 [971754,120928] 0 2026-03-10T06:22:27.938 INFO:tasks.workunit.client.1.vm06.stdout:9/730: dwrite f11 [0,4194304] 0 2026-03-10T06:22:27.938 INFO:tasks.workunit.client.1.vm06.stdout:9/731: write d21/d27/d50/d57/fae [481960,119437] 0 2026-03-10T06:22:27.943 INFO:tasks.workunit.client.1.vm06.stdout:5/542: dwrite d8/f49 [0,4194304] 0 2026-03-10T06:22:27.950 INFO:tasks.workunit.client.1.vm06.stdout:5/543: symlink d8/db/d54/d67/d46/d6e/da2/laf 0 2026-03-10T06:22:27.956 INFO:tasks.workunit.client.1.vm06.stdout:1/770: truncate d9/d1b/d20/f30 572876 0 2026-03-10T06:22:27.958 INFO:tasks.workunit.client.1.vm06.stdout:1/771: mknod d9/d35/d46/db0/cd9 0 2026-03-10T06:22:27.960 INFO:tasks.workunit.client.1.vm06.stdout:1/772: rmdir d9/d35/d46 39 2026-03-10T06:22:27.962 INFO:tasks.workunit.client.1.vm06.stdout:5/544: dwrite d8/db/d54/d8a/d39/f69 [0,4194304] 0 2026-03-10T06:22:27.963 INFO:tasks.workunit.client.1.vm06.stdout:5/545: readlink d8/d9/l13 0 2026-03-10T06:22:27.977 INFO:tasks.workunit.client.1.vm06.stdout:5/546: dwrite d8/db/d54/d67/d46/d6e/faa [0,4194304] 0 2026-03-10T06:22:27.978 INFO:tasks.workunit.client.1.vm06.stdout:1/773: truncate d9/d35/d46/d38/d63/d83/d93/fb5 762728 0 2026-03-10T06:22:27.986 INFO:tasks.workunit.client.1.vm06.stdout:8/592: read d1/df/d11/f47 [2665105,108785] 0 2026-03-10T06:22:27.988 INFO:tasks.workunit.client.1.vm06.stdout:5/547: dwrite d8/db/d54/d8a/d39/f52 [4194304,4194304] 0 2026-03-10T06:22:27.991 INFO:tasks.workunit.client.1.vm06.stdout:5/548: fsync d8/d9/f4b 0 2026-03-10T06:22:27.994 INFO:tasks.workunit.client.1.vm06.stdout:1/774: symlink d9/d62/dc7/lda 0 2026-03-10T06:22:27.994 INFO:tasks.workunit.client.1.vm06.stdout:5/549: dread - d8/f3f zero size 2026-03-10T06:22:27.995 INFO:tasks.workunit.client.1.vm06.stdout:8/593: read d1/df/d20/f51 [25577,34588] 0 2026-03-10T06:22:27.998 INFO:tasks.workunit.client.1.vm06.stdout:5/550: write d8/db/d57/f97 [815347,17799] 0 2026-03-10T06:22:28.002 INFO:tasks.workunit.client.1.vm06.stdout:8/594: getdents d1/df/d20/d21/d5e 0 2026-03-10T06:22:28.005 INFO:tasks.workunit.client.1.vm06.stdout:5/551: symlink d8/db/d54/d8a/d74/d90/lb0 0 2026-03-10T06:22:28.007 INFO:tasks.workunit.client.1.vm06.stdout:6/779: dread d6/d7/f87 [0,4194304] 0 2026-03-10T06:22:28.013 INFO:tasks.workunit.client.1.vm06.stdout:8/595: read d1/df/d20/f63 [562488,49615] 0 2026-03-10T06:22:28.019 INFO:tasks.workunit.client.1.vm06.stdout:6/780: link d6/dd/d25/f5c d6/dd/dc2/d10d/f10e 0 2026-03-10T06:22:28.020 INFO:tasks.workunit.client.1.vm06.stdout:6/781: fsync d6/d79/fc6 0 2026-03-10T06:22:28.024 INFO:tasks.workunit.client.1.vm06.stdout:1/775: dwrite d9/d35/d46/d38/d8c/f92 [0,4194304] 0 2026-03-10T06:22:28.025 INFO:tasks.workunit.client.1.vm06.stdout:8/596: dwrite d1/d2c/f32 [0,4194304] 0 2026-03-10T06:22:28.032 INFO:tasks.workunit.client.1.vm06.stdout:6/782: write d6/d7/f16 [2528482,4032] 0 2026-03-10T06:22:28.033 INFO:tasks.workunit.client.1.vm06.stdout:8/597: write d1/df/d20/f88 [1502022,79548] 0 2026-03-10T06:22:28.040 INFO:tasks.workunit.client.1.vm06.stdout:8/598: symlink d1/d2c/d5b/lc6 0 2026-03-10T06:22:28.040 INFO:tasks.workunit.client.1.vm06.stdout:1/776: dwrite d9/dd3/f54 [0,4194304] 0 2026-03-10T06:22:28.054 INFO:tasks.workunit.client.1.vm06.stdout:1/777: creat d9/d35/d46/d38/fdb x:0 0 0 2026-03-10T06:22:28.055 INFO:tasks.workunit.client.1.vm06.stdout:1/778: chown d9/d35/d89/fd1 9175912 1 2026-03-10T06:22:28.062 INFO:tasks.workunit.client.1.vm06.stdout:8/599: sync 2026-03-10T06:22:28.064 INFO:tasks.workunit.client.1.vm06.stdout:1/779: unlink d9/d35/d46/d38/d63/l8d 0 2026-03-10T06:22:28.065 INFO:tasks.workunit.client.1.vm06.stdout:8/600: chown d1/df/d20/d21/d7e/d8d/c94 116361358 1 2026-03-10T06:22:28.070 INFO:tasks.workunit.client.1.vm06.stdout:1/780: mknod d9/d35/d46/d38/cdc 0 2026-03-10T06:22:28.070 INFO:tasks.workunit.client.1.vm06.stdout:1/781: fsync d9/d1b/d20/f25 0 2026-03-10T06:22:28.071 INFO:tasks.workunit.client.1.vm06.stdout:8/601: creat d1/df/fc7 x:0 0 0 2026-03-10T06:22:28.071 INFO:tasks.workunit.client.1.vm06.stdout:6/783: write d6/dd/d25/d33/d4d/f8c [3390166,45101] 0 2026-03-10T06:22:28.087 INFO:tasks.workunit.client.1.vm06.stdout:1/782: mkdir d9/d35/d46/d38/ddd 0 2026-03-10T06:22:28.088 INFO:tasks.workunit.client.1.vm06.stdout:1/783: write d9/d35/d46/d38/d8c/f92 [1267169,24679] 0 2026-03-10T06:22:28.097 INFO:tasks.workunit.client.1.vm06.stdout:1/784: mknod d9/d62/cde 0 2026-03-10T06:22:28.099 INFO:tasks.workunit.client.1.vm06.stdout:6/784: sync 2026-03-10T06:22:28.099 INFO:tasks.workunit.client.1.vm06.stdout:1/785: sync 2026-03-10T06:22:28.101 INFO:tasks.workunit.client.1.vm06.stdout:6/785: readlink d6/dd/d35/l27 0 2026-03-10T06:22:28.119 INFO:tasks.workunit.client.1.vm06.stdout:4/752: dread f2 [8388608,4194304] 0 2026-03-10T06:22:28.121 INFO:tasks.workunit.client.1.vm06.stdout:4/753: chown dd/d41/f95 1982 1 2026-03-10T06:22:28.123 INFO:tasks.workunit.client.1.vm06.stdout:4/754: write dd/d24/d2d/d2f/f42 [3382767,27797] 0 2026-03-10T06:22:28.132 INFO:tasks.workunit.client.1.vm06.stdout:4/755: symlink dd/d24/d2d/d2f/d39/lda 0 2026-03-10T06:22:28.136 INFO:tasks.workunit.client.1.vm06.stdout:4/756: creat dd/d24/d5e/fdb x:0 0 0 2026-03-10T06:22:28.150 INFO:tasks.workunit.client.1.vm06.stdout:8/602: truncate d1/df/d20/fc2 629806 0 2026-03-10T06:22:28.150 INFO:tasks.workunit.client.1.vm06.stdout:8/603: chown d1/df/d20/d35/f42 105829950 1 2026-03-10T06:22:28.155 INFO:tasks.workunit.client.1.vm06.stdout:1/786: write d9/d35/d89/f14 [4176173,131055] 0 2026-03-10T06:22:28.156 INFO:tasks.workunit.client.1.vm06.stdout:6/786: truncate d6/dd/d25/d33/d4d/f8c 300496 0 2026-03-10T06:22:28.160 INFO:tasks.workunit.client.1.vm06.stdout:8/604: truncate d1/df/d11/f1d 1310735 0 2026-03-10T06:22:28.161 INFO:tasks.workunit.client.1.vm06.stdout:6/787: dwrite d6/dd/f5b [0,4194304] 0 2026-03-10T06:22:28.162 INFO:tasks.workunit.client.1.vm06.stdout:4/757: truncate dd/d41/f52 450710 0 2026-03-10T06:22:28.163 INFO:tasks.workunit.client.1.vm06.stdout:8/605: dread - d1/df/d20/d21/d5e/f65 zero size 2026-03-10T06:22:28.176 INFO:tasks.workunit.client.1.vm06.stdout:1/787: truncate d9/d1b/d20/fa7 2816863 0 2026-03-10T06:22:28.177 INFO:tasks.workunit.client.1.vm06.stdout:4/758: symlink dd/d41/ldc 0 2026-03-10T06:22:28.181 INFO:tasks.workunit.client.1.vm06.stdout:6/788: unlink d6/fb1 0 2026-03-10T06:22:28.181 INFO:tasks.workunit.client.1.vm06.stdout:1/788: mknod d9/dd3/cdf 0 2026-03-10T06:22:28.183 INFO:tasks.workunit.client.1.vm06.stdout:1/789: chown d9/d35/d46/d38/fbe 6801493 1 2026-03-10T06:22:28.185 INFO:tasks.workunit.client.1.vm06.stdout:1/790: dread - d9/d35/d46/d38/d63/fc2 zero size 2026-03-10T06:22:28.185 INFO:tasks.workunit.client.1.vm06.stdout:6/789: stat d6/d7/c23 0 2026-03-10T06:22:28.187 INFO:tasks.workunit.client.1.vm06.stdout:5/552: dread d8/db/d54/d8a/d74/f78 [0,4194304] 0 2026-03-10T06:22:28.189 INFO:tasks.workunit.client.1.vm06.stdout:6/790: chown d6/dd/dc7/cf0 1982508 1 2026-03-10T06:22:28.191 INFO:tasks.workunit.client.1.vm06.stdout:5/553: read d8/db/d54/d67/d46/fa4 [2230842,100996] 0 2026-03-10T06:22:28.193 INFO:tasks.workunit.client.1.vm06.stdout:4/759: dread dd/d24/f45 [0,4194304] 0 2026-03-10T06:22:28.207 INFO:tasks.workunit.client.1.vm06.stdout:0/745: dread d0/dd/d14/d18/d85/dcc/d88/d98/faf [0,4194304] 0 2026-03-10T06:22:28.211 INFO:tasks.workunit.client.1.vm06.stdout:6/791: rename d6/df/c1c to d6/dd/dc2/d10d/dd0/dec/c10f 0 2026-03-10T06:22:28.213 INFO:tasks.workunit.client.1.vm06.stdout:5/554: dread d8/d9/f47 [0,4194304] 0 2026-03-10T06:22:28.213 INFO:tasks.workunit.client.1.vm06.stdout:5/555: chown d8/db/d54/d8a/d39/f69 1 1 2026-03-10T06:22:28.213 INFO:tasks.workunit.client.1.vm06.stdout:6/792: write d6/dd/d25/d33/d4d/fc0 [864639,58093] 0 2026-03-10T06:22:28.215 INFO:tasks.workunit.client.1.vm06.stdout:4/760: link f8 dd/d24/d9c/fdd 0 2026-03-10T06:22:28.216 INFO:tasks.workunit.client.1.vm06.stdout:1/791: link d9/d35/d89/f9b d9/d35/d89/fe0 0 2026-03-10T06:22:28.217 INFO:tasks.workunit.client.1.vm06.stdout:6/793: dread - d6/dd/d35/fca zero size 2026-03-10T06:22:28.217 INFO:tasks.workunit.client.1.vm06.stdout:1/792: write d9/d62/f99 [1099218,69799] 0 2026-03-10T06:22:28.221 INFO:tasks.workunit.client.1.vm06.stdout:6/794: fdatasync d6/dd/d25/d2c/f85 0 2026-03-10T06:22:28.224 INFO:tasks.workunit.client.1.vm06.stdout:5/556: rmdir d8/db/d54/d8a/d39/d9e 0 2026-03-10T06:22:28.224 INFO:tasks.workunit.client.1.vm06.stdout:1/793: creat d9/d35/d46/d38/d63/d83/dc5/dd5/fe1 x:0 0 0 2026-03-10T06:22:28.224 INFO:tasks.workunit.client.1.vm06.stdout:4/761: dwrite dd/d33/d47/f93 [0,4194304] 0 2026-03-10T06:22:28.224 INFO:tasks.workunit.client.1.vm06.stdout:5/557: symlink d8/d9/lb1 0 2026-03-10T06:22:28.225 INFO:tasks.workunit.client.1.vm06.stdout:1/794: write d9/d1b/d20/f25 [278461,99181] 0 2026-03-10T06:22:28.225 INFO:tasks.workunit.client.1.vm06.stdout:0/746: sync 2026-03-10T06:22:28.229 INFO:tasks.workunit.client.1.vm06.stdout:4/762: stat dd/d33/d47/d97/db6/dd7/fd9 0 2026-03-10T06:22:28.229 INFO:tasks.workunit.client.1.vm06.stdout:0/747: rename d0/dd/d14/d1d/l23 to d0/dd/d14/d18/d66/lf3 0 2026-03-10T06:22:28.230 INFO:tasks.workunit.client.1.vm06.stdout:6/795: creat d6/d79/d95/db4/f110 x:0 0 0 2026-03-10T06:22:28.233 INFO:tasks.workunit.client.1.vm06.stdout:5/558: getdents d8 0 2026-03-10T06:22:28.233 INFO:tasks.workunit.client.1.vm06.stdout:0/748: creat d0/dd/d14/d1d/d73/ff4 x:0 0 0 2026-03-10T06:22:28.234 INFO:tasks.workunit.client.1.vm06.stdout:4/763: creat dd/d33/da6/fde x:0 0 0 2026-03-10T06:22:28.235 INFO:tasks.workunit.client.1.vm06.stdout:6/796: read - d6/d79/d95/db4/f110 zero size 2026-03-10T06:22:28.235 INFO:tasks.workunit.client.1.vm06.stdout:4/764: rmdir dd/d24/d5e/db0 39 2026-03-10T06:22:28.239 INFO:tasks.workunit.client.1.vm06.stdout:3/710: dread d6/d21/d38/dd0/dd1/f89 [0,4194304] 0 2026-03-10T06:22:28.240 INFO:tasks.workunit.client.1.vm06.stdout:5/559: dread d8/db/d54/d8a/d74/f78 [0,4194304] 0 2026-03-10T06:22:28.240 INFO:tasks.workunit.client.1.vm06.stdout:4/765: readlink dd/d24/d2d/d2f/d34/d40/l46 0 2026-03-10T06:22:28.243 INFO:tasks.workunit.client.1.vm06.stdout:4/766: mknod dd/d24/d2d/d2f/d34/cdf 0 2026-03-10T06:22:28.244 INFO:tasks.workunit.client.1.vm06.stdout:6/797: symlink d6/dd/dc2/d10d/db9/l111 0 2026-03-10T06:22:28.246 INFO:tasks.workunit.client.1.vm06.stdout:4/767: chown dd/c13 26516 1 2026-03-10T06:22:28.247 INFO:tasks.workunit.client.1.vm06.stdout:1/795: dwrite d9/d62/f76 [0,4194304] 0 2026-03-10T06:22:28.252 INFO:tasks.workunit.client.1.vm06.stdout:3/711: mknod d6/dc/d13/d35/cee 0 2026-03-10T06:22:28.253 INFO:tasks.workunit.client.1.vm06.stdout:5/560: truncate d8/db/d54/d8a/d74/f17 1399219 0 2026-03-10T06:22:28.256 INFO:tasks.workunit.client.1.vm06.stdout:0/749: mknod d0/dd/d14/d18/d85/dcc/d88/cf5 0 2026-03-10T06:22:28.258 INFO:tasks.workunit.client.1.vm06.stdout:3/712: mkdir d6/def 0 2026-03-10T06:22:28.258 INFO:tasks.workunit.client.1.vm06.stdout:6/798: rename d6/d7/f16 to d6/d79/d95/db4/f112 0 2026-03-10T06:22:28.260 INFO:tasks.workunit.client.1.vm06.stdout:3/713: write d6/dc/f69 [4843260,128513] 0 2026-03-10T06:22:28.261 INFO:tasks.workunit.client.1.vm06.stdout:5/561: rmdir d8/db/d54/d8a/d39/d72 39 2026-03-10T06:22:28.262 INFO:tasks.workunit.client.1.vm06.stdout:3/714: readlink d6/d21/d38/d88/dae/ld7 0 2026-03-10T06:22:28.264 INFO:tasks.workunit.client.1.vm06.stdout:0/750: write d0/dd/f48 [1027374,9466] 0 2026-03-10T06:22:28.266 INFO:tasks.workunit.client.1.vm06.stdout:1/796: dwrite d9/d35/d46/d38/d63/d83/d93/f9c [0,4194304] 0 2026-03-10T06:22:28.268 INFO:tasks.workunit.client.1.vm06.stdout:4/768: rename dd/d24/d5e to dd/d24/d2d/d2f/d39/d71/dc3/dd0/de0 0 2026-03-10T06:22:28.268 INFO:tasks.workunit.client.1.vm06.stdout:0/751: fdatasync d0/dd/d14/d18/d85/dcc/d88/d35/d74/fb4 0 2026-03-10T06:22:28.270 INFO:tasks.workunit.client.1.vm06.stdout:3/715: symlink d6/lf0 0 2026-03-10T06:22:28.270 INFO:tasks.workunit.client.1.vm06.stdout:6/799: write d6/d79/d95/db4/f112 [1030885,68396] 0 2026-03-10T06:22:28.274 INFO:tasks.workunit.client.1.vm06.stdout:6/800: write d6/dd/dc2/d10d/dd0/dc5/fe1 [964724,68328] 0 2026-03-10T06:22:28.274 INFO:tasks.workunit.client.1.vm06.stdout:4/769: symlink dd/d24/d2d/d2f/d34/le1 0 2026-03-10T06:22:28.275 INFO:tasks.workunit.client.1.vm06.stdout:4/770: chown dd/d24/d2d/d2f/d34/c77 2 1 2026-03-10T06:22:28.275 INFO:tasks.workunit.client.1.vm06.stdout:5/562: mknod d8/db/d54/d8a/d39/cb2 0 2026-03-10T06:22:28.276 INFO:tasks.workunit.client.1.vm06.stdout:0/752: dwrite d0/dd/f67 [0,4194304] 0 2026-03-10T06:22:28.279 INFO:tasks.workunit.client.1.vm06.stdout:0/753: readlink d0/dd/d14/l4f 0 2026-03-10T06:22:28.281 INFO:tasks.workunit.client.1.vm06.stdout:4/771: mkdir dd/d33/d47/d97/db6/dbb/de2 0 2026-03-10T06:22:28.285 INFO:tasks.workunit.client.1.vm06.stdout:6/801: mkdir d6/d79/d95/db4/dd4/df5/d113 0 2026-03-10T06:22:28.286 INFO:tasks.workunit.client.1.vm06.stdout:3/716: link d6/dc/d72/c79 d6/d21/d38/dd0/cf1 0 2026-03-10T06:22:28.286 INFO:tasks.workunit.client.1.vm06.stdout:0/754: chown d0/dd/d14/c62 38 1 2026-03-10T06:22:28.286 INFO:tasks.workunit.client.1.vm06.stdout:4/772: mknod dd/d24/d2d/d2f/d39/d71/dc3/dd0/ce3 0 2026-03-10T06:22:28.286 INFO:tasks.workunit.client.1.vm06.stdout:4/773: readlink dd/l85 0 2026-03-10T06:22:28.293 INFO:tasks.workunit.client.1.vm06.stdout:6/802: rename d6/dd/f5b to d6/dd/f114 0 2026-03-10T06:22:28.294 INFO:tasks.workunit.client.1.vm06.stdout:5/563: creat d8/db/d54/d8a/d74/fb3 x:0 0 0 2026-03-10T06:22:28.295 INFO:tasks.workunit.client.1.vm06.stdout:4/774: symlink dd/d24/d2d/d2f/d39/d71/dc3/dd0/le4 0 2026-03-10T06:22:28.298 INFO:tasks.workunit.client.1.vm06.stdout:4/775: truncate dd/d24/d2d/d2f/d34/d40/fb8 290116 0 2026-03-10T06:22:28.305 INFO:tasks.workunit.client.1.vm06.stdout:0/755: symlink d0/dd/d14/d18/d7e/lf6 0 2026-03-10T06:22:28.307 INFO:tasks.workunit.client.1.vm06.stdout:5/564: dwrite d8/db/d54/d8a/d74/f85 [0,4194304] 0 2026-03-10T06:22:28.308 INFO:tasks.workunit.client.1.vm06.stdout:0/756: write d0/dd/f48 [1330146,38337] 0 2026-03-10T06:22:28.313 INFO:tasks.workunit.client.1.vm06.stdout:0/757: rename d0/dd/d14/d18/d85/dcc/d88/d98/f9a to d0/dd/d1c/ff7 0 2026-03-10T06:22:28.323 INFO:tasks.workunit.client.1.vm06.stdout:5/565: symlink d8/db/lb4 0 2026-03-10T06:22:28.323 INFO:tasks.workunit.client.1.vm06.stdout:0/758: dread d0/dd/f4c [0,4194304] 0 2026-03-10T06:22:28.328 INFO:tasks.workunit.client.1.vm06.stdout:5/566: dwrite d8/db/d54/d67/d46/f98 [0,4194304] 0 2026-03-10T06:22:28.331 INFO:tasks.workunit.client.1.vm06.stdout:4/776: dread dd/f5c [0,4194304] 0 2026-03-10T06:22:28.333 INFO:tasks.workunit.client.1.vm06.stdout:1/797: truncate d9/f2f 3980690 0 2026-03-10T06:22:28.336 INFO:tasks.workunit.client.1.vm06.stdout:5/567: creat d8/db/d54/d8a/d74/d90/fb5 x:0 0 0 2026-03-10T06:22:28.339 INFO:tasks.workunit.client.1.vm06.stdout:5/568: write d8/db/d57/d83/f99 [982398,111468] 0 2026-03-10T06:22:28.339 INFO:tasks.workunit.client.1.vm06.stdout:0/759: symlink d0/dd/d14/d18/d85/dcc/d99/dde/lf8 0 2026-03-10T06:22:28.340 INFO:tasks.workunit.client.1.vm06.stdout:0/760: chown d0/d3c/dc1/dc4/ce8 4 1 2026-03-10T06:22:28.342 INFO:tasks.workunit.client.1.vm06.stdout:0/761: read d0/dd/d14/d18/d66/fcb [389629,58804] 0 2026-03-10T06:22:28.345 INFO:tasks.workunit.client.1.vm06.stdout:3/717: write d6/f84 [402965,15998] 0 2026-03-10T06:22:28.345 INFO:tasks.workunit.client.1.vm06.stdout:3/718: dread - d6/d8/d7f/fd4 zero size 2026-03-10T06:22:28.348 INFO:tasks.workunit.client.1.vm06.stdout:5/569: symlink d8/db/d54/d8a/d39/lb6 0 2026-03-10T06:22:28.350 INFO:tasks.workunit.client.1.vm06.stdout:3/719: mknod d6/d8/cf2 0 2026-03-10T06:22:28.358 INFO:tasks.workunit.client.1.vm06.stdout:0/762: dwrite d0/dd/d14/d1d/d73/ff4 [0,4194304] 0 2026-03-10T06:22:28.361 INFO:tasks.workunit.client.1.vm06.stdout:4/777: write dd/d18/f55 [5431054,21666] 0 2026-03-10T06:22:28.362 INFO:tasks.workunit.client.1.vm06.stdout:8/606: dread d1/f18 [0,4194304] 0 2026-03-10T06:22:28.372 INFO:tasks.workunit.client.1.vm06.stdout:3/720: unlink d6/dc/d13/d35/cee 0 2026-03-10T06:22:28.372 INFO:tasks.workunit.client.1.vm06.stdout:5/570: rename d8/db/d54/d8a/d74/c4e to d8/db/d57/cb7 0 2026-03-10T06:22:28.372 INFO:tasks.workunit.client.1.vm06.stdout:9/732: dread d21/da2/de6/f2f [0,4194304] 0 2026-03-10T06:22:28.397 INFO:tasks.workunit.client.1.vm06.stdout:4/778: truncate dd/d18/fac 3658967 0 2026-03-10T06:22:28.398 INFO:tasks.workunit.client.1.vm06.stdout:8/607: write d1/fa [5030035,12665] 0 2026-03-10T06:22:28.398 INFO:tasks.workunit.client.1.vm06.stdout:0/763: write d0/dd/d14/d18/d85/dcc/fb6 [4119383,87440] 0 2026-03-10T06:22:28.398 INFO:tasks.workunit.client.1.vm06.stdout:9/733: read d21/da2/da7/d93/f94 [336981,108067] 0 2026-03-10T06:22:28.400 INFO:tasks.workunit.client.1.vm06.stdout:4/779: creat dd/d24/d2d/fe5 x:0 0 0 2026-03-10T06:22:28.403 INFO:tasks.workunit.client.1.vm06.stdout:1/798: dread d9/f86 [0,4194304] 0 2026-03-10T06:22:28.403 INFO:tasks.workunit.client.1.vm06.stdout:3/721: dwrite d6/d8/fb [0,4194304] 0 2026-03-10T06:22:28.408 INFO:tasks.workunit.client.1.vm06.stdout:4/780: creat dd/d24/d5d/fe6 x:0 0 0 2026-03-10T06:22:28.412 INFO:tasks.workunit.client.1.vm06.stdout:4/781: creat dd/d33/d47/d97/fe7 x:0 0 0 2026-03-10T06:22:28.415 INFO:tasks.workunit.client.1.vm06.stdout:3/722: fsync d6/dc/d13/d9d/fe1 0 2026-03-10T06:22:28.416 INFO:tasks.workunit.client.1.vm06.stdout:9/734: rename d21/d27/c48 to d21/d46/ded/cf3 0 2026-03-10T06:22:28.416 INFO:tasks.workunit.client.1.vm06.stdout:8/608: write d1/df/d11/f12 [1002912,119322] 0 2026-03-10T06:22:28.416 INFO:tasks.workunit.client.1.vm06.stdout:0/764: dwrite d0/dd/d14/d18/d85/dcc/f41 [0,4194304] 0 2026-03-10T06:22:28.417 INFO:tasks.workunit.client.1.vm06.stdout:4/782: write dd/d24/d2d/fbe [774352,100763] 0 2026-03-10T06:22:28.417 INFO:tasks.workunit.client.1.vm06.stdout:4/783: dread - dd/d33/d47/d97/fe7 zero size 2026-03-10T06:22:28.418 INFO:tasks.workunit.client.1.vm06.stdout:8/609: mknod d1/d2c/d99/cc8 0 2026-03-10T06:22:28.430 INFO:tasks.workunit.client.1.vm06.stdout:9/735: mkdir d21/da2/da7/d93/dda/df4 0 2026-03-10T06:22:28.431 INFO:tasks.workunit.client.1.vm06.stdout:9/736: readlink d21/da2/da7/le7 0 2026-03-10T06:22:28.431 INFO:tasks.workunit.client.1.vm06.stdout:8/610: creat d1/df/d20/d35/dac/dbf/fc9 x:0 0 0 2026-03-10T06:22:28.434 INFO:tasks.workunit.client.1.vm06.stdout:9/737: unlink d21/d27/d50/d57/fa9 0 2026-03-10T06:22:28.434 INFO:tasks.workunit.client.1.vm06.stdout:8/611: mknod d1/d2c/d90/cca 0 2026-03-10T06:22:28.439 INFO:tasks.workunit.client.1.vm06.stdout:8/612: rename d1/df/d20/d21/d5e/fbb to d1/d2c/d90/fcb 0 2026-03-10T06:22:28.440 INFO:tasks.workunit.client.1.vm06.stdout:8/613: fsync d1/d2c/f8a 0 2026-03-10T06:22:28.444 INFO:tasks.workunit.client.1.vm06.stdout:8/614: unlink d1/df/d58/f6a 0 2026-03-10T06:22:28.445 INFO:tasks.workunit.client.1.vm06.stdout:8/615: write d1/f89 [564218,66744] 0 2026-03-10T06:22:28.447 INFO:tasks.workunit.client.1.vm06.stdout:3/723: sync 2026-03-10T06:22:28.449 INFO:tasks.workunit.client.1.vm06.stdout:9/738: read f14 [5602865,117858] 0 2026-03-10T06:22:28.450 INFO:tasks.workunit.client.1.vm06.stdout:3/724: mknod d6/dc/d41/d6d/cf3 0 2026-03-10T06:22:28.454 INFO:tasks.workunit.client.1.vm06.stdout:3/725: creat d6/dc/d41/d6d/ff4 x:0 0 0 2026-03-10T06:22:28.456 INFO:tasks.workunit.client.1.vm06.stdout:9/739: getdents d21/d32/d4d/d51/dcb 0 2026-03-10T06:22:28.457 INFO:tasks.workunit.client.1.vm06.stdout:3/726: mknod d6/d21/d38/dd0/dd1/d90/dc7/cf5 0 2026-03-10T06:22:28.458 INFO:tasks.workunit.client.1.vm06.stdout:9/740: truncate d21/d27/d50/d57/db2/faa 1181975 0 2026-03-10T06:22:28.458 INFO:tasks.workunit.client.1.vm06.stdout:3/727: dread - d6/d8/d7f/fd4 zero size 2026-03-10T06:22:28.460 INFO:tasks.workunit.client.1.vm06.stdout:3/728: chown d6/f63 7474215 1 2026-03-10T06:22:28.462 INFO:tasks.workunit.client.1.vm06.stdout:3/729: symlink d6/d21/d38/d88/dae/lf6 0 2026-03-10T06:22:28.463 INFO:tasks.workunit.client.1.vm06.stdout:0/765: dread d0/da3/dd5/ddc/fe3 [0,4194304] 0 2026-03-10T06:22:28.464 INFO:tasks.workunit.client.1.vm06.stdout:0/766: write d0/dd/d14/d18/d85/dcc/dab/fc0 [4396866,55010] 0 2026-03-10T06:22:28.465 INFO:tasks.workunit.client.1.vm06.stdout:3/730: stat d6/dc/d13/d9d/d54/fb8 0 2026-03-10T06:22:28.466 INFO:tasks.workunit.client.1.vm06.stdout:9/741: dwrite d21/da2/de6/fe5 [0,4194304] 0 2026-03-10T06:22:28.467 INFO:tasks.workunit.client.1.vm06.stdout:0/767: chown d0/dd/f24 19593504 1 2026-03-10T06:22:28.468 INFO:tasks.workunit.client.1.vm06.stdout:0/768: write d0/f61 [1763906,130222] 0 2026-03-10T06:22:28.472 INFO:tasks.workunit.client.1.vm06.stdout:9/742: fsync d21/da2/da7/fe2 0 2026-03-10T06:22:28.475 INFO:tasks.workunit.client.1.vm06.stdout:4/784: dwrite dd/d24/d2d/d2f/d39/f62 [0,4194304] 0 2026-03-10T06:22:28.478 INFO:tasks.workunit.client.1.vm06.stdout:4/785: dread f2 [8388608,4194304] 0 2026-03-10T06:22:28.480 INFO:tasks.workunit.client.1.vm06.stdout:0/769: creat d0/dd/d14/d18/d85/dcc/dab/ff9 x:0 0 0 2026-03-10T06:22:28.480 INFO:tasks.workunit.client.1.vm06.stdout:4/786: chown dd/d24/d2d/d2f/d34/d40/f89 2 1 2026-03-10T06:22:28.486 INFO:tasks.workunit.client.1.vm06.stdout:7/773: read d19/d3b/d5b/f69 [1325774,88334] 0 2026-03-10T06:22:28.487 INFO:tasks.workunit.client.1.vm06.stdout:9/743: creat d21/d27/d50/d57/db2/ff5 x:0 0 0 2026-03-10T06:22:28.488 INFO:tasks.workunit.client.1.vm06.stdout:0/770: truncate d0/dd/d14/f31 991480 0 2026-03-10T06:22:28.489 INFO:tasks.workunit.client.1.vm06.stdout:0/771: fsync d0/d3c/dc1/d3d/f82 0 2026-03-10T06:22:28.490 INFO:tasks.workunit.client.1.vm06.stdout:0/772: read - d0/dd/d14/d18/d85/dcc/d99/fdd zero size 2026-03-10T06:22:28.493 INFO:tasks.workunit.client.1.vm06.stdout:0/773: dread d0/dd/f95 [0,4194304] 0 2026-03-10T06:22:28.498 INFO:tasks.workunit.client.1.vm06.stdout:8/616: dwrite d1/df/d20/d21/d5e/f73 [0,4194304] 0 2026-03-10T06:22:28.501 INFO:tasks.workunit.client.1.vm06.stdout:4/787: link c7 dd/d18/ce8 0 2026-03-10T06:22:28.507 INFO:tasks.workunit.client.1.vm06.stdout:0/774: dread - d0/dd/d14/d18/d85/dcc/fa1 zero size 2026-03-10T06:22:28.510 INFO:tasks.workunit.client.1.vm06.stdout:4/788: rmdir dd/d33/d47/d97/db6/dbb 39 2026-03-10T06:22:28.511 INFO:tasks.workunit.client.1.vm06.stdout:3/731: write d6/d21/f58 [413764,92243] 0 2026-03-10T06:22:28.523 INFO:tasks.workunit.client.1.vm06.stdout:3/732: symlink d6/dc/d13/d51/lf7 0 2026-03-10T06:22:28.523 INFO:tasks.workunit.client.1.vm06.stdout:7/774: dread d19/d3b/f43 [0,4194304] 0 2026-03-10T06:22:28.524 INFO:tasks.workunit.client.1.vm06.stdout:3/733: fdatasync d6/d21/d38/f6c 0 2026-03-10T06:22:28.528 INFO:tasks.workunit.client.1.vm06.stdout:0/775: link d0/d3c/dc1/dc4/ce8 d0/dd/d1c/da2/cfa 0 2026-03-10T06:22:28.536 INFO:tasks.workunit.client.1.vm06.stdout:0/776: symlink d0/dd/d14/d18/d85/dcc/d88/d98/lfb 0 2026-03-10T06:22:28.538 INFO:tasks.workunit.client.1.vm06.stdout:8/617: write d1/df/d11/f48 [2460300,117233] 0 2026-03-10T06:22:28.541 INFO:tasks.workunit.client.1.vm06.stdout:8/618: truncate d1/df/d20/fe 2359825 0 2026-03-10T06:22:28.542 INFO:tasks.workunit.client.1.vm06.stdout:8/619: creat d1/d3b/db3/fcc x:0 0 0 2026-03-10T06:22:28.543 INFO:tasks.workunit.client.1.vm06.stdout:8/620: stat d1/d3b/db3/fbc 0 2026-03-10T06:22:28.546 INFO:tasks.workunit.client.1.vm06.stdout:0/777: dwrite d0/dd/d14/d18/d85/dcc/d5e/f86 [0,4194304] 0 2026-03-10T06:22:28.552 INFO:tasks.workunit.client.1.vm06.stdout:4/789: dwrite dd/f81 [4194304,4194304] 0 2026-03-10T06:22:28.555 INFO:tasks.workunit.client.1.vm06.stdout:3/734: dwrite d6/dc/d13/d9d/f57 [0,4194304] 0 2026-03-10T06:22:28.562 INFO:tasks.workunit.client.1.vm06.stdout:0/778: mknod d0/dd/d14/d18/d85/dcc/d88/d47/d4d/cfc 0 2026-03-10T06:22:28.570 INFO:tasks.workunit.client.1.vm06.stdout:8/621: dwrite d1/f13 [4194304,4194304] 0 2026-03-10T06:22:28.577 INFO:tasks.workunit.client.1.vm06.stdout:3/735: chown d6/dc/d13/d9d/d54/cd8 337066 1 2026-03-10T06:22:28.579 INFO:tasks.workunit.client.1.vm06.stdout:4/790: fsync dd/f5c 0 2026-03-10T06:22:28.580 INFO:tasks.workunit.client.1.vm06.stdout:0/779: symlink d0/d3c/dc1/d7d/lfd 0 2026-03-10T06:22:28.580 INFO:tasks.workunit.client.1.vm06.stdout:7/775: dwrite d19/d3b/d41/f48 [0,4194304] 0 2026-03-10T06:22:28.580 INFO:tasks.workunit.client.1.vm06.stdout:4/791: write dd/d24/d2d/d2f/d34/d83/fcc [373432,106321] 0 2026-03-10T06:22:28.580 INFO:tasks.workunit.client.1.vm06.stdout:0/780: read d0/dd/d14/d18/d85/dcc/d99/f9c [600825,43817] 0 2026-03-10T06:22:28.581 INFO:tasks.workunit.client.1.vm06.stdout:8/622: creat d1/df/d11/da1/fcd x:0 0 0 2026-03-10T06:22:28.585 INFO:tasks.workunit.client.1.vm06.stdout:7/776: creat d19/db0/f100 x:0 0 0 2026-03-10T06:22:28.587 INFO:tasks.workunit.client.1.vm06.stdout:3/736: creat d6/d21/d38/dd0/dd1/d90/ff8 x:0 0 0 2026-03-10T06:22:28.591 INFO:tasks.workunit.client.1.vm06.stdout:9/744: dread d21/f34 [0,4194304] 0 2026-03-10T06:22:28.591 INFO:tasks.workunit.client.1.vm06.stdout:3/737: fsync d6/f84 0 2026-03-10T06:22:28.598 INFO:tasks.workunit.client.1.vm06.stdout:3/738: dwrite d6/d21/d38/dd0/dd1/d90/fe9 [0,4194304] 0 2026-03-10T06:22:28.598 INFO:tasks.workunit.client.1.vm06.stdout:0/781: sync 2026-03-10T06:22:28.599 INFO:tasks.workunit.client.1.vm06.stdout:7/777: sync 2026-03-10T06:22:28.603 INFO:tasks.workunit.client.1.vm06.stdout:7/778: write d19/d3b/d41/d4c/fcf [49307,39425] 0 2026-03-10T06:22:28.609 INFO:tasks.workunit.client.1.vm06.stdout:9/745: unlink d21/d32/d4d/cb5 0 2026-03-10T06:22:28.616 INFO:tasks.workunit.client.1.vm06.stdout:3/739: unlink d6/d4f/ca7 0 2026-03-10T06:22:28.627 INFO:tasks.workunit.client.1.vm06.stdout:3/740: write d6/d8/d7f/fc9 [784630,96792] 0 2026-03-10T06:22:28.627 INFO:tasks.workunit.client.1.vm06.stdout:9/746: mkdir d21/d27/d56/df6 0 2026-03-10T06:22:28.627 INFO:tasks.workunit.client.1.vm06.stdout:9/747: chown d21/d32/c84 4053760 1 2026-03-10T06:22:28.627 INFO:tasks.workunit.client.1.vm06.stdout:0/782: creat d0/dd/d14/d18/deb/ffe x:0 0 0 2026-03-10T06:22:28.634 INFO:tasks.workunit.client.1.vm06.stdout:9/748: symlink d21/d27/d50/d57/lf7 0 2026-03-10T06:22:28.644 INFO:tasks.workunit.client.1.vm06.stdout:3/741: mknod d6/d21/dbc/cf9 0 2026-03-10T06:22:28.644 INFO:tasks.workunit.client.1.vm06.stdout:3/742: write d6/dc/d13/d9d/d54/fcc [1054542,8922] 0 2026-03-10T06:22:28.644 INFO:tasks.workunit.client.1.vm06.stdout:4/792: dread dd/d18/f1d [0,4194304] 0 2026-03-10T06:22:28.644 INFO:tasks.workunit.client.1.vm06.stdout:4/793: truncate dd/d24/d2d/fbe 1478337 0 2026-03-10T06:22:28.644 INFO:tasks.workunit.client.1.vm06.stdout:7/779: dread d19/d3b/d41/d42/d62/d80/d82/fae [0,4194304] 0 2026-03-10T06:22:28.651 INFO:tasks.workunit.client.1.vm06.stdout:3/743: getdents d6/dc/d13/d51 0 2026-03-10T06:22:28.656 INFO:tasks.workunit.client.1.vm06.stdout:3/744: rmdir d6/dc/d13/d9d 39 2026-03-10T06:22:28.667 INFO:tasks.workunit.client.1.vm06.stdout:3/745: rename d6/lac to d6/d21/d38/d88/lfa 0 2026-03-10T06:22:28.680 INFO:tasks.workunit.client.1.vm06.stdout:1/799: fdatasync d9/d35/d46/d38/d63/d83/d93/f9c 0 2026-03-10T06:22:28.680 INFO:tasks.workunit.client.1.vm06.stdout:7/780: read d19/d3b/d41/d4c/f55 [187183,3208] 0 2026-03-10T06:22:28.680 INFO:tasks.workunit.client.1.vm06.stdout:8/623: write d1/df/d11/f59 [404706,67793] 0 2026-03-10T06:22:28.687 INFO:tasks.workunit.client.1.vm06.stdout:7/781: link d19/d3b/d41/d42/d62/c89 d19/d3b/d41/d42/d52/d9f/c101 0 2026-03-10T06:22:28.687 INFO:tasks.workunit.client.1.vm06.stdout:1/800: dread d9/d62/f76 [0,4194304] 0 2026-03-10T06:22:28.689 INFO:tasks.workunit.client.1.vm06.stdout:7/782: stat d19/d3b/d41/d4c/ffc 0 2026-03-10T06:22:28.690 INFO:tasks.workunit.client.1.vm06.stdout:8/624: dwrite d1/d3b/da9/dab/fb2 [0,4194304] 0 2026-03-10T06:22:28.703 INFO:tasks.workunit.client.1.vm06.stdout:1/801: unlink d9/la 0 2026-03-10T06:22:28.704 INFO:tasks.workunit.client.1.vm06.stdout:7/783: creat d19/d3b/d41/d42/d62/d80/da1/f102 x:0 0 0 2026-03-10T06:22:28.709 INFO:tasks.workunit.client.1.vm06.stdout:0/783: dwrite d0/dd/d14/f31 [0,4194304] 0 2026-03-10T06:22:28.715 INFO:tasks.workunit.client.1.vm06.stdout:3/746: write d6/d8/f62 [895794,129819] 0 2026-03-10T06:22:28.715 INFO:tasks.workunit.client.1.vm06.stdout:1/802: creat d9/d35/d46/d38/d8c/fe2 x:0 0 0 2026-03-10T06:22:28.715 INFO:tasks.workunit.client.1.vm06.stdout:1/803: chown d9/d35/c48 0 1 2026-03-10T06:22:28.715 INFO:tasks.workunit.client.1.vm06.stdout:0/784: dwrite d0/dd/d14/d1d/d5d/dca/ff1 [0,4194304] 0 2026-03-10T06:22:28.720 INFO:tasks.workunit.client.1.vm06.stdout:1/804: write d9/d35/f5c [1320650,130381] 0 2026-03-10T06:22:28.722 INFO:tasks.workunit.client.1.vm06.stdout:8/625: rename d1/d2c/l9d to d1/df/d58/lce 0 2026-03-10T06:22:28.726 INFO:tasks.workunit.client.1.vm06.stdout:3/747: symlink d6/d21/d38/d88/dde/lfb 0 2026-03-10T06:22:28.726 INFO:tasks.workunit.client.1.vm06.stdout:7/784: readlink d19/d3b/lf0 0 2026-03-10T06:22:28.727 INFO:tasks.workunit.client.1.vm06.stdout:0/785: chown d0/dd/d14/l17 722324290 1 2026-03-10T06:22:28.727 INFO:tasks.workunit.client.1.vm06.stdout:3/748: read d6/d21/d38/dd0/dd1/d90/fe0 [258955,124650] 0 2026-03-10T06:22:28.736 INFO:tasks.workunit.client.1.vm06.stdout:3/749: rmdir d6/d21/dbc 39 2026-03-10T06:22:28.740 INFO:tasks.workunit.client.1.vm06.stdout:0/786: mknod d0/dd/d14/d18/d7e/cff 0 2026-03-10T06:22:28.742 INFO:tasks.workunit.client.1.vm06.stdout:3/750: symlink d6/d1a/lfc 0 2026-03-10T06:22:28.742 INFO:tasks.workunit.client.1.vm06.stdout:8/626: sync 2026-03-10T06:22:28.743 INFO:tasks.workunit.client.1.vm06.stdout:1/805: link d9/d35/d46/d38/dc6/lcd d9/d35/le3 0 2026-03-10T06:22:28.747 INFO:tasks.workunit.client.1.vm06.stdout:8/627: truncate d1/df/d58/f86 617007 0 2026-03-10T06:22:28.751 INFO:tasks.workunit.client.1.vm06.stdout:0/787: dread d0/dd/d14/d18/d85/dcc/d88/d35/f7f [0,4194304] 0 2026-03-10T06:22:28.759 INFO:tasks.workunit.client.1.vm06.stdout:1/806: link d9/dd3/dbf/fc3 d9/d35/d46/d38/dc6/dd4/fe4 0 2026-03-10T06:22:28.760 INFO:tasks.workunit.client.1.vm06.stdout:3/751: dread d6/f5c [0,4194304] 0 2026-03-10T06:22:28.761 INFO:tasks.workunit.client.1.vm06.stdout:0/788: symlink d0/dd/d14/d18/d85/dcc/dab/l100 0 2026-03-10T06:22:28.762 INFO:tasks.workunit.client.1.vm06.stdout:3/752: readlink d6/d21/d38/d88/dae/lf6 0 2026-03-10T06:22:28.763 INFO:tasks.workunit.client.1.vm06.stdout:8/628: link d1/d2c/c34 d1/df/d20/d35/dac/dbf/ccf 0 2026-03-10T06:22:28.765 INFO:tasks.workunit.client.1.vm06.stdout:3/753: rename d6/d21/d38/dd0/dd1/d90/dc7/cc8 to d6/dc/d72/cfd 0 2026-03-10T06:22:28.768 INFO:tasks.workunit.client.1.vm06.stdout:1/807: link d9/d35/d46/d38/d63/d83/d93/f9c d9/d35/d46/d38/d63/d83/d93/fe5 0 2026-03-10T06:22:28.776 INFO:tasks.workunit.client.1.vm06.stdout:3/754: dwrite d6/d21/d38/dd0/feb [4194304,4194304] 0 2026-03-10T06:22:28.777 INFO:tasks.workunit.client.1.vm06.stdout:1/808: mknod d9/d35/d46/d38/dc6/ce6 0 2026-03-10T06:22:28.783 INFO:tasks.workunit.client.1.vm06.stdout:3/755: mkdir d6/d8/d7f/da1/dfe 0 2026-03-10T06:22:28.784 INFO:tasks.workunit.client.1.vm06.stdout:3/756: chown d6/d21/d38/dd0/dd1/d90/dc7 6463516 1 2026-03-10T06:22:28.790 INFO:tasks.workunit.client.1.vm06.stdout:3/757: dwrite d6/d8/fb [0,4194304] 0 2026-03-10T06:22:28.792 INFO:tasks.workunit.client.1.vm06.stdout:3/758: chown d6/dc/d41/fb7 108564749 1 2026-03-10T06:22:28.792 INFO:tasks.workunit.client.1.vm06.stdout:3/759: stat d6/d21/d38/d88/dae/lf6 0 2026-03-10T06:22:28.798 INFO:tasks.workunit.client.1.vm06.stdout:1/809: mkdir d9/d1b/de7 0 2026-03-10T06:22:28.798 INFO:tasks.workunit.client.1.vm06.stdout:1/810: readlink d9/d35/d46/d38/l61 0 2026-03-10T06:22:28.799 INFO:tasks.workunit.client.1.vm06.stdout:1/811: write d9/d35/faf [4696570,94286] 0 2026-03-10T06:22:28.800 INFO:tasks.workunit.client.1.vm06.stdout:0/789: dread d0/dd/d14/d18/d85/dcc/d88/fa0 [0,4194304] 0 2026-03-10T06:22:28.800 INFO:tasks.workunit.client.1.vm06.stdout:0/790: stat d0/dd/d14/d18/deb 0 2026-03-10T06:22:28.802 INFO:tasks.workunit.client.1.vm06.stdout:2/636: dread da/d13/f1f [0,4194304] 0 2026-03-10T06:22:28.807 INFO:tasks.workunit.client.1.vm06.stdout:3/760: fsync d6/d21/f99 0 2026-03-10T06:22:28.811 INFO:tasks.workunit.client.1.vm06.stdout:0/791: creat d0/da3/dd5/f101 x:0 0 0 2026-03-10T06:22:28.813 INFO:tasks.workunit.client.1.vm06.stdout:3/761: rmdir d6/d1a/d5b/dbd 39 2026-03-10T06:22:28.814 INFO:tasks.workunit.client.1.vm06.stdout:0/792: symlink d0/dd/d14/d1d/d5d/dca/dd8/l102 0 2026-03-10T06:22:28.820 INFO:tasks.workunit.client.1.vm06.stdout:3/762: creat d6/d21/d38/dd0/dd1/d90/dc7/fff x:0 0 0 2026-03-10T06:22:28.820 INFO:tasks.workunit.client.1.vm06.stdout:0/793: truncate d0/da3/dd5/fd7 560049 0 2026-03-10T06:22:28.820 INFO:tasks.workunit.client.1.vm06.stdout:3/763: chown d6/d8/f48 1 1 2026-03-10T06:22:28.820 INFO:tasks.workunit.client.1.vm06.stdout:7/785: dwrite d19/d3b/d41/d42/d62/fb5 [0,4194304] 0 2026-03-10T06:22:28.822 INFO:tasks.workunit.client.1.vm06.stdout:2/637: sync 2026-03-10T06:22:28.826 INFO:tasks.workunit.client.1.vm06.stdout:5/571: dread d8/db/d54/d8a/d74/f66 [0,4194304] 0 2026-03-10T06:22:28.831 INFO:tasks.workunit.client.1.vm06.stdout:9/749: dread d21/f2a [0,4194304] 0 2026-03-10T06:22:28.836 INFO:tasks.workunit.client.1.vm06.stdout:7/786: mknod d19/d3b/d41/d42/d52/d9f/c103 0 2026-03-10T06:22:28.836 INFO:tasks.workunit.client.1.vm06.stdout:2/638: creat da/d13/d1c/d1d/d44/d53/d61/d68/fcc x:0 0 0 2026-03-10T06:22:28.836 INFO:tasks.workunit.client.1.vm06.stdout:9/750: creat d21/da2/ff8 x:0 0 0 2026-03-10T06:22:28.837 INFO:tasks.workunit.client.1.vm06.stdout:7/787: mkdir d19/d3b/d41/d72/d104 0 2026-03-10T06:22:28.837 INFO:tasks.workunit.client.1.vm06.stdout:8/629: write d1/d3b/d5c/f6f [492355,93139] 0 2026-03-10T06:22:28.838 INFO:tasks.workunit.client.1.vm06.stdout:8/630: truncate d1/d3b/da9/dab/fb2 4284559 0 2026-03-10T06:22:28.842 INFO:tasks.workunit.client.1.vm06.stdout:6/803: dread d6/dd/d25/d33/f5d [0,4194304] 0 2026-03-10T06:22:28.842 INFO:tasks.workunit.client.1.vm06.stdout:0/794: dread d0/d3c/dc1/f4e [0,4194304] 0 2026-03-10T06:22:28.843 INFO:tasks.workunit.client.1.vm06.stdout:3/764: dread d6/d21/fa4 [0,4194304] 0 2026-03-10T06:22:28.843 INFO:tasks.workunit.client.1.vm06.stdout:9/751: rmdir d21/d27/d50/d57/db2/d80/d95/d9b/dd0 39 2026-03-10T06:22:28.843 INFO:tasks.workunit.client.1.vm06.stdout:6/804: fdatasync d6/dd/d25/d4e/f60 0 2026-03-10T06:22:28.845 INFO:tasks.workunit.client.1.vm06.stdout:6/805: dread - d6/dd/d25/ff8 zero size 2026-03-10T06:22:28.845 INFO:tasks.workunit.client.1.vm06.stdout:7/788: rename d19/d3b/d41/d42/d62/d80/d82/f90 to d19/d3b/d41/da9/f105 0 2026-03-10T06:22:28.846 INFO:tasks.workunit.client.1.vm06.stdout:0/795: stat d0/dd/d14/d18/d85/dcc/d88/d47/lda 0 2026-03-10T06:22:28.846 INFO:tasks.workunit.client.1.vm06.stdout:6/806: stat d6/d79/c7c 0 2026-03-10T06:22:28.847 INFO:tasks.workunit.client.1.vm06.stdout:7/789: write fa [3405277,69444] 0 2026-03-10T06:22:28.847 INFO:tasks.workunit.client.1.vm06.stdout:0/796: chown d0/dd/d14/d18/d7e/lf6 6754777 1 2026-03-10T06:22:28.852 INFO:tasks.workunit.client.1.vm06.stdout:7/790: chown d19/d3b/d41/d4c/fcf 102702 1 2026-03-10T06:22:28.854 INFO:tasks.workunit.client.1.vm06.stdout:3/765: mkdir d6/dc/d13/d9d/d100 0 2026-03-10T06:22:28.854 INFO:tasks.workunit.client.1.vm06.stdout:9/752: rename d21/fd7 to d21/d27/d50/d57/db2/d80/d95/d9b/dd0/ff9 0 2026-03-10T06:22:28.856 INFO:tasks.workunit.client.1.vm06.stdout:6/807: mknod d6/df/d40/d10c/c115 0 2026-03-10T06:22:28.856 INFO:tasks.workunit.client.1.vm06.stdout:8/631: creat d1/df/fd0 x:0 0 0 2026-03-10T06:22:28.858 INFO:tasks.workunit.client.1.vm06.stdout:5/572: sync 2026-03-10T06:22:28.858 INFO:tasks.workunit.client.1.vm06.stdout:2/639: sync 2026-03-10T06:22:28.860 INFO:tasks.workunit.client.1.vm06.stdout:0/797: dwrite d0/da3/dd5/f101 [0,4194304] 0 2026-03-10T06:22:28.865 INFO:tasks.workunit.client.1.vm06.stdout:8/632: mknod d1/d2c/cd1 0 2026-03-10T06:22:28.867 INFO:tasks.workunit.client.1.vm06.stdout:8/633: stat d1/d3b/da9/dab/fb2 0 2026-03-10T06:22:28.872 INFO:tasks.workunit.client.1.vm06.stdout:7/791: mkdir d19/d3b/d41/d42/d62/d80/da1/de3/d106 0 2026-03-10T06:22:28.872 INFO:tasks.workunit.client.1.vm06.stdout:6/808: dwrite d6/df/d70/fed [0,4194304] 0 2026-03-10T06:22:28.872 INFO:tasks.workunit.client.1.vm06.stdout:5/573: mknod d8/db/d54/d67/d46/d68/cb8 0 2026-03-10T06:22:28.872 INFO:tasks.workunit.client.1.vm06.stdout:1/812: write d9/d35/d89/f9b [3720354,20932] 0 2026-03-10T06:22:28.872 INFO:tasks.workunit.client.1.vm06.stdout:2/640: sync 2026-03-10T06:22:28.875 INFO:tasks.workunit.client.1.vm06.stdout:2/641: truncate da/d13/d1a/dc7/daf/d56/db9/fc2 986305 0 2026-03-10T06:22:28.876 INFO:tasks.workunit.client.1.vm06.stdout:2/642: chown da/d13/d1c/d43/f7a 57 1 2026-03-10T06:22:28.878 INFO:tasks.workunit.client.1.vm06.stdout:2/643: fdatasync da/d13/d1a/d39/f2f 0 2026-03-10T06:22:28.883 INFO:tasks.workunit.client.1.vm06.stdout:8/634: mkdir d1/df/d11/da1/dd2 0 2026-03-10T06:22:28.885 INFO:tasks.workunit.client.1.vm06.stdout:8/635: fdatasync d1/f75 0 2026-03-10T06:22:28.886 INFO:tasks.workunit.client.1.vm06.stdout:2/644: creat da/d13/d1c/d43/fcd x:0 0 0 2026-03-10T06:22:28.887 INFO:tasks.workunit.client.1.vm06.stdout:1/813: mkdir d9/d35/d46/d38/d63/dd6/de8 0 2026-03-10T06:22:28.888 INFO:tasks.workunit.client.1.vm06.stdout:5/574: creat d8/db/d54/d67/d46/fb9 x:0 0 0 2026-03-10T06:22:28.889 INFO:tasks.workunit.client.1.vm06.stdout:2/645: symlink da/d13/d1c/d1d/d44/d53/d61/d68/lce 0 2026-03-10T06:22:28.890 INFO:tasks.workunit.client.1.vm06.stdout:2/646: write da/d13/d1c/d7d/f81 [1539420,81916] 0 2026-03-10T06:22:28.890 INFO:tasks.workunit.client.1.vm06.stdout:2/647: readlink da/d13/d1c/d1d/d44/d46/l9f 0 2026-03-10T06:22:28.892 INFO:tasks.workunit.client.1.vm06.stdout:1/814: rename d9/d35/d46/d38/d63/d83/fb2 to d9/d1b/d20/db3/fe9 0 2026-03-10T06:22:28.892 INFO:tasks.workunit.client.1.vm06.stdout:0/798: read d0/dd/f10 [364444,25987] 0 2026-03-10T06:22:28.894 INFO:tasks.workunit.client.1.vm06.stdout:5/575: mknod d8/db/d54/d55/cba 0 2026-03-10T06:22:28.894 INFO:tasks.workunit.client.1.vm06.stdout:1/815: mknod d9/d62/dc7/cea 0 2026-03-10T06:22:28.895 INFO:tasks.workunit.client.1.vm06.stdout:1/816: readlink d9/d35/d46/d38/l5d 0 2026-03-10T06:22:28.895 INFO:tasks.workunit.client.1.vm06.stdout:1/817: chown d9/d62/cad 110091 1 2026-03-10T06:22:28.897 INFO:tasks.workunit.client.1.vm06.stdout:2/648: creat da/d13/d1c/d1d/d44/d53/d61/fcf x:0 0 0 2026-03-10T06:22:28.898 INFO:tasks.workunit.client.1.vm06.stdout:1/818: creat d9/d35/d46/d38/d63/feb x:0 0 0 2026-03-10T06:22:28.901 INFO:tasks.workunit.client.1.vm06.stdout:0/799: sync 2026-03-10T06:22:28.910 INFO:tasks.workunit.client.1.vm06.stdout:0/800: rename d0/d3c/dc1/dc4 to d0/dd/d14/d18/d85/dcc/d88/d98/d103 0 2026-03-10T06:22:28.910 INFO:tasks.workunit.client.1.vm06.stdout:1/819: symlink d9/d35/d46/d38/ddd/lec 0 2026-03-10T06:22:28.913 INFO:tasks.workunit.client.1.vm06.stdout:5/576: dwrite d8/db/d54/d8a/d39/d72/f8b [0,4194304] 0 2026-03-10T06:22:28.917 INFO:tasks.workunit.client.1.vm06.stdout:0/801: rename d0/dd/d14/d18/d85/dcc/d5e/dbb/fc3 to d0/dd/d14/d18/d7e/f104 0 2026-03-10T06:22:28.922 INFO:tasks.workunit.client.1.vm06.stdout:3/766: rmdir d6/dc/d13 39 2026-03-10T06:22:28.931 INFO:tasks.workunit.client.1.vm06.stdout:1/820: rename d9/d35/d46/d38/d8c/fc1 to d9/d35/d46/d38/d63/d83/dc5/dd5/fed 0 2026-03-10T06:22:28.932 INFO:tasks.workunit.client.1.vm06.stdout:8/636: getdents d1/d2c 0 2026-03-10T06:22:28.933 INFO:tasks.workunit.client.1.vm06.stdout:5/577: truncate d8/fa5 506206 0 2026-03-10T06:22:28.934 INFO:tasks.workunit.client.1.vm06.stdout:1/821: read d9/d35/d46/d38/d63/d83/d93/fe5 [1365606,126044] 0 2026-03-10T06:22:28.936 INFO:tasks.workunit.client.1.vm06.stdout:0/802: rename d0/dd/f48 to d0/dd/d14/d18/d85/f105 0 2026-03-10T06:22:28.936 INFO:tasks.workunit.client.1.vm06.stdout:1/822: write d9/d1b/f81 [701767,42784] 0 2026-03-10T06:22:28.936 INFO:tasks.workunit.client.1.vm06.stdout:0/803: fsync d0/dd/f24 0 2026-03-10T06:22:28.937 INFO:tasks.workunit.client.1.vm06.stdout:9/753: dwrite d21/d27/d50/d57/db2/d7f/fc6 [0,4194304] 0 2026-03-10T06:22:28.946 INFO:tasks.workunit.client.1.vm06.stdout:6/809: dwrite d6/dd/dc2/d10d/ffa [0,4194304] 0 2026-03-10T06:22:28.947 INFO:tasks.workunit.client.1.vm06.stdout:5/578: creat d8/db/d54/d55/d80/fbb x:0 0 0 2026-03-10T06:22:28.952 INFO:tasks.workunit.client.1.vm06.stdout:9/754: rmdir d21/d27 39 2026-03-10T06:22:28.955 INFO:tasks.workunit.client.1.vm06.stdout:0/804: mknod d0/d3c/dc1/d7d/c106 0 2026-03-10T06:22:28.957 INFO:tasks.workunit.client.1.vm06.stdout:5/579: sync 2026-03-10T06:22:28.958 INFO:tasks.workunit.client.1.vm06.stdout:8/637: truncate d1/df/d11/f1d 1933728 0 2026-03-10T06:22:28.960 INFO:tasks.workunit.client.1.vm06.stdout:2/649: dwrite da/d13/d1a/f21 [0,4194304] 0 2026-03-10T06:22:28.967 INFO:tasks.workunit.client.1.vm06.stdout:3/767: dread d6/dc/d13/d35/f5a [4194304,4194304] 0 2026-03-10T06:22:28.967 INFO:tasks.workunit.client.1.vm06.stdout:7/792: dread d19/d3b/d41/d42/f78 [0,4194304] 0 2026-03-10T06:22:28.970 INFO:tasks.workunit.client.1.vm06.stdout:0/805: mkdir d0/dd/d14/d18/d85/d107 0 2026-03-10T06:22:28.970 INFO:tasks.workunit.client.1.vm06.stdout:5/580: unlink d8/d9/c2c 0 2026-03-10T06:22:28.971 INFO:tasks.workunit.client.1.vm06.stdout:6/810: symlink d6/df/l116 0 2026-03-10T06:22:28.976 INFO:tasks.workunit.client.1.vm06.stdout:8/638: dwrite d1/df/d11/f59 [0,4194304] 0 2026-03-10T06:22:28.978 INFO:tasks.workunit.client.1.vm06.stdout:3/768: rename d6/d21/d38 to d6/dc/d13/d35/d101 0 2026-03-10T06:22:28.978 INFO:tasks.workunit.client.1.vm06.stdout:6/811: write d6/df/d70/fd9 [78700,72435] 0 2026-03-10T06:22:28.978 INFO:tasks.workunit.client.1.vm06.stdout:3/769: chown d6/l28 13 1 2026-03-10T06:22:28.986 INFO:tasks.workunit.client.1.vm06.stdout:1/823: dwrite d9/d35/d89/f4f [0,4194304] 0 2026-03-10T06:22:28.990 INFO:tasks.workunit.client.1.vm06.stdout:7/793: creat d19/d3b/d41/d42/d62/d80/da1/f107 x:0 0 0 2026-03-10T06:22:28.992 INFO:tasks.workunit.client.1.vm06.stdout:2/650: write da/ff [654495,81617] 0 2026-03-10T06:22:28.992 INFO:tasks.workunit.client.1.vm06.stdout:3/770: rename d6/dc/d13/d35/d101/dd0/feb to d6/f102 0 2026-03-10T06:22:28.993 INFO:tasks.workunit.client.1.vm06.stdout:0/806: dread d0/dd/d14/d18/d85/dcc/d5e/f86 [0,4194304] 0 2026-03-10T06:22:28.999 INFO:tasks.workunit.client.1.vm06.stdout:1/824: fsync d9/d35/faf 0 2026-03-10T06:22:29.000 INFO:tasks.workunit.client.1.vm06.stdout:1/825: chown d9/d62 60941 1 2026-03-10T06:22:29.007 INFO:tasks.workunit.client.1.vm06.stdout:1/826: dwrite d9/d35/faf [0,4194304] 0 2026-03-10T06:22:29.010 INFO:tasks.workunit.client.1.vm06.stdout:5/581: dwrite d8/db/d54/d8a/d39/f51 [0,4194304] 0 2026-03-10T06:22:29.012 INFO:tasks.workunit.client.1.vm06.stdout:1/827: write d9/d62/f99 [1636196,80117] 0 2026-03-10T06:22:29.012 INFO:tasks.workunit.client.1.vm06.stdout:7/794: creat d19/d3b/d5b/f108 x:0 0 0 2026-03-10T06:22:29.012 INFO:tasks.workunit.client.1.vm06.stdout:3/771: creat d6/dc/d13/d35/d101/d88/dae/f103 x:0 0 0 2026-03-10T06:22:29.013 INFO:tasks.workunit.client.1.vm06.stdout:7/795: write d19/f20 [4323498,115797] 0 2026-03-10T06:22:29.013 INFO:tasks.workunit.client.1.vm06.stdout:2/651: fdatasync da/d13/d1c/d1d/d44/d53/d61/f89 0 2026-03-10T06:22:29.016 INFO:tasks.workunit.client.1.vm06.stdout:0/807: creat d0/d3c/dc1/d3d/d50/f108 x:0 0 0 2026-03-10T06:22:29.026 INFO:tasks.workunit.client.1.vm06.stdout:2/652: truncate da/d13/d1a/dc7/d86/fc8 320392 0 2026-03-10T06:22:29.027 INFO:tasks.workunit.client.1.vm06.stdout:3/772: chown d6/dc/d13/d35/d101/dd0/cf1 1791931 1 2026-03-10T06:22:29.027 INFO:tasks.workunit.client.1.vm06.stdout:9/755: dread f20 [0,4194304] 0 2026-03-10T06:22:29.027 INFO:tasks.workunit.client.1.vm06.stdout:4/794: dread dd/f43 [0,4194304] 0 2026-03-10T06:22:29.029 INFO:tasks.workunit.client.1.vm06.stdout:3/773: dread d6/d8/f62 [0,4194304] 0 2026-03-10T06:22:29.032 INFO:tasks.workunit.client.1.vm06.stdout:6/812: link d6/df/d70/cc8 d6/d79/d95/db4/dd4/df5/df3/c117 0 2026-03-10T06:22:29.033 INFO:tasks.workunit.client.1.vm06.stdout:3/774: stat d6/dc/d13/d35/f95 0 2026-03-10T06:22:29.033 INFO:tasks.workunit.client.1.vm06.stdout:6/813: dread - d6/d79/d95/dea/f108 zero size 2026-03-10T06:22:29.033 INFO:tasks.workunit.client.1.vm06.stdout:2/653: dwrite da/d13/d1c/d7d/fc3 [0,4194304] 0 2026-03-10T06:22:29.034 INFO:tasks.workunit.client.1.vm06.stdout:7/796: mknod d19/d3b/d41/d42/d52/c109 0 2026-03-10T06:22:29.042 INFO:tasks.workunit.client.1.vm06.stdout:9/756: rmdir d21/d27/d50/d57/db2/d7f 39 2026-03-10T06:22:29.042 INFO:tasks.workunit.client.1.vm06.stdout:5/582: creat d8/db/fbc x:0 0 0 2026-03-10T06:22:29.042 INFO:tasks.workunit.client.1.vm06.stdout:2/654: creat da/d13/d1a/dc7/dc5/fd0 x:0 0 0 2026-03-10T06:22:29.043 INFO:tasks.workunit.client.1.vm06.stdout:0/808: getdents d0/da3 0 2026-03-10T06:22:29.045 INFO:tasks.workunit.client.1.vm06.stdout:9/757: unlink d21/d27/f65 0 2026-03-10T06:22:29.051 INFO:tasks.workunit.client.1.vm06.stdout:2/655: readlink da/l15 0 2026-03-10T06:22:29.051 INFO:tasks.workunit.client.1.vm06.stdout:6/814: rename d6/d79/c7c to d6/d7/c118 0 2026-03-10T06:22:29.052 INFO:tasks.workunit.client.1.vm06.stdout:4/795: dwrite dd/d33/d47/d97/db6/dbb/fce [0,4194304] 0 2026-03-10T06:22:29.056 INFO:tasks.workunit.client.1.vm06.stdout:5/583: dwrite d8/db/d54/d8a/f31 [0,4194304] 0 2026-03-10T06:22:29.057 INFO:tasks.workunit.client.1.vm06.stdout:4/796: chown dd/d33/d36/f8d 255154991 1 2026-03-10T06:22:29.066 INFO:tasks.workunit.client.1.vm06.stdout:6/815: dwrite d6/dd/d25/d4e/f83 [0,4194304] 0 2026-03-10T06:22:29.070 INFO:tasks.workunit.client.1.vm06.stdout:0/809: rename d0/dd/d1c/c2a to d0/dd/d14/d1d/d5d/c109 0 2026-03-10T06:22:29.070 INFO:tasks.workunit.client.1.vm06.stdout:2/656: truncate da/d13/d1a/d39/d35/f74 1523837 0 2026-03-10T06:22:29.070 INFO:tasks.workunit.client.1.vm06.stdout:4/797: mkdir dd/d33/de9 0 2026-03-10T06:22:29.077 INFO:tasks.workunit.client.1.vm06.stdout:6/816: mkdir d6/dd/dc2/d10d/dd0/dc5/d119 0 2026-03-10T06:22:29.087 INFO:tasks.workunit.client.1.vm06.stdout:6/817: mkdir d6/d79/d95/db4/dd4/df5/d11a 0 2026-03-10T06:22:29.087 INFO:tasks.workunit.client.1.vm06.stdout:5/584: creat d8/db/d54/d8a/fbd x:0 0 0 2026-03-10T06:22:29.093 INFO:tasks.workunit.client.1.vm06.stdout:3/775: truncate d6/dc/d13/d35/f4e 1023142 0 2026-03-10T06:22:29.093 INFO:tasks.workunit.client.1.vm06.stdout:9/758: dread d21/f33 [0,4194304] 0 2026-03-10T06:22:29.093 INFO:tasks.workunit.client.1.vm06.stdout:5/585: write d8/db/d54/d55/d80/fbb [170837,75196] 0 2026-03-10T06:22:29.094 INFO:tasks.workunit.client.1.vm06.stdout:7/797: write d19/db0/ddd/ff5 [2130940,77315] 0 2026-03-10T06:22:29.096 INFO:tasks.workunit.client.1.vm06.stdout:8/639: dwrite d1/df/d20/f51 [0,4194304] 0 2026-03-10T06:22:29.097 INFO:tasks.workunit.client.1.vm06.stdout:7/798: write d19/d3b/d41/f65 [591922,62205] 0 2026-03-10T06:22:29.097 INFO:tasks.workunit.client.1.vm06.stdout:2/657: rmdir da/d13/d1a/d39/d35/da1 0 2026-03-10T06:22:29.097 INFO:tasks.workunit.client.1.vm06.stdout:6/818: readlink d6/d7/d37/l7b 0 2026-03-10T06:22:29.098 INFO:tasks.workunit.client.1.vm06.stdout:9/759: readlink d21/d27/d3a/l85 0 2026-03-10T06:22:29.102 INFO:tasks.workunit.client.1.vm06.stdout:1/828: dwrite d9/d35/d46/d38/d63/d83/d93/f9c [0,4194304] 0 2026-03-10T06:22:29.109 INFO:tasks.workunit.client.1.vm06.stdout:5/586: fsync d8/db/d54/d8a/d74/f71 0 2026-03-10T06:22:29.109 INFO:tasks.workunit.client.1.vm06.stdout:0/810: dwrite d0/dd/f4c [0,4194304] 0 2026-03-10T06:22:29.110 INFO:tasks.workunit.client.1.vm06.stdout:7/799: chown d19/d3b/d41/d42/d52/d83/d9d/lea 2 1 2026-03-10T06:22:29.113 INFO:tasks.workunit.client.1.vm06.stdout:4/798: dwrite dd/d33/da6/fde [0,4194304] 0 2026-03-10T06:22:29.114 INFO:tasks.workunit.client.1.vm06.stdout:4/799: chown dd/d18/d75 14952 1 2026-03-10T06:22:29.117 INFO:tasks.workunit.client.1.vm06.stdout:6/819: mknod d6/dd/c11b 0 2026-03-10T06:22:29.117 INFO:tasks.workunit.client.1.vm06.stdout:1/829: mknod d9/d35/d46/d38/d63/d83/dc5/cee 0 2026-03-10T06:22:29.118 INFO:tasks.workunit.client.1.vm06.stdout:2/658: dread - da/d13/f52 zero size 2026-03-10T06:22:29.119 INFO:tasks.workunit.client.1.vm06.stdout:5/587: truncate d8/db/d54/d67/d46/fa4 2886553 0 2026-03-10T06:22:29.122 INFO:tasks.workunit.client.1.vm06.stdout:7/800: creat d19/d3b/d41/d42/d52/d83/d9d/da8/f10a x:0 0 0 2026-03-10T06:22:29.126 INFO:tasks.workunit.client.1.vm06.stdout:0/811: creat d0/dd/d14/d1d/d5d/dca/dd8/f10a x:0 0 0 2026-03-10T06:22:29.127 INFO:tasks.workunit.client.1.vm06.stdout:8/640: creat d1/df/d20/d21/d5e/fd3 x:0 0 0 2026-03-10T06:22:29.129 INFO:tasks.workunit.client.1.vm06.stdout:1/830: creat d9/d35/d46/fef x:0 0 0 2026-03-10T06:22:29.133 INFO:tasks.workunit.client.1.vm06.stdout:2/659: creat da/d13/d1a/dc7/daf/d56/db9/fd1 x:0 0 0 2026-03-10T06:22:29.133 INFO:tasks.workunit.client.1.vm06.stdout:1/831: dread - d9/d35/d46/fef zero size 2026-03-10T06:22:29.133 INFO:tasks.workunit.client.1.vm06.stdout:2/660: stat da/d13/d1a/f3a 0 2026-03-10T06:22:29.133 INFO:tasks.workunit.client.1.vm06.stdout:2/661: chown da/d13/f1f 12 1 2026-03-10T06:22:29.137 INFO:tasks.workunit.client.1.vm06.stdout:0/812: read d0/dd/d14/d18/d85/dcc/d88/d35/f51 [1655658,5711] 0 2026-03-10T06:22:29.145 INFO:tasks.workunit.client.1.vm06.stdout:0/813: dread d0/dd/f4c [0,4194304] 0 2026-03-10T06:22:29.145 INFO:tasks.workunit.client.1.vm06.stdout:4/800: getdents dd/d41/da9 0 2026-03-10T06:22:29.145 INFO:tasks.workunit.client.1.vm06.stdout:2/662: chown da/c3d 1524 1 2026-03-10T06:22:29.145 INFO:tasks.workunit.client.1.vm06.stdout:2/663: chown da/d13/d1c/d1d/d44/d53/d61 3474857 1 2026-03-10T06:22:29.145 INFO:tasks.workunit.client.1.vm06.stdout:0/814: write d0/dd/d1c/da2/fb9 [916716,65537] 0 2026-03-10T06:22:29.145 INFO:tasks.workunit.client.1.vm06.stdout:6/820: link d6/dd/dc2/d10d/dd0/dc5/fe1 d6/df/f11c 0 2026-03-10T06:22:29.146 INFO:tasks.workunit.client.1.vm06.stdout:4/801: write dd/d24/d5d/fd1 [796030,82339] 0 2026-03-10T06:22:29.147 INFO:tasks.workunit.client.1.vm06.stdout:2/664: symlink da/d13/d1a/dc7/daf/d56/db9/d9b/ld2 0 2026-03-10T06:22:29.154 INFO:tasks.workunit.client.1.vm06.stdout:1/832: rename d9/d35/d89/c66 to d9/d35/d89/cf0 0 2026-03-10T06:22:29.154 INFO:tasks.workunit.client.1.vm06.stdout:4/802: write dd/fc2 [691955,28566] 0 2026-03-10T06:22:29.154 INFO:tasks.workunit.client.1.vm06.stdout:0/815: dread - d0/dd/d14/d18/d85/dcc/dab/ff9 zero size 2026-03-10T06:22:29.154 INFO:tasks.workunit.client.1.vm06.stdout:4/803: readlink dd/d41/ldc 0 2026-03-10T06:22:29.154 INFO:tasks.workunit.client.1.vm06.stdout:1/833: stat d9/dd3/cdf 0 2026-03-10T06:22:29.154 INFO:tasks.workunit.client.1.vm06.stdout:6/821: symlink d6/dd/dc2/d10d/dd0/l11d 0 2026-03-10T06:22:29.154 INFO:tasks.workunit.client.1.vm06.stdout:0/816: chown d0/dd/d14/d6b/la6 5 1 2026-03-10T06:22:29.154 INFO:tasks.workunit.client.1.vm06.stdout:0/817: write d0/dd/d14/d18/d85/dcc/fb6 [183340,31769] 0 2026-03-10T06:22:29.154 INFO:tasks.workunit.client.1.vm06.stdout:4/804: readlink dd/d24/d2d/d2f/d39/d71/dc3/dd0/le4 0 2026-03-10T06:22:29.157 INFO:tasks.workunit.client.1.vm06.stdout:6/822: rmdir d6/dd/dc7 39 2026-03-10T06:22:29.158 INFO:tasks.workunit.client.1.vm06.stdout:6/823: chown d6/dd/d25/d33/d5a/dd8/c10b 21849806 1 2026-03-10T06:22:29.166 INFO:tasks.workunit.client.1.vm06.stdout:0/818: dread d0/dd/d14/d18/f2c [0,4194304] 0 2026-03-10T06:22:29.170 INFO:tasks.workunit.client.1.vm06.stdout:0/819: symlink d0/dd/d1c/da2/dc6/l10b 0 2026-03-10T06:22:29.176 INFO:tasks.workunit.client.1.vm06.stdout:0/820: symlink d0/dd/d14/d18/d85/dcc/db7/l10c 0 2026-03-10T06:22:29.181 INFO:tasks.workunit.client.1.vm06.stdout:0/821: creat d0/dd/d14/d18/d85/dcc/d99/f10d x:0 0 0 2026-03-10T06:22:29.189 INFO:tasks.workunit.client.1.vm06.stdout:0/822: chown d0/dd/fa4 3 1 2026-03-10T06:22:29.189 INFO:tasks.workunit.client.1.vm06.stdout:0/823: chown d0/f61 354 1 2026-03-10T06:22:29.189 INFO:tasks.workunit.client.1.vm06.stdout:3/776: truncate d6/dc/d13/d9d/f57 3787203 0 2026-03-10T06:22:29.189 INFO:tasks.workunit.client.1.vm06.stdout:9/760: dwrite d21/da2/de6/fc1 [0,4194304] 0 2026-03-10T06:22:29.189 INFO:tasks.workunit.client.1.vm06.stdout:9/761: chown d21/d32/d4d/f9d 0 1 2026-03-10T06:22:29.189 INFO:tasks.workunit.client.1.vm06.stdout:9/762: truncate d21/da2/da7/fe2 910520 0 2026-03-10T06:22:29.190 INFO:tasks.workunit.client.1.vm06.stdout:3/777: unlink d6/d8/d7f/fd4 0 2026-03-10T06:22:29.199 INFO:tasks.workunit.client.1.vm06.stdout:2/665: write da/d13/d1c/d1d/f2a [5111831,13657] 0 2026-03-10T06:22:29.203 INFO:tasks.workunit.client.1.vm06.stdout:3/778: creat d6/dc/d13/d35/d101/d88/f104 x:0 0 0 2026-03-10T06:22:29.203 INFO:tasks.workunit.client.1.vm06.stdout:7/801: dwrite d19/d3b/d41/d42/f91 [0,4194304] 0 2026-03-10T06:22:29.204 INFO:tasks.workunit.client.1.vm06.stdout:1/834: write d9/f1a [375295,62241] 0 2026-03-10T06:22:29.204 INFO:tasks.workunit.client.1.vm06.stdout:1/835: fsync d9/d35/d46/d38/d63/f80 0 2026-03-10T06:22:29.204 INFO:tasks.workunit.client.1.vm06.stdout:1/836: chown d9/dd3/dbf/lc9 5 1 2026-03-10T06:22:29.212 INFO:tasks.workunit.client.1.vm06.stdout:7/802: dwrite d19/d3b/d41/f65 [0,4194304] 0 2026-03-10T06:22:29.215 INFO:tasks.workunit.client.1.vm06.stdout:2/666: mknod da/d13/d5e/cd3 0 2026-03-10T06:22:29.220 INFO:tasks.workunit.client.1.vm06.stdout:7/803: rmdir d19/d3b/d41/da9 39 2026-03-10T06:22:29.223 INFO:tasks.workunit.client.1.vm06.stdout:5/588: dread d8/db/d54/d67/d46/fa4 [0,4194304] 0 2026-03-10T06:22:29.223 INFO:tasks.workunit.client.1.vm06.stdout:7/804: write f15 [6299507,56224] 0 2026-03-10T06:22:29.225 INFO:tasks.workunit.client.1.vm06.stdout:6/824: dwrite d6/dd/d25/d2c/f85 [0,4194304] 0 2026-03-10T06:22:29.225 INFO:tasks.workunit.client.1.vm06.stdout:6/825: dread - d6/dd/d25/d33/d5a/f109 zero size 2026-03-10T06:22:29.231 INFO:tasks.workunit.client.1.vm06.stdout:9/763: write d21/f2a [1622261,95135] 0 2026-03-10T06:22:29.231 INFO:tasks.workunit.client.1.vm06.stdout:2/667: mkdir da/d13/d1a/dc7/daf/d56/dd4 0 2026-03-10T06:22:29.231 INFO:tasks.workunit.client.1.vm06.stdout:3/779: fsync d6/d21/f7b 0 2026-03-10T06:22:29.237 INFO:tasks.workunit.client.1.vm06.stdout:1/837: sync 2026-03-10T06:22:29.247 INFO:tasks.workunit.client.1.vm06.stdout:5/589: dwrite d8/db/d54/d8a/d74/f78 [0,4194304] 0 2026-03-10T06:22:29.247 INFO:tasks.workunit.client.1.vm06.stdout:5/590: read d8/db/d54/d8a/d74/f78 [1735023,62597] 0 2026-03-10T06:22:29.247 INFO:tasks.workunit.client.1.vm06.stdout:5/591: write d8/db/fbc [500824,112625] 0 2026-03-10T06:22:29.247 INFO:tasks.workunit.client.1.vm06.stdout:5/592: stat d8/db/d54/d8a/d74/d90/lac 0 2026-03-10T06:22:29.247 INFO:tasks.workunit.client.1.vm06.stdout:4/805: dwrite dd/fa7 [0,4194304] 0 2026-03-10T06:22:29.249 INFO:tasks.workunit.client.1.vm06.stdout:9/764: dread d21/d32/d4d/fbd [0,4194304] 0 2026-03-10T06:22:29.252 INFO:tasks.workunit.client.1.vm06.stdout:1/838: creat d9/d1b/d20/db3/ff1 x:0 0 0 2026-03-10T06:22:29.254 INFO:tasks.workunit.client.1.vm06.stdout:2/668: write da/f28 [2576749,113324] 0 2026-03-10T06:22:29.257 INFO:tasks.workunit.client.1.vm06.stdout:9/765: truncate d21/d27/f4b 1823557 0 2026-03-10T06:22:29.258 INFO:tasks.workunit.client.1.vm06.stdout:6/826: getdents d6/dd/d25/d33 0 2026-03-10T06:22:29.259 INFO:tasks.workunit.client.1.vm06.stdout:9/766: chown d21/f33 160206 1 2026-03-10T06:22:29.264 INFO:tasks.workunit.client.1.vm06.stdout:6/827: fdatasync d6/d79/fe2 0 2026-03-10T06:22:29.264 INFO:tasks.workunit.client.1.vm06.stdout:0/824: dread d0/dd/f10 [0,4194304] 0 2026-03-10T06:22:29.264 INFO:tasks.workunit.client.1.vm06.stdout:9/767: mknod d21/d27/d56/cfa 0 2026-03-10T06:22:29.264 INFO:tasks.workunit.client.1.vm06.stdout:2/669: mknod da/d13/d1c/d1d/cd5 0 2026-03-10T06:22:29.266 INFO:tasks.workunit.client.1.vm06.stdout:2/670: readlink da/d13/d1a/dc7/l82 0 2026-03-10T06:22:29.267 INFO:tasks.workunit.client.1.vm06.stdout:0/825: dwrite d0/d3c/dc1/d3d/d50/f108 [0,4194304] 0 2026-03-10T06:22:29.273 INFO:tasks.workunit.client.1.vm06.stdout:4/806: rename dd/d24/fd6 to dd/d33/d47/d97/db6/fea 0 2026-03-10T06:22:29.276 INFO:tasks.workunit.client.1.vm06.stdout:9/768: rmdir d21/d27/d50/d57/dcd 39 2026-03-10T06:22:29.278 INFO:tasks.workunit.client.1.vm06.stdout:1/839: sync 2026-03-10T06:22:29.279 INFO:tasks.workunit.client.1.vm06.stdout:6/828: dwrite d6/d79/d95/db4/fbd [0,4194304] 0 2026-03-10T06:22:29.279 INFO:tasks.workunit.client.1.vm06.stdout:2/671: symlink da/da8/ld6 0 2026-03-10T06:22:29.284 INFO:tasks.workunit.client.1.vm06.stdout:0/826: mkdir d0/d3c/dc1/d7d/d10e 0 2026-03-10T06:22:29.289 INFO:tasks.workunit.client.1.vm06.stdout:2/672: creat da/d13/d1c/d1d/d44/d46/fd7 x:0 0 0 2026-03-10T06:22:29.289 INFO:tasks.workunit.client.1.vm06.stdout:8/641: read d1/f1b [698878,118060] 0 2026-03-10T06:22:29.289 INFO:tasks.workunit.client.1.vm06.stdout:4/807: mkdir dd/d24/d2d/d2f/d39/d71/dc3/dd0/de0/db0/deb 0 2026-03-10T06:22:29.294 INFO:tasks.workunit.client.1.vm06.stdout:9/769: fsync d21/f3e 0 2026-03-10T06:22:29.307 INFO:tasks.workunit.client.1.vm06.stdout:4/808: write dd/d24/d5d/f9d [60800,102067] 0 2026-03-10T06:22:29.308 INFO:tasks.workunit.client.1.vm06.stdout:1/840: symlink d9/d35/d89/lf2 0 2026-03-10T06:22:29.308 INFO:tasks.workunit.client.1.vm06.stdout:1/841: chown d9/d62/dc7 4307879 1 2026-03-10T06:22:29.308 INFO:tasks.workunit.client.1.vm06.stdout:0/827: symlink d0/dd/d14/d18/d85/dcc/d88/l10f 0 2026-03-10T06:22:29.308 INFO:tasks.workunit.client.1.vm06.stdout:2/673: stat da/d13/d1c/d1d/d44/d53/l96 0 2026-03-10T06:22:29.308 INFO:tasks.workunit.client.1.vm06.stdout:1/842: chown d9/dd3 200 1 2026-03-10T06:22:29.308 INFO:tasks.workunit.client.1.vm06.stdout:8/642: link d1/df/d11/da1/fb6 d1/d3b/da9/dab/fd4 0 2026-03-10T06:22:29.308 INFO:tasks.workunit.client.1.vm06.stdout:1/843: chown d9/d35/d46/d38/d63/d83/fa1 529239 1 2026-03-10T06:22:29.308 INFO:tasks.workunit.client.1.vm06.stdout:8/643: chown d1/df/d20/d21/l5f 2 1 2026-03-10T06:22:29.308 INFO:tasks.workunit.client.1.vm06.stdout:2/674: mknod da/d13/d1a/dc7/daf/d56/cd8 0 2026-03-10T06:22:29.308 INFO:tasks.workunit.client.1.vm06.stdout:9/770: mkdir d21/d27/d50/d57/db2/d80/d95/d9b/dd0/dfb 0 2026-03-10T06:22:29.308 INFO:tasks.workunit.client.1.vm06.stdout:0/828: mknod d0/dd/d14/d18/d85/d107/c110 0 2026-03-10T06:22:29.308 INFO:tasks.workunit.client.1.vm06.stdout:2/675: dread - da/d13/d1c/d1d/fbd zero size 2026-03-10T06:22:29.308 INFO:tasks.workunit.client.1.vm06.stdout:1/844: mkdir d9/d35/d46/d38/df3 0 2026-03-10T06:22:29.308 INFO:tasks.workunit.client.1.vm06.stdout:0/829: creat d0/dd/d14/d18/d85/dcc/d88/d47/f111 x:0 0 0 2026-03-10T06:22:29.308 INFO:tasks.workunit.client.1.vm06.stdout:1/845: chown d9/dd3/f85 57823 1 2026-03-10T06:22:29.308 INFO:tasks.workunit.client.1.vm06.stdout:9/771: truncate d21/d27/d50/d57/db2/d80/f86 1331303 0 2026-03-10T06:22:29.308 INFO:tasks.workunit.client.1.vm06.stdout:9/772: read d21/da2/da7/fe2 [260111,23828] 0 2026-03-10T06:22:29.308 INFO:tasks.workunit.client.1.vm06.stdout:2/676: creat da/d13/d1a/dc7/daf/fd9 x:0 0 0 2026-03-10T06:22:29.308 INFO:tasks.workunit.client.1.vm06.stdout:8/644: getdents d1/d3b 0 2026-03-10T06:22:29.308 INFO:tasks.workunit.client.1.vm06.stdout:1/846: creat d9/d35/d46/d38/d8c/ff4 x:0 0 0 2026-03-10T06:22:29.308 INFO:tasks.workunit.client.1.vm06.stdout:9/773: dwrite d21/da2/da7/fca [4194304,4194304] 0 2026-03-10T06:22:29.311 INFO:tasks.workunit.client.1.vm06.stdout:9/774: unlink d21/d46/l72 0 2026-03-10T06:22:29.314 INFO:tasks.workunit.client.1.vm06.stdout:2/677: rename da/d13/d1c/d7d/cb5 to da/d13/d1c/d43/cda 0 2026-03-10T06:22:29.324 INFO:tasks.workunit.client.1.vm06.stdout:3/780: truncate d6/dc/d41/d6d/fce 2800842 0 2026-03-10T06:22:29.325 INFO:tasks.workunit.client.1.vm06.stdout:9/775: creat d21/d32/d4d/d51/ffc x:0 0 0 2026-03-10T06:22:29.325 INFO:tasks.workunit.client.1.vm06.stdout:9/776: dread - d21/d32/d4d/d51/db0/fe3 zero size 2026-03-10T06:22:29.327 INFO:tasks.workunit.client.1.vm06.stdout:0/830: rename d0/d3c/dc1/d3d/d50/c75 to d0/dd/d14/d18/d85/dcc/d88/d9e/c112 0 2026-03-10T06:22:29.327 INFO:tasks.workunit.client.1.vm06.stdout:4/809: dread dd/d24/d2d/d2f/d34/d40/f99 [0,4194304] 0 2026-03-10T06:22:29.328 INFO:tasks.workunit.client.1.vm06.stdout:9/777: symlink d21/d27/d50/d57/db2/d80/lfd 0 2026-03-10T06:22:29.328 INFO:tasks.workunit.client.1.vm06.stdout:3/781: write d6/d21/dbc/fd9 [5175297,45065] 0 2026-03-10T06:22:29.332 INFO:tasks.workunit.client.1.vm06.stdout:1/847: link d9/dd3/c64 d9/d1b/d20/cf5 0 2026-03-10T06:22:29.337 INFO:tasks.workunit.client.1.vm06.stdout:1/848: dread d9/d35/f7e [0,4194304] 0 2026-03-10T06:22:29.339 INFO:tasks.workunit.client.1.vm06.stdout:7/805: dread d19/f35 [4194304,4194304] 0 2026-03-10T06:22:29.342 INFO:tasks.workunit.client.1.vm06.stdout:9/778: creat d21/d27/d50/d57/db2/d80/d95/d9b/dd0/ffe x:0 0 0 2026-03-10T06:22:29.347 INFO:tasks.workunit.client.1.vm06.stdout:8/645: rename d1/df/d20/d21/l2d to d1/d2c/d5b/ld5 0 2026-03-10T06:22:29.347 INFO:tasks.workunit.client.1.vm06.stdout:4/810: chown dd/d24/d2d/d2f/d39/d71/la1 12 1 2026-03-10T06:22:29.347 INFO:tasks.workunit.client.1.vm06.stdout:7/806: mkdir d19/d3b/d41/d42/d10b 0 2026-03-10T06:22:29.348 INFO:tasks.workunit.client.1.vm06.stdout:2/678: link da/d13/d1c/d1d/d44/d46/lab da/d13/d1c/d1d/d44/d53/d61/ldb 0 2026-03-10T06:22:29.348 INFO:tasks.workunit.client.1.vm06.stdout:8/646: write d1/d2c/d99/dc0/fc3 [590399,71834] 0 2026-03-10T06:22:29.348 INFO:tasks.workunit.client.1.vm06.stdout:4/811: write dd/d33/d47/fc7 [889181,10855] 0 2026-03-10T06:22:29.349 INFO:tasks.workunit.client.1.vm06.stdout:9/779: unlink d21/d32/d4d/d51/dcb/fde 0 2026-03-10T06:22:29.352 INFO:tasks.workunit.client.1.vm06.stdout:1/849: dwrite d9/f1f [4194304,4194304] 0 2026-03-10T06:22:29.355 INFO:tasks.workunit.client.1.vm06.stdout:9/780: sync 2026-03-10T06:22:29.356 INFO:tasks.workunit.client.1.vm06.stdout:9/781: readlink d21/d27/d50/d57/db2/d80/d95/d9b/dd0/le8 0 2026-03-10T06:22:29.359 INFO:tasks.workunit.client.1.vm06.stdout:3/782: creat d6/dc/d13/d35/d101/f105 x:0 0 0 2026-03-10T06:22:29.362 INFO:tasks.workunit.client.1.vm06.stdout:4/812: write dd/d24/d2d/d2f/d34/d83/f87 [1354438,2239] 0 2026-03-10T06:22:29.365 INFO:tasks.workunit.client.1.vm06.stdout:7/807: chown d19/d3b/d41/d42/d62/dc4/df8/cff 149133251 1 2026-03-10T06:22:29.369 INFO:tasks.workunit.client.1.vm06.stdout:1/850: write d9/d1b/f1d [2903912,53681] 0 2026-03-10T06:22:29.370 INFO:tasks.workunit.client.1.vm06.stdout:3/783: symlink d6/d21/dbc/l106 0 2026-03-10T06:22:29.371 INFO:tasks.workunit.client.1.vm06.stdout:0/831: rename d0/dd/d14/d1d/d5d/c109 to d0/da3/dd5/c113 0 2026-03-10T06:22:29.371 INFO:tasks.workunit.client.1.vm06.stdout:7/808: chown d19/d3b/d41/d42/d52/d83/d9d/cca 1 1 2026-03-10T06:22:29.373 INFO:tasks.workunit.client.1.vm06.stdout:6/829: dwrite d6/df/d70/f90 [0,4194304] 0 2026-03-10T06:22:29.375 INFO:tasks.workunit.client.1.vm06.stdout:8/647: fsync d1/f4 0 2026-03-10T06:22:29.381 INFO:tasks.workunit.client.1.vm06.stdout:8/648: chown d1/d7/lc 5 1 2026-03-10T06:22:29.386 INFO:tasks.workunit.client.1.vm06.stdout:3/784: unlink d6/d8/cf2 0 2026-03-10T06:22:29.388 INFO:tasks.workunit.client.1.vm06.stdout:1/851: fsync d9/d35/d46/d38/dc6/dd4/fe4 0 2026-03-10T06:22:29.389 INFO:tasks.workunit.client.1.vm06.stdout:6/830: mkdir d6/d79/d95/d11e 0 2026-03-10T06:22:29.393 INFO:tasks.workunit.client.1.vm06.stdout:2/679: getdents da/d13/d1c/d7d 0 2026-03-10T06:22:29.406 INFO:tasks.workunit.client.1.vm06.stdout:3/785: mkdir d6/dc/d13/d35/d101/dd0/d107 0 2026-03-10T06:22:29.406 INFO:tasks.workunit.client.1.vm06.stdout:6/831: symlink d6/dd/dc2/d10d/dd0/dec/l11f 0 2026-03-10T06:22:29.406 INFO:tasks.workunit.client.1.vm06.stdout:2/680: stat da/d13/d1c/d1d/d44/d53/c93 0 2026-03-10T06:22:29.406 INFO:tasks.workunit.client.1.vm06.stdout:4/813: write dd/d72/f78 [82964,127437] 0 2026-03-10T06:22:29.406 INFO:tasks.workunit.client.1.vm06.stdout:7/809: mkdir d19/d3b/d41/da9/da5/d10c 0 2026-03-10T06:22:29.406 INFO:tasks.workunit.client.1.vm06.stdout:2/681: read da/d13/d1c/d1d/f2a [5046719,113217] 0 2026-03-10T06:22:29.406 INFO:tasks.workunit.client.1.vm06.stdout:1/852: creat d9/d35/d46/d38/dc6/dd4/ff6 x:0 0 0 2026-03-10T06:22:29.407 INFO:tasks.workunit.client.1.vm06.stdout:6/832: rename d6/dd/d25/d4e/ld7 to d6/dd/d35/dff/l120 0 2026-03-10T06:22:29.411 INFO:tasks.workunit.client.1.vm06.stdout:2/682: write da/d13/d1a/dc7/daf/d56/db9/fc2 [1051384,112295] 0 2026-03-10T06:22:29.412 INFO:tasks.workunit.client.1.vm06.stdout:7/810: unlink d19/db0/ddd/ff5 0 2026-03-10T06:22:29.412 INFO:tasks.workunit.client.1.vm06.stdout:7/811: chown d19/d3b/d41/d42/d52/d9f/cf1 39735303 1 2026-03-10T06:22:29.413 INFO:tasks.workunit.client.1.vm06.stdout:7/812: readlink d19/d3b/d41/d42/l46 0 2026-03-10T06:22:29.413 INFO:tasks.workunit.client.1.vm06.stdout:7/813: write d19/d3b/d41/d42/d52/d83/fd8 [1182945,46431] 0 2026-03-10T06:22:29.414 INFO:tasks.workunit.client.1.vm06.stdout:7/814: stat d19/d3b/d41/d4c/f85 0 2026-03-10T06:22:29.414 INFO:tasks.workunit.client.1.vm06.stdout:6/833: chown d6/dd/d25/d2c/f32 338 1 2026-03-10T06:22:29.415 INFO:tasks.workunit.client.1.vm06.stdout:6/834: chown d6/df/fe9 6 1 2026-03-10T06:22:29.419 INFO:tasks.workunit.client.1.vm06.stdout:5/593: dread d8/d9/f14 [0,4194304] 0 2026-03-10T06:22:29.423 INFO:tasks.workunit.client.1.vm06.stdout:6/835: sync 2026-03-10T06:22:29.424 INFO:tasks.workunit.client.1.vm06.stdout:9/782: truncate d21/d32/d4d/fb4 1830708 0 2026-03-10T06:22:29.425 INFO:tasks.workunit.client.1.vm06.stdout:1/853: creat d9/dd3/dbf/ff7 x:0 0 0 2026-03-10T06:22:29.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:29 vm04.local ceph-mon[51058]: pgmap v13: 65 pgs: 65 active+clean; 1.4 GiB data, 4.8 GiB used, 115 GiB / 120 GiB avail; 42 MiB/s rd, 125 MiB/s wr, 260 op/s 2026-03-10T06:22:29.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:29 vm04.local ceph-mon[51058]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:22:29.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:29 vm04.local ceph-mon[51058]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:22:29.430 INFO:tasks.workunit.client.1.vm06.stdout:8/649: write d1/d2c/d5b/f7c [1256125,59012] 0 2026-03-10T06:22:29.431 INFO:tasks.workunit.client.1.vm06.stdout:8/650: fdatasync d1/d3b/db3/fcc 0 2026-03-10T06:22:29.431 INFO:tasks.workunit.client.1.vm06.stdout:8/651: readlink d1/d2c/l96 0 2026-03-10T06:22:29.432 INFO:tasks.workunit.client.1.vm06.stdout:8/652: chown d1/df/d20/d21/c28 581241777 1 2026-03-10T06:22:29.434 INFO:tasks.workunit.client.1.vm06.stdout:3/786: getdents d6/dc/de5 0 2026-03-10T06:22:29.434 INFO:tasks.workunit.client.1.vm06.stdout:0/832: dwrite d0/dd/d14/d1d/d5d/f5f [0,4194304] 0 2026-03-10T06:22:29.440 INFO:tasks.workunit.client.1.vm06.stdout:9/783: dwrite f9 [4194304,4194304] 0 2026-03-10T06:22:29.447 INFO:tasks.workunit.client.1.vm06.stdout:1/854: dwrite d9/f1f [4194304,4194304] 0 2026-03-10T06:22:29.448 INFO:tasks.workunit.client.1.vm06.stdout:1/855: truncate d9/d35/d46/fef 97416 0 2026-03-10T06:22:29.449 INFO:tasks.workunit.client.1.vm06.stdout:1/856: chown d9/d35/d46/d38/d63/ca5 43 1 2026-03-10T06:22:29.454 INFO:tasks.workunit.client.1.vm06.stdout:7/815: mkdir d19/d3b/d41/d42/d10b/d10d 0 2026-03-10T06:22:29.458 INFO:tasks.workunit.client.1.vm06.stdout:8/653: creat d1/df/d20/d21/d7e/fd6 x:0 0 0 2026-03-10T06:22:29.460 INFO:tasks.workunit.client.1.vm06.stdout:0/833: symlink d0/d3c/dc1/d3d/l114 0 2026-03-10T06:22:29.470 INFO:tasks.workunit.client.1.vm06.stdout:7/816: symlink d19/d3b/d41/d42/d62/dc4/l10e 0 2026-03-10T06:22:29.470 INFO:tasks.workunit.client.1.vm06.stdout:8/654: creat d1/df/d20/d21/d7e/fd7 x:0 0 0 2026-03-10T06:22:29.471 INFO:tasks.workunit.client.1.vm06.stdout:7/817: readlink d19/d3b/l50 0 2026-03-10T06:22:29.477 INFO:tasks.workunit.client.1.vm06.stdout:0/834: creat d0/d3c/dc1/d3d/d50/d91/f115 x:0 0 0 2026-03-10T06:22:29.478 INFO:tasks.workunit.client.1.vm06.stdout:8/655: creat d1/d3b/da9/fd8 x:0 0 0 2026-03-10T06:22:29.480 INFO:tasks.workunit.client.1.vm06.stdout:7/818: mknod d19/d3b/d41/da9/dbd/dd2/c10f 0 2026-03-10T06:22:29.480 INFO:tasks.workunit.client.1.vm06.stdout:9/784: link d21/d27/d50/d57/f97 d21/d27/d50/d57/db2/def/fff 0 2026-03-10T06:22:29.480 INFO:tasks.workunit.client.1.vm06.stdout:1/857: link d9/lcf d9/d35/d46/db0/lf8 0 2026-03-10T06:22:29.482 INFO:tasks.workunit.client.1.vm06.stdout:0/835: creat d0/dd/d14/d18/d85/dcc/db7/f116 x:0 0 0 2026-03-10T06:22:29.484 INFO:tasks.workunit.client.1.vm06.stdout:1/858: creat d9/d35/d46/db0/ff9 x:0 0 0 2026-03-10T06:22:29.487 INFO:tasks.workunit.client.1.vm06.stdout:7/819: symlink d19/d3b/d41/da9/dbd/l110 0 2026-03-10T06:22:29.488 INFO:tasks.workunit.client.1.vm06.stdout:6/836: truncate d6/dd/d25/d2c/f32 4854972 0 2026-03-10T06:22:29.489 INFO:tasks.workunit.client.1.vm06.stdout:7/820: fdatasync d19/d3b/d41/f49 0 2026-03-10T06:22:29.490 INFO:tasks.workunit.client.1.vm06.stdout:5/594: write d8/db/d54/d8a/f53 [2273183,67651] 0 2026-03-10T06:22:29.494 INFO:tasks.workunit.client.1.vm06.stdout:0/836: dwrite d0/da3/dd5/f101 [4194304,4194304] 0 2026-03-10T06:22:29.494 INFO:tasks.workunit.client.1.vm06.stdout:2/683: dread da/d13/d1c/d1d/d44/d53/d61/d68/f6b [0,4194304] 0 2026-03-10T06:22:29.495 INFO:tasks.workunit.client.1.vm06.stdout:2/684: readlink da/d13/d5e/l6c 0 2026-03-10T06:22:29.496 INFO:tasks.workunit.client.1.vm06.stdout:2/685: stat da/d13/f1f 0 2026-03-10T06:22:29.496 INFO:tasks.workunit.client.1.vm06.stdout:0/837: chown d0/dd/d14/d18/d85/dcc/d99/fe4 173 1 2026-03-10T06:22:29.497 INFO:tasks.workunit.client.1.vm06.stdout:2/686: chown da/d13/d1a/dc7/d86/fc8 6661100 1 2026-03-10T06:22:29.507 INFO:tasks.workunit.client.1.vm06.stdout:7/821: fsync d19/f99 0 2026-03-10T06:22:29.512 INFO:tasks.workunit.client.1.vm06.stdout:1/859: getdents d9/d35/d46/db0 0 2026-03-10T06:22:29.512 INFO:tasks.workunit.client.1.vm06.stdout:1/860: chown d9/d35/d46/f88 34 1 2026-03-10T06:22:29.521 INFO:tasks.workunit.client.1.vm06.stdout:1/861: dwrite d9/d35/d46/d38/dc6/dd4/ff6 [0,4194304] 0 2026-03-10T06:22:29.524 INFO:tasks.workunit.client.1.vm06.stdout:0/838: dwrite d0/f5 [0,4194304] 0 2026-03-10T06:22:29.529 INFO:tasks.workunit.client.1.vm06.stdout:5/595: write d8/d9/f14 [3447409,35718] 0 2026-03-10T06:22:29.532 INFO:tasks.workunit.client.1.vm06.stdout:6/837: dread d6/d79/fe2 [0,4194304] 0 2026-03-10T06:22:29.541 INFO:tasks.workunit.client.1.vm06.stdout:0/839: mknod d0/dd/d14/d18/d85/d107/c117 0 2026-03-10T06:22:29.544 INFO:tasks.workunit.client.1.vm06.stdout:7/822: dwrite d19/d3b/d41/f66 [0,4194304] 0 2026-03-10T06:22:29.554 INFO:tasks.workunit.client.1.vm06.stdout:6/838: link d6/dd/dc2/d10d/dd0/dc5/leb d6/dd/d25/d33/d5a/dd8/l121 0 2026-03-10T06:22:29.556 INFO:tasks.workunit.client.1.vm06.stdout:5/596: creat d8/db/d54/d8a/d74/fbe x:0 0 0 2026-03-10T06:22:29.556 INFO:tasks.workunit.client.1.vm06.stdout:0/840: creat d0/dd/d14/d18/d7e/dd0/f118 x:0 0 0 2026-03-10T06:22:29.564 INFO:tasks.workunit.client.1.vm06.stdout:1/862: truncate d9/d35/d89/f4f 1400128 0 2026-03-10T06:22:29.564 INFO:tasks.workunit.client.1.vm06.stdout:6/839: chown d6/c17 394865097 1 2026-03-10T06:22:29.570 INFO:tasks.workunit.client.1.vm06.stdout:0/841: link d0/dd/d14/d18/d85/dcc/d88/d47/f111 d0/dd/d14/d18/d85/dcc/d99/f119 0 2026-03-10T06:22:29.574 INFO:tasks.workunit.client.1.vm06.stdout:1/863: write d9/d62/f76 [4269084,44375] 0 2026-03-10T06:22:29.574 INFO:tasks.workunit.client.1.vm06.stdout:7/823: truncate d19/d3b/d41/da9/da5/fa6 428884 0 2026-03-10T06:22:29.577 INFO:tasks.workunit.client.1.vm06.stdout:1/864: mknod d9/d35/d46/d38/d63/cfa 0 2026-03-10T06:22:29.579 INFO:tasks.workunit.client.1.vm06.stdout:1/865: write d9/d35/d89/f14 [1419647,55561] 0 2026-03-10T06:22:29.579 INFO:tasks.workunit.client.1.vm06.stdout:7/824: symlink d19/d3b/d41/da9/dbd/dd2/l111 0 2026-03-10T06:22:29.582 INFO:tasks.workunit.client.1.vm06.stdout:7/825: dwrite d19/d3b/d41/d42/f91 [0,4194304] 0 2026-03-10T06:22:29.589 INFO:tasks.workunit.client.1.vm06.stdout:9/785: dread d21/da2/da7/d93/f94 [0,4194304] 0 2026-03-10T06:22:29.597 INFO:tasks.workunit.client.1.vm06.stdout:3/787: dread d6/d21/f31 [4194304,4194304] 0 2026-03-10T06:22:29.598 INFO:tasks.workunit.client.1.vm06.stdout:5/597: truncate d8/db/d54/d8a/d74/f85 3004199 0 2026-03-10T06:22:29.598 INFO:tasks.workunit.client.1.vm06.stdout:4/814: dread dd/f14 [0,4194304] 0 2026-03-10T06:22:29.599 INFO:tasks.workunit.client.1.vm06.stdout:7/826: mkdir d19/d3b/d41/d42/d52/d112 0 2026-03-10T06:22:29.600 INFO:tasks.workunit.client.1.vm06.stdout:6/840: getdents d6/dd/d25/d33/d5a/df1 0 2026-03-10T06:22:29.600 INFO:tasks.workunit.client.1.vm06.stdout:6/841: fsync d6/df/d70/fd9 0 2026-03-10T06:22:29.608 INFO:tasks.workunit.client.1.vm06.stdout:0/842: write d0/dd/d1c/ff7 [100318,102666] 0 2026-03-10T06:22:29.608 INFO:tasks.workunit.client.1.vm06.stdout:3/788: fsync d6/d21/f55 0 2026-03-10T06:22:29.609 INFO:tasks.workunit.client.1.vm06.stdout:4/815: fdatasync dd/d41/f52 0 2026-03-10T06:22:29.609 INFO:tasks.workunit.client.1.vm06.stdout:4/816: readlink dd/l1b 0 2026-03-10T06:22:29.611 INFO:tasks.workunit.client.1.vm06.stdout:6/842: sync 2026-03-10T06:22:29.616 INFO:tasks.workunit.client.1.vm06.stdout:5/598: creat d8/db/d54/d67/d46/d6e/fbf x:0 0 0 2026-03-10T06:22:29.617 INFO:tasks.workunit.client.1.vm06.stdout:5/599: fsync d8/db/d54/d8a/d74/fb3 0 2026-03-10T06:22:29.621 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:29 vm06.local ceph-mon[58974]: pgmap v13: 65 pgs: 65 active+clean; 1.4 GiB data, 4.8 GiB used, 115 GiB / 120 GiB avail; 42 MiB/s rd, 125 MiB/s wr, 260 op/s 2026-03-10T06:22:29.621 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:29 vm06.local ceph-mon[58974]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:22:29.621 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:29 vm06.local ceph-mon[58974]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:22:29.622 INFO:tasks.workunit.client.1.vm06.stdout:9/786: creat d21/da2/da7/d93/dda/df4/f100 x:0 0 0 2026-03-10T06:22:29.623 INFO:tasks.workunit.client.1.vm06.stdout:1/866: write d9/d1b/d20/fa7 [2254449,118058] 0 2026-03-10T06:22:29.627 INFO:tasks.workunit.client.1.vm06.stdout:1/867: stat d9/d35/d89/lf2 0 2026-03-10T06:22:29.628 INFO:tasks.workunit.client.1.vm06.stdout:4/817: rename dd/d24/d5d/l92 to dd/d24/d2d/d2f/d39/d71/dc3/dd0/de0/d66/lec 0 2026-03-10T06:22:29.631 INFO:tasks.workunit.client.1.vm06.stdout:9/787: stat d21/l43 0 2026-03-10T06:22:29.634 INFO:tasks.workunit.client.1.vm06.stdout:6/843: mkdir d6/dd/d122 0 2026-03-10T06:22:29.635 INFO:tasks.workunit.client.1.vm06.stdout:4/818: rmdir dd/d24/d2d/d2f/d39/d71/dc3 39 2026-03-10T06:22:29.636 INFO:tasks.workunit.client.1.vm06.stdout:3/789: unlink d6/dc/d72/cfd 0 2026-03-10T06:22:29.643 INFO:tasks.workunit.client.1.vm06.stdout:1/868: dwrite d9/d1b/f81 [0,4194304] 0 2026-03-10T06:22:29.646 INFO:tasks.workunit.client.1.vm06.stdout:6/844: truncate d6/df/f1e 2116048 0 2026-03-10T06:22:29.647 INFO:tasks.workunit.client.1.vm06.stdout:9/788: dwrite d21/d27/f9a [0,4194304] 0 2026-03-10T06:22:29.648 INFO:tasks.workunit.client.1.vm06.stdout:6/845: read d6/dd/dc2/d10d/ffa [682990,2097] 0 2026-03-10T06:22:29.652 INFO:tasks.workunit.client.1.vm06.stdout:3/790: write d6/dc/d13/f9f [703848,66709] 0 2026-03-10T06:22:29.661 INFO:tasks.workunit.client.1.vm06.stdout:9/789: truncate d21/f3e 5166828 0 2026-03-10T06:22:29.666 INFO:tasks.workunit.client.1.vm06.stdout:7/827: dwrite d19/d3b/f7b [0,4194304] 0 2026-03-10T06:22:29.669 INFO:tasks.workunit.client.1.vm06.stdout:7/828: chown d19/d3b/d41/d42/d52/d83/d9d/da8/df4 8326 1 2026-03-10T06:22:29.672 INFO:tasks.workunit.client.1.vm06.stdout:1/869: dwrite d9/d35/d46/d38/d8c/fe2 [0,4194304] 0 2026-03-10T06:22:29.674 INFO:tasks.workunit.client.1.vm06.stdout:6/846: dread d6/d79/fc6 [0,4194304] 0 2026-03-10T06:22:29.675 INFO:tasks.workunit.client.1.vm06.stdout:4/819: link dd/d24/d2d/d2f/d34/d40/f8a dd/d33/d47/d97/db6/fed 0 2026-03-10T06:22:29.683 INFO:tasks.workunit.client.1.vm06.stdout:1/870: mknod d9/d35/d46/d38/d63/d83/d93/cfb 0 2026-03-10T06:22:29.686 INFO:tasks.workunit.client.1.vm06.stdout:7/829: creat d19/d3b/d41/d72/d104/f113 x:0 0 0 2026-03-10T06:22:29.686 INFO:tasks.workunit.client.1.vm06.stdout:7/830: dread - d19/d3b/d41/d4c/f85 zero size 2026-03-10T06:22:29.687 INFO:tasks.workunit.client.1.vm06.stdout:1/871: readlink d9/d1b/d20/l7d 0 2026-03-10T06:22:29.690 INFO:tasks.workunit.client.1.vm06.stdout:9/790: rename d21/d27/d3a/f83 to d21/d32/f101 0 2026-03-10T06:22:29.693 INFO:tasks.workunit.client.1.vm06.stdout:4/820: mknod dd/d24/d2d/d7c/cee 0 2026-03-10T06:22:29.693 INFO:tasks.workunit.client.1.vm06.stdout:4/821: write dd/d24/d5d/fd1 [1922120,93817] 0 2026-03-10T06:22:29.694 INFO:tasks.workunit.client.1.vm06.stdout:4/822: dread - dd/d18/d75/f91 zero size 2026-03-10T06:22:29.708 INFO:tasks.workunit.client.1.vm06.stdout:7/831: unlink d19/d3b/d41/d72/ff2 0 2026-03-10T06:22:29.708 INFO:tasks.workunit.client.1.vm06.stdout:1/872: creat d9/d35/d46/d38/dc6/dd4/ffc x:0 0 0 2026-03-10T06:22:29.710 INFO:tasks.workunit.client.1.vm06.stdout:9/791: rmdir d21/d27/d3a 39 2026-03-10T06:22:29.716 INFO:tasks.workunit.client.1.vm06.stdout:3/791: truncate d6/dc/d13/d35/d101/dd0/dd1/d90/fe9 2130692 0 2026-03-10T06:22:29.717 INFO:tasks.workunit.client.1.vm06.stdout:4/823: dwrite dd/d33/d47/fc7 [0,4194304] 0 2026-03-10T06:22:29.732 INFO:tasks.workunit.client.1.vm06.stdout:6/847: link d6/dd/dc2/d10d/dd0/dc5/ld2 d6/dd/dc2/d10d/dd0/dc5/d119/l123 0 2026-03-10T06:22:29.741 INFO:tasks.workunit.client.1.vm06.stdout:1/873: symlink d9/d35/d46/d38/d63/dd6/de8/lfd 0 2026-03-10T06:22:29.741 INFO:tasks.workunit.client.1.vm06.stdout:3/792: creat d6/d1a/d5b/dbd/f108 x:0 0 0 2026-03-10T06:22:29.743 INFO:tasks.workunit.client.1.vm06.stdout:4/824: write dd/d24/d2d/d2f/d39/d71/dc3/dd0/de0/d66/dc6/dca/fd5 [999813,41980] 0 2026-03-10T06:22:29.749 INFO:tasks.workunit.client.1.vm06.stdout:1/874: symlink d9/d35/d46/d38/ddd/lfe 0 2026-03-10T06:22:29.751 INFO:tasks.workunit.client.1.vm06.stdout:3/793: rename d6/l9 to d6/dc/d13/d35/d101/d88/dde/l109 0 2026-03-10T06:22:29.751 INFO:tasks.workunit.client.1.vm06.stdout:8/656: dread d1/d7/f4f [0,4194304] 0 2026-03-10T06:22:29.751 INFO:tasks.workunit.client.1.vm06.stdout:9/792: getdents d21/d32/d4d/dd2 0 2026-03-10T06:22:29.751 INFO:tasks.workunit.client.1.vm06.stdout:6/848: dwrite d6/dd/f96 [0,4194304] 0 2026-03-10T06:22:29.752 INFO:tasks.workunit.client.1.vm06.stdout:4/825: creat dd/d33/da6/fef x:0 0 0 2026-03-10T06:22:29.757 INFO:tasks.workunit.client.1.vm06.stdout:3/794: mkdir d6/def/d10a 0 2026-03-10T06:22:29.758 INFO:tasks.workunit.client.1.vm06.stdout:1/875: chown d9/d35/d46/d38/d63/dd6/de8/lfd 1940080520 1 2026-03-10T06:22:29.758 INFO:tasks.workunit.client.1.vm06.stdout:3/795: write d6/d8/f52 [180938,80189] 0 2026-03-10T06:22:29.759 INFO:tasks.workunit.client.1.vm06.stdout:1/876: stat d9/l95 0 2026-03-10T06:22:29.760 INFO:tasks.workunit.client.1.vm06.stdout:3/796: chown d6/dc/d13/fca 3922 1 2026-03-10T06:22:29.760 INFO:tasks.workunit.client.1.vm06.stdout:9/793: truncate d21/d27/d50/d57/db2/d80/d95/d9b/dd0/ffe 954324 0 2026-03-10T06:22:29.767 INFO:tasks.workunit.client.1.vm06.stdout:9/794: write d21/d32/d4d/d51/ffc [892568,73260] 0 2026-03-10T06:22:29.769 INFO:tasks.workunit.client.1.vm06.stdout:8/657: symlink d1/df/d11/da1/dd2/ld9 0 2026-03-10T06:22:29.772 INFO:tasks.workunit.client.1.vm06.stdout:4/826: dread dd/fc2 [0,4194304] 0 2026-03-10T06:22:29.777 INFO:tasks.workunit.client.1.vm06.stdout:7/832: dwrite d19/d3b/d41/f54 [0,4194304] 0 2026-03-10T06:22:29.778 INFO:tasks.workunit.client.1.vm06.stdout:3/797: rename d6/dc/f3f to d6/dc/d13/d35/d101/dd0/dd1/d90/f10b 0 2026-03-10T06:22:29.778 INFO:tasks.workunit.client.1.vm06.stdout:6/849: fsync d6/dd/dc2/d10d/f10e 0 2026-03-10T06:22:29.784 INFO:tasks.workunit.client.1.vm06.stdout:1/877: link d9/lca d9/dd3/lff 0 2026-03-10T06:22:29.788 INFO:tasks.workunit.client.1.vm06.stdout:7/833: truncate d19/d3b/d41/d42/d62/d80/ffa 799165 0 2026-03-10T06:22:29.788 INFO:tasks.workunit.client.1.vm06.stdout:3/798: dread d6/dc/d13/d35/d101/dd0/dd1/f4c [0,4194304] 0 2026-03-10T06:22:29.788 INFO:tasks.workunit.client.1.vm06.stdout:6/850: chown d6/dd/d25/d33/d5a/dd8/c10b 52 1 2026-03-10T06:22:29.793 INFO:tasks.workunit.client.1.vm06.stdout:6/851: stat d6/dd/dc2/d10d/dd0/dc5/d119 0 2026-03-10T06:22:29.794 INFO:tasks.workunit.client.1.vm06.stdout:9/795: rename d21/d27/d50/d57/db2/d80/d95/d9b/fd9 to d21/d27/d50/d57/db2/d80/d95/f102 0 2026-03-10T06:22:29.798 INFO:tasks.workunit.client.1.vm06.stdout:1/878: symlink d9/d62/l100 0 2026-03-10T06:22:29.799 INFO:tasks.workunit.client.1.vm06.stdout:9/796: creat d21/d46/ded/f103 x:0 0 0 2026-03-10T06:22:29.800 INFO:tasks.workunit.client.1.vm06.stdout:3/799: symlink d6/d21/l10c 0 2026-03-10T06:22:29.802 INFO:tasks.workunit.client.1.vm06.stdout:8/658: sync 2026-03-10T06:22:29.802 INFO:tasks.workunit.client.1.vm06.stdout:6/852: creat d6/d79/d95/db4/dd4/df5/df3/f124 x:0 0 0 2026-03-10T06:22:29.804 INFO:tasks.workunit.client.1.vm06.stdout:7/834: dwrite d19/d3b/d41/d72/d97/fbc [0,4194304] 0 2026-03-10T06:22:29.804 INFO:tasks.workunit.client.1.vm06.stdout:9/797: mkdir d21/d27/d50/d57/dcd/de4/d104 0 2026-03-10T06:22:29.809 INFO:tasks.workunit.client.1.vm06.stdout:6/853: write d6/d79/d95/db4/dd4/df5/df3/f124 [502582,41576] 0 2026-03-10T06:22:29.818 INFO:tasks.workunit.client.1.vm06.stdout:9/798: creat d21/da2/de6/f105 x:0 0 0 2026-03-10T06:22:29.818 INFO:tasks.workunit.client.1.vm06.stdout:8/659: creat d1/df/d20/d21/d5e/d79/fda x:0 0 0 2026-03-10T06:22:29.820 INFO:tasks.workunit.client.1.vm06.stdout:7/835: symlink d19/d3b/d41/d72/l114 0 2026-03-10T06:22:29.828 INFO:tasks.workunit.client.1.vm06.stdout:1/879: creat d9/d35/f101 x:0 0 0 2026-03-10T06:22:29.828 INFO:tasks.workunit.client.1.vm06.stdout:4/827: fdatasync dd/d33/d47/fc7 0 2026-03-10T06:22:29.829 INFO:tasks.workunit.client.1.vm06.stdout:4/828: dread - dd/d18/d8e/fa4 zero size 2026-03-10T06:22:29.830 INFO:tasks.workunit.client.1.vm06.stdout:5/600: dread d8/db/d54/d8a/d74/f3b [0,4194304] 0 2026-03-10T06:22:29.838 INFO:tasks.workunit.client.1.vm06.stdout:5/601: write d8/db/f48 [4711380,53285] 0 2026-03-10T06:22:29.842 INFO:tasks.workunit.client.1.vm06.stdout:8/660: rmdir d1/df/d11/da1 39 2026-03-10T06:22:29.842 INFO:tasks.workunit.client.1.vm06.stdout:4/829: creat dd/d18/d8e/ff0 x:0 0 0 2026-03-10T06:22:29.842 INFO:tasks.workunit.client.1.vm06.stdout:7/836: rename d19/d3b/dde/lb1 to d19/d3b/d41/d4c/l115 0 2026-03-10T06:22:29.848 INFO:tasks.workunit.client.1.vm06.stdout:8/661: truncate d1/df/d20/d35/dac/dbf/fc9 561921 0 2026-03-10T06:22:29.848 INFO:tasks.workunit.client.1.vm06.stdout:8/662: chown d1/d3b/f98 47095410 1 2026-03-10T06:22:29.849 INFO:tasks.workunit.client.1.vm06.stdout:5/602: fdatasync d8/db/d54/d8a/d39/d72/f9a 0 2026-03-10T06:22:29.851 INFO:tasks.workunit.client.1.vm06.stdout:4/830: rmdir dd/d24/d2d/d2f/d39/d71/dc3/dd0/de0/d66/dc6/dca 39 2026-03-10T06:22:29.862 INFO:tasks.workunit.client.1.vm06.stdout:8/663: dread - d1/df/d20/d21/f82 zero size 2026-03-10T06:22:29.868 INFO:tasks.workunit.client.1.vm06.stdout:6/854: dwrite d6/dd/dc2/d10d/f10e [0,4194304] 0 2026-03-10T06:22:29.868 INFO:tasks.workunit.client.1.vm06.stdout:7/837: dread d19/d3b/d41/d42/f78 [0,4194304] 0 2026-03-10T06:22:29.870 INFO:tasks.workunit.client.1.vm06.stdout:3/800: dwrite d6/dc/fc0 [0,4194304] 0 2026-03-10T06:22:29.870 INFO:tasks.workunit.client.1.vm06.stdout:1/880: dwrite d9/d1b/d20/f8e [4194304,4194304] 0 2026-03-10T06:22:29.879 INFO:tasks.workunit.client.1.vm06.stdout:5/603: symlink d8/db/lc0 0 2026-03-10T06:22:29.880 INFO:tasks.workunit.client.1.vm06.stdout:9/799: dread d21/d27/d50/d57/db2/d80/f86 [0,4194304] 0 2026-03-10T06:22:29.881 INFO:tasks.workunit.client.1.vm06.stdout:7/838: fsync d19/d3b/d41/d4c/f6e 0 2026-03-10T06:22:29.882 INFO:tasks.workunit.client.1.vm06.stdout:3/801: chown d6/dc/f1d 395731 1 2026-03-10T06:22:29.882 INFO:tasks.workunit.client.1.vm06.stdout:3/802: chown d6/d1a 88311930 1 2026-03-10T06:22:29.882 INFO:tasks.workunit.client.1.vm06.stdout:8/664: stat d1/df/d20/d21/d7e/d8d/c94 0 2026-03-10T06:22:29.890 INFO:tasks.workunit.client.1.vm06.stdout:6/855: symlink d6/d79/d95/db4/dd4/df5/l125 0 2026-03-10T06:22:29.896 INFO:tasks.workunit.client.1.vm06.stdout:3/803: dread - d6/dc/d13/d35/d101/dd0/dd1/d90/fa5 zero size 2026-03-10T06:22:29.899 INFO:tasks.workunit.client.1.vm06.stdout:1/881: dread d9/d35/d46/d38/d63/d83/d93/fe5 [0,4194304] 0 2026-03-10T06:22:29.899 INFO:tasks.workunit.client.1.vm06.stdout:4/831: link dd/d33/d36/la2 dd/d18/d8e/daa/lf1 0 2026-03-10T06:22:29.903 INFO:tasks.workunit.client.1.vm06.stdout:1/882: truncate d9/d35/d46/d38/d63/feb 1041651 0 2026-03-10T06:22:29.903 INFO:tasks.workunit.client.1.vm06.stdout:4/832: truncate dd/f12 2298313 0 2026-03-10T06:22:29.911 INFO:tasks.workunit.client.1.vm06.stdout:3/804: getdents d6/dc/d13/d35/d101/dd0/dd1 0 2026-03-10T06:22:29.912 INFO:tasks.workunit.client.1.vm06.stdout:5/604: dread d8/db/d54/d8a/f53 [0,4194304] 0 2026-03-10T06:22:29.912 INFO:tasks.workunit.client.1.vm06.stdout:3/805: truncate d6/dc/d13/d35/d101/f105 954804 0 2026-03-10T06:22:29.913 INFO:tasks.workunit.client.1.vm06.stdout:4/833: rename dd/fc2 to dd/d24/d2d/d2f/d39/d71/dc3/dd0/de0/d66/dc6/ff2 0 2026-03-10T06:22:29.918 INFO:tasks.workunit.client.1.vm06.stdout:5/605: mkdir d8/db/d54/d67/d46/d68/dc1 0 2026-03-10T06:22:29.923 INFO:tasks.workunit.client.1.vm06.stdout:4/834: unlink dd/d24/d2d/d2f/d34/d40/f89 0 2026-03-10T06:22:29.925 INFO:tasks.workunit.client.1.vm06.stdout:9/800: dread d21/d27/f4b [0,4194304] 0 2026-03-10T06:22:29.925 INFO:tasks.workunit.client.1.vm06.stdout:5/606: mknod d8/db/d54/d67/d46/d68/dc1/cc2 0 2026-03-10T06:22:29.927 INFO:tasks.workunit.client.1.vm06.stdout:1/883: write d9/f2f [4732803,2077] 0 2026-03-10T06:22:29.927 INFO:tasks.workunit.client.1.vm06.stdout:8/665: write d1/d7/fa7 [1028603,110169] 0 2026-03-10T06:22:29.927 INFO:tasks.workunit.client.1.vm06.stdout:7/839: write d19/d3b/d41/d42/d62/d80/d82/fae [2160214,119396] 0 2026-03-10T06:22:29.933 INFO:tasks.workunit.client.1.vm06.stdout:4/835: rename dd/d41/f60 to dd/d18/d8e/daa/ff3 0 2026-03-10T06:22:29.934 INFO:tasks.workunit.client.1.vm06.stdout:5/607: creat d8/db/d57/d83/fc3 x:0 0 0 2026-03-10T06:22:29.936 INFO:tasks.workunit.client.1.vm06.stdout:8/666: mkdir d1/d3b/da9/ddb 0 2026-03-10T06:22:29.937 INFO:tasks.workunit.client.1.vm06.stdout:7/840: fdatasync d19/d3b/d41/d4c/f6e 0 2026-03-10T06:22:29.939 INFO:tasks.workunit.client.1.vm06.stdout:4/836: write dd/d24/d2d/d2f/f42 [7544594,58988] 0 2026-03-10T06:22:29.941 INFO:tasks.workunit.client.1.vm06.stdout:5/608: rmdir d8/db/d54/d8a/d74 39 2026-03-10T06:22:29.945 INFO:tasks.workunit.client.1.vm06.stdout:8/667: read d1/df/d20/d21/d7e/d8d/f9c [858643,17932] 0 2026-03-10T06:22:29.946 INFO:tasks.workunit.client.1.vm06.stdout:4/837: write dd/d33/da6/fef [138222,84164] 0 2026-03-10T06:22:29.951 INFO:tasks.workunit.client.1.vm06.stdout:3/806: dwrite d6/d1a/d5b/dbd/fc2 [4194304,4194304] 0 2026-03-10T06:22:29.955 INFO:tasks.workunit.client.1.vm06.stdout:9/801: dwrite d21/da2/de6/fc1 [4194304,4194304] 0 2026-03-10T06:22:29.955 INFO:tasks.workunit.client.1.vm06.stdout:1/884: link d9/d35/le3 d9/d35/d46/db0/l102 0 2026-03-10T06:22:29.957 INFO:tasks.workunit.client.1.vm06.stdout:5/609: sync 2026-03-10T06:22:29.958 INFO:tasks.workunit.client.1.vm06.stdout:8/668: sync 2026-03-10T06:22:29.966 INFO:tasks.workunit.client.1.vm06.stdout:1/885: dread - d9/d35/d46/d38/d63/fc2 zero size 2026-03-10T06:22:29.967 INFO:tasks.workunit.client.1.vm06.stdout:5/610: write d8/db/fbc [1448252,88139] 0 2026-03-10T06:22:29.967 INFO:tasks.workunit.client.1.vm06.stdout:4/838: mknod dd/d41/cf4 0 2026-03-10T06:22:29.974 INFO:tasks.workunit.client.1.vm06.stdout:5/611: rmdir d8/db/d57/d83 39 2026-03-10T06:22:29.985 INFO:tasks.workunit.client.1.vm06.stdout:9/802: rename d21/d27/d50/d57/db2/def to d21/da2/da7/d93/dda/df4/d106 0 2026-03-10T06:22:29.985 INFO:tasks.workunit.client.1.vm06.stdout:1/886: mkdir d9/d1b/d20/d103 0 2026-03-10T06:22:29.985 INFO:tasks.workunit.client.1.vm06.stdout:1/887: dwrite d9/d62/f94 [0,4194304] 0 2026-03-10T06:22:29.994 INFO:tasks.workunit.client.1.vm06.stdout:5/612: mkdir d8/db/d54/d67/d46/dc4 0 2026-03-10T06:22:29.999 INFO:tasks.workunit.client.1.vm06.stdout:1/888: rmdir d9/d35/d46/d38/df3 0 2026-03-10T06:22:30.008 INFO:tasks.workunit.client.1.vm06.stdout:5/613: rmdir d8/db/d57/d83 39 2026-03-10T06:22:30.008 INFO:tasks.workunit.client.1.vm06.stdout:7/841: write f4 [7952851,4108] 0 2026-03-10T06:22:30.012 INFO:tasks.workunit.client.1.vm06.stdout:5/614: dread d8/db/d54/d8a/d39/d72/f9a [0,4194304] 0 2026-03-10T06:22:30.014 INFO:tasks.workunit.client.1.vm06.stdout:8/669: dwrite d1/df/d58/f86 [0,4194304] 0 2026-03-10T06:22:30.017 INFO:tasks.workunit.client.1.vm06.stdout:8/670: chown d1/df/d11 498568 1 2026-03-10T06:22:30.020 INFO:tasks.workunit.client.1.vm06.stdout:9/803: write d21/d27/d50/d57/db2/d80/f90 [645701,51632] 0 2026-03-10T06:22:30.020 INFO:tasks.workunit.client.1.vm06.stdout:5/615: fdatasync d8/db/d54/d8a/d74/f5a 0 2026-03-10T06:22:30.022 INFO:tasks.workunit.client.1.vm06.stdout:5/616: dread - d8/db/d54/d8a/d74/d90/fb5 zero size 2026-03-10T06:22:30.022 INFO:tasks.workunit.client.1.vm06.stdout:1/889: mknod d9/d35/d89/c104 0 2026-03-10T06:22:30.024 INFO:tasks.workunit.client.1.vm06.stdout:8/671: mkdir d1/d2c/d99/ddc 0 2026-03-10T06:22:30.037 INFO:tasks.workunit.client.1.vm06.stdout:9/804: truncate d21/f3e 2655450 0 2026-03-10T06:22:30.037 INFO:tasks.workunit.client.1.vm06.stdout:9/805: chown d21/ddb/fe9 3 1 2026-03-10T06:22:30.037 INFO:tasks.workunit.client.1.vm06.stdout:8/672: dread - d1/df/d20/d21/d5e/d79/fda zero size 2026-03-10T06:22:30.038 INFO:tasks.workunit.client.1.vm06.stdout:6/856: dread d6/d79/d95/db4/dd4/df5/df3/f124 [0,4194304] 0 2026-03-10T06:22:30.039 INFO:tasks.workunit.client.1.vm06.stdout:9/806: symlink d21/d32/l107 0 2026-03-10T06:22:30.040 INFO:tasks.workunit.client.1.vm06.stdout:9/807: write d21/d27/d50/d57/db2/f8d [725163,127932] 0 2026-03-10T06:22:30.049 INFO:tasks.workunit.client.1.vm06.stdout:8/673: dwrite d1/df/d11/f47 [0,4194304] 0 2026-03-10T06:22:30.051 INFO:tasks.workunit.client.1.vm06.stdout:5/617: dwrite d8/db/d54/d8a/d39/f6a [0,4194304] 0 2026-03-10T06:22:30.057 INFO:tasks.workunit.client.1.vm06.stdout:9/808: symlink d21/d27/d56/df6/l108 0 2026-03-10T06:22:30.059 INFO:tasks.workunit.client.1.vm06.stdout:6/857: link d6/dd/dc2/d10d/db9/l111 d6/dd/dc2/d10d/dd0/dec/l126 0 2026-03-10T06:22:30.064 INFO:tasks.workunit.client.1.vm06.stdout:9/809: dwrite d21/da2/da7/f96 [4194304,4194304] 0 2026-03-10T06:22:30.067 INFO:tasks.workunit.client.1.vm06.stdout:9/810: dwrite d21/d27/d50/d57/db2/d80/d95/ff2 [0,4194304] 0 2026-03-10T06:22:30.073 INFO:tasks.workunit.client.1.vm06.stdout:8/674: dwrite d1/d2c/d5b/f7c [0,4194304] 0 2026-03-10T06:22:30.073 INFO:tasks.workunit.client.1.vm06.stdout:6/858: read - d6/df/fe9 zero size 2026-03-10T06:22:30.077 INFO:tasks.workunit.client.1.vm06.stdout:6/859: write d6/df/d40/d99/fb5 [244008,12942] 0 2026-03-10T06:22:30.089 INFO:tasks.workunit.client.1.vm06.stdout:5/618: creat d8/db/d54/d8a/d39/fc5 x:0 0 0 2026-03-10T06:22:30.095 INFO:tasks.workunit.client.1.vm06.stdout:8/675: dwrite d1/d3b/da9/fd8 [0,4194304] 0 2026-03-10T06:22:30.098 INFO:tasks.workunit.client.1.vm06.stdout:5/619: rmdir d8/db/d54/d55/d80 39 2026-03-10T06:22:30.104 INFO:tasks.workunit.client.1.vm06.stdout:7/842: fsync d19/d3b/d41/d42/f91 0 2026-03-10T06:22:30.111 INFO:tasks.workunit.client.1.vm06.stdout:6/860: dwrite d6/dd/d35/f2d [0,4194304] 0 2026-03-10T06:22:30.111 INFO:tasks.workunit.client.1.vm06.stdout:7/843: mkdir d19/db0/d116 0 2026-03-10T06:22:30.111 INFO:tasks.workunit.client.1.vm06.stdout:7/844: rename d19/d3b/d41/d42/d62/dc4/df8 to d19/d3b/d41/d42/d62/d80/da1/d117 0 2026-03-10T06:22:30.114 INFO:tasks.workunit.client.1.vm06.stdout:6/861: dread d6/d79/d95/db4/dd4/fd5 [4194304,4194304] 0 2026-03-10T06:22:30.116 INFO:tasks.workunit.client.1.vm06.stdout:6/862: rmdir d6/df/d40/d99 39 2026-03-10T06:22:30.116 INFO:tasks.workunit.client.1.vm06.stdout:7/845: getdents d19/d3b/d5b 0 2026-03-10T06:22:30.117 INFO:tasks.workunit.client.1.vm06.stdout:7/846: chown d19/d3b/d41/d42/fd4 924 1 2026-03-10T06:22:30.117 INFO:tasks.workunit.client.1.vm06.stdout:6/863: chown d6/dd/d25/c8b 575 1 2026-03-10T06:22:30.118 INFO:tasks.workunit.client.1.vm06.stdout:7/847: chown d19/d3b/d41/d42/d62/f7c 197018 1 2026-03-10T06:22:30.133 INFO:tasks.workunit.client.1.vm06.stdout:4/839: dread dd/d24/d2d/d2f/f42 [0,4194304] 0 2026-03-10T06:22:30.135 INFO:tasks.workunit.client.1.vm06.stdout:7/848: creat d19/d3b/d41/d42/d62/d80/f118 x:0 0 0 2026-03-10T06:22:30.142 INFO:tasks.workunit.client.1.vm06.stdout:7/849: mknod d19/d3b/d41/da9/da5/c119 0 2026-03-10T06:22:30.147 INFO:tasks.workunit.client.1.vm06.stdout:7/850: rename d19/d3b/d5b/c5c to d19/d3b/d5b/c11a 0 2026-03-10T06:22:30.149 INFO:tasks.workunit.client.1.vm06.stdout:7/851: symlink d19/d3b/d41/da9/da5/l11b 0 2026-03-10T06:22:30.158 INFO:tasks.workunit.client.1.vm06.stdout:4/840: sync 2026-03-10T06:22:30.159 INFO:tasks.workunit.client.1.vm06.stdout:4/841: truncate dd/d24/d2d/fe5 650003 0 2026-03-10T06:22:30.169 INFO:tasks.workunit.client.1.vm06.stdout:4/842: dread dd/d33/d47/d97/db6/fed [0,4194304] 0 2026-03-10T06:22:30.172 INFO:tasks.workunit.client.1.vm06.stdout:4/843: dread dd/d24/d2d/d2f/d34/d83/fcc [0,4194304] 0 2026-03-10T06:22:30.174 INFO:tasks.workunit.client.1.vm06.stdout:4/844: chown dd/d18/f5f 5783 1 2026-03-10T06:22:30.198 INFO:tasks.workunit.client.1.vm06.stdout:8/676: dread d1/f26 [0,4194304] 0 2026-03-10T06:22:30.199 INFO:tasks.workunit.client.1.vm06.stdout:8/677: readlink d1/df/d20/laf 0 2026-03-10T06:22:30.221 INFO:tasks.workunit.client.1.vm06.stdout:3/807: dread d6/d1a/d5b/dbd/fc2 [0,4194304] 0 2026-03-10T06:22:30.222 INFO:tasks.workunit.client.1.vm06.stdout:1/890: dread d9/d35/d46/d38/d63/f84 [0,4194304] 0 2026-03-10T06:22:30.230 INFO:tasks.workunit.client.1.vm06.stdout:5/620: dread d8/db/d54/d55/d80/fbb [0,4194304] 0 2026-03-10T06:22:30.237 INFO:tasks.workunit.client.1.vm06.stdout:4/845: dwrite dd/d24/d2d/d2f/f42 [0,4194304] 0 2026-03-10T06:22:30.239 INFO:tasks.workunit.client.1.vm06.stdout:6/864: dwrite d6/df/f1e [0,4194304] 0 2026-03-10T06:22:30.242 INFO:tasks.workunit.client.1.vm06.stdout:6/865: stat d6/df/d9f 0 2026-03-10T06:22:30.248 INFO:tasks.workunit.client.1.vm06.stdout:7/852: dwrite d19/d3b/d5b/f7f [0,4194304] 0 2026-03-10T06:22:30.263 INFO:tasks.workunit.client.1.vm06.stdout:4/846: stat dd/d24/d2d/d2f/d39/d71/dc3/dd0/de0/d66/dc6/dca 0 2026-03-10T06:22:30.270 INFO:tasks.workunit.client.1.vm06.stdout:4/847: getdents dd/d24/d2d/d2f/d39/d71 0 2026-03-10T06:22:30.287 INFO:tasks.workunit.client.1.vm06.stdout:4/848: read dd/d33/d36/fc5 [942772,32544] 0 2026-03-10T06:22:30.308 INFO:tasks.workunit.client.1.vm06.stdout:4/849: sync 2026-03-10T06:22:30.316 INFO:tasks.workunit.client.1.vm06.stdout:9/811: dread d21/d46/fb9 [0,4194304] 0 2026-03-10T06:22:30.316 INFO:tasks.workunit.client.1.vm06.stdout:8/678: dread d1/df/d20/f51 [0,4194304] 0 2026-03-10T06:22:30.318 INFO:tasks.workunit.client.1.vm06.stdout:2/687: dread da/d13/d1a/dc7/daf/d56/db9/f77 [0,4194304] 0 2026-03-10T06:22:30.325 INFO:tasks.workunit.client.1.vm06.stdout:0/843: dread d0/f9 [0,4194304] 0 2026-03-10T06:22:30.326 INFO:tasks.workunit.client.1.vm06.stdout:5/621: truncate d8/db/d54/f88 3245917 0 2026-03-10T06:22:30.328 INFO:tasks.workunit.client.1.vm06.stdout:2/688: read da/d13/d1c/f7e [2204410,1963] 0 2026-03-10T06:22:30.328 INFO:tasks.workunit.client.1.vm06.stdout:4/850: dwrite dd/d24/d2d/f5a [0,4194304] 0 2026-03-10T06:22:30.329 INFO:tasks.workunit.client.1.vm06.stdout:5/622: chown d8/db/d54/d67/c5b 462209 1 2026-03-10T06:22:30.329 INFO:tasks.workunit.client.1.vm06.stdout:8/679: dwrite d1/df/d11/f45 [0,4194304] 0 2026-03-10T06:22:30.332 INFO:tasks.workunit.client.1.vm06.stdout:0/844: getdents d0/dd/d14 0 2026-03-10T06:22:30.334 INFO:tasks.workunit.client.1.vm06.stdout:0/845: readlink d0/dd/d14/d18/d85/dcc/d88/d98/lfb 0 2026-03-10T06:22:30.335 INFO:tasks.workunit.client.1.vm06.stdout:2/689: read da/d13/f1f [1715059,89466] 0 2026-03-10T06:22:30.335 INFO:tasks.workunit.client.1.vm06.stdout:3/808: rename d6/d21/fda to d6/dc/f10d 0 2026-03-10T06:22:30.339 INFO:tasks.workunit.client.1.vm06.stdout:0/846: dread - d0/dd/d14/d18/d7e/dd0/f118 zero size 2026-03-10T06:22:30.340 INFO:tasks.workunit.client.1.vm06.stdout:5/623: dread d8/db/d54/d8a/d39/f6a [0,4194304] 0 2026-03-10T06:22:30.346 INFO:tasks.workunit.client.1.vm06.stdout:1/891: rename d9/d35/d46/d38/d63/d83/dc5/dd5/fed to d9/d62/f105 0 2026-03-10T06:22:30.352 INFO:tasks.workunit.client.1.vm06.stdout:0/847: unlink d0/dd/d14/d18/d85/d107/c117 0 2026-03-10T06:22:30.352 INFO:tasks.workunit.client.1.vm06.stdout:2/690: creat da/d13/d1a/dc7/daf/d56/db9/fdc x:0 0 0 2026-03-10T06:22:30.356 INFO:tasks.workunit.client.1.vm06.stdout:3/809: mknod d6/def/d10a/c10e 0 2026-03-10T06:22:30.356 INFO:tasks.workunit.client.1.vm06.stdout:4/851: dwrite dd/fa7 [0,4194304] 0 2026-03-10T06:22:30.361 INFO:tasks.workunit.client.1.vm06.stdout:2/691: mkdir da/d13/d1a/dc7/daf/d56/ddd 0 2026-03-10T06:22:30.366 INFO:tasks.workunit.client.1.vm06.stdout:4/852: stat dd/d24/d2d/d2f/d39/d71/dc3/dd0/ce3 0 2026-03-10T06:22:30.367 INFO:tasks.workunit.client.1.vm06.stdout:9/812: rmdir d21/d27/d50/d57 39 2026-03-10T06:22:30.373 INFO:tasks.workunit.client.1.vm06.stdout:2/692: dread - da/fa7 zero size 2026-03-10T06:22:30.373 INFO:tasks.workunit.client.1.vm06.stdout:9/813: truncate fd 4997210 0 2026-03-10T06:22:30.373 INFO:tasks.workunit.client.1.vm06.stdout:3/810: creat d6/d8/d7f/da1/dfe/f10f x:0 0 0 2026-03-10T06:22:30.374 INFO:tasks.workunit.client.1.vm06.stdout:0/848: link d0/dd/d14/d18/d85/dcc/d99/f9c d0/d3c/dc1/d7d/f11a 0 2026-03-10T06:22:30.374 INFO:tasks.workunit.client.1.vm06.stdout:2/693: write da/d13/d5e/f9a [4372807,102491] 0 2026-03-10T06:22:30.378 INFO:tasks.workunit.client.1.vm06.stdout:3/811: creat d6/dc/d13/d35/f110 x:0 0 0 2026-03-10T06:22:30.381 INFO:tasks.workunit.client.1.vm06.stdout:0/849: chown d0/dd/d14/d18/d7e/dd0/f118 466 1 2026-03-10T06:22:30.382 INFO:tasks.workunit.client.1.vm06.stdout:4/853: dread dd/d33/d47/f88 [0,4194304] 0 2026-03-10T06:22:30.382 INFO:tasks.workunit.client.1.vm06.stdout:3/812: unlink d6/dc/d13/f9f 0 2026-03-10T06:22:30.382 INFO:tasks.workunit.client.1.vm06.stdout:3/813: creat d6/d4f/f111 x:0 0 0 2026-03-10T06:22:30.385 INFO:tasks.workunit.client.1.vm06.stdout:9/814: getdents d21/d27/d50 0 2026-03-10T06:22:30.390 INFO:tasks.workunit.client.1.vm06.stdout:4/854: symlink dd/d24/d2d/d2f/d39/d71/dc3/dd0/de0/db0/lf5 0 2026-03-10T06:22:30.390 INFO:tasks.workunit.client.1.vm06.stdout:3/814: mknod d6/d1a/d5b/c112 0 2026-03-10T06:22:30.393 INFO:tasks.workunit.client.1.vm06.stdout:4/855: mkdir dd/d24/d2d/d2f/d34/d40/df6 0 2026-03-10T06:22:30.393 INFO:tasks.workunit.client.1.vm06.stdout:3/815: write d6/dc/d13/d35/d101/d88/dde/fe8 [639661,55792] 0 2026-03-10T06:22:30.395 INFO:tasks.workunit.client.1.vm06.stdout:9/815: link d21/d27/d50/d57/l6d d21/d27/d50/d57/dcd/de4/dee/l109 0 2026-03-10T06:22:30.398 INFO:tasks.workunit.client.1.vm06.stdout:3/816: fdatasync d6/d8/d7f/da1/dfe/f10f 0 2026-03-10T06:22:30.399 INFO:tasks.workunit.client.1.vm06.stdout:4/856: creat dd/d18/ff7 x:0 0 0 2026-03-10T06:22:30.400 INFO:tasks.workunit.client.1.vm06.stdout:9/816: write d21/d46/fe0 [1070952,77850] 0 2026-03-10T06:22:30.403 INFO:tasks.workunit.client.1.vm06.stdout:3/817: fsync d6/dc/d13/d35/d101/f105 0 2026-03-10T06:22:30.404 INFO:tasks.workunit.client.1.vm06.stdout:9/817: chown d21/d32/d4d/l66 395646 1 2026-03-10T06:22:30.408 INFO:tasks.workunit.client.1.vm06.stdout:3/818: chown d6/dc/d13/d9d/d54/fb8 20 1 2026-03-10T06:22:30.411 INFO:tasks.workunit.client.1.vm06.stdout:9/818: dread d21/d32/f101 [0,4194304] 0 2026-03-10T06:22:30.425 INFO:tasks.workunit.client.1.vm06.stdout:9/819: mkdir d21/d27/d50/d57/db2/d80/d95/d9b/dd0/d10a 0 2026-03-10T06:22:30.428 INFO:tasks.workunit.client.1.vm06.stdout:1/892: unlink d9/d35/f5c 0 2026-03-10T06:22:30.428 INFO:tasks.workunit.client.1.vm06.stdout:6/866: creat d6/dd/dc2/d10d/f127 x:0 0 0 2026-03-10T06:22:30.430 INFO:tasks.workunit.client.1.vm06.stdout:3/819: truncate d6/d21/fb4 3126624 0 2026-03-10T06:22:30.432 INFO:tasks.workunit.client.1.vm06.stdout:9/820: unlink d21/d27/d3a/cdf 0 2026-03-10T06:22:30.432 INFO:tasks.workunit.client.1.vm06.stdout:3/820: fsync d6/f63 0 2026-03-10T06:22:30.432 INFO:tasks.workunit.client.1.vm06.stdout:9/821: stat d21/d32/d4d/d51/cb8 0 2026-03-10T06:22:30.433 INFO:tasks.workunit.client.1.vm06.stdout:6/867: creat d6/dd/dc7/f128 x:0 0 0 2026-03-10T06:22:30.434 INFO:tasks.workunit.client.1.vm06.stdout:6/868: dread - d6/dd/d35/fca zero size 2026-03-10T06:22:30.436 INFO:tasks.workunit.client.1.vm06.stdout:3/821: symlink d6/dc/d13/d35/d101/dd0/dd1/d90/l113 0 2026-03-10T06:22:30.440 INFO:tasks.workunit.client.1.vm06.stdout:3/822: mknod d6/dc/de5/c114 0 2026-03-10T06:22:30.441 INFO:tasks.workunit.client.1.vm06.stdout:6/869: dwrite d6/d7/f2a [4194304,4194304] 0 2026-03-10T06:22:30.442 INFO:tasks.workunit.client.1.vm06.stdout:2/694: dread da/d13/d1a/dc7/daf/d56/db9/fc2 [0,4194304] 0 2026-03-10T06:22:30.443 INFO:tasks.workunit.client.1.vm06.stdout:9/822: dread d21/d32/fcf [0,4194304] 0 2026-03-10T06:22:30.448 INFO:tasks.workunit.client.1.vm06.stdout:9/823: stat d21/d32/d4d/l8f 0 2026-03-10T06:22:30.448 INFO:tasks.workunit.client.1.vm06.stdout:2/695: chown da/d13/d1a/dc7/daf/d56/db7 406732 1 2026-03-10T06:22:30.449 INFO:tasks.workunit.client.1.vm06.stdout:6/870: symlink d6/df/d70/l129 0 2026-03-10T06:22:30.453 INFO:tasks.workunit.client.1.vm06.stdout:3/823: symlink d6/dc/l115 0 2026-03-10T06:22:30.456 INFO:tasks.workunit.client.1.vm06.stdout:2/696: mkdir da/d13/d1a/dc7/daf/d56/db7/dde 0 2026-03-10T06:22:30.460 INFO:tasks.workunit.client.1.vm06.stdout:3/824: mknod d6/dc/d13/d35/d101/d88/dae/c116 0 2026-03-10T06:22:30.462 INFO:tasks.workunit.client.1.vm06.stdout:2/697: fsync da/d13/d1c/f41 0 2026-03-10T06:22:30.466 INFO:tasks.workunit.client.1.vm06.stdout:6/871: truncate d6/dd/d25/d2c/f32 5543604 0 2026-03-10T06:22:30.467 INFO:tasks.workunit.client.1.vm06.stdout:9/824: getdents d21/d46/ded 0 2026-03-10T06:22:30.469 INFO:tasks.workunit.client.1.vm06.stdout:7/853: rename d19/d3b/d41/d72 to d19/d3b/d41/d42/d10b/d11c 0 2026-03-10T06:22:30.472 INFO:tasks.workunit.client.1.vm06.stdout:7/854: dwrite d19/d3b/d41/f66 [4194304,4194304] 0 2026-03-10T06:22:30.477 INFO:tasks.workunit.client.1.vm06.stdout:6/872: rmdir d6/dd/d25/d33/d5a/dae 39 2026-03-10T06:22:30.477 INFO:tasks.workunit.client.1.vm06.stdout:6/873: fdatasync d6/dd/d35/f97 0 2026-03-10T06:22:30.479 INFO:tasks.workunit.client.1.vm06.stdout:9/825: mknod d21/d46/ded/c10b 0 2026-03-10T06:22:30.485 INFO:tasks.workunit.client.1.vm06.stdout:5/624: truncate d8/db/fbc 852245 0 2026-03-10T06:22:30.486 INFO:tasks.workunit.client.1.vm06.stdout:5/625: write d8/db/d54/d8a/d74/f71 [1751214,115542] 0 2026-03-10T06:22:30.488 INFO:tasks.workunit.client.1.vm06.stdout:5/626: readlink d8/l1d 0 2026-03-10T06:22:30.488 INFO:tasks.workunit.client.1.vm06.stdout:0/850: dwrite d0/dd/f5b [0,4194304] 0 2026-03-10T06:22:30.489 INFO:tasks.workunit.client.1.vm06.stdout:5/627: chown d8/db/d57/la9 0 1 2026-03-10T06:22:30.489 INFO:tasks.workunit.client.1.vm06.stdout:5/628: stat d8/d9/lb1 0 2026-03-10T06:22:30.492 INFO:tasks.workunit.client.1.vm06.stdout:8/680: rename d1/d2c/l33 to d1/d3b/db3/ldd 0 2026-03-10T06:22:30.500 INFO:tasks.workunit.client.1.vm06.stdout:4/857: truncate dd/d33/d36/fba 1534351 0 2026-03-10T06:22:30.500 INFO:tasks.workunit.client.1.vm06.stdout:6/874: creat d6/dd/dc2/f12a x:0 0 0 2026-03-10T06:22:30.500 INFO:tasks.workunit.client.1.vm06.stdout:9/826: mknod d21/d32/d4d/dd2/c10c 0 2026-03-10T06:22:30.501 INFO:tasks.workunit.client.1.vm06.stdout:0/851: symlink d0/da3/dd5/l11b 0 2026-03-10T06:22:30.503 INFO:tasks.workunit.client.1.vm06.stdout:1/893: truncate d9/d35/f57 4018052 0 2026-03-10T06:22:30.503 INFO:tasks.workunit.client.1.vm06.stdout:5/629: read - d8/db/d57/d83/fa6 zero size 2026-03-10T06:22:30.504 INFO:tasks.workunit.client.1.vm06.stdout:2/698: rename da/d13/d1c/d1d/d44/d53 to da/d13/d1c/d7d/ddf 0 2026-03-10T06:22:30.504 INFO:tasks.workunit.client.1.vm06.stdout:5/630: chown d8/db/c19 3 1 2026-03-10T06:22:30.511 INFO:tasks.workunit.client.1.vm06.stdout:5/631: chown d8/db/d54/d8a/d74/f42 16427286 1 2026-03-10T06:22:30.516 INFO:tasks.workunit.client.1.vm06.stdout:2/699: mkdir da/d13/d1c/d7d/ddf/d61/d68/de0 0 2026-03-10T06:22:30.516 INFO:tasks.workunit.client.1.vm06.stdout:0/852: readlink d0/d3c/dc1/l93 0 2026-03-10T06:22:30.517 INFO:tasks.workunit.client.1.vm06.stdout:8/681: sync 2026-03-10T06:22:30.518 INFO:tasks.workunit.client.1.vm06.stdout:9/827: creat d21/f10d x:0 0 0 2026-03-10T06:22:30.519 INFO:tasks.workunit.client.1.vm06.stdout:6/875: dread d6/dd/dc2/d10d/ffa [0,4194304] 0 2026-03-10T06:22:30.525 INFO:tasks.workunit.client.1.vm06.stdout:6/876: readlink d6/l9d 0 2026-03-10T06:22:30.525 INFO:tasks.workunit.client.1.vm06.stdout:2/700: creat da/d13/d1a/dc7/daf/d56/dd4/fe1 x:0 0 0 2026-03-10T06:22:30.526 INFO:tasks.workunit.client.1.vm06.stdout:9/828: dread d21/d27/d50/d57/fae [0,4194304] 0 2026-03-10T06:22:30.527 INFO:tasks.workunit.client.1.vm06.stdout:9/829: stat d21/d27/d50/d57 0 2026-03-10T06:22:30.528 INFO:tasks.workunit.client.1.vm06.stdout:3/825: dwrite d6/dc/d13/d35/d101/d88/f7d [0,4194304] 0 2026-03-10T06:22:30.529 INFO:tasks.workunit.client.1.vm06.stdout:8/682: unlink d1/df/d20/cb1 0 2026-03-10T06:22:30.534 INFO:tasks.workunit.client.1.vm06.stdout:2/701: mkdir da/d13/d1c/d1d/d44/d46/de2 0 2026-03-10T06:22:30.537 INFO:tasks.workunit.client.1.vm06.stdout:8/683: dwrite d1/f13 [4194304,4194304] 0 2026-03-10T06:22:30.542 INFO:tasks.workunit.client.1.vm06.stdout:8/684: sync 2026-03-10T06:22:30.543 INFO:tasks.workunit.client.1.vm06.stdout:2/702: creat da/d13/d5e/fe3 x:0 0 0 2026-03-10T06:22:30.544 INFO:tasks.workunit.client.1.vm06.stdout:2/703: write da/d13/d1c/d7d/f81 [2383511,23578] 0 2026-03-10T06:22:30.544 INFO:tasks.workunit.client.1.vm06.stdout:2/704: readlink da/d13/d1c/d7d/ddf/d61/d68/lce 0 2026-03-10T06:22:30.548 INFO:tasks.workunit.client.1.vm06.stdout:6/877: rename d6/df/c86 to d6/df/d9f/c12b 0 2026-03-10T06:22:30.548 INFO:tasks.workunit.client.1.vm06.stdout:3/826: mkdir d6/dc/d13/d35/d101/d88/dae/dec/d117 0 2026-03-10T06:22:30.549 INFO:tasks.workunit.client.1.vm06.stdout:8/685: readlink d1/df/d58/db5/lc1 0 2026-03-10T06:22:30.551 INFO:tasks.workunit.client.1.vm06.stdout:9/830: read d21/d27/d50/d57/db2/d80/d95/fc4 [373680,25478] 0 2026-03-10T06:22:30.551 INFO:tasks.workunit.client.1.vm06.stdout:9/831: read ff [5022818,18525] 0 2026-03-10T06:22:30.552 INFO:tasks.workunit.client.1.vm06.stdout:6/878: mkdir d6/dd/d25/d2c/d12c 0 2026-03-10T06:22:30.553 INFO:tasks.workunit.client.1.vm06.stdout:9/832: creat d21/d32/d4d/d51/db0/f10e x:0 0 0 2026-03-10T06:22:30.556 INFO:tasks.workunit.client.1.vm06.stdout:8/686: symlink d1/df/d11/da1/lde 0 2026-03-10T06:22:30.556 INFO:tasks.workunit.client.1.vm06.stdout:3/827: creat d6/dc/d13/f118 x:0 0 0 2026-03-10T06:22:30.558 INFO:tasks.workunit.client.1.vm06.stdout:8/687: creat d1/df/d58/db5/fdf x:0 0 0 2026-03-10T06:22:30.560 INFO:tasks.workunit.client.1.vm06.stdout:9/833: rename d21/d27/c41 to d21/d27/d50/d57/db2/d80/c10f 0 2026-03-10T06:22:30.560 INFO:tasks.workunit.client.1.vm06.stdout:8/688: fdatasync d1/df/d11/da1/fa2 0 2026-03-10T06:22:30.561 INFO:tasks.workunit.client.1.vm06.stdout:8/689: chown d1/df/d11/f59 12533 1 2026-03-10T06:22:30.564 INFO:tasks.workunit.client.1.vm06.stdout:3/828: symlink d6/dc/d13/d35/d101/d88/l119 0 2026-03-10T06:22:30.564 INFO:tasks.workunit.client.1.vm06.stdout:9/834: write f14 [5387402,112349] 0 2026-03-10T06:22:30.565 INFO:tasks.workunit.client.1.vm06.stdout:8/690: rmdir d1/d2c 39 2026-03-10T06:22:30.569 INFO:tasks.workunit.client.1.vm06.stdout:3/829: chown d6/dc/d13/d35/f4e 99286 1 2026-03-10T06:22:30.572 INFO:tasks.workunit.client.1.vm06.stdout:6/879: link d6/d79/d95/db4/dd4/df5/l125 d6/l12d 0 2026-03-10T06:22:30.572 INFO:tasks.workunit.client.1.vm06.stdout:8/691: rmdir d1/d2c 39 2026-03-10T06:22:30.573 INFO:tasks.workunit.client.1.vm06.stdout:3/830: rmdir d6/def/d10a 39 2026-03-10T06:22:30.573 INFO:tasks.workunit.client.1.vm06.stdout:3/831: chown d6/d21/dbc 2002595374 1 2026-03-10T06:22:30.581 INFO:tasks.workunit.client.1.vm06.stdout:3/832: creat d6/dc/d13/d9d/f11a x:0 0 0 2026-03-10T06:22:30.582 INFO:tasks.workunit.client.1.vm06.stdout:8/692: truncate d1/d2c/d90/fcb 474015 0 2026-03-10T06:22:30.582 INFO:tasks.workunit.client.1.vm06.stdout:8/693: chown d1/df/d11/f4a 429092 1 2026-03-10T06:22:30.583 INFO:tasks.workunit.client.1.vm06.stdout:6/880: dread d6/d79/fe2 [0,4194304] 0 2026-03-10T06:22:30.585 INFO:tasks.workunit.client.1.vm06.stdout:8/694: truncate d1/d3b/f98 8580677 0 2026-03-10T06:22:30.587 INFO:tasks.workunit.client.1.vm06.stdout:6/881: mkdir d6/dd/dc7/d12e 0 2026-03-10T06:22:30.587 INFO:tasks.workunit.client.1.vm06.stdout:8/695: symlink d1/d2c/d99/le0 0 2026-03-10T06:22:30.595 INFO:tasks.workunit.client.1.vm06.stdout:8/696: truncate d1/d3b/d5c/f6f 489535 0 2026-03-10T06:22:30.596 INFO:tasks.workunit.client.1.vm06.stdout:8/697: mknod d1/df/d20/d35/ce1 0 2026-03-10T06:22:30.598 INFO:tasks.workunit.client.1.vm06.stdout:8/698: dwrite d1/df/d20/d35/dac/dbf/fc9 [0,4194304] 0 2026-03-10T06:22:30.600 INFO:tasks.workunit.client.1.vm06.stdout:6/882: getdents d6/d79/d95/db4/dd4/df5 0 2026-03-10T06:22:30.603 INFO:tasks.workunit.client.1.vm06.stdout:8/699: creat d1/d2c/d99/dc0/fe2 x:0 0 0 2026-03-10T06:22:30.605 INFO:tasks.workunit.client.1.vm06.stdout:8/700: mkdir d1/d3b/d5c/de3 0 2026-03-10T06:22:30.607 INFO:tasks.workunit.client.1.vm06.stdout:8/701: write d1/d2c/d90/fcb [1496210,78663] 0 2026-03-10T06:22:30.607 INFO:tasks.workunit.client.1.vm06.stdout:6/883: write d6/df/d40/d99/fb5 [1586104,123202] 0 2026-03-10T06:22:30.611 INFO:tasks.workunit.client.1.vm06.stdout:8/702: dwrite d1/df/d20/d21/d5e/f70 [0,4194304] 0 2026-03-10T06:22:30.616 INFO:tasks.workunit.client.1.vm06.stdout:6/884: dwrite d6/dd/d25/d4e/f5f [4194304,4194304] 0 2026-03-10T06:22:30.630 INFO:tasks.workunit.client.1.vm06.stdout:6/885: dwrite d6/dd/d35/f2d [8388608,4194304] 0 2026-03-10T06:22:30.639 INFO:tasks.workunit.client.1.vm06.stdout:6/886: chown d6/df/l116 45742486 1 2026-03-10T06:22:30.642 INFO:tasks.workunit.client.1.vm06.stdout:7/855: dwrite d19/d3b/d41/d42/fd4 [0,4194304] 0 2026-03-10T06:22:30.647 INFO:tasks.workunit.client.1.vm06.stdout:4/858: truncate dd/d18/f1f 7337304 0 2026-03-10T06:22:30.650 INFO:tasks.workunit.client.1.vm06.stdout:7/856: creat d19/f11d x:0 0 0 2026-03-10T06:22:30.651 INFO:tasks.workunit.client.1.vm06.stdout:1/894: dwrite d9/d35/f57 [4194304,4194304] 0 2026-03-10T06:22:30.659 INFO:tasks.workunit.client.1.vm06.stdout:1/895: dread d9/d35/f57 [4194304,4194304] 0 2026-03-10T06:22:30.665 INFO:tasks.workunit.client.1.vm06.stdout:7/857: dwrite d19/d3b/dde/fe5 [0,4194304] 0 2026-03-10T06:22:30.665 INFO:tasks.workunit.client.1.vm06.stdout:4/859: getdents dd/d24/d2d/d7c 0 2026-03-10T06:22:30.666 INFO:tasks.workunit.client.1.vm06.stdout:7/858: write d19/d3b/d41/d42/fd4 [280525,60310] 0 2026-03-10T06:22:30.673 INFO:tasks.workunit.client.1.vm06.stdout:4/860: mkdir dd/d18/df8 0 2026-03-10T06:22:30.684 INFO:tasks.workunit.client.1.vm06.stdout:1/896: symlink d9/d1b/d20/db3/l106 0 2026-03-10T06:22:30.684 INFO:tasks.workunit.client.1.vm06.stdout:7/859: creat d19/d3b/d41/d42/d52/d83/d9d/da8/df4/f11e x:0 0 0 2026-03-10T06:22:30.684 INFO:tasks.workunit.client.1.vm06.stdout:4/861: mkdir dd/d18/df9 0 2026-03-10T06:22:30.684 INFO:tasks.workunit.client.1.vm06.stdout:7/860: symlink d19/d3b/d41/d42/d10b/d11c/de0/de2/l11f 0 2026-03-10T06:22:30.684 INFO:tasks.workunit.client.1.vm06.stdout:4/862: write dd/d24/d2d/d2f/d39/d71/dc3/dd0/de0/f67 [2094701,65445] 0 2026-03-10T06:22:30.684 INFO:tasks.workunit.client.1.vm06.stdout:7/861: write d19/d3b/d41/da9/da5/fa6 [3999496,73025] 0 2026-03-10T06:22:30.684 INFO:tasks.workunit.client.1.vm06.stdout:4/863: creat dd/d24/d2d/d2f/d39/ffa x:0 0 0 2026-03-10T06:22:30.684 INFO:tasks.workunit.client.1.vm06.stdout:7/862: symlink d19/d3b/d5b/l120 0 2026-03-10T06:22:30.685 INFO:tasks.workunit.client.1.vm06.stdout:4/864: dread - dd/d24/d2d/d2f/d34/d40/fbc zero size 2026-03-10T06:22:30.688 INFO:tasks.workunit.client.1.vm06.stdout:4/865: write dd/f81 [2424727,29150] 0 2026-03-10T06:22:30.690 INFO:tasks.workunit.client.1.vm06.stdout:1/897: dread d9/d62/f8a [0,4194304] 0 2026-03-10T06:22:30.697 INFO:tasks.workunit.client.1.vm06.stdout:1/898: creat d9/d35/d46/d38/ddd/f107 x:0 0 0 2026-03-10T06:22:30.707 INFO:tasks.workunit.client.1.vm06.stdout:7/863: dread d19/d3b/d41/d42/d62/f86 [0,4194304] 0 2026-03-10T06:22:30.707 INFO:tasks.workunit.client.1.vm06.stdout:4/866: dwrite dd/d24/d2d/fe5 [0,4194304] 0 2026-03-10T06:22:30.707 INFO:tasks.workunit.client.1.vm06.stdout:7/864: creat d19/d3b/d41/d42/d62/f121 x:0 0 0 2026-03-10T06:22:30.707 INFO:tasks.workunit.client.1.vm06.stdout:7/865: read d19/d3b/d41/d42/d62/f86 [1651646,20417] 0 2026-03-10T06:22:30.746 INFO:tasks.workunit.client.1.vm06.stdout:9/835: dread d21/d32/d4d/f64 [0,4194304] 0 2026-03-10T06:22:30.748 INFO:tasks.workunit.client.1.vm06.stdout:9/836: creat d21/d32/f110 x:0 0 0 2026-03-10T06:22:30.750 INFO:tasks.workunit.client.1.vm06.stdout:9/837: link d21/d32/d4d/fbd d21/d27/d50/d57/dcd/de4/f111 0 2026-03-10T06:22:30.752 INFO:tasks.workunit.client.1.vm06.stdout:9/838: rename d21/d27 to d21/d27/d50/d57/db2/d7f/d112 22 2026-03-10T06:22:30.752 INFO:tasks.workunit.client.1.vm06.stdout:9/839: fdatasync fe 0 2026-03-10T06:22:30.756 INFO:tasks.workunit.client.1.vm06.stdout:9/840: creat d21/d27/d50/d57/dcd/de4/dee/f113 x:0 0 0 2026-03-10T06:22:30.760 INFO:tasks.workunit.client.1.vm06.stdout:9/841: symlink d21/da2/de6/l114 0 2026-03-10T06:22:30.760 INFO:tasks.workunit.client.1.vm06.stdout:9/842: readlink d21/d32/d4d/l8f 0 2026-03-10T06:22:30.760 INFO:tasks.workunit.client.1.vm06.stdout:9/843: write d21/d27/d50/d57/db2/d80/d95/d9b/fcc [2134176,56229] 0 2026-03-10T06:22:30.760 INFO:tasks.workunit.client.1.vm06.stdout:9/844: write d21/d27/d50/d57/db2/d80/f90 [404617,72210] 0 2026-03-10T06:22:30.769 INFO:tasks.workunit.client.1.vm06.stdout:7/866: dread f15 [0,4194304] 0 2026-03-10T06:22:30.774 INFO:tasks.workunit.client.1.vm06.stdout:7/867: dread - d19/d3b/d41/d42/d10b/d11c/d97/fb3 zero size 2026-03-10T06:22:30.775 INFO:tasks.workunit.client.1.vm06.stdout:7/868: fdatasync d19/d3b/d41/f48 0 2026-03-10T06:22:30.776 INFO:tasks.workunit.client.1.vm06.stdout:7/869: write d19/d3b/d41/d42/f91 [1409984,91085] 0 2026-03-10T06:22:30.777 INFO:tasks.workunit.client.1.vm06.stdout:7/870: read d19/d3b/d41/d4c/ffc [1002167,117198] 0 2026-03-10T06:22:30.778 INFO:tasks.workunit.client.1.vm06.stdout:7/871: mknod d19/d3b/d41/d42/d10b/d11c/de0/c122 0 2026-03-10T06:22:30.782 INFO:tasks.workunit.client.1.vm06.stdout:7/872: dwrite d19/d3b/fee [0,4194304] 0 2026-03-10T06:22:30.789 INFO:tasks.workunit.client.1.vm06.stdout:7/873: mknod d19/d3b/d41/d42/d52/d83/c123 0 2026-03-10T06:22:30.790 INFO:tasks.workunit.client.1.vm06.stdout:7/874: write d19/d3b/dde/fdc [1533327,47662] 0 2026-03-10T06:22:30.793 INFO:tasks.workunit.client.1.vm06.stdout:7/875: rename d19/d3b/d5b/l76 to d19/d3b/d41/d42/d10b/d10d/l124 0 2026-03-10T06:22:30.793 INFO:tasks.workunit.client.1.vm06.stdout:7/876: readlink d19/d3b/l57 0 2026-03-10T06:22:30.800 INFO:tasks.workunit.client.1.vm06.stdout:7/877: dwrite d19/d3b/d5b/f108 [0,4194304] 0 2026-03-10T06:22:30.800 INFO:tasks.workunit.client.1.vm06.stdout:7/878: chown d19/d3b/d41/c5a 4498 1 2026-03-10T06:22:30.810 INFO:tasks.workunit.client.1.vm06.stdout:7/879: getdents d19/d3b/d41/d4c 0 2026-03-10T06:22:30.821 INFO:tasks.workunit.client.1.vm06.stdout:0/853: dwrite d0/dd/d14/d18/d85/dcc/d99/f119 [0,4194304] 0 2026-03-10T06:22:30.827 INFO:tasks.workunit.client.1.vm06.stdout:8/703: read d1/df/d20/f63 [1503290,119955] 0 2026-03-10T06:22:30.828 INFO:tasks.workunit.client.1.vm06.stdout:8/704: write d1/df/d20/d21/f37 [860499,80198] 0 2026-03-10T06:22:30.835 INFO:tasks.workunit.client.1.vm06.stdout:5/632: truncate d8/db/d54/d8a/d39/d72/f8b 2914314 0 2026-03-10T06:22:30.837 INFO:tasks.workunit.client.1.vm06.stdout:5/633: write d8/db/d54/d8a/d74/f66 [1401401,29025] 0 2026-03-10T06:22:30.842 INFO:tasks.workunit.client.1.vm06.stdout:5/634: rmdir d8/db/d54/d8a/d39/d72 39 2026-03-10T06:22:30.842 INFO:tasks.workunit.client.1.vm06.stdout:2/705: dwrite da/d13/d1a/f3a [0,4194304] 0 2026-03-10T06:22:30.843 INFO:tasks.workunit.client.1.vm06.stdout:3/833: dwrite d6/d21/f7b [4194304,4194304] 0 2026-03-10T06:22:30.843 INFO:tasks.workunit.client.1.vm06.stdout:8/705: link d1/fa d1/df/d11/fe4 0 2026-03-10T06:22:30.844 INFO:tasks.workunit.client.1.vm06.stdout:0/854: dwrite d0/dd/d14/d18/d85/dcc/d5e/f86 [0,4194304] 0 2026-03-10T06:22:30.849 INFO:tasks.workunit.client.1.vm06.stdout:6/887: write d6/dd/d25/f69 [1050892,79215] 0 2026-03-10T06:22:30.867 INFO:tasks.workunit.client.1.vm06.stdout:5/635: dread d8/db/d54/d8a/d39/d6c/f91 [0,4194304] 0 2026-03-10T06:22:30.871 INFO:tasks.workunit.client.1.vm06.stdout:0/855: mkdir d0/d3c/dc1/d7d/d10e/d11c 0 2026-03-10T06:22:30.872 INFO:tasks.workunit.client.1.vm06.stdout:6/888: mknod d6/df/d40/c12f 0 2026-03-10T06:22:30.873 INFO:tasks.workunit.client.1.vm06.stdout:2/706: rename da/d13/d1a/dc7/daf/d56/db9/f77 to da/d13/fe4 0 2026-03-10T06:22:30.875 INFO:tasks.workunit.client.1.vm06.stdout:5/636: mknod d8/db/d54/d8a/d39/cc6 0 2026-03-10T06:22:30.877 INFO:tasks.workunit.client.1.vm06.stdout:6/889: mkdir d6/dd/d25/d4e/d130 0 2026-03-10T06:22:30.877 INFO:tasks.workunit.client.1.vm06.stdout:0/856: truncate d0/dd/d14/d18/d85/dcc/d88/d98/fbc 435507 0 2026-03-10T06:22:30.877 INFO:tasks.workunit.client.1.vm06.stdout:2/707: fdatasync da/d13/d5e/fe3 0 2026-03-10T06:22:30.881 INFO:tasks.workunit.client.1.vm06.stdout:6/890: unlink d6/d7/d37/l50 0 2026-03-10T06:22:30.884 INFO:tasks.workunit.client.1.vm06.stdout:3/834: rename d6/dc/d13/d35/d101/dd0/d107 to d6/dc/d13/d35/d101/d11b 0 2026-03-10T06:22:30.886 INFO:tasks.workunit.client.1.vm06.stdout:6/891: mknod d6/dd/dc2/d10d/db9/c131 0 2026-03-10T06:22:30.886 INFO:tasks.workunit.client.1.vm06.stdout:0/857: dwrite d0/dd/d14/d18/d7e/fce [0,4194304] 0 2026-03-10T06:22:30.886 INFO:tasks.workunit.client.1.vm06.stdout:5/637: creat d8/db/d54/d8a/fc7 x:0 0 0 2026-03-10T06:22:30.890 INFO:tasks.workunit.client.1.vm06.stdout:5/638: read d8/db/d54/d8a/d39/f51 [2954298,48992] 0 2026-03-10T06:22:30.895 INFO:tasks.workunit.client.1.vm06.stdout:3/835: mknod d6/d1a/d5b/dbd/c11c 0 2026-03-10T06:22:30.899 INFO:tasks.workunit.client.1.vm06.stdout:0/858: dwrite d0/dd/d14/d18/d85/dcc/f5c [0,4194304] 0 2026-03-10T06:22:30.903 INFO:tasks.workunit.client.1.vm06.stdout:0/859: rename d0/dd/d14/d18/d85/dcc/f41 to d0/d3c/dc1/d3d/d50/d91/da7/f11d 0 2026-03-10T06:22:30.910 INFO:tasks.workunit.client.1.vm06.stdout:5/639: unlink d8/db/d54/d8a/c2d 0 2026-03-10T06:22:30.910 INFO:tasks.workunit.client.1.vm06.stdout:0/860: dwrite d0/dd/d14/d18/d85/dcc/d99/f10d [0,4194304] 0 2026-03-10T06:22:30.912 INFO:tasks.workunit.client.1.vm06.stdout:6/892: sync 2026-03-10T06:22:30.913 INFO:tasks.workunit.client.1.vm06.stdout:1/899: write d9/d1b/d20/f30 [1251938,51041] 0 2026-03-10T06:22:30.923 INFO:tasks.workunit.client.1.vm06.stdout:6/893: truncate d6/dd/dc7/f128 982641 0 2026-03-10T06:22:30.923 INFO:tasks.workunit.client.1.vm06.stdout:4/867: dwrite dd/f14 [0,4194304] 0 2026-03-10T06:22:30.923 INFO:tasks.workunit.client.1.vm06.stdout:4/868: chown dd/f5c 239447 1 2026-03-10T06:22:30.924 INFO:tasks.workunit.client.1.vm06.stdout:0/861: creat d0/d3c/dc1/d7d/d10e/f11e x:0 0 0 2026-03-10T06:22:30.931 INFO:tasks.workunit.client.1.vm06.stdout:4/869: rmdir dd/d24/d9c 39 2026-03-10T06:22:30.931 INFO:tasks.workunit.client.1.vm06.stdout:3/836: getdents d6/dc/d41 0 2026-03-10T06:22:30.933 INFO:tasks.workunit.client.1.vm06.stdout:2/708: dread da/d13/d1c/d7d/f81 [0,4194304] 0 2026-03-10T06:22:30.936 INFO:tasks.workunit.client.1.vm06.stdout:3/837: mkdir d6/dc/de5/d11d 0 2026-03-10T06:22:30.936 INFO:tasks.workunit.client.1.vm06.stdout:4/870: chown dd/d33/d36/la2 27058 1 2026-03-10T06:22:30.937 INFO:tasks.workunit.client.1.vm06.stdout:3/838: fdatasync d6/d8/f97 0 2026-03-10T06:22:30.938 INFO:tasks.workunit.client.1.vm06.stdout:9/845: dread d21/da2/de6/fe5 [0,4194304] 0 2026-03-10T06:22:30.940 INFO:tasks.workunit.client.1.vm06.stdout:4/871: chown dd/d18/ce8 22 1 2026-03-10T06:22:30.951 INFO:tasks.workunit.client.1.vm06.stdout:4/872: symlink dd/d33/d47/lfb 0 2026-03-10T06:22:30.951 INFO:tasks.workunit.client.1.vm06.stdout:2/709: link da/d13/d1a/dc7/daf/d56/l72 da/da8/le5 0 2026-03-10T06:22:30.954 INFO:tasks.workunit.client.1.vm06.stdout:4/873: mknod dd/d18/d8e/daa/cfc 0 2026-03-10T06:22:30.957 INFO:tasks.workunit.client.1.vm06.stdout:5/640: dread d8/db/d54/d8a/d74/f37 [0,4194304] 0 2026-03-10T06:22:30.957 INFO:tasks.workunit.client.1.vm06.stdout:9/846: getdents d21/d27/d56 0 2026-03-10T06:22:30.957 INFO:tasks.workunit.client.1.vm06.stdout:4/874: mkdir dd/d18/d75/dfd 0 2026-03-10T06:22:30.961 INFO:tasks.workunit.client.1.vm06.stdout:9/847: creat d21/d27/d50/d57/dcd/de4/f115 x:0 0 0 2026-03-10T06:22:30.964 INFO:tasks.workunit.client.1.vm06.stdout:7/880: write d19/d3b/d41/fb2 [45117,114936] 0 2026-03-10T06:22:30.968 INFO:tasks.workunit.client.1.vm06.stdout:9/848: write d21/d27/d50/d57/db2/f8d [1120284,45573] 0 2026-03-10T06:22:30.968 INFO:tasks.workunit.client.1.vm06.stdout:9/849: chown d21/d32/d4d/d51/db0/fe3 8250819 1 2026-03-10T06:22:30.968 INFO:tasks.workunit.client.1.vm06.stdout:4/875: getdents dd/d24/d2d/d2f/d34/d40 0 2026-03-10T06:22:30.972 INFO:tasks.workunit.client.1.vm06.stdout:7/881: truncate d19/d3b/d41/d4c/f55 971532 0 2026-03-10T06:22:30.974 INFO:tasks.workunit.client.1.vm06.stdout:9/850: dwrite d21/d32/d4d/dd2/ff1 [0,4194304] 0 2026-03-10T06:22:30.981 INFO:tasks.workunit.client.1.vm06.stdout:7/882: creat d19/d3b/d41/d42/f125 x:0 0 0 2026-03-10T06:22:30.985 INFO:tasks.workunit.client.1.vm06.stdout:9/851: getdents d21/d27/d50/d57/db2/d80/d95/d9b/dd0/d10a 0 2026-03-10T06:22:30.985 INFO:tasks.workunit.client.1.vm06.stdout:7/883: write d19/d3b/d5b/fa4 [1460466,81647] 0 2026-03-10T06:22:30.985 INFO:tasks.workunit.client.1.vm06.stdout:4/876: dwrite dd/d18/d75/f91 [0,4194304] 0 2026-03-10T06:22:30.990 INFO:tasks.workunit.client.1.vm06.stdout:9/852: chown d21/da2/de6/fe5 0 1 2026-03-10T06:22:30.992 INFO:tasks.workunit.client.1.vm06.stdout:7/884: truncate d19/db0/f100 441347 0 2026-03-10T06:22:30.992 INFO:tasks.workunit.client.1.vm06.stdout:7/885: write d19/f3f [3281912,120242] 0 2026-03-10T06:22:30.995 INFO:tasks.workunit.client.1.vm06.stdout:4/877: fdatasync dd/d33/f3f 0 2026-03-10T06:22:30.995 INFO:tasks.workunit.client.1.vm06.stdout:7/886: write d19/d3b/d41/d42/d52/d83/fd8 [1430957,28719] 0 2026-03-10T06:22:30.997 INFO:tasks.workunit.client.1.vm06.stdout:8/706: dread d1/f1c [0,4194304] 0 2026-03-10T06:22:31.004 INFO:tasks.workunit.client.1.vm06.stdout:4/878: creat dd/d33/de9/ffe x:0 0 0 2026-03-10T06:22:31.006 INFO:tasks.workunit.client.1.vm06.stdout:4/879: rename fa to dd/d24/d2d/d2f/d39/d71/dc3/dd0/de0/d66/fff 0 2026-03-10T06:22:31.010 INFO:tasks.workunit.client.1.vm06.stdout:7/887: getdents d19/d3b/d41/da9/dbd/dd2 0 2026-03-10T06:22:31.010 INFO:tasks.workunit.client.1.vm06.stdout:7/888: rmdir d19/d3b/d41/d42/d52/d83 39 2026-03-10T06:22:31.010 INFO:tasks.workunit.client.1.vm06.stdout:7/889: readlink d19/d3b/d41/d42/d52/d9f/dc2/ldb 0 2026-03-10T06:22:31.014 INFO:tasks.workunit.client.1.vm06.stdout:7/890: truncate d19/f33 143912 0 2026-03-10T06:22:31.020 INFO:tasks.workunit.client.1.vm06.stdout:7/891: dwrite d19/d3b/d41/fb2 [0,4194304] 0 2026-03-10T06:22:31.020 INFO:tasks.workunit.client.1.vm06.stdout:8/707: read d1/df/d20/d21/f37 [8073363,90578] 0 2026-03-10T06:22:31.022 INFO:tasks.workunit.client.1.vm06.stdout:8/708: read - d1/df/d20/d21/f82 zero size 2026-03-10T06:22:31.029 INFO:tasks.workunit.client.1.vm06.stdout:7/892: dwrite d19/d3b/d41/d42/fd4 [4194304,4194304] 0 2026-03-10T06:22:31.032 INFO:tasks.workunit.client.1.vm06.stdout:7/893: mkdir d19/d3b/dde/d126 0 2026-03-10T06:22:31.033 INFO:tasks.workunit.client.1.vm06.stdout:5/641: read d8/db/d54/d67/d46/f98 [1233813,2742] 0 2026-03-10T06:22:31.034 INFO:tasks.workunit.client.1.vm06.stdout:7/894: creat d19/d3b/d41/d4c/f127 x:0 0 0 2026-03-10T06:22:31.036 INFO:tasks.workunit.client.1.vm06.stdout:5/642: truncate d8/db/d57/d83/fa6 957958 0 2026-03-10T06:22:31.037 INFO:tasks.workunit.client.1.vm06.stdout:7/895: rename d19 to d19/d3b/d41/d42/d52/d83/d128 22 2026-03-10T06:22:31.039 INFO:tasks.workunit.client.1.vm06.stdout:5/643: write d8/db/d54/d8a/d39/f69 [4899311,41362] 0 2026-03-10T06:22:31.043 INFO:tasks.workunit.client.1.vm06.stdout:7/896: dread - d19/d3b/d41/d42/d52/fc5 zero size 2026-03-10T06:22:31.111 INFO:tasks.workunit.client.1.vm06.stdout:1/900: write d9/d35/d46/d38/dc6/dd4/fe4 [2580832,8013] 0 2026-03-10T06:22:31.111 INFO:tasks.workunit.client.1.vm06.stdout:6/894: write d6/f6a [389569,54168] 0 2026-03-10T06:22:31.112 INFO:tasks.workunit.client.1.vm06.stdout:0/862: truncate d0/dd/d14/d18/d85/dcc/f54 3404449 0 2026-03-10T06:22:31.113 INFO:tasks.workunit.client.1.vm06.stdout:0/863: fsync d0/d3c/dc1/f2f 0 2026-03-10T06:22:31.116 INFO:tasks.workunit.client.1.vm06.stdout:1/901: mkdir d9/d35/d46/d38/d63/d83/dc5/d108 0 2026-03-10T06:22:31.120 INFO:tasks.workunit.client.1.vm06.stdout:3/839: dwrite d6/dc/d13/fca [0,4194304] 0 2026-03-10T06:22:31.125 INFO:tasks.workunit.client.1.vm06.stdout:2/710: dwrite da/d13/d1c/d7d/ddf/f67 [0,4194304] 0 2026-03-10T06:22:31.132 INFO:tasks.workunit.client.1.vm06.stdout:2/711: truncate da/d13/d1c/d7d/ddf/f65 1373751 0 2026-03-10T06:22:31.133 INFO:tasks.workunit.client.1.vm06.stdout:6/895: creat d6/d7/f132 x:0 0 0 2026-03-10T06:22:31.134 INFO:tasks.workunit.client.1.vm06.stdout:2/712: chown da/d13/d1a/d39/d35/f74 32103 1 2026-03-10T06:22:31.136 INFO:tasks.workunit.client.1.vm06.stdout:9/853: dwrite d21/d27/d50/d57/db2/d80/d95/fc4 [0,4194304] 0 2026-03-10T06:22:31.136 INFO:tasks.workunit.client.1.vm06.stdout:6/896: creat d6/d7/d37/d43/f133 x:0 0 0 2026-03-10T06:22:31.140 INFO:tasks.workunit.client.1.vm06.stdout:6/897: rmdir d6/dd/dc2 39 2026-03-10T06:22:31.144 INFO:tasks.workunit.client.1.vm06.stdout:6/898: creat d6/d79/f134 x:0 0 0 2026-03-10T06:22:31.145 INFO:tasks.workunit.client.1.vm06.stdout:6/899: stat d6/dd/d35/dff/l120 0 2026-03-10T06:22:31.146 INFO:tasks.workunit.client.1.vm06.stdout:6/900: write d6/d79/d95/db4/dd4/fd5 [2540214,987] 0 2026-03-10T06:22:31.146 INFO:tasks.workunit.client.1.vm06.stdout:6/901: dread - d6/d79/f134 zero size 2026-03-10T06:22:31.148 INFO:tasks.workunit.client.1.vm06.stdout:6/902: write d6/dd/f114 [4577167,79029] 0 2026-03-10T06:22:31.150 INFO:tasks.workunit.client.1.vm06.stdout:9/854: dwrite d21/d27/d50/d57/db2/d80/d95/d9b/fcc [0,4194304] 0 2026-03-10T06:22:31.154 INFO:tasks.workunit.client.1.vm06.stdout:6/903: dread - d6/dd/d25/d33/d5a/dae/f104 zero size 2026-03-10T06:22:31.154 INFO:tasks.workunit.client.1.vm06.stdout:6/904: fsync d6/dd/f96 0 2026-03-10T06:22:31.160 INFO:tasks.workunit.client.1.vm06.stdout:9/855: dread d21/d27/fb3 [0,4194304] 0 2026-03-10T06:22:31.166 INFO:tasks.workunit.client.1.vm06.stdout:9/856: chown d21/d32/la4 1 1 2026-03-10T06:22:31.166 INFO:tasks.workunit.client.1.vm06.stdout:6/905: rename d6/d79/d95/db4/dd4/df5/df3 to d6/dd/dc2/d10d/db9/d135 0 2026-03-10T06:22:31.170 INFO:tasks.workunit.client.1.vm06.stdout:6/906: rename d6/dd/d25/fa0 to d6/dd/d25/d2c/f136 0 2026-03-10T06:22:31.184 INFO:tasks.workunit.client.1.vm06.stdout:4/880: dwrite dd/d18/f5f [0,4194304] 0 2026-03-10T06:22:31.192 INFO:tasks.workunit.client.1.vm06.stdout:5/644: truncate d8/ff 441053 0 2026-03-10T06:22:31.192 INFO:tasks.workunit.client.1.vm06.stdout:7/897: write d19/d3b/d41/d42/d10b/d11c/d97/fb3 [442753,51971] 0 2026-03-10T06:22:31.193 INFO:tasks.workunit.client.1.vm06.stdout:7/898: chown d19/d3b/d41/d42/d62/d80/ffa 16119191 1 2026-03-10T06:22:31.194 INFO:tasks.workunit.client.1.vm06.stdout:4/881: creat dd/d33/d47/d97/db6/dbb/de2/f100 x:0 0 0 2026-03-10T06:22:31.195 INFO:tasks.workunit.client.1.vm06.stdout:7/899: chown d19/d3b/d41/d42/d62/d80/da1/f107 1 1 2026-03-10T06:22:31.195 INFO:tasks.workunit.client.1.vm06.stdout:4/882: chown dd/d41/c57 1964757 1 2026-03-10T06:22:31.218 INFO:tasks.workunit.client.1.vm06.stdout:0/864: write d0/dd/d14/d18/d85/dcc/d99/fc7 [713631,97315] 0 2026-03-10T06:22:31.220 INFO:tasks.workunit.client.1.vm06.stdout:0/865: write d0/dd/d14/d1d/d5d/dca/dd8/f10a [476037,115033] 0 2026-03-10T06:22:31.221 INFO:tasks.workunit.client.1.vm06.stdout:0/866: chown d0/dd/d14/d1d 1977 1 2026-03-10T06:22:31.225 INFO:tasks.workunit.client.1.vm06.stdout:3/840: dwrite d6/dc/d13/fc1 [0,4194304] 0 2026-03-10T06:22:31.229 INFO:tasks.workunit.client.1.vm06.stdout:5/645: mkdir d8/db/d54/d8a/d39/d6c/dc8 0 2026-03-10T06:22:31.233 INFO:tasks.workunit.client.1.vm06.stdout:0/867: mkdir d0/dd/d1c/da2/d11f 0 2026-03-10T06:22:31.235 INFO:tasks.workunit.client.1.vm06.stdout:0/868: mknod d0/dd/d1c/da2/d11f/c120 0 2026-03-10T06:22:31.237 INFO:tasks.workunit.client.1.vm06.stdout:5/646: creat d8/db/d54/fc9 x:0 0 0 2026-03-10T06:22:31.246 INFO:tasks.workunit.client.1.vm06.stdout:5/647: symlink d8/db/lca 0 2026-03-10T06:22:31.280 INFO:tasks.workunit.client.1.vm06.stdout:2/713: dwrite da/d13/d5e/f9e [0,4194304] 0 2026-03-10T06:22:31.283 INFO:tasks.workunit.client.1.vm06.stdout:6/907: rename d6/dd/dc2/d10d to d6/d79/d95/d137 0 2026-03-10T06:22:31.285 INFO:tasks.workunit.client.1.vm06.stdout:2/714: dread - da/d13/d1c/d43/f91 zero size 2026-03-10T06:22:31.290 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:31 vm06.local ceph-mon[58974]: pgmap v14: 65 pgs: 65 active+clean; 1.7 GiB data, 5.8 GiB used, 114 GiB / 120 GiB avail; 67 MiB/s rd, 183 MiB/s wr, 402 op/s 2026-03-10T06:22:31.294 INFO:tasks.workunit.client.1.vm06.stdout:6/908: creat d6/dd/d25/d33/d5a/dae/f138 x:0 0 0 2026-03-10T06:22:31.294 INFO:tasks.workunit.client.1.vm06.stdout:6/909: readlink d6/df/d70/l129 0 2026-03-10T06:22:31.294 INFO:tasks.workunit.client.1.vm06.stdout:2/715: mknod da/ce6 0 2026-03-10T06:22:31.295 INFO:tasks.workunit.client.1.vm06.stdout:3/841: rename d6/d21/la2 to d6/dc/d13/d35/d101/dd0/dd1/d90/l11e 0 2026-03-10T06:22:31.305 INFO:tasks.workunit.client.1.vm06.stdout:2/716: dwrite da/d13/d1a/dc7/daf/d56/db9/fb6 [0,4194304] 0 2026-03-10T06:22:31.322 INFO:tasks.workunit.client.1.vm06.stdout:9/857: write d21/da2/da7/d93/dda/f7c [1243696,87376] 0 2026-03-10T06:22:31.337 INFO:tasks.workunit.client.1.vm06.stdout:2/717: mkdir da/d13/d1a/dc7/daf/d56/de7 0 2026-03-10T06:22:31.337 INFO:tasks.workunit.client.1.vm06.stdout:6/910: symlink d6/d79/d95/d11e/l139 0 2026-03-10T06:22:31.339 INFO:tasks.workunit.client.1.vm06.stdout:9/858: symlink d21/d27/d56/dc0/l116 0 2026-03-10T06:22:31.340 INFO:tasks.workunit.client.1.vm06.stdout:2/718: creat da/d13/d1c/d7d/fe8 x:0 0 0 2026-03-10T06:22:31.341 INFO:tasks.workunit.client.1.vm06.stdout:2/719: fsync da/d13/d1c/d1d/fbd 0 2026-03-10T06:22:31.341 INFO:tasks.workunit.client.1.vm06.stdout:3/842: link d6/dc/d13/d9d/d54/cd8 d6/dc/d41/c11f 0 2026-03-10T06:22:31.342 INFO:tasks.workunit.client.1.vm06.stdout:3/843: fsync d6/dc/d13/d9d/d54/fcc 0 2026-03-10T06:22:31.349 INFO:tasks.workunit.client.1.vm06.stdout:6/911: dread d6/dd/d25/f69 [0,4194304] 0 2026-03-10T06:22:31.354 INFO:tasks.workunit.client.1.vm06.stdout:2/720: creat da/d13/d1c/d1d/fe9 x:0 0 0 2026-03-10T06:22:31.355 INFO:tasks.workunit.client.1.vm06.stdout:6/912: link d6/d7/f132 d6/d79/d95/d137/dd0/f13a 0 2026-03-10T06:22:31.358 INFO:tasks.workunit.client.1.vm06.stdout:1/902: dread d9/d35/d46/d38/d8c/f9a [0,4194304] 0 2026-03-10T06:22:31.359 INFO:tasks.workunit.client.1.vm06.stdout:6/913: creat d6/d79/d95/d137/dd0/dec/f13b x:0 0 0 2026-03-10T06:22:31.362 INFO:tasks.workunit.client.1.vm06.stdout:8/709: write d1/d3b/d5c/f6f [1300799,113233] 0 2026-03-10T06:22:31.364 INFO:tasks.workunit.client.1.vm06.stdout:1/903: sync 2026-03-10T06:22:31.367 INFO:tasks.workunit.client.1.vm06.stdout:9/859: sync 2026-03-10T06:22:31.368 INFO:tasks.workunit.client.1.vm06.stdout:6/914: mknod d6/d79/d95/c13c 0 2026-03-10T06:22:31.369 INFO:tasks.workunit.client.1.vm06.stdout:2/721: getdents da/d13/d5e 0 2026-03-10T06:22:31.378 INFO:tasks.workunit.client.1.vm06.stdout:6/915: rmdir d6/dd/d25/d33/d5a/dd8 39 2026-03-10T06:22:31.378 INFO:tasks.workunit.client.1.vm06.stdout:2/722: unlink da/d13/d1a/d39/d35/f4a 0 2026-03-10T06:22:31.385 INFO:tasks.workunit.client.1.vm06.stdout:4/883: dwrite dd/d18/d8e/daa/ff3 [0,4194304] 0 2026-03-10T06:22:31.393 INFO:tasks.workunit.client.1.vm06.stdout:8/710: link d1/df/d11/f1d d1/df/d11/da1/dd2/fe5 0 2026-03-10T06:22:31.394 INFO:tasks.workunit.client.1.vm06.stdout:0/869: write d0/dd/f49 [691151,77251] 0 2026-03-10T06:22:31.399 INFO:tasks.workunit.client.1.vm06.stdout:7/900: dwrite d19/d3b/d5b/f81 [0,4194304] 0 2026-03-10T06:22:31.404 INFO:tasks.workunit.client.1.vm06.stdout:7/901: write d19/d3b/d41/d42/d62/d80/ffa [1325220,51665] 0 2026-03-10T06:22:31.404 INFO:tasks.workunit.client.1.vm06.stdout:7/902: chown d19/d3b/dde/fe5 1 1 2026-03-10T06:22:31.407 INFO:tasks.workunit.client.1.vm06.stdout:4/884: creat dd/d24/d2d/d2f/d39/d71/dc3/dd0/f101 x:0 0 0 2026-03-10T06:22:31.408 INFO:tasks.workunit.client.1.vm06.stdout:2/723: getdents da/d13/d5e 0 2026-03-10T06:22:31.408 INFO:tasks.workunit.client.1.vm06.stdout:4/885: stat dd/d24/d2d/d2f/f82 0 2026-03-10T06:22:31.409 INFO:tasks.workunit.client.1.vm06.stdout:8/711: mkdir d1/d3b/d5c/de6 0 2026-03-10T06:22:31.409 INFO:tasks.workunit.client.1.vm06.stdout:6/916: dwrite d6/d79/d95/db4/f110 [0,4194304] 0 2026-03-10T06:22:31.410 INFO:tasks.workunit.client.1.vm06.stdout:7/903: dwrite f4 [4194304,4194304] 0 2026-03-10T06:22:31.411 INFO:tasks.workunit.client.1.vm06.stdout:0/870: mknod d0/dd/d14/c121 0 2026-03-10T06:22:31.418 INFO:tasks.workunit.client.1.vm06.stdout:1/904: dread d9/d62/f87 [0,4194304] 0 2026-03-10T06:22:31.422 INFO:tasks.workunit.client.1.vm06.stdout:5/648: write d8/db/d54/d8a/d74/fb3 [88128,72037] 0 2026-03-10T06:22:31.427 INFO:tasks.workunit.client.1.vm06.stdout:5/649: dwrite d8/db/d54/d8a/d74/fb3 [0,4194304] 0 2026-03-10T06:22:31.458 INFO:tasks.workunit.client.1.vm06.stdout:4/886: rename dd/d24/d2d/d2f/d39/d71/dc3/dd0/le4 to dd/d33/l102 0 2026-03-10T06:22:31.458 INFO:tasks.workunit.client.1.vm06.stdout:6/917: creat d6/dd/d25/d33/d4d/f13d x:0 0 0 2026-03-10T06:22:31.458 INFO:tasks.workunit.client.1.vm06.stdout:8/712: creat d1/df/d20/d21/fe7 x:0 0 0 2026-03-10T06:22:31.458 INFO:tasks.workunit.client.1.vm06.stdout:5/650: chown d8/db/c19 3 1 2026-03-10T06:22:31.459 INFO:tasks.workunit.client.1.vm06.stdout:2/724: getdents da/d13/d1c 0 2026-03-10T06:22:31.460 INFO:tasks.workunit.client.1.vm06.stdout:4/887: readlink dd/d18/l1a 0 2026-03-10T06:22:31.461 INFO:tasks.workunit.client.1.vm06.stdout:5/651: readlink d8/db/d54/d8a/d74/d90/lad 0 2026-03-10T06:22:31.461 INFO:tasks.workunit.client.1.vm06.stdout:8/713: write d1/df/d58/f86 [1548021,119587] 0 2026-03-10T06:22:31.462 INFO:tasks.workunit.client.1.vm06.stdout:8/714: dread - d1/df/d20/d21/d5e/d79/fda zero size 2026-03-10T06:22:31.466 INFO:tasks.workunit.client.1.vm06.stdout:6/918: creat d6/dd/d25/d2c/d12c/f13e x:0 0 0 2026-03-10T06:22:31.469 INFO:tasks.workunit.client.1.vm06.stdout:2/725: mkdir da/dea 0 2026-03-10T06:22:31.469 INFO:tasks.workunit.client.1.vm06.stdout:2/726: stat da/d13/d1c/d7d/ddf/d61/d68/de0 0 2026-03-10T06:22:31.470 INFO:tasks.workunit.client.1.vm06.stdout:2/727: stat da/d13/d1a/d39/d35/l7f 0 2026-03-10T06:22:31.470 INFO:tasks.workunit.client.1.vm06.stdout:8/715: symlink d1/df/d20/d35/dac/le8 0 2026-03-10T06:22:31.470 INFO:tasks.workunit.client.1.vm06.stdout:8/716: chown d1/d3b/da9/fd8 111375549 1 2026-03-10T06:22:31.475 INFO:tasks.workunit.client.1.vm06.stdout:5/652: symlink d8/db/d54/d8a/d39/lcb 0 2026-03-10T06:22:31.482 INFO:tasks.workunit.client.1.vm06.stdout:1/905: dread d9/d35/d46/d38/d63/d83/fa1 [0,4194304] 0 2026-03-10T06:22:31.482 INFO:tasks.workunit.client.1.vm06.stdout:1/906: stat d9/d35/d46/d38/dc6/dd4/fe4 0 2026-03-10T06:22:31.482 INFO:tasks.workunit.client.1.vm06.stdout:8/717: fdatasync d1/d2c/f8a 0 2026-03-10T06:22:31.482 INFO:tasks.workunit.client.1.vm06.stdout:4/888: dread dd/d24/d2d/d2f/d39/f61 [0,4194304] 0 2026-03-10T06:22:31.485 INFO:tasks.workunit.client.1.vm06.stdout:6/919: link d6/dd/d25/d2c/f9c d6/df/d40/d99/f13f 0 2026-03-10T06:22:31.487 INFO:tasks.workunit.client.1.vm06.stdout:0/871: dread d0/f61 [0,4194304] 0 2026-03-10T06:22:31.487 INFO:tasks.workunit.client.1.vm06.stdout:5/653: stat d8/db/d54/d8a/d39/d72 0 2026-03-10T06:22:31.491 INFO:tasks.workunit.client.1.vm06.stdout:1/907: rename d9/d1b/fcb to d9/d35/d46/d38/d63/dd6/de8/f109 0 2026-03-10T06:22:31.492 INFO:tasks.workunit.client.1.vm06.stdout:8/718: stat d1/df/f6b 0 2026-03-10T06:22:31.492 INFO:tasks.workunit.client.1.vm06.stdout:3/844: write d6/f63 [445413,27020] 0 2026-03-10T06:22:31.494 INFO:tasks.workunit.client.1.vm06.stdout:6/920: creat d6/dd/d25/d33/d5a/df1/f140 x:0 0 0 2026-03-10T06:22:31.494 INFO:tasks.workunit.client.1.vm06.stdout:1/908: fsync d9/f86 0 2026-03-10T06:22:31.495 INFO:tasks.workunit.client.1.vm06.stdout:4/889: rename dd/d24/d2d/d2f/d34/d40/l46 to dd/d33/d36/l103 0 2026-03-10T06:22:31.495 INFO:tasks.workunit.client.1.vm06.stdout:1/909: readlink d9/d35/d46/db0/lba 0 2026-03-10T06:22:31.497 INFO:tasks.workunit.client.1.vm06.stdout:3/845: read d6/dc/d13/d9d/d54/fea [3762842,52415] 0 2026-03-10T06:22:31.497 INFO:tasks.workunit.client.1.vm06.stdout:3/846: fsync d6/dc/d13/fca 0 2026-03-10T06:22:31.498 INFO:tasks.workunit.client.1.vm06.stdout:8/719: creat d1/df/d58/db5/fe9 x:0 0 0 2026-03-10T06:22:31.498 INFO:tasks.workunit.client.1.vm06.stdout:8/720: chown d1/d3b/da9/dab/fb2 71 1 2026-03-10T06:22:31.500 INFO:tasks.workunit.client.1.vm06.stdout:8/721: chown d1/df/fd0 377902 1 2026-03-10T06:22:31.500 INFO:tasks.workunit.client.1.vm06.stdout:8/722: chown d1/d7/l1a 7 1 2026-03-10T06:22:31.502 INFO:tasks.workunit.client.1.vm06.stdout:5/654: truncate d8/db/d54/d67/d46/d68/f73 6953213 0 2026-03-10T06:22:31.502 INFO:tasks.workunit.client.1.vm06.stdout:3/847: creat d6/d8/d7f/f120 x:0 0 0 2026-03-10T06:22:31.502 INFO:tasks.workunit.client.1.vm06.stdout:8/723: creat d1/df/d58/db5/fea x:0 0 0 2026-03-10T06:22:31.503 INFO:tasks.workunit.client.1.vm06.stdout:3/848: stat d6/dc/d72 0 2026-03-10T06:22:31.503 INFO:tasks.workunit.client.1.vm06.stdout:1/910: sync 2026-03-10T06:22:31.504 INFO:tasks.workunit.client.1.vm06.stdout:4/890: truncate dd/f12 2059392 0 2026-03-10T06:22:31.505 INFO:tasks.workunit.client.1.vm06.stdout:9/860: write d21/d32/f73 [384344,94] 0 2026-03-10T06:22:31.505 INFO:tasks.workunit.client.1.vm06.stdout:1/911: sync 2026-03-10T06:22:31.505 INFO:tasks.workunit.client.1.vm06.stdout:4/891: fdatasync f0 0 2026-03-10T06:22:31.509 INFO:tasks.workunit.client.1.vm06.stdout:7/904: dread d19/d3b/d41/da9/da5/fa6 [0,4194304] 0 2026-03-10T06:22:31.510 INFO:tasks.workunit.client.1.vm06.stdout:1/912: truncate d9/d35/d46/d38/d63/d83/d93/fe5 4728905 0 2026-03-10T06:22:31.514 INFO:tasks.workunit.client.1.vm06.stdout:3/849: creat d6/d8/d7f/da1/f121 x:0 0 0 2026-03-10T06:22:31.516 INFO:tasks.workunit.client.1.vm06.stdout:4/892: symlink dd/d72/l104 0 2026-03-10T06:22:31.522 INFO:tasks.workunit.client.1.vm06.stdout:4/893: chown dd/d24/d2d/d2f/d39/c3e 20 1 2026-03-10T06:22:31.523 INFO:tasks.workunit.client.1.vm06.stdout:4/894: readlink dd/d33/l7b 0 2026-03-10T06:22:31.524 INFO:tasks.workunit.client.1.vm06.stdout:4/895: stat dd/d33/f70 0 2026-03-10T06:22:31.525 INFO:tasks.workunit.client.1.vm06.stdout:9/861: creat d21/d27/d50/d57/dcd/de4/f117 x:0 0 0 2026-03-10T06:22:31.526 INFO:tasks.workunit.client.1.vm06.stdout:6/921: link d6/d79/d95/d137/dd0/l11d d6/d7/l141 0 2026-03-10T06:22:31.527 INFO:tasks.workunit.client.1.vm06.stdout:1/913: write d9/d62/f87 [3637854,106129] 0 2026-03-10T06:22:31.527 INFO:tasks.workunit.client.1.vm06.stdout:3/850: creat d6/dc/d13/d35/d101/d11b/f122 x:0 0 0 2026-03-10T06:22:31.527 INFO:tasks.workunit.client.1.vm06.stdout:9/862: write d21/d32/d4d/d51/db0/f10e [542692,130545] 0 2026-03-10T06:22:31.530 INFO:tasks.workunit.client.1.vm06.stdout:7/905: rename d19/d3b/d41/d42/d10b to d19/d3b/d41/d42/d52/d83/d9d/da8/d129 0 2026-03-10T06:22:31.532 INFO:tasks.workunit.client.1.vm06.stdout:5/655: rmdir d8/db/d54/d8a/d39/d6c/dc8 0 2026-03-10T06:22:31.535 INFO:tasks.workunit.client.1.vm06.stdout:4/896: symlink dd/d72/l105 0 2026-03-10T06:22:31.535 INFO:tasks.workunit.client.1.vm06.stdout:5/656: readlink d8/db/d54/d67/d46/d6e/l93 0 2026-03-10T06:22:31.536 INFO:tasks.workunit.client.1.vm06.stdout:6/922: fsync d6/d79/d95/d137/ffa 0 2026-03-10T06:22:31.536 INFO:tasks.workunit.client.1.vm06.stdout:1/914: fsync d9/d35/f57 0 2026-03-10T06:22:31.541 INFO:tasks.workunit.client.1.vm06.stdout:8/724: getdents d1/df/d11 0 2026-03-10T06:22:31.541 INFO:tasks.workunit.client.1.vm06.stdout:6/923: chown d6/dd/d25/d33/d5a/df1/f140 1948675442 1 2026-03-10T06:22:31.541 INFO:tasks.workunit.client.1.vm06.stdout:8/725: chown d1/fa 4099451 1 2026-03-10T06:22:31.543 INFO:tasks.workunit.client.1.vm06.stdout:6/924: write d6/dd/d25/d33/d5a/dae/ff6 [663289,99605] 0 2026-03-10T06:22:31.546 INFO:tasks.workunit.client.1.vm06.stdout:3/851: creat d6/dc/d13/d9d/d100/f123 x:0 0 0 2026-03-10T06:22:31.548 INFO:tasks.workunit.client.1.vm06.stdout:0/872: dwrite d0/dd/d14/d18/d85/dcc/d88/fa0 [0,4194304] 0 2026-03-10T06:22:31.550 INFO:tasks.workunit.client.1.vm06.stdout:1/915: dread d9/d35/d46/d38/dc6/dd4/fe4 [0,4194304] 0 2026-03-10T06:22:31.554 INFO:tasks.workunit.client.1.vm06.stdout:8/726: rmdir d1/d2c/d99/dc0 39 2026-03-10T06:22:31.554 INFO:tasks.workunit.client.1.vm06.stdout:5/657: mknod d8/db/d54/d8a/d39/ccc 0 2026-03-10T06:22:31.557 INFO:tasks.workunit.client.1.vm06.stdout:3/852: unlink d6/f25 0 2026-03-10T06:22:31.557 INFO:tasks.workunit.client.1.vm06.stdout:9/863: link d21/da2/da7/d93/dda/f6a d21/d32/d4d/dd2/f118 0 2026-03-10T06:22:31.557 INFO:tasks.workunit.client.1.vm06.stdout:0/873: creat d0/dd/d14/d18/f122 x:0 0 0 2026-03-10T06:22:31.558 INFO:tasks.workunit.client.1.vm06.stdout:6/925: symlink d6/l142 0 2026-03-10T06:22:31.558 INFO:tasks.workunit.client.1.vm06.stdout:4/897: dread dd/d24/d2d/d2f/d39/f62 [0,4194304] 0 2026-03-10T06:22:31.569 INFO:tasks.workunit.client.1.vm06.stdout:6/926: chown d6/d79/d95/db4 51 1 2026-03-10T06:22:31.569 INFO:tasks.workunit.client.1.vm06.stdout:5/658: creat d8/db/d54/d8a/d74/d90/fcd x:0 0 0 2026-03-10T06:22:31.569 INFO:tasks.workunit.client.1.vm06.stdout:7/906: getdents d19/d3b/d41/d42/d52/d83/d9d/da8/d129/d11c/d104 0 2026-03-10T06:22:31.569 INFO:tasks.workunit.client.1.vm06.stdout:1/916: mkdir d9/d1b/d20/d103/d10a 0 2026-03-10T06:22:31.569 INFO:tasks.workunit.client.1.vm06.stdout:0/874: readlink d0/dd/d14/d18/d85/dcc/db7/l10c 0 2026-03-10T06:22:31.569 INFO:tasks.workunit.client.1.vm06.stdout:7/907: rmdir d19/db0 39 2026-03-10T06:22:31.569 INFO:tasks.workunit.client.1.vm06.stdout:1/917: rmdir d9/d62/dc7 39 2026-03-10T06:22:31.569 INFO:tasks.workunit.client.1.vm06.stdout:7/908: dread - d19/d3b/d41/d42/d52/d83/d9d/da8/df4/f11e zero size 2026-03-10T06:22:31.569 INFO:tasks.workunit.client.1.vm06.stdout:3/853: link d6/dc/d13/d35/d101/dd0/dd1/f4c d6/dc/de5/f124 0 2026-03-10T06:22:31.569 INFO:tasks.workunit.client.1.vm06.stdout:7/909: rmdir d19/d3b/d41/d42/d62/d80/d82 39 2026-03-10T06:22:31.569 INFO:tasks.workunit.client.1.vm06.stdout:0/875: chown d0/dd/d14/d18/d85/dcc/d88/d35/f51 711 1 2026-03-10T06:22:31.573 INFO:tasks.workunit.client.1.vm06.stdout:4/898: dwrite dd/d24/f45 [4194304,4194304] 0 2026-03-10T06:22:31.575 INFO:tasks.workunit.client.1.vm06.stdout:3/854: symlink d6/dc/d41/d6d/l125 0 2026-03-10T06:22:31.576 INFO:tasks.workunit.client.1.vm06.stdout:4/899: mknod dd/d18/d8e/daa/c106 0 2026-03-10T06:22:31.586 INFO:tasks.workunit.client.1.vm06.stdout:5/659: dwrite d8/db/d54/d8a/fc7 [0,4194304] 0 2026-03-10T06:22:31.590 INFO:tasks.workunit.client.1.vm06.stdout:0/876: getdents d0/dd/d14/d6b 0 2026-03-10T06:22:31.593 INFO:tasks.workunit.client.1.vm06.stdout:4/900: rename dd/d24/d2d/d2f/d39/d71/dc3/dd0/de0/d66 to dd/d24/d2d/d2f/d39/d71/dc3/dd0/de0/db0/deb/d107 0 2026-03-10T06:22:31.595 INFO:tasks.workunit.client.1.vm06.stdout:1/918: dread d9/d1b/d20/f24 [0,4194304] 0 2026-03-10T06:22:31.603 INFO:tasks.workunit.client.1.vm06.stdout:4/901: symlink dd/d33/da6/l108 0 2026-03-10T06:22:31.603 INFO:tasks.workunit.client.1.vm06.stdout:7/910: dwrite d19/d3b/d41/d42/d62/fef [0,4194304] 0 2026-03-10T06:22:31.603 INFO:tasks.workunit.client.1.vm06.stdout:5/660: getdents d8/db/d54/d67 0 2026-03-10T06:22:31.603 INFO:tasks.workunit.client.1.vm06.stdout:4/902: dread - dd/d33/d47/d97/db6/dbb/de2/f100 zero size 2026-03-10T06:22:31.603 INFO:tasks.workunit.client.1.vm06.stdout:5/661: creat d8/db/d54/d67/d46/d68/dc1/fce x:0 0 0 2026-03-10T06:22:31.603 INFO:tasks.workunit.client.1.vm06.stdout:1/919: rename d9/d35/d46/d38/d63/d83/d93/cfb to d9/d35/d46/d38/d63/d83/dc5/d108/c10b 0 2026-03-10T06:22:31.606 INFO:tasks.workunit.client.1.vm06.stdout:7/911: truncate d19/d3b/f68 4212128 0 2026-03-10T06:22:31.609 INFO:tasks.workunit.client.1.vm06.stdout:2/728: write f7 [900666,52731] 0 2026-03-10T06:22:31.612 INFO:tasks.workunit.client.1.vm06.stdout:2/729: write da/d13/d1a/f21 [179198,18944] 0 2026-03-10T06:22:31.614 INFO:tasks.workunit.client.1.vm06.stdout:7/912: getdents d19/d3b/d41/d42/d52/d83/d9d/da8/d129 0 2026-03-10T06:22:31.614 INFO:tasks.workunit.client.1.vm06.stdout:5/662: dwrite d8/db/d54/d8a/d74/d90/fb5 [0,4194304] 0 2026-03-10T06:22:31.621 INFO:tasks.workunit.client.1.vm06.stdout:7/913: creat d19/d3b/d41/d42/d52/d83/d9d/da8/d129/d10d/f12a x:0 0 0 2026-03-10T06:22:31.649 INFO:tasks.workunit.client.1.vm06.stdout:6/927: dread d6/d7/f2a [0,4194304] 0 2026-03-10T06:22:31.649 INFO:tasks.workunit.client.1.vm06.stdout:7/914: dwrite d19/d3b/d41/d42/d62/f7c [0,4194304] 0 2026-03-10T06:22:31.649 INFO:tasks.workunit.client.1.vm06.stdout:6/928: creat d6/d79/d95/d137/dd0/dc5/f143 x:0 0 0 2026-03-10T06:22:31.649 INFO:tasks.workunit.client.1.vm06.stdout:6/929: mkdir d6/d79/d95/d144 0 2026-03-10T06:22:31.649 INFO:tasks.workunit.client.1.vm06.stdout:6/930: fsync d6/d79/d95/db4/fbd 0 2026-03-10T06:22:31.649 INFO:tasks.workunit.client.1.vm06.stdout:7/915: truncate d19/d3b/d41/d42/d52/d83/d9d/da8/d129/d11c/d104/f113 329712 0 2026-03-10T06:22:31.649 INFO:tasks.workunit.client.1.vm06.stdout:6/931: truncate d6/dd/d25/d2c/f32 789801 0 2026-03-10T06:22:31.649 INFO:tasks.workunit.client.1.vm06.stdout:6/932: rename d6/dd/d25/f105 to d6/dd/d122/f145 0 2026-03-10T06:22:31.649 INFO:tasks.workunit.client.1.vm06.stdout:6/933: write d6/dd/d25/d2c/f85 [1738443,105498] 0 2026-03-10T06:22:31.649 INFO:tasks.workunit.client.1.vm06.stdout:7/916: dwrite d19/d3b/f7b [4194304,4194304] 0 2026-03-10T06:22:31.649 INFO:tasks.workunit.client.1.vm06.stdout:6/934: rename d6/d79/ca4 to d6/df/d40/d99/c146 0 2026-03-10T06:22:31.651 INFO:tasks.workunit.client.1.vm06.stdout:6/935: readlink d6/d79/d95/db4/dd4/df5/l125 0 2026-03-10T06:22:31.651 INFO:tasks.workunit.client.1.vm06.stdout:7/917: symlink d19/db0/d116/l12b 0 2026-03-10T06:22:31.651 INFO:tasks.workunit.client.1.vm06.stdout:6/936: creat d6/d79/d95/d137/db9/f147 x:0 0 0 2026-03-10T06:22:31.654 INFO:tasks.workunit.client.1.vm06.stdout:6/937: dwrite d6/f6a [0,4194304] 0 2026-03-10T06:22:31.656 INFO:tasks.workunit.client.1.vm06.stdout:6/938: fdatasync d6/dd/d25/d4e/f83 0 2026-03-10T06:22:31.663 INFO:tasks.workunit.client.1.vm06.stdout:6/939: write d6/f62 [3261184,89442] 0 2026-03-10T06:22:31.668 INFO:tasks.workunit.client.1.vm06.stdout:6/940: write d6/dd/d25/d33/d5a/dae/ff6 [1773566,48945] 0 2026-03-10T06:22:31.669 INFO:tasks.workunit.client.1.vm06.stdout:6/941: truncate d6/d79/d95/d137/dd0/dec/f13b 63287 0 2026-03-10T06:22:31.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:31 vm04.local ceph-mon[51058]: pgmap v14: 65 pgs: 65 active+clean; 1.7 GiB data, 5.8 GiB used, 114 GiB / 120 GiB avail; 67 MiB/s rd, 183 MiB/s wr, 402 op/s 2026-03-10T06:22:31.720 INFO:tasks.workunit.client.1.vm06.stdout:8/727: sync 2026-03-10T06:22:31.720 INFO:tasks.workunit.client.1.vm06.stdout:8/728: fdatasync d1/d3b/db3/fcc 0 2026-03-10T06:22:31.725 INFO:tasks.workunit.client.1.vm06.stdout:8/729: creat d1/df/d11/da1/feb x:0 0 0 2026-03-10T06:22:31.726 INFO:tasks.workunit.client.1.vm06.stdout:8/730: read d1/d3b/da9/dab/fd4 [98383,48680] 0 2026-03-10T06:22:31.728 INFO:tasks.workunit.client.1.vm06.stdout:8/731: fsync d1/df/f6b 0 2026-03-10T06:22:31.731 INFO:tasks.workunit.client.1.vm06.stdout:8/732: rename d1/df/d11/l78 to d1/lec 0 2026-03-10T06:22:31.767 INFO:tasks.workunit.client.1.vm06.stdout:8/733: read - d1/d3b/f49 zero size 2026-03-10T06:22:31.767 INFO:tasks.workunit.client.1.vm06.stdout:8/734: truncate d1/df/d20/d35/f42 4357325 0 2026-03-10T06:22:31.767 INFO:tasks.workunit.client.1.vm06.stdout:8/735: truncate d1/df/d20/f43 2116452 0 2026-03-10T06:22:31.767 INFO:tasks.workunit.client.1.vm06.stdout:8/736: dwrite d1/d2c/f32 [0,4194304] 0 2026-03-10T06:22:31.767 INFO:tasks.workunit.client.1.vm06.stdout:4/903: sync 2026-03-10T06:22:31.767 INFO:tasks.workunit.client.1.vm06.stdout:0/877: sync 2026-03-10T06:22:31.769 INFO:tasks.workunit.client.1.vm06.stdout:0/878: creat d0/d3c/f123 x:0 0 0 2026-03-10T06:22:32.094 INFO:tasks.workunit.client.1.vm06.stdout:7/918: dread d19/d3b/d41/d42/d52/d83/d9d/da8/d129/d11c/d97/fa3 [0,4194304] 0 2026-03-10T06:22:32.095 INFO:tasks.workunit.client.1.vm06.stdout:7/919: readlink d19/d3b/d41/d42/d52/d83/d9d/da8/d129/d11c/l114 0 2026-03-10T06:22:32.097 INFO:tasks.workunit.client.1.vm06.stdout:7/920: getdents d19/d3b/d41/d42/d52 0 2026-03-10T06:22:32.099 INFO:tasks.workunit.client.1.vm06.stdout:7/921: creat d19/d3b/d41/da9/dbd/dd2/f12c x:0 0 0 2026-03-10T06:22:32.130 INFO:tasks.workunit.client.1.vm06.stdout:8/737: sync 2026-03-10T06:22:32.130 INFO:tasks.workunit.client.1.vm06.stdout:0/879: sync 2026-03-10T06:22:32.131 INFO:tasks.workunit.client.1.vm06.stdout:8/738: fdatasync d1/df/d20/d21/d7e/d8d/f95 0 2026-03-10T06:22:32.132 INFO:tasks.workunit.client.1.vm06.stdout:8/739: rename d1/df/d20/d35/dac/dbf to d1/df/d20/d35/dac/dbf/ded 22 2026-03-10T06:22:32.134 INFO:tasks.workunit.client.1.vm06.stdout:8/740: fdatasync d1/df/d20/d35/f42 0 2026-03-10T06:22:32.136 INFO:tasks.workunit.client.1.vm06.stdout:0/880: dread d0/dd/d1c/ff7 [0,4194304] 0 2026-03-10T06:22:32.137 INFO:tasks.workunit.client.1.vm06.stdout:0/881: mknod d0/dd/d14/d18/d7e/c124 0 2026-03-10T06:22:32.140 INFO:tasks.workunit.client.1.vm06.stdout:0/882: dread d0/dd/d14/d18/d85/dcc/f5c [0,4194304] 0 2026-03-10T06:22:32.149 INFO:tasks.workunit.client.1.vm06.stdout:0/883: dwrite d0/dd/d14/d18/d85/fe9 [0,4194304] 0 2026-03-10T06:22:32.150 INFO:tasks.workunit.client.1.vm06.stdout:0/884: readlink d0/d3c/dc1/d3d/l114 0 2026-03-10T06:22:32.157 INFO:tasks.workunit.client.1.vm06.stdout:0/885: creat d0/dd/d14/d1d/d5d/dca/f125 x:0 0 0 2026-03-10T06:22:32.159 INFO:tasks.workunit.client.1.vm06.stdout:0/886: dread d0/dd/d14/d18/d66/fcb [0,4194304] 0 2026-03-10T06:22:32.162 INFO:tasks.workunit.client.1.vm06.stdout:0/887: dwrite d0/dd/d1c/ff7 [0,4194304] 0 2026-03-10T06:22:32.168 INFO:tasks.workunit.client.1.vm06.stdout:0/888: creat d0/dd/d14/d18/d7e/f126 x:0 0 0 2026-03-10T06:22:32.301 INFO:tasks.workunit.client.1.vm06.stdout:9/864: dwrite d21/d27/d50/d57/f58 [0,4194304] 0 2026-03-10T06:22:32.329 INFO:tasks.workunit.client.1.vm06.stdout:3/855: write d6/dc/d13/d35/f95 [5521861,31878] 0 2026-03-10T06:22:32.334 INFO:tasks.workunit.client.1.vm06.stdout:3/856: symlink d6/dc/d13/d35/d101/dd0/dd1/l126 0 2026-03-10T06:22:32.341 INFO:tasks.workunit.client.1.vm06.stdout:4/904: rmdir dd/d24/d2d/d2f/d39/d71/dc3/dd0 39 2026-03-10T06:22:32.343 INFO:tasks.workunit.client.1.vm06.stdout:4/905: creat dd/d33/d47/d97/db6/dd7/f109 x:0 0 0 2026-03-10T06:22:32.344 INFO:tasks.workunit.client.1.vm06.stdout:4/906: rmdir dd/d18/d75 39 2026-03-10T06:22:32.358 INFO:tasks.workunit.client.1.vm06.stdout:1/920: dwrite d9/d35/d46/d38/d63/f84 [0,4194304] 0 2026-03-10T06:22:32.361 INFO:tasks.workunit.client.1.vm06.stdout:1/921: write d9/d62/f8a [711118,35512] 0 2026-03-10T06:22:32.363 INFO:tasks.workunit.client.1.vm06.stdout:1/922: unlink d9/d35/d89/c104 0 2026-03-10T06:22:32.365 INFO:tasks.workunit.client.1.vm06.stdout:1/923: creat d9/d1b/d20/d103/f10c x:0 0 0 2026-03-10T06:22:32.366 INFO:tasks.workunit.client.1.vm06.stdout:1/924: mknod d9/d35/d46/d38/dc6/c10d 0 2026-03-10T06:22:32.371 INFO:tasks.workunit.client.1.vm06.stdout:1/925: creat d9/d35/d46/d38/d63/d83/dc5/dd5/f10e x:0 0 0 2026-03-10T06:22:32.374 INFO:tasks.workunit.client.1.vm06.stdout:1/926: mknod d9/dd3/dbf/c10f 0 2026-03-10T06:22:32.376 INFO:tasks.workunit.client.1.vm06.stdout:1/927: stat d9/d35/d46/db0/lf8 0 2026-03-10T06:22:32.378 INFO:tasks.workunit.client.1.vm06.stdout:1/928: write d9/d35/d46/d38/d63/d83/d93/f9c [2248237,6434] 0 2026-03-10T06:22:32.386 INFO:tasks.workunit.client.1.vm06.stdout:1/929: dwrite d9/d35/d46/d38/d63/d83/dc5/dd5/fe1 [0,4194304] 0 2026-03-10T06:22:32.396 INFO:tasks.workunit.client.1.vm06.stdout:2/730: dwrite da/d13/d1c/d43/f7a [0,4194304] 0 2026-03-10T06:22:32.400 INFO:tasks.workunit.client.1.vm06.stdout:2/731: rename da/d13/d1a/dc7/d86/fc8 to da/d13/d1a/dc7/daf/d56/db9/feb 0 2026-03-10T06:22:32.405 INFO:tasks.workunit.client.1.vm06.stdout:2/732: dwrite da/d13/d5e/fbc [0,4194304] 0 2026-03-10T06:22:32.413 INFO:tasks.workunit.client.1.vm06.stdout:2/733: getdents da/d13/d1a/dc7/daf/d56/dd4 0 2026-03-10T06:22:32.417 INFO:tasks.workunit.client.1.vm06.stdout:5/663: write d8/fa5 [1257170,60712] 0 2026-03-10T06:22:32.417 INFO:tasks.workunit.client.1.vm06.stdout:2/734: dwrite da/d13/d1a/dc7/f9d [0,4194304] 0 2026-03-10T06:22:32.419 INFO:tasks.workunit.client.1.vm06.stdout:5/664: read d8/db/d54/d8a/f31 [2891816,65740] 0 2026-03-10T06:22:32.419 INFO:tasks.workunit.client.1.vm06.stdout:2/735: fdatasync da/d13/d1a/dc7/daf/d56/dd4/fe1 0 2026-03-10T06:22:32.422 INFO:tasks.workunit.client.1.vm06.stdout:5/665: mknod d8/db/d54/d8a/d74/d90/ccf 0 2026-03-10T06:22:32.437 INFO:tasks.workunit.client.1.vm06.stdout:2/736: symlink da/d13/d1a/dc7/lec 0 2026-03-10T06:22:32.484 INFO:tasks.workunit.client.1.vm06.stdout:6/942: truncate d6/dd/d25/d2c/f136 4046958 0 2026-03-10T06:22:32.505 INFO:tasks.workunit.client.1.vm06.stdout:7/922: write d19/f33 [533631,65247] 0 2026-03-10T06:22:32.516 INFO:tasks.workunit.client.1.vm06.stdout:6/943: dread d6/dd/d25/d2c/f85 [0,4194304] 0 2026-03-10T06:22:32.517 INFO:tasks.workunit.client.1.vm06.stdout:8/741: dwrite d1/df/d20/f64 [0,4194304] 0 2026-03-10T06:22:32.520 INFO:tasks.workunit.client.1.vm06.stdout:6/944: symlink d6/dd/d25/d33/d5a/df1/l148 0 2026-03-10T06:22:32.523 INFO:tasks.workunit.client.1.vm06.stdout:8/742: fdatasync d1/df/f6d 0 2026-03-10T06:22:32.525 INFO:tasks.workunit.client.1.vm06.stdout:6/945: creat d6/dd/d25/d33/d5a/dd8/f149 x:0 0 0 2026-03-10T06:22:32.527 INFO:tasks.workunit.client.1.vm06.stdout:6/946: fdatasync d6/d7/d37/d43/f59 0 2026-03-10T06:22:32.539 INFO:tasks.workunit.client.1.vm06.stdout:6/947: rename d6/dd/d25/d4e/c75 to d6/d7/d37/c14a 0 2026-03-10T06:22:32.544 INFO:tasks.workunit.client.1.vm06.stdout:6/948: creat d6/d79/d95/d137/dd0/dc5/f14b x:0 0 0 2026-03-10T06:22:32.544 INFO:tasks.workunit.client.1.vm06.stdout:6/949: fdatasync d6/dd/ffd 0 2026-03-10T06:22:32.567 INFO:tasks.workunit.client.1.vm06.stdout:6/950: link d6/dd/d25/d2c/c4b d6/df/d70/daa/dee/c14c 0 2026-03-10T06:22:32.579 INFO:tasks.workunit.client.1.vm06.stdout:6/951: dread d6/d79/d95/db4/fbd [0,4194304] 0 2026-03-10T06:22:32.588 INFO:tasks.workunit.client.1.vm06.stdout:6/952: mkdir d6/d79/d95/db4/dd4/d14d 0 2026-03-10T06:22:32.597 INFO:tasks.workunit.client.1.vm06.stdout:0/889: dwrite d0/dd/d14/d18/d85/dcc/d88/fae [0,4194304] 0 2026-03-10T06:22:32.600 INFO:tasks.workunit.client.1.vm06.stdout:0/890: dwrite d0/da3/fe5 [0,4194304] 0 2026-03-10T06:22:32.603 INFO:tasks.workunit.client.1.vm06.stdout:6/953: symlink d6/dd/d25/d2c/d12c/l14e 0 2026-03-10T06:22:32.624 INFO:tasks.workunit.client.1.vm06.stdout:0/891: dwrite d0/dd/d14/d18/d7e/f104 [0,4194304] 0 2026-03-10T06:22:32.626 INFO:tasks.workunit.client.1.vm06.stdout:0/892: stat d0/dd/f24 0 2026-03-10T06:22:32.628 INFO:tasks.workunit.client.1.vm06.stdout:6/954: creat d6/d79/d95/db4/dd4/df5/f14f x:0 0 0 2026-03-10T06:22:32.629 INFO:tasks.workunit.client.1.vm06.stdout:9/865: dwrite d21/d32/fcf [0,4194304] 0 2026-03-10T06:22:32.630 INFO:tasks.workunit.client.1.vm06.stdout:9/866: write d21/d27/d3a/fbb [3247837,51699] 0 2026-03-10T06:22:32.631 INFO:tasks.workunit.client.1.vm06.stdout:0/893: creat d0/d3c/dc1/d7d/d10e/f127 x:0 0 0 2026-03-10T06:22:32.642 INFO:tasks.workunit.client.1.vm06.stdout:6/955: rename d6/dd/d25/d4e/f5f to d6/df/d40/d99/f150 0 2026-03-10T06:22:32.645 INFO:tasks.workunit.client.1.vm06.stdout:3/857: dwrite d6/dc/d13/d9d/d54/fb8 [0,4194304] 0 2026-03-10T06:22:32.646 INFO:tasks.workunit.client.1.vm06.stdout:4/907: truncate f8 6313804 0 2026-03-10T06:22:32.646 INFO:tasks.workunit.client.1.vm06.stdout:0/894: rmdir d0/dd/d14/d18/d85/dcc/db7 39 2026-03-10T06:22:32.647 INFO:tasks.workunit.client.1.vm06.stdout:9/867: chown d21/d32/d4d/dd2/f118 148420561 1 2026-03-10T06:22:32.647 INFO:tasks.workunit.client.1.vm06.stdout:9/868: chown d21/d32/f101 31340786 1 2026-03-10T06:22:32.647 INFO:tasks.workunit.client.1.vm06.stdout:0/895: rename d0/d3c/dc1/d3d/d50 to d0/d3c/dc1/d3d/d50/d128 22 2026-03-10T06:22:32.656 INFO:tasks.workunit.client.1.vm06.stdout:3/858: creat d6/d1a/f127 x:0 0 0 2026-03-10T06:22:32.657 INFO:tasks.workunit.client.1.vm06.stdout:4/908: mkdir dd/d24/d9c/d10a 0 2026-03-10T06:22:32.658 INFO:tasks.workunit.client.1.vm06.stdout:9/869: stat d21/d46/ded/cf3 0 2026-03-10T06:22:32.659 INFO:tasks.workunit.client.1.vm06.stdout:9/870: chown f9 101 1 2026-03-10T06:22:32.660 INFO:tasks.workunit.client.1.vm06.stdout:0/896: rmdir d0/dd/d1c/da2/d11f 39 2026-03-10T06:22:32.662 INFO:tasks.workunit.client.1.vm06.stdout:4/909: dwrite dd/d24/f45 [0,4194304] 0 2026-03-10T06:22:32.666 INFO:tasks.workunit.client.1.vm06.stdout:6/956: creat d6/d79/d95/db4/dd4/df5/d113/f151 x:0 0 0 2026-03-10T06:22:32.667 INFO:tasks.workunit.client.1.vm06.stdout:4/910: symlink dd/d24/d2d/d2f/l10b 0 2026-03-10T06:22:32.668 INFO:tasks.workunit.client.1.vm06.stdout:4/911: chown dd/d24/d2d/d2f/d34/d40/df6 35610 1 2026-03-10T06:22:32.668 INFO:tasks.workunit.client.1.vm06.stdout:3/859: mknod d6/d21/c128 0 2026-03-10T06:22:32.674 INFO:tasks.workunit.client.1.vm06.stdout:6/957: unlink d6/dd/d25/d33/d5a/dae/fe8 0 2026-03-10T06:22:32.675 INFO:tasks.workunit.client.1.vm06.stdout:6/958: readlink d6/dd/d25/d33/lcf 0 2026-03-10T06:22:32.675 INFO:tasks.workunit.client.1.vm06.stdout:9/871: link f20 d21/d27/d50/d57/dcd/de4/dee/f119 0 2026-03-10T06:22:32.676 INFO:tasks.workunit.client.1.vm06.stdout:0/897: mkdir d0/dd/d14/d18/d85/dcc/d88/d47/dd1/d129 0 2026-03-10T06:22:32.678 INFO:tasks.workunit.client.1.vm06.stdout:3/860: symlink d6/dc/d13/d35/d101/d88/dae/l129 0 2026-03-10T06:22:32.679 INFO:tasks.workunit.client.1.vm06.stdout:3/861: stat d6/d1a/d5b/fe7 0 2026-03-10T06:22:32.680 INFO:tasks.workunit.client.1.vm06.stdout:6/959: rmdir d6/d79/d95/dea 39 2026-03-10T06:22:32.680 INFO:tasks.workunit.client.1.vm06.stdout:0/898: mkdir d0/dd/d14/d1d/d5d/dca/dd8/d12a 0 2026-03-10T06:22:32.680 INFO:tasks.workunit.client.1.vm06.stdout:6/960: write d6/dd/dc2/f12a [960913,57214] 0 2026-03-10T06:22:32.681 INFO:tasks.workunit.client.1.vm06.stdout:6/961: write d6/d79/fe2 [842703,62538] 0 2026-03-10T06:22:32.683 INFO:tasks.workunit.client.1.vm06.stdout:0/899: fsync d0/da3/dd5/ddc/fe3 0 2026-03-10T06:22:32.688 INFO:tasks.workunit.client.1.vm06.stdout:0/900: chown d0/dd/d14/f70 1834 1 2026-03-10T06:22:32.688 INFO:tasks.workunit.client.1.vm06.stdout:9/872: dread d21/d27/d50/d57/db2/d80/f86 [0,4194304] 0 2026-03-10T06:22:32.689 INFO:tasks.workunit.client.1.vm06.stdout:0/901: readlink d0/d3c/dc1/d7d/lfd 0 2026-03-10T06:22:32.690 INFO:tasks.workunit.client.1.vm06.stdout:0/902: write d0/dd/d14/d18/deb/ffe [156658,6709] 0 2026-03-10T06:22:32.691 INFO:tasks.workunit.client.1.vm06.stdout:9/873: mknod d21/d27/d3a/c11a 0 2026-03-10T06:22:32.692 INFO:tasks.workunit.client.1.vm06.stdout:9/874: write f9 [5675258,34593] 0 2026-03-10T06:22:32.694 INFO:tasks.workunit.client.1.vm06.stdout:9/875: dread d21/d32/fcf [0,4194304] 0 2026-03-10T06:22:32.694 INFO:tasks.workunit.client.1.vm06.stdout:0/903: symlink d0/dd/d14/d18/d85/dcc/d99/l12b 0 2026-03-10T06:22:32.703 INFO:tasks.workunit.client.1.vm06.stdout:3/862: link d6/d1a/d5b/c112 d6/dc/d41/c12a 0 2026-03-10T06:22:32.708 INFO:tasks.workunit.client.1.vm06.stdout:6/962: getdents d6/df/d40/d10c 0 2026-03-10T06:22:32.711 INFO:tasks.workunit.client.1.vm06.stdout:9/876: dwrite f20 [0,4194304] 0 2026-03-10T06:22:32.712 INFO:tasks.workunit.client.1.vm06.stdout:9/877: write d21/d27/d50/d57/db2/d80/d95/fc4 [51129,19326] 0 2026-03-10T06:22:32.717 INFO:tasks.workunit.client.1.vm06.stdout:3/863: rename d6/d21/f55 to d6/d8/d7f/f12b 0 2026-03-10T06:22:32.720 INFO:tasks.workunit.client.1.vm06.stdout:9/878: dwrite f14 [0,4194304] 0 2026-03-10T06:22:32.723 INFO:tasks.workunit.client.1.vm06.stdout:9/879: dread d21/d27/d50/d57/db2/d80/d95/d9b/dd0/ffe [0,4194304] 0 2026-03-10T06:22:32.724 INFO:tasks.workunit.client.1.vm06.stdout:9/880: write d21/f49 [8992185,82433] 0 2026-03-10T06:22:32.729 INFO:tasks.workunit.client.1.vm06.stdout:6/963: fdatasync d6/dd/d25/d4e/f60 0 2026-03-10T06:22:32.774 INFO:tasks.workunit.client.1.vm06.stdout:3/864: creat d6/d8/f12c x:0 0 0 2026-03-10T06:22:32.774 INFO:tasks.workunit.client.1.vm06.stdout:0/904: link d0/dd/lbd d0/dd/d1c/l12c 0 2026-03-10T06:22:32.775 INFO:tasks.workunit.client.1.vm06.stdout:6/964: rename d6/d79/d95/d137/c9a to d6/d79/d95/d11e/c152 0 2026-03-10T06:22:32.776 INFO:tasks.workunit.client.1.vm06.stdout:1/930: truncate d9/d35/d46/d38/d63/d83/d93/fb5 465696 0 2026-03-10T06:22:32.778 INFO:tasks.workunit.client.1.vm06.stdout:3/865: stat d6/d21/l9b 0 2026-03-10T06:22:32.782 INFO:tasks.workunit.client.1.vm06.stdout:3/866: mkdir d6/d12d 0 2026-03-10T06:22:32.783 INFO:tasks.workunit.client.1.vm06.stdout:3/867: chown d6/d21/f58 98187758 1 2026-03-10T06:22:32.799 INFO:tasks.workunit.client.1.vm06.stdout:0/905: rename d0/dd/d1c/c21 to d0/dd/d1c/da2/d11f/c12d 0 2026-03-10T06:22:32.800 INFO:tasks.workunit.client.1.vm06.stdout:3/868: dread - d6/dc/d13/d51/fd2 zero size 2026-03-10T06:22:32.800 INFO:tasks.workunit.client.1.vm06.stdout:0/906: chown d0/dd/d14/d18/d7e/f104 133997 1 2026-03-10T06:22:32.802 INFO:tasks.workunit.client.1.vm06.stdout:9/881: link d21/d32/f8b d21/f11b 0 2026-03-10T06:22:32.802 INFO:tasks.workunit.client.1.vm06.stdout:0/907: chown d0/dd/d14/d18/d85/dcc/d88/d9e 12659 1 2026-03-10T06:22:32.804 INFO:tasks.workunit.client.1.vm06.stdout:9/882: truncate d21/d27/d50/d57/dcd/de4/dee/f113 534877 0 2026-03-10T06:22:32.806 INFO:tasks.workunit.client.1.vm06.stdout:1/931: creat d9/d35/d89/f110 x:0 0 0 2026-03-10T06:22:32.816 INFO:tasks.workunit.client.1.vm06.stdout:5/666: truncate d8/db/d54/d8a/d39/f3d 115554 0 2026-03-10T06:22:32.817 INFO:tasks.workunit.client.1.vm06.stdout:2/737: write da/d13/d1a/d39/f3c [4911842,34374] 0 2026-03-10T06:22:32.818 INFO:tasks.workunit.client.1.vm06.stdout:5/667: write d8/db/d54/d8a/d74/d90/fb5 [1999107,28368] 0 2026-03-10T06:22:32.819 INFO:tasks.workunit.client.1.vm06.stdout:5/668: fsync d8/db/d54/d8a/d39/f52 0 2026-03-10T06:22:32.822 INFO:tasks.workunit.client.1.vm06.stdout:1/932: mknod d9/d35/d46/db0/c111 0 2026-03-10T06:22:32.826 INFO:tasks.workunit.client.1.vm06.stdout:9/883: symlink d21/d27/d50/d57/dcd/de4/d104/l11c 0 2026-03-10T06:22:32.836 INFO:tasks.workunit.client.1.vm06.stdout:1/933: creat d9/d35/d46/d38/d63/d83/dc5/d108/f112 x:0 0 0 2026-03-10T06:22:32.836 INFO:tasks.workunit.client.1.vm06.stdout:2/738: dread da/d13/d1c/d7d/ddf/f67 [0,4194304] 0 2026-03-10T06:22:32.837 INFO:tasks.workunit.client.1.vm06.stdout:2/739: truncate f7 1804936 0 2026-03-10T06:22:32.840 INFO:tasks.workunit.client.1.vm06.stdout:5/669: dread d8/db/d54/d8a/d74/f4c [0,4194304] 0 2026-03-10T06:22:32.849 INFO:tasks.workunit.client.1.vm06.stdout:3/869: link d6/dc/d13/l85 d6/dc/d13/d35/d101/dd0/dd1/d90/dc7/l12e 0 2026-03-10T06:22:32.852 INFO:tasks.workunit.client.1.vm06.stdout:7/923: dwrite d19/fb7 [0,4194304] 0 2026-03-10T06:22:32.853 INFO:tasks.workunit.client.1.vm06.stdout:0/908: creat d0/dd/d14/d18/d85/dcc/f12e x:0 0 0 2026-03-10T06:22:32.856 INFO:tasks.workunit.client.1.vm06.stdout:9/884: mknod d21/d32/d4d/d51/dcb/c11d 0 2026-03-10T06:22:32.856 INFO:tasks.workunit.client.1.vm06.stdout:7/924: stat d19/d3b/d41/d42/d52/d83/d9d/da8/d129/d11c/d104 0 2026-03-10T06:22:32.857 INFO:tasks.workunit.client.1.vm06.stdout:7/925: readlink d19/d3b/d41/d42/d52/d83/d9d/da8/d129/d11c/de0/de2/l11f 0 2026-03-10T06:22:32.862 INFO:tasks.workunit.client.1.vm06.stdout:7/926: dwrite d19/d3b/d41/d42/f125 [0,4194304] 0 2026-03-10T06:22:32.863 INFO:tasks.workunit.client.1.vm06.stdout:2/740: symlink da/d13/d1c/d7d/ddf/d61/d68/led 0 2026-03-10T06:22:32.863 INFO:tasks.workunit.client.1.vm06.stdout:1/934: read d9/d1b/f51 [5251361,63460] 0 2026-03-10T06:22:32.866 INFO:tasks.workunit.client.1.vm06.stdout:5/670: stat d8/db/d54/d8a/d39/fa1 0 2026-03-10T06:22:32.868 INFO:tasks.workunit.client.1.vm06.stdout:0/909: mkdir d0/dd/d14/d18/d7e/d12f 0 2026-03-10T06:22:32.868 INFO:tasks.workunit.client.1.vm06.stdout:9/885: mkdir d21/d27/d50/d57/db2/d7f/d11e 0 2026-03-10T06:22:32.872 INFO:tasks.workunit.client.1.vm06.stdout:1/935: dwrite d9/d35/d89/f9b [0,4194304] 0 2026-03-10T06:22:32.876 INFO:tasks.workunit.client.1.vm06.stdout:5/671: dread - d8/db/d54/d67/d46/d68/f81 zero size 2026-03-10T06:22:32.877 INFO:tasks.workunit.client.1.vm06.stdout:2/741: chown da/d13/d1a/lad 0 1 2026-03-10T06:22:32.879 INFO:tasks.workunit.client.1.vm06.stdout:7/927: truncate d19/d3b/d41/d4c/f55 1388444 0 2026-03-10T06:22:32.883 INFO:tasks.workunit.client.1.vm06.stdout:9/886: truncate d21/d32/d4d/fbd 945395 0 2026-03-10T06:22:32.883 INFO:tasks.workunit.client.1.vm06.stdout:9/887: dread - d21/da2/da7/d93/dda/df4/f100 zero size 2026-03-10T06:22:32.883 INFO:tasks.workunit.client.1.vm06.stdout:1/936: creat d9/d1b/f113 x:0 0 0 2026-03-10T06:22:32.884 INFO:tasks.workunit.client.1.vm06.stdout:9/888: symlink d21/d27/d50/d57/db2/d7f/l11f 0 2026-03-10T06:22:32.884 INFO:tasks.workunit.client.1.vm06.stdout:7/928: chown d19/d3b/d41/d42/d52/d83/ccd 122844956 1 2026-03-10T06:22:32.885 INFO:tasks.workunit.client.1.vm06.stdout:2/742: creat da/d13/d1a/dc7/daf/fee x:0 0 0 2026-03-10T06:22:32.892 INFO:tasks.workunit.client.1.vm06.stdout:1/937: symlink d9/dd3/dbf/l114 0 2026-03-10T06:22:32.893 INFO:tasks.workunit.client.1.vm06.stdout:7/929: creat d19/d3b/d41/d42/d62/d80/da1/d117/f12d x:0 0 0 2026-03-10T06:22:32.895 INFO:tasks.workunit.client.1.vm06.stdout:1/938: truncate d9/f34 221925 0 2026-03-10T06:22:32.895 INFO:tasks.workunit.client.1.vm06.stdout:9/889: dwrite d21/da2/da7/d93/dda/df4/f100 [0,4194304] 0 2026-03-10T06:22:32.897 INFO:tasks.workunit.client.1.vm06.stdout:7/930: chown d19/d3b/d41/d42/d62/l6f 6359292 1 2026-03-10T06:22:32.897 INFO:tasks.workunit.client.1.vm06.stdout:2/743: dwrite da/d13/d1a/dc7/daf/d56/db9/fd1 [0,4194304] 0 2026-03-10T06:22:32.899 INFO:tasks.workunit.client.1.vm06.stdout:2/744: truncate da/d13/d1a/dc7/daf/fd9 303087 0 2026-03-10T06:22:32.902 INFO:tasks.workunit.client.1.vm06.stdout:2/745: stat da/d13/d1a/dc7/dc5/fd0 0 2026-03-10T06:22:32.902 INFO:tasks.workunit.client.1.vm06.stdout:1/939: rename d9/d35/d89/f9b to d9/d35/d46/d38/dc6/dd4/f115 0 2026-03-10T06:22:32.905 INFO:tasks.workunit.client.1.vm06.stdout:1/940: truncate d9/d35/d89/f110 180474 0 2026-03-10T06:22:32.913 INFO:tasks.workunit.client.1.vm06.stdout:8/743: write d1/df/d20/d21/d5e/f65 [290992,88789] 0 2026-03-10T06:22:32.920 INFO:tasks.workunit.client.1.vm06.stdout:2/746: rename f7 to da/d13/d1c/fef 0 2026-03-10T06:22:32.920 INFO:tasks.workunit.client.1.vm06.stdout:8/744: truncate d1/df/f71 3672690 0 2026-03-10T06:22:32.940 INFO:tasks.workunit.client.1.vm06.stdout:5/672: sync 2026-03-10T06:22:32.951 INFO:tasks.workunit.client.1.vm06.stdout:1/941: sync 2026-03-10T06:22:32.951 INFO:tasks.workunit.client.1.vm06.stdout:8/745: sync 2026-03-10T06:22:32.954 INFO:tasks.workunit.client.1.vm06.stdout:8/746: dwrite d1/df/d20/d21/d7e/d8d/f95 [0,4194304] 0 2026-03-10T06:22:32.959 INFO:tasks.workunit.client.1.vm06.stdout:1/942: rename d9/d35/d46/db0/cd9 to d9/d62/c116 0 2026-03-10T06:22:32.959 INFO:tasks.workunit.client.1.vm06.stdout:1/943: write d9/d35/d89/f14 [5102481,43109] 0 2026-03-10T06:22:32.962 INFO:tasks.workunit.client.1.vm06.stdout:1/944: dwrite d9/d35/d46/d38/dc6/dd4/ff6 [0,4194304] 0 2026-03-10T06:22:32.976 INFO:tasks.workunit.client.1.vm06.stdout:8/747: dread d1/d2c/d5b/f7c [0,4194304] 0 2026-03-10T06:22:32.990 INFO:tasks.workunit.client.1.vm06.stdout:8/748: mkdir d1/d7/dee 0 2026-03-10T06:22:32.997 INFO:tasks.workunit.client.1.vm06.stdout:8/749: creat d1/df/d20/d21/d7e/d8d/fef x:0 0 0 2026-03-10T06:22:32.997 INFO:tasks.workunit.client.1.vm06.stdout:8/750: stat d1/d2c/f8a 0 2026-03-10T06:22:32.997 INFO:tasks.workunit.client.1.vm06.stdout:8/751: chown d1/df/d20/c23 1584 1 2026-03-10T06:22:32.998 INFO:tasks.workunit.client.1.vm06.stdout:8/752: stat d1/f18 0 2026-03-10T06:22:33.000 INFO:tasks.workunit.client.1.vm06.stdout:8/753: truncate d1/f1c 2764052 0 2026-03-10T06:22:33.009 INFO:tasks.workunit.client.1.vm06.stdout:8/754: dread d1/df/d11/fe4 [4194304,4194304] 0 2026-03-10T06:22:33.078 INFO:tasks.workunit.client.1.vm06.stdout:4/912: dwrite dd/d24/d5d/f94 [0,4194304] 0 2026-03-10T06:22:33.088 INFO:tasks.workunit.client.1.vm06.stdout:4/913: truncate dd/d33/d47/fc7 2379952 0 2026-03-10T06:22:33.092 INFO:tasks.workunit.client.1.vm06.stdout:4/914: symlink dd/d18/d75/dfd/l10c 0 2026-03-10T06:22:33.092 INFO:tasks.workunit.client.1.vm06.stdout:4/915: readlink dd/l85 0 2026-03-10T06:22:33.092 INFO:tasks.workunit.client.1.vm06.stdout:4/916: chown dd/d18/d8e/fa4 105586808 1 2026-03-10T06:22:33.093 INFO:tasks.workunit.client.1.vm06.stdout:4/917: write dd/d24/d5d/f94 [642041,38508] 0 2026-03-10T06:22:33.144 INFO:tasks.workunit.client.1.vm06.stdout:5/673: truncate d8/db/d54/d8a/d74/f29 5617989 0 2026-03-10T06:22:33.229 INFO:tasks.workunit.client.1.vm06.stdout:6/965: truncate d6/dd/d25/d4e/f60 1861377 0 2026-03-10T06:22:33.233 INFO:tasks.workunit.client.1.vm06.stdout:6/966: mknod d6/dd/d25/d33/c153 0 2026-03-10T06:22:33.238 INFO:tasks.workunit.client.1.vm06.stdout:6/967: dwrite d6/d7/ffc [0,4194304] 0 2026-03-10T06:22:33.244 INFO:tasks.workunit.client.1.vm06.stdout:3/870: dwrite d6/dc/d41/f82 [0,4194304] 0 2026-03-10T06:22:33.245 INFO:tasks.workunit.client.1.vm06.stdout:0/910: dwrite d0/dd/f95 [0,4194304] 0 2026-03-10T06:22:33.246 INFO:tasks.workunit.client.1.vm06.stdout:0/911: fsync d0/dd/d14/d18/f122 0 2026-03-10T06:22:33.249 INFO:tasks.workunit.client.1.vm06.stdout:0/912: chown d0/d3c/dc1/d7d/c106 0 1 2026-03-10T06:22:33.249 INFO:tasks.workunit.client.1.vm06.stdout:0/913: read - d0/dd/d14/d18/d7e/f126 zero size 2026-03-10T06:22:33.275 INFO:tasks.workunit.client.1.vm06.stdout:3/871: mknod d6/d8/d7f/da1/c12f 0 2026-03-10T06:22:33.275 INFO:tasks.workunit.client.1.vm06.stdout:7/931: rmdir d19/d3b/d41/d42 39 2026-03-10T06:22:33.275 INFO:tasks.workunit.client.1.vm06.stdout:7/932: readlink d19/l2c 0 2026-03-10T06:22:33.289 INFO:tasks.workunit.client.1.vm06.stdout:0/914: write d0/d3c/dc1/fee [1644282,13798] 0 2026-03-10T06:22:33.289 INFO:tasks.workunit.client.1.vm06.stdout:9/890: write d21/f3e [1188968,44487] 0 2026-03-10T06:22:33.291 INFO:tasks.workunit.client.1.vm06.stdout:3/872: dread d6/d21/f58 [0,4194304] 0 2026-03-10T06:22:33.291 INFO:tasks.workunit.client.1.vm06.stdout:3/873: chown d6/d4f 84213225 1 2026-03-10T06:22:33.291 INFO:tasks.workunit.client.1.vm06.stdout:3/874: chown d6/dc/d13/d35/f5a 944 1 2026-03-10T06:22:33.292 INFO:tasks.workunit.client.1.vm06.stdout:3/875: write d6/dc/d13/d9d/fe1 [365551,121307] 0 2026-03-10T06:22:33.307 INFO:tasks.workunit.client.1.vm06.stdout:9/891: stat fd 0 2026-03-10T06:22:33.318 INFO:tasks.workunit.client.1.vm06.stdout:0/915: rename d0/d3c/dc1/d7d/f11a to d0/dd/d14/d18/deb/f130 0 2026-03-10T06:22:33.319 INFO:tasks.workunit.client.1.vm06.stdout:3/876: creat d6/dc/d13/d35/d101/d88/dae/dec/d117/f130 x:0 0 0 2026-03-10T06:22:33.325 INFO:tasks.workunit.client.1.vm06.stdout:7/933: creat d19/d3b/d41/d42/d62/d80/d82/f12e x:0 0 0 2026-03-10T06:22:33.326 INFO:tasks.workunit.client.1.vm06.stdout:0/916: read d0/dd/d14/d18/d85/dcc/d88/d47/f111 [1515071,123663] 0 2026-03-10T06:22:33.327 INFO:tasks.workunit.client.1.vm06.stdout:0/917: dread - d0/d3c/dc1/d3d/d50/d91/f115 zero size 2026-03-10T06:22:33.328 INFO:tasks.workunit.client.1.vm06.stdout:1/945: write d9/f91 [682959,64283] 0 2026-03-10T06:22:33.334 INFO:tasks.workunit.client.1.vm06.stdout:8/755: getdents d1/df/d20/d21/d7e/d8d 0 2026-03-10T06:22:33.336 INFO:tasks.workunit.client.1.vm06.stdout:9/892: rename d21/da2/da7/cdc to d21/d27/d50/d57/db2/d80/d95/d9b/dd0/c120 0 2026-03-10T06:22:33.337 INFO:tasks.workunit.client.1.vm06.stdout:9/893: fdatasync d21/da2/da7/d93/f94 0 2026-03-10T06:22:33.346 INFO:tasks.workunit.client.1.vm06.stdout:3/877: creat d6/dc/d13/d35/d101/d88/dae/dec/d117/f131 x:0 0 0 2026-03-10T06:22:33.346 INFO:tasks.workunit.client.1.vm06.stdout:3/878: chown d6/dc/d13/d51/c60 60 1 2026-03-10T06:22:33.346 INFO:tasks.workunit.client.1.vm06.stdout:3/879: stat d6/dc/d41/d6d/l125 0 2026-03-10T06:22:33.353 INFO:tasks.workunit.client.1.vm06.stdout:8/756: fdatasync f0 0 2026-03-10T06:22:33.354 INFO:tasks.workunit.client.1.vm06.stdout:8/757: readlink d1/d3b/l97 0 2026-03-10T06:22:33.360 INFO:tasks.workunit.client.1.vm06.stdout:0/918: rename d0/dd/d14/d18/d7e/f104 to d0/dd/d14/d18/d85/dcc/d88/d9e/f131 0 2026-03-10T06:22:33.360 INFO:tasks.workunit.client.1.vm06.stdout:0/919: stat d0/dd/d14/d1d/d5d/dca/dd8/f10a 0 2026-03-10T06:22:33.363 INFO:tasks.workunit.client.1.vm06.stdout:3/880: mkdir d6/dc/d13/d35/d101/d88/dae/d132 0 2026-03-10T06:22:33.364 INFO:tasks.workunit.client.1.vm06.stdout:4/918: dwrite dd/d33/d36/fba [0,4194304] 0 2026-03-10T06:22:33.368 INFO:tasks.workunit.client.1.vm06.stdout:1/946: dread d9/d1b/d20/f25 [0,4194304] 0 2026-03-10T06:22:33.374 INFO:tasks.workunit.client.1.vm06.stdout:0/920: mknod d0/d3c/dc1/d3d/d50/d91/da7/c132 0 2026-03-10T06:22:33.375 INFO:tasks.workunit.client.1.vm06.stdout:0/921: dread - d0/d3c/dc1/d3d/d50/d91/f115 zero size 2026-03-10T06:22:33.379 INFO:tasks.workunit.client.1.vm06.stdout:1/947: dread d9/d35/d46/d38/dc6/dd4/ff6 [0,4194304] 0 2026-03-10T06:22:33.383 INFO:tasks.workunit.client.1.vm06.stdout:4/919: rename dd/d18 to dd/d24/d2d/d2f/d34/d40/d10d 0 2026-03-10T06:22:33.384 INFO:tasks.workunit.client.1.vm06.stdout:9/894: creat d21/d27/d50/d57/f121 x:0 0 0 2026-03-10T06:22:33.385 INFO:tasks.workunit.client.1.vm06.stdout:1/948: stat d9/d35/d46/d38/dc6/lcd 0 2026-03-10T06:22:33.392 INFO:tasks.workunit.client.1.vm06.stdout:9/895: mknod d21/d27/d56/dc0/c122 0 2026-03-10T06:22:33.406 INFO:tasks.workunit.client.1.vm06.stdout:1/949: mknod d9/d35/d46/d38/d63/d83/d93/c117 0 2026-03-10T06:22:33.406 INFO:tasks.workunit.client.1.vm06.stdout:8/758: getdents d1/d3b 0 2026-03-10T06:22:33.406 INFO:tasks.workunit.client.1.vm06.stdout:8/759: chown d1/df/f71 447157 1 2026-03-10T06:22:33.406 INFO:tasks.workunit.client.1.vm06.stdout:9/896: getdents d21/d27/d3a 0 2026-03-10T06:22:33.406 INFO:tasks.workunit.client.1.vm06.stdout:8/760: creat d1/df/d20/d21/ff0 x:0 0 0 2026-03-10T06:22:33.412 INFO:tasks.workunit.client.1.vm06.stdout:0/922: sync 2026-03-10T06:22:33.412 INFO:tasks.workunit.client.1.vm06.stdout:1/950: sync 2026-03-10T06:22:33.412 INFO:tasks.workunit.client.1.vm06.stdout:0/923: chown d0/dd/d1c/da2/fb9 593 1 2026-03-10T06:22:33.413 INFO:tasks.workunit.client.1.vm06.stdout:1/951: truncate d9/d35/d46/d38/ddd/f107 221578 0 2026-03-10T06:22:33.418 INFO:tasks.workunit.client.1.vm06.stdout:0/924: mknod d0/da3/dd5/ddc/c133 0 2026-03-10T06:22:33.419 INFO:tasks.workunit.client.1.vm06.stdout:8/761: creat d1/df/d11/ff1 x:0 0 0 2026-03-10T06:22:33.421 INFO:tasks.workunit.client.1.vm06.stdout:8/762: write d1/df/d20/d35/f42 [4971111,74911] 0 2026-03-10T06:22:33.425 INFO:tasks.workunit.client.1.vm06.stdout:1/952: link d9/d35/d46/db0/ff9 d9/d35/d46/d38/d63/f118 0 2026-03-10T06:22:33.425 INFO:tasks.workunit.client.1.vm06.stdout:0/925: dwrite d0/dd/d14/d18/d85/dcc/d88/fae [0,4194304] 0 2026-03-10T06:22:33.428 INFO:tasks.workunit.client.1.vm06.stdout:0/926: truncate d0/dd/d14/d18/d7e/f126 86211 0 2026-03-10T06:22:33.431 INFO:tasks.workunit.client.1.vm06.stdout:0/927: write d0/dd/d14/d18/d7e/f126 [1132602,130483] 0 2026-03-10T06:22:33.434 INFO:tasks.workunit.client.1.vm06.stdout:1/953: dwrite d9/f1a [0,4194304] 0 2026-03-10T06:22:33.435 INFO:tasks.workunit.client.1.vm06.stdout:0/928: write d0/dd/d14/d18/f90 [1680493,62029] 0 2026-03-10T06:22:33.443 INFO:tasks.workunit.client.1.vm06.stdout:0/929: dread d0/dd/d14/d18/d85/f105 [0,4194304] 0 2026-03-10T06:22:33.444 INFO:tasks.workunit.client.1.vm06.stdout:8/763: unlink d1/df/d20/d21/d5e/f70 0 2026-03-10T06:22:33.476 INFO:tasks.workunit.client.1.vm06.stdout:5/674: dwrite d8/db/d57/f75 [0,4194304] 0 2026-03-10T06:22:33.476 INFO:tasks.workunit.client.1.vm06.stdout:5/675: chown d8/db/d54/d8a/l56 821179 1 2026-03-10T06:22:33.481 INFO:tasks.workunit.client.1.vm06.stdout:1/954: read - d9/fac zero size 2026-03-10T06:22:33.484 INFO:tasks.workunit.client.1.vm06.stdout:5/676: dwrite d8/db/d54/d67/d46/d68/dc1/fce [0,4194304] 0 2026-03-10T06:22:33.486 INFO:tasks.workunit.client.1.vm06.stdout:5/677: stat d8/db/d54/d67/d46/d6e/c84 0 2026-03-10T06:22:33.487 INFO:tasks.workunit.client.1.vm06.stdout:5/678: chown d8/db/d54/d67/d46/f98 2047839 1 2026-03-10T06:22:33.488 INFO:tasks.workunit.client.1.vm06.stdout:0/930: rmdir d0/dd/d14/d18/d85/dcc/d88/d47 39 2026-03-10T06:22:33.489 INFO:tasks.workunit.client.1.vm06.stdout:0/931: write d0/dd/d14/d18/d7e/dd0/f118 [251221,121182] 0 2026-03-10T06:22:33.505 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:33 vm04.local ceph-mon[51058]: pgmap v15: 65 pgs: 65 active+clean; 1.7 GiB data, 5.8 GiB used, 114 GiB / 120 GiB avail; 45 MiB/s rd, 125 MiB/s wr, 275 op/s 2026-03-10T06:22:33.505 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:33 vm04.local ceph-mon[51058]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:22:33.505 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:33 vm04.local ceph-mon[51058]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:22:33.505 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:33 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:22:33.505 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:33 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T06:22:33.505 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:33 vm04.local ceph-mon[51058]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:22:33.505 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:33 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:22:33.505 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:33 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:22:33.505 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:33 vm04.local ceph-mon[51058]: Upgrade: Setting container_image for all mgr 2026-03-10T06:22:33.505 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:33 vm04.local ceph-mon[51058]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:22:33.505 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:33 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mgr.vm04.exdvdb"}]: dispatch 2026-03-10T06:22:33.505 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:33 vm04.local ceph-mon[51058]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mgr.vm04.exdvdb"}]: dispatch 2026-03-10T06:22:33.505 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:33 vm04.local ceph-mon[51058]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mgr.vm04.exdvdb"}]': finished 2026-03-10T06:22:33.505 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:33 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mgr.vm06.wwotdr"}]: dispatch 2026-03-10T06:22:33.505 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:33 vm04.local ceph-mon[51058]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mgr.vm06.wwotdr"}]: dispatch 2026-03-10T06:22:33.505 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:33 vm04.local ceph-mon[51058]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mgr.vm06.wwotdr"}]': finished 2026-03-10T06:22:33.505 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:33 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:22:33.505 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:33 vm04.local ceph-mon[51058]: Upgrade: Setting container_image for all rgw 2026-03-10T06:22:33.505 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:33 vm04.local ceph-mon[51058]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:22:33.505 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:33 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:22:33.505 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:33 vm04.local ceph-mon[51058]: Upgrade: Setting container_image for all rbd-mirror 2026-03-10T06:22:33.505 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:33 vm04.local ceph-mon[51058]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:22:33.505 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:33 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:22:33.505 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:33 vm04.local ceph-mon[51058]: Upgrade: Setting container_image for all cephfs-mirror 2026-03-10T06:22:33.505 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:33 vm04.local ceph-mon[51058]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:22:33.505 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:33 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:22:33.505 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:33 vm04.local ceph-mon[51058]: Upgrade: Setting container_image for all iscsi 2026-03-10T06:22:33.505 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:33 vm04.local ceph-mon[51058]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:22:33.505 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:33 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:22:33.505 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:33 vm04.local ceph-mon[51058]: Upgrade: Setting container_image for all nfs 2026-03-10T06:22:33.505 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:33 vm04.local ceph-mon[51058]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:22:33.505 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:33 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:22:33.505 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:33 vm04.local ceph-mon[51058]: Upgrade: Setting container_image for all nvmeof 2026-03-10T06:22:33.505 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:33 vm04.local ceph-mon[51058]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:22:33.516 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:33 vm06.local ceph-mon[58974]: pgmap v15: 65 pgs: 65 active+clean; 1.7 GiB data, 5.8 GiB used, 114 GiB / 120 GiB avail; 45 MiB/s rd, 125 MiB/s wr, 275 op/s 2026-03-10T06:22:33.516 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:33 vm06.local ceph-mon[58974]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:22:33.516 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:33 vm06.local ceph-mon[58974]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:22:33.516 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:33 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:22:33.516 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:33 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T06:22:33.516 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:33 vm06.local ceph-mon[58974]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:22:33.516 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:33 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:22:33.516 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:33 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:22:33.516 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:33 vm06.local ceph-mon[58974]: Upgrade: Setting container_image for all mgr 2026-03-10T06:22:33.516 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:33 vm06.local ceph-mon[58974]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:22:33.516 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:33 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mgr.vm04.exdvdb"}]: dispatch 2026-03-10T06:22:33.516 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:33 vm06.local ceph-mon[58974]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mgr.vm04.exdvdb"}]: dispatch 2026-03-10T06:22:33.516 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:33 vm06.local ceph-mon[58974]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mgr.vm04.exdvdb"}]': finished 2026-03-10T06:22:33.516 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:33 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mgr.vm06.wwotdr"}]: dispatch 2026-03-10T06:22:33.516 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:33 vm06.local ceph-mon[58974]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mgr.vm06.wwotdr"}]: dispatch 2026-03-10T06:22:33.516 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:33 vm06.local ceph-mon[58974]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mgr.vm06.wwotdr"}]': finished 2026-03-10T06:22:33.516 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:33 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:22:33.516 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:33 vm06.local ceph-mon[58974]: Upgrade: Setting container_image for all rgw 2026-03-10T06:22:33.516 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:33 vm06.local ceph-mon[58974]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:22:33.516 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:33 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:22:33.516 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:33 vm06.local ceph-mon[58974]: Upgrade: Setting container_image for all rbd-mirror 2026-03-10T06:22:33.516 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:33 vm06.local ceph-mon[58974]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:22:33.516 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:33 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:22:33.516 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:33 vm06.local ceph-mon[58974]: Upgrade: Setting container_image for all cephfs-mirror 2026-03-10T06:22:33.517 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:33 vm06.local ceph-mon[58974]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:22:33.517 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:33 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:22:33.517 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:33 vm06.local ceph-mon[58974]: Upgrade: Setting container_image for all iscsi 2026-03-10T06:22:33.517 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:33 vm06.local ceph-mon[58974]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:22:33.517 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:33 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:22:33.517 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:33 vm06.local ceph-mon[58974]: Upgrade: Setting container_image for all nfs 2026-03-10T06:22:33.517 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:33 vm06.local ceph-mon[58974]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:22:33.517 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:33 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:22:33.517 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:33 vm06.local ceph-mon[58974]: Upgrade: Setting container_image for all nvmeof 2026-03-10T06:22:33.517 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:33 vm06.local ceph-mon[58974]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:22:33.541 INFO:tasks.workunit.client.1.vm06.stdout:5/679: creat d8/db/d54/d55/d80/fd0 x:0 0 0 2026-03-10T06:22:33.548 INFO:tasks.workunit.client.1.vm06.stdout:0/932: fdatasync d0/dd/d14/d18/d85/dcc/db7/f116 0 2026-03-10T06:22:33.567 INFO:tasks.workunit.client.1.vm06.stdout:2/747: truncate da/d13/d1c/f7e 1044731 0 2026-03-10T06:22:33.567 INFO:tasks.workunit.client.1.vm06.stdout:8/764: link d1/df/d11/fe4 d1/df/d20/d35/ff2 0 2026-03-10T06:22:33.568 INFO:tasks.workunit.client.1.vm06.stdout:8/765: stat d1/df/d20/d21/fe7 0 2026-03-10T06:22:33.584 INFO:tasks.workunit.client.1.vm06.stdout:7/934: write f10 [3072413,90296] 0 2026-03-10T06:22:33.605 INFO:tasks.workunit.client.1.vm06.stdout:6/968: truncate d6/dd/d25/d33/d5a/dae/fcb 436327 0 2026-03-10T06:22:33.605 INFO:tasks.workunit.client.1.vm06.stdout:1/955: creat d9/d35/d89/f119 x:0 0 0 2026-03-10T06:22:33.608 INFO:tasks.workunit.client.1.vm06.stdout:6/969: dwrite d6/dd/f114 [0,4194304] 0 2026-03-10T06:22:33.610 INFO:tasks.workunit.client.1.vm06.stdout:6/970: chown d6/d79/d95/db4/dd4/df5/d113/f151 3 1 2026-03-10T06:22:33.611 INFO:tasks.workunit.client.1.vm06.stdout:5/680: getdents d8/db/d54/d67/d46/dc4 0 2026-03-10T06:22:33.611 INFO:tasks.workunit.client.1.vm06.stdout:5/681: chown d8/db/d54/d8a/d74/f5a 51026 1 2026-03-10T06:22:33.614 INFO:tasks.workunit.client.1.vm06.stdout:6/971: read d6/dd/d25/d2c/f9c [4412512,104512] 0 2026-03-10T06:22:33.614 INFO:tasks.workunit.client.1.vm06.stdout:5/682: dread d8/db/d54/d67/d46/d68/dc1/fce [0,4194304] 0 2026-03-10T06:22:33.632 INFO:tasks.workunit.client.1.vm06.stdout:1/956: dwrite d9/d35/d46/d38/dc6/dd4/ff6 [0,4194304] 0 2026-03-10T06:22:33.632 INFO:tasks.workunit.client.1.vm06.stdout:1/957: chown d9/d35/d46/d38/laa 379 1 2026-03-10T06:22:33.634 INFO:tasks.workunit.client.1.vm06.stdout:3/881: write d6/d1a/f1f [1229769,13189] 0 2026-03-10T06:22:33.639 INFO:tasks.workunit.client.1.vm06.stdout:6/972: unlink d6/d79/d95/d137/db9/c131 0 2026-03-10T06:22:33.651 INFO:tasks.workunit.client.1.vm06.stdout:0/933: rename d0/dd/d14/d18/d7e/cff to d0/dd/d14/d18/d85/dcc/c134 0 2026-03-10T06:22:33.653 INFO:tasks.workunit.client.1.vm06.stdout:1/958: unlink d9/d1b/d20/fa7 0 2026-03-10T06:22:33.653 INFO:tasks.workunit.client.1.vm06.stdout:1/959: chown d9/d35/d89/lf2 12 1 2026-03-10T06:22:33.657 INFO:tasks.workunit.client.1.vm06.stdout:8/766: dread d1/d3b/f98 [4194304,4194304] 0 2026-03-10T06:22:33.657 INFO:tasks.workunit.client.1.vm06.stdout:8/767: dread - d1/df/d20/d21/fe7 zero size 2026-03-10T06:22:33.658 INFO:tasks.workunit.client.1.vm06.stdout:4/920: write dd/d24/d2d/d2f/d39/d71/dc3/dd0/de0/db0/deb/d107/fff [4999105,75042] 0 2026-03-10T06:22:33.661 INFO:tasks.workunit.client.1.vm06.stdout:6/973: rmdir d6/d79/d95/db4/dd4/df5 39 2026-03-10T06:22:33.663 INFO:tasks.workunit.client.1.vm06.stdout:1/960: sync 2026-03-10T06:22:33.663 INFO:tasks.workunit.client.1.vm06.stdout:6/974: sync 2026-03-10T06:22:33.664 INFO:tasks.workunit.client.1.vm06.stdout:1/961: write d9/d35/d46/d38/d63/feb [1633010,47779] 0 2026-03-10T06:22:33.665 INFO:tasks.workunit.client.1.vm06.stdout:9/897: write d21/d27/d56/f74 [3891234,117044] 0 2026-03-10T06:22:33.682 INFO:tasks.workunit.client.1.vm06.stdout:1/962: dread d9/d35/d46/d38/d63/d83/fa1 [0,4194304] 0 2026-03-10T06:22:33.682 INFO:tasks.workunit.client.1.vm06.stdout:8/768: creat d1/df/d20/d35/ff3 x:0 0 0 2026-03-10T06:22:33.683 INFO:tasks.workunit.client.1.vm06.stdout:8/769: write d1/df/d20/d21/d7e/d8d/fef [568593,78830] 0 2026-03-10T06:22:33.683 INFO:tasks.workunit.client.1.vm06.stdout:3/882: mkdir d6/dc/d13/d133 0 2026-03-10T06:22:33.686 INFO:tasks.workunit.client.1.vm06.stdout:6/975: fdatasync d6/d79/d95/d137/db9/d135/f124 0 2026-03-10T06:22:33.690 INFO:tasks.workunit.client.1.vm06.stdout:6/976: dwrite d6/df/fe9 [0,4194304] 0 2026-03-10T06:22:33.705 INFO:tasks.workunit.client.1.vm06.stdout:8/770: creat d1/d3b/d5c/ff4 x:0 0 0 2026-03-10T06:22:33.709 INFO:tasks.workunit.client.1.vm06.stdout:2/748: truncate da/d13/d1c/f7e 1449164 0 2026-03-10T06:22:33.710 INFO:tasks.workunit.client.1.vm06.stdout:3/883: creat d6/dc/d13/d35/d101/d88/dde/f134 x:0 0 0 2026-03-10T06:22:33.711 INFO:tasks.workunit.client.1.vm06.stdout:9/898: mkdir d21/d32/d4d/d51/d123 0 2026-03-10T06:22:33.713 INFO:tasks.workunit.client.1.vm06.stdout:4/921: dread dd/d33/f53 [0,4194304] 0 2026-03-10T06:22:33.715 INFO:tasks.workunit.client.1.vm06.stdout:6/977: chown d6/d79/d95/d137/dd0/dc5/leb 449 1 2026-03-10T06:22:33.716 INFO:tasks.workunit.client.1.vm06.stdout:6/978: read - d6/d79/d95/d137/db9/f147 zero size 2026-03-10T06:22:33.716 INFO:tasks.workunit.client.1.vm06.stdout:6/979: chown d6/c17 243165217 1 2026-03-10T06:22:33.720 INFO:tasks.workunit.client.1.vm06.stdout:7/935: rename d19/d3b/d41/d42/d52/d9f to d19/d3b/d41/d12f 0 2026-03-10T06:22:33.720 INFO:tasks.workunit.client.1.vm06.stdout:0/934: link d0/dd/d14/d18/f122 d0/dd/d14/f135 0 2026-03-10T06:22:33.721 INFO:tasks.workunit.client.1.vm06.stdout:8/771: fsync d1/df/d11/f45 0 2026-03-10T06:22:33.722 INFO:tasks.workunit.client.1.vm06.stdout:7/936: write d19/d3b/d41/f54 [2959747,58041] 0 2026-03-10T06:22:33.727 INFO:tasks.workunit.client.1.vm06.stdout:9/899: fdatasync d21/da2/de6/f2f 0 2026-03-10T06:22:33.728 INFO:tasks.workunit.client.1.vm06.stdout:9/900: read - d21/d27/d50/d57/dcd/de4/f115 zero size 2026-03-10T06:22:33.730 INFO:tasks.workunit.client.1.vm06.stdout:3/884: dread d6/d21/f99 [0,4194304] 0 2026-03-10T06:22:33.744 INFO:tasks.workunit.client.1.vm06.stdout:5/683: dwrite d8/db/d54/d8a/f4d [0,4194304] 0 2026-03-10T06:22:33.765 INFO:tasks.workunit.client.1.vm06.stdout:1/963: dwrite d9/d35/d46/d38/f82 [4194304,4194304] 0 2026-03-10T06:22:33.765 INFO:tasks.workunit.client.1.vm06.stdout:1/964: fsync d9/d35/d46/d38/d63/d83/dc5/dd5/f10e 0 2026-03-10T06:22:33.772 INFO:tasks.workunit.client.1.vm06.stdout:1/965: dwrite d9/d35/d46/d38/d63/feb [0,4194304] 0 2026-03-10T06:22:33.773 INFO:tasks.workunit.client.1.vm06.stdout:1/966: chown d9 356221580 1 2026-03-10T06:22:33.774 INFO:tasks.workunit.client.1.vm06.stdout:1/967: chown d9/d35/d46/d38/d8c/f92 9570339 1 2026-03-10T06:22:33.777 INFO:tasks.workunit.client.1.vm06.stdout:1/968: dread d9/d35/d46/f71 [0,4194304] 0 2026-03-10T06:22:33.816 INFO:tasks.workunit.client.1.vm06.stdout:0/935: unlink d0/dd/d1c/cbf 0 2026-03-10T06:22:33.818 INFO:tasks.workunit.client.1.vm06.stdout:8/772: truncate d1/df/d20/d21/f69 3873280 0 2026-03-10T06:22:33.825 INFO:tasks.workunit.client.1.vm06.stdout:4/922: fsync dd/d24/d2d/d2f/d39/d71/dc3/dd0/de0/f6a 0 2026-03-10T06:22:33.831 INFO:tasks.workunit.client.1.vm06.stdout:3/885: symlink d6/dc/d13/d9d/d54/l135 0 2026-03-10T06:22:33.832 INFO:tasks.workunit.client.1.vm06.stdout:1/969: write d9/d35/d46/d38/d63/dd6/de8/f109 [440515,27063] 0 2026-03-10T06:22:33.838 INFO:tasks.workunit.client.1.vm06.stdout:2/749: mknod da/d13/d1c/d1d/d44/dc4/cf0 0 2026-03-10T06:22:33.857 INFO:tasks.workunit.client.1.vm06.stdout:9/901: mkdir d21/d32/d4d/d51/d124 0 2026-03-10T06:22:33.870 INFO:tasks.workunit.client.1.vm06.stdout:3/886: symlink d6/dc/d13/d35/d101/d11b/l136 0 2026-03-10T06:22:33.870 INFO:tasks.workunit.client.1.vm06.stdout:3/887: write d6/dc/d13/d9d/d54/fb8 [3864443,5175] 0 2026-03-10T06:22:33.875 INFO:tasks.workunit.client.1.vm06.stdout:0/936: mknod d0/dd/d14/d18/d85/dcc/d5e/c136 0 2026-03-10T06:22:33.876 INFO:tasks.workunit.client.1.vm06.stdout:7/937: mknod d19/d3b/d41/d42/d52/d83/d9d/da8/d129/c130 0 2026-03-10T06:22:33.882 INFO:tasks.workunit.client.1.vm06.stdout:8/773: rename d1/df/d11/f59 to d1/d7/dee/ff5 0 2026-03-10T06:22:33.884 INFO:tasks.workunit.client.1.vm06.stdout:7/938: sync 2026-03-10T06:22:33.885 INFO:tasks.workunit.client.1.vm06.stdout:9/902: fsync d21/d32/d4d/dd2/fd5 0 2026-03-10T06:22:33.892 INFO:tasks.workunit.client.1.vm06.stdout:5/684: mkdir d8/db/d7e/dd1 0 2026-03-10T06:22:33.893 INFO:tasks.workunit.client.1.vm06.stdout:7/939: dread d19/d3b/d41/d42/fd4 [4194304,4194304] 0 2026-03-10T06:22:33.893 INFO:tasks.workunit.client.1.vm06.stdout:7/940: write d19/d3b/d41/f65 [95054,112020] 0 2026-03-10T06:22:33.894 INFO:tasks.workunit.client.1.vm06.stdout:7/941: chown d19/d3b/f6b 0 1 2026-03-10T06:22:33.895 INFO:tasks.workunit.client.1.vm06.stdout:7/942: chown d19/d3b/d41/d42/d52/d83/d9d/da8/d129/d11c/de0/de2 9 1 2026-03-10T06:22:33.903 INFO:tasks.workunit.client.1.vm06.stdout:0/937: mkdir d0/dd/d14/d18/d7e/d137 0 2026-03-10T06:22:33.903 INFO:tasks.workunit.client.1.vm06.stdout:0/938: chown d0/da3/dd5/ddc/c133 88 1 2026-03-10T06:22:33.904 INFO:tasks.workunit.client.1.vm06.stdout:2/750: dread da/d13/d1c/d7d/ddf/f78 [0,4194304] 0 2026-03-10T06:22:33.905 INFO:tasks.workunit.client.1.vm06.stdout:6/980: truncate d6/d7/f2a 5825477 0 2026-03-10T06:22:33.912 INFO:tasks.workunit.client.1.vm06.stdout:8/774: creat d1/df/d58/ff6 x:0 0 0 2026-03-10T06:22:33.912 INFO:tasks.workunit.client.1.vm06.stdout:8/775: chown d1/d2c/d5b/lc6 1 1 2026-03-10T06:22:33.925 INFO:tasks.workunit.client.1.vm06.stdout:4/923: dwrite dd/d24/d2d/d2f/d34/d40/fb8 [0,4194304] 0 2026-03-10T06:22:33.926 INFO:tasks.workunit.client.1.vm06.stdout:4/924: read - dd/d24/d2d/d2f/d34/d40/d10d/d8e/fa4 zero size 2026-03-10T06:22:33.939 INFO:tasks.workunit.client.1.vm06.stdout:7/943: dread d19/f30 [0,4194304] 0 2026-03-10T06:22:33.945 INFO:tasks.workunit.client.1.vm06.stdout:8/776: creat d1/d7/dee/ff7 x:0 0 0 2026-03-10T06:22:33.958 INFO:tasks.workunit.client.1.vm06.stdout:3/888: write d6/dc/f94 [3749060,126523] 0 2026-03-10T06:22:33.959 INFO:tasks.workunit.client.1.vm06.stdout:1/970: getdents d9/d35/d46/d38/d63/d83 0 2026-03-10T06:22:33.960 INFO:tasks.workunit.client.1.vm06.stdout:7/944: creat d19/d3b/d41/da9/dbd/f131 x:0 0 0 2026-03-10T06:22:33.962 INFO:tasks.workunit.client.1.vm06.stdout:0/939: rmdir d0/dd/d14/d1d/d73 39 2026-03-10T06:22:33.965 INFO:tasks.workunit.client.1.vm06.stdout:7/945: dwrite d19/d3b/d41/d42/d52/d83/d9d/fbf [0,4194304] 0 2026-03-10T06:22:33.967 INFO:tasks.workunit.client.1.vm06.stdout:9/903: rename d21/d32/d4d/f64 to d21/d27/d50/d57/db2/d80/d95/d9b/f125 0 2026-03-10T06:22:33.970 INFO:tasks.workunit.client.1.vm06.stdout:8/777: write d1/fb9 [176257,39945] 0 2026-03-10T06:22:33.972 INFO:tasks.workunit.client.1.vm06.stdout:5/685: link d8/db/d54/d8a/d74/f78 d8/db/fd2 0 2026-03-10T06:22:33.974 INFO:tasks.workunit.client.1.vm06.stdout:3/889: creat d6/d8/d7f/da1/dfe/f137 x:0 0 0 2026-03-10T06:22:33.977 INFO:tasks.workunit.client.1.vm06.stdout:1/971: dread d9/f2f [0,4194304] 0 2026-03-10T06:22:34.001 INFO:tasks.workunit.client.1.vm06.stdout:7/946: mkdir d19/d3b/d41/da9/d132 0 2026-03-10T06:22:34.005 INFO:tasks.workunit.client.1.vm06.stdout:9/904: fdatasync f11 0 2026-03-10T06:22:34.018 INFO:tasks.workunit.client.1.vm06.stdout:2/751: rename da/d13/d1c/d1d/d44/d46/lab to da/d13/d1a/d39/d35/lf1 0 2026-03-10T06:22:34.018 INFO:tasks.workunit.client.1.vm06.stdout:8/778: truncate d1/f26 229328 0 2026-03-10T06:22:34.020 INFO:tasks.workunit.client.1.vm06.stdout:5/686: fdatasync d8/db/d54/d8a/d74/f78 0 2026-03-10T06:22:34.020 INFO:tasks.workunit.client.1.vm06.stdout:1/972: rmdir d9/d35 39 2026-03-10T06:22:34.021 INFO:tasks.workunit.client.1.vm06.stdout:6/981: link d6/d7/l2e d6/d79/d95/d137/db9/l154 0 2026-03-10T06:22:34.023 INFO:tasks.workunit.client.1.vm06.stdout:8/779: dread d1/df/d11/f45 [0,4194304] 0 2026-03-10T06:22:34.023 INFO:tasks.workunit.client.1.vm06.stdout:0/940: creat d0/dd/d14/d18/d85/dcc/d88/d47/f138 x:0 0 0 2026-03-10T06:22:34.026 INFO:tasks.workunit.client.1.vm06.stdout:7/947: symlink d19/d3b/dde/l133 0 2026-03-10T06:22:34.027 INFO:tasks.workunit.client.1.vm06.stdout:7/948: dread - d19/d3b/d41/da9/dbd/f131 zero size 2026-03-10T06:22:34.028 INFO:tasks.workunit.client.1.vm06.stdout:7/949: fdatasync d19/d3b/d41/d42/d62/fbb 0 2026-03-10T06:22:34.028 INFO:tasks.workunit.client.1.vm06.stdout:6/982: dwrite d6/d79/d95/d137/dd0/dc5/f14b [0,4194304] 0 2026-03-10T06:22:34.029 INFO:tasks.workunit.client.1.vm06.stdout:7/950: chown d19/d3b/d5b/f69 7218503 1 2026-03-10T06:22:34.030 INFO:tasks.workunit.client.1.vm06.stdout:6/983: read - d6/dd/d25/d33/d5a/df1/f140 zero size 2026-03-10T06:22:34.033 INFO:tasks.workunit.client.1.vm06.stdout:7/951: dread d19/fb7 [0,4194304] 0 2026-03-10T06:22:34.060 INFO:tasks.workunit.client.1.vm06.stdout:9/905: rmdir d21 39 2026-03-10T06:22:34.072 INFO:tasks.workunit.client.1.vm06.stdout:3/890: dwrite d6/f91 [0,4194304] 0 2026-03-10T06:22:34.073 INFO:tasks.workunit.client.1.vm06.stdout:2/752: unlink da/d13/d1a/dc7/daf/f7c 0 2026-03-10T06:22:34.081 INFO:tasks.workunit.client.1.vm06.stdout:6/984: dread d6/dd/dc2/f12a [0,4194304] 0 2026-03-10T06:22:34.089 INFO:tasks.workunit.client.1.vm06.stdout:4/925: getdents dd/d24/d2d/d2f/d39/d71/dc3/dd0/de0/db0/deb 0 2026-03-10T06:22:34.089 INFO:tasks.workunit.client.1.vm06.stdout:4/926: write dd/d24/d2d/d2f/d34/d40/d10d/f5f [1437985,33441] 0 2026-03-10T06:22:34.090 INFO:tasks.workunit.client.1.vm06.stdout:4/927: write dd/d24/d5d/f9d [1437141,32588] 0 2026-03-10T06:22:34.103 INFO:tasks.workunit.client.1.vm06.stdout:1/973: dwrite d9/d35/d46/d38/d63/d83/dc5/dd5/f10e [0,4194304] 0 2026-03-10T06:22:34.103 INFO:tasks.workunit.client.1.vm06.stdout:1/974: stat d9/d35/f101 0 2026-03-10T06:22:34.105 INFO:tasks.workunit.client.1.vm06.stdout:5/687: read d8/db/d54/d8a/d39/d72/f8b [954948,65687] 0 2026-03-10T06:22:34.105 INFO:tasks.workunit.client.1.vm06.stdout:8/780: mkdir d1/d7/df8 0 2026-03-10T06:22:34.123 INFO:tasks.workunit.client.1.vm06.stdout:7/952: rmdir d19/d3b/d41/d42/d62/d80/da1/d117 39 2026-03-10T06:22:34.123 INFO:tasks.workunit.client.1.vm06.stdout:7/953: write d19/f20 [5987278,41726] 0 2026-03-10T06:22:34.126 INFO:tasks.workunit.client.1.vm06.stdout:3/891: creat d6/dc/d13/d51/f138 x:0 0 0 2026-03-10T06:22:34.126 INFO:tasks.workunit.client.1.vm06.stdout:2/753: creat da/d13/d1c/d7d/ddf/ff2 x:0 0 0 2026-03-10T06:22:34.128 INFO:tasks.workunit.client.1.vm06.stdout:6/985: readlink d6/d79/d95/d137/dd0/dec/l126 0 2026-03-10T06:22:34.130 INFO:tasks.workunit.client.1.vm06.stdout:6/986: truncate d6/dd/d25/d33/d5a/dae/f104 600962 0 2026-03-10T06:22:34.133 INFO:tasks.workunit.client.1.vm06.stdout:8/781: unlink d1/df/d20/c23 0 2026-03-10T06:22:34.134 INFO:tasks.workunit.client.1.vm06.stdout:8/782: dread - d1/df/d58/db5/fe9 zero size 2026-03-10T06:22:34.139 INFO:tasks.workunit.client.1.vm06.stdout:3/892: dread d6/dc/f10d [0,4194304] 0 2026-03-10T06:22:34.140 INFO:tasks.workunit.client.1.vm06.stdout:3/893: stat d6/dc/d13/d35/d101/d88/dae/f103 0 2026-03-10T06:22:34.144 INFO:tasks.workunit.client.1.vm06.stdout:4/928: getdents dd/d24/d9c/d10a 0 2026-03-10T06:22:34.146 INFO:tasks.workunit.client.1.vm06.stdout:1/975: mknod d9/d62/c11a 0 2026-03-10T06:22:34.147 INFO:tasks.workunit.client.1.vm06.stdout:2/754: dwrite da/f28 [0,4194304] 0 2026-03-10T06:22:34.155 INFO:tasks.workunit.client.1.vm06.stdout:1/976: sync 2026-03-10T06:22:34.155 INFO:tasks.workunit.client.1.vm06.stdout:1/977: chown d9/d62/f90 11 1 2026-03-10T06:22:34.156 INFO:tasks.workunit.client.1.vm06.stdout:0/941: truncate d0/dd/f5b 2010221 0 2026-03-10T06:22:34.157 INFO:tasks.workunit.client.1.vm06.stdout:5/688: write d8/db/d54/d55/fa3 [647809,30205] 0 2026-03-10T06:22:34.158 INFO:tasks.workunit.client.1.vm06.stdout:4/929: dread dd/d33/f3f [0,4194304] 0 2026-03-10T06:22:34.160 INFO:tasks.workunit.client.1.vm06.stdout:4/930: chown dd/d24/d2d/d2f/d39/d71/dc3/dd0/de0/db0/deb/d107/dc6 0 1 2026-03-10T06:22:34.162 INFO:tasks.workunit.client.1.vm06.stdout:1/978: dread d9/d35/d46/d38/d63/dd6/de8/f109 [0,4194304] 0 2026-03-10T06:22:34.166 INFO:tasks.workunit.client.1.vm06.stdout:9/906: mkdir d21/d32/d4d/d126 0 2026-03-10T06:22:34.168 INFO:tasks.workunit.client.1.vm06.stdout:3/894: dread d6/dc/f7e [0,4194304] 0 2026-03-10T06:22:34.177 INFO:tasks.workunit.client.1.vm06.stdout:9/907: dread d21/d27/d50/d57/db2/d80/d95/d9b/f125 [0,4194304] 0 2026-03-10T06:22:34.178 INFO:tasks.workunit.client.1.vm06.stdout:9/908: write d21/da2/da7/d93/f94 [256733,10927] 0 2026-03-10T06:22:34.185 INFO:tasks.workunit.client.1.vm06.stdout:2/755: write da/d13/d1c/d43/f91 [796642,127681] 0 2026-03-10T06:22:34.186 INFO:tasks.workunit.client.1.vm06.stdout:2/756: write da/d13/d5e/fe3 [896777,55063] 0 2026-03-10T06:22:34.202 INFO:tasks.workunit.client.1.vm06.stdout:5/689: symlink d8/ld3 0 2026-03-10T06:22:34.207 INFO:tasks.workunit.client.1.vm06.stdout:8/783: write d1/f18 [5124870,73759] 0 2026-03-10T06:22:34.211 INFO:tasks.workunit.client.1.vm06.stdout:7/954: dwrite d19/d3b/d41/d42/d52/d83/d9d/da8/d129/d11c/fb9 [0,4194304] 0 2026-03-10T06:22:34.211 INFO:tasks.workunit.client.1.vm06.stdout:7/955: chown d19/d3b/d41/d42/d62/f121 7493 1 2026-03-10T06:22:34.212 INFO:tasks.workunit.client.1.vm06.stdout:7/956: read - d19/d3b/d41/da9/dbd/f131 zero size 2026-03-10T06:22:34.224 INFO:tasks.workunit.client.1.vm06.stdout:3/895: dread d6/dc/d13/d35/d101/dd0/dd1/d90/f9e [0,4194304] 0 2026-03-10T06:22:34.226 INFO:tasks.workunit.client.1.vm06.stdout:3/896: dwrite d6/d1a/f127 [0,4194304] 0 2026-03-10T06:22:34.228 INFO:tasks.workunit.client.1.vm06.stdout:9/909: creat d21/d27/d56/dc0/f127 x:0 0 0 2026-03-10T06:22:34.229 INFO:tasks.workunit.client.1.vm06.stdout:6/987: getdents d6/dd/d25/d33 0 2026-03-10T06:22:34.230 INFO:tasks.workunit.client.1.vm06.stdout:0/942: creat d0/dd/d14/d1d/d73/f139 x:0 0 0 2026-03-10T06:22:34.235 INFO:tasks.workunit.client.1.vm06.stdout:9/910: dwrite d21/da2/ff8 [0,4194304] 0 2026-03-10T06:22:34.235 INFO:tasks.workunit.client.1.vm06.stdout:4/931: creat dd/d72/dcd/f10e x:0 0 0 2026-03-10T06:22:34.236 INFO:tasks.workunit.client.1.vm06.stdout:9/911: write d21/d32/f110 [575670,130762] 0 2026-03-10T06:22:34.249 INFO:tasks.workunit.client.1.vm06.stdout:7/957: symlink d19/d3b/dde/l134 0 2026-03-10T06:22:34.249 INFO:tasks.workunit.client.1.vm06.stdout:1/979: mknod d9/d1b/de7/c11b 0 2026-03-10T06:22:34.255 INFO:tasks.workunit.client.1.vm06.stdout:3/897: unlink d6/dc/d13/d35/d101/d88/l119 0 2026-03-10T06:22:34.255 INFO:tasks.workunit.client.1.vm06.stdout:6/988: creat d6/dd/d35/dff/f155 x:0 0 0 2026-03-10T06:22:34.257 INFO:tasks.workunit.client.1.vm06.stdout:0/943: write d0/d3c/dc1/d3d/d50/d91/da7/f11d [3812324,2667] 0 2026-03-10T06:22:34.258 INFO:tasks.workunit.client.1.vm06.stdout:0/944: chown d0/d3c/dc1/d3d/d50/cb0 143 1 2026-03-10T06:22:34.261 INFO:tasks.workunit.client.1.vm06.stdout:3/898: dwrite d6/d8/fb [0,4194304] 0 2026-03-10T06:22:34.262 INFO:tasks.workunit.client.1.vm06.stdout:3/899: chown d6/dc/f1d 1164776967 1 2026-03-10T06:22:34.262 INFO:tasks.workunit.client.1.vm06.stdout:3/900: fsync d6/dc/d13/fca 0 2026-03-10T06:22:34.268 INFO:tasks.workunit.client.1.vm06.stdout:1/980: rename d9/d35/d89/ld2 to d9/d35/d46/d38/dc6/dd4/l11c 0 2026-03-10T06:22:34.271 INFO:tasks.workunit.client.1.vm06.stdout:6/989: creat d6/df/d40/d99/f156 x:0 0 0 2026-03-10T06:22:34.271 INFO:tasks.workunit.client.1.vm06.stdout:0/945: mkdir d0/dd/d14/d18/d85/dcc/d88/d98/d13a 0 2026-03-10T06:22:34.271 INFO:tasks.workunit.client.1.vm06.stdout:3/901: creat d6/dc/d13/d35/d101/d11b/f139 x:0 0 0 2026-03-10T06:22:34.271 INFO:tasks.workunit.client.1.vm06.stdout:3/902: stat d6/dc/d41/d6d/ff4 0 2026-03-10T06:22:34.271 INFO:tasks.workunit.client.1.vm06.stdout:4/932: getdents dd/d24/d2d/d2f/d34/d40/df6 0 2026-03-10T06:22:34.272 INFO:tasks.workunit.client.1.vm06.stdout:7/958: creat d19/d3b/d41/d42/d62/d80/da1/de3/d106/f135 x:0 0 0 2026-03-10T06:22:34.278 INFO:tasks.workunit.client.1.vm06.stdout:0/946: creat d0/d3c/dc1/d3d/d50/f13b x:0 0 0 2026-03-10T06:22:34.283 INFO:tasks.workunit.client.1.vm06.stdout:4/933: creat dd/d33/d47/d97/db6/dbb/de2/f10f x:0 0 0 2026-03-10T06:22:34.283 INFO:tasks.workunit.client.1.vm06.stdout:7/959: unlink d19/d3b/d41/d42/d52/d83/d9d/fbf 0 2026-03-10T06:22:34.284 INFO:tasks.workunit.client.1.vm06.stdout:6/990: rename d6/l9 to d6/d79/d95/d11e/l157 0 2026-03-10T06:22:34.284 INFO:tasks.workunit.client.1.vm06.stdout:0/947: truncate d0/dd/d14/d18/d85/dcc/fa1 816813 0 2026-03-10T06:22:34.286 INFO:tasks.workunit.client.1.vm06.stdout:8/784: getdents d1/df/d11/da1 0 2026-03-10T06:22:34.286 INFO:tasks.workunit.client.1.vm06.stdout:3/903: symlink d6/dc/l13a 0 2026-03-10T06:22:34.287 INFO:tasks.workunit.client.1.vm06.stdout:9/912: getdents d21/ddb 0 2026-03-10T06:22:34.287 INFO:tasks.workunit.client.1.vm06.stdout:4/934: fsync dd/d33/d36/f8d 0 2026-03-10T06:22:34.291 INFO:tasks.workunit.client.1.vm06.stdout:7/960: creat d19/d3b/d41/d42/d62/d80/d82/f136 x:0 0 0 2026-03-10T06:22:34.292 INFO:tasks.workunit.client.1.vm06.stdout:7/961: stat d19/d3b/d5b/cb8 0 2026-03-10T06:22:34.292 INFO:tasks.workunit.client.1.vm06.stdout:1/981: sync 2026-03-10T06:22:34.292 INFO:tasks.workunit.client.1.vm06.stdout:6/991: sync 2026-03-10T06:22:34.296 INFO:tasks.workunit.client.1.vm06.stdout:0/948: dread d0/dd/d14/d18/d85/dcc/fb6 [0,4194304] 0 2026-03-10T06:22:34.298 INFO:tasks.workunit.client.1.vm06.stdout:8/785: rmdir d1/d7 39 2026-03-10T06:22:34.299 INFO:tasks.workunit.client.1.vm06.stdout:0/949: fsync d0/dd/d14/d18/f90 0 2026-03-10T06:22:34.300 INFO:tasks.workunit.client.1.vm06.stdout:3/904: read - d6/dc/d13/d35/d101/dd0/dd1/d90/ff8 zero size 2026-03-10T06:22:34.301 INFO:tasks.workunit.client.1.vm06.stdout:9/913: unlink d21/d46/ded/c10b 0 2026-03-10T06:22:34.303 INFO:tasks.workunit.client.1.vm06.stdout:7/962: readlink d19/d3b/d41/d42/d52/l84 0 2026-03-10T06:22:34.303 INFO:tasks.workunit.client.1.vm06.stdout:7/963: stat d19/db0/d116/l12b 0 2026-03-10T06:22:34.304 INFO:tasks.workunit.client.1.vm06.stdout:6/992: rename d6/df/d9f/le6 to d6/d79/d95/d11e/l158 0 2026-03-10T06:22:34.304 INFO:tasks.workunit.client.1.vm06.stdout:8/786: unlink d1/d2c/c72 0 2026-03-10T06:22:34.306 INFO:tasks.workunit.client.1.vm06.stdout:0/950: creat d0/da3/dd5/f13c x:0 0 0 2026-03-10T06:22:34.310 INFO:tasks.workunit.client.1.vm06.stdout:7/964: chown d19/d3b/d41/d4c/f55 3 1 2026-03-10T06:22:34.310 INFO:tasks.workunit.client.1.vm06.stdout:1/982: dread d9/d35/d46/d38/d63/d83/dc5/dd5/fe1 [0,4194304] 0 2026-03-10T06:22:34.312 INFO:tasks.workunit.client.1.vm06.stdout:7/965: fsync d19/d3b/d41/d42/d52/d83/d9d/da8/d129/d11c/d104/f113 0 2026-03-10T06:22:34.313 INFO:tasks.workunit.client.1.vm06.stdout:1/983: sync 2026-03-10T06:22:34.323 INFO:tasks.workunit.client.1.vm06.stdout:8/787: creat d1/d2c/d99/ddc/ff9 x:0 0 0 2026-03-10T06:22:34.323 INFO:tasks.workunit.client.1.vm06.stdout:7/966: rmdir d19/d3b/d41/d42/d52 39 2026-03-10T06:22:34.325 INFO:tasks.workunit.client.1.vm06.stdout:9/914: dread d21/da2/de6/fc1 [4194304,4194304] 0 2026-03-10T06:22:34.336 INFO:tasks.workunit.client.1.vm06.stdout:3/905: dread d6/dc/d13/d35/d101/dd0/dd1/d90/fe9 [0,4194304] 0 2026-03-10T06:22:34.336 INFO:tasks.workunit.client.1.vm06.stdout:3/906: fdatasync d6/f63 0 2026-03-10T06:22:34.338 INFO:tasks.workunit.client.1.vm06.stdout:1/984: chown d9/d62/c116 28884 1 2026-03-10T06:22:34.340 INFO:tasks.workunit.client.1.vm06.stdout:2/757: truncate da/d13/d1c/d43/f7a 2135200 0 2026-03-10T06:22:34.340 INFO:tasks.workunit.client.1.vm06.stdout:2/758: write da/d13/d5e/fbc [4824062,125339] 0 2026-03-10T06:22:34.343 INFO:tasks.workunit.client.1.vm06.stdout:3/907: dwrite d6/d1a/d5b/dbd/f108 [0,4194304] 0 2026-03-10T06:22:34.348 INFO:tasks.workunit.client.1.vm06.stdout:2/759: dread da/d13/d1c/d7d/ddf/d61/f89 [0,4194304] 0 2026-03-10T06:22:34.349 INFO:tasks.workunit.client.1.vm06.stdout:5/690: dwrite d8/f9c [0,4194304] 0 2026-03-10T06:22:34.356 INFO:tasks.workunit.client.1.vm06.stdout:9/915: dread - d21/d27/d50/d57/db2/ff5 zero size 2026-03-10T06:22:34.365 INFO:tasks.workunit.client.1.vm06.stdout:5/691: sync 2026-03-10T06:22:34.370 INFO:tasks.workunit.client.1.vm06.stdout:8/788: mknod d1/d7/dee/cfa 0 2026-03-10T06:22:34.371 INFO:tasks.workunit.client.1.vm06.stdout:8/789: write d1/df/d58/ff6 [423432,36360] 0 2026-03-10T06:22:34.377 INFO:tasks.workunit.client.1.vm06.stdout:7/967: mkdir d19/d3b/d41/d42/d52/d83/d9d/da8/d129/d11c/de0/d137 0 2026-03-10T06:22:34.385 INFO:tasks.workunit.client.1.vm06.stdout:5/692: mknod d8/db/d54/d8a/d74/d90/cd4 0 2026-03-10T06:22:34.385 INFO:tasks.workunit.client.1.vm06.stdout:4/935: dwrite dd/d33/f70 [0,4194304] 0 2026-03-10T06:22:34.387 INFO:tasks.workunit.client.1.vm06.stdout:4/936: dread - dd/d33/de9/ffe zero size 2026-03-10T06:22:34.398 INFO:tasks.workunit.client.1.vm06.stdout:8/790: mkdir d1/d7/dfb 0 2026-03-10T06:22:34.401 INFO:tasks.workunit.client.1.vm06.stdout:0/951: write d0/d3c/dc1/d3d/f82 [4904126,111060] 0 2026-03-10T06:22:34.407 INFO:tasks.workunit.client.1.vm06.stdout:6/993: write d6/fa2 [374841,31051] 0 2026-03-10T06:22:34.408 INFO:tasks.workunit.client.1.vm06.stdout:7/968: read d19/d3b/d41/d42/d62/f7c [1522801,48675] 0 2026-03-10T06:22:34.408 INFO:tasks.workunit.client.1.vm06.stdout:7/969: write d19/d3b/d41/da9/dbd/dd2/f12c [855507,113005] 0 2026-03-10T06:22:34.409 INFO:tasks.workunit.client.1.vm06.stdout:9/916: mknod d21/d32/d4d/d51/d123/c128 0 2026-03-10T06:22:34.410 INFO:tasks.workunit.client.1.vm06.stdout:9/917: chown d21/d27/d50/d57/db2/d80/d95/fc4 23972902 1 2026-03-10T06:22:34.410 INFO:tasks.workunit.client.1.vm06.stdout:7/970: fsync d19/d3b/d41/d42/d62/d80/da1/de3/d106/f135 0 2026-03-10T06:22:34.413 INFO:tasks.workunit.client.1.vm06.stdout:7/971: dread d19/d3b/d41/d42/d62/f7c [0,4194304] 0 2026-03-10T06:22:34.417 INFO:tasks.workunit.client.1.vm06.stdout:1/985: creat d9/f11d x:0 0 0 2026-03-10T06:22:34.417 INFO:tasks.workunit.client.1.vm06.stdout:7/972: sync 2026-03-10T06:22:34.420 INFO:tasks.workunit.client.1.vm06.stdout:7/973: dwrite f10 [0,4194304] 0 2026-03-10T06:22:34.430 INFO:tasks.workunit.client.1.vm06.stdout:5/693: mknod d8/db/d54/d8a/d39/d72/cd5 0 2026-03-10T06:22:34.434 INFO:tasks.workunit.client.1.vm06.stdout:0/952: symlink d0/d3c/dc1/d3d/l13d 0 2026-03-10T06:22:34.434 INFO:tasks.workunit.client.1.vm06.stdout:3/908: write d6/dc/f10d [2500720,70076] 0 2026-03-10T06:22:34.436 INFO:tasks.workunit.client.1.vm06.stdout:6/994: unlink d6/dd/l56 0 2026-03-10T06:22:34.442 INFO:tasks.workunit.client.1.vm06.stdout:0/953: sync 2026-03-10T06:22:34.444 INFO:tasks.workunit.client.1.vm06.stdout:1/986: creat d9/d35/d46/d38/d63/d83/dc5/dd5/f11e x:0 0 0 2026-03-10T06:22:34.444 INFO:tasks.workunit.client.1.vm06.stdout:0/954: dread d0/da3/dd5/fd7 [0,4194304] 0 2026-03-10T06:22:34.444 INFO:tasks.workunit.client.1.vm06.stdout:0/955: stat d0/da3/dd5/ddc 0 2026-03-10T06:22:34.446 INFO:tasks.workunit.client.1.vm06.stdout:7/974: rmdir d19/d3b/d41/d42/d52/d83/d9d/da8/df4 39 2026-03-10T06:22:34.447 INFO:tasks.workunit.client.1.vm06.stdout:7/975: readlink d19/d3b/d41/d42/d62/l6f 0 2026-03-10T06:22:34.448 INFO:tasks.workunit.client.1.vm06.stdout:5/694: unlink d8/db/d54/d8a/d74/fb3 0 2026-03-10T06:22:34.450 INFO:tasks.workunit.client.1.vm06.stdout:4/937: dread dd/d33/d47/fc7 [0,4194304] 0 2026-03-10T06:22:34.450 INFO:tasks.workunit.client.1.vm06.stdout:6/995: read d6/dd/d25/d2c/fc3 [45130,55787] 0 2026-03-10T06:22:34.453 INFO:tasks.workunit.client.1.vm06.stdout:6/996: dwrite d6/d79/fe2 [0,4194304] 0 2026-03-10T06:22:34.457 INFO:tasks.workunit.client.1.vm06.stdout:2/760: dwrite da/d13/d1c/f7e [0,4194304] 0 2026-03-10T06:22:34.459 INFO:tasks.workunit.client.1.vm06.stdout:1/987: unlink d9/d35/d46/d38/d63/d83/la4 0 2026-03-10T06:22:34.459 INFO:tasks.workunit.client.1.vm06.stdout:0/956: write d0/dd/f4c [1082795,65831] 0 2026-03-10T06:22:34.460 INFO:tasks.workunit.client.1.vm06.stdout:0/957: chown d0/dd/d14/d18 939 1 2026-03-10T06:22:34.461 INFO:tasks.workunit.client.1.vm06.stdout:8/791: dread d1/f89 [0,4194304] 0 2026-03-10T06:22:34.461 INFO:tasks.workunit.client.1.vm06.stdout:8/792: stat d1/df/d58/caa 0 2026-03-10T06:22:34.465 INFO:tasks.workunit.client.1.vm06.stdout:3/909: dread d6/dc/d13/d35/d101/dd0/dd1/f89 [0,4194304] 0 2026-03-10T06:22:34.470 INFO:tasks.workunit.client.1.vm06.stdout:5/695: write d8/db/d54/d8a/d74/f37 [5456034,72635] 0 2026-03-10T06:22:34.474 INFO:tasks.workunit.client.1.vm06.stdout:9/918: dwrite d21/d32/fd8 [0,4194304] 0 2026-03-10T06:22:34.479 INFO:tasks.workunit.client.1.vm06.stdout:5/696: dwrite d8/d9/f4b [4194304,4194304] 0 2026-03-10T06:22:34.491 INFO:tasks.workunit.client.1.vm06.stdout:9/919: dread d21/d27/d50/d57/fb7 [0,4194304] 0 2026-03-10T06:22:34.493 INFO:tasks.workunit.client.1.vm06.stdout:9/920: dwrite f14 [0,4194304] 0 2026-03-10T06:22:34.497 INFO:tasks.workunit.client.1.vm06.stdout:9/921: stat d21/d27/d50/l8a 0 2026-03-10T06:22:34.500 INFO:tasks.workunit.client.1.vm06.stdout:2/761: mkdir da/d13/d1c/d7d/ddf/d61/df3 0 2026-03-10T06:22:34.506 INFO:tasks.workunit.client.1.vm06.stdout:7/976: symlink d19/d3b/dde/d126/l138 0 2026-03-10T06:22:34.506 INFO:tasks.workunit.client.1.vm06.stdout:7/977: dwrite d19/d3b/d41/d42/d52/d83/f8f [0,4194304] 0 2026-03-10T06:22:34.507 INFO:tasks.workunit.client.1.vm06.stdout:3/910: creat d6/dc/d13/d35/d101/dd0/f13b x:0 0 0 2026-03-10T06:22:34.507 INFO:tasks.workunit.client.1.vm06.stdout:7/978: stat d19/f25 0 2026-03-10T06:22:34.518 INFO:tasks.workunit.client.1.vm06.stdout:1/988: mkdir d9/d62/d11f 0 2026-03-10T06:22:34.520 INFO:tasks.workunit.client.1.vm06.stdout:0/958: mkdir d0/dd/d14/d18/d7e/d12f/d13e 0 2026-03-10T06:22:34.521 INFO:tasks.workunit.client.1.vm06.stdout:0/959: fsync d0/dd/d14/d18/f90 0 2026-03-10T06:22:34.521 INFO:tasks.workunit.client.1.vm06.stdout:1/989: dread - d9/d35/d46/d38/d63/d83/dc5/d108/f112 zero size 2026-03-10T06:22:34.521 INFO:tasks.workunit.client.1.vm06.stdout:3/911: creat d6/dc/d13/d35/f13c x:0 0 0 2026-03-10T06:22:34.521 INFO:tasks.workunit.client.1.vm06.stdout:7/979: mkdir d19/d3b/d41/da9/da5/d139 0 2026-03-10T06:22:34.524 INFO:tasks.workunit.client.1.vm06.stdout:6/997: getdents d6/df/d9f 0 2026-03-10T06:22:34.525 INFO:tasks.workunit.client.1.vm06.stdout:6/998: dread - d6/d7/d37/d43/f133 zero size 2026-03-10T06:22:34.525 INFO:tasks.workunit.client.1.vm06.stdout:5/697: rename d8/ld3 to d8/db/d54/d8a/d74/ld6 0 2026-03-10T06:22:34.526 INFO:tasks.workunit.client.1.vm06.stdout:5/698: chown d8/db/d54/d8a/d39 174065 1 2026-03-10T06:22:34.528 INFO:tasks.workunit.client.1.vm06.stdout:0/960: mkdir d0/d3c/dc1/d7d/d13f 0 2026-03-10T06:22:34.528 INFO:tasks.workunit.client.1.vm06.stdout:5/699: read d8/db/d54/d8a/f4d [1450867,68954] 0 2026-03-10T06:22:34.530 INFO:tasks.workunit.client.1.vm06.stdout:2/762: link da/d13/d1a/dc7/daf/d56/db9/fd1 da/d13/d1a/dc7/daf/ff4 0 2026-03-10T06:22:34.531 INFO:tasks.workunit.client.1.vm06.stdout:5/700: read d8/db/d54/d8a/d39/fae [1125646,74367] 0 2026-03-10T06:22:34.533 INFO:tasks.workunit.client.1.vm06.stdout:9/922: rename d21/d27/d50/d57 to d21/da2/da7/d93/dda/df4/d106/d129 0 2026-03-10T06:22:34.534 INFO:tasks.workunit.client.1.vm06.stdout:9/923: stat d21/d46/fb9 0 2026-03-10T06:22:34.536 INFO:tasks.workunit.client.1.vm06.stdout:6/999: dwrite d6/dd/d25/d33/d5a/df1/f140 [0,4194304] 0 2026-03-10T06:22:34.536 INFO:tasks.workunit.client.1.vm06.stdout:0/961: fsync d0/dd/d14/d18/d85/dcc/d88/d35/d74/fb4 0 2026-03-10T06:22:34.537 INFO:tasks.workunit.client.1.vm06.stdout:0/962: truncate d0/dd/d14/d18/d85/fe9 4601323 0 2026-03-10T06:22:34.540 INFO:tasks.workunit.client.1.vm06.stdout:1/990: dread d9/d35/d46/d38/fab [0,4194304] 0 2026-03-10T06:22:34.543 INFO:tasks.workunit.client.1.vm06.stdout:9/924: write d21/d46/fb9 [496683,53438] 0 2026-03-10T06:22:34.543 INFO:tasks.workunit.client.1.vm06.stdout:2/763: write da/d13/d1c/fef [450855,21491] 0 2026-03-10T06:22:34.547 INFO:tasks.workunit.client.1.vm06.stdout:9/925: truncate d21/d27/d3a/fbb 4437469 0 2026-03-10T06:22:34.549 INFO:tasks.workunit.client.1.vm06.stdout:5/701: dwrite d8/db/d54/d8a/f53 [0,4194304] 0 2026-03-10T06:22:34.552 INFO:tasks.workunit.client.1.vm06.stdout:7/980: dread d19/d3b/d41/fb2 [0,4194304] 0 2026-03-10T06:22:34.554 INFO:tasks.workunit.client.1.vm06.stdout:9/926: fsync d21/da2/da7/d93/dda/feb 0 2026-03-10T06:22:34.554 INFO:tasks.workunit.client.1.vm06.stdout:0/963: creat d0/dd/d14/d18/d85/dcc/d88/d47/dd1/d129/f140 x:0 0 0 2026-03-10T06:22:34.554 INFO:tasks.workunit.client.1.vm06.stdout:0/964: stat d0/dd/d14/d18/d85/dcc/d99/l12b 0 2026-03-10T06:22:34.559 INFO:tasks.workunit.client.1.vm06.stdout:5/702: mkdir d8/db/d54/d67/dd7 0 2026-03-10T06:22:34.570 INFO:tasks.workunit.client.1.vm06.stdout:5/703: dread - d8/db/d54/d55/d80/fd0 zero size 2026-03-10T06:22:34.570 INFO:tasks.workunit.client.1.vm06.stdout:0/965: dwrite d0/f5 [0,4194304] 0 2026-03-10T06:22:34.570 INFO:tasks.workunit.client.1.vm06.stdout:9/927: rename l3 to d21/da2/da7/d93/dda/df4/d106/d129/dcd/de4/dee/l12a 0 2026-03-10T06:22:34.570 INFO:tasks.workunit.client.1.vm06.stdout:9/928: read d21/da2/ff8 [2179745,60916] 0 2026-03-10T06:22:34.574 INFO:tasks.workunit.client.1.vm06.stdout:5/704: mknod d8/db/d54/d67/d46/d6e/da2/cd8 0 2026-03-10T06:22:34.576 INFO:tasks.workunit.client.1.vm06.stdout:0/966: dread - d0/d3c/dc1/d3d/d50/d91/da7/fdb zero size 2026-03-10T06:22:34.577 INFO:tasks.workunit.client.1.vm06.stdout:0/967: dread - d0/d3c/f123 zero size 2026-03-10T06:22:34.580 INFO:tasks.workunit.client.1.vm06.stdout:2/764: link da/d13/d1c/d1d/cae da/d13/cf5 0 2026-03-10T06:22:34.582 INFO:tasks.workunit.client.1.vm06.stdout:5/705: dwrite d8/db/d54/d67/d46/fb9 [0,4194304] 0 2026-03-10T06:22:34.583 INFO:tasks.workunit.client.1.vm06.stdout:0/968: mknod d0/d3c/dc1/c141 0 2026-03-10T06:22:34.586 INFO:tasks.workunit.client.1.vm06.stdout:0/969: symlink d0/dd/d14/d18/d85/dcc/d88/d9e/l142 0 2026-03-10T06:22:34.589 INFO:tasks.workunit.client.1.vm06.stdout:2/765: creat da/d13/d1c/d7d/ddf/d61/d68/de0/ff6 x:0 0 0 2026-03-10T06:22:34.589 INFO:tasks.workunit.client.1.vm06.stdout:2/766: fdatasync da/d13/d1a/dc7/daf/d56/db9/fdc 0 2026-03-10T06:22:34.590 INFO:tasks.workunit.client.1.vm06.stdout:0/970: fdatasync d0/dd/d14/d18/d66/fcb 0 2026-03-10T06:22:34.591 INFO:tasks.workunit.client.1.vm06.stdout:2/767: mkdir da/d13/d5e/df7 0 2026-03-10T06:22:34.592 INFO:tasks.workunit.client.1.vm06.stdout:2/768: fsync da/d13/d5e/fbc 0 2026-03-10T06:22:34.592 INFO:tasks.workunit.client.1.vm06.stdout:9/929: dread f9 [4194304,4194304] 0 2026-03-10T06:22:34.593 INFO:tasks.workunit.client.1.vm06.stdout:1/991: rmdir d9/d35/d46/d38/d63/d83/dc5/dd5 39 2026-03-10T06:22:34.595 INFO:tasks.workunit.client.1.vm06.stdout:1/992: write d9/d35/f7e [1006386,127661] 0 2026-03-10T06:22:34.595 INFO:tasks.workunit.client.1.vm06.stdout:2/769: truncate da/d13/d1a/dc7/daf/d56/dd4/fe1 364169 0 2026-03-10T06:22:34.602 INFO:tasks.workunit.client.1.vm06.stdout:9/930: mknod d21/da2/da7/d93/dda/df4/d106/d129/dcd/de4/dee/c12b 0 2026-03-10T06:22:34.614 INFO:tasks.workunit.client.1.vm06.stdout:1/993: symlink d9/d35/d46/l120 0 2026-03-10T06:22:34.614 INFO:tasks.workunit.client.1.vm06.stdout:2/770: creat da/d13/d5e/ff8 x:0 0 0 2026-03-10T06:22:34.614 INFO:tasks.workunit.client.1.vm06.stdout:1/994: mknod d9/d35/d46/d38/d63/dd6/de8/c121 0 2026-03-10T06:22:34.614 INFO:tasks.workunit.client.1.vm06.stdout:1/995: dread - d9/d35/d46/d38/dc6/dd4/ffc zero size 2026-03-10T06:22:34.614 INFO:tasks.workunit.client.1.vm06.stdout:1/996: rename d9/dd3/cdf to d9/dd3/dbf/c122 0 2026-03-10T06:22:34.614 INFO:tasks.workunit.client.1.vm06.stdout:2/771: symlink da/d13/d1a/dc7/daf/d56/db7/dde/lf9 0 2026-03-10T06:22:34.614 INFO:tasks.workunit.client.1.vm06.stdout:2/772: mkdir da/d13/d1a/dc7/d86/dfa 0 2026-03-10T06:22:34.614 INFO:tasks.workunit.client.1.vm06.stdout:1/997: mknod d9/d62/dc7/c123 0 2026-03-10T06:22:34.614 INFO:tasks.workunit.client.1.vm06.stdout:2/773: mknod da/da8/cfb 0 2026-03-10T06:22:34.614 INFO:tasks.workunit.client.1.vm06.stdout:1/998: rename d9/d35/faf to d9/d35/d46/d38/f124 0 2026-03-10T06:22:34.614 INFO:tasks.workunit.client.1.vm06.stdout:1/999: write d9/d1b/d20/f30 [67520,14975] 0 2026-03-10T06:22:34.614 INFO:tasks.workunit.client.1.vm06.stdout:2/774: link da/d13/d1a/dc7/daf/d56/dd4/fe1 da/d13/d1c/d1d/d44/d46/ffc 0 2026-03-10T06:22:34.615 INFO:tasks.workunit.client.1.vm06.stdout:2/775: symlink da/d13/d1a/dc7/daf/lfd 0 2026-03-10T06:22:34.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:34 vm06.local ceph-mon[58974]: Upgrade: Updating node-exporter.vm04 (1/2) 2026-03-10T06:22:34.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:34 vm06.local ceph-mon[58974]: Deploying daemon node-exporter.vm04 on vm04 2026-03-10T06:22:34.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:34 vm06.local ceph-mon[58974]: Standby manager daemon vm06.wwotdr restarted 2026-03-10T06:22:34.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:34 vm06.local ceph-mon[58974]: Standby manager daemon vm06.wwotdr started 2026-03-10T06:22:34.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:34 vm06.local ceph-mon[58974]: from='mgr.? 192.168.123.106:0/2530202142' entity='mgr.vm06.wwotdr' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm06.wwotdr/crt"}]: dispatch 2026-03-10T06:22:34.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:34 vm06.local ceph-mon[58974]: from='mgr.? 192.168.123.106:0/2530202142' entity='mgr.vm06.wwotdr' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/crt"}]: dispatch 2026-03-10T06:22:34.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:34 vm06.local ceph-mon[58974]: from='mgr.? 192.168.123.106:0/2530202142' entity='mgr.vm06.wwotdr' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm06.wwotdr/key"}]: dispatch 2026-03-10T06:22:34.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:34 vm06.local ceph-mon[58974]: from='mgr.? 192.168.123.106:0/2530202142' entity='mgr.vm06.wwotdr' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/key"}]: dispatch 2026-03-10T06:22:34.622 INFO:tasks.workunit.client.1.vm06.stdout:0/971: sync 2026-03-10T06:22:34.622 INFO:tasks.workunit.client.1.vm06.stdout:9/931: sync 2026-03-10T06:22:34.624 INFO:tasks.workunit.client.1.vm06.stdout:0/972: chown d0/d3c/dc1/d3d/c83 10 1 2026-03-10T06:22:34.626 INFO:tasks.workunit.client.1.vm06.stdout:2/776: dwrite da/d13/d1c/d1d/d44/d46/fd7 [0,4194304] 0 2026-03-10T06:22:34.628 INFO:tasks.workunit.client.1.vm06.stdout:9/932: dwrite d21/da2/da7/d93/dda/df4/f100 [0,4194304] 0 2026-03-10T06:22:34.629 INFO:tasks.workunit.client.1.vm06.stdout:0/973: creat d0/d3c/f143 x:0 0 0 2026-03-10T06:22:34.647 INFO:tasks.workunit.client.1.vm06.stdout:3/912: rmdir d6/dc/d13/d35 39 2026-03-10T06:22:34.652 INFO:tasks.workunit.client.1.vm06.stdout:5/706: write d8/f49 [4240807,28797] 0 2026-03-10T06:22:34.657 INFO:tasks.workunit.client.1.vm06.stdout:8/793: dwrite d1/d2c/f67 [0,4194304] 0 2026-03-10T06:22:34.660 INFO:tasks.workunit.client.1.vm06.stdout:4/938: dwrite f2 [0,4194304] 0 2026-03-10T06:22:34.665 INFO:tasks.workunit.client.1.vm06.stdout:0/974: creat d0/dd/d14/d18/d85/dcc/d88/d47/f144 x:0 0 0 2026-03-10T06:22:34.667 INFO:tasks.workunit.client.1.vm06.stdout:7/981: dwrite d19/d3b/d41/da9/dbd/dd2/fe8 [0,4194304] 0 2026-03-10T06:22:34.667 INFO:tasks.workunit.client.1.vm06.stdout:5/707: chown d8/db/d54/d8a/d39/ccc 0 1 2026-03-10T06:22:34.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:34 vm04.local ceph-mon[51058]: Upgrade: Updating node-exporter.vm04 (1/2) 2026-03-10T06:22:34.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:34 vm04.local ceph-mon[51058]: Deploying daemon node-exporter.vm04 on vm04 2026-03-10T06:22:34.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:34 vm04.local ceph-mon[51058]: Standby manager daemon vm06.wwotdr restarted 2026-03-10T06:22:34.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:34 vm04.local ceph-mon[51058]: Standby manager daemon vm06.wwotdr started 2026-03-10T06:22:34.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:34 vm04.local ceph-mon[51058]: from='mgr.? 192.168.123.106:0/2530202142' entity='mgr.vm06.wwotdr' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm06.wwotdr/crt"}]: dispatch 2026-03-10T06:22:34.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:34 vm04.local ceph-mon[51058]: from='mgr.? 192.168.123.106:0/2530202142' entity='mgr.vm06.wwotdr' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/crt"}]: dispatch 2026-03-10T06:22:34.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:34 vm04.local ceph-mon[51058]: from='mgr.? 192.168.123.106:0/2530202142' entity='mgr.vm06.wwotdr' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm06.wwotdr/key"}]: dispatch 2026-03-10T06:22:34.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:34 vm04.local ceph-mon[51058]: from='mgr.? 192.168.123.106:0/2530202142' entity='mgr.vm06.wwotdr' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/key"}]: dispatch 2026-03-10T06:22:34.678 INFO:tasks.workunit.client.1.vm06.stdout:2/777: dread da/d13/d1a/dc7/daf/d56/f58 [0,4194304] 0 2026-03-10T06:22:34.679 INFO:tasks.workunit.client.1.vm06.stdout:8/794: unlink d1/df/d20/d21/l5f 0 2026-03-10T06:22:34.682 INFO:tasks.workunit.client.1.vm06.stdout:3/913: sync 2026-03-10T06:22:34.687 INFO:tasks.workunit.client.1.vm06.stdout:7/982: symlink d19/d3b/dde/l13a 0 2026-03-10T06:22:34.692 INFO:tasks.workunit.client.1.vm06.stdout:7/983: dwrite d19/d3b/d41/f54 [0,4194304] 0 2026-03-10T06:22:34.699 INFO:tasks.workunit.client.1.vm06.stdout:7/984: dwrite d19/d3b/d41/d42/d52/d83/d9d/da8/f10a [0,4194304] 0 2026-03-10T06:22:34.702 INFO:tasks.workunit.client.1.vm06.stdout:0/975: creat d0/d3c/dc1/d3d/d50/f145 x:0 0 0 2026-03-10T06:22:34.703 INFO:tasks.workunit.client.1.vm06.stdout:2/778: rename da/d13/d1a/dc7/daf/d56/db9/fb6 to da/d13/d1a/dc7/dc5/ffe 0 2026-03-10T06:22:34.703 INFO:tasks.workunit.client.1.vm06.stdout:0/976: write d0/dd/f24 [476343,107099] 0 2026-03-10T06:22:34.704 INFO:tasks.workunit.client.1.vm06.stdout:8/795: fdatasync d1/df/fc7 0 2026-03-10T06:22:34.705 INFO:tasks.workunit.client.1.vm06.stdout:0/977: write d0/dd/d14/d18/d85/dcc/f12e [63910,127194] 0 2026-03-10T06:22:34.706 INFO:tasks.workunit.client.1.vm06.stdout:3/914: mkdir d6/dc/de5/d13d 0 2026-03-10T06:22:34.707 INFO:tasks.workunit.client.1.vm06.stdout:9/933: creat d21/d32/f12c x:0 0 0 2026-03-10T06:22:34.709 INFO:tasks.workunit.client.1.vm06.stdout:5/708: unlink d8/db/d54/d8a/d74/l92 0 2026-03-10T06:22:34.713 INFO:tasks.workunit.client.1.vm06.stdout:7/985: dwrite d19/d3b/dde/fdc [4194304,4194304] 0 2026-03-10T06:22:34.716 INFO:tasks.workunit.client.1.vm06.stdout:0/978: dwrite d0/dd/d14/d18/d7e/f126 [0,4194304] 0 2026-03-10T06:22:34.725 INFO:tasks.workunit.client.1.vm06.stdout:7/986: dread - d19/d3b/d41/d42/d62/d80/d82/f12e zero size 2026-03-10T06:22:34.727 INFO:tasks.workunit.client.1.vm06.stdout:9/934: dread d21/da2/da7/d93/dda/df4/f100 [0,4194304] 0 2026-03-10T06:22:34.730 INFO:tasks.workunit.client.1.vm06.stdout:2/779: creat da/d13/d1a/dc7/daf/d56/db9/d9b/fff x:0 0 0 2026-03-10T06:22:34.730 INFO:tasks.workunit.client.1.vm06.stdout:3/915: symlink d6/l13e 0 2026-03-10T06:22:34.732 INFO:tasks.workunit.client.1.vm06.stdout:5/709: mknod d8/cd9 0 2026-03-10T06:22:34.770 INFO:tasks.workunit.client.1.vm06.stdout:9/935: rename d21/d27/d3a/l85 to d21/d27/d3a/l12d 0 2026-03-10T06:22:34.770 INFO:tasks.workunit.client.1.vm06.stdout:8/796: getdents d1/d7/dfb 0 2026-03-10T06:22:34.771 INFO:tasks.workunit.client.1.vm06.stdout:5/710: unlink d8/db/d54/d8a/d74/d90/ccf 0 2026-03-10T06:22:34.771 INFO:tasks.workunit.client.1.vm06.stdout:3/916: read - d6/dc/d13/d35/d101/d88/dae/dec/d117/f131 zero size 2026-03-10T06:22:34.772 INFO:tasks.workunit.client.1.vm06.stdout:3/917: readlink d6/d21/l10c 0 2026-03-10T06:22:34.772 INFO:tasks.workunit.client.1.vm06.stdout:7/987: mkdir d19/d3b/d41/d42/d52/d83/d9d/da8/d13b 0 2026-03-10T06:22:34.773 INFO:tasks.workunit.client.1.vm06.stdout:3/918: unlink d6/d8/l14 0 2026-03-10T06:22:34.775 INFO:tasks.workunit.client.1.vm06.stdout:2/780: rmdir da/d13/d1c/d7d/ddf/d61/df3 0 2026-03-10T06:22:34.775 INFO:tasks.workunit.client.1.vm06.stdout:0/979: getdents d0/dd/d14/d18/d85/dcc/d99/dde 0 2026-03-10T06:22:34.776 INFO:tasks.workunit.client.1.vm06.stdout:5/711: rename d8/db/d54/d67/d46/dc4 to d8/db/d54/d67/d46/d68/dc1/dda 0 2026-03-10T06:22:34.776 INFO:tasks.workunit.client.1.vm06.stdout:2/781: fdatasync da/d13/d1c/d7d/fe8 0 2026-03-10T06:22:34.777 INFO:tasks.workunit.client.1.vm06.stdout:7/988: unlink d19/d3b/d41/da9/dbd/dd2/c10f 0 2026-03-10T06:22:34.777 INFO:tasks.workunit.client.1.vm06.stdout:9/936: creat d21/d32/d4d/d51/f12e x:0 0 0 2026-03-10T06:22:34.778 INFO:tasks.workunit.client.1.vm06.stdout:3/919: fsync d6/dc/f69 0 2026-03-10T06:22:34.779 INFO:tasks.workunit.client.1.vm06.stdout:0/980: unlink d0/dd/d14/d18/f122 0 2026-03-10T06:22:34.783 INFO:tasks.workunit.client.1.vm06.stdout:5/712: truncate d8/db/d54/d8a/d39/f44 5226844 0 2026-03-10T06:22:34.783 INFO:tasks.workunit.client.1.vm06.stdout:9/937: rename d21/d46/ded to d21/da2/d12f 0 2026-03-10T06:22:34.788 INFO:tasks.workunit.client.1.vm06.stdout:2/782: fdatasync da/d13/d1c/d7d/ddf/f78 0 2026-03-10T06:22:34.794 INFO:tasks.workunit.client.1.vm06.stdout:7/989: dread d19/d3b/dde/fe5 [0,4194304] 0 2026-03-10T06:22:34.795 INFO:tasks.workunit.client.1.vm06.stdout:8/797: link d1/d7/c2e d1/d2c/d99/dc0/cfc 0 2026-03-10T06:22:34.802 INFO:tasks.workunit.client.1.vm06.stdout:0/981: chown d0/dd/d14/d18/d85/dcc/l56 60779019 1 2026-03-10T06:22:34.803 INFO:tasks.workunit.client.1.vm06.stdout:0/982: write d0/dd/d14/d18/d85/dcc/d88/fae [262086,11286] 0 2026-03-10T06:22:34.806 INFO:tasks.workunit.client.1.vm06.stdout:4/939: truncate dd/d33/da6/fef 190601 0 2026-03-10T06:22:34.814 INFO:tasks.workunit.client.1.vm06.stdout:4/940: sync 2026-03-10T06:22:34.815 INFO:tasks.workunit.client.1.vm06.stdout:8/798: mknod d1/df/d58/cfd 0 2026-03-10T06:22:34.816 INFO:tasks.workunit.client.1.vm06.stdout:5/713: mknod d8/db/d54/d67/d46/cdb 0 2026-03-10T06:22:34.817 INFO:tasks.workunit.client.1.vm06.stdout:5/714: readlink d8/db/d54/d67/d46/d6e/l70 0 2026-03-10T06:22:34.826 INFO:tasks.workunit.client.1.vm06.stdout:0/983: truncate d0/dd/d14/d18/d85/dcc/d88/d47/f111 3197162 0 2026-03-10T06:22:34.827 INFO:tasks.workunit.client.1.vm06.stdout:0/984: sync 2026-03-10T06:22:34.828 INFO:tasks.workunit.client.1.vm06.stdout:0/985: dread d0/dd/d14/d18/d66/fcb [0,4194304] 0 2026-03-10T06:22:34.828 INFO:tasks.workunit.client.1.vm06.stdout:0/986: chown d0/dd 35 1 2026-03-10T06:22:34.829 INFO:tasks.workunit.client.1.vm06.stdout:3/920: link d6/d8/d7f/da1/caa d6/d12d/c13f 0 2026-03-10T06:22:34.832 INFO:tasks.workunit.client.1.vm06.stdout:8/799: creat d1/d3b/ffe x:0 0 0 2026-03-10T06:22:34.832 INFO:tasks.workunit.client.1.vm06.stdout:4/941: rmdir dd/d24/d2d/d2f/d39/d71/dc3/dd0/de0/db0/deb/d107/dc6/dca 39 2026-03-10T06:22:34.833 INFO:tasks.workunit.client.1.vm06.stdout:5/715: write d8/db/d54/d55/d80/fbb [473947,123488] 0 2026-03-10T06:22:34.840 INFO:tasks.workunit.client.1.vm06.stdout:0/987: mknod d0/da3/dd5/c146 0 2026-03-10T06:22:34.840 INFO:tasks.workunit.client.1.vm06.stdout:2/783: rmdir da/d13/d1a/dc7/d86/dfa 0 2026-03-10T06:22:34.841 INFO:tasks.workunit.client.1.vm06.stdout:7/990: rmdir d19/d3b/d41/da9/d132 0 2026-03-10T06:22:34.843 INFO:tasks.workunit.client.1.vm06.stdout:8/800: rmdir d1/df/d11 39 2026-03-10T06:22:34.846 INFO:tasks.workunit.client.1.vm06.stdout:9/938: getdents d21/da2/da7/d93/dda/df4/d106/d129/db2/d80/d95 0 2026-03-10T06:22:34.848 INFO:tasks.workunit.client.1.vm06.stdout:9/939: read d21/da2/da7/d93/dda/df4/d106/d129/fae [415099,103190] 0 2026-03-10T06:22:34.848 INFO:tasks.workunit.client.1.vm06.stdout:5/716: creat d8/db/d54/d55/d80/fdc x:0 0 0 2026-03-10T06:22:34.854 INFO:tasks.workunit.client.1.vm06.stdout:2/784: symlink da/d13/d1a/dc7/daf/d56/db9/d9b/l100 0 2026-03-10T06:22:34.855 INFO:tasks.workunit.client.1.vm06.stdout:2/785: dread - da/d13/d1a/dc7/daf/d56/db9/fdc zero size 2026-03-10T06:22:34.860 INFO:tasks.workunit.client.1.vm06.stdout:2/786: chown da/d13/d1a/dc7/daf/fee 6195905 1 2026-03-10T06:22:34.864 INFO:tasks.workunit.client.1.vm06.stdout:0/988: dread d0/dd/d14/f70 [0,4194304] 0 2026-03-10T06:22:34.866 INFO:tasks.workunit.client.1.vm06.stdout:2/787: read da/d13/fe4 [2095695,17726] 0 2026-03-10T06:22:34.868 INFO:tasks.workunit.client.1.vm06.stdout:4/942: link dd/d24/d2d/d2f/d39/d71/dc3/dd0/de0/f6a dd/d33/da6/f110 0 2026-03-10T06:22:34.871 INFO:tasks.workunit.client.1.vm06.stdout:5/717: dread d8/db/fbc [0,4194304] 0 2026-03-10T06:22:34.875 INFO:tasks.workunit.client.1.vm06.stdout:2/788: truncate da/d13/d1a/dc7/daf/d56/db9/d9b/fff 205687 0 2026-03-10T06:22:34.875 INFO:tasks.workunit.client.1.vm06.stdout:9/940: dread d21/da2/da7/d93/dda/df4/d106/d129/db2/d80/d95/fc4 [0,4194304] 0 2026-03-10T06:22:34.876 INFO:tasks.workunit.client.1.vm06.stdout:4/943: dwrite dd/d41/f52 [0,4194304] 0 2026-03-10T06:22:34.884 INFO:tasks.workunit.client.1.vm06.stdout:0/989: mkdir d0/dd/d14/d18/d85/dcc/d99/dde/d147 0 2026-03-10T06:22:34.886 INFO:tasks.workunit.client.1.vm06.stdout:0/990: chown d0/dd/f67 792455 1 2026-03-10T06:22:34.890 INFO:tasks.workunit.client.1.vm06.stdout:8/801: dread d1/df/d11/f81 [0,4194304] 0 2026-03-10T06:22:34.899 INFO:tasks.workunit.client.1.vm06.stdout:5/718: dread d8/fa5 [0,4194304] 0 2026-03-10T06:22:34.900 INFO:tasks.workunit.client.1.vm06.stdout:5/719: chown d8/db/d54 44890 1 2026-03-10T06:22:34.900 INFO:tasks.workunit.client.1.vm06.stdout:5/720: dread - d8/f3f zero size 2026-03-10T06:22:34.903 INFO:tasks.workunit.client.1.vm06.stdout:5/721: dread d8/db/d54/d8a/d39/d6c/f91 [0,4194304] 0 2026-03-10T06:22:34.906 INFO:tasks.workunit.client.1.vm06.stdout:3/921: write d6/dc/de5/f124 [699133,118449] 0 2026-03-10T06:22:34.907 INFO:tasks.workunit.client.1.vm06.stdout:3/922: dread - d6/dc/d13/d35/d101/d88/dae/dec/d117/f130 zero size 2026-03-10T06:22:34.907 INFO:tasks.workunit.client.1.vm06.stdout:5/722: dread d8/db/d54/d8a/d39/fae [0,4194304] 0 2026-03-10T06:22:34.911 INFO:tasks.workunit.client.1.vm06.stdout:7/991: dwrite d19/d3b/d5b/f69 [0,4194304] 0 2026-03-10T06:22:34.923 INFO:tasks.workunit.client.1.vm06.stdout:8/802: unlink d1/df/d11/f74 0 2026-03-10T06:22:34.924 INFO:tasks.workunit.client.1.vm06.stdout:4/944: link dd/d24/d2d/d2f/d39/f61 dd/d24/f111 0 2026-03-10T06:22:34.924 INFO:tasks.workunit.client.1.vm06.stdout:2/789: write da/d13/d1c/d1d/d44/d46/fb4 [885351,47937] 0 2026-03-10T06:22:34.926 INFO:tasks.workunit.client.1.vm06.stdout:4/945: fsync dd/d24/d5d/f9d 0 2026-03-10T06:22:34.928 INFO:tasks.workunit.client.1.vm06.stdout:5/723: dread d8/db/d57/f75 [0,4194304] 0 2026-03-10T06:22:34.928 INFO:tasks.workunit.client.1.vm06.stdout:4/946: chown dd/d24/d2d/d2f/d34/d40/d10d/lb5 0 1 2026-03-10T06:22:34.930 INFO:tasks.workunit.client.1.vm06.stdout:7/992: dread d19/d3b/d41/da9/f105 [0,4194304] 0 2026-03-10T06:22:34.931 INFO:tasks.workunit.client.1.vm06.stdout:7/993: fsync d19/d3b/d41/d42/d62/d80/d82/f136 0 2026-03-10T06:22:34.932 INFO:tasks.workunit.client.1.vm06.stdout:7/994: chown d19/c59 0 1 2026-03-10T06:22:34.933 INFO:tasks.workunit.client.1.vm06.stdout:9/941: dwrite d21/d27/f9a [4194304,4194304] 0 2026-03-10T06:22:34.933 INFO:tasks.workunit.client.1.vm06.stdout:7/995: dread - d19/d3b/d41/da9/dbd/f131 zero size 2026-03-10T06:22:34.934 INFO:tasks.workunit.client.1.vm06.stdout:7/996: write d19/f33 [243996,44986] 0 2026-03-10T06:22:34.940 INFO:tasks.workunit.client.1.vm06.stdout:9/942: sync 2026-03-10T06:22:34.941 INFO:tasks.workunit.client.1.vm06.stdout:7/997: write d19/d3b/d41/d42/d52/d83/d9d/da8/d129/d11c/d104/f113 [437825,2783] 0 2026-03-10T06:22:34.947 INFO:tasks.workunit.client.1.vm06.stdout:8/803: chown d1/df/d11/cae 2667 1 2026-03-10T06:22:34.948 INFO:tasks.workunit.client.1.vm06.stdout:8/804: fsync d1/df/d58/db5/fdf 0 2026-03-10T06:22:34.949 INFO:tasks.workunit.client.1.vm06.stdout:5/724: mkdir d8/d9/ddd 0 2026-03-10T06:22:34.952 INFO:tasks.workunit.client.1.vm06.stdout:4/947: rmdir dd/d24/d5d 39 2026-03-10T06:22:34.953 INFO:tasks.workunit.client.1.vm06.stdout:5/725: dwrite d8/db/d54/d8a/d74/f66 [0,4194304] 0 2026-03-10T06:22:34.964 INFO:tasks.workunit.client.1.vm06.stdout:9/943: mkdir d21/da2/da7/d93/dda/df4/d106/d129/dcd/de4/d104/d130 0 2026-03-10T06:22:34.964 INFO:tasks.workunit.client.1.vm06.stdout:7/998: creat d19/d3b/d41/da9/dbd/dd2/f13c x:0 0 0 2026-03-10T06:22:34.969 INFO:tasks.workunit.client.1.vm06.stdout:3/923: rename d6/dc/d41/d6d/l125 to d6/dc/d41/l140 0 2026-03-10T06:22:34.970 INFO:tasks.workunit.client.1.vm06.stdout:8/805: mknod d1/df/d58/db5/cff 0 2026-03-10T06:22:34.973 INFO:tasks.workunit.client.1.vm06.stdout:4/948: unlink dd/d24/d2d/d2f/d34/d83/fcc 0 2026-03-10T06:22:34.973 INFO:tasks.workunit.client.1.vm06.stdout:4/949: chown dd/d33/l73 41380642 1 2026-03-10T06:22:34.982 INFO:tasks.workunit.client.1.vm06.stdout:7/999: fsync d19/d3b/d41/d4c/f55 0 2026-03-10T06:22:34.988 INFO:tasks.workunit.client.1.vm06.stdout:2/790: creat da/d13/d1a/f101 x:0 0 0 2026-03-10T06:22:34.992 INFO:tasks.workunit.client.1.vm06.stdout:9/944: dwrite d21/d32/d4d/f6b [0,4194304] 0 2026-03-10T06:22:34.992 INFO:tasks.workunit.client.1.vm06.stdout:3/924: truncate d6/dc/d13/d35/d101/dd0/dd1/d90/dc7/fcb 104223 0 2026-03-10T06:22:34.996 INFO:tasks.workunit.client.1.vm06.stdout:4/950: creat dd/d33/f112 x:0 0 0 2026-03-10T06:22:34.997 INFO:tasks.workunit.client.1.vm06.stdout:3/925: fdatasync d6/dc/d13/d9d/f11a 0 2026-03-10T06:22:34.998 INFO:tasks.workunit.client.1.vm06.stdout:2/791: write da/d13/f1f [3715736,130831] 0 2026-03-10T06:22:35.004 INFO:tasks.workunit.client.1.vm06.stdout:9/945: dread d21/da2/da7/d93/dda/df4/d106/d129/db2/d80/d95/d9b/dd0/ffe [0,4194304] 0 2026-03-10T06:22:35.004 INFO:tasks.workunit.client.1.vm06.stdout:2/792: write da/d13/d1c/d7d/ddf/ff2 [728158,128603] 0 2026-03-10T06:22:35.008 INFO:tasks.workunit.client.1.vm06.stdout:8/806: dread d1/df/f56 [0,4194304] 0 2026-03-10T06:22:35.016 INFO:tasks.workunit.client.1.vm06.stdout:3/926: creat d6/dc/d13/d35/d101/d88/dae/dec/f141 x:0 0 0 2026-03-10T06:22:35.016 INFO:tasks.workunit.client.1.vm06.stdout:3/927: fsync d6/d8/f52 0 2026-03-10T06:22:35.017 INFO:tasks.workunit.client.1.vm06.stdout:5/726: link d8/db/d54/d67/d46/d6e/fbf d8/db/fde 0 2026-03-10T06:22:35.019 INFO:tasks.workunit.client.1.vm06.stdout:5/727: dread d8/db/d54/d55/d80/fbb [0,4194304] 0 2026-03-10T06:22:35.021 INFO:tasks.workunit.client.1.vm06.stdout:0/991: link d0/dd/d14/d18/c29 d0/dd/d14/d18/d85/dcc/d88/d47/d4d/c148 0 2026-03-10T06:22:35.021 INFO:tasks.workunit.client.1.vm06.stdout:5/728: fsync d8/db/d54/d55/d80/f96 0 2026-03-10T06:22:35.023 INFO:tasks.workunit.client.1.vm06.stdout:4/951: creat dd/d24/d9c/d10a/f113 x:0 0 0 2026-03-10T06:22:35.029 INFO:tasks.workunit.client.1.vm06.stdout:4/952: dwrite dd/d24/d2d/d2f/d34/d40/fb8 [0,4194304] 0 2026-03-10T06:22:35.035 INFO:tasks.workunit.client.1.vm06.stdout:5/729: dread d8/db/d54/d8a/d39/fa1 [0,4194304] 0 2026-03-10T06:22:35.041 INFO:tasks.workunit.client.1.vm06.stdout:9/946: mkdir d21/d32/d4d/d51/d131 0 2026-03-10T06:22:35.046 INFO:tasks.workunit.client.1.vm06.stdout:4/953: dwrite dd/d33/d36/fba [0,4194304] 0 2026-03-10T06:22:35.048 INFO:tasks.workunit.client.1.vm06.stdout:3/928: mkdir d6/dc/d13/d142 0 2026-03-10T06:22:35.049 INFO:tasks.workunit.client.1.vm06.stdout:9/947: symlink d21/d32/d4d/d51/dcb/l132 0 2026-03-10T06:22:35.050 INFO:tasks.workunit.client.1.vm06.stdout:9/948: readlink d21/d27/d56/lab 0 2026-03-10T06:22:35.058 INFO:tasks.workunit.client.1.vm06.stdout:5/730: dread d8/db/d54/d8a/d39/fae [0,4194304] 0 2026-03-10T06:22:35.070 INFO:tasks.workunit.client.1.vm06.stdout:2/793: creat da/d13/d1c/d1d/d44/f102 x:0 0 0 2026-03-10T06:22:35.070 INFO:tasks.workunit.client.1.vm06.stdout:8/807: creat d1/df/d11/f100 x:0 0 0 2026-03-10T06:22:35.071 INFO:tasks.workunit.client.1.vm06.stdout:4/954: creat dd/d24/d2d/d2f/d34/d40/d10d/d75/dfd/f114 x:0 0 0 2026-03-10T06:22:35.075 INFO:tasks.workunit.client.1.vm06.stdout:5/731: creat d8/db/d54/d8a/d39/d6c/fdf x:0 0 0 2026-03-10T06:22:35.083 INFO:tasks.workunit.client.1.vm06.stdout:2/794: creat da/d13/d1a/d39/d35/f103 x:0 0 0 2026-03-10T06:22:35.083 INFO:tasks.workunit.client.1.vm06.stdout:8/808: mkdir d1/d2c/d99/d101 0 2026-03-10T06:22:35.083 INFO:tasks.workunit.client.1.vm06.stdout:5/732: readlink d8/db/d54/d8a/l32 0 2026-03-10T06:22:35.083 INFO:tasks.workunit.client.1.vm06.stdout:2/795: creat da/d13/d1a/dc7/daf/d56/db7/f104 x:0 0 0 2026-03-10T06:22:35.083 INFO:tasks.workunit.client.1.vm06.stdout:8/809: creat d1/df/d20/d21/d7e/f102 x:0 0 0 2026-03-10T06:22:35.083 INFO:tasks.workunit.client.1.vm06.stdout:5/733: unlink d8/db/d54/d55/d80/fbb 0 2026-03-10T06:22:35.085 INFO:tasks.workunit.client.1.vm06.stdout:8/810: mkdir d1/d7/df8/d103 0 2026-03-10T06:22:35.085 INFO:tasks.workunit.client.1.vm06.stdout:8/811: chown d1/d7/l1a 26869 1 2026-03-10T06:22:35.086 INFO:tasks.workunit.client.1.vm06.stdout:5/734: creat d8/db/d54/d67/d46/d6e/da2/fe0 x:0 0 0 2026-03-10T06:22:35.087 INFO:tasks.workunit.client.1.vm06.stdout:8/812: mknod d1/d3b/da9/dab/c104 0 2026-03-10T06:22:35.089 INFO:tasks.workunit.client.1.vm06.stdout:8/813: read - d1/df/d20/d21/fc5 zero size 2026-03-10T06:22:35.092 INFO:tasks.workunit.client.1.vm06.stdout:8/814: fsync d1/df/f6b 0 2026-03-10T06:22:35.152 INFO:tasks.workunit.client.1.vm06.stdout:9/949: sync 2026-03-10T06:22:35.152 INFO:tasks.workunit.client.1.vm06.stdout:2/796: sync 2026-03-10T06:22:35.154 INFO:tasks.workunit.client.1.vm06.stdout:2/797: stat da/d13/d1a/dc7/daf/d56/de7 0 2026-03-10T06:22:35.161 INFO:tasks.workunit.client.1.vm06.stdout:0/992: dwrite d0/da3/dd5/f101 [0,4194304] 0 2026-03-10T06:22:35.169 INFO:tasks.workunit.client.1.vm06.stdout:9/950: creat d21/d27/f133 x:0 0 0 2026-03-10T06:22:35.171 INFO:tasks.workunit.client.1.vm06.stdout:3/929: write d6/d8/f62 [1259226,52301] 0 2026-03-10T06:22:35.174 INFO:tasks.workunit.client.1.vm06.stdout:0/993: creat d0/dd/d14/d18/d85/dcc/d99/dde/d147/f149 x:0 0 0 2026-03-10T06:22:35.177 INFO:tasks.workunit.client.1.vm06.stdout:4/955: dwrite dd/d24/d2d/fbe [0,4194304] 0 2026-03-10T06:22:35.186 INFO:tasks.workunit.client.1.vm06.stdout:0/994: sync 2026-03-10T06:22:35.307 INFO:tasks.workunit.client.1.vm06.stdout:5/735: truncate d8/d9/f4b 3341799 0 2026-03-10T06:22:35.308 INFO:tasks.workunit.client.1.vm06.stdout:5/736: truncate d8/db/d54/fc9 515746 0 2026-03-10T06:22:35.314 INFO:tasks.workunit.client.1.vm06.stdout:8/815: write d1/df/f6d [1768635,69371] 0 2026-03-10T06:22:35.318 INFO:tasks.workunit.client.1.vm06.stdout:2/798: write da/d13/d1c/d7d/ddf/d61/d68/f6b [1860279,94432] 0 2026-03-10T06:22:35.323 INFO:tasks.workunit.client.1.vm06.stdout:8/816: dread d1/d7/f4f [0,4194304] 0 2026-03-10T06:22:35.337 INFO:tasks.workunit.client.1.vm06.stdout:3/930: mkdir d6/dc/d13/d142/d143 0 2026-03-10T06:22:35.338 INFO:tasks.workunit.client.1.vm06.stdout:3/931: chown d6/dc/d13/d35/d101/dd0/dd1/d90/cb5 78567428 1 2026-03-10T06:22:35.338 INFO:tasks.workunit.client.1.vm06.stdout:3/932: chown d6/d4f/l6a 4 1 2026-03-10T06:22:35.345 INFO:tasks.workunit.client.1.vm06.stdout:3/933: dread d6/d8/d7f/fe6 [0,4194304] 0 2026-03-10T06:22:35.348 INFO:tasks.workunit.client.1.vm06.stdout:3/934: dwrite d6/dc/d13/d35/f13c [0,4194304] 0 2026-03-10T06:22:35.349 INFO:tasks.workunit.client.1.vm06.stdout:2/799: mknod da/d13/d5e/c105 0 2026-03-10T06:22:35.350 INFO:tasks.workunit.client.1.vm06.stdout:8/817: symlink d1/d7/dee/l105 0 2026-03-10T06:22:35.364 INFO:tasks.workunit.client.1.vm06.stdout:3/935: rename d6/dc/d13/d35/d101/dd0/dd1/d90/l113 to d6/dc/de5/d13d/l144 0 2026-03-10T06:22:35.364 INFO:tasks.workunit.client.1.vm06.stdout:3/936: chown d6/dc/d13/d35/d101/d11b 48 1 2026-03-10T06:22:35.378 INFO:tasks.workunit.client.1.vm06.stdout:2/800: mkdir da/d13/d1a/dc7/daf/d56/de7/d106 0 2026-03-10T06:22:35.394 INFO:tasks.workunit.client.1.vm06.stdout:3/937: getdents d6/def 0 2026-03-10T06:22:35.405 INFO:tasks.workunit.client.1.vm06.stdout:8/818: rename d1/d2c/d5b/lc6 to d1/df/d20/d35/dac/l106 0 2026-03-10T06:22:35.407 INFO:tasks.workunit.client.1.vm06.stdout:8/819: mknod d1/df/d58/db5/c107 0 2026-03-10T06:22:35.501 INFO:tasks.workunit.client.1.vm06.stdout:5/737: unlink d8/db/d54/d67/d46/d68/f81 0 2026-03-10T06:22:35.527 INFO:tasks.workunit.client.1.vm06.stdout:9/951: write d21/d32/d4d/fbd [1448160,95567] 0 2026-03-10T06:22:35.528 INFO:tasks.workunit.client.1.vm06.stdout:9/952: write d21/d32/f12c [958506,89957] 0 2026-03-10T06:22:35.528 INFO:tasks.workunit.client.1.vm06.stdout:9/953: fdatasync d21/f3e 0 2026-03-10T06:22:35.529 INFO:tasks.workunit.client.1.vm06.stdout:9/954: chown d21/d46 73306360 1 2026-03-10T06:22:35.534 INFO:tasks.workunit.client.1.vm06.stdout:9/955: creat d21/d27/d56/df6/f134 x:0 0 0 2026-03-10T06:22:35.535 INFO:tasks.workunit.client.1.vm06.stdout:9/956: chown d21/d32/fd8 18041 1 2026-03-10T06:22:35.551 INFO:tasks.workunit.client.1.vm06.stdout:2/801: mkdir da/d13/d107 0 2026-03-10T06:22:35.552 INFO:tasks.workunit.client.1.vm06.stdout:2/802: write da/d13/d5e/fe3 [837926,100626] 0 2026-03-10T06:22:35.552 INFO:tasks.workunit.client.1.vm06.stdout:2/803: stat da/d13/d1a/lac 0 2026-03-10T06:22:35.566 INFO:tasks.workunit.client.1.vm06.stdout:4/956: creat dd/d24/d2d/d2f/d34/f115 x:0 0 0 2026-03-10T06:22:35.570 INFO:tasks.workunit.client.1.vm06.stdout:4/957: dwrite dd/d24/f45 [4194304,4194304] 0 2026-03-10T06:22:35.574 INFO:tasks.workunit.client.1.vm06.stdout:0/995: mknod d0/da3/c14a 0 2026-03-10T06:22:35.583 INFO:tasks.workunit.client.1.vm06.stdout:4/958: symlink dd/d33/d36/l116 0 2026-03-10T06:22:35.589 INFO:tasks.workunit.client.1.vm06.stdout:4/959: mknod dd/d24/d2d/d2f/d34/d40/d10d/d75/c117 0 2026-03-10T06:22:35.590 INFO:tasks.workunit.client.1.vm06.stdout:4/960: chown dd/d24/d2d/d2f/d34/d40/d10d/ff7 169 1 2026-03-10T06:22:35.592 INFO:tasks.workunit.client.1.vm06.stdout:4/961: creat dd/d24/d9c/d10a/f118 x:0 0 0 2026-03-10T06:22:35.595 INFO:tasks.workunit.client.1.vm06.stdout:4/962: truncate dd/d24/d2d/d2f/d39/d71/dc3/dd0/de0/db0/deb/d107/fd3 349617 0 2026-03-10T06:22:35.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:35 vm06.local ceph-mon[58974]: pgmap v16: 65 pgs: 65 active+clean; 1.7 GiB data, 5.8 GiB used, 114 GiB / 120 GiB avail; 45 MiB/s rd, 125 MiB/s wr, 275 op/s 2026-03-10T06:22:35.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:35 vm06.local ceph-mon[58974]: mgrmap e32: vm04.exdvdb(active, since 26s), standbys: vm06.wwotdr 2026-03-10T06:22:35.652 INFO:tasks.workunit.client.1.vm06.stdout:8/820: write d1/d3b/d5c/f7a [2866393,16660] 0 2026-03-10T06:22:35.656 INFO:tasks.workunit.client.1.vm06.stdout:8/821: write d1/df/d20/d21/d5e/f73 [4165007,32981] 0 2026-03-10T06:22:35.661 INFO:tasks.workunit.client.1.vm06.stdout:8/822: sync 2026-03-10T06:22:35.664 INFO:tasks.workunit.client.1.vm06.stdout:8/823: dwrite d1/fb9 [0,4194304] 0 2026-03-10T06:22:35.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:35 vm04.local ceph-mon[51058]: pgmap v16: 65 pgs: 65 active+clean; 1.7 GiB data, 5.8 GiB used, 114 GiB / 120 GiB avail; 45 MiB/s rd, 125 MiB/s wr, 275 op/s 2026-03-10T06:22:35.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:35 vm04.local ceph-mon[51058]: mgrmap e32: vm04.exdvdb(active, since 26s), standbys: vm06.wwotdr 2026-03-10T06:22:35.732 INFO:tasks.workunit.client.1.vm06.stdout:2/804: unlink da/d13/d1c/d7d/ddf/ca5 0 2026-03-10T06:22:35.737 INFO:tasks.workunit.client.1.vm06.stdout:2/805: sync 2026-03-10T06:22:35.737 INFO:tasks.workunit.client.1.vm06.stdout:2/806: read - da/d13/d1a/dc7/daf/fee zero size 2026-03-10T06:22:35.742 INFO:tasks.workunit.client.1.vm06.stdout:2/807: dwrite da/d13/d1c/d1d/d44/d46/fd7 [0,4194304] 0 2026-03-10T06:22:35.744 INFO:tasks.workunit.client.1.vm06.stdout:2/808: stat da/d13/d1a/fc9 0 2026-03-10T06:22:35.744 INFO:tasks.workunit.client.1.vm06.stdout:2/809: fdatasync da/d13/d1a/f21 0 2026-03-10T06:22:35.745 INFO:tasks.workunit.client.1.vm06.stdout:2/810: chown da/d13/d1a/f21 65017 1 2026-03-10T06:22:35.748 INFO:tasks.workunit.client.1.vm06.stdout:3/938: rename d6/dc/d13/d9d/fe1 to d6/dc/d13/d9d/d54/f145 0 2026-03-10T06:22:35.786 INFO:tasks.workunit.client.1.vm06.stdout:5/738: dwrite d8/db/f1f [0,4194304] 0 2026-03-10T06:22:35.788 INFO:tasks.workunit.client.1.vm06.stdout:5/739: write d8/db/f1f [2124754,57708] 0 2026-03-10T06:22:35.864 INFO:tasks.workunit.client.1.vm06.stdout:9/957: symlink d21/da2/da7/d93/dda/df4/d106/d129/db2/d80/l135 0 2026-03-10T06:22:35.872 INFO:tasks.workunit.client.1.vm06.stdout:0/996: dwrite d0/dd/d14/d18/d85/dcc/f60 [0,4194304] 0 2026-03-10T06:22:35.877 INFO:tasks.workunit.client.1.vm06.stdout:4/963: dwrite dd/d24/d2d/d2f/d39/d71/dc3/dd0/de0/fdb [0,4194304] 0 2026-03-10T06:22:35.912 INFO:tasks.workunit.client.1.vm06.stdout:3/939: creat d6/d8/f146 x:0 0 0 2026-03-10T06:22:35.915 INFO:tasks.workunit.client.1.vm06.stdout:8/824: rename d1/df/d20/f88 to d1/d3b/d5c/de3/f108 0 2026-03-10T06:22:35.929 INFO:tasks.workunit.client.1.vm06.stdout:0/997: unlink d0/dd/d14/d18/d85/dcc/d88/d35/f7f 0 2026-03-10T06:22:35.937 INFO:tasks.workunit.client.1.vm06.stdout:4/964: unlink dd/d33/f70 0 2026-03-10T06:22:35.939 INFO:tasks.workunit.client.1.vm06.stdout:2/811: symlink da/d13/d1c/d1d/l108 0 2026-03-10T06:22:35.950 INFO:tasks.workunit.client.1.vm06.stdout:3/940: dread d6/f1c [0,4194304] 0 2026-03-10T06:22:35.951 INFO:tasks.workunit.client.1.vm06.stdout:3/941: write d6/d8/d7f/da1/dfe/f137 [792775,126840] 0 2026-03-10T06:22:35.961 INFO:tasks.workunit.client.1.vm06.stdout:5/740: mkdir d8/db/d7e/de1 0 2026-03-10T06:22:35.962 INFO:tasks.workunit.client.1.vm06.stdout:9/958: fdatasync d21/da2/da7/d93/dda/df4/d106/d129/f97 0 2026-03-10T06:22:35.968 INFO:tasks.workunit.client.1.vm06.stdout:4/965: rmdir dd/d24/d2d 39 2026-03-10T06:22:35.969 INFO:tasks.workunit.client.1.vm06.stdout:3/942: fsync d6/d8/d7f/f12b 0 2026-03-10T06:22:35.970 INFO:tasks.workunit.client.1.vm06.stdout:5/741: creat d8/db/d54/d8a/d74/d90/fe2 x:0 0 0 2026-03-10T06:22:35.973 INFO:tasks.workunit.client.1.vm06.stdout:0/998: creat d0/dd/d14/d18/d7e/d137/f14b x:0 0 0 2026-03-10T06:22:35.975 INFO:tasks.workunit.client.1.vm06.stdout:9/959: dwrite fe [0,4194304] 0 2026-03-10T06:22:35.976 INFO:tasks.workunit.client.1.vm06.stdout:9/960: write fe [350099,114999] 0 2026-03-10T06:22:35.977 INFO:tasks.workunit.client.1.vm06.stdout:9/961: write d21/da2/da7/d93/dda/df4/d106/d129/db2/f8d [1972056,99723] 0 2026-03-10T06:22:35.985 INFO:tasks.workunit.client.1.vm06.stdout:5/742: symlink d8/db/d54/d67/d46/d68/dc1/dda/le3 0 2026-03-10T06:22:35.986 INFO:tasks.workunit.client.1.vm06.stdout:0/999: mknod d0/d3c/dc1/d3d/d50/c14c 0 2026-03-10T06:22:35.997 INFO:tasks.workunit.client.1.vm06.stdout:4/966: symlink dd/d24/d2d/d2f/d34/d40/d10d/d8e/daa/l119 0 2026-03-10T06:22:36.000 INFO:tasks.workunit.client.1.vm06.stdout:4/967: write dd/d24/d2d/d2f/d34/f115 [96262,6374] 0 2026-03-10T06:22:36.018 INFO:tasks.workunit.client.1.vm06.stdout:4/968: symlink dd/d33/d47/d97/db6/dbb/l11a 0 2026-03-10T06:22:36.023 INFO:tasks.workunit.client.1.vm06.stdout:4/969: dwrite dd/d33/d47/d97/db6/dbb/de2/f10f [0,4194304] 0 2026-03-10T06:22:36.025 INFO:tasks.workunit.client.1.vm06.stdout:4/970: stat dd/d24/d2d/d2f/d34/d40/df6 0 2026-03-10T06:22:36.033 INFO:tasks.workunit.client.1.vm06.stdout:4/971: symlink dd/d24/d9c/l11b 0 2026-03-10T06:22:36.033 INFO:tasks.workunit.client.1.vm06.stdout:4/972: chown dd/d72 2344 1 2026-03-10T06:22:36.033 INFO:tasks.workunit.client.1.vm06.stdout:4/973: write dd/d24/d2d/d2f/d34/f115 [1119960,5109] 0 2026-03-10T06:22:36.039 INFO:tasks.workunit.client.1.vm06.stdout:4/974: symlink dd/d24/d2d/d2f/d34/d40/d10d/d75/l11c 0 2026-03-10T06:22:36.040 INFO:tasks.workunit.client.1.vm06.stdout:4/975: fsync dd/d33/d47/d97/db6/dbb/de2/f100 0 2026-03-10T06:22:36.040 INFO:tasks.workunit.client.1.vm06.stdout:4/976: stat dd/d33/f3f 0 2026-03-10T06:22:36.062 INFO:tasks.workunit.client.1.vm06.stdout:8/825: write d1/df/f56 [3820300,92864] 0 2026-03-10T06:22:36.067 INFO:tasks.workunit.client.1.vm06.stdout:8/826: rename d1/df/d20/d21/d7e/l8f to d1/d2c/l109 0 2026-03-10T06:22:36.072 INFO:tasks.workunit.client.1.vm06.stdout:8/827: symlink d1/l10a 0 2026-03-10T06:22:36.074 INFO:tasks.workunit.client.1.vm06.stdout:3/943: dwrite d6/f5c [0,4194304] 0 2026-03-10T06:22:36.074 INFO:tasks.workunit.client.1.vm06.stdout:2/812: dwrite da/ff [0,4194304] 0 2026-03-10T06:22:36.077 INFO:tasks.workunit.client.1.vm06.stdout:2/813: write da/d13/d5e/f9e [3059964,103978] 0 2026-03-10T06:22:36.088 INFO:tasks.workunit.client.1.vm06.stdout:8/828: rename d1/d7/fa7 to d1/df/d11/da1/f10b 0 2026-03-10T06:22:36.099 INFO:tasks.workunit.client.1.vm06.stdout:9/962: write f11 [3066671,62995] 0 2026-03-10T06:22:36.101 INFO:tasks.workunit.client.1.vm06.stdout:5/743: dwrite d8/db/d57/f97 [0,4194304] 0 2026-03-10T06:22:36.106 INFO:tasks.workunit.client.1.vm06.stdout:3/944: creat d6/d1a/d5b/dbd/f147 x:0 0 0 2026-03-10T06:22:36.110 INFO:tasks.workunit.client.1.vm06.stdout:2/814: dread - da/d13/d1c/d1d/fca zero size 2026-03-10T06:22:36.111 INFO:tasks.workunit.client.1.vm06.stdout:8/829: dread - d1/df/d20/d21/d7e/fd6 zero size 2026-03-10T06:22:36.111 INFO:tasks.workunit.client.1.vm06.stdout:2/815: write da/d13/d1a/dc7/f9d [1541859,54559] 0 2026-03-10T06:22:36.111 INFO:tasks.workunit.client.1.vm06.stdout:8/830: chown d1/d7/l1a 0 1 2026-03-10T06:22:36.113 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.111+0000 7fc8edd31700 1 -- 192.168.123.104:0/3589342388 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc8e8071a60 msgr2=0x7fc8e8071e70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:22:36.113 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.111+0000 7fc8edd31700 1 --2- 192.168.123.104:0/3589342388 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc8e8071a60 0x7fc8e8071e70 secure :-1 s=READY pgs=328 cs=0 l=1 rev1=1 crypto rx=0x7fc8dc009b00 tx=0x7fc8dc009e10 comp rx=0 tx=0).stop 2026-03-10T06:22:36.113 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.113+0000 7fc8edd31700 1 -- 192.168.123.104:0/3589342388 shutdown_connections 2026-03-10T06:22:36.114 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.113+0000 7fc8edd31700 1 --2- 192.168.123.104:0/3589342388 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc8e8072440 0x7fc8e810be90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:22:36.114 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.113+0000 7fc8edd31700 1 --2- 192.168.123.104:0/3589342388 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc8e8071a60 0x7fc8e8071e70 unknown :-1 s=CLOSED pgs=328 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:22:36.114 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.113+0000 7fc8edd31700 1 -- 192.168.123.104:0/3589342388 >> 192.168.123.104:0/3589342388 conn(0x7fc8e806d1a0 msgr2=0x7fc8e806f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:22:36.114 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.113+0000 7fc8edd31700 1 -- 192.168.123.104:0/3589342388 shutdown_connections 2026-03-10T06:22:36.114 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.113+0000 7fc8edd31700 1 -- 192.168.123.104:0/3589342388 wait complete. 2026-03-10T06:22:36.114 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.114+0000 7fc8edd31700 1 Processor -- start 2026-03-10T06:22:36.114 INFO:tasks.workunit.client.1.vm06.stdout:9/963: write d21/f33 [3414193,1612] 0 2026-03-10T06:22:36.115 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.114+0000 7fc8edd31700 1 -- start start 2026-03-10T06:22:36.115 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.114+0000 7fc8edd31700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc8e8071a60 0x7fc8e81a4a50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:22:36.115 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.114+0000 7fc8edd31700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc8e8072440 0x7fc8e81a4f90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:22:36.115 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.114+0000 7fc8edd31700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc8e81a55b0 con 0x7fc8e8072440 2026-03-10T06:22:36.115 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.114+0000 7fc8edd31700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc8e81a56f0 con 0x7fc8e8071a60 2026-03-10T06:22:36.115 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.115+0000 7fc8e7fff700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc8e8072440 0x7fc8e81a4f90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:22:36.115 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.115+0000 7fc8e7fff700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc8e8072440 0x7fc8e81a4f90 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:51016/0 (socket says 192.168.123.104:51016) 2026-03-10T06:22:36.115 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.115+0000 7fc8e7fff700 1 -- 192.168.123.104:0/1139473458 learned_addr learned my addr 192.168.123.104:0/1139473458 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:22:36.117 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.116+0000 7fc8e7fff700 1 -- 192.168.123.104:0/1139473458 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc8e8071a60 msgr2=0x7fc8e81a4a50 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:22:36.117 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.116+0000 7fc8e7fff700 1 --2- 192.168.123.104:0/1139473458 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc8e8071a60 0x7fc8e81a4a50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:22:36.117 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.116+0000 7fc8e7fff700 1 -- 192.168.123.104:0/1139473458 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fc8dc0097e0 con 0x7fc8e8072440 2026-03-10T06:22:36.117 INFO:tasks.workunit.client.1.vm06.stdout:3/945: creat d6/dc/d13/d9d/f148 x:0 0 0 2026-03-10T06:22:36.118 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.117+0000 7fc8e7fff700 1 --2- 192.168.123.104:0/1139473458 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc8e8072440 0x7fc8e81a4f90 secure :-1 s=READY pgs=329 cs=0 l=1 rev1=1 crypto rx=0x7fc8d800b700 tx=0x7fc8d800bac0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:22:36.118 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.118+0000 7fc8e5ffb700 1 -- 192.168.123.104:0/1139473458 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc8d8010820 con 0x7fc8e8072440 2026-03-10T06:22:36.118 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.118+0000 7fc8e5ffb700 1 -- 192.168.123.104:0/1139473458 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fc8d8010e60 con 0x7fc8e8072440 2026-03-10T06:22:36.118 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.118+0000 7fc8e5ffb700 1 -- 192.168.123.104:0/1139473458 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc8d8017570 con 0x7fc8e8072440 2026-03-10T06:22:36.118 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.118+0000 7fc8edd31700 1 -- 192.168.123.104:0/1139473458 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fc8e810f620 con 0x7fc8e8072440 2026-03-10T06:22:36.118 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.118+0000 7fc8edd31700 1 -- 192.168.123.104:0/1139473458 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fc8e810fb70 con 0x7fc8e8072440 2026-03-10T06:22:36.121 INFO:tasks.workunit.client.1.vm06.stdout:3/946: dwrite d6/dc/d13/d51/f138 [0,4194304] 0 2026-03-10T06:22:36.121 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.120+0000 7fc8edd31700 1 -- 192.168.123.104:0/1139473458 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fc8d4005320 con 0x7fc8e8072440 2026-03-10T06:22:36.121 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.120+0000 7fc8e5ffb700 1 -- 192.168.123.104:0/1139473458 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 32) v1 ==== 100066+0+0 (secure 0 0 0) 0x7fc8d8010980 con 0x7fc8e8072440 2026-03-10T06:22:36.121 INFO:tasks.workunit.client.1.vm06.stdout:3/947: write d6/d8/f96 [320587,124409] 0 2026-03-10T06:22:36.121 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.120+0000 7fc8e5ffb700 1 --2- 192.168.123.104:0/1139473458 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7fc8d0077790 0x7fc8d0079c40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:22:36.121 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.120+0000 7fc8e5ffb700 1 -- 192.168.123.104:0/1139473458 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(42..42 src has 1..42) v4 ==== 5792+0+0 (secure 0 0 0) 0x7fc8d8098ef0 con 0x7fc8e8072440 2026-03-10T06:22:36.122 INFO:tasks.workunit.client.1.vm06.stdout:3/948: fdatasync d6/d1a/d5b/dbd/f108 0 2026-03-10T06:22:36.122 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.121+0000 7fc8ecd2f700 1 --2- 192.168.123.104:0/1139473458 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7fc8d0077790 0x7fc8d0079c40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:22:36.125 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.124+0000 7fc8ecd2f700 1 --2- 192.168.123.104:0/1139473458 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7fc8d0077790 0x7fc8d0079c40 secure :-1 s=READY pgs=22 cs=0 l=1 rev1=1 crypto rx=0x7fc8dc0052d0 tx=0x7fc8dc009f90 comp rx=0 tx=0).ready entity=mgr.14632 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:22:36.127 INFO:tasks.workunit.client.1.vm06.stdout:8/831: creat d1/d7/dee/f10c x:0 0 0 2026-03-10T06:22:36.133 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.132+0000 7fc8e5ffb700 1 -- 192.168.123.104:0/1139473458 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191525 (secure 0 0 0) 0x7fc8d8061a00 con 0x7fc8e8072440 2026-03-10T06:22:36.143 INFO:tasks.workunit.client.1.vm06.stdout:9/964: creat d21/da2/da7/d93/dda/df4/d106/d129/db2/d80/d95/d9b/dd0/f136 x:0 0 0 2026-03-10T06:22:36.146 INFO:tasks.workunit.client.1.vm06.stdout:4/977: write dd/d24/d2d/d2f/d34/d83/f87 [2154942,112272] 0 2026-03-10T06:22:36.150 INFO:tasks.workunit.client.1.vm06.stdout:8/832: sync 2026-03-10T06:22:36.166 INFO:tasks.workunit.client.1.vm06.stdout:3/949: truncate d6/dc/d13/d51/fd2 94771 0 2026-03-10T06:22:36.168 INFO:tasks.workunit.client.1.vm06.stdout:2/816: mknod da/c109 0 2026-03-10T06:22:36.170 INFO:tasks.workunit.client.1.vm06.stdout:3/950: dwrite d6/d21/f7b [8388608,4194304] 0 2026-03-10T06:22:36.181 INFO:tasks.workunit.client.1.vm06.stdout:5/744: write d8/db/d54/d67/d46/d68/dc1/fce [740692,120041] 0 2026-03-10T06:22:36.187 INFO:tasks.workunit.client.1.vm06.stdout:3/951: dread d6/dc/d13/d9d/d54/fcc [0,4194304] 0 2026-03-10T06:22:36.187 INFO:tasks.workunit.client.1.vm06.stdout:5/745: dwrite d8/db/d54/d8a/fc7 [0,4194304] 0 2026-03-10T06:22:36.201 INFO:tasks.workunit.client.1.vm06.stdout:9/965: dread d21/da2/da7/d93/dda/df4/d106/d129/db2/d80/d95/d9b/fa5 [0,4194304] 0 2026-03-10T06:22:36.212 INFO:tasks.workunit.client.1.vm06.stdout:5/746: unlink d8/db/d54/d8a/d74/f5a 0 2026-03-10T06:22:36.227 INFO:tasks.workunit.client.1.vm06.stdout:4/978: creat dd/d24/d2d/d2f/d39/d71/dc3/f11d x:0 0 0 2026-03-10T06:22:36.238 INFO:tasks.workunit.client.1.vm06.stdout:8/833: write d1/f1c [2857555,106169] 0 2026-03-10T06:22:36.240 INFO:tasks.workunit.client.1.vm06.stdout:9/966: write d21/d32/f52 [4475880,33365] 0 2026-03-10T06:22:36.243 INFO:tasks.workunit.client.1.vm06.stdout:3/952: creat d6/d1a/d5b/f149 x:0 0 0 2026-03-10T06:22:36.246 INFO:tasks.workunit.client.1.vm06.stdout:5/747: mkdir d8/db/d54/d8a/de4 0 2026-03-10T06:22:36.247 INFO:tasks.workunit.client.1.vm06.stdout:2/817: getdents da/d13/d5e 0 2026-03-10T06:22:36.248 INFO:tasks.workunit.client.1.vm06.stdout:3/953: dwrite d6/dc/d13/d9d/f11a [0,4194304] 0 2026-03-10T06:22:36.248 INFO:tasks.workunit.client.1.vm06.stdout:3/954: chown d6/dc/d13/fc1 675621 1 2026-03-10T06:22:36.255 INFO:tasks.workunit.client.1.vm06.stdout:8/834: unlink d1/df/d20/d21/ff0 0 2026-03-10T06:22:36.256 INFO:tasks.workunit.client.1.vm06.stdout:8/835: dread - d1/df/fc7 zero size 2026-03-10T06:22:36.257 INFO:tasks.workunit.client.1.vm06.stdout:8/836: read d1/d3b/da9/dab/fd4 [237625,43445] 0 2026-03-10T06:22:36.261 INFO:tasks.workunit.client.1.vm06.stdout:4/979: mkdir dd/d33/d47/d97/db6/d11e 0 2026-03-10T06:22:36.274 INFO:tasks.workunit.client.1.vm06.stdout:2/818: stat da/c3d 0 2026-03-10T06:22:36.279 INFO:tasks.workunit.client.1.vm06.stdout:5/748: write d8/db/d57/d83/fc3 [170914,73520] 0 2026-03-10T06:22:36.297 INFO:tasks.workunit.client.1.vm06.stdout:2/819: symlink da/d13/d1a/dc7/d86/l10a 0 2026-03-10T06:22:36.298 INFO:tasks.workunit.client.1.vm06.stdout:4/980: dread dd/d33/d47/d97/db6/dbb/fce [0,4194304] 0 2026-03-10T06:22:36.302 INFO:tasks.workunit.client.1.vm06.stdout:9/967: link d21/da2/da7/d93/dda/f81 d21/d32/d4d/d126/f137 0 2026-03-10T06:22:36.319 INFO:tasks.workunit.client.1.vm06.stdout:5/749: creat d8/db/d54/d8a/d39/d9f/fe5 x:0 0 0 2026-03-10T06:22:36.326 INFO:tasks.workunit.client.1.vm06.stdout:8/837: rename d1/d7/dee/ff5 to d1/df/d20/d21/d5e/f10d 0 2026-03-10T06:22:36.329 INFO:tasks.workunit.client.1.vm06.stdout:2/820: write da/d13/d1c/d7d/ddf/f78 [731362,100489] 0 2026-03-10T06:22:36.330 INFO:tasks.workunit.client.1.vm06.stdout:2/821: chown da/d13/f52 107 1 2026-03-10T06:22:36.331 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.329+0000 7fc8edd31700 1 -- 192.168.123.104:0/1139473458 --> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7fc8d4000bf0 con 0x7fc8d0077790 2026-03-10T06:22:36.333 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.332+0000 7fc8e5ffb700 1 -- 192.168.123.104:0/1139473458 <== mgr.14632 v2:192.168.123.104:6800/1695210057 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+370 (secure 0 0 0) 0x7fc8d4000bf0 con 0x7fc8d0077790 2026-03-10T06:22:36.333 INFO:tasks.workunit.client.1.vm06.stdout:3/955: getdents d6/dc/d13/d35/d101/d88 0 2026-03-10T06:22:36.334 INFO:tasks.workunit.client.1.vm06.stdout:5/750: mkdir d8/de6 0 2026-03-10T06:22:36.337 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.336+0000 7fc8cf7fe700 1 -- 192.168.123.104:0/1139473458 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7fc8d0077790 msgr2=0x7fc8d0079c40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:22:36.337 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.336+0000 7fc8cf7fe700 1 --2- 192.168.123.104:0/1139473458 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7fc8d0077790 0x7fc8d0079c40 secure :-1 s=READY pgs=22 cs=0 l=1 rev1=1 crypto rx=0x7fc8dc0052d0 tx=0x7fc8dc009f90 comp rx=0 tx=0).stop 2026-03-10T06:22:36.337 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.336+0000 7fc8cf7fe700 1 -- 192.168.123.104:0/1139473458 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc8e8072440 msgr2=0x7fc8e81a4f90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:22:36.337 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.336+0000 7fc8cf7fe700 1 --2- 192.168.123.104:0/1139473458 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc8e8072440 0x7fc8e81a4f90 secure :-1 s=READY pgs=329 cs=0 l=1 rev1=1 crypto rx=0x7fc8d800b700 tx=0x7fc8d800bac0 comp rx=0 tx=0).stop 2026-03-10T06:22:36.337 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.337+0000 7fc8cf7fe700 1 -- 192.168.123.104:0/1139473458 shutdown_connections 2026-03-10T06:22:36.337 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.337+0000 7fc8cf7fe700 1 --2- 192.168.123.104:0/1139473458 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc8e8071a60 0x7fc8e81a4a50 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:22:36.337 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.337+0000 7fc8cf7fe700 1 --2- 192.168.123.104:0/1139473458 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7fc8d0077790 0x7fc8d0079c40 unknown :-1 s=CLOSED pgs=22 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:22:36.337 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.337+0000 7fc8cf7fe700 1 --2- 192.168.123.104:0/1139473458 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc8e8072440 0x7fc8e81a4f90 unknown :-1 s=CLOSED pgs=329 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:22:36.337 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.337+0000 7fc8cf7fe700 1 -- 192.168.123.104:0/1139473458 >> 192.168.123.104:0/1139473458 conn(0x7fc8e806d1a0 msgr2=0x7fc8e810b550 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:22:36.338 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.337+0000 7fc8cf7fe700 1 -- 192.168.123.104:0/1139473458 shutdown_connections 2026-03-10T06:22:36.338 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.337+0000 7fc8cf7fe700 1 -- 192.168.123.104:0/1139473458 wait complete. 2026-03-10T06:22:36.342 INFO:tasks.workunit.client.1.vm06.stdout:4/981: rename dd/c10 to dd/d24/d2d/d2f/d34/d40/d10d/d8e/c11f 0 2026-03-10T06:22:36.346 INFO:tasks.workunit.client.1.vm06.stdout:4/982: dwrite dd/d24/d2d/f3b [0,4194304] 0 2026-03-10T06:22:36.352 INFO:teuthology.orchestra.run.vm04.stdout:true 2026-03-10T06:22:36.355 INFO:tasks.workunit.client.1.vm06.stdout:4/983: dread dd/f14 [0,4194304] 0 2026-03-10T06:22:36.363 INFO:tasks.workunit.client.1.vm06.stdout:8/838: dwrite d1/df/d11/da1/fa5 [0,4194304] 0 2026-03-10T06:22:36.379 INFO:tasks.workunit.client.1.vm06.stdout:5/751: truncate d8/d9/f14 5522665 0 2026-03-10T06:22:36.384 INFO:tasks.workunit.client.1.vm06.stdout:9/968: rename d21/d32/la1 to d21/d32/d4d/d51/dcb/l138 0 2026-03-10T06:22:36.390 INFO:tasks.workunit.client.1.vm06.stdout:2/822: creat da/dea/f10b x:0 0 0 2026-03-10T06:22:36.397 INFO:tasks.workunit.client.1.vm06.stdout:9/969: sync 2026-03-10T06:22:36.412 INFO:tasks.workunit.client.1.vm06.stdout:8/839: chown d1/df/d11/f12 973285 1 2026-03-10T06:22:36.415 INFO:tasks.workunit.client.1.vm06.stdout:8/840: dwrite d1/d2c/f32 [0,4194304] 0 2026-03-10T06:22:36.432 INFO:tasks.workunit.client.1.vm06.stdout:5/752: dread d8/db/d54/d67/d46/d68/f73 [4194304,4194304] 0 2026-03-10T06:22:36.433 INFO:tasks.workunit.client.1.vm06.stdout:5/753: dread d8/db/d54/d8a/d39/d6c/f91 [0,4194304] 0 2026-03-10T06:22:36.438 INFO:tasks.workunit.client.1.vm06.stdout:4/984: mkdir dd/d24/d2d/d2f/d34/d40/d10d/df8/d120 0 2026-03-10T06:22:36.444 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.443+0000 7f25abafe700 1 -- 192.168.123.104:0/4003749955 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f25a4071db0 msgr2=0x7f25a40721c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:22:36.444 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.443+0000 7f25abafe700 1 --2- 192.168.123.104:0/4003749955 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f25a4071db0 0x7f25a40721c0 secure :-1 s=READY pgs=330 cs=0 l=1 rev1=1 crypto rx=0x7f259c00b3a0 tx=0x7f259c00b6b0 comp rx=0 tx=0).stop 2026-03-10T06:22:36.445 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.445+0000 7f25abafe700 1 -- 192.168.123.104:0/4003749955 shutdown_connections 2026-03-10T06:22:36.445 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.445+0000 7f25abafe700 1 --2- 192.168.123.104:0/4003749955 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f25a4107d50 0x7f25a41081c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:22:36.445 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.445+0000 7f25abafe700 1 --2- 192.168.123.104:0/4003749955 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f25a4071db0 0x7f25a40721c0 unknown :-1 s=CLOSED pgs=330 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:22:36.445 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.445+0000 7f25abafe700 1 -- 192.168.123.104:0/4003749955 >> 192.168.123.104:0/4003749955 conn(0x7f25a406d3e0 msgr2=0x7f25a406f830 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:22:36.446 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.445+0000 7f25abafe700 1 -- 192.168.123.104:0/4003749955 shutdown_connections 2026-03-10T06:22:36.446 INFO:tasks.workunit.client.1.vm06.stdout:9/970: unlink c1d 0 2026-03-10T06:22:36.446 INFO:tasks.workunit.client.1.vm06.stdout:3/956: link d6/dc/d13/d9d/d54/l66 d6/dc/d13/d9d/l14a 0 2026-03-10T06:22:36.446 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.445+0000 7f25abafe700 1 -- 192.168.123.104:0/4003749955 wait complete. 2026-03-10T06:22:36.446 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.445+0000 7f25abafe700 1 Processor -- start 2026-03-10T06:22:36.446 INFO:tasks.workunit.client.1.vm06.stdout:5/754: mkdir d8/db/d54/d8a/d39/d6c/de7 0 2026-03-10T06:22:36.446 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.446+0000 7f25abafe700 1 -- start start 2026-03-10T06:22:36.446 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.446+0000 7f25abafe700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f25a4071db0 0x7f25a41a0850 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:22:36.446 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.446+0000 7f25abafe700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f25a4107d50 0x7f25a41a0d90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:22:36.446 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.446+0000 7f25abafe700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f25a41a13b0 con 0x7f25a4071db0 2026-03-10T06:22:36.446 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.446+0000 7f25abafe700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f25a41a14f0 con 0x7f25a4107d50 2026-03-10T06:22:36.447 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.446+0000 7f25a989a700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f25a4071db0 0x7f25a41a0850 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:22:36.447 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.446+0000 7f25a989a700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f25a4071db0 0x7f25a41a0850 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:51034/0 (socket says 192.168.123.104:51034) 2026-03-10T06:22:36.447 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.446+0000 7f25a989a700 1 -- 192.168.123.104:0/1007833026 learned_addr learned my addr 192.168.123.104:0/1007833026 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:22:36.447 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.446+0000 7f25a9099700 1 --2- 192.168.123.104:0/1007833026 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f25a4107d50 0x7f25a41a0d90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:22:36.447 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.447+0000 7f25a989a700 1 -- 192.168.123.104:0/1007833026 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f25a4107d50 msgr2=0x7f25a41a0d90 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:22:36.447 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.447+0000 7f25a989a700 1 --2- 192.168.123.104:0/1007833026 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f25a4107d50 0x7f25a41a0d90 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:22:36.447 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.447+0000 7f25a989a700 1 -- 192.168.123.104:0/1007833026 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f259c00b050 con 0x7f25a4071db0 2026-03-10T06:22:36.447 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.447+0000 7f25a989a700 1 --2- 192.168.123.104:0/1007833026 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f25a4071db0 0x7f25a41a0850 secure :-1 s=READY pgs=331 cs=0 l=1 rev1=1 crypto rx=0x7f259c008f00 tx=0x7f259c008fe0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:22:36.448 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.447+0000 7f259affd700 1 -- 192.168.123.104:0/1007833026 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f259c00e050 con 0x7f25a4071db0 2026-03-10T06:22:36.448 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.447+0000 7f259affd700 1 -- 192.168.123.104:0/1007833026 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f259c004790 con 0x7f25a4071db0 2026-03-10T06:22:36.448 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.448+0000 7f259affd700 1 -- 192.168.123.104:0/1007833026 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f259c01dc50 con 0x7f25a4071db0 2026-03-10T06:22:36.449 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.448+0000 7f25abafe700 1 -- 192.168.123.104:0/1007833026 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f25a41a5f40 con 0x7f25a4071db0 2026-03-10T06:22:36.449 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.448+0000 7f25abafe700 1 -- 192.168.123.104:0/1007833026 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f25a41a63b0 con 0x7f25a4071db0 2026-03-10T06:22:36.450 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.449+0000 7f2598ff9700 1 -- 192.168.123.104:0/1007833026 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f25a4066e40 con 0x7f25a4071db0 2026-03-10T06:22:36.456 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.455+0000 7f259affd700 1 -- 192.168.123.104:0/1007833026 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 32) v1 ==== 100066+0+0 (secure 0 0 0) 0x7f259c019040 con 0x7f25a4071db0 2026-03-10T06:22:36.456 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.455+0000 7f259affd700 1 --2- 192.168.123.104:0/1007833026 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7f25900774f0 0x7f25900799a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:22:36.456 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.456+0000 7f25a9099700 1 --2- 192.168.123.104:0/1007833026 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7f25900774f0 0x7f25900799a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:22:36.456 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.456+0000 7f259affd700 1 -- 192.168.123.104:0/1007833026 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(42..42 src has 1..42) v4 ==== 5792+0+0 (secure 0 0 0) 0x7f259c09b830 con 0x7f25a4071db0 2026-03-10T06:22:36.456 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.456+0000 7f259affd700 1 -- 192.168.123.104:0/1007833026 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191525 (secure 0 0 0) 0x7f259c0cace0 con 0x7f25a4071db0 2026-03-10T06:22:36.459 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.459+0000 7f25a9099700 1 --2- 192.168.123.104:0/1007833026 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7f25900774f0 0x7f25900799a0 secure :-1 s=READY pgs=23 cs=0 l=1 rev1=1 crypto rx=0x7f25a0009d20 tx=0x7f25a0009450 comp rx=0 tx=0).ready entity=mgr.14632 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:22:36.462 INFO:tasks.workunit.client.1.vm06.stdout:9/971: dread d21/d32/d4d/dd2/ff1 [0,4194304] 0 2026-03-10T06:22:36.463 INFO:tasks.workunit.client.1.vm06.stdout:3/957: creat d6/dc/de5/f14b x:0 0 0 2026-03-10T06:22:36.464 INFO:tasks.workunit.client.1.vm06.stdout:3/958: chown d6/d21/f58 59034059 1 2026-03-10T06:22:36.474 INFO:tasks.workunit.client.1.vm06.stdout:8/841: creat d1/d2c/d99/f10e x:0 0 0 2026-03-10T06:22:36.475 INFO:tasks.workunit.client.1.vm06.stdout:2/823: getdents da/d13/d1c/d7d/ddf/d61 0 2026-03-10T06:22:36.476 INFO:tasks.workunit.client.1.vm06.stdout:2/824: fsync da/dea/f10b 0 2026-03-10T06:22:36.479 INFO:tasks.workunit.client.1.vm06.stdout:4/985: write dd/d24/d5d/fe6 [685802,8873] 0 2026-03-10T06:22:36.479 INFO:tasks.workunit.client.1.vm06.stdout:9/972: fsync d21/da2/da7/d93/dda/df4/d106/fff 0 2026-03-10T06:22:36.481 INFO:tasks.workunit.client.1.vm06.stdout:9/973: truncate d21/da2/da7/d93/dda/df4/d106/d129/f121 398842 0 2026-03-10T06:22:36.482 INFO:tasks.workunit.client.1.vm06.stdout:9/974: write d21/da2/de6/f2e [6537596,54481] 0 2026-03-10T06:22:36.485 INFO:tasks.workunit.client.1.vm06.stdout:3/959: creat d6/d4f/f14c x:0 0 0 2026-03-10T06:22:36.486 INFO:tasks.workunit.client.1.vm06.stdout:3/960: fsync d6/dc/d13/d35/d101/dd0/f13b 0 2026-03-10T06:22:36.487 INFO:tasks.workunit.client.1.vm06.stdout:5/755: getdents d8/de6 0 2026-03-10T06:22:36.493 INFO:tasks.workunit.client.1.vm06.stdout:8/842: mkdir d1/df/d20/d35/dac/d10f 0 2026-03-10T06:22:36.495 INFO:tasks.workunit.client.1.vm06.stdout:4/986: rmdir dd/d72 39 2026-03-10T06:22:36.496 INFO:tasks.workunit.client.1.vm06.stdout:8/843: dwrite d1/df/f56 [0,4194304] 0 2026-03-10T06:22:36.513 INFO:tasks.workunit.client.1.vm06.stdout:3/961: unlink d6/dc/d13/d35/d101/d88/f7d 0 2026-03-10T06:22:36.532 INFO:tasks.workunit.client.1.vm06.stdout:9/975: write d21/f10d [396518,11299] 0 2026-03-10T06:22:36.535 INFO:tasks.workunit.client.1.vm06.stdout:5/756: dwrite d8/db/d54/d8a/d39/fae [0,4194304] 0 2026-03-10T06:22:36.561 INFO:tasks.workunit.client.1.vm06.stdout:2/825: rename da/d13/d1c/d43/f79 to da/f10c 0 2026-03-10T06:22:36.570 INFO:tasks.workunit.client.1.vm06.stdout:9/976: truncate d21/d46/fe0 528685 0 2026-03-10T06:22:36.587 INFO:tasks.workunit.client.1.vm06.stdout:4/987: rename dd/d33/d47/d97/db6/dd7/fd9 to dd/d24/d2d/d2f/d34/d40/d10d/d75/dfd/f121 0 2026-03-10T06:22:36.588 INFO:tasks.workunit.client.1.vm06.stdout:4/988: chown dd/d33/l7b 12904195 1 2026-03-10T06:22:36.593 INFO:tasks.workunit.client.1.vm06.stdout:4/989: sync 2026-03-10T06:22:36.615 INFO:tasks.workunit.client.1.vm06.stdout:9/977: write d21/da2/da7/d93/dda/df4/d106/d129/db2/d80/d95/d9b/dd0/ff9 [242437,9601] 0 2026-03-10T06:22:36.619 INFO:tasks.workunit.client.1.vm06.stdout:8/844: link d1/df/d58/f6e d1/df/d20/f110 0 2026-03-10T06:22:36.628 INFO:tasks.workunit.client.1.vm06.stdout:3/962: mknod d6/dc/d13/d35/d101/d88/c14d 0 2026-03-10T06:22:36.630 INFO:tasks.workunit.client.1.vm06.stdout:2/826: mkdir da/d13/d1a/dc7/daf/d10d 0 2026-03-10T06:22:36.634 INFO:tasks.workunit.client.1.vm06.stdout:5/757: creat d8/db/d54/fe8 x:0 0 0 2026-03-10T06:22:36.638 INFO:tasks.workunit.client.1.vm06.stdout:9/978: dread d21/d32/d4d/d51/f5c [0,4194304] 0 2026-03-10T06:22:36.639 INFO:tasks.workunit.client.1.vm06.stdout:9/979: dread - d21/d32/d4d/dd2/fd5 zero size 2026-03-10T06:22:36.643 INFO:tasks.workunit.client.1.vm06.stdout:8/845: rmdir d1/d2c 39 2026-03-10T06:22:36.646 INFO:tasks.workunit.client.1.vm06.stdout:3/963: symlink d6/d4f/l14e 0 2026-03-10T06:22:36.648 INFO:tasks.workunit.client.1.vm06.stdout:9/980: dread d21/d27/d3a/fbb [0,4194304] 0 2026-03-10T06:22:36.649 INFO:tasks.workunit.client.1.vm06.stdout:9/981: read d21/da2/da7/d93/dda/df4/d106/d129/db2/d80/d95/f102 [575257,120853] 0 2026-03-10T06:22:36.658 INFO:tasks.workunit.client.1.vm06.stdout:2/827: rename da/d13/d1c/fef to da/d13/d1c/d7d/f10e 0 2026-03-10T06:22:36.669 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.668+0000 7f2598ff9700 1 -- 192.168.123.104:0/1007833026 --> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f25a4061190 con 0x7f25900774f0 2026-03-10T06:22:36.670 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.669+0000 7f259affd700 1 -- 192.168.123.104:0/1007833026 <== mgr.14632 v2:192.168.123.104:6800/1695210057 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+370 (secure 0 0 0) 0x7f25a4061190 con 0x7f25900774f0 2026-03-10T06:22:36.673 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.672+0000 7f2598ff9700 1 -- 192.168.123.104:0/1007833026 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7f25900774f0 msgr2=0x7f25900799a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:22:36.673 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.672+0000 7f2598ff9700 1 --2- 192.168.123.104:0/1007833026 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7f25900774f0 0x7f25900799a0 secure :-1 s=READY pgs=23 cs=0 l=1 rev1=1 crypto rx=0x7f25a0009d20 tx=0x7f25a0009450 comp rx=0 tx=0).stop 2026-03-10T06:22:36.673 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.672+0000 7f2598ff9700 1 -- 192.168.123.104:0/1007833026 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f25a4071db0 msgr2=0x7f25a41a0850 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:22:36.673 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.672+0000 7f2598ff9700 1 --2- 192.168.123.104:0/1007833026 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f25a4071db0 0x7f25a41a0850 secure :-1 s=READY pgs=331 cs=0 l=1 rev1=1 crypto rx=0x7f259c008f00 tx=0x7f259c008fe0 comp rx=0 tx=0).stop 2026-03-10T06:22:36.673 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.673+0000 7f2598ff9700 1 -- 192.168.123.104:0/1007833026 shutdown_connections 2026-03-10T06:22:36.673 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.673+0000 7f2598ff9700 1 --2- 192.168.123.104:0/1007833026 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7f25900774f0 0x7f25900799a0 unknown :-1 s=CLOSED pgs=23 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:22:36.673 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.673+0000 7f2598ff9700 1 --2- 192.168.123.104:0/1007833026 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f25a4071db0 0x7f25a41a0850 unknown :-1 s=CLOSED pgs=331 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:22:36.673 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.673+0000 7f2598ff9700 1 --2- 192.168.123.104:0/1007833026 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f25a4107d50 0x7f25a41a0d90 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:22:36.673 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.673+0000 7f2598ff9700 1 -- 192.168.123.104:0/1007833026 >> 192.168.123.104:0/1007833026 conn(0x7f25a406d3e0 msgr2=0x7f25a410a060 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:22:36.674 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.673+0000 7f2598ff9700 1 -- 192.168.123.104:0/1007833026 shutdown_connections 2026-03-10T06:22:36.674 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.673+0000 7f2598ff9700 1 -- 192.168.123.104:0/1007833026 wait complete. 2026-03-10T06:22:36.677 INFO:tasks.workunit.client.1.vm06.stdout:8/846: creat d1/d3b/db3/f111 x:0 0 0 2026-03-10T06:22:36.711 INFO:tasks.workunit.client.1.vm06.stdout:9/982: write d21/da2/de6/f2f [3925173,40202] 0 2026-03-10T06:22:36.717 INFO:tasks.workunit.client.1.vm06.stdout:2/828: unlink da/d13/d1a/dc7/lec 0 2026-03-10T06:22:36.717 INFO:tasks.workunit.client.1.vm06.stdout:2/829: stat da/d13/fe4 0 2026-03-10T06:22:36.719 INFO:tasks.workunit.client.1.vm06.stdout:5/758: mkdir d8/d9/ddd/de9 0 2026-03-10T06:22:36.727 INFO:tasks.workunit.client.1.vm06.stdout:2/830: dread da/d13/d1a/dc7/daf/ff4 [0,4194304] 0 2026-03-10T06:22:36.733 INFO:tasks.workunit.client.1.vm06.stdout:3/964: creat d6/dc/de5/d11d/f14f x:0 0 0 2026-03-10T06:22:36.737 INFO:tasks.workunit.client.1.vm06.stdout:8/847: write d1/df/d20/d21/f82 [960967,67043] 0 2026-03-10T06:22:36.741 INFO:tasks.workunit.client.1.vm06.stdout:9/983: symlink d21/da2/da7/d93/dda/l139 0 2026-03-10T06:22:36.753 INFO:tasks.workunit.client.1.vm06.stdout:4/990: link dd/d33/d47/d97/db6/fea dd/d24/d2d/d2f/f122 0 2026-03-10T06:22:36.779 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.778+0000 7f3c5f59e700 1 -- 192.168.123.104:0/2010690062 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3c601036d0 msgr2=0x7f3c60103b20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:22:36.779 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.778+0000 7f3c5f59e700 1 --2- 192.168.123.104:0/2010690062 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3c601036d0 0x7f3c60103b20 secure :-1 s=READY pgs=332 cs=0 l=1 rev1=1 crypto rx=0x7f3c50009b00 tx=0x7f3c50009e10 comp rx=0 tx=0).stop 2026-03-10T06:22:36.779 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.779+0000 7f3c5f59e700 1 -- 192.168.123.104:0/2010690062 shutdown_connections 2026-03-10T06:22:36.779 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.779+0000 7f3c5f59e700 1 --2- 192.168.123.104:0/2010690062 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3c601036d0 0x7f3c60103b20 unknown :-1 s=CLOSED pgs=332 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:22:36.779 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.779+0000 7f3c5f59e700 1 --2- 192.168.123.104:0/2010690062 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3c601024d0 0x7f3c601028e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:22:36.779 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.779+0000 7f3c5f59e700 1 -- 192.168.123.104:0/2010690062 >> 192.168.123.104:0/2010690062 conn(0x7f3c600fda60 msgr2=0x7f3c600ffeb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:22:36.779 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.779+0000 7f3c5f59e700 1 -- 192.168.123.104:0/2010690062 shutdown_connections 2026-03-10T06:22:36.780 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.779+0000 7f3c5f59e700 1 -- 192.168.123.104:0/2010690062 wait complete. 2026-03-10T06:22:36.780 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.779+0000 7f3c5f59e700 1 Processor -- start 2026-03-10T06:22:36.780 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.779+0000 7f3c5f59e700 1 -- start start 2026-03-10T06:22:36.780 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.780+0000 7f3c5f59e700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3c601024d0 0x7f3c60197d10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:22:36.780 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.780+0000 7f3c5f59e700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3c601036d0 0x7f3c60198250 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:22:36.780 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.780+0000 7f3c5f59e700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3c60198870 con 0x7f3c601036d0 2026-03-10T06:22:36.780 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.780+0000 7f3c5f59e700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3c6019d280 con 0x7f3c601024d0 2026-03-10T06:22:36.782 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.780+0000 7f3c5e59c700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3c601024d0 0x7f3c60197d10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:22:36.782 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.780+0000 7f3c5e59c700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3c601024d0 0x7f3c60197d10 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.104:35774/0 (socket says 192.168.123.104:35774) 2026-03-10T06:22:36.782 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.780+0000 7f3c5e59c700 1 -- 192.168.123.104:0/3346442627 learned_addr learned my addr 192.168.123.104:0/3346442627 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:22:36.782 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.781+0000 7f3c5e59c700 1 -- 192.168.123.104:0/3346442627 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3c601036d0 msgr2=0x7f3c60198250 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:22:36.782 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.781+0000 7f3c5e59c700 1 --2- 192.168.123.104:0/3346442627 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3c601036d0 0x7f3c60198250 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:22:36.782 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.781+0000 7f3c5e59c700 1 -- 192.168.123.104:0/3346442627 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f3c500097e0 con 0x7f3c601024d0 2026-03-10T06:22:36.782 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.781+0000 7f3c5e59c700 1 --2- 192.168.123.104:0/3346442627 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3c601024d0 0x7f3c60197d10 secure :-1 s=READY pgs=90 cs=0 l=1 rev1=1 crypto rx=0x7f3c5400d900 tx=0x7f3c5400dcc0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:22:36.782 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.781+0000 7f3c4f7fe700 1 -- 192.168.123.104:0/3346442627 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3c540041d0 con 0x7f3c601024d0 2026-03-10T06:22:36.783 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.782+0000 7f3c5f59e700 1 -- 192.168.123.104:0/3346442627 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f3c6019d480 con 0x7f3c601024d0 2026-03-10T06:22:36.783 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.782+0000 7f3c5f59e700 1 -- 192.168.123.104:0/3346442627 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f3c6019d9a0 con 0x7f3c601024d0 2026-03-10T06:22:36.783 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.783+0000 7f3c4f7fe700 1 -- 192.168.123.104:0/3346442627 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f3c54004330 con 0x7f3c601024d0 2026-03-10T06:22:36.783 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.783+0000 7f3c4f7fe700 1 -- 192.168.123.104:0/3346442627 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3c54003de0 con 0x7f3c601024d0 2026-03-10T06:22:36.784 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.784+0000 7f3c5f59e700 1 -- 192.168.123.104:0/3346442627 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f3c40005320 con 0x7f3c601024d0 2026-03-10T06:22:36.785 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.784+0000 7f3c4f7fe700 1 -- 192.168.123.104:0/3346442627 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 32) v1 ==== 100066+0+0 (secure 0 0 0) 0x7f3c5403ca60 con 0x7f3c601024d0 2026-03-10T06:22:36.785 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.785+0000 7f3c4f7fe700 1 --2- 192.168.123.104:0/3346442627 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7f3c480776d0 0x7f3c48079b80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:22:36.787 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.785+0000 7f3c4f7fe700 1 -- 192.168.123.104:0/3346442627 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(42..42 src has 1..42) v4 ==== 5792+0+0 (secure 0 0 0) 0x7f3c54021030 con 0x7f3c601024d0 2026-03-10T06:22:36.787 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.786+0000 7f3c5dd9b700 1 --2- 192.168.123.104:0/3346442627 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7f3c480776d0 0x7f3c48079b80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:22:36.787 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.787+0000 7f3c5dd9b700 1 --2- 192.168.123.104:0/3346442627 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7f3c480776d0 0x7f3c48079b80 secure :-1 s=READY pgs=24 cs=0 l=1 rev1=1 crypto rx=0x7f3c50005fd0 tx=0x7f3c50009f90 comp rx=0 tx=0).ready entity=mgr.14632 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:22:36.791 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.789+0000 7f3c4f7fe700 1 -- 192.168.123.104:0/3346442627 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191525 (secure 0 0 0) 0x7f3c54062280 con 0x7f3c601024d0 2026-03-10T06:22:36.871 INFO:tasks.workunit.client.1.vm06.stdout:8/848: creat d1/d3b/f112 x:0 0 0 2026-03-10T06:22:36.872 INFO:tasks.workunit.client.1.vm06.stdout:8/849: stat d1/df/d20/d21/d5e/f73 0 2026-03-10T06:22:36.872 INFO:tasks.workunit.client.1.vm06.stdout:9/984: rmdir d21/da2/da7/d93/dda/df4/d106/d129/dcd 39 2026-03-10T06:22:36.876 INFO:tasks.workunit.client.1.vm06.stdout:4/991: mknod dd/d41/da9/c123 0 2026-03-10T06:22:36.877 INFO:tasks.workunit.client.1.vm06.stdout:5/759: truncate d8/db/d54/d8a/d39/f44 348684 0 2026-03-10T06:22:36.881 INFO:tasks.workunit.client.1.vm06.stdout:2/831: mkdir da/d13/d1a/dc7/daf/d56/ddd/d10f 0 2026-03-10T06:22:36.883 INFO:tasks.workunit.client.1.vm06.stdout:8/850: creat d1/df/d20/f113 x:0 0 0 2026-03-10T06:22:36.883 INFO:tasks.workunit.client.1.vm06.stdout:9/985: chown d21/d27/d3a/l4c 51313 1 2026-03-10T06:22:36.891 INFO:tasks.workunit.client.1.vm06.stdout:4/992: fsync dd/d33/d36/fc5 0 2026-03-10T06:22:36.897 INFO:tasks.workunit.client.1.vm06.stdout:8/851: rmdir d1/df 39 2026-03-10T06:22:36.915 INFO:tasks.workunit.client.1.vm06.stdout:4/993: rmdir dd/d24/d2d/d2f/d34/d40/d10d 39 2026-03-10T06:22:36.918 INFO:tasks.workunit.client.1.vm06.stdout:3/965: dwrite d6/dc/d13/d9d/fbb [0,4194304] 0 2026-03-10T06:22:36.935 INFO:tasks.workunit.client.1.vm06.stdout:5/760: fsync d8/db/d54/d8a/d74/f85 0 2026-03-10T06:22:36.941 INFO:tasks.workunit.client.1.vm06.stdout:8/852: dread d1/df/d20/d35/f42 [4194304,4194304] 0 2026-03-10T06:22:36.941 INFO:tasks.workunit.client.1.vm06.stdout:8/853: chown d1/d3b/d5c/f6f 942 1 2026-03-10T06:22:36.942 INFO:tasks.workunit.client.1.vm06.stdout:8/854: dread - d1/df/d11/da1/feb zero size 2026-03-10T06:22:36.958 INFO:tasks.workunit.client.1.vm06.stdout:3/966: write d6/dc/d13/d9d/d54/fea [1559205,51454] 0 2026-03-10T06:22:36.959 INFO:tasks.workunit.client.1.vm06.stdout:5/761: dread - d8/db/fa0 zero size 2026-03-10T06:22:36.959 INFO:tasks.workunit.client.1.vm06.stdout:5/762: chown d8/d9/ddd/de9 14242489 1 2026-03-10T06:22:36.964 INFO:tasks.workunit.client.1.vm06.stdout:9/986: rename d21/d32/d4d/cea to d21/da2/da7/d93/dda/df4/d106/d129/db2/d80/d95/d9b/c13a 0 2026-03-10T06:22:36.968 INFO:tasks.workunit.client.1.vm06.stdout:8/855: dwrite d1/d2c/d99/ddc/ff9 [0,4194304] 0 2026-03-10T06:22:36.973 INFO:tasks.workunit.client.1.vm06.stdout:2/832: mkdir da/d13/d1c/d1d/d110 0 2026-03-10T06:22:36.973 INFO:tasks.workunit.client.1.vm06.stdout:3/967: rename d6/d8/f45 to d6/dc/d13/d133/f150 0 2026-03-10T06:22:36.974 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.973+0000 7f3c5f59e700 1 -- 192.168.123.104:0/3346442627 --> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7f3c40000bf0 con 0x7f3c480776d0 2026-03-10T06:22:36.974 INFO:tasks.workunit.client.1.vm06.stdout:9/987: dread - d21/d32/d4d/d51/db0/fe3 zero size 2026-03-10T06:22:36.979 INFO:tasks.workunit.client.1.vm06.stdout:4/994: symlink dd/d24/d2d/d2f/d34/d40/d10d/df8/d120/l124 0 2026-03-10T06:22:36.981 INFO:tasks.workunit.client.1.vm06.stdout:4/995: write dd/d24/d2d/d2f/d39/d71/dc3/dd0/de0/db0/deb/d107/fff [747321,58213] 0 2026-03-10T06:22:36.985 INFO:teuthology.orchestra.run.vm04.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-10T06:22:36.986 INFO:teuthology.orchestra.run.vm04.stdout:alertmanager.vm04 vm04 *:9093,9094 running (4m) 10s ago 5m 25.2M - 0.25.0 c8568f914cd2 3d98d9c97afc 2026-03-10T06:22:36.986 INFO:teuthology.orchestra.run.vm04.stdout:ceph-exporter.vm04 vm04 running (5m) 10s ago 5m 8334k - 18.2.0 dc2bc1663786 019b79596e39 2026-03-10T06:22:36.986 INFO:teuthology.orchestra.run.vm04.stdout:ceph-exporter.vm06 vm06 running (4m) 10s ago 4m 8614k - 18.2.0 dc2bc1663786 02ba67f7b99e 2026-03-10T06:22:36.986 INFO:teuthology.orchestra.run.vm04.stdout:crash.vm04 vm04 running (5m) 10s ago 5m 7402k - 18.2.0 dc2bc1663786 35fbdbd85c40 2026-03-10T06:22:36.986 INFO:teuthology.orchestra.run.vm04.stdout:crash.vm06 vm06 running (4m) 10s ago 4m 7411k - 18.2.0 dc2bc1663786 a60199b09d41 2026-03-10T06:22:36.986 INFO:teuthology.orchestra.run.vm04.stdout:grafana.vm04 vm04 *:3000 running (4m) 10s ago 4m 90.2M - 9.4.7 954c08fa6188 888c399470c8 2026-03-10T06:22:36.986 INFO:teuthology.orchestra.run.vm04.stdout:mds.cephfs.vm04.hdxbzv vm04 running (2m) 10s ago 2m 224M - 18.2.0 dc2bc1663786 342935a5b39a 2026-03-10T06:22:36.986 INFO:teuthology.orchestra.run.vm04.stdout:mds.cephfs.vm04.hsrsig vm04 running (2m) 10s ago 2m 16.1M - 18.2.0 dc2bc1663786 9bbaa4df4333 2026-03-10T06:22:36.986 INFO:teuthology.orchestra.run.vm04.stdout:mds.cephfs.vm06.afscws vm06 running (2m) 10s ago 2m 16.2M - 18.2.0 dc2bc1663786 dc29bd0a94dd 2026-03-10T06:22:36.986 INFO:teuthology.orchestra.run.vm04.stdout:mds.cephfs.vm06.wzhqon vm06 running (2m) 10s ago 2m 232M - 18.2.0 dc2bc1663786 5f7b9f10b346 2026-03-10T06:22:36.986 INFO:teuthology.orchestra.run.vm04.stdout:mgr.vm04.exdvdb vm04 *:8443,9283,8765 running (36s) 10s ago 5m 597M - 19.2.3-678-ge911bdeb 654f31e6858e 640a9d30421c 2026-03-10T06:22:36.986 INFO:teuthology.orchestra.run.vm04.stdout:mgr.vm06.wwotdr vm06 *:8443,9283,8765 running (13s) 10s ago 4m 46.5M - 19.2.3-678-ge911bdeb 654f31e6858e 0f98de364d6a 2026-03-10T06:22:36.986 INFO:teuthology.orchestra.run.vm04.stdout:mon.vm04 vm04 running (5m) 10s ago 5m 54.1M 2048M 18.2.0 dc2bc1663786 089bb557f95b 2026-03-10T06:22:36.986 INFO:teuthology.orchestra.run.vm04.stdout:mon.vm06 vm06 running (4m) 10s ago 4m 41.5M 2048M 18.2.0 dc2bc1663786 826078cd5cc7 2026-03-10T06:22:36.986 INFO:teuthology.orchestra.run.vm04.stdout:node-exporter.vm04 vm04 *:9100 running (5m) 10s ago 5m 12.6M - 1.5.0 0da6a335fe13 f563a35e96ab 2026-03-10T06:22:36.986 INFO:teuthology.orchestra.run.vm04.stdout:node-exporter.vm06 vm06 *:9100 running (4m) 10s ago 4m 15.2M - 1.5.0 0da6a335fe13 3304cc389738 2026-03-10T06:22:36.986 INFO:teuthology.orchestra.run.vm04.stdout:osd.0 vm04 running (4m) 10s ago 4m 183M 4096M 18.2.0 dc2bc1663786 23249edb3d75 2026-03-10T06:22:36.986 INFO:teuthology.orchestra.run.vm04.stdout:osd.1 vm04 running (3m) 10s ago 3m 190M 4096M 18.2.0 dc2bc1663786 ddcaf1636c42 2026-03-10T06:22:36.986 INFO:teuthology.orchestra.run.vm04.stdout:osd.2 vm04 running (3m) 10s ago 3m 180M 4096M 18.2.0 dc2bc1663786 e5a533082c80 2026-03-10T06:22:36.986 INFO:teuthology.orchestra.run.vm04.stdout:osd.3 vm06 running (3m) 10s ago 3m 262M 4096M 18.2.0 dc2bc1663786 62400287eca0 2026-03-10T06:22:36.986 INFO:teuthology.orchestra.run.vm04.stdout:osd.4 vm06 running (3m) 10s ago 3m 231M 4096M 18.2.0 dc2bc1663786 dcd395dfe220 2026-03-10T06:22:36.986 INFO:teuthology.orchestra.run.vm04.stdout:osd.5 vm06 running (3m) 10s ago 3m 210M 4096M 18.2.0 dc2bc1663786 862da087fc06 2026-03-10T06:22:36.986 INFO:teuthology.orchestra.run.vm04.stdout:prometheus.vm04 vm04 *:9095 running (17s) 10s ago 4m 45.2M - 2.43.0 a07b618ecd1d 225d15e25b4d 2026-03-10T06:22:36.987 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.982+0000 7f3c4f7fe700 1 -- 192.168.123.104:0/3346442627 <== mgr.14632 v2:192.168.123.104:6800/1695210057 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3552 (secure 0 0 0) 0x7f3c40000bf0 con 0x7f3c480776d0 2026-03-10T06:22:36.987 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.985+0000 7f3c4d7fa700 1 -- 192.168.123.104:0/3346442627 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7f3c480776d0 msgr2=0x7f3c48079b80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:22:36.987 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.985+0000 7f3c4d7fa700 1 --2- 192.168.123.104:0/3346442627 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7f3c480776d0 0x7f3c48079b80 secure :-1 s=READY pgs=24 cs=0 l=1 rev1=1 crypto rx=0x7f3c50005fd0 tx=0x7f3c50009f90 comp rx=0 tx=0).stop 2026-03-10T06:22:36.987 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.985+0000 7f3c4d7fa700 1 -- 192.168.123.104:0/3346442627 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3c601024d0 msgr2=0x7f3c60197d10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:22:36.987 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.985+0000 7f3c4d7fa700 1 --2- 192.168.123.104:0/3346442627 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3c601024d0 0x7f3c60197d10 secure :-1 s=READY pgs=90 cs=0 l=1 rev1=1 crypto rx=0x7f3c5400d900 tx=0x7f3c5400dcc0 comp rx=0 tx=0).stop 2026-03-10T06:22:36.987 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.986+0000 7f3c4d7fa700 1 -- 192.168.123.104:0/3346442627 shutdown_connections 2026-03-10T06:22:36.987 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.986+0000 7f3c4d7fa700 1 --2- 192.168.123.104:0/3346442627 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3c601024d0 0x7f3c60197d10 unknown :-1 s=CLOSED pgs=90 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:22:36.987 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.986+0000 7f3c4d7fa700 1 --2- 192.168.123.104:0/3346442627 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7f3c480776d0 0x7f3c48079b80 unknown :-1 s=CLOSED pgs=24 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:22:36.987 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.986+0000 7f3c4d7fa700 1 --2- 192.168.123.104:0/3346442627 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3c601036d0 0x7f3c60198250 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:22:36.987 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.986+0000 7f3c4d7fa700 1 -- 192.168.123.104:0/3346442627 >> 192.168.123.104:0/3346442627 conn(0x7f3c600fda60 msgr2=0x7f3c600ffe30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:22:36.987 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.986+0000 7f3c4d7fa700 1 -- 192.168.123.104:0/3346442627 shutdown_connections 2026-03-10T06:22:36.987 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:36.986+0000 7f3c4d7fa700 1 -- 192.168.123.104:0/3346442627 wait complete. 2026-03-10T06:22:37.010 INFO:tasks.workunit.client.1.vm06.stdout:2/833: creat da/d13/d1c/d7d/ddf/d61/f111 x:0 0 0 2026-03-10T06:22:37.010 INFO:tasks.workunit.client.1.vm06.stdout:3/968: dwrite d6/d1a/fb9 [0,4194304] 0 2026-03-10T06:22:37.011 INFO:tasks.workunit.client.1.vm06.stdout:8/856: dwrite d1/f13 [0,4194304] 0 2026-03-10T06:22:37.024 INFO:tasks.workunit.client.1.vm06.stdout:9/988: truncate d21/da2/de6/f2f 2775428 0 2026-03-10T06:22:37.024 INFO:tasks.workunit.client.1.vm06.stdout:4/996: symlink dd/d24/d2d/d2f/d39/d71/dc3/l125 0 2026-03-10T06:22:37.024 INFO:tasks.workunit.client.1.vm06.stdout:5/763: link d8/db/d54/d8a/d74/d90/lb0 d8/db/d54/lea 0 2026-03-10T06:22:37.029 INFO:tasks.workunit.client.1.vm06.stdout:8/857: stat d1/df/d20/d21/d7e/d8d/f9c 0 2026-03-10T06:22:37.029 INFO:tasks.workunit.client.1.vm06.stdout:8/858: stat d1/df/d11/ff1 0 2026-03-10T06:22:37.030 INFO:tasks.workunit.client.1.vm06.stdout:3/969: truncate d6/dc/d13/d35/f110 1796 0 2026-03-10T06:22:37.034 INFO:tasks.workunit.client.1.vm06.stdout:2/834: rename da/d13/d1a/dc7/dc5 to da/d13/d1c/d1d/d44/d46/de2/d112 0 2026-03-10T06:22:37.036 INFO:tasks.workunit.client.1.vm06.stdout:5/764: stat d8/db/d54/d55/c7b 0 2026-03-10T06:22:37.045 INFO:tasks.workunit.client.1.vm06.stdout:8/859: mkdir d1/d3b/da9/d114 0 2026-03-10T06:22:37.046 INFO:tasks.workunit.client.1.vm06.stdout:3/970: rmdir d6/dc/de5/d13d 39 2026-03-10T06:22:37.047 INFO:tasks.workunit.client.1.vm06.stdout:2/835: write da/d13/d1a/dc7/daf/d56/db9/feb [668461,31038] 0 2026-03-10T06:22:37.048 INFO:tasks.workunit.client.1.vm06.stdout:2/836: fsync da/d13/d5e/f9e 0 2026-03-10T06:22:37.050 INFO:tasks.workunit.client.1.vm06.stdout:8/860: dread d1/d2c/d99/ddc/ff9 [0,4194304] 0 2026-03-10T06:22:37.064 INFO:tasks.workunit.client.1.vm06.stdout:9/989: write d21/da2/da7/d93/dda/df4/d106/d129/fb7 [4103771,45977] 0 2026-03-10T06:22:37.080 INFO:tasks.workunit.client.1.vm06.stdout:9/990: sync 2026-03-10T06:22:37.082 INFO:tasks.workunit.client.1.vm06.stdout:5/765: creat d8/db/d54/d8a/d39/d6c/de7/feb x:0 0 0 2026-03-10T06:22:37.084 INFO:tasks.workunit.client.1.vm06.stdout:3/971: dwrite d6/dc/d13/d35/d101/d88/dae/dec/d117/f130 [0,4194304] 0 2026-03-10T06:22:37.103 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:37.102+0000 7f0c00a05700 1 -- 192.168.123.104:0/1982882763 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0bfc072360 msgr2=0x7f0bfc0770e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:22:37.103 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:37.102+0000 7f0c00a05700 1 --2- 192.168.123.104:0/1982882763 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0bfc072360 0x7f0bfc0770e0 secure :-1 s=READY pgs=333 cs=0 l=1 rev1=1 crypto rx=0x7f0bf400d3f0 tx=0x7f0bf400d700 comp rx=0 tx=0).stop 2026-03-10T06:22:37.103 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:37.102+0000 7f0c00a05700 1 -- 192.168.123.104:0/1982882763 shutdown_connections 2026-03-10T06:22:37.103 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:37.102+0000 7f0c00a05700 1 --2- 192.168.123.104:0/1982882763 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0bfc072360 0x7f0bfc0770e0 unknown :-1 s=CLOSED pgs=333 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:22:37.103 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:37.102+0000 7f0c00a05700 1 --2- 192.168.123.104:0/1982882763 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0bfc071980 0x7f0bfc071d90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:22:37.103 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:37.102+0000 7f0c00a05700 1 -- 192.168.123.104:0/1982882763 >> 192.168.123.104:0/1982882763 conn(0x7f0bfc06d1a0 msgr2=0x7f0bfc06f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:22:37.105 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:37.103+0000 7f0c00a05700 1 -- 192.168.123.104:0/1982882763 shutdown_connections 2026-03-10T06:22:37.105 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:37.103+0000 7f0c00a05700 1 -- 192.168.123.104:0/1982882763 wait complete. 2026-03-10T06:22:37.105 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:37.103+0000 7f0c00a05700 1 Processor -- start 2026-03-10T06:22:37.105 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:37.103+0000 7f0c00a05700 1 -- start start 2026-03-10T06:22:37.105 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:37.104+0000 7f0c00a05700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0bfc071980 0x7f0bfc1313c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:22:37.105 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:37.104+0000 7f0c00a05700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0bfc131900 0x7f0bfc07f590 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:22:37.105 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:37.104+0000 7f0c00a05700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0bfc131e00 con 0x7f0bfc131900 2026-03-10T06:22:37.105 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:37.104+0000 7f0c00a05700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0bfc131f70 con 0x7f0bfc071980 2026-03-10T06:22:37.105 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:37.105+0000 7f0bfa59c700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0bfc071980 0x7f0bfc1313c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:22:37.105 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:37.105+0000 7f0bfa59c700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0bfc071980 0x7f0bfc1313c0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.104:35798/0 (socket says 192.168.123.104:35798) 2026-03-10T06:22:37.106 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:37.105+0000 7f0bfa59c700 1 -- 192.168.123.104:0/2785238524 learned_addr learned my addr 192.168.123.104:0/2785238524 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:22:37.106 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:37.106+0000 7f0bfa59c700 1 -- 192.168.123.104:0/2785238524 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0bfc131900 msgr2=0x7f0bfc07f590 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:22:37.106 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:37.106+0000 7f0bfa59c700 1 --2- 192.168.123.104:0/2785238524 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0bfc131900 0x7f0bfc07f590 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:22:37.106 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:37.106+0000 7f0bfa59c700 1 -- 192.168.123.104:0/2785238524 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f0bf4007ed0 con 0x7f0bfc071980 2026-03-10T06:22:37.107 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:37.107+0000 7f0bfa59c700 1 --2- 192.168.123.104:0/2785238524 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0bfc071980 0x7f0bfc1313c0 secure :-1 s=READY pgs=91 cs=0 l=1 rev1=1 crypto rx=0x7f0bec00d8d0 tx=0x7f0bec00dc90 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:22:37.109 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:37.108+0000 7f0beb7fe700 1 -- 192.168.123.104:0/2785238524 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f0bec009940 con 0x7f0bfc071980 2026-03-10T06:22:37.109 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:37.108+0000 7f0c00a05700 1 -- 192.168.123.104:0/2785238524 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f0bfc07fb30 con 0x7f0bfc071980 2026-03-10T06:22:37.109 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:37.108+0000 7f0c00a05700 1 -- 192.168.123.104:0/2785238524 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f0bfc080050 con 0x7f0bfc071980 2026-03-10T06:22:37.109 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:37.109+0000 7f0beb7fe700 1 -- 192.168.123.104:0/2785238524 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f0bec010460 con 0x7f0bfc071980 2026-03-10T06:22:37.110 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:37.109+0000 7f0beb7fe700 1 -- 192.168.123.104:0/2785238524 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f0bec00f5d0 con 0x7f0bfc071980 2026-03-10T06:22:37.111 INFO:tasks.workunit.client.1.vm06.stdout:4/997: getdents dd 0 2026-03-10T06:22:37.113 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:37.110+0000 7f0c00a05700 1 -- 192.168.123.104:0/2785238524 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f0bfc12b500 con 0x7f0bfc071980 2026-03-10T06:22:37.113 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:37.112+0000 7f0beb7fe700 1 -- 192.168.123.104:0/2785238524 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 32) v1 ==== 100066+0+0 (secure 0 0 0) 0x7f0bec0105d0 con 0x7f0bfc071980 2026-03-10T06:22:37.113 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:37.113+0000 7f0beb7fe700 1 --2- 192.168.123.104:0/2785238524 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7f0be4077780 0x7f0be4079c30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:22:37.114 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:37.113+0000 7f0beb7fe700 1 -- 192.168.123.104:0/2785238524 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(42..42 src has 1..42) v4 ==== 5792+0+0 (secure 0 0 0) 0x7f0bec099b20 con 0x7f0bfc071980 2026-03-10T06:22:37.115 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:37.115+0000 7f0beb7fe700 1 -- 192.168.123.104:0/2785238524 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191525 (secure 0 0 0) 0x7f0bec061fe0 con 0x7f0bfc071980 2026-03-10T06:22:37.122 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:37.116+0000 7f0bf9d9b700 1 --2- 192.168.123.104:0/2785238524 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7f0be4077780 0x7f0be4079c30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:22:37.123 INFO:tasks.workunit.client.1.vm06.stdout:9/991: rename d21/da2/da7/d93/dda/df4/d106/fff to d21/da2/da7/d93/dda/df4/d106/d129/db2/d7f/f13b 0 2026-03-10T06:22:37.132 INFO:tasks.workunit.client.1.vm06.stdout:3/972: fdatasync d6/f53 0 2026-03-10T06:22:37.139 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:37.139+0000 7f0bf9d9b700 1 --2- 192.168.123.104:0/2785238524 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7f0be4077780 0x7f0be4079c30 secure :-1 s=READY pgs=25 cs=0 l=1 rev1=1 crypto rx=0x7f0bfc072ff0 tx=0x7f0bf400db00 comp rx=0 tx=0).ready entity=mgr.14632 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:22:37.144 INFO:tasks.workunit.client.1.vm06.stdout:8/861: getdents d1/d3b/da9/d114 0 2026-03-10T06:22:37.145 INFO:tasks.workunit.client.1.vm06.stdout:9/992: mknod d21/da2/da7/d93/dda/c13c 0 2026-03-10T06:22:37.146 INFO:tasks.workunit.client.1.vm06.stdout:4/998: dread dd/d24/d2d/f3b [0,4194304] 0 2026-03-10T06:22:37.164 INFO:tasks.workunit.client.1.vm06.stdout:5/766: truncate d8/d9/f11 2203504 0 2026-03-10T06:22:37.176 INFO:tasks.workunit.client.1.vm06.stdout:8/862: dread d1/df/d11/da1/f10b [0,4194304] 0 2026-03-10T06:22:37.182 INFO:tasks.workunit.client.1.vm06.stdout:3/973: dread d6/dc/d41/d6d/fce [0,4194304] 0 2026-03-10T06:22:37.194 INFO:tasks.workunit.client.1.vm06.stdout:4/999: write dd/d33/f37 [2207809,44573] 0 2026-03-10T06:22:37.196 INFO:tasks.workunit.client.1.vm06.stdout:2/837: getdents da/d13/d1a/dc7/d86 0 2026-03-10T06:22:37.196 INFO:tasks.workunit.client.1.vm06.stdout:2/838: chown da/d13/d1a/l92 7803 1 2026-03-10T06:22:37.196 INFO:tasks.workunit.client.1.vm06.stdout:2/839: stat da/d13/d1c/d7d/ddf/f67 0 2026-03-10T06:22:37.197 INFO:tasks.workunit.client.1.vm06.stdout:2/840: dread - da/d13/d1c/d7d/fe8 zero size 2026-03-10T06:22:37.201 INFO:tasks.workunit.client.1.vm06.stdout:3/974: dread d6/dc/d13/d35/f110 [0,4194304] 0 2026-03-10T06:22:37.207 INFO:tasks.workunit.client.1.vm06.stdout:2/841: fsync da/d13/d1a/dc7/daf/d56/f58 0 2026-03-10T06:22:37.216 INFO:tasks.workunit.client.1.vm06.stdout:8/863: symlink d1/d7/dfb/l115 0 2026-03-10T06:22:37.224 INFO:tasks.workunit.client.1.vm06.stdout:9/993: link d21/da2/da7/d93/dda/c7e d21/d32/d4d/dd2/c13d 0 2026-03-10T06:22:37.232 INFO:tasks.workunit.client.1.vm06.stdout:3/975: dwrite d6/dc/d13/f8b [0,4194304] 0 2026-03-10T06:22:37.236 INFO:tasks.workunit.client.1.vm06.stdout:3/976: dread d6/d8/d7f/fe6 [0,4194304] 0 2026-03-10T06:22:37.266 INFO:tasks.workunit.client.1.vm06.stdout:9/994: rmdir d21/d32 39 2026-03-10T06:22:37.266 INFO:tasks.workunit.client.1.vm06.stdout:9/995: stat d21/da2/da7/f96 0 2026-03-10T06:22:37.277 INFO:tasks.workunit.client.1.vm06.stdout:2/842: symlink da/d13/d1c/d43/l113 0 2026-03-10T06:22:37.283 INFO:tasks.workunit.client.1.vm06.stdout:5/767: getdents d8/db/d54/d67/d46/d6e/da2 0 2026-03-10T06:22:37.283 INFO:tasks.workunit.client.1.vm06.stdout:5/768: stat d8/d9 0 2026-03-10T06:22:37.286 INFO:tasks.workunit.client.1.vm06.stdout:8/864: creat d1/d7/df8/d103/f116 x:0 0 0 2026-03-10T06:22:37.289 INFO:tasks.workunit.client.1.vm06.stdout:9/996: creat d21/da2/f13e x:0 0 0 2026-03-10T06:22:37.292 INFO:tasks.workunit.client.1.vm06.stdout:3/977: symlink d6/d1a/d5b/l151 0 2026-03-10T06:22:37.294 INFO:tasks.workunit.client.1.vm06.stdout:5/769: rename d8/db/fa0 to d8/de6/fec 0 2026-03-10T06:22:37.297 INFO:tasks.workunit.client.1.vm06.stdout:3/978: read d6/d1a/f1f [926138,113554] 0 2026-03-10T06:22:37.319 INFO:tasks.workunit.client.1.vm06.stdout:2/843: dwrite da/f10c [0,4194304] 0 2026-03-10T06:22:37.330 INFO:tasks.workunit.client.1.vm06.stdout:2/844: dwrite da/d13/d1a/f101 [0,4194304] 0 2026-03-10T06:22:37.360 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:37.359+0000 7f0c00a05700 1 -- 192.168.123.104:0/2785238524 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7f0bfc04ea50 con 0x7f0bfc071980 2026-03-10T06:22:37.361 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:37.360+0000 7f0beb7fe700 1 -- 192.168.123.104:0/2785238524 <== mon.1 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+679 (secure 0 0 0) 0x7f0bec061e00 con 0x7f0bfc071980 2026-03-10T06:22:37.362 INFO:teuthology.orchestra.run.vm04.stdout:{ 2026-03-10T06:22:37.362 INFO:teuthology.orchestra.run.vm04.stdout: "mon": { 2026-03-10T06:22:37.362 INFO:teuthology.orchestra.run.vm04.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 2 2026-03-10T06:22:37.362 INFO:teuthology.orchestra.run.vm04.stdout: }, 2026-03-10T06:22:37.362 INFO:teuthology.orchestra.run.vm04.stdout: "mgr": { 2026-03-10T06:22:37.362 INFO:teuthology.orchestra.run.vm04.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T06:22:37.362 INFO:teuthology.orchestra.run.vm04.stdout: }, 2026-03-10T06:22:37.362 INFO:teuthology.orchestra.run.vm04.stdout: "osd": { 2026-03-10T06:22:37.362 INFO:teuthology.orchestra.run.vm04.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 6 2026-03-10T06:22:37.362 INFO:teuthology.orchestra.run.vm04.stdout: }, 2026-03-10T06:22:37.362 INFO:teuthology.orchestra.run.vm04.stdout: "mds": { 2026-03-10T06:22:37.362 INFO:teuthology.orchestra.run.vm04.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 4 2026-03-10T06:22:37.362 INFO:teuthology.orchestra.run.vm04.stdout: }, 2026-03-10T06:22:37.362 INFO:teuthology.orchestra.run.vm04.stdout: "overall": { 2026-03-10T06:22:37.362 INFO:teuthology.orchestra.run.vm04.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 12, 2026-03-10T06:22:37.362 INFO:teuthology.orchestra.run.vm04.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T06:22:37.362 INFO:teuthology.orchestra.run.vm04.stdout: } 2026-03-10T06:22:37.362 INFO:teuthology.orchestra.run.vm04.stdout:} 2026-03-10T06:22:37.365 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:37.365+0000 7f0c00a05700 1 -- 192.168.123.104:0/2785238524 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7f0be4077780 msgr2=0x7f0be4079c30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:22:37.365 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:37.365+0000 7f0c00a05700 1 --2- 192.168.123.104:0/2785238524 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7f0be4077780 0x7f0be4079c30 secure :-1 s=READY pgs=25 cs=0 l=1 rev1=1 crypto rx=0x7f0bfc072ff0 tx=0x7f0bf400db00 comp rx=0 tx=0).stop 2026-03-10T06:22:37.365 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:37.365+0000 7f0c00a05700 1 -- 192.168.123.104:0/2785238524 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0bfc071980 msgr2=0x7f0bfc1313c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:22:37.365 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:37.365+0000 7f0c00a05700 1 --2- 192.168.123.104:0/2785238524 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0bfc071980 0x7f0bfc1313c0 secure :-1 s=READY pgs=91 cs=0 l=1 rev1=1 crypto rx=0x7f0bec00d8d0 tx=0x7f0bec00dc90 comp rx=0 tx=0).stop 2026-03-10T06:22:37.365 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:37.365+0000 7f0c00a05700 1 -- 192.168.123.104:0/2785238524 shutdown_connections 2026-03-10T06:22:37.365 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:37.365+0000 7f0c00a05700 1 --2- 192.168.123.104:0/2785238524 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0bfc071980 0x7f0bfc1313c0 unknown :-1 s=CLOSED pgs=91 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:22:37.365 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:37.365+0000 7f0c00a05700 1 --2- 192.168.123.104:0/2785238524 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7f0be4077780 0x7f0be4079c30 unknown :-1 s=CLOSED pgs=25 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:22:37.366 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:37.365+0000 7f0c00a05700 1 --2- 192.168.123.104:0/2785238524 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0bfc131900 0x7f0bfc07f590 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:22:37.366 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:37.365+0000 7f0c00a05700 1 -- 192.168.123.104:0/2785238524 >> 192.168.123.104:0/2785238524 conn(0x7f0bfc06d1a0 msgr2=0x7f0bfc0764c0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:22:37.366 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:37.365+0000 7f0c00a05700 1 -- 192.168.123.104:0/2785238524 shutdown_connections 2026-03-10T06:22:37.366 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:37.365+0000 7f0c00a05700 1 -- 192.168.123.104:0/2785238524 wait complete. 2026-03-10T06:22:37.374 INFO:tasks.workunit.client.1.vm06.stdout:8/865: creat d1/d2c/d99/f117 x:0 0 0 2026-03-10T06:22:37.381 INFO:tasks.workunit.client.1.vm06.stdout:5/770: write d8/db/d54/d8a/d74/f2f [1818728,3196] 0 2026-03-10T06:22:37.381 INFO:tasks.workunit.client.1.vm06.stdout:9/997: write d21/da2/da7/d93/dda/df4/d106/d129/db2/d7f/f13b [696766,122417] 0 2026-03-10T06:22:37.383 INFO:tasks.workunit.client.1.vm06.stdout:3/979: write d6/dc/d13/d9d/f57 [1582301,42103] 0 2026-03-10T06:22:37.383 INFO:tasks.workunit.client.1.vm06.stdout:2/845: rename da/d13/d1a/dc7/daf/d56/db9/d9b/laa to da/d13/d1a/dc7/daf/d56/dd4/l114 0 2026-03-10T06:22:37.394 INFO:tasks.workunit.client.1.vm06.stdout:8/866: mkdir d1/df/d20/d21/d5e/d79/d118 0 2026-03-10T06:22:37.398 INFO:tasks.workunit.client.1.vm06.stdout:5/771: truncate d8/db/fbc 1440345 0 2026-03-10T06:22:37.403 INFO:tasks.workunit.client.1.vm06.stdout:5/772: sync 2026-03-10T06:22:37.407 INFO:tasks.workunit.client.1.vm06.stdout:3/980: dwrite d6/dc/d13/d9d/d54/fdf [0,4194304] 0 2026-03-10T06:22:37.408 INFO:tasks.workunit.client.1.vm06.stdout:9/998: symlink d21/d32/d4d/d51/dcb/l13f 0 2026-03-10T06:22:37.409 INFO:tasks.workunit.client.1.vm06.stdout:9/999: sync 2026-03-10T06:22:37.428 INFO:tasks.workunit.client.1.vm06.stdout:8/867: symlink d1/df/d20/d35/dac/dbf/l119 0 2026-03-10T06:22:37.448 INFO:tasks.workunit.client.1.vm06.stdout:3/981: truncate d6/dc/f69 2292971 0 2026-03-10T06:22:37.456 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:37.454+0000 7f74df8c4700 1 -- 192.168.123.104:0/585211801 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f74d8072360 msgr2=0x7f74d80770e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:22:37.456 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:37.454+0000 7f74df8c4700 1 --2- 192.168.123.104:0/585211801 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f74d8072360 0x7f74d80770e0 secure :-1 s=READY pgs=334 cs=0 l=1 rev1=1 crypto rx=0x7f74d000d3f0 tx=0x7f74d000d700 comp rx=0 tx=0).stop 2026-03-10T06:22:37.456 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:37.455+0000 7f74df8c4700 1 -- 192.168.123.104:0/585211801 shutdown_connections 2026-03-10T06:22:37.456 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:37.455+0000 7f74df8c4700 1 --2- 192.168.123.104:0/585211801 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f74d8072360 0x7f74d80770e0 unknown :-1 s=CLOSED pgs=334 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:22:37.456 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:37.455+0000 7f74df8c4700 1 --2- 192.168.123.104:0/585211801 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f74d8071980 0x7f74d8071d90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:22:37.456 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:37.455+0000 7f74df8c4700 1 -- 192.168.123.104:0/585211801 >> 192.168.123.104:0/585211801 conn(0x7f74d806d1a0 msgr2=0x7f74d806f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:22:37.456 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:37.455+0000 7f74df8c4700 1 -- 192.168.123.104:0/585211801 shutdown_connections 2026-03-10T06:22:37.457 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:37.455+0000 7f74df8c4700 1 -- 192.168.123.104:0/585211801 wait complete. 2026-03-10T06:22:37.457 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:37.456+0000 7f74df8c4700 1 Processor -- start 2026-03-10T06:22:37.457 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:37.456+0000 7f74df8c4700 1 -- start start 2026-03-10T06:22:37.457 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:37.456+0000 7f74df8c4700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f74d8071980 0x7f74d8131350 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:22:37.457 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:37.456+0000 7f74df8c4700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f74d8131890 0x7f74d807f520 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:22:37.457 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:37.456+0000 7f74df8c4700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f74d8131d90 con 0x7f74d8131890 2026-03-10T06:22:37.457 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:37.456+0000 7f74df8c4700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f74d8131ed0 con 0x7f74d8071980 2026-03-10T06:22:37.457 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:37.456+0000 7f74dce5f700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f74d8131890 0x7f74d807f520 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:22:37.457 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:37.456+0000 7f74dce5f700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f74d8131890 0x7f74d807f520 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:51080/0 (socket says 192.168.123.104:51080) 2026-03-10T06:22:37.457 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:37.456+0000 7f74dce5f700 1 -- 192.168.123.104:0/4108799228 learned_addr learned my addr 192.168.123.104:0/4108799228 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:22:37.457 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:37.456+0000 7f74dd660700 1 --2- 192.168.123.104:0/4108799228 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f74d8071980 0x7f74d8131350 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:22:37.458 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:37.457+0000 7f74dce5f700 1 -- 192.168.123.104:0/4108799228 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f74d8071980 msgr2=0x7f74d8131350 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:22:37.458 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:37.457+0000 7f74dce5f700 1 --2- 192.168.123.104:0/4108799228 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f74d8071980 0x7f74d8131350 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:22:37.458 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:37.457+0000 7f74dce5f700 1 -- 192.168.123.104:0/4108799228 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f74d0007ed0 con 0x7f74d8131890 2026-03-10T06:22:37.458 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:37.457+0000 7f74dce5f700 1 --2- 192.168.123.104:0/4108799228 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f74d8131890 0x7f74d807f520 secure :-1 s=READY pgs=335 cs=0 l=1 rev1=1 crypto rx=0x7f74d0003c30 tx=0x7f74d0003c60 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:22:37.458 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:37.458+0000 7f74ce7fc700 1 -- 192.168.123.104:0/4108799228 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f74d001c070 con 0x7f74d8131890 2026-03-10T06:22:37.459 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:37.458+0000 7f74df8c4700 1 -- 192.168.123.104:0/4108799228 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f74d807fa60 con 0x7f74d8131890 2026-03-10T06:22:37.459 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:37.458+0000 7f74df8c4700 1 -- 192.168.123.104:0/4108799228 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f74d807ff00 con 0x7f74d8131890 2026-03-10T06:22:37.459 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:37.458+0000 7f74ce7fc700 1 -- 192.168.123.104:0/4108799228 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f74d000fb40 con 0x7f74d8131890 2026-03-10T06:22:37.460 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:37.460+0000 7f74ce7fc700 1 -- 192.168.123.104:0/4108799228 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f74d0017c60 con 0x7f74d8131890 2026-03-10T06:22:37.461 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:37.460+0000 7f74ce7fc700 1 -- 192.168.123.104:0/4108799228 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 32) v1 ==== 100066+0+0 (secure 0 0 0) 0x7f74d002a430 con 0x7f74d8131890 2026-03-10T06:22:37.461 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:37.461+0000 7f74ce7fc700 1 --2- 192.168.123.104:0/4108799228 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7f74c4077860 0x7f74c4079d10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:22:37.461 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:37.461+0000 7f74ce7fc700 1 -- 192.168.123.104:0/4108799228 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(42..42 src has 1..42) v4 ==== 5792+0+0 (secure 0 0 0) 0x7f74d0013070 con 0x7f74d8131890 2026-03-10T06:22:37.462 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:37.461+0000 7f74df8c4700 1 -- 192.168.123.104:0/4108799228 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f74bc005320 con 0x7f74d8131890 2026-03-10T06:22:37.464 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:37.464+0000 7f74dd660700 1 --2- 192.168.123.104:0/4108799228 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7f74c4077860 0x7f74c4079d10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:22:37.465 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:37.464+0000 7f74dd660700 1 --2- 192.168.123.104:0/4108799228 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7f74c4077860 0x7f74c4079d10 secure :-1 s=READY pgs=26 cs=0 l=1 rev1=1 crypto rx=0x7f74d4009c80 tx=0x7f74d4009400 comp rx=0 tx=0).ready entity=mgr.14632 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:22:37.469 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:37.466+0000 7f74ce7fc700 1 -- 192.168.123.104:0/4108799228 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191525 (secure 0 0 0) 0x7f74d0064e50 con 0x7f74d8131890 2026-03-10T06:22:37.478 INFO:tasks.workunit.client.1.vm06.stdout:8/868: fsync d1/df/d20/d21/f69 0 2026-03-10T06:22:37.489 INFO:tasks.workunit.client.1.vm06.stdout:3/982: dwrite d6/dc/d13/d35/d101/d88/dae/f103 [0,4194304] 0 2026-03-10T06:22:37.505 INFO:tasks.workunit.client.1.vm06.stdout:2/846: link da/d13/d1c/d1d/cae da/da8/c115 0 2026-03-10T06:22:37.522 INFO:tasks.workunit.client.1.vm06.stdout:3/983: symlink d6/d8/d7f/da1/l152 0 2026-03-10T06:22:37.522 INFO:tasks.workunit.client.1.vm06.stdout:3/984: fsync d6/dc/f1d 0 2026-03-10T06:22:37.528 INFO:tasks.workunit.client.1.vm06.stdout:2/847: readlink da/da8/le5 0 2026-03-10T06:22:37.530 INFO:tasks.workunit.client.1.vm06.stdout:5/773: getdents d8/db/d54/d8a/d39/d6c 0 2026-03-10T06:22:37.532 INFO:tasks.workunit.client.1.vm06.stdout:5/774: write d8/db/d54/d55/d80/fdc [41884,55490] 0 2026-03-10T06:22:37.535 INFO:tasks.workunit.client.1.vm06.stdout:2/848: dwrite da/d13/d1a/d39/d35/f103 [0,4194304] 0 2026-03-10T06:22:37.548 INFO:tasks.workunit.client.1.vm06.stdout:8/869: getdents d1/d2c/d99/d101 0 2026-03-10T06:22:37.559 INFO:tasks.workunit.client.1.vm06.stdout:3/985: mkdir d6/dc/d13/d133/d153 0 2026-03-10T06:22:37.559 INFO:tasks.workunit.client.1.vm06.stdout:3/986: fsync d6/dc/d13/d9d/fbb 0 2026-03-10T06:22:37.565 INFO:tasks.workunit.client.1.vm06.stdout:5/775: creat d8/d9/ddd/fed x:0 0 0 2026-03-10T06:22:37.578 INFO:tasks.workunit.client.1.vm06.stdout:2/849: creat da/d13/d1c/d1d/d44/dc4/f116 x:0 0 0 2026-03-10T06:22:37.579 INFO:tasks.workunit.client.1.vm06.stdout:8/870: truncate d1/df/d20/d21/d5e/d79/fda 756774 0 2026-03-10T06:22:37.592 INFO:tasks.workunit.client.1.vm06.stdout:3/987: truncate d6/dc/d13/d35/d101/dd0/dd1/d90/fa5 865351 0 2026-03-10T06:22:37.592 INFO:tasks.workunit.client.1.vm06.stdout:3/988: readlink d6/dc/d13/d35/d101/dd0/dd1/l75 0 2026-03-10T06:22:37.594 INFO:tasks.workunit.client.1.vm06.stdout:5/776: creat d8/d9/fee x:0 0 0 2026-03-10T06:22:37.594 INFO:tasks.workunit.client.1.vm06.stdout:5/777: chown d8/db/d54/d55/f61 937 1 2026-03-10T06:22:37.605 INFO:tasks.workunit.client.1.vm06.stdout:5/778: link d8/d9/fee d8/d9/ddd/fef 0 2026-03-10T06:22:37.619 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:37.606+0000 7f74df8c4700 1 -- 192.168.123.104:0/4108799228 --> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f74bc000bf0 con 0x7f74c4077860 2026-03-10T06:22:37.619 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:37.609+0000 7f74ce7fc700 1 -- 192.168.123.104:0/4108799228 <== mgr.14632 v2:192.168.123.104:6800/1695210057 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+370 (secure 0 0 0) 0x7f74bc000bf0 con 0x7f74c4077860 2026-03-10T06:22:37.619 INFO:teuthology.orchestra.run.vm04.stdout:{ 2026-03-10T06:22:37.619 INFO:teuthology.orchestra.run.vm04.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc", 2026-03-10T06:22:37.619 INFO:teuthology.orchestra.run.vm04.stdout: "in_progress": true, 2026-03-10T06:22:37.619 INFO:teuthology.orchestra.run.vm04.stdout: "which": "Upgrading daemons of type(s) mgr", 2026-03-10T06:22:37.619 INFO:teuthology.orchestra.run.vm04.stdout: "services_complete": [ 2026-03-10T06:22:37.620 INFO:teuthology.orchestra.run.vm04.stdout: "mgr" 2026-03-10T06:22:37.620 INFO:teuthology.orchestra.run.vm04.stdout: ], 2026-03-10T06:22:37.620 INFO:teuthology.orchestra.run.vm04.stdout: "progress": "2/2 daemons upgraded", 2026-03-10T06:22:37.620 INFO:teuthology.orchestra.run.vm04.stdout: "message": "Currently upgrading node-exporter daemons", 2026-03-10T06:22:37.620 INFO:teuthology.orchestra.run.vm04.stdout: "is_paused": false 2026-03-10T06:22:37.620 INFO:teuthology.orchestra.run.vm04.stdout:} 2026-03-10T06:22:37.620 INFO:tasks.workunit.client.1.vm06.stdout:5/779: write d8/db/d54/d67/d46/fb9 [2971253,17553] 0 2026-03-10T06:22:37.620 INFO:tasks.workunit.client.1.vm06.stdout:5/780: fsync d8/db/d54/d8a/d74/f37 0 2026-03-10T06:22:37.620 INFO:tasks.workunit.client.1.vm06.stdout:5/781: dread d8/db/d54/d8a/fc7 [0,4194304] 0 2026-03-10T06:22:37.620 INFO:tasks.workunit.client.1.vm06.stdout:8/871: dread d1/d2c/d90/fcb [0,4194304] 0 2026-03-10T06:22:37.620 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:37.615+0000 7f74c3fff700 1 -- 192.168.123.104:0/4108799228 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7f74c4077860 msgr2=0x7f74c4079d10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:22:37.620 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:37.615+0000 7f74c3fff700 1 --2- 192.168.123.104:0/4108799228 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7f74c4077860 0x7f74c4079d10 secure :-1 s=READY pgs=26 cs=0 l=1 rev1=1 crypto rx=0x7f74d4009c80 tx=0x7f74d4009400 comp rx=0 tx=0).stop 2026-03-10T06:22:37.620 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:37 vm04.local ceph-mon[51058]: pgmap v17: 65 pgs: 65 active+clean; 2.0 GiB data, 6.5 GiB used, 113 GiB / 120 GiB avail; 66 MiB/s rd, 176 MiB/s wr, 403 op/s 2026-03-10T06:22:37.620 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:37 vm04.local ceph-mon[51058]: from='client.14676 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:22:37.620 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:37 vm04.local ceph-mon[51058]: from='client.14680 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:22:37.620 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:37.615+0000 7f74c3fff700 1 -- 192.168.123.104:0/4108799228 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f74d8131890 msgr2=0x7f74d807f520 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:22:37.620 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:37.615+0000 7f74c3fff700 1 --2- 192.168.123.104:0/4108799228 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f74d8131890 0x7f74d807f520 secure :-1 s=READY pgs=335 cs=0 l=1 rev1=1 crypto rx=0x7f74d0003c30 tx=0x7f74d0003c60 comp rx=0 tx=0).stop 2026-03-10T06:22:37.620 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:37.618+0000 7f74c3fff700 1 -- 192.168.123.104:0/4108799228 shutdown_connections 2026-03-10T06:22:37.620 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:37.618+0000 7f74c3fff700 1 --2- 192.168.123.104:0/4108799228 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f74d8071980 0x7f74d8131350 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:22:37.620 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:37.618+0000 7f74c3fff700 1 --2- 192.168.123.104:0/4108799228 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7f74c4077860 0x7f74c4079d10 unknown :-1 s=CLOSED pgs=26 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:22:37.620 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:37.618+0000 7f74c3fff700 1 --2- 192.168.123.104:0/4108799228 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f74d8131890 0x7f74d807f520 unknown :-1 s=CLOSED pgs=335 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:22:37.620 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:37.618+0000 7f74c3fff700 1 -- 192.168.123.104:0/4108799228 >> 192.168.123.104:0/4108799228 conn(0x7f74d806d1a0 msgr2=0x7f74d8076480 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:22:37.620 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:37.618+0000 7f74c3fff700 1 -- 192.168.123.104:0/4108799228 shutdown_connections 2026-03-10T06:22:37.620 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:22:37.618+0000 7f74c3fff700 1 -- 192.168.123.104:0/4108799228 wait complete. 2026-03-10T06:22:37.621 INFO:tasks.workunit.client.1.vm06.stdout:8/872: write d1/df/d11/da1/fb6 [774779,67289] 0 2026-03-10T06:22:37.624 INFO:tasks.workunit.client.1.vm06.stdout:2/850: dread f8 [0,4194304] 0 2026-03-10T06:22:37.626 INFO:tasks.workunit.client.1.vm06.stdout:5/782: getdents d8/db/d57/d83 0 2026-03-10T06:22:37.633 INFO:tasks.workunit.client.1.vm06.stdout:5/783: mknod d8/db/d7e/de1/cf0 0 2026-03-10T06:22:37.661 INFO:tasks.workunit.client.1.vm06.stdout:2/851: creat da/d13/d1a/f117 x:0 0 0 2026-03-10T06:22:37.661 INFO:tasks.workunit.client.1.vm06.stdout:2/852: mkdir da/d13/d107/d118 0 2026-03-10T06:22:37.702 INFO:tasks.workunit.client.1.vm06.stdout:8/873: write d1/d3b/db3/fcc [922361,108812] 0 2026-03-10T06:22:37.702 INFO:tasks.workunit.client.1.vm06.stdout:5/784: write d8/db/d54/d8a/d74/f85 [3666831,102234] 0 2026-03-10T06:22:37.705 INFO:tasks.workunit.client.1.vm06.stdout:8/874: write d1/d7/df8/d103/f116 [200808,105867] 0 2026-03-10T06:22:37.710 INFO:tasks.workunit.client.1.vm06.stdout:3/989: dwrite d6/d1a/d5b/dbd/fc2 [0,4194304] 0 2026-03-10T06:22:37.710 INFO:tasks.workunit.client.1.vm06.stdout:8/875: mkdir d1/d7/dee/d11a 0 2026-03-10T06:22:37.710 INFO:tasks.workunit.client.1.vm06.stdout:2/853: dwrite da/d13/d1a/f27 [0,4194304] 0 2026-03-10T06:22:37.728 INFO:tasks.workunit.client.1.vm06.stdout:8/876: rename d1/f3a to d1/d3b/d5c/de3/f11b 0 2026-03-10T06:22:37.738 INFO:tasks.workunit.client.1.vm06.stdout:3/990: dwrite d6/dc/d13/d35/d101/d88/dde/f134 [0,4194304] 0 2026-03-10T06:22:37.745 INFO:tasks.workunit.client.1.vm06.stdout:3/991: dread d6/d1a/fb9 [0,4194304] 0 2026-03-10T06:22:37.755 INFO:tasks.workunit.client.1.vm06.stdout:5/785: write d8/db/f45 [2827337,125522] 0 2026-03-10T06:22:37.757 INFO:tasks.workunit.client.1.vm06.stdout:2/854: dwrite da/d13/f5b [0,4194304] 0 2026-03-10T06:22:37.791 INFO:tasks.workunit.client.1.vm06.stdout:8/877: creat d1/d7/df8/d103/f11c x:0 0 0 2026-03-10T06:22:37.795 INFO:tasks.workunit.client.1.vm06.stdout:5/786: mkdir d8/db/d54/d8a/d74/d90/df1 0 2026-03-10T06:22:37.799 INFO:tasks.workunit.client.1.vm06.stdout:2/855: rename da/d13/d5e/f8f to da/d13/d1a/dc7/d86/f119 0 2026-03-10T06:22:37.811 INFO:tasks.workunit.client.1.vm06.stdout:3/992: mkdir d6/dc/de5/d154 0 2026-03-10T06:22:37.815 INFO:tasks.workunit.client.1.vm06.stdout:8/878: creat d1/df/d11/da1/dd2/f11d x:0 0 0 2026-03-10T06:22:37.819 INFO:tasks.workunit.client.1.vm06.stdout:5/787: unlink d8/db/d54/d8a/d39/d6c/fdf 0 2026-03-10T06:22:37.822 INFO:tasks.workunit.client.1.vm06.stdout:2/856: dwrite da/d13/d1c/d7d/ddf/f67 [0,4194304] 0 2026-03-10T06:22:37.826 INFO:tasks.workunit.client.1.vm06.stdout:3/993: mknod d6/d1a/c155 0 2026-03-10T06:22:37.827 INFO:tasks.workunit.client.1.vm06.stdout:3/994: chown d6/dc/f1d 2240 1 2026-03-10T06:22:37.841 INFO:tasks.workunit.client.1.vm06.stdout:2/857: creat da/d13/d1c/d1d/d44/d46/de2/d112/f11a x:0 0 0 2026-03-10T06:22:37.841 INFO:tasks.workunit.client.1.vm06.stdout:2/858: chown da/d13/d1c/d7d/ddf/f78 2 1 2026-03-10T06:22:37.842 INFO:tasks.workunit.client.1.vm06.stdout:2/859: chown da/ff 15374 1 2026-03-10T06:22:37.842 INFO:tasks.workunit.client.1.vm06.stdout:2/860: write da/d13/f1f [2414984,41324] 0 2026-03-10T06:22:37.844 INFO:tasks.workunit.client.1.vm06.stdout:2/861: fsync da/d13/d1a/dc7/daf/d56/db7/f104 0 2026-03-10T06:22:37.848 INFO:tasks.workunit.client.1.vm06.stdout:2/862: dread da/d13/d1a/dc7/daf/d56/db9/fd1 [0,4194304] 0 2026-03-10T06:22:37.851 INFO:tasks.workunit.client.1.vm06.stdout:8/879: dwrite d1/df/f6b [0,4194304] 0 2026-03-10T06:22:37.857 INFO:tasks.workunit.client.1.vm06.stdout:3/995: symlink d6/dc/de5/d13d/l156 0 2026-03-10T06:22:37.860 INFO:tasks.workunit.client.1.vm06.stdout:3/996: truncate d6/dc/d13/fca 4815304 0 2026-03-10T06:22:37.864 INFO:tasks.workunit.client.1.vm06.stdout:5/788: creat d8/db/d54/d8a/d74/ff2 x:0 0 0 2026-03-10T06:22:37.868 INFO:tasks.workunit.client.1.vm06.stdout:8/880: dread d1/df/d20/d21/d5e/f73 [0,4194304] 0 2026-03-10T06:22:37.869 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:37 vm06.local ceph-mon[58974]: pgmap v17: 65 pgs: 65 active+clean; 2.0 GiB data, 6.5 GiB used, 113 GiB / 120 GiB avail; 66 MiB/s rd, 176 MiB/s wr, 403 op/s 2026-03-10T06:22:37.869 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:37 vm06.local ceph-mon[58974]: from='client.14676 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:22:37.869 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:37 vm06.local ceph-mon[58974]: from='client.14680 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:22:37.869 INFO:tasks.workunit.client.1.vm06.stdout:8/881: write d1/df/d58/db5/fdf [885927,22861] 0 2026-03-10T06:22:37.869 INFO:tasks.workunit.client.1.vm06.stdout:2/863: symlink da/d13/d5e/df7/l11b 0 2026-03-10T06:22:37.881 INFO:tasks.workunit.client.1.vm06.stdout:8/882: mkdir d1/d2c/d5b/d11e 0 2026-03-10T06:22:37.881 INFO:tasks.workunit.client.1.vm06.stdout:8/883: chown d1/df/d20/f63 202514622 1 2026-03-10T06:22:37.893 INFO:tasks.workunit.client.1.vm06.stdout:2/864: write da/d13/d1c/d7d/ddf/d61/fcf [191105,23557] 0 2026-03-10T06:22:37.894 INFO:tasks.workunit.client.1.vm06.stdout:2/865: readlink da/d13/d1c/d7d/ddf/lc1 0 2026-03-10T06:22:37.902 INFO:tasks.workunit.client.1.vm06.stdout:3/997: write d6/dc/d13/f5d [89588,108673] 0 2026-03-10T06:22:37.920 INFO:tasks.workunit.client.1.vm06.stdout:2/866: mknod da/d13/d5e/c11c 0 2026-03-10T06:22:37.921 INFO:tasks.workunit.client.1.vm06.stdout:5/789: dwrite d8/db/f1f [0,4194304] 0 2026-03-10T06:22:37.927 INFO:tasks.workunit.client.1.vm06.stdout:2/867: mknod da/d13/d1c/d7d/c11d 0 2026-03-10T06:22:37.930 INFO:tasks.workunit.client.1.vm06.stdout:3/998: link d6/dc/d13/d35/d101/dd0/f13b d6/d8/d7f/da1/dfe/f157 0 2026-03-10T06:22:37.931 INFO:tasks.workunit.client.1.vm06.stdout:3/999: truncate d6/dc/d13/d35/d101/dd0/dd1/d90/f10b 646370 0 2026-03-10T06:22:37.934 INFO:tasks.workunit.client.1.vm06.stdout:5/790: creat d8/db/d54/d8a/d39/d72/ff3 x:0 0 0 2026-03-10T06:22:37.934 INFO:tasks.workunit.client.1.vm06.stdout:8/884: dwrite d1/d3b/da9/dab/fb2 [0,4194304] 0 2026-03-10T06:22:37.942 INFO:tasks.workunit.client.1.vm06.stdout:8/885: chown d1/df 46902 1 2026-03-10T06:22:37.944 INFO:tasks.workunit.client.1.vm06.stdout:2/868: rmdir da/d13/d1c/d1d/d44/dc4 39 2026-03-10T06:22:37.944 INFO:tasks.workunit.client.1.vm06.stdout:8/886: rename d1/d2c/d5b/l68 to d1/df/d20/l11f 0 2026-03-10T06:22:37.950 INFO:tasks.workunit.client.1.vm06.stdout:8/887: fdatasync d1/d3b/da9/fd8 0 2026-03-10T06:22:37.952 INFO:tasks.workunit.client.1.vm06.stdout:2/869: creat da/d13/d1a/dc7/daf/d56/de7/d106/f11e x:0 0 0 2026-03-10T06:22:37.959 INFO:tasks.workunit.client.1.vm06.stdout:2/870: mknod da/d13/d1c/d7d/ddf/c11f 0 2026-03-10T06:22:37.960 INFO:tasks.workunit.client.1.vm06.stdout:8/888: creat d1/df/d20/d21/d5e/d79/d118/f120 x:0 0 0 2026-03-10T06:22:37.962 INFO:tasks.workunit.client.1.vm06.stdout:2/871: dread da/d13/d1c/d1d/d44/d46/fd7 [0,4194304] 0 2026-03-10T06:22:37.966 INFO:tasks.workunit.client.1.vm06.stdout:2/872: dwrite da/d13/f5b [4194304,4194304] 0 2026-03-10T06:22:37.969 INFO:tasks.workunit.client.1.vm06.stdout:8/889: rename d1/df/d20/d21/d7e/fd6 to d1/d2c/d99/dc0/f121 0 2026-03-10T06:22:37.991 INFO:tasks.workunit.client.1.vm06.stdout:5/791: dwrite d8/db/d54/d8a/d39/d72/f8b [0,4194304] 0 2026-03-10T06:22:37.993 INFO:tasks.workunit.client.1.vm06.stdout:2/873: dread - da/d13/d1a/fc9 zero size 2026-03-10T06:22:37.998 INFO:tasks.workunit.client.1.vm06.stdout:5/792: write d8/db/d54/d67/d46/d68/dc1/fce [4249949,109700] 0 2026-03-10T06:22:37.998 INFO:tasks.workunit.client.1.vm06.stdout:5/793: chown d8/db/d54/d55/f60 0 1 2026-03-10T06:22:37.998 INFO:tasks.workunit.client.1.vm06.stdout:5/794: fsync d8/db/d54/d8a/d74/f2f 0 2026-03-10T06:22:38.002 INFO:tasks.workunit.client.1.vm06.stdout:5/795: dread d8/db/d57/d83/fc3 [0,4194304] 0 2026-03-10T06:22:38.003 INFO:tasks.workunit.client.1.vm06.stdout:8/890: unlink d1/df/d11/da1/dd2/fe5 0 2026-03-10T06:22:38.005 INFO:tasks.workunit.client.1.vm06.stdout:8/891: chown d1/d2c/l109 2 1 2026-03-10T06:22:38.005 INFO:tasks.workunit.client.1.vm06.stdout:8/892: read d1/f1c [2003214,87659] 0 2026-03-10T06:22:38.007 INFO:tasks.workunit.client.1.vm06.stdout:8/893: creat d1/df/d20/d35/dac/dbf/f122 x:0 0 0 2026-03-10T06:22:38.007 INFO:tasks.workunit.client.1.vm06.stdout:8/894: chown d1/f4 1320 1 2026-03-10T06:22:38.018 INFO:tasks.workunit.client.1.vm06.stdout:8/895: dread d1/d2c/d5b/f7c [0,4194304] 0 2026-03-10T06:22:38.018 INFO:tasks.workunit.client.1.vm06.stdout:8/896: write d1/d3b/da9/dab/fb2 [22937,113569] 0 2026-03-10T06:22:38.019 INFO:tasks.workunit.client.1.vm06.stdout:8/897: readlink d1/df/d20/laf 0 2026-03-10T06:22:38.021 INFO:tasks.workunit.client.1.vm06.stdout:5/796: dread d8/db/d54/d67/d46/f98 [0,4194304] 0 2026-03-10T06:22:38.021 INFO:tasks.workunit.client.1.vm06.stdout:8/898: mknod d1/d3b/d5c/de3/c123 0 2026-03-10T06:22:38.021 INFO:tasks.workunit.client.1.vm06.stdout:2/874: sync 2026-03-10T06:22:38.022 INFO:tasks.workunit.client.1.vm06.stdout:5/797: readlink d8/db/d54/d8a/l32 0 2026-03-10T06:22:38.022 INFO:tasks.workunit.client.1.vm06.stdout:8/899: readlink d1/df/d20/d35/l46 0 2026-03-10T06:22:38.024 INFO:tasks.workunit.client.1.vm06.stdout:5/798: fsync d8/db/d54/d55/f61 0 2026-03-10T06:22:38.024 INFO:tasks.workunit.client.1.vm06.stdout:8/900: readlink d1/df/d20/d21/d7e/d8d/l8e 0 2026-03-10T06:22:38.037 INFO:tasks.workunit.client.1.vm06.stdout:5/799: fsync d8/db/d54/d8a/d74/f17 0 2026-03-10T06:22:38.037 INFO:tasks.workunit.client.1.vm06.stdout:5/800: readlink d8/db/d54/d8a/d74/d90/lac 0 2026-03-10T06:22:38.041 INFO:tasks.workunit.client.1.vm06.stdout:5/801: fsync d8/db/f48 0 2026-03-10T06:22:38.041 INFO:tasks.workunit.client.1.vm06.stdout:5/802: readlink d8/d9/l13 0 2026-03-10T06:22:38.042 INFO:tasks.workunit.client.1.vm06.stdout:5/803: write d8/db/f45 [1135918,37205] 0 2026-03-10T06:22:38.050 INFO:tasks.workunit.client.1.vm06.stdout:5/804: unlink d8/db/d54/d67/d46/fa4 0 2026-03-10T06:22:38.068 INFO:tasks.workunit.client.1.vm06.stdout:8/901: rmdir d1/d3b 39 2026-03-10T06:22:38.069 INFO:tasks.workunit.client.1.vm06.stdout:2/875: write da/d13/d1c/d1d/f26 [368865,107998] 0 2026-03-10T06:22:38.070 INFO:tasks.workunit.client.1.vm06.stdout:8/902: chown d1/df/d20/d21/d5e/fa4 2 1 2026-03-10T06:22:38.081 INFO:tasks.workunit.client.1.vm06.stdout:2/876: mkdir da/d13/d1a/dc7/d120 0 2026-03-10T06:22:38.081 INFO:tasks.workunit.client.1.vm06.stdout:2/877: dread - da/dea/f10b zero size 2026-03-10T06:22:38.082 INFO:tasks.workunit.client.1.vm06.stdout:5/805: dwrite d8/db/d54/d8a/d39/fa1 [0,4194304] 0 2026-03-10T06:22:38.084 INFO:tasks.workunit.client.1.vm06.stdout:2/878: truncate da/d13/d1c/d7d/ddf/d61/f111 89476 0 2026-03-10T06:22:38.091 INFO:tasks.workunit.client.1.vm06.stdout:8/903: rename d1/df/d20/d21/cbe to d1/d7/dee/c124 0 2026-03-10T06:22:38.093 INFO:tasks.workunit.client.1.vm06.stdout:2/879: mkdir da/d13/d1c/d7d/d121 0 2026-03-10T06:22:38.094 INFO:tasks.workunit.client.1.vm06.stdout:2/880: stat da/d13/d1a/dc7/daf 0 2026-03-10T06:22:38.094 INFO:tasks.workunit.client.1.vm06.stdout:8/904: fdatasync d1/f1b 0 2026-03-10T06:22:38.095 INFO:tasks.workunit.client.1.vm06.stdout:5/806: mknod d8/db/d54/d8a/d5e/cf4 0 2026-03-10T06:22:38.096 INFO:tasks.workunit.client.1.vm06.stdout:2/881: creat da/d13/d107/f122 x:0 0 0 2026-03-10T06:22:38.098 INFO:tasks.workunit.client.1.vm06.stdout:8/905: creat d1/df/d20/f125 x:0 0 0 2026-03-10T06:22:38.100 INFO:tasks.workunit.client.1.vm06.stdout:2/882: mknod da/d13/d1a/dc7/daf/d56/db7/dde/c123 0 2026-03-10T06:22:38.101 INFO:tasks.workunit.client.1.vm06.stdout:2/883: chown da/d13/d5e/c8c 28 1 2026-03-10T06:22:38.101 INFO:tasks.workunit.client.1.vm06.stdout:8/906: creat d1/df/d20/f126 x:0 0 0 2026-03-10T06:22:38.111 INFO:tasks.workunit.client.1.vm06.stdout:8/907: dread d1/df/d11/f4a [0,4194304] 0 2026-03-10T06:22:38.112 INFO:tasks.workunit.client.1.vm06.stdout:8/908: write d1/df/d11/da1/dd2/f11d [297106,124846] 0 2026-03-10T06:22:38.116 INFO:tasks.workunit.client.1.vm06.stdout:8/909: dwrite d1/df/d11/ff1 [0,4194304] 0 2026-03-10T06:22:38.126 INFO:tasks.workunit.client.1.vm06.stdout:8/910: stat d1/d3b/d5c/de6 0 2026-03-10T06:22:38.148 INFO:tasks.workunit.client.1.vm06.stdout:2/884: write da/d13/d1a/fc9 [839156,18181] 0 2026-03-10T06:22:38.152 INFO:tasks.workunit.client.1.vm06.stdout:5/807: dwrite d8/db/d57/d83/fc3 [0,4194304] 0 2026-03-10T06:22:38.170 INFO:tasks.workunit.client.1.vm06.stdout:2/885: dread da/d13/d1a/d39/f2f [0,4194304] 0 2026-03-10T06:22:38.172 INFO:tasks.workunit.client.1.vm06.stdout:5/808: creat d8/db/d54/d67/d46/ff5 x:0 0 0 2026-03-10T06:22:38.174 INFO:tasks.workunit.client.1.vm06.stdout:2/886: chown da/l10 63 1 2026-03-10T06:22:38.176 INFO:tasks.workunit.client.1.vm06.stdout:5/809: fdatasync d8/db/d54/d8a/d39/f41 0 2026-03-10T06:22:38.177 INFO:tasks.workunit.client.1.vm06.stdout:2/887: write da/d13/d1c/d7d/f10e [1605603,121283] 0 2026-03-10T06:22:38.181 INFO:tasks.workunit.client.1.vm06.stdout:8/911: write d1/d3b/f49 [954850,25624] 0 2026-03-10T06:22:38.184 INFO:tasks.workunit.client.1.vm06.stdout:5/810: mkdir d8/de6/df6 0 2026-03-10T06:22:38.192 INFO:tasks.workunit.client.1.vm06.stdout:5/811: write d8/db/d54/d8a/d39/f41 [5036635,23851] 0 2026-03-10T06:22:38.195 INFO:tasks.workunit.client.1.vm06.stdout:5/812: readlink d8/db/d54/d8a/d74/d90/lad 0 2026-03-10T06:22:38.195 INFO:tasks.workunit.client.1.vm06.stdout:5/813: chown d8/db/d7e/de1 23529 1 2026-03-10T06:22:38.198 INFO:tasks.workunit.client.1.vm06.stdout:2/888: write da/d13/d1a/dc7/daf/d56/f85 [116184,92180] 0 2026-03-10T06:22:38.199 INFO:tasks.workunit.client.1.vm06.stdout:5/814: creat d8/db/d7e/de1/ff7 x:0 0 0 2026-03-10T06:22:38.199 INFO:tasks.workunit.client.1.vm06.stdout:5/815: chown d8/f3f 505795214 1 2026-03-10T06:22:38.200 INFO:tasks.workunit.client.1.vm06.stdout:8/912: dread d1/d3b/d5c/de3/f11b [0,4194304] 0 2026-03-10T06:22:38.203 INFO:tasks.workunit.client.1.vm06.stdout:8/913: rename d1/f5 to d1/d7/dfb/f127 0 2026-03-10T06:22:38.205 INFO:tasks.workunit.client.1.vm06.stdout:5/816: mkdir d8/db/df8 0 2026-03-10T06:22:38.208 INFO:tasks.workunit.client.1.vm06.stdout:2/889: getdents da/d13/d1c/d7d/ddf 0 2026-03-10T06:22:38.210 INFO:tasks.workunit.client.1.vm06.stdout:5/817: getdents d8/db/df8 0 2026-03-10T06:22:38.213 INFO:tasks.workunit.client.1.vm06.stdout:8/914: rmdir d1/d7/dee/d11a 0 2026-03-10T06:22:38.216 INFO:tasks.workunit.client.1.vm06.stdout:2/890: dread da/d13/d1c/d43/f7a [0,4194304] 0 2026-03-10T06:22:38.223 INFO:tasks.workunit.client.1.vm06.stdout:2/891: rename da/d13/d1a/dc7/daf/d56 to da/d13/d1a/dc7/d120/d124 0 2026-03-10T06:22:38.232 INFO:tasks.workunit.client.1.vm06.stdout:2/892: chown da/d13/d1c/d1d/d44/d46/cbf 5013 1 2026-03-10T06:22:38.233 INFO:tasks.workunit.client.1.vm06.stdout:2/893: chown da/ff 26351787 1 2026-03-10T06:22:38.235 INFO:tasks.workunit.client.1.vm06.stdout:2/894: mkdir da/d13/d1a/dc7/d120/d124/db7/dde/d125 0 2026-03-10T06:22:38.241 INFO:tasks.workunit.client.1.vm06.stdout:8/915: getdents d1/d7/dee 0 2026-03-10T06:22:38.244 INFO:tasks.workunit.client.1.vm06.stdout:2/895: sync 2026-03-10T06:22:38.247 INFO:tasks.workunit.client.1.vm06.stdout:8/916: read d1/df/d11/f47 [574071,4439] 0 2026-03-10T06:22:38.248 INFO:tasks.workunit.client.1.vm06.stdout:5/818: dwrite d8/db/d54/d8a/d39/f69 [4194304,4194304] 0 2026-03-10T06:22:38.249 INFO:tasks.workunit.client.1.vm06.stdout:8/917: dread - d1/d7/dee/ff7 zero size 2026-03-10T06:22:38.255 INFO:tasks.workunit.client.1.vm06.stdout:8/918: write d1/d3b/da9/dab/fb2 [2503468,99923] 0 2026-03-10T06:22:38.257 INFO:tasks.workunit.client.1.vm06.stdout:8/919: stat d1/df/d20/d21/d7e/d8d/f95 0 2026-03-10T06:22:38.259 INFO:tasks.workunit.client.1.vm06.stdout:8/920: mknod d1/d7/dfb/c128 0 2026-03-10T06:22:38.268 INFO:tasks.workunit.client.1.vm06.stdout:5/819: dread d8/db/fbc [0,4194304] 0 2026-03-10T06:22:38.268 INFO:tasks.workunit.client.1.vm06.stdout:8/921: rename d1/d2c/d90 to d1/df/d20/d21/d7e/d8d/d129 0 2026-03-10T06:22:38.268 INFO:tasks.workunit.client.1.vm06.stdout:5/820: mkdir d8/db/d54/d67/d46/d68/dc1/df9 0 2026-03-10T06:22:38.268 INFO:tasks.workunit.client.1.vm06.stdout:8/922: readlink d1/l6 0 2026-03-10T06:22:38.268 INFO:tasks.workunit.client.1.vm06.stdout:8/923: dread - d1/df/d20/f113 zero size 2026-03-10T06:22:38.268 INFO:tasks.workunit.client.1.vm06.stdout:8/924: readlink d1/d2c/d99/le0 0 2026-03-10T06:22:38.268 INFO:tasks.workunit.client.1.vm06.stdout:5/821: creat d8/db/d54/d67/d46/d68/dc1/df9/ffa x:0 0 0 2026-03-10T06:22:38.268 INFO:tasks.workunit.client.1.vm06.stdout:8/925: mkdir d1/d2c/d99/d101/d12a 0 2026-03-10T06:22:38.268 INFO:tasks.workunit.client.1.vm06.stdout:5/822: dread - d8/db/d54/d8a/d74/d90/fe2 zero size 2026-03-10T06:22:38.268 INFO:tasks.workunit.client.1.vm06.stdout:5/823: mknod d8/db/d54/d67/cfb 0 2026-03-10T06:22:38.269 INFO:tasks.workunit.client.1.vm06.stdout:8/926: dwrite d1/df/d58/ff6 [0,4194304] 0 2026-03-10T06:22:38.272 INFO:tasks.workunit.client.1.vm06.stdout:5/824: sync 2026-03-10T06:22:38.280 INFO:tasks.workunit.client.1.vm06.stdout:2/896: write da/d13/d1a/d39/d35/f74 [332166,8886] 0 2026-03-10T06:22:38.285 INFO:tasks.workunit.client.1.vm06.stdout:5/825: unlink d8/db/d54/d8a/l56 0 2026-03-10T06:22:38.292 INFO:tasks.workunit.client.1.vm06.stdout:2/897: creat da/d13/d1c/d1d/d44/dc4/f126 x:0 0 0 2026-03-10T06:22:38.299 INFO:tasks.workunit.client.1.vm06.stdout:2/898: symlink da/da8/l127 0 2026-03-10T06:22:38.300 INFO:tasks.workunit.client.1.vm06.stdout:2/899: write da/d13/d5e/f9e [2305811,38092] 0 2026-03-10T06:22:38.301 INFO:tasks.workunit.client.1.vm06.stdout:8/927: write d1/df/d20/d21/f37 [2911584,19769] 0 2026-03-10T06:22:38.310 INFO:tasks.workunit.client.1.vm06.stdout:2/900: rmdir da/d13/d1c/d1d/d44/d46/de2/d112 39 2026-03-10T06:22:38.311 INFO:tasks.workunit.client.1.vm06.stdout:8/928: mknod d1/df/d11/da1/c12b 0 2026-03-10T06:22:38.311 INFO:tasks.workunit.client.1.vm06.stdout:5/826: dread d8/db/d54/d8a/d74/d90/fb5 [0,4194304] 0 2026-03-10T06:22:38.311 INFO:tasks.workunit.client.1.vm06.stdout:2/901: write da/d13/d1a/d39/f3c [3240019,80408] 0 2026-03-10T06:22:38.316 INFO:tasks.workunit.client.1.vm06.stdout:8/929: creat d1/df/d20/f12c x:0 0 0 2026-03-10T06:22:38.317 INFO:tasks.workunit.client.1.vm06.stdout:2/902: creat da/d13/d1c/d43/f128 x:0 0 0 2026-03-10T06:22:38.318 INFO:tasks.workunit.client.1.vm06.stdout:5/827: mknod d8/db/d54/d8a/de4/cfc 0 2026-03-10T06:22:38.320 INFO:tasks.workunit.client.1.vm06.stdout:5/828: write d8/db/d54/d8a/d39/fae [4938797,96718] 0 2026-03-10T06:22:38.323 INFO:tasks.workunit.client.1.vm06.stdout:2/903: dread f8 [0,4194304] 0 2026-03-10T06:22:38.324 INFO:tasks.workunit.client.1.vm06.stdout:8/930: getdents d1/d3b/d5c/de3 0 2026-03-10T06:22:38.326 INFO:tasks.workunit.client.1.vm06.stdout:5/829: creat d8/db/ffd x:0 0 0 2026-03-10T06:22:38.329 INFO:tasks.workunit.client.1.vm06.stdout:2/904: creat da/d13/d1c/d7d/ddf/d61/d68/de0/f129 x:0 0 0 2026-03-10T06:22:38.331 INFO:tasks.workunit.client.1.vm06.stdout:8/931: dread d1/df/d20/d21/d5e/d79/f7f [0,4194304] 0 2026-03-10T06:22:38.335 INFO:tasks.workunit.client.1.vm06.stdout:5/830: dread d8/db/d57/d83/f99 [0,4194304] 0 2026-03-10T06:22:38.348 INFO:tasks.workunit.client.1.vm06.stdout:2/905: dwrite da/d13/d5e/f9a [4194304,4194304] 0 2026-03-10T06:22:38.374 INFO:tasks.workunit.client.1.vm06.stdout:8/932: write d1/df/d11/da1/fcd [676380,45491] 0 2026-03-10T06:22:38.374 INFO:tasks.workunit.client.1.vm06.stdout:5/831: write d8/db/d54/d8a/d74/f62 [2273499,91624] 0 2026-03-10T06:22:38.375 INFO:tasks.workunit.client.1.vm06.stdout:2/906: dwrite da/d13/d1a/dc7/daf/fd9 [0,4194304] 0 2026-03-10T06:22:38.381 INFO:tasks.workunit.client.1.vm06.stdout:2/907: write da/d13/d1a/dc7/d120/d124/de7/d106/f11e [612714,998] 0 2026-03-10T06:22:38.391 INFO:tasks.workunit.client.1.vm06.stdout:8/933: write d1/d7/f4f [5040143,33002] 0 2026-03-10T06:22:38.392 INFO:tasks.workunit.client.1.vm06.stdout:8/934: rename d1/d2c/d99 to d1/d2c/d99/d101/d12d 22 2026-03-10T06:22:38.392 INFO:tasks.workunit.client.1.vm06.stdout:2/908: dwrite da/d13/d5e/ff8 [0,4194304] 0 2026-03-10T06:22:38.401 INFO:tasks.workunit.client.1.vm06.stdout:8/935: chown d1/d3b/da9/dab/c104 168627539 1 2026-03-10T06:22:38.401 INFO:tasks.workunit.client.1.vm06.stdout:2/909: write da/dea/f10b [189052,93452] 0 2026-03-10T06:22:38.402 INFO:tasks.workunit.client.1.vm06.stdout:5/832: truncate d8/db/d54/d8a/d39/fc5 709883 0 2026-03-10T06:22:38.410 INFO:tasks.workunit.client.1.vm06.stdout:2/910: mkdir da/da8/d12a 0 2026-03-10T06:22:38.410 INFO:tasks.workunit.client.1.vm06.stdout:2/911: stat da/d13/d1a/f117 0 2026-03-10T06:22:38.411 INFO:tasks.workunit.client.1.vm06.stdout:5/833: sync 2026-03-10T06:22:38.412 INFO:tasks.workunit.client.1.vm06.stdout:5/834: chown d8/db/d54/d55/d80/f96 166769 1 2026-03-10T06:22:38.416 INFO:tasks.workunit.client.1.vm06.stdout:2/912: mknod da/d13/d107/d118/c12b 0 2026-03-10T06:22:38.426 INFO:tasks.workunit.client.1.vm06.stdout:2/913: dwrite da/d13/d5e/fe3 [0,4194304] 0 2026-03-10T06:22:38.432 INFO:tasks.workunit.client.1.vm06.stdout:5/835: write d8/db/d54/d8a/f31 [2085945,32835] 0 2026-03-10T06:22:38.433 INFO:tasks.workunit.client.1.vm06.stdout:5/836: write d8/db/d54/fe8 [176401,1651] 0 2026-03-10T06:22:38.438 INFO:tasks.workunit.client.1.vm06.stdout:8/936: dwrite d1/df/d20/d21/d7e/d8d/d129/fcb [0,4194304] 0 2026-03-10T06:22:38.442 INFO:tasks.workunit.client.1.vm06.stdout:5/837: truncate d8/db/d54/d8a/f4d 2216301 0 2026-03-10T06:22:38.444 INFO:tasks.workunit.client.1.vm06.stdout:2/914: creat da/d13/d1a/dc7/d120/d124/db7/dde/f12c x:0 0 0 2026-03-10T06:22:38.446 INFO:tasks.workunit.client.1.vm06.stdout:8/937: mknod d1/d3b/db3/c12e 0 2026-03-10T06:22:38.446 INFO:tasks.workunit.client.1.vm06.stdout:5/838: write d8/db/d54/d8a/d74/d90/fe2 [224858,110708] 0 2026-03-10T06:22:38.447 INFO:tasks.workunit.client.1.vm06.stdout:5/839: fsync d8/db/d54/d8a/d39/f69 0 2026-03-10T06:22:38.459 INFO:tasks.workunit.client.1.vm06.stdout:5/840: mknod d8/db/d54/d8a/d39/d6c/cfe 0 2026-03-10T06:22:38.460 INFO:tasks.workunit.client.1.vm06.stdout:2/915: link da/d13/d1c/d1d/d44/c88 da/d13/d1c/d7d/ddf/d61/c12d 0 2026-03-10T06:22:38.464 INFO:tasks.workunit.client.1.vm06.stdout:2/916: creat da/d13/d1c/f12e x:0 0 0 2026-03-10T06:22:38.465 INFO:tasks.workunit.client.1.vm06.stdout:2/917: read da/ff [523767,74740] 0 2026-03-10T06:22:38.468 INFO:tasks.workunit.client.1.vm06.stdout:2/918: rename da/d13/d1a/d39/d35/f74 to da/d13/d1a/dc7/d120/d124/db9/f12f 0 2026-03-10T06:22:38.470 INFO:tasks.workunit.client.1.vm06.stdout:2/919: mknod da/d13/d1a/dc7/d120/d124/de7/c130 0 2026-03-10T06:22:38.488 INFO:tasks.workunit.client.1.vm06.stdout:2/920: dread da/d13/d1a/dc7/d120/d124/f85 [0,4194304] 0 2026-03-10T06:22:38.491 INFO:tasks.workunit.client.1.vm06.stdout:8/938: dwrite d1/d3b/d5c/f62 [0,4194304] 0 2026-03-10T06:22:38.493 INFO:tasks.workunit.client.1.vm06.stdout:5/841: dwrite d8/db/d54/d8a/fbd [0,4194304] 0 2026-03-10T06:22:38.494 INFO:tasks.workunit.client.1.vm06.stdout:2/921: write da/d13/d1a/dc7/d120/d124/db9/fdc [1004340,53355] 0 2026-03-10T06:22:38.501 INFO:tasks.workunit.client.1.vm06.stdout:5/842: write d8/db/d54/d8a/fbd [648503,113222] 0 2026-03-10T06:22:38.501 INFO:tasks.workunit.client.1.vm06.stdout:5/843: dwrite d8/db/d54/d8a/d39/f69 [4194304,4194304] 0 2026-03-10T06:22:38.503 INFO:tasks.workunit.client.1.vm06.stdout:8/939: dread - d1/df/d20/d21/d5e/fa4 zero size 2026-03-10T06:22:38.504 INFO:tasks.workunit.client.1.vm06.stdout:2/922: sync 2026-03-10T06:22:38.510 INFO:tasks.workunit.client.1.vm06.stdout:5/844: creat d8/db/d54/d8a/d39/d6c/de7/fff x:0 0 0 2026-03-10T06:22:38.520 INFO:tasks.workunit.client.1.vm06.stdout:8/940: creat d1/df/d11/f12f x:0 0 0 2026-03-10T06:22:38.521 INFO:tasks.workunit.client.1.vm06.stdout:5/845: creat d8/db/d54/d8a/d39/d6c/f100 x:0 0 0 2026-03-10T06:22:38.522 INFO:tasks.workunit.client.1.vm06.stdout:2/923: getdents da/d13/d1c/d7d/ddf/d61/d68 0 2026-03-10T06:22:38.523 INFO:tasks.workunit.client.1.vm06.stdout:2/924: write da/d13/d1c/d1d/f26 [943247,105703] 0 2026-03-10T06:22:38.524 INFO:tasks.workunit.client.1.vm06.stdout:8/941: mkdir d1/df/d20/d21/d5e/d79/d118/d130 0 2026-03-10T06:22:38.526 INFO:tasks.workunit.client.1.vm06.stdout:2/925: creat da/d13/d1c/d1d/d44/dc4/f131 x:0 0 0 2026-03-10T06:22:38.526 INFO:tasks.workunit.client.1.vm06.stdout:2/926: chown da/d13/d1c/d1d/l108 87 1 2026-03-10T06:22:38.534 INFO:tasks.workunit.client.1.vm06.stdout:2/927: rename da/d13/d5e/c8c to da/da8/d12a/c132 0 2026-03-10T06:22:38.539 INFO:tasks.workunit.client.1.vm06.stdout:5/846: dread d8/db/d54/d8a/d74/f71 [0,4194304] 0 2026-03-10T06:22:38.544 INFO:tasks.workunit.client.1.vm06.stdout:5/847: readlink d8/db/d54/d8a/d74/l1c 0 2026-03-10T06:22:38.545 INFO:tasks.workunit.client.1.vm06.stdout:5/848: chown d8/c24 13539 1 2026-03-10T06:22:38.546 INFO:tasks.workunit.client.1.vm06.stdout:5/849: mkdir d8/db/d54/d67/d101 0 2026-03-10T06:22:38.549 INFO:tasks.workunit.client.1.vm06.stdout:5/850: dread - d8/db/fde zero size 2026-03-10T06:22:38.554 INFO:tasks.workunit.client.1.vm06.stdout:5/851: link d8/db/d54/d67/cfb d8/db/d54/d67/c102 0 2026-03-10T06:22:38.580 INFO:tasks.workunit.client.1.vm06.stdout:8/942: write d1/df/d20/d21/d7e/d8d/f9c [1143542,53211] 0 2026-03-10T06:22:38.584 INFO:tasks.workunit.client.1.vm06.stdout:2/928: getdents da/da8/d12a 0 2026-03-10T06:22:38.619 INFO:tasks.workunit.client.1.vm06.stdout:5/852: write d8/db/d54/d8a/d39/f44 [566030,25995] 0 2026-03-10T06:22:38.623 INFO:tasks.workunit.client.1.vm06.stdout:5/853: sync 2026-03-10T06:22:38.624 INFO:tasks.workunit.client.1.vm06.stdout:5/854: write d8/db/d54/d8a/d39/f69 [2188253,108018] 0 2026-03-10T06:22:38.630 INFO:tasks.workunit.client.1.vm06.stdout:8/943: write d1/df/d20/f51 [2363891,126894] 0 2026-03-10T06:22:38.630 INFO:tasks.workunit.client.1.vm06.stdout:5/855: creat d8/f103 x:0 0 0 2026-03-10T06:22:38.636 INFO:tasks.workunit.client.1.vm06.stdout:2/929: write da/d13/d1c/d7d/fc3 [3868080,9672] 0 2026-03-10T06:22:38.636 INFO:tasks.workunit.client.1.vm06.stdout:5/856: symlink d8/de6/df6/l104 0 2026-03-10T06:22:38.637 INFO:tasks.workunit.client.1.vm06.stdout:2/930: readlink da/d13/d1c/d7d/ddf/d61/d68/lce 0 2026-03-10T06:22:38.638 INFO:tasks.workunit.client.1.vm06.stdout:5/857: truncate d8/db/d54/d8a/d39/f44 756253 0 2026-03-10T06:22:38.644 INFO:tasks.workunit.client.1.vm06.stdout:8/944: link d1/df/l4d d1/d7/df8/l131 0 2026-03-10T06:22:38.645 INFO:tasks.workunit.client.1.vm06.stdout:5/858: mkdir d8/db/d54/d67/d46/d105 0 2026-03-10T06:22:38.645 INFO:tasks.workunit.client.1.vm06.stdout:5/859: dread - d8/db/d54/d55/d80/fd0 zero size 2026-03-10T06:22:38.646 INFO:tasks.workunit.client.1.vm06.stdout:5/860: chown d8/db/f45 810614979 1 2026-03-10T06:22:38.647 INFO:tasks.workunit.client.1.vm06.stdout:8/945: mkdir d1/df/d11/da1/d132 0 2026-03-10T06:22:38.650 INFO:tasks.workunit.client.1.vm06.stdout:8/946: symlink d1/d3b/da9/ddb/l133 0 2026-03-10T06:22:38.653 INFO:tasks.workunit.client.1.vm06.stdout:5/861: getdents d8/db/d54/d67/dd7 0 2026-03-10T06:22:38.657 INFO:tasks.workunit.client.1.vm06.stdout:2/931: write da/d13/d1c/d7d/ddf/f98 [4416484,51875] 0 2026-03-10T06:22:38.659 INFO:tasks.workunit.client.1.vm06.stdout:8/947: dwrite d1/d3b/f49 [0,4194304] 0 2026-03-10T06:22:38.661 INFO:tasks.workunit.client.1.vm06.stdout:5/862: getdents d8/db/d54/d8a/d39 0 2026-03-10T06:22:38.664 INFO:tasks.workunit.client.1.vm06.stdout:5/863: dread - d8/db/d54/d67/d46/d6e/fbf zero size 2026-03-10T06:22:38.669 INFO:tasks.workunit.client.1.vm06.stdout:8/948: mkdir d1/df/d58/db5/d134 0 2026-03-10T06:22:38.670 INFO:tasks.workunit.client.1.vm06.stdout:2/932: chown da/f84 7 1 2026-03-10T06:22:38.671 INFO:tasks.workunit.client.1.vm06.stdout:2/933: truncate da/f28 4938015 0 2026-03-10T06:22:38.675 INFO:tasks.workunit.client.1.vm06.stdout:2/934: symlink da/d13/d1a/l133 0 2026-03-10T06:22:38.675 INFO:tasks.workunit.client.1.vm06.stdout:5/864: truncate d8/db/d54/d8a/f4d 2654053 0 2026-03-10T06:22:38.676 INFO:tasks.workunit.client.1.vm06.stdout:2/935: fdatasync da/d13/d1c/d7d/ddf/ff2 0 2026-03-10T06:22:38.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:38 vm04.local ceph-mon[51058]: from='client.24447 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:22:38.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:38 vm04.local ceph-mon[51058]: from='client.? 192.168.123.104:0/2785238524' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:22:38.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:38 vm04.local ceph-mon[51058]: from='client.14688 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:22:38.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:38 vm04.local ceph-mon[51058]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:22:38.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:38 vm04.local ceph-mon[51058]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:22:38.678 INFO:tasks.workunit.client.1.vm06.stdout:5/865: symlink d8/db/d54/d8a/l106 0 2026-03-10T06:22:38.678 INFO:tasks.workunit.client.1.vm06.stdout:2/936: fdatasync da/d13/d1a/dc7/f9d 0 2026-03-10T06:22:38.680 INFO:tasks.workunit.client.1.vm06.stdout:2/937: write da/d13/d1c/d1d/d44/dc4/f116 [985149,126561] 0 2026-03-10T06:22:38.685 INFO:tasks.workunit.client.1.vm06.stdout:2/938: mknod da/da8/c134 0 2026-03-10T06:22:38.691 INFO:tasks.workunit.client.1.vm06.stdout:2/939: write da/f84 [1562742,96817] 0 2026-03-10T06:22:38.693 INFO:tasks.workunit.client.1.vm06.stdout:5/866: dwrite d8/db/d54/d8a/d39/d6c/de7/fff [0,4194304] 0 2026-03-10T06:22:38.694 INFO:tasks.workunit.client.1.vm06.stdout:8/949: dwrite d1/f89 [0,4194304] 0 2026-03-10T06:22:38.697 INFO:tasks.workunit.client.1.vm06.stdout:8/950: readlink d1/df/d11/da1/lde 0 2026-03-10T06:22:38.698 INFO:tasks.workunit.client.1.vm06.stdout:8/951: fsync d1/d7/f4f 0 2026-03-10T06:22:38.698 INFO:tasks.workunit.client.1.vm06.stdout:2/940: chown da/da8/le5 3604 1 2026-03-10T06:22:38.702 INFO:tasks.workunit.client.1.vm06.stdout:5/867: symlink d8/db/df8/l107 0 2026-03-10T06:22:38.711 INFO:tasks.workunit.client.1.vm06.stdout:2/941: symlink da/d13/d1c/d1d/d44/d46/de2/l135 0 2026-03-10T06:22:38.711 INFO:tasks.workunit.client.1.vm06.stdout:5/868: rename d8/db/d54/d8a/d39/d9f to d8/db/d54/d67/d46/d105/d108 0 2026-03-10T06:22:38.714 INFO:tasks.workunit.client.1.vm06.stdout:5/869: read - d8/d9/ddd/fef zero size 2026-03-10T06:22:38.720 INFO:tasks.workunit.client.1.vm06.stdout:2/942: fsync da/d13/d1a/d39/f2f 0 2026-03-10T06:22:38.721 INFO:tasks.workunit.client.1.vm06.stdout:2/943: write da/d13/d1a/f117 [541180,46063] 0 2026-03-10T06:22:38.726 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:38 vm06.local ceph-mon[58974]: from='client.24447 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:22:38.726 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:38 vm06.local ceph-mon[58974]: from='client.? 192.168.123.104:0/2785238524' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:22:38.726 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:38 vm06.local ceph-mon[58974]: from='client.14688 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:22:38.726 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:38 vm06.local ceph-mon[58974]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:22:38.726 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:38 vm06.local ceph-mon[58974]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:22:38.726 INFO:tasks.workunit.client.1.vm06.stdout:2/944: mknod da/d13/d1a/d39/c136 0 2026-03-10T06:22:38.731 INFO:tasks.workunit.client.1.vm06.stdout:8/952: dwrite d1/df/d58/f86 [0,4194304] 0 2026-03-10T06:22:38.753 INFO:tasks.workunit.client.1.vm06.stdout:2/945: dread da/f19 [0,4194304] 0 2026-03-10T06:22:38.759 INFO:tasks.workunit.client.1.vm06.stdout:8/953: chown d1/df/d20/l11f 29120 1 2026-03-10T06:22:38.765 INFO:tasks.workunit.client.1.vm06.stdout:5/870: rmdir d8/db/d54/d8a/d39 39 2026-03-10T06:22:38.765 INFO:tasks.workunit.client.1.vm06.stdout:8/954: chown d1/df/d20/d35/ce1 11 1 2026-03-10T06:22:38.765 INFO:tasks.workunit.client.1.vm06.stdout:8/955: write d1/d3b/f49 [2769765,85373] 0 2026-03-10T06:22:38.765 INFO:tasks.workunit.client.1.vm06.stdout:5/871: dread d8/db/d54/d67/d46/d6e/fa7 [0,4194304] 0 2026-03-10T06:22:38.780 INFO:tasks.workunit.client.1.vm06.stdout:5/872: write d8/db/d54/d8a/d39/d6c/de7/feb [803430,72675] 0 2026-03-10T06:22:38.781 INFO:tasks.workunit.client.1.vm06.stdout:2/946: getdents da/d13/d1a 0 2026-03-10T06:22:38.781 INFO:tasks.workunit.client.1.vm06.stdout:2/947: stat da/d13/d1a/d39/d35/c4e 0 2026-03-10T06:22:38.782 INFO:tasks.workunit.client.1.vm06.stdout:8/956: dread d1/df/d20/f51 [0,4194304] 0 2026-03-10T06:22:38.806 INFO:tasks.workunit.client.1.vm06.stdout:8/957: dwrite d1/d7/fd [0,4194304] 0 2026-03-10T06:22:38.808 INFO:tasks.workunit.client.1.vm06.stdout:5/873: truncate d8/db/d54/d8a/d39/fae 2874032 0 2026-03-10T06:22:38.808 INFO:tasks.workunit.client.1.vm06.stdout:8/958: fsync d1/df/f6d 0 2026-03-10T06:22:38.808 INFO:tasks.workunit.client.1.vm06.stdout:5/874: chown d8/f3f 99 1 2026-03-10T06:22:38.809 INFO:tasks.workunit.client.1.vm06.stdout:2/948: dread da/d13/d1a/dc7/d120/d124/db9/fc2 [0,4194304] 0 2026-03-10T06:22:38.810 INFO:tasks.workunit.client.1.vm06.stdout:5/875: truncate d8/db/d54/d55/d80/fdc 427943 0 2026-03-10T06:22:38.813 INFO:tasks.workunit.client.1.vm06.stdout:8/959: unlink d1/df/d20/d21/d7e/d8d/d129/fcb 0 2026-03-10T06:22:38.813 INFO:tasks.workunit.client.1.vm06.stdout:5/876: creat d8/d9/ddd/f109 x:0 0 0 2026-03-10T06:22:38.817 INFO:tasks.workunit.client.1.vm06.stdout:5/877: dread d8/db/f1f [0,4194304] 0 2026-03-10T06:22:38.818 INFO:tasks.workunit.client.1.vm06.stdout:5/878: read - d8/db/d54/d67/d46/d105/d108/fe5 zero size 2026-03-10T06:22:38.819 INFO:tasks.workunit.client.1.vm06.stdout:5/879: fsync d8/db/d54/d8a/d74/f85 0 2026-03-10T06:22:38.820 INFO:tasks.workunit.client.1.vm06.stdout:8/960: dread d1/df/d58/ff6 [0,4194304] 0 2026-03-10T06:22:38.821 INFO:tasks.workunit.client.1.vm06.stdout:5/880: chown d8/db/c79 16416732 1 2026-03-10T06:22:38.821 INFO:tasks.workunit.client.1.vm06.stdout:8/961: dread - d1/df/d20/d21/d5e/d79/d118/f120 zero size 2026-03-10T06:22:38.826 INFO:tasks.workunit.client.1.vm06.stdout:8/962: rename d1/df/d20/fe to d1/d3b/da9/f135 0 2026-03-10T06:22:38.827 INFO:tasks.workunit.client.1.vm06.stdout:8/963: chown d1/df/fc7 468606929 1 2026-03-10T06:22:38.828 INFO:tasks.workunit.client.1.vm06.stdout:8/964: chown d1/df/d58/caa 773499 1 2026-03-10T06:22:38.831 INFO:tasks.workunit.client.1.vm06.stdout:8/965: rename d1/d2c/c34 to d1/df/d58/db5/d134/c136 0 2026-03-10T06:22:38.833 INFO:tasks.workunit.client.1.vm06.stdout:8/966: truncate d1/df/d11/da1/fb6 1670385 0 2026-03-10T06:22:38.833 INFO:tasks.workunit.client.1.vm06.stdout:8/967: dread - d1/d2c/f8a zero size 2026-03-10T06:22:38.835 INFO:tasks.workunit.client.1.vm06.stdout:5/881: dread f7 [0,4194304] 0 2026-03-10T06:22:38.836 INFO:tasks.workunit.client.1.vm06.stdout:8/968: read d1/d7/dfb/f127 [3087060,106595] 0 2026-03-10T06:22:38.837 INFO:tasks.workunit.client.1.vm06.stdout:8/969: stat d1/df/d11/da1/f10b 0 2026-03-10T06:22:38.838 INFO:tasks.workunit.client.1.vm06.stdout:8/970: chown d1/f89 20 1 2026-03-10T06:22:38.838 INFO:tasks.workunit.client.1.vm06.stdout:8/971: readlink d1/df/d20/d21/d5e/d79/lad 0 2026-03-10T06:22:38.839 INFO:tasks.workunit.client.1.vm06.stdout:8/972: chown d1/f75 6515648 1 2026-03-10T06:22:38.840 INFO:tasks.workunit.client.1.vm06.stdout:5/882: getdents d8/d9 0 2026-03-10T06:22:38.847 INFO:tasks.workunit.client.1.vm06.stdout:5/883: dread d8/db/d54/d8a/d39/fc5 [0,4194304] 0 2026-03-10T06:22:38.848 INFO:tasks.workunit.client.1.vm06.stdout:5/884: read - d8/db/d54/d67/d46/d6e/fbf zero size 2026-03-10T06:22:38.852 INFO:tasks.workunit.client.1.vm06.stdout:5/885: dread d8/db/d54/d8a/d39/d6c/f91 [0,4194304] 0 2026-03-10T06:22:38.852 INFO:tasks.workunit.client.1.vm06.stdout:2/949: truncate da/d13/d5e/ff8 2024913 0 2026-03-10T06:22:38.864 INFO:tasks.workunit.client.1.vm06.stdout:2/950: chown da/d13/d1c/f7e 9 1 2026-03-10T06:22:38.865 INFO:tasks.workunit.client.1.vm06.stdout:2/951: stat da/d13/d1a/d39/d35 0 2026-03-10T06:22:38.865 INFO:tasks.workunit.client.1.vm06.stdout:2/952: write da/d13/d1c/d1d/d44/dc4/f116 [1569886,76647] 0 2026-03-10T06:22:38.867 INFO:tasks.workunit.client.1.vm06.stdout:5/886: fsync d8/de6/fec 0 2026-03-10T06:22:38.873 INFO:tasks.workunit.client.1.vm06.stdout:8/973: write d1/df/d11/f4a [43543,33723] 0 2026-03-10T06:22:38.881 INFO:tasks.workunit.client.1.vm06.stdout:8/974: unlink d1/df/d11/da1/f10b 0 2026-03-10T06:22:38.886 INFO:tasks.workunit.client.1.vm06.stdout:2/953: write da/d13/d5e/f64 [3212505,35204] 0 2026-03-10T06:22:38.887 INFO:tasks.workunit.client.1.vm06.stdout:2/954: write da/d13/d5e/fbc [3937453,110730] 0 2026-03-10T06:22:38.890 INFO:tasks.workunit.client.1.vm06.stdout:5/887: write d8/db/d54/d8a/d39/fc5 [1688011,19520] 0 2026-03-10T06:22:38.892 INFO:tasks.workunit.client.1.vm06.stdout:8/975: chown d1/df/d11/fe4 1656916744 1 2026-03-10T06:22:38.892 INFO:tasks.workunit.client.1.vm06.stdout:5/888: fdatasync d8/db/d54/f88 0 2026-03-10T06:22:38.893 INFO:tasks.workunit.client.1.vm06.stdout:2/955: mknod da/d13/d1a/dc7/d120/d124/ddd/c137 0 2026-03-10T06:22:38.901 INFO:tasks.workunit.client.1.vm06.stdout:5/889: mkdir d8/db/d54/d67/d46/d105/d108/d10a 0 2026-03-10T06:22:38.902 INFO:tasks.workunit.client.1.vm06.stdout:2/956: creat da/d13/d1a/dc7/d120/d124/ddd/f138 x:0 0 0 2026-03-10T06:22:38.907 INFO:tasks.workunit.client.1.vm06.stdout:5/890: mknod d8/db/d7e/c10b 0 2026-03-10T06:22:38.908 INFO:tasks.workunit.client.1.vm06.stdout:2/957: creat da/d13/d1a/dc7/d120/d124/db9/d9b/f139 x:0 0 0 2026-03-10T06:22:38.908 INFO:tasks.workunit.client.1.vm06.stdout:8/976: write d1/d7/f92 [4211273,80507] 0 2026-03-10T06:22:38.910 INFO:tasks.workunit.client.1.vm06.stdout:2/958: fdatasync da/d13/f5b 0 2026-03-10T06:22:38.916 INFO:tasks.workunit.client.1.vm06.stdout:8/977: creat d1/d7/f137 x:0 0 0 2026-03-10T06:22:38.920 INFO:tasks.workunit.client.1.vm06.stdout:8/978: dwrite d1/df/d20/f113 [0,4194304] 0 2026-03-10T06:22:38.920 INFO:tasks.workunit.client.1.vm06.stdout:2/959: symlink da/d13/d1a/d39/l13a 0 2026-03-10T06:22:38.921 INFO:tasks.workunit.client.1.vm06.stdout:5/891: rmdir d8/db/d54/d8a/de4 39 2026-03-10T06:22:38.930 INFO:tasks.workunit.client.1.vm06.stdout:8/979: dwrite d1/f13 [0,4194304] 0 2026-03-10T06:22:38.930 INFO:tasks.workunit.client.1.vm06.stdout:5/892: mknod d8/db/d54/d67/d46/d68/dc1/c10c 0 2026-03-10T06:22:38.932 INFO:tasks.workunit.client.1.vm06.stdout:2/960: creat da/d13/d1c/d1d/d44/d46/de2/d112/f13b x:0 0 0 2026-03-10T06:22:38.938 INFO:tasks.workunit.client.1.vm06.stdout:2/961: fsync da/d13/d1c/d1d/d44/dc4/f131 0 2026-03-10T06:22:38.959 INFO:tasks.workunit.client.1.vm06.stdout:8/980: dread d1/df/d20/d21/d7e/d8d/f95 [0,4194304] 0 2026-03-10T06:22:38.965 INFO:tasks.workunit.client.1.vm06.stdout:2/962: sync 2026-03-10T06:22:38.972 INFO:tasks.workunit.client.1.vm06.stdout:5/893: write d8/db/fde [190194,90117] 0 2026-03-10T06:22:38.973 INFO:tasks.workunit.client.1.vm06.stdout:5/894: chown d8/db/d54/d67/d46/d68/dc1/fce 5216 1 2026-03-10T06:22:38.975 INFO:tasks.workunit.client.1.vm06.stdout:2/963: symlink da/d13/d1c/l13c 0 2026-03-10T06:22:38.979 INFO:tasks.workunit.client.1.vm06.stdout:2/964: unlink da/d13/d5e/fe3 0 2026-03-10T06:22:38.989 INFO:tasks.workunit.client.1.vm06.stdout:2/965: creat da/d13/d1a/dc7/daf/d10d/f13d x:0 0 0 2026-03-10T06:22:38.997 INFO:tasks.workunit.client.1.vm06.stdout:8/981: truncate d1/df/d58/f86 3699038 0 2026-03-10T06:22:38.998 INFO:tasks.workunit.client.1.vm06.stdout:8/982: chown d1/d3b/da9/cb8 38424 1 2026-03-10T06:22:38.999 INFO:tasks.workunit.client.1.vm06.stdout:5/895: truncate d8/db/d54/d8a/d39/d72/f8b 3523017 0 2026-03-10T06:22:39.002 INFO:tasks.workunit.client.1.vm06.stdout:2/966: dwrite da/d13/d1c/f76 [0,4194304] 0 2026-03-10T06:22:39.004 INFO:tasks.workunit.client.1.vm06.stdout:2/967: chown da/d13/d1a/l32 6 1 2026-03-10T06:22:39.010 INFO:tasks.workunit.client.1.vm06.stdout:5/896: mkdir d8/db/d54/d67/d46/d68/d10d 0 2026-03-10T06:22:39.010 INFO:tasks.workunit.client.1.vm06.stdout:2/968: unlink da/f10c 0 2026-03-10T06:22:39.011 INFO:tasks.workunit.client.1.vm06.stdout:5/897: readlink d8/d9/lb1 0 2026-03-10T06:22:39.012 INFO:tasks.workunit.client.1.vm06.stdout:2/969: fdatasync da/d13/d1c/d7d/ddf/f65 0 2026-03-10T06:22:39.012 INFO:tasks.workunit.client.1.vm06.stdout:5/898: readlink d8/db/lb4 0 2026-03-10T06:22:39.013 INFO:tasks.workunit.client.1.vm06.stdout:5/899: chown d8/db/df8 17888956 1 2026-03-10T06:22:39.021 INFO:tasks.workunit.client.1.vm06.stdout:5/900: unlink d8/db/d54/d8a/d39/f51 0 2026-03-10T06:22:39.033 INFO:tasks.workunit.client.1.vm06.stdout:8/983: write d1/df/d58/db5/fea [495393,65982] 0 2026-03-10T06:22:39.033 INFO:tasks.workunit.client.1.vm06.stdout:5/901: write d8/db/d57/f75 [499973,40860] 0 2026-03-10T06:22:39.033 INFO:tasks.workunit.client.1.vm06.stdout:2/970: write da/d13/d1c/d1d/d44/d46/ffc [1175958,57858] 0 2026-03-10T06:22:39.036 INFO:tasks.workunit.client.1.vm06.stdout:5/902: readlink d8/db/d54/d8a/d74/d90/lad 0 2026-03-10T06:22:39.037 INFO:tasks.workunit.client.1.vm06.stdout:8/984: dread d1/df/d20/f113 [0,4194304] 0 2026-03-10T06:22:39.042 INFO:tasks.workunit.client.1.vm06.stdout:8/985: dwrite d1/df/f6b [0,4194304] 0 2026-03-10T06:22:39.051 INFO:tasks.workunit.client.1.vm06.stdout:5/903: creat d8/db/d54/d67/d46/d105/d108/f10e x:0 0 0 2026-03-10T06:22:39.052 INFO:tasks.workunit.client.1.vm06.stdout:5/904: fsync d8/db/d54/d67/d46/d105/d108/fe5 0 2026-03-10T06:22:39.056 INFO:tasks.workunit.client.1.vm06.stdout:2/971: dwrite da/d13/d1c/d7d/ddf/d61/f89 [0,4194304] 0 2026-03-10T06:22:39.057 INFO:tasks.workunit.client.1.vm06.stdout:5/905: write d8/db/d54/d8a/d39/d6c/de7/fff [1216169,50876] 0 2026-03-10T06:22:39.057 INFO:tasks.workunit.client.1.vm06.stdout:8/986: truncate d1/df/d11/da1/fb6 2000180 0 2026-03-10T06:22:39.061 INFO:tasks.workunit.client.1.vm06.stdout:8/987: dread d1/d3b/f49 [0,4194304] 0 2026-03-10T06:22:39.075 INFO:tasks.workunit.client.1.vm06.stdout:5/906: chown d8/db/d54/d8a/d39/cb2 2180 1 2026-03-10T06:22:39.076 INFO:tasks.workunit.client.1.vm06.stdout:5/907: readlink d8/db/d54/d67/d46/d6e/l70 0 2026-03-10T06:22:39.077 INFO:tasks.workunit.client.1.vm06.stdout:5/908: fdatasync d8/db/ffd 0 2026-03-10T06:22:39.079 INFO:tasks.workunit.client.1.vm06.stdout:5/909: dread d8/db/d54/d67/d46/f98 [0,4194304] 0 2026-03-10T06:22:39.083 INFO:tasks.workunit.client.1.vm06.stdout:8/988: dread d1/df/d20/d35/ff2 [4194304,4194304] 0 2026-03-10T06:22:39.083 INFO:tasks.workunit.client.1.vm06.stdout:5/910: dread d8/db/d54/d8a/d39/d72/f8b [0,4194304] 0 2026-03-10T06:22:39.090 INFO:tasks.workunit.client.1.vm06.stdout:5/911: creat d8/db/d54/d8a/d39/d6c/de7/f10f x:0 0 0 2026-03-10T06:22:39.090 INFO:tasks.workunit.client.1.vm06.stdout:5/912: readlink d8/d9/l13 0 2026-03-10T06:22:39.102 INFO:tasks.workunit.client.1.vm06.stdout:2/972: dwrite da/d13/d1c/d7d/ddf/d61/d68/fcc [0,4194304] 0 2026-03-10T06:22:39.118 INFO:tasks.workunit.client.1.vm06.stdout:8/989: dwrite d1/f18 [0,4194304] 0 2026-03-10T06:22:39.123 INFO:tasks.workunit.client.1.vm06.stdout:2/973: dwrite da/d13/d1a/dc7/d120/d124/db7/f104 [0,4194304] 0 2026-03-10T06:22:39.128 INFO:tasks.workunit.client.1.vm06.stdout:5/913: dwrite d8/db/d54/d8a/f53 [0,4194304] 0 2026-03-10T06:22:39.128 INFO:tasks.workunit.client.1.vm06.stdout:5/914: chown d8/db/d54/d8a/d74/f42 17 1 2026-03-10T06:22:39.130 INFO:tasks.workunit.client.1.vm06.stdout:8/990: symlink d1/df/d20/d21/d5e/d79/d118/d130/l138 0 2026-03-10T06:22:39.140 INFO:tasks.workunit.client.1.vm06.stdout:2/974: dwrite da/d13/d1c/d7d/fe8 [0,4194304] 0 2026-03-10T06:22:39.144 INFO:tasks.workunit.client.1.vm06.stdout:8/991: fdatasync d1/d2c/d99/ddc/ff9 0 2026-03-10T06:22:39.144 INFO:tasks.workunit.client.1.vm06.stdout:5/915: fsync d8/d9/fee 0 2026-03-10T06:22:39.149 INFO:tasks.workunit.client.1.vm06.stdout:5/916: rename d8/de6/fec to d8/d9/f110 0 2026-03-10T06:22:39.159 INFO:tasks.workunit.client.1.vm06.stdout:2/975: truncate da/d13/d1c/d1d/d44/d46/de2/d112/fd0 1041173 0 2026-03-10T06:22:39.159 INFO:tasks.workunit.client.1.vm06.stdout:8/992: symlink d1/df/d11/l139 0 2026-03-10T06:22:39.159 INFO:tasks.workunit.client.1.vm06.stdout:2/976: symlink da/d13/d1c/d7d/l13e 0 2026-03-10T06:22:39.159 INFO:tasks.workunit.client.1.vm06.stdout:8/993: creat d1/df/d20/d35/dac/dbf/f13a x:0 0 0 2026-03-10T06:22:39.159 INFO:tasks.workunit.client.1.vm06.stdout:5/917: dwrite d8/db/d57/d83/fc3 [0,4194304] 0 2026-03-10T06:22:39.178 INFO:tasks.workunit.client.1.vm06.stdout:8/994: link d1/df/f71 d1/df/d11/da1/d132/f13b 0 2026-03-10T06:22:39.183 INFO:tasks.workunit.client.1.vm06.stdout:2/977: link da/c87 da/d13/d1c/d1d/d44/d46/c13f 0 2026-03-10T06:22:39.190 INFO:tasks.workunit.client.1.vm06.stdout:5/918: rmdir d8/db/d54/d67/d46/d68/d10d 0 2026-03-10T06:22:39.193 INFO:tasks.workunit.client.1.vm06.stdout:2/978: mkdir da/d13/d1c/d1d/d44/d46/de2/d112/d140 0 2026-03-10T06:22:39.194 INFO:tasks.workunit.client.1.vm06.stdout:5/919: fdatasync d8/db/d54/d8a/d39/f41 0 2026-03-10T06:22:39.196 INFO:tasks.workunit.client.1.vm06.stdout:2/979: fdatasync da/d13/d1a/dc7/d120/d124/f85 0 2026-03-10T06:22:39.197 INFO:tasks.workunit.client.1.vm06.stdout:2/980: read da/d13/d1c/d43/f91 [411962,66016] 0 2026-03-10T06:22:39.198 INFO:tasks.workunit.client.1.vm06.stdout:2/981: write da/d13/d1c/d1d/d44/dc4/f126 [570214,53922] 0 2026-03-10T06:22:39.199 INFO:tasks.workunit.client.1.vm06.stdout:5/920: write d8/db/d54/d67/d46/d6e/fa7 [1809726,4888] 0 2026-03-10T06:22:39.205 INFO:tasks.workunit.client.1.vm06.stdout:2/982: creat da/d13/d1a/dc7/d120/d124/de7/d106/f141 x:0 0 0 2026-03-10T06:22:39.206 INFO:tasks.workunit.client.1.vm06.stdout:5/921: symlink d8/db/d54/d67/d46/d105/l111 0 2026-03-10T06:22:39.224 INFO:tasks.workunit.client.1.vm06.stdout:5/922: sync 2026-03-10T06:22:39.226 INFO:tasks.workunit.client.1.vm06.stdout:5/923: read d8/db/d54/d8a/f31 [2553040,34535] 0 2026-03-10T06:22:39.229 INFO:tasks.workunit.client.1.vm06.stdout:5/924: rename f7 to d8/db/d7e/de1/f112 0 2026-03-10T06:22:39.246 INFO:tasks.workunit.client.1.vm06.stdout:5/925: link d8/db/d54/c8c d8/db/c113 0 2026-03-10T06:22:39.247 INFO:tasks.workunit.client.1.vm06.stdout:8/995: dwrite d1/df/fc7 [0,4194304] 0 2026-03-10T06:22:39.262 INFO:tasks.workunit.client.1.vm06.stdout:5/926: read - d8/d9/f110 zero size 2026-03-10T06:22:39.262 INFO:tasks.workunit.client.1.vm06.stdout:8/996: fsync d1/df/d20/d21/d5e/fa4 0 2026-03-10T06:22:39.263 INFO:tasks.workunit.client.1.vm06.stdout:2/983: truncate da/d13/d1c/f76 2084885 0 2026-03-10T06:22:39.270 INFO:tasks.workunit.client.1.vm06.stdout:2/984: rename da/d13/d1a/l5c to da/d13/d1a/dc7/d120/d124/db9/l142 0 2026-03-10T06:22:39.275 INFO:tasks.workunit.client.1.vm06.stdout:5/927: mkdir d8/db/d54/d67/d46/d105/d108/d10a/d114 0 2026-03-10T06:22:39.281 INFO:tasks.workunit.client.1.vm06.stdout:8/997: getdents d1/df/d20/d35/dac 0 2026-03-10T06:22:39.283 INFO:tasks.workunit.client.1.vm06.stdout:5/928: link d8/db/d54/d8a/f31 d8/db/d54/d55/d80/f115 0 2026-03-10T06:22:39.286 INFO:tasks.workunit.client.1.vm06.stdout:8/998: dwrite d1/df/f6b [0,4194304] 0 2026-03-10T06:22:39.296 INFO:tasks.workunit.client.1.vm06.stdout:5/929: getdents d8/db/d57 0 2026-03-10T06:22:39.299 INFO:tasks.workunit.client.1.vm06.stdout:8/999: rename d1/df/d20/d21/d7e/c83 to d1/d2c/d99/d101/c13c 0 2026-03-10T06:22:39.302 INFO:tasks.workunit.client.1.vm06.stdout:5/930: chown d8/db/d54/d8a/de4 676112345 1 2026-03-10T06:22:39.305 INFO:tasks.workunit.client.1.vm06.stdout:2/985: link da/d13/d1a/d39/c136 da/d13/d1c/d1d/d110/c143 0 2026-03-10T06:22:39.305 INFO:tasks.workunit.client.1.vm06.stdout:5/931: chown d8/db/d54/d67/c102 448 1 2026-03-10T06:22:39.305 INFO:tasks.workunit.client.1.vm06.stdout:5/932: write d8/db/d54/d55/fa3 [1119905,13444] 0 2026-03-10T06:22:39.308 INFO:tasks.workunit.client.1.vm06.stdout:2/986: chown da/d13/d1c/d1d/d44/c88 42004250 1 2026-03-10T06:22:39.309 INFO:tasks.workunit.client.1.vm06.stdout:5/933: unlink d8/db/d54/d67/d46/d6e/da2/laf 0 2026-03-10T06:22:39.312 INFO:tasks.workunit.client.1.vm06.stdout:2/987: rename da/d13/d1a/d39/d35/c4e to da/d13/d1c/d1d/d44/d46/c144 0 2026-03-10T06:22:39.314 INFO:tasks.workunit.client.1.vm06.stdout:5/934: rename d8/db/d54/d67/d46/d68/dc1/c10c to d8/db/d7e/c116 0 2026-03-10T06:22:39.316 INFO:tasks.workunit.client.1.vm06.stdout:2/988: chown da/d13/d1a/f27 0 1 2026-03-10T06:22:39.318 INFO:tasks.workunit.client.1.vm06.stdout:2/989: read da/d13/d5e/ff8 [159278,107224] 0 2026-03-10T06:22:39.318 INFO:tasks.workunit.client.1.vm06.stdout:5/935: creat d8/db/d54/d67/d46/d105/d108/d10a/d114/f117 x:0 0 0 2026-03-10T06:22:39.321 INFO:tasks.workunit.client.1.vm06.stdout:5/936: truncate d8/d9/f14 1751077 0 2026-03-10T06:22:39.322 INFO:tasks.workunit.client.1.vm06.stdout:2/990: truncate da/d13/d1a/dc7/d120/d124/db9/f12f 2268190 0 2026-03-10T06:22:39.329 INFO:tasks.workunit.client.1.vm06.stdout:2/991: dread da/d13/d1c/d1d/d44/dc4/f126 [0,4194304] 0 2026-03-10T06:22:39.333 INFO:tasks.workunit.client.1.vm06.stdout:5/937: dwrite d8/db/d54/d8a/d39/f69 [4194304,4194304] 0 2026-03-10T06:22:39.336 INFO:tasks.workunit.client.1.vm06.stdout:5/938: write d8/db/ffd [369222,123028] 0 2026-03-10T06:22:39.338 INFO:tasks.workunit.client.1.vm06.stdout:2/992: mkdir da/d13/d1c/d1d/d44/d145 0 2026-03-10T06:22:39.339 INFO:tasks.workunit.client.1.vm06.stdout:2/993: write da/d13/f5b [5952843,91253] 0 2026-03-10T06:22:39.344 INFO:tasks.workunit.client.1.vm06.stdout:5/939: rmdir d8/de6/df6 39 2026-03-10T06:22:39.346 INFO:tasks.workunit.client.1.vm06.stdout:2/994: rename da/da8/ca9 to da/d13/d1c/d1d/d44/d145/c146 0 2026-03-10T06:22:39.346 INFO:tasks.workunit.client.1.vm06.stdout:5/940: readlink d8/db/d54/d8a/d39/lb6 0 2026-03-10T06:22:39.351 INFO:tasks.workunit.client.1.vm06.stdout:5/941: dread d8/db/d54/d55/d80/fdc [0,4194304] 0 2026-03-10T06:22:39.353 INFO:tasks.workunit.client.1.vm06.stdout:2/995: dwrite da/d13/d1a/f27 [0,4194304] 0 2026-03-10T06:22:39.364 INFO:tasks.workunit.client.1.vm06.stdout:5/942: dwrite d8/db/d54/d67/d46/d105/d108/d10a/d114/f117 [0,4194304] 0 2026-03-10T06:22:39.369 INFO:tasks.workunit.client.1.vm06.stdout:5/943: write d8/db/d54/d8a/d74/f85 [524371,71575] 0 2026-03-10T06:22:39.371 INFO:tasks.workunit.client.1.vm06.stdout:2/996: dwrite da/d13/d1a/f101 [0,4194304] 0 2026-03-10T06:22:39.371 INFO:tasks.workunit.client.1.vm06.stdout:5/944: read d8/db/d54/fc9 [234066,63186] 0 2026-03-10T06:22:39.382 INFO:tasks.workunit.client.1.vm06.stdout:2/997: mkdir da/d13/d1a/d147 0 2026-03-10T06:22:39.383 INFO:tasks.workunit.client.1.vm06.stdout:2/998: dread - da/d13/d1c/d7d/fa4 zero size 2026-03-10T06:22:39.385 INFO:tasks.workunit.client.1.vm06.stdout:2/999: readlink da/d13/d1a/l34 0 2026-03-10T06:22:39.404 INFO:tasks.workunit.client.1.vm06.stdout:5/945: dread d8/db/d54/d67/d46/d68/dc1/fce [0,4194304] 0 2026-03-10T06:22:39.405 INFO:tasks.workunit.client.1.vm06.stdout:5/946: readlink d8/db/lca 0 2026-03-10T06:22:39.406 INFO:tasks.workunit.client.1.vm06.stdout:5/947: readlink d8/db/d54/d8a/l32 0 2026-03-10T06:22:39.411 INFO:tasks.workunit.client.1.vm06.stdout:5/948: mknod d8/db/d54/d55/c118 0 2026-03-10T06:22:39.446 INFO:tasks.workunit.client.1.vm06.stdout:5/949: dwrite d8/db/d54/d8a/d74/f71 [0,4194304] 0 2026-03-10T06:22:39.471 INFO:tasks.workunit.client.1.vm06.stdout:5/950: dwrite d8/db/d54/d8a/d74/f3b [0,4194304] 0 2026-03-10T06:22:39.476 INFO:tasks.workunit.client.1.vm06.stdout:5/951: link d8/db/d57/l8f d8/de6/l119 0 2026-03-10T06:22:39.493 INFO:tasks.workunit.client.1.vm06.stdout:5/952: write d8/d9/f11 [1721575,116962] 0 2026-03-10T06:22:39.494 INFO:tasks.workunit.client.1.vm06.stdout:5/953: dread - d8/db/d54/d8a/d74/d90/fcd zero size 2026-03-10T06:22:39.497 INFO:tasks.workunit.client.1.vm06.stdout:5/954: dwrite d8/db/d54/d8a/d74/f85 [0,4194304] 0 2026-03-10T06:22:39.511 INFO:tasks.workunit.client.1.vm06.stdout:5/955: link d8/db/d54/d67/d46/d68/dc1/dda/le3 d8/db/df8/l11a 0 2026-03-10T06:22:39.512 INFO:tasks.workunit.client.1.vm06.stdout:5/956: mkdir d8/db/d57/d83/d11b 0 2026-03-10T06:22:39.515 INFO:tasks.workunit.client.1.vm06.stdout:5/957: getdents d8/db/d57 0 2026-03-10T06:22:39.523 INFO:tasks.workunit.client.1.vm06.stdout:5/958: dread d8/db/d54/d8a/f4d [0,4194304] 0 2026-03-10T06:22:39.524 INFO:tasks.workunit.client.1.vm06.stdout:5/959: read - d8/db/d54/d55/d80/fd0 zero size 2026-03-10T06:22:39.526 INFO:tasks.workunit.client.1.vm06.stdout:5/960: fdatasync d8/db/d54/d8a/fc7 0 2026-03-10T06:22:39.527 INFO:tasks.workunit.client.1.vm06.stdout:5/961: write d8/db/d54/d67/d46/ff5 [505589,110576] 0 2026-03-10T06:22:39.528 INFO:tasks.workunit.client.1.vm06.stdout:5/962: write d8/db/d54/d67/d46/ff5 [858182,79765] 0 2026-03-10T06:22:39.538 INFO:tasks.workunit.client.1.vm06.stdout:5/963: getdents d8/db/d54/d67/d46 0 2026-03-10T06:22:39.541 INFO:tasks.workunit.client.1.vm06.stdout:5/964: mknod d8/db/d54/d67/d46/d105/d108/d10a/c11c 0 2026-03-10T06:22:39.543 INFO:tasks.workunit.client.1.vm06.stdout:5/965: mknod d8/db/d54/d67/d46/d6e/c11d 0 2026-03-10T06:22:39.558 INFO:tasks.workunit.client.1.vm06.stdout:5/966: dread d8/db/d54/d8a/fc7 [0,4194304] 0 2026-03-10T06:22:39.560 INFO:tasks.workunit.client.1.vm06.stdout:5/967: symlink d8/d9/ddd/de9/l11e 0 2026-03-10T06:22:39.560 INFO:tasks.workunit.client.1.vm06.stdout:5/968: write d8/db/f45 [4770269,25453] 0 2026-03-10T06:22:39.564 INFO:tasks.workunit.client.1.vm06.stdout:5/969: mknod d8/db/d54/d8a/d39/c11f 0 2026-03-10T06:22:39.575 INFO:tasks.workunit.client.1.vm06.stdout:5/970: dread d8/db/fd2 [0,4194304] 0 2026-03-10T06:22:39.577 INFO:tasks.workunit.client.1.vm06.stdout:5/971: unlink d8/db/f1f 0 2026-03-10T06:22:39.579 INFO:tasks.workunit.client.1.vm06.stdout:5/972: mkdir d8/db/d54/d67/d46/d68/d120 0 2026-03-10T06:22:39.580 INFO:tasks.workunit.client.1.vm06.stdout:5/973: dread d8/db/d54/d8a/f4d [0,4194304] 0 2026-03-10T06:22:39.581 INFO:tasks.workunit.client.1.vm06.stdout:5/974: chown d8/db/d54/d8a/d74/f62 16918348 1 2026-03-10T06:22:39.583 INFO:tasks.workunit.client.1.vm06.stdout:5/975: mknod d8/db/d54/d55/d80/c121 0 2026-03-10T06:22:39.587 INFO:tasks.workunit.client.1.vm06.stdout:5/976: rename d8/d9/f8e to d8/d9/ddd/f122 0 2026-03-10T06:22:39.600 INFO:tasks.workunit.client.1.vm06.stdout:5/977: dread d8/db/d54/d8a/d74/f42 [0,4194304] 0 2026-03-10T06:22:39.600 INFO:tasks.workunit.client.1.vm06.stdout:5/978: readlink d8/db/d54/d67/d46/d6e/l7c 0 2026-03-10T06:22:39.603 INFO:tasks.workunit.client.1.vm06.stdout:5/979: dwrite d8/db/fde [0,4194304] 0 2026-03-10T06:22:39.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:39 vm06.local ceph-mon[58974]: pgmap v18: 65 pgs: 65 active+clean; 2.0 GiB data, 6.5 GiB used, 113 GiB / 120 GiB avail; 45 MiB/s rd, 109 MiB/s wr, 270 op/s 2026-03-10T06:22:39.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:39 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T06:22:39.654 INFO:tasks.workunit.client.1.vm06.stdout:5/980: dwrite d8/db/d54/d8a/d39/f6a [0,4194304] 0 2026-03-10T06:22:39.663 INFO:tasks.workunit.client.1.vm06.stdout:5/981: dread d8/db/d57/f75 [0,4194304] 0 2026-03-10T06:22:39.677 INFO:tasks.workunit.client.1.vm06.stdout:5/982: mkdir d8/db/d54/d8a/d39/d6c/de7/d123 0 2026-03-10T06:22:39.677 INFO:tasks.workunit.client.1.vm06.stdout:5/983: creat d8/db/d54/d67/d46/d68/dc1/df9/f124 x:0 0 0 2026-03-10T06:22:39.677 INFO:tasks.workunit.client.1.vm06.stdout:5/984: readlink d8/db/df8/l11a 0 2026-03-10T06:22:39.677 INFO:tasks.workunit.client.1.vm06.stdout:5/985: unlink d8/d9/ddd/fef 0 2026-03-10T06:22:39.677 INFO:tasks.workunit.client.1.vm06.stdout:5/986: dread d8/db/fbc [0,4194304] 0 2026-03-10T06:22:39.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:39 vm04.local ceph-mon[51058]: pgmap v18: 65 pgs: 65 active+clean; 2.0 GiB data, 6.5 GiB used, 113 GiB / 120 GiB avail; 45 MiB/s rd, 109 MiB/s wr, 270 op/s 2026-03-10T06:22:39.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:39 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T06:22:39.703 INFO:tasks.workunit.client.1.vm06.stdout:5/987: write d8/db/d57/d83/f99 [1786261,5653] 0 2026-03-10T06:22:39.704 INFO:tasks.workunit.client.1.vm06.stdout:5/988: chown d8/de6/l119 868 1 2026-03-10T06:22:39.706 INFO:tasks.workunit.client.1.vm06.stdout:5/989: dread d8/db/d54/d8a/d74/f66 [0,4194304] 0 2026-03-10T06:22:39.707 INFO:tasks.workunit.client.1.vm06.stdout:5/990: dread - d8/db/d54/d8a/d39/d6c/f100 zero size 2026-03-10T06:22:39.708 INFO:tasks.workunit.client.1.vm06.stdout:5/991: mkdir d8/db/d125 0 2026-03-10T06:22:39.709 INFO:tasks.workunit.client.1.vm06.stdout:5/992: mknod d8/db/d54/d55/c126 0 2026-03-10T06:22:39.710 INFO:tasks.workunit.client.1.vm06.stdout:5/993: write d8/db/d54/d8a/d74/d90/fb5 [186368,81370] 0 2026-03-10T06:22:39.714 INFO:tasks.workunit.client.1.vm06.stdout:5/994: dwrite d8/db/d57/f75 [0,4194304] 0 2026-03-10T06:22:39.719 INFO:tasks.workunit.client.1.vm06.stdout:5/995: truncate d8/db/d54/d67/d46/d6e/faa 4776426 0 2026-03-10T06:22:39.723 INFO:tasks.workunit.client.1.vm06.stdout:5/996: readlink d8/db/lb4 0 2026-03-10T06:22:39.723 INFO:tasks.workunit.client.1.vm06.stdout:5/997: read - d8/db/d54/d8a/d74/fbe zero size 2026-03-10T06:22:39.723 INFO:tasks.workunit.client.1.vm06.stdout:5/998: write d8/db/d54/d8a/d39/d6c/f100 [30160,17931] 0 2026-03-10T06:22:39.723 INFO:tasks.workunit.client.1.vm06.stdout:5/999: chown d8/db/fde 5141 1 2026-03-10T06:22:39.725 INFO:tasks.workunit.client.1.vm06.stderr:+ rm -rf -- ./tmp.NoDCIoppvH 2026-03-10T06:22:40.697 INFO:tasks.workunit.client.0.vm04.stderr:+ pushd ltp-full-20091231/testcases/kernel/fs/fsstress 2026-03-10T06:22:40.701 INFO:tasks.workunit.client.0.vm04.stdout:~/cephtest/mnt.0/client.0/tmp/fsstress/ltp-full-20091231/testcases/kernel/fs/fsstress ~/cephtest/mnt.0/client.0/tmp/fsstress ~/cephtest/mnt.0/client.0/tmp 2026-03-10T06:22:40.701 INFO:tasks.workunit.client.0.vm04.stderr:+ make 2026-03-10T06:22:40.735 INFO:tasks.workunit.client.0.vm04.stdout:cc -DNO_XFS -I/home/ubuntu/cephtest/mnt.0/client.0/tmp/fsstress/ltp-full-20091231/testcases/kernel/fs/fsstress -D_LARGEFILE64_SOURCE -D_GNU_SOURCE -I../../../../include -I../../../../include -L../../../../lib fsstress.c -o fsstress 2026-03-10T06:22:40.854 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:40 vm04.local ceph-mon[51058]: Upgrade: Updating node-exporter.vm06 (2/2) 2026-03-10T06:22:40.854 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:40 vm04.local ceph-mon[51058]: Deploying daemon node-exporter.vm06 on vm06 2026-03-10T06:22:40.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:40 vm06.local ceph-mon[58974]: Upgrade: Updating node-exporter.vm06 (2/2) 2026-03-10T06:22:40.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:40 vm06.local ceph-mon[58974]: Deploying daemon node-exporter.vm06 on vm06 2026-03-10T06:22:41.004 INFO:tasks.workunit.client.0.vm04.stderr:++ readlink -f fsstress 2026-03-10T06:22:41.005 INFO:tasks.workunit.client.0.vm04.stderr:+ BIN=/home/ubuntu/cephtest/mnt.0/client.0/tmp/fsstress/ltp-full-20091231/testcases/kernel/fs/fsstress/fsstress 2026-03-10T06:22:41.005 INFO:tasks.workunit.client.0.vm04.stderr:+ popd 2026-03-10T06:22:41.006 INFO:tasks.workunit.client.0.vm04.stdout:~/cephtest/mnt.0/client.0/tmp/fsstress ~/cephtest/mnt.0/client.0/tmp 2026-03-10T06:22:41.006 INFO:tasks.workunit.client.0.vm04.stderr:+ popd 2026-03-10T06:22:41.007 INFO:tasks.workunit.client.0.vm04.stdout:~/cephtest/mnt.0/client.0/tmp 2026-03-10T06:22:41.007 INFO:tasks.workunit.client.0.vm04.stderr:++ mktemp -d -p . 2026-03-10T06:22:41.009 INFO:tasks.workunit.client.0.vm04.stderr:+ T=./tmp.2mByBn89Ip 2026-03-10T06:22:41.009 INFO:tasks.workunit.client.0.vm04.stderr:+ /home/ubuntu/cephtest/mnt.0/client.0/tmp/fsstress/ltp-full-20091231/testcases/kernel/fs/fsstress/fsstress -d ./tmp.2mByBn89Ip -l 1 -n 1000 -p 10 -v 2026-03-10T06:22:41.013 INFO:tasks.workunit.client.0.vm04.stdout:seed = 1773112769 2026-03-10T06:22:41.022 INFO:tasks.workunit.client.0.vm04.stdout:7/0: dwrite - no filename 2026-03-10T06:22:41.027 INFO:tasks.workunit.client.0.vm04.stdout:8/0: chown . 2593328 1 2026-03-10T06:22:41.027 INFO:tasks.workunit.client.0.vm04.stdout:8/1: write - no filename 2026-03-10T06:22:41.027 INFO:tasks.workunit.client.0.vm04.stdout:8/2: read - no filename 2026-03-10T06:22:41.027 INFO:tasks.workunit.client.0.vm04.stdout:8/3: chown . 7735 1 2026-03-10T06:22:41.029 INFO:tasks.workunit.client.0.vm04.stdout:7/1: creat f0 x:0 0 0 2026-03-10T06:22:41.035 INFO:tasks.workunit.client.0.vm04.stdout:8/4: symlink l0 0 2026-03-10T06:22:41.035 INFO:tasks.workunit.client.0.vm04.stdout:7/2: dwrite f0 [0,4194304] 0 2026-03-10T06:22:41.036 INFO:tasks.workunit.client.0.vm04.stdout:9/0: creat f0 x:0 0 0 2026-03-10T06:22:41.046 INFO:tasks.workunit.client.0.vm04.stdout:9/1: dwrite f0 [0,4194304] 0 2026-03-10T06:22:41.046 INFO:tasks.workunit.client.0.vm04.stdout:7/3: write f0 [4021411,76000] 0 2026-03-10T06:22:41.049 INFO:tasks.workunit.client.0.vm04.stdout:8/5: creat f1 x:0 0 0 2026-03-10T06:22:41.049 INFO:tasks.workunit.client.0.vm04.stdout:8/6: write f1 [100896,97983] 0 2026-03-10T06:22:41.050 INFO:tasks.workunit.client.0.vm04.stdout:5/0: unlink - no file 2026-03-10T06:22:41.063 INFO:tasks.workunit.client.0.vm04.stdout:7/4: mknod c1 0 2026-03-10T06:22:41.063 INFO:tasks.workunit.client.0.vm04.stdout:6/0: creat f0 x:0 0 0 2026-03-10T06:22:41.063 INFO:tasks.workunit.client.0.vm04.stdout:6/1: write f0 [614755,69511] 0 2026-03-10T06:22:41.064 INFO:tasks.workunit.client.0.vm04.stdout:6/2: write f0 [130109,98624] 0 2026-03-10T06:22:41.070 INFO:tasks.workunit.client.0.vm04.stdout:6/3: dwrite f0 [0,4194304] 0 2026-03-10T06:22:41.079 INFO:tasks.workunit.client.0.vm04.stdout:9/2: mknod c1 0 2026-03-10T06:22:41.079 INFO:tasks.workunit.client.0.vm04.stdout:7/5: rename c1 to c2 0 2026-03-10T06:22:41.079 INFO:tasks.workunit.client.0.vm04.stdout:4/0: rmdir - no directory 2026-03-10T06:22:41.079 INFO:tasks.workunit.client.0.vm04.stdout:4/1: dwrite - no filename 2026-03-10T06:22:41.079 INFO:tasks.workunit.client.0.vm04.stdout:7/6: read f0 [3997306,11687] 0 2026-03-10T06:22:41.079 INFO:tasks.workunit.client.0.vm04.stdout:5/1: creat f0 x:0 0 0 2026-03-10T06:22:41.079 INFO:tasks.workunit.client.0.vm04.stdout:5/2: fdatasync f0 0 2026-03-10T06:22:41.085 INFO:tasks.workunit.client.0.vm04.stdout:3/0: dwrite - no filename 2026-03-10T06:22:41.085 INFO:tasks.workunit.client.0.vm04.stdout:9/3: dread f0 [0,4194304] 0 2026-03-10T06:22:41.087 INFO:tasks.workunit.client.0.vm04.stdout:4/2: symlink l0 0 2026-03-10T06:22:41.087 INFO:tasks.workunit.client.0.vm04.stdout:4/3: fdatasync - no filename 2026-03-10T06:22:41.088 INFO:tasks.workunit.client.0.vm04.stdout:4/4: creat f1 x:0 0 0 2026-03-10T06:22:41.089 INFO:tasks.workunit.client.0.vm04.stdout:3/1: creat f0 x:0 0 0 2026-03-10T06:22:41.091 INFO:tasks.workunit.client.0.vm04.stdout:2/0: write - no filename 2026-03-10T06:22:41.091 INFO:tasks.workunit.client.0.vm04.stdout:2/1: write - no filename 2026-03-10T06:22:41.092 INFO:tasks.workunit.client.0.vm04.stdout:4/5: mkdir d2 0 2026-03-10T06:22:41.094 INFO:tasks.workunit.client.0.vm04.stdout:4/6: write f1 [189900,85133] 0 2026-03-10T06:22:41.096 INFO:tasks.workunit.client.0.vm04.stdout:0/0: stat - no entries 2026-03-10T06:22:41.097 INFO:tasks.workunit.client.0.vm04.stdout:1/0: mkdir d0 0 2026-03-10T06:22:41.097 INFO:tasks.workunit.client.0.vm04.stdout:1/1: read - no filename 2026-03-10T06:22:41.097 INFO:tasks.workunit.client.0.vm04.stdout:1/2: link - no file 2026-03-10T06:22:41.097 INFO:tasks.workunit.client.0.vm04.stdout:1/3: write - no filename 2026-03-10T06:22:41.098 INFO:tasks.workunit.client.0.vm04.stdout:1/4: chown d0 142 1 2026-03-10T06:22:41.099 INFO:tasks.workunit.client.0.vm04.stdout:4/7: rename f1 to d2/f3 0 2026-03-10T06:22:41.101 INFO:tasks.workunit.client.0.vm04.stdout:2/2: getdents . 0 2026-03-10T06:22:41.103 INFO:tasks.workunit.client.0.vm04.stdout:3/2: dwrite f0 [0,4194304] 0 2026-03-10T06:22:41.104 INFO:tasks.workunit.client.0.vm04.stdout:9/4: fdatasync f0 0 2026-03-10T06:22:41.106 INFO:tasks.workunit.client.0.vm04.stdout:3/3: write f0 [1307842,125386] 0 2026-03-10T06:22:41.106 INFO:tasks.workunit.client.0.vm04.stdout:3/4: chown f0 14088317 1 2026-03-10T06:22:41.111 INFO:tasks.workunit.client.0.vm04.stdout:0/1: mkdir d0 0 2026-03-10T06:22:41.111 INFO:tasks.workunit.client.0.vm04.stdout:0/2: stat d0 0 2026-03-10T06:22:41.113 INFO:tasks.workunit.client.0.vm04.stdout:2/3: getdents . 0 2026-03-10T06:22:41.113 INFO:tasks.workunit.client.0.vm04.stdout:1/5: mkdir d0/d1 0 2026-03-10T06:22:41.114 INFO:tasks.workunit.client.0.vm04.stdout:3/5: link f0 f1 0 2026-03-10T06:22:41.116 INFO:tasks.workunit.client.0.vm04.stdout:2/4: symlink l0 0 2026-03-10T06:22:41.117 INFO:tasks.workunit.client.0.vm04.stdout:2/5: chown l0 11920647 1 2026-03-10T06:22:41.117 INFO:tasks.workunit.client.0.vm04.stdout:2/6: dread - no filename 2026-03-10T06:22:41.118 INFO:tasks.workunit.client.0.vm04.stdout:2/7: mkdir d1 0 2026-03-10T06:22:41.118 INFO:tasks.workunit.client.0.vm04.stdout:2/8: truncate - no filename 2026-03-10T06:22:41.118 INFO:tasks.workunit.client.0.vm04.stdout:2/9: dread - no filename 2026-03-10T06:22:41.118 INFO:tasks.workunit.client.0.vm04.stdout:3/6: creat f2 x:0 0 0 2026-03-10T06:22:41.118 INFO:tasks.workunit.client.0.vm04.stdout:2/10: rename d1 to d1/d2 22 2026-03-10T06:22:41.118 INFO:tasks.workunit.client.0.vm04.stdout:2/11: fdatasync - no filename 2026-03-10T06:22:41.118 INFO:tasks.workunit.client.0.vm04.stdout:2/12: write - no filename 2026-03-10T06:22:41.118 INFO:tasks.workunit.client.0.vm04.stdout:2/13: dwrite - no filename 2026-03-10T06:22:41.121 INFO:tasks.workunit.client.0.vm04.stdout:1/6: rmdir d0/d1 0 2026-03-10T06:22:41.121 INFO:tasks.workunit.client.0.vm04.stdout:9/5: dwrite f0 [0,4194304] 0 2026-03-10T06:22:41.123 INFO:tasks.workunit.client.0.vm04.stdout:3/7: creat f3 x:0 0 0 2026-03-10T06:22:41.128 INFO:tasks.workunit.client.0.vm04.stdout:1/7: mknod d0/c2 0 2026-03-10T06:22:41.131 INFO:tasks.workunit.client.0.vm04.stdout:9/6: mkdir d2 0 2026-03-10T06:22:41.131 INFO:tasks.workunit.client.0.vm04.stdout:9/7: write f0 [1736763,94137] 0 2026-03-10T06:22:41.135 INFO:tasks.workunit.client.0.vm04.stdout:3/8: mkdir d4 0 2026-03-10T06:22:41.138 INFO:tasks.workunit.client.0.vm04.stdout:3/9: stat f2 0 2026-03-10T06:22:41.138 INFO:tasks.workunit.client.0.vm04.stdout:3/10: chown f0 644 1 2026-03-10T06:22:41.138 INFO:tasks.workunit.client.0.vm04.stdout:2/14: creat d1/f3 x:0 0 0 2026-03-10T06:22:41.138 INFO:tasks.workunit.client.0.vm04.stdout:2/15: truncate d1/f3 936170 0 2026-03-10T06:22:41.138 INFO:tasks.workunit.client.0.vm04.stdout:2/16: readlink l0 0 2026-03-10T06:22:41.138 INFO:tasks.workunit.client.0.vm04.stdout:2/17: fdatasync d1/f3 0 2026-03-10T06:22:41.139 INFO:tasks.workunit.client.0.vm04.stdout:1/8: mkdir d0/d3 0 2026-03-10T06:22:41.139 INFO:tasks.workunit.client.0.vm04.stdout:1/9: dread - no filename 2026-03-10T06:22:41.144 INFO:tasks.workunit.client.0.vm04.stdout:2/18: symlink d1/l4 0 2026-03-10T06:22:41.148 INFO:tasks.workunit.client.0.vm04.stdout:1/10: creat d0/f4 x:0 0 0 2026-03-10T06:22:41.148 INFO:tasks.workunit.client.0.vm04.stdout:1/11: write d0/f4 [931695,83483] 0 2026-03-10T06:22:41.148 INFO:tasks.workunit.client.0.vm04.stdout:9/8: mkdir d2/d3 0 2026-03-10T06:22:41.148 INFO:tasks.workunit.client.0.vm04.stdout:3/11: mknod d4/c5 0 2026-03-10T06:22:41.150 INFO:tasks.workunit.client.0.vm04.stdout:3/12: mkdir d4/d6 0 2026-03-10T06:22:41.151 INFO:tasks.workunit.client.0.vm04.stdout:8/7: fsync f1 0 2026-03-10T06:22:41.155 INFO:tasks.workunit.client.0.vm04.stdout:9/9: link f0 d2/d3/f4 0 2026-03-10T06:22:41.156 INFO:tasks.workunit.client.0.vm04.stdout:8/8: dwrite f1 [0,4194304] 0 2026-03-10T06:22:41.156 INFO:tasks.workunit.client.0.vm04.stdout:8/9: rmdir - no directory 2026-03-10T06:22:41.157 INFO:tasks.workunit.client.0.vm04.stdout:9/10: mknod d2/c5 0 2026-03-10T06:22:41.158 INFO:tasks.workunit.client.0.vm04.stdout:9/11: write f0 [1413530,16534] 0 2026-03-10T06:22:41.159 INFO:tasks.workunit.client.0.vm04.stdout:8/10: rename f1 to f2 0 2026-03-10T06:22:41.160 INFO:tasks.workunit.client.0.vm04.stdout:9/12: write d2/d3/f4 [478732,27506] 0 2026-03-10T06:22:41.161 INFO:tasks.workunit.client.0.vm04.stdout:8/11: creat f3 x:0 0 0 2026-03-10T06:22:41.161 INFO:tasks.workunit.client.0.vm04.stdout:9/13: rmdir d2/d3 39 2026-03-10T06:22:41.197 INFO:tasks.workunit.client.0.vm04.stdout:4/8: dread d2/f3 [0,4194304] 0 2026-03-10T06:22:41.199 INFO:tasks.workunit.client.0.vm04.stdout:6/4: write f0 [4564013,33222] 0 2026-03-10T06:22:41.199 INFO:tasks.workunit.client.0.vm04.stdout:6/5: chown f0 782856785 1 2026-03-10T06:22:41.208 INFO:tasks.workunit.client.0.vm04.stdout:2/19: getdents d1 0 2026-03-10T06:22:41.217 INFO:tasks.workunit.client.0.vm04.stdout:1/12: rmdir d0 39 2026-03-10T06:22:41.217 INFO:tasks.workunit.client.0.vm04.stdout:8/12: fsync f3 0 2026-03-10T06:22:41.217 INFO:tasks.workunit.client.0.vm04.stdout:8/13: fsync f3 0 2026-03-10T06:22:41.217 INFO:tasks.workunit.client.0.vm04.stdout:8/14: dread - f3 zero size 2026-03-10T06:22:41.250 INFO:tasks.workunit.client.0.vm04.stdout:6/6: fsync f0 0 2026-03-10T06:22:41.341 INFO:tasks.workunit.client.0.vm04.stdout:9/14: creat d2/f6 x:0 0 0 2026-03-10T06:22:41.341 INFO:tasks.workunit.client.0.vm04.stdout:9/15: chown d2/c5 1433 1 2026-03-10T06:22:41.343 INFO:tasks.workunit.client.0.vm04.stdout:7/7: rename c2 to c3 0 2026-03-10T06:22:41.343 INFO:tasks.workunit.client.0.vm04.stdout:4/9: creat d2/f4 x:0 0 0 2026-03-10T06:22:41.344 INFO:tasks.workunit.client.0.vm04.stdout:4/10: write d2/f4 [796587,61053] 0 2026-03-10T06:22:41.344 INFO:tasks.workunit.client.0.vm04.stdout:4/11: chown d2 265936 1 2026-03-10T06:22:41.345 INFO:tasks.workunit.client.0.vm04.stdout:4/12: read d2/f3 [240832,18661] 0 2026-03-10T06:22:41.347 INFO:tasks.workunit.client.0.vm04.stdout:2/20: creat d1/f5 x:0 0 0 2026-03-10T06:22:41.350 INFO:tasks.workunit.client.0.vm04.stdout:1/13: dread d0/f4 [0,4194304] 0 2026-03-10T06:22:41.350 INFO:tasks.workunit.client.0.vm04.stdout:6/7: rename f0 to f1 0 2026-03-10T06:22:41.351 INFO:tasks.workunit.client.0.vm04.stdout:1/14: truncate d0/f4 1323459 0 2026-03-10T06:22:41.353 INFO:tasks.workunit.client.0.vm04.stdout:9/16: rename d2/f6 to d2/d3/f7 0 2026-03-10T06:22:41.354 INFO:tasks.workunit.client.0.vm04.stdout:9/17: chown d2/d3/f4 196947407 1 2026-03-10T06:22:41.354 INFO:tasks.workunit.client.0.vm04.stdout:1/15: dwrite d0/f4 [0,4194304] 0 2026-03-10T06:22:41.356 INFO:tasks.workunit.client.0.vm04.stdout:9/18: chown f0 976 1 2026-03-10T06:22:41.356 INFO:tasks.workunit.client.0.vm04.stdout:1/16: stat d0 0 2026-03-10T06:22:41.356 INFO:tasks.workunit.client.0.vm04.stdout:1/17: chown d0/d3 0 1 2026-03-10T06:22:41.366 INFO:tasks.workunit.client.0.vm04.stdout:7/8: dwrite f0 [4194304,4194304] 0 2026-03-10T06:22:41.374 INFO:tasks.workunit.client.0.vm04.stdout:4/13: symlink d2/l5 0 2026-03-10T06:22:41.376 INFO:tasks.workunit.client.0.vm04.stdout:4/14: dwrite d2/f4 [0,4194304] 0 2026-03-10T06:22:41.380 INFO:tasks.workunit.client.0.vm04.stdout:6/8: mkdir d2 0 2026-03-10T06:22:41.385 INFO:tasks.workunit.client.0.vm04.stdout:1/18: creat d0/f5 x:0 0 0 2026-03-10T06:22:41.413 INFO:tasks.workunit.client.0.vm04.stdout:7/9: unlink f0 0 2026-03-10T06:22:41.413 INFO:tasks.workunit.client.0.vm04.stdout:7/10: dwrite - no filename 2026-03-10T06:22:41.413 INFO:tasks.workunit.client.0.vm04.stdout:7/11: readlink - no filename 2026-03-10T06:22:41.413 INFO:tasks.workunit.client.0.vm04.stdout:7/12: dread - no filename 2026-03-10T06:22:41.413 INFO:tasks.workunit.client.0.vm04.stdout:2/21: mknod d1/c6 0 2026-03-10T06:22:41.413 INFO:tasks.workunit.client.0.vm04.stdout:6/9: write f1 [5436964,118348] 0 2026-03-10T06:22:41.413 INFO:tasks.workunit.client.0.vm04.stdout:9/19: mkdir d2/d8 0 2026-03-10T06:22:41.413 INFO:tasks.workunit.client.0.vm04.stdout:1/19: write d0/f4 [2878465,1450] 0 2026-03-10T06:22:41.414 INFO:tasks.workunit.client.0.vm04.stdout:1/20: dread - d0/f5 zero size 2026-03-10T06:22:41.414 INFO:tasks.workunit.client.0.vm04.stdout:1/21: write d0/f5 [318420,54534] 0 2026-03-10T06:22:41.414 INFO:tasks.workunit.client.0.vm04.stdout:6/10: dwrite f1 [4194304,4194304] 0 2026-03-10T06:22:41.414 INFO:tasks.workunit.client.0.vm04.stdout:9/20: dread d2/d3/f4 [0,4194304] 0 2026-03-10T06:22:41.414 INFO:tasks.workunit.client.0.vm04.stdout:9/21: dread - d2/d3/f7 zero size 2026-03-10T06:22:41.414 INFO:tasks.workunit.client.0.vm04.stdout:9/22: dread - d2/d3/f7 zero size 2026-03-10T06:22:41.414 INFO:tasks.workunit.client.0.vm04.stdout:7/13: mkdir d4 0 2026-03-10T06:22:41.414 INFO:tasks.workunit.client.0.vm04.stdout:7/14: truncate - no filename 2026-03-10T06:22:41.414 INFO:tasks.workunit.client.0.vm04.stdout:7/15: write - no filename 2026-03-10T06:22:41.414 INFO:tasks.workunit.client.0.vm04.stdout:1/22: dwrite d0/f4 [0,4194304] 0 2026-03-10T06:22:41.414 INFO:tasks.workunit.client.0.vm04.stdout:1/23: chown d0/d3 288651 1 2026-03-10T06:22:41.414 INFO:tasks.workunit.client.0.vm04.stdout:6/11: creat d2/f3 x:0 0 0 2026-03-10T06:22:41.414 INFO:tasks.workunit.client.0.vm04.stdout:9/23: mkdir d2/d9 0 2026-03-10T06:22:41.414 INFO:tasks.workunit.client.0.vm04.stdout:2/22: creat d1/f7 x:0 0 0 2026-03-10T06:22:41.414 INFO:tasks.workunit.client.0.vm04.stdout:2/23: chown d1/f7 404078 1 2026-03-10T06:22:41.414 INFO:tasks.workunit.client.0.vm04.stdout:6/12: write d2/f3 [1030925,1132] 0 2026-03-10T06:22:41.414 INFO:tasks.workunit.client.0.vm04.stdout:2/24: read - d1/f7 zero size 2026-03-10T06:22:41.414 INFO:tasks.workunit.client.0.vm04.stdout:9/24: mkdir d2/d8/da 0 2026-03-10T06:22:41.414 INFO:tasks.workunit.client.0.vm04.stdout:6/13: mkdir d2/d4 0 2026-03-10T06:22:41.414 INFO:tasks.workunit.client.0.vm04.stdout:6/14: write d2/f3 [1062160,40678] 0 2026-03-10T06:22:41.414 INFO:tasks.workunit.client.0.vm04.stdout:9/25: read f0 [2758281,46999] 0 2026-03-10T06:22:41.416 INFO:tasks.workunit.client.0.vm04.stdout:6/15: dwrite d2/f3 [0,4194304] 0 2026-03-10T06:22:41.416 INFO:tasks.workunit.client.0.vm04.stdout:2/25: creat d1/f8 x:0 0 0 2026-03-10T06:22:41.419 INFO:tasks.workunit.client.0.vm04.stdout:6/16: symlink d2/d4/l5 0 2026-03-10T06:22:41.419 INFO:tasks.workunit.client.0.vm04.stdout:6/17: readlink d2/d4/l5 0 2026-03-10T06:22:41.420 INFO:tasks.workunit.client.0.vm04.stdout:2/26: symlink d1/l9 0 2026-03-10T06:22:41.421 INFO:tasks.workunit.client.0.vm04.stdout:6/18: dread d2/f3 [0,4194304] 0 2026-03-10T06:22:41.421 INFO:tasks.workunit.client.0.vm04.stdout:6/19: truncate f1 9164808 0 2026-03-10T06:22:41.423 INFO:tasks.workunit.client.0.vm04.stdout:6/20: creat d2/d4/f6 x:0 0 0 2026-03-10T06:22:41.424 INFO:tasks.workunit.client.0.vm04.stdout:2/27: symlink d1/la 0 2026-03-10T06:22:41.424 INFO:tasks.workunit.client.0.vm04.stdout:2/28: chown d1/f5 29425220 1 2026-03-10T06:22:41.425 INFO:tasks.workunit.client.0.vm04.stdout:6/21: unlink d2/f3 0 2026-03-10T06:22:41.425 INFO:tasks.workunit.client.0.vm04.stdout:6/22: stat d2/d4/l5 0 2026-03-10T06:22:41.427 INFO:tasks.workunit.client.0.vm04.stdout:6/23: dread f1 [4194304,4194304] 0 2026-03-10T06:22:41.435 INFO:tasks.workunit.client.0.vm04.stdout:6/24: link f1 d2/f7 0 2026-03-10T06:22:41.435 INFO:tasks.workunit.client.0.vm04.stdout:6/25: stat d2/d4 0 2026-03-10T06:22:41.435 INFO:tasks.workunit.client.0.vm04.stdout:6/26: mkdir d2/d8 0 2026-03-10T06:22:41.435 INFO:tasks.workunit.client.0.vm04.stdout:6/27: write d2/f7 [9618848,98861] 0 2026-03-10T06:22:41.600 INFO:tasks.workunit.client.0.vm04.stdout:9/26: fsync d2/d3/f7 0 2026-03-10T06:22:41.600 INFO:tasks.workunit.client.0.vm04.stdout:9/27: chown d2/d3/f4 262724404 1 2026-03-10T06:22:41.605 INFO:tasks.workunit.client.0.vm04.stdout:9/28: rmdir d2/d8/da 0 2026-03-10T06:22:41.607 INFO:tasks.workunit.client.0.vm04.stdout:9/29: dread f0 [0,4194304] 0 2026-03-10T06:22:41.609 INFO:tasks.workunit.client.0.vm04.stdout:9/30: creat d2/fb x:0 0 0 2026-03-10T06:22:41.613 INFO:tasks.workunit.client.0.vm04.stdout:9/31: dwrite d2/fb [0,4194304] 0 2026-03-10T06:22:41.616 INFO:tasks.workunit.client.0.vm04.stdout:9/32: mknod d2/cc 0 2026-03-10T06:22:41.617 INFO:tasks.workunit.client.0.vm04.stdout:9/33: creat d2/d9/fd x:0 0 0 2026-03-10T06:22:41.625 INFO:tasks.workunit.client.0.vm04.stdout:9/34: dwrite d2/d3/f4 [0,4194304] 0 2026-03-10T06:22:41.646 INFO:tasks.workunit.client.0.vm04.stdout:9/35: symlink d2/d3/le 0 2026-03-10T06:22:41.646 INFO:tasks.workunit.client.0.vm04.stdout:9/36: write d2/d9/fd [959593,15239] 0 2026-03-10T06:22:41.646 INFO:tasks.workunit.client.0.vm04.stdout:9/37: dwrite d2/fb [0,4194304] 0 2026-03-10T06:22:41.646 INFO:tasks.workunit.client.0.vm04.stdout:9/38: mkdir d2/df 0 2026-03-10T06:22:41.646 INFO:tasks.workunit.client.0.vm04.stdout:9/39: link d2/d9/fd d2/d3/f10 0 2026-03-10T06:22:41.647 INFO:tasks.workunit.client.0.vm04.stdout:9/40: rename d2/df to d2/d9/d11 0 2026-03-10T06:22:41.647 INFO:tasks.workunit.client.0.vm04.stdout:9/41: dread d2/d3/f4 [0,4194304] 0 2026-03-10T06:22:41.647 INFO:tasks.workunit.client.0.vm04.stdout:9/42: creat d2/d3/f12 x:0 0 0 2026-03-10T06:22:41.647 INFO:tasks.workunit.client.0.vm04.stdout:9/43: rename d2/c5 to d2/d8/c13 0 2026-03-10T06:22:41.677 INFO:tasks.workunit.client.0.vm04.stdout:1/24: getdents d0 0 2026-03-10T06:22:41.679 INFO:tasks.workunit.client.0.vm04.stdout:1/25: symlink d0/l6 0 2026-03-10T06:22:41.682 INFO:tasks.workunit.client.0.vm04.stdout:1/26: rename d0/l6 to d0/l7 0 2026-03-10T06:22:41.683 INFO:tasks.workunit.client.0.vm04.stdout:1/27: mkdir d0/d8 0 2026-03-10T06:22:41.685 INFO:tasks.workunit.client.0.vm04.stdout:1/28: write d0/f4 [2399999,81872] 0 2026-03-10T06:22:41.688 INFO:tasks.workunit.client.0.vm04.stdout:2/29: getdents d1 0 2026-03-10T06:22:41.688 INFO:tasks.workunit.client.0.vm04.stdout:2/30: dread - d1/f7 zero size 2026-03-10T06:22:41.693 INFO:tasks.workunit.client.0.vm04.stdout:2/31: dwrite d1/f7 [0,4194304] 0 2026-03-10T06:22:41.693 INFO:tasks.workunit.client.0.vm04.stdout:2/32: fsync d1/f3 0 2026-03-10T06:22:41.695 INFO:tasks.workunit.client.0.vm04.stdout:2/33: dread d1/f7 [0,4194304] 0 2026-03-10T06:22:41.696 INFO:tasks.workunit.client.0.vm04.stdout:2/34: rmdir d1 39 2026-03-10T06:22:41.697 INFO:tasks.workunit.client.0.vm04.stdout:2/35: getdents d1 0 2026-03-10T06:22:41.698 INFO:tasks.workunit.client.0.vm04.stdout:2/36: write d1/f7 [2626355,31498] 0 2026-03-10T06:22:41.706 INFO:tasks.workunit.client.0.vm04.stdout:2/37: mkdir d1/db 0 2026-03-10T06:22:41.708 INFO:tasks.workunit.client.0.vm04.stdout:2/38: symlink d1/db/lc 0 2026-03-10T06:22:41.708 INFO:tasks.workunit.client.0.vm04.stdout:2/39: dread d1/f3 [0,4194304] 0 2026-03-10T06:22:41.712 INFO:tasks.workunit.client.0.vm04.stdout:2/40: symlink d1/db/ld 0 2026-03-10T06:22:41.712 INFO:tasks.workunit.client.0.vm04.stdout:2/41: readlink d1/l9 0 2026-03-10T06:22:41.713 INFO:tasks.workunit.client.0.vm04.stdout:2/42: creat d1/db/fe x:0 0 0 2026-03-10T06:22:41.811 INFO:tasks.workunit.client.0.vm04.stdout:6/28: getdents d2 0 2026-03-10T06:22:41.813 INFO:tasks.workunit.client.0.vm04.stdout:6/29: rename d2/d4/f6 to d2/d8/f9 0 2026-03-10T06:22:41.814 INFO:tasks.workunit.client.0.vm04.stdout:6/30: write d2/f7 [4818700,97918] 0 2026-03-10T06:22:41.815 INFO:tasks.workunit.client.0.vm04.stdout:6/31: creat d2/d4/fa x:0 0 0 2026-03-10T06:22:41.834 INFO:tasks.workunit.client.0.vm04.stdout:6/32: mknod d2/cb 0 2026-03-10T06:22:41.929 INFO:tasks.workunit.client.0.vm04.stdout:4/15: sync 2026-03-10T06:22:41.929 INFO:tasks.workunit.client.0.vm04.stdout:7/16: sync 2026-03-10T06:22:41.929 INFO:tasks.workunit.client.0.vm04.stdout:7/17: fdatasync - no filename 2026-03-10T06:22:41.929 INFO:tasks.workunit.client.0.vm04.stdout:7/18: write - no filename 2026-03-10T06:22:41.929 INFO:tasks.workunit.client.0.vm04.stdout:8/15: sync 2026-03-10T06:22:41.929 INFO:tasks.workunit.client.0.vm04.stdout:0/3: sync 2026-03-10T06:22:41.929 INFO:tasks.workunit.client.0.vm04.stdout:3/13: sync 2026-03-10T06:22:41.930 INFO:tasks.workunit.client.0.vm04.stdout:5/3: sync 2026-03-10T06:22:41.930 INFO:tasks.workunit.client.0.vm04.stdout:5/4: rmdir - no directory 2026-03-10T06:22:41.935 INFO:tasks.workunit.client.0.vm04.stdout:5/5: dwrite f0 [0,4194304] 0 2026-03-10T06:22:41.935 INFO:tasks.workunit.client.0.vm04.stdout:5/6: rmdir - no directory 2026-03-10T06:22:41.949 INFO:tasks.workunit.client.0.vm04.stdout:3/14: write f0 [4194412,40959] 0 2026-03-10T06:22:41.953 INFO:tasks.workunit.client.0.vm04.stdout:8/16: creat f4 x:0 0 0 2026-03-10T06:22:41.955 INFO:tasks.workunit.client.0.vm04.stdout:4/16: unlink d2/l5 0 2026-03-10T06:22:41.959 INFO:tasks.workunit.client.0.vm04.stdout:5/7: symlink l1 0 2026-03-10T06:22:41.960 INFO:tasks.workunit.client.0.vm04.stdout:3/15: rename f3 to d4/f7 0 2026-03-10T06:22:41.961 INFO:tasks.workunit.client.0.vm04.stdout:7/19: creat d4/f5 x:0 0 0 2026-03-10T06:22:41.964 INFO:tasks.workunit.client.0.vm04.stdout:0/4: symlink d0/l1 0 2026-03-10T06:22:41.964 INFO:tasks.workunit.client.0.vm04.stdout:0/5: chown d0 1203179 1 2026-03-10T06:22:41.964 INFO:tasks.workunit.client.0.vm04.stdout:0/6: dread - no filename 2026-03-10T06:22:41.964 INFO:tasks.workunit.client.0.vm04.stdout:0/7: rename d0 to d0/d2 22 2026-03-10T06:22:41.964 INFO:tasks.workunit.client.0.vm04.stdout:0/8: rename d0 to d0/d3 22 2026-03-10T06:22:41.964 INFO:tasks.workunit.client.0.vm04.stdout:0/9: truncate - no filename 2026-03-10T06:22:41.964 INFO:tasks.workunit.client.0.vm04.stdout:0/10: dread - no filename 2026-03-10T06:22:41.966 INFO:tasks.workunit.client.0.vm04.stdout:4/17: sync 2026-03-10T06:22:41.966 INFO:tasks.workunit.client.0.vm04.stdout:7/20: dwrite d4/f5 [0,4194304] 0 2026-03-10T06:22:41.969 INFO:tasks.workunit.client.0.vm04.stdout:5/8: creat f2 x:0 0 0 2026-03-10T06:22:41.982 INFO:tasks.workunit.client.0.vm04.stdout:4/18: dwrite d2/f3 [0,4194304] 0 2026-03-10T06:22:41.982 INFO:tasks.workunit.client.0.vm04.stdout:4/19: write d2/f4 [2030280,19788] 0 2026-03-10T06:22:41.982 INFO:tasks.workunit.client.0.vm04.stdout:4/20: chown l0 678 1 2026-03-10T06:22:41.983 INFO:tasks.workunit.client.0.vm04.stdout:7/21: creat d4/f6 x:0 0 0 2026-03-10T06:22:41.983 INFO:tasks.workunit.client.0.vm04.stdout:3/16: creat d4/d6/f8 x:0 0 0 2026-03-10T06:22:41.987 INFO:tasks.workunit.client.0.vm04.stdout:8/17: link l0 l5 0 2026-03-10T06:22:41.987 INFO:tasks.workunit.client.0.vm04.stdout:8/18: rmdir - no directory 2026-03-10T06:22:41.988 INFO:tasks.workunit.client.0.vm04.stdout:5/9: sync 2026-03-10T06:22:41.998 INFO:tasks.workunit.client.0.vm04.stdout:8/19: dwrite f3 [0,4194304] 0 2026-03-10T06:22:41.999 INFO:tasks.workunit.client.0.vm04.stdout:5/10: dwrite f0 [0,4194304] 0 2026-03-10T06:22:42.005 INFO:tasks.workunit.client.0.vm04.stdout:4/21: sync 2026-03-10T06:22:42.006 INFO:tasks.workunit.client.0.vm04.stdout:4/22: chown l0 105375754 1 2026-03-10T06:22:42.007 INFO:tasks.workunit.client.0.vm04.stdout:5/11: dwrite f2 [0,4194304] 0 2026-03-10T06:22:42.012 INFO:tasks.workunit.client.0.vm04.stdout:5/12: sync 2026-03-10T06:22:42.014 INFO:tasks.workunit.client.0.vm04.stdout:3/17: dwrite f1 [0,4194304] 0 2026-03-10T06:22:42.019 INFO:tasks.workunit.client.0.vm04.stdout:3/18: sync 2026-03-10T06:22:42.019 INFO:tasks.workunit.client.0.vm04.stdout:3/19: dread - d4/f7 zero size 2026-03-10T06:22:42.019 INFO:tasks.workunit.client.0.vm04.stdout:3/20: stat f0 0 2026-03-10T06:22:42.027 INFO:tasks.workunit.client.0.vm04.stdout:8/20: rename f2 to f6 0 2026-03-10T06:22:42.034 INFO:tasks.workunit.client.0.vm04.stdout:4/23: rename d2/f3 to d2/f6 0 2026-03-10T06:22:42.052 INFO:tasks.workunit.client.0.vm04.stdout:5/13: creat f3 x:0 0 0 2026-03-10T06:22:42.056 INFO:tasks.workunit.client.0.vm04.stdout:9/44: getdents d2/d3 0 2026-03-10T06:22:42.059 INFO:tasks.workunit.client.0.vm04.stdout:3/21: unlink f2 0 2026-03-10T06:22:42.071 INFO:tasks.workunit.client.0.vm04.stdout:8/21: dwrite f6 [0,4194304] 0 2026-03-10T06:22:42.071 INFO:tasks.workunit.client.0.vm04.stdout:8/22: write f4 [460888,9463] 0 2026-03-10T06:22:42.071 INFO:tasks.workunit.client.0.vm04.stdout:5/14: mkdir d4 0 2026-03-10T06:22:42.071 INFO:tasks.workunit.client.0.vm04.stdout:9/45: mkdir d2/d8/d14 0 2026-03-10T06:22:42.071 INFO:tasks.workunit.client.0.vm04.stdout:3/22: rename d4/d6/f8 to d4/d6/f9 0 2026-03-10T06:22:42.071 INFO:tasks.workunit.client.0.vm04.stdout:5/15: fdatasync f3 0 2026-03-10T06:22:42.071 INFO:tasks.workunit.client.0.vm04.stdout:7/22: link c3 d4/c7 0 2026-03-10T06:22:42.071 INFO:tasks.workunit.client.0.vm04.stdout:3/23: write f1 [5162349,107896] 0 2026-03-10T06:22:42.072 INFO:tasks.workunit.client.0.vm04.stdout:3/24: chown f1 0 1 2026-03-10T06:22:42.075 INFO:tasks.workunit.client.0.vm04.stdout:8/23: creat f7 x:0 0 0 2026-03-10T06:22:42.078 INFO:tasks.workunit.client.0.vm04.stdout:1/29: link d0/l7 d0/d8/l9 0 2026-03-10T06:22:42.081 INFO:tasks.workunit.client.0.vm04.stdout:7/23: creat d4/f8 x:0 0 0 2026-03-10T06:22:42.085 INFO:tasks.workunit.client.0.vm04.stdout:9/46: symlink d2/d8/l15 0 2026-03-10T06:22:42.088 INFO:tasks.workunit.client.0.vm04.stdout:8/24: creat f8 x:0 0 0 2026-03-10T06:22:42.089 INFO:tasks.workunit.client.0.vm04.stdout:8/25: truncate f8 171272 0 2026-03-10T06:22:42.089 INFO:tasks.workunit.client.0.vm04.stdout:8/26: dread - f7 zero size 2026-03-10T06:22:42.089 INFO:tasks.workunit.client.0.vm04.stdout:8/27: write f4 [619038,75215] 0 2026-03-10T06:22:42.090 INFO:tasks.workunit.client.0.vm04.stdout:8/28: write f4 [1328011,25307] 0 2026-03-10T06:22:42.094 INFO:tasks.workunit.client.0.vm04.stdout:5/16: creat d4/f5 x:0 0 0 2026-03-10T06:22:42.094 INFO:tasks.workunit.client.0.vm04.stdout:5/17: dread - f3 zero size 2026-03-10T06:22:42.100 INFO:tasks.workunit.client.0.vm04.stdout:8/29: rename f3 to f9 0 2026-03-10T06:22:42.102 INFO:tasks.workunit.client.0.vm04.stdout:5/18: mkdir d4/d6 0 2026-03-10T06:22:42.104 INFO:tasks.workunit.client.0.vm04.stdout:5/19: dread f0 [0,4194304] 0 2026-03-10T06:22:42.104 INFO:tasks.workunit.client.0.vm04.stdout:9/47: mknod d2/c16 0 2026-03-10T06:22:42.108 INFO:tasks.workunit.client.0.vm04.stdout:1/30: creat d0/fa x:0 0 0 2026-03-10T06:22:42.122 INFO:tasks.workunit.client.0.vm04.stdout:9/48: dwrite d2/d3/f4 [0,4194304] 0 2026-03-10T06:22:42.131 INFO:tasks.workunit.client.0.vm04.stdout:8/30: rename l5 to la 0 2026-03-10T06:22:42.134 INFO:tasks.workunit.client.0.vm04.stdout:5/20: creat d4/d6/f7 x:0 0 0 2026-03-10T06:22:42.135 INFO:tasks.workunit.client.0.vm04.stdout:2/43: getdents d1/db 0 2026-03-10T06:22:42.140 INFO:tasks.workunit.client.0.vm04.stdout:2/44: mkdir d1/df 0 2026-03-10T06:22:42.146 INFO:tasks.workunit.client.0.vm04.stdout:2/45: dwrite d1/f3 [0,4194304] 0 2026-03-10T06:22:42.151 INFO:tasks.workunit.client.0.vm04.stdout:9/49: creat d2/f17 x:0 0 0 2026-03-10T06:22:42.154 INFO:tasks.workunit.client.0.vm04.stdout:9/50: mkdir d2/d3/d18 0 2026-03-10T06:22:42.156 INFO:tasks.workunit.client.0.vm04.stdout:1/31: link d0/c2 d0/cb 0 2026-03-10T06:22:42.158 INFO:tasks.workunit.client.0.vm04.stdout:2/46: sync 2026-03-10T06:22:42.159 INFO:tasks.workunit.client.0.vm04.stdout:9/51: mknod d2/d9/c19 0 2026-03-10T06:22:42.159 INFO:tasks.workunit.client.0.vm04.stdout:1/32: dread d0/f4 [0,4194304] 0 2026-03-10T06:22:42.163 INFO:tasks.workunit.client.0.vm04.stdout:2/47: unlink d1/l4 0 2026-03-10T06:22:42.168 INFO:tasks.workunit.client.0.vm04.stdout:9/52: creat d2/d8/f1a x:0 0 0 2026-03-10T06:22:42.168 INFO:tasks.workunit.client.0.vm04.stdout:9/53: truncate d2/d8/f1a 965484 0 2026-03-10T06:22:42.168 INFO:tasks.workunit.client.0.vm04.stdout:2/48: creat d1/f10 x:0 0 0 2026-03-10T06:22:42.168 INFO:tasks.workunit.client.0.vm04.stdout:2/49: write d1/db/fe [1022752,49112] 0 2026-03-10T06:22:42.168 INFO:tasks.workunit.client.0.vm04.stdout:2/50: write d1/f7 [272015,94596] 0 2026-03-10T06:22:42.170 INFO:tasks.workunit.client.0.vm04.stdout:1/33: mknod d0/cc 0 2026-03-10T06:22:42.173 INFO:tasks.workunit.client.0.vm04.stdout:2/51: mkdir d1/df/d11 0 2026-03-10T06:22:42.174 INFO:tasks.workunit.client.0.vm04.stdout:1/34: stat d0/cb 0 2026-03-10T06:22:42.175 INFO:tasks.workunit.client.0.vm04.stdout:2/52: creat d1/db/f12 x:0 0 0 2026-03-10T06:22:42.180 INFO:tasks.workunit.client.0.vm04.stdout:2/53: fsync d1/f7 0 2026-03-10T06:22:42.187 INFO:tasks.workunit.client.0.vm04.stdout:6/33: rmdir d2/d4 39 2026-03-10T06:22:42.187 INFO:tasks.workunit.client.0.vm04.stdout:6/34: truncate d2/f7 10366173 0 2026-03-10T06:22:42.187 INFO:tasks.workunit.client.0.vm04.stdout:6/35: read - d2/d8/f9 zero size 2026-03-10T06:22:42.195 INFO:tasks.workunit.client.0.vm04.stdout:0/11: getdents d0 0 2026-03-10T06:22:42.195 INFO:tasks.workunit.client.0.vm04.stdout:0/12: read - no filename 2026-03-10T06:22:42.201 INFO:tasks.workunit.client.0.vm04.stdout:8/31: dwrite f9 [4194304,4194304] 0 2026-03-10T06:22:42.215 INFO:tasks.workunit.client.0.vm04.stdout:4/24: truncate d2/f4 2542470 0 2026-03-10T06:22:42.216 INFO:tasks.workunit.client.0.vm04.stdout:7/24: rmdir d4 39 2026-03-10T06:22:42.216 INFO:tasks.workunit.client.0.vm04.stdout:7/25: readlink - no filename 2026-03-10T06:22:42.218 INFO:tasks.workunit.client.0.vm04.stdout:3/25: truncate f1 49035 0 2026-03-10T06:22:42.226 INFO:tasks.workunit.client.0.vm04.stdout:5/21: truncate f0 1094513 0 2026-03-10T06:22:42.228 INFO:tasks.workunit.client.0.vm04.stdout:9/54: rmdir d2/d9 39 2026-03-10T06:22:42.248 INFO:tasks.workunit.client.0.vm04.stdout:2/54: creat d1/f13 x:0 0 0 2026-03-10T06:22:42.249 INFO:tasks.workunit.client.0.vm04.stdout:1/35: chown d0/d8/l9 22 1 2026-03-10T06:22:42.251 INFO:tasks.workunit.client.0.vm04.stdout:0/13: creat d0/f4 x:0 0 0 2026-03-10T06:22:42.252 INFO:tasks.workunit.client.0.vm04.stdout:8/32: symlink lb 0 2026-03-10T06:22:42.254 INFO:tasks.workunit.client.0.vm04.stdout:3/26: write d4/d6/f9 [417322,112556] 0 2026-03-10T06:22:42.257 INFO:tasks.workunit.client.0.vm04.stdout:9/55: rmdir d2 39 2026-03-10T06:22:42.259 INFO:tasks.workunit.client.0.vm04.stdout:1/36: truncate d0/f4 4900399 0 2026-03-10T06:22:42.260 INFO:tasks.workunit.client.0.vm04.stdout:6/36: symlink d2/d4/lc 0 2026-03-10T06:22:42.261 INFO:tasks.workunit.client.0.vm04.stdout:6/37: write d2/d4/fa [633237,129252] 0 2026-03-10T06:22:42.263 INFO:tasks.workunit.client.0.vm04.stdout:1/37: dwrite d0/fa [0,4194304] 0 2026-03-10T06:22:42.264 INFO:tasks.workunit.client.0.vm04.stdout:0/14: mkdir d0/d5 0 2026-03-10T06:22:42.264 INFO:tasks.workunit.client.0.vm04.stdout:0/15: readlink d0/l1 0 2026-03-10T06:22:42.271 INFO:tasks.workunit.client.0.vm04.stdout:8/33: symlink lc 0 2026-03-10T06:22:42.274 INFO:tasks.workunit.client.0.vm04.stdout:3/27: mkdir d4/da 0 2026-03-10T06:22:42.289 INFO:tasks.workunit.client.0.vm04.stdout:2/55: mkdir d1/df/d11/d14 0 2026-03-10T06:22:42.290 INFO:tasks.workunit.client.0.vm04.stdout:2/56: write d1/db/f12 [286937,44707] 0 2026-03-10T06:22:42.292 INFO:tasks.workunit.client.0.vm04.stdout:2/57: dwrite d1/f7 [4194304,4194304] 0 2026-03-10T06:22:42.298 INFO:tasks.workunit.client.0.vm04.stdout:6/38: creat d2/d8/fd x:0 0 0 2026-03-10T06:22:42.301 INFO:tasks.workunit.client.0.vm04.stdout:6/39: chown d2/d4/l5 727 1 2026-03-10T06:22:42.301 INFO:tasks.workunit.client.0.vm04.stdout:1/38: write d0/f5 [7855,24329] 0 2026-03-10T06:22:42.301 INFO:tasks.workunit.client.0.vm04.stdout:1/39: chown d0/d8/l9 2 1 2026-03-10T06:22:42.301 INFO:tasks.workunit.client.0.vm04.stdout:1/40: write d0/fa [2130830,51946] 0 2026-03-10T06:22:42.302 INFO:tasks.workunit.client.0.vm04.stdout:1/41: write d0/f5 [524911,100997] 0 2026-03-10T06:22:42.302 INFO:tasks.workunit.client.0.vm04.stdout:3/28: sync 2026-03-10T06:22:42.305 INFO:tasks.workunit.client.0.vm04.stdout:0/16: symlink d0/l6 0 2026-03-10T06:22:42.305 INFO:tasks.workunit.client.0.vm04.stdout:0/17: stat d0/l1 0 2026-03-10T06:22:42.308 INFO:tasks.workunit.client.0.vm04.stdout:0/18: dwrite d0/f4 [0,4194304] 0 2026-03-10T06:22:42.313 INFO:tasks.workunit.client.0.vm04.stdout:8/34: readlink l0 0 2026-03-10T06:22:42.314 INFO:tasks.workunit.client.0.vm04.stdout:0/19: dread d0/f4 [0,4194304] 0 2026-03-10T06:22:42.318 INFO:tasks.workunit.client.0.vm04.stdout:8/35: dwrite f7 [0,4194304] 0 2026-03-10T06:22:42.330 INFO:tasks.workunit.client.0.vm04.stdout:2/58: creat d1/df/d11/f15 x:0 0 0 2026-03-10T06:22:42.332 INFO:tasks.workunit.client.0.vm04.stdout:2/59: dread d1/db/f12 [0,4194304] 0 2026-03-10T06:22:42.335 INFO:tasks.workunit.client.0.vm04.stdout:3/29: mknod d4/d6/cb 0 2026-03-10T06:22:42.338 INFO:tasks.workunit.client.0.vm04.stdout:8/36: unlink f8 0 2026-03-10T06:22:42.343 INFO:tasks.workunit.client.0.vm04.stdout:8/37: dwrite f7 [0,4194304] 0 2026-03-10T06:22:42.343 INFO:tasks.workunit.client.0.vm04.stdout:4/25: getdents d2 0 2026-03-10T06:22:42.345 INFO:tasks.workunit.client.0.vm04.stdout:8/38: dread f9 [0,4194304] 0 2026-03-10T06:22:42.347 INFO:tasks.workunit.client.0.vm04.stdout:1/42: dread d0/f5 [0,4194304] 0 2026-03-10T06:22:42.351 INFO:tasks.workunit.client.0.vm04.stdout:1/43: write d0/fa [3789840,104929] 0 2026-03-10T06:22:42.353 INFO:tasks.workunit.client.0.vm04.stdout:1/44: chown d0 0 1 2026-03-10T06:22:42.353 INFO:tasks.workunit.client.0.vm04.stdout:8/39: dwrite f6 [0,4194304] 0 2026-03-10T06:22:42.364 INFO:tasks.workunit.client.0.vm04.stdout:3/30: mkdir d4/d6/dc 0 2026-03-10T06:22:42.365 INFO:tasks.workunit.client.0.vm04.stdout:3/31: dread f1 [0,4194304] 0 2026-03-10T06:22:42.367 INFO:tasks.workunit.client.0.vm04.stdout:0/20: creat d0/d5/f7 x:0 0 0 2026-03-10T06:22:42.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:41 vm06.local ceph-mon[58974]: pgmap v19: 65 pgs: 65 active+clean; 2.2 GiB data, 7.2 GiB used, 113 GiB / 120 GiB avail; 61 MiB/s rd, 145 MiB/s wr, 388 op/s 2026-03-10T06:22:42.368 INFO:tasks.workunit.client.0.vm04.stdout:0/21: truncate d0/f4 4814780 0 2026-03-10T06:22:42.369 INFO:tasks.workunit.client.0.vm04.stdout:5/22: dwrite f0 [0,4194304] 0 2026-03-10T06:22:42.377 INFO:tasks.workunit.client.0.vm04.stdout:7/26: link c3 d4/c9 0 2026-03-10T06:22:42.377 INFO:tasks.workunit.client.0.vm04.stdout:7/27: chown d4/f6 168242 1 2026-03-10T06:22:42.377 INFO:tasks.workunit.client.0.vm04.stdout:7/28: dread - d4/f8 zero size 2026-03-10T06:22:42.378 INFO:tasks.workunit.client.0.vm04.stdout:7/29: dread - d4/f6 zero size 2026-03-10T06:22:42.381 INFO:tasks.workunit.client.0.vm04.stdout:4/26: rename d2/f6 to d2/f7 0 2026-03-10T06:22:42.384 INFO:tasks.workunit.client.0.vm04.stdout:8/40: creat fd x:0 0 0 2026-03-10T06:22:42.387 INFO:tasks.workunit.client.0.vm04.stdout:2/60: link d1/f3 d1/df/d11/f16 0 2026-03-10T06:22:42.388 INFO:tasks.workunit.client.0.vm04.stdout:2/61: write d1/df/d11/f15 [380325,71572] 0 2026-03-10T06:22:42.391 INFO:tasks.workunit.client.0.vm04.stdout:6/40: creat d2/fe x:0 0 0 2026-03-10T06:22:42.392 INFO:tasks.workunit.client.0.vm04.stdout:3/32: write f1 [202469,103034] 0 2026-03-10T06:22:42.392 INFO:tasks.workunit.client.0.vm04.stdout:3/33: read - d4/f7 zero size 2026-03-10T06:22:42.393 INFO:tasks.workunit.client.0.vm04.stdout:3/34: chown d4 7488 1 2026-03-10T06:22:42.398 INFO:tasks.workunit.client.0.vm04.stdout:3/35: dwrite d4/d6/f9 [0,4194304] 0 2026-03-10T06:22:42.400 INFO:tasks.workunit.client.0.vm04.stdout:5/23: creat d4/d6/f8 x:0 0 0 2026-03-10T06:22:42.401 INFO:tasks.workunit.client.0.vm04.stdout:7/30: creat d4/fa x:0 0 0 2026-03-10T06:22:42.402 INFO:tasks.workunit.client.0.vm04.stdout:7/31: write d4/f6 [415851,54882] 0 2026-03-10T06:22:42.408 INFO:tasks.workunit.client.0.vm04.stdout:0/22: dwrite d0/f4 [0,4194304] 0 2026-03-10T06:22:42.409 INFO:tasks.workunit.client.0.vm04.stdout:0/23: readlink d0/l6 0 2026-03-10T06:22:42.411 INFO:tasks.workunit.client.0.vm04.stdout:7/32: dwrite d4/f5 [0,4194304] 0 2026-03-10T06:22:42.420 INFO:tasks.workunit.client.0.vm04.stdout:9/56: getdents d2/d3 0 2026-03-10T06:22:42.420 INFO:tasks.workunit.client.0.vm04.stdout:2/62: symlink d1/l17 0 2026-03-10T06:22:42.421 INFO:tasks.workunit.client.0.vm04.stdout:4/27: dwrite d2/f7 [0,4194304] 0 2026-03-10T06:22:42.422 INFO:tasks.workunit.client.0.vm04.stdout:4/28: write d2/f7 [352091,92590] 0 2026-03-10T06:22:42.423 INFO:tasks.workunit.client.0.vm04.stdout:8/41: creat fe x:0 0 0 2026-03-10T06:22:42.424 INFO:tasks.workunit.client.0.vm04.stdout:9/57: dread d2/d8/f1a [0,4194304] 0 2026-03-10T06:22:42.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:41 vm04.local ceph-mon[51058]: pgmap v19: 65 pgs: 65 active+clean; 2.2 GiB data, 7.2 GiB used, 113 GiB / 120 GiB avail; 61 MiB/s rd, 145 MiB/s wr, 388 op/s 2026-03-10T06:22:42.431 INFO:tasks.workunit.client.0.vm04.stdout:5/24: mknod d4/c9 0 2026-03-10T06:22:42.432 INFO:tasks.workunit.client.0.vm04.stdout:0/24: mkdir d0/d5/d8 0 2026-03-10T06:22:42.433 INFO:tasks.workunit.client.0.vm04.stdout:7/33: rename d4/f8 to d4/fb 0 2026-03-10T06:22:42.433 INFO:tasks.workunit.client.0.vm04.stdout:1/45: symlink d0/d3/ld 0 2026-03-10T06:22:42.436 INFO:tasks.workunit.client.0.vm04.stdout:0/25: dwrite d0/f4 [0,4194304] 0 2026-03-10T06:22:42.450 INFO:tasks.workunit.client.0.vm04.stdout:0/26: dread - d0/d5/f7 zero size 2026-03-10T06:22:42.451 INFO:tasks.workunit.client.0.vm04.stdout:2/63: mkdir d1/df/d11/d18 0 2026-03-10T06:22:42.451 INFO:tasks.workunit.client.0.vm04.stdout:8/42: mkdir df 0 2026-03-10T06:22:42.451 INFO:tasks.workunit.client.0.vm04.stdout:3/36: creat d4/d6/dc/fd x:0 0 0 2026-03-10T06:22:42.451 INFO:tasks.workunit.client.0.vm04.stdout:6/41: mknod d2/cf 0 2026-03-10T06:22:42.451 INFO:tasks.workunit.client.0.vm04.stdout:8/43: chown f4 94 1 2026-03-10T06:22:42.451 INFO:tasks.workunit.client.0.vm04.stdout:5/25: creat d4/d6/fa x:0 0 0 2026-03-10T06:22:42.451 INFO:tasks.workunit.client.0.vm04.stdout:9/58: write d2/d3/f10 [683377,103672] 0 2026-03-10T06:22:42.451 INFO:tasks.workunit.client.0.vm04.stdout:9/59: truncate d2/d3/f10 1156201 0 2026-03-10T06:22:42.451 INFO:tasks.workunit.client.0.vm04.stdout:0/27: symlink d0/d5/l9 0 2026-03-10T06:22:42.451 INFO:tasks.workunit.client.0.vm04.stdout:2/64: rename d1/f3 to d1/df/d11/d18/f19 0 2026-03-10T06:22:42.451 INFO:tasks.workunit.client.0.vm04.stdout:1/46: symlink d0/d8/le 0 2026-03-10T06:22:42.451 INFO:tasks.workunit.client.0.vm04.stdout:2/65: dread - d1/f10 zero size 2026-03-10T06:22:42.451 INFO:tasks.workunit.client.0.vm04.stdout:5/26: symlink d4/lb 0 2026-03-10T06:22:42.451 INFO:tasks.workunit.client.0.vm04.stdout:5/27: truncate d4/d6/f8 832819 0 2026-03-10T06:22:42.451 INFO:tasks.workunit.client.0.vm04.stdout:4/29: sync 2026-03-10T06:22:42.453 INFO:tasks.workunit.client.0.vm04.stdout:3/37: mkdir d4/d6/dc/de 0 2026-03-10T06:22:42.454 INFO:tasks.workunit.client.0.vm04.stdout:7/34: rename d4/c9 to d4/cc 0 2026-03-10T06:22:42.455 INFO:tasks.workunit.client.0.vm04.stdout:6/42: dwrite d2/f7 [8388608,4194304] 0 2026-03-10T06:22:42.462 INFO:tasks.workunit.client.0.vm04.stdout:8/44: rename f4 to df/f10 0 2026-03-10T06:22:42.463 INFO:tasks.workunit.client.0.vm04.stdout:8/45: chown df/f10 193812 1 2026-03-10T06:22:42.463 INFO:tasks.workunit.client.0.vm04.stdout:1/47: rmdir d0/d3 39 2026-03-10T06:22:42.463 INFO:tasks.workunit.client.0.vm04.stdout:7/35: dread - d4/fb zero size 2026-03-10T06:22:42.463 INFO:tasks.workunit.client.0.vm04.stdout:5/28: mkdir d4/dc 0 2026-03-10T06:22:42.463 INFO:tasks.workunit.client.0.vm04.stdout:5/29: write d4/d6/f8 [1554197,59184] 0 2026-03-10T06:22:42.469 INFO:tasks.workunit.client.0.vm04.stdout:9/60: mknod d2/d8/d14/c1b 0 2026-03-10T06:22:42.470 INFO:tasks.workunit.client.0.vm04.stdout:5/30: fsync f3 0 2026-03-10T06:22:42.472 INFO:tasks.workunit.client.0.vm04.stdout:5/31: dread - f3 zero size 2026-03-10T06:22:42.474 INFO:tasks.workunit.client.0.vm04.stdout:8/46: creat df/f11 x:0 0 0 2026-03-10T06:22:42.475 INFO:tasks.workunit.client.0.vm04.stdout:8/47: read f7 [3034177,50963] 0 2026-03-10T06:22:42.475 INFO:tasks.workunit.client.0.vm04.stdout:4/30: dwrite d2/f7 [0,4194304] 0 2026-03-10T06:22:42.476 INFO:tasks.workunit.client.0.vm04.stdout:0/28: creat d0/d5/d8/fa x:0 0 0 2026-03-10T06:22:42.487 INFO:tasks.workunit.client.0.vm04.stdout:7/36: dwrite d4/fa [0,4194304] 0 2026-03-10T06:22:42.487 INFO:tasks.workunit.client.0.vm04.stdout:2/66: mknod d1/df/d11/d14/c1a 0 2026-03-10T06:22:42.490 INFO:tasks.workunit.client.0.vm04.stdout:4/31: mkdir d2/d8 0 2026-03-10T06:22:42.492 INFO:tasks.workunit.client.0.vm04.stdout:1/48: dread d0/fa [0,4194304] 0 2026-03-10T06:22:42.492 INFO:tasks.workunit.client.0.vm04.stdout:5/32: rename d4/dc to d4/dd 0 2026-03-10T06:22:42.493 INFO:tasks.workunit.client.0.vm04.stdout:7/37: dread - d4/fb zero size 2026-03-10T06:22:42.494 INFO:tasks.workunit.client.0.vm04.stdout:6/43: creat d2/f10 x:0 0 0 2026-03-10T06:22:42.508 INFO:tasks.workunit.client.0.vm04.stdout:6/44: write f1 [5040772,45834] 0 2026-03-10T06:22:42.509 INFO:tasks.workunit.client.0.vm04.stdout:9/61: creat d2/f1c x:0 0 0 2026-03-10T06:22:42.509 INFO:tasks.workunit.client.0.vm04.stdout:5/33: mkdir d4/d6/de 0 2026-03-10T06:22:42.511 INFO:tasks.workunit.client.0.vm04.stdout:0/29: dwrite d0/d5/f7 [0,4194304] 0 2026-03-10T06:22:42.514 INFO:tasks.workunit.client.0.vm04.stdout:0/30: creat d0/d5/fb x:0 0 0 2026-03-10T06:22:42.515 INFO:tasks.workunit.client.0.vm04.stdout:1/49: read d0/f4 [837807,13199] 0 2026-03-10T06:22:42.516 INFO:tasks.workunit.client.0.vm04.stdout:6/45: write d2/d4/fa [335758,31849] 0 2026-03-10T06:22:42.518 INFO:tasks.workunit.client.0.vm04.stdout:3/38: dwrite f0 [0,4194304] 0 2026-03-10T06:22:42.518 INFO:tasks.workunit.client.0.vm04.stdout:0/31: creat d0/d5/d8/fc x:0 0 0 2026-03-10T06:22:42.520 INFO:tasks.workunit.client.0.vm04.stdout:0/32: dread - d0/d5/fb zero size 2026-03-10T06:22:42.521 INFO:tasks.workunit.client.0.vm04.stdout:3/39: chown d4/da 189726 1 2026-03-10T06:22:42.528 INFO:tasks.workunit.client.0.vm04.stdout:8/48: dwrite f9 [0,4194304] 0 2026-03-10T06:22:42.529 INFO:tasks.workunit.client.0.vm04.stdout:3/40: fdatasync d4/d6/f9 0 2026-03-10T06:22:42.536 INFO:tasks.workunit.client.0.vm04.stdout:0/33: dwrite d0/d5/d8/fc [0,4194304] 0 2026-03-10T06:22:42.541 INFO:tasks.workunit.client.0.vm04.stdout:3/41: write d4/d6/dc/fd [449105,21561] 0 2026-03-10T06:22:42.541 INFO:tasks.workunit.client.0.vm04.stdout:6/46: dwrite d2/f10 [0,4194304] 0 2026-03-10T06:22:42.541 INFO:tasks.workunit.client.0.vm04.stdout:9/62: getdents d2/d3/d18 0 2026-03-10T06:22:42.541 INFO:tasks.workunit.client.0.vm04.stdout:8/49: creat df/f12 x:0 0 0 2026-03-10T06:22:42.542 INFO:tasks.workunit.client.0.vm04.stdout:0/34: dread d0/f4 [0,4194304] 0 2026-03-10T06:22:42.543 INFO:tasks.workunit.client.0.vm04.stdout:6/47: write d2/d8/fd [232179,123685] 0 2026-03-10T06:22:42.548 INFO:tasks.workunit.client.0.vm04.stdout:5/34: rmdir d4/d6/de 0 2026-03-10T06:22:42.555 INFO:tasks.workunit.client.0.vm04.stdout:1/50: rename d0/fa to d0/ff 0 2026-03-10T06:22:42.557 INFO:tasks.workunit.client.0.vm04.stdout:9/63: unlink d2/d3/le 0 2026-03-10T06:22:42.557 INFO:tasks.workunit.client.0.vm04.stdout:3/42: rename d4/d6/dc/de to d4/da/df 0 2026-03-10T06:22:42.558 INFO:tasks.workunit.client.0.vm04.stdout:5/35: dwrite f0 [0,4194304] 0 2026-03-10T06:22:42.558 INFO:tasks.workunit.client.0.vm04.stdout:9/64: read - d2/f1c zero size 2026-03-10T06:22:42.559 INFO:tasks.workunit.client.0.vm04.stdout:9/65: dread - d2/f17 zero size 2026-03-10T06:22:42.564 INFO:tasks.workunit.client.0.vm04.stdout:6/48: rename d2/fe to d2/d8/f11 0 2026-03-10T06:22:42.564 INFO:tasks.workunit.client.0.vm04.stdout:9/66: chown d2/d9/c19 30550 1 2026-03-10T06:22:42.564 INFO:tasks.workunit.client.0.vm04.stdout:9/67: truncate d2/f1c 280916 0 2026-03-10T06:22:42.568 INFO:tasks.workunit.client.0.vm04.stdout:3/43: rename d4/d6/f9 to d4/f10 0 2026-03-10T06:22:42.568 INFO:tasks.workunit.client.0.vm04.stdout:9/68: mkdir d2/d8/d14/d1d 0 2026-03-10T06:22:42.569 INFO:tasks.workunit.client.0.vm04.stdout:3/44: stat d4/f7 0 2026-03-10T06:22:42.578 INFO:tasks.workunit.client.0.vm04.stdout:6/49: dwrite d2/d8/f9 [0,4194304] 0 2026-03-10T06:22:42.579 INFO:tasks.workunit.client.0.vm04.stdout:6/50: chown d2/d4/l5 7951 1 2026-03-10T06:22:42.579 INFO:tasks.workunit.client.0.vm04.stdout:6/51: truncate d2/d8/f11 704603 0 2026-03-10T06:22:42.583 INFO:tasks.workunit.client.0.vm04.stdout:6/52: unlink d2/d8/fd 0 2026-03-10T06:22:42.584 INFO:tasks.workunit.client.0.vm04.stdout:6/53: write d2/d4/fa [1456913,1040] 0 2026-03-10T06:22:42.586 INFO:tasks.workunit.client.0.vm04.stdout:9/69: rename d2/d8/f1a to d2/f1e 0 2026-03-10T06:22:42.590 INFO:tasks.workunit.client.0.vm04.stdout:9/70: rename d2/d8/l15 to d2/d3/l1f 0 2026-03-10T06:22:42.590 INFO:tasks.workunit.client.0.vm04.stdout:9/71: chown d2/d9/c19 2611 1 2026-03-10T06:22:42.591 INFO:tasks.workunit.client.0.vm04.stdout:9/72: stat d2/d8/d14 0 2026-03-10T06:22:42.617 INFO:tasks.workunit.client.0.vm04.stdout:3/45: sync 2026-03-10T06:22:42.617 INFO:tasks.workunit.client.0.vm04.stdout:3/46: readlink - no filename 2026-03-10T06:22:42.618 INFO:tasks.workunit.client.0.vm04.stdout:3/47: unlink d4/d6/dc/fd 0 2026-03-10T06:22:42.619 INFO:tasks.workunit.client.0.vm04.stdout:3/48: write f0 [1859921,50941] 0 2026-03-10T06:22:42.619 INFO:tasks.workunit.client.0.vm04.stdout:3/49: chown d4/da 11888842 1 2026-03-10T06:22:42.621 INFO:tasks.workunit.client.0.vm04.stdout:3/50: mkdir d4/da/df/d11 0 2026-03-10T06:22:42.622 INFO:tasks.workunit.client.0.vm04.stdout:3/51: link f0 d4/d6/f12 0 2026-03-10T06:22:42.623 INFO:tasks.workunit.client.0.vm04.stdout:3/52: mkdir d4/da/df/d13 0 2026-03-10T06:22:42.624 INFO:tasks.workunit.client.0.vm04.stdout:3/53: truncate d4/d6/f12 4397085 0 2026-03-10T06:22:42.625 INFO:tasks.workunit.client.0.vm04.stdout:3/54: mknod d4/c14 0 2026-03-10T06:22:42.626 INFO:tasks.workunit.client.0.vm04.stdout:3/55: rename d4/d6/cb to d4/da/c15 0 2026-03-10T06:22:42.627 INFO:tasks.workunit.client.0.vm04.stdout:3/56: getdents d4/da/df/d13 0 2026-03-10T06:22:42.752 INFO:tasks.workunit.client.0.vm04.stdout:2/67: getdents d1/df/d11/d14 0 2026-03-10T06:22:42.762 INFO:tasks.workunit.client.0.vm04.stdout:5/36: getdents d4 0 2026-03-10T06:22:42.773 INFO:tasks.workunit.client.0.vm04.stdout:9/73: fsync d2/f1c 0 2026-03-10T06:22:42.776 INFO:tasks.workunit.client.0.vm04.stdout:7/38: truncate d4/fa 2746147 0 2026-03-10T06:22:42.777 INFO:tasks.workunit.client.0.vm04.stdout:9/74: creat d2/d9/f20 x:0 0 0 2026-03-10T06:22:42.777 INFO:tasks.workunit.client.0.vm04.stdout:9/75: fsync f0 0 2026-03-10T06:22:42.779 INFO:tasks.workunit.client.0.vm04.stdout:4/32: truncate d2/f7 3868109 0 2026-03-10T06:22:42.781 INFO:tasks.workunit.client.0.vm04.stdout:5/37: rmdir d4/dd 0 2026-03-10T06:22:42.781 INFO:tasks.workunit.client.0.vm04.stdout:5/38: readlink l1 0 2026-03-10T06:22:42.785 INFO:tasks.workunit.client.0.vm04.stdout:8/50: getdents df 0 2026-03-10T06:22:42.785 INFO:tasks.workunit.client.0.vm04.stdout:8/51: readlink la 0 2026-03-10T06:22:42.785 INFO:tasks.workunit.client.0.vm04.stdout:8/52: fdatasync df/f12 0 2026-03-10T06:22:42.785 INFO:tasks.workunit.client.0.vm04.stdout:8/53: truncate fd 821464 0 2026-03-10T06:22:42.786 INFO:tasks.workunit.client.0.vm04.stdout:8/54: read fd [671408,32728] 0 2026-03-10T06:22:42.786 INFO:tasks.workunit.client.0.vm04.stdout:8/55: stat la 0 2026-03-10T06:22:42.787 INFO:tasks.workunit.client.0.vm04.stdout:0/35: write d0/d5/d8/fc [4680164,7929] 0 2026-03-10T06:22:42.788 INFO:tasks.workunit.client.0.vm04.stdout:0/36: write d0/d5/d8/fa [114860,50130] 0 2026-03-10T06:22:42.800 INFO:tasks.workunit.client.0.vm04.stdout:1/51: dwrite d0/f4 [0,4194304] 0 2026-03-10T06:22:42.801 INFO:tasks.workunit.client.0.vm04.stdout:0/37: mkdir d0/d5/d8/dd 0 2026-03-10T06:22:42.804 INFO:tasks.workunit.client.0.vm04.stdout:0/38: dread d0/d5/f7 [0,4194304] 0 2026-03-10T06:22:42.804 INFO:tasks.workunit.client.0.vm04.stdout:0/39: fdatasync d0/d5/fb 0 2026-03-10T06:22:42.817 INFO:tasks.workunit.client.0.vm04.stdout:1/52: symlink d0/d8/l10 0 2026-03-10T06:22:42.819 INFO:tasks.workunit.client.0.vm04.stdout:6/54: truncate d2/f7 10349512 0 2026-03-10T06:22:42.819 INFO:tasks.workunit.client.0.vm04.stdout:6/55: readlink d2/d4/lc 0 2026-03-10T06:22:42.821 INFO:tasks.workunit.client.0.vm04.stdout:0/40: mknod d0/d5/d8/ce 0 2026-03-10T06:22:42.824 INFO:tasks.workunit.client.0.vm04.stdout:5/39: rename f2 to d4/ff 0 2026-03-10T06:22:42.827 INFO:tasks.workunit.client.0.vm04.stdout:5/40: dwrite f0 [0,4194304] 0 2026-03-10T06:22:42.831 INFO:tasks.workunit.client.0.vm04.stdout:1/53: creat d0/d8/f11 x:0 0 0 2026-03-10T06:22:42.842 INFO:tasks.workunit.client.0.vm04.stdout:3/57: getdents d4/da 0 2026-03-10T06:22:42.845 INFO:tasks.workunit.client.0.vm04.stdout:3/58: dread d4/f10 [0,4194304] 0 2026-03-10T06:22:42.845 INFO:tasks.workunit.client.0.vm04.stdout:0/41: dread d0/d5/f7 [0,4194304] 0 2026-03-10T06:22:42.852 INFO:tasks.workunit.client.0.vm04.stdout:1/54: mknod d0/d8/c12 0 2026-03-10T06:22:42.852 INFO:tasks.workunit.client.0.vm04.stdout:1/55: chown d0/cc 0 1 2026-03-10T06:22:42.853 INFO:tasks.workunit.client.0.vm04.stdout:1/56: stat d0/cb 0 2026-03-10T06:22:42.855 INFO:tasks.workunit.client.0.vm04.stdout:6/56: creat d2/f12 x:0 0 0 2026-03-10T06:22:42.858 INFO:tasks.workunit.client.0.vm04.stdout:3/59: creat d4/da/df/d13/f16 x:0 0 0 2026-03-10T06:22:42.858 INFO:tasks.workunit.client.0.vm04.stdout:6/57: dwrite d2/d8/f9 [0,4194304] 0 2026-03-10T06:22:42.873 INFO:tasks.workunit.client.0.vm04.stdout:3/60: write d4/f7 [63049,17985] 0 2026-03-10T06:22:42.874 INFO:tasks.workunit.client.0.vm04.stdout:2/68: write d1/db/f12 [375727,27759] 0 2026-03-10T06:22:42.875 INFO:tasks.workunit.client.0.vm04.stdout:6/58: creat d2/d8/f13 x:0 0 0 2026-03-10T06:22:42.875 INFO:tasks.workunit.client.0.vm04.stdout:6/59: dread - d2/d8/f13 zero size 2026-03-10T06:22:42.876 INFO:tasks.workunit.client.0.vm04.stdout:6/60: write d2/f12 [703456,1259] 0 2026-03-10T06:22:42.880 INFO:tasks.workunit.client.0.vm04.stdout:1/57: symlink d0/d3/l13 0 2026-03-10T06:22:42.882 INFO:tasks.workunit.client.0.vm04.stdout:0/42: getdents d0 0 2026-03-10T06:22:42.882 INFO:tasks.workunit.client.0.vm04.stdout:5/41: getdents d4/d6 0 2026-03-10T06:22:42.883 INFO:tasks.workunit.client.0.vm04.stdout:3/61: mknod d4/d6/dc/c17 0 2026-03-10T06:22:42.883 INFO:tasks.workunit.client.0.vm04.stdout:1/58: chown d0/f5 40466 1 2026-03-10T06:22:42.883 INFO:tasks.workunit.client.0.vm04.stdout:0/43: write d0/d5/d8/fa [710109,94091] 0 2026-03-10T06:22:42.884 INFO:tasks.workunit.client.0.vm04.stdout:3/62: write d4/da/df/d13/f16 [1016135,75616] 0 2026-03-10T06:22:42.884 INFO:tasks.workunit.client.0.vm04.stdout:1/59: chown d0/ff 158110 1 2026-03-10T06:22:42.884 INFO:tasks.workunit.client.0.vm04.stdout:0/44: chown d0/d5/d8/fc 150441602 1 2026-03-10T06:22:42.884 INFO:tasks.workunit.client.0.vm04.stdout:3/63: truncate f1 4762902 0 2026-03-10T06:22:42.891 INFO:tasks.workunit.client.0.vm04.stdout:2/69: creat d1/df/d11/f1b x:0 0 0 2026-03-10T06:22:42.893 INFO:tasks.workunit.client.0.vm04.stdout:6/61: unlink d2/d8/f13 0 2026-03-10T06:22:42.894 INFO:tasks.workunit.client.0.vm04.stdout:6/62: stat d2/d8 0 2026-03-10T06:22:42.896 INFO:tasks.workunit.client.0.vm04.stdout:1/60: creat d0/d8/f14 x:0 0 0 2026-03-10T06:22:42.896 INFO:tasks.workunit.client.0.vm04.stdout:3/64: creat d4/f18 x:0 0 0 2026-03-10T06:22:42.896 INFO:tasks.workunit.client.0.vm04.stdout:9/76: fsync d2/d9/f20 0 2026-03-10T06:22:42.898 INFO:tasks.workunit.client.0.vm04.stdout:1/61: stat d0/cc 0 2026-03-10T06:22:42.898 INFO:tasks.workunit.client.0.vm04.stdout:7/39: dread d4/fa [0,4194304] 0 2026-03-10T06:22:42.898 INFO:tasks.workunit.client.0.vm04.stdout:6/63: read d2/d4/fa [1021748,130947] 0 2026-03-10T06:22:42.899 INFO:tasks.workunit.client.0.vm04.stdout:3/65: fdatasync d4/da/df/d13/f16 0 2026-03-10T06:22:42.899 INFO:tasks.workunit.client.0.vm04.stdout:3/66: chown d4/da/df 1 1 2026-03-10T06:22:42.902 INFO:tasks.workunit.client.0.vm04.stdout:9/77: dread f0 [0,4194304] 0 2026-03-10T06:22:42.906 INFO:tasks.workunit.client.0.vm04.stdout:3/67: read d4/f7 [19262,61757] 0 2026-03-10T06:22:42.906 INFO:tasks.workunit.client.0.vm04.stdout:5/42: unlink d4/lb 0 2026-03-10T06:22:42.906 INFO:tasks.workunit.client.0.vm04.stdout:0/45: write d0/d5/f7 [4819816,76620] 0 2026-03-10T06:22:42.908 INFO:tasks.workunit.client.0.vm04.stdout:9/78: write d2/d3/f7 [453405,6104] 0 2026-03-10T06:22:42.909 INFO:tasks.workunit.client.0.vm04.stdout:5/43: dread - d4/d6/f7 zero size 2026-03-10T06:22:42.910 INFO:tasks.workunit.client.0.vm04.stdout:7/40: dread d4/fa [0,4194304] 0 2026-03-10T06:22:42.910 INFO:tasks.workunit.client.0.vm04.stdout:7/41: stat d4/fb 0 2026-03-10T06:22:42.914 INFO:tasks.workunit.client.0.vm04.stdout:2/70: getdents d1/df/d11/d14 0 2026-03-10T06:22:42.915 INFO:tasks.workunit.client.0.vm04.stdout:3/68: symlink d4/d6/dc/l19 0 2026-03-10T06:22:42.915 INFO:tasks.workunit.client.0.vm04.stdout:6/64: dread d2/f10 [0,4194304] 0 2026-03-10T06:22:42.915 INFO:tasks.workunit.client.0.vm04.stdout:3/69: chown d4/f10 22 1 2026-03-10T06:22:42.919 INFO:tasks.workunit.client.0.vm04.stdout:0/46: fdatasync d0/f4 0 2026-03-10T06:22:42.921 INFO:tasks.workunit.client.0.vm04.stdout:5/44: creat d4/d6/f10 x:0 0 0 2026-03-10T06:22:42.921 INFO:tasks.workunit.client.0.vm04.stdout:4/33: write d2/f4 [1940069,16458] 0 2026-03-10T06:22:42.922 INFO:tasks.workunit.client.0.vm04.stdout:9/79: dwrite d2/d9/f20 [0,4194304] 0 2026-03-10T06:22:42.927 INFO:tasks.workunit.client.0.vm04.stdout:1/62: dwrite d0/f5 [0,4194304] 0 2026-03-10T06:22:42.931 INFO:tasks.workunit.client.0.vm04.stdout:4/34: rename l0 to d2/l9 0 2026-03-10T06:22:42.933 INFO:tasks.workunit.client.0.vm04.stdout:8/56: dwrite f7 [0,4194304] 0 2026-03-10T06:22:42.937 INFO:tasks.workunit.client.0.vm04.stdout:0/47: symlink d0/lf 0 2026-03-10T06:22:42.948 INFO:tasks.workunit.client.0.vm04.stdout:3/70: dwrite d4/f10 [0,4194304] 0 2026-03-10T06:22:42.959 INFO:tasks.workunit.client.0.vm04.stdout:1/63: dwrite d0/f4 [0,4194304] 0 2026-03-10T06:22:42.962 INFO:tasks.workunit.client.0.vm04.stdout:1/64: readlink d0/d8/l10 0 2026-03-10T06:22:42.976 INFO:tasks.workunit.client.0.vm04.stdout:9/80: mknod d2/c21 0 2026-03-10T06:22:42.987 INFO:tasks.workunit.client.0.vm04.stdout:2/71: write d1/df/d11/f16 [3169959,105072] 0 2026-03-10T06:22:42.987 INFO:tasks.workunit.client.0.vm04.stdout:8/57: fsync df/f10 0 2026-03-10T06:22:42.988 INFO:tasks.workunit.client.0.vm04.stdout:4/35: write d2/f7 [3092636,41483] 0 2026-03-10T06:22:42.988 INFO:tasks.workunit.client.0.vm04.stdout:8/58: chown df/f12 2585 1 2026-03-10T06:22:42.988 INFO:tasks.workunit.client.0.vm04.stdout:2/72: dread - d1/f13 zero size 2026-03-10T06:22:42.988 INFO:tasks.workunit.client.0.vm04.stdout:5/45: mkdir d4/d11 0 2026-03-10T06:22:42.988 INFO:tasks.workunit.client.0.vm04.stdout:3/71: mknod d4/d6/dc/c1a 0 2026-03-10T06:22:42.988 INFO:tasks.workunit.client.0.vm04.stdout:1/65: mknod d0/d3/c15 0 2026-03-10T06:22:42.988 INFO:tasks.workunit.client.0.vm04.stdout:6/65: rename d2/f12 to d2/f14 0 2026-03-10T06:22:42.988 INFO:tasks.workunit.client.0.vm04.stdout:5/46: write f3 [542600,110249] 0 2026-03-10T06:22:42.988 INFO:tasks.workunit.client.0.vm04.stdout:7/42: getdents d4 0 2026-03-10T06:22:42.988 INFO:tasks.workunit.client.0.vm04.stdout:6/66: chown d2/d4/lc 52981708 1 2026-03-10T06:22:42.988 INFO:tasks.workunit.client.0.vm04.stdout:9/81: chown d2/d8/c13 1531 1 2026-03-10T06:22:42.988 INFO:tasks.workunit.client.0.vm04.stdout:8/59: mknod df/c13 0 2026-03-10T06:22:42.988 INFO:tasks.workunit.client.0.vm04.stdout:9/82: chown d2/d8/c13 438 1 2026-03-10T06:22:42.988 INFO:tasks.workunit.client.0.vm04.stdout:2/73: mknod d1/db/c1c 0 2026-03-10T06:22:42.991 INFO:tasks.workunit.client.0.vm04.stdout:1/66: creat d0/d3/f16 x:0 0 0 2026-03-10T06:22:42.996 INFO:tasks.workunit.client.0.vm04.stdout:7/43: stat d4/cc 0 2026-03-10T06:22:42.997 INFO:tasks.workunit.client.0.vm04.stdout:9/83: mkdir d2/d8/d22 0 2026-03-10T06:22:43.000 INFO:tasks.workunit.client.0.vm04.stdout:1/67: dwrite d0/d3/f16 [0,4194304] 0 2026-03-10T06:22:43.005 INFO:tasks.workunit.client.0.vm04.stdout:8/60: mknod df/c14 0 2026-03-10T06:22:43.009 INFO:tasks.workunit.client.0.vm04.stdout:9/84: dwrite f0 [0,4194304] 0 2026-03-10T06:22:43.011 INFO:tasks.workunit.client.0.vm04.stdout:7/44: mknod d4/cd 0 2026-03-10T06:22:43.016 INFO:tasks.workunit.client.0.vm04.stdout:5/47: getdents d4/d11 0 2026-03-10T06:22:43.017 INFO:tasks.workunit.client.0.vm04.stdout:1/68: dread d0/ff [0,4194304] 0 2026-03-10T06:22:43.017 INFO:tasks.workunit.client.0.vm04.stdout:1/69: write d0/d8/f11 [458238,12165] 0 2026-03-10T06:22:43.026 INFO:tasks.workunit.client.0.vm04.stdout:8/61: mkdir df/d15 0 2026-03-10T06:22:43.026 INFO:tasks.workunit.client.0.vm04.stdout:5/48: dread d4/ff [0,4194304] 0 2026-03-10T06:22:43.026 INFO:tasks.workunit.client.0.vm04.stdout:9/85: fdatasync d2/f1e 0 2026-03-10T06:22:43.030 INFO:tasks.workunit.client.0.vm04.stdout:1/70: rename d0/cc to d0/d3/c17 0 2026-03-10T06:22:43.031 INFO:tasks.workunit.client.0.vm04.stdout:1/71: stat d0/d3/l13 0 2026-03-10T06:22:43.033 INFO:tasks.workunit.client.0.vm04.stdout:9/86: dread d2/d3/f4 [0,4194304] 0 2026-03-10T06:22:43.033 INFO:tasks.workunit.client.0.vm04.stdout:1/72: write d0/d3/f16 [2598098,43837] 0 2026-03-10T06:22:43.034 INFO:tasks.workunit.client.0.vm04.stdout:9/87: fdatasync d2/f17 0 2026-03-10T06:22:43.035 INFO:tasks.workunit.client.0.vm04.stdout:8/62: mknod df/c16 0 2026-03-10T06:22:43.045 INFO:tasks.workunit.client.0.vm04.stdout:9/88: readlink d2/d3/l1f 0 2026-03-10T06:22:43.045 INFO:tasks.workunit.client.0.vm04.stdout:8/63: creat df/f17 x:0 0 0 2026-03-10T06:22:43.046 INFO:tasks.workunit.client.0.vm04.stdout:0/48: sync 2026-03-10T06:22:43.046 INFO:tasks.workunit.client.0.vm04.stdout:3/72: sync 2026-03-10T06:22:43.046 INFO:tasks.workunit.client.0.vm04.stdout:7/45: sync 2026-03-10T06:22:43.047 INFO:tasks.workunit.client.0.vm04.stdout:7/46: rename d4 to d4/de 22 2026-03-10T06:22:43.056 INFO:tasks.workunit.client.0.vm04.stdout:3/73: dread f1 [0,4194304] 0 2026-03-10T06:22:43.058 INFO:tasks.workunit.client.0.vm04.stdout:3/74: write d4/f10 [4981262,57118] 0 2026-03-10T06:22:43.060 INFO:tasks.workunit.client.0.vm04.stdout:0/49: mknod d0/d5/d8/c10 0 2026-03-10T06:22:43.060 INFO:tasks.workunit.client.0.vm04.stdout:3/75: chown d4/d6/f12 81786 1 2026-03-10T06:22:43.069 INFO:tasks.workunit.client.0.vm04.stdout:7/47: mkdir d4/df 0 2026-03-10T06:22:43.070 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:43 vm06.local ceph-mon[58974]: pgmap v20: 65 pgs: 65 active+clean; 2.2 GiB data, 7.2 GiB used, 113 GiB / 120 GiB avail; 36 MiB/s rd, 87 MiB/s wr, 246 op/s 2026-03-10T06:22:43.070 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:43 vm06.local ceph-mon[58974]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:22:43.070 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:43 vm06.local ceph-mon[58974]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:22:43.070 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:43 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:22:43.083 INFO:tasks.workunit.client.0.vm04.stdout:5/49: link d4/d6/fa d4/f12 0 2026-03-10T06:22:43.091 INFO:tasks.workunit.client.0.vm04.stdout:9/89: mkdir d2/d23 0 2026-03-10T06:22:43.092 INFO:tasks.workunit.client.0.vm04.stdout:9/90: write d2/d3/f7 [573532,93260] 0 2026-03-10T06:22:43.092 INFO:tasks.workunit.client.0.vm04.stdout:8/64: creat df/d15/f18 x:0 0 0 2026-03-10T06:22:43.093 INFO:tasks.workunit.client.0.vm04.stdout:8/65: write fe [519381,67391] 0 2026-03-10T06:22:43.094 INFO:tasks.workunit.client.0.vm04.stdout:8/66: chown df/c13 1 1 2026-03-10T06:22:43.095 INFO:tasks.workunit.client.0.vm04.stdout:8/67: fdatasync df/d15/f18 0 2026-03-10T06:22:43.098 INFO:tasks.workunit.client.0.vm04.stdout:4/36: fdatasync d2/f7 0 2026-03-10T06:22:43.099 INFO:tasks.workunit.client.0.vm04.stdout:3/76: fsync f0 0 2026-03-10T06:22:43.107 INFO:tasks.workunit.client.0.vm04.stdout:2/74: truncate d1/f7 1323770 0 2026-03-10T06:22:43.107 INFO:tasks.workunit.client.0.vm04.stdout:2/75: readlink d1/la 0 2026-03-10T06:22:43.111 INFO:tasks.workunit.client.0.vm04.stdout:7/48: creat d4/f10 x:0 0 0 2026-03-10T06:22:43.111 INFO:tasks.workunit.client.0.vm04.stdout:7/49: stat d4/f10 0 2026-03-10T06:22:43.113 INFO:tasks.workunit.client.0.vm04.stdout:1/73: link d0/d8/c12 d0/c18 0 2026-03-10T06:22:43.113 INFO:tasks.workunit.client.0.vm04.stdout:2/76: dwrite d1/df/d11/d18/f19 [0,4194304] 0 2026-03-10T06:22:43.118 INFO:tasks.workunit.client.0.vm04.stdout:3/77: creat d4/da/df/f1b x:0 0 0 2026-03-10T06:22:43.123 INFO:tasks.workunit.client.0.vm04.stdout:7/50: dread d4/f5 [0,4194304] 0 2026-03-10T06:22:43.126 INFO:tasks.workunit.client.0.vm04.stdout:0/50: mknod d0/d5/d8/dd/c11 0 2026-03-10T06:22:43.126 INFO:tasks.workunit.client.0.vm04.stdout:3/78: dread d4/f10 [4194304,4194304] 0 2026-03-10T06:22:43.126 INFO:tasks.workunit.client.0.vm04.stdout:1/74: dwrite d0/d3/f16 [0,4194304] 0 2026-03-10T06:22:43.127 INFO:tasks.workunit.client.0.vm04.stdout:0/51: fdatasync d0/d5/fb 0 2026-03-10T06:22:43.129 INFO:tasks.workunit.client.0.vm04.stdout:3/79: write d4/f10 [3993602,43411] 0 2026-03-10T06:22:43.131 INFO:tasks.workunit.client.0.vm04.stdout:0/52: chown d0/d5 0 1 2026-03-10T06:22:43.131 INFO:tasks.workunit.client.0.vm04.stdout:3/80: stat d4 0 2026-03-10T06:22:43.131 INFO:tasks.workunit.client.0.vm04.stdout:0/53: readlink d0/l1 0 2026-03-10T06:22:43.133 INFO:tasks.workunit.client.0.vm04.stdout:0/54: write d0/d5/d8/fc [5483099,121786] 0 2026-03-10T06:22:43.134 INFO:tasks.workunit.client.0.vm04.stdout:0/55: stat d0/d5 0 2026-03-10T06:22:43.135 INFO:tasks.workunit.client.0.vm04.stdout:1/75: dread d0/f5 [0,4194304] 0 2026-03-10T06:22:43.149 INFO:tasks.workunit.client.0.vm04.stdout:3/81: dread d4/da/df/d13/f16 [0,4194304] 0 2026-03-10T06:22:43.150 INFO:tasks.workunit.client.0.vm04.stdout:3/82: read - d4/da/df/f1b zero size 2026-03-10T06:22:43.154 INFO:tasks.workunit.client.0.vm04.stdout:9/91: mkdir d2/d23/d24 0 2026-03-10T06:22:43.158 INFO:tasks.workunit.client.0.vm04.stdout:4/37: creat d2/d8/fa x:0 0 0 2026-03-10T06:22:43.161 INFO:tasks.workunit.client.0.vm04.stdout:7/51: mknod d4/c11 0 2026-03-10T06:22:43.172 INFO:tasks.workunit.client.0.vm04.stdout:6/67: write f1 [8619106,5807] 0 2026-03-10T06:22:43.174 INFO:tasks.workunit.client.0.vm04.stdout:8/68: truncate f9 6862552 0 2026-03-10T06:22:43.178 INFO:tasks.workunit.client.0.vm04.stdout:1/76: creat d0/d3/f19 x:0 0 0 2026-03-10T06:22:43.178 INFO:tasks.workunit.client.0.vm04.stdout:5/50: rename f3 to d4/f13 0 2026-03-10T06:22:43.182 INFO:tasks.workunit.client.0.vm04.stdout:2/77: link d1/df/d11/d18/f19 d1/df/d11/d14/f1d 0 2026-03-10T06:22:43.182 INFO:tasks.workunit.client.0.vm04.stdout:3/83: creat d4/da/df/d13/f1c x:0 0 0 2026-03-10T06:22:43.182 INFO:tasks.workunit.client.0.vm04.stdout:9/92: creat d2/d9/d11/f25 x:0 0 0 2026-03-10T06:22:43.184 INFO:tasks.workunit.client.0.vm04.stdout:4/38: rmdir d2/d8 39 2026-03-10T06:22:43.192 INFO:tasks.workunit.client.0.vm04.stdout:1/77: dread d0/d3/f16 [0,4194304] 0 2026-03-10T06:22:43.192 INFO:tasks.workunit.client.0.vm04.stdout:6/68: rmdir d2/d8 39 2026-03-10T06:22:43.194 INFO:tasks.workunit.client.0.vm04.stdout:8/69: creat df/d15/f19 x:0 0 0 2026-03-10T06:22:43.209 INFO:tasks.workunit.client.0.vm04.stdout:1/78: dread d0/d3/f16 [0,4194304] 0 2026-03-10T06:22:43.271 INFO:tasks.workunit.client.0.vm04.stdout:2/78: write d1/df/d11/d14/f1d [3176389,64834] 0 2026-03-10T06:22:43.271 INFO:tasks.workunit.client.0.vm04.stdout:3/84: dread d4/f7 [0,4194304] 0 2026-03-10T06:22:43.272 INFO:tasks.workunit.client.0.vm04.stdout:4/39: write d2/d8/fa [1000063,61013] 0 2026-03-10T06:22:43.287 INFO:tasks.workunit.client.0.vm04.stdout:7/52: mkdir d4/df/d12 0 2026-03-10T06:22:43.288 INFO:tasks.workunit.client.0.vm04.stdout:6/69: creat d2/d4/f15 x:0 0 0 2026-03-10T06:22:43.291 INFO:tasks.workunit.client.0.vm04.stdout:4/40: dwrite d2/f7 [0,4194304] 0 2026-03-10T06:22:43.293 INFO:tasks.workunit.client.0.vm04.stdout:5/51: symlink d4/d11/l14 0 2026-03-10T06:22:43.298 INFO:tasks.workunit.client.0.vm04.stdout:9/93: creat d2/d23/d24/f26 x:0 0 0 2026-03-10T06:22:43.303 INFO:tasks.workunit.client.0.vm04.stdout:3/85: dwrite d4/da/df/d13/f16 [0,4194304] 0 2026-03-10T06:22:43.303 INFO:tasks.workunit.client.0.vm04.stdout:0/56: getdents d0 0 2026-03-10T06:22:43.303 INFO:tasks.workunit.client.0.vm04.stdout:6/70: write d2/d8/f9 [4236041,61383] 0 2026-03-10T06:22:43.306 INFO:tasks.workunit.client.0.vm04.stdout:2/79: link d1/df/d11/d14/f1d d1/db/f1e 0 2026-03-10T06:22:43.316 INFO:tasks.workunit.client.0.vm04.stdout:1/79: creat d0/f1a x:0 0 0 2026-03-10T06:22:43.316 INFO:tasks.workunit.client.0.vm04.stdout:0/57: symlink d0/l12 0 2026-03-10T06:22:43.316 INFO:tasks.workunit.client.0.vm04.stdout:5/52: dread d4/ff [0,4194304] 0 2026-03-10T06:22:43.316 INFO:tasks.workunit.client.0.vm04.stdout:5/53: fsync d4/d6/f8 0 2026-03-10T06:22:43.316 INFO:tasks.workunit.client.0.vm04.stdout:9/94: rename d2/fb to d2/d8/d14/f27 0 2026-03-10T06:22:43.316 INFO:tasks.workunit.client.0.vm04.stdout:9/95: fdatasync d2/d23/d24/f26 0 2026-03-10T06:22:43.316 INFO:tasks.workunit.client.0.vm04.stdout:9/96: readlink d2/d3/l1f 0 2026-03-10T06:22:43.316 INFO:tasks.workunit.client.0.vm04.stdout:9/97: truncate d2/f1c 1272737 0 2026-03-10T06:22:43.316 INFO:tasks.workunit.client.0.vm04.stdout:3/86: write d4/f10 [4884975,83156] 0 2026-03-10T06:22:43.320 INFO:tasks.workunit.client.0.vm04.stdout:7/53: mkdir d4/df/d12/d13 0 2026-03-10T06:22:43.322 INFO:tasks.workunit.client.0.vm04.stdout:9/98: dwrite d2/d9/f20 [0,4194304] 0 2026-03-10T06:22:43.324 INFO:tasks.workunit.client.0.vm04.stdout:6/71: write d2/d8/f11 [12301,40447] 0 2026-03-10T06:22:43.326 INFO:tasks.workunit.client.0.vm04.stdout:9/99: write d2/d23/d24/f26 [988896,57724] 0 2026-03-10T06:22:43.336 INFO:tasks.workunit.client.0.vm04.stdout:1/80: write d0/f5 [4677594,101844] 0 2026-03-10T06:22:43.339 INFO:tasks.workunit.client.0.vm04.stdout:0/58: creat d0/d5/d8/dd/f13 x:0 0 0 2026-03-10T06:22:43.339 INFO:tasks.workunit.client.0.vm04.stdout:0/59: dread - d0/d5/fb zero size 2026-03-10T06:22:43.354 INFO:tasks.workunit.client.0.vm04.stdout:6/72: dwrite d2/f14 [0,4194304] 0 2026-03-10T06:22:43.356 INFO:tasks.workunit.client.0.vm04.stdout:9/100: creat d2/d8/d14/f28 x:0 0 0 2026-03-10T06:22:43.362 INFO:tasks.workunit.client.0.vm04.stdout:1/81: mknod d0/d8/c1b 0 2026-03-10T06:22:43.369 INFO:tasks.workunit.client.0.vm04.stdout:0/60: creat d0/f14 x:0 0 0 2026-03-10T06:22:43.369 INFO:tasks.workunit.client.0.vm04.stdout:5/54: unlink d4/c9 0 2026-03-10T06:22:43.372 INFO:tasks.workunit.client.0.vm04.stdout:7/54: link d4/f6 d4/df/d12/f14 0 2026-03-10T06:22:43.393 INFO:tasks.workunit.client.0.vm04.stdout:7/55: readlink - no filename 2026-03-10T06:22:43.394 INFO:tasks.workunit.client.0.vm04.stdout:7/56: dread - d4/fb zero size 2026-03-10T06:22:43.394 INFO:tasks.workunit.client.0.vm04.stdout:5/55: stat d4/f5 0 2026-03-10T06:22:43.394 INFO:tasks.workunit.client.0.vm04.stdout:7/57: dread - d4/f10 zero size 2026-03-10T06:22:43.394 INFO:tasks.workunit.client.0.vm04.stdout:7/58: write d4/f10 [826055,97417] 0 2026-03-10T06:22:43.394 INFO:tasks.workunit.client.0.vm04.stdout:2/80: getdents d1/df 0 2026-03-10T06:22:43.394 INFO:tasks.workunit.client.0.vm04.stdout:2/81: read - d1/f5 zero size 2026-03-10T06:22:43.394 INFO:tasks.workunit.client.0.vm04.stdout:2/82: write d1/df/d11/f15 [606298,21286] 0 2026-03-10T06:22:43.394 INFO:tasks.workunit.client.0.vm04.stdout:0/61: link d0/d5/d8/dd/c11 d0/d5/d8/c15 0 2026-03-10T06:22:43.394 INFO:tasks.workunit.client.0.vm04.stdout:6/73: creat d2/f16 x:0 0 0 2026-03-10T06:22:43.394 INFO:tasks.workunit.client.0.vm04.stdout:0/62: chown d0/d5/l9 7507 1 2026-03-10T06:22:43.394 INFO:tasks.workunit.client.0.vm04.stdout:5/56: symlink d4/l15 0 2026-03-10T06:22:43.394 INFO:tasks.workunit.client.0.vm04.stdout:1/82: getdents d0/d3 0 2026-03-10T06:22:43.394 INFO:tasks.workunit.client.0.vm04.stdout:2/83: rename d1/c6 to d1/df/d11/d14/c1f 0 2026-03-10T06:22:43.395 INFO:tasks.workunit.client.0.vm04.stdout:5/57: mknod d4/d11/c16 0 2026-03-10T06:22:43.397 INFO:tasks.workunit.client.0.vm04.stdout:6/74: dread d2/f14 [0,4194304] 0 2026-03-10T06:22:43.398 INFO:tasks.workunit.client.0.vm04.stdout:2/84: mkdir d1/db/d20 0 2026-03-10T06:22:43.400 INFO:tasks.workunit.client.0.vm04.stdout:6/75: rmdir d2/d8 39 2026-03-10T06:22:43.401 INFO:tasks.workunit.client.0.vm04.stdout:5/58: link d4/f13 d4/d11/f17 0 2026-03-10T06:22:43.404 INFO:tasks.workunit.client.0.vm04.stdout:1/83: rename d0/d8/c1b to d0/c1c 0 2026-03-10T06:22:43.405 INFO:tasks.workunit.client.0.vm04.stdout:1/84: truncate d0/d8/f14 335766 0 2026-03-10T06:22:43.407 INFO:tasks.workunit.client.0.vm04.stdout:2/85: mknod d1/df/d11/c21 0 2026-03-10T06:22:43.407 INFO:tasks.workunit.client.0.vm04.stdout:2/86: chown d1/f5 14370551 1 2026-03-10T06:22:43.413 INFO:tasks.workunit.client.0.vm04.stdout:6/76: dread d2/f14 [0,4194304] 0 2026-03-10T06:22:43.417 INFO:tasks.workunit.client.0.vm04.stdout:0/63: rename d0/d5/d8/fc to d0/f16 0 2026-03-10T06:22:43.439 INFO:tasks.workunit.client.0.vm04.stdout:1/85: symlink d0/d8/l1d 0 2026-03-10T06:22:43.439 INFO:tasks.workunit.client.0.vm04.stdout:1/86: rename d0/d8/l10 to d0/d3/l1e 0 2026-03-10T06:22:43.439 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:43 vm04.local ceph-mon[51058]: pgmap v20: 65 pgs: 65 active+clean; 2.2 GiB data, 7.2 GiB used, 113 GiB / 120 GiB avail; 36 MiB/s rd, 87 MiB/s wr, 246 op/s 2026-03-10T06:22:43.439 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:43 vm04.local ceph-mon[51058]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:22:43.439 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:43 vm04.local ceph-mon[51058]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:22:43.439 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:43 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:22:43.439 INFO:tasks.workunit.client.0.vm04.stdout:0/64: link d0/f14 d0/f17 0 2026-03-10T06:22:43.439 INFO:tasks.workunit.client.0.vm04.stdout:1/87: dread d0/d8/f11 [0,4194304] 0 2026-03-10T06:22:43.439 INFO:tasks.workunit.client.0.vm04.stdout:1/88: fsync d0/d3/f16 0 2026-03-10T06:22:43.439 INFO:tasks.workunit.client.0.vm04.stdout:1/89: fdatasync d0/d8/f11 0 2026-03-10T06:22:43.445 INFO:tasks.workunit.client.0.vm04.stdout:0/65: dwrite d0/f16 [0,4194304] 0 2026-03-10T06:22:43.458 INFO:tasks.workunit.client.0.vm04.stdout:5/59: dread d4/d11/f17 [0,4194304] 0 2026-03-10T06:22:43.463 INFO:tasks.workunit.client.0.vm04.stdout:5/60: rename d4/f12 to d4/d11/f18 0 2026-03-10T06:22:43.467 INFO:tasks.workunit.client.0.vm04.stdout:5/61: creat d4/f19 x:0 0 0 2026-03-10T06:22:43.467 INFO:tasks.workunit.client.0.vm04.stdout:5/62: write d4/d6/f8 [2661037,129137] 0 2026-03-10T06:22:43.474 INFO:tasks.workunit.client.0.vm04.stdout:5/63: dwrite d4/ff [0,4194304] 0 2026-03-10T06:22:43.487 INFO:tasks.workunit.client.0.vm04.stdout:5/64: mknod d4/d11/c1a 0 2026-03-10T06:22:43.488 INFO:tasks.workunit.client.0.vm04.stdout:5/65: mknod d4/d11/c1b 0 2026-03-10T06:22:43.489 INFO:tasks.workunit.client.0.vm04.stdout:5/66: write f0 [2967944,6661] 0 2026-03-10T06:22:43.492 INFO:tasks.workunit.client.0.vm04.stdout:5/67: unlink d4/d11/c1a 0 2026-03-10T06:22:43.509 INFO:tasks.workunit.client.0.vm04.stdout:2/87: sync 2026-03-10T06:22:43.510 INFO:tasks.workunit.client.0.vm04.stdout:2/88: chown d1/df/d11/d14/c1a 1671 1 2026-03-10T06:22:43.515 INFO:tasks.workunit.client.0.vm04.stdout:2/89: link d1/f5 d1/df/f22 0 2026-03-10T06:22:43.516 INFO:tasks.workunit.client.0.vm04.stdout:2/90: readlink d1/db/lc 0 2026-03-10T06:22:43.516 INFO:tasks.workunit.client.0.vm04.stdout:2/91: readlink l0 0 2026-03-10T06:22:43.520 INFO:tasks.workunit.client.0.vm04.stdout:2/92: dread d1/db/f1e [0,4194304] 0 2026-03-10T06:22:43.525 INFO:tasks.workunit.client.0.vm04.stdout:2/93: dwrite d1/df/d11/f15 [0,4194304] 0 2026-03-10T06:22:43.529 INFO:tasks.workunit.client.0.vm04.stdout:2/94: fdatasync d1/df/d11/d18/f19 0 2026-03-10T06:22:43.530 INFO:tasks.workunit.client.0.vm04.stdout:2/95: readlink l0 0 2026-03-10T06:22:43.531 INFO:tasks.workunit.client.0.vm04.stdout:2/96: read d1/db/f1e [1297766,108860] 0 2026-03-10T06:22:43.535 INFO:tasks.workunit.client.0.vm04.stdout:2/97: dwrite d1/df/d11/f16 [0,4194304] 0 2026-03-10T06:22:43.541 INFO:tasks.workunit.client.0.vm04.stdout:5/68: fdatasync d4/d6/f8 0 2026-03-10T06:22:43.543 INFO:tasks.workunit.client.0.vm04.stdout:2/98: stat d1/f7 0 2026-03-10T06:22:43.547 INFO:tasks.workunit.client.0.vm04.stdout:2/99: write d1/f8 [826048,120968] 0 2026-03-10T06:22:43.547 INFO:tasks.workunit.client.0.vm04.stdout:4/41: fsync d2/f7 0 2026-03-10T06:22:43.548 INFO:tasks.workunit.client.0.vm04.stdout:6/77: fsync d2/d8/f11 0 2026-03-10T06:22:43.556 INFO:tasks.workunit.client.0.vm04.stdout:4/42: creat d2/d8/fb x:0 0 0 2026-03-10T06:22:43.561 INFO:tasks.workunit.client.0.vm04.stdout:4/43: dwrite d2/d8/fa [0,4194304] 0 2026-03-10T06:22:43.569 INFO:tasks.workunit.client.0.vm04.stdout:4/44: read d2/d8/fa [896249,17661] 0 2026-03-10T06:22:43.571 INFO:tasks.workunit.client.0.vm04.stdout:2/100: mknod d1/db/c23 0 2026-03-10T06:22:43.572 INFO:tasks.workunit.client.0.vm04.stdout:4/45: dread d2/f7 [0,4194304] 0 2026-03-10T06:22:43.572 INFO:tasks.workunit.client.0.vm04.stdout:2/101: readlink d1/db/ld 0 2026-03-10T06:22:43.580 INFO:tasks.workunit.client.0.vm04.stdout:2/102: unlink d1/la 0 2026-03-10T06:22:43.583 INFO:tasks.workunit.client.0.vm04.stdout:4/46: mkdir d2/d8/dc 0 2026-03-10T06:22:43.587 INFO:tasks.workunit.client.0.vm04.stdout:8/70: write fd [1642150,72938] 0 2026-03-10T06:22:43.595 INFO:tasks.workunit.client.0.vm04.stdout:0/66: getdents d0/d5/d8/dd 0 2026-03-10T06:22:43.595 INFO:tasks.workunit.client.0.vm04.stdout:0/67: stat d0/f17 0 2026-03-10T06:22:43.600 INFO:tasks.workunit.client.0.vm04.stdout:2/103: creat d1/df/f24 x:0 0 0 2026-03-10T06:22:43.601 INFO:tasks.workunit.client.0.vm04.stdout:4/47: readlink d2/l9 0 2026-03-10T06:22:43.602 INFO:tasks.workunit.client.0.vm04.stdout:4/48: dread - d2/d8/fb zero size 2026-03-10T06:22:43.605 INFO:tasks.workunit.client.0.vm04.stdout:8/71: mknod df/c1a 0 2026-03-10T06:22:43.605 INFO:tasks.workunit.client.0.vm04.stdout:8/72: fdatasync fe 0 2026-03-10T06:22:43.605 INFO:tasks.workunit.client.0.vm04.stdout:8/73: write df/f12 [471817,97486] 0 2026-03-10T06:22:43.617 INFO:tasks.workunit.client.0.vm04.stdout:0/68: symlink d0/d5/l18 0 2026-03-10T06:22:43.617 INFO:tasks.workunit.client.0.vm04.stdout:0/69: readlink d0/l1 0 2026-03-10T06:22:43.619 INFO:tasks.workunit.client.0.vm04.stdout:4/49: mknod d2/d8/cd 0 2026-03-10T06:22:43.619 INFO:tasks.workunit.client.0.vm04.stdout:4/50: chown d2/d8/fa 171977408 1 2026-03-10T06:22:43.621 INFO:tasks.workunit.client.0.vm04.stdout:8/74: creat df/d15/f1b x:0 0 0 2026-03-10T06:22:43.623 INFO:tasks.workunit.client.0.vm04.stdout:4/51: dread d2/d8/fa [0,4194304] 0 2026-03-10T06:22:43.623 INFO:tasks.workunit.client.0.vm04.stdout:0/70: symlink d0/l19 0 2026-03-10T06:22:43.624 INFO:tasks.workunit.client.0.vm04.stdout:4/52: write d2/f7 [1897846,80699] 0 2026-03-10T06:22:43.625 INFO:tasks.workunit.client.0.vm04.stdout:4/53: dread - d2/d8/fb zero size 2026-03-10T06:22:43.625 INFO:tasks.workunit.client.0.vm04.stdout:8/75: unlink df/d15/f18 0 2026-03-10T06:22:43.626 INFO:tasks.workunit.client.0.vm04.stdout:8/76: chown f6 3256498 1 2026-03-10T06:22:43.636 INFO:tasks.workunit.client.0.vm04.stdout:0/71: mkdir d0/d1a 0 2026-03-10T06:22:43.637 INFO:tasks.workunit.client.0.vm04.stdout:3/87: write d4/f7 [493064,103198] 0 2026-03-10T06:22:43.640 INFO:tasks.workunit.client.0.vm04.stdout:7/59: getdents d4/df/d12 0 2026-03-10T06:22:43.641 INFO:tasks.workunit.client.0.vm04.stdout:7/60: write d4/f10 [1372245,72496] 0 2026-03-10T06:22:43.645 INFO:tasks.workunit.client.0.vm04.stdout:6/78: dwrite d2/f14 [4194304,4194304] 0 2026-03-10T06:22:43.648 INFO:tasks.workunit.client.0.vm04.stdout:9/101: truncate d2/d8/d14/f27 327674 0 2026-03-10T06:22:43.649 INFO:tasks.workunit.client.0.vm04.stdout:2/104: chown d1/df/d11/d14/c1f 492045222 1 2026-03-10T06:22:43.657 INFO:tasks.workunit.client.0.vm04.stdout:1/90: truncate d0/f4 2925058 0 2026-03-10T06:22:43.657 INFO:tasks.workunit.client.0.vm04.stdout:1/91: fsync d0/d3/f19 0 2026-03-10T06:22:43.658 INFO:tasks.workunit.client.0.vm04.stdout:1/92: write d0/d8/f11 [645306,22027] 0 2026-03-10T06:22:43.677 INFO:tasks.workunit.client.0.vm04.stdout:8/77: symlink df/d15/l1c 0 2026-03-10T06:22:43.677 INFO:tasks.workunit.client.0.vm04.stdout:8/78: readlink lb 0 2026-03-10T06:22:43.678 INFO:tasks.workunit.client.0.vm04.stdout:8/79: write df/f11 [378258,71291] 0 2026-03-10T06:22:43.682 INFO:tasks.workunit.client.0.vm04.stdout:0/72: creat d0/f1b x:0 0 0 2026-03-10T06:22:43.683 INFO:tasks.workunit.client.0.vm04.stdout:0/73: chown d0/l1 38 1 2026-03-10T06:22:43.690 INFO:tasks.workunit.client.0.vm04.stdout:3/88: creat d4/da/df/d13/f1d x:0 0 0 2026-03-10T06:22:43.698 INFO:tasks.workunit.client.0.vm04.stdout:5/69: rmdir d4/d11 39 2026-03-10T06:22:43.700 INFO:tasks.workunit.client.0.vm04.stdout:7/61: mknod d4/df/c15 0 2026-03-10T06:22:43.702 INFO:tasks.workunit.client.0.vm04.stdout:6/79: mknod d2/d8/c17 0 2026-03-10T06:22:43.706 INFO:tasks.workunit.client.0.vm04.stdout:9/102: creat d2/d23/d24/f29 x:0 0 0 2026-03-10T06:22:43.717 INFO:tasks.workunit.client.0.vm04.stdout:2/105: creat d1/df/d11/d18/f25 x:0 0 0 2026-03-10T06:22:43.718 INFO:tasks.workunit.client.0.vm04.stdout:2/106: dread - d1/df/f22 zero size 2026-03-10T06:22:43.721 INFO:tasks.workunit.client.0.vm04.stdout:2/107: dread d1/f8 [0,4194304] 0 2026-03-10T06:22:43.721 INFO:tasks.workunit.client.0.vm04.stdout:2/108: chown d1/df/f22 668607 1 2026-03-10T06:22:43.724 INFO:tasks.workunit.client.0.vm04.stdout:2/109: dread d1/df/d11/f15 [0,4194304] 0 2026-03-10T06:22:43.740 INFO:tasks.workunit.client.0.vm04.stdout:8/80: rename df/d15/f19 to df/f1d 0 2026-03-10T06:22:43.741 INFO:tasks.workunit.client.0.vm04.stdout:0/74: creat d0/d5/f1c x:0 0 0 2026-03-10T06:22:43.741 INFO:tasks.workunit.client.0.vm04.stdout:0/75: write d0/d5/f1c [152055,47361] 0 2026-03-10T06:22:43.742 INFO:tasks.workunit.client.0.vm04.stdout:0/76: stat d0/lf 0 2026-03-10T06:22:43.745 INFO:tasks.workunit.client.0.vm04.stdout:2/110: sync 2026-03-10T06:22:43.746 INFO:tasks.workunit.client.0.vm04.stdout:0/77: dwrite d0/d5/fb [0,4194304] 0 2026-03-10T06:22:43.761 INFO:tasks.workunit.client.0.vm04.stdout:3/89: symlink d4/da/l1e 0 2026-03-10T06:22:43.762 INFO:tasks.workunit.client.0.vm04.stdout:3/90: write d4/f18 [884943,69631] 0 2026-03-10T06:22:43.762 INFO:tasks.workunit.client.0.vm04.stdout:3/91: chown d4/c14 3920 1 2026-03-10T06:22:43.763 INFO:tasks.workunit.client.0.vm04.stdout:3/92: dread - d4/da/df/d13/f1d zero size 2026-03-10T06:22:43.768 INFO:tasks.workunit.client.0.vm04.stdout:5/70: mknod d4/d6/c1c 0 2026-03-10T06:22:43.769 INFO:tasks.workunit.client.0.vm04.stdout:5/71: write f0 [100538,129155] 0 2026-03-10T06:22:43.778 INFO:tasks.workunit.client.0.vm04.stdout:7/62: creat d4/f16 x:0 0 0 2026-03-10T06:22:43.783 INFO:tasks.workunit.client.0.vm04.stdout:6/80: unlink d2/d8/c17 0 2026-03-10T06:22:43.783 INFO:tasks.workunit.client.0.vm04.stdout:6/81: chown d2/cf 920 1 2026-03-10T06:22:43.794 INFO:tasks.workunit.client.0.vm04.stdout:9/103: rename d2/d3/f7 to d2/d3/f2a 0 2026-03-10T06:22:43.797 INFO:tasks.workunit.client.0.vm04.stdout:9/104: dread d2/d3/f2a [0,4194304] 0 2026-03-10T06:22:43.797 INFO:tasks.workunit.client.0.vm04.stdout:9/105: truncate d2/d3/f12 533111 0 2026-03-10T06:22:43.801 INFO:tasks.workunit.client.0.vm04.stdout:9/106: dwrite d2/d3/f12 [0,4194304] 0 2026-03-10T06:22:43.802 INFO:tasks.workunit.client.0.vm04.stdout:9/107: write d2/d3/f10 [1504097,102340] 0 2026-03-10T06:22:43.810 INFO:tasks.workunit.client.0.vm04.stdout:4/54: rmdir d2/d8 39 2026-03-10T06:22:43.811 INFO:tasks.workunit.client.0.vm04.stdout:4/55: write d2/f4 [1285204,13301] 0 2026-03-10T06:22:43.814 INFO:tasks.workunit.client.0.vm04.stdout:4/56: dread d2/f7 [0,4194304] 0 2026-03-10T06:22:43.828 INFO:tasks.workunit.client.0.vm04.stdout:8/81: creat df/d15/f1e x:0 0 0 2026-03-10T06:22:43.828 INFO:tasks.workunit.client.0.vm04.stdout:8/82: chown df/d15/l1c 2763029 1 2026-03-10T06:22:43.831 INFO:tasks.workunit.client.0.vm04.stdout:8/83: dread f6 [0,4194304] 0 2026-03-10T06:22:43.833 INFO:tasks.workunit.client.0.vm04.stdout:0/78: mkdir d0/d5/d8/dd/d1d 0 2026-03-10T06:22:43.844 INFO:tasks.workunit.client.0.vm04.stdout:3/93: creat d4/d6/dc/f1f x:0 0 0 2026-03-10T06:22:43.848 INFO:tasks.workunit.client.0.vm04.stdout:3/94: dread d4/f7 [0,4194304] 0 2026-03-10T06:22:43.859 INFO:tasks.workunit.client.0.vm04.stdout:6/82: mkdir d2/d4/d18 0 2026-03-10T06:22:43.866 INFO:tasks.workunit.client.0.vm04.stdout:9/108: creat d2/d23/d24/f2b x:0 0 0 2026-03-10T06:22:43.886 INFO:tasks.workunit.client.0.vm04.stdout:1/93: creat d0/f1f x:0 0 0 2026-03-10T06:22:43.886 INFO:tasks.workunit.client.0.vm04.stdout:8/84: creat df/f1f x:0 0 0 2026-03-10T06:22:43.886 INFO:tasks.workunit.client.0.vm04.stdout:8/85: stat df/d15/f1e 0 2026-03-10T06:22:43.888 INFO:tasks.workunit.client.0.vm04.stdout:3/95: rmdir d4/da 39 2026-03-10T06:22:43.889 INFO:tasks.workunit.client.0.vm04.stdout:5/72: mknod d4/d11/c1d 0 2026-03-10T06:22:43.889 INFO:tasks.workunit.client.0.vm04.stdout:3/96: dread - d4/d6/dc/f1f zero size 2026-03-10T06:22:43.890 INFO:tasks.workunit.client.0.vm04.stdout:7/63: symlink d4/df/d12/d13/l17 0 2026-03-10T06:22:43.892 INFO:tasks.workunit.client.0.vm04.stdout:6/83: rename d2/cf to d2/d8/c19 0 2026-03-10T06:22:43.892 INFO:tasks.workunit.client.0.vm04.stdout:9/109: rename d2/d23 to d2/d23/d24/d2c 22 2026-03-10T06:22:43.896 INFO:tasks.workunit.client.0.vm04.stdout:9/110: dwrite d2/d23/d24/f26 [0,4194304] 0 2026-03-10T06:22:43.898 INFO:tasks.workunit.client.0.vm04.stdout:9/111: fdatasync d2/d3/f10 0 2026-03-10T06:22:43.914 INFO:tasks.workunit.client.0.vm04.stdout:0/79: unlink d0/d5/d8/c15 0 2026-03-10T06:22:43.918 INFO:tasks.workunit.client.0.vm04.stdout:8/86: mkdir df/d20 0 2026-03-10T06:22:43.920 INFO:tasks.workunit.client.0.vm04.stdout:1/94: dwrite d0/ff [0,4194304] 0 2026-03-10T06:22:43.931 INFO:tasks.workunit.client.0.vm04.stdout:1/95: dread d0/d3/f16 [0,4194304] 0 2026-03-10T06:22:43.934 INFO:tasks.workunit.client.0.vm04.stdout:3/97: dread - d4/da/df/f1b zero size 2026-03-10T06:22:43.934 INFO:tasks.workunit.client.0.vm04.stdout:7/64: creat d4/df/d12/f18 x:0 0 0 2026-03-10T06:22:43.934 INFO:tasks.workunit.client.0.vm04.stdout:4/57: mknod d2/d8/dc/ce 0 2026-03-10T06:22:43.935 INFO:tasks.workunit.client.0.vm04.stdout:9/112: creat d2/d9/d11/f2d x:0 0 0 2026-03-10T06:22:43.935 INFO:tasks.workunit.client.0.vm04.stdout:3/98: dread - d4/d6/dc/f1f zero size 2026-03-10T06:22:43.939 INFO:tasks.workunit.client.0.vm04.stdout:2/111: getdents d1/df/d11/d18 0 2026-03-10T06:22:43.941 INFO:tasks.workunit.client.0.vm04.stdout:5/73: link d4/f19 d4/d6/f1e 0 2026-03-10T06:22:43.941 INFO:tasks.workunit.client.0.vm04.stdout:6/84: mkdir d2/d4/d18/d1a 0 2026-03-10T06:22:43.947 INFO:tasks.workunit.client.0.vm04.stdout:1/96: creat d0/d3/f20 x:0 0 0 2026-03-10T06:22:43.951 INFO:tasks.workunit.client.0.vm04.stdout:1/97: write d0/d8/f11 [893175,79745] 0 2026-03-10T06:22:43.951 INFO:tasks.workunit.client.0.vm04.stdout:2/112: mknod d1/df/c26 0 2026-03-10T06:22:43.954 INFO:tasks.workunit.client.0.vm04.stdout:2/113: chown d1/db/lc 145566 1 2026-03-10T06:22:43.955 INFO:tasks.workunit.client.0.vm04.stdout:8/87: dwrite df/f1d [0,4194304] 0 2026-03-10T06:22:43.958 INFO:tasks.workunit.client.0.vm04.stdout:2/114: write d1/db/fe [392992,87982] 0 2026-03-10T06:22:43.964 INFO:tasks.workunit.client.0.vm04.stdout:3/99: dwrite d4/da/df/f1b [0,4194304] 0 2026-03-10T06:22:43.964 INFO:tasks.workunit.client.0.vm04.stdout:2/115: truncate d1/f10 270863 0 2026-03-10T06:22:43.971 INFO:tasks.workunit.client.0.vm04.stdout:5/74: unlink d4/d6/f10 0 2026-03-10T06:22:43.975 INFO:tasks.workunit.client.0.vm04.stdout:5/75: dread - d4/f5 zero size 2026-03-10T06:22:43.975 INFO:tasks.workunit.client.0.vm04.stdout:5/76: chown d4/d6 446031 1 2026-03-10T06:22:43.978 INFO:tasks.workunit.client.0.vm04.stdout:3/100: dread d4/da/df/d13/f16 [0,4194304] 0 2026-03-10T06:22:43.988 INFO:tasks.workunit.client.0.vm04.stdout:9/113: link d2/d3/f4 d2/d9/f2e 0 2026-03-10T06:22:43.992 INFO:tasks.workunit.client.0.vm04.stdout:0/80: link d0/l12 d0/d1a/l1e 0 2026-03-10T06:22:43.993 INFO:tasks.workunit.client.0.vm04.stdout:1/98: creat d0/d8/f21 x:0 0 0 2026-03-10T06:22:43.996 INFO:tasks.workunit.client.0.vm04.stdout:8/88: read f9 [3660212,121610] 0 2026-03-10T06:22:44.004 INFO:tasks.workunit.client.0.vm04.stdout:6/85: symlink d2/l1b 0 2026-03-10T06:22:44.004 INFO:tasks.workunit.client.0.vm04.stdout:2/116: fdatasync d1/f7 0 2026-03-10T06:22:44.004 INFO:tasks.workunit.client.0.vm04.stdout:0/81: dread d0/d5/f1c [0,4194304] 0 2026-03-10T06:22:44.004 INFO:tasks.workunit.client.0.vm04.stdout:3/101: rename d4/d6/dc/l19 to d4/da/df/d13/l20 0 2026-03-10T06:22:44.004 INFO:tasks.workunit.client.0.vm04.stdout:4/58: creat d2/ff x:0 0 0 2026-03-10T06:22:44.004 INFO:tasks.workunit.client.0.vm04.stdout:2/117: chown d1/db/lc 1 1 2026-03-10T06:22:44.004 INFO:tasks.workunit.client.0.vm04.stdout:7/65: dwrite d4/df/d12/f14 [0,4194304] 0 2026-03-10T06:22:44.006 INFO:tasks.workunit.client.0.vm04.stdout:9/114: dwrite d2/d9/d11/f2d [0,4194304] 0 2026-03-10T06:22:44.015 INFO:tasks.workunit.client.0.vm04.stdout:0/82: creat d0/d5/f1f x:0 0 0 2026-03-10T06:22:44.015 INFO:tasks.workunit.client.0.vm04.stdout:2/118: rename d1/f8 to d1/db/f27 0 2026-03-10T06:22:44.016 INFO:tasks.workunit.client.0.vm04.stdout:2/119: write d1/db/fe [408209,77891] 0 2026-03-10T06:22:44.022 INFO:tasks.workunit.client.0.vm04.stdout:3/102: mkdir d4/da/df/d13/d21 0 2026-03-10T06:22:44.028 INFO:tasks.workunit.client.0.vm04.stdout:7/66: symlink d4/df/l19 0 2026-03-10T06:22:44.029 INFO:tasks.workunit.client.0.vm04.stdout:8/89: creat df/d20/f21 x:0 0 0 2026-03-10T06:22:44.030 INFO:tasks.workunit.client.0.vm04.stdout:4/59: mknod d2/c10 0 2026-03-10T06:22:44.031 INFO:tasks.workunit.client.0.vm04.stdout:8/90: chown df/d20 1 1 2026-03-10T06:22:44.032 INFO:tasks.workunit.client.0.vm04.stdout:2/120: truncate d1/df/f22 950553 0 2026-03-10T06:22:44.033 INFO:tasks.workunit.client.0.vm04.stdout:8/91: chown lc 127990 1 2026-03-10T06:22:44.034 INFO:tasks.workunit.client.0.vm04.stdout:3/103: rename d4/f18 to d4/d6/dc/f22 0 2026-03-10T06:22:44.041 INFO:tasks.workunit.client.0.vm04.stdout:8/92: chown lb 64923315 1 2026-03-10T06:22:44.041 INFO:tasks.workunit.client.0.vm04.stdout:3/104: write f0 [400111,86891] 0 2026-03-10T06:22:44.041 INFO:tasks.workunit.client.0.vm04.stdout:8/93: dread - df/f1f zero size 2026-03-10T06:22:44.041 INFO:tasks.workunit.client.0.vm04.stdout:1/99: creat d0/f22 x:0 0 0 2026-03-10T06:22:44.041 INFO:tasks.workunit.client.0.vm04.stdout:7/67: symlink d4/df/l1a 0 2026-03-10T06:22:44.041 INFO:tasks.workunit.client.0.vm04.stdout:8/94: truncate df/d20/f21 963555 0 2026-03-10T06:22:44.041 INFO:tasks.workunit.client.0.vm04.stdout:7/68: readlink d4/df/l19 0 2026-03-10T06:22:44.041 INFO:tasks.workunit.client.0.vm04.stdout:6/86: link d2/d4/lc d2/d4/l1c 0 2026-03-10T06:22:44.041 INFO:tasks.workunit.client.0.vm04.stdout:5/77: sync 2026-03-10T06:22:44.041 INFO:tasks.workunit.client.0.vm04.stdout:8/95: truncate df/f10 1457559 0 2026-03-10T06:22:44.044 INFO:tasks.workunit.client.0.vm04.stdout:0/83: mkdir d0/d1a/d20 0 2026-03-10T06:22:44.044 INFO:tasks.workunit.client.0.vm04.stdout:9/115: sync 2026-03-10T06:22:44.044 INFO:tasks.workunit.client.0.vm04.stdout:7/69: fdatasync d4/df/d12/f18 0 2026-03-10T06:22:44.044 INFO:tasks.workunit.client.0.vm04.stdout:8/96: write df/f11 [702113,2832] 0 2026-03-10T06:22:44.045 INFO:tasks.workunit.client.0.vm04.stdout:7/70: write d4/df/d12/f18 [421343,40726] 0 2026-03-10T06:22:44.045 INFO:tasks.workunit.client.0.vm04.stdout:8/97: dread - df/d15/f1b zero size 2026-03-10T06:22:44.045 INFO:tasks.workunit.client.0.vm04.stdout:2/121: write d1/df/d11/f15 [4011612,52714] 0 2026-03-10T06:22:44.047 INFO:tasks.workunit.client.0.vm04.stdout:7/71: chown d4/f5 197521339 1 2026-03-10T06:22:44.047 INFO:tasks.workunit.client.0.vm04.stdout:9/116: sync 2026-03-10T06:22:44.047 INFO:tasks.workunit.client.0.vm04.stdout:0/84: sync 2026-03-10T06:22:44.050 INFO:tasks.workunit.client.0.vm04.stdout:7/72: dread - d4/fb zero size 2026-03-10T06:22:44.053 INFO:tasks.workunit.client.0.vm04.stdout:3/105: creat d4/da/df/d13/f23 x:0 0 0 2026-03-10T06:22:44.055 INFO:tasks.workunit.client.0.vm04.stdout:3/106: read d4/d6/f12 [2467593,67107] 0 2026-03-10T06:22:44.057 INFO:tasks.workunit.client.0.vm04.stdout:6/87: symlink d2/d8/l1d 0 2026-03-10T06:22:44.061 INFO:tasks.workunit.client.0.vm04.stdout:8/98: creat df/d20/f22 x:0 0 0 2026-03-10T06:22:44.061 INFO:tasks.workunit.client.0.vm04.stdout:3/107: dread - d4/da/df/d13/f1c zero size 2026-03-10T06:22:44.062 INFO:tasks.workunit.client.0.vm04.stdout:0/85: mknod d0/d5/d8/c21 0 2026-03-10T06:22:44.064 INFO:tasks.workunit.client.0.vm04.stdout:3/108: rename d4 to d4/d6/dc/d24 22 2026-03-10T06:22:44.066 INFO:tasks.workunit.client.0.vm04.stdout:7/73: fdatasync d4/fb 0 2026-03-10T06:22:44.067 INFO:tasks.workunit.client.0.vm04.stdout:0/86: write d0/d5/fb [4771371,130977] 0 2026-03-10T06:22:44.069 INFO:tasks.workunit.client.0.vm04.stdout:2/122: dwrite d1/db/f1e [4194304,4194304] 0 2026-03-10T06:22:44.073 INFO:tasks.workunit.client.0.vm04.stdout:6/88: dread d2/f7 [4194304,4194304] 0 2026-03-10T06:22:44.075 INFO:tasks.workunit.client.0.vm04.stdout:1/100: creat d0/f23 x:0 0 0 2026-03-10T06:22:44.076 INFO:tasks.workunit.client.0.vm04.stdout:3/109: dread - d4/da/df/d13/f1c zero size 2026-03-10T06:22:44.089 INFO:tasks.workunit.client.0.vm04.stdout:4/60: truncate d2/d8/fa 2251896 0 2026-03-10T06:22:44.092 INFO:tasks.workunit.client.0.vm04.stdout:8/99: symlink df/l23 0 2026-03-10T06:22:44.092 INFO:tasks.workunit.client.0.vm04.stdout:9/117: symlink d2/d3/d18/l2f 0 2026-03-10T06:22:44.094 INFO:tasks.workunit.client.0.vm04.stdout:6/89: dwrite d2/d8/f9 [0,4194304] 0 2026-03-10T06:22:44.097 INFO:tasks.workunit.client.0.vm04.stdout:6/90: dread - d2/f16 zero size 2026-03-10T06:22:44.100 INFO:tasks.workunit.client.0.vm04.stdout:7/74: mknod d4/df/d12/c1b 0 2026-03-10T06:22:44.100 INFO:tasks.workunit.client.0.vm04.stdout:8/100: truncate df/d15/f1e 1019817 0 2026-03-10T06:22:44.101 INFO:tasks.workunit.client.0.vm04.stdout:2/123: symlink d1/l28 0 2026-03-10T06:22:44.102 INFO:tasks.workunit.client.0.vm04.stdout:1/101: creat d0/d3/f24 x:0 0 0 2026-03-10T06:22:44.103 INFO:tasks.workunit.client.0.vm04.stdout:4/61: dwrite d2/f4 [0,4194304] 0 2026-03-10T06:22:44.104 INFO:tasks.workunit.client.0.vm04.stdout:3/110: creat d4/d6/dc/f25 x:0 0 0 2026-03-10T06:22:44.105 INFO:tasks.workunit.client.0.vm04.stdout:3/111: chown d4/da/df/d13/f16 18766 1 2026-03-10T06:22:44.118 INFO:tasks.workunit.client.0.vm04.stdout:9/118: mknod d2/d3/d18/c30 0 2026-03-10T06:22:44.118 INFO:tasks.workunit.client.0.vm04.stdout:7/75: rmdir d4/df 39 2026-03-10T06:22:44.118 INFO:tasks.workunit.client.0.vm04.stdout:5/78: getdents d4/d11 0 2026-03-10T06:22:44.118 INFO:tasks.workunit.client.0.vm04.stdout:8/101: creat df/d15/f24 x:0 0 0 2026-03-10T06:22:44.120 INFO:tasks.workunit.client.0.vm04.stdout:6/91: mknod d2/d4/d18/c1e 0 2026-03-10T06:22:44.120 INFO:tasks.workunit.client.0.vm04.stdout:2/124: creat d1/df/d11/f29 x:0 0 0 2026-03-10T06:22:44.120 INFO:tasks.workunit.client.0.vm04.stdout:5/79: read - d4/d6/fa zero size 2026-03-10T06:22:44.120 INFO:tasks.workunit.client.0.vm04.stdout:1/102: creat d0/d8/f25 x:0 0 0 2026-03-10T06:22:44.122 INFO:tasks.workunit.client.0.vm04.stdout:4/62: creat d2/d8/f11 x:0 0 0 2026-03-10T06:22:44.125 INFO:tasks.workunit.client.0.vm04.stdout:8/102: dread df/d15/f1e [0,4194304] 0 2026-03-10T06:22:44.125 INFO:tasks.workunit.client.0.vm04.stdout:4/63: chown d2/ff 4515 1 2026-03-10T06:22:44.125 INFO:tasks.workunit.client.0.vm04.stdout:9/119: rmdir d2/d8 39 2026-03-10T06:22:44.130 INFO:tasks.workunit.client.0.vm04.stdout:8/103: chown df/f17 89213 1 2026-03-10T06:22:44.130 INFO:tasks.workunit.client.0.vm04.stdout:3/112: mknod d4/da/df/d11/c26 0 2026-03-10T06:22:44.131 INFO:tasks.workunit.client.0.vm04.stdout:6/92: fsync d2/d4/fa 0 2026-03-10T06:22:44.132 INFO:tasks.workunit.client.0.vm04.stdout:0/87: getdents d0/d5/d8 0 2026-03-10T06:22:44.133 INFO:tasks.workunit.client.0.vm04.stdout:7/76: rename d4/f10 to d4/df/d12/f1c 0 2026-03-10T06:22:44.133 INFO:tasks.workunit.client.0.vm04.stdout:6/93: dread - d2/f16 zero size 2026-03-10T06:22:44.135 INFO:tasks.workunit.client.0.vm04.stdout:3/113: write d4/da/df/d13/f1d [110442,121574] 0 2026-03-10T06:22:44.138 INFO:tasks.workunit.client.0.vm04.stdout:9/120: dwrite d2/d9/fd [0,4194304] 0 2026-03-10T06:22:44.138 INFO:tasks.workunit.client.0.vm04.stdout:2/125: mkdir d1/db/d20/d2a 0 2026-03-10T06:22:44.138 INFO:tasks.workunit.client.0.vm04.stdout:7/77: readlink d4/df/d12/d13/l17 0 2026-03-10T06:22:44.143 INFO:tasks.workunit.client.0.vm04.stdout:2/126: stat d1/df/d11/d14/c1a 0 2026-03-10T06:22:44.144 INFO:tasks.workunit.client.0.vm04.stdout:1/103: dwrite d0/f23 [0,4194304] 0 2026-03-10T06:22:44.145 INFO:tasks.workunit.client.0.vm04.stdout:1/104: stat d0/d3/ld 0 2026-03-10T06:22:44.146 INFO:tasks.workunit.client.0.vm04.stdout:1/105: dread - d0/d8/f21 zero size 2026-03-10T06:22:44.146 INFO:tasks.workunit.client.0.vm04.stdout:1/106: stat d0/d8 0 2026-03-10T06:22:44.150 INFO:tasks.workunit.client.0.vm04.stdout:0/88: dread d0/d5/d8/fa [0,4194304] 0 2026-03-10T06:22:44.150 INFO:tasks.workunit.client.0.vm04.stdout:9/121: write d2/f17 [1039516,16355] 0 2026-03-10T06:22:44.160 INFO:tasks.workunit.client.0.vm04.stdout:2/127: rename d1/df/d11/f1b to d1/f2b 0 2026-03-10T06:22:44.161 INFO:tasks.workunit.client.0.vm04.stdout:2/128: readlink d1/l28 0 2026-03-10T06:22:44.166 INFO:tasks.workunit.client.0.vm04.stdout:3/114: dwrite d4/d6/dc/f25 [0,4194304] 0 2026-03-10T06:22:44.178 INFO:tasks.workunit.client.0.vm04.stdout:5/80: link d4/d11/f17 d4/d11/f1f 0 2026-03-10T06:22:44.180 INFO:tasks.workunit.client.0.vm04.stdout:4/64: creat d2/f12 x:0 0 0 2026-03-10T06:22:44.190 INFO:tasks.workunit.client.0.vm04.stdout:0/89: unlink d0/d5/f7 0 2026-03-10T06:22:44.194 INFO:tasks.workunit.client.0.vm04.stdout:2/129: sync 2026-03-10T06:22:44.197 INFO:tasks.workunit.client.0.vm04.stdout:7/78: dwrite d4/df/d12/f1c [0,4194304] 0 2026-03-10T06:22:44.197 INFO:tasks.workunit.client.0.vm04.stdout:7/79: read - d4/fb zero size 2026-03-10T06:22:44.197 INFO:tasks.workunit.client.0.vm04.stdout:6/94: mkdir d2/d4/d18/d1a/d1f 0 2026-03-10T06:22:44.198 INFO:tasks.workunit.client.0.vm04.stdout:7/80: fdatasync d4/f16 0 2026-03-10T06:22:44.206 INFO:tasks.workunit.client.0.vm04.stdout:3/115: read f1 [1535840,17355] 0 2026-03-10T06:22:44.207 INFO:tasks.workunit.client.0.vm04.stdout:3/116: readlink d4/da/l1e 0 2026-03-10T06:22:44.210 INFO:tasks.workunit.client.0.vm04.stdout:1/107: mknod d0/c26 0 2026-03-10T06:22:44.210 INFO:tasks.workunit.client.0.vm04.stdout:4/65: symlink d2/d8/l13 0 2026-03-10T06:22:44.210 INFO:tasks.workunit.client.0.vm04.stdout:5/81: creat d4/d6/f20 x:0 0 0 2026-03-10T06:22:44.355 INFO:tasks.workunit.client.0.vm04.stdout:0/90: creat d0/d5/d8/f22 x:0 0 0 2026-03-10T06:22:44.359 INFO:tasks.workunit.client.0.vm04.stdout:9/122: link d2/d9/d11/f2d d2/d23/f31 0 2026-03-10T06:22:44.362 INFO:tasks.workunit.client.0.vm04.stdout:6/95: mknod d2/d4/c20 0 2026-03-10T06:22:44.362 INFO:tasks.workunit.client.0.vm04.stdout:9/123: write d2/d23/d24/f26 [2762150,15673] 0 2026-03-10T06:22:44.363 INFO:tasks.workunit.client.0.vm04.stdout:6/96: write d2/d8/f9 [4777988,15257] 0 2026-03-10T06:22:44.366 INFO:tasks.workunit.client.0.vm04.stdout:9/124: dread d2/d23/d24/f26 [0,4194304] 0 2026-03-10T06:22:44.366 INFO:tasks.workunit.client.0.vm04.stdout:9/125: fdatasync d2/d3/f12 0 2026-03-10T06:22:44.369 INFO:tasks.workunit.client.0.vm04.stdout:3/117: rename d4/da/df/d13/f1c to d4/da/df/d13/f27 0 2026-03-10T06:22:44.371 INFO:tasks.workunit.client.0.vm04.stdout:0/91: creat d0/d5/d8/f23 x:0 0 0 2026-03-10T06:22:44.375 INFO:tasks.workunit.client.0.vm04.stdout:0/92: dwrite d0/d5/d8/dd/f13 [0,4194304] 0 2026-03-10T06:22:44.379 INFO:tasks.workunit.client.0.vm04.stdout:8/104: getdents df/d15 0 2026-03-10T06:22:44.384 INFO:tasks.workunit.client.0.vm04.stdout:5/82: truncate d4/f13 16841 0 2026-03-10T06:22:44.387 INFO:tasks.workunit.client.0.vm04.stdout:2/130: rmdir d1/db/d20/d2a 0 2026-03-10T06:22:44.388 INFO:tasks.workunit.client.0.vm04.stdout:2/131: dread - d1/df/d11/f29 zero size 2026-03-10T06:22:44.389 INFO:tasks.workunit.client.0.vm04.stdout:6/97: mknod d2/d4/d18/d1a/d1f/c21 0 2026-03-10T06:22:44.390 INFO:tasks.workunit.client.0.vm04.stdout:8/105: mkdir df/d20/d25 0 2026-03-10T06:22:44.390 INFO:tasks.workunit.client.0.vm04.stdout:8/106: stat df/d15/f1e 0 2026-03-10T06:22:44.392 INFO:tasks.workunit.client.0.vm04.stdout:9/126: mkdir d2/d32 0 2026-03-10T06:22:44.393 INFO:tasks.workunit.client.0.vm04.stdout:4/66: rename d2/d8/fb to d2/f14 0 2026-03-10T06:22:44.394 INFO:tasks.workunit.client.0.vm04.stdout:4/67: truncate d2/d8/f11 794809 0 2026-03-10T06:22:44.395 INFO:tasks.workunit.client.0.vm04.stdout:0/93: creat d0/d5/d8/dd/d1d/f24 x:0 0 0 2026-03-10T06:22:44.397 INFO:tasks.workunit.client.0.vm04.stdout:2/132: dread d1/f5 [0,4194304] 0 2026-03-10T06:22:44.397 INFO:tasks.workunit.client.0.vm04.stdout:2/133: dread - d1/df/d11/d18/f25 zero size 2026-03-10T06:22:44.398 INFO:tasks.workunit.client.0.vm04.stdout:1/108: link d0/f4 d0/d8/f27 0 2026-03-10T06:22:44.399 INFO:tasks.workunit.client.0.vm04.stdout:2/134: write d1/df/d11/d18/f19 [6591706,98745] 0 2026-03-10T06:22:44.401 INFO:tasks.workunit.client.0.vm04.stdout:6/98: mknod d2/d4/c22 0 2026-03-10T06:22:44.401 INFO:tasks.workunit.client.0.vm04.stdout:2/135: write d1/df/f24 [980699,105486] 0 2026-03-10T06:22:44.403 INFO:tasks.workunit.client.0.vm04.stdout:6/99: chown d2/d4/d18/d1a/d1f 2161206 1 2026-03-10T06:22:44.403 INFO:tasks.workunit.client.0.vm04.stdout:6/100: readlink d2/l1b 0 2026-03-10T06:22:44.403 INFO:tasks.workunit.client.0.vm04.stdout:9/127: rmdir d2/d23 39 2026-03-10T06:22:44.403 INFO:tasks.workunit.client.0.vm04.stdout:2/136: chown d1/f7 362 1 2026-03-10T06:22:44.405 INFO:tasks.workunit.client.0.vm04.stdout:4/68: chown d2/d8/fa 21694854 1 2026-03-10T06:22:44.405 INFO:tasks.workunit.client.0.vm04.stdout:4/69: chown d2/d8 52 1 2026-03-10T06:22:44.407 INFO:tasks.workunit.client.0.vm04.stdout:1/109: creat d0/d3/f28 x:0 0 0 2026-03-10T06:22:44.428 INFO:tasks.workunit.client.0.vm04.stdout:5/83: creat d4/f21 x:0 0 0 2026-03-10T06:22:44.428 INFO:tasks.workunit.client.0.vm04.stdout:1/110: dread d0/f23 [0,4194304] 0 2026-03-10T06:22:44.428 INFO:tasks.workunit.client.0.vm04.stdout:9/128: fsync d2/d3/f4 0 2026-03-10T06:22:44.428 INFO:tasks.workunit.client.0.vm04.stdout:1/111: write d0/d8/f21 [789381,108166] 0 2026-03-10T06:22:44.428 INFO:tasks.workunit.client.0.vm04.stdout:1/112: chown d0/d8/f21 26612 1 2026-03-10T06:22:44.428 INFO:tasks.workunit.client.0.vm04.stdout:9/129: write d2/d23/d24/f29 [776936,86409] 0 2026-03-10T06:22:44.428 INFO:tasks.workunit.client.0.vm04.stdout:9/130: truncate d2/d23/d24/f29 1857944 0 2026-03-10T06:22:44.428 INFO:tasks.workunit.client.0.vm04.stdout:1/113: dwrite d0/f5 [0,4194304] 0 2026-03-10T06:22:44.432 INFO:tasks.workunit.client.0.vm04.stdout:1/114: creat d0/f29 x:0 0 0 2026-03-10T06:22:44.432 INFO:tasks.workunit.client.0.vm04.stdout:9/131: getdents d2/d9 0 2026-03-10T06:22:44.438 INFO:tasks.workunit.client.0.vm04.stdout:1/115: dwrite d0/d3/f19 [0,4194304] 0 2026-03-10T06:22:44.443 INFO:tasks.workunit.client.0.vm04.stdout:1/116: symlink d0/l2a 0 2026-03-10T06:22:44.443 INFO:tasks.workunit.client.0.vm04.stdout:9/132: dwrite d2/d9/d11/f25 [0,4194304] 0 2026-03-10T06:22:44.447 INFO:tasks.workunit.client.0.vm04.stdout:1/117: mknod d0/d8/c2b 0 2026-03-10T06:22:44.447 INFO:tasks.workunit.client.0.vm04.stdout:9/133: chown d2/c16 3919 1 2026-03-10T06:22:44.448 INFO:tasks.workunit.client.0.vm04.stdout:1/118: chown d0/c26 6323277 1 2026-03-10T06:22:44.580 INFO:tasks.workunit.client.0.vm04.stdout:8/107: sync 2026-03-10T06:22:44.580 INFO:tasks.workunit.client.0.vm04.stdout:2/137: sync 2026-03-10T06:22:44.580 INFO:tasks.workunit.client.0.vm04.stdout:2/138: stat d1/df/d11/d14/c1f 0 2026-03-10T06:22:44.581 INFO:tasks.workunit.client.0.vm04.stdout:2/139: read - d1/df/d11/d18/f25 zero size 2026-03-10T06:22:44.582 INFO:tasks.workunit.client.0.vm04.stdout:8/108: mknod df/c26 0 2026-03-10T06:22:44.583 INFO:tasks.workunit.client.0.vm04.stdout:8/109: creat df/f27 x:0 0 0 2026-03-10T06:22:44.584 INFO:tasks.workunit.client.0.vm04.stdout:8/110: chown la 128276567 1 2026-03-10T06:22:44.605 INFO:tasks.workunit.client.0.vm04.stdout:8/111: sync 2026-03-10T06:22:44.618 INFO:tasks.workunit.client.0.vm04.stdout:8/112: creat df/d20/f28 x:0 0 0 2026-03-10T06:22:44.618 INFO:tasks.workunit.client.0.vm04.stdout:8/113: fsync df/f11 0 2026-03-10T06:22:44.619 INFO:tasks.workunit.client.0.vm04.stdout:8/114: chown df/d15/f1e 1247468 1 2026-03-10T06:22:44.647 INFO:tasks.workunit.client.0.vm04.stdout:8/115: dwrite df/f1f [0,4194304] 0 2026-03-10T06:22:44.652 INFO:tasks.workunit.client.0.vm04.stdout:8/116: dread - df/d15/f1b zero size 2026-03-10T06:22:44.670 INFO:tasks.workunit.client.0.vm04.stdout:7/81: write d4/f5 [4793281,124447] 0 2026-03-10T06:22:44.673 INFO:tasks.workunit.client.0.vm04.stdout:7/82: mknod d4/c1d 0 2026-03-10T06:22:44.674 INFO:tasks.workunit.client.0.vm04.stdout:3/118: write d4/da/df/d13/f16 [696351,130429] 0 2026-03-10T06:22:44.674 INFO:tasks.workunit.client.0.vm04.stdout:3/119: stat d4/d6/dc 0 2026-03-10T06:22:44.679 INFO:tasks.workunit.client.0.vm04.stdout:7/83: creat d4/df/d12/d13/f1e x:0 0 0 2026-03-10T06:22:44.681 INFO:tasks.workunit.client.0.vm04.stdout:3/120: dwrite d4/da/df/d13/f16 [0,4194304] 0 2026-03-10T06:22:44.694 INFO:tasks.workunit.client.0.vm04.stdout:7/84: mknod d4/df/c1f 0 2026-03-10T06:22:44.695 INFO:tasks.workunit.client.0.vm04.stdout:3/121: write d4/da/df/d13/f27 [616558,82942] 0 2026-03-10T06:22:44.695 INFO:tasks.workunit.client.0.vm04.stdout:7/85: read d4/df/d12/f18 [402821,4394] 0 2026-03-10T06:22:44.698 INFO:tasks.workunit.client.0.vm04.stdout:7/86: chown d4/fb 1425 1 2026-03-10T06:22:44.699 INFO:tasks.workunit.client.0.vm04.stdout:6/101: rename d2/d4/d18 to d2/d8/d23 0 2026-03-10T06:22:44.699 INFO:tasks.workunit.client.0.vm04.stdout:6/102: write d2/d8/f11 [289495,61613] 0 2026-03-10T06:22:44.701 INFO:tasks.workunit.client.0.vm04.stdout:4/70: getdents d2/d8 0 2026-03-10T06:22:44.705 INFO:tasks.workunit.client.0.vm04.stdout:3/122: dread d4/d6/dc/f25 [0,4194304] 0 2026-03-10T06:22:44.705 INFO:tasks.workunit.client.0.vm04.stdout:7/87: creat d4/df/d12/f20 x:0 0 0 2026-03-10T06:22:44.705 INFO:tasks.workunit.client.0.vm04.stdout:0/94: rename d0/d5/d8 to d0/d5/d25 0 2026-03-10T06:22:44.706 INFO:tasks.workunit.client.0.vm04.stdout:0/95: write d0/d5/d25/f22 [621491,11676] 0 2026-03-10T06:22:44.708 INFO:tasks.workunit.client.0.vm04.stdout:4/71: dread d2/d8/fa [0,4194304] 0 2026-03-10T06:22:44.712 INFO:tasks.workunit.client.0.vm04.stdout:6/103: write f1 [9102714,25779] 0 2026-03-10T06:22:44.714 INFO:tasks.workunit.client.0.vm04.stdout:1/119: rmdir d0/d3 39 2026-03-10T06:22:44.727 INFO:tasks.workunit.client.0.vm04.stdout:1/120: symlink d0/d8/l2c 0 2026-03-10T06:22:44.727 INFO:tasks.workunit.client.0.vm04.stdout:6/104: sync 2026-03-10T06:22:44.728 INFO:tasks.workunit.client.0.vm04.stdout:7/88: dwrite d4/f16 [0,4194304] 0 2026-03-10T06:22:44.729 INFO:tasks.workunit.client.0.vm04.stdout:5/84: rename d4/d11/c16 to d4/c22 0 2026-03-10T06:22:44.730 INFO:tasks.workunit.client.0.vm04.stdout:3/123: dread d4/d6/dc/f25 [0,4194304] 0 2026-03-10T06:22:44.732 INFO:tasks.workunit.client.0.vm04.stdout:2/140: truncate d1/f10 44203 0 2026-03-10T06:22:44.733 INFO:tasks.workunit.client.0.vm04.stdout:8/117: rmdir df 39 2026-03-10T06:22:44.737 INFO:tasks.workunit.client.0.vm04.stdout:9/134: rename d2/d8/c13 to d2/d3/c33 0 2026-03-10T06:22:44.737 INFO:tasks.workunit.client.0.vm04.stdout:6/105: creat d2/d4/f24 x:0 0 0 2026-03-10T06:22:44.741 INFO:tasks.workunit.client.0.vm04.stdout:9/135: chown d2/f17 82244529 1 2026-03-10T06:22:44.748 INFO:tasks.workunit.client.0.vm04.stdout:7/89: mkdir d4/df/d12/d21 0 2026-03-10T06:22:44.752 INFO:tasks.workunit.client.0.vm04.stdout:3/124: symlink d4/da/df/d13/l28 0 2026-03-10T06:22:44.752 INFO:tasks.workunit.client.0.vm04.stdout:6/106: dwrite d2/f14 [0,4194304] 0 2026-03-10T06:22:44.755 INFO:tasks.workunit.client.0.vm04.stdout:0/96: getdents d0 0 2026-03-10T06:22:44.756 INFO:tasks.workunit.client.0.vm04.stdout:4/72: rename d2/f7 to d2/d8/dc/f15 0 2026-03-10T06:22:44.757 INFO:tasks.workunit.client.0.vm04.stdout:1/121: mknod d0/c2d 0 2026-03-10T06:22:44.758 INFO:tasks.workunit.client.0.vm04.stdout:0/97: dread - d0/f1b zero size 2026-03-10T06:22:44.761 INFO:tasks.workunit.client.0.vm04.stdout:7/90: symlink d4/df/d12/l22 0 2026-03-10T06:22:44.767 INFO:tasks.workunit.client.0.vm04.stdout:8/118: mkdir df/d15/d29 0 2026-03-10T06:22:44.771 INFO:tasks.workunit.client.0.vm04.stdout:6/107: dwrite d2/f14 [4194304,4194304] 0 2026-03-10T06:22:44.778 INFO:tasks.workunit.client.0.vm04.stdout:0/98: creat d0/d5/d25/dd/d1d/f26 x:0 0 0 2026-03-10T06:22:44.789 INFO:tasks.workunit.client.0.vm04.stdout:8/119: getdents df/d15/d29 0 2026-03-10T06:22:44.790 INFO:tasks.workunit.client.0.vm04.stdout:0/99: rename d0/d5/d25/fa to d0/d1a/f27 0 2026-03-10T06:22:44.801 INFO:tasks.workunit.client.0.vm04.stdout:8/120: fsync df/f17 0 2026-03-10T06:22:44.801 INFO:tasks.workunit.client.0.vm04.stdout:4/73: mkdir d2/d16 0 2026-03-10T06:22:44.802 INFO:tasks.workunit.client.0.vm04.stdout:6/108: mknod d2/d8/d23/d1a/d1f/c25 0 2026-03-10T06:22:44.802 INFO:tasks.workunit.client.0.vm04.stdout:7/91: link d4/df/d12/c1b d4/df/d12/c23 0 2026-03-10T06:22:44.802 INFO:tasks.workunit.client.0.vm04.stdout:3/125: creat d4/f29 x:0 0 0 2026-03-10T06:22:44.802 INFO:tasks.workunit.client.0.vm04.stdout:1/122: creat d0/f2e x:0 0 0 2026-03-10T06:22:44.802 INFO:tasks.workunit.client.0.vm04.stdout:6/109: dread d2/f10 [0,4194304] 0 2026-03-10T06:22:44.802 INFO:tasks.workunit.client.0.vm04.stdout:0/100: rename d0/d5/l9 to d0/d1a/l28 0 2026-03-10T06:22:44.802 INFO:tasks.workunit.client.0.vm04.stdout:6/110: write d2/f14 [2801637,41761] 0 2026-03-10T06:22:44.808 INFO:tasks.workunit.client.0.vm04.stdout:2/141: write d1/df/f22 [566880,45636] 0 2026-03-10T06:22:44.812 INFO:tasks.workunit.client.0.vm04.stdout:8/121: creat df/d20/d25/f2a x:0 0 0 2026-03-10T06:22:44.814 INFO:tasks.workunit.client.0.vm04.stdout:8/122: write df/f11 [1751100,109368] 0 2026-03-10T06:22:44.817 INFO:tasks.workunit.client.0.vm04.stdout:9/136: truncate d2/f1c 39963 0 2026-03-10T06:22:44.817 INFO:tasks.workunit.client.0.vm04.stdout:2/142: dwrite d1/f5 [0,4194304] 0 2026-03-10T06:22:44.828 INFO:tasks.workunit.client.0.vm04.stdout:1/123: dread d0/f23 [0,4194304] 0 2026-03-10T06:22:44.830 INFO:tasks.workunit.client.0.vm04.stdout:8/123: dread f7 [0,4194304] 0 2026-03-10T06:22:44.830 INFO:tasks.workunit.client.0.vm04.stdout:9/137: dwrite d2/d23/d24/f29 [0,4194304] 0 2026-03-10T06:22:44.830 INFO:tasks.workunit.client.0.vm04.stdout:1/124: dread - d0/f1f zero size 2026-03-10T06:22:44.831 INFO:tasks.workunit.client.0.vm04.stdout:4/74: link d2/f14 d2/d8/dc/f17 0 2026-03-10T06:22:44.833 INFO:tasks.workunit.client.0.vm04.stdout:7/92: mknod d4/c24 0 2026-03-10T06:22:44.833 INFO:tasks.workunit.client.0.vm04.stdout:4/75: chown d2 1272 1 2026-03-10T06:22:44.835 INFO:tasks.workunit.client.0.vm04.stdout:7/93: truncate d4/f5 5124986 0 2026-03-10T06:22:44.844 INFO:tasks.workunit.client.0.vm04.stdout:5/85: truncate d4/d11/f17 1055808 0 2026-03-10T06:22:44.844 INFO:tasks.workunit.client.0.vm04.stdout:3/126: rename d4/da/c15 to d4/da/df/d13/d21/c2a 0 2026-03-10T06:22:44.848 INFO:tasks.workunit.client.0.vm04.stdout:2/143: mkdir d1/df/d2c 0 2026-03-10T06:22:44.855 INFO:tasks.workunit.client.0.vm04.stdout:7/94: dwrite d4/f5 [4194304,4194304] 0 2026-03-10T06:22:44.855 INFO:tasks.workunit.client.0.vm04.stdout:2/144: write d1/f13 [318785,12015] 0 2026-03-10T06:22:44.856 INFO:tasks.workunit.client.0.vm04.stdout:2/145: stat d1/db 0 2026-03-10T06:22:44.856 INFO:tasks.workunit.client.0.vm04.stdout:7/95: chown d4/df 10788846 1 2026-03-10T06:22:44.863 INFO:tasks.workunit.client.0.vm04.stdout:2/146: dwrite d1/db/f12 [0,4194304] 0 2026-03-10T06:22:44.868 INFO:tasks.workunit.client.0.vm04.stdout:9/138: mkdir d2/d3/d18/d34 0 2026-03-10T06:22:44.879 INFO:tasks.workunit.client.0.vm04.stdout:8/124: mkdir df/d15/d2b 0 2026-03-10T06:22:44.879 INFO:tasks.workunit.client.0.vm04.stdout:8/125: chown df/c26 109338 1 2026-03-10T06:22:44.879 INFO:tasks.workunit.client.0.vm04.stdout:8/126: dread df/d20/f21 [0,4194304] 0 2026-03-10T06:22:44.879 INFO:tasks.workunit.client.0.vm04.stdout:8/127: readlink df/l23 0 2026-03-10T06:22:44.879 INFO:tasks.workunit.client.0.vm04.stdout:0/101: symlink d0/d1a/d20/l29 0 2026-03-10T06:22:44.881 INFO:tasks.workunit.client.0.vm04.stdout:4/76: creat d2/d8/dc/f18 x:0 0 0 2026-03-10T06:22:44.882 INFO:tasks.workunit.client.0.vm04.stdout:2/147: fsync d1/df/f22 0 2026-03-10T06:22:44.883 INFO:tasks.workunit.client.0.vm04.stdout:3/127: readlink d4/da/df/d13/l20 0 2026-03-10T06:22:44.885 INFO:tasks.workunit.client.0.vm04.stdout:1/125: dwrite d0/d3/f16 [4194304,4194304] 0 2026-03-10T06:22:44.886 INFO:tasks.workunit.client.0.vm04.stdout:1/126: chown d0/c2 264 1 2026-03-10T06:22:44.899 INFO:tasks.workunit.client.0.vm04.stdout:5/86: rename d4/f5 to d4/d6/f23 0 2026-03-10T06:22:44.911 INFO:tasks.workunit.client.0.vm04.stdout:2/148: dwrite d1/df/d11/d18/f25 [0,4194304] 0 2026-03-10T06:22:44.911 INFO:tasks.workunit.client.0.vm04.stdout:7/96: mkdir d4/df/d12/d13/d25 0 2026-03-10T06:22:44.911 INFO:tasks.workunit.client.0.vm04.stdout:7/97: dread d4/fa [0,4194304] 0 2026-03-10T06:22:44.921 INFO:tasks.workunit.client.0.vm04.stdout:6/111: link d2/cb d2/d8/d23/d1a/c26 0 2026-03-10T06:22:44.925 INFO:tasks.workunit.client.0.vm04.stdout:8/128: mknod df/d20/c2c 0 2026-03-10T06:22:44.936 INFO:tasks.workunit.client.0.vm04.stdout:1/127: rmdir d0/d3 39 2026-03-10T06:22:44.936 INFO:tasks.workunit.client.0.vm04.stdout:2/149: symlink d1/df/l2d 0 2026-03-10T06:22:44.936 INFO:tasks.workunit.client.0.vm04.stdout:5/87: fdatasync d4/d11/f18 0 2026-03-10T06:22:44.938 INFO:tasks.workunit.client.0.vm04.stdout:3/128: dwrite d4/d6/dc/f25 [0,4194304] 0 2026-03-10T06:22:44.943 INFO:tasks.workunit.client.0.vm04.stdout:5/88: truncate d4/f19 742790 0 2026-03-10T06:22:44.943 INFO:tasks.workunit.client.0.vm04.stdout:4/77: mknod d2/c19 0 2026-03-10T06:22:44.943 INFO:tasks.workunit.client.0.vm04.stdout:6/112: creat d2/d8/d23/f27 x:0 0 0 2026-03-10T06:22:44.944 INFO:tasks.workunit.client.0.vm04.stdout:3/129: stat d4/da/df/d13/l20 0 2026-03-10T06:22:44.944 INFO:tasks.workunit.client.0.vm04.stdout:4/78: write d2/d8/f11 [1730350,63632] 0 2026-03-10T06:22:44.948 INFO:tasks.workunit.client.0.vm04.stdout:8/129: mknod df/d20/c2d 0 2026-03-10T06:22:44.949 INFO:tasks.workunit.client.0.vm04.stdout:3/130: fdatasync d4/d6/dc/f1f 0 2026-03-10T06:22:44.957 INFO:tasks.workunit.client.0.vm04.stdout:8/130: rename df to df/d20/d2e 22 2026-03-10T06:22:44.959 INFO:tasks.workunit.client.0.vm04.stdout:2/150: creat d1/db/d20/f2e x:0 0 0 2026-03-10T06:22:44.960 INFO:tasks.workunit.client.0.vm04.stdout:6/113: dwrite d2/f7 [4194304,4194304] 0 2026-03-10T06:22:44.963 INFO:tasks.workunit.client.0.vm04.stdout:7/98: unlink d4/df/d12/c1b 0 2026-03-10T06:22:44.964 INFO:tasks.workunit.client.0.vm04.stdout:7/99: stat d4/df/d12 0 2026-03-10T06:22:44.968 INFO:tasks.workunit.client.0.vm04.stdout:5/89: dwrite d4/ff [0,4194304] 0 2026-03-10T06:22:44.979 INFO:tasks.workunit.client.0.vm04.stdout:8/131: dread fd [0,4194304] 0 2026-03-10T06:22:44.980 INFO:tasks.workunit.client.0.vm04.stdout:7/100: dwrite d4/df/d12/f14 [0,4194304] 0 2026-03-10T06:22:45.004 INFO:tasks.workunit.client.0.vm04.stdout:4/79: creat d2/d8/f1a x:0 0 0 2026-03-10T06:22:45.004 INFO:tasks.workunit.client.0.vm04.stdout:4/80: rename d2 to d2/d16/d1b 22 2026-03-10T06:22:45.010 INFO:tasks.workunit.client.0.vm04.stdout:1/128: mknod d0/c2f 0 2026-03-10T06:22:45.010 INFO:tasks.workunit.client.0.vm04.stdout:9/139: write d2/f1e [1714097,71951] 0 2026-03-10T06:22:45.011 INFO:tasks.workunit.client.0.vm04.stdout:1/129: write d0/f5 [2216900,125923] 0 2026-03-10T06:22:45.016 INFO:tasks.workunit.client.0.vm04.stdout:3/131: creat d4/da/df/d13/f2b x:0 0 0 2026-03-10T06:22:45.017 INFO:tasks.workunit.client.0.vm04.stdout:0/102: write d0/d5/f1c [398375,3280] 0 2026-03-10T06:22:45.023 INFO:tasks.workunit.client.0.vm04.stdout:3/132: dwrite d4/da/df/d13/f23 [0,4194304] 0 2026-03-10T06:22:45.031 INFO:tasks.workunit.client.0.vm04.stdout:2/151: fdatasync d1/db/f27 0 2026-03-10T06:22:45.045 INFO:tasks.workunit.client.0.vm04.stdout:0/103: fdatasync d0/f17 0 2026-03-10T06:22:45.049 INFO:tasks.workunit.client.0.vm04.stdout:5/90: mknod d4/c24 0 2026-03-10T06:22:45.049 INFO:tasks.workunit.client.0.vm04.stdout:3/133: mkdir d4/da/df/d13/d21/d2c 0 2026-03-10T06:22:45.050 INFO:tasks.workunit.client.0.vm04.stdout:8/132: creat df/d15/d2b/f2f x:0 0 0 2026-03-10T06:22:45.050 INFO:tasks.workunit.client.0.vm04.stdout:3/134: fdatasync d4/d6/dc/f1f 0 2026-03-10T06:22:45.053 INFO:tasks.workunit.client.0.vm04.stdout:7/101: creat d4/df/d12/d21/f26 x:0 0 0 2026-03-10T06:22:45.057 INFO:tasks.workunit.client.0.vm04.stdout:4/81: truncate d2/d8/dc/f15 1248396 0 2026-03-10T06:22:45.057 INFO:tasks.workunit.client.0.vm04.stdout:4/82: chown d2/d16 437899012 1 2026-03-10T06:22:45.064 INFO:tasks.workunit.client.0.vm04.stdout:2/152: rmdir d1/df/d11/d18 39 2026-03-10T06:22:45.072 INFO:tasks.workunit.client.0.vm04.stdout:1/130: rename d0/d8/l1d to d0/d3/l30 0 2026-03-10T06:22:45.084 INFO:tasks.workunit.client.0.vm04.stdout:6/114: creat d2/f28 x:0 0 0 2026-03-10T06:22:45.085 INFO:tasks.workunit.client.0.vm04.stdout:6/115: fdatasync d2/f7 0 2026-03-10T06:22:45.086 INFO:tasks.workunit.client.0.vm04.stdout:6/116: chown d2/d8 27532 1 2026-03-10T06:22:45.093 INFO:tasks.workunit.client.0.vm04.stdout:9/140: truncate d2/f1e 1276982 0 2026-03-10T06:22:45.094 INFO:tasks.workunit.client.0.vm04.stdout:8/133: mkdir df/d20/d25/d30 0 2026-03-10T06:22:45.104 INFO:tasks.workunit.client.0.vm04.stdout:5/91: dread d4/d11/f17 [0,4194304] 0 2026-03-10T06:22:45.117 INFO:tasks.workunit.client.0.vm04.stdout:4/83: creat d2/d8/f1c x:0 0 0 2026-03-10T06:22:45.118 INFO:tasks.workunit.client.0.vm04.stdout:9/141: dread d2/d23/d24/f29 [0,4194304] 0 2026-03-10T06:22:45.118 INFO:tasks.workunit.client.0.vm04.stdout:9/142: write d2/d3/f12 [3966306,39245] 0 2026-03-10T06:22:45.118 INFO:tasks.workunit.client.0.vm04.stdout:0/104: rename d0/d5/l18 to d0/d1a/l2a 0 2026-03-10T06:22:45.118 INFO:tasks.workunit.client.0.vm04.stdout:1/131: symlink d0/d8/l31 0 2026-03-10T06:22:45.125 INFO:tasks.workunit.client.0.vm04.stdout:8/134: dwrite df/f1d [0,4194304] 0 2026-03-10T06:22:45.126 INFO:tasks.workunit.client.0.vm04.stdout:8/135: write df/f27 [79849,30691] 0 2026-03-10T06:22:45.126 INFO:tasks.workunit.client.0.vm04.stdout:8/136: chown df/d20/f21 16262 1 2026-03-10T06:22:45.126 INFO:tasks.workunit.client.0.vm04.stdout:5/92: mknod d4/d11/c25 0 2026-03-10T06:22:45.127 INFO:tasks.workunit.client.0.vm04.stdout:6/117: sync 2026-03-10T06:22:45.134 INFO:tasks.workunit.client.0.vm04.stdout:0/105: dwrite d0/d5/d25/f22 [0,4194304] 0 2026-03-10T06:22:45.137 INFO:tasks.workunit.client.0.vm04.stdout:3/135: creat d4/f2d x:0 0 0 2026-03-10T06:22:45.147 INFO:tasks.workunit.client.0.vm04.stdout:7/102: creat d4/df/d12/d13/f27 x:0 0 0 2026-03-10T06:22:45.158 INFO:tasks.workunit.client.0.vm04.stdout:8/137: truncate f7 1527898 0 2026-03-10T06:22:45.158 INFO:tasks.workunit.client.0.vm04.stdout:9/143: dwrite d2/d8/d14/f28 [0,4194304] 0 2026-03-10T06:22:45.163 INFO:tasks.workunit.client.0.vm04.stdout:4/84: creat d2/f1d x:0 0 0 2026-03-10T06:22:45.167 INFO:tasks.workunit.client.0.vm04.stdout:2/153: write d1/f7 [1105175,67876] 0 2026-03-10T06:22:45.174 INFO:tasks.workunit.client.0.vm04.stdout:7/103: dwrite d4/f16 [0,4194304] 0 2026-03-10T06:22:45.174 INFO:tasks.workunit.client.0.vm04.stdout:7/104: chown d4 4313512 1 2026-03-10T06:22:45.175 INFO:tasks.workunit.client.0.vm04.stdout:4/85: dwrite d2/d8/dc/f18 [0,4194304] 0 2026-03-10T06:22:45.179 INFO:tasks.workunit.client.0.vm04.stdout:2/154: dwrite d1/db/fe [0,4194304] 0 2026-03-10T06:22:45.183 INFO:tasks.workunit.client.0.vm04.stdout:0/106: mkdir d0/d5/d25/dd/d2b 0 2026-03-10T06:22:45.184 INFO:tasks.workunit.client.0.vm04.stdout:6/118: mknod d2/c29 0 2026-03-10T06:22:45.222 INFO:tasks.workunit.client.0.vm04.stdout:0/107: sync 2026-03-10T06:22:45.223 INFO:tasks.workunit.client.0.vm04.stdout:0/108: read - d0/d5/f1f zero size 2026-03-10T06:22:45.241 INFO:tasks.workunit.client.0.vm04.stdout:0/109: dread d0/f4 [0,4194304] 0 2026-03-10T06:22:45.258 INFO:tasks.workunit.client.0.vm04.stdout:4/86: mknod d2/d8/c1e 0 2026-03-10T06:22:45.271 INFO:tasks.workunit.client.0.vm04.stdout:5/93: creat d4/f26 x:0 0 0 2026-03-10T06:22:45.287 INFO:tasks.workunit.client.0.vm04.stdout:9/144: fsync d2/f1c 0 2026-03-10T06:22:45.287 INFO:tasks.workunit.client.0.vm04.stdout:1/132: write d0/f4 [737358,23905] 0 2026-03-10T06:22:45.290 INFO:tasks.workunit.client.0.vm04.stdout:1/133: write d0/d3/f28 [209148,93277] 0 2026-03-10T06:22:45.305 INFO:tasks.workunit.client.0.vm04.stdout:3/136: dwrite f0 [4194304,4194304] 0 2026-03-10T06:22:45.306 INFO:tasks.workunit.client.0.vm04.stdout:3/137: chown d4/c5 20152160 1 2026-03-10T06:22:45.307 INFO:tasks.workunit.client.0.vm04.stdout:3/138: stat f1 0 2026-03-10T06:22:45.308 INFO:tasks.workunit.client.0.vm04.stdout:3/139: read d4/da/df/d13/f1d [61596,20496] 0 2026-03-10T06:22:45.565 INFO:tasks.workunit.client.0.vm04.stdout:6/119: symlink d2/l2a 0 2026-03-10T06:22:45.566 INFO:tasks.workunit.client.0.vm04.stdout:9/145: creat d2/d9/d11/f35 x:0 0 0 2026-03-10T06:22:45.567 INFO:tasks.workunit.client.0.vm04.stdout:6/120: chown d2/d4/c22 69 1 2026-03-10T06:22:45.567 INFO:tasks.workunit.client.0.vm04.stdout:6/121: truncate d2/f10 4982490 0 2026-03-10T06:22:45.572 INFO:tasks.workunit.client.0.vm04.stdout:1/134: rmdir d0/d8 39 2026-03-10T06:22:45.578 INFO:tasks.workunit.client.0.vm04.stdout:3/140: creat d4/da/df/d13/f2e x:0 0 0 2026-03-10T06:22:45.579 INFO:tasks.workunit.client.0.vm04.stdout:7/105: mkdir d4/df/d12/d13/d25/d28 0 2026-03-10T06:22:45.581 INFO:tasks.workunit.client.0.vm04.stdout:7/106: stat d4/df/d12/d13/d25 0 2026-03-10T06:22:45.582 INFO:tasks.workunit.client.0.vm04.stdout:4/87: getdents d2/d16 0 2026-03-10T06:22:45.582 INFO:tasks.workunit.client.0.vm04.stdout:7/107: chown d4/df/d12 29 1 2026-03-10T06:22:45.582 INFO:tasks.workunit.client.0.vm04.stdout:5/94: mknod d4/c27 0 2026-03-10T06:22:45.582 INFO:tasks.workunit.client.0.vm04.stdout:7/108: chown d4/f16 4056034 1 2026-03-10T06:22:45.584 INFO:tasks.workunit.client.0.vm04.stdout:9/146: creat d2/d3/f36 x:0 0 0 2026-03-10T06:22:45.584 INFO:tasks.workunit.client.0.vm04.stdout:6/122: creat d2/d8/d23/d1a/f2b x:0 0 0 2026-03-10T06:22:45.591 INFO:tasks.workunit.client.0.vm04.stdout:3/141: symlink d4/da/df/d13/d21/l2f 0 2026-03-10T06:22:45.592 INFO:tasks.workunit.client.0.vm04.stdout:5/95: symlink d4/d11/l28 0 2026-03-10T06:22:45.594 INFO:tasks.workunit.client.0.vm04.stdout:7/109: creat d4/df/f29 x:0 0 0 2026-03-10T06:22:45.612 INFO:tasks.workunit.client.0.vm04.stdout:6/123: rename f1 to d2/d4/f2c 0 2026-03-10T06:22:45.620 INFO:tasks.workunit.client.0.vm04.stdout:9/147: dread d2/d9/f2e [0,4194304] 0 2026-03-10T06:22:45.620 INFO:tasks.workunit.client.0.vm04.stdout:2/155: getdents d1/df/d11 0 2026-03-10T06:22:45.621 INFO:tasks.workunit.client.0.vm04.stdout:9/148: chown d2/c16 6104 1 2026-03-10T06:22:45.636 INFO:tasks.workunit.client.0.vm04.stdout:4/88: link d2/f1d d2/d8/f1f 0 2026-03-10T06:22:45.636 INFO:tasks.workunit.client.0.vm04.stdout:1/135: rename d0/f22 to d0/d8/f32 0 2026-03-10T06:22:45.637 INFO:tasks.workunit.client.0.vm04.stdout:4/89: truncate d2/d8/dc/f17 197055 0 2026-03-10T06:22:45.637 INFO:tasks.workunit.client.0.vm04.stdout:4/90: dread - d2/f1d zero size 2026-03-10T06:22:45.645 INFO:tasks.workunit.client.0.vm04.stdout:1/136: dread d0/d3/f19 [0,4194304] 0 2026-03-10T06:22:45.659 INFO:tasks.workunit.client.0.vm04.stdout:7/110: creat d4/df/d12/d21/f2a x:0 0 0 2026-03-10T06:22:45.659 INFO:tasks.workunit.client.0.vm04.stdout:2/156: rename d1/f7 to d1/db/d20/f2f 0 2026-03-10T06:22:45.665 INFO:tasks.workunit.client.0.vm04.stdout:6/124: sync 2026-03-10T06:22:45.666 INFO:tasks.workunit.client.0.vm04.stdout:2/157: sync 2026-03-10T06:22:45.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:45 vm04.local ceph-mon[51058]: pgmap v21: 65 pgs: 65 active+clean; 2.2 GiB data, 7.2 GiB used, 113 GiB / 120 GiB avail; 36 MiB/s rd, 87 MiB/s wr, 246 op/s 2026-03-10T06:22:45.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:45 vm04.local ceph-mon[51058]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:22:45.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:45 vm04.local ceph-mon[51058]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:22:45.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:45 vm04.local ceph-mon[51058]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:22:45.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:45 vm04.local ceph-mon[51058]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:22:45.687 INFO:tasks.workunit.client.0.vm04.stdout:7/111: symlink d4/df/d12/d13/d25/l2b 0 2026-03-10T06:22:45.688 INFO:tasks.workunit.client.0.vm04.stdout:1/137: rename d0/f1f to d0/d3/f33 0 2026-03-10T06:22:45.689 INFO:tasks.workunit.client.0.vm04.stdout:4/91: creat d2/d16/f20 x:0 0 0 2026-03-10T06:22:45.689 INFO:tasks.workunit.client.0.vm04.stdout:3/142: getdents d4/d6 0 2026-03-10T06:22:45.691 INFO:tasks.workunit.client.0.vm04.stdout:6/125: mkdir d2/d4/d2d 0 2026-03-10T06:22:45.692 INFO:tasks.workunit.client.0.vm04.stdout:6/126: chown d2/d8/d23/f27 80530004 1 2026-03-10T06:22:45.701 INFO:tasks.workunit.client.0.vm04.stdout:1/138: dwrite d0/d3/f33 [0,4194304] 0 2026-03-10T06:22:45.706 INFO:tasks.workunit.client.0.vm04.stdout:1/139: dread d0/d3/f33 [0,4194304] 0 2026-03-10T06:22:45.713 INFO:tasks.workunit.client.0.vm04.stdout:4/92: creat d2/d8/dc/f21 x:0 0 0 2026-03-10T06:22:45.713 INFO:tasks.workunit.client.0.vm04.stdout:4/93: readlink d2/l9 0 2026-03-10T06:22:45.717 INFO:tasks.workunit.client.0.vm04.stdout:3/143: creat d4/d6/f30 x:0 0 0 2026-03-10T06:22:45.719 INFO:tasks.workunit.client.0.vm04.stdout:1/140: sync 2026-03-10T06:22:45.719 INFO:tasks.workunit.client.0.vm04.stdout:1/141: chown d0/f2e 27677 1 2026-03-10T06:22:45.725 INFO:tasks.workunit.client.0.vm04.stdout:7/112: mknod d4/df/d12/d13/d25/d28/c2c 0 2026-03-10T06:22:45.726 INFO:tasks.workunit.client.0.vm04.stdout:8/138: link f7 df/d20/d25/f31 0 2026-03-10T06:22:45.731 INFO:tasks.workunit.client.0.vm04.stdout:4/94: dread d2/d8/dc/f15 [0,4194304] 0 2026-03-10T06:22:45.732 INFO:tasks.workunit.client.0.vm04.stdout:6/127: symlink d2/d4/d2d/l2e 0 2026-03-10T06:22:45.736 INFO:tasks.workunit.client.0.vm04.stdout:3/144: creat d4/da/df/d11/f31 x:0 0 0 2026-03-10T06:22:45.737 INFO:tasks.workunit.client.0.vm04.stdout:3/145: write d4/da/df/f1b [5061154,40407] 0 2026-03-10T06:22:45.737 INFO:tasks.workunit.client.0.vm04.stdout:6/128: dwrite d2/f14 [4194304,4194304] 0 2026-03-10T06:22:45.741 INFO:tasks.workunit.client.0.vm04.stdout:3/146: chown d4/da/df/d13/f23 53505 1 2026-03-10T06:22:45.741 INFO:tasks.workunit.client.0.vm04.stdout:3/147: write f0 [7156017,54266] 0 2026-03-10T06:22:45.742 INFO:tasks.workunit.client.0.vm04.stdout:0/110: truncate d0/d5/d25/f22 1899843 0 2026-03-10T06:22:45.745 INFO:tasks.workunit.client.0.vm04.stdout:1/142: creat d0/d3/f34 x:0 0 0 2026-03-10T06:22:45.756 INFO:tasks.workunit.client.0.vm04.stdout:4/95: dwrite d2/d8/f1f [0,4194304] 0 2026-03-10T06:22:45.762 INFO:tasks.workunit.client.0.vm04.stdout:6/129: mkdir d2/d8/d23/d2f 0 2026-03-10T06:22:45.764 INFO:tasks.workunit.client.0.vm04.stdout:3/148: mkdir d4/da/df/d13/d21/d32 0 2026-03-10T06:22:45.765 INFO:tasks.workunit.client.0.vm04.stdout:0/111: symlink d0/l2c 0 2026-03-10T06:22:45.768 INFO:tasks.workunit.client.0.vm04.stdout:8/139: mknod df/d15/d29/c32 0 2026-03-10T06:22:45.772 INFO:tasks.workunit.client.0.vm04.stdout:6/130: dread d2/d4/fa [0,4194304] 0 2026-03-10T06:22:45.774 INFO:tasks.workunit.client.0.vm04.stdout:1/143: link d0/d3/f16 d0/d3/f35 0 2026-03-10T06:22:45.774 INFO:tasks.workunit.client.0.vm04.stdout:1/144: chown d0/c2 57510804 1 2026-03-10T06:22:45.775 INFO:tasks.workunit.client.0.vm04.stdout:8/140: creat df/d15/d2b/f33 x:0 0 0 2026-03-10T06:22:45.775 INFO:tasks.workunit.client.0.vm04.stdout:8/141: dread - df/d15/f1b zero size 2026-03-10T06:22:45.780 INFO:tasks.workunit.client.0.vm04.stdout:5/96: write d4/d6/f1e [1227919,129479] 0 2026-03-10T06:22:45.782 INFO:tasks.workunit.client.0.vm04.stdout:9/149: dwrite d2/d23/d24/f29 [4194304,4194304] 0 2026-03-10T06:22:45.794 INFO:tasks.workunit.client.0.vm04.stdout:0/112: symlink d0/d5/d25/dd/d2b/l2d 0 2026-03-10T06:22:45.795 INFO:tasks.workunit.client.0.vm04.stdout:0/113: truncate d0/f1b 283511 0 2026-03-10T06:22:45.798 INFO:tasks.workunit.client.0.vm04.stdout:0/114: dwrite d0/d5/d25/dd/d1d/f26 [0,4194304] 0 2026-03-10T06:22:45.802 INFO:tasks.workunit.client.0.vm04.stdout:8/142: mknod df/d15/c34 0 2026-03-10T06:22:45.808 INFO:tasks.workunit.client.0.vm04.stdout:9/150: rmdir d2/d23 39 2026-03-10T06:22:45.809 INFO:tasks.workunit.client.0.vm04.stdout:4/96: rename d2/d8/cd to d2/c22 0 2026-03-10T06:22:45.811 INFO:tasks.workunit.client.0.vm04.stdout:4/97: dread d2/d8/f1f [0,4194304] 0 2026-03-10T06:22:45.818 INFO:tasks.workunit.client.0.vm04.stdout:2/158: write d1/f10 [819303,32451] 0 2026-03-10T06:22:45.819 INFO:tasks.workunit.client.0.vm04.stdout:8/143: fsync df/d15/f1e 0 2026-03-10T06:22:45.824 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:45 vm06.local ceph-mon[58974]: pgmap v21: 65 pgs: 65 active+clean; 2.2 GiB data, 7.2 GiB used, 113 GiB / 120 GiB avail; 36 MiB/s rd, 87 MiB/s wr, 246 op/s 2026-03-10T06:22:45.824 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:45 vm06.local ceph-mon[58974]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:22:45.824 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:45 vm06.local ceph-mon[58974]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:22:45.824 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:45 vm06.local ceph-mon[58974]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:22:45.824 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:45 vm06.local ceph-mon[58974]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:22:45.824 INFO:tasks.workunit.client.0.vm04.stdout:9/151: chown d2/d23/d24/f29 32 1 2026-03-10T06:22:45.824 INFO:tasks.workunit.client.0.vm04.stdout:3/149: rename d4/d6/dc/c17 to d4/da/df/c33 0 2026-03-10T06:22:45.826 INFO:tasks.workunit.client.0.vm04.stdout:4/98: creat d2/d8/f23 x:0 0 0 2026-03-10T06:22:45.837 INFO:tasks.workunit.client.0.vm04.stdout:7/113: dwrite d4/df/d12/f18 [0,4194304] 0 2026-03-10T06:22:45.838 INFO:tasks.workunit.client.0.vm04.stdout:2/159: mknod d1/df/d11/d18/c30 0 2026-03-10T06:22:45.850 INFO:tasks.workunit.client.0.vm04.stdout:9/152: dwrite d2/d3/f2a [0,4194304] 0 2026-03-10T06:22:45.852 INFO:tasks.workunit.client.0.vm04.stdout:6/131: rename d2/d8/d23/d1a to d2/d4/d2d/d30 0 2026-03-10T06:22:45.857 INFO:tasks.workunit.client.0.vm04.stdout:5/97: dwrite d4/d11/f18 [0,4194304] 0 2026-03-10T06:22:45.860 INFO:tasks.workunit.client.0.vm04.stdout:5/98: dread d4/d11/f1f [0,4194304] 0 2026-03-10T06:22:45.867 INFO:tasks.workunit.client.0.vm04.stdout:4/99: rmdir d2 39 2026-03-10T06:22:45.869 INFO:tasks.workunit.client.0.vm04.stdout:1/145: getdents d0/d3 0 2026-03-10T06:22:45.869 INFO:tasks.workunit.client.0.vm04.stdout:1/146: truncate d0/d3/f28 483788 0 2026-03-10T06:22:45.872 INFO:tasks.workunit.client.0.vm04.stdout:2/160: creat d1/db/d20/f31 x:0 0 0 2026-03-10T06:22:45.882 INFO:tasks.workunit.client.0.vm04.stdout:0/115: rename d0/lf to d0/d5/d25/dd/d1d/l2e 0 2026-03-10T06:22:45.883 INFO:tasks.workunit.client.0.vm04.stdout:3/150: rename d4/da to d4/da/df/d13/d21/d34 22 2026-03-10T06:22:45.883 INFO:tasks.workunit.client.0.vm04.stdout:3/151: fsync d4/da/df/d13/f2e 0 2026-03-10T06:22:45.884 INFO:tasks.workunit.client.0.vm04.stdout:5/99: dread d4/d6/f8 [0,4194304] 0 2026-03-10T06:22:45.885 INFO:tasks.workunit.client.0.vm04.stdout:5/100: fdatasync d4/d6/f20 0 2026-03-10T06:22:45.886 INFO:tasks.workunit.client.0.vm04.stdout:1/147: creat d0/d3/f36 x:0 0 0 2026-03-10T06:22:45.890 INFO:tasks.workunit.client.0.vm04.stdout:6/132: link d2/d8/d23/f27 d2/d4/f31 0 2026-03-10T06:22:45.894 INFO:tasks.workunit.client.0.vm04.stdout:1/148: creat d0/d3/f37 x:0 0 0 2026-03-10T06:22:45.901 INFO:tasks.workunit.client.0.vm04.stdout:9/153: rmdir d2/d32 0 2026-03-10T06:22:45.901 INFO:tasks.workunit.client.0.vm04.stdout:9/154: readlink d2/d3/d18/l2f 0 2026-03-10T06:22:45.911 INFO:tasks.workunit.client.0.vm04.stdout:3/152: getdents d4/da/df/d13/d21/d32 0 2026-03-10T06:22:45.913 INFO:tasks.workunit.client.0.vm04.stdout:3/153: dread f1 [0,4194304] 0 2026-03-10T06:22:45.920 INFO:tasks.workunit.client.0.vm04.stdout:5/101: mknod d4/c29 0 2026-03-10T06:22:45.921 INFO:tasks.workunit.client.0.vm04.stdout:5/102: fsync d4/f19 0 2026-03-10T06:22:45.921 INFO:tasks.workunit.client.0.vm04.stdout:5/103: read - d4/f21 zero size 2026-03-10T06:22:45.933 INFO:tasks.workunit.client.0.vm04.stdout:6/133: dwrite d2/f7 [0,4194304] 0 2026-03-10T06:22:45.937 INFO:tasks.workunit.client.0.vm04.stdout:6/134: dread d2/d8/f9 [0,4194304] 0 2026-03-10T06:22:45.949 INFO:tasks.workunit.client.0.vm04.stdout:8/144: dwrite f7 [0,4194304] 0 2026-03-10T06:22:45.950 INFO:tasks.workunit.client.0.vm04.stdout:8/145: fdatasync df/d20/d25/f2a 0 2026-03-10T06:22:45.953 INFO:tasks.workunit.client.0.vm04.stdout:2/161: write d1/db/f27 [615830,91659] 0 2026-03-10T06:22:45.954 INFO:tasks.workunit.client.0.vm04.stdout:2/162: write d1/df/f22 [606016,82724] 0 2026-03-10T06:22:45.954 INFO:tasks.workunit.client.0.vm04.stdout:2/163: read d1/db/f1e [8145760,123959] 0 2026-03-10T06:22:45.955 INFO:tasks.workunit.client.0.vm04.stdout:2/164: chown d1/df/f24 2 1 2026-03-10T06:22:45.958 INFO:tasks.workunit.client.0.vm04.stdout:1/149: creat d0/d8/f38 x:0 0 0 2026-03-10T06:22:45.959 INFO:tasks.workunit.client.0.vm04.stdout:7/114: link d4/c24 d4/df/d12/c2d 0 2026-03-10T06:22:45.960 INFO:tasks.workunit.client.0.vm04.stdout:9/155: creat d2/d23/d24/f37 x:0 0 0 2026-03-10T06:22:45.962 INFO:tasks.workunit.client.0.vm04.stdout:7/115: dwrite d4/df/f29 [0,4194304] 0 2026-03-10T06:22:45.967 INFO:tasks.workunit.client.0.vm04.stdout:7/116: chown d4/df/d12/d13/d25/l2b 12068 1 2026-03-10T06:22:45.967 INFO:tasks.workunit.client.0.vm04.stdout:0/116: link d0/f14 d0/d1a/f2f 0 2026-03-10T06:22:45.970 INFO:tasks.workunit.client.0.vm04.stdout:5/104: write d4/d6/f23 [1018608,4079] 0 2026-03-10T06:22:45.974 INFO:tasks.workunit.client.0.vm04.stdout:6/135: creat d2/d4/d2d/d30/f32 x:0 0 0 2026-03-10T06:22:45.987 INFO:tasks.workunit.client.0.vm04.stdout:8/146: rename f7 to df/d20/d25/f35 0 2026-03-10T06:22:45.987 INFO:tasks.workunit.client.0.vm04.stdout:8/147: fdatasync df/f27 0 2026-03-10T06:22:45.987 INFO:tasks.workunit.client.0.vm04.stdout:8/148: chown f6 162 1 2026-03-10T06:22:45.990 INFO:tasks.workunit.client.0.vm04.stdout:8/149: dwrite df/f27 [0,4194304] 0 2026-03-10T06:22:46.000 INFO:tasks.workunit.client.0.vm04.stdout:1/150: rename d0/d3/f36 to d0/d3/f39 0 2026-03-10T06:22:46.016 INFO:tasks.workunit.client.0.vm04.stdout:1/151: dwrite d0/d3/f34 [0,4194304] 0 2026-03-10T06:22:46.016 INFO:tasks.workunit.client.0.vm04.stdout:1/152: write d0/f2e [559895,33418] 0 2026-03-10T06:22:46.016 INFO:tasks.workunit.client.0.vm04.stdout:0/117: creat d0/d5/d25/dd/d1d/f30 x:0 0 0 2026-03-10T06:22:46.016 INFO:tasks.workunit.client.0.vm04.stdout:1/153: readlink d0/d3/l1e 0 2026-03-10T06:22:46.018 INFO:tasks.workunit.client.0.vm04.stdout:5/105: rmdir d4/d11 39 2026-03-10T06:22:46.026 INFO:tasks.workunit.client.0.vm04.stdout:6/136: sync 2026-03-10T06:22:46.026 INFO:tasks.workunit.client.0.vm04.stdout:4/100: link d2/d8/c1e d2/c24 0 2026-03-10T06:22:46.027 INFO:tasks.workunit.client.0.vm04.stdout:4/101: dread - d2/d8/f1a zero size 2026-03-10T06:22:46.029 INFO:tasks.workunit.client.0.vm04.stdout:4/102: dread d2/d8/f1f [0,4194304] 0 2026-03-10T06:22:46.040 INFO:tasks.workunit.client.0.vm04.stdout:2/165: creat d1/df/d2c/f32 x:0 0 0 2026-03-10T06:22:46.040 INFO:tasks.workunit.client.0.vm04.stdout:2/166: stat d1/df/d11/d18/c30 0 2026-03-10T06:22:46.048 INFO:tasks.workunit.client.0.vm04.stdout:7/117: link d4/fa d4/df/d12/d13/d25/f2e 0 2026-03-10T06:22:46.052 INFO:tasks.workunit.client.0.vm04.stdout:7/118: fsync d4/df/d12/d21/f26 0 2026-03-10T06:22:46.052 INFO:tasks.workunit.client.0.vm04.stdout:1/154: mknod d0/d8/c3a 0 2026-03-10T06:22:46.053 INFO:tasks.workunit.client.0.vm04.stdout:1/155: stat d0/d3/l1e 0 2026-03-10T06:22:46.065 INFO:tasks.workunit.client.0.vm04.stdout:6/137: symlink d2/d4/d2d/d30/l33 0 2026-03-10T06:22:46.065 INFO:tasks.workunit.client.0.vm04.stdout:1/156: dread d0/f4 [0,4194304] 0 2026-03-10T06:22:46.066 INFO:tasks.workunit.client.0.vm04.stdout:6/138: write d2/d8/f11 [547346,56150] 0 2026-03-10T06:22:46.067 INFO:tasks.workunit.client.0.vm04.stdout:3/154: truncate d4/d6/f12 22678 0 2026-03-10T06:22:46.067 INFO:tasks.workunit.client.0.vm04.stdout:6/139: fsync d2/d4/f2c 0 2026-03-10T06:22:46.068 INFO:tasks.workunit.client.0.vm04.stdout:3/155: chown d4/c14 5820333 1 2026-03-10T06:22:46.070 INFO:tasks.workunit.client.0.vm04.stdout:8/150: write df/d15/f1e [934473,1712] 0 2026-03-10T06:22:46.071 INFO:tasks.workunit.client.0.vm04.stdout:2/167: creat d1/df/d2c/f33 x:0 0 0 2026-03-10T06:22:46.075 INFO:tasks.workunit.client.0.vm04.stdout:0/118: fsync d0/d5/d25/f22 0 2026-03-10T06:22:46.075 INFO:tasks.workunit.client.0.vm04.stdout:7/119: creat d4/df/d12/d13/d25/f2f x:0 0 0 2026-03-10T06:22:46.079 INFO:tasks.workunit.client.0.vm04.stdout:5/106: mkdir d4/d11/d2a 0 2026-03-10T06:22:46.080 INFO:tasks.workunit.client.0.vm04.stdout:8/151: dwrite df/d15/d2b/f2f [0,4194304] 0 2026-03-10T06:22:46.081 INFO:tasks.workunit.client.0.vm04.stdout:5/107: write d4/f19 [381550,80878] 0 2026-03-10T06:22:46.081 INFO:tasks.workunit.client.0.vm04.stdout:1/157: creat d0/d3/f3b x:0 0 0 2026-03-10T06:22:46.082 INFO:tasks.workunit.client.0.vm04.stdout:6/140: dwrite d2/d4/d2d/d30/f2b [0,4194304] 0 2026-03-10T06:22:46.083 INFO:tasks.workunit.client.0.vm04.stdout:1/158: fsync d0/f29 0 2026-03-10T06:22:46.085 INFO:tasks.workunit.client.0.vm04.stdout:9/156: getdents d2/d3 0 2026-03-10T06:22:46.085 INFO:tasks.workunit.client.0.vm04.stdout:9/157: readlink d2/d3/l1f 0 2026-03-10T06:22:46.085 INFO:tasks.workunit.client.0.vm04.stdout:9/158: read d2/d3/f4 [833404,49902] 0 2026-03-10T06:22:46.086 INFO:tasks.workunit.client.0.vm04.stdout:8/152: write df/d15/d2b/f2f [358052,62871] 0 2026-03-10T06:22:46.087 INFO:tasks.workunit.client.0.vm04.stdout:9/159: fdatasync d2/d9/d11/f25 0 2026-03-10T06:22:46.091 INFO:tasks.workunit.client.0.vm04.stdout:3/156: rename d4/da/df/d13/d21/l2f to d4/da/df/d11/l35 0 2026-03-10T06:22:46.091 INFO:tasks.workunit.client.0.vm04.stdout:2/168: symlink d1/df/d11/l34 0 2026-03-10T06:22:46.091 INFO:tasks.workunit.client.0.vm04.stdout:8/153: dread - df/f17 zero size 2026-03-10T06:22:46.092 INFO:tasks.workunit.client.0.vm04.stdout:8/154: fsync df/f17 0 2026-03-10T06:22:46.092 INFO:tasks.workunit.client.0.vm04.stdout:5/108: sync 2026-03-10T06:22:46.092 INFO:tasks.workunit.client.0.vm04.stdout:7/120: dread d4/fa [0,4194304] 0 2026-03-10T06:22:46.092 INFO:tasks.workunit.client.0.vm04.stdout:8/155: fdatasync df/d20/f22 0 2026-03-10T06:22:46.093 INFO:tasks.workunit.client.0.vm04.stdout:8/156: readlink l0 0 2026-03-10T06:22:46.095 INFO:tasks.workunit.client.0.vm04.stdout:5/109: truncate d4/f26 241724 0 2026-03-10T06:22:46.097 INFO:tasks.workunit.client.0.vm04.stdout:0/119: mkdir d0/d5/d25/dd/d2b/d31 0 2026-03-10T06:22:46.100 INFO:tasks.workunit.client.0.vm04.stdout:6/141: stat d2/d8/c19 0 2026-03-10T06:22:46.100 INFO:tasks.workunit.client.0.vm04.stdout:6/142: truncate d2/f16 850734 0 2026-03-10T06:22:46.115 INFO:tasks.workunit.client.0.vm04.stdout:6/143: dread d2/f10 [0,4194304] 0 2026-03-10T06:22:46.122 INFO:tasks.workunit.client.0.vm04.stdout:6/144: readlink d2/l2a 0 2026-03-10T06:22:46.123 INFO:tasks.workunit.client.0.vm04.stdout:6/145: readlink d2/l2a 0 2026-03-10T06:22:46.126 INFO:tasks.workunit.client.0.vm04.stdout:8/157: mknod df/d15/d2b/c36 0 2026-03-10T06:22:46.126 INFO:tasks.workunit.client.0.vm04.stdout:7/121: fsync d4/df/d12/d13/d25/f2e 0 2026-03-10T06:22:46.128 INFO:tasks.workunit.client.0.vm04.stdout:6/146: write d2/d4/f2c [10968901,18764] 0 2026-03-10T06:22:46.128 INFO:tasks.workunit.client.0.vm04.stdout:4/103: getdents d2/d16 0 2026-03-10T06:22:46.128 INFO:tasks.workunit.client.0.vm04.stdout:0/120: symlink d0/d5/d25/dd/d1d/l32 0 2026-03-10T06:22:46.132 INFO:tasks.workunit.client.0.vm04.stdout:8/158: dwrite fe [0,4194304] 0 2026-03-10T06:22:46.139 INFO:tasks.workunit.client.0.vm04.stdout:8/159: stat df/d20 0 2026-03-10T06:22:46.140 INFO:tasks.workunit.client.0.vm04.stdout:4/104: sync 2026-03-10T06:22:46.148 INFO:tasks.workunit.client.0.vm04.stdout:5/110: mknod d4/d11/d2a/c2b 0 2026-03-10T06:22:46.152 INFO:tasks.workunit.client.0.vm04.stdout:8/160: dread df/d20/d25/f31 [0,4194304] 0 2026-03-10T06:22:46.162 INFO:tasks.workunit.client.0.vm04.stdout:9/160: write d2/d9/d11/f2d [4188539,38909] 0 2026-03-10T06:22:46.163 INFO:tasks.workunit.client.0.vm04.stdout:1/159: creat d0/f3c x:0 0 0 2026-03-10T06:22:46.164 INFO:tasks.workunit.client.0.vm04.stdout:9/161: dread - d2/d23/d24/f2b zero size 2026-03-10T06:22:46.170 INFO:tasks.workunit.client.0.vm04.stdout:6/147: unlink d2/d4/f15 0 2026-03-10T06:22:46.171 INFO:tasks.workunit.client.0.vm04.stdout:4/105: rename d2/d8/f1c to d2/d8/f25 0 2026-03-10T06:22:46.171 INFO:tasks.workunit.client.0.vm04.stdout:6/148: chown d2/c29 197831 1 2026-03-10T06:22:46.171 INFO:tasks.workunit.client.0.vm04.stdout:0/121: dread - d0/f17 zero size 2026-03-10T06:22:46.171 INFO:tasks.workunit.client.0.vm04.stdout:0/122: readlink d0/l2c 0 2026-03-10T06:22:46.173 INFO:tasks.workunit.client.0.vm04.stdout:0/123: write d0/d5/d25/dd/d1d/f24 [162799,16804] 0 2026-03-10T06:22:46.175 INFO:tasks.workunit.client.0.vm04.stdout:9/162: dread f0 [0,4194304] 0 2026-03-10T06:22:46.176 INFO:tasks.workunit.client.0.vm04.stdout:9/163: write d2/d23/d24/f37 [1012895,10794] 0 2026-03-10T06:22:46.182 INFO:tasks.workunit.client.0.vm04.stdout:1/160: mknod d0/d8/c3d 0 2026-03-10T06:22:46.183 INFO:tasks.workunit.client.0.vm04.stdout:6/149: dwrite d2/f7 [8388608,4194304] 0 2026-03-10T06:22:46.187 INFO:tasks.workunit.client.0.vm04.stdout:2/169: unlink d1/df/d11/c21 0 2026-03-10T06:22:46.187 INFO:tasks.workunit.client.0.vm04.stdout:4/106: rename d2/d8/dc/f17 to d2/d8/dc/f26 0 2026-03-10T06:22:46.190 INFO:tasks.workunit.client.0.vm04.stdout:8/161: getdents df/d20/d25/d30 0 2026-03-10T06:22:46.190 INFO:tasks.workunit.client.0.vm04.stdout:2/170: write d1/db/f27 [1514954,11724] 0 2026-03-10T06:22:46.190 INFO:tasks.workunit.client.0.vm04.stdout:8/162: write fe [4319065,68439] 0 2026-03-10T06:22:46.192 INFO:tasks.workunit.client.0.vm04.stdout:1/161: dread d0/d8/f11 [0,4194304] 0 2026-03-10T06:22:46.197 INFO:tasks.workunit.client.0.vm04.stdout:9/164: truncate f0 4634187 0 2026-03-10T06:22:46.198 INFO:tasks.workunit.client.0.vm04.stdout:6/150: rename d2/d8/d23/d2f to d2/d4/d2d/d30/d34 0 2026-03-10T06:22:46.204 INFO:tasks.workunit.client.0.vm04.stdout:4/107: mkdir d2/d16/d27 0 2026-03-10T06:22:46.207 INFO:tasks.workunit.client.0.vm04.stdout:2/171: dread - d1/df/d11/f29 zero size 2026-03-10T06:22:46.212 INFO:tasks.workunit.client.0.vm04.stdout:5/111: dwrite d4/d11/f1f [0,4194304] 0 2026-03-10T06:22:46.212 INFO:tasks.workunit.client.0.vm04.stdout:8/163: dwrite df/d20/f28 [0,4194304] 0 2026-03-10T06:22:46.240 INFO:tasks.workunit.client.0.vm04.stdout:7/122: truncate d4/df/d12/d13/d25/f2e 3292008 0 2026-03-10T06:22:46.249 INFO:tasks.workunit.client.0.vm04.stdout:0/124: truncate d0/f16 2463604 0 2026-03-10T06:22:46.256 INFO:tasks.workunit.client.0.vm04.stdout:0/125: dread d0/d5/d25/dd/d1d/f24 [0,4194304] 0 2026-03-10T06:22:46.302 INFO:tasks.workunit.client.0.vm04.stdout:1/162: symlink d0/l3e 0 2026-03-10T06:22:46.305 INFO:tasks.workunit.client.0.vm04.stdout:9/165: rmdir d2/d8/d14 39 2026-03-10T06:22:46.305 INFO:tasks.workunit.client.0.vm04.stdout:1/163: truncate d0/d3/f37 1009997 0 2026-03-10T06:22:46.305 INFO:tasks.workunit.client.0.vm04.stdout:9/166: readlink d2/d3/d18/l2f 0 2026-03-10T06:22:46.305 INFO:tasks.workunit.client.0.vm04.stdout:1/164: dread - d0/f3c zero size 2026-03-10T06:22:46.309 INFO:tasks.workunit.client.0.vm04.stdout:4/108: creat d2/d8/dc/f28 x:0 0 0 2026-03-10T06:22:46.312 INFO:tasks.workunit.client.0.vm04.stdout:3/157: truncate f1 275885 0 2026-03-10T06:22:46.314 INFO:tasks.workunit.client.0.vm04.stdout:1/165: dwrite d0/f3c [0,4194304] 0 2026-03-10T06:22:46.314 INFO:tasks.workunit.client.0.vm04.stdout:5/112: creat d4/d11/d2a/f2c x:0 0 0 2026-03-10T06:22:46.315 INFO:tasks.workunit.client.0.vm04.stdout:3/158: dread - d4/da/df/d13/f2e zero size 2026-03-10T06:22:46.326 INFO:tasks.workunit.client.0.vm04.stdout:6/151: unlink d2/d4/d2d/d30/c26 0 2026-03-10T06:22:46.327 INFO:tasks.workunit.client.0.vm04.stdout:4/109: mkdir d2/d8/dc/d29 0 2026-03-10T06:22:46.328 INFO:tasks.workunit.client.0.vm04.stdout:4/110: chown d2/c10 0 1 2026-03-10T06:22:46.329 INFO:tasks.workunit.client.0.vm04.stdout:6/152: fsync d2/f7 0 2026-03-10T06:22:46.329 INFO:tasks.workunit.client.0.vm04.stdout:2/172: mkdir d1/df/d11/d18/d35 0 2026-03-10T06:22:46.329 INFO:tasks.workunit.client.0.vm04.stdout:4/111: fdatasync d2/d8/f23 0 2026-03-10T06:22:46.331 INFO:tasks.workunit.client.0.vm04.stdout:2/173: dread d1/f13 [0,4194304] 0 2026-03-10T06:22:46.333 INFO:tasks.workunit.client.0.vm04.stdout:2/174: dread - d1/db/d20/f2e zero size 2026-03-10T06:22:46.339 INFO:tasks.workunit.client.0.vm04.stdout:5/113: mknod d4/d11/d2a/c2d 0 2026-03-10T06:22:46.340 INFO:tasks.workunit.client.0.vm04.stdout:0/126: rename d0/d5/d25/dd/c11 to d0/d1a/c33 0 2026-03-10T06:22:46.341 INFO:tasks.workunit.client.0.vm04.stdout:5/114: fdatasync d4/f19 0 2026-03-10T06:22:46.341 INFO:tasks.workunit.client.0.vm04.stdout:0/127: truncate d0/d5/f1c 1265423 0 2026-03-10T06:22:46.359 INFO:tasks.workunit.client.0.vm04.stdout:4/112: chown d2/c24 4952737 1 2026-03-10T06:22:46.360 INFO:tasks.workunit.client.0.vm04.stdout:7/123: write d4/fa [1086620,38165] 0 2026-03-10T06:22:46.361 INFO:tasks.workunit.client.0.vm04.stdout:9/167: truncate d2/d9/f20 3983285 0 2026-03-10T06:22:46.368 INFO:tasks.workunit.client.0.vm04.stdout:6/153: write d2/d4/f31 [714009,68391] 0 2026-03-10T06:22:46.372 INFO:tasks.workunit.client.0.vm04.stdout:0/128: truncate d0/d5/d25/dd/d1d/f24 143637 0 2026-03-10T06:22:46.373 INFO:tasks.workunit.client.0.vm04.stdout:9/168: dwrite d2/d23/f31 [0,4194304] 0 2026-03-10T06:22:46.376 INFO:tasks.workunit.client.0.vm04.stdout:9/169: readlink d2/d3/d18/l2f 0 2026-03-10T06:22:46.379 INFO:tasks.workunit.client.0.vm04.stdout:9/170: fdatasync d2/d9/fd 0 2026-03-10T06:22:46.394 INFO:tasks.workunit.client.0.vm04.stdout:9/171: dwrite d2/d3/f12 [0,4194304] 0 2026-03-10T06:22:46.414 INFO:tasks.workunit.client.0.vm04.stdout:8/164: getdents df/d20/d25 0 2026-03-10T06:22:46.414 INFO:tasks.workunit.client.0.vm04.stdout:8/165: dread - df/f17 zero size 2026-03-10T06:22:46.415 INFO:tasks.workunit.client.0.vm04.stdout:8/166: read df/d15/d2b/f2f [1361611,43230] 0 2026-03-10T06:22:46.424 INFO:tasks.workunit.client.0.vm04.stdout:0/129: creat d0/d5/d25/dd/d1d/f34 x:0 0 0 2026-03-10T06:22:46.424 INFO:tasks.workunit.client.0.vm04.stdout:5/115: mknod d4/c2e 0 2026-03-10T06:22:46.424 INFO:tasks.workunit.client.0.vm04.stdout:6/154: creat d2/d4/f35 x:0 0 0 2026-03-10T06:22:46.425 INFO:tasks.workunit.client.0.vm04.stdout:5/116: write d4/f21 [785212,103364] 0 2026-03-10T06:22:46.431 INFO:tasks.workunit.client.0.vm04.stdout:3/159: write f1 [1224933,86299] 0 2026-03-10T06:22:46.437 INFO:tasks.workunit.client.0.vm04.stdout:1/166: getdents d0/d8 0 2026-03-10T06:22:46.442 INFO:tasks.workunit.client.0.vm04.stdout:2/175: dwrite d1/df/d11/f16 [4194304,4194304] 0 2026-03-10T06:22:46.443 INFO:tasks.workunit.client.0.vm04.stdout:2/176: dread - d1/f2b zero size 2026-03-10T06:22:46.446 INFO:tasks.workunit.client.0.vm04.stdout:0/130: mknod d0/d5/d25/c35 0 2026-03-10T06:22:46.447 INFO:tasks.workunit.client.0.vm04.stdout:0/131: dread d0/f1b [0,4194304] 0 2026-03-10T06:22:46.449 INFO:tasks.workunit.client.0.vm04.stdout:6/155: rmdir d2/d4/d2d/d30 39 2026-03-10T06:22:46.462 INFO:tasks.workunit.client.0.vm04.stdout:4/113: rmdir d2/d8/dc/d29 0 2026-03-10T06:22:46.462 INFO:tasks.workunit.client.0.vm04.stdout:5/117: creat d4/d11/f2f x:0 0 0 2026-03-10T06:22:46.465 INFO:tasks.workunit.client.0.vm04.stdout:1/167: dread d0/f2e [0,4194304] 0 2026-03-10T06:22:46.466 INFO:tasks.workunit.client.0.vm04.stdout:5/118: dread - d4/d11/f2f zero size 2026-03-10T06:22:46.469 INFO:tasks.workunit.client.0.vm04.stdout:8/167: creat df/d20/d25/d30/f37 x:0 0 0 2026-03-10T06:22:46.471 INFO:tasks.workunit.client.0.vm04.stdout:5/119: write d4/d6/f23 [1932240,79563] 0 2026-03-10T06:22:46.476 INFO:tasks.workunit.client.0.vm04.stdout:0/132: mknod d0/d5/c36 0 2026-03-10T06:22:46.477 INFO:tasks.workunit.client.0.vm04.stdout:0/133: write d0/d5/fb [15572,108932] 0 2026-03-10T06:22:46.478 INFO:tasks.workunit.client.0.vm04.stdout:0/134: chown d0/l6 0 1 2026-03-10T06:22:46.483 INFO:tasks.workunit.client.0.vm04.stdout:3/160: symlink d4/l36 0 2026-03-10T06:22:46.485 INFO:tasks.workunit.client.0.vm04.stdout:7/124: getdents d4/df/d12/d13 0 2026-03-10T06:22:46.487 INFO:tasks.workunit.client.0.vm04.stdout:3/161: dwrite f1 [0,4194304] 0 2026-03-10T06:22:46.497 INFO:tasks.workunit.client.0.vm04.stdout:9/172: rename d2/d3/l1f to d2/l38 0 2026-03-10T06:22:46.507 INFO:tasks.workunit.client.0.vm04.stdout:1/168: mknod d0/d3/c3f 0 2026-03-10T06:22:46.525 INFO:tasks.workunit.client.0.vm04.stdout:5/120: creat d4/d11/d2a/f30 x:0 0 0 2026-03-10T06:22:46.525 INFO:tasks.workunit.client.0.vm04.stdout:6/156: symlink d2/l36 0 2026-03-10T06:22:46.529 INFO:tasks.workunit.client.0.vm04.stdout:5/121: dread d4/f21 [0,4194304] 0 2026-03-10T06:22:46.534 INFO:tasks.workunit.client.0.vm04.stdout:0/135: chown d0/d1a/c33 934 1 2026-03-10T06:22:46.535 INFO:tasks.workunit.client.0.vm04.stdout:0/136: chown d0/d1a/l1e 63992562 1 2026-03-10T06:22:46.535 INFO:tasks.workunit.client.0.vm04.stdout:0/137: dread - d0/d5/d25/dd/d1d/f30 zero size 2026-03-10T06:22:46.535 INFO:tasks.workunit.client.0.vm04.stdout:0/138: stat d0/d5/fb 0 2026-03-10T06:22:46.536 INFO:tasks.workunit.client.0.vm04.stdout:0/139: read d0/d5/f1c [459316,72226] 0 2026-03-10T06:22:46.537 INFO:tasks.workunit.client.0.vm04.stdout:0/140: fsync d0/f4 0 2026-03-10T06:22:46.537 INFO:tasks.workunit.client.0.vm04.stdout:0/141: dread - d0/d5/d25/f23 zero size 2026-03-10T06:22:46.543 INFO:tasks.workunit.client.0.vm04.stdout:3/162: creat d4/d6/dc/f37 x:0 0 0 2026-03-10T06:22:46.545 INFO:tasks.workunit.client.0.vm04.stdout:7/125: mkdir d4/df/d12/d13/d25/d30 0 2026-03-10T06:22:46.547 INFO:tasks.workunit.client.0.vm04.stdout:2/177: rename d1/f13 to d1/db/f36 0 2026-03-10T06:22:46.552 INFO:tasks.workunit.client.0.vm04.stdout:7/126: dwrite d4/fa [0,4194304] 0 2026-03-10T06:22:46.555 INFO:tasks.workunit.client.0.vm04.stdout:7/127: chown d4/df/d12/d21 1527048 1 2026-03-10T06:22:46.570 INFO:tasks.workunit.client.0.vm04.stdout:1/169: write d0/d3/f16 [4939402,110213] 0 2026-03-10T06:22:46.573 INFO:tasks.workunit.client.0.vm04.stdout:8/168: truncate df/d20/d25/f35 919538 0 2026-03-10T06:22:46.573 INFO:tasks.workunit.client.0.vm04.stdout:5/122: creat d4/d11/d2a/f31 x:0 0 0 2026-03-10T06:22:46.579 INFO:tasks.workunit.client.0.vm04.stdout:4/114: rmdir d2/d16/d27 0 2026-03-10T06:22:46.579 INFO:tasks.workunit.client.0.vm04.stdout:4/115: chown d2 173755 1 2026-03-10T06:22:46.581 INFO:tasks.workunit.client.0.vm04.stdout:4/116: write d2/d8/f11 [2175986,101187] 0 2026-03-10T06:22:46.584 INFO:tasks.workunit.client.0.vm04.stdout:3/163: mkdir d4/d6/d38 0 2026-03-10T06:22:46.585 INFO:tasks.workunit.client.0.vm04.stdout:3/164: fdatasync d4/d6/dc/f25 0 2026-03-10T06:22:46.588 INFO:tasks.workunit.client.0.vm04.stdout:9/173: rename d2/d9 to d2/d3/d18/d39 0 2026-03-10T06:22:46.597 INFO:tasks.workunit.client.0.vm04.stdout:0/142: write d0/d5/d25/f22 [2838174,29876] 0 2026-03-10T06:22:46.600 INFO:tasks.workunit.client.0.vm04.stdout:3/165: dread d4/f10 [0,4194304] 0 2026-03-10T06:22:46.610 INFO:tasks.workunit.client.0.vm04.stdout:7/128: symlink d4/df/d12/d13/d25/d28/l31 0 2026-03-10T06:22:46.621 INFO:tasks.workunit.client.0.vm04.stdout:6/157: mkdir d2/d37 0 2026-03-10T06:22:46.621 INFO:tasks.workunit.client.0.vm04.stdout:6/158: fdatasync d2/f14 0 2026-03-10T06:22:46.621 INFO:tasks.workunit.client.0.vm04.stdout:5/123: creat d4/d11/f32 x:0 0 0 2026-03-10T06:22:46.623 INFO:tasks.workunit.client.0.vm04.stdout:5/124: write d4/d6/f7 [158798,101999] 0 2026-03-10T06:22:46.625 INFO:tasks.workunit.client.0.vm04.stdout:8/169: fdatasync df/d15/d2b/f2f 0 2026-03-10T06:22:46.626 INFO:tasks.workunit.client.0.vm04.stdout:8/170: fsync df/d20/f28 0 2026-03-10T06:22:46.627 INFO:tasks.workunit.client.0.vm04.stdout:5/125: write d4/d11/d2a/f31 [166863,22803] 0 2026-03-10T06:22:46.627 INFO:tasks.workunit.client.0.vm04.stdout:8/171: write df/f11 [652417,13709] 0 2026-03-10T06:22:46.630 INFO:tasks.workunit.client.0.vm04.stdout:6/159: dread d2/d4/f2c [8388608,4194304] 0 2026-03-10T06:22:46.633 INFO:tasks.workunit.client.0.vm04.stdout:4/117: write d2/d8/f25 [10004,79721] 0 2026-03-10T06:22:46.644 INFO:tasks.workunit.client.0.vm04.stdout:7/129: mknod d4/df/d12/d21/c32 0 2026-03-10T06:22:46.649 INFO:tasks.workunit.client.0.vm04.stdout:1/170: symlink d0/l40 0 2026-03-10T06:22:46.651 INFO:tasks.workunit.client.0.vm04.stdout:1/171: dread - d0/d8/f25 zero size 2026-03-10T06:22:46.652 INFO:tasks.workunit.client.0.vm04.stdout:1/172: fsync d0/d3/f28 0 2026-03-10T06:22:46.652 INFO:tasks.workunit.client.0.vm04.stdout:4/118: dread d2/f4 [0,4194304] 0 2026-03-10T06:22:46.659 INFO:tasks.workunit.client.0.vm04.stdout:3/166: dwrite d4/da/df/d13/f1d [0,4194304] 0 2026-03-10T06:22:46.659 INFO:tasks.workunit.client.0.vm04.stdout:1/173: dwrite d0/d3/f28 [0,4194304] 0 2026-03-10T06:22:46.666 INFO:tasks.workunit.client.0.vm04.stdout:8/172: symlink df/d20/d25/d30/l38 0 2026-03-10T06:22:46.667 INFO:tasks.workunit.client.0.vm04.stdout:8/173: chown lb 264109635 1 2026-03-10T06:22:46.674 INFO:tasks.workunit.client.0.vm04.stdout:6/160: readlink d2/d4/l1c 0 2026-03-10T06:22:46.680 INFO:tasks.workunit.client.0.vm04.stdout:9/174: mkdir d2/d8/d3a 0 2026-03-10T06:22:46.680 INFO:tasks.workunit.client.0.vm04.stdout:9/175: rename d2/d3/d18/d39 to d2/d3/d18/d39/d3b 22 2026-03-10T06:22:46.694 INFO:tasks.workunit.client.0.vm04.stdout:1/174: write d0/f2e [1272619,102336] 0 2026-03-10T06:22:46.698 INFO:tasks.workunit.client.0.vm04.stdout:2/178: getdents d1/db/d20 0 2026-03-10T06:22:46.699 INFO:tasks.workunit.client.0.vm04.stdout:9/176: dread d2/f1e [0,4194304] 0 2026-03-10T06:22:46.700 INFO:tasks.workunit.client.0.vm04.stdout:4/119: mkdir d2/d2a 0 2026-03-10T06:22:46.702 INFO:tasks.workunit.client.0.vm04.stdout:3/167: mkdir d4/da/df/d13/d21/d32/d39 0 2026-03-10T06:22:46.703 INFO:tasks.workunit.client.0.vm04.stdout:8/174: fdatasync df/d20/d25/f35 0 2026-03-10T06:22:46.704 INFO:tasks.workunit.client.0.vm04.stdout:6/161: creat d2/d37/f38 x:0 0 0 2026-03-10T06:22:46.704 INFO:tasks.workunit.client.0.vm04.stdout:3/168: write f0 [230854,65120] 0 2026-03-10T06:22:46.704 INFO:tasks.workunit.client.0.vm04.stdout:6/162: read d2/f10 [2283923,39699] 0 2026-03-10T06:22:46.711 INFO:tasks.workunit.client.0.vm04.stdout:2/179: mkdir d1/df/d2c/d37 0 2026-03-10T06:22:46.711 INFO:tasks.workunit.client.0.vm04.stdout:4/120: dread d2/f1d [0,4194304] 0 2026-03-10T06:22:46.712 INFO:tasks.workunit.client.0.vm04.stdout:4/121: truncate d2/d8/dc/f26 1019164 0 2026-03-10T06:22:46.713 INFO:tasks.workunit.client.0.vm04.stdout:9/177: unlink d2/c21 0 2026-03-10T06:22:46.714 INFO:tasks.workunit.client.0.vm04.stdout:8/175: rename df/d20/d25/f31 to df/d20/d25/f39 0 2026-03-10T06:22:46.715 INFO:tasks.workunit.client.0.vm04.stdout:2/180: readlink d1/l9 0 2026-03-10T06:22:46.716 INFO:tasks.workunit.client.0.vm04.stdout:7/130: rename d4/df/d12/c23 to d4/df/d12/c33 0 2026-03-10T06:22:46.717 INFO:tasks.workunit.client.0.vm04.stdout:1/175: dwrite d0/d3/f16 [4194304,4194304] 0 2026-03-10T06:22:46.717 INFO:tasks.workunit.client.0.vm04.stdout:1/176: write d0/f3c [3683419,84014] 0 2026-03-10T06:22:46.717 INFO:tasks.workunit.client.0.vm04.stdout:2/181: write d1/db/f27 [483426,38673] 0 2026-03-10T06:22:46.722 INFO:tasks.workunit.client.0.vm04.stdout:1/177: truncate d0/f2e 2021502 0 2026-03-10T06:22:46.723 INFO:tasks.workunit.client.0.vm04.stdout:1/178: fsync d0/ff 0 2026-03-10T06:22:46.746 INFO:tasks.workunit.client.0.vm04.stdout:3/169: rename d4/d6/dc/f25 to d4/da/df/d13/d21/f3a 0 2026-03-10T06:22:46.746 INFO:tasks.workunit.client.0.vm04.stdout:8/176: creat df/d15/d29/f3a x:0 0 0 2026-03-10T06:22:46.753 INFO:tasks.workunit.client.0.vm04.stdout:0/143: truncate d0/d5/f1c 1111417 0 2026-03-10T06:22:46.757 INFO:tasks.workunit.client.0.vm04.stdout:1/179: rmdir d0/d3 39 2026-03-10T06:22:46.761 INFO:tasks.workunit.client.0.vm04.stdout:0/144: dwrite d0/d5/fb [0,4194304] 0 2026-03-10T06:22:46.765 INFO:tasks.workunit.client.0.vm04.stdout:9/178: creat d2/d8/d14/f3c x:0 0 0 2026-03-10T06:22:46.769 INFO:tasks.workunit.client.0.vm04.stdout:4/122: rename d2/d8/dc to d2/d16/d2b 0 2026-03-10T06:22:46.770 INFO:tasks.workunit.client.0.vm04.stdout:5/126: truncate d4/d11/f17 2295061 0 2026-03-10T06:22:46.771 INFO:tasks.workunit.client.0.vm04.stdout:8/177: symlink df/d15/d29/l3b 0 2026-03-10T06:22:46.772 INFO:tasks.workunit.client.0.vm04.stdout:3/170: chown d4/da/df/d11/l35 1112132 1 2026-03-10T06:22:46.773 INFO:tasks.workunit.client.0.vm04.stdout:3/171: chown d4/da/l1e 794256 1 2026-03-10T06:22:46.773 INFO:tasks.workunit.client.0.vm04.stdout:3/172: chown d4/f29 1 1 2026-03-10T06:22:46.777 INFO:tasks.workunit.client.0.vm04.stdout:7/131: mkdir d4/df/d12/d34 0 2026-03-10T06:22:46.780 INFO:tasks.workunit.client.0.vm04.stdout:3/173: dwrite f1 [0,4194304] 0 2026-03-10T06:22:46.785 INFO:tasks.workunit.client.0.vm04.stdout:0/145: symlink d0/d5/d25/dd/d1d/l37 0 2026-03-10T06:22:46.798 INFO:tasks.workunit.client.0.vm04.stdout:4/123: mkdir d2/d16/d2c 0 2026-03-10T06:22:46.798 INFO:tasks.workunit.client.0.vm04.stdout:4/124: chown d2/d8/fa 49437 1 2026-03-10T06:22:46.798 INFO:tasks.workunit.client.0.vm04.stdout:7/132: dwrite d4/df/f29 [0,4194304] 0 2026-03-10T06:22:46.803 INFO:tasks.workunit.client.0.vm04.stdout:4/125: dread d2/d8/f1f [0,4194304] 0 2026-03-10T06:22:46.804 INFO:tasks.workunit.client.0.vm04.stdout:4/126: read d2/d16/d2b/f26 [180225,56247] 0 2026-03-10T06:22:46.805 INFO:tasks.workunit.client.0.vm04.stdout:4/127: dread - d2/d16/d2b/f21 zero size 2026-03-10T06:22:46.814 INFO:tasks.workunit.client.0.vm04.stdout:1/180: mkdir d0/d3/d41 0 2026-03-10T06:22:46.822 INFO:tasks.workunit.client.0.vm04.stdout:9/179: dread d2/d3/d18/d39/f20 [0,4194304] 0 2026-03-10T06:22:46.822 INFO:tasks.workunit.client.0.vm04.stdout:9/180: stat d2/d8/d14/c1b 0 2026-03-10T06:22:46.824 INFO:tasks.workunit.client.0.vm04.stdout:3/174: unlink d4/da/df/d11/f31 0 2026-03-10T06:22:46.825 INFO:tasks.workunit.client.0.vm04.stdout:9/181: dread d2/d23/f31 [0,4194304] 0 2026-03-10T06:22:46.829 INFO:tasks.workunit.client.0.vm04.stdout:0/146: fdatasync d0/d1a/f2f 0 2026-03-10T06:22:46.829 INFO:tasks.workunit.client.0.vm04.stdout:2/182: sync 2026-03-10T06:22:46.836 INFO:tasks.workunit.client.0.vm04.stdout:5/127: link d4/f19 d4/d6/f33 0 2026-03-10T06:22:46.837 INFO:tasks.workunit.client.0.vm04.stdout:1/181: fdatasync d0/d8/f32 0 2026-03-10T06:22:46.838 INFO:tasks.workunit.client.0.vm04.stdout:7/133: write d4/df/d12/f20 [1021951,56148] 0 2026-03-10T06:22:46.838 INFO:tasks.workunit.client.0.vm04.stdout:5/128: dread - d4/d11/d2a/f2c zero size 2026-03-10T06:22:46.839 INFO:tasks.workunit.client.0.vm04.stdout:2/183: mknod d1/df/c38 0 2026-03-10T06:22:46.846 INFO:tasks.workunit.client.0.vm04.stdout:4/128: rename d2/d8/l13 to d2/l2d 0 2026-03-10T06:22:46.848 INFO:tasks.workunit.client.0.vm04.stdout:3/175: dread d4/f10 [0,4194304] 0 2026-03-10T06:22:46.851 INFO:tasks.workunit.client.0.vm04.stdout:2/184: dwrite d1/db/fe [0,4194304] 0 2026-03-10T06:22:46.854 INFO:tasks.workunit.client.0.vm04.stdout:2/185: chown d1/df/f24 457416417 1 2026-03-10T06:22:46.854 INFO:tasks.workunit.client.0.vm04.stdout:9/182: fsync f0 0 2026-03-10T06:22:46.857 INFO:tasks.workunit.client.0.vm04.stdout:7/134: fdatasync d4/df/d12/f20 0 2026-03-10T06:22:46.864 INFO:tasks.workunit.client.0.vm04.stdout:6/163: write d2/d4/fa [1573203,73405] 0 2026-03-10T06:22:46.877 INFO:tasks.workunit.client.0.vm04.stdout:1/182: mknod d0/c42 0 2026-03-10T06:22:46.879 INFO:tasks.workunit.client.0.vm04.stdout:0/147: rename d0/d5/d25/dd/d2b to d0/d1a/d20/d38 0 2026-03-10T06:22:46.893 INFO:tasks.workunit.client.0.vm04.stdout:8/178: write df/d15/d2b/f2f [5095730,115023] 0 2026-03-10T06:22:46.901 INFO:tasks.workunit.client.0.vm04.stdout:5/129: dwrite d4/f21 [0,4194304] 0 2026-03-10T06:22:46.905 INFO:tasks.workunit.client.0.vm04.stdout:5/130: chown l1 0 1 2026-03-10T06:22:46.915 INFO:tasks.workunit.client.0.vm04.stdout:2/186: rmdir d1/df/d2c 39 2026-03-10T06:22:46.915 INFO:tasks.workunit.client.0.vm04.stdout:9/183: symlink d2/d8/d14/l3d 0 2026-03-10T06:22:46.915 INFO:tasks.workunit.client.0.vm04.stdout:1/183: creat d0/d8/f43 x:0 0 0 2026-03-10T06:22:46.916 INFO:tasks.workunit.client.0.vm04.stdout:6/164: write d2/d4/d2d/d30/f2b [2790565,85318] 0 2026-03-10T06:22:46.916 INFO:tasks.workunit.client.0.vm04.stdout:2/187: fsync d1/db/d20/f2e 0 2026-03-10T06:22:46.917 INFO:tasks.workunit.client.0.vm04.stdout:2/188: fdatasync d1/db/d20/f31 0 2026-03-10T06:22:46.926 INFO:tasks.workunit.client.0.vm04.stdout:9/184: dread d2/d3/f2a [0,4194304] 0 2026-03-10T06:22:46.933 INFO:tasks.workunit.client.0.vm04.stdout:3/176: write d4/f10 [5832163,85186] 0 2026-03-10T06:22:46.936 INFO:tasks.workunit.client.0.vm04.stdout:3/177: truncate d4/f2d 751325 0 2026-03-10T06:22:46.937 INFO:tasks.workunit.client.0.vm04.stdout:2/189: dread d1/df/d11/d18/f25 [0,4194304] 0 2026-03-10T06:22:46.944 INFO:tasks.workunit.client.0.vm04.stdout:0/148: write d0/f14 [942274,79502] 0 2026-03-10T06:22:46.947 INFO:tasks.workunit.client.0.vm04.stdout:7/135: getdents d4/df/d12/d13/d25/d30 0 2026-03-10T06:22:46.958 INFO:tasks.workunit.client.0.vm04.stdout:7/136: dwrite d4/df/d12/d13/d25/f2e [4194304,4194304] 0 2026-03-10T06:22:46.964 INFO:tasks.workunit.client.0.vm04.stdout:0/149: dwrite d0/d5/d25/dd/d1d/f30 [0,4194304] 0 2026-03-10T06:22:46.978 INFO:tasks.workunit.client.0.vm04.stdout:1/184: unlink d0/d3/f39 0 2026-03-10T06:22:46.978 INFO:tasks.workunit.client.0.vm04.stdout:8/179: dwrite df/d20/f21 [0,4194304] 0 2026-03-10T06:22:46.981 INFO:tasks.workunit.client.0.vm04.stdout:6/165: write d2/f10 [4799346,117702] 0 2026-03-10T06:22:46.989 INFO:tasks.workunit.client.0.vm04.stdout:6/166: fdatasync d2/d37/f38 0 2026-03-10T06:22:46.990 INFO:tasks.workunit.client.0.vm04.stdout:3/178: creat d4/da/df/f3b x:0 0 0 2026-03-10T06:22:46.993 INFO:tasks.workunit.client.0.vm04.stdout:8/180: dwrite df/d20/f21 [0,4194304] 0 2026-03-10T06:22:46.993 INFO:tasks.workunit.client.0.vm04.stdout:6/167: chown d2/d4/l5 19 1 2026-03-10T06:22:46.997 INFO:tasks.workunit.client.0.vm04.stdout:2/190: symlink d1/db/d20/l39 0 2026-03-10T06:22:47.003 INFO:tasks.workunit.client.0.vm04.stdout:5/131: link d4/d6/f7 d4/d11/f34 0 2026-03-10T06:22:47.017 INFO:tasks.workunit.client.0.vm04.stdout:0/150: dread d0/f1b [0,4194304] 0 2026-03-10T06:22:47.019 INFO:tasks.workunit.client.0.vm04.stdout:0/151: fsync d0/d5/f1f 0 2026-03-10T06:22:47.023 INFO:tasks.workunit.client.0.vm04.stdout:4/129: getdents d2 0 2026-03-10T06:22:47.025 INFO:tasks.workunit.client.0.vm04.stdout:9/185: mkdir d2/d8/d3e 0 2026-03-10T06:22:47.026 INFO:tasks.workunit.client.0.vm04.stdout:3/179: mknod d4/da/df/d13/d21/c3c 0 2026-03-10T06:22:47.027 INFO:tasks.workunit.client.0.vm04.stdout:3/180: fsync d4/f2d 0 2026-03-10T06:22:47.027 INFO:tasks.workunit.client.0.vm04.stdout:3/181: fsync d4/f10 0 2026-03-10T06:22:47.028 INFO:tasks.workunit.client.0.vm04.stdout:3/182: readlink d4/da/df/d11/l35 0 2026-03-10T06:22:47.034 INFO:tasks.workunit.client.0.vm04.stdout:6/168: dread d2/d8/f9 [0,4194304] 0 2026-03-10T06:22:47.044 INFO:tasks.workunit.client.0.vm04.stdout:2/191: mknod d1/c3a 0 2026-03-10T06:22:47.044 INFO:tasks.workunit.client.0.vm04.stdout:2/192: readlink d1/l28 0 2026-03-10T06:22:47.046 INFO:tasks.workunit.client.0.vm04.stdout:8/181: rename df/d20/d25/d30/f37 to df/d15/d29/f3c 0 2026-03-10T06:22:47.050 INFO:tasks.workunit.client.0.vm04.stdout:7/137: link d4/f5 d4/df/f35 0 2026-03-10T06:22:47.051 INFO:tasks.workunit.client.0.vm04.stdout:0/152: mknod d0/c39 0 2026-03-10T06:22:47.056 INFO:tasks.workunit.client.0.vm04.stdout:6/169: creat d2/d4/d2d/d30/f39 x:0 0 0 2026-03-10T06:22:47.057 INFO:tasks.workunit.client.0.vm04.stdout:6/170: read d2/d8/d23/f27 [641918,10273] 0 2026-03-10T06:22:47.058 INFO:tasks.workunit.client.0.vm04.stdout:6/171: fsync d2/f14 0 2026-03-10T06:22:47.058 INFO:tasks.workunit.client.0.vm04.stdout:6/172: fdatasync d2/f14 0 2026-03-10T06:22:47.060 INFO:tasks.workunit.client.0.vm04.stdout:1/185: write d0/d3/f19 [3943682,67632] 0 2026-03-10T06:22:47.060 INFO:tasks.workunit.client.0.vm04.stdout:1/186: fsync d0/d3/f35 0 2026-03-10T06:22:47.061 INFO:tasks.workunit.client.0.vm04.stdout:3/183: sync 2026-03-10T06:22:47.064 INFO:tasks.workunit.client.0.vm04.stdout:8/182: unlink fd 0 2026-03-10T06:22:47.068 INFO:tasks.workunit.client.0.vm04.stdout:7/138: mkdir d4/df/d12/d13/d25/d28/d36 0 2026-03-10T06:22:47.069 INFO:tasks.workunit.client.0.vm04.stdout:7/139: fsync d4/df/d12/f18 0 2026-03-10T06:22:47.077 INFO:tasks.workunit.client.0.vm04.stdout:7/140: sync 2026-03-10T06:22:47.077 INFO:tasks.workunit.client.0.vm04.stdout:4/130: creat d2/d16/d2c/f2e x:0 0 0 2026-03-10T06:22:47.078 INFO:tasks.workunit.client.0.vm04.stdout:7/141: readlink d4/df/d12/d13/l17 0 2026-03-10T06:22:47.080 INFO:tasks.workunit.client.0.vm04.stdout:9/186: mknod d2/d3/d18/d34/c3f 0 2026-03-10T06:22:47.080 INFO:tasks.workunit.client.0.vm04.stdout:2/193: mknod d1/df/d11/d14/c3b 0 2026-03-10T06:22:47.082 INFO:tasks.workunit.client.0.vm04.stdout:3/184: symlink d4/d6/l3d 0 2026-03-10T06:22:47.082 INFO:tasks.workunit.client.0.vm04.stdout:1/187: creat d0/d3/f44 x:0 0 0 2026-03-10T06:22:47.084 INFO:tasks.workunit.client.0.vm04.stdout:8/183: symlink df/d15/l3d 0 2026-03-10T06:22:47.085 INFO:tasks.workunit.client.0.vm04.stdout:5/132: creat d4/f35 x:0 0 0 2026-03-10T06:22:47.090 INFO:tasks.workunit.client.0.vm04.stdout:3/185: dwrite d4/f29 [0,4194304] 0 2026-03-10T06:22:47.103 INFO:tasks.workunit.client.0.vm04.stdout:7/142: unlink d4/f16 0 2026-03-10T06:22:47.104 INFO:tasks.workunit.client.0.vm04.stdout:0/153: unlink d0/d5/f1c 0 2026-03-10T06:22:47.111 INFO:tasks.workunit.client.0.vm04.stdout:2/194: unlink d1/l9 0 2026-03-10T06:22:47.111 INFO:tasks.workunit.client.0.vm04.stdout:6/173: mkdir d2/d3a 0 2026-03-10T06:22:47.114 INFO:tasks.workunit.client.0.vm04.stdout:1/188: mknod d0/d3/c45 0 2026-03-10T06:22:47.115 INFO:tasks.workunit.client.0.vm04.stdout:8/184: creat df/d15/d29/f3e x:0 0 0 2026-03-10T06:22:47.115 INFO:tasks.workunit.client.0.vm04.stdout:5/133: creat d4/d11/d2a/f36 x:0 0 0 2026-03-10T06:22:47.116 INFO:tasks.workunit.client.0.vm04.stdout:8/185: truncate df/f11 2718186 0 2026-03-10T06:22:47.118 INFO:tasks.workunit.client.0.vm04.stdout:9/187: dread d2/d23/f31 [0,4194304] 0 2026-03-10T06:22:47.123 INFO:tasks.workunit.client.0.vm04.stdout:1/189: dwrite d0/ff [0,4194304] 0 2026-03-10T06:22:47.127 INFO:tasks.workunit.client.0.vm04.stdout:2/195: dread d1/db/d20/f2f [0,4194304] 0 2026-03-10T06:22:47.132 INFO:tasks.workunit.client.0.vm04.stdout:1/190: sync 2026-03-10T06:22:47.132 INFO:tasks.workunit.client.0.vm04.stdout:0/154: mkdir d0/d5/d25/dd/d3a 0 2026-03-10T06:22:47.134 INFO:tasks.workunit.client.0.vm04.stdout:6/174: rename d2/f16 to d2/d4/f3b 0 2026-03-10T06:22:47.134 INFO:tasks.workunit.client.0.vm04.stdout:6/175: readlink d2/l36 0 2026-03-10T06:22:47.143 INFO:tasks.workunit.client.0.vm04.stdout:4/131: dwrite d2/d8/fa [0,4194304] 0 2026-03-10T06:22:47.144 INFO:tasks.workunit.client.0.vm04.stdout:8/186: fdatasync df/d20/d25/f39 0 2026-03-10T06:22:47.147 INFO:tasks.workunit.client.0.vm04.stdout:3/186: mknod d4/c3e 0 2026-03-10T06:22:47.148 INFO:tasks.workunit.client.0.vm04.stdout:7/143: creat d4/df/d12/d13/d25/d30/f37 x:0 0 0 2026-03-10T06:22:47.149 INFO:tasks.workunit.client.0.vm04.stdout:4/132: write d2/ff [533973,26849] 0 2026-03-10T06:22:47.149 INFO:tasks.workunit.client.0.vm04.stdout:4/133: dread - d2/f12 zero size 2026-03-10T06:22:47.149 INFO:tasks.workunit.client.0.vm04.stdout:0/155: dwrite d0/d1a/f2f [0,4194304] 0 2026-03-10T06:22:47.155 INFO:tasks.workunit.client.0.vm04.stdout:2/196: symlink d1/l3c 0 2026-03-10T06:22:47.155 INFO:tasks.workunit.client.0.vm04.stdout:7/144: stat d4/c7 0 2026-03-10T06:22:47.155 INFO:tasks.workunit.client.0.vm04.stdout:1/191: dwrite d0/d3/f34 [0,4194304] 0 2026-03-10T06:22:47.156 INFO:tasks.workunit.client.0.vm04.stdout:3/187: chown d4/da/df/d13/f2b 7 1 2026-03-10T06:22:47.157 INFO:tasks.workunit.client.0.vm04.stdout:7/145: dread - d4/df/d12/d21/f26 zero size 2026-03-10T06:22:47.157 INFO:tasks.workunit.client.0.vm04.stdout:9/188: link d2/d23/d24/f2b d2/d8/d14/f40 0 2026-03-10T06:22:47.158 INFO:tasks.workunit.client.0.vm04.stdout:6/176: write d2/d4/f3b [728648,45500] 0 2026-03-10T06:22:47.167 INFO:tasks.workunit.client.0.vm04.stdout:6/177: truncate d2/d4/d2d/d30/f32 768776 0 2026-03-10T06:22:47.172 INFO:tasks.workunit.client.0.vm04.stdout:0/156: rename d0/d5/d25/f22 to d0/d1a/f3b 0 2026-03-10T06:22:47.172 INFO:tasks.workunit.client.0.vm04.stdout:8/187: dread df/d15/f1e [0,4194304] 0 2026-03-10T06:22:47.178 INFO:tasks.workunit.client.0.vm04.stdout:2/197: dwrite d1/db/d20/f2e [0,4194304] 0 2026-03-10T06:22:47.185 INFO:tasks.workunit.client.0.vm04.stdout:3/188: dwrite d4/da/df/d13/f2e [0,4194304] 0 2026-03-10T06:22:47.191 INFO:tasks.workunit.client.0.vm04.stdout:6/178: dwrite d2/d4/f24 [0,4194304] 0 2026-03-10T06:22:47.192 INFO:tasks.workunit.client.0.vm04.stdout:0/157: dwrite d0/f4 [0,4194304] 0 2026-03-10T06:22:47.193 INFO:tasks.workunit.client.0.vm04.stdout:3/189: write d4/da/df/d13/f1d [3448208,24750] 0 2026-03-10T06:22:47.202 INFO:tasks.workunit.client.0.vm04.stdout:4/134: write d2/f4 [2564218,91304] 0 2026-03-10T06:22:47.210 INFO:tasks.workunit.client.0.vm04.stdout:9/189: rmdir d2/d8 39 2026-03-10T06:22:47.213 INFO:tasks.workunit.client.0.vm04.stdout:1/192: mkdir d0/d8/d46 0 2026-03-10T06:22:47.213 INFO:tasks.workunit.client.0.vm04.stdout:7/146: mknod d4/df/d12/d21/c38 0 2026-03-10T06:22:47.213 INFO:tasks.workunit.client.0.vm04.stdout:1/193: write d0/f2e [2681296,78025] 0 2026-03-10T06:22:47.213 INFO:tasks.workunit.client.0.vm04.stdout:7/147: chown d4/df/l1a 4044727 1 2026-03-10T06:22:47.213 INFO:tasks.workunit.client.0.vm04.stdout:7/148: fdatasync d4/df/d12/d13/d25/f2e 0 2026-03-10T06:22:47.213 INFO:tasks.workunit.client.0.vm04.stdout:6/179: dread d2/d4/d2d/d30/f2b [0,4194304] 0 2026-03-10T06:22:47.215 INFO:tasks.workunit.client.0.vm04.stdout:6/180: chown d2/d4/f35 37913360 1 2026-03-10T06:22:47.215 INFO:tasks.workunit.client.0.vm04.stdout:6/181: stat d2/d37/f38 0 2026-03-10T06:22:47.221 INFO:tasks.workunit.client.0.vm04.stdout:7/149: dwrite d4/df/d12/d13/d25/f2e [4194304,4194304] 0 2026-03-10T06:22:47.226 INFO:tasks.workunit.client.0.vm04.stdout:8/188: truncate f9 5210601 0 2026-03-10T06:22:47.235 INFO:tasks.workunit.client.0.vm04.stdout:3/190: rename d4/c5 to d4/da/df/c3f 0 2026-03-10T06:22:47.236 INFO:tasks.workunit.client.0.vm04.stdout:5/134: getdents d4/d6 0 2026-03-10T06:22:47.237 INFO:tasks.workunit.client.0.vm04.stdout:5/135: write d4/d6/f20 [142608,41153] 0 2026-03-10T06:22:47.241 INFO:tasks.workunit.client.0.vm04.stdout:3/191: dread d4/f10 [0,4194304] 0 2026-03-10T06:22:47.246 INFO:tasks.workunit.client.0.vm04.stdout:6/182: rmdir d2/d4 39 2026-03-10T06:22:47.246 INFO:tasks.workunit.client.0.vm04.stdout:4/135: dwrite d2/d8/f1f [4194304,4194304] 0 2026-03-10T06:22:47.248 INFO:tasks.workunit.client.0.vm04.stdout:9/190: sync 2026-03-10T06:22:47.262 INFO:tasks.workunit.client.0.vm04.stdout:8/189: dread df/d15/f1e [0,4194304] 0 2026-03-10T06:22:47.262 INFO:tasks.workunit.client.0.vm04.stdout:1/194: rename d0/d3/f28 to d0/d3/d41/f47 0 2026-03-10T06:22:47.264 INFO:tasks.workunit.client.0.vm04.stdout:1/195: write d0/d8/f32 [853904,62632] 0 2026-03-10T06:22:47.266 INFO:tasks.workunit.client.0.vm04.stdout:5/136: mkdir d4/d6/d37 0 2026-03-10T06:22:47.268 INFO:tasks.workunit.client.0.vm04.stdout:1/196: chown d0/d3/l13 46687 1 2026-03-10T06:22:47.287 INFO:tasks.workunit.client.0.vm04.stdout:7/150: rename d4/df/f35 to d4/df/d12/d13/d25/d28/f39 0 2026-03-10T06:22:47.288 INFO:tasks.workunit.client.0.vm04.stdout:8/190: rmdir df/d20 39 2026-03-10T06:22:47.292 INFO:tasks.workunit.client.0.vm04.stdout:5/137: mkdir d4/d11/d2a/d38 0 2026-03-10T06:22:47.294 INFO:tasks.workunit.client.0.vm04.stdout:1/197: mknod d0/d8/c48 0 2026-03-10T06:22:47.296 INFO:tasks.workunit.client.0.vm04.stdout:7/151: sync 2026-03-10T06:22:47.296 INFO:tasks.workunit.client.0.vm04.stdout:9/191: mknod d2/c41 0 2026-03-10T06:22:47.301 INFO:tasks.workunit.client.0.vm04.stdout:6/183: rename d2/d8/d23 to d2/d4/d2d/d30/d1f/d3c 0 2026-03-10T06:22:47.306 INFO:tasks.workunit.client.0.vm04.stdout:8/191: write df/d20/f22 [1014120,124798] 0 2026-03-10T06:22:47.311 INFO:tasks.workunit.client.0.vm04.stdout:1/198: unlink d0/d3/f16 0 2026-03-10T06:22:47.311 INFO:tasks.workunit.client.0.vm04.stdout:4/136: link d2/d8/c1e d2/d16/d2b/c2f 0 2026-03-10T06:22:47.323 INFO:tasks.workunit.client.0.vm04.stdout:6/184: dread d2/d4/d2d/d30/d1f/d3c/f27 [0,4194304] 0 2026-03-10T06:22:47.333 INFO:tasks.workunit.client.0.vm04.stdout:9/192: rename d2/d8/d3e to d2/d3/d18/d39/d11/d42 0 2026-03-10T06:22:47.335 INFO:tasks.workunit.client.0.vm04.stdout:9/193: chown d2/d8/d22 48693752 1 2026-03-10T06:22:47.338 INFO:tasks.workunit.client.0.vm04.stdout:9/194: chown d2/d23/d24/f29 0 1 2026-03-10T06:22:47.338 INFO:tasks.workunit.client.0.vm04.stdout:8/192: creat df/f3f x:0 0 0 2026-03-10T06:22:47.341 INFO:tasks.workunit.client.0.vm04.stdout:6/185: symlink d2/d3a/l3d 0 2026-03-10T06:22:47.341 INFO:tasks.workunit.client.0.vm04.stdout:1/199: creat d0/f49 x:0 0 0 2026-03-10T06:22:47.342 INFO:tasks.workunit.client.0.vm04.stdout:6/186: chown d2/d3a 3157165 1 2026-03-10T06:22:47.345 INFO:tasks.workunit.client.0.vm04.stdout:9/195: unlink d2/d3/f10 0 2026-03-10T06:22:47.347 INFO:tasks.workunit.client.0.vm04.stdout:7/152: fsync d4/fa 0 2026-03-10T06:22:47.347 INFO:tasks.workunit.client.0.vm04.stdout:1/200: dwrite d0/d3/f37 [0,4194304] 0 2026-03-10T06:22:47.361 INFO:tasks.workunit.client.0.vm04.stdout:6/187: unlink d2/d8/l1d 0 2026-03-10T06:22:47.366 INFO:tasks.workunit.client.0.vm04.stdout:9/196: truncate f0 4047373 0 2026-03-10T06:22:47.372 INFO:tasks.workunit.client.0.vm04.stdout:7/153: dwrite d4/df/d12/d13/d25/f2f [0,4194304] 0 2026-03-10T06:22:47.377 INFO:tasks.workunit.client.0.vm04.stdout:6/188: mknod d2/d4/d2d/d30/d34/c3e 0 2026-03-10T06:22:47.377 INFO:tasks.workunit.client.0.vm04.stdout:6/189: dread - d2/d37/f38 zero size 2026-03-10T06:22:47.386 INFO:tasks.workunit.client.0.vm04.stdout:8/193: rename df/f10 to df/f40 0 2026-03-10T06:22:47.386 INFO:tasks.workunit.client.0.vm04.stdout:8/194: write df/f1d [3797846,6716] 0 2026-03-10T06:22:47.387 INFO:tasks.workunit.client.0.vm04.stdout:9/197: dread d2/d23/d24/f26 [0,4194304] 0 2026-03-10T06:22:47.387 INFO:tasks.workunit.client.0.vm04.stdout:6/190: dwrite d2/f7 [0,4194304] 0 2026-03-10T06:22:47.388 INFO:tasks.workunit.client.0.vm04.stdout:8/195: chown df/f1f 11 1 2026-03-10T06:22:47.388 INFO:tasks.workunit.client.0.vm04.stdout:8/196: dread - df/d15/d29/f3e zero size 2026-03-10T06:22:47.392 INFO:tasks.workunit.client.0.vm04.stdout:8/197: dread df/d20/d25/f39 [0,4194304] 0 2026-03-10T06:22:47.402 INFO:tasks.workunit.client.0.vm04.stdout:7/154: mkdir d4/df/d12/d13/d25/d28/d3a 0 2026-03-10T06:22:47.402 INFO:tasks.workunit.client.0.vm04.stdout:3/192: write d4/da/df/d13/f2e [4324691,74704] 0 2026-03-10T06:22:47.406 INFO:tasks.workunit.client.0.vm04.stdout:7/155: dread d4/df/d12/d13/d25/f2e [0,4194304] 0 2026-03-10T06:22:47.420 INFO:tasks.workunit.client.0.vm04.stdout:8/198: symlink df/d20/d25/l41 0 2026-03-10T06:22:47.421 INFO:tasks.workunit.client.0.vm04.stdout:6/191: truncate d2/d4/d2d/d30/d1f/d3c/f27 1779429 0 2026-03-10T06:22:47.429 INFO:tasks.workunit.client.0.vm04.stdout:7/156: mkdir d4/df/d12/d13/d25/d28/d3b 0 2026-03-10T06:22:47.429 INFO:tasks.workunit.client.0.vm04.stdout:0/158: truncate d0/f17 2465578 0 2026-03-10T06:22:47.430 INFO:tasks.workunit.client.0.vm04.stdout:0/159: chown d0/d1a/d20/d38 5 1 2026-03-10T06:22:47.434 INFO:tasks.workunit.client.0.vm04.stdout:7/157: chown d4/df/d12/f18 63429225 1 2026-03-10T06:22:47.435 INFO:tasks.workunit.client.0.vm04.stdout:3/193: symlink d4/l40 0 2026-03-10T06:22:47.436 INFO:tasks.workunit.client.0.vm04.stdout:3/194: write d4/d6/f30 [756712,11986] 0 2026-03-10T06:22:47.437 INFO:tasks.workunit.client.0.vm04.stdout:2/198: truncate d1/df/d11/f15 3566545 0 2026-03-10T06:22:47.437 INFO:tasks.workunit.client.0.vm04.stdout:3/195: fdatasync d4/d6/dc/f37 0 2026-03-10T06:22:47.437 INFO:tasks.workunit.client.0.vm04.stdout:3/196: fdatasync d4/f10 0 2026-03-10T06:22:47.438 INFO:tasks.workunit.client.0.vm04.stdout:2/199: chown d1/df/d11/d14 14527941 1 2026-03-10T06:22:47.438 INFO:tasks.workunit.client.0.vm04.stdout:3/197: stat d4/da/df/d13/f2e 0 2026-03-10T06:22:47.439 INFO:tasks.workunit.client.0.vm04.stdout:3/198: stat d4/da/df/d13/d21 0 2026-03-10T06:22:47.440 INFO:tasks.workunit.client.0.vm04.stdout:3/199: chown d4/l40 175878830 1 2026-03-10T06:22:47.443 INFO:tasks.workunit.client.0.vm04.stdout:0/160: rename d0/d5/d25/dd/d1d/f24 to d0/d5/d25/f3c 0 2026-03-10T06:22:47.444 INFO:tasks.workunit.client.0.vm04.stdout:3/200: dwrite d4/da/df/f1b [4194304,4194304] 0 2026-03-10T06:22:47.446 INFO:tasks.workunit.client.0.vm04.stdout:2/200: dread - d1/df/d11/f29 zero size 2026-03-10T06:22:47.454 INFO:tasks.workunit.client.0.vm04.stdout:7/158: symlink d4/df/d12/d13/l3c 0 2026-03-10T06:22:47.467 INFO:tasks.workunit.client.0.vm04.stdout:0/161: chown d0/f16 192026040 1 2026-03-10T06:22:47.468 INFO:tasks.workunit.client.0.vm04.stdout:8/199: getdents df/d15/d2b 0 2026-03-10T06:22:47.474 INFO:tasks.workunit.client.0.vm04.stdout:4/137: dwrite d2/d16/d2b/f26 [0,4194304] 0 2026-03-10T06:22:47.474 INFO:tasks.workunit.client.0.vm04.stdout:3/201: dread d4/da/df/d13/f16 [0,4194304] 0 2026-03-10T06:22:47.477 INFO:tasks.workunit.client.0.vm04.stdout:4/138: write d2/f14 [1136349,40150] 0 2026-03-10T06:22:47.487 INFO:tasks.workunit.client.0.vm04.stdout:2/201: dwrite d1/db/d20/f2e [0,4194304] 0 2026-03-10T06:22:47.489 INFO:tasks.workunit.client.0.vm04.stdout:7/159: dwrite d4/df/d12/f20 [0,4194304] 0 2026-03-10T06:22:47.495 INFO:tasks.workunit.client.0.vm04.stdout:5/138: dwrite d4/d11/f1f [0,4194304] 0 2026-03-10T06:22:47.496 INFO:tasks.workunit.client.0.vm04.stdout:7/160: write d4/df/f29 [319082,124613] 0 2026-03-10T06:22:47.499 INFO:tasks.workunit.client.0.vm04.stdout:5/139: dread - d4/d11/f32 zero size 2026-03-10T06:22:47.500 INFO:tasks.workunit.client.0.vm04.stdout:4/139: dread d2/d16/d2b/f15 [0,4194304] 0 2026-03-10T06:22:47.500 INFO:tasks.workunit.client.0.vm04.stdout:4/140: chown d2/d16/d2b/f18 12857902 1 2026-03-10T06:22:47.501 INFO:tasks.workunit.client.0.vm04.stdout:4/141: fdatasync d2/d16/d2b/f28 0 2026-03-10T06:22:47.501 INFO:tasks.workunit.client.0.vm04.stdout:4/142: write d2/d8/f1f [4556143,115148] 0 2026-03-10T06:22:47.508 INFO:tasks.workunit.client.0.vm04.stdout:8/200: dwrite df/d15/d2b/f33 [0,4194304] 0 2026-03-10T06:22:47.511 INFO:tasks.workunit.client.0.vm04.stdout:9/198: truncate d2/f17 155986 0 2026-03-10T06:22:47.513 INFO:tasks.workunit.client.0.vm04.stdout:1/201: write d0/f23 [2920020,49009] 0 2026-03-10T06:22:47.513 INFO:tasks.workunit.client.0.vm04.stdout:9/199: chown d2/d3 18 1 2026-03-10T06:22:47.515 INFO:tasks.workunit.client.0.vm04.stdout:9/200: stat d2/f1c 0 2026-03-10T06:22:47.519 INFO:tasks.workunit.client.0.vm04.stdout:1/202: read - d0/f1a zero size 2026-03-10T06:22:47.519 INFO:tasks.workunit.client.0.vm04.stdout:8/201: write df/d20/f22 [1930120,33239] 0 2026-03-10T06:22:47.519 INFO:tasks.workunit.client.0.vm04.stdout:7/161: dwrite d4/df/d12/d13/f1e [0,4194304] 0 2026-03-10T06:22:47.529 INFO:tasks.workunit.client.0.vm04.stdout:8/202: dwrite df/f11 [0,4194304] 0 2026-03-10T06:22:47.540 INFO:tasks.workunit.client.0.vm04.stdout:3/202: rmdir d4/da/df/d13/d21 39 2026-03-10T06:22:47.540 INFO:tasks.workunit.client.0.vm04.stdout:8/203: stat df/d15/d29/f3c 0 2026-03-10T06:22:47.554 INFO:tasks.workunit.client.0.vm04.stdout:4/143: sync 2026-03-10T06:22:47.555 INFO:tasks.workunit.client.0.vm04.stdout:4/144: chown d2/l9 885632 1 2026-03-10T06:22:47.556 INFO:tasks.workunit.client.0.vm04.stdout:4/145: readlink d2/l9 0 2026-03-10T06:22:47.556 INFO:tasks.workunit.client.0.vm04.stdout:8/204: dread df/d20/d25/f39 [0,4194304] 0 2026-03-10T06:22:47.562 INFO:tasks.workunit.client.0.vm04.stdout:4/146: write d2/d16/d2b/f21 [766245,38482] 0 2026-03-10T06:22:47.577 INFO:tasks.workunit.client.0.vm04.stdout:4/147: write d2/d8/fa [872494,1776] 0 2026-03-10T06:22:47.577 INFO:tasks.workunit.client.0.vm04.stdout:3/203: creat d4/d6/dc/f41 x:0 0 0 2026-03-10T06:22:47.577 INFO:tasks.workunit.client.0.vm04.stdout:9/201: rename d2/d23/d24/f26 to d2/d3/f43 0 2026-03-10T06:22:47.577 INFO:tasks.workunit.client.0.vm04.stdout:2/202: creat d1/df/d2c/f3d x:0 0 0 2026-03-10T06:22:47.577 INFO:tasks.workunit.client.0.vm04.stdout:8/205: dread df/d15/d2b/f2f [4194304,4194304] 0 2026-03-10T06:22:47.577 INFO:tasks.workunit.client.0.vm04.stdout:4/148: rmdir d2/d2a 0 2026-03-10T06:22:47.577 INFO:tasks.workunit.client.0.vm04.stdout:9/202: mknod d2/d8/d14/c44 0 2026-03-10T06:22:47.577 INFO:tasks.workunit.client.0.vm04.stdout:2/203: unlink d1/df/d11/d14/c3b 0 2026-03-10T06:22:47.577 INFO:tasks.workunit.client.0.vm04.stdout:7/162: rename d4/df/d12/l22 to d4/df/d12/d13/d25/l3d 0 2026-03-10T06:22:47.580 INFO:tasks.workunit.client.0.vm04.stdout:9/203: symlink d2/d3/l45 0 2026-03-10T06:22:47.580 INFO:tasks.workunit.client.0.vm04.stdout:9/204: chown d2/d3/f2a 997125 1 2026-03-10T06:22:47.583 INFO:tasks.workunit.client.0.vm04.stdout:9/205: mkdir d2/d3/d18/d39/d46 0 2026-03-10T06:22:47.585 INFO:tasks.workunit.client.0.vm04.stdout:9/206: creat d2/d3/d18/d34/f47 x:0 0 0 2026-03-10T06:22:47.598 INFO:tasks.workunit.client.0.vm04.stdout:3/204: sync 2026-03-10T06:22:47.598 INFO:tasks.workunit.client.0.vm04.stdout:7/163: sync 2026-03-10T06:22:47.599 INFO:tasks.workunit.client.0.vm04.stdout:3/205: stat f0 0 2026-03-10T06:22:47.602 INFO:tasks.workunit.client.0.vm04.stdout:7/164: stat d4/df/d12/d13/d25 0 2026-03-10T06:22:47.607 INFO:tasks.workunit.client.0.vm04.stdout:3/206: rmdir d4/da/df/d11 39 2026-03-10T06:22:47.607 INFO:tasks.workunit.client.0.vm04.stdout:3/207: chown f0 7100 1 2026-03-10T06:22:47.609 INFO:tasks.workunit.client.0.vm04.stdout:8/206: dread df/f1f [0,4194304] 0 2026-03-10T06:22:47.612 INFO:tasks.workunit.client.0.vm04.stdout:7/165: dread d4/df/f29 [0,4194304] 0 2026-03-10T06:22:47.613 INFO:tasks.workunit.client.0.vm04.stdout:6/192: dread d2/d4/f31 [0,4194304] 0 2026-03-10T06:22:47.620 INFO:tasks.workunit.client.0.vm04.stdout:6/193: dwrite d2/d4/fa [0,4194304] 0 2026-03-10T06:22:47.622 INFO:tasks.workunit.client.0.vm04.stdout:1/203: fsync d0/d3/f19 0 2026-03-10T06:22:47.622 INFO:tasks.workunit.client.0.vm04.stdout:1/204: stat d0/f4 0 2026-03-10T06:22:47.630 INFO:tasks.workunit.client.0.vm04.stdout:6/194: dwrite d2/f10 [0,4194304] 0 2026-03-10T06:22:47.630 INFO:tasks.workunit.client.0.vm04.stdout:3/208: creat d4/f42 x:0 0 0 2026-03-10T06:22:47.640 INFO:tasks.workunit.client.0.vm04.stdout:7/166: getdents d4/df/d12/d13/d25/d28/d3a 0 2026-03-10T06:22:47.641 INFO:tasks.workunit.client.0.vm04.stdout:7/167: read - d4/fb zero size 2026-03-10T06:22:47.641 INFO:tasks.workunit.client.0.vm04.stdout:6/195: dwrite d2/d4/fa [0,4194304] 0 2026-03-10T06:22:47.645 INFO:tasks.workunit.client.0.vm04.stdout:1/205: write d0/d3/f37 [4741846,98643] 0 2026-03-10T06:22:47.653 INFO:tasks.workunit.client.0.vm04.stdout:6/196: creat d2/d4/d2d/d30/d1f/f3f x:0 0 0 2026-03-10T06:22:47.655 INFO:tasks.workunit.client.0.vm04.stdout:1/206: write d0/d3/f34 [1841056,25743] 0 2026-03-10T06:22:47.657 INFO:tasks.workunit.client.0.vm04.stdout:1/207: chown d0/d8 7830307 1 2026-03-10T06:22:47.657 INFO:tasks.workunit.client.0.vm04.stdout:7/168: symlink d4/df/d12/l3e 0 2026-03-10T06:22:47.657 INFO:tasks.workunit.client.0.vm04.stdout:3/209: creat d4/da/df/d13/d21/d32/d39/f43 x:0 0 0 2026-03-10T06:22:47.658 INFO:tasks.workunit.client.0.vm04.stdout:1/208: fsync d0/d8/f38 0 2026-03-10T06:22:47.662 INFO:tasks.workunit.client.0.vm04.stdout:6/197: dwrite d2/f28 [0,4194304] 0 2026-03-10T06:22:47.683 INFO:tasks.workunit.client.0.vm04.stdout:0/162: dwrite d0/f1b [0,4194304] 0 2026-03-10T06:22:47.687 INFO:tasks.workunit.client.0.vm04.stdout:3/210: mknod d4/da/df/d13/d21/d2c/c44 0 2026-03-10T06:22:47.690 INFO:tasks.workunit.client.0.vm04.stdout:1/209: mknod d0/c4a 0 2026-03-10T06:22:47.694 INFO:tasks.workunit.client.0.vm04.stdout:7/169: rmdir d4/df/d12/d13/d25/d28/d3b 0 2026-03-10T06:22:47.698 INFO:tasks.workunit.client.0.vm04.stdout:7/170: mknod d4/df/d12/d21/c3f 0 2026-03-10T06:22:47.699 INFO:tasks.workunit.client.0.vm04.stdout:3/211: dwrite d4/da/df/d13/f16 [0,4194304] 0 2026-03-10T06:22:47.701 INFO:tasks.workunit.client.0.vm04.stdout:3/212: chown d4/f2d 484 1 2026-03-10T06:22:47.708 INFO:tasks.workunit.client.0.vm04.stdout:3/213: write d4/d6/f12 [3861862,44982] 0 2026-03-10T06:22:47.708 INFO:tasks.workunit.client.0.vm04.stdout:0/163: truncate d0/d5/d25/f3c 71322 0 2026-03-10T06:22:47.708 INFO:tasks.workunit.client.0.vm04.stdout:1/210: mkdir d0/d3/d41/d4b 0 2026-03-10T06:22:47.708 INFO:tasks.workunit.client.0.vm04.stdout:7/171: dread d4/df/d12/d13/d25/f2e [4194304,4194304] 0 2026-03-10T06:22:47.709 INFO:tasks.workunit.client.0.vm04.stdout:1/211: write d0/f23 [2085699,91931] 0 2026-03-10T06:22:47.710 INFO:tasks.workunit.client.0.vm04.stdout:0/164: write d0/d5/d25/dd/d1d/f34 [106596,69508] 0 2026-03-10T06:22:47.712 INFO:tasks.workunit.client.0.vm04.stdout:0/165: chown d0/d1a/f2f 2280616 1 2026-03-10T06:22:47.716 INFO:tasks.workunit.client.0.vm04.stdout:7/172: dread d4/df/f29 [0,4194304] 0 2026-03-10T06:22:47.724 INFO:tasks.workunit.client.0.vm04.stdout:0/166: chown d0/d1a/f3b 205701506 1 2026-03-10T06:22:47.728 INFO:tasks.workunit.client.0.vm04.stdout:7/173: mkdir d4/df/d12/d13/d25/d30/d40 0 2026-03-10T06:22:47.729 INFO:tasks.workunit.client.0.vm04.stdout:3/214: creat d4/da/df/d13/f45 x:0 0 0 2026-03-10T06:22:47.731 INFO:tasks.workunit.client.0.vm04.stdout:1/212: link d0/d8/c3a d0/d3/d41/c4c 0 2026-03-10T06:22:47.735 INFO:tasks.workunit.client.0.vm04.stdout:0/167: mkdir d0/d1a/d20/d3d 0 2026-03-10T06:22:47.735 INFO:tasks.workunit.client.0.vm04.stdout:1/213: fsync d0/d8/f21 0 2026-03-10T06:22:47.735 INFO:tasks.workunit.client.0.vm04.stdout:0/168: creat d0/d5/f3e x:0 0 0 2026-03-10T06:22:47.736 INFO:tasks.workunit.client.0.vm04.stdout:0/169: write d0/d5/d25/dd/d1d/f30 [4199197,80421] 0 2026-03-10T06:22:47.737 INFO:tasks.workunit.client.0.vm04.stdout:5/140: dwrite d4/d11/f34 [0,4194304] 0 2026-03-10T06:22:47.742 INFO:tasks.workunit.client.0.vm04.stdout:5/141: fsync d4/d6/f23 0 2026-03-10T06:22:47.746 INFO:tasks.workunit.client.0.vm04.stdout:5/142: chown d4/d11/d2a/c2d 98926 1 2026-03-10T06:22:47.749 INFO:tasks.workunit.client.0.vm04.stdout:3/215: mknod d4/c46 0 2026-03-10T06:22:47.751 INFO:tasks.workunit.client.0.vm04.stdout:0/170: dread d0/d5/d25/dd/d1d/f30 [0,4194304] 0 2026-03-10T06:22:47.752 INFO:tasks.workunit.client.0.vm04.stdout:7/174: creat d4/df/d12/d13/d25/d28/d36/f41 x:0 0 0 2026-03-10T06:22:47.758 INFO:tasks.workunit.client.0.vm04.stdout:1/214: creat d0/d8/d46/f4d x:0 0 0 2026-03-10T06:22:47.760 INFO:tasks.workunit.client.0.vm04.stdout:3/216: dwrite d4/d6/f12 [4194304,4194304] 0 2026-03-10T06:22:47.762 INFO:tasks.workunit.client.0.vm04.stdout:3/217: readlink d4/l40 0 2026-03-10T06:22:47.773 INFO:tasks.workunit.client.0.vm04.stdout:8/207: truncate f9 986864 0 2026-03-10T06:22:47.777 INFO:tasks.workunit.client.0.vm04.stdout:5/143: rename d4/d6/f1e to d4/d6/d37/f39 0 2026-03-10T06:22:47.779 INFO:tasks.workunit.client.0.vm04.stdout:0/171: dread d0/d5/fb [0,4194304] 0 2026-03-10T06:22:47.785 INFO:tasks.workunit.client.0.vm04.stdout:2/204: truncate d1/df/d11/f15 3682576 0 2026-03-10T06:22:47.788 INFO:tasks.workunit.client.0.vm04.stdout:5/144: dread d4/d11/f1f [0,4194304] 0 2026-03-10T06:22:47.789 INFO:tasks.workunit.client.0.vm04.stdout:0/172: dread d0/f1b [0,4194304] 0 2026-03-10T06:22:47.792 INFO:tasks.workunit.client.0.vm04.stdout:2/205: dwrite d1/df/d2c/f3d [0,4194304] 0 2026-03-10T06:22:47.805 INFO:tasks.workunit.client.0.vm04.stdout:2/206: dwrite d1/db/f27 [0,4194304] 0 2026-03-10T06:22:47.825 INFO:tasks.workunit.client.0.vm04.stdout:7/175: dread d4/df/d12/f14 [0,4194304] 0 2026-03-10T06:22:47.839 INFO:tasks.workunit.client.0.vm04.stdout:8/208: rename df/f27 to df/d20/f42 0 2026-03-10T06:22:47.839 INFO:tasks.workunit.client.0.vm04.stdout:8/209: readlink df/d15/l1c 0 2026-03-10T06:22:47.839 INFO:tasks.workunit.client.0.vm04.stdout:5/145: rename d4/d6 to d4/d6/d3a 22 2026-03-10T06:22:47.859 INFO:tasks.workunit.client.0.vm04.stdout:6/198: write d2/d4/f31 [1695463,108884] 0 2026-03-10T06:22:47.861 INFO:tasks.workunit.client.0.vm04.stdout:0/173: mkdir d0/d1a/d3f 0 2026-03-10T06:22:47.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:47 vm06.local ceph-mon[58974]: pgmap v22: 65 pgs: 65 active+clean; 2.2 GiB data, 7.5 GiB used, 112 GiB / 120 GiB avail; 39 MiB/s rd, 121 MiB/s wr, 343 op/s 2026-03-10T06:22:47.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:47 vm06.local ceph-mon[58974]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:22:47.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:47 vm06.local ceph-mon[58974]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:22:47.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:47 vm06.local ceph-mon[58974]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:22:47.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:47 vm06.local ceph-mon[58974]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:22:47.871 INFO:tasks.workunit.client.0.vm04.stdout:9/207: dread d2/d3/f4 [0,4194304] 0 2026-03-10T06:22:47.892 INFO:tasks.workunit.client.0.vm04.stdout:7/176: symlink d4/df/d12/d21/l42 0 2026-03-10T06:22:47.898 INFO:tasks.workunit.client.0.vm04.stdout:7/177: dwrite d4/df/d12/d13/f1e [4194304,4194304] 0 2026-03-10T06:22:47.899 INFO:tasks.workunit.client.0.vm04.stdout:8/210: creat df/d15/f43 x:0 0 0 2026-03-10T06:22:47.916 INFO:tasks.workunit.client.0.vm04.stdout:6/199: creat d2/d3a/f40 x:0 0 0 2026-03-10T06:22:47.919 INFO:tasks.workunit.client.0.vm04.stdout:2/207: dwrite d1/df/d11/d18/f25 [0,4194304] 0 2026-03-10T06:22:47.920 INFO:tasks.workunit.client.0.vm04.stdout:2/208: write d1/db/f1e [8408822,28280] 0 2026-03-10T06:22:47.924 INFO:tasks.workunit.client.0.vm04.stdout:1/215: creat d0/d3/f4e x:0 0 0 2026-03-10T06:22:47.924 INFO:tasks.workunit.client.0.vm04.stdout:3/218: creat d4/da/df/d13/f47 x:0 0 0 2026-03-10T06:22:47.925 INFO:tasks.workunit.client.0.vm04.stdout:7/178: creat d4/df/f43 x:0 0 0 2026-03-10T06:22:47.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:47 vm04.local ceph-mon[51058]: pgmap v22: 65 pgs: 65 active+clean; 2.2 GiB data, 7.5 GiB used, 112 GiB / 120 GiB avail; 39 MiB/s rd, 121 MiB/s wr, 343 op/s 2026-03-10T06:22:47.939 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:47 vm04.local ceph-mon[51058]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:22:47.940 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:47 vm04.local ceph-mon[51058]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:22:47.940 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:47 vm04.local ceph-mon[51058]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:22:47.940 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:47 vm04.local ceph-mon[51058]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:22:47.940 INFO:tasks.workunit.client.0.vm04.stdout:6/200: symlink d2/d37/l41 0 2026-03-10T06:22:47.940 INFO:tasks.workunit.client.0.vm04.stdout:2/209: creat d1/df/d11/d18/f3e x:0 0 0 2026-03-10T06:22:47.940 INFO:tasks.workunit.client.0.vm04.stdout:2/210: chown d1/df/d11/d18/f25 2831580 1 2026-03-10T06:22:47.940 INFO:tasks.workunit.client.0.vm04.stdout:1/216: stat d0/d3/c45 0 2026-03-10T06:22:47.941 INFO:tasks.workunit.client.0.vm04.stdout:8/211: sync 2026-03-10T06:22:47.943 INFO:tasks.workunit.client.0.vm04.stdout:5/146: write d4/d6/d37/f39 [2012123,51563] 0 2026-03-10T06:22:47.948 INFO:tasks.workunit.client.0.vm04.stdout:7/179: mknod d4/df/d12/d21/c44 0 2026-03-10T06:22:47.955 INFO:tasks.workunit.client.0.vm04.stdout:9/208: creat d2/f48 x:0 0 0 2026-03-10T06:22:47.960 INFO:tasks.workunit.client.0.vm04.stdout:2/211: dread d1/db/f36 [0,4194304] 0 2026-03-10T06:22:47.963 INFO:tasks.workunit.client.0.vm04.stdout:2/212: dwrite d1/f5 [0,4194304] 0 2026-03-10T06:22:47.967 INFO:tasks.workunit.client.0.vm04.stdout:2/213: stat d1 0 2026-03-10T06:22:47.968 INFO:tasks.workunit.client.0.vm04.stdout:8/212: creat df/d20/d25/f44 x:0 0 0 2026-03-10T06:22:47.968 INFO:tasks.workunit.client.0.vm04.stdout:8/213: fdatasync df/d20/d25/f2a 0 2026-03-10T06:22:47.971 INFO:tasks.workunit.client.0.vm04.stdout:3/219: creat d4/da/df/d11/f48 x:0 0 0 2026-03-10T06:22:47.971 INFO:tasks.workunit.client.0.vm04.stdout:3/220: truncate d4/d6/dc/f1f 197224 0 2026-03-10T06:22:47.976 INFO:tasks.workunit.client.0.vm04.stdout:7/180: symlink d4/df/d12/d13/d25/d30/l45 0 2026-03-10T06:22:47.977 INFO:tasks.workunit.client.0.vm04.stdout:4/149: link d2/d16/d2b/f15 d2/f30 0 2026-03-10T06:22:47.979 INFO:tasks.workunit.client.0.vm04.stdout:0/174: getdents d0/d5/d25 0 2026-03-10T06:22:47.984 INFO:tasks.workunit.client.0.vm04.stdout:1/217: mknod d0/d3/d41/d4b/c4f 0 2026-03-10T06:22:47.986 INFO:tasks.workunit.client.0.vm04.stdout:8/214: creat df/d15/f45 x:0 0 0 2026-03-10T06:22:47.987 INFO:tasks.workunit.client.0.vm04.stdout:5/147: mkdir d4/d3b 0 2026-03-10T06:22:47.988 INFO:tasks.workunit.client.0.vm04.stdout:5/148: write d4/f21 [3347426,128525] 0 2026-03-10T06:22:47.994 INFO:tasks.workunit.client.0.vm04.stdout:6/201: rename d2/f7 to d2/d4/d2d/f42 0 2026-03-10T06:22:47.996 INFO:tasks.workunit.client.0.vm04.stdout:0/175: write d0/f14 [1899762,32189] 0 2026-03-10T06:22:48.014 INFO:tasks.workunit.client.0.vm04.stdout:7/181: creat d4/df/d12/d34/f46 x:0 0 0 2026-03-10T06:22:48.015 INFO:tasks.workunit.client.0.vm04.stdout:7/182: write d4/df/d12/d34/f46 [644408,44941] 0 2026-03-10T06:22:48.018 INFO:tasks.workunit.client.0.vm04.stdout:4/150: unlink d2/d16/d2b/f21 0 2026-03-10T06:22:48.023 INFO:tasks.workunit.client.0.vm04.stdout:6/202: dread d2/d4/d2d/d30/f2b [0,4194304] 0 2026-03-10T06:22:48.025 INFO:tasks.workunit.client.0.vm04.stdout:2/214: link d1/df/d11/l34 d1/df/d11/d14/l3f 0 2026-03-10T06:22:48.028 INFO:tasks.workunit.client.0.vm04.stdout:3/221: creat d4/f49 x:0 0 0 2026-03-10T06:22:48.037 INFO:tasks.workunit.client.0.vm04.stdout:7/183: rmdir d4/df/d12/d13/d25/d28/d36 39 2026-03-10T06:22:48.037 INFO:tasks.workunit.client.0.vm04.stdout:7/184: dread - d4/df/f43 zero size 2026-03-10T06:22:48.037 INFO:tasks.workunit.client.0.vm04.stdout:4/151: mkdir d2/d16/d31 0 2026-03-10T06:22:48.038 INFO:tasks.workunit.client.0.vm04.stdout:9/209: link d2/d23/d24/f2b d2/f49 0 2026-03-10T06:22:48.040 INFO:tasks.workunit.client.0.vm04.stdout:1/218: creat d0/d3/f50 x:0 0 0 2026-03-10T06:22:48.042 INFO:tasks.workunit.client.0.vm04.stdout:8/215: creat df/f46 x:0 0 0 2026-03-10T06:22:48.043 INFO:tasks.workunit.client.0.vm04.stdout:8/216: dread df/d20/d25/f39 [0,4194304] 0 2026-03-10T06:22:48.043 INFO:tasks.workunit.client.0.vm04.stdout:8/217: dread - df/d15/f45 zero size 2026-03-10T06:22:48.044 INFO:tasks.workunit.client.0.vm04.stdout:8/218: truncate df/d15/d29/f3a 570480 0 2026-03-10T06:22:48.044 INFO:tasks.workunit.client.0.vm04.stdout:3/222: rmdir d4/da/df/d13/d21 39 2026-03-10T06:22:48.047 INFO:tasks.workunit.client.0.vm04.stdout:3/223: dread d4/f29 [0,4194304] 0 2026-03-10T06:22:48.047 INFO:tasks.workunit.client.0.vm04.stdout:3/224: chown d4/d6/f30 10525 1 2026-03-10T06:22:48.048 INFO:tasks.workunit.client.0.vm04.stdout:5/149: creat d4/f3c x:0 0 0 2026-03-10T06:22:48.049 INFO:tasks.workunit.client.0.vm04.stdout:5/150: write d4/f3c [583032,63375] 0 2026-03-10T06:22:48.049 INFO:tasks.workunit.client.0.vm04.stdout:5/151: truncate d4/f26 280689 0 2026-03-10T06:22:48.052 INFO:tasks.workunit.client.0.vm04.stdout:4/152: stat d2/d8/f23 0 2026-03-10T06:22:48.053 INFO:tasks.workunit.client.0.vm04.stdout:4/153: chown d2/d16/d2b/f18 3 1 2026-03-10T06:22:48.053 INFO:tasks.workunit.client.0.vm04.stdout:4/154: chown d2/d8/f1a 31490 1 2026-03-10T06:22:48.055 INFO:tasks.workunit.client.0.vm04.stdout:0/176: link d0/d5/d25/ce d0/d5/c40 0 2026-03-10T06:22:48.057 INFO:tasks.workunit.client.0.vm04.stdout:9/210: rmdir d2/d3/d18/d39 39 2026-03-10T06:22:48.059 INFO:tasks.workunit.client.0.vm04.stdout:0/177: dwrite d0/d5/d25/dd/d1d/f26 [0,4194304] 0 2026-03-10T06:22:48.067 INFO:tasks.workunit.client.0.vm04.stdout:1/219: symlink d0/d3/d41/l51 0 2026-03-10T06:22:48.067 INFO:tasks.workunit.client.0.vm04.stdout:8/219: symlink df/d20/d25/d30/l47 0 2026-03-10T06:22:48.067 INFO:tasks.workunit.client.0.vm04.stdout:0/178: write d0/d5/d25/dd/d1d/f26 [1467890,75885] 0 2026-03-10T06:22:48.074 INFO:tasks.workunit.client.0.vm04.stdout:3/225: mkdir d4/da/df/d11/d4a 0 2026-03-10T06:22:48.074 INFO:tasks.workunit.client.0.vm04.stdout:3/226: stat d4/d6 0 2026-03-10T06:22:48.075 INFO:tasks.workunit.client.0.vm04.stdout:5/152: creat d4/d11/d2a/f3d x:0 0 0 2026-03-10T06:22:48.078 INFO:tasks.workunit.client.0.vm04.stdout:7/185: truncate d4/df/d12/d13/d25/d28/f39 4815613 0 2026-03-10T06:22:48.089 INFO:tasks.workunit.client.0.vm04.stdout:7/186: dread - d4/df/d12/d21/f26 zero size 2026-03-10T06:22:48.090 INFO:tasks.workunit.client.0.vm04.stdout:5/153: dread d4/d6/f20 [0,4194304] 0 2026-03-10T06:22:48.090 INFO:tasks.workunit.client.0.vm04.stdout:5/154: write d4/d11/d2a/f36 [744830,126581] 0 2026-03-10T06:22:48.090 INFO:tasks.workunit.client.0.vm04.stdout:1/220: mkdir d0/d8/d52 0 2026-03-10T06:22:48.090 INFO:tasks.workunit.client.0.vm04.stdout:0/179: rename d0/d5/d25/dd/d1d/f34 to d0/d5/f41 0 2026-03-10T06:22:48.090 INFO:tasks.workunit.client.0.vm04.stdout:0/180: readlink d0/l1 0 2026-03-10T06:22:48.097 INFO:tasks.workunit.client.0.vm04.stdout:4/155: mkdir d2/d32 0 2026-03-10T06:22:48.098 INFO:tasks.workunit.client.0.vm04.stdout:7/187: write d4/df/f29 [4437659,68274] 0 2026-03-10T06:22:48.104 INFO:tasks.workunit.client.0.vm04.stdout:2/215: sync 2026-03-10T06:22:48.104 INFO:tasks.workunit.client.0.vm04.stdout:6/203: rename d2/d4 to d2/d43 0 2026-03-10T06:22:48.108 INFO:tasks.workunit.client.0.vm04.stdout:7/188: sync 2026-03-10T06:22:48.110 INFO:tasks.workunit.client.0.vm04.stdout:7/189: sync 2026-03-10T06:22:48.121 INFO:tasks.workunit.client.0.vm04.stdout:9/211: creat d2/d8/f4a x:0 0 0 2026-03-10T06:22:48.124 INFO:tasks.workunit.client.0.vm04.stdout:5/155: creat d4/d11/d2a/d38/f3e x:0 0 0 2026-03-10T06:22:48.124 INFO:tasks.workunit.client.0.vm04.stdout:2/216: rmdir d1/db/d20 39 2026-03-10T06:22:48.129 INFO:tasks.workunit.client.0.vm04.stdout:2/217: dread d1/f10 [0,4194304] 0 2026-03-10T06:22:48.130 INFO:tasks.workunit.client.0.vm04.stdout:2/218: write d1/f5 [3162884,13974] 0 2026-03-10T06:22:48.132 INFO:tasks.workunit.client.0.vm04.stdout:6/204: mknod d2/d43/d2d/d30/c44 0 2026-03-10T06:22:48.133 INFO:tasks.workunit.client.0.vm04.stdout:0/181: getdents d0/d5/d25/dd/d3a 0 2026-03-10T06:22:48.135 INFO:tasks.workunit.client.0.vm04.stdout:3/227: creat d4/da/df/d13/f4b x:0 0 0 2026-03-10T06:22:48.135 INFO:tasks.workunit.client.0.vm04.stdout:3/228: fdatasync d4/d6/f30 0 2026-03-10T06:22:48.138 INFO:tasks.workunit.client.0.vm04.stdout:8/220: getdents df/d20 0 2026-03-10T06:22:48.139 INFO:tasks.workunit.client.0.vm04.stdout:9/212: chown d2/d8/d14/f27 7123 1 2026-03-10T06:22:48.142 INFO:tasks.workunit.client.0.vm04.stdout:1/221: creat d0/f53 x:0 0 0 2026-03-10T06:22:48.143 INFO:tasks.workunit.client.0.vm04.stdout:4/156: symlink d2/d16/d31/l33 0 2026-03-10T06:22:48.144 INFO:tasks.workunit.client.0.vm04.stdout:4/157: chown d2/d16 941520938 1 2026-03-10T06:22:48.145 INFO:tasks.workunit.client.0.vm04.stdout:9/213: sync 2026-03-10T06:22:48.148 INFO:tasks.workunit.client.0.vm04.stdout:5/156: creat d4/d11/f3f x:0 0 0 2026-03-10T06:22:48.151 INFO:tasks.workunit.client.0.vm04.stdout:5/157: dwrite d4/f21 [0,4194304] 0 2026-03-10T06:22:48.167 INFO:tasks.workunit.client.0.vm04.stdout:7/190: mknod d4/df/d12/c47 0 2026-03-10T06:22:48.169 INFO:tasks.workunit.client.0.vm04.stdout:0/182: symlink d0/d1a/d20/d38/l42 0 2026-03-10T06:22:48.169 INFO:tasks.workunit.client.0.vm04.stdout:8/221: symlink df/d15/l48 0 2026-03-10T06:22:48.169 INFO:tasks.workunit.client.0.vm04.stdout:1/222: mknod d0/d3/d41/d4b/c54 0 2026-03-10T06:22:48.170 INFO:tasks.workunit.client.0.vm04.stdout:4/158: fdatasync d2/ff 0 2026-03-10T06:22:48.172 INFO:tasks.workunit.client.0.vm04.stdout:4/159: truncate d2/d8/f25 865505 0 2026-03-10T06:22:48.173 INFO:tasks.workunit.client.0.vm04.stdout:1/223: sync 2026-03-10T06:22:48.174 INFO:tasks.workunit.client.0.vm04.stdout:5/158: symlink d4/d11/d2a/d38/l40 0 2026-03-10T06:22:48.176 INFO:tasks.workunit.client.0.vm04.stdout:8/222: unlink lc 0 2026-03-10T06:22:48.176 INFO:tasks.workunit.client.0.vm04.stdout:5/159: sync 2026-03-10T06:22:48.177 INFO:tasks.workunit.client.0.vm04.stdout:8/223: write df/d15/f1b [398117,100257] 0 2026-03-10T06:22:48.177 INFO:tasks.workunit.client.0.vm04.stdout:2/219: dwrite d1/f2b [0,4194304] 0 2026-03-10T06:22:48.183 INFO:tasks.workunit.client.0.vm04.stdout:7/191: creat d4/df/d12/d13/d25/f48 x:0 0 0 2026-03-10T06:22:48.183 INFO:tasks.workunit.client.0.vm04.stdout:7/192: read d4/df/d12/f20 [1116582,37548] 0 2026-03-10T06:22:48.184 INFO:tasks.workunit.client.0.vm04.stdout:4/160: truncate d2/f14 4421259 0 2026-03-10T06:22:48.188 INFO:tasks.workunit.client.0.vm04.stdout:4/161: dwrite d2/d8/fa [4194304,4194304] 0 2026-03-10T06:22:48.189 INFO:tasks.workunit.client.0.vm04.stdout:4/162: fdatasync d2/d16/d2c/f2e 0 2026-03-10T06:22:48.189 INFO:tasks.workunit.client.0.vm04.stdout:0/183: rmdir d0/d1a/d20/d38 39 2026-03-10T06:22:48.192 INFO:tasks.workunit.client.0.vm04.stdout:0/184: fdatasync d0/f17 0 2026-03-10T06:22:48.193 INFO:tasks.workunit.client.0.vm04.stdout:1/224: write d0/d3/f20 [22401,107765] 0 2026-03-10T06:22:48.200 INFO:tasks.workunit.client.0.vm04.stdout:6/205: creat d2/d43/d2d/f45 x:0 0 0 2026-03-10T06:22:48.201 INFO:tasks.workunit.client.0.vm04.stdout:6/206: write d2/f14 [126078,100736] 0 2026-03-10T06:22:48.201 INFO:tasks.workunit.client.0.vm04.stdout:6/207: stat d2/d43/f24 0 2026-03-10T06:22:48.204 INFO:tasks.workunit.client.0.vm04.stdout:2/220: fsync d1/db/d20/f31 0 2026-03-10T06:22:48.215 INFO:tasks.workunit.client.0.vm04.stdout:9/214: creat d2/d3/d18/d39/d46/f4b x:0 0 0 2026-03-10T06:22:48.216 INFO:tasks.workunit.client.0.vm04.stdout:0/185: readlink d0/d1a/d20/d38/l42 0 2026-03-10T06:22:48.216 INFO:tasks.workunit.client.0.vm04.stdout:4/163: rename d2/d8/f25 to d2/d16/d2c/f34 0 2026-03-10T06:22:48.216 INFO:tasks.workunit.client.0.vm04.stdout:3/229: getdents d4/d6/dc 0 2026-03-10T06:22:48.216 INFO:tasks.workunit.client.0.vm04.stdout:3/230: write d4/da/df/d13/f47 [862020,129553] 0 2026-03-10T06:22:48.216 INFO:tasks.workunit.client.0.vm04.stdout:6/208: mknod d2/d43/d2d/d30/d34/c46 0 2026-03-10T06:22:48.216 INFO:tasks.workunit.client.0.vm04.stdout:5/160: creat d4/d3b/f41 x:0 0 0 2026-03-10T06:22:48.218 INFO:tasks.workunit.client.0.vm04.stdout:1/225: symlink d0/d3/l55 0 2026-03-10T06:22:48.220 INFO:tasks.workunit.client.0.vm04.stdout:4/164: rename d2/d8/f1f to d2/d8/f35 0 2026-03-10T06:22:48.221 INFO:tasks.workunit.client.0.vm04.stdout:0/186: sync 2026-03-10T06:22:48.224 INFO:tasks.workunit.client.0.vm04.stdout:0/187: dread d0/f1b [0,4194304] 0 2026-03-10T06:22:48.226 INFO:tasks.workunit.client.0.vm04.stdout:3/231: rmdir d4 39 2026-03-10T06:22:48.226 INFO:tasks.workunit.client.0.vm04.stdout:1/226: dread d0/f2e [0,4194304] 0 2026-03-10T06:22:48.226 INFO:tasks.workunit.client.0.vm04.stdout:1/227: fdatasync d0/d8/f14 0 2026-03-10T06:22:48.235 INFO:tasks.workunit.client.0.vm04.stdout:6/209: unlink d2/d43/l5 0 2026-03-10T06:22:48.246 INFO:tasks.workunit.client.0.vm04.stdout:2/221: mkdir d1/df/d2c/d37/d40 0 2026-03-10T06:22:48.246 INFO:tasks.workunit.client.0.vm04.stdout:2/222: stat l0 0 2026-03-10T06:22:48.249 INFO:tasks.workunit.client.0.vm04.stdout:9/215: mknod d2/d8/d3a/c4c 0 2026-03-10T06:22:48.249 INFO:tasks.workunit.client.0.vm04.stdout:5/161: dread d4/ff [0,4194304] 0 2026-03-10T06:22:48.251 INFO:tasks.workunit.client.0.vm04.stdout:2/223: sync 2026-03-10T06:22:48.259 INFO:tasks.workunit.client.0.vm04.stdout:1/228: rename d0/c2d to d0/d8/d46/c56 0 2026-03-10T06:22:48.260 INFO:tasks.workunit.client.0.vm04.stdout:8/224: getdents df/d15 0 2026-03-10T06:22:48.262 INFO:tasks.workunit.client.0.vm04.stdout:1/229: sync 2026-03-10T06:22:48.265 INFO:tasks.workunit.client.0.vm04.stdout:5/162: symlink d4/d11/l42 0 2026-03-10T06:22:48.268 INFO:tasks.workunit.client.0.vm04.stdout:2/224: symlink d1/l41 0 2026-03-10T06:22:48.269 INFO:tasks.workunit.client.0.vm04.stdout:2/225: chown d1/db/fe 12835382 1 2026-03-10T06:22:48.271 INFO:tasks.workunit.client.0.vm04.stdout:4/165: symlink d2/d32/l36 0 2026-03-10T06:22:48.272 INFO:tasks.workunit.client.0.vm04.stdout:4/166: chown d2/f14 36 1 2026-03-10T06:22:48.274 INFO:tasks.workunit.client.0.vm04.stdout:6/210: rename d2/d43/c22 to d2/d43/d2d/d30/c47 0 2026-03-10T06:22:48.277 INFO:tasks.workunit.client.0.vm04.stdout:6/211: write d2/d43/d2d/d30/d1f/d3c/f27 [62868,112739] 0 2026-03-10T06:22:48.284 INFO:tasks.workunit.client.0.vm04.stdout:7/193: dwrite d4/df/d12/f14 [0,4194304] 0 2026-03-10T06:22:48.284 INFO:tasks.workunit.client.0.vm04.stdout:5/163: fsync d4/d3b/f41 0 2026-03-10T06:22:48.286 INFO:tasks.workunit.client.0.vm04.stdout:5/164: dread - d4/d11/f2f zero size 2026-03-10T06:22:48.293 INFO:tasks.workunit.client.0.vm04.stdout:1/230: chown d0/d8/d46/c56 712771 1 2026-03-10T06:22:48.294 INFO:tasks.workunit.client.0.vm04.stdout:5/165: truncate d4/d11/f2f 1010216 0 2026-03-10T06:22:48.294 INFO:tasks.workunit.client.0.vm04.stdout:3/232: write d4/f7 [805648,82246] 0 2026-03-10T06:22:48.295 INFO:tasks.workunit.client.0.vm04.stdout:1/231: dread - d0/f29 zero size 2026-03-10T06:22:48.295 INFO:tasks.workunit.client.0.vm04.stdout:5/166: fsync d4/d11/f2f 0 2026-03-10T06:22:48.296 INFO:tasks.workunit.client.0.vm04.stdout:5/167: chown d4/f3c 127 1 2026-03-10T06:22:48.299 INFO:tasks.workunit.client.0.vm04.stdout:7/194: rmdir d4/df/d12 39 2026-03-10T06:22:48.302 INFO:tasks.workunit.client.0.vm04.stdout:2/226: creat d1/df/d2c/d37/d40/f42 x:0 0 0 2026-03-10T06:22:48.303 INFO:tasks.workunit.client.0.vm04.stdout:0/188: link d0/d1a/f27 d0/d5/d25/dd/f43 0 2026-03-10T06:22:48.305 INFO:tasks.workunit.client.0.vm04.stdout:8/225: link df/d20/c2d df/d20/c49 0 2026-03-10T06:22:48.310 INFO:tasks.workunit.client.0.vm04.stdout:2/227: symlink d1/df/d2c/l43 0 2026-03-10T06:22:48.311 INFO:tasks.workunit.client.0.vm04.stdout:0/189: creat d0/f44 x:0 0 0 2026-03-10T06:22:48.313 INFO:tasks.workunit.client.0.vm04.stdout:3/233: mknod d4/da/df/d13/d21/c4c 0 2026-03-10T06:22:48.317 INFO:tasks.workunit.client.0.vm04.stdout:1/232: sync 2026-03-10T06:22:48.317 INFO:tasks.workunit.client.0.vm04.stdout:2/228: sync 2026-03-10T06:22:48.317 INFO:tasks.workunit.client.0.vm04.stdout:7/195: mknod d4/df/d12/d13/c49 0 2026-03-10T06:22:48.318 INFO:tasks.workunit.client.0.vm04.stdout:7/196: read d4/df/d12/d13/d25/f2e [360986,30307] 0 2026-03-10T06:22:48.319 INFO:tasks.workunit.client.0.vm04.stdout:9/216: link d2/l38 d2/d8/l4d 0 2026-03-10T06:22:48.321 INFO:tasks.workunit.client.0.vm04.stdout:2/229: dread d1/df/d11/d18/f25 [0,4194304] 0 2026-03-10T06:22:48.323 INFO:tasks.workunit.client.0.vm04.stdout:6/212: truncate d2/d43/f2c 3228456 0 2026-03-10T06:22:48.324 INFO:tasks.workunit.client.0.vm04.stdout:6/213: fdatasync d2/d43/f3b 0 2026-03-10T06:22:48.325 INFO:tasks.workunit.client.0.vm04.stdout:2/230: dwrite d1/db/fe [0,4194304] 0 2026-03-10T06:22:48.326 INFO:tasks.workunit.client.0.vm04.stdout:2/231: readlink d1/l41 0 2026-03-10T06:22:48.330 INFO:tasks.workunit.client.0.vm04.stdout:3/234: symlink d4/d6/dc/l4d 0 2026-03-10T06:22:48.330 INFO:tasks.workunit.client.0.vm04.stdout:0/190: rename d0/l6 to d0/d5/d25/dd/d3a/l45 0 2026-03-10T06:22:48.332 INFO:tasks.workunit.client.0.vm04.stdout:4/167: link d2/d8/c1e d2/d16/d2b/c37 0 2026-03-10T06:22:48.335 INFO:tasks.workunit.client.0.vm04.stdout:4/168: chown d2/d32/l36 207 1 2026-03-10T06:22:48.348 INFO:tasks.workunit.client.0.vm04.stdout:8/226: link f6 df/d15/d2b/f4a 0 2026-03-10T06:22:48.348 INFO:tasks.workunit.client.0.vm04.stdout:8/227: chown df/d15/l1c 7861 1 2026-03-10T06:22:48.349 INFO:tasks.workunit.client.0.vm04.stdout:3/235: mkdir d4/da/df/d13/d21/d32/d4e 0 2026-03-10T06:22:48.351 INFO:tasks.workunit.client.0.vm04.stdout:4/169: dwrite d2/ff [0,4194304] 0 2026-03-10T06:22:48.352 INFO:tasks.workunit.client.0.vm04.stdout:4/170: fsync d2/d16/f20 0 2026-03-10T06:22:48.358 INFO:tasks.workunit.client.0.vm04.stdout:1/233: rename d0/d8/f25 to d0/d8/d46/f57 0 2026-03-10T06:22:48.361 INFO:tasks.workunit.client.0.vm04.stdout:5/168: getdents d4/d6 0 2026-03-10T06:22:48.361 INFO:tasks.workunit.client.0.vm04.stdout:0/191: dwrite d0/d5/f41 [0,4194304] 0 2026-03-10T06:22:48.362 INFO:tasks.workunit.client.0.vm04.stdout:5/169: write d4/d11/f34 [1705300,67669] 0 2026-03-10T06:22:48.372 INFO:tasks.workunit.client.0.vm04.stdout:0/192: dread d0/d5/d25/dd/d1d/f30 [0,4194304] 0 2026-03-10T06:22:48.372 INFO:tasks.workunit.client.0.vm04.stdout:0/193: dread - d0/d5/f3e zero size 2026-03-10T06:22:48.377 INFO:tasks.workunit.client.0.vm04.stdout:7/197: fdatasync d4/f5 0 2026-03-10T06:22:48.390 INFO:tasks.workunit.client.0.vm04.stdout:3/236: unlink d4/da/df/f3b 0 2026-03-10T06:22:48.400 INFO:tasks.workunit.client.0.vm04.stdout:3/237: sync 2026-03-10T06:22:48.402 INFO:tasks.workunit.client.0.vm04.stdout:3/238: truncate d4/da/df/d11/f48 1043683 0 2026-03-10T06:22:48.404 INFO:tasks.workunit.client.0.vm04.stdout:9/217: rename d2/d8/d3a/c4c to d2/d3/d18/d39/d46/c4e 0 2026-03-10T06:22:48.409 INFO:tasks.workunit.client.0.vm04.stdout:3/239: dread d4/d6/f12 [4194304,4194304] 0 2026-03-10T06:22:48.418 INFO:tasks.workunit.client.0.vm04.stdout:1/234: dwrite d0/d3/f3b [0,4194304] 0 2026-03-10T06:22:48.426 INFO:tasks.workunit.client.0.vm04.stdout:1/235: dwrite d0/d8/f32 [0,4194304] 0 2026-03-10T06:22:48.438 INFO:tasks.workunit.client.0.vm04.stdout:1/236: dwrite d0/f29 [0,4194304] 0 2026-03-10T06:22:48.455 INFO:tasks.workunit.client.0.vm04.stdout:7/198: write d4/df/d12/d13/d25/d28/d36/f41 [100025,99225] 0 2026-03-10T06:22:48.457 INFO:tasks.workunit.client.0.vm04.stdout:4/171: symlink d2/l38 0 2026-03-10T06:22:48.458 INFO:tasks.workunit.client.0.vm04.stdout:4/172: read - d2/d16/f20 zero size 2026-03-10T06:22:48.459 INFO:tasks.workunit.client.0.vm04.stdout:4/173: write d2/d8/fa [4321038,18237] 0 2026-03-10T06:22:48.467 INFO:tasks.workunit.client.0.vm04.stdout:4/174: dread d2/d8/fa [4194304,4194304] 0 2026-03-10T06:22:48.471 INFO:tasks.workunit.client.0.vm04.stdout:2/232: rename d1/db/d20/f2f to d1/df/d2c/f44 0 2026-03-10T06:22:48.472 INFO:tasks.workunit.client.0.vm04.stdout:9/218: write d2/f1e [310190,124631] 0 2026-03-10T06:22:48.474 INFO:tasks.workunit.client.0.vm04.stdout:9/219: readlink d2/d3/l45 0 2026-03-10T06:22:48.475 INFO:tasks.workunit.client.0.vm04.stdout:9/220: chown d2/d3/d18/d39/d11/f25 21 1 2026-03-10T06:22:48.477 INFO:tasks.workunit.client.0.vm04.stdout:3/240: mknod d4/da/df/d13/d21/d2c/c4f 0 2026-03-10T06:22:48.478 INFO:tasks.workunit.client.0.vm04.stdout:3/241: write d4/d6/f30 [662890,47651] 0 2026-03-10T06:22:48.481 INFO:tasks.workunit.client.0.vm04.stdout:8/228: truncate df/d20/f22 1732644 0 2026-03-10T06:22:48.481 INFO:tasks.workunit.client.0.vm04.stdout:3/242: dread d4/f2d [0,4194304] 0 2026-03-10T06:22:48.482 INFO:tasks.workunit.client.0.vm04.stdout:3/243: read - d4/f49 zero size 2026-03-10T06:22:48.483 INFO:tasks.workunit.client.0.vm04.stdout:3/244: fdatasync d4/da/df/d13/f45 0 2026-03-10T06:22:48.483 INFO:tasks.workunit.client.0.vm04.stdout:8/229: dread df/d20/d25/f39 [0,4194304] 0 2026-03-10T06:22:48.485 INFO:tasks.workunit.client.0.vm04.stdout:3/245: dread - d4/f49 zero size 2026-03-10T06:22:48.486 INFO:tasks.workunit.client.0.vm04.stdout:3/246: write d4/f42 [201527,33923] 0 2026-03-10T06:22:48.491 INFO:tasks.workunit.client.0.vm04.stdout:8/230: sync 2026-03-10T06:22:48.493 INFO:tasks.workunit.client.0.vm04.stdout:5/170: link d4/ff d4/d11/f43 0 2026-03-10T06:22:48.494 INFO:tasks.workunit.client.0.vm04.stdout:3/247: dwrite d4/d6/dc/f41 [0,4194304] 0 2026-03-10T06:22:48.495 INFO:tasks.workunit.client.0.vm04.stdout:3/248: stat d4/d6/dc/l4d 0 2026-03-10T06:22:48.498 INFO:tasks.workunit.client.0.vm04.stdout:3/249: truncate d4/da/df/d13/f2b 133887 0 2026-03-10T06:22:48.512 INFO:tasks.workunit.client.0.vm04.stdout:0/194: mknod d0/d1a/d3f/c46 0 2026-03-10T06:22:48.521 INFO:tasks.workunit.client.0.vm04.stdout:6/214: fdatasync d2/d43/d2d/f42 0 2026-03-10T06:22:48.522 INFO:tasks.workunit.client.0.vm04.stdout:6/215: chown d2/d43/d2d/d30/d1f/c25 3 1 2026-03-10T06:22:48.525 INFO:tasks.workunit.client.0.vm04.stdout:7/199: creat d4/df/d12/d13/f4a x:0 0 0 2026-03-10T06:22:48.526 INFO:tasks.workunit.client.0.vm04.stdout:7/200: chown d4/df/d12/d13/d25/d30/d40 87186004 1 2026-03-10T06:22:48.527 INFO:tasks.workunit.client.0.vm04.stdout:7/201: write d4/df/d12/d13/d25/f2f [3952352,45645] 0 2026-03-10T06:22:48.547 INFO:tasks.workunit.client.0.vm04.stdout:4/175: dread d2/f30 [0,4194304] 0 2026-03-10T06:22:48.552 INFO:tasks.workunit.client.0.vm04.stdout:4/176: dread - d2/d8/f1a zero size 2026-03-10T06:22:48.552 INFO:tasks.workunit.client.0.vm04.stdout:5/171: creat d4/d11/d2a/f44 x:0 0 0 2026-03-10T06:22:48.555 INFO:tasks.workunit.client.0.vm04.stdout:3/250: mkdir d4/da/df/d11/d50 0 2026-03-10T06:22:48.556 INFO:tasks.workunit.client.0.vm04.stdout:3/251: write d4/da/df/d13/f23 [2802497,38429] 0 2026-03-10T06:22:48.563 INFO:tasks.workunit.client.0.vm04.stdout:6/216: rmdir d2/d43/d2d/d30 39 2026-03-10T06:22:48.563 INFO:tasks.workunit.client.0.vm04.stdout:5/172: dwrite d4/d11/d2a/f3d [0,4194304] 0 2026-03-10T06:22:48.564 INFO:tasks.workunit.client.0.vm04.stdout:5/173: chown l1 8629 1 2026-03-10T06:22:48.580 INFO:tasks.workunit.client.0.vm04.stdout:9/221: mkdir d2/d8/d22/d4f 0 2026-03-10T06:22:48.585 INFO:tasks.workunit.client.0.vm04.stdout:4/177: symlink d2/d16/d2c/l39 0 2026-03-10T06:22:48.591 INFO:tasks.workunit.client.0.vm04.stdout:0/195: mkdir d0/d1a/d20/d38/d31/d47 0 2026-03-10T06:22:48.592 INFO:tasks.workunit.client.0.vm04.stdout:4/178: dread - d2/d8/f1a zero size 2026-03-10T06:22:48.592 INFO:tasks.workunit.client.0.vm04.stdout:0/196: fsync d0/f17 0 2026-03-10T06:22:48.597 INFO:tasks.workunit.client.0.vm04.stdout:3/252: mknod d4/da/df/d11/c51 0 2026-03-10T06:22:48.602 INFO:tasks.workunit.client.0.vm04.stdout:1/237: rmdir d0/d8/d52 0 2026-03-10T06:22:48.602 INFO:tasks.workunit.client.0.vm04.stdout:5/174: unlink d4/d11/d2a/c2b 0 2026-03-10T06:22:48.604 INFO:tasks.workunit.client.0.vm04.stdout:0/197: dwrite d0/d5/d25/dd/d1d/f26 [0,4194304] 0 2026-03-10T06:22:48.614 INFO:tasks.workunit.client.0.vm04.stdout:5/175: dwrite f0 [0,4194304] 0 2026-03-10T06:22:48.621 INFO:tasks.workunit.client.0.vm04.stdout:6/217: symlink d2/d8/l48 0 2026-03-10T06:22:48.653 INFO:tasks.workunit.client.0.vm04.stdout:9/222: symlink d2/d8/d3a/l50 0 2026-03-10T06:22:48.654 INFO:tasks.workunit.client.0.vm04.stdout:9/223: chown d2/d23/d24/f37 25 1 2026-03-10T06:22:48.654 INFO:tasks.workunit.client.0.vm04.stdout:9/224: write d2/f48 [905393,41380] 0 2026-03-10T06:22:48.659 INFO:tasks.workunit.client.0.vm04.stdout:4/179: unlink d2/d8/f11 0 2026-03-10T06:22:48.685 INFO:tasks.workunit.client.0.vm04.stdout:2/233: getdents d1/df/d2c/d37/d40 0 2026-03-10T06:22:48.688 INFO:tasks.workunit.client.0.vm04.stdout:9/225: truncate d2/d3/f4 3732406 0 2026-03-10T06:22:48.692 INFO:tasks.workunit.client.0.vm04.stdout:8/231: getdents df/d20 0 2026-03-10T06:22:48.699 INFO:tasks.workunit.client.0.vm04.stdout:8/232: dread df/d15/f1b [0,4194304] 0 2026-03-10T06:22:48.709 INFO:tasks.workunit.client.0.vm04.stdout:3/253: symlink d4/da/df/d11/d4a/l52 0 2026-03-10T06:22:48.711 INFO:tasks.workunit.client.0.vm04.stdout:3/254: chown d4/da/df/d13/d21/c4c 1518629 1 2026-03-10T06:22:48.711 INFO:tasks.workunit.client.0.vm04.stdout:0/198: creat d0/d1a/d20/d3d/f48 x:0 0 0 2026-03-10T06:22:48.712 INFO:tasks.workunit.client.0.vm04.stdout:3/255: fdatasync d4/d6/dc/f37 0 2026-03-10T06:22:48.717 INFO:tasks.workunit.client.0.vm04.stdout:3/256: readlink d4/da/df/d11/d4a/l52 0 2026-03-10T06:22:48.721 INFO:tasks.workunit.client.0.vm04.stdout:6/218: mkdir d2/d43/d2d/d30/d1f/d3c/d49 0 2026-03-10T06:22:48.727 INFO:tasks.workunit.client.0.vm04.stdout:0/199: dwrite d0/d5/d25/dd/f13 [4194304,4194304] 0 2026-03-10T06:22:48.727 INFO:tasks.workunit.client.0.vm04.stdout:3/257: dread d4/d6/dc/f41 [0,4194304] 0 2026-03-10T06:22:48.735 INFO:tasks.workunit.client.0.vm04.stdout:7/202: getdents d4/df/d12/d13 0 2026-03-10T06:22:48.746 INFO:tasks.workunit.client.0.vm04.stdout:2/234: creat d1/df/d11/d14/f45 x:0 0 0 2026-03-10T06:22:48.746 INFO:tasks.workunit.client.0.vm04.stdout:7/203: dwrite d4/df/f43 [0,4194304] 0 2026-03-10T06:22:48.760 INFO:tasks.workunit.client.0.vm04.stdout:8/233: mknod df/d20/d25/c4b 0 2026-03-10T06:22:48.761 INFO:tasks.workunit.client.0.vm04.stdout:8/234: dread - df/d15/f45 zero size 2026-03-10T06:22:48.761 INFO:tasks.workunit.client.0.vm04.stdout:8/235: stat df/d20 0 2026-03-10T06:22:48.775 INFO:tasks.workunit.client.0.vm04.stdout:2/235: dread d1/df/d11/d14/f1d [0,4194304] 0 2026-03-10T06:22:48.778 INFO:tasks.workunit.client.0.vm04.stdout:2/236: chown d1/df/d2c 538019930 1 2026-03-10T06:22:48.794 INFO:tasks.workunit.client.0.vm04.stdout:8/236: dread fe [0,4194304] 0 2026-03-10T06:22:48.804 INFO:tasks.workunit.client.0.vm04.stdout:1/238: creat d0/d3/f58 x:0 0 0 2026-03-10T06:22:48.806 INFO:tasks.workunit.client.0.vm04.stdout:3/258: rename d4/da/df/f1b to d4/d6/d38/f53 0 2026-03-10T06:22:48.806 INFO:tasks.workunit.client.0.vm04.stdout:7/204: rename d4/df/d12/d13/d25/f48 to d4/df/d12/d13/d25/f4b 0 2026-03-10T06:22:48.806 INFO:tasks.workunit.client.0.vm04.stdout:9/226: fdatasync d2/f49 0 2026-03-10T06:22:48.816 INFO:tasks.workunit.client.0.vm04.stdout:8/237: sync 2026-03-10T06:22:48.819 INFO:tasks.workunit.client.0.vm04.stdout:7/205: dread d4/df/d12/d13/f1e [0,4194304] 0 2026-03-10T06:22:48.822 INFO:tasks.workunit.client.0.vm04.stdout:1/239: dwrite d0/f29 [0,4194304] 0 2026-03-10T06:22:48.827 INFO:tasks.workunit.client.0.vm04.stdout:2/237: symlink d1/l46 0 2026-03-10T06:22:48.843 INFO:tasks.workunit.client.0.vm04.stdout:5/176: getdents d4/d6/d37 0 2026-03-10T06:22:48.862 INFO:tasks.workunit.client.0.vm04.stdout:3/259: dwrite f0 [4194304,4194304] 0 2026-03-10T06:22:48.862 INFO:tasks.workunit.client.0.vm04.stdout:3/260: chown d4/da/df/d11 597 1 2026-03-10T06:22:48.879 INFO:tasks.workunit.client.0.vm04.stdout:9/227: creat d2/d8/d3a/f51 x:0 0 0 2026-03-10T06:22:48.879 INFO:tasks.workunit.client.0.vm04.stdout:9/228: dread - d2/d3/d18/d39/d11/f35 zero size 2026-03-10T06:22:48.880 INFO:tasks.workunit.client.0.vm04.stdout:9/229: write d2/d8/d14/f28 [5192592,116502] 0 2026-03-10T06:22:48.882 INFO:tasks.workunit.client.0.vm04.stdout:8/238: creat df/d15/d2b/f4c x:0 0 0 2026-03-10T06:22:48.884 INFO:tasks.workunit.client.0.vm04.stdout:8/239: chown df/d15/d29/l3b 60241405 1 2026-03-10T06:22:48.888 INFO:tasks.workunit.client.0.vm04.stdout:4/180: getdents d2/d16/d2b 0 2026-03-10T06:22:48.903 INFO:tasks.workunit.client.0.vm04.stdout:6/219: truncate d2/f28 3814301 0 2026-03-10T06:22:48.905 INFO:tasks.workunit.client.0.vm04.stdout:6/220: fdatasync d2/d43/d2d/f45 0 2026-03-10T06:22:48.906 INFO:tasks.workunit.client.0.vm04.stdout:6/221: write d2/f10 [2225946,23173] 0 2026-03-10T06:22:48.910 INFO:tasks.workunit.client.0.vm04.stdout:0/200: dwrite d0/d5/d25/dd/f43 [0,4194304] 0 2026-03-10T06:22:48.915 INFO:tasks.workunit.client.0.vm04.stdout:6/222: sync 2026-03-10T06:22:48.922 INFO:tasks.workunit.client.0.vm04.stdout:2/238: symlink d1/l47 0 2026-03-10T06:22:48.927 INFO:tasks.workunit.client.0.vm04.stdout:5/177: symlink d4/d3b/l45 0 2026-03-10T06:22:48.937 INFO:tasks.workunit.client.0.vm04.stdout:9/230: rmdir d2/d8/d22 39 2026-03-10T06:22:48.938 INFO:tasks.workunit.client.0.vm04.stdout:3/261: dwrite d4/d6/d38/f53 [0,4194304] 0 2026-03-10T06:22:48.939 INFO:tasks.workunit.client.0.vm04.stdout:3/262: chown d4/da/df/d13/d21/c4c 899 1 2026-03-10T06:22:48.939 INFO:tasks.workunit.client.0.vm04.stdout:9/231: readlink d2/d3/l45 0 2026-03-10T06:22:48.942 INFO:tasks.workunit.client.0.vm04.stdout:8/240: creat df/d15/d2b/f4d x:0 0 0 2026-03-10T06:22:48.967 INFO:tasks.workunit.client.0.vm04.stdout:0/201: mknod d0/d1a/d20/d3d/c49 0 2026-03-10T06:22:48.975 INFO:tasks.workunit.client.0.vm04.stdout:6/223: creat d2/d43/d2d/d30/f4a x:0 0 0 2026-03-10T06:22:48.980 INFO:tasks.workunit.client.0.vm04.stdout:9/232: dread d2/f1c [0,4194304] 0 2026-03-10T06:22:48.986 INFO:tasks.workunit.client.0.vm04.stdout:3/263: mkdir d4/d6/d54 0 2026-03-10T06:22:48.987 INFO:tasks.workunit.client.0.vm04.stdout:3/264: fdatasync d4/d6/d38/f53 0 2026-03-10T06:22:48.989 INFO:tasks.workunit.client.0.vm04.stdout:4/181: getdents d2/d16/d2b 0 2026-03-10T06:22:48.991 INFO:tasks.workunit.client.0.vm04.stdout:4/182: truncate d2/d8/f23 66580 0 2026-03-10T06:22:48.993 INFO:tasks.workunit.client.0.vm04.stdout:0/202: mknod d0/d5/d25/dd/d1d/c4a 0 2026-03-10T06:22:49.009 INFO:tasks.workunit.client.0.vm04.stdout:9/233: symlink d2/d3/d18/d39/l52 0 2026-03-10T06:22:49.010 INFO:tasks.workunit.client.0.vm04.stdout:9/234: chown d2/d23/d24 1 1 2026-03-10T06:22:49.011 INFO:tasks.workunit.client.0.vm04.stdout:2/239: dwrite d1/df/d11/f16 [8388608,4194304] 0 2026-03-10T06:22:49.011 INFO:tasks.workunit.client.0.vm04.stdout:9/235: dread - d2/d3/d18/d39/d11/f35 zero size 2026-03-10T06:22:49.030 INFO:tasks.workunit.client.0.vm04.stdout:3/265: rename d4/da/df/d13/d21/c4c to d4/d6/c55 0 2026-03-10T06:22:49.030 INFO:tasks.workunit.client.0.vm04.stdout:1/240: link d0/d3/f33 d0/f59 0 2026-03-10T06:22:49.030 INFO:tasks.workunit.client.0.vm04.stdout:7/206: getdents d4/df/d12/d34 0 2026-03-10T06:22:49.032 INFO:tasks.workunit.client.0.vm04.stdout:1/241: chown d0/d8/d46 0 1 2026-03-10T06:22:49.033 INFO:tasks.workunit.client.0.vm04.stdout:1/242: dread - d0/d3/f4e zero size 2026-03-10T06:22:49.034 INFO:tasks.workunit.client.0.vm04.stdout:2/240: dread d1/f5 [0,4194304] 0 2026-03-10T06:22:49.037 INFO:tasks.workunit.client.0.vm04.stdout:8/241: getdents df/d20 0 2026-03-10T06:22:49.045 INFO:tasks.workunit.client.0.vm04.stdout:3/266: mknod d4/da/df/d11/c56 0 2026-03-10T06:22:49.045 INFO:tasks.workunit.client.0.vm04.stdout:4/183: rename d2/ff to d2/d16/f3a 0 2026-03-10T06:22:49.047 INFO:tasks.workunit.client.0.vm04.stdout:5/178: getdents d4 0 2026-03-10T06:22:49.053 INFO:tasks.workunit.client.0.vm04.stdout:9/236: mkdir d2/d8/d53 0 2026-03-10T06:22:49.054 INFO:tasks.workunit.client.0.vm04.stdout:3/267: dwrite d4/d6/f30 [0,4194304] 0 2026-03-10T06:22:49.056 INFO:tasks.workunit.client.0.vm04.stdout:3/268: fdatasync d4/f7 0 2026-03-10T06:22:49.081 INFO:tasks.workunit.client.0.vm04.stdout:1/243: mknod d0/d8/d46/c5a 0 2026-03-10T06:22:49.085 INFO:tasks.workunit.client.0.vm04.stdout:2/241: mkdir d1/df/d11/d18/d48 0 2026-03-10T06:22:49.087 INFO:tasks.workunit.client.0.vm04.stdout:2/242: dwrite d1/db/f12 [0,4194304] 0 2026-03-10T06:22:49.091 INFO:tasks.workunit.client.0.vm04.stdout:6/224: getdents d2/d43/d2d/d30 0 2026-03-10T06:22:49.108 INFO:tasks.workunit.client.0.vm04.stdout:0/203: creat d0/d1a/d20/f4b x:0 0 0 2026-03-10T06:22:49.118 INFO:tasks.workunit.client.0.vm04.stdout:3/269: creat d4/da/df/d11/f57 x:0 0 0 2026-03-10T06:22:49.118 INFO:tasks.workunit.client.0.vm04.stdout:1/244: mkdir d0/d3/d41/d4b/d5b 0 2026-03-10T06:22:49.118 INFO:tasks.workunit.client.0.vm04.stdout:1/245: truncate d0/d8/f43 515985 0 2026-03-10T06:22:49.119 INFO:tasks.workunit.client.0.vm04.stdout:7/207: creat d4/df/d12/f4c x:0 0 0 2026-03-10T06:22:49.124 INFO:tasks.workunit.client.0.vm04.stdout:4/184: symlink d2/l3b 0 2026-03-10T06:22:49.125 INFO:tasks.workunit.client.0.vm04.stdout:6/225: creat d2/d43/f4b x:0 0 0 2026-03-10T06:22:49.126 INFO:tasks.workunit.client.0.vm04.stdout:0/204: symlink d0/d1a/d3f/l4c 0 2026-03-10T06:22:49.126 INFO:tasks.workunit.client.0.vm04.stdout:0/205: fdatasync d0/d5/f1f 0 2026-03-10T06:22:49.127 INFO:tasks.workunit.client.0.vm04.stdout:0/206: chown d0/d1a/d20/d3d/c49 7 1 2026-03-10T06:22:49.128 INFO:tasks.workunit.client.0.vm04.stdout:0/207: read d0/f16 [1107352,72827] 0 2026-03-10T06:22:49.128 INFO:tasks.workunit.client.0.vm04.stdout:0/208: dread - d0/d1a/d20/d3d/f48 zero size 2026-03-10T06:22:49.131 INFO:tasks.workunit.client.0.vm04.stdout:9/237: symlink d2/l54 0 2026-03-10T06:22:49.131 INFO:tasks.workunit.client.0.vm04.stdout:3/270: sync 2026-03-10T06:22:49.134 INFO:tasks.workunit.client.0.vm04.stdout:3/271: fdatasync d4/f49 0 2026-03-10T06:22:49.135 INFO:tasks.workunit.client.0.vm04.stdout:0/209: dwrite d0/d5/f3e [0,4194304] 0 2026-03-10T06:22:49.137 INFO:tasks.workunit.client.0.vm04.stdout:1/246: truncate d0/d3/f24 861380 0 2026-03-10T06:22:49.138 INFO:tasks.workunit.client.0.vm04.stdout:2/243: unlink d1/df/d11/f15 0 2026-03-10T06:22:49.139 INFO:tasks.workunit.client.0.vm04.stdout:7/208: creat d4/df/d12/d13/d25/d28/d36/f4d x:0 0 0 2026-03-10T06:22:49.140 INFO:tasks.workunit.client.0.vm04.stdout:8/242: getdents df/d20 0 2026-03-10T06:22:49.147 INFO:tasks.workunit.client.0.vm04.stdout:8/243: chown df/d15/d29/l3b 0 1 2026-03-10T06:22:49.150 INFO:tasks.workunit.client.0.vm04.stdout:7/209: dwrite d4/df/d12/d13/d25/d28/d36/f4d [0,4194304] 0 2026-03-10T06:22:49.150 INFO:tasks.workunit.client.0.vm04.stdout:4/185: chown d2/c22 188 1 2026-03-10T06:22:49.150 INFO:tasks.workunit.client.0.vm04.stdout:4/186: fsync d2/d16/f20 0 2026-03-10T06:22:49.153 INFO:tasks.workunit.client.0.vm04.stdout:2/244: dread d1/df/f24 [0,4194304] 0 2026-03-10T06:22:49.153 INFO:tasks.workunit.client.0.vm04.stdout:8/244: write df/f3f [247066,48859] 0 2026-03-10T06:22:49.155 INFO:tasks.workunit.client.0.vm04.stdout:7/210: write d4/df/d12/d21/f26 [883783,120719] 0 2026-03-10T06:22:49.162 INFO:tasks.workunit.client.0.vm04.stdout:6/226: symlink d2/d43/d2d/d30/d1f/d3c/l4c 0 2026-03-10T06:22:49.162 INFO:tasks.workunit.client.0.vm04.stdout:5/179: link d4/d11/c1d d4/c46 0 2026-03-10T06:22:49.163 INFO:tasks.workunit.client.0.vm04.stdout:9/238: mkdir d2/d3/d18/d39/d46/d55 0 2026-03-10T06:22:49.164 INFO:tasks.workunit.client.0.vm04.stdout:9/239: chown d2/d8/d14/f27 452145 1 2026-03-10T06:22:49.172 INFO:tasks.workunit.client.0.vm04.stdout:8/245: dread df/d15/d2b/f4a [0,4194304] 0 2026-03-10T06:22:49.173 INFO:tasks.workunit.client.0.vm04.stdout:8/246: chown df/f1d 103593195 1 2026-03-10T06:22:49.185 INFO:tasks.workunit.client.0.vm04.stdout:4/187: read - d2/f12 zero size 2026-03-10T06:22:49.194 INFO:tasks.workunit.client.0.vm04.stdout:0/210: dwrite d0/f16 [0,4194304] 0 2026-03-10T06:22:49.194 INFO:tasks.workunit.client.0.vm04.stdout:3/272: dwrite d4/d6/f30 [4194304,4194304] 0 2026-03-10T06:22:49.195 INFO:tasks.workunit.client.0.vm04.stdout:0/211: chown d0/f16 738681189 1 2026-03-10T06:22:49.196 INFO:tasks.workunit.client.0.vm04.stdout:3/273: stat d4/da/df/d11/c26 0 2026-03-10T06:22:49.205 INFO:tasks.workunit.client.0.vm04.stdout:2/245: creat d1/db/d20/f49 x:0 0 0 2026-03-10T06:22:49.209 INFO:tasks.workunit.client.0.vm04.stdout:7/211: rename d4/df/f43 to d4/df/d12/d13/d25/d28/d3a/f4e 0 2026-03-10T06:22:49.217 INFO:tasks.workunit.client.0.vm04.stdout:1/247: link d0/d8/d46/f57 d0/d3/d41/d4b/d5b/f5c 0 2026-03-10T06:22:49.217 INFO:tasks.workunit.client.0.vm04.stdout:6/227: creat d2/d43/d2d/d30/d34/f4d x:0 0 0 2026-03-10T06:22:49.218 INFO:tasks.workunit.client.0.vm04.stdout:9/240: truncate d2/d3/d18/d39/d11/f2d 3524980 0 2026-03-10T06:22:49.218 INFO:tasks.workunit.client.0.vm04.stdout:6/228: chown d2/d43/f4b 26 1 2026-03-10T06:22:49.219 INFO:tasks.workunit.client.0.vm04.stdout:6/229: chown d2/d43/f3b 329 1 2026-03-10T06:22:49.219 INFO:tasks.workunit.client.0.vm04.stdout:9/241: stat d2/f1c 0 2026-03-10T06:22:49.219 INFO:tasks.workunit.client.0.vm04.stdout:7/212: truncate d4/df/d12/d13/d25/d28/d36/f41 1015962 0 2026-03-10T06:22:49.225 INFO:tasks.workunit.client.0.vm04.stdout:7/213: read d4/df/d12/d21/f26 [195235,72415] 0 2026-03-10T06:22:49.232 INFO:tasks.workunit.client.0.vm04.stdout:3/274: dread d4/da/df/d13/d21/f3a [0,4194304] 0 2026-03-10T06:22:49.232 INFO:tasks.workunit.client.0.vm04.stdout:1/248: dwrite d0/d3/f24 [0,4194304] 0 2026-03-10T06:22:49.233 INFO:tasks.workunit.client.0.vm04.stdout:9/242: dwrite d2/d3/d18/d34/f47 [0,4194304] 0 2026-03-10T06:22:49.237 INFO:tasks.workunit.client.0.vm04.stdout:3/275: readlink d4/da/df/d11/d4a/l52 0 2026-03-10T06:22:49.238 INFO:tasks.workunit.client.0.vm04.stdout:9/243: write d2/d8/d14/f28 [2919043,19685] 0 2026-03-10T06:22:49.253 INFO:tasks.workunit.client.0.vm04.stdout:3/276: dread d4/d6/dc/f22 [0,4194304] 0 2026-03-10T06:22:49.253 INFO:tasks.workunit.client.0.vm04.stdout:2/246: creat d1/df/d2c/f4a x:0 0 0 2026-03-10T06:22:49.257 INFO:tasks.workunit.client.0.vm04.stdout:2/247: chown d1/df 189 1 2026-03-10T06:22:49.258 INFO:tasks.workunit.client.0.vm04.stdout:5/180: write d4/d11/d2a/f3d [2352489,17184] 0 2026-03-10T06:22:49.262 INFO:tasks.workunit.client.0.vm04.stdout:7/214: dread d4/df/d12/f18 [0,4194304] 0 2026-03-10T06:22:49.269 INFO:tasks.workunit.client.0.vm04.stdout:1/249: readlink d0/d3/ld 0 2026-03-10T06:22:49.283 INFO:tasks.workunit.client.0.vm04.stdout:1/250: dread d0/d3/f37 [0,4194304] 0 2026-03-10T06:22:49.291 INFO:tasks.workunit.client.0.vm04.stdout:3/277: creat d4/da/df/d13/d21/d32/f58 x:0 0 0 2026-03-10T06:22:49.294 INFO:tasks.workunit.client.0.vm04.stdout:2/248: mknod d1/df/d11/d14/c4b 0 2026-03-10T06:22:49.310 INFO:tasks.workunit.client.0.vm04.stdout:8/247: write df/d20/d25/f39 [311481,35530] 0 2026-03-10T06:22:49.311 INFO:tasks.workunit.client.0.vm04.stdout:8/248: write df/f11 [3675975,106290] 0 2026-03-10T06:22:49.312 INFO:tasks.workunit.client.0.vm04.stdout:8/249: write df/d20/f28 [4339954,81550] 0 2026-03-10T06:22:49.312 INFO:tasks.workunit.client.0.vm04.stdout:8/250: chown df/f1f 8657 1 2026-03-10T06:22:49.318 INFO:tasks.workunit.client.0.vm04.stdout:6/230: symlink d2/d43/d2d/d30/d1f/d3c/d49/l4e 0 2026-03-10T06:22:49.324 INFO:tasks.workunit.client.0.vm04.stdout:4/188: truncate d2/d16/d2c/f34 807967 0 2026-03-10T06:22:49.324 INFO:tasks.workunit.client.0.vm04.stdout:4/189: write d2/d16/d2b/f28 [532451,109828] 0 2026-03-10T06:22:49.329 INFO:tasks.workunit.client.0.vm04.stdout:0/212: truncate d0/f16 1123776 0 2026-03-10T06:22:49.335 INFO:tasks.workunit.client.0.vm04.stdout:3/278: unlink d4/c14 0 2026-03-10T06:22:49.347 INFO:tasks.workunit.client.0.vm04.stdout:5/181: creat d4/d6/f47 x:0 0 0 2026-03-10T06:22:49.347 INFO:tasks.workunit.client.0.vm04.stdout:8/251: creat df/d20/d25/d30/f4e x:0 0 0 2026-03-10T06:22:49.350 INFO:tasks.workunit.client.0.vm04.stdout:6/231: unlink d2/d43/d2d/f45 0 2026-03-10T06:22:49.351 INFO:tasks.workunit.client.0.vm04.stdout:8/252: dwrite df/f17 [0,4194304] 0 2026-03-10T06:22:49.351 INFO:tasks.workunit.client.0.vm04.stdout:0/213: mkdir d0/d1a/d4d 0 2026-03-10T06:22:49.351 INFO:tasks.workunit.client.0.vm04.stdout:1/251: mknod d0/d3/c5d 0 2026-03-10T06:22:49.351 INFO:tasks.workunit.client.0.vm04.stdout:0/214: readlink d0/d5/d25/dd/d1d/l32 0 2026-03-10T06:22:49.352 INFO:tasks.workunit.client.0.vm04.stdout:8/253: write df/f3f [176115,119303] 0 2026-03-10T06:22:49.361 INFO:tasks.workunit.client.0.vm04.stdout:3/279: dwrite d4/da/df/d13/d21/f3a [0,4194304] 0 2026-03-10T06:22:49.363 INFO:tasks.workunit.client.0.vm04.stdout:4/190: dwrite d2/d8/fa [0,4194304] 0 2026-03-10T06:22:49.393 INFO:tasks.workunit.client.0.vm04.stdout:3/280: dread d4/da/df/d13/f16 [0,4194304] 0 2026-03-10T06:22:49.399 INFO:tasks.workunit.client.0.vm04.stdout:9/244: write d2/d3/d18/d39/fd [285393,62671] 0 2026-03-10T06:22:49.401 INFO:tasks.workunit.client.0.vm04.stdout:5/182: fsync d4/d3b/f41 0 2026-03-10T06:22:49.403 INFO:tasks.workunit.client.0.vm04.stdout:9/245: dwrite d2/d23/d24/f29 [4194304,4194304] 0 2026-03-10T06:22:49.417 INFO:tasks.workunit.client.0.vm04.stdout:0/215: creat d0/d5/f4e x:0 0 0 2026-03-10T06:22:49.418 INFO:tasks.workunit.client.0.vm04.stdout:0/216: read d0/d5/f41 [3721986,110533] 0 2026-03-10T06:22:49.418 INFO:tasks.workunit.client.0.vm04.stdout:7/215: link d4/df/c1f d4/df/c4f 0 2026-03-10T06:22:49.430 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:49 vm04.local ceph-mon[51058]: pgmap v23: 65 pgs: 65 active+clean; 2.2 GiB data, 7.5 GiB used, 112 GiB / 120 GiB avail; 19 MiB/s rd, 69 MiB/s wr, 214 op/s 2026-03-10T06:22:49.431 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:49 vm04.local ceph-mon[51058]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:22:49.431 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:49 vm04.local ceph-mon[51058]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:22:49.439 INFO:tasks.workunit.client.0.vm04.stdout:5/183: dread - d4/f35 zero size 2026-03-10T06:22:49.440 INFO:tasks.workunit.client.0.vm04.stdout:6/232: symlink d2/d43/d2d/l4f 0 2026-03-10T06:22:49.440 INFO:tasks.workunit.client.0.vm04.stdout:6/233: dread - d2/d43/f35 zero size 2026-03-10T06:22:49.449 INFO:tasks.workunit.client.0.vm04.stdout:3/281: symlink d4/da/df/d13/d21/d32/d4e/l59 0 2026-03-10T06:22:49.454 INFO:tasks.workunit.client.0.vm04.stdout:0/217: dread d0/f14 [0,4194304] 0 2026-03-10T06:22:49.455 INFO:tasks.workunit.client.0.vm04.stdout:5/184: mkdir d4/d6/d48 0 2026-03-10T06:22:49.455 INFO:tasks.workunit.client.0.vm04.stdout:6/234: creat d2/d3a/f50 x:0 0 0 2026-03-10T06:22:49.462 INFO:tasks.workunit.client.0.vm04.stdout:6/235: dread d2/d8/f9 [0,4194304] 0 2026-03-10T06:22:49.463 INFO:tasks.workunit.client.0.vm04.stdout:6/236: write d2/d43/d2d/d30/d34/f4d [485557,83226] 0 2026-03-10T06:22:49.471 INFO:tasks.workunit.client.0.vm04.stdout:2/249: dwrite d1/df/f24 [0,4194304] 0 2026-03-10T06:22:49.473 INFO:tasks.workunit.client.0.vm04.stdout:2/250: chown d1/db/fe 67291844 1 2026-03-10T06:22:49.479 INFO:tasks.workunit.client.0.vm04.stdout:8/254: creat df/f4f x:0 0 0 2026-03-10T06:22:49.479 INFO:tasks.workunit.client.0.vm04.stdout:8/255: read - df/d15/f45 zero size 2026-03-10T06:22:49.480 INFO:tasks.workunit.client.0.vm04.stdout:7/216: mkdir d4/df/d12/d13/d25/d30/d40/d50 0 2026-03-10T06:22:49.482 INFO:tasks.workunit.client.0.vm04.stdout:3/282: mkdir d4/da/df/d11/d5a 0 2026-03-10T06:22:49.486 INFO:tasks.workunit.client.0.vm04.stdout:5/185: symlink d4/d11/d2a/l49 0 2026-03-10T06:22:49.489 INFO:tasks.workunit.client.0.vm04.stdout:1/252: link d0/c4a d0/c5e 0 2026-03-10T06:22:49.494 INFO:tasks.workunit.client.0.vm04.stdout:4/191: rename d2/d16/d2c/f34 to d2/d16/d2b/f3c 0 2026-03-10T06:22:49.497 INFO:tasks.workunit.client.0.vm04.stdout:4/192: chown d2/f14 0 1 2026-03-10T06:22:49.500 INFO:tasks.workunit.client.0.vm04.stdout:2/251: unlink d1/df/d2c/f32 0 2026-03-10T06:22:49.505 INFO:tasks.workunit.client.0.vm04.stdout:8/256: write df/d20/f42 [2967682,118027] 0 2026-03-10T06:22:49.510 INFO:tasks.workunit.client.0.vm04.stdout:3/283: fdatasync d4/da/df/d13/f16 0 2026-03-10T06:22:49.511 INFO:tasks.workunit.client.0.vm04.stdout:3/284: dread d4/f2d [0,4194304] 0 2026-03-10T06:22:49.518 INFO:tasks.workunit.client.0.vm04.stdout:1/253: unlink d0/f3c 0 2026-03-10T06:22:49.518 INFO:tasks.workunit.client.0.vm04.stdout:9/246: rename d2/d3/f36 to d2/d3/d18/d39/d11/f56 0 2026-03-10T06:22:49.519 INFO:tasks.workunit.client.0.vm04.stdout:1/254: read d0/d8/f32 [2887407,123782] 0 2026-03-10T06:22:49.520 INFO:tasks.workunit.client.0.vm04.stdout:4/193: creat d2/d16/d2b/f3d x:0 0 0 2026-03-10T06:22:49.520 INFO:tasks.workunit.client.0.vm04.stdout:2/252: creat d1/df/d2c/f4c x:0 0 0 2026-03-10T06:22:49.521 INFO:tasks.workunit.client.0.vm04.stdout:4/194: read - d2/d16/d2b/f3d zero size 2026-03-10T06:22:49.521 INFO:tasks.workunit.client.0.vm04.stdout:2/253: fsync d1/df/d11/d14/f1d 0 2026-03-10T06:22:49.523 INFO:tasks.workunit.client.0.vm04.stdout:5/186: sync 2026-03-10T06:22:49.539 INFO:tasks.workunit.client.0.vm04.stdout:9/247: creat d2/d3/f57 x:0 0 0 2026-03-10T06:22:49.540 INFO:tasks.workunit.client.0.vm04.stdout:9/248: readlink d2/d8/d14/l3d 0 2026-03-10T06:22:49.544 INFO:tasks.workunit.client.0.vm04.stdout:0/218: dwrite d0/d5/d25/f3c [0,4194304] 0 2026-03-10T06:22:49.544 INFO:tasks.workunit.client.0.vm04.stdout:0/219: fdatasync d0/f14 0 2026-03-10T06:22:49.546 INFO:tasks.workunit.client.0.vm04.stdout:0/220: dread d0/d5/d25/dd/d1d/f30 [0,4194304] 0 2026-03-10T06:22:49.549 INFO:tasks.workunit.client.0.vm04.stdout:0/221: dread d0/d5/d25/dd/d1d/f26 [0,4194304] 0 2026-03-10T06:22:49.553 INFO:tasks.workunit.client.0.vm04.stdout:1/255: chown d0/d3/d41/c4c 1813994 1 2026-03-10T06:22:49.553 INFO:tasks.workunit.client.0.vm04.stdout:1/256: chown d0/d3/d41/f47 79405 1 2026-03-10T06:22:49.558 INFO:tasks.workunit.client.0.vm04.stdout:7/217: link d4/df/d12/d13/f27 d4/f51 0 2026-03-10T06:22:49.558 INFO:tasks.workunit.client.0.vm04.stdout:7/218: fsync d4/df/d12/d34/f46 0 2026-03-10T06:22:49.561 INFO:tasks.workunit.client.0.vm04.stdout:3/285: unlink d4/da/df/c3f 0 2026-03-10T06:22:49.566 INFO:tasks.workunit.client.0.vm04.stdout:4/195: rename d2/d16/d2c/l39 to d2/d16/d2c/l3e 0 2026-03-10T06:22:49.566 INFO:tasks.workunit.client.0.vm04.stdout:2/254: readlink d1/df/d11/l34 0 2026-03-10T06:22:49.568 INFO:tasks.workunit.client.0.vm04.stdout:5/187: write d4/f26 [266731,93316] 0 2026-03-10T06:22:49.582 INFO:tasks.workunit.client.0.vm04.stdout:4/196: dread d2/d16/d2b/f26 [0,4194304] 0 2026-03-10T06:22:49.582 INFO:tasks.workunit.client.0.vm04.stdout:4/197: chown d2/l3b 944896637 1 2026-03-10T06:22:49.586 INFO:tasks.workunit.client.0.vm04.stdout:8/257: link df/d20/d25/l41 df/d15/d29/l50 0 2026-03-10T06:22:49.593 INFO:tasks.workunit.client.0.vm04.stdout:1/257: unlink d0/ff 0 2026-03-10T06:22:49.594 INFO:tasks.workunit.client.0.vm04.stdout:1/258: dread d0/d8/f43 [0,4194304] 0 2026-03-10T06:22:49.598 INFO:tasks.workunit.client.0.vm04.stdout:0/222: dread d0/d5/fb [0,4194304] 0 2026-03-10T06:22:49.599 INFO:tasks.workunit.client.0.vm04.stdout:8/258: sync 2026-03-10T06:22:49.599 INFO:tasks.workunit.client.0.vm04.stdout:0/223: fsync d0/d5/f1f 0 2026-03-10T06:22:49.602 INFO:tasks.workunit.client.0.vm04.stdout:0/224: dwrite d0/d1a/f2f [0,4194304] 0 2026-03-10T06:22:49.608 INFO:tasks.workunit.client.0.vm04.stdout:0/225: dread d0/d5/d25/f3c [0,4194304] 0 2026-03-10T06:22:49.609 INFO:tasks.workunit.client.0.vm04.stdout:0/226: write d0/d1a/f27 [2225039,32736] 0 2026-03-10T06:22:49.612 INFO:tasks.workunit.client.0.vm04.stdout:0/227: dread d0/f17 [0,4194304] 0 2026-03-10T06:22:49.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:49 vm06.local ceph-mon[58974]: pgmap v23: 65 pgs: 65 active+clean; 2.2 GiB data, 7.5 GiB used, 112 GiB / 120 GiB avail; 19 MiB/s rd, 69 MiB/s wr, 214 op/s 2026-03-10T06:22:49.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:49 vm06.local ceph-mon[58974]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:22:49.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:49 vm06.local ceph-mon[58974]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:22:49.618 INFO:tasks.workunit.client.0.vm04.stdout:0/228: dwrite d0/d1a/d20/d3d/f48 [0,4194304] 0 2026-03-10T06:22:49.619 INFO:tasks.workunit.client.0.vm04.stdout:0/229: read d0/d1a/d20/d3d/f48 [4022869,30820] 0 2026-03-10T06:22:49.620 INFO:tasks.workunit.client.0.vm04.stdout:0/230: truncate d0/d5/d25/dd/d1d/f30 4843621 0 2026-03-10T06:22:49.623 INFO:tasks.workunit.client.0.vm04.stdout:7/219: creat d4/df/d12/d13/d25/d30/d40/f52 x:0 0 0 2026-03-10T06:22:49.626 INFO:tasks.workunit.client.0.vm04.stdout:7/220: dwrite d4/df/f29 [0,4194304] 0 2026-03-10T06:22:49.627 INFO:tasks.workunit.client.0.vm04.stdout:0/231: dread d0/d5/f41 [0,4194304] 0 2026-03-10T06:22:49.645 INFO:tasks.workunit.client.0.vm04.stdout:6/237: rename d2/d43/d2d/d30/d1f/d3c/d49 to d2/d43/d2d/d51 0 2026-03-10T06:22:49.653 INFO:tasks.workunit.client.0.vm04.stdout:2/255: unlink d1/db/d20/f31 0 2026-03-10T06:22:49.654 INFO:tasks.workunit.client.0.vm04.stdout:5/188: symlink d4/d6/l4a 0 2026-03-10T06:22:49.660 INFO:tasks.workunit.client.0.vm04.stdout:1/259: mknod d0/d3/d41/c5f 0 2026-03-10T06:22:49.660 INFO:tasks.workunit.client.0.vm04.stdout:1/260: chown d0/d8/c48 72104 1 2026-03-10T06:22:49.661 INFO:tasks.workunit.client.0.vm04.stdout:8/259: creat df/d20/d25/d30/f51 x:0 0 0 2026-03-10T06:22:49.679 INFO:tasks.workunit.client.0.vm04.stdout:4/198: dwrite d2/d16/d2b/f15 [0,4194304] 0 2026-03-10T06:22:49.680 INFO:tasks.workunit.client.0.vm04.stdout:4/199: chown d2/d32/l36 1 1 2026-03-10T06:22:49.684 INFO:tasks.workunit.client.0.vm04.stdout:0/232: creat d0/d1a/d3f/f4f x:0 0 0 2026-03-10T06:22:49.687 INFO:tasks.workunit.client.0.vm04.stdout:3/286: mkdir d4/da/df/d11/d5a/d5b 0 2026-03-10T06:22:49.688 INFO:tasks.workunit.client.0.vm04.stdout:6/238: creat d2/d43/d2d/d30/d34/f52 x:0 0 0 2026-03-10T06:22:49.688 INFO:tasks.workunit.client.0.vm04.stdout:6/239: readlink d2/d8/l48 0 2026-03-10T06:22:49.694 INFO:tasks.workunit.client.0.vm04.stdout:9/249: symlink d2/d8/d22/d4f/l58 0 2026-03-10T06:22:49.695 INFO:tasks.workunit.client.0.vm04.stdout:9/250: write d2/d3/f57 [519340,118215] 0 2026-03-10T06:22:49.698 INFO:tasks.workunit.client.0.vm04.stdout:9/251: dwrite d2/d3/f12 [0,4194304] 0 2026-03-10T06:22:49.709 INFO:tasks.workunit.client.0.vm04.stdout:8/260: truncate df/d15/f1b 765400 0 2026-03-10T06:22:49.711 INFO:tasks.workunit.client.0.vm04.stdout:5/189: dread d4/f13 [0,4194304] 0 2026-03-10T06:22:49.713 INFO:tasks.workunit.client.0.vm04.stdout:8/261: dwrite df/d20/d25/d30/f4e [0,4194304] 0 2026-03-10T06:22:49.715 INFO:tasks.workunit.client.0.vm04.stdout:8/262: write df/f11 [475891,57663] 0 2026-03-10T06:22:49.731 INFO:tasks.workunit.client.0.vm04.stdout:4/200: mkdir d2/d16/d31/d3f 0 2026-03-10T06:22:49.734 INFO:tasks.workunit.client.0.vm04.stdout:4/201: dread d2/f30 [0,4194304] 0 2026-03-10T06:22:49.741 INFO:tasks.workunit.client.0.vm04.stdout:0/233: creat d0/d5/d25/dd/d3a/f50 x:0 0 0 2026-03-10T06:22:49.741 INFO:tasks.workunit.client.0.vm04.stdout:3/287: creat d4/d6/dc/f5c x:0 0 0 2026-03-10T06:22:49.744 INFO:tasks.workunit.client.0.vm04.stdout:6/240: rename d2/d43/d2d/l2e to d2/d43/d2d/d30/d1f/l53 0 2026-03-10T06:22:49.750 INFO:tasks.workunit.client.0.vm04.stdout:6/241: dwrite d2/d43/f35 [0,4194304] 0 2026-03-10T06:22:49.750 INFO:tasks.workunit.client.0.vm04.stdout:6/242: fsync d2/d43/f4b 0 2026-03-10T06:22:49.751 INFO:tasks.workunit.client.0.vm04.stdout:6/243: chown d2/l2a 416565671 1 2026-03-10T06:22:49.754 INFO:tasks.workunit.client.0.vm04.stdout:6/244: dread d2/d43/d2d/d30/f32 [0,4194304] 0 2026-03-10T06:22:49.758 INFO:tasks.workunit.client.0.vm04.stdout:6/245: dwrite d2/d43/f31 [0,4194304] 0 2026-03-10T06:22:49.761 INFO:tasks.workunit.client.0.vm04.stdout:6/246: write d2/d8/f11 [669707,81021] 0 2026-03-10T06:22:49.770 INFO:tasks.workunit.client.0.vm04.stdout:2/256: mknod d1/df/d11/c4d 0 2026-03-10T06:22:49.780 INFO:tasks.workunit.client.0.vm04.stdout:9/252: mknod d2/d23/c59 0 2026-03-10T06:22:49.780 INFO:tasks.workunit.client.0.vm04.stdout:5/190: chown d4/c24 62992951 1 2026-03-10T06:22:49.780 INFO:tasks.workunit.client.0.vm04.stdout:5/191: truncate d4/d11/d2a/f2c 815681 0 2026-03-10T06:22:49.780 INFO:tasks.workunit.client.0.vm04.stdout:5/192: fsync d4/d11/f18 0 2026-03-10T06:22:49.783 INFO:tasks.workunit.client.0.vm04.stdout:7/221: truncate d4/df/d12/f18 1401256 0 2026-03-10T06:22:49.784 INFO:tasks.workunit.client.0.vm04.stdout:7/222: chown d4/df/d12/d13/d25/d28/l31 7 1 2026-03-10T06:22:49.785 INFO:tasks.workunit.client.0.vm04.stdout:7/223: chown d4/df/d12/f14 1063361129 1 2026-03-10T06:22:49.793 INFO:tasks.workunit.client.0.vm04.stdout:0/234: write d0/d1a/f3b [697718,129538] 0 2026-03-10T06:22:49.810 INFO:tasks.workunit.client.0.vm04.stdout:0/235: chown d0/f44 23 1 2026-03-10T06:22:49.810 INFO:tasks.workunit.client.0.vm04.stdout:3/288: symlink d4/da/df/d13/d21/d32/d39/l5d 0 2026-03-10T06:22:49.810 INFO:tasks.workunit.client.0.vm04.stdout:6/247: rmdir d2/d43/d2d/d30/d1f 39 2026-03-10T06:22:49.810 INFO:tasks.workunit.client.0.vm04.stdout:2/257: mkdir d1/df/d11/d14/d4e 0 2026-03-10T06:22:49.810 INFO:tasks.workunit.client.0.vm04.stdout:2/258: stat d1/l3c 0 2026-03-10T06:22:49.814 INFO:tasks.workunit.client.0.vm04.stdout:5/193: sync 2026-03-10T06:22:49.814 INFO:tasks.workunit.client.0.vm04.stdout:3/289: sync 2026-03-10T06:22:49.814 INFO:tasks.workunit.client.0.vm04.stdout:5/194: chown d4/d11/f18 239 1 2026-03-10T06:22:49.815 INFO:tasks.workunit.client.0.vm04.stdout:5/195: fdatasync d4/d3b/f41 0 2026-03-10T06:22:49.815 INFO:tasks.workunit.client.0.vm04.stdout:3/290: truncate d4/da/df/d11/f57 349918 0 2026-03-10T06:22:49.821 INFO:tasks.workunit.client.0.vm04.stdout:3/291: dread d4/da/df/d13/d21/f3a [0,4194304] 0 2026-03-10T06:22:49.829 INFO:tasks.workunit.client.0.vm04.stdout:1/261: write d0/d3/f33 [2240372,79644] 0 2026-03-10T06:22:49.840 INFO:tasks.workunit.client.0.vm04.stdout:8/263: fsync df/d15/f1b 0 2026-03-10T06:22:49.842 INFO:tasks.workunit.client.0.vm04.stdout:0/236: symlink d0/d1a/d20/d38/d31/l51 0 2026-03-10T06:22:49.852 INFO:tasks.workunit.client.0.vm04.stdout:0/237: dwrite d0/d5/f1f [0,4194304] 0 2026-03-10T06:22:49.886 INFO:tasks.workunit.client.0.vm04.stdout:8/264: mknod df/d15/d2b/c52 0 2026-03-10T06:22:49.893 INFO:tasks.workunit.client.0.vm04.stdout:9/253: truncate d2/d3/f57 390000 0 2026-03-10T06:22:49.893 INFO:tasks.workunit.client.0.vm04.stdout:8/265: dwrite df/d15/f45 [0,4194304] 0 2026-03-10T06:22:49.896 INFO:tasks.workunit.client.0.vm04.stdout:8/266: chown df/c14 3893992 1 2026-03-10T06:22:49.898 INFO:tasks.workunit.client.0.vm04.stdout:7/224: dwrite d4/df/d12/f18 [0,4194304] 0 2026-03-10T06:22:49.903 INFO:tasks.workunit.client.0.vm04.stdout:7/225: chown d4/df/d12/d13/d25/f2e 8 1 2026-03-10T06:22:49.905 INFO:tasks.workunit.client.0.vm04.stdout:7/226: write d4/df/d12/d21/f2a [470761,110939] 0 2026-03-10T06:22:49.924 INFO:tasks.workunit.client.0.vm04.stdout:9/254: mkdir d2/d23/d24/d5a 0 2026-03-10T06:22:49.929 INFO:tasks.workunit.client.0.vm04.stdout:4/202: getdents d2/d16/d2c 0 2026-03-10T06:22:49.938 INFO:tasks.workunit.client.0.vm04.stdout:7/227: symlink d4/df/d12/d21/l53 0 2026-03-10T06:22:49.943 INFO:tasks.workunit.client.0.vm04.stdout:3/292: creat d4/da/df/f5e x:0 0 0 2026-03-10T06:22:49.944 INFO:tasks.workunit.client.0.vm04.stdout:3/293: truncate d4/da/df/d13/f47 1134598 0 2026-03-10T06:22:49.961 INFO:tasks.workunit.client.0.vm04.stdout:5/196: dread d4/f21 [0,4194304] 0 2026-03-10T06:22:49.963 INFO:tasks.workunit.client.0.vm04.stdout:6/248: rename d2/d43/d2d/d30/c44 to d2/d43/d2d/c54 0 2026-03-10T06:22:49.964 INFO:tasks.workunit.client.0.vm04.stdout:0/238: truncate d0/d1a/d20/d3d/f48 901788 0 2026-03-10T06:22:49.965 INFO:tasks.workunit.client.0.vm04.stdout:9/255: creat d2/d3/d18/d39/d11/d42/f5b x:0 0 0 2026-03-10T06:22:49.966 INFO:tasks.workunit.client.0.vm04.stdout:9/256: write d2/f1e [824427,53102] 0 2026-03-10T06:22:49.966 INFO:tasks.workunit.client.0.vm04.stdout:9/257: chown d2/d8/d14/f40 0 1 2026-03-10T06:22:49.972 INFO:tasks.workunit.client.0.vm04.stdout:8/267: mkdir df/d53 0 2026-03-10T06:22:49.972 INFO:tasks.workunit.client.0.vm04.stdout:4/203: fdatasync d2/f14 0 2026-03-10T06:22:49.973 INFO:tasks.workunit.client.0.vm04.stdout:3/294: stat d4/da/df/c33 0 2026-03-10T06:22:49.973 INFO:tasks.workunit.client.0.vm04.stdout:1/262: getdents d0 0 2026-03-10T06:22:49.976 INFO:tasks.workunit.client.0.vm04.stdout:3/295: write d4/da/df/d13/f4b [509277,782] 0 2026-03-10T06:22:49.977 INFO:tasks.workunit.client.0.vm04.stdout:4/204: dwrite d2/f14 [0,4194304] 0 2026-03-10T06:22:49.977 INFO:tasks.workunit.client.0.vm04.stdout:3/296: write d4/da/df/f5e [141647,90991] 0 2026-03-10T06:22:49.981 INFO:tasks.workunit.client.0.vm04.stdout:3/297: truncate d4/d6/dc/f41 4520275 0 2026-03-10T06:22:49.985 INFO:tasks.workunit.client.0.vm04.stdout:6/249: symlink d2/d37/l55 0 2026-03-10T06:22:49.995 INFO:tasks.workunit.client.0.vm04.stdout:4/205: mkdir d2/d8/d40 0 2026-03-10T06:22:49.995 INFO:tasks.workunit.client.0.vm04.stdout:8/268: creat df/d20/d25/f54 x:0 0 0 2026-03-10T06:22:49.996 INFO:tasks.workunit.client.0.vm04.stdout:4/206: chown d2/d8/f35 4117902 1 2026-03-10T06:22:49.996 INFO:tasks.workunit.client.0.vm04.stdout:5/197: mknod d4/c4b 0 2026-03-10T06:22:49.997 INFO:tasks.workunit.client.0.vm04.stdout:5/198: fsync d4/d11/d2a/d38/f3e 0 2026-03-10T06:22:50.001 INFO:tasks.workunit.client.0.vm04.stdout:0/239: link d0/d1a/f3b d0/d1a/d20/d38/f52 0 2026-03-10T06:22:50.002 INFO:tasks.workunit.client.0.vm04.stdout:9/258: mknod d2/d8/c5c 0 2026-03-10T06:22:50.002 INFO:tasks.workunit.client.0.vm04.stdout:2/259: rename d1/db/ld to d1/df/d11/l4f 0 2026-03-10T06:22:50.003 INFO:tasks.workunit.client.0.vm04.stdout:8/269: write df/f1f [3397829,125663] 0 2026-03-10T06:22:50.013 INFO:tasks.workunit.client.0.vm04.stdout:8/270: dwrite df/f11 [4194304,4194304] 0 2026-03-10T06:22:50.021 INFO:tasks.workunit.client.0.vm04.stdout:0/240: sync 2026-03-10T06:22:50.021 INFO:tasks.workunit.client.0.vm04.stdout:4/207: dread d2/f4 [0,4194304] 0 2026-03-10T06:22:50.021 INFO:tasks.workunit.client.0.vm04.stdout:0/241: stat d0/d5/d25/dd/d1d/f26 0 2026-03-10T06:22:50.022 INFO:tasks.workunit.client.0.vm04.stdout:4/208: stat d2/d16/d2b/f18 0 2026-03-10T06:22:50.023 INFO:tasks.workunit.client.0.vm04.stdout:8/271: dread df/d15/d2b/f4a [0,4194304] 0 2026-03-10T06:22:50.025 INFO:tasks.workunit.client.0.vm04.stdout:4/209: truncate d2/d16/f20 149811 0 2026-03-10T06:22:50.026 INFO:tasks.workunit.client.0.vm04.stdout:3/298: link d4/da/df/d11/c51 d4/d6/d54/c5f 0 2026-03-10T06:22:50.027 INFO:tasks.workunit.client.0.vm04.stdout:4/210: chown d2/d16/d31/l33 4 1 2026-03-10T06:22:50.029 INFO:tasks.workunit.client.0.vm04.stdout:4/211: chown d2/d8/fa 3862 1 2026-03-10T06:22:50.030 INFO:tasks.workunit.client.0.vm04.stdout:3/299: dread d4/d6/dc/f1f [0,4194304] 0 2026-03-10T06:22:50.030 INFO:tasks.workunit.client.0.vm04.stdout:1/263: rename d0/d3/c3f to d0/d8/d46/c60 0 2026-03-10T06:22:50.030 INFO:tasks.workunit.client.0.vm04.stdout:5/199: dwrite d4/f19 [0,4194304] 0 2026-03-10T06:22:50.036 INFO:tasks.workunit.client.0.vm04.stdout:9/259: mkdir d2/d8/d5d 0 2026-03-10T06:22:50.037 INFO:tasks.workunit.client.0.vm04.stdout:7/228: write d4/df/d12/d13/f27 [420325,97578] 0 2026-03-10T06:22:50.037 INFO:tasks.workunit.client.0.vm04.stdout:7/229: chown d4/df/d12/c47 13 1 2026-03-10T06:22:50.038 INFO:tasks.workunit.client.0.vm04.stdout:7/230: stat d4/c1d 0 2026-03-10T06:22:50.042 INFO:tasks.workunit.client.0.vm04.stdout:2/260: symlink d1/df/d11/d18/d48/l50 0 2026-03-10T06:22:50.042 INFO:tasks.workunit.client.0.vm04.stdout:7/231: read d4/df/d12/d13/d25/f2e [4919005,66441] 0 2026-03-10T06:22:50.043 INFO:tasks.workunit.client.0.vm04.stdout:7/232: stat d4/df/d12/d13/f27 0 2026-03-10T06:22:50.044 INFO:tasks.workunit.client.0.vm04.stdout:0/242: write d0/d5/f41 [2225641,25633] 0 2026-03-10T06:22:50.045 INFO:tasks.workunit.client.0.vm04.stdout:5/200: dwrite d4/d11/f3f [0,4194304] 0 2026-03-10T06:22:50.046 INFO:tasks.workunit.client.0.vm04.stdout:0/243: write d0/f4 [4728190,24557] 0 2026-03-10T06:22:50.047 INFO:tasks.workunit.client.0.vm04.stdout:9/260: sync 2026-03-10T06:22:50.063 INFO:tasks.workunit.client.0.vm04.stdout:7/233: dread d4/df/d12/d13/d25/d28/f39 [0,4194304] 0 2026-03-10T06:22:50.069 INFO:tasks.workunit.client.0.vm04.stdout:8/272: mkdir df/d20/d25/d30/d55 0 2026-03-10T06:22:50.078 INFO:tasks.workunit.client.0.vm04.stdout:3/300: unlink d4/l36 0 2026-03-10T06:22:50.078 INFO:tasks.workunit.client.0.vm04.stdout:8/273: write df/d15/f43 [910797,113780] 0 2026-03-10T06:22:50.078 INFO:tasks.workunit.client.0.vm04.stdout:8/274: dwrite df/f46 [0,4194304] 0 2026-03-10T06:22:50.081 INFO:tasks.workunit.client.0.vm04.stdout:5/201: dread d4/d6/f7 [0,4194304] 0 2026-03-10T06:22:50.085 INFO:tasks.workunit.client.0.vm04.stdout:9/261: creat d2/d3/d18/d39/d11/d42/f5e x:0 0 0 2026-03-10T06:22:50.086 INFO:tasks.workunit.client.0.vm04.stdout:9/262: chown d2/d3/f57 35056165 1 2026-03-10T06:22:50.088 INFO:tasks.workunit.client.0.vm04.stdout:5/202: sync 2026-03-10T06:22:50.089 INFO:tasks.workunit.client.0.vm04.stdout:5/203: chown d4/d11/f34 1363 1 2026-03-10T06:22:50.092 INFO:tasks.workunit.client.0.vm04.stdout:7/234: symlink d4/df/d12/d21/l54 0 2026-03-10T06:22:50.095 INFO:tasks.workunit.client.0.vm04.stdout:4/212: creat d2/d8/d40/f41 x:0 0 0 2026-03-10T06:22:50.097 INFO:tasks.workunit.client.0.vm04.stdout:5/204: dwrite d4/d11/d2a/f36 [0,4194304] 0 2026-03-10T06:22:50.099 INFO:tasks.workunit.client.0.vm04.stdout:5/205: read - d4/d11/d2a/d38/f3e zero size 2026-03-10T06:22:50.104 INFO:tasks.workunit.client.0.vm04.stdout:5/206: fdatasync d4/d11/f3f 0 2026-03-10T06:22:50.110 INFO:tasks.workunit.client.0.vm04.stdout:6/250: truncate d2/d43/d2d/d30/d1f/d3c/f27 616270 0 2026-03-10T06:22:50.121 INFO:tasks.workunit.client.0.vm04.stdout:6/251: truncate d2/d43/d2d/d30/d34/f4d 1294617 0 2026-03-10T06:22:50.136 INFO:tasks.workunit.client.0.vm04.stdout:7/235: mknod d4/df/d12/d13/d25/c55 0 2026-03-10T06:22:50.288 INFO:tasks.workunit.client.0.vm04.stdout:3/301: symlink d4/da/df/d13/l60 0 2026-03-10T06:22:50.289 INFO:tasks.workunit.client.0.vm04.stdout:4/213: write d2/f30 [952188,42707] 0 2026-03-10T06:22:50.289 INFO:tasks.workunit.client.0.vm04.stdout:5/207: rmdir d4/d3b 39 2026-03-10T06:22:50.289 INFO:tasks.workunit.client.0.vm04.stdout:4/214: write d2/d16/d2b/f26 [3613869,13324] 0 2026-03-10T06:22:50.289 INFO:tasks.workunit.client.0.vm04.stdout:4/215: write d2/f12 [352507,67314] 0 2026-03-10T06:22:50.289 INFO:tasks.workunit.client.0.vm04.stdout:7/236: dread d4/df/d12/d13/d25/d28/f39 [4194304,4194304] 0 2026-03-10T06:22:50.289 INFO:tasks.workunit.client.0.vm04.stdout:6/252: rmdir d2/d3a 39 2026-03-10T06:22:50.289 INFO:tasks.workunit.client.0.vm04.stdout:1/264: rename d0/d8/f14 to d0/d3/f61 0 2026-03-10T06:22:50.289 INFO:tasks.workunit.client.0.vm04.stdout:7/237: creat d4/df/f56 x:0 0 0 2026-03-10T06:22:50.289 INFO:tasks.workunit.client.0.vm04.stdout:1/265: link d0/f29 d0/d3/f62 0 2026-03-10T06:22:50.289 INFO:tasks.workunit.client.0.vm04.stdout:7/238: creat d4/df/d12/d34/f57 x:0 0 0 2026-03-10T06:22:50.289 INFO:tasks.workunit.client.0.vm04.stdout:7/239: mkdir d4/df/d12/d13/d25/d28/d3a/d58 0 2026-03-10T06:22:50.289 INFO:tasks.workunit.client.0.vm04.stdout:1/266: stat d0/d3/c17 0 2026-03-10T06:22:50.289 INFO:tasks.workunit.client.0.vm04.stdout:1/267: write d0/d3/f61 [428294,118942] 0 2026-03-10T06:22:50.289 INFO:tasks.workunit.client.0.vm04.stdout:1/268: fdatasync d0/d3/f20 0 2026-03-10T06:22:50.289 INFO:tasks.workunit.client.0.vm04.stdout:7/240: getdents d4/df/d12/d13/d25/d28 0 2026-03-10T06:22:50.290 INFO:tasks.workunit.client.0.vm04.stdout:3/302: dread d4/da/df/d13/f2e [4194304,4194304] 0 2026-03-10T06:22:50.290 INFO:tasks.workunit.client.0.vm04.stdout:3/303: stat d4/da/df/d11/d4a 0 2026-03-10T06:22:50.304 INFO:tasks.workunit.client.0.vm04.stdout:3/304: link d4/da/df/d11/c51 d4/da/df/d11/d4a/c61 0 2026-03-10T06:22:50.309 INFO:tasks.workunit.client.0.vm04.stdout:5/208: dread d4/d11/f43 [0,4194304] 0 2026-03-10T06:22:50.311 INFO:tasks.workunit.client.0.vm04.stdout:3/305: mkdir d4/da/df/d11/d62 0 2026-03-10T06:22:50.326 INFO:tasks.workunit.client.0.vm04.stdout:5/209: rmdir d4/d11/d2a 39 2026-03-10T06:22:50.326 INFO:tasks.workunit.client.0.vm04.stdout:3/306: mknod d4/da/df/d11/d50/c63 0 2026-03-10T06:22:50.327 INFO:tasks.workunit.client.0.vm04.stdout:5/210: chown d4/d11/d2a/f30 121583753 1 2026-03-10T06:22:50.327 INFO:tasks.workunit.client.0.vm04.stdout:5/211: mkdir d4/d6/d48/d4c 0 2026-03-10T06:22:50.327 INFO:tasks.workunit.client.0.vm04.stdout:5/212: creat d4/d3b/f4d x:0 0 0 2026-03-10T06:22:50.327 INFO:tasks.workunit.client.0.vm04.stdout:5/213: write d4/d11/f2f [1483796,128831] 0 2026-03-10T06:22:50.337 INFO:tasks.workunit.client.0.vm04.stdout:5/214: dwrite d4/d11/d2a/f2c [0,4194304] 0 2026-03-10T06:22:50.350 INFO:tasks.workunit.client.0.vm04.stdout:5/215: dwrite d4/d3b/f41 [0,4194304] 0 2026-03-10T06:22:50.492 INFO:tasks.workunit.client.0.vm04.stdout:3/307: sync 2026-03-10T06:22:50.492 INFO:tasks.workunit.client.0.vm04.stdout:7/241: sync 2026-03-10T06:22:50.492 INFO:tasks.workunit.client.0.vm04.stdout:5/216: sync 2026-03-10T06:22:50.493 INFO:tasks.workunit.client.0.vm04.stdout:3/308: fsync d4/da/df/d13/f2b 0 2026-03-10T06:22:50.581 INFO:tasks.workunit.client.0.vm04.stdout:2/261: truncate d1/df/f24 459578 0 2026-03-10T06:22:50.581 INFO:tasks.workunit.client.0.vm04.stdout:0/244: truncate d0/d5/f41 3960196 0 2026-03-10T06:22:50.585 INFO:tasks.workunit.client.0.vm04.stdout:9/263: write d2/d3/f4 [2472010,45999] 0 2026-03-10T06:22:50.585 INFO:tasks.workunit.client.0.vm04.stdout:6/253: write d2/d43/d2d/d30/f2b [320751,63734] 0 2026-03-10T06:22:50.585 INFO:tasks.workunit.client.0.vm04.stdout:6/254: write d2/d8/f11 [1201486,45329] 0 2026-03-10T06:22:50.586 INFO:tasks.workunit.client.0.vm04.stdout:8/275: dwrite f6 [4194304,4194304] 0 2026-03-10T06:22:50.586 INFO:tasks.workunit.client.0.vm04.stdout:8/276: readlink df/l23 0 2026-03-10T06:22:50.595 INFO:tasks.workunit.client.0.vm04.stdout:0/245: stat d0/d5/c40 0 2026-03-10T06:22:50.604 INFO:tasks.workunit.client.0.vm04.stdout:4/216: dwrite d2/d16/d2b/f3c [0,4194304] 0 2026-03-10T06:22:50.604 INFO:tasks.workunit.client.0.vm04.stdout:6/255: dread d2/d43/d2d/d30/d34/f4d [0,4194304] 0 2026-03-10T06:22:50.605 INFO:tasks.workunit.client.0.vm04.stdout:8/277: creat df/d15/d2b/f56 x:0 0 0 2026-03-10T06:22:50.606 INFO:tasks.workunit.client.0.vm04.stdout:0/246: creat d0/d1a/d3f/f53 x:0 0 0 2026-03-10T06:22:50.608 INFO:tasks.workunit.client.0.vm04.stdout:4/217: stat d2/d16/d2b/c2f 0 2026-03-10T06:22:50.610 INFO:tasks.workunit.client.0.vm04.stdout:5/217: sync 2026-03-10T06:22:50.610 INFO:tasks.workunit.client.0.vm04.stdout:4/218: chown d2/l3b 338565829 1 2026-03-10T06:22:50.610 INFO:tasks.workunit.client.0.vm04.stdout:5/218: fsync d4/d11/f2f 0 2026-03-10T06:22:50.630 INFO:tasks.workunit.client.0.vm04.stdout:9/264: fsync d2/d23/f31 0 2026-03-10T06:22:50.630 INFO:tasks.workunit.client.0.vm04.stdout:2/262: truncate d1/db/f1e 6649832 0 2026-03-10T06:22:50.630 INFO:tasks.workunit.client.0.vm04.stdout:4/219: mkdir d2/d16/d31/d42 0 2026-03-10T06:22:50.630 INFO:tasks.workunit.client.0.vm04.stdout:4/220: fdatasync d2/d16/d2c/f2e 0 2026-03-10T06:22:50.630 INFO:tasks.workunit.client.0.vm04.stdout:6/256: creat d2/d3a/f56 x:0 0 0 2026-03-10T06:22:50.630 INFO:tasks.workunit.client.0.vm04.stdout:8/278: creat df/d53/f57 x:0 0 0 2026-03-10T06:22:50.630 INFO:tasks.workunit.client.0.vm04.stdout:0/247: creat d0/d1a/d20/d38/d31/d47/f54 x:0 0 0 2026-03-10T06:22:50.630 INFO:tasks.workunit.client.0.vm04.stdout:0/248: write d0/d5/d25/dd/f13 [6546486,110842] 0 2026-03-10T06:22:50.639 INFO:tasks.workunit.client.0.vm04.stdout:6/257: creat d2/d3a/f57 x:0 0 0 2026-03-10T06:22:50.645 INFO:tasks.workunit.client.0.vm04.stdout:4/221: getdents d2/d16/d2b 0 2026-03-10T06:22:50.671 INFO:tasks.workunit.client.0.vm04.stdout:8/279: getdents df/d15 0 2026-03-10T06:22:50.671 INFO:tasks.workunit.client.0.vm04.stdout:8/280: chown lb 14 1 2026-03-10T06:22:50.671 INFO:tasks.workunit.client.0.vm04.stdout:4/222: creat d2/d16/d31/d3f/f43 x:0 0 0 2026-03-10T06:22:50.671 INFO:tasks.workunit.client.0.vm04.stdout:4/223: chown d2/d8/f35 131740445 1 2026-03-10T06:22:50.671 INFO:tasks.workunit.client.0.vm04.stdout:4/224: write d2/d16/d2b/f28 [1194036,96875] 0 2026-03-10T06:22:50.671 INFO:tasks.workunit.client.0.vm04.stdout:4/225: symlink d2/d16/d31/d42/l44 0 2026-03-10T06:22:50.671 INFO:tasks.workunit.client.0.vm04.stdout:4/226: chown d2/f4 31647068 1 2026-03-10T06:22:50.687 INFO:tasks.workunit.client.0.vm04.stdout:9/265: read d2/f48 [442443,9434] 0 2026-03-10T06:22:50.687 INFO:tasks.workunit.client.0.vm04.stdout:2/263: sync 2026-03-10T06:22:50.695 INFO:tasks.workunit.client.0.vm04.stdout:9/266: link d2/d3/f4 d2/d3/d18/d34/f5f 0 2026-03-10T06:22:50.703 INFO:tasks.workunit.client.0.vm04.stdout:9/267: dread d2/d23/d24/f29 [0,4194304] 0 2026-03-10T06:22:50.717 INFO:tasks.workunit.client.0.vm04.stdout:9/268: getdents d2/d8/d53 0 2026-03-10T06:22:50.720 INFO:tasks.workunit.client.0.vm04.stdout:9/269: mkdir d2/d8/d3a/d60 0 2026-03-10T06:22:50.725 INFO:tasks.workunit.client.0.vm04.stdout:9/270: dwrite f0 [0,4194304] 0 2026-03-10T06:22:50.727 INFO:tasks.workunit.client.0.vm04.stdout:3/309: dwrite d4/f29 [0,4194304] 0 2026-03-10T06:22:50.733 INFO:tasks.workunit.client.0.vm04.stdout:3/310: fsync d4/da/df/d13/f23 0 2026-03-10T06:22:50.738 INFO:tasks.workunit.client.0.vm04.stdout:9/271: fdatasync d2/f48 0 2026-03-10T06:22:50.741 INFO:tasks.workunit.client.0.vm04.stdout:3/311: mkdir d4/da/df/d13/d21/d32/d39/d64 0 2026-03-10T06:22:50.743 INFO:tasks.workunit.client.0.vm04.stdout:3/312: unlink d4/d6/d54/c5f 0 2026-03-10T06:22:50.842 INFO:tasks.workunit.client.0.vm04.stdout:0/249: dwrite d0/d1a/f3b [0,4194304] 0 2026-03-10T06:22:50.874 INFO:tasks.workunit.client.0.vm04.stdout:6/258: symlink d2/l58 0 2026-03-10T06:22:50.881 INFO:tasks.workunit.client.0.vm04.stdout:3/313: rmdir d4/da/df/d13/d21/d32 39 2026-03-10T06:22:50.881 INFO:tasks.workunit.client.0.vm04.stdout:3/314: write d4/f29 [1994652,54842] 0 2026-03-10T06:22:50.885 INFO:tasks.workunit.client.0.vm04.stdout:3/315: symlink d4/da/df/d11/d50/l65 0 2026-03-10T06:22:50.887 INFO:tasks.workunit.client.0.vm04.stdout:8/281: unlink df/f1f 0 2026-03-10T06:22:50.888 INFO:tasks.workunit.client.0.vm04.stdout:8/282: chown df/d15/d29 70876328 1 2026-03-10T06:22:50.888 INFO:tasks.workunit.client.0.vm04.stdout:8/283: chown df/d20/d25/d30 14 1 2026-03-10T06:22:50.888 INFO:tasks.workunit.client.0.vm04.stdout:8/284: chown df/d15/f43 24185803 1 2026-03-10T06:22:50.890 INFO:tasks.workunit.client.0.vm04.stdout:3/316: fsync d4/da/df/d13/f16 0 2026-03-10T06:22:50.894 INFO:tasks.workunit.client.0.vm04.stdout:3/317: symlink d4/d6/dc/l66 0 2026-03-10T06:22:50.903 INFO:tasks.workunit.client.0.vm04.stdout:5/219: unlink f0 0 2026-03-10T06:22:50.911 INFO:tasks.workunit.client.0.vm04.stdout:5/220: stat d4/d11/d2a/d38/l40 0 2026-03-10T06:22:50.911 INFO:tasks.workunit.client.0.vm04.stdout:5/221: dread - d4/d6/f47 zero size 2026-03-10T06:22:50.911 INFO:tasks.workunit.client.0.vm04.stdout:1/269: rename d0/c4a to d0/d3/d41/d4b/d5b/c63 0 2026-03-10T06:22:50.911 INFO:tasks.workunit.client.0.vm04.stdout:7/242: rename d4/df/d12/d21/f26 to d4/df/d12/d13/f59 0 2026-03-10T06:22:50.911 INFO:tasks.workunit.client.0.vm04.stdout:1/270: creat d0/f64 x:0 0 0 2026-03-10T06:22:50.912 INFO:tasks.workunit.client.0.vm04.stdout:4/227: rename d2/c10 to d2/d16/d31/c45 0 2026-03-10T06:22:50.914 INFO:tasks.workunit.client.0.vm04.stdout:1/271: dwrite d0/d3/f20 [0,4194304] 0 2026-03-10T06:22:50.915 INFO:tasks.workunit.client.0.vm04.stdout:7/243: creat d4/df/d12/d13/d25/d28/d3a/d58/f5a x:0 0 0 2026-03-10T06:22:50.925 INFO:tasks.workunit.client.0.vm04.stdout:2/264: rename d1/df/d11/c4d to d1/df/d2c/c51 0 2026-03-10T06:22:50.964 INFO:tasks.workunit.client.0.vm04.stdout:9/272: rename d2/d3/d18/d39 to d2/d3/d18/d39/d61 22 2026-03-10T06:22:50.964 INFO:tasks.workunit.client.0.vm04.stdout:7/244: unlink d4/df/l19 0 2026-03-10T06:22:50.964 INFO:tasks.workunit.client.0.vm04.stdout:1/272: symlink d0/d8/l65 0 2026-03-10T06:22:50.964 INFO:tasks.workunit.client.0.vm04.stdout:1/273: fdatasync d0/f23 0 2026-03-10T06:22:50.964 INFO:tasks.workunit.client.0.vm04.stdout:2/265: creat d1/df/d2c/d37/f52 x:0 0 0 2026-03-10T06:22:50.964 INFO:tasks.workunit.client.0.vm04.stdout:9/273: symlink d2/d3/d18/d39/d11/l62 0 2026-03-10T06:22:50.964 INFO:tasks.workunit.client.0.vm04.stdout:7/245: write d4/df/d12/d13/d25/f4b [306925,75460] 0 2026-03-10T06:22:50.964 INFO:tasks.workunit.client.0.vm04.stdout:4/228: rename d2/d16/d2b to d2/d46 0 2026-03-10T06:22:50.964 INFO:tasks.workunit.client.0.vm04.stdout:9/274: rename d2/d3 to d2/d3/d18/d39/d11/d63 22 2026-03-10T06:22:50.964 INFO:tasks.workunit.client.0.vm04.stdout:9/275: dread - d2/d3/d18/d39/d11/f56 zero size 2026-03-10T06:22:50.964 INFO:tasks.workunit.client.0.vm04.stdout:9/276: write d2/d3/f12 [3199147,118569] 0 2026-03-10T06:22:50.964 INFO:tasks.workunit.client.0.vm04.stdout:7/246: rename d4/df/d12/d13/d25/d28/d3a/f4e to d4/df/d12/d13/d25/d30/d40/d50/f5b 0 2026-03-10T06:22:50.964 INFO:tasks.workunit.client.0.vm04.stdout:7/247: symlink d4/df/d12/d13/l5c 0 2026-03-10T06:22:50.964 INFO:tasks.workunit.client.0.vm04.stdout:4/229: rename d2/d8/fa to d2/f47 0 2026-03-10T06:22:50.964 INFO:tasks.workunit.client.0.vm04.stdout:7/248: rmdir d4/df/d12/d13/d25/d28/d3a 39 2026-03-10T06:22:50.964 INFO:tasks.workunit.client.0.vm04.stdout:4/230: symlink d2/d16/d31/d3f/l48 0 2026-03-10T06:22:50.964 INFO:tasks.workunit.client.0.vm04.stdout:7/249: mknod d4/df/d12/d34/c5d 0 2026-03-10T06:22:50.970 INFO:tasks.workunit.client.0.vm04.stdout:1/274: dread d0/f5 [0,4194304] 0 2026-03-10T06:22:50.972 INFO:tasks.workunit.client.0.vm04.stdout:1/275: creat d0/d3/d41/d4b/d5b/f66 x:0 0 0 2026-03-10T06:22:50.972 INFO:tasks.workunit.client.0.vm04.stdout:1/276: write d0/d3/f19 [718669,105846] 0 2026-03-10T06:22:50.972 INFO:tasks.workunit.client.0.vm04.stdout:1/277: chown d0 2607 1 2026-03-10T06:22:50.977 INFO:tasks.workunit.client.0.vm04.stdout:1/278: creat d0/d8/f67 x:0 0 0 2026-03-10T06:22:50.978 INFO:tasks.workunit.client.0.vm04.stdout:1/279: dread d0/d8/f43 [0,4194304] 0 2026-03-10T06:22:50.981 INFO:tasks.workunit.client.0.vm04.stdout:1/280: dwrite d0/d8/f38 [0,4194304] 0 2026-03-10T06:22:50.984 INFO:tasks.workunit.client.0.vm04.stdout:1/281: mknod d0/d3/d41/c68 0 2026-03-10T06:22:50.993 INFO:tasks.workunit.client.0.vm04.stdout:1/282: creat d0/d8/f69 x:0 0 0 2026-03-10T06:22:50.993 INFO:tasks.workunit.client.0.vm04.stdout:1/283: chown d0/c18 1794043 1 2026-03-10T06:22:50.993 INFO:tasks.workunit.client.0.vm04.stdout:1/284: creat d0/f6a x:0 0 0 2026-03-10T06:22:51.081 INFO:tasks.workunit.client.0.vm04.stdout:0/250: truncate d0/d5/d25/dd/f43 3123334 0 2026-03-10T06:22:51.082 INFO:tasks.workunit.client.0.vm04.stdout:8/285: truncate df/d20/d25/d30/f4e 2268416 0 2026-03-10T06:22:51.083 INFO:tasks.workunit.client.0.vm04.stdout:6/259: dwrite d2/d43/d2d/d30/d34/f4d [0,4194304] 0 2026-03-10T06:22:51.084 INFO:tasks.workunit.client.0.vm04.stdout:0/251: read d0/f17 [3293847,10963] 0 2026-03-10T06:22:51.086 INFO:tasks.workunit.client.0.vm04.stdout:0/252: write d0/d5/d25/dd/d3a/f50 [352945,19366] 0 2026-03-10T06:22:51.088 INFO:tasks.workunit.client.0.vm04.stdout:6/260: creat d2/d43/d2d/d51/f59 x:0 0 0 2026-03-10T06:22:51.089 INFO:tasks.workunit.client.0.vm04.stdout:8/286: mknod df/c58 0 2026-03-10T06:22:51.089 INFO:tasks.workunit.client.0.vm04.stdout:6/261: fdatasync d2/d3a/f50 0 2026-03-10T06:22:51.089 INFO:tasks.workunit.client.0.vm04.stdout:0/253: mknod d0/d1a/c55 0 2026-03-10T06:22:51.093 INFO:tasks.workunit.client.0.vm04.stdout:6/262: dwrite d2/d43/f2c [0,4194304] 0 2026-03-10T06:22:51.094 INFO:tasks.workunit.client.0.vm04.stdout:0/254: mkdir d0/d5/d25/dd/d3a/d56 0 2026-03-10T06:22:51.095 INFO:tasks.workunit.client.0.vm04.stdout:8/287: rename df/d20/c2d to df/d15/d2b/c59 0 2026-03-10T06:22:51.097 INFO:tasks.workunit.client.0.vm04.stdout:0/255: write d0/d5/d25/f23 [111161,54580] 0 2026-03-10T06:22:51.098 INFO:tasks.workunit.client.0.vm04.stdout:0/256: write d0/d1a/d20/f4b [731950,75551] 0 2026-03-10T06:22:51.105 INFO:tasks.workunit.client.0.vm04.stdout:8/288: symlink df/d20/d25/d30/d55/l5a 0 2026-03-10T06:22:51.107 INFO:tasks.workunit.client.0.vm04.stdout:6/263: rename d2/d43/f2c to d2/d43/d2d/d30/f5a 0 2026-03-10T06:22:51.111 INFO:tasks.workunit.client.0.vm04.stdout:6/264: dwrite d2/d43/d2d/f42 [0,4194304] 0 2026-03-10T06:22:51.145 INFO:tasks.workunit.client.0.vm04.stdout:6/265: read - d2/d43/d2d/d30/d34/f52 zero size 2026-03-10T06:22:51.145 INFO:tasks.workunit.client.0.vm04.stdout:8/289: link df/d15/f45 df/d20/d25/d30/d55/f5b 0 2026-03-10T06:22:51.145 INFO:tasks.workunit.client.0.vm04.stdout:8/290: fsync df/d15/f1b 0 2026-03-10T06:22:51.145 INFO:tasks.workunit.client.0.vm04.stdout:8/291: dread - df/d53/f57 zero size 2026-03-10T06:22:51.145 INFO:tasks.workunit.client.0.vm04.stdout:8/292: fdatasync df/d15/d2b/f4d 0 2026-03-10T06:22:51.146 INFO:tasks.workunit.client.0.vm04.stdout:6/266: rename d2/d43/d2d/d30/d1f/d3c/l4c to d2/d8/l5b 0 2026-03-10T06:22:51.146 INFO:tasks.workunit.client.0.vm04.stdout:8/293: creat df/d20/f5c x:0 0 0 2026-03-10T06:22:51.146 INFO:tasks.workunit.client.0.vm04.stdout:6/267: dwrite d2/d8/f11 [0,4194304] 0 2026-03-10T06:22:51.298 INFO:tasks.workunit.client.0.vm04.stdout:2/266: sync 2026-03-10T06:22:51.298 INFO:tasks.workunit.client.0.vm04.stdout:5/222: sync 2026-03-10T06:22:51.298 INFO:tasks.workunit.client.0.vm04.stdout:9/277: sync 2026-03-10T06:22:51.298 INFO:tasks.workunit.client.0.vm04.stdout:4/231: sync 2026-03-10T06:22:51.298 INFO:tasks.workunit.client.0.vm04.stdout:3/318: sync 2026-03-10T06:22:51.298 INFO:tasks.workunit.client.0.vm04.stdout:6/268: sync 2026-03-10T06:22:51.299 INFO:tasks.workunit.client.0.vm04.stdout:4/232: stat d2/d16/d31/d42/l44 0 2026-03-10T06:22:51.302 INFO:tasks.workunit.client.0.vm04.stdout:5/223: dread d4/d3b/f41 [0,4194304] 0 2026-03-10T06:22:51.307 INFO:tasks.workunit.client.0.vm04.stdout:2/267: creat d1/df/d11/d18/f53 x:0 0 0 2026-03-10T06:22:51.307 INFO:tasks.workunit.client.0.vm04.stdout:2/268: chown d1/f10 6296 1 2026-03-10T06:22:51.307 INFO:tasks.workunit.client.0.vm04.stdout:4/233: mknod d2/d32/c49 0 2026-03-10T06:22:51.310 INFO:tasks.workunit.client.0.vm04.stdout:6/269: rmdir d2/d43/d2d/d30/d1f 39 2026-03-10T06:22:51.310 INFO:tasks.workunit.client.0.vm04.stdout:5/224: fsync d4/d6/f7 0 2026-03-10T06:22:51.311 INFO:tasks.workunit.client.0.vm04.stdout:9/278: mkdir d2/d8/d14/d1d/d64 0 2026-03-10T06:22:51.317 INFO:tasks.workunit.client.0.vm04.stdout:9/279: dwrite d2/d8/f4a [0,4194304] 0 2026-03-10T06:22:51.322 INFO:tasks.workunit.client.0.vm04.stdout:1/285: write d0/d3/f62 [2523150,99788] 0 2026-03-10T06:22:51.325 INFO:tasks.workunit.client.0.vm04.stdout:1/286: stat d0 0 2026-03-10T06:22:51.328 INFO:tasks.workunit.client.0.vm04.stdout:8/294: dwrite df/d20/d25/d30/f4e [0,4194304] 0 2026-03-10T06:22:51.330 INFO:tasks.workunit.client.0.vm04.stdout:8/295: chown df/c26 2 1 2026-03-10T06:22:51.330 INFO:tasks.workunit.client.0.vm04.stdout:8/296: readlink la 0 2026-03-10T06:22:51.337 INFO:tasks.workunit.client.0.vm04.stdout:0/257: write d0/d5/d25/dd/d1d/f26 [1399458,71623] 0 2026-03-10T06:22:51.342 INFO:tasks.workunit.client.0.vm04.stdout:3/319: creat d4/da/df/d13/d21/d32/d39/d64/f67 x:0 0 0 2026-03-10T06:22:51.348 INFO:tasks.workunit.client.0.vm04.stdout:2/269: mkdir d1/df/d11/d18/d35/d54 0 2026-03-10T06:22:51.349 INFO:tasks.workunit.client.0.vm04.stdout:5/225: symlink d4/l4e 0 2026-03-10T06:22:51.349 INFO:tasks.workunit.client.0.vm04.stdout:5/226: readlink d4/d11/d2a/l49 0 2026-03-10T06:22:51.349 INFO:tasks.workunit.client.0.vm04.stdout:0/258: unlink d0/d1a/d20/f4b 0 2026-03-10T06:22:51.349 INFO:tasks.workunit.client.0.vm04.stdout:9/280: mknod d2/d23/d24/d5a/c65 0 2026-03-10T06:22:51.358 INFO:tasks.workunit.client.0.vm04.stdout:6/270: dread d2/f14 [4194304,4194304] 0 2026-03-10T06:22:51.359 INFO:tasks.workunit.client.0.vm04.stdout:3/320: creat d4/da/df/d11/d50/f68 x:0 0 0 2026-03-10T06:22:51.359 INFO:tasks.workunit.client.0.vm04.stdout:3/321: chown d4/da/df/d13/d21/d2c/c4f 75 1 2026-03-10T06:22:51.361 INFO:tasks.workunit.client.0.vm04.stdout:2/270: dread d1/df/f24 [0,4194304] 0 2026-03-10T06:22:51.362 INFO:tasks.workunit.client.0.vm04.stdout:1/287: sync 2026-03-10T06:22:51.365 INFO:tasks.workunit.client.0.vm04.stdout:2/271: write d1/df/d2c/f3d [1340186,75233] 0 2026-03-10T06:22:51.370 INFO:tasks.workunit.client.0.vm04.stdout:0/259: creat d0/d5/d25/dd/d3a/f57 x:0 0 0 2026-03-10T06:22:51.375 INFO:tasks.workunit.client.0.vm04.stdout:7/250: dwrite d4/df/d12/d13/f59 [0,4194304] 0 2026-03-10T06:22:51.378 INFO:tasks.workunit.client.0.vm04.stdout:7/251: truncate d4/df/d12/d13/f4a 456244 0 2026-03-10T06:22:51.389 INFO:tasks.workunit.client.0.vm04.stdout:5/227: mknod d4/d11/c4f 0 2026-03-10T06:22:51.389 INFO:tasks.workunit.client.0.vm04.stdout:0/260: chown d0/d1a/l2a 7373384 1 2026-03-10T06:22:51.390 INFO:tasks.workunit.client.0.vm04.stdout:5/228: read - d4/d11/d2a/d38/f3e zero size 2026-03-10T06:22:51.418 INFO:tasks.workunit.client.0.vm04.stdout:4/234: link d2/l2d d2/l4a 0 2026-03-10T06:22:51.419 INFO:tasks.workunit.client.0.vm04.stdout:8/297: getdents df/d20 0 2026-03-10T06:22:51.419 INFO:tasks.workunit.client.0.vm04.stdout:8/298: dread - df/d15/d29/f3c zero size 2026-03-10T06:22:51.425 INFO:tasks.workunit.client.0.vm04.stdout:6/271: mknod d2/d43/d2d/d30/d1f/d3c/c5c 0 2026-03-10T06:22:51.427 INFO:tasks.workunit.client.0.vm04.stdout:4/235: dwrite d2/f30 [0,4194304] 0 2026-03-10T06:22:51.429 INFO:tasks.workunit.client.0.vm04.stdout:4/236: fsync d2/d46/f3d 0 2026-03-10T06:22:51.441 INFO:tasks.workunit.client.0.vm04.stdout:3/322: creat d4/da/df/d11/d62/f69 x:0 0 0 2026-03-10T06:22:51.443 INFO:tasks.workunit.client.0.vm04.stdout:3/323: dwrite d4/da/df/f5e [0,4194304] 0 2026-03-10T06:22:51.446 INFO:tasks.workunit.client.0.vm04.stdout:3/324: chown d4/da/df/d13/d21/c3c 49256 1 2026-03-10T06:22:51.450 INFO:tasks.workunit.client.0.vm04.stdout:1/288: link d0/f59 d0/d3/d41/d4b/f6b 0 2026-03-10T06:22:51.453 INFO:tasks.workunit.client.0.vm04.stdout:5/229: stat d4/c29 0 2026-03-10T06:22:51.459 INFO:tasks.workunit.client.0.vm04.stdout:8/299: rename df/d20/f21 to df/d15/f5d 0 2026-03-10T06:22:51.461 INFO:tasks.workunit.client.0.vm04.stdout:7/252: dread d4/df/d12/f1c [0,4194304] 0 2026-03-10T06:22:51.461 INFO:tasks.workunit.client.0.vm04.stdout:7/253: chown d4/df/d12/d13/f1e 1 1 2026-03-10T06:22:51.464 INFO:tasks.workunit.client.0.vm04.stdout:4/237: creat d2/d8/d40/f4b x:0 0 0 2026-03-10T06:22:51.466 INFO:tasks.workunit.client.0.vm04.stdout:4/238: truncate d2/d46/f3c 4527601 0 2026-03-10T06:22:51.468 INFO:tasks.workunit.client.0.vm04.stdout:9/281: creat d2/d8/f66 x:0 0 0 2026-03-10T06:22:51.471 INFO:tasks.workunit.client.0.vm04.stdout:7/254: dwrite d4/f51 [0,4194304] 0 2026-03-10T06:22:51.473 INFO:tasks.workunit.client.0.vm04.stdout:8/300: dread df/f11 [0,4194304] 0 2026-03-10T06:22:51.475 INFO:tasks.workunit.client.0.vm04.stdout:7/255: dwrite d4/df/d12/d13/f27 [0,4194304] 0 2026-03-10T06:22:51.491 INFO:tasks.workunit.client.0.vm04.stdout:3/325: mknod d4/d6/c6a 0 2026-03-10T06:22:51.493 INFO:tasks.workunit.client.0.vm04.stdout:2/272: link d1/df/l2d d1/df/d2c/d37/l55 0 2026-03-10T06:22:51.497 INFO:tasks.workunit.client.0.vm04.stdout:2/273: dwrite d1/df/d2c/f4c [0,4194304] 0 2026-03-10T06:22:51.498 INFO:tasks.workunit.client.0.vm04.stdout:2/274: chown d1/db/d20/f49 1337101 1 2026-03-10T06:22:51.503 INFO:tasks.workunit.client.0.vm04.stdout:5/230: mkdir d4/d6/d50 0 2026-03-10T06:22:51.506 INFO:tasks.workunit.client.0.vm04.stdout:3/326: read d4/d6/dc/f22 [896397,89083] 0 2026-03-10T06:22:51.506 INFO:tasks.workunit.client.0.vm04.stdout:3/327: chown d4/da/df/d11/d5a 215650659 1 2026-03-10T06:22:51.522 INFO:tasks.workunit.client.0.vm04.stdout:9/282: truncate d2/f49 183134 0 2026-03-10T06:22:51.527 INFO:tasks.workunit.client.0.vm04.stdout:7/256: symlink d4/df/d12/d13/l5e 0 2026-03-10T06:22:51.549 INFO:tasks.workunit.client.0.vm04.stdout:9/283: unlink d2/d3/d18/d39/c19 0 2026-03-10T06:22:51.553 INFO:tasks.workunit.client.0.vm04.stdout:2/275: mknod d1/df/d11/c56 0 2026-03-10T06:22:51.556 INFO:tasks.workunit.client.0.vm04.stdout:2/276: dwrite d1/db/f12 [4194304,4194304] 0 2026-03-10T06:22:51.563 INFO:tasks.workunit.client.0.vm04.stdout:0/261: getdents d0/d1a 0 2026-03-10T06:22:51.567 INFO:tasks.workunit.client.0.vm04.stdout:5/231: unlink d4/c27 0 2026-03-10T06:22:51.567 INFO:tasks.workunit.client.0.vm04.stdout:5/232: write d4/d11/d2a/f44 [261920,77164] 0 2026-03-10T06:22:51.568 INFO:tasks.workunit.client.0.vm04.stdout:5/233: stat d4/f35 0 2026-03-10T06:22:51.570 INFO:tasks.workunit.client.0.vm04.stdout:6/272: rename d2/c29 to d2/d43/d2d/d30/c5d 0 2026-03-10T06:22:51.574 INFO:tasks.workunit.client.0.vm04.stdout:3/328: link d4/da/df/d13/d21/d32/f58 d4/da/df/d13/d21/d2c/f6b 0 2026-03-10T06:22:51.579 INFO:tasks.workunit.client.0.vm04.stdout:9/284: symlink d2/d8/d14/l67 0 2026-03-10T06:22:51.579 INFO:tasks.workunit.client.0.vm04.stdout:8/301: creat df/d20/f5e x:0 0 0 2026-03-10T06:22:51.580 INFO:tasks.workunit.client.0.vm04.stdout:3/329: read d4/d6/d38/f53 [4656980,70972] 0 2026-03-10T06:22:51.581 INFO:tasks.workunit.client.0.vm04.stdout:3/330: read d4/da/df/d11/f57 [45432,109670] 0 2026-03-10T06:22:51.581 INFO:tasks.workunit.client.0.vm04.stdout:3/331: dread - d4/da/df/d11/d62/f69 zero size 2026-03-10T06:22:51.585 INFO:tasks.workunit.client.0.vm04.stdout:1/289: getdents d0/d3 0 2026-03-10T06:22:51.591 INFO:tasks.workunit.client.0.vm04.stdout:0/262: truncate d0/f14 3268016 0 2026-03-10T06:22:51.593 INFO:tasks.workunit.client.0.vm04.stdout:5/234: mkdir d4/d11/d2a/d38/d51 0 2026-03-10T06:22:51.596 INFO:tasks.workunit.client.0.vm04.stdout:6/273: mkdir d2/d3a/d5e 0 2026-03-10T06:22:51.597 INFO:tasks.workunit.client.0.vm04.stdout:6/274: write d2/d3a/f50 [760836,84116] 0 2026-03-10T06:22:51.598 INFO:tasks.workunit.client.0.vm04.stdout:4/239: getdents d2/d32 0 2026-03-10T06:22:51.599 INFO:tasks.workunit.client.0.vm04.stdout:4/240: write d2/d16/d2c/f2e [149902,100650] 0 2026-03-10T06:22:51.599 INFO:tasks.workunit.client.0.vm04.stdout:9/285: sync 2026-03-10T06:22:51.610 INFO:tasks.workunit.client.0.vm04.stdout:3/332: dwrite d4/da/df/d13/f2e [4194304,4194304] 0 2026-03-10T06:22:51.616 INFO:tasks.workunit.client.0.vm04.stdout:3/333: dwrite d4/da/df/d13/d21/d32/d39/f43 [0,4194304] 0 2026-03-10T06:22:51.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:51 vm06.local ceph-mon[58974]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:22:51.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:51 vm06.local ceph-mon[58974]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:22:51.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:51 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:22:51.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:51 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T06:22:51.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:51 vm06.local ceph-mon[58974]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:22:51.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:51 vm06.local ceph-mon[58974]: pgmap v24: 65 pgs: 65 active+clean; 2.4 GiB data, 8.2 GiB used, 112 GiB / 120 GiB avail; 27 MiB/s rd, 113 MiB/s wr, 308 op/s 2026-03-10T06:22:51.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:51 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:22:51.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:51 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:22:51.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:51 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:22:51.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:51 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:22:51.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:51 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:22:51.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:51 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:22:51.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:51 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:22:51.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:51 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:22:51.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:51 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:22:51.619 INFO:tasks.workunit.client.0.vm04.stdout:3/334: truncate d4/d6/dc/f37 389660 0 2026-03-10T06:22:51.629 INFO:tasks.workunit.client.0.vm04.stdout:7/257: creat d4/df/d12/f5f x:0 0 0 2026-03-10T06:22:51.632 INFO:tasks.workunit.client.0.vm04.stdout:1/290: creat d0/d8/f6c x:0 0 0 2026-03-10T06:22:51.634 INFO:tasks.workunit.client.0.vm04.stdout:2/277: fdatasync d1/db/f1e 0 2026-03-10T06:22:51.644 INFO:tasks.workunit.client.0.vm04.stdout:9/286: symlink d2/d8/d14/l68 0 2026-03-10T06:22:51.657 INFO:tasks.workunit.client.0.vm04.stdout:8/302: rename la to df/d20/d25/d30/l5f 0 2026-03-10T06:22:51.658 INFO:tasks.workunit.client.0.vm04.stdout:8/303: write df/d15/d2b/f33 [45898,100690] 0 2026-03-10T06:22:51.658 INFO:tasks.workunit.client.0.vm04.stdout:2/278: creat d1/f57 x:0 0 0 2026-03-10T06:22:51.658 INFO:tasks.workunit.client.0.vm04.stdout:2/279: readlink d1/l46 0 2026-03-10T06:22:51.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:51 vm04.local ceph-mon[51058]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:22:51.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:51 vm04.local ceph-mon[51058]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:22:51.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:51 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:22:51.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:51 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T06:22:51.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:51 vm04.local ceph-mon[51058]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:22:51.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:51 vm04.local ceph-mon[51058]: pgmap v24: 65 pgs: 65 active+clean; 2.4 GiB data, 8.2 GiB used, 112 GiB / 120 GiB avail; 27 MiB/s rd, 113 MiB/s wr, 308 op/s 2026-03-10T06:22:51.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:51 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:22:51.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:51 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:22:51.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:51 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:22:51.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:51 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:22:51.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:51 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:22:51.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:51 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:22:51.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:51 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:22:51.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:51 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:22:51.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:51 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:22:51.679 INFO:tasks.workunit.client.0.vm04.stdout:5/235: write d4/d6/f20 [262565,36659] 0 2026-03-10T06:22:51.684 INFO:tasks.workunit.client.0.vm04.stdout:3/335: rename d4/da/df/d13 to d4/da/df/d11/d50/d6c 0 2026-03-10T06:22:51.692 INFO:tasks.workunit.client.0.vm04.stdout:2/280: write d1/db/f1e [3553012,77455] 0 2026-03-10T06:22:51.694 INFO:tasks.workunit.client.0.vm04.stdout:2/281: dread d1/df/d2c/f4c [0,4194304] 0 2026-03-10T06:22:51.695 INFO:tasks.workunit.client.0.vm04.stdout:3/336: sync 2026-03-10T06:22:51.697 INFO:tasks.workunit.client.0.vm04.stdout:2/282: dwrite d1/df/d11/d14/f1d [0,4194304] 0 2026-03-10T06:22:51.702 INFO:tasks.workunit.client.0.vm04.stdout:9/287: creat d2/d8/d5d/f69 x:0 0 0 2026-03-10T06:22:51.705 INFO:tasks.workunit.client.0.vm04.stdout:5/236: mkdir d4/d11/d2a/d52 0 2026-03-10T06:22:51.707 INFO:tasks.workunit.client.0.vm04.stdout:0/263: truncate d0/d5/f1f 716989 0 2026-03-10T06:22:51.707 INFO:tasks.workunit.client.0.vm04.stdout:0/264: stat d0/l1 0 2026-03-10T06:22:51.709 INFO:tasks.workunit.client.0.vm04.stdout:5/237: dwrite d4/d11/d2a/f36 [0,4194304] 0 2026-03-10T06:22:51.711 INFO:tasks.workunit.client.0.vm04.stdout:5/238: fsync d4/d11/d2a/f30 0 2026-03-10T06:22:51.713 INFO:tasks.workunit.client.0.vm04.stdout:0/265: dwrite d0/d5/d25/dd/d3a/f50 [0,4194304] 0 2026-03-10T06:22:51.723 INFO:tasks.workunit.client.0.vm04.stdout:1/291: rename d0/d3/d41/c5f to d0/d8/c6d 0 2026-03-10T06:22:51.727 INFO:tasks.workunit.client.0.vm04.stdout:5/239: dread d4/d11/f18 [0,4194304] 0 2026-03-10T06:22:51.729 INFO:tasks.workunit.client.0.vm04.stdout:5/240: dwrite d4/d11/d2a/f36 [0,4194304] 0 2026-03-10T06:22:51.730 INFO:tasks.workunit.client.0.vm04.stdout:5/241: read d4/d11/d2a/f2c [1737662,51536] 0 2026-03-10T06:22:51.741 INFO:tasks.workunit.client.0.vm04.stdout:5/242: write d4/d11/d2a/f36 [1442924,59719] 0 2026-03-10T06:22:51.741 INFO:tasks.workunit.client.0.vm04.stdout:5/243: chown d4/d11/c4f 8 1 2026-03-10T06:22:51.741 INFO:tasks.workunit.client.0.vm04.stdout:8/304: link df/f17 df/d15/d2b/f60 0 2026-03-10T06:22:51.741 INFO:tasks.workunit.client.0.vm04.stdout:4/241: truncate d2/d46/f3c 1399080 0 2026-03-10T06:22:51.741 INFO:tasks.workunit.client.0.vm04.stdout:7/258: creat d4/df/f60 x:0 0 0 2026-03-10T06:22:51.741 INFO:tasks.workunit.client.0.vm04.stdout:4/242: write d2/d8/f23 [657341,106970] 0 2026-03-10T06:22:51.742 INFO:tasks.workunit.client.0.vm04.stdout:4/243: dwrite d2/d46/f15 [4194304,4194304] 0 2026-03-10T06:22:51.743 INFO:tasks.workunit.client.0.vm04.stdout:4/244: write d2/d16/d31/d3f/f43 [594138,16013] 0 2026-03-10T06:22:51.753 INFO:tasks.workunit.client.0.vm04.stdout:7/259: dread d4/df/d12/f14 [0,4194304] 0 2026-03-10T06:22:51.758 INFO:tasks.workunit.client.0.vm04.stdout:4/245: sync 2026-03-10T06:22:51.768 INFO:tasks.workunit.client.0.vm04.stdout:2/283: dwrite d1/f5 [0,4194304] 0 2026-03-10T06:22:51.772 INFO:tasks.workunit.client.0.vm04.stdout:2/284: dwrite d1/db/f12 [4194304,4194304] 0 2026-03-10T06:22:51.777 INFO:tasks.workunit.client.0.vm04.stdout:6/275: rename d2/f28 to d2/f5f 0 2026-03-10T06:22:51.779 INFO:tasks.workunit.client.0.vm04.stdout:6/276: chown d2/d3a/f50 583 1 2026-03-10T06:22:51.781 INFO:tasks.workunit.client.0.vm04.stdout:6/277: chown d2/d43/d2d/d30/d34 407 1 2026-03-10T06:22:51.782 INFO:tasks.workunit.client.0.vm04.stdout:6/278: dread - d2/d3a/f56 zero size 2026-03-10T06:22:51.783 INFO:tasks.workunit.client.0.vm04.stdout:6/279: chown d2/d43/d2d/d30/d1f/d3c 58520 1 2026-03-10T06:22:51.783 INFO:tasks.workunit.client.0.vm04.stdout:2/285: dwrite d1/df/d11/f16 [0,4194304] 0 2026-03-10T06:22:51.789 INFO:tasks.workunit.client.0.vm04.stdout:2/286: dwrite d1/db/f1e [0,4194304] 0 2026-03-10T06:22:51.806 INFO:tasks.workunit.client.0.vm04.stdout:9/288: creat d2/d8/d14/d1d/f6a x:0 0 0 2026-03-10T06:22:51.818 INFO:tasks.workunit.client.0.vm04.stdout:5/244: mknod d4/d11/d2a/c53 0 2026-03-10T06:22:51.823 INFO:tasks.workunit.client.0.vm04.stdout:9/289: sync 2026-03-10T06:22:51.828 INFO:tasks.workunit.client.0.vm04.stdout:2/287: fdatasync d1/df/d2c/f4c 0 2026-03-10T06:22:51.829 INFO:tasks.workunit.client.0.vm04.stdout:9/290: dwrite d2/d8/d14/d1d/f6a [0,4194304] 0 2026-03-10T06:22:51.853 INFO:tasks.workunit.client.0.vm04.stdout:7/260: mknod d4/df/c61 0 2026-03-10T06:22:51.869 INFO:tasks.workunit.client.0.vm04.stdout:1/292: link d0/d3/c45 d0/d3/d41/d4b/c6e 0 2026-03-10T06:22:51.873 INFO:tasks.workunit.client.0.vm04.stdout:3/337: getdents d4/da/df/d11/d50/d6c/d21/d32/d39/d64 0 2026-03-10T06:22:51.874 INFO:tasks.workunit.client.0.vm04.stdout:3/338: stat d4/da/df/d11/d50/d6c/f1d 0 2026-03-10T06:22:51.890 INFO:tasks.workunit.client.0.vm04.stdout:4/246: rename d2/d8/f23 to d2/f4c 0 2026-03-10T06:22:51.900 INFO:tasks.workunit.client.0.vm04.stdout:3/339: dread f1 [4194304,4194304] 0 2026-03-10T06:22:51.948 INFO:tasks.workunit.client.0.vm04.stdout:2/288: write d1/df/f24 [1063910,31039] 0 2026-03-10T06:22:51.956 INFO:tasks.workunit.client.0.vm04.stdout:6/280: getdents d2/d43/d2d/d30/d1f/d3c 0 2026-03-10T06:22:51.958 INFO:tasks.workunit.client.0.vm04.stdout:8/305: dread df/d15/d2b/f33 [0,4194304] 0 2026-03-10T06:22:51.959 INFO:tasks.workunit.client.0.vm04.stdout:8/306: dread - df/d15/d29/f3c zero size 2026-03-10T06:22:51.964 INFO:tasks.workunit.client.0.vm04.stdout:7/261: link d4/df/d12/f20 d4/df/d12/d13/d25/d30/d40/d50/f62 0 2026-03-10T06:22:51.975 INFO:tasks.workunit.client.0.vm04.stdout:0/266: dwrite d0/f14 [0,4194304] 0 2026-03-10T06:22:51.975 INFO:tasks.workunit.client.0.vm04.stdout:0/267: read d0/d1a/f2f [2187312,123526] 0 2026-03-10T06:22:51.990 INFO:tasks.workunit.client.0.vm04.stdout:2/289: creat d1/df/d2c/f58 x:0 0 0 2026-03-10T06:22:51.993 INFO:tasks.workunit.client.0.vm04.stdout:5/245: rename d4/f3c to d4/d3b/f54 0 2026-03-10T06:22:52.001 INFO:tasks.workunit.client.0.vm04.stdout:5/246: truncate d4/d11/d2a/d38/f3e 907473 0 2026-03-10T06:22:52.004 INFO:tasks.workunit.client.0.vm04.stdout:5/247: write d4/d11/f2f [677587,24056] 0 2026-03-10T06:22:52.011 INFO:tasks.workunit.client.0.vm04.stdout:5/248: dwrite d4/d6/fa [0,4194304] 0 2026-03-10T06:22:52.011 INFO:tasks.workunit.client.0.vm04.stdout:5/249: chown l1 2560 1 2026-03-10T06:22:52.015 INFO:tasks.workunit.client.0.vm04.stdout:9/291: getdents d2/d23 0 2026-03-10T06:22:52.025 INFO:tasks.workunit.client.0.vm04.stdout:7/262: mkdir d4/df/d12/d34/d63 0 2026-03-10T06:22:52.025 INFO:tasks.workunit.client.0.vm04.stdout:7/263: chown d4 115182676 1 2026-03-10T06:22:52.025 INFO:tasks.workunit.client.0.vm04.stdout:8/307: dwrite df/d15/d2b/f60 [0,4194304] 0 2026-03-10T06:22:52.025 INFO:tasks.workunit.client.0.vm04.stdout:7/264: read d4/df/d12/d13/f4a [29802,33950] 0 2026-03-10T06:22:52.048 INFO:tasks.workunit.client.0.vm04.stdout:2/290: mkdir d1/df/d2c/d37/d59 0 2026-03-10T06:22:52.049 INFO:tasks.workunit.client.0.vm04.stdout:2/291: stat d1/df 0 2026-03-10T06:22:52.052 INFO:tasks.workunit.client.0.vm04.stdout:0/268: dwrite d0/f1b [4194304,4194304] 0 2026-03-10T06:22:52.053 INFO:tasks.workunit.client.0.vm04.stdout:0/269: truncate d0/d5/f3e 4837318 0 2026-03-10T06:22:52.068 INFO:tasks.workunit.client.0.vm04.stdout:3/340: creat d4/da/df/d11/d50/d6c/d21/f6d x:0 0 0 2026-03-10T06:22:52.069 INFO:tasks.workunit.client.0.vm04.stdout:0/270: dwrite d0/d1a/d3f/f4f [0,4194304] 0 2026-03-10T06:22:52.072 INFO:tasks.workunit.client.0.vm04.stdout:9/292: creat d2/d8/d3a/f6b x:0 0 0 2026-03-10T06:22:52.072 INFO:tasks.workunit.client.0.vm04.stdout:4/247: truncate d2/f1d 4295554 0 2026-03-10T06:22:52.073 INFO:tasks.workunit.client.0.vm04.stdout:8/308: mknod df/d20/d25/c61 0 2026-03-10T06:22:52.076 INFO:tasks.workunit.client.0.vm04.stdout:7/265: dwrite d4/df/d12/d13/d25/d30/d40/d50/f5b [0,4194304] 0 2026-03-10T06:22:52.088 INFO:tasks.workunit.client.0.vm04.stdout:1/293: rename d0/d3/f35 to d0/d3/d41/d4b/d5b/f6f 0 2026-03-10T06:22:52.089 INFO:tasks.workunit.client.0.vm04.stdout:6/281: creat d2/d43/d2d/d30/f60 x:0 0 0 2026-03-10T06:22:52.091 INFO:tasks.workunit.client.0.vm04.stdout:5/250: dread d4/ff [0,4194304] 0 2026-03-10T06:22:52.102 INFO:tasks.workunit.client.0.vm04.stdout:5/251: readlink d4/d3b/l45 0 2026-03-10T06:22:52.102 INFO:tasks.workunit.client.0.vm04.stdout:3/341: chown d4/da/df/d11/d50/d6c/d21/c2a 37 1 2026-03-10T06:22:52.102 INFO:tasks.workunit.client.0.vm04.stdout:0/271: chown d0/d5/d25/dd/f43 11376 1 2026-03-10T06:22:52.109 INFO:tasks.workunit.client.0.vm04.stdout:9/293: dread d2/d8/d14/f28 [0,4194304] 0 2026-03-10T06:22:52.110 INFO:tasks.workunit.client.0.vm04.stdout:1/294: rmdir d0/d3/d41 39 2026-03-10T06:22:52.111 INFO:tasks.workunit.client.0.vm04.stdout:1/295: truncate d0/d3/f34 4449413 0 2026-03-10T06:22:52.116 INFO:tasks.workunit.client.0.vm04.stdout:0/272: dread d0/d5/d25/dd/f43 [0,4194304] 0 2026-03-10T06:22:52.118 INFO:tasks.workunit.client.0.vm04.stdout:5/252: mkdir d4/d6/d48/d55 0 2026-03-10T06:22:52.122 INFO:tasks.workunit.client.0.vm04.stdout:8/309: unlink df/d15/d2b/c59 0 2026-03-10T06:22:52.126 INFO:tasks.workunit.client.0.vm04.stdout:9/294: mkdir d2/d8/d14/d6c 0 2026-03-10T06:22:52.139 INFO:tasks.workunit.client.0.vm04.stdout:0/273: unlink d0/d1a/d20/d38/l2d 0 2026-03-10T06:22:52.140 INFO:tasks.workunit.client.0.vm04.stdout:4/248: dread d2/f1d [0,4194304] 0 2026-03-10T06:22:52.140 INFO:tasks.workunit.client.0.vm04.stdout:0/274: write d0/d5/f4e [291166,20878] 0 2026-03-10T06:22:52.140 INFO:tasks.workunit.client.0.vm04.stdout:0/275: stat d0/d1a/l2a 0 2026-03-10T06:22:52.147 INFO:tasks.workunit.client.0.vm04.stdout:7/266: dwrite d4/df/d12/f1c [4194304,4194304] 0 2026-03-10T06:22:52.147 INFO:tasks.workunit.client.0.vm04.stdout:0/276: dread d0/d1a/d20/d38/f52 [0,4194304] 0 2026-03-10T06:22:52.152 INFO:tasks.workunit.client.0.vm04.stdout:3/342: mknod d4/da/df/d11/d50/d6c/d21/c6e 0 2026-03-10T06:22:52.152 INFO:tasks.workunit.client.0.vm04.stdout:3/343: chown d4/da/df/d11/d50/d6c/f16 97486330 1 2026-03-10T06:22:52.154 INFO:tasks.workunit.client.0.vm04.stdout:2/292: rename d1/db/d20/f2e to d1/df/f5a 0 2026-03-10T06:22:52.155 INFO:tasks.workunit.client.0.vm04.stdout:2/293: stat d1/c3a 0 2026-03-10T06:22:52.158 INFO:tasks.workunit.client.0.vm04.stdout:2/294: dwrite d1/db/f12 [0,4194304] 0 2026-03-10T06:22:52.159 INFO:tasks.workunit.client.0.vm04.stdout:6/282: link d2/d43/d2d/d30/c47 d2/d3a/d5e/c61 0 2026-03-10T06:22:52.164 INFO:tasks.workunit.client.0.vm04.stdout:5/253: mknod d4/d11/d2a/d38/d51/c56 0 2026-03-10T06:22:52.175 INFO:tasks.workunit.client.0.vm04.stdout:1/296: rename d0/c2 to d0/d3/d41/d4b/d5b/c70 0 2026-03-10T06:22:52.178 INFO:tasks.workunit.client.0.vm04.stdout:9/295: mknod d2/d8/d3a/d60/c6d 0 2026-03-10T06:22:52.180 INFO:tasks.workunit.client.0.vm04.stdout:6/283: rmdir d2 39 2026-03-10T06:22:52.182 INFO:tasks.workunit.client.0.vm04.stdout:2/295: mknod d1/df/d2c/d37/d40/c5b 0 2026-03-10T06:22:52.183 INFO:tasks.workunit.client.0.vm04.stdout:2/296: write d1/df/d2c/f4c [4391682,79421] 0 2026-03-10T06:22:52.186 INFO:tasks.workunit.client.0.vm04.stdout:2/297: dread d1/df/d11/d14/f1d [0,4194304] 0 2026-03-10T06:22:52.188 INFO:tasks.workunit.client.0.vm04.stdout:5/254: symlink d4/d6/l57 0 2026-03-10T06:22:52.188 INFO:tasks.workunit.client.0.vm04.stdout:5/255: write d4/d6/f23 [2345881,98270] 0 2026-03-10T06:22:52.190 INFO:tasks.workunit.client.0.vm04.stdout:0/277: mknod d0/d5/d25/dd/d3a/d56/c58 0 2026-03-10T06:22:52.194 INFO:tasks.workunit.client.0.vm04.stdout:8/310: write df/f11 [1359309,14879] 0 2026-03-10T06:22:52.198 INFO:tasks.workunit.client.0.vm04.stdout:0/278: dread d0/d5/f41 [0,4194304] 0 2026-03-10T06:22:52.199 INFO:tasks.workunit.client.0.vm04.stdout:3/344: mknod d4/da/df/c6f 0 2026-03-10T06:22:52.200 INFO:tasks.workunit.client.0.vm04.stdout:3/345: write d4/da/df/d11/d50/d6c/f2e [2327486,51823] 0 2026-03-10T06:22:52.206 INFO:tasks.workunit.client.0.vm04.stdout:4/249: rename d2/f1d to d2/d16/d2c/f4d 0 2026-03-10T06:22:52.209 INFO:tasks.workunit.client.0.vm04.stdout:1/297: symlink d0/d3/d41/d4b/l71 0 2026-03-10T06:22:52.212 INFO:tasks.workunit.client.0.vm04.stdout:1/298: dwrite d0/d3/f58 [0,4194304] 0 2026-03-10T06:22:52.214 INFO:tasks.workunit.client.0.vm04.stdout:6/284: chown d2/d3a/l3d 26156 1 2026-03-10T06:22:52.215 INFO:tasks.workunit.client.0.vm04.stdout:6/285: chown d2/d43/d2d/d30/d34 1640840 1 2026-03-10T06:22:52.215 INFO:tasks.workunit.client.0.vm04.stdout:6/286: dread - d2/d43/d2d/d51/f59 zero size 2026-03-10T06:22:52.219 INFO:tasks.workunit.client.0.vm04.stdout:2/298: rmdir d1/db 39 2026-03-10T06:22:52.220 INFO:tasks.workunit.client.0.vm04.stdout:2/299: readlink d1/l28 0 2026-03-10T06:22:52.224 INFO:tasks.workunit.client.0.vm04.stdout:7/267: creat d4/df/d12/d13/d25/d28/d36/f64 x:0 0 0 2026-03-10T06:22:52.231 INFO:tasks.workunit.client.0.vm04.stdout:0/279: rename d0/d1a/d20/d3d to d0/d5/d25/dd/d1d/d59 0 2026-03-10T06:22:52.242 INFO:tasks.workunit.client.0.vm04.stdout:1/299: unlink d0/f53 0 2026-03-10T06:22:52.242 INFO:tasks.workunit.client.0.vm04.stdout:9/296: mkdir d2/d8/d53/d6e 0 2026-03-10T06:22:52.242 INFO:tasks.workunit.client.0.vm04.stdout:6/287: rmdir d2/d3a 39 2026-03-10T06:22:52.242 INFO:tasks.workunit.client.0.vm04.stdout:7/268: dread d4/df/f29 [0,4194304] 0 2026-03-10T06:22:52.243 INFO:tasks.workunit.client.0.vm04.stdout:5/256: symlink d4/d11/d2a/d52/l58 0 2026-03-10T06:22:52.248 INFO:tasks.workunit.client.0.vm04.stdout:2/300: creat d1/df/d11/d14/d4e/f5c x:0 0 0 2026-03-10T06:22:52.248 INFO:tasks.workunit.client.0.vm04.stdout:0/280: dread d0/d5/f3e [0,4194304] 0 2026-03-10T06:22:52.250 INFO:tasks.workunit.client.0.vm04.stdout:7/269: symlink d4/df/d12/d34/d63/l65 0 2026-03-10T06:22:52.252 INFO:tasks.workunit.client.0.vm04.stdout:7/270: chown d4/df/d12/d13/d25/d28/d36/f41 5273946 1 2026-03-10T06:22:52.252 INFO:tasks.workunit.client.0.vm04.stdout:5/257: creat d4/d6/d50/f59 x:0 0 0 2026-03-10T06:22:52.253 INFO:tasks.workunit.client.0.vm04.stdout:3/346: sync 2026-03-10T06:22:52.254 INFO:tasks.workunit.client.0.vm04.stdout:9/297: sync 2026-03-10T06:22:52.256 INFO:tasks.workunit.client.0.vm04.stdout:2/301: unlink d1/df/d11/c56 0 2026-03-10T06:22:52.257 INFO:tasks.workunit.client.0.vm04.stdout:5/258: dwrite d4/d6/f23 [0,4194304] 0 2026-03-10T06:22:52.257 INFO:tasks.workunit.client.0.vm04.stdout:0/281: creat d0/d5/d25/dd/d1d/f5a x:0 0 0 2026-03-10T06:22:52.277 INFO:tasks.workunit.client.0.vm04.stdout:4/250: write d2/f4c [1641106,79658] 0 2026-03-10T06:22:52.283 INFO:tasks.workunit.client.0.vm04.stdout:9/298: truncate d2/d8/d14/f3c 822694 0 2026-03-10T06:22:52.290 INFO:tasks.workunit.client.0.vm04.stdout:1/300: dwrite d0/f4 [0,4194304] 0 2026-03-10T06:22:52.290 INFO:tasks.workunit.client.0.vm04.stdout:1/301: readlink d0/d3/l1e 0 2026-03-10T06:22:52.302 INFO:tasks.workunit.client.0.vm04.stdout:0/282: symlink d0/d5/d25/l5b 0 2026-03-10T06:22:52.306 INFO:tasks.workunit.client.0.vm04.stdout:2/302: dwrite d1/df/d11/f16 [0,4194304] 0 2026-03-10T06:22:52.318 INFO:tasks.workunit.client.0.vm04.stdout:2/303: write d1/df/d11/f29 [22990,98469] 0 2026-03-10T06:22:52.318 INFO:tasks.workunit.client.0.vm04.stdout:4/251: creat d2/d16/d31/f4e x:0 0 0 2026-03-10T06:22:52.318 INFO:tasks.workunit.client.0.vm04.stdout:2/304: dwrite d1/df/f24 [0,4194304] 0 2026-03-10T06:22:52.318 INFO:tasks.workunit.client.0.vm04.stdout:0/283: dread d0/d5/d25/f3c [0,4194304] 0 2026-03-10T06:22:52.318 INFO:tasks.workunit.client.0.vm04.stdout:0/284: readlink d0/d1a/d3f/l4c 0 2026-03-10T06:22:52.325 INFO:tasks.workunit.client.0.vm04.stdout:5/259: creat d4/d6/d48/d55/f5a x:0 0 0 2026-03-10T06:22:52.325 INFO:tasks.workunit.client.0.vm04.stdout:7/271: link d4/df/d12/d13/d25/d30/f37 d4/df/d12/d13/d25/f66 0 2026-03-10T06:22:52.327 INFO:tasks.workunit.client.0.vm04.stdout:4/252: mkdir d2/d8/d40/d4f 0 2026-03-10T06:22:52.329 INFO:tasks.workunit.client.0.vm04.stdout:9/299: dwrite d2/d23/d24/f29 [4194304,4194304] 0 2026-03-10T06:22:52.331 INFO:tasks.workunit.client.0.vm04.stdout:9/300: truncate d2/d3/d18/d39/d11/f35 130216 0 2026-03-10T06:22:52.335 INFO:tasks.workunit.client.0.vm04.stdout:0/285: mkdir d0/d5/d25/dd/d5c 0 2026-03-10T06:22:52.336 INFO:tasks.workunit.client.0.vm04.stdout:6/288: getdents d2/d3a/d5e 0 2026-03-10T06:22:52.336 INFO:tasks.workunit.client.0.vm04.stdout:0/286: chown d0/c39 2363714 1 2026-03-10T06:22:52.339 INFO:tasks.workunit.client.0.vm04.stdout:3/347: link d4/da/df/d11/d50/d6c/d21/d2c/c44 d4/d6/d38/c70 0 2026-03-10T06:22:52.340 INFO:tasks.workunit.client.0.vm04.stdout:3/348: dread - d4/da/df/d11/d50/d6c/d21/d32/f58 zero size 2026-03-10T06:22:52.340 INFO:tasks.workunit.client.0.vm04.stdout:7/272: chown d4/df/d12/d13/d25/d28/d3a/d58/f5a 11 1 2026-03-10T06:22:52.347 INFO:tasks.workunit.client.0.vm04.stdout:4/253: dread d2/d16/f3a [0,4194304] 0 2026-03-10T06:22:52.347 INFO:tasks.workunit.client.0.vm04.stdout:2/305: mkdir d1/df/d11/d18/d35/d54/d5d 0 2026-03-10T06:22:52.348 INFO:tasks.workunit.client.0.vm04.stdout:2/306: dread - d1/df/d11/d14/f45 zero size 2026-03-10T06:22:52.349 INFO:tasks.workunit.client.0.vm04.stdout:2/307: stat d1/df/d11/d18/d48 0 2026-03-10T06:22:52.350 INFO:tasks.workunit.client.0.vm04.stdout:0/287: symlink d0/d5/d25/l5d 0 2026-03-10T06:22:52.363 INFO:tasks.workunit.client.0.vm04.stdout:1/302: link d0/c42 d0/d3/d41/d4b/c72 0 2026-03-10T06:22:52.363 INFO:tasks.workunit.client.0.vm04.stdout:8/311: rename df/d20/c2c to df/d20/d25/d30/d55/c62 0 2026-03-10T06:22:52.363 INFO:tasks.workunit.client.0.vm04.stdout:8/312: write df/d15/d2b/f4a [6488386,82174] 0 2026-03-10T06:22:52.363 INFO:tasks.workunit.client.0.vm04.stdout:4/254: write d2/d16/f3a [3143888,116821] 0 2026-03-10T06:22:52.363 INFO:tasks.workunit.client.0.vm04.stdout:4/255: dread d2/d16/d2c/f2e [0,4194304] 0 2026-03-10T06:22:52.367 INFO:tasks.workunit.client.0.vm04.stdout:9/301: mknod d2/d8/d53/d6e/c6f 0 2026-03-10T06:22:52.368 INFO:tasks.workunit.client.0.vm04.stdout:2/308: write d1/db/fe [3434730,129619] 0 2026-03-10T06:22:52.369 INFO:tasks.workunit.client.0.vm04.stdout:2/309: dread - d1/df/d2c/f4a zero size 2026-03-10T06:22:52.371 INFO:tasks.workunit.client.0.vm04.stdout:1/303: symlink d0/l73 0 2026-03-10T06:22:52.372 INFO:tasks.workunit.client.0.vm04.stdout:1/304: chown d0/d8/c3d 615722 1 2026-03-10T06:22:52.376 INFO:tasks.workunit.client.0.vm04.stdout:6/289: sync 2026-03-10T06:22:52.376 INFO:tasks.workunit.client.0.vm04.stdout:8/313: creat df/d15/d2b/f63 x:0 0 0 2026-03-10T06:22:52.377 INFO:tasks.workunit.client.0.vm04.stdout:0/288: dwrite d0/d1a/f2f [0,4194304] 0 2026-03-10T06:22:52.389 INFO:tasks.workunit.client.0.vm04.stdout:6/290: read d2/d43/d2d/f42 [259147,49507] 0 2026-03-10T06:22:52.397 INFO:tasks.workunit.client.0.vm04.stdout:1/305: dread d0/f23 [0,4194304] 0 2026-03-10T06:22:52.398 INFO:tasks.workunit.client.0.vm04.stdout:8/314: fsync df/d15/d2b/f4a 0 2026-03-10T06:22:52.399 INFO:tasks.workunit.client.0.vm04.stdout:8/315: fdatasync df/f17 0 2026-03-10T06:22:52.403 INFO:tasks.workunit.client.0.vm04.stdout:1/306: dwrite d0/d3/f61 [0,4194304] 0 2026-03-10T06:22:52.405 INFO:tasks.workunit.client.0.vm04.stdout:3/349: mknod d4/da/df/d11/d50/d6c/c71 0 2026-03-10T06:22:52.405 INFO:tasks.workunit.client.0.vm04.stdout:5/260: link d4/ff d4/d11/d2a/f5b 0 2026-03-10T06:22:52.405 INFO:tasks.workunit.client.0.vm04.stdout:4/256: rename d2/d8/f1a to d2/d16/d31/f50 0 2026-03-10T06:22:52.428 INFO:tasks.workunit.client.0.vm04.stdout:6/291: mknod d2/d43/d2d/d30/d34/c62 0 2026-03-10T06:22:52.429 INFO:tasks.workunit.client.0.vm04.stdout:6/292: dread - d2/d43/d2d/d51/f59 zero size 2026-03-10T06:22:52.437 INFO:tasks.workunit.client.0.vm04.stdout:1/307: dwrite d0/d8/f32 [4194304,4194304] 0 2026-03-10T06:22:52.448 INFO:tasks.workunit.client.0.vm04.stdout:3/350: symlink d4/da/df/d11/d50/d6c/d21/d32/d39/l72 0 2026-03-10T06:22:52.449 INFO:tasks.workunit.client.0.vm04.stdout:3/351: read d4/d6/dc/f22 [775049,20264] 0 2026-03-10T06:22:52.450 INFO:tasks.workunit.client.0.vm04.stdout:2/310: symlink d1/df/d11/d18/d35/d54/d5d/l5e 0 2026-03-10T06:22:52.451 INFO:tasks.workunit.client.0.vm04.stdout:2/311: chown d1/f10 108 1 2026-03-10T06:22:52.461 INFO:tasks.workunit.client.0.vm04.stdout:7/273: getdents d4/df/d12/d34 0 2026-03-10T06:22:52.464 INFO:tasks.workunit.client.0.vm04.stdout:5/261: chown d4/ff 1734296 1 2026-03-10T06:22:52.468 INFO:tasks.workunit.client.0.vm04.stdout:3/352: creat d4/da/df/d11/d50/d6c/d21/d32/d4e/f73 x:0 0 0 2026-03-10T06:22:52.469 INFO:tasks.workunit.client.0.vm04.stdout:9/302: link d2/d8/d14/l3d d2/d8/d5d/l70 0 2026-03-10T06:22:52.475 INFO:tasks.workunit.client.0.vm04.stdout:2/312: fsync d1/db/f36 0 2026-03-10T06:22:52.476 INFO:tasks.workunit.client.0.vm04.stdout:2/313: chown d1 0 1 2026-03-10T06:22:52.476 INFO:tasks.workunit.client.0.vm04.stdout:2/314: chown d1/df 0 1 2026-03-10T06:22:52.479 INFO:tasks.workunit.client.0.vm04.stdout:0/289: link d0/d1a/l1e d0/d5/l5e 0 2026-03-10T06:22:52.483 INFO:tasks.workunit.client.0.vm04.stdout:8/316: creat df/d20/f64 x:0 0 0 2026-03-10T06:22:52.484 INFO:tasks.workunit.client.0.vm04.stdout:8/317: chown df/d15/d29/c32 101953638 1 2026-03-10T06:22:52.484 INFO:tasks.workunit.client.0.vm04.stdout:8/318: dread - df/d20/d25/f2a zero size 2026-03-10T06:22:52.488 INFO:tasks.workunit.client.0.vm04.stdout:8/319: dwrite df/d20/d25/d30/f4e [0,4194304] 0 2026-03-10T06:22:52.492 INFO:tasks.workunit.client.0.vm04.stdout:6/293: dwrite d2/f5f [0,4194304] 0 2026-03-10T06:22:52.493 INFO:tasks.workunit.client.0.vm04.stdout:6/294: read - d2/d43/d2d/d30/d1f/f3f zero size 2026-03-10T06:22:52.500 INFO:tasks.workunit.client.0.vm04.stdout:5/262: chown d4/c46 428 1 2026-03-10T06:22:52.503 INFO:tasks.workunit.client.0.vm04.stdout:5/263: dwrite d4/d6/f20 [0,4194304] 0 2026-03-10T06:22:52.506 INFO:tasks.workunit.client.0.vm04.stdout:0/290: creat d0/d5/d25/f5f x:0 0 0 2026-03-10T06:22:52.511 INFO:tasks.workunit.client.0.vm04.stdout:8/320: dwrite df/d15/d2b/f33 [0,4194304] 0 2026-03-10T06:22:52.512 INFO:tasks.workunit.client.0.vm04.stdout:0/291: dread d0/d5/f41 [0,4194304] 0 2026-03-10T06:22:52.522 INFO:tasks.workunit.client.0.vm04.stdout:4/257: getdents d2/d8 0 2026-03-10T06:22:52.522 INFO:tasks.workunit.client.0.vm04.stdout:5/264: rename d4/c29 to d4/d6/d50/c5c 0 2026-03-10T06:22:52.523 INFO:tasks.workunit.client.0.vm04.stdout:4/258: fsync d2/f14 0 2026-03-10T06:22:52.523 INFO:tasks.workunit.client.0.vm04.stdout:1/308: getdents d0/d3/d41 0 2026-03-10T06:22:52.523 INFO:tasks.workunit.client.0.vm04.stdout:1/309: chown d0/d3/f4e 145929 1 2026-03-10T06:22:52.524 INFO:tasks.workunit.client.0.vm04.stdout:1/310: fsync d0/d8/f69 0 2026-03-10T06:22:52.526 INFO:tasks.workunit.client.0.vm04.stdout:0/292: creat d0/d5/d25/dd/d3a/f60 x:0 0 0 2026-03-10T06:22:52.528 INFO:tasks.workunit.client.0.vm04.stdout:8/321: mkdir df/d20/d25/d30/d65 0 2026-03-10T06:22:52.539 INFO:tasks.workunit.client.0.vm04.stdout:7/274: link d4/f5 d4/df/d12/d13/d25/d28/d36/f67 0 2026-03-10T06:22:52.539 INFO:tasks.workunit.client.0.vm04.stdout:8/322: write df/f17 [2266907,79931] 0 2026-03-10T06:22:52.539 INFO:tasks.workunit.client.0.vm04.stdout:8/323: unlink df/d20/d25/l41 0 2026-03-10T06:22:52.539 INFO:tasks.workunit.client.0.vm04.stdout:8/324: write df/d15/f1b [1081670,31941] 0 2026-03-10T06:22:52.539 INFO:tasks.workunit.client.0.vm04.stdout:7/275: mkdir d4/df/d12/d13/d25/d28/d3a/d58/d68 0 2026-03-10T06:22:52.539 INFO:tasks.workunit.client.0.vm04.stdout:6/295: creat d2/d43/d2d/d30/f63 x:0 0 0 2026-03-10T06:22:52.539 INFO:tasks.workunit.client.0.vm04.stdout:3/353: getdents d4/d6 0 2026-03-10T06:22:52.539 INFO:tasks.workunit.client.0.vm04.stdout:9/303: getdents d2 0 2026-03-10T06:22:52.539 INFO:tasks.workunit.client.0.vm04.stdout:3/354: chown d4/da/df/d11/c56 1 1 2026-03-10T06:22:52.539 INFO:tasks.workunit.client.0.vm04.stdout:3/355: write d4/f49 [115884,104308] 0 2026-03-10T06:22:52.543 INFO:tasks.workunit.client.0.vm04.stdout:6/296: creat d2/d3a/d5e/f64 x:0 0 0 2026-03-10T06:22:52.557 INFO:tasks.workunit.client.0.vm04.stdout:9/304: dwrite d2/d3/d18/d39/f2e [0,4194304] 0 2026-03-10T06:22:52.557 INFO:tasks.workunit.client.0.vm04.stdout:3/356: rmdir d4/da/df/d11/d50/d6c 39 2026-03-10T06:22:52.557 INFO:tasks.workunit.client.0.vm04.stdout:6/297: readlink d2/d43/d2d/d30/d1f/l53 0 2026-03-10T06:22:52.557 INFO:tasks.workunit.client.0.vm04.stdout:9/305: rmdir d2/d8/d14 39 2026-03-10T06:22:52.557 INFO:tasks.workunit.client.0.vm04.stdout:5/265: getdents d4/d11 0 2026-03-10T06:22:52.557 INFO:tasks.workunit.client.0.vm04.stdout:9/306: readlink d2/d8/d3a/l50 0 2026-03-10T06:22:52.557 INFO:tasks.workunit.client.0.vm04.stdout:5/266: read d4/d6/f20 [3674992,74908] 0 2026-03-10T06:22:52.557 INFO:tasks.workunit.client.0.vm04.stdout:9/307: dwrite d2/f1e [0,4194304] 0 2026-03-10T06:22:52.557 INFO:tasks.workunit.client.0.vm04.stdout:8/325: rename df/d53/f57 to df/f66 0 2026-03-10T06:22:52.557 INFO:tasks.workunit.client.0.vm04.stdout:3/357: symlink d4/d6/l74 0 2026-03-10T06:22:52.557 INFO:tasks.workunit.client.0.vm04.stdout:7/276: link d4/df/d12/d13/l17 d4/df/d12/d13/l69 0 2026-03-10T06:22:52.557 INFO:tasks.workunit.client.0.vm04.stdout:3/358: chown d4/c46 3 1 2026-03-10T06:22:52.557 INFO:tasks.workunit.client.0.vm04.stdout:6/298: creat d2/d43/d2d/d30/d1f/d3c/f65 x:0 0 0 2026-03-10T06:22:52.559 INFO:tasks.workunit.client.0.vm04.stdout:5/267: creat d4/d6/d48/d55/f5d x:0 0 0 2026-03-10T06:22:52.561 INFO:tasks.workunit.client.0.vm04.stdout:5/268: readlink d4/d11/d2a/d52/l58 0 2026-03-10T06:22:52.561 INFO:tasks.workunit.client.0.vm04.stdout:8/326: dwrite df/d20/d25/d30/f4e [0,4194304] 0 2026-03-10T06:22:52.570 INFO:tasks.workunit.client.0.vm04.stdout:1/311: sync 2026-03-10T06:22:52.575 INFO:tasks.workunit.client.0.vm04.stdout:8/327: rmdir df/d15 39 2026-03-10T06:22:52.580 INFO:tasks.workunit.client.0.vm04.stdout:8/328: write df/d15/d29/f3c [809952,12613] 0 2026-03-10T06:22:52.629 INFO:tasks.workunit.client.0.vm04.stdout:6/299: dread d2/f14 [4194304,4194304] 0 2026-03-10T06:22:52.637 INFO:tasks.workunit.client.0.vm04.stdout:6/300: dread d2/d8/f11 [0,4194304] 0 2026-03-10T06:22:52.639 INFO:tasks.workunit.client.0.vm04.stdout:6/301: mknod d2/d43/d2d/d30/d1f/c66 0 2026-03-10T06:22:52.640 INFO:tasks.workunit.client.0.vm04.stdout:6/302: truncate d2/d43/d2d/d30/f2b 4306161 0 2026-03-10T06:22:52.642 INFO:tasks.workunit.client.0.vm04.stdout:6/303: mknod d2/d3a/c67 0 2026-03-10T06:22:52.645 INFO:tasks.workunit.client.0.vm04.stdout:6/304: symlink d2/l68 0 2026-03-10T06:22:52.648 INFO:tasks.workunit.client.0.vm04.stdout:6/305: dwrite d2/d43/d2d/d30/d34/f4d [0,4194304] 0 2026-03-10T06:22:52.649 INFO:tasks.workunit.client.0.vm04.stdout:6/306: write d2/d43/d2d/d30/d1f/d3c/f65 [681843,116491] 0 2026-03-10T06:22:52.658 INFO:tasks.workunit.client.0.vm04.stdout:6/307: creat d2/d43/f69 x:0 0 0 2026-03-10T06:22:52.659 INFO:tasks.workunit.client.0.vm04.stdout:6/308: write d2/d3a/f57 [840443,102772] 0 2026-03-10T06:22:52.667 INFO:tasks.workunit.client.0.vm04.stdout:2/315: truncate d1/db/f36 1232780 0 2026-03-10T06:22:52.667 INFO:tasks.workunit.client.0.vm04.stdout:2/316: stat d1/df/d2c/l43 0 2026-03-10T06:22:52.668 INFO:tasks.workunit.client.0.vm04.stdout:2/317: chown d1/l46 530811317 1 2026-03-10T06:22:52.669 INFO:tasks.workunit.client.0.vm04.stdout:6/309: unlink d2/d3a/d5e/c61 0 2026-03-10T06:22:52.669 INFO:tasks.workunit.client.0.vm04.stdout:6/310: chown d2/d43/f3b 1375 1 2026-03-10T06:22:52.672 INFO:tasks.workunit.client.0.vm04.stdout:2/318: creat d1/df/d2c/f5f x:0 0 0 2026-03-10T06:22:52.674 INFO:tasks.workunit.client.0.vm04.stdout:4/259: truncate d2/f14 4190964 0 2026-03-10T06:22:52.674 INFO:tasks.workunit.client.0.vm04.stdout:4/260: chown d2/d8/d40/d4f 166825 1 2026-03-10T06:22:52.676 INFO:tasks.workunit.client.0.vm04.stdout:6/311: rename d2/d3a/f40 to d2/d43/d2d/d30/d1f/d3c/f6a 0 2026-03-10T06:22:52.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:52 vm04.local ceph-mon[51058]: Upgrade: Updating prometheus.vm04 2026-03-10T06:22:52.680 INFO:tasks.workunit.client.0.vm04.stdout:6/312: dread d2/d8/f11 [0,4194304] 0 2026-03-10T06:22:52.680 INFO:tasks.workunit.client.0.vm04.stdout:4/261: dwrite d2/d16/f20 [0,4194304] 0 2026-03-10T06:22:52.691 INFO:tasks.workunit.client.0.vm04.stdout:0/293: dwrite d0/d1a/d20/d38/f52 [0,4194304] 0 2026-03-10T06:22:52.704 INFO:tasks.workunit.client.0.vm04.stdout:5/269: write d4/d11/d2a/f5b [3467448,78508] 0 2026-03-10T06:22:52.708 INFO:tasks.workunit.client.0.vm04.stdout:9/308: dwrite d2/d23/f31 [0,4194304] 0 2026-03-10T06:22:52.710 INFO:tasks.workunit.client.0.vm04.stdout:5/270: dread - d4/d6/d48/d55/f5a zero size 2026-03-10T06:22:52.714 INFO:tasks.workunit.client.0.vm04.stdout:8/329: dwrite df/d20/f22 [0,4194304] 0 2026-03-10T06:22:52.715 INFO:tasks.workunit.client.0.vm04.stdout:7/277: dwrite d4/f5 [0,4194304] 0 2026-03-10T06:22:52.721 INFO:tasks.workunit.client.0.vm04.stdout:1/312: dwrite d0/d3/f61 [4194304,4194304] 0 2026-03-10T06:22:52.729 INFO:tasks.workunit.client.0.vm04.stdout:2/319: rename d1/df/d2c/d37/d40/f42 to d1/df/d11/d14/d4e/f60 0 2026-03-10T06:22:52.729 INFO:tasks.workunit.client.0.vm04.stdout:0/294: creat d0/d1a/d3f/f61 x:0 0 0 2026-03-10T06:22:52.729 INFO:tasks.workunit.client.0.vm04.stdout:3/359: dwrite f1 [4194304,4194304] 0 2026-03-10T06:22:52.735 INFO:tasks.workunit.client.0.vm04.stdout:5/271: dread d4/d11/d2a/f44 [0,4194304] 0 2026-03-10T06:22:52.736 INFO:tasks.workunit.client.0.vm04.stdout:0/295: write d0/f1b [6998642,36852] 0 2026-03-10T06:22:52.751 INFO:tasks.workunit.client.0.vm04.stdout:9/309: dread d2/d3/f12 [0,4194304] 0 2026-03-10T06:22:52.753 INFO:tasks.workunit.client.0.vm04.stdout:4/262: mkdir d2/d8/d40/d4f/d51 0 2026-03-10T06:22:52.763 INFO:tasks.workunit.client.0.vm04.stdout:7/278: rename d4/df/c61 to d4/df/d12/d13/d25/d28/d3a/d58/c6a 0 2026-03-10T06:22:52.773 INFO:tasks.workunit.client.0.vm04.stdout:5/272: creat d4/d6/d48/f5e x:0 0 0 2026-03-10T06:22:52.775 INFO:tasks.workunit.client.0.vm04.stdout:0/296: mknod d0/d5/d25/dd/d3a/d56/c62 0 2026-03-10T06:22:52.780 INFO:tasks.workunit.client.0.vm04.stdout:9/310: dread d2/d23/d24/f2b [0,4194304] 0 2026-03-10T06:22:52.783 INFO:tasks.workunit.client.0.vm04.stdout:4/263: creat d2/d16/d31/d3f/f52 x:0 0 0 2026-03-10T06:22:52.785 INFO:tasks.workunit.client.0.vm04.stdout:5/273: mknod d4/d6/d48/c5f 0 2026-03-10T06:22:52.786 INFO:tasks.workunit.client.0.vm04.stdout:5/274: write d4/d11/f2f [2061260,20183] 0 2026-03-10T06:22:52.788 INFO:tasks.workunit.client.0.vm04.stdout:3/360: creat d4/da/df/d11/d50/d6c/d21/d32/d39/d64/f75 x:0 0 0 2026-03-10T06:22:52.789 INFO:tasks.workunit.client.0.vm04.stdout:0/297: mkdir d0/d5/d25/dd/d1d/d59/d63 0 2026-03-10T06:22:52.790 INFO:tasks.workunit.client.0.vm04.stdout:0/298: read d0/d5/d25/dd/d3a/f50 [1916362,119625] 0 2026-03-10T06:22:52.792 INFO:tasks.workunit.client.0.vm04.stdout:0/299: dread d0/f14 [0,4194304] 0 2026-03-10T06:22:52.792 INFO:tasks.workunit.client.0.vm04.stdout:0/300: stat d0/d1a/d4d 0 2026-03-10T06:22:52.793 INFO:tasks.workunit.client.0.vm04.stdout:0/301: write d0/d1a/d20/d38/f52 [1651744,107191] 0 2026-03-10T06:22:52.794 INFO:tasks.workunit.client.0.vm04.stdout:0/302: write d0/d1a/d3f/f53 [89994,99225] 0 2026-03-10T06:22:52.794 INFO:tasks.workunit.client.0.vm04.stdout:0/303: truncate d0/f16 1564008 0 2026-03-10T06:22:52.795 INFO:tasks.workunit.client.0.vm04.stdout:9/311: rename d2/d3/d18/d39/d11/f25 to d2/d3/d18/d39/d11/f71 0 2026-03-10T06:22:52.796 INFO:tasks.workunit.client.0.vm04.stdout:9/312: fdatasync d2/d3/d18/d39/d11/f35 0 2026-03-10T06:22:52.797 INFO:tasks.workunit.client.0.vm04.stdout:6/313: getdents d2/d43/d2d/d30/d1f 0 2026-03-10T06:22:52.798 INFO:tasks.workunit.client.0.vm04.stdout:0/304: dread d0/d5/d25/dd/f43 [0,4194304] 0 2026-03-10T06:22:52.798 INFO:tasks.workunit.client.0.vm04.stdout:0/305: readlink d0/d5/d25/dd/d1d/l37 0 2026-03-10T06:22:52.804 INFO:tasks.workunit.client.0.vm04.stdout:4/264: symlink d2/d16/d31/d3f/l53 0 2026-03-10T06:22:52.804 INFO:tasks.workunit.client.0.vm04.stdout:4/265: readlink d2/d32/l36 0 2026-03-10T06:22:52.808 INFO:tasks.workunit.client.0.vm04.stdout:4/266: dread d2/d16/f3a [0,4194304] 0 2026-03-10T06:22:52.814 INFO:tasks.workunit.client.0.vm04.stdout:5/275: dwrite d4/d11/f1f [4194304,4194304] 0 2026-03-10T06:22:52.814 INFO:tasks.workunit.client.0.vm04.stdout:5/276: stat d4/d3b 0 2026-03-10T06:22:52.815 INFO:tasks.workunit.client.0.vm04.stdout:3/361: creat d4/da/df/d11/d4a/f76 x:0 0 0 2026-03-10T06:22:52.815 INFO:tasks.workunit.client.0.vm04.stdout:7/279: sync 2026-03-10T06:22:52.819 INFO:tasks.workunit.client.0.vm04.stdout:8/330: getdents df/d15 0 2026-03-10T06:22:52.822 INFO:tasks.workunit.client.0.vm04.stdout:1/313: getdents d0/d8/d46 0 2026-03-10T06:22:52.824 INFO:tasks.workunit.client.0.vm04.stdout:4/267: creat d2/d8/f54 x:0 0 0 2026-03-10T06:22:52.825 INFO:tasks.workunit.client.0.vm04.stdout:5/277: symlink d4/d11/l60 0 2026-03-10T06:22:52.827 INFO:tasks.workunit.client.0.vm04.stdout:6/314: sync 2026-03-10T06:22:52.838 INFO:tasks.workunit.client.0.vm04.stdout:7/280: rmdir d4/df/d12/d13/d25/d28/d36 39 2026-03-10T06:22:52.839 INFO:tasks.workunit.client.0.vm04.stdout:8/331: mkdir df/d53/d67 0 2026-03-10T06:22:52.842 INFO:tasks.workunit.client.0.vm04.stdout:9/313: rename d2/d3/d18/d39/l52 to d2/d8/d14/l72 0 2026-03-10T06:22:52.842 INFO:tasks.workunit.client.0.vm04.stdout:1/314: sync 2026-03-10T06:22:52.845 INFO:tasks.workunit.client.0.vm04.stdout:9/314: dread d2/f1e [0,4194304] 0 2026-03-10T06:22:52.847 INFO:tasks.workunit.client.0.vm04.stdout:1/315: dwrite d0/f23 [0,4194304] 0 2026-03-10T06:22:52.854 INFO:tasks.workunit.client.0.vm04.stdout:4/268: rename d2/d46/f28 to d2/d16/d2c/f55 0 2026-03-10T06:22:52.860 INFO:tasks.workunit.client.0.vm04.stdout:4/269: stat d2/d16/d31/d3f/f52 0 2026-03-10T06:22:52.860 INFO:tasks.workunit.client.0.vm04.stdout:5/278: link d4/d6/f33 d4/d6/d50/f61 0 2026-03-10T06:22:52.860 INFO:tasks.workunit.client.0.vm04.stdout:3/362: creat d4/da/df/d11/d50/d6c/f77 x:0 0 0 2026-03-10T06:22:52.860 INFO:tasks.workunit.client.0.vm04.stdout:3/363: chown d4/da/df/d11/d62 3 1 2026-03-10T06:22:52.864 INFO:tasks.workunit.client.0.vm04.stdout:6/315: rename d2/d8/c19 to d2/d43/d2d/d30/d34/c6b 0 2026-03-10T06:22:52.866 INFO:tasks.workunit.client.0.vm04.stdout:1/316: rename d0/d8/c48 to d0/c74 0 2026-03-10T06:22:52.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:52 vm06.local ceph-mon[58974]: Upgrade: Updating prometheus.vm04 2026-03-10T06:22:52.867 INFO:tasks.workunit.client.0.vm04.stdout:6/316: dread d2/d43/d2d/d30/d1f/d3c/f65 [0,4194304] 0 2026-03-10T06:22:52.870 INFO:tasks.workunit.client.0.vm04.stdout:7/281: dread d4/df/d12/d13/d25/f4b [0,4194304] 0 2026-03-10T06:22:52.873 INFO:tasks.workunit.client.0.vm04.stdout:0/306: getdents d0/d1a 0 2026-03-10T06:22:52.874 INFO:tasks.workunit.client.0.vm04.stdout:7/282: dwrite d4/df/d12/d13/d25/f2f [0,4194304] 0 2026-03-10T06:22:52.877 INFO:tasks.workunit.client.0.vm04.stdout:5/279: dread d4/d3b/f54 [0,4194304] 0 2026-03-10T06:22:52.878 INFO:tasks.workunit.client.0.vm04.stdout:1/317: unlink d0/d3/f58 0 2026-03-10T06:22:52.879 INFO:tasks.workunit.client.0.vm04.stdout:1/318: read d0/d8/f43 [189186,117268] 0 2026-03-10T06:22:52.879 INFO:tasks.workunit.client.0.vm04.stdout:1/319: write d0/d3/f19 [2616947,73548] 0 2026-03-10T06:22:52.886 INFO:tasks.workunit.client.0.vm04.stdout:9/315: getdents d2/d3 0 2026-03-10T06:22:52.887 INFO:tasks.workunit.client.0.vm04.stdout:9/316: read d2/d3/d18/d34/f5f [3607326,125447] 0 2026-03-10T06:22:52.889 INFO:tasks.workunit.client.0.vm04.stdout:0/307: truncate d0/d1a/f27 2488888 0 2026-03-10T06:22:52.894 INFO:tasks.workunit.client.0.vm04.stdout:5/280: creat d4/d6/d37/f62 x:0 0 0 2026-03-10T06:22:52.896 INFO:tasks.workunit.client.0.vm04.stdout:1/320: rename d0/d8/d46/f4d to d0/d3/d41/f75 0 2026-03-10T06:22:52.898 INFO:tasks.workunit.client.0.vm04.stdout:0/308: read d0/d1a/f2f [3247060,52582] 0 2026-03-10T06:22:52.899 INFO:tasks.workunit.client.0.vm04.stdout:7/283: truncate d4/df/d12/f20 291125 0 2026-03-10T06:22:52.900 INFO:tasks.workunit.client.0.vm04.stdout:7/284: write d4/df/d12/d34/f57 [90710,6141] 0 2026-03-10T06:22:52.901 INFO:tasks.workunit.client.0.vm04.stdout:5/281: creat d4/d6/d50/f63 x:0 0 0 2026-03-10T06:22:52.904 INFO:tasks.workunit.client.0.vm04.stdout:9/317: mkdir d2/d8/d14/d1d/d64/d73 0 2026-03-10T06:22:52.905 INFO:tasks.workunit.client.0.vm04.stdout:9/318: readlink d2/d3/d18/d39/d11/l62 0 2026-03-10T06:22:52.916 INFO:tasks.workunit.client.0.vm04.stdout:7/285: unlink d4/df/d12/c47 0 2026-03-10T06:22:52.918 INFO:tasks.workunit.client.0.vm04.stdout:1/321: link d0/d3/f62 d0/d8/f76 0 2026-03-10T06:22:52.926 INFO:tasks.workunit.client.0.vm04.stdout:7/286: creat d4/df/d12/d21/f6b x:0 0 0 2026-03-10T06:22:52.926 INFO:tasks.workunit.client.0.vm04.stdout:7/287: chown d4/df/d12/f14 83629219 1 2026-03-10T06:22:52.928 INFO:tasks.workunit.client.0.vm04.stdout:7/288: dread d4/df/d12/f1c [4194304,4194304] 0 2026-03-10T06:22:52.936 INFO:tasks.workunit.client.0.vm04.stdout:7/289: truncate d4/df/d12/d13/d25/d28/d36/f64 348852 0 2026-03-10T06:22:52.938 INFO:tasks.workunit.client.0.vm04.stdout:9/319: getdents d2/d23/d24/d5a 0 2026-03-10T06:22:52.938 INFO:tasks.workunit.client.0.vm04.stdout:9/320: dread - d2/d8/d5d/f69 zero size 2026-03-10T06:22:52.939 INFO:tasks.workunit.client.0.vm04.stdout:9/321: write d2/d3/d18/d39/d46/f4b [767546,49364] 0 2026-03-10T06:22:52.946 INFO:tasks.workunit.client.0.vm04.stdout:7/290: symlink d4/df/d12/d13/d25/d30/d40/l6c 0 2026-03-10T06:22:52.946 INFO:tasks.workunit.client.0.vm04.stdout:7/291: write d4/df/d12/d13/f27 [334441,129089] 0 2026-03-10T06:22:52.951 INFO:tasks.workunit.client.0.vm04.stdout:9/322: symlink d2/d3/d18/l74 0 2026-03-10T06:22:52.956 INFO:tasks.workunit.client.0.vm04.stdout:9/323: creat d2/d8/d22/f75 x:0 0 0 2026-03-10T06:22:52.960 INFO:tasks.workunit.client.0.vm04.stdout:9/324: dwrite d2/d8/d5d/f69 [0,4194304] 0 2026-03-10T06:22:52.970 INFO:tasks.workunit.client.0.vm04.stdout:9/325: mknod d2/d3/d18/d39/d11/d42/c76 0 2026-03-10T06:22:52.974 INFO:tasks.workunit.client.0.vm04.stdout:9/326: dwrite d2/d8/f66 [0,4194304] 0 2026-03-10T06:22:52.982 INFO:tasks.workunit.client.0.vm04.stdout:2/320: getdents d1/df/d11/d14/d4e 0 2026-03-10T06:22:52.982 INFO:tasks.workunit.client.0.vm04.stdout:2/321: unlink d1/db/lc 0 2026-03-10T06:22:52.993 INFO:tasks.workunit.client.0.vm04.stdout:2/322: rename d1/df/d11/l4f to d1/df/d2c/d37/d40/l61 0 2026-03-10T06:22:52.994 INFO:tasks.workunit.client.0.vm04.stdout:2/323: write d1/df/d11/d18/f53 [819090,20690] 0 2026-03-10T06:22:52.995 INFO:tasks.workunit.client.0.vm04.stdout:2/324: chown d1/db/d20/f49 6137 1 2026-03-10T06:22:52.999 INFO:tasks.workunit.client.0.vm04.stdout:2/325: creat d1/df/d11/d18/d48/f62 x:0 0 0 2026-03-10T06:22:52.999 INFO:tasks.workunit.client.0.vm04.stdout:2/326: write d1/df/f24 [1417669,74349] 0 2026-03-10T06:22:53.000 INFO:tasks.workunit.client.0.vm04.stdout:2/327: truncate d1/df/d11/d18/f53 1204573 0 2026-03-10T06:22:53.015 INFO:tasks.workunit.client.0.vm04.stdout:2/328: dread d1/df/f5a [0,4194304] 0 2026-03-10T06:22:53.019 INFO:tasks.workunit.client.0.vm04.stdout:2/329: rename d1/df/d11/d18/f3e to d1/df/f63 0 2026-03-10T06:22:53.019 INFO:tasks.workunit.client.0.vm04.stdout:2/330: chown d1/df/d2c/d37 553651405 1 2026-03-10T06:22:53.023 INFO:tasks.workunit.client.0.vm04.stdout:2/331: dwrite d1/f57 [0,4194304] 0 2026-03-10T06:22:53.024 INFO:tasks.workunit.client.0.vm04.stdout:2/332: chown d1/df/f5a 2005368 1 2026-03-10T06:22:53.026 INFO:tasks.workunit.client.0.vm04.stdout:8/332: write df/d20/d25/d30/d55/f5b [256564,66136] 0 2026-03-10T06:22:53.028 INFO:tasks.workunit.client.0.vm04.stdout:8/333: write df/f1d [1354258,99037] 0 2026-03-10T06:22:53.038 INFO:tasks.workunit.client.0.vm04.stdout:2/333: creat d1/df/d2c/d37/d40/f64 x:0 0 0 2026-03-10T06:22:53.040 INFO:tasks.workunit.client.0.vm04.stdout:6/317: getdents d2/d8 0 2026-03-10T06:22:53.040 INFO:tasks.workunit.client.0.vm04.stdout:6/318: dread - d2/d43/f4b zero size 2026-03-10T06:22:53.044 INFO:tasks.workunit.client.0.vm04.stdout:6/319: dwrite d2/d43/d2d/d51/f59 [0,4194304] 0 2026-03-10T06:22:53.046 INFO:tasks.workunit.client.0.vm04.stdout:6/320: dread - d2/d43/d2d/d30/d1f/d3c/f6a zero size 2026-03-10T06:22:53.047 INFO:tasks.workunit.client.0.vm04.stdout:6/321: write d2/d43/d2d/d51/f59 [473700,119569] 0 2026-03-10T06:22:53.051 INFO:tasks.workunit.client.0.vm04.stdout:4/270: truncate d2/d16/d2c/f55 477127 0 2026-03-10T06:22:53.056 INFO:tasks.workunit.client.0.vm04.stdout:8/334: symlink df/d20/d25/l68 0 2026-03-10T06:22:53.056 INFO:tasks.workunit.client.0.vm04.stdout:3/364: truncate d4/da/df/d11/d50/d6c/f4b 121096 0 2026-03-10T06:22:53.057 INFO:tasks.workunit.client.0.vm04.stdout:5/282: mknod d4/d11/d2a/d52/c64 0 2026-03-10T06:22:53.060 INFO:tasks.workunit.client.0.vm04.stdout:3/365: dwrite d4/da/df/d11/d50/d6c/d21/f6d [0,4194304] 0 2026-03-10T06:22:53.074 INFO:tasks.workunit.client.0.vm04.stdout:8/335: creat df/d15/f69 x:0 0 0 2026-03-10T06:22:53.075 INFO:tasks.workunit.client.0.vm04.stdout:5/283: mkdir d4/d11/d2a/d65 0 2026-03-10T06:22:53.078 INFO:tasks.workunit.client.0.vm04.stdout:5/284: dwrite d4/d11/d2a/f36 [0,4194304] 0 2026-03-10T06:22:53.082 INFO:tasks.workunit.client.0.vm04.stdout:5/285: dwrite d4/d6/d37/f62 [0,4194304] 0 2026-03-10T06:22:53.092 INFO:tasks.workunit.client.0.vm04.stdout:3/366: creat d4/d6/d38/f78 x:0 0 0 2026-03-10T06:22:53.093 INFO:tasks.workunit.client.0.vm04.stdout:9/327: getdents d2/d8/d14/d1d/d64 0 2026-03-10T06:22:53.094 INFO:tasks.workunit.client.0.vm04.stdout:2/334: link d1/f57 d1/df/d11/d18/d35/d54/d5d/f65 0 2026-03-10T06:22:53.096 INFO:tasks.workunit.client.0.vm04.stdout:6/322: mknod d2/d43/d2d/c6c 0 2026-03-10T06:22:53.096 INFO:tasks.workunit.client.0.vm04.stdout:6/323: chown d2/l1b 38910 1 2026-03-10T06:22:53.097 INFO:tasks.workunit.client.0.vm04.stdout:1/322: write d0/d3/d41/d4b/d5b/f5c [772208,123014] 0 2026-03-10T06:22:53.100 INFO:tasks.workunit.client.0.vm04.stdout:0/309: dwrite d0/d5/d25/dd/d1d/d59/f48 [0,4194304] 0 2026-03-10T06:22:53.105 INFO:tasks.workunit.client.0.vm04.stdout:5/286: creat d4/d6/f66 x:0 0 0 2026-03-10T06:22:53.107 INFO:tasks.workunit.client.0.vm04.stdout:5/287: readlink d4/d6/l4a 0 2026-03-10T06:22:53.107 INFO:tasks.workunit.client.0.vm04.stdout:3/367: mkdir d4/da/df/d11/d50/d6c/d21/d2c/d79 0 2026-03-10T06:22:53.109 INFO:tasks.workunit.client.0.vm04.stdout:7/292: dwrite d4/df/d12/d13/d25/f66 [0,4194304] 0 2026-03-10T06:22:53.116 INFO:tasks.workunit.client.0.vm04.stdout:6/324: rename d2/d43/fa to d2/d43/d2d/d30/d34/f6d 0 2026-03-10T06:22:53.118 INFO:tasks.workunit.client.0.vm04.stdout:1/323: symlink d0/d8/d46/l77 0 2026-03-10T06:22:53.121 INFO:tasks.workunit.client.0.vm04.stdout:4/271: sync 2026-03-10T06:22:53.124 INFO:tasks.workunit.client.0.vm04.stdout:4/272: dwrite d2/d8/d40/f41 [0,4194304] 0 2026-03-10T06:22:53.128 INFO:tasks.workunit.client.0.vm04.stdout:8/336: symlink df/d20/l6a 0 2026-03-10T06:22:53.129 INFO:tasks.workunit.client.0.vm04.stdout:5/288: rename d4/d6/f66 to d4/d6/f67 0 2026-03-10T06:22:53.133 INFO:tasks.workunit.client.0.vm04.stdout:7/293: chown d4/df/d12/c2d 139857 1 2026-03-10T06:22:53.136 INFO:tasks.workunit.client.0.vm04.stdout:7/294: dwrite d4/fb [0,4194304] 0 2026-03-10T06:22:53.145 INFO:tasks.workunit.client.0.vm04.stdout:7/295: readlink d4/df/d12/d13/d25/l2b 0 2026-03-10T06:22:53.145 INFO:tasks.workunit.client.0.vm04.stdout:7/296: dwrite d4/df/d12/d13/d25/f2f [0,4194304] 0 2026-03-10T06:22:53.145 INFO:tasks.workunit.client.0.vm04.stdout:6/325: mkdir d2/d37/d6e 0 2026-03-10T06:22:53.145 INFO:tasks.workunit.client.0.vm04.stdout:7/297: write d4/df/d12/d34/f57 [679733,108994] 0 2026-03-10T06:22:53.148 INFO:tasks.workunit.client.0.vm04.stdout:1/324: mknod d0/d8/d46/c78 0 2026-03-10T06:22:53.160 INFO:tasks.workunit.client.0.vm04.stdout:4/273: unlink d2/f30 0 2026-03-10T06:22:53.162 INFO:tasks.workunit.client.0.vm04.stdout:8/337: chown l0 1985 1 2026-03-10T06:22:53.167 INFO:tasks.workunit.client.0.vm04.stdout:8/338: dwrite df/d15/f43 [0,4194304] 0 2026-03-10T06:22:53.174 INFO:tasks.workunit.client.0.vm04.stdout:1/325: symlink d0/d3/d41/l79 0 2026-03-10T06:22:53.178 INFO:tasks.workunit.client.0.vm04.stdout:1/326: dwrite d0/d3/f19 [0,4194304] 0 2026-03-10T06:22:53.197 INFO:tasks.workunit.client.0.vm04.stdout:4/274: write d2/f4 [4385035,115672] 0 2026-03-10T06:22:53.199 INFO:tasks.workunit.client.0.vm04.stdout:8/339: creat df/d20/d25/d30/f6b x:0 0 0 2026-03-10T06:22:53.204 INFO:tasks.workunit.client.0.vm04.stdout:8/340: dwrite df/d15/f69 [0,4194304] 0 2026-03-10T06:22:53.214 INFO:tasks.workunit.client.0.vm04.stdout:7/298: symlink d4/df/d12/d13/d25/d28/d3a/d58/d68/l6d 0 2026-03-10T06:22:53.221 INFO:tasks.workunit.client.0.vm04.stdout:1/327: mkdir d0/d8/d46/d7a 0 2026-03-10T06:22:53.234 INFO:tasks.workunit.client.0.vm04.stdout:4/275: rmdir d2/d8 39 2026-03-10T06:22:53.234 INFO:tasks.workunit.client.0.vm04.stdout:3/368: getdents d4/da/df/d11/d50/d6c 0 2026-03-10T06:22:53.234 INFO:tasks.workunit.client.0.vm04.stdout:8/341: creat df/d53/f6c x:0 0 0 2026-03-10T06:22:53.234 INFO:tasks.workunit.client.0.vm04.stdout:8/342: readlink df/d20/d25/d30/l47 0 2026-03-10T06:22:53.234 INFO:tasks.workunit.client.0.vm04.stdout:6/326: link d2/d43/d2d/d30/d1f/d3c/c1e d2/d3a/c6f 0 2026-03-10T06:22:53.234 INFO:tasks.workunit.client.0.vm04.stdout:6/327: read d2/d43/d2d/d51/f59 [205239,116771] 0 2026-03-10T06:22:53.243 INFO:tasks.workunit.client.0.vm04.stdout:1/328: symlink d0/d3/d41/d4b/l7b 0 2026-03-10T06:22:53.246 INFO:tasks.workunit.client.0.vm04.stdout:4/276: rmdir d2/d46 39 2026-03-10T06:22:53.246 INFO:tasks.workunit.client.0.vm04.stdout:6/328: rename d2/d8/f9 to d2/d37/d6e/f70 0 2026-03-10T06:22:53.246 INFO:tasks.workunit.client.0.vm04.stdout:6/329: stat d2/d43/d2d/d51/f59 0 2026-03-10T06:22:53.248 INFO:tasks.workunit.client.0.vm04.stdout:1/329: creat d0/f7c x:0 0 0 2026-03-10T06:22:53.249 INFO:tasks.workunit.client.0.vm04.stdout:8/343: fsync df/d15/d29/f3c 0 2026-03-10T06:22:53.251 INFO:tasks.workunit.client.0.vm04.stdout:4/277: mkdir d2/d16/d56 0 2026-03-10T06:22:53.254 INFO:tasks.workunit.client.0.vm04.stdout:6/330: mknod d2/d8/c71 0 2026-03-10T06:22:53.257 INFO:tasks.workunit.client.0.vm04.stdout:6/331: dread d2/d43/f3b [0,4194304] 0 2026-03-10T06:22:53.257 INFO:tasks.workunit.client.0.vm04.stdout:6/332: dread - d2/d43/f69 zero size 2026-03-10T06:22:53.258 INFO:tasks.workunit.client.0.vm04.stdout:6/333: stat d2/d3a 0 2026-03-10T06:22:53.260 INFO:tasks.workunit.client.0.vm04.stdout:4/278: mknod d2/d16/d31/c57 0 2026-03-10T06:22:53.261 INFO:tasks.workunit.client.0.vm04.stdout:6/334: fsync d2/d43/d2d/d30/f5a 0 2026-03-10T06:22:53.262 INFO:tasks.workunit.client.0.vm04.stdout:6/335: readlink d2/d43/d2d/d30/l33 0 2026-03-10T06:22:53.263 INFO:tasks.workunit.client.0.vm04.stdout:6/336: dread - d2/d3a/d5e/f64 zero size 2026-03-10T06:22:53.263 INFO:tasks.workunit.client.0.vm04.stdout:8/344: dread f9 [0,4194304] 0 2026-03-10T06:22:53.264 INFO:tasks.workunit.client.0.vm04.stdout:6/337: write d2/d43/f69 [258208,14283] 0 2026-03-10T06:22:53.269 INFO:tasks.workunit.client.0.vm04.stdout:6/338: dread d2/d43/d2d/d30/d34/f4d [0,4194304] 0 2026-03-10T06:22:53.270 INFO:tasks.workunit.client.0.vm04.stdout:6/339: write d2/d43/d2d/d30/d1f/f3f [702112,42198] 0 2026-03-10T06:22:53.271 INFO:tasks.workunit.client.0.vm04.stdout:8/345: symlink df/d53/l6d 0 2026-03-10T06:22:53.273 INFO:tasks.workunit.client.0.vm04.stdout:4/279: mknod d2/d16/d56/c58 0 2026-03-10T06:22:53.274 INFO:tasks.workunit.client.0.vm04.stdout:4/280: write d2/f12 [1117794,77712] 0 2026-03-10T06:22:53.286 INFO:tasks.workunit.client.0.vm04.stdout:6/340: fsync d2/d43/d2d/d30/f32 0 2026-03-10T06:22:53.286 INFO:tasks.workunit.client.0.vm04.stdout:8/346: truncate df/f12 914702 0 2026-03-10T06:22:53.286 INFO:tasks.workunit.client.0.vm04.stdout:8/347: readlink df/d20/d25/d30/l38 0 2026-03-10T06:22:53.286 INFO:tasks.workunit.client.0.vm04.stdout:4/281: fsync d2/d46/f3d 0 2026-03-10T06:22:53.286 INFO:tasks.workunit.client.0.vm04.stdout:6/341: write d2/d43/d2d/d51/f59 [2728775,49315] 0 2026-03-10T06:22:53.286 INFO:tasks.workunit.client.0.vm04.stdout:6/342: rmdir d2/d3a/d5e 39 2026-03-10T06:22:53.286 INFO:tasks.workunit.client.0.vm04.stdout:6/343: readlink d2/d43/l1c 0 2026-03-10T06:22:53.287 INFO:tasks.workunit.client.0.vm04.stdout:7/299: sync 2026-03-10T06:22:53.287 INFO:tasks.workunit.client.0.vm04.stdout:3/369: sync 2026-03-10T06:22:53.288 INFO:tasks.workunit.client.0.vm04.stdout:3/370: chown d4/f2d 3 1 2026-03-10T06:22:53.290 INFO:tasks.workunit.client.0.vm04.stdout:7/300: fsync d4/df/d12/d13/d25/d28/d36/f41 0 2026-03-10T06:22:53.292 INFO:tasks.workunit.client.0.vm04.stdout:4/282: symlink d2/d8/d40/d4f/d51/l59 0 2026-03-10T06:22:53.294 INFO:tasks.workunit.client.0.vm04.stdout:4/283: symlink d2/d8/l5a 0 2026-03-10T06:22:53.294 INFO:tasks.workunit.client.0.vm04.stdout:4/284: chown d2/d16/d2c 578 1 2026-03-10T06:22:53.297 INFO:tasks.workunit.client.0.vm04.stdout:4/285: mknod d2/d46/c5b 0 2026-03-10T06:22:53.303 INFO:tasks.workunit.client.0.vm04.stdout:9/328: write d2/f1c [186123,44597] 0 2026-03-10T06:22:53.308 INFO:tasks.workunit.client.0.vm04.stdout:9/329: mknod d2/d8/d14/d1d/d64/d73/c77 0 2026-03-10T06:22:53.308 INFO:tasks.workunit.client.0.vm04.stdout:2/335: truncate d1/db/fe 4010329 0 2026-03-10T06:22:53.309 INFO:tasks.workunit.client.0.vm04.stdout:9/330: creat d2/d8/d14/d1d/f78 x:0 0 0 2026-03-10T06:22:53.315 INFO:tasks.workunit.client.0.vm04.stdout:9/331: dwrite d2/d3/d18/d39/d11/f2d [0,4194304] 0 2026-03-10T06:22:53.318 INFO:tasks.workunit.client.0.vm04.stdout:3/371: sync 2026-03-10T06:22:53.318 INFO:tasks.workunit.client.0.vm04.stdout:7/301: sync 2026-03-10T06:22:53.319 INFO:tasks.workunit.client.0.vm04.stdout:4/286: dread d2/f14 [0,4194304] 0 2026-03-10T06:22:53.319 INFO:tasks.workunit.client.0.vm04.stdout:7/302: chown d4/df/d12/d21/l54 8 1 2026-03-10T06:22:53.323 INFO:tasks.workunit.client.0.vm04.stdout:2/336: symlink d1/df/d2c/d37/d40/l66 0 2026-03-10T06:22:53.323 INFO:tasks.workunit.client.0.vm04.stdout:2/337: stat d1/df 0 2026-03-10T06:22:53.330 INFO:tasks.workunit.client.0.vm04.stdout:9/332: readlink d2/d8/d14/l72 0 2026-03-10T06:22:53.332 INFO:tasks.workunit.client.0.vm04.stdout:3/372: symlink d4/da/df/d11/d50/d6c/l7a 0 2026-03-10T06:22:53.333 INFO:tasks.workunit.client.0.vm04.stdout:3/373: truncate f1 9096139 0 2026-03-10T06:22:53.337 INFO:tasks.workunit.client.0.vm04.stdout:5/289: dwrite d4/d11/d2a/f31 [0,4194304] 0 2026-03-10T06:22:53.339 INFO:tasks.workunit.client.0.vm04.stdout:2/338: mkdir d1/df/d11/d18/d48/d67 0 2026-03-10T06:22:53.344 INFO:tasks.workunit.client.0.vm04.stdout:9/333: chown d2/l38 3 1 2026-03-10T06:22:53.348 INFO:tasks.workunit.client.0.vm04.stdout:5/290: fsync d4/d11/f1f 0 2026-03-10T06:22:53.353 INFO:tasks.workunit.client.0.vm04.stdout:9/334: symlink d2/d8/d3a/d60/l79 0 2026-03-10T06:22:53.365 INFO:tasks.workunit.client.0.vm04.stdout:2/339: dwrite d1/df/f63 [0,4194304] 0 2026-03-10T06:22:53.365 INFO:tasks.workunit.client.0.vm04.stdout:7/303: link d4/c1d d4/df/d12/d21/c6e 0 2026-03-10T06:22:53.365 INFO:tasks.workunit.client.0.vm04.stdout:5/291: creat d4/d6/d48/d55/f68 x:0 0 0 2026-03-10T06:22:53.372 INFO:tasks.workunit.client.0.vm04.stdout:2/340: dread d1/df/d2c/f44 [0,4194304] 0 2026-03-10T06:22:53.373 INFO:tasks.workunit.client.0.vm04.stdout:7/304: mknod d4/df/d12/d13/d25/d28/d3a/d58/d68/c6f 0 2026-03-10T06:22:53.375 INFO:tasks.workunit.client.0.vm04.stdout:0/310: truncate d0/d1a/f27 2108235 0 2026-03-10T06:22:53.376 INFO:tasks.workunit.client.0.vm04.stdout:9/335: symlink d2/d8/d14/d6c/l7a 0 2026-03-10T06:22:53.384 INFO:tasks.workunit.client.0.vm04.stdout:9/336: mknod d2/d8/d14/d1d/d64/c7b 0 2026-03-10T06:22:53.385 INFO:tasks.workunit.client.0.vm04.stdout:7/305: mknod d4/df/d12/c70 0 2026-03-10T06:22:53.386 INFO:tasks.workunit.client.0.vm04.stdout:7/306: fsync d4/df/d12/d13/d25/d28/d36/f41 0 2026-03-10T06:22:53.387 INFO:tasks.workunit.client.0.vm04.stdout:5/292: creat d4/f69 x:0 0 0 2026-03-10T06:22:53.388 INFO:tasks.workunit.client.0.vm04.stdout:7/307: mknod d4/df/d12/d13/d25/d28/d3a/d58/c71 0 2026-03-10T06:22:53.390 INFO:tasks.workunit.client.0.vm04.stdout:7/308: mkdir d4/df/d12/d13/d25/d72 0 2026-03-10T06:22:53.393 INFO:tasks.workunit.client.0.vm04.stdout:7/309: dwrite d4/df/d12/d13/d25/d30/d40/f52 [0,4194304] 0 2026-03-10T06:22:53.396 INFO:tasks.workunit.client.0.vm04.stdout:5/293: unlink d4/c2e 0 2026-03-10T06:22:53.428 INFO:tasks.workunit.client.0.vm04.stdout:5/294: dread - d4/d6/d48/f5e zero size 2026-03-10T06:22:53.429 INFO:tasks.workunit.client.0.vm04.stdout:5/295: dwrite d4/ff [0,4194304] 0 2026-03-10T06:22:53.429 INFO:tasks.workunit.client.0.vm04.stdout:5/296: symlink d4/d6/l6a 0 2026-03-10T06:22:53.429 INFO:tasks.workunit.client.0.vm04.stdout:5/297: dwrite d4/d11/f2f [0,4194304] 0 2026-03-10T06:22:53.429 INFO:tasks.workunit.client.0.vm04.stdout:1/330: write d0/d8/f11 [1014012,4202] 0 2026-03-10T06:22:53.431 INFO:tasks.workunit.client.0.vm04.stdout:7/310: dread d4/df/d12/d21/f2a [0,4194304] 0 2026-03-10T06:22:53.431 INFO:tasks.workunit.client.0.vm04.stdout:7/311: readlink d4/df/d12/l3e 0 2026-03-10T06:22:53.432 INFO:tasks.workunit.client.0.vm04.stdout:7/312: write d4/df/d12/d13/f27 [4685170,87380] 0 2026-03-10T06:22:53.434 INFO:tasks.workunit.client.0.vm04.stdout:5/298: write d4/f21 [4428451,41593] 0 2026-03-10T06:22:53.439 INFO:tasks.workunit.client.0.vm04.stdout:6/344: truncate d2/d43/d2d/d30/d34/f4d 1967508 0 2026-03-10T06:22:53.447 INFO:tasks.workunit.client.0.vm04.stdout:7/313: creat d4/df/d12/d13/d25/d28/d3a/f73 x:0 0 0 2026-03-10T06:22:53.447 INFO:tasks.workunit.client.0.vm04.stdout:7/314: chown d4/df/d12 20 1 2026-03-10T06:22:53.447 INFO:tasks.workunit.client.0.vm04.stdout:7/315: chown d4/df/d12/d13/d25/d28/d3a/d58/d68/l6d 49093273 1 2026-03-10T06:22:53.447 INFO:tasks.workunit.client.0.vm04.stdout:8/348: truncate df/d15/d2b/f33 3798993 0 2026-03-10T06:22:53.447 INFO:tasks.workunit.client.0.vm04.stdout:8/349: chown df/d20/d25/d30/d55/l5a 184 1 2026-03-10T06:22:53.450 INFO:tasks.workunit.client.0.vm04.stdout:8/350: dwrite df/f17 [0,4194304] 0 2026-03-10T06:22:53.452 INFO:tasks.workunit.client.0.vm04.stdout:6/345: symlink d2/d3a/l72 0 2026-03-10T06:22:53.453 INFO:tasks.workunit.client.0.vm04.stdout:7/316: link d4/df/d12/d21/c44 d4/df/d12/d21/c74 0 2026-03-10T06:22:53.456 INFO:tasks.workunit.client.0.vm04.stdout:7/317: creat d4/df/d12/d13/d25/d30/d40/f75 x:0 0 0 2026-03-10T06:22:53.475 INFO:tasks.workunit.client.0.vm04.stdout:8/351: creat df/f6e x:0 0 0 2026-03-10T06:22:53.475 INFO:tasks.workunit.client.0.vm04.stdout:4/287: rename d2/d8/d40 to d2/d32/d5c 0 2026-03-10T06:22:53.475 INFO:tasks.workunit.client.0.vm04.stdout:4/288: write d2/d46/f3d [733469,26568] 0 2026-03-10T06:22:53.480 INFO:tasks.workunit.client.0.vm04.stdout:9/337: rename d2/d3/d18/d39/d46/f4b to d2/d8/d5d/f7c 0 2026-03-10T06:22:53.485 INFO:tasks.workunit.client.0.vm04.stdout:9/338: dwrite d2/d23/d24/f29 [4194304,4194304] 0 2026-03-10T06:22:53.487 INFO:tasks.workunit.client.0.vm04.stdout:9/339: write d2/d3/d18/d39/fd [3812967,95015] 0 2026-03-10T06:22:53.499 INFO:tasks.workunit.client.0.vm04.stdout:3/374: dwrite d4/da/df/d11/d50/d6c/d21/d32/d39/d64/f75 [0,4194304] 0 2026-03-10T06:22:53.518 INFO:tasks.workunit.client.0.vm04.stdout:4/289: link d2/d8/f35 d2/d46/f5d 0 2026-03-10T06:22:53.520 INFO:tasks.workunit.client.0.vm04.stdout:8/352: sync 2026-03-10T06:22:53.520 INFO:tasks.workunit.client.0.vm04.stdout:8/353: write df/f1d [1339969,60] 0 2026-03-10T06:22:53.526 INFO:tasks.workunit.client.0.vm04.stdout:5/299: rename d4/d11/l42 to d4/d6/d37/l6b 0 2026-03-10T06:22:53.534 INFO:tasks.workunit.client.0.vm04.stdout:4/290: rmdir d2/d32/d5c/d4f/d51 39 2026-03-10T06:22:53.534 INFO:tasks.workunit.client.0.vm04.stdout:4/291: write d2/d16/d31/f4e [248987,68142] 0 2026-03-10T06:22:53.541 INFO:tasks.workunit.client.0.vm04.stdout:3/375: rename d4/da/df/d11/d50/d6c to d4/da/df/d11/d4a/d7b 0 2026-03-10T06:22:53.553 INFO:tasks.workunit.client.0.vm04.stdout:4/292: write d2/f47 [2275528,101230] 0 2026-03-10T06:22:53.553 INFO:tasks.workunit.client.0.vm04.stdout:0/311: write d0/f17 [2991543,10081] 0 2026-03-10T06:22:53.553 INFO:tasks.workunit.client.0.vm04.stdout:3/376: creat d4/da/df/d11/d4a/d7b/d21/d2c/f7c x:0 0 0 2026-03-10T06:22:53.553 INFO:tasks.workunit.client.0.vm04.stdout:4/293: rename d2 to d2/d32/d5c/d4f/d51/d5e 22 2026-03-10T06:22:53.553 INFO:tasks.workunit.client.0.vm04.stdout:0/312: chown d0/d1a/d20/d38/f52 110977 1 2026-03-10T06:22:53.553 INFO:tasks.workunit.client.0.vm04.stdout:2/341: dwrite d1/f10 [0,4194304] 0 2026-03-10T06:22:53.553 INFO:tasks.workunit.client.0.vm04.stdout:0/313: truncate d0/d1a/f2f 4725949 0 2026-03-10T06:22:53.553 INFO:tasks.workunit.client.0.vm04.stdout:9/340: getdents d2/d8 0 2026-03-10T06:22:53.554 INFO:tasks.workunit.client.0.vm04.stdout:2/342: dread - d1/df/d2c/d37/f52 zero size 2026-03-10T06:22:53.554 INFO:tasks.workunit.client.0.vm04.stdout:9/341: write d2/d8/d3a/f6b [998932,41917] 0 2026-03-10T06:22:53.554 INFO:tasks.workunit.client.0.vm04.stdout:2/343: stat d1/l46 0 2026-03-10T06:22:53.554 INFO:tasks.workunit.client.0.vm04.stdout:8/354: creat df/d15/d29/f6f x:0 0 0 2026-03-10T06:22:53.555 INFO:tasks.workunit.client.0.vm04.stdout:9/342: readlink d2/d3/l45 0 2026-03-10T06:22:53.557 INFO:tasks.workunit.client.0.vm04.stdout:2/344: write d1/df/d11/d18/f53 [1230174,48577] 0 2026-03-10T06:22:53.557 INFO:tasks.workunit.client.0.vm04.stdout:2/345: chown d1/df/d11/d18/f25 437 1 2026-03-10T06:22:53.559 INFO:tasks.workunit.client.0.vm04.stdout:4/294: mknod d2/d32/d5c/c5f 0 2026-03-10T06:22:53.563 INFO:tasks.workunit.client.0.vm04.stdout:3/377: rename d4/f7 to d4/da/df/d11/d4a/d7b/d21/d32/d39/d64/f7d 0 2026-03-10T06:22:53.566 INFO:tasks.workunit.client.0.vm04.stdout:4/295: dread d2/d16/d31/d3f/f43 [0,4194304] 0 2026-03-10T06:22:53.566 INFO:tasks.workunit.client.0.vm04.stdout:0/314: symlink d0/d1a/d20/d38/d31/d47/l64 0 2026-03-10T06:22:53.589 INFO:tasks.workunit.client.0.vm04.stdout:0/315: write d0/d5/d25/f5f [189000,73304] 0 2026-03-10T06:22:53.589 INFO:tasks.workunit.client.0.vm04.stdout:9/343: rename d2/f1c to d2/d8/d53/d6e/f7d 0 2026-03-10T06:22:53.589 INFO:tasks.workunit.client.0.vm04.stdout:9/344: dread d2/d8/d14/d1d/f6a [0,4194304] 0 2026-03-10T06:22:53.589 INFO:tasks.workunit.client.0.vm04.stdout:2/346: rename d1/l3c to d1/df/d2c/l68 0 2026-03-10T06:22:53.589 INFO:tasks.workunit.client.0.vm04.stdout:3/378: mknod d4/da/df/d11/d50/c7e 0 2026-03-10T06:22:53.589 INFO:tasks.workunit.client.0.vm04.stdout:0/316: rename d0/l1 to d0/d1a/d4d/l65 0 2026-03-10T06:22:53.589 INFO:tasks.workunit.client.0.vm04.stdout:0/317: stat d0/d5/d25/dd/d1d/d59/c49 0 2026-03-10T06:22:53.589 INFO:tasks.workunit.client.0.vm04.stdout:3/379: mkdir d4/da/df/d11/d4a/d7b/d21/d32/d4e/d7f 0 2026-03-10T06:22:53.589 INFO:tasks.workunit.client.0.vm04.stdout:4/296: creat d2/d32/d5c/d4f/f60 x:0 0 0 2026-03-10T06:22:53.595 INFO:tasks.workunit.client.0.vm04.stdout:8/355: dread df/d15/d2b/f4a [4194304,4194304] 0 2026-03-10T06:22:53.602 INFO:tasks.workunit.client.0.vm04.stdout:9/345: rename d2/c16 to d2/d8/d14/d1d/d64/d73/c7e 0 2026-03-10T06:22:53.602 INFO:tasks.workunit.client.0.vm04.stdout:4/297: creat d2/d46/f61 x:0 0 0 2026-03-10T06:22:53.603 INFO:tasks.workunit.client.0.vm04.stdout:3/380: rename d4/da/df/d11/d50/f68 to d4/da/df/d11/d4a/f80 0 2026-03-10T06:22:53.605 INFO:tasks.workunit.client.0.vm04.stdout:9/346: symlink d2/d3/d18/d39/l7f 0 2026-03-10T06:22:53.606 INFO:tasks.workunit.client.0.vm04.stdout:4/298: mkdir d2/d16/d31/d3f/d62 0 2026-03-10T06:22:53.610 INFO:tasks.workunit.client.0.vm04.stdout:4/299: chown d2/d16/d2c/l3e 0 1 2026-03-10T06:22:53.611 INFO:tasks.workunit.client.0.vm04.stdout:4/300: chown d2/d32/d5c/d4f/d51/l59 12071116 1 2026-03-10T06:22:53.619 INFO:tasks.workunit.client.0.vm04.stdout:1/331: dwrite d0/d3/f44 [0,4194304] 0 2026-03-10T06:22:53.626 INFO:tasks.workunit.client.0.vm04.stdout:0/318: dread d0/d5/fb [0,4194304] 0 2026-03-10T06:22:53.633 INFO:tasks.workunit.client.0.vm04.stdout:6/346: dwrite d2/d43/f24 [0,4194304] 0 2026-03-10T06:22:53.638 INFO:tasks.workunit.client.0.vm04.stdout:4/301: rename d2/d16/d31/d42/l44 to d2/d8/l63 0 2026-03-10T06:22:53.639 INFO:tasks.workunit.client.0.vm04.stdout:9/347: dread f0 [0,4194304] 0 2026-03-10T06:22:53.644 INFO:tasks.workunit.client.0.vm04.stdout:1/332: mknod d0/c7d 0 2026-03-10T06:22:53.644 INFO:tasks.workunit.client.0.vm04.stdout:0/319: creat d0/d1a/f66 x:0 0 0 2026-03-10T06:22:53.648 INFO:tasks.workunit.client.0.vm04.stdout:2/347: chown d1/db/fe 17 1 2026-03-10T06:22:53.652 INFO:tasks.workunit.client.0.vm04.stdout:2/348: write d1/df/d2c/d37/d40/f64 [773854,41977] 0 2026-03-10T06:22:53.652 INFO:tasks.workunit.client.0.vm04.stdout:6/347: rmdir d2/d43/d2d/d30/d1f 39 2026-03-10T06:22:53.653 INFO:tasks.workunit.client.0.vm04.stdout:7/318: rmdir d4/df 39 2026-03-10T06:22:53.656 INFO:tasks.workunit.client.0.vm04.stdout:2/349: dread d1/db/f1e [4194304,4194304] 0 2026-03-10T06:22:53.676 INFO:tasks.workunit.client.0.vm04.stdout:4/302: dread d2/d16/d31/f4e [0,4194304] 0 2026-03-10T06:22:53.677 INFO:tasks.workunit.client.0.vm04.stdout:4/303: chown d2/d16/d31/c57 3170088 1 2026-03-10T06:22:53.711 INFO:tasks.workunit.client.0.vm04.stdout:9/348: mknod d2/d3/d18/d39/c80 0 2026-03-10T06:22:53.713 INFO:tasks.workunit.client.0.vm04.stdout:9/349: write d2/d8/d14/d1d/f78 [839446,128804] 0 2026-03-10T06:22:53.719 INFO:tasks.workunit.client.0.vm04.stdout:1/333: symlink d0/d3/d41/d4b/d5b/l7e 0 2026-03-10T06:22:53.744 INFO:tasks.workunit.client.0.vm04.stdout:4/304: creat d2/d16/d31/d3f/f64 x:0 0 0 2026-03-10T06:22:53.751 INFO:tasks.workunit.client.0.vm04.stdout:1/334: truncate d0/d3/f50 780724 0 2026-03-10T06:22:53.754 INFO:tasks.workunit.client.0.vm04.stdout:5/300: truncate d4/d11/d2a/f36 4706724 0 2026-03-10T06:22:53.754 INFO:tasks.workunit.client.0.vm04.stdout:0/320: symlink d0/d5/d25/dd/d1d/d59/d63/l67 0 2026-03-10T06:22:53.759 INFO:tasks.workunit.client.0.vm04.stdout:6/348: symlink d2/d43/d2d/d30/d1f/d3c/l73 0 2026-03-10T06:22:53.766 INFO:tasks.workunit.client.0.vm04.stdout:7/319: mknod d4/df/d12/d34/d63/c76 0 2026-03-10T06:22:53.782 INFO:tasks.workunit.client.0.vm04.stdout:2/350: mkdir d1/db/d69 0 2026-03-10T06:22:53.786 INFO:tasks.workunit.client.0.vm04.stdout:9/350: mknod d2/d3/d18/d39/d46/d55/c81 0 2026-03-10T06:22:53.788 INFO:tasks.workunit.client.0.vm04.stdout:6/349: sync 2026-03-10T06:22:53.800 INFO:tasks.workunit.client.0.vm04.stdout:1/335: mknod d0/d3/d41/d4b/c7f 0 2026-03-10T06:22:53.805 INFO:tasks.workunit.client.0.vm04.stdout:8/356: truncate df/f17 3383549 0 2026-03-10T06:22:53.820 INFO:tasks.workunit.client.0.vm04.stdout:0/321: symlink d0/d1a/d3f/l68 0 2026-03-10T06:22:53.822 INFO:tasks.workunit.client.0.vm04.stdout:2/351: mkdir d1/df/d11/d14/d6a 0 2026-03-10T06:22:53.824 INFO:tasks.workunit.client.0.vm04.stdout:0/322: truncate d0/d1a/d20/d38/d31/d47/f54 529174 0 2026-03-10T06:22:53.826 INFO:tasks.workunit.client.0.vm04.stdout:3/381: truncate d4/d6/f30 7897506 0 2026-03-10T06:22:53.826 INFO:tasks.workunit.client.0.vm04.stdout:5/301: dread d4/d11/f1f [0,4194304] 0 2026-03-10T06:22:53.827 INFO:tasks.workunit.client.0.vm04.stdout:9/351: symlink d2/d8/d22/d4f/l82 0 2026-03-10T06:22:53.827 INFO:tasks.workunit.client.0.vm04.stdout:6/350: symlink d2/d43/d2d/d51/l74 0 2026-03-10T06:22:53.834 INFO:tasks.workunit.client.0.vm04.stdout:8/357: mkdir df/d20/d25/d30/d70 0 2026-03-10T06:22:53.834 INFO:tasks.workunit.client.0.vm04.stdout:1/336: fdatasync d0/d3/f33 0 2026-03-10T06:22:53.836 INFO:tasks.workunit.client.0.vm04.stdout:1/337: write d0/f64 [99831,111056] 0 2026-03-10T06:22:53.837 INFO:tasks.workunit.client.0.vm04.stdout:2/352: readlink d1/df/d2c/l68 0 2026-03-10T06:22:53.843 INFO:tasks.workunit.client.0.vm04.stdout:4/305: rmdir d2/d16/d31/d3f/d62 0 2026-03-10T06:22:53.843 INFO:tasks.workunit.client.0.vm04.stdout:4/306: readlink d2/d16/d31/d3f/l48 0 2026-03-10T06:22:53.849 INFO:tasks.workunit.client.0.vm04.stdout:9/352: creat d2/d23/d24/f83 x:0 0 0 2026-03-10T06:22:53.850 INFO:tasks.workunit.client.0.vm04.stdout:9/353: chown d2/d3/l45 2050609 1 2026-03-10T06:22:53.850 INFO:tasks.workunit.client.0.vm04.stdout:0/323: dread d0/d5/d25/dd/d1d/f26 [0,4194304] 0 2026-03-10T06:22:53.866 INFO:tasks.workunit.client.0.vm04.stdout:8/358: unlink df/c1a 0 2026-03-10T06:22:53.866 INFO:tasks.workunit.client.0.vm04.stdout:5/302: truncate d4/d3b/f41 1163418 0 2026-03-10T06:22:53.867 INFO:tasks.workunit.client.0.vm04.stdout:8/359: dread - df/d15/d2b/f4c zero size 2026-03-10T06:22:53.874 INFO:tasks.workunit.client.0.vm04.stdout:7/320: dwrite d4/df/d12/f20 [0,4194304] 0 2026-03-10T06:22:53.874 INFO:tasks.workunit.client.0.vm04.stdout:0/324: dwrite d0/f44 [0,4194304] 0 2026-03-10T06:22:53.876 INFO:tasks.workunit.client.0.vm04.stdout:6/351: rename d2/d43/d2d/d51 to d2/d43/d2d/d30/d1f/d3c/d75 0 2026-03-10T06:22:53.876 INFO:tasks.workunit.client.0.vm04.stdout:1/338: rmdir d0/d8/d46 39 2026-03-10T06:22:53.884 INFO:tasks.workunit.client.0.vm04.stdout:0/325: chown d0/d1a/f2f 11 1 2026-03-10T06:22:53.891 INFO:tasks.workunit.client.0.vm04.stdout:0/326: dwrite d0/d5/d25/dd/d3a/f60 [0,4194304] 0 2026-03-10T06:22:53.900 INFO:tasks.workunit.client.0.vm04.stdout:3/382: getdents d4/da/df/d11/d4a/d7b/d21/d2c/d79 0 2026-03-10T06:22:53.900 INFO:tasks.workunit.client.0.vm04.stdout:4/307: fdatasync d2/d46/f5d 0 2026-03-10T06:22:53.908 INFO:tasks.workunit.client.0.vm04.stdout:5/303: symlink d4/d11/d2a/d52/l6c 0 2026-03-10T06:22:53.909 INFO:tasks.workunit.client.0.vm04.stdout:5/304: chown d4/d6/d48 514000 1 2026-03-10T06:22:53.914 INFO:tasks.workunit.client.0.vm04.stdout:6/352: dwrite d2/d43/d2d/d30/f4a [0,4194304] 0 2026-03-10T06:22:53.914 INFO:tasks.workunit.client.0.vm04.stdout:6/353: write d2/d43/d2d/d30/f60 [863008,7921] 0 2026-03-10T06:22:53.916 INFO:tasks.workunit.client.0.vm04.stdout:7/321: creat d4/df/d12/d13/d25/d28/d3a/d58/f77 x:0 0 0 2026-03-10T06:22:53.917 INFO:tasks.workunit.client.0.vm04.stdout:2/353: mkdir d1/db/d6b 0 2026-03-10T06:22:53.917 INFO:tasks.workunit.client.0.vm04.stdout:0/327: unlink d0/d1a/f2f 0 2026-03-10T06:22:53.917 INFO:tasks.workunit.client.0.vm04.stdout:0/328: chown d0/d5/d25 1 1 2026-03-10T06:22:53.918 INFO:tasks.workunit.client.0.vm04.stdout:0/329: write d0/d5/d25/f5f [71743,123586] 0 2026-03-10T06:22:53.922 INFO:tasks.workunit.client.0.vm04.stdout:4/308: creat d2/d32/d5c/d4f/d51/f65 x:0 0 0 2026-03-10T06:22:53.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:53 vm04.local ceph-mon[51058]: Deploying daemon prometheus.vm04 on vm04 2026-03-10T06:22:53.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:53 vm04.local ceph-mon[51058]: pgmap v25: 65 pgs: 65 active+clean; 2.4 GiB data, 8.2 GiB used, 112 GiB / 120 GiB avail; 11 MiB/s rd, 77 MiB/s wr, 191 op/s 2026-03-10T06:22:53.941 INFO:tasks.workunit.client.0.vm04.stdout:3/383: dread d4/da/df/d11/f57 [0,4194304] 0 2026-03-10T06:22:53.943 INFO:tasks.workunit.client.0.vm04.stdout:6/354: mkdir d2/d43/d2d/d30/d34/d76 0 2026-03-10T06:22:53.943 INFO:tasks.workunit.client.0.vm04.stdout:5/305: creat d4/d3b/f6d x:0 0 0 2026-03-10T06:22:53.944 INFO:tasks.workunit.client.0.vm04.stdout:1/339: mkdir d0/d3/d80 0 2026-03-10T06:22:53.953 INFO:tasks.workunit.client.0.vm04.stdout:7/322: creat d4/df/d12/d34/d63/f78 x:0 0 0 2026-03-10T06:22:53.957 INFO:tasks.workunit.client.0.vm04.stdout:0/330: rename d0/d5/d25/dd/d1d/l32 to d0/d5/d25/dd/d1d/d59/l69 0 2026-03-10T06:22:53.961 INFO:tasks.workunit.client.0.vm04.stdout:3/384: dwrite f1 [8388608,4194304] 0 2026-03-10T06:22:53.962 INFO:tasks.workunit.client.0.vm04.stdout:5/306: creat d4/d6/d48/d55/f6e x:0 0 0 2026-03-10T06:22:53.967 INFO:tasks.workunit.client.0.vm04.stdout:7/323: fdatasync d4/df/d12/d13/d25/d28/d3a/d58/f77 0 2026-03-10T06:22:53.970 INFO:tasks.workunit.client.0.vm04.stdout:6/355: creat d2/d37/d6e/f77 x:0 0 0 2026-03-10T06:22:53.971 INFO:tasks.workunit.client.0.vm04.stdout:2/354: dwrite d1/df/f5a [0,4194304] 0 2026-03-10T06:22:53.983 INFO:tasks.workunit.client.0.vm04.stdout:7/324: dwrite d4/df/d12/d13/d25/d28/d36/f64 [0,4194304] 0 2026-03-10T06:22:53.989 INFO:tasks.workunit.client.0.vm04.stdout:7/325: write d4/df/d12/d13/d25/d30/d40/d50/f62 [167860,91193] 0 2026-03-10T06:22:53.996 INFO:tasks.workunit.client.0.vm04.stdout:7/326: write d4/f51 [2133819,80994] 0 2026-03-10T06:22:53.996 INFO:tasks.workunit.client.0.vm04.stdout:9/354: dwrite d2/f17 [0,4194304] 0 2026-03-10T06:22:53.996 INFO:tasks.workunit.client.0.vm04.stdout:9/355: read d2/f17 [2677846,32536] 0 2026-03-10T06:22:53.996 INFO:tasks.workunit.client.0.vm04.stdout:9/356: fdatasync d2/d8/d14/d1d/f78 0 2026-03-10T06:22:54.004 INFO:tasks.workunit.client.0.vm04.stdout:3/385: fsync d4/d6/f12 0 2026-03-10T06:22:54.005 INFO:tasks.workunit.client.0.vm04.stdout:0/331: creat d0/d5/d25/dd/d3a/f6a x:0 0 0 2026-03-10T06:22:54.021 INFO:tasks.workunit.client.0.vm04.stdout:8/360: write df/d15/f43 [5131161,55289] 0 2026-03-10T06:22:54.021 INFO:tasks.workunit.client.0.vm04.stdout:6/356: dread d2/d43/d2d/d30/f2b [0,4194304] 0 2026-03-10T06:22:54.031 INFO:tasks.workunit.client.0.vm04.stdout:2/355: rename d1/df/d2c/l43 to d1/df/d2c/d37/d40/l6c 0 2026-03-10T06:22:54.046 INFO:tasks.workunit.client.0.vm04.stdout:7/327: mkdir d4/df/d12/d13/d25/d30/d40/d79 0 2026-03-10T06:22:54.062 INFO:tasks.workunit.client.0.vm04.stdout:4/309: write d2/d46/f18 [4708728,40583] 0 2026-03-10T06:22:54.064 INFO:tasks.workunit.client.0.vm04.stdout:9/357: mkdir d2/d3/d18/d39/d46/d84 0 2026-03-10T06:22:54.070 INFO:tasks.workunit.client.0.vm04.stdout:3/386: creat d4/d6/d54/f81 x:0 0 0 2026-03-10T06:22:54.070 INFO:tasks.workunit.client.0.vm04.stdout:5/307: link d4/d11/f1f d4/d6/f6f 0 2026-03-10T06:22:54.071 INFO:tasks.workunit.client.0.vm04.stdout:1/340: write d0/d3/d41/d4b/f6b [1697385,5726] 0 2026-03-10T06:22:54.079 INFO:tasks.workunit.client.0.vm04.stdout:5/308: dread - d4/f35 zero size 2026-03-10T06:22:54.079 INFO:tasks.workunit.client.0.vm04.stdout:8/361: dread - df/d15/d29/f3e zero size 2026-03-10T06:22:54.081 INFO:tasks.workunit.client.0.vm04.stdout:0/332: dread d0/f17 [0,4194304] 0 2026-03-10T06:22:54.083 INFO:tasks.workunit.client.0.vm04.stdout:2/356: mkdir d1/df/d2c/d6d 0 2026-03-10T06:22:54.084 INFO:tasks.workunit.client.0.vm04.stdout:2/357: read - d1/df/d11/d14/d4e/f5c zero size 2026-03-10T06:22:54.085 INFO:tasks.workunit.client.0.vm04.stdout:0/333: chown d0/d1a/d20/d38/l42 1856259 1 2026-03-10T06:22:54.086 INFO:tasks.workunit.client.0.vm04.stdout:4/310: creat d2/d16/d31/f66 x:0 0 0 2026-03-10T06:22:54.087 INFO:tasks.workunit.client.0.vm04.stdout:1/341: dwrite d0/d3/f61 [4194304,4194304] 0 2026-03-10T06:22:54.093 INFO:tasks.workunit.client.0.vm04.stdout:9/358: rename d2/d8/d3a/d60/c6d to d2/d3/d18/d39/d11/d42/c85 0 2026-03-10T06:22:54.106 INFO:tasks.workunit.client.0.vm04.stdout:3/387: mknod d4/d6/dc/c82 0 2026-03-10T06:22:54.110 INFO:tasks.workunit.client.0.vm04.stdout:3/388: write d4/da/df/d11/d4a/f76 [76073,92243] 0 2026-03-10T06:22:54.112 INFO:tasks.workunit.client.0.vm04.stdout:3/389: chown d4/da/df/d11/d4a/l52 0 1 2026-03-10T06:22:54.113 INFO:tasks.workunit.client.0.vm04.stdout:4/311: dread d2/d16/d31/f4e [0,4194304] 0 2026-03-10T06:22:54.115 INFO:tasks.workunit.client.0.vm04.stdout:8/362: mknod df/d20/d25/d30/d70/c71 0 2026-03-10T06:22:54.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:53 vm06.local ceph-mon[58974]: Deploying daemon prometheus.vm04 on vm04 2026-03-10T06:22:54.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:53 vm06.local ceph-mon[58974]: pgmap v25: 65 pgs: 65 active+clean; 2.4 GiB data, 8.2 GiB used, 112 GiB / 120 GiB avail; 11 MiB/s rd, 77 MiB/s wr, 191 op/s 2026-03-10T06:22:54.122 INFO:tasks.workunit.client.0.vm04.stdout:0/334: symlink d0/d1a/d20/l6b 0 2026-03-10T06:22:54.124 INFO:tasks.workunit.client.0.vm04.stdout:7/328: creat d4/f7a x:0 0 0 2026-03-10T06:22:54.125 INFO:tasks.workunit.client.0.vm04.stdout:1/342: creat d0/d3/d80/f81 x:0 0 0 2026-03-10T06:22:54.127 INFO:tasks.workunit.client.0.vm04.stdout:1/343: read d0/d8/f11 [52656,120208] 0 2026-03-10T06:22:54.130 INFO:tasks.workunit.client.0.vm04.stdout:5/309: rename d4/d11/f43 to d4/d11/d2a/d38/d51/f70 0 2026-03-10T06:22:54.132 INFO:tasks.workunit.client.0.vm04.stdout:6/357: write d2/d43/d2d/d30/d1f/d3c/f65 [588293,127963] 0 2026-03-10T06:22:54.133 INFO:tasks.workunit.client.0.vm04.stdout:4/312: rmdir d2/d16 39 2026-03-10T06:22:54.142 INFO:tasks.workunit.client.0.vm04.stdout:9/359: write f0 [1325661,57428] 0 2026-03-10T06:22:54.142 INFO:tasks.workunit.client.0.vm04.stdout:7/329: dwrite d4/df/d12/d13/d25/d28/f39 [4194304,4194304] 0 2026-03-10T06:22:54.151 INFO:tasks.workunit.client.0.vm04.stdout:7/330: write d4/df/d12/f4c [962589,109223] 0 2026-03-10T06:22:54.151 INFO:tasks.workunit.client.0.vm04.stdout:7/331: truncate d4/df/f56 139198 0 2026-03-10T06:22:54.151 INFO:tasks.workunit.client.0.vm04.stdout:7/332: chown d4/df/d12/f18 936 1 2026-03-10T06:22:54.157 INFO:tasks.workunit.client.0.vm04.stdout:6/358: mkdir d2/d8/d78 0 2026-03-10T06:22:54.157 INFO:tasks.workunit.client.0.vm04.stdout:2/358: link d1/df/d11/d18/f25 d1/df/f6e 0 2026-03-10T06:22:54.157 INFO:tasks.workunit.client.0.vm04.stdout:4/313: symlink d2/d32/d5c/l67 0 2026-03-10T06:22:54.169 INFO:tasks.workunit.client.0.vm04.stdout:0/335: mknod d0/d5/d25/dd/d5c/c6c 0 2026-03-10T06:22:54.172 INFO:tasks.workunit.client.0.vm04.stdout:3/390: creat d4/da/df/d11/d4a/d7b/d21/f83 x:0 0 0 2026-03-10T06:22:54.172 INFO:tasks.workunit.client.0.vm04.stdout:7/333: unlink d4/df/d12/d13/d25/d30/l45 0 2026-03-10T06:22:54.172 INFO:tasks.workunit.client.0.vm04.stdout:9/360: mknod d2/d3/d18/d39/d46/d55/c86 0 2026-03-10T06:22:54.172 INFO:tasks.workunit.client.0.vm04.stdout:9/361: chown d2/d3/d18/d34 443 1 2026-03-10T06:22:54.172 INFO:tasks.workunit.client.0.vm04.stdout:1/344: creat d0/d8/d46/f82 x:0 0 0 2026-03-10T06:22:54.172 INFO:tasks.workunit.client.0.vm04.stdout:1/345: chown d0/d3/f61 1192 1 2026-03-10T06:22:54.174 INFO:tasks.workunit.client.0.vm04.stdout:0/336: truncate d0/d5/d25/f23 472423 0 2026-03-10T06:22:54.178 INFO:tasks.workunit.client.0.vm04.stdout:6/359: read d2/d43/f31 [477695,68163] 0 2026-03-10T06:22:54.179 INFO:tasks.workunit.client.0.vm04.stdout:6/360: chown d2/d3a/f57 22 1 2026-03-10T06:22:54.181 INFO:tasks.workunit.client.0.vm04.stdout:2/359: fsync d1/f2b 0 2026-03-10T06:22:54.181 INFO:tasks.workunit.client.0.vm04.stdout:3/391: symlink d4/da/df/d11/d4a/d7b/d21/d32/d4e/l84 0 2026-03-10T06:22:54.185 INFO:tasks.workunit.client.0.vm04.stdout:5/310: rmdir d4 39 2026-03-10T06:22:54.190 INFO:tasks.workunit.client.0.vm04.stdout:8/363: sync 2026-03-10T06:22:54.190 INFO:tasks.workunit.client.0.vm04.stdout:9/362: sync 2026-03-10T06:22:54.190 INFO:tasks.workunit.client.0.vm04.stdout:4/314: mknod d2/d16/d56/c68 0 2026-03-10T06:22:54.196 INFO:tasks.workunit.client.0.vm04.stdout:8/364: dread df/d15/d29/f3c [0,4194304] 0 2026-03-10T06:22:54.200 INFO:tasks.workunit.client.0.vm04.stdout:2/360: rmdir d1/df/d11/d18/d48 39 2026-03-10T06:22:54.204 INFO:tasks.workunit.client.0.vm04.stdout:1/346: truncate d0/f29 4085586 0 2026-03-10T06:22:54.204 INFO:tasks.workunit.client.0.vm04.stdout:2/361: read - d1/db/d20/f49 zero size 2026-03-10T06:22:54.206 INFO:tasks.workunit.client.0.vm04.stdout:1/347: write d0/d3/f34 [1329633,116991] 0 2026-03-10T06:22:54.211 INFO:tasks.workunit.client.0.vm04.stdout:8/365: dwrite df/d15/f1b [0,4194304] 0 2026-03-10T06:22:54.219 INFO:tasks.workunit.client.0.vm04.stdout:9/363: mkdir d2/d8/d22/d87 0 2026-03-10T06:22:54.224 INFO:tasks.workunit.client.0.vm04.stdout:4/315: rename d2/d16/d31/d3f/l53 to d2/d8/l69 0 2026-03-10T06:22:54.229 INFO:tasks.workunit.client.0.vm04.stdout:6/361: unlink d2/d3a/c6f 0 2026-03-10T06:22:54.235 INFO:tasks.workunit.client.0.vm04.stdout:2/362: mknod d1/df/d11/d18/d35/c6f 0 2026-03-10T06:22:54.236 INFO:tasks.workunit.client.0.vm04.stdout:2/363: write d1/df/d2c/d37/f52 [633939,105917] 0 2026-03-10T06:22:54.247 INFO:tasks.workunit.client.0.vm04.stdout:3/392: fsync d4/da/df/d11/d4a/d7b/f1d 0 2026-03-10T06:22:54.247 INFO:tasks.workunit.client.0.vm04.stdout:1/348: creat d0/f83 x:0 0 0 2026-03-10T06:22:54.247 INFO:tasks.workunit.client.0.vm04.stdout:8/366: mknod df/d20/d25/d30/d55/c72 0 2026-03-10T06:22:54.250 INFO:tasks.workunit.client.0.vm04.stdout:9/364: creat d2/d3/d18/d39/d46/d55/f88 x:0 0 0 2026-03-10T06:22:54.252 INFO:tasks.workunit.client.0.vm04.stdout:4/316: truncate d2/f14 4443375 0 2026-03-10T06:22:54.253 INFO:tasks.workunit.client.0.vm04.stdout:2/364: symlink d1/db/d20/l70 0 2026-03-10T06:22:54.257 INFO:tasks.workunit.client.0.vm04.stdout:7/334: link d4/df/d12/d13/l5e d4/df/d12/d13/d25/d28/l7b 0 2026-03-10T06:22:54.265 INFO:tasks.workunit.client.0.vm04.stdout:5/311: dread - d4/d11/f32 zero size 2026-03-10T06:22:54.267 INFO:tasks.workunit.client.0.vm04.stdout:9/365: dread d2/d3/d18/d39/d11/f2d [0,4194304] 0 2026-03-10T06:22:54.267 INFO:tasks.workunit.client.0.vm04.stdout:8/367: mkdir df/d20/d25/d73 0 2026-03-10T06:22:54.267 INFO:tasks.workunit.client.0.vm04.stdout:1/349: truncate d0/f1a 440318 0 2026-03-10T06:22:54.267 INFO:tasks.workunit.client.0.vm04.stdout:5/312: dread - d4/d6/d48/d55/f6e zero size 2026-03-10T06:22:54.271 INFO:tasks.workunit.client.0.vm04.stdout:0/337: getdents d0/d5/d25/dd/d3a 0 2026-03-10T06:22:54.271 INFO:tasks.workunit.client.0.vm04.stdout:2/365: symlink d1/df/d2c/d37/l71 0 2026-03-10T06:22:54.272 INFO:tasks.workunit.client.0.vm04.stdout:6/362: creat d2/d8/d78/f79 x:0 0 0 2026-03-10T06:22:54.273 INFO:tasks.workunit.client.0.vm04.stdout:6/363: stat d2/d43/d2d/d30/d1f/d3c 0 2026-03-10T06:22:54.275 INFO:tasks.workunit.client.0.vm04.stdout:7/335: fsync d4/df/d12/d13/f4a 0 2026-03-10T06:22:54.276 INFO:tasks.workunit.client.0.vm04.stdout:3/393: fsync d4/da/df/d11/d4a/d7b/f27 0 2026-03-10T06:22:54.276 INFO:tasks.workunit.client.0.vm04.stdout:0/338: symlink d0/d1a/d3f/l6d 0 2026-03-10T06:22:54.282 INFO:tasks.workunit.client.0.vm04.stdout:9/366: mkdir d2/d8/d53/d6e/d89 0 2026-03-10T06:22:54.283 INFO:tasks.workunit.client.0.vm04.stdout:4/317: dread d2/d16/d2c/f55 [0,4194304] 0 2026-03-10T06:22:54.284 INFO:tasks.workunit.client.0.vm04.stdout:8/368: mknod df/d20/c74 0 2026-03-10T06:22:54.287 INFO:tasks.workunit.client.0.vm04.stdout:5/313: sync 2026-03-10T06:22:54.288 INFO:tasks.workunit.client.0.vm04.stdout:6/364: sync 2026-03-10T06:22:54.289 INFO:tasks.workunit.client.0.vm04.stdout:6/365: write d2/d43/d2d/d30/f63 [481948,25893] 0 2026-03-10T06:22:54.299 INFO:tasks.workunit.client.0.vm04.stdout:7/336: dwrite d4/df/d12/d13/f1e [4194304,4194304] 0 2026-03-10T06:22:54.300 INFO:tasks.workunit.client.0.vm04.stdout:8/369: dwrite df/d15/d2b/f4d [0,4194304] 0 2026-03-10T06:22:54.304 INFO:tasks.workunit.client.0.vm04.stdout:8/370: readlink df/d15/l3d 0 2026-03-10T06:22:54.307 INFO:tasks.workunit.client.0.vm04.stdout:7/337: fsync d4/df/d12/d13/d25/d30/d40/f52 0 2026-03-10T06:22:54.307 INFO:tasks.workunit.client.0.vm04.stdout:7/338: read d4/df/d12/f20 [4128317,121297] 0 2026-03-10T06:22:54.315 INFO:tasks.workunit.client.0.vm04.stdout:0/339: truncate d0/d5/f3e 2432254 0 2026-03-10T06:22:54.315 INFO:tasks.workunit.client.0.vm04.stdout:0/340: write d0/d1a/f66 [340776,79797] 0 2026-03-10T06:22:54.318 INFO:tasks.workunit.client.0.vm04.stdout:4/318: readlink d2/l9 0 2026-03-10T06:22:54.319 INFO:tasks.workunit.client.0.vm04.stdout:1/350: read d0/d3/f62 [237637,26279] 0 2026-03-10T06:22:54.331 INFO:tasks.workunit.client.0.vm04.stdout:3/394: dwrite d4/f42 [0,4194304] 0 2026-03-10T06:22:54.335 INFO:tasks.workunit.client.0.vm04.stdout:5/314: fsync d4/f19 0 2026-03-10T06:22:54.335 INFO:tasks.workunit.client.0.vm04.stdout:6/366: dread - d2/d37/f38 zero size 2026-03-10T06:22:54.341 INFO:tasks.workunit.client.0.vm04.stdout:0/341: sync 2026-03-10T06:22:54.345 INFO:tasks.workunit.client.0.vm04.stdout:0/342: write d0/f44 [1021453,94263] 0 2026-03-10T06:22:54.348 INFO:tasks.workunit.client.0.vm04.stdout:0/343: stat d0 0 2026-03-10T06:22:54.348 INFO:tasks.workunit.client.0.vm04.stdout:8/371: write df/d20/f28 [1555787,45371] 0 2026-03-10T06:22:54.348 INFO:tasks.workunit.client.0.vm04.stdout:9/367: dwrite d2/d23/d24/f2b [0,4194304] 0 2026-03-10T06:22:54.348 INFO:tasks.workunit.client.0.vm04.stdout:8/372: chown df/d20/d25/f35 9828540 1 2026-03-10T06:22:54.360 INFO:tasks.workunit.client.0.vm04.stdout:2/366: rename d1/df/d2c/d6d to d1/db/d72 0 2026-03-10T06:22:54.362 INFO:tasks.workunit.client.0.vm04.stdout:3/395: dread d4/f42 [0,4194304] 0 2026-03-10T06:22:54.369 INFO:tasks.workunit.client.0.vm04.stdout:5/315: rename d4/d3b/f54 to d4/d3b/f71 0 2026-03-10T06:22:54.376 INFO:tasks.workunit.client.0.vm04.stdout:3/396: dread d4/da/df/d11/d4a/d7b/d21/d32/d39/d64/f75 [0,4194304] 0 2026-03-10T06:22:54.378 INFO:tasks.workunit.client.0.vm04.stdout:8/373: fdatasync df/d15/d2b/f60 0 2026-03-10T06:22:54.382 INFO:tasks.workunit.client.0.vm04.stdout:7/339: rename d4/df/d12/f20 to d4/df/d12/d13/d25/d28/d3a/d58/d68/f7c 0 2026-03-10T06:22:54.383 INFO:tasks.workunit.client.0.vm04.stdout:5/316: mkdir d4/d6/d48/d55/d72 0 2026-03-10T06:22:54.386 INFO:tasks.workunit.client.0.vm04.stdout:2/367: mknod d1/df/c73 0 2026-03-10T06:22:54.386 INFO:tasks.workunit.client.0.vm04.stdout:3/397: dread d4/da/df/d11/d4a/d7b/d21/f6d [0,4194304] 0 2026-03-10T06:22:54.386 INFO:tasks.workunit.client.0.vm04.stdout:9/368: rename d2/f48 to d2/d8/d14/d6c/f8a 0 2026-03-10T06:22:54.387 INFO:tasks.workunit.client.0.vm04.stdout:8/374: write df/d20/d25/d30/d55/f5b [4142958,85243] 0 2026-03-10T06:22:54.389 INFO:tasks.workunit.client.0.vm04.stdout:0/344: rename d0/d5/d25 to d0/d5/d25/dd/d3a/d56/d6e 22 2026-03-10T06:22:54.389 INFO:tasks.workunit.client.0.vm04.stdout:0/345: stat d0/d1a/d20/l6b 0 2026-03-10T06:22:54.398 INFO:tasks.workunit.client.0.vm04.stdout:6/367: creat d2/f7a x:0 0 0 2026-03-10T06:22:54.401 INFO:tasks.workunit.client.0.vm04.stdout:3/398: dwrite d4/da/df/d11/d4a/f76 [0,4194304] 0 2026-03-10T06:22:54.403 INFO:tasks.workunit.client.0.vm04.stdout:2/368: rmdir d1/df/d2c/d37 39 2026-03-10T06:22:54.419 INFO:tasks.workunit.client.0.vm04.stdout:9/369: mknod d2/d3/c8b 0 2026-03-10T06:22:54.424 INFO:tasks.workunit.client.0.vm04.stdout:4/319: write d2/d8/f35 [832734,100961] 0 2026-03-10T06:22:54.429 INFO:tasks.workunit.client.0.vm04.stdout:1/351: dwrite d0/d8/f21 [0,4194304] 0 2026-03-10T06:22:54.430 INFO:tasks.workunit.client.0.vm04.stdout:0/346: creat d0/d5/d25/f6f x:0 0 0 2026-03-10T06:22:54.430 INFO:tasks.workunit.client.0.vm04.stdout:1/352: write d0/d8/f21 [2048787,10526] 0 2026-03-10T06:22:54.431 INFO:tasks.workunit.client.0.vm04.stdout:5/317: dread d4/d3b/f41 [0,4194304] 0 2026-03-10T06:22:54.431 INFO:tasks.workunit.client.0.vm04.stdout:0/347: write d0/f44 [4803201,9083] 0 2026-03-10T06:22:54.455 INFO:tasks.workunit.client.0.vm04.stdout:1/353: dread d0/f2e [0,4194304] 0 2026-03-10T06:22:54.455 INFO:tasks.workunit.client.0.vm04.stdout:1/354: write d0/d3/f19 [2569507,5894] 0 2026-03-10T06:22:54.462 INFO:tasks.workunit.client.0.vm04.stdout:8/375: mknod df/d20/c75 0 2026-03-10T06:22:54.473 INFO:tasks.workunit.client.0.vm04.stdout:4/320: unlink d2/d8/c1e 0 2026-03-10T06:22:54.484 INFO:tasks.workunit.client.0.vm04.stdout:6/368: write d2/f10 [1751847,3697] 0 2026-03-10T06:22:54.485 INFO:tasks.workunit.client.0.vm04.stdout:6/369: chown d2/d43/d2d/d30/d34/f6d 329765339 1 2026-03-10T06:22:54.494 INFO:tasks.workunit.client.0.vm04.stdout:5/318: chown d4/d11/d2a/f5b 20765 1 2026-03-10T06:22:54.496 INFO:tasks.workunit.client.0.vm04.stdout:5/319: read d4/f19 [243081,17006] 0 2026-03-10T06:22:54.499 INFO:tasks.workunit.client.0.vm04.stdout:0/348: creat d0/d5/f70 x:0 0 0 2026-03-10T06:22:54.503 INFO:tasks.workunit.client.0.vm04.stdout:7/340: creat d4/df/d12/d13/d25/d28/f7d x:0 0 0 2026-03-10T06:22:54.504 INFO:tasks.workunit.client.0.vm04.stdout:7/341: dread - d4/df/d12/d13/d25/d28/d3a/f73 zero size 2026-03-10T06:22:54.506 INFO:tasks.workunit.client.0.vm04.stdout:2/369: mkdir d1/db/d69/d74 0 2026-03-10T06:22:54.509 INFO:tasks.workunit.client.0.vm04.stdout:0/349: dread d0/d1a/f3b [0,4194304] 0 2026-03-10T06:22:54.514 INFO:tasks.workunit.client.0.vm04.stdout:8/376: mknod df/d20/d25/d30/d70/c76 0 2026-03-10T06:22:54.515 INFO:tasks.workunit.client.0.vm04.stdout:8/377: dread - df/d20/d25/d30/f6b zero size 2026-03-10T06:22:54.515 INFO:tasks.workunit.client.0.vm04.stdout:8/378: write df/d15/f69 [82816,95531] 0 2026-03-10T06:22:54.516 INFO:tasks.workunit.client.0.vm04.stdout:8/379: write df/d15/f69 [1177067,32015] 0 2026-03-10T06:22:54.517 INFO:tasks.workunit.client.0.vm04.stdout:8/380: chown df/d15/d2b/f2f 9817 1 2026-03-10T06:22:54.523 INFO:tasks.workunit.client.0.vm04.stdout:5/320: sync 2026-03-10T06:22:54.527 INFO:tasks.workunit.client.0.vm04.stdout:6/370: unlink d2/f7a 0 2026-03-10T06:22:54.530 INFO:tasks.workunit.client.0.vm04.stdout:6/371: write d2/d3a/f56 [935859,102013] 0 2026-03-10T06:22:54.534 INFO:tasks.workunit.client.0.vm04.stdout:2/370: rename d1/db/d6b to d1/db/d20/d75 0 2026-03-10T06:22:54.534 INFO:tasks.workunit.client.0.vm04.stdout:2/371: chown d1/df/d11/d14/c1a 718 1 2026-03-10T06:22:54.540 INFO:tasks.workunit.client.0.vm04.stdout:9/370: link d2/d23/c59 d2/d8/d14/d6c/c8c 0 2026-03-10T06:22:54.540 INFO:tasks.workunit.client.0.vm04.stdout:1/355: creat d0/d8/d46/d7a/f84 x:0 0 0 2026-03-10T06:22:54.541 INFO:tasks.workunit.client.0.vm04.stdout:9/371: readlink d2/d8/d14/l67 0 2026-03-10T06:22:54.551 INFO:tasks.workunit.client.0.vm04.stdout:8/381: unlink f6 0 2026-03-10T06:22:54.565 INFO:tasks.workunit.client.0.vm04.stdout:0/350: truncate d0/d5/d25/dd/f13 5467423 0 2026-03-10T06:22:54.565 INFO:tasks.workunit.client.0.vm04.stdout:3/399: getdents d4/da/df/d11/d4a/d7b/d21/d32/d4e 0 2026-03-10T06:22:54.577 INFO:tasks.workunit.client.0.vm04.stdout:9/372: rmdir d2/d3/d18 39 2026-03-10T06:22:54.578 INFO:tasks.workunit.client.0.vm04.stdout:1/356: symlink d0/d3/d80/l85 0 2026-03-10T06:22:54.581 INFO:tasks.workunit.client.0.vm04.stdout:7/342: rmdir d4/df/d12/d13/d25/d72 0 2026-03-10T06:22:54.582 INFO:tasks.workunit.client.0.vm04.stdout:7/343: read d4/df/f56 [47737,18872] 0 2026-03-10T06:22:54.583 INFO:tasks.workunit.client.0.vm04.stdout:7/344: fsync d4/df/d12/d13/d25/d28/d3a/d58/f77 0 2026-03-10T06:22:54.590 INFO:tasks.workunit.client.0.vm04.stdout:2/372: mkdir d1/d76 0 2026-03-10T06:22:54.592 INFO:tasks.workunit.client.0.vm04.stdout:0/351: mkdir d0/d5/d25/dd/d1d/d59/d63/d71 0 2026-03-10T06:22:54.592 INFO:tasks.workunit.client.0.vm04.stdout:3/400: creat d4/da/df/d11/d4a/d7b/d21/d2c/f85 x:0 0 0 2026-03-10T06:22:54.592 INFO:tasks.workunit.client.0.vm04.stdout:0/352: stat d0/f16 0 2026-03-10T06:22:54.592 INFO:tasks.workunit.client.0.vm04.stdout:2/373: write d1/f5 [318572,81709] 0 2026-03-10T06:22:54.593 INFO:tasks.workunit.client.0.vm04.stdout:4/321: getdents d2/d8 0 2026-03-10T06:22:54.598 INFO:tasks.workunit.client.0.vm04.stdout:5/321: link d4/d11/d2a/c53 d4/d11/d2a/c73 0 2026-03-10T06:22:54.599 INFO:tasks.workunit.client.0.vm04.stdout:5/322: fdatasync d4/d6/d48/d55/f6e 0 2026-03-10T06:22:54.602 INFO:tasks.workunit.client.0.vm04.stdout:1/357: dwrite d0/d8/f27 [0,4194304] 0 2026-03-10T06:22:54.604 INFO:tasks.workunit.client.0.vm04.stdout:9/373: dread d2/d23/d24/f29 [0,4194304] 0 2026-03-10T06:22:54.604 INFO:tasks.workunit.client.0.vm04.stdout:7/345: symlink d4/df/d12/d34/l7e 0 2026-03-10T06:22:54.633 INFO:tasks.workunit.client.0.vm04.stdout:3/401: fsync d4/da/df/d11/f57 0 2026-03-10T06:22:54.634 INFO:tasks.workunit.client.0.vm04.stdout:2/374: creat d1/db/d69/f77 x:0 0 0 2026-03-10T06:22:54.635 INFO:tasks.workunit.client.0.vm04.stdout:2/375: write d1/df/d11/d14/d4e/f5c [670912,103131] 0 2026-03-10T06:22:54.644 INFO:tasks.workunit.client.0.vm04.stdout:1/358: creat d0/d3/d80/f86 x:0 0 0 2026-03-10T06:22:54.644 INFO:tasks.workunit.client.0.vm04.stdout:6/372: rename d2/d43/d2d/c54 to d2/d43/d2d/c7b 0 2026-03-10T06:22:54.654 INFO:tasks.workunit.client.0.vm04.stdout:0/353: symlink d0/d1a/d20/l72 0 2026-03-10T06:22:54.657 INFO:tasks.workunit.client.0.vm04.stdout:2/376: truncate d1/df/d2c/d37/d40/f64 1662236 0 2026-03-10T06:22:54.664 INFO:tasks.workunit.client.0.vm04.stdout:8/382: truncate df/d15/d2b/f33 4803597 0 2026-03-10T06:22:54.667 INFO:tasks.workunit.client.0.vm04.stdout:5/323: rename d4/d11/f2f to d4/d6/d48/f74 0 2026-03-10T06:22:54.672 INFO:tasks.workunit.client.0.vm04.stdout:8/383: dread - df/f4f zero size 2026-03-10T06:22:54.673 INFO:tasks.workunit.client.0.vm04.stdout:1/359: mknod d0/d3/d80/c87 0 2026-03-10T06:22:54.673 INFO:tasks.workunit.client.0.vm04.stdout:1/360: fdatasync d0/d3/d80/f81 0 2026-03-10T06:22:54.674 INFO:tasks.workunit.client.0.vm04.stdout:4/322: chown d2/d46/f26 19 1 2026-03-10T06:22:54.674 INFO:tasks.workunit.client.0.vm04.stdout:1/361: write d0/d8/d46/f57 [724810,1057] 0 2026-03-10T06:22:54.675 INFO:tasks.workunit.client.0.vm04.stdout:1/362: readlink d0/d3/d41/l51 0 2026-03-10T06:22:54.676 INFO:tasks.workunit.client.0.vm04.stdout:1/363: fsync d0/f6a 0 2026-03-10T06:22:54.676 INFO:tasks.workunit.client.0.vm04.stdout:1/364: chown d0/d3/l1e 1584837 1 2026-03-10T06:22:54.677 INFO:tasks.workunit.client.0.vm04.stdout:1/365: chown d0/d3/c15 66653288 1 2026-03-10T06:22:54.679 INFO:tasks.workunit.client.0.vm04.stdout:1/366: chown d0/d3/f3b 0 1 2026-03-10T06:22:54.689 INFO:tasks.workunit.client.0.vm04.stdout:3/402: mknod d4/da/df/d11/d5a/d5b/c86 0 2026-03-10T06:22:54.690 INFO:tasks.workunit.client.0.vm04.stdout:2/377: readlink d1/df/d2c/d37/d40/l6c 0 2026-03-10T06:22:54.697 INFO:tasks.workunit.client.0.vm04.stdout:5/324: sync 2026-03-10T06:22:54.699 INFO:tasks.workunit.client.0.vm04.stdout:7/346: creat d4/df/d12/f7f x:0 0 0 2026-03-10T06:22:54.700 INFO:tasks.workunit.client.0.vm04.stdout:9/374: rename d2/d8/d3a/d60 to d2/d8/d53/d6e/d8d 0 2026-03-10T06:22:54.706 INFO:tasks.workunit.client.0.vm04.stdout:6/373: mkdir d2/d43/d2d/d7c 0 2026-03-10T06:22:54.707 INFO:tasks.workunit.client.0.vm04.stdout:4/323: dread - d2/d32/d5c/f4b zero size 2026-03-10T06:22:54.708 INFO:tasks.workunit.client.0.vm04.stdout:1/367: symlink d0/d3/d41/d4b/d5b/l88 0 2026-03-10T06:22:54.711 INFO:tasks.workunit.client.0.vm04.stdout:3/403: symlink d4/da/df/d11/d5a/d5b/l87 0 2026-03-10T06:22:54.712 INFO:tasks.workunit.client.0.vm04.stdout:3/404: chown d4/d6/dc/l66 744340 1 2026-03-10T06:22:54.712 INFO:tasks.workunit.client.0.vm04.stdout:7/347: creat d4/df/d12/d34/f80 x:0 0 0 2026-03-10T06:22:54.715 INFO:tasks.workunit.client.0.vm04.stdout:9/375: fdatasync d2/d8/d14/f28 0 2026-03-10T06:22:54.716 INFO:tasks.workunit.client.0.vm04.stdout:7/348: read d4/fa [3203204,107082] 0 2026-03-10T06:22:54.723 INFO:tasks.workunit.client.0.vm04.stdout:0/354: dread d0/d5/f1f [0,4194304] 0 2026-03-10T06:22:54.723 INFO:tasks.workunit.client.0.vm04.stdout:6/374: rename d2/d43/d2d/d30/d1f/d3c/l73 to d2/d43/d2d/d7c/l7d 0 2026-03-10T06:22:54.724 INFO:tasks.workunit.client.0.vm04.stdout:4/324: dwrite d2/f4c [0,4194304] 0 2026-03-10T06:22:54.725 INFO:tasks.workunit.client.0.vm04.stdout:1/368: mknod d0/d3/d41/d4b/d5b/c89 0 2026-03-10T06:22:54.726 INFO:tasks.workunit.client.0.vm04.stdout:5/325: mknod d4/d11/c75 0 2026-03-10T06:22:54.727 INFO:tasks.workunit.client.0.vm04.stdout:8/384: creat df/f77 x:0 0 0 2026-03-10T06:22:54.731 INFO:tasks.workunit.client.0.vm04.stdout:7/349: symlink d4/df/d12/d21/l81 0 2026-03-10T06:22:54.738 INFO:tasks.workunit.client.0.vm04.stdout:1/369: creat d0/d3/d41/f8a x:0 0 0 2026-03-10T06:22:54.739 INFO:tasks.workunit.client.0.vm04.stdout:1/370: write d0/f83 [518595,79300] 0 2026-03-10T06:22:54.739 INFO:tasks.workunit.client.0.vm04.stdout:1/371: chown d0/d3/f37 14819 1 2026-03-10T06:22:54.745 INFO:tasks.workunit.client.0.vm04.stdout:8/385: mknod df/d15/c78 0 2026-03-10T06:22:54.747 INFO:tasks.workunit.client.0.vm04.stdout:5/326: mknod d4/d11/d2a/c76 0 2026-03-10T06:22:54.749 INFO:tasks.workunit.client.0.vm04.stdout:3/405: mknod d4/c88 0 2026-03-10T06:22:54.749 INFO:tasks.workunit.client.0.vm04.stdout:3/406: chown d4/f49 0 1 2026-03-10T06:22:54.752 INFO:tasks.workunit.client.0.vm04.stdout:7/350: dread d4/df/f56 [0,4194304] 0 2026-03-10T06:22:54.759 INFO:tasks.workunit.client.0.vm04.stdout:7/351: dread - d4/df/d12/d13/d25/d28/d3a/f73 zero size 2026-03-10T06:22:54.762 INFO:tasks.workunit.client.0.vm04.stdout:6/375: mkdir d2/d43/d2d/d30/d34/d76/d7e 0 2026-03-10T06:22:54.766 INFO:tasks.workunit.client.0.vm04.stdout:9/376: mknod d2/d3/d18/d39/c8e 0 2026-03-10T06:22:54.766 INFO:tasks.workunit.client.0.vm04.stdout:0/355: truncate d0/d1a/f27 2508951 0 2026-03-10T06:22:54.767 INFO:tasks.workunit.client.0.vm04.stdout:1/372: symlink d0/d3/d41/d4b/d5b/l8b 0 2026-03-10T06:22:54.768 INFO:tasks.workunit.client.0.vm04.stdout:2/378: link d1/l47 d1/df/d2c/l78 0 2026-03-10T06:22:54.774 INFO:tasks.workunit.client.0.vm04.stdout:2/379: truncate d1/df/d11/d14/d4e/f60 677478 0 2026-03-10T06:22:54.779 INFO:tasks.workunit.client.0.vm04.stdout:0/356: sync 2026-03-10T06:22:54.789 INFO:tasks.workunit.client.0.vm04.stdout:8/386: creat df/d20/d25/d30/f79 x:0 0 0 2026-03-10T06:22:54.790 INFO:tasks.workunit.client.0.vm04.stdout:8/387: chown df/d15/f5d 26 1 2026-03-10T06:22:54.794 INFO:tasks.workunit.client.0.vm04.stdout:5/327: rmdir d4/d11/d2a/d52 39 2026-03-10T06:22:54.799 INFO:tasks.workunit.client.0.vm04.stdout:7/352: mknod d4/df/d12/d13/c82 0 2026-03-10T06:22:54.811 INFO:tasks.workunit.client.0.vm04.stdout:0/357: dread d0/d1a/d20/d38/f52 [0,4194304] 0 2026-03-10T06:22:54.813 INFO:tasks.workunit.client.0.vm04.stdout:5/328: unlink d4/d3b/f4d 0 2026-03-10T06:22:54.817 INFO:tasks.workunit.client.0.vm04.stdout:4/325: getdents d2/d32 0 2026-03-10T06:22:54.823 INFO:tasks.workunit.client.0.vm04.stdout:1/373: symlink d0/d3/l8c 0 2026-03-10T06:22:54.824 INFO:tasks.workunit.client.0.vm04.stdout:1/374: truncate d0/d8/f21 4929628 0 2026-03-10T06:22:54.824 INFO:tasks.workunit.client.0.vm04.stdout:1/375: fdatasync d0/f83 0 2026-03-10T06:22:54.837 INFO:tasks.workunit.client.0.vm04.stdout:3/407: rename d4/da/df/d11/d4a/d7b/d21/d2c/d79 to d4/da/df/d11/d4a/d7b/d89 0 2026-03-10T06:22:54.839 INFO:tasks.workunit.client.0.vm04.stdout:6/376: creat d2/d43/d2d/d30/f7f x:0 0 0 2026-03-10T06:22:54.840 INFO:tasks.workunit.client.0.vm04.stdout:6/377: fdatasync d2/f5f 0 2026-03-10T06:22:54.842 INFO:tasks.workunit.client.0.vm04.stdout:9/377: link d2/f17 d2/d3/d18/f8f 0 2026-03-10T06:22:54.844 INFO:tasks.workunit.client.0.vm04.stdout:9/378: fdatasync d2/d8/f4a 0 2026-03-10T06:22:54.844 INFO:tasks.workunit.client.0.vm04.stdout:1/376: mknod d0/d8/d46/c8d 0 2026-03-10T06:22:54.848 INFO:tasks.workunit.client.0.vm04.stdout:8/388: creat df/d15/d29/f7a x:0 0 0 2026-03-10T06:22:54.848 INFO:tasks.workunit.client.0.vm04.stdout:5/329: mknod d4/d11/c77 0 2026-03-10T06:22:54.851 INFO:tasks.workunit.client.0.vm04.stdout:2/380: rename d1/db/c1c to d1/db/d69/c79 0 2026-03-10T06:22:54.852 INFO:tasks.workunit.client.0.vm04.stdout:9/379: dread d2/d8/d14/d6c/f8a [0,4194304] 0 2026-03-10T06:22:54.853 INFO:tasks.workunit.client.0.vm04.stdout:2/381: chown d1/df/d11/f16 1 1 2026-03-10T06:22:54.853 INFO:tasks.workunit.client.0.vm04.stdout:2/382: dread - d1/df/d2c/f4a zero size 2026-03-10T06:22:54.859 INFO:tasks.workunit.client.0.vm04.stdout:4/326: fdatasync d2/f14 0 2026-03-10T06:22:54.859 INFO:tasks.workunit.client.0.vm04.stdout:6/378: symlink d2/d43/d2d/d30/d1f/l80 0 2026-03-10T06:22:54.869 INFO:tasks.workunit.client.0.vm04.stdout:3/408: dwrite d4/da/df/d11/d4a/d7b/f47 [0,4194304] 0 2026-03-10T06:22:54.869 INFO:tasks.workunit.client.0.vm04.stdout:4/327: sync 2026-03-10T06:22:54.869 INFO:tasks.workunit.client.0.vm04.stdout:9/380: sync 2026-03-10T06:22:54.870 INFO:tasks.workunit.client.0.vm04.stdout:1/377: unlink d0/d3/d41/d4b/l7b 0 2026-03-10T06:22:54.876 INFO:tasks.workunit.client.0.vm04.stdout:6/379: dread d2/d43/d2d/d30/d1f/d3c/f65 [0,4194304] 0 2026-03-10T06:22:54.880 INFO:tasks.workunit.client.0.vm04.stdout:2/383: creat d1/db/d72/f7a x:0 0 0 2026-03-10T06:22:54.882 INFO:tasks.workunit.client.0.vm04.stdout:2/384: read d1/df/d2c/f4c [3215215,20258] 0 2026-03-10T06:22:54.886 INFO:tasks.workunit.client.0.vm04.stdout:9/381: symlink d2/d3/d18/d39/d11/d42/l90 0 2026-03-10T06:22:54.891 INFO:tasks.workunit.client.0.vm04.stdout:9/382: fdatasync d2/d3/f4 0 2026-03-10T06:22:54.893 INFO:tasks.workunit.client.0.vm04.stdout:4/328: chown d2/c19 4 1 2026-03-10T06:22:54.893 INFO:tasks.workunit.client.0.vm04.stdout:9/383: write d2/d8/f66 [3262504,96031] 0 2026-03-10T06:22:54.893 INFO:tasks.workunit.client.0.vm04.stdout:9/384: fdatasync d2/d8/f4a 0 2026-03-10T06:22:54.902 INFO:tasks.workunit.client.0.vm04.stdout:4/329: dread - d2/d32/d5c/d4f/f60 zero size 2026-03-10T06:22:54.904 INFO:tasks.workunit.client.0.vm04.stdout:5/330: getdents d4/d6/d48/d4c 0 2026-03-10T06:22:54.904 INFO:tasks.workunit.client.0.vm04.stdout:8/389: symlink df/d15/d29/l7b 0 2026-03-10T06:22:54.905 INFO:tasks.workunit.client.0.vm04.stdout:8/390: fsync df/f46 0 2026-03-10T06:22:54.906 INFO:tasks.workunit.client.0.vm04.stdout:4/330: fdatasync d2/d46/f3d 0 2026-03-10T06:22:54.909 INFO:tasks.workunit.client.0.vm04.stdout:9/385: creat d2/d8/d14/d1d/d64/f91 x:0 0 0 2026-03-10T06:22:54.910 INFO:tasks.workunit.client.0.vm04.stdout:2/385: write d1/df/d11/d18/d48/f62 [332172,104021] 0 2026-03-10T06:22:54.913 INFO:tasks.workunit.client.0.vm04.stdout:9/386: readlink d2/d3/d18/l2f 0 2026-03-10T06:22:54.913 INFO:tasks.workunit.client.0.vm04.stdout:9/387: write d2/d8/f66 [564764,90262] 0 2026-03-10T06:22:54.920 INFO:tasks.workunit.client.0.vm04.stdout:2/386: dwrite d1/df/f22 [0,4194304] 0 2026-03-10T06:22:54.924 INFO:tasks.workunit.client.0.vm04.stdout:9/388: sync 2026-03-10T06:22:54.932 INFO:tasks.workunit.client.0.vm04.stdout:9/389: dwrite d2/d8/d14/f28 [0,4194304] 0 2026-03-10T06:22:54.937 INFO:tasks.workunit.client.0.vm04.stdout:7/353: rename d4/df/d12/d21/c38 to d4/df/d12/c83 0 2026-03-10T06:22:54.943 INFO:tasks.workunit.client.0.vm04.stdout:9/390: dread d2/d3/d18/d34/f5f [0,4194304] 0 2026-03-10T06:22:54.944 INFO:tasks.workunit.client.0.vm04.stdout:6/380: write d2/d43/d2d/d30/f2b [3191923,84649] 0 2026-03-10T06:22:54.945 INFO:tasks.workunit.client.0.vm04.stdout:6/381: fsync d2/d43/f35 0 2026-03-10T06:22:54.946 INFO:tasks.workunit.client.0.vm04.stdout:6/382: read - d2/d37/f38 zero size 2026-03-10T06:22:54.952 INFO:tasks.workunit.client.0.vm04.stdout:1/378: dwrite d0/d3/f37 [0,4194304] 0 2026-03-10T06:22:54.989 INFO:tasks.workunit.client.0.vm04.stdout:4/331: write d2/d46/f15 [583047,96414] 0 2026-03-10T06:22:54.994 INFO:tasks.workunit.client.0.vm04.stdout:9/391: fdatasync d2/d23/f31 0 2026-03-10T06:22:54.996 INFO:tasks.workunit.client.0.vm04.stdout:9/392: readlink d2/d3/d18/l74 0 2026-03-10T06:22:55.001 INFO:tasks.workunit.client.0.vm04.stdout:0/358: rename d0/d1a/d3f to d0/d5/d25/dd/d5c/d73 0 2026-03-10T06:22:55.003 INFO:tasks.workunit.client.0.vm04.stdout:0/359: stat d0/d5/f1f 0 2026-03-10T06:22:55.011 INFO:tasks.workunit.client.0.vm04.stdout:8/391: truncate df/d20/d25/d30/d55/f5b 1853122 0 2026-03-10T06:22:55.011 INFO:tasks.workunit.client.0.vm04.stdout:6/383: write d2/d8/f11 [1345273,18153] 0 2026-03-10T06:22:55.012 INFO:tasks.workunit.client.0.vm04.stdout:8/392: chown df/d15/f69 0 1 2026-03-10T06:22:55.012 INFO:tasks.workunit.client.0.vm04.stdout:8/393: dread - df/d20/d25/f54 zero size 2026-03-10T06:22:55.013 INFO:tasks.workunit.client.0.vm04.stdout:8/394: chown df/c14 74048543 1 2026-03-10T06:22:55.017 INFO:tasks.workunit.client.0.vm04.stdout:2/387: symlink d1/df/d11/l7b 0 2026-03-10T06:22:55.018 INFO:tasks.workunit.client.0.vm04.stdout:8/395: dread df/d15/f1e [0,4194304] 0 2026-03-10T06:22:55.019 INFO:tasks.workunit.client.0.vm04.stdout:8/396: chown df/d20/d25/f44 3068 1 2026-03-10T06:22:55.023 INFO:tasks.workunit.client.0.vm04.stdout:8/397: dwrite df/f11 [0,4194304] 0 2026-03-10T06:22:55.037 INFO:tasks.workunit.client.0.vm04.stdout:3/409: rename d4/da/df/d11/d4a/d7b/l7a to d4/d6/d38/l8a 0 2026-03-10T06:22:55.038 INFO:tasks.workunit.client.0.vm04.stdout:0/360: mknod d0/d5/d25/dd/d5c/d73/c74 0 2026-03-10T06:22:55.041 INFO:tasks.workunit.client.0.vm04.stdout:5/331: getdents d4/d11/d2a/d38/d51 0 2026-03-10T06:22:55.043 INFO:tasks.workunit.client.0.vm04.stdout:0/361: truncate d0/d5/f4e 1001400 0 2026-03-10T06:22:55.046 INFO:tasks.workunit.client.0.vm04.stdout:9/393: dwrite d2/d23/f31 [4194304,4194304] 0 2026-03-10T06:22:55.061 INFO:tasks.workunit.client.0.vm04.stdout:6/384: stat d2/d43/d2d/d30/c5d 0 2026-03-10T06:22:55.061 INFO:tasks.workunit.client.0.vm04.stdout:4/332: dread d2/d16/d2c/f4d [0,4194304] 0 2026-03-10T06:22:55.067 INFO:tasks.workunit.client.0.vm04.stdout:1/379: link d0/f4 d0/d3/d41/d4b/d5b/f8e 0 2026-03-10T06:22:55.069 INFO:tasks.workunit.client.0.vm04.stdout:2/388: truncate d1/df/d11/d18/f19 2534894 0 2026-03-10T06:22:55.073 INFO:tasks.workunit.client.0.vm04.stdout:7/354: creat d4/df/f84 x:0 0 0 2026-03-10T06:22:55.085 INFO:tasks.workunit.client.0.vm04.stdout:5/332: sync 2026-03-10T06:22:55.086 INFO:tasks.workunit.client.0.vm04.stdout:9/394: sync 2026-03-10T06:22:55.092 INFO:tasks.workunit.client.0.vm04.stdout:5/333: sync 2026-03-10T06:22:55.093 INFO:tasks.workunit.client.0.vm04.stdout:7/355: dwrite d4/df/d12/d13/d25/d28/d36/f64 [0,4194304] 0 2026-03-10T06:22:55.095 INFO:tasks.workunit.client.0.vm04.stdout:3/410: rename d4/da/df/d11/d4a/d7b/d21/d32/d39/f43 to d4/da/df/d11/d5a/f8b 0 2026-03-10T06:22:55.098 INFO:tasks.workunit.client.0.vm04.stdout:7/356: chown d4/df/f60 39177137 1 2026-03-10T06:22:55.108 INFO:tasks.workunit.client.0.vm04.stdout:9/395: dwrite d2/d8/d3a/f6b [0,4194304] 0 2026-03-10T06:22:55.114 INFO:tasks.workunit.client.0.vm04.stdout:1/380: creat d0/f8f x:0 0 0 2026-03-10T06:22:55.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:54 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T06:22:55.122 INFO:tasks.workunit.client.0.vm04.stdout:5/334: dwrite d4/d11/d2a/f31 [0,4194304] 0 2026-03-10T06:22:55.136 INFO:tasks.workunit.client.0.vm04.stdout:8/398: symlink df/d53/d67/l7c 0 2026-03-10T06:22:55.142 INFO:tasks.workunit.client.0.vm04.stdout:8/399: dwrite df/d20/f28 [0,4194304] 0 2026-03-10T06:22:55.158 INFO:tasks.workunit.client.0.vm04.stdout:6/385: rename d2/f14 to d2/d43/d2d/d30/d34/d76/d7e/f81 0 2026-03-10T06:22:55.162 INFO:tasks.workunit.client.0.vm04.stdout:3/411: symlink d4/da/df/d11/l8c 0 2026-03-10T06:22:55.171 INFO:tasks.workunit.client.0.vm04.stdout:5/335: symlink d4/d6/d48/l78 0 2026-03-10T06:22:55.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:54 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T06:22:55.212 INFO:tasks.workunit.client.0.vm04.stdout:8/400: dread - df/d20/f5e zero size 2026-03-10T06:22:55.219 INFO:tasks.workunit.client.0.vm04.stdout:7/357: rename d4/df/d12/d34/d63/l65 to d4/df/d12/d13/d25/d30/d40/d50/l85 0 2026-03-10T06:22:55.219 INFO:tasks.workunit.client.0.vm04.stdout:4/333: write d2/d16/d2c/f2e [1226784,26562] 0 2026-03-10T06:22:55.240 INFO:tasks.workunit.client.0.vm04.stdout:1/381: truncate d0/d3/d41/d4b/d5b/f8e 57608 0 2026-03-10T06:22:55.240 INFO:tasks.workunit.client.0.vm04.stdout:1/382: write d0/d8/f38 [377406,27043] 0 2026-03-10T06:22:55.247 INFO:tasks.workunit.client.0.vm04.stdout:0/362: getdents d0/d5/d25/dd/d3a/d56 0 2026-03-10T06:22:55.266 INFO:tasks.workunit.client.0.vm04.stdout:8/401: dwrite df/d15/d2b/f4c [0,4194304] 0 2026-03-10T06:22:55.280 INFO:tasks.workunit.client.0.vm04.stdout:4/334: rename d2/d16/d31/f4e to d2/d32/d5c/f6a 0 2026-03-10T06:22:55.337 INFO:tasks.workunit.client.0.vm04.stdout:1/383: symlink d0/d8/d46/d7a/l90 0 2026-03-10T06:22:55.337 INFO:tasks.workunit.client.0.vm04.stdout:7/358: mknod d4/df/c86 0 2026-03-10T06:22:55.342 INFO:tasks.workunit.client.0.vm04.stdout:4/335: mkdir d2/d16/d2c/d6b 0 2026-03-10T06:22:55.342 INFO:tasks.workunit.client.0.vm04.stdout:8/402: chown df/d15/d2b/c52 1710 1 2026-03-10T06:22:55.347 INFO:tasks.workunit.client.0.vm04.stdout:1/384: write d0/d3/d41/d4b/d5b/f66 [564776,71274] 0 2026-03-10T06:22:55.349 INFO:tasks.workunit.client.0.vm04.stdout:9/396: getdents d2/d8/d14/d1d/d64 0 2026-03-10T06:22:55.350 INFO:tasks.workunit.client.0.vm04.stdout:5/336: creat d4/f79 x:0 0 0 2026-03-10T06:22:55.351 INFO:tasks.workunit.client.0.vm04.stdout:9/397: chown d2/d8/d22/d87 64762924 1 2026-03-10T06:22:55.352 INFO:tasks.workunit.client.0.vm04.stdout:9/398: write d2/d3/d18/d34/f5f [5038976,42837] 0 2026-03-10T06:22:55.358 INFO:tasks.workunit.client.0.vm04.stdout:2/389: dwrite d1/df/d11/f16 [0,4194304] 0 2026-03-10T06:22:55.365 INFO:tasks.workunit.client.0.vm04.stdout:1/385: creat d0/d3/d80/f91 x:0 0 0 2026-03-10T06:22:55.365 INFO:tasks.workunit.client.0.vm04.stdout:8/403: dwrite df/f77 [0,4194304] 0 2026-03-10T06:22:55.374 INFO:tasks.workunit.client.0.vm04.stdout:9/399: read - d2/d3/d18/d39/d11/d42/f5b zero size 2026-03-10T06:22:55.377 INFO:tasks.workunit.client.0.vm04.stdout:9/400: dread d2/d3/d18/d39/d11/f2d [0,4194304] 0 2026-03-10T06:22:55.391 INFO:tasks.workunit.client.0.vm04.stdout:0/363: rmdir d0/d5/d25/dd/d1d/d59/d63/d71 0 2026-03-10T06:22:55.404 INFO:tasks.workunit.client.0.vm04.stdout:4/336: rename d2/d46/c37 to d2/d32/d5c/c6c 0 2026-03-10T06:22:55.408 INFO:tasks.workunit.client.0.vm04.stdout:4/337: fdatasync d2/d46/f61 0 2026-03-10T06:22:55.408 INFO:tasks.workunit.client.0.vm04.stdout:8/404: symlink df/d53/d67/l7d 0 2026-03-10T06:22:55.409 INFO:tasks.workunit.client.0.vm04.stdout:1/386: dread - d0/f49 zero size 2026-03-10T06:22:55.409 INFO:tasks.workunit.client.0.vm04.stdout:4/338: chown d2/d16/d31 1 1 2026-03-10T06:22:55.410 INFO:tasks.workunit.client.0.vm04.stdout:8/405: chown df/d20/d25/f2a 279674 1 2026-03-10T06:22:55.412 INFO:tasks.workunit.client.0.vm04.stdout:9/401: mkdir d2/d23/d24/d5a/d92 0 2026-03-10T06:22:55.417 INFO:tasks.workunit.client.0.vm04.stdout:0/364: creat d0/f75 x:0 0 0 2026-03-10T06:22:55.417 INFO:tasks.workunit.client.0.vm04.stdout:4/339: creat d2/d32/d5c/f6d x:0 0 0 2026-03-10T06:22:55.419 INFO:tasks.workunit.client.0.vm04.stdout:7/359: sync 2026-03-10T06:22:55.420 INFO:tasks.workunit.client.0.vm04.stdout:0/365: write d0/d5/d25/f23 [55580,36850] 0 2026-03-10T06:22:55.420 INFO:tasks.workunit.client.0.vm04.stdout:7/360: truncate d4/df/f84 172347 0 2026-03-10T06:22:55.429 INFO:tasks.workunit.client.0.vm04.stdout:7/361: dread d4/df/d12/d13/d25/d28/d36/f41 [0,4194304] 0 2026-03-10T06:22:55.445 INFO:tasks.workunit.client.0.vm04.stdout:9/402: link d2/d8/d22/f75 d2/d23/f93 0 2026-03-10T06:22:55.445 INFO:tasks.workunit.client.0.vm04.stdout:0/366: unlink d0/d5/d25/dd/d5c/d73/c74 0 2026-03-10T06:22:55.445 INFO:tasks.workunit.client.0.vm04.stdout:7/362: symlink d4/df/d12/d13/d25/d28/d3a/l87 0 2026-03-10T06:22:55.445 INFO:tasks.workunit.client.0.vm04.stdout:8/406: creat df/d15/d2b/f7e x:0 0 0 2026-03-10T06:22:55.452 INFO:tasks.workunit.client.0.vm04.stdout:0/367: symlink d0/d5/d25/dd/d1d/l76 0 2026-03-10T06:22:55.452 INFO:tasks.workunit.client.0.vm04.stdout:0/368: dread - d0/d5/d25/dd/d3a/f57 zero size 2026-03-10T06:22:55.453 INFO:tasks.workunit.client.0.vm04.stdout:0/369: stat d0/d1a/d20/l6b 0 2026-03-10T06:22:55.459 INFO:tasks.workunit.client.0.vm04.stdout:0/370: creat d0/d5/d25/dd/d3a/f77 x:0 0 0 2026-03-10T06:22:55.460 INFO:tasks.workunit.client.0.vm04.stdout:9/403: sync 2026-03-10T06:22:55.461 INFO:tasks.workunit.client.0.vm04.stdout:9/404: read f0 [3324229,93650] 0 2026-03-10T06:22:55.480 INFO:tasks.workunit.client.0.vm04.stdout:9/405: mkdir d2/d23/d94 0 2026-03-10T06:22:55.481 INFO:tasks.workunit.client.0.vm04.stdout:9/406: write d2/d8/d14/d1d/f78 [1853810,76987] 0 2026-03-10T06:22:55.498 INFO:tasks.workunit.client.0.vm04.stdout:8/407: link df/d15/d2b/c36 df/d15/d29/c7f 0 2026-03-10T06:22:55.499 INFO:tasks.workunit.client.0.vm04.stdout:8/408: chown df/d20/d25/d30/d55/l5a 2 1 2026-03-10T06:22:55.500 INFO:tasks.workunit.client.0.vm04.stdout:8/409: stat df/d15/c78 0 2026-03-10T06:22:55.506 INFO:tasks.workunit.client.0.vm04.stdout:9/407: dwrite d2/d3/d18/d39/d11/f56 [0,4194304] 0 2026-03-10T06:22:55.514 INFO:tasks.workunit.client.0.vm04.stdout:9/408: creat d2/d8/d53/d6e/d89/f95 x:0 0 0 2026-03-10T06:22:55.514 INFO:tasks.workunit.client.0.vm04.stdout:9/409: chown d2/d8/c5c 16 1 2026-03-10T06:22:55.518 INFO:tasks.workunit.client.0.vm04.stdout:9/410: getdents d2/d8/d53/d6e/d8d 0 2026-03-10T06:22:55.529 INFO:tasks.workunit.client.0.vm04.stdout:9/411: getdents d2/d8/d53/d6e/d8d 0 2026-03-10T06:22:55.529 INFO:tasks.workunit.client.0.vm04.stdout:9/412: write d2/d8/d53/d6e/f7d [997297,74251] 0 2026-03-10T06:22:55.529 INFO:tasks.workunit.client.0.vm04.stdout:9/413: dread - d2/d8/d3a/f51 zero size 2026-03-10T06:22:55.529 INFO:tasks.workunit.client.0.vm04.stdout:9/414: link d2/d8/d14/d1d/d64/c7b d2/d23/d24/c96 0 2026-03-10T06:22:55.532 INFO:tasks.workunit.client.0.vm04.stdout:9/415: creat d2/d3/d18/d34/f97 x:0 0 0 2026-03-10T06:22:55.532 INFO:tasks.workunit.client.0.vm04.stdout:8/410: dread df/d15/d2b/f60 [0,4194304] 0 2026-03-10T06:22:55.538 INFO:tasks.workunit.client.0.vm04.stdout:9/416: write d2/d3/d18/d39/d11/d42/f5e [332462,126851] 0 2026-03-10T06:22:55.552 INFO:tasks.workunit.client.0.vm04.stdout:9/417: mknod d2/d23/d94/c98 0 2026-03-10T06:22:55.558 INFO:tasks.workunit.client.0.vm04.stdout:9/418: creat d2/d8/f99 x:0 0 0 2026-03-10T06:22:55.576 INFO:tasks.workunit.client.0.vm04.stdout:9/419: sync 2026-03-10T06:22:55.578 INFO:tasks.workunit.client.0.vm04.stdout:9/420: fdatasync d2/d23/d24/f29 0 2026-03-10T06:22:55.608 INFO:tasks.workunit.client.0.vm04.stdout:6/386: truncate d2/d8/f11 3017023 0 2026-03-10T06:22:55.610 INFO:tasks.workunit.client.0.vm04.stdout:6/387: creat d2/d37/d6e/f82 x:0 0 0 2026-03-10T06:22:55.611 INFO:tasks.workunit.client.0.vm04.stdout:6/388: mkdir d2/d37/d83 0 2026-03-10T06:22:55.618 INFO:tasks.workunit.client.0.vm04.stdout:1/387: mknod d0/d3/c92 0 2026-03-10T06:22:55.619 INFO:tasks.workunit.client.0.vm04.stdout:1/388: chown d0/d8/c3d 0 1 2026-03-10T06:22:55.638 INFO:tasks.workunit.client.0.vm04.stdout:2/390: rename d1/df/d2c/f4c to d1/df/f7c 0 2026-03-10T06:22:55.639 INFO:tasks.workunit.client.0.vm04.stdout:2/391: chown d1/df/f24 14205 1 2026-03-10T06:22:55.641 INFO:tasks.workunit.client.0.vm04.stdout:4/340: rename d2/d16/d2c/f4d to d2/d32/d5c/d4f/d51/f6e 0 2026-03-10T06:22:55.642 INFO:tasks.workunit.client.0.vm04.stdout:2/392: chown d1/df/d2c/f4a 25 1 2026-03-10T06:22:55.645 INFO:tasks.workunit.client.0.vm04.stdout:7/363: rename d4/df/d12/d13/d25/d30/f37 to d4/df/d12/d13/d25/d28/d36/f88 0 2026-03-10T06:22:55.650 INFO:tasks.workunit.client.0.vm04.stdout:0/371: dwrite d0/d5/d25/dd/d3a/f50 [0,4194304] 0 2026-03-10T06:22:55.654 INFO:tasks.workunit.client.0.vm04.stdout:2/393: truncate d1/db/fe 2807882 0 2026-03-10T06:22:55.655 INFO:tasks.workunit.client.0.vm04.stdout:7/364: rename d4/f51 to d4/df/d12/d13/d25/d30/d40/d79/f89 0 2026-03-10T06:22:55.655 INFO:tasks.workunit.client.0.vm04.stdout:2/394: truncate d1/df/d2c/f58 245354 0 2026-03-10T06:22:55.657 INFO:tasks.workunit.client.0.vm04.stdout:4/341: dread d2/d46/f26 [0,4194304] 0 2026-03-10T06:22:55.658 INFO:tasks.workunit.client.0.vm04.stdout:4/342: stat d2/d16/f20 0 2026-03-10T06:22:55.663 INFO:tasks.workunit.client.0.vm04.stdout:0/372: rmdir d0/d1a/d20/d38 39 2026-03-10T06:22:55.664 INFO:tasks.workunit.client.0.vm04.stdout:0/373: readlink d0/d5/d25/dd/d5c/d73/l4c 0 2026-03-10T06:22:55.665 INFO:tasks.workunit.client.0.vm04.stdout:2/395: dread d1/db/f1e [0,4194304] 0 2026-03-10T06:22:55.666 INFO:tasks.workunit.client.0.vm04.stdout:5/337: rmdir d4/d11/d2a 39 2026-03-10T06:22:55.666 INFO:tasks.workunit.client.0.vm04.stdout:0/374: write d0/d5/d25/dd/d3a/f50 [3600446,48791] 0 2026-03-10T06:22:55.670 INFO:tasks.workunit.client.0.vm04.stdout:5/338: creat d4/d6/d48/d55/f7a x:0 0 0 2026-03-10T06:22:55.673 INFO:tasks.workunit.client.0.vm04.stdout:4/343: read d2/f4 [3448767,108654] 0 2026-03-10T06:22:55.678 INFO:tasks.workunit.client.0.vm04.stdout:7/365: dread d4/df/d12/d13/d25/f2e [4194304,4194304] 0 2026-03-10T06:22:55.678 INFO:tasks.workunit.client.0.vm04.stdout:4/344: creat d2/d16/d2c/f6f x:0 0 0 2026-03-10T06:22:55.681 INFO:tasks.workunit.client.0.vm04.stdout:0/375: rename d0/d1a/d20/d38/f52 to d0/d1a/d20/d38/f78 0 2026-03-10T06:22:55.681 INFO:tasks.workunit.client.0.vm04.stdout:2/396: dwrite d1/df/f22 [0,4194304] 0 2026-03-10T06:22:55.699 INFO:tasks.workunit.client.0.vm04.stdout:3/412: symlink d4/da/df/d11/l8d 0 2026-03-10T06:22:55.701 INFO:tasks.workunit.client.0.vm04.stdout:8/411: dwrite fe [0,4194304] 0 2026-03-10T06:22:55.714 INFO:tasks.workunit.client.0.vm04.stdout:7/366: dread d4/df/d12/d13/d25/d30/d40/d79/f89 [0,4194304] 0 2026-03-10T06:22:55.718 INFO:tasks.workunit.client.0.vm04.stdout:4/345: dread d2/d46/f3c [0,4194304] 0 2026-03-10T06:22:55.723 INFO:tasks.workunit.client.0.vm04.stdout:2/397: mknod d1/df/d11/d18/d35/d54/d5d/c7d 0 2026-03-10T06:22:55.732 INFO:tasks.workunit.client.0.vm04.stdout:7/367: dread d4/df/d12/d13/d25/d28/d36/f88 [0,4194304] 0 2026-03-10T06:22:55.732 INFO:tasks.workunit.client.0.vm04.stdout:7/368: dread - d4/df/d12/d34/f80 zero size 2026-03-10T06:22:55.739 INFO:tasks.workunit.client.0.vm04.stdout:6/389: dwrite d2/d43/d2d/d30/f39 [0,4194304] 0 2026-03-10T06:22:55.740 INFO:tasks.workunit.client.0.vm04.stdout:1/389: write d0/f5 [3271913,106011] 0 2026-03-10T06:22:55.740 INFO:tasks.workunit.client.0.vm04.stdout:1/390: dread - d0/d3/d41/f8a zero size 2026-03-10T06:22:55.741 INFO:tasks.workunit.client.0.vm04.stdout:5/339: creat d4/d11/f7b x:0 0 0 2026-03-10T06:22:55.746 INFO:tasks.workunit.client.0.vm04.stdout:5/340: stat d4/d6/l4a 0 2026-03-10T06:22:55.746 INFO:tasks.workunit.client.0.vm04.stdout:5/341: dread - d4/d6/d50/f63 zero size 2026-03-10T06:22:55.756 INFO:tasks.workunit.client.0.vm04.stdout:7/369: dwrite d4/df/d12/d13/d25/d28/d3a/d58/f77 [0,4194304] 0 2026-03-10T06:22:55.775 INFO:tasks.workunit.client.0.vm04.stdout:8/412: unlink df/d15/d29/l50 0 2026-03-10T06:22:55.781 INFO:tasks.workunit.client.0.vm04.stdout:8/413: dwrite df/d15/f43 [0,4194304] 0 2026-03-10T06:22:55.782 INFO:tasks.workunit.client.0.vm04.stdout:2/398: dwrite d1/df/d11/d14/f1d [4194304,4194304] 0 2026-03-10T06:22:55.816 INFO:tasks.workunit.client.0.vm04.stdout:9/421: creat d2/d8/d14/d1d/d64/d73/f9a x:0 0 0 2026-03-10T06:22:55.817 INFO:tasks.workunit.client.0.vm04.stdout:9/422: stat d2/d3/d18/d39/d11/f2d 0 2026-03-10T06:22:55.828 INFO:tasks.workunit.client.0.vm04.stdout:8/414: unlink df/f11 0 2026-03-10T06:22:55.855 INFO:tasks.workunit.client.0.vm04.stdout:0/376: write d0/f14 [5324392,115738] 0 2026-03-10T06:22:55.872 INFO:tasks.workunit.client.0.vm04.stdout:3/413: dwrite d4/f10 [0,4194304] 0 2026-03-10T06:22:55.873 INFO:tasks.workunit.client.0.vm04.stdout:4/346: dwrite d2/d16/d31/d3f/f43 [0,4194304] 0 2026-03-10T06:22:55.889 INFO:tasks.workunit.client.0.vm04.stdout:8/415: stat df/d15/d2b/f33 0 2026-03-10T06:22:55.898 INFO:tasks.workunit.client.0.vm04.stdout:7/370: creat d4/df/f8a x:0 0 0 2026-03-10T06:22:55.898 INFO:tasks.workunit.client.0.vm04.stdout:5/342: dread d4/d11/d2a/f3d [0,4194304] 0 2026-03-10T06:22:55.905 INFO:tasks.workunit.client.0.vm04.stdout:4/347: symlink d2/d16/l70 0 2026-03-10T06:22:55.905 INFO:tasks.workunit.client.0.vm04.stdout:6/390: getdents d2/d43 0 2026-03-10T06:22:55.908 INFO:tasks.workunit.client.0.vm04.stdout:3/414: fdatasync d4/da/df/d11/d4a/d7b/d21/d32/d39/d64/f75 0 2026-03-10T06:22:55.908 INFO:tasks.workunit.client.0.vm04.stdout:3/415: chown d4/da/df 5858 1 2026-03-10T06:22:55.910 INFO:tasks.workunit.client.0.vm04.stdout:7/371: mkdir d4/df/d12/d13/d8b 0 2026-03-10T06:22:55.917 INFO:tasks.workunit.client.0.vm04.stdout:4/348: dread d2/d16/d2c/f2e [0,4194304] 0 2026-03-10T06:22:55.918 INFO:tasks.workunit.client.0.vm04.stdout:3/416: dread d4/da/df/d11/d4a/d7b/f47 [0,4194304] 0 2026-03-10T06:22:55.924 INFO:tasks.workunit.client.0.vm04.stdout:5/343: dwrite d4/d6/d48/d55/f5d [0,4194304] 0 2026-03-10T06:22:55.947 INFO:tasks.workunit.client.0.vm04.stdout:8/416: creat df/d20/d25/d30/d65/f80 x:0 0 0 2026-03-10T06:22:55.947 INFO:tasks.workunit.client.0.vm04.stdout:2/399: rename d1/df/f7c to d1/df/d11/f7e 0 2026-03-10T06:22:55.947 INFO:tasks.workunit.client.0.vm04.stdout:2/400: write d1/db/d20/f49 [994786,116629] 0 2026-03-10T06:22:55.947 INFO:tasks.workunit.client.0.vm04.stdout:2/401: dread d1/df/d11/f16 [4194304,4194304] 0 2026-03-10T06:22:55.947 INFO:tasks.workunit.client.0.vm04.stdout:3/417: rmdir d4/da/df/d11/d62 39 2026-03-10T06:22:55.948 INFO:tasks.workunit.client.0.vm04.stdout:4/349: dread d2/d32/d5c/f6a [0,4194304] 0 2026-03-10T06:22:55.949 INFO:tasks.workunit.client.0.vm04.stdout:6/391: fsync d2/d43/d2d/d30/d34/f4d 0 2026-03-10T06:22:55.950 INFO:tasks.workunit.client.0.vm04.stdout:6/392: write d2/d8/d78/f79 [763020,103801] 0 2026-03-10T06:22:55.955 INFO:tasks.workunit.client.0.vm04.stdout:9/423: rename d2/d8/d53/d6e/d8d/l79 to d2/d3/d18/d39/l9b 0 2026-03-10T06:22:55.956 INFO:tasks.workunit.client.0.vm04.stdout:9/424: chown d2/d3/d18/d39/d46/d55/c81 32 1 2026-03-10T06:22:55.957 INFO:tasks.workunit.client.0.vm04.stdout:0/377: sync 2026-03-10T06:22:55.969 INFO:tasks.workunit.client.0.vm04.stdout:3/418: mkdir d4/da/df/d11/d4a/d7b/d21/d32/d8e 0 2026-03-10T06:22:55.973 INFO:tasks.workunit.client.0.vm04.stdout:3/419: stat d4/da/df/d11/d5a/f8b 0 2026-03-10T06:22:55.973 INFO:tasks.workunit.client.0.vm04.stdout:3/420: stat d4/da/df/d11/d4a/d7b/f27 0 2026-03-10T06:22:55.973 INFO:tasks.workunit.client.0.vm04.stdout:6/393: mknod d2/d43/c84 0 2026-03-10T06:22:55.975 INFO:tasks.workunit.client.0.vm04.stdout:9/425: symlink d2/d3/d18/d39/d46/d55/l9c 0 2026-03-10T06:22:55.980 INFO:tasks.workunit.client.0.vm04.stdout:0/378: mkdir d0/d1a/d20/d38/d31/d79 0 2026-03-10T06:22:55.986 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:55 vm04.local ceph-mon[51058]: pgmap v26: 65 pgs: 65 active+clean; 2.4 GiB data, 8.2 GiB used, 112 GiB / 120 GiB avail; 11 MiB/s rd, 77 MiB/s wr, 191 op/s 2026-03-10T06:22:55.988 INFO:tasks.workunit.client.0.vm04.stdout:5/344: rmdir d4/d11/d2a/d65 0 2026-03-10T06:22:55.990 INFO:tasks.workunit.client.0.vm04.stdout:5/345: write d4/f79 [681322,76985] 0 2026-03-10T06:22:55.994 INFO:tasks.workunit.client.0.vm04.stdout:1/391: dwrite d0/d3/f62 [0,4194304] 0 2026-03-10T06:22:56.001 INFO:tasks.workunit.client.0.vm04.stdout:9/426: sync 2026-03-10T06:22:56.040 INFO:tasks.workunit.client.0.vm04.stdout:7/372: write d4/df/d12/d13/d25/d28/d36/f4d [4108398,8698] 0 2026-03-10T06:22:56.042 INFO:tasks.workunit.client.0.vm04.stdout:0/379: creat d0/d5/d25/dd/d5c/f7a x:0 0 0 2026-03-10T06:22:56.050 INFO:tasks.workunit.client.0.vm04.stdout:2/402: link d1/df/c38 d1/db/d20/c7f 0 2026-03-10T06:22:56.065 INFO:tasks.workunit.client.0.vm04.stdout:3/421: fsync d4/d6/f30 0 2026-03-10T06:22:56.071 INFO:tasks.workunit.client.0.vm04.stdout:8/417: truncate df/d15/f1b 3911392 0 2026-03-10T06:22:56.074 INFO:tasks.workunit.client.0.vm04.stdout:8/418: chown df/d20/d25/d30/d55/l5a 44 1 2026-03-10T06:22:56.075 INFO:tasks.workunit.client.0.vm04.stdout:9/427: write d2/d8/f4a [3574147,45023] 0 2026-03-10T06:22:56.079 INFO:tasks.workunit.client.0.vm04.stdout:4/350: write d2/d8/f35 [854826,78076] 0 2026-03-10T06:22:56.081 INFO:tasks.workunit.client.0.vm04.stdout:9/428: dread d2/d8/d14/f40 [0,4194304] 0 2026-03-10T06:22:56.083 INFO:tasks.workunit.client.0.vm04.stdout:7/373: symlink d4/df/d12/d21/l8c 0 2026-03-10T06:22:56.085 INFO:tasks.workunit.client.0.vm04.stdout:0/380: fsync d0/d5/f41 0 2026-03-10T06:22:56.087 INFO:tasks.workunit.client.0.vm04.stdout:2/403: unlink d1/df/d2c/f44 0 2026-03-10T06:22:56.089 INFO:tasks.workunit.client.0.vm04.stdout:6/394: truncate d2/d43/d2d/d30/f32 279116 0 2026-03-10T06:22:56.089 INFO:tasks.workunit.client.0.vm04.stdout:2/404: read d1/df/d11/d18/f53 [694508,120739] 0 2026-03-10T06:22:56.091 INFO:tasks.workunit.client.0.vm04.stdout:5/346: dwrite d4/d6/f6f [0,4194304] 0 2026-03-10T06:22:56.091 INFO:tasks.workunit.client.0.vm04.stdout:2/405: write d1/df/f24 [3210949,75394] 0 2026-03-10T06:22:56.095 INFO:tasks.workunit.client.0.vm04.stdout:0/381: dwrite d0/d5/d25/dd/d3a/f60 [0,4194304] 0 2026-03-10T06:22:56.099 INFO:tasks.workunit.client.0.vm04.stdout:3/422: chown d4/da/df/d11/d4a/d7b/f77 196470729 1 2026-03-10T06:22:56.099 INFO:tasks.workunit.client.0.vm04.stdout:8/419: sync 2026-03-10T06:22:56.101 INFO:tasks.workunit.client.0.vm04.stdout:1/392: rename d0/f1a to d0/d8/d46/f93 0 2026-03-10T06:22:56.103 INFO:tasks.workunit.client.0.vm04.stdout:8/420: fdatasync df/d15/f69 0 2026-03-10T06:22:56.103 INFO:tasks.workunit.client.0.vm04.stdout:6/395: dwrite d2/d43/d2d/d30/f39 [0,4194304] 0 2026-03-10T06:22:56.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:55 vm06.local ceph-mon[58974]: pgmap v26: 65 pgs: 65 active+clean; 2.4 GiB data, 8.2 GiB used, 112 GiB / 120 GiB avail; 11 MiB/s rd, 77 MiB/s wr, 191 op/s 2026-03-10T06:22:56.124 INFO:tasks.workunit.client.0.vm04.stdout:4/351: chown d2/l4a 67165271 1 2026-03-10T06:22:56.130 INFO:tasks.workunit.client.0.vm04.stdout:7/374: symlink d4/df/d12/d13/d25/d28/d36/l8d 0 2026-03-10T06:22:56.157 INFO:tasks.workunit.client.0.vm04.stdout:0/382: write d0/f16 [2423220,25458] 0 2026-03-10T06:22:56.160 INFO:tasks.workunit.client.0.vm04.stdout:0/383: dread - d0/d5/d25/dd/d3a/f57 zero size 2026-03-10T06:22:56.162 INFO:tasks.workunit.client.0.vm04.stdout:0/384: fsync d0/d5/d25/dd/d3a/f57 0 2026-03-10T06:22:56.162 INFO:tasks.workunit.client.0.vm04.stdout:5/347: write d4/d6/f8 [1465256,33494] 0 2026-03-10T06:22:56.162 INFO:tasks.workunit.client.0.vm04.stdout:9/429: dwrite d2/d8/d14/f27 [0,4194304] 0 2026-03-10T06:22:56.167 INFO:tasks.workunit.client.0.vm04.stdout:6/396: mkdir d2/d43/d2d/d30/d1f/d3c/d85 0 2026-03-10T06:22:56.174 INFO:tasks.workunit.client.0.vm04.stdout:3/423: dwrite d4/da/df/d11/d4a/d7b/f1d [0,4194304] 0 2026-03-10T06:22:56.179 INFO:tasks.workunit.client.0.vm04.stdout:4/352: creat d2/d32/d5c/d4f/d51/f71 x:0 0 0 2026-03-10T06:22:56.191 INFO:tasks.workunit.client.0.vm04.stdout:2/406: unlink d1/df/f6e 0 2026-03-10T06:22:56.195 INFO:tasks.workunit.client.0.vm04.stdout:2/407: dwrite d1/db/d69/f77 [0,4194304] 0 2026-03-10T06:22:56.229 INFO:tasks.workunit.client.0.vm04.stdout:6/397: mkdir d2/d43/d86 0 2026-03-10T06:22:56.232 INFO:tasks.workunit.client.0.vm04.stdout:5/348: write d4/d11/d2a/f3d [2819724,128027] 0 2026-03-10T06:22:56.239 INFO:tasks.workunit.client.0.vm04.stdout:4/353: mknod d2/d16/d31/d42/c72 0 2026-03-10T06:22:56.239 INFO:tasks.workunit.client.0.vm04.stdout:8/421: dwrite df/d15/d29/f3a [0,4194304] 0 2026-03-10T06:22:56.240 INFO:tasks.workunit.client.0.vm04.stdout:6/398: dread d2/d43/d2d/f42 [0,4194304] 0 2026-03-10T06:22:56.242 INFO:tasks.workunit.client.0.vm04.stdout:4/354: fdatasync d2/d8/f35 0 2026-03-10T06:22:56.242 INFO:tasks.workunit.client.0.vm04.stdout:9/430: dwrite d2/d3/d18/d39/d11/f2d [0,4194304] 0 2026-03-10T06:22:56.247 INFO:tasks.workunit.client.0.vm04.stdout:9/431: chown d2/d3/d18/d34/f5f 11426093 1 2026-03-10T06:22:56.248 INFO:tasks.workunit.client.0.vm04.stdout:9/432: readlink d2/d3/d18/l2f 0 2026-03-10T06:22:56.271 INFO:tasks.workunit.client.0.vm04.stdout:9/433: dwrite d2/d23/d24/f29 [0,4194304] 0 2026-03-10T06:22:56.280 INFO:tasks.workunit.client.0.vm04.stdout:5/349: rmdir d4/d11/d2a/d38 39 2026-03-10T06:22:56.293 INFO:tasks.workunit.client.0.vm04.stdout:3/424: unlink d4/da/df/d11/d4a/d7b/f16 0 2026-03-10T06:22:56.299 INFO:tasks.workunit.client.0.vm04.stdout:4/355: creat d2/d16/f73 x:0 0 0 2026-03-10T06:22:56.299 INFO:tasks.workunit.client.0.vm04.stdout:4/356: write d2/d16/d31/d3f/f43 [1396060,49816] 0 2026-03-10T06:22:56.299 INFO:tasks.workunit.client.0.vm04.stdout:3/425: stat d4/da/df/d11/d4a/d7b/d21/d32/d39/d64/f7d 0 2026-03-10T06:22:56.304 INFO:tasks.workunit.client.0.vm04.stdout:1/393: link d0/d3/d41/d4b/d5b/c70 d0/d8/d46/c94 0 2026-03-10T06:22:56.305 INFO:tasks.workunit.client.0.vm04.stdout:1/394: write d0/f7c [1027070,74005] 0 2026-03-10T06:22:56.312 INFO:tasks.workunit.client.0.vm04.stdout:7/375: getdents d4/df/d12/d13/d25/d30 0 2026-03-10T06:22:56.314 INFO:tasks.workunit.client.0.vm04.stdout:8/422: dread df/d15/d2b/f4a [4194304,4194304] 0 2026-03-10T06:22:56.315 INFO:tasks.workunit.client.0.vm04.stdout:7/376: readlink d4/df/d12/d13/d25/d28/d3a/d58/d68/l6d 0 2026-03-10T06:22:56.321 INFO:tasks.workunit.client.0.vm04.stdout:2/408: link d1/df/d11/d18/d35/d54/d5d/c7d d1/df/d11/d18/d35/d54/d5d/c80 0 2026-03-10T06:22:56.322 INFO:tasks.workunit.client.0.vm04.stdout:2/409: fsync d1/df/f22 0 2026-03-10T06:22:56.324 INFO:tasks.workunit.client.0.vm04.stdout:7/377: dwrite d4/fa [0,4194304] 0 2026-03-10T06:22:56.325 INFO:tasks.workunit.client.0.vm04.stdout:0/385: getdents d0/d5/d25 0 2026-03-10T06:22:56.327 INFO:tasks.workunit.client.0.vm04.stdout:9/434: symlink d2/d8/l9d 0 2026-03-10T06:22:56.335 INFO:tasks.workunit.client.0.vm04.stdout:1/395: mkdir d0/d8/d46/d7a/d95 0 2026-03-10T06:22:56.336 INFO:tasks.workunit.client.0.vm04.stdout:0/386: creat d0/d1a/d20/d38/d31/d47/f7b x:0 0 0 2026-03-10T06:22:56.340 INFO:tasks.workunit.client.0.vm04.stdout:8/423: mkdir df/d15/d2b/d81 0 2026-03-10T06:22:56.347 INFO:tasks.workunit.client.0.vm04.stdout:4/357: link d2/d32/l36 d2/d16/d2c/l74 0 2026-03-10T06:22:56.347 INFO:tasks.workunit.client.0.vm04.stdout:8/424: write df/d15/d2b/f60 [3746939,72400] 0 2026-03-10T06:22:56.347 INFO:tasks.workunit.client.0.vm04.stdout:1/396: truncate d0/f49 312501 0 2026-03-10T06:22:56.347 INFO:tasks.workunit.client.0.vm04.stdout:4/358: fdatasync d2/d32/d5c/f6d 0 2026-03-10T06:22:56.350 INFO:tasks.workunit.client.0.vm04.stdout:7/378: symlink d4/df/d12/d13/d25/d28/l8e 0 2026-03-10T06:22:56.351 INFO:tasks.workunit.client.0.vm04.stdout:8/425: dwrite df/d15/d29/f6f [0,4194304] 0 2026-03-10T06:22:56.353 INFO:tasks.workunit.client.0.vm04.stdout:0/387: rmdir d0/d5/d25/dd/d1d 39 2026-03-10T06:22:56.355 INFO:tasks.workunit.client.0.vm04.stdout:1/397: dwrite d0/f6a [0,4194304] 0 2026-03-10T06:22:56.361 INFO:tasks.workunit.client.0.vm04.stdout:1/398: chown d0/f8f 158078 1 2026-03-10T06:22:56.365 INFO:tasks.workunit.client.0.vm04.stdout:1/399: chown d0/d8/f76 200640185 1 2026-03-10T06:22:56.372 INFO:tasks.workunit.client.0.vm04.stdout:0/388: dwrite d0/d1a/d20/d38/d31/d47/f54 [0,4194304] 0 2026-03-10T06:22:56.374 INFO:tasks.workunit.client.0.vm04.stdout:0/389: chown d0/d5/f1f 54 1 2026-03-10T06:22:56.375 INFO:tasks.workunit.client.0.vm04.stdout:0/390: chown d0/f1b 4515 1 2026-03-10T06:22:56.375 INFO:tasks.workunit.client.0.vm04.stdout:7/379: mkdir d4/df/d12/d13/d25/d8f 0 2026-03-10T06:22:56.377 INFO:tasks.workunit.client.0.vm04.stdout:0/391: write d0/d5/d25/f23 [216739,102386] 0 2026-03-10T06:22:56.391 INFO:tasks.workunit.client.0.vm04.stdout:2/410: sync 2026-03-10T06:22:56.399 INFO:tasks.workunit.client.0.vm04.stdout:5/350: sync 2026-03-10T06:22:56.401 INFO:tasks.workunit.client.0.vm04.stdout:8/426: creat df/d20/d25/d30/d65/f82 x:0 0 0 2026-03-10T06:22:56.402 INFO:tasks.workunit.client.0.vm04.stdout:9/435: dread d2/f1e [0,4194304] 0 2026-03-10T06:22:56.403 INFO:tasks.workunit.client.0.vm04.stdout:8/427: dread - df/d20/d25/d30/d65/f80 zero size 2026-03-10T06:22:56.404 INFO:tasks.workunit.client.0.vm04.stdout:1/400: rename d0/d3/d41/d4b/l71 to d0/d3/d41/d4b/d5b/l96 0 2026-03-10T06:22:56.408 INFO:tasks.workunit.client.0.vm04.stdout:7/380: write d4/df/d12/d13/f27 [3268993,48691] 0 2026-03-10T06:22:56.414 INFO:tasks.workunit.client.0.vm04.stdout:6/399: write d2/d43/d2d/d30/d1f/d3c/f27 [1412186,27206] 0 2026-03-10T06:22:56.418 INFO:tasks.workunit.client.0.vm04.stdout:3/426: write d4/d6/dc/f41 [3600491,5319] 0 2026-03-10T06:22:56.418 INFO:tasks.workunit.client.0.vm04.stdout:9/436: mknod d2/d3/d18/d39/d11/c9e 0 2026-03-10T06:22:56.423 INFO:tasks.workunit.client.0.vm04.stdout:2/411: sync 2026-03-10T06:22:56.425 INFO:tasks.workunit.client.0.vm04.stdout:6/400: dread d2/d43/f69 [0,4194304] 0 2026-03-10T06:22:56.426 INFO:tasks.workunit.client.0.vm04.stdout:1/401: dread d0/d3/f34 [0,4194304] 0 2026-03-10T06:22:56.443 INFO:tasks.workunit.client.0.vm04.stdout:5/351: dread d4/d11/d2a/d38/d51/f70 [0,4194304] 0 2026-03-10T06:22:56.444 INFO:tasks.workunit.client.0.vm04.stdout:4/359: truncate d2/f47 4930653 0 2026-03-10T06:22:56.448 INFO:tasks.workunit.client.0.vm04.stdout:4/360: read - d2/d16/d31/d3f/f64 zero size 2026-03-10T06:22:56.448 INFO:tasks.workunit.client.0.vm04.stdout:0/392: symlink d0/d1a/d20/d38/d31/d79/l7c 0 2026-03-10T06:22:56.448 INFO:tasks.workunit.client.0.vm04.stdout:0/393: chown d0/d1a/c33 0 1 2026-03-10T06:22:56.448 INFO:tasks.workunit.client.0.vm04.stdout:4/361: dread - d2/d16/d31/d3f/f52 zero size 2026-03-10T06:22:56.451 INFO:tasks.workunit.client.0.vm04.stdout:8/428: symlink df/d15/d2b/l83 0 2026-03-10T06:22:56.452 INFO:tasks.workunit.client.0.vm04.stdout:0/394: rename d0/d5/d25/dd to d0/d5/d25/dd/d5c/d73/d7d 22 2026-03-10T06:22:56.453 INFO:tasks.workunit.client.0.vm04.stdout:0/395: write d0/d5/d25/dd/d3a/f60 [4023323,9277] 0 2026-03-10T06:22:56.455 INFO:tasks.workunit.client.0.vm04.stdout:0/396: write d0/f17 [1974982,95478] 0 2026-03-10T06:22:56.456 INFO:tasks.workunit.client.0.vm04.stdout:7/381: dwrite d4/df/d12/d13/d25/d28/d36/f88 [4194304,4194304] 0 2026-03-10T06:22:56.457 INFO:tasks.workunit.client.0.vm04.stdout:6/401: read d2/d37/d6e/f70 [2187865,31244] 0 2026-03-10T06:22:56.471 INFO:tasks.workunit.client.0.vm04.stdout:9/437: readlink d2/d3/d18/d39/l9b 0 2026-03-10T06:22:56.471 INFO:tasks.workunit.client.0.vm04.stdout:2/412: fdatasync d1/df/d11/d18/f53 0 2026-03-10T06:22:56.475 INFO:tasks.workunit.client.0.vm04.stdout:3/427: mkdir d4/da/df/d11/d4a/d7b/d21/d32/d4e/d8f 0 2026-03-10T06:22:56.479 INFO:tasks.workunit.client.0.vm04.stdout:5/352: symlink d4/d6/d50/l7c 0 2026-03-10T06:22:56.480 INFO:tasks.workunit.client.0.vm04.stdout:4/362: truncate d2/d46/f26 869018 0 2026-03-10T06:22:56.480 INFO:tasks.workunit.client.0.vm04.stdout:8/429: dread - df/d20/d25/d30/f51 zero size 2026-03-10T06:22:56.483 INFO:tasks.workunit.client.0.vm04.stdout:7/382: stat d4/df/d12/d21/c44 0 2026-03-10T06:22:56.487 INFO:tasks.workunit.client.0.vm04.stdout:4/363: dread - d2/d32/d5c/d4f/d51/f71 zero size 2026-03-10T06:22:56.487 INFO:tasks.workunit.client.0.vm04.stdout:1/402: mknod d0/d3/c97 0 2026-03-10T06:22:56.489 INFO:tasks.workunit.client.0.vm04.stdout:3/428: dread - d4/da/df/d11/d4a/d7b/f45 zero size 2026-03-10T06:22:56.490 INFO:tasks.workunit.client.0.vm04.stdout:7/383: dread d4/df/d12/d13/d25/d28/d36/f41 [0,4194304] 0 2026-03-10T06:22:56.492 INFO:tasks.workunit.client.0.vm04.stdout:3/429: chown d4/c88 15748 1 2026-03-10T06:22:56.496 INFO:tasks.workunit.client.0.vm04.stdout:2/413: dwrite d1/f57 [0,4194304] 0 2026-03-10T06:22:56.501 INFO:tasks.workunit.client.0.vm04.stdout:7/384: read d4/df/d12/d13/d25/d30/d40/d50/f5b [2361346,35890] 0 2026-03-10T06:22:56.502 INFO:tasks.workunit.client.0.vm04.stdout:5/353: dread d4/d11/d2a/f44 [0,4194304] 0 2026-03-10T06:22:56.503 INFO:tasks.workunit.client.0.vm04.stdout:2/414: chown d1/df/d11/d18/d48 36317470 1 2026-03-10T06:22:56.504 INFO:tasks.workunit.client.0.vm04.stdout:7/385: read d4/df/d12/d13/d25/d28/d3a/d58/d68/f7c [3746896,34123] 0 2026-03-10T06:22:56.513 INFO:tasks.workunit.client.0.vm04.stdout:8/430: creat df/d20/f84 x:0 0 0 2026-03-10T06:22:56.513 INFO:tasks.workunit.client.0.vm04.stdout:7/386: fsync d4/df/d12/d13/d25/d30/d40/f75 0 2026-03-10T06:22:56.516 INFO:tasks.workunit.client.0.vm04.stdout:4/364: read d2/d16/f3a [309392,41362] 0 2026-03-10T06:22:56.519 INFO:tasks.workunit.client.0.vm04.stdout:0/397: getdents d0/d1a/d20/d38/d31 0 2026-03-10T06:22:56.519 INFO:tasks.workunit.client.0.vm04.stdout:1/403: dwrite d0/f7c [0,4194304] 0 2026-03-10T06:22:56.527 INFO:tasks.workunit.client.0.vm04.stdout:3/430: symlink d4/da/df/l90 0 2026-03-10T06:22:56.528 INFO:tasks.workunit.client.0.vm04.stdout:3/431: dread - d4/d6/d38/f78 zero size 2026-03-10T06:22:56.537 INFO:tasks.workunit.client.0.vm04.stdout:1/404: dwrite d0/f23 [0,4194304] 0 2026-03-10T06:22:56.540 INFO:tasks.workunit.client.0.vm04.stdout:8/431: symlink df/d20/d25/l85 0 2026-03-10T06:22:56.540 INFO:tasks.workunit.client.0.vm04.stdout:4/365: unlink d2/d32/d5c/d4f/d51/f71 0 2026-03-10T06:22:56.540 INFO:tasks.workunit.client.0.vm04.stdout:0/398: symlink d0/d5/d25/dd/d5c/l7e 0 2026-03-10T06:22:56.542 INFO:tasks.workunit.client.0.vm04.stdout:8/432: chown df/d15/f43 67775728 1 2026-03-10T06:22:56.546 INFO:tasks.workunit.client.0.vm04.stdout:7/387: mkdir d4/df/d90 0 2026-03-10T06:22:56.549 INFO:tasks.workunit.client.0.vm04.stdout:7/388: dread - d4/df/f60 zero size 2026-03-10T06:22:56.549 INFO:tasks.workunit.client.0.vm04.stdout:7/389: write d4/fa [8923699,77083] 0 2026-03-10T06:22:56.549 INFO:tasks.workunit.client.0.vm04.stdout:3/432: mkdir d4/d6/d91 0 2026-03-10T06:22:56.549 INFO:tasks.workunit.client.0.vm04.stdout:7/390: fdatasync d4/df/d12/d13/f27 0 2026-03-10T06:22:56.553 INFO:tasks.workunit.client.0.vm04.stdout:5/354: rename d4/d11/d2a to d4/d11/d7d 0 2026-03-10T06:22:56.558 INFO:tasks.workunit.client.0.vm04.stdout:7/391: dwrite d4/df/d12/f4c [0,4194304] 0 2026-03-10T06:22:56.558 INFO:tasks.workunit.client.0.vm04.stdout:4/366: rmdir d2/d16/d56 39 2026-03-10T06:22:56.562 INFO:tasks.workunit.client.0.vm04.stdout:4/367: dread - d2/d16/f73 zero size 2026-03-10T06:22:56.563 INFO:tasks.workunit.client.0.vm04.stdout:0/399: chown d0/d5/l5e 1 1 2026-03-10T06:22:56.572 INFO:tasks.workunit.client.0.vm04.stdout:5/355: unlink d4/d6/f6f 0 2026-03-10T06:22:56.572 INFO:tasks.workunit.client.0.vm04.stdout:7/392: read d4/df/d12/d34/f57 [686835,100791] 0 2026-03-10T06:22:56.573 INFO:tasks.workunit.client.0.vm04.stdout:7/393: read d4/df/d12/d13/d25/d28/d36/f64 [103363,30088] 0 2026-03-10T06:22:56.576 INFO:tasks.workunit.client.0.vm04.stdout:4/368: creat d2/d16/d2c/f75 x:0 0 0 2026-03-10T06:22:56.577 INFO:tasks.workunit.client.0.vm04.stdout:4/369: fdatasync d2/d16/d2c/f6f 0 2026-03-10T06:22:56.579 INFO:tasks.workunit.client.0.vm04.stdout:0/400: symlink d0/d5/d25/dd/d3a/d56/l7f 0 2026-03-10T06:22:56.579 INFO:tasks.workunit.client.0.vm04.stdout:9/438: write f0 [1466726,49617] 0 2026-03-10T06:22:56.584 INFO:tasks.workunit.client.0.vm04.stdout:5/356: fsync d4/d6/f20 0 2026-03-10T06:22:56.585 INFO:tasks.workunit.client.0.vm04.stdout:5/357: dread d4/d11/d7d/f44 [0,4194304] 0 2026-03-10T06:22:56.593 INFO:tasks.workunit.client.0.vm04.stdout:6/402: truncate d2/d3a/f56 203292 0 2026-03-10T06:22:56.599 INFO:tasks.workunit.client.0.vm04.stdout:4/370: mkdir d2/d32/d5c/d76 0 2026-03-10T06:22:56.601 INFO:tasks.workunit.client.0.vm04.stdout:7/394: dread d4/df/f29 [0,4194304] 0 2026-03-10T06:22:56.606 INFO:tasks.workunit.client.0.vm04.stdout:9/439: unlink d2/d23/d24/f2b 0 2026-03-10T06:22:56.611 INFO:tasks.workunit.client.0.vm04.stdout:5/358: creat d4/d6/d37/f7e x:0 0 0 2026-03-10T06:22:56.613 INFO:tasks.workunit.client.0.vm04.stdout:6/403: creat d2/d43/d2d/d30/d1f/f87 x:0 0 0 2026-03-10T06:22:56.615 INFO:tasks.workunit.client.0.vm04.stdout:4/371: symlink d2/d46/l77 0 2026-03-10T06:22:56.620 INFO:tasks.workunit.client.0.vm04.stdout:0/401: rename d0/d1a/d20/l6b to d0/d5/d25/dd/d1d/l80 0 2026-03-10T06:22:56.622 INFO:tasks.workunit.client.0.vm04.stdout:7/395: symlink d4/df/d12/d13/d25/d30/d40/d50/l91 0 2026-03-10T06:22:56.624 INFO:tasks.workunit.client.0.vm04.stdout:7/396: chown d4/df/d12/d13/d25/d28/d3a/d58/d68/c6f 66304683 1 2026-03-10T06:22:56.627 INFO:tasks.workunit.client.0.vm04.stdout:2/415: write d1/db/fe [992663,40389] 0 2026-03-10T06:22:56.635 INFO:tasks.workunit.client.0.vm04.stdout:8/433: write df/d15/f24 [870076,91490] 0 2026-03-10T06:22:56.646 INFO:tasks.workunit.client.0.vm04.stdout:3/433: write d4/da/df/d11/f48 [837185,3633] 0 2026-03-10T06:22:56.646 INFO:tasks.workunit.client.0.vm04.stdout:0/402: write d0/d5/d25/f3c [3676056,75433] 0 2026-03-10T06:22:56.647 INFO:tasks.workunit.client.0.vm04.stdout:1/405: dwrite d0/d3/f20 [0,4194304] 0 2026-03-10T06:22:56.647 INFO:tasks.workunit.client.0.vm04.stdout:1/406: write d0/d3/f61 [2642579,59193] 0 2026-03-10T06:22:56.647 INFO:tasks.workunit.client.0.vm04.stdout:9/440: creat d2/d8/d53/d6e/d89/f9f x:0 0 0 2026-03-10T06:22:56.647 INFO:tasks.workunit.client.0.vm04.stdout:3/434: write d4/da/df/d11/d4a/d7b/d21/d2c/f7c [552018,107634] 0 2026-03-10T06:22:56.647 INFO:tasks.workunit.client.0.vm04.stdout:7/397: dwrite d4/fb [0,4194304] 0 2026-03-10T06:22:56.648 INFO:tasks.workunit.client.0.vm04.stdout:7/398: stat d4/df/d12/d13/d25/d28/d3a/d58/d68/c6f 0 2026-03-10T06:22:56.651 INFO:tasks.workunit.client.0.vm04.stdout:7/399: dread - d4/df/d12/d34/d63/f78 zero size 2026-03-10T06:22:56.657 INFO:tasks.workunit.client.0.vm04.stdout:4/372: sync 2026-03-10T06:22:56.660 INFO:tasks.workunit.client.0.vm04.stdout:2/416: truncate d1/df/d11/d18/f53 765855 0 2026-03-10T06:22:56.661 INFO:tasks.workunit.client.0.vm04.stdout:5/359: rename d4/d11/c1b to d4/d6/d48/d4c/c7f 0 2026-03-10T06:22:56.661 INFO:tasks.workunit.client.0.vm04.stdout:2/417: dread - d1/df/d2c/f5f zero size 2026-03-10T06:22:56.661 INFO:tasks.workunit.client.0.vm04.stdout:8/434: dwrite df/d15/d2b/f4c [0,4194304] 0 2026-03-10T06:22:56.661 INFO:tasks.workunit.client.0.vm04.stdout:4/373: read d2/d16/f3a [2645519,103644] 0 2026-03-10T06:22:56.663 INFO:tasks.workunit.client.0.vm04.stdout:2/418: dread d1/df/d11/f29 [0,4194304] 0 2026-03-10T06:22:56.667 INFO:tasks.workunit.client.0.vm04.stdout:4/374: readlink d2/d16/d2c/l3e 0 2026-03-10T06:22:56.667 INFO:tasks.workunit.client.0.vm04.stdout:2/419: write d1/db/d72/f7a [186825,43991] 0 2026-03-10T06:22:56.667 INFO:tasks.workunit.client.0.vm04.stdout:4/375: read - d2/d32/d5c/f6d zero size 2026-03-10T06:22:56.667 INFO:tasks.workunit.client.0.vm04.stdout:2/420: chown d1/df/d11/d14/d4e 8 1 2026-03-10T06:22:56.677 INFO:tasks.workunit.client.0.vm04.stdout:7/400: rename d4/df/l1a to d4/df/d12/d34/l92 0 2026-03-10T06:22:56.677 INFO:tasks.workunit.client.0.vm04.stdout:5/360: mkdir d4/d6/d80 0 2026-03-10T06:22:56.678 INFO:tasks.workunit.client.0.vm04.stdout:5/361: truncate d4/d6/d50/f63 915376 0 2026-03-10T06:22:56.686 INFO:tasks.workunit.client.0.vm04.stdout:5/362: dwrite d4/d6/d50/f63 [0,4194304] 0 2026-03-10T06:22:56.689 INFO:tasks.workunit.client.0.vm04.stdout:2/421: mknod d1/df/d11/d18/d35/c81 0 2026-03-10T06:22:56.692 INFO:tasks.workunit.client.0.vm04.stdout:5/363: readlink d4/d11/l14 0 2026-03-10T06:22:56.693 INFO:tasks.workunit.client.0.vm04.stdout:0/403: dread d0/f44 [0,4194304] 0 2026-03-10T06:22:56.694 INFO:tasks.workunit.client.0.vm04.stdout:5/364: mkdir d4/d6/d81 0 2026-03-10T06:22:56.695 INFO:tasks.workunit.client.0.vm04.stdout:5/365: stat d4/c46 0 2026-03-10T06:22:56.699 INFO:tasks.workunit.client.0.vm04.stdout:5/366: truncate d4/d6/d48/d55/f7a 933694 0 2026-03-10T06:22:56.707 INFO:tasks.workunit.client.0.vm04.stdout:8/435: dwrite df/d15/f5d [0,4194304] 0 2026-03-10T06:22:56.708 INFO:tasks.workunit.client.0.vm04.stdout:0/404: mkdir d0/d5/d25/dd/d3a/d81 0 2026-03-10T06:22:56.717 INFO:tasks.workunit.client.0.vm04.stdout:1/407: getdents d0/d8/d46 0 2026-03-10T06:22:56.718 INFO:tasks.workunit.client.0.vm04.stdout:1/408: fdatasync d0/d8/f21 0 2026-03-10T06:22:56.725 INFO:tasks.workunit.client.0.vm04.stdout:0/405: dwrite d0/f1b [0,4194304] 0 2026-03-10T06:22:56.734 INFO:tasks.workunit.client.0.vm04.stdout:8/436: creat df/d20/d25/d30/d65/f86 x:0 0 0 2026-03-10T06:22:56.740 INFO:tasks.workunit.client.0.vm04.stdout:3/435: getdents d4/da/df/d11/d62 0 2026-03-10T06:22:56.745 INFO:tasks.workunit.client.0.vm04.stdout:5/367: symlink d4/d11/d7d/d52/l82 0 2026-03-10T06:22:56.749 INFO:tasks.workunit.client.0.vm04.stdout:4/376: dread d2/f4c [0,4194304] 0 2026-03-10T06:22:56.749 INFO:tasks.workunit.client.0.vm04.stdout:4/377: fdatasync d2/d46/f5d 0 2026-03-10T06:22:56.751 INFO:tasks.workunit.client.0.vm04.stdout:4/378: dread d2/d16/d31/d3f/f43 [0,4194304] 0 2026-03-10T06:22:56.754 INFO:tasks.workunit.client.0.vm04.stdout:0/406: mkdir d0/d5/d25/dd/d5c/d73/d82 0 2026-03-10T06:22:56.754 INFO:tasks.workunit.client.0.vm04.stdout:8/437: mkdir df/d20/d25/d87 0 2026-03-10T06:22:56.759 INFO:tasks.workunit.client.0.vm04.stdout:3/436: rmdir d4/da/df/d11/d4a/d7b/d21 39 2026-03-10T06:22:56.764 INFO:tasks.workunit.client.0.vm04.stdout:6/404: write d2/d8/f11 [1000412,47718] 0 2026-03-10T06:22:56.765 INFO:tasks.workunit.client.0.vm04.stdout:6/405: write d2/d37/d6e/f82 [586175,42927] 0 2026-03-10T06:22:56.768 INFO:tasks.workunit.client.0.vm04.stdout:9/441: write d2/d3/d18/d39/d11/d42/f5b [832061,61284] 0 2026-03-10T06:22:56.774 INFO:tasks.workunit.client.0.vm04.stdout:6/406: dwrite d2/d8/f11 [0,4194304] 0 2026-03-10T06:22:56.794 INFO:tasks.workunit.client.0.vm04.stdout:7/401: dwrite d4/df/d12/f1c [0,4194304] 0 2026-03-10T06:22:56.819 INFO:tasks.workunit.client.0.vm04.stdout:5/368: creat d4/d6/d48/d4c/f83 x:0 0 0 2026-03-10T06:22:56.820 INFO:tasks.workunit.client.0.vm04.stdout:5/369: stat d4/d6/d81 0 2026-03-10T06:22:56.826 INFO:tasks.workunit.client.0.vm04.stdout:2/422: truncate d1/df/d2c/f3d 3050769 0 2026-03-10T06:22:56.833 INFO:tasks.workunit.client.0.vm04.stdout:0/407: write d0/d5/d25/dd/d1d/f26 [3450855,79886] 0 2026-03-10T06:22:56.833 INFO:tasks.workunit.client.0.vm04.stdout:1/409: creat d0/d3/f98 x:0 0 0 2026-03-10T06:22:56.834 INFO:tasks.workunit.client.0.vm04.stdout:8/438: mknod df/d20/d25/c88 0 2026-03-10T06:22:56.836 INFO:tasks.workunit.client.0.vm04.stdout:3/437: truncate d4/da/df/d11/f57 729042 0 2026-03-10T06:22:56.851 INFO:tasks.workunit.client.0.vm04.stdout:6/407: rename d2/d43/d2d/l4f to d2/d37/d83/l88 0 2026-03-10T06:22:56.852 INFO:tasks.workunit.client.0.vm04.stdout:6/408: stat d2/d43/d2d/d30/d34/d76 0 2026-03-10T06:22:56.863 INFO:tasks.workunit.client.0.vm04.stdout:6/409: dread d2/f5f [0,4194304] 0 2026-03-10T06:22:56.864 INFO:tasks.workunit.client.0.vm04.stdout:6/410: fdatasync d2/d43/d2d/d30/d1f/f3f 0 2026-03-10T06:22:56.870 INFO:tasks.workunit.client.0.vm04.stdout:2/423: symlink d1/db/d72/l82 0 2026-03-10T06:22:56.873 INFO:tasks.workunit.client.0.vm04.stdout:0/408: creat d0/d1a/f83 x:0 0 0 2026-03-10T06:22:56.874 INFO:tasks.workunit.client.0.vm04.stdout:1/410: mkdir d0/d3/d41/d99 0 2026-03-10T06:22:56.875 INFO:tasks.workunit.client.0.vm04.stdout:3/438: rmdir d4/da/df 39 2026-03-10T06:22:56.879 INFO:tasks.workunit.client.0.vm04.stdout:4/379: dwrite d2/f14 [0,4194304] 0 2026-03-10T06:22:56.882 INFO:tasks.workunit.client.0.vm04.stdout:7/402: symlink d4/df/d12/d13/d25/d8f/l93 0 2026-03-10T06:22:56.882 INFO:tasks.workunit.client.0.vm04.stdout:5/370: mkdir d4/d6/d80/d84 0 2026-03-10T06:22:56.890 INFO:tasks.workunit.client.0.vm04.stdout:8/439: dread df/d15/d2b/f4d [0,4194304] 0 2026-03-10T06:22:56.899 INFO:tasks.workunit.client.0.vm04.stdout:2/424: rename d1/db/f1e to d1/db/d72/f83 0 2026-03-10T06:22:56.902 INFO:tasks.workunit.client.0.vm04.stdout:0/409: creat d0/d5/d25/dd/d3a/d56/f84 x:0 0 0 2026-03-10T06:22:56.903 INFO:tasks.workunit.client.0.vm04.stdout:4/380: fsync d2/d46/f26 0 2026-03-10T06:22:56.911 INFO:tasks.workunit.client.0.vm04.stdout:9/442: link d2/d3/d18/d39/d11/d42/f5e d2/d3/d18/fa0 0 2026-03-10T06:22:56.913 INFO:tasks.workunit.client.0.vm04.stdout:0/410: dwrite d0/d5/d25/dd/d3a/f60 [4194304,4194304] 0 2026-03-10T06:22:56.913 INFO:tasks.workunit.client.0.vm04.stdout:1/411: dwrite d0/d3/f34 [4194304,4194304] 0 2026-03-10T06:22:56.923 INFO:tasks.workunit.client.0.vm04.stdout:5/371: symlink d4/d6/d50/l85 0 2026-03-10T06:22:56.924 INFO:tasks.workunit.client.0.vm04.stdout:7/403: creat d4/df/d12/d21/f94 x:0 0 0 2026-03-10T06:22:56.926 INFO:tasks.workunit.client.0.vm04.stdout:5/372: dwrite d4/d6/d48/d55/f5d [0,4194304] 0 2026-03-10T06:22:56.927 INFO:tasks.workunit.client.0.vm04.stdout:5/373: write d4/d3b/f6d [488183,80780] 0 2026-03-10T06:22:56.932 INFO:tasks.workunit.client.0.vm04.stdout:2/425: rename d1/df/d2c/d37/d40/c5b to d1/db/d20/c84 0 2026-03-10T06:22:56.933 INFO:tasks.workunit.client.0.vm04.stdout:6/411: mknod d2/d3a/c89 0 2026-03-10T06:22:56.934 INFO:tasks.workunit.client.0.vm04.stdout:6/412: write d2/d43/f35 [701041,65260] 0 2026-03-10T06:22:56.934 INFO:tasks.workunit.client.0.vm04.stdout:2/426: readlink d1/df/d2c/d37/d40/l66 0 2026-03-10T06:22:56.938 INFO:tasks.workunit.client.0.vm04.stdout:3/439: dwrite d4/da/df/d11/d62/f69 [0,4194304] 0 2026-03-10T06:22:56.947 INFO:tasks.workunit.client.0.vm04.stdout:1/412: chown d0/d3/d41/d4b/d5b/c70 98208 1 2026-03-10T06:22:56.948 INFO:tasks.workunit.client.0.vm04.stdout:1/413: write d0/d8/d46/f57 [1494091,81690] 0 2026-03-10T06:22:56.966 INFO:tasks.workunit.client.0.vm04.stdout:8/440: mkdir df/d15/d29/d89 0 2026-03-10T06:22:56.973 INFO:tasks.workunit.client.0.vm04.stdout:8/441: dread df/d15/d2b/f4d [0,4194304] 0 2026-03-10T06:22:56.973 INFO:tasks.workunit.client.0.vm04.stdout:8/442: write df/f46 [5204171,13395] 0 2026-03-10T06:22:56.973 INFO:tasks.workunit.client.0.vm04.stdout:8/443: write df/d20/f5e [580401,53048] 0 2026-03-10T06:22:56.979 INFO:tasks.workunit.client.0.vm04.stdout:5/374: symlink d4/d11/d7d/d38/l86 0 2026-03-10T06:22:56.980 INFO:tasks.workunit.client.0.vm04.stdout:5/375: dread - d4/d6/d48/d4c/f83 zero size 2026-03-10T06:22:56.981 INFO:tasks.workunit.client.0.vm04.stdout:5/376: write d4/d6/fa [4668920,107886] 0 2026-03-10T06:22:56.984 INFO:tasks.workunit.client.0.vm04.stdout:5/377: dwrite d4/d6/d50/f63 [0,4194304] 0 2026-03-10T06:22:56.996 INFO:tasks.workunit.client.0.vm04.stdout:4/381: rename d2/d16/d2c/l3e to d2/d32/l78 0 2026-03-10T06:22:56.997 INFO:tasks.workunit.client.0.vm04.stdout:4/382: dread - d2/d32/d5c/d4f/f60 zero size 2026-03-10T06:22:56.998 INFO:tasks.workunit.client.0.vm04.stdout:4/383: read d2/d8/f35 [1685296,45830] 0 2026-03-10T06:22:57.001 INFO:tasks.workunit.client.0.vm04.stdout:4/384: read - d2/d16/d31/f66 zero size 2026-03-10T06:22:57.001 INFO:tasks.workunit.client.0.vm04.stdout:3/440: mkdir d4/d6/d92 0 2026-03-10T06:22:57.026 INFO:tasks.workunit.client.0.vm04.stdout:5/378: creat d4/d6/f87 x:0 0 0 2026-03-10T06:22:57.030 INFO:tasks.workunit.client.0.vm04.stdout:7/404: write d4/df/d12/d13/d25/d30/d40/d50/f62 [5173472,121197] 0 2026-03-10T06:22:57.032 INFO:tasks.workunit.client.0.vm04.stdout:7/405: write d4/df/d12/d13/d25/d28/d36/f4d [2210094,60890] 0 2026-03-10T06:22:57.033 INFO:tasks.workunit.client.0.vm04.stdout:7/406: stat d4/df/d12/d21/f2a 0 2026-03-10T06:22:57.043 INFO:tasks.workunit.client.0.vm04.stdout:6/413: truncate d2/d43/d2d/d30/f32 984318 0 2026-03-10T06:22:57.044 INFO:tasks.workunit.client.0.vm04.stdout:4/385: symlink d2/d46/l79 0 2026-03-10T06:22:57.046 INFO:tasks.workunit.client.0.vm04.stdout:8/444: mkdir df/d15/d2b/d8a 0 2026-03-10T06:22:57.047 INFO:tasks.workunit.client.0.vm04.stdout:8/445: fsync df/d20/d25/d30/d65/f86 0 2026-03-10T06:22:57.048 INFO:tasks.workunit.client.0.vm04.stdout:8/446: dread - df/d15/d2b/f7e zero size 2026-03-10T06:22:57.051 INFO:tasks.workunit.client.0.vm04.stdout:8/447: chown df/d15/l3d 21671884 1 2026-03-10T06:22:57.051 INFO:tasks.workunit.client.0.vm04.stdout:9/443: chown d2/d8/d14/d6c/c8c 102 1 2026-03-10T06:22:57.054 INFO:tasks.workunit.client.0.vm04.stdout:8/448: write df/d20/d25/d30/f6b [337948,72371] 0 2026-03-10T06:22:57.055 INFO:tasks.workunit.client.0.vm04.stdout:8/449: truncate df/f17 4837074 0 2026-03-10T06:22:57.056 INFO:tasks.workunit.client.0.vm04.stdout:6/414: dread d2/d43/d2d/d30/d1f/d3c/d75/f59 [0,4194304] 0 2026-03-10T06:22:57.057 INFO:tasks.workunit.client.0.vm04.stdout:2/427: truncate d1/df/f22 614525 0 2026-03-10T06:22:57.057 INFO:tasks.workunit.client.0.vm04.stdout:2/428: readlink d1/db/d20/l39 0 2026-03-10T06:22:57.058 INFO:tasks.workunit.client.0.vm04.stdout:6/415: write d2/d3a/f57 [1483387,20063] 0 2026-03-10T06:22:57.063 INFO:tasks.workunit.client.0.vm04.stdout:6/416: dread d2/d43/d2d/d30/d1f/f3f [0,4194304] 0 2026-03-10T06:22:57.064 INFO:tasks.workunit.client.0.vm04.stdout:6/417: chown d2/d3a/l72 3665 1 2026-03-10T06:22:57.064 INFO:tasks.workunit.client.0.vm04.stdout:7/407: creat d4/df/d12/d13/d25/f95 x:0 0 0 2026-03-10T06:22:57.065 INFO:tasks.workunit.client.0.vm04.stdout:7/408: chown d4/df/d12/d34/f46 4358 1 2026-03-10T06:22:57.071 INFO:tasks.workunit.client.0.vm04.stdout:5/379: dwrite d4/d6/f7 [0,4194304] 0 2026-03-10T06:22:57.072 INFO:tasks.workunit.client.0.vm04.stdout:5/380: chown d4/d11/d7d/d38/d51/f70 31784 1 2026-03-10T06:22:57.076 INFO:tasks.workunit.client.0.vm04.stdout:5/381: chown d4/d6/d48/f5e 351 1 2026-03-10T06:22:57.082 INFO:tasks.workunit.client.0.vm04.stdout:4/386: rmdir d2/d16/d31/d3f 39 2026-03-10T06:22:57.092 INFO:tasks.workunit.client.0.vm04.stdout:3/441: mknod d4/da/df/d11/d4a/d7b/c93 0 2026-03-10T06:22:57.101 INFO:tasks.workunit.client.0.vm04.stdout:8/450: chown df/d15/d2b/f56 12080 1 2026-03-10T06:22:57.101 INFO:tasks.workunit.client.0.vm04.stdout:8/451: chown df/d15/d2b 36002 1 2026-03-10T06:22:57.108 INFO:tasks.workunit.client.0.vm04.stdout:0/411: rename d0/d5/d25/dd/d3a/f77 to d0/d1a/d20/f85 0 2026-03-10T06:22:57.114 INFO:tasks.workunit.client.0.vm04.stdout:2/429: symlink d1/df/d11/d14/d4e/l85 0 2026-03-10T06:22:57.116 INFO:tasks.workunit.client.0.vm04.stdout:7/409: mknod d4/df/d12/d13/d25/d30/d40/c96 0 2026-03-10T06:22:57.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:56 vm06.local ceph-mon[58974]: pgmap v27: 65 pgs: 65 active+clean; 2.6 GiB data, 8.5 GiB used, 111 GiB / 120 GiB avail; 27 MiB/s rd, 123 MiB/s wr, 286 op/s 2026-03-10T06:22:57.117 INFO:tasks.workunit.client.0.vm04.stdout:7/410: fdatasync d4/df/f84 0 2026-03-10T06:22:57.118 INFO:tasks.workunit.client.0.vm04.stdout:1/414: getdents d0/d8/d46/d7a 0 2026-03-10T06:22:57.119 INFO:tasks.workunit.client.0.vm04.stdout:7/411: write d4/df/d12/d13/d25/f2f [1164358,48969] 0 2026-03-10T06:22:57.119 INFO:tasks.workunit.client.0.vm04.stdout:4/387: truncate d2/d16/f3a 1684819 0 2026-03-10T06:22:57.120 INFO:tasks.workunit.client.0.vm04.stdout:4/388: chown d2/d46/f18 172643 1 2026-03-10T06:22:57.124 INFO:tasks.workunit.client.0.vm04.stdout:6/418: write d2/d37/d6e/f70 [2782946,118971] 0 2026-03-10T06:22:57.130 INFO:tasks.workunit.client.0.vm04.stdout:9/444: write d2/d3/d18/d39/d11/d42/f5e [1282756,7340] 0 2026-03-10T06:22:57.132 INFO:tasks.workunit.client.0.vm04.stdout:5/382: rename d4/d6/f7 to d4/d6/d48/d4c/f88 0 2026-03-10T06:22:57.134 INFO:tasks.workunit.client.0.vm04.stdout:2/430: creat d1/db/d20/f86 x:0 0 0 2026-03-10T06:22:57.136 INFO:tasks.workunit.client.0.vm04.stdout:2/431: truncate d1/df/d2c/f58 261381 0 2026-03-10T06:22:57.143 INFO:tasks.workunit.client.0.vm04.stdout:1/415: fdatasync d0/f2e 0 2026-03-10T06:22:57.149 INFO:tasks.workunit.client.0.vm04.stdout:4/389: mknod d2/d46/c7a 0 2026-03-10T06:22:57.149 INFO:tasks.workunit.client.0.vm04.stdout:4/390: write d2/d16/f73 [659130,88310] 0 2026-03-10T06:22:57.155 INFO:tasks.workunit.client.0.vm04.stdout:7/412: creat d4/df/d12/d13/d25/d28/d3a/d58/f97 x:0 0 0 2026-03-10T06:22:57.155 INFO:tasks.workunit.client.0.vm04.stdout:3/442: mknod d4/da/df/d11/d4a/d7b/d21/d2c/c94 0 2026-03-10T06:22:57.156 INFO:tasks.workunit.client.0.vm04.stdout:6/419: mkdir d2/d43/d2d/d30/d34/d76/d8a 0 2026-03-10T06:22:57.157 INFO:tasks.workunit.client.0.vm04.stdout:6/420: chown d2/d37/d6e/f82 145096 1 2026-03-10T06:22:57.157 INFO:tasks.workunit.client.0.vm04.stdout:6/421: fdatasync d2/d43/f4b 0 2026-03-10T06:22:57.165 INFO:tasks.workunit.client.0.vm04.stdout:0/412: rename d0/d5/l5e to d0/d5/d25/l86 0 2026-03-10T06:22:57.166 INFO:tasks.workunit.client.0.vm04.stdout:5/383: symlink d4/d11/d7d/d38/d51/l89 0 2026-03-10T06:22:57.167 INFO:tasks.workunit.client.0.vm04.stdout:8/452: dwrite df/d20/d25/d30/d55/f5b [0,4194304] 0 2026-03-10T06:22:57.170 INFO:tasks.workunit.client.0.vm04.stdout:1/416: creat d0/f9a x:0 0 0 2026-03-10T06:22:57.171 INFO:tasks.workunit.client.0.vm04.stdout:4/391: symlink d2/d32/l7b 0 2026-03-10T06:22:57.180 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:56 vm04.local ceph-mon[51058]: pgmap v27: 65 pgs: 65 active+clean; 2.6 GiB data, 8.5 GiB used, 111 GiB / 120 GiB avail; 27 MiB/s rd, 123 MiB/s wr, 286 op/s 2026-03-10T06:22:57.180 INFO:tasks.workunit.client.0.vm04.stdout:9/445: rmdir d2/d8/d22/d4f 39 2026-03-10T06:22:57.181 INFO:tasks.workunit.client.0.vm04.stdout:5/384: symlink d4/d11/d7d/d38/l8a 0 2026-03-10T06:22:57.181 INFO:tasks.workunit.client.0.vm04.stdout:0/413: chown d0/d1a/f3b 34941977 1 2026-03-10T06:22:57.182 INFO:tasks.workunit.client.0.vm04.stdout:2/432: mkdir d1/db/d69/d74/d87 0 2026-03-10T06:22:57.184 INFO:tasks.workunit.client.0.vm04.stdout:8/453: mknod df/d20/d25/d30/d65/c8b 0 2026-03-10T06:22:57.187 INFO:tasks.workunit.client.0.vm04.stdout:1/417: rename d0/d8/c2b to d0/d8/d46/d7a/c9b 0 2026-03-10T06:22:57.191 INFO:tasks.workunit.client.0.vm04.stdout:8/454: dwrite df/d15/d2b/f7e [0,4194304] 0 2026-03-10T06:22:57.191 INFO:tasks.workunit.client.0.vm04.stdout:3/443: creat d4/da/df/d11/d4a/d7b/d21/d32/d8e/f95 x:0 0 0 2026-03-10T06:22:57.198 INFO:tasks.workunit.client.0.vm04.stdout:8/455: dread df/d15/d29/f3c [0,4194304] 0 2026-03-10T06:22:57.201 INFO:tasks.workunit.client.0.vm04.stdout:4/392: dwrite d2/f12 [0,4194304] 0 2026-03-10T06:22:57.204 INFO:tasks.workunit.client.0.vm04.stdout:4/393: write d2/d16/d2c/f75 [808865,103987] 0 2026-03-10T06:22:57.207 INFO:tasks.workunit.client.0.vm04.stdout:4/394: dread - d2/d32/d5c/d4f/d51/f65 zero size 2026-03-10T06:22:57.208 INFO:tasks.workunit.client.0.vm04.stdout:4/395: dread - d2/d8/f54 zero size 2026-03-10T06:22:57.220 INFO:tasks.workunit.client.0.vm04.stdout:6/422: truncate d2/d3a/f56 128092 0 2026-03-10T06:22:57.220 INFO:tasks.workunit.client.0.vm04.stdout:6/423: stat d2/d43/d2d 0 2026-03-10T06:22:57.224 INFO:tasks.workunit.client.0.vm04.stdout:9/446: creat d2/d8/d14/d1d/d64/fa1 x:0 0 0 2026-03-10T06:22:57.232 INFO:tasks.workunit.client.0.vm04.stdout:3/444: dread d4/d6/dc/f41 [0,4194304] 0 2026-03-10T06:22:57.233 INFO:tasks.workunit.client.0.vm04.stdout:0/414: mknod d0/d1a/d4d/c87 0 2026-03-10T06:22:57.235 INFO:tasks.workunit.client.0.vm04.stdout:5/385: dwrite d4/d11/f18 [0,4194304] 0 2026-03-10T06:22:57.241 INFO:tasks.workunit.client.0.vm04.stdout:1/418: truncate d0/d3/f24 4594444 0 2026-03-10T06:22:57.241 INFO:tasks.workunit.client.0.vm04.stdout:5/386: truncate d4/d11/d7d/d38/d51/f70 4702531 0 2026-03-10T06:22:57.242 INFO:tasks.workunit.client.0.vm04.stdout:1/419: fsync d0/d3/d41/d4b/d5b/f5c 0 2026-03-10T06:22:57.248 INFO:tasks.workunit.client.0.vm04.stdout:1/420: dread - d0/d3/f98 zero size 2026-03-10T06:22:57.248 INFO:tasks.workunit.client.0.vm04.stdout:2/433: dread d1/df/f22 [0,4194304] 0 2026-03-10T06:22:57.249 INFO:tasks.workunit.client.0.vm04.stdout:0/415: dwrite d0/d5/d25/dd/d5c/d73/f53 [0,4194304] 0 2026-03-10T06:22:57.261 INFO:tasks.workunit.client.0.vm04.stdout:4/396: creat d2/d32/f7c x:0 0 0 2026-03-10T06:22:57.262 INFO:tasks.workunit.client.0.vm04.stdout:4/397: dread - d2/d46/f61 zero size 2026-03-10T06:22:57.265 INFO:tasks.workunit.client.0.vm04.stdout:6/424: creat d2/d43/d2d/d7c/f8b x:0 0 0 2026-03-10T06:22:57.265 INFO:tasks.workunit.client.0.vm04.stdout:6/425: write d2/d43/f24 [3162126,89653] 0 2026-03-10T06:22:57.267 INFO:tasks.workunit.client.0.vm04.stdout:1/421: dread d0/f29 [0,4194304] 0 2026-03-10T06:22:57.271 INFO:tasks.workunit.client.0.vm04.stdout:7/413: truncate d4/df/d12/f4c 1787374 0 2026-03-10T06:22:57.279 INFO:tasks.workunit.client.0.vm04.stdout:3/445: rename d4/l40 to d4/da/df/d11/d4a/d7b/d21/d32/d39/l96 0 2026-03-10T06:22:57.288 INFO:tasks.workunit.client.0.vm04.stdout:6/426: dread d2/d43/d2d/d30/d34/f4d [0,4194304] 0 2026-03-10T06:22:57.291 INFO:tasks.workunit.client.0.vm04.stdout:6/427: chown d2/d8/l48 7023 1 2026-03-10T06:22:57.296 INFO:tasks.workunit.client.0.vm04.stdout:7/414: dwrite d4/df/d12/d13/d25/d28/d36/f4d [0,4194304] 0 2026-03-10T06:22:57.324 INFO:tasks.workunit.client.0.vm04.stdout:8/456: mknod df/d15/d29/c8c 0 2026-03-10T06:22:57.324 INFO:tasks.workunit.client.0.vm04.stdout:4/398: fdatasync d2/d32/d5c/f6a 0 2026-03-10T06:22:57.324 INFO:tasks.workunit.client.0.vm04.stdout:8/457: chown df/d20/d25/d30 116475 1 2026-03-10T06:22:57.324 INFO:tasks.workunit.client.0.vm04.stdout:4/399: chown d2/d16/d2c/f75 24 1 2026-03-10T06:22:57.325 INFO:tasks.workunit.client.0.vm04.stdout:4/400: chown d2/d46/ce 224 1 2026-03-10T06:22:57.325 INFO:tasks.workunit.client.0.vm04.stdout:6/428: fdatasync d2/d43/d2d/d30/d1f/d3c/f65 0 2026-03-10T06:22:57.326 INFO:tasks.workunit.client.0.vm04.stdout:6/429: stat d2/d43/d2d/d30/f60 0 2026-03-10T06:22:57.326 INFO:tasks.workunit.client.0.vm04.stdout:9/447: unlink d2/d8/d3a/f6b 0 2026-03-10T06:22:57.326 INFO:tasks.workunit.client.0.vm04.stdout:6/430: dread - d2/d37/d6e/f77 zero size 2026-03-10T06:22:57.327 INFO:tasks.workunit.client.0.vm04.stdout:7/415: mknod d4/df/d12/d34/d63/c98 0 2026-03-10T06:22:57.339 INFO:tasks.workunit.client.0.vm04.stdout:0/416: dread d0/d1a/f27 [0,4194304] 0 2026-03-10T06:22:57.343 INFO:tasks.workunit.client.0.vm04.stdout:9/448: dwrite d2/d3/d18/d34/f97 [0,4194304] 0 2026-03-10T06:22:57.346 INFO:tasks.workunit.client.0.vm04.stdout:1/422: mknod d0/d3/d41/d99/c9c 0 2026-03-10T06:22:57.352 INFO:tasks.workunit.client.0.vm04.stdout:9/449: truncate d2/d8/d22/f75 1012447 0 2026-03-10T06:22:57.352 INFO:tasks.workunit.client.0.vm04.stdout:7/416: creat d4/df/d12/d13/d25/d28/d3a/d58/d68/f99 x:0 0 0 2026-03-10T06:22:57.354 INFO:tasks.workunit.client.0.vm04.stdout:2/434: creat d1/df/f88 x:0 0 0 2026-03-10T06:22:57.376 INFO:tasks.workunit.client.0.vm04.stdout:8/458: getdents df/d15/d29/d89 0 2026-03-10T06:22:57.376 INFO:tasks.workunit.client.0.vm04.stdout:5/387: getdents d4/d6 0 2026-03-10T06:22:57.376 INFO:tasks.workunit.client.0.vm04.stdout:7/417: creat d4/df/d12/d34/d63/f9a x:0 0 0 2026-03-10T06:22:57.376 INFO:tasks.workunit.client.0.vm04.stdout:6/431: link d2/d8/l5b d2/d43/d2d/d30/d1f/d3c/d75/l8c 0 2026-03-10T06:22:57.376 INFO:tasks.workunit.client.0.vm04.stdout:6/432: chown d2/d43/f4b 2 1 2026-03-10T06:22:57.376 INFO:tasks.workunit.client.0.vm04.stdout:5/388: rmdir d4/d6/d48 39 2026-03-10T06:22:57.376 INFO:tasks.workunit.client.0.vm04.stdout:9/450: mkdir d2/d23/d24/da2 0 2026-03-10T06:22:57.376 INFO:tasks.workunit.client.0.vm04.stdout:7/418: symlink d4/df/d12/d13/d25/d28/d3a/l9b 0 2026-03-10T06:22:57.376 INFO:tasks.workunit.client.0.vm04.stdout:9/451: chown d2/d8/d14/d6c 84804060 1 2026-03-10T06:22:57.376 INFO:tasks.workunit.client.0.vm04.stdout:9/452: chown d2/d3/c8b 12 1 2026-03-10T06:22:57.376 INFO:tasks.workunit.client.0.vm04.stdout:9/453: fdatasync d2/d23/f31 0 2026-03-10T06:22:57.376 INFO:tasks.workunit.client.0.vm04.stdout:8/459: getdents df/d15/d2b/d81 0 2026-03-10T06:22:57.376 INFO:tasks.workunit.client.0.vm04.stdout:4/401: rename d2/d32/d5c/c6c to d2/d16/d31/c7d 0 2026-03-10T06:22:57.376 INFO:tasks.workunit.client.0.vm04.stdout:9/454: dread - d2/d8/d14/d1d/d64/d73/f9a zero size 2026-03-10T06:22:57.376 INFO:tasks.workunit.client.0.vm04.stdout:4/402: chown d2/d16/f20 9308 1 2026-03-10T06:22:57.376 INFO:tasks.workunit.client.0.vm04.stdout:7/419: mkdir d4/df/d12/d13/d25/d28/d36/d9c 0 2026-03-10T06:22:57.383 INFO:tasks.workunit.client.0.vm04.stdout:0/417: rename d0/f44 to d0/d5/d25/dd/d3a/d56/f88 0 2026-03-10T06:22:57.383 INFO:tasks.workunit.client.0.vm04.stdout:6/433: symlink d2/d43/d2d/d30/l8d 0 2026-03-10T06:22:57.383 INFO:tasks.workunit.client.0.vm04.stdout:0/418: stat d0/d5 0 2026-03-10T06:22:57.384 INFO:tasks.workunit.client.0.vm04.stdout:5/389: dread d4/d6/d48/d55/f5d [0,4194304] 0 2026-03-10T06:22:57.385 INFO:tasks.workunit.client.0.vm04.stdout:0/419: chown d0/d5/d25/dd/d5c/d73/f4f 5774 1 2026-03-10T06:22:57.388 INFO:tasks.workunit.client.0.vm04.stdout:1/423: dread d0/d3/f24 [0,4194304] 0 2026-03-10T06:22:57.389 INFO:tasks.workunit.client.0.vm04.stdout:6/434: dread d2/d43/d2d/d30/f60 [0,4194304] 0 2026-03-10T06:22:57.391 INFO:tasks.workunit.client.0.vm04.stdout:2/435: dread d1/df/d11/f16 [0,4194304] 0 2026-03-10T06:22:57.391 INFO:tasks.workunit.client.0.vm04.stdout:6/435: fsync d2/d43/d2d/d7c/f8b 0 2026-03-10T06:22:57.392 INFO:tasks.workunit.client.0.vm04.stdout:2/436: fdatasync d1/df/f24 0 2026-03-10T06:22:57.393 INFO:tasks.workunit.client.0.vm04.stdout:1/424: fsync d0/d3/d41/d4b/f6b 0 2026-03-10T06:22:57.393 INFO:tasks.workunit.client.0.vm04.stdout:9/455: mkdir d2/d8/d14/da3 0 2026-03-10T06:22:57.394 INFO:tasks.workunit.client.0.vm04.stdout:7/420: mknod d4/df/d12/d13/d25/d30/c9d 0 2026-03-10T06:22:57.396 INFO:tasks.workunit.client.0.vm04.stdout:5/390: fsync d4/d3b/f71 0 2026-03-10T06:22:57.396 INFO:tasks.workunit.client.0.vm04.stdout:0/420: creat d0/d1a/d20/d38/d31/d47/f89 x:0 0 0 2026-03-10T06:22:57.397 INFO:tasks.workunit.client.0.vm04.stdout:6/436: unlink d2/l68 0 2026-03-10T06:22:57.402 INFO:tasks.workunit.client.0.vm04.stdout:2/437: unlink d1/df/f22 0 2026-03-10T06:22:57.409 INFO:tasks.workunit.client.0.vm04.stdout:1/425: creat d0/d3/d41/f9d x:0 0 0 2026-03-10T06:22:57.410 INFO:tasks.workunit.client.0.vm04.stdout:4/403: unlink d2/l38 0 2026-03-10T06:22:57.410 INFO:tasks.workunit.client.0.vm04.stdout:9/456: unlink d2/d3/d18/d39/l9b 0 2026-03-10T06:22:57.410 INFO:tasks.workunit.client.0.vm04.stdout:0/421: mkdir d0/d1a/d20/d38/d31/d47/d8a 0 2026-03-10T06:22:57.410 INFO:tasks.workunit.client.0.vm04.stdout:6/437: creat d2/d37/d83/f8e x:0 0 0 2026-03-10T06:22:57.410 INFO:tasks.workunit.client.0.vm04.stdout:0/422: stat d0/d5/d25/dd/d3a/f50 0 2026-03-10T06:22:57.410 INFO:tasks.workunit.client.0.vm04.stdout:9/457: mknod d2/d8/d53/d6e/d89/ca4 0 2026-03-10T06:22:57.412 INFO:tasks.workunit.client.0.vm04.stdout:5/391: dwrite d4/d6/d50/f63 [0,4194304] 0 2026-03-10T06:22:57.414 INFO:tasks.workunit.client.0.vm04.stdout:6/438: mknod d2/d43/c8f 0 2026-03-10T06:22:57.414 INFO:tasks.workunit.client.0.vm04.stdout:7/421: creat d4/df/d12/d13/d25/d28/f9e x:0 0 0 2026-03-10T06:22:57.415 INFO:tasks.workunit.client.0.vm04.stdout:7/422: read d4/fb [873878,12028] 0 2026-03-10T06:22:57.425 INFO:tasks.workunit.client.0.vm04.stdout:0/423: truncate d0/d5/d25/dd/f13 39257 0 2026-03-10T06:22:57.449 INFO:tasks.workunit.client.0.vm04.stdout:2/438: symlink d1/db/d20/d75/l89 0 2026-03-10T06:22:57.449 INFO:tasks.workunit.client.0.vm04.stdout:2/439: fsync d1/db/d20/f49 0 2026-03-10T06:22:57.449 INFO:tasks.workunit.client.0.vm04.stdout:9/458: mkdir d2/d3/d18/d39/d11/da5 0 2026-03-10T06:22:57.449 INFO:tasks.workunit.client.0.vm04.stdout:4/404: creat d2/d16/d56/f7e x:0 0 0 2026-03-10T06:22:57.449 INFO:tasks.workunit.client.0.vm04.stdout:0/424: mknod d0/d1a/c8b 0 2026-03-10T06:22:57.450 INFO:tasks.workunit.client.0.vm04.stdout:4/405: creat d2/d16/d56/f7f x:0 0 0 2026-03-10T06:22:57.450 INFO:tasks.workunit.client.0.vm04.stdout:5/392: link d4/d6/d48/d55/f6e d4/d11/d7d/d38/f8b 0 2026-03-10T06:22:57.450 INFO:tasks.workunit.client.0.vm04.stdout:5/393: dread - d4/d6/d48/d55/f68 zero size 2026-03-10T06:22:57.450 INFO:tasks.workunit.client.0.vm04.stdout:4/406: write d2/d46/f15 [1883220,26370] 0 2026-03-10T06:22:57.450 INFO:tasks.workunit.client.0.vm04.stdout:6/439: creat d2/d3a/f90 x:0 0 0 2026-03-10T06:22:57.450 INFO:tasks.workunit.client.0.vm04.stdout:0/425: write d0/d1a/d20/f85 [647415,95782] 0 2026-03-10T06:22:57.450 INFO:tasks.workunit.client.0.vm04.stdout:9/459: dread d2/d3/d18/d34/f97 [0,4194304] 0 2026-03-10T06:22:57.450 INFO:tasks.workunit.client.0.vm04.stdout:5/394: read d4/d11/f34 [2646785,19019] 0 2026-03-10T06:22:57.450 INFO:tasks.workunit.client.0.vm04.stdout:0/426: rmdir d0/d5/d25/dd/d5c/d73 39 2026-03-10T06:22:57.450 INFO:tasks.workunit.client.0.vm04.stdout:5/395: chown d4/d11/d7d/d38/f3e 235683 1 2026-03-10T06:22:57.450 INFO:tasks.workunit.client.0.vm04.stdout:6/440: write d2/d43/d2d/d30/f60 [1795426,68859] 0 2026-03-10T06:22:57.450 INFO:tasks.workunit.client.0.vm04.stdout:1/426: dwrite d0/d8/f76 [4194304,4194304] 0 2026-03-10T06:22:57.450 INFO:tasks.workunit.client.0.vm04.stdout:7/423: dread d4/f5 [0,4194304] 0 2026-03-10T06:22:57.450 INFO:tasks.workunit.client.0.vm04.stdout:5/396: symlink d4/d6/d50/l8c 0 2026-03-10T06:22:57.450 INFO:tasks.workunit.client.0.vm04.stdout:6/441: stat d2/d43/d2d/d30/d1f/d3c/d75/l4e 0 2026-03-10T06:22:57.452 INFO:tasks.workunit.client.0.vm04.stdout:1/427: dwrite d0/f5 [0,4194304] 0 2026-03-10T06:22:57.458 INFO:tasks.workunit.client.0.vm04.stdout:1/428: dwrite d0/f5 [0,4194304] 0 2026-03-10T06:22:57.478 INFO:tasks.workunit.client.0.vm04.stdout:6/442: fsync d2/d43/d2d/d30/d1f/d3c/d75/f59 0 2026-03-10T06:22:57.479 INFO:tasks.workunit.client.0.vm04.stdout:6/443: read d2/d8/f11 [613350,57458] 0 2026-03-10T06:22:57.479 INFO:tasks.workunit.client.0.vm04.stdout:9/460: getdents d2/d23 0 2026-03-10T06:22:57.479 INFO:tasks.workunit.client.0.vm04.stdout:5/397: dread d4/d3b/f71 [0,4194304] 0 2026-03-10T06:22:57.479 INFO:tasks.workunit.client.0.vm04.stdout:1/429: mkdir d0/d3/d41/d4b/d9e 0 2026-03-10T06:22:57.482 INFO:tasks.workunit.client.0.vm04.stdout:0/427: rename d0/d5/d25/dd/d1d/f5a to d0/d1a/d20/f8c 0 2026-03-10T06:22:57.488 INFO:tasks.workunit.client.0.vm04.stdout:0/428: chown d0/d5/d25/dd/d3a/d56/f84 1 1 2026-03-10T06:22:57.489 INFO:tasks.workunit.client.0.vm04.stdout:7/424: creat d4/df/d12/d13/d25/d28/d36/d9c/f9f x:0 0 0 2026-03-10T06:22:57.494 INFO:tasks.workunit.client.0.vm04.stdout:1/430: dwrite d0/d3/d41/f47 [4194304,4194304] 0 2026-03-10T06:22:57.504 INFO:tasks.workunit.client.0.vm04.stdout:5/398: rename d4/c24 to d4/d6/d80/c8d 0 2026-03-10T06:22:57.504 INFO:tasks.workunit.client.0.vm04.stdout:5/399: stat d4/d6/d48/c5f 0 2026-03-10T06:22:57.505 INFO:tasks.workunit.client.0.vm04.stdout:5/400: stat d4/d6/d37 0 2026-03-10T06:22:57.508 INFO:tasks.workunit.client.0.vm04.stdout:7/425: mknod d4/df/d12/d13/d25/ca0 0 2026-03-10T06:22:57.509 INFO:tasks.workunit.client.0.vm04.stdout:7/426: chown d4/df/d12/f7f 342662123 1 2026-03-10T06:22:57.512 INFO:tasks.workunit.client.0.vm04.stdout:9/461: getdents d2/d23/d24/da2 0 2026-03-10T06:22:57.519 INFO:tasks.workunit.client.0.vm04.stdout:0/429: mkdir d0/d1a/d20/d38/d31/d47/d8a/d8d 0 2026-03-10T06:22:57.519 INFO:tasks.workunit.client.0.vm04.stdout:9/462: truncate d2/d8/d53/d6e/f7d 2079304 0 2026-03-10T06:22:57.519 INFO:tasks.workunit.client.0.vm04.stdout:9/463: readlink d2/d8/d14/l68 0 2026-03-10T06:22:57.519 INFO:tasks.workunit.client.0.vm04.stdout:7/427: mknod d4/df/d12/d13/d25/d28/d3a/d58/d68/ca1 0 2026-03-10T06:22:57.519 INFO:tasks.workunit.client.0.vm04.stdout:1/431: rmdir d0/d3/d41/d4b/d9e 0 2026-03-10T06:22:57.520 INFO:tasks.workunit.client.0.vm04.stdout:9/464: creat d2/d23/d24/d5a/fa6 x:0 0 0 2026-03-10T06:22:57.520 INFO:tasks.workunit.client.0.vm04.stdout:1/432: stat d0/d8/c3d 0 2026-03-10T06:22:57.521 INFO:tasks.workunit.client.0.vm04.stdout:0/430: dread d0/d5/f4e [0,4194304] 0 2026-03-10T06:22:57.522 INFO:tasks.workunit.client.0.vm04.stdout:9/465: read d2/d3/d18/d39/d11/f56 [3102621,108654] 0 2026-03-10T06:22:57.527 INFO:tasks.workunit.client.0.vm04.stdout:6/444: stat d2/d3a/f56 0 2026-03-10T06:22:57.529 INFO:tasks.workunit.client.0.vm04.stdout:9/466: unlink d2/d3/d18/d39/c80 0 2026-03-10T06:22:57.530 INFO:tasks.workunit.client.0.vm04.stdout:7/428: rename d4/df/d12/d13/d25/d28/d3a/d58/d68/f99 to d4/fa2 0 2026-03-10T06:22:57.531 INFO:tasks.workunit.client.0.vm04.stdout:1/433: link d0/d8/d46/c8d d0/c9f 0 2026-03-10T06:22:57.536 INFO:tasks.workunit.client.0.vm04.stdout:9/467: creat d2/d23/d24/d5a/fa7 x:0 0 0 2026-03-10T06:22:57.538 INFO:tasks.workunit.client.0.vm04.stdout:0/431: link d0/d5/d25/l5b d0/d1a/d20/d38/d31/d47/d8a/l8e 0 2026-03-10T06:22:57.542 INFO:tasks.workunit.client.0.vm04.stdout:6/445: creat d2/d43/d2d/d30/f91 x:0 0 0 2026-03-10T06:22:57.544 INFO:tasks.workunit.client.0.vm04.stdout:1/434: dwrite d0/d8/d46/f57 [0,4194304] 0 2026-03-10T06:22:57.545 INFO:tasks.workunit.client.0.vm04.stdout:6/446: fdatasync d2/d43/d2d/d30/f7f 0 2026-03-10T06:22:57.548 INFO:tasks.workunit.client.0.vm04.stdout:8/460: write df/d15/d2b/f4a [769569,56799] 0 2026-03-10T06:22:57.549 INFO:tasks.workunit.client.0.vm04.stdout:1/435: readlink d0/d8/l65 0 2026-03-10T06:22:57.550 INFO:tasks.workunit.client.0.vm04.stdout:7/429: truncate d4/df/d12/d13/d25/d30/d40/f52 963384 0 2026-03-10T06:22:57.554 INFO:tasks.workunit.client.0.vm04.stdout:8/461: creat df/d20/d25/d30/d55/f8d x:0 0 0 2026-03-10T06:22:57.554 INFO:tasks.workunit.client.0.vm04.stdout:1/436: mknod d0/d3/d41/ca0 0 2026-03-10T06:22:57.556 INFO:tasks.workunit.client.0.vm04.stdout:7/430: symlink d4/df/d12/d13/la3 0 2026-03-10T06:22:57.558 INFO:tasks.workunit.client.0.vm04.stdout:0/432: rename d0/f17 to d0/d5/d25/dd/d5c/f8f 0 2026-03-10T06:22:57.561 INFO:tasks.workunit.client.0.vm04.stdout:9/468: link d2/cc d2/d23/d94/ca8 0 2026-03-10T06:22:57.563 INFO:tasks.workunit.client.0.vm04.stdout:7/431: readlink d4/df/d12/d13/l5e 0 2026-03-10T06:22:57.563 INFO:tasks.workunit.client.0.vm04.stdout:1/437: dread d0/d3/d41/d4b/d5b/f66 [0,4194304] 0 2026-03-10T06:22:57.563 INFO:tasks.workunit.client.0.vm04.stdout:7/432: readlink d4/df/d12/d21/l81 0 2026-03-10T06:22:57.564 INFO:tasks.workunit.client.0.vm04.stdout:7/433: chown d4/df/d12/d13/d8b 0 1 2026-03-10T06:22:57.565 INFO:tasks.workunit.client.0.vm04.stdout:8/462: rename df/d20/d25/d30/d55/f5b to df/d15/d29/d89/f8e 0 2026-03-10T06:22:57.567 INFO:tasks.workunit.client.0.vm04.stdout:9/469: chown d2/c41 790 1 2026-03-10T06:22:57.570 INFO:tasks.workunit.client.0.vm04.stdout:0/433: creat d0/d5/d25/dd/d3a/d81/f90 x:0 0 0 2026-03-10T06:22:57.570 INFO:tasks.workunit.client.0.vm04.stdout:8/463: mkdir df/d20/d25/d30/d65/d8f 0 2026-03-10T06:22:57.570 INFO:tasks.workunit.client.0.vm04.stdout:1/438: dread - d0/d8/f67 zero size 2026-03-10T06:22:57.572 INFO:tasks.workunit.client.0.vm04.stdout:3/446: sync 2026-03-10T06:22:57.572 INFO:tasks.workunit.client.0.vm04.stdout:1/439: chown d0/d3/c97 2373 1 2026-03-10T06:22:57.573 INFO:tasks.workunit.client.0.vm04.stdout:3/447: fdatasync d4/da/df/d11/d62/f69 0 2026-03-10T06:22:57.574 INFO:tasks.workunit.client.0.vm04.stdout:7/434: link d4/df/d12/d13/d25/d28/d36/f41 d4/df/d12/d21/fa4 0 2026-03-10T06:22:57.574 INFO:tasks.workunit.client.0.vm04.stdout:0/434: symlink d0/l91 0 2026-03-10T06:22:57.578 INFO:tasks.workunit.client.0.vm04.stdout:8/464: chown df/c58 2644796 1 2026-03-10T06:22:57.580 INFO:tasks.workunit.client.0.vm04.stdout:1/440: write d0/d3/d80/f86 [96745,96518] 0 2026-03-10T06:22:57.584 INFO:tasks.workunit.client.0.vm04.stdout:3/448: symlink d4/da/df/d11/d4a/d7b/d21/d2c/l97 0 2026-03-10T06:22:57.589 INFO:tasks.workunit.client.0.vm04.stdout:8/465: dread df/d20/d25/d30/f6b [0,4194304] 0 2026-03-10T06:22:57.596 INFO:tasks.workunit.client.0.vm04.stdout:1/441: fsync d0/d3/d80/f91 0 2026-03-10T06:22:57.596 INFO:tasks.workunit.client.0.vm04.stdout:7/435: dread d4/f5 [4194304,4194304] 0 2026-03-10T06:22:57.596 INFO:tasks.workunit.client.0.vm04.stdout:3/449: truncate d4/da/df/d11/d4a/f80 9151 0 2026-03-10T06:22:57.599 INFO:tasks.workunit.client.0.vm04.stdout:8/466: stat df/d15/d29/c32 0 2026-03-10T06:22:57.599 INFO:tasks.workunit.client.0.vm04.stdout:7/436: stat d4/df/d12/f4c 0 2026-03-10T06:22:57.600 INFO:tasks.workunit.client.0.vm04.stdout:3/450: creat d4/da/df/d11/d5a/d5b/f98 x:0 0 0 2026-03-10T06:22:57.601 INFO:tasks.workunit.client.0.vm04.stdout:0/435: dwrite d0/d1a/d20/f8c [0,4194304] 0 2026-03-10T06:22:57.604 INFO:tasks.workunit.client.0.vm04.stdout:8/467: truncate df/d53/f6c 269074 0 2026-03-10T06:22:57.609 INFO:tasks.workunit.client.0.vm04.stdout:8/468: chown df/d20/d25/d30/d55 0 1 2026-03-10T06:22:57.617 INFO:tasks.workunit.client.0.vm04.stdout:3/451: stat d4/da/df/d11/d4a/d7b/f1d 0 2026-03-10T06:22:57.617 INFO:tasks.workunit.client.0.vm04.stdout:5/401: rmdir d4 39 2026-03-10T06:22:57.617 INFO:tasks.workunit.client.0.vm04.stdout:4/407: dwrite d2/d16/f3a [0,4194304] 0 2026-03-10T06:22:57.618 INFO:tasks.workunit.client.0.vm04.stdout:6/447: write d2/d37/f38 [361221,123575] 0 2026-03-10T06:22:57.626 INFO:tasks.workunit.client.0.vm04.stdout:6/448: dread d2/d43/d2d/d30/d34/f4d [0,4194304] 0 2026-03-10T06:22:57.630 INFO:tasks.workunit.client.0.vm04.stdout:7/437: dwrite d4/df/f60 [0,4194304] 0 2026-03-10T06:22:57.631 INFO:tasks.workunit.client.0.vm04.stdout:4/408: symlink d2/d32/d5c/d4f/d51/l80 0 2026-03-10T06:22:57.634 INFO:tasks.workunit.client.0.vm04.stdout:6/449: chown d2/d37/d6e/f77 0 1 2026-03-10T06:22:57.637 INFO:tasks.workunit.client.0.vm04.stdout:5/402: unlink d4/d6/l4a 0 2026-03-10T06:22:57.637 INFO:tasks.workunit.client.0.vm04.stdout:0/436: mkdir d0/d5/d25/dd/d92 0 2026-03-10T06:22:57.638 INFO:tasks.workunit.client.0.vm04.stdout:4/409: symlink d2/d8/l81 0 2026-03-10T06:22:57.641 INFO:tasks.workunit.client.0.vm04.stdout:4/410: truncate d2/d32/d5c/f4b 407533 0 2026-03-10T06:22:57.651 INFO:tasks.workunit.client.0.vm04.stdout:4/411: dwrite d2/d8/f35 [0,4194304] 0 2026-03-10T06:22:57.656 INFO:tasks.workunit.client.0.vm04.stdout:2/440: sync 2026-03-10T06:22:57.656 INFO:tasks.workunit.client.0.vm04.stdout:6/450: unlink d2/d3a/c67 0 2026-03-10T06:22:57.657 INFO:tasks.workunit.client.0.vm04.stdout:5/403: mkdir d4/d6/d80/d8e 0 2026-03-10T06:22:57.659 INFO:tasks.workunit.client.0.vm04.stdout:4/412: rmdir d2/d16/d31/d42 39 2026-03-10T06:22:57.664 INFO:tasks.workunit.client.0.vm04.stdout:6/451: write d2/d43/d2d/d30/d1f/f3f [456836,82167] 0 2026-03-10T06:22:57.664 INFO:tasks.workunit.client.0.vm04.stdout:4/413: chown d2/d46/f61 548 1 2026-03-10T06:22:57.664 INFO:tasks.workunit.client.0.vm04.stdout:4/414: chown d2 76 1 2026-03-10T06:22:57.664 INFO:tasks.workunit.client.0.vm04.stdout:3/452: rename d4/da/df/d11/d4a to d4/d6/d99 0 2026-03-10T06:22:57.667 INFO:tasks.workunit.client.0.vm04.stdout:6/452: creat d2/d43/d2d/d30/d1f/d3c/d75/f92 x:0 0 0 2026-03-10T06:22:57.669 INFO:tasks.workunit.client.0.vm04.stdout:5/404: rmdir d4/d6/d80/d8e 0 2026-03-10T06:22:57.669 INFO:tasks.workunit.client.0.vm04.stdout:0/437: rename d0/l2c to d0/d5/l93 0 2026-03-10T06:22:57.669 INFO:tasks.workunit.client.0.vm04.stdout:4/415: getdents d2/d32/d5c/d76 0 2026-03-10T06:22:57.672 INFO:tasks.workunit.client.0.vm04.stdout:2/441: rename d1/l41 to d1/db/d69/d74/d87/l8a 0 2026-03-10T06:22:57.672 INFO:tasks.workunit.client.0.vm04.stdout:2/442: fdatasync d1/df/d2c/f58 0 2026-03-10T06:22:57.673 INFO:tasks.workunit.client.0.vm04.stdout:9/470: dwrite d2/d23/d24/f37 [0,4194304] 0 2026-03-10T06:22:57.673 INFO:tasks.workunit.client.0.vm04.stdout:4/416: chown d2/d16/d2c/d6b 3 1 2026-03-10T06:22:57.673 INFO:tasks.workunit.client.0.vm04.stdout:6/453: read d2/d43/d2d/d30/f4a [1325242,53245] 0 2026-03-10T06:22:57.674 INFO:tasks.workunit.client.0.vm04.stdout:6/454: chown d2/l1b 8322633 1 2026-03-10T06:22:57.677 INFO:tasks.workunit.client.0.vm04.stdout:3/453: truncate d4/da/df/d11/d62/f69 1340979 0 2026-03-10T06:22:57.684 INFO:tasks.workunit.client.0.vm04.stdout:8/469: sync 2026-03-10T06:22:57.684 INFO:tasks.workunit.client.0.vm04.stdout:1/442: sync 2026-03-10T06:22:57.694 INFO:tasks.workunit.client.0.vm04.stdout:3/454: dread d4/d6/f12 [8388608,4194304] 0 2026-03-10T06:22:57.702 INFO:tasks.workunit.client.0.vm04.stdout:8/470: dwrite df/d20/d25/d30/f4e [4194304,4194304] 0 2026-03-10T06:22:57.702 INFO:tasks.workunit.client.0.vm04.stdout:8/471: truncate df/d20/d25/d30/d65/f80 919226 0 2026-03-10T06:22:57.705 INFO:tasks.workunit.client.0.vm04.stdout:9/471: rename d2/d8/d14/d6c to d2/d23/d24/da9 0 2026-03-10T06:22:57.706 INFO:tasks.workunit.client.0.vm04.stdout:4/417: write d2/d16/d31/f50 [704346,56765] 0 2026-03-10T06:22:57.710 INFO:tasks.workunit.client.0.vm04.stdout:1/443: symlink d0/d3/d41/la1 0 2026-03-10T06:22:57.711 INFO:tasks.workunit.client.0.vm04.stdout:1/444: read - d0/d8/d46/f82 zero size 2026-03-10T06:22:57.714 INFO:tasks.workunit.client.0.vm04.stdout:8/472: rmdir df/d20/d25/d30/d55 39 2026-03-10T06:22:57.722 INFO:tasks.workunit.client.0.vm04.stdout:1/445: mknod d0/d3/d80/ca2 0 2026-03-10T06:22:57.726 INFO:tasks.workunit.client.0.vm04.stdout:4/418: creat d2/d32/f82 x:0 0 0 2026-03-10T06:22:57.727 INFO:tasks.workunit.client.0.vm04.stdout:8/473: creat df/d20/d25/d30/d70/f90 x:0 0 0 2026-03-10T06:22:57.727 INFO:tasks.workunit.client.0.vm04.stdout:4/419: chown d2/d32 476 1 2026-03-10T06:22:57.727 INFO:tasks.workunit.client.0.vm04.stdout:0/438: creat d0/d5/d25/dd/f94 x:0 0 0 2026-03-10T06:22:57.727 INFO:tasks.workunit.client.0.vm04.stdout:4/420: write d2/d16/d56/f7e [48579,98847] 0 2026-03-10T06:22:57.727 INFO:tasks.workunit.client.0.vm04.stdout:0/439: chown d0/d5/d25/dd/d5c/c6c 3554635 1 2026-03-10T06:22:57.727 INFO:tasks.workunit.client.0.vm04.stdout:4/421: fsync d2/d16/d31/f66 0 2026-03-10T06:22:57.729 INFO:tasks.workunit.client.0.vm04.stdout:6/455: getdents d2/d43/d2d/d30 0 2026-03-10T06:22:57.731 INFO:tasks.workunit.client.0.vm04.stdout:3/455: creat d4/d6/d99/d7b/d21/f9a x:0 0 0 2026-03-10T06:22:57.731 INFO:tasks.workunit.client.0.vm04.stdout:1/446: creat d0/d3/d41/fa3 x:0 0 0 2026-03-10T06:22:57.731 INFO:tasks.workunit.client.0.vm04.stdout:8/474: fsync df/d15/d2b/f4d 0 2026-03-10T06:22:57.735 INFO:tasks.workunit.client.0.vm04.stdout:0/440: dread d0/d5/d25/dd/d3a/d56/f88 [0,4194304] 0 2026-03-10T06:22:57.739 INFO:tasks.workunit.client.0.vm04.stdout:6/456: dwrite d2/d43/d2d/d30/f60 [0,4194304] 0 2026-03-10T06:22:57.741 INFO:tasks.workunit.client.0.vm04.stdout:4/422: rename d2/c22 to d2/d16/d2c/d6b/c83 0 2026-03-10T06:22:57.753 INFO:tasks.workunit.client.0.vm04.stdout:9/472: dwrite d2/d8/d5d/f7c [0,4194304] 0 2026-03-10T06:22:57.753 INFO:tasks.workunit.client.0.vm04.stdout:4/423: rename d2/d8/l5a to d2/d8/l84 0 2026-03-10T06:22:57.758 INFO:tasks.workunit.client.0.vm04.stdout:1/447: dwrite d0/d3/d41/d4b/d5b/f66 [0,4194304] 0 2026-03-10T06:22:57.780 INFO:tasks.workunit.client.0.vm04.stdout:3/456: link d4/d6/l74 d4/d6/d92/l9b 0 2026-03-10T06:22:57.782 INFO:tasks.workunit.client.0.vm04.stdout:1/448: stat d0/d3/f44 0 2026-03-10T06:22:57.782 INFO:tasks.workunit.client.0.vm04.stdout:8/475: link df/d53/d67/l7c df/d20/d25/d30/d65/l91 0 2026-03-10T06:22:57.783 INFO:tasks.workunit.client.0.vm04.stdout:1/449: chown d0/l7 1 1 2026-03-10T06:22:57.786 INFO:tasks.workunit.client.0.vm04.stdout:1/450: write d0/f83 [604870,74748] 0 2026-03-10T06:22:57.788 INFO:tasks.workunit.client.0.vm04.stdout:7/438: dwrite d4/df/f29 [4194304,4194304] 0 2026-03-10T06:22:57.789 INFO:tasks.workunit.client.0.vm04.stdout:3/457: creat d4/d6/d99/d7b/d21/d32/d39/d64/f9c x:0 0 0 2026-03-10T06:22:57.797 INFO:tasks.workunit.client.0.vm04.stdout:8/476: dwrite df/d20/d25/d30/d55/f8d [0,4194304] 0 2026-03-10T06:22:57.799 INFO:tasks.workunit.client.0.vm04.stdout:8/477: stat df/d15/d2b/f33 0 2026-03-10T06:22:57.799 INFO:tasks.workunit.client.0.vm04.stdout:8/478: readlink l0 0 2026-03-10T06:22:57.800 INFO:tasks.workunit.client.0.vm04.stdout:0/441: link d0/d5/d25/dd/d5c/d73/l6d d0/d1a/d20/d38/d31/d47/d8a/l95 0 2026-03-10T06:22:57.821 INFO:tasks.workunit.client.0.vm04.stdout:6/457: link d2/d43/d2d/d30/d1f/d3c/f27 d2/d43/d2d/d30/f93 0 2026-03-10T06:22:57.822 INFO:tasks.workunit.client.0.vm04.stdout:3/458: mknod d4/d6/d99/d7b/d21/d2c/c9d 0 2026-03-10T06:22:57.823 INFO:tasks.workunit.client.0.vm04.stdout:3/459: fdatasync d4/da/df/d11/d5a/d5b/f98 0 2026-03-10T06:22:57.825 INFO:tasks.workunit.client.0.vm04.stdout:7/439: chown d4/df/d12/d13/d25/d28/d3a/d58/c6a 1 1 2026-03-10T06:22:57.829 INFO:tasks.workunit.client.0.vm04.stdout:4/424: creat d2/d32/d5c/d4f/f85 x:0 0 0 2026-03-10T06:22:57.829 INFO:tasks.workunit.client.0.vm04.stdout:8/479: creat df/d53/d67/f92 x:0 0 0 2026-03-10T06:22:57.834 INFO:tasks.workunit.client.0.vm04.stdout:0/442: unlink d0/d5/d25/dd/d3a/f60 0 2026-03-10T06:22:57.836 INFO:tasks.workunit.client.0.vm04.stdout:1/451: mknod d0/d3/d41/d99/ca4 0 2026-03-10T06:22:57.839 INFO:tasks.workunit.client.0.vm04.stdout:9/473: rename d2/d8/d5d to d2/d8/d22/daa 0 2026-03-10T06:22:57.843 INFO:tasks.workunit.client.0.vm04.stdout:0/443: dwrite d0/d5/d25/dd/f94 [0,4194304] 0 2026-03-10T06:22:57.846 INFO:tasks.workunit.client.0.vm04.stdout:0/444: read - d0/d5/d25/dd/d3a/f6a zero size 2026-03-10T06:22:57.853 INFO:tasks.workunit.client.0.vm04.stdout:3/460: rmdir d4/d6 39 2026-03-10T06:22:57.854 INFO:tasks.workunit.client.0.vm04.stdout:7/440: rmdir d4/df/d12/d13/d25/d30 39 2026-03-10T06:22:57.854 INFO:tasks.workunit.client.0.vm04.stdout:0/445: dread d0/d1a/d20/f85 [0,4194304] 0 2026-03-10T06:22:57.855 INFO:tasks.workunit.client.0.vm04.stdout:4/425: fdatasync d2/f4 0 2026-03-10T06:22:57.864 INFO:tasks.workunit.client.0.vm04.stdout:8/480: creat df/d20/d25/f93 x:0 0 0 2026-03-10T06:22:57.874 INFO:tasks.workunit.client.0.vm04.stdout:0/446: fsync d0/d5/d25/dd/d5c/f8f 0 2026-03-10T06:22:57.875 INFO:tasks.workunit.client.0.vm04.stdout:7/441: truncate d4/df/d12/f4c 1889126 0 2026-03-10T06:22:57.876 INFO:tasks.workunit.client.0.vm04.stdout:0/447: truncate d0/d1a/d20/d38/d31/d47/f7b 389169 0 2026-03-10T06:22:57.877 INFO:tasks.workunit.client.0.vm04.stdout:0/448: readlink d0/d5/d25/dd/d5c/d73/l68 0 2026-03-10T06:22:57.881 INFO:tasks.workunit.client.0.vm04.stdout:4/426: symlink d2/d32/d5c/d4f/d51/l86 0 2026-03-10T06:22:57.882 INFO:tasks.workunit.client.0.vm04.stdout:9/474: symlink d2/d3/d18/lab 0 2026-03-10T06:22:57.890 INFO:tasks.workunit.client.0.vm04.stdout:0/449: unlink d0/l91 0 2026-03-10T06:22:57.891 INFO:tasks.workunit.client.0.vm04.stdout:0/450: chown d0/d5/d25/dd/d5c 3474 1 2026-03-10T06:22:57.893 INFO:tasks.workunit.client.0.vm04.stdout:8/481: dwrite df/d15/f45 [0,4194304] 0 2026-03-10T06:22:57.896 INFO:tasks.workunit.client.0.vm04.stdout:9/475: creat d2/d3/d18/d39/d46/fac x:0 0 0 2026-03-10T06:22:57.902 INFO:tasks.workunit.client.0.vm04.stdout:9/476: truncate d2/d8/f99 644974 0 2026-03-10T06:22:57.902 INFO:tasks.workunit.client.0.vm04.stdout:0/451: dwrite d0/f1b [4194304,4194304] 0 2026-03-10T06:22:57.911 INFO:tasks.workunit.client.0.vm04.stdout:7/442: link d4/df/f60 d4/df/d12/d13/d8b/fa5 0 2026-03-10T06:22:57.915 INFO:tasks.workunit.client.0.vm04.stdout:1/452: rename d0/d3/d41/d4b/c6e to d0/d8/ca5 0 2026-03-10T06:22:57.920 INFO:tasks.workunit.client.0.vm04.stdout:8/482: creat df/d20/d25/d30/d65/f94 x:0 0 0 2026-03-10T06:22:57.922 INFO:tasks.workunit.client.0.vm04.stdout:9/477: mknod d2/d8/d14/d1d/d64/d73/cad 0 2026-03-10T06:22:57.925 INFO:tasks.workunit.client.0.vm04.stdout:7/443: creat d4/df/d12/d13/fa6 x:0 0 0 2026-03-10T06:22:57.929 INFO:tasks.workunit.client.0.vm04.stdout:1/453: dread d0/f23 [0,4194304] 0 2026-03-10T06:22:57.934 INFO:tasks.workunit.client.0.vm04.stdout:0/452: link d0/d1a/d20/d38/d31/d79/l7c d0/d5/l96 0 2026-03-10T06:22:57.935 INFO:tasks.workunit.client.0.vm04.stdout:9/478: symlink d2/d23/d24/da2/lae 0 2026-03-10T06:22:57.937 INFO:tasks.workunit.client.0.vm04.stdout:8/483: dwrite df/d20/d25/d30/f6b [0,4194304] 0 2026-03-10T06:22:57.943 INFO:tasks.workunit.client.0.vm04.stdout:9/479: truncate d2/d23/d24/f29 8674479 0 2026-03-10T06:22:57.946 INFO:tasks.workunit.client.0.vm04.stdout:7/444: creat d4/fa7 x:0 0 0 2026-03-10T06:22:57.952 INFO:tasks.workunit.client.0.vm04.stdout:8/484: creat df/d20/d25/d30/d55/f95 x:0 0 0 2026-03-10T06:22:57.952 INFO:tasks.workunit.client.0.vm04.stdout:9/480: mknod d2/d3/caf 0 2026-03-10T06:22:57.953 INFO:tasks.workunit.client.0.vm04.stdout:9/481: fdatasync d2/d8/d14/d1d/d64/fa1 0 2026-03-10T06:22:57.953 INFO:tasks.workunit.client.0.vm04.stdout:9/482: fdatasync d2/d3/f4 0 2026-03-10T06:22:57.953 INFO:tasks.workunit.client.0.vm04.stdout:7/445: write d4/df/d12/d21/f2a [195795,25284] 0 2026-03-10T06:22:57.954 INFO:tasks.workunit.client.0.vm04.stdout:1/454: dread d0/d3/d41/d4b/d5b/f66 [0,4194304] 0 2026-03-10T06:22:57.956 INFO:tasks.workunit.client.0.vm04.stdout:9/483: truncate d2/d8/f99 824438 0 2026-03-10T06:22:57.956 INFO:tasks.workunit.client.0.vm04.stdout:0/453: dread d0/d5/d25/f3c [0,4194304] 0 2026-03-10T06:22:57.957 INFO:tasks.workunit.client.0.vm04.stdout:0/454: fdatasync d0/d5/f70 0 2026-03-10T06:22:57.959 INFO:tasks.workunit.client.0.vm04.stdout:7/446: truncate d4/df/f56 424402 0 2026-03-10T06:22:57.973 INFO:tasks.workunit.client.0.vm04.stdout:9/484: mknod d2/d8/d14/da3/cb0 0 2026-03-10T06:22:57.973 INFO:tasks.workunit.client.0.vm04.stdout:9/485: write d2/d23/d24/f83 [84112,94750] 0 2026-03-10T06:22:57.973 INFO:tasks.workunit.client.0.vm04.stdout:7/447: mknod d4/df/d12/d13/d8b/ca8 0 2026-03-10T06:22:57.973 INFO:tasks.workunit.client.0.vm04.stdout:9/486: creat d2/d3/d18/d39/d11/d42/fb1 x:0 0 0 2026-03-10T06:22:57.973 INFO:tasks.workunit.client.0.vm04.stdout:0/455: dwrite d0/d1a/d20/d38/d31/d47/f89 [0,4194304] 0 2026-03-10T06:22:57.978 INFO:tasks.workunit.client.0.vm04.stdout:7/448: mknod d4/df/d12/d13/d25/d28/ca9 0 2026-03-10T06:22:57.979 INFO:tasks.workunit.client.0.vm04.stdout:7/449: chown d4/df/d12/d13/d25/d28/d3a/d58/c71 0 1 2026-03-10T06:22:57.980 INFO:tasks.workunit.client.0.vm04.stdout:7/450: chown d4/df/d12/d13/d25/d28/d3a/d58/c6a 2 1 2026-03-10T06:22:57.980 INFO:tasks.workunit.client.0.vm04.stdout:0/456: dread d0/d1a/f27 [0,4194304] 0 2026-03-10T06:22:57.982 INFO:tasks.workunit.client.0.vm04.stdout:0/457: write d0/d1a/f66 [691659,23377] 0 2026-03-10T06:22:57.985 INFO:tasks.workunit.client.0.vm04.stdout:7/451: creat d4/df/d12/d13/d25/d30/d40/d79/faa x:0 0 0 2026-03-10T06:22:57.985 INFO:tasks.workunit.client.0.vm04.stdout:0/458: mkdir d0/d5/d97 0 2026-03-10T06:22:57.987 INFO:tasks.workunit.client.0.vm04.stdout:7/452: creat d4/df/d12/d13/d25/d30/d40/d79/fab x:0 0 0 2026-03-10T06:22:57.997 INFO:tasks.workunit.client.0.vm04.stdout:0/459: creat d0/d5/d25/dd/d1d/d59/f98 x:0 0 0 2026-03-10T06:22:57.997 INFO:tasks.workunit.client.0.vm04.stdout:7/453: creat d4/df/d12/d13/fac x:0 0 0 2026-03-10T06:22:57.997 INFO:tasks.workunit.client.0.vm04.stdout:0/460: dread - d0/d5/d25/dd/d3a/d81/f90 zero size 2026-03-10T06:22:57.997 INFO:tasks.workunit.client.0.vm04.stdout:0/461: mknod d0/d5/d25/dd/d3a/c99 0 2026-03-10T06:22:57.997 INFO:tasks.workunit.client.0.vm04.stdout:7/454: link d4/df/d12/d21/c74 d4/df/d12/d13/d25/d8f/cad 0 2026-03-10T06:22:57.999 INFO:tasks.workunit.client.0.vm04.stdout:7/455: unlink d4/df/d12/d21/f2a 0 2026-03-10T06:22:58.000 INFO:tasks.workunit.client.0.vm04.stdout:7/456: chown d4/df/d12/d13/d25/f95 1984 1 2026-03-10T06:22:58.003 INFO:tasks.workunit.client.0.vm04.stdout:0/462: dwrite d0/f16 [0,4194304] 0 2026-03-10T06:22:58.018 INFO:tasks.workunit.client.0.vm04.stdout:3/461: rename d4/da/df/d11/d62/f69 to d4/d6/d99/d7b/f9e 0 2026-03-10T06:22:58.021 INFO:tasks.workunit.client.0.vm04.stdout:5/405: dwrite d4/d6/d37/f39 [0,4194304] 0 2026-03-10T06:22:58.023 INFO:tasks.workunit.client.0.vm04.stdout:5/406: chown d4/d11/d7d/d52/l58 2094855 1 2026-03-10T06:22:58.024 INFO:tasks.workunit.client.0.vm04.stdout:5/407: write d4/d6/f87 [1042402,61301] 0 2026-03-10T06:22:58.024 INFO:tasks.workunit.client.0.vm04.stdout:4/427: rename d2/d16/d2c/f55 to d2/d46/f87 0 2026-03-10T06:22:58.025 INFO:tasks.workunit.client.0.vm04.stdout:5/408: stat d4/d11/d7d/d38/l8a 0 2026-03-10T06:22:58.030 INFO:tasks.workunit.client.0.vm04.stdout:8/485: rename df/d15/d2b/l83 to df/d20/d25/d87/l96 0 2026-03-10T06:22:58.030 INFO:tasks.workunit.client.0.vm04.stdout:5/409: creat d4/d11/d7d/d52/f8f x:0 0 0 2026-03-10T06:22:58.032 INFO:tasks.workunit.client.0.vm04.stdout:5/410: write d4/d6/d50/f59 [870157,83299] 0 2026-03-10T06:22:58.032 INFO:tasks.workunit.client.0.vm04.stdout:4/428: symlink d2/d16/d31/d42/l88 0 2026-03-10T06:22:58.032 INFO:tasks.workunit.client.0.vm04.stdout:1/455: rename d0/d3/f61 to d0/d8/d46/d7a/d95/fa6 0 2026-03-10T06:22:58.034 INFO:tasks.workunit.client.0.vm04.stdout:5/411: chown d4/d11/d7d/d38/d51/f70 8982571 1 2026-03-10T06:22:58.044 INFO:tasks.workunit.client.0.vm04.stdout:4/429: chown d2/l2d 224920936 1 2026-03-10T06:22:58.050 INFO:tasks.workunit.client.0.vm04.stdout:9/487: rename d2/d3/f4 to d2/d8/d22/d4f/fb2 0 2026-03-10T06:22:58.056 INFO:tasks.workunit.client.0.vm04.stdout:8/486: getdents df/d53 0 2026-03-10T06:22:58.056 INFO:tasks.workunit.client.0.vm04.stdout:8/487: stat lb 0 2026-03-10T06:22:58.056 INFO:tasks.workunit.client.0.vm04.stdout:4/430: dwrite d2/d32/f7c [0,4194304] 0 2026-03-10T06:22:58.059 INFO:tasks.workunit.client.0.vm04.stdout:5/412: creat d4/d11/d7d/f90 x:0 0 0 2026-03-10T06:22:58.071 INFO:tasks.workunit.client.0.vm04.stdout:0/463: rename d0/d5/d25/dd/d1d/d59/f98 to d0/d5/d25/dd/d5c/f9a 0 2026-03-10T06:22:58.073 INFO:tasks.workunit.client.0.vm04.stdout:0/464: stat d0/d1a/d20/d38/d31/d47/f7b 0 2026-03-10T06:22:58.073 INFO:tasks.workunit.client.0.vm04.stdout:5/413: rename d4/d6/d48 to d4/d11/d7d/d38/d91 0 2026-03-10T06:22:58.076 INFO:tasks.workunit.client.0.vm04.stdout:5/414: creat d4/d11/d7d/d38/f92 x:0 0 0 2026-03-10T06:22:58.078 INFO:tasks.workunit.client.0.vm04.stdout:0/465: link d0/d5/d25/l5b d0/l9b 0 2026-03-10T06:22:58.079 INFO:tasks.workunit.client.0.vm04.stdout:0/466: stat d0/d5/d25/dd/d1d/c4a 0 2026-03-10T06:22:58.081 INFO:tasks.workunit.client.0.vm04.stdout:5/415: link d4/d6/f87 d4/d6/f93 0 2026-03-10T06:22:58.082 INFO:tasks.workunit.client.0.vm04.stdout:2/443: dwrite d1/db/d72/f83 [0,4194304] 0 2026-03-10T06:22:58.086 INFO:tasks.workunit.client.0.vm04.stdout:0/467: mkdir d0/d5/d25/dd/d1d/d9c 0 2026-03-10T06:22:58.092 INFO:tasks.workunit.client.0.vm04.stdout:2/444: chown d1/df/f88 38023 1 2026-03-10T06:22:58.103 INFO:tasks.workunit.client.0.vm04.stdout:5/416: fdatasync d4/d11/d7d/d38/d51/f70 0 2026-03-10T06:22:58.103 INFO:tasks.workunit.client.0.vm04.stdout:0/468: read d0/d1a/f3b [3628885,128835] 0 2026-03-10T06:22:58.103 INFO:tasks.workunit.client.0.vm04.stdout:5/417: write d4/d11/f34 [3041488,26343] 0 2026-03-10T06:22:58.103 INFO:tasks.workunit.client.0.vm04.stdout:0/469: symlink d0/d5/d25/dd/d5c/d73/d82/l9d 0 2026-03-10T06:22:58.103 INFO:tasks.workunit.client.0.vm04.stdout:5/418: link d4/d11/d7d/d38/d51/c56 d4/d11/d7d/d38/d91/d55/c94 0 2026-03-10T06:22:58.103 INFO:tasks.workunit.client.0.vm04.stdout:2/445: dwrite d1/db/d72/f83 [0,4194304] 0 2026-03-10T06:22:58.105 INFO:tasks.workunit.client.0.vm04.stdout:5/419: read - d4/d11/d7d/d38/d91/f5e zero size 2026-03-10T06:22:58.110 INFO:tasks.workunit.client.0.vm04.stdout:0/470: rename d0/d5/d25/dd/d1d/d59/c49 to d0/d5/d25/dd/d5c/d73/c9e 0 2026-03-10T06:22:58.115 INFO:tasks.workunit.client.0.vm04.stdout:2/446: rmdir d1/df/d11/d14/d4e 39 2026-03-10T06:22:58.115 INFO:tasks.workunit.client.0.vm04.stdout:5/420: creat d4/d11/d7d/d38/d91/d4c/f95 x:0 0 0 2026-03-10T06:22:58.115 INFO:tasks.workunit.client.0.vm04.stdout:0/471: creat d0/d5/d25/dd/d1d/d59/f9f x:0 0 0 2026-03-10T06:22:58.116 INFO:tasks.workunit.client.0.vm04.stdout:0/472: readlink d0/d1a/d20/d38/d31/d47/d8a/l95 0 2026-03-10T06:22:58.121 INFO:tasks.workunit.client.0.vm04.stdout:5/421: dread d4/d11/f18 [0,4194304] 0 2026-03-10T06:22:58.121 INFO:tasks.workunit.client.0.vm04.stdout:0/473: getdents d0/d1a/d20/d38 0 2026-03-10T06:22:58.124 INFO:tasks.workunit.client.0.vm04.stdout:5/422: write d4/d11/d7d/d38/d91/d55/f7a [162734,111338] 0 2026-03-10T06:22:58.129 INFO:tasks.workunit.client.0.vm04.stdout:5/423: link d4/d11/d7d/f5b d4/d11/d7d/d52/f96 0 2026-03-10T06:22:58.132 INFO:tasks.workunit.client.0.vm04.stdout:5/424: write d4/d11/d7d/d38/d91/d4c/f83 [79227,97368] 0 2026-03-10T06:22:58.165 INFO:tasks.workunit.client.0.vm04.stdout:0/474: dwrite d0/d1a/d20/f8c [0,4194304] 0 2026-03-10T06:22:58.165 INFO:tasks.workunit.client.0.vm04.stdout:5/425: dwrite d4/d6/f33 [0,4194304] 0 2026-03-10T06:22:58.169 INFO:tasks.workunit.client.0.vm04.stdout:5/426: getdents d4/d6/d80/d84 0 2026-03-10T06:22:58.173 INFO:tasks.workunit.client.0.vm04.stdout:0/475: dread d0/d1a/f3b [0,4194304] 0 2026-03-10T06:22:58.174 INFO:tasks.workunit.client.0.vm04.stdout:3/462: sync 2026-03-10T06:22:58.174 INFO:tasks.workunit.client.0.vm04.stdout:4/431: sync 2026-03-10T06:22:58.174 INFO:tasks.workunit.client.0.vm04.stdout:2/447: sync 2026-03-10T06:22:58.176 INFO:tasks.workunit.client.0.vm04.stdout:3/463: fdatasync d4/d6/d99/d7b/f77 0 2026-03-10T06:22:58.176 INFO:tasks.workunit.client.0.vm04.stdout:4/432: dread - d2/d16/d31/f66 zero size 2026-03-10T06:22:58.177 INFO:tasks.workunit.client.0.vm04.stdout:2/448: creat d1/df/d2c/d37/d59/f8b x:0 0 0 2026-03-10T06:22:58.178 INFO:tasks.workunit.client.0.vm04.stdout:4/433: dread - d2/d46/f61 zero size 2026-03-10T06:22:58.180 INFO:tasks.workunit.client.0.vm04.stdout:0/476: dwrite d0/d5/f70 [0,4194304] 0 2026-03-10T06:22:58.181 INFO:tasks.workunit.client.0.vm04.stdout:4/434: creat d2/d8/f89 x:0 0 0 2026-03-10T06:22:58.183 INFO:tasks.workunit.client.0.vm04.stdout:3/464: creat d4/da/df/d11/f9f x:0 0 0 2026-03-10T06:22:58.197 INFO:tasks.workunit.client.0.vm04.stdout:4/435: stat d2/d16/d56/c68 0 2026-03-10T06:22:58.202 INFO:tasks.workunit.client.0.vm04.stdout:4/436: readlink d2/d32/l7b 0 2026-03-10T06:22:58.202 INFO:tasks.workunit.client.0.vm04.stdout:4/437: dread d2/f12 [0,4194304] 0 2026-03-10T06:22:58.202 INFO:tasks.workunit.client.0.vm04.stdout:4/438: symlink d2/l8a 0 2026-03-10T06:22:58.202 INFO:tasks.workunit.client.0.vm04.stdout:4/439: symlink d2/d16/d2c/l8b 0 2026-03-10T06:22:58.203 INFO:tasks.workunit.client.0.vm04.stdout:4/440: symlink d2/d32/d5c/d76/l8c 0 2026-03-10T06:22:58.204 INFO:tasks.workunit.client.0.vm04.stdout:4/441: creat d2/d16/d2c/f8d x:0 0 0 2026-03-10T06:22:58.251 INFO:tasks.workunit.client.0.vm04.stdout:4/442: sync 2026-03-10T06:22:58.262 INFO:tasks.workunit.client.0.vm04.stdout:4/443: symlink d2/d32/d5c/d76/l8e 0 2026-03-10T06:22:58.263 INFO:tasks.workunit.client.0.vm04.stdout:4/444: chown d2/f4 125471437 1 2026-03-10T06:22:58.266 INFO:tasks.workunit.client.0.vm04.stdout:4/445: dread - d2/d32/d5c/d4f/f60 zero size 2026-03-10T06:22:58.269 INFO:tasks.workunit.client.0.vm04.stdout:4/446: write d2/d8/f35 [4955530,1955] 0 2026-03-10T06:22:58.272 INFO:tasks.workunit.client.0.vm04.stdout:4/447: stat d2/l4a 0 2026-03-10T06:22:58.275 INFO:tasks.workunit.client.0.vm04.stdout:4/448: write d2/d16/d2c/f6f [189276,85336] 0 2026-03-10T06:22:58.296 INFO:tasks.workunit.client.0.vm04.stdout:8/488: getdents df/d53/d67 0 2026-03-10T06:22:58.297 INFO:tasks.workunit.client.0.vm04.stdout:8/489: dread - df/f6e zero size 2026-03-10T06:22:58.303 INFO:tasks.workunit.client.0.vm04.stdout:8/490: rename df/d53 to df/d20/d25/d30/d70/d97 0 2026-03-10T06:22:58.307 INFO:tasks.workunit.client.0.vm04.stdout:8/491: rename df/f6e to df/d20/d25/d73/f98 0 2026-03-10T06:22:58.338 INFO:tasks.workunit.client.0.vm04.stdout:7/457: dwrite d4/fb [0,4194304] 0 2026-03-10T06:22:58.347 INFO:tasks.workunit.client.0.vm04.stdout:8/492: rmdir df/d15 39 2026-03-10T06:22:58.347 INFO:tasks.workunit.client.0.vm04.stdout:7/458: rmdir d4/df/d12/d13/d25/d28/d36 39 2026-03-10T06:22:58.348 INFO:tasks.workunit.client.0.vm04.stdout:1/456: write d0/d3/f3b [609585,73464] 0 2026-03-10T06:22:58.349 INFO:tasks.workunit.client.0.vm04.stdout:9/488: write d2/d23/f93 [992005,53153] 0 2026-03-10T06:22:58.350 INFO:tasks.workunit.client.0.vm04.stdout:5/427: fsync d4/d11/d7d/f90 0 2026-03-10T06:22:58.357 INFO:tasks.workunit.client.0.vm04.stdout:5/428: dwrite d4/d11/d7d/d38/d91/d4c/f88 [0,4194304] 0 2026-03-10T06:22:58.363 INFO:tasks.workunit.client.0.vm04.stdout:8/493: write df/d15/d2b/f7e [712720,54426] 0 2026-03-10T06:22:58.365 INFO:tasks.workunit.client.0.vm04.stdout:1/457: creat d0/d8/d46/d7a/d95/fa7 x:0 0 0 2026-03-10T06:22:58.365 INFO:tasks.workunit.client.0.vm04.stdout:9/489: mknod d2/d8/d14/d1d/d64/d73/cb3 0 2026-03-10T06:22:58.370 INFO:tasks.workunit.client.0.vm04.stdout:5/429: creat d4/d6/d80/f97 x:0 0 0 2026-03-10T06:22:58.372 INFO:tasks.workunit.client.0.vm04.stdout:5/430: read - d4/d11/d7d/f90 zero size 2026-03-10T06:22:58.377 INFO:tasks.workunit.client.0.vm04.stdout:1/458: creat d0/d8/d46/d7a/fa8 x:0 0 0 2026-03-10T06:22:58.382 INFO:tasks.workunit.client.0.vm04.stdout:1/459: fsync d0/d3/d41/f47 0 2026-03-10T06:22:58.383 INFO:tasks.workunit.client.0.vm04.stdout:5/431: mkdir d4/d11/d7d/d38/d91/d4c/d98 0 2026-03-10T06:22:58.385 INFO:tasks.workunit.client.0.vm04.stdout:9/490: creat d2/fb4 x:0 0 0 2026-03-10T06:22:58.387 INFO:tasks.workunit.client.0.vm04.stdout:9/491: dread - d2/d8/d14/d1d/d64/d73/f9a zero size 2026-03-10T06:22:58.398 INFO:tasks.workunit.client.0.vm04.stdout:3/465: dwrite d4/d6/d99/d7b/f27 [0,4194304] 0 2026-03-10T06:22:58.404 INFO:tasks.workunit.client.0.vm04.stdout:0/477: dwrite d0/d5/f1f [0,4194304] 0 2026-03-10T06:22:58.406 INFO:tasks.workunit.client.0.vm04.stdout:1/460: dread d0/f6a [0,4194304] 0 2026-03-10T06:22:58.406 INFO:tasks.workunit.client.0.vm04.stdout:1/461: chown d0/d3/d41/d4b/d5b/l88 0 1 2026-03-10T06:22:58.407 INFO:tasks.workunit.client.0.vm04.stdout:2/449: dwrite d1/db/f36 [0,4194304] 0 2026-03-10T06:22:58.437 INFO:tasks.workunit.client.0.vm04.stdout:2/450: symlink d1/df/d2c/d37/d59/l8c 0 2026-03-10T06:22:58.444 INFO:tasks.workunit.client.0.vm04.stdout:1/462: rename d0/d3/c17 to d0/d8/d46/d7a/ca9 0 2026-03-10T06:22:58.444 INFO:tasks.workunit.client.0.vm04.stdout:4/449: write d2/f47 [2151579,57539] 0 2026-03-10T06:22:58.444 INFO:tasks.workunit.client.0.vm04.stdout:2/451: write d1/df/f24 [120115,91400] 0 2026-03-10T06:22:58.444 INFO:tasks.workunit.client.0.vm04.stdout:9/492: getdents d2/d8/d22 0 2026-03-10T06:22:58.444 INFO:tasks.workunit.client.0.vm04.stdout:0/478: mknod d0/d5/d25/dd/d1d/d9c/ca0 0 2026-03-10T06:22:58.445 INFO:tasks.workunit.client.0.vm04.stdout:0/479: write d0/d5/d25/dd/d3a/f57 [869406,82743] 0 2026-03-10T06:22:58.449 INFO:tasks.workunit.client.0.vm04.stdout:2/452: dwrite d1/df/f63 [0,4194304] 0 2026-03-10T06:22:58.451 INFO:tasks.workunit.client.0.vm04.stdout:2/453: rename d1/df/d11/d18/d35 to d1/df/d11/d18/d35/d8d 22 2026-03-10T06:22:58.457 INFO:tasks.workunit.client.0.vm04.stdout:2/454: write d1/df/d2c/f58 [1225090,14209] 0 2026-03-10T06:22:58.457 INFO:tasks.workunit.client.0.vm04.stdout:0/480: dread d0/d5/f70 [0,4194304] 0 2026-03-10T06:22:58.457 INFO:tasks.workunit.client.0.vm04.stdout:1/463: unlink d0/d3/d80/f81 0 2026-03-10T06:22:58.458 INFO:tasks.workunit.client.0.vm04.stdout:5/432: sync 2026-03-10T06:22:58.467 INFO:tasks.workunit.client.0.vm04.stdout:5/433: read d4/f21 [3735924,109563] 0 2026-03-10T06:22:58.467 INFO:tasks.workunit.client.0.vm04.stdout:5/434: readlink d4/d11/d7d/l49 0 2026-03-10T06:22:58.475 INFO:tasks.workunit.client.0.vm04.stdout:6/458: write d2/d43/d2d/d30/d34/f4d [298361,45573] 0 2026-03-10T06:22:58.482 INFO:tasks.workunit.client.0.vm04.stdout:4/450: dread d2/d32/d5c/f41 [0,4194304] 0 2026-03-10T06:22:58.484 INFO:tasks.workunit.client.0.vm04.stdout:4/451: fdatasync d2/d8/f35 0 2026-03-10T06:22:58.490 INFO:tasks.workunit.client.0.vm04.stdout:2/455: dread d1/db/fe [0,4194304] 0 2026-03-10T06:22:58.490 INFO:tasks.workunit.client.0.vm04.stdout:5/435: dread d4/d11/d7d/f31 [0,4194304] 0 2026-03-10T06:22:58.514 INFO:tasks.workunit.client.0.vm04.stdout:0/481: mknod d0/d1a/d20/d38/d31/d47/d8a/d8d/ca1 0 2026-03-10T06:22:58.516 INFO:tasks.workunit.client.0.vm04.stdout:0/482: read - d0/d5/d25/dd/d3a/d56/f84 zero size 2026-03-10T06:22:58.516 INFO:tasks.workunit.client.0.vm04.stdout:8/494: write df/d15/d29/f3e [304699,101789] 0 2026-03-10T06:22:58.517 INFO:tasks.workunit.client.0.vm04.stdout:0/483: stat d0/d5/d25/dd/d5c/d73/d82/l9d 0 2026-03-10T06:22:58.525 INFO:tasks.workunit.client.0.vm04.stdout:7/459: dwrite d4/df/f56 [0,4194304] 0 2026-03-10T06:22:58.530 INFO:tasks.workunit.client.0.vm04.stdout:7/460: write d4/df/d12/d13/f27 [2706271,52788] 0 2026-03-10T06:22:58.539 INFO:tasks.workunit.client.0.vm04.stdout:9/493: link d2/d8/f66 d2/d23/d94/fb5 0 2026-03-10T06:22:58.539 INFO:tasks.workunit.client.0.vm04.stdout:9/494: fdatasync d2/d3/d18/d39/d11/d42/f5e 0 2026-03-10T06:22:58.547 INFO:tasks.workunit.client.0.vm04.stdout:5/436: truncate d4/d6/d37/f62 5167998 0 2026-03-10T06:22:58.547 INFO:tasks.workunit.client.0.vm04.stdout:5/437: chown d4/d11/d7d/d38/d91 103033 1 2026-03-10T06:22:58.558 INFO:tasks.workunit.client.0.vm04.stdout:6/459: creat d2/d43/d2d/d30/d34/d76/d8a/f94 x:0 0 0 2026-03-10T06:22:58.563 INFO:tasks.workunit.client.0.vm04.stdout:1/464: link d0/d3/d41/l79 d0/d3/d80/laa 0 2026-03-10T06:22:58.574 INFO:tasks.workunit.client.0.vm04.stdout:0/484: creat d0/d5/d25/dd/d1d/fa2 x:0 0 0 2026-03-10T06:22:58.574 INFO:tasks.workunit.client.0.vm04.stdout:4/452: creat d2/d16/d31/d3f/f8f x:0 0 0 2026-03-10T06:22:58.577 INFO:tasks.workunit.client.0.vm04.stdout:4/453: dread - d2/d16/d31/d3f/f52 zero size 2026-03-10T06:22:58.577 INFO:tasks.workunit.client.0.vm04.stdout:3/466: dwrite d4/f49 [0,4194304] 0 2026-03-10T06:22:58.580 INFO:tasks.workunit.client.0.vm04.stdout:3/467: dread - d4/d6/d38/f78 zero size 2026-03-10T06:22:58.587 INFO:tasks.workunit.client.0.vm04.stdout:4/454: dwrite d2/d32/d5c/f6d [0,4194304] 0 2026-03-10T06:22:58.588 INFO:tasks.workunit.client.0.vm04.stdout:9/495: rename d2/d8/d14/l72 to d2/d23/d24/da9/lb6 0 2026-03-10T06:22:58.589 INFO:tasks.workunit.client.0.vm04.stdout:2/456: creat d1/d76/f8e x:0 0 0 2026-03-10T06:22:58.612 INFO:tasks.workunit.client.0.vm04.stdout:7/461: mkdir d4/df/d12/d13/d25/d28/dae 0 2026-03-10T06:22:58.619 INFO:tasks.workunit.client.0.vm04.stdout:7/462: dwrite d4/df/d12/d13/d25/d30/d40/d50/f62 [0,4194304] 0 2026-03-10T06:22:58.619 INFO:tasks.workunit.client.0.vm04.stdout:0/485: creat d0/d5/d25/dd/d1d/d59/d63/fa3 x:0 0 0 2026-03-10T06:22:58.624 INFO:tasks.workunit.client.0.vm04.stdout:3/468: dread d4/f2d [0,4194304] 0 2026-03-10T06:22:58.640 INFO:tasks.workunit.client.0.vm04.stdout:4/455: symlink d2/d16/d56/l90 0 2026-03-10T06:22:58.643 INFO:tasks.workunit.client.0.vm04.stdout:9/496: mknod d2/d23/d94/cb7 0 2026-03-10T06:22:58.647 INFO:tasks.workunit.client.0.vm04.stdout:4/456: dread d2/f4 [0,4194304] 0 2026-03-10T06:22:58.647 INFO:tasks.workunit.client.0.vm04.stdout:2/457: unlink d1/db/d20/d75/l89 0 2026-03-10T06:22:58.659 INFO:tasks.workunit.client.0.vm04.stdout:5/438: mkdir d4/d6/d80/d84/d99 0 2026-03-10T06:22:58.659 INFO:tasks.workunit.client.0.vm04.stdout:5/439: chown d4/d11/d7d/d52/f96 185 1 2026-03-10T06:22:58.667 INFO:tasks.workunit.client.0.vm04.stdout:1/465: truncate d0/f4 10565 0 2026-03-10T06:22:58.670 INFO:tasks.workunit.client.0.vm04.stdout:0/486: readlink d0/d5/l93 0 2026-03-10T06:22:58.671 INFO:tasks.workunit.client.0.vm04.stdout:0/487: read d0/d5/d25/dd/f43 [1722793,27979] 0 2026-03-10T06:22:58.671 INFO:tasks.workunit.client.0.vm04.stdout:0/488: fdatasync d0/d1a/d20/f8c 0 2026-03-10T06:22:58.680 INFO:tasks.workunit.client.0.vm04.stdout:5/440: write d4/d11/d7d/d38/d91/f74 [5195444,29783] 0 2026-03-10T06:22:58.686 INFO:tasks.workunit.client.0.vm04.stdout:5/441: dwrite d4/d6/d37/f7e [0,4194304] 0 2026-03-10T06:22:58.709 INFO:tasks.workunit.client.0.vm04.stdout:8/495: truncate df/d15/d29/f3e 350949 0 2026-03-10T06:22:58.716 INFO:tasks.workunit.client.0.vm04.stdout:7/463: creat d4/df/d12/faf x:0 0 0 2026-03-10T06:22:58.720 INFO:tasks.workunit.client.0.vm04.stdout:0/489: creat d0/d5/d25/dd/d92/fa4 x:0 0 0 2026-03-10T06:22:58.723 INFO:tasks.workunit.client.0.vm04.stdout:6/460: truncate d2/d43/d2d/d30/d34/f6d 1099391 0 2026-03-10T06:22:58.740 INFO:tasks.workunit.client.0.vm04.stdout:5/442: creat d4/d11/d7d/d52/f9a x:0 0 0 2026-03-10T06:22:58.745 INFO:tasks.workunit.client.0.vm04.stdout:9/497: creat d2/d8/fb8 x:0 0 0 2026-03-10T06:22:58.745 INFO:tasks.workunit.client.0.vm04.stdout:5/443: read - d4/d11/d7d/d38/d91/d55/f68 zero size 2026-03-10T06:22:58.751 INFO:tasks.workunit.client.0.vm04.stdout:8/496: fdatasync df/d20/d25/f44 0 2026-03-10T06:22:58.759 INFO:tasks.workunit.client.0.vm04.stdout:5/444: dwrite d4/d6/f33 [0,4194304] 0 2026-03-10T06:22:58.765 INFO:tasks.workunit.client.0.vm04.stdout:1/466: creat d0/d8/fab x:0 0 0 2026-03-10T06:22:58.777 INFO:tasks.workunit.client.0.vm04.stdout:8/497: dread df/d15/f5d [0,4194304] 0 2026-03-10T06:22:58.780 INFO:tasks.workunit.client.0.vm04.stdout:7/464: mkdir d4/df/d12/d13/d25/d28/d3a/db0 0 2026-03-10T06:22:58.791 INFO:tasks.workunit.client.0.vm04.stdout:3/469: getdents d4/d6/d99/d7b/d21/d32/d8e 0 2026-03-10T06:22:58.802 INFO:tasks.workunit.client.0.vm04.stdout:4/457: rename d2/l3b to d2/d16/d31/l91 0 2026-03-10T06:22:58.809 INFO:tasks.workunit.client.0.vm04.stdout:5/445: creat d4/d3b/f9b x:0 0 0 2026-03-10T06:22:58.810 INFO:tasks.workunit.client.0.vm04.stdout:1/467: write d0/d3/d41/d4b/d5b/f66 [2309968,81710] 0 2026-03-10T06:22:58.811 INFO:tasks.workunit.client.0.vm04.stdout:1/468: dread - d0/d8/d46/f82 zero size 2026-03-10T06:22:58.811 INFO:tasks.workunit.client.0.vm04.stdout:1/469: fdatasync d0/d3/f19 0 2026-03-10T06:22:58.821 INFO:tasks.workunit.client.0.vm04.stdout:8/498: dwrite df/d20/d25/f44 [0,4194304] 0 2026-03-10T06:22:58.835 INFO:tasks.workunit.client.0.vm04.stdout:8/499: dread df/d20/d25/d30/d65/f80 [0,4194304] 0 2026-03-10T06:22:58.836 INFO:tasks.workunit.client.0.vm04.stdout:9/498: symlink d2/d3/d18/lb9 0 2026-03-10T06:22:58.837 INFO:tasks.workunit.client.0.vm04.stdout:2/458: rename d1/df/d11/d18 to d1/db/d20/d8f 0 2026-03-10T06:22:58.839 INFO:tasks.workunit.client.0.vm04.stdout:2/459: read d1/db/f36 [1776155,61006] 0 2026-03-10T06:22:58.839 INFO:tasks.workunit.client.0.vm04.stdout:2/460: write d1/db/d20/f49 [986907,116831] 0 2026-03-10T06:22:58.846 INFO:tasks.workunit.client.0.vm04.stdout:2/461: dwrite d1/df/d11/d14/f1d [0,4194304] 0 2026-03-10T06:22:58.848 INFO:tasks.workunit.client.0.vm04.stdout:5/446: unlink d4/d11/d7d/d38/l40 0 2026-03-10T06:22:58.852 INFO:tasks.workunit.client.0.vm04.stdout:2/462: dread - d1/db/d20/f86 zero size 2026-03-10T06:22:58.855 INFO:tasks.workunit.client.0.vm04.stdout:1/470: symlink d0/d3/d41/d4b/lac 0 2026-03-10T06:22:58.859 INFO:tasks.workunit.client.0.vm04.stdout:2/463: dread d1/db/fe [0,4194304] 0 2026-03-10T06:22:58.860 INFO:tasks.workunit.client.0.vm04.stdout:7/465: mkdir d4/df/d12/d13/d25/d28/d36/d9c/db1 0 2026-03-10T06:22:58.866 INFO:tasks.workunit.client.0.vm04.stdout:8/500: creat df/d20/d25/d30/d55/f99 x:0 0 0 2026-03-10T06:22:58.867 INFO:tasks.workunit.client.0.vm04.stdout:8/501: read fe [3912947,93681] 0 2026-03-10T06:22:58.870 INFO:tasks.workunit.client.0.vm04.stdout:6/461: rename d2/d43/f4b to d2/d37/d6e/f95 0 2026-03-10T06:22:58.873 INFO:tasks.workunit.client.0.vm04.stdout:9/499: creat d2/d8/d53/d6e/d89/fba x:0 0 0 2026-03-10T06:22:58.881 INFO:tasks.workunit.client.0.vm04.stdout:4/458: symlink d2/d16/d31/l92 0 2026-03-10T06:22:58.885 INFO:tasks.workunit.client.0.vm04.stdout:5/447: dwrite d4/d6/fa [0,4194304] 0 2026-03-10T06:22:58.885 INFO:tasks.workunit.client.0.vm04.stdout:2/464: mknod d1/df/d2c/d37/c90 0 2026-03-10T06:22:58.889 INFO:tasks.workunit.client.0.vm04.stdout:0/490: getdents d0 0 2026-03-10T06:22:58.891 INFO:tasks.workunit.client.0.vm04.stdout:8/502: dread df/d15/d29/f6f [0,4194304] 0 2026-03-10T06:22:58.894 INFO:tasks.workunit.client.0.vm04.stdout:7/466: truncate d4/df/d12/f18 4990294 0 2026-03-10T06:22:58.895 INFO:tasks.workunit.client.0.vm04.stdout:8/503: fsync df/f4f 0 2026-03-10T06:22:58.896 INFO:tasks.workunit.client.0.vm04.stdout:3/470: creat d4/d6/d99/d7b/d21/fa0 x:0 0 0 2026-03-10T06:22:58.897 INFO:tasks.workunit.client.0.vm04.stdout:3/471: write d4/da/df/d11/d5a/d5b/f98 [404808,31146] 0 2026-03-10T06:22:58.905 INFO:tasks.workunit.client.0.vm04.stdout:2/465: dwrite d1/df/d2c/d37/f52 [0,4194304] 0 2026-03-10T06:22:58.906 INFO:tasks.workunit.client.0.vm04.stdout:2/466: read - d1/d76/f8e zero size 2026-03-10T06:22:58.917 INFO:tasks.workunit.client.0.vm04.stdout:8/504: dread df/d15/d2b/f4a [0,4194304] 0 2026-03-10T06:22:58.918 INFO:tasks.workunit.client.0.vm04.stdout:2/467: dread d1/df/d2c/d37/f52 [0,4194304] 0 2026-03-10T06:22:58.922 INFO:tasks.workunit.client.0.vm04.stdout:2/468: stat d1/db/d20/d8f/d35/d54 0 2026-03-10T06:22:58.922 INFO:tasks.workunit.client.0.vm04.stdout:8/505: chown df/d15/d2b/f4a 1 1 2026-03-10T06:22:58.932 INFO:tasks.workunit.client.0.vm04.stdout:1/471: symlink d0/d3/lad 0 2026-03-10T06:22:58.936 INFO:tasks.workunit.client.0.vm04.stdout:4/459: mkdir d2/d16/d31/d3f/d93 0 2026-03-10T06:22:58.940 INFO:tasks.workunit.client.0.vm04.stdout:4/460: fsync d2/d16/d2c/f6f 0 2026-03-10T06:22:58.940 INFO:tasks.workunit.client.0.vm04.stdout:0/491: rename d0/d1a/f83 to d0/d5/d25/dd/d5c/d73/fa5 0 2026-03-10T06:22:58.940 INFO:tasks.workunit.client.0.vm04.stdout:4/461: chown d2/d32/d5c/c5f 15356 1 2026-03-10T06:22:58.940 INFO:tasks.workunit.client.0.vm04.stdout:4/462: write d2/d32/d5c/f6d [4759084,67408] 0 2026-03-10T06:22:58.945 INFO:tasks.workunit.client.0.vm04.stdout:9/500: write d2/d3/f12 [3894725,16228] 0 2026-03-10T06:22:58.953 INFO:tasks.workunit.client.0.vm04.stdout:6/462: symlink d2/d43/d86/l96 0 2026-03-10T06:22:58.958 INFO:tasks.workunit.client.0.vm04.stdout:1/472: symlink d0/d3/d41/d99/lae 0 2026-03-10T06:22:58.960 INFO:tasks.workunit.client.0.vm04.stdout:0/492: mknod d0/ca6 0 2026-03-10T06:22:58.961 INFO:tasks.workunit.client.0.vm04.stdout:0/493: fsync d0/f14 0 2026-03-10T06:22:58.963 INFO:tasks.workunit.client.0.vm04.stdout:5/448: rename d4/d11/d7d/d38/d91/d55/f6e to d4/d6/d80/d84/f9c 0 2026-03-10T06:22:58.966 INFO:tasks.workunit.client.0.vm04.stdout:4/463: mkdir d2/d32/d94 0 2026-03-10T06:22:58.967 INFO:tasks.workunit.client.0.vm04.stdout:0/494: dread d0/d5/f1f [0,4194304] 0 2026-03-10T06:22:58.967 INFO:tasks.workunit.client.0.vm04.stdout:9/501: creat d2/d8/d53/d6e/d89/fbb x:0 0 0 2026-03-10T06:22:58.968 INFO:tasks.workunit.client.0.vm04.stdout:3/472: mkdir d4/d6/d91/da1 0 2026-03-10T06:22:58.970 INFO:tasks.workunit.client.0.vm04.stdout:8/506: mkdir df/d15/d2b/d81/d9a 0 2026-03-10T06:22:58.977 INFO:tasks.workunit.client.0.vm04.stdout:1/473: rename d0/d8/le to d0/d3/d41/d4b/laf 0 2026-03-10T06:22:58.977 INFO:tasks.workunit.client.0.vm04.stdout:5/449: readlink d4/l15 0 2026-03-10T06:22:58.977 INFO:tasks.workunit.client.0.vm04.stdout:7/467: creat d4/fb2 x:0 0 0 2026-03-10T06:22:58.979 INFO:tasks.workunit.client.0.vm04.stdout:4/464: creat d2/d32/d5c/d76/f95 x:0 0 0 2026-03-10T06:22:58.980 INFO:tasks.workunit.client.0.vm04.stdout:0/495: creat d0/d5/d25/dd/d3a/d56/fa7 x:0 0 0 2026-03-10T06:22:58.982 INFO:tasks.workunit.client.0.vm04.stdout:9/502: creat d2/d3/d18/d39/d46/fbc x:0 0 0 2026-03-10T06:22:58.982 INFO:tasks.workunit.client.0.vm04.stdout:3/473: symlink d4/d6/d99/la2 0 2026-03-10T06:22:58.982 INFO:tasks.workunit.client.0.vm04.stdout:3/474: chown d4/d6/d99/d7b 115 1 2026-03-10T06:22:58.982 INFO:tasks.workunit.client.0.vm04.stdout:6/463: mkdir d2/d97 0 2026-03-10T06:22:58.989 INFO:tasks.workunit.client.0.vm04.stdout:1/474: mknod d0/d8/d46/d7a/cb0 0 2026-03-10T06:22:58.989 INFO:tasks.workunit.client.0.vm04.stdout:5/450: mkdir d4/d11/d7d/d38/d91/d4c/d9d 0 2026-03-10T06:22:58.989 INFO:tasks.workunit.client.0.vm04.stdout:7/468: truncate d4/df/d12/d34/f46 73293 0 2026-03-10T06:22:58.989 INFO:tasks.workunit.client.0.vm04.stdout:4/465: rename d2/d16/d2c/f75 to d2/d16/d2c/d6b/f96 0 2026-03-10T06:22:58.989 INFO:tasks.workunit.client.0.vm04.stdout:0/496: mknod d0/d1a/d20/d38/d31/d79/ca8 0 2026-03-10T06:22:58.990 INFO:tasks.workunit.client.0.vm04.stdout:9/503: mknod d2/d3/d18/d39/cbd 0 2026-03-10T06:22:58.994 INFO:tasks.workunit.client.0.vm04.stdout:1/475: write d0/f7c [945563,70318] 0 2026-03-10T06:22:58.995 INFO:tasks.workunit.client.0.vm04.stdout:4/466: readlink d2/d16/d2c/l74 0 2026-03-10T06:22:59.016 INFO:tasks.workunit.client.0.vm04.stdout:0/497: dread d0/d5/d25/dd/d3a/f57 [0,4194304] 0 2026-03-10T06:22:59.016 INFO:tasks.workunit.client.0.vm04.stdout:6/464: chown d2/d43/d2d/d30/d1f/c25 36 1 2026-03-10T06:22:59.016 INFO:tasks.workunit.client.0.vm04.stdout:0/498: stat d0/d1a/d20/d38/d31/d47/d8a/d8d/ca1 0 2026-03-10T06:22:59.016 INFO:tasks.workunit.client.0.vm04.stdout:4/467: write d2/d46/f61 [762925,35361] 0 2026-03-10T06:22:59.016 INFO:tasks.workunit.client.0.vm04.stdout:4/468: dread - d2/d16/d31/d3f/f52 zero size 2026-03-10T06:22:59.016 INFO:tasks.workunit.client.0.vm04.stdout:6/465: mknod d2/d37/d83/c98 0 2026-03-10T06:22:59.016 INFO:tasks.workunit.client.0.vm04.stdout:9/504: dwrite d2/d8/d14/d1d/d64/fa1 [0,4194304] 0 2026-03-10T06:22:59.016 INFO:tasks.workunit.client.0.vm04.stdout:7/469: truncate d4/df/d12/d13/d25/d30/d40/f52 1814026 0 2026-03-10T06:22:59.016 INFO:tasks.workunit.client.0.vm04.stdout:4/469: write d2/d16/d31/d3f/f8f [590229,68806] 0 2026-03-10T06:22:59.016 INFO:tasks.workunit.client.0.vm04.stdout:6/466: chown d2/d43/d2d/d30/f32 7043 1 2026-03-10T06:22:59.016 INFO:tasks.workunit.client.0.vm04.stdout:4/470: dread - d2/d16/d31/d3f/f52 zero size 2026-03-10T06:22:59.023 INFO:tasks.workunit.client.0.vm04.stdout:4/471: dread d2/d32/d5c/d4f/d51/f6e [0,4194304] 0 2026-03-10T06:22:59.024 INFO:tasks.workunit.client.0.vm04.stdout:0/499: dread d0/f14 [0,4194304] 0 2026-03-10T06:22:59.024 INFO:tasks.workunit.client.0.vm04.stdout:9/505: creat d2/d23/d24/fbe x:0 0 0 2026-03-10T06:22:59.024 INFO:tasks.workunit.client.0.vm04.stdout:7/470: dwrite d4/df/d12/d13/d25/d30/d40/d79/f89 [0,4194304] 0 2026-03-10T06:22:59.032 INFO:tasks.workunit.client.0.vm04.stdout:0/500: dread d0/f1b [4194304,4194304] 0 2026-03-10T06:22:59.032 INFO:tasks.workunit.client.0.vm04.stdout:0/501: fsync d0/f16 0 2026-03-10T06:22:59.041 INFO:tasks.workunit.client.0.vm04.stdout:9/506: symlink d2/d23/d24/lbf 0 2026-03-10T06:22:59.043 INFO:tasks.workunit.client.0.vm04.stdout:7/471: mkdir d4/df/d12/d13/db3 0 2026-03-10T06:22:59.054 INFO:tasks.workunit.client.0.vm04.stdout:7/472: readlink d4/df/d12/d21/l8c 0 2026-03-10T06:22:59.054 INFO:tasks.workunit.client.0.vm04.stdout:0/502: symlink d0/d5/d25/dd/d3a/la9 0 2026-03-10T06:22:59.054 INFO:tasks.workunit.client.0.vm04.stdout:4/472: getdents d2/d32/d94 0 2026-03-10T06:22:59.054 INFO:tasks.workunit.client.0.vm04.stdout:9/507: symlink d2/d3/d18/d39/d46/d55/lc0 0 2026-03-10T06:22:59.058 INFO:tasks.workunit.client.0.vm04.stdout:4/473: dwrite d2/d46/f61 [0,4194304] 0 2026-03-10T06:22:59.059 INFO:tasks.workunit.client.0.vm04.stdout:7/473: mknod d4/df/d12/d13/d25/d30/cb4 0 2026-03-10T06:22:59.064 INFO:tasks.workunit.client.0.vm04.stdout:4/474: dwrite d2/d32/d5c/f4b [0,4194304] 0 2026-03-10T06:22:59.066 INFO:tasks.workunit.client.0.vm04.stdout:6/467: getdents d2/d37 0 2026-03-10T06:22:59.067 INFO:tasks.workunit.client.0.vm04.stdout:7/474: dread d4/df/d12/d13/d25/d28/d36/f41 [0,4194304] 0 2026-03-10T06:22:59.075 INFO:tasks.workunit.client.0.vm04.stdout:2/469: sync 2026-03-10T06:22:59.075 INFO:tasks.workunit.client.0.vm04.stdout:2/470: stat d1/df/d2c/d37/d40/f64 0 2026-03-10T06:22:59.081 INFO:tasks.workunit.client.0.vm04.stdout:9/508: truncate d2/d8/d22/daa/f69 3516658 0 2026-03-10T06:22:59.082 INFO:tasks.workunit.client.0.vm04.stdout:2/471: dwrite d1/db/d69/f77 [0,4194304] 0 2026-03-10T06:22:59.111 INFO:tasks.workunit.client.0.vm04.stdout:4/475: creat d2/d32/d5c/f97 x:0 0 0 2026-03-10T06:22:59.114 INFO:tasks.workunit.client.0.vm04.stdout:7/475: creat d4/df/d12/d13/fb5 x:0 0 0 2026-03-10T06:22:59.119 INFO:tasks.workunit.client.0.vm04.stdout:7/476: stat d4/df/d12/d13/d25/l2b 0 2026-03-10T06:22:59.119 INFO:tasks.workunit.client.0.vm04.stdout:7/477: fsync d4/df/d12/f7f 0 2026-03-10T06:22:59.119 INFO:tasks.workunit.client.0.vm04.stdout:2/472: readlink d1/db/d69/d74/d87/l8a 0 2026-03-10T06:22:59.124 INFO:tasks.workunit.client.0.vm04.stdout:6/468: creat d2/d3a/d5e/f99 x:0 0 0 2026-03-10T06:22:59.126 INFO:tasks.workunit.client.0.vm04.stdout:4/476: mkdir d2/d32/d5c/d98 0 2026-03-10T06:22:59.126 INFO:tasks.workunit.client.0.vm04.stdout:7/478: creat d4/df/d12/d13/d25/d28/d3a/d58/fb6 x:0 0 0 2026-03-10T06:22:59.132 INFO:tasks.workunit.client.0.vm04.stdout:9/509: rename d2/d8/d14/d1d/d64/fa1 to d2/d3/d18/fc1 0 2026-03-10T06:22:59.133 INFO:tasks.workunit.client.0.vm04.stdout:4/477: dwrite d2/d32/d5c/d4f/f85 [0,4194304] 0 2026-03-10T06:22:59.137 INFO:tasks.workunit.client.0.vm04.stdout:3/475: getdents d4/d6/d99 0 2026-03-10T06:22:59.138 INFO:tasks.workunit.client.0.vm04.stdout:4/478: readlink d2/d32/d5c/d76/l8c 0 2026-03-10T06:22:59.144 INFO:tasks.workunit.client.0.vm04.stdout:6/469: mkdir d2/d3a/d5e/d9a 0 2026-03-10T06:22:59.144 INFO:tasks.workunit.client.0.vm04.stdout:6/470: chown d2/d43/d2d/f42 6714 1 2026-03-10T06:22:59.150 INFO:tasks.workunit.client.0.vm04.stdout:8/507: dwrite df/f3f [0,4194304] 0 2026-03-10T06:22:59.156 INFO:tasks.workunit.client.0.vm04.stdout:9/510: readlink d2/l38 0 2026-03-10T06:22:59.176 INFO:tasks.workunit.client.0.vm04.stdout:5/451: write d4/d11/d7d/d52/f96 [1437280,41191] 0 2026-03-10T06:22:59.176 INFO:tasks.workunit.client.0.vm04.stdout:3/476: creat d4/da/df/d11/d5a/d5b/fa3 x:0 0 0 2026-03-10T06:22:59.176 INFO:tasks.workunit.client.0.vm04.stdout:3/477: stat d4/d6/d54/f81 0 2026-03-10T06:22:59.179 INFO:tasks.workunit.client.0.vm04.stdout:6/471: fsync d2/d43/d2d/d30/d34/d76/d7e/f81 0 2026-03-10T06:22:59.183 INFO:tasks.workunit.client.0.vm04.stdout:1/476: truncate d0/d3/d41/d4b/d5b/f5c 103996 0 2026-03-10T06:22:59.187 INFO:tasks.workunit.client.0.vm04.stdout:0/503: write d0/d5/d25/dd/f43 [1447018,37901] 0 2026-03-10T06:22:59.188 INFO:tasks.workunit.client.0.vm04.stdout:1/477: dwrite d0/d3/d41/fa3 [0,4194304] 0 2026-03-10T06:22:59.193 INFO:tasks.workunit.client.0.vm04.stdout:1/478: readlink d0/l3e 0 2026-03-10T06:22:59.197 INFO:tasks.workunit.client.0.vm04.stdout:8/508: symlink df/d20/d25/l9b 0 2026-03-10T06:22:59.198 INFO:tasks.workunit.client.0.vm04.stdout:8/509: chown df/d20/d25/d30/f4e 722947 1 2026-03-10T06:22:59.218 INFO:tasks.workunit.client.0.vm04.stdout:2/473: getdents d1/db/d20/d8f 0 2026-03-10T06:22:59.221 INFO:tasks.workunit.client.0.vm04.stdout:4/479: mkdir d2/d32/d94/d99 0 2026-03-10T06:22:59.224 INFO:tasks.workunit.client.0.vm04.stdout:5/452: creat d4/d11/d7d/d38/d91/d55/f9e x:0 0 0 2026-03-10T06:22:59.225 INFO:tasks.workunit.client.0.vm04.stdout:3/478: readlink d4/d6/l74 0 2026-03-10T06:22:59.228 INFO:tasks.workunit.client.0.vm04.stdout:6/472: mkdir d2/d43/d9b 0 2026-03-10T06:22:59.233 INFO:tasks.workunit.client.0.vm04.stdout:6/473: dwrite d2/d8/d78/f79 [0,4194304] 0 2026-03-10T06:22:59.234 INFO:tasks.workunit.client.0.vm04.stdout:7/479: creat d4/df/d12/d13/d25/d28/fb7 x:0 0 0 2026-03-10T06:22:59.235 INFO:tasks.workunit.client.0.vm04.stdout:6/474: chown d2/d43/d2d/d30/d34/f52 31 1 2026-03-10T06:22:59.242 INFO:tasks.workunit.client.0.vm04.stdout:1/479: chown d0/d3/d41/l79 0 1 2026-03-10T06:22:59.243 INFO:tasks.workunit.client.0.vm04.stdout:8/510: creat df/d20/d25/d30/d70/d97/f9c x:0 0 0 2026-03-10T06:22:59.244 INFO:tasks.workunit.client.0.vm04.stdout:9/511: link d2/d23/d24/fbe d2/d23/d24/da9/fc2 0 2026-03-10T06:22:59.248 INFO:tasks.workunit.client.0.vm04.stdout:4/480: mkdir d2/d16/d2c/d9a 0 2026-03-10T06:22:59.248 INFO:tasks.workunit.client.0.vm04.stdout:9/512: readlink d2/d3/d18/d39/d11/l62 0 2026-03-10T06:22:59.258 INFO:tasks.workunit.client.0.vm04.stdout:7/480: dread d4/df/d12/f1c [4194304,4194304] 0 2026-03-10T06:22:59.260 INFO:tasks.workunit.client.0.vm04.stdout:8/511: dread df/d20/d25/d30/f4e [0,4194304] 0 2026-03-10T06:22:59.266 INFO:tasks.workunit.client.0.vm04.stdout:8/512: truncate df/d20/d25/d30/d70/d97/f6c 426635 0 2026-03-10T06:22:59.278 INFO:tasks.workunit.client.0.vm04.stdout:3/479: write d4/d6/dc/f22 [1557624,95203] 0 2026-03-10T06:22:59.279 INFO:tasks.workunit.client.0.vm04.stdout:5/453: dwrite d4/d11/d7d/f2c [0,4194304] 0 2026-03-10T06:22:59.283 INFO:tasks.workunit.client.0.vm04.stdout:1/480: unlink d0/d3/f34 0 2026-03-10T06:22:59.286 INFO:tasks.workunit.client.0.vm04.stdout:9/513: mkdir d2/d3/d18/d39/d46/d55/dc3 0 2026-03-10T06:22:59.289 INFO:tasks.workunit.client.0.vm04.stdout:6/475: mkdir d2/d3a/d9c 0 2026-03-10T06:22:59.291 INFO:tasks.workunit.client.0.vm04.stdout:3/480: fsync d4/da/df/d11/d5a/d5b/f98 0 2026-03-10T06:22:59.291 INFO:tasks.workunit.client.0.vm04.stdout:7/481: write d4/df/d12/f1c [1891985,3880] 0 2026-03-10T06:22:59.294 INFO:tasks.workunit.client.0.vm04.stdout:5/454: dread d4/f79 [0,4194304] 0 2026-03-10T06:22:59.298 INFO:tasks.workunit.client.0.vm04.stdout:7/482: dread d4/df/d12/d13/d25/d30/d40/d79/f89 [0,4194304] 0 2026-03-10T06:22:59.299 INFO:tasks.workunit.client.0.vm04.stdout:7/483: chown d4/df/d12/d13/d25/d28/l8e 6098627 1 2026-03-10T06:22:59.302 INFO:tasks.workunit.client.0.vm04.stdout:0/504: rename d0/d1a/c33 to d0/d5/d25/dd/d5c/caa 0 2026-03-10T06:22:59.312 INFO:tasks.workunit.client.0.vm04.stdout:5/455: symlink d4/d11/d7d/d38/d91/l9f 0 2026-03-10T06:22:59.325 INFO:tasks.workunit.client.0.vm04.stdout:9/514: link d2/fb4 d2/d3/d18/d39/d46/fc4 0 2026-03-10T06:22:59.325 INFO:tasks.workunit.client.0.vm04.stdout:3/481: symlink d4/d6/d99/d7b/la4 0 2026-03-10T06:22:59.325 INFO:tasks.workunit.client.0.vm04.stdout:5/456: creat d4/d3b/fa0 x:0 0 0 2026-03-10T06:22:59.326 INFO:tasks.workunit.client.0.vm04.stdout:7/484: symlink d4/df/d12/d13/d25/d28/d36/d9c/db1/lb8 0 2026-03-10T06:22:59.327 INFO:tasks.workunit.client.0.vm04.stdout:0/505: creat d0/d5/d97/fab x:0 0 0 2026-03-10T06:22:59.329 INFO:tasks.workunit.client.0.vm04.stdout:0/506: write d0/d5/d25/dd/d92/fa4 [901757,5919] 0 2026-03-10T06:22:59.329 INFO:tasks.workunit.client.0.vm04.stdout:9/515: creat d2/d23/d24/da2/fc5 x:0 0 0 2026-03-10T06:22:59.330 INFO:tasks.workunit.client.0.vm04.stdout:0/507: stat d0/d5/d25/dd/d5c/f7a 0 2026-03-10T06:22:59.331 INFO:tasks.workunit.client.0.vm04.stdout:3/482: dread - d4/d6/d99/d7b/d21/d32/d4e/f73 zero size 2026-03-10T06:22:59.331 INFO:tasks.workunit.client.0.vm04.stdout:7/485: symlink d4/df/d12/d34/d63/lb9 0 2026-03-10T06:22:59.332 INFO:tasks.workunit.client.0.vm04.stdout:7/486: dread - d4/df/d12/d13/d25/d28/f9e zero size 2026-03-10T06:22:59.333 INFO:tasks.workunit.client.0.vm04.stdout:5/457: creat d4/d11/d7d/d38/d91/d4c/fa1 x:0 0 0 2026-03-10T06:22:59.336 INFO:tasks.workunit.client.0.vm04.stdout:2/474: rename d1/df/f88 to d1/f91 0 2026-03-10T06:22:59.337 INFO:tasks.workunit.client.0.vm04.stdout:9/516: mknod d2/d3/d18/d39/d11/cc6 0 2026-03-10T06:22:59.340 INFO:tasks.workunit.client.0.vm04.stdout:3/483: mknod d4/d6/d99/d7b/d21/d32/d39/d64/ca5 0 2026-03-10T06:22:59.345 INFO:tasks.workunit.client.0.vm04.stdout:9/517: creat d2/d3/d18/d34/fc7 x:0 0 0 2026-03-10T06:22:59.346 INFO:tasks.workunit.client.0.vm04.stdout:3/484: rmdir d4/da/df/d11/d5a 39 2026-03-10T06:22:59.348 INFO:tasks.workunit.client.0.vm04.stdout:9/518: dread d2/d8/f99 [0,4194304] 0 2026-03-10T06:22:59.348 INFO:tasks.workunit.client.0.vm04.stdout:3/485: write d4/d6/d99/d7b/d21/d32/d8e/f95 [239473,44285] 0 2026-03-10T06:22:59.351 INFO:tasks.workunit.client.0.vm04.stdout:2/475: dwrite d1/df/d11/d14/d4e/f5c [0,4194304] 0 2026-03-10T06:22:59.363 INFO:tasks.workunit.client.0.vm04.stdout:1/481: rename d0/d3/d41/d4b/d5b/f8e to d0/d3/d41/fb1 0 2026-03-10T06:22:59.370 INFO:tasks.workunit.client.0.vm04.stdout:1/482: chown d0/d8/d46/f57 7332 1 2026-03-10T06:22:59.371 INFO:tasks.workunit.client.0.vm04.stdout:3/486: symlink d4/da/la6 0 2026-03-10T06:22:59.372 INFO:tasks.workunit.client.0.vm04.stdout:7/487: getdents d4/df/d12/d13/d25/d28/d36/d9c 0 2026-03-10T06:22:59.374 INFO:tasks.workunit.client.0.vm04.stdout:7/488: chown d4/df/d12/d13/d25/d30/d40/d50/f62 1624 1 2026-03-10T06:22:59.375 INFO:tasks.workunit.client.0.vm04.stdout:9/519: link d2/d8/f4a d2/d8/d53/fc8 0 2026-03-10T06:22:59.376 INFO:tasks.workunit.client.0.vm04.stdout:7/489: write d4/df/d12/d13/d25/d28/d3a/d58/f97 [694217,99576] 0 2026-03-10T06:22:59.379 INFO:tasks.workunit.client.0.vm04.stdout:1/483: dwrite d0/d8/d46/d7a/d95/fa6 [8388608,4194304] 0 2026-03-10T06:22:59.383 INFO:tasks.workunit.client.0.vm04.stdout:8/513: sync 2026-03-10T06:22:59.383 INFO:tasks.workunit.client.0.vm04.stdout:6/476: sync 2026-03-10T06:22:59.383 INFO:tasks.workunit.client.0.vm04.stdout:5/458: sync 2026-03-10T06:22:59.384 INFO:tasks.workunit.client.0.vm04.stdout:1/484: stat d0/c2f 0 2026-03-10T06:22:59.384 INFO:tasks.workunit.client.0.vm04.stdout:6/477: chown d2/d8/c71 512879 1 2026-03-10T06:22:59.385 INFO:tasks.workunit.client.0.vm04.stdout:6/478: stat d2/d37/d83/c98 0 2026-03-10T06:22:59.386 INFO:tasks.workunit.client.0.vm04.stdout:2/476: dread d1/db/d20/d8f/f25 [0,4194304] 0 2026-03-10T06:22:59.390 INFO:tasks.workunit.client.0.vm04.stdout:2/477: dread - d1/db/d20/f86 zero size 2026-03-10T06:22:59.391 INFO:tasks.workunit.client.0.vm04.stdout:9/520: mknod d2/d8/d53/d6e/d89/cc9 0 2026-03-10T06:22:59.391 INFO:tasks.workunit.client.0.vm04.stdout:7/490: mknod d4/df/d12/d13/d25/d28/d3a/d58/cba 0 2026-03-10T06:22:59.393 INFO:tasks.workunit.client.0.vm04.stdout:7/491: dread - d4/df/d12/d13/d25/d28/f7d zero size 2026-03-10T06:22:59.394 INFO:tasks.workunit.client.0.vm04.stdout:7/492: chown d4/df/d12/d34/d63/f9a 137915643 1 2026-03-10T06:22:59.398 INFO:tasks.workunit.client.0.vm04.stdout:4/481: write d2/d32/d5c/f6a [182014,12853] 0 2026-03-10T06:22:59.401 INFO:tasks.workunit.client.0.vm04.stdout:6/479: fsync d2/d43/d2d/d30/f60 0 2026-03-10T06:22:59.402 INFO:tasks.workunit.client.0.vm04.stdout:9/521: symlink d2/d3/d18/d39/lca 0 2026-03-10T06:22:59.403 INFO:tasks.workunit.client.0.vm04.stdout:6/480: readlink d2/d43/d2d/d30/d1f/d3c/d75/l4e 0 2026-03-10T06:22:59.411 INFO:tasks.workunit.client.0.vm04.stdout:6/481: mknod d2/d8/d78/c9d 0 2026-03-10T06:22:59.412 INFO:tasks.workunit.client.0.vm04.stdout:6/482: read d2/d43/d2d/d30/f60 [4049927,230] 0 2026-03-10T06:22:59.413 INFO:tasks.workunit.client.0.vm04.stdout:8/514: dread df/f1d [0,4194304] 0 2026-03-10T06:22:59.421 INFO:tasks.workunit.client.0.vm04.stdout:0/508: write d0/d5/f70 [486771,108779] 0 2026-03-10T06:22:59.421 INFO:tasks.workunit.client.0.vm04.stdout:6/483: dread d2/d43/f31 [0,4194304] 0 2026-03-10T06:22:59.426 INFO:tasks.workunit.client.0.vm04.stdout:9/522: dwrite d2/d8/d14/f40 [0,4194304] 0 2026-03-10T06:22:59.428 INFO:tasks.workunit.client.0.vm04.stdout:8/515: creat df/d15/d2b/d81/f9d x:0 0 0 2026-03-10T06:22:59.429 INFO:tasks.workunit.client.0.vm04.stdout:7/493: link d4/df/d12/d21/c44 d4/df/d12/d13/d25/d28/d36/cbb 0 2026-03-10T06:22:59.429 INFO:tasks.workunit.client.0.vm04.stdout:9/523: write d2/d3/d18/d34/fc7 [295881,79332] 0 2026-03-10T06:22:59.430 INFO:tasks.workunit.client.0.vm04.stdout:9/524: stat d2/d3/d18/d39/d11 0 2026-03-10T06:22:59.432 INFO:tasks.workunit.client.0.vm04.stdout:9/525: readlink d2/d3/d18/l2f 0 2026-03-10T06:22:59.435 INFO:tasks.workunit.client.0.vm04.stdout:3/487: write d4/d6/d99/d7b/f2e [3357929,128028] 0 2026-03-10T06:22:59.435 INFO:tasks.workunit.client.0.vm04.stdout:2/478: write d1/db/f12 [7712208,42156] 0 2026-03-10T06:22:59.435 INFO:tasks.workunit.client.0.vm04.stdout:1/485: write d0/d8/f43 [697133,29309] 0 2026-03-10T06:22:59.444 INFO:tasks.workunit.client.0.vm04.stdout:8/516: chown df/d20/f42 4 1 2026-03-10T06:22:59.444 INFO:tasks.workunit.client.0.vm04.stdout:3/488: dread - d4/d6/d99/d7b/d21/d32/d39/d64/f67 zero size 2026-03-10T06:22:59.445 INFO:tasks.workunit.client.0.vm04.stdout:6/484: rename d2/d43/d2d/d30/f5a to d2/d43/d2d/d30/d34/d76/d8a/f9e 0 2026-03-10T06:22:59.446 INFO:tasks.workunit.client.0.vm04.stdout:4/482: dwrite d2/d16/f20 [4194304,4194304] 0 2026-03-10T06:22:59.447 INFO:tasks.workunit.client.0.vm04.stdout:2/479: creat d1/db/d20/d8f/d48/d67/f92 x:0 0 0 2026-03-10T06:22:59.448 INFO:tasks.workunit.client.0.vm04.stdout:8/517: write df/d15/d2b/f4d [2088334,68606] 0 2026-03-10T06:22:59.450 INFO:tasks.workunit.client.0.vm04.stdout:5/459: dwrite d4/d11/d7d/d38/d91/d55/f5a [0,4194304] 0 2026-03-10T06:22:59.454 INFO:tasks.workunit.client.0.vm04.stdout:8/518: truncate df/d15/d29/f7a 37110 0 2026-03-10T06:22:59.464 INFO:tasks.workunit.client.0.vm04.stdout:4/483: dwrite d2/f14 [4194304,4194304] 0 2026-03-10T06:22:59.464 INFO:tasks.workunit.client.0.vm04.stdout:1/486: link d0/f83 d0/d3/fb2 0 2026-03-10T06:22:59.464 INFO:tasks.workunit.client.0.vm04.stdout:0/509: getdents d0/d1a/d20/d38/d31/d47 0 2026-03-10T06:22:59.464 INFO:tasks.workunit.client.0.vm04.stdout:2/480: creat d1/db/d20/d8f/d35/d54/d5d/f93 x:0 0 0 2026-03-10T06:22:59.464 INFO:tasks.workunit.client.0.vm04.stdout:6/485: chown d2/d37/d83/l88 58331 1 2026-03-10T06:22:59.469 INFO:tasks.workunit.client.0.vm04.stdout:9/526: sync 2026-03-10T06:22:59.474 INFO:tasks.workunit.client.0.vm04.stdout:6/486: dwrite d2/d3a/f57 [0,4194304] 0 2026-03-10T06:22:59.478 INFO:tasks.workunit.client.0.vm04.stdout:5/460: fsync d4/d6/d80/d84/f9c 0 2026-03-10T06:22:59.486 INFO:tasks.workunit.client.0.vm04.stdout:0/510: dwrite d0/d5/d25/dd/d3a/d56/f84 [0,4194304] 0 2026-03-10T06:22:59.486 INFO:tasks.workunit.client.0.vm04.stdout:1/487: dwrite d0/d3/d41/fa3 [0,4194304] 0 2026-03-10T06:22:59.489 INFO:tasks.workunit.client.0.vm04.stdout:6/487: truncate d2/d37/d6e/f82 1193567 0 2026-03-10T06:22:59.497 INFO:tasks.workunit.client.0.vm04.stdout:2/481: dwrite d1/db/d20/f49 [0,4194304] 0 2026-03-10T06:22:59.506 INFO:tasks.workunit.client.0.vm04.stdout:4/484: dwrite d2/f47 [4194304,4194304] 0 2026-03-10T06:22:59.506 INFO:tasks.workunit.client.0.vm04.stdout:9/527: mkdir d2/d8/d3a/dcb 0 2026-03-10T06:22:59.511 INFO:tasks.workunit.client.0.vm04.stdout:9/528: write d2/d23/f31 [3234727,66756] 0 2026-03-10T06:22:59.517 INFO:tasks.workunit.client.0.vm04.stdout:0/511: truncate d0/d1a/d20/d38/f78 201366 0 2026-03-10T06:22:59.527 INFO:tasks.workunit.client.0.vm04.stdout:1/488: mkdir d0/d8/d46/db3 0 2026-03-10T06:22:59.527 INFO:tasks.workunit.client.0.vm04.stdout:4/485: creat d2/d16/d2c/f9b x:0 0 0 2026-03-10T06:22:59.527 INFO:tasks.workunit.client.0.vm04.stdout:5/461: symlink d4/d6/la2 0 2026-03-10T06:22:59.527 INFO:tasks.workunit.client.0.vm04.stdout:1/489: mkdir d0/d8/d46/db3/db4 0 2026-03-10T06:22:59.528 INFO:tasks.workunit.client.0.vm04.stdout:4/486: chown d2/d46/c2f 1 1 2026-03-10T06:22:59.528 INFO:tasks.workunit.client.0.vm04.stdout:5/462: rename d4/d11/d7d/f3d to d4/d11/d7d/d38/d91/d4c/fa3 0 2026-03-10T06:22:59.528 INFO:tasks.workunit.client.0.vm04.stdout:9/529: link d2/d3/caf d2/d23/d24/ccc 0 2026-03-10T06:22:59.531 INFO:tasks.workunit.client.0.vm04.stdout:4/487: dwrite d2/d16/d31/f66 [0,4194304] 0 2026-03-10T06:22:59.536 INFO:tasks.workunit.client.0.vm04.stdout:0/512: link d0/d5/d25/l86 d0/d5/d25/dd/d1d/d9c/lac 0 2026-03-10T06:22:59.545 INFO:tasks.workunit.client.0.vm04.stdout:9/530: symlink d2/lcd 0 2026-03-10T06:22:59.551 INFO:tasks.workunit.client.0.vm04.stdout:5/463: fsync d4/d11/f1f 0 2026-03-10T06:22:59.556 INFO:tasks.workunit.client.0.vm04.stdout:0/513: creat d0/d1a/d20/d38/d31/d47/d8a/d8d/fad x:0 0 0 2026-03-10T06:22:59.556 INFO:tasks.workunit.client.0.vm04.stdout:4/488: creat d2/d16/d31/d3f/d93/f9c x:0 0 0 2026-03-10T06:22:59.556 INFO:tasks.workunit.client.0.vm04.stdout:9/531: mknod d2/d8/cce 0 2026-03-10T06:22:59.556 INFO:tasks.workunit.client.0.vm04.stdout:5/464: mknod d4/d11/d7d/d38/d91/d4c/ca4 0 2026-03-10T06:22:59.556 INFO:tasks.workunit.client.0.vm04.stdout:4/489: chown d2/d46/f3c 17091 1 2026-03-10T06:22:59.571 INFO:tasks.workunit.client.0.vm04.stdout:7/494: truncate d4/df/d12/d13/d25/d28/d3a/d58/f77 2949335 0 2026-03-10T06:22:59.577 INFO:tasks.workunit.client.0.vm04.stdout:3/489: dwrite d4/d6/d99/d7b/d21/f83 [0,4194304] 0 2026-03-10T06:22:59.577 INFO:tasks.workunit.client.0.vm04.stdout:5/465: creat d4/d6/d80/d84/fa5 x:0 0 0 2026-03-10T06:22:59.579 INFO:tasks.workunit.client.0.vm04.stdout:7/495: fsync d4/df/d12/d34/f57 0 2026-03-10T06:22:59.584 INFO:tasks.workunit.client.0.vm04.stdout:3/490: rmdir d4/da 39 2026-03-10T06:22:59.585 INFO:tasks.workunit.client.0.vm04.stdout:5/466: dwrite d4/d11/d7d/f2c [0,4194304] 0 2026-03-10T06:22:59.588 INFO:tasks.workunit.client.0.vm04.stdout:5/467: fdatasync d4/d11/d7d/d38/d91/d4c/f88 0 2026-03-10T06:22:59.595 INFO:tasks.workunit.client.0.vm04.stdout:4/490: link d2/d8/f35 d2/d16/f9d 0 2026-03-10T06:22:59.599 INFO:tasks.workunit.client.0.vm04.stdout:7/496: rmdir d4/df/d90 0 2026-03-10T06:22:59.603 INFO:tasks.workunit.client.0.vm04.stdout:4/491: creat d2/d16/d2c/d6b/f9e x:0 0 0 2026-03-10T06:22:59.604 INFO:tasks.workunit.client.0.vm04.stdout:4/492: write d2/d32/d5c/f6a [273489,8167] 0 2026-03-10T06:22:59.607 INFO:tasks.workunit.client.0.vm04.stdout:5/468: creat d4/d11/d7d/fa6 x:0 0 0 2026-03-10T06:22:59.609 INFO:tasks.workunit.client.0.vm04.stdout:4/493: creat d2/d8/f9f x:0 0 0 2026-03-10T06:22:59.610 INFO:tasks.workunit.client.0.vm04.stdout:4/494: write d2/d32/d5c/d4f/f85 [5030432,109806] 0 2026-03-10T06:22:59.614 INFO:tasks.workunit.client.0.vm04.stdout:5/469: link d4/d11/f32 d4/d6/d80/d84/fa7 0 2026-03-10T06:22:59.621 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:22:59 vm06.local ceph-mon[58974]: pgmap v28: 65 pgs: 65 active+clean; 2.6 GiB data, 8.5 GiB used, 111 GiB / 120 GiB avail; 24 MiB/s rd, 89 MiB/s wr, 189 op/s 2026-03-10T06:22:59.622 INFO:tasks.workunit.client.0.vm04.stdout:5/470: mkdir d4/d3b/da8 0 2026-03-10T06:22:59.640 INFO:tasks.workunit.client.0.vm04.stdout:0/514: dread d0/d5/d25/f3c [0,4194304] 0 2026-03-10T06:22:59.644 INFO:tasks.workunit.client.0.vm04.stdout:0/515: creat d0/d5/d25/dd/d5c/d73/fae x:0 0 0 2026-03-10T06:22:59.647 INFO:tasks.workunit.client.0.vm04.stdout:0/516: link d0/d5/d97/fab d0/d5/d25/dd/d5c/d73/d82/faf 0 2026-03-10T06:22:59.649 INFO:tasks.workunit.client.0.vm04.stdout:9/532: sync 2026-03-10T06:22:59.649 INFO:tasks.workunit.client.0.vm04.stdout:3/491: sync 2026-03-10T06:22:59.651 INFO:tasks.workunit.client.0.vm04.stdout:0/517: creat d0/fb0 x:0 0 0 2026-03-10T06:22:59.653 INFO:tasks.workunit.client.0.vm04.stdout:0/518: mknod d0/d5/d25/cb1 0 2026-03-10T06:22:59.656 INFO:tasks.workunit.client.0.vm04.stdout:3/492: creat d4/fa7 x:0 0 0 2026-03-10T06:22:59.665 INFO:tasks.workunit.client.0.vm04.stdout:3/493: creat d4/d6/d54/fa8 x:0 0 0 2026-03-10T06:22:59.665 INFO:tasks.workunit.client.0.vm04.stdout:0/519: creat d0/d5/d25/dd/d5c/fb2 x:0 0 0 2026-03-10T06:22:59.667 INFO:tasks.workunit.client.0.vm04.stdout:7/497: dread d4/df/d12/d13/d25/d28/d36/f64 [0,4194304] 0 2026-03-10T06:22:59.667 INFO:tasks.workunit.client.0.vm04.stdout:3/494: chown d4/d6/d99/d7b/d21/d32/f58 24798 1 2026-03-10T06:22:59.669 INFO:tasks.workunit.client.0.vm04.stdout:3/495: chown d4/da/df/l90 2 1 2026-03-10T06:22:59.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:22:59 vm04.local ceph-mon[51058]: pgmap v28: 65 pgs: 65 active+clean; 2.6 GiB data, 8.5 GiB used, 111 GiB / 120 GiB avail; 24 MiB/s rd, 89 MiB/s wr, 189 op/s 2026-03-10T06:22:59.677 INFO:tasks.workunit.client.0.vm04.stdout:0/520: rename d0/d1a/d20/d38/d31/d47/d8a/l95 to d0/d5/d25/lb3 0 2026-03-10T06:22:59.677 INFO:tasks.workunit.client.0.vm04.stdout:7/498: mknod d4/df/d12/d13/d25/d28/d3a/db0/cbc 0 2026-03-10T06:22:59.679 INFO:tasks.workunit.client.0.vm04.stdout:0/521: fdatasync d0/d5/d25/dd/d5c/d73/fae 0 2026-03-10T06:22:59.686 INFO:tasks.workunit.client.0.vm04.stdout:2/482: dread d1/db/d20/d8f/f19 [4194304,4194304] 0 2026-03-10T06:22:59.688 INFO:tasks.workunit.client.0.vm04.stdout:2/483: truncate d1/db/d20/d8f/d35/d54/d5d/f93 13495 0 2026-03-10T06:22:59.690 INFO:tasks.workunit.client.0.vm04.stdout:8/519: write df/d15/d29/f3e [1298846,25432] 0 2026-03-10T06:22:59.692 INFO:tasks.workunit.client.0.vm04.stdout:8/520: chown df/d15/d2b/f4c 1222 1 2026-03-10T06:22:59.693 INFO:tasks.workunit.client.0.vm04.stdout:7/499: dread d4/fb [0,4194304] 0 2026-03-10T06:22:59.695 INFO:tasks.workunit.client.0.vm04.stdout:7/500: fsync d4/df/d12/d13/d25/f2f 0 2026-03-10T06:22:59.695 INFO:tasks.workunit.client.0.vm04.stdout:2/484: write d1/df/d11/f16 [4984995,33305] 0 2026-03-10T06:22:59.701 INFO:tasks.workunit.client.0.vm04.stdout:2/485: dread d1/df/d2c/f58 [0,4194304] 0 2026-03-10T06:22:59.706 INFO:tasks.workunit.client.0.vm04.stdout:2/486: dwrite d1/db/d20/f49 [0,4194304] 0 2026-03-10T06:22:59.708 INFO:tasks.workunit.client.0.vm04.stdout:8/521: dread df/d15/d2b/f33 [0,4194304] 0 2026-03-10T06:22:59.715 INFO:tasks.workunit.client.0.vm04.stdout:7/501: mkdir d4/df/d12/d34/dbd 0 2026-03-10T06:22:59.726 INFO:tasks.workunit.client.0.vm04.stdout:6/488: truncate d2/d43/f24 2358669 0 2026-03-10T06:22:59.732 INFO:tasks.workunit.client.0.vm04.stdout:1/490: dwrite d0/d3/d41/fb1 [0,4194304] 0 2026-03-10T06:22:59.743 INFO:tasks.workunit.client.0.vm04.stdout:2/487: mkdir d1/db/d72/d94 0 2026-03-10T06:22:59.743 INFO:tasks.workunit.client.0.vm04.stdout:4/495: rmdir d2/d16/d2c/d6b 39 2026-03-10T06:22:59.744 INFO:tasks.workunit.client.0.vm04.stdout:5/471: write d4/d11/d7d/f36 [1342265,18916] 0 2026-03-10T06:22:59.744 INFO:tasks.workunit.client.0.vm04.stdout:4/496: chown d2/d32/f7c 2558451 1 2026-03-10T06:22:59.745 INFO:tasks.workunit.client.0.vm04.stdout:2/488: truncate d1/f91 7202 0 2026-03-10T06:22:59.746 INFO:tasks.workunit.client.0.vm04.stdout:2/489: chown d1/df/d2c/d37/d59 284 1 2026-03-10T06:22:59.752 INFO:tasks.workunit.client.0.vm04.stdout:7/502: creat d4/df/d12/d13/d25/d28/d36/d9c/db1/fbe x:0 0 0 2026-03-10T06:22:59.752 INFO:tasks.workunit.client.0.vm04.stdout:9/533: write d2/d3/d18/d39/d11/f35 [831134,41063] 0 2026-03-10T06:22:59.752 INFO:tasks.workunit.client.0.vm04.stdout:4/497: dwrite d2/d32/d5c/d76/f95 [0,4194304] 0 2026-03-10T06:22:59.752 INFO:tasks.workunit.client.0.vm04.stdout:4/498: readlink d2/d16/d2c/l8b 0 2026-03-10T06:22:59.760 INFO:tasks.workunit.client.0.vm04.stdout:4/499: dwrite d2/d16/d31/d3f/f52 [0,4194304] 0 2026-03-10T06:22:59.777 INFO:tasks.workunit.client.0.vm04.stdout:3/496: rmdir d4/d6/d54 39 2026-03-10T06:22:59.799 INFO:tasks.workunit.client.0.vm04.stdout:6/489: creat d2/d43/d86/f9f x:0 0 0 2026-03-10T06:22:59.799 INFO:tasks.workunit.client.0.vm04.stdout:6/490: chown d2/d43/lc 140 1 2026-03-10T06:22:59.801 INFO:tasks.workunit.client.0.vm04.stdout:0/522: write d0/d5/f3e [1577546,97284] 0 2026-03-10T06:22:59.802 INFO:tasks.workunit.client.0.vm04.stdout:0/523: chown d0/d5/d25/f5f 1909413 1 2026-03-10T06:22:59.814 INFO:tasks.workunit.client.0.vm04.stdout:1/491: creat d0/d8/d46/fb5 x:0 0 0 2026-03-10T06:22:59.815 INFO:tasks.workunit.client.0.vm04.stdout:1/492: write d0/f59 [3320334,90492] 0 2026-03-10T06:22:59.816 INFO:tasks.workunit.client.0.vm04.stdout:5/472: chown d4/d11/d7d/c53 1588 1 2026-03-10T06:22:59.817 INFO:tasks.workunit.client.0.vm04.stdout:1/493: dread - d0/d3/d80/f91 zero size 2026-03-10T06:22:59.819 INFO:tasks.workunit.client.0.vm04.stdout:1/494: fdatasync d0/d3/f20 0 2026-03-10T06:22:59.840 INFO:tasks.workunit.client.0.vm04.stdout:2/490: symlink d1/db/d69/l95 0 2026-03-10T06:22:59.850 INFO:tasks.workunit.client.0.vm04.stdout:9/534: creat d2/d8/d22/fcf x:0 0 0 2026-03-10T06:22:59.852 INFO:tasks.workunit.client.0.vm04.stdout:4/500: creat d2/d16/d31/d42/fa0 x:0 0 0 2026-03-10T06:22:59.855 INFO:tasks.workunit.client.0.vm04.stdout:4/501: write d2/d16/f73 [89677,76304] 0 2026-03-10T06:22:59.857 INFO:tasks.workunit.client.0.vm04.stdout:3/497: dread d4/da/df/d11/f57 [0,4194304] 0 2026-03-10T06:22:59.857 INFO:tasks.workunit.client.0.vm04.stdout:4/502: write d2/d16/d31/d3f/d93/f9c [55703,30551] 0 2026-03-10T06:22:59.857 INFO:tasks.workunit.client.0.vm04.stdout:3/498: write d4/d6/f12 [12180549,49025] 0 2026-03-10T06:22:59.876 INFO:tasks.workunit.client.0.vm04.stdout:0/524: creat d0/d1a/d20/d38/fb4 x:0 0 0 2026-03-10T06:22:59.876 INFO:tasks.workunit.client.0.vm04.stdout:5/473: unlink d4/d3b/f9b 0 2026-03-10T06:22:59.877 INFO:tasks.workunit.client.0.vm04.stdout:0/525: chown d0/d1a/d20/d38/d31/d79/ca8 1 1 2026-03-10T06:22:59.885 INFO:tasks.workunit.client.0.vm04.stdout:1/495: creat d0/d3/d41/d4b/d5b/fb6 x:0 0 0 2026-03-10T06:22:59.887 INFO:tasks.workunit.client.0.vm04.stdout:1/496: chown d0/d3/d41/d4b/d5b/c70 60 1 2026-03-10T06:22:59.891 INFO:tasks.workunit.client.0.vm04.stdout:9/535: readlink d2/d23/d24/da9/lb6 0 2026-03-10T06:22:59.891 INFO:tasks.workunit.client.0.vm04.stdout:4/503: mkdir d2/d16/d31/d3f/da1 0 2026-03-10T06:22:59.893 INFO:tasks.workunit.client.0.vm04.stdout:8/522: dwrite df/d15/d2b/f2f [4194304,4194304] 0 2026-03-10T06:22:59.904 INFO:tasks.workunit.client.0.vm04.stdout:5/474: truncate d4/d11/d7d/f44 1167514 0 2026-03-10T06:22:59.905 INFO:tasks.workunit.client.0.vm04.stdout:5/475: read d4/d6/fa [656864,116684] 0 2026-03-10T06:22:59.906 INFO:tasks.workunit.client.0.vm04.stdout:1/497: read d0/d3/d41/d4b/d5b/f6f [5421562,60643] 0 2026-03-10T06:22:59.908 INFO:tasks.workunit.client.0.vm04.stdout:4/504: truncate d2/d46/f3c 479179 0 2026-03-10T06:22:59.909 INFO:tasks.workunit.client.0.vm04.stdout:7/503: link d4/df/d12/d13/d25/d28/d36/f64 d4/fbf 0 2026-03-10T06:22:59.912 INFO:tasks.workunit.client.0.vm04.stdout:3/499: rename d4/f10 to d4/da/df/d11/d50/fa9 0 2026-03-10T06:22:59.914 INFO:tasks.workunit.client.0.vm04.stdout:9/536: rename d2 to d2/d8/d53/dd0 22 2026-03-10T06:22:59.923 INFO:tasks.workunit.client.0.vm04.stdout:0/526: dwrite d0/d5/d25/dd/d5c/d73/fa5 [0,4194304] 0 2026-03-10T06:22:59.923 INFO:tasks.workunit.client.0.vm04.stdout:2/491: dread d1/df/d11/f7e [0,4194304] 0 2026-03-10T06:22:59.924 INFO:tasks.workunit.client.0.vm04.stdout:1/498: dread d0/d3/d41/d4b/f6b [0,4194304] 0 2026-03-10T06:22:59.924 INFO:tasks.workunit.client.0.vm04.stdout:5/476: truncate d4/f79 902145 0 2026-03-10T06:22:59.925 INFO:tasks.workunit.client.0.vm04.stdout:5/477: stat d4/d11/d7d/f31 0 2026-03-10T06:22:59.925 INFO:tasks.workunit.client.0.vm04.stdout:3/500: dwrite d4/d6/d99/d7b/f27 [0,4194304] 0 2026-03-10T06:22:59.925 INFO:tasks.workunit.client.0.vm04.stdout:5/478: write d4/d11/f7b [231366,77898] 0 2026-03-10T06:22:59.932 INFO:tasks.workunit.client.0.vm04.stdout:4/505: fsync d2/d16/d2c/d6b/f9e 0 2026-03-10T06:22:59.963 INFO:tasks.workunit.client.0.vm04.stdout:7/504: dwrite d4/df/d12/d13/d25/d28/d3a/d58/f5a [0,4194304] 0 2026-03-10T06:22:59.964 INFO:tasks.workunit.client.0.vm04.stdout:6/491: link d2/d43/d2d/d30/d1f/d3c/f27 d2/fa0 0 2026-03-10T06:22:59.965 INFO:tasks.workunit.client.0.vm04.stdout:8/523: link df/d20/c75 df/d20/d25/d73/c9e 0 2026-03-10T06:22:59.974 INFO:tasks.workunit.client.0.vm04.stdout:5/479: mknod d4/d11/d7d/d38/d91/ca9 0 2026-03-10T06:22:59.974 INFO:tasks.workunit.client.0.vm04.stdout:1/499: creat d0/d8/d46/fb7 x:0 0 0 2026-03-10T06:22:59.975 INFO:tasks.workunit.client.0.vm04.stdout:5/480: dread - d4/d11/d7d/f90 zero size 2026-03-10T06:22:59.977 INFO:tasks.workunit.client.0.vm04.stdout:5/481: chown d4/c46 542693 1 2026-03-10T06:22:59.979 INFO:tasks.workunit.client.0.vm04.stdout:6/492: write d2/d43/d2d/d30/f93 [2091508,110954] 0 2026-03-10T06:22:59.999 INFO:tasks.workunit.client.0.vm04.stdout:5/482: write d4/d6/d50/f61 [4841737,74792] 0 2026-03-10T06:22:59.999 INFO:tasks.workunit.client.0.vm04.stdout:8/524: creat df/d20/d25/d30/d65/f9f x:0 0 0 2026-03-10T06:22:59.999 INFO:tasks.workunit.client.0.vm04.stdout:1/500: creat d0/d3/d41/fb8 x:0 0 0 2026-03-10T06:22:59.999 INFO:tasks.workunit.client.0.vm04.stdout:8/525: truncate df/d20/d25/d30/d55/f95 53982 0 2026-03-10T06:22:59.999 INFO:tasks.workunit.client.0.vm04.stdout:9/537: creat d2/d8/d22/fd1 x:0 0 0 2026-03-10T06:22:59.999 INFO:tasks.workunit.client.0.vm04.stdout:1/501: mknod d0/d8/d46/d7a/d95/cb9 0 2026-03-10T06:22:59.999 INFO:tasks.workunit.client.0.vm04.stdout:7/505: creat d4/df/d12/d13/d25/d28/fc0 x:0 0 0 2026-03-10T06:22:59.999 INFO:tasks.workunit.client.0.vm04.stdout:6/493: unlink d2/d43/d2d/c7b 0 2026-03-10T06:22:59.999 INFO:tasks.workunit.client.0.vm04.stdout:8/526: mknod df/d20/d25/d30/d70/d97/d67/ca0 0 2026-03-10T06:22:59.999 INFO:tasks.workunit.client.0.vm04.stdout:8/527: readlink df/d20/d25/d30/d55/l5a 0 2026-03-10T06:22:59.999 INFO:tasks.workunit.client.0.vm04.stdout:6/494: chown d2/d43/d2d/d30/d1f 5127 1 2026-03-10T06:22:59.999 INFO:tasks.workunit.client.0.vm04.stdout:0/527: getdents d0/d5/d25/dd 0 2026-03-10T06:23:00.004 INFO:tasks.workunit.client.0.vm04.stdout:7/506: creat d4/df/d12/d13/d25/d28/d36/d9c/fc1 x:0 0 0 2026-03-10T06:23:00.007 INFO:tasks.workunit.client.0.vm04.stdout:8/528: dread df/d20/d25/f44 [0,4194304] 0 2026-03-10T06:23:00.013 INFO:tasks.workunit.client.0.vm04.stdout:6/495: dwrite d2/d37/d6e/f95 [0,4194304] 0 2026-03-10T06:23:00.040 INFO:tasks.workunit.client.0.vm04.stdout:1/502: link d0/cb d0/d3/d41/cba 0 2026-03-10T06:23:00.040 INFO:tasks.workunit.client.0.vm04.stdout:1/503: dread - d0/d8/d46/fb7 zero size 2026-03-10T06:23:00.040 INFO:tasks.workunit.client.0.vm04.stdout:6/496: mknod d2/d8/ca1 0 2026-03-10T06:23:00.040 INFO:tasks.workunit.client.0.vm04.stdout:9/538: getdents d2/d3/d18/d39/d11/d42 0 2026-03-10T06:23:00.040 INFO:tasks.workunit.client.0.vm04.stdout:7/507: link d4/df/d12/d34/l7e d4/df/d12/d13/db3/lc2 0 2026-03-10T06:23:00.040 INFO:tasks.workunit.client.0.vm04.stdout:6/497: symlink d2/d43/d86/la2 0 2026-03-10T06:23:00.040 INFO:tasks.workunit.client.0.vm04.stdout:0/528: rename d0/d1a/f3b to d0/d5/d25/dd/d5c/fb5 0 2026-03-10T06:23:00.040 INFO:tasks.workunit.client.0.vm04.stdout:6/498: readlink d2/d43/d2d/d30/l33 0 2026-03-10T06:23:00.040 INFO:tasks.workunit.client.0.vm04.stdout:0/529: dwrite d0/d5/d25/f23 [0,4194304] 0 2026-03-10T06:23:00.040 INFO:tasks.workunit.client.0.vm04.stdout:0/530: dread - d0/d5/d25/f6f zero size 2026-03-10T06:23:00.040 INFO:tasks.workunit.client.0.vm04.stdout:6/499: symlink d2/d3a/la3 0 2026-03-10T06:23:00.045 INFO:tasks.workunit.client.0.vm04.stdout:0/531: rmdir d0/d5/d97 39 2026-03-10T06:23:00.046 INFO:tasks.workunit.client.0.vm04.stdout:0/532: read - d0/d5/d25/dd/d1d/fa2 zero size 2026-03-10T06:23:00.050 INFO:tasks.workunit.client.0.vm04.stdout:8/529: rename df/d20/d25/d30/d70/c71 to df/d15/d2b/ca1 0 2026-03-10T06:23:00.051 INFO:tasks.workunit.client.0.vm04.stdout:6/500: creat d2/d3a/d5e/fa4 x:0 0 0 2026-03-10T06:23:00.051 INFO:tasks.workunit.client.0.vm04.stdout:6/501: readlink d2/d3a/la3 0 2026-03-10T06:23:00.052 INFO:tasks.workunit.client.0.vm04.stdout:6/502: write d2/d37/d6e/f77 [1004331,120149] 0 2026-03-10T06:23:00.056 INFO:tasks.workunit.client.0.vm04.stdout:6/503: dwrite d2/d37/d6e/f70 [0,4194304] 0 2026-03-10T06:23:00.058 INFO:tasks.workunit.client.0.vm04.stdout:4/506: sync 2026-03-10T06:23:00.058 INFO:tasks.workunit.client.0.vm04.stdout:3/501: sync 2026-03-10T06:23:00.063 INFO:tasks.workunit.client.0.vm04.stdout:4/507: write d2/d8/f89 [917406,101549] 0 2026-03-10T06:23:00.063 INFO:tasks.workunit.client.0.vm04.stdout:4/508: dread - d2/d16/d31/d42/fa0 zero size 2026-03-10T06:23:00.063 INFO:tasks.workunit.client.0.vm04.stdout:4/509: dread d2/d46/f26 [4194304,4194304] 0 2026-03-10T06:23:00.066 INFO:tasks.workunit.client.0.vm04.stdout:1/504: rename d0/d8/l31 to d0/d3/d41/d99/lbb 0 2026-03-10T06:23:00.073 INFO:tasks.workunit.client.0.vm04.stdout:4/510: chown d2/d32/d5c/f4b 5 1 2026-03-10T06:23:00.073 INFO:tasks.workunit.client.0.vm04.stdout:0/533: symlink d0/d1a/d20/lb6 0 2026-03-10T06:23:00.073 INFO:tasks.workunit.client.0.vm04.stdout:4/511: chown d2/d16/d56 577891 1 2026-03-10T06:23:00.075 INFO:tasks.workunit.client.0.vm04.stdout:0/534: read d0/f16 [169611,50035] 0 2026-03-10T06:23:00.075 INFO:tasks.workunit.client.0.vm04.stdout:0/535: read - d0/d5/d25/dd/d5c/fb2 zero size 2026-03-10T06:23:00.076 INFO:tasks.workunit.client.0.vm04.stdout:9/539: rename d2/d23/c59 to d2/d8/d53/d6e/cd2 0 2026-03-10T06:23:00.079 INFO:tasks.workunit.client.0.vm04.stdout:3/502: symlink d4/da/df/d11/d50/laa 0 2026-03-10T06:23:00.098 INFO:tasks.workunit.client.0.vm04.stdout:9/540: write f0 [5974018,87821] 0 2026-03-10T06:23:00.098 INFO:tasks.workunit.client.0.vm04.stdout:4/512: link d2/d16/d31/d3f/f52 d2/d16/d31/d3f/d93/fa2 0 2026-03-10T06:23:00.098 INFO:tasks.workunit.client.0.vm04.stdout:4/513: chown d2/c19 1690871 1 2026-03-10T06:23:00.098 INFO:tasks.workunit.client.0.vm04.stdout:4/514: chown d2/d32/d5c/d4f/d51/l59 257 1 2026-03-10T06:23:00.098 INFO:tasks.workunit.client.0.vm04.stdout:9/541: creat d2/d8/d53/fd3 x:0 0 0 2026-03-10T06:23:00.099 INFO:tasks.workunit.client.0.vm04.stdout:7/508: rename d4/df/d12/d34/d63/c76 to d4/df/d12/d13/d25/cc3 0 2026-03-10T06:23:00.104 INFO:tasks.workunit.client.0.vm04.stdout:9/542: mkdir d2/d23/d24/dd4 0 2026-03-10T06:23:00.104 INFO:tasks.workunit.client.0.vm04.stdout:4/515: mkdir d2/d16/da3 0 2026-03-10T06:23:00.105 INFO:tasks.workunit.client.0.vm04.stdout:4/516: write d2/d8/f9f [590489,128779] 0 2026-03-10T06:23:00.110 INFO:tasks.workunit.client.0.vm04.stdout:7/509: dread d4/df/d12/d13/d25/d28/d3a/d58/d68/f7c [0,4194304] 0 2026-03-10T06:23:00.111 INFO:tasks.workunit.client.0.vm04.stdout:1/505: sync 2026-03-10T06:23:00.112 INFO:tasks.workunit.client.0.vm04.stdout:4/517: dwrite d2/d16/f20 [4194304,4194304] 0 2026-03-10T06:23:00.132 INFO:tasks.workunit.client.0.vm04.stdout:2/492: write d1/df/d2c/f3d [2575275,24329] 0 2026-03-10T06:23:00.138 INFO:tasks.workunit.client.0.vm04.stdout:5/483: truncate d4/ff 4233299 0 2026-03-10T06:23:00.141 INFO:tasks.workunit.client.0.vm04.stdout:9/543: creat d2/d3/d18/d39/d46/d84/fd5 x:0 0 0 2026-03-10T06:23:00.142 INFO:tasks.workunit.client.0.vm04.stdout:1/506: creat d0/d8/d46/d7a/fbc x:0 0 0 2026-03-10T06:23:00.146 INFO:tasks.workunit.client.0.vm04.stdout:7/510: mkdir d4/df/d12/d13/d25/d28/d36/d9c/db1/dc4 0 2026-03-10T06:23:00.152 INFO:tasks.workunit.client.0.vm04.stdout:6/504: dread d2/d37/d6e/f77 [0,4194304] 0 2026-03-10T06:23:00.155 INFO:tasks.workunit.client.0.vm04.stdout:6/505: chown d2/d3a/f90 56362496 1 2026-03-10T06:23:00.161 INFO:tasks.workunit.client.0.vm04.stdout:2/493: dwrite d1/df/d2c/f58 [0,4194304] 0 2026-03-10T06:23:00.166 INFO:tasks.workunit.client.0.vm04.stdout:7/511: mknod d4/df/d12/d13/d25/d28/d36/d9c/db1/cc5 0 2026-03-10T06:23:00.167 INFO:tasks.workunit.client.0.vm04.stdout:7/512: readlink d4/df/d12/l3e 0 2026-03-10T06:23:00.174 INFO:tasks.workunit.client.0.vm04.stdout:5/484: mknod d4/d6/d80/d84/d99/caa 0 2026-03-10T06:23:00.175 INFO:tasks.workunit.client.0.vm04.stdout:0/536: dread d0/d5/d25/dd/d5c/d73/f4f [0,4194304] 0 2026-03-10T06:23:00.175 INFO:tasks.workunit.client.0.vm04.stdout:0/537: stat d0/d1a/d4d/c87 0 2026-03-10T06:23:00.184 INFO:tasks.workunit.client.0.vm04.stdout:9/544: creat d2/d3/d18/d39/d11/da5/fd6 x:0 0 0 2026-03-10T06:23:00.185 INFO:tasks.workunit.client.0.vm04.stdout:5/485: dwrite d4/d3b/f6d [0,4194304] 0 2026-03-10T06:23:00.192 INFO:tasks.workunit.client.0.vm04.stdout:0/538: read d0/d5/d25/dd/d3a/f57 [699017,17757] 0 2026-03-10T06:23:00.216 INFO:tasks.workunit.client.0.vm04.stdout:6/506: dread d2/d43/f3b [0,4194304] 0 2026-03-10T06:23:00.216 INFO:tasks.workunit.client.0.vm04.stdout:6/507: stat d2/d43/d2d/c6c 0 2026-03-10T06:23:00.235 INFO:tasks.workunit.client.0.vm04.stdout:6/508: sync 2026-03-10T06:23:00.242 INFO:tasks.workunit.client.0.vm04.stdout:6/509: chown d2/d43/d86/f9f 95883552 1 2026-03-10T06:23:00.272 INFO:tasks.workunit.client.0.vm04.stdout:7/513: fsync d4/f7a 0 2026-03-10T06:23:00.278 INFO:tasks.workunit.client.0.vm04.stdout:8/530: unlink df/d15/d2b/ca1 0 2026-03-10T06:23:00.286 INFO:tasks.workunit.client.0.vm04.stdout:0/539: mknod d0/d5/d25/dd/d1d/d9c/cb7 0 2026-03-10T06:23:00.296 INFO:tasks.workunit.client.0.vm04.stdout:9/545: dread d2/d3/d18/d39/d11/f71 [0,4194304] 0 2026-03-10T06:23:00.301 INFO:tasks.workunit.client.0.vm04.stdout:3/503: dwrite d4/d6/d99/d7b/f45 [0,4194304] 0 2026-03-10T06:23:00.307 INFO:tasks.workunit.client.0.vm04.stdout:9/546: sync 2026-03-10T06:23:00.308 INFO:tasks.workunit.client.0.vm04.stdout:9/547: sync 2026-03-10T06:23:00.309 INFO:tasks.workunit.client.0.vm04.stdout:9/548: fsync d2/d8/d14/f40 0 2026-03-10T06:23:00.312 INFO:tasks.workunit.client.0.vm04.stdout:9/549: dwrite d2/d8/d14/f27 [4194304,4194304] 0 2026-03-10T06:23:00.355 INFO:tasks.workunit.client.0.vm04.stdout:1/507: write d0/f64 [770881,57738] 0 2026-03-10T06:23:00.355 INFO:tasks.workunit.client.0.vm04.stdout:1/508: dread - d0/d8/fab zero size 2026-03-10T06:23:00.357 INFO:tasks.workunit.client.0.vm04.stdout:7/514: rename d4/df/d12/d13/c82 to d4/df/d12/d13/d25/d28/d3a/db0/cc6 0 2026-03-10T06:23:00.358 INFO:tasks.workunit.client.0.vm04.stdout:4/518: truncate d2/d32/d5c/f4b 2718770 0 2026-03-10T06:23:00.366 INFO:tasks.workunit.client.0.vm04.stdout:2/494: rmdir d1/db/d20/d75 0 2026-03-10T06:23:00.367 INFO:tasks.workunit.client.0.vm04.stdout:5/486: mkdir d4/d11/d7d/dab 0 2026-03-10T06:23:00.391 INFO:tasks.workunit.client.0.vm04.stdout:1/509: symlink d0/d3/d80/lbd 0 2026-03-10T06:23:00.409 INFO:tasks.workunit.client.0.vm04.stdout:2/495: symlink d1/db/d72/l96 0 2026-03-10T06:23:00.409 INFO:tasks.workunit.client.0.vm04.stdout:2/496: read - d1/df/d2c/f5f zero size 2026-03-10T06:23:00.418 INFO:tasks.workunit.client.0.vm04.stdout:1/510: write d0/f6a [4433298,3639] 0 2026-03-10T06:23:00.419 INFO:tasks.workunit.client.0.vm04.stdout:4/519: creat d2/d16/da3/fa4 x:0 0 0 2026-03-10T06:23:00.422 INFO:tasks.workunit.client.0.vm04.stdout:4/520: dwrite d2/d32/d5c/d76/f95 [0,4194304] 0 2026-03-10T06:23:00.428 INFO:tasks.workunit.client.0.vm04.stdout:8/531: link df/d20/f64 df/d20/d25/d30/d65/d8f/fa2 0 2026-03-10T06:23:00.460 INFO:tasks.workunit.client.0.vm04.stdout:0/540: truncate d0/d5/d25/dd/d5c/f8f 4203707 0 2026-03-10T06:23:00.460 INFO:tasks.workunit.client.0.vm04.stdout:1/511: mknod d0/d8/d46/d7a/cbe 0 2026-03-10T06:23:00.461 INFO:tasks.workunit.client.0.vm04.stdout:4/521: unlink d2/d16/d2c/f6f 0 2026-03-10T06:23:00.462 INFO:tasks.workunit.client.0.vm04.stdout:2/497: creat d1/db/d72/d94/f97 x:0 0 0 2026-03-10T06:23:00.477 INFO:tasks.workunit.client.0.vm04.stdout:0/541: dread d0/f4 [0,4194304] 0 2026-03-10T06:23:00.488 INFO:tasks.workunit.client.0.vm04.stdout:4/522: creat d2/d46/fa5 x:0 0 0 2026-03-10T06:23:00.488 INFO:tasks.workunit.client.0.vm04.stdout:4/523: fsync d2/d46/f61 0 2026-03-10T06:23:00.489 INFO:tasks.workunit.client.0.vm04.stdout:8/532: mkdir df/d15/d29/da3 0 2026-03-10T06:23:00.491 INFO:tasks.workunit.client.0.vm04.stdout:4/524: dwrite d2/d16/f20 [4194304,4194304] 0 2026-03-10T06:23:00.493 INFO:tasks.workunit.client.0.vm04.stdout:0/542: sync 2026-03-10T06:23:00.496 INFO:tasks.workunit.client.0.vm04.stdout:4/525: dread d2/d8/f9f [0,4194304] 0 2026-03-10T06:23:00.496 INFO:tasks.workunit.client.0.vm04.stdout:4/526: chown d2/d16/d31/f50 524 1 2026-03-10T06:23:00.514 INFO:tasks.workunit.client.0.vm04.stdout:8/533: creat df/d20/d25/d30/d70/d97/d67/fa4 x:0 0 0 2026-03-10T06:23:00.522 INFO:tasks.workunit.client.0.vm04.stdout:0/543: mkdir d0/d1a/db8 0 2026-03-10T06:23:00.523 INFO:tasks.workunit.client.0.vm04.stdout:4/527: fdatasync d2/f12 0 2026-03-10T06:23:00.532 INFO:tasks.workunit.client.0.vm04.stdout:4/528: mknod d2/d16/d31/d3f/ca6 0 2026-03-10T06:23:00.534 INFO:tasks.workunit.client.0.vm04.stdout:8/534: getdents df/d20/d25/d30/d70/d97 0 2026-03-10T06:23:00.534 INFO:tasks.workunit.client.0.vm04.stdout:4/529: dread - d2/d16/d31/d3f/f64 zero size 2026-03-10T06:23:00.547 INFO:tasks.workunit.client.0.vm04.stdout:4/530: dwrite d2/d16/f20 [0,4194304] 0 2026-03-10T06:23:00.556 INFO:tasks.workunit.client.0.vm04.stdout:7/515: rmdir d4/df/d12/d13/d25/d28 39 2026-03-10T06:23:00.561 INFO:tasks.workunit.client.0.vm04.stdout:4/531: rename d2/d16/d31/f50 to d2/d16/d56/fa7 0 2026-03-10T06:23:00.565 INFO:tasks.workunit.client.0.vm04.stdout:4/532: dread d2/d32/d5c/d76/f95 [0,4194304] 0 2026-03-10T06:23:00.574 INFO:tasks.workunit.client.0.vm04.stdout:7/516: dwrite d4/fa2 [0,4194304] 0 2026-03-10T06:23:00.596 INFO:tasks.workunit.client.0.vm04.stdout:7/517: rename d4/df/d12/d13/d25/d28/d36/f88 to d4/df/d12/d13/fc7 0 2026-03-10T06:23:00.609 INFO:tasks.workunit.client.0.vm04.stdout:7/518: chown d4/df/d12/d21/c6e 34 1 2026-03-10T06:23:00.614 INFO:tasks.workunit.client.0.vm04.stdout:7/519: symlink d4/df/d12/d34/dbd/lc8 0 2026-03-10T06:23:00.615 INFO:tasks.workunit.client.0.vm04.stdout:7/520: fsync d4/df/d12/d21/f94 0 2026-03-10T06:23:00.623 INFO:tasks.workunit.client.0.vm04.stdout:7/521: getdents d4/df/d12/d13/d25/d28/d36/d9c/db1/dc4 0 2026-03-10T06:23:00.638 INFO:tasks.workunit.client.0.vm04.stdout:7/522: fsync d4/df/d12/f18 0 2026-03-10T06:23:00.643 INFO:tasks.workunit.client.0.vm04.stdout:3/504: write d4/d6/d54/f81 [793681,7845] 0 2026-03-10T06:23:00.644 INFO:tasks.workunit.client.0.vm04.stdout:6/510: symlink d2/d43/la5 0 2026-03-10T06:23:00.648 INFO:tasks.workunit.client.0.vm04.stdout:6/511: mknod d2/d8/ca6 0 2026-03-10T06:23:00.654 INFO:tasks.workunit.client.0.vm04.stdout:9/550: unlink d2/d8/d14/c1b 0 2026-03-10T06:23:00.661 INFO:tasks.workunit.client.0.vm04.stdout:5/487: mknod d4/d11/d7d/d38/cac 0 2026-03-10T06:23:00.662 INFO:tasks.workunit.client.0.vm04.stdout:9/551: symlink d2/d23/d24/dd4/ld7 0 2026-03-10T06:23:00.667 INFO:tasks.workunit.client.0.vm04.stdout:5/488: fdatasync d4/d11/d7d/d52/f96 0 2026-03-10T06:23:00.672 INFO:tasks.workunit.client.0.vm04.stdout:7/523: dread d4/df/d12/d13/f1e [4194304,4194304] 0 2026-03-10T06:23:00.674 INFO:tasks.workunit.client.0.vm04.stdout:9/552: dwrite d2/d23/d24/da2/fc5 [0,4194304] 0 2026-03-10T06:23:00.675 INFO:tasks.workunit.client.0.vm04.stdout:5/489: dwrite d4/d11/d7d/d38/d91/d4c/fa1 [0,4194304] 0 2026-03-10T06:23:00.689 INFO:tasks.workunit.client.0.vm04.stdout:6/512: rename d2/d43/c20 to d2/d43/d2d/d30/d34/ca7 0 2026-03-10T06:23:00.691 INFO:tasks.workunit.client.0.vm04.stdout:9/553: creat d2/d8/d3a/fd8 x:0 0 0 2026-03-10T06:23:00.693 INFO:tasks.workunit.client.0.vm04.stdout:5/490: mknod d4/d6/d50/cad 0 2026-03-10T06:23:00.694 INFO:tasks.workunit.client.0.vm04.stdout:5/491: chown d4/d6/c1c 3 1 2026-03-10T06:23:00.694 INFO:tasks.workunit.client.0.vm04.stdout:1/512: write d0/d8/f67 [1015514,48192] 0 2026-03-10T06:23:00.695 INFO:tasks.workunit.client.0.vm04.stdout:5/492: stat d4/d6/d80/d84/d99 0 2026-03-10T06:23:00.698 INFO:tasks.workunit.client.0.vm04.stdout:6/513: mkdir d2/d43/d2d/d30/d34/da8 0 2026-03-10T06:23:00.698 INFO:tasks.workunit.client.0.vm04.stdout:7/524: dread d4/df/d12/d13/d25/d28/d36/f4d [0,4194304] 0 2026-03-10T06:23:00.698 INFO:tasks.workunit.client.0.vm04.stdout:6/514: readlink d2/l36 0 2026-03-10T06:23:00.699 INFO:tasks.workunit.client.0.vm04.stdout:9/554: mknod d2/d8/d53/d6e/d8d/cd9 0 2026-03-10T06:23:00.701 INFO:tasks.workunit.client.0.vm04.stdout:1/513: unlink d0/d3/d41/d99/lae 0 2026-03-10T06:23:00.701 INFO:tasks.workunit.client.0.vm04.stdout:1/514: write d0/d8/fab [1002339,92046] 0 2026-03-10T06:23:00.708 INFO:tasks.workunit.client.0.vm04.stdout:7/525: dwrite d4/f5 [4194304,4194304] 0 2026-03-10T06:23:00.713 INFO:tasks.workunit.client.0.vm04.stdout:9/555: stat d2/d23/d24/c96 0 2026-03-10T06:23:00.713 INFO:tasks.workunit.client.0.vm04.stdout:5/493: mkdir d4/d11/d7d/dae 0 2026-03-10T06:23:00.713 INFO:tasks.workunit.client.0.vm04.stdout:2/498: truncate d1/db/d69/f77 1465759 0 2026-03-10T06:23:00.730 INFO:tasks.workunit.client.0.vm04.stdout:1/515: symlink d0/d8/d46/db3/db4/lbf 0 2026-03-10T06:23:00.734 INFO:tasks.workunit.client.0.vm04.stdout:1/516: dwrite d0/d3/d41/d4b/d5b/fb6 [0,4194304] 0 2026-03-10T06:23:00.736 INFO:tasks.workunit.client.0.vm04.stdout:0/544: write d0/f4 [2879707,77955] 0 2026-03-10T06:23:00.738 INFO:tasks.workunit.client.0.vm04.stdout:5/494: fsync d4/d6/f47 0 2026-03-10T06:23:00.738 INFO:tasks.workunit.client.0.vm04.stdout:0/545: dread - d0/d5/d25/dd/d3a/d81/f90 zero size 2026-03-10T06:23:00.738 INFO:tasks.workunit.client.0.vm04.stdout:7/526: symlink d4/df/d12/d13/db3/lc9 0 2026-03-10T06:23:00.745 INFO:tasks.workunit.client.0.vm04.stdout:8/535: write fe [1458149,109106] 0 2026-03-10T06:23:00.746 INFO:tasks.workunit.client.0.vm04.stdout:8/536: write df/d20/f5e [796352,79482] 0 2026-03-10T06:23:00.748 INFO:tasks.workunit.client.0.vm04.stdout:6/515: getdents d2/d37 0 2026-03-10T06:23:00.750 INFO:tasks.workunit.client.0.vm04.stdout:5/495: unlink d4/d6/la2 0 2026-03-10T06:23:00.750 INFO:tasks.workunit.client.0.vm04.stdout:8/537: dread df/f40 [0,4194304] 0 2026-03-10T06:23:00.750 INFO:tasks.workunit.client.0.vm04.stdout:8/538: readlink df/d20/d25/l85 0 2026-03-10T06:23:00.751 INFO:tasks.workunit.client.0.vm04.stdout:8/539: chown df/d20/d25/d30/f4e 2329915 1 2026-03-10T06:23:00.753 INFO:tasks.workunit.client.0.vm04.stdout:7/527: symlink d4/df/d12/d13/d25/d28/d3a/db0/lca 0 2026-03-10T06:23:00.758 INFO:tasks.workunit.client.0.vm04.stdout:6/516: fdatasync d2/f10 0 2026-03-10T06:23:00.758 INFO:tasks.workunit.client.0.vm04.stdout:6/517: fdatasync d2/d3a/d5e/f99 0 2026-03-10T06:23:00.769 INFO:tasks.workunit.client.0.vm04.stdout:0/546: dread d0/d1a/d20/d38/d31/d47/f89 [0,4194304] 0 2026-03-10T06:23:00.770 INFO:tasks.workunit.client.0.vm04.stdout:1/517: rename d0/d8/c3d to d0/d3/cc0 0 2026-03-10T06:23:00.771 INFO:tasks.workunit.client.0.vm04.stdout:1/518: chown d0/d3/d41/c4c 253 1 2026-03-10T06:23:00.771 INFO:tasks.workunit.client.0.vm04.stdout:1/519: write d0/d8/d46/d7a/fbc [539297,118511] 0 2026-03-10T06:23:00.774 INFO:tasks.workunit.client.0.vm04.stdout:6/518: creat d2/d37/d6e/fa9 x:0 0 0 2026-03-10T06:23:00.777 INFO:tasks.workunit.client.0.vm04.stdout:8/540: symlink df/d15/d29/da3/la5 0 2026-03-10T06:23:00.780 INFO:tasks.workunit.client.0.vm04.stdout:8/541: dwrite df/d15/f45 [0,4194304] 0 2026-03-10T06:23:00.781 INFO:tasks.workunit.client.0.vm04.stdout:8/542: fdatasync df/d15/d2b/f2f 0 2026-03-10T06:23:00.782 INFO:tasks.workunit.client.0.vm04.stdout:7/528: sync 2026-03-10T06:23:00.782 INFO:tasks.workunit.client.0.vm04.stdout:1/520: sync 2026-03-10T06:23:00.786 INFO:tasks.workunit.client.0.vm04.stdout:1/521: dwrite d0/d8/f38 [0,4194304] 0 2026-03-10T06:23:00.790 INFO:tasks.workunit.client.0.vm04.stdout:6/519: mkdir d2/d43/d2d/d7c/daa 0 2026-03-10T06:23:00.796 INFO:tasks.workunit.client.0.vm04.stdout:6/520: dwrite d2/d43/d2d/d30/d34/f4d [0,4194304] 0 2026-03-10T06:23:00.798 INFO:tasks.workunit.client.0.vm04.stdout:3/505: dwrite d4/d6/d99/d7b/f47 [0,4194304] 0 2026-03-10T06:23:00.822 INFO:tasks.workunit.client.0.vm04.stdout:1/522: symlink d0/d8/d46/db3/db4/lc1 0 2026-03-10T06:23:00.836 INFO:tasks.workunit.client.0.vm04.stdout:6/521: rmdir d2 39 2026-03-10T06:23:00.838 INFO:tasks.workunit.client.0.vm04.stdout:3/506: mknod d4/d6/d99/d7b/d21/d2c/cab 0 2026-03-10T06:23:00.839 INFO:tasks.workunit.client.0.vm04.stdout:9/556: dwrite d2/d8/f4a [0,4194304] 0 2026-03-10T06:23:00.842 INFO:tasks.workunit.client.0.vm04.stdout:9/557: chown d2/d8/d14/d1d/f78 699605898 1 2026-03-10T06:23:00.843 INFO:tasks.workunit.client.0.vm04.stdout:7/529: mkdir d4/df/d12/d13/d25/dcb 0 2026-03-10T06:23:00.844 INFO:tasks.workunit.client.0.vm04.stdout:0/547: truncate d0/d5/f41 3372083 0 2026-03-10T06:23:00.845 INFO:tasks.workunit.client.0.vm04.stdout:9/558: dread - d2/fb4 zero size 2026-03-10T06:23:00.845 INFO:tasks.workunit.client.0.vm04.stdout:2/499: truncate d1/df/d2c/f58 1233276 0 2026-03-10T06:23:00.850 INFO:tasks.workunit.client.0.vm04.stdout:3/507: dread d4/d6/d99/d7b/f27 [0,4194304] 0 2026-03-10T06:23:00.850 INFO:tasks.workunit.client.0.vm04.stdout:0/548: chown d0/d1a/d20/f85 2 1 2026-03-10T06:23:00.851 INFO:tasks.workunit.client.0.vm04.stdout:0/549: chown d0/d1a/d20/d38/d31/d47/f89 268978 1 2026-03-10T06:23:00.851 INFO:tasks.workunit.client.0.vm04.stdout:1/523: mkdir d0/d3/d41/dc2 0 2026-03-10T06:23:00.864 INFO:tasks.workunit.client.0.vm04.stdout:4/533: write d2/d32/d5c/f4b [2666365,78658] 0 2026-03-10T06:23:00.877 INFO:tasks.workunit.client.0.vm04.stdout:9/559: mknod d2/d3/d18/d39/d11/cda 0 2026-03-10T06:23:00.887 INFO:tasks.workunit.client.0.vm04.stdout:9/560: chown d2/d3/d18/d34/f5f 779 1 2026-03-10T06:23:00.887 INFO:tasks.workunit.client.0.vm04.stdout:3/508: creat d4/d6/d99/d7b/d21/d2c/fac x:0 0 0 2026-03-10T06:23:00.887 INFO:tasks.workunit.client.0.vm04.stdout:3/509: stat d4/d6/d54 0 2026-03-10T06:23:00.888 INFO:tasks.workunit.client.0.vm04.stdout:7/530: creat d4/df/d12/d13/d25/d28/d3a/d58/fcc x:0 0 0 2026-03-10T06:23:00.888 INFO:tasks.workunit.client.0.vm04.stdout:3/510: dwrite d4/d6/d99/d7b/d21/f9a [0,4194304] 0 2026-03-10T06:23:00.892 INFO:tasks.workunit.client.0.vm04.stdout:6/522: creat d2/d43/d2d/d30/d34/d76/d8a/fab x:0 0 0 2026-03-10T06:23:00.906 INFO:tasks.workunit.client.0.vm04.stdout:5/496: write d4/d11/f32 [428505,11862] 0 2026-03-10T06:23:00.925 INFO:tasks.workunit.client.0.vm04.stdout:8/543: getdents df/d20 0 2026-03-10T06:23:00.926 INFO:tasks.workunit.client.0.vm04.stdout:2/500: symlink d1/df/d11/l98 0 2026-03-10T06:23:00.926 INFO:tasks.workunit.client.0.vm04.stdout:2/501: chown d1/df/d11/d14/d4e/l85 427274964 1 2026-03-10T06:23:00.927 INFO:tasks.workunit.client.0.vm04.stdout:9/561: unlink d2/d3/d18/d39/d46/d55/f88 0 2026-03-10T06:23:00.937 INFO:tasks.workunit.client.0.vm04.stdout:4/534: link d2/d16/d2c/d6b/f9e d2/d46/fa8 0 2026-03-10T06:23:00.943 INFO:tasks.workunit.client.0.vm04.stdout:4/535: dread d2/f4c [0,4194304] 0 2026-03-10T06:23:00.953 INFO:tasks.workunit.client.0.vm04.stdout:5/497: fdatasync d4/d6/f47 0 2026-03-10T06:23:00.954 INFO:tasks.workunit.client.0.vm04.stdout:5/498: chown d4/d11/d7d/fa6 39869 1 2026-03-10T06:23:00.956 INFO:tasks.workunit.client.0.vm04.stdout:5/499: dread d4/d11/d7d/d38/d91/d4c/f83 [0,4194304] 0 2026-03-10T06:23:00.960 INFO:tasks.workunit.client.0.vm04.stdout:3/511: dwrite d4/d6/d99/f76 [0,4194304] 0 2026-03-10T06:23:00.968 INFO:tasks.workunit.client.0.vm04.stdout:8/544: rename df/d20/d25/l85 to df/d20/d25/d30/d70/la6 0 2026-03-10T06:23:00.973 INFO:tasks.workunit.client.0.vm04.stdout:9/562: truncate d2/d3/d18/d34/f47 2242632 0 2026-03-10T06:23:00.982 INFO:tasks.workunit.client.0.vm04.stdout:0/550: creat d0/d1a/d20/fb9 x:0 0 0 2026-03-10T06:23:00.982 INFO:tasks.workunit.client.0.vm04.stdout:0/551: fdatasync d0/f16 0 2026-03-10T06:23:00.982 INFO:tasks.workunit.client.0.vm04.stdout:1/524: link d0/d3/d41/d4b/d5b/c89 d0/d3/d41/cc3 0 2026-03-10T06:23:00.982 INFO:tasks.workunit.client.0.vm04.stdout:1/525: chown d0/d8/d46/d7a/cbe 8473994 1 2026-03-10T06:23:00.982 INFO:tasks.workunit.client.0.vm04.stdout:1/526: readlink d0/d3/d41/d4b/d5b/l7e 0 2026-03-10T06:23:00.982 INFO:tasks.workunit.client.0.vm04.stdout:4/536: rmdir d2/d32/d5c/d4f 39 2026-03-10T06:23:00.982 INFO:tasks.workunit.client.0.vm04.stdout:4/537: chown d2/d46/f15 209 1 2026-03-10T06:23:00.982 INFO:tasks.workunit.client.0.vm04.stdout:6/523: symlink d2/d43/d2d/d7c/daa/lac 0 2026-03-10T06:23:00.986 INFO:tasks.workunit.client.0.vm04.stdout:2/502: write d1/db/d20/d8f/f53 [189751,37271] 0 2026-03-10T06:23:00.990 INFO:tasks.workunit.client.0.vm04.stdout:7/531: dwrite d4/df/d12/d13/d25/d30/d40/f52 [0,4194304] 0 2026-03-10T06:23:00.994 INFO:tasks.workunit.client.0.vm04.stdout:3/512: creat d4/d6/d91/fad x:0 0 0 2026-03-10T06:23:00.996 INFO:tasks.workunit.client.0.vm04.stdout:3/513: dread d4/d6/d99/f76 [0,4194304] 0 2026-03-10T06:23:01.004 INFO:tasks.workunit.client.0.vm04.stdout:8/545: rename df/d20/d25/f93 to df/d15/d29/da3/fa7 0 2026-03-10T06:23:01.011 INFO:tasks.workunit.client.0.vm04.stdout:0/552: chown d0/d5/f41 36714364 1 2026-03-10T06:23:01.019 INFO:tasks.workunit.client.0.vm04.stdout:4/538: chown d2/d16/d2c/d6b/c83 286 1 2026-03-10T06:23:01.022 INFO:tasks.workunit.client.0.vm04.stdout:9/563: write d2/d8/f66 [4678916,32717] 0 2026-03-10T06:23:01.022 INFO:tasks.workunit.client.0.vm04.stdout:9/564: write d2/d8/d22/fd1 [476387,99417] 0 2026-03-10T06:23:01.023 INFO:tasks.workunit.client.0.vm04.stdout:1/527: write d0/d8/d46/d7a/f84 [587928,58984] 0 2026-03-10T06:23:01.032 INFO:tasks.workunit.client.0.vm04.stdout:6/524: dwrite d2/d43/d2d/d30/f39 [0,4194304] 0 2026-03-10T06:23:01.076 INFO:tasks.workunit.client.0.vm04.stdout:2/503: write d1/df/d2c/d37/f52 [1711717,59582] 0 2026-03-10T06:23:01.079 INFO:tasks.workunit.client.0.vm04.stdout:7/532: stat d4/df/d12/d34/l92 0 2026-03-10T06:23:01.088 INFO:tasks.workunit.client.0.vm04.stdout:5/500: creat d4/d11/d7d/d38/d91/d4c/d98/faf x:0 0 0 2026-03-10T06:23:01.088 INFO:tasks.workunit.client.0.vm04.stdout:5/501: read - d4/d11/d7d/d38/f8b zero size 2026-03-10T06:23:01.098 INFO:tasks.workunit.client.0.vm04.stdout:3/514: write d4/da/df/d11/d5a/d5b/fa3 [593332,77392] 0 2026-03-10T06:23:01.106 INFO:tasks.workunit.client.0.vm04.stdout:8/546: rename l0 to df/d20/d25/d30/d65/d8f/la8 0 2026-03-10T06:23:01.120 INFO:tasks.workunit.client.0.vm04.stdout:4/539: dwrite d2/d16/f9d [0,4194304] 0 2026-03-10T06:23:01.124 INFO:tasks.workunit.client.0.vm04.stdout:4/540: dwrite d2/d46/f5d [0,4194304] 0 2026-03-10T06:23:01.131 INFO:tasks.workunit.client.0.vm04.stdout:9/565: symlink d2/d8/d14/da3/ldb 0 2026-03-10T06:23:01.132 INFO:tasks.workunit.client.0.vm04.stdout:9/566: truncate d2/d8/d14/d1d/d64/d73/f9a 230787 0 2026-03-10T06:23:01.134 INFO:tasks.workunit.client.0.vm04.stdout:4/541: dread d2/d16/d31/d3f/f8f [0,4194304] 0 2026-03-10T06:23:01.141 INFO:tasks.workunit.client.0.vm04.stdout:6/525: mknod d2/d8/cad 0 2026-03-10T06:23:01.145 INFO:tasks.workunit.client.0.vm04.stdout:2/504: dread - d1/df/d2c/f33 zero size 2026-03-10T06:23:01.149 INFO:tasks.workunit.client.0.vm04.stdout:8/547: creat df/d20/d25/d73/fa9 x:0 0 0 2026-03-10T06:23:01.149 INFO:tasks.workunit.client.0.vm04.stdout:8/548: fsync df/f17 0 2026-03-10T06:23:01.159 INFO:tasks.workunit.client.0.vm04.stdout:1/528: dwrite d0/d8/f6c [0,4194304] 0 2026-03-10T06:23:01.159 INFO:tasks.workunit.client.0.vm04.stdout:1/529: read - d0/d3/d41/fb8 zero size 2026-03-10T06:23:01.174 INFO:tasks.workunit.client.0.vm04.stdout:0/553: write d0/d5/d25/dd/d5c/fb5 [754635,58840] 0 2026-03-10T06:23:01.179 INFO:tasks.workunit.client.0.vm04.stdout:4/542: creat d2/d32/fa9 x:0 0 0 2026-03-10T06:23:01.182 INFO:tasks.workunit.client.0.vm04.stdout:6/526: mkdir d2/d43/d2d/d30/d34/dae 0 2026-03-10T06:23:01.182 INFO:tasks.workunit.client.0.vm04.stdout:6/527: write d2/d3a/f57 [2194852,33404] 0 2026-03-10T06:23:01.183 INFO:tasks.workunit.client.0.vm04.stdout:6/528: chown d2/d43/d2d/d30 2088406 1 2026-03-10T06:23:01.192 INFO:tasks.workunit.client.0.vm04.stdout:2/505: unlink d1/f2b 0 2026-03-10T06:23:01.200 INFO:tasks.workunit.client.0.vm04.stdout:3/515: truncate d4/d6/d99/d7b/d21/f9a 2703732 0 2026-03-10T06:23:01.207 INFO:tasks.workunit.client.0.vm04.stdout:1/530: dread d0/d8/f11 [0,4194304] 0 2026-03-10T06:23:01.208 INFO:tasks.workunit.client.0.vm04.stdout:1/531: stat d0/d8/d46/d7a/fa8 0 2026-03-10T06:23:01.209 INFO:tasks.workunit.client.0.vm04.stdout:0/554: symlink d0/d1a/d20/d38/d31/d47/d8a/lba 0 2026-03-10T06:23:01.209 INFO:tasks.workunit.client.0.vm04.stdout:1/532: dread d0/d8/d46/f57 [0,4194304] 0 2026-03-10T06:23:01.210 INFO:tasks.workunit.client.0.vm04.stdout:9/567: symlink d2/d8/d22/d4f/ldc 0 2026-03-10T06:23:01.211 INFO:tasks.workunit.client.0.vm04.stdout:0/555: read d0/d5/d25/dd/f13 [21903,41789] 0 2026-03-10T06:23:01.213 INFO:tasks.workunit.client.0.vm04.stdout:6/529: symlink d2/d43/d2d/d30/d34/laf 0 2026-03-10T06:23:01.213 INFO:tasks.workunit.client.0.vm04.stdout:6/530: fsync d2/d43/f31 0 2026-03-10T06:23:01.214 INFO:tasks.workunit.client.0.vm04.stdout:7/533: creat d4/df/d12/fcd x:0 0 0 2026-03-10T06:23:01.215 INFO:tasks.workunit.client.0.vm04.stdout:5/502: rename d4/d11/d7d/d38/d91/d4c/f83 to d4/fb0 0 2026-03-10T06:23:01.217 INFO:tasks.workunit.client.0.vm04.stdout:8/549: link df/d15/d2b/d81/f9d df/d15/d29/da3/faa 0 2026-03-10T06:23:01.220 INFO:tasks.workunit.client.0.vm04.stdout:7/534: unlink d4/df/d12/d13/d25/f4b 0 2026-03-10T06:23:01.226 INFO:tasks.workunit.client.0.vm04.stdout:6/531: fsync d2/d43/f24 0 2026-03-10T06:23:01.231 INFO:tasks.workunit.client.0.vm04.stdout:2/506: rename d1/df/d2c/d37/l55 to d1/db/d69/d74/l99 0 2026-03-10T06:23:01.232 INFO:tasks.workunit.client.0.vm04.stdout:3/516: sync 2026-03-10T06:23:01.237 INFO:tasks.workunit.client.0.vm04.stdout:8/550: mkdir df/d15/d2b/d8a/dab 0 2026-03-10T06:23:01.240 INFO:tasks.workunit.client.0.vm04.stdout:1/533: link d0/d3/d41/ca0 d0/d3/d41/dc2/cc4 0 2026-03-10T06:23:01.243 INFO:tasks.workunit.client.0.vm04.stdout:4/543: getdents d2/d46 0 2026-03-10T06:23:01.248 INFO:tasks.workunit.client.0.vm04.stdout:4/544: dwrite d2/d16/d2c/f9b [0,4194304] 0 2026-03-10T06:23:01.257 INFO:tasks.workunit.client.0.vm04.stdout:7/535: mknod d4/df/d12/d13/d25/d28/dae/cce 0 2026-03-10T06:23:01.260 INFO:tasks.workunit.client.0.vm04.stdout:7/536: dwrite d4/df/f56 [4194304,4194304] 0 2026-03-10T06:23:01.268 INFO:tasks.workunit.client.0.vm04.stdout:0/556: write d0/d5/f1f [4534823,114292] 0 2026-03-10T06:23:01.269 INFO:tasks.workunit.client.0.vm04.stdout:3/517: mknod d4/d6/d99/d7b/d89/cae 0 2026-03-10T06:23:01.270 INFO:tasks.workunit.client.0.vm04.stdout:3/518: chown d4/d6/d38/f78 2037 1 2026-03-10T06:23:01.276 INFO:tasks.workunit.client.0.vm04.stdout:5/503: dwrite d4/d11/f1f [4194304,4194304] 0 2026-03-10T06:23:01.276 INFO:tasks.workunit.client.0.vm04.stdout:3/519: stat d4/d6/d92/l9b 0 2026-03-10T06:23:01.294 INFO:tasks.workunit.client.0.vm04.stdout:0/557: fdatasync d0/f75 0 2026-03-10T06:23:01.297 INFO:tasks.workunit.client.0.vm04.stdout:3/520: dwrite d4/f42 [0,4194304] 0 2026-03-10T06:23:01.299 INFO:tasks.workunit.client.0.vm04.stdout:5/504: mkdir d4/d11/d7d/d38/d91/d55/db1 0 2026-03-10T06:23:01.303 INFO:tasks.workunit.client.0.vm04.stdout:9/568: rename d2/d3/d18/d39/d11/d42 to d2/d3/d18/ddd 0 2026-03-10T06:23:01.311 INFO:tasks.workunit.client.0.vm04.stdout:0/558: chown d0/d5/d25/dd/d5c/d73/c9e 936580622 1 2026-03-10T06:23:01.311 INFO:tasks.workunit.client.0.vm04.stdout:3/521: mknod d4/d6/dc/caf 0 2026-03-10T06:23:01.317 INFO:tasks.workunit.client.0.vm04.stdout:2/507: getdents d1/d76 0 2026-03-10T06:23:01.319 INFO:tasks.workunit.client.0.vm04.stdout:2/508: chown d1/df/d2c/d37/c90 26774493 1 2026-03-10T06:23:01.320 INFO:tasks.workunit.client.0.vm04.stdout:2/509: dwrite d1/f57 [0,4194304] 0 2026-03-10T06:23:01.329 INFO:tasks.workunit.client.0.vm04.stdout:0/559: unlink d0/d5/d25/c35 0 2026-03-10T06:23:01.332 INFO:tasks.workunit.client.0.vm04.stdout:4/545: getdents d2/d16/d31/d3f 0 2026-03-10T06:23:01.334 INFO:tasks.workunit.client.0.vm04.stdout:2/510: creat d1/db/d20/d8f/d35/d54/f9a x:0 0 0 2026-03-10T06:23:01.337 INFO:tasks.workunit.client.0.vm04.stdout:2/511: dwrite d1/db/d20/d8f/f53 [0,4194304] 0 2026-03-10T06:23:01.342 INFO:tasks.workunit.client.0.vm04.stdout:0/560: unlink d0/d5/d25/dd/d3a/d56/c58 0 2026-03-10T06:23:01.346 INFO:tasks.workunit.client.0.vm04.stdout:0/561: fdatasync d0/d5/d25/dd/d3a/d56/f84 0 2026-03-10T06:23:01.347 INFO:tasks.workunit.client.0.vm04.stdout:5/505: link d4/f35 d4/d11/d7d/dae/fb2 0 2026-03-10T06:23:01.347 INFO:tasks.workunit.client.0.vm04.stdout:5/506: write d4/f19 [575000,41479] 0 2026-03-10T06:23:01.347 INFO:tasks.workunit.client.0.vm04.stdout:3/522: sync 2026-03-10T06:23:01.351 INFO:tasks.workunit.client.0.vm04.stdout:4/546: dread d2/d8/f9f [0,4194304] 0 2026-03-10T06:23:01.356 INFO:tasks.workunit.client.0.vm04.stdout:8/551: write df/d15/f1e [1221866,54590] 0 2026-03-10T06:23:01.359 INFO:tasks.workunit.client.0.vm04.stdout:1/534: dwrite d0/d8/d46/f57 [0,4194304] 0 2026-03-10T06:23:01.367 INFO:tasks.workunit.client.0.vm04.stdout:5/507: creat d4/d6/d80/d84/d99/fb3 x:0 0 0 2026-03-10T06:23:01.372 INFO:tasks.workunit.client.0.vm04.stdout:4/547: symlink d2/d32/d94/laa 0 2026-03-10T06:23:01.381 INFO:tasks.workunit.client.0.vm04.stdout:4/548: chown d2/d32/d5c/d76/l8c 261382 1 2026-03-10T06:23:01.382 INFO:tasks.workunit.client.0.vm04.stdout:9/569: getdents d2/d23 0 2026-03-10T06:23:01.382 INFO:tasks.workunit.client.0.vm04.stdout:2/512: unlink d1/df/d2c/l78 0 2026-03-10T06:23:01.382 INFO:tasks.workunit.client.0.vm04.stdout:2/513: chown d1/df/d11/d14/d6a 4 1 2026-03-10T06:23:01.382 INFO:tasks.workunit.client.0.vm04.stdout:8/552: rmdir df/d15/d2b 39 2026-03-10T06:23:01.384 INFO:tasks.workunit.client.0.vm04.stdout:0/562: mknod d0/d5/cbb 0 2026-03-10T06:23:01.385 INFO:tasks.workunit.client.0.vm04.stdout:0/563: stat d0/d5/d25/dd/d1d/d59 0 2026-03-10T06:23:01.386 INFO:tasks.workunit.client.0.vm04.stdout:0/564: fsync d0/d5/d25/dd/d5c/fb2 0 2026-03-10T06:23:01.386 INFO:tasks.workunit.client.0.vm04.stdout:0/565: chown d0/c39 1784218 1 2026-03-10T06:23:01.395 INFO:tasks.workunit.client.0.vm04.stdout:4/549: write d2/d16/d2c/f9b [4275468,30980] 0 2026-03-10T06:23:01.395 INFO:tasks.workunit.client.0.vm04.stdout:4/550: chown d2/d46/f18 210861920 1 2026-03-10T06:23:01.395 INFO:tasks.workunit.client.0.vm04.stdout:6/532: write d2/d43/d2d/d30/d34/d76/d8a/f9e [4133416,84305] 0 2026-03-10T06:23:01.395 INFO:tasks.workunit.client.0.vm04.stdout:6/533: dread - d2/d43/d2d/d30/d1f/f87 zero size 2026-03-10T06:23:01.395 INFO:tasks.workunit.client.0.vm04.stdout:7/537: dwrite d4/df/d12/d13/d25/d28/d3a/f73 [0,4194304] 0 2026-03-10T06:23:01.396 INFO:tasks.workunit.client.0.vm04.stdout:9/570: sync 2026-03-10T06:23:01.397 INFO:tasks.workunit.client.0.vm04.stdout:9/571: readlink d2/d23/d24/da9/l7a 0 2026-03-10T06:23:01.397 INFO:tasks.workunit.client.0.vm04.stdout:9/572: write d2/d8/fb8 [1039370,119691] 0 2026-03-10T06:23:01.400 INFO:tasks.workunit.client.0.vm04.stdout:9/573: dwrite d2/d8/d22/fcf [0,4194304] 0 2026-03-10T06:23:01.402 INFO:tasks.workunit.client.0.vm04.stdout:9/574: dread - d2/d3/d18/d39/d46/fc4 zero size 2026-03-10T06:23:01.408 INFO:tasks.workunit.client.0.vm04.stdout:9/575: dwrite d2/d3/f12 [4194304,4194304] 0 2026-03-10T06:23:01.413 INFO:tasks.workunit.client.0.vm04.stdout:2/514: unlink d1/db/d20/d8f/d35/c81 0 2026-03-10T06:23:01.415 INFO:tasks.workunit.client.0.vm04.stdout:0/566: rename d0/d5/d25/dd/d5c/d73/c9e to d0/d5/d25/cbc 0 2026-03-10T06:23:01.416 INFO:tasks.workunit.client.0.vm04.stdout:6/534: mknod d2/d37/cb0 0 2026-03-10T06:23:01.421 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:01 vm04.local ceph-mon[51058]: pgmap v29: 65 pgs: 65 active+clean; 2.6 GiB data, 8.8 GiB used, 111 GiB / 120 GiB avail; 40 MiB/s rd, 131 MiB/s wr, 284 op/s 2026-03-10T06:23:01.423 INFO:tasks.workunit.client.0.vm04.stdout:5/508: creat d4/d11/d7d/d38/d91/d55/db1/fb4 x:0 0 0 2026-03-10T06:23:01.434 INFO:tasks.workunit.client.0.vm04.stdout:7/538: creat d4/df/d12/d13/db3/fcf x:0 0 0 2026-03-10T06:23:01.449 INFO:tasks.workunit.client.0.vm04.stdout:9/576: mknod d2/d23/d24/da9/cde 0 2026-03-10T06:23:01.451 INFO:tasks.workunit.client.0.vm04.stdout:4/551: mknod d2/d32/d5c/d98/cab 0 2026-03-10T06:23:01.460 INFO:tasks.workunit.client.0.vm04.stdout:6/535: write d2/d43/d2d/d30/f60 [720761,123911] 0 2026-03-10T06:23:01.460 INFO:tasks.workunit.client.0.vm04.stdout:7/539: mknod d4/df/d12/d34/d63/cd0 0 2026-03-10T06:23:01.460 INFO:tasks.workunit.client.0.vm04.stdout:3/523: getdents d4/da 0 2026-03-10T06:23:01.461 INFO:tasks.workunit.client.0.vm04.stdout:9/577: creat d2/d8/d53/d6e/d8d/fdf x:0 0 0 2026-03-10T06:23:01.462 INFO:tasks.workunit.client.0.vm04.stdout:1/535: getdents d0/d8/d46/d7a/d95 0 2026-03-10T06:23:01.464 INFO:tasks.workunit.client.0.vm04.stdout:2/515: dread d1/df/d11/f29 [0,4194304] 0 2026-03-10T06:23:01.467 INFO:tasks.workunit.client.0.vm04.stdout:4/552: mknod d2/d32/d5c/cac 0 2026-03-10T06:23:01.467 INFO:tasks.workunit.client.0.vm04.stdout:6/536: mknod d2/d43/d9b/cb1 0 2026-03-10T06:23:01.468 INFO:tasks.workunit.client.0.vm04.stdout:1/536: mkdir d0/d8/d46/d7a/d95/dc5 0 2026-03-10T06:23:01.469 INFO:tasks.workunit.client.0.vm04.stdout:8/553: getdents df/d15/d2b 0 2026-03-10T06:23:01.473 INFO:tasks.workunit.client.0.vm04.stdout:4/553: mkdir d2/d32/dad 0 2026-03-10T06:23:01.474 INFO:tasks.workunit.client.0.vm04.stdout:1/537: creat d0/d8/d46/db3/fc6 x:0 0 0 2026-03-10T06:23:01.474 INFO:tasks.workunit.client.0.vm04.stdout:2/516: mkdir d1/db/d9b 0 2026-03-10T06:23:01.476 INFO:tasks.workunit.client.0.vm04.stdout:2/517: write d1/df/f63 [3645607,39849] 0 2026-03-10T06:23:01.476 INFO:tasks.workunit.client.0.vm04.stdout:8/554: readlink df/d20/d25/d30/d65/d8f/la8 0 2026-03-10T06:23:01.485 INFO:tasks.workunit.client.0.vm04.stdout:9/578: rename d2/d8/d14 to d2/de0 0 2026-03-10T06:23:01.488 INFO:tasks.workunit.client.0.vm04.stdout:9/579: dwrite d2/de0/f40 [0,4194304] 0 2026-03-10T06:23:01.499 INFO:tasks.workunit.client.0.vm04.stdout:5/509: write d4/f21 [4739912,75060] 0 2026-03-10T06:23:01.514 INFO:tasks.workunit.client.0.vm04.stdout:3/524: getdents d4/da/df/d11 0 2026-03-10T06:23:01.522 INFO:tasks.workunit.client.0.vm04.stdout:7/540: rename d4/df/d12/d13/d25/d28/l7b to d4/df/d12/d13/d8b/ld1 0 2026-03-10T06:23:01.523 INFO:tasks.workunit.client.0.vm04.stdout:9/580: sync 2026-03-10T06:23:01.526 INFO:tasks.workunit.client.0.vm04.stdout:0/567: getdents d0/d5/d25/dd 0 2026-03-10T06:23:01.527 INFO:tasks.workunit.client.0.vm04.stdout:0/568: write d0/d1a/d20/d38/f78 [780043,77920] 0 2026-03-10T06:23:01.534 INFO:tasks.workunit.client.0.vm04.stdout:7/541: dread d4/df/d12/f1c [4194304,4194304] 0 2026-03-10T06:23:01.539 INFO:tasks.workunit.client.0.vm04.stdout:2/518: dwrite d1/df/d11/f7e [0,4194304] 0 2026-03-10T06:23:01.539 INFO:tasks.workunit.client.0.vm04.stdout:1/538: dwrite d0/d8/f32 [8388608,4194304] 0 2026-03-10T06:23:01.555 INFO:tasks.workunit.client.0.vm04.stdout:4/554: creat d2/d32/dad/fae x:0 0 0 2026-03-10T06:23:01.556 INFO:tasks.workunit.client.0.vm04.stdout:6/537: link d2/d43/d2d/d30/d1f/c66 d2/d43/d2d/d7c/cb2 0 2026-03-10T06:23:01.557 INFO:tasks.workunit.client.0.vm04.stdout:6/538: write d2/d43/d2d/d30/f39 [1375065,14958] 0 2026-03-10T06:23:01.557 INFO:tasks.workunit.client.0.vm04.stdout:6/539: truncate d2/d43/f31 2930667 0 2026-03-10T06:23:01.559 INFO:tasks.workunit.client.0.vm04.stdout:3/525: rmdir d4/da 39 2026-03-10T06:23:01.562 INFO:tasks.workunit.client.0.vm04.stdout:8/555: fdatasync df/d20/d25/d30/d65/d8f/fa2 0 2026-03-10T06:23:01.571 INFO:tasks.workunit.client.0.vm04.stdout:8/556: readlink df/d15/l3d 0 2026-03-10T06:23:01.571 INFO:tasks.workunit.client.0.vm04.stdout:5/510: mknod d4/cb5 0 2026-03-10T06:23:01.571 INFO:tasks.workunit.client.0.vm04.stdout:9/581: dread - d2/de0/d1d/d64/f91 zero size 2026-03-10T06:23:01.571 INFO:tasks.workunit.client.0.vm04.stdout:9/582: write d2/d8/f4a [46675,93661] 0 2026-03-10T06:23:01.571 INFO:tasks.workunit.client.0.vm04.stdout:0/569: rmdir d0/d1a/d20/d38/d31/d47 39 2026-03-10T06:23:01.576 INFO:tasks.workunit.client.0.vm04.stdout:7/542: dwrite d4/df/d12/d13/d25/d30/d40/d50/f62 [0,4194304] 0 2026-03-10T06:23:01.583 INFO:tasks.workunit.client.0.vm04.stdout:2/519: unlink d1/df/d11/f29 0 2026-03-10T06:23:01.588 INFO:tasks.workunit.client.0.vm04.stdout:6/540: unlink d2/d43/d2d/d30/d1f/c21 0 2026-03-10T06:23:01.589 INFO:tasks.workunit.client.0.vm04.stdout:3/526: write d4/da/df/d11/d5a/d5b/f98 [74607,22512] 0 2026-03-10T06:23:01.596 INFO:tasks.workunit.client.0.vm04.stdout:0/570: mknod d0/d5/d25/dd/d3a/d81/cbd 0 2026-03-10T06:23:01.602 INFO:tasks.workunit.client.0.vm04.stdout:4/555: mknod d2/d32/d5c/d4f/d51/caf 0 2026-03-10T06:23:01.602 INFO:tasks.workunit.client.0.vm04.stdout:4/556: readlink d2/d16/d31/d3f/l48 0 2026-03-10T06:23:01.603 INFO:tasks.workunit.client.0.vm04.stdout:6/541: creat d2/d3a/d5e/fb3 x:0 0 0 2026-03-10T06:23:01.604 INFO:tasks.workunit.client.0.vm04.stdout:6/542: chown d2/d43/d2d/d30/d1f/d3c 10311 1 2026-03-10T06:23:01.607 INFO:tasks.workunit.client.0.vm04.stdout:3/527: rename d4/d6/d99/d7b/f27 to d4/d6/d99/d7b/d21/d32/d4e/d8f/fb0 0 2026-03-10T06:23:01.608 INFO:tasks.workunit.client.0.vm04.stdout:5/511: mkdir d4/d6/d81/db6 0 2026-03-10T06:23:01.611 INFO:tasks.workunit.client.0.vm04.stdout:9/583: mknod d2/d3/d18/ce1 0 2026-03-10T06:23:01.613 INFO:tasks.workunit.client.0.vm04.stdout:7/543: mkdir d4/df/d12/d13/d25/dcb/dd2 0 2026-03-10T06:23:01.616 INFO:tasks.workunit.client.0.vm04.stdout:4/557: rmdir d2/d16/d31/d3f/d93 39 2026-03-10T06:23:01.616 INFO:tasks.workunit.client.0.vm04.stdout:0/571: write d0/d5/d25/dd/d5c/f9a [475980,118748] 0 2026-03-10T06:23:01.616 INFO:tasks.workunit.client.0.vm04.stdout:4/558: chown d2/d16 1 1 2026-03-10T06:23:01.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:01 vm06.local ceph-mon[58974]: pgmap v29: 65 pgs: 65 active+clean; 2.6 GiB data, 8.8 GiB used, 111 GiB / 120 GiB avail; 40 MiB/s rd, 131 MiB/s wr, 284 op/s 2026-03-10T06:23:01.618 INFO:tasks.workunit.client.0.vm04.stdout:4/559: dread d2/d16/d2c/f9b [0,4194304] 0 2026-03-10T06:23:01.622 INFO:tasks.workunit.client.0.vm04.stdout:6/543: rename d2/d43/d2d/d30/d34/d76/d8a/f94 to d2/d43/d2d/d30/d34/da8/fb4 0 2026-03-10T06:23:01.623 INFO:tasks.workunit.client.0.vm04.stdout:6/544: chown d2/d43/d2d/d30/d34/dae 731052 1 2026-03-10T06:23:01.634 INFO:tasks.workunit.client.0.vm04.stdout:8/557: write df/d20/f28 [5458914,125490] 0 2026-03-10T06:23:01.641 INFO:tasks.workunit.client.0.vm04.stdout:1/539: truncate d0/d3/d41/d4b/d5b/f5c 724662 0 2026-03-10T06:23:01.641 INFO:tasks.workunit.client.0.vm04.stdout:1/540: chown d0/d8/f27 544506 1 2026-03-10T06:23:01.642 INFO:tasks.workunit.client.0.vm04.stdout:1/541: stat d0/l73 0 2026-03-10T06:23:01.642 INFO:tasks.workunit.client.0.vm04.stdout:1/542: fdatasync d0/d8/f38 0 2026-03-10T06:23:01.642 INFO:tasks.workunit.client.0.vm04.stdout:2/520: dread d1/df/f63 [0,4194304] 0 2026-03-10T06:23:01.644 INFO:tasks.workunit.client.0.vm04.stdout:9/584: creat d2/d3/d18/d34/fe2 x:0 0 0 2026-03-10T06:23:01.647 INFO:tasks.workunit.client.0.vm04.stdout:7/544: rmdir d4/df/d12/d13/d25/d8f 39 2026-03-10T06:23:01.653 INFO:tasks.workunit.client.0.vm04.stdout:0/572: fdatasync d0/d5/d25/dd/d5c/f7a 0 2026-03-10T06:23:01.658 INFO:tasks.workunit.client.0.vm04.stdout:0/573: dwrite d0/d5/d25/dd/d3a/d81/f90 [0,4194304] 0 2026-03-10T06:23:01.666 INFO:tasks.workunit.client.0.vm04.stdout:6/545: mkdir d2/d3a/d5e/db5 0 2026-03-10T06:23:01.669 INFO:tasks.workunit.client.0.vm04.stdout:8/558: mkdir df/d20/d25/d30/d70/dac 0 2026-03-10T06:23:01.672 INFO:tasks.workunit.client.0.vm04.stdout:1/543: mknod d0/d3/d41/cc7 0 2026-03-10T06:23:01.675 INFO:tasks.workunit.client.0.vm04.stdout:5/512: dwrite d4/d11/d7d/d38/d91/d55/f5d [0,4194304] 0 2026-03-10T06:23:01.681 INFO:tasks.workunit.client.0.vm04.stdout:6/546: dread d2/d43/d2d/d30/d34/d76/d7e/f81 [0,4194304] 0 2026-03-10T06:23:01.718 INFO:tasks.workunit.client.0.vm04.stdout:0/574: chown d0/d1a/d20/d38/d31/d79/l7c 113567 1 2026-03-10T06:23:01.718 INFO:tasks.workunit.client.0.vm04.stdout:0/575: chown d0/d5/l93 3497 1 2026-03-10T06:23:01.722 INFO:tasks.workunit.client.0.vm04.stdout:0/576: dwrite d0/d5/d25/dd/d92/fa4 [0,4194304] 0 2026-03-10T06:23:01.723 INFO:tasks.workunit.client.0.vm04.stdout:0/577: stat d0/d5/d25/dd/d1d/fa2 0 2026-03-10T06:23:01.724 INFO:tasks.workunit.client.0.vm04.stdout:1/544: fsync d0/d3/d41/d4b/f6b 0 2026-03-10T06:23:01.725 INFO:tasks.workunit.client.0.vm04.stdout:0/578: write d0/d5/d25/dd/d5c/d73/fa5 [653620,23079] 0 2026-03-10T06:23:01.727 INFO:tasks.workunit.client.0.vm04.stdout:6/547: mkdir d2/d43/d2d/d30/d1f/db6 0 2026-03-10T06:23:01.731 INFO:tasks.workunit.client.0.vm04.stdout:0/579: dwrite d0/d1a/f27 [0,4194304] 0 2026-03-10T06:23:01.752 INFO:tasks.workunit.client.0.vm04.stdout:8/559: link df/d15/d29/f7a df/d20/d25/d30/d70/d97/d67/fad 0 2026-03-10T06:23:01.752 INFO:tasks.workunit.client.0.vm04.stdout:1/545: creat d0/d8/d46/d7a/d95/fc8 x:0 0 0 2026-03-10T06:23:01.752 INFO:tasks.workunit.client.0.vm04.stdout:8/560: chown df/d20/d25 1 1 2026-03-10T06:23:01.758 INFO:tasks.workunit.client.0.vm04.stdout:5/513: rename d4/d6/d37/l6b to d4/d11/d7d/d38/d91/d55/db1/lb7 0 2026-03-10T06:23:01.759 INFO:tasks.workunit.client.0.vm04.stdout:6/548: fdatasync d2/d3a/f50 0 2026-03-10T06:23:01.759 INFO:tasks.workunit.client.0.vm04.stdout:5/514: write d4/d6/d37/f39 [5188354,19233] 0 2026-03-10T06:23:01.760 INFO:tasks.workunit.client.0.vm04.stdout:6/549: write d2/fa0 [2185050,116972] 0 2026-03-10T06:23:01.764 INFO:tasks.workunit.client.0.vm04.stdout:5/515: dwrite d4/d11/f1f [0,4194304] 0 2026-03-10T06:23:01.772 INFO:tasks.workunit.client.0.vm04.stdout:1/546: dread d0/f5 [0,4194304] 0 2026-03-10T06:23:01.776 INFO:tasks.workunit.client.0.vm04.stdout:4/560: getdents d2/d16/d31/d3f 0 2026-03-10T06:23:01.785 INFO:tasks.workunit.client.0.vm04.stdout:8/561: symlink df/d15/d2b/d81/lae 0 2026-03-10T06:23:01.795 INFO:tasks.workunit.client.0.vm04.stdout:8/562: stat df/d20/d25/d30/d55 0 2026-03-10T06:23:01.795 INFO:tasks.workunit.client.0.vm04.stdout:8/563: truncate df/f40 2492680 0 2026-03-10T06:23:01.795 INFO:tasks.workunit.client.0.vm04.stdout:1/547: rename d0/d8/d46/c5a to d0/d8/d46/d7a/d95/dc5/cc9 0 2026-03-10T06:23:01.795 INFO:tasks.workunit.client.0.vm04.stdout:1/548: write d0/d8/f32 [4004531,49853] 0 2026-03-10T06:23:01.800 INFO:tasks.workunit.client.0.vm04.stdout:0/580: creat d0/d1a/d20/d38/d31/d47/d8a/fbe x:0 0 0 2026-03-10T06:23:01.801 INFO:tasks.workunit.client.0.vm04.stdout:0/581: fdatasync d0/d1a/d20/f8c 0 2026-03-10T06:23:01.802 INFO:tasks.workunit.client.0.vm04.stdout:8/564: write df/d15/d29/da3/fa7 [368492,778] 0 2026-03-10T06:23:01.802 INFO:tasks.workunit.client.0.vm04.stdout:6/550: getdents d2/d3a/d5e/db5 0 2026-03-10T06:23:01.803 INFO:tasks.workunit.client.0.vm04.stdout:6/551: fdatasync d2/d37/d6e/f82 0 2026-03-10T06:23:01.805 INFO:tasks.workunit.client.0.vm04.stdout:5/516: creat d4/d11/d7d/dab/fb8 x:0 0 0 2026-03-10T06:23:01.805 INFO:tasks.workunit.client.0.vm04.stdout:5/517: dread - d4/d11/d7d/d38/f92 zero size 2026-03-10T06:23:01.806 INFO:tasks.workunit.client.0.vm04.stdout:5/518: write d4/d11/d7d/f90 [827736,126868] 0 2026-03-10T06:23:01.816 INFO:tasks.workunit.client.0.vm04.stdout:1/549: symlink d0/d8/d46/db3/lca 0 2026-03-10T06:23:01.816 INFO:tasks.workunit.client.0.vm04.stdout:3/528: write d4/d6/d99/d7b/f4b [601248,93735] 0 2026-03-10T06:23:01.822 INFO:tasks.workunit.client.0.vm04.stdout:0/582: mkdir d0/d5/d25/dd/d1d/d9c/dbf 0 2026-03-10T06:23:01.823 INFO:tasks.workunit.client.0.vm04.stdout:0/583: write d0/d5/d25/dd/f43 [183356,36216] 0 2026-03-10T06:23:01.823 INFO:tasks.workunit.client.0.vm04.stdout:8/565: creat df/d20/d25/faf x:0 0 0 2026-03-10T06:23:01.846 INFO:tasks.workunit.client.0.vm04.stdout:2/521: write d1/f10 [2107950,113140] 0 2026-03-10T06:23:01.847 INFO:tasks.workunit.client.0.vm04.stdout:2/522: write d1/db/d20/f86 [556643,4527] 0 2026-03-10T06:23:01.849 INFO:tasks.workunit.client.0.vm04.stdout:7/545: dwrite d4/df/d12/d13/d25/d30/d40/d50/f5b [0,4194304] 0 2026-03-10T06:23:01.851 INFO:tasks.workunit.client.0.vm04.stdout:6/552: creat d2/d43/d2d/d30/d1f/d3c/fb7 x:0 0 0 2026-03-10T06:23:01.851 INFO:tasks.workunit.client.0.vm04.stdout:7/546: chown d4/df/d12/d13/d8b/ca8 19 1 2026-03-10T06:23:01.852 INFO:tasks.workunit.client.0.vm04.stdout:6/553: write d2/d43/d2d/d30/d1f/f3f [1233244,27455] 0 2026-03-10T06:23:01.857 INFO:tasks.workunit.client.0.vm04.stdout:1/550: mkdir d0/d3/d41/dcb 0 2026-03-10T06:23:01.859 INFO:tasks.workunit.client.0.vm04.stdout:3/529: rmdir d4/d6/d99/d7b/d89 39 2026-03-10T06:23:01.863 INFO:tasks.workunit.client.0.vm04.stdout:3/530: dwrite d4/da/df/d11/d5a/d5b/f98 [0,4194304] 0 2026-03-10T06:23:01.874 INFO:tasks.workunit.client.0.vm04.stdout:8/566: truncate df/d20/d25/f2a 565186 0 2026-03-10T06:23:01.876 INFO:tasks.workunit.client.0.vm04.stdout:2/523: truncate d1/df/f5a 3993047 0 2026-03-10T06:23:01.889 INFO:tasks.workunit.client.0.vm04.stdout:9/585: truncate d2/d3/d18/ddd/f5b 160787 0 2026-03-10T06:23:01.892 INFO:tasks.workunit.client.0.vm04.stdout:9/586: dwrite d2/d3/d18/d34/f5f [4194304,4194304] 0 2026-03-10T06:23:01.893 INFO:tasks.workunit.client.0.vm04.stdout:9/587: dread - d2/d8/d53/d6e/d89/f9f zero size 2026-03-10T06:23:01.894 INFO:tasks.workunit.client.0.vm04.stdout:9/588: write d2/d8/d22/fd1 [1156621,9184] 0 2026-03-10T06:23:01.900 INFO:tasks.workunit.client.0.vm04.stdout:7/547: mkdir d4/df/d12/d34/d63/dd3 0 2026-03-10T06:23:01.909 INFO:tasks.workunit.client.0.vm04.stdout:1/551: mkdir d0/d8/d46/d7a/d95/dc5/dcc 0 2026-03-10T06:23:01.922 INFO:tasks.workunit.client.0.vm04.stdout:8/567: symlink df/d20/d25/d73/lb0 0 2026-03-10T06:23:01.924 INFO:tasks.workunit.client.0.vm04.stdout:8/568: dread df/d15/d29/da3/fa7 [0,4194304] 0 2026-03-10T06:23:01.927 INFO:tasks.workunit.client.0.vm04.stdout:8/569: dread df/d15/d29/d89/f8e [0,4194304] 0 2026-03-10T06:23:01.931 INFO:tasks.workunit.client.0.vm04.stdout:4/561: write d2/f14 [6625748,23154] 0 2026-03-10T06:23:01.935 INFO:tasks.workunit.client.0.vm04.stdout:5/519: creat d4/d11/fb9 x:0 0 0 2026-03-10T06:23:01.938 INFO:tasks.workunit.client.0.vm04.stdout:9/589: stat d2/f49 0 2026-03-10T06:23:01.957 INFO:tasks.workunit.client.0.vm04.stdout:4/562: creat d2/d32/d5c/d4f/d51/fb0 x:0 0 0 2026-03-10T06:23:01.957 INFO:tasks.workunit.client.0.vm04.stdout:6/554: link d2/d43/c84 d2/d43/cb8 0 2026-03-10T06:23:01.958 INFO:tasks.workunit.client.0.vm04.stdout:1/552: symlink d0/d8/d46/d7a/d95/dc5/dcc/lcd 0 2026-03-10T06:23:01.958 INFO:tasks.workunit.client.0.vm04.stdout:0/584: getdents d0/d5/d25/dd/d5c/d73/d82 0 2026-03-10T06:23:01.961 INFO:tasks.workunit.client.0.vm04.stdout:0/585: dwrite d0/d5/f1f [0,4194304] 0 2026-03-10T06:23:01.970 INFO:tasks.workunit.client.0.vm04.stdout:2/524: creat d1/df/d11/d14/f9c x:0 0 0 2026-03-10T06:23:01.971 INFO:tasks.workunit.client.0.vm04.stdout:7/548: rename d4/df/d12/d13/d25/d28/d3a/d58/d68 to d4/df/d12/dd4 0 2026-03-10T06:23:01.972 INFO:tasks.workunit.client.0.vm04.stdout:7/549: chown d4/df/d12/d13/d25/f66 942 1 2026-03-10T06:23:01.973 INFO:tasks.workunit.client.0.vm04.stdout:4/563: symlink d2/d16/d31/d3f/d93/lb1 0 2026-03-10T06:23:01.974 INFO:tasks.workunit.client.0.vm04.stdout:4/564: write d2/d16/d31/f66 [5070434,56237] 0 2026-03-10T06:23:01.974 INFO:tasks.workunit.client.0.vm04.stdout:1/553: creat d0/d8/d46/d7a/fce x:0 0 0 2026-03-10T06:23:01.974 INFO:tasks.workunit.client.0.vm04.stdout:4/565: readlink d2/d46/l77 0 2026-03-10T06:23:01.991 INFO:tasks.workunit.client.0.vm04.stdout:6/555: creat d2/d43/d2d/d30/d34/d76/d8a/fb9 x:0 0 0 2026-03-10T06:23:01.991 INFO:tasks.workunit.client.0.vm04.stdout:8/570: link df/d15/d2b/d81/f9d df/d20/d25/d30/d70/d97/fb1 0 2026-03-10T06:23:01.991 INFO:tasks.workunit.client.0.vm04.stdout:8/571: chown df/c13 3219 1 2026-03-10T06:23:01.991 INFO:tasks.workunit.client.0.vm04.stdout:9/590: link d2/de0/f28 d2/d8/d3a/dcb/fe3 0 2026-03-10T06:23:01.991 INFO:tasks.workunit.client.0.vm04.stdout:9/591: dread d2/d23/d24/f83 [0,4194304] 0 2026-03-10T06:23:01.991 INFO:tasks.workunit.client.0.vm04.stdout:1/554: mkdir d0/d8/d46/dcf 0 2026-03-10T06:23:01.991 INFO:tasks.workunit.client.0.vm04.stdout:1/555: chown d0/d3/f19 1256 1 2026-03-10T06:23:01.991 INFO:tasks.workunit.client.0.vm04.stdout:2/525: link d1/db/f12 d1/df/d11/d14/d4e/f9d 0 2026-03-10T06:23:01.991 INFO:tasks.workunit.client.0.vm04.stdout:8/572: symlink df/d20/lb2 0 2026-03-10T06:23:01.991 INFO:tasks.workunit.client.0.vm04.stdout:8/573: read df/f1d [1140583,100420] 0 2026-03-10T06:23:01.991 INFO:tasks.workunit.client.0.vm04.stdout:8/574: fdatasync df/d15/d2b/f4c 0 2026-03-10T06:23:01.991 INFO:tasks.workunit.client.0.vm04.stdout:4/566: getdents d2/d16/d2c/d9a 0 2026-03-10T06:23:01.991 INFO:tasks.workunit.client.0.vm04.stdout:6/556: creat d2/d3a/d9c/fba x:0 0 0 2026-03-10T06:23:01.991 INFO:tasks.workunit.client.0.vm04.stdout:9/592: dwrite d2/d3/d18/d39/d11/f35 [0,4194304] 0 2026-03-10T06:23:01.993 INFO:tasks.workunit.client.0.vm04.stdout:5/520: sync 2026-03-10T06:23:01.994 INFO:tasks.workunit.client.0.vm04.stdout:7/550: creat d4/df/d12/d13/d25/d28/fd5 x:0 0 0 2026-03-10T06:23:01.995 INFO:tasks.workunit.client.0.vm04.stdout:0/586: getdents d0/d1a/d20 0 2026-03-10T06:23:01.995 INFO:tasks.workunit.client.0.vm04.stdout:0/587: chown d0/d1a/d20/l29 25390 1 2026-03-10T06:23:01.999 INFO:tasks.workunit.client.0.vm04.stdout:4/567: sync 2026-03-10T06:23:02.007 INFO:tasks.workunit.client.0.vm04.stdout:9/593: dread d2/d3/d18/d39/d11/f35 [0,4194304] 0 2026-03-10T06:23:02.007 INFO:tasks.workunit.client.0.vm04.stdout:6/557: dread d2/d8/d78/f79 [0,4194304] 0 2026-03-10T06:23:02.011 INFO:tasks.workunit.client.0.vm04.stdout:6/558: truncate d2/d43/d2d/d30/f93 3448878 0 2026-03-10T06:23:02.011 INFO:tasks.workunit.client.0.vm04.stdout:6/559: write d2/d43/d2d/f42 [786474,107826] 0 2026-03-10T06:23:02.011 INFO:tasks.workunit.client.0.vm04.stdout:6/560: chown d2/d37/d83 267551029 1 2026-03-10T06:23:02.028 INFO:tasks.workunit.client.0.vm04.stdout:1/556: rename d0/c7d to d0/d3/cd0 0 2026-03-10T06:23:02.028 INFO:tasks.workunit.client.0.vm04.stdout:1/557: readlink d0/d3/l55 0 2026-03-10T06:23:02.029 INFO:tasks.workunit.client.0.vm04.stdout:1/558: chown d0/d8/d46/d7a/d95/cb9 1942879 1 2026-03-10T06:23:02.029 INFO:tasks.workunit.client.0.vm04.stdout:1/559: dread - d0/d3/f98 zero size 2026-03-10T06:23:02.029 INFO:tasks.workunit.client.0.vm04.stdout:3/531: write d4/d6/d99/d7b/d21/d32/d4e/f73 [705156,120724] 0 2026-03-10T06:23:02.030 INFO:tasks.workunit.client.0.vm04.stdout:1/560: fsync d0/d8/f43 0 2026-03-10T06:23:02.031 INFO:tasks.workunit.client.0.vm04.stdout:3/532: readlink d4/d6/d99/d7b/d21/d32/d39/l5d 0 2026-03-10T06:23:02.033 INFO:tasks.workunit.client.0.vm04.stdout:5/521: creat d4/d11/d7d/d38/d91/d55/fba x:0 0 0 2026-03-10T06:23:02.033 INFO:tasks.workunit.client.0.vm04.stdout:7/551: creat d4/df/d12/d13/d25/dcb/fd6 x:0 0 0 2026-03-10T06:23:02.059 INFO:tasks.workunit.client.0.vm04.stdout:6/561: symlink d2/d8/d78/lbb 0 2026-03-10T06:23:02.061 INFO:tasks.workunit.client.0.vm04.stdout:8/575: rename df/d15/d29/l7b to df/d15/d2b/d8a/lb3 0 2026-03-10T06:23:02.061 INFO:tasks.workunit.client.0.vm04.stdout:8/576: chown df/c14 1073279965 1 2026-03-10T06:23:02.065 INFO:tasks.workunit.client.0.vm04.stdout:7/552: creat d4/df/d12/d13/d25/d30/d40/d79/fd7 x:0 0 0 2026-03-10T06:23:02.066 INFO:tasks.workunit.client.0.vm04.stdout:2/526: link d1/db/d69/d74/l99 d1/db/d9b/l9e 0 2026-03-10T06:23:02.069 INFO:tasks.workunit.client.0.vm04.stdout:2/527: dwrite d1/db/d20/f49 [0,4194304] 0 2026-03-10T06:23:02.070 INFO:tasks.workunit.client.0.vm04.stdout:8/577: truncate df/d15/d2b/f4a 3485434 0 2026-03-10T06:23:02.071 INFO:tasks.workunit.client.0.vm04.stdout:6/562: dread - d2/d43/d2d/d30/d34/f52 zero size 2026-03-10T06:23:02.071 INFO:tasks.workunit.client.0.vm04.stdout:6/563: readlink d2/d43/lc 0 2026-03-10T06:23:02.082 INFO:tasks.workunit.client.0.vm04.stdout:4/568: link d2/d46/l77 d2/d16/d2c/d9a/lb2 0 2026-03-10T06:23:02.088 INFO:tasks.workunit.client.0.vm04.stdout:2/528: mkdir d1/df/d11/d14/d9f 0 2026-03-10T06:23:02.091 INFO:tasks.workunit.client.0.vm04.stdout:4/569: fdatasync d2/d46/f3c 0 2026-03-10T06:23:02.091 INFO:tasks.workunit.client.0.vm04.stdout:6/564: link d2/d3a/f90 d2/d43/d2d/d30/d1f/fbc 0 2026-03-10T06:23:02.092 INFO:tasks.workunit.client.0.vm04.stdout:1/561: link d0/d3/d41/d99/lbb d0/d3/d41/ld1 0 2026-03-10T06:23:02.093 INFO:tasks.workunit.client.0.vm04.stdout:5/522: getdents d4/d6/d80/d84/d99 0 2026-03-10T06:23:02.105 INFO:tasks.workunit.client.0.vm04.stdout:2/529: mkdir d1/df/d11/d14/d9f/da0 0 2026-03-10T06:23:02.105 INFO:tasks.workunit.client.0.vm04.stdout:2/530: truncate d1/db/d72/d94/f97 92793 0 2026-03-10T06:23:02.108 INFO:tasks.workunit.client.0.vm04.stdout:2/531: creat d1/db/d20/fa1 x:0 0 0 2026-03-10T06:23:02.110 INFO:tasks.workunit.client.0.vm04.stdout:4/570: rename d2/d46/ce to d2/d32/d5c/cb3 0 2026-03-10T06:23:02.112 INFO:tasks.workunit.client.0.vm04.stdout:6/565: link d2/d43/d9b/cb1 d2/d43/d2d/d30/d34/dae/cbd 0 2026-03-10T06:23:02.112 INFO:tasks.workunit.client.0.vm04.stdout:6/566: stat d2/d37/d6e 0 2026-03-10T06:23:02.116 INFO:tasks.workunit.client.0.vm04.stdout:4/571: creat d2/d16/d31/d3f/d93/fb4 x:0 0 0 2026-03-10T06:23:02.117 INFO:tasks.workunit.client.0.vm04.stdout:5/523: link d4/d11/d7d/d38/d91/d4c/c7f d4/d11/d7d/d38/d91/d55/db1/cbb 0 2026-03-10T06:23:02.121 INFO:tasks.workunit.client.0.vm04.stdout:6/567: link d2/d43/d2d/d30/d1f/d3c/f27 d2/d3a/d5e/db5/fbe 0 2026-03-10T06:23:02.121 INFO:tasks.workunit.client.0.vm04.stdout:6/568: chown d2/d43/d2d/d30/d1f/c25 54624885 1 2026-03-10T06:23:02.130 INFO:tasks.workunit.client.0.vm04.stdout:5/524: rename d4/d11/d7d/d38/d91/l9f to d4/d6/d80/lbc 0 2026-03-10T06:23:02.133 INFO:tasks.workunit.client.0.vm04.stdout:6/569: dread d2/d43/d2d/d30/f93 [0,4194304] 0 2026-03-10T06:23:02.143 INFO:tasks.workunit.client.0.vm04.stdout:4/572: creat d2/fb5 x:0 0 0 2026-03-10T06:23:02.186 INFO:tasks.workunit.client.0.vm04.stdout:6/570: dread d2/d37/d6e/f95 [0,4194304] 0 2026-03-10T06:23:02.191 INFO:tasks.workunit.client.0.vm04.stdout:6/571: dread d2/d3a/d5e/db5/fbe [0,4194304] 0 2026-03-10T06:23:02.196 INFO:tasks.workunit.client.0.vm04.stdout:6/572: dwrite d2/d43/d2d/d30/d1f/d3c/fb7 [0,4194304] 0 2026-03-10T06:23:02.200 INFO:tasks.workunit.client.0.vm04.stdout:4/573: creat d2/d16/d2c/d9a/fb6 x:0 0 0 2026-03-10T06:23:02.200 INFO:tasks.workunit.client.0.vm04.stdout:4/574: write d2/d16/f20 [1890120,88964] 0 2026-03-10T06:23:02.203 INFO:tasks.workunit.client.0.vm04.stdout:6/573: dread d2/d43/d2d/d30/d1f/d3c/f27 [0,4194304] 0 2026-03-10T06:23:02.211 INFO:tasks.workunit.client.0.vm04.stdout:4/575: mkdir d2/d16/d2c/d9a/db7 0 2026-03-10T06:23:02.212 INFO:tasks.workunit.client.0.vm04.stdout:6/574: stat d2/d43/d2d/d30/f91 0 2026-03-10T06:23:02.225 INFO:tasks.workunit.client.0.vm04.stdout:6/575: write d2/d3a/d5e/db5/fbe [1626084,73379] 0 2026-03-10T06:23:02.234 INFO:tasks.workunit.client.0.vm04.stdout:4/576: rmdir d2/d16/d2c/d9a/db7 0 2026-03-10T06:23:02.237 INFO:tasks.workunit.client.0.vm04.stdout:4/577: dwrite d2/d16/f73 [0,4194304] 0 2026-03-10T06:23:02.257 INFO:tasks.workunit.client.0.vm04.stdout:6/576: mkdir d2/d43/d2d/d30/d1f/d3c/d85/dbf 0 2026-03-10T06:23:02.257 INFO:tasks.workunit.client.0.vm04.stdout:6/577: write d2/d43/d2d/d30/f39 [1890904,6146] 0 2026-03-10T06:23:02.322 INFO:tasks.workunit.client.0.vm04.stdout:7/553: fsync d4/df/d12/d13/d25/dcb/fd6 0 2026-03-10T06:23:02.336 INFO:tasks.workunit.client.0.vm04.stdout:0/588: write d0/d5/d25/dd/d5c/d73/f61 [398007,88775] 0 2026-03-10T06:23:02.349 INFO:tasks.workunit.client.0.vm04.stdout:9/594: write d2/d3/d18/d39/d46/fc4 [131340,111034] 0 2026-03-10T06:23:02.417 INFO:tasks.workunit.client.0.vm04.stdout:3/533: dwrite d4/d6/d99/d7b/d21/f9a [0,4194304] 0 2026-03-10T06:23:02.480 INFO:tasks.workunit.client.0.vm04.stdout:1/562: rmdir d0/d3 39 2026-03-10T06:23:02.558 INFO:tasks.workunit.client.0.vm04.stdout:2/532: truncate d1/df/d11/f7e 1344396 0 2026-03-10T06:23:02.559 INFO:tasks.workunit.client.0.vm04.stdout:7/554: write d4/df/d12/d13/d25/d28/d36/f4d [3086173,114963] 0 2026-03-10T06:23:02.560 INFO:tasks.workunit.client.0.vm04.stdout:2/533: dread d1/db/d20/d8f/d35/d54/d5d/f93 [0,4194304] 0 2026-03-10T06:23:02.573 INFO:tasks.workunit.client.0.vm04.stdout:9/595: rename d2/de0/l67 to d2/d23/d24/da9/le4 0 2026-03-10T06:23:02.576 INFO:tasks.workunit.client.0.vm04.stdout:3/534: mknod d4/da/df/d11/cb1 0 2026-03-10T06:23:02.576 INFO:tasks.workunit.client.0.vm04.stdout:5/525: dwrite d4/f35 [0,4194304] 0 2026-03-10T06:23:02.580 INFO:tasks.workunit.client.0.vm04.stdout:5/526: chown d4/d11/d7d/d38/d91/d4c/d98/faf 0 1 2026-03-10T06:23:02.584 INFO:tasks.workunit.client.0.vm04.stdout:1/563: dwrite d0/d3/d41/fb8 [0,4194304] 0 2026-03-10T06:23:02.608 INFO:tasks.workunit.client.0.vm04.stdout:4/578: link d2/d16/d2c/l8b d2/d16/d2c/d6b/lb8 0 2026-03-10T06:23:02.619 INFO:tasks.workunit.client.0.vm04.stdout:8/578: creat df/d20/d25/d30/d65/d8f/fb4 x:0 0 0 2026-03-10T06:23:02.620 INFO:tasks.workunit.client.0.vm04.stdout:0/589: dread d0/d1a/d20/d38/d31/d47/f54 [0,4194304] 0 2026-03-10T06:23:02.637 INFO:tasks.workunit.client.0.vm04.stdout:2/534: symlink d1/db/d69/la2 0 2026-03-10T06:23:02.638 INFO:tasks.workunit.client.0.vm04.stdout:2/535: write d1/df/d11/d14/d4e/f5c [3265343,105113] 0 2026-03-10T06:23:02.649 INFO:tasks.workunit.client.0.vm04.stdout:3/535: creat d4/d6/d99/d7b/d21/d2c/fb2 x:0 0 0 2026-03-10T06:23:02.650 INFO:tasks.workunit.client.0.vm04.stdout:5/527: unlink d4/d11/d7d/d38/d91/d4c/fa1 0 2026-03-10T06:23:02.651 INFO:tasks.workunit.client.0.vm04.stdout:1/564: mkdir d0/d8/d46/db3/dd2 0 2026-03-10T06:23:02.656 INFO:tasks.workunit.client.0.vm04.stdout:8/579: unlink df/d20/d25/d30/d65/d8f/fa2 0 2026-03-10T06:23:02.657 INFO:tasks.workunit.client.0.vm04.stdout:5/528: dwrite d4/d6/f33 [0,4194304] 0 2026-03-10T06:23:02.675 INFO:tasks.workunit.client.0.vm04.stdout:9/596: mkdir d2/de5 0 2026-03-10T06:23:02.675 INFO:tasks.workunit.client.0.vm04.stdout:2/536: rmdir d1/db/d9b 39 2026-03-10T06:23:02.678 INFO:tasks.workunit.client.0.vm04.stdout:6/578: getdents d2/d3a/d9c 0 2026-03-10T06:23:02.682 INFO:tasks.workunit.client.0.vm04.stdout:7/555: rename d4/df/d12/d13/d25/d28/d36 to d4/df/dd8 0 2026-03-10T06:23:02.683 INFO:tasks.workunit.client.0.vm04.stdout:8/580: fsync df/d15/f1b 0 2026-03-10T06:23:02.683 INFO:tasks.workunit.client.0.vm04.stdout:1/565: truncate d0/d8/d46/f57 100520 0 2026-03-10T06:23:02.683 INFO:tasks.workunit.client.0.vm04.stdout:9/597: creat d2/d8/d3a/dcb/fe6 x:0 0 0 2026-03-10T06:23:02.689 INFO:tasks.workunit.client.0.vm04.stdout:4/579: rename d2/d32/d5c/d4f/d51 to d2/d16/d31/d42/db9 0 2026-03-10T06:23:02.692 INFO:tasks.workunit.client.0.vm04.stdout:4/580: chown d2/d32/d5c/c5f 331728627 1 2026-03-10T06:23:02.692 INFO:tasks.workunit.client.0.vm04.stdout:6/579: mkdir d2/d43/d2d/d30/dc0 0 2026-03-10T06:23:02.692 INFO:tasks.workunit.client.0.vm04.stdout:9/598: unlink d2/d8/d53/d6e/d89/ca4 0 2026-03-10T06:23:02.692 INFO:tasks.workunit.client.0.vm04.stdout:7/556: dread d4/df/d12/d13/d25/d28/d3a/f73 [0,4194304] 0 2026-03-10T06:23:02.696 INFO:tasks.workunit.client.0.vm04.stdout:1/566: dwrite d0/d8/f76 [4194304,4194304] 0 2026-03-10T06:23:02.696 INFO:tasks.workunit.client.0.vm04.stdout:1/567: write d0/d3/f98 [127087,46597] 0 2026-03-10T06:23:02.706 INFO:tasks.workunit.client.0.vm04.stdout:5/529: sync 2026-03-10T06:23:02.706 INFO:tasks.workunit.client.0.vm04.stdout:3/536: sync 2026-03-10T06:23:02.714 INFO:tasks.workunit.client.0.vm04.stdout:1/568: dread d0/d3/d41/d4b/d5b/f6f [0,4194304] 0 2026-03-10T06:23:02.747 INFO:tasks.workunit.client.0.vm04.stdout:2/537: rename d1/db/d20/d8f/f19 to d1/db/d9b/fa3 0 2026-03-10T06:23:02.750 INFO:tasks.workunit.client.0.vm04.stdout:9/599: rmdir d2 39 2026-03-10T06:23:02.750 INFO:tasks.workunit.client.0.vm04.stdout:6/580: mkdir d2/d37/d83/dc1 0 2026-03-10T06:23:02.760 INFO:tasks.workunit.client.0.vm04.stdout:4/581: symlink d2/d16/d2c/d9a/lba 0 2026-03-10T06:23:02.780 INFO:tasks.workunit.client.0.vm04.stdout:5/530: rename d4/d11/d7d/d38/f92 to d4/d11/d7d/d38/d91/d55/fbd 0 2026-03-10T06:23:02.803 INFO:tasks.workunit.client.0.vm04.stdout:0/590: write d0/d1a/d20/f85 [197315,92053] 0 2026-03-10T06:23:02.804 INFO:tasks.workunit.client.0.vm04.stdout:0/591: dread - d0/d5/d25/dd/d5c/fb2 zero size 2026-03-10T06:23:02.806 INFO:tasks.workunit.client.0.vm04.stdout:7/557: fdatasync d4/df/d12/d34/f46 0 2026-03-10T06:23:02.816 INFO:tasks.workunit.client.0.vm04.stdout:4/582: unlink d2/l8a 0 2026-03-10T06:23:02.823 INFO:tasks.workunit.client.0.vm04.stdout:5/531: mkdir d4/d11/d7d/d38/d51/dbe 0 2026-03-10T06:23:02.823 INFO:tasks.workunit.client.0.vm04.stdout:5/532: chown d4/d11/d7d/f30 0 1 2026-03-10T06:23:02.825 INFO:tasks.workunit.client.0.vm04.stdout:5/533: dread d4/f35 [0,4194304] 0 2026-03-10T06:23:02.834 INFO:tasks.workunit.client.0.vm04.stdout:9/600: mkdir d2/d23/d24/de7 0 2026-03-10T06:23:02.834 INFO:tasks.workunit.client.0.vm04.stdout:9/601: chown d2/de0/da3/cb0 5458 1 2026-03-10T06:23:02.846 INFO:tasks.workunit.client.0.vm04.stdout:4/583: mknod d2/d8/cbb 0 2026-03-10T06:23:02.846 INFO:tasks.workunit.client.0.vm04.stdout:4/584: readlink d2/d32/l36 0 2026-03-10T06:23:02.856 INFO:tasks.workunit.client.0.vm04.stdout:3/537: mkdir d4/da/df/d11/d5a/db3 0 2026-03-10T06:23:02.856 INFO:tasks.workunit.client.0.vm04.stdout:3/538: write d4/d6/d99/d7b/f47 [4153897,111764] 0 2026-03-10T06:23:02.874 INFO:tasks.workunit.client.0.vm04.stdout:9/602: mknod d2/d8/d53/d6e/d8d/ce8 0 2026-03-10T06:23:02.876 INFO:tasks.workunit.client.0.vm04.stdout:0/592: mkdir d0/d5/d97/dc0 0 2026-03-10T06:23:02.883 INFO:tasks.workunit.client.0.vm04.stdout:8/581: dwrite df/d20/d25/d30/d65/f80 [0,4194304] 0 2026-03-10T06:23:02.883 INFO:tasks.workunit.client.0.vm04.stdout:4/585: creat d2/d16/d31/d42/fbc x:0 0 0 2026-03-10T06:23:02.898 INFO:tasks.workunit.client.0.vm04.stdout:5/534: getdents d4/d3b/da8 0 2026-03-10T06:23:02.902 INFO:tasks.workunit.client.0.vm04.stdout:0/593: mknod d0/d1a/cc1 0 2026-03-10T06:23:02.903 INFO:tasks.workunit.client.0.vm04.stdout:0/594: dread - d0/d5/d25/dd/d3a/d56/fa7 zero size 2026-03-10T06:23:02.903 INFO:tasks.workunit.client.0.vm04.stdout:0/595: read - d0/d1a/d20/d38/fb4 zero size 2026-03-10T06:23:02.904 INFO:tasks.workunit.client.0.vm04.stdout:0/596: write d0/d5/d25/f23 [1291992,13163] 0 2026-03-10T06:23:02.907 INFO:tasks.workunit.client.0.vm04.stdout:0/597: dread d0/d5/d25/dd/d92/fa4 [0,4194304] 0 2026-03-10T06:23:02.907 INFO:tasks.workunit.client.0.vm04.stdout:0/598: chown d0/d5/d25/dd/d3a/d81/cbd 38381 1 2026-03-10T06:23:02.908 INFO:tasks.workunit.client.0.vm04.stdout:0/599: dread - d0/d1a/d20/fb9 zero size 2026-03-10T06:23:02.918 INFO:tasks.workunit.client.0.vm04.stdout:1/569: dwrite d0/d3/f44 [0,4194304] 0 2026-03-10T06:23:02.919 INFO:tasks.workunit.client.0.vm04.stdout:1/570: chown d0/d8/d46/db3/fc6 5913258 1 2026-03-10T06:23:02.953 INFO:tasks.workunit.client.0.vm04.stdout:6/581: write d2/d43/d2d/d30/d34/d76/d7e/f81 [9027134,104666] 0 2026-03-10T06:23:02.955 INFO:tasks.workunit.client.0.vm04.stdout:6/582: dwrite d2/d37/d6e/f70 [0,4194304] 0 2026-03-10T06:23:02.957 INFO:tasks.workunit.client.0.vm04.stdout:3/539: mkdir d4/d6/d99/d7b/d89/db4 0 2026-03-10T06:23:02.962 INFO:tasks.workunit.client.0.vm04.stdout:3/540: dwrite d4/d6/dc/f22 [0,4194304] 0 2026-03-10T06:23:02.979 INFO:tasks.workunit.client.0.vm04.stdout:7/558: dwrite d4/f7a [0,4194304] 0 2026-03-10T06:23:03.006 INFO:tasks.workunit.client.0.vm04.stdout:0/600: mkdir d0/d1a/d20/dc2 0 2026-03-10T06:23:03.012 INFO:tasks.workunit.client.0.vm04.stdout:1/571: creat d0/d3/d41/d4b/fd3 x:0 0 0 2026-03-10T06:23:03.017 INFO:tasks.workunit.client.0.vm04.stdout:2/538: truncate d1/df/d11/f7e 1866237 0 2026-03-10T06:23:03.018 INFO:tasks.workunit.client.0.vm04.stdout:2/539: stat d1/l17 0 2026-03-10T06:23:03.018 INFO:tasks.workunit.client.0.vm04.stdout:2/540: chown d1/df/d11/d14/d9f/da0 9 1 2026-03-10T06:23:03.021 INFO:tasks.workunit.client.0.vm04.stdout:1/572: dread d0/d3/d41/d4b/d5b/fb6 [0,4194304] 0 2026-03-10T06:23:03.025 INFO:tasks.workunit.client.0.vm04.stdout:3/541: rmdir d4/d6/d99/d7b/d21/d32/d39 39 2026-03-10T06:23:03.034 INFO:tasks.workunit.client.0.vm04.stdout:5/535: symlink d4/d11/d7d/dae/lbf 0 2026-03-10T06:23:03.059 INFO:tasks.workunit.client.0.vm04.stdout:8/582: write df/d15/d2b/f33 [3147024,56881] 0 2026-03-10T06:23:03.063 INFO:tasks.workunit.client.0.vm04.stdout:9/603: write d2/d3/d18/ddd/f5b [269275,40214] 0 2026-03-10T06:23:03.071 INFO:tasks.workunit.client.0.vm04.stdout:9/604: dread d2/d3/d18/d39/f20 [0,4194304] 0 2026-03-10T06:23:03.080 INFO:tasks.workunit.client.0.vm04.stdout:4/586: rename d2/d16/d31/c7d to d2/cbd 0 2026-03-10T06:23:03.081 INFO:tasks.workunit.client.0.vm04.stdout:4/587: chown d2/d32/d5c/d76/f95 49166 1 2026-03-10T06:23:03.084 INFO:tasks.workunit.client.0.vm04.stdout:4/588: dwrite d2/d46/f5d [4194304,4194304] 0 2026-03-10T06:23:03.085 INFO:tasks.workunit.client.0.vm04.stdout:4/589: fsync d2/d16/da3/fa4 0 2026-03-10T06:23:03.086 INFO:tasks.workunit.client.0.vm04.stdout:4/590: chown d2/d32/d94/d99 0 1 2026-03-10T06:23:03.093 INFO:tasks.workunit.client.0.vm04.stdout:2/541: write d1/db/d72/f83 [5564127,13097] 0 2026-03-10T06:23:03.104 INFO:tasks.workunit.client.0.vm04.stdout:5/536: rmdir d4/d11/d7d/d38 39 2026-03-10T06:23:03.107 INFO:tasks.workunit.client.0.vm04.stdout:5/537: dread d4/d11/f1f [0,4194304] 0 2026-03-10T06:23:03.112 INFO:tasks.workunit.client.0.vm04.stdout:8/583: mknod df/d20/d25/d30/d65/cb5 0 2026-03-10T06:23:03.114 INFO:tasks.workunit.client.0.vm04.stdout:7/559: rename d4/df/d12/f14 to d4/df/d12/d21/fd9 0 2026-03-10T06:23:03.124 INFO:tasks.workunit.client.0.vm04.stdout:6/583: link d2/d37/cb0 d2/d43/d2d/d30/d34/cc2 0 2026-03-10T06:23:03.126 INFO:tasks.workunit.client.0.vm04.stdout:5/538: unlink d4/d11/d7d/d52/l82 0 2026-03-10T06:23:03.129 INFO:tasks.workunit.client.0.vm04.stdout:6/584: fsync d2/d43/f24 0 2026-03-10T06:23:03.129 INFO:tasks.workunit.client.0.vm04.stdout:1/573: creat d0/d3/fd4 x:0 0 0 2026-03-10T06:23:03.131 INFO:tasks.workunit.client.0.vm04.stdout:1/574: chown d0/d8/d46/d7a/fa8 1900 1 2026-03-10T06:23:03.131 INFO:tasks.workunit.client.0.vm04.stdout:6/585: dread - d2/d43/d2d/d30/d34/d76/d8a/fb9 zero size 2026-03-10T06:23:03.131 INFO:tasks.workunit.client.0.vm04.stdout:6/586: chown d2/d43/f69 168 1 2026-03-10T06:23:03.132 INFO:tasks.workunit.client.0.vm04.stdout:6/587: fdatasync d2/d37/d6e/fa9 0 2026-03-10T06:23:03.133 INFO:tasks.workunit.client.0.vm04.stdout:6/588: write d2/d43/d2d/d30/f60 [672919,1942] 0 2026-03-10T06:23:03.141 INFO:tasks.workunit.client.0.vm04.stdout:1/575: rmdir d0/d3/d41/d4b/d5b 39 2026-03-10T06:23:03.150 INFO:tasks.workunit.client.0.vm04.stdout:3/542: rename d4/da/df/d11/d5a/d5b/l87 to d4/da/df/lb5 0 2026-03-10T06:23:03.151 INFO:tasks.workunit.client.0.vm04.stdout:3/543: write d4/d6/d99/d7b/d21/f9a [5156808,88429] 0 2026-03-10T06:23:03.151 INFO:tasks.workunit.client.0.vm04.stdout:6/589: link d2/d43/d2d/d30/d34/da8/fb4 d2/d8/fc3 0 2026-03-10T06:23:03.151 INFO:tasks.workunit.client.0.vm04.stdout:8/584: getdents df/d15/d2b/d81 0 2026-03-10T06:23:03.151 INFO:tasks.workunit.client.0.vm04.stdout:1/576: unlink d0/d3/d80/laa 0 2026-03-10T06:23:03.151 INFO:tasks.workunit.client.0.vm04.stdout:3/544: fdatasync d4/d6/dc/f41 0 2026-03-10T06:23:03.152 INFO:tasks.workunit.client.0.vm04.stdout:8/585: mknod df/d15/d29/d89/cb6 0 2026-03-10T06:23:03.152 INFO:tasks.workunit.client.0.vm04.stdout:7/560: sync 2026-03-10T06:23:03.156 INFO:tasks.workunit.client.0.vm04.stdout:6/590: creat d2/d43/d86/fc4 x:0 0 0 2026-03-10T06:23:03.158 INFO:tasks.workunit.client.0.vm04.stdout:9/605: rename d2/d23/d24 to d2/d3/d18/de9 0 2026-03-10T06:23:03.159 INFO:tasks.workunit.client.0.vm04.stdout:3/545: unlink d4/da/df/c6f 0 2026-03-10T06:23:03.161 INFO:tasks.workunit.client.0.vm04.stdout:8/586: stat df/d15/d2b/d81/f9d 0 2026-03-10T06:23:03.163 INFO:tasks.workunit.client.0.vm04.stdout:6/591: sync 2026-03-10T06:23:03.163 INFO:tasks.workunit.client.0.vm04.stdout:9/606: sync 2026-03-10T06:23:03.167 INFO:tasks.workunit.client.0.vm04.stdout:8/587: symlink df/d15/d29/d89/lb7 0 2026-03-10T06:23:03.167 INFO:tasks.workunit.client.0.vm04.stdout:6/592: write d2/d37/d6e/fa9 [599722,734] 0 2026-03-10T06:23:03.169 INFO:tasks.workunit.client.0.vm04.stdout:7/561: mknod d4/df/d12/d13/d25/cda 0 2026-03-10T06:23:03.171 INFO:tasks.workunit.client.0.vm04.stdout:7/562: stat d4/c11 0 2026-03-10T06:23:03.172 INFO:tasks.workunit.client.0.vm04.stdout:7/563: stat d4/df/d12/dd4 0 2026-03-10T06:23:03.184 INFO:tasks.workunit.client.0.vm04.stdout:1/577: truncate d0/d3/f19 1963478 0 2026-03-10T06:23:03.189 INFO:tasks.workunit.client.0.vm04.stdout:9/607: symlink d2/d3/d18/d34/lea 0 2026-03-10T06:23:03.195 INFO:tasks.workunit.client.0.vm04.stdout:0/601: dwrite d0/d5/d25/dd/d5c/f7a [0,4194304] 0 2026-03-10T06:23:03.202 INFO:tasks.workunit.client.0.vm04.stdout:8/588: dwrite df/d15/d29/f6f [0,4194304] 0 2026-03-10T06:23:03.204 INFO:tasks.workunit.client.0.vm04.stdout:8/589: read - df/d15/d2b/d81/f9d zero size 2026-03-10T06:23:03.206 INFO:tasks.workunit.client.0.vm04.stdout:8/590: write df/d15/d2b/f2f [5716488,90927] 0 2026-03-10T06:23:03.217 INFO:tasks.workunit.client.0.vm04.stdout:3/546: creat d4/da/df/fb6 x:0 0 0 2026-03-10T06:23:03.218 INFO:tasks.workunit.client.0.vm04.stdout:3/547: readlink d4/da/df/l90 0 2026-03-10T06:23:03.218 INFO:tasks.workunit.client.0.vm04.stdout:6/593: unlink d2/d43/d2d/d7c/l7d 0 2026-03-10T06:23:03.221 INFO:tasks.workunit.client.0.vm04.stdout:0/602: mknod d0/d1a/d4d/cc3 0 2026-03-10T06:23:03.222 INFO:tasks.workunit.client.0.vm04.stdout:8/591: mkdir df/d15/d29/da3/db8 0 2026-03-10T06:23:03.223 INFO:tasks.workunit.client.0.vm04.stdout:1/578: symlink d0/d8/d46/dcf/ld5 0 2026-03-10T06:23:03.225 INFO:tasks.workunit.client.0.vm04.stdout:3/548: fsync d4/d6/d99/d7b/d21/f3a 0 2026-03-10T06:23:03.227 INFO:tasks.workunit.client.0.vm04.stdout:3/549: read d4/d6/d99/d7b/d21/d32/d4e/f73 [591862,16037] 0 2026-03-10T06:23:03.230 INFO:tasks.workunit.client.0.vm04.stdout:8/592: symlink df/d20/d25/d30/d70/lb9 0 2026-03-10T06:23:03.231 INFO:tasks.workunit.client.0.vm04.stdout:9/608: link d2/d3/l45 d2/d8/d3a/leb 0 2026-03-10T06:23:03.233 INFO:tasks.workunit.client.0.vm04.stdout:0/603: creat d0/d5/d97/dc0/fc4 x:0 0 0 2026-03-10T06:23:03.234 INFO:tasks.workunit.client.0.vm04.stdout:3/550: fsync d4/d6/dc/f5c 0 2026-03-10T06:23:03.244 INFO:tasks.workunit.client.0.vm04.stdout:9/609: dread d2/d3/d18/fa0 [0,4194304] 0 2026-03-10T06:23:03.257 INFO:tasks.workunit.client.0.vm04.stdout:4/591: write d2/d32/d5c/f41 [2441837,48639] 0 2026-03-10T06:23:03.257 INFO:tasks.workunit.client.0.vm04.stdout:2/542: write d1/db/d72/f7a [456912,124693] 0 2026-03-10T06:23:03.257 INFO:tasks.workunit.client.0.vm04.stdout:9/610: dwrite d2/de0/f40 [0,4194304] 0 2026-03-10T06:23:03.260 INFO:tasks.workunit.client.0.vm04.stdout:3/551: creat d4/d6/dc/fb7 x:0 0 0 2026-03-10T06:23:03.261 INFO:tasks.workunit.client.0.vm04.stdout:4/592: chown d2/d16/d56/fa7 9144 1 2026-03-10T06:23:03.262 INFO:tasks.workunit.client.0.vm04.stdout:2/543: write d1/f10 [751373,66226] 0 2026-03-10T06:23:03.265 INFO:tasks.workunit.client.0.vm04.stdout:9/611: truncate d2/d3/d18/de9/fbe 578447 0 2026-03-10T06:23:03.273 INFO:tasks.workunit.client.0.vm04.stdout:5/539: rmdir d4/d11 39 2026-03-10T06:23:03.350 INFO:tasks.workunit.client.0.vm04.stdout:7/564: write d4/df/d12/d34/f80 [106053,79722] 0 2026-03-10T06:23:03.383 INFO:tasks.workunit.client.0.vm04.stdout:1/579: getdents d0/d8/d46/dcf 0 2026-03-10T06:23:03.390 INFO:tasks.workunit.client.0.vm04.stdout:6/594: truncate d2/d43/d2d/d30/f60 2900855 0 2026-03-10T06:23:03.391 INFO:tasks.workunit.client.0.vm04.stdout:6/595: readlink d2/d43/d2d/d30/d1f/l53 0 2026-03-10T06:23:03.409 INFO:tasks.workunit.client.0.vm04.stdout:8/593: truncate df/f17 4400701 0 2026-03-10T06:23:03.416 INFO:tasks.workunit.client.0.vm04.stdout:8/594: dread df/f3f [0,4194304] 0 2026-03-10T06:23:03.419 INFO:tasks.workunit.client.0.vm04.stdout:0/604: creat d0/d5/fc5 x:0 0 0 2026-03-10T06:23:03.420 INFO:tasks.workunit.client.0.vm04.stdout:0/605: fsync d0/d5/d25/dd/d3a/d56/f84 0 2026-03-10T06:23:03.425 INFO:tasks.workunit.client.0.vm04.stdout:8/595: dread df/d15/d2b/f4c [0,4194304] 0 2026-03-10T06:23:03.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:03 vm04.local ceph-mon[51058]: pgmap v30: 65 pgs: 65 active+clean; 2.6 GiB data, 8.8 GiB used, 111 GiB / 120 GiB avail; 32 MiB/s rd, 88 MiB/s wr, 189 op/s 2026-03-10T06:23:03.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:03 vm04.local ceph-mon[51058]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:23:03.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:03 vm04.local ceph-mon[51058]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:23:03.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:03 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:23:03.428 INFO:tasks.workunit.client.0.vm04.stdout:0/606: dread d0/d5/d25/dd/d3a/d56/f84 [0,4194304] 0 2026-03-10T06:23:03.429 INFO:tasks.workunit.client.0.vm04.stdout:0/607: readlink d0/d5/d25/dd/d5c/d73/l68 0 2026-03-10T06:23:03.440 INFO:tasks.workunit.client.0.vm04.stdout:3/552: creat d4/d6/d38/fb8 x:0 0 0 2026-03-10T06:23:03.453 INFO:tasks.workunit.client.0.vm04.stdout:7/565: dread - d4/df/d12/d13/d25/f95 zero size 2026-03-10T06:23:03.453 INFO:tasks.workunit.client.0.vm04.stdout:1/580: truncate d0/d3/d41/f8a 33266 0 2026-03-10T06:23:03.461 INFO:tasks.workunit.client.0.vm04.stdout:6/596: unlink d2/d43/d2d/d30/d1f/d3c/c5c 0 2026-03-10T06:23:03.464 INFO:tasks.workunit.client.0.vm04.stdout:6/597: dwrite d2/d3a/d5e/db5/fbe [0,4194304] 0 2026-03-10T06:23:03.499 INFO:tasks.workunit.client.0.vm04.stdout:4/593: mknod d2/cbe 0 2026-03-10T06:23:03.499 INFO:tasks.workunit.client.0.vm04.stdout:3/553: dread d4/d6/d99/d7b/f2b [0,4194304] 0 2026-03-10T06:23:03.499 INFO:tasks.workunit.client.0.vm04.stdout:3/554: stat d4/d6/dc/f22 0 2026-03-10T06:23:03.501 INFO:tasks.workunit.client.0.vm04.stdout:5/540: truncate d4/d6/d37/f62 1866237 0 2026-03-10T06:23:03.503 INFO:tasks.workunit.client.0.vm04.stdout:7/566: mknod d4/df/dd8/d9c/cdb 0 2026-03-10T06:23:03.504 INFO:tasks.workunit.client.0.vm04.stdout:1/581: creat d0/d8/d46/d7a/d95/fd6 x:0 0 0 2026-03-10T06:23:03.511 INFO:tasks.workunit.client.0.vm04.stdout:8/596: unlink df/d20/d25/d30/d55/c62 0 2026-03-10T06:23:03.514 INFO:tasks.workunit.client.0.vm04.stdout:4/594: unlink d2/d16/f73 0 2026-03-10T06:23:03.516 INFO:tasks.workunit.client.0.vm04.stdout:0/608: write d0/d1a/d20/d38/fb4 [798165,83289] 0 2026-03-10T06:23:03.516 INFO:tasks.workunit.client.0.vm04.stdout:4/595: write d2/d16/d31/d3f/d93/f9c [28390,3530] 0 2026-03-10T06:23:03.516 INFO:tasks.workunit.client.0.vm04.stdout:5/541: read d4/d11/f34 [2979831,83202] 0 2026-03-10T06:23:03.516 INFO:tasks.workunit.client.0.vm04.stdout:3/555: stat d4/da/df/d11/f57 0 2026-03-10T06:23:03.521 INFO:tasks.workunit.client.0.vm04.stdout:3/556: readlink d4/d6/d99/l52 0 2026-03-10T06:23:03.523 INFO:tasks.workunit.client.0.vm04.stdout:7/567: creat d4/df/dd8/d9c/db1/fdc x:0 0 0 2026-03-10T06:23:03.529 INFO:tasks.workunit.client.0.vm04.stdout:6/598: creat d2/d43/d2d/d30/d1f/db6/fc5 x:0 0 0 2026-03-10T06:23:03.532 INFO:tasks.workunit.client.0.vm04.stdout:1/582: dread d0/f5 [0,4194304] 0 2026-03-10T06:23:03.536 INFO:tasks.workunit.client.0.vm04.stdout:2/544: getdents d1/db/d20/d8f/d48 0 2026-03-10T06:23:03.540 INFO:tasks.workunit.client.0.vm04.stdout:7/568: creat d4/df/d12/d13/d8b/fdd x:0 0 0 2026-03-10T06:23:03.544 INFO:tasks.workunit.client.0.vm04.stdout:5/542: sync 2026-03-10T06:23:03.557 INFO:tasks.workunit.client.0.vm04.stdout:9/612: write d2/d3/d18/de9/da9/fc2 [1081226,103789] 0 2026-03-10T06:23:03.582 INFO:tasks.workunit.client.0.vm04.stdout:0/609: dwrite d0/d5/d25/dd/f13 [0,4194304] 0 2026-03-10T06:23:03.583 INFO:tasks.workunit.client.0.vm04.stdout:8/597: dwrite df/d20/d25/f54 [0,4194304] 0 2026-03-10T06:23:03.584 INFO:tasks.workunit.client.0.vm04.stdout:8/598: readlink df/d20/d25/d73/lb0 0 2026-03-10T06:23:03.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:03 vm06.local ceph-mon[58974]: pgmap v30: 65 pgs: 65 active+clean; 2.6 GiB data, 8.8 GiB used, 111 GiB / 120 GiB avail; 32 MiB/s rd, 88 MiB/s wr, 189 op/s 2026-03-10T06:23:03.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:03 vm06.local ceph-mon[58974]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:23:03.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:03 vm06.local ceph-mon[58974]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:23:03.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:03 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:23:03.619 INFO:tasks.workunit.client.0.vm04.stdout:7/569: mkdir d4/df/dd8/d9c/db1/dde 0 2026-03-10T06:23:03.631 INFO:tasks.workunit.client.0.vm04.stdout:9/613: unlink d2/d3/d18/de9/f37 0 2026-03-10T06:23:03.637 INFO:tasks.workunit.client.0.vm04.stdout:4/596: truncate d2/d32/d5c/f6d 4372215 0 2026-03-10T06:23:03.640 INFO:tasks.workunit.client.0.vm04.stdout:4/597: write d2/d16/d31/d3f/f64 [981603,62986] 0 2026-03-10T06:23:03.665 INFO:tasks.workunit.client.0.vm04.stdout:8/599: dwrite df/d20/d25/d30/d65/f94 [0,4194304] 0 2026-03-10T06:23:03.679 INFO:tasks.workunit.client.0.vm04.stdout:0/610: dwrite d0/d5/d25/dd/d1d/f30 [0,4194304] 0 2026-03-10T06:23:03.688 INFO:tasks.workunit.client.0.vm04.stdout:6/599: mknod d2/d3a/d5e/db5/cc6 0 2026-03-10T06:23:03.698 INFO:tasks.workunit.client.0.vm04.stdout:1/583: mknod d0/d8/d46/db3/dd2/cd7 0 2026-03-10T06:23:03.698 INFO:tasks.workunit.client.0.vm04.stdout:0/611: creat d0/d5/d25/dd/fc6 x:0 0 0 2026-03-10T06:23:03.701 INFO:tasks.workunit.client.0.vm04.stdout:2/545: rename l0 to d1/la4 0 2026-03-10T06:23:03.702 INFO:tasks.workunit.client.0.vm04.stdout:3/557: rename d4/d6/d99 to d4/d6/d99/d7b/d21/d32/d39/d64/db9 22 2026-03-10T06:23:03.703 INFO:tasks.workunit.client.0.vm04.stdout:9/614: creat d2/d3/d18/de9/de7/fec x:0 0 0 2026-03-10T06:23:03.710 INFO:tasks.workunit.client.0.vm04.stdout:0/612: dwrite d0/d5/d25/dd/f43 [0,4194304] 0 2026-03-10T06:23:03.717 INFO:tasks.workunit.client.0.vm04.stdout:9/615: read d2/d3/d18/d39/d46/fc4 [126394,26626] 0 2026-03-10T06:23:03.723 INFO:tasks.workunit.client.0.vm04.stdout:2/546: dread d1/db/d20/d8f/f53 [0,4194304] 0 2026-03-10T06:23:03.724 INFO:tasks.workunit.client.0.vm04.stdout:2/547: stat d1/df/d2c 0 2026-03-10T06:23:03.736 INFO:tasks.workunit.client.0.vm04.stdout:5/543: rename d4/d11/d7d/d38/d51 to d4/d11/d7d/d38/d91/d4c/d98/dc0 0 2026-03-10T06:23:03.736 INFO:tasks.workunit.client.0.vm04.stdout:5/544: write d4/f21 [863059,85442] 0 2026-03-10T06:23:03.754 INFO:tasks.workunit.client.0.vm04.stdout:7/570: write d4/df/dd8/f41 [18426,50662] 0 2026-03-10T06:23:03.764 INFO:tasks.workunit.client.0.vm04.stdout:8/600: dwrite df/d20/d25/f2a [0,4194304] 0 2026-03-10T06:23:03.768 INFO:tasks.workunit.client.0.vm04.stdout:4/598: creat d2/d16/d31/fbf x:0 0 0 2026-03-10T06:23:03.769 INFO:tasks.workunit.client.0.vm04.stdout:8/601: readlink df/d20/d25/d30/d70/lb9 0 2026-03-10T06:23:03.774 INFO:tasks.workunit.client.0.vm04.stdout:8/602: chown df/d15/d29/f3e 16471 1 2026-03-10T06:23:03.775 INFO:tasks.workunit.client.0.vm04.stdout:8/603: fdatasync df/d15/d29/f3e 0 2026-03-10T06:23:03.781 INFO:tasks.workunit.client.0.vm04.stdout:0/613: creat d0/d1a/d20/d38/d31/d79/fc7 x:0 0 0 2026-03-10T06:23:03.781 INFO:tasks.workunit.client.0.vm04.stdout:8/604: read df/d15/d2b/f33 [286712,58150] 0 2026-03-10T06:23:03.813 INFO:tasks.workunit.client.0.vm04.stdout:3/558: mkdir d4/dba 0 2026-03-10T06:23:03.845 INFO:tasks.workunit.client.0.vm04.stdout:6/600: link d2/d43/c84 d2/d3a/d5e/db5/cc7 0 2026-03-10T06:23:03.845 INFO:tasks.workunit.client.0.vm04.stdout:6/601: readlink d2/d43/d2d/d30/l33 0 2026-03-10T06:23:03.848 INFO:tasks.workunit.client.0.vm04.stdout:1/584: dwrite d0/d3/f19 [0,4194304] 0 2026-03-10T06:23:03.856 INFO:tasks.workunit.client.0.vm04.stdout:1/585: fsync d0/f59 0 2026-03-10T06:23:03.856 INFO:tasks.workunit.client.0.vm04.stdout:1/586: write d0/d3/f20 [4509527,100631] 0 2026-03-10T06:23:03.869 INFO:tasks.workunit.client.0.vm04.stdout:8/605: chown df/f12 5 1 2026-03-10T06:23:03.872 INFO:tasks.workunit.client.0.vm04.stdout:0/614: creat d0/d5/d97/dc0/fc8 x:0 0 0 2026-03-10T06:23:03.876 INFO:tasks.workunit.client.0.vm04.stdout:7/571: mkdir d4/df/dd8/d9c/db1/dde/ddf 0 2026-03-10T06:23:03.877 INFO:tasks.workunit.client.0.vm04.stdout:3/559: rmdir d4/d6/d99/d7b/d21/d32/d4e/d8f 39 2026-03-10T06:23:03.877 INFO:tasks.workunit.client.0.vm04.stdout:4/599: mknod d2/d16/d31/d3f/da1/cc0 0 2026-03-10T06:23:03.882 INFO:tasks.workunit.client.0.vm04.stdout:6/602: read - d2/d43/d2d/d30/f7f zero size 2026-03-10T06:23:03.889 INFO:tasks.workunit.client.0.vm04.stdout:0/615: creat d0/d5/d25/dd/d92/fc9 x:0 0 0 2026-03-10T06:23:03.889 INFO:tasks.workunit.client.0.vm04.stdout:9/616: creat d2/d3/fed x:0 0 0 2026-03-10T06:23:03.889 INFO:tasks.workunit.client.0.vm04.stdout:7/572: unlink d4/df/d12/d21/f94 0 2026-03-10T06:23:03.891 INFO:tasks.workunit.client.0.vm04.stdout:3/560: dwrite f0 [0,4194304] 0 2026-03-10T06:23:03.892 INFO:tasks.workunit.client.0.vm04.stdout:4/600: symlink d2/d16/d31/d3f/da1/lc1 0 2026-03-10T06:23:03.899 INFO:tasks.workunit.client.0.vm04.stdout:6/603: mknod d2/d43/d2d/d30/d1f/cc8 0 2026-03-10T06:23:03.899 INFO:tasks.workunit.client.0.vm04.stdout:5/545: rename d4/d6/d80/c8d to d4/d11/cc1 0 2026-03-10T06:23:03.899 INFO:tasks.workunit.client.0.vm04.stdout:2/548: truncate d1/df/d11/d14/f1d 5483840 0 2026-03-10T06:23:03.900 INFO:tasks.workunit.client.0.vm04.stdout:6/604: chown d2/d3a/d5e/fa4 16 1 2026-03-10T06:23:03.908 INFO:tasks.workunit.client.0.vm04.stdout:0/616: creat d0/d5/d25/dd/d5c/d73/d82/fca x:0 0 0 2026-03-10T06:23:03.909 INFO:tasks.workunit.client.0.vm04.stdout:4/601: symlink d2/d16/d31/d42/lc2 0 2026-03-10T06:23:03.913 INFO:tasks.workunit.client.0.vm04.stdout:1/587: dwrite d0/f8f [0,4194304] 0 2026-03-10T06:23:03.915 INFO:tasks.workunit.client.0.vm04.stdout:8/606: rename df/d20/d25/d30/l47 to df/d20/d25/d30/d55/lba 0 2026-03-10T06:23:03.915 INFO:tasks.workunit.client.0.vm04.stdout:1/588: readlink d0/d3/d80/l85 0 2026-03-10T06:23:03.919 INFO:tasks.workunit.client.0.vm04.stdout:5/546: read d4/d6/d50/f59 [450357,54789] 0 2026-03-10T06:23:03.927 INFO:tasks.workunit.client.0.vm04.stdout:1/589: write d0/d8/d46/d7a/d95/fa7 [740312,37223] 0 2026-03-10T06:23:03.927 INFO:tasks.workunit.client.0.vm04.stdout:5/547: chown d4/d6/f93 481868 1 2026-03-10T06:23:03.927 INFO:tasks.workunit.client.0.vm04.stdout:2/549: dread d1/db/d20/d8f/f25 [0,4194304] 0 2026-03-10T06:23:03.927 INFO:tasks.workunit.client.0.vm04.stdout:3/561: creat d4/da/df/d11/d5a/db3/fbb x:0 0 0 2026-03-10T06:23:03.927 INFO:tasks.workunit.client.0.vm04.stdout:3/562: chown d4/d6/d99/d7b/d21/d32/d4e/l84 0 1 2026-03-10T06:23:03.930 INFO:tasks.workunit.client.0.vm04.stdout:6/605: symlink d2/d8/d78/lc9 0 2026-03-10T06:23:03.931 INFO:tasks.workunit.client.0.vm04.stdout:0/617: fsync d0/d1a/d20/f8c 0 2026-03-10T06:23:03.933 INFO:tasks.workunit.client.0.vm04.stdout:4/602: rmdir d2/d16/d31/d3f 39 2026-03-10T06:23:03.933 INFO:tasks.workunit.client.0.vm04.stdout:6/606: write d2/d43/d2d/d30/f39 [2826006,89114] 0 2026-03-10T06:23:03.934 INFO:tasks.workunit.client.0.vm04.stdout:4/603: fsync d2/fb5 0 2026-03-10T06:23:03.936 INFO:tasks.workunit.client.0.vm04.stdout:6/607: stat d2/d3a/l72 0 2026-03-10T06:23:03.938 INFO:tasks.workunit.client.0.vm04.stdout:8/607: symlink df/d15/d29/da3/db8/lbb 0 2026-03-10T06:23:03.952 INFO:tasks.workunit.client.0.vm04.stdout:3/563: dread d4/d6/d99/d7b/d21/d32/d39/d64/f75 [0,4194304] 0 2026-03-10T06:23:03.956 INFO:tasks.workunit.client.0.vm04.stdout:3/564: fsync d4/d6/d91/fad 0 2026-03-10T06:23:03.962 INFO:tasks.workunit.client.0.vm04.stdout:9/617: write d2/d8/d53/d6e/d89/f95 [94055,115994] 0 2026-03-10T06:23:03.963 INFO:tasks.workunit.client.0.vm04.stdout:1/590: write d0/d3/d41/d4b/d5b/f6f [5175386,1074] 0 2026-03-10T06:23:03.966 INFO:tasks.workunit.client.0.vm04.stdout:2/550: dwrite d1/db/d20/d8f/f25 [4194304,4194304] 0 2026-03-10T06:23:03.973 INFO:tasks.workunit.client.0.vm04.stdout:1/591: dwrite d0/d8/d46/d7a/d95/fa7 [0,4194304] 0 2026-03-10T06:23:03.988 INFO:tasks.workunit.client.0.vm04.stdout:5/548: mkdir d4/d6/dc2 0 2026-03-10T06:23:03.991 INFO:tasks.workunit.client.0.vm04.stdout:4/604: symlink d2/d32/d5c/d98/lc3 0 2026-03-10T06:23:03.992 INFO:tasks.workunit.client.0.vm04.stdout:4/605: fsync d2/d8/f89 0 2026-03-10T06:23:03.997 INFO:tasks.workunit.client.0.vm04.stdout:7/573: rename d4/df/d12/d13/db3/lc9 to d4/df/d12/d13/d25/le0 0 2026-03-10T06:23:03.997 INFO:tasks.workunit.client.0.vm04.stdout:6/608: mknod d2/d43/d2d/d30/d1f/cca 0 2026-03-10T06:23:03.998 INFO:tasks.workunit.client.0.vm04.stdout:7/574: fdatasync d4/df/d12/d13/d25/d28/fb7 0 2026-03-10T06:23:04.002 INFO:tasks.workunit.client.0.vm04.stdout:9/618: fsync d2/d3/d18/d39/d11/f35 0 2026-03-10T06:23:04.003 INFO:tasks.workunit.client.0.vm04.stdout:8/608: mknod df/d20/d25/d30/d70/dac/cbc 0 2026-03-10T06:23:04.003 INFO:tasks.workunit.client.0.vm04.stdout:8/609: chown df/c13 0 1 2026-03-10T06:23:04.005 INFO:tasks.workunit.client.0.vm04.stdout:5/549: creat d4/d6/d81/fc3 x:0 0 0 2026-03-10T06:23:04.006 INFO:tasks.workunit.client.0.vm04.stdout:5/550: readlink d4/d11/d7d/d52/l58 0 2026-03-10T06:23:04.006 INFO:tasks.workunit.client.0.vm04.stdout:4/606: write d2/d16/d56/fa7 [662761,43133] 0 2026-03-10T06:23:04.020 INFO:tasks.workunit.client.0.vm04.stdout:9/619: creat d2/d3/d18/de9/d5a/fee x:0 0 0 2026-03-10T06:23:04.024 INFO:tasks.workunit.client.0.vm04.stdout:8/610: fdatasync df/d15/d29/f3a 0 2026-03-10T06:23:04.029 INFO:tasks.workunit.client.0.vm04.stdout:2/551: creat d1/fa5 x:0 0 0 2026-03-10T06:23:04.030 INFO:tasks.workunit.client.0.vm04.stdout:5/551: mknod d4/d11/cc4 0 2026-03-10T06:23:04.031 INFO:tasks.workunit.client.0.vm04.stdout:3/565: creat d4/da/df/d11/d50/fbc x:0 0 0 2026-03-10T06:23:04.031 INFO:tasks.workunit.client.0.vm04.stdout:8/611: creat df/d20/d25/d73/fbd x:0 0 0 2026-03-10T06:23:04.036 INFO:tasks.workunit.client.0.vm04.stdout:8/612: dwrite df/d15/d29/f6f [0,4194304] 0 2026-03-10T06:23:04.037 INFO:tasks.workunit.client.0.vm04.stdout:0/618: rename d0/d1a/f27 to d0/d1a/d20/fcb 0 2026-03-10T06:23:04.044 INFO:tasks.workunit.client.0.vm04.stdout:9/620: mkdir d2/d8/d22/def 0 2026-03-10T06:23:04.044 INFO:tasks.workunit.client.0.vm04.stdout:4/607: sync 2026-03-10T06:23:04.047 INFO:tasks.workunit.client.0.vm04.stdout:5/552: read d4/d11/d7d/d38/d91/d55/f7a [269963,72925] 0 2026-03-10T06:23:04.048 INFO:tasks.workunit.client.0.vm04.stdout:4/608: write d2/d32/d5c/f6a [585283,45479] 0 2026-03-10T06:23:04.051 INFO:tasks.workunit.client.0.vm04.stdout:4/609: dread - d2/d32/d5c/f97 zero size 2026-03-10T06:23:04.062 INFO:tasks.workunit.client.0.vm04.stdout:8/613: rmdir df/d15/d29/da3 39 2026-03-10T06:23:04.062 INFO:tasks.workunit.client.0.vm04.stdout:0/619: truncate d0/d5/f4e 1852382 0 2026-03-10T06:23:04.062 INFO:tasks.workunit.client.0.vm04.stdout:9/621: unlink d2/d8/fb8 0 2026-03-10T06:23:04.062 INFO:tasks.workunit.client.0.vm04.stdout:3/566: mkdir d4/d6/d99/d7b/dbd 0 2026-03-10T06:23:04.062 INFO:tasks.workunit.client.0.vm04.stdout:3/567: write d4/d6/d54/fa8 [961726,38887] 0 2026-03-10T06:23:04.062 INFO:tasks.workunit.client.0.vm04.stdout:1/592: rename d0/d3/d41/f8a to d0/d8/d46/fd8 0 2026-03-10T06:23:04.062 INFO:tasks.workunit.client.0.vm04.stdout:1/593: write d0/d3/d41/fa3 [4710281,90384] 0 2026-03-10T06:23:04.062 INFO:tasks.workunit.client.0.vm04.stdout:2/552: link d1/db/f27 d1/df/d11/d14/d4e/fa6 0 2026-03-10T06:23:04.066 INFO:tasks.workunit.client.0.vm04.stdout:0/620: truncate d0/d1a/d20/fb9 681567 0 2026-03-10T06:23:04.066 INFO:tasks.workunit.client.0.vm04.stdout:0/621: write d0/d5/d25/dd/d5c/fb5 [1679883,117523] 0 2026-03-10T06:23:04.066 INFO:tasks.workunit.client.0.vm04.stdout:2/553: readlink d1/db/d20/d8f/d48/l50 0 2026-03-10T06:23:04.070 INFO:tasks.workunit.client.0.vm04.stdout:2/554: stat d1/df/d11/d14/d6a 0 2026-03-10T06:23:04.074 INFO:tasks.workunit.client.0.vm04.stdout:5/553: mknod d4/d11/d7d/d38/d91/d55/d72/cc5 0 2026-03-10T06:23:04.079 INFO:tasks.workunit.client.0.vm04.stdout:6/609: write d2/d43/d2d/d30/f60 [2886882,41924] 0 2026-03-10T06:23:04.079 INFO:tasks.workunit.client.0.vm04.stdout:8/614: mkdir df/d15/d2b/d81/d9a/dbe 0 2026-03-10T06:23:04.079 INFO:tasks.workunit.client.0.vm04.stdout:7/575: dwrite d4/df/d12/d34/d63/f78 [0,4194304] 0 2026-03-10T06:23:04.079 INFO:tasks.workunit.client.0.vm04.stdout:6/610: truncate d2/d8/fc3 186914 0 2026-03-10T06:23:04.088 INFO:tasks.workunit.client.0.vm04.stdout:7/576: dread d4/df/d12/d21/fa4 [0,4194304] 0 2026-03-10T06:23:04.096 INFO:tasks.workunit.client.0.vm04.stdout:1/594: rename d0/d3/f19 to d0/d3/d41/d4b/d5b/fd9 0 2026-03-10T06:23:04.097 INFO:tasks.workunit.client.0.vm04.stdout:4/610: mknod d2/d32/d5c/d4f/cc4 0 2026-03-10T06:23:04.104 INFO:tasks.workunit.client.0.vm04.stdout:9/622: rmdir d2/d8/d22/daa 39 2026-03-10T06:23:04.111 INFO:tasks.workunit.client.0.vm04.stdout:4/611: dwrite d2/d46/f61 [0,4194304] 0 2026-03-10T06:23:04.113 INFO:tasks.workunit.client.0.vm04.stdout:4/612: readlink d2/d16/d31/d42/l88 0 2026-03-10T06:23:04.139 INFO:tasks.workunit.client.0.vm04.stdout:0/622: write d0/d5/d25/dd/d3a/f50 [1315615,128664] 0 2026-03-10T06:23:04.140 INFO:tasks.workunit.client.0.vm04.stdout:2/555: mknod d1/db/d72/d94/ca7 0 2026-03-10T06:23:04.159 INFO:tasks.workunit.client.0.vm04.stdout:0/623: dread d0/d5/d25/dd/f13 [0,4194304] 0 2026-03-10T06:23:04.161 INFO:tasks.workunit.client.0.vm04.stdout:6/611: symlink d2/d3a/d5e/lcb 0 2026-03-10T06:23:04.163 INFO:tasks.workunit.client.0.vm04.stdout:7/577: creat d4/df/d12/dd4/fe1 x:0 0 0 2026-03-10T06:23:04.164 INFO:tasks.workunit.client.0.vm04.stdout:0/624: write d0/d1a/d20/d38/d31/d79/fc7 [69903,94148] 0 2026-03-10T06:23:04.171 INFO:tasks.workunit.client.0.vm04.stdout:0/625: truncate d0/d5/d25/dd/d5c/fb2 402917 0 2026-03-10T06:23:04.175 INFO:tasks.workunit.client.0.vm04.stdout:1/595: creat d0/d3/d41/d99/fda x:0 0 0 2026-03-10T06:23:04.175 INFO:tasks.workunit.client.0.vm04.stdout:4/613: creat d2/d46/fc5 x:0 0 0 2026-03-10T06:23:04.175 INFO:tasks.workunit.client.0.vm04.stdout:3/568: creat d4/fbe x:0 0 0 2026-03-10T06:23:04.181 INFO:tasks.workunit.client.0.vm04.stdout:5/554: creat d4/d11/d7d/d38/d91/d4c/d98/dc0/dbe/fc6 x:0 0 0 2026-03-10T06:23:04.183 INFO:tasks.workunit.client.0.vm04.stdout:0/626: dwrite d0/d5/f70 [0,4194304] 0 2026-03-10T06:23:04.189 INFO:tasks.workunit.client.0.vm04.stdout:0/627: chown d0/d5/d25/dd/d1d/d59/f48 3 1 2026-03-10T06:23:04.199 INFO:tasks.workunit.client.0.vm04.stdout:3/569: sync 2026-03-10T06:23:04.200 INFO:tasks.workunit.client.0.vm04.stdout:3/570: stat d4/d6/dc/f41 0 2026-03-10T06:23:04.202 INFO:tasks.workunit.client.0.vm04.stdout:3/571: write d4/d6/dc/fb7 [966915,34330] 0 2026-03-10T06:23:04.214 INFO:tasks.workunit.client.0.vm04.stdout:4/614: symlink d2/d32/dad/lc6 0 2026-03-10T06:23:04.218 INFO:tasks.workunit.client.0.vm04.stdout:9/623: dwrite d2/d3/d18/d39/d46/fc4 [0,4194304] 0 2026-03-10T06:23:04.218 INFO:tasks.workunit.client.0.vm04.stdout:4/615: sync 2026-03-10T06:23:04.242 INFO:tasks.workunit.client.0.vm04.stdout:7/578: dwrite d4/fa7 [0,4194304] 0 2026-03-10T06:23:04.242 INFO:tasks.workunit.client.0.vm04.stdout:5/555: rename d4/d11/d7d/c73 to d4/d11/d7d/d38/d91/d4c/d98/dc0/cc7 0 2026-03-10T06:23:04.254 INFO:tasks.workunit.client.0.vm04.stdout:6/612: getdents d2/d43/d2d/d30/d1f/d3c/d85/dbf 0 2026-03-10T06:23:04.257 INFO:tasks.workunit.client.0.vm04.stdout:2/556: write d1/f5 [1536170,13135] 0 2026-03-10T06:23:04.258 INFO:tasks.workunit.client.0.vm04.stdout:4/616: write d2/d16/d31/d3f/d93/f9c [625261,55910] 0 2026-03-10T06:23:04.262 INFO:tasks.workunit.client.0.vm04.stdout:5/556: truncate d4/d11/d7d/f90 1915705 0 2026-03-10T06:23:04.267 INFO:tasks.workunit.client.0.vm04.stdout:6/613: fdatasync d2/d43/d2d/d30/f39 0 2026-03-10T06:23:04.267 INFO:tasks.workunit.client.0.vm04.stdout:8/615: link df/d20/d25/f35 df/d15/d29/da3/fbf 0 2026-03-10T06:23:04.267 INFO:tasks.workunit.client.0.vm04.stdout:9/624: rename d2/d8/d53/d6e/f7d to d2/d8/d22/d4f/ff0 0 2026-03-10T06:23:04.267 INFO:tasks.workunit.client.0.vm04.stdout:2/557: dread - d1/db/d20/d8f/d35/d54/f9a zero size 2026-03-10T06:23:04.268 INFO:tasks.workunit.client.0.vm04.stdout:0/628: unlink d0/d5/d25/dd/d5c/caa 0 2026-03-10T06:23:04.269 INFO:tasks.workunit.client.0.vm04.stdout:0/629: chown d0/d5/d25/dd/d1d/c4a 15386973 1 2026-03-10T06:23:04.274 INFO:tasks.workunit.client.0.vm04.stdout:5/557: dread d4/d11/d7d/d38/d91/d55/f7a [0,4194304] 0 2026-03-10T06:23:04.276 INFO:tasks.workunit.client.0.vm04.stdout:5/558: chown d4/f35 219969 1 2026-03-10T06:23:04.284 INFO:tasks.workunit.client.0.vm04.stdout:9/625: sync 2026-03-10T06:23:04.287 INFO:tasks.workunit.client.0.vm04.stdout:4/617: creat d2/d32/dad/fc7 x:0 0 0 2026-03-10T06:23:04.288 INFO:tasks.workunit.client.0.vm04.stdout:8/616: truncate df/d20/f5e 1453546 0 2026-03-10T06:23:04.291 INFO:tasks.workunit.client.0.vm04.stdout:2/558: mknod d1/db/d20/d8f/d35/ca8 0 2026-03-10T06:23:04.297 INFO:tasks.workunit.client.0.vm04.stdout:0/630: rename d0/d5/d25/dd/d3a/d56/l7f to d0/d5/d25/dd/d1d/d9c/dbf/lcc 0 2026-03-10T06:23:04.298 INFO:tasks.workunit.client.0.vm04.stdout:1/596: dwrite d0/d8/f21 [0,4194304] 0 2026-03-10T06:23:04.299 INFO:tasks.workunit.client.0.vm04.stdout:1/597: write d0/d3/d41/d4b/f6b [2956850,34144] 0 2026-03-10T06:23:04.305 INFO:tasks.workunit.client.0.vm04.stdout:3/572: truncate d4/da/df/d11/d5a/f8b 1797235 0 2026-03-10T06:23:04.305 INFO:tasks.workunit.client.0.vm04.stdout:0/631: dread d0/d5/d25/dd/d1d/f30 [0,4194304] 0 2026-03-10T06:23:04.318 INFO:tasks.workunit.client.0.vm04.stdout:2/559: symlink d1/df/d2c/d37/d59/la9 0 2026-03-10T06:23:04.324 INFO:tasks.workunit.client.0.vm04.stdout:7/579: rename d4/df/dd8/d9c/fc1 to d4/df/d12/d13/d25/d8f/fe2 0 2026-03-10T06:23:04.331 INFO:tasks.workunit.client.0.vm04.stdout:3/573: fsync d4/d6/d99/f76 0 2026-03-10T06:23:04.337 INFO:tasks.workunit.client.0.vm04.stdout:3/574: chown d4/d6/d99/d7b/d21/d2c/cab 278751263 1 2026-03-10T06:23:04.338 INFO:tasks.workunit.client.0.vm04.stdout:3/575: chown d4/d6/d99/d7b/f1d 30691 1 2026-03-10T06:23:04.344 INFO:tasks.workunit.client.0.vm04.stdout:6/614: dwrite d2/d43/d2d/d30/f2b [0,4194304] 0 2026-03-10T06:23:04.344 INFO:tasks.workunit.client.0.vm04.stdout:4/618: dread d2/d16/d31/d3f/d93/fa2 [0,4194304] 0 2026-03-10T06:23:04.348 INFO:tasks.workunit.client.0.vm04.stdout:8/617: dwrite df/d20/d25/d30/d70/f90 [0,4194304] 0 2026-03-10T06:23:04.350 INFO:tasks.workunit.client.0.vm04.stdout:4/619: chown d2/l2d 37 1 2026-03-10T06:23:04.351 INFO:tasks.workunit.client.0.vm04.stdout:6/615: chown d2/d43/d86/fc4 285116080 1 2026-03-10T06:23:04.356 INFO:tasks.workunit.client.0.vm04.stdout:9/626: unlink d2/d3/l45 0 2026-03-10T06:23:04.357 INFO:tasks.workunit.client.0.vm04.stdout:4/620: truncate d2/d46/fa8 504995 0 2026-03-10T06:23:04.358 INFO:tasks.workunit.client.0.vm04.stdout:6/616: fdatasync d2/d3a/d5e/db5/fbe 0 2026-03-10T06:23:04.359 INFO:tasks.workunit.client.0.vm04.stdout:1/598: dwrite d0/f23 [0,4194304] 0 2026-03-10T06:23:04.360 INFO:tasks.workunit.client.0.vm04.stdout:4/621: readlink d2/d8/l63 0 2026-03-10T06:23:04.364 INFO:tasks.workunit.client.0.vm04.stdout:0/632: dwrite d0/f1b [4194304,4194304] 0 2026-03-10T06:23:04.365 INFO:tasks.workunit.client.0.vm04.stdout:8/618: dread df/d15/d2b/f4c [0,4194304] 0 2026-03-10T06:23:04.368 INFO:tasks.workunit.client.0.vm04.stdout:0/633: dread - d0/d5/d97/dc0/fc8 zero size 2026-03-10T06:23:04.374 INFO:tasks.workunit.client.0.vm04.stdout:5/559: link d4/d11/l14 d4/d11/d7d/d38/d91/d4c/lc8 0 2026-03-10T06:23:04.375 INFO:tasks.workunit.client.0.vm04.stdout:2/560: creat d1/db/d20/d8f/d35/d54/d5d/faa x:0 0 0 2026-03-10T06:23:04.376 INFO:tasks.workunit.client.0.vm04.stdout:5/560: fdatasync d4/d11/f34 0 2026-03-10T06:23:04.378 INFO:tasks.workunit.client.0.vm04.stdout:5/561: dread - d4/d11/d7d/d38/d91/d4c/d98/faf zero size 2026-03-10T06:23:04.393 INFO:tasks.workunit.client.0.vm04.stdout:2/561: dwrite d1/db/d72/d94/f97 [0,4194304] 0 2026-03-10T06:23:04.399 INFO:tasks.workunit.client.0.vm04.stdout:3/576: rename d4/d6/d99/d7b/d21/f83 to d4/dba/fbf 0 2026-03-10T06:23:04.400 INFO:tasks.workunit.client.0.vm04.stdout:3/577: dread - d4/d6/d99/d7b/d21/d2c/fac zero size 2026-03-10T06:23:04.408 INFO:tasks.workunit.client.0.vm04.stdout:9/627: creat d2/d8/d22/d4f/ff1 x:0 0 0 2026-03-10T06:23:04.411 INFO:tasks.workunit.client.0.vm04.stdout:4/622: dread - d2/d16/d56/f7f zero size 2026-03-10T06:23:04.414 INFO:tasks.workunit.client.0.vm04.stdout:5/562: mknod d4/d11/d7d/d38/d91/d55/cc9 0 2026-03-10T06:23:04.416 INFO:tasks.workunit.client.0.vm04.stdout:8/619: sync 2026-03-10T06:23:04.416 INFO:tasks.workunit.client.0.vm04.stdout:5/563: write d4/d11/d7d/f90 [1833114,84936] 0 2026-03-10T06:23:04.419 INFO:tasks.workunit.client.0.vm04.stdout:2/562: creat d1/df/d11/d14/d9f/fab x:0 0 0 2026-03-10T06:23:04.423 INFO:tasks.workunit.client.0.vm04.stdout:8/620: dwrite fe [0,4194304] 0 2026-03-10T06:23:04.424 INFO:tasks.workunit.client.0.vm04.stdout:6/617: rename d2/d43/d2d/d30/d1f/d3c/f27 to d2/d3a/d5e/d9a/fcc 0 2026-03-10T06:23:04.430 INFO:tasks.workunit.client.0.vm04.stdout:6/618: readlink d2/l2a 0 2026-03-10T06:23:04.432 INFO:tasks.workunit.client.0.vm04.stdout:5/564: dwrite d4/d11/d7d/f36 [4194304,4194304] 0 2026-03-10T06:23:04.437 INFO:tasks.workunit.client.0.vm04.stdout:4/623: dwrite d2/d46/f26 [4194304,4194304] 0 2026-03-10T06:23:04.444 INFO:tasks.workunit.client.0.vm04.stdout:9/628: mknod d2/d3/d18/d34/cf2 0 2026-03-10T06:23:04.445 INFO:tasks.workunit.client.0.vm04.stdout:9/629: chown d2/d3/d18/d39/d46/fc4 12231 1 2026-03-10T06:23:04.446 INFO:tasks.workunit.client.0.vm04.stdout:4/624: dwrite d2/d16/d56/fa7 [0,4194304] 0 2026-03-10T06:23:04.455 INFO:tasks.workunit.client.0.vm04.stdout:2/563: creat d1/db/d20/d8f/d48/d67/fac x:0 0 0 2026-03-10T06:23:04.455 INFO:tasks.workunit.client.0.vm04.stdout:2/564: readlink d1/db/d72/l82 0 2026-03-10T06:23:04.455 INFO:tasks.workunit.client.0.vm04.stdout:8/621: creat df/d20/d25/d73/fc0 x:0 0 0 2026-03-10T06:23:04.460 INFO:tasks.workunit.client.0.vm04.stdout:6/619: rmdir d2/d3a/d5e/db5 39 2026-03-10T06:23:04.464 INFO:tasks.workunit.client.0.vm04.stdout:9/630: creat d2/d8/d53/d6e/d89/ff3 x:0 0 0 2026-03-10T06:23:04.464 INFO:tasks.workunit.client.0.vm04.stdout:9/631: truncate d2/d8/d3a/dcb/fe6 401324 0 2026-03-10T06:23:04.465 INFO:tasks.workunit.client.0.vm04.stdout:9/632: chown d2/d8/d22/fd1 42 1 2026-03-10T06:23:04.466 INFO:tasks.workunit.client.0.vm04.stdout:9/633: write d2/d3/d18/de9/da2/fc5 [1804772,100631] 0 2026-03-10T06:23:04.470 INFO:tasks.workunit.client.0.vm04.stdout:4/625: mkdir d2/d16/d31/d3f/dc8 0 2026-03-10T06:23:04.477 INFO:tasks.workunit.client.0.vm04.stdout:0/634: rename d0/d5/d25/dd/d1d/d9c/lac to d0/d5/d97/dc0/lcd 0 2026-03-10T06:23:04.478 INFO:tasks.workunit.client.0.vm04.stdout:0/635: fdatasync d0/d5/d25/f6f 0 2026-03-10T06:23:04.479 INFO:tasks.workunit.client.0.vm04.stdout:5/565: symlink d4/d6/lca 0 2026-03-10T06:23:04.479 INFO:tasks.workunit.client.0.vm04.stdout:0/636: stat d0/d1a/cc1 0 2026-03-10T06:23:04.481 INFO:tasks.workunit.client.0.vm04.stdout:7/580: write d4/df/f8a [728935,20250] 0 2026-03-10T06:23:04.481 INFO:tasks.workunit.client.0.vm04.stdout:6/620: truncate d2/d43/d2d/d30/f7f 1019582 0 2026-03-10T06:23:04.482 INFO:tasks.workunit.client.0.vm04.stdout:6/621: fdatasync d2/d37/d6e/f82 0 2026-03-10T06:23:04.486 INFO:tasks.workunit.client.0.vm04.stdout:4/626: mknod d2/d32/d5c/d98/cc9 0 2026-03-10T06:23:04.487 INFO:tasks.workunit.client.0.vm04.stdout:3/578: rename d4/d6/d99/d7b/d21/c6e to d4/d6/d99/d7b/d21/d32/d8e/cc0 0 2026-03-10T06:23:04.488 INFO:tasks.workunit.client.0.vm04.stdout:5/566: rmdir d4/d11/d7d/d38/d91/d4c/d98 39 2026-03-10T06:23:04.489 INFO:tasks.workunit.client.0.vm04.stdout:0/637: creat d0/d5/d25/dd/d3a/fce x:0 0 0 2026-03-10T06:23:04.503 INFO:tasks.workunit.client.0.vm04.stdout:5/567: dread - d4/d11/d7d/d38/d91/f5e zero size 2026-03-10T06:23:04.516 INFO:tasks.workunit.client.0.vm04.stdout:5/568: symlink d4/d11/d7d/d38/d91/d55/lcb 0 2026-03-10T06:23:04.516 INFO:tasks.workunit.client.0.vm04.stdout:0/638: link d0/d1a/d20/d38/d31/d47/d8a/d8d/fad d0/d5/d25/dd/d1d/fcf 0 2026-03-10T06:23:04.516 INFO:tasks.workunit.client.0.vm04.stdout:3/579: creat d4/da/df/fc1 x:0 0 0 2026-03-10T06:23:04.517 INFO:tasks.workunit.client.0.vm04.stdout:9/634: getdents d2/d3/d18/de9/d5a 0 2026-03-10T06:23:04.518 INFO:tasks.workunit.client.0.vm04.stdout:3/580: write d4/f42 [3556590,38530] 0 2026-03-10T06:23:04.521 INFO:tasks.workunit.client.0.vm04.stdout:9/635: write d2/fb4 [1885621,115294] 0 2026-03-10T06:23:04.524 INFO:tasks.workunit.client.0.vm04.stdout:5/569: creat d4/d6/d37/fcc x:0 0 0 2026-03-10T06:23:04.525 INFO:tasks.workunit.client.0.vm04.stdout:9/636: fsync d2/d3/d18/de9/de7/fec 0 2026-03-10T06:23:04.532 INFO:tasks.workunit.client.0.vm04.stdout:7/581: dread d4/df/d12/d13/d25/d28/d3a/d58/f77 [0,4194304] 0 2026-03-10T06:23:04.536 INFO:tasks.workunit.client.0.vm04.stdout:3/581: creat d4/d6/d99/d7b/d21/d32/d4e/d8f/fc2 x:0 0 0 2026-03-10T06:23:04.536 INFO:tasks.workunit.client.0.vm04.stdout:9/637: truncate d2/d8/d22/daa/f7c 4804078 0 2026-03-10T06:23:04.537 INFO:tasks.workunit.client.0.vm04.stdout:0/639: dread d0/f16 [0,4194304] 0 2026-03-10T06:23:04.537 INFO:tasks.workunit.client.0.vm04.stdout:0/640: read - d0/d5/d25/dd/fc6 zero size 2026-03-10T06:23:04.538 INFO:tasks.workunit.client.0.vm04.stdout:0/641: fdatasync d0/d5/d25/dd/d5c/d73/fa5 0 2026-03-10T06:23:04.540 INFO:tasks.workunit.client.0.vm04.stdout:3/582: dread d4/d6/d99/d7b/f4b [0,4194304] 0 2026-03-10T06:23:04.551 INFO:tasks.workunit.client.0.vm04.stdout:0/642: truncate d0/d1a/f66 426988 0 2026-03-10T06:23:04.551 INFO:tasks.workunit.client.0.vm04.stdout:5/570: getdents d4/d11/d7d/dab 0 2026-03-10T06:23:04.551 INFO:tasks.workunit.client.0.vm04.stdout:3/583: mknod d4/d6/d99/d7b/d89/cc3 0 2026-03-10T06:23:04.557 INFO:tasks.workunit.client.0.vm04.stdout:1/599: truncate d0/f29 2886515 0 2026-03-10T06:23:04.559 INFO:tasks.workunit.client.0.vm04.stdout:9/638: mkdir d2/d3/df4 0 2026-03-10T06:23:04.568 INFO:tasks.workunit.client.0.vm04.stdout:0/643: rmdir d0/d5/d25/dd/d1d/d59 39 2026-03-10T06:23:04.568 INFO:tasks.workunit.client.0.vm04.stdout:0/644: stat d0/f75 0 2026-03-10T06:23:04.571 INFO:tasks.workunit.client.0.vm04.stdout:9/639: mkdir d2/d3/d18/d39/d11/da5/df5 0 2026-03-10T06:23:04.573 INFO:tasks.workunit.client.0.vm04.stdout:0/645: creat d0/d1a/d20/d38/d31/d47/d8a/d8d/fd0 x:0 0 0 2026-03-10T06:23:04.573 INFO:tasks.workunit.client.0.vm04.stdout:3/584: creat d4/d6/d91/da1/fc4 x:0 0 0 2026-03-10T06:23:04.574 INFO:tasks.workunit.client.0.vm04.stdout:0/646: readlink d0/d5/d25/dd/d1d/l37 0 2026-03-10T06:23:04.574 INFO:tasks.workunit.client.0.vm04.stdout:3/585: readlink d4/da/df/d11/d50/l65 0 2026-03-10T06:23:04.575 INFO:tasks.workunit.client.0.vm04.stdout:1/600: sync 2026-03-10T06:23:04.576 INFO:tasks.workunit.client.0.vm04.stdout:0/647: readlink d0/d1a/d20/d38/d31/d47/l64 0 2026-03-10T06:23:04.579 INFO:tasks.workunit.client.0.vm04.stdout:4/627: rmdir d2/d16 39 2026-03-10T06:23:04.579 INFO:tasks.workunit.client.0.vm04.stdout:3/586: write d4/d6/d99/d7b/d21/d2c/fac [458156,62154] 0 2026-03-10T06:23:04.579 INFO:tasks.workunit.client.0.vm04.stdout:4/628: dread - d2/d32/d5c/f97 zero size 2026-03-10T06:23:04.580 INFO:tasks.workunit.client.0.vm04.stdout:9/640: dwrite d2/d8/d53/d6e/d89/f9f [0,4194304] 0 2026-03-10T06:23:04.584 INFO:tasks.workunit.client.0.vm04.stdout:2/565: write d1/df/d2c/d37/d59/f8b [914374,127076] 0 2026-03-10T06:23:04.586 INFO:tasks.workunit.client.0.vm04.stdout:0/648: chown d0/l9b 0 1 2026-03-10T06:23:04.596 INFO:tasks.workunit.client.0.vm04.stdout:8/622: dwrite df/d20/d25/d30/f79 [0,4194304] 0 2026-03-10T06:23:04.600 INFO:tasks.workunit.client.0.vm04.stdout:3/587: readlink d4/d6/d99/d7b/l60 0 2026-03-10T06:23:04.600 INFO:tasks.workunit.client.0.vm04.stdout:2/566: dwrite d1/db/d20/d8f/d35/d54/d5d/f65 [0,4194304] 0 2026-03-10T06:23:04.600 INFO:tasks.workunit.client.0.vm04.stdout:0/649: mkdir d0/dd1 0 2026-03-10T06:23:04.607 INFO:tasks.workunit.client.0.vm04.stdout:8/623: unlink df/c14 0 2026-03-10T06:23:04.608 INFO:tasks.workunit.client.0.vm04.stdout:8/624: chown df/d20/d25/f2a 21 1 2026-03-10T06:23:04.620 INFO:tasks.workunit.client.0.vm04.stdout:9/641: mknod d2/d8/d22/cf6 0 2026-03-10T06:23:04.621 INFO:tasks.workunit.client.0.vm04.stdout:1/601: link d0/d8/l9 d0/d8/d46/d7a/d95/dc5/ldb 0 2026-03-10T06:23:04.621 INFO:tasks.workunit.client.0.vm04.stdout:0/650: unlink d0/d5/cbb 0 2026-03-10T06:23:04.624 INFO:tasks.workunit.client.0.vm04.stdout:2/567: dread d1/df/d11/d14/d4e/f60 [0,4194304] 0 2026-03-10T06:23:04.628 INFO:tasks.workunit.client.0.vm04.stdout:8/625: readlink df/d15/d2b/d8a/lb3 0 2026-03-10T06:23:04.631 INFO:tasks.workunit.client.0.vm04.stdout:4/629: chown d2/d16/d31/d3f/f43 16876637 1 2026-03-10T06:23:04.631 INFO:tasks.workunit.client.0.vm04.stdout:8/626: read df/d20/d25/f44 [3465583,13494] 0 2026-03-10T06:23:04.631 INFO:tasks.workunit.client.0.vm04.stdout:0/651: dread d0/d5/d25/dd/f43 [0,4194304] 0 2026-03-10T06:23:04.631 INFO:tasks.workunit.client.0.vm04.stdout:4/630: readlink d2/l2d 0 2026-03-10T06:23:04.635 INFO:tasks.workunit.client.0.vm04.stdout:9/642: unlink c1 0 2026-03-10T06:23:04.639 INFO:tasks.workunit.client.0.vm04.stdout:1/602: dwrite d0/d3/d41/d4b/d5b/fd9 [0,4194304] 0 2026-03-10T06:23:04.640 INFO:tasks.workunit.client.0.vm04.stdout:1/603: write d0/f23 [3660157,6739] 0 2026-03-10T06:23:04.659 INFO:tasks.workunit.client.0.vm04.stdout:8/627: rmdir df/d15/d2b/d81/d9a 39 2026-03-10T06:23:04.660 INFO:tasks.workunit.client.0.vm04.stdout:0/652: symlink d0/d5/d25/dd/d3a/ld2 0 2026-03-10T06:23:04.661 INFO:tasks.workunit.client.0.vm04.stdout:9/643: creat d2/d8/d53/d6e/d8d/ff7 x:0 0 0 2026-03-10T06:23:04.662 INFO:tasks.workunit.client.0.vm04.stdout:2/568: symlink d1/db/lad 0 2026-03-10T06:23:04.668 INFO:tasks.workunit.client.0.vm04.stdout:0/653: creat d0/d1a/d20/dc2/fd3 x:0 0 0 2026-03-10T06:23:04.680 INFO:tasks.workunit.client.0.vm04.stdout:6/622: fsync d2/d43/d2d/d30/d34/da8/fb4 0 2026-03-10T06:23:04.683 INFO:tasks.workunit.client.0.vm04.stdout:7/582: write d4/df/d12/f4c [767132,60640] 0 2026-03-10T06:23:04.702 INFO:tasks.workunit.client.0.vm04.stdout:5/571: write d4/f79 [1348338,79327] 0 2026-03-10T06:23:04.707 INFO:tasks.workunit.client.0.vm04.stdout:0/654: fsync d0/d1a/d20/fcb 0 2026-03-10T06:23:04.708 INFO:tasks.workunit.client.0.vm04.stdout:0/655: truncate d0/d5/d25/dd/d1d/fa2 970544 0 2026-03-10T06:23:04.716 INFO:tasks.workunit.client.0.vm04.stdout:7/583: dread - d4/df/d12/d13/d25/d30/d40/f75 zero size 2026-03-10T06:23:04.720 INFO:tasks.workunit.client.0.vm04.stdout:1/604: getdents d0/d8/d46/d7a 0 2026-03-10T06:23:04.725 INFO:tasks.workunit.client.0.vm04.stdout:2/569: rename d1/df to d1/dae 0 2026-03-10T06:23:04.730 INFO:tasks.workunit.client.0.vm04.stdout:3/588: dwrite d4/d6/d99/f76 [0,4194304] 0 2026-03-10T06:23:04.731 INFO:tasks.workunit.client.0.vm04.stdout:9/644: link d2/de0/l3d d2/de5/lf8 0 2026-03-10T06:23:04.747 INFO:tasks.workunit.client.0.vm04.stdout:6/623: creat d2/d43/d2d/d30/dc0/fcd x:0 0 0 2026-03-10T06:23:04.751 INFO:tasks.workunit.client.0.vm04.stdout:3/589: dread d4/d6/d99/d7b/d21/f9a [0,4194304] 0 2026-03-10T06:23:04.751 INFO:tasks.workunit.client.0.vm04.stdout:8/628: dwrite df/d20/d25/d30/f51 [0,4194304] 0 2026-03-10T06:23:04.762 INFO:tasks.workunit.client.0.vm04.stdout:4/631: write d2/d16/d31/d3f/f8f [1128680,119621] 0 2026-03-10T06:23:04.763 INFO:tasks.workunit.client.0.vm04.stdout:5/572: mknod d4/d11/ccd 0 2026-03-10T06:23:04.767 INFO:tasks.workunit.client.0.vm04.stdout:1/605: chown d0/d3/cd0 65483 1 2026-03-10T06:23:04.769 INFO:tasks.workunit.client.0.vm04.stdout:2/570: truncate d1/db/d20/d8f/d35/d54/d5d/f93 699683 0 2026-03-10T06:23:04.774 INFO:tasks.workunit.client.0.vm04.stdout:0/656: creat d0/d5/d25/dd/d1d/d59/fd4 x:0 0 0 2026-03-10T06:23:04.784 INFO:tasks.workunit.client.0.vm04.stdout:7/584: rename d4/df/d12/d13/d25/l3d to d4/df/dd8/d9c/db1/dc4/le3 0 2026-03-10T06:23:04.809 INFO:tasks.workunit.client.0.vm04.stdout:9/645: write d2/de0/d1d/d64/f91 [74765,76718] 0 2026-03-10T06:23:04.810 INFO:tasks.workunit.client.0.vm04.stdout:9/646: read d2/d3/d18/d34/fc7 [174924,90227] 0 2026-03-10T06:23:04.813 INFO:tasks.workunit.client.0.vm04.stdout:4/632: dread - d2/d8/f54 zero size 2026-03-10T06:23:04.818 INFO:tasks.workunit.client.0.vm04.stdout:1/606: symlink d0/d8/d46/dcf/ldc 0 2026-03-10T06:23:04.823 INFO:tasks.workunit.client.0.vm04.stdout:2/571: mkdir d1/db/d72/daf 0 2026-03-10T06:23:04.823 INFO:tasks.workunit.client.0.vm04.stdout:2/572: readlink d1/dae/d2c/d37/l71 0 2026-03-10T06:23:04.827 INFO:tasks.workunit.client.0.vm04.stdout:0/657: dwrite d0/d5/d25/dd/d3a/f57 [0,4194304] 0 2026-03-10T06:23:04.832 INFO:tasks.workunit.client.0.vm04.stdout:0/658: dwrite d0/d1a/d20/f8c [0,4194304] 0 2026-03-10T06:23:04.841 INFO:tasks.workunit.client.0.vm04.stdout:6/624: fdatasync d2/d3a/d5e/db5/fbe 0 2026-03-10T06:23:04.841 INFO:tasks.workunit.client.0.vm04.stdout:6/625: chown d2/d43/d2d/d7c 1588919 1 2026-03-10T06:23:04.849 INFO:tasks.workunit.client.0.vm04.stdout:8/629: rename df/d20/d25/d30/d70 to df/d15/d29/da3/db8/dc1 0 2026-03-10T06:23:04.854 INFO:tasks.workunit.client.0.vm04.stdout:7/585: mknod d4/df/d12/d13/d25/d28/dae/ce4 0 2026-03-10T06:23:04.855 INFO:tasks.workunit.client.0.vm04.stdout:7/586: chown d4/df/d12/d13/d25/f95 2515 1 2026-03-10T06:23:04.855 INFO:tasks.workunit.client.0.vm04.stdout:7/587: chown d4/df/d12/d13/d25/d30/cb4 859 1 2026-03-10T06:23:04.857 INFO:tasks.workunit.client.0.vm04.stdout:3/590: unlink d4/d6/d99/c61 0 2026-03-10T06:23:04.860 INFO:tasks.workunit.client.0.vm04.stdout:3/591: dwrite d4/d6/d38/fb8 [0,4194304] 0 2026-03-10T06:23:04.865 INFO:tasks.workunit.client.0.vm04.stdout:9/647: creat d2/d8/d22/daa/ff9 x:0 0 0 2026-03-10T06:23:04.874 INFO:tasks.workunit.client.0.vm04.stdout:5/573: getdents d4/d11/d7d/d38/d91/d4c/d9d 0 2026-03-10T06:23:04.880 INFO:tasks.workunit.client.0.vm04.stdout:0/659: symlink d0/d5/d25/dd/d3a/d56/ld5 0 2026-03-10T06:23:04.880 INFO:tasks.workunit.client.0.vm04.stdout:8/630: fsync df/d15/f5d 0 2026-03-10T06:23:04.880 INFO:tasks.workunit.client.0.vm04.stdout:0/660: chown d0/d5/d25/c10 225 1 2026-03-10T06:23:04.880 INFO:tasks.workunit.client.0.vm04.stdout:0/661: write d0/d5/d25/dd/d1d/fa2 [1485544,118081] 0 2026-03-10T06:23:04.881 INFO:tasks.workunit.client.0.vm04.stdout:2/573: dread d1/db/d20/d8f/d35/d54/d5d/f93 [0,4194304] 0 2026-03-10T06:23:04.883 INFO:tasks.workunit.client.0.vm04.stdout:3/592: rename d4/d6/d92/l9b to d4/d6/d91/lc5 0 2026-03-10T06:23:04.884 INFO:tasks.workunit.client.0.vm04.stdout:7/588: fdatasync d4/df/d12/f5f 0 2026-03-10T06:23:04.884 INFO:tasks.workunit.client.0.vm04.stdout:6/626: dwrite d2/d8/d78/f79 [4194304,4194304] 0 2026-03-10T06:23:04.889 INFO:tasks.workunit.client.0.vm04.stdout:7/589: write d4/df/dd8/d9c/db1/fbe [629388,62614] 0 2026-03-10T06:23:04.893 INFO:tasks.workunit.client.0.vm04.stdout:9/648: rmdir d2/d23/d94 39 2026-03-10T06:23:04.895 INFO:tasks.workunit.client.0.vm04.stdout:8/631: truncate df/d15/d2b/f56 91502 0 2026-03-10T06:23:04.896 INFO:tasks.workunit.client.0.vm04.stdout:0/662: mknod d0/d5/d97/dc0/cd6 0 2026-03-10T06:23:04.898 INFO:tasks.workunit.client.0.vm04.stdout:3/593: symlink d4/d6/d91/da1/lc6 0 2026-03-10T06:23:04.902 INFO:tasks.workunit.client.0.vm04.stdout:7/590: truncate d4/df/d12/d13/d25/f95 80039 0 2026-03-10T06:23:04.902 INFO:tasks.workunit.client.0.vm04.stdout:9/649: fsync d2/d3/d18/d39/d11/f71 0 2026-03-10T06:23:04.902 INFO:tasks.workunit.client.0.vm04.stdout:9/650: chown d2/c41 13715 1 2026-03-10T06:23:04.904 INFO:tasks.workunit.client.0.vm04.stdout:8/632: dwrite df/d20/d25/d73/f98 [0,4194304] 0 2026-03-10T06:23:04.906 INFO:tasks.workunit.client.0.vm04.stdout:2/574: mkdir d1/db/d72/daf/db0 0 2026-03-10T06:23:04.906 INFO:tasks.workunit.client.0.vm04.stdout:2/575: readlink d1/l46 0 2026-03-10T06:23:04.913 INFO:tasks.workunit.client.0.vm04.stdout:2/576: dwrite d1/db/d20/d8f/d48/d67/f92 [0,4194304] 0 2026-03-10T06:23:04.923 INFO:tasks.workunit.client.0.vm04.stdout:3/594: rename d4/d6/dc/f1f to d4/da/df/d11/d5a/d5b/fc7 0 2026-03-10T06:23:04.940 INFO:tasks.workunit.client.0.vm04.stdout:9/651: rmdir d2/d8/d53 39 2026-03-10T06:23:04.940 INFO:tasks.workunit.client.0.vm04.stdout:6/627: truncate d2/d37/d6e/f82 194176 0 2026-03-10T06:23:04.940 INFO:tasks.workunit.client.0.vm04.stdout:8/633: dread - df/d20/f84 zero size 2026-03-10T06:23:04.944 INFO:tasks.workunit.client.0.vm04.stdout:6/628: dread - d2/d43/d2d/d30/d34/f52 zero size 2026-03-10T06:23:04.951 INFO:tasks.workunit.client.0.vm04.stdout:5/574: write d4/d6/f20 [4250931,63159] 0 2026-03-10T06:23:04.955 INFO:tasks.workunit.client.0.vm04.stdout:6/629: truncate d2/d37/d6e/f77 1480670 0 2026-03-10T06:23:04.962 INFO:tasks.workunit.client.0.vm04.stdout:9/652: creat d2/d3/d18/d34/ffa x:0 0 0 2026-03-10T06:23:04.963 INFO:tasks.workunit.client.0.vm04.stdout:7/591: rmdir d4/df/d12 39 2026-03-10T06:23:04.963 INFO:tasks.workunit.client.0.vm04.stdout:2/577: rename d1/db/d20/d8f/d35/c6f to d1/dae/d11/cb1 0 2026-03-10T06:23:04.964 INFO:tasks.workunit.client.0.vm04.stdout:1/607: dwrite d0/d8/f69 [0,4194304] 0 2026-03-10T06:23:04.966 INFO:tasks.workunit.client.0.vm04.stdout:2/578: dread - d1/dae/d11/d14/f9c zero size 2026-03-10T06:23:04.966 INFO:tasks.workunit.client.0.vm04.stdout:6/630: symlink d2/d43/d86/lce 0 2026-03-10T06:23:04.968 INFO:tasks.workunit.client.0.vm04.stdout:4/633: dwrite d2/d46/fa8 [0,4194304] 0 2026-03-10T06:23:04.974 INFO:tasks.workunit.client.0.vm04.stdout:6/631: fsync d2/d3a/d9c/fba 0 2026-03-10T06:23:04.975 INFO:tasks.workunit.client.0.vm04.stdout:3/595: getdents d4/d6/d99/d7b/d21/d32 0 2026-03-10T06:23:04.980 INFO:tasks.workunit.client.0.vm04.stdout:2/579: mkdir d1/dae/d2c/d37/d59/db2 0 2026-03-10T06:23:04.988 INFO:tasks.workunit.client.0.vm04.stdout:5/575: dwrite d4/d6/d80/d84/fa7 [0,4194304] 0 2026-03-10T06:23:04.989 INFO:tasks.workunit.client.0.vm04.stdout:7/592: symlink d4/df/d12/d13/d25/d28/d3a/le5 0 2026-03-10T06:23:04.990 INFO:tasks.workunit.client.0.vm04.stdout:0/663: dwrite d0/d1a/d20/d38/d31/d47/d8a/d8d/fad [0,4194304] 0 2026-03-10T06:23:04.990 INFO:tasks.workunit.client.0.vm04.stdout:0/664: stat d0/ca6 0 2026-03-10T06:23:04.991 INFO:tasks.workunit.client.0.vm04.stdout:9/653: dwrite d2/d3/f57 [0,4194304] 0 2026-03-10T06:23:04.992 INFO:tasks.workunit.client.0.vm04.stdout:0/665: stat d0/d5/d25/dd/d3a/d56/ld5 0 2026-03-10T06:23:04.992 INFO:tasks.workunit.client.0.vm04.stdout:7/593: rename d4/df/dd8/f41 to d4/df/d12/d13/d25/d28/d3a/d58/fe6 0 2026-03-10T06:23:04.992 INFO:tasks.workunit.client.0.vm04.stdout:4/634: dread d2/d46/f87 [0,4194304] 0 2026-03-10T06:23:04.998 INFO:tasks.workunit.client.0.vm04.stdout:2/580: mkdir d1/db/d20/d8f/d48/d67/db3 0 2026-03-10T06:23:05.000 INFO:tasks.workunit.client.0.vm04.stdout:4/635: dwrite d2/d8/f89 [0,4194304] 0 2026-03-10T06:23:05.009 INFO:tasks.workunit.client.0.vm04.stdout:8/634: sync 2026-03-10T06:23:05.018 INFO:tasks.workunit.client.0.vm04.stdout:3/596: mkdir d4/da/df/d11/d50/dc8 0 2026-03-10T06:23:05.022 INFO:tasks.workunit.client.0.vm04.stdout:0/666: rmdir d0/d5/d25/dd/d1d/d59 39 2026-03-10T06:23:05.024 INFO:tasks.workunit.client.0.vm04.stdout:7/594: readlink d4/df/d12/d13/db3/lc2 0 2026-03-10T06:23:05.029 INFO:tasks.workunit.client.0.vm04.stdout:4/636: rename d2/d32/d5c/f97 to d2/d16/d56/fca 0 2026-03-10T06:23:05.031 INFO:tasks.workunit.client.0.vm04.stdout:8/635: symlink df/d20/d25/d30/d65/lc2 0 2026-03-10T06:23:05.033 INFO:tasks.workunit.client.0.vm04.stdout:3/597: creat d4/da/df/d11/d5a/db3/fc9 x:0 0 0 2026-03-10T06:23:05.035 INFO:tasks.workunit.client.0.vm04.stdout:6/632: link d2/d43/d2d/d30/d34/da8/fb4 d2/d43/d2d/fcf 0 2026-03-10T06:23:05.039 INFO:tasks.workunit.client.0.vm04.stdout:0/667: symlink d0/d5/d25/ld7 0 2026-03-10T06:23:05.039 INFO:tasks.workunit.client.0.vm04.stdout:7/595: mknod d4/df/dd8/d9c/ce7 0 2026-03-10T06:23:05.039 INFO:tasks.workunit.client.0.vm04.stdout:7/596: readlink d4/df/d12/d13/l5c 0 2026-03-10T06:23:05.051 INFO:tasks.workunit.client.0.vm04.stdout:5/576: getdents d4/d11/d7d/d38 0 2026-03-10T06:23:05.053 INFO:tasks.workunit.client.0.vm04.stdout:0/668: mkdir d0/d5/d97/dc0/dd8 0 2026-03-10T06:23:05.053 INFO:tasks.workunit.client.0.vm04.stdout:0/669: chown d0/d5/d25/dd/d3a/d81 1 1 2026-03-10T06:23:05.063 INFO:tasks.workunit.client.0.vm04.stdout:9/654: truncate d2/d3/d18/ddd/f5e 834950 0 2026-03-10T06:23:05.068 INFO:tasks.workunit.client.0.vm04.stdout:3/598: symlink d4/d6/d99/d7b/d21/d32/d4e/lca 0 2026-03-10T06:23:05.069 INFO:tasks.workunit.client.0.vm04.stdout:9/655: write f0 [6903720,27004] 0 2026-03-10T06:23:05.069 INFO:tasks.workunit.client.0.vm04.stdout:3/599: read - d4/d6/d99/d7b/d21/d32/d4e/d8f/fc2 zero size 2026-03-10T06:23:05.070 INFO:tasks.workunit.client.0.vm04.stdout:9/656: write d2/d8/d22/d4f/fb2 [1068958,82546] 0 2026-03-10T06:23:05.078 INFO:tasks.workunit.client.0.vm04.stdout:0/670: chown d0/d5/d25/dd/d1d/d59/fd4 742 1 2026-03-10T06:23:05.087 INFO:tasks.workunit.client.0.vm04.stdout:5/577: sync 2026-03-10T06:23:05.087 INFO:tasks.workunit.client.0.vm04.stdout:3/600: dread d4/d6/d54/fa8 [0,4194304] 0 2026-03-10T06:23:05.088 INFO:tasks.workunit.client.0.vm04.stdout:8/636: dread df/d20/f42 [0,4194304] 0 2026-03-10T06:23:05.089 INFO:tasks.workunit.client.0.vm04.stdout:9/657: dwrite d2/d8/f4a [4194304,4194304] 0 2026-03-10T06:23:05.105 INFO:tasks.workunit.client.0.vm04.stdout:1/608: truncate d0/d8/f21 3249358 0 2026-03-10T06:23:05.110 INFO:tasks.workunit.client.0.vm04.stdout:4/637: getdents d2/d16/d2c/d6b 0 2026-03-10T06:23:05.110 INFO:tasks.workunit.client.0.vm04.stdout:2/581: dwrite d1/d76/f8e [0,4194304] 0 2026-03-10T06:23:05.120 INFO:tasks.workunit.client.0.vm04.stdout:5/578: creat d4/d11/d7d/d38/d91/d55/fce x:0 0 0 2026-03-10T06:23:05.129 INFO:tasks.workunit.client.0.vm04.stdout:6/633: dwrite d2/d43/f24 [0,4194304] 0 2026-03-10T06:23:05.134 INFO:tasks.workunit.client.0.vm04.stdout:7/597: truncate d4/df/d12/d13/d25/d30/d40/f52 2763017 0 2026-03-10T06:23:05.134 INFO:tasks.workunit.client.0.vm04.stdout:1/609: mknod d0/d8/d46/db3/cdd 0 2026-03-10T06:23:05.135 INFO:tasks.workunit.client.0.vm04.stdout:4/638: creat d2/d46/fcb x:0 0 0 2026-03-10T06:23:05.142 INFO:tasks.workunit.client.0.vm04.stdout:4/639: dwrite d2/d32/d5c/f6a [0,4194304] 0 2026-03-10T06:23:05.155 INFO:tasks.workunit.client.0.vm04.stdout:5/579: fdatasync d4/d11/d7d/d38/d91/d55/f68 0 2026-03-10T06:23:05.158 INFO:tasks.workunit.client.0.vm04.stdout:5/580: dwrite d4/d6/d80/d84/fa7 [0,4194304] 0 2026-03-10T06:23:05.164 INFO:tasks.workunit.client.0.vm04.stdout:6/634: dread d2/d37/d6e/f82 [0,4194304] 0 2026-03-10T06:23:05.166 INFO:tasks.workunit.client.0.vm04.stdout:1/610: fsync d0/d3/f3b 0 2026-03-10T06:23:05.173 INFO:tasks.workunit.client.0.vm04.stdout:0/671: rename d0/d5/f70 to d0/d5/d25/dd/d1d/fd9 0 2026-03-10T06:23:05.177 INFO:tasks.workunit.client.0.vm04.stdout:0/672: chown d0/d5/d25/dd/f13 554825 1 2026-03-10T06:23:05.184 INFO:tasks.workunit.client.0.vm04.stdout:6/635: dread d2/d8/d78/f79 [0,4194304] 0 2026-03-10T06:23:05.206 INFO:tasks.workunit.client.0.vm04.stdout:8/637: write df/d15/d29/f3a [4089203,106327] 0 2026-03-10T06:23:05.242 INFO:tasks.workunit.client.0.vm04.stdout:7/598: link d4/df/d12/d13/d25/d28/d3a/d58/fcc d4/df/dd8/d9c/fe8 0 2026-03-10T06:23:05.245 INFO:tasks.workunit.client.0.vm04.stdout:7/599: dread d4/df/d12/d13/d25/d28/d3a/d58/f77 [0,4194304] 0 2026-03-10T06:23:05.252 INFO:tasks.workunit.client.0.vm04.stdout:6/636: creat d2/d43/d2d/d30/d1f/d3c/d85/fd0 x:0 0 0 2026-03-10T06:23:05.263 INFO:tasks.workunit.client.0.vm04.stdout:3/601: getdents d4/da/df/d11/d5a/d5b 0 2026-03-10T06:23:05.278 INFO:tasks.workunit.client.0.vm04.stdout:0/673: symlink d0/dd1/lda 0 2026-03-10T06:23:05.279 INFO:tasks.workunit.client.0.vm04.stdout:6/637: creat d2/d43/d9b/fd1 x:0 0 0 2026-03-10T06:23:05.280 INFO:tasks.workunit.client.0.vm04.stdout:0/674: write d0/d5/d25/dd/d5c/f9a [1338670,82789] 0 2026-03-10T06:23:05.284 INFO:tasks.workunit.client.0.vm04.stdout:3/602: sync 2026-03-10T06:23:05.287 INFO:tasks.workunit.client.0.vm04.stdout:6/638: dwrite d2/d43/d2d/d30/dc0/fcd [0,4194304] 0 2026-03-10T06:23:05.354 INFO:tasks.workunit.client.0.vm04.stdout:1/611: dwrite d0/d8/d46/f57 [0,4194304] 0 2026-03-10T06:23:05.355 INFO:tasks.workunit.client.0.vm04.stdout:1/612: stat d0/d3/f20 0 2026-03-10T06:23:05.400 INFO:tasks.workunit.client.0.vm04.stdout:5/581: creat d4/d6/fcf x:0 0 0 2026-03-10T06:23:05.403 INFO:tasks.workunit.client.0.vm04.stdout:7/600: write d4/df/dd8/f64 [2158032,119768] 0 2026-03-10T06:23:05.407 INFO:tasks.workunit.client.0.vm04.stdout:9/658: rename d2/d3/d18/d39/c8e to d2/d8/cfb 0 2026-03-10T06:23:05.425 INFO:tasks.workunit.client.0.vm04.stdout:1/613: write d0/d8/f27 [1967099,126237] 0 2026-03-10T06:23:05.426 INFO:tasks.workunit.client.0.vm04.stdout:1/614: chown d0/d8/d46/d7a/d95/cb9 933677 1 2026-03-10T06:23:05.434 INFO:tasks.workunit.client.0.vm04.stdout:0/675: link d0/d5/d25/dd/d5c/d73/fa5 d0/d5/d97/dc0/fdb 0 2026-03-10T06:23:05.434 INFO:tasks.workunit.client.0.vm04.stdout:7/601: dwrite d4/df/d12/d13/d25/d28/d3a/d58/f97 [0,4194304] 0 2026-03-10T06:23:05.458 INFO:tasks.workunit.client.0.vm04.stdout:5/582: mkdir d4/d3b/da8/dd0 0 2026-03-10T06:23:05.463 INFO:tasks.workunit.client.0.vm04.stdout:9/659: dwrite d2/d3/d18/d39/d11/f71 [0,4194304] 0 2026-03-10T06:23:05.474 INFO:tasks.workunit.client.0.vm04.stdout:6/639: truncate d2/d43/d2d/d30/f60 1755529 0 2026-03-10T06:23:05.474 INFO:tasks.workunit.client.0.vm04.stdout:2/582: rename d1/db/d20/c84 to d1/db/d72/d94/cb4 0 2026-03-10T06:23:05.475 INFO:tasks.workunit.client.0.vm04.stdout:1/615: write d0/d8/d46/d7a/fa8 [820885,48632] 0 2026-03-10T06:23:05.489 INFO:tasks.workunit.client.0.vm04.stdout:6/640: dwrite d2/d43/d2d/d30/d1f/d3c/d85/fd0 [0,4194304] 0 2026-03-10T06:23:05.504 INFO:tasks.workunit.client.0.vm04.stdout:9/660: unlink d2/de0/d1d/d64/d73/f9a 0 2026-03-10T06:23:05.506 INFO:tasks.workunit.client.0.vm04.stdout:9/661: readlink d2/d3/d18/de9/da9/lb6 0 2026-03-10T06:23:05.507 INFO:tasks.workunit.client.0.vm04.stdout:6/641: dwrite d2/d3a/d5e/d9a/fcc [0,4194304] 0 2026-03-10T06:23:05.508 INFO:tasks.workunit.client.0.vm04.stdout:9/662: write d2/d8/f66 [3108845,59685] 0 2026-03-10T06:23:05.509 INFO:tasks.workunit.client.0.vm04.stdout:0/676: dread d0/d5/d25/dd/d3a/d56/f88 [0,4194304] 0 2026-03-10T06:23:05.512 INFO:tasks.workunit.client.0.vm04.stdout:4/640: rename d2/d32/fa9 to d2/d16/da3/fcc 0 2026-03-10T06:23:05.529 INFO:tasks.workunit.client.0.vm04.stdout:2/583: mknod d1/d76/cb5 0 2026-03-10T06:23:05.530 INFO:tasks.workunit.client.0.vm04.stdout:7/602: dwrite d4/df/d12/d13/d25/d28/d3a/d58/fcc [0,4194304] 0 2026-03-10T06:23:05.545 INFO:tasks.workunit.client.0.vm04.stdout:2/584: dwrite d1/db/d72/f7a [0,4194304] 0 2026-03-10T06:23:05.563 INFO:tasks.workunit.client.0.vm04.stdout:9/663: truncate d2/d3/d18/de9/f29 4893057 0 2026-03-10T06:23:05.565 INFO:tasks.workunit.client.0.vm04.stdout:0/677: unlink d0/d1a/d20/d38/d31/d79/fc7 0 2026-03-10T06:23:05.567 INFO:tasks.workunit.client.0.vm04.stdout:0/678: dread - d0/d5/d97/dc0/fc4 zero size 2026-03-10T06:23:05.572 INFO:tasks.workunit.client.0.vm04.stdout:0/679: dread - d0/f75 zero size 2026-03-10T06:23:05.572 INFO:tasks.workunit.client.0.vm04.stdout:8/638: rename df/d15/d29/f3c to df/d15/d29/da3/db8/dc1/dac/fc3 0 2026-03-10T06:23:05.574 INFO:tasks.workunit.client.0.vm04.stdout:0/680: read d0/d1a/d20/d38/d31/d47/f54 [3497617,112316] 0 2026-03-10T06:23:05.579 INFO:tasks.workunit.client.0.vm04.stdout:7/603: fdatasync d4/df/d12/d13/d25/d28/f9e 0 2026-03-10T06:23:05.582 INFO:tasks.workunit.client.0.vm04.stdout:2/585: unlink d1/dae/d2c/d37/c90 0 2026-03-10T06:23:05.582 INFO:tasks.workunit.client.0.vm04.stdout:2/586: stat d1/d76/f8e 0 2026-03-10T06:23:05.583 INFO:tasks.workunit.client.0.vm04.stdout:2/587: fsync d1/f10 0 2026-03-10T06:23:05.588 INFO:tasks.workunit.client.0.vm04.stdout:4/641: write d2/d32/f82 [233780,71708] 0 2026-03-10T06:23:05.589 INFO:tasks.workunit.client.0.vm04.stdout:4/642: chown d2/d8/f54 928 1 2026-03-10T06:23:05.592 INFO:tasks.workunit.client.0.vm04.stdout:4/643: write d2/d46/fa5 [971490,67104] 0 2026-03-10T06:23:05.593 INFO:tasks.workunit.client.0.vm04.stdout:3/603: rename d4/d6/d99/d7b/d21/d32/d39/d64/f9c to d4/da/df/d11/d50/dc8/fcb 0 2026-03-10T06:23:05.600 INFO:tasks.workunit.client.0.vm04.stdout:0/681: readlink d0/d5/d25/l86 0 2026-03-10T06:23:05.614 INFO:tasks.workunit.client.0.vm04.stdout:9/664: write d2/d8/d53/fd3 [875419,3811] 0 2026-03-10T06:23:05.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:05 vm06.local ceph-mon[58974]: pgmap v31: 65 pgs: 65 active+clean; 2.6 GiB data, 8.8 GiB used, 111 GiB / 120 GiB avail; 32 MiB/s rd, 88 MiB/s wr, 189 op/s 2026-03-10T06:23:05.617 INFO:tasks.workunit.client.0.vm04.stdout:5/583: rename d4/d6/d50/f63 to d4/d6/d80/fd1 0 2026-03-10T06:23:05.623 INFO:tasks.workunit.client.0.vm04.stdout:3/604: mkdir d4/d6/d38/dcc 0 2026-03-10T06:23:05.625 INFO:tasks.workunit.client.0.vm04.stdout:5/584: dwrite d4/d6/d80/d84/d99/fb3 [0,4194304] 0 2026-03-10T06:23:05.627 INFO:tasks.workunit.client.0.vm04.stdout:5/585: truncate d4/d6/fcf 75105 0 2026-03-10T06:23:05.653 INFO:tasks.workunit.client.0.vm04.stdout:7/604: creat d4/df/dd8/d9c/db1/dde/ddf/fe9 x:0 0 0 2026-03-10T06:23:05.655 INFO:tasks.workunit.client.0.vm04.stdout:0/682: rmdir d0/d5/d25/dd/d3a/d81 39 2026-03-10T06:23:05.677 INFO:tasks.workunit.client.0.vm04.stdout:6/642: link d2/d3a/d5e/db5/cc7 d2/d43/d2d/d30/d34/da8/cd2 0 2026-03-10T06:23:05.677 INFO:tasks.workunit.client.0.vm04.stdout:8/639: creat df/d15/d2b/d81/fc4 x:0 0 0 2026-03-10T06:23:05.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:05 vm04.local ceph-mon[51058]: pgmap v31: 65 pgs: 65 active+clean; 2.6 GiB data, 8.8 GiB used, 111 GiB / 120 GiB avail; 32 MiB/s rd, 88 MiB/s wr, 189 op/s 2026-03-10T06:23:05.677 INFO:tasks.workunit.client.0.vm04.stdout:4/644: mknod d2/d16/d31/d3f/ccd 0 2026-03-10T06:23:05.679 INFO:tasks.workunit.client.0.vm04.stdout:6/643: fdatasync d2/d37/d6e/f77 0 2026-03-10T06:23:05.679 INFO:tasks.workunit.client.0.vm04.stdout:0/683: creat d0/d1a/d20/d38/fdc x:0 0 0 2026-03-10T06:23:05.682 INFO:tasks.workunit.client.0.vm04.stdout:9/665: creat d2/d3/d18/d39/d11/da5/df5/ffc x:0 0 0 2026-03-10T06:23:05.684 INFO:tasks.workunit.client.0.vm04.stdout:3/605: sync 2026-03-10T06:23:05.684 INFO:tasks.workunit.client.0.vm04.stdout:4/645: sync 2026-03-10T06:23:05.686 INFO:tasks.workunit.client.0.vm04.stdout:0/684: mkdir d0/d1a/d20/d38/d31/d47/ddd 0 2026-03-10T06:23:05.689 INFO:tasks.workunit.client.0.vm04.stdout:9/666: truncate d2/de0/d1d/f78 201898 0 2026-03-10T06:23:05.691 INFO:tasks.workunit.client.0.vm04.stdout:2/588: truncate d1/db/d20/f86 425245 0 2026-03-10T06:23:05.692 INFO:tasks.workunit.client.0.vm04.stdout:2/589: stat d1/db/d20/l39 0 2026-03-10T06:23:05.693 INFO:tasks.workunit.client.0.vm04.stdout:8/640: mkdir df/d20/d25/d30/dc5 0 2026-03-10T06:23:05.696 INFO:tasks.workunit.client.0.vm04.stdout:7/605: write d4/df/d12/d21/fa4 [1340995,111811] 0 2026-03-10T06:23:05.696 INFO:tasks.workunit.client.0.vm04.stdout:3/606: creat d4/d6/d92/fcd x:0 0 0 2026-03-10T06:23:05.698 INFO:tasks.workunit.client.0.vm04.stdout:5/586: link d4/d11/cc4 d4/d6/d80/cd2 0 2026-03-10T06:23:05.699 INFO:tasks.workunit.client.0.vm04.stdout:6/644: mkdir d2/d43/d2d/d30/d1f/d3c/d85/dbf/dd3 0 2026-03-10T06:23:05.700 INFO:tasks.workunit.client.0.vm04.stdout:2/590: dwrite d1/dae/d11/d14/f9c [0,4194304] 0 2026-03-10T06:23:05.707 INFO:tasks.workunit.client.0.vm04.stdout:9/667: mknod d2/d3/d18/d39/d11/da5/df5/cfd 0 2026-03-10T06:23:05.708 INFO:tasks.workunit.client.0.vm04.stdout:8/641: creat df/d20/d25/d87/fc6 x:0 0 0 2026-03-10T06:23:05.712 INFO:tasks.workunit.client.0.vm04.stdout:7/606: symlink d4/df/dd8/d9c/db1/lea 0 2026-03-10T06:23:05.713 INFO:tasks.workunit.client.0.vm04.stdout:1/616: rename d0/d8/l9 to d0/d3/d41/d4b/lde 0 2026-03-10T06:23:05.720 INFO:tasks.workunit.client.0.vm04.stdout:6/645: mkdir d2/d3a/d5e/db5/dd4 0 2026-03-10T06:23:05.726 INFO:tasks.workunit.client.0.vm04.stdout:5/587: rmdir d4/d11/d7d/d38/d91/d55 39 2026-03-10T06:23:05.726 INFO:tasks.workunit.client.0.vm04.stdout:6/646: dwrite d2/d43/d2d/d30/d1f/d3c/fb7 [0,4194304] 0 2026-03-10T06:23:05.730 INFO:tasks.workunit.client.0.vm04.stdout:2/591: truncate d1/dae/f24 5052615 0 2026-03-10T06:23:05.730 INFO:tasks.workunit.client.0.vm04.stdout:2/592: stat d1/f91 0 2026-03-10T06:23:05.733 INFO:tasks.workunit.client.0.vm04.stdout:4/646: creat d2/fce x:0 0 0 2026-03-10T06:23:05.735 INFO:tasks.workunit.client.0.vm04.stdout:8/642: truncate df/d15/d2b/f7e 3602908 0 2026-03-10T06:23:05.737 INFO:tasks.workunit.client.0.vm04.stdout:9/668: truncate d2/d8/f99 912364 0 2026-03-10T06:23:05.739 INFO:tasks.workunit.client.0.vm04.stdout:9/669: write d2/d8/d22/fd1 [1628123,107608] 0 2026-03-10T06:23:05.750 INFO:tasks.workunit.client.0.vm04.stdout:1/617: symlink d0/d8/d46/d7a/d95/dc5/ldf 0 2026-03-10T06:23:05.750 INFO:tasks.workunit.client.0.vm04.stdout:1/618: chown d0/d3/d80/lbd 87089 1 2026-03-10T06:23:05.758 INFO:tasks.workunit.client.0.vm04.stdout:6/647: dread d2/d43/d2d/d30/f2b [0,4194304] 0 2026-03-10T06:23:05.759 INFO:tasks.workunit.client.0.vm04.stdout:6/648: readlink d2/d43/d2d/d30/d1f/l53 0 2026-03-10T06:23:05.778 INFO:tasks.workunit.client.0.vm04.stdout:2/593: mknod d1/dae/d11/d14/d9f/cb6 0 2026-03-10T06:23:05.782 INFO:tasks.workunit.client.0.vm04.stdout:4/647: unlink d2/d8/f54 0 2026-03-10T06:23:05.782 INFO:tasks.workunit.client.0.vm04.stdout:2/594: chown d1/db/d72/l82 99908714 1 2026-03-10T06:23:05.782 INFO:tasks.workunit.client.0.vm04.stdout:4/648: stat d2/d46 0 2026-03-10T06:23:05.787 INFO:tasks.workunit.client.0.vm04.stdout:8/643: symlink df/d15/d29/da3/db8/lc7 0 2026-03-10T06:23:05.821 INFO:tasks.workunit.client.0.vm04.stdout:3/607: rename d4/d6/d99/d7b/l28 to d4/da/df/d11/d5a/db3/lce 0 2026-03-10T06:23:05.825 INFO:tasks.workunit.client.0.vm04.stdout:1/619: rename d0/d3/d41/d4b/d5b/l7e to d0/d3/d41/d4b/d5b/le0 0 2026-03-10T06:23:05.827 INFO:tasks.workunit.client.0.vm04.stdout:3/608: dread d4/d6/d99/d7b/f4b [0,4194304] 0 2026-03-10T06:23:05.828 INFO:tasks.workunit.client.0.vm04.stdout:2/595: creat d1/dae/d2c/d37/fb7 x:0 0 0 2026-03-10T06:23:05.829 INFO:tasks.workunit.client.0.vm04.stdout:7/607: dwrite d4/df/d12/d13/d25/d28/d3a/d58/fb6 [0,4194304] 0 2026-03-10T06:23:05.835 INFO:tasks.workunit.client.0.vm04.stdout:2/596: truncate d1/db/d20/f49 4769982 0 2026-03-10T06:23:05.852 INFO:tasks.workunit.client.0.vm04.stdout:0/685: link d0/d5/d25/dd/d1d/d59/l69 d0/d5/d25/dd/d1d/lde 0 2026-03-10T06:23:05.853 INFO:tasks.workunit.client.0.vm04.stdout:2/597: dread d1/db/d20/f49 [0,4194304] 0 2026-03-10T06:23:05.855 INFO:tasks.workunit.client.0.vm04.stdout:3/609: mknod d4/d6/d91/ccf 0 2026-03-10T06:23:05.856 INFO:tasks.workunit.client.0.vm04.stdout:3/610: write d4/da/df/d11/d5a/d5b/fa3 [1470380,67031] 0 2026-03-10T06:23:05.858 INFO:tasks.workunit.client.0.vm04.stdout:6/649: symlink d2/d43/d2d/d30/ld5 0 2026-03-10T06:23:05.884 INFO:tasks.workunit.client.0.vm04.stdout:8/644: mknod df/d20/cc8 0 2026-03-10T06:23:05.886 INFO:tasks.workunit.client.0.vm04.stdout:4/649: symlink d2/d16/d31/d3f/d93/lcf 0 2026-03-10T06:23:05.889 INFO:tasks.workunit.client.0.vm04.stdout:7/608: rmdir d4/df/dd8/d9c/db1/dde 39 2026-03-10T06:23:05.895 INFO:tasks.workunit.client.0.vm04.stdout:9/670: link d2/d8/d53/d6e/d89/ff3 d2/d8/d22/ffe 0 2026-03-10T06:23:05.896 INFO:tasks.workunit.client.0.vm04.stdout:0/686: creat d0/d5/d25/dd/fdf x:0 0 0 2026-03-10T06:23:05.897 INFO:tasks.workunit.client.0.vm04.stdout:0/687: write d0/d5/d25/dd/d5c/f9a [566181,5333] 0 2026-03-10T06:23:05.900 INFO:tasks.workunit.client.0.vm04.stdout:3/611: stat d4/d6/d38/l8a 0 2026-03-10T06:23:05.902 INFO:tasks.workunit.client.0.vm04.stdout:0/688: write d0/d1a/d20/d38/d31/d47/d8a/fbe [72387,130606] 0 2026-03-10T06:23:05.902 INFO:tasks.workunit.client.0.vm04.stdout:6/650: unlink d2/d3a/d5e/fb3 0 2026-03-10T06:23:05.911 INFO:tasks.workunit.client.0.vm04.stdout:4/650: mknod d2/d32/d5c/d98/cd0 0 2026-03-10T06:23:05.912 INFO:tasks.workunit.client.0.vm04.stdout:1/620: creat d0/d8/fe1 x:0 0 0 2026-03-10T06:23:05.912 INFO:tasks.workunit.client.0.vm04.stdout:7/609: chown d4/df/d12/d21/fd9 353532835 1 2026-03-10T06:23:05.914 INFO:tasks.workunit.client.0.vm04.stdout:2/598: symlink d1/dae/lb8 0 2026-03-10T06:23:05.918 INFO:tasks.workunit.client.0.vm04.stdout:5/588: getdents d4/d11/d7d/d38/d91/d4c/d98/dc0 0 2026-03-10T06:23:05.918 INFO:tasks.workunit.client.0.vm04.stdout:6/651: dwrite d2/d43/d2d/f42 [4194304,4194304] 0 2026-03-10T06:23:05.918 INFO:tasks.workunit.client.0.vm04.stdout:3/612: dwrite d4/da/df/d11/d5a/d5b/fa3 [0,4194304] 0 2026-03-10T06:23:05.918 INFO:tasks.workunit.client.0.vm04.stdout:9/671: sync 2026-03-10T06:23:05.920 INFO:tasks.workunit.client.0.vm04.stdout:0/689: rename d0/d5/d25/dd/fdf to d0/d1a/fe0 0 2026-03-10T06:23:05.922 INFO:tasks.workunit.client.0.vm04.stdout:2/599: chown d1/dae/d11/d14/d9f 8 1 2026-03-10T06:23:05.923 INFO:tasks.workunit.client.0.vm04.stdout:7/610: write d4/fa7 [1964103,32832] 0 2026-03-10T06:23:05.923 INFO:tasks.workunit.client.0.vm04.stdout:9/672: sync 2026-03-10T06:23:05.923 INFO:tasks.workunit.client.0.vm04.stdout:8/645: creat df/d20/d25/d30/d65/d8f/fc9 x:0 0 0 2026-03-10T06:23:05.923 INFO:tasks.workunit.client.0.vm04.stdout:1/621: chown d0/d3/d41/d4b/f6b 1288381653 1 2026-03-10T06:23:05.934 INFO:tasks.workunit.client.0.vm04.stdout:2/600: fdatasync d1/dae/d11/d14/f9c 0 2026-03-10T06:23:05.941 INFO:tasks.workunit.client.0.vm04.stdout:6/652: creat d2/d3a/d5e/d9a/fd6 x:0 0 0 2026-03-10T06:23:05.945 INFO:tasks.workunit.client.0.vm04.stdout:3/613: rename d4/d6/d99/d7b/d89/db4 to d4/d6/d99/d7b/dbd/dd0 0 2026-03-10T06:23:05.947 INFO:tasks.workunit.client.0.vm04.stdout:4/651: dwrite d2/d16/d2c/d6b/f96 [0,4194304] 0 2026-03-10T06:23:05.952 INFO:tasks.workunit.client.0.vm04.stdout:1/622: creat d0/d3/d41/d99/fe2 x:0 0 0 2026-03-10T06:23:05.953 INFO:tasks.workunit.client.0.vm04.stdout:0/690: dread d0/d1a/d20/d38/d31/d47/f7b [0,4194304] 0 2026-03-10T06:23:05.954 INFO:tasks.workunit.client.0.vm04.stdout:0/691: write d0/d5/d25/dd/d3a/d56/f84 [887559,49113] 0 2026-03-10T06:23:05.954 INFO:tasks.workunit.client.0.vm04.stdout:0/692: stat d0/d1a 0 2026-03-10T06:23:05.955 INFO:tasks.workunit.client.0.vm04.stdout:5/589: dread d4/d11/d7d/f36 [0,4194304] 0 2026-03-10T06:23:05.955 INFO:tasks.workunit.client.0.vm04.stdout:0/693: dread - d0/d5/d25/dd/d1d/d59/fd4 zero size 2026-03-10T06:23:05.956 INFO:tasks.workunit.client.0.vm04.stdout:5/590: chown d4/d11/d7d/d52/f8f 78 1 2026-03-10T06:23:05.957 INFO:tasks.workunit.client.0.vm04.stdout:0/694: write d0/d5/fc5 [409581,125546] 0 2026-03-10T06:23:05.967 INFO:tasks.workunit.client.0.vm04.stdout:2/601: creat d1/dae/d2c/d37/d40/fb9 x:0 0 0 2026-03-10T06:23:05.968 INFO:tasks.workunit.client.0.vm04.stdout:9/673: fsync d2/d8/d22/daa/f7c 0 2026-03-10T06:23:05.974 INFO:tasks.workunit.client.0.vm04.stdout:7/611: getdents d4/df/d12/d13/d25/dcb/dd2 0 2026-03-10T06:23:05.982 INFO:tasks.workunit.client.0.vm04.stdout:3/614: unlink d4/d6/d99/d7b/d21/d32/f58 0 2026-03-10T06:23:05.982 INFO:tasks.workunit.client.0.vm04.stdout:0/695: dread - d0/d5/d25/dd/d1d/d59/d63/fa3 zero size 2026-03-10T06:23:05.984 INFO:tasks.workunit.client.0.vm04.stdout:3/615: dwrite d4/d6/d99/d7b/d21/d2c/fb2 [0,4194304] 0 2026-03-10T06:23:05.991 INFO:tasks.workunit.client.0.vm04.stdout:4/652: sync 2026-03-10T06:23:06.000 INFO:tasks.workunit.client.0.vm04.stdout:9/674: mknod d2/de0/d1d/cff 0 2026-03-10T06:23:06.004 INFO:tasks.workunit.client.0.vm04.stdout:6/653: write d2/d43/f3b [1398800,117367] 0 2026-03-10T06:23:06.008 INFO:tasks.workunit.client.0.vm04.stdout:5/591: mkdir d4/d6/dc2/dd3 0 2026-03-10T06:23:06.020 INFO:tasks.workunit.client.0.vm04.stdout:0/696: dread d0/d5/d25/dd/d5c/d73/f53 [0,4194304] 0 2026-03-10T06:23:06.026 INFO:tasks.workunit.client.0.vm04.stdout:7/612: dwrite d4/df/d12/d13/d25/d30/d40/d79/fab [0,4194304] 0 2026-03-10T06:23:06.028 INFO:tasks.workunit.client.0.vm04.stdout:7/613: chown d4/df/f29 32725463 1 2026-03-10T06:23:06.029 INFO:tasks.workunit.client.0.vm04.stdout:4/653: write d2/d32/d5c/d76/f95 [561443,33929] 0 2026-03-10T06:23:06.029 INFO:tasks.workunit.client.0.vm04.stdout:2/602: truncate d1/db/d20/d8f/d35/d54/d5d/f65 1136956 0 2026-03-10T06:23:06.036 INFO:tasks.workunit.client.0.vm04.stdout:7/614: dwrite d4/df/dd8/f4d [0,4194304] 0 2026-03-10T06:23:06.050 INFO:tasks.workunit.client.0.vm04.stdout:8/646: link df/d15/d2b/f4d df/d15/d29/fca 0 2026-03-10T06:23:06.056 INFO:tasks.workunit.client.0.vm04.stdout:9/675: dread d2/d8/d3a/dcb/fe6 [0,4194304] 0 2026-03-10T06:23:06.057 INFO:tasks.workunit.client.0.vm04.stdout:9/676: chown d2/d3/d18/de9/da9/cde 1194819 1 2026-03-10T06:23:06.059 INFO:tasks.workunit.client.0.vm04.stdout:1/623: link d0/d3/cc0 d0/d8/d46/db3/dd2/ce3 0 2026-03-10T06:23:06.061 INFO:tasks.workunit.client.0.vm04.stdout:4/654: mkdir d2/d16/d2c/d6b/dd1 0 2026-03-10T06:23:06.065 INFO:tasks.workunit.client.0.vm04.stdout:5/592: mknod d4/d11/cd4 0 2026-03-10T06:23:06.074 INFO:tasks.workunit.client.0.vm04.stdout:3/616: dwrite d4/da/df/d11/f48 [0,4194304] 0 2026-03-10T06:23:06.076 INFO:tasks.workunit.client.0.vm04.stdout:6/654: dwrite d2/d3a/d5e/fa4 [0,4194304] 0 2026-03-10T06:23:06.076 INFO:tasks.workunit.client.0.vm04.stdout:6/655: readlink d2/d3a/l3d 0 2026-03-10T06:23:06.099 INFO:tasks.workunit.client.0.vm04.stdout:0/697: unlink d0/d5/f4e 0 2026-03-10T06:23:06.099 INFO:tasks.workunit.client.0.vm04.stdout:9/677: readlink d2/de5/lf8 0 2026-03-10T06:23:06.100 INFO:tasks.workunit.client.0.vm04.stdout:0/698: stat d0/d5/d25/dd/d3a/ld2 0 2026-03-10T06:23:06.112 INFO:tasks.workunit.client.0.vm04.stdout:3/617: sync 2026-03-10T06:23:06.121 INFO:tasks.workunit.client.0.vm04.stdout:7/615: rename d4/df/c4f to d4/df/d12/d13/d25/d30/d40/ceb 0 2026-03-10T06:23:06.122 INFO:tasks.workunit.client.0.vm04.stdout:7/616: truncate d4/df/d12/d13/d8b/fa5 4397271 0 2026-03-10T06:23:06.124 INFO:tasks.workunit.client.0.vm04.stdout:3/618: dread d4/d6/d99/d7b/f45 [0,4194304] 0 2026-03-10T06:23:06.128 INFO:tasks.workunit.client.0.vm04.stdout:3/619: fsync d4/f42 0 2026-03-10T06:23:06.149 INFO:tasks.workunit.client.0.vm04.stdout:6/656: creat d2/d37/fd7 x:0 0 0 2026-03-10T06:23:06.154 INFO:tasks.workunit.client.0.vm04.stdout:1/624: dwrite d0/f9a [0,4194304] 0 2026-03-10T06:23:06.158 INFO:tasks.workunit.client.0.vm04.stdout:4/655: write d2/d32/f7c [591449,59293] 0 2026-03-10T06:23:06.162 INFO:tasks.workunit.client.0.vm04.stdout:9/678: creat d2/d8/d53/d6e/d8d/f100 x:0 0 0 2026-03-10T06:23:06.164 INFO:tasks.workunit.client.0.vm04.stdout:9/679: fsync d2/d3/d18/de9/da2/fc5 0 2026-03-10T06:23:06.167 INFO:tasks.workunit.client.0.vm04.stdout:0/699: mkdir d0/d1a/d20/d38/d31/d47/d8a/d8d/de1 0 2026-03-10T06:23:06.170 INFO:tasks.workunit.client.0.vm04.stdout:2/603: read d1/f57 [1055720,110129] 0 2026-03-10T06:23:06.171 INFO:tasks.workunit.client.0.vm04.stdout:7/617: unlink d4/df/d12/d13/d25/d28/d3a/l87 0 2026-03-10T06:23:06.176 INFO:tasks.workunit.client.0.vm04.stdout:3/620: symlink d4/d6/d99/d7b/d89/ld1 0 2026-03-10T06:23:06.180 INFO:tasks.workunit.client.0.vm04.stdout:3/621: write d4/da/df/d11/d5a/d5b/f98 [1041755,53615] 0 2026-03-10T06:23:06.181 INFO:tasks.workunit.client.0.vm04.stdout:8/647: creat df/d15/d2b/d81/d9a/dbe/fcb x:0 0 0 2026-03-10T06:23:06.182 INFO:tasks.workunit.client.0.vm04.stdout:6/657: read d2/d43/d2d/d30/f60 [1606776,84242] 0 2026-03-10T06:23:06.185 INFO:tasks.workunit.client.0.vm04.stdout:1/625: mkdir d0/d8/d46/de4 0 2026-03-10T06:23:06.185 INFO:tasks.workunit.client.0.vm04.stdout:4/656: symlink d2/d32/d5c/d76/ld2 0 2026-03-10T06:23:06.189 INFO:tasks.workunit.client.0.vm04.stdout:1/626: write d0/d8/d46/d7a/fce [273660,102959] 0 2026-03-10T06:23:06.196 INFO:tasks.workunit.client.0.vm04.stdout:5/593: dwrite d4/d11/f17 [0,4194304] 0 2026-03-10T06:23:06.196 INFO:tasks.workunit.client.0.vm04.stdout:2/604: symlink d1/db/d69/d74/lba 0 2026-03-10T06:23:06.196 INFO:tasks.workunit.client.0.vm04.stdout:2/605: readlink d1/db/d72/l96 0 2026-03-10T06:23:06.199 INFO:tasks.workunit.client.0.vm04.stdout:1/627: dwrite d0/d3/d41/d4b/fd3 [0,4194304] 0 2026-03-10T06:23:06.200 INFO:tasks.workunit.client.0.vm04.stdout:7/618: fsync d4/f6 0 2026-03-10T06:23:06.210 INFO:tasks.workunit.client.0.vm04.stdout:2/606: dread d1/dae/d11/d14/f9c [0,4194304] 0 2026-03-10T06:23:06.210 INFO:tasks.workunit.client.0.vm04.stdout:2/607: dread - d1/dae/d2c/d37/fb7 zero size 2026-03-10T06:23:06.214 INFO:tasks.workunit.client.0.vm04.stdout:6/658: fdatasync d2/d43/d2d/d30/d1f/d3c/f6a 0 2026-03-10T06:23:06.220 INFO:tasks.workunit.client.0.vm04.stdout:4/657: readlink d2/d16/d2c/d9a/lb2 0 2026-03-10T06:23:06.220 INFO:tasks.workunit.client.0.vm04.stdout:9/680: mknod d2/d8/d22/def/c101 0 2026-03-10T06:23:06.220 INFO:tasks.workunit.client.0.vm04.stdout:0/700: symlink d0/d5/d25/dd/d5c/le2 0 2026-03-10T06:23:06.226 INFO:tasks.workunit.client.0.vm04.stdout:2/608: mkdir d1/db/d72/d94/dbb 0 2026-03-10T06:23:06.226 INFO:tasks.workunit.client.0.vm04.stdout:6/659: read d2/d43/d2d/d30/d34/d76/d7e/f81 [4299521,89911] 0 2026-03-10T06:23:06.231 INFO:tasks.workunit.client.0.vm04.stdout:3/622: dwrite d4/d6/d99/d7b/d21/d32/d4e/d8f/fc2 [0,4194304] 0 2026-03-10T06:23:06.232 INFO:tasks.workunit.client.0.vm04.stdout:4/658: mknod d2/d16/d31/d42/db9/cd3 0 2026-03-10T06:23:06.234 INFO:tasks.workunit.client.0.vm04.stdout:5/594: mknod d4/d6/dc2/dd3/cd5 0 2026-03-10T06:23:06.241 INFO:tasks.workunit.client.0.vm04.stdout:7/619: mknod d4/df/d12/d13/d25/cec 0 2026-03-10T06:23:06.241 INFO:tasks.workunit.client.0.vm04.stdout:6/660: creat d2/d43/d2d/d30/d1f/fd8 x:0 0 0 2026-03-10T06:23:06.253 INFO:tasks.workunit.client.0.vm04.stdout:2/609: dread d1/dae/d11/d14/f1d [0,4194304] 0 2026-03-10T06:23:06.253 INFO:tasks.workunit.client.0.vm04.stdout:2/610: chown d1/dae/d11/d14/d4e/f60 18579887 1 2026-03-10T06:23:06.254 INFO:tasks.workunit.client.0.vm04.stdout:2/611: chown d1/dae/d2c/d37/d59/db2 2 1 2026-03-10T06:23:06.263 INFO:tasks.workunit.client.0.vm04.stdout:3/623: truncate d4/d6/d99/d7b/d21/d32/d39/d64/f7d 1181700 0 2026-03-10T06:23:06.280 INFO:tasks.workunit.client.0.vm04.stdout:3/624: dread d4/d6/f12 [4194304,4194304] 0 2026-03-10T06:23:06.281 INFO:tasks.workunit.client.0.vm04.stdout:6/661: unlink d2/d3a/d5e/d9a/fd6 0 2026-03-10T06:23:06.281 INFO:tasks.workunit.client.0.vm04.stdout:6/662: readlink d2/l2a 0 2026-03-10T06:23:06.284 INFO:tasks.workunit.client.0.vm04.stdout:8/648: dwrite df/d15/d29/da3/db8/dc1/d97/fb1 [0,4194304] 0 2026-03-10T06:23:06.287 INFO:tasks.workunit.client.0.vm04.stdout:2/612: readlink d1/l47 0 2026-03-10T06:23:06.304 INFO:tasks.workunit.client.0.vm04.stdout:1/628: dwrite d0/f2e [0,4194304] 0 2026-03-10T06:23:06.314 INFO:tasks.workunit.client.0.vm04.stdout:9/681: dwrite d2/d8/d22/d4f/ff0 [0,4194304] 0 2026-03-10T06:23:06.314 INFO:tasks.workunit.client.0.vm04.stdout:0/701: dwrite d0/f14 [4194304,4194304] 0 2026-03-10T06:23:06.322 INFO:tasks.workunit.client.0.vm04.stdout:0/702: read - d0/d5/d25/dd/d1d/d59/fd4 zero size 2026-03-10T06:23:06.327 INFO:tasks.workunit.client.0.vm04.stdout:9/682: truncate d2/d3/d18/de9/de7/fec 769110 0 2026-03-10T06:23:06.327 INFO:tasks.workunit.client.0.vm04.stdout:0/703: write d0/d1a/d20/fb9 [1049461,53775] 0 2026-03-10T06:23:06.327 INFO:tasks.workunit.client.0.vm04.stdout:9/683: write d2/d8/d53/d6e/d8d/ff7 [981525,43418] 0 2026-03-10T06:23:06.349 INFO:tasks.workunit.client.0.vm04.stdout:4/659: symlink d2/d16/ld4 0 2026-03-10T06:23:06.349 INFO:tasks.workunit.client.0.vm04.stdout:4/660: chown d2/d16/d56/fca 1 1 2026-03-10T06:23:06.350 INFO:tasks.workunit.client.0.vm04.stdout:5/595: symlink d4/d11/ld6 0 2026-03-10T06:23:06.356 INFO:tasks.workunit.client.0.vm04.stdout:5/596: dwrite d4/d11/d7d/d38/d91/d4c/d98/dc0/dbe/fc6 [0,4194304] 0 2026-03-10T06:23:06.357 INFO:tasks.workunit.client.0.vm04.stdout:7/620: unlink d4/df/d12/d21/c44 0 2026-03-10T06:23:06.369 INFO:tasks.workunit.client.0.vm04.stdout:3/625: read d4/d6/d99/d7b/f1d [760280,29574] 0 2026-03-10T06:23:06.383 INFO:tasks.workunit.client.0.vm04.stdout:0/704: mkdir d0/d1a/d20/d38/d31/d47/d8a/de3 0 2026-03-10T06:23:06.391 INFO:tasks.workunit.client.0.vm04.stdout:1/629: dread d0/d3/f37 [0,4194304] 0 2026-03-10T06:23:06.393 INFO:tasks.workunit.client.0.vm04.stdout:4/661: unlink d2/d16/d31/d3f/f8f 0 2026-03-10T06:23:06.399 INFO:tasks.workunit.client.0.vm04.stdout:7/621: mkdir d4/df/d12/d13/db3/ded 0 2026-03-10T06:23:06.400 INFO:tasks.workunit.client.0.vm04.stdout:4/662: dread d2/d46/f3d [0,4194304] 0 2026-03-10T06:23:06.407 INFO:tasks.workunit.client.0.vm04.stdout:6/663: unlink d2/d43/d2d/d30/f63 0 2026-03-10T06:23:06.417 INFO:tasks.workunit.client.0.vm04.stdout:1/630: creat d0/d8/d46/db3/db4/fe5 x:0 0 0 2026-03-10T06:23:06.418 INFO:tasks.workunit.client.0.vm04.stdout:1/631: truncate d0/d3/d41/f75 110589 0 2026-03-10T06:23:06.425 INFO:tasks.workunit.client.0.vm04.stdout:4/663: truncate d2/d16/d2c/f2e 1040602 0 2026-03-10T06:23:06.426 INFO:tasks.workunit.client.0.vm04.stdout:5/597: truncate d4/d11/d7d/d38/d91/d55/f9e 605959 0 2026-03-10T06:23:06.429 INFO:tasks.workunit.client.0.vm04.stdout:6/664: rmdir d2/d43/d2d/d30/d34/da8 39 2026-03-10T06:23:06.443 INFO:tasks.workunit.client.0.vm04.stdout:2/613: write d1/db/d20/f49 [783430,130050] 0 2026-03-10T06:23:06.446 INFO:tasks.workunit.client.0.vm04.stdout:2/614: dwrite d1/dae/d11/d14/d4e/f5c [0,4194304] 0 2026-03-10T06:23:06.453 INFO:tasks.workunit.client.0.vm04.stdout:0/705: mknod d0/d5/d25/dd/d3a/d81/ce4 0 2026-03-10T06:23:06.455 INFO:tasks.workunit.client.0.vm04.stdout:9/684: dwrite d2/d23/f31 [8388608,4194304] 0 2026-03-10T06:23:06.471 INFO:tasks.workunit.client.0.vm04.stdout:5/598: fdatasync d4/d11/d7d/f31 0 2026-03-10T06:23:06.471 INFO:tasks.workunit.client.0.vm04.stdout:6/665: sync 2026-03-10T06:23:06.475 INFO:tasks.workunit.client.0.vm04.stdout:5/599: chown d4/l15 4 1 2026-03-10T06:23:06.478 INFO:tasks.workunit.client.0.vm04.stdout:8/649: getdents df/d15/d2b 0 2026-03-10T06:23:06.478 INFO:tasks.workunit.client.0.vm04.stdout:0/706: creat d0/d1a/fe5 x:0 0 0 2026-03-10T06:23:06.478 INFO:tasks.workunit.client.0.vm04.stdout:0/707: dread - d0/d5/d25/dd/fc6 zero size 2026-03-10T06:23:06.478 INFO:tasks.workunit.client.0.vm04.stdout:3/626: dwrite d4/d6/d99/d7b/f77 [0,4194304] 0 2026-03-10T06:23:06.478 INFO:tasks.workunit.client.0.vm04.stdout:1/632: creat d0/d3/d41/dc2/fe6 x:0 0 0 2026-03-10T06:23:06.479 INFO:tasks.workunit.client.0.vm04.stdout:1/633: stat d0/d8/d46/db3/lca 0 2026-03-10T06:23:06.479 INFO:tasks.workunit.client.0.vm04.stdout:1/634: chown d0/d8/d46/l77 6712 1 2026-03-10T06:23:06.483 INFO:tasks.workunit.client.0.vm04.stdout:2/615: mknod d1/db/d72/daf/db0/cbc 0 2026-03-10T06:23:06.488 INFO:tasks.workunit.client.0.vm04.stdout:6/666: rmdir d2/d43/d2d/d7c 39 2026-03-10T06:23:06.489 INFO:tasks.workunit.client.0.vm04.stdout:5/600: unlink d4/d6/f67 0 2026-03-10T06:23:06.490 INFO:tasks.workunit.client.0.vm04.stdout:5/601: fdatasync d4/d11/d7d/f90 0 2026-03-10T06:23:06.501 INFO:tasks.workunit.client.0.vm04.stdout:9/685: rename d2/d8/d22/daa/f69 to d2/de0/da3/f102 0 2026-03-10T06:23:06.504 INFO:tasks.workunit.client.0.vm04.stdout:9/686: write d2/d3/d18/ddd/f5b [1301540,13957] 0 2026-03-10T06:23:06.509 INFO:tasks.workunit.client.0.vm04.stdout:7/622: getdents d4/df 0 2026-03-10T06:23:06.513 INFO:tasks.workunit.client.0.vm04.stdout:5/602: dread d4/d11/d7d/f31 [0,4194304] 0 2026-03-10T06:23:06.513 INFO:tasks.workunit.client.0.vm04.stdout:5/603: write d4/d11/d7d/d38/d91/d4c/d98/faf [430925,124414] 0 2026-03-10T06:23:06.513 INFO:tasks.workunit.client.0.vm04.stdout:2/616: write d1/db/f36 [1612408,120193] 0 2026-03-10T06:23:06.518 INFO:tasks.workunit.client.0.vm04.stdout:6/667: dread d2/d43/f69 [0,4194304] 0 2026-03-10T06:23:06.527 INFO:tasks.workunit.client.0.vm04.stdout:9/687: mknod d2/d3/d18/de9/d5a/c103 0 2026-03-10T06:23:06.527 INFO:tasks.workunit.client.0.vm04.stdout:6/668: dread d2/d43/f24 [0,4194304] 0 2026-03-10T06:23:06.528 INFO:tasks.workunit.client.0.vm04.stdout:6/669: write d2/d43/d2d/d30/d1f/d3c/d85/fd0 [3030162,47442] 0 2026-03-10T06:23:06.541 INFO:tasks.workunit.client.0.vm04.stdout:2/617: creat d1/db/d20/d8f/d48/d67/fbd x:0 0 0 2026-03-10T06:23:06.541 INFO:tasks.workunit.client.0.vm04.stdout:9/688: dread d2/d8/d53/fc8 [4194304,4194304] 0 2026-03-10T06:23:06.551 INFO:tasks.workunit.client.0.vm04.stdout:4/664: write d2/d16/f3a [23799,17003] 0 2026-03-10T06:23:06.551 INFO:tasks.workunit.client.0.vm04.stdout:4/665: readlink d2/l4a 0 2026-03-10T06:23:06.557 INFO:tasks.workunit.client.0.vm04.stdout:8/650: truncate df/d15/d2b/f4c 2825019 0 2026-03-10T06:23:06.561 INFO:tasks.workunit.client.0.vm04.stdout:0/708: creat d0/d1a/d20/fe6 x:0 0 0 2026-03-10T06:23:06.563 INFO:tasks.workunit.client.0.vm04.stdout:0/709: stat d0/d5/d25/dd/d92/fa4 0 2026-03-10T06:23:06.563 INFO:tasks.workunit.client.0.vm04.stdout:0/710: chown d0/d5/fb 11 1 2026-03-10T06:23:06.569 INFO:tasks.workunit.client.0.vm04.stdout:2/618: unlink d1/dae/d2c/f3d 0 2026-03-10T06:23:06.570 INFO:tasks.workunit.client.0.vm04.stdout:0/711: dread d0/d1a/d20/d38/fb4 [0,4194304] 0 2026-03-10T06:23:06.572 INFO:tasks.workunit.client.0.vm04.stdout:0/712: read d0/f1b [4929315,114291] 0 2026-03-10T06:23:06.572 INFO:tasks.workunit.client.0.vm04.stdout:0/713: chown d0/d1a/d20/dc2 10012850 1 2026-03-10T06:23:06.580 INFO:tasks.workunit.client.0.vm04.stdout:7/623: write d4/df/dd8/d9c/f9f [590240,114726] 0 2026-03-10T06:23:06.583 INFO:tasks.workunit.client.0.vm04.stdout:3/627: dwrite d4/d6/d99/d7b/f23 [0,4194304] 0 2026-03-10T06:23:06.587 INFO:tasks.workunit.client.0.vm04.stdout:4/666: creat d2/d16/da3/fd5 x:0 0 0 2026-03-10T06:23:06.589 INFO:tasks.workunit.client.0.vm04.stdout:9/689: write d2/d3/d18/d39/d46/fac [800311,73117] 0 2026-03-10T06:23:06.595 INFO:tasks.workunit.client.0.vm04.stdout:6/670: symlink d2/d3a/d5e/ld9 0 2026-03-10T06:23:06.596 INFO:tasks.workunit.client.0.vm04.stdout:1/635: getdents d0/d8/d46/d7a/d95/dc5 0 2026-03-10T06:23:06.597 INFO:tasks.workunit.client.0.vm04.stdout:1/636: readlink d0/d3/d80/l85 0 2026-03-10T06:23:06.598 INFO:tasks.workunit.client.0.vm04.stdout:2/619: dread - d1/dae/d2c/f4a zero size 2026-03-10T06:23:06.602 INFO:tasks.workunit.client.0.vm04.stdout:0/714: dread - d0/d5/d25/dd/d1d/d59/f9f zero size 2026-03-10T06:23:06.602 INFO:tasks.workunit.client.0.vm04.stdout:5/604: link d4/d11/d7d/d52/f8f d4/d11/d7d/d38/d91/d4c/d98/dc0/dbe/fd7 0 2026-03-10T06:23:06.611 INFO:tasks.workunit.client.0.vm04.stdout:7/624: rename d4/df/d12/d13/la3 to d4/df/dd8/lee 0 2026-03-10T06:23:06.611 INFO:tasks.workunit.client.0.vm04.stdout:4/667: truncate d2/d46/f18 4104028 0 2026-03-10T06:23:06.617 INFO:tasks.workunit.client.0.vm04.stdout:9/690: creat d2/d3/d18/de9/da2/f104 x:0 0 0 2026-03-10T06:23:06.623 INFO:tasks.workunit.client.0.vm04.stdout:8/651: dwrite df/f17 [0,4194304] 0 2026-03-10T06:23:06.628 INFO:tasks.workunit.client.0.vm04.stdout:7/625: dread d4/df/d12/d13/d8b/fa5 [0,4194304] 0 2026-03-10T06:23:06.629 INFO:tasks.workunit.client.0.vm04.stdout:0/715: mknod d0/d5/d25/dd/d3a/d56/ce7 0 2026-03-10T06:23:06.635 INFO:tasks.workunit.client.0.vm04.stdout:6/671: rename d2/d3a/l3d to d2/d37/d6e/lda 0 2026-03-10T06:23:06.644 INFO:tasks.workunit.client.0.vm04.stdout:6/672: dwrite d2/d43/d86/fc4 [0,4194304] 0 2026-03-10T06:23:06.648 INFO:tasks.workunit.client.0.vm04.stdout:0/716: unlink d0/l19 0 2026-03-10T06:23:06.648 INFO:tasks.workunit.client.0.vm04.stdout:7/626: symlink d4/df/d12/d13/d25/d30/lef 0 2026-03-10T06:23:06.650 INFO:tasks.workunit.client.0.vm04.stdout:2/620: rename d1/dae/c26 to d1/db/d72/daf/cbe 0 2026-03-10T06:23:06.654 INFO:tasks.workunit.client.0.vm04.stdout:3/628: creat d4/da/df/d11/fd2 x:0 0 0 2026-03-10T06:23:06.664 INFO:tasks.workunit.client.0.vm04.stdout:4/668: rename d2/d16/d2c/d6b/f9e to d2/d32/d94/d99/fd6 0 2026-03-10T06:23:06.666 INFO:tasks.workunit.client.0.vm04.stdout:1/637: truncate d0/f23 2874561 0 2026-03-10T06:23:06.671 INFO:tasks.workunit.client.0.vm04.stdout:5/605: truncate d4/d6/f20 4305350 0 2026-03-10T06:23:06.672 INFO:tasks.workunit.client.0.vm04.stdout:0/717: dread d0/d5/d25/dd/d1d/f30 [0,4194304] 0 2026-03-10T06:23:06.675 INFO:tasks.workunit.client.0.vm04.stdout:9/691: dwrite d2/d3/d18/d39/fd [0,4194304] 0 2026-03-10T06:23:06.675 INFO:tasks.workunit.client.0.vm04.stdout:8/652: creat df/d15/fcc x:0 0 0 2026-03-10T06:23:06.679 INFO:tasks.workunit.client.0.vm04.stdout:7/627: dwrite d4/df/f29 [4194304,4194304] 0 2026-03-10T06:23:06.680 INFO:tasks.workunit.client.0.vm04.stdout:7/628: write d4/df/dd8/d9c/fe8 [3724635,115323] 0 2026-03-10T06:23:06.691 INFO:tasks.workunit.client.0.vm04.stdout:4/669: rename d2/d16 to d2/d32/d5c/d76/dd7 0 2026-03-10T06:23:06.695 INFO:tasks.workunit.client.0.vm04.stdout:1/638: mknod d0/d8/d46/d7a/d95/ce7 0 2026-03-10T06:23:06.697 INFO:tasks.workunit.client.0.vm04.stdout:0/718: symlink d0/d5/d25/dd/d5c/d73/le8 0 2026-03-10T06:23:06.706 INFO:tasks.workunit.client.0.vm04.stdout:3/629: mknod d4/da/df/d11/d5a/cd3 0 2026-03-10T06:23:06.710 INFO:tasks.workunit.client.0.vm04.stdout:3/630: write d4/d6/d99/d7b/f47 [1168517,82388] 0 2026-03-10T06:23:06.710 INFO:tasks.workunit.client.0.vm04.stdout:1/639: chown d0/d3/d41/d4b/lde 363 1 2026-03-10T06:23:06.713 INFO:tasks.workunit.client.0.vm04.stdout:6/673: link d2/d43/d2d/d30/f32 d2/d43/d86/fdb 0 2026-03-10T06:23:06.713 INFO:tasks.workunit.client.0.vm04.stdout:9/692: unlink d2/d8/d22/f75 0 2026-03-10T06:23:06.713 INFO:tasks.workunit.client.0.vm04.stdout:3/631: unlink d4/d6/d99/d7b/d21/d32/d39/l72 0 2026-03-10T06:23:06.714 INFO:tasks.workunit.client.0.vm04.stdout:3/632: dread - d4/d6/d92/fcd zero size 2026-03-10T06:23:06.714 INFO:tasks.workunit.client.0.vm04.stdout:1/640: rmdir d0/d8/d46 39 2026-03-10T06:23:06.717 INFO:tasks.workunit.client.0.vm04.stdout:6/674: unlink d2/d37/l55 0 2026-03-10T06:23:06.718 INFO:tasks.workunit.client.0.vm04.stdout:8/653: rename df/d15/d29/d89/f8e to df/fcd 0 2026-03-10T06:23:06.718 INFO:tasks.workunit.client.0.vm04.stdout:3/633: mknod d4/d6/d92/cd4 0 2026-03-10T06:23:06.719 INFO:tasks.workunit.client.0.vm04.stdout:5/606: getdents d4/d3b 0 2026-03-10T06:23:06.721 INFO:tasks.workunit.client.0.vm04.stdout:6/675: mkdir d2/d43/d2d/d30/d34/d76/d7e/ddc 0 2026-03-10T06:23:06.726 INFO:tasks.workunit.client.0.vm04.stdout:3/634: unlink d4/d6/dc/f41 0 2026-03-10T06:23:06.729 INFO:tasks.workunit.client.0.vm04.stdout:3/635: stat d4/da/df/d11/d5a/cd3 0 2026-03-10T06:23:06.729 INFO:tasks.workunit.client.0.vm04.stdout:3/636: chown d4/c46 84362608 1 2026-03-10T06:23:06.729 INFO:tasks.workunit.client.0.vm04.stdout:0/719: getdents d0/d1a/d20/d38/d31/d79 0 2026-03-10T06:23:06.731 INFO:tasks.workunit.client.0.vm04.stdout:6/676: creat d2/d37/d6e/fdd x:0 0 0 2026-03-10T06:23:06.731 INFO:tasks.workunit.client.0.vm04.stdout:1/641: symlink d0/d8/d46/de4/le8 0 2026-03-10T06:23:06.734 INFO:tasks.workunit.client.0.vm04.stdout:1/642: read d0/f2e [3438097,102689] 0 2026-03-10T06:23:06.737 INFO:tasks.workunit.client.0.vm04.stdout:7/629: sync 2026-03-10T06:23:06.738 INFO:tasks.workunit.client.0.vm04.stdout:3/637: mkdir d4/d6/d91/da1/dd5 0 2026-03-10T06:23:06.740 INFO:tasks.workunit.client.0.vm04.stdout:0/720: unlink d0/f1b 0 2026-03-10T06:23:06.740 INFO:tasks.workunit.client.0.vm04.stdout:3/638: write d4/d6/d38/fb8 [3813959,31269] 0 2026-03-10T06:23:06.746 INFO:tasks.workunit.client.0.vm04.stdout:4/670: rename d2/d32/dad/lc6 to d2/d32/d5c/d76/dd7/d31/d3f/ld8 0 2026-03-10T06:23:06.746 INFO:tasks.workunit.client.0.vm04.stdout:6/677: creat d2/d43/d2d/d30/d1f/d3c/d85/dbf/fde x:0 0 0 2026-03-10T06:23:06.746 INFO:tasks.workunit.client.0.vm04.stdout:4/671: readlink d2/d32/l36 0 2026-03-10T06:23:06.754 INFO:tasks.workunit.client.0.vm04.stdout:1/643: rename d0/d8/d46/db3/db4 to d0/d3/d80/de9 0 2026-03-10T06:23:06.756 INFO:tasks.workunit.client.0.vm04.stdout:2/621: dwrite d1/dae/f5a [0,4194304] 0 2026-03-10T06:23:06.757 INFO:tasks.workunit.client.0.vm04.stdout:9/693: write d2/d3/d18/de9/d5a/fa6 [893979,117791] 0 2026-03-10T06:23:06.768 INFO:tasks.workunit.client.0.vm04.stdout:8/654: dwrite df/d20/d25/d30/d65/f9f [0,4194304] 0 2026-03-10T06:23:06.773 INFO:tasks.workunit.client.0.vm04.stdout:5/607: dwrite d4/d11/d7d/d38/d91/d55/f5a [4194304,4194304] 0 2026-03-10T06:23:06.776 INFO:tasks.workunit.client.0.vm04.stdout:7/630: write d4/df/d12/fcd [218336,120719] 0 2026-03-10T06:23:06.780 INFO:tasks.workunit.client.0.vm04.stdout:0/721: mkdir d0/d1a/d20/d38/d31/de9 0 2026-03-10T06:23:06.783 INFO:tasks.workunit.client.0.vm04.stdout:3/639: chown d4/d6/d99/d7b/d21/d2c/c94 2497 1 2026-03-10T06:23:06.788 INFO:tasks.workunit.client.0.vm04.stdout:1/644: rmdir d0 39 2026-03-10T06:23:06.788 INFO:tasks.workunit.client.0.vm04.stdout:2/622: chown d1/dae/f24 1014 1 2026-03-10T06:23:06.790 INFO:tasks.workunit.client.0.vm04.stdout:9/694: rmdir d2/d3/d18/ddd 39 2026-03-10T06:23:06.801 INFO:tasks.workunit.client.0.vm04.stdout:8/655: symlink df/d20/d25/d30/d55/lce 0 2026-03-10T06:23:06.802 INFO:tasks.workunit.client.0.vm04.stdout:7/631: symlink d4/df/lf0 0 2026-03-10T06:23:06.802 INFO:tasks.workunit.client.0.vm04.stdout:6/678: mkdir d2/d3a/d5e/ddf 0 2026-03-10T06:23:06.802 INFO:tasks.workunit.client.0.vm04.stdout:9/695: dread d2/d3/d18/d34/f97 [0,4194304] 0 2026-03-10T06:23:06.802 INFO:tasks.workunit.client.0.vm04.stdout:0/722: creat d0/d1a/d20/d38/d31/d79/fea x:0 0 0 2026-03-10T06:23:06.807 INFO:tasks.workunit.client.0.vm04.stdout:3/640: truncate f0 9873957 0 2026-03-10T06:23:06.807 INFO:tasks.workunit.client.0.vm04.stdout:1/645: dwrite d0/d3/d41/d4b/f6b [0,4194304] 0 2026-03-10T06:23:06.807 INFO:tasks.workunit.client.0.vm04.stdout:1/646: chown d0/d3/f33 815782185 1 2026-03-10T06:23:06.808 INFO:tasks.workunit.client.0.vm04.stdout:8/656: rename df/d20/d25/d30/d65/d8f/la8 to df/d20/d25/d30/dc5/lcf 0 2026-03-10T06:23:06.822 INFO:tasks.workunit.client.0.vm04.stdout:1/647: read d0/d8/f69 [512498,120049] 0 2026-03-10T06:23:06.822 INFO:tasks.workunit.client.0.vm04.stdout:8/657: dread df/d20/d25/d73/f98 [0,4194304] 0 2026-03-10T06:23:06.826 INFO:tasks.workunit.client.0.vm04.stdout:9/696: write d2/d3/d18/d34/f97 [5208651,29472] 0 2026-03-10T06:23:06.829 INFO:tasks.workunit.client.0.vm04.stdout:9/697: stat d2/d3/df4 0 2026-03-10T06:23:06.833 INFO:tasks.workunit.client.0.vm04.stdout:0/723: symlink d0/d5/d25/dd/d1d/d9c/dbf/leb 0 2026-03-10T06:23:06.844 INFO:tasks.workunit.client.0.vm04.stdout:3/641: rmdir d4 39 2026-03-10T06:23:06.851 INFO:tasks.workunit.client.0.vm04.stdout:5/608: write d4/d11/d7d/d38/f3e [1811951,58867] 0 2026-03-10T06:23:06.911 INFO:tasks.workunit.client.0.vm04.stdout:2/623: truncate d1/db/d20/d8f/f53 621897 0 2026-03-10T06:23:06.932 INFO:tasks.workunit.client.0.vm04.stdout:1/648: mknod d0/d8/d46/dcf/cea 0 2026-03-10T06:23:06.938 INFO:tasks.workunit.client.0.vm04.stdout:8/658: dread df/d15/d2b/f4a [0,4194304] 0 2026-03-10T06:23:06.945 INFO:tasks.workunit.client.0.vm04.stdout:9/698: unlink d2/d3/d18/d39/fd 0 2026-03-10T06:23:06.948 INFO:tasks.workunit.client.0.vm04.stdout:4/672: getdents d2/d46 0 2026-03-10T06:23:06.949 INFO:tasks.workunit.client.0.vm04.stdout:0/724: rename d0/d5/c40 to d0/d5/d97/cec 0 2026-03-10T06:23:06.973 INFO:tasks.workunit.client.0.vm04.stdout:5/609: dread d4/d6/f23 [0,4194304] 0 2026-03-10T06:23:06.977 INFO:tasks.workunit.client.0.vm04.stdout:6/679: creat d2/fe0 x:0 0 0 2026-03-10T06:23:06.979 INFO:tasks.workunit.client.0.vm04.stdout:1/649: mknod d0/d3/d80/ceb 0 2026-03-10T06:23:06.979 INFO:tasks.workunit.client.0.vm04.stdout:1/650: write d0/d8/f43 [384982,105489] 0 2026-03-10T06:23:06.983 INFO:tasks.workunit.client.0.vm04.stdout:9/699: chown d2/d3/d18/ddd/c76 6965335 1 2026-03-10T06:23:06.983 INFO:tasks.workunit.client.0.vm04.stdout:9/700: chown d2/d3/c33 174356348 1 2026-03-10T06:23:06.983 INFO:tasks.workunit.client.0.vm04.stdout:4/673: mknod d2/d32/d5c/d76/cd9 0 2026-03-10T06:23:06.984 INFO:tasks.workunit.client.0.vm04.stdout:0/725: creat d0/d5/d25/dd/d1d/d59/fed x:0 0 0 2026-03-10T06:23:06.984 INFO:tasks.workunit.client.0.vm04.stdout:3/642: write d4/d6/d99/d7b/f45 [1725194,67903] 0 2026-03-10T06:23:06.987 INFO:tasks.workunit.client.0.vm04.stdout:1/651: dread d0/d3/f98 [0,4194304] 0 2026-03-10T06:23:06.987 INFO:tasks.workunit.client.0.vm04.stdout:7/632: fdatasync d4/df/d12/d13/d25/d28/d3a/f73 0 2026-03-10T06:23:06.991 INFO:tasks.workunit.client.0.vm04.stdout:2/624: mkdir d1/dbf 0 2026-03-10T06:23:06.991 INFO:tasks.workunit.client.0.vm04.stdout:2/625: stat d1/dae/d11/d14/d9f 0 2026-03-10T06:23:06.991 INFO:tasks.workunit.client.0.vm04.stdout:5/610: creat d4/d11/d7d/d38/d91/d55/db1/fd8 x:0 0 0 2026-03-10T06:23:06.994 INFO:tasks.workunit.client.0.vm04.stdout:6/680: rename d2/d3a/f50 to d2/d43/d2d/d30/d1f/d3c/d85/dbf/fe1 0 2026-03-10T06:23:07.008 INFO:tasks.workunit.client.0.vm04.stdout:9/701: mknod d2/d3/d18/d39/d46/d84/c105 0 2026-03-10T06:23:07.009 INFO:tasks.workunit.client.0.vm04.stdout:9/702: fdatasync d2/d8/d53/d6e/d8d/fdf 0 2026-03-10T06:23:07.011 INFO:tasks.workunit.client.0.vm04.stdout:0/726: creat d0/d1a/d20/dc2/fee x:0 0 0 2026-03-10T06:23:07.013 INFO:tasks.workunit.client.0.vm04.stdout:3/643: creat d4/d6/d99/d7b/d21/d32/d39/d64/fd6 x:0 0 0 2026-03-10T06:23:07.014 INFO:tasks.workunit.client.0.vm04.stdout:1/652: mkdir d0/d8/d46/de4/dec 0 2026-03-10T06:23:07.015 INFO:tasks.workunit.client.0.vm04.stdout:2/626: symlink d1/db/d20/d8f/d35/lc0 0 2026-03-10T06:23:07.017 INFO:tasks.workunit.client.0.vm04.stdout:2/627: dwrite d1/fa5 [0,4194304] 0 2026-03-10T06:23:07.023 INFO:tasks.workunit.client.0.vm04.stdout:5/611: mkdir d4/d6/d80/dd9 0 2026-03-10T06:23:07.027 INFO:tasks.workunit.client.0.vm04.stdout:5/612: write d4/d6/d80/d84/fa7 [4942917,129570] 0 2026-03-10T06:23:07.029 INFO:tasks.workunit.client.0.vm04.stdout:8/659: link df/d15/d29/da3/db8/dc1/d97/f9c df/d20/d25/d30/d55/fd0 0 2026-03-10T06:23:07.029 INFO:tasks.workunit.client.0.vm04.stdout:5/613: read d4/d11/f1f [6401853,74642] 0 2026-03-10T06:23:07.029 INFO:tasks.workunit.client.0.vm04.stdout:9/703: unlink d2/d3/d18/d39/d11/f2d 0 2026-03-10T06:23:07.031 INFO:tasks.workunit.client.0.vm04.stdout:5/614: dread - d4/d11/d7d/d38/d91/d55/fce zero size 2026-03-10T06:23:07.038 INFO:tasks.workunit.client.0.vm04.stdout:8/660: dwrite df/d20/d25/d30/d65/d8f/fb4 [0,4194304] 0 2026-03-10T06:23:07.043 INFO:tasks.workunit.client.0.vm04.stdout:0/727: symlink d0/d5/d25/dd/d5c/d73/lef 0 2026-03-10T06:23:07.049 INFO:tasks.workunit.client.0.vm04.stdout:8/661: chown df/d15/d29/da3/db8/dc1/dac/cbc 9876347 1 2026-03-10T06:23:07.049 INFO:tasks.workunit.client.0.vm04.stdout:3/644: mknod d4/da/cd7 0 2026-03-10T06:23:07.050 INFO:tasks.workunit.client.0.vm04.stdout:5/615: dwrite d4/d6/d80/d84/d99/fb3 [0,4194304] 0 2026-03-10T06:23:07.052 INFO:tasks.workunit.client.0.vm04.stdout:1/653: mknod d0/d8/d46/d7a/d95/dc5/ced 0 2026-03-10T06:23:07.052 INFO:tasks.workunit.client.0.vm04.stdout:1/654: write d0/d8/d46/d7a/fa8 [977339,22429] 0 2026-03-10T06:23:07.053 INFO:tasks.workunit.client.0.vm04.stdout:4/674: write d2/d32/d5c/d76/dd7/d31/d42/db9/f65 [721438,90443] 0 2026-03-10T06:23:07.061 INFO:tasks.workunit.client.0.vm04.stdout:6/681: unlink d2/d43/d9b/cb1 0 2026-03-10T06:23:07.062 INFO:tasks.workunit.client.0.vm04.stdout:6/682: stat d2/d43/d2d/d30/d1f/d3c/f65 0 2026-03-10T06:23:07.074 INFO:tasks.workunit.client.0.vm04.stdout:8/662: rename df/d15/d29/c32 to df/d20/d25/d30/dc5/cd1 0 2026-03-10T06:23:07.075 INFO:tasks.workunit.client.0.vm04.stdout:5/616: chown d4/d11/c1d 320093 1 2026-03-10T06:23:07.077 INFO:tasks.workunit.client.0.vm04.stdout:7/633: mkdir d4/df/dd8/d9c/db1/dde/ddf/df1 0 2026-03-10T06:23:07.077 INFO:tasks.workunit.client.0.vm04.stdout:6/683: fdatasync d2/d43/f35 0 2026-03-10T06:23:07.077 INFO:tasks.workunit.client.0.vm04.stdout:9/704: unlink d2/d3/d18/fa0 0 2026-03-10T06:23:07.079 INFO:tasks.workunit.client.0.vm04.stdout:8/663: creat df/d15/d2b/d81/d9a/fd2 x:0 0 0 2026-03-10T06:23:07.081 INFO:tasks.workunit.client.0.vm04.stdout:1/655: mkdir d0/d3/d41/dcb/dee 0 2026-03-10T06:23:07.086 INFO:tasks.workunit.client.0.vm04.stdout:0/728: rename d0/d5/d25/dd/d5c/c6c to d0/d5/d25/dd/d5c/cf0 0 2026-03-10T06:23:07.086 INFO:tasks.workunit.client.0.vm04.stdout:9/705: rename d2/d3/d18/de9/d5a/c103 to d2/d3/d18/d39/d46/c106 0 2026-03-10T06:23:07.088 INFO:tasks.workunit.client.0.vm04.stdout:8/664: symlink df/d20/d25/d30/d55/ld3 0 2026-03-10T06:23:07.090 INFO:tasks.workunit.client.0.vm04.stdout:5/617: mkdir d4/d11/d7d/d38/d91/dda 0 2026-03-10T06:23:07.096 INFO:tasks.workunit.client.0.vm04.stdout:5/618: write d4/d11/f32 [512978,44880] 0 2026-03-10T06:23:07.107 INFO:tasks.workunit.client.0.vm04.stdout:8/665: dwrite df/d15/f1e [0,4194304] 0 2026-03-10T06:23:07.107 INFO:tasks.workunit.client.0.vm04.stdout:6/684: rename d2/d8/cad to d2/d43/d2d/d7c/ce2 0 2026-03-10T06:23:07.109 INFO:tasks.workunit.client.0.vm04.stdout:6/685: chown d2/d43/d2d/d30/d34/f4d 105018 1 2026-03-10T06:23:07.110 INFO:tasks.workunit.client.0.vm04.stdout:5/619: dread d4/d6/d80/d84/d99/fb3 [0,4194304] 0 2026-03-10T06:23:07.115 INFO:tasks.workunit.client.0.vm04.stdout:4/675: sync 2026-03-10T06:23:07.115 INFO:tasks.workunit.client.0.vm04.stdout:0/729: mkdir d0/d5/d25/df1 0 2026-03-10T06:23:07.120 INFO:tasks.workunit.client.0.vm04.stdout:9/706: stat d2/d8/f99 0 2026-03-10T06:23:07.121 INFO:tasks.workunit.client.0.vm04.stdout:6/686: dwrite d2/d43/f3b [0,4194304] 0 2026-03-10T06:23:07.121 INFO:tasks.workunit.client.0.vm04.stdout:9/707: chown d2/d3/d18/d39/d11/da5/fd6 19273 1 2026-03-10T06:23:07.137 INFO:tasks.workunit.client.0.vm04.stdout:4/676: dwrite d2/d32/d5c/d76/dd7/d31/f66 [0,4194304] 0 2026-03-10T06:23:07.137 INFO:tasks.workunit.client.0.vm04.stdout:2/628: dwrite d1/db/fe [0,4194304] 0 2026-03-10T06:23:07.137 INFO:tasks.workunit.client.0.vm04.stdout:8/666: dread df/d15/f24 [0,4194304] 0 2026-03-10T06:23:07.139 INFO:tasks.workunit.client.0.vm04.stdout:9/708: rmdir d2/d8 39 2026-03-10T06:23:07.144 INFO:tasks.workunit.client.0.vm04.stdout:1/656: dread d0/d8/f6c [0,4194304] 0 2026-03-10T06:23:07.144 INFO:tasks.workunit.client.0.vm04.stdout:8/667: truncate df/d15/d29/f3e 1379507 0 2026-03-10T06:23:07.148 INFO:tasks.workunit.client.0.vm04.stdout:5/620: dread d4/d11/f17 [0,4194304] 0 2026-03-10T06:23:07.162 INFO:tasks.workunit.client.0.vm04.stdout:6/687: symlink d2/d43/d2d/d30/d1f/le3 0 2026-03-10T06:23:07.165 INFO:tasks.workunit.client.0.vm04.stdout:0/730: rename d0/d5/d25/cb1 to d0/d5/d25/dd/d1d/d59/cf2 0 2026-03-10T06:23:07.170 INFO:tasks.workunit.client.0.vm04.stdout:6/688: dwrite d2/d43/f31 [0,4194304] 0 2026-03-10T06:23:07.176 INFO:tasks.workunit.client.0.vm04.stdout:3/645: mkdir d4/da/df/dd8 0 2026-03-10T06:23:07.181 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:06 vm04.local ceph-mon[51058]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:23:07.181 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:06 vm04.local ceph-mon[51058]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:23:07.181 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:06 vm04.local ceph-mon[51058]: pgmap v32: 65 pgs: 65 active+clean; 2.8 GiB data, 9.2 GiB used, 111 GiB / 120 GiB avail; 47 MiB/s rd, 127 MiB/s wr, 283 op/s 2026-03-10T06:23:07.181 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:06 vm04.local ceph-mon[51058]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:23:07.182 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:06 vm04.local ceph-mon[51058]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:23:07.182 INFO:tasks.workunit.client.0.vm04.stdout:1/657: mkdir d0/d3/d41/d99/def 0 2026-03-10T06:23:07.186 INFO:tasks.workunit.client.0.vm04.stdout:9/709: dread d2/d3/d18/de9/d5a/fa6 [0,4194304] 0 2026-03-10T06:23:07.193 INFO:tasks.workunit.client.0.vm04.stdout:4/677: mknod d2/d32/d5c/d76/dd7/d31/d3f/da1/cda 0 2026-03-10T06:23:07.193 INFO:tasks.workunit.client.0.vm04.stdout:3/646: creat d4/da/df/d11/d5a/d5b/fd9 x:0 0 0 2026-03-10T06:23:07.194 INFO:tasks.workunit.client.0.vm04.stdout:7/634: dwrite d4/df/d12/d13/f4a [0,4194304] 0 2026-03-10T06:23:07.194 INFO:tasks.workunit.client.0.vm04.stdout:3/647: read - d4/da/df/d11/d5a/db3/fbb zero size 2026-03-10T06:23:07.197 INFO:tasks.workunit.client.0.vm04.stdout:8/668: sync 2026-03-10T06:23:07.197 INFO:tasks.workunit.client.0.vm04.stdout:0/731: symlink d0/d1a/db8/lf3 0 2026-03-10T06:23:07.197 INFO:tasks.workunit.client.0.vm04.stdout:9/710: mkdir d2/d3/d18/d39/d107 0 2026-03-10T06:23:07.198 INFO:tasks.workunit.client.0.vm04.stdout:4/678: read d2/d46/f15 [3102482,29354] 0 2026-03-10T06:23:07.198 INFO:tasks.workunit.client.0.vm04.stdout:7/635: fsync d4/df/d12/d13/d25/d30/d40/d79/faa 0 2026-03-10T06:23:07.201 INFO:tasks.workunit.client.0.vm04.stdout:1/658: read d0/d8/f21 [341559,107799] 0 2026-03-10T06:23:07.202 INFO:tasks.workunit.client.0.vm04.stdout:4/679: fsync d2/d32/d5c/d76/dd7/f20 0 2026-03-10T06:23:07.203 INFO:tasks.workunit.client.0.vm04.stdout:8/669: rename df/c26 to df/d15/d29/da3/db8/dc1/d97/d67/cd4 0 2026-03-10T06:23:07.203 INFO:tasks.workunit.client.0.vm04.stdout:9/711: write d2/d3/d18/de9/da2/fc5 [4683776,110267] 0 2026-03-10T06:23:07.204 INFO:tasks.workunit.client.0.vm04.stdout:7/636: creat d4/df/d12/d13/ff2 x:0 0 0 2026-03-10T06:23:07.209 INFO:tasks.workunit.client.0.vm04.stdout:8/670: stat df/d20/f84 0 2026-03-10T06:23:07.209 INFO:tasks.workunit.client.0.vm04.stdout:4/680: creat d2/d32/d94/d99/fdb x:0 0 0 2026-03-10T06:23:07.209 INFO:tasks.workunit.client.0.vm04.stdout:3/648: getdents d4/d6/d99/d7b/d21/d32 0 2026-03-10T06:23:07.214 INFO:tasks.workunit.client.0.vm04.stdout:8/671: mknod df/d15/d2b/d81/d9a/cd5 0 2026-03-10T06:23:07.215 INFO:tasks.workunit.client.0.vm04.stdout:4/681: mkdir d2/d32/d94/d99/ddc 0 2026-03-10T06:23:07.219 INFO:tasks.workunit.client.0.vm04.stdout:8/672: chown df/c16 23 1 2026-03-10T06:23:07.219 INFO:tasks.workunit.client.0.vm04.stdout:4/682: fsync d2/d32/d5c/d76/dd7/d56/fa7 0 2026-03-10T06:23:07.220 INFO:tasks.workunit.client.0.vm04.stdout:3/649: rename d4/d6/d99/f76 to d4/dba/fda 0 2026-03-10T06:23:07.226 INFO:tasks.workunit.client.0.vm04.stdout:9/712: dread d2/d3/d18/f8f [0,4194304] 0 2026-03-10T06:23:07.235 INFO:tasks.workunit.client.0.vm04.stdout:1/659: sync 2026-03-10T06:23:07.243 INFO:tasks.workunit.client.0.vm04.stdout:8/673: link df/d20/d25/d30/d55/f95 df/d20/d25/fd6 0 2026-03-10T06:23:07.246 INFO:tasks.workunit.client.0.vm04.stdout:1/660: mknod d0/d3/d41/dcb/cf0 0 2026-03-10T06:23:07.247 INFO:tasks.workunit.client.0.vm04.stdout:0/732: chown d0/d5/d25/dd/d1d/d59/cf2 32 1 2026-03-10T06:23:07.250 INFO:tasks.workunit.client.0.vm04.stdout:8/674: write df/d20/d25/d30/f51 [3039710,22757] 0 2026-03-10T06:23:07.250 INFO:tasks.workunit.client.0.vm04.stdout:2/629: dwrite d1/dae/d11/d14/f1d [4194304,4194304] 0 2026-03-10T06:23:07.270 INFO:tasks.workunit.client.0.vm04.stdout:5/621: dread d4/d11/d7d/d38/d91/d55/f5d [0,4194304] 0 2026-03-10T06:23:07.275 INFO:tasks.workunit.client.0.vm04.stdout:5/622: readlink d4/d6/d50/l7c 0 2026-03-10T06:23:07.277 INFO:tasks.workunit.client.0.vm04.stdout:6/689: write d2/d43/d2d/d30/d1f/d3c/d75/f59 [1889856,130049] 0 2026-03-10T06:23:07.287 INFO:tasks.workunit.client.0.vm04.stdout:3/650: dwrite d4/da/df/d11/d5a/d5b/fc7 [0,4194304] 0 2026-03-10T06:23:07.287 INFO:tasks.workunit.client.0.vm04.stdout:3/651: stat d4/d6/d99/d7b/dbd 0 2026-03-10T06:23:07.288 INFO:tasks.workunit.client.0.vm04.stdout:7/637: dwrite d4/df/d12/d13/d25/d28/f7d [0,4194304] 0 2026-03-10T06:23:07.299 INFO:tasks.workunit.client.0.vm04.stdout:0/733: creat d0/d1a/d20/dc2/ff4 x:0 0 0 2026-03-10T06:23:07.301 INFO:tasks.workunit.client.0.vm04.stdout:0/734: chown d0/d1a/d20/f8c 5664 1 2026-03-10T06:23:07.303 INFO:tasks.workunit.client.0.vm04.stdout:9/713: dwrite d2/d8/d22/ffe [0,4194304] 0 2026-03-10T06:23:07.307 INFO:tasks.workunit.client.0.vm04.stdout:9/714: write d2/fb4 [5047176,97430] 0 2026-03-10T06:23:07.311 INFO:tasks.workunit.client.0.vm04.stdout:5/623: dread d4/d6/d80/d84/d99/fb3 [0,4194304] 0 2026-03-10T06:23:07.321 INFO:tasks.workunit.client.0.vm04.stdout:5/624: readlink d4/d11/d7d/d38/d91/d4c/d98/dc0/l89 0 2026-03-10T06:23:07.325 INFO:tasks.workunit.client.0.vm04.stdout:0/735: dwrite d0/d1a/d20/dc2/fee [0,4194304] 0 2026-03-10T06:23:07.339 INFO:tasks.workunit.client.0.vm04.stdout:4/683: truncate d2/d32/d5c/d76/dd7/f20 3894992 0 2026-03-10T06:23:07.342 INFO:tasks.workunit.client.0.vm04.stdout:1/661: symlink d0/lf1 0 2026-03-10T06:23:07.342 INFO:tasks.workunit.client.0.vm04.stdout:1/662: chown d0/d8/d46/d7a/d95/dc5/cc9 185395 1 2026-03-10T06:23:07.342 INFO:tasks.workunit.client.0.vm04.stdout:1/663: chown d0/d8/d46/d7a 8556180 1 2026-03-10T06:23:07.343 INFO:tasks.workunit.client.0.vm04.stdout:6/690: read d2/d43/d2d/d30/f7f [576897,1927] 0 2026-03-10T06:23:07.345 INFO:tasks.workunit.client.0.vm04.stdout:6/691: dread d2/d43/d2d/fcf [0,4194304] 0 2026-03-10T06:23:07.346 INFO:tasks.workunit.client.0.vm04.stdout:7/638: write d4/df/d12/d13/fc7 [121253,65511] 0 2026-03-10T06:23:07.346 INFO:tasks.workunit.client.0.vm04.stdout:3/652: fsync d4/da/df/f5e 0 2026-03-10T06:23:07.347 INFO:tasks.workunit.client.0.vm04.stdout:3/653: stat d4/d6/d92/fcd 0 2026-03-10T06:23:07.347 INFO:tasks.workunit.client.0.vm04.stdout:8/675: truncate df/d15/d29/da3/db8/dc1/dac/fc3 1644278 0 2026-03-10T06:23:07.347 INFO:tasks.workunit.client.0.vm04.stdout:8/676: fdatasync df/f17 0 2026-03-10T06:23:07.356 INFO:tasks.workunit.client.0.vm04.stdout:9/715: mkdir d2/d8/d3a/dcb/d108 0 2026-03-10T06:23:07.360 INFO:tasks.workunit.client.0.vm04.stdout:5/625: unlink d4/d11/d7d/l49 0 2026-03-10T06:23:07.362 INFO:tasks.workunit.client.0.vm04.stdout:4/684: mknod d2/d32/d5c/d76/dd7/d31/d42/cdd 0 2026-03-10T06:23:07.362 INFO:tasks.workunit.client.0.vm04.stdout:1/664: mknod d0/d8/d46/d7a/d95/dc5/cf2 0 2026-03-10T06:23:07.362 INFO:tasks.workunit.client.0.vm04.stdout:5/626: dread - d4/d6/d80/d84/f9c zero size 2026-03-10T06:23:07.362 INFO:tasks.workunit.client.0.vm04.stdout:1/665: chown d0/d3/c97 555 1 2026-03-10T06:23:07.362 INFO:tasks.workunit.client.0.vm04.stdout:4/685: write d2/d32/f7c [3510002,22] 0 2026-03-10T06:23:07.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:06 vm06.local ceph-mon[58974]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:23:07.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:06 vm06.local ceph-mon[58974]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:23:07.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:06 vm06.local ceph-mon[58974]: pgmap v32: 65 pgs: 65 active+clean; 2.8 GiB data, 9.2 GiB used, 111 GiB / 120 GiB avail; 47 MiB/s rd, 127 MiB/s wr, 283 op/s 2026-03-10T06:23:07.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:06 vm06.local ceph-mon[58974]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:23:07.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:06 vm06.local ceph-mon[58974]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:23:07.371 INFO:tasks.workunit.client.0.vm04.stdout:5/627: truncate d4/d11/d7d/d38/d91/d4c/fa3 4278029 0 2026-03-10T06:23:07.371 INFO:tasks.workunit.client.0.vm04.stdout:5/628: chown d4/d11/d7d/c76 370 1 2026-03-10T06:23:07.378 INFO:tasks.workunit.client.0.vm04.stdout:6/692: creat d2/d43/d2d/d30/d1f/d3c/fe4 x:0 0 0 2026-03-10T06:23:07.386 INFO:tasks.workunit.client.0.vm04.stdout:2/630: dwrite d1/dae/d2c/f4a [0,4194304] 0 2026-03-10T06:23:07.387 INFO:tasks.workunit.client.0.vm04.stdout:4/686: dread d2/d32/d94/d99/fd6 [0,4194304] 0 2026-03-10T06:23:07.401 INFO:tasks.workunit.client.0.vm04.stdout:2/631: dwrite d1/dae/d2c/d37/d59/f8b [0,4194304] 0 2026-03-10T06:23:07.402 INFO:tasks.workunit.client.0.vm04.stdout:2/632: chown d1/db/d20/d8f/d35 8283 1 2026-03-10T06:23:07.403 INFO:tasks.workunit.client.0.vm04.stdout:3/654: rmdir d4 39 2026-03-10T06:23:07.403 INFO:tasks.workunit.client.0.vm04.stdout:2/633: chown d1/f91 535781135 1 2026-03-10T06:23:07.403 INFO:tasks.workunit.client.0.vm04.stdout:8/677: creat df/d15/d29/d89/fd7 x:0 0 0 2026-03-10T06:23:07.409 INFO:tasks.workunit.client.0.vm04.stdout:1/666: mkdir d0/d8/d46/d7a/d95/df3 0 2026-03-10T06:23:07.411 INFO:tasks.workunit.client.0.vm04.stdout:1/667: chown d0/d3/c92 2042985 1 2026-03-10T06:23:07.411 INFO:tasks.workunit.client.0.vm04.stdout:5/629: creat d4/d6/d50/fdb x:0 0 0 2026-03-10T06:23:07.413 INFO:tasks.workunit.client.0.vm04.stdout:9/716: dread d2/de0/da3/f102 [0,4194304] 0 2026-03-10T06:23:07.418 INFO:tasks.workunit.client.0.vm04.stdout:4/687: write d2/d32/d94/d99/fd6 [2704315,70196] 0 2026-03-10T06:23:07.419 INFO:tasks.workunit.client.0.vm04.stdout:7/639: dwrite d4/df/d12/d21/fa4 [0,4194304] 0 2026-03-10T06:23:07.422 INFO:tasks.workunit.client.0.vm04.stdout:8/678: readlink df/d20/d25/d30/dc5/lcf 0 2026-03-10T06:23:07.431 INFO:tasks.workunit.client.0.vm04.stdout:2/634: creat d1/db/d20/fc1 x:0 0 0 2026-03-10T06:23:07.435 INFO:tasks.workunit.client.0.vm04.stdout:0/736: rename d0/d1a/d20/d38/d31 to d0/d1a/d20/df5 0 2026-03-10T06:23:07.439 INFO:tasks.workunit.client.0.vm04.stdout:6/693: dwrite d2/d43/d2d/d30/d34/da8/fb4 [0,4194304] 0 2026-03-10T06:23:07.441 INFO:tasks.workunit.client.0.vm04.stdout:9/717: creat d2/de0/d1d/d64/d73/f109 x:0 0 0 2026-03-10T06:23:07.445 INFO:tasks.workunit.client.0.vm04.stdout:0/737: creat d0/d1a/d20/df5/d47/ff6 x:0 0 0 2026-03-10T06:23:07.445 INFO:tasks.workunit.client.0.vm04.stdout:0/738: chown d0/d1a 19316 1 2026-03-10T06:23:07.447 INFO:tasks.workunit.client.0.vm04.stdout:4/688: mkdir d2/dde 0 2026-03-10T06:23:07.447 INFO:tasks.workunit.client.0.vm04.stdout:8/679: fsync df/d20/f28 0 2026-03-10T06:23:07.449 INFO:tasks.workunit.client.0.vm04.stdout:3/655: dread - d4/da/df/d11/d50/dc8/fcb zero size 2026-03-10T06:23:07.450 INFO:tasks.workunit.client.0.vm04.stdout:3/656: dread - d4/d6/d99/d7b/d21/d32/d39/d64/f67 zero size 2026-03-10T06:23:07.453 INFO:tasks.workunit.client.0.vm04.stdout:5/630: rmdir d4/d3b/da8/dd0 0 2026-03-10T06:23:07.454 INFO:tasks.workunit.client.0.vm04.stdout:5/631: read d4/d11/d7d/d38/f3e [1623208,100905] 0 2026-03-10T06:23:07.455 INFO:tasks.workunit.client.0.vm04.stdout:1/668: dread d0/f7c [0,4194304] 0 2026-03-10T06:23:07.456 INFO:tasks.workunit.client.0.vm04.stdout:0/739: mkdir d0/d1a/db8/df7 0 2026-03-10T06:23:07.457 INFO:tasks.workunit.client.0.vm04.stdout:4/689: chown d2/d32/d5c/d76/dd7/d2c/l8b 24751 1 2026-03-10T06:23:07.458 INFO:tasks.workunit.client.0.vm04.stdout:8/680: fdatasync df/d15/d29/f7a 0 2026-03-10T06:23:07.459 INFO:tasks.workunit.client.0.vm04.stdout:2/635: rename d1/db/d20/f49 to d1/fc2 0 2026-03-10T06:23:07.460 INFO:tasks.workunit.client.0.vm04.stdout:2/636: chown d1/dae/c73 76081 1 2026-03-10T06:23:07.463 INFO:tasks.workunit.client.0.vm04.stdout:5/632: mkdir d4/d6/dc2/ddc 0 2026-03-10T06:23:07.466 INFO:tasks.workunit.client.0.vm04.stdout:3/657: dwrite d4/da/df/d11/d50/fa9 [4194304,4194304] 0 2026-03-10T06:23:07.466 INFO:tasks.workunit.client.0.vm04.stdout:1/669: unlink d0/d3/d41/la1 0 2026-03-10T06:23:07.469 INFO:tasks.workunit.client.0.vm04.stdout:8/681: mkdir df/d15/d2b/dd8 0 2026-03-10T06:23:07.469 INFO:tasks.workunit.client.0.vm04.stdout:8/682: fsync df/f17 0 2026-03-10T06:23:07.474 INFO:tasks.workunit.client.0.vm04.stdout:9/718: rename d2/c41 to d2/d8/d3a/dcb/c10a 0 2026-03-10T06:23:07.475 INFO:tasks.workunit.client.0.vm04.stdout:5/633: rmdir d4/d6/dc2/dd3 39 2026-03-10T06:23:07.475 INFO:tasks.workunit.client.0.vm04.stdout:5/634: chown d4/d11/f7b 0 1 2026-03-10T06:23:07.477 INFO:tasks.workunit.client.0.vm04.stdout:3/658: mknod d4/da/df/d11/d5a/d5b/cdb 0 2026-03-10T06:23:07.478 INFO:tasks.workunit.client.0.vm04.stdout:7/640: sync 2026-03-10T06:23:07.480 INFO:tasks.workunit.client.0.vm04.stdout:8/683: mknod df/d15/d2b/cd9 0 2026-03-10T06:23:07.482 INFO:tasks.workunit.client.0.vm04.stdout:8/684: write df/d15/d29/d89/fd7 [418939,41038] 0 2026-03-10T06:23:07.482 INFO:tasks.workunit.client.0.vm04.stdout:8/685: dread - df/d20/d25/d87/fc6 zero size 2026-03-10T06:23:07.483 INFO:tasks.workunit.client.0.vm04.stdout:5/635: rename d4/d6/d80/d84/d99/fb3 to d4/d11/d7d/d38/d91/d55/d72/fdd 0 2026-03-10T06:23:07.484 INFO:tasks.workunit.client.0.vm04.stdout:3/659: rmdir d4/d6/d99/d7b/dbd 39 2026-03-10T06:23:07.484 INFO:tasks.workunit.client.0.vm04.stdout:7/641: symlink d4/df/dd8/d9c/lf3 0 2026-03-10T06:23:07.485 INFO:tasks.workunit.client.0.vm04.stdout:7/642: write d4/df/d12/d34/d63/f78 [2736083,125599] 0 2026-03-10T06:23:07.487 INFO:tasks.workunit.client.0.vm04.stdout:2/637: creat d1/dae/d11/fc3 x:0 0 0 2026-03-10T06:23:07.496 INFO:tasks.workunit.client.0.vm04.stdout:5/636: mkdir d4/d11/d7d/d38/d91/d4c/d98/dc0/dde 0 2026-03-10T06:23:07.496 INFO:tasks.workunit.client.0.vm04.stdout:1/670: creat d0/d8/ff4 x:0 0 0 2026-03-10T06:23:07.496 INFO:tasks.workunit.client.0.vm04.stdout:6/694: sync 2026-03-10T06:23:07.496 INFO:tasks.workunit.client.0.vm04.stdout:5/637: readlink d4/d3b/l45 0 2026-03-10T06:23:07.496 INFO:tasks.workunit.client.0.vm04.stdout:6/695: write d2/d43/d2d/d30/d1f/fd8 [638342,41924] 0 2026-03-10T06:23:07.496 INFO:tasks.workunit.client.0.vm04.stdout:7/643: fdatasync d4/df/d12/d13/d25/d30/d40/d79/faa 0 2026-03-10T06:23:07.496 INFO:tasks.workunit.client.0.vm04.stdout:1/671: creat d0/d3/d80/ff5 x:0 0 0 2026-03-10T06:23:07.496 INFO:tasks.workunit.client.0.vm04.stdout:5/638: unlink d4/d11/d7d/d38/l8a 0 2026-03-10T06:23:07.496 INFO:tasks.workunit.client.0.vm04.stdout:2/638: dread d1/db/d20/d8f/f25 [0,4194304] 0 2026-03-10T06:23:07.499 INFO:tasks.workunit.client.0.vm04.stdout:3/660: creat d4/d6/d99/d7b/fdc x:0 0 0 2026-03-10T06:23:07.500 INFO:tasks.workunit.client.0.vm04.stdout:5/639: creat d4/d11/d7d/d38/d91/d55/db1/fdf x:0 0 0 2026-03-10T06:23:07.503 INFO:tasks.workunit.client.0.vm04.stdout:2/639: creat d1/dae/d2c/d37/d40/fc4 x:0 0 0 2026-03-10T06:23:07.503 INFO:tasks.workunit.client.0.vm04.stdout:7/644: dread d4/df/f8a [0,4194304] 0 2026-03-10T06:23:07.516 INFO:tasks.workunit.client.0.vm04.stdout:4/690: write d2/d46/f15 [9022506,105747] 0 2026-03-10T06:23:07.516 INFO:tasks.workunit.client.0.vm04.stdout:4/691: readlink d2/d32/l36 0 2026-03-10T06:23:07.517 INFO:tasks.workunit.client.0.vm04.stdout:0/740: truncate d0/d1a/d20/fb9 713526 0 2026-03-10T06:23:07.521 INFO:tasks.workunit.client.0.vm04.stdout:9/719: write d2/de0/f27 [757386,128749] 0 2026-03-10T06:23:07.522 INFO:tasks.workunit.client.0.vm04.stdout:3/661: mknod d4/d6/d99/d7b/d21/d32/cdd 0 2026-03-10T06:23:07.527 INFO:tasks.workunit.client.0.vm04.stdout:8/686: write df/d15/d2b/f4c [1619884,67704] 0 2026-03-10T06:23:07.529 INFO:tasks.workunit.client.0.vm04.stdout:8/687: chown df/d20/d25/d30/d65/d8f/fc9 221454441 1 2026-03-10T06:23:07.532 INFO:tasks.workunit.client.0.vm04.stdout:8/688: stat df/d20/d25/d30/d65/lc2 0 2026-03-10T06:23:07.533 INFO:tasks.workunit.client.0.vm04.stdout:6/696: dwrite d2/d3a/f56 [0,4194304] 0 2026-03-10T06:23:07.535 INFO:tasks.workunit.client.0.vm04.stdout:6/697: readlink d2/d43/d86/l96 0 2026-03-10T06:23:07.538 INFO:tasks.workunit.client.0.vm04.stdout:5/640: symlink d4/d11/d7d/d38/d91/d4c/d98/le0 0 2026-03-10T06:23:07.540 INFO:tasks.workunit.client.0.vm04.stdout:7/645: fsync d4/df/d12/d13/d25/d30/d40/d50/f5b 0 2026-03-10T06:23:07.543 INFO:tasks.workunit.client.0.vm04.stdout:2/640: read d1/dae/d11/d14/d4e/fa6 [3184674,14297] 0 2026-03-10T06:23:07.544 INFO:tasks.workunit.client.0.vm04.stdout:0/741: unlink d0/d5/d25/dd/d3a/f6a 0 2026-03-10T06:23:07.546 INFO:tasks.workunit.client.0.vm04.stdout:9/720: mknod d2/d8/d53/d6e/d89/c10b 0 2026-03-10T06:23:07.552 INFO:tasks.workunit.client.0.vm04.stdout:1/672: getdents d0/d8 0 2026-03-10T06:23:07.553 INFO:tasks.workunit.client.0.vm04.stdout:2/641: creat d1/db/d20/fc5 x:0 0 0 2026-03-10T06:23:07.557 INFO:tasks.workunit.client.0.vm04.stdout:4/692: mknod d2/d32/d5c/d76/dd7/d31/d3f/dc8/cdf 0 2026-03-10T06:23:07.559 INFO:tasks.workunit.client.0.vm04.stdout:9/721: mknod d2/de0/c10c 0 2026-03-10T06:23:07.560 INFO:tasks.workunit.client.0.vm04.stdout:6/698: dread d2/d43/d2d/d30/d34/f6d [0,4194304] 0 2026-03-10T06:23:07.560 INFO:tasks.workunit.client.0.vm04.stdout:3/662: dread d4/f49 [0,4194304] 0 2026-03-10T06:23:07.567 INFO:tasks.workunit.client.0.vm04.stdout:6/699: dwrite d2/d37/d6e/fa9 [0,4194304] 0 2026-03-10T06:23:07.568 INFO:tasks.workunit.client.0.vm04.stdout:7/646: dread d4/df/d12/d13/d25/f2f [0,4194304] 0 2026-03-10T06:23:07.569 INFO:tasks.workunit.client.0.vm04.stdout:1/673: mknod d0/d3/d41/dcb/cf6 0 2026-03-10T06:23:07.572 INFO:tasks.workunit.client.0.vm04.stdout:7/647: chown d4/df/d12/dd4 45058 1 2026-03-10T06:23:07.577 INFO:tasks.workunit.client.0.vm04.stdout:7/648: chown d4/df/d12/d34/d63/f78 1 1 2026-03-10T06:23:07.583 INFO:tasks.workunit.client.0.vm04.stdout:4/693: dread - d2/d32/d5c/d76/dd7/d2c/f8d zero size 2026-03-10T06:23:07.584 INFO:tasks.workunit.client.0.vm04.stdout:5/641: write d4/d6/f93 [852454,81655] 0 2026-03-10T06:23:07.584 INFO:tasks.workunit.client.0.vm04.stdout:9/722: mkdir d2/de0/d1d/d64/d73/d10d 0 2026-03-10T06:23:07.585 INFO:tasks.workunit.client.0.vm04.stdout:5/642: chown d4/d11/f32 599225 1 2026-03-10T06:23:07.586 INFO:tasks.workunit.client.0.vm04.stdout:0/742: dwrite d0/d5/d25/dd/d1d/d59/d63/fa3 [0,4194304] 0 2026-03-10T06:23:07.589 INFO:tasks.workunit.client.0.vm04.stdout:8/689: creat df/fda x:0 0 0 2026-03-10T06:23:07.592 INFO:tasks.workunit.client.0.vm04.stdout:6/700: rename d2/d43/d9b/fd1 to d2/d43/fe5 0 2026-03-10T06:23:07.592 INFO:tasks.workunit.client.0.vm04.stdout:0/743: fdatasync d0/d5/d25/dd/d3a/fce 0 2026-03-10T06:23:07.593 INFO:tasks.workunit.client.0.vm04.stdout:2/642: dwrite d1/dae/d2c/f33 [0,4194304] 0 2026-03-10T06:23:07.597 INFO:tasks.workunit.client.0.vm04.stdout:6/701: read - d2/fe0 zero size 2026-03-10T06:23:07.607 INFO:tasks.workunit.client.0.vm04.stdout:6/702: dwrite d2/d43/d2d/d30/dc0/fcd [0,4194304] 0 2026-03-10T06:23:07.607 INFO:tasks.workunit.client.0.vm04.stdout:6/703: write d2/d43/d2d/d30/d34/da8/fb4 [1266699,8437] 0 2026-03-10T06:23:07.610 INFO:tasks.workunit.client.0.vm04.stdout:7/649: creat d4/df/d12/d13/d25/d30/ff4 x:0 0 0 2026-03-10T06:23:07.613 INFO:tasks.workunit.client.0.vm04.stdout:5/643: creat d4/d6/d80/d84/fe1 x:0 0 0 2026-03-10T06:23:07.618 INFO:tasks.workunit.client.0.vm04.stdout:8/690: rmdir df/d15/d29/da3 39 2026-03-10T06:23:07.619 INFO:tasks.workunit.client.0.vm04.stdout:1/674: rename d0/f2e to d0/d8/d46/db3/ff7 0 2026-03-10T06:23:07.625 INFO:tasks.workunit.client.0.vm04.stdout:0/744: creat d0/ff8 x:0 0 0 2026-03-10T06:23:07.625 INFO:tasks.workunit.client.0.vm04.stdout:2/643: creat d1/db/d69/fc6 x:0 0 0 2026-03-10T06:23:07.631 INFO:tasks.workunit.client.0.vm04.stdout:4/694: symlink d2/d32/d5c/d76/dd7/le0 0 2026-03-10T06:23:07.651 INFO:tasks.workunit.client.0.vm04.stdout:8/691: rmdir df/d15/d2b 39 2026-03-10T06:23:07.655 INFO:tasks.workunit.client.0.vm04.stdout:0/745: mkdir d0/d5/d25/dd/d3a/d81/df9 0 2026-03-10T06:23:07.655 INFO:tasks.workunit.client.0.vm04.stdout:8/692: dread - df/fda zero size 2026-03-10T06:23:07.660 INFO:tasks.workunit.client.0.vm04.stdout:4/695: creat d2/d32/d5c/d76/dd7/d31/d42/db9/fe1 x:0 0 0 2026-03-10T06:23:07.662 INFO:tasks.workunit.client.0.vm04.stdout:4/696: write d2/d32/d5c/f6a [585198,93392] 0 2026-03-10T06:23:07.662 INFO:tasks.workunit.client.0.vm04.stdout:9/723: creat d2/d23/d94/f10e x:0 0 0 2026-03-10T06:23:07.663 INFO:tasks.workunit.client.0.vm04.stdout:2/644: sync 2026-03-10T06:23:07.664 INFO:tasks.workunit.client.0.vm04.stdout:1/675: creat d0/d3/d41/d99/def/ff8 x:0 0 0 2026-03-10T06:23:07.665 INFO:tasks.workunit.client.0.vm04.stdout:9/724: write d2/d3/d18/de9/d5a/fee [372768,85399] 0 2026-03-10T06:23:07.666 INFO:tasks.workunit.client.0.vm04.stdout:8/693: readlink df/d20/l6a 0 2026-03-10T06:23:07.686 INFO:tasks.workunit.client.0.vm04.stdout:6/704: getdents d2/d8/d78 0 2026-03-10T06:23:07.689 INFO:tasks.workunit.client.0.vm04.stdout:2/645: fdatasync d1/db/d20/d8f/d35/d54/d5d/f93 0 2026-03-10T06:23:07.690 INFO:tasks.workunit.client.0.vm04.stdout:9/725: symlink d2/d3/d18/d39/d11/l10f 0 2026-03-10T06:23:07.691 INFO:tasks.workunit.client.0.vm04.stdout:9/726: chown d2/d8/d53/d6e/c6f 306288 1 2026-03-10T06:23:07.692 INFO:tasks.workunit.client.0.vm04.stdout:3/663: write d4/d6/d99/d7b/f2b [1169072,70462] 0 2026-03-10T06:23:07.710 INFO:tasks.workunit.client.0.vm04.stdout:7/650: dread d4/df/f56 [0,4194304] 0 2026-03-10T06:23:07.725 INFO:tasks.workunit.client.0.vm04.stdout:1/676: symlink d0/d8/d46/db3/lf9 0 2026-03-10T06:23:07.726 INFO:tasks.workunit.client.0.vm04.stdout:6/705: mkdir d2/d37/d6e/de6 0 2026-03-10T06:23:07.726 INFO:tasks.workunit.client.0.vm04.stdout:9/727: dwrite d2/d3/d18/de9/d5a/fee [0,4194304] 0 2026-03-10T06:23:07.753 INFO:tasks.workunit.client.0.vm04.stdout:5/644: dwrite d4/d11/d7d/d38/d91/d55/f7a [0,4194304] 0 2026-03-10T06:23:07.767 INFO:tasks.workunit.client.0.vm04.stdout:8/694: rename df/d20/d25/c61 to df/d15/d29/da3/cdb 0 2026-03-10T06:23:07.781 INFO:tasks.workunit.client.0.vm04.stdout:5/645: creat d4/d6/d80/d84/fe2 x:0 0 0 2026-03-10T06:23:07.786 INFO:tasks.workunit.client.0.vm04.stdout:0/746: dwrite d0/d5/d25/dd/d1d/f26 [0,4194304] 0 2026-03-10T06:23:07.794 INFO:tasks.workunit.client.0.vm04.stdout:2/646: mknod d1/dae/d11/d14/cc7 0 2026-03-10T06:23:07.803 INFO:tasks.workunit.client.0.vm04.stdout:9/728: rename d2/d8/d53/fc8 to d2/d3/d18/d39/d46/f110 0 2026-03-10T06:23:07.808 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:07.806+0000 7fc91ce10700 1 -- 192.168.123.104:0/12245446 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc9100a5450 msgr2=0x7fc9100a58c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:23:07.813 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:07.806+0000 7fc91ce10700 1 --2- 192.168.123.104:0/12245446 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc9100a5450 0x7fc9100a58c0 secure :-1 s=READY pgs=92 cs=0 l=1 rev1=1 crypto rx=0x7fc904009b00 tx=0x7fc904009e10 comp rx=0 tx=0).stop 2026-03-10T06:23:07.814 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:07.809+0000 7fc91ce10700 1 -- 192.168.123.104:0/12245446 shutdown_connections 2026-03-10T06:23:07.814 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:07.809+0000 7fc91ce10700 1 --2- 192.168.123.104:0/12245446 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc9100a5450 0x7fc9100a58c0 unknown :-1 s=CLOSED pgs=92 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:07.814 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:07.809+0000 7fc91ce10700 1 --2- 192.168.123.104:0/12245446 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc9100a4310 0x7fc9100a4720 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:07.814 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:07.809+0000 7fc91ce10700 1 -- 192.168.123.104:0/12245446 >> 192.168.123.104:0/12245446 conn(0x7fc91009f7e0 msgr2=0x7fc9100a1c30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:23:07.814 INFO:tasks.workunit.client.0.vm04.stdout:4/697: truncate d2/d32/d5c/d76/dd7/d56/fa7 114803 0 2026-03-10T06:23:07.818 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:07.817+0000 7fc91ce10700 1 -- 192.168.123.104:0/12245446 shutdown_connections 2026-03-10T06:23:07.819 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:07.818+0000 7fc91ce10700 1 -- 192.168.123.104:0/12245446 wait complete. 2026-03-10T06:23:07.819 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:07.818+0000 7fc91ce10700 1 Processor -- start 2026-03-10T06:23:07.819 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:07.818+0000 7fc91ce10700 1 -- start start 2026-03-10T06:23:07.819 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:07.818+0000 7fc91ce10700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc9100a4310 0x7fc9100b35e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:23:07.819 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:07.818+0000 7fc91ce10700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc9100a5450 0x7fc9100b3b40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:23:07.819 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:07.818+0000 7fc91ce10700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc9100b4180 con 0x7fc9100a5450 2026-03-10T06:23:07.819 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:07.818+0000 7fc91ce10700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc91013ee20 con 0x7fc9100a4310 2026-03-10T06:23:07.820 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:07.819+0000 7fc9177fe700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc9100a4310 0x7fc9100b35e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:23:07.820 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:07.819+0000 7fc9177fe700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc9100a4310 0x7fc9100b35e0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.104:47998/0 (socket says 192.168.123.104:47998) 2026-03-10T06:23:07.820 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:07.819+0000 7fc9177fe700 1 -- 192.168.123.104:0/4246774702 learned_addr learned my addr 192.168.123.104:0/4246774702 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:23:07.820 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:07.819+0000 7fc916ffd700 1 --2- 192.168.123.104:0/4246774702 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc9100a5450 0x7fc9100b3b40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:23:07.821 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:07.821+0000 7fc9177fe700 1 -- 192.168.123.104:0/4246774702 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc9100a5450 msgr2=0x7fc9100b3b40 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:23:07.822 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:07.821+0000 7fc9177fe700 1 --2- 192.168.123.104:0/4246774702 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc9100a5450 0x7fc9100b3b40 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:07.822 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:07.821+0000 7fc9177fe700 1 -- 192.168.123.104:0/4246774702 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fc9040097e0 con 0x7fc9100a4310 2026-03-10T06:23:07.823 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:07.822+0000 7fc9177fe700 1 --2- 192.168.123.104:0/4246774702 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc9100a4310 0x7fc9100b35e0 secure :-1 s=READY pgs=93 cs=0 l=1 rev1=1 crypto rx=0x7fc90c00b770 tx=0x7fc90c00ba80 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:23:07.823 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:07.822+0000 7fc914ff9700 1 -- 192.168.123.104:0/4246774702 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc90c010840 con 0x7fc9100a4310 2026-03-10T06:23:07.823 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:07.823+0000 7fc91ce10700 1 -- 192.168.123.104:0/4246774702 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fc91013f100 con 0x7fc9100a4310 2026-03-10T06:23:07.823 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:07.823+0000 7fc91ce10700 1 -- 192.168.123.104:0/4246774702 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fc91013f650 con 0x7fc9100a4310 2026-03-10T06:23:07.823 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:07.823+0000 7fc914ff9700 1 -- 192.168.123.104:0/4246774702 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fc90c010e80 con 0x7fc9100a4310 2026-03-10T06:23:07.823 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:07.823+0000 7fc914ff9700 1 -- 192.168.123.104:0/4246774702 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc90c00d590 con 0x7fc9100a4310 2026-03-10T06:23:07.825 INFO:tasks.workunit.client.0.vm04.stdout:6/706: dwrite d2/d37/d6e/f82 [0,4194304] 0 2026-03-10T06:23:07.829 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:07.827+0000 7fc914ff9700 1 -- 192.168.123.104:0/4246774702 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 32) v1 ==== 100066+0+0 (secure 0 0 0) 0x7fc90c0109a0 con 0x7fc9100a4310 2026-03-10T06:23:07.829 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:07.828+0000 7fc914ff9700 1 --2- 192.168.123.104:0/4246774702 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7fc908077850 0x7fc908079d00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:23:07.832 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:07.830+0000 7fc914ff9700 1 -- 192.168.123.104:0/4246774702 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(42..42 src has 1..42) v4 ==== 5792+0+0 (secure 0 0 0) 0x7fc90c098f70 con 0x7fc9100a4310 2026-03-10T06:23:07.832 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:07.830+0000 7fc91ce10700 1 -- 192.168.123.104:0/4246774702 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fc8fc005320 con 0x7fc9100a4310 2026-03-10T06:23:07.832 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:07.832+0000 7fc916ffd700 1 --2- 192.168.123.104:0/4246774702 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7fc908077850 0x7fc908079d00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:23:07.833 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:07.833+0000 7fc916ffd700 1 --2- 192.168.123.104:0/4246774702 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7fc908077850 0x7fc908079d00 secure :-1 s=READY pgs=27 cs=0 l=1 rev1=1 crypto rx=0x7fc9100b46c0 tx=0x7fc90401a040 comp rx=0 tx=0).ready entity=mgr.14632 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:23:07.834 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:07.834+0000 7fc914ff9700 1 -- 192.168.123.104:0/4246774702 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191525 (secure 0 0 0) 0x7fc90c061b00 con 0x7fc9100a4310 2026-03-10T06:23:07.834 INFO:tasks.workunit.client.0.vm04.stdout:0/747: symlink d0/d1a/d20/df5/d79/lfa 0 2026-03-10T06:23:07.835 INFO:tasks.workunit.client.0.vm04.stdout:7/651: write d4/df/d12/d13/d25/d30/d40/f52 [2937195,71991] 0 2026-03-10T06:23:07.838 INFO:tasks.workunit.client.0.vm04.stdout:2/647: mknod d1/db/d69/cc8 0 2026-03-10T06:23:07.843 INFO:tasks.workunit.client.0.vm04.stdout:1/677: rename d0/d3/l30 to d0/d3/d41/dcb/dee/lfa 0 2026-03-10T06:23:07.850 INFO:tasks.workunit.client.0.vm04.stdout:1/678: dread d0/d8/d46/f93 [0,4194304] 0 2026-03-10T06:23:07.850 INFO:tasks.workunit.client.0.vm04.stdout:1/679: write d0/d3/f33 [4230741,74370] 0 2026-03-10T06:23:07.850 INFO:tasks.workunit.client.0.vm04.stdout:1/680: write d0/d3/fd4 [617568,89062] 0 2026-03-10T06:23:07.850 INFO:tasks.workunit.client.0.vm04.stdout:9/729: fdatasync d2/d3/d18/de9/d5a/fa7 0 2026-03-10T06:23:07.850 INFO:tasks.workunit.client.0.vm04.stdout:9/730: write d2/d3/d18/d34/ffa [148674,41610] 0 2026-03-10T06:23:07.850 INFO:tasks.workunit.client.0.vm04.stdout:9/731: fdatasync d2/d3/f57 0 2026-03-10T06:23:07.854 INFO:tasks.workunit.client.0.vm04.stdout:6/707: truncate d2/d8/f11 3638318 0 2026-03-10T06:23:07.857 INFO:tasks.workunit.client.0.vm04.stdout:5/646: creat d4/d6/dc2/dd3/fe3 x:0 0 0 2026-03-10T06:23:07.859 INFO:tasks.workunit.client.0.vm04.stdout:0/748: mkdir d0/d5/d25/dd/d1d/d59/d63/dfb 0 2026-03-10T06:23:07.861 INFO:tasks.workunit.client.0.vm04.stdout:2/648: stat d1/db/d20/d8f/f53 0 2026-03-10T06:23:07.864 INFO:tasks.workunit.client.0.vm04.stdout:1/681: sync 2026-03-10T06:23:07.868 INFO:tasks.workunit.client.0.vm04.stdout:8/695: rename df/d20/d25/d30/f6b to df/d15/d29/da3/db8/fdc 0 2026-03-10T06:23:07.887 INFO:tasks.workunit.client.0.vm04.stdout:4/698: mkdir d2/d32/d5c/de2 0 2026-03-10T06:23:07.935 INFO:tasks.workunit.client.0.vm04.stdout:7/652: mknod d4/df/d12/d13/db3/ded/cf5 0 2026-03-10T06:23:07.971 INFO:tasks.workunit.client.0.vm04.stdout:0/749: rename d0/d5/d25/l5d to d0/lfc 0 2026-03-10T06:23:07.971 INFO:tasks.workunit.client.0.vm04.stdout:0/750: readlink d0/d5/d25/ld7 0 2026-03-10T06:23:07.982 INFO:tasks.workunit.client.0.vm04.stdout:8/696: mknod df/d20/d25/d73/cdd 0 2026-03-10T06:23:07.983 INFO:tasks.workunit.client.0.vm04.stdout:4/699: mkdir d2/d32/d94/d99/de3 0 2026-03-10T06:23:07.986 INFO:tasks.workunit.client.0.vm04.stdout:9/732: getdents d2/d3/d18/ddd 0 2026-03-10T06:23:07.987 INFO:tasks.workunit.client.0.vm04.stdout:9/733: chown d2/d3/d18/d39/d46/fbc 1869 1 2026-03-10T06:23:07.987 INFO:tasks.workunit.client.0.vm04.stdout:9/734: stat d2/d3/d18/d39/d107 0 2026-03-10T06:23:07.990 INFO:tasks.workunit.client.0.vm04.stdout:3/664: link d4/d6/d99/d7b/d21/d32/cdd d4/da/df/cde 0 2026-03-10T06:23:08.000 INFO:tasks.workunit.client.0.vm04.stdout:7/653: unlink d4/df/d12/d13/l3c 0 2026-03-10T06:23:08.025 INFO:tasks.workunit.client.0.vm04.stdout:6/708: write d2/d43/d2d/d30/d1f/f3f [903455,76012] 0 2026-03-10T06:23:08.027 INFO:tasks.workunit.client.0.vm04.stdout:5/647: link d4/d11/d7d/f5b d4/d11/d7d/d38/d91/dda/fe4 0 2026-03-10T06:23:08.035 INFO:tasks.workunit.client.0.vm04.stdout:6/709: truncate d2/d3a/d5e/f99 833333 0 2026-03-10T06:23:08.035 INFO:tasks.workunit.client.0.vm04.stdout:2/649: write d1/dae/d11/f7e [1115077,108862] 0 2026-03-10T06:23:08.035 INFO:tasks.workunit.client.0.vm04.stdout:1/682: truncate d0/d8/f43 310211 0 2026-03-10T06:23:08.036 INFO:tasks.workunit.client.0.vm04.stdout:1/683: dread d0/d3/d41/f75 [0,4194304] 0 2026-03-10T06:23:08.039 INFO:tasks.workunit.client.0.vm04.stdout:1/684: dwrite d0/d8/d46/d7a/fa8 [0,4194304] 0 2026-03-10T06:23:08.045 INFO:tasks.workunit.client.0.vm04.stdout:4/700: dwrite d2/d32/d5c/d76/dd7/d31/d3f/d93/fb4 [0,4194304] 0 2026-03-10T06:23:08.049 INFO:tasks.workunit.client.0.vm04.stdout:5/648: mkdir d4/d6/d80/de5 0 2026-03-10T06:23:08.049 INFO:tasks.workunit.client.0.vm04.stdout:5/649: chown d4/d11/d7d/f31 6 1 2026-03-10T06:23:08.049 INFO:tasks.workunit.client.0.vm04.stdout:5/650: chown d4/d11/d7d/f31 18 1 2026-03-10T06:23:08.064 INFO:tasks.workunit.client.0.vm04.stdout:7/654: write d4/df/d12/d13/d25/d30/d40/f75 [13943,78868] 0 2026-03-10T06:23:08.067 INFO:tasks.workunit.client.0.vm04.stdout:0/751: truncate d0/d5/d25/dd/d5c/f8f 2298142 0 2026-03-10T06:23:08.067 INFO:tasks.workunit.client.0.vm04.stdout:9/735: truncate d2/d3/d18/d39/d46/f110 1629016 0 2026-03-10T06:23:08.072 INFO:tasks.workunit.client.0.vm04.stdout:9/736: dwrite d2/d3/d18/de9/d5a/fee [0,4194304] 0 2026-03-10T06:23:08.072 INFO:tasks.workunit.client.0.vm04.stdout:2/650: chown d1/db/d69/f77 2033424 1 2026-03-10T06:23:08.094 INFO:tasks.workunit.client.0.vm04.stdout:5/651: mkdir d4/d6/d80/d84/d99/de6 0 2026-03-10T06:23:08.096 INFO:tasks.workunit.client.0.vm04.stdout:7/655: mkdir d4/df/d12/d13/d25/d30/d40/d50/df6 0 2026-03-10T06:23:08.097 INFO:tasks.workunit.client.0.vm04.stdout:8/697: getdents df 0 2026-03-10T06:23:08.104 INFO:tasks.workunit.client.0.vm04.stdout:0/752: creat d0/d5/d25/dd/d1d/d9c/dbf/ffd x:0 0 0 2026-03-10T06:23:08.105 INFO:tasks.workunit.client.0.vm04.stdout:2/651: mknod d1/db/d69/d74/d87/cc9 0 2026-03-10T06:23:08.106 INFO:tasks.workunit.client.0.vm04.stdout:2/652: truncate d1/dae/d11/f7e 2628613 0 2026-03-10T06:23:08.114 INFO:tasks.workunit.client.0.vm04.stdout:2/653: dwrite d1/db/d69/fc6 [0,4194304] 0 2026-03-10T06:23:08.119 INFO:tasks.workunit.client.0.vm04.stdout:9/737: dread d2/de0/d1d/f6a [0,4194304] 0 2026-03-10T06:23:08.119 INFO:tasks.workunit.client.0.vm04.stdout:9/738: read d2/d3/d18/de9/d5a/fee [1071181,63601] 0 2026-03-10T06:23:08.125 INFO:tasks.workunit.client.0.vm04.stdout:1/685: link d0/d8/fe1 d0/d3/d41/dcb/ffb 0 2026-03-10T06:23:08.131 INFO:tasks.workunit.client.0.vm04.stdout:3/665: dwrite d4/d6/d99/d7b/d21/d32/d4e/d8f/fb0 [0,4194304] 0 2026-03-10T06:23:08.132 INFO:tasks.workunit.client.0.vm04.stdout:3/666: readlink d4/d6/d38/l8a 0 2026-03-10T06:23:08.133 INFO:tasks.workunit.client.0.vm04.stdout:3/667: fsync d4/f42 0 2026-03-10T06:23:08.153 INFO:tasks.workunit.client.0.vm04.stdout:6/710: dwrite d2/d43/d2d/d30/d1f/d3c/f6a [0,4194304] 0 2026-03-10T06:23:08.167 INFO:tasks.workunit.client.0.vm04.stdout:5/652: rename d4/d11/d7d/d38/d91/ca9 to d4/d6/d80/dd9/ce7 0 2026-03-10T06:23:08.183 INFO:tasks.workunit.client.0.vm04.stdout:2/654: mkdir d1/dae/d2c/d37/dca 0 2026-03-10T06:23:08.187 INFO:tasks.workunit.client.0.vm04.stdout:7/656: write d4/fa [398406,102883] 0 2026-03-10T06:23:08.189 INFO:tasks.workunit.client.0.vm04.stdout:7/657: truncate d4/df/d12/d13/d25/d28/d3a/d58/fcc 4584694 0 2026-03-10T06:23:08.189 INFO:tasks.workunit.client.0.vm04.stdout:7/658: write d4/df/dd8/f64 [4542360,41536] 0 2026-03-10T06:23:08.195 INFO:tasks.workunit.client.0.vm04.stdout:1/686: creat d0/d3/d41/dc2/ffc x:0 0 0 2026-03-10T06:23:08.201 INFO:tasks.workunit.client.0.vm04.stdout:6/711: creat d2/d43/fe7 x:0 0 0 2026-03-10T06:23:08.208 INFO:tasks.workunit.client.0.vm04.stdout:6/712: dread d2/d43/d2d/d30/d1f/d3c/d85/fd0 [0,4194304] 0 2026-03-10T06:23:08.209 INFO:tasks.workunit.client.0.vm04.stdout:8/698: rename df/d15/c78 to df/d20/d25/d87/cde 0 2026-03-10T06:23:08.221 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:08.219+0000 7fc91ce10700 1 -- 192.168.123.104:0/4246774702 --> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7fc8fc000bf0 con 0x7fc908077850 2026-03-10T06:23:08.222 INFO:tasks.workunit.client.0.vm04.stdout:7/659: dwrite d4/df/d12/d13/d25/d30/d40/d79/f89 [4194304,4194304] 0 2026-03-10T06:23:08.223 INFO:tasks.workunit.client.0.vm04.stdout:7/660: fdatasync d4/df/d12/d13/d25/d28/d3a/d58/fcc 0 2026-03-10T06:23:08.225 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:08.225+0000 7fc914ff9700 1 -- 192.168.123.104:0/4246774702 <== mgr.14632 v2:192.168.123.104:6800/1695210057 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+367 (secure 0 0 0) 0x7fc8fc000bf0 con 0x7fc908077850 2026-03-10T06:23:08.231 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:08.230+0000 7fc91ce10700 1 -- 192.168.123.104:0/4246774702 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7fc908077850 msgr2=0x7fc908079d00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:23:08.231 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:08.230+0000 7fc91ce10700 1 --2- 192.168.123.104:0/4246774702 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7fc908077850 0x7fc908079d00 secure :-1 s=READY pgs=27 cs=0 l=1 rev1=1 crypto rx=0x7fc9100b46c0 tx=0x7fc90401a040 comp rx=0 tx=0).stop 2026-03-10T06:23:08.231 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:08.230+0000 7fc91ce10700 1 -- 192.168.123.104:0/4246774702 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc9100a4310 msgr2=0x7fc9100b35e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:23:08.231 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:08.230+0000 7fc91ce10700 1 --2- 192.168.123.104:0/4246774702 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc9100a4310 0x7fc9100b35e0 secure :-1 s=READY pgs=93 cs=0 l=1 rev1=1 crypto rx=0x7fc90c00b770 tx=0x7fc90c00ba80 comp rx=0 tx=0).stop 2026-03-10T06:23:08.231 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:08.230+0000 7fc91ce10700 1 -- 192.168.123.104:0/4246774702 shutdown_connections 2026-03-10T06:23:08.231 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:08.230+0000 7fc91ce10700 1 --2- 192.168.123.104:0/4246774702 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc9100a4310 0x7fc9100b35e0 unknown :-1 s=CLOSED pgs=93 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:08.231 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:08.230+0000 7fc91ce10700 1 --2- 192.168.123.104:0/4246774702 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7fc908077850 0x7fc908079d00 unknown :-1 s=CLOSED pgs=27 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:08.231 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:08.230+0000 7fc91ce10700 1 --2- 192.168.123.104:0/4246774702 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc9100a5450 0x7fc9100b3b40 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:08.231 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:08.230+0000 7fc91ce10700 1 -- 192.168.123.104:0/4246774702 >> 192.168.123.104:0/4246774702 conn(0x7fc91009f7e0 msgr2=0x7fc9100a8680 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:23:08.231 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:08.230+0000 7fc91ce10700 1 -- 192.168.123.104:0/4246774702 shutdown_connections 2026-03-10T06:23:08.231 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:08.231+0000 7fc91ce10700 1 -- 192.168.123.104:0/4246774702 wait complete. 2026-03-10T06:23:08.235 INFO:tasks.workunit.client.0.vm04.stdout:7/661: dread d4/df/d12/d13/d25/d30/d40/d79/fab [0,4194304] 0 2026-03-10T06:23:08.237 INFO:tasks.workunit.client.0.vm04.stdout:4/701: getdents d2/d32 0 2026-03-10T06:23:08.241 INFO:tasks.workunit.client.0.vm04.stdout:6/713: symlink d2/d8/le8 0 2026-03-10T06:23:08.251 INFO:tasks.workunit.client.0.vm04.stdout:9/739: write d2/d3/d18/d39/d46/f110 [2518784,28930] 0 2026-03-10T06:23:08.252 INFO:teuthology.orchestra.run.vm04.stdout:true 2026-03-10T06:23:08.253 INFO:tasks.workunit.client.0.vm04.stdout:5/653: dwrite d4/fb0 [0,4194304] 0 2026-03-10T06:23:08.259 INFO:tasks.workunit.client.0.vm04.stdout:8/699: write df/f77 [4283357,7903] 0 2026-03-10T06:23:08.259 INFO:tasks.workunit.client.0.vm04.stdout:0/753: rename d0/d1a/d20/fcb to d0/d5/d25/dd/d92/ffe 0 2026-03-10T06:23:08.267 INFO:tasks.workunit.client.0.vm04.stdout:7/662: read d4/fa2 [2760297,14923] 0 2026-03-10T06:23:08.270 INFO:tasks.workunit.client.0.vm04.stdout:6/714: truncate d2/d43/d2d/d30/d34/f4d 2725467 0 2026-03-10T06:23:08.277 INFO:tasks.workunit.client.0.vm04.stdout:2/655: rmdir d1/dae/d2c/d37/d59/db2 0 2026-03-10T06:23:08.281 INFO:tasks.workunit.client.0.vm04.stdout:3/668: rename d4/d6/d99/d7b to d4/da/df/d11/d5a/d5b/ddf 0 2026-03-10T06:23:08.294 INFO:tasks.workunit.client.0.vm04.stdout:1/687: link d0/d3/d80/lbd d0/d8/d46/db3/dd2/lfd 0 2026-03-10T06:23:08.296 INFO:tasks.workunit.client.0.vm04.stdout:0/754: sync 2026-03-10T06:23:08.296 INFO:tasks.workunit.client.0.vm04.stdout:0/755: chown d0/ff8 263 1 2026-03-10T06:23:08.303 INFO:tasks.workunit.client.0.vm04.stdout:4/702: dwrite d2/d32/d5c/d76/dd7/d56/f7f [0,4194304] 0 2026-03-10T06:23:08.305 INFO:tasks.workunit.client.0.vm04.stdout:4/703: read - d2/d32/d5c/d76/dd7/da3/fd5 zero size 2026-03-10T06:23:08.311 INFO:tasks.workunit.client.0.vm04.stdout:9/740: write d2/d8/d53/d6e/d89/fba [681634,39367] 0 2026-03-10T06:23:08.314 INFO:tasks.workunit.client.0.vm04.stdout:8/700: dwrite df/f46 [0,4194304] 0 2026-03-10T06:23:08.331 INFO:tasks.workunit.client.0.vm04.stdout:2/656: dwrite d1/fc2 [0,4194304] 0 2026-03-10T06:23:08.332 INFO:tasks.workunit.client.0.vm04.stdout:2/657: write d1/dae/d11/f7e [3413112,5163] 0 2026-03-10T06:23:08.344 INFO:tasks.workunit.client.0.vm04.stdout:3/669: chown d4/d6/l74 269 1 2026-03-10T06:23:08.345 INFO:tasks.workunit.client.0.vm04.stdout:1/688: symlink d0/d8/d46/db3/dd2/lfe 0 2026-03-10T06:23:08.348 INFO:tasks.workunit.client.0.vm04.stdout:4/704: creat d2/d32/d5c/d98/fe4 x:0 0 0 2026-03-10T06:23:08.350 INFO:tasks.workunit.client.0.vm04.stdout:0/756: dread d0/d1a/d20/d38/fb4 [0,4194304] 0 2026-03-10T06:23:08.355 INFO:tasks.workunit.client.0.vm04.stdout:8/701: dwrite df/d20/f5e [0,4194304] 0 2026-03-10T06:23:08.355 INFO:tasks.workunit.client.0.vm04.stdout:2/658: sync 2026-03-10T06:23:08.365 INFO:tasks.workunit.client.0.vm04.stdout:2/659: dwrite d1/dae/d2c/d37/d59/f8b [0,4194304] 0 2026-03-10T06:23:08.367 INFO:tasks.workunit.client.0.vm04.stdout:6/715: unlink d2/d43/d2d/d30/d34/ca7 0 2026-03-10T06:23:08.382 INFO:tasks.workunit.client.0.vm04.stdout:1/689: unlink d0/l7 0 2026-03-10T06:23:08.384 INFO:tasks.workunit.client.0.vm04.stdout:7/663: creat d4/df/dd8/d9c/db1/ff7 x:0 0 0 2026-03-10T06:23:08.386 INFO:tasks.workunit.client.0.vm04.stdout:4/705: mknod d2/d8/ce5 0 2026-03-10T06:23:08.389 INFO:tasks.workunit.client.0.vm04.stdout:0/757: truncate d0/d5/d25/dd/d5c/d73/fae 377844 0 2026-03-10T06:23:08.397 INFO:tasks.workunit.client.0.vm04.stdout:2/660: creat d1/db/d20/d8f/d48/d67/fcb x:0 0 0 2026-03-10T06:23:08.403 INFO:tasks.workunit.client.0.vm04.stdout:5/654: rename d4/d3b/l45 to d4/d6/le8 0 2026-03-10T06:23:08.406 INFO:tasks.workunit.client.0.vm04.stdout:3/670: unlink f0 0 2026-03-10T06:23:08.412 INFO:tasks.workunit.client.0.vm04.stdout:1/690: mknod d0/d3/d41/d99/cff 0 2026-03-10T06:23:08.418 INFO:tasks.workunit.client.0.vm04.stdout:7/664: mknod d4/df/d12/d13/d25/d28/d3a/d58/cf8 0 2026-03-10T06:23:08.421 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:08.418+0000 7fc0820b0700 1 -- 192.168.123.104:0/494431276 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc07c072360 msgr2=0x7fc07c0770e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:23:08.421 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:08.418+0000 7fc0820b0700 1 --2- 192.168.123.104:0/494431276 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc07c072360 0x7fc07c0770e0 secure :-1 s=READY pgs=336 cs=0 l=1 rev1=1 crypto rx=0x7fc07400b3a0 tx=0x7fc07400b6b0 comp rx=0 tx=0).stop 2026-03-10T06:23:08.421 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:08.418+0000 7fc0820b0700 1 -- 192.168.123.104:0/494431276 shutdown_connections 2026-03-10T06:23:08.421 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:08.418+0000 7fc0820b0700 1 --2- 192.168.123.104:0/494431276 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc07c072360 0x7fc07c0770e0 unknown :-1 s=CLOSED pgs=336 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:08.421 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:08.418+0000 7fc0820b0700 1 --2- 192.168.123.104:0/494431276 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc07c071980 0x7fc07c071d90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:08.421 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:08.418+0000 7fc0820b0700 1 -- 192.168.123.104:0/494431276 >> 192.168.123.104:0/494431276 conn(0x7fc07c06d1a0 msgr2=0x7fc07c06f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:23:08.421 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:08.418+0000 7fc0820b0700 1 -- 192.168.123.104:0/494431276 shutdown_connections 2026-03-10T06:23:08.421 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:08.419+0000 7fc0820b0700 1 -- 192.168.123.104:0/494431276 wait complete. 2026-03-10T06:23:08.421 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:08.420+0000 7fc0820b0700 1 Processor -- start 2026-03-10T06:23:08.421 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:08.420+0000 7fc0820b0700 1 -- start start 2026-03-10T06:23:08.421 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:08.420+0000 7fc0820b0700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc07c071980 0x7fc07c082530 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:23:08.421 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:08.420+0000 7fc0820b0700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc07c082a70 0x7fc07c082ee0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:23:08.421 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:08.420+0000 7fc0820b0700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc07c1b2a90 con 0x7fc07c071980 2026-03-10T06:23:08.421 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:08.420+0000 7fc0820b0700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc07c1b2bd0 con 0x7fc07c082a70 2026-03-10T06:23:08.421 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:08.420+0000 7fc07affd700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc07c082a70 0x7fc07c082ee0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:23:08.422 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:08.420+0000 7fc07affd700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc07c082a70 0x7fc07c082ee0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.104:48020/0 (socket says 192.168.123.104:48020) 2026-03-10T06:23:08.422 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:08.420+0000 7fc07affd700 1 -- 192.168.123.104:0/3972149144 learned_addr learned my addr 192.168.123.104:0/3972149144 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:23:08.422 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:08.421+0000 7fc07b7fe700 1 --2- 192.168.123.104:0/3972149144 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc07c071980 0x7fc07c082530 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:23:08.422 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:08.421+0000 7fc07b7fe700 1 -- 192.168.123.104:0/3972149144 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc07c082a70 msgr2=0x7fc07c082ee0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:23:08.423 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:08.421+0000 7fc07b7fe700 1 --2- 192.168.123.104:0/3972149144 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc07c082a70 0x7fc07c082ee0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:08.423 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:08.421+0000 7fc07b7fe700 1 -- 192.168.123.104:0/3972149144 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fc07400b050 con 0x7fc07c071980 2026-03-10T06:23:08.423 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:08.422+0000 7fc07b7fe700 1 --2- 192.168.123.104:0/3972149144 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc07c071980 0x7fc07c082530 secure :-1 s=READY pgs=337 cs=0 l=1 rev1=1 crypto rx=0x7fc06c007ae0 tx=0x7fc06c007df0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:23:08.426 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:08.423+0000 7fc078ff9700 1 -- 192.168.123.104:0/3972149144 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc06c013070 con 0x7fc07c071980 2026-03-10T06:23:08.426 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:08.423+0000 7fc078ff9700 1 -- 192.168.123.104:0/3972149144 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fc06c00e490 con 0x7fc07c071980 2026-03-10T06:23:08.426 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:08.423+0000 7fc078ff9700 1 -- 192.168.123.104:0/3972149144 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc06c00c850 con 0x7fc07c071980 2026-03-10T06:23:08.426 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:08.423+0000 7fc0820b0700 1 -- 192.168.123.104:0/3972149144 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fc07c1b2d70 con 0x7fc07c071980 2026-03-10T06:23:08.426 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:08.425+0000 7fc0820b0700 1 -- 192.168.123.104:0/3972149144 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fc07c1b3230 con 0x7fc07c071980 2026-03-10T06:23:08.431 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:08.430+0000 7fc078ff9700 1 -- 192.168.123.104:0/3972149144 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 32) v1 ==== 100066+0+0 (secure 0 0 0) 0x7fc06c00c9b0 con 0x7fc07c071980 2026-03-10T06:23:08.431 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:08.431+0000 7fc078ff9700 1 --2- 192.168.123.104:0/3972149144 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7fc0640776b0 0x7fc064079b60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:23:08.431 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:08.431+0000 7fc078ff9700 1 -- 192.168.123.104:0/3972149144 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(42..42 src has 1..42) v4 ==== 5792+0+0 (secure 0 0 0) 0x7fc06c09dce0 con 0x7fc07c071980 2026-03-10T06:23:08.432 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:08.431+0000 7fc0820b0700 1 -- 192.168.123.104:0/3972149144 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fc068005320 con 0x7fc07c071980 2026-03-10T06:23:08.432 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:08.431+0000 7fc07affd700 1 --2- 192.168.123.104:0/3972149144 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7fc0640776b0 0x7fc064079b60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:23:08.435 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:08.435+0000 7fc07affd700 1 --2- 192.168.123.104:0/3972149144 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7fc0640776b0 0x7fc064079b60 secure :-1 s=READY pgs=28 cs=0 l=1 rev1=1 crypto rx=0x7fc07400bb30 tx=0x7fc074014040 comp rx=0 tx=0).ready entity=mgr.14632 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:23:08.436 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:08.436+0000 7fc078ff9700 1 -- 192.168.123.104:0/3972149144 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191525 (secure 0 0 0) 0x7fc06c0668b0 con 0x7fc07c071980 2026-03-10T06:23:08.444 INFO:tasks.workunit.client.0.vm04.stdout:0/758: dwrite d0/fb0 [0,4194304] 0 2026-03-10T06:23:08.462 INFO:tasks.workunit.client.0.vm04.stdout:1/691: mkdir d0/d8/d46/db3/dd2/d100 0 2026-03-10T06:23:08.462 INFO:tasks.workunit.client.0.vm04.stdout:7/665: readlink d4/df/d12/d13/d25/d28/l31 0 2026-03-10T06:23:08.465 INFO:tasks.workunit.client.0.vm04.stdout:9/741: getdents d2/d3/d18/d39/d46/d55 0 2026-03-10T06:23:08.474 INFO:tasks.workunit.client.0.vm04.stdout:0/759: rename d0/d5/d25/dd/d1d to d0/d5/d97/dc0/dd8/dff 0 2026-03-10T06:23:08.477 INFO:tasks.workunit.client.0.vm04.stdout:8/702: link df/d15/d29/da3/fa7 df/d15/d2b/d8a/dab/fdf 0 2026-03-10T06:23:08.478 INFO:tasks.workunit.client.0.vm04.stdout:0/760: fdatasync d0/d5/d25/dd/f13 0 2026-03-10T06:23:08.479 INFO:tasks.workunit.client.0.vm04.stdout:0/761: dread - d0/d5/d97/dc0/dd8/dff/d59/fed zero size 2026-03-10T06:23:08.482 INFO:tasks.workunit.client.0.vm04.stdout:6/716: creat d2/d43/d2d/d30/d34/dae/fe9 x:0 0 0 2026-03-10T06:23:08.485 INFO:tasks.workunit.client.0.vm04.stdout:9/742: mknod d2/d8/d3a/dcb/d108/c111 0 2026-03-10T06:23:08.486 INFO:tasks.workunit.client.0.vm04.stdout:8/703: creat df/d20/d25/fe0 x:0 0 0 2026-03-10T06:23:08.487 INFO:tasks.workunit.client.0.vm04.stdout:8/704: fdatasync df/d15/d2b/d81/fc4 0 2026-03-10T06:23:08.494 INFO:tasks.workunit.client.0.vm04.stdout:2/661: write d1/dae/f63 [1458173,4853] 0 2026-03-10T06:23:08.498 INFO:tasks.workunit.client.0.vm04.stdout:5/655: write d4/d6/d37/f7e [1158792,102328] 0 2026-03-10T06:23:08.499 INFO:tasks.workunit.client.0.vm04.stdout:5/656: write d4/d11/d7d/d38/d91/d4c/f88 [4808146,28261] 0 2026-03-10T06:23:08.503 INFO:tasks.workunit.client.0.vm04.stdout:3/671: write d4/da/df/d11/d5a/d5b/ddf/d21/f3a [2862154,84455] 0 2026-03-10T06:23:08.509 INFO:tasks.workunit.client.0.vm04.stdout:4/706: truncate d2/d32/d5c/f6a 141847 0 2026-03-10T06:23:08.510 INFO:tasks.workunit.client.0.vm04.stdout:4/707: write d2/d46/fa8 [5153433,53610] 0 2026-03-10T06:23:08.515 INFO:tasks.workunit.client.0.vm04.stdout:7/666: write d4/df/d12/f18 [2615828,86222] 0 2026-03-10T06:23:08.517 INFO:tasks.workunit.client.0.vm04.stdout:1/692: write d0/d3/f50 [447325,45562] 0 2026-03-10T06:23:08.523 INFO:tasks.workunit.client.0.vm04.stdout:6/717: dread - d2/d43/d2d/d30/f91 zero size 2026-03-10T06:23:08.529 INFO:tasks.workunit.client.0.vm04.stdout:2/662: rename d1/dae/d11/d14/d4e/l85 to d1/db/d69/lcc 0 2026-03-10T06:23:08.534 INFO:tasks.workunit.client.0.vm04.stdout:5/657: creat d4/d6/d80/d84/d99/fe9 x:0 0 0 2026-03-10T06:23:08.538 INFO:tasks.workunit.client.0.vm04.stdout:4/708: truncate d2/d32/d5c/d4f/f60 351141 0 2026-03-10T06:23:08.540 INFO:tasks.workunit.client.0.vm04.stdout:7/667: truncate d4/df/d12/faf 332243 0 2026-03-10T06:23:08.541 INFO:tasks.workunit.client.0.vm04.stdout:6/718: sync 2026-03-10T06:23:08.544 INFO:tasks.workunit.client.0.vm04.stdout:7/668: dwrite d4/df/dd8/d9c/db1/fbe [0,4194304] 0 2026-03-10T06:23:08.546 INFO:tasks.workunit.client.0.vm04.stdout:1/693: dread - d0/d8/d46/fb7 zero size 2026-03-10T06:23:08.547 INFO:tasks.workunit.client.0.vm04.stdout:0/762: symlink d0/d1a/d20/df5/de9/l100 0 2026-03-10T06:23:08.554 INFO:tasks.workunit.client.0.vm04.stdout:0/763: dwrite d0/d5/d25/dd/d3a/f50 [0,4194304] 0 2026-03-10T06:23:08.559 INFO:tasks.workunit.client.0.vm04.stdout:8/705: mkdir df/d15/d2b/d81/de1 0 2026-03-10T06:23:08.559 INFO:tasks.workunit.client.0.vm04.stdout:8/706: chown df/d15/d29/da3/faa 15586 1 2026-03-10T06:23:08.566 INFO:tasks.workunit.client.0.vm04.stdout:7/669: dread d4/f6 [0,4194304] 0 2026-03-10T06:23:08.570 INFO:tasks.workunit.client.0.vm04.stdout:7/670: fdatasync d4/df/d12/d13/d25/d28/f7d 0 2026-03-10T06:23:08.582 INFO:tasks.workunit.client.0.vm04.stdout:3/672: creat d4/d6/d91/da1/fe0 x:0 0 0 2026-03-10T06:23:08.582 INFO:tasks.workunit.client.0.vm04.stdout:3/673: stat d4/d6/d91/da1 0 2026-03-10T06:23:08.586 INFO:tasks.workunit.client.0.vm04.stdout:6/719: truncate d2/d43/d2d/d30/d1f/d3c/d75/f92 1006778 0 2026-03-10T06:23:08.592 INFO:tasks.workunit.client.0.vm04.stdout:9/743: truncate d2/d8/f4a 858813 0 2026-03-10T06:23:08.594 INFO:tasks.workunit.client.0.vm04.stdout:3/674: dread d4/da/df/d11/d50/fa9 [0,4194304] 0 2026-03-10T06:23:08.598 INFO:tasks.workunit.client.0.vm04.stdout:1/694: chown d0/d8/d46/c8d 3272 1 2026-03-10T06:23:08.600 INFO:tasks.workunit.client.0.vm04.stdout:3/675: sync 2026-03-10T06:23:08.607 INFO:tasks.workunit.client.0.vm04.stdout:2/663: write d1/dae/d11/d14/d4e/f9d [1373385,26326] 0 2026-03-10T06:23:08.614 INFO:tasks.workunit.client.0.vm04.stdout:4/709: dwrite d2/d32/d5c/f41 [0,4194304] 0 2026-03-10T06:23:08.616 INFO:tasks.workunit.client.0.vm04.stdout:7/671: fsync d4/df/d12/d13/d25/d28/fc0 0 2026-03-10T06:23:08.617 INFO:tasks.workunit.client.0.vm04.stdout:7/672: readlink d4/df/d12/d13/d25/d30/d40/l6c 0 2026-03-10T06:23:08.618 INFO:tasks.workunit.client.0.vm04.stdout:7/673: chown d4/df/d12/d13/d8b/ca8 181 1 2026-03-10T06:23:08.637 INFO:tasks.workunit.client.0.vm04.stdout:2/664: mkdir d1/db/d69/dcd 0 2026-03-10T06:23:08.638 INFO:tasks.workunit.client.0.vm04.stdout:3/676: dread - d4/da/df/d11/f9f zero size 2026-03-10T06:23:08.638 INFO:tasks.workunit.client.0.vm04.stdout:0/764: dwrite d0/d5/d25/f23 [0,4194304] 0 2026-03-10T06:23:08.638 INFO:tasks.workunit.client.0.vm04.stdout:3/677: chown d4/d6/f30 8175836 1 2026-03-10T06:23:08.640 INFO:tasks.workunit.client.0.vm04.stdout:2/665: read - d1/db/d20/d8f/d35/d54/d5d/faa zero size 2026-03-10T06:23:08.670 INFO:tasks.workunit.client.0.vm04.stdout:8/707: truncate df/d20/d25/d30/d55/f95 922014 0 2026-03-10T06:23:08.676 INFO:tasks.workunit.client.0.vm04.stdout:0/765: dread d0/d5/fb [0,4194304] 0 2026-03-10T06:23:08.685 INFO:tasks.workunit.client.0.vm04.stdout:6/720: link d2/d43/d2d/d30/d34/da8/fb4 d2/d43/d9b/fea 0 2026-03-10T06:23:08.689 INFO:tasks.workunit.client.0.vm04.stdout:6/721: dread d2/d43/d2d/fcf [0,4194304] 0 2026-03-10T06:23:08.690 INFO:tasks.workunit.client.0.vm04.stdout:4/710: dwrite d2/d32/d5c/d4f/f85 [0,4194304] 0 2026-03-10T06:23:08.692 INFO:tasks.workunit.client.0.vm04.stdout:4/711: readlink d2/d8/l81 0 2026-03-10T06:23:08.692 INFO:tasks.workunit.client.0.vm04.stdout:7/674: dwrite d4/df/d12/d34/f46 [0,4194304] 0 2026-03-10T06:23:08.704 INFO:tasks.workunit.client.0.vm04.stdout:9/744: symlink d2/d3/d18/de9/d5a/d92/l112 0 2026-03-10T06:23:08.710 INFO:tasks.workunit.client.0.vm04.stdout:1/695: dwrite d0/d3/d41/d4b/d5b/f5c [0,4194304] 0 2026-03-10T06:23:08.717 INFO:tasks.workunit.client.0.vm04.stdout:2/666: fdatasync d1/db/f27 0 2026-03-10T06:23:08.718 INFO:tasks.workunit.client.0.vm04.stdout:5/658: getdents d4/d6/d80 0 2026-03-10T06:23:08.718 INFO:tasks.workunit.client.0.vm04.stdout:2/667: readlink d1/dae/d2c/d37/d40/l66 0 2026-03-10T06:23:08.719 INFO:tasks.workunit.client.0.vm04.stdout:2/668: write d1/db/d9b/fa3 [5194242,38858] 0 2026-03-10T06:23:08.727 INFO:tasks.workunit.client.0.vm04.stdout:8/708: symlink df/d15/d29/da3/db8/dc1/d97/d67/le2 0 2026-03-10T06:23:08.741 INFO:tasks.workunit.client.0.vm04.stdout:0/766: creat d0/d1a/f101 x:0 0 0 2026-03-10T06:23:08.745 INFO:tasks.workunit.client.0.vm04.stdout:1/696: mkdir d0/d3/d41/dc2/d101 0 2026-03-10T06:23:08.748 INFO:tasks.workunit.client.0.vm04.stdout:1/697: dwrite d0/d8/f27 [0,4194304] 0 2026-03-10T06:23:08.752 INFO:tasks.workunit.client.0.vm04.stdout:3/678: creat d4/da/df/d11/d5a/d5b/ddf/d21/d32/d8e/fe1 x:0 0 0 2026-03-10T06:23:08.767 INFO:tasks.workunit.client.0.vm04.stdout:8/709: rmdir df/d15/d29/da3/db8/dc1/d97/d67 39 2026-03-10T06:23:08.767 INFO:tasks.workunit.client.0.vm04.stdout:4/712: creat d2/dde/fe6 x:0 0 0 2026-03-10T06:23:08.768 INFO:tasks.workunit.client.0.vm04.stdout:1/698: rmdir d0/d8/d46 39 2026-03-10T06:23:08.769 INFO:tasks.workunit.client.0.vm04.stdout:3/679: dread - d4/da/df/fb6 zero size 2026-03-10T06:23:08.771 INFO:tasks.workunit.client.0.vm04.stdout:8/710: dread df/f40 [0,4194304] 0 2026-03-10T06:23:08.772 INFO:tasks.workunit.client.0.vm04.stdout:8/711: readlink df/d20/d25/d30/l5f 0 2026-03-10T06:23:08.775 INFO:tasks.workunit.client.0.vm04.stdout:4/713: dread d2/d8/f89 [0,4194304] 0 2026-03-10T06:23:08.780 INFO:tasks.workunit.client.0.vm04.stdout:5/659: mknod d4/d11/d7d/d38/d91/cea 0 2026-03-10T06:23:08.788 INFO:tasks.workunit.client.0.vm04.stdout:0/767: write d0/d1a/d20/fe6 [3205,17662] 0 2026-03-10T06:23:08.795 INFO:tasks.workunit.client.0.vm04.stdout:0/768: dwrite d0/d5/d25/dd/d5c/fb2 [0,4194304] 0 2026-03-10T06:23:08.802 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:08.802+0000 7fc0820b0700 1 -- 192.168.123.104:0/3972149144 --> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7fc068000bf0 con 0x7fc0640776b0 2026-03-10T06:23:08.808 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:08.806+0000 7fc078ff9700 1 -- 192.168.123.104:0/3972149144 <== mgr.14632 v2:192.168.123.104:6800/1695210057 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+367 (secure 0 0 0) 0x7fc068000bf0 con 0x7fc0640776b0 2026-03-10T06:23:08.819 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:08.810+0000 7fc0627fc700 1 -- 192.168.123.104:0/3972149144 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7fc0640776b0 msgr2=0x7fc064079b60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:23:08.819 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:08.810+0000 7fc0627fc700 1 --2- 192.168.123.104:0/3972149144 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7fc0640776b0 0x7fc064079b60 secure :-1 s=READY pgs=28 cs=0 l=1 rev1=1 crypto rx=0x7fc07400bb30 tx=0x7fc074014040 comp rx=0 tx=0).stop 2026-03-10T06:23:08.819 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:08.810+0000 7fc0627fc700 1 -- 192.168.123.104:0/3972149144 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc07c071980 msgr2=0x7fc07c082530 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:23:08.819 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:08.810+0000 7fc0627fc700 1 --2- 192.168.123.104:0/3972149144 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc07c071980 0x7fc07c082530 secure :-1 s=READY pgs=337 cs=0 l=1 rev1=1 crypto rx=0x7fc06c007ae0 tx=0x7fc06c007df0 comp rx=0 tx=0).stop 2026-03-10T06:23:08.819 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:08.811+0000 7fc0627fc700 1 -- 192.168.123.104:0/3972149144 shutdown_connections 2026-03-10T06:23:08.819 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:08.811+0000 7fc0627fc700 1 --2- 192.168.123.104:0/3972149144 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7fc0640776b0 0x7fc064079b60 unknown :-1 s=CLOSED pgs=28 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:08.819 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:08.811+0000 7fc0627fc700 1 --2- 192.168.123.104:0/3972149144 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc07c071980 0x7fc07c082530 unknown :-1 s=CLOSED pgs=337 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:08.819 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:08.811+0000 7fc0627fc700 1 --2- 192.168.123.104:0/3972149144 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc07c082a70 0x7fc07c082ee0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:08.819 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:08.811+0000 7fc0627fc700 1 -- 192.168.123.104:0/3972149144 >> 192.168.123.104:0/3972149144 conn(0x7fc07c06d1a0 msgr2=0x7fc07c0764c0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:23:08.819 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:08.812+0000 7fc0627fc700 1 -- 192.168.123.104:0/3972149144 shutdown_connections 2026-03-10T06:23:08.819 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:08.812+0000 7fc0627fc700 1 -- 192.168.123.104:0/3972149144 wait complete. 2026-03-10T06:23:08.819 INFO:tasks.workunit.client.0.vm04.stdout:9/745: write d2/d3/d18/de9/d5a/fa7 [735483,129288] 0 2026-03-10T06:23:08.819 INFO:tasks.workunit.client.0.vm04.stdout:0/769: sync 2026-03-10T06:23:08.820 INFO:tasks.workunit.client.0.vm04.stdout:6/722: dwrite d2/d3a/f57 [0,4194304] 0 2026-03-10T06:23:08.823 INFO:tasks.workunit.client.0.vm04.stdout:9/746: dread d2/d3/d18/de9/da2/fc5 [4194304,4194304] 0 2026-03-10T06:23:08.826 INFO:tasks.workunit.client.0.vm04.stdout:9/747: read d2/d3/d18/d34/f97 [2645981,46923] 0 2026-03-10T06:23:08.826 INFO:tasks.workunit.client.0.vm04.stdout:8/712: mknod df/ce3 0 2026-03-10T06:23:08.830 INFO:tasks.workunit.client.0.vm04.stdout:9/748: dwrite d2/fb4 [0,4194304] 0 2026-03-10T06:23:08.831 INFO:tasks.workunit.client.0.vm04.stdout:9/749: chown d2/d3/d18/d39/d11/f35 74891995 1 2026-03-10T06:23:08.850 INFO:tasks.workunit.client.0.vm04.stdout:7/675: getdents d4/df/d12/d13/d25/d28/d3a/d58 0 2026-03-10T06:23:08.850 INFO:tasks.workunit.client.0.vm04.stdout:7/676: chown d4/df/d12/d13/d25/d30/d40/c96 0 1 2026-03-10T06:23:08.851 INFO:tasks.workunit.client.0.vm04.stdout:7/677: write d4/df/d12/d13/d8b/fdd [222128,108794] 0 2026-03-10T06:23:08.905 INFO:tasks.workunit.client.0.vm04.stdout:2/669: truncate d1/dae/d11/d14/d4e/f9d 7497687 0 2026-03-10T06:23:08.907 INFO:tasks.workunit.client.0.vm04.stdout:2/670: dwrite d1/db/f36 [4194304,4194304] 0 2026-03-10T06:23:08.909 INFO:tasks.workunit.client.0.vm04.stdout:2/671: chown d1/db/d69/dcd 216 1 2026-03-10T06:23:08.911 INFO:tasks.workunit.client.0.vm04.stdout:6/723: mkdir d2/d43/d2d/d30/d34/d76/d8a/deb 0 2026-03-10T06:23:08.922 INFO:tasks.workunit.client.0.vm04.stdout:3/680: write d4/d6/dc/f22 [428747,81113] 0 2026-03-10T06:23:08.957 INFO:tasks.workunit.client.0.vm04.stdout:5/660: dwrite d4/f26 [0,4194304] 0 2026-03-10T06:23:08.971 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:08.970+0000 7f1147ae3700 1 -- 192.168.123.104:0/1415336631 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1140072360 msgr2=0x7f11400770e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:23:08.971 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:08.970+0000 7f1147ae3700 1 --2- 192.168.123.104:0/1415336631 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1140072360 0x7f11400770e0 secure :-1 s=READY pgs=338 cs=0 l=1 rev1=1 crypto rx=0x7f113800d3f0 tx=0x7f113800d700 comp rx=0 tx=0).stop 2026-03-10T06:23:08.971 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:08.970+0000 7f1147ae3700 1 -- 192.168.123.104:0/1415336631 shutdown_connections 2026-03-10T06:23:08.971 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:08.970+0000 7f1147ae3700 1 --2- 192.168.123.104:0/1415336631 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1140072360 0x7f11400770e0 unknown :-1 s=CLOSED pgs=338 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:08.971 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:08.970+0000 7f1147ae3700 1 --2- 192.168.123.104:0/1415336631 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1140071980 0x7f1140071d90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:08.971 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:08.970+0000 7f1147ae3700 1 -- 192.168.123.104:0/1415336631 >> 192.168.123.104:0/1415336631 conn(0x7f114006d1a0 msgr2=0x7f114006f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:23:08.971 INFO:tasks.workunit.client.0.vm04.stdout:1/699: chown d0/d8/d46/c94 149111832 1 2026-03-10T06:23:08.971 INFO:tasks.workunit.client.0.vm04.stdout:1/700: fdatasync d0/f9a 0 2026-03-10T06:23:08.972 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:08.970+0000 7f1147ae3700 1 -- 192.168.123.104:0/1415336631 shutdown_connections 2026-03-10T06:23:08.972 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:08.970+0000 7f1147ae3700 1 -- 192.168.123.104:0/1415336631 wait complete. 2026-03-10T06:23:08.972 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:08.971+0000 7f1147ae3700 1 Processor -- start 2026-03-10T06:23:08.972 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:08.972+0000 7f1147ae3700 1 -- start start 2026-03-10T06:23:08.973 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:08.972+0000 7f1147ae3700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1140071980 0x7f1140131390 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:23:08.973 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:08.972+0000 7f1147ae3700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f11401318d0 0x7f114007f530 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:23:08.973 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:08.972+0000 7f1147ae3700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f1140131dd0 con 0x7f1140071980 2026-03-10T06:23:08.973 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:08.972+0000 7f1147ae3700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f1140131f40 con 0x7f11401318d0 2026-03-10T06:23:08.973 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:08.973+0000 7f114507e700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f11401318d0 0x7f114007f530 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:23:08.973 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:08.973+0000 7f114507e700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f11401318d0 0x7f114007f530 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.104:48046/0 (socket says 192.168.123.104:48046) 2026-03-10T06:23:08.973 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:08.973+0000 7f114507e700 1 -- 192.168.123.104:0/3275201691 learned_addr learned my addr 192.168.123.104:0/3275201691 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:23:08.973 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:08.973+0000 7f114587f700 1 --2- 192.168.123.104:0/3275201691 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1140071980 0x7f1140131390 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:23:08.974 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:08.973+0000 7f114507e700 1 -- 192.168.123.104:0/3275201691 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1140071980 msgr2=0x7f1140131390 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:23:08.974 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:08.973+0000 7f114507e700 1 --2- 192.168.123.104:0/3275201691 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1140071980 0x7f1140131390 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:08.974 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:08.973+0000 7f114507e700 1 -- 192.168.123.104:0/3275201691 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f1138007ed0 con 0x7f11401318d0 2026-03-10T06:23:08.974 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:08.973+0000 7f114507e700 1 --2- 192.168.123.104:0/3275201691 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f11401318d0 0x7f114007f530 secure :-1 s=READY pgs=94 cs=0 l=1 rev1=1 crypto rx=0x7f1138000f80 tx=0x7f1138004b40 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:23:08.975 INFO:tasks.workunit.client.0.vm04.stdout:0/770: creat d0/d1a/d20/df5/d47/d8a/de3/f102 x:0 0 0 2026-03-10T06:23:08.987 INFO:tasks.workunit.client.0.vm04.stdout:3/681: write d4/dba/fda [722940,51153] 0 2026-03-10T06:23:08.987 INFO:tasks.workunit.client.0.vm04.stdout:3/682: stat d4/d6/d38/l8a 0 2026-03-10T06:23:08.993 INFO:tasks.workunit.client.0.vm04.stdout:5/661: fsync d4/d11/d7d/d52/f9a 0 2026-03-10T06:23:08.994 INFO:tasks.workunit.client.0.vm04.stdout:5/662: readlink d4/d6/d50/l8c 0 2026-03-10T06:23:09.008 INFO:tasks.workunit.client.0.vm04.stdout:8/713: fdatasync df/d15/d2b/f33 0 2026-03-10T06:23:09.009 INFO:tasks.workunit.client.0.vm04.stdout:9/750: link d2/d8/cce d2/de0/c113 0 2026-03-10T06:23:09.010 INFO:tasks.workunit.client.0.vm04.stdout:9/751: read - d2/d8/d22/daa/ff9 zero size 2026-03-10T06:23:09.011 INFO:tasks.workunit.client.0.vm04.stdout:4/714: getdents d2/d32/d94/d99 0 2026-03-10T06:23:09.019 INFO:tasks.workunit.client.0.vm04.stdout:9/752: symlink d2/d3/d18/de9/d5a/l114 0 2026-03-10T06:23:09.019 INFO:tasks.workunit.client.0.vm04.stdout:9/753: chown d2/d3/d18/de9/de7/fec 21953 1 2026-03-10T06:23:09.019 INFO:tasks.workunit.client.0.vm04.stdout:8/714: fsync df/d20/d25/d30/d65/f82 0 2026-03-10T06:23:09.019 INFO:tasks.workunit.client.0.vm04.stdout:5/663: creat d4/d6/dc2/feb x:0 0 0 2026-03-10T06:23:09.020 INFO:tasks.workunit.client.0.vm04.stdout:8/715: truncate df/d20/d25/f39 818476 0 2026-03-10T06:23:09.021 INFO:tasks.workunit.client.0.vm04.stdout:8/716: truncate df/d20/d25/d30/d65/d8f/fc9 999192 0 2026-03-10T06:23:09.022 INFO:tasks.workunit.client.0.vm04.stdout:5/664: creat d4/d11/d7d/fec x:0 0 0 2026-03-10T06:23:09.023 INFO:tasks.workunit.client.0.vm04.stdout:8/717: rmdir df/d15 39 2026-03-10T06:23:09.024 INFO:tasks.workunit.client.0.vm04.stdout:5/665: write d4/f13 [320589,33801] 0 2026-03-10T06:23:09.024 INFO:tasks.workunit.client.0.vm04.stdout:5/666: chown d4/f35 49172 1 2026-03-10T06:23:09.026 INFO:tasks.workunit.client.0.vm04.stdout:8/718: chown df/d20/d25/d30/d55/f95 1478 1 2026-03-10T06:23:09.027 INFO:tasks.workunit.client.0.vm04.stdout:5/667: unlink d4/d11/l60 0 2026-03-10T06:23:09.029 INFO:tasks.workunit.client.0.vm04.stdout:5/668: symlink d4/d11/d7d/d38/d91/d4c/d98/dc0/dbe/led 0 2026-03-10T06:23:09.037 INFO:tasks.workunit.client.0.vm04.stdout:1/701: sync 2026-03-10T06:23:09.037 INFO:tasks.workunit.client.0.vm04.stdout:5/669: dread d4/d11/d7d/d38/d91/d4c/fa3 [0,4194304] 0 2026-03-10T06:23:09.039 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:09.037+0000 7f1136ffd700 1 -- 192.168.123.104:0/3275201691 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f113801c070 con 0x7f11401318d0 2026-03-10T06:23:09.039 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:09.037+0000 7f1147ae3700 1 -- 192.168.123.104:0/3275201691 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f114007fa70 con 0x7f11401318d0 2026-03-10T06:23:09.039 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:09.037+0000 7f1147ae3700 1 -- 192.168.123.104:0/3275201691 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f114007ff30 con 0x7f11401318d0 2026-03-10T06:23:09.039 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:09.038+0000 7f1136ffd700 1 -- 192.168.123.104:0/3275201691 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f113800deb0 con 0x7f11401318d0 2026-03-10T06:23:09.039 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:09.038+0000 7f1136ffd700 1 -- 192.168.123.104:0/3275201691 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f1138017b30 con 0x7f11401318d0 2026-03-10T06:23:09.040 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:09.040+0000 7f1136ffd700 1 -- 192.168.123.104:0/3275201691 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 32) v1 ==== 100066+0+0 (secure 0 0 0) 0x7f1138017c90 con 0x7f11401318d0 2026-03-10T06:23:09.042 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:09.041+0000 7f1136ffd700 1 --2- 192.168.123.104:0/3275201691 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7f112c077790 0x7f112c079c40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:23:09.042 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:09.042+0000 7f114587f700 1 --2- 192.168.123.104:0/3275201691 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7f112c077790 0x7f112c079c40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:23:09.042 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:09.042+0000 7f1136ffd700 1 -- 192.168.123.104:0/3275201691 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(42..42 src has 1..42) v4 ==== 5792+0+0 (secure 0 0 0) 0x7f1138013070 con 0x7f11401318d0 2026-03-10T06:23:09.043 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:09.043+0000 7f1147ae3700 1 -- 192.168.123.104:0/3275201691 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f1124005320 con 0x7f11401318d0 2026-03-10T06:23:09.048 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:09.046+0000 7f1136ffd700 1 -- 192.168.123.104:0/3275201691 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191525 (secure 0 0 0) 0x7f1138064c50 con 0x7f11401318d0 2026-03-10T06:23:09.050 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:09.050+0000 7f114587f700 1 --2- 192.168.123.104:0/3275201691 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7f112c077790 0x7f112c079c40 secure :-1 s=READY pgs=29 cs=0 l=1 rev1=1 crypto rx=0x7f114007b4a0 tx=0x7f113c00b410 comp rx=0 tx=0).ready entity=mgr.14632 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:23:09.058 INFO:tasks.workunit.client.0.vm04.stdout:1/702: mknod d0/d3/d41/dcb/c102 0 2026-03-10T06:23:09.087 INFO:tasks.workunit.client.0.vm04.stdout:1/703: unlink d0/f5 0 2026-03-10T06:23:09.101 INFO:tasks.workunit.client.0.vm04.stdout:5/670: rename d4/d11/d7d/d38/d91/d55/f5d to d4/d11/d7d/fee 0 2026-03-10T06:23:09.108 INFO:tasks.workunit.client.0.vm04.stdout:1/704: rename d0/d3/d80/de9 to d0/d3/d41/d99/d103 0 2026-03-10T06:23:09.143 INFO:tasks.workunit.client.0.vm04.stdout:7/678: write d4/df/d12/d13/d25/d28/fc0 [852093,50518] 0 2026-03-10T06:23:09.143 INFO:tasks.workunit.client.0.vm04.stdout:2/672: write d1/dae/d2c/f5f [752531,89791] 0 2026-03-10T06:23:09.143 INFO:tasks.workunit.client.0.vm04.stdout:3/683: write d4/d6/f12 [7423237,5656] 0 2026-03-10T06:23:09.145 INFO:tasks.workunit.client.0.vm04.stdout:0/771: write d0/d1a/d20/df5/d47/f54 [331545,96054] 0 2026-03-10T06:23:09.151 INFO:tasks.workunit.client.0.vm04.stdout:2/673: write d1/dae/d2c/d37/d59/f8b [994991,39349] 0 2026-03-10T06:23:09.152 INFO:tasks.workunit.client.0.vm04.stdout:6/724: dwrite d2/d43/d2d/d30/f4a [0,4194304] 0 2026-03-10T06:23:09.169 INFO:tasks.workunit.client.0.vm04.stdout:4/715: write d2/f47 [1333602,44229] 0 2026-03-10T06:23:09.177 INFO:tasks.workunit.client.0.vm04.stdout:9/754: write d2/de0/da3/f102 [3101333,115769] 0 2026-03-10T06:23:09.186 INFO:tasks.workunit.client.0.vm04.stdout:8/719: truncate df/d20/d25/d30/d65/f82 153326 0 2026-03-10T06:23:09.219 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:09.218+0000 7f1147ae3700 1 -- 192.168.123.104:0/3275201691 --> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7f1124000bf0 con 0x7f112c077790 2026-03-10T06:23:09.219 INFO:tasks.workunit.client.0.vm04.stdout:5/671: write d4/d11/d7d/dae/fb2 [376514,7184] 0 2026-03-10T06:23:09.219 INFO:tasks.workunit.client.0.vm04.stdout:3/684: dread - d4/d6/d38/f78 zero size 2026-03-10T06:23:09.227 INFO:tasks.workunit.client.0.vm04.stdout:5/672: dwrite d4/d6/dc2/dd3/fe3 [0,4194304] 0 2026-03-10T06:23:09.229 INFO:tasks.workunit.client.0.vm04.stdout:3/685: dwrite d4/d6/d91/da1/fc4 [0,4194304] 0 2026-03-10T06:23:09.232 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:09.231+0000 7f1136ffd700 1 -- 192.168.123.104:0/3275201691 <== mgr.14632 v2:192.168.123.104:6800/1695210057 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3552 (secure 0 0 0) 0x7f1124000bf0 con 0x7f112c077790 2026-03-10T06:23:09.234 INFO:teuthology.orchestra.run.vm04.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-10T06:23:09.234 INFO:teuthology.orchestra.run.vm04.stdout:alertmanager.vm04 vm04 *:9093,9094 running (4m) 3s ago 5m 25.4M - 0.25.0 c8568f914cd2 3d98d9c97afc 2026-03-10T06:23:09.234 INFO:teuthology.orchestra.run.vm04.stdout:ceph-exporter.vm04 vm04 running (5m) 3s ago 5m 8392k - 18.2.0 dc2bc1663786 019b79596e39 2026-03-10T06:23:09.234 INFO:teuthology.orchestra.run.vm04.stdout:ceph-exporter.vm06 vm06 running (4m) 24s ago 4m 8648k - 18.2.0 dc2bc1663786 02ba67f7b99e 2026-03-10T06:23:09.234 INFO:teuthology.orchestra.run.vm04.stdout:crash.vm04 vm04 running (5m) 3s ago 5m 7402k - 18.2.0 dc2bc1663786 35fbdbd85c40 2026-03-10T06:23:09.234 INFO:teuthology.orchestra.run.vm04.stdout:crash.vm06 vm06 running (4m) 24s ago 4m 7411k - 18.2.0 dc2bc1663786 a60199b09d41 2026-03-10T06:23:09.234 INFO:teuthology.orchestra.run.vm04.stdout:grafana.vm04 vm04 *:3000 running (4m) 3s ago 5m 90.5M - 9.4.7 954c08fa6188 888c399470c8 2026-03-10T06:23:09.234 INFO:teuthology.orchestra.run.vm04.stdout:mds.cephfs.vm04.hdxbzv vm04 running (3m) 3s ago 3m 259M - 18.2.0 dc2bc1663786 342935a5b39a 2026-03-10T06:23:09.234 INFO:teuthology.orchestra.run.vm04.stdout:mds.cephfs.vm04.hsrsig vm04 running (3m) 3s ago 3m 16.4M - 18.2.0 dc2bc1663786 9bbaa4df4333 2026-03-10T06:23:09.234 INFO:teuthology.orchestra.run.vm04.stdout:mds.cephfs.vm06.afscws vm06 running (3m) 24s ago 3m 16.5M - 18.2.0 dc2bc1663786 dc29bd0a94dd 2026-03-10T06:23:09.235 INFO:teuthology.orchestra.run.vm04.stdout:mds.cephfs.vm06.wzhqon vm06 running (3m) 24s ago 3m 263M - 18.2.0 dc2bc1663786 5f7b9f10b346 2026-03-10T06:23:09.235 INFO:teuthology.orchestra.run.vm04.stdout:mgr.vm04.exdvdb vm04 *:8443,9283,8765 running (68s) 3s ago 6m 604M - 19.2.3-678-ge911bdeb 654f31e6858e 640a9d30421c 2026-03-10T06:23:09.235 INFO:teuthology.orchestra.run.vm04.stdout:mgr.vm06.wwotdr vm06 *:8443,9283,8765 running (45s) 24s ago 4m 495M - 19.2.3-678-ge911bdeb 654f31e6858e 0f98de364d6a 2026-03-10T06:23:09.235 INFO:teuthology.orchestra.run.vm04.stdout:mon.vm04 vm04 running (6m) 3s ago 6m 56.4M 2048M 18.2.0 dc2bc1663786 089bb557f95b 2026-03-10T06:23:09.235 INFO:teuthology.orchestra.run.vm04.stdout:mon.vm06 vm06 running (4m) 24s ago 4m 42.8M 2048M 18.2.0 dc2bc1663786 826078cd5cc7 2026-03-10T06:23:09.235 INFO:teuthology.orchestra.run.vm04.stdout:node-exporter.vm04 vm04 *:9100 running (31s) 3s ago 5m 8530k - 1.7.0 72c9c2088986 f88b18573eef 2026-03-10T06:23:09.235 INFO:teuthology.orchestra.run.vm04.stdout:node-exporter.vm06 vm06 *:9100 running (26s) 24s ago 4m 5368k - 1.7.0 72c9c2088986 32cea90d1988 2026-03-10T06:23:09.235 INFO:teuthology.orchestra.run.vm04.stdout:osd.0 vm04 running (4m) 3s ago 4m 265M 4096M 18.2.0 dc2bc1663786 23249edb3d75 2026-03-10T06:23:09.235 INFO:teuthology.orchestra.run.vm04.stdout:osd.1 vm04 running (4m) 3s ago 4m 280M 4096M 18.2.0 dc2bc1663786 ddcaf1636c42 2026-03-10T06:23:09.235 INFO:teuthology.orchestra.run.vm04.stdout:osd.2 vm04 running (4m) 3s ago 4m 232M 4096M 18.2.0 dc2bc1663786 e5a533082c80 2026-03-10T06:23:09.235 INFO:teuthology.orchestra.run.vm04.stdout:osd.3 vm06 running (4m) 24s ago 4m 320M 4096M 18.2.0 dc2bc1663786 62400287eca0 2026-03-10T06:23:09.235 INFO:teuthology.orchestra.run.vm04.stdout:osd.4 vm06 running (3m) 24s ago 3m 263M 4096M 18.2.0 dc2bc1663786 dcd395dfe220 2026-03-10T06:23:09.235 INFO:teuthology.orchestra.run.vm04.stdout:osd.5 vm06 running (3m) 24s ago 3m 313M 4096M 18.2.0 dc2bc1663786 862da087fc06 2026-03-10T06:23:09.235 INFO:teuthology.orchestra.run.vm04.stdout:prometheus.vm04 vm04 *:9095 running (7s) 3s ago 5m 33.6M - 2.51.0 1d3b7f56885b 9e491f823407 2026-03-10T06:23:09.235 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:09.235+0000 7f1147ae3700 1 -- 192.168.123.104:0/3275201691 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7f112c077790 msgr2=0x7f112c079c40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:23:09.235 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:09.235+0000 7f1147ae3700 1 --2- 192.168.123.104:0/3275201691 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7f112c077790 0x7f112c079c40 secure :-1 s=READY pgs=29 cs=0 l=1 rev1=1 crypto rx=0x7f114007b4a0 tx=0x7f113c00b410 comp rx=0 tx=0).stop 2026-03-10T06:23:09.235 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:09.235+0000 7f1147ae3700 1 -- 192.168.123.104:0/3275201691 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f11401318d0 msgr2=0x7f114007f530 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:23:09.235 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:09.235+0000 7f1147ae3700 1 --2- 192.168.123.104:0/3275201691 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f11401318d0 0x7f114007f530 secure :-1 s=READY pgs=94 cs=0 l=1 rev1=1 crypto rx=0x7f1138000f80 tx=0x7f1138004b40 comp rx=0 tx=0).stop 2026-03-10T06:23:09.235 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:09.235+0000 7f1147ae3700 1 -- 192.168.123.104:0/3275201691 shutdown_connections 2026-03-10T06:23:09.235 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:09.235+0000 7f1147ae3700 1 --2- 192.168.123.104:0/3275201691 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7f112c077790 0x7f112c079c40 unknown :-1 s=CLOSED pgs=29 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:09.236 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:09.235+0000 7f1147ae3700 1 --2- 192.168.123.104:0/3275201691 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1140071980 0x7f1140131390 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:09.236 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:09.235+0000 7f1147ae3700 1 --2- 192.168.123.104:0/3275201691 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f11401318d0 0x7f114007f530 unknown :-1 s=CLOSED pgs=94 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:09.236 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:09.235+0000 7f1147ae3700 1 -- 192.168.123.104:0/3275201691 >> 192.168.123.104:0/3275201691 conn(0x7f114006d1a0 msgr2=0x7f11400764c0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:23:09.236 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:09.235+0000 7f1147ae3700 1 -- 192.168.123.104:0/3275201691 shutdown_connections 2026-03-10T06:23:09.236 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:09.235+0000 7f1147ae3700 1 -- 192.168.123.104:0/3275201691 wait complete. 2026-03-10T06:23:09.247 INFO:tasks.workunit.client.0.vm04.stdout:1/705: write d0/d8/d46/f82 [1040171,73181] 0 2026-03-10T06:23:09.252 INFO:tasks.workunit.client.0.vm04.stdout:7/679: rmdir d4/df/d12/d13/d25/d28/d3a/db0 39 2026-03-10T06:23:09.262 INFO:tasks.workunit.client.0.vm04.stdout:6/725: dread - d2/d3a/d5e/f64 zero size 2026-03-10T06:23:09.262 INFO:tasks.workunit.client.0.vm04.stdout:9/755: mknod d2/d3/d18/de9/da2/c115 0 2026-03-10T06:23:09.286 INFO:tasks.workunit.client.0.vm04.stdout:2/674: dwrite d1/dae/d11/d14/f45 [0,4194304] 0 2026-03-10T06:23:09.287 INFO:tasks.workunit.client.0.vm04.stdout:3/686: mknod d4/d6/d91/da1/ce2 0 2026-03-10T06:23:09.289 INFO:tasks.workunit.client.0.vm04.stdout:4/716: dwrite d2/f4c [0,4194304] 0 2026-03-10T06:23:09.292 INFO:tasks.workunit.client.0.vm04.stdout:4/717: write d2/d32/d5c/d76/dd7/d2c/d6b/f96 [1025785,25935] 0 2026-03-10T06:23:09.293 INFO:tasks.workunit.client.0.vm04.stdout:4/718: dread - d2/d32/d5c/d76/dd7/d56/fca zero size 2026-03-10T06:23:09.298 INFO:tasks.workunit.client.0.vm04.stdout:0/772: mkdir d0/d1a/d20/df5/d47/ddd/d103 0 2026-03-10T06:23:09.311 INFO:tasks.workunit.client.0.vm04.stdout:1/706: rmdir d0/d8/d46/d7a/d95/dc5/dcc 39 2026-03-10T06:23:09.321 INFO:tasks.workunit.client.0.vm04.stdout:7/680: dread - d4/df/d12/d13/fac zero size 2026-03-10T06:23:09.324 INFO:tasks.workunit.client.0.vm04.stdout:9/756: mkdir d2/d3/d18/de9/d116 0 2026-03-10T06:23:09.327 INFO:tasks.workunit.client.0.vm04.stdout:6/726: rename d2/d43/d2d/d30/d34/d76/d8a/fb9 to d2/d43/d2d/d30/d34/d76/d7e/ddc/fec 0 2026-03-10T06:23:09.339 INFO:tasks.workunit.client.0.vm04.stdout:6/727: dread d2/d43/d2d/d30/f93 [0,4194304] 0 2026-03-10T06:23:09.361 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:09.360+0000 7f176b046700 1 -- 192.168.123.104:0/2264578783 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1764072470 msgr2=0x7f176410beb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:23:09.361 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:09.360+0000 7f176b046700 1 --2- 192.168.123.104:0/2264578783 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1764072470 0x7f176410beb0 secure :-1 s=READY pgs=95 cs=0 l=1 rev1=1 crypto rx=0x7f175c00b3a0 tx=0x7f175c00b6b0 comp rx=0 tx=0).stop 2026-03-10T06:23:09.361 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:09.360+0000 7f176b046700 1 -- 192.168.123.104:0/2264578783 shutdown_connections 2026-03-10T06:23:09.361 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:09.360+0000 7f176b046700 1 --2- 192.168.123.104:0/2264578783 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1764072470 0x7f176410beb0 unknown :-1 s=CLOSED pgs=95 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:09.361 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:09.360+0000 7f176b046700 1 --2- 192.168.123.104:0/2264578783 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1764071a90 0x7f1764071ea0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:09.361 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:09.360+0000 7f176b046700 1 -- 192.168.123.104:0/2264578783 >> 192.168.123.104:0/2264578783 conn(0x7f176406d1a0 msgr2=0x7f176406f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:23:09.361 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:09.360+0000 7f176b046700 1 -- 192.168.123.104:0/2264578783 shutdown_connections 2026-03-10T06:23:09.361 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:09.360+0000 7f176b046700 1 -- 192.168.123.104:0/2264578783 wait complete. 2026-03-10T06:23:09.361 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:09.360+0000 7f176b046700 1 Processor -- start 2026-03-10T06:23:09.362 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:09 vm04.local ceph-mon[51058]: pgmap v33: 65 pgs: 65 active+clean; 2.8 GiB data, 9.2 GiB used, 111 GiB / 120 GiB avail; 31 MiB/s rd, 81 MiB/s wr, 188 op/s 2026-03-10T06:23:09.362 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:09 vm04.local ceph-mon[51058]: from='client.24457 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:23:09.362 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:09 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T06:23:09.362 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:09.360+0000 7f176b046700 1 -- start start 2026-03-10T06:23:09.362 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:09.361+0000 7f176b046700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1764071a90 0x7f1764116ad0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:23:09.362 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:09.361+0000 7f176b046700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1764117010 0x7f17641b2800 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:23:09.362 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:09.361+0000 7f176b046700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f1764117510 con 0x7f1764117010 2026-03-10T06:23:09.362 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:09.361+0000 7f176b046700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f1764117680 con 0x7f1764071a90 2026-03-10T06:23:09.364 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:09.363+0000 7f1768de2700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1764071a90 0x7f1764116ad0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:23:09.366 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:09.364+0000 7f1768de2700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1764071a90 0x7f1764116ad0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.104:48068/0 (socket says 192.168.123.104:48068) 2026-03-10T06:23:09.366 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:09.364+0000 7f1768de2700 1 -- 192.168.123.104:0/3598138886 learned_addr learned my addr 192.168.123.104:0/3598138886 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:23:09.366 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:09.365+0000 7f1763fff700 1 --2- 192.168.123.104:0/3598138886 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1764117010 0x7f17641b2800 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:23:09.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:09 vm06.local ceph-mon[58974]: pgmap v33: 65 pgs: 65 active+clean; 2.8 GiB data, 9.2 GiB used, 111 GiB / 120 GiB avail; 31 MiB/s rd, 81 MiB/s wr, 188 op/s 2026-03-10T06:23:09.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:09 vm06.local ceph-mon[58974]: from='client.24457 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:23:09.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:09 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T06:23:09.367 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:09.366+0000 7f1768de2700 1 -- 192.168.123.104:0/3598138886 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1764117010 msgr2=0x7f17641b2800 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:23:09.368 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:09.366+0000 7f1768de2700 1 --2- 192.168.123.104:0/3598138886 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1764117010 0x7f17641b2800 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:09.368 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:09.366+0000 7f1768de2700 1 -- 192.168.123.104:0/3598138886 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f175c00b050 con 0x7f1764071a90 2026-03-10T06:23:09.368 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:09.366+0000 7f1768de2700 1 --2- 192.168.123.104:0/3598138886 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1764071a90 0x7f1764116ad0 secure :-1 s=READY pgs=96 cs=0 l=1 rev1=1 crypto rx=0x7f175400eb10 tx=0x7f175400ee20 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:23:09.368 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:09.366+0000 7f1761ffb700 1 -- 192.168.123.104:0/3598138886 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f175400cc40 con 0x7f1764071a90 2026-03-10T06:23:09.368 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:09.366+0000 7f1761ffb700 1 -- 192.168.123.104:0/3598138886 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f175400cda0 con 0x7f1764071a90 2026-03-10T06:23:09.368 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:09.366+0000 7f1761ffb700 1 -- 192.168.123.104:0/3598138886 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f1754018810 con 0x7f1764071a90 2026-03-10T06:23:09.368 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:09.367+0000 7f176b046700 1 -- 192.168.123.104:0/3598138886 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f17641b2da0 con 0x7f1764071a90 2026-03-10T06:23:09.368 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:09.367+0000 7f176b046700 1 -- 192.168.123.104:0/3598138886 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f17641b3260 con 0x7f1764071a90 2026-03-10T06:23:09.368 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:09.367+0000 7f176b046700 1 -- 192.168.123.104:0/3598138886 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f1764110c20 con 0x7f1764071a90 2026-03-10T06:23:09.371 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:09.370+0000 7f1761ffb700 1 -- 192.168.123.104:0/3598138886 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 32) v1 ==== 100066+0+0 (secure 0 0 0) 0x7f1754010ba0 con 0x7f1764071a90 2026-03-10T06:23:09.373 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:09.370+0000 7f1761ffb700 1 --2- 192.168.123.104:0/3598138886 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7f174c0776b0 0x7f174c079b60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:23:09.373 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:09.371+0000 7f1761ffb700 1 -- 192.168.123.104:0/3598138886 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(42..42 src has 1..42) v4 ==== 5792+0+0 (secure 0 0 0) 0x7f1754014070 con 0x7f1764071a90 2026-03-10T06:23:09.373 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:09.371+0000 7f1761ffb700 1 -- 192.168.123.104:0/3598138886 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191525 (secure 0 0 0) 0x7f175409a950 con 0x7f1764071a90 2026-03-10T06:23:09.373 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:09.371+0000 7f1763fff700 1 --2- 192.168.123.104:0/3598138886 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7f174c0776b0 0x7f174c079b60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:23:09.378 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:09.378+0000 7f1763fff700 1 --2- 192.168.123.104:0/3598138886 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7f174c0776b0 0x7f174c079b60 secure :-1 s=READY pgs=30 cs=0 l=1 rev1=1 crypto rx=0x7f175c00ba80 tx=0x7f175c00bee0 comp rx=0 tx=0).ready entity=mgr.14632 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:23:09.397 INFO:tasks.workunit.client.0.vm04.stdout:8/720: symlink df/d15/d2b/d81/d9a/dbe/le4 0 2026-03-10T06:23:09.399 INFO:tasks.workunit.client.0.vm04.stdout:5/673: truncate d4/d11/d7d/d52/f96 3708376 0 2026-03-10T06:23:09.405 INFO:tasks.workunit.client.0.vm04.stdout:2/675: fdatasync d1/f91 0 2026-03-10T06:23:09.425 INFO:tasks.workunit.client.0.vm04.stdout:4/719: creat d2/d32/d5c/d76/dd7/d31/d3f/da1/fe7 x:0 0 0 2026-03-10T06:23:09.425 INFO:tasks.workunit.client.0.vm04.stdout:3/687: dwrite d4/da/df/d11/d5a/d5b/ddf/d21/d2c/fac [0,4194304] 0 2026-03-10T06:23:09.425 INFO:tasks.workunit.client.0.vm04.stdout:4/720: write d2/f47 [2602377,8138] 0 2026-03-10T06:23:09.441 INFO:tasks.workunit.client.0.vm04.stdout:0/773: truncate d0/d5/d25/dd/d5c/d73/f53 714700 0 2026-03-10T06:23:09.442 INFO:tasks.workunit.client.0.vm04.stdout:7/681: rmdir d4/df/d12/d13/d25/dcb 39 2026-03-10T06:23:09.446 INFO:tasks.workunit.client.0.vm04.stdout:9/757: symlink d2/d8/d22/daa/l117 0 2026-03-10T06:23:09.451 INFO:tasks.workunit.client.0.vm04.stdout:9/758: dwrite d2/d8/d22/daa/ff9 [0,4194304] 0 2026-03-10T06:23:09.457 INFO:tasks.workunit.client.0.vm04.stdout:9/759: stat d2/d3/d18/d39/d46/d55/dc3 0 2026-03-10T06:23:09.471 INFO:tasks.workunit.client.0.vm04.stdout:5/674: truncate d4/d11/d7d/d38/d91/f5e 917041 0 2026-03-10T06:23:09.476 INFO:tasks.workunit.client.0.vm04.stdout:2/676: mkdir d1/d76/dce 0 2026-03-10T06:23:09.483 INFO:tasks.workunit.client.0.vm04.stdout:2/677: dwrite d1/dae/d2c/d37/d59/f8b [0,4194304] 0 2026-03-10T06:23:09.503 INFO:tasks.workunit.client.0.vm04.stdout:4/721: mknod d2/d32/ce8 0 2026-03-10T06:23:09.503 INFO:tasks.workunit.client.0.vm04.stdout:9/760: sync 2026-03-10T06:23:09.503 INFO:tasks.workunit.client.0.vm04.stdout:1/707: creat d0/d8/d46/d7a/d95/dc5/dcc/f104 x:0 0 0 2026-03-10T06:23:09.504 INFO:tasks.workunit.client.0.vm04.stdout:1/708: chown d0/d3/d41/cc7 1012 1 2026-03-10T06:23:09.504 INFO:tasks.workunit.client.0.vm04.stdout:4/722: write d2/d32/d5c/d76/dd7/f3a [4180672,13435] 0 2026-03-10T06:23:09.510 INFO:tasks.workunit.client.0.vm04.stdout:0/774: unlink d0/d5/d25/c10 0 2026-03-10T06:23:09.527 INFO:tasks.workunit.client.0.vm04.stdout:3/688: dwrite d4/d6/d38/f53 [4194304,4194304] 0 2026-03-10T06:23:09.533 INFO:tasks.workunit.client.0.vm04.stdout:7/682: dwrite d4/df/d12/dd4/fe1 [0,4194304] 0 2026-03-10T06:23:09.534 INFO:tasks.workunit.client.0.vm04.stdout:3/689: write d4/da/df/d11/d5a/d5b/ddf/d21/d2c/fac [3458440,37184] 0 2026-03-10T06:23:09.553 INFO:tasks.workunit.client.0.vm04.stdout:8/721: truncate df/d15/d2b/f56 932903 0 2026-03-10T06:23:09.553 INFO:tasks.workunit.client.0.vm04.stdout:6/728: getdents d2/d37/d83/dc1 0 2026-03-10T06:23:09.556 INFO:tasks.workunit.client.0.vm04.stdout:2/678: rmdir d1/db/d69/d74/d87 39 2026-03-10T06:23:09.556 INFO:tasks.workunit.client.0.vm04.stdout:2/679: stat d1/dae/d11/l34 0 2026-03-10T06:23:09.557 INFO:tasks.workunit.client.0.vm04.stdout:2/680: write d1/db/d72/f7a [1749356,99957] 0 2026-03-10T06:23:09.557 INFO:tasks.workunit.client.0.vm04.stdout:2/681: chown d1/db/d72/l96 58674 1 2026-03-10T06:23:09.558 INFO:tasks.workunit.client.0.vm04.stdout:2/682: fsync d1/db/d20/d8f/d35/d54/d5d/faa 0 2026-03-10T06:23:09.558 INFO:tasks.workunit.client.0.vm04.stdout:2/683: chown d1/db/d9b 1766807 1 2026-03-10T06:23:09.566 INFO:tasks.workunit.client.0.vm04.stdout:9/761: symlink d2/d3/d18/de9/de7/l118 0 2026-03-10T06:23:09.582 INFO:tasks.workunit.client.0.vm04.stdout:8/722: sync 2026-03-10T06:23:09.583 INFO:tasks.workunit.client.0.vm04.stdout:7/683: creat d4/df/dd8/d9c/db1/dde/ddf/ff9 x:0 0 0 2026-03-10T06:23:09.585 INFO:tasks.workunit.client.0.vm04.stdout:3/690: dread d4/da/df/d11/d5a/d5b/ddf/d21/d32/d8e/f95 [0,4194304] 0 2026-03-10T06:23:09.595 INFO:tasks.workunit.client.0.vm04.stdout:6/729: truncate d2/d43/d2d/d30/f93 1259734 0 2026-03-10T06:23:09.604 INFO:tasks.workunit.client.0.vm04.stdout:9/762: dread - d2/d3/d18/ddd/fb1 zero size 2026-03-10T06:23:09.609 INFO:tasks.workunit.client.0.vm04.stdout:9/763: dwrite d2/d3/d18/d39/d11/da5/df5/ffc [0,4194304] 0 2026-03-10T06:23:09.618 INFO:tasks.workunit.client.0.vm04.stdout:8/723: creat df/d15/d2b/d8a/fe5 x:0 0 0 2026-03-10T06:23:09.622 INFO:tasks.workunit.client.0.vm04.stdout:5/675: rename d4/d6/dc2 to d4/d11/d7d/d38/d91/d4c/def 0 2026-03-10T06:23:09.622 INFO:tasks.workunit.client.0.vm04.stdout:5/676: fdatasync d4/d11/d7d/d38/d91/d4c/f88 0 2026-03-10T06:23:09.622 INFO:tasks.workunit.client.0.vm04.stdout:7/684: unlink d4/df/dd8/d9c/lf3 0 2026-03-10T06:23:09.626 INFO:tasks.workunit.client.0.vm04.stdout:4/723: creat d2/d32/d5c/d76/dd7/d31/fe9 x:0 0 0 2026-03-10T06:23:09.627 INFO:tasks.workunit.client.0.vm04.stdout:3/691: write d4/da/df/d11/d5a/d5b/ddf/d21/d2c/fb2 [2481582,99850] 0 2026-03-10T06:23:09.634 INFO:tasks.workunit.client.0.vm04.stdout:8/724: sync 2026-03-10T06:23:09.638 INFO:tasks.workunit.client.0.vm04.stdout:9/764: rename d2/d3/d18/d39/d46/fc4 to d2/d8/d53/d6e/d8d/f119 0 2026-03-10T06:23:09.638 INFO:tasks.workunit.client.0.vm04.stdout:5/677: dwrite d4/d6/d37/f7e [0,4194304] 0 2026-03-10T06:23:09.650 INFO:tasks.workunit.client.0.vm04.stdout:9/765: chown d2/d3/d18/d39/d11/da5/df5 2 1 2026-03-10T06:23:09.678 INFO:tasks.workunit.client.0.vm04.stdout:6/730: mkdir d2/d3a/ded 0 2026-03-10T06:23:09.679 INFO:tasks.workunit.client.0.vm04.stdout:3/692: fsync d4/da/df/d11/d5a/d5b/ddf/f4b 0 2026-03-10T06:23:09.682 INFO:tasks.workunit.client.0.vm04.stdout:8/725: creat df/d15/d29/d89/fe6 x:0 0 0 2026-03-10T06:23:09.682 INFO:tasks.workunit.client.0.vm04.stdout:0/775: unlink d0/d5/d25/dd/d5c/d73/f53 0 2026-03-10T06:23:09.689 INFO:tasks.workunit.client.0.vm04.stdout:5/678: dread - d4/f69 zero size 2026-03-10T06:23:09.691 INFO:tasks.workunit.client.0.vm04.stdout:4/724: dwrite d2/d32/d5c/d76/dd7/d31/d3f/d93/fa2 [0,4194304] 0 2026-03-10T06:23:09.697 INFO:tasks.workunit.client.0.vm04.stdout:9/766: write d2/d3/f2a [1833237,78629] 0 2026-03-10T06:23:09.698 INFO:tasks.workunit.client.0.vm04.stdout:6/731: mkdir d2/d37/d83/dee 0 2026-03-10T06:23:09.698 INFO:tasks.workunit.client.0.vm04.stdout:0/776: unlink d0/d5/d25/dd/d5c/d73/lef 0 2026-03-10T06:23:09.702 INFO:tasks.workunit.client.0.vm04.stdout:7/685: creat d4/df/ffa x:0 0 0 2026-03-10T06:23:09.705 INFO:tasks.workunit.client.0.vm04.stdout:9/767: sync 2026-03-10T06:23:09.706 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:09.703+0000 7f176b046700 1 -- 192.168.123.104:0/3598138886 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7f176404ea50 con 0x7f1764071a90 2026-03-10T06:23:09.708 INFO:teuthology.orchestra.run.vm04.stdout:{ 2026-03-10T06:23:09.709 INFO:teuthology.orchestra.run.vm04.stdout: "mon": { 2026-03-10T06:23:09.709 INFO:teuthology.orchestra.run.vm04.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 2 2026-03-10T06:23:09.709 INFO:teuthology.orchestra.run.vm04.stdout: }, 2026-03-10T06:23:09.709 INFO:teuthology.orchestra.run.vm04.stdout: "mgr": { 2026-03-10T06:23:09.709 INFO:teuthology.orchestra.run.vm04.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T06:23:09.709 INFO:teuthology.orchestra.run.vm04.stdout: }, 2026-03-10T06:23:09.709 INFO:teuthology.orchestra.run.vm04.stdout: "osd": { 2026-03-10T06:23:09.709 INFO:teuthology.orchestra.run.vm04.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 6 2026-03-10T06:23:09.709 INFO:teuthology.orchestra.run.vm04.stdout: }, 2026-03-10T06:23:09.709 INFO:teuthology.orchestra.run.vm04.stdout: "mds": { 2026-03-10T06:23:09.709 INFO:teuthology.orchestra.run.vm04.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 4 2026-03-10T06:23:09.709 INFO:teuthology.orchestra.run.vm04.stdout: }, 2026-03-10T06:23:09.709 INFO:teuthology.orchestra.run.vm04.stdout: "overall": { 2026-03-10T06:23:09.709 INFO:teuthology.orchestra.run.vm04.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 12, 2026-03-10T06:23:09.709 INFO:teuthology.orchestra.run.vm04.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T06:23:09.709 INFO:teuthology.orchestra.run.vm04.stdout: } 2026-03-10T06:23:09.709 INFO:teuthology.orchestra.run.vm04.stdout:} 2026-03-10T06:23:09.709 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:09.705+0000 7f1761ffb700 1 -- 192.168.123.104:0/3598138886 <== mon.1 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+679 (secure 0 0 0) 0x7f1754063290 con 0x7f1764071a90 2026-03-10T06:23:09.717 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:09.714+0000 7f176b046700 1 -- 192.168.123.104:0/3598138886 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7f174c0776b0 msgr2=0x7f174c079b60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:23:09.717 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:09.714+0000 7f176b046700 1 --2- 192.168.123.104:0/3598138886 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7f174c0776b0 0x7f174c079b60 secure :-1 s=READY pgs=30 cs=0 l=1 rev1=1 crypto rx=0x7f175c00ba80 tx=0x7f175c00bee0 comp rx=0 tx=0).stop 2026-03-10T06:23:09.717 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:09.714+0000 7f176b046700 1 -- 192.168.123.104:0/3598138886 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1764071a90 msgr2=0x7f1764116ad0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:23:09.717 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:09.714+0000 7f176b046700 1 --2- 192.168.123.104:0/3598138886 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1764071a90 0x7f1764116ad0 secure :-1 s=READY pgs=96 cs=0 l=1 rev1=1 crypto rx=0x7f175400eb10 tx=0x7f175400ee20 comp rx=0 tx=0).stop 2026-03-10T06:23:09.717 INFO:tasks.workunit.client.0.vm04.stdout:0/777: dread d0/d5/d25/dd/d5c/f9a [0,4194304] 0 2026-03-10T06:23:09.718 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:09.715+0000 7f176b046700 1 -- 192.168.123.104:0/3598138886 shutdown_connections 2026-03-10T06:23:09.718 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:09.715+0000 7f176b046700 1 --2- 192.168.123.104:0/3598138886 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1764071a90 0x7f1764116ad0 unknown :-1 s=CLOSED pgs=96 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:09.718 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:09.715+0000 7f176b046700 1 --2- 192.168.123.104:0/3598138886 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7f174c0776b0 0x7f174c079b60 unknown :-1 s=CLOSED pgs=30 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:09.718 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:09.715+0000 7f176b046700 1 --2- 192.168.123.104:0/3598138886 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1764117010 0x7f17641b2800 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:09.718 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:09.715+0000 7f176b046700 1 -- 192.168.123.104:0/3598138886 >> 192.168.123.104:0/3598138886 conn(0x7f176406d1a0 msgr2=0x7f176410b260 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:23:09.719 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:09.717+0000 7f176b046700 1 -- 192.168.123.104:0/3598138886 shutdown_connections 2026-03-10T06:23:09.719 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:09.717+0000 7f176b046700 1 -- 192.168.123.104:0/3598138886 wait complete. 2026-03-10T06:23:09.726 INFO:tasks.workunit.client.0.vm04.stdout:5/679: unlink d4/d6/d80/f97 0 2026-03-10T06:23:09.728 INFO:tasks.workunit.client.0.vm04.stdout:1/709: rename d0/d3/d41/d4b/d5b/f66 to d0/d8/f105 0 2026-03-10T06:23:09.728 INFO:tasks.workunit.client.0.vm04.stdout:6/732: creat d2/d43/d2d/d30/dc0/fef x:0 0 0 2026-03-10T06:23:09.728 INFO:tasks.workunit.client.0.vm04.stdout:4/725: dwrite d2/d32/d5c/d76/dd7/d31/d3f/f52 [0,4194304] 0 2026-03-10T06:23:09.736 INFO:tasks.workunit.client.0.vm04.stdout:3/693: creat d4/fe3 x:0 0 0 2026-03-10T06:23:09.736 INFO:tasks.workunit.client.0.vm04.stdout:1/710: fdatasync d0/d3/d41/d99/d103/fe5 0 2026-03-10T06:23:09.736 INFO:tasks.workunit.client.0.vm04.stdout:3/694: chown d4/da/df/d11/d5a/d5b/ddf/f23 855924 1 2026-03-10T06:23:09.744 INFO:tasks.workunit.client.0.vm04.stdout:9/768: creat d2/d8/d22/daa/f11a x:0 0 0 2026-03-10T06:23:09.745 INFO:tasks.workunit.client.0.vm04.stdout:8/726: dread df/d20/d25/d30/d65/f94 [0,4194304] 0 2026-03-10T06:23:09.748 INFO:tasks.workunit.client.0.vm04.stdout:4/726: sync 2026-03-10T06:23:09.749 INFO:tasks.workunit.client.0.vm04.stdout:4/727: chown d2/d32/d5c/d76/dd7/d31/d3f/f43 14 1 2026-03-10T06:23:09.764 INFO:tasks.workunit.client.0.vm04.stdout:2/684: rename d1/db/d20 to d1/db/d69/d74/d87/dcf 0 2026-03-10T06:23:09.764 INFO:tasks.workunit.client.0.vm04.stdout:5/680: rmdir d4/d6/d50 39 2026-03-10T06:23:09.765 INFO:tasks.workunit.client.0.vm04.stdout:3/695: rmdir d4/dba 39 2026-03-10T06:23:09.765 INFO:tasks.workunit.client.0.vm04.stdout:7/686: creat d4/df/d12/d13/d25/d30/d40/d50/ffb x:0 0 0 2026-03-10T06:23:09.765 INFO:tasks.workunit.client.0.vm04.stdout:2/685: chown d1/db/d9b 84831365 1 2026-03-10T06:23:09.769 INFO:tasks.workunit.client.0.vm04.stdout:2/686: sync 2026-03-10T06:23:09.773 INFO:tasks.workunit.client.0.vm04.stdout:8/727: mkdir df/d20/d25/d30/d55/de7 0 2026-03-10T06:23:09.775 INFO:tasks.workunit.client.0.vm04.stdout:0/778: mknod d0/d1a/d20/df5/d47/d8a/d8d/de1/c104 0 2026-03-10T06:23:09.780 INFO:tasks.workunit.client.0.vm04.stdout:9/769: symlink d2/d8/d22/daa/l11b 0 2026-03-10T06:23:09.783 INFO:tasks.workunit.client.0.vm04.stdout:1/711: fsync d0/d8/f76 0 2026-03-10T06:23:09.783 INFO:tasks.workunit.client.0.vm04.stdout:3/696: creat d4/d6/d99/fe4 x:0 0 0 2026-03-10T06:23:09.786 INFO:tasks.workunit.client.0.vm04.stdout:5/681: mkdir d4/d6/d80/dd9/df0 0 2026-03-10T06:23:09.788 INFO:tasks.workunit.client.0.vm04.stdout:3/697: dread - d4/da/df/fc1 zero size 2026-03-10T06:23:09.798 INFO:tasks.workunit.client.0.vm04.stdout:8/728: creat df/d20/d25/d30/dc5/fe8 x:0 0 0 2026-03-10T06:23:09.801 INFO:tasks.workunit.client.0.vm04.stdout:4/728: write d2/d32/d5c/d76/dd7/f9d [8900719,33336] 0 2026-03-10T06:23:09.816 INFO:tasks.workunit.client.0.vm04.stdout:5/682: fdatasync d4/f21 0 2026-03-10T06:23:09.835 INFO:tasks.workunit.client.0.vm04.stdout:6/733: write d2/d43/d2d/d30/d34/f4d [816420,122493] 0 2026-03-10T06:23:09.841 INFO:tasks.workunit.client.0.vm04.stdout:0/779: write d0/d1a/d20/df5/d47/f89 [4062211,93046] 0 2026-03-10T06:23:09.845 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:09.842+0000 7f2442c4c700 1 -- 192.168.123.104:0/4106523560 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f243c072360 msgr2=0x7f243c0770e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:23:09.845 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:09.842+0000 7f2442c4c700 1 --2- 192.168.123.104:0/4106523560 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f243c072360 0x7f243c0770e0 secure :-1 s=READY pgs=97 cs=0 l=1 rev1=1 crypto rx=0x7f243400d3f0 tx=0x7f243400d700 comp rx=0 tx=0).stop 2026-03-10T06:23:09.845 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:09.842+0000 7f2442c4c700 1 -- 192.168.123.104:0/4106523560 shutdown_connections 2026-03-10T06:23:09.845 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:09.842+0000 7f2442c4c700 1 --2- 192.168.123.104:0/4106523560 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f243c072360 0x7f243c0770e0 unknown :-1 s=CLOSED pgs=97 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:09.845 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:09.842+0000 7f2442c4c700 1 --2- 192.168.123.104:0/4106523560 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f243c071980 0x7f243c071d90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:09.845 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:09.843+0000 7f2442c4c700 1 -- 192.168.123.104:0/4106523560 >> 192.168.123.104:0/4106523560 conn(0x7f243c06d1a0 msgr2=0x7f243c06f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:23:09.846 INFO:tasks.workunit.client.0.vm04.stdout:3/698: dread d4/da/df/d11/d5a/d5b/ddf/f9e [0,4194304] 0 2026-03-10T06:23:09.846 INFO:tasks.workunit.client.0.vm04.stdout:7/687: dwrite d4/df/d12/d34/f80 [0,4194304] 0 2026-03-10T06:23:09.849 INFO:tasks.workunit.client.0.vm04.stdout:0/780: dwrite d0/d5/d25/f23 [0,4194304] 0 2026-03-10T06:23:09.856 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:09.843+0000 7f2442c4c700 1 -- 192.168.123.104:0/4106523560 shutdown_connections 2026-03-10T06:23:09.856 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:09.843+0000 7f2442c4c700 1 -- 192.168.123.104:0/4106523560 wait complete. 2026-03-10T06:23:09.856 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:09.845+0000 7f2442c4c700 1 Processor -- start 2026-03-10T06:23:09.856 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:09.845+0000 7f2442c4c700 1 -- start start 2026-03-10T06:23:09.856 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:09.845+0000 7f2442c4c700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f243c071980 0x7f243c131300 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:23:09.856 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:09.845+0000 7f2442c4c700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f243c131840 0x7f243c07f530 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:23:09.856 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:09.845+0000 7f2442c4c700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f243c131cb0 con 0x7f243c071980 2026-03-10T06:23:09.856 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:09.845+0000 7f2442c4c700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f243c131e20 con 0x7f243c131840 2026-03-10T06:23:09.856 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:09.850+0000 7f24409e8700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f243c071980 0x7f243c131300 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:23:09.856 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:09.850+0000 7f24409e8700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f243c071980 0x7f243c131300 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:47974/0 (socket says 192.168.123.104:47974) 2026-03-10T06:23:09.856 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:09.850+0000 7f24409e8700 1 -- 192.168.123.104:0/1310936313 learned_addr learned my addr 192.168.123.104:0/1310936313 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:23:09.856 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:09.850+0000 7f24409e8700 1 -- 192.168.123.104:0/1310936313 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f243c131840 msgr2=0x7f243c07f530 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T06:23:09.856 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:09.850+0000 7f24409e8700 1 --2- 192.168.123.104:0/1310936313 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f243c131840 0x7f243c07f530 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:09.856 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:09.850+0000 7f24409e8700 1 -- 192.168.123.104:0/1310936313 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f2434007ed0 con 0x7f243c071980 2026-03-10T06:23:09.856 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:09.850+0000 7f24409e8700 1 --2- 192.168.123.104:0/1310936313 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f243c071980 0x7f243c131300 secure :-1 s=READY pgs=339 cs=0 l=1 rev1=1 crypto rx=0x7f242c00ba70 tx=0x7f242c00be30 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:23:09.856 INFO:tasks.workunit.client.0.vm04.stdout:3/699: truncate d4/da/df/d11/d5a/d5b/ddf/f4b 1002892 0 2026-03-10T06:23:09.859 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:09.851+0000 7f2439ffb700 1 -- 192.168.123.104:0/1310936313 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f242c00c7e0 con 0x7f243c071980 2026-03-10T06:23:09.859 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:09.851+0000 7f2439ffb700 1 -- 192.168.123.104:0/1310936313 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f242c00ce20 con 0x7f243c071980 2026-03-10T06:23:09.859 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:09.851+0000 7f2439ffb700 1 -- 192.168.123.104:0/1310936313 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f242c012550 con 0x7f243c071980 2026-03-10T06:23:09.859 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:09.851+0000 7f2442c4c700 1 -- 192.168.123.104:0/1310936313 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f243c07fa70 con 0x7f243c071980 2026-03-10T06:23:09.859 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:09.851+0000 7f2442c4c700 1 -- 192.168.123.104:0/1310936313 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f243c07ff60 con 0x7f243c071980 2026-03-10T06:23:09.859 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:09.852+0000 7f24237fe700 1 -- 192.168.123.104:0/1310936313 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f243c07a410 con 0x7f243c071980 2026-03-10T06:23:09.859 INFO:tasks.workunit.client.0.vm04.stdout:8/729: creat df/d20/d25/d30/d55/fe9 x:0 0 0 2026-03-10T06:23:09.861 INFO:tasks.workunit.client.0.vm04.stdout:3/700: stat d4/c3e 0 2026-03-10T06:23:09.862 INFO:tasks.workunit.client.0.vm04.stdout:8/730: chown df/d20/d25/d30/d65/f94 196786006 1 2026-03-10T06:23:09.862 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:09.857+0000 7f2439ffb700 1 -- 192.168.123.104:0/1310936313 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 32) v1 ==== 100066+0+0 (secure 0 0 0) 0x7f242c014440 con 0x7f243c071980 2026-03-10T06:23:09.862 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:09.858+0000 7f2439ffb700 1 --2- 192.168.123.104:0/1310936313 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7f24240776c0 0x7f2424079b70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:23:09.862 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:09.858+0000 7f2439ffb700 1 -- 192.168.123.104:0/1310936313 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(42..42 src has 1..42) v4 ==== 5792+0+0 (secure 0 0 0) 0x7f242c098f70 con 0x7f243c071980 2026-03-10T06:23:09.863 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:09.862+0000 7f243bfff700 1 --2- 192.168.123.104:0/1310936313 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7f24240776c0 0x7f2424079b70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:23:09.867 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:09.867+0000 7f243bfff700 1 --2- 192.168.123.104:0/1310936313 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7f24240776c0 0x7f2424079b70 secure :-1 s=READY pgs=31 cs=0 l=1 rev1=1 crypto rx=0x7f2434007550 tx=0x7f2434007480 comp rx=0 tx=0).ready entity=mgr.14632 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:23:09.878 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:09.870+0000 7f2439ffb700 1 -- 192.168.123.104:0/1310936313 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191525 (secure 0 0 0) 0x7f242c061d20 con 0x7f243c071980 2026-03-10T06:23:09.890 INFO:tasks.workunit.client.0.vm04.stdout:1/712: truncate d0/f23 1381871 0 2026-03-10T06:23:09.891 INFO:tasks.workunit.client.0.vm04.stdout:6/734: mknod d2/d43/d2d/d30/d34/d76/cf0 0 2026-03-10T06:23:09.892 INFO:tasks.workunit.client.0.vm04.stdout:6/735: dread - d2/d43/fe7 zero size 2026-03-10T06:23:09.892 INFO:tasks.workunit.client.0.vm04.stdout:6/736: write d2/d3a/d5e/fa4 [1179003,77228] 0 2026-03-10T06:23:09.893 INFO:tasks.workunit.client.0.vm04.stdout:6/737: chown d2/d43/c8f 223389 1 2026-03-10T06:23:09.893 INFO:tasks.workunit.client.0.vm04.stdout:6/738: chown d2/d43/d2d/d30/l33 6352132 1 2026-03-10T06:23:09.900 INFO:tasks.workunit.client.0.vm04.stdout:9/770: dwrite d2/d8/d3a/dcb/fe3 [0,4194304] 0 2026-03-10T06:23:09.903 INFO:tasks.workunit.client.0.vm04.stdout:5/683: dwrite d4/d11/d7d/f30 [0,4194304] 0 2026-03-10T06:23:09.904 INFO:tasks.workunit.client.0.vm04.stdout:5/684: chown d4/d3b/da8 11184 1 2026-03-10T06:23:09.910 INFO:tasks.workunit.client.0.vm04.stdout:2/687: rename d1/dae/d11/l98 to d1/dae/ld0 0 2026-03-10T06:23:09.918 INFO:tasks.workunit.client.0.vm04.stdout:0/781: rmdir d0/d5/d97/dc0/dd8/dff 39 2026-03-10T06:23:09.945 INFO:tasks.workunit.client.0.vm04.stdout:1/713: mknod d0/d8/d46/db3/dd2/c106 0 2026-03-10T06:23:09.961 INFO:tasks.workunit.client.0.vm04.stdout:6/739: fsync d2/f5f 0 2026-03-10T06:23:09.995 INFO:tasks.workunit.client.0.vm04.stdout:5/685: rmdir d4/d11/d7d/d38/d91 39 2026-03-10T06:23:09.998 INFO:tasks.workunit.client.0.vm04.stdout:5/686: dread d4/f79 [0,4194304] 0 2026-03-10T06:23:10.009 INFO:tasks.workunit.client.0.vm04.stdout:4/729: rename d2/d32/d5c/d76/dd7/d56/fca to d2/d32/d94/d99/ddc/fea 0 2026-03-10T06:23:10.013 INFO:tasks.workunit.client.0.vm04.stdout:2/688: creat d1/d76/fd1 x:0 0 0 2026-03-10T06:23:10.023 INFO:tasks.workunit.client.0.vm04.stdout:8/731: mkdir df/d15/d2b/d8a/dab/dea 0 2026-03-10T06:23:10.033 INFO:tasks.workunit.client.0.vm04.stdout:3/701: dwrite d4/da/df/d11/d5a/d5b/ddf/d21/d32/d39/d64/f7d [0,4194304] 0 2026-03-10T06:23:10.055 INFO:tasks.workunit.client.0.vm04.stdout:6/740: mknod d2/d43/d86/cf1 0 2026-03-10T06:23:10.069 INFO:tasks.workunit.client.0.vm04.stdout:5/687: readlink d4/d11/d7d/d38/d91/d4c/d98/dc0/l89 0 2026-03-10T06:23:10.069 INFO:tasks.workunit.client.0.vm04.stdout:5/688: write d4/fb0 [3241711,7645] 0 2026-03-10T06:23:10.069 INFO:tasks.workunit.client.0.vm04.stdout:5/689: stat d4/d3b 0 2026-03-10T06:23:10.076 INFO:tasks.workunit.client.0.vm04.stdout:7/688: rename d4/df/d12/d34/l7e to d4/df/d12/d13/db3/lfc 0 2026-03-10T06:23:10.081 INFO:tasks.workunit.client.0.vm04.stdout:2/689: fsync d1/dae/d2c/d37/f52 0 2026-03-10T06:23:10.081 INFO:tasks.workunit.client.0.vm04.stdout:7/689: chown d4/df/d12/d13/fb5 94094 1 2026-03-10T06:23:10.082 INFO:tasks.workunit.client.0.vm04.stdout:8/732: chown df/d20/d25/d87/l96 233251 1 2026-03-10T06:23:10.088 INFO:tasks.workunit.client.0.vm04.stdout:0/782: dread d0/d5/d97/dc0/dd8/dff/d59/f48 [0,4194304] 0 2026-03-10T06:23:10.094 INFO:tasks.workunit.client.0.vm04.stdout:3/702: creat d4/d6/d91/fe5 x:0 0 0 2026-03-10T06:23:10.122 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:10.119+0000 7f24237fe700 1 -- 192.168.123.104:0/1310936313 --> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f243c131fe0 con 0x7f24240776c0 2026-03-10T06:23:10.123 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:10.122+0000 7f2439ffb700 1 -- 192.168.123.104:0/1310936313 <== mgr.14632 v2:192.168.123.104:6800/1695210057 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+369 (secure 0 0 0) 0x7f24340047f0 con 0x7f24240776c0 2026-03-10T06:23:10.123 INFO:teuthology.orchestra.run.vm04.stdout:{ 2026-03-10T06:23:10.123 INFO:teuthology.orchestra.run.vm04.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc", 2026-03-10T06:23:10.123 INFO:teuthology.orchestra.run.vm04.stdout: "in_progress": true, 2026-03-10T06:23:10.123 INFO:teuthology.orchestra.run.vm04.stdout: "which": "Upgrading daemons of type(s) mgr", 2026-03-10T06:23:10.123 INFO:teuthology.orchestra.run.vm04.stdout: "services_complete": [ 2026-03-10T06:23:10.123 INFO:teuthology.orchestra.run.vm04.stdout: "mgr" 2026-03-10T06:23:10.123 INFO:teuthology.orchestra.run.vm04.stdout: ], 2026-03-10T06:23:10.123 INFO:teuthology.orchestra.run.vm04.stdout: "progress": "2/2 daemons upgraded", 2026-03-10T06:23:10.123 INFO:teuthology.orchestra.run.vm04.stdout: "message": "Currently upgrading alertmanager daemons", 2026-03-10T06:23:10.123 INFO:teuthology.orchestra.run.vm04.stdout: "is_paused": false 2026-03-10T06:23:10.123 INFO:teuthology.orchestra.run.vm04.stdout:} 2026-03-10T06:23:10.130 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:10 vm04.local ceph-mon[51058]: from='client.14694 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:23:10.130 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:10 vm04.local ceph-mon[51058]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:23:10.130 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:10 vm04.local ceph-mon[51058]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:23:10.130 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:10 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:23:10.130 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:10 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T06:23:10.130 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:10 vm04.local ceph-mon[51058]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:23:10.130 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:10 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-10T06:23:10.130 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:10 vm04.local ceph-mon[51058]: from='mon.? -' entity='mon.' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-10T06:23:10.130 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:10 vm04.local ceph-mon[51058]: from='client.24463 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:23:10.130 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:10 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:23:10.130 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:10 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:23:10.130 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:10 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:23:10.130 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:10 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:23:10.130 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:10 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:23:10.130 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:10 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:23:10.130 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:10 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:23:10.130 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:10 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:23:10.130 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:10 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:23:10.130 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:10 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:23:10.130 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:10 vm04.local ceph-mon[51058]: from='client.? 192.168.123.104:0/3598138886' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:23:10.130 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:10.127+0000 7f2442c4c700 1 -- 192.168.123.104:0/1310936313 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7f24240776c0 msgr2=0x7f2424079b70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:23:10.130 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:10.127+0000 7f2442c4c700 1 --2- 192.168.123.104:0/1310936313 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7f24240776c0 0x7f2424079b70 secure :-1 s=READY pgs=31 cs=0 l=1 rev1=1 crypto rx=0x7f2434007550 tx=0x7f2434007480 comp rx=0 tx=0).stop 2026-03-10T06:23:10.130 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:10.128+0000 7f2442c4c700 1 -- 192.168.123.104:0/1310936313 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f243c071980 msgr2=0x7f243c131300 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:23:10.130 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:10.128+0000 7f2442c4c700 1 --2- 192.168.123.104:0/1310936313 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f243c071980 0x7f243c131300 secure :-1 s=READY pgs=339 cs=0 l=1 rev1=1 crypto rx=0x7f242c00ba70 tx=0x7f242c00be30 comp rx=0 tx=0).stop 2026-03-10T06:23:10.130 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:10.128+0000 7f2442c4c700 1 -- 192.168.123.104:0/1310936313 shutdown_connections 2026-03-10T06:23:10.131 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:10.128+0000 7f2442c4c700 1 --2- 192.168.123.104:0/1310936313 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7f24240776c0 0x7f2424079b70 unknown :-1 s=CLOSED pgs=31 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:10.131 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:10.128+0000 7f2442c4c700 1 --2- 192.168.123.104:0/1310936313 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f243c071980 0x7f243c131300 unknown :-1 s=CLOSED pgs=339 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:10.131 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:10.128+0000 7f2442c4c700 1 --2- 192.168.123.104:0/1310936313 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f243c131840 0x7f243c07f530 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:10.131 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:10.128+0000 7f2442c4c700 1 -- 192.168.123.104:0/1310936313 >> 192.168.123.104:0/1310936313 conn(0x7f243c06d1a0 msgr2=0x7f243c0763d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:23:10.131 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:10.129+0000 7f2442c4c700 1 -- 192.168.123.104:0/1310936313 shutdown_connections 2026-03-10T06:23:10.131 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:10.129+0000 7f2442c4c700 1 -- 192.168.123.104:0/1310936313 wait complete. 2026-03-10T06:23:10.132 INFO:tasks.workunit.client.0.vm04.stdout:9/771: link d2/d3/d18/d34/cf2 d2/d8/d3a/dcb/d108/c11c 0 2026-03-10T06:23:10.147 INFO:tasks.workunit.client.0.vm04.stdout:5/690: chown d4/d6/d50/fdb 647639 1 2026-03-10T06:23:10.150 INFO:tasks.workunit.client.0.vm04.stdout:4/730: rename d2/d8/f89 to d2/d32/d5c/d76/dd7/da3/feb 0 2026-03-10T06:23:10.154 INFO:tasks.workunit.client.0.vm04.stdout:2/690: symlink d1/db/d69/d74/d87/ld2 0 2026-03-10T06:23:10.161 INFO:tasks.workunit.client.0.vm04.stdout:8/733: write df/f77 [4496630,122580] 0 2026-03-10T06:23:10.168 INFO:tasks.workunit.client.0.vm04.stdout:0/783: symlink d0/d1a/l105 0 2026-03-10T06:23:10.172 INFO:tasks.workunit.client.0.vm04.stdout:3/703: truncate d4/da/df/d11/d5a/d5b/ddf/d21/d32/d39/d64/f75 1267323 0 2026-03-10T06:23:10.174 INFO:tasks.workunit.client.0.vm04.stdout:7/690: dwrite d4/df/d12/f7f [0,4194304] 0 2026-03-10T06:23:10.177 INFO:tasks.workunit.client.0.vm04.stdout:3/704: chown d4/da/df/d11/d5a/d5b/ddf/f2b 16 1 2026-03-10T06:23:10.221 INFO:tasks.workunit.client.0.vm04.stdout:6/741: truncate d2/fa0 1721023 0 2026-03-10T06:23:10.231 INFO:tasks.workunit.client.0.vm04.stdout:9/772: fsync d2/d8/d53/d6e/d89/f95 0 2026-03-10T06:23:10.231 INFO:tasks.workunit.client.0.vm04.stdout:9/773: chown d2/d3/d18/d39/d46/d55/l9c 1007660 1 2026-03-10T06:23:10.237 INFO:tasks.workunit.client.0.vm04.stdout:5/691: creat d4/d11/d7d/d38/d91/d4c/d98/dc0/dbe/ff1 x:0 0 0 2026-03-10T06:23:10.241 INFO:tasks.workunit.client.0.vm04.stdout:4/731: symlink d2/d32/d5c/d76/dd7/da3/lec 0 2026-03-10T06:23:10.249 INFO:tasks.workunit.client.0.vm04.stdout:4/732: chown d2/d32/d94 795508243 1 2026-03-10T06:23:10.250 INFO:tasks.workunit.client.0.vm04.stdout:8/734: symlink df/d20/d25/d87/leb 0 2026-03-10T06:23:10.250 INFO:tasks.workunit.client.0.vm04.stdout:3/705: dread - d4/d6/d91/fad zero size 2026-03-10T06:23:10.252 INFO:tasks.workunit.client.0.vm04.stdout:6/742: sync 2026-03-10T06:23:10.252 INFO:tasks.workunit.client.0.vm04.stdout:1/714: getdents d0/d8/d46/d7a/d95/dc5/dcc 0 2026-03-10T06:23:10.256 INFO:tasks.workunit.client.0.vm04.stdout:5/692: truncate d4/d6/d50/f59 379009 0 2026-03-10T06:23:10.269 INFO:tasks.workunit.client.0.vm04.stdout:4/733: fsync d2/d32/d5c/d76/dd7/d31/d3f/f64 0 2026-03-10T06:23:10.278 INFO:tasks.workunit.client.0.vm04.stdout:6/743: creat d2/d3a/d5e/db5/dd4/ff2 x:0 0 0 2026-03-10T06:23:10.327 INFO:tasks.workunit.client.0.vm04.stdout:4/734: mkdir d2/d32/d5c/d76/dd7/d2c/d9a/ded 0 2026-03-10T06:23:10.329 INFO:tasks.workunit.client.0.vm04.stdout:3/706: creat d4/da/df/d11/d5a/d5b/ddf/d21/fe6 x:0 0 0 2026-03-10T06:23:10.339 INFO:tasks.workunit.client.0.vm04.stdout:4/735: link d2/d32/d5c/d76/dd7/d31/d3f/ccd d2/d32/d5c/de2/cee 0 2026-03-10T06:23:10.362 INFO:tasks.workunit.client.0.vm04.stdout:4/736: dread d2/d8/f35 [0,4194304] 0 2026-03-10T06:23:10.363 INFO:tasks.workunit.client.0.vm04.stdout:3/707: symlink d4/d6/d91/le7 0 2026-03-10T06:23:10.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:10 vm06.local ceph-mon[58974]: from='client.14694 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:23:10.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:10 vm06.local ceph-mon[58974]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:23:10.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:10 vm06.local ceph-mon[58974]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:23:10.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:10 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:23:10.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:10 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T06:23:10.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:10 vm06.local ceph-mon[58974]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:23:10.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:10 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-10T06:23:10.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:10 vm06.local ceph-mon[58974]: from='mon.? -' entity='mon.' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-10T06:23:10.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:10 vm06.local ceph-mon[58974]: from='client.24463 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:23:10.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:10 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:23:10.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:10 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:23:10.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:10 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:23:10.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:10 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:23:10.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:10 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:23:10.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:10 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:23:10.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:10 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:23:10.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:10 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:23:10.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:10 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:23:10.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:10 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:23:10.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:10 vm06.local ceph-mon[58974]: from='client.? 192.168.123.104:0/3598138886' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:23:10.369 INFO:tasks.workunit.client.0.vm04.stdout:3/708: dread - d4/da/df/fb6 zero size 2026-03-10T06:23:10.382 INFO:tasks.workunit.client.0.vm04.stdout:3/709: truncate d4/da/df/d11/d50/fa9 1067245 0 2026-03-10T06:23:10.385 INFO:tasks.workunit.client.0.vm04.stdout:4/737: dwrite d2/d32/d94/d99/ddc/fea [0,4194304] 0 2026-03-10T06:23:10.399 INFO:tasks.workunit.client.0.vm04.stdout:4/738: stat d2/d32/d5c/d76/dd7/d2c/d6b/f96 0 2026-03-10T06:23:10.402 INFO:tasks.workunit.client.0.vm04.stdout:9/774: dwrite d2/d3/d18/d39/f20 [0,4194304] 0 2026-03-10T06:23:10.416 INFO:tasks.workunit.client.0.vm04.stdout:8/735: write df/d20/d25/d30/d65/f94 [2670080,84392] 0 2026-03-10T06:23:10.425 INFO:tasks.workunit.client.0.vm04.stdout:4/739: truncate d2/d32/d5c/f6a 636555 0 2026-03-10T06:23:10.427 INFO:tasks.workunit.client.0.vm04.stdout:8/736: dwrite df/d20/d25/fe0 [0,4194304] 0 2026-03-10T06:23:10.431 INFO:tasks.workunit.client.0.vm04.stdout:4/740: readlink d2/d32/d5c/d76/dd7/d2c/d9a/lb2 0 2026-03-10T06:23:10.441 INFO:tasks.workunit.client.0.vm04.stdout:4/741: fdatasync d2/d46/fa5 0 2026-03-10T06:23:10.446 INFO:tasks.workunit.client.0.vm04.stdout:8/737: creat df/d15/d29/da3/db8/fec x:0 0 0 2026-03-10T06:23:10.453 INFO:tasks.workunit.client.0.vm04.stdout:1/715: symlink d0/d8/d46/l107 0 2026-03-10T06:23:10.454 INFO:tasks.workunit.client.0.vm04.stdout:1/716: stat d0/d3/cd0 0 2026-03-10T06:23:10.464 INFO:tasks.workunit.client.0.vm04.stdout:1/717: readlink d0/d8/l65 0 2026-03-10T06:23:10.473 INFO:tasks.workunit.client.0.vm04.stdout:6/744: symlink d2/d43/lf3 0 2026-03-10T06:23:10.477 INFO:tasks.workunit.client.0.vm04.stdout:2/691: rename d1/d76 to d1/dae/d2c/dd3 0 2026-03-10T06:23:10.483 INFO:tasks.workunit.client.0.vm04.stdout:0/784: rename d0/d5/d25/dd/d3a/d56/ld5 to d0/d1a/d20/df5/d47/d8a/de3/l106 0 2026-03-10T06:23:10.485 INFO:tasks.workunit.client.0.vm04.stdout:2/692: truncate d1/db/d69/d74/d87/dcf/d8f/d48/d67/fbd 101394 0 2026-03-10T06:23:10.486 INFO:tasks.workunit.client.0.vm04.stdout:1/718: rmdir d0/d8/d46/d7a/d95/dc5 39 2026-03-10T06:23:10.494 INFO:tasks.workunit.client.0.vm04.stdout:6/745: creat d2/d43/d2d/ff4 x:0 0 0 2026-03-10T06:23:10.496 INFO:tasks.workunit.client.0.vm04.stdout:6/746: chown d2/l1b 9869757 1 2026-03-10T06:23:10.496 INFO:tasks.workunit.client.0.vm04.stdout:1/719: mkdir d0/d8/d46/d7a/d108 0 2026-03-10T06:23:10.499 INFO:tasks.workunit.client.0.vm04.stdout:2/693: mknod d1/dae/d2c/dd3/cd4 0 2026-03-10T06:23:10.499 INFO:tasks.workunit.client.0.vm04.stdout:1/720: creat d0/d8/d46/dcf/f109 x:0 0 0 2026-03-10T06:23:10.501 INFO:tasks.workunit.client.0.vm04.stdout:2/694: chown d1/dae/d11/l34 53349 1 2026-03-10T06:23:10.504 INFO:tasks.workunit.client.0.vm04.stdout:7/691: rename d4/c11 to d4/df/d12/d13/d25/cfd 0 2026-03-10T06:23:10.505 INFO:tasks.workunit.client.0.vm04.stdout:7/692: chown d4/df/d12/d21 0 1 2026-03-10T06:23:10.506 INFO:tasks.workunit.client.0.vm04.stdout:7/693: stat d4/df/d12/d13/d25/d30/d40/d50/f5b 0 2026-03-10T06:23:10.508 INFO:tasks.workunit.client.0.vm04.stdout:9/775: write d2/de0/f3c [605853,125190] 0 2026-03-10T06:23:10.511 INFO:tasks.workunit.client.0.vm04.stdout:2/695: unlink d1/db/d69/d74/d87/dcf/d8f/d48/l50 0 2026-03-10T06:23:10.514 INFO:tasks.workunit.client.0.vm04.stdout:1/721: fdatasync d0/d8/f43 0 2026-03-10T06:23:10.515 INFO:tasks.workunit.client.0.vm04.stdout:7/694: mknod d4/df/d12/d13/d25/d30/d40/d50/cfe 0 2026-03-10T06:23:10.515 INFO:tasks.workunit.client.0.vm04.stdout:1/722: readlink d0/d8/d46/l77 0 2026-03-10T06:23:10.515 INFO:tasks.workunit.client.0.vm04.stdout:1/723: chown d0/d3/ld 2 1 2026-03-10T06:23:10.516 INFO:tasks.workunit.client.0.vm04.stdout:7/695: write d4/df/d12/d13/f27 [2660914,91349] 0 2026-03-10T06:23:10.517 INFO:tasks.workunit.client.0.vm04.stdout:6/747: sync 2026-03-10T06:23:10.520 INFO:tasks.workunit.client.0.vm04.stdout:9/776: rmdir d2/d3/d18/de9/d5a 39 2026-03-10T06:23:10.521 INFO:tasks.workunit.client.0.vm04.stdout:4/742: write d2/f12 [2416703,107536] 0 2026-03-10T06:23:10.523 INFO:tasks.workunit.client.0.vm04.stdout:4/743: readlink d2/d32/d5c/d76/dd7/d31/d3f/d93/lb1 0 2026-03-10T06:23:10.529 INFO:tasks.workunit.client.0.vm04.stdout:8/738: dwrite df/d15/d2b/d8a/dab/fdf [0,4194304] 0 2026-03-10T06:23:10.532 INFO:tasks.workunit.client.0.vm04.stdout:2/696: symlink d1/db/d69/d74/ld5 0 2026-03-10T06:23:10.534 INFO:tasks.workunit.client.0.vm04.stdout:5/693: rename d4/d11/d7d/d38/d91/d55/d72/cc5 to d4/d11/d7d/d38/d91/d4c/cf2 0 2026-03-10T06:23:10.535 INFO:tasks.workunit.client.0.vm04.stdout:0/785: rename d0 to d0/d5/d25/dd/d107 22 2026-03-10T06:23:10.538 INFO:tasks.workunit.client.0.vm04.stdout:0/786: dread - d0/d5/d97/dc0/fc4 zero size 2026-03-10T06:23:10.538 INFO:tasks.workunit.client.0.vm04.stdout:0/787: chown d0/d1a/db8/df7 1 1 2026-03-10T06:23:10.539 INFO:tasks.workunit.client.0.vm04.stdout:4/744: mkdir d2/d32/d5c/d76/dd7/d31/d42/db9/def 0 2026-03-10T06:23:10.539 INFO:tasks.workunit.client.0.vm04.stdout:7/696: dread d4/df/d12/d13/d25/f2f [0,4194304] 0 2026-03-10T06:23:10.540 INFO:tasks.workunit.client.0.vm04.stdout:7/697: chown d4/df/d12/f4c 492453311 1 2026-03-10T06:23:10.544 INFO:tasks.workunit.client.0.vm04.stdout:1/724: creat d0/d8/f10a x:0 0 0 2026-03-10T06:23:10.551 INFO:tasks.workunit.client.0.vm04.stdout:9/777: dread d2/d3/d18/f8f [0,4194304] 0 2026-03-10T06:23:10.553 INFO:tasks.workunit.client.0.vm04.stdout:9/778: chown d2/lcd 538372234 1 2026-03-10T06:23:10.554 INFO:tasks.workunit.client.0.vm04.stdout:6/748: rename d2/d43/d2d/f42 to d2/d43/d2d/d30/d1f/ff5 0 2026-03-10T06:23:10.555 INFO:tasks.workunit.client.0.vm04.stdout:8/739: dread df/d15/f5d [0,4194304] 0 2026-03-10T06:23:10.569 INFO:tasks.workunit.client.0.vm04.stdout:5/694: dread d4/d11/f3f [0,4194304] 0 2026-03-10T06:23:10.570 INFO:tasks.workunit.client.0.vm04.stdout:3/710: dwrite d4/da/df/d11/d5a/d5b/ddf/d21/d32/d39/d64/f75 [0,4194304] 0 2026-03-10T06:23:10.578 INFO:tasks.workunit.client.0.vm04.stdout:3/711: write d4/d6/d91/fe5 [524770,33112] 0 2026-03-10T06:23:10.579 INFO:tasks.workunit.client.0.vm04.stdout:2/697: mkdir d1/dae/dd6 0 2026-03-10T06:23:10.585 INFO:tasks.workunit.client.0.vm04.stdout:3/712: dwrite d4/da/df/d11/d5a/d5b/ddf/d21/d32/d39/d64/f75 [0,4194304] 0 2026-03-10T06:23:10.587 INFO:tasks.workunit.client.0.vm04.stdout:7/698: write d4/df/d12/d13/d25/d28/d3a/d58/f77 [777791,64069] 0 2026-03-10T06:23:10.588 INFO:tasks.workunit.client.0.vm04.stdout:3/713: fsync d4/da/df/d11/d5a/d5b/ddf/d21/d32/d39/d64/f7d 0 2026-03-10T06:23:10.595 INFO:tasks.workunit.client.0.vm04.stdout:6/749: symlink d2/d3a/d5e/db5/lf6 0 2026-03-10T06:23:10.597 INFO:tasks.workunit.client.0.vm04.stdout:4/745: truncate d2/d32/d5c/d76/dd7/d56/fa7 332840 0 2026-03-10T06:23:10.597 INFO:tasks.workunit.client.0.vm04.stdout:4/746: readlink d2/d32/d5c/d76/dd7/d31/d42/db9/l59 0 2026-03-10T06:23:10.597 INFO:tasks.workunit.client.0.vm04.stdout:5/695: unlink d4/d11/f3f 0 2026-03-10T06:23:10.599 INFO:tasks.workunit.client.0.vm04.stdout:5/696: fsync d4/d11/d7d/f90 0 2026-03-10T06:23:10.600 INFO:tasks.workunit.client.0.vm04.stdout:0/788: link d0/d5/d97/dc0/dd8/dff/d9c/dbf/ffd d0/d1a/db8/df7/f108 0 2026-03-10T06:23:10.609 INFO:tasks.workunit.client.0.vm04.stdout:2/698: dread d1/dae/f5a [0,4194304] 0 2026-03-10T06:23:10.619 INFO:tasks.workunit.client.0.vm04.stdout:1/725: symlink d0/d3/d41/dc2/d101/l10b 0 2026-03-10T06:23:10.622 INFO:tasks.workunit.client.0.vm04.stdout:6/750: stat d2/d43/d2d/d30/l8d 0 2026-03-10T06:23:10.623 INFO:tasks.workunit.client.0.vm04.stdout:4/747: rmdir d2/d32/d5c/d76/dd7/d31/d3f/d93 39 2026-03-10T06:23:10.625 INFO:tasks.workunit.client.0.vm04.stdout:3/714: symlink d4/da/df/d11/d5a/d5b/ddf/le8 0 2026-03-10T06:23:10.625 INFO:tasks.workunit.client.0.vm04.stdout:6/751: write d2/d37/d6e/fdd [141751,23908] 0 2026-03-10T06:23:10.628 INFO:tasks.workunit.client.0.vm04.stdout:3/715: read - d4/da/df/d11/d50/dc8/fcb zero size 2026-03-10T06:23:10.629 INFO:tasks.workunit.client.0.vm04.stdout:0/789: sync 2026-03-10T06:23:10.634 INFO:tasks.workunit.client.0.vm04.stdout:0/790: chown d0/d5/d25/dd/d3a/fce 1 1 2026-03-10T06:23:10.634 INFO:tasks.workunit.client.0.vm04.stdout:1/726: dwrite d0/d3/f33 [0,4194304] 0 2026-03-10T06:23:10.635 INFO:tasks.workunit.client.0.vm04.stdout:1/727: chown d0/d3/fd4 2081075266 1 2026-03-10T06:23:10.636 INFO:tasks.workunit.client.0.vm04.stdout:8/740: rename df/d15/d29/l3b to df/d15/d29/da3/db8/dc1/d97/d67/led 0 2026-03-10T06:23:10.636 INFO:tasks.workunit.client.0.vm04.stdout:0/791: sync 2026-03-10T06:23:10.644 INFO:tasks.workunit.client.0.vm04.stdout:1/728: dwrite d0/d8/d46/d7a/fce [0,4194304] 0 2026-03-10T06:23:10.646 INFO:tasks.workunit.client.0.vm04.stdout:6/752: creat d2/d43/d2d/d30/d34/da8/ff7 x:0 0 0 2026-03-10T06:23:10.646 INFO:tasks.workunit.client.0.vm04.stdout:5/697: write d4/d6/fa [1845956,52901] 0 2026-03-10T06:23:10.647 INFO:tasks.workunit.client.0.vm04.stdout:9/779: dwrite d2/d3/d18/d34/fc7 [0,4194304] 0 2026-03-10T06:23:10.658 INFO:tasks.workunit.client.0.vm04.stdout:7/699: creat d4/df/dd8/d9c/db1/fff x:0 0 0 2026-03-10T06:23:10.659 INFO:tasks.workunit.client.0.vm04.stdout:0/792: symlink d0/d5/d97/dc0/dd8/dff/d59/l109 0 2026-03-10T06:23:10.675 INFO:tasks.workunit.client.0.vm04.stdout:4/748: creat d2/d32/d5c/d76/dd7/d2c/d6b/dd1/ff0 x:0 0 0 2026-03-10T06:23:10.678 INFO:tasks.workunit.client.0.vm04.stdout:9/780: read - d2/d3/d18/d34/fe2 zero size 2026-03-10T06:23:10.679 INFO:tasks.workunit.client.0.vm04.stdout:3/716: rename d4/da/df/d11/d5a/d5b/ddf/d21/d32/d39/d64/f75 to d4/da/df/d11/d5a/d5b/ddf/d21/d32/d4e/d7f/fe9 0 2026-03-10T06:23:10.682 INFO:tasks.workunit.client.0.vm04.stdout:0/793: rmdir d0/d1a/db8 39 2026-03-10T06:23:10.684 INFO:tasks.workunit.client.0.vm04.stdout:0/794: read - d0/d5/d97/dc0/dd8/dff/d59/f9f zero size 2026-03-10T06:23:10.684 INFO:tasks.workunit.client.0.vm04.stdout:1/729: mkdir d0/d8/d46/d7a/d95/df3/d10c 0 2026-03-10T06:23:10.685 INFO:tasks.workunit.client.0.vm04.stdout:2/699: link d1/db/d72/daf/cbe d1/dae/d11/d14/d9f/cd7 0 2026-03-10T06:23:10.685 INFO:tasks.workunit.client.0.vm04.stdout:6/753: dwrite d2/d43/d2d/d30/d34/f4d [0,4194304] 0 2026-03-10T06:23:10.686 INFO:tasks.workunit.client.0.vm04.stdout:8/741: creat df/d15/fee x:0 0 0 2026-03-10T06:23:10.691 INFO:tasks.workunit.client.0.vm04.stdout:4/749: truncate d2/f14 7218378 0 2026-03-10T06:23:10.694 INFO:tasks.workunit.client.0.vm04.stdout:5/698: dread d4/d11/f32 [0,4194304] 0 2026-03-10T06:23:10.694 INFO:tasks.workunit.client.0.vm04.stdout:7/700: mkdir d4/df/d12/d13/d25/d28/d3a/d100 0 2026-03-10T06:23:10.707 INFO:tasks.workunit.client.0.vm04.stdout:8/742: truncate df/d20/d25/d30/d55/f8d 4882266 0 2026-03-10T06:23:10.708 INFO:tasks.workunit.client.0.vm04.stdout:0/795: fdatasync d0/d5/d25/dd/d5c/d73/d82/faf 0 2026-03-10T06:23:10.708 INFO:tasks.workunit.client.0.vm04.stdout:2/700: dread d1/fa5 [0,4194304] 0 2026-03-10T06:23:10.712 INFO:tasks.workunit.client.0.vm04.stdout:9/781: mkdir d2/d3/d18/de9/d116/d11d 0 2026-03-10T06:23:10.717 INFO:tasks.workunit.client.0.vm04.stdout:0/796: dwrite d0/d5/d25/dd/d5c/fb2 [4194304,4194304] 0 2026-03-10T06:23:10.717 INFO:tasks.workunit.client.0.vm04.stdout:5/699: truncate d4/d6/f23 3662271 0 2026-03-10T06:23:10.724 INFO:tasks.workunit.client.0.vm04.stdout:2/701: dwrite d1/dae/d11/d14/d4e/f5c [0,4194304] 0 2026-03-10T06:23:10.729 INFO:tasks.workunit.client.0.vm04.stdout:9/782: sync 2026-03-10T06:23:10.739 INFO:tasks.workunit.client.0.vm04.stdout:2/702: dread d1/dae/f5a [0,4194304] 0 2026-03-10T06:23:10.745 INFO:tasks.workunit.client.0.vm04.stdout:3/717: rename d4/d6/d38/l8a to d4/da/df/d11/d5a/d5b/ddf/lea 0 2026-03-10T06:23:10.746 INFO:tasks.workunit.client.0.vm04.stdout:0/797: mkdir d0/d5/d97/d10a 0 2026-03-10T06:23:10.746 INFO:tasks.workunit.client.0.vm04.stdout:0/798: stat d0/d5/d25/f5f 0 2026-03-10T06:23:10.747 INFO:tasks.workunit.client.0.vm04.stdout:7/701: write d4/df/f60 [4602657,94416] 0 2026-03-10T06:23:10.750 INFO:tasks.workunit.client.0.vm04.stdout:5/700: mkdir d4/d11/d7d/dae/df3 0 2026-03-10T06:23:10.756 INFO:tasks.workunit.client.0.vm04.stdout:8/743: unlink df/l23 0 2026-03-10T06:23:10.760 INFO:tasks.workunit.client.0.vm04.stdout:5/701: mknod d4/d6/d80/dd9/cf4 0 2026-03-10T06:23:10.761 INFO:tasks.workunit.client.0.vm04.stdout:6/754: link d2/d43/d2d/d30/ld5 d2/d43/d2d/d30/lf8 0 2026-03-10T06:23:10.765 INFO:tasks.workunit.client.0.vm04.stdout:2/703: creat d1/dae/d11/d14/d6a/fd8 x:0 0 0 2026-03-10T06:23:10.770 INFO:tasks.workunit.client.0.vm04.stdout:1/730: link d0/d3/f3b d0/d8/d46/d7a/d95/f10d 0 2026-03-10T06:23:10.779 INFO:tasks.workunit.client.0.vm04.stdout:3/718: mkdir d4/deb 0 2026-03-10T06:23:10.782 INFO:tasks.workunit.client.0.vm04.stdout:3/719: chown d4/da/df/d11/d5a/d5b/ddf/d21/c3c 317483 1 2026-03-10T06:23:10.784 INFO:tasks.workunit.client.0.vm04.stdout:4/750: link d2/d32/d5c/d76/dd7/d2c/d6b/c83 d2/d32/d5c/d76/dd7/d31/d3f/cf1 0 2026-03-10T06:23:10.785 INFO:tasks.workunit.client.0.vm04.stdout:4/751: chown d2/d32/d94 1 1 2026-03-10T06:23:10.789 INFO:tasks.workunit.client.0.vm04.stdout:0/799: symlink d0/d5/d97/dc0/dd8/dff/d59/d63/dfb/l10b 0 2026-03-10T06:23:10.795 INFO:tasks.workunit.client.0.vm04.stdout:9/783: write d2/d3/d18/de9/f83 [194320,29750] 0 2026-03-10T06:23:10.812 INFO:tasks.workunit.client.0.vm04.stdout:7/702: link d4/df/d12/f18 d4/df/d12/d13/db3/ded/f101 0 2026-03-10T06:23:10.858 INFO:tasks.workunit.client.0.vm04.stdout:5/702: write d4/d6/d80/d84/f9c [204211,42707] 0 2026-03-10T06:23:10.859 INFO:tasks.workunit.client.0.vm04.stdout:6/755: readlink d2/d8/l5b 0 2026-03-10T06:23:10.859 INFO:tasks.workunit.client.0.vm04.stdout:8/744: mknod df/d15/d2b/d81/de1/cef 0 2026-03-10T06:23:10.859 INFO:tasks.workunit.client.0.vm04.stdout:3/720: mkdir d4/da/df/d11/d5a/d5b/ddf/d21/d32/d39/dec 0 2026-03-10T06:23:10.860 INFO:tasks.workunit.client.0.vm04.stdout:9/784: mknod d2/de0/c11e 0 2026-03-10T06:23:10.867 INFO:tasks.workunit.client.0.vm04.stdout:2/704: rename d1/db/f12 to d1/db/d69/d74/d87/dcf/d8f/fd9 0 2026-03-10T06:23:10.868 INFO:tasks.workunit.client.0.vm04.stdout:4/752: creat d2/d32/d5c/d76/dd7/d31/d3f/d93/ff2 x:0 0 0 2026-03-10T06:23:10.872 INFO:tasks.workunit.client.0.vm04.stdout:8/745: mkdir df/d15/d2b/d81/d9a/dbe/df0 0 2026-03-10T06:23:10.872 INFO:tasks.workunit.client.0.vm04.stdout:8/746: fdatasync df/d20/d25/d30/d55/fe9 0 2026-03-10T06:23:10.877 INFO:tasks.workunit.client.0.vm04.stdout:4/753: dwrite d2/d32/d5c/d76/dd7/d31/d3f/f52 [4194304,4194304] 0 2026-03-10T06:23:10.880 INFO:tasks.workunit.client.0.vm04.stdout:9/785: mknod d2/de0/d1d/d64/c11f 0 2026-03-10T06:23:10.905 INFO:tasks.workunit.client.0.vm04.stdout:4/754: sync 2026-03-10T06:23:10.905 INFO:tasks.workunit.client.0.vm04.stdout:0/800: rename d0/d5/d97/dc0/dd8/dff/f26 to d0/d5/d25/dd/d92/f10c 0 2026-03-10T06:23:10.920 INFO:tasks.workunit.client.0.vm04.stdout:2/705: mknod d1/db/d69/d74/d87/dcf/d8f/d48/d67/cda 0 2026-03-10T06:23:10.929 INFO:tasks.workunit.client.0.vm04.stdout:9/786: rmdir d2/d3 39 2026-03-10T06:23:10.949 INFO:tasks.workunit.client.0.vm04.stdout:1/731: dread d0/d3/f98 [0,4194304] 0 2026-03-10T06:23:10.969 INFO:tasks.workunit.client.0.vm04.stdout:5/703: symlink d4/d6/d80/dd9/df0/lf5 0 2026-03-10T06:23:10.972 INFO:tasks.workunit.client.0.vm04.stdout:6/756: dwrite d2/d43/d9b/fea [0,4194304] 0 2026-03-10T06:23:10.974 INFO:tasks.workunit.client.0.vm04.stdout:3/721: dwrite d4/da/df/fb6 [0,4194304] 0 2026-03-10T06:23:10.977 INFO:tasks.workunit.client.0.vm04.stdout:4/755: rename d2/d32/d5c/d4f/cc4 to d2/d32/d94/d99/cf3 0 2026-03-10T06:23:10.980 INFO:tasks.workunit.client.0.vm04.stdout:0/801: mknod d0/d1a/d20/df5/d47/d8a/de3/c10d 0 2026-03-10T06:23:11.014 INFO:tasks.workunit.client.0.vm04.stdout:2/706: dread d1/db/d72/d94/f97 [0,4194304] 0 2026-03-10T06:23:11.025 INFO:tasks.workunit.client.0.vm04.stdout:9/787: dwrite d2/d3/d18/d39/d46/fac [0,4194304] 0 2026-03-10T06:23:11.037 INFO:tasks.workunit.client.0.vm04.stdout:1/732: dread d0/d3/d41/d4b/f6b [0,4194304] 0 2026-03-10T06:23:11.039 INFO:tasks.workunit.client.0.vm04.stdout:5/704: truncate d4/d3b/f71 1607072 0 2026-03-10T06:23:11.044 INFO:tasks.workunit.client.0.vm04.stdout:0/802: creat d0/d5/d97/dc0/dd8/dff/d59/f10e x:0 0 0 2026-03-10T06:23:11.047 INFO:tasks.workunit.client.0.vm04.stdout:2/707: rename d1/db/d72 to d1/dae/d11/d14/d9f/ddb 0 2026-03-10T06:23:11.055 INFO:tasks.workunit.client.0.vm04.stdout:6/757: dwrite d2/d43/d2d/d30/d34/d76/d8a/fab [0,4194304] 0 2026-03-10T06:23:11.055 INFO:tasks.workunit.client.0.vm04.stdout:8/747: creat df/d20/ff1 x:0 0 0 2026-03-10T06:23:11.076 INFO:tasks.workunit.client.0.vm04.stdout:7/703: getdents d4/df/d12/d13/d25/dcb 0 2026-03-10T06:23:11.077 INFO:tasks.workunit.client.0.vm04.stdout:1/733: readlink d0/d3/d41/d4b/d5b/le0 0 2026-03-10T06:23:11.079 INFO:tasks.workunit.client.0.vm04.stdout:3/722: creat d4/d6/d38/dcc/fed x:0 0 0 2026-03-10T06:23:11.097 INFO:tasks.workunit.client.0.vm04.stdout:9/788: write d2/d8/d3a/fd8 [436548,112656] 0 2026-03-10T06:23:11.097 INFO:tasks.workunit.client.0.vm04.stdout:7/704: mkdir d4/df/dd8/d102 0 2026-03-10T06:23:11.098 INFO:tasks.workunit.client.0.vm04.stdout:9/789: chown d2/de0/l3d 24403 1 2026-03-10T06:23:11.100 INFO:tasks.workunit.client.0.vm04.stdout:9/790: write d2/d8/d22/daa/ff9 [1435826,20616] 0 2026-03-10T06:23:11.101 INFO:tasks.workunit.client.0.vm04.stdout:9/791: read d2/d3/d18/d39/d11/da5/df5/ffc [4052357,54969] 0 2026-03-10T06:23:11.107 INFO:tasks.workunit.client.0.vm04.stdout:3/723: rmdir d4/da/df/d11/d5a/d5b/ddf/d21/d32/d39 39 2026-03-10T06:23:11.107 INFO:tasks.workunit.client.0.vm04.stdout:4/756: link d2/d32/d5c/d76/dd7/d56/c58 d2/dde/cf4 0 2026-03-10T06:23:11.109 INFO:tasks.workunit.client.0.vm04.stdout:0/803: symlink d0/d5/d97/d10a/l10f 0 2026-03-10T06:23:11.110 INFO:tasks.workunit.client.0.vm04.stdout:6/758: mkdir d2/d37/d83/dc1/df9 0 2026-03-10T06:23:11.112 INFO:tasks.workunit.client.0.vm04.stdout:0/804: dread d0/d1a/d20/df5/d47/d8a/fbe [0,4194304] 0 2026-03-10T06:23:11.112 INFO:tasks.workunit.client.0.vm04.stdout:8/748: write df/d20/d25/d30/d65/f82 [329916,17577] 0 2026-03-10T06:23:11.114 INFO:tasks.workunit.client.0.vm04.stdout:0/805: write d0/d1a/f101 [84699,17081] 0 2026-03-10T06:23:11.116 INFO:tasks.workunit.client.0.vm04.stdout:7/705: mkdir d4/df/d12/d34/d103 0 2026-03-10T06:23:11.116 INFO:tasks.workunit.client.0.vm04.stdout:6/759: dwrite d2/d37/d6e/fa9 [0,4194304] 0 2026-03-10T06:23:11.122 INFO:tasks.workunit.client.0.vm04.stdout:7/706: dwrite d4/df/d12/d13/d25/d28/d3a/d58/f77 [0,4194304] 0 2026-03-10T06:23:11.130 INFO:tasks.workunit.client.0.vm04.stdout:3/724: sync 2026-03-10T06:23:11.137 INFO:tasks.workunit.client.0.vm04.stdout:4/757: dread - d2/d46/fcb zero size 2026-03-10T06:23:11.168 INFO:tasks.workunit.client.0.vm04.stdout:7/707: chown d4/df/d12/d13/db3/lfc 1906373304 1 2026-03-10T06:23:11.171 INFO:tasks.workunit.client.0.vm04.stdout:3/725: mkdir d4/d6/d54/dee 0 2026-03-10T06:23:11.181 INFO:tasks.workunit.client.0.vm04.stdout:5/705: rename d4/d11/d7d/f44 to d4/d6/ff6 0 2026-03-10T06:23:11.185 INFO:tasks.workunit.client.0.vm04.stdout:1/734: write d0/d8/d46/d7a/d95/f10d [3240256,57233] 0 2026-03-10T06:23:11.186 INFO:tasks.workunit.client.0.vm04.stdout:1/735: chown d0/d3/f24 2 1 2026-03-10T06:23:11.192 INFO:tasks.workunit.client.0.vm04.stdout:8/749: dwrite df/d15/d29/da3/db8/dc1/d97/d67/fa4 [0,4194304] 0 2026-03-10T06:23:11.193 INFO:tasks.workunit.client.0.vm04.stdout:8/750: readlink df/d20/d25/d87/leb 0 2026-03-10T06:23:11.202 INFO:tasks.workunit.client.0.vm04.stdout:4/758: creat d2/d32/d5c/d76/dd7/d2c/d9a/ff5 x:0 0 0 2026-03-10T06:23:11.206 INFO:tasks.workunit.client.0.vm04.stdout:7/708: mkdir d4/df/dd8/d9c/db1/dc4/d104 0 2026-03-10T06:23:11.207 INFO:tasks.workunit.client.0.vm04.stdout:7/709: dread - d4/df/d12/d13/ff2 zero size 2026-03-10T06:23:11.208 INFO:tasks.workunit.client.0.vm04.stdout:3/726: dread - d4/fa7 zero size 2026-03-10T06:23:11.211 INFO:tasks.workunit.client.0.vm04.stdout:2/708: rename d1/dae/d2c/dd3 to d1/db/d69/d74/d87/dcf/d8f/ddc 0 2026-03-10T06:23:11.217 INFO:tasks.workunit.client.0.vm04.stdout:1/736: chown d0/d8/f21 121 1 2026-03-10T06:23:11.217 INFO:tasks.workunit.client.0.vm04.stdout:1/737: chown d0/d8/d46/d7a/d95/fc8 107778048 1 2026-03-10T06:23:11.228 INFO:tasks.workunit.client.0.vm04.stdout:8/751: fdatasync df/d15/f24 0 2026-03-10T06:23:11.236 INFO:tasks.workunit.client.0.vm04.stdout:9/792: dwrite d2/f1e [0,4194304] 0 2026-03-10T06:23:11.246 INFO:tasks.workunit.client.0.vm04.stdout:6/760: truncate d2/d3a/d5e/fa4 4023128 0 2026-03-10T06:23:11.251 INFO:tasks.workunit.client.0.vm04.stdout:1/738: rmdir d0/d8/d46/dcf 39 2026-03-10T06:23:11.254 INFO:tasks.workunit.client.0.vm04.stdout:5/706: dwrite d4/d6/d37/fcc [0,4194304] 0 2026-03-10T06:23:11.256 INFO:tasks.workunit.client.0.vm04.stdout:0/806: getdents d0/d5/d97/dc0 0 2026-03-10T06:23:11.277 INFO:tasks.workunit.client.0.vm04.stdout:7/710: mkdir d4/d105 0 2026-03-10T06:23:11.281 INFO:tasks.workunit.client.0.vm04.stdout:3/727: unlink d4/da/df/d11/d5a/d5b/ddf/d21/d32/cdd 0 2026-03-10T06:23:11.281 INFO:tasks.workunit.client.0.vm04.stdout:3/728: chown d4/da/cd7 0 1 2026-03-10T06:23:11.281 INFO:tasks.workunit.client.0.vm04.stdout:3/729: read - d4/d6/d99/fe4 zero size 2026-03-10T06:23:11.288 INFO:tasks.workunit.client.0.vm04.stdout:8/752: dwrite df/d15/d2b/f33 [4194304,4194304] 0 2026-03-10T06:23:11.290 INFO:tasks.workunit.client.0.vm04.stdout:6/761: mkdir d2/d43/d2d/d30/d1f/d3c/dfa 0 2026-03-10T06:23:11.291 INFO:tasks.workunit.client.0.vm04.stdout:6/762: chown d2/d43/d2d/d30/d1f/fbc 53 1 2026-03-10T06:23:11.292 INFO:tasks.workunit.client.0.vm04.stdout:6/763: read - d2/d43/d2d/d30/d1f/fbc zero size 2026-03-10T06:23:11.293 INFO:tasks.workunit.client.0.vm04.stdout:2/709: mknod d1/dae/d11/d14/d9f/ddb/d94/dbb/cdd 0 2026-03-10T06:23:11.300 INFO:tasks.workunit.client.0.vm04.stdout:5/707: write d4/d11/d7d/d38/d91/d55/d72/fdd [3329021,1087] 0 2026-03-10T06:23:11.300 INFO:tasks.workunit.client.0.vm04.stdout:5/708: chown d4/d6/d81/fc3 249111 1 2026-03-10T06:23:11.308 INFO:tasks.workunit.client.0.vm04.stdout:2/710: dread d1/dae/d11/f7e [0,4194304] 0 2026-03-10T06:23:11.316 INFO:tasks.workunit.client.0.vm04.stdout:2/711: dread d1/dae/d2c/f4a [0,4194304] 0 2026-03-10T06:23:11.346 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:11 vm04.local ceph-mon[51058]: pgmap v34: 65 pgs: 65 active+clean; 2.9 GiB data, 9.7 GiB used, 110 GiB / 120 GiB avail; 47 MiB/s rd, 126 MiB/s wr, 291 op/s 2026-03-10T06:23:11.346 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:11 vm04.local ceph-mon[51058]: from='client.14704 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:23:11.346 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:11 vm04.local ceph-mon[51058]: Upgrade: Updating alertmanager.vm04 2026-03-10T06:23:11.346 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:11 vm04.local ceph-mon[51058]: Deploying daemon alertmanager.vm04 on vm04 2026-03-10T06:23:11.351 INFO:tasks.workunit.client.0.vm04.stdout:4/759: rename d2/d32/d5c/d98/lc3 to d2/d32/d5c/d76/dd7/d31/lf6 0 2026-03-10T06:23:11.352 INFO:tasks.workunit.client.0.vm04.stdout:4/760: readlink d2/d32/l7b 0 2026-03-10T06:23:11.364 INFO:tasks.workunit.client.0.vm04.stdout:2/712: creat d1/db/d69/d74/d87/dcf/d8f/d35/fde x:0 0 0 2026-03-10T06:23:11.369 INFO:tasks.workunit.client.0.vm04.stdout:7/711: mkdir d4/df/d12/d13/d25/d28/d3a/d100/d106 0 2026-03-10T06:23:11.376 INFO:tasks.workunit.client.0.vm04.stdout:9/793: rename d2/d3/d18/d34/c3f to d2/de0/da3/c120 0 2026-03-10T06:23:11.380 INFO:tasks.workunit.client.0.vm04.stdout:5/709: symlink d4/d11/d7d/d38/d91/lf7 0 2026-03-10T06:23:11.381 INFO:tasks.workunit.client.0.vm04.stdout:5/710: chown d4/d6/d80/dd9/df0 1297661667 1 2026-03-10T06:23:11.384 INFO:tasks.workunit.client.0.vm04.stdout:0/807: creat d0/d1a/d20/df5/f110 x:0 0 0 2026-03-10T06:23:11.388 INFO:tasks.workunit.client.0.vm04.stdout:2/713: dread d1/f57 [0,4194304] 0 2026-03-10T06:23:11.389 INFO:tasks.workunit.client.0.vm04.stdout:8/753: rmdir df/d15/d2b/dd8 0 2026-03-10T06:23:11.394 INFO:tasks.workunit.client.0.vm04.stdout:6/764: mknod d2/d43/d2d/d30/d34/dae/cfb 0 2026-03-10T06:23:11.399 INFO:tasks.workunit.client.0.vm04.stdout:5/711: chown d4/d6/d50/c5c 72513172 1 2026-03-10T06:23:11.400 INFO:tasks.workunit.client.0.vm04.stdout:5/712: readlink d4/d6/d80/lbc 0 2026-03-10T06:23:11.401 INFO:tasks.workunit.client.0.vm04.stdout:0/808: mkdir d0/d1a/d20/df5/d79/d111 0 2026-03-10T06:23:11.401 INFO:tasks.workunit.client.0.vm04.stdout:0/809: readlink d0/d5/d97/dc0/dd8/dff/d9c/dbf/leb 0 2026-03-10T06:23:11.403 INFO:tasks.workunit.client.0.vm04.stdout:8/754: creat df/d20/d25/d87/ff2 x:0 0 0 2026-03-10T06:23:11.405 INFO:tasks.workunit.client.0.vm04.stdout:6/765: rmdir d2/d8/d78 39 2026-03-10T06:23:11.407 INFO:tasks.workunit.client.0.vm04.stdout:6/766: read d2/d43/d2d/d30/d1f/d3c/f65 [715373,74518] 0 2026-03-10T06:23:11.407 INFO:tasks.workunit.client.0.vm04.stdout:4/761: creat d2/d32/d5c/d76/dd7/d2c/ff7 x:0 0 0 2026-03-10T06:23:11.409 INFO:tasks.workunit.client.0.vm04.stdout:5/713: mknod d4/d6/d80/cf8 0 2026-03-10T06:23:11.411 INFO:tasks.workunit.client.0.vm04.stdout:6/767: dwrite d2/d43/d9b/fea [0,4194304] 0 2026-03-10T06:23:11.412 INFO:tasks.workunit.client.0.vm04.stdout:6/768: dread - d2/fe0 zero size 2026-03-10T06:23:11.420 INFO:tasks.workunit.client.0.vm04.stdout:5/714: dread d4/d11/d7d/dae/fb2 [0,4194304] 0 2026-03-10T06:23:11.420 INFO:tasks.workunit.client.0.vm04.stdout:5/715: readlink d4/d11/d7d/d52/l58 0 2026-03-10T06:23:11.422 INFO:tasks.workunit.client.0.vm04.stdout:8/755: creat df/d15/d2b/ff3 x:0 0 0 2026-03-10T06:23:11.425 INFO:tasks.workunit.client.0.vm04.stdout:7/712: rename d4/df/d12/d34/f80 to d4/df/f107 0 2026-03-10T06:23:11.429 INFO:tasks.workunit.client.0.vm04.stdout:1/739: dwrite d0/d8/fab [0,4194304] 0 2026-03-10T06:23:11.431 INFO:tasks.workunit.client.0.vm04.stdout:4/762: read - d2/d32/d5c/d76/dd7/da3/fa4 zero size 2026-03-10T06:23:11.432 INFO:tasks.workunit.client.0.vm04.stdout:5/716: dread d4/d6/d80/fd1 [0,4194304] 0 2026-03-10T06:23:11.437 INFO:tasks.workunit.client.0.vm04.stdout:3/730: write d4/da/df/d11/d5a/d5b/ddf/d21/d2c/f7c [473903,114329] 0 2026-03-10T06:23:11.440 INFO:tasks.workunit.client.0.vm04.stdout:8/756: creat df/d20/d25/d73/ff4 x:0 0 0 2026-03-10T06:23:11.447 INFO:tasks.workunit.client.0.vm04.stdout:7/713: mkdir d4/df/d12/d13/d25/d30/d40/d108 0 2026-03-10T06:23:11.447 INFO:tasks.workunit.client.0.vm04.stdout:7/714: dread - d4/df/d12/d13/d25/d8f/fe2 zero size 2026-03-10T06:23:11.448 INFO:tasks.workunit.client.0.vm04.stdout:7/715: fdatasync d4/df/d12/f7f 0 2026-03-10T06:23:11.463 INFO:tasks.workunit.client.0.vm04.stdout:1/740: dwrite d0/d3/d41/f75 [0,4194304] 0 2026-03-10T06:23:11.471 INFO:tasks.workunit.client.0.vm04.stdout:9/794: write d2/d3/d18/d39/d46/fbc [889118,54047] 0 2026-03-10T06:23:11.478 INFO:tasks.workunit.client.0.vm04.stdout:5/717: creat d4/d11/d7d/d52/ff9 x:0 0 0 2026-03-10T06:23:11.479 INFO:tasks.workunit.client.0.vm04.stdout:8/757: stat df/d20/d25/f39 0 2026-03-10T06:23:11.480 INFO:tasks.workunit.client.0.vm04.stdout:8/758: write df/d15/f1e [1745651,118640] 0 2026-03-10T06:23:11.482 INFO:tasks.workunit.client.0.vm04.stdout:2/714: rename d1/dae/d11/d14/d9f/ddb/d94/cb4 to d1/db/d69/d74/d87/dcf/d8f/d35/cdf 0 2026-03-10T06:23:11.484 INFO:tasks.workunit.client.0.vm04.stdout:3/731: sync 2026-03-10T06:23:11.493 INFO:tasks.workunit.client.0.vm04.stdout:9/795: rmdir d2/de5 39 2026-03-10T06:23:11.493 INFO:tasks.workunit.client.0.vm04.stdout:5/718: rmdir d4/d11/d7d/d38/d91/d4c/d98 39 2026-03-10T06:23:11.495 INFO:tasks.workunit.client.0.vm04.stdout:8/759: symlink df/d15/d29/da3/db8/lf5 0 2026-03-10T06:23:11.497 INFO:tasks.workunit.client.0.vm04.stdout:6/769: rename d2/d43/d2d/d30/d1f/d3c/fe4 to d2/d8/d78/ffc 0 2026-03-10T06:23:11.505 INFO:tasks.workunit.client.0.vm04.stdout:9/796: dwrite d2/d3/d18/fc1 [0,4194304] 0 2026-03-10T06:23:11.509 INFO:tasks.workunit.client.0.vm04.stdout:1/741: creat d0/d8/d46/f10e x:0 0 0 2026-03-10T06:23:11.510 INFO:tasks.workunit.client.0.vm04.stdout:4/763: getdents d2/d32/d5c/d98 0 2026-03-10T06:23:11.511 INFO:tasks.workunit.client.0.vm04.stdout:9/797: read - d2/d8/d53/d6e/d89/fbb zero size 2026-03-10T06:23:11.515 INFO:tasks.workunit.client.0.vm04.stdout:0/810: dwrite d0/d5/d25/f3c [0,4194304] 0 2026-03-10T06:23:11.526 INFO:tasks.workunit.client.0.vm04.stdout:7/716: write d4/df/d12/d13/d25/d28/fd5 [318759,93245] 0 2026-03-10T06:23:11.527 INFO:tasks.workunit.client.0.vm04.stdout:7/717: write d4/df/d12/d13/d25/d28/fd5 [868985,45552] 0 2026-03-10T06:23:11.538 INFO:tasks.workunit.client.0.vm04.stdout:2/715: dwrite d1/db/d69/d74/d87/dcf/fa1 [0,4194304] 0 2026-03-10T06:23:11.546 INFO:tasks.workunit.client.0.vm04.stdout:6/770: mkdir d2/d97/dfd 0 2026-03-10T06:23:11.546 INFO:tasks.workunit.client.0.vm04.stdout:8/760: mkdir df/d20/df6 0 2026-03-10T06:23:11.550 INFO:tasks.workunit.client.0.vm04.stdout:3/732: rmdir d4/da/df/d11/d50/dc8 39 2026-03-10T06:23:11.551 INFO:tasks.workunit.client.0.vm04.stdout:9/798: rename d2/d3/d18/ddd/c85 to d2/de0/da3/c121 0 2026-03-10T06:23:11.570 INFO:tasks.workunit.client.0.vm04.stdout:7/718: symlink d4/df/d12/dd4/l109 0 2026-03-10T06:23:11.570 INFO:tasks.workunit.client.0.vm04.stdout:2/716: creat d1/dae/d2c/d37/fe0 x:0 0 0 2026-03-10T06:23:11.571 INFO:tasks.workunit.client.0.vm04.stdout:8/761: fsync df/d20/d25/d30/d55/f95 0 2026-03-10T06:23:11.574 INFO:tasks.workunit.client.0.vm04.stdout:3/733: unlink d4/d6/d91/da1/fc4 0 2026-03-10T06:23:11.574 INFO:tasks.workunit.client.0.vm04.stdout:9/799: rename d2/d8/d53/d6e/d8d to d2/de0/d1d/d64/d122 0 2026-03-10T06:23:11.576 INFO:tasks.workunit.client.0.vm04.stdout:0/811: fdatasync d0/f14 0 2026-03-10T06:23:11.576 INFO:tasks.workunit.client.0.vm04.stdout:3/734: write d4/da/df/d11/fd2 [779060,5039] 0 2026-03-10T06:23:11.576 INFO:tasks.workunit.client.0.vm04.stdout:5/719: getdents d4/d11 0 2026-03-10T06:23:11.577 INFO:tasks.workunit.client.0.vm04.stdout:1/742: link d0/d8/d46/d7a/d95/cb9 d0/d3/d41/d99/d103/c10f 0 2026-03-10T06:23:11.579 INFO:tasks.workunit.client.0.vm04.stdout:2/717: write d1/db/d69/d74/d87/dcf/d8f/f25 [6273883,114472] 0 2026-03-10T06:23:11.583 INFO:tasks.workunit.client.0.vm04.stdout:3/735: mkdir d4/d6/d92/def 0 2026-03-10T06:23:11.583 INFO:tasks.workunit.client.0.vm04.stdout:8/762: creat df/d15/d29/d89/ff7 x:0 0 0 2026-03-10T06:23:11.590 INFO:tasks.workunit.client.0.vm04.stdout:2/718: mknod d1/dae/d2c/d37/ce1 0 2026-03-10T06:23:11.596 INFO:tasks.workunit.client.0.vm04.stdout:0/812: mkdir d0/d1a/d20/df5/d47/ddd/d103/d112 0 2026-03-10T06:23:11.596 INFO:tasks.workunit.client.0.vm04.stdout:7/719: getdents d4/df/d12/d34 0 2026-03-10T06:23:11.598 INFO:tasks.workunit.client.0.vm04.stdout:8/763: dwrite df/d15/d2b/f60 [0,4194304] 0 2026-03-10T06:23:11.601 INFO:tasks.workunit.client.0.vm04.stdout:0/813: chown d0/d5/d25/dd/d92/ffe 10327211 1 2026-03-10T06:23:11.603 INFO:tasks.workunit.client.0.vm04.stdout:2/719: sync 2026-03-10T06:23:11.607 INFO:tasks.workunit.client.0.vm04.stdout:1/743: unlink d0/d3/d41/d99/d103/c10f 0 2026-03-10T06:23:11.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:11 vm06.local ceph-mon[58974]: pgmap v34: 65 pgs: 65 active+clean; 2.9 GiB data, 9.7 GiB used, 110 GiB / 120 GiB avail; 47 MiB/s rd, 126 MiB/s wr, 291 op/s 2026-03-10T06:23:11.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:11 vm06.local ceph-mon[58974]: from='client.14704 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:23:11.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:11 vm06.local ceph-mon[58974]: Upgrade: Updating alertmanager.vm04 2026-03-10T06:23:11.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:11 vm06.local ceph-mon[58974]: Deploying daemon alertmanager.vm04 on vm04 2026-03-10T06:23:11.624 INFO:tasks.workunit.client.0.vm04.stdout:4/764: dwrite d2/d32/d5c/d76/dd7/d31/fbf [0,4194304] 0 2026-03-10T06:23:11.634 INFO:tasks.workunit.client.0.vm04.stdout:7/720: symlink d4/df/d12/d13/l10a 0 2026-03-10T06:23:11.636 INFO:tasks.workunit.client.0.vm04.stdout:9/800: dread d2/d8/d22/daa/f7c [0,4194304] 0 2026-03-10T06:23:11.642 INFO:tasks.workunit.client.0.vm04.stdout:6/771: write d2/d43/f69 [799105,104690] 0 2026-03-10T06:23:11.643 INFO:tasks.workunit.client.0.vm04.stdout:9/801: dwrite d2/d8/d53/d6e/d89/ff3 [0,4194304] 0 2026-03-10T06:23:11.644 INFO:tasks.workunit.client.0.vm04.stdout:9/802: readlink d2/d3/d18/l74 0 2026-03-10T06:23:11.647 INFO:tasks.workunit.client.0.vm04.stdout:9/803: chown d2/d3/d18/d39/d11/c9e 9649 1 2026-03-10T06:23:11.651 INFO:tasks.workunit.client.0.vm04.stdout:9/804: write d2/de0/d1d/d64/d73/f109 [731280,26171] 0 2026-03-10T06:23:11.653 INFO:tasks.workunit.client.0.vm04.stdout:7/721: dread d4/df/d12/d13/d25/d30/d40/d50/f5b [0,4194304] 0 2026-03-10T06:23:11.669 INFO:tasks.workunit.client.0.vm04.stdout:0/814: dread d0/d5/d25/dd/d5c/d73/fa5 [0,4194304] 0 2026-03-10T06:23:11.672 INFO:tasks.workunit.client.0.vm04.stdout:5/720: dwrite d4/d11/d7d/d38/d91/d4c/d98/dc0/f70 [0,4194304] 0 2026-03-10T06:23:11.672 INFO:tasks.workunit.client.0.vm04.stdout:8/764: chown df/d20/c75 724 1 2026-03-10T06:23:11.673 INFO:tasks.workunit.client.0.vm04.stdout:8/765: write df/d15/d2b/d81/f9d [3965248,4897] 0 2026-03-10T06:23:11.693 INFO:tasks.workunit.client.0.vm04.stdout:1/744: rename d0/d3/d41/f47 to d0/d8/d46/de4/dec/f110 0 2026-03-10T06:23:11.711 INFO:tasks.workunit.client.0.vm04.stdout:4/765: symlink d2/d32/d94/d99/lf8 0 2026-03-10T06:23:11.717 INFO:tasks.workunit.client.0.vm04.stdout:9/805: creat d2/de0/d1d/d64/f123 x:0 0 0 2026-03-10T06:23:11.718 INFO:tasks.workunit.client.0.vm04.stdout:9/806: readlink d2/d3/d18/de9/de7/l118 0 2026-03-10T06:23:11.719 INFO:tasks.workunit.client.0.vm04.stdout:9/807: read d2/d8/d3a/dcb/fe3 [2118424,25325] 0 2026-03-10T06:23:11.725 INFO:tasks.workunit.client.0.vm04.stdout:5/721: dread d4/f35 [0,4194304] 0 2026-03-10T06:23:11.727 INFO:tasks.workunit.client.0.vm04.stdout:0/815: write d0/d5/d25/dd/d3a/d56/f88 [2915976,128261] 0 2026-03-10T06:23:11.733 INFO:tasks.workunit.client.0.vm04.stdout:5/722: sync 2026-03-10T06:23:11.735 INFO:tasks.workunit.client.0.vm04.stdout:3/736: rename d4/da/df/d11/d5a/d5b/ddf/d21/fe6 to d4/da/df/d11/d5a/d5b/ddf/d21/ff0 0 2026-03-10T06:23:11.741 INFO:tasks.workunit.client.0.vm04.stdout:7/722: truncate d4/df/d12/f18 557081 0 2026-03-10T06:23:11.743 INFO:tasks.workunit.client.0.vm04.stdout:9/808: mknod d2/d3/d18/d39/d46/c124 0 2026-03-10T06:23:11.747 INFO:tasks.workunit.client.0.vm04.stdout:8/766: mkdir df/d15/d29/df8 0 2026-03-10T06:23:11.752 INFO:tasks.workunit.client.0.vm04.stdout:0/816: read d0/d5/fc5 [164223,79075] 0 2026-03-10T06:23:11.754 INFO:tasks.workunit.client.0.vm04.stdout:2/720: creat d1/dae/fe2 x:0 0 0 2026-03-10T06:23:11.755 INFO:tasks.workunit.client.0.vm04.stdout:1/745: creat d0/d8/d46/d7a/d95/dc5/dcc/f111 x:0 0 0 2026-03-10T06:23:11.762 INFO:tasks.workunit.client.0.vm04.stdout:5/723: dwrite d4/d11/d7d/fa6 [0,4194304] 0 2026-03-10T06:23:11.763 INFO:tasks.workunit.client.0.vm04.stdout:5/724: stat d4/d6/d80/cd2 0 2026-03-10T06:23:11.767 INFO:tasks.workunit.client.0.vm04.stdout:4/766: mknod d2/d32/d5c/cf9 0 2026-03-10T06:23:11.773 INFO:tasks.workunit.client.0.vm04.stdout:6/772: creat d2/d3a/d5e/ffe x:0 0 0 2026-03-10T06:23:11.775 INFO:tasks.workunit.client.0.vm04.stdout:8/767: creat df/d20/d25/d30/dc5/ff9 x:0 0 0 2026-03-10T06:23:11.784 INFO:tasks.workunit.client.0.vm04.stdout:3/737: mknod d4/da/df/d11/cf1 0 2026-03-10T06:23:11.790 INFO:tasks.workunit.client.0.vm04.stdout:2/721: write d1/dae/d2c/d37/d40/f64 [1145661,89396] 0 2026-03-10T06:23:11.791 INFO:tasks.workunit.client.0.vm04.stdout:0/817: dwrite d0/d1a/d20/df5/d47/d8a/fbe [0,4194304] 0 2026-03-10T06:23:11.798 INFO:tasks.workunit.client.0.vm04.stdout:5/725: dwrite d4/d6/f47 [0,4194304] 0 2026-03-10T06:23:11.802 INFO:tasks.workunit.client.0.vm04.stdout:7/723: mkdir d4/df/dd8/d102/d10b 0 2026-03-10T06:23:11.814 INFO:tasks.workunit.client.0.vm04.stdout:1/746: mkdir d0/d112 0 2026-03-10T06:23:11.815 INFO:tasks.workunit.client.0.vm04.stdout:3/738: symlink d4/d6/d91/da1/lf2 0 2026-03-10T06:23:11.815 INFO:tasks.workunit.client.0.vm04.stdout:2/722: rename d1/dae/d11/d14/d6a to d1/db/d69/d74/d87/dcf/d8f/d48/de3 0 2026-03-10T06:23:11.816 INFO:tasks.workunit.client.0.vm04.stdout:1/747: stat d0/d3/d41/d4b/d5b/fb6 0 2026-03-10T06:23:11.820 INFO:tasks.workunit.client.0.vm04.stdout:4/767: mknod d2/d32/d5c/cfa 0 2026-03-10T06:23:11.825 INFO:tasks.workunit.client.0.vm04.stdout:7/724: creat d4/df/dd8/f10c x:0 0 0 2026-03-10T06:23:11.825 INFO:tasks.workunit.client.0.vm04.stdout:7/725: read d4/df/d12/d13/d25/d30/d40/d50/f5b [685862,99985] 0 2026-03-10T06:23:11.825 INFO:tasks.workunit.client.0.vm04.stdout:7/726: fdatasync d4/df/f60 0 2026-03-10T06:23:11.825 INFO:tasks.workunit.client.0.vm04.stdout:5/726: truncate d4/d11/d7d/dab/fb8 529654 0 2026-03-10T06:23:11.826 INFO:tasks.workunit.client.0.vm04.stdout:6/773: link d2/d43/d86/l96 d2/d43/d2d/d30/d1f/d3c/lff 0 2026-03-10T06:23:11.830 INFO:tasks.workunit.client.0.vm04.stdout:7/727: creat d4/df/d12/d13/f10d x:0 0 0 2026-03-10T06:23:11.831 INFO:tasks.workunit.client.0.vm04.stdout:3/739: sync 2026-03-10T06:23:11.841 INFO:tasks.workunit.client.0.vm04.stdout:9/809: truncate d2/d8/d53/d6e/d89/ff3 1139139 0 2026-03-10T06:23:11.843 INFO:tasks.workunit.client.0.vm04.stdout:0/818: write d0/d5/d25/dd/d3a/d56/fa7 [43045,1135] 0 2026-03-10T06:23:11.844 INFO:tasks.workunit.client.0.vm04.stdout:8/768: dwrite df/d20/d25/f35 [0,4194304] 0 2026-03-10T06:23:11.847 INFO:tasks.workunit.client.0.vm04.stdout:8/769: sync 2026-03-10T06:23:11.848 INFO:tasks.workunit.client.0.vm04.stdout:8/770: write df/d15/d2b/d8a/dab/fdf [2459981,51518] 0 2026-03-10T06:23:11.853 INFO:tasks.workunit.client.0.vm04.stdout:2/723: write d1/db/d69/d74/d87/dcf/d8f/fd9 [4293594,106694] 0 2026-03-10T06:23:11.855 INFO:tasks.workunit.client.0.vm04.stdout:5/727: write d4/d11/d7d/f31 [1868172,41294] 0 2026-03-10T06:23:11.856 INFO:tasks.workunit.client.0.vm04.stdout:4/768: dwrite d2/d32/dad/fae [0,4194304] 0 2026-03-10T06:23:11.874 INFO:tasks.workunit.client.0.vm04.stdout:6/774: chown d2/d43/cb8 324 1 2026-03-10T06:23:11.888 INFO:tasks.workunit.client.0.vm04.stdout:3/740: rename d4/da/df/d11/d5a/d5b/ddf to d4/da/df/d11/d5a/d5b/ddf/df3 22 2026-03-10T06:23:11.890 INFO:tasks.workunit.client.0.vm04.stdout:9/810: write d2/de0/d1d/d64/d122/f119 [1858172,122962] 0 2026-03-10T06:23:11.892 INFO:tasks.workunit.client.0.vm04.stdout:3/741: dwrite d4/f42 [0,4194304] 0 2026-03-10T06:23:11.901 INFO:tasks.workunit.client.0.vm04.stdout:2/724: dread d1/dae/d11/d14/d4e/fa6 [0,4194304] 0 2026-03-10T06:23:11.901 INFO:tasks.workunit.client.0.vm04.stdout:2/725: chown d1/db/d9b 1450 1 2026-03-10T06:23:11.902 INFO:tasks.workunit.client.0.vm04.stdout:2/726: chown d1/dae/d11/d14/c1f 18 1 2026-03-10T06:23:11.902 INFO:tasks.workunit.client.0.vm04.stdout:2/727: read d1/dae/d11/d14/d9f/ddb/d94/f97 [171194,35] 0 2026-03-10T06:23:11.916 INFO:tasks.workunit.client.0.vm04.stdout:0/819: dwrite d0/d1a/d20/d38/fdc [0,4194304] 0 2026-03-10T06:23:11.961 INFO:tasks.workunit.client.0.vm04.stdout:1/748: link d0/d3/fb2 d0/d8/d46/db3/dd2/d100/f113 0 2026-03-10T06:23:11.963 INFO:tasks.workunit.client.0.vm04.stdout:4/769: dwrite d2/d8/f9f [0,4194304] 0 2026-03-10T06:23:11.975 INFO:tasks.workunit.client.0.vm04.stdout:9/811: fdatasync d2/de0/d1d/f6a 0 2026-03-10T06:23:11.975 INFO:tasks.workunit.client.0.vm04.stdout:9/812: chown d2/d3/c33 4206 1 2026-03-10T06:23:11.976 INFO:tasks.workunit.client.0.vm04.stdout:9/813: chown d2/d8/d22/d4f 10639615 1 2026-03-10T06:23:11.981 INFO:tasks.workunit.client.0.vm04.stdout:6/775: dwrite d2/d43/f35 [4194304,4194304] 0 2026-03-10T06:23:11.984 INFO:tasks.workunit.client.0.vm04.stdout:2/728: creat d1/db/d69/d74/d87/dcf/d8f/d35/d54/fe4 x:0 0 0 2026-03-10T06:23:11.987 INFO:tasks.workunit.client.0.vm04.stdout:2/729: dwrite d1/dae/d2c/f33 [0,4194304] 0 2026-03-10T06:23:12.007 INFO:tasks.workunit.client.0.vm04.stdout:0/820: dread d0/d5/d25/dd/f43 [0,4194304] 0 2026-03-10T06:23:12.011 INFO:tasks.workunit.client.0.vm04.stdout:8/771: mknod df/d20/d25/d30/d55/de7/cfa 0 2026-03-10T06:23:12.012 INFO:tasks.workunit.client.0.vm04.stdout:5/728: mknod d4/d11/d7d/d38/cfa 0 2026-03-10T06:23:12.012 INFO:tasks.workunit.client.0.vm04.stdout:1/749: unlink d0/d3/d41/d99/d103/fe5 0 2026-03-10T06:23:12.012 INFO:tasks.workunit.client.0.vm04.stdout:4/770: rmdir d2/d32/d5c/d76/dd7/d31/d42 39 2026-03-10T06:23:12.013 INFO:tasks.workunit.client.0.vm04.stdout:5/729: chown d4/d6/d37/fcc 15031 1 2026-03-10T06:23:12.014 INFO:tasks.workunit.client.0.vm04.stdout:5/730: dread - d4/d11/d7d/d38/d91/d55/f68 zero size 2026-03-10T06:23:12.017 INFO:tasks.workunit.client.0.vm04.stdout:0/821: dread d0/d1a/d20/dc2/fee [0,4194304] 0 2026-03-10T06:23:12.017 INFO:tasks.workunit.client.0.vm04.stdout:0/822: chown d0/d1a/d4d/cc3 17 1 2026-03-10T06:23:12.020 INFO:tasks.workunit.client.0.vm04.stdout:7/728: truncate d4/df/d12/d13/d25/d28/f7d 2078642 0 2026-03-10T06:23:12.030 INFO:tasks.workunit.client.0.vm04.stdout:2/730: mkdir d1/dae/d11/d14/d9f/ddb/d94/de5 0 2026-03-10T06:23:12.036 INFO:tasks.workunit.client.0.vm04.stdout:5/731: symlink d4/d11/d7d/d38/d91/d4c/d98/dc0/dbe/lfb 0 2026-03-10T06:23:12.040 INFO:tasks.workunit.client.0.vm04.stdout:5/732: dread d4/d6/f93 [0,4194304] 0 2026-03-10T06:23:12.047 INFO:tasks.workunit.client.0.vm04.stdout:8/772: dread df/d20/d25/f2a [0,4194304] 0 2026-03-10T06:23:12.047 INFO:tasks.workunit.client.0.vm04.stdout:8/773: chown df/d20/f84 478566 1 2026-03-10T06:23:12.051 INFO:tasks.workunit.client.0.vm04.stdout:9/814: rename d2/d8/d22/ffe to d2/d3/d18/d39/d46/d55/dc3/f125 0 2026-03-10T06:23:12.057 INFO:tasks.workunit.client.0.vm04.stdout:6/776: creat d2/d37/d6e/de6/f100 x:0 0 0 2026-03-10T06:23:12.058 INFO:tasks.workunit.client.0.vm04.stdout:7/729: write d4/df/d12/d13/d25/d28/d3a/d58/f97 [202292,15977] 0 2026-03-10T06:23:12.059 INFO:tasks.workunit.client.0.vm04.stdout:7/730: stat d4/df/d12/d34/d63/lb9 0 2026-03-10T06:23:12.060 INFO:tasks.workunit.client.0.vm04.stdout:2/731: chown d1/dae/d2c/d37/d40/l61 379965 1 2026-03-10T06:23:12.062 INFO:tasks.workunit.client.0.vm04.stdout:2/732: read d1/dae/d11/f16 [6251762,115127] 0 2026-03-10T06:23:12.063 INFO:tasks.workunit.client.0.vm04.stdout:2/733: truncate d1/db/d69/d74/d87/dcf/fc5 577537 0 2026-03-10T06:23:12.074 INFO:tasks.workunit.client.0.vm04.stdout:0/823: creat d0/d1a/d20/df5/d79/d111/f113 x:0 0 0 2026-03-10T06:23:12.075 INFO:tasks.workunit.client.0.vm04.stdout:1/750: dwrite d0/d8/fe1 [0,4194304] 0 2026-03-10T06:23:12.095 INFO:tasks.workunit.client.0.vm04.stdout:4/771: rename d2/d32/d5c/d76/dd7/da3/lec to d2/d32/lfb 0 2026-03-10T06:23:12.098 INFO:tasks.workunit.client.0.vm04.stdout:5/733: write d4/d3b/f6d [3158435,11572] 0 2026-03-10T06:23:12.099 INFO:tasks.workunit.client.0.vm04.stdout:5/734: write d4/d11/f1f [3214483,85118] 0 2026-03-10T06:23:12.101 INFO:tasks.workunit.client.0.vm04.stdout:3/742: getdents d4/d6/d38/dcc 0 2026-03-10T06:23:12.104 INFO:tasks.workunit.client.0.vm04.stdout:6/777: creat d2/d3a/d5e/db5/f101 x:0 0 0 2026-03-10T06:23:12.105 INFO:tasks.workunit.client.0.vm04.stdout:9/815: dread d2/d3/d18/de9/fbe [0,4194304] 0 2026-03-10T06:23:12.114 INFO:tasks.workunit.client.0.vm04.stdout:8/774: symlink df/d20/d25/d30/lfb 0 2026-03-10T06:23:12.115 INFO:tasks.workunit.client.0.vm04.stdout:2/734: rename d1/db/d69/d74/d87/dcf/d8f/ddc/cb5 to d1/dae/d11/d14/d9f/ddb/daf/db0/ce6 0 2026-03-10T06:23:12.123 INFO:tasks.workunit.client.0.vm04.stdout:4/772: dread d2/d32/d5c/f6d [0,4194304] 0 2026-03-10T06:23:12.124 INFO:tasks.workunit.client.0.vm04.stdout:9/816: sync 2026-03-10T06:23:12.130 INFO:tasks.workunit.client.0.vm04.stdout:0/824: write d0/d5/d25/dd/d5c/f9a [1246731,90768] 0 2026-03-10T06:23:12.131 INFO:tasks.workunit.client.0.vm04.stdout:0/825: readlink d0/d1a/d20/df5/d79/lfa 0 2026-03-10T06:23:12.132 INFO:tasks.workunit.client.0.vm04.stdout:0/826: chown d0/d1a/d20/df5/d47/d8a/d8d/fad 134 1 2026-03-10T06:23:12.133 INFO:tasks.workunit.client.0.vm04.stdout:1/751: dwrite d0/d8/d46/d7a/d95/fc8 [0,4194304] 0 2026-03-10T06:23:12.137 INFO:tasks.workunit.client.0.vm04.stdout:7/731: symlink d4/d105/l10e 0 2026-03-10T06:23:12.138 INFO:tasks.workunit.client.0.vm04.stdout:3/743: truncate d4/da/df/d11/d5a/d5b/ddf/d21/d32/d39/d64/fd6 230604 0 2026-03-10T06:23:12.145 INFO:tasks.workunit.client.0.vm04.stdout:8/775: symlink df/d20/d25/d30/d55/lfc 0 2026-03-10T06:23:12.146 INFO:tasks.workunit.client.0.vm04.stdout:2/735: write d1/fa5 [4112507,51802] 0 2026-03-10T06:23:12.147 INFO:tasks.workunit.client.0.vm04.stdout:2/736: chown d1/dae/d11/d14/d4e/fa6 1209730 1 2026-03-10T06:23:12.151 INFO:tasks.workunit.client.0.vm04.stdout:2/737: write d1/dae/d11/d14/d9f/ddb/f7a [116667,122203] 0 2026-03-10T06:23:12.151 INFO:tasks.workunit.client.0.vm04.stdout:2/738: chown d1/dae/c73 18202 1 2026-03-10T06:23:12.151 INFO:tasks.workunit.client.0.vm04.stdout:2/739: chown d1/dae/d2c/d37/d59/f8b 0 1 2026-03-10T06:23:12.151 INFO:tasks.workunit.client.0.vm04.stdout:2/740: truncate d1/dae/d11/d14/d9f/ddb/f7a 4353079 0 2026-03-10T06:23:12.153 INFO:tasks.workunit.client.0.vm04.stdout:4/773: rename d2/d32/d5c/f41 to d2/d32/d5c/d76/dd7/d31/d3f/da1/ffc 0 2026-03-10T06:23:12.157 INFO:tasks.workunit.client.0.vm04.stdout:9/817: creat d2/d3/d18/d39/d46/d55/dc3/f126 x:0 0 0 2026-03-10T06:23:12.161 INFO:tasks.workunit.client.0.vm04.stdout:9/818: dwrite d2/d3/d18/d39/d46/fbc [0,4194304] 0 2026-03-10T06:23:12.164 INFO:tasks.workunit.client.0.vm04.stdout:0/827: fdatasync d0/d5/d25/dd/f43 0 2026-03-10T06:23:12.168 INFO:tasks.workunit.client.0.vm04.stdout:9/819: dwrite d2/d8/d53/d6e/d89/fba [0,4194304] 0 2026-03-10T06:23:12.185 INFO:tasks.workunit.client.0.vm04.stdout:2/741: mknod d1/db/d69/d74/d87/dcf/d8f/d48/de3/ce7 0 2026-03-10T06:23:12.191 INFO:tasks.workunit.client.0.vm04.stdout:4/774: dread d2/d32/d5c/d76/dd7/d31/d3f/d93/f9c [0,4194304] 0 2026-03-10T06:23:12.201 INFO:tasks.workunit.client.0.vm04.stdout:5/735: link d4/d11/d7d/d52/f9a d4/d11/d7d/d38/d91/d4c/d98/ffc 0 2026-03-10T06:23:12.208 INFO:tasks.workunit.client.0.vm04.stdout:6/778: link d2/d43/d2d/d30/f2b d2/d3a/d9c/f102 0 2026-03-10T06:23:12.213 INFO:tasks.workunit.client.0.vm04.stdout:3/744: write d4/da/df/d11/d5a/d5b/fa3 [358034,60547] 0 2026-03-10T06:23:12.214 INFO:tasks.workunit.client.0.vm04.stdout:3/745: write d4/fe3 [958719,1554] 0 2026-03-10T06:23:12.215 INFO:tasks.workunit.client.0.vm04.stdout:7/732: dwrite d4/df/d12/d13/d25/d28/d3a/d58/f5a [0,4194304] 0 2026-03-10T06:23:12.216 INFO:tasks.workunit.client.0.vm04.stdout:3/746: fsync d4/da/df/d11/d5a/d5b/ddf/d21/d32/d4e/d8f/fb0 0 2026-03-10T06:23:12.230 INFO:tasks.workunit.client.0.vm04.stdout:0/828: dwrite d0/d5/d97/dc0/fc8 [0,4194304] 0 2026-03-10T06:23:12.236 INFO:tasks.workunit.client.0.vm04.stdout:1/752: write d0/f23 [1047664,15048] 0 2026-03-10T06:23:12.241 INFO:tasks.workunit.client.0.vm04.stdout:4/775: rmdir d2/d32/d5c 39 2026-03-10T06:23:12.241 INFO:tasks.workunit.client.0.vm04.stdout:9/820: truncate d2/d23/f93 1587464 0 2026-03-10T06:23:12.241 INFO:tasks.workunit.client.0.vm04.stdout:9/821: dread d2/d3/d18/d39/d46/fbc [0,4194304] 0 2026-03-10T06:23:12.244 INFO:tasks.workunit.client.0.vm04.stdout:9/822: dread d2/d3/d18/de9/fbe [0,4194304] 0 2026-03-10T06:23:12.247 INFO:tasks.workunit.client.0.vm04.stdout:4/776: dread d2/d8/f35 [0,4194304] 0 2026-03-10T06:23:12.252 INFO:tasks.workunit.client.0.vm04.stdout:7/733: creat d4/df/d12/d13/f10f x:0 0 0 2026-03-10T06:23:12.258 INFO:tasks.workunit.client.0.vm04.stdout:7/734: chown d4/df/d12/d34/dbd/lc8 5618903 1 2026-03-10T06:23:12.258 INFO:tasks.workunit.client.0.vm04.stdout:8/776: rename df/d15/d2b/f4d to df/ffd 0 2026-03-10T06:23:12.258 INFO:tasks.workunit.client.0.vm04.stdout:1/753: creat d0/d3/d80/f114 x:0 0 0 2026-03-10T06:23:12.260 INFO:tasks.workunit.client.0.vm04.stdout:5/736: creat d4/d11/d7d/d38/d91/d4c/d9d/ffd x:0 0 0 2026-03-10T06:23:12.261 INFO:tasks.workunit.client.0.vm04.stdout:6/779: truncate d2/d43/d2d/d30/f32 29311 0 2026-03-10T06:23:12.262 INFO:tasks.workunit.client.0.vm04.stdout:9/823: mknod d2/de0/da3/c127 0 2026-03-10T06:23:12.275 INFO:tasks.workunit.client.0.vm04.stdout:0/829: fsync d0/d1a/d20/f85 0 2026-03-10T06:23:12.279 INFO:tasks.workunit.client.0.vm04.stdout:3/747: rename d4/da/df/d11/d5a/d5b/ddf/d21/d32/d39/dec to d4/da/df/d11/d5a/d5b/ddf/dbd/df4 0 2026-03-10T06:23:12.281 INFO:tasks.workunit.client.0.vm04.stdout:1/754: mknod d0/d3/d41/d4b/d5b/c115 0 2026-03-10T06:23:12.283 INFO:tasks.workunit.client.0.vm04.stdout:7/735: dread d4/f5 [0,4194304] 0 2026-03-10T06:23:12.286 INFO:tasks.workunit.client.0.vm04.stdout:2/742: truncate d1/fc2 3183074 0 2026-03-10T06:23:12.288 INFO:tasks.workunit.client.0.vm04.stdout:2/743: truncate d1/db/d69/d74/d87/dcf/d8f/d35/d54/d5d/faa 537009 0 2026-03-10T06:23:12.296 INFO:tasks.workunit.client.0.vm04.stdout:8/777: read df/d15/f43 [4941594,61259] 0 2026-03-10T06:23:12.315 INFO:tasks.workunit.client.0.vm04.stdout:9/824: creat d2/d3/d18/d34/f128 x:0 0 0 2026-03-10T06:23:12.315 INFO:tasks.workunit.client.0.vm04.stdout:4/777: creat d2/d32/d5c/d76/dd7/d56/ffd x:0 0 0 2026-03-10T06:23:12.358 INFO:tasks.workunit.client.0.vm04.stdout:2/744: mkdir d1/dae/d11/d14/d9f/ddb/d94/dbb/de8 0 2026-03-10T06:23:12.363 INFO:tasks.workunit.client.0.vm04.stdout:6/780: mknod d2/d37/d83/dee/c103 0 2026-03-10T06:23:12.366 INFO:tasks.workunit.client.0.vm04.stdout:4/778: mkdir d2/d32/dad/dfe 0 2026-03-10T06:23:12.367 INFO:tasks.workunit.client.0.vm04.stdout:4/779: chown d2/d32 20 1 2026-03-10T06:23:12.382 INFO:tasks.workunit.client.0.vm04.stdout:1/755: mknod d0/d8/d46/d7a/d95/dc5/c116 0 2026-03-10T06:23:12.385 INFO:tasks.workunit.client.0.vm04.stdout:4/780: dread d2/d46/fa8 [0,4194304] 0 2026-03-10T06:23:12.388 INFO:tasks.workunit.client.0.vm04.stdout:7/736: truncate d4/df/f84 98109 0 2026-03-10T06:23:12.399 INFO:tasks.workunit.client.0.vm04.stdout:3/748: link d4/d6/d91/fe5 d4/da/df/d11/ff5 0 2026-03-10T06:23:12.402 INFO:tasks.workunit.client.0.vm04.stdout:2/745: mkdir d1/dae/d11/d14/d9f/ddb/d94/de5/de9 0 2026-03-10T06:23:12.402 INFO:tasks.workunit.client.0.vm04.stdout:4/781: creat d2/d32/d5c/d76/dd7/d31/d3f/d93/fff x:0 0 0 2026-03-10T06:23:12.404 INFO:tasks.workunit.client.0.vm04.stdout:2/746: truncate d1/db/f36 8951349 0 2026-03-10T06:23:12.405 INFO:tasks.workunit.client.0.vm04.stdout:6/781: mknod d2/d43/d2d/d30/c104 0 2026-03-10T06:23:12.406 INFO:tasks.workunit.client.0.vm04.stdout:6/782: write d2/d43/f35 [3878953,55224] 0 2026-03-10T06:23:12.416 INFO:tasks.workunit.client.0.vm04.stdout:0/830: link d0/d1a/d20/df5/d47/d8a/d8d/fad d0/d1a/d20/f114 0 2026-03-10T06:23:12.418 INFO:tasks.workunit.client.0.vm04.stdout:3/749: read f1 [2077637,92453] 0 2026-03-10T06:23:12.418 INFO:tasks.workunit.client.0.vm04.stdout:1/756: fsync d0/d8/d46/fd8 0 2026-03-10T06:23:12.423 INFO:tasks.workunit.client.0.vm04.stdout:8/778: rename df/d20/d25/d30/d65/l91 to df/d20/d25/d30/lfe 0 2026-03-10T06:23:12.429 INFO:tasks.workunit.client.0.vm04.stdout:9/825: getdents d2/de5 0 2026-03-10T06:23:12.430 INFO:tasks.workunit.client.0.vm04.stdout:6/783: creat d2/d37/d6e/f105 x:0 0 0 2026-03-10T06:23:12.436 INFO:tasks.workunit.client.0.vm04.stdout:3/750: creat d4/da/df/d11/d5a/d5b/ddf/dbd/ff6 x:0 0 0 2026-03-10T06:23:12.440 INFO:tasks.workunit.client.0.vm04.stdout:3/751: dwrite d4/d6/d99/fe4 [0,4194304] 0 2026-03-10T06:23:12.444 INFO:tasks.workunit.client.0.vm04.stdout:7/737: rename d4/df/dd8 to d4/df/d12/d13/db3/d110 0 2026-03-10T06:23:12.457 INFO:tasks.workunit.client.0.vm04.stdout:8/779: read df/f17 [4078409,37329] 0 2026-03-10T06:23:12.459 INFO:tasks.workunit.client.0.vm04.stdout:4/782: unlink d2/d32/d5c/d76/dd7/d31/d42/db9/cd3 0 2026-03-10T06:23:12.465 INFO:tasks.workunit.client.0.vm04.stdout:9/826: symlink d2/de0/d1d/l129 0 2026-03-10T06:23:12.465 INFO:tasks.workunit.client.0.vm04.stdout:5/737: write d4/d11/d7d/d38/d91/d55/f9e [1583542,130348] 0 2026-03-10T06:23:12.465 INFO:tasks.workunit.client.0.vm04.stdout:5/738: write d4/f13 [6232680,125887] 0 2026-03-10T06:23:12.465 INFO:tasks.workunit.client.0.vm04.stdout:5/739: chown d4/d11/d7d/d38/d91/d4c/ca4 712043 1 2026-03-10T06:23:12.465 INFO:tasks.workunit.client.0.vm04.stdout:6/784: fsync d2/d43/d2d/d30/f91 0 2026-03-10T06:23:12.469 INFO:tasks.workunit.client.0.vm04.stdout:6/785: dwrite d2/d43/d2d/d30/dc0/fcd [0,4194304] 0 2026-03-10T06:23:12.471 INFO:tasks.workunit.client.0.vm04.stdout:6/786: chown d2/d43/d2d/d30/dc0 1 1 2026-03-10T06:23:12.482 INFO:tasks.workunit.client.0.vm04.stdout:7/738: fdatasync d4/df/d12/d13/f1e 0 2026-03-10T06:23:12.487 INFO:tasks.workunit.client.0.vm04.stdout:9/827: creat d2/de5/f12a x:0 0 0 2026-03-10T06:23:12.494 INFO:tasks.workunit.client.0.vm04.stdout:2/747: dwrite d1/dae/d2c/d37/fb7 [0,4194304] 0 2026-03-10T06:23:12.501 INFO:tasks.workunit.client.0.vm04.stdout:3/752: fdatasync d4/dba/fbf 0 2026-03-10T06:23:12.510 INFO:tasks.workunit.client.0.vm04.stdout:8/780: link df/d20/d25/d30/d65/d8f/fc9 df/d15/d2b/d81/d9a/dbe/df0/fff 0 2026-03-10T06:23:12.515 INFO:tasks.workunit.client.0.vm04.stdout:6/787: write d2/d3a/d9c/fba [364114,67458] 0 2026-03-10T06:23:12.515 INFO:tasks.workunit.client.0.vm04.stdout:4/783: getdents d2/d32/dad/dfe 0 2026-03-10T06:23:12.517 INFO:tasks.workunit.client.0.vm04.stdout:5/740: creat d4/d11/d7d/d38/d91/dda/ffe x:0 0 0 2026-03-10T06:23:12.518 INFO:tasks.workunit.client.0.vm04.stdout:5/741: dread - d4/d11/d7d/d38/d91/d4c/d98/dc0/dbe/fd7 zero size 2026-03-10T06:23:12.519 INFO:tasks.workunit.client.0.vm04.stdout:7/739: write d4/df/d12/dd4/f7c [2226651,129271] 0 2026-03-10T06:23:12.525 INFO:tasks.workunit.client.0.vm04.stdout:0/831: truncate d0/d1a/d20/df5/d47/d8a/d8d/fad 3191603 0 2026-03-10T06:23:12.530 INFO:tasks.workunit.client.0.vm04.stdout:3/753: chown d4/da/df/d11/d50/dc8/fcb 178539371 1 2026-03-10T06:23:12.530 INFO:tasks.workunit.client.0.vm04.stdout:3/754: chown d4/d6/d54/dee 1 1 2026-03-10T06:23:12.533 INFO:tasks.workunit.client.0.vm04.stdout:3/755: chown d4/da/df/d11/d5a/d5b/ddf/d21/d32/d4e/d8f/fb0 94103245 1 2026-03-10T06:23:12.537 INFO:tasks.workunit.client.0.vm04.stdout:2/748: dwrite d1/dae/d11/d14/d9f/ddb/d94/f97 [0,4194304] 0 2026-03-10T06:23:12.548 INFO:tasks.workunit.client.0.vm04.stdout:4/784: rmdir d2 39 2026-03-10T06:23:12.549 INFO:tasks.workunit.client.0.vm04.stdout:2/749: dread d1/dae/d11/d14/d9f/ddb/d94/f97 [0,4194304] 0 2026-03-10T06:23:12.553 INFO:tasks.workunit.client.0.vm04.stdout:5/742: write d4/d11/d7d/d38/d91/d4c/fa3 [2704731,104857] 0 2026-03-10T06:23:12.561 INFO:tasks.workunit.client.0.vm04.stdout:1/757: rename d0/d3/d41/ld1 to d0/l117 0 2026-03-10T06:23:12.565 INFO:tasks.workunit.client.0.vm04.stdout:9/828: truncate d2/d3/d18/d39/d46/fac 1858587 0 2026-03-10T06:23:12.566 INFO:tasks.workunit.client.0.vm04.stdout:0/832: write d0/d5/d97/dc0/dd8/dff/d9c/dbf/ffd [705675,55764] 0 2026-03-10T06:23:12.567 INFO:tasks.workunit.client.0.vm04.stdout:0/833: chown d0/d5/c36 60 1 2026-03-10T06:23:12.572 INFO:tasks.workunit.client.0.vm04.stdout:6/788: creat d2/d43/d2d/d30/d1f/d3c/dfa/f106 x:0 0 0 2026-03-10T06:23:12.582 INFO:tasks.workunit.client.0.vm04.stdout:7/740: getdents d4/df/d12/d13/db3/d110/d9c/db1/dde/ddf/df1 0 2026-03-10T06:23:12.583 INFO:tasks.workunit.client.0.vm04.stdout:5/743: dread d4/d11/d7d/dab/fb8 [0,4194304] 0 2026-03-10T06:23:12.585 INFO:tasks.workunit.client.0.vm04.stdout:7/741: chown d4/df/d12/dd4/fe1 198 1 2026-03-10T06:23:12.585 INFO:tasks.workunit.client.0.vm04.stdout:1/758: creat d0/d3/d41/d99/def/f118 x:0 0 0 2026-03-10T06:23:12.586 INFO:tasks.workunit.client.0.vm04.stdout:7/742: chown d4/df/d12/d13/db3/ded 0 1 2026-03-10T06:23:12.586 INFO:tasks.workunit.client.0.vm04.stdout:1/759: write d0/f23 [1767197,49519] 0 2026-03-10T06:23:12.587 INFO:tasks.workunit.client.0.vm04.stdout:7/743: write d4/df/d12/d13/f10d [1007497,109150] 0 2026-03-10T06:23:12.596 INFO:tasks.workunit.client.0.vm04.stdout:3/756: symlink d4/da/df/lf7 0 2026-03-10T06:23:12.597 INFO:tasks.workunit.client.0.vm04.stdout:9/829: mkdir d2/d8/d53/d6e/d12b 0 2026-03-10T06:23:12.608 INFO:tasks.workunit.client.0.vm04.stdout:1/760: dwrite d0/d3/d41/d99/def/ff8 [0,4194304] 0 2026-03-10T06:23:12.609 INFO:tasks.workunit.client.0.vm04.stdout:8/781: rename df/d20/c74 to df/d15/d2b/d81/d9a/dbe/df0/c100 0 2026-03-10T06:23:12.614 INFO:tasks.workunit.client.0.vm04.stdout:8/782: write df/d15/d2b/d81/fc4 [410602,101274] 0 2026-03-10T06:23:12.615 INFO:tasks.workunit.client.0.vm04.stdout:0/834: dwrite d0/d1a/d20/d38/f78 [0,4194304] 0 2026-03-10T06:23:12.626 INFO:tasks.workunit.client.0.vm04.stdout:7/744: mknod d4/df/d12/d13/db3/d110/d9c/db1/dde/ddf/c111 0 2026-03-10T06:23:12.631 INFO:tasks.workunit.client.0.vm04.stdout:6/789: symlink d2/d3a/d5e/ddf/l107 0 2026-03-10T06:23:12.636 INFO:tasks.workunit.client.0.vm04.stdout:4/785: fdatasync d2/d32/d5c/d76/dd7/f20 0 2026-03-10T06:23:12.643 INFO:tasks.workunit.client.0.vm04.stdout:6/790: readlink d2/d8/d78/lbb 0 2026-03-10T06:23:12.643 INFO:tasks.workunit.client.0.vm04.stdout:5/744: rename d4/c46 to d4/d11/d7d/d38/d91/d4c/def/ddc/cff 0 2026-03-10T06:23:12.643 INFO:tasks.workunit.client.0.vm04.stdout:6/791: chown d2/d43/fe7 1969033 1 2026-03-10T06:23:12.647 INFO:tasks.workunit.client.0.vm04.stdout:0/835: creat d0/d5/d25/dd/d3a/d81/f115 x:0 0 0 2026-03-10T06:23:12.648 INFO:tasks.workunit.client.0.vm04.stdout:3/757: creat d4/d6/d54/dee/ff8 x:0 0 0 2026-03-10T06:23:12.648 INFO:tasks.workunit.client.0.vm04.stdout:1/761: mkdir d0/d8/d46/db3/dd2/d100/d119 0 2026-03-10T06:23:12.652 INFO:tasks.workunit.client.0.vm04.stdout:8/783: mknod df/c101 0 2026-03-10T06:23:12.656 INFO:tasks.workunit.client.0.vm04.stdout:2/750: write d1/dae/d2c/f58 [600452,114641] 0 2026-03-10T06:23:12.666 INFO:tasks.workunit.client.0.vm04.stdout:5/745: dwrite d4/d11/d7d/d52/f96 [0,4194304] 0 2026-03-10T06:23:12.674 INFO:tasks.workunit.client.0.vm04.stdout:1/762: symlink d0/d8/d46/dcf/l11a 0 2026-03-10T06:23:12.674 INFO:tasks.workunit.client.0.vm04.stdout:9/830: getdents d2/d3/d18/de9/dd4 0 2026-03-10T06:23:12.674 INFO:tasks.workunit.client.0.vm04.stdout:3/758: mkdir d4/da/df/df9 0 2026-03-10T06:23:12.681 INFO:tasks.workunit.client.0.vm04.stdout:0/836: link d0/d1a/d20/l72 d0/d5/d97/dc0/dd8/dff/d59/l116 0 2026-03-10T06:23:12.681 INFO:tasks.workunit.client.0.vm04.stdout:9/831: unlink d2/d8/d22/d4f/l82 0 2026-03-10T06:23:12.684 INFO:tasks.workunit.client.0.vm04.stdout:5/746: truncate d4/d11/d7d/d38/d91/d4c/d98/ffc 371455 0 2026-03-10T06:23:12.685 INFO:tasks.workunit.client.0.vm04.stdout:0/837: symlink d0/d1a/d20/df5/d79/d111/l117 0 2026-03-10T06:23:12.685 INFO:tasks.workunit.client.0.vm04.stdout:1/763: creat d0/d3/f11b x:0 0 0 2026-03-10T06:23:12.686 INFO:tasks.workunit.client.0.vm04.stdout:9/832: mkdir d2/d3/d18/de9/d116/d11d/d12c 0 2026-03-10T06:23:12.689 INFO:tasks.workunit.client.0.vm04.stdout:5/747: mknod d4/d11/d7d/d38/d91/d4c/d98/dc0/c100 0 2026-03-10T06:23:12.689 INFO:tasks.workunit.client.0.vm04.stdout:9/833: truncate d2/de0/d1d/f6a 3514129 0 2026-03-10T06:23:12.691 INFO:tasks.workunit.client.0.vm04.stdout:9/834: stat d2/d3/d18/ddd/fb1 0 2026-03-10T06:23:12.698 INFO:tasks.workunit.client.0.vm04.stdout:7/745: sync 2026-03-10T06:23:12.700 INFO:tasks.workunit.client.0.vm04.stdout:5/748: read d4/d11/d7d/d38/d91/d55/d72/fdd [1538287,2410] 0 2026-03-10T06:23:12.702 INFO:tasks.workunit.client.0.vm04.stdout:2/751: dread d1/db/d69/d74/d87/dcf/d8f/ddc/f8e [0,4194304] 0 2026-03-10T06:23:12.702 INFO:tasks.workunit.client.0.vm04.stdout:5/749: write d4/f26 [3072125,43982] 0 2026-03-10T06:23:12.702 INFO:tasks.workunit.client.0.vm04.stdout:0/838: dread d0/d5/fc5 [0,4194304] 0 2026-03-10T06:23:12.706 INFO:tasks.workunit.client.0.vm04.stdout:4/786: dwrite d2/d32/d94/d99/fd6 [0,4194304] 0 2026-03-10T06:23:12.710 INFO:tasks.workunit.client.0.vm04.stdout:6/792: write d2/d43/d2d/d30/d1f/db6/fc5 [503126,38147] 0 2026-03-10T06:23:12.710 INFO:tasks.workunit.client.0.vm04.stdout:8/784: write df/d15/d2b/d81/d9a/fd2 [338119,47242] 0 2026-03-10T06:23:12.711 INFO:tasks.workunit.client.0.vm04.stdout:2/752: write d1/dae/d2c/d37/fe0 [441800,79902] 0 2026-03-10T06:23:12.717 INFO:tasks.workunit.client.0.vm04.stdout:2/753: chown d1/db/d69/d74/d87/dcf/d8f/d35/d54/d5d/f65 484773899 1 2026-03-10T06:23:12.719 INFO:tasks.workunit.client.0.vm04.stdout:5/750: mknod d4/d11/d7d/d38/d91/d4c/d98/dc0/dbe/c101 0 2026-03-10T06:23:12.728 INFO:tasks.workunit.client.0.vm04.stdout:2/754: write d1/db/f36 [4856006,75461] 0 2026-03-10T06:23:12.731 INFO:tasks.workunit.client.0.vm04.stdout:9/835: write d2/d3/d18/d34/f97 [3881310,27809] 0 2026-03-10T06:23:12.733 INFO:tasks.workunit.client.0.vm04.stdout:7/746: write d4/df/d12/d13/fac [477460,116602] 0 2026-03-10T06:23:12.736 INFO:tasks.workunit.client.0.vm04.stdout:0/839: dwrite d0/d1a/f101 [0,4194304] 0 2026-03-10T06:23:12.738 INFO:tasks.workunit.client.0.vm04.stdout:0/840: fsync d0/d1a/d20/d38/fdc 0 2026-03-10T06:23:12.739 INFO:tasks.workunit.client.0.vm04.stdout:3/759: dwrite d4/d6/d91/fad [0,4194304] 0 2026-03-10T06:23:12.746 INFO:tasks.workunit.client.0.vm04.stdout:8/785: truncate df/d20/f22 432679 0 2026-03-10T06:23:12.747 INFO:tasks.workunit.client.0.vm04.stdout:1/764: dread d0/d3/d41/d4b/d5b/f5c [0,4194304] 0 2026-03-10T06:23:12.750 INFO:tasks.workunit.client.0.vm04.stdout:7/747: dwrite d4/df/d12/d13/fac [0,4194304] 0 2026-03-10T06:23:12.754 INFO:tasks.workunit.client.0.vm04.stdout:4/787: dwrite d2/d46/fa5 [0,4194304] 0 2026-03-10T06:23:12.757 INFO:tasks.workunit.client.0.vm04.stdout:2/755: fsync d1/dae/d11/d14/d4e/f9d 0 2026-03-10T06:23:12.758 INFO:tasks.workunit.client.0.vm04.stdout:5/751: truncate d4/d11/d7d/d38/d91/d55/f68 360879 0 2026-03-10T06:23:12.758 INFO:tasks.workunit.client.0.vm04.stdout:0/841: creat d0/d1a/d20/df5/d79/d111/f118 x:0 0 0 2026-03-10T06:23:12.758 INFO:tasks.workunit.client.0.vm04.stdout:1/765: readlink d0/d8/l2c 0 2026-03-10T06:23:12.763 INFO:tasks.workunit.client.0.vm04.stdout:1/766: truncate d0/d3/d41/d99/def/f118 660677 0 2026-03-10T06:23:12.763 INFO:tasks.workunit.client.0.vm04.stdout:2/756: dread d1/db/d69/d74/d87/dcf/d8f/d35/d54/d5d/f93 [0,4194304] 0 2026-03-10T06:23:12.773 INFO:tasks.workunit.client.0.vm04.stdout:6/793: dwrite d2/d3a/d5e/db5/fbe [0,4194304] 0 2026-03-10T06:23:12.773 INFO:tasks.workunit.client.0.vm04.stdout:6/794: readlink d2/l2a 0 2026-03-10T06:23:12.774 INFO:tasks.workunit.client.0.vm04.stdout:4/788: chown d2/d32/d5c/d76/dd7/d2c/d6b/dd1/ff0 23 1 2026-03-10T06:23:12.779 INFO:tasks.workunit.client.0.vm04.stdout:9/836: symlink d2/d3/d18/d39/d46/l12d 0 2026-03-10T06:23:12.779 INFO:tasks.workunit.client.0.vm04.stdout:8/786: rename df/d15/d2b/d8a to df/d15/d29/df8/d102 0 2026-03-10T06:23:12.791 INFO:tasks.workunit.client.0.vm04.stdout:4/789: dread d2/d46/f3d [0,4194304] 0 2026-03-10T06:23:12.797 INFO:tasks.workunit.client.0.vm04.stdout:3/760: creat d4/da/df/d11/d50/ffa x:0 0 0 2026-03-10T06:23:12.797 INFO:tasks.workunit.client.0.vm04.stdout:6/795: dread d2/d43/d2d/d30/d1f/d3c/f6a [0,4194304] 0 2026-03-10T06:23:12.797 INFO:tasks.workunit.client.0.vm04.stdout:3/761: stat d4/d6/d38/dcc/fed 0 2026-03-10T06:23:12.799 INFO:tasks.workunit.client.0.vm04.stdout:6/796: readlink d2/d43/d2d/d30/d1f/d3c/d75/l4e 0 2026-03-10T06:23:12.799 INFO:tasks.workunit.client.0.vm04.stdout:6/797: chown d2/d43/d86/fc4 186419 1 2026-03-10T06:23:12.803 INFO:tasks.workunit.client.0.vm04.stdout:2/757: unlink d1/dae/d2c/d37/d40/l6c 0 2026-03-10T06:23:12.807 INFO:tasks.workunit.client.0.vm04.stdout:0/842: rename d0/d1a/d20/df5/d47/l64 to d0/d1a/d20/dc2/l119 0 2026-03-10T06:23:12.812 INFO:tasks.workunit.client.0.vm04.stdout:8/787: mkdir df/d20/d25/d30/d55/de7/d103 0 2026-03-10T06:23:12.819 INFO:tasks.workunit.client.0.vm04.stdout:9/837: dread - d2/d3/d18/d39/d11/da5/fd6 zero size 2026-03-10T06:23:12.825 INFO:tasks.workunit.client.0.vm04.stdout:5/752: creat d4/d11/d7d/d38/d91/d4c/d98/dc0/dde/f102 x:0 0 0 2026-03-10T06:23:12.835 INFO:tasks.workunit.client.0.vm04.stdout:4/790: rename d2/d32/d5c/d76/dd7/d31/d3f/da1 to d2/d32/d5c/d76/dd7/d31/d3f/dc8/d100 0 2026-03-10T06:23:12.842 INFO:tasks.workunit.client.0.vm04.stdout:0/843: readlink d0/d1a/d4d/l65 0 2026-03-10T06:23:12.844 INFO:tasks.workunit.client.0.vm04.stdout:4/791: dwrite d2/d32/d5c/d76/dd7/d31/d3f/d93/fb4 [0,4194304] 0 2026-03-10T06:23:12.863 INFO:tasks.workunit.client.0.vm04.stdout:8/788: mknod df/d15/d29/da3/db8/c104 0 2026-03-10T06:23:12.865 INFO:tasks.workunit.client.0.vm04.stdout:7/748: link d4/cd d4/df/d12/d13/db3/c112 0 2026-03-10T06:23:12.874 INFO:tasks.workunit.client.0.vm04.stdout:5/753: creat d4/d11/d7d/d52/f103 x:0 0 0 2026-03-10T06:23:12.882 INFO:tasks.workunit.client.0.vm04.stdout:8/789: fsync df/d20/d25/d30/d65/f9f 0 2026-03-10T06:23:12.882 INFO:tasks.workunit.client.0.vm04.stdout:3/762: write d4/fbe [23509,73514] 0 2026-03-10T06:23:12.887 INFO:tasks.workunit.client.0.vm04.stdout:7/749: dread - d4/df/d12/d13/fa6 zero size 2026-03-10T06:23:12.890 INFO:tasks.workunit.client.0.vm04.stdout:9/838: creat d2/d3/d18/de9/d5a/d92/f12e x:0 0 0 2026-03-10T06:23:12.894 INFO:tasks.workunit.client.0.vm04.stdout:5/754: mknod d4/d3b/c104 0 2026-03-10T06:23:12.895 INFO:tasks.workunit.client.0.vm04.stdout:1/767: getdents d0/d8/d46/d7a/d95/dc5/dcc 0 2026-03-10T06:23:12.898 INFO:tasks.workunit.client.0.vm04.stdout:1/768: write d0/d3/d41/f75 [584080,77298] 0 2026-03-10T06:23:12.898 INFO:tasks.workunit.client.0.vm04.stdout:7/750: dwrite d4/df/d12/f7f [0,4194304] 0 2026-03-10T06:23:12.901 INFO:tasks.workunit.client.0.vm04.stdout:4/792: getdents d2/d32/d5c/d76/dd7/d2c/d9a/ded 0 2026-03-10T06:23:12.912 INFO:tasks.workunit.client.0.vm04.stdout:7/751: dwrite d4/df/d12/d13/db3/d110/f10c [0,4194304] 0 2026-03-10T06:23:12.918 INFO:tasks.workunit.client.0.vm04.stdout:3/763: sync 2026-03-10T06:23:12.919 INFO:tasks.workunit.client.0.vm04.stdout:7/752: write d4/df/f60 [2800334,23190] 0 2026-03-10T06:23:12.927 INFO:tasks.workunit.client.0.vm04.stdout:2/758: getdents d1/db 0 2026-03-10T06:23:12.939 INFO:tasks.workunit.client.0.vm04.stdout:6/798: rename d2/d8/d78 to d2/d43/d2d/d30/d34/d108 0 2026-03-10T06:23:12.940 INFO:tasks.workunit.client.0.vm04.stdout:8/790: dread df/d20/d25/d30/d65/f80 [0,4194304] 0 2026-03-10T06:23:12.942 INFO:tasks.workunit.client.0.vm04.stdout:5/755: creat d4/d11/d7d/dab/f105 x:0 0 0 2026-03-10T06:23:12.945 INFO:tasks.workunit.client.0.vm04.stdout:4/793: unlink d2/d32/d5c/d76/dd7/d2c/ff7 0 2026-03-10T06:23:12.945 INFO:tasks.workunit.client.0.vm04.stdout:1/769: creat d0/d8/d46/db3/dd2/f11c x:0 0 0 2026-03-10T06:23:12.945 INFO:tasks.workunit.client.0.vm04.stdout:3/764: rmdir d4/da/df/d11/d5a/d5b/ddf/d21 39 2026-03-10T06:23:12.945 INFO:tasks.workunit.client.0.vm04.stdout:7/753: mknod d4/df/d12/d34/d63/c113 0 2026-03-10T06:23:12.946 INFO:tasks.workunit.client.0.vm04.stdout:7/754: fsync d4/df/d12/dd4/f7c 0 2026-03-10T06:23:12.946 INFO:tasks.workunit.client.0.vm04.stdout:7/755: readlink d4/df/d12/d34/dbd/lc8 0 2026-03-10T06:23:12.947 INFO:tasks.workunit.client.0.vm04.stdout:7/756: dread - d4/df/d12/d13/d25/dcb/fd6 zero size 2026-03-10T06:23:12.958 INFO:tasks.workunit.client.0.vm04.stdout:2/759: creat d1/db/d9b/fea x:0 0 0 2026-03-10T06:23:13.000 INFO:tasks.workunit.client.0.vm04.stdout:9/839: rename d2/d3/d18/fc1 to d2/d3/d18/de9/da9/f12f 0 2026-03-10T06:23:13.002 INFO:tasks.workunit.client.0.vm04.stdout:8/791: write df/d20/d25/d30/f79 [2836345,75655] 0 2026-03-10T06:23:13.004 INFO:tasks.workunit.client.0.vm04.stdout:0/844: link d0/d1a/d4d/c87 d0/d5/d97/dc0/dd8/c11a 0 2026-03-10T06:23:13.006 INFO:tasks.workunit.client.0.vm04.stdout:4/794: mkdir d2/d32/d94/d99/d101 0 2026-03-10T06:23:13.014 INFO:tasks.workunit.client.0.vm04.stdout:2/760: creat d1/dae/d11/d14/d9f/ddb/d94/feb x:0 0 0 2026-03-10T06:23:13.014 INFO:tasks.workunit.client.0.vm04.stdout:2/761: chown d1/f91 2000242 1 2026-03-10T06:23:13.016 INFO:tasks.workunit.client.0.vm04.stdout:5/756: rename d4/d6/d80/dd9 to d4/d11/d7d/dab/d106 0 2026-03-10T06:23:13.018 INFO:tasks.workunit.client.0.vm04.stdout:7/757: mkdir d4/df/d12/d13/d25/d30/d40/d50/df6/d114 0 2026-03-10T06:23:13.020 INFO:tasks.workunit.client.0.vm04.stdout:7/758: truncate d4/df/d12/d13/db3/d110/d9c/db1/ff7 626880 0 2026-03-10T06:23:13.021 INFO:tasks.workunit.client.0.vm04.stdout:8/792: write df/d20/d25/d30/d65/d8f/fc9 [783950,115510] 0 2026-03-10T06:23:13.024 INFO:tasks.workunit.client.0.vm04.stdout:2/762: rmdir d1/dae/d11/d14/d4e 39 2026-03-10T06:23:13.025 INFO:tasks.workunit.client.0.vm04.stdout:2/763: read - d1/db/d69/d74/d87/dcf/fc1 zero size 2026-03-10T06:23:13.027 INFO:tasks.workunit.client.0.vm04.stdout:1/770: truncate d0/f23 1000738 0 2026-03-10T06:23:13.028 INFO:tasks.workunit.client.0.vm04.stdout:6/799: creat d2/d3a/f109 x:0 0 0 2026-03-10T06:23:13.030 INFO:tasks.workunit.client.0.vm04.stdout:5/757: write d4/d11/d7d/d38/d91/d55/fbd [417912,79650] 0 2026-03-10T06:23:13.031 INFO:tasks.workunit.client.0.vm04.stdout:5/758: write d4/f26 [144190,77028] 0 2026-03-10T06:23:13.032 INFO:tasks.workunit.client.0.vm04.stdout:5/759: dread - d4/d11/d7d/d38/d91/d4c/d98/dc0/dde/f102 zero size 2026-03-10T06:23:13.045 INFO:tasks.workunit.client.0.vm04.stdout:0/845: mknod d0/d5/d25/dd/d3a/c11b 0 2026-03-10T06:23:13.045 INFO:tasks.workunit.client.0.vm04.stdout:4/795: mkdir d2/d32/d5c/de2/d102 0 2026-03-10T06:23:13.046 INFO:tasks.workunit.client.0.vm04.stdout:4/796: write d2/f47 [1479824,77802] 0 2026-03-10T06:23:13.054 INFO:tasks.workunit.client.0.vm04.stdout:8/793: chown df/c58 4357054 1 2026-03-10T06:23:13.054 INFO:tasks.workunit.client.0.vm04.stdout:8/794: stat df/d20/d25/l68 0 2026-03-10T06:23:13.057 INFO:tasks.workunit.client.0.vm04.stdout:2/764: rmdir d1/dae/d11/d14/d9f/ddb/d94/de5 39 2026-03-10T06:23:13.058 INFO:tasks.workunit.client.0.vm04.stdout:2/765: read d1/fa5 [2762288,23592] 0 2026-03-10T06:23:13.058 INFO:tasks.workunit.client.0.vm04.stdout:1/771: symlink d0/d8/d46/dcf/l11d 0 2026-03-10T06:23:13.060 INFO:tasks.workunit.client.0.vm04.stdout:9/840: rename d2/d8/d22/d87 to d2/d3/d18/d39/d11/da5/df5/d130 0 2026-03-10T06:23:13.071 INFO:tasks.workunit.client.0.vm04.stdout:0/846: dwrite d0/d1a/d20/df5/d47/f7b [0,4194304] 0 2026-03-10T06:23:13.073 INFO:tasks.workunit.client.0.vm04.stdout:4/797: creat d2/d32/d5c/d76/dd7/d31/d42/db9/f103 x:0 0 0 2026-03-10T06:23:13.073 INFO:tasks.workunit.client.0.vm04.stdout:7/759: unlink d4/df/c86 0 2026-03-10T06:23:13.074 INFO:tasks.workunit.client.0.vm04.stdout:7/760: chown d4/df/d12/f4c 1778425 1 2026-03-10T06:23:13.074 INFO:tasks.workunit.client.0.vm04.stdout:8/795: fdatasync df/d20/d25/d73/fbd 0 2026-03-10T06:23:13.075 INFO:tasks.workunit.client.0.vm04.stdout:3/765: link d4/da/df/lb5 d4/d6/d92/def/lfb 0 2026-03-10T06:23:13.082 INFO:tasks.workunit.client.0.vm04.stdout:6/800: rename d2/d43/d86/la2 to d2/d43/d2d/d30/d1f/l10a 0 2026-03-10T06:23:13.085 INFO:tasks.workunit.client.0.vm04.stdout:6/801: dwrite d2/d43/f35 [4194304,4194304] 0 2026-03-10T06:23:13.085 INFO:tasks.workunit.client.0.vm04.stdout:6/802: readlink d2/d43/d2d/d7c/daa/lac 0 2026-03-10T06:23:13.087 INFO:tasks.workunit.client.0.vm04.stdout:0/847: rmdir d0/d5/d25/dd/d92 39 2026-03-10T06:23:13.091 INFO:tasks.workunit.client.0.vm04.stdout:0/848: fdatasync d0/d5/d25/f23 0 2026-03-10T06:23:13.102 INFO:tasks.workunit.client.0.vm04.stdout:7/761: chown d4/df/d12/d21/c32 518233508 1 2026-03-10T06:23:13.104 INFO:tasks.workunit.client.0.vm04.stdout:7/762: read d4/df/d12/d13/db3/d110/d9c/fe8 [3893094,85890] 0 2026-03-10T06:23:13.111 INFO:tasks.workunit.client.0.vm04.stdout:6/803: unlink d2/d43/d86/fc4 0 2026-03-10T06:23:13.115 INFO:tasks.workunit.client.0.vm04.stdout:2/766: write d1/f91 [142588,57389] 0 2026-03-10T06:23:13.118 INFO:tasks.workunit.client.0.vm04.stdout:8/796: fsync df/d15/d29/fca 0 2026-03-10T06:23:13.121 INFO:tasks.workunit.client.0.vm04.stdout:1/772: write d0/d8/f11 [923292,383] 0 2026-03-10T06:23:13.124 INFO:tasks.workunit.client.0.vm04.stdout:4/798: dwrite d2/f14 [4194304,4194304] 0 2026-03-10T06:23:13.126 INFO:tasks.workunit.client.0.vm04.stdout:0/849: sync 2026-03-10T06:23:13.127 INFO:tasks.workunit.client.0.vm04.stdout:2/767: sync 2026-03-10T06:23:13.127 INFO:tasks.workunit.client.0.vm04.stdout:7/763: sync 2026-03-10T06:23:13.129 INFO:tasks.workunit.client.0.vm04.stdout:7/764: truncate d4/df/d12/d13/d25/d30/ff4 43557 0 2026-03-10T06:23:13.131 INFO:tasks.workunit.client.0.vm04.stdout:7/765: sync 2026-03-10T06:23:13.141 INFO:tasks.workunit.client.0.vm04.stdout:8/797: readlink df/d20/d25/d30/lfe 0 2026-03-10T06:23:13.144 INFO:tasks.workunit.client.0.vm04.stdout:3/766: creat d4/ffc x:0 0 0 2026-03-10T06:23:13.147 INFO:tasks.workunit.client.0.vm04.stdout:3/767: dread - d4/d6/d38/f78 zero size 2026-03-10T06:23:13.156 INFO:tasks.workunit.client.0.vm04.stdout:5/760: rename d4/d11/d7d/d38/d91/d4c/d98/le0 to d4/d11/d7d/d38/d91/d4c/def/l107 0 2026-03-10T06:23:13.160 INFO:tasks.workunit.client.0.vm04.stdout:1/773: symlink d0/d8/d46/d7a/d95/df3/l11e 0 2026-03-10T06:23:13.160 INFO:tasks.workunit.client.0.vm04.stdout:4/799: mknod d2/d32/d5c/d76/c104 0 2026-03-10T06:23:13.162 INFO:tasks.workunit.client.0.vm04.stdout:0/850: fsync d0/d5/d25/dd/d5c/f7a 0 2026-03-10T06:23:13.164 INFO:tasks.workunit.client.0.vm04.stdout:5/761: write d4/d11/d7d/d52/ff9 [911266,128021] 0 2026-03-10T06:23:13.164 INFO:tasks.workunit.client.0.vm04.stdout:2/768: rmdir d1/db/d69/d74/d87/dcf/d8f/d35/d54/d5d 39 2026-03-10T06:23:13.167 INFO:tasks.workunit.client.0.vm04.stdout:3/768: dwrite d4/fbe [0,4194304] 0 2026-03-10T06:23:13.176 INFO:tasks.workunit.client.0.vm04.stdout:6/804: truncate d2/d43/f31 959951 0 2026-03-10T06:23:13.180 INFO:tasks.workunit.client.0.vm04.stdout:2/769: read d1/db/fe [3893910,89245] 0 2026-03-10T06:23:13.180 INFO:tasks.workunit.client.0.vm04.stdout:5/762: fdatasync d4/d11/d7d/d38/d91/d55/fbd 0 2026-03-10T06:23:13.180 INFO:tasks.workunit.client.0.vm04.stdout:9/841: rename d2/d3/f57 to d2/d3/d18/de9/d116/f131 0 2026-03-10T06:23:13.186 INFO:tasks.workunit.client.0.vm04.stdout:1/774: unlink d0/d3/d41/cc3 0 2026-03-10T06:23:13.187 INFO:tasks.workunit.client.0.vm04.stdout:6/805: dread d2/d3a/d9c/fba [0,4194304] 0 2026-03-10T06:23:13.191 INFO:tasks.workunit.client.0.vm04.stdout:6/806: write d2/d8/fc3 [1356195,84330] 0 2026-03-10T06:23:13.195 INFO:tasks.workunit.client.0.vm04.stdout:6/807: stat d2/d43/d2d/d30/d1f/fd8 0 2026-03-10T06:23:13.197 INFO:tasks.workunit.client.0.vm04.stdout:9/842: dread d2/d8/d22/daa/ff9 [0,4194304] 0 2026-03-10T06:23:13.197 INFO:tasks.workunit.client.0.vm04.stdout:7/766: dread d4/df/d12/d21/fd9 [0,4194304] 0 2026-03-10T06:23:13.214 INFO:tasks.workunit.client.0.vm04.stdout:8/798: dwrite df/d15/d2b/d81/d9a/dbe/fcb [0,4194304] 0 2026-03-10T06:23:13.221 INFO:tasks.workunit.client.0.vm04.stdout:0/851: rename d0/d1a/d20/df5 to d0/d1a/d20/df5/d11c 22 2026-03-10T06:23:13.229 INFO:tasks.workunit.client.0.vm04.stdout:2/770: dread d1/dae/d2c/f4a [0,4194304] 0 2026-03-10T06:23:13.230 INFO:tasks.workunit.client.0.vm04.stdout:2/771: write d1/dae/d2c/d37/fe0 [1262296,84885] 0 2026-03-10T06:23:13.232 INFO:tasks.workunit.client.0.vm04.stdout:5/763: mkdir d4/d11/d7d/d38/d91/d4c/def/ddc/d108 0 2026-03-10T06:23:13.235 INFO:tasks.workunit.client.0.vm04.stdout:6/808: mknod d2/d3a/d9c/c10b 0 2026-03-10T06:23:13.246 INFO:tasks.workunit.client.0.vm04.stdout:9/843: dread d2/d23/f31 [8388608,4194304] 0 2026-03-10T06:23:13.246 INFO:tasks.workunit.client.0.vm04.stdout:9/844: chown d2/d3/d18/d39/d11/cc6 5261894 1 2026-03-10T06:23:13.251 INFO:tasks.workunit.client.0.vm04.stdout:3/769: unlink d4/da/df/d11/d5a/d5b/ddf/d21/d32/d8e/f95 0 2026-03-10T06:23:13.258 INFO:tasks.workunit.client.0.vm04.stdout:8/799: mkdir df/d15/d29/df8/d102/d105 0 2026-03-10T06:23:13.260 INFO:tasks.workunit.client.0.vm04.stdout:8/800: read df/d20/d25/fe0 [2228963,119830] 0 2026-03-10T06:23:13.262 INFO:tasks.workunit.client.0.vm04.stdout:8/801: readlink df/d20/d25/l9b 0 2026-03-10T06:23:13.266 INFO:tasks.workunit.client.0.vm04.stdout:2/772: unlink d1/db/d69/d74/d87/dcf/fa1 0 2026-03-10T06:23:13.266 INFO:tasks.workunit.client.0.vm04.stdout:8/802: chown df/d20/d25/d30/f79 7138394 1 2026-03-10T06:23:13.266 INFO:tasks.workunit.client.0.vm04.stdout:8/803: chown df/d20/d25/d73/fbd 14813 1 2026-03-10T06:23:13.267 INFO:tasks.workunit.client.0.vm04.stdout:8/804: stat df/d15/l3d 0 2026-03-10T06:23:13.268 INFO:tasks.workunit.client.0.vm04.stdout:1/775: symlink d0/d8/d46/l11f 0 2026-03-10T06:23:13.270 INFO:tasks.workunit.client.0.vm04.stdout:1/776: chown d0/d3/d41/dc2 28569372 1 2026-03-10T06:23:13.271 INFO:tasks.workunit.client.0.vm04.stdout:1/777: chown d0/d3/f33 9 1 2026-03-10T06:23:13.272 INFO:tasks.workunit.client.0.vm04.stdout:6/809: mknod d2/d43/d2d/d7c/daa/c10c 0 2026-03-10T06:23:13.274 INFO:tasks.workunit.client.0.vm04.stdout:2/773: dwrite d1/dae/d2c/d37/fb7 [0,4194304] 0 2026-03-10T06:23:13.293 INFO:tasks.workunit.client.0.vm04.stdout:4/800: creat d2/d32/d5c/d4f/f105 x:0 0 0 2026-03-10T06:23:13.304 INFO:tasks.workunit.client.0.vm04.stdout:9/845: symlink d2/d8/d3a/dcb/l132 0 2026-03-10T06:23:13.304 INFO:tasks.workunit.client.0.vm04.stdout:0/852: creat d0/d5/d25/df1/f11d x:0 0 0 2026-03-10T06:23:13.316 INFO:tasks.workunit.client.0.vm04.stdout:4/801: sync 2026-03-10T06:23:13.320 INFO:tasks.workunit.client.0.vm04.stdout:3/770: write d4/da/df/d11/d5a/d5b/ddf/d21/d32/d4e/d8f/fc2 [1840883,45642] 0 2026-03-10T06:23:13.326 INFO:tasks.workunit.client.0.vm04.stdout:8/805: write df/d20/d25/d30/f4e [1558703,91506] 0 2026-03-10T06:23:13.372 INFO:tasks.workunit.client.0.vm04.stdout:0/853: mkdir d0/d1a/d20/df5/d47/d11e 0 2026-03-10T06:23:13.372 INFO:tasks.workunit.client.0.vm04.stdout:9/846: dread d2/d8/f4a [0,4194304] 0 2026-03-10T06:23:13.373 INFO:tasks.workunit.client.0.vm04.stdout:3/771: readlink d4/da/la6 0 2026-03-10T06:23:13.377 INFO:tasks.workunit.client.0.vm04.stdout:6/810: dread d2/d43/f24 [0,4194304] 0 2026-03-10T06:23:13.379 INFO:tasks.workunit.client.0.vm04.stdout:5/764: getdents d4/d11/d7d/d52 0 2026-03-10T06:23:13.380 INFO:tasks.workunit.client.0.vm04.stdout:4/802: link d2/d32/d5c/d76/dd7/d31/d42/db9/fe1 d2/d32/dad/f106 0 2026-03-10T06:23:13.381 INFO:tasks.workunit.client.0.vm04.stdout:8/806: fsync df/d15/d2b/f56 0 2026-03-10T06:23:13.382 INFO:tasks.workunit.client.0.vm04.stdout:7/767: link d4/df/d12/d13/l17 d4/df/d12/d13/d25/d28/l115 0 2026-03-10T06:23:13.383 INFO:tasks.workunit.client.0.vm04.stdout:7/768: chown d4/df/d12/d34/d63/dd3 248 1 2026-03-10T06:23:13.386 INFO:tasks.workunit.client.0.vm04.stdout:1/778: rename d0/d8/f6c to d0/f120 0 2026-03-10T06:23:13.401 INFO:tasks.workunit.client.0.vm04.stdout:2/774: write d1/db/d69/d74/d87/dcf/d8f/d35/d54/d5d/f93 [627719,18668] 0 2026-03-10T06:23:13.402 INFO:tasks.workunit.client.0.vm04.stdout:9/847: write d2/d3/d18/d34/fe2 [56253,19433] 0 2026-03-10T06:23:13.404 INFO:tasks.workunit.client.0.vm04.stdout:3/772: write d4/da/df/d11/d5a/db3/fbb [488863,57407] 0 2026-03-10T06:23:13.408 INFO:tasks.workunit.client.0.vm04.stdout:4/803: write d2/d32/d5c/d76/dd7/d2c/f9b [222847,58826] 0 2026-03-10T06:23:13.409 INFO:tasks.workunit.client.0.vm04.stdout:4/804: chown d2/d32/d5c/de2/d102 1790470683 1 2026-03-10T06:23:13.410 INFO:tasks.workunit.client.0.vm04.stdout:5/765: dwrite d4/d3b/f41 [0,4194304] 0 2026-03-10T06:23:13.416 INFO:tasks.workunit.client.0.vm04.stdout:7/769: mknod d4/df/d12/d13/d25/d28/d3a/d100/d106/c116 0 2026-03-10T06:23:13.417 INFO:tasks.workunit.client.0.vm04.stdout:3/773: mknod d4/d6/d91/da1/cfd 0 2026-03-10T06:23:13.419 INFO:tasks.workunit.client.0.vm04.stdout:0/854: truncate d0/d5/d25/dd/d92/ffe 2928936 0 2026-03-10T06:23:13.422 INFO:tasks.workunit.client.0.vm04.stdout:1/779: mkdir d0/d8/d46/d7a/d95/dc5/d121 0 2026-03-10T06:23:13.422 INFO:tasks.workunit.client.0.vm04.stdout:4/805: fdatasync d2/d32/d5c/d76/dd7/d31/d3f/d93/f9c 0 2026-03-10T06:23:13.426 INFO:tasks.workunit.client.0.vm04.stdout:9/848: unlink d2/d3/d18/d39/d46/c106 0 2026-03-10T06:23:13.427 INFO:tasks.workunit.client.0.vm04.stdout:6/811: rename d2/d37/d83/c98 to d2/d43/d2d/d30/d34/c10d 0 2026-03-10T06:23:13.428 INFO:tasks.workunit.client.0.vm04.stdout:6/812: chown d2/d43/d2d/d30/d1f/db6/fc5 97406209 1 2026-03-10T06:23:13.431 INFO:tasks.workunit.client.0.vm04.stdout:5/766: fsync d4/d11/d7d/d38/d91/f5e 0 2026-03-10T06:23:13.433 INFO:tasks.workunit.client.0.vm04.stdout:8/807: write df/d15/d2b/f7e [1103412,12336] 0 2026-03-10T06:23:13.438 INFO:tasks.workunit.client.0.vm04.stdout:2/775: link d1/dae/d11/l7b d1/dae/d2c/d37/d40/lec 0 2026-03-10T06:23:13.441 INFO:tasks.workunit.client.0.vm04.stdout:9/849: dread d2/d8/d3a/dcb/fe6 [0,4194304] 0 2026-03-10T06:23:13.444 INFO:tasks.workunit.client.0.vm04.stdout:4/806: sync 2026-03-10T06:23:13.452 INFO:tasks.workunit.client.0.vm04.stdout:7/770: dwrite d4/df/d12/d13/fb5 [0,4194304] 0 2026-03-10T06:23:13.452 INFO:tasks.workunit.client.0.vm04.stdout:3/774: truncate d4/da/df/d11/d5a/d5b/ddf/d21/d32/d4e/d7f/fe9 232244 0 2026-03-10T06:23:13.456 INFO:tasks.workunit.client.0.vm04.stdout:6/813: rmdir d2/d43/d2d/d30/d1f/d3c/dfa 39 2026-03-10T06:23:13.464 INFO:tasks.workunit.client.0.vm04.stdout:8/808: rename df/d20/d25/d30/d65/cb5 to df/d20/d25/d30/dc5/c106 0 2026-03-10T06:23:13.471 INFO:tasks.workunit.client.0.vm04.stdout:8/809: chown df/d15/d2b/f4a 4696 1 2026-03-10T06:23:13.471 INFO:tasks.workunit.client.0.vm04.stdout:8/810: chown df/d15/d29/f7a 56680 1 2026-03-10T06:23:13.471 INFO:tasks.workunit.client.0.vm04.stdout:0/855: link d0/d5/d25/dd/d92/f10c d0/d1a/d20/df5/f11f 0 2026-03-10T06:23:13.473 INFO:tasks.workunit.client.0.vm04.stdout:4/807: mkdir d2/d32/d5c/d76/dd7/d2c/d6b/dd1/d107 0 2026-03-10T06:23:13.478 INFO:tasks.workunit.client.0.vm04.stdout:3/775: creat d4/da/df/d11/d5a/d5b/ddf/d21/d32/d8e/ffe x:0 0 0 2026-03-10T06:23:13.479 INFO:tasks.workunit.client.0.vm04.stdout:3/776: write d4/da/df/d11/d5a/d5b/ddf/d21/d32/d39/d64/fd6 [1172362,97131] 0 2026-03-10T06:23:13.487 INFO:tasks.workunit.client.0.vm04.stdout:5/767: symlink d4/l109 0 2026-03-10T06:23:13.489 INFO:tasks.workunit.client.0.vm04.stdout:8/811: mknod df/d15/d29/df8/c107 0 2026-03-10T06:23:13.491 INFO:tasks.workunit.client.0.vm04.stdout:1/780: creat d0/d8/d46/d7a/d95/dc5/f122 x:0 0 0 2026-03-10T06:23:13.493 INFO:tasks.workunit.client.0.vm04.stdout:2/776: unlink d1/db/d69/d74/d87/dcf/d8f/d35/d54/d5d/c7d 0 2026-03-10T06:23:13.506 INFO:tasks.workunit.client.0.vm04.stdout:4/808: rename d2/d32/dad to d2/d32/d5c/d76/dd7/d2c/d6b/d108 0 2026-03-10T06:23:13.511 INFO:tasks.workunit.client.0.vm04.stdout:7/771: mknod d4/df/d12/d13/d25/c117 0 2026-03-10T06:23:13.516 INFO:tasks.workunit.client.0.vm04.stdout:1/781: fsync d0/d8/d46/f57 0 2026-03-10T06:23:13.516 INFO:tasks.workunit.client.0.vm04.stdout:2/777: fdatasync d1/fa5 0 2026-03-10T06:23:13.516 INFO:tasks.workunit.client.0.vm04.stdout:3/777: chown d4/da/df/d11/d50/fa9 73468 1 2026-03-10T06:23:13.516 INFO:tasks.workunit.client.0.vm04.stdout:9/850: link d2/d8/f4a d2/d8/d3a/dcb/f133 0 2026-03-10T06:23:13.516 INFO:tasks.workunit.client.0.vm04.stdout:0/856: creat d0/d1a/d20/df5/d47/d11e/f120 x:0 0 0 2026-03-10T06:23:13.528 INFO:tasks.workunit.client.0.vm04.stdout:5/768: link d4/d6/c1c d4/d6/d80/c10a 0 2026-03-10T06:23:13.528 INFO:tasks.workunit.client.0.vm04.stdout:0/857: creat d0/d5/d25/dd/d92/f121 x:0 0 0 2026-03-10T06:23:13.531 INFO:tasks.workunit.client.0.vm04.stdout:4/809: dwrite d2/d32/d5c/d76/dd7/d31/d3f/dc8/d100/fe7 [0,4194304] 0 2026-03-10T06:23:13.533 INFO:tasks.workunit.client.0.vm04.stdout:9/851: creat d2/d3/d18/de9/d116/f134 x:0 0 0 2026-03-10T06:23:13.534 INFO:tasks.workunit.client.0.vm04.stdout:2/778: dread d1/db/d69/d74/d87/dcf/d8f/d48/f62 [0,4194304] 0 2026-03-10T06:23:13.536 INFO:tasks.workunit.client.0.vm04.stdout:9/852: stat d2/d3/d18/de9/d5a/fee 0 2026-03-10T06:23:13.538 INFO:tasks.workunit.client.0.vm04.stdout:4/810: stat d2/d32/d5c/d76/dd7/d2c/d6b 0 2026-03-10T06:23:13.538 INFO:tasks.workunit.client.0.vm04.stdout:1/782: chown d0/d8/d46/db3/dd2/ce3 5 1 2026-03-10T06:23:13.539 INFO:tasks.workunit.client.0.vm04.stdout:6/814: rename d2/d3a/d5e/fa4 to d2/d3a/d5e/f10e 0 2026-03-10T06:23:13.553 INFO:tasks.workunit.client.0.vm04.stdout:5/769: truncate d4/d11/d7d/d38/d91/d55/db1/fb4 882340 0 2026-03-10T06:23:13.558 INFO:tasks.workunit.client.0.vm04.stdout:7/772: link d4/df/d12/dd4/l6d d4/df/d12/d13/d25/d30/d40/l118 0 2026-03-10T06:23:13.559 INFO:tasks.workunit.client.0.vm04.stdout:4/811: creat d2/d32/d5c/d76/dd7/d56/f109 x:0 0 0 2026-03-10T06:23:13.560 INFO:tasks.workunit.client.0.vm04.stdout:6/815: rename d2/d43/d2d/d30/d34/f6d to d2/d43/d2d/d30/d34/d76/d7e/ddc/f10f 0 2026-03-10T06:23:13.563 INFO:tasks.workunit.client.0.vm04.stdout:6/816: dread d2/d37/d6e/fdd [0,4194304] 0 2026-03-10T06:23:13.565 INFO:tasks.workunit.client.0.vm04.stdout:5/770: creat d4/d6/d37/f10b x:0 0 0 2026-03-10T06:23:13.565 INFO:tasks.workunit.client.0.vm04.stdout:8/812: dwrite df/d20/f64 [0,4194304] 0 2026-03-10T06:23:13.565 INFO:tasks.workunit.client.0.vm04.stdout:3/778: write d4/da/df/d11/d5a/d5b/ddf/f45 [2768633,126223] 0 2026-03-10T06:23:13.567 INFO:tasks.workunit.client.0.vm04.stdout:9/853: symlink d2/d8/d3a/l135 0 2026-03-10T06:23:13.570 INFO:tasks.workunit.client.0.vm04.stdout:0/858: dwrite d0/d5/d25/dd/d5c/d73/f61 [0,4194304] 0 2026-03-10T06:23:13.575 INFO:tasks.workunit.client.0.vm04.stdout:3/779: fsync d4/da/df/d11/d5a/d5b/ddf/d21/d32/d4e/d8f/fc2 0 2026-03-10T06:23:13.587 INFO:tasks.workunit.client.0.vm04.stdout:4/812: dwrite d2/d32/d94/d99/fd6 [0,4194304] 0 2026-03-10T06:23:13.590 INFO:tasks.workunit.client.0.vm04.stdout:4/813: chown d2/d46/c5b 311182 1 2026-03-10T06:23:13.595 INFO:tasks.workunit.client.0.vm04.stdout:8/813: dwrite df/d15/d29/da3/db8/dc1/d97/d67/fa4 [0,4194304] 0 2026-03-10T06:23:13.608 INFO:tasks.workunit.client.0.vm04.stdout:5/771: mknod d4/d11/d7d/d38/d91/d4c/d98/dc0/c10c 0 2026-03-10T06:23:13.608 INFO:tasks.workunit.client.0.vm04.stdout:1/783: rmdir d0/d8/d46/d7a/d95/df3/d10c 0 2026-03-10T06:23:13.609 INFO:tasks.workunit.client.0.vm04.stdout:2/779: creat d1/db/d69/d74/d87/dcf/d8f/d35/d54/d5d/fed x:0 0 0 2026-03-10T06:23:13.609 INFO:tasks.workunit.client.0.vm04.stdout:4/814: rename d2/d32/d5c/d76/dd7/d2c/d6b/d108/f106 to d2/d32/d5c/d76/dd7/d2c/d6b/d108/f10a 0 2026-03-10T06:23:13.611 INFO:tasks.workunit.client.0.vm04.stdout:1/784: readlink d0/d8/d46/d7a/l90 0 2026-03-10T06:23:13.612 INFO:tasks.workunit.client.0.vm04.stdout:5/772: fsync d4/d11/d7d/d52/f8f 0 2026-03-10T06:23:13.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:13 vm06.local ceph-mon[58974]: pgmap v35: 65 pgs: 65 active+clean; 2.9 GiB data, 9.7 GiB used, 110 GiB / 120 GiB avail; 31 MiB/s rd, 84 MiB/s wr, 196 op/s 2026-03-10T06:23:13.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:13 vm06.local ceph-mon[58974]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:23:13.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:13 vm06.local ceph-mon[58974]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:23:13.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:13 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:23:13.619 INFO:tasks.workunit.client.0.vm04.stdout:2/780: fdatasync d1/db/d69/d74/d87/dcf/d8f/d48/f62 0 2026-03-10T06:23:13.624 INFO:tasks.workunit.client.0.vm04.stdout:7/773: dread d4/df/f29 [0,4194304] 0 2026-03-10T06:23:13.625 INFO:tasks.workunit.client.0.vm04.stdout:3/780: sync 2026-03-10T06:23:13.627 INFO:tasks.workunit.client.0.vm04.stdout:4/815: dread d2/d46/f5d [4194304,4194304] 0 2026-03-10T06:23:13.630 INFO:tasks.workunit.client.0.vm04.stdout:1/785: link d0/d8/d46/db3/cdd d0/d8/d46/de4/c123 0 2026-03-10T06:23:13.631 INFO:tasks.workunit.client.0.vm04.stdout:5/773: link d4/d11/d7d/d38/cac d4/d6/d80/d84/c10d 0 2026-03-10T06:23:13.631 INFO:tasks.workunit.client.0.vm04.stdout:5/774: chown d4/d11/d7d/d52/f103 128047 1 2026-03-10T06:23:13.632 INFO:tasks.workunit.client.0.vm04.stdout:6/817: dread d2/d3a/d9c/f102 [0,4194304] 0 2026-03-10T06:23:13.632 INFO:tasks.workunit.client.0.vm04.stdout:4/816: chown d2/d32/d5c/d76/dd7/d31/d42/db9/f65 66567453 1 2026-03-10T06:23:13.633 INFO:tasks.workunit.client.0.vm04.stdout:3/781: rename d4/d6/d54/dee to d4/da/df/d11/d5a/d5b/dff 0 2026-03-10T06:23:13.635 INFO:tasks.workunit.client.0.vm04.stdout:5/775: fdatasync d4/f69 0 2026-03-10T06:23:13.635 INFO:tasks.workunit.client.0.vm04.stdout:2/781: mknod d1/dae/d11/d14/d9f/ddb/d94/dbb/de8/cee 0 2026-03-10T06:23:13.636 INFO:tasks.workunit.client.0.vm04.stdout:7/774: fsync d4/df/d12/d13/d25/f95 0 2026-03-10T06:23:13.637 INFO:tasks.workunit.client.0.vm04.stdout:2/782: stat d1/dae/d11/f16 0 2026-03-10T06:23:13.638 INFO:tasks.workunit.client.0.vm04.stdout:2/783: write d1/db/f36 [8399813,27696] 0 2026-03-10T06:23:13.639 INFO:tasks.workunit.client.0.vm04.stdout:5/776: read - d4/d11/d7d/d38/d91/d55/fba zero size 2026-03-10T06:23:13.642 INFO:tasks.workunit.client.0.vm04.stdout:5/777: mknod d4/d6/d80/d84/d99/c10e 0 2026-03-10T06:23:13.644 INFO:tasks.workunit.client.0.vm04.stdout:2/784: mkdir d1/dae/d2c/d37/dca/def 0 2026-03-10T06:23:13.644 INFO:tasks.workunit.client.0.vm04.stdout:4/817: getdents d2/d32/d94 0 2026-03-10T06:23:13.648 INFO:tasks.workunit.client.0.vm04.stdout:2/785: link d1/db/d69/d74/d87/cc9 d1/dae/d2c/d37/d40/cf0 0 2026-03-10T06:23:13.654 INFO:tasks.workunit.client.0.vm04.stdout:2/786: symlink d1/dae/d11/d14/d4e/lf1 0 2026-03-10T06:23:13.657 INFO:tasks.workunit.client.0.vm04.stdout:9/854: dwrite d2/d3/d18/de9/da9/fc2 [0,4194304] 0 2026-03-10T06:23:13.657 INFO:tasks.workunit.client.0.vm04.stdout:4/818: sync 2026-03-10T06:23:13.659 INFO:tasks.workunit.client.0.vm04.stdout:9/855: dread - d2/d8/d3a/f51 zero size 2026-03-10T06:23:13.661 INFO:tasks.workunit.client.0.vm04.stdout:4/819: write d2/d32/d5c/d76/dd7/d2c/f9b [4405871,16799] 0 2026-03-10T06:23:13.661 INFO:tasks.workunit.client.0.vm04.stdout:0/859: dwrite d0/d5/d25/dd/f13 [4194304,4194304] 0 2026-03-10T06:23:13.666 INFO:tasks.workunit.client.0.vm04.stdout:3/782: write d4/d6/d99/f80 [1036165,113428] 0 2026-03-10T06:23:13.669 INFO:tasks.workunit.client.0.vm04.stdout:7/775: dwrite d4/df/d12/d13/d25/dcb/fd6 [0,4194304] 0 2026-03-10T06:23:13.672 INFO:tasks.workunit.client.0.vm04.stdout:1/786: write d0/f49 [279233,8902] 0 2026-03-10T06:23:13.672 INFO:tasks.workunit.client.0.vm04.stdout:5/778: write d4/d6/d50/fdb [1027383,128518] 0 2026-03-10T06:23:13.676 INFO:tasks.workunit.client.0.vm04.stdout:6/818: dwrite d2/d43/d2d/d30/d1f/d3c/d75/f92 [0,4194304] 0 2026-03-10T06:23:13.679 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:13 vm04.local ceph-mon[51058]: pgmap v35: 65 pgs: 65 active+clean; 2.9 GiB data, 9.7 GiB used, 110 GiB / 120 GiB avail; 31 MiB/s rd, 84 MiB/s wr, 196 op/s 2026-03-10T06:23:13.679 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:13 vm04.local ceph-mon[51058]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:23:13.679 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:13 vm04.local ceph-mon[51058]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:23:13.679 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:13 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:23:13.683 INFO:tasks.workunit.client.0.vm04.stdout:5/779: read d4/f26 [3060781,8933] 0 2026-03-10T06:23:13.685 INFO:tasks.workunit.client.0.vm04.stdout:2/787: chown d1/db/d69/d74/d87/dcf/d8f/d35/fde 320 1 2026-03-10T06:23:13.685 INFO:tasks.workunit.client.0.vm04.stdout:8/814: dwrite df/d15/d29/f3a [0,4194304] 0 2026-03-10T06:23:13.709 INFO:tasks.workunit.client.0.vm04.stdout:4/820: chown d2/cbd 112332 1 2026-03-10T06:23:13.712 INFO:tasks.workunit.client.0.vm04.stdout:9/856: readlink d2/d3/d18/de9/da9/le4 0 2026-03-10T06:23:13.713 INFO:tasks.workunit.client.0.vm04.stdout:3/783: symlink d4/da/df/d11/d5a/d5b/ddf/dbd/l100 0 2026-03-10T06:23:13.713 INFO:tasks.workunit.client.0.vm04.stdout:4/821: dread - d2/d32/d5c/d76/dd7/d56/ffd zero size 2026-03-10T06:23:13.729 INFO:tasks.workunit.client.0.vm04.stdout:3/784: rename d4/d6/l74 to d4/d6/d91/l101 0 2026-03-10T06:23:13.729 INFO:tasks.workunit.client.0.vm04.stdout:5/780: mkdir d4/d6/d81/db6/d10f 0 2026-03-10T06:23:13.729 INFO:tasks.workunit.client.0.vm04.stdout:8/815: write df/f46 [4563182,20395] 0 2026-03-10T06:23:13.738 INFO:tasks.workunit.client.0.vm04.stdout:3/785: chown d4/da/df/d11/d62 5215 1 2026-03-10T06:23:13.748 INFO:tasks.workunit.client.0.vm04.stdout:0/860: getdents d0/d5/d25/dd/d5c/d73/d82 0 2026-03-10T06:23:13.748 INFO:tasks.workunit.client.0.vm04.stdout:2/788: dread d1/dae/d11/f16 [0,4194304] 0 2026-03-10T06:23:13.752 INFO:tasks.workunit.client.0.vm04.stdout:9/857: sync 2026-03-10T06:23:13.765 INFO:tasks.workunit.client.0.vm04.stdout:4/822: rename d2/d32/d5c/d76/dd7/d31/d3f to d2/d32/d10b 0 2026-03-10T06:23:13.771 INFO:tasks.workunit.client.0.vm04.stdout:6/819: dread d2/d43/d2d/d30/f39 [0,4194304] 0 2026-03-10T06:23:13.771 INFO:tasks.workunit.client.0.vm04.stdout:8/816: dread df/d15/f69 [0,4194304] 0 2026-03-10T06:23:13.776 INFO:tasks.workunit.client.0.vm04.stdout:7/776: dwrite d4/df/d12/d13/f1e [0,4194304] 0 2026-03-10T06:23:13.778 INFO:tasks.workunit.client.0.vm04.stdout:3/786: creat d4/deb/f102 x:0 0 0 2026-03-10T06:23:13.784 INFO:tasks.workunit.client.0.vm04.stdout:0/861: read d0/d5/d25/dd/d5c/d73/fae [145610,92948] 0 2026-03-10T06:23:13.784 INFO:tasks.workunit.client.0.vm04.stdout:1/787: dwrite d0/d3/d80/f91 [0,4194304] 0 2026-03-10T06:23:13.785 INFO:tasks.workunit.client.0.vm04.stdout:0/862: chown d0/d5/d25/dd/d3a/l45 91 1 2026-03-10T06:23:13.790 INFO:tasks.workunit.client.0.vm04.stdout:2/789: fsync d1/dae/d11/d14/f9c 0 2026-03-10T06:23:13.794 INFO:tasks.workunit.client.0.vm04.stdout:2/790: write d1/dae/fe2 [223171,69574] 0 2026-03-10T06:23:13.806 INFO:tasks.workunit.client.0.vm04.stdout:2/791: sync 2026-03-10T06:23:13.807 INFO:tasks.workunit.client.0.vm04.stdout:2/792: fdatasync d1/db/d69/d74/d87/dcf/d8f/d35/d54/fe4 0 2026-03-10T06:23:13.819 INFO:tasks.workunit.client.0.vm04.stdout:5/781: write d4/d11/d7d/d38/d91/d55/db1/fb4 [198031,13296] 0 2026-03-10T06:23:13.824 INFO:tasks.workunit.client.0.vm04.stdout:5/782: dread - d4/d11/d7d/dab/f105 zero size 2026-03-10T06:23:13.868 INFO:tasks.workunit.client.0.vm04.stdout:1/788: mknod d0/d3/d41/dcb/c124 0 2026-03-10T06:23:13.872 INFO:tasks.workunit.client.0.vm04.stdout:1/789: fdatasync d0/d3/d41/d4b/d5b/f5c 0 2026-03-10T06:23:13.873 INFO:tasks.workunit.client.0.vm04.stdout:3/787: dread d4/d6/d38/fb8 [0,4194304] 0 2026-03-10T06:23:13.879 INFO:tasks.workunit.client.0.vm04.stdout:0/863: truncate d0/d5/d25/dd/d5c/d73/f4f 1270411 0 2026-03-10T06:23:13.882 INFO:tasks.workunit.client.0.vm04.stdout:0/864: chown d0/d5/d25/df1 31081794 1 2026-03-10T06:23:13.883 INFO:tasks.workunit.client.0.vm04.stdout:0/865: chown d0/dd1/lda 17 1 2026-03-10T06:23:13.914 INFO:tasks.workunit.client.0.vm04.stdout:8/817: unlink df/c13 0 2026-03-10T06:23:13.918 INFO:tasks.workunit.client.0.vm04.stdout:8/818: chown df/d20/d25/d30/f79 93 1 2026-03-10T06:23:13.934 INFO:tasks.workunit.client.0.vm04.stdout:7/777: getdents d4/df/d12/d34/d63/dd3 0 2026-03-10T06:23:13.943 INFO:tasks.workunit.client.0.vm04.stdout:6/820: truncate d2/d43/f69 467704 0 2026-03-10T06:23:13.943 INFO:tasks.workunit.client.0.vm04.stdout:1/790: mkdir d0/d8/d46/db3/d125 0 2026-03-10T06:23:13.955 INFO:tasks.workunit.client.0.vm04.stdout:9/858: dwrite d2/d3/d18/d39/d46/d55/dc3/f125 [0,4194304] 0 2026-03-10T06:23:13.957 INFO:tasks.workunit.client.0.vm04.stdout:3/788: fdatasync d4/f2d 0 2026-03-10T06:23:13.963 INFO:tasks.workunit.client.0.vm04.stdout:2/793: truncate d1/db/d69/d74/d87/dcf/f86 664518 0 2026-03-10T06:23:13.964 INFO:tasks.workunit.client.0.vm04.stdout:2/794: stat d1/c3a 0 2026-03-10T06:23:13.969 INFO:tasks.workunit.client.0.vm04.stdout:0/866: symlink d0/d1a/d20/df5/d47/d8a/d8d/de1/l122 0 2026-03-10T06:23:13.971 INFO:tasks.workunit.client.0.vm04.stdout:5/783: mkdir d4/d11/d7d/d38/d110 0 2026-03-10T06:23:13.971 INFO:tasks.workunit.client.0.vm04.stdout:4/823: link d2/d32/d5c/d76/c104 d2/d32/d5c/de2/c10c 0 2026-03-10T06:23:13.978 INFO:tasks.workunit.client.0.vm04.stdout:8/819: mknod df/d15/d29/da3/c108 0 2026-03-10T06:23:13.978 INFO:tasks.workunit.client.0.vm04.stdout:7/778: symlink d4/df/d12/d13/db3/d110/l119 0 2026-03-10T06:23:13.979 INFO:tasks.workunit.client.0.vm04.stdout:8/820: chown df/d15/d29/da3/db8/dc1/d97/d67/le2 0 1 2026-03-10T06:23:13.982 INFO:tasks.workunit.client.0.vm04.stdout:2/795: dread d1/dae/d11/d14/d4e/f5c [0,4194304] 0 2026-03-10T06:23:13.985 INFO:tasks.workunit.client.0.vm04.stdout:6/821: dwrite d2/d43/d2d/d30/d1f/d3c/f6a [0,4194304] 0 2026-03-10T06:23:13.990 INFO:tasks.workunit.client.0.vm04.stdout:5/784: fdatasync d4/d3b/fa0 0 2026-03-10T06:23:13.997 INFO:tasks.workunit.client.0.vm04.stdout:4/824: rename d2/d32/d94/d99/fd6 to d2/d32/d5c/d76/f10d 0 2026-03-10T06:23:14.002 INFO:tasks.workunit.client.0.vm04.stdout:7/779: creat d4/df/d12/d13/db3/d110/d9c/db1/dc4/f11a x:0 0 0 2026-03-10T06:23:14.003 INFO:tasks.workunit.client.0.vm04.stdout:2/796: fdatasync d1/dae/d11/d14/f9c 0 2026-03-10T06:23:14.009 INFO:tasks.workunit.client.0.vm04.stdout:3/789: creat d4/d6/d92/def/f103 x:0 0 0 2026-03-10T06:23:14.026 INFO:tasks.workunit.client.0.vm04.stdout:4/825: dread d2/d32/d5c/d76/dd7/d2c/d6b/f96 [0,4194304] 0 2026-03-10T06:23:14.028 INFO:tasks.workunit.client.0.vm04.stdout:9/859: truncate d2/de0/d1d/f78 706049 0 2026-03-10T06:23:14.033 INFO:tasks.workunit.client.0.vm04.stdout:6/822: symlink d2/d43/d2d/d30/d1f/d3c/d85/dbf/l110 0 2026-03-10T06:23:14.033 INFO:tasks.workunit.client.0.vm04.stdout:6/823: chown d2/d43/d2d/d30/l8d 1511072 1 2026-03-10T06:23:14.040 INFO:tasks.workunit.client.0.vm04.stdout:5/785: creat d4/d6/d81/db6/f111 x:0 0 0 2026-03-10T06:23:14.043 INFO:tasks.workunit.client.0.vm04.stdout:1/791: write d0/d8/ff4 [926821,14537] 0 2026-03-10T06:23:14.048 INFO:tasks.workunit.client.0.vm04.stdout:1/792: chown d0/d8/d46/d7a/d95/dc5 107678445 1 2026-03-10T06:23:14.049 INFO:tasks.workunit.client.0.vm04.stdout:7/780: chown d4/df/d12/d21/l8c 49956 1 2026-03-10T06:23:14.051 INFO:tasks.workunit.client.0.vm04.stdout:8/821: dwrite df/d15/d29/f6f [0,4194304] 0 2026-03-10T06:23:14.051 INFO:tasks.workunit.client.0.vm04.stdout:2/797: fsync d1/db/d69/d74/d87/dcf/d8f/d48/d67/f92 0 2026-03-10T06:23:14.055 INFO:tasks.workunit.client.0.vm04.stdout:3/790: mkdir d4/da/df/d11/d5a/d5b/ddf/d21/d32/d4e/d8f/d104 0 2026-03-10T06:23:14.058 INFO:tasks.workunit.client.0.vm04.stdout:9/860: symlink d2/d3/d18/de9/dd4/l136 0 2026-03-10T06:23:14.059 INFO:tasks.workunit.client.0.vm04.stdout:3/791: stat d4/da/df/d11/d5a/d5b/ddf/d21/d32/d4e/d8f/d104 0 2026-03-10T06:23:14.060 INFO:tasks.workunit.client.0.vm04.stdout:9/861: read d2/d8/d3a/fd8 [2552,2118] 0 2026-03-10T06:23:14.065 INFO:tasks.workunit.client.0.vm04.stdout:6/824: rmdir d2/d43/d2d/d30/d34/d76/d8a 39 2026-03-10T06:23:14.070 INFO:tasks.workunit.client.0.vm04.stdout:0/867: truncate d0/d5/d25/dd/d5c/d73/f61 4087442 0 2026-03-10T06:23:14.071 INFO:tasks.workunit.client.0.vm04.stdout:1/793: read d0/d8/d46/d7a/fa8 [1913599,80226] 0 2026-03-10T06:23:14.072 INFO:tasks.workunit.client.0.vm04.stdout:1/794: readlink d0/d3/d41/d99/d103/lbf 0 2026-03-10T06:23:14.078 INFO:tasks.workunit.client.0.vm04.stdout:7/781: chown d4/df/d12/d13/db3/d110/cbb 393 1 2026-03-10T06:23:14.078 INFO:tasks.workunit.client.0.vm04.stdout:3/792: readlink d4/da/df/d11/l8c 0 2026-03-10T06:23:14.078 INFO:tasks.workunit.client.0.vm04.stdout:2/798: chown d1/db/d69/d74/d87/dcf/c7f 3038 1 2026-03-10T06:23:14.078 INFO:tasks.workunit.client.0.vm04.stdout:5/786: write d4/d11/d7d/d38/d91/f74 [3671095,113151] 0 2026-03-10T06:23:14.082 INFO:tasks.workunit.client.0.vm04.stdout:2/799: read d1/dae/d11/d14/d9f/ddb/f7a [1251798,36880] 0 2026-03-10T06:23:14.090 INFO:tasks.workunit.client.0.vm04.stdout:9/862: creat d2/d3/d18/de9/d116/f137 x:0 0 0 2026-03-10T06:23:14.090 INFO:tasks.workunit.client.0.vm04.stdout:1/795: symlink d0/d3/d41/l126 0 2026-03-10T06:23:14.090 INFO:tasks.workunit.client.0.vm04.stdout:5/787: mknod d4/d11/d7d/dab/c112 0 2026-03-10T06:23:14.092 INFO:tasks.workunit.client.0.vm04.stdout:0/868: sync 2026-03-10T06:23:14.094 INFO:tasks.workunit.client.0.vm04.stdout:3/793: mkdir d4/d6/d92/def/d105 0 2026-03-10T06:23:14.096 INFO:tasks.workunit.client.0.vm04.stdout:7/782: mknod d4/df/d12/d13/d25/d30/d40/d79/c11b 0 2026-03-10T06:23:14.096 INFO:tasks.workunit.client.0.vm04.stdout:4/826: truncate d2/d32/d5c/d76/dd7/d31/d42/db9/f6e 690958 0 2026-03-10T06:23:14.097 INFO:tasks.workunit.client.0.vm04.stdout:5/788: chown d4/d11/d7d/dae/fb2 65 1 2026-03-10T06:23:14.098 INFO:tasks.workunit.client.0.vm04.stdout:5/789: chown d4/d6/d50/l8c 43142173 1 2026-03-10T06:23:14.107 INFO:tasks.workunit.client.0.vm04.stdout:3/794: dwrite d4/da/df/d11/d5a/d5b/ddf/d21/d32/d39/d64/f7d [4194304,4194304] 0 2026-03-10T06:23:14.124 INFO:tasks.workunit.client.0.vm04.stdout:9/863: dread d2/d3/d18/de9/da9/f12f [0,4194304] 0 2026-03-10T06:23:14.132 INFO:tasks.workunit.client.0.vm04.stdout:0/869: mknod d0/d1a/d4d/c123 0 2026-03-10T06:23:14.142 INFO:tasks.workunit.client.0.vm04.stdout:8/822: getdents df/d15/d29/da3 0 2026-03-10T06:23:14.161 INFO:tasks.workunit.client.0.vm04.stdout:7/783: dwrite d4/df/d12/f4c [0,4194304] 0 2026-03-10T06:23:14.164 INFO:tasks.workunit.client.0.vm04.stdout:5/790: creat d4/d11/d7d/d38/d91/d55/d72/f113 x:0 0 0 2026-03-10T06:23:14.168 INFO:tasks.workunit.client.0.vm04.stdout:1/796: mknod d0/d112/c127 0 2026-03-10T06:23:14.176 INFO:tasks.workunit.client.0.vm04.stdout:5/791: dwrite d4/d11/d7d/f31 [0,4194304] 0 2026-03-10T06:23:14.192 INFO:tasks.workunit.client.0.vm04.stdout:6/825: getdents d2/d37/d83 0 2026-03-10T06:23:14.197 INFO:tasks.workunit.client.0.vm04.stdout:7/784: creat d4/df/d12/d13/d25/d8f/f11c x:0 0 0 2026-03-10T06:23:14.209 INFO:tasks.workunit.client.0.vm04.stdout:3/795: mknod d4/da/df/d11/c106 0 2026-03-10T06:23:14.209 INFO:tasks.workunit.client.0.vm04.stdout:2/800: creat d1/dae/d11/d14/d9f/ddb/d94/ff2 x:0 0 0 2026-03-10T06:23:14.223 INFO:tasks.workunit.client.0.vm04.stdout:9/864: symlink d2/d8/d53/d6e/d12b/l138 0 2026-03-10T06:23:14.223 INFO:tasks.workunit.client.0.vm04.stdout:6/826: unlink d2/f5f 0 2026-03-10T06:23:14.225 INFO:tasks.workunit.client.0.vm04.stdout:8/823: truncate df/d15/d2b/f7e 1794884 0 2026-03-10T06:23:14.225 INFO:tasks.workunit.client.0.vm04.stdout:7/785: creat d4/df/d12/dd4/f11d x:0 0 0 2026-03-10T06:23:14.227 INFO:tasks.workunit.client.0.vm04.stdout:3/796: dread - d4/da/df/d11/d5a/d5b/fd9 zero size 2026-03-10T06:23:14.230 INFO:tasks.workunit.client.0.vm04.stdout:0/870: rename d0/d5/d25/dd/d3a/f50 to d0/d5/d25/dd/f124 0 2026-03-10T06:23:14.234 INFO:tasks.workunit.client.0.vm04.stdout:8/824: chown df/d15/d29/da3/db8/lc7 11942 1 2026-03-10T06:23:14.235 INFO:tasks.workunit.client.0.vm04.stdout:5/792: mkdir d4/d6/d80/de5/d114 0 2026-03-10T06:23:14.239 INFO:tasks.workunit.client.0.vm04.stdout:1/797: dwrite d0/d3/d80/f86 [0,4194304] 0 2026-03-10T06:23:14.244 INFO:tasks.workunit.client.0.vm04.stdout:2/801: getdents d1/dae/dd6 0 2026-03-10T06:23:14.250 INFO:tasks.workunit.client.0.vm04.stdout:7/786: truncate d4/df/d12/d13/db3/d110/d9c/db1/ff7 992569 0 2026-03-10T06:23:14.254 INFO:tasks.workunit.client.0.vm04.stdout:2/802: unlink d1/db/d69/d74/d87/dcf/d8f/c30 0 2026-03-10T06:23:14.256 INFO:tasks.workunit.client.0.vm04.stdout:4/827: dread d2/d8/f35 [0,4194304] 0 2026-03-10T06:23:14.257 INFO:tasks.workunit.client.0.vm04.stdout:1/798: dwrite d0/d8/fab [0,4194304] 0 2026-03-10T06:23:14.258 INFO:tasks.workunit.client.0.vm04.stdout:7/787: unlink d4/df/d12/d13/d25/d28/d3a/le5 0 2026-03-10T06:23:14.266 INFO:tasks.workunit.client.0.vm04.stdout:0/871: mknod d0/d5/d25/dd/d3a/d81/df9/c125 0 2026-03-10T06:23:14.268 INFO:tasks.workunit.client.0.vm04.stdout:2/803: mkdir d1/dae/d11/d14/d9f/ddb/df3 0 2026-03-10T06:23:14.268 INFO:tasks.workunit.client.0.vm04.stdout:6/827: creat d2/d3a/f111 x:0 0 0 2026-03-10T06:23:14.277 INFO:tasks.workunit.client.0.vm04.stdout:7/788: dwrite d4/df/d12/dd4/fe1 [0,4194304] 0 2026-03-10T06:23:14.277 INFO:tasks.workunit.client.0.vm04.stdout:1/799: fsync d0/f6a 0 2026-03-10T06:23:14.280 INFO:tasks.workunit.client.0.vm04.stdout:1/800: write d0/d3/d80/f91 [4408085,70243] 0 2026-03-10T06:23:14.289 INFO:tasks.workunit.client.0.vm04.stdout:6/828: truncate d2/d43/d2d/d30/d34/f52 87513 0 2026-03-10T06:23:14.291 INFO:tasks.workunit.client.0.vm04.stdout:9/865: creat d2/d3/d18/de9/f139 x:0 0 0 2026-03-10T06:23:14.291 INFO:tasks.workunit.client.0.vm04.stdout:9/866: readlink d2/lcd 0 2026-03-10T06:23:14.298 INFO:tasks.workunit.client.0.vm04.stdout:2/804: sync 2026-03-10T06:23:14.306 INFO:tasks.workunit.client.0.vm04.stdout:0/872: dread d0/d5/d25/dd/d5c/d73/fa5 [0,4194304] 0 2026-03-10T06:23:14.321 INFO:tasks.workunit.client.0.vm04.stdout:1/801: creat d0/d3/d41/d99/def/f128 x:0 0 0 2026-03-10T06:23:14.342 INFO:tasks.workunit.client.0.vm04.stdout:3/797: dwrite d4/da/df/d11/d5a/db3/fc9 [0,4194304] 0 2026-03-10T06:23:14.352 INFO:tasks.workunit.client.0.vm04.stdout:3/798: chown d4/da/df/d11/d50/fbc 72266 1 2026-03-10T06:23:14.353 INFO:tasks.workunit.client.0.vm04.stdout:5/793: dwrite d4/d6/d37/f39 [4194304,4194304] 0 2026-03-10T06:23:14.367 INFO:tasks.workunit.client.0.vm04.stdout:3/799: dwrite d4/da/df/d11/d50/ffa [0,4194304] 0 2026-03-10T06:23:14.399 INFO:tasks.workunit.client.0.vm04.stdout:7/789: dwrite d4/df/d12/d13/d25/d28/d3a/d58/fb6 [4194304,4194304] 0 2026-03-10T06:23:14.420 INFO:tasks.workunit.client.0.vm04.stdout:8/825: getdents df/d20/d25/d73 0 2026-03-10T06:23:14.420 INFO:tasks.workunit.client.0.vm04.stdout:4/828: link d2/d32/d10b/d93/ff2 d2/d32/d5c/f10e 0 2026-03-10T06:23:14.424 INFO:tasks.workunit.client.0.vm04.stdout:5/794: symlink d4/d6/d50/l115 0 2026-03-10T06:23:14.426 INFO:tasks.workunit.client.0.vm04.stdout:0/873: read d0/d5/d97/dc0/dd8/dff/fa2 [559418,62494] 0 2026-03-10T06:23:14.426 INFO:tasks.workunit.client.0.vm04.stdout:3/800: creat d4/da/df/d11/d5a/d5b/ddf/d21/d32/d39/f107 x:0 0 0 2026-03-10T06:23:14.427 INFO:tasks.workunit.client.0.vm04.stdout:7/790: creat d4/df/d12/d13/d25/d30/d40/d50/f11e x:0 0 0 2026-03-10T06:23:14.427 INFO:tasks.workunit.client.0.vm04.stdout:6/829: symlink d2/d3a/ded/l112 0 2026-03-10T06:23:14.428 INFO:tasks.workunit.client.0.vm04.stdout:9/867: symlink d2/de0/d1d/d64/d73/d10d/l13a 0 2026-03-10T06:23:14.430 INFO:tasks.workunit.client.0.vm04.stdout:5/795: creat d4/d11/d7d/d38/d91/d55/db1/f116 x:0 0 0 2026-03-10T06:23:14.431 INFO:tasks.workunit.client.0.vm04.stdout:4/829: symlink d2/d32/d5c/l10f 0 2026-03-10T06:23:14.432 INFO:tasks.workunit.client.0.vm04.stdout:8/826: getdents df/d20/d25/d73 0 2026-03-10T06:23:14.433 INFO:tasks.workunit.client.0.vm04.stdout:6/830: rename d2/d43/d86/cf1 to d2/d3a/d5e/db5/c113 0 2026-03-10T06:23:14.433 INFO:tasks.workunit.client.0.vm04.stdout:6/831: write d2/d43/d2d/fcf [2909304,90] 0 2026-03-10T06:23:14.434 INFO:tasks.workunit.client.0.vm04.stdout:9/868: rmdir d2/d3/d18/de9 39 2026-03-10T06:23:14.435 INFO:tasks.workunit.client.0.vm04.stdout:0/874: mknod d0/d5/c126 0 2026-03-10T06:23:14.437 INFO:tasks.workunit.client.0.vm04.stdout:9/869: chown d2/d3/d18/d39/d11/da5 153628 1 2026-03-10T06:23:14.438 INFO:tasks.workunit.client.0.vm04.stdout:2/805: getdents d1/dae/d11/d14/d4e 0 2026-03-10T06:23:14.439 INFO:tasks.workunit.client.0.vm04.stdout:9/870: chown d2/d3/d18/ce1 105192 1 2026-03-10T06:23:14.440 INFO:tasks.workunit.client.0.vm04.stdout:5/796: unlink d4/d11/d7d/d38/d91/d4c/d98/dc0/cc7 0 2026-03-10T06:23:14.441 INFO:tasks.workunit.client.0.vm04.stdout:6/832: chown d2/d43/d86/l96 237978935 1 2026-03-10T06:23:14.441 INFO:tasks.workunit.client.0.vm04.stdout:3/801: dread d4/da/df/d11/d5a/d5b/ddf/d21/d32/d4e/f73 [0,4194304] 0 2026-03-10T06:23:14.443 INFO:tasks.workunit.client.0.vm04.stdout:2/806: symlink d1/dae/d11/d14/d9f/ddb/d94/dbb/lf4 0 2026-03-10T06:23:14.443 INFO:tasks.workunit.client.0.vm04.stdout:0/875: creat d0/d1a/d20/df5/d47/d8a/d8d/f127 x:0 0 0 2026-03-10T06:23:14.444 INFO:tasks.workunit.client.0.vm04.stdout:3/802: write d4/d6/d38/dcc/fed [306969,125644] 0 2026-03-10T06:23:14.446 INFO:tasks.workunit.client.0.vm04.stdout:5/797: dread d4/d6/d80/d84/f9c [0,4194304] 0 2026-03-10T06:23:14.447 INFO:tasks.workunit.client.0.vm04.stdout:3/803: write d4/da/df/d11/d5a/d5b/fa3 [1073576,107780] 0 2026-03-10T06:23:14.448 INFO:tasks.workunit.client.0.vm04.stdout:8/827: link df/d15/d2b/d81/d9a/dbe/df0/fff df/d20/d25/d30/dc5/f109 0 2026-03-10T06:23:14.450 INFO:tasks.workunit.client.0.vm04.stdout:0/876: mknod d0/d5/d25/c128 0 2026-03-10T06:23:14.463 INFO:tasks.workunit.client.0.vm04.stdout:8/828: rmdir df/d15/d29 39 2026-03-10T06:23:14.463 INFO:tasks.workunit.client.0.vm04.stdout:3/804: mkdir d4/da/df/d11/d5a/db3/d108 0 2026-03-10T06:23:14.463 INFO:tasks.workunit.client.0.vm04.stdout:5/798: mknod d4/d11/d7d/d38/d91/d4c/d9d/c117 0 2026-03-10T06:23:14.463 INFO:tasks.workunit.client.0.vm04.stdout:2/807: creat d1/db/d69/d74/d87/dcf/d8f/d35/d54/d5d/ff5 x:0 0 0 2026-03-10T06:23:14.463 INFO:tasks.workunit.client.0.vm04.stdout:0/877: mknod d0/d1a/db8/df7/c129 0 2026-03-10T06:23:14.463 INFO:tasks.workunit.client.0.vm04.stdout:5/799: truncate d4/d6/d80/d84/fe2 331834 0 2026-03-10T06:23:14.463 INFO:tasks.workunit.client.0.vm04.stdout:3/805: mknod d4/d6/d91/da1/dd5/c109 0 2026-03-10T06:23:14.465 INFO:tasks.workunit.client.0.vm04.stdout:2/808: symlink d1/db/d69/d74/d87/dcf/d8f/ddc/dce/lf6 0 2026-03-10T06:23:14.466 INFO:tasks.workunit.client.0.vm04.stdout:8/829: dwrite df/d20/d25/d30/dc5/ff9 [0,4194304] 0 2026-03-10T06:23:14.474 INFO:tasks.workunit.client.0.vm04.stdout:5/800: write d4/d11/d7d/d38/d91/d55/f9e [1607868,118905] 0 2026-03-10T06:23:14.482 INFO:tasks.workunit.client.0.vm04.stdout:0/878: link d0/d5/d25/dd/f43 d0/d5/d97/dc0/dd8/dff/d9c/dbf/f12a 0 2026-03-10T06:23:14.483 INFO:tasks.workunit.client.0.vm04.stdout:2/809: dread - d1/db/d69/d74/d87/dcf/d8f/d48/d67/fac zero size 2026-03-10T06:23:14.484 INFO:tasks.workunit.client.0.vm04.stdout:0/879: write d0/d5/d25/dd/d3a/d56/fa7 [989831,73466] 0 2026-03-10T06:23:14.484 INFO:tasks.workunit.client.0.vm04.stdout:9/871: dread d2/d3/d18/d39/f2e [0,4194304] 0 2026-03-10T06:23:14.487 INFO:tasks.workunit.client.0.vm04.stdout:5/801: rmdir d4/d6/d81 39 2026-03-10T06:23:14.489 INFO:tasks.workunit.client.0.vm04.stdout:0/880: mknod d0/d5/d25/c12b 0 2026-03-10T06:23:14.489 INFO:tasks.workunit.client.0.vm04.stdout:5/802: chown d4/d6/d81/fc3 424746 1 2026-03-10T06:23:14.490 INFO:tasks.workunit.client.0.vm04.stdout:2/810: getdents d1/dae/d11/d14/d9f/ddb 0 2026-03-10T06:23:14.501 INFO:tasks.workunit.client.0.vm04.stdout:2/811: chown d1/db/d69/d74/lba 0 1 2026-03-10T06:23:14.501 INFO:tasks.workunit.client.0.vm04.stdout:5/803: creat d4/d11/d7d/d38/d91/d55/d72/f118 x:0 0 0 2026-03-10T06:23:14.501 INFO:tasks.workunit.client.0.vm04.stdout:8/830: dwrite df/d15/d29/da3/faa [0,4194304] 0 2026-03-10T06:23:14.501 INFO:tasks.workunit.client.0.vm04.stdout:0/881: dwrite d0/d1a/d20/df5/d79/d111/f118 [0,4194304] 0 2026-03-10T06:23:14.505 INFO:tasks.workunit.client.0.vm04.stdout:9/872: symlink d2/d3/d18/d39/d46/l13b 0 2026-03-10T06:23:14.506 INFO:tasks.workunit.client.0.vm04.stdout:5/804: dread d4/d11/d7d/d38/f8b [0,4194304] 0 2026-03-10T06:23:14.513 INFO:tasks.workunit.client.0.vm04.stdout:9/873: chown d2/d23/d94/fb5 16624299 1 2026-03-10T06:23:14.513 INFO:tasks.workunit.client.0.vm04.stdout:3/806: sync 2026-03-10T06:23:14.517 INFO:tasks.workunit.client.0.vm04.stdout:2/812: dread d1/db/d69/d74/d87/dcf/d8f/f25 [0,4194304] 0 2026-03-10T06:23:14.530 INFO:tasks.workunit.client.0.vm04.stdout:8/831: fsync df/d15/d29/da3/db8/dc1/d97/d67/f92 0 2026-03-10T06:23:14.537 INFO:tasks.workunit.client.0.vm04.stdout:5/805: mknod d4/d11/d7d/d38/d91/d55/d72/c119 0 2026-03-10T06:23:14.538 INFO:tasks.workunit.client.0.vm04.stdout:0/882: symlink d0/d1a/d20/l12c 0 2026-03-10T06:23:14.538 INFO:tasks.workunit.client.0.vm04.stdout:9/874: creat d2/d3/d18/d39/d46/f13c x:0 0 0 2026-03-10T06:23:14.541 INFO:tasks.workunit.client.0.vm04.stdout:1/802: dwrite d0/d8/d46/d7a/d95/fa7 [0,4194304] 0 2026-03-10T06:23:14.556 INFO:tasks.workunit.client.0.vm04.stdout:4/830: write d2/d46/f61 [4527219,116719] 0 2026-03-10T06:23:14.565 INFO:tasks.workunit.client.0.vm04.stdout:2/813: mknod d1/dae/d11/d14/d9f/ddb/d94/cf7 0 2026-03-10T06:23:14.570 INFO:tasks.workunit.client.0.vm04.stdout:7/791: truncate d4/df/d12/d13/d25/d30/d40/d79/f89 6782148 0 2026-03-10T06:23:14.572 INFO:tasks.workunit.client.0.vm04.stdout:6/833: dwrite d2/fa0 [0,4194304] 0 2026-03-10T06:23:14.576 INFO:tasks.workunit.client.0.vm04.stdout:0/883: fdatasync d0/d1a/d20/d38/fb4 0 2026-03-10T06:23:14.576 INFO:tasks.workunit.client.0.vm04.stdout:8/832: symlink df/d20/l10a 0 2026-03-10T06:23:14.583 INFO:tasks.workunit.client.0.vm04.stdout:2/814: creat d1/dae/d11/d14/d9f/ddb/d94/dbb/ff8 x:0 0 0 2026-03-10T06:23:14.596 INFO:tasks.workunit.client.0.vm04.stdout:6/834: rmdir d2/d43/d2d/d7c/daa 39 2026-03-10T06:23:14.598 INFO:tasks.workunit.client.0.vm04.stdout:3/807: write d4/da/df/d11/d5a/f8b [890895,94687] 0 2026-03-10T06:23:14.600 INFO:tasks.workunit.client.0.vm04.stdout:9/875: dwrite d2/d3/f43 [4194304,4194304] 0 2026-03-10T06:23:14.602 INFO:tasks.workunit.client.0.vm04.stdout:9/876: chown d2/d3/d18/d39/f20 709777979 1 2026-03-10T06:23:14.602 INFO:tasks.workunit.client.0.vm04.stdout:1/803: mknod d0/d8/d46/c129 0 2026-03-10T06:23:14.605 INFO:tasks.workunit.client.0.vm04.stdout:1/804: stat d0/d3/d41/d99/def/f118 0 2026-03-10T06:23:14.609 INFO:tasks.workunit.client.0.vm04.stdout:7/792: creat d4/df/d12/d13/d25/d30/d40/d50/df6/d114/f11f x:0 0 0 2026-03-10T06:23:14.615 INFO:tasks.workunit.client.0.vm04.stdout:3/808: truncate d4/fa7 726518 0 2026-03-10T06:23:14.616 INFO:tasks.workunit.client.0.vm04.stdout:3/809: chown d4/da/df/d11/d5a/d5b/ddf/d21/d2c/c94 5197746 1 2026-03-10T06:23:14.620 INFO:tasks.workunit.client.0.vm04.stdout:4/831: link d2/d32/ce8 d2/d46/c110 0 2026-03-10T06:23:14.623 INFO:tasks.workunit.client.0.vm04.stdout:2/815: write d1/db/d69/f77 [1996972,39687] 0 2026-03-10T06:23:14.627 INFO:tasks.workunit.client.0.vm04.stdout:6/835: dwrite d2/d37/d6e/f77 [0,4194304] 0 2026-03-10T06:23:14.628 INFO:tasks.workunit.client.0.vm04.stdout:9/877: write d2/d3/d18/de9/f139 [227970,103290] 0 2026-03-10T06:23:14.630 INFO:tasks.workunit.client.0.vm04.stdout:8/833: write df/d20/f42 [3740884,81788] 0 2026-03-10T06:23:14.636 INFO:tasks.workunit.client.0.vm04.stdout:1/805: write d0/d3/f62 [3377481,77035] 0 2026-03-10T06:23:14.637 INFO:tasks.workunit.client.0.vm04.stdout:9/878: dread d2/d3/d18/de9/f139 [0,4194304] 0 2026-03-10T06:23:14.638 INFO:tasks.workunit.client.0.vm04.stdout:9/879: chown d2/d3/d18/de9/d5a/fee 110 1 2026-03-10T06:23:14.644 INFO:tasks.workunit.client.0.vm04.stdout:0/884: fsync d0/d5/d25/dd/d5c/d73/f4f 0 2026-03-10T06:23:14.653 INFO:tasks.workunit.client.0.vm04.stdout:7/793: unlink d4/df/d12/d13/c49 0 2026-03-10T06:23:14.654 INFO:tasks.workunit.client.0.vm04.stdout:5/806: getdents d4/d11/d7d/d38/d91/d4c 0 2026-03-10T06:23:14.655 INFO:tasks.workunit.client.0.vm04.stdout:5/807: stat d4/d11/d7d/d38/d91/d55/fba 0 2026-03-10T06:23:14.658 INFO:tasks.workunit.client.0.vm04.stdout:4/832: creat d2/d46/f111 x:0 0 0 2026-03-10T06:23:14.658 INFO:tasks.workunit.client.0.vm04.stdout:4/833: readlink d2/d32/l36 0 2026-03-10T06:23:14.662 INFO:tasks.workunit.client.0.vm04.stdout:5/808: dread d4/d11/f32 [0,4194304] 0 2026-03-10T06:23:14.662 INFO:tasks.workunit.client.0.vm04.stdout:5/809: chown d4/d11/d7d/dae 59743347 1 2026-03-10T06:23:14.665 INFO:tasks.workunit.client.0.vm04.stdout:2/816: mknod d1/db/d69/d74/d87/dcf/cf9 0 2026-03-10T06:23:14.670 INFO:tasks.workunit.client.0.vm04.stdout:1/806: creat d0/d8/d46/de4/dec/f12a x:0 0 0 2026-03-10T06:23:14.671 INFO:tasks.workunit.client.0.vm04.stdout:9/880: creat d2/d3/d18/d39/d46/d55/f13d x:0 0 0 2026-03-10T06:23:14.672 INFO:tasks.workunit.client.0.vm04.stdout:0/885: rename d0/d5/d25/dd/d3a/d81 to d0/d1a/d20/dc2/d12d 0 2026-03-10T06:23:14.674 INFO:tasks.workunit.client.0.vm04.stdout:3/810: dwrite d4/dba/fda [0,4194304] 0 2026-03-10T06:23:14.683 INFO:tasks.workunit.client.0.vm04.stdout:7/794: chown d4/df/d12/d13/l69 86231 1 2026-03-10T06:23:14.691 INFO:tasks.workunit.client.0.vm04.stdout:3/811: dread d4/da/df/d11/d5a/d5b/ddf/d21/d32/d4e/d8f/fb0 [0,4194304] 0 2026-03-10T06:23:14.692 INFO:tasks.workunit.client.0.vm04.stdout:5/810: symlink d4/d6/d80/d84/l11a 0 2026-03-10T06:23:14.695 INFO:tasks.workunit.client.0.vm04.stdout:8/834: dwrite df/d20/d25/f2a [4194304,4194304] 0 2026-03-10T06:23:14.702 INFO:tasks.workunit.client.0.vm04.stdout:2/817: fdatasync d1/db/d69/d74/d87/dcf/d8f/d48/d67/fac 0 2026-03-10T06:23:14.703 INFO:tasks.workunit.client.0.vm04.stdout:9/881: mknod d2/de0/d1d/c13e 0 2026-03-10T06:23:14.704 INFO:tasks.workunit.client.0.vm04.stdout:9/882: chown d2/lcd 492 1 2026-03-10T06:23:14.704 INFO:tasks.workunit.client.0.vm04.stdout:9/883: chown d2/d3/d18/d39/f20 2 1 2026-03-10T06:23:14.706 INFO:tasks.workunit.client.0.vm04.stdout:1/807: creat d0/d3/d41/dcb/f12b x:0 0 0 2026-03-10T06:23:14.707 INFO:tasks.workunit.client.0.vm04.stdout:1/808: dread d0/d8/d46/f93 [0,4194304] 0 2026-03-10T06:23:14.708 INFO:tasks.workunit.client.0.vm04.stdout:0/886: mkdir d0/d1a/d4d/d12e 0 2026-03-10T06:23:14.711 INFO:tasks.workunit.client.0.vm04.stdout:7/795: fdatasync d4/df/d12/d13/db3/fcf 0 2026-03-10T06:23:14.716 INFO:tasks.workunit.client.0.vm04.stdout:2/818: fdatasync d1/f10 0 2026-03-10T06:23:14.716 INFO:tasks.workunit.client.0.vm04.stdout:9/884: fdatasync d2/d3/d18/d39/d11/f71 0 2026-03-10T06:23:14.716 INFO:tasks.workunit.client.0.vm04.stdout:1/809: mknod d0/d3/d41/dcb/dee/c12c 0 2026-03-10T06:23:14.716 INFO:tasks.workunit.client.0.vm04.stdout:1/810: stat d0/d3/d41/d4b/f6b 0 2026-03-10T06:23:14.716 INFO:tasks.workunit.client.0.vm04.stdout:7/796: dwrite d4/df/d12/d13/db3/d110/d9c/db1/ff7 [0,4194304] 0 2026-03-10T06:23:14.718 INFO:tasks.workunit.client.0.vm04.stdout:7/797: dwrite d4/df/d12/d13/fac [0,4194304] 0 2026-03-10T06:23:14.725 INFO:tasks.workunit.client.0.vm04.stdout:5/811: link d4/d11/d7d/d38/d91/d55/d72/f113 d4/d6/d81/f11b 0 2026-03-10T06:23:14.733 INFO:tasks.workunit.client.0.vm04.stdout:0/887: dwrite d0/d5/d25/dd/d5c/d73/fa5 [0,4194304] 0 2026-03-10T06:23:14.738 INFO:tasks.workunit.client.0.vm04.stdout:0/888: chown d0/d1a/d20/df5/d47/d8a/fbe 33654991 1 2026-03-10T06:23:14.742 INFO:tasks.workunit.client.0.vm04.stdout:0/889: dread d0/d5/d97/dc0/fdb [0,4194304] 0 2026-03-10T06:23:14.745 INFO:tasks.workunit.client.0.vm04.stdout:5/812: creat d4/d6/d81/db6/f11c x:0 0 0 2026-03-10T06:23:14.746 INFO:tasks.workunit.client.0.vm04.stdout:5/813: dread d4/d11/d7d/d38/f8b [0,4194304] 0 2026-03-10T06:23:14.752 INFO:tasks.workunit.client.0.vm04.stdout:4/834: dwrite d2/d46/fcb [0,4194304] 0 2026-03-10T06:23:14.764 INFO:tasks.workunit.client.0.vm04.stdout:6/836: dwrite d2/d43/d2d/d30/d34/f52 [0,4194304] 0 2026-03-10T06:23:14.768 INFO:tasks.workunit.client.0.vm04.stdout:6/837: truncate d2/d37/d6e/f105 958340 0 2026-03-10T06:23:14.781 INFO:tasks.workunit.client.0.vm04.stdout:3/812: write d4/d6/dc/f22 [2733330,74427] 0 2026-03-10T06:23:14.789 INFO:tasks.workunit.client.0.vm04.stdout:3/813: symlink d4/da/df/d11/d5a/d5b/ddf/d21/d32/d39/d64/l10a 0 2026-03-10T06:23:14.796 INFO:tasks.workunit.client.0.vm04.stdout:8/835: write df/d15/d29/da3/db8/dc1/dac/fc3 [816916,59579] 0 2026-03-10T06:23:14.797 INFO:tasks.workunit.client.0.vm04.stdout:8/836: truncate df/d15/d29/d89/ff7 481804 0 2026-03-10T06:23:14.798 INFO:tasks.workunit.client.0.vm04.stdout:4/835: link d2/d32/d5c/cfa d2/d32/d5c/de2/d102/c112 0 2026-03-10T06:23:14.802 INFO:tasks.workunit.client.0.vm04.stdout:4/836: dwrite d2/d8/f9f [0,4194304] 0 2026-03-10T06:23:14.809 INFO:tasks.workunit.client.0.vm04.stdout:4/837: dwrite d2/d46/f61 [0,4194304] 0 2026-03-10T06:23:14.809 INFO:tasks.workunit.client.0.vm04.stdout:4/838: stat d2/d32/d10b/f52 0 2026-03-10T06:23:14.820 INFO:tasks.workunit.client.0.vm04.stdout:2/819: dwrite d1/dae/d2c/f4a [0,4194304] 0 2026-03-10T06:23:14.825 INFO:tasks.workunit.client.0.vm04.stdout:9/885: dwrite d2/d8/d53/d6e/d89/fbb [0,4194304] 0 2026-03-10T06:23:14.830 INFO:tasks.workunit.client.0.vm04.stdout:1/811: dwrite d0/d8/d46/d7a/fa8 [0,4194304] 0 2026-03-10T06:23:14.837 INFO:tasks.workunit.client.0.vm04.stdout:7/798: dwrite d4/fa7 [0,4194304] 0 2026-03-10T06:23:14.844 INFO:tasks.workunit.client.0.vm04.stdout:6/838: write d2/d43/d2d/d30/f4a [2057467,4654] 0 2026-03-10T06:23:14.846 INFO:tasks.workunit.client.0.vm04.stdout:5/814: dwrite d4/d6/d81/fc3 [0,4194304] 0 2026-03-10T06:23:14.855 INFO:tasks.workunit.client.0.vm04.stdout:0/890: dwrite d0/d5/d25/dd/d3a/d56/f84 [0,4194304] 0 2026-03-10T06:23:14.855 INFO:tasks.workunit.client.0.vm04.stdout:0/891: chown d0/d1a/db8/df7 105 1 2026-03-10T06:23:14.855 INFO:tasks.workunit.client.0.vm04.stdout:0/892: chown d0/d5/d25/dd/d3a/d56 55340 1 2026-03-10T06:23:14.855 INFO:tasks.workunit.client.0.vm04.stdout:0/893: chown d0/d5/d25/dd/d5c/l7e 3280 1 2026-03-10T06:23:14.859 INFO:tasks.workunit.client.0.vm04.stdout:3/814: rename d4/c3e to d4/da/df/d11/d5a/d5b/ddf/c10b 0 2026-03-10T06:23:14.863 INFO:tasks.workunit.client.0.vm04.stdout:8/837: unlink df/fda 0 2026-03-10T06:23:14.875 INFO:tasks.workunit.client.0.vm04.stdout:2/820: unlink d1/db/f27 0 2026-03-10T06:23:14.875 INFO:tasks.workunit.client.0.vm04.stdout:1/812: mknod d0/d8/d46/de4/dec/c12d 0 2026-03-10T06:23:14.875 INFO:tasks.workunit.client.0.vm04.stdout:1/813: write d0/f29 [3333866,34474] 0 2026-03-10T06:23:14.885 INFO:tasks.workunit.client.0.vm04.stdout:5/815: unlink d4/c4b 0 2026-03-10T06:23:14.890 INFO:tasks.workunit.client.0.vm04.stdout:0/894: dread - d0/d1a/d20/df5/d79/fea zero size 2026-03-10T06:23:14.895 INFO:tasks.workunit.client.0.vm04.stdout:8/838: creat df/d20/d25/f10b x:0 0 0 2026-03-10T06:23:14.896 INFO:tasks.workunit.client.0.vm04.stdout:8/839: fdatasync df/d15/d29/f6f 0 2026-03-10T06:23:14.900 INFO:tasks.workunit.client.0.vm04.stdout:8/840: dwrite df/d15/d2b/d81/fc4 [0,4194304] 0 2026-03-10T06:23:14.906 INFO:tasks.workunit.client.0.vm04.stdout:8/841: dwrite df/d20/d25/d73/ff4 [0,4194304] 0 2026-03-10T06:23:14.910 INFO:tasks.workunit.client.0.vm04.stdout:8/842: stat df/d15/d29/da3/db8/dc1/dac 0 2026-03-10T06:23:14.916 INFO:tasks.workunit.client.0.vm04.stdout:0/895: creat d0/f12f x:0 0 0 2026-03-10T06:23:14.930 INFO:tasks.workunit.client.0.vm04.stdout:4/839: getdents d2/d32/d5c/d76 0 2026-03-10T06:23:14.933 INFO:tasks.workunit.client.0.vm04.stdout:1/814: link d0/d3/d41/d4b/d5b/fd9 d0/d3/d41/dcb/f12e 0 2026-03-10T06:23:14.934 INFO:tasks.workunit.client.0.vm04.stdout:1/815: read d0/d8/d46/d7a/d95/fa7 [2751819,128140] 0 2026-03-10T06:23:14.937 INFO:tasks.workunit.client.0.vm04.stdout:6/839: getdents d2/d43/d2d/d30/d34/d76/d7e 0 2026-03-10T06:23:14.938 INFO:tasks.workunit.client.0.vm04.stdout:4/840: dread d2/f4 [0,4194304] 0 2026-03-10T06:23:14.939 INFO:tasks.workunit.client.0.vm04.stdout:4/841: chown d2/d32/d5c/d76/dd7/d56/f7f 32957260 1 2026-03-10T06:23:14.942 INFO:tasks.workunit.client.0.vm04.stdout:4/842: dread d2/d46/fcb [0,4194304] 0 2026-03-10T06:23:14.944 INFO:tasks.workunit.client.0.vm04.stdout:1/816: mkdir d0/d3/d80/d12f 0 2026-03-10T06:23:14.944 INFO:tasks.workunit.client.0.vm04.stdout:1/817: chown d0/d3/d80/d12f 607 1 2026-03-10T06:23:14.948 INFO:tasks.workunit.client.0.vm04.stdout:4/843: dwrite d2/d32/d5c/d4f/f105 [0,4194304] 0 2026-03-10T06:23:14.961 INFO:tasks.workunit.client.0.vm04.stdout:4/844: mknod d2/d32/d5c/d4f/c113 0 2026-03-10T06:23:14.972 INFO:tasks.workunit.client.0.vm04.stdout:1/818: dread d0/f7c [0,4194304] 0 2026-03-10T06:23:14.973 INFO:tasks.workunit.client.0.vm04.stdout:1/819: fdatasync d0/d8/d46/db3/ff7 0 2026-03-10T06:23:14.974 INFO:tasks.workunit.client.0.vm04.stdout:1/820: mknod d0/d3/d41/c130 0 2026-03-10T06:23:14.975 INFO:tasks.workunit.client.0.vm04.stdout:1/821: chown d0/d8/f105 20136128 1 2026-03-10T06:23:15.014 INFO:tasks.workunit.client.0.vm04.stdout:8/843: sync 2026-03-10T06:23:15.014 INFO:tasks.workunit.client.0.vm04.stdout:4/845: dread d2/f12 [0,4194304] 0 2026-03-10T06:23:15.015 INFO:tasks.workunit.client.0.vm04.stdout:8/844: symlink df/d15/d2b/d81/l10c 0 2026-03-10T06:23:15.017 INFO:tasks.workunit.client.0.vm04.stdout:8/845: creat df/d15/d29/f10d x:0 0 0 2026-03-10T06:23:15.022 INFO:tasks.workunit.client.0.vm04.stdout:8/846: dread f9 [0,4194304] 0 2026-03-10T06:23:15.027 INFO:tasks.workunit.client.0.vm04.stdout:4/846: sync 2026-03-10T06:23:15.030 INFO:tasks.workunit.client.0.vm04.stdout:7/799: write d4/fb2 [159987,97703] 0 2026-03-10T06:23:15.031 INFO:tasks.workunit.client.0.vm04.stdout:7/800: chown d4/df/f107 1507467 1 2026-03-10T06:23:15.039 INFO:tasks.workunit.client.0.vm04.stdout:4/847: chown d2/d32/d5c/d76/dd7/d31/d42/db9/l80 952 1 2026-03-10T06:23:15.042 INFO:tasks.workunit.client.0.vm04.stdout:2/821: dwrite d1/db/d69/d74/d87/dcf/d8f/d35/d54/d5d/ff5 [0,4194304] 0 2026-03-10T06:23:15.045 INFO:tasks.workunit.client.0.vm04.stdout:2/822: fsync d1/dae/d2c/f33 0 2026-03-10T06:23:15.054 INFO:tasks.workunit.client.0.vm04.stdout:8/847: symlink df/d15/d29/da3/db8/dc1/d97/d67/l10e 0 2026-03-10T06:23:15.054 INFO:tasks.workunit.client.0.vm04.stdout:9/886: dwrite d2/d3/d18/d39/f2e [0,4194304] 0 2026-03-10T06:23:15.058 INFO:tasks.workunit.client.0.vm04.stdout:4/848: mknod d2/d32/d10b/dc8/c114 0 2026-03-10T06:23:15.061 INFO:tasks.workunit.client.0.vm04.stdout:5/816: truncate d4/fb0 2657927 0 2026-03-10T06:23:15.062 INFO:tasks.workunit.client.0.vm04.stdout:0/896: truncate d0/d1a/d20/df5/d79/d111/f118 2350772 0 2026-03-10T06:23:15.064 INFO:tasks.workunit.client.0.vm04.stdout:3/815: dwrite d4/d6/dc/f37 [0,4194304] 0 2026-03-10T06:23:15.080 INFO:tasks.workunit.client.0.vm04.stdout:7/801: rename d4/df/d12/c83 to d4/df/d12/d13/d25/d28/d3a/d100/d106/c120 0 2026-03-10T06:23:15.080 INFO:tasks.workunit.client.0.vm04.stdout:6/840: dwrite d2/d37/d83/f8e [0,4194304] 0 2026-03-10T06:23:15.081 INFO:tasks.workunit.client.0.vm04.stdout:7/802: write d4/df/d12/d13/db3/d110/f10c [2945156,43785] 0 2026-03-10T06:23:15.098 INFO:tasks.workunit.client.0.vm04.stdout:1/822: dwrite d0/d3/d41/fa3 [0,4194304] 0 2026-03-10T06:23:15.098 INFO:tasks.workunit.client.0.vm04.stdout:0/897: dread - d0/d5/d25/dd/d92/fc9 zero size 2026-03-10T06:23:15.104 INFO:tasks.workunit.client.0.vm04.stdout:8/848: rename df/d20/d25/d87/l96 to df/d20/d25/d30/d55/l10f 0 2026-03-10T06:23:15.107 INFO:tasks.workunit.client.0.vm04.stdout:8/849: fsync df/f77 0 2026-03-10T06:23:15.109 INFO:tasks.workunit.client.0.vm04.stdout:6/841: read d2/d43/d86/fdb [14881,65947] 0 2026-03-10T06:23:15.123 INFO:tasks.workunit.client.0.vm04.stdout:7/803: symlink d4/df/d12/d34/d63/l121 0 2026-03-10T06:23:15.128 INFO:tasks.workunit.client.0.vm04.stdout:4/849: symlink d2/d32/d5c/d76/dd7/d31/d42/db9/def/l115 0 2026-03-10T06:23:15.132 INFO:tasks.workunit.client.0.vm04.stdout:0/898: symlink d0/d5/d97/l130 0 2026-03-10T06:23:15.137 INFO:tasks.workunit.client.0.vm04.stdout:3/816: rename d4/da/df/df9 to d4/d6/d99/d10c 0 2026-03-10T06:23:15.138 INFO:tasks.workunit.client.0.vm04.stdout:8/850: mknod df/d20/d25/d73/c110 0 2026-03-10T06:23:15.139 INFO:tasks.workunit.client.0.vm04.stdout:6/842: unlink d2/d3a/d5e/f64 0 2026-03-10T06:23:15.139 INFO:tasks.workunit.client.0.vm04.stdout:7/804: symlink d4/df/d12/d34/l122 0 2026-03-10T06:23:15.139 INFO:tasks.workunit.client.0.vm04.stdout:4/850: truncate d2/d46/f3d 962314 0 2026-03-10T06:23:15.145 INFO:tasks.workunit.client.0.vm04.stdout:5/817: creat d4/d11/d7d/f11d x:0 0 0 2026-03-10T06:23:15.145 INFO:tasks.workunit.client.0.vm04.stdout:5/818: stat d4/d6/d81/db6 0 2026-03-10T06:23:15.148 INFO:tasks.workunit.client.0.vm04.stdout:9/887: getdents d2/d8/d3a 0 2026-03-10T06:23:15.148 INFO:tasks.workunit.client.0.vm04.stdout:6/843: rename d2/d43/f35 to d2/d43/d2d/d30/dc0/f114 0 2026-03-10T06:23:15.149 INFO:tasks.workunit.client.0.vm04.stdout:4/851: truncate d2/d32/d5c/d4f/f85 4617632 0 2026-03-10T06:23:15.150 INFO:tasks.workunit.client.0.vm04.stdout:6/844: write d2/d43/d2d/d30/dc0/fcd [2804059,107930] 0 2026-03-10T06:23:15.150 INFO:tasks.workunit.client.0.vm04.stdout:2/823: getdents d1/dae/d11/d14/d9f/ddb/d94/de5 0 2026-03-10T06:23:15.151 INFO:tasks.workunit.client.0.vm04.stdout:8/851: mkdir df/d20/d25/d30/d55/de7/d103/d111 0 2026-03-10T06:23:15.153 INFO:tasks.workunit.client.0.vm04.stdout:7/805: symlink d4/df/d12/d21/l123 0 2026-03-10T06:23:15.155 INFO:tasks.workunit.client.0.vm04.stdout:9/888: mkdir d2/d3/d18/de9/da9/d13f 0 2026-03-10T06:23:15.157 INFO:tasks.workunit.client.0.vm04.stdout:6/845: symlink d2/d3a/d9c/l115 0 2026-03-10T06:23:15.158 INFO:tasks.workunit.client.0.vm04.stdout:3/817: rmdir d4/d6/d92/def/d105 0 2026-03-10T06:23:15.158 INFO:tasks.workunit.client.0.vm04.stdout:8/852: chown df/d15/d2b/f7e 594 1 2026-03-10T06:23:15.158 INFO:tasks.workunit.client.0.vm04.stdout:5/819: creat d4/d11/d7d/dae/df3/f11e x:0 0 0 2026-03-10T06:23:15.160 INFO:tasks.workunit.client.0.vm04.stdout:5/820: fsync d4/d11/d7d/d38/d91/d4c/d98/dc0/dbe/ff1 0 2026-03-10T06:23:15.160 INFO:tasks.workunit.client.0.vm04.stdout:4/852: symlink d2/d32/d10b/l116 0 2026-03-10T06:23:15.161 INFO:tasks.workunit.client.0.vm04.stdout:7/806: truncate d4/f6 4158150 0 2026-03-10T06:23:15.161 INFO:tasks.workunit.client.0.vm04.stdout:8/853: fdatasync df/d15/d29/da3/db8/dc1/d97/fb1 0 2026-03-10T06:23:15.161 INFO:tasks.workunit.client.0.vm04.stdout:9/889: mkdir d2/d8/d53/d6e/d89/d140 0 2026-03-10T06:23:15.162 INFO:tasks.workunit.client.0.vm04.stdout:9/890: chown d2/d3/d18/d39/d11/da5 63649132 1 2026-03-10T06:23:15.162 INFO:tasks.workunit.client.0.vm04.stdout:9/891: fdatasync d2/d3/f2a 0 2026-03-10T06:23:15.167 INFO:tasks.workunit.client.0.vm04.stdout:6/846: creat d2/d37/d6e/f116 x:0 0 0 2026-03-10T06:23:15.167 INFO:tasks.workunit.client.0.vm04.stdout:8/854: readlink df/d20/d25/d30/dc5/lcf 0 2026-03-10T06:23:15.168 INFO:tasks.workunit.client.0.vm04.stdout:2/824: write d1/db/d69/d74/d87/dcf/d8f/d35/d54/d5d/f65 [491390,97433] 0 2026-03-10T06:23:15.175 INFO:tasks.workunit.client.0.vm04.stdout:5/821: dread d4/d11/f32 [0,4194304] 0 2026-03-10T06:23:15.176 INFO:tasks.workunit.client.0.vm04.stdout:2/825: fsync d1/dae/d11/d14/f1d 0 2026-03-10T06:23:15.179 INFO:tasks.workunit.client.0.vm04.stdout:4/853: dread d2/d32/d5c/f6a [0,4194304] 0 2026-03-10T06:23:15.198 INFO:tasks.workunit.client.0.vm04.stdout:7/807: truncate d4/df/d12/d13/db3/d110/f4d 1704317 0 2026-03-10T06:23:15.198 INFO:tasks.workunit.client.0.vm04.stdout:9/892: dwrite d2/d3/d18/de9/da9/f12f [0,4194304] 0 2026-03-10T06:23:15.198 INFO:tasks.workunit.client.0.vm04.stdout:5/822: dwrite d4/d11/d7d/d38/d91/dda/fe4 [0,4194304] 0 2026-03-10T06:23:15.198 INFO:tasks.workunit.client.0.vm04.stdout:5/823: stat d4/d11/d7d/d38/d91/cea 0 2026-03-10T06:23:15.198 INFO:tasks.workunit.client.0.vm04.stdout:5/824: chown d4/d11/l14 86 1 2026-03-10T06:23:15.205 INFO:tasks.workunit.client.0.vm04.stdout:6/847: dread d2/d43/d2d/fcf [0,4194304] 0 2026-03-10T06:23:15.218 INFO:tasks.workunit.client.0.vm04.stdout:6/848: dwrite d2/fe0 [0,4194304] 0 2026-03-10T06:23:15.218 INFO:tasks.workunit.client.0.vm04.stdout:8/855: dread df/d20/d25/d30/f51 [0,4194304] 0 2026-03-10T06:23:15.223 INFO:tasks.workunit.client.0.vm04.stdout:9/893: creat d2/de0/d1d/d64/d73/d10d/f141 x:0 0 0 2026-03-10T06:23:15.225 INFO:tasks.workunit.client.0.vm04.stdout:4/854: getdents d2/d32/d94/d99/d101 0 2026-03-10T06:23:15.225 INFO:tasks.workunit.client.0.vm04.stdout:7/808: readlink d4/df/d12/d13/db3/d110/lee 0 2026-03-10T06:23:15.226 INFO:tasks.workunit.client.0.vm04.stdout:9/894: chown d2/d3/d18/d39/d11/da5/df5/ffc 0 1 2026-03-10T06:23:15.229 INFO:tasks.workunit.client.0.vm04.stdout:9/895: fdatasync d2/d3/d18/d39/d46/d84/fd5 0 2026-03-10T06:23:15.230 INFO:tasks.workunit.client.0.vm04.stdout:9/896: write d2/de5/f12a [359265,109752] 0 2026-03-10T06:23:15.234 INFO:tasks.workunit.client.0.vm04.stdout:9/897: creat d2/d3/d18/d39/d46/d55/dc3/f142 x:0 0 0 2026-03-10T06:23:15.234 INFO:tasks.workunit.client.0.vm04.stdout:9/898: chown d2/d3/d18/de9/de7/l118 0 1 2026-03-10T06:23:15.235 INFO:tasks.workunit.client.0.vm04.stdout:9/899: fdatasync d2/de0/d1d/d64/d73/d10d/f141 0 2026-03-10T06:23:15.351 INFO:tasks.workunit.client.0.vm04.stdout:1/823: write d0/d3/d41/d4b/fd3 [2123359,37468] 0 2026-03-10T06:23:15.358 INFO:tasks.workunit.client.0.vm04.stdout:0/899: truncate d0/d5/d97/dc0/dd8/dff/fd9 3564867 0 2026-03-10T06:23:15.378 INFO:tasks.workunit.client.0.vm04.stdout:0/900: creat d0/d5/d97/dc0/f131 x:0 0 0 2026-03-10T06:23:15.380 INFO:tasks.workunit.client.0.vm04.stdout:0/901: read - d0/d1a/d20/df5/d47/d8a/d8d/f127 zero size 2026-03-10T06:23:15.386 INFO:tasks.workunit.client.0.vm04.stdout:0/902: rmdir d0/d5/d97/dc0/dd8/dff/d59 39 2026-03-10T06:23:15.388 INFO:tasks.workunit.client.0.vm04.stdout:3/818: dwrite f1 [4194304,4194304] 0 2026-03-10T06:23:15.395 INFO:tasks.workunit.client.0.vm04.stdout:2/826: dwrite d1/dae/d11/f16 [4194304,4194304] 0 2026-03-10T06:23:15.404 INFO:tasks.workunit.client.0.vm04.stdout:3/819: dwrite d4/da/df/d11/d5a/db3/fbb [0,4194304] 0 2026-03-10T06:23:15.451 INFO:tasks.workunit.client.0.vm04.stdout:3/820: link d4/d6/d99/f80 d4/d6/d91/da1/f10d 0 2026-03-10T06:23:15.457 INFO:tasks.workunit.client.0.vm04.stdout:3/821: mknod d4/da/df/d11/d5a/d5b/ddf/d21/d32/d4e/d7f/c10e 0 2026-03-10T06:23:15.466 INFO:tasks.workunit.client.0.vm04.stdout:3/822: fsync d4/da/df/d11/d5a/f8b 0 2026-03-10T06:23:15.466 INFO:tasks.workunit.client.0.vm04.stdout:3/823: symlink d4/d6/d91/da1/l10f 0 2026-03-10T06:23:15.504 INFO:tasks.workunit.client.0.vm04.stdout:5/825: write d4/d6/f8 [1098801,108000] 0 2026-03-10T06:23:15.508 INFO:tasks.workunit.client.0.vm04.stdout:6/849: dwrite d2/d3a/d5e/f10e [0,4194304] 0 2026-03-10T06:23:15.519 INFO:tasks.workunit.client.0.vm04.stdout:6/850: unlink d2/d3a/d5e/db5/cc7 0 2026-03-10T06:23:15.541 INFO:tasks.workunit.client.0.vm04.stdout:9/900: write d2/d8/d3a/f51 [510983,51836] 0 2026-03-10T06:23:15.541 INFO:tasks.workunit.client.0.vm04.stdout:9/901: chown d2/d8/d3a 159 1 2026-03-10T06:23:15.541 INFO:tasks.workunit.client.0.vm04.stdout:7/809: dwrite d4/df/d12/d13/db3/ded/f101 [0,4194304] 0 2026-03-10T06:23:15.541 INFO:tasks.workunit.client.0.vm04.stdout:4/855: dwrite d2/d32/d5c/d76/f95 [0,4194304] 0 2026-03-10T06:23:15.541 INFO:tasks.workunit.client.0.vm04.stdout:6/851: dwrite d2/d43/d2d/d30/d34/f52 [0,4194304] 0 2026-03-10T06:23:15.541 INFO:tasks.workunit.client.0.vm04.stdout:4/856: creat d2/d32/d10b/dc8/f117 x:0 0 0 2026-03-10T06:23:15.541 INFO:tasks.workunit.client.0.vm04.stdout:6/852: mknod d2/d37/d83/c117 0 2026-03-10T06:23:15.547 INFO:tasks.workunit.client.0.vm04.stdout:6/853: symlink d2/d43/d2d/d7c/daa/l118 0 2026-03-10T06:23:15.552 INFO:tasks.workunit.client.0.vm04.stdout:8/856: rename df/d15/d29/da3/db8/dc1/d97/d67/l7c to df/d20/d25/l112 0 2026-03-10T06:23:15.552 INFO:tasks.workunit.client.0.vm04.stdout:8/857: chown df/d20/f42 525024359 1 2026-03-10T06:23:15.556 INFO:tasks.workunit.client.0.vm04.stdout:4/857: link d2/d32/d5c/d76/dd7/d56/f109 d2/d32/d5c/d76/dd7/d2c/d6b/dd1/d107/f118 0 2026-03-10T06:23:15.556 INFO:tasks.workunit.client.0.vm04.stdout:8/858: mkdir df/d15/d2b/d81/de1/d113 0 2026-03-10T06:23:15.557 INFO:tasks.workunit.client.0.vm04.stdout:1/824: link d0/d8/d46/db3/dd2/cd7 d0/d3/d41/d4b/c131 0 2026-03-10T06:23:15.558 INFO:tasks.workunit.client.0.vm04.stdout:4/858: dread d2/d32/d5c/f6a [0,4194304] 0 2026-03-10T06:23:15.560 INFO:tasks.workunit.client.0.vm04.stdout:8/859: creat df/d15/d29/da3/db8/dc1/dac/f114 x:0 0 0 2026-03-10T06:23:15.561 INFO:tasks.workunit.client.0.vm04.stdout:1/825: creat d0/d8/d46/d7a/d95/dc5/f132 x:0 0 0 2026-03-10T06:23:15.562 INFO:tasks.workunit.client.0.vm04.stdout:8/860: stat df/d15/f45 0 2026-03-10T06:23:15.564 INFO:tasks.workunit.client.0.vm04.stdout:4/859: dread - d2/d32/d5c/d76/dd7/d2c/d6b/d108/fc7 zero size 2026-03-10T06:23:15.565 INFO:tasks.workunit.client.0.vm04.stdout:0/903: dwrite d0/d1a/fe5 [0,4194304] 0 2026-03-10T06:23:15.590 INFO:tasks.workunit.client.0.vm04.stdout:0/904: creat d0/d5/d97/dc0/dd8/dff/d59/f132 x:0 0 0 2026-03-10T06:23:15.592 INFO:tasks.workunit.client.0.vm04.stdout:0/905: mknod d0/d5/d97/dc0/dd8/dff/d59/c133 0 2026-03-10T06:23:15.595 INFO:tasks.workunit.client.0.vm04.stdout:0/906: dwrite d0/d1a/f101 [0,4194304] 0 2026-03-10T06:23:15.597 INFO:tasks.workunit.client.0.vm04.stdout:0/907: dread - d0/d5/d97/dc0/dd8/dff/d59/f9f zero size 2026-03-10T06:23:15.605 INFO:tasks.workunit.client.0.vm04.stdout:7/810: dread d4/df/d12/d13/d25/d28/fc0 [0,4194304] 0 2026-03-10T06:23:15.610 INFO:tasks.workunit.client.0.vm04.stdout:0/908: dwrite d0/f12f [0,4194304] 0 2026-03-10T06:23:15.611 INFO:tasks.workunit.client.0.vm04.stdout:0/909: mkdir d0/d5/d97/dc0/dd8/dff/d9c/d134 0 2026-03-10T06:23:15.618 INFO:tasks.workunit.client.0.vm04.stdout:0/910: write d0/d5/d25/dd/d5c/f9a [435735,9100] 0 2026-03-10T06:23:15.619 INFO:tasks.workunit.client.0.vm04.stdout:7/811: dread d4/df/d12/d13/db3/d110/f67 [4194304,4194304] 0 2026-03-10T06:23:15.625 INFO:tasks.workunit.client.0.vm04.stdout:6/854: dread d2/d37/d6e/f82 [0,4194304] 0 2026-03-10T06:23:15.627 INFO:tasks.workunit.client.0.vm04.stdout:6/855: creat d2/d37/d83/dc1/df9/f119 x:0 0 0 2026-03-10T06:23:15.628 INFO:tasks.workunit.client.0.vm04.stdout:0/911: dwrite d0/d1a/d20/df5/d79/d111/f113 [0,4194304] 0 2026-03-10T06:23:15.628 INFO:tasks.workunit.client.0.vm04.stdout:6/856: stat d2/d3a/f111 0 2026-03-10T06:23:15.629 INFO:tasks.workunit.client.0.vm04.stdout:0/912: readlink d0/d1a/l28 0 2026-03-10T06:23:15.631 INFO:tasks.workunit.client.0.vm04.stdout:6/857: fsync d2/d43/d2d/d30/d1f/fd8 0 2026-03-10T06:23:15.633 INFO:tasks.workunit.client.0.vm04.stdout:6/858: fsync d2/d43/d2d/d30/d1f/f87 0 2026-03-10T06:23:15.648 INFO:tasks.workunit.client.0.vm04.stdout:6/859: dwrite d2/d3a/d5e/db5/f101 [0,4194304] 0 2026-03-10T06:23:15.654 INFO:tasks.workunit.client.0.vm04.stdout:6/860: mknod d2/d37/d6e/c11a 0 2026-03-10T06:23:15.661 INFO:tasks.workunit.client.0.vm04.stdout:6/861: mkdir d2/d37/d83/d11b 0 2026-03-10T06:23:15.675 INFO:tasks.workunit.client.0.vm04.stdout:3/824: rename d4/da/df/d11/c56 to d4/da/df/d11/d5a/db3/d108/c110 0 2026-03-10T06:23:15.676 INFO:tasks.workunit.client.0.vm04.stdout:2/827: write d1/dae/d11/d14/d4e/f5c [288254,11157] 0 2026-03-10T06:23:15.676 INFO:tasks.workunit.client.0.vm04.stdout:7/812: symlink d4/df/d12/d13/l124 0 2026-03-10T06:23:15.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:15 vm04.local ceph-mon[51058]: pgmap v36: 65 pgs: 65 active+clean; 2.9 GiB data, 9.7 GiB used, 110 GiB / 120 GiB avail; 31 MiB/s rd, 84 MiB/s wr, 196 op/s 2026-03-10T06:23:15.682 INFO:tasks.workunit.client.0.vm04.stdout:2/828: rmdir d1/dae/d11/d14/d4e 39 2026-03-10T06:23:15.682 INFO:tasks.workunit.client.0.vm04.stdout:6/862: mkdir d2/d43/d2d/d30/d1f/d11c 0 2026-03-10T06:23:15.682 INFO:tasks.workunit.client.0.vm04.stdout:3/825: creat d4/deb/f111 x:0 0 0 2026-03-10T06:23:15.683 INFO:tasks.workunit.client.0.vm04.stdout:2/829: read d1/db/d69/d74/d87/dcf/fc5 [126220,115995] 0 2026-03-10T06:23:15.688 INFO:tasks.workunit.client.0.vm04.stdout:6/863: fdatasync d2/d3a/d9c/fba 0 2026-03-10T06:23:15.694 INFO:tasks.workunit.client.0.vm04.stdout:7/813: mknod d4/df/d12/d13/d25/d30/c125 0 2026-03-10T06:23:15.696 INFO:tasks.workunit.client.0.vm04.stdout:5/826: write d4/d11/d7d/f36 [6530863,58813] 0 2026-03-10T06:23:15.701 INFO:tasks.workunit.client.0.vm04.stdout:5/827: stat d4/d11/d7d/d38/d91/d4c/d98/dc0/dbe/fd7 0 2026-03-10T06:23:15.704 INFO:tasks.workunit.client.0.vm04.stdout:5/828: symlink d4/d11/d7d/d52/l11f 0 2026-03-10T06:23:15.705 INFO:tasks.workunit.client.0.vm04.stdout:9/902: write d2/d3/d18/ddd/f5b [1936593,126768] 0 2026-03-10T06:23:15.709 INFO:tasks.workunit.client.0.vm04.stdout:5/829: rmdir d4/d11/d7d/d38/d91/d4c/d98/dc0/dde 39 2026-03-10T06:23:15.713 INFO:tasks.workunit.client.0.vm04.stdout:9/903: link d2/d8/d53/d6e/d89/fbb d2/d8/d53/f143 0 2026-03-10T06:23:15.719 INFO:tasks.workunit.client.0.vm04.stdout:1/826: creat d0/d3/d41/f133 x:0 0 0 2026-03-10T06:23:15.719 INFO:tasks.workunit.client.0.vm04.stdout:5/830: dread d4/d6/d37/f62 [0,4194304] 0 2026-03-10T06:23:15.723 INFO:tasks.workunit.client.0.vm04.stdout:9/904: creat d2/d8/d22/f144 x:0 0 0 2026-03-10T06:23:15.725 INFO:tasks.workunit.client.0.vm04.stdout:1/827: dwrite d0/d8/fe1 [4194304,4194304] 0 2026-03-10T06:23:15.734 INFO:tasks.workunit.client.0.vm04.stdout:5/831: rename d4/d6/d80/lbc to d4/d6/d80/de5/l120 0 2026-03-10T06:23:15.735 INFO:tasks.workunit.client.0.vm04.stdout:9/905: rmdir d2/d3/d18/d39/d11/da5 39 2026-03-10T06:23:15.736 INFO:tasks.workunit.client.0.vm04.stdout:1/828: mkdir d0/d3/d41/d134 0 2026-03-10T06:23:15.737 INFO:tasks.workunit.client.0.vm04.stdout:5/832: creat d4/d3b/f121 x:0 0 0 2026-03-10T06:23:15.738 INFO:tasks.workunit.client.0.vm04.stdout:9/906: truncate d2/d8/d22/d4f/ff1 402439 0 2026-03-10T06:23:15.738 INFO:tasks.workunit.client.0.vm04.stdout:1/829: dread d0/d3/d41/d99/def/f118 [0,4194304] 0 2026-03-10T06:23:15.742 INFO:tasks.workunit.client.0.vm04.stdout:5/833: rmdir d4/d6/d80/d84 39 2026-03-10T06:23:15.742 INFO:tasks.workunit.client.0.vm04.stdout:1/830: fdatasync d0/d8/d46/db3/dd2/d100/f113 0 2026-03-10T06:23:15.743 INFO:tasks.workunit.client.0.vm04.stdout:8/861: write df/fcd [1086880,58742] 0 2026-03-10T06:23:15.743 INFO:tasks.workunit.client.0.vm04.stdout:5/834: dread - d4/d11/d7d/d38/d91/d55/d72/f113 zero size 2026-03-10T06:23:15.744 INFO:tasks.workunit.client.0.vm04.stdout:8/862: chown df/d15/d2b/d81/d9a/dbe 1 1 2026-03-10T06:23:15.749 INFO:tasks.workunit.client.0.vm04.stdout:5/835: write d4/d11/d7d/d38/d91/dda/fe4 [1383560,40241] 0 2026-03-10T06:23:15.757 INFO:tasks.workunit.client.0.vm04.stdout:0/913: dwrite d0/d5/fb [0,4194304] 0 2026-03-10T06:23:15.763 INFO:tasks.workunit.client.0.vm04.stdout:4/860: dwrite d2/fce [0,4194304] 0 2026-03-10T06:23:15.766 INFO:tasks.workunit.client.0.vm04.stdout:0/914: truncate d0/d5/d25/dd/d92/f121 410758 0 2026-03-10T06:23:15.768 INFO:tasks.workunit.client.0.vm04.stdout:1/831: dwrite d0/d8/ff4 [0,4194304] 0 2026-03-10T06:23:15.768 INFO:tasks.workunit.client.0.vm04.stdout:5/836: dwrite d4/d11/d7d/dae/df3/f11e [0,4194304] 0 2026-03-10T06:23:15.768 INFO:tasks.workunit.client.0.vm04.stdout:8/863: creat df/d15/d2b/d81/d9a/dbe/df0/f115 x:0 0 0 2026-03-10T06:23:15.776 INFO:tasks.workunit.client.0.vm04.stdout:9/907: link d2/d3/d18/d39/d46/l12d d2/d3/df4/l145 0 2026-03-10T06:23:15.777 INFO:tasks.workunit.client.0.vm04.stdout:9/908: stat d2/d3/d18/de9/d116/f137 0 2026-03-10T06:23:15.778 INFO:tasks.workunit.client.0.vm04.stdout:9/909: truncate d2/d3/d18/d39/d46/d55/dc3/f142 1043585 0 2026-03-10T06:23:15.799 INFO:tasks.workunit.client.0.vm04.stdout:4/861: rename d2/d46/fa5 to d2/d32/d5c/d76/dd7/d56/f119 0 2026-03-10T06:23:15.802 INFO:tasks.workunit.client.0.vm04.stdout:0/915: truncate d0/d5/d97/fab 690807 0 2026-03-10T06:23:15.803 INFO:tasks.workunit.client.0.vm04.stdout:0/916: fdatasync d0/d5/d25/dd/d5c/f9a 0 2026-03-10T06:23:15.804 INFO:tasks.workunit.client.0.vm04.stdout:1/832: creat d0/d3/d41/dcb/f135 x:0 0 0 2026-03-10T06:23:15.805 INFO:tasks.workunit.client.0.vm04.stdout:8/864: creat df/d15/d29/da3/f116 x:0 0 0 2026-03-10T06:23:15.808 INFO:tasks.workunit.client.0.vm04.stdout:5/837: fsync d4/d11/d7d/d38/d91/d55/fba 0 2026-03-10T06:23:15.812 INFO:tasks.workunit.client.0.vm04.stdout:9/910: creat d2/d3/d18/de9/d5a/f146 x:0 0 0 2026-03-10T06:23:15.813 INFO:tasks.workunit.client.0.vm04.stdout:5/838: write d4/d6/d50/f61 [5559534,73703] 0 2026-03-10T06:23:15.813 INFO:tasks.workunit.client.0.vm04.stdout:3/826: dwrite d4/da/df/d11/d5a/d5b/ddf/f77 [4194304,4194304] 0 2026-03-10T06:23:15.813 INFO:tasks.workunit.client.0.vm04.stdout:3/827: dread - d4/d6/d92/def/f103 zero size 2026-03-10T06:23:15.814 INFO:tasks.workunit.client.0.vm04.stdout:0/917: mkdir d0/d5/d97/d10a/d135 0 2026-03-10T06:23:15.821 INFO:tasks.workunit.client.0.vm04.stdout:6/864: truncate d2/d43/d2d/d30/d34/d76/d8a/fab 1439174 0 2026-03-10T06:23:15.821 INFO:tasks.workunit.client.0.vm04.stdout:6/865: fsync d2/fa0 0 2026-03-10T06:23:15.821 INFO:tasks.workunit.client.0.vm04.stdout:5/839: write d4/d11/f18 [3902742,116864] 0 2026-03-10T06:23:15.824 INFO:tasks.workunit.client.0.vm04.stdout:7/814: dwrite d4/df/d12/d13/d8b/fdd [0,4194304] 0 2026-03-10T06:23:15.828 INFO:tasks.workunit.client.0.vm04.stdout:1/833: mknod d0/d3/d41/d99/c136 0 2026-03-10T06:23:15.828 INFO:tasks.workunit.client.0.vm04.stdout:5/840: fsync d4/d6/f8 0 2026-03-10T06:23:15.829 INFO:tasks.workunit.client.0.vm04.stdout:2/830: dwrite d1/dae/f63 [4194304,4194304] 0 2026-03-10T06:23:15.833 INFO:tasks.workunit.client.0.vm04.stdout:7/815: chown d4/df/d12/d13/d25/d30/d40/d108 750 1 2026-03-10T06:23:15.836 INFO:tasks.workunit.client.0.vm04.stdout:7/816: chown d4/df/d12/dd4/f11d 223074401 1 2026-03-10T06:23:15.844 INFO:tasks.workunit.client.0.vm04.stdout:0/918: creat d0/d5/d97/dc0/dd8/dff/d59/f136 x:0 0 0 2026-03-10T06:23:15.846 INFO:tasks.workunit.client.0.vm04.stdout:0/919: fsync d0/d1a/d20/df5/f110 0 2026-03-10T06:23:15.848 INFO:tasks.workunit.client.0.vm04.stdout:6/866: creat d2/d3a/ded/f11d x:0 0 0 2026-03-10T06:23:15.861 INFO:tasks.workunit.client.0.vm04.stdout:2/831: mkdir d1/dae/d2c/d37/d40/dfa 0 2026-03-10T06:23:15.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:15 vm06.local ceph-mon[58974]: pgmap v36: 65 pgs: 65 active+clean; 2.9 GiB data, 9.7 GiB used, 110 GiB / 120 GiB avail; 31 MiB/s rd, 84 MiB/s wr, 196 op/s 2026-03-10T06:23:15.867 INFO:tasks.workunit.client.0.vm04.stdout:3/828: symlink d4/da/df/l112 0 2026-03-10T06:23:15.871 INFO:tasks.workunit.client.0.vm04.stdout:7/817: fsync d4/df/d12/d13/d25/d28/fc0 0 2026-03-10T06:23:15.873 INFO:tasks.workunit.client.0.vm04.stdout:1/834: dread d0/d3/fd4 [0,4194304] 0 2026-03-10T06:23:15.876 INFO:tasks.workunit.client.0.vm04.stdout:4/862: write d2/d32/d5c/d76/dd7/d56/f109 [467688,39672] 0 2026-03-10T06:23:15.877 INFO:tasks.workunit.client.0.vm04.stdout:8/865: write df/d15/d29/da3/db8/dc1/f90 [1215279,3611] 0 2026-03-10T06:23:15.878 INFO:tasks.workunit.client.0.vm04.stdout:8/866: chown df/d20/d25/d30/f4e 1045 1 2026-03-10T06:23:15.879 INFO:tasks.workunit.client.0.vm04.stdout:9/911: dwrite d2/d3/d18/f8f [4194304,4194304] 0 2026-03-10T06:23:15.896 INFO:tasks.workunit.client.0.vm04.stdout:5/841: creat d4/d6/d80/de5/d114/f122 x:0 0 0 2026-03-10T06:23:15.903 INFO:tasks.workunit.client.0.vm04.stdout:7/818: creat d4/df/d12/dd4/f126 x:0 0 0 2026-03-10T06:23:15.903 INFO:tasks.workunit.client.0.vm04.stdout:6/867: fsync d2/d43/d2d/d30/d34/d76/d7e/ddc/f10f 0 2026-03-10T06:23:15.904 INFO:tasks.workunit.client.0.vm04.stdout:6/868: readlink d2/d37/l41 0 2026-03-10T06:23:15.909 INFO:tasks.workunit.client.0.vm04.stdout:7/819: dread d4/df/d12/d13/fac [0,4194304] 0 2026-03-10T06:23:15.910 INFO:tasks.workunit.client.0.vm04.stdout:0/920: creat d0/d5/d97/dc0/dd8/dff/d9c/d134/f137 x:0 0 0 2026-03-10T06:23:15.918 INFO:tasks.workunit.client.0.vm04.stdout:9/912: creat d2/d3/d18/ddd/f147 x:0 0 0 2026-03-10T06:23:15.928 INFO:tasks.workunit.client.0.vm04.stdout:6/869: mknod d2/d43/d2d/d7c/daa/c11e 0 2026-03-10T06:23:15.930 INFO:tasks.workunit.client.0.vm04.stdout:6/870: write d2/d3a/f111 [680291,49176] 0 2026-03-10T06:23:15.957 INFO:tasks.workunit.client.0.vm04.stdout:6/871: write d2/d3a/d5e/ffe [654677,29773] 0 2026-03-10T06:23:15.957 INFO:tasks.workunit.client.0.vm04.stdout:5/842: dread d4/d6/d37/f7e [0,4194304] 0 2026-03-10T06:23:15.957 INFO:tasks.workunit.client.0.vm04.stdout:9/913: creat d2/d3/d18/d39/d46/d84/f148 x:0 0 0 2026-03-10T06:23:15.957 INFO:tasks.workunit.client.0.vm04.stdout:2/832: link d1/db/d69/d74/d87/ld2 d1/dae/dd6/lfb 0 2026-03-10T06:23:15.957 INFO:tasks.workunit.client.0.vm04.stdout:3/829: link d4/d6/d91/da1/l10f d4/d6/d38/dcc/l113 0 2026-03-10T06:23:15.957 INFO:tasks.workunit.client.0.vm04.stdout:6/872: symlink d2/d43/d9b/l11f 0 2026-03-10T06:23:15.957 INFO:tasks.workunit.client.0.vm04.stdout:6/873: write d2/d37/d83/f8e [1832842,93193] 0 2026-03-10T06:23:15.957 INFO:tasks.workunit.client.0.vm04.stdout:8/867: rename df/d15/d2b/d81/d9a/fd2 to df/d15/f117 0 2026-03-10T06:23:15.957 INFO:tasks.workunit.client.0.vm04.stdout:8/868: write df/d15/d29/da3/faa [3223136,129965] 0 2026-03-10T06:23:15.957 INFO:tasks.workunit.client.0.vm04.stdout:8/869: chown df/d15/d29/d89/fe6 230 1 2026-03-10T06:23:15.957 INFO:tasks.workunit.client.0.vm04.stdout:8/870: write df/d15/d29/da3/faa [1743000,58148] 0 2026-03-10T06:23:15.957 INFO:tasks.workunit.client.0.vm04.stdout:0/921: creat d0/d5/d97/dc0/dd8/f138 x:0 0 0 2026-03-10T06:23:15.957 INFO:tasks.workunit.client.0.vm04.stdout:0/922: fdatasync d0/d1a/d20/dc2/d12d/f115 0 2026-03-10T06:23:15.967 INFO:tasks.workunit.client.0.vm04.stdout:2/833: dread d1/dae/d11/d14/d9f/ddb/f7a [0,4194304] 0 2026-03-10T06:23:15.967 INFO:tasks.workunit.client.0.vm04.stdout:4/863: sync 2026-03-10T06:23:15.968 INFO:tasks.workunit.client.0.vm04.stdout:6/874: sync 2026-03-10T06:23:15.978 INFO:tasks.workunit.client.0.vm04.stdout:9/914: dread d2/de0/f40 [0,4194304] 0 2026-03-10T06:23:15.980 INFO:tasks.workunit.client.0.vm04.stdout:1/835: getdents d0/d3/d41/dcb/dee 0 2026-03-10T06:23:15.990 INFO:tasks.workunit.client.0.vm04.stdout:7/820: getdents d4/df/d12/dd4 0 2026-03-10T06:23:15.995 INFO:tasks.workunit.client.0.vm04.stdout:8/871: mknod df/d20/d25/d30/d55/c118 0 2026-03-10T06:23:15.998 INFO:tasks.workunit.client.0.vm04.stdout:0/923: rename d0/d5/d25/dd/d5c/d73 to d0/d1a/d20/df5/d47/ddd/d103/d139 0 2026-03-10T06:23:16.005 INFO:tasks.workunit.client.0.vm04.stdout:0/924: dwrite d0/d1a/fe5 [4194304,4194304] 0 2026-03-10T06:23:16.013 INFO:tasks.workunit.client.0.vm04.stdout:3/830: write d4/da/df/d11/d50/fa9 [1847485,44852] 0 2026-03-10T06:23:16.015 INFO:tasks.workunit.client.0.vm04.stdout:4/864: symlink d2/d32/d10b/dc8/l11a 0 2026-03-10T06:23:16.017 INFO:tasks.workunit.client.0.vm04.stdout:0/925: fdatasync d0/d1a/d20/df5/f110 0 2026-03-10T06:23:16.020 INFO:tasks.workunit.client.0.vm04.stdout:5/843: creat d4/d11/d7d/d38/d91/d4c/d98/dc0/f123 x:0 0 0 2026-03-10T06:23:16.021 INFO:tasks.workunit.client.0.vm04.stdout:5/844: write d4/d11/f1f [7367503,77782] 0 2026-03-10T06:23:16.026 INFO:tasks.workunit.client.0.vm04.stdout:9/915: creat d2/d3/d18/de9/d5a/f149 x:0 0 0 2026-03-10T06:23:16.028 INFO:tasks.workunit.client.0.vm04.stdout:8/872: truncate df/d15/f69 3406467 0 2026-03-10T06:23:16.028 INFO:tasks.workunit.client.0.vm04.stdout:1/836: read d0/d3/f50 [145327,97835] 0 2026-03-10T06:23:16.029 INFO:tasks.workunit.client.0.vm04.stdout:5/845: dwrite d4/d6/d37/f39 [4194304,4194304] 0 2026-03-10T06:23:16.053 INFO:tasks.workunit.client.0.vm04.stdout:8/873: dread df/d15/d29/da3/db8/dc1/d97/fb1 [0,4194304] 0 2026-03-10T06:23:16.079 INFO:tasks.workunit.client.0.vm04.stdout:2/834: write d1/f10 [3621598,31302] 0 2026-03-10T06:23:16.088 INFO:tasks.workunit.client.0.vm04.stdout:6/875: dwrite d2/d3a/d9c/fba [0,4194304] 0 2026-03-10T06:23:16.099 INFO:tasks.workunit.client.0.vm04.stdout:9/916: symlink d2/de0/d1d/d64/l14a 0 2026-03-10T06:23:16.100 INFO:tasks.workunit.client.0.vm04.stdout:6/876: sync 2026-03-10T06:23:16.103 INFO:tasks.workunit.client.0.vm04.stdout:6/877: chown d2/d8/c71 196596558 1 2026-03-10T06:23:16.103 INFO:tasks.workunit.client.0.vm04.stdout:6/878: readlink d2/d43/lc 0 2026-03-10T06:23:16.116 INFO:tasks.workunit.client.0.vm04.stdout:4/865: dwrite d2/d32/d5c/d76/dd7/f9d [0,4194304] 0 2026-03-10T06:23:16.133 INFO:tasks.workunit.client.0.vm04.stdout:3/831: creat d4/da/df/d11/d5a/d5b/ddf/d21/d32/d4e/d8f/d104/f114 x:0 0 0 2026-03-10T06:23:16.138 INFO:tasks.workunit.client.0.vm04.stdout:8/874: symlink df/d15/d29/da3/db8/dc1/d97/d67/l119 0 2026-03-10T06:23:16.141 INFO:tasks.workunit.client.0.vm04.stdout:5/846: write d4/f69 [164995,124154] 0 2026-03-10T06:23:16.152 INFO:tasks.workunit.client.0.vm04.stdout:9/917: mkdir d2/d3/d18/de9/d116/d11d/d14b 0 2026-03-10T06:23:16.152 INFO:tasks.workunit.client.0.vm04.stdout:9/918: chown d2/d3/d18/de9/da2/c115 278119 1 2026-03-10T06:23:16.153 INFO:tasks.workunit.client.0.vm04.stdout:9/919: dread d2/d8/d3a/dcb/fe6 [0,4194304] 0 2026-03-10T06:23:16.154 INFO:tasks.workunit.client.0.vm04.stdout:6/879: dread - d2/d43/d2d/d7c/f8b zero size 2026-03-10T06:23:16.165 INFO:tasks.workunit.client.0.vm04.stdout:1/837: write d0/d3/d41/d4b/d5b/fd9 [701427,127847] 0 2026-03-10T06:23:16.172 INFO:tasks.workunit.client.0.vm04.stdout:4/866: unlink d2/d32/d5c/l10f 0 2026-03-10T06:23:16.201 INFO:tasks.workunit.client.0.vm04.stdout:2/835: mkdir d1/db/d69/d74/d87/dcf/d8f/d35/d54/dfc 0 2026-03-10T06:23:16.212 INFO:tasks.workunit.client.0.vm04.stdout:0/926: link d0/d1a/d20/df5/d47/d8a/d8d/f127 d0/d1a/f13a 0 2026-03-10T06:23:16.212 INFO:tasks.workunit.client.0.vm04.stdout:0/927: chown d0/d1a/d20/dc2/ff4 46417 1 2026-03-10T06:23:16.218 INFO:tasks.workunit.client.0.vm04.stdout:7/821: link d4/df/d12/d21/c32 d4/c127 0 2026-03-10T06:23:16.221 INFO:tasks.workunit.client.0.vm04.stdout:1/838: fsync d0/d3/d41/d99/fe2 0 2026-03-10T06:23:16.222 INFO:tasks.workunit.client.0.vm04.stdout:1/839: read d0/d3/d41/dcb/f12e [1925008,125988] 0 2026-03-10T06:23:16.234 INFO:tasks.workunit.client.0.vm04.stdout:0/928: creat d0/f13b x:0 0 0 2026-03-10T06:23:16.240 INFO:tasks.workunit.client.0.vm04.stdout:9/920: truncate d2/de0/f3c 571373 0 2026-03-10T06:23:16.244 INFO:tasks.workunit.client.0.vm04.stdout:9/921: dwrite d2/d3/d18/d39/d46/d55/dc3/f125 [4194304,4194304] 0 2026-03-10T06:23:16.254 INFO:tasks.workunit.client.0.vm04.stdout:4/867: mknod d2/d32/d5c/d76/dd7/c11b 0 2026-03-10T06:23:16.256 INFO:tasks.workunit.client.0.vm04.stdout:3/832: rmdir d4/da/df/d11/d62 0 2026-03-10T06:23:16.264 INFO:tasks.workunit.client.0.vm04.stdout:5/847: rename d4/d11/d7d/d38/d91/d4c/fa3 to d4/d6/f124 0 2026-03-10T06:23:16.290 INFO:tasks.workunit.client.0.vm04.stdout:9/922: write d2/d8/f99 [1619928,60273] 0 2026-03-10T06:23:16.296 INFO:tasks.workunit.client.0.vm04.stdout:1/840: creat d0/d8/d46/d7a/d95/dc5/d121/f137 x:0 0 0 2026-03-10T06:23:16.298 INFO:tasks.workunit.client.0.vm04.stdout:4/868: mkdir d2/d32/d5c/d76/dd7/d2c/d6b/d108/d11c 0 2026-03-10T06:23:16.303 INFO:tasks.workunit.client.0.vm04.stdout:3/833: readlink d4/da/df/l90 0 2026-03-10T06:23:16.303 INFO:tasks.workunit.client.0.vm04.stdout:8/875: getdents df/d15/d29/da3/db8/dc1/dac 0 2026-03-10T06:23:16.307 INFO:tasks.workunit.client.0.vm04.stdout:3/834: dwrite d4/d6/dc/f22 [4194304,4194304] 0 2026-03-10T06:23:16.312 INFO:tasks.workunit.client.0.vm04.stdout:3/835: readlink d4/da/df/d11/d5a/d5b/ddf/dbd/l100 0 2026-03-10T06:23:16.322 INFO:tasks.workunit.client.0.vm04.stdout:0/929: creat d0/d5/d97/d10a/d135/f13c x:0 0 0 2026-03-10T06:23:16.323 INFO:tasks.workunit.client.0.vm04.stdout:6/880: getdents d2/d3a/d5e/db5/dd4 0 2026-03-10T06:23:16.324 INFO:tasks.workunit.client.0.vm04.stdout:6/881: dread d2/d43/d2d/d30/f32 [0,4194304] 0 2026-03-10T06:23:16.328 INFO:tasks.workunit.client.0.vm04.stdout:7/822: creat d4/df/f128 x:0 0 0 2026-03-10T06:23:16.337 INFO:tasks.workunit.client.0.vm04.stdout:0/930: dread d0/d5/d25/dd/d5c/f7a [0,4194304] 0 2026-03-10T06:23:16.337 INFO:tasks.workunit.client.0.vm04.stdout:0/931: chown d0/d5/d97/dc0/dd8 9477237 1 2026-03-10T06:23:16.338 INFO:tasks.workunit.client.0.vm04.stdout:0/932: write d0/d5/fb [1216596,19176] 0 2026-03-10T06:23:16.345 INFO:tasks.workunit.client.0.vm04.stdout:9/923: unlink d2/de0/d1d/d64/l14a 0 2026-03-10T06:23:16.396 INFO:tasks.workunit.client.0.vm04.stdout:1/841: rmdir d0/d8/d46/d7a/d95/dc5/dcc 39 2026-03-10T06:23:16.397 INFO:tasks.workunit.client.0.vm04.stdout:4/869: truncate d2/d32/d10b/f64 1040803 0 2026-03-10T06:23:16.399 INFO:tasks.workunit.client.0.vm04.stdout:8/876: read df/f1d [3500452,75688] 0 2026-03-10T06:23:16.399 INFO:tasks.workunit.client.0.vm04.stdout:3/836: mkdir d4/da/df/d11/d5a/d5b/ddf/d21/d32/d4e/d7f/d115 0 2026-03-10T06:23:16.403 INFO:tasks.workunit.client.0.vm04.stdout:8/877: read df/d20/d25/f39 [179704,61373] 0 2026-03-10T06:23:16.403 INFO:tasks.workunit.client.0.vm04.stdout:2/836: getdents d1/dae/d2c 0 2026-03-10T06:23:16.408 INFO:tasks.workunit.client.0.vm04.stdout:6/882: dwrite d2/d43/d2d/d30/d34/f4d [0,4194304] 0 2026-03-10T06:23:16.412 INFO:tasks.workunit.client.0.vm04.stdout:6/883: chown d2/d3a/f109 3279485 1 2026-03-10T06:23:16.415 INFO:tasks.workunit.client.0.vm04.stdout:9/924: chown d2/d3/d18/d39/d11/da5/df5 13 1 2026-03-10T06:23:16.424 INFO:tasks.workunit.client.0.vm04.stdout:3/837: rmdir d4/da/df/d11/d5a/d5b/ddf/d21/d32/d4e/d8f/d104 39 2026-03-10T06:23:16.436 INFO:tasks.workunit.client.0.vm04.stdout:8/878: rename df/d15/d29/da3/db8/dc1/d97/d67/l10e to df/d20/d25/d30/d65/d8f/l11a 0 2026-03-10T06:23:16.440 INFO:tasks.workunit.client.0.vm04.stdout:5/848: truncate d4/f13 3810994 0 2026-03-10T06:23:16.441 INFO:tasks.workunit.client.0.vm04.stdout:5/849: stat d4/cb5 0 2026-03-10T06:23:16.443 INFO:tasks.workunit.client.0.vm04.stdout:1/842: fsync d0/d8/d46/d7a/d95/dc5/dcc/f104 0 2026-03-10T06:23:16.444 INFO:tasks.workunit.client.0.vm04.stdout:9/925: mknod d2/d3/d18/d39/d11/da5/df5/d130/c14c 0 2026-03-10T06:23:16.446 INFO:tasks.workunit.client.0.vm04.stdout:9/926: chown d2/d3/d18/de9/d5a/l114 1624 1 2026-03-10T06:23:16.448 INFO:tasks.workunit.client.0.vm04.stdout:2/837: dread d1/dae/d2c/d37/d59/f8b [0,4194304] 0 2026-03-10T06:23:16.449 INFO:tasks.workunit.client.0.vm04.stdout:2/838: chown d1/dae/d11/f16 1566421 1 2026-03-10T06:23:16.455 INFO:tasks.workunit.client.0.vm04.stdout:7/823: write d4/df/d12/d13/d25/d30/d40/d50/f5b [697192,115942] 0 2026-03-10T06:23:16.458 INFO:tasks.workunit.client.0.vm04.stdout:4/870: dwrite d2/d32/d5c/f6d [4194304,4194304] 0 2026-03-10T06:23:16.460 INFO:tasks.workunit.client.0.vm04.stdout:2/839: dwrite d1/db/d69/d74/d87/dcf/d8f/d35/d54/d5d/ff5 [0,4194304] 0 2026-03-10T06:23:16.471 INFO:tasks.workunit.client.0.vm04.stdout:2/840: dread d1/dae/d11/d14/d9f/ddb/f7a [0,4194304] 0 2026-03-10T06:23:16.479 INFO:tasks.workunit.client.0.vm04.stdout:3/838: write d4/da/df/d11/d5a/d5b/ddf/d21/ff0 [789112,24037] 0 2026-03-10T06:23:16.484 INFO:tasks.workunit.client.0.vm04.stdout:0/933: creat d0/d1a/d20/f13d x:0 0 0 2026-03-10T06:23:16.492 INFO:tasks.workunit.client.0.vm04.stdout:1/843: dread d0/d3/d41/d99/def/f118 [0,4194304] 0 2026-03-10T06:23:16.496 INFO:tasks.workunit.client.0.vm04.stdout:0/934: truncate d0/d1a/d20/df5/d47/ddd/d103/d139/f4f 1315238 0 2026-03-10T06:23:16.502 INFO:tasks.workunit.client.0.vm04.stdout:4/871: symlink d2/d32/d5c/de2/d102/l11d 0 2026-03-10T06:23:16.503 INFO:tasks.workunit.client.0.vm04.stdout:7/824: mkdir d4/df/d12/d13/d25/d28/d3a/d129 0 2026-03-10T06:23:16.505 INFO:tasks.workunit.client.0.vm04.stdout:7/825: truncate d4/df/d12/d13/d8b/fdd 4273021 0 2026-03-10T06:23:16.513 INFO:tasks.workunit.client.0.vm04.stdout:6/884: getdents d2/d37/d6e/de6 0 2026-03-10T06:23:16.517 INFO:tasks.workunit.client.0.vm04.stdout:0/935: creat d0/d1a/d20/df5/d47/d8a/f13e x:0 0 0 2026-03-10T06:23:16.519 INFO:tasks.workunit.client.0.vm04.stdout:9/927: creat d2/d3/d18/de9/f14d x:0 0 0 2026-03-10T06:23:16.522 INFO:tasks.workunit.client.0.vm04.stdout:4/872: creat d2/d32/d5c/d76/dd7/da3/f11e x:0 0 0 2026-03-10T06:23:16.524 INFO:tasks.workunit.client.0.vm04.stdout:7/826: rename d4/df/d12/d13/d8b/fdd to d4/df/d12/d13/d25/d28/d3a/d129/f12a 0 2026-03-10T06:23:16.532 INFO:tasks.workunit.client.0.vm04.stdout:8/879: dwrite df/d15/d2b/f4c [0,4194304] 0 2026-03-10T06:23:16.532 INFO:tasks.workunit.client.0.vm04.stdout:0/936: mkdir d0/d1a/d20/df5/d47/d11e/d13f 0 2026-03-10T06:23:16.534 INFO:tasks.workunit.client.0.vm04.stdout:0/937: dread - d0/d5/d97/dc0/dd8/dff/d59/f9f zero size 2026-03-10T06:23:16.545 INFO:tasks.workunit.client.0.vm04.stdout:1/844: sync 2026-03-10T06:23:16.549 INFO:tasks.workunit.client.0.vm04.stdout:3/839: dwrite d4/d6/d38/f78 [0,4194304] 0 2026-03-10T06:23:16.551 INFO:tasks.workunit.client.0.vm04.stdout:9/928: creat d2/d8/d53/d6e/d12b/f14e x:0 0 0 2026-03-10T06:23:16.555 INFO:tasks.workunit.client.0.vm04.stdout:9/929: chown d2/de0/da3/cb0 727 1 2026-03-10T06:23:16.555 INFO:tasks.workunit.client.0.vm04.stdout:2/841: write d1/db/fe [332947,36089] 0 2026-03-10T06:23:16.567 INFO:tasks.workunit.client.0.vm04.stdout:5/850: dread d4/d11/f1f [0,4194304] 0 2026-03-10T06:23:16.571 INFO:tasks.workunit.client.0.vm04.stdout:4/873: fdatasync d2/d46/f18 0 2026-03-10T06:23:16.572 INFO:tasks.workunit.client.0.vm04.stdout:6/885: rename d2/d3a/f109 to d2/d43/d2d/d30/d1f/d3c/d85/f120 0 2026-03-10T06:23:16.573 INFO:tasks.workunit.client.0.vm04.stdout:2/842: dread d1/db/d69/d74/d87/dcf/d8f/d48/f62 [0,4194304] 0 2026-03-10T06:23:16.575 INFO:tasks.workunit.client.0.vm04.stdout:7/827: write d4/df/d12/d13/d25/d28/d3a/d129/f12a [4636076,38007] 0 2026-03-10T06:23:16.582 INFO:tasks.workunit.client.0.vm04.stdout:8/880: write df/d20/d25/d30/d55/f95 [1589053,312] 0 2026-03-10T06:23:16.586 INFO:tasks.workunit.client.0.vm04.stdout:1/845: chown d0/d8/f69 5219 1 2026-03-10T06:23:16.591 INFO:tasks.workunit.client.0.vm04.stdout:9/930: fsync d2/d3/d18/d39/d46/f110 0 2026-03-10T06:23:16.596 INFO:tasks.workunit.client.0.vm04.stdout:5/851: symlink d4/d11/d7d/d38/d91/d55/db1/l125 0 2026-03-10T06:23:16.598 INFO:tasks.workunit.client.0.vm04.stdout:4/874: rename d2/d32/d5c/d76/c104 to d2/d32/d5c/d76/dd7/da3/c11f 0 2026-03-10T06:23:16.600 INFO:tasks.workunit.client.0.vm04.stdout:4/875: read d2/d32/d10b/dc8/d100/fe7 [2888206,59395] 0 2026-03-10T06:23:16.601 INFO:tasks.workunit.client.0.vm04.stdout:2/843: creat d1/dae/d2c/d37/ffd x:0 0 0 2026-03-10T06:23:16.604 INFO:tasks.workunit.client.0.vm04.stdout:7/828: mkdir d4/df/d12/dd4/d12b 0 2026-03-10T06:23:16.606 INFO:tasks.workunit.client.0.vm04.stdout:0/938: symlink d0/d5/d25/dd/l140 0 2026-03-10T06:23:16.607 INFO:tasks.workunit.client.0.vm04.stdout:0/939: write d0/d5/d97/dc0/fc8 [5172744,92913] 0 2026-03-10T06:23:16.611 INFO:tasks.workunit.client.0.vm04.stdout:1/846: creat d0/d8/d46/d7a/d95/df3/f138 x:0 0 0 2026-03-10T06:23:16.611 INFO:tasks.workunit.client.0.vm04.stdout:8/881: fsync df/d15/d29/da3/db8/dc1/d97/fb1 0 2026-03-10T06:23:16.612 INFO:tasks.workunit.client.0.vm04.stdout:4/876: sync 2026-03-10T06:23:16.616 INFO:tasks.workunit.client.0.vm04.stdout:3/840: getdents d4/da/df/d11/d5a/d5b/ddf/dbd/df4 0 2026-03-10T06:23:16.625 INFO:tasks.workunit.client.0.vm04.stdout:2/844: dread - d1/db/d69/d74/d87/dcf/d8f/ddc/fd1 zero size 2026-03-10T06:23:16.626 INFO:tasks.workunit.client.0.vm04.stdout:2/845: dread - d1/db/d69/d74/d87/dcf/d8f/d35/d54/fe4 zero size 2026-03-10T06:23:16.643 INFO:tasks.workunit.client.0.vm04.stdout:1/847: creat d0/d3/d41/dcb/f139 x:0 0 0 2026-03-10T06:23:16.649 INFO:tasks.workunit.client.0.vm04.stdout:5/852: symlink d4/d11/d7d/d38/d91/l126 0 2026-03-10T06:23:16.651 INFO:tasks.workunit.client.0.vm04.stdout:4/877: dwrite d2/d32/d5c/d76/dd7/d2c/d6b/d108/f10a [0,4194304] 0 2026-03-10T06:23:16.654 INFO:tasks.workunit.client.0.vm04.stdout:8/882: write df/d20/d25/d30/d65/f9f [2041862,114719] 0 2026-03-10T06:23:16.663 INFO:tasks.workunit.client.0.vm04.stdout:9/931: truncate d2/d3/d18/d39/f2e 3693247 0 2026-03-10T06:23:16.663 INFO:tasks.workunit.client.0.vm04.stdout:2/846: fdatasync d1/db/d69/d74/d87/dcf/d8f/d48/d67/f92 0 2026-03-10T06:23:16.667 INFO:tasks.workunit.client.0.vm04.stdout:2/847: read d1/dae/f24 [4110370,25884] 0 2026-03-10T06:23:16.667 INFO:tasks.workunit.client.0.vm04.stdout:0/940: mkdir d0/d5/d25/dd/d5c/d141 0 2026-03-10T06:23:16.675 INFO:tasks.workunit.client.0.vm04.stdout:8/883: fsync df/d20/d25/d30/d65/f82 0 2026-03-10T06:23:16.678 INFO:tasks.workunit.client.0.vm04.stdout:6/886: link d2/d43/d2d/d30/d1f/c66 d2/d8/c121 0 2026-03-10T06:23:16.683 INFO:tasks.workunit.client.0.vm04.stdout:2/848: rmdir d1/db/d69/d74/d87/dcf/d8f/ddc 39 2026-03-10T06:23:16.683 INFO:tasks.workunit.client.0.vm04.stdout:2/849: readlink d1/db/d69/d74/d87/dcf/l70 0 2026-03-10T06:23:16.684 INFO:tasks.workunit.client.0.vm04.stdout:3/841: symlink d4/da/df/dd8/l116 0 2026-03-10T06:23:16.687 INFO:tasks.workunit.client.0.vm04.stdout:4/878: creat d2/d32/d5c/d76/dd7/d2c/d9a/ded/f120 x:0 0 0 2026-03-10T06:23:16.689 INFO:tasks.workunit.client.0.vm04.stdout:4/879: truncate d2/d46/f111 89435 0 2026-03-10T06:23:16.692 INFO:tasks.workunit.client.0.vm04.stdout:5/853: dwrite d4/d11/d7d/d38/d91/d4c/d98/dc0/f70 [4194304,4194304] 0 2026-03-10T06:23:16.694 INFO:tasks.workunit.client.0.vm04.stdout:7/829: getdents d4/df/d12/d13/db3/d110/d9c 0 2026-03-10T06:23:16.694 INFO:tasks.workunit.client.0.vm04.stdout:3/842: dread d4/d6/d38/dcc/fed [0,4194304] 0 2026-03-10T06:23:16.704 INFO:tasks.workunit.client.0.vm04.stdout:9/932: rename d2/d3/d18/d39/d46/d55/c86 to d2/d3/c14f 0 2026-03-10T06:23:16.707 INFO:tasks.workunit.client.0.vm04.stdout:4/880: symlink d2/d32/d5c/d76/l121 0 2026-03-10T06:23:16.714 INFO:tasks.workunit.client.0.vm04.stdout:3/843: creat d4/da/df/d11/d5a/d5b/dff/f117 x:0 0 0 2026-03-10T06:23:16.714 INFO:tasks.workunit.client.0.vm04.stdout:6/887: link d2/d3a/d5e/db5/f101 d2/d43/d2d/d30/d34/f122 0 2026-03-10T06:23:16.714 INFO:tasks.workunit.client.0.vm04.stdout:8/884: rename df/d15/d2b/f63 to df/d15/d29/df8/d102/d105/f11b 0 2026-03-10T06:23:16.719 INFO:tasks.workunit.client.0.vm04.stdout:4/881: mkdir d2/d32/d5c/de2/d122 0 2026-03-10T06:23:16.721 INFO:tasks.workunit.client.0.vm04.stdout:1/848: write d0/d8/d46/d7a/fce [1632463,43422] 0 2026-03-10T06:23:16.724 INFO:tasks.workunit.client.0.vm04.stdout:0/941: write d0/d5/d97/dc0/dd8/dff/f30 [1986444,130494] 0 2026-03-10T06:23:16.726 INFO:tasks.workunit.client.0.vm04.stdout:5/854: write d4/d11/d7d/d38/d91/f5e [579942,102752] 0 2026-03-10T06:23:16.737 INFO:tasks.workunit.client.0.vm04.stdout:9/933: write d2/d3/d18/de9/f83 [554258,106995] 0 2026-03-10T06:23:16.742 INFO:tasks.workunit.client.0.vm04.stdout:6/888: creat d2/d3a/d5e/db5/f123 x:0 0 0 2026-03-10T06:23:16.743 INFO:tasks.workunit.client.0.vm04.stdout:9/934: truncate d2/d3/d18/de9/d5a/f146 287574 0 2026-03-10T06:23:16.744 INFO:tasks.workunit.client.0.vm04.stdout:4/882: chown d2/d32/d10b/ld8 308443205 1 2026-03-10T06:23:16.745 INFO:tasks.workunit.client.0.vm04.stdout:5/855: creat d4/d11/d7d/dae/f127 x:0 0 0 2026-03-10T06:23:16.747 INFO:tasks.workunit.client.0.vm04.stdout:3/844: unlink d4/d6/d38/c70 0 2026-03-10T06:23:16.751 INFO:tasks.workunit.client.0.vm04.stdout:9/935: readlink d2/d3/df4/l145 0 2026-03-10T06:23:16.764 INFO:tasks.workunit.client.0.vm04.stdout:8/885: write df/d15/d29/da3/db8/dc1/d97/d67/fad [578696,8815] 0 2026-03-10T06:23:16.770 INFO:tasks.workunit.client.0.vm04.stdout:5/856: mknod d4/d6/d80/de5/d114/c128 0 2026-03-10T06:23:16.771 INFO:tasks.workunit.client.0.vm04.stdout:1/849: write d0/d8/d46/d7a/f84 [1574455,24527] 0 2026-03-10T06:23:16.777 INFO:tasks.workunit.client.0.vm04.stdout:9/936: chown d2/d23/d94/ca8 1 1 2026-03-10T06:23:16.786 INFO:tasks.workunit.client.0.vm04.stdout:8/886: rmdir df/d15 39 2026-03-10T06:23:16.787 INFO:tasks.workunit.client.0.vm04.stdout:0/942: dwrite d0/d1a/d20/df5/d47/ddd/d103/d139/d82/faf [0,4194304] 0 2026-03-10T06:23:16.795 INFO:tasks.workunit.client.0.vm04.stdout:4/883: dwrite d2/d32/d10b/d93/f9c [0,4194304] 0 2026-03-10T06:23:16.795 INFO:tasks.workunit.client.0.vm04.stdout:0/943: chown d0/d1a/d20/dc2 110 1 2026-03-10T06:23:16.803 INFO:tasks.workunit.client.0.vm04.stdout:9/937: sync 2026-03-10T06:23:16.804 INFO:tasks.workunit.client.0.vm04.stdout:2/850: rename d1/fc2 to d1/dae/d11/d14/ffe 0 2026-03-10T06:23:16.810 INFO:tasks.workunit.client.0.vm04.stdout:5/857: rmdir d4/d11/d7d/d38/d91/d4c/d98/dc0/dde 39 2026-03-10T06:23:16.812 INFO:tasks.workunit.client.0.vm04.stdout:5/858: read d4/d11/f32 [4261147,97853] 0 2026-03-10T06:23:16.814 INFO:tasks.workunit.client.0.vm04.stdout:3/845: dwrite d4/da/df/d11/d5a/d5b/ddf/f2b [0,4194304] 0 2026-03-10T06:23:16.830 INFO:tasks.workunit.client.0.vm04.stdout:7/830: rename d4/df/d12/d13/d25/d30/d40/d50/df6/d114/f11f to d4/df/d12/d13/db3/d110/d9c/db1/dc4/f12c 0 2026-03-10T06:23:16.831 INFO:tasks.workunit.client.0.vm04.stdout:6/889: link d2/d43/d2d/d30/d1f/d3c/d85/fd0 d2/d43/d2d/d30/d1f/d3c/dfa/f124 0 2026-03-10T06:23:16.832 INFO:tasks.workunit.client.0.vm04.stdout:3/846: mkdir d4/d6/d99/d10c/d118 0 2026-03-10T06:23:16.834 INFO:tasks.workunit.client.0.vm04.stdout:9/938: rename d2/d3/d18/d39/lca to d2/d3/d18/d39/d46/l150 0 2026-03-10T06:23:16.834 INFO:tasks.workunit.client.0.vm04.stdout:4/884: creat d2/d32/d5c/d76/dd7/d2c/d6b/d108/dfe/f123 x:0 0 0 2026-03-10T06:23:16.835 INFO:tasks.workunit.client.0.vm04.stdout:5/859: dread d4/d11/d7d/d52/f9a [0,4194304] 0 2026-03-10T06:23:16.835 INFO:tasks.workunit.client.0.vm04.stdout:6/890: read - d2/d43/d2d/d30/d1f/d3c/dfa/f106 zero size 2026-03-10T06:23:16.835 INFO:tasks.workunit.client.0.vm04.stdout:5/860: dread - d4/d3b/fa0 zero size 2026-03-10T06:23:16.836 INFO:tasks.workunit.client.0.vm04.stdout:3/847: chown d4/da/df/d11/d5a/d5b/ddf/d21/f6d 2 1 2026-03-10T06:23:16.836 INFO:tasks.workunit.client.0.vm04.stdout:5/861: readlink l1 0 2026-03-10T06:23:16.838 INFO:tasks.workunit.client.0.vm04.stdout:5/862: chown d4/d11/fb9 21382786 1 2026-03-10T06:23:16.842 INFO:tasks.workunit.client.0.vm04.stdout:0/944: creat d0/d5/d97/dc0/dd8/dff/f142 x:0 0 0 2026-03-10T06:23:16.842 INFO:tasks.workunit.client.0.vm04.stdout:9/939: symlink d2/de0/da3/l151 0 2026-03-10T06:23:16.844 INFO:tasks.workunit.client.0.vm04.stdout:3/848: rmdir d4/d6/d92/def 39 2026-03-10T06:23:16.848 INFO:tasks.workunit.client.0.vm04.stdout:9/940: creat d2/de0/f152 x:0 0 0 2026-03-10T06:23:16.849 INFO:tasks.workunit.client.0.vm04.stdout:0/945: sync 2026-03-10T06:23:16.857 INFO:tasks.workunit.client.0.vm04.stdout:1/850: dwrite d0/f83 [0,4194304] 0 2026-03-10T06:23:16.871 INFO:tasks.workunit.client.0.vm04.stdout:2/851: creat d1/dae/d11/d14/d4e/fff x:0 0 0 2026-03-10T06:23:16.878 INFO:tasks.workunit.client.0.vm04.stdout:9/941: stat d2/d8/cfb 0 2026-03-10T06:23:16.881 INFO:tasks.workunit.client.0.vm04.stdout:9/942: readlink d2/de0/d1d/l129 0 2026-03-10T06:23:16.887 INFO:tasks.workunit.client.0.vm04.stdout:6/891: creat d2/d43/d2d/d30/d34/f125 x:0 0 0 2026-03-10T06:23:16.887 INFO:tasks.workunit.client.0.vm04.stdout:0/946: mkdir d0/d1a/d20/df5/d47/ddd/d103/d139/d82/d143 0 2026-03-10T06:23:16.889 INFO:tasks.workunit.client.0.vm04.stdout:7/831: truncate d4/df/d12/d13/d8b/fa5 2848443 0 2026-03-10T06:23:16.891 INFO:tasks.workunit.client.0.vm04.stdout:6/892: sync 2026-03-10T06:23:16.891 INFO:tasks.workunit.client.0.vm04.stdout:8/887: dwrite df/d20/d25/f54 [4194304,4194304] 0 2026-03-10T06:23:16.893 INFO:tasks.workunit.client.0.vm04.stdout:5/863: write d4/d11/d7d/d38/d91/d55/d72/f113 [130174,41850] 0 2026-03-10T06:23:16.894 INFO:tasks.workunit.client.0.vm04.stdout:1/851: mkdir d0/d3/d41/dc2/d13a 0 2026-03-10T06:23:16.895 INFO:tasks.workunit.client.0.vm04.stdout:4/885: link d2/d32/d5c/d76/dd7/d56/l90 d2/d32/d5c/d76/dd7/d2c/d9a/ded/l124 0 2026-03-10T06:23:16.900 INFO:tasks.workunit.client.0.vm04.stdout:9/943: rename d2/d8/d3a/dcb to d2/d8/d53/d6e/d89/d153 0 2026-03-10T06:23:16.900 INFO:tasks.workunit.client.0.vm04.stdout:2/852: write d1/dae/d11/d14/d4e/f9d [6524738,65363] 0 2026-03-10T06:23:16.903 INFO:tasks.workunit.client.0.vm04.stdout:3/849: dwrite d4/da/df/d11/d5a/d5b/ddf/d21/d32/d8e/fe1 [0,4194304] 0 2026-03-10T06:23:16.904 INFO:tasks.workunit.client.0.vm04.stdout:6/893: truncate d2/d3a/f56 2555808 0 2026-03-10T06:23:16.904 INFO:tasks.workunit.client.0.vm04.stdout:2/853: write d1/dae/d11/f16 [7582829,62110] 0 2026-03-10T06:23:16.911 INFO:tasks.workunit.client.0.vm04.stdout:3/850: chown d4/f42 28 1 2026-03-10T06:23:16.911 INFO:tasks.workunit.client.0.vm04.stdout:3/851: write d4/da/df/d11/d5a/d5b/dff/ff8 [367435,108757] 0 2026-03-10T06:23:16.916 INFO:tasks.workunit.client.0.vm04.stdout:3/852: truncate d4/da/df/d11/d5a/d5b/ddf/d21/d32/d8e/ffe 913192 0 2026-03-10T06:23:16.924 INFO:tasks.workunit.client.0.vm04.stdout:6/894: dwrite d2/d3a/d9c/fba [0,4194304] 0 2026-03-10T06:23:16.930 INFO:tasks.workunit.client.0.vm04.stdout:8/888: creat df/d15/d29/da3/db8/dc1/d97/d67/f11c x:0 0 0 2026-03-10T06:23:16.933 INFO:tasks.workunit.client.0.vm04.stdout:8/889: fsync df/d15/d29/df8/d102/dab/fdf 0 2026-03-10T06:23:16.934 INFO:tasks.workunit.client.0.vm04.stdout:5/864: symlink d4/d6/d81/db6/d10f/l129 0 2026-03-10T06:23:16.959 INFO:tasks.workunit.client.0.vm04.stdout:1/852: dread d0/d3/f33 [0,4194304] 0 2026-03-10T06:23:16.966 INFO:tasks.workunit.client.0.vm04.stdout:4/886: symlink d2/d32/d5c/l125 0 2026-03-10T06:23:16.968 INFO:tasks.workunit.client.0.vm04.stdout:4/887: chown d2/d32/d10b/d93/fb4 5 1 2026-03-10T06:23:16.989 INFO:tasks.workunit.client.0.vm04.stdout:6/895: dread - d2/d43/d86/f9f zero size 2026-03-10T06:23:17.000 INFO:tasks.workunit.client.0.vm04.stdout:7/832: write d4/df/d12/d21/fd9 [698588,6889] 0 2026-03-10T06:23:17.007 INFO:tasks.workunit.client.0.vm04.stdout:8/890: creat df/d15/d29/da3/f11d x:0 0 0 2026-03-10T06:23:17.009 INFO:tasks.workunit.client.0.vm04.stdout:9/944: dwrite f0 [4194304,4194304] 0 2026-03-10T06:23:17.009 INFO:tasks.workunit.client.0.vm04.stdout:5/865: unlink d4/d11/d7d/d38/d91/d4c/d98/faf 0 2026-03-10T06:23:17.011 INFO:tasks.workunit.client.0.vm04.stdout:9/945: truncate d2/de0/d1d/d64/f123 859772 0 2026-03-10T06:23:17.015 INFO:tasks.workunit.client.0.vm04.stdout:0/947: dwrite d0/d1a/d20/df5/d47/ddd/d103/d139/d82/fca [0,4194304] 0 2026-03-10T06:23:17.023 INFO:tasks.workunit.client.0.vm04.stdout:4/888: fsync d2/d32/d10b/d93/fa2 0 2026-03-10T06:23:17.027 INFO:tasks.workunit.client.0.vm04.stdout:2/854: rename d1/db/d69/d74/d87/dcf/d8f/d35/d54/d5d/l5e to d1/dae/d2c/d37/d40/l100 0 2026-03-10T06:23:17.033 INFO:tasks.workunit.client.0.vm04.stdout:6/896: mknod d2/d43/d2d/d30/d1f/d3c/d75/c126 0 2026-03-10T06:23:17.046 INFO:tasks.workunit.client.0.vm04.stdout:1/853: dread d0/d8/f67 [0,4194304] 0 2026-03-10T06:23:17.072 INFO:tasks.workunit.client.0.vm04.stdout:7/833: dread d4/f7a [0,4194304] 0 2026-03-10T06:23:17.076 INFO:tasks.workunit.client.0.vm04.stdout:3/853: rename d4/d6/d38/dcc to d4/d6/d99/d119 0 2026-03-10T06:23:17.077 INFO:tasks.workunit.client.0.vm04.stdout:0/948: write d0/d5/d97/dc0/dd8/dff/d59/fed [221373,75973] 0 2026-03-10T06:23:17.079 INFO:tasks.workunit.client.0.vm04.stdout:8/891: dwrite df/d15/fee [0,4194304] 0 2026-03-10T06:23:17.085 INFO:tasks.workunit.client.0.vm04.stdout:6/897: fsync d2/d43/d2d/d30/d1f/d3c/fb7 0 2026-03-10T06:23:17.085 INFO:tasks.workunit.client.0.vm04.stdout:8/892: chown df/d20/d25/d73/lb0 191178037 1 2026-03-10T06:23:17.093 INFO:tasks.workunit.client.0.vm04.stdout:7/834: sync 2026-03-10T06:23:17.094 INFO:tasks.workunit.client.0.vm04.stdout:7/835: stat d4/df/d12/d13/d25/c55 0 2026-03-10T06:23:17.101 INFO:tasks.workunit.client.0.vm04.stdout:2/855: rename d1/dae/d2c/l68 to d1/dae/d11/d14/d9f/ddb/d94/dbb/l101 0 2026-03-10T06:23:17.105 INFO:tasks.workunit.client.0.vm04.stdout:7/836: sync 2026-03-10T06:23:17.113 INFO:tasks.workunit.client.0.vm04.stdout:0/949: creat d0/d1a/d20/df5/d47/d8a/de3/f144 x:0 0 0 2026-03-10T06:23:17.113 INFO:tasks.workunit.client.0.vm04.stdout:8/893: mkdir df/d15/d2b/d81/d9a/d11e 0 2026-03-10T06:23:17.115 INFO:tasks.workunit.client.0.vm04.stdout:6/898: dwrite d2/d3a/d5e/db5/f101 [0,4194304] 0 2026-03-10T06:23:17.115 INFO:tasks.workunit.client.0.vm04.stdout:7/837: dread d4/f7a [0,4194304] 0 2026-03-10T06:23:17.115 INFO:tasks.workunit.client.0.vm04.stdout:1/854: write d0/f6a [5286947,65207] 0 2026-03-10T06:23:17.119 INFO:tasks.workunit.client.0.vm04.stdout:7/838: read d4/df/d12/d13/d25/d28/d3a/d58/fcc [1237786,96098] 0 2026-03-10T06:23:17.119 INFO:tasks.workunit.client.0.vm04.stdout:1/855: dread - d0/d8/d46/d7a/d95/dc5/f132 zero size 2026-03-10T06:23:17.125 INFO:tasks.workunit.client.0.vm04.stdout:9/946: link d2/d3/d18/de9/da9/l7a d2/d3/df4/l154 0 2026-03-10T06:23:17.132 INFO:tasks.workunit.client.0.vm04.stdout:9/947: dread d2/d8/f99 [0,4194304] 0 2026-03-10T06:23:17.132 INFO:tasks.workunit.client.0.vm04.stdout:5/866: mkdir d4/d11/d7d/d38/d91/d4c/def/d12a 0 2026-03-10T06:23:17.139 INFO:tasks.workunit.client.0.vm04.stdout:3/854: mkdir d4/da/df/d11/d5a/d5b/ddf/d21/d2c/d11a 0 2026-03-10T06:23:17.145 INFO:tasks.workunit.client.0.vm04.stdout:3/855: dread d4/fe3 [0,4194304] 0 2026-03-10T06:23:17.145 INFO:tasks.workunit.client.0.vm04.stdout:5/867: sync 2026-03-10T06:23:17.146 INFO:tasks.workunit.client.0.vm04.stdout:9/948: dread d2/d8/d53/f143 [0,4194304] 0 2026-03-10T06:23:17.147 INFO:tasks.workunit.client.0.vm04.stdout:8/894: fsync df/d15/d2b/f33 0 2026-03-10T06:23:17.151 INFO:tasks.workunit.client.0.vm04.stdout:6/899: symlink d2/d37/d6e/de6/l127 0 2026-03-10T06:23:17.154 INFO:tasks.workunit.client.0.vm04.stdout:2/856: rename d1/db/d69/d74/d87/dcf/d8f/d48/d67/fcb to d1/db/d69/d74/d87/dcf/d8f/d35/d54/dfc/f102 0 2026-03-10T06:23:17.156 INFO:tasks.workunit.client.0.vm04.stdout:5/868: dwrite d4/d11/d7d/d38/d91/d4c/d98/dc0/dbe/ff1 [0,4194304] 0 2026-03-10T06:23:17.158 INFO:tasks.workunit.client.0.vm04.stdout:5/869: chown d4/cb5 14416643 1 2026-03-10T06:23:17.164 INFO:tasks.workunit.client.0.vm04.stdout:4/889: getdents d2/d32/d5c/d76/dd7/d2c/d6b/d108/dfe 0 2026-03-10T06:23:17.177 INFO:tasks.workunit.client.0.vm04.stdout:3/856: symlink d4/d6/d54/l11b 0 2026-03-10T06:23:17.180 INFO:tasks.workunit.client.0.vm04.stdout:4/890: sync 2026-03-10T06:23:17.184 INFO:tasks.workunit.client.0.vm04.stdout:7/839: rename d4/df/d12/d13/d25/d28/d3a/d58/cba to d4/df/d12/d13/d25/d30/d40/d50/c12d 0 2026-03-10T06:23:17.186 INFO:tasks.workunit.client.0.vm04.stdout:9/949: dread d2/d3/d18/de9/d5a/fa7 [0,4194304] 0 2026-03-10T06:23:17.194 INFO:tasks.workunit.client.0.vm04.stdout:2/857: unlink d1/dae/d11/d14/d9f/ddb/d94/ca7 0 2026-03-10T06:23:17.196 INFO:tasks.workunit.client.0.vm04.stdout:5/870: truncate d4/d11/f1f 484000 0 2026-03-10T06:23:17.208 INFO:tasks.workunit.client.0.vm04.stdout:4/891: fsync d2/d32/d5c/d76/dd7/d2c/d9a/ff5 0 2026-03-10T06:23:17.218 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:16 vm04.local ceph-mon[51058]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:23:17.218 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:16 vm04.local ceph-mon[51058]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:23:17.218 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:16 vm04.local ceph-mon[51058]: pgmap v37: 65 pgs: 65 active+clean; 3.0 GiB data, 9.9 GiB used, 110 GiB / 120 GiB avail; 49 MiB/s rd, 122 MiB/s wr, 299 op/s 2026-03-10T06:23:17.218 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:16 vm04.local ceph-mon[51058]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:23:17.218 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:16 vm04.local ceph-mon[51058]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:23:17.225 INFO:tasks.workunit.client.0.vm04.stdout:5/871: fdatasync d4/d11/d7d/fec 0 2026-03-10T06:23:17.229 INFO:tasks.workunit.client.0.vm04.stdout:8/895: creat df/d20/d25/d30/f11f x:0 0 0 2026-03-10T06:23:17.230 INFO:tasks.workunit.client.0.vm04.stdout:6/900: creat d2/d43/d2d/d30/d34/dae/f128 x:0 0 0 2026-03-10T06:23:17.230 INFO:tasks.workunit.client.0.vm04.stdout:6/901: dread - d2/d43/fe5 zero size 2026-03-10T06:23:17.230 INFO:tasks.workunit.client.0.vm04.stdout:8/896: write df/d15/fee [2287178,23239] 0 2026-03-10T06:23:17.232 INFO:tasks.workunit.client.0.vm04.stdout:6/902: chown d2/d3a/d5e/db5/fbe 7324 1 2026-03-10T06:23:17.234 INFO:tasks.workunit.client.0.vm04.stdout:6/903: read d2/d43/d2d/d30/d1f/ff5 [3342443,14834] 0 2026-03-10T06:23:17.236 INFO:tasks.workunit.client.0.vm04.stdout:4/892: fsync d2/d32/d5c/d76/dd7/d2c/d6b/f96 0 2026-03-10T06:23:17.240 INFO:tasks.workunit.client.0.vm04.stdout:7/840: fdatasync d4/df/f84 0 2026-03-10T06:23:17.241 INFO:tasks.workunit.client.0.vm04.stdout:0/950: link d0/d5/d25/c12b d0/d5/d25/dd/d3a/c145 0 2026-03-10T06:23:17.248 INFO:tasks.workunit.client.0.vm04.stdout:3/857: creat d4/da/df/d11/d50/f11c x:0 0 0 2026-03-10T06:23:17.249 INFO:tasks.workunit.client.0.vm04.stdout:8/897: dread df/d15/d29/da3/fa7 [0,4194304] 0 2026-03-10T06:23:17.267 INFO:tasks.workunit.client.0.vm04.stdout:7/841: readlink d4/df/d12/d21/l53 0 2026-03-10T06:23:17.267 INFO:tasks.workunit.client.0.vm04.stdout:5/872: unlink d4/d6/d80/d84/fa5 0 2026-03-10T06:23:17.268 INFO:tasks.workunit.client.0.vm04.stdout:2/858: fsync d1/dae/d2c/d37/d40/fc4 0 2026-03-10T06:23:17.269 INFO:tasks.workunit.client.0.vm04.stdout:5/873: read d4/d11/d7d/f36 [7120395,33267] 0 2026-03-10T06:23:17.276 INFO:tasks.workunit.client.0.vm04.stdout:7/842: dwrite d4/df/d12/dd4/fe1 [4194304,4194304] 0 2026-03-10T06:23:17.276 INFO:tasks.workunit.client.0.vm04.stdout:3/858: unlink d4/d6/dc/fb7 0 2026-03-10T06:23:17.276 INFO:tasks.workunit.client.0.vm04.stdout:8/898: read df/d15/d29/da3/faa [4172809,121921] 0 2026-03-10T06:23:17.281 INFO:tasks.workunit.client.0.vm04.stdout:1/856: rename d0/d3/d41/d4b/d5b/c89 to d0/d3/d41/d4b/c13b 0 2026-03-10T06:23:17.291 INFO:tasks.workunit.client.0.vm04.stdout:0/951: creat d0/d5/d25/dd/d5c/d141/f146 x:0 0 0 2026-03-10T06:23:17.295 INFO:tasks.workunit.client.0.vm04.stdout:7/843: rmdir d4/df/d12/d13/d25/d28/d3a/d100/d106 39 2026-03-10T06:23:17.304 INFO:tasks.workunit.client.0.vm04.stdout:9/950: rename d2/d3/d18/d39/d11/l62 to d2/d3/d18/d39/d11/da5/df5/l155 0 2026-03-10T06:23:17.309 INFO:tasks.workunit.client.0.vm04.stdout:9/951: stat d2/d3/d18/d39/d11/cda 0 2026-03-10T06:23:17.316 INFO:tasks.workunit.client.0.vm04.stdout:4/893: link d2/d32/d5c/d76/dd7/d31/d42/db9/fb0 d2/d32/d94/f126 0 2026-03-10T06:23:17.320 INFO:tasks.workunit.client.0.vm04.stdout:0/952: fsync d0/d5/d97/dc0/dd8/dff/fa2 0 2026-03-10T06:23:17.321 INFO:tasks.workunit.client.0.vm04.stdout:6/904: truncate d2/d43/d2d/d30/d1f/d3c/d75/f92 3024817 0 2026-03-10T06:23:17.322 INFO:tasks.workunit.client.0.vm04.stdout:0/953: stat d0/d5/d25/dd/d5c/d141/f146 0 2026-03-10T06:23:17.326 INFO:tasks.workunit.client.0.vm04.stdout:5/874: symlink d4/d11/d7d/d38/d91/d4c/d98/dc0/l12b 0 2026-03-10T06:23:17.326 INFO:tasks.workunit.client.0.vm04.stdout:7/844: rmdir d4/df/d12/d21 39 2026-03-10T06:23:17.335 INFO:tasks.workunit.client.0.vm04.stdout:8/899: dwrite df/d15/fcc [0,4194304] 0 2026-03-10T06:23:17.340 INFO:tasks.workunit.client.0.vm04.stdout:8/900: write df/d15/d29/da3/db8/dc1/dac/fc3 [2300372,92470] 0 2026-03-10T06:23:17.343 INFO:tasks.workunit.client.0.vm04.stdout:1/857: dwrite d0/d3/f4e [0,4194304] 0 2026-03-10T06:23:17.364 INFO:tasks.workunit.client.0.vm04.stdout:9/952: mknod d2/de0/d1d/d64/d73/d10d/c156 0 2026-03-10T06:23:17.366 INFO:tasks.workunit.client.0.vm04.stdout:4/894: read d2/d32/d5c/d76/dd7/d2c/d6b/d108/fae [1370206,111527] 0 2026-03-10T06:23:17.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:16 vm06.local ceph-mon[58974]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:23:17.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:16 vm06.local ceph-mon[58974]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:23:17.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:16 vm06.local ceph-mon[58974]: pgmap v37: 65 pgs: 65 active+clean; 3.0 GiB data, 9.9 GiB used, 110 GiB / 120 GiB avail; 49 MiB/s rd, 122 MiB/s wr, 299 op/s 2026-03-10T06:23:17.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:16 vm06.local ceph-mon[58974]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:23:17.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:16 vm06.local ceph-mon[58974]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:23:17.367 INFO:tasks.workunit.client.0.vm04.stdout:4/895: fsync d2/d32/d5c/d4f/f105 0 2026-03-10T06:23:17.368 INFO:tasks.workunit.client.0.vm04.stdout:4/896: chown d2/d46/f26 5 1 2026-03-10T06:23:17.417 INFO:tasks.workunit.client.0.vm04.stdout:1/858: getdents d0/d8/d46/db3/dd2/d100/d119 0 2026-03-10T06:23:17.417 INFO:tasks.workunit.client.0.vm04.stdout:3/859: link d4/da/df/lf7 d4/da/df/d11/d5a/d5b/ddf/d21/d32/l11d 0 2026-03-10T06:23:17.429 INFO:tasks.workunit.client.0.vm04.stdout:2/859: getdents d1/dae/d11/d14/d9f/ddb 0 2026-03-10T06:23:17.429 INFO:tasks.workunit.client.0.vm04.stdout:5/875: rename d4/d3b/c104 to d4/d6/d80/c12c 0 2026-03-10T06:23:17.437 INFO:tasks.workunit.client.0.vm04.stdout:3/860: creat d4/deb/f11e x:0 0 0 2026-03-10T06:23:17.439 INFO:tasks.workunit.client.0.vm04.stdout:4/897: creat d2/d32/f127 x:0 0 0 2026-03-10T06:23:17.446 INFO:tasks.workunit.client.0.vm04.stdout:0/954: dwrite d0/d5/fc5 [0,4194304] 0 2026-03-10T06:23:17.456 INFO:tasks.workunit.client.0.vm04.stdout:8/901: write df/d15/d2b/f4a [4376788,112783] 0 2026-03-10T06:23:17.457 INFO:tasks.workunit.client.0.vm04.stdout:6/905: write d2/d43/d2d/d30/d1f/d3c/d85/fd0 [1793665,39342] 0 2026-03-10T06:23:17.465 INFO:tasks.workunit.client.0.vm04.stdout:7/845: dwrite d4/df/d12/f5f [0,4194304] 0 2026-03-10T06:23:17.470 INFO:tasks.workunit.client.0.vm04.stdout:0/955: read d0/d1a/d20/f8c [1554357,100942] 0 2026-03-10T06:23:17.470 INFO:tasks.workunit.client.0.vm04.stdout:7/846: write d4/df/d12/f5f [619249,18750] 0 2026-03-10T06:23:17.475 INFO:tasks.workunit.client.0.vm04.stdout:2/860: dread d1/dae/d2c/d37/d40/f64 [0,4194304] 0 2026-03-10T06:23:17.494 INFO:tasks.workunit.client.0.vm04.stdout:9/953: dwrite d2/d3/d18/ddd/f5e [0,4194304] 0 2026-03-10T06:23:17.494 INFO:tasks.workunit.client.0.vm04.stdout:5/876: dread d4/d6/d37/fcc [0,4194304] 0 2026-03-10T06:23:17.495 INFO:tasks.workunit.client.0.vm04.stdout:8/902: creat df/d15/d29/da3/db8/dc1/dac/f120 x:0 0 0 2026-03-10T06:23:17.499 INFO:tasks.workunit.client.0.vm04.stdout:4/898: mknod d2/c128 0 2026-03-10T06:23:17.501 INFO:tasks.workunit.client.0.vm04.stdout:2/861: chown d1/db/d69/d74/d87/dcf/d8f/d35/d54/d5d/f65 61148 1 2026-03-10T06:23:17.502 INFO:tasks.workunit.client.0.vm04.stdout:0/956: dwrite d0/d5/d97/dc0/dd8/dff/d9c/dbf/ffd [0,4194304] 0 2026-03-10T06:23:17.509 INFO:tasks.workunit.client.0.vm04.stdout:0/957: fdatasync d0/d5/d97/dc0/dd8/dff/d9c/d134/f137 0 2026-03-10T06:23:17.533 INFO:tasks.workunit.client.0.vm04.stdout:6/906: mkdir d2/d43/d9b/d129 0 2026-03-10T06:23:17.542 INFO:tasks.workunit.client.0.vm04.stdout:6/907: dwrite d2/d43/d2d/d30/f93 [0,4194304] 0 2026-03-10T06:23:17.559 INFO:tasks.workunit.client.0.vm04.stdout:3/861: write d4/da/df/d11/d5a/d5b/ddf/d21/d2c/f85 [468062,66147] 0 2026-03-10T06:23:17.564 INFO:tasks.workunit.client.0.vm04.stdout:7/847: truncate d4/df/d12/d13/fc7 3036321 0 2026-03-10T06:23:17.567 INFO:tasks.workunit.client.0.vm04.stdout:7/848: dwrite d4/df/d12/d13/f10d [0,4194304] 0 2026-03-10T06:23:17.590 INFO:tasks.workunit.client.0.vm04.stdout:5/877: dread d4/d11/d7d/d38/d91/d55/f7a [0,4194304] 0 2026-03-10T06:23:17.603 INFO:tasks.workunit.client.0.vm04.stdout:9/954: write d2/d3/d18/de9/f139 [1294234,121374] 0 2026-03-10T06:23:17.606 INFO:tasks.workunit.client.0.vm04.stdout:4/899: mknod d2/d8/c129 0 2026-03-10T06:23:17.608 INFO:tasks.workunit.client.0.vm04.stdout:2/862: stat d1/db/d69/d74/d87/dcf/d8f/d35/d54/d5d/f93 0 2026-03-10T06:23:17.609 INFO:tasks.workunit.client.0.vm04.stdout:2/863: write d1/dae/d11/d14/d4e/fff [815808,115437] 0 2026-03-10T06:23:17.611 INFO:tasks.workunit.client.0.vm04.stdout:0/958: mknod d0/d1a/d4d/c147 0 2026-03-10T06:23:17.612 INFO:tasks.workunit.client.0.vm04.stdout:1/859: rename d0/f23 to d0/d8/d46/d7a/d95/dc5/f13c 0 2026-03-10T06:23:17.613 INFO:tasks.workunit.client.0.vm04.stdout:1/860: dread - d0/d8/d46/d7a/d95/dc5/f122 zero size 2026-03-10T06:23:17.617 INFO:tasks.workunit.client.0.vm04.stdout:3/862: chown d4/da/df/lf7 393294274 1 2026-03-10T06:23:17.622 INFO:tasks.workunit.client.0.vm04.stdout:5/878: rmdir d4/d11/d7d/d38/d91/d4c/def/ddc 39 2026-03-10T06:23:17.622 INFO:tasks.workunit.client.0.vm04.stdout:5/879: fdatasync d4/d6/d81/db6/f11c 0 2026-03-10T06:23:17.625 INFO:tasks.workunit.client.0.vm04.stdout:9/955: symlink d2/d3/d18/d39/d46/d55/l157 0 2026-03-10T06:23:17.628 INFO:tasks.workunit.client.0.vm04.stdout:9/956: dwrite d2/d3/d18/de9/f83 [0,4194304] 0 2026-03-10T06:23:17.629 INFO:tasks.workunit.client.0.vm04.stdout:9/957: dread - d2/de0/f152 zero size 2026-03-10T06:23:17.639 INFO:tasks.workunit.client.0.vm04.stdout:4/900: chown d2/d32/d5c/d76/dd7/d56/fa7 4 1 2026-03-10T06:23:17.639 INFO:tasks.workunit.client.0.vm04.stdout:4/901: fdatasync d2/d46/f5d 0 2026-03-10T06:23:17.645 INFO:tasks.workunit.client.0.vm04.stdout:2/864: truncate d1/f5 1060544 0 2026-03-10T06:23:17.647 INFO:tasks.workunit.client.0.vm04.stdout:0/959: rmdir d0/d1a/d20/df5/d79/d111 39 2026-03-10T06:23:17.653 INFO:tasks.workunit.client.0.vm04.stdout:7/849: link d4/df/d12/d13/d25/d8f/f11c d4/df/d12/d13/db3/d110/d9c/db1/dde/ddf/f12e 0 2026-03-10T06:23:17.663 INFO:tasks.workunit.client.0.vm04.stdout:5/880: mkdir d4/d11/d7d/dab/d106/d12d 0 2026-03-10T06:23:17.663 INFO:tasks.workunit.client.0.vm04.stdout:9/958: creat d2/d8/d53/d6e/d89/d153/d108/f158 x:0 0 0 2026-03-10T06:23:17.664 INFO:tasks.workunit.client.0.vm04.stdout:1/861: sync 2026-03-10T06:23:17.670 INFO:tasks.workunit.client.0.vm04.stdout:8/903: link df/d20/d25/d30/dc5/lcf df/d15/d29/d89/l121 0 2026-03-10T06:23:17.671 INFO:tasks.workunit.client.0.vm04.stdout:2/865: mknod d1/dae/d11/d14/d9f/ddb/d94/dbb/de8/c103 0 2026-03-10T06:23:17.676 INFO:tasks.workunit.client.0.vm04.stdout:6/908: truncate d2/d43/f31 695598 0 2026-03-10T06:23:17.687 INFO:tasks.workunit.client.0.vm04.stdout:3/863: creat d4/d6/d99/d10c/d118/f11f x:0 0 0 2026-03-10T06:23:17.690 INFO:tasks.workunit.client.0.vm04.stdout:7/850: mkdir d4/df/d12/d13/d12f 0 2026-03-10T06:23:17.695 INFO:tasks.workunit.client.0.vm04.stdout:3/864: sync 2026-03-10T06:23:17.695 INFO:tasks.workunit.client.0.vm04.stdout:3/865: chown d4/d6/dc/l66 374 1 2026-03-10T06:23:17.697 INFO:tasks.workunit.client.0.vm04.stdout:3/866: truncate d4/da/df/d11/d5a/d5b/ddf/d21/ff0 1642459 0 2026-03-10T06:23:17.703 INFO:tasks.workunit.client.0.vm04.stdout:1/862: mkdir d0/d112/d13d 0 2026-03-10T06:23:17.705 INFO:tasks.workunit.client.0.vm04.stdout:4/902: creat d2/d32/d94/d99/de3/f12a x:0 0 0 2026-03-10T06:23:17.706 INFO:tasks.workunit.client.0.vm04.stdout:8/904: mknod df/d15/d29/da3/db8/dc1/c122 0 2026-03-10T06:23:17.714 INFO:tasks.workunit.client.0.vm04.stdout:0/960: write d0/d1a/d20/df5/d47/d8a/d8d/fad [431481,88905] 0 2026-03-10T06:23:17.721 INFO:tasks.workunit.client.0.vm04.stdout:3/867: rename d4/da/df/d11/d5a/d5b/ddf/d21/d32/d4e/d8f/fb0 to d4/d6/dc/f120 0 2026-03-10T06:23:17.725 INFO:tasks.workunit.client.0.vm04.stdout:1/863: symlink d0/d3/d80/l13e 0 2026-03-10T06:23:17.726 INFO:tasks.workunit.client.0.vm04.stdout:4/903: mknod d2/d32/d5c/d76/dd7/d2c/d6b/d108/c12b 0 2026-03-10T06:23:17.727 INFO:tasks.workunit.client.0.vm04.stdout:4/904: write d2/d32/d5c/d76/f95 [1578527,116230] 0 2026-03-10T06:23:17.740 INFO:tasks.workunit.client.0.vm04.stdout:8/905: dwrite df/d15/d2b/ff3 [0,4194304] 0 2026-03-10T06:23:17.751 INFO:tasks.workunit.client.0.vm04.stdout:6/909: fdatasync d2/d43/f31 0 2026-03-10T06:23:17.752 INFO:tasks.workunit.client.0.vm04.stdout:7/851: mknod d4/df/d12/d13/d25/d30/d40/d108/c130 0 2026-03-10T06:23:17.756 INFO:tasks.workunit.client.0.vm04.stdout:3/868: mknod d4/d6/d91/da1/dd5/c121 0 2026-03-10T06:23:17.758 INFO:tasks.workunit.client.0.vm04.stdout:4/905: creat d2/d32/d94/f12c x:0 0 0 2026-03-10T06:23:17.758 INFO:tasks.workunit.client.0.vm04.stdout:4/906: write d2/d46/f61 [489723,19688] 0 2026-03-10T06:23:17.761 INFO:tasks.workunit.client.0.vm04.stdout:2/866: creat d1/db/d69/d74/d87/dcf/d8f/d35/d54/f104 x:0 0 0 2026-03-10T06:23:17.761 INFO:tasks.workunit.client.0.vm04.stdout:8/906: creat df/d15/d2b/d81/d9a/dbe/df0/f123 x:0 0 0 2026-03-10T06:23:17.767 INFO:tasks.workunit.client.0.vm04.stdout:5/881: getdents d4/d11/d7d/d52 0 2026-03-10T06:23:17.767 INFO:tasks.workunit.client.0.vm04.stdout:3/869: rmdir d4/da/df/d11/d5a/d5b/ddf/d21/d2c 39 2026-03-10T06:23:17.770 INFO:tasks.workunit.client.0.vm04.stdout:5/882: dread - d4/d6/d80/de5/d114/f122 zero size 2026-03-10T06:23:17.771 INFO:tasks.workunit.client.0.vm04.stdout:5/883: stat d4/d11/c4f 0 2026-03-10T06:23:17.778 INFO:tasks.workunit.client.0.vm04.stdout:8/907: truncate df/d15/d2b/f60 1124945 0 2026-03-10T06:23:17.787 INFO:tasks.workunit.client.0.vm04.stdout:9/959: link d2/d3/d18/ce1 d2/d8/d3a/c159 0 2026-03-10T06:23:17.789 INFO:tasks.workunit.client.0.vm04.stdout:9/960: chown d2/d3/d18/d39/d46/d55/dc3/f142 135736 1 2026-03-10T06:23:17.798 INFO:tasks.workunit.client.0.vm04.stdout:0/961: rename d0/d1a/d20/dc2/ff4 to d0/d1a/d20/f148 0 2026-03-10T06:23:17.799 INFO:tasks.workunit.client.0.vm04.stdout:7/852: dread d4/df/d12/d21/fa4 [0,4194304] 0 2026-03-10T06:23:17.804 INFO:tasks.workunit.client.0.vm04.stdout:5/884: dread d4/d6/f93 [0,4194304] 0 2026-03-10T06:23:17.815 INFO:tasks.workunit.client.0.vm04.stdout:9/961: unlink d2/de0/d1d/l129 0 2026-03-10T06:23:17.817 INFO:tasks.workunit.client.0.vm04.stdout:1/864: rename d0/d3/l1e to d0/d8/d46/de4/l13f 0 2026-03-10T06:23:17.823 INFO:tasks.workunit.client.0.vm04.stdout:4/907: dwrite d2/d32/d94/d99/ddc/fea [0,4194304] 0 2026-03-10T06:23:17.825 INFO:tasks.workunit.client.0.vm04.stdout:4/908: fdatasync d2/d46/f111 0 2026-03-10T06:23:17.825 INFO:tasks.workunit.client.0.vm04.stdout:4/909: stat d2/d32/d5c/f10e 0 2026-03-10T06:23:17.845 INFO:tasks.workunit.client.0.vm04.stdout:2/867: dwrite d1/db/d69/d74/d87/dcf/d8f/ddc/fd1 [0,4194304] 0 2026-03-10T06:23:17.857 INFO:tasks.workunit.client.0.vm04.stdout:8/908: write df/d20/d25/d30/f51 [1632388,56556] 0 2026-03-10T06:23:17.880 INFO:tasks.workunit.client.0.vm04.stdout:9/962: write d2/d3/d18/d39/d11/f56 [4602165,102592] 0 2026-03-10T06:23:17.885 INFO:tasks.workunit.client.0.vm04.stdout:4/910: dwrite d2/d32/d10b/dc8/d100/fe7 [0,4194304] 0 2026-03-10T06:23:17.892 INFO:tasks.workunit.client.0.vm04.stdout:4/911: dwrite d2/d32/d5c/f6d [0,4194304] 0 2026-03-10T06:23:17.896 INFO:tasks.workunit.client.0.vm04.stdout:4/912: dwrite d2/d32/d5c/d76/f95 [0,4194304] 0 2026-03-10T06:23:17.900 INFO:tasks.workunit.client.0.vm04.stdout:8/909: truncate df/d20/d25/d30/d65/f82 1324991 0 2026-03-10T06:23:17.920 INFO:tasks.workunit.client.0.vm04.stdout:0/962: creat d0/d5/d97/dc0/dd8/dff/f149 x:0 0 0 2026-03-10T06:23:17.920 INFO:tasks.workunit.client.0.vm04.stdout:0/963: chown d0/d1a/l2a 8 1 2026-03-10T06:23:17.922 INFO:tasks.workunit.client.0.vm04.stdout:9/963: mkdir d2/d23/d94/d15a 0 2026-03-10T06:23:17.931 INFO:tasks.workunit.client.0.vm04.stdout:7/853: rmdir d4/df/d12/dd4/d12b 0 2026-03-10T06:23:17.941 INFO:tasks.workunit.client.0.vm04.stdout:5/885: creat d4/d11/d7d/d38/f12e x:0 0 0 2026-03-10T06:23:17.943 INFO:tasks.workunit.client.0.vm04.stdout:4/913: rmdir d2/d32/d94/d99/ddc 39 2026-03-10T06:23:17.949 INFO:tasks.workunit.client.0.vm04.stdout:2/868: mknod d1/dbf/c105 0 2026-03-10T06:23:17.974 INFO:tasks.workunit.client.0.vm04.stdout:6/910: rename d2/d37/d6e/fdd to d2/d8/f12a 0 2026-03-10T06:23:17.977 INFO:tasks.workunit.client.0.vm04.stdout:1/865: write d0/f64 [885578,113117] 0 2026-03-10T06:23:17.977 INFO:tasks.workunit.client.0.vm04.stdout:1/866: stat d0/d3/d41/d4b/d5b/f6f 0 2026-03-10T06:23:17.981 INFO:tasks.workunit.client.0.vm04.stdout:0/964: dwrite d0/d5/d97/dc0/fdb [4194304,4194304] 0 2026-03-10T06:23:18.001 INFO:tasks.workunit.client.0.vm04.stdout:5/886: fsync d4/d6/d37/fcc 0 2026-03-10T06:23:18.004 INFO:tasks.workunit.client.0.vm04.stdout:2/869: chown d1/dae/d2c/f58 0 1 2026-03-10T06:23:18.020 INFO:tasks.workunit.client.0.vm04.stdout:7/854: write d4/df/d12/d13/db3/d110/f67 [7016174,16253] 0 2026-03-10T06:23:18.024 INFO:tasks.workunit.client.0.vm04.stdout:9/964: creat d2/d3/d18/de9/d116/d11d/d12c/f15b x:0 0 0 2026-03-10T06:23:18.028 INFO:tasks.workunit.client.0.vm04.stdout:8/910: dwrite df/d15/d29/da3/db8/dc1/d97/f9c [0,4194304] 0 2026-03-10T06:23:18.032 INFO:tasks.workunit.client.0.vm04.stdout:8/911: dwrite df/d20/d25/d30/d65/f9f [0,4194304] 0 2026-03-10T06:23:18.042 INFO:tasks.workunit.client.0.vm04.stdout:5/887: mknod d4/d6/d80/de5/c12f 0 2026-03-10T06:23:18.045 INFO:tasks.workunit.client.0.vm04.stdout:4/914: creat d2/d32/d94/d99/d101/f12d x:0 0 0 2026-03-10T06:23:18.045 INFO:tasks.workunit.client.0.vm04.stdout:4/915: dread - d2/d32/d10b/dc8/f117 zero size 2026-03-10T06:23:18.048 INFO:tasks.workunit.client.0.vm04.stdout:2/870: symlink d1/dbf/l106 0 2026-03-10T06:23:18.055 INFO:tasks.workunit.client.0.vm04.stdout:3/870: rename d4/da/df/d11/l8d to d4/da/df/d11/d5a/d5b/ddf/d21/d32/d4e/d7f/l122 0 2026-03-10T06:23:18.068 INFO:tasks.workunit.client.0.vm04.stdout:0/965: symlink d0/d5/d25/dd/d3a/l14a 0 2026-03-10T06:23:18.078 INFO:tasks.workunit.client.0.vm04.stdout:7/855: mknod d4/df/d12/d13/d25/d28/d3a/d129/c131 0 2026-03-10T06:23:18.081 INFO:tasks.workunit.client.0.vm04.stdout:6/911: write d2/d43/d2d/d30/d34/d76/d8a/fab [1689497,108413] 0 2026-03-10T06:23:18.091 INFO:tasks.workunit.client.0.vm04.stdout:6/912: dread d2/d43/d2d/d30/d34/d76/d8a/f9e [4194304,4194304] 0 2026-03-10T06:23:18.093 INFO:tasks.workunit.client.0.vm04.stdout:8/912: rmdir df/d15 39 2026-03-10T06:23:18.100 INFO:tasks.workunit.client.0.vm04.stdout:4/916: creat d2/d32/d10b/dc8/d100/f12e x:0 0 0 2026-03-10T06:23:18.103 INFO:tasks.workunit.client.0.vm04.stdout:4/917: dwrite d2/d32/d94/d99/de3/f12a [0,4194304] 0 2026-03-10T06:23:18.117 INFO:tasks.workunit.client.0.vm04.stdout:2/871: rename d1/dae/d2c/d37/d40/l66 to d1/db/d69/d74/d87/dcf/d8f/d35/l107 0 2026-03-10T06:23:18.125 INFO:tasks.workunit.client.0.vm04.stdout:1/867: creat d0/d8/d46/d7a/d95/f140 x:0 0 0 2026-03-10T06:23:18.138 INFO:tasks.workunit.client.0.vm04.stdout:8/913: truncate df/d15/d29/da3/db8/dc1/dac/f114 405385 0 2026-03-10T06:23:18.143 INFO:tasks.workunit.client.0.vm04.stdout:9/965: dwrite d2/d23/f93 [0,4194304] 0 2026-03-10T06:23:18.143 INFO:tasks.workunit.client.0.vm04.stdout:6/913: dwrite d2/d37/d6e/fa9 [0,4194304] 0 2026-03-10T06:23:18.161 INFO:tasks.workunit.client.0.vm04.stdout:5/888: link d4/ff d4/d11/d7d/dab/d106/df0/f130 0 2026-03-10T06:23:18.178 INFO:tasks.workunit.client.0.vm04.stdout:4/918: symlink d2/d32/d10b/dc8/l12f 0 2026-03-10T06:23:18.182 INFO:tasks.workunit.client.0.vm04.stdout:3/871: mknod d4/da/df/d11/c123 0 2026-03-10T06:23:18.187 INFO:tasks.workunit.client.0.vm04.stdout:0/966: mkdir d0/d1a/d20/df5/d79/d14b 0 2026-03-10T06:23:18.189 INFO:tasks.workunit.client.0.vm04.stdout:7/856: creat d4/df/d12/d13/db3/d110/d9c/db1/dde/ddf/df1/f132 x:0 0 0 2026-03-10T06:23:18.196 INFO:tasks.workunit.client.0.vm04.stdout:1/868: dread d0/f9a [0,4194304] 0 2026-03-10T06:23:18.198 INFO:tasks.workunit.client.0.vm04.stdout:6/914: symlink d2/d3a/d5e/d9a/l12b 0 2026-03-10T06:23:18.204 INFO:tasks.workunit.client.0.vm04.stdout:9/966: mkdir d2/de0/d1d/d15c 0 2026-03-10T06:23:18.205 INFO:tasks.workunit.client.0.vm04.stdout:6/915: dread d2/d43/d2d/d30/f60 [0,4194304] 0 2026-03-10T06:23:18.205 INFO:tasks.workunit.client.0.vm04.stdout:9/967: chown d2/d3/df4/l154 0 1 2026-03-10T06:23:18.207 INFO:tasks.workunit.client.0.vm04.stdout:5/889: rename d4/d3b/f6d to d4/d6/d80/d84/d99/f131 0 2026-03-10T06:23:18.210 INFO:tasks.workunit.client.0.vm04.stdout:5/890: read d4/d6/d37/f62 [1600719,5851] 0 2026-03-10T06:23:18.214 INFO:tasks.workunit.client.0.vm04.stdout:0/967: creat d0/d5/d97/d10a/d135/f14c x:0 0 0 2026-03-10T06:23:18.215 INFO:tasks.workunit.client.0.vm04.stdout:7/857: truncate d4/df/d12/d13/f4a 4383662 0 2026-03-10T06:23:18.220 INFO:tasks.workunit.client.0.vm04.stdout:7/858: dread d4/df/d12/d13/db3/d110/d9c/f9f [0,4194304] 0 2026-03-10T06:23:18.222 INFO:tasks.workunit.client.0.vm04.stdout:8/914: mknod df/d20/c124 0 2026-03-10T06:23:18.228 INFO:tasks.workunit.client.0.vm04.stdout:1/869: dread d0/f4 [0,4194304] 0 2026-03-10T06:23:18.230 INFO:tasks.workunit.client.0.vm04.stdout:3/872: write d4/da/df/d11/f9f [421734,90436] 0 2026-03-10T06:23:18.233 INFO:tasks.workunit.client.0.vm04.stdout:2/872: dwrite d1/db/d69/d74/d87/dcf/d8f/d35/d54/dfc/f102 [0,4194304] 0 2026-03-10T06:23:18.238 INFO:tasks.workunit.client.0.vm04.stdout:6/916: write d2/d43/fe5 [327602,107697] 0 2026-03-10T06:23:18.244 INFO:tasks.workunit.client.0.vm04.stdout:9/968: creat d2/d3/d18/de9/d116/d11d/d12c/f15d x:0 0 0 2026-03-10T06:23:18.246 INFO:tasks.workunit.client.0.vm04.stdout:0/968: fsync d0/d5/d25/dd/d3a/d56/f88 0 2026-03-10T06:23:18.247 INFO:tasks.workunit.client.0.vm04.stdout:8/915: creat df/d15/d29/df8/d102/dab/f125 x:0 0 0 2026-03-10T06:23:18.247 INFO:tasks.workunit.client.0.vm04.stdout:4/919: creat d2/d32/d94/d99/ddc/f130 x:0 0 0 2026-03-10T06:23:18.249 INFO:tasks.workunit.client.0.vm04.stdout:3/873: truncate d4/d6/d54/fa8 251645 0 2026-03-10T06:23:18.250 INFO:tasks.workunit.client.0.vm04.stdout:8/916: readlink df/d15/d29/da3/db8/dc1/d97/d67/le2 0 2026-03-10T06:23:18.251 INFO:tasks.workunit.client.0.vm04.stdout:7/859: fdatasync d4/df/d12/d21/f6b 0 2026-03-10T06:23:18.257 INFO:tasks.workunit.client.0.vm04.stdout:7/860: dwrite d4/df/d12/d13/db3/d110/d9c/db1/ff7 [0,4194304] 0 2026-03-10T06:23:18.261 INFO:tasks.workunit.client.0.vm04.stdout:4/920: mknod d2/d32/d94/d99/ddc/c131 0 2026-03-10T06:23:18.265 INFO:tasks.workunit.client.0.vm04.stdout:8/917: dwrite df/d20/d25/fe0 [0,4194304] 0 2026-03-10T06:23:18.274 INFO:tasks.workunit.client.0.vm04.stdout:3/874: dwrite d4/da/df/d11/d5a/d5b/ddf/d21/d32/d4e/d8f/d104/f114 [0,4194304] 0 2026-03-10T06:23:18.278 INFO:tasks.workunit.client.0.vm04.stdout:8/918: fsync df/d15/f43 0 2026-03-10T06:23:18.280 INFO:tasks.workunit.client.0.vm04.stdout:9/969: mkdir d2/d8/d3a/d15e 0 2026-03-10T06:23:18.280 INFO:tasks.workunit.client.0.vm04.stdout:9/970: chown d2/de0/d1d/d64/d73/d10d/l13a 0 1 2026-03-10T06:23:18.287 INFO:tasks.workunit.client.0.vm04.stdout:4/921: creat d2/d32/d5c/f132 x:0 0 0 2026-03-10T06:23:18.291 INFO:tasks.workunit.client.0.vm04.stdout:9/971: creat d2/d8/d53/d6e/d89/d140/f15f x:0 0 0 2026-03-10T06:23:18.291 INFO:tasks.workunit.client.0.vm04.stdout:3/875: rename d4/da/cd7 to d4/c124 0 2026-03-10T06:23:18.293 INFO:tasks.workunit.client.0.vm04.stdout:9/972: mknod d2/de0/d1d/d64/d73/c160 0 2026-03-10T06:23:18.298 INFO:tasks.workunit.client.0.vm04.stdout:4/922: rename d2/d32/d5c/d76/dd7/d2c/d9a/ded to d2/d32/d5c/d76/dd7/d31/d42/db9/def/d133 0 2026-03-10T06:23:18.303 INFO:tasks.workunit.client.0.vm04.stdout:2/873: sync 2026-03-10T06:23:18.304 INFO:tasks.workunit.client.0.vm04.stdout:0/969: sync 2026-03-10T06:23:18.312 INFO:tasks.workunit.client.0.vm04.stdout:4/923: dread d2/d32/d10b/f64 [0,4194304] 0 2026-03-10T06:23:18.325 INFO:tasks.workunit.client.0.vm04.stdout:1/870: write d0/d8/f21 [1282406,4838] 0 2026-03-10T06:23:18.336 INFO:tasks.workunit.client.0.vm04.stdout:5/891: write d4/d6/f23 [3521332,45384] 0 2026-03-10T06:23:18.338 INFO:tasks.workunit.client.0.vm04.stdout:6/917: write d2/d43/d2d/d30/d1f/d3c/d75/f92 [683539,19724] 0 2026-03-10T06:23:18.339 INFO:tasks.workunit.client.0.vm04.stdout:7/861: write d4/df/d12/d13/db3/fcf [231827,20041] 0 2026-03-10T06:23:18.340 INFO:tasks.workunit.client.0.vm04.stdout:8/919: write df/d20/f84 [221220,70368] 0 2026-03-10T06:23:18.343 INFO:tasks.workunit.client.0.vm04.stdout:8/920: dwrite df/d15/d29/f6f [0,4194304] 0 2026-03-10T06:23:18.349 INFO:tasks.workunit.client.0.vm04.stdout:8/921: dwrite df/d15/d29/da3/db8/dc1/d97/f9c [4194304,4194304] 0 2026-03-10T06:23:18.350 INFO:tasks.workunit.client.0.vm04.stdout:9/973: write d2/d3/d18/d39/d11/f71 [3704369,23745] 0 2026-03-10T06:23:18.352 INFO:tasks.workunit.client.0.vm04.stdout:9/974: dread - d2/d3/d18/d39/d46/d84/fd5 zero size 2026-03-10T06:23:18.352 INFO:tasks.workunit.client.0.vm04.stdout:9/975: fdatasync d2/d3/d18/de9/f139 0 2026-03-10T06:23:18.396 INFO:tasks.workunit.client.0.vm04.stdout:0/970: symlink d0/l14d 0 2026-03-10T06:23:18.396 INFO:tasks.workunit.client.0.vm04.stdout:0/971: chown d0/d5/fc5 7 1 2026-03-10T06:23:18.398 INFO:tasks.workunit.client.0.vm04.stdout:2/874: write d1/db/d69/d74/d87/dcf/d8f/f25 [2729324,51890] 0 2026-03-10T06:23:18.403 INFO:tasks.workunit.client.0.vm04.stdout:1/871: readlink d0/d3/d41/d4b/d5b/l96 0 2026-03-10T06:23:18.410 INFO:tasks.workunit.client.0.vm04.stdout:5/892: rename l1 to d4/d3b/da8/l132 0 2026-03-10T06:23:18.410 INFO:tasks.workunit.client.0.vm04.stdout:1/872: dread d0/d3/fb2 [0,4194304] 0 2026-03-10T06:23:18.413 INFO:tasks.workunit.client.0.vm04.stdout:6/918: symlink d2/d43/d2d/d30/d1f/d3c/d75/l12c 0 2026-03-10T06:23:18.414 INFO:tasks.workunit.client.0.vm04.stdout:6/919: stat d2/d43/d2d/d30/d1f/d3c/d85/dbf 0 2026-03-10T06:23:18.421 INFO:tasks.workunit.client.0.vm04.stdout:4/924: dread d2/d32/f7c [0,4194304] 0 2026-03-10T06:23:18.435 INFO:tasks.workunit.client.0.vm04.stdout:8/922: creat df/d15/d29/da3/db8/f126 x:0 0 0 2026-03-10T06:23:18.436 INFO:tasks.workunit.client.0.vm04.stdout:7/862: write d4/df/ffa [302430,34293] 0 2026-03-10T06:23:18.439 INFO:tasks.workunit.client.0.vm04.stdout:0/972: symlink d0/d1a/d20/df5/d47/d11e/l14e 0 2026-03-10T06:23:18.440 INFO:tasks.workunit.client.0.vm04.stdout:9/976: dread d2/de0/d1d/d64/f91 [0,4194304] 0 2026-03-10T06:23:18.445 INFO:tasks.workunit.client.0.vm04.stdout:0/973: dread d0/d5/f41 [0,4194304] 0 2026-03-10T06:23:18.446 INFO:tasks.workunit.client.0.vm04.stdout:2/875: symlink d1/db/d69/d74/d87/dcf/d8f/d35/d54/dfc/l108 0 2026-03-10T06:23:18.458 INFO:tasks.workunit.client.0.vm04.stdout:1/873: truncate d0/f59 47437 0 2026-03-10T06:23:18.462 INFO:tasks.workunit.client.0.vm04.stdout:6/920: fdatasync d2/d43/d2d/d30/d1f/ff5 0 2026-03-10T06:23:18.466 INFO:tasks.workunit.client.0.vm04.stdout:5/893: dwrite d4/d11/d7d/d38/d91/dda/fe4 [8388608,4194304] 0 2026-03-10T06:23:18.478 INFO:tasks.workunit.client.0.vm04.stdout:8/923: fsync df/d15/d29/da3/fbf 0 2026-03-10T06:23:18.482 INFO:tasks.workunit.client.0.vm04.stdout:4/925: dwrite d2/f4 [0,4194304] 0 2026-03-10T06:23:18.491 INFO:tasks.workunit.client.0.vm04.stdout:7/863: mknod d4/df/d12/dd4/c133 0 2026-03-10T06:23:18.491 INFO:tasks.workunit.client.0.vm04.stdout:4/926: fsync d2/d8/f9f 0 2026-03-10T06:23:18.491 INFO:tasks.workunit.client.0.vm04.stdout:9/977: creat d2/de5/f161 x:0 0 0 2026-03-10T06:23:18.491 INFO:tasks.workunit.client.0.vm04.stdout:9/978: chown d2/de0/l3d 28 1 2026-03-10T06:23:18.491 INFO:tasks.workunit.client.0.vm04.stdout:9/979: chown d2/d3/d18/c30 480163 1 2026-03-10T06:23:18.491 INFO:tasks.workunit.client.0.vm04.stdout:4/927: dwrite d2/d32/d5c/d76/dd7/d56/f109 [0,4194304] 0 2026-03-10T06:23:18.496 INFO:tasks.workunit.client.0.vm04.stdout:1/874: creat d0/d8/d46/dcf/f141 x:0 0 0 2026-03-10T06:23:18.501 INFO:tasks.workunit.client.0.vm04.stdout:6/921: rmdir d2/d43/d2d/d30/d34/dae 39 2026-03-10T06:23:18.501 INFO:tasks.workunit.client.0.vm04.stdout:5/894: rename d4/d11/d7d/d38/l86 to d4/d6/d81/db6/l133 0 2026-03-10T06:23:18.509 INFO:tasks.workunit.client.0.vm04.stdout:8/924: unlink df/d20/d25/d30/d65/d8f/fb4 0 2026-03-10T06:23:18.522 INFO:tasks.workunit.client.0.vm04.stdout:2/876: mkdir d1/db/d69/d74/d87/dcf/d8f/d48/d67/db3/d109 0 2026-03-10T06:23:18.524 INFO:tasks.workunit.client.0.vm04.stdout:9/980: mkdir d2/d8/d22/daa/d162 0 2026-03-10T06:23:18.525 INFO:tasks.workunit.client.0.vm04.stdout:4/928: chown d2/c24 143522 1 2026-03-10T06:23:18.534 INFO:tasks.workunit.client.0.vm04.stdout:3/876: creat d4/da/df/d11/d5a/d5b/ddf/d21/d32/f125 x:0 0 0 2026-03-10T06:23:18.539 INFO:tasks.workunit.client.0.vm04.stdout:0/974: creat d0/d5/f14f x:0 0 0 2026-03-10T06:23:18.540 INFO:tasks.workunit.client.0.vm04.stdout:2/877: unlink d1/dae/lb8 0 2026-03-10T06:23:18.541 INFO:tasks.workunit.client.0.vm04.stdout:2/878: dread - d1/db/d69/d74/d87/dcf/d8f/d48/d67/fac zero size 2026-03-10T06:23:18.543 INFO:tasks.workunit.client.0.vm04.stdout:9/981: creat d2/d8/d53/d6e/d12b/f163 x:0 0 0 2026-03-10T06:23:18.543 INFO:tasks.workunit.client.0.vm04.stdout:2/879: chown d1/dae/d2c/d37/d59/la9 332 1 2026-03-10T06:23:18.549 INFO:tasks.workunit.client.0.vm04.stdout:1/875: write d0/d3/f24 [1054357,99395] 0 2026-03-10T06:23:18.560 INFO:tasks.workunit.client.0.vm04.stdout:7/864: truncate d4/df/d12/d13/f1e 5439210 0 2026-03-10T06:23:18.560 INFO:tasks.workunit.client.0.vm04.stdout:5/895: creat d4/d11/d7d/dab/d106/d12d/f134 x:0 0 0 2026-03-10T06:23:18.561 INFO:tasks.workunit.client.0.vm04.stdout:7/865: chown d4/c24 25 1 2026-03-10T06:23:18.562 INFO:tasks.workunit.client.0.vm04.stdout:7/866: stat d4/df/d12/d13/d25/d28/fb7 0 2026-03-10T06:23:18.563 INFO:tasks.workunit.client.0.vm04.stdout:4/929: dwrite d2/d32/d5c/d98/fe4 [0,4194304] 0 2026-03-10T06:23:18.570 INFO:tasks.workunit.client.0.vm04.stdout:9/982: symlink d2/de0/d1d/l164 0 2026-03-10T06:23:18.571 INFO:tasks.workunit.client.0.vm04.stdout:9/983: stat d2/d3/d18/d39/d46/d84/fd5 0 2026-03-10T06:23:18.575 INFO:tasks.workunit.client.0.vm04.stdout:3/877: dwrite d4/d6/d91/fad [0,4194304] 0 2026-03-10T06:23:18.575 INFO:tasks.workunit.client.0.vm04.stdout:5/896: chown d4/d6/f20 14849216 1 2026-03-10T06:23:18.584 INFO:tasks.workunit.client.0.vm04.stdout:8/925: creat df/d20/f127 x:0 0 0 2026-03-10T06:23:18.585 INFO:tasks.workunit.client.0.vm04.stdout:8/926: read - df/d15/d29/da3/db8/dc1/d97/d67/f92 zero size 2026-03-10T06:23:18.588 INFO:tasks.workunit.client.0.vm04.stdout:4/930: rmdir d2/d32/d5c/d76/dd7/d31 39 2026-03-10T06:23:18.591 INFO:tasks.workunit.client.0.vm04.stdout:6/922: creat d2/d43/d2d/d30/d34/dae/f12d x:0 0 0 2026-03-10T06:23:18.591 INFO:tasks.workunit.client.0.vm04.stdout:5/897: symlink d4/d11/d7d/d38/d91/dda/l135 0 2026-03-10T06:23:18.592 INFO:tasks.workunit.client.0.vm04.stdout:6/923: chown d2/d43/d2d/d30/d34/d76/d8a/f9e 8491 1 2026-03-10T06:23:18.599 INFO:tasks.workunit.client.0.vm04.stdout:7/867: symlink d4/df/d12/d34/d103/l134 0 2026-03-10T06:23:18.602 INFO:tasks.workunit.client.0.vm04.stdout:1/876: link d0/d3/d41/f9d d0/d8/d46/db3/dd2/d100/d119/f142 0 2026-03-10T06:23:18.602 INFO:tasks.workunit.client.0.vm04.stdout:2/880: creat d1/dae/d11/f10a x:0 0 0 2026-03-10T06:23:18.603 INFO:tasks.workunit.client.0.vm04.stdout:1/877: fdatasync d0/d3/d41/dcb/f12b 0 2026-03-10T06:23:18.603 INFO:tasks.workunit.client.0.vm04.stdout:4/931: mknod d2/d32/d5c/d76/dd7/d2c/d6b/c134 0 2026-03-10T06:23:18.609 INFO:tasks.workunit.client.0.vm04.stdout:9/984: dread d2/d8/d22/daa/ff9 [0,4194304] 0 2026-03-10T06:23:18.612 INFO:tasks.workunit.client.0.vm04.stdout:5/898: dwrite d4/d11/d7d/d38/d91/d4c/d98/dc0/dbe/ff1 [0,4194304] 0 2026-03-10T06:23:18.620 INFO:tasks.workunit.client.0.vm04.stdout:6/924: dwrite d2/d43/d2d/d30/dc0/f114 [0,4194304] 0 2026-03-10T06:23:18.623 INFO:tasks.workunit.client.0.vm04.stdout:6/925: fsync d2/d43/d2d/d30/d1f/d3c/dfa/f124 0 2026-03-10T06:23:18.623 INFO:tasks.workunit.client.0.vm04.stdout:0/975: getdents d0/d1a/d20/df5/d47/d8a/de3 0 2026-03-10T06:23:18.630 INFO:tasks.workunit.client.0.vm04.stdout:6/926: readlink d2/d37/d83/l88 0 2026-03-10T06:23:18.632 INFO:tasks.workunit.client.0.vm04.stdout:6/927: read d2/d43/d2d/d30/dc0/fcd [2265639,113048] 0 2026-03-10T06:23:18.633 INFO:tasks.workunit.client.0.vm04.stdout:4/932: creat d2/d32/d5c/de2/f135 x:0 0 0 2026-03-10T06:23:18.633 INFO:tasks.workunit.client.0.vm04.stdout:6/928: read - d2/d3a/f90 zero size 2026-03-10T06:23:18.636 INFO:tasks.workunit.client.0.vm04.stdout:2/881: truncate d1/db/d69/d74/d87/dcf/d8f/d48/f62 381488 0 2026-03-10T06:23:18.642 INFO:tasks.workunit.client.0.vm04.stdout:8/927: write df/d15/d2b/f7e [1328139,64903] 0 2026-03-10T06:23:18.642 INFO:tasks.workunit.client.0.vm04.stdout:8/928: stat df/d20/d25/d73/lb0 0 2026-03-10T06:23:18.642 INFO:tasks.workunit.client.0.vm04.stdout:3/878: rename d4/d6/d91/fe5 to d4/da/df/d11/f126 0 2026-03-10T06:23:18.644 INFO:tasks.workunit.client.0.vm04.stdout:5/899: fsync d4/d11/d7d/fee 0 2026-03-10T06:23:18.648 INFO:tasks.workunit.client.0.vm04.stdout:8/929: unlink df/d20/d25/d73/fbd 0 2026-03-10T06:23:18.654 INFO:tasks.workunit.client.0.vm04.stdout:1/878: rename d0/d8/d46/d7a/d95/dc5/c116 to d0/d3/d80/c143 0 2026-03-10T06:23:18.656 INFO:tasks.workunit.client.0.vm04.stdout:1/879: dread d0/d8/d46/fd8 [0,4194304] 0 2026-03-10T06:23:18.658 INFO:tasks.workunit.client.0.vm04.stdout:7/868: dwrite d4/df/d12/d13/d25/d30/d40/f52 [0,4194304] 0 2026-03-10T06:23:18.670 INFO:tasks.workunit.client.0.vm04.stdout:9/985: write d2/d3/d18/d39/d46/f110 [1684839,106217] 0 2026-03-10T06:23:18.672 INFO:tasks.workunit.client.0.vm04.stdout:8/930: creat df/d20/d25/d30/dc5/f128 x:0 0 0 2026-03-10T06:23:18.674 INFO:tasks.workunit.client.0.vm04.stdout:5/900: read d4/f26 [583986,40546] 0 2026-03-10T06:23:18.679 INFO:tasks.workunit.client.0.vm04.stdout:0/976: rename d0/l14d to d0/d5/d97/d10a/l150 0 2026-03-10T06:23:18.693 INFO:tasks.workunit.client.0.vm04.stdout:0/977: stat d0/d5/d97/dc0/dd8/dff/d59/f48 0 2026-03-10T06:23:18.693 INFO:tasks.workunit.client.0.vm04.stdout:1/880: mknod d0/d3/d41/c144 0 2026-03-10T06:23:18.693 INFO:tasks.workunit.client.0.vm04.stdout:2/882: symlink d1/dae/d11/d14/d9f/ddb/d94/de5/de9/l10b 0 2026-03-10T06:23:18.693 INFO:tasks.workunit.client.0.vm04.stdout:3/879: creat d4/da/df/d11/d5a/d5b/ddf/d21/d32/d4e/d7f/d115/f127 x:0 0 0 2026-03-10T06:23:18.693 INFO:tasks.workunit.client.0.vm04.stdout:3/880: read - d4/da/df/d11/d5a/d5b/ddf/d21/d32/d4e/d7f/d115/f127 zero size 2026-03-10T06:23:18.693 INFO:tasks.workunit.client.0.vm04.stdout:9/986: mkdir d2/d3/d18/de9/da2/d165 0 2026-03-10T06:23:18.693 INFO:tasks.workunit.client.0.vm04.stdout:8/931: creat df/d20/d25/d30/d55/de7/f129 x:0 0 0 2026-03-10T06:23:18.693 INFO:tasks.workunit.client.0.vm04.stdout:6/929: sync 2026-03-10T06:23:18.708 INFO:tasks.workunit.client.0.vm04.stdout:8/932: dread df/d20/d25/f2a [0,4194304] 0 2026-03-10T06:23:18.711 INFO:tasks.workunit.client.0.vm04.stdout:5/901: rmdir d4 39 2026-03-10T06:23:18.720 INFO:tasks.workunit.client.0.vm04.stdout:7/869: dwrite d4/df/f84 [0,4194304] 0 2026-03-10T06:23:18.729 INFO:tasks.workunit.client.0.vm04.stdout:4/933: rename d2/dde to d2/d32/d5c/d76/dd7/d31/d42/d136 0 2026-03-10T06:23:18.729 INFO:tasks.workunit.client.0.vm04.stdout:0/978: creat d0/d1a/d20/df5/d47/ddd/f151 x:0 0 0 2026-03-10T06:23:18.730 INFO:tasks.workunit.client.0.vm04.stdout:0/979: chown d0/d5/d97/dc0/dd8/dff/d9c/dbf 484753341 1 2026-03-10T06:23:18.730 INFO:tasks.workunit.client.0.vm04.stdout:4/934: chown d2/d32/d5c/d76/dd7/da3/fcc 8326 1 2026-03-10T06:23:18.731 INFO:tasks.workunit.client.0.vm04.stdout:0/980: chown d0/d5/d97/dc0/cd6 163650295 1 2026-03-10T06:23:18.732 INFO:tasks.workunit.client.0.vm04.stdout:0/981: chown d0/d5/d25/dd/d3a/l14a 1769075 1 2026-03-10T06:23:18.734 INFO:tasks.workunit.client.0.vm04.stdout:4/935: dwrite d2/d8/f35 [0,4194304] 0 2026-03-10T06:23:18.739 INFO:tasks.workunit.client.0.vm04.stdout:2/883: mknod d1/dae/d2c/d37/c10c 0 2026-03-10T06:23:18.747 INFO:tasks.workunit.client.0.vm04.stdout:4/936: unlink d2/d32/d5c/d76/dd7/da3/fd5 0 2026-03-10T06:23:18.748 INFO:tasks.workunit.client.0.vm04.stdout:0/982: mknod d0/d1a/d20/df5/d47/d8a/c152 0 2026-03-10T06:23:18.752 INFO:tasks.workunit.client.0.vm04.stdout:3/881: mknod d4/da/c128 0 2026-03-10T06:23:18.760 INFO:tasks.workunit.client.0.vm04.stdout:2/884: symlink d1/db/d69/d74/d87/dcf/d8f/d48/d67/db3/l10d 0 2026-03-10T06:23:18.760 INFO:tasks.workunit.client.0.vm04.stdout:2/885: fsync d1/db/d69/d74/d87/dcf/d8f/ddc/fd1 0 2026-03-10T06:23:18.760 INFO:tasks.workunit.client.0.vm04.stdout:6/930: creat d2/d43/d9b/d129/f12e x:0 0 0 2026-03-10T06:23:18.762 INFO:tasks.workunit.client.0.vm04.stdout:5/902: truncate d4/d11/fb9 13244 0 2026-03-10T06:23:18.762 INFO:tasks.workunit.client.0.vm04.stdout:6/931: dwrite d2/d37/d83/dc1/df9/f119 [0,4194304] 0 2026-03-10T06:23:18.767 INFO:tasks.workunit.client.0.vm04.stdout:4/937: sync 2026-03-10T06:23:18.778 INFO:tasks.workunit.client.0.vm04.stdout:5/903: dwrite d4/f19 [4194304,4194304] 0 2026-03-10T06:23:18.781 INFO:tasks.workunit.client.0.vm04.stdout:5/904: chown d4/d11/d7d/dab/fb8 482155 1 2026-03-10T06:23:18.792 INFO:tasks.workunit.client.0.vm04.stdout:3/882: creat d4/d6/d99/d119/f129 x:0 0 0 2026-03-10T06:23:18.792 INFO:tasks.workunit.client.0.vm04.stdout:3/883: stat d4/f49 0 2026-03-10T06:23:18.795 INFO:tasks.workunit.client.0.vm04.stdout:1/881: write d0/d3/d41/d4b/d5b/f6f [1042197,114886] 0 2026-03-10T06:23:18.796 INFO:tasks.workunit.client.0.vm04.stdout:1/882: read - d0/d3/d41/dcb/f135 zero size 2026-03-10T06:23:18.797 INFO:tasks.workunit.client.0.vm04.stdout:2/886: creat d1/dae/d11/d14/d9f/ddb/daf/f10e x:0 0 0 2026-03-10T06:23:18.803 INFO:tasks.workunit.client.0.vm04.stdout:7/870: creat d4/f135 x:0 0 0 2026-03-10T06:23:18.805 INFO:tasks.workunit.client.0.vm04.stdout:0/983: mknod d0/d1a/d20/df5/d47/ddd/d103/d139/d82/d143/c153 0 2026-03-10T06:23:18.812 INFO:tasks.workunit.client.0.vm04.stdout:9/987: getdents d2/d3/d18/d39/d46/d84 0 2026-03-10T06:23:18.812 INFO:tasks.workunit.client.0.vm04.stdout:9/988: chown d2/de0/da3/cb0 29973 1 2026-03-10T06:23:18.814 INFO:tasks.workunit.client.0.vm04.stdout:6/932: symlink d2/d43/d2d/d30/dc0/l12f 0 2026-03-10T06:23:18.826 INFO:tasks.workunit.client.0.vm04.stdout:8/933: truncate df/d15/d29/da3/db8/dc1/d97/f9c 2602695 0 2026-03-10T06:23:18.828 INFO:tasks.workunit.client.0.vm04.stdout:2/887: mknod d1/dae/d2c/d37/d59/c10f 0 2026-03-10T06:23:18.830 INFO:tasks.workunit.client.0.vm04.stdout:2/888: chown d1/db/d69/d74/d87 223211 1 2026-03-10T06:23:18.833 INFO:tasks.workunit.client.0.vm04.stdout:0/984: symlink d0/d1a/d20/df5/d47/ddd/d103/d139/l154 0 2026-03-10T06:23:18.834 INFO:tasks.workunit.client.0.vm04.stdout:0/985: chown d0/d5/d25/dd/d3a/d56/f84 0 1 2026-03-10T06:23:18.848 INFO:tasks.workunit.client.0.vm04.stdout:1/883: dwrite d0/d8/d46/fb7 [0,4194304] 0 2026-03-10T06:23:18.852 INFO:tasks.workunit.client.0.vm04.stdout:8/934: dread df/d15/d2b/f4a [0,4194304] 0 2026-03-10T06:23:18.854 INFO:tasks.workunit.client.0.vm04.stdout:5/905: getdents d4/d11/d7d/d38/d110 0 2026-03-10T06:23:18.862 INFO:tasks.workunit.client.0.vm04.stdout:9/989: dwrite d2/d3/fed [0,4194304] 0 2026-03-10T06:23:18.876 INFO:tasks.workunit.client.0.vm04.stdout:4/938: truncate d2/f12 4073593 0 2026-03-10T06:23:18.885 INFO:tasks.workunit.client.0.vm04.stdout:7/871: symlink d4/df/d12/d13/db3/d110/d9c/db1/l136 0 2026-03-10T06:23:18.885 INFO:tasks.workunit.client.0.vm04.stdout:0/986: truncate d0/f75 234147 0 2026-03-10T06:23:18.893 INFO:tasks.workunit.client.0.vm04.stdout:6/933: symlink d2/d3a/l130 0 2026-03-10T06:23:18.912 INFO:tasks.workunit.client.0.vm04.stdout:1/884: mkdir d0/d3/d41/d145 0 2026-03-10T06:23:18.917 INFO:tasks.workunit.client.0.vm04.stdout:5/906: truncate d4/d11/d7d/d38/d91/d55/fce 215100 0 2026-03-10T06:23:18.921 INFO:tasks.workunit.client.0.vm04.stdout:5/907: stat d4/d3b/f41 0 2026-03-10T06:23:18.964 INFO:tasks.workunit.client.0.vm04.stdout:4/939: creat d2/d32/d5c/d76/dd7/d2c/d6b/d108/dfe/f137 x:0 0 0 2026-03-10T06:23:18.973 INFO:tasks.workunit.client.0.vm04.stdout:2/889: symlink d1/db/d69/d74/d87/dcf/d8f/d48/d67/db3/d109/l110 0 2026-03-10T06:23:18.973 INFO:tasks.workunit.client.0.vm04.stdout:0/987: creat d0/d1a/db8/f155 x:0 0 0 2026-03-10T06:23:18.981 INFO:tasks.workunit.client.0.vm04.stdout:7/872: creat d4/df/d12/d13/d8b/f137 x:0 0 0 2026-03-10T06:23:18.981 INFO:tasks.workunit.client.0.vm04.stdout:6/934: symlink d2/d37/d83/dee/l131 0 2026-03-10T06:23:18.990 INFO:tasks.workunit.client.0.vm04.stdout:8/935: mknod df/d20/d25/d30/d55/de7/d103/d111/c12a 0 2026-03-10T06:23:18.990 INFO:tasks.workunit.client.0.vm04.stdout:9/990: fsync d2/d3/d18/d34/f47 0 2026-03-10T06:23:18.996 INFO:tasks.workunit.client.0.vm04.stdout:8/936: dwrite df/d20/d25/fe0 [0,4194304] 0 2026-03-10T06:23:18.999 INFO:tasks.workunit.client.0.vm04.stdout:6/935: dwrite d2/d43/d2d/d30/d34/dae/f128 [0,4194304] 0 2026-03-10T06:23:19.003 INFO:tasks.workunit.client.0.vm04.stdout:4/940: dwrite d2/d46/fa8 [0,4194304] 0 2026-03-10T06:23:19.009 INFO:tasks.workunit.client.0.vm04.stdout:0/988: truncate d0/d1a/d20/dc2/fd3 18448 0 2026-03-10T06:23:19.009 INFO:tasks.workunit.client.0.vm04.stdout:7/873: symlink d4/df/d12/d13/d25/d30/d40/d50/df6/l138 0 2026-03-10T06:23:19.028 INFO:tasks.workunit.client.0.vm04.stdout:3/884: link d4/d6/d91/fad d4/da/df/d11/d5a/d5b/ddf/d21/d2c/f12a 0 2026-03-10T06:23:19.030 INFO:tasks.workunit.client.0.vm04.stdout:5/908: mkdir d4/d11/d7d/d38/d110/d136 0 2026-03-10T06:23:19.035 INFO:tasks.workunit.client.0.vm04.stdout:9/991: mkdir d2/de0/d166 0 2026-03-10T06:23:19.047 INFO:tasks.workunit.client.0.vm04.stdout:2/890: symlink d1/db/d69/d74/d87/dcf/d8f/l111 0 2026-03-10T06:23:19.062 INFO:tasks.workunit.client.0.vm04.stdout:1/885: dwrite d0/d3/d41/dc2/fe6 [0,4194304] 0 2026-03-10T06:23:19.067 INFO:tasks.workunit.client.0.vm04.stdout:2/891: dread d1/db/d69/d74/d87/dcf/d8f/ddc/f8e [0,4194304] 0 2026-03-10T06:23:19.088 INFO:tasks.workunit.client.0.vm04.stdout:6/936: dwrite d2/d43/d2d/d30/d1f/d3c/fb7 [0,4194304] 0 2026-03-10T06:23:19.100 INFO:tasks.workunit.client.0.vm04.stdout:7/874: link d4/df/d12/d13/l69 d4/d105/l139 0 2026-03-10T06:23:19.100 INFO:tasks.workunit.client.0.vm04.stdout:3/885: link d4/da/df/d11/l8c d4/da/df/d11/d5a/db3/d108/l12b 0 2026-03-10T06:23:19.101 INFO:tasks.workunit.client.0.vm04.stdout:5/909: link d4/d11/d7d/d38/d91/d4c/def/dd3/fe3 d4/d11/d7d/d38/d91/d4c/d98/f137 0 2026-03-10T06:23:19.110 INFO:tasks.workunit.client.0.vm04.stdout:9/992: link d2/d8/f66 d2/de0/d1d/d64/d73/d10d/f167 0 2026-03-10T06:23:19.114 INFO:tasks.workunit.client.0.vm04.stdout:2/892: creat d1/dae/d2c/d37/d40/dfa/f112 x:0 0 0 2026-03-10T06:23:19.114 INFO:tasks.workunit.client.0.vm04.stdout:2/893: stat d1/db/d69/d74/ld5 0 2026-03-10T06:23:19.117 INFO:tasks.workunit.client.0.vm04.stdout:8/937: truncate df/d15/d29/da3/db8/dc1/d97/d67/fa4 1277895 0 2026-03-10T06:23:19.119 INFO:tasks.workunit.client.0.vm04.stdout:4/941: truncate d2/f4 1243718 0 2026-03-10T06:23:19.121 INFO:tasks.workunit.client.0.vm04.stdout:0/989: link d0/d1a/cc1 d0/d5/d25/dd/d5c/c156 0 2026-03-10T06:23:19.123 INFO:tasks.workunit.client.0.vm04.stdout:1/886: creat d0/d3/f146 x:0 0 0 2026-03-10T06:23:19.127 INFO:tasks.workunit.client.0.vm04.stdout:7/875: mknod d4/d105/c13a 0 2026-03-10T06:23:19.136 INFO:tasks.workunit.client.0.vm04.stdout:6/937: truncate d2/d37/d6e/f95 3178185 0 2026-03-10T06:23:19.140 INFO:tasks.workunit.client.0.vm04.stdout:3/886: dread d4/da/df/d11/d5a/d5b/ddf/f4b [0,4194304] 0 2026-03-10T06:23:19.145 INFO:tasks.workunit.client.0.vm04.stdout:0/990: sync 2026-03-10T06:23:19.155 INFO:tasks.workunit.client.0.vm04.stdout:9/993: write d2/de0/d1d/d64/d122/ff7 [389968,109250] 0 2026-03-10T06:23:19.159 INFO:tasks.workunit.client.0.vm04.stdout:9/994: sync 2026-03-10T06:23:19.160 INFO:tasks.workunit.client.0.vm04.stdout:9/995: sync 2026-03-10T06:23:19.161 INFO:tasks.workunit.client.0.vm04.stdout:8/938: chown df/d15/d2b/d81/d9a/dbe/df0/c100 7578395 1 2026-03-10T06:23:19.168 INFO:tasks.workunit.client.0.vm04.stdout:1/887: stat d0/d8/d46/de4/l13f 0 2026-03-10T06:23:19.168 INFO:tasks.workunit.client.0.vm04.stdout:7/876: chown d4/df/d12/d13/d25/d28/d3a/d100/d106/c116 91584157 1 2026-03-10T06:23:19.168 INFO:tasks.workunit.client.0.vm04.stdout:5/910: mknod d4/d11/d7d/d38/d110/d136/c138 0 2026-03-10T06:23:19.169 INFO:tasks.workunit.client.0.vm04.stdout:6/938: rmdir d2/d43/d9b 39 2026-03-10T06:23:19.184 INFO:tasks.workunit.client.0.vm04.stdout:2/894: link d1/dae/d11/f16 d1/db/d69/f113 0 2026-03-10T06:23:19.187 INFO:tasks.workunit.client.0.vm04.stdout:9/996: symlink d2/d3/d18/d39/d46/d55/dc3/l168 0 2026-03-10T06:23:19.188 INFO:tasks.workunit.client.0.vm04.stdout:9/997: stat d2/d8/d22/daa/l11b 0 2026-03-10T06:23:19.192 INFO:tasks.workunit.client.0.vm04.stdout:9/998: dwrite d2/d3/f43 [0,4194304] 0 2026-03-10T06:23:19.197 INFO:tasks.workunit.client.0.vm04.stdout:8/939: mknod df/d15/d29/da3/db8/c12b 0 2026-03-10T06:23:19.200 INFO:tasks.workunit.client.0.vm04.stdout:1/888: rename d0/d3/d80/ceb to d0/d8/d46/d7a/d95/df3/c147 0 2026-03-10T06:23:19.202 INFO:tasks.workunit.client.0.vm04.stdout:5/911: fsync d4/f35 0 2026-03-10T06:23:19.203 INFO:tasks.workunit.client.0.vm04.stdout:1/889: dwrite d0/f64 [0,4194304] 0 2026-03-10T06:23:19.212 INFO:tasks.workunit.client.0.vm04.stdout:7/877: truncate d4/df/d12/d13/d25/d30/ff4 1060161 0 2026-03-10T06:23:19.227 INFO:tasks.workunit.client.0.vm04.stdout:5/912: dread d4/d3b/f71 [0,4194304] 0 2026-03-10T06:23:19.230 INFO:tasks.workunit.client.0.vm04.stdout:1/890: dread d0/d8/d46/d7a/fbc [0,4194304] 0 2026-03-10T06:23:19.235 INFO:tasks.workunit.client.0.vm04.stdout:7/878: chown d4/df/d12/d13/f4a 1213 1 2026-03-10T06:23:19.237 INFO:tasks.workunit.client.0.vm04.stdout:3/887: creat d4/da/df/d11/d5a/d5b/ddf/f12c x:0 0 0 2026-03-10T06:23:19.245 INFO:tasks.workunit.client.0.vm04.stdout:6/939: write d2/f10 [4655604,59317] 0 2026-03-10T06:23:19.246 INFO:tasks.workunit.client.0.vm04.stdout:2/895: write d1/dae/d11/f7e [3958002,19901] 0 2026-03-10T06:23:19.248 INFO:tasks.workunit.client.0.vm04.stdout:0/991: dwrite d0/d1a/f66 [0,4194304] 0 2026-03-10T06:23:19.257 INFO:tasks.workunit.client.0.vm04.stdout:0/992: dwrite d0/d1a/d20/dc2/d12d/f115 [0,4194304] 0 2026-03-10T06:23:19.265 INFO:tasks.workunit.client.0.vm04.stdout:0/993: dwrite d0/d1a/db8/f155 [0,4194304] 0 2026-03-10T06:23:19.275 INFO:tasks.workunit.client.0.vm04.stdout:9/999: write d2/d3/d18/d39/d11/da5/df5/ffc [2527271,76619] 0 2026-03-10T06:23:19.277 INFO:tasks.workunit.client.0.vm04.stdout:4/942: getdents d2/d32/d10b/d93 0 2026-03-10T06:23:19.278 INFO:tasks.workunit.client.0.vm04.stdout:4/943: dread - d2/d32/d10b/dc8/f117 zero size 2026-03-10T06:23:19.283 INFO:tasks.workunit.client.0.vm04.stdout:8/940: dwrite df/d15/d29/fca [0,4194304] 0 2026-03-10T06:23:19.292 INFO:tasks.workunit.client.0.vm04.stdout:5/913: mknod d4/d11/d7d/d38/d91/d4c/d98/c139 0 2026-03-10T06:23:19.292 INFO:tasks.workunit.client.0.vm04.stdout:3/888: creat d4/da/df/d11/d5a/d5b/ddf/d21/d32/d39/f12d x:0 0 0 2026-03-10T06:23:19.292 INFO:tasks.workunit.client.0.vm04.stdout:3/889: read - d4/da/df/d11/d5a/d5b/ddf/d21/d32/f125 zero size 2026-03-10T06:23:19.317 INFO:tasks.workunit.client.0.vm04.stdout:4/944: unlink d2/d32/d5c/d76/dd7/f9d 0 2026-03-10T06:23:19.331 INFO:tasks.workunit.client.0.vm04.stdout:7/879: symlink d4/df/d12/d13/d25/d28/d3a/db0/l13b 0 2026-03-10T06:23:19.336 INFO:tasks.workunit.client.0.vm04.stdout:5/914: read d4/d11/d7d/d38/d91/d4c/def/dd3/fe3 [2916971,21167] 0 2026-03-10T06:23:19.347 INFO:tasks.workunit.client.0.vm04.stdout:3/890: mkdir d4/d6/d91/d12e 0 2026-03-10T06:23:19.348 INFO:tasks.workunit.client.0.vm04.stdout:5/915: dread d4/d11/d7d/d38/d91/d55/f9e [0,4194304] 0 2026-03-10T06:23:19.350 INFO:tasks.workunit.client.0.vm04.stdout:1/891: write d0/d3/d41/d99/def/ff8 [1705761,41067] 0 2026-03-10T06:23:19.357 INFO:tasks.workunit.client.0.vm04.stdout:8/941: dwrite df/d20/f5e [0,4194304] 0 2026-03-10T06:23:19.486 INFO:tasks.workunit.client.0.vm04.stdout:0/994: truncate d0/d1a/f66 21482 0 2026-03-10T06:23:19.490 INFO:tasks.workunit.client.0.vm04.stdout:7/880: symlink d4/df/d12/d13/d25/d30/d40/l13c 0 2026-03-10T06:23:19.492 INFO:tasks.workunit.client.0.vm04.stdout:5/916: mknod d4/d11/d7d/d38/d110/d136/c13a 0 2026-03-10T06:23:19.493 INFO:tasks.workunit.client.0.vm04.stdout:5/917: readlink d4/d11/d7d/d38/d91/lf7 0 2026-03-10T06:23:19.493 INFO:tasks.workunit.client.0.vm04.stdout:6/940: creat d2/d3a/f132 x:0 0 0 2026-03-10T06:23:19.494 INFO:tasks.workunit.client.0.vm04.stdout:6/941: write d2/d43/d2d/d30/d34/f122 [3754553,101302] 0 2026-03-10T06:23:19.496 INFO:tasks.workunit.client.0.vm04.stdout:8/942: creat df/d20/d25/d30/d65/d8f/f12c x:0 0 0 2026-03-10T06:23:19.504 INFO:tasks.workunit.client.0.vm04.stdout:8/943: dread df/d20/d25/d73/f98 [0,4194304] 0 2026-03-10T06:23:19.504 INFO:tasks.workunit.client.0.vm04.stdout:1/892: dwrite d0/d3/d41/dc2/ffc [0,4194304] 0 2026-03-10T06:23:19.508 INFO:tasks.workunit.client.0.vm04.stdout:3/891: symlink d4/da/df/d11/d5a/d5b/ddf/l12f 0 2026-03-10T06:23:19.511 INFO:tasks.workunit.client.0.vm04.stdout:2/896: rename d1/dae/d11/d14/d9f/ddb/d94/dbb/l101 to d1/dae/d2c/l114 0 2026-03-10T06:23:19.524 INFO:tasks.workunit.client.0.vm04.stdout:8/944: mkdir df/d20/d25/d30/d55/de7/d12d 0 2026-03-10T06:23:19.526 INFO:tasks.workunit.client.0.vm04.stdout:4/945: creat d2/d32/d5c/d76/dd7/f138 x:0 0 0 2026-03-10T06:23:19.527 INFO:tasks.workunit.client.0.vm04.stdout:4/946: write d2/d32/d94/d99/ddc/fea [4871948,100991] 0 2026-03-10T06:23:19.528 INFO:tasks.workunit.client.0.vm04.stdout:4/947: dread - d2/d32/d5c/d76/dd7/da3/f11e zero size 2026-03-10T06:23:19.535 INFO:tasks.workunit.client.0.vm04.stdout:7/881: mkdir d4/df/d12/d13/d25/d28/d3a/d100/d13d 0 2026-03-10T06:23:19.536 INFO:tasks.workunit.client.0.vm04.stdout:7/882: chown d4/df/d12/d13/d25/d28 29919 1 2026-03-10T06:23:19.537 INFO:tasks.workunit.client.0.vm04.stdout:7/883: fdatasync d4/df/d12/d13/d25/d30/d40/d50/f11e 0 2026-03-10T06:23:19.537 INFO:tasks.workunit.client.0.vm04.stdout:7/884: chown d4/df/d12/d13/d25/d30/c125 16058063 1 2026-03-10T06:23:19.539 INFO:tasks.workunit.client.0.vm04.stdout:0/995: rename d0/d1a/d20/df5/d47/ddd/d103/d112 to d0/d5/d25/dd/d3a/d56/d157 0 2026-03-10T06:23:19.541 INFO:tasks.workunit.client.0.vm04.stdout:2/897: chown d1/db/d69/d74/d87/cc9 41485 1 2026-03-10T06:23:19.541 INFO:tasks.workunit.client.0.vm04.stdout:5/918: symlink d4/d11/d7d/d38/l13b 0 2026-03-10T06:23:19.542 INFO:tasks.workunit.client.0.vm04.stdout:6/942: mknod d2/d43/d9b/c133 0 2026-03-10T06:23:19.544 INFO:tasks.workunit.client.0.vm04.stdout:8/945: creat df/d15/d29/da3/f12e x:0 0 0 2026-03-10T06:23:19.545 INFO:tasks.workunit.client.0.vm04.stdout:4/948: creat d2/d32/d5c/d76/f139 x:0 0 0 2026-03-10T06:23:19.554 INFO:tasks.workunit.client.0.vm04.stdout:3/892: dwrite d4/da/df/d11/d5a/d5b/ddf/d21/d32/d4e/d7f/fe9 [0,4194304] 0 2026-03-10T06:23:19.555 INFO:tasks.workunit.client.0.vm04.stdout:3/893: write d4/da/df/d11/d5a/d5b/ddf/f12c [300461,103206] 0 2026-03-10T06:23:19.556 INFO:tasks.workunit.client.0.vm04.stdout:3/894: truncate d4/deb/f11e 506573 0 2026-03-10T06:23:19.567 INFO:tasks.workunit.client.0.vm04.stdout:7/885: creat d4/df/d12/d34/dbd/f13e x:0 0 0 2026-03-10T06:23:19.580 INFO:tasks.workunit.client.0.vm04.stdout:8/946: dread df/d15/d29/f3e [0,4194304] 0 2026-03-10T06:23:19.581 INFO:tasks.workunit.client.0.vm04.stdout:8/947: chown df/d15/d29/df8/d102/dab/fdf 0 1 2026-03-10T06:23:19.583 INFO:tasks.workunit.client.0.vm04.stdout:3/895: mknod d4/d6/d91/c130 0 2026-03-10T06:23:19.588 INFO:tasks.workunit.client.0.vm04.stdout:2/898: dwrite d1/dae/d2c/d37/f52 [0,4194304] 0 2026-03-10T06:23:19.590 INFO:tasks.workunit.client.0.vm04.stdout:0/996: mkdir d0/d1a/d20/df5/d158 0 2026-03-10T06:23:19.591 INFO:tasks.workunit.client.0.vm04.stdout:0/997: write d0/d5/d25/dd/d3a/d56/fa7 [157219,43526] 0 2026-03-10T06:23:19.593 INFO:tasks.workunit.client.0.vm04.stdout:5/919: write d4/d6/f87 [1639112,120526] 0 2026-03-10T06:23:19.594 INFO:tasks.workunit.client.0.vm04.stdout:5/920: write d4/d6/f87 [2469094,31735] 0 2026-03-10T06:23:19.597 INFO:tasks.workunit.client.0.vm04.stdout:1/893: link d0/d8/c12 d0/d8/d46/db3/c148 0 2026-03-10T06:23:19.600 INFO:tasks.workunit.client.0.vm04.stdout:4/949: symlink d2/d32/d5c/d76/dd7/l13a 0 2026-03-10T06:23:19.609 INFO:tasks.workunit.client.0.vm04.stdout:2/899: rename d1/dae/d2c/d37/d59/l8c to d1/db/d69/dcd/l115 0 2026-03-10T06:23:19.610 INFO:tasks.workunit.client.0.vm04.stdout:0/998: read d0/d1a/f66 [21238,69191] 0 2026-03-10T06:23:19.616 INFO:tasks.workunit.client.0.vm04.stdout:6/943: creat d2/d43/d2d/f134 x:0 0 0 2026-03-10T06:23:19.616 INFO:tasks.workunit.client.0.vm04.stdout:4/950: read d2/d32/d5c/d76/dd7/d2c/f2e [58527,50499] 0 2026-03-10T06:23:19.616 INFO:tasks.workunit.client.0.vm04.stdout:8/948: mknod df/d15/d29/c12f 0 2026-03-10T06:23:19.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:19 vm06.local ceph-mon[58974]: pgmap v38: 65 pgs: 65 active+clean; 3.0 GiB data, 9.9 GiB used, 110 GiB / 120 GiB avail; 34 MiB/s rd, 84 MiB/s wr, 206 op/s 2026-03-10T06:23:19.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:19 vm06.local ceph-mon[58974]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:23:19.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:19 vm06.local ceph-mon[58974]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:23:19.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:19 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:23:19.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:19 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T06:23:19.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:19 vm06.local ceph-mon[58974]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:23:19.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:19 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "dashboard get-alertmanager-api-host"}]: dispatch 2026-03-10T06:23:19.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:19 vm06.local ceph-mon[58974]: from='mon.? -' entity='mon.' cmd=[{"prefix": "dashboard get-alertmanager-api-host"}]: dispatch 2026-03-10T06:23:19.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:19 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:23:19.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:19 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:23:19.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:19 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:23:19.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:19 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:23:19.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:19 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:23:19.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:19 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:23:19.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:19 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:23:19.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:19 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:23:19.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:19 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:23:19.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:19 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:23:19.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:19 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:23:19.618 INFO:tasks.workunit.client.0.vm04.stdout:2/900: creat d1/dae/d2c/d37/d40/dfa/f116 x:0 0 0 2026-03-10T06:23:19.628 INFO:tasks.workunit.client.0.vm04.stdout:3/896: write d4/da/df/d11/d5a/d5b/ddf/f23 [3898040,61798] 0 2026-03-10T06:23:19.629 INFO:tasks.workunit.client.0.vm04.stdout:7/886: truncate d4/f5 2757590 0 2026-03-10T06:23:19.633 INFO:tasks.workunit.client.0.vm04.stdout:5/921: mknod d4/d6/c13c 0 2026-03-10T06:23:19.634 INFO:tasks.workunit.client.0.vm04.stdout:0/999: read d0/d1a/d20/df5/d47/f7b [1140652,88585] 0 2026-03-10T06:23:19.635 INFO:tasks.workunit.client.0.vm04.stdout:1/894: mknod d0/d8/d46/d7a/d95/c149 0 2026-03-10T06:23:19.635 INFO:tasks.workunit.client.0.vm04.stdout:1/895: fdatasync d0/d3/f24 0 2026-03-10T06:23:19.639 INFO:tasks.workunit.client.0.vm04.stdout:2/901: truncate d1/dae/f24 3524932 0 2026-03-10T06:23:19.647 INFO:tasks.workunit.client.0.vm04.stdout:4/951: dread d2/d32/d5c/f4b [0,4194304] 0 2026-03-10T06:23:19.649 INFO:tasks.workunit.client.0.vm04.stdout:8/949: dread df/d20/d25/d30/d65/f94 [0,4194304] 0 2026-03-10T06:23:19.659 INFO:tasks.workunit.client.0.vm04.stdout:3/897: write d4/da/df/d11/d5a/d5b/fd9 [412497,15236] 0 2026-03-10T06:23:19.660 INFO:tasks.workunit.client.0.vm04.stdout:7/887: creat d4/df/d12/d13/db3/d110/d9c/f13f x:0 0 0 2026-03-10T06:23:19.661 INFO:tasks.workunit.client.0.vm04.stdout:6/944: truncate d2/d8/f11 3178101 0 2026-03-10T06:23:19.667 INFO:tasks.workunit.client.0.vm04.stdout:3/898: rmdir d4/da/df/d11/d5a/d5b/ddf/d21 39 2026-03-10T06:23:19.679 INFO:tasks.workunit.client.0.vm04.stdout:1/896: getdents d0/d3/d41/dc2/d13a 0 2026-03-10T06:23:19.679 INFO:tasks.workunit.client.0.vm04.stdout:7/888: unlink d4/df/d12/d13/f1e 0 2026-03-10T06:23:19.679 INFO:tasks.workunit.client.0.vm04.stdout:3/899: chown d4/da/la6 10902107 1 2026-03-10T06:23:19.680 INFO:tasks.workunit.client.0.vm04.stdout:4/952: rename d2/c24 to d2/d32/d10b/c13b 0 2026-03-10T06:23:19.680 INFO:tasks.workunit.client.0.vm04.stdout:7/889: stat d4/df/d12/d13/d25/d28/d3a/d58 0 2026-03-10T06:23:19.680 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:19 vm04.local ceph-mon[51058]: pgmap v38: 65 pgs: 65 active+clean; 3.0 GiB data, 9.9 GiB used, 110 GiB / 120 GiB avail; 34 MiB/s rd, 84 MiB/s wr, 206 op/s 2026-03-10T06:23:19.680 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:19 vm04.local ceph-mon[51058]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:23:19.680 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:19 vm04.local ceph-mon[51058]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:23:19.680 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:19 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:23:19.680 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:19 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T06:23:19.680 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:19 vm04.local ceph-mon[51058]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:23:19.680 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:19 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "dashboard get-alertmanager-api-host"}]: dispatch 2026-03-10T06:23:19.680 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:19 vm04.local ceph-mon[51058]: from='mon.? -' entity='mon.' cmd=[{"prefix": "dashboard get-alertmanager-api-host"}]: dispatch 2026-03-10T06:23:19.680 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:19 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:23:19.680 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:19 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:23:19.680 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:19 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:23:19.680 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:19 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:23:19.680 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:19 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:23:19.680 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:19 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:23:19.680 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:19 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:23:19.680 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:19 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:23:19.680 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:19 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:23:19.680 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:19 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:23:19.680 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:19 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:23:19.681 INFO:tasks.workunit.client.0.vm04.stdout:3/900: dwrite d4/da/df/d11/d5a/d5b/ddf/f12c [0,4194304] 0 2026-03-10T06:23:19.681 INFO:tasks.workunit.client.0.vm04.stdout:2/902: sync 2026-03-10T06:23:19.691 INFO:tasks.workunit.client.0.vm04.stdout:1/897: dread - d0/d3/d80/f114 zero size 2026-03-10T06:23:19.694 INFO:tasks.workunit.client.0.vm04.stdout:4/953: rename d2/d32/d5c/d76/dd7/d2c/d6b/d108/c12b to d2/d32/d10b/dc8/c13c 0 2026-03-10T06:23:19.694 INFO:tasks.workunit.client.0.vm04.stdout:7/890: creat d4/df/d12/d13/db3/d110/d9c/db1/dc4/f140 x:0 0 0 2026-03-10T06:23:19.695 INFO:tasks.workunit.client.0.vm04.stdout:2/903: mknod d1/dae/d11/d14/d9f/ddb/c117 0 2026-03-10T06:23:19.699 INFO:tasks.workunit.client.0.vm04.stdout:5/922: write d4/d11/d7d/fa6 [2240452,91782] 0 2026-03-10T06:23:19.708 INFO:tasks.workunit.client.0.vm04.stdout:8/950: dwrite df/d20/d25/faf [0,4194304] 0 2026-03-10T06:23:19.709 INFO:tasks.workunit.client.0.vm04.stdout:6/945: dwrite d2/d43/d2d/d30/d1f/d3c/d75/f59 [0,4194304] 0 2026-03-10T06:23:19.710 INFO:tasks.workunit.client.0.vm04.stdout:8/951: stat df/d15/d29/da3/c108 0 2026-03-10T06:23:19.713 INFO:tasks.workunit.client.0.vm04.stdout:1/898: creat d0/d3/d41/d99/f14a x:0 0 0 2026-03-10T06:23:19.735 INFO:tasks.workunit.client.0.vm04.stdout:3/901: symlink d4/d6/d92/l131 0 2026-03-10T06:23:19.735 INFO:tasks.workunit.client.0.vm04.stdout:4/954: creat d2/d32/d5c/d76/dd7/d31/d42/db9/def/d133/f13d x:0 0 0 2026-03-10T06:23:19.736 INFO:tasks.workunit.client.0.vm04.stdout:3/902: chown d4/d6/d91/d12e 2296 1 2026-03-10T06:23:19.743 INFO:tasks.workunit.client.0.vm04.stdout:3/903: dwrite d4/deb/f111 [0,4194304] 0 2026-03-10T06:23:19.747 INFO:tasks.workunit.client.0.vm04.stdout:2/904: dread d1/db/d69/d74/d87/dcf/d8f/d35/d54/d5d/f93 [0,4194304] 0 2026-03-10T06:23:19.747 INFO:tasks.workunit.client.0.vm04.stdout:8/952: rename df/d15/d29/da3/db8/lc7 to df/d20/d25/d30/d55/l130 0 2026-03-10T06:23:19.747 INFO:tasks.workunit.client.0.vm04.stdout:1/899: fdatasync d0/d3/d41/f75 0 2026-03-10T06:23:19.761 INFO:tasks.workunit.client.0.vm04.stdout:5/923: fdatasync d4/d11/d7d/d38/d91/d4c/d9d/ffd 0 2026-03-10T06:23:19.761 INFO:tasks.workunit.client.0.vm04.stdout:2/905: unlink d1/dae/d2c/d37/ffd 0 2026-03-10T06:23:19.764 INFO:tasks.workunit.client.0.vm04.stdout:8/953: mknod df/d20/d25/d73/c131 0 2026-03-10T06:23:19.766 INFO:tasks.workunit.client.0.vm04.stdout:7/891: rmdir d4/df/d12/d13/db3/d110/d102/d10b 0 2026-03-10T06:23:19.766 INFO:tasks.workunit.client.0.vm04.stdout:1/900: dread - d0/d3/d80/ff5 zero size 2026-03-10T06:23:19.767 INFO:tasks.workunit.client.0.vm04.stdout:8/954: readlink df/d15/d29/d89/lb7 0 2026-03-10T06:23:19.771 INFO:tasks.workunit.client.0.vm04.stdout:1/901: dread - d0/d3/d41/d99/def/f128 zero size 2026-03-10T06:23:19.779 INFO:tasks.workunit.client.0.vm04.stdout:7/892: dread d4/df/d12/d13/d25/d30/d40/d50/f5b [0,4194304] 0 2026-03-10T06:23:19.791 INFO:tasks.workunit.client.0.vm04.stdout:6/946: write d2/d3a/f57 [730059,48946] 0 2026-03-10T06:23:19.792 INFO:tasks.workunit.client.0.vm04.stdout:5/924: dread - d4/d11/d7d/dab/f105 zero size 2026-03-10T06:23:19.793 INFO:tasks.workunit.client.0.vm04.stdout:7/893: sync 2026-03-10T06:23:19.794 INFO:tasks.workunit.client.0.vm04.stdout:6/947: rename d2/d43/d2d/d30/d1f/d3c to d2/d43/d2d/d30/d1f/d3c/d75/d135 22 2026-03-10T06:23:19.802 INFO:tasks.workunit.client.0.vm04.stdout:3/904: mkdir d4/da/df/d11/d5a/d5b/ddf/d21/d32/d4e/d132 0 2026-03-10T06:23:19.803 INFO:tasks.workunit.client.0.vm04.stdout:4/955: write d2/d32/d5c/d76/dd7/d31/f66 [3824878,88316] 0 2026-03-10T06:23:19.804 INFO:tasks.workunit.client.0.vm04.stdout:1/902: creat d0/d112/d13d/f14b x:0 0 0 2026-03-10T06:23:19.813 INFO:tasks.workunit.client.0.vm04.stdout:2/906: dread d1/dae/d11/d14/ffe [0,4194304] 0 2026-03-10T06:23:19.816 INFO:tasks.workunit.client.0.vm04.stdout:6/948: creat d2/d97/f136 x:0 0 0 2026-03-10T06:23:19.818 INFO:tasks.workunit.client.0.vm04.stdout:6/949: write d2/d43/d2d/d30/f4a [2344188,82162] 0 2026-03-10T06:23:19.823 INFO:tasks.workunit.client.0.vm04.stdout:1/903: dread d0/d3/d80/f91 [4194304,4194304] 0 2026-03-10T06:23:19.827 INFO:tasks.workunit.client.0.vm04.stdout:5/925: truncate d4/d11/d7d/dae/df3/f11e 989356 0 2026-03-10T06:23:19.828 INFO:tasks.workunit.client.0.vm04.stdout:7/894: write d4/df/d12/d13/d25/d30/d40/d50/f62 [1611519,128757] 0 2026-03-10T06:23:19.834 INFO:tasks.workunit.client.0.vm04.stdout:8/955: dwrite df/d20/d25/d30/d55/f8d [4194304,4194304] 0 2026-03-10T06:23:19.836 INFO:tasks.workunit.client.0.vm04.stdout:7/895: chown d4/df/d12/d13/d25/d30/d40/d50/df6/d114 910796 1 2026-03-10T06:23:19.850 INFO:tasks.workunit.client.0.vm04.stdout:4/956: dwrite d2/d32/d5c/d76/dd7/d31/d42/db9/fb0 [0,4194304] 0 2026-03-10T06:23:19.854 INFO:tasks.workunit.client.0.vm04.stdout:7/896: creat d4/df/d12/d13/db3/ded/f141 x:0 0 0 2026-03-10T06:23:19.855 INFO:tasks.workunit.client.0.vm04.stdout:5/926: fdatasync d4/d11/d7d/d38/d91/d4c/f88 0 2026-03-10T06:23:19.863 INFO:tasks.workunit.client.0.vm04.stdout:4/957: symlink d2/d32/d5c/d76/dd7/d31/d42/db9/def/l13e 0 2026-03-10T06:23:19.866 INFO:tasks.workunit.client.0.vm04.stdout:8/956: dwrite df/f66 [0,4194304] 0 2026-03-10T06:23:19.870 INFO:tasks.workunit.client.0.vm04.stdout:3/905: truncate d4/da/df/d11/d5a/d5b/ddf/d21/d32/d8e/ffe 642969 0 2026-03-10T06:23:19.872 INFO:tasks.workunit.client.0.vm04.stdout:1/904: write d0/d3/f50 [22759,102853] 0 2026-03-10T06:23:19.873 INFO:tasks.workunit.client.0.vm04.stdout:2/907: write d1/dae/f24 [3916118,60206] 0 2026-03-10T06:23:19.873 INFO:tasks.workunit.client.0.vm04.stdout:6/950: truncate d2/d43/d2d/d30/d1f/d3c/d75/f92 2240960 0 2026-03-10T06:23:19.877 INFO:tasks.workunit.client.0.vm04.stdout:3/906: mkdir d4/da/df/d11/d5a/d5b/ddf/d89/d133 0 2026-03-10T06:23:19.879 INFO:tasks.workunit.client.0.vm04.stdout:1/905: fdatasync d0/d8/fab 0 2026-03-10T06:23:19.882 INFO:tasks.workunit.client.0.vm04.stdout:8/957: read df/d20/f22 [377179,95460] 0 2026-03-10T06:23:19.888 INFO:tasks.workunit.client.0.vm04.stdout:7/897: rmdir d4/df/d12/d13/db3/d110/d102 0 2026-03-10T06:23:19.891 INFO:tasks.workunit.client.0.vm04.stdout:7/898: dwrite d4/df/d12/d13/db3/fcf [0,4194304] 0 2026-03-10T06:23:19.894 INFO:tasks.workunit.client.0.vm04.stdout:1/906: symlink d0/d3/d41/dc2/l14c 0 2026-03-10T06:23:19.896 INFO:tasks.workunit.client.0.vm04.stdout:1/907: write d0/d8/d46/d7a/d95/f140 [938549,26800] 0 2026-03-10T06:23:19.897 INFO:tasks.workunit.client.0.vm04.stdout:1/908: write d0/d3/f4e [513349,71273] 0 2026-03-10T06:23:19.904 INFO:tasks.workunit.client.0.vm04.stdout:6/951: creat d2/d3a/f137 x:0 0 0 2026-03-10T06:23:19.910 INFO:tasks.workunit.client.0.vm04.stdout:1/909: symlink d0/d8/d46/d7a/d95/l14d 0 2026-03-10T06:23:19.910 INFO:tasks.workunit.client.0.vm04.stdout:1/910: chown d0/d3/d41/c68 219421379 1 2026-03-10T06:23:19.919 INFO:tasks.workunit.client.0.vm04.stdout:5/927: dwrite d4/f35 [0,4194304] 0 2026-03-10T06:23:19.919 INFO:tasks.workunit.client.0.vm04.stdout:4/958: dwrite d2/d32/d5c/d76/dd7/d2c/d6b/d108/fc7 [0,4194304] 0 2026-03-10T06:23:19.920 INFO:tasks.workunit.client.0.vm04.stdout:5/928: read d4/f19 [5458354,48379] 0 2026-03-10T06:23:19.928 INFO:tasks.workunit.client.0.vm04.stdout:2/908: write d1/db/d69/d74/d87/dcf/fc1 [857893,39794] 0 2026-03-10T06:23:19.929 INFO:tasks.workunit.client.0.vm04.stdout:8/958: getdents df/d15/d29/df8/d102/d105 0 2026-03-10T06:23:19.929 INFO:tasks.workunit.client.0.vm04.stdout:8/959: truncate df/d20/d25/f10b 912476 0 2026-03-10T06:23:19.930 INFO:tasks.workunit.client.0.vm04.stdout:8/960: write df/d15/d2b/ff3 [1673976,3906] 0 2026-03-10T06:23:19.931 INFO:tasks.workunit.client.0.vm04.stdout:8/961: fdatasync df/d15/d29/da3/db8/dc1/f90 0 2026-03-10T06:23:19.940 INFO:tasks.workunit.client.0.vm04.stdout:7/899: write d4/df/d12/d34/f46 [2352151,127850] 0 2026-03-10T06:23:19.940 INFO:tasks.workunit.client.0.vm04.stdout:6/952: write d2/d37/d6e/de6/f100 [557601,10433] 0 2026-03-10T06:23:19.945 INFO:tasks.workunit.client.0.vm04.stdout:3/907: link d4/da/df/d11/f48 d4/da/f134 0 2026-03-10T06:23:19.962 INFO:tasks.workunit.client.0.vm04.stdout:2/909: chown d1/dae/ld0 227 1 2026-03-10T06:23:19.967 INFO:tasks.workunit.client.0.vm04.stdout:2/910: dwrite d1/dae/d2c/d37/d40/dfa/f112 [0,4194304] 0 2026-03-10T06:23:19.981 INFO:tasks.workunit.client.0.vm04.stdout:8/962: rename df/d20/d25/d30/lfb to df/d20/d25/d73/l132 0 2026-03-10T06:23:19.982 INFO:tasks.workunit.client.0.vm04.stdout:1/911: mknod d0/d8/d46/db3/dd2/d100/d119/c14e 0 2026-03-10T06:23:19.989 INFO:tasks.workunit.client.0.vm04.stdout:4/959: mknod d2/d32/d5c/c13f 0 2026-03-10T06:23:19.990 INFO:tasks.workunit.client.0.vm04.stdout:4/960: write d2/d32/d10b/d93/f9c [351519,36289] 0 2026-03-10T06:23:19.991 INFO:tasks.workunit.client.0.vm04.stdout:4/961: chown d2/d32/d5c/d76/dd7/d2c/d6b/f96 236289884 1 2026-03-10T06:23:19.999 INFO:tasks.workunit.client.0.vm04.stdout:7/900: write d4/df/d12/d13/f10f [33088,8912] 0 2026-03-10T06:23:20.000 INFO:tasks.workunit.client.0.vm04.stdout:7/901: write d4/df/d12/d34/dbd/f13e [441926,38707] 0 2026-03-10T06:23:20.004 INFO:tasks.workunit.client.0.vm04.stdout:5/929: rename d4/d11/d7d/d38/d91/d4c/def/dd3 to d4/d11/d7d/d38/d91/d4c/def/ddc/d13d 0 2026-03-10T06:23:20.007 INFO:tasks.workunit.client.0.vm04.stdout:1/912: symlink d0/d8/d46/d7a/l14f 0 2026-03-10T06:23:20.008 INFO:tasks.workunit.client.0.vm04.stdout:6/953: mknod d2/d43/d2d/d30/d1f/d11c/c138 0 2026-03-10T06:23:20.012 INFO:tasks.workunit.client.0.vm04.stdout:3/908: fdatasync d4/da/df/d11/f48 0 2026-03-10T06:23:20.022 INFO:tasks.workunit.client.0.vm04.stdout:6/954: dread d2/d3a/f57 [0,4194304] 0 2026-03-10T06:23:20.023 INFO:tasks.workunit.client.0.vm04.stdout:7/902: dread d4/df/d12/d13/d8b/fa5 [0,4194304] 0 2026-03-10T06:23:20.023 INFO:tasks.workunit.client.0.vm04.stdout:6/955: rename d2/d43 to d2/d43/d9b/d129/d139 22 2026-03-10T06:23:20.026 INFO:tasks.workunit.client.0.vm04.stdout:6/956: chown d2/d37/l41 2 1 2026-03-10T06:23:20.031 INFO:tasks.workunit.client.0.vm04.stdout:8/963: write df/d15/d29/da3/fbf [4448720,32994] 0 2026-03-10T06:23:20.035 INFO:tasks.workunit.client.0.vm04.stdout:8/964: dwrite df/d15/d29/fca [0,4194304] 0 2026-03-10T06:23:20.050 INFO:tasks.workunit.client.0.vm04.stdout:2/911: creat d1/dae/d11/f118 x:0 0 0 2026-03-10T06:23:20.052 INFO:tasks.workunit.client.0.vm04.stdout:5/930: fdatasync d4/d11/fb9 0 2026-03-10T06:23:20.055 INFO:tasks.workunit.client.0.vm04.stdout:4/962: creat d2/d32/f140 x:0 0 0 2026-03-10T06:23:20.055 INFO:tasks.workunit.client.0.vm04.stdout:3/909: fdatasync d4/da/df/fc1 0 2026-03-10T06:23:20.061 INFO:tasks.workunit.client.0.vm04.stdout:5/931: fsync d4/d6/d80/d84/d99/fe9 0 2026-03-10T06:23:20.071 INFO:tasks.workunit.client.0.vm04.stdout:4/963: mkdir d2/d32/d5c/d76/dd7/d2c/d6b/d108/dfe/d141 0 2026-03-10T06:23:20.072 INFO:tasks.workunit.client.0.vm04.stdout:7/903: write d4/df/d12/dd4/f11d [900508,32208] 0 2026-03-10T06:23:20.079 INFO:tasks.workunit.client.0.vm04.stdout:3/910: mknod d4/d6/d38/c135 0 2026-03-10T06:23:20.082 INFO:tasks.workunit.client.0.vm04.stdout:4/964: dread d2/fce [0,4194304] 0 2026-03-10T06:23:20.083 INFO:tasks.workunit.client.0.vm04.stdout:4/965: dread d2/d46/f111 [0,4194304] 0 2026-03-10T06:23:20.083 INFO:tasks.workunit.client.0.vm04.stdout:4/966: fsync d2/d32/d5c/f132 0 2026-03-10T06:23:20.085 INFO:tasks.workunit.client.0.vm04.stdout:2/912: mkdir d1/db/d69/d74/d87/dcf/d8f/d119 0 2026-03-10T06:23:20.086 INFO:tasks.workunit.client.0.vm04.stdout:8/965: write df/d15/d29/da3/db8/dc1/d97/d67/f92 [264411,49164] 0 2026-03-10T06:23:20.090 INFO:tasks.workunit.client.0.vm04.stdout:1/913: getdents d0/d8/d46 0 2026-03-10T06:23:20.093 INFO:tasks.workunit.client.0.vm04.stdout:1/914: dwrite d0/d8/d46/d7a/d95/df3/f138 [0,4194304] 0 2026-03-10T06:23:20.102 INFO:tasks.workunit.client.0.vm04.stdout:3/911: readlink d4/da/df/d11/l8c 0 2026-03-10T06:23:20.102 INFO:tasks.workunit.client.0.vm04.stdout:4/967: fdatasync d2/d32/d5c/d76/dd7/d2c/d6b/f96 0 2026-03-10T06:23:20.102 INFO:tasks.workunit.client.0.vm04.stdout:6/957: getdents d2/d37/d83 0 2026-03-10T06:23:20.103 INFO:tasks.workunit.client.0.vm04.stdout:3/912: chown d4/d6/d99/d10c/d118 2489155 1 2026-03-10T06:23:20.104 INFO:tasks.workunit.client.0.vm04.stdout:8/966: write df/d15/d2b/f4a [2906659,28473] 0 2026-03-10T06:23:20.107 INFO:tasks.workunit.client.0.vm04.stdout:1/915: creat d0/d8/d46/de4/f150 x:0 0 0 2026-03-10T06:23:20.107 INFO:tasks.workunit.client.0.vm04.stdout:4/968: dwrite d2/d46/fa8 [4194304,4194304] 0 2026-03-10T06:23:20.108 INFO:tasks.workunit.client.0.vm04.stdout:7/904: sync 2026-03-10T06:23:20.112 INFO:tasks.workunit.client.0.vm04.stdout:6/958: truncate d2/d43/d2d/d30/f32 612820 0 2026-03-10T06:23:20.118 INFO:tasks.workunit.client.0.vm04.stdout:1/916: truncate d0/d3/d41/dcb/f139 776171 0 2026-03-10T06:23:20.118 INFO:tasks.workunit.client.0.vm04.stdout:7/905: dwrite d4/df/d12/d13/db3/d110/d9c/db1/ff7 [0,4194304] 0 2026-03-10T06:23:20.120 INFO:tasks.workunit.client.0.vm04.stdout:5/932: rename d4/d11/d7d/d52/f96 to d4/d11/d7d/f13e 0 2026-03-10T06:23:20.120 INFO:tasks.workunit.client.0.vm04.stdout:5/933: stat d4/d11/d7d/d38/d91/d4c/d98/dc0/f123 0 2026-03-10T06:23:20.126 INFO:tasks.workunit.client.0.vm04.stdout:4/969: truncate d2/d46/f87 1203672 0 2026-03-10T06:23:20.129 INFO:tasks.workunit.client.0.vm04.stdout:3/913: mknod d4/da/df/d11/d5a/d5b/ddf/d21/d2c/c136 0 2026-03-10T06:23:20.130 INFO:tasks.workunit.client.0.vm04.stdout:2/913: creat d1/f11a x:0 0 0 2026-03-10T06:23:20.133 INFO:tasks.workunit.client.0.vm04.stdout:4/970: write d2/d32/d94/d99/d101/f12d [805299,106792] 0 2026-03-10T06:23:20.135 INFO:tasks.workunit.client.0.vm04.stdout:2/914: dread - d1/dae/d11/d14/d9f/ddb/d94/dbb/ff8 zero size 2026-03-10T06:23:20.148 INFO:tasks.workunit.client.0.vm04.stdout:7/906: unlink d4/fa 0 2026-03-10T06:23:20.157 INFO:tasks.workunit.client.0.vm04.stdout:4/971: rmdir d2/d32/d5c/d76/dd7/d2c/d6b/dd1/d107 39 2026-03-10T06:23:20.157 INFO:tasks.workunit.client.0.vm04.stdout:4/972: stat d2/d32/d5c/de2/d122 0 2026-03-10T06:23:20.158 INFO:tasks.workunit.client.0.vm04.stdout:4/973: readlink d2/d8/l63 0 2026-03-10T06:23:20.159 INFO:tasks.workunit.client.0.vm04.stdout:2/915: creat d1/dae/d2c/d37/d40/f11b x:0 0 0 2026-03-10T06:23:20.161 INFO:tasks.workunit.client.0.vm04.stdout:7/907: sync 2026-03-10T06:23:20.161 INFO:tasks.workunit.client.0.vm04.stdout:7/908: chown d4/df/d12/d13/d25/d30/d40/c96 127255 1 2026-03-10T06:23:20.162 INFO:tasks.workunit.client.0.vm04.stdout:7/909: chown d4/df/d12/d13/d25/d28/d3a/db0/l13b 436 1 2026-03-10T06:23:20.163 INFO:tasks.workunit.client.0.vm04.stdout:8/967: write df/d20/d25/d30/d65/f80 [1202635,32816] 0 2026-03-10T06:23:20.167 INFO:tasks.workunit.client.0.vm04.stdout:1/917: write d0/d3/d41/d4b/d5b/f5c [2008486,63988] 0 2026-03-10T06:23:20.169 INFO:tasks.workunit.client.0.vm04.stdout:6/959: dwrite d2/d3a/d9c/f102 [0,4194304] 0 2026-03-10T06:23:20.171 INFO:tasks.workunit.client.0.vm04.stdout:6/960: chown d2/d43/d2d/d30/dc0/l12f 3828 1 2026-03-10T06:23:20.171 INFO:tasks.workunit.client.0.vm04.stdout:6/961: chown d2/d43/d2d/d7c/f8b 51250620 1 2026-03-10T06:23:20.173 INFO:tasks.workunit.client.0.vm04.stdout:7/910: dread d4/df/d12/d13/db3/ded/f101 [0,4194304] 0 2026-03-10T06:23:20.173 INFO:tasks.workunit.client.0.vm04.stdout:7/911: chown d4/df/d12/d34 1236118270 1 2026-03-10T06:23:20.179 INFO:tasks.workunit.client.0.vm04.stdout:2/916: symlink d1/db/d69/d74/d87/l11c 0 2026-03-10T06:23:20.180 INFO:tasks.workunit.client.0.vm04.stdout:4/974: creat d2/d32/d5c/d76/dd7/d31/d42/db9/f142 x:0 0 0 2026-03-10T06:23:20.185 INFO:tasks.workunit.client.0.vm04.stdout:5/934: creat d4/d11/d7d/f13f x:0 0 0 2026-03-10T06:23:20.190 INFO:tasks.workunit.client.0.vm04.stdout:5/935: chown d4/d6/f93 0 1 2026-03-10T06:23:20.190 INFO:tasks.workunit.client.0.vm04.stdout:5/936: chown d4/d6/d81/db6/f11c 486289 1 2026-03-10T06:23:20.259 INFO:tasks.workunit.client.0.vm04.stdout:7/912: write d4/df/d12/d34/d63/f9a [276269,33721] 0 2026-03-10T06:23:20.294 INFO:tasks.workunit.client.0.vm04.stdout:4/975: dread d2/f12 [0,4194304] 0 2026-03-10T06:23:20.295 INFO:tasks.workunit.client.0.vm04.stdout:2/917: mkdir d1/dae/d11/d14/d9f/ddb/d94/dbb/de8/d11d 0 2026-03-10T06:23:20.306 INFO:tasks.workunit.client.0.vm04.stdout:3/914: getdents d4/da/df/d11/d5a/d5b/ddf/d21/d32/d4e/d8f/d104 0 2026-03-10T06:23:20.306 INFO:tasks.workunit.client.0.vm04.stdout:3/915: write d4/deb/f111 [112222,54123] 0 2026-03-10T06:23:20.310 INFO:tasks.workunit.client.0.vm04.stdout:4/976: mknod d2/d32/d5c/d76/dd7/d2c/d6b/dd1/d107/c143 0 2026-03-10T06:23:20.311 INFO:tasks.workunit.client.0.vm04.stdout:5/937: creat d4/d11/d7d/d38/d91/d4c/def/ddc/d108/f140 x:0 0 0 2026-03-10T06:23:20.407 INFO:tasks.workunit.client.0.vm04.stdout:3/916: mknod d4/d6/d99/c137 0 2026-03-10T06:23:20.414 INFO:tasks.workunit.client.0.vm04.stdout:5/938: write d4/d6/d81/fc3 [4144987,118544] 0 2026-03-10T06:23:20.416 INFO:tasks.workunit.client.0.vm04.stdout:3/917: mknod d4/da/df/d11/d5a/db3/c138 0 2026-03-10T06:23:20.423 INFO:tasks.workunit.client.0.vm04.stdout:3/918: truncate d4/da/df/d11/d5a/d5b/f98 4670278 0 2026-03-10T06:23:20.431 INFO:tasks.workunit.client.0.vm04.stdout:3/919: mknod d4/da/df/d11/d50/c139 0 2026-03-10T06:23:20.435 INFO:tasks.workunit.client.0.vm04.stdout:3/920: fsync d4/da/df/d11/d5a/d5b/ddf/d21/d32/d39/f107 0 2026-03-10T06:23:20.436 INFO:tasks.workunit.client.0.vm04.stdout:3/921: chown d4/da/df/d11/d5a/d5b/ddf/la4 171 1 2026-03-10T06:23:20.443 INFO:tasks.workunit.client.0.vm04.stdout:3/922: dread d4/da/df/d11/d5a/db3/fbb [0,4194304] 0 2026-03-10T06:23:20.446 INFO:tasks.workunit.client.0.vm04.stdout:7/913: mknod d4/df/d12/d13/d25/dcb/c142 0 2026-03-10T06:23:20.446 INFO:tasks.workunit.client.0.vm04.stdout:7/914: stat d4/df/d12/d34/d63/c98 0 2026-03-10T06:23:20.449 INFO:tasks.workunit.client.0.vm04.stdout:3/923: read d4/da/df/d11/d5a/d5b/ddf/d21/d32/d39/d64/fd6 [217340,119792] 0 2026-03-10T06:23:20.454 INFO:tasks.workunit.client.0.vm04.stdout:7/915: mknod d4/df/d12/d13/db3/d110/d9c/db1/dde/ddf/c143 0 2026-03-10T06:23:20.456 INFO:tasks.workunit.client.0.vm04.stdout:4/977: symlink d2/l144 0 2026-03-10T06:23:20.463 INFO:tasks.workunit.client.0.vm04.stdout:3/924: creat d4/d6/d99/d10c/f13a x:0 0 0 2026-03-10T06:23:20.464 INFO:tasks.workunit.client.0.vm04.stdout:3/925: write d4/da/df/d11/d5a/d5b/ddf/f12c [1609511,81692] 0 2026-03-10T06:23:20.465 INFO:tasks.workunit.client.0.vm04.stdout:3/926: write d4/da/df/d11/d5a/d5b/ddf/d21/d32/d39/f107 [93438,113437] 0 2026-03-10T06:23:20.471 INFO:tasks.workunit.client.0.vm04.stdout:4/978: mknod d2/d32/d5c/de2/d102/c145 0 2026-03-10T06:23:20.478 INFO:tasks.workunit.client.0.vm04.stdout:4/979: creat d2/d32/d5c/de2/d102/f146 x:0 0 0 2026-03-10T06:23:20.492 INFO:tasks.workunit.client.0.vm04.stdout:8/968: rename df/d15/d29/da3/db8/dc1/d97/d67/f92 to df/f133 0 2026-03-10T06:23:20.493 INFO:tasks.workunit.client.0.vm04.stdout:1/918: rename d0/d3/d80 to d0/d3/d80/d151 22 2026-03-10T06:23:20.499 INFO:tasks.workunit.client.0.vm04.stdout:1/919: chown d0/d3/d41/d4b/lde 167694 1 2026-03-10T06:23:20.500 INFO:tasks.workunit.client.0.vm04.stdout:1/920: write d0/d3/d41/dcb/f135 [134004,21474] 0 2026-03-10T06:23:20.503 INFO:tasks.workunit.client.0.vm04.stdout:1/921: dread d0/d8/d46/f82 [0,4194304] 0 2026-03-10T06:23:20.503 INFO:tasks.workunit.client.0.vm04.stdout:1/922: fsync d0/d3/d41/dcb/f12b 0 2026-03-10T06:23:20.506 INFO:tasks.workunit.client.0.vm04.stdout:8/969: creat df/d15/d29/f134 x:0 0 0 2026-03-10T06:23:20.507 INFO:tasks.workunit.client.0.vm04.stdout:8/970: truncate df/d15/d29/df8/d102/dab/f125 750956 0 2026-03-10T06:23:20.512 INFO:tasks.workunit.client.0.vm04.stdout:8/971: rmdir df/d20/d25/d30/dc5 39 2026-03-10T06:23:20.513 INFO:tasks.workunit.client.0.vm04.stdout:2/918: rename d1/db/d69/d74/d87/dcf/d8f/f25 to d1/dae/d11/f11e 0 2026-03-10T06:23:20.515 INFO:tasks.workunit.client.0.vm04.stdout:8/972: creat df/d20/d25/d30/d65/f135 x:0 0 0 2026-03-10T06:23:20.518 INFO:tasks.workunit.client.0.vm04.stdout:7/916: mknod d4/df/d12/d13/d25/d28/d3a/c144 0 2026-03-10T06:23:20.523 INFO:tasks.workunit.client.0.vm04.stdout:8/973: dread df/d20/d25/f44 [0,4194304] 0 2026-03-10T06:23:20.525 INFO:tasks.workunit.client.0.vm04.stdout:2/919: link d1/dae/d11/d14/d9f/ddb/daf/db0/cbc d1/dae/d2c/c11f 0 2026-03-10T06:23:20.528 INFO:tasks.workunit.client.0.vm04.stdout:8/974: rmdir df/d15/d29/da3/db8/dc1/dac 39 2026-03-10T06:23:20.536 INFO:tasks.workunit.client.0.vm04.stdout:2/920: fsync d1/dae/d11/d14/ffe 0 2026-03-10T06:23:20.540 INFO:tasks.workunit.client.0.vm04.stdout:8/975: creat df/d15/d29/df8/d102/dab/f136 x:0 0 0 2026-03-10T06:23:20.544 INFO:tasks.workunit.client.0.vm04.stdout:2/921: dread d1/dae/d11/d14/d9f/ddb/f7a [0,4194304] 0 2026-03-10T06:23:20.545 INFO:tasks.workunit.client.0.vm04.stdout:8/976: mknod df/d15/d29/da3/c137 0 2026-03-10T06:23:20.554 INFO:tasks.workunit.client.0.vm04.stdout:2/922: dread d1/f5 [0,4194304] 0 2026-03-10T06:23:20.555 INFO:tasks.workunit.client.0.vm04.stdout:2/923: stat d1/db/d69/d74/d87/dcf/d8f/d35/d54/fe4 0 2026-03-10T06:23:20.556 INFO:tasks.workunit.client.0.vm04.stdout:2/924: fsync d1/db/d69/d74/d87/dcf/fc1 0 2026-03-10T06:23:20.569 INFO:tasks.workunit.client.0.vm04.stdout:6/962: unlink d2/cb 0 2026-03-10T06:23:20.570 INFO:tasks.workunit.client.0.vm04.stdout:6/963: creat d2/d43/d2d/d30/d34/d108/f13a x:0 0 0 2026-03-10T06:23:20.574 INFO:tasks.workunit.client.0.vm04.stdout:5/939: dwrite d4/f26 [0,4194304] 0 2026-03-10T06:23:20.577 INFO:tasks.workunit.client.0.vm04.stdout:6/964: dread d2/d43/d2d/d30/d1f/d3c/d75/f92 [0,4194304] 0 2026-03-10T06:23:20.577 INFO:tasks.workunit.client.0.vm04.stdout:6/965: stat d2/d43/d2d/d30/d1f/l80 0 2026-03-10T06:23:20.596 INFO:tasks.workunit.client.0.vm04.stdout:6/966: creat d2/f13b x:0 0 0 2026-03-10T06:23:20.596 INFO:tasks.workunit.client.0.vm04.stdout:7/917: getdents d4/df/d12/d13/d25/dcb 0 2026-03-10T06:23:20.599 INFO:tasks.workunit.client.0.vm04.stdout:6/967: dwrite d2/d43/d2d/d30/d34/d76/d8a/f9e [0,4194304] 0 2026-03-10T06:23:20.601 INFO:tasks.workunit.client.0.vm04.stdout:6/968: fdatasync d2/d43/d2d/f134 0 2026-03-10T06:23:20.601 INFO:tasks.workunit.client.0.vm04.stdout:6/969: chown d2/d43/d2d/d30/d1f/d3c/fb7 873 1 2026-03-10T06:23:20.605 INFO:tasks.workunit.client.0.vm04.stdout:6/970: dwrite d2/d3a/f137 [0,4194304] 0 2026-03-10T06:23:20.605 INFO:tasks.workunit.client.0.vm04.stdout:8/977: sync 2026-03-10T06:23:20.607 INFO:tasks.workunit.client.0.vm04.stdout:8/978: stat df/d20/f22 0 2026-03-10T06:23:20.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:20 vm06.local ceph-mon[58974]: Upgrade: Updating grafana.vm04 2026-03-10T06:23:20.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:20 vm06.local ceph-mon[58974]: Deploying daemon grafana.vm04 on vm04 2026-03-10T06:23:20.623 INFO:tasks.workunit.client.0.vm04.stdout:7/918: mknod d4/df/d12/d13/d25/dcb/dd2/c145 0 2026-03-10T06:23:20.630 INFO:tasks.workunit.client.0.vm04.stdout:3/927: write d4/da/df/d11/d5a/d5b/ddf/f2e [916883,85199] 0 2026-03-10T06:23:20.632 INFO:tasks.workunit.client.0.vm04.stdout:7/919: rmdir d4/d105 39 2026-03-10T06:23:20.632 INFO:tasks.workunit.client.0.vm04.stdout:7/920: fsync d4/df/f84 0 2026-03-10T06:23:20.633 INFO:tasks.workunit.client.0.vm04.stdout:4/980: write d2/d32/d5c/d76/dd7/d2c/d9a/fb6 [943552,83373] 0 2026-03-10T06:23:20.634 INFO:tasks.workunit.client.0.vm04.stdout:6/971: creat d2/d8/f13c x:0 0 0 2026-03-10T06:23:20.639 INFO:tasks.workunit.client.0.vm04.stdout:3/928: truncate d4/da/f134 3858622 0 2026-03-10T06:23:20.640 INFO:tasks.workunit.client.0.vm04.stdout:1/923: write d0/d8/d46/fd8 [483057,115088] 0 2026-03-10T06:23:20.641 INFO:tasks.workunit.client.0.vm04.stdout:3/929: truncate d4/da/df/d11/d5a/d5b/dff/f117 961452 0 2026-03-10T06:23:20.648 INFO:tasks.workunit.client.0.vm04.stdout:7/921: creat d4/df/d12/d34/d103/f146 x:0 0 0 2026-03-10T06:23:20.650 INFO:tasks.workunit.client.0.vm04.stdout:6/972: chown d2/d43/d2d/d7c/ce2 103243 1 2026-03-10T06:23:20.651 INFO:tasks.workunit.client.0.vm04.stdout:6/973: stat d2/d43/d2d/d30/d1f/le3 0 2026-03-10T06:23:20.652 INFO:tasks.workunit.client.0.vm04.stdout:7/922: dwrite d4/df/d12/d13/db3/d110/d9c/db1/dde/ddf/df1/f132 [0,4194304] 0 2026-03-10T06:23:20.652 INFO:tasks.workunit.client.0.vm04.stdout:6/974: truncate d2/d3a/ded/f11d 419383 0 2026-03-10T06:23:20.660 INFO:tasks.workunit.client.0.vm04.stdout:1/924: fdatasync d0/d3/fb2 0 2026-03-10T06:23:20.664 INFO:tasks.workunit.client.0.vm04.stdout:8/979: getdents df/d15/d29/df8/d102/dab 0 2026-03-10T06:23:20.665 INFO:tasks.workunit.client.0.vm04.stdout:8/980: truncate df/d15/d29/df8/d102/dab/f125 1553538 0 2026-03-10T06:23:20.665 INFO:tasks.workunit.client.0.vm04.stdout:8/981: write df/d20/d25/d30/f51 [3151684,15959] 0 2026-03-10T06:23:20.666 INFO:tasks.workunit.client.0.vm04.stdout:8/982: fsync df/d20/d25/fd6 0 2026-03-10T06:23:20.667 INFO:tasks.workunit.client.0.vm04.stdout:4/981: creat d2/d32/d5c/d76/dd7/d2c/d6b/d108/dfe/d141/f147 x:0 0 0 2026-03-10T06:23:20.672 INFO:tasks.workunit.client.0.vm04.stdout:6/975: rename d2/d8/l5b to d2/d43/d2d/d30/d1f/d3c/d85/dbf/dd3/l13d 0 2026-03-10T06:23:20.677 INFO:tasks.workunit.client.0.vm04.stdout:6/976: write d2/d3a/ded/f11d [1281234,72074] 0 2026-03-10T06:23:20.677 INFO:tasks.workunit.client.0.vm04.stdout:2/925: write d1/dae/d11/d14/d9f/ddb/d94/f97 [4270441,44837] 0 2026-03-10T06:23:20.677 INFO:tasks.workunit.client.0.vm04.stdout:6/977: chown d2/l36 7442 1 2026-03-10T06:23:20.677 INFO:tasks.workunit.client.0.vm04.stdout:2/926: dread - d1/dae/d11/d14/d9f/ddb/d94/dbb/ff8 zero size 2026-03-10T06:23:20.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:20 vm04.local ceph-mon[51058]: Upgrade: Updating grafana.vm04 2026-03-10T06:23:20.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:20 vm04.local ceph-mon[51058]: Deploying daemon grafana.vm04 on vm04 2026-03-10T06:23:20.678 INFO:tasks.workunit.client.0.vm04.stdout:8/983: sync 2026-03-10T06:23:20.683 INFO:tasks.workunit.client.0.vm04.stdout:2/927: dread d1/dae/d11/f16 [0,4194304] 0 2026-03-10T06:23:20.693 INFO:tasks.workunit.client.0.vm04.stdout:7/923: creat d4/df/d12/d13/d25/d30/d40/d50/f147 x:0 0 0 2026-03-10T06:23:20.695 INFO:tasks.workunit.client.0.vm04.stdout:5/940: write d4/d6/f47 [2013329,106638] 0 2026-03-10T06:23:20.703 INFO:tasks.workunit.client.0.vm04.stdout:8/984: fsync df/d15/d29/d89/fd7 0 2026-03-10T06:23:20.706 INFO:tasks.workunit.client.0.vm04.stdout:4/982: mkdir d2/d32/d5c/d76/dd7/d2c/d6b/d108/d11c/d148 0 2026-03-10T06:23:20.707 INFO:tasks.workunit.client.0.vm04.stdout:4/983: write d2/d32/d5c/d76/dd7/d31/d42/db9/fb0 [1487338,11832] 0 2026-03-10T06:23:20.713 INFO:tasks.workunit.client.0.vm04.stdout:3/930: write d4/da/df/d11/d50/dc8/fcb [213905,115122] 0 2026-03-10T06:23:20.715 INFO:tasks.workunit.client.0.vm04.stdout:6/978: rename d2/d8 to d2/d43/d9b/d13e 0 2026-03-10T06:23:20.717 INFO:tasks.workunit.client.0.vm04.stdout:1/925: dwrite d0/d3/fd4 [0,4194304] 0 2026-03-10T06:23:20.721 INFO:tasks.workunit.client.0.vm04.stdout:4/984: mknod d2/d32/d5c/d76/dd7/d31/d42/db9/def/d133/c149 0 2026-03-10T06:23:20.722 INFO:tasks.workunit.client.0.vm04.stdout:7/924: mkdir d4/df/d12/d13/d12f/d148 0 2026-03-10T06:23:20.723 INFO:tasks.workunit.client.0.vm04.stdout:5/941: mkdir d4/d6/d80/d84/d99/de6/d141 0 2026-03-10T06:23:20.725 INFO:tasks.workunit.client.0.vm04.stdout:1/926: sync 2026-03-10T06:23:20.726 INFO:tasks.workunit.client.0.vm04.stdout:4/985: sync 2026-03-10T06:23:20.726 INFO:tasks.workunit.client.0.vm04.stdout:7/925: sync 2026-03-10T06:23:20.727 INFO:tasks.workunit.client.0.vm04.stdout:4/986: write d2/d32/d5c/d76/f10d [4281579,53028] 0 2026-03-10T06:23:20.731 INFO:tasks.workunit.client.0.vm04.stdout:1/927: symlink d0/d112/l152 0 2026-03-10T06:23:20.732 INFO:tasks.workunit.client.0.vm04.stdout:7/926: mknod d4/df/d12/d13/db3/d110/d9c/c149 0 2026-03-10T06:23:20.735 INFO:tasks.workunit.client.0.vm04.stdout:6/979: link d2/d3a/d5e/ffe d2/d43/d2d/d30/d34/d76/d7e/ddc/f13f 0 2026-03-10T06:23:20.736 INFO:tasks.workunit.client.0.vm04.stdout:2/928: getdents d1 0 2026-03-10T06:23:20.738 INFO:tasks.workunit.client.0.vm04.stdout:5/942: symlink d4/d11/d7d/d38/d91/d4c/def/d12a/l142 0 2026-03-10T06:23:20.740 INFO:tasks.workunit.client.0.vm04.stdout:7/927: dread d4/df/f107 [0,4194304] 0 2026-03-10T06:23:20.742 INFO:tasks.workunit.client.0.vm04.stdout:8/985: rename df/d20/d25/d30 to df/d20/d138 0 2026-03-10T06:23:20.745 INFO:tasks.workunit.client.0.vm04.stdout:3/931: write d4/f2d [1168679,36781] 0 2026-03-10T06:23:20.747 INFO:tasks.workunit.client.0.vm04.stdout:2/929: dread - d1/db/d69/d74/d87/dcf/d8f/d35/fde zero size 2026-03-10T06:23:20.752 INFO:tasks.workunit.client.0.vm04.stdout:7/928: mkdir d4/df/d12/d13/d25/d30/d14a 0 2026-03-10T06:23:20.753 INFO:tasks.workunit.client.0.vm04.stdout:4/987: write d2/d8/f35 [4629922,62438] 0 2026-03-10T06:23:20.760 INFO:tasks.workunit.client.0.vm04.stdout:8/986: readlink df/d20/d138/d65/d8f/l11a 0 2026-03-10T06:23:20.761 INFO:tasks.workunit.client.0.vm04.stdout:6/980: unlink d2/d43/d2d/d30/d34/cc2 0 2026-03-10T06:23:20.764 INFO:tasks.workunit.client.0.vm04.stdout:3/932: mkdir d4/d6/dc/d13b 0 2026-03-10T06:23:20.765 INFO:tasks.workunit.client.0.vm04.stdout:3/933: chown d4/fe3 30406936 1 2026-03-10T06:23:20.766 INFO:tasks.workunit.client.0.vm04.stdout:2/930: symlink d1/db/d69/d74/d87/dcf/d8f/ddc/l120 0 2026-03-10T06:23:20.766 INFO:tasks.workunit.client.0.vm04.stdout:5/943: mkdir d4/d6/d143 0 2026-03-10T06:23:20.767 INFO:tasks.workunit.client.0.vm04.stdout:5/944: chown d4/d11/d7d/d52/c64 218461 1 2026-03-10T06:23:20.770 INFO:tasks.workunit.client.0.vm04.stdout:6/981: creat d2/d43/d2d/d7c/daa/f140 x:0 0 0 2026-03-10T06:23:20.772 INFO:tasks.workunit.client.0.vm04.stdout:2/931: unlink d1/dae/d2c/f58 0 2026-03-10T06:23:20.773 INFO:tasks.workunit.client.0.vm04.stdout:2/932: dread d1/f5 [0,4194304] 0 2026-03-10T06:23:20.777 INFO:tasks.workunit.client.0.vm04.stdout:4/988: dwrite d2/d32/d5c/d76/dd7/d2c/d6b/dd1/d107/f118 [4194304,4194304] 0 2026-03-10T06:23:20.778 INFO:tasks.workunit.client.0.vm04.stdout:7/929: creat d4/df/d12/d13/d25/d28/d3a/d100/d13d/f14b x:0 0 0 2026-03-10T06:23:20.779 INFO:tasks.workunit.client.0.vm04.stdout:4/989: chown d2/d32/d5c/d76/dd7/d2c/d6b/f96 595 1 2026-03-10T06:23:20.787 INFO:tasks.workunit.client.0.vm04.stdout:6/982: mknod d2/d37/d6e/c141 0 2026-03-10T06:23:20.787 INFO:tasks.workunit.client.0.vm04.stdout:3/934: mknod d4/d6/d92/def/c13c 0 2026-03-10T06:23:20.788 INFO:tasks.workunit.client.0.vm04.stdout:2/933: fsync d1/db/d69/d74/d87/dcf/d8f/d48/d67/f92 0 2026-03-10T06:23:20.792 INFO:tasks.workunit.client.0.vm04.stdout:1/928: rename d0/d3/d41/d4b/c131 to d0/d8/d46/d7a/d95/c153 0 2026-03-10T06:23:20.793 INFO:tasks.workunit.client.0.vm04.stdout:7/930: symlink d4/df/d12/d13/d8b/l14c 0 2026-03-10T06:23:20.796 INFO:tasks.workunit.client.0.vm04.stdout:5/945: dread d4/d11/d7d/f36 [4194304,4194304] 0 2026-03-10T06:23:20.797 INFO:tasks.workunit.client.0.vm04.stdout:6/983: fsync d2/d43/d2d/d30/f7f 0 2026-03-10T06:23:20.798 INFO:tasks.workunit.client.0.vm04.stdout:6/984: readlink d2/l2a 0 2026-03-10T06:23:20.798 INFO:tasks.workunit.client.0.vm04.stdout:5/946: truncate d4/d6/d81/f11b 853043 0 2026-03-10T06:23:20.799 INFO:tasks.workunit.client.0.vm04.stdout:3/935: mknod d4/da/df/d11/d5a/db3/d108/c13d 0 2026-03-10T06:23:20.800 INFO:tasks.workunit.client.0.vm04.stdout:2/934: mkdir d1/dae/d2c/d37/d40/dfa/d121 0 2026-03-10T06:23:20.800 INFO:tasks.workunit.client.0.vm04.stdout:8/987: getdents df/d20/d138/d55/de7/d103 0 2026-03-10T06:23:20.802 INFO:tasks.workunit.client.0.vm04.stdout:1/929: creat d0/d8/d46/db3/dd2/d100/d119/f154 x:0 0 0 2026-03-10T06:23:20.805 INFO:tasks.workunit.client.0.vm04.stdout:2/935: write d1/db/d69/f113 [2656286,2273] 0 2026-03-10T06:23:20.806 INFO:tasks.workunit.client.0.vm04.stdout:2/936: fsync d1/dae/d11/d14/d4e/f9d 0 2026-03-10T06:23:20.819 INFO:tasks.workunit.client.0.vm04.stdout:8/988: dwrite df/d20/d138/d65/d8f/fc9 [0,4194304] 0 2026-03-10T06:23:20.825 INFO:tasks.workunit.client.0.vm04.stdout:5/947: write d4/d11/d7d/d38/d91/d4c/f95 [544637,69023] 0 2026-03-10T06:23:20.834 INFO:tasks.workunit.client.0.vm04.stdout:4/990: rename d2/d32/d5c/d76/dd7/d31/d42/db9/def to d2/d14a 0 2026-03-10T06:23:20.835 INFO:tasks.workunit.client.0.vm04.stdout:4/991: read d2/d32/d10b/dc8/d100/ffc [3636051,83331] 0 2026-03-10T06:23:20.838 INFO:tasks.workunit.client.0.vm04.stdout:5/948: mknod d4/d11/d7d/d38/d110/d136/c144 0 2026-03-10T06:23:20.840 INFO:tasks.workunit.client.0.vm04.stdout:3/936: rename d4/da/df/d11/cb1 to d4/d6/dc/c13e 0 2026-03-10T06:23:20.841 INFO:tasks.workunit.client.0.vm04.stdout:1/930: creat d0/f155 x:0 0 0 2026-03-10T06:23:20.842 INFO:tasks.workunit.client.0.vm04.stdout:4/992: creat d2/d32/d5c/d76/dd7/da3/f14b x:0 0 0 2026-03-10T06:23:20.843 INFO:tasks.workunit.client.0.vm04.stdout:7/931: getdents d4 0 2026-03-10T06:23:20.847 INFO:tasks.workunit.client.0.vm04.stdout:7/932: dwrite d4/df/d12/dd4/f7c [4194304,4194304] 0 2026-03-10T06:23:20.848 INFO:tasks.workunit.client.0.vm04.stdout:6/985: link d2/d43/d2d/d30/c5d d2/d3a/d5e/ddf/c142 0 2026-03-10T06:23:20.848 INFO:tasks.workunit.client.0.vm04.stdout:2/937: dwrite d1/dae/d11/fc3 [0,4194304] 0 2026-03-10T06:23:20.862 INFO:tasks.workunit.client.0.vm04.stdout:5/949: symlink d4/d11/d7d/d38/d91/d4c/def/d12a/l145 0 2026-03-10T06:23:20.864 INFO:tasks.workunit.client.0.vm04.stdout:8/989: write df/f46 [2614721,20642] 0 2026-03-10T06:23:20.864 INFO:tasks.workunit.client.0.vm04.stdout:3/937: dread d4/d6/d54/fa8 [0,4194304] 0 2026-03-10T06:23:20.876 INFO:tasks.workunit.client.0.vm04.stdout:5/950: truncate d4/f19 8476107 0 2026-03-10T06:23:20.879 INFO:tasks.workunit.client.0.vm04.stdout:6/986: creat d2/d43/d2d/d30/d1f/d3c/d85/dbf/dd3/f143 x:0 0 0 2026-03-10T06:23:20.879 INFO:tasks.workunit.client.0.vm04.stdout:6/987: chown d2/d43/d2d/d30/d34/d76/d8a/f9e 11 1 2026-03-10T06:23:20.883 INFO:tasks.workunit.client.0.vm04.stdout:3/938: mkdir d4/da/df/d11/d5a/d5b/ddf/dbd/dd0/d13f 0 2026-03-10T06:23:20.886 INFO:tasks.workunit.client.0.vm04.stdout:8/990: symlink df/d15/d29/da3/db8/dc1/d97/l139 0 2026-03-10T06:23:20.892 INFO:tasks.workunit.client.0.vm04.stdout:6/988: creat d2/d43/d2d/d30/d1f/d3c/d85/dbf/f144 x:0 0 0 2026-03-10T06:23:20.894 INFO:tasks.workunit.client.0.vm04.stdout:3/939: readlink d4/d6/d91/lc5 0 2026-03-10T06:23:20.894 INFO:tasks.workunit.client.0.vm04.stdout:3/940: read d4/deb/f11e [196969,66459] 0 2026-03-10T06:23:20.898 INFO:tasks.workunit.client.0.vm04.stdout:1/931: creat d0/d8/d46/d7a/d95/f156 x:0 0 0 2026-03-10T06:23:20.899 INFO:tasks.workunit.client.0.vm04.stdout:1/932: write d0/d8/d46/d7a/d95/dc5/f122 [884527,3442] 0 2026-03-10T06:23:20.900 INFO:tasks.workunit.client.0.vm04.stdout:1/933: fsync d0/d3/d41/dcb/f139 0 2026-03-10T06:23:20.903 INFO:tasks.workunit.client.0.vm04.stdout:1/934: dwrite d0/d3/d41/d4b/fd3 [0,4194304] 0 2026-03-10T06:23:20.914 INFO:tasks.workunit.client.0.vm04.stdout:5/951: write d4/d11/d7d/fec [309827,97098] 0 2026-03-10T06:23:20.916 INFO:tasks.workunit.client.0.vm04.stdout:2/938: dwrite d1/db/d69/d74/d87/dcf/d8f/d35/d54/f9a [0,4194304] 0 2026-03-10T06:23:20.918 INFO:tasks.workunit.client.0.vm04.stdout:8/991: rmdir df/d15/d2b/d81/d9a/dbe 39 2026-03-10T06:23:20.928 INFO:tasks.workunit.client.0.vm04.stdout:1/935: rename d0/f9a to d0/d8/d46/db3/f157 0 2026-03-10T06:23:20.929 INFO:tasks.workunit.client.0.vm04.stdout:4/993: getdents d2/d14a/d133 0 2026-03-10T06:23:20.931 INFO:tasks.workunit.client.0.vm04.stdout:5/952: creat d4/d3b/f146 x:0 0 0 2026-03-10T06:23:20.933 INFO:tasks.workunit.client.0.vm04.stdout:7/933: link d4/df/d12/d21/l81 d4/df/d12/d13/d8b/l14d 0 2026-03-10T06:23:20.933 INFO:tasks.workunit.client.0.vm04.stdout:7/934: chown d4/d105/c13a 7379 1 2026-03-10T06:23:20.939 INFO:tasks.workunit.client.0.vm04.stdout:2/939: dread d1/db/d69/f77 [0,4194304] 0 2026-03-10T06:23:20.955 INFO:tasks.workunit.client.0.vm04.stdout:1/936: unlink d0/d8/d46/d7a/d95/dc5/d121/f137 0 2026-03-10T06:23:20.956 INFO:tasks.workunit.client.0.vm04.stdout:1/937: truncate d0/d3/f146 925900 0 2026-03-10T06:23:20.956 INFO:tasks.workunit.client.0.vm04.stdout:3/941: write d4/da/df/d11/d5a/d5b/ddf/f1d [3127699,33491] 0 2026-03-10T06:23:20.958 INFO:tasks.workunit.client.0.vm04.stdout:1/938: fsync d0/d8/d46/db3/dd2/d100/d119/f154 0 2026-03-10T06:23:20.973 INFO:tasks.workunit.client.0.vm04.stdout:4/994: mkdir d2/d32/d5c/d76/dd7/d31/d42/d14c 0 2026-03-10T06:23:20.976 INFO:tasks.workunit.client.0.vm04.stdout:6/989: creat d2/d43/d2d/d30/f145 x:0 0 0 2026-03-10T06:23:20.976 INFO:tasks.workunit.client.0.vm04.stdout:4/995: dread d2/d32/d5c/d76/dd7/d2c/d6b/dd1/d107/f118 [4194304,4194304] 0 2026-03-10T06:23:20.979 INFO:tasks.workunit.client.0.vm04.stdout:1/939: creat d0/d3/d41/dcb/f158 x:0 0 0 2026-03-10T06:23:20.982 INFO:tasks.workunit.client.0.vm04.stdout:4/996: dwrite d2/d32/d5c/d76/dd7/f138 [0,4194304] 0 2026-03-10T06:23:20.984 INFO:tasks.workunit.client.0.vm04.stdout:8/992: creat df/d15/d2b/d81/f13a x:0 0 0 2026-03-10T06:23:20.985 INFO:tasks.workunit.client.0.vm04.stdout:6/990: write d2/d43/d2d/d30/d34/d108/ffc [75267,49963] 0 2026-03-10T06:23:20.986 INFO:tasks.workunit.client.0.vm04.stdout:6/991: write d2/d3a/f137 [5191927,93823] 0 2026-03-10T06:23:20.988 INFO:tasks.workunit.client.0.vm04.stdout:3/942: sync 2026-03-10T06:23:20.999 INFO:tasks.workunit.client.0.vm04.stdout:5/953: write d4/d11/d7d/f30 [2466052,25156] 0 2026-03-10T06:23:21.004 INFO:tasks.workunit.client.0.vm04.stdout:5/954: dwrite d4/f26 [0,4194304] 0 2026-03-10T06:23:21.005 INFO:tasks.workunit.client.0.vm04.stdout:7/935: dwrite d4/df/d12/d13/d25/d28/fc0 [0,4194304] 0 2026-03-10T06:23:21.007 INFO:tasks.workunit.client.0.vm04.stdout:5/955: chown d4/d11/d7d/dab 15 1 2026-03-10T06:23:21.018 INFO:tasks.workunit.client.0.vm04.stdout:1/940: dwrite d0/d8/d46/f82 [0,4194304] 0 2026-03-10T06:23:21.023 INFO:tasks.workunit.client.0.vm04.stdout:1/941: dread d0/d3/d41/dc2/ffc [0,4194304] 0 2026-03-10T06:23:21.027 INFO:tasks.workunit.client.0.vm04.stdout:1/942: chown d0/d8/d46/db3/dd2/d100/d119 97046482 1 2026-03-10T06:23:21.035 INFO:tasks.workunit.client.0.vm04.stdout:5/956: symlink d4/d6/d81/db6/l147 0 2026-03-10T06:23:21.043 INFO:tasks.workunit.client.0.vm04.stdout:1/943: symlink d0/d3/d41/dc2/l159 0 2026-03-10T06:23:21.046 INFO:tasks.workunit.client.0.vm04.stdout:1/944: dwrite d0/d3/d41/dcb/f12b [0,4194304] 0 2026-03-10T06:23:21.052 INFO:tasks.workunit.client.0.vm04.stdout:4/997: unlink d2/d32/d10b/ca6 0 2026-03-10T06:23:21.055 INFO:tasks.workunit.client.0.vm04.stdout:2/940: getdents d1/dae/d11/d14/d9f/ddb 0 2026-03-10T06:23:21.060 INFO:tasks.workunit.client.0.vm04.stdout:5/957: dread d4/d6/f47 [0,4194304] 0 2026-03-10T06:23:21.062 INFO:tasks.workunit.client.0.vm04.stdout:3/943: dwrite d4/d6/d38/fb8 [0,4194304] 0 2026-03-10T06:23:21.075 INFO:tasks.workunit.client.0.vm04.stdout:4/998: rename d2/d32/d10b/dc8/l12f to d2/d32/d5c/d76/dd7/d2c/d6b/l14d 0 2026-03-10T06:23:21.086 INFO:tasks.workunit.client.0.vm04.stdout:8/993: link df/d20/f42 df/d20/df6/f13b 0 2026-03-10T06:23:21.089 INFO:tasks.workunit.client.0.vm04.stdout:8/994: dwrite df/d20/d25/f39 [0,4194304] 0 2026-03-10T06:23:21.091 INFO:tasks.workunit.client.0.vm04.stdout:6/992: write d2/d43/d2d/d30/d34/d76/d7e/ddc/fec [905223,93956] 0 2026-03-10T06:23:21.092 INFO:tasks.workunit.client.0.vm04.stdout:6/993: chown d2/d43/d2d/d30/f39 28717 1 2026-03-10T06:23:21.093 INFO:tasks.workunit.client.0.vm04.stdout:5/958: rmdir d4/d11/d7d/d38/d91/d4c/d98 39 2026-03-10T06:23:21.093 INFO:tasks.workunit.client.0.vm04.stdout:5/959: chown d4/f69 2165 1 2026-03-10T06:23:21.099 INFO:tasks.workunit.client.0.vm04.stdout:1/945: creat d0/d8/d46/db3/d125/f15a x:0 0 0 2026-03-10T06:23:21.099 INFO:tasks.workunit.client.0.vm04.stdout:1/946: chown d0/d3/d41/dcb/c124 45222627 1 2026-03-10T06:23:21.100 INFO:tasks.workunit.client.0.vm04.stdout:1/947: read - d0/d3/d80/ff5 zero size 2026-03-10T06:23:21.101 INFO:tasks.workunit.client.0.vm04.stdout:3/944: dread d4/da/df/d11/d5a/d5b/ddf/d21/f3a [0,4194304] 0 2026-03-10T06:23:21.102 INFO:tasks.workunit.client.0.vm04.stdout:4/999: symlink d2/d46/l14e 0 2026-03-10T06:23:21.106 INFO:tasks.workunit.client.0.vm04.stdout:2/941: symlink d1/db/d69/d74/l122 0 2026-03-10T06:23:21.109 INFO:tasks.workunit.client.0.vm04.stdout:6/994: mkdir d2/d3a/d5e/db5/dd4/d146 0 2026-03-10T06:23:21.112 INFO:tasks.workunit.client.0.vm04.stdout:6/995: dwrite d2/d43/d2d/d30/d1f/d3c/dfa/f106 [0,4194304] 0 2026-03-10T06:23:21.117 INFO:tasks.workunit.client.0.vm04.stdout:5/960: creat d4/d11/d7d/d38/d91/d4c/def/ddc/d13d/f148 x:0 0 0 2026-03-10T06:23:21.117 INFO:tasks.workunit.client.0.vm04.stdout:6/996: dread d2/d43/d2d/d30/f2b [0,4194304] 0 2026-03-10T06:23:21.119 INFO:tasks.workunit.client.0.vm04.stdout:3/945: rmdir d4/da/df/d11/d5a/d5b/ddf/d21 39 2026-03-10T06:23:21.119 INFO:tasks.workunit.client.0.vm04.stdout:7/936: write d4/df/d12/d13/d25/f2f [3827249,79281] 0 2026-03-10T06:23:21.120 INFO:tasks.workunit.client.0.vm04.stdout:3/946: stat d4/d6/d99/d119/fed 0 2026-03-10T06:23:21.120 INFO:tasks.workunit.client.0.vm04.stdout:7/937: read - d4/df/f128 zero size 2026-03-10T06:23:21.121 INFO:tasks.workunit.client.0.vm04.stdout:7/938: write d4/df/d12/d34/d63/f9a [661651,100141] 0 2026-03-10T06:23:21.122 INFO:tasks.workunit.client.0.vm04.stdout:7/939: chown d4/df/d12/d13/d25/d28/ca9 26 1 2026-03-10T06:23:21.135 INFO:tasks.workunit.client.0.vm04.stdout:8/995: dwrite df/d20/d138/d55/fe9 [0,4194304] 0 2026-03-10T06:23:21.138 INFO:tasks.workunit.client.0.vm04.stdout:1/948: write d0/d8/f76 [3838639,56106] 0 2026-03-10T06:23:21.138 INFO:tasks.workunit.client.0.vm04.stdout:5/961: creat d4/d11/d7d/dae/f149 x:0 0 0 2026-03-10T06:23:21.143 INFO:tasks.workunit.client.0.vm04.stdout:1/949: dwrite d0/d112/d13d/f14b [0,4194304] 0 2026-03-10T06:23:21.146 INFO:tasks.workunit.client.0.vm04.stdout:1/950: fdatasync d0/f83 0 2026-03-10T06:23:21.150 INFO:tasks.workunit.client.0.vm04.stdout:2/942: unlink d1/dae/d11/cb1 0 2026-03-10T06:23:21.162 INFO:tasks.workunit.client.0.vm04.stdout:8/996: rename df/f46 to df/d15/d29/da3/db8/f13c 0 2026-03-10T06:23:21.163 INFO:tasks.workunit.client.0.vm04.stdout:3/947: dread d4/dba/fda [0,4194304] 0 2026-03-10T06:23:21.171 INFO:tasks.workunit.client.0.vm04.stdout:2/943: rmdir d1/dae/d11/d14/d9f/ddb/d94/dbb/de8 39 2026-03-10T06:23:21.173 INFO:tasks.workunit.client.0.vm04.stdout:7/940: truncate d4/df/f8a 944913 0 2026-03-10T06:23:21.179 INFO:tasks.workunit.client.0.vm04.stdout:1/951: symlink d0/l15b 0 2026-03-10T06:23:21.183 INFO:tasks.workunit.client.0.vm04.stdout:7/941: mkdir d4/df/d12/d13/d25/d30/d40/d14e 0 2026-03-10T06:23:21.185 INFO:tasks.workunit.client.0.vm04.stdout:6/997: truncate d2/d3a/d5e/db5/f101 176319 0 2026-03-10T06:23:21.186 INFO:tasks.workunit.client.0.vm04.stdout:6/998: chown d2/d43/d2d/d30/d34/d108/lbb 10326 1 2026-03-10T06:23:21.188 INFO:tasks.workunit.client.0.vm04.stdout:8/997: truncate df/d15/d29/da3/db8/fdc 4027042 0 2026-03-10T06:23:21.190 INFO:tasks.workunit.client.0.vm04.stdout:5/962: creat d4/d11/d7d/d38/d91/d4c/f14a x:0 0 0 2026-03-10T06:23:21.202 INFO:tasks.workunit.client.0.vm04.stdout:5/963: dread d4/d11/d7d/d38/d91/d4c/f88 [0,4194304] 0 2026-03-10T06:23:21.203 INFO:tasks.workunit.client.0.vm04.stdout:3/948: write d4/da/df/d11/d5a/d5b/ddf/d21/d32/d39/d64/f67 [195476,53754] 0 2026-03-10T06:23:21.204 INFO:tasks.workunit.client.0.vm04.stdout:3/949: dread - d4/da/df/fc1 zero size 2026-03-10T06:23:21.207 INFO:tasks.workunit.client.0.vm04.stdout:1/952: mkdir d0/d8/d46/db3/d125/d15c 0 2026-03-10T06:23:21.211 INFO:tasks.workunit.client.0.vm04.stdout:2/944: symlink d1/db/l123 0 2026-03-10T06:23:21.213 INFO:tasks.workunit.client.0.vm04.stdout:7/942: creat d4/df/d12/d13/d25/d28/d3a/d129/f14f x:0 0 0 2026-03-10T06:23:21.221 INFO:tasks.workunit.client.0.vm04.stdout:7/943: dread d4/df/d12/d34/f46 [0,4194304] 0 2026-03-10T06:23:21.230 INFO:tasks.workunit.client.0.vm04.stdout:6/999: rmdir d2/d43/d2d/d30/d34/d76/d8a 39 2026-03-10T06:23:21.231 INFO:tasks.workunit.client.0.vm04.stdout:8/998: mkdir df/d20/d138/d55/d13d 0 2026-03-10T06:23:21.231 INFO:tasks.workunit.client.0.vm04.stdout:5/964: creat d4/d11/d7d/d38/d91/d4c/def/d12a/f14b x:0 0 0 2026-03-10T06:23:21.232 INFO:tasks.workunit.client.0.vm04.stdout:1/953: sync 2026-03-10T06:23:21.233 INFO:tasks.workunit.client.0.vm04.stdout:1/954: write d0/d8/d46/d7a/d95/f140 [503130,42313] 0 2026-03-10T06:23:21.233 INFO:tasks.workunit.client.0.vm04.stdout:1/955: chown d0/d3/d80 38577 1 2026-03-10T06:23:21.239 INFO:tasks.workunit.client.0.vm04.stdout:3/950: symlink d4/da/df/d11/d5a/d5b/ddf/d21/d2c/d11a/l140 0 2026-03-10T06:23:21.241 INFO:tasks.workunit.client.0.vm04.stdout:8/999: fsync df/d15/d29/d89/fd7 0 2026-03-10T06:23:21.252 INFO:tasks.workunit.client.0.vm04.stdout:2/945: dwrite d1/dae/d2c/d37/d40/fc4 [0,4194304] 0 2026-03-10T06:23:21.253 INFO:tasks.workunit.client.0.vm04.stdout:7/944: read d4/df/d12/d13/d25/d28/f7d [292878,105214] 0 2026-03-10T06:23:21.263 INFO:tasks.workunit.client.0.vm04.stdout:3/951: mknod d4/da/df/d11/d5a/d5b/ddf/d21/d32/d4e/d8f/d104/c141 0 2026-03-10T06:23:21.264 INFO:tasks.workunit.client.0.vm04.stdout:7/945: dread d4/df/d12/d13/d25/d28/d3a/d129/f12a [0,4194304] 0 2026-03-10T06:23:21.269 INFO:tasks.workunit.client.0.vm04.stdout:1/956: mkdir d0/d3/d80/d12f/d15d 0 2026-03-10T06:23:21.279 INFO:tasks.workunit.client.0.vm04.stdout:2/946: dwrite d1/dae/d11/f11e [0,4194304] 0 2026-03-10T06:23:21.297 INFO:tasks.workunit.client.0.vm04.stdout:1/957: dread - d0/d8/d46/d7a/d95/dc5/dcc/f104 zero size 2026-03-10T06:23:21.300 INFO:tasks.workunit.client.0.vm04.stdout:2/947: truncate d1/dae/fe2 652746 0 2026-03-10T06:23:21.300 INFO:tasks.workunit.client.0.vm04.stdout:7/946: dread d4/fbf [0,4194304] 0 2026-03-10T06:23:21.304 INFO:tasks.workunit.client.0.vm04.stdout:1/958: creat d0/d3/d41/d99/f15e x:0 0 0 2026-03-10T06:23:21.304 INFO:tasks.workunit.client.0.vm04.stdout:5/965: link d4/d11/d7d/d38/d91/d4c/d98/c139 d4/c14c 0 2026-03-10T06:23:21.305 INFO:tasks.workunit.client.0.vm04.stdout:2/948: stat d1/db/d69/d74/d87/dcf/d8f/d35/d54/f9a 0 2026-03-10T06:23:21.305 INFO:tasks.workunit.client.0.vm04.stdout:1/959: chown d0/d8/d46/d7a/d95/dc5 4015 1 2026-03-10T06:23:21.312 INFO:tasks.workunit.client.0.vm04.stdout:7/947: mknod d4/df/d12/d13/d25/d8f/c150 0 2026-03-10T06:23:21.316 INFO:tasks.workunit.client.0.vm04.stdout:1/960: fdatasync d0/d3/d41/d4b/d5b/fb6 0 2026-03-10T06:23:21.317 INFO:tasks.workunit.client.0.vm04.stdout:3/952: truncate d4/da/df/d11/d5a/d5b/ddf/f2b 901190 0 2026-03-10T06:23:21.318 INFO:tasks.workunit.client.0.vm04.stdout:7/948: read d4/df/d12/d13/d25/dcb/fd6 [2579897,17009] 0 2026-03-10T06:23:21.336 INFO:tasks.workunit.client.0.vm04.stdout:2/949: truncate d1/dae/d11/d14/d9f/fab 1025652 0 2026-03-10T06:23:21.350 INFO:tasks.workunit.client.0.vm04.stdout:2/950: dread d1/dae/d2c/f33 [0,4194304] 0 2026-03-10T06:23:21.363 INFO:tasks.workunit.client.0.vm04.stdout:5/966: getdents d4/d11/d7d/dab/d106/df0 0 2026-03-10T06:23:21.368 INFO:tasks.workunit.client.0.vm04.stdout:2/951: unlink d1/l17 0 2026-03-10T06:23:21.371 INFO:tasks.workunit.client.0.vm04.stdout:1/961: link d0/c1c d0/d3/d41/dcb/c15f 0 2026-03-10T06:23:21.380 INFO:tasks.workunit.client.0.vm04.stdout:7/949: creat d4/df/d12/d13/d25/d28/f151 x:0 0 0 2026-03-10T06:23:21.381 INFO:tasks.workunit.client.0.vm04.stdout:3/953: write d4/d6/d91/da1/f10d [214832,49599] 0 2026-03-10T06:23:21.384 INFO:tasks.workunit.client.0.vm04.stdout:1/962: dread d0/d8/f76 [0,4194304] 0 2026-03-10T06:23:21.386 INFO:tasks.workunit.client.0.vm04.stdout:5/967: chown d4/d11/d7d/d38/d91/d4c/d98/dc0/c56 162 1 2026-03-10T06:23:21.388 INFO:tasks.workunit.client.0.vm04.stdout:3/954: rmdir d4/da/df/d11/d5a/db3 39 2026-03-10T06:23:21.389 INFO:tasks.workunit.client.0.vm04.stdout:3/955: dread - d4/d6/d99/d10c/f13a zero size 2026-03-10T06:23:21.391 INFO:tasks.workunit.client.0.vm04.stdout:7/950: dwrite d4/df/d12/d34/d63/f9a [0,4194304] 0 2026-03-10T06:23:21.401 INFO:tasks.workunit.client.0.vm04.stdout:5/968: truncate d4/d6/d81/db6/f111 1006967 0 2026-03-10T06:23:21.403 INFO:tasks.workunit.client.0.vm04.stdout:7/951: read d4/f5 [2499718,35754] 0 2026-03-10T06:23:21.407 INFO:tasks.workunit.client.0.vm04.stdout:1/963: link d0/d8/d46/d7a/d95/dc5/cf2 d0/d8/d46/db3/d125/c160 0 2026-03-10T06:23:21.421 INFO:tasks.workunit.client.0.vm04.stdout:3/956: write d4/da/df/d11/d5a/d5b/ddf/f4b [955429,117949] 0 2026-03-10T06:23:21.425 INFO:tasks.workunit.client.0.vm04.stdout:5/969: dwrite d4/d6/f8 [0,4194304] 0 2026-03-10T06:23:21.431 INFO:tasks.workunit.client.0.vm04.stdout:2/952: creat d1/dae/d11/d14/d9f/ddb/d94/dbb/f124 x:0 0 0 2026-03-10T06:23:21.433 INFO:tasks.workunit.client.0.vm04.stdout:1/964: symlink d0/d8/d46/d7a/d95/l161 0 2026-03-10T06:23:21.434 INFO:tasks.workunit.client.0.vm04.stdout:3/957: write d4/deb/f102 [533870,13381] 0 2026-03-10T06:23:21.434 INFO:tasks.workunit.client.0.vm04.stdout:7/952: dread d4/df/d12/d13/db3/d110/f10c [0,4194304] 0 2026-03-10T06:23:21.436 INFO:tasks.workunit.client.0.vm04.stdout:7/953: chown d4/df/d12/d13/db3/d110/d9c/db1/dde/ddf/df1/f132 229071 1 2026-03-10T06:23:21.466 INFO:tasks.workunit.client.0.vm04.stdout:5/970: creat d4/d11/d7d/d38/d91/d4c/d98/dc0/dde/f14d x:0 0 0 2026-03-10T06:23:21.466 INFO:tasks.workunit.client.0.vm04.stdout:1/965: dread - d0/d8/d46/db3/fc6 zero size 2026-03-10T06:23:21.469 INFO:tasks.workunit.client.0.vm04.stdout:1/966: chown d0/d8/f105 532336 1 2026-03-10T06:23:21.470 INFO:tasks.workunit.client.0.vm04.stdout:3/958: mknod d4/da/df/d11/d5a/d5b/ddf/d21/d32/d39/d64/c142 0 2026-03-10T06:23:21.471 INFO:tasks.workunit.client.0.vm04.stdout:3/959: chown d4/d6/dc 65621 1 2026-03-10T06:23:21.473 INFO:tasks.workunit.client.0.vm04.stdout:7/954: write d4/df/d12/d13/d25/d28/d3a/d58/fb6 [8557007,67193] 0 2026-03-10T06:23:21.481 INFO:tasks.workunit.client.0.vm04.stdout:1/967: dwrite d0/d8/d46/fb7 [4194304,4194304] 0 2026-03-10T06:23:21.490 INFO:tasks.workunit.client.0.vm04.stdout:3/960: mknod d4/da/df/d11/d5a/d5b/ddf/d21/d32/d4e/d8f/d104/c143 0 2026-03-10T06:23:21.490 INFO:tasks.workunit.client.0.vm04.stdout:1/968: stat d0/d8/d46/dcf/f141 0 2026-03-10T06:23:21.490 INFO:tasks.workunit.client.0.vm04.stdout:3/961: fdatasync d4/d6/dc/f22 0 2026-03-10T06:23:21.495 INFO:tasks.workunit.client.0.vm04.stdout:5/971: dwrite d4/d11/d7d/d38/d91/d55/db1/fb4 [0,4194304] 0 2026-03-10T06:23:21.500 INFO:tasks.workunit.client.0.vm04.stdout:2/953: rename d1/db/d69/d74/ld5 to d1/db/l125 0 2026-03-10T06:23:21.516 INFO:tasks.workunit.client.0.vm04.stdout:2/954: dread d1/db/d69/d74/d87/dcf/d8f/d48/d67/fbd [0,4194304] 0 2026-03-10T06:23:21.516 INFO:tasks.workunit.client.0.vm04.stdout:1/969: dwrite d0/d3/d41/dc2/ffc [0,4194304] 0 2026-03-10T06:23:21.523 INFO:tasks.workunit.client.0.vm04.stdout:3/962: unlink d4/da/df/d11/c26 0 2026-03-10T06:23:21.541 INFO:tasks.workunit.client.0.vm04.stdout:3/963: fdatasync d4/da/df/d11/ff5 0 2026-03-10T06:23:21.543 INFO:tasks.workunit.client.0.vm04.stdout:5/972: link d4/d11/d7d/dab/f105 d4/d11/d7d/dab/d106/d12d/f14e 0 2026-03-10T06:23:21.545 INFO:tasks.workunit.client.0.vm04.stdout:1/970: creat d0/d3/d41/dc2/d13a/f162 x:0 0 0 2026-03-10T06:23:21.548 INFO:tasks.workunit.client.0.vm04.stdout:7/955: sync 2026-03-10T06:23:21.554 INFO:tasks.workunit.client.0.vm04.stdout:2/955: unlink d1/dae/d11/d14/d9f/ddb/daf/cbe 0 2026-03-10T06:23:21.563 INFO:tasks.workunit.client.0.vm04.stdout:3/964: dread d4/da/df/d11/f57 [0,4194304] 0 2026-03-10T06:23:21.569 INFO:tasks.workunit.client.0.vm04.stdout:1/971: rename d0/d8/d46/db3/ff7 to d0/d112/d13d/f163 0 2026-03-10T06:23:21.569 INFO:tasks.workunit.client.0.vm04.stdout:5/973: write d4/d11/d7d/d38/d91/dda/fe4 [7425965,51372] 0 2026-03-10T06:23:21.575 INFO:tasks.workunit.client.0.vm04.stdout:2/956: truncate d1/db/d69/d74/d87/dcf/d8f/d48/d67/f92 315856 0 2026-03-10T06:23:21.580 INFO:tasks.workunit.client.0.vm04.stdout:2/957: dwrite d1/dae/d11/f118 [0,4194304] 0 2026-03-10T06:23:21.586 INFO:tasks.workunit.client.0.vm04.stdout:7/956: rename d4/df/d12/d13/d25/d28/d3a/d58 to d4/df/d12/d34/dbd/d152 0 2026-03-10T06:23:21.586 INFO:tasks.workunit.client.0.vm04.stdout:2/958: readlink d1/dae/d11/d14/l3f 0 2026-03-10T06:23:21.590 INFO:tasks.workunit.client.0.vm04.stdout:1/972: creat d0/d3/d80/d12f/f164 x:0 0 0 2026-03-10T06:23:21.595 INFO:tasks.workunit.client.0.vm04.stdout:3/965: symlink d4/da/df/d11/d5a/d5b/ddf/d21/d2c/l144 0 2026-03-10T06:23:21.597 INFO:tasks.workunit.client.0.vm04.stdout:3/966: chown d4/da/df/lf7 30 1 2026-03-10T06:23:21.600 INFO:tasks.workunit.client.0.vm04.stdout:2/959: dread d1/dae/d11/f11e [4194304,4194304] 0 2026-03-10T06:23:21.611 INFO:tasks.workunit.client.0.vm04.stdout:7/957: mkdir d4/df/d12/d13/db3/d110/d9c/db1/dde/d153 0 2026-03-10T06:23:21.611 INFO:tasks.workunit.client.0.vm04.stdout:5/974: unlink d4/d11/d7d/d38/d91/d4c/lc8 0 2026-03-10T06:23:21.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:21 vm06.local ceph-mon[58974]: pgmap v39: 65 pgs: 65 active+clean; 3.0 GiB data, 10 GiB used, 110 GiB / 120 GiB avail; 49 MiB/s rd, 129 MiB/s wr, 308 op/s 2026-03-10T06:23:21.619 INFO:tasks.workunit.client.0.vm04.stdout:2/960: mkdir d1/db/d69/d74/d87/dcf/d8f/ddc/d126 0 2026-03-10T06:23:21.619 INFO:tasks.workunit.client.0.vm04.stdout:3/967: sync 2026-03-10T06:23:21.629 INFO:tasks.workunit.client.0.vm04.stdout:2/961: dwrite d1/db/d69/d74/d87/dcf/d8f/ddc/fd1 [0,4194304] 0 2026-03-10T06:23:21.636 INFO:tasks.workunit.client.0.vm04.stdout:7/958: write d4/fa7 [5216404,10332] 0 2026-03-10T06:23:21.637 INFO:tasks.workunit.client.0.vm04.stdout:1/973: write d0/d3/d80/ff5 [387342,97581] 0 2026-03-10T06:23:21.639 INFO:tasks.workunit.client.0.vm04.stdout:3/968: write d4/da/df/d11/d5a/d5b/ddf/d21/d32/d4e/f73 [809900,126173] 0 2026-03-10T06:23:21.652 INFO:tasks.workunit.client.0.vm04.stdout:5/975: truncate d4/d6/d80/d84/fe2 429980 0 2026-03-10T06:23:21.658 INFO:tasks.workunit.client.0.vm04.stdout:2/962: creat d1/db/d69/d74/d87/dcf/f127 x:0 0 0 2026-03-10T06:23:21.667 INFO:tasks.workunit.client.0.vm04.stdout:1/974: mkdir d0/d112/d165 0 2026-03-10T06:23:21.669 INFO:tasks.workunit.client.0.vm04.stdout:7/959: mknod d4/df/d12/d13/d25/d30/d40/d50/df6/d114/c154 0 2026-03-10T06:23:21.669 INFO:tasks.workunit.client.0.vm04.stdout:1/975: stat d0/d3/d41/dcb/f158 0 2026-03-10T06:23:21.670 INFO:tasks.workunit.client.0.vm04.stdout:3/969: truncate d4/da/df/d11/f57 510613 0 2026-03-10T06:23:21.673 INFO:tasks.workunit.client.0.vm04.stdout:2/963: unlink d1/dae/ld0 0 2026-03-10T06:23:21.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:21 vm04.local ceph-mon[51058]: pgmap v39: 65 pgs: 65 active+clean; 3.0 GiB data, 10 GiB used, 110 GiB / 120 GiB avail; 49 MiB/s rd, 129 MiB/s wr, 308 op/s 2026-03-10T06:23:21.681 INFO:tasks.workunit.client.0.vm04.stdout:7/960: symlink d4/df/d12/d13/d12f/l155 0 2026-03-10T06:23:21.688 INFO:tasks.workunit.client.0.vm04.stdout:1/976: mkdir d0/d8/d46/de4/dec/d166 0 2026-03-10T06:23:21.688 INFO:tasks.workunit.client.0.vm04.stdout:3/970: creat d4/d6/d54/f145 x:0 0 0 2026-03-10T06:23:21.689 INFO:tasks.workunit.client.0.vm04.stdout:1/977: chown d0/d3/d41/d99/fda 221046 1 2026-03-10T06:23:21.697 INFO:tasks.workunit.client.0.vm04.stdout:7/961: rmdir d4/df/d12/d13/d25/d30 39 2026-03-10T06:23:21.698 INFO:tasks.workunit.client.0.vm04.stdout:2/964: unlink d1/dae/d2c/d37/d59/f8b 0 2026-03-10T06:23:21.698 INFO:tasks.workunit.client.0.vm04.stdout:7/962: chown d4/df/d12/d13/fb5 539 1 2026-03-10T06:23:21.701 INFO:tasks.workunit.client.0.vm04.stdout:1/978: symlink d0/d8/d46/db3/d125/l167 0 2026-03-10T06:23:21.702 INFO:tasks.workunit.client.0.vm04.stdout:1/979: stat d0/d3/d41/fa3 0 2026-03-10T06:23:21.711 INFO:tasks.workunit.client.0.vm04.stdout:5/976: creat d4/d11/d7d/d38/d91/f14f x:0 0 0 2026-03-10T06:23:21.718 INFO:tasks.workunit.client.0.vm04.stdout:7/963: mkdir d4/df/d12/d13/d25/d30/d40/d156 0 2026-03-10T06:23:21.732 INFO:tasks.workunit.client.0.vm04.stdout:2/965: truncate d1/dae/d11/f118 3341243 0 2026-03-10T06:23:21.736 INFO:tasks.workunit.client.0.vm04.stdout:1/980: symlink d0/d8/d46/de4/dec/d166/l168 0 2026-03-10T06:23:21.738 INFO:tasks.workunit.client.0.vm04.stdout:7/964: creat d4/df/d12/d34/d103/f157 x:0 0 0 2026-03-10T06:23:21.744 INFO:tasks.workunit.client.0.vm04.stdout:5/977: truncate d4/d11/d7d/f13e 12009562 0 2026-03-10T06:23:21.745 INFO:tasks.workunit.client.0.vm04.stdout:5/978: chown d4/d6/d80/cf8 12 1 2026-03-10T06:23:21.745 INFO:tasks.workunit.client.0.vm04.stdout:2/966: mkdir d1/db/d69/dcd/d128 0 2026-03-10T06:23:21.746 INFO:tasks.workunit.client.0.vm04.stdout:7/965: mknod d4/df/d12/d13/d25/d8f/c158 0 2026-03-10T06:23:21.747 INFO:tasks.workunit.client.0.vm04.stdout:1/981: dread d0/d8/d46/fd8 [0,4194304] 0 2026-03-10T06:23:21.761 INFO:tasks.workunit.client.0.vm04.stdout:3/971: dread d4/da/df/d11/d5a/d5b/fa3 [0,4194304] 0 2026-03-10T06:23:21.773 INFO:tasks.workunit.client.0.vm04.stdout:3/972: write f1 [9082089,5985] 0 2026-03-10T06:23:21.782 INFO:tasks.workunit.client.0.vm04.stdout:5/979: getdents d4/d11/d7d 0 2026-03-10T06:23:21.783 INFO:tasks.workunit.client.0.vm04.stdout:7/966: getdents d4/df/d12/d13/db3/d110/d9c 0 2026-03-10T06:23:21.784 INFO:tasks.workunit.client.0.vm04.stdout:7/967: dread - d4/df/d12/d21/f6b zero size 2026-03-10T06:23:21.785 INFO:tasks.workunit.client.0.vm04.stdout:3/973: dread - d4/da/df/d11/d5a/d5b/ddf/dbd/ff6 zero size 2026-03-10T06:23:21.786 INFO:tasks.workunit.client.0.vm04.stdout:1/982: link d0/d3/f33 d0/d3/d41/d4b/d5b/f169 0 2026-03-10T06:23:21.793 INFO:tasks.workunit.client.0.vm04.stdout:2/967: unlink d1/dae/d11/d14/d9f/ddb/d94/dbb/de8/c103 0 2026-03-10T06:23:21.793 INFO:tasks.workunit.client.0.vm04.stdout:5/980: mkdir d4/d11/d7d/d52/d150 0 2026-03-10T06:23:21.797 INFO:tasks.workunit.client.0.vm04.stdout:7/968: rename d4/df/d12/d13/d25/dcb/fd6 to d4/df/d12/d13/d25/d28/d3a/d100/d13d/f159 0 2026-03-10T06:23:21.804 INFO:tasks.workunit.client.0.vm04.stdout:2/968: write d1/db/d69/d74/d87/dcf/d8f/d35/d54/dfc/f102 [2451310,49612] 0 2026-03-10T06:23:21.809 INFO:tasks.workunit.client.0.vm04.stdout:1/983: creat d0/d112/d165/f16a x:0 0 0 2026-03-10T06:23:21.811 INFO:tasks.workunit.client.0.vm04.stdout:7/969: creat d4/df/d12/d13/f15a x:0 0 0 2026-03-10T06:23:21.813 INFO:tasks.workunit.client.0.vm04.stdout:3/974: dwrite d4/da/df/d11/d5a/d5b/f98 [0,4194304] 0 2026-03-10T06:23:21.823 INFO:tasks.workunit.client.0.vm04.stdout:1/984: creat d0/d3/d41/f16b x:0 0 0 2026-03-10T06:23:21.825 INFO:tasks.workunit.client.0.vm04.stdout:5/981: dwrite d4/d11/d7d/d38/d91/d55/f7a [0,4194304] 0 2026-03-10T06:23:21.825 INFO:tasks.workunit.client.0.vm04.stdout:7/970: creat d4/df/d12/d13/d25/d28/d3a/d100/d106/f15b x:0 0 0 2026-03-10T06:23:21.829 INFO:tasks.workunit.client.0.vm04.stdout:3/975: symlink d4/da/df/d11/d5a/d5b/l146 0 2026-03-10T06:23:21.835 INFO:tasks.workunit.client.0.vm04.stdout:2/969: symlink d1/dbf/l129 0 2026-03-10T06:23:21.843 INFO:tasks.workunit.client.0.vm04.stdout:1/985: symlink d0/d8/d46/d7a/d95/l16c 0 2026-03-10T06:23:21.849 INFO:tasks.workunit.client.0.vm04.stdout:7/971: getdents d4/df/d12/d13/db3/d110/d9c/db1/dc4/d104 0 2026-03-10T06:23:21.850 INFO:tasks.workunit.client.0.vm04.stdout:3/976: fsync d4/da/f134 0 2026-03-10T06:23:21.856 INFO:tasks.workunit.client.0.vm04.stdout:5/982: link d4/d11/d7d/d38/d91/l126 d4/d11/l151 0 2026-03-10T06:23:21.865 INFO:tasks.workunit.client.0.vm04.stdout:2/970: mkdir d1/dae/dd6/d12a 0 2026-03-10T06:23:21.867 INFO:tasks.workunit.client.0.vm04.stdout:5/983: dread d4/d11/d7d/d38/d91/d55/db1/fb4 [0,4194304] 0 2026-03-10T06:23:21.869 INFO:tasks.workunit.client.0.vm04.stdout:1/986: dread d0/d8/f11 [0,4194304] 0 2026-03-10T06:23:21.869 INFO:tasks.workunit.client.0.vm04.stdout:1/987: chown d0/d3/f11b 8 1 2026-03-10T06:23:21.875 INFO:tasks.workunit.client.0.vm04.stdout:3/977: write d4/da/df/d11/f57 [712562,79035] 0 2026-03-10T06:23:21.883 INFO:tasks.workunit.client.0.vm04.stdout:1/988: dread d0/d8/d46/db3/dd2/d100/f113 [0,4194304] 0 2026-03-10T06:23:21.886 INFO:tasks.workunit.client.0.vm04.stdout:3/978: dwrite d4/da/df/d11/d5a/d5b/dff/f117 [0,4194304] 0 2026-03-10T06:23:21.886 INFO:tasks.workunit.client.0.vm04.stdout:2/971: readlink d1/db/lad 0 2026-03-10T06:23:21.893 INFO:tasks.workunit.client.0.vm04.stdout:5/984: symlink d4/d11/d7d/l152 0 2026-03-10T06:23:21.898 INFO:tasks.workunit.client.0.vm04.stdout:5/985: read d4/f35 [664806,47770] 0 2026-03-10T06:23:21.901 INFO:tasks.workunit.client.0.vm04.stdout:1/989: dwrite d0/d8/d46/d7a/f84 [0,4194304] 0 2026-03-10T06:23:21.908 INFO:tasks.workunit.client.0.vm04.stdout:2/972: dwrite d1/dae/d11/f10a [0,4194304] 0 2026-03-10T06:23:21.929 INFO:tasks.workunit.client.0.vm04.stdout:7/972: creat d4/df/d12/d13/db3/d110/d9c/db1/f15c x:0 0 0 2026-03-10T06:23:21.929 INFO:tasks.workunit.client.0.vm04.stdout:3/979: dread d4/da/df/d11/d5a/d5b/ddf/f12c [0,4194304] 0 2026-03-10T06:23:21.951 INFO:tasks.workunit.client.0.vm04.stdout:1/990: creat d0/d8/d46/d7a/d95/dc5/dcc/f16d x:0 0 0 2026-03-10T06:23:21.961 INFO:tasks.workunit.client.0.vm04.stdout:2/973: creat d1/dae/d11/d14/d9f/ddb/f12b x:0 0 0 2026-03-10T06:23:21.966 INFO:tasks.workunit.client.0.vm04.stdout:7/973: chown d4/df/d12/d13/d25/ca0 27590 1 2026-03-10T06:23:21.974 INFO:tasks.workunit.client.0.vm04.stdout:7/974: dwrite d4/df/d12/d13/db3/d110/d9c/db1/ff7 [0,4194304] 0 2026-03-10T06:23:21.994 INFO:tasks.workunit.client.0.vm04.stdout:3/980: fdatasync d4/da/df/d11/d5a/d5b/ddf/dbd/ff6 0 2026-03-10T06:23:21.995 INFO:tasks.workunit.client.0.vm04.stdout:3/981: write d4/da/df/d11/d5a/d5b/fd9 [1049508,47426] 0 2026-03-10T06:23:21.995 INFO:tasks.workunit.client.0.vm04.stdout:3/982: chown d4/da/df/d11/d50/c139 1311377 1 2026-03-10T06:23:22.008 INFO:tasks.workunit.client.0.vm04.stdout:2/974: dread d1/dae/d2c/d37/f52 [0,4194304] 0 2026-03-10T06:23:22.010 INFO:tasks.workunit.client.0.vm04.stdout:7/975: symlink d4/df/d12/d13/d25/d30/d40/l15d 0 2026-03-10T06:23:22.011 INFO:tasks.workunit.client.0.vm04.stdout:2/975: chown d1/db/d69/d74/d87/dcf/d8f/d35/d54/f104 11 1 2026-03-10T06:23:22.040 INFO:tasks.workunit.client.0.vm04.stdout:7/976: readlink d4/df/d12/d21/l81 0 2026-03-10T06:23:22.041 INFO:tasks.workunit.client.0.vm04.stdout:3/983: mkdir d4/da/df/d11/d5a/d5b/ddf/d147 0 2026-03-10T06:23:22.044 INFO:tasks.workunit.client.0.vm04.stdout:5/986: link d4/d11/d7d/c53 d4/d6/d80/de5/c153 0 2026-03-10T06:23:22.045 INFO:tasks.workunit.client.0.vm04.stdout:1/991: dwrite d0/d3/d41/fa3 [4194304,4194304] 0 2026-03-10T06:23:22.056 INFO:tasks.workunit.client.0.vm04.stdout:7/977: creat d4/df/d12/d13/d12f/d148/f15e x:0 0 0 2026-03-10T06:23:22.056 INFO:tasks.workunit.client.0.vm04.stdout:3/984: fsync d4/da/df/d11/d5a/d5b/ddf/d21/f3a 0 2026-03-10T06:23:22.057 INFO:tasks.workunit.client.0.vm04.stdout:7/978: fsync d4/df/f128 0 2026-03-10T06:23:22.059 INFO:tasks.workunit.client.0.vm04.stdout:5/987: truncate d4/d6/fcf 566347 0 2026-03-10T06:23:22.059 INFO:tasks.workunit.client.0.vm04.stdout:2/976: creat d1/dae/d11/d14/d9f/ddb/d94/f12c x:0 0 0 2026-03-10T06:23:22.063 INFO:tasks.workunit.client.0.vm04.stdout:7/979: unlink d4/df/d12/d34/d63/f78 0 2026-03-10T06:23:22.080 INFO:tasks.workunit.client.0.vm04.stdout:7/980: symlink d4/l15f 0 2026-03-10T06:23:22.081 INFO:tasks.workunit.client.0.vm04.stdout:7/981: readlink d4/df/d12/d13/db3/d110/d9c/db1/lb8 0 2026-03-10T06:23:22.093 INFO:tasks.workunit.client.0.vm04.stdout:5/988: dread d4/d6/d37/f7e [0,4194304] 0 2026-03-10T06:23:22.094 INFO:tasks.workunit.client.0.vm04.stdout:1/992: link d0/d3/d41/dcb/f158 d0/d3/f16e 0 2026-03-10T06:23:22.095 INFO:tasks.workunit.client.0.vm04.stdout:7/982: fdatasync d4/df/d12/dd4/f126 0 2026-03-10T06:23:22.095 INFO:tasks.workunit.client.0.vm04.stdout:2/977: dread d1/db/d69/d74/d87/dcf/d8f/d35/d54/dfc/f102 [0,4194304] 0 2026-03-10T06:23:22.097 INFO:tasks.workunit.client.0.vm04.stdout:3/985: link d4/da/df/d11/d50/c139 d4/d6/d92/def/c148 0 2026-03-10T06:23:22.098 INFO:tasks.workunit.client.0.vm04.stdout:2/978: read - d1/db/d69/d74/d87/dcf/d8f/d35/d54/f104 zero size 2026-03-10T06:23:22.099 INFO:tasks.workunit.client.0.vm04.stdout:3/986: chown d4/da/df/d11/d5a/d5b/ddf/l12f 148769314 1 2026-03-10T06:23:22.102 INFO:tasks.workunit.client.0.vm04.stdout:1/993: creat d0/d8/d46/de4/dec/d166/f16f x:0 0 0 2026-03-10T06:23:22.105 INFO:tasks.workunit.client.0.vm04.stdout:7/983: mkdir d4/df/d12/d13/d12f/d148/d160 0 2026-03-10T06:23:22.108 INFO:tasks.workunit.client.0.vm04.stdout:2/979: symlink d1/dae/d2c/d37/d40/dfa/l12d 0 2026-03-10T06:23:22.109 INFO:tasks.workunit.client.0.vm04.stdout:3/987: fdatasync d4/da/f134 0 2026-03-10T06:23:22.114 INFO:tasks.workunit.client.0.vm04.stdout:5/989: fsync d4/d6/d50/f61 0 2026-03-10T06:23:22.114 INFO:tasks.workunit.client.0.vm04.stdout:1/994: fsync d0/d8/f105 0 2026-03-10T06:23:22.114 INFO:tasks.workunit.client.0.vm04.stdout:7/984: rename d4/df/d12/d13/db3/d110/d9c/db1 to d4/df/d12/d34/dbd/d161 0 2026-03-10T06:23:22.116 INFO:tasks.workunit.client.0.vm04.stdout:7/985: dread - d4/df/d12/d13/fa6 zero size 2026-03-10T06:23:22.123 INFO:tasks.workunit.client.0.vm04.stdout:1/995: dread - d0/d8/d46/db3/dd2/f11c zero size 2026-03-10T06:23:22.132 INFO:tasks.workunit.client.0.vm04.stdout:3/988: mkdir d4/da/df/d11/d5a/d5b/ddf/d21/d32/d4e/d149 0 2026-03-10T06:23:22.133 INFO:tasks.workunit.client.0.vm04.stdout:1/996: mknod d0/d112/d13d/c170 0 2026-03-10T06:23:22.135 INFO:tasks.workunit.client.0.vm04.stdout:2/980: link d1/dae/d11/d14/d9f/ddb/d94/de5/de9/l10b d1/db/d69/d74/d87/l12e 0 2026-03-10T06:23:22.135 INFO:tasks.workunit.client.0.vm04.stdout:7/986: symlink d4/df/d12/d13/d25/d30/d40/d14e/l162 0 2026-03-10T06:23:22.139 INFO:tasks.workunit.client.0.vm04.stdout:1/997: mknod d0/d3/d41/d4b/d5b/c171 0 2026-03-10T06:23:22.144 INFO:tasks.workunit.client.0.vm04.stdout:5/990: rename d4/d11/d7d/d38/d91/d55/d72/f113 to d4/d11/d7d/d38/f154 0 2026-03-10T06:23:22.155 INFO:tasks.workunit.client.0.vm04.stdout:3/989: getdents d4/da/df/d11/d5a/d5b/ddf/d21/d32/d4e/d149 0 2026-03-10T06:23:22.156 INFO:tasks.workunit.client.0.vm04.stdout:1/998: chown d0/d8/d46/db3/d125/c160 244475385 1 2026-03-10T06:23:22.157 INFO:tasks.workunit.client.0.vm04.stdout:5/991: symlink d4/d6/d50/l155 0 2026-03-10T06:23:22.166 INFO:tasks.workunit.client.0.vm04.stdout:2/981: truncate d1/dae/d11/d14/f1d 1358102 0 2026-03-10T06:23:22.171 INFO:tasks.workunit.client.0.vm04.stdout:7/987: link d4/df/d12/d13/l124 d4/df/d12/d13/d25/d8f/l163 0 2026-03-10T06:23:22.172 INFO:tasks.workunit.client.0.vm04.stdout:3/990: symlink d4/da/df/d11/d5a/d5b/ddf/d89/d133/l14a 0 2026-03-10T06:23:22.177 INFO:tasks.workunit.client.0.vm04.stdout:7/988: dread d4/df/d12/d13/d8b/fa5 [0,4194304] 0 2026-03-10T06:23:22.182 INFO:tasks.workunit.client.0.vm04.stdout:3/991: dread - d4/da/df/d11/d5a/d5b/ddf/d21/d2c/f6b zero size 2026-03-10T06:23:22.194 INFO:tasks.workunit.client.0.vm04.stdout:3/992: dread d4/da/df/d11/d5a/d5b/dff/ff8 [0,4194304] 0 2026-03-10T06:23:22.200 INFO:tasks.workunit.client.0.vm04.stdout:5/992: rename d4/d6/f93 to d4/d11/d7d/d38/d91/f156 0 2026-03-10T06:23:22.201 INFO:tasks.workunit.client.0.vm04.stdout:1/999: getdents d0/d8/d46/de4/dec 0 2026-03-10T06:23:22.216 INFO:tasks.workunit.client.0.vm04.stdout:3/993: dwrite d4/da/df/d11/d5a/d5b/ddf/f45 [0,4194304] 0 2026-03-10T06:23:22.218 INFO:tasks.workunit.client.0.vm04.stdout:3/994: chown d4/d6/d99/f80 28 1 2026-03-10T06:23:22.228 INFO:tasks.workunit.client.0.vm04.stdout:5/993: fsync d4/d11/d7d/d38/d91/f74 0 2026-03-10T06:23:22.241 INFO:tasks.workunit.client.0.vm04.stdout:3/995: mknod d4/d6/dc/c14b 0 2026-03-10T06:23:22.241 INFO:tasks.workunit.client.0.vm04.stdout:5/994: symlink d4/d6/d80/de5/d114/l157 0 2026-03-10T06:23:22.246 INFO:tasks.workunit.client.0.vm04.stdout:7/989: getdents d4 0 2026-03-10T06:23:22.254 INFO:tasks.workunit.client.0.vm04.stdout:2/982: rename d1/db/f36 to d1/f12f 0 2026-03-10T06:23:22.260 INFO:tasks.workunit.client.0.vm04.stdout:5/995: unlink d4/d6/le8 0 2026-03-10T06:23:22.263 INFO:tasks.workunit.client.0.vm04.stdout:3/996: dwrite d4/da/df/d11/d5a/f8b [0,4194304] 0 2026-03-10T06:23:22.269 INFO:tasks.workunit.client.0.vm04.stdout:7/990: dread d4/df/d12/dd4/fe1 [0,4194304] 0 2026-03-10T06:23:22.269 INFO:tasks.workunit.client.0.vm04.stdout:7/991: dread - d4/f135 zero size 2026-03-10T06:23:22.297 INFO:tasks.workunit.client.0.vm04.stdout:3/997: rename d4/da/df/d11/d50/l65 to d4/da/df/d11/d5a/d5b/ddf/d21/d32/d39/l14c 0 2026-03-10T06:23:22.301 INFO:tasks.workunit.client.0.vm04.stdout:7/992: readlink d4/df/d12/d13/d25/l2b 0 2026-03-10T06:23:22.302 INFO:tasks.workunit.client.0.vm04.stdout:5/996: dwrite d4/d6/d37/f10b [0,4194304] 0 2026-03-10T06:23:22.302 INFO:tasks.workunit.client.0.vm04.stdout:2/983: rename d1/dae/d2c/d37/d40/f11b to d1/dbf/f130 0 2026-03-10T06:23:22.327 INFO:tasks.workunit.client.0.vm04.stdout:5/997: write d4/d11/d7d/d38/d91/d55/db1/fb4 [4609157,20565] 0 2026-03-10T06:23:22.328 INFO:tasks.workunit.client.0.vm04.stdout:7/993: mknod d4/df/d12/d13/d12f/d148/c164 0 2026-03-10T06:23:22.386 INFO:tasks.workunit.client.0.vm04.stdout:2/984: dwrite d1/dae/d11/d14/f45 [4194304,4194304] 0 2026-03-10T06:23:22.418 INFO:tasks.workunit.client.0.vm04.stdout:7/994: truncate d4/df/d12/d13/d25/d28/d3a/f73 1055591 0 2026-03-10T06:23:22.422 INFO:tasks.workunit.client.0.vm04.stdout:3/998: sync 2026-03-10T06:23:22.424 INFO:tasks.workunit.client.0.vm04.stdout:2/985: rename d1/dae/d2c/d37/f52 to d1/dae/d2c/d37/f131 0 2026-03-10T06:23:22.444 INFO:tasks.workunit.client.0.vm04.stdout:7/995: mkdir d4/df/d12/d13/d25/d28/d3a/d100/d13d/d165 0 2026-03-10T06:23:22.447 INFO:tasks.workunit.client.0.vm04.stdout:5/998: creat d4/d11/d7d/d38/f158 x:0 0 0 2026-03-10T06:23:22.449 INFO:tasks.workunit.client.0.vm04.stdout:2/986: mknod d1/dae/d11/d14/d9f/ddb/df3/c132 0 2026-03-10T06:23:22.451 INFO:tasks.workunit.client.0.vm04.stdout:3/999: link d4/d6/dc/l4d d4/d6/d99/d119/l14d 0 2026-03-10T06:23:22.468 INFO:tasks.workunit.client.0.vm04.stdout:2/987: sync 2026-03-10T06:23:22.469 INFO:tasks.workunit.client.0.vm04.stdout:7/996: sync 2026-03-10T06:23:22.469 INFO:tasks.workunit.client.0.vm04.stdout:5/999: sync 2026-03-10T06:23:22.479 INFO:tasks.workunit.client.0.vm04.stdout:2/988: unlink d1/dae/d11/l7b 0 2026-03-10T06:23:22.481 INFO:tasks.workunit.client.0.vm04.stdout:7/997: creat d4/df/d12/d13/d25/d28/d3a/f166 x:0 0 0 2026-03-10T06:23:22.485 INFO:tasks.workunit.client.0.vm04.stdout:7/998: mkdir d4/df/d12/d13/d25/d28/d3a/d129/d167 0 2026-03-10T06:23:22.487 INFO:tasks.workunit.client.0.vm04.stdout:7/999: dread - d4/df/d12/d13/d25/d28/d3a/d100/d13d/f14b zero size 2026-03-10T06:23:22.490 INFO:tasks.workunit.client.0.vm04.stdout:2/989: symlink d1/db/d69/d74/d87/dcf/d8f/d48/d67/db3/l133 0 2026-03-10T06:23:22.501 INFO:tasks.workunit.client.0.vm04.stdout:2/990: symlink d1/db/d69/d74/d87/dcf/d8f/d35/d54/d5d/l134 0 2026-03-10T06:23:22.504 INFO:tasks.workunit.client.0.vm04.stdout:2/991: chown d1/dae/d11/fc3 1024 1 2026-03-10T06:23:22.516 INFO:tasks.workunit.client.0.vm04.stdout:2/992: dread d1/db/d9b/fa3 [0,4194304] 0 2026-03-10T06:23:22.520 INFO:tasks.workunit.client.0.vm04.stdout:2/993: write d1/db/d69/d74/d87/dcf/d8f/d35/d54/f9a [1318661,87699] 0 2026-03-10T06:23:22.524 INFO:tasks.workunit.client.0.vm04.stdout:2/994: dread - d1/dae/d2c/d37/d40/dfa/f116 zero size 2026-03-10T06:23:22.531 INFO:tasks.workunit.client.0.vm04.stdout:2/995: chown d1/db/d69/d74/d87/dcf/d8f/d35/d54/d5d/ff5 9657 1 2026-03-10T06:23:22.537 INFO:tasks.workunit.client.0.vm04.stdout:2/996: mknod d1/dae/d2c/d37/c135 0 2026-03-10T06:23:22.540 INFO:tasks.workunit.client.0.vm04.stdout:2/997: write d1/dae/d11/d14/d4e/fff [546380,13514] 0 2026-03-10T06:23:22.547 INFO:tasks.workunit.client.0.vm04.stdout:2/998: symlink d1/dae/d2c/d37/l136 0 2026-03-10T06:23:22.552 INFO:tasks.workunit.client.0.vm04.stdout:2/999: creat d1/dae/d11/d14/d9f/ddb/d94/f137 x:0 0 0 2026-03-10T06:23:22.573 INFO:tasks.workunit.client.0.vm04.stderr:+ rm -rf -- ./tmp.2mByBn89Ip 2026-03-10T06:23:23.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:23 vm06.local ceph-mon[58974]: pgmap v40: 65 pgs: 65 active+clean; 3.0 GiB data, 10 GiB used, 110 GiB / 120 GiB avail; 33 MiB/s rd, 84 MiB/s wr, 205 op/s 2026-03-10T06:23:23.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:23 vm04.local ceph-mon[51058]: pgmap v40: 65 pgs: 65 active+clean; 3.0 GiB data, 10 GiB used, 110 GiB / 120 GiB avail; 33 MiB/s rd, 84 MiB/s wr, 205 op/s 2026-03-10T06:23:24.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:24 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T06:23:24.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:24 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T06:23:25.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:25 vm06.local ceph-mon[58974]: pgmap v41: 65 pgs: 65 active+clean; 3.0 GiB data, 10 GiB used, 110 GiB / 120 GiB avail; 33 MiB/s rd, 84 MiB/s wr, 205 op/s 2026-03-10T06:23:25.930 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:25 vm04.local ceph-mon[51058]: pgmap v41: 65 pgs: 65 active+clean; 3.0 GiB data, 10 GiB used, 110 GiB / 120 GiB avail; 33 MiB/s rd, 84 MiB/s wr, 205 op/s 2026-03-10T06:23:28.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:27 vm06.local ceph-mon[58974]: pgmap v42: 65 pgs: 65 active+clean; 3.0 GiB data, 10 GiB used, 110 GiB / 120 GiB avail; 46 MiB/s rd, 106 MiB/s wr, 302 op/s 2026-03-10T06:23:28.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:27 vm04.local ceph-mon[51058]: pgmap v42: 65 pgs: 65 active+clean; 3.0 GiB data, 10 GiB used, 110 GiB / 120 GiB avail; 46 MiB/s rd, 106 MiB/s wr, 302 op/s 2026-03-10T06:23:29.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:28 vm06.local ceph-mon[58974]: pgmap v43: 65 pgs: 65 active+clean; 3.0 GiB data, 10 GiB used, 110 GiB / 120 GiB avail; 29 MiB/s rd, 68 MiB/s wr, 199 op/s 2026-03-10T06:23:29.150 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:28 vm04.local ceph-mon[51058]: pgmap v43: 65 pgs: 65 active+clean; 3.0 GiB data, 10 GiB used, 110 GiB / 120 GiB avail; 29 MiB/s rd, 68 MiB/s wr, 199 op/s 2026-03-10T06:23:30.796 INFO:teuthology.orchestra.run:Running command with timeout 3600 2026-03-10T06:23:30.796 DEBUG:teuthology.orchestra.run.vm06:> sudo rm -rf -- /home/ubuntu/cephtest/mnt.1/client.1/tmp 2026-03-10T06:23:31.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:31 vm04.local ceph-mon[51058]: pgmap v44: 65 pgs: 65 active+clean; 2.2 GiB data, 8.1 GiB used, 112 GiB / 120 GiB avail; 29 MiB/s rd, 68 MiB/s wr, 255 op/s 2026-03-10T06:23:31.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:31 vm04.local ceph-mon[51058]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:23:31.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:31 vm04.local ceph-mon[51058]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:23:31.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:31 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:23:31.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:31 vm06.local ceph-mon[58974]: pgmap v44: 65 pgs: 65 active+clean; 2.2 GiB data, 8.1 GiB used, 112 GiB / 120 GiB avail; 29 MiB/s rd, 68 MiB/s wr, 255 op/s 2026-03-10T06:23:31.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:31 vm06.local ceph-mon[58974]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:23:31.118 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:31 vm06.local ceph-mon[58974]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:23:31.118 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:31 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:23:33.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:33 vm04.local ceph-mon[51058]: pgmap v45: 65 pgs: 65 active+clean; 2.2 GiB data, 8.1 GiB used, 112 GiB / 120 GiB avail; 14 MiB/s rd, 23 MiB/s wr, 152 op/s 2026-03-10T06:23:33.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:33 vm04.local ceph-mon[51058]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:23:33.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:33 vm04.local ceph-mon[51058]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:23:33.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:33 vm06.local ceph-mon[58974]: pgmap v45: 65 pgs: 65 active+clean; 2.2 GiB data, 8.1 GiB used, 112 GiB / 120 GiB avail; 14 MiB/s rd, 23 MiB/s wr, 152 op/s 2026-03-10T06:23:33.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:33 vm06.local ceph-mon[58974]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:23:33.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:33 vm06.local ceph-mon[58974]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:23:34.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:34 vm06.local ceph-mon[58974]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:23:34.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:34 vm06.local ceph-mon[58974]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:23:34.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:34 vm04.local ceph-mon[51058]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:23:34.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:34 vm04.local ceph-mon[51058]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:23:35.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:35 vm06.local ceph-mon[58974]: pgmap v46: 65 pgs: 65 active+clean; 2.2 GiB data, 8.1 GiB used, 112 GiB / 120 GiB avail; 14 MiB/s rd, 23 MiB/s wr, 152 op/s 2026-03-10T06:23:35.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:35 vm06.local ceph-mon[58974]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:23:35.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:35 vm06.local ceph-mon[58974]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:23:35.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:35 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:23:35.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:35 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T06:23:35.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:35 vm06.local ceph-mon[58974]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:23:35.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:35 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "dashboard get-grafana-api-url"}]: dispatch 2026-03-10T06:23:35.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:35 vm06.local ceph-mon[58974]: from='mon.? -' entity='mon.' cmd=[{"prefix": "dashboard get-grafana-api-url"}]: dispatch 2026-03-10T06:23:35.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:35 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:23:35.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:35 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:23:35.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:35 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:23:35.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:35 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:23:35.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:35 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:23:35.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:35 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:23:35.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:35 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:23:35.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:35 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:23:35.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:35 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:23:35.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:35 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:23:35.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:35 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:23:35.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:35 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:23:35.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:35 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:23:35.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:35 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:23:35.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:35 vm06.local ceph-mon[58974]: Upgrade: Finalizing container_image settings 2026-03-10T06:23:35.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:35 vm06.local ceph-mon[58974]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:23:35.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:35 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mgr"}]: dispatch 2026-03-10T06:23:35.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:35 vm06.local ceph-mon[58974]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mgr"}]: dispatch 2026-03-10T06:23:35.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:35 vm06.local ceph-mon[58974]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mgr"}]': finished 2026-03-10T06:23:35.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:35 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-10T06:23:35.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:35 vm06.local ceph-mon[58974]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-10T06:23:35.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:35 vm06.local ceph-mon[58974]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mon"}]': finished 2026-03-10T06:23:35.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:35 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.crash"}]: dispatch 2026-03-10T06:23:35.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:35 vm06.local ceph-mon[58974]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.crash"}]: dispatch 2026-03-10T06:23:35.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:35 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd"}]: dispatch 2026-03-10T06:23:35.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:35 vm06.local ceph-mon[58974]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd"}]: dispatch 2026-03-10T06:23:35.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:35 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mds"}]: dispatch 2026-03-10T06:23:35.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:35 vm06.local ceph-mon[58974]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mds"}]: dispatch 2026-03-10T06:23:35.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:35 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.rgw"}]: dispatch 2026-03-10T06:23:35.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:35 vm06.local ceph-mon[58974]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.rgw"}]: dispatch 2026-03-10T06:23:35.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:35 vm06.local ceph-mon[58974]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.rgw"}]': finished 2026-03-10T06:23:35.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:35 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.rbd-mirror"}]: dispatch 2026-03-10T06:23:35.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:35 vm06.local ceph-mon[58974]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.rbd-mirror"}]: dispatch 2026-03-10T06:23:35.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:35 vm06.local ceph-mon[58974]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.rbd-mirror"}]': finished 2026-03-10T06:23:35.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:35 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-10T06:23:35.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:35 vm06.local ceph-mon[58974]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-10T06:23:35.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:35 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.ceph-exporter"}]: dispatch 2026-03-10T06:23:35.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:35 vm06.local ceph-mon[58974]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.ceph-exporter"}]: dispatch 2026-03-10T06:23:35.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:35 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.iscsi"}]: dispatch 2026-03-10T06:23:35.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:35 vm06.local ceph-mon[58974]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.iscsi"}]: dispatch 2026-03-10T06:23:35.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:35 vm06.local ceph-mon[58974]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.iscsi"}]': finished 2026-03-10T06:23:35.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:35 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.nfs"}]: dispatch 2026-03-10T06:23:35.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:35 vm06.local ceph-mon[58974]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.nfs"}]: dispatch 2026-03-10T06:23:35.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:35 vm06.local ceph-mon[58974]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.nfs"}]': finished 2026-03-10T06:23:35.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:35 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.nvmeof"}]: dispatch 2026-03-10T06:23:35.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:35 vm06.local ceph-mon[58974]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.nvmeof"}]: dispatch 2026-03-10T06:23:35.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:35 vm06.local ceph-mon[58974]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.nvmeof"}]': finished 2026-03-10T06:23:35.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:35 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-10T06:23:35.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:35 vm06.local ceph-mon[58974]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-10T06:23:35.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:35 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-10T06:23:35.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:35 vm06.local ceph-mon[58974]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-10T06:23:35.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:35 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-10T06:23:35.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:35 vm06.local ceph-mon[58974]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-10T06:23:35.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:35 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-10T06:23:35.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:35 vm06.local ceph-mon[58974]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-10T06:23:35.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:35 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-10T06:23:35.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:35 vm06.local ceph-mon[58974]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-10T06:23:35.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:35 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-10T06:23:35.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:35 vm06.local ceph-mon[58974]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-10T06:23:35.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:35 vm06.local ceph-mon[58974]: Upgrade: Complete! 2026-03-10T06:23:35.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:35 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix":"config-key del","key":"mgr/cephadm/upgrade_state"}]: dispatch 2026-03-10T06:23:35.869 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:35 vm06.local ceph-mon[58974]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' cmd=[{"prefix":"config-key del","key":"mgr/cephadm/upgrade_state"}]: dispatch 2026-03-10T06:23:35.869 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:35 vm06.local ceph-mon[58974]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/upgrade_state"}]': finished 2026-03-10T06:23:35.869 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:35 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:23:35.869 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:35 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:23:35.869 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:35 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T06:23:35.869 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:35 vm06.local ceph-mon[58974]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:23:35.869 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:35 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:23:35.869 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:35 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:23:35.869 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:35 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T06:23:35.869 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:35 vm06.local ceph-mon[58974]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:23:35.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:35 vm04.local ceph-mon[51058]: pgmap v46: 65 pgs: 65 active+clean; 2.2 GiB data, 8.1 GiB used, 112 GiB / 120 GiB avail; 14 MiB/s rd, 23 MiB/s wr, 152 op/s 2026-03-10T06:23:35.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:35 vm04.local ceph-mon[51058]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:23:35.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:35 vm04.local ceph-mon[51058]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:23:35.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:35 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:23:35.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:35 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T06:23:35.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:35 vm04.local ceph-mon[51058]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:23:35.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:35 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "dashboard get-grafana-api-url"}]: dispatch 2026-03-10T06:23:35.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:35 vm04.local ceph-mon[51058]: from='mon.? -' entity='mon.' cmd=[{"prefix": "dashboard get-grafana-api-url"}]: dispatch 2026-03-10T06:23:35.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:35 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:23:35.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:35 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:23:35.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:35 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:23:35.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:35 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:23:35.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:35 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:23:35.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:35 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:23:35.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:35 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:23:35.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:35 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:23:35.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:35 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:23:35.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:35 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:23:35.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:35 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:23:35.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:35 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:23:35.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:35 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:23:35.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:35 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:23:35.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:35 vm04.local ceph-mon[51058]: Upgrade: Finalizing container_image settings 2026-03-10T06:23:35.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:35 vm04.local ceph-mon[51058]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:23:35.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:35 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mgr"}]: dispatch 2026-03-10T06:23:35.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:35 vm04.local ceph-mon[51058]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mgr"}]: dispatch 2026-03-10T06:23:35.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:35 vm04.local ceph-mon[51058]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mgr"}]': finished 2026-03-10T06:23:35.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:35 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-10T06:23:35.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:35 vm04.local ceph-mon[51058]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-10T06:23:35.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:35 vm04.local ceph-mon[51058]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mon"}]': finished 2026-03-10T06:23:35.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:35 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.crash"}]: dispatch 2026-03-10T06:23:35.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:35 vm04.local ceph-mon[51058]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.crash"}]: dispatch 2026-03-10T06:23:35.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:35 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd"}]: dispatch 2026-03-10T06:23:35.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:35 vm04.local ceph-mon[51058]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd"}]: dispatch 2026-03-10T06:23:35.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:35 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mds"}]: dispatch 2026-03-10T06:23:35.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:35 vm04.local ceph-mon[51058]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mds"}]: dispatch 2026-03-10T06:23:35.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:35 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.rgw"}]: dispatch 2026-03-10T06:23:35.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:35 vm04.local ceph-mon[51058]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.rgw"}]: dispatch 2026-03-10T06:23:35.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:35 vm04.local ceph-mon[51058]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.rgw"}]': finished 2026-03-10T06:23:35.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:35 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.rbd-mirror"}]: dispatch 2026-03-10T06:23:35.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:35 vm04.local ceph-mon[51058]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.rbd-mirror"}]: dispatch 2026-03-10T06:23:35.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:35 vm04.local ceph-mon[51058]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.rbd-mirror"}]': finished 2026-03-10T06:23:35.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:35 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-10T06:23:35.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:35 vm04.local ceph-mon[51058]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-10T06:23:35.929 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:35 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.ceph-exporter"}]: dispatch 2026-03-10T06:23:35.929 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:35 vm04.local ceph-mon[51058]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.ceph-exporter"}]: dispatch 2026-03-10T06:23:35.929 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:35 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.iscsi"}]: dispatch 2026-03-10T06:23:35.929 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:35 vm04.local ceph-mon[51058]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.iscsi"}]: dispatch 2026-03-10T06:23:35.929 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:35 vm04.local ceph-mon[51058]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.iscsi"}]': finished 2026-03-10T06:23:35.929 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:35 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.nfs"}]: dispatch 2026-03-10T06:23:35.929 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:35 vm04.local ceph-mon[51058]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.nfs"}]: dispatch 2026-03-10T06:23:35.929 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:35 vm04.local ceph-mon[51058]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.nfs"}]': finished 2026-03-10T06:23:35.929 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:35 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.nvmeof"}]: dispatch 2026-03-10T06:23:35.929 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:35 vm04.local ceph-mon[51058]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.nvmeof"}]: dispatch 2026-03-10T06:23:35.929 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:35 vm04.local ceph-mon[51058]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.nvmeof"}]': finished 2026-03-10T06:23:35.929 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:35 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-10T06:23:35.929 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:35 vm04.local ceph-mon[51058]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-10T06:23:35.929 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:35 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-10T06:23:35.929 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:35 vm04.local ceph-mon[51058]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-10T06:23:35.929 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:35 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-10T06:23:35.929 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:35 vm04.local ceph-mon[51058]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-10T06:23:35.929 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:35 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-10T06:23:35.929 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:35 vm04.local ceph-mon[51058]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-10T06:23:35.929 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:35 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-10T06:23:35.929 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:35 vm04.local ceph-mon[51058]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-10T06:23:35.929 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:35 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-10T06:23:35.929 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:35 vm04.local ceph-mon[51058]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-10T06:23:35.929 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:35 vm04.local ceph-mon[51058]: Upgrade: Complete! 2026-03-10T06:23:35.929 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:35 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix":"config-key del","key":"mgr/cephadm/upgrade_state"}]: dispatch 2026-03-10T06:23:35.929 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:35 vm04.local ceph-mon[51058]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' cmd=[{"prefix":"config-key del","key":"mgr/cephadm/upgrade_state"}]: dispatch 2026-03-10T06:23:35.929 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:35 vm04.local ceph-mon[51058]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/upgrade_state"}]': finished 2026-03-10T06:23:35.929 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:35 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:23:35.929 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:35 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:23:35.929 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:35 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T06:23:35.929 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:35 vm04.local ceph-mon[51058]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:23:35.929 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:35 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:23:35.929 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:35 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:23:35.929 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:35 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T06:23:35.929 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:35 vm04.local ceph-mon[51058]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:23:37.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:37 vm06.local ceph-mon[58974]: pgmap v47: 65 pgs: 65 active+clean; 1.7 GiB data, 6.7 GiB used, 113 GiB / 120 GiB avail; 14 MiB/s rd, 23 MiB/s wr, 202 op/s 2026-03-10T06:23:37.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:37 vm04.local ceph-mon[51058]: pgmap v47: 65 pgs: 65 active+clean; 1.7 GiB data, 6.7 GiB used, 113 GiB / 120 GiB avail; 14 MiB/s rd, 23 MiB/s wr, 202 op/s 2026-03-10T06:23:39.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:38 vm06.local ceph-mon[58974]: pgmap v48: 65 pgs: 65 active+clean; 1.7 GiB data, 6.7 GiB used, 113 GiB / 120 GiB avail; 651 KiB/s rd, 718 KiB/s wr, 105 op/s 2026-03-10T06:23:39.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:38 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T06:23:39.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:38 vm06.local ceph-mon[58974]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:23:39.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:38 vm04.local ceph-mon[51058]: pgmap v48: 65 pgs: 65 active+clean; 1.7 GiB data, 6.7 GiB used, 113 GiB / 120 GiB avail; 651 KiB/s rd, 718 KiB/s wr, 105 op/s 2026-03-10T06:23:39.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:38 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T06:23:39.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:38 vm04.local ceph-mon[51058]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:23:40.544 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:40.543+0000 7f1997225700 1 -- 192.168.123.104:0/706771089 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1990071980 msgr2=0x7f1990071d90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:23:40.544 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:40.543+0000 7f1997225700 1 --2- 192.168.123.104:0/706771089 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1990071980 0x7f1990071d90 secure :-1 s=READY pgs=340 cs=0 l=1 rev1=1 crypto rx=0x7f19800077e0 tx=0x7f1980007af0 comp rx=0 tx=0).stop 2026-03-10T06:23:40.544 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:40.543+0000 7f1997225700 1 -- 192.168.123.104:0/706771089 shutdown_connections 2026-03-10T06:23:40.545 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:40.543+0000 7f1997225700 1 --2- 192.168.123.104:0/706771089 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1990072360 0x7f19900770e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:40.545 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:40.543+0000 7f1997225700 1 --2- 192.168.123.104:0/706771089 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1990071980 0x7f1990071d90 unknown :-1 s=CLOSED pgs=340 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:40.545 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:40.543+0000 7f1997225700 1 -- 192.168.123.104:0/706771089 >> 192.168.123.104:0/706771089 conn(0x7f199006d1a0 msgr2=0x7f199006f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:23:40.545 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:40.544+0000 7f1997225700 1 -- 192.168.123.104:0/706771089 shutdown_connections 2026-03-10T06:23:40.545 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:40.544+0000 7f1997225700 1 -- 192.168.123.104:0/706771089 wait complete. 2026-03-10T06:23:40.545 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:40.544+0000 7f1997225700 1 Processor -- start 2026-03-10T06:23:40.545 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:40.544+0000 7f1997225700 1 -- start start 2026-03-10T06:23:40.545 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:40.544+0000 7f1997225700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1990072360 0x7f1990082ba0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:23:40.545 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:40.544+0000 7f1997225700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f19900830e0 0x7f199012df20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:23:40.545 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:40.544+0000 7f1997225700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f19900835e0 con 0x7f19900830e0 2026-03-10T06:23:40.545 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:40.544+0000 7f1997225700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f1990083750 con 0x7f1990072360 2026-03-10T06:23:40.548 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:40.545+0000 7f1994fc1700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1990072360 0x7f1990082ba0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:23:40.548 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:40.545+0000 7f1994fc1700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1990072360 0x7f1990082ba0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.104:59448/0 (socket says 192.168.123.104:59448) 2026-03-10T06:23:40.548 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:40.545+0000 7f1994fc1700 1 -- 192.168.123.104:0/2666418806 learned_addr learned my addr 192.168.123.104:0/2666418806 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:23:40.548 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:40.545+0000 7f198ffff700 1 --2- 192.168.123.104:0/2666418806 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f19900830e0 0x7f199012df20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:23:40.548 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:40.545+0000 7f198ffff700 1 -- 192.168.123.104:0/2666418806 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1990072360 msgr2=0x7f1990082ba0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:23:40.548 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:40.545+0000 7f198ffff700 1 --2- 192.168.123.104:0/2666418806 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1990072360 0x7f1990082ba0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:40.548 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:40.545+0000 7f198ffff700 1 -- 192.168.123.104:0/2666418806 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f1980007430 con 0x7f19900830e0 2026-03-10T06:23:40.548 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:40.545+0000 7f198ffff700 1 --2- 192.168.123.104:0/2666418806 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f19900830e0 0x7f199012df20 secure :-1 s=READY pgs=341 cs=0 l=1 rev1=1 crypto rx=0x7f1988009fd0 tx=0x7f198800d3b0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:23:40.548 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:40.546+0000 7f198dffb700 1 -- 192.168.123.104:0/2666418806 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f198800de40 con 0x7f19900830e0 2026-03-10T06:23:40.548 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:40.546+0000 7f1997225700 1 -- 192.168.123.104:0/2666418806 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f199012e4c0 con 0x7f19900830e0 2026-03-10T06:23:40.548 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:40.547+0000 7f1997225700 1 -- 192.168.123.104:0/2666418806 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f199012e9e0 con 0x7f19900830e0 2026-03-10T06:23:40.548 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:40.547+0000 7f198dffb700 1 -- 192.168.123.104:0/2666418806 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f198800f040 con 0x7f19900830e0 2026-03-10T06:23:40.548 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:40.547+0000 7f198dffb700 1 -- 192.168.123.104:0/2666418806 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f1988015610 con 0x7f19900830e0 2026-03-10T06:23:40.549 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:40.548+0000 7f1997225700 1 -- 192.168.123.104:0/2666418806 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f197c005320 con 0x7f19900830e0 2026-03-10T06:23:40.550 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:40.549+0000 7f198dffb700 1 -- 192.168.123.104:0/2666418806 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 32) v1 ==== 100066+0+0 (secure 0 0 0) 0x7f1988004ad0 con 0x7f19900830e0 2026-03-10T06:23:40.550 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:40.549+0000 7f198dffb700 1 --2- 192.168.123.104:0/2666418806 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7f19780776c0 0x7f1978079b70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:23:40.550 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:40.549+0000 7f198dffb700 1 -- 192.168.123.104:0/2666418806 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(42..42 src has 1..42) v4 ==== 5792+0+0 (secure 0 0 0) 0x7f1988099510 con 0x7f19900830e0 2026-03-10T06:23:40.550 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:40.549+0000 7f1994fc1700 1 --2- 192.168.123.104:0/2666418806 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7f19780776c0 0x7f1978079b70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:23:40.550 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:40.550+0000 7f1994fc1700 1 --2- 192.168.123.104:0/2666418806 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7f19780776c0 0x7f1978079b70 secure :-1 s=READY pgs=32 cs=0 l=1 rev1=1 crypto rx=0x7f1980007400 tx=0x7f1980011730 comp rx=0 tx=0).ready entity=mgr.14632 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:23:40.564 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:40.554+0000 7f198dffb700 1 -- 192.168.123.104:0/2666418806 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191525 (secure 0 0 0) 0x7f1988062090 con 0x7f19900830e0 2026-03-10T06:23:40.692 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:40.692+0000 7f198dffb700 1 -- 192.168.123.104:0/2666418806 <== mon.0 v2:192.168.123.104:3300/0 7 ==== mgrmap(e 33) v1 ==== 99964+0+0 (secure 0 0 0) 0x7f19880979d0 con 0x7f19900830e0 2026-03-10T06:23:40.716 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:40.714+0000 7f1997225700 1 -- 192.168.123.104:0/2666418806 --> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f197c000bf0 con 0x7f19780776c0 2026-03-10T06:23:40.716 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:40.715+0000 7f198dffb700 1 -- 192.168.123.104:0/2666418806 <== mgr.14632 v2:192.168.123.104:6800/1695210057 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+175 (secure 0 0 0) 0x7f197c000bf0 con 0x7f19780776c0 2026-03-10T06:23:40.721 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:40.719+0000 7f19777fe700 1 -- 192.168.123.104:0/2666418806 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7f19780776c0 msgr2=0x7f1978079b70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:23:40.721 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:40.719+0000 7f19777fe700 1 --2- 192.168.123.104:0/2666418806 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7f19780776c0 0x7f1978079b70 secure :-1 s=READY pgs=32 cs=0 l=1 rev1=1 crypto rx=0x7f1980007400 tx=0x7f1980011730 comp rx=0 tx=0).stop 2026-03-10T06:23:40.721 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:40.719+0000 7f19777fe700 1 -- 192.168.123.104:0/2666418806 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f19900830e0 msgr2=0x7f199012df20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:23:40.721 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:40.719+0000 7f19777fe700 1 --2- 192.168.123.104:0/2666418806 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f19900830e0 0x7f199012df20 secure :-1 s=READY pgs=341 cs=0 l=1 rev1=1 crypto rx=0x7f1988009fd0 tx=0x7f198800d3b0 comp rx=0 tx=0).stop 2026-03-10T06:23:40.722 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:40.721+0000 7f19777fe700 1 -- 192.168.123.104:0/2666418806 shutdown_connections 2026-03-10T06:23:40.722 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:40.721+0000 7f19777fe700 1 --2- 192.168.123.104:0/2666418806 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1990072360 0x7f1990082ba0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:40.722 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:40.721+0000 7f19777fe700 1 --2- 192.168.123.104:0/2666418806 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7f19780776c0 0x7f1978079b70 unknown :-1 s=CLOSED pgs=32 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:40.722 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:40.721+0000 7f19777fe700 1 --2- 192.168.123.104:0/2666418806 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f19900830e0 0x7f199012df20 unknown :-1 s=CLOSED pgs=341 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:40.722 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:40.721+0000 7f19777fe700 1 -- 192.168.123.104:0/2666418806 >> 192.168.123.104:0/2666418806 conn(0x7f199006d1a0 msgr2=0x7f199006e570 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:23:40.722 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:40.721+0000 7f19777fe700 1 -- 192.168.123.104:0/2666418806 shutdown_connections 2026-03-10T06:23:40.722 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:40.721+0000 7f19777fe700 1 -- 192.168.123.104:0/2666418806 wait complete. 2026-03-10T06:23:40.836 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 9c59102a-1c48-11f1-b618-035af535377d -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph versions | jq -e '"'"'.mgr | length == 1'"'"'' 2026-03-10T06:23:41.097 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/mon.vm04/config 2026-03-10T06:23:41.534 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:41.530+0000 7f27bd563700 1 -- 192.168.123.104:0/1236336937 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f27b810c8f0 msgr2=0x7f27b810ccc0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:23:41.534 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:41.530+0000 7f27bd563700 1 --2- 192.168.123.104:0/1236336937 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f27b810c8f0 0x7f27b810ccc0 secure :-1 s=READY pgs=98 cs=0 l=1 rev1=1 crypto rx=0x7f27b000d3f0 tx=0x7f27b000d700 comp rx=0 tx=0).stop 2026-03-10T06:23:41.534 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:41.530+0000 7f27bd563700 1 -- 192.168.123.104:0/1236336937 shutdown_connections 2026-03-10T06:23:41.534 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:41.530+0000 7f27bd563700 1 --2- 192.168.123.104:0/1236336937 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f27b8071e40 0x7f27b80722b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:41.534 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:41.530+0000 7f27bd563700 1 --2- 192.168.123.104:0/1236336937 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f27b810c8f0 0x7f27b810ccc0 unknown :-1 s=CLOSED pgs=98 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:41.534 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:41.530+0000 7f27bd563700 1 -- 192.168.123.104:0/1236336937 >> 192.168.123.104:0/1236336937 conn(0x7f27b806c6c0 msgr2=0x7f27b806cac0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:23:41.534 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:41.530+0000 7f27bd563700 1 -- 192.168.123.104:0/1236336937 shutdown_connections 2026-03-10T06:23:41.534 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:41.530+0000 7f27bd563700 1 -- 192.168.123.104:0/1236336937 wait complete. 2026-03-10T06:23:41.534 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:41.531+0000 7f27bd563700 1 Processor -- start 2026-03-10T06:23:41.534 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:41.531+0000 7f27bd563700 1 -- start start 2026-03-10T06:23:41.534 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:41.531+0000 7f27bd563700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f27b8071e40 0x7f27b8137760 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:23:41.534 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:41.531+0000 7f27bd563700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f27b8132710 0x7f27b8132b80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:23:41.534 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:41.531+0000 7f27bd563700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f27b81330c0 con 0x7f27b8071e40 2026-03-10T06:23:41.534 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:41.531+0000 7f27bd563700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f27b8133230 con 0x7f27b8132710 2026-03-10T06:23:41.534 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:41.532+0000 7f27b77fe700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f27b8132710 0x7f27b8132b80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:23:41.534 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:41.532+0000 7f27b77fe700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f27b8132710 0x7f27b8132b80 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.104:59454/0 (socket says 192.168.123.104:59454) 2026-03-10T06:23:41.534 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:41.532+0000 7f27b77fe700 1 -- 192.168.123.104:0/2817169892 learned_addr learned my addr 192.168.123.104:0/2817169892 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:23:41.534 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:41.532+0000 7f27b77fe700 1 -- 192.168.123.104:0/2817169892 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f27b8071e40 msgr2=0x7f27b8137760 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:23:41.534 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:41.532+0000 7f27b77fe700 1 --2- 192.168.123.104:0/2817169892 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f27b8071e40 0x7f27b8137760 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:41.534 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:41.532+0000 7f27b77fe700 1 -- 192.168.123.104:0/2817169892 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f27b0007ed0 con 0x7f27b8132710 2026-03-10T06:23:41.534 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:41.532+0000 7f27b77fe700 1 --2- 192.168.123.104:0/2817169892 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f27b8132710 0x7f27b8132b80 secure :-1 s=READY pgs=99 cs=0 l=1 rev1=1 crypto rx=0x7f27a800b770 tx=0x7f27a800bb30 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:23:41.534 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:41.533+0000 7f27b57fa700 1 -- 192.168.123.104:0/2817169892 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f27a800f820 con 0x7f27b8132710 2026-03-10T06:23:41.535 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:41.533+0000 7f27bd563700 1 -- 192.168.123.104:0/2817169892 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f27b81334c0 con 0x7f27b8132710 2026-03-10T06:23:41.535 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:41.533+0000 7f27bd563700 1 -- 192.168.123.104:0/2817169892 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f27b807f030 con 0x7f27b8132710 2026-03-10T06:23:41.535 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:41.534+0000 7f27b57fa700 1 -- 192.168.123.104:0/2817169892 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f27a800fe60 con 0x7f27b8132710 2026-03-10T06:23:41.535 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:41.534+0000 7f27b57fa700 1 -- 192.168.123.104:0/2817169892 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f27a800d610 con 0x7f27b8132710 2026-03-10T06:23:41.535 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:41.534+0000 7f27bd563700 1 -- 192.168.123.104:0/2817169892 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f27a4005320 con 0x7f27b8132710 2026-03-10T06:23:41.538 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:41.537+0000 7f27b57fa700 1 -- 192.168.123.104:0/2817169892 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 33) v1 ==== 99964+0+0 (secure 0 0 0) 0x7f27a800f980 con 0x7f27b8132710 2026-03-10T06:23:41.538 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:41.537+0000 7f27b57fa700 1 --2- 192.168.123.104:0/2817169892 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7f27a0077660 0x7f27a0079b10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:23:41.538 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:41.537+0000 7f27b57fa700 1 -- 192.168.123.104:0/2817169892 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(42..42 src has 1..42) v4 ==== 5792+0+0 (secure 0 0 0) 0x7f27a80992c0 con 0x7f27b8132710 2026-03-10T06:23:41.539 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:41.538+0000 7f27b7fff700 1 --2- 192.168.123.104:0/2817169892 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7f27a0077660 0x7f27a0079b10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:23:41.539 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:41.539+0000 7f27b57fa700 1 -- 192.168.123.104:0/2817169892 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191525 (secure 0 0 0) 0x7f27a8062070 con 0x7f27b8132710 2026-03-10T06:23:41.552 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:41.551+0000 7f27b7fff700 1 --2- 192.168.123.104:0/2817169892 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7f27a0077660 0x7f27a0079b10 secure :-1 s=READY pgs=33 cs=0 l=1 rev1=1 crypto rx=0x7f27b000db80 tx=0x7f27b0006040 comp rx=0 tx=0).ready entity=mgr.14632 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:23:41.766 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:41.765+0000 7f27bd563700 1 -- 192.168.123.104:0/2817169892 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7f27a4005cc0 con 0x7f27b8132710 2026-03-10T06:23:41.767 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:41.766+0000 7f27b57fa700 1 -- 192.168.123.104:0/2817169892 <== mon.1 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+679 (secure 0 0 0) 0x7f27a80617c0 con 0x7f27b8132710 2026-03-10T06:23:41.773 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:41.772+0000 7f279effd700 1 -- 192.168.123.104:0/2817169892 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7f27a0077660 msgr2=0x7f27a0079b10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:23:41.773 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:41.772+0000 7f279effd700 1 --2- 192.168.123.104:0/2817169892 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7f27a0077660 0x7f27a0079b10 secure :-1 s=READY pgs=33 cs=0 l=1 rev1=1 crypto rx=0x7f27b000db80 tx=0x7f27b0006040 comp rx=0 tx=0).stop 2026-03-10T06:23:41.773 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:41.772+0000 7f279effd700 1 -- 192.168.123.104:0/2817169892 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f27b8132710 msgr2=0x7f27b8132b80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:23:41.773 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:41.772+0000 7f279effd700 1 --2- 192.168.123.104:0/2817169892 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f27b8132710 0x7f27b8132b80 secure :-1 s=READY pgs=99 cs=0 l=1 rev1=1 crypto rx=0x7f27a800b770 tx=0x7f27a800bb30 comp rx=0 tx=0).stop 2026-03-10T06:23:41.773 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:41.773+0000 7f279effd700 1 -- 192.168.123.104:0/2817169892 shutdown_connections 2026-03-10T06:23:41.773 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:41.773+0000 7f279effd700 1 --2- 192.168.123.104:0/2817169892 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7f27a0077660 0x7f27a0079b10 unknown :-1 s=CLOSED pgs=33 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:41.773 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:41.773+0000 7f279effd700 1 --2- 192.168.123.104:0/2817169892 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f27b8071e40 0x7f27b8137760 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:41.773 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:41.773+0000 7f279effd700 1 --2- 192.168.123.104:0/2817169892 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f27b8132710 0x7f27b8132b80 secure :-1 s=CLOSED pgs=99 cs=0 l=1 rev1=1 crypto rx=0x7f27a800b770 tx=0x7f27a800bb30 comp rx=0 tx=0).stop 2026-03-10T06:23:41.774 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:41.773+0000 7f279effd700 1 -- 192.168.123.104:0/2817169892 >> 192.168.123.104:0/2817169892 conn(0x7f27b806c6c0 msgr2=0x7f27b8070080 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:23:41.774 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:41.773+0000 7f279effd700 1 -- 192.168.123.104:0/2817169892 shutdown_connections 2026-03-10T06:23:41.774 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:41.773+0000 7f279effd700 1 -- 192.168.123.104:0/2817169892 wait complete. 2026-03-10T06:23:41.784 INFO:teuthology.orchestra.run.vm04.stdout:true 2026-03-10T06:23:41.835 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 9c59102a-1c48-11f1-b618-035af535377d -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph versions | jq -e '"'"'.mgr | keys'"'"' | grep $sha1' 2026-03-10T06:23:41.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:41 vm06.local ceph-mon[58974]: pgmap v49: 65 pgs: 65 active+clean; 1.3 GiB data, 5.4 GiB used, 115 GiB / 120 GiB avail; 1.4 MiB/s rd, 1.5 MiB/s wr, 170 op/s 2026-03-10T06:23:41.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:41 vm06.local ceph-mon[58974]: mgrmap e33: vm04.exdvdb(active, since 92s), standbys: vm06.wwotdr 2026-03-10T06:23:41.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:41 vm06.local ceph-mon[58974]: from='client.14708 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:23:42.003 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:41 vm04.local ceph-mon[51058]: pgmap v49: 65 pgs: 65 active+clean; 1.3 GiB data, 5.4 GiB used, 115 GiB / 120 GiB avail; 1.4 MiB/s rd, 1.5 MiB/s wr, 170 op/s 2026-03-10T06:23:42.004 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:41 vm04.local ceph-mon[51058]: mgrmap e33: vm04.exdvdb(active, since 92s), standbys: vm06.wwotdr 2026-03-10T06:23:42.004 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:41 vm04.local ceph-mon[51058]: from='client.14708 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:23:42.057 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/mon.vm04/config 2026-03-10T06:23:42.421 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:42.420+0000 7f73e5ee2700 1 -- 192.168.123.104:0/1619118731 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f73e010c8b0 msgr2=0x7f73e010cc80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:23:42.421 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:42.420+0000 7f73e5ee2700 1 --2- 192.168.123.104:0/1619118731 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f73e010c8b0 0x7f73e010cc80 secure :-1 s=READY pgs=342 cs=0 l=1 rev1=1 crypto rx=0x7f73d0007780 tx=0x7f73d0007a90 comp rx=0 tx=0).stop 2026-03-10T06:23:42.421 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:42.421+0000 7f73e5ee2700 1 -- 192.168.123.104:0/1619118731 shutdown_connections 2026-03-10T06:23:42.422 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:42.421+0000 7f73e5ee2700 1 --2- 192.168.123.104:0/1619118731 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f73e0071e40 0x7f73e00722b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:42.422 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:42.421+0000 7f73e5ee2700 1 --2- 192.168.123.104:0/1619118731 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f73e010c8b0 0x7f73e010cc80 unknown :-1 s=CLOSED pgs=342 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:42.422 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:42.421+0000 7f73e5ee2700 1 -- 192.168.123.104:0/1619118731 >> 192.168.123.104:0/1619118731 conn(0x7f73e006c6c0 msgr2=0x7f73e006cac0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:23:42.422 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:42.421+0000 7f73e5ee2700 1 -- 192.168.123.104:0/1619118731 shutdown_connections 2026-03-10T06:23:42.422 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:42.421+0000 7f73e5ee2700 1 -- 192.168.123.104:0/1619118731 wait complete. 2026-03-10T06:23:42.422 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:42.421+0000 7f73e5ee2700 1 Processor -- start 2026-03-10T06:23:42.422 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:42.421+0000 7f73e5ee2700 1 -- start start 2026-03-10T06:23:42.422 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:42.421+0000 7f73e5ee2700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f73e0071e40 0x7f73e007cd50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:23:42.422 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:42.421+0000 7f73e5ee2700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f73e007d290 0x7f73e007d700 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:23:42.422 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:42.421+0000 7f73e5ee2700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f73e0083dc0 con 0x7f73e0071e40 2026-03-10T06:23:42.422 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:42.421+0000 7f73e5ee2700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f73e00818d0 con 0x7f73e007d290 2026-03-10T06:23:42.423 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:42.421+0000 7f73df7fe700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f73e0071e40 0x7f73e007cd50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:23:42.423 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:42.421+0000 7f73deffd700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f73e007d290 0x7f73e007d700 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:23:42.423 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:42.421+0000 7f73deffd700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f73e007d290 0x7f73e007d700 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.104:59480/0 (socket says 192.168.123.104:59480) 2026-03-10T06:23:42.423 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:42.421+0000 7f73deffd700 1 -- 192.168.123.104:0/1880414861 learned_addr learned my addr 192.168.123.104:0/1880414861 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:23:42.423 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:42.422+0000 7f73deffd700 1 -- 192.168.123.104:0/1880414861 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f73e0071e40 msgr2=0x7f73e007cd50 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:23:42.423 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:42.422+0000 7f73deffd700 1 --2- 192.168.123.104:0/1880414861 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f73e0071e40 0x7f73e007cd50 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:42.423 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:42.422+0000 7f73deffd700 1 -- 192.168.123.104:0/1880414861 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f73d0007430 con 0x7f73e007d290 2026-03-10T06:23:42.423 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:42.422+0000 7f73deffd700 1 --2- 192.168.123.104:0/1880414861 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f73e007d290 0x7f73e007d700 secure :-1 s=READY pgs=100 cs=0 l=1 rev1=1 crypto rx=0x7f73d800c390 tx=0x7f73d800c6a0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:23:42.424 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:42.422+0000 7f73dcff9700 1 -- 192.168.123.104:0/1880414861 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f73d800e030 con 0x7f73e007d290 2026-03-10T06:23:42.424 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:42.423+0000 7f73e5ee2700 1 -- 192.168.123.104:0/1880414861 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f73e0081b50 con 0x7f73e007d290 2026-03-10T06:23:42.424 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:42.423+0000 7f73e5ee2700 1 -- 192.168.123.104:0/1880414861 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f73e00820a0 con 0x7f73e007d290 2026-03-10T06:23:42.424 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:42.424+0000 7f73dcff9700 1 -- 192.168.123.104:0/1880414861 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f73d800f040 con 0x7f73e007d290 2026-03-10T06:23:42.424 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:42.424+0000 7f73dcff9700 1 -- 192.168.123.104:0/1880414861 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f73d8014650 con 0x7f73e007d290 2026-03-10T06:23:42.425 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:42.425+0000 7f73e5ee2700 1 -- 192.168.123.104:0/1880414861 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f73cc005320 con 0x7f73e007d290 2026-03-10T06:23:42.426 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:42.425+0000 7f73dcff9700 1 -- 192.168.123.104:0/1880414861 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 33) v1 ==== 99964+0+0 (secure 0 0 0) 0x7f73d8009110 con 0x7f73e007d290 2026-03-10T06:23:42.426 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:42.426+0000 7f73dcff9700 1 --2- 192.168.123.104:0/1880414861 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7f73c8077590 0x7f73c8079a40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:23:42.427 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:42.427+0000 7f73dcff9700 1 -- 192.168.123.104:0/1880414861 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(42..42 src has 1..42) v4 ==== 5792+0+0 (secure 0 0 0) 0x7f73d8099390 con 0x7f73e007d290 2026-03-10T06:23:42.428 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:42.427+0000 7f73df7fe700 1 --2- 192.168.123.104:0/1880414861 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7f73c8077590 0x7f73c8079a40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:23:42.428 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:42.428+0000 7f73df7fe700 1 --2- 192.168.123.104:0/1880414861 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7f73c8077590 0x7f73c8079a40 secure :-1 s=READY pgs=34 cs=0 l=1 rev1=1 crypto rx=0x7f73d0007e60 tx=0x7f73d00058e0 comp rx=0 tx=0).ready entity=mgr.14632 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:23:42.431 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:42.430+0000 7f73dcff9700 1 -- 192.168.123.104:0/1880414861 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191525 (secure 0 0 0) 0x7f73d80620c0 con 0x7f73e007d290 2026-03-10T06:23:42.625 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:42.624+0000 7f73e5ee2700 1 -- 192.168.123.104:0/1880414861 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7f73cc006200 con 0x7f73e007d290 2026-03-10T06:23:42.626 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:42.625+0000 7f73dcff9700 1 -- 192.168.123.104:0/1880414861 <== mon.1 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+679 (secure 0 0 0) 0x7f73d8061810 con 0x7f73e007d290 2026-03-10T06:23:42.630 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:42.629+0000 7f73c67fc700 1 -- 192.168.123.104:0/1880414861 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7f73c8077590 msgr2=0x7f73c8079a40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:23:42.630 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:42.629+0000 7f73c67fc700 1 --2- 192.168.123.104:0/1880414861 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7f73c8077590 0x7f73c8079a40 secure :-1 s=READY pgs=34 cs=0 l=1 rev1=1 crypto rx=0x7f73d0007e60 tx=0x7f73d00058e0 comp rx=0 tx=0).stop 2026-03-10T06:23:42.630 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:42.629+0000 7f73c67fc700 1 -- 192.168.123.104:0/1880414861 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f73e007d290 msgr2=0x7f73e007d700 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:23:42.630 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:42.629+0000 7f73c67fc700 1 --2- 192.168.123.104:0/1880414861 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f73e007d290 0x7f73e007d700 secure :-1 s=READY pgs=100 cs=0 l=1 rev1=1 crypto rx=0x7f73d800c390 tx=0x7f73d800c6a0 comp rx=0 tx=0).stop 2026-03-10T06:23:42.630 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:42.630+0000 7f73c67fc700 1 -- 192.168.123.104:0/1880414861 shutdown_connections 2026-03-10T06:23:42.630 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:42.630+0000 7f73c67fc700 1 --2- 192.168.123.104:0/1880414861 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7f73c8077590 0x7f73c8079a40 unknown :-1 s=CLOSED pgs=34 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:42.630 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:42.630+0000 7f73c67fc700 1 --2- 192.168.123.104:0/1880414861 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f73e0071e40 0x7f73e007cd50 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:42.630 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:42.630+0000 7f73c67fc700 1 --2- 192.168.123.104:0/1880414861 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f73e007d290 0x7f73e007d700 unknown :-1 s=CLOSED pgs=100 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:42.630 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:42.630+0000 7f73c67fc700 1 -- 192.168.123.104:0/1880414861 >> 192.168.123.104:0/1880414861 conn(0x7f73e006c6c0 msgr2=0x7f73e00708e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:23:42.631 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:42.630+0000 7f73c67fc700 1 -- 192.168.123.104:0/1880414861 shutdown_connections 2026-03-10T06:23:42.631 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:42.630+0000 7f73c67fc700 1 -- 192.168.123.104:0/1880414861 wait complete. 2026-03-10T06:23:42.640 INFO:teuthology.orchestra.run.vm04.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)" 2026-03-10T06:23:42.693 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 9c59102a-1c48-11f1-b618-035af535377d -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph versions | jq -e '"'"'.overall | length == 2'"'"'' 2026-03-10T06:23:42.894 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/mon.vm04/config 2026-03-10T06:23:42.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:42 vm04.local ceph-mon[51058]: from='client.? 192.168.123.104:0/2817169892' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:23:42.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:42 vm04.local ceph-mon[51058]: from='client.? 192.168.123.104:0/1880414861' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:23:43.118 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:42 vm06.local ceph-mon[58974]: from='client.? 192.168.123.104:0/2817169892' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:23:43.118 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:42 vm06.local ceph-mon[58974]: from='client.? 192.168.123.104:0/1880414861' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:23:43.316 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:43.315+0000 7fe4336ba700 1 -- 192.168.123.104:0/1675425949 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe42c071e40 msgr2=0x7fe42c0722b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:23:43.316 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:43.315+0000 7fe4336ba700 1 --2- 192.168.123.104:0/1675425949 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe42c071e40 0x7fe42c0722b0 secure :-1 s=READY pgs=101 cs=0 l=1 rev1=1 crypto rx=0x7fe424009230 tx=0x7fe424009260 comp rx=0 tx=0).stop 2026-03-10T06:23:43.317 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:43.316+0000 7fe4336ba700 1 -- 192.168.123.104:0/1675425949 shutdown_connections 2026-03-10T06:23:43.317 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:43.316+0000 7fe4336ba700 1 --2- 192.168.123.104:0/1675425949 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe42c071e40 0x7fe42c0722b0 unknown :-1 s=CLOSED pgs=101 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:43.317 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:43.316+0000 7fe4336ba700 1 --2- 192.168.123.104:0/1675425949 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe42c10c8b0 0x7fe42c10cc80 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:43.317 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:43.316+0000 7fe4336ba700 1 -- 192.168.123.104:0/1675425949 >> 192.168.123.104:0/1675425949 conn(0x7fe42c06c6c0 msgr2=0x7fe42c06cac0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:23:43.317 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:43.316+0000 7fe4336ba700 1 -- 192.168.123.104:0/1675425949 shutdown_connections 2026-03-10T06:23:43.318 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:43.316+0000 7fe4336ba700 1 -- 192.168.123.104:0/1675425949 wait complete. 2026-03-10T06:23:43.318 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:43.317+0000 7fe4336ba700 1 Processor -- start 2026-03-10T06:23:43.318 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:43.317+0000 7fe4336ba700 1 -- start start 2026-03-10T06:23:43.318 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:43.317+0000 7fe4336ba700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe42c10c8b0 0x7fe42c07ce30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:23:43.318 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:43.317+0000 7fe4336ba700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe42c07d370 0x7fe42c07d7e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:23:43.318 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:43.317+0000 7fe4336ba700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe42c0819b0 con 0x7fe42c07d370 2026-03-10T06:23:43.318 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:43.317+0000 7fe4336ba700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe42c081b20 con 0x7fe42c10c8b0 2026-03-10T06:23:43.319 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:43.317+0000 7fe430c55700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe42c07d370 0x7fe42c07d7e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:23:43.319 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:43.317+0000 7fe430c55700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe42c07d370 0x7fe42c07d7e0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:38440/0 (socket says 192.168.123.104:38440) 2026-03-10T06:23:43.319 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:43.317+0000 7fe430c55700 1 -- 192.168.123.104:0/445852843 learned_addr learned my addr 192.168.123.104:0/445852843 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:23:43.319 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:43.317+0000 7fe431456700 1 --2- 192.168.123.104:0/445852843 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe42c10c8b0 0x7fe42c07ce30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:23:43.319 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:43.317+0000 7fe430c55700 1 -- 192.168.123.104:0/445852843 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe42c10c8b0 msgr2=0x7fe42c07ce30 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:23:43.319 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:43.317+0000 7fe430c55700 1 --2- 192.168.123.104:0/445852843 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe42c10c8b0 0x7fe42c07ce30 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:43.319 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:43.317+0000 7fe430c55700 1 -- 192.168.123.104:0/445852843 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fe424008ee0 con 0x7fe42c07d370 2026-03-10T06:23:43.319 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:43.318+0000 7fe430c55700 1 --2- 192.168.123.104:0/445852843 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe42c07d370 0x7fe42c07d7e0 secure :-1 s=READY pgs=343 cs=0 l=1 rev1=1 crypto rx=0x7fe42400c9a0 tx=0x7fe424003fa0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:23:43.319 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:43.318+0000 7fe4227fc700 1 -- 192.168.123.104:0/445852843 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe42401d070 con 0x7fe42c07d370 2026-03-10T06:23:43.320 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:43.318+0000 7fe4336ba700 1 -- 192.168.123.104:0/445852843 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fe42c081da0 con 0x7fe42c07d370 2026-03-10T06:23:43.320 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:43.318+0000 7fe4336ba700 1 -- 192.168.123.104:0/445852843 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fe42c082290 con 0x7fe42c07d370 2026-03-10T06:23:43.320 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:43.320+0000 7fe4227fc700 1 -- 192.168.123.104:0/445852843 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7fe424007cb0 con 0x7fe42c07d370 2026-03-10T06:23:43.321 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:43.320+0000 7fe4227fc700 1 -- 192.168.123.104:0/445852843 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe42400e9c0 con 0x7fe42c07d370 2026-03-10T06:23:43.321 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:43.320+0000 7fe4227fc700 1 -- 192.168.123.104:0/445852843 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 33) v1 ==== 99964+0+0 (secure 0 0 0) 0x7fe42400ebe0 con 0x7fe42c07d370 2026-03-10T06:23:43.323 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:43.322+0000 7fe4227fc700 1 --2- 192.168.123.104:0/445852843 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7fe418077730 0x7fe418079be0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:23:43.324 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:43.323+0000 7fe4227fc700 1 -- 192.168.123.104:0/445852843 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(42..42 src has 1..42) v4 ==== 5792+0+0 (secure 0 0 0) 0x7fe424012070 con 0x7fe42c07d370 2026-03-10T06:23:43.324 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:43.323+0000 7fe4336ba700 1 -- 192.168.123.104:0/445852843 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fe410005320 con 0x7fe42c07d370 2026-03-10T06:23:43.332 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:43.328+0000 7fe4227fc700 1 -- 192.168.123.104:0/445852843 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191525 (secure 0 0 0) 0x7fe424064af0 con 0x7fe42c07d370 2026-03-10T06:23:43.343 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:43.343+0000 7fe431456700 1 --2- 192.168.123.104:0/445852843 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7fe418077730 0x7fe418079be0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:23:43.361 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:43.361+0000 7fe431456700 1 --2- 192.168.123.104:0/445852843 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7fe418077730 0x7fe418079be0 secure :-1 s=READY pgs=35 cs=0 l=1 rev1=1 crypto rx=0x7fe42800b3c0 tx=0x7fe42800d040 comp rx=0 tx=0).ready entity=mgr.14632 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:23:43.580 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:43.577+0000 7fe4336ba700 1 -- 192.168.123.104:0/445852843 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7fe410006200 con 0x7fe42c07d370 2026-03-10T06:23:43.581 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:43.580+0000 7fe4227fc700 1 -- 192.168.123.104:0/445852843 <== mon.0 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+679 (secure 0 0 0) 0x7fe424026070 con 0x7fe42c07d370 2026-03-10T06:23:43.586 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:43.585+0000 7fe4336ba700 1 -- 192.168.123.104:0/445852843 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7fe418077730 msgr2=0x7fe418079be0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:23:43.586 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:43.585+0000 7fe4336ba700 1 --2- 192.168.123.104:0/445852843 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7fe418077730 0x7fe418079be0 secure :-1 s=READY pgs=35 cs=0 l=1 rev1=1 crypto rx=0x7fe42800b3c0 tx=0x7fe42800d040 comp rx=0 tx=0).stop 2026-03-10T06:23:43.586 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:43.585+0000 7fe4336ba700 1 -- 192.168.123.104:0/445852843 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe42c07d370 msgr2=0x7fe42c07d7e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:23:43.586 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:43.585+0000 7fe4336ba700 1 --2- 192.168.123.104:0/445852843 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe42c07d370 0x7fe42c07d7e0 secure :-1 s=READY pgs=343 cs=0 l=1 rev1=1 crypto rx=0x7fe42400c9a0 tx=0x7fe424003fa0 comp rx=0 tx=0).stop 2026-03-10T06:23:43.586 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:43.585+0000 7fe4336ba700 1 -- 192.168.123.104:0/445852843 shutdown_connections 2026-03-10T06:23:43.586 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:43.585+0000 7fe4336ba700 1 --2- 192.168.123.104:0/445852843 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe42c10c8b0 0x7fe42c07ce30 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:43.586 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:43.585+0000 7fe4336ba700 1 --2- 192.168.123.104:0/445852843 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7fe418077730 0x7fe418079be0 unknown :-1 s=CLOSED pgs=35 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:43.586 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:43.585+0000 7fe4336ba700 1 --2- 192.168.123.104:0/445852843 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe42c07d370 0x7fe42c07d7e0 unknown :-1 s=CLOSED pgs=343 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:43.586 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:43.586+0000 7fe4336ba700 1 -- 192.168.123.104:0/445852843 >> 192.168.123.104:0/445852843 conn(0x7fe42c06c6c0 msgr2=0x7fe42c070840 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:23:43.586 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:43.586+0000 7fe4336ba700 1 -- 192.168.123.104:0/445852843 shutdown_connections 2026-03-10T06:23:43.586 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:43.586+0000 7fe4336ba700 1 -- 192.168.123.104:0/445852843 wait complete. 2026-03-10T06:23:43.601 INFO:teuthology.orchestra.run.vm04.stdout:true 2026-03-10T06:23:43.731 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 9c59102a-1c48-11f1-b618-035af535377d -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph orch upgrade check quay.ceph.io/ceph-ci/ceph:$sha1 | jq -e '"'"'.up_to_date | length == 2'"'"'' 2026-03-10T06:23:43.917 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/mon.vm04/config 2026-03-10T06:23:43.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:43 vm04.local ceph-mon[51058]: pgmap v50: 65 pgs: 65 active+clean; 1.3 GiB data, 5.4 GiB used, 115 GiB / 120 GiB avail; 974 KiB/s rd, 1.1 MiB/s wr, 114 op/s 2026-03-10T06:23:43.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:43 vm04.local ceph-mon[51058]: from='client.? 192.168.123.104:0/445852843' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:23:44.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:43 vm06.local ceph-mon[58974]: pgmap v50: 65 pgs: 65 active+clean; 1.3 GiB data, 5.4 GiB used, 115 GiB / 120 GiB avail; 974 KiB/s rd, 1.1 MiB/s wr, 114 op/s 2026-03-10T06:23:44.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:43 vm06.local ceph-mon[58974]: from='client.? 192.168.123.104:0/445852843' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:23:44.326 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:44.325+0000 7fdca74a9700 1 -- 192.168.123.104:0/865847026 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fdc980ac390 msgr2=0x7fdc980a4ca0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:23:44.326 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:44.325+0000 7fdca74a9700 1 --2- 192.168.123.104:0/865847026 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fdc980ac390 0x7fdc980a4ca0 secure :-1 s=READY pgs=102 cs=0 l=1 rev1=1 crypto rx=0x7fdc9c009b00 tx=0x7fdc9c009e10 comp rx=0 tx=0).stop 2026-03-10T06:23:44.326 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:44.325+0000 7fdca74a9700 1 -- 192.168.123.104:0/865847026 shutdown_connections 2026-03-10T06:23:44.326 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:44.325+0000 7fdca74a9700 1 --2- 192.168.123.104:0/865847026 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fdc980ac760 0x7fdc980a51e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:44.326 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:44.325+0000 7fdca74a9700 1 --2- 192.168.123.104:0/865847026 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fdc980ac390 0x7fdc980a4ca0 unknown :-1 s=CLOSED pgs=102 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:44.326 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:44.325+0000 7fdca74a9700 1 -- 192.168.123.104:0/865847026 >> 192.168.123.104:0/865847026 conn(0x7fdc9801a260 msgr2=0x7fdc9801a660 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:23:44.326 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:44.325+0000 7fdca74a9700 1 -- 192.168.123.104:0/865847026 shutdown_connections 2026-03-10T06:23:44.326 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:44.325+0000 7fdca74a9700 1 -- 192.168.123.104:0/865847026 wait complete. 2026-03-10T06:23:44.327 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:44.326+0000 7fdca74a9700 1 Processor -- start 2026-03-10T06:23:44.327 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:44.326+0000 7fdca74a9700 1 -- start start 2026-03-10T06:23:44.327 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:44.326+0000 7fdca74a9700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fdc980ac390 0x7fdc980b2be0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:23:44.327 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:44.326+0000 7fdca74a9700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fdc980ac760 0x7fdc980b3120 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:23:44.327 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:44.326+0000 7fdca74a9700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fdc9814f830 con 0x7fdc980ac390 2026-03-10T06:23:44.327 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:44.326+0000 7fdca74a9700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fdc9814f9a0 con 0x7fdc980ac760 2026-03-10T06:23:44.328 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:44.327+0000 7fdca64a7700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fdc980ac390 0x7fdc980b2be0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:23:44.328 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:44.327+0000 7fdca64a7700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fdc980ac390 0x7fdc980b2be0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:58454/0 (socket says 192.168.123.104:58454) 2026-03-10T06:23:44.328 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:44.327+0000 7fdca64a7700 1 -- 192.168.123.104:0/2736569988 learned_addr learned my addr 192.168.123.104:0/2736569988 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:23:44.328 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:44.327+0000 7fdca5ca6700 1 --2- 192.168.123.104:0/2736569988 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fdc980ac760 0x7fdc980b3120 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:23:44.328 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:44.327+0000 7fdca64a7700 1 -- 192.168.123.104:0/2736569988 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fdc980ac760 msgr2=0x7fdc980b3120 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:23:44.328 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:44.327+0000 7fdca64a7700 1 --2- 192.168.123.104:0/2736569988 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fdc980ac760 0x7fdc980b3120 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:44.328 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:44.327+0000 7fdca64a7700 1 -- 192.168.123.104:0/2736569988 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fdc9c0097e0 con 0x7fdc980ac390 2026-03-10T06:23:44.328 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:44.327+0000 7fdca64a7700 1 --2- 192.168.123.104:0/2736569988 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fdc980ac390 0x7fdc980b2be0 secure :-1 s=READY pgs=344 cs=0 l=1 rev1=1 crypto rx=0x7fdc9c00ba00 tx=0x7fdc9c00ba30 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:23:44.329 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:44.328+0000 7fdc977fe700 1 -- 192.168.123.104:0/2736569988 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fdc9c01d070 con 0x7fdc980ac390 2026-03-10T06:23:44.329 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:44.328+0000 7fdca74a9700 1 -- 192.168.123.104:0/2736569988 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fdc9814fc80 con 0x7fdc980ac390 2026-03-10T06:23:44.329 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:44.328+0000 7fdca74a9700 1 -- 192.168.123.104:0/2736569988 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fdc981501d0 con 0x7fdc980ac390 2026-03-10T06:23:44.329 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:44.329+0000 7fdc977fe700 1 -- 192.168.123.104:0/2736569988 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7fdc9c00f460 con 0x7fdc980ac390 2026-03-10T06:23:44.329 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:44.329+0000 7fdc977fe700 1 -- 192.168.123.104:0/2736569988 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fdc9c021620 con 0x7fdc980ac390 2026-03-10T06:23:44.331 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:44.331+0000 7fdc977fe700 1 -- 192.168.123.104:0/2736569988 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 33) v1 ==== 99964+0+0 (secure 0 0 0) 0x7fdc9c003a40 con 0x7fdc980ac390 2026-03-10T06:23:44.331 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:44.331+0000 7fdc977fe700 1 --2- 192.168.123.104:0/2736569988 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7fdc8c077610 0x7fdc8c079ac0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:23:44.332 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:44.331+0000 7fdc977fe700 1 -- 192.168.123.104:0/2736569988 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(42..42 src has 1..42) v4 ==== 5792+0+0 (secure 0 0 0) 0x7fdc9c09a5b0 con 0x7fdc980ac390 2026-03-10T06:23:44.332 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:44.331+0000 7fdca74a9700 1 -- 192.168.123.104:0/2736569988 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fdc84005320 con 0x7fdc980ac390 2026-03-10T06:23:44.335 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:44.334+0000 7fdca5ca6700 1 --2- 192.168.123.104:0/2736569988 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7fdc8c077610 0x7fdc8c079ac0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:23:44.336 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:44.336+0000 7fdca5ca6700 1 --2- 192.168.123.104:0/2736569988 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7fdc8c077610 0x7fdc8c079ac0 secure :-1 s=READY pgs=36 cs=0 l=1 rev1=1 crypto rx=0x7fdc980b40b0 tx=0x7fdc90009450 comp rx=0 tx=0).ready entity=mgr.14632 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:23:44.337 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:44.337+0000 7fdc977fe700 1 -- 192.168.123.104:0/2736569988 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191525 (secure 0 0 0) 0x7fdc9c063360 con 0x7fdc980ac390 2026-03-10T06:23:44.538 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:44.536+0000 7fdca74a9700 1 -- 192.168.123.104:0/2736569988 --> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] -- mgr_command(tid 0: {"prefix": "orch upgrade check", "image": "quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df", "target": ["mon-mgr", ""]}) v1 -- 0x7fdc84000c90 con 0x7fdc8c077610 2026-03-10T06:23:46.145 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:45 vm04.local ceph-mon[51058]: pgmap v51: 65 pgs: 65 active+clean; 1.3 GiB data, 5.4 GiB used, 115 GiB / 120 GiB avail; 974 KiB/s rd, 1.1 MiB/s wr, 114 op/s 2026-03-10T06:23:46.145 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:45 vm04.local ceph-mon[51058]: from='client.14720 -' entity='client.admin' cmd=[{"prefix": "orch upgrade check", "image": "quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:23:46.367 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:46.362+0000 7fdc977fe700 1 -- 192.168.123.104:0/2736569988 <== mgr.14632 v2:192.168.123.104:6800/1695210057 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+5308 (secure 0 0 0) 0x7fdc84000c90 con 0x7fdc8c077610 2026-03-10T06:23:46.367 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:46.366+0000 7fdc957fa700 1 -- 192.168.123.104:0/2736569988 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7fdc8c077610 msgr2=0x7fdc8c079ac0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:23:46.367 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:46.366+0000 7fdc957fa700 1 --2- 192.168.123.104:0/2736569988 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7fdc8c077610 0x7fdc8c079ac0 secure :-1 s=READY pgs=36 cs=0 l=1 rev1=1 crypto rx=0x7fdc980b40b0 tx=0x7fdc90009450 comp rx=0 tx=0).stop 2026-03-10T06:23:46.367 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:46.366+0000 7fdc957fa700 1 -- 192.168.123.104:0/2736569988 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fdc980ac390 msgr2=0x7fdc980b2be0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:23:46.367 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:46.366+0000 7fdc957fa700 1 --2- 192.168.123.104:0/2736569988 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fdc980ac390 0x7fdc980b2be0 secure :-1 s=READY pgs=344 cs=0 l=1 rev1=1 crypto rx=0x7fdc9c00ba00 tx=0x7fdc9c00ba30 comp rx=0 tx=0).stop 2026-03-10T06:23:46.367 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:46.366+0000 7fdc957fa700 1 -- 192.168.123.104:0/2736569988 shutdown_connections 2026-03-10T06:23:46.367 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:46.367+0000 7fdc957fa700 1 --2- 192.168.123.104:0/2736569988 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7fdc8c077610 0x7fdc8c079ac0 unknown :-1 s=CLOSED pgs=36 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:46.367 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:46.367+0000 7fdc957fa700 1 --2- 192.168.123.104:0/2736569988 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fdc980ac390 0x7fdc980b2be0 unknown :-1 s=CLOSED pgs=344 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:46.367 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:46.367+0000 7fdc957fa700 1 --2- 192.168.123.104:0/2736569988 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fdc980ac760 0x7fdc980b3120 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:46.367 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:46.367+0000 7fdc957fa700 1 -- 192.168.123.104:0/2736569988 >> 192.168.123.104:0/2736569988 conn(0x7fdc9801a260 msgr2=0x7fdc980a3af0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:23:46.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:45 vm06.local ceph-mon[58974]: pgmap v51: 65 pgs: 65 active+clean; 1.3 GiB data, 5.4 GiB used, 115 GiB / 120 GiB avail; 974 KiB/s rd, 1.1 MiB/s wr, 114 op/s 2026-03-10T06:23:46.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:45 vm06.local ceph-mon[58974]: from='client.14720 -' entity='client.admin' cmd=[{"prefix": "orch upgrade check", "image": "quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:23:46.370 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:46.369+0000 7fdc957fa700 1 -- 192.168.123.104:0/2736569988 shutdown_connections 2026-03-10T06:23:46.370 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:46.369+0000 7fdc957fa700 1 -- 192.168.123.104:0/2736569988 wait complete. 2026-03-10T06:23:46.385 INFO:teuthology.orchestra.run.vm04.stdout:true 2026-03-10T06:23:46.466 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 9c59102a-1c48-11f1-b618-035af535377d -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph orch ps' 2026-03-10T06:23:46.688 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/mon.vm04/config 2026-03-10T06:23:47.017 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:46 vm04.local ceph-mon[51058]: pgmap v52: 65 pgs: 65 active+clean; 767 MiB data, 3.9 GiB used, 116 GiB / 120 GiB avail; 1.5 MiB/s rd, 1.5 MiB/s wr, 160 op/s 2026-03-10T06:23:47.159 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:47.157+0000 7f9357322700 1 -- 192.168.123.104:0/1691860986 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f935010eab0 msgr2=0x7f935010ee80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:23:47.159 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:47.157+0000 7f9357322700 1 --2- 192.168.123.104:0/1691860986 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f935010eab0 0x7f935010ee80 secure :-1 s=READY pgs=345 cs=0 l=1 rev1=1 crypto rx=0x7f934c009b00 tx=0x7f934c009e10 comp rx=0 tx=0).stop 2026-03-10T06:23:47.159 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:47.157+0000 7f9357322700 1 -- 192.168.123.104:0/1691860986 shutdown_connections 2026-03-10T06:23:47.159 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:47.157+0000 7f9357322700 1 --2- 192.168.123.104:0/1691860986 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9350071b60 0x7f9350071fd0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:47.159 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:47.157+0000 7f9357322700 1 --2- 192.168.123.104:0/1691860986 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f935010eab0 0x7f935010ee80 unknown :-1 s=CLOSED pgs=345 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:47.159 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:47.157+0000 7f9357322700 1 -- 192.168.123.104:0/1691860986 >> 192.168.123.104:0/1691860986 conn(0x7f935006c6c0 msgr2=0x7f935006cac0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:23:47.159 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:47.158+0000 7f9357322700 1 -- 192.168.123.104:0/1691860986 shutdown_connections 2026-03-10T06:23:47.159 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:47.158+0000 7f9357322700 1 -- 192.168.123.104:0/1691860986 wait complete. 2026-03-10T06:23:47.159 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:47.158+0000 7f9357322700 1 Processor -- start 2026-03-10T06:23:47.160 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:47.158+0000 7f9357322700 1 -- start start 2026-03-10T06:23:47.160 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:47.158+0000 7f9357322700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9350071b60 0x7f9350119560 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:23:47.160 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:47.158+0000 7f9357322700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f935010eab0 0x7f9350114560 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:23:47.160 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:47.158+0000 7f9357322700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9350114aa0 con 0x7f935010eab0 2026-03-10T06:23:47.160 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:47.158+0000 7f9357322700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9350114c10 con 0x7f9350071b60 2026-03-10T06:23:47.160 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:47.159+0000 7f93548bd700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f935010eab0 0x7f9350114560 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:23:47.160 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:47.159+0000 7f93548bd700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f935010eab0 0x7f9350114560 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:58482/0 (socket says 192.168.123.104:58482) 2026-03-10T06:23:47.160 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:47.159+0000 7f93548bd700 1 -- 192.168.123.104:0/1364272098 learned_addr learned my addr 192.168.123.104:0/1364272098 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:23:47.160 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:47.160+0000 7f93548bd700 1 -- 192.168.123.104:0/1364272098 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9350071b60 msgr2=0x7f9350119560 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:23:47.160 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:47.160+0000 7f93548bd700 1 --2- 192.168.123.104:0/1364272098 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9350071b60 0x7f9350119560 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:47.160 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:47.160+0000 7f93548bd700 1 -- 192.168.123.104:0/1364272098 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f934c0097e0 con 0x7f935010eab0 2026-03-10T06:23:47.160 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:47.160+0000 7f93548bd700 1 --2- 192.168.123.104:0/1364272098 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f935010eab0 0x7f9350114560 secure :-1 s=READY pgs=346 cs=0 l=1 rev1=1 crypto rx=0x7f934000b700 tx=0x7f934000bac0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:23:47.162 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:47.160+0000 7f93467fc700 1 -- 192.168.123.104:0/1364272098 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9340010840 con 0x7f935010eab0 2026-03-10T06:23:47.166 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:47.160+0000 7f93467fc700 1 -- 192.168.123.104:0/1364272098 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f9340010e80 con 0x7f935010eab0 2026-03-10T06:23:47.166 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:47.160+0000 7f93467fc700 1 -- 192.168.123.104:0/1364272098 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f934000d590 con 0x7f935010eab0 2026-03-10T06:23:47.166 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:47.160+0000 7f9357322700 1 -- 192.168.123.104:0/1364272098 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f9350114e90 con 0x7f935010eab0 2026-03-10T06:23:47.166 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:47.160+0000 7f9357322700 1 -- 192.168.123.104:0/1364272098 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f93501153e0 con 0x7f935010eab0 2026-03-10T06:23:47.167 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:47.161+0000 7f9357322700 1 -- 192.168.123.104:0/1364272098 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f935004f2a0 con 0x7f935010eab0 2026-03-10T06:23:47.167 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:47.164+0000 7f93467fc700 1 -- 192.168.123.104:0/1364272098 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 33) v1 ==== 99964+0+0 (secure 0 0 0) 0x7f934000f3e0 con 0x7f935010eab0 2026-03-10T06:23:47.167 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:47.164+0000 7f93467fc700 1 --2- 192.168.123.104:0/1364272098 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7f933c077660 0x7f933c079b10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:23:47.167 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:47.164+0000 7f93467fc700 1 -- 192.168.123.104:0/1364272098 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(42..42 src has 1..42) v4 ==== 5792+0+0 (secure 0 0 0) 0x7f93400991c0 con 0x7f935010eab0 2026-03-10T06:23:47.167 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:47.165+0000 7f93550be700 1 --2- 192.168.123.104:0/1364272098 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7f933c077660 0x7f933c079b10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:23:47.167 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:47.166+0000 7f93550be700 1 --2- 192.168.123.104:0/1364272098 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7f933c077660 0x7f933c079b10 secure :-1 s=READY pgs=37 cs=0 l=1 rev1=1 crypto rx=0x7f934c00b5c0 tx=0x7f934c009f90 comp rx=0 tx=0).ready entity=mgr.14632 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:23:47.168 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:47.166+0000 7f93467fc700 1 -- 192.168.123.104:0/1364272098 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191525 (secure 0 0 0) 0x7f9340061f70 con 0x7f935010eab0 2026-03-10T06:23:47.341 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:47.340+0000 7f9357322700 1 -- 192.168.123.104:0/1364272098 --> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7f93500709c0 con 0x7f933c077660 2026-03-10T06:23:47.351 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:47.350+0000 7f93467fc700 1 -- 192.168.123.104:0/1364272098 <== mgr.14632 v2:192.168.123.104:6800/1695210057 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3576 (secure 0 0 0) 0x7f93500709c0 con 0x7f933c077660 2026-03-10T06:23:47.351 INFO:teuthology.orchestra.run.vm04.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-10T06:23:47.351 INFO:teuthology.orchestra.run.vm04.stdout:alertmanager.vm04 vm04 *:9093,9094 running (35s) 14s ago 6m 15.5M - 0.25.0 c8568f914cd2 85edc8fe2fc1 2026-03-10T06:23:47.351 INFO:teuthology.orchestra.run.vm04.stdout:ceph-exporter.vm04 vm04 running (6m) 14s ago 6m 8426k - 18.2.0 dc2bc1663786 019b79596e39 2026-03-10T06:23:47.351 INFO:teuthology.orchestra.run.vm04.stdout:ceph-exporter.vm06 vm06 running (5m) 62s ago 5m 8648k - 18.2.0 dc2bc1663786 02ba67f7b99e 2026-03-10T06:23:47.351 INFO:teuthology.orchestra.run.vm04.stdout:crash.vm04 vm04 running (6m) 14s ago 6m 7402k - 18.2.0 dc2bc1663786 35fbdbd85c40 2026-03-10T06:23:47.351 INFO:teuthology.orchestra.run.vm04.stdout:crash.vm06 vm06 running (5m) 62s ago 5m 7411k - 18.2.0 dc2bc1663786 a60199b09d41 2026-03-10T06:23:47.351 INFO:teuthology.orchestra.run.vm04.stdout:grafana.vm04 vm04 *:3000 running (17s) 14s ago 5m 40.4M - 10.4.0 c8b91775d855 28b34ae2f2b0 2026-03-10T06:23:47.351 INFO:teuthology.orchestra.run.vm04.stdout:mds.cephfs.vm04.hdxbzv vm04 running (3m) 14s ago 3m 267M - 18.2.0 dc2bc1663786 342935a5b39a 2026-03-10T06:23:47.351 INFO:teuthology.orchestra.run.vm04.stdout:mds.cephfs.vm04.hsrsig vm04 running (3m) 14s ago 3m 16.6M - 18.2.0 dc2bc1663786 9bbaa4df4333 2026-03-10T06:23:47.351 INFO:teuthology.orchestra.run.vm04.stdout:mds.cephfs.vm06.afscws vm06 running (3m) 62s ago 3m 16.5M - 18.2.0 dc2bc1663786 dc29bd0a94dd 2026-03-10T06:23:47.351 INFO:teuthology.orchestra.run.vm04.stdout:mds.cephfs.vm06.wzhqon vm06 running (3m) 62s ago 3m 263M - 18.2.0 dc2bc1663786 5f7b9f10b346 2026-03-10T06:23:47.351 INFO:teuthology.orchestra.run.vm04.stdout:mgr.vm04.exdvdb vm04 *:8443,9283,8765 running (106s) 14s ago 7m 612M - 19.2.3-678-ge911bdeb 654f31e6858e 640a9d30421c 2026-03-10T06:23:47.351 INFO:teuthology.orchestra.run.vm04.stdout:mgr.vm06.wwotdr vm06 *:8443,9283,8765 running (83s) 62s ago 5m 495M - 19.2.3-678-ge911bdeb 654f31e6858e 0f98de364d6a 2026-03-10T06:23:47.351 INFO:teuthology.orchestra.run.vm04.stdout:mon.vm04 vm04 running (7m) 14s ago 7m 57.4M 2048M 18.2.0 dc2bc1663786 089bb557f95b 2026-03-10T06:23:47.351 INFO:teuthology.orchestra.run.vm04.stdout:mon.vm06 vm06 running (5m) 62s ago 5m 42.8M 2048M 18.2.0 dc2bc1663786 826078cd5cc7 2026-03-10T06:23:47.351 INFO:teuthology.orchestra.run.vm04.stdout:node-exporter.vm04 vm04 *:9100 running (69s) 14s ago 6m 9198k - 1.7.0 72c9c2088986 f88b18573eef 2026-03-10T06:23:47.351 INFO:teuthology.orchestra.run.vm04.stdout:node-exporter.vm06 vm06 *:9100 running (64s) 62s ago 5m 5368k - 1.7.0 72c9c2088986 32cea90d1988 2026-03-10T06:23:47.351 INFO:teuthology.orchestra.run.vm04.stdout:osd.0 vm04 running (5m) 14s ago 5m 297M 4096M 18.2.0 dc2bc1663786 23249edb3d75 2026-03-10T06:23:47.351 INFO:teuthology.orchestra.run.vm04.stdout:osd.1 vm04 running (5m) 14s ago 5m 296M 4096M 18.2.0 dc2bc1663786 ddcaf1636c42 2026-03-10T06:23:47.351 INFO:teuthology.orchestra.run.vm04.stdout:osd.2 vm04 running (4m) 14s ago 4m 236M 4096M 18.2.0 dc2bc1663786 e5a533082c80 2026-03-10T06:23:47.351 INFO:teuthology.orchestra.run.vm04.stdout:osd.3 vm06 running (4m) 62s ago 4m 320M 4096M 18.2.0 dc2bc1663786 62400287eca0 2026-03-10T06:23:47.351 INFO:teuthology.orchestra.run.vm04.stdout:osd.4 vm06 running (4m) 62s ago 4m 263M 4096M 18.2.0 dc2bc1663786 dcd395dfe220 2026-03-10T06:23:47.351 INFO:teuthology.orchestra.run.vm04.stdout:osd.5 vm06 running (4m) 62s ago 4m 313M 4096M 18.2.0 dc2bc1663786 862da087fc06 2026-03-10T06:23:47.351 INFO:teuthology.orchestra.run.vm04.stdout:prometheus.vm04 vm04 *:9095 running (45s) 14s ago 5m 46.9M - 2.51.0 1d3b7f56885b 9e491f823407 2026-03-10T06:23:47.354 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:47.353+0000 7f933bfff700 1 -- 192.168.123.104:0/1364272098 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7f933c077660 msgr2=0x7f933c079b10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:23:47.354 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:47.353+0000 7f933bfff700 1 --2- 192.168.123.104:0/1364272098 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7f933c077660 0x7f933c079b10 secure :-1 s=READY pgs=37 cs=0 l=1 rev1=1 crypto rx=0x7f934c00b5c0 tx=0x7f934c009f90 comp rx=0 tx=0).stop 2026-03-10T06:23:47.354 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:47.353+0000 7f933bfff700 1 -- 192.168.123.104:0/1364272098 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f935010eab0 msgr2=0x7f9350114560 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:23:47.354 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:47.353+0000 7f933bfff700 1 --2- 192.168.123.104:0/1364272098 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f935010eab0 0x7f9350114560 secure :-1 s=READY pgs=346 cs=0 l=1 rev1=1 crypto rx=0x7f934000b700 tx=0x7f934000bac0 comp rx=0 tx=0).stop 2026-03-10T06:23:47.354 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:47.353+0000 7f933bfff700 1 -- 192.168.123.104:0/1364272098 shutdown_connections 2026-03-10T06:23:47.354 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:47.353+0000 7f933bfff700 1 --2- 192.168.123.104:0/1364272098 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9350071b60 0x7f9350119560 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:47.354 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:47.353+0000 7f933bfff700 1 --2- 192.168.123.104:0/1364272098 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7f933c077660 0x7f933c079b10 unknown :-1 s=CLOSED pgs=37 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:47.354 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:47.353+0000 7f933bfff700 1 --2- 192.168.123.104:0/1364272098 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f935010eab0 0x7f9350114560 unknown :-1 s=CLOSED pgs=346 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:47.354 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:47.353+0000 7f933bfff700 1 -- 192.168.123.104:0/1364272098 >> 192.168.123.104:0/1364272098 conn(0x7f935006c6c0 msgr2=0x7f93500702b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:23:47.354 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:47.354+0000 7f933bfff700 1 -- 192.168.123.104:0/1364272098 shutdown_connections 2026-03-10T06:23:47.354 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:47.354+0000 7f933bfff700 1 -- 192.168.123.104:0/1364272098 wait complete. 2026-03-10T06:23:47.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:46 vm06.local ceph-mon[58974]: pgmap v52: 65 pgs: 65 active+clean; 767 MiB data, 3.9 GiB used, 116 GiB / 120 GiB avail; 1.5 MiB/s rd, 1.5 MiB/s wr, 160 op/s 2026-03-10T06:23:47.429 INFO:teuthology.task.sequential:In sequential, running task cephadm.shell... 2026-03-10T06:23:47.430 INFO:tasks.cephadm:Running commands on role host.a host ubuntu@vm04.local 2026-03-10T06:23:47.430 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 9c59102a-1c48-11f1-b618-035af535377d -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph config set mgr mgr/orchestrator/fail_fs true' 2026-03-10T06:23:47.682 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/mon.vm04/config 2026-03-10T06:23:48.033 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:48.031+0000 7f38b691d700 1 -- 192.168.123.104:0/4234206631 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f38a80a4f10 msgr2=0x7f38a80a52e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:23:48.033 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:48.031+0000 7f38b691d700 1 --2- 192.168.123.104:0/4234206631 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f38a80a4f10 0x7f38a80a52e0 secure :-1 s=READY pgs=347 cs=0 l=1 rev1=1 crypto rx=0x7f38ac007780 tx=0x7f38ac00c050 comp rx=0 tx=0).stop 2026-03-10T06:23:48.033 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:48.032+0000 7f38b691d700 1 -- 192.168.123.104:0/4234206631 shutdown_connections 2026-03-10T06:23:48.033 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:48.032+0000 7f38b691d700 1 --2- 192.168.123.104:0/4234206631 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f38a80a58b0 0x7f38a80a8e70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:48.033 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:48.032+0000 7f38b691d700 1 --2- 192.168.123.104:0/4234206631 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f38a80a4f10 0x7f38a80a52e0 unknown :-1 s=CLOSED pgs=347 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:48.033 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:48.032+0000 7f38b691d700 1 -- 192.168.123.104:0/4234206631 >> 192.168.123.104:0/4234206631 conn(0x7f38a801a270 msgr2=0x7f38a801a670 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:23:48.035 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:48.032+0000 7f38b691d700 1 -- 192.168.123.104:0/4234206631 shutdown_connections 2026-03-10T06:23:48.035 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:48.032+0000 7f38b691d700 1 -- 192.168.123.104:0/4234206631 wait complete. 2026-03-10T06:23:48.035 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:48.032+0000 7f38b691d700 1 Processor -- start 2026-03-10T06:23:48.035 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:48.032+0000 7f38b691d700 1 -- start start 2026-03-10T06:23:48.035 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:48.032+0000 7f38b691d700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f38a80a58b0 0x7f38a80d51c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:23:48.035 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:48.032+0000 7f38b691d700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f38a80d0170 0x7f38a80d05e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:23:48.035 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:48.032+0000 7f38b691d700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f38a80d0b20 con 0x7f38a80d0170 2026-03-10T06:23:48.035 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:48.032+0000 7f38b691d700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f38a80d0c90 con 0x7f38a80a58b0 2026-03-10T06:23:48.035 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:48.033+0000 7f38b591b700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f38a80a58b0 0x7f38a80d51c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:23:48.035 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:48.033+0000 7f38b591b700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f38a80a58b0 0x7f38a80d51c0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.104:58572/0 (socket says 192.168.123.104:58572) 2026-03-10T06:23:48.035 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:48.033+0000 7f38b591b700 1 -- 192.168.123.104:0/3414149613 learned_addr learned my addr 192.168.123.104:0/3414149613 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:23:48.035 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:48.033+0000 7f38b511a700 1 --2- 192.168.123.104:0/3414149613 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f38a80d0170 0x7f38a80d05e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:23:48.035 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:48.033+0000 7f38b591b700 1 -- 192.168.123.104:0/3414149613 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f38a80d0170 msgr2=0x7f38a80d05e0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:23:48.035 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:48.033+0000 7f38b591b700 1 --2- 192.168.123.104:0/3414149613 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f38a80d0170 0x7f38a80d05e0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:48.035 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:48.033+0000 7f38b591b700 1 -- 192.168.123.104:0/3414149613 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f38ac007430 con 0x7f38a80a58b0 2026-03-10T06:23:48.035 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:48.033+0000 7f38b591b700 1 --2- 192.168.123.104:0/3414149613 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f38a80a58b0 0x7f38a80d51c0 secure :-1 s=READY pgs=103 cs=0 l=1 rev1=1 crypto rx=0x7f38ac00afd0 tx=0x7f38ac00c900 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:23:48.035 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:48.033+0000 7f38a6ffd700 1 -- 192.168.123.104:0/3414149613 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f38ac00f050 con 0x7f38a80a58b0 2026-03-10T06:23:48.035 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:48.033+0000 7f38b691d700 1 -- 192.168.123.104:0/3414149613 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f38a80d0ec0 con 0x7f38a80a58b0 2026-03-10T06:23:48.035 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:48.033+0000 7f38b691d700 1 -- 192.168.123.104:0/3414149613 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f38a8011930 con 0x7f38a80a58b0 2026-03-10T06:23:48.035 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:48.034+0000 7f38a6ffd700 1 -- 192.168.123.104:0/3414149613 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f38ac003710 con 0x7f38a80a58b0 2026-03-10T06:23:48.035 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:48.034+0000 7f38a6ffd700 1 -- 192.168.123.104:0/3414149613 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f38ac008250 con 0x7f38a80a58b0 2026-03-10T06:23:48.036 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:48.035+0000 7f38b691d700 1 -- 192.168.123.104:0/3414149613 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f3894005320 con 0x7f38a80a58b0 2026-03-10T06:23:48.037 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:48.036+0000 7f38a6ffd700 1 -- 192.168.123.104:0/3414149613 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 33) v1 ==== 99964+0+0 (secure 0 0 0) 0x7f38ac01a040 con 0x7f38a80a58b0 2026-03-10T06:23:48.037 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:48.036+0000 7f38a6ffd700 1 --2- 192.168.123.104:0/3414149613 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7f389c077590 0x7f389c079a40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:23:48.037 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:48.036+0000 7f38a6ffd700 1 -- 192.168.123.104:0/3414149613 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(42..42 src has 1..42) v4 ==== 5792+0+0 (secure 0 0 0) 0x7f38ac099d30 con 0x7f38a80a58b0 2026-03-10T06:23:48.037 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:48.037+0000 7f38b511a700 1 --2- 192.168.123.104:0/3414149613 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7f389c077590 0x7f389c079a40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:23:48.038 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:48.037+0000 7f38b511a700 1 --2- 192.168.123.104:0/3414149613 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7f389c077590 0x7f389c079a40 secure :-1 s=READY pgs=38 cs=0 l=1 rev1=1 crypto rx=0x7f38b00666b0 tx=0x7f38b0067150 comp rx=0 tx=0).ready entity=mgr.14632 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:23:48.040 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:48.040+0000 7f38a6ffd700 1 -- 192.168.123.104:0/3414149613 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191525 (secure 0 0 0) 0x7f38ac062ae0 con 0x7f38a80a58b0 2026-03-10T06:23:48.163 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:48.163+0000 7f38b691d700 1 -- 192.168.123.104:0/3414149613 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command([{prefix=config set, name=mgr/orchestrator/fail_fs}] v 0) v1 -- 0x7f3894005cc0 con 0x7f38a80a58b0 2026-03-10T06:23:48.173 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:48.172+0000 7f38a6ffd700 1 -- 192.168.123.104:0/3414149613 <== mon.1 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{prefix=config set, name=mgr/orchestrator/fail_fs}]=0 v37)=0 v37) v1 ==== 125+0+0 (secure 0 0 0) 0x7f38ac062230 con 0x7f38a80a58b0 2026-03-10T06:23:48.176 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:48.175+0000 7f38a4ff9700 1 -- 192.168.123.104:0/3414149613 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7f389c077590 msgr2=0x7f389c079a40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:23:48.176 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:48.175+0000 7f38a4ff9700 1 --2- 192.168.123.104:0/3414149613 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7f389c077590 0x7f389c079a40 secure :-1 s=READY pgs=38 cs=0 l=1 rev1=1 crypto rx=0x7f38b00666b0 tx=0x7f38b0067150 comp rx=0 tx=0).stop 2026-03-10T06:23:48.176 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:48.175+0000 7f38a4ff9700 1 -- 192.168.123.104:0/3414149613 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f38a80a58b0 msgr2=0x7f38a80d51c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:23:48.176 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:48.175+0000 7f38a4ff9700 1 --2- 192.168.123.104:0/3414149613 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f38a80a58b0 0x7f38a80d51c0 secure :-1 s=READY pgs=103 cs=0 l=1 rev1=1 crypto rx=0x7f38ac00afd0 tx=0x7f38ac00c900 comp rx=0 tx=0).stop 2026-03-10T06:23:48.176 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:48.175+0000 7f38a4ff9700 1 -- 192.168.123.104:0/3414149613 shutdown_connections 2026-03-10T06:23:48.176 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:48.175+0000 7f38a4ff9700 1 --2- 192.168.123.104:0/3414149613 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f38a80a58b0 0x7f38a80d51c0 unknown :-1 s=CLOSED pgs=103 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:48.176 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:48.175+0000 7f38a4ff9700 1 --2- 192.168.123.104:0/3414149613 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7f389c077590 0x7f389c079a40 unknown :-1 s=CLOSED pgs=38 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:48.176 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:48.175+0000 7f38a4ff9700 1 --2- 192.168.123.104:0/3414149613 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f38a80d0170 0x7f38a80d05e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:48.176 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:48.175+0000 7f38a4ff9700 1 -- 192.168.123.104:0/3414149613 >> 192.168.123.104:0/3414149613 conn(0x7f38a801a270 msgr2=0x7f38a80a2c30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:23:48.176 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:48.176+0000 7f38a4ff9700 1 -- 192.168.123.104:0/3414149613 shutdown_connections 2026-03-10T06:23:48.176 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:48.176+0000 7f38a4ff9700 1 -- 192.168.123.104:0/3414149613 wait complete. 2026-03-10T06:23:48.251 INFO:teuthology.task.sequential:In sequential, running task cephadm.shell... 2026-03-10T06:23:48.251 INFO:tasks.cephadm:Running commands on role host.a host ubuntu@vm04.local 2026-03-10T06:23:48.251 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 9c59102a-1c48-11f1-b618-035af535377d -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph config set mon mon_warn_on_insecure_global_id_reclaim false --force' 2026-03-10T06:23:48.495 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:48 vm04.local ceph-mon[51058]: from='client.14724 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:23:48.495 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:48 vm04.local ceph-mon[51058]: from='client.? ' entity='client.admin' 2026-03-10T06:23:48.495 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:48 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:23:48.518 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/mon.vm04/config 2026-03-10T06:23:48.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:48 vm06.local ceph-mon[58974]: from='client.14724 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:23:48.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:48 vm06.local ceph-mon[58974]: from='client.? ' entity='client.admin' 2026-03-10T06:23:48.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:48 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:23:48.763 INFO:teuthology.orchestra.run:Running command with timeout 3600 2026-03-10T06:23:48.763 DEBUG:teuthology.orchestra.run.vm04:> sudo rm -rf -- /home/ubuntu/cephtest/mnt.0/client.0/tmp 2026-03-10T06:23:48.964 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:48.963+0000 7ff3c8d67700 1 -- 192.168.123.104:0/587470331 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7ff3c406d7a0 msgr2=0x7ff3c406dc10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:23:48.964 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:48.963+0000 7ff3c8d67700 1 --2- 192.168.123.104:0/587470331 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7ff3c406d7a0 0x7ff3c406dc10 secure :-1 s=READY pgs=348 cs=0 l=1 rev1=1 crypto rx=0x7ff3bc00b3a0 tx=0x7ff3bc00b6b0 comp rx=0 tx=0).stop 2026-03-10T06:23:48.964 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:48.963+0000 7ff3c8d67700 1 -- 192.168.123.104:0/587470331 shutdown_connections 2026-03-10T06:23:48.964 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:48.963+0000 7ff3c8d67700 1 --2- 192.168.123.104:0/587470331 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7ff3c406d7a0 0x7ff3c406dc10 unknown :-1 s=CLOSED pgs=348 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:48.964 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:48.963+0000 7ff3c8d67700 1 --2- 192.168.123.104:0/587470331 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff3c410ed80 0x7ff3c406d260 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:48.964 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:48.963+0000 7ff3c8d67700 1 -- 192.168.123.104:0/587470331 >> 192.168.123.104:0/587470331 conn(0x7ff3c406c830 msgr2=0x7ff3c4071830 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:23:48.964 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:48.964+0000 7ff3c8d67700 1 -- 192.168.123.104:0/587470331 shutdown_connections 2026-03-10T06:23:48.964 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:48.964+0000 7ff3c8d67700 1 -- 192.168.123.104:0/587470331 wait complete. 2026-03-10T06:23:48.965 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:48.964+0000 7ff3c8d67700 1 Processor -- start 2026-03-10T06:23:48.965 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:48.964+0000 7ff3c8d67700 1 -- start start 2026-03-10T06:23:48.965 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:48.964+0000 7ff3c8d67700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff3c406d7a0 0x7ff3c41a4f10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:23:48.965 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:48.965+0000 7ff3c8d67700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7ff3c410ed80 0x7ff3c41a5450 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:23:48.965 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:48.965+0000 7ff3c8d67700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff3c41a5b30 con 0x7ff3c410ed80 2026-03-10T06:23:48.965 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:48.965+0000 7ff3c8d67700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff3c41a98c0 con 0x7ff3c406d7a0 2026-03-10T06:23:48.965 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:48.965+0000 7ff3c259c700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff3c406d7a0 0x7ff3c41a4f10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:23:48.965 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:48.965+0000 7ff3c259c700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff3c406d7a0 0x7ff3c41a4f10 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.104:58600/0 (socket says 192.168.123.104:58600) 2026-03-10T06:23:48.965 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:48.965+0000 7ff3c259c700 1 -- 192.168.123.104:0/2532365536 learned_addr learned my addr 192.168.123.104:0/2532365536 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:23:48.966 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:48.965+0000 7ff3c259c700 1 -- 192.168.123.104:0/2532365536 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7ff3c410ed80 msgr2=0x7ff3c41a5450 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T06:23:48.966 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:48.965+0000 7ff3c1d9b700 1 --2- 192.168.123.104:0/2532365536 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7ff3c410ed80 0x7ff3c41a5450 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:23:48.966 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:48.965+0000 7ff3c259c700 1 --2- 192.168.123.104:0/2532365536 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7ff3c410ed80 0x7ff3c41a5450 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:48.966 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:48.965+0000 7ff3c259c700 1 -- 192.168.123.104:0/2532365536 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7ff3bc00b050 con 0x7ff3c406d7a0 2026-03-10T06:23:48.966 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:48.966+0000 7ff3c259c700 1 --2- 192.168.123.104:0/2532365536 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff3c406d7a0 0x7ff3c41a4f10 secure :-1 s=READY pgs=104 cs=0 l=1 rev1=1 crypto rx=0x7ff3b800eb10 tx=0x7ff3b800eed0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:23:48.966 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:48.966+0000 7ff3b37fe700 1 -- 192.168.123.104:0/2532365536 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff3b800cca0 con 0x7ff3c406d7a0 2026-03-10T06:23:48.968 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:48.966+0000 7ff3c8d67700 1 -- 192.168.123.104:0/2532365536 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7ff3c41a9ac0 con 0x7ff3c406d7a0 2026-03-10T06:23:48.968 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:48.966+0000 7ff3c8d67700 1 -- 192.168.123.104:0/2532365536 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7ff3c41aa010 con 0x7ff3c406d7a0 2026-03-10T06:23:48.968 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:48.966+0000 7ff3b37fe700 1 -- 192.168.123.104:0/2532365536 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7ff3b800ce00 con 0x7ff3c406d7a0 2026-03-10T06:23:48.968 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:48.966+0000 7ff3b37fe700 1 -- 192.168.123.104:0/2532365536 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff3b80189c0 con 0x7ff3c406d7a0 2026-03-10T06:23:48.968 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:48.967+0000 7ff3c8d67700 1 -- 192.168.123.104:0/2532365536 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7ff3a4005320 con 0x7ff3c406d7a0 2026-03-10T06:23:48.968 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:48.968+0000 7ff3b37fe700 1 -- 192.168.123.104:0/2532365536 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 33) v1 ==== 99964+0+0 (secure 0 0 0) 0x7ff3b8018b20 con 0x7ff3c406d7a0 2026-03-10T06:23:48.969 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:48.968+0000 7ff3b37fe700 1 --2- 192.168.123.104:0/2532365536 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7ff3ac077540 0x7ff3ac0799f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:23:48.969 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:48.968+0000 7ff3b37fe700 1 -- 192.168.123.104:0/2532365536 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(42..42 src has 1..42) v4 ==== 5792+0+0 (secure 0 0 0) 0x7ff3b8014070 con 0x7ff3c406d7a0 2026-03-10T06:23:48.970 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:48.969+0000 7ff3c1d9b700 1 --2- 192.168.123.104:0/2532365536 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7ff3ac077540 0x7ff3ac0799f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:23:48.970 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:48.969+0000 7ff3c1d9b700 1 --2- 192.168.123.104:0/2532365536 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7ff3ac077540 0x7ff3ac0799f0 secure :-1 s=READY pgs=39 cs=0 l=1 rev1=1 crypto rx=0x7ff3bc009250 tx=0x7ff3bc006000 comp rx=0 tx=0).ready entity=mgr.14632 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:23:48.972 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:48.971+0000 7ff3b37fe700 1 -- 192.168.123.104:0/2532365536 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191525 (secure 0 0 0) 0x7ff3b8062700 con 0x7ff3c406d7a0 2026-03-10T06:23:49.104 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:49.103+0000 7ff3c8d67700 1 -- 192.168.123.104:0/2532365536 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command([{prefix=config set, name=mon_warn_on_insecure_global_id_reclaim}] v 0) v1 -- 0x7ff3a40059f0 con 0x7ff3c406d7a0 2026-03-10T06:23:49.107 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:49.107+0000 7ff3b37fe700 1 -- 192.168.123.104:0/2532365536 <== mon.1 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{prefix=config set, name=mon_warn_on_insecure_global_id_reclaim}]=0 v37)=0 v37) v1 ==== 155+0+0 (secure 0 0 0) 0x7ff3b8061e50 con 0x7ff3c406d7a0 2026-03-10T06:23:49.111 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:49.110+0000 7ff3b17fa700 1 -- 192.168.123.104:0/2532365536 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7ff3ac077540 msgr2=0x7ff3ac0799f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:23:49.111 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:49.110+0000 7ff3b17fa700 1 --2- 192.168.123.104:0/2532365536 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7ff3ac077540 0x7ff3ac0799f0 secure :-1 s=READY pgs=39 cs=0 l=1 rev1=1 crypto rx=0x7ff3bc009250 tx=0x7ff3bc006000 comp rx=0 tx=0).stop 2026-03-10T06:23:49.111 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:49.110+0000 7ff3b17fa700 1 -- 192.168.123.104:0/2532365536 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff3c406d7a0 msgr2=0x7ff3c41a4f10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:23:49.111 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:49.110+0000 7ff3b17fa700 1 --2- 192.168.123.104:0/2532365536 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff3c406d7a0 0x7ff3c41a4f10 secure :-1 s=READY pgs=104 cs=0 l=1 rev1=1 crypto rx=0x7ff3b800eb10 tx=0x7ff3b800eed0 comp rx=0 tx=0).stop 2026-03-10T06:23:49.112 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:49.111+0000 7ff3b17fa700 1 -- 192.168.123.104:0/2532365536 shutdown_connections 2026-03-10T06:23:49.112 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:49.111+0000 7ff3b17fa700 1 --2- 192.168.123.104:0/2532365536 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff3c406d7a0 0x7ff3c41a4f10 unknown :-1 s=CLOSED pgs=104 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:49.112 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:49.111+0000 7ff3b17fa700 1 --2- 192.168.123.104:0/2532365536 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7ff3ac077540 0x7ff3ac0799f0 unknown :-1 s=CLOSED pgs=39 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:49.112 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:49.111+0000 7ff3b17fa700 1 --2- 192.168.123.104:0/2532365536 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7ff3c410ed80 0x7ff3c41a5450 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:49.112 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:49.111+0000 7ff3b17fa700 1 -- 192.168.123.104:0/2532365536 >> 192.168.123.104:0/2532365536 conn(0x7ff3c406c830 msgr2=0x7ff3c4118960 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:23:49.112 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:49.111+0000 7ff3b17fa700 1 -- 192.168.123.104:0/2532365536 shutdown_connections 2026-03-10T06:23:49.112 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:49.111+0000 7ff3b17fa700 1 -- 192.168.123.104:0/2532365536 wait complete. 2026-03-10T06:23:49.430 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 9c59102a-1c48-11f1-b618-035af535377d -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph config set mon mon_warn_on_insecure_global_id_reclaim_allowed false --force' 2026-03-10T06:23:49.661 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/mon.vm04/config 2026-03-10T06:23:49.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:49 vm04.local ceph-mon[51058]: pgmap v53: 65 pgs: 65 active+clean; 767 MiB data, 3.9 GiB used, 116 GiB / 120 GiB avail; 1.2 MiB/s rd, 1.2 MiB/s wr, 110 op/s 2026-03-10T06:23:49.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:49 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:23:49.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:49 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T06:23:49.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:49 vm04.local ceph-mon[51058]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:23:49.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:49 vm06.local ceph-mon[58974]: pgmap v53: 65 pgs: 65 active+clean; 767 MiB data, 3.9 GiB used, 116 GiB / 120 GiB avail; 1.2 MiB/s rd, 1.2 MiB/s wr, 110 op/s 2026-03-10T06:23:49.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:49 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:23:49.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:49 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T06:23:49.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:49 vm06.local ceph-mon[58974]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:23:50.011 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:50.009+0000 7f2eabfff700 1 -- 192.168.123.104:0/1764236850 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2ea40a4940 msgr2=0x7f2ea40a4db0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:23:50.011 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:50.009+0000 7f2eabfff700 1 --2- 192.168.123.104:0/1764236850 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2ea40a4940 0x7f2ea40a4db0 secure :-1 s=READY pgs=349 cs=0 l=1 rev1=1 crypto rx=0x7f2ea0009b00 tx=0x7f2ea0009e10 comp rx=0 tx=0).stop 2026-03-10T06:23:50.011 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:50.010+0000 7f2eabfff700 1 -- 192.168.123.104:0/1764236850 shutdown_connections 2026-03-10T06:23:50.011 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:50.010+0000 7f2eabfff700 1 --2- 192.168.123.104:0/1764236850 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2ea40a4940 0x7f2ea40a4db0 unknown :-1 s=CLOSED pgs=349 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:50.011 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:50.010+0000 7f2eabfff700 1 --2- 192.168.123.104:0/1764236850 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f2ea40ac3c0 0x7f2ea40a4400 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:50.011 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:50.010+0000 7f2eabfff700 1 -- 192.168.123.104:0/1764236850 >> 192.168.123.104:0/1764236850 conn(0x7f2ea401a220 msgr2=0x7f2ea401a620 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:23:50.011 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:50.011+0000 7f2eabfff700 1 -- 192.168.123.104:0/1764236850 shutdown_connections 2026-03-10T06:23:50.012 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:50.011+0000 7f2eabfff700 1 -- 192.168.123.104:0/1764236850 wait complete. 2026-03-10T06:23:50.012 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:50.011+0000 7f2eabfff700 1 Processor -- start 2026-03-10T06:23:50.012 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:50.011+0000 7f2eabfff700 1 -- start start 2026-03-10T06:23:50.012 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:50.011+0000 7f2eabfff700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2ea40a4940 0x7f2ea40b48c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:23:50.012 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:50.011+0000 7f2eabfff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f2ea40ac3c0 0x7f2ea40af8c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:23:50.012 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:50.011+0000 7f2eabfff700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f2ea40afe00 con 0x7f2ea40a4940 2026-03-10T06:23:50.012 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:50.011+0000 7f2eabfff700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f2ea40aff70 con 0x7f2ea40ac3c0 2026-03-10T06:23:50.014 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:50.013+0000 7f2eaaffd700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2ea40a4940 0x7f2ea40b48c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:23:50.014 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:50.013+0000 7f2eaaffd700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2ea40a4940 0x7f2ea40b48c0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:58504/0 (socket says 192.168.123.104:58504) 2026-03-10T06:23:50.014 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:50.013+0000 7f2eaaffd700 1 -- 192.168.123.104:0/2444954540 learned_addr learned my addr 192.168.123.104:0/2444954540 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:23:50.014 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:50.013+0000 7f2eaa7fc700 1 --2- 192.168.123.104:0/2444954540 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f2ea40ac3c0 0x7f2ea40af8c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:23:50.014 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:50.014+0000 7f2eaaffd700 1 -- 192.168.123.104:0/2444954540 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f2ea40ac3c0 msgr2=0x7f2ea40af8c0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:23:50.014 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:50.014+0000 7f2eaaffd700 1 --2- 192.168.123.104:0/2444954540 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f2ea40ac3c0 0x7f2ea40af8c0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:50.014 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:50.014+0000 7f2eaaffd700 1 -- 192.168.123.104:0/2444954540 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f2ea00097e0 con 0x7f2ea40a4940 2026-03-10T06:23:50.014 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:50.014+0000 7f2eaaffd700 1 --2- 192.168.123.104:0/2444954540 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2ea40a4940 0x7f2ea40b48c0 secure :-1 s=READY pgs=350 cs=0 l=1 rev1=1 crypto rx=0x7f2e9c00b700 tx=0x7f2e9c00bac0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:23:50.015 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:50.014+0000 7f2e93fff700 1 -- 192.168.123.104:0/2444954540 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f2e9c010840 con 0x7f2ea40a4940 2026-03-10T06:23:50.015 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:50.014+0000 7f2eabfff700 1 -- 192.168.123.104:0/2444954540 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f2ea40b0250 con 0x7f2ea40a4940 2026-03-10T06:23:50.015 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:50.014+0000 7f2eabfff700 1 -- 192.168.123.104:0/2444954540 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f2ea4155460 con 0x7f2ea40a4940 2026-03-10T06:23:50.016 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:50.015+0000 7f2e93fff700 1 -- 192.168.123.104:0/2444954540 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f2e9c010e80 con 0x7f2ea40a4940 2026-03-10T06:23:50.016 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:50.015+0000 7f2eabfff700 1 -- 192.168.123.104:0/2444954540 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f2ea4004600 con 0x7f2ea40a4940 2026-03-10T06:23:50.016 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:50.015+0000 7f2e93fff700 1 -- 192.168.123.104:0/2444954540 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f2e9c00fea0 con 0x7f2ea40a4940 2026-03-10T06:23:50.018 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:50.017+0000 7f2e93fff700 1 -- 192.168.123.104:0/2444954540 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 33) v1 ==== 99964+0+0 (secure 0 0 0) 0x7f2e9c00f870 con 0x7f2ea40a4940 2026-03-10T06:23:50.018 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:50.018+0000 7f2e93fff700 1 --2- 192.168.123.104:0/2444954540 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7f2e94077590 0x7f2e94079a40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:23:50.018 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:50.018+0000 7f2e93fff700 1 -- 192.168.123.104:0/2444954540 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(42..42 src has 1..42) v4 ==== 5792+0+0 (secure 0 0 0) 0x7f2e9c0992d0 con 0x7f2ea40a4940 2026-03-10T06:23:50.019 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:50.018+0000 7f2eaa7fc700 1 --2- 192.168.123.104:0/2444954540 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7f2e94077590 0x7f2e94079a40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:23:50.023 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:50.022+0000 7f2eaa7fc700 1 --2- 192.168.123.104:0/2444954540 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7f2e94077590 0x7f2e94079a40 secure :-1 s=READY pgs=40 cs=0 l=1 rev1=1 crypto rx=0x7f2ea40b10b0 tx=0x7f2ea0005fb0 comp rx=0 tx=0).ready entity=mgr.14632 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:23:50.024 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:50.023+0000 7f2e93fff700 1 -- 192.168.123.104:0/2444954540 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191525 (secure 0 0 0) 0x7f2e9c061930 con 0x7f2ea40a4940 2026-03-10T06:23:50.164 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:50.163+0000 7f2eabfff700 1 -- 192.168.123.104:0/2444954540 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command([{prefix=config set, name=mon_warn_on_insecure_global_id_reclaim_allowed}] v 0) v1 -- 0x7f2ea4005080 con 0x7f2ea40a4940 2026-03-10T06:23:50.165 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:50.164+0000 7f2e93fff700 1 -- 192.168.123.104:0/2444954540 <== mon.0 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{prefix=config set, name=mon_warn_on_insecure_global_id_reclaim_allowed}]=0 v37)=0 v37) v1 ==== 163+0+0 (secure 0 0 0) 0x7f2e9c061750 con 0x7f2ea40a4940 2026-03-10T06:23:50.168 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:50.167+0000 7f2e91ffb700 1 -- 192.168.123.104:0/2444954540 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7f2e94077590 msgr2=0x7f2e94079a40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:23:50.168 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:50.167+0000 7f2e91ffb700 1 --2- 192.168.123.104:0/2444954540 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7f2e94077590 0x7f2e94079a40 secure :-1 s=READY pgs=40 cs=0 l=1 rev1=1 crypto rx=0x7f2ea40b10b0 tx=0x7f2ea0005fb0 comp rx=0 tx=0).stop 2026-03-10T06:23:50.168 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:50.167+0000 7f2e91ffb700 1 -- 192.168.123.104:0/2444954540 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2ea40a4940 msgr2=0x7f2ea40b48c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:23:50.168 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:50.167+0000 7f2e91ffb700 1 --2- 192.168.123.104:0/2444954540 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2ea40a4940 0x7f2ea40b48c0 secure :-1 s=READY pgs=350 cs=0 l=1 rev1=1 crypto rx=0x7f2e9c00b700 tx=0x7f2e9c00bac0 comp rx=0 tx=0).stop 2026-03-10T06:23:50.168 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:50.168+0000 7f2e91ffb700 1 -- 192.168.123.104:0/2444954540 shutdown_connections 2026-03-10T06:23:50.168 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:50.168+0000 7f2e91ffb700 1 --2- 192.168.123.104:0/2444954540 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7f2e94077590 0x7f2e94079a40 unknown :-1 s=CLOSED pgs=40 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:50.168 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:50.168+0000 7f2e91ffb700 1 --2- 192.168.123.104:0/2444954540 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2ea40a4940 0x7f2ea40b48c0 unknown :-1 s=CLOSED pgs=350 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:50.168 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:50.168+0000 7f2e91ffb700 1 --2- 192.168.123.104:0/2444954540 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f2ea40ac3c0 0x7f2ea40af8c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:50.169 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:50.168+0000 7f2e91ffb700 1 -- 192.168.123.104:0/2444954540 >> 192.168.123.104:0/2444954540 conn(0x7f2ea401a220 msgr2=0x7f2ea40b53a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:23:50.169 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:50.168+0000 7f2e91ffb700 1 -- 192.168.123.104:0/2444954540 shutdown_connections 2026-03-10T06:23:50.169 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:50.169+0000 7f2e91ffb700 1 -- 192.168.123.104:0/2444954540 wait complete. 2026-03-10T06:23:50.234 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 9c59102a-1c48-11f1-b618-035af535377d -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph config set global log_to_journald false --force' 2026-03-10T06:23:50.499 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/mon.vm04/config 2026-03-10T06:23:50.819 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:50.818+0000 7f6271257700 1 -- 192.168.123.104:0/950136528 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f626c0ff250 msgr2=0x7f626c0ff620 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:23:50.819 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:50.818+0000 7f6271257700 1 --2- 192.168.123.104:0/950136528 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f626c0ff250 0x7f626c0ff620 secure :-1 s=READY pgs=105 cs=0 l=1 rev1=1 crypto rx=0x7f625c009a60 tx=0x7f625c009d70 comp rx=0 tx=0).stop 2026-03-10T06:23:50.821 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:50.821+0000 7f6271257700 1 -- 192.168.123.104:0/950136528 shutdown_connections 2026-03-10T06:23:50.821 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:50.821+0000 7f6271257700 1 --2- 192.168.123.104:0/950136528 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f626c0ffb60 0x7f626c104040 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:50.821 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:50.821+0000 7f6271257700 1 --2- 192.168.123.104:0/950136528 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f626c0ff250 0x7f626c0ff620 unknown :-1 s=CLOSED pgs=105 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:50.821 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:50.821+0000 7f6271257700 1 -- 192.168.123.104:0/950136528 >> 192.168.123.104:0/950136528 conn(0x7f626c0faca0 msgr2=0x7f626c0fd0b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:23:50.823 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:50.823+0000 7f6271257700 1 -- 192.168.123.104:0/950136528 shutdown_connections 2026-03-10T06:23:50.823 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:50.823+0000 7f6271257700 1 -- 192.168.123.104:0/950136528 wait complete. 2026-03-10T06:23:50.824 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:50.823+0000 7f6271257700 1 Processor -- start 2026-03-10T06:23:50.824 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:50.824+0000 7f6271257700 1 -- start start 2026-03-10T06:23:50.827 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:50.826+0000 7f6271257700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f626c0ff250 0x7f626c111780 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:23:50.827 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:50.826+0000 7f6271257700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f626c0ffb60 0x7f626c10c780 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:23:50.827 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:50.826+0000 7f6271257700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f626c10cd50 con 0x7f626c0ffb60 2026-03-10T06:23:50.827 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:50.826+0000 7f6271257700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f626c10cec0 con 0x7f626c0ff250 2026-03-10T06:23:50.827 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:50.826+0000 7f626b7fe700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f626c0ffb60 0x7f626c10c780 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:23:50.827 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:50.826+0000 7f626b7fe700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f626c0ffb60 0x7f626c10c780 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:58526/0 (socket says 192.168.123.104:58526) 2026-03-10T06:23:50.827 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:50.826+0000 7f626b7fe700 1 -- 192.168.123.104:0/1745167597 learned_addr learned my addr 192.168.123.104:0/1745167597 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:23:50.827 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:50.826+0000 7f626bfff700 1 --2- 192.168.123.104:0/1745167597 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f626c0ff250 0x7f626c111780 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:23:50.827 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:50.826+0000 7f626bfff700 1 -- 192.168.123.104:0/1745167597 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f626c0ffb60 msgr2=0x7f626c10c780 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:23:50.827 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:50.826+0000 7f626bfff700 1 --2- 192.168.123.104:0/1745167597 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f626c0ffb60 0x7f626c10c780 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:50.827 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:50.826+0000 7f626bfff700 1 -- 192.168.123.104:0/1745167597 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f625c009710 con 0x7f626c0ff250 2026-03-10T06:23:50.827 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:50.826+0000 7f626bfff700 1 --2- 192.168.123.104:0/1745167597 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f626c0ff250 0x7f626c111780 secure :-1 s=READY pgs=106 cs=0 l=1 rev1=1 crypto rx=0x7f625c000c00 tx=0x7f625c00f690 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:23:50.827 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:50.827+0000 7f62697fa700 1 -- 192.168.123.104:0/1745167597 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f625c01d070 con 0x7f626c0ff250 2026-03-10T06:23:50.828 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:50.827+0000 7f62697fa700 1 -- 192.168.123.104:0/1745167597 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f625c00bb40 con 0x7f626c0ff250 2026-03-10T06:23:50.828 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:50.827+0000 7f6271257700 1 -- 192.168.123.104:0/1745167597 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f626c10d140 con 0x7f626c0ff250 2026-03-10T06:23:50.828 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:50.827+0000 7f6271257700 1 -- 192.168.123.104:0/1745167597 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f626c1af210 con 0x7f626c0ff250 2026-03-10T06:23:50.829 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:50.828+0000 7f62697fa700 1 -- 192.168.123.104:0/1745167597 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f625c017600 con 0x7f626c0ff250 2026-03-10T06:23:50.830 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:50.829+0000 7f62697fa700 1 -- 192.168.123.104:0/1745167597 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 33) v1 ==== 99964+0+0 (secure 0 0 0) 0x7f625c00fac0 con 0x7f626c0ff250 2026-03-10T06:23:50.831 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:50.830+0000 7f6271257700 1 -- 192.168.123.104:0/1745167597 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f6258005320 con 0x7f626c0ff250 2026-03-10T06:23:50.831 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:50.831+0000 7f62697fa700 1 --2- 192.168.123.104:0/1745167597 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7f6254077380 0x7f6254079830 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:23:50.831 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:50.831+0000 7f62697fa700 1 -- 192.168.123.104:0/1745167597 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(42..42 src has 1..42) v4 ==== 5792+0+0 (secure 0 0 0) 0x7f625c09aa90 con 0x7f626c0ff250 2026-03-10T06:23:50.831 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:50.831+0000 7f626b7fe700 1 --2- 192.168.123.104:0/1745167597 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7f6254077380 0x7f6254079830 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:23:50.832 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:50.831+0000 7f626b7fe700 1 --2- 192.168.123.104:0/1745167597 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7f6254077380 0x7f6254079830 secure :-1 s=READY pgs=41 cs=0 l=1 rev1=1 crypto rx=0x7f626c10df70 tx=0x7f626000b4e0 comp rx=0 tx=0).ready entity=mgr.14632 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:23:50.834 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:50.834+0000 7f62697fa700 1 -- 192.168.123.104:0/1745167597 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191525 (secure 0 0 0) 0x7f625c063840 con 0x7f626c0ff250 2026-03-10T06:23:50.964 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:50.963+0000 7f6271257700 1 -- 192.168.123.104:0/1745167597 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command([{prefix=config set, name=log_to_journald}] v 0) v1 -- 0x7f6258005f70 con 0x7f626c0ff250 2026-03-10T06:23:50.965 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:50.964+0000 7f62697fa700 1 -- 192.168.123.104:0/1745167597 <== mon.1 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{prefix=config set, name=log_to_journald}]=0 v37)=0 v37) v1 ==== 135+0+0 (secure 0 0 0) 0x7f625c062f90 con 0x7f626c0ff250 2026-03-10T06:23:50.967 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:50.967+0000 7f6252ffd700 1 -- 192.168.123.104:0/1745167597 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7f6254077380 msgr2=0x7f6254079830 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:23:50.968 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:50.967+0000 7f6252ffd700 1 --2- 192.168.123.104:0/1745167597 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7f6254077380 0x7f6254079830 secure :-1 s=READY pgs=41 cs=0 l=1 rev1=1 crypto rx=0x7f626c10df70 tx=0x7f626000b4e0 comp rx=0 tx=0).stop 2026-03-10T06:23:50.968 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:50.967+0000 7f6252ffd700 1 -- 192.168.123.104:0/1745167597 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f626c0ff250 msgr2=0x7f626c111780 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:23:50.968 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:50.967+0000 7f6252ffd700 1 --2- 192.168.123.104:0/1745167597 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f626c0ff250 0x7f626c111780 secure :-1 s=READY pgs=106 cs=0 l=1 rev1=1 crypto rx=0x7f625c000c00 tx=0x7f625c00f690 comp rx=0 tx=0).stop 2026-03-10T06:23:50.968 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:50.968+0000 7f6252ffd700 1 -- 192.168.123.104:0/1745167597 shutdown_connections 2026-03-10T06:23:50.968 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:50.968+0000 7f6252ffd700 1 --2- 192.168.123.104:0/1745167597 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f626c0ff250 0x7f626c111780 unknown :-1 s=CLOSED pgs=106 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:50.968 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:50.968+0000 7f6252ffd700 1 --2- 192.168.123.104:0/1745167597 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7f6254077380 0x7f6254079830 unknown :-1 s=CLOSED pgs=41 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:50.968 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:50.968+0000 7f6252ffd700 1 --2- 192.168.123.104:0/1745167597 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f626c0ffb60 0x7f626c10c780 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:50.968 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:50.968+0000 7f6252ffd700 1 -- 192.168.123.104:0/1745167597 >> 192.168.123.104:0/1745167597 conn(0x7f626c0faca0 msgr2=0x7f626c0fd0b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:23:50.969 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:50.968+0000 7f6252ffd700 1 -- 192.168.123.104:0/1745167597 shutdown_connections 2026-03-10T06:23:50.969 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:50.968+0000 7f6252ffd700 1 -- 192.168.123.104:0/1745167597 wait complete. 2026-03-10T06:23:51.058 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 9c59102a-1c48-11f1-b618-035af535377d -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph orch upgrade start --image quay.ceph.io/ceph-ci/ceph:$sha1' 2026-03-10T06:23:51.278 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/mon.vm04/config 2026-03-10T06:23:51.602 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:51 vm04.local ceph-mon[51058]: pgmap v54: 65 pgs: 65 active+clean; 405 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1.6 MiB/s rd, 1.7 MiB/s wr, 148 op/s 2026-03-10T06:23:51.696 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:51.694+0000 7f2858039700 1 -- 192.168.123.104:0/3378454053 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f285010c8b0 msgr2=0x7f285010cc80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:23:51.696 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:51.694+0000 7f2858039700 1 --2- 192.168.123.104:0/3378454053 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f285010c8b0 0x7f285010cc80 secure :-1 s=READY pgs=107 cs=0 l=1 rev1=1 crypto rx=0x7f284c009a60 tx=0x7f284c009d70 comp rx=0 tx=0).stop 2026-03-10T06:23:51.696 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:51.695+0000 7f2858039700 1 -- 192.168.123.104:0/3378454053 shutdown_connections 2026-03-10T06:23:51.696 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:51.695+0000 7f2858039700 1 --2- 192.168.123.104:0/3378454053 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2850071e40 0x7f28500722b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:51.696 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:51.695+0000 7f2858039700 1 --2- 192.168.123.104:0/3378454053 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f285010c8b0 0x7f285010cc80 unknown :-1 s=CLOSED pgs=107 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:51.696 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:51.695+0000 7f2858039700 1 -- 192.168.123.104:0/3378454053 >> 192.168.123.104:0/3378454053 conn(0x7f285006c6c0 msgr2=0x7f285006cac0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:23:51.696 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:51.695+0000 7f2858039700 1 -- 192.168.123.104:0/3378454053 shutdown_connections 2026-03-10T06:23:51.696 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:51.695+0000 7f2858039700 1 -- 192.168.123.104:0/3378454053 wait complete. 2026-03-10T06:23:51.697 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:51.696+0000 7f2858039700 1 Processor -- start 2026-03-10T06:23:51.697 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:51.696+0000 7f2858039700 1 -- start start 2026-03-10T06:23:51.697 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:51.696+0000 7f2858039700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2850071e40 0x7f285007cf50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:23:51.697 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:51.696+0000 7f2858039700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f285007d490 0x7f285007d900 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:23:51.697 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:51.696+0000 7f2858039700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f2850081a30 con 0x7f2850071e40 2026-03-10T06:23:51.697 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:51.696+0000 7f2858039700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f2850081ba0 con 0x7f285007d490 2026-03-10T06:23:51.697 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:51.696+0000 7f28555d4700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f285007d490 0x7f285007d900 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:23:51.697 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:51.696+0000 7f28555d4700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f285007d490 0x7f285007d900 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.104:58634/0 (socket says 192.168.123.104:58634) 2026-03-10T06:23:51.697 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:51.696+0000 7f28555d4700 1 -- 192.168.123.104:0/4171389552 learned_addr learned my addr 192.168.123.104:0/4171389552 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:23:51.697 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:51.696+0000 7f28555d4700 1 -- 192.168.123.104:0/4171389552 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2850071e40 msgr2=0x7f285007cf50 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:23:51.697 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:51.696+0000 7f28555d4700 1 --2- 192.168.123.104:0/4171389552 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2850071e40 0x7f285007cf50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:51.697 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:51.696+0000 7f28555d4700 1 -- 192.168.123.104:0/4171389552 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f284c009710 con 0x7f285007d490 2026-03-10T06:23:51.697 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:51.697+0000 7f28555d4700 1 --2- 192.168.123.104:0/4171389552 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f285007d490 0x7f285007d900 secure :-1 s=READY pgs=108 cs=0 l=1 rev1=1 crypto rx=0x7f284800deb0 tx=0x7f284800df90 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:23:51.698 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:51.697+0000 7f2846ffd700 1 -- 192.168.123.104:0/4171389552 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f284800cdf0 con 0x7f285007d490 2026-03-10T06:23:51.698 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:51.697+0000 7f2858039700 1 -- 192.168.123.104:0/4171389552 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f2850081e80 con 0x7f285007d490 2026-03-10T06:23:51.700 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:51.697+0000 7f2858039700 1 -- 192.168.123.104:0/4171389552 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f28500823d0 con 0x7f285007d490 2026-03-10T06:23:51.700 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:51.700+0000 7f2846ffd700 1 -- 192.168.123.104:0/4171389552 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f2848012650 con 0x7f285007d490 2026-03-10T06:23:51.700 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:51.700+0000 7f2846ffd700 1 -- 192.168.123.104:0/4171389552 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f284800f7f0 con 0x7f285007d490 2026-03-10T06:23:51.701 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:51.700+0000 7f2846ffd700 1 -- 192.168.123.104:0/4171389552 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 33) v1 ==== 99964+0+0 (secure 0 0 0) 0x7f284800f9d0 con 0x7f285007d490 2026-03-10T06:23:51.702 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:51.701+0000 7f2846ffd700 1 --2- 192.168.123.104:0/4171389552 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7f283c077730 0x7f283c079be0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:23:51.702 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:51.702+0000 7f2846ffd700 1 -- 192.168.123.104:0/4171389552 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(42..42 src has 1..42) v4 ==== 5792+0+0 (secure 0 0 0) 0x7f28480a31e0 con 0x7f285007d490 2026-03-10T06:23:51.703 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:51.703+0000 7f2855dd5700 1 --2- 192.168.123.104:0/4171389552 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7f283c077730 0x7f283c079be0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:23:51.704 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:51.703+0000 7f2858039700 1 -- 192.168.123.104:0/4171389552 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f2834005320 con 0x7f285007d490 2026-03-10T06:23:51.704 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:51.704+0000 7f2855dd5700 1 --2- 192.168.123.104:0/4171389552 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7f283c077730 0x7f283c079be0 secure :-1 s=READY pgs=42 cs=0 l=1 rev1=1 crypto rx=0x7f284c00d7d0 tx=0x7f284c0058e0 comp rx=0 tx=0).ready entity=mgr.14632 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:23:51.714 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:51.708+0000 7f2846ffd700 1 -- 192.168.123.104:0/4171389552 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191525 (secure 0 0 0) 0x7f284806bf90 con 0x7f285007d490 2026-03-10T06:23:51.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:51 vm06.local ceph-mon[58974]: pgmap v54: 65 pgs: 65 active+clean; 405 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1.6 MiB/s rd, 1.7 MiB/s wr, 148 op/s 2026-03-10T06:23:51.896 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:51.895+0000 7f2858039700 1 -- 192.168.123.104:0/4171389552 --> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] -- mgr_command(tid 0: {"prefix": "orch upgrade start", "image": "quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df", "target": ["mon-mgr", ""]}) v1 -- 0x7f2834000c90 con 0x7f283c077730 2026-03-10T06:23:52.180 INFO:teuthology.orchestra.run.vm04.stdout:Initiating upgrade to quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-10T06:23:52.181 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:52.177+0000 7f2846ffd700 1 -- 192.168.123.104:0/4171389552 <== mgr.14632 v2:192.168.123.104:6800/1695210057 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+89 (secure 0 0 0) 0x7f2834000c90 con 0x7f283c077730 2026-03-10T06:23:52.181 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:52.180+0000 7f2858039700 1 -- 192.168.123.104:0/4171389552 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7f283c077730 msgr2=0x7f283c079be0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:23:52.181 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:52.180+0000 7f2858039700 1 --2- 192.168.123.104:0/4171389552 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7f283c077730 0x7f283c079be0 secure :-1 s=READY pgs=42 cs=0 l=1 rev1=1 crypto rx=0x7f284c00d7d0 tx=0x7f284c0058e0 comp rx=0 tx=0).stop 2026-03-10T06:23:52.181 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:52.180+0000 7f2858039700 1 -- 192.168.123.104:0/4171389552 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f285007d490 msgr2=0x7f285007d900 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:23:52.181 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:52.180+0000 7f2858039700 1 --2- 192.168.123.104:0/4171389552 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f285007d490 0x7f285007d900 secure :-1 s=READY pgs=108 cs=0 l=1 rev1=1 crypto rx=0x7f284800deb0 tx=0x7f284800df90 comp rx=0 tx=0).stop 2026-03-10T06:23:52.181 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:52.180+0000 7f2858039700 1 -- 192.168.123.104:0/4171389552 shutdown_connections 2026-03-10T06:23:52.181 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:52.180+0000 7f2858039700 1 --2- 192.168.123.104:0/4171389552 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7f283c077730 0x7f283c079be0 secure :-1 s=CLOSED pgs=42 cs=0 l=1 rev1=1 crypto rx=0x7f284c00d7d0 tx=0x7f284c0058e0 comp rx=0 tx=0).stop 2026-03-10T06:23:52.181 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:52.180+0000 7f2858039700 1 --2- 192.168.123.104:0/4171389552 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2850071e40 0x7f285007cf50 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:52.181 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:52.180+0000 7f2858039700 1 --2- 192.168.123.104:0/4171389552 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f285007d490 0x7f285007d900 unknown :-1 s=CLOSED pgs=108 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:52.181 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:52.180+0000 7f2858039700 1 -- 192.168.123.104:0/4171389552 >> 192.168.123.104:0/4171389552 conn(0x7f285006c6c0 msgr2=0x7f285006fff0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:23:52.181 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:52.181+0000 7f2858039700 1 -- 192.168.123.104:0/4171389552 shutdown_connections 2026-03-10T06:23:52.181 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:52.181+0000 7f2858039700 1 -- 192.168.123.104:0/4171389552 wait complete. 2026-03-10T06:23:52.757 INFO:teuthology.task.sequential:In sequential, running task cephadm.shell... 2026-03-10T06:23:52.757 INFO:tasks.cephadm:Running commands on role host.a host ubuntu@vm04.local 2026-03-10T06:23:52.757 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 9c59102a-1c48-11f1-b618-035af535377d -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'while ceph orch upgrade status | jq '"'"'.in_progress'"'"' | grep true && ! ceph orch upgrade status | jq '"'"'.message'"'"' | grep Error ; do ceph orch ps ; ceph versions ; ceph fs dump; ceph orch upgrade status ; ceph health detail ; sleep 30 ; done' 2026-03-10T06:23:53.154 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/mon.vm04/config 2026-03-10T06:23:53.479 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:53 vm04.local ceph-mon[51058]: from='client.24507 -' entity='client.admin' cmd=[{"prefix": "orch upgrade start", "image": "quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:23:53.480 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:53 vm04.local ceph-mon[51058]: Upgrade: Started with target quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-10T06:23:53.480 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:53 vm04.local ceph-mon[51058]: pgmap v55: 65 pgs: 65 active+clean; 405 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 936 KiB/s rd, 949 KiB/s wr, 84 op/s 2026-03-10T06:23:53.480 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:53 vm04.local ceph-mon[51058]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:23:53.480 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:53 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:23:53.480 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:53 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:23:53.480 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:53 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T06:23:53.480 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:53 vm04.local ceph-mon[51058]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:23:53.480 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:53 vm04.local ceph-mon[51058]: Upgrade: First pull of quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-10T06:23:53.606 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:53.604+0000 7f8b8c7b2700 1 -- 192.168.123.104:0/405966084 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8b84071e40 msgr2=0x7f8b840722b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:23:53.606 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:53.604+0000 7f8b8c7b2700 1 --2- 192.168.123.104:0/405966084 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8b84071e40 0x7f8b840722b0 secure :-1 s=READY pgs=109 cs=0 l=1 rev1=1 crypto rx=0x7f8b7c00cd40 tx=0x7f8b7c00a320 comp rx=0 tx=0).stop 2026-03-10T06:23:53.606 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:53.605+0000 7f8b8c7b2700 1 -- 192.168.123.104:0/405966084 shutdown_connections 2026-03-10T06:23:53.606 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:53.605+0000 7f8b8c7b2700 1 --2- 192.168.123.104:0/405966084 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8b84071e40 0x7f8b840722b0 unknown :-1 s=CLOSED pgs=109 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:53.606 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:53.605+0000 7f8b8c7b2700 1 --2- 192.168.123.104:0/405966084 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8b8410c8b0 0x7f8b8410cc80 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:53.606 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:53.605+0000 7f8b8c7b2700 1 -- 192.168.123.104:0/405966084 >> 192.168.123.104:0/405966084 conn(0x7f8b8406c6c0 msgr2=0x7f8b8406cac0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:23:53.606 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:53.605+0000 7f8b8c7b2700 1 -- 192.168.123.104:0/405966084 shutdown_connections 2026-03-10T06:23:53.606 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:53.605+0000 7f8b8c7b2700 1 -- 192.168.123.104:0/405966084 wait complete. 2026-03-10T06:23:53.607 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:53.605+0000 7f8b8c7b2700 1 Processor -- start 2026-03-10T06:23:53.607 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:53.606+0000 7f8b8c7b2700 1 -- start start 2026-03-10T06:23:53.607 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:53.606+0000 7f8b8c7b2700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8b8410c8b0 0x7f8b8407ce00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:23:53.607 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:53.606+0000 7f8b8c7b2700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8b8407d340 0x7f8b8407d7b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:23:53.607 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:53.606+0000 7f8b8c7b2700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8b84081980 con 0x7f8b8410c8b0 2026-03-10T06:23:53.607 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:53.606+0000 7f8b8c7b2700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8b84081af0 con 0x7f8b8407d340 2026-03-10T06:23:53.607 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:53.606+0000 7f8b89d4d700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8b8407d340 0x7f8b8407d7b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:23:53.608 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:53.606+0000 7f8b8a54e700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8b8410c8b0 0x7f8b8407ce00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:23:53.608 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:53.606+0000 7f8b8a54e700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8b8410c8b0 0x7f8b8407ce00 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:58560/0 (socket says 192.168.123.104:58560) 2026-03-10T06:23:53.608 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:53.606+0000 7f8b8a54e700 1 -- 192.168.123.104:0/2555216240 learned_addr learned my addr 192.168.123.104:0/2555216240 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:23:53.608 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:53.606+0000 7f8b8a54e700 1 -- 192.168.123.104:0/2555216240 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8b8407d340 msgr2=0x7f8b8407d7b0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:23:53.608 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:53.606+0000 7f8b8a54e700 1 --2- 192.168.123.104:0/2555216240 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8b8407d340 0x7f8b8407d7b0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:53.608 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:53.606+0000 7f8b8a54e700 1 -- 192.168.123.104:0/2555216240 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f8b7c00c9f0 con 0x7f8b8410c8b0 2026-03-10T06:23:53.608 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:53.607+0000 7f8b8a54e700 1 --2- 192.168.123.104:0/2555216240 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8b8410c8b0 0x7f8b8407ce00 secure :-1 s=READY pgs=351 cs=0 l=1 rev1=1 crypto rx=0x7f8b8000d8d0 tx=0x7f8b8000dbe0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:23:53.608 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:53.608+0000 7f8b7b7fe700 1 -- 192.168.123.104:0/2555216240 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f8b80009880 con 0x7f8b8410c8b0 2026-03-10T06:23:53.610 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:53.608+0000 7f8b8c7b2700 1 -- 192.168.123.104:0/2555216240 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f8b84081dd0 con 0x7f8b8410c8b0 2026-03-10T06:23:53.610 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:53.608+0000 7f8b8c7b2700 1 -- 192.168.123.104:0/2555216240 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f8b84082320 con 0x7f8b8410c8b0 2026-03-10T06:23:53.610 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:53.609+0000 7f8b7b7fe700 1 -- 192.168.123.104:0/2555216240 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f8b80010460 con 0x7f8b8410c8b0 2026-03-10T06:23:53.610 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:53.609+0000 7f8b7b7fe700 1 -- 192.168.123.104:0/2555216240 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f8b8000f5d0 con 0x7f8b8410c8b0 2026-03-10T06:23:53.611 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:53.611+0000 7f8b7b7fe700 1 -- 192.168.123.104:0/2555216240 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 33) v1 ==== 99964+0+0 (secure 0 0 0) 0x7f8b8000f730 con 0x7f8b8410c8b0 2026-03-10T06:23:53.611 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:53.611+0000 7f8b7b7fe700 1 --2- 192.168.123.104:0/2555216240 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7f8b70077660 0x7f8b70079b10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:23:53.612 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:53.611+0000 7f8b7b7fe700 1 -- 192.168.123.104:0/2555216240 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(42..42 src has 1..42) v4 ==== 5792+0+0 (secure 0 0 0) 0x7f8b8009a290 con 0x7f8b8410c8b0 2026-03-10T06:23:53.612 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:53.612+0000 7f8b89d4d700 1 --2- 192.168.123.104:0/2555216240 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7f8b70077660 0x7f8b70079b10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:23:53.612 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:53.612+0000 7f8b89d4d700 1 --2- 192.168.123.104:0/2555216240 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7f8b70077660 0x7f8b70079b10 secure :-1 s=READY pgs=43 cs=0 l=1 rev1=1 crypto rx=0x7f8b7c00cd10 tx=0x7f8b7c00a720 comp rx=0 tx=0).ready entity=mgr.14632 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:23:53.612 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:53.612+0000 7f8b8c7b2700 1 -- 192.168.123.104:0/2555216240 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f8b68005320 con 0x7f8b8410c8b0 2026-03-10T06:23:53.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:53 vm06.local ceph-mon[58974]: from='client.24507 -' entity='client.admin' cmd=[{"prefix": "orch upgrade start", "image": "quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:23:53.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:53 vm06.local ceph-mon[58974]: Upgrade: Started with target quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-10T06:23:53.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:53 vm06.local ceph-mon[58974]: pgmap v55: 65 pgs: 65 active+clean; 405 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 936 KiB/s rd, 949 KiB/s wr, 84 op/s 2026-03-10T06:23:53.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:53 vm06.local ceph-mon[58974]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:23:53.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:53 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:23:53.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:53 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:23:53.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:53 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T06:23:53.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:53 vm06.local ceph-mon[58974]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:23:53.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:53 vm06.local ceph-mon[58974]: Upgrade: First pull of quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-10T06:23:53.619 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:53.616+0000 7f8b7b7fe700 1 -- 192.168.123.104:0/2555216240 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191525 (secure 0 0 0) 0x7f8b80063040 con 0x7f8b8410c8b0 2026-03-10T06:23:53.791 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:53.788+0000 7f8b8c7b2700 1 -- 192.168.123.104:0/2555216240 --> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f8b68000bf0 con 0x7f8b70077660 2026-03-10T06:23:53.792 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:53.791+0000 7f8b7b7fe700 1 -- 192.168.123.104:0/2555216240 <== mgr.14632 v2:192.168.123.104:6800/1695210057 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+358 (secure 0 0 0) 0x7f8b68000bf0 con 0x7f8b70077660 2026-03-10T06:23:53.795 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:53.795+0000 7f8b797fa700 1 -- 192.168.123.104:0/2555216240 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7f8b70077660 msgr2=0x7f8b70079b10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:23:53.796 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:53.795+0000 7f8b797fa700 1 --2- 192.168.123.104:0/2555216240 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7f8b70077660 0x7f8b70079b10 secure :-1 s=READY pgs=43 cs=0 l=1 rev1=1 crypto rx=0x7f8b7c00cd10 tx=0x7f8b7c00a720 comp rx=0 tx=0).stop 2026-03-10T06:23:53.796 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:53.795+0000 7f8b797fa700 1 -- 192.168.123.104:0/2555216240 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8b8410c8b0 msgr2=0x7f8b8407ce00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:23:53.796 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:53.795+0000 7f8b797fa700 1 --2- 192.168.123.104:0/2555216240 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8b8410c8b0 0x7f8b8407ce00 secure :-1 s=READY pgs=351 cs=0 l=1 rev1=1 crypto rx=0x7f8b8000d8d0 tx=0x7f8b8000dbe0 comp rx=0 tx=0).stop 2026-03-10T06:23:53.796 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:53.795+0000 7f8b797fa700 1 -- 192.168.123.104:0/2555216240 shutdown_connections 2026-03-10T06:23:53.796 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:53.795+0000 7f8b797fa700 1 --2- 192.168.123.104:0/2555216240 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7f8b70077660 0x7f8b70079b10 unknown :-1 s=CLOSED pgs=43 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:53.796 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:53.795+0000 7f8b797fa700 1 --2- 192.168.123.104:0/2555216240 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8b8410c8b0 0x7f8b8407ce00 unknown :-1 s=CLOSED pgs=351 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:53.796 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:53.795+0000 7f8b797fa700 1 --2- 192.168.123.104:0/2555216240 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8b8407d340 0x7f8b8407d7b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:53.796 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:53.795+0000 7f8b797fa700 1 -- 192.168.123.104:0/2555216240 >> 192.168.123.104:0/2555216240 conn(0x7f8b8406c6c0 msgr2=0x7f8b8406ff50 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:23:53.796 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:53.796+0000 7f8b797fa700 1 -- 192.168.123.104:0/2555216240 shutdown_connections 2026-03-10T06:23:53.797 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:53.796+0000 7f8b797fa700 1 -- 192.168.123.104:0/2555216240 wait complete. 2026-03-10T06:23:53.808 INFO:teuthology.orchestra.run.vm04.stdout:true 2026-03-10T06:23:53.885 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:53.884+0000 7f38ef93d700 1 -- 192.168.123.104:0/3597482020 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f38e810c8b0 msgr2=0x7f38e810cc80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:23:53.885 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:53.884+0000 7f38ef93d700 1 --2- 192.168.123.104:0/3597482020 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f38e810c8b0 0x7f38e810cc80 secure :-1 s=READY pgs=352 cs=0 l=1 rev1=1 crypto rx=0x7f38e4009b00 tx=0x7f38e4009e10 comp rx=0 tx=0).stop 2026-03-10T06:23:53.885 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:53.885+0000 7f38ef93d700 1 -- 192.168.123.104:0/3597482020 shutdown_connections 2026-03-10T06:23:53.885 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:53.885+0000 7f38ef93d700 1 --2- 192.168.123.104:0/3597482020 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f38e8071e40 0x7f38e80722b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:53.885 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:53.885+0000 7f38ef93d700 1 --2- 192.168.123.104:0/3597482020 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f38e810c8b0 0x7f38e810cc80 unknown :-1 s=CLOSED pgs=352 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:53.885 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:53.885+0000 7f38ef93d700 1 -- 192.168.123.104:0/3597482020 >> 192.168.123.104:0/3597482020 conn(0x7f38e806c6c0 msgr2=0x7f38e806cac0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:23:53.886 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:53.885+0000 7f38ef93d700 1 -- 192.168.123.104:0/3597482020 shutdown_connections 2026-03-10T06:23:53.886 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:53.885+0000 7f38ef93d700 1 -- 192.168.123.104:0/3597482020 wait complete. 2026-03-10T06:23:53.887 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:53.885+0000 7f38ef93d700 1 Processor -- start 2026-03-10T06:23:53.887 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:53.885+0000 7f38ef93d700 1 -- start start 2026-03-10T06:23:53.887 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:53.886+0000 7f38ef93d700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f38e8071e40 0x7f38e8079a70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:23:53.887 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:53.886+0000 7f38ef93d700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f38e8079fb0 0x7f38e807a420 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:23:53.887 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:53.886+0000 7f38ef93d700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f38e807e5f0 con 0x7f38e8071e40 2026-03-10T06:23:53.887 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:53.886+0000 7f38ef93d700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f38e807e760 con 0x7f38e8079fb0 2026-03-10T06:23:53.887 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:53.886+0000 7f38eced8700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f38e8079fb0 0x7f38e807a420 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:23:53.887 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:53.886+0000 7f38eced8700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f38e8079fb0 0x7f38e807a420 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.104:42424/0 (socket says 192.168.123.104:42424) 2026-03-10T06:23:53.888 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:53.886+0000 7f38eced8700 1 -- 192.168.123.104:0/2237985593 learned_addr learned my addr 192.168.123.104:0/2237985593 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:23:53.888 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:53.886+0000 7f38ed6d9700 1 --2- 192.168.123.104:0/2237985593 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f38e8071e40 0x7f38e8079a70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:23:53.888 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:53.886+0000 7f38eced8700 1 -- 192.168.123.104:0/2237985593 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f38e8071e40 msgr2=0x7f38e8079a70 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:23:53.888 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:53.886+0000 7f38eced8700 1 --2- 192.168.123.104:0/2237985593 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f38e8071e40 0x7f38e8079a70 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:53.888 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:53.886+0000 7f38eced8700 1 -- 192.168.123.104:0/2237985593 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f38e40097e0 con 0x7f38e8079fb0 2026-03-10T06:23:53.888 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:53.887+0000 7f38eced8700 1 --2- 192.168.123.104:0/2237985593 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f38e8079fb0 0x7f38e807a420 secure :-1 s=READY pgs=110 cs=0 l=1 rev1=1 crypto rx=0x7f38e000b560 tx=0x7f38e000b590 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:23:53.888 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:53.888+0000 7f38de7fc700 1 -- 192.168.123.104:0/2237985593 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f38e0010430 con 0x7f38e8079fb0 2026-03-10T06:23:53.889 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:53.888+0000 7f38ef93d700 1 -- 192.168.123.104:0/2237985593 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f38e807ea40 con 0x7f38e8079fb0 2026-03-10T06:23:53.889 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:53.888+0000 7f38ef93d700 1 -- 192.168.123.104:0/2237985593 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f38e807ef90 con 0x7f38e8079fb0 2026-03-10T06:23:53.889 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:53.889+0000 7f38de7fc700 1 -- 192.168.123.104:0/2237985593 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f38e0010a70 con 0x7f38e8079fb0 2026-03-10T06:23:53.889 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:53.889+0000 7f38de7fc700 1 -- 192.168.123.104:0/2237985593 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f38e000fc90 con 0x7f38e8079fb0 2026-03-10T06:23:53.890 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:53.890+0000 7f38de7fc700 1 -- 192.168.123.104:0/2237985593 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 33) v1 ==== 99964+0+0 (secure 0 0 0) 0x7f38e001c400 con 0x7f38e8079fb0 2026-03-10T06:23:53.891 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:53.891+0000 7f38de7fc700 1 --2- 192.168.123.104:0/2237985593 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7f38d4077660 0x7f38d4079b10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:23:53.892 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:53.891+0000 7f38ed6d9700 1 --2- 192.168.123.104:0/2237985593 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7f38d4077660 0x7f38d4079b10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:23:53.892 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:53.891+0000 7f38de7fc700 1 -- 192.168.123.104:0/2237985593 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(42..42 src has 1..42) v4 ==== 5792+0+0 (secure 0 0 0) 0x7f38e009a760 con 0x7f38e8079fb0 2026-03-10T06:23:53.892 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:53.892+0000 7f38ed6d9700 1 --2- 192.168.123.104:0/2237985593 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7f38d4077660 0x7f38d4079b10 secure :-1 s=READY pgs=44 cs=0 l=1 rev1=1 crypto rx=0x7f38e4009ad0 tx=0x7f38e4017040 comp rx=0 tx=0).ready entity=mgr.14632 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:23:53.893 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:53.892+0000 7f38ef93d700 1 -- 192.168.123.104:0/2237985593 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f38cc005320 con 0x7f38e8079fb0 2026-03-10T06:23:53.896 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:53.896+0000 7f38de7fc700 1 -- 192.168.123.104:0/2237985593 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191525 (secure 0 0 0) 0x7f38e0063510 con 0x7f38e8079fb0 2026-03-10T06:23:54.061 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.057+0000 7f38ef93d700 1 -- 192.168.123.104:0/2237985593 --> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f38cc000bf0 con 0x7f38d4077660 2026-03-10T06:23:54.062 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.062+0000 7f38de7fc700 1 -- 192.168.123.104:0/2237985593 <== mgr.14632 v2:192.168.123.104:6800/1695210057 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+358 (secure 0 0 0) 0x7f38cc000bf0 con 0x7f38d4077660 2026-03-10T06:23:54.066 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.065+0000 7f38ef93d700 1 -- 192.168.123.104:0/2237985593 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7f38d4077660 msgr2=0x7f38d4079b10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:23:54.066 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.066+0000 7f38ef93d700 1 --2- 192.168.123.104:0/2237985593 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7f38d4077660 0x7f38d4079b10 secure :-1 s=READY pgs=44 cs=0 l=1 rev1=1 crypto rx=0x7f38e4009ad0 tx=0x7f38e4017040 comp rx=0 tx=0).stop 2026-03-10T06:23:54.066 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.066+0000 7f38ef93d700 1 -- 192.168.123.104:0/2237985593 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f38e8079fb0 msgr2=0x7f38e807a420 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:23:54.066 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.066+0000 7f38ef93d700 1 --2- 192.168.123.104:0/2237985593 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f38e8079fb0 0x7f38e807a420 secure :-1 s=READY pgs=110 cs=0 l=1 rev1=1 crypto rx=0x7f38e000b560 tx=0x7f38e000b590 comp rx=0 tx=0).stop 2026-03-10T06:23:54.067 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.066+0000 7f38ef93d700 1 -- 192.168.123.104:0/2237985593 shutdown_connections 2026-03-10T06:23:54.067 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.066+0000 7f38ef93d700 1 --2- 192.168.123.104:0/2237985593 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7f38d4077660 0x7f38d4079b10 unknown :-1 s=CLOSED pgs=44 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:54.067 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.066+0000 7f38ef93d700 1 --2- 192.168.123.104:0/2237985593 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f38e8071e40 0x7f38e8079a70 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:54.067 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.066+0000 7f38ef93d700 1 --2- 192.168.123.104:0/2237985593 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f38e8079fb0 0x7f38e807a420 unknown :-1 s=CLOSED pgs=110 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:54.067 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.066+0000 7f38ef93d700 1 -- 192.168.123.104:0/2237985593 >> 192.168.123.104:0/2237985593 conn(0x7f38e806c6c0 msgr2=0x7f38e806fdd0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:23:54.067 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.067+0000 7f38ef93d700 1 -- 192.168.123.104:0/2237985593 shutdown_connections 2026-03-10T06:23:54.067 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.067+0000 7f38ef93d700 1 -- 192.168.123.104:0/2237985593 wait complete. 2026-03-10T06:23:54.163 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.162+0000 7f6a204a0700 1 -- 192.168.123.104:0/613017410 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6a1810eab0 msgr2=0x7f6a1810ee80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:23:54.163 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.162+0000 7f6a204a0700 1 --2- 192.168.123.104:0/613017410 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6a1810eab0 0x7f6a1810ee80 secure :-1 s=READY pgs=353 cs=0 l=1 rev1=1 crypto rx=0x7f6a14009b00 tx=0x7f6a14009e10 comp rx=0 tx=0).stop 2026-03-10T06:23:54.163 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.163+0000 7f6a204a0700 1 -- 192.168.123.104:0/613017410 shutdown_connections 2026-03-10T06:23:54.163 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.163+0000 7f6a204a0700 1 --2- 192.168.123.104:0/613017410 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6a18071b60 0x7f6a18071fd0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:54.163 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.163+0000 7f6a204a0700 1 --2- 192.168.123.104:0/613017410 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6a1810eab0 0x7f6a1810ee80 unknown :-1 s=CLOSED pgs=353 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:54.163 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.163+0000 7f6a204a0700 1 -- 192.168.123.104:0/613017410 >> 192.168.123.104:0/613017410 conn(0x7f6a1806c6c0 msgr2=0x7f6a1806cac0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:23:54.163 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.163+0000 7f6a204a0700 1 -- 192.168.123.104:0/613017410 shutdown_connections 2026-03-10T06:23:54.164 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.163+0000 7f6a204a0700 1 -- 192.168.123.104:0/613017410 wait complete. 2026-03-10T06:23:54.164 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.163+0000 7f6a204a0700 1 Processor -- start 2026-03-10T06:23:54.164 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.164+0000 7f6a204a0700 1 -- start start 2026-03-10T06:23:54.165 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.164+0000 7f6a204a0700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6a18071b60 0x7f6a18117750 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:23:54.165 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.164+0000 7f6a204a0700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6a1810eab0 0x7f6a18112750 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:23:54.165 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.164+0000 7f6a204a0700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6a18112e30 con 0x7f6a1810eab0 2026-03-10T06:23:54.165 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.164+0000 7f6a204a0700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6a18112fa0 con 0x7f6a18071b60 2026-03-10T06:23:54.165 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.164+0000 7f6a1da3b700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6a1810eab0 0x7f6a18112750 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:23:54.165 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.164+0000 7f6a1da3b700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6a1810eab0 0x7f6a18112750 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:47936/0 (socket says 192.168.123.104:47936) 2026-03-10T06:23:54.165 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.164+0000 7f6a1da3b700 1 -- 192.168.123.104:0/1800900119 learned_addr learned my addr 192.168.123.104:0/1800900119 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:23:54.165 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.165+0000 7f6a1da3b700 1 -- 192.168.123.104:0/1800900119 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6a18071b60 msgr2=0x7f6a18117750 unknown :-1 s=STATE_CONNECTING_RE l=1).mark_down 2026-03-10T06:23:54.165 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.165+0000 7f6a1da3b700 1 --2- 192.168.123.104:0/1800900119 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6a18071b60 0x7f6a18117750 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:54.165 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.165+0000 7f6a1da3b700 1 -- 192.168.123.104:0/1800900119 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f6a140097e0 con 0x7f6a1810eab0 2026-03-10T06:23:54.166 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.165+0000 7f6a1da3b700 1 --2- 192.168.123.104:0/1800900119 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6a1810eab0 0x7f6a18112750 secure :-1 s=READY pgs=354 cs=0 l=1 rev1=1 crypto rx=0x7f6a1000c390 tx=0x7f6a1000c750 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:23:54.166 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.166+0000 7f6a0f7fe700 1 -- 192.168.123.104:0/1800900119 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6a1000e030 con 0x7f6a1810eab0 2026-03-10T06:23:54.167 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.166+0000 7f6a204a0700 1 -- 192.168.123.104:0/1800900119 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f6a18113280 con 0x7f6a1810eab0 2026-03-10T06:23:54.167 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.166+0000 7f6a204a0700 1 -- 192.168.123.104:0/1800900119 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f6a181b7cb0 con 0x7f6a1810eab0 2026-03-10T06:23:54.168 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.167+0000 7f6a0f7fe700 1 -- 192.168.123.104:0/1800900119 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f6a1000f040 con 0x7f6a1810eab0 2026-03-10T06:23:54.168 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.168+0000 7f6a0f7fe700 1 -- 192.168.123.104:0/1800900119 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6a100146c0 con 0x7f6a1810eab0 2026-03-10T06:23:54.169 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.168+0000 7f6a0f7fe700 1 -- 192.168.123.104:0/1800900119 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 33) v1 ==== 99964+0+0 (secure 0 0 0) 0x7f6a10014900 con 0x7f6a1810eab0 2026-03-10T06:23:54.170 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.169+0000 7f6a0f7fe700 1 --2- 192.168.123.104:0/1800900119 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7f6a04077660 0x7f6a04079b10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:23:54.170 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.170+0000 7f6a1e23c700 1 --2- 192.168.123.104:0/1800900119 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7f6a04077660 0x7f6a04079b10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:23:54.171 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.170+0000 7f6a0f7fe700 1 -- 192.168.123.104:0/1800900119 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(42..42 src has 1..42) v4 ==== 5792+0+0 (secure 0 0 0) 0x7f6a1009a5f0 con 0x7f6a1810eab0 2026-03-10T06:23:54.171 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.171+0000 7f6a204a0700 1 -- 192.168.123.104:0/1800900119 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f69fc005320 con 0x7f6a1810eab0 2026-03-10T06:23:54.175 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.174+0000 7f6a1e23c700 1 --2- 192.168.123.104:0/1800900119 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7f6a04077660 0x7f6a04079b10 secure :-1 s=READY pgs=45 cs=0 l=1 rev1=1 crypto rx=0x7f6a14009ad0 tx=0x7f6a14000bc0 comp rx=0 tx=0).ready entity=mgr.14632 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:23:54.175 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.175+0000 7f6a0f7fe700 1 -- 192.168.123.104:0/1800900119 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191525 (secure 0 0 0) 0x7f6a100633a0 con 0x7f6a1810eab0 2026-03-10T06:23:54.332 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.331+0000 7f6a204a0700 1 -- 192.168.123.104:0/1800900119 --> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7f69fc000bf0 con 0x7f6a04077660 2026-03-10T06:23:54.332 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:54 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T06:23:54.345 INFO:teuthology.orchestra.run.vm04.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-10T06:23:54.345 INFO:teuthology.orchestra.run.vm04.stdout:alertmanager.vm04 vm04 *:9093,9094 running (42s) 21s ago 6m 15.5M - 0.25.0 c8568f914cd2 85edc8fe2fc1 2026-03-10T06:23:54.345 INFO:teuthology.orchestra.run.vm04.stdout:ceph-exporter.vm04 vm04 running (6m) 21s ago 6m 8426k - 18.2.0 dc2bc1663786 019b79596e39 2026-03-10T06:23:54.345 INFO:teuthology.orchestra.run.vm04.stdout:ceph-exporter.vm06 vm06 running (5m) 69s ago 5m 8648k - 18.2.0 dc2bc1663786 02ba67f7b99e 2026-03-10T06:23:54.345 INFO:teuthology.orchestra.run.vm04.stdout:crash.vm04 vm04 running (6m) 21s ago 6m 7402k - 18.2.0 dc2bc1663786 35fbdbd85c40 2026-03-10T06:23:54.345 INFO:teuthology.orchestra.run.vm04.stdout:crash.vm06 vm06 running (5m) 69s ago 5m 7411k - 18.2.0 dc2bc1663786 a60199b09d41 2026-03-10T06:23:54.345 INFO:teuthology.orchestra.run.vm04.stdout:grafana.vm04 vm04 *:3000 running (24s) 21s ago 6m 40.4M - 10.4.0 c8b91775d855 28b34ae2f2b0 2026-03-10T06:23:54.345 INFO:teuthology.orchestra.run.vm04.stdout:mds.cephfs.vm04.hdxbzv vm04 running (4m) 21s ago 4m 267M - 18.2.0 dc2bc1663786 342935a5b39a 2026-03-10T06:23:54.345 INFO:teuthology.orchestra.run.vm04.stdout:mds.cephfs.vm04.hsrsig vm04 running (4m) 21s ago 4m 16.6M - 18.2.0 dc2bc1663786 9bbaa4df4333 2026-03-10T06:23:54.345 INFO:teuthology.orchestra.run.vm04.stdout:mds.cephfs.vm06.afscws vm06 running (4m) 69s ago 4m 16.5M - 18.2.0 dc2bc1663786 dc29bd0a94dd 2026-03-10T06:23:54.345 INFO:teuthology.orchestra.run.vm04.stdout:mds.cephfs.vm06.wzhqon vm06 running (4m) 69s ago 4m 263M - 18.2.0 dc2bc1663786 5f7b9f10b346 2026-03-10T06:23:54.345 INFO:teuthology.orchestra.run.vm04.stdout:mgr.vm04.exdvdb vm04 *:8443,9283,8765 running (113s) 21s ago 7m 612M - 19.2.3-678-ge911bdeb 654f31e6858e 640a9d30421c 2026-03-10T06:23:54.345 INFO:teuthology.orchestra.run.vm04.stdout:mgr.vm06.wwotdr vm06 *:8443,9283,8765 running (90s) 69s ago 5m 495M - 19.2.3-678-ge911bdeb 654f31e6858e 0f98de364d6a 2026-03-10T06:23:54.345 INFO:teuthology.orchestra.run.vm04.stdout:mon.vm04 vm04 running (7m) 21s ago 7m 57.4M 2048M 18.2.0 dc2bc1663786 089bb557f95b 2026-03-10T06:23:54.345 INFO:teuthology.orchestra.run.vm04.stdout:mon.vm06 vm06 running (5m) 69s ago 5m 42.8M 2048M 18.2.0 dc2bc1663786 826078cd5cc7 2026-03-10T06:23:54.345 INFO:teuthology.orchestra.run.vm04.stdout:node-exporter.vm04 vm04 *:9100 running (76s) 21s ago 6m 9198k - 1.7.0 72c9c2088986 f88b18573eef 2026-03-10T06:23:54.345 INFO:teuthology.orchestra.run.vm04.stdout:node-exporter.vm06 vm06 *:9100 running (71s) 69s ago 5m 5368k - 1.7.0 72c9c2088986 32cea90d1988 2026-03-10T06:23:54.345 INFO:teuthology.orchestra.run.vm04.stdout:osd.0 vm04 running (5m) 21s ago 5m 297M 4096M 18.2.0 dc2bc1663786 23249edb3d75 2026-03-10T06:23:54.345 INFO:teuthology.orchestra.run.vm04.stdout:osd.1 vm04 running (5m) 21s ago 5m 296M 4096M 18.2.0 dc2bc1663786 ddcaf1636c42 2026-03-10T06:23:54.345 INFO:teuthology.orchestra.run.vm04.stdout:osd.2 vm04 running (4m) 21s ago 4m 236M 4096M 18.2.0 dc2bc1663786 e5a533082c80 2026-03-10T06:23:54.345 INFO:teuthology.orchestra.run.vm04.stdout:osd.3 vm06 running (4m) 69s ago 4m 320M 4096M 18.2.0 dc2bc1663786 62400287eca0 2026-03-10T06:23:54.345 INFO:teuthology.orchestra.run.vm04.stdout:osd.4 vm06 running (4m) 69s ago 4m 263M 4096M 18.2.0 dc2bc1663786 dcd395dfe220 2026-03-10T06:23:54.345 INFO:teuthology.orchestra.run.vm04.stdout:osd.5 vm06 running (4m) 69s ago 4m 313M 4096M 18.2.0 dc2bc1663786 862da087fc06 2026-03-10T06:23:54.345 INFO:teuthology.orchestra.run.vm04.stdout:prometheus.vm04 vm04 *:9095 running (52s) 21s ago 5m 46.9M - 2.51.0 1d3b7f56885b 9e491f823407 2026-03-10T06:23:54.345 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.342+0000 7f6a0f7fe700 1 -- 192.168.123.104:0/1800900119 <== mgr.14632 v2:192.168.123.104:6800/1695210057 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3576 (secure 0 0 0) 0x7f69fc000bf0 con 0x7f6a04077660 2026-03-10T06:23:54.345 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.345+0000 7f6a0d7fa700 1 -- 192.168.123.104:0/1800900119 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7f6a04077660 msgr2=0x7f6a04079b10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:23:54.346 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.345+0000 7f6a0d7fa700 1 --2- 192.168.123.104:0/1800900119 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7f6a04077660 0x7f6a04079b10 secure :-1 s=READY pgs=45 cs=0 l=1 rev1=1 crypto rx=0x7f6a14009ad0 tx=0x7f6a14000bc0 comp rx=0 tx=0).stop 2026-03-10T06:23:54.346 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.345+0000 7f6a0d7fa700 1 -- 192.168.123.104:0/1800900119 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6a1810eab0 msgr2=0x7f6a18112750 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:23:54.346 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.345+0000 7f6a0d7fa700 1 --2- 192.168.123.104:0/1800900119 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6a1810eab0 0x7f6a18112750 secure :-1 s=READY pgs=354 cs=0 l=1 rev1=1 crypto rx=0x7f6a1000c390 tx=0x7f6a1000c750 comp rx=0 tx=0).stop 2026-03-10T06:23:54.346 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.346+0000 7f6a0d7fa700 1 -- 192.168.123.104:0/1800900119 shutdown_connections 2026-03-10T06:23:54.346 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.346+0000 7f6a0d7fa700 1 --2- 192.168.123.104:0/1800900119 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6a18071b60 0x7f6a18117750 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:54.346 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.346+0000 7f6a0d7fa700 1 --2- 192.168.123.104:0/1800900119 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7f6a04077660 0x7f6a04079b10 unknown :-1 s=CLOSED pgs=45 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:54.346 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.346+0000 7f6a0d7fa700 1 --2- 192.168.123.104:0/1800900119 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6a1810eab0 0x7f6a18112750 unknown :-1 s=CLOSED pgs=354 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:54.346 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.346+0000 7f6a0d7fa700 1 -- 192.168.123.104:0/1800900119 >> 192.168.123.104:0/1800900119 conn(0x7f6a1806c6c0 msgr2=0x7f6a18118230 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:23:54.347 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.346+0000 7f6a0d7fa700 1 -- 192.168.123.104:0/1800900119 shutdown_connections 2026-03-10T06:23:54.347 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.346+0000 7f6a0d7fa700 1 -- 192.168.123.104:0/1800900119 wait complete. 2026-03-10T06:23:54.439 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.438+0000 7fe8e14f5700 1 -- 192.168.123.104:0/1664522650 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe8dc10c8b0 msgr2=0x7fe8dc10cc80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:23:54.439 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.438+0000 7fe8e14f5700 1 --2- 192.168.123.104:0/1664522650 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe8dc10c8b0 0x7fe8dc10cc80 secure :-1 s=READY pgs=355 cs=0 l=1 rev1=1 crypto rx=0x7fe8cc007780 tx=0x7fe8cc00c050 comp rx=0 tx=0).stop 2026-03-10T06:23:54.439 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.438+0000 7fe8e14f5700 1 -- 192.168.123.104:0/1664522650 shutdown_connections 2026-03-10T06:23:54.439 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.438+0000 7fe8e14f5700 1 --2- 192.168.123.104:0/1664522650 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe8dc071e40 0x7fe8dc0722b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:54.439 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.438+0000 7fe8e14f5700 1 --2- 192.168.123.104:0/1664522650 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe8dc10c8b0 0x7fe8dc10cc80 unknown :-1 s=CLOSED pgs=355 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:54.439 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.438+0000 7fe8e14f5700 1 -- 192.168.123.104:0/1664522650 >> 192.168.123.104:0/1664522650 conn(0x7fe8dc06c6c0 msgr2=0x7fe8dc06cac0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:23:54.440 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.439+0000 7fe8e14f5700 1 -- 192.168.123.104:0/1664522650 shutdown_connections 2026-03-10T06:23:54.440 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.439+0000 7fe8e14f5700 1 -- 192.168.123.104:0/1664522650 wait complete. 2026-03-10T06:23:54.440 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.439+0000 7fe8e14f5700 1 Processor -- start 2026-03-10T06:23:54.440 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.439+0000 7fe8e14f5700 1 -- start start 2026-03-10T06:23:54.440 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.439+0000 7fe8e14f5700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe8dc071e40 0x7fe8dc132890 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:23:54.440 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.439+0000 7fe8e14f5700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe8dc132dd0 0x7fe8dc133240 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:23:54.440 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.439+0000 7fe8e14f5700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe8dc07eef0 con 0x7fe8dc132dd0 2026-03-10T06:23:54.440 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.439+0000 7fe8e14f5700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe8dc07f060 con 0x7fe8dc071e40 2026-03-10T06:23:54.440 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.439+0000 7fe8da7fc700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe8dc132dd0 0x7fe8dc133240 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:23:54.440 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.440+0000 7fe8da7fc700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe8dc132dd0 0x7fe8dc133240 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:47964/0 (socket says 192.168.123.104:47964) 2026-03-10T06:23:54.440 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.440+0000 7fe8da7fc700 1 -- 192.168.123.104:0/4219683549 learned_addr learned my addr 192.168.123.104:0/4219683549 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:23:54.440 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.440+0000 7fe8da7fc700 1 -- 192.168.123.104:0/4219683549 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe8dc071e40 msgr2=0x7fe8dc132890 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:23:54.440 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.440+0000 7fe8da7fc700 1 --2- 192.168.123.104:0/4219683549 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe8dc071e40 0x7fe8dc132890 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:54.440 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.440+0000 7fe8da7fc700 1 -- 192.168.123.104:0/4219683549 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fe8cc007430 con 0x7fe8dc132dd0 2026-03-10T06:23:54.440 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.440+0000 7fe8da7fc700 1 --2- 192.168.123.104:0/4219683549 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe8dc132dd0 0x7fe8dc133240 secure :-1 s=READY pgs=356 cs=0 l=1 rev1=1 crypto rx=0x7fe8d400bf40 tx=0x7fe8d400bf70 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:23:54.441 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.440+0000 7fe8c3fff700 1 -- 192.168.123.104:0/4219683549 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe8d400cbc0 con 0x7fe8dc132dd0 2026-03-10T06:23:54.441 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.440+0000 7fe8e14f5700 1 -- 192.168.123.104:0/4219683549 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fe8dc07f2e0 con 0x7fe8dc132dd0 2026-03-10T06:23:54.441 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.440+0000 7fe8e14f5700 1 -- 192.168.123.104:0/4219683549 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fe8dc07f830 con 0x7fe8dc132dd0 2026-03-10T06:23:54.441 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.441+0000 7fe8c3fff700 1 -- 192.168.123.104:0/4219683549 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7fe8d400cd20 con 0x7fe8dc132dd0 2026-03-10T06:23:54.441 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.441+0000 7fe8c3fff700 1 -- 192.168.123.104:0/4219683549 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe8d40078c0 con 0x7fe8dc132dd0 2026-03-10T06:23:54.443 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.442+0000 7fe8c3fff700 1 -- 192.168.123.104:0/4219683549 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 33) v1 ==== 99964+0+0 (secure 0 0 0) 0x7fe8d4007a20 con 0x7fe8dc132dd0 2026-03-10T06:23:54.443 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.443+0000 7fe8c3fff700 1 --2- 192.168.123.104:0/4219683549 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7fe8c4077660 0x7fe8c4079b10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:23:54.444 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.443+0000 7fe8c3fff700 1 -- 192.168.123.104:0/4219683549 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(42..42 src has 1..42) v4 ==== 5792+0+0 (secure 0 0 0) 0x7fe8d4099e70 con 0x7fe8dc132dd0 2026-03-10T06:23:54.444 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.443+0000 7fe8daffd700 1 --2- 192.168.123.104:0/4219683549 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7fe8c4077660 0x7fe8c4079b10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:23:54.444 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.444+0000 7fe8e14f5700 1 -- 192.168.123.104:0/4219683549 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fe8c8005320 con 0x7fe8dc132dd0 2026-03-10T06:23:54.446 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.446+0000 7fe8daffd700 1 --2- 192.168.123.104:0/4219683549 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7fe8c4077660 0x7fe8c4079b10 secure :-1 s=READY pgs=46 cs=0 l=1 rev1=1 crypto rx=0x7fe8cc00c4d0 tx=0x7fe8cc00af60 comp rx=0 tx=0).ready entity=mgr.14632 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:23:54.447 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.447+0000 7fe8c3fff700 1 -- 192.168.123.104:0/4219683549 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191525 (secure 0 0 0) 0x7fe8d4062c20 con 0x7fe8dc132dd0 2026-03-10T06:23:54.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:54 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T06:23:54.806 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.805+0000 7fe8e14f5700 1 -- 192.168.123.104:0/4219683549 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7fe8c8006200 con 0x7fe8dc132dd0 2026-03-10T06:23:54.806 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.805+0000 7fe8c3fff700 1 -- 192.168.123.104:0/4219683549 <== mon.0 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+679 (secure 0 0 0) 0x7fe8d4019070 con 0x7fe8dc132dd0 2026-03-10T06:23:54.806 INFO:teuthology.orchestra.run.vm04.stdout:{ 2026-03-10T06:23:54.806 INFO:teuthology.orchestra.run.vm04.stdout: "mon": { 2026-03-10T06:23:54.806 INFO:teuthology.orchestra.run.vm04.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 2 2026-03-10T06:23:54.806 INFO:teuthology.orchestra.run.vm04.stdout: }, 2026-03-10T06:23:54.806 INFO:teuthology.orchestra.run.vm04.stdout: "mgr": { 2026-03-10T06:23:54.806 INFO:teuthology.orchestra.run.vm04.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T06:23:54.807 INFO:teuthology.orchestra.run.vm04.stdout: }, 2026-03-10T06:23:54.807 INFO:teuthology.orchestra.run.vm04.stdout: "osd": { 2026-03-10T06:23:54.807 INFO:teuthology.orchestra.run.vm04.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 6 2026-03-10T06:23:54.807 INFO:teuthology.orchestra.run.vm04.stdout: }, 2026-03-10T06:23:54.807 INFO:teuthology.orchestra.run.vm04.stdout: "mds": { 2026-03-10T06:23:54.807 INFO:teuthology.orchestra.run.vm04.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 4 2026-03-10T06:23:54.807 INFO:teuthology.orchestra.run.vm04.stdout: }, 2026-03-10T06:23:54.807 INFO:teuthology.orchestra.run.vm04.stdout: "overall": { 2026-03-10T06:23:54.807 INFO:teuthology.orchestra.run.vm04.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 12, 2026-03-10T06:23:54.807 INFO:teuthology.orchestra.run.vm04.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T06:23:54.807 INFO:teuthology.orchestra.run.vm04.stdout: } 2026-03-10T06:23:54.807 INFO:teuthology.orchestra.run.vm04.stdout:} 2026-03-10T06:23:54.810 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.809+0000 7fe8c1ffb700 1 -- 192.168.123.104:0/4219683549 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7fe8c4077660 msgr2=0x7fe8c4079b10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:23:54.810 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.809+0000 7fe8c1ffb700 1 --2- 192.168.123.104:0/4219683549 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7fe8c4077660 0x7fe8c4079b10 secure :-1 s=READY pgs=46 cs=0 l=1 rev1=1 crypto rx=0x7fe8cc00c4d0 tx=0x7fe8cc00af60 comp rx=0 tx=0).stop 2026-03-10T06:23:54.810 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.809+0000 7fe8c1ffb700 1 -- 192.168.123.104:0/4219683549 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe8dc132dd0 msgr2=0x7fe8dc133240 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:23:54.810 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.809+0000 7fe8c1ffb700 1 --2- 192.168.123.104:0/4219683549 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe8dc132dd0 0x7fe8dc133240 secure :-1 s=READY pgs=356 cs=0 l=1 rev1=1 crypto rx=0x7fe8d400bf40 tx=0x7fe8d400bf70 comp rx=0 tx=0).stop 2026-03-10T06:23:54.811 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.809+0000 7fe8c1ffb700 1 -- 192.168.123.104:0/4219683549 shutdown_connections 2026-03-10T06:23:54.811 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.809+0000 7fe8c1ffb700 1 --2- 192.168.123.104:0/4219683549 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe8dc071e40 0x7fe8dc132890 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:54.811 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.809+0000 7fe8c1ffb700 1 --2- 192.168.123.104:0/4219683549 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7fe8c4077660 0x7fe8c4079b10 unknown :-1 s=CLOSED pgs=46 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:54.811 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.809+0000 7fe8c1ffb700 1 --2- 192.168.123.104:0/4219683549 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe8dc132dd0 0x7fe8dc133240 unknown :-1 s=CLOSED pgs=356 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:54.811 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.809+0000 7fe8c1ffb700 1 -- 192.168.123.104:0/4219683549 >> 192.168.123.104:0/4219683549 conn(0x7fe8dc06c6c0 msgr2=0x7fe8dc06ff80 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:23:54.811 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.809+0000 7fe8c1ffb700 1 -- 192.168.123.104:0/4219683549 shutdown_connections 2026-03-10T06:23:54.811 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.809+0000 7fe8c1ffb700 1 -- 192.168.123.104:0/4219683549 wait complete. 2026-03-10T06:23:54.916 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.915+0000 7fe9ee8ac700 1 -- 192.168.123.104:0/1098958880 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe9e8071e40 msgr2=0x7fe9e80722b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:23:54.916 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.915+0000 7fe9ee8ac700 1 --2- 192.168.123.104:0/1098958880 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe9e8071e40 0x7fe9e80722b0 secure :-1 s=READY pgs=111 cs=0 l=1 rev1=1 crypto rx=0x7fe9e0009230 tx=0x7fe9e0009260 comp rx=0 tx=0).stop 2026-03-10T06:23:54.916 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.915+0000 7fe9ee8ac700 1 -- 192.168.123.104:0/1098958880 shutdown_connections 2026-03-10T06:23:54.916 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.915+0000 7fe9ee8ac700 1 --2- 192.168.123.104:0/1098958880 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe9e8071e40 0x7fe9e80722b0 unknown :-1 s=CLOSED pgs=111 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:54.916 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.915+0000 7fe9ee8ac700 1 --2- 192.168.123.104:0/1098958880 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe9e810c8b0 0x7fe9e810cc80 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:54.916 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.915+0000 7fe9ee8ac700 1 -- 192.168.123.104:0/1098958880 >> 192.168.123.104:0/1098958880 conn(0x7fe9e806c6c0 msgr2=0x7fe9e806cac0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:23:54.917 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.915+0000 7fe9ee8ac700 1 -- 192.168.123.104:0/1098958880 shutdown_connections 2026-03-10T06:23:54.917 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.915+0000 7fe9ee8ac700 1 -- 192.168.123.104:0/1098958880 wait complete. 2026-03-10T06:23:54.917 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.916+0000 7fe9ee8ac700 1 Processor -- start 2026-03-10T06:23:54.917 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.916+0000 7fe9ee8ac700 1 -- start start 2026-03-10T06:23:54.917 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.916+0000 7fe9ee8ac700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe9e810c8b0 0x7fe9e807cf50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:23:54.917 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.916+0000 7fe9ee8ac700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe9e807d490 0x7fe9e807d900 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:23:54.917 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.916+0000 7fe9ee8ac700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe9e8081b60 con 0x7fe9e810c8b0 2026-03-10T06:23:54.917 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.916+0000 7fe9ee8ac700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe9e8081cd0 con 0x7fe9e807d490 2026-03-10T06:23:54.917 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.916+0000 7fe9e77fe700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe9e807d490 0x7fe9e807d900 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:23:54.917 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.916+0000 7fe9e77fe700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe9e807d490 0x7fe9e807d900 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.104:42488/0 (socket says 192.168.123.104:42488) 2026-03-10T06:23:54.917 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.916+0000 7fe9e77fe700 1 -- 192.168.123.104:0/2818047456 learned_addr learned my addr 192.168.123.104:0/2818047456 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:23:54.917 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.916+0000 7fe9e7fff700 1 --2- 192.168.123.104:0/2818047456 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe9e810c8b0 0x7fe9e807cf50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:23:54.918 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.916+0000 7fe9e77fe700 1 -- 192.168.123.104:0/2818047456 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe9e810c8b0 msgr2=0x7fe9e807cf50 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:23:54.918 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.916+0000 7fe9e77fe700 1 --2- 192.168.123.104:0/2818047456 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe9e810c8b0 0x7fe9e807cf50 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:54.918 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.917+0000 7fe9e77fe700 1 -- 192.168.123.104:0/2818047456 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fe9e0008ee0 con 0x7fe9e807d490 2026-03-10T06:23:54.918 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.917+0000 7fe9e77fe700 1 --2- 192.168.123.104:0/2818047456 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe9e807d490 0x7fe9e807d900 secure :-1 s=READY pgs=112 cs=0 l=1 rev1=1 crypto rx=0x7fe9e0004740 tx=0x7fe9e0004770 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:23:54.919 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.917+0000 7fe9e57fa700 1 -- 192.168.123.104:0/2818047456 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe9e001d070 con 0x7fe9e807d490 2026-03-10T06:23:54.919 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.917+0000 7fe9e57fa700 1 -- 192.168.123.104:0/2818047456 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7fe9e000ead0 con 0x7fe9e807d490 2026-03-10T06:23:54.920 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.918+0000 7fe9e57fa700 1 -- 192.168.123.104:0/2818047456 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe9e0016a70 con 0x7fe9e807d490 2026-03-10T06:23:54.920 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.918+0000 7fe9ee8ac700 1 -- 192.168.123.104:0/2818047456 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fe9e8081ef0 con 0x7fe9e807d490 2026-03-10T06:23:54.920 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.918+0000 7fe9ee8ac700 1 -- 192.168.123.104:0/2818047456 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fe9e8082360 con 0x7fe9e807d490 2026-03-10T06:23:54.920 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.919+0000 7fe9e57fa700 1 -- 192.168.123.104:0/2818047456 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 33) v1 ==== 99964+0+0 (secure 0 0 0) 0x7fe9e000ec40 con 0x7fe9e807d490 2026-03-10T06:23:54.928 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.920+0000 7fe9e57fa700 1 --2- 192.168.123.104:0/2818047456 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7fe9d0077380 0x7fe9d0079830 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:23:54.928 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.920+0000 7fe9e57fa700 1 -- 192.168.123.104:0/2818047456 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(42..42 src has 1..42) v4 ==== 5792+0+0 (secure 0 0 0) 0x7fe9e0012070 con 0x7fe9e807d490 2026-03-10T06:23:54.928 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.919+0000 7fe9ee8ac700 1 -- 192.168.123.104:0/2818047456 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fe9e8075730 con 0x7fe9e807d490 2026-03-10T06:23:54.930 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.927+0000 7fe9e57fa700 1 -- 192.168.123.104:0/2818047456 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191525 (secure 0 0 0) 0x7fe9e0063900 con 0x7fe9e807d490 2026-03-10T06:23:54.930 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.930+0000 7fe9e7fff700 1 --2- 192.168.123.104:0/2818047456 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7fe9d0077380 0x7fe9d0079830 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:23:54.930 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:54.930+0000 7fe9e7fff700 1 --2- 192.168.123.104:0/2818047456 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7fe9d0077380 0x7fe9d0079830 secure :-1 s=READY pgs=47 cs=0 l=1 rev1=1 crypto rx=0x7fe9d800b3c0 tx=0x7fe9d800d040 comp rx=0 tx=0).ready entity=mgr.14632 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:23:55.206 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:55.205+0000 7fe9ee8ac700 1 -- 192.168.123.104:0/2818047456 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7fe9e804f2a0 con 0x7fe9e807d490 2026-03-10T06:23:55.206 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:55.206+0000 7fe9e57fa700 1 -- 192.168.123.104:0/2818047456 <== mon.1 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 11 v11) v1 ==== 76+0+1853 (secure 0 0 0) 0x7fe9e0026750 con 0x7fe9e807d490 2026-03-10T06:23:55.210 INFO:teuthology.orchestra.run.vm04.stdout:e11 2026-03-10T06:23:55.210 INFO:teuthology.orchestra.run.vm04.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-10T06:23:55.210 INFO:teuthology.orchestra.run.vm04.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T06:23:55.210 INFO:teuthology.orchestra.run.vm04.stdout:legacy client fscid: 1 2026-03-10T06:23:55.210 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:23:55.210 INFO:teuthology.orchestra.run.vm04.stdout:Filesystem 'cephfs' (1) 2026-03-10T06:23:55.210 INFO:teuthology.orchestra.run.vm04.stdout:fs_name cephfs 2026-03-10T06:23:55.210 INFO:teuthology.orchestra.run.vm04.stdout:epoch 9 2026-03-10T06:23:55.210 INFO:teuthology.orchestra.run.vm04.stdout:flags 32 joinable allow_snaps allow_multimds_snaps allow_standby_replay 2026-03-10T06:23:55.210 INFO:teuthology.orchestra.run.vm04.stdout:created 2026-03-10T06:19:48.407965+0000 2026-03-10T06:23:55.210 INFO:teuthology.orchestra.run.vm04.stdout:modified 2026-03-10T06:19:55.449951+0000 2026-03-10T06:23:55.210 INFO:teuthology.orchestra.run.vm04.stdout:tableserver 0 2026-03-10T06:23:55.210 INFO:teuthology.orchestra.run.vm04.stdout:root 0 2026-03-10T06:23:55.210 INFO:teuthology.orchestra.run.vm04.stdout:session_timeout 60 2026-03-10T06:23:55.210 INFO:teuthology.orchestra.run.vm04.stdout:session_autoclose 300 2026-03-10T06:23:55.210 INFO:teuthology.orchestra.run.vm04.stdout:max_file_size 1099511627776 2026-03-10T06:23:55.210 INFO:teuthology.orchestra.run.vm04.stdout:required_client_features {} 2026-03-10T06:23:55.210 INFO:teuthology.orchestra.run.vm04.stdout:last_failure 0 2026-03-10T06:23:55.210 INFO:teuthology.orchestra.run.vm04.stdout:last_failure_osd_epoch 0 2026-03-10T06:23:55.210 INFO:teuthology.orchestra.run.vm04.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T06:23:55.210 INFO:teuthology.orchestra.run.vm04.stdout:max_mds 1 2026-03-10T06:23:55.210 INFO:teuthology.orchestra.run.vm04.stdout:in 0 2026-03-10T06:23:55.210 INFO:teuthology.orchestra.run.vm04.stdout:up {0=14508} 2026-03-10T06:23:55.210 INFO:teuthology.orchestra.run.vm04.stdout:failed 2026-03-10T06:23:55.210 INFO:teuthology.orchestra.run.vm04.stdout:damaged 2026-03-10T06:23:55.210 INFO:teuthology.orchestra.run.vm04.stdout:stopped 2026-03-10T06:23:55.210 INFO:teuthology.orchestra.run.vm04.stdout:data_pools [3] 2026-03-10T06:23:55.210 INFO:teuthology.orchestra.run.vm04.stdout:metadata_pool 2 2026-03-10T06:23:55.210 INFO:teuthology.orchestra.run.vm04.stdout:inline_data enabled 2026-03-10T06:23:55.210 INFO:teuthology.orchestra.run.vm04.stdout:balancer 2026-03-10T06:23:55.210 INFO:teuthology.orchestra.run.vm04.stdout:bal_rank_mask -1 2026-03-10T06:23:55.210 INFO:teuthology.orchestra.run.vm04.stdout:standby_count_wanted 1 2026-03-10T06:23:55.210 INFO:teuthology.orchestra.run.vm04.stdout:[mds.cephfs.vm04.hdxbzv{0:14508} state up:active seq 3 join_fscid=1 addr [v2:192.168.123.104:6826/2274683007,v1:192.168.123.104:6827/2274683007] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T06:23:55.210 INFO:teuthology.orchestra.run.vm04.stdout:[mds.cephfs.vm06.wzhqon{0:24299} state up:standby-replay seq 2 join_fscid=1 addr [v2:192.168.123.106:6824/3071631026,v1:192.168.123.106:6825/3071631026] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T06:23:55.210 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:23:55.210 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:23:55.210 INFO:teuthology.orchestra.run.vm04.stdout:Standby daemons: 2026-03-10T06:23:55.211 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:23:55.211 INFO:teuthology.orchestra.run.vm04.stdout:[mds.cephfs.vm04.hsrsig{-1:14518} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.104:6828/2419696492,v1:192.168.123.104:6829/2419696492] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T06:23:55.211 INFO:teuthology.orchestra.run.vm04.stdout:[mds.cephfs.vm06.afscws{-1:14526} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.106:6826/3120742985,v1:192.168.123.106:6827/3120742985] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T06:23:55.211 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:55.209+0000 7fe9ee8ac700 1 -- 192.168.123.104:0/2818047456 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7fe9d0077380 msgr2=0x7fe9d0079830 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:23:55.211 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:55.209+0000 7fe9ee8ac700 1 --2- 192.168.123.104:0/2818047456 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7fe9d0077380 0x7fe9d0079830 secure :-1 s=READY pgs=47 cs=0 l=1 rev1=1 crypto rx=0x7fe9d800b3c0 tx=0x7fe9d800d040 comp rx=0 tx=0).stop 2026-03-10T06:23:55.211 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:55.209+0000 7fe9ee8ac700 1 -- 192.168.123.104:0/2818047456 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe9e807d490 msgr2=0x7fe9e807d900 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:23:55.211 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:55.209+0000 7fe9ee8ac700 1 --2- 192.168.123.104:0/2818047456 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe9e807d490 0x7fe9e807d900 secure :-1 s=READY pgs=112 cs=0 l=1 rev1=1 crypto rx=0x7fe9e0004740 tx=0x7fe9e0004770 comp rx=0 tx=0).stop 2026-03-10T06:23:55.212 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:55.211+0000 7fe9ee8ac700 1 -- 192.168.123.104:0/2818047456 shutdown_connections 2026-03-10T06:23:55.212 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:55.211+0000 7fe9ee8ac700 1 --2- 192.168.123.104:0/2818047456 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7fe9d0077380 0x7fe9d0079830 unknown :-1 s=CLOSED pgs=47 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:55.212 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:55.211+0000 7fe9ee8ac700 1 --2- 192.168.123.104:0/2818047456 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe9e810c8b0 0x7fe9e807cf50 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:55.212 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:55.211+0000 7fe9ee8ac700 1 --2- 192.168.123.104:0/2818047456 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe9e807d490 0x7fe9e807d900 unknown :-1 s=CLOSED pgs=112 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:55.212 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:55.211+0000 7fe9ee8ac700 1 -- 192.168.123.104:0/2818047456 >> 192.168.123.104:0/2818047456 conn(0x7fe9e806c6c0 msgr2=0x7fe9e80708c0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:23:55.215 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:55.214+0000 7fe9ee8ac700 1 -- 192.168.123.104:0/2818047456 shutdown_connections 2026-03-10T06:23:55.215 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:55.214+0000 7fe9ee8ac700 1 -- 192.168.123.104:0/2818047456 wait complete. 2026-03-10T06:23:55.219 INFO:teuthology.orchestra.run.vm04.stderr:dumped fsmap epoch 11 2026-03-10T06:23:55.355 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:55.354+0000 7f043f4c2700 1 -- 192.168.123.104:0/3853982529 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0438071e40 msgr2=0x7f04380722b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:23:55.355 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:55.354+0000 7f043f4c2700 1 --2- 192.168.123.104:0/3853982529 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0438071e40 0x7f04380722b0 secure :-1 s=READY pgs=357 cs=0 l=1 rev1=1 crypto rx=0x7f043000b600 tx=0x7f043000b910 comp rx=0 tx=0).stop 2026-03-10T06:23:55.355 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:55.354+0000 7f043f4c2700 1 -- 192.168.123.104:0/3853982529 shutdown_connections 2026-03-10T06:23:55.355 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:55.354+0000 7f043f4c2700 1 --2- 192.168.123.104:0/3853982529 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0438071e40 0x7f04380722b0 unknown :-1 s=CLOSED pgs=357 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:55.355 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:55.354+0000 7f043f4c2700 1 --2- 192.168.123.104:0/3853982529 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f043810c8b0 0x7f043810cc80 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:55.355 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:55.354+0000 7f043f4c2700 1 -- 192.168.123.104:0/3853982529 >> 192.168.123.104:0/3853982529 conn(0x7f043806c6c0 msgr2=0x7f043806cac0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:23:55.358 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:55.354+0000 7f043f4c2700 1 -- 192.168.123.104:0/3853982529 shutdown_connections 2026-03-10T06:23:55.358 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:55.354+0000 7f043f4c2700 1 -- 192.168.123.104:0/3853982529 wait complete. 2026-03-10T06:23:55.358 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:55.355+0000 7f043f4c2700 1 Processor -- start 2026-03-10T06:23:55.359 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:55.355+0000 7f043f4c2700 1 -- start start 2026-03-10T06:23:55.359 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:55.355+0000 7f043f4c2700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f043810c8b0 0x7f043807d1c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:23:55.359 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:55.355+0000 7f043f4c2700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f043807d700 0x7f0438081b70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:23:55.359 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:55.355+0000 7f043f4c2700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f043807dc00 con 0x7f043807d700 2026-03-10T06:23:55.359 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:55.355+0000 7f043f4c2700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f043807dd70 con 0x7f043810c8b0 2026-03-10T06:23:55.359 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:55.355+0000 7f043ca5d700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f043807d700 0x7f0438081b70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:23:55.359 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:55.355+0000 7f043ca5d700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f043807d700 0x7f0438081b70 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:47986/0 (socket says 192.168.123.104:47986) 2026-03-10T06:23:55.359 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:55.355+0000 7f043ca5d700 1 -- 192.168.123.104:0/627619336 learned_addr learned my addr 192.168.123.104:0/627619336 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:23:55.359 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:55.355+0000 7f043ca5d700 1 -- 192.168.123.104:0/627619336 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f043810c8b0 msgr2=0x7f043807d1c0 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T06:23:55.359 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:55.355+0000 7f043ca5d700 1 --2- 192.168.123.104:0/627619336 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f043810c8b0 0x7f043807d1c0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:55.359 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:55.355+0000 7f043ca5d700 1 -- 192.168.123.104:0/627619336 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f043000b050 con 0x7f043807d700 2026-03-10T06:23:55.359 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:55.356+0000 7f043ca5d700 1 --2- 192.168.123.104:0/627619336 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f043807d700 0x7f0438081b70 secure :-1 s=READY pgs=358 cs=0 l=1 rev1=1 crypto rx=0x7f0430003c40 tx=0x7f0430003c70 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:23:55.359 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:55.356+0000 7f042e7fc700 1 -- 192.168.123.104:0/627619336 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f043000e030 con 0x7f043807d700 2026-03-10T06:23:55.362 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:55.356+0000 7f043f4c2700 1 -- 192.168.123.104:0/627619336 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f04380820b0 con 0x7f043807d700 2026-03-10T06:23:55.362 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:55.356+0000 7f043f4c2700 1 -- 192.168.123.104:0/627619336 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f0438082600 con 0x7f043807d700 2026-03-10T06:23:55.362 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:55.361+0000 7f042e7fc700 1 -- 192.168.123.104:0/627619336 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f0430003ec0 con 0x7f043807d700 2026-03-10T06:23:55.362 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:55.361+0000 7f042e7fc700 1 -- 192.168.123.104:0/627619336 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f043001cd80 con 0x7f043807d700 2026-03-10T06:23:55.362 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:55.362+0000 7f042e7fc700 1 -- 192.168.123.104:0/627619336 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 33) v1 ==== 99964+0+0 (secure 0 0 0) 0x7f0430012430 con 0x7f043807d700 2026-03-10T06:23:55.363 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:55.363+0000 7f042e7fc700 1 --2- 192.168.123.104:0/627619336 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7f0424079950 0x7f042407be00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:23:55.364 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:55.363+0000 7f043d25e700 1 --2- 192.168.123.104:0/627619336 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7f0424079950 0x7f042407be00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:23:55.364 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:55.363+0000 7f042e7fc700 1 -- 192.168.123.104:0/627619336 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(42..42 src has 1..42) v4 ==== 5792+0+0 (secure 0 0 0) 0x7f043009bdd0 con 0x7f043807d700 2026-03-10T06:23:55.370 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:55.364+0000 7f043d25e700 1 --2- 192.168.123.104:0/627619336 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7f0424079950 0x7f042407be00 secure :-1 s=READY pgs=48 cs=0 l=1 rev1=1 crypto rx=0x7f0434009710 tx=0x7f0434006c60 comp rx=0 tx=0).ready entity=mgr.14632 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:23:55.370 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:55.364+0000 7f043f4c2700 1 -- 192.168.123.104:0/627619336 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f041c005320 con 0x7f043807d700 2026-03-10T06:23:55.374 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:55.374+0000 7f042e7fc700 1 -- 192.168.123.104:0/627619336 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191525 (secure 0 0 0) 0x7f0430064b80 con 0x7f043807d700 2026-03-10T06:23:55.473 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:55 vm04.local ceph-mon[51058]: from='client.14744 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:23:55.473 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:55 vm04.local ceph-mon[51058]: pgmap v56: 65 pgs: 65 active+clean; 405 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 936 KiB/s rd, 949 KiB/s wr, 84 op/s 2026-03-10T06:23:55.473 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:55 vm04.local ceph-mon[51058]: from='client.24515 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:23:55.473 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:55 vm04.local ceph-mon[51058]: from='client.14752 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:23:55.473 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:55 vm04.local ceph-mon[51058]: from='client.? 192.168.123.104:0/4219683549' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:23:55.473 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:55 vm04.local ceph-mon[51058]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:23:55.473 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:55 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:23:55.473 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:55 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:23:55.473 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:55 vm04.local ceph-mon[51058]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:23:55.473 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:55 vm04.local ceph-mon[51058]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "quorum_status"}]: dispatch 2026-03-10T06:23:55.473 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:55 vm04.local ceph-mon[51058]: from='client.? 192.168.123.104:0/2818047456' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T06:23:55.591 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:55.590+0000 7f043f4c2700 1 -- 192.168.123.104:0/627619336 --> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f041c000bf0 con 0x7f0424079950 2026-03-10T06:23:55.597 INFO:teuthology.orchestra.run.vm04.stdout:{ 2026-03-10T06:23:55.597 INFO:teuthology.orchestra.run.vm04.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc", 2026-03-10T06:23:55.597 INFO:teuthology.orchestra.run.vm04.stdout: "in_progress": true, 2026-03-10T06:23:55.597 INFO:teuthology.orchestra.run.vm04.stdout: "which": "Upgrading all daemon types on all hosts", 2026-03-10T06:23:55.597 INFO:teuthology.orchestra.run.vm04.stdout: "services_complete": [ 2026-03-10T06:23:55.597 INFO:teuthology.orchestra.run.vm04.stdout: "mgr" 2026-03-10T06:23:55.597 INFO:teuthology.orchestra.run.vm04.stdout: ], 2026-03-10T06:23:55.597 INFO:teuthology.orchestra.run.vm04.stdout: "progress": "2/23 daemons upgraded", 2026-03-10T06:23:55.597 INFO:teuthology.orchestra.run.vm04.stdout: "message": "Currently upgrading mon daemons", 2026-03-10T06:23:55.597 INFO:teuthology.orchestra.run.vm04.stdout: "is_paused": false 2026-03-10T06:23:55.597 INFO:teuthology.orchestra.run.vm04.stdout:} 2026-03-10T06:23:55.597 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:55.593+0000 7f042e7fc700 1 -- 192.168.123.104:0/627619336 <== mgr.14632 v2:192.168.123.104:6800/1695210057 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+368 (secure 0 0 0) 0x7f041c000bf0 con 0x7f0424079950 2026-03-10T06:23:55.599 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:55.597+0000 7f0423fff700 1 -- 192.168.123.104:0/627619336 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7f0424079950 msgr2=0x7f042407be00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:23:55.599 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:55.597+0000 7f0423fff700 1 --2- 192.168.123.104:0/627619336 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7f0424079950 0x7f042407be00 secure :-1 s=READY pgs=48 cs=0 l=1 rev1=1 crypto rx=0x7f0434009710 tx=0x7f0434006c60 comp rx=0 tx=0).stop 2026-03-10T06:23:55.599 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:55.597+0000 7f0423fff700 1 -- 192.168.123.104:0/627619336 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f043807d700 msgr2=0x7f0438081b70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:23:55.599 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:55.597+0000 7f0423fff700 1 --2- 192.168.123.104:0/627619336 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f043807d700 0x7f0438081b70 secure :-1 s=READY pgs=358 cs=0 l=1 rev1=1 crypto rx=0x7f0430003c40 tx=0x7f0430003c70 comp rx=0 tx=0).stop 2026-03-10T06:23:55.600 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:55.598+0000 7f0423fff700 1 -- 192.168.123.104:0/627619336 shutdown_connections 2026-03-10T06:23:55.600 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:55.598+0000 7f0423fff700 1 --2- 192.168.123.104:0/627619336 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f043810c8b0 0x7f043807d1c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:55.600 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:55.598+0000 7f0423fff700 1 --2- 192.168.123.104:0/627619336 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7f0424079950 0x7f042407be00 unknown :-1 s=CLOSED pgs=48 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:55.600 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:55.598+0000 7f0423fff700 1 --2- 192.168.123.104:0/627619336 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f043807d700 0x7f0438081b70 unknown :-1 s=CLOSED pgs=358 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:55.600 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:55.598+0000 7f0423fff700 1 -- 192.168.123.104:0/627619336 >> 192.168.123.104:0/627619336 conn(0x7f043806c6c0 msgr2=0x7f0438070080 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:23:55.600 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:55.599+0000 7f0423fff700 1 -- 192.168.123.104:0/627619336 shutdown_connections 2026-03-10T06:23:55.600 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:55.599+0000 7f0423fff700 1 -- 192.168.123.104:0/627619336 wait complete. 2026-03-10T06:23:55.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:55 vm06.local ceph-mon[58974]: from='client.14744 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:23:55.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:55 vm06.local ceph-mon[58974]: pgmap v56: 65 pgs: 65 active+clean; 405 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 936 KiB/s rd, 949 KiB/s wr, 84 op/s 2026-03-10T06:23:55.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:55 vm06.local ceph-mon[58974]: from='client.24515 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:23:55.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:55 vm06.local ceph-mon[58974]: from='client.14752 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:23:55.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:55 vm06.local ceph-mon[58974]: from='client.? 192.168.123.104:0/4219683549' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:23:55.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:55 vm06.local ceph-mon[58974]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:23:55.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:55 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:23:55.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:55 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:23:55.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:55 vm06.local ceph-mon[58974]: from='mgr.14632 ' entity='mgr.vm04.exdvdb' 2026-03-10T06:23:55.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:55 vm06.local ceph-mon[58974]: from='mgr.14632 192.168.123.104:0/2203698609' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "quorum_status"}]: dispatch 2026-03-10T06:23:55.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:55 vm06.local ceph-mon[58974]: from='client.? 192.168.123.104:0/2818047456' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T06:23:55.751 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:55.750+0000 7feaf9579700 1 -- 192.168.123.104:0/1937010750 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7feaf4071e40 msgr2=0x7feaf40722b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:23:55.751 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:55.750+0000 7feaf9579700 1 --2- 192.168.123.104:0/1937010750 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7feaf4071e40 0x7feaf40722b0 secure :-1 s=READY pgs=359 cs=0 l=1 rev1=1 crypto rx=0x7feaec009230 tx=0x7feaec009260 comp rx=0 tx=0).stop 2026-03-10T06:23:55.751 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:55.750+0000 7feaf9579700 1 -- 192.168.123.104:0/1937010750 shutdown_connections 2026-03-10T06:23:55.751 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:55.750+0000 7feaf9579700 1 --2- 192.168.123.104:0/1937010750 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7feaf4071e40 0x7feaf40722b0 unknown :-1 s=CLOSED pgs=359 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:55.751 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:55.750+0000 7feaf9579700 1 --2- 192.168.123.104:0/1937010750 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7feaf410c8b0 0x7feaf410cc80 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:55.751 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:55.750+0000 7feaf9579700 1 -- 192.168.123.104:0/1937010750 >> 192.168.123.104:0/1937010750 conn(0x7feaf406c6c0 msgr2=0x7feaf406cac0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:23:55.751 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:55.750+0000 7feaf9579700 1 -- 192.168.123.104:0/1937010750 shutdown_connections 2026-03-10T06:23:55.751 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:55.750+0000 7feaf9579700 1 -- 192.168.123.104:0/1937010750 wait complete. 2026-03-10T06:23:55.751 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:55.750+0000 7feaf9579700 1 Processor -- start 2026-03-10T06:23:55.751 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:55.750+0000 7feaf9579700 1 -- start start 2026-03-10T06:23:55.751 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:55.750+0000 7feaf9579700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7feaf410c8b0 0x7feaf407ced0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:23:55.751 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:55.750+0000 7feaf9579700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7feaf407d410 0x7feaf407d880 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:23:55.751 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:55.750+0000 7feaf9579700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7feaf4081a50 con 0x7feaf407d410 2026-03-10T06:23:55.751 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:55.750+0000 7feaf9579700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7feaf4081bc0 con 0x7feaf410c8b0 2026-03-10T06:23:55.751 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:55.751+0000 7feaf27fc700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7feaf407d410 0x7feaf407d880 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:23:55.751 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:55.751+0000 7feaf27fc700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7feaf407d410 0x7feaf407d880 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:48008/0 (socket says 192.168.123.104:48008) 2026-03-10T06:23:55.751 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:55.751+0000 7feaf27fc700 1 -- 192.168.123.104:0/629083297 learned_addr learned my addr 192.168.123.104:0/629083297 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:23:55.751 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:55.751+0000 7feaf2ffd700 1 --2- 192.168.123.104:0/629083297 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7feaf410c8b0 0x7feaf407ced0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:23:55.752 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:55.751+0000 7feaf2ffd700 1 -- 192.168.123.104:0/629083297 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7feaf407d410 msgr2=0x7feaf407d880 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:23:55.752 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:55.751+0000 7feaf2ffd700 1 --2- 192.168.123.104:0/629083297 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7feaf407d410 0x7feaf407d880 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:55.752 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:55.751+0000 7feaf2ffd700 1 -- 192.168.123.104:0/629083297 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7feaec008ee0 con 0x7feaf410c8b0 2026-03-10T06:23:55.752 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:55.751+0000 7feaf2ffd700 1 --2- 192.168.123.104:0/629083297 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7feaf410c8b0 0x7feaf407ced0 secure :-1 s=READY pgs=113 cs=0 l=1 rev1=1 crypto rx=0x7feae400eab0 tx=0x7feae400ee70 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:23:55.752 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:55.752+0000 7feadbfff700 1 -- 192.168.123.104:0/629083297 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7feae400cbe0 con 0x7feaf410c8b0 2026-03-10T06:23:55.753 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:55.752+0000 7feaf9579700 1 -- 192.168.123.104:0/629083297 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7feaf4081ea0 con 0x7feaf410c8b0 2026-03-10T06:23:55.753 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:55.752+0000 7feaf9579700 1 -- 192.168.123.104:0/629083297 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7feaf40823f0 con 0x7feaf410c8b0 2026-03-10T06:23:55.753 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:55.752+0000 7feadbfff700 1 -- 192.168.123.104:0/629083297 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7feae400cd40 con 0x7feaf410c8b0 2026-03-10T06:23:55.753 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:55.752+0000 7feadbfff700 1 -- 192.168.123.104:0/629083297 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7feae4018860 con 0x7feaf410c8b0 2026-03-10T06:23:55.753 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:55.753+0000 7fead9ffb700 1 -- 192.168.123.104:0/629083297 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7feae0005320 con 0x7feaf410c8b0 2026-03-10T06:23:55.754 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:55.754+0000 7feadbfff700 1 -- 192.168.123.104:0/629083297 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 33) v1 ==== 99964+0+0 (secure 0 0 0) 0x7feae4018af0 con 0x7feaf410c8b0 2026-03-10T06:23:55.755 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:55.754+0000 7feadbfff700 1 --2- 192.168.123.104:0/629083297 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7feadc077590 0x7feadc079a40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:23:55.755 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:55.754+0000 7feadbfff700 1 -- 192.168.123.104:0/629083297 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(42..42 src has 1..42) v4 ==== 5792+0+0 (secure 0 0 0) 0x7feae4014070 con 0x7feaf410c8b0 2026-03-10T06:23:55.755 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:55.755+0000 7feaf27fc700 1 --2- 192.168.123.104:0/629083297 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7feadc077590 0x7feadc079a40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:23:55.757 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:55.756+0000 7feadbfff700 1 -- 192.168.123.104:0/629083297 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191525 (secure 0 0 0) 0x7feae4062e00 con 0x7feaf410c8b0 2026-03-10T06:23:55.757 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:55.756+0000 7feaf27fc700 1 --2- 192.168.123.104:0/629083297 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7feadc077590 0x7feadc079a40 secure :-1 s=READY pgs=49 cs=0 l=1 rev1=1 crypto rx=0x7feaec00efd0 tx=0x7feaec00eb70 comp rx=0 tx=0).ready entity=mgr.14632 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:23:56.032 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:56.031+0000 7fead9ffb700 1 -- 192.168.123.104:0/629083297 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7feae0005190 con 0x7feaf410c8b0 2026-03-10T06:23:56.032 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:56.032+0000 7feadbfff700 1 -- 192.168.123.104:0/629083297 <== mon.1 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+201 (secure 0 0 0) 0x7feae4062550 con 0x7feaf410c8b0 2026-03-10T06:23:56.032 INFO:teuthology.orchestra.run.vm04.stdout:HEALTH_WARN 1 filesystem with deprecated feature inline_data 2026-03-10T06:23:56.032 INFO:teuthology.orchestra.run.vm04.stdout:[WRN] FS_INLINE_DATA_DEPRECATED: 1 filesystem with deprecated feature inline_data 2026-03-10T06:23:56.032 INFO:teuthology.orchestra.run.vm04.stdout: fs cephfs has deprecated feature inline_data enabled. 2026-03-10T06:23:56.035 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:56.034+0000 7feaf9579700 1 -- 192.168.123.104:0/629083297 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7feadc077590 msgr2=0x7feadc079a40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:23:56.035 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:56.034+0000 7feaf9579700 1 --2- 192.168.123.104:0/629083297 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7feadc077590 0x7feadc079a40 secure :-1 s=READY pgs=49 cs=0 l=1 rev1=1 crypto rx=0x7feaec00efd0 tx=0x7feaec00eb70 comp rx=0 tx=0).stop 2026-03-10T06:23:56.035 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:56.034+0000 7feaf9579700 1 -- 192.168.123.104:0/629083297 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7feaf410c8b0 msgr2=0x7feaf407ced0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:23:56.035 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:56.034+0000 7feaf9579700 1 --2- 192.168.123.104:0/629083297 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7feaf410c8b0 0x7feaf407ced0 secure :-1 s=READY pgs=113 cs=0 l=1 rev1=1 crypto rx=0x7feae400eab0 tx=0x7feae400ee70 comp rx=0 tx=0).stop 2026-03-10T06:23:56.035 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:56.034+0000 7feaf9579700 1 -- 192.168.123.104:0/629083297 shutdown_connections 2026-03-10T06:23:56.035 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:56.034+0000 7feaf9579700 1 --2- 192.168.123.104:0/629083297 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7feaf410c8b0 0x7feaf407ced0 unknown :-1 s=CLOSED pgs=113 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:56.035 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:56.035+0000 7feaf9579700 1 --2- 192.168.123.104:0/629083297 >> [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057] conn(0x7feadc077590 0x7feadc079a40 unknown :-1 s=CLOSED pgs=49 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:56.035 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:56.035+0000 7feaf9579700 1 --2- 192.168.123.104:0/629083297 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7feaf407d410 0x7feaf407d880 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:23:56.035 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:56.035+0000 7feaf9579700 1 -- 192.168.123.104:0/629083297 >> 192.168.123.104:0/629083297 conn(0x7feaf406c6c0 msgr2=0x7feaf4070900 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:23:56.035 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:56.035+0000 7feaf9579700 1 -- 192.168.123.104:0/629083297 shutdown_connections 2026-03-10T06:23:56.035 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:23:56.035+0000 7feaf9579700 1 -- 192.168.123.104:0/629083297 wait complete. 2026-03-10T06:23:56.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:56 vm04.local systemd[1]: Stopping Ceph mon.vm04 for 9c59102a-1c48-11f1-b618-035af535377d... 2026-03-10T06:23:56.743 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:56 vm04.local ceph-9c59102a-1c48-11f1-b618-035af535377d-mon-vm04[51054]: 2026-03-10T06:23:56.443+0000 7f1a54327700 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-mon -n mon.vm04 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false --default-mon-cluster-log-to-file=false --default-mon-cluster-log-to-journald=true --default-mon-cluster-log-to-stderr=false (PID: 1) UID: 0 2026-03-10T06:23:56.743 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:56 vm04.local ceph-9c59102a-1c48-11f1-b618-035af535377d-mon-vm04[51054]: 2026-03-10T06:23:56.443+0000 7f1a54327700 -1 mon.vm04@0(leader) e2 *** Got Signal Terminated *** 2026-03-10T06:23:56.743 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:56 vm04.local podman[115619]: 2026-03-10 06:23:56.50859058 +0000 UTC m=+0.168379756 container died 089bb557f95b9394c87ff557894f535f578da6e4e05adc52d8e40c294b0d47b2 (image=quay.io/ceph/ceph:v18.2.0, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-mon-vm04, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , org.label-schema.name=CentOS Stream 8 Base Image, org.label-schema.schema-version=1.0, CEPH_POINT_RELEASE=-18.2.0, GIT_CLEAN=True, GIT_COMMIT=0396eef90bef641b676c164ec7a3876f45010308, org.label-schema.license=GPLv2, io.buildah.version=1.29.1, org.label-schema.build-date=20231212, org.label-schema.vendor=CentOS, GIT_BRANCH=HEAD, RELEASE=HEAD) 2026-03-10T06:23:56.743 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:56 vm04.local podman[115619]: 2026-03-10 06:23:56.551628679 +0000 UTC m=+0.211417855 container remove 089bb557f95b9394c87ff557894f535f578da6e4e05adc52d8e40c294b0d47b2 (image=quay.io/ceph/ceph:v18.2.0, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-mon-vm04, org.label-schema.vendor=CentOS, CEPH_POINT_RELEASE=-18.2.0, GIT_BRANCH=HEAD, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, org.label-schema.name=CentOS Stream 8 Base Image, org.label-schema.schema-version=1.0, RELEASE=HEAD, io.buildah.version=1.29.1, maintainer=Guillaume Abrioux , GIT_COMMIT=0396eef90bef641b676c164ec7a3876f45010308, org.label-schema.build-date=20231212, org.label-schema.license=GPLv2) 2026-03-10T06:23:56.743 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:56 vm04.local bash[115619]: ceph-9c59102a-1c48-11f1-b618-035af535377d-mon-vm04 2026-03-10T06:23:56.743 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:56 vm04.local systemd[1]: ceph-9c59102a-1c48-11f1-b618-035af535377d@mon.vm04.service: Deactivated successfully. 2026-03-10T06:23:56.743 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:56 vm04.local systemd[1]: Stopped Ceph mon.vm04 for 9c59102a-1c48-11f1-b618-035af535377d. 2026-03-10T06:23:56.743 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:56 vm04.local systemd[1]: ceph-9c59102a-1c48-11f1-b618-035af535377d@mon.vm04.service: Consumed 7.701s CPU time. 2026-03-10T06:23:57.033 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:56 vm04.local systemd[1]: Starting Ceph mon.vm04 for 9c59102a-1c48-11f1-b618-035af535377d... 2026-03-10T06:23:57.033 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:56 vm04.local podman[115728]: 2026-03-10 06:23:56.995758612 +0000 UTC m=+0.027310959 container create cf1d928233780ae27b9373eabc9040ea31f4c252053091416da36ef4ec1f5362 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-mon-vm04, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_REF=squid, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.authors=Ceph Release Team ) 2026-03-10T06:23:57.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local podman[115728]: 2026-03-10 06:23:57.044318679 +0000 UTC m=+0.075871016 container init cf1d928233780ae27b9373eabc9040ea31f4c252053091416da36ef4ec1f5362 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-mon-vm04, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team , CEPH_REF=squid, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, OSD_FLAVOR=default) 2026-03-10T06:23:57.429 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local podman[115728]: 2026-03-10 06:23:57.048002605 +0000 UTC m=+0.079554952 container start cf1d928233780ae27b9373eabc9040ea31f4c252053091416da36ef4ec1f5362 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-mon-vm04, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team , ceph=True, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df) 2026-03-10T06:23:57.429 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local bash[115728]: cf1d928233780ae27b9373eabc9040ea31f4c252053091416da36ef4ec1f5362 2026-03-10T06:23:57.429 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local podman[115728]: 2026-03-10 06:23:56.981165092 +0000 UTC m=+0.012717439 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-10T06:23:57.429 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local systemd[1]: Started Ceph mon.vm04 for 9c59102a-1c48-11f1-b618-035af535377d. 2026-03-10T06:23:57.429 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: set uid:gid to 167:167 (ceph:ceph) 2026-03-10T06:23:57.429 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable), process ceph-mon, pid 2 2026-03-10T06:23:57.429 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: pidfile_write: ignore empty --pid-file 2026-03-10T06:23:57.429 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: load: jerasure load: lrc 2026-03-10T06:23:57.429 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: RocksDB version: 7.9.2 2026-03-10T06:23:57.429 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Git sha 0 2026-03-10T06:23:57.429 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Compile date 2026-02-25 18:11:04 2026-03-10T06:23:57.429 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: DB SUMMARY 2026-03-10T06:23:57.429 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: DB Session ID: F6FY7WIB0AAV117Y3WN7 2026-03-10T06:23:57.429 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: CURRENT file: CURRENT 2026-03-10T06:23:57.429 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: IDENTITY file: IDENTITY 2026-03-10T06:23:57.429 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: MANIFEST file: MANIFEST-000015 size: 1028 Bytes 2026-03-10T06:23:57.429 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: SST files in /var/lib/ceph/mon/ceph-vm04/store.db dir, Total Num: 1, files: 000026.sst 2026-03-10T06:23:57.429 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Write Ahead Log file in /var/lib/ceph/mon/ceph-vm04/store.db: 000024.log size: 632816 ; 2026-03-10T06:23:57.429 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.error_if_exists: 0 2026-03-10T06:23:57.429 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.create_if_missing: 0 2026-03-10T06:23:57.429 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.paranoid_checks: 1 2026-03-10T06:23:57.429 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.flush_verify_memtable_count: 1 2026-03-10T06:23:57.429 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.track_and_verify_wals_in_manifest: 0 2026-03-10T06:23:57.429 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.verify_sst_unique_id_in_manifest: 1 2026-03-10T06:23:57.429 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.env: 0x562ea369edc0 2026-03-10T06:23:57.429 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.fs: PosixFileSystem 2026-03-10T06:23:57.429 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.info_log: 0x562ea56f3900 2026-03-10T06:23:57.429 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.max_file_opening_threads: 16 2026-03-10T06:23:57.429 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.statistics: (nil) 2026-03-10T06:23:57.429 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.use_fsync: 0 2026-03-10T06:23:57.429 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.max_log_file_size: 0 2026-03-10T06:23:57.429 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.max_manifest_file_size: 1073741824 2026-03-10T06:23:57.429 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.log_file_time_to_roll: 0 2026-03-10T06:23:57.429 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.keep_log_file_num: 1000 2026-03-10T06:23:57.429 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.recycle_log_file_num: 0 2026-03-10T06:23:57.429 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.allow_fallocate: 1 2026-03-10T06:23:57.429 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.allow_mmap_reads: 0 2026-03-10T06:23:57.429 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.allow_mmap_writes: 0 2026-03-10T06:23:57.429 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.use_direct_reads: 0 2026-03-10T06:23:57.429 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.use_direct_io_for_flush_and_compaction: 0 2026-03-10T06:23:57.429 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.create_missing_column_families: 0 2026-03-10T06:23:57.429 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.db_log_dir: 2026-03-10T06:23:57.429 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.wal_dir: 2026-03-10T06:23:57.429 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.table_cache_numshardbits: 6 2026-03-10T06:23:57.429 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.WAL_ttl_seconds: 0 2026-03-10T06:23:57.429 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.WAL_size_limit_MB: 0 2026-03-10T06:23:57.429 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.max_write_batch_group_size_bytes: 1048576 2026-03-10T06:23:57.429 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.manifest_preallocation_size: 4194304 2026-03-10T06:23:57.429 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.is_fd_close_on_exec: 1 2026-03-10T06:23:57.429 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.advise_random_on_open: 1 2026-03-10T06:23:57.429 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.db_write_buffer_size: 0 2026-03-10T06:23:57.429 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.write_buffer_manager: 0x562ea56f7900 2026-03-10T06:23:57.429 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.access_hint_on_compaction_start: 1 2026-03-10T06:23:57.429 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.random_access_max_buffer_size: 1048576 2026-03-10T06:23:57.429 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.use_adaptive_mutex: 0 2026-03-10T06:23:57.429 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.rate_limiter: (nil) 2026-03-10T06:23:57.429 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.sst_file_manager.rate_bytes_per_sec: 0 2026-03-10T06:23:57.429 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.wal_recovery_mode: 2 2026-03-10T06:23:57.429 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.enable_thread_tracking: 0 2026-03-10T06:23:57.429 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.enable_pipelined_write: 0 2026-03-10T06:23:57.430 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.unordered_write: 0 2026-03-10T06:23:57.430 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.allow_concurrent_memtable_write: 1 2026-03-10T06:23:57.430 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.enable_write_thread_adaptive_yield: 1 2026-03-10T06:23:57.430 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.write_thread_max_yield_usec: 100 2026-03-10T06:23:57.430 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.write_thread_slow_yield_usec: 3 2026-03-10T06:23:57.430 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.row_cache: None 2026-03-10T06:23:57.430 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.wal_filter: None 2026-03-10T06:23:57.430 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.avoid_flush_during_recovery: 0 2026-03-10T06:23:57.430 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.allow_ingest_behind: 0 2026-03-10T06:23:57.430 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.two_write_queues: 0 2026-03-10T06:23:57.430 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.manual_wal_flush: 0 2026-03-10T06:23:57.430 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.wal_compression: 0 2026-03-10T06:23:57.430 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.atomic_flush: 0 2026-03-10T06:23:57.430 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.avoid_unnecessary_blocking_io: 0 2026-03-10T06:23:57.430 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.persist_stats_to_disk: 0 2026-03-10T06:23:57.430 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.write_dbid_to_manifest: 0 2026-03-10T06:23:57.430 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.log_readahead_size: 0 2026-03-10T06:23:57.430 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.file_checksum_gen_factory: Unknown 2026-03-10T06:23:57.430 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.best_efforts_recovery: 0 2026-03-10T06:23:57.430 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.max_bgerror_resume_count: 2147483647 2026-03-10T06:23:57.430 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.bgerror_resume_retry_interval: 1000000 2026-03-10T06:23:57.430 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.allow_data_in_errors: 0 2026-03-10T06:23:57.430 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.db_host_id: __hostname__ 2026-03-10T06:23:57.430 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.enforce_single_del_contracts: true 2026-03-10T06:23:57.430 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.max_background_jobs: 2 2026-03-10T06:23:57.430 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.max_background_compactions: -1 2026-03-10T06:23:57.430 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.max_subcompactions: 1 2026-03-10T06:23:57.430 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.avoid_flush_during_shutdown: 0 2026-03-10T06:23:57.430 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.writable_file_max_buffer_size: 1048576 2026-03-10T06:23:57.430 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.delayed_write_rate : 16777216 2026-03-10T06:23:57.430 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.max_total_wal_size: 0 2026-03-10T06:23:57.430 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.delete_obsolete_files_period_micros: 21600000000 2026-03-10T06:23:57.430 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.stats_dump_period_sec: 600 2026-03-10T06:23:57.430 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.stats_persist_period_sec: 600 2026-03-10T06:23:57.430 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.stats_history_buffer_size: 1048576 2026-03-10T06:23:57.430 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.max_open_files: -1 2026-03-10T06:23:57.430 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.bytes_per_sync: 0 2026-03-10T06:23:57.430 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.wal_bytes_per_sync: 0 2026-03-10T06:23:57.430 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.strict_bytes_per_sync: 0 2026-03-10T06:23:57.430 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.compaction_readahead_size: 0 2026-03-10T06:23:57.430 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.max_background_flushes: -1 2026-03-10T06:23:57.430 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Compression algorithms supported: 2026-03-10T06:23:57.430 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: kZSTD supported: 0 2026-03-10T06:23:57.430 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: kXpressCompression supported: 0 2026-03-10T06:23:57.430 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: kBZip2Compression supported: 0 2026-03-10T06:23:57.430 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: kZSTDNotFinalCompression supported: 0 2026-03-10T06:23:57.430 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: kLZ4Compression supported: 1 2026-03-10T06:23:57.430 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: kZlibCompression supported: 1 2026-03-10T06:23:57.430 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: kLZ4HCCompression supported: 1 2026-03-10T06:23:57.430 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: kSnappyCompression supported: 1 2026-03-10T06:23:57.430 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Fast CRC32 supported: Supported on x86 2026-03-10T06:23:57.430 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: DMutex implementation: pthread_mutex_t 2026-03-10T06:23:57.430 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: /var/lib/ceph/mon/ceph-vm04/store.db/MANIFEST-000015 2026-03-10T06:23:57.430 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]: 2026-03-10T06:23:57.430 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.comparator: leveldb.BytewiseComparator 2026-03-10T06:23:57.430 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.merge_operator: 2026-03-10T06:23:57.430 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.compaction_filter: None 2026-03-10T06:23:57.430 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.compaction_filter_factory: None 2026-03-10T06:23:57.430 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.sst_partitioner_factory: None 2026-03-10T06:23:57.430 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.memtable_factory: SkipListFactory 2026-03-10T06:23:57.431 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.table_factory: BlockBasedTable 2026-03-10T06:23:57.431 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x562ea56f3580) 2026-03-10T06:23:57.431 INFO:journalctl@ceph.mon.vm04.vm04.stdout: cache_index_and_filter_blocks: 1 2026-03-10T06:23:57.431 INFO:journalctl@ceph.mon.vm04.vm04.stdout: cache_index_and_filter_blocks_with_high_priority: 0 2026-03-10T06:23:57.431 INFO:journalctl@ceph.mon.vm04.vm04.stdout: pin_l0_filter_and_index_blocks_in_cache: 0 2026-03-10T06:23:57.431 INFO:journalctl@ceph.mon.vm04.vm04.stdout: pin_top_level_index_and_filter: 1 2026-03-10T06:23:57.431 INFO:journalctl@ceph.mon.vm04.vm04.stdout: index_type: 0 2026-03-10T06:23:57.431 INFO:journalctl@ceph.mon.vm04.vm04.stdout: data_block_index_type: 0 2026-03-10T06:23:57.431 INFO:journalctl@ceph.mon.vm04.vm04.stdout: index_shortening: 1 2026-03-10T06:23:57.431 INFO:journalctl@ceph.mon.vm04.vm04.stdout: data_block_hash_table_util_ratio: 0.750000 2026-03-10T06:23:57.431 INFO:journalctl@ceph.mon.vm04.vm04.stdout: checksum: 4 2026-03-10T06:23:57.431 INFO:journalctl@ceph.mon.vm04.vm04.stdout: no_block_cache: 0 2026-03-10T06:23:57.431 INFO:journalctl@ceph.mon.vm04.vm04.stdout: block_cache: 0x562ea57169b0 2026-03-10T06:23:57.431 INFO:journalctl@ceph.mon.vm04.vm04.stdout: block_cache_name: BinnedLRUCache 2026-03-10T06:23:57.431 INFO:journalctl@ceph.mon.vm04.vm04.stdout: block_cache_options: 2026-03-10T06:23:57.431 INFO:journalctl@ceph.mon.vm04.vm04.stdout: capacity : 536870912 2026-03-10T06:23:57.431 INFO:journalctl@ceph.mon.vm04.vm04.stdout: num_shard_bits : 4 2026-03-10T06:23:57.431 INFO:journalctl@ceph.mon.vm04.vm04.stdout: strict_capacity_limit : 0 2026-03-10T06:23:57.431 INFO:journalctl@ceph.mon.vm04.vm04.stdout: high_pri_pool_ratio: 0.000 2026-03-10T06:23:57.431 INFO:journalctl@ceph.mon.vm04.vm04.stdout: block_cache_compressed: (nil) 2026-03-10T06:23:57.431 INFO:journalctl@ceph.mon.vm04.vm04.stdout: persistent_cache: (nil) 2026-03-10T06:23:57.431 INFO:journalctl@ceph.mon.vm04.vm04.stdout: block_size: 4096 2026-03-10T06:23:57.431 INFO:journalctl@ceph.mon.vm04.vm04.stdout: block_size_deviation: 10 2026-03-10T06:23:57.431 INFO:journalctl@ceph.mon.vm04.vm04.stdout: block_restart_interval: 16 2026-03-10T06:23:57.431 INFO:journalctl@ceph.mon.vm04.vm04.stdout: index_block_restart_interval: 1 2026-03-10T06:23:57.431 INFO:journalctl@ceph.mon.vm04.vm04.stdout: metadata_block_size: 4096 2026-03-10T06:23:57.431 INFO:journalctl@ceph.mon.vm04.vm04.stdout: partition_filters: 0 2026-03-10T06:23:57.431 INFO:journalctl@ceph.mon.vm04.vm04.stdout: use_delta_encoding: 1 2026-03-10T06:23:57.431 INFO:journalctl@ceph.mon.vm04.vm04.stdout: filter_policy: bloomfilter 2026-03-10T06:23:57.431 INFO:journalctl@ceph.mon.vm04.vm04.stdout: whole_key_filtering: 1 2026-03-10T06:23:57.431 INFO:journalctl@ceph.mon.vm04.vm04.stdout: verify_compression: 0 2026-03-10T06:23:57.431 INFO:journalctl@ceph.mon.vm04.vm04.stdout: read_amp_bytes_per_bit: 0 2026-03-10T06:23:57.431 INFO:journalctl@ceph.mon.vm04.vm04.stdout: format_version: 5 2026-03-10T06:23:57.431 INFO:journalctl@ceph.mon.vm04.vm04.stdout: enable_index_compression: 1 2026-03-10T06:23:57.431 INFO:journalctl@ceph.mon.vm04.vm04.stdout: block_align: 0 2026-03-10T06:23:57.431 INFO:journalctl@ceph.mon.vm04.vm04.stdout: max_auto_readahead_size: 262144 2026-03-10T06:23:57.431 INFO:journalctl@ceph.mon.vm04.vm04.stdout: prepopulate_block_cache: 0 2026-03-10T06:23:57.431 INFO:journalctl@ceph.mon.vm04.vm04.stdout: initial_auto_readahead_size: 8192 2026-03-10T06:23:57.431 INFO:journalctl@ceph.mon.vm04.vm04.stdout: num_file_reads_for_auto_readahead: 2 2026-03-10T06:23:57.431 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.write_buffer_size: 33554432 2026-03-10T06:23:57.431 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.max_write_buffer_number: 2 2026-03-10T06:23:57.431 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.compression: NoCompression 2026-03-10T06:23:57.431 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.bottommost_compression: Disabled 2026-03-10T06:23:57.431 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.prefix_extractor: nullptr 2026-03-10T06:23:57.431 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr 2026-03-10T06:23:57.431 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.num_levels: 7 2026-03-10T06:23:57.431 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.min_write_buffer_number_to_merge: 1 2026-03-10T06:23:57.431 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 2026-03-10T06:23:57.431 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 2026-03-10T06:23:57.431 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 2026-03-10T06:23:57.431 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.bottommost_compression_opts.level: 32767 2026-03-10T06:23:57.431 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.bottommost_compression_opts.strategy: 0 2026-03-10T06:23:57.431 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 2026-03-10T06:23:57.431 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 2026-03-10T06:23:57.431 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 2026-03-10T06:23:57.431 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.bottommost_compression_opts.enabled: false 2026-03-10T06:23:57.431 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 2026-03-10T06:23:57.431 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true 2026-03-10T06:23:57.431 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.compression_opts.window_bits: -14 2026-03-10T06:23:57.431 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.compression_opts.level: 32767 2026-03-10T06:23:57.431 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.compression_opts.strategy: 0 2026-03-10T06:23:57.432 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.compression_opts.max_dict_bytes: 0 2026-03-10T06:23:57.432 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 2026-03-10T06:23:57.432 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true 2026-03-10T06:23:57.432 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.compression_opts.parallel_threads: 1 2026-03-10T06:23:57.432 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.compression_opts.enabled: false 2026-03-10T06:23:57.432 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 2026-03-10T06:23:57.432 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.level0_file_num_compaction_trigger: 4 2026-03-10T06:23:57.432 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.level0_slowdown_writes_trigger: 20 2026-03-10T06:23:57.432 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.level0_stop_writes_trigger: 36 2026-03-10T06:23:57.432 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.target_file_size_base: 67108864 2026-03-10T06:23:57.432 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.target_file_size_multiplier: 1 2026-03-10T06:23:57.432 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.max_bytes_for_level_base: 268435456 2026-03-10T06:23:57.432 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.level_compaction_dynamic_level_bytes: 1 2026-03-10T06:23:57.432 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.max_bytes_for_level_multiplier: 10.000000 2026-03-10T06:23:57.432 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 2026-03-10T06:23:57.432 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 2026-03-10T06:23:57.432 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 2026-03-10T06:23:57.432 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 2026-03-10T06:23:57.432 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 2026-03-10T06:23:57.432 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 2026-03-10T06:23:57.432 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 2026-03-10T06:23:57.432 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.max_sequential_skip_in_iterations: 8 2026-03-10T06:23:57.432 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.max_compaction_bytes: 1677721600 2026-03-10T06:23:57.432 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true 2026-03-10T06:23:57.432 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.arena_block_size: 1048576 2026-03-10T06:23:57.432 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 2026-03-10T06:23:57.432 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 2026-03-10T06:23:57.432 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.disable_auto_compactions: 0 2026-03-10T06:23:57.432 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.compaction_style: kCompactionStyleLevel 2026-03-10T06:23:57.432 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.compaction_pri: kMinOverlappingRatio 2026-03-10T06:23:57.432 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.compaction_options_universal.size_ratio: 1 2026-03-10T06:23:57.432 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 2026-03-10T06:23:57.432 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 2026-03-10T06:23:57.432 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 2026-03-10T06:23:57.432 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 2026-03-10T06:23:57.432 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize 2026-03-10T06:23:57.432 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 2026-03-10T06:23:57.432 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 2026-03-10T06:23:57.432 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); 2026-03-10T06:23:57.432 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.inplace_update_support: 0 2026-03-10T06:23:57.432 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.inplace_update_num_locks: 10000 2026-03-10T06:23:57.432 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 2026-03-10T06:23:57.432 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.memtable_whole_key_filtering: 0 2026-03-10T06:23:57.432 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.memtable_huge_page_size: 0 2026-03-10T06:23:57.432 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.bloom_locality: 0 2026-03-10T06:23:57.432 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.max_successive_merges: 0 2026-03-10T06:23:57.432 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.optimize_filters_for_hits: 0 2026-03-10T06:23:57.432 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.paranoid_file_checks: 0 2026-03-10T06:23:57.432 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.force_consistency_checks: 1 2026-03-10T06:23:57.432 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.report_bg_io_stats: 0 2026-03-10T06:23:57.432 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.ttl: 2592000 2026-03-10T06:23:57.432 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.periodic_compaction_seconds: 0 2026-03-10T06:23:57.432 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.preclude_last_level_data_seconds: 0 2026-03-10T06:23:57.432 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.preserve_internal_time_seconds: 0 2026-03-10T06:23:57.432 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.enable_blob_files: false 2026-03-10T06:23:57.432 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.min_blob_size: 0 2026-03-10T06:23:57.432 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.blob_file_size: 268435456 2026-03-10T06:23:57.432 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.blob_compression_type: NoCompression 2026-03-10T06:23:57.432 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.enable_blob_garbage_collection: false 2026-03-10T06:23:57.432 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 2026-03-10T06:23:57.432 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 2026-03-10T06:23:57.432 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.blob_compaction_readahead_size: 0 2026-03-10T06:23:57.433 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.blob_file_starting_level: 0 2026-03-10T06:23:57.433 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 2026-03-10T06:23:57.433 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. 2026-03-10T06:23:57.433 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:/var/lib/ceph/mon/ceph-vm04/store.db/MANIFEST-000015 succeeded,manifest_file_number is 15, next_file_number is 28, last_sequence is 9846, log_number is 24,prev_log_number is 0,max_column_family is 0,min_log_number_to_keep is 24 2026-03-10T06:23:57.433 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 24 2026-03-10T06:23:57.433 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 408df411-08e0-4cc2-8e50-beb38e7e2a77 2026-03-10T06:23:57.433 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: EVENT_LOG_v1 {"time_micros": 1773123837096132, "job": 1, "event": "recovery_started", "wal_files": [24]} 2026-03-10T06:23:57.433 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #24 mode 2 2026-03-10T06:23:57.433 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: EVENT_LOG_v1 {"time_micros": 1773123837103847, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 29, "file_size": 606515, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 9847, "largest_seqno": 10383, "table_properties": {"data_size": 603485, "index_size": 1221, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 773, "raw_key_size": 7672, "raw_average_key_size": 26, "raw_value_size": 597561, "raw_average_value_size": 2089, "num_data_blocks": 53, "num_entries": 286, "num_filter_entries": 286, "num_deletions": 6, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1773123837, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "408df411-08e0-4cc2-8e50-beb38e7e2a77", "db_session_id": "F6FY7WIB0AAV117Y3WN7", "orig_file_number": 29, "seqno_to_time_mapping": "N/A"}} 2026-03-10T06:23:57.433 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: EVENT_LOG_v1 {"time_micros": 1773123837103985, "job": 1, "event": "recovery_finished"} 2026-03-10T06:23:57.433 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: [db/version_set.cc:5047] Creating manifest 31 2026-03-10T06:23:57.433 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. 2026-03-10T06:23:57.433 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-vm04/store.db/000024.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 2026-03-10T06:23:57.433 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x562ea5718e00 2026-03-10T06:23:57.433 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: DB pointer 0x562ea5728000 2026-03-10T06:23:57.433 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- 2026-03-10T06:23:57.433 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: rocksdb: [db/db_impl/db_impl.cc:1111] 2026-03-10T06:23:57.433 INFO:journalctl@ceph.mon.vm04.vm04.stdout: ** DB Stats ** 2026-03-10T06:23:57.433 INFO:journalctl@ceph.mon.vm04.vm04.stdout: Uptime(secs): 0.0 total, 0.0 interval 2026-03-10T06:23:57.433 INFO:journalctl@ceph.mon.vm04.vm04.stdout: Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s 2026-03-10T06:23:57.433 INFO:journalctl@ceph.mon.vm04.vm04.stdout: Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s 2026-03-10T06:23:57.433 INFO:journalctl@ceph.mon.vm04.vm04.stdout: Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent 2026-03-10T06:23:57.433 INFO:journalctl@ceph.mon.vm04.vm04.stdout: Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s 2026-03-10T06:23:57.433 INFO:journalctl@ceph.mon.vm04.vm04.stdout: Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s 2026-03-10T06:23:57.433 INFO:journalctl@ceph.mon.vm04.vm04.stdout: Interval stall: 00:00:0.000 H:M:S, 0.0 percent 2026-03-10T06:23:57.433 INFO:journalctl@ceph.mon.vm04.vm04.stdout: 2026-03-10T06:23:57.433 INFO:journalctl@ceph.mon.vm04.vm04.stdout: ** Compaction Stats [default] ** 2026-03-10T06:23:57.433 INFO:journalctl@ceph.mon.vm04.vm04.stdout: Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB) 2026-03-10T06:23:57.433 INFO:journalctl@ceph.mon.vm04.vm04.stdout: ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------ 2026-03-10T06:23:57.433 INFO:journalctl@ceph.mon.vm04.vm04.stdout: L0 1/0 592.30 KB 0.2 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 88.6 0.01 0.00 1 0.007 0 0 0.0 0.0 2026-03-10T06:23:57.433 INFO:journalctl@ceph.mon.vm04.vm04.stdout: L6 1/0 7.92 MB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0 2026-03-10T06:23:57.433 INFO:journalctl@ceph.mon.vm04.vm04.stdout: Sum 2/0 8.50 MB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 88.6 0.01 0.00 1 0.007 0 0 0.0 0.0 2026-03-10T06:23:57.433 INFO:journalctl@ceph.mon.vm04.vm04.stdout: Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 88.6 0.01 0.00 1 0.007 0 0 0.0 0.0 2026-03-10T06:23:57.433 INFO:journalctl@ceph.mon.vm04.vm04.stdout: 2026-03-10T06:23:57.433 INFO:journalctl@ceph.mon.vm04.vm04.stdout: ** Compaction Stats [default] ** 2026-03-10T06:23:57.433 INFO:journalctl@ceph.mon.vm04.vm04.stdout: Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB) 2026-03-10T06:23:57.433 INFO:journalctl@ceph.mon.vm04.vm04.stdout: --------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- 2026-03-10T06:23:57.433 INFO:journalctl@ceph.mon.vm04.vm04.stdout: User 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 88.6 0.01 0.00 1 0.007 0 0 0.0 0.0 2026-03-10T06:23:57.433 INFO:journalctl@ceph.mon.vm04.vm04.stdout: 2026-03-10T06:23:57.433 INFO:journalctl@ceph.mon.vm04.vm04.stdout: Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0 2026-03-10T06:23:57.433 INFO:journalctl@ceph.mon.vm04.vm04.stdout: 2026-03-10T06:23:57.433 INFO:journalctl@ceph.mon.vm04.vm04.stdout: Uptime(secs): 0.0 total, 0.0 interval 2026-03-10T06:23:57.433 INFO:journalctl@ceph.mon.vm04.vm04.stdout: Flush(GB): cumulative 0.001, interval 0.001 2026-03-10T06:23:57.433 INFO:journalctl@ceph.mon.vm04.vm04.stdout: AddFile(GB): cumulative 0.000, interval 0.000 2026-03-10T06:23:57.433 INFO:journalctl@ceph.mon.vm04.vm04.stdout: AddFile(Total Files): cumulative 0, interval 0 2026-03-10T06:23:57.433 INFO:journalctl@ceph.mon.vm04.vm04.stdout: AddFile(L0 Files): cumulative 0, interval 0 2026-03-10T06:23:57.433 INFO:journalctl@ceph.mon.vm04.vm04.stdout: AddFile(Keys): cumulative 0, interval 0 2026-03-10T06:23:57.433 INFO:journalctl@ceph.mon.vm04.vm04.stdout: Cumulative compaction: 0.00 GB write, 44.52 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds 2026-03-10T06:23:57.433 INFO:journalctl@ceph.mon.vm04.vm04.stdout: Interval compaction: 0.00 GB write, 44.52 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds 2026-03-10T06:23:57.433 INFO:journalctl@ceph.mon.vm04.vm04.stdout: Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count 2026-03-10T06:23:57.433 INFO:journalctl@ceph.mon.vm04.vm04.stdout: Block cache BinnedLRUCache@0x562ea57169b0#2 capacity: 512.00 MB usage: 2.23 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 0 last_secs: 6e-06 secs_since: 0 2026-03-10T06:23:57.433 INFO:journalctl@ceph.mon.vm04.vm04.stdout: Block cache entry stats(count,size,portion): FilterBlock(1,0.91 KB,0.000172853%) IndexBlock(1,1.33 KB,0.00025332%) Misc(1,0.00 KB,0%) 2026-03-10T06:23:57.433 INFO:journalctl@ceph.mon.vm04.vm04.stdout: 2026-03-10T06:23:57.433 INFO:journalctl@ceph.mon.vm04.vm04.stdout: ** File Read Latency Histogram By Level [default] ** 2026-03-10T06:23:57.433 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: starting mon.vm04 rank 0 at public addrs [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] at bind addrs [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] mon_data /var/lib/ceph/mon/ceph-vm04 fsid 9c59102a-1c48-11f1-b618-035af535377d 2026-03-10T06:23:57.433 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: mon.vm04@-1(???) e2 preinit fsid 9c59102a-1c48-11f1-b618-035af535377d 2026-03-10T06:23:57.433 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: mon.vm04@-1(???).mds e11 new map 2026-03-10T06:23:57.433 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: mon.vm04@-1(???).mds e11 print_map 2026-03-10T06:23:57.433 INFO:journalctl@ceph.mon.vm04.vm04.stdout: e11 2026-03-10T06:23:57.433 INFO:journalctl@ceph.mon.vm04.vm04.stdout: btime 1970-01-01T00:00:00:000000+0000 2026-03-10T06:23:57.433 INFO:journalctl@ceph.mon.vm04.vm04.stdout: enable_multiple, ever_enabled_multiple: 1,1 2026-03-10T06:23:57.434 INFO:journalctl@ceph.mon.vm04.vm04.stdout: default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T06:23:57.434 INFO:journalctl@ceph.mon.vm04.vm04.stdout: legacy client fscid: 1 2026-03-10T06:23:57.434 INFO:journalctl@ceph.mon.vm04.vm04.stdout: 2026-03-10T06:23:57.434 INFO:journalctl@ceph.mon.vm04.vm04.stdout: Filesystem 'cephfs' (1) 2026-03-10T06:23:57.434 INFO:journalctl@ceph.mon.vm04.vm04.stdout: fs_name cephfs 2026-03-10T06:23:57.434 INFO:journalctl@ceph.mon.vm04.vm04.stdout: epoch 9 2026-03-10T06:23:57.434 INFO:journalctl@ceph.mon.vm04.vm04.stdout: flags 32 joinable allow_snaps allow_multimds_snaps allow_standby_replay 2026-03-10T06:23:57.434 INFO:journalctl@ceph.mon.vm04.vm04.stdout: created 2026-03-10T06:19:48.407965+0000 2026-03-10T06:23:57.434 INFO:journalctl@ceph.mon.vm04.vm04.stdout: modified 2026-03-10T06:19:55.449951+0000 2026-03-10T06:23:57.434 INFO:journalctl@ceph.mon.vm04.vm04.stdout: tableserver 0 2026-03-10T06:23:57.434 INFO:journalctl@ceph.mon.vm04.vm04.stdout: root 0 2026-03-10T06:23:57.434 INFO:journalctl@ceph.mon.vm04.vm04.stdout: session_timeout 60 2026-03-10T06:23:57.434 INFO:journalctl@ceph.mon.vm04.vm04.stdout: session_autoclose 300 2026-03-10T06:23:57.434 INFO:journalctl@ceph.mon.vm04.vm04.stdout: max_file_size 1099511627776 2026-03-10T06:23:57.434 INFO:journalctl@ceph.mon.vm04.vm04.stdout: max_xattr_size 65536 2026-03-10T06:23:57.434 INFO:journalctl@ceph.mon.vm04.vm04.stdout: required_client_features {} 2026-03-10T06:23:57.434 INFO:journalctl@ceph.mon.vm04.vm04.stdout: last_failure 0 2026-03-10T06:23:57.434 INFO:journalctl@ceph.mon.vm04.vm04.stdout: last_failure_osd_epoch 0 2026-03-10T06:23:57.434 INFO:journalctl@ceph.mon.vm04.vm04.stdout: compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T06:23:57.434 INFO:journalctl@ceph.mon.vm04.vm04.stdout: max_mds 1 2026-03-10T06:23:57.434 INFO:journalctl@ceph.mon.vm04.vm04.stdout: in 0 2026-03-10T06:23:57.434 INFO:journalctl@ceph.mon.vm04.vm04.stdout: up {0=14508} 2026-03-10T06:23:57.434 INFO:journalctl@ceph.mon.vm04.vm04.stdout: failed 2026-03-10T06:23:57.434 INFO:journalctl@ceph.mon.vm04.vm04.stdout: damaged 2026-03-10T06:23:57.434 INFO:journalctl@ceph.mon.vm04.vm04.stdout: stopped 2026-03-10T06:23:57.434 INFO:journalctl@ceph.mon.vm04.vm04.stdout: data_pools [3] 2026-03-10T06:23:57.434 INFO:journalctl@ceph.mon.vm04.vm04.stdout: metadata_pool 2 2026-03-10T06:23:57.434 INFO:journalctl@ceph.mon.vm04.vm04.stdout: inline_data enabled 2026-03-10T06:23:57.434 INFO:journalctl@ceph.mon.vm04.vm04.stdout: balancer 2026-03-10T06:23:57.434 INFO:journalctl@ceph.mon.vm04.vm04.stdout: bal_rank_mask -1 2026-03-10T06:23:57.434 INFO:journalctl@ceph.mon.vm04.vm04.stdout: standby_count_wanted 1 2026-03-10T06:23:57.434 INFO:journalctl@ceph.mon.vm04.vm04.stdout: qdb_cluster leader: 0 members: 2026-03-10T06:23:57.434 INFO:journalctl@ceph.mon.vm04.vm04.stdout: [mds.cephfs.vm04.hdxbzv{0:14508} state up:active seq 3 join_fscid=1 addr [v2:192.168.123.104:6826/2274683007,v1:192.168.123.104:6827/2274683007] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T06:23:57.434 INFO:journalctl@ceph.mon.vm04.vm04.stdout: [mds.cephfs.vm06.wzhqon{0:24299} state up:standby-replay seq 2 join_fscid=1 addr [v2:192.168.123.106:6824/3071631026,v1:192.168.123.106:6825/3071631026] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T06:23:57.434 INFO:journalctl@ceph.mon.vm04.vm04.stdout: 2026-03-10T06:23:57.434 INFO:journalctl@ceph.mon.vm04.vm04.stdout: 2026-03-10T06:23:57.434 INFO:journalctl@ceph.mon.vm04.vm04.stdout: Standby daemons: 2026-03-10T06:23:57.434 INFO:journalctl@ceph.mon.vm04.vm04.stdout: 2026-03-10T06:23:57.434 INFO:journalctl@ceph.mon.vm04.vm04.stdout: [mds.cephfs.vm04.hsrsig{-1:14518} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.104:6828/2419696492,v1:192.168.123.104:6829/2419696492] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T06:23:57.434 INFO:journalctl@ceph.mon.vm04.vm04.stdout: [mds.cephfs.vm06.afscws{-1:14526} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.106:6826/3120742985,v1:192.168.123.106:6827/3120742985] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T06:23:57.434 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: mon.vm04@-1(???).osd e42 crush map has features 3314933000852226048, adjusting msgr requires 2026-03-10T06:23:57.434 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: mon.vm04@-1(???).osd e42 crush map has features 288514051259236352, adjusting msgr requires 2026-03-10T06:23:57.434 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: mon.vm04@-1(???).osd e42 crush map has features 288514051259236352, adjusting msgr requires 2026-03-10T06:23:57.435 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: mon.vm04@-1(???).osd e42 crush map has features 288514051259236352, adjusting msgr requires 2026-03-10T06:23:57.435 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: mon.vm04@-1(???).paxosservice(auth 1..21) refresh upgraded, format 0 -> 3 2026-03-10T06:23:57.435 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: mon.vm04@-1(???).mgr e0 loading version 33 2026-03-10T06:23:57.435 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: mon.vm04@-1(???).mgr e33 active server: [v2:192.168.123.104:6800/1695210057,v1:192.168.123.104:6801/1695210057](14632) 2026-03-10T06:23:57.435 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:57 vm04.local ceph-mon[115743]: mon.vm04@-1(???).mgr e33 mkfs or daemon transitioned to available, loading commands 2026-03-10T06:23:58.790 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:58 vm04.local ceph-mon[115743]: pgmap v57: 65 pgs: 65 active+clean; 293 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail; 1.4 MiB/s rd, 1.4 MiB/s wr, 123 op/s 2026-03-10T06:23:58.790 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:58 vm04.local ceph-mon[115743]: mon.vm04 calling monitor election 2026-03-10T06:23:58.790 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:58 vm04.local ceph-mon[115743]: mon.vm04 is new leader, mons vm04,vm06 in quorum (ranks 0,1) 2026-03-10T06:23:58.790 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:58 vm04.local ceph-mon[115743]: monmap epoch 2 2026-03-10T06:23:58.790 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:58 vm04.local ceph-mon[115743]: fsid 9c59102a-1c48-11f1-b618-035af535377d 2026-03-10T06:23:58.790 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:58 vm04.local ceph-mon[115743]: last_changed 2026-03-10T06:18:16.127480+0000 2026-03-10T06:23:58.790 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:58 vm04.local ceph-mon[115743]: created 2026-03-10T06:16:42.736031+0000 2026-03-10T06:23:58.790 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:58 vm04.local ceph-mon[115743]: min_mon_release 18 (reef) 2026-03-10T06:23:58.790 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:58 vm04.local ceph-mon[115743]: election_strategy: 1 2026-03-10T06:23:58.790 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:58 vm04.local ceph-mon[115743]: 0: [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] mon.vm04 2026-03-10T06:23:58.790 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:58 vm04.local ceph-mon[115743]: 1: [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] mon.vm06 2026-03-10T06:23:58.790 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:58 vm04.local ceph-mon[115743]: fsmap cephfs:1 {0=cephfs.vm04.hdxbzv=up:active} 1 up:standby-replay 2 up:standby 2026-03-10T06:23:58.790 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:58 vm04.local ceph-mon[115743]: osdmap e42: 6 total, 6 up, 6 in 2026-03-10T06:23:58.790 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:58 vm04.local ceph-mon[115743]: mgrmap e33: vm04.exdvdb(active, since 110s), standbys: vm06.wwotdr 2026-03-10T06:23:58.791 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:58 vm04.local ceph-mon[115743]: Health detail: HEALTH_WARN 1 filesystem with deprecated feature inline_data 2026-03-10T06:23:58.791 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:58 vm04.local ceph-mon[115743]: [WRN] FS_INLINE_DATA_DEPRECATED: 1 filesystem with deprecated feature inline_data 2026-03-10T06:23:58.791 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:58 vm04.local ceph-mon[115743]: fs cephfs has deprecated feature inline_data enabled. 2026-03-10T06:23:58.791 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:58 vm04.local ceph-mon[115743]: from='mgr.14632 ' entity='' 2026-03-10T06:23:58.791 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:23:58 vm04.local ceph-mon[115743]: mgrmap e34: vm04.exdvdb(active, since 110s), standbys: vm06.wwotdr 2026-03-10T06:23:59.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:58 vm06.local ceph-mon[58974]: pgmap v57: 65 pgs: 65 active+clean; 293 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail; 1.4 MiB/s rd, 1.4 MiB/s wr, 123 op/s 2026-03-10T06:23:59.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:58 vm06.local ceph-mon[58974]: mon.vm04 calling monitor election 2026-03-10T06:23:59.118 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:58 vm06.local ceph-mon[58974]: mon.vm04 is new leader, mons vm04,vm06 in quorum (ranks 0,1) 2026-03-10T06:23:59.118 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:58 vm06.local ceph-mon[58974]: monmap epoch 2 2026-03-10T06:23:59.118 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:58 vm06.local ceph-mon[58974]: fsid 9c59102a-1c48-11f1-b618-035af535377d 2026-03-10T06:23:59.118 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:58 vm06.local ceph-mon[58974]: last_changed 2026-03-10T06:18:16.127480+0000 2026-03-10T06:23:59.118 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:58 vm06.local ceph-mon[58974]: created 2026-03-10T06:16:42.736031+0000 2026-03-10T06:23:59.118 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:58 vm06.local ceph-mon[58974]: min_mon_release 18 (reef) 2026-03-10T06:23:59.118 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:58 vm06.local ceph-mon[58974]: election_strategy: 1 2026-03-10T06:23:59.118 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:58 vm06.local ceph-mon[58974]: 0: [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] mon.vm04 2026-03-10T06:23:59.118 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:58 vm06.local ceph-mon[58974]: 1: [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] mon.vm06 2026-03-10T06:23:59.118 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:58 vm06.local ceph-mon[58974]: fsmap cephfs:1 {0=cephfs.vm04.hdxbzv=up:active} 1 up:standby-replay 2 up:standby 2026-03-10T06:23:59.118 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:58 vm06.local ceph-mon[58974]: osdmap e42: 6 total, 6 up, 6 in 2026-03-10T06:23:59.118 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:58 vm06.local ceph-mon[58974]: mgrmap e33: vm04.exdvdb(active, since 110s), standbys: vm06.wwotdr 2026-03-10T06:23:59.118 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:58 vm06.local ceph-mon[58974]: Health detail: HEALTH_WARN 1 filesystem with deprecated feature inline_data 2026-03-10T06:23:59.118 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:58 vm06.local ceph-mon[58974]: [WRN] FS_INLINE_DATA_DEPRECATED: 1 filesystem with deprecated feature inline_data 2026-03-10T06:23:59.118 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:58 vm06.local ceph-mon[58974]: fs cephfs has deprecated feature inline_data enabled. 2026-03-10T06:23:59.118 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:58 vm06.local ceph-mon[58974]: from='mgr.14632 ' entity='' 2026-03-10T06:23:59.118 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:23:58 vm06.local ceph-mon[58974]: mgrmap e34: vm04.exdvdb(active, since 110s), standbys: vm06.wwotdr 2026-03-10T06:24:03.560 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:03 vm04.local ceph-mon[115743]: Standby manager daemon vm06.wwotdr restarted 2026-03-10T06:24:03.560 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:03 vm04.local ceph-mon[115743]: Standby manager daemon vm06.wwotdr started 2026-03-10T06:24:03.560 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:03 vm04.local ceph-mon[115743]: from='mgr.? 192.168.123.106:0/3108226597' entity='mgr.vm06.wwotdr' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm06.wwotdr/crt"}]: dispatch 2026-03-10T06:24:03.560 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:03 vm04.local ceph-mon[115743]: from='mgr.? 192.168.123.106:0/3108226597' entity='mgr.vm06.wwotdr' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/crt"}]: dispatch 2026-03-10T06:24:03.560 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:03 vm04.local ceph-mon[115743]: from='mgr.? 192.168.123.106:0/3108226597' entity='mgr.vm06.wwotdr' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm06.wwotdr/key"}]: dispatch 2026-03-10T06:24:03.560 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:03 vm04.local ceph-mon[115743]: from='mgr.? 192.168.123.106:0/3108226597' entity='mgr.vm06.wwotdr' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/key"}]: dispatch 2026-03-10T06:24:03.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:03 vm06.local ceph-mon[58974]: Standby manager daemon vm06.wwotdr restarted 2026-03-10T06:24:03.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:03 vm06.local ceph-mon[58974]: Standby manager daemon vm06.wwotdr started 2026-03-10T06:24:03.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:03 vm06.local ceph-mon[58974]: from='mgr.? 192.168.123.106:0/3108226597' entity='mgr.vm06.wwotdr' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm06.wwotdr/crt"}]: dispatch 2026-03-10T06:24:03.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:03 vm06.local ceph-mon[58974]: from='mgr.? 192.168.123.106:0/3108226597' entity='mgr.vm06.wwotdr' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/crt"}]: dispatch 2026-03-10T06:24:03.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:03 vm06.local ceph-mon[58974]: from='mgr.? 192.168.123.106:0/3108226597' entity='mgr.vm06.wwotdr' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm06.wwotdr/key"}]: dispatch 2026-03-10T06:24:03.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:03 vm06.local ceph-mon[58974]: from='mgr.? 192.168.123.106:0/3108226597' entity='mgr.vm06.wwotdr' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/key"}]: dispatch 2026-03-10T06:24:04.744 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:04 vm06.local ceph-mon[58974]: mgrmap e35: vm04.exdvdb(active, since 115s), standbys: vm06.wwotdr 2026-03-10T06:24:04.744 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:04 vm06.local ceph-mon[58974]: Active manager daemon vm04.exdvdb restarted 2026-03-10T06:24:04.744 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:04 vm06.local ceph-mon[58974]: Activating manager daemon vm04.exdvdb 2026-03-10T06:24:04.744 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:04 vm06.local ceph-mon[58974]: osdmap e43: 6 total, 6 up, 6 in 2026-03-10T06:24:04.745 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:04 vm06.local ceph-mon[58974]: mgrmap e36: vm04.exdvdb(active, starting, since 0.019164s), standbys: vm06.wwotdr 2026-03-10T06:24:04.745 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:04 vm06.local ceph-mon[58974]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "mon metadata", "id": "vm04"}]: dispatch 2026-03-10T06:24:04.745 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:04 vm06.local ceph-mon[58974]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "mon metadata", "id": "vm06"}]: dispatch 2026-03-10T06:24:04.745 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:04 vm06.local ceph-mon[58974]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm04.hdxbzv"}]: dispatch 2026-03-10T06:24:04.745 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:04 vm06.local ceph-mon[58974]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm04.hsrsig"}]: dispatch 2026-03-10T06:24:04.745 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:04 vm06.local ceph-mon[58974]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm06.afscws"}]: dispatch 2026-03-10T06:24:04.745 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:04 vm06.local ceph-mon[58974]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm06.wzhqon"}]: dispatch 2026-03-10T06:24:04.745 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:04 vm06.local ceph-mon[58974]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "mgr metadata", "who": "vm04.exdvdb", "id": "vm04.exdvdb"}]: dispatch 2026-03-10T06:24:04.745 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:04 vm06.local ceph-mon[58974]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "mgr metadata", "who": "vm06.wwotdr", "id": "vm06.wwotdr"}]: dispatch 2026-03-10T06:24:04.745 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:04 vm06.local ceph-mon[58974]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "mds metadata"}]: dispatch 2026-03-10T06:24:04.745 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:04 vm06.local ceph-mon[58974]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd metadata"}]: dispatch 2026-03-10T06:24:04.745 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:04 vm06.local ceph-mon[58974]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "mon metadata"}]: dispatch 2026-03-10T06:24:04.745 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:04 vm06.local ceph-mon[58974]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-10T06:24:04.745 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:04 vm06.local ceph-mon[58974]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-10T06:24:04.745 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:04 vm06.local ceph-mon[58974]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-10T06:24:04.745 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:04 vm06.local ceph-mon[58974]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-10T06:24:04.745 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:04 vm06.local ceph-mon[58974]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-10T06:24:04.745 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:04 vm06.local ceph-mon[58974]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-10T06:24:04.745 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:04 vm06.local ceph-mon[58974]: Manager daemon vm04.exdvdb is now available 2026-03-10T06:24:04.745 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:04 vm06.local ceph-mon[58974]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T06:24:04.745 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:04 vm06.local ceph-mon[58974]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:24:04.745 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:04 vm06.local ceph-mon[58974]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm04.exdvdb/mirror_snapshot_schedule"}]: dispatch 2026-03-10T06:24:04.745 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:04 vm06.local ceph-mon[58974]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm04.exdvdb/trash_purge_schedule"}]: dispatch 2026-03-10T06:24:04.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:04 vm04.local ceph-mon[115743]: mgrmap e35: vm04.exdvdb(active, since 115s), standbys: vm06.wwotdr 2026-03-10T06:24:04.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:04 vm04.local ceph-mon[115743]: Active manager daemon vm04.exdvdb restarted 2026-03-10T06:24:04.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:04 vm04.local ceph-mon[115743]: Activating manager daemon vm04.exdvdb 2026-03-10T06:24:04.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:04 vm04.local ceph-mon[115743]: osdmap e43: 6 total, 6 up, 6 in 2026-03-10T06:24:04.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:04 vm04.local ceph-mon[115743]: mgrmap e36: vm04.exdvdb(active, starting, since 0.019164s), standbys: vm06.wwotdr 2026-03-10T06:24:04.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:04 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "mon metadata", "id": "vm04"}]: dispatch 2026-03-10T06:24:04.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:04 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "mon metadata", "id": "vm06"}]: dispatch 2026-03-10T06:24:04.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:04 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm04.hdxbzv"}]: dispatch 2026-03-10T06:24:04.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:04 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm04.hsrsig"}]: dispatch 2026-03-10T06:24:04.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:04 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm06.afscws"}]: dispatch 2026-03-10T06:24:04.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:04 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm06.wzhqon"}]: dispatch 2026-03-10T06:24:04.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:04 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "mgr metadata", "who": "vm04.exdvdb", "id": "vm04.exdvdb"}]: dispatch 2026-03-10T06:24:04.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:04 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "mgr metadata", "who": "vm06.wwotdr", "id": "vm06.wwotdr"}]: dispatch 2026-03-10T06:24:04.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:04 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "mds metadata"}]: dispatch 2026-03-10T06:24:04.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:04 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd metadata"}]: dispatch 2026-03-10T06:24:04.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:04 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "mon metadata"}]: dispatch 2026-03-10T06:24:04.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:04 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-10T06:24:04.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:04 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-10T06:24:04.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:04 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-10T06:24:04.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:04 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-10T06:24:04.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:04 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-10T06:24:04.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:04 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-10T06:24:04.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:04 vm04.local ceph-mon[115743]: Manager daemon vm04.exdvdb is now available 2026-03-10T06:24:04.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:04 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T06:24:04.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:04 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:24:04.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:04 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm04.exdvdb/mirror_snapshot_schedule"}]: dispatch 2026-03-10T06:24:04.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:04 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm04.exdvdb/trash_purge_schedule"}]: dispatch 2026-03-10T06:24:06.263 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:06 vm04.local ceph-mon[115743]: mgrmap e37: vm04.exdvdb(active, since 1.3204s), standbys: vm06.wwotdr 2026-03-10T06:24:06.263 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:06 vm04.local ceph-mon[115743]: pgmap v3: 65 pgs: 65 active+clean; 294 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail 2026-03-10T06:24:06.263 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:06 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:06.265 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:06 vm06.local ceph-mon[58974]: mgrmap e37: vm04.exdvdb(active, since 1.3204s), standbys: vm06.wwotdr 2026-03-10T06:24:06.265 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:06 vm06.local ceph-mon[58974]: pgmap v3: 65 pgs: 65 active+clean; 294 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail 2026-03-10T06:24:06.265 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:06 vm06.local ceph-mon[58974]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:07.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:07 vm06.local ceph-mon[58974]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:07.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:07 vm06.local ceph-mon[58974]: mgrmap e38: vm04.exdvdb(active, since 2s), standbys: vm06.wwotdr 2026-03-10T06:24:07.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:07 vm06.local ceph-mon[58974]: [10/Mar/2026:06:24:06] ENGINE Bus STARTING 2026-03-10T06:24:07.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:07 vm06.local ceph-mon[58974]: [10/Mar/2026:06:24:06] ENGINE Serving on http://192.168.123.104:8765 2026-03-10T06:24:07.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:07 vm06.local ceph-mon[58974]: [10/Mar/2026:06:24:06] ENGINE Serving on https://192.168.123.104:7150 2026-03-10T06:24:07.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:07 vm06.local ceph-mon[58974]: [10/Mar/2026:06:24:06] ENGINE Bus STARTED 2026-03-10T06:24:07.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:07 vm06.local ceph-mon[58974]: [10/Mar/2026:06:24:06] ENGINE Client ('192.168.123.104', 59892) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)') 2026-03-10T06:24:07.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:07 vm06.local ceph-mon[58974]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:07.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:07 vm06.local ceph-mon[58974]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:07.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:07 vm06.local ceph-mon[58974]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:07.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:07 vm06.local ceph-mon[58974]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:07.681 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:07 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:07.681 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:07 vm04.local ceph-mon[115743]: mgrmap e38: vm04.exdvdb(active, since 2s), standbys: vm06.wwotdr 2026-03-10T06:24:07.681 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:07 vm04.local ceph-mon[115743]: [10/Mar/2026:06:24:06] ENGINE Bus STARTING 2026-03-10T06:24:07.681 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:07 vm04.local ceph-mon[115743]: [10/Mar/2026:06:24:06] ENGINE Serving on http://192.168.123.104:8765 2026-03-10T06:24:07.681 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:07 vm04.local ceph-mon[115743]: [10/Mar/2026:06:24:06] ENGINE Serving on https://192.168.123.104:7150 2026-03-10T06:24:07.681 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:07 vm04.local ceph-mon[115743]: [10/Mar/2026:06:24:06] ENGINE Bus STARTED 2026-03-10T06:24:07.681 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:07 vm04.local ceph-mon[115743]: [10/Mar/2026:06:24:06] ENGINE Client ('192.168.123.104', 59892) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)') 2026-03-10T06:24:07.681 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:07 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:07.682 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:07 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:07.682 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:07 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:07.682 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:07 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:09.119 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:08 vm06.local ceph-mon[58974]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:09.119 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:08 vm06.local ceph-mon[58974]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:09.119 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:08 vm06.local ceph-mon[58974]: pgmap v5: 65 pgs: 65 active+clean; 294 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail 2026-03-10T06:24:09.119 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:08 vm06.local ceph-mon[58974]: Detected new or changed devices on vm06 2026-03-10T06:24:09.119 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:08 vm06.local ceph-mon[58974]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:09.119 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:08 vm06.local ceph-mon[58974]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:09.119 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:08 vm06.local ceph-mon[58974]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "who": "osd/host:vm06", "name": "osd_memory_target"}]: dispatch 2026-03-10T06:24:09.179 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:08 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:09.181 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:08 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:09.181 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:08 vm04.local ceph-mon[115743]: pgmap v5: 65 pgs: 65 active+clean; 294 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail 2026-03-10T06:24:09.181 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:08 vm04.local ceph-mon[115743]: Detected new or changed devices on vm06 2026-03-10T06:24:09.181 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:08 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:09.181 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:08 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:09.181 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:08 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "who": "osd/host:vm06", "name": "osd_memory_target"}]: dispatch 2026-03-10T06:24:10.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:09 vm06.local ceph-mon[58974]: mgrmap e39: vm04.exdvdb(active, since 4s), standbys: vm06.wwotdr 2026-03-10T06:24:10.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:09 vm06.local ceph-mon[58974]: Detected new or changed devices on vm04 2026-03-10T06:24:10.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:09 vm06.local ceph-mon[58974]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:10.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:09 vm06.local ceph-mon[58974]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:10.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:09 vm06.local ceph-mon[58974]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "who": "osd/host:vm04", "name": "osd_memory_target"}]: dispatch 2026-03-10T06:24:10.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:09 vm06.local ceph-mon[58974]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:24:10.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:09 vm06.local ceph-mon[58974]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T06:24:10.118 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:09 vm06.local ceph-mon[58974]: Updating vm04:/etc/ceph/ceph.conf 2026-03-10T06:24:10.118 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:09 vm06.local ceph-mon[58974]: Updating vm06:/etc/ceph/ceph.conf 2026-03-10T06:24:10.118 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:09 vm06.local ceph-mon[58974]: Updating vm06:/var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/config/ceph.conf 2026-03-10T06:24:10.118 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:09 vm06.local ceph-mon[58974]: Updating vm04:/var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/config/ceph.conf 2026-03-10T06:24:10.118 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:09 vm06.local ceph-mon[58974]: Updating vm06:/etc/ceph/ceph.client.admin.keyring 2026-03-10T06:24:10.118 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:09 vm06.local ceph-mon[58974]: Updating vm04:/etc/ceph/ceph.client.admin.keyring 2026-03-10T06:24:10.118 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:09 vm06.local ceph-mon[58974]: Updating vm04:/var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/config/ceph.client.admin.keyring 2026-03-10T06:24:10.118 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:09 vm06.local ceph-mon[58974]: Updating vm06:/var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/config/ceph.client.admin.keyring 2026-03-10T06:24:10.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:09 vm04.local ceph-mon[115743]: mgrmap e39: vm04.exdvdb(active, since 4s), standbys: vm06.wwotdr 2026-03-10T06:24:10.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:09 vm04.local ceph-mon[115743]: Detected new or changed devices on vm04 2026-03-10T06:24:10.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:09 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:10.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:09 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:10.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:09 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "who": "osd/host:vm04", "name": "osd_memory_target"}]: dispatch 2026-03-10T06:24:10.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:09 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:24:10.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:09 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T06:24:10.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:09 vm04.local ceph-mon[115743]: Updating vm04:/etc/ceph/ceph.conf 2026-03-10T06:24:10.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:09 vm04.local ceph-mon[115743]: Updating vm06:/etc/ceph/ceph.conf 2026-03-10T06:24:10.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:09 vm04.local ceph-mon[115743]: Updating vm06:/var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/config/ceph.conf 2026-03-10T06:24:10.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:09 vm04.local ceph-mon[115743]: Updating vm04:/var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/config/ceph.conf 2026-03-10T06:24:10.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:09 vm04.local ceph-mon[115743]: Updating vm06:/etc/ceph/ceph.client.admin.keyring 2026-03-10T06:24:10.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:09 vm04.local ceph-mon[115743]: Updating vm04:/etc/ceph/ceph.client.admin.keyring 2026-03-10T06:24:10.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:09 vm04.local ceph-mon[115743]: Updating vm04:/var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/config/ceph.client.admin.keyring 2026-03-10T06:24:10.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:09 vm04.local ceph-mon[115743]: Updating vm06:/var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/config/ceph.client.admin.keyring 2026-03-10T06:24:11.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:10 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:11.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:10 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:11.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:10 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:11.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:10 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:11.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:10 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:11.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:10 vm04.local ceph-mon[115743]: pgmap v6: 65 pgs: 65 active+clean; 294 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail 2026-03-10T06:24:11.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:10 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:24:11.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:10 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:24:11.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:10 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "quorum_status"}]: dispatch 2026-03-10T06:24:11.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:10 vm04.local ceph-mon[115743]: Upgrade: Updating mon.vm06 2026-03-10T06:24:11.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:10 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:11.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:10 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch 2026-03-10T06:24:11.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:10 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch 2026-03-10T06:24:11.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:10 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:24:11.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:10 vm04.local ceph-mon[115743]: Deploying daemon mon.vm06 on vm06 2026-03-10T06:24:11.233 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:10 vm06.local ceph-mon[58974]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:11.233 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:10 vm06.local ceph-mon[58974]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:11.233 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:10 vm06.local ceph-mon[58974]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:11.233 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:10 vm06.local ceph-mon[58974]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:11.233 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:10 vm06.local ceph-mon[58974]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:11.233 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:10 vm06.local ceph-mon[58974]: pgmap v6: 65 pgs: 65 active+clean; 294 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail 2026-03-10T06:24:11.233 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:10 vm06.local ceph-mon[58974]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:24:11.233 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:10 vm06.local ceph-mon[58974]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:24:11.233 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:10 vm06.local ceph-mon[58974]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "quorum_status"}]: dispatch 2026-03-10T06:24:11.233 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:10 vm06.local ceph-mon[58974]: Upgrade: Updating mon.vm06 2026-03-10T06:24:11.233 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:10 vm06.local ceph-mon[58974]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:11.233 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:10 vm06.local ceph-mon[58974]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch 2026-03-10T06:24:11.233 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:10 vm06.local ceph-mon[58974]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch 2026-03-10T06:24:11.233 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:10 vm06.local ceph-mon[58974]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:24:11.233 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:10 vm06.local ceph-mon[58974]: Deploying daemon mon.vm06 on vm06 2026-03-10T06:24:11.233 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:11 vm06.local systemd[1]: Stopping Ceph mon.vm06 for 9c59102a-1c48-11f1-b618-035af535377d... 2026-03-10T06:24:11.544 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:11 vm06.local ceph-9c59102a-1c48-11f1-b618-035af535377d-mon-vm06[58970]: 2026-03-10T06:24:11.349+0000 7f9a76311700 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-mon -n mon.vm06 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false --default-mon-cluster-log-to-file=false --default-mon-cluster-log-to-journald=true --default-mon-cluster-log-to-stderr=false (PID: 1) UID: 0 2026-03-10T06:24:11.544 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:11 vm06.local ceph-9c59102a-1c48-11f1-b618-035af535377d-mon-vm06[58970]: 2026-03-10T06:24:11.349+0000 7f9a76311700 -1 mon.vm06@1(peon) e2 *** Got Signal Terminated *** 2026-03-10T06:24:11.544 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:11 vm06.local podman[98835]: 2026-03-10 06:24:11.417857651 +0000 UTC m=+0.089562601 container died 826078cd5cc7665d6070c2762ef357451a4069f87312449f1708939eeee65290 (image=quay.io/ceph/ceph@sha256:1793ff3af6ae74527c86e1a0b22401e9c42dc08d0ebb8379653be07db17d0007, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-mon-vm06, GIT_CLEAN=True, RELEASE=HEAD, org.label-schema.license=GPLv2, CEPH_POINT_RELEASE=-18.2.0, GIT_COMMIT=0396eef90bef641b676c164ec7a3876f45010308, GIT_BRANCH=HEAD, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, org.label-schema.name=CentOS Stream 8 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.29.1, maintainer=Guillaume Abrioux , org.label-schema.build-date=20231212, org.label-schema.vendor=CentOS) 2026-03-10T06:24:11.544 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:11 vm06.local podman[98835]: 2026-03-10 06:24:11.448932566 +0000 UTC m=+0.120637516 container remove 826078cd5cc7665d6070c2762ef357451a4069f87312449f1708939eeee65290 (image=quay.io/ceph/ceph@sha256:1793ff3af6ae74527c86e1a0b22401e9c42dc08d0ebb8379653be07db17d0007, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-mon-vm06, io.buildah.version=1.29.1, org.label-schema.schema-version=1.0, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , org.label-schema.build-date=20231212, org.label-schema.name=CentOS Stream 8 Base Image, CEPH_POINT_RELEASE=-18.2.0, GIT_BRANCH=HEAD, RELEASE=HEAD, ceph=True, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, GIT_CLEAN=True, GIT_COMMIT=0396eef90bef641b676c164ec7a3876f45010308) 2026-03-10T06:24:11.544 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:11 vm06.local bash[98835]: ceph-9c59102a-1c48-11f1-b618-035af535377d-mon-vm06 2026-03-10T06:24:11.830 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:11 vm06.local systemd[1]: ceph-9c59102a-1c48-11f1-b618-035af535377d@mon.vm06.service: Deactivated successfully. 2026-03-10T06:24:11.830 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:11 vm06.local systemd[1]: Stopped Ceph mon.vm06 for 9c59102a-1c48-11f1-b618-035af535377d. 2026-03-10T06:24:11.830 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:11 vm06.local systemd[1]: ceph-9c59102a-1c48-11f1-b618-035af535377d@mon.vm06.service: Consumed 4.400s CPU time. 2026-03-10T06:24:11.830 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:11 vm06.local systemd[1]: Starting Ceph mon.vm06 for 9c59102a-1c48-11f1-b618-035af535377d... 2026-03-10T06:24:12.083 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:11 vm06.local podman[98946]: 2026-03-10 06:24:11.978353352 +0000 UTC m=+0.027785378 container create 0f90bc9a714a355895f22ff59cdb0a528532adf2774a9e870257c966027c95f2 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-mon-vm06, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team , org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, ceph=True, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, OSD_FLAVOR=default, CEPH_REF=squid, org.label-schema.build-date=20260223) 2026-03-10T06:24:12.083 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local podman[98946]: 2026-03-10 06:24:12.020537435 +0000 UTC m=+0.069969461 container init 0f90bc9a714a355895f22ff59cdb0a528532adf2774a9e870257c966027c95f2 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-mon-vm06, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.authors=Ceph Release Team , org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.build-date=20260223, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, ceph=True, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image) 2026-03-10T06:24:12.083 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local podman[98946]: 2026-03-10 06:24:12.025308137 +0000 UTC m=+0.074740152 container start 0f90bc9a714a355895f22ff59cdb0a528532adf2774a9e870257c966027c95f2 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-mon-vm06, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, OSD_FLAVOR=default, CEPH_REF=squid, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260223, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.vendor=CentOS) 2026-03-10T06:24:12.083 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local bash[98946]: 0f90bc9a714a355895f22ff59cdb0a528532adf2774a9e870257c966027c95f2 2026-03-10T06:24:12.083 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local podman[98946]: 2026-03-10 06:24:11.968516337 +0000 UTC m=+0.017948372 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-10T06:24:12.083 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local systemd[1]: Started Ceph mon.vm06 for 9c59102a-1c48-11f1-b618-035af535377d. 2026-03-10T06:24:12.083 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: set uid:gid to 167:167 (ceph:ceph) 2026-03-10T06:24:12.083 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable), process ceph-mon, pid 2 2026-03-10T06:24:12.083 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: pidfile_write: ignore empty --pid-file 2026-03-10T06:24:12.084 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: load: jerasure load: lrc 2026-03-10T06:24:12.084 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: RocksDB version: 7.9.2 2026-03-10T06:24:12.084 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Git sha 0 2026-03-10T06:24:12.084 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Compile date 2026-02-25 18:11:04 2026-03-10T06:24:12.084 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: DB SUMMARY 2026-03-10T06:24:12.084 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: DB Session ID: GTEWCLJHXL9A23ORN53W 2026-03-10T06:24:12.084 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: CURRENT file: CURRENT 2026-03-10T06:24:12.084 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: IDENTITY file: IDENTITY 2026-03-10T06:24:12.084 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: MANIFEST file: MANIFEST-000010 size: 921 Bytes 2026-03-10T06:24:12.084 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: SST files in /var/lib/ceph/mon/ceph-vm06/store.db dir, Total Num: 1, files: 000021.sst 2026-03-10T06:24:12.084 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Write Ahead Log file in /var/lib/ceph/mon/ceph-vm06/store.db: 000019.log size: 4201473 ; 2026-03-10T06:24:12.084 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.error_if_exists: 0 2026-03-10T06:24:12.084 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.create_if_missing: 0 2026-03-10T06:24:12.084 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.paranoid_checks: 1 2026-03-10T06:24:12.084 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.flush_verify_memtable_count: 1 2026-03-10T06:24:12.084 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.track_and_verify_wals_in_manifest: 0 2026-03-10T06:24:12.084 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.verify_sst_unique_id_in_manifest: 1 2026-03-10T06:24:12.084 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.env: 0x55cc0a196dc0 2026-03-10T06:24:12.084 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.fs: PosixFileSystem 2026-03-10T06:24:12.084 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.info_log: 0x55cc0c5ff900 2026-03-10T06:24:12.084 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.max_file_opening_threads: 16 2026-03-10T06:24:12.084 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.statistics: (nil) 2026-03-10T06:24:12.084 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.use_fsync: 0 2026-03-10T06:24:12.084 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.max_log_file_size: 0 2026-03-10T06:24:12.084 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.max_manifest_file_size: 1073741824 2026-03-10T06:24:12.084 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.log_file_time_to_roll: 0 2026-03-10T06:24:12.084 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.keep_log_file_num: 1000 2026-03-10T06:24:12.084 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.recycle_log_file_num: 0 2026-03-10T06:24:12.084 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.allow_fallocate: 1 2026-03-10T06:24:12.084 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.allow_mmap_reads: 0 2026-03-10T06:24:12.084 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.allow_mmap_writes: 0 2026-03-10T06:24:12.084 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.use_direct_reads: 0 2026-03-10T06:24:12.084 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.use_direct_io_for_flush_and_compaction: 0 2026-03-10T06:24:12.084 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.create_missing_column_families: 0 2026-03-10T06:24:12.084 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.db_log_dir: 2026-03-10T06:24:12.084 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.wal_dir: 2026-03-10T06:24:12.084 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.table_cache_numshardbits: 6 2026-03-10T06:24:12.084 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.WAL_ttl_seconds: 0 2026-03-10T06:24:12.084 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.WAL_size_limit_MB: 0 2026-03-10T06:24:12.084 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.max_write_batch_group_size_bytes: 1048576 2026-03-10T06:24:12.084 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.manifest_preallocation_size: 4194304 2026-03-10T06:24:12.084 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.is_fd_close_on_exec: 1 2026-03-10T06:24:12.084 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.advise_random_on_open: 1 2026-03-10T06:24:12.084 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.db_write_buffer_size: 0 2026-03-10T06:24:12.084 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.write_buffer_manager: 0x55cc0c603900 2026-03-10T06:24:12.085 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.access_hint_on_compaction_start: 1 2026-03-10T06:24:12.085 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.random_access_max_buffer_size: 1048576 2026-03-10T06:24:12.085 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.use_adaptive_mutex: 0 2026-03-10T06:24:12.085 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.rate_limiter: (nil) 2026-03-10T06:24:12.085 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.sst_file_manager.rate_bytes_per_sec: 0 2026-03-10T06:24:12.085 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.wal_recovery_mode: 2 2026-03-10T06:24:12.085 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.enable_thread_tracking: 0 2026-03-10T06:24:12.085 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.enable_pipelined_write: 0 2026-03-10T06:24:12.085 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.unordered_write: 0 2026-03-10T06:24:12.085 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.allow_concurrent_memtable_write: 1 2026-03-10T06:24:12.085 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.enable_write_thread_adaptive_yield: 1 2026-03-10T06:24:12.085 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.write_thread_max_yield_usec: 100 2026-03-10T06:24:12.085 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.write_thread_slow_yield_usec: 3 2026-03-10T06:24:12.085 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.row_cache: None 2026-03-10T06:24:12.085 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.wal_filter: None 2026-03-10T06:24:12.085 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.avoid_flush_during_recovery: 0 2026-03-10T06:24:12.085 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.allow_ingest_behind: 0 2026-03-10T06:24:12.085 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.two_write_queues: 0 2026-03-10T06:24:12.085 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.manual_wal_flush: 0 2026-03-10T06:24:12.085 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.wal_compression: 0 2026-03-10T06:24:12.085 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.atomic_flush: 0 2026-03-10T06:24:12.085 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.avoid_unnecessary_blocking_io: 0 2026-03-10T06:24:12.085 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.persist_stats_to_disk: 0 2026-03-10T06:24:12.085 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.write_dbid_to_manifest: 0 2026-03-10T06:24:12.085 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.log_readahead_size: 0 2026-03-10T06:24:12.085 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.file_checksum_gen_factory: Unknown 2026-03-10T06:24:12.085 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.best_efforts_recovery: 0 2026-03-10T06:24:12.085 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.max_bgerror_resume_count: 2147483647 2026-03-10T06:24:12.085 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.bgerror_resume_retry_interval: 1000000 2026-03-10T06:24:12.085 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.allow_data_in_errors: 0 2026-03-10T06:24:12.085 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.db_host_id: __hostname__ 2026-03-10T06:24:12.085 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.enforce_single_del_contracts: true 2026-03-10T06:24:12.085 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.max_background_jobs: 2 2026-03-10T06:24:12.085 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.max_background_compactions: -1 2026-03-10T06:24:12.085 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.max_subcompactions: 1 2026-03-10T06:24:12.085 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.avoid_flush_during_shutdown: 0 2026-03-10T06:24:12.085 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.writable_file_max_buffer_size: 1048576 2026-03-10T06:24:12.085 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.delayed_write_rate : 16777216 2026-03-10T06:24:12.085 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.max_total_wal_size: 0 2026-03-10T06:24:12.085 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.delete_obsolete_files_period_micros: 21600000000 2026-03-10T06:24:12.085 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.stats_dump_period_sec: 600 2026-03-10T06:24:12.085 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.stats_persist_period_sec: 600 2026-03-10T06:24:12.085 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.stats_history_buffer_size: 1048576 2026-03-10T06:24:12.085 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.max_open_files: -1 2026-03-10T06:24:12.085 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.bytes_per_sync: 0 2026-03-10T06:24:12.086 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.wal_bytes_per_sync: 0 2026-03-10T06:24:12.086 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.strict_bytes_per_sync: 0 2026-03-10T06:24:12.086 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.compaction_readahead_size: 0 2026-03-10T06:24:12.086 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.max_background_flushes: -1 2026-03-10T06:24:12.086 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Compression algorithms supported: 2026-03-10T06:24:12.086 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: kZSTD supported: 0 2026-03-10T06:24:12.086 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: kXpressCompression supported: 0 2026-03-10T06:24:12.086 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: kBZip2Compression supported: 0 2026-03-10T06:24:12.086 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: kZSTDNotFinalCompression supported: 0 2026-03-10T06:24:12.086 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: kLZ4Compression supported: 1 2026-03-10T06:24:12.086 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: kZlibCompression supported: 1 2026-03-10T06:24:12.086 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: kLZ4HCCompression supported: 1 2026-03-10T06:24:12.086 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: kSnappyCompression supported: 1 2026-03-10T06:24:12.086 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Fast CRC32 supported: Supported on x86 2026-03-10T06:24:12.086 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: DMutex implementation: pthread_mutex_t 2026-03-10T06:24:12.086 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: /var/lib/ceph/mon/ceph-vm06/store.db/MANIFEST-000010 2026-03-10T06:24:12.086 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]: 2026-03-10T06:24:12.086 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.comparator: leveldb.BytewiseComparator 2026-03-10T06:24:12.086 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.merge_operator: 2026-03-10T06:24:12.086 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.compaction_filter: None 2026-03-10T06:24:12.086 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.compaction_filter_factory: None 2026-03-10T06:24:12.086 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.sst_partitioner_factory: None 2026-03-10T06:24:12.086 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.memtable_factory: SkipListFactory 2026-03-10T06:24:12.086 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.table_factory: BlockBasedTable 2026-03-10T06:24:12.086 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55cc0c5ff580) 2026-03-10T06:24:12.086 INFO:journalctl@ceph.mon.vm06.vm06.stdout: cache_index_and_filter_blocks: 1 2026-03-10T06:24:12.086 INFO:journalctl@ceph.mon.vm06.vm06.stdout: cache_index_and_filter_blocks_with_high_priority: 0 2026-03-10T06:24:12.086 INFO:journalctl@ceph.mon.vm06.vm06.stdout: pin_l0_filter_and_index_blocks_in_cache: 0 2026-03-10T06:24:12.086 INFO:journalctl@ceph.mon.vm06.vm06.stdout: pin_top_level_index_and_filter: 1 2026-03-10T06:24:12.086 INFO:journalctl@ceph.mon.vm06.vm06.stdout: index_type: 0 2026-03-10T06:24:12.086 INFO:journalctl@ceph.mon.vm06.vm06.stdout: data_block_index_type: 0 2026-03-10T06:24:12.086 INFO:journalctl@ceph.mon.vm06.vm06.stdout: index_shortening: 1 2026-03-10T06:24:12.086 INFO:journalctl@ceph.mon.vm06.vm06.stdout: data_block_hash_table_util_ratio: 0.750000 2026-03-10T06:24:12.086 INFO:journalctl@ceph.mon.vm06.vm06.stdout: checksum: 4 2026-03-10T06:24:12.086 INFO:journalctl@ceph.mon.vm06.vm06.stdout: no_block_cache: 0 2026-03-10T06:24:12.086 INFO:journalctl@ceph.mon.vm06.vm06.stdout: block_cache: 0x55cc0c6229b0 2026-03-10T06:24:12.086 INFO:journalctl@ceph.mon.vm06.vm06.stdout: block_cache_name: BinnedLRUCache 2026-03-10T06:24:12.086 INFO:journalctl@ceph.mon.vm06.vm06.stdout: block_cache_options: 2026-03-10T06:24:12.086 INFO:journalctl@ceph.mon.vm06.vm06.stdout: capacity : 536870912 2026-03-10T06:24:12.086 INFO:journalctl@ceph.mon.vm06.vm06.stdout: num_shard_bits : 4 2026-03-10T06:24:12.086 INFO:journalctl@ceph.mon.vm06.vm06.stdout: strict_capacity_limit : 0 2026-03-10T06:24:12.086 INFO:journalctl@ceph.mon.vm06.vm06.stdout: high_pri_pool_ratio: 0.000 2026-03-10T06:24:12.086 INFO:journalctl@ceph.mon.vm06.vm06.stdout: block_cache_compressed: (nil) 2026-03-10T06:24:12.086 INFO:journalctl@ceph.mon.vm06.vm06.stdout: persistent_cache: (nil) 2026-03-10T06:24:12.086 INFO:journalctl@ceph.mon.vm06.vm06.stdout: block_size: 4096 2026-03-10T06:24:12.087 INFO:journalctl@ceph.mon.vm06.vm06.stdout: block_size_deviation: 10 2026-03-10T06:24:12.087 INFO:journalctl@ceph.mon.vm06.vm06.stdout: block_restart_interval: 16 2026-03-10T06:24:12.087 INFO:journalctl@ceph.mon.vm06.vm06.stdout: index_block_restart_interval: 1 2026-03-10T06:24:12.087 INFO:journalctl@ceph.mon.vm06.vm06.stdout: metadata_block_size: 4096 2026-03-10T06:24:12.087 INFO:journalctl@ceph.mon.vm06.vm06.stdout: partition_filters: 0 2026-03-10T06:24:12.087 INFO:journalctl@ceph.mon.vm06.vm06.stdout: use_delta_encoding: 1 2026-03-10T06:24:12.087 INFO:journalctl@ceph.mon.vm06.vm06.stdout: filter_policy: bloomfilter 2026-03-10T06:24:12.087 INFO:journalctl@ceph.mon.vm06.vm06.stdout: whole_key_filtering: 1 2026-03-10T06:24:12.087 INFO:journalctl@ceph.mon.vm06.vm06.stdout: verify_compression: 0 2026-03-10T06:24:12.087 INFO:journalctl@ceph.mon.vm06.vm06.stdout: read_amp_bytes_per_bit: 0 2026-03-10T06:24:12.087 INFO:journalctl@ceph.mon.vm06.vm06.stdout: format_version: 5 2026-03-10T06:24:12.087 INFO:journalctl@ceph.mon.vm06.vm06.stdout: enable_index_compression: 1 2026-03-10T06:24:12.087 INFO:journalctl@ceph.mon.vm06.vm06.stdout: block_align: 0 2026-03-10T06:24:12.087 INFO:journalctl@ceph.mon.vm06.vm06.stdout: max_auto_readahead_size: 262144 2026-03-10T06:24:12.087 INFO:journalctl@ceph.mon.vm06.vm06.stdout: prepopulate_block_cache: 0 2026-03-10T06:24:12.087 INFO:journalctl@ceph.mon.vm06.vm06.stdout: initial_auto_readahead_size: 8192 2026-03-10T06:24:12.087 INFO:journalctl@ceph.mon.vm06.vm06.stdout: num_file_reads_for_auto_readahead: 2 2026-03-10T06:24:12.087 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.write_buffer_size: 33554432 2026-03-10T06:24:12.087 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.max_write_buffer_number: 2 2026-03-10T06:24:12.087 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.compression: NoCompression 2026-03-10T06:24:12.087 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.bottommost_compression: Disabled 2026-03-10T06:24:12.087 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.prefix_extractor: nullptr 2026-03-10T06:24:12.087 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr 2026-03-10T06:24:12.087 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.num_levels: 7 2026-03-10T06:24:12.087 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.min_write_buffer_number_to_merge: 1 2026-03-10T06:24:12.087 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 2026-03-10T06:24:12.087 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 2026-03-10T06:24:12.087 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 2026-03-10T06:24:12.087 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.bottommost_compression_opts.level: 32767 2026-03-10T06:24:12.087 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.bottommost_compression_opts.strategy: 0 2026-03-10T06:24:12.087 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 2026-03-10T06:24:12.087 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 2026-03-10T06:24:12.087 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 2026-03-10T06:24:12.087 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.bottommost_compression_opts.enabled: false 2026-03-10T06:24:12.087 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 2026-03-10T06:24:12.087 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true 2026-03-10T06:24:12.087 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.compression_opts.window_bits: -14 2026-03-10T06:24:12.087 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.compression_opts.level: 32767 2026-03-10T06:24:12.087 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.compression_opts.strategy: 0 2026-03-10T06:24:12.087 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.compression_opts.max_dict_bytes: 0 2026-03-10T06:24:12.087 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 2026-03-10T06:24:12.087 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true 2026-03-10T06:24:12.087 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.compression_opts.parallel_threads: 1 2026-03-10T06:24:12.087 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.compression_opts.enabled: false 2026-03-10T06:24:12.088 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 2026-03-10T06:24:12.088 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.level0_file_num_compaction_trigger: 4 2026-03-10T06:24:12.088 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.level0_slowdown_writes_trigger: 20 2026-03-10T06:24:12.088 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.level0_stop_writes_trigger: 36 2026-03-10T06:24:12.088 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.target_file_size_base: 67108864 2026-03-10T06:24:12.088 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.target_file_size_multiplier: 1 2026-03-10T06:24:12.088 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.max_bytes_for_level_base: 268435456 2026-03-10T06:24:12.088 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.level_compaction_dynamic_level_bytes: 1 2026-03-10T06:24:12.088 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.max_bytes_for_level_multiplier: 10.000000 2026-03-10T06:24:12.088 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 2026-03-10T06:24:12.088 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 2026-03-10T06:24:12.088 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 2026-03-10T06:24:12.088 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 2026-03-10T06:24:12.088 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 2026-03-10T06:24:12.088 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 2026-03-10T06:24:12.088 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 2026-03-10T06:24:12.088 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.max_sequential_skip_in_iterations: 8 2026-03-10T06:24:12.088 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.max_compaction_bytes: 1677721600 2026-03-10T06:24:12.088 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true 2026-03-10T06:24:12.088 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.arena_block_size: 1048576 2026-03-10T06:24:12.088 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 2026-03-10T06:24:12.088 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 2026-03-10T06:24:12.088 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.disable_auto_compactions: 0 2026-03-10T06:24:12.088 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.compaction_style: kCompactionStyleLevel 2026-03-10T06:24:12.088 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.compaction_pri: kMinOverlappingRatio 2026-03-10T06:24:12.088 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.compaction_options_universal.size_ratio: 1 2026-03-10T06:24:12.088 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 2026-03-10T06:24:12.088 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 2026-03-10T06:24:12.088 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 2026-03-10T06:24:12.088 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 2026-03-10T06:24:12.088 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize 2026-03-10T06:24:12.088 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 2026-03-10T06:24:12.088 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 2026-03-10T06:24:12.088 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); 2026-03-10T06:24:12.088 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.inplace_update_support: 0 2026-03-10T06:24:12.088 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.inplace_update_num_locks: 10000 2026-03-10T06:24:12.088 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 2026-03-10T06:24:12.088 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.memtable_whole_key_filtering: 0 2026-03-10T06:24:12.088 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.memtable_huge_page_size: 0 2026-03-10T06:24:12.088 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.bloom_locality: 0 2026-03-10T06:24:12.088 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.max_successive_merges: 0 2026-03-10T06:24:12.088 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.optimize_filters_for_hits: 0 2026-03-10T06:24:12.088 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.paranoid_file_checks: 0 2026-03-10T06:24:12.088 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.force_consistency_checks: 1 2026-03-10T06:24:12.088 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.report_bg_io_stats: 0 2026-03-10T06:24:12.088 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.ttl: 2592000 2026-03-10T06:24:12.088 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.periodic_compaction_seconds: 0 2026-03-10T06:24:12.088 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.preclude_last_level_data_seconds: 0 2026-03-10T06:24:12.088 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.preserve_internal_time_seconds: 0 2026-03-10T06:24:12.089 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.enable_blob_files: false 2026-03-10T06:24:12.089 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.min_blob_size: 0 2026-03-10T06:24:12.089 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.blob_file_size: 268435456 2026-03-10T06:24:12.089 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.blob_compression_type: NoCompression 2026-03-10T06:24:12.089 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.enable_blob_garbage_collection: false 2026-03-10T06:24:12.089 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 2026-03-10T06:24:12.089 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 2026-03-10T06:24:12.089 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.blob_compaction_readahead_size: 0 2026-03-10T06:24:12.089 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.blob_file_starting_level: 0 2026-03-10T06:24:12.089 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 2026-03-10T06:24:12.089 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. 2026-03-10T06:24:12.089 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:/var/lib/ceph/mon/ceph-vm06/store.db/MANIFEST-000010 succeeded,manifest_file_number is 10, next_file_number is 23, last_sequence is 9889, log_number is 19,prev_log_number is 0,max_column_family is 0,min_log_number_to_keep is 19 2026-03-10T06:24:12.089 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 19 2026-03-10T06:24:12.089 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: d7f668b6-f670-46ee-a723-f9534d755530 2026-03-10T06:24:12.089 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: EVENT_LOG_v1 {"time_micros": 1773123852066736, "job": 1, "event": "recovery_started", "wal_files": [19]} 2026-03-10T06:24:12.089 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #19 mode 2 2026-03-10T06:24:12.089 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: EVENT_LOG_v1 {"time_micros": 1773123852081909, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 24, "file_size": 2641535, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 9894, "largest_seqno": 10887, "table_properties": {"data_size": 2636496, "index_size": 2778, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1221, "raw_key_size": 12435, "raw_average_key_size": 25, "raw_value_size": 2626073, "raw_average_value_size": 5437, "num_data_blocks": 126, "num_entries": 483, "num_filter_entries": 483, "num_deletions": 6, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1773123852, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d7f668b6-f670-46ee-a723-f9534d755530", "db_session_id": "GTEWCLJHXL9A23ORN53W", "orig_file_number": 24, "seqno_to_time_mapping": "N/A"}} 2026-03-10T06:24:12.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: EVENT_LOG_v1 {"time_micros": 1773123852083878, "job": 1, "event": "recovery_finished"} 2026-03-10T06:24:12.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: [db/version_set.cc:5047] Creating manifest 26 2026-03-10T06:24:12.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. 2026-03-10T06:24:12.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-vm06/store.db/000019.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 2026-03-10T06:24:12.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x55cc0c624e00 2026-03-10T06:24:12.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: DB pointer 0x55cc0c634000 2026-03-10T06:24:12.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- 2026-03-10T06:24:12.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: rocksdb: [db/db_impl/db_impl.cc:1111] 2026-03-10T06:24:12.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout: ** DB Stats ** 2026-03-10T06:24:12.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout: Uptime(secs): 0.0 total, 0.0 interval 2026-03-10T06:24:12.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout: Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s 2026-03-10T06:24:12.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout: Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s 2026-03-10T06:24:12.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout: Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent 2026-03-10T06:24:12.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout: Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s 2026-03-10T06:24:12.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout: Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s 2026-03-10T06:24:12.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout: Interval stall: 00:00:0.000 H:M:S, 0.0 percent 2026-03-10T06:24:12.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout: 2026-03-10T06:24:12.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout: ** Compaction Stats [default] ** 2026-03-10T06:24:12.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout: Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB) 2026-03-10T06:24:12.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout: ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------ 2026-03-10T06:24:12.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout: L0 1/0 2.52 MB 0.2 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 262.4 0.01 0.00 1 0.010 0 0 0.0 0.0 2026-03-10T06:24:12.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout: L6 1/0 7.92 MB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0 2026-03-10T06:24:12.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout: Sum 2/0 10.44 MB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 262.4 0.01 0.00 1 0.010 0 0 0.0 0.0 2026-03-10T06:24:12.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout: Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 262.4 0.01 0.00 1 0.010 0 0 0.0 0.0 2026-03-10T06:24:12.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout: 2026-03-10T06:24:12.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout: ** Compaction Stats [default] ** 2026-03-10T06:24:12.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout: Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB) 2026-03-10T06:24:12.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout: --------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- 2026-03-10T06:24:12.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout: User 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 262.4 0.01 0.00 1 0.010 0 0 0.0 0.0 2026-03-10T06:24:12.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout: 2026-03-10T06:24:12.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout: Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0 2026-03-10T06:24:12.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout: 2026-03-10T06:24:12.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout: Uptime(secs): 0.0 total, 0.0 interval 2026-03-10T06:24:12.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout: Flush(GB): cumulative 0.002, interval 0.002 2026-03-10T06:24:12.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout: AddFile(GB): cumulative 0.000, interval 0.000 2026-03-10T06:24:12.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout: AddFile(Total Files): cumulative 0, interval 0 2026-03-10T06:24:12.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout: AddFile(L0 Files): cumulative 0, interval 0 2026-03-10T06:24:12.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout: AddFile(Keys): cumulative 0, interval 0 2026-03-10T06:24:12.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout: Cumulative compaction: 0.00 GB write, 90.62 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds 2026-03-10T06:24:12.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout: Interval compaction: 0.00 GB write, 90.62 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds 2026-03-10T06:24:12.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout: Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count 2026-03-10T06:24:12.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout: Block cache BinnedLRUCache@0x55cc0c6229b0#2 capacity: 512.00 MB usage: 4.11 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 0 last_secs: 1.7e-05 secs_since: 0 2026-03-10T06:24:12.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout: Block cache entry stats(count,size,portion): FilterBlock(1,1.28 KB,0.000244379%) IndexBlock(1,2.83 KB,0.000539422%) Misc(1,0.00 KB,0%) 2026-03-10T06:24:12.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout: 2026-03-10T06:24:12.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout: ** File Read Latency Histogram By Level [default] ** 2026-03-10T06:24:12.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: starting mon.vm06 rank 1 at public addrs [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] at bind addrs [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] mon_data /var/lib/ceph/mon/ceph-vm06 fsid 9c59102a-1c48-11f1-b618-035af535377d 2026-03-10T06:24:12.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: mon.vm06@-1(???) e2 preinit fsid 9c59102a-1c48-11f1-b618-035af535377d 2026-03-10T06:24:12.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: mon.vm06@-1(???).mds e11 new map 2026-03-10T06:24:12.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: mon.vm06@-1(???).mds e11 print_map 2026-03-10T06:24:12.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout: e11 2026-03-10T06:24:12.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout: btime 1970-01-01T00:00:00:000000+0000 2026-03-10T06:24:12.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout: enable_multiple, ever_enabled_multiple: 1,1 2026-03-10T06:24:12.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout: default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T06:24:12.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout: legacy client fscid: 1 2026-03-10T06:24:12.369 INFO:journalctl@ceph.mon.vm06.vm06.stdout: 2026-03-10T06:24:12.369 INFO:journalctl@ceph.mon.vm06.vm06.stdout: Filesystem 'cephfs' (1) 2026-03-10T06:24:12.369 INFO:journalctl@ceph.mon.vm06.vm06.stdout: fs_name cephfs 2026-03-10T06:24:12.369 INFO:journalctl@ceph.mon.vm06.vm06.stdout: epoch 9 2026-03-10T06:24:12.369 INFO:journalctl@ceph.mon.vm06.vm06.stdout: flags 32 joinable allow_snaps allow_multimds_snaps allow_standby_replay 2026-03-10T06:24:12.369 INFO:journalctl@ceph.mon.vm06.vm06.stdout: created 2026-03-10T06:19:48.407965+0000 2026-03-10T06:24:12.369 INFO:journalctl@ceph.mon.vm06.vm06.stdout: modified 2026-03-10T06:19:55.449951+0000 2026-03-10T06:24:12.369 INFO:journalctl@ceph.mon.vm06.vm06.stdout: tableserver 0 2026-03-10T06:24:12.369 INFO:journalctl@ceph.mon.vm06.vm06.stdout: root 0 2026-03-10T06:24:12.369 INFO:journalctl@ceph.mon.vm06.vm06.stdout: session_timeout 60 2026-03-10T06:24:12.369 INFO:journalctl@ceph.mon.vm06.vm06.stdout: session_autoclose 300 2026-03-10T06:24:12.369 INFO:journalctl@ceph.mon.vm06.vm06.stdout: max_file_size 1099511627776 2026-03-10T06:24:12.369 INFO:journalctl@ceph.mon.vm06.vm06.stdout: max_xattr_size 65536 2026-03-10T06:24:12.369 INFO:journalctl@ceph.mon.vm06.vm06.stdout: required_client_features {} 2026-03-10T06:24:12.369 INFO:journalctl@ceph.mon.vm06.vm06.stdout: last_failure 0 2026-03-10T06:24:12.369 INFO:journalctl@ceph.mon.vm06.vm06.stdout: last_failure_osd_epoch 0 2026-03-10T06:24:12.369 INFO:journalctl@ceph.mon.vm06.vm06.stdout: compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T06:24:12.369 INFO:journalctl@ceph.mon.vm06.vm06.stdout: max_mds 1 2026-03-10T06:24:12.369 INFO:journalctl@ceph.mon.vm06.vm06.stdout: in 0 2026-03-10T06:24:12.369 INFO:journalctl@ceph.mon.vm06.vm06.stdout: up {0=14508} 2026-03-10T06:24:12.369 INFO:journalctl@ceph.mon.vm06.vm06.stdout: failed 2026-03-10T06:24:12.369 INFO:journalctl@ceph.mon.vm06.vm06.stdout: damaged 2026-03-10T06:24:12.369 INFO:journalctl@ceph.mon.vm06.vm06.stdout: stopped 2026-03-10T06:24:12.369 INFO:journalctl@ceph.mon.vm06.vm06.stdout: data_pools [3] 2026-03-10T06:24:12.369 INFO:journalctl@ceph.mon.vm06.vm06.stdout: metadata_pool 2 2026-03-10T06:24:12.369 INFO:journalctl@ceph.mon.vm06.vm06.stdout: inline_data enabled 2026-03-10T06:24:12.369 INFO:journalctl@ceph.mon.vm06.vm06.stdout: balancer 2026-03-10T06:24:12.369 INFO:journalctl@ceph.mon.vm06.vm06.stdout: bal_rank_mask -1 2026-03-10T06:24:12.369 INFO:journalctl@ceph.mon.vm06.vm06.stdout: standby_count_wanted 1 2026-03-10T06:24:12.369 INFO:journalctl@ceph.mon.vm06.vm06.stdout: qdb_cluster leader: 0 members: 2026-03-10T06:24:12.369 INFO:journalctl@ceph.mon.vm06.vm06.stdout: [mds.cephfs.vm04.hdxbzv{0:14508} state up:active seq 3 join_fscid=1 addr [v2:192.168.123.104:6826/2274683007,v1:192.168.123.104:6827/2274683007] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T06:24:12.369 INFO:journalctl@ceph.mon.vm06.vm06.stdout: [mds.cephfs.vm06.wzhqon{0:24299} state up:standby-replay seq 2 join_fscid=1 addr [v2:192.168.123.106:6824/3071631026,v1:192.168.123.106:6825/3071631026] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T06:24:12.369 INFO:journalctl@ceph.mon.vm06.vm06.stdout: 2026-03-10T06:24:12.369 INFO:journalctl@ceph.mon.vm06.vm06.stdout: 2026-03-10T06:24:12.369 INFO:journalctl@ceph.mon.vm06.vm06.stdout: Standby daemons: 2026-03-10T06:24:12.369 INFO:journalctl@ceph.mon.vm06.vm06.stdout: 2026-03-10T06:24:12.369 INFO:journalctl@ceph.mon.vm06.vm06.stdout: [mds.cephfs.vm04.hsrsig{-1:14518} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.104:6828/2419696492,v1:192.168.123.104:6829/2419696492] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T06:24:12.369 INFO:journalctl@ceph.mon.vm06.vm06.stdout: [mds.cephfs.vm06.afscws{-1:14526} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.106:6826/3120742985,v1:192.168.123.106:6827/3120742985] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T06:24:12.369 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: mon.vm06@-1(???).osd e43 crush map has features 3314933000852226048, adjusting msgr requires 2026-03-10T06:24:12.369 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: mon.vm06@-1(???).osd e43 crush map has features 288514051259236352, adjusting msgr requires 2026-03-10T06:24:12.369 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: mon.vm06@-1(???).osd e43 crush map has features 288514051259236352, adjusting msgr requires 2026-03-10T06:24:12.369 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: mon.vm06@-1(???).osd e43 crush map has features 288514051259236352, adjusting msgr requires 2026-03-10T06:24:12.369 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:12 vm06.local ceph-mon[98962]: mon.vm06@-1(???).paxosservice(auth 1..22) refresh upgraded, format 0 -> 3 2026-03-10T06:24:13.428 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:13 vm06.local ceph-mon[98962]: pgmap v7: 65 pgs: 65 active+clean; 298 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail; 1.1 MiB/s rd, 1.0 MiB/s wr, 60 op/s 2026-03-10T06:24:13.428 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:13 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "mon metadata", "id": "vm04"}]: dispatch 2026-03-10T06:24:13.428 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:13 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "mon metadata", "id": "vm06"}]: dispatch 2026-03-10T06:24:13.428 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:13 vm06.local ceph-mon[98962]: mon.vm04 calling monitor election 2026-03-10T06:24:13.428 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:13 vm06.local ceph-mon[98962]: mon.vm06 calling monitor election 2026-03-10T06:24:13.428 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:13 vm06.local ceph-mon[98962]: mon.vm04 is new leader, mons vm04,vm06 in quorum (ranks 0,1) 2026-03-10T06:24:13.428 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:13 vm06.local ceph-mon[98962]: monmap epoch 3 2026-03-10T06:24:13.428 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:13 vm06.local ceph-mon[98962]: fsid 9c59102a-1c48-11f1-b618-035af535377d 2026-03-10T06:24:13.428 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:13 vm06.local ceph-mon[98962]: last_changed 2026-03-10T06:24:12.213096+0000 2026-03-10T06:24:13.428 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:13 vm06.local ceph-mon[98962]: created 2026-03-10T06:16:42.736031+0000 2026-03-10T06:24:13.428 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:13 vm06.local ceph-mon[98962]: min_mon_release 19 (squid) 2026-03-10T06:24:13.428 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:13 vm06.local ceph-mon[98962]: election_strategy: 1 2026-03-10T06:24:13.428 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:13 vm06.local ceph-mon[98962]: 0: [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] mon.vm04 2026-03-10T06:24:13.428 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:13 vm06.local ceph-mon[98962]: 1: [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] mon.vm06 2026-03-10T06:24:13.428 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:13 vm06.local ceph-mon[98962]: fsmap cephfs:1 {0=cephfs.vm04.hdxbzv=up:active} 1 up:standby-replay 2 up:standby 2026-03-10T06:24:13.428 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:13 vm06.local ceph-mon[98962]: osdmap e43: 6 total, 6 up, 6 in 2026-03-10T06:24:13.428 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:13 vm06.local ceph-mon[98962]: mgrmap e39: vm04.exdvdb(active, since 8s), standbys: vm06.wwotdr 2026-03-10T06:24:13.428 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:13 vm06.local ceph-mon[98962]: Health detail: HEALTH_WARN 1 filesystem with deprecated feature inline_data 2026-03-10T06:24:13.428 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:13 vm06.local ceph-mon[98962]: [WRN] FS_INLINE_DATA_DEPRECATED: 1 filesystem with deprecated feature inline_data 2026-03-10T06:24:13.428 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:13 vm06.local ceph-mon[98962]: fs cephfs has deprecated feature inline_data enabled. 2026-03-10T06:24:13.428 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:13 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:13.428 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:13 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:13.428 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:13 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:24:13.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:13 vm04.local ceph-mon[115743]: pgmap v7: 65 pgs: 65 active+clean; 298 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail; 1.1 MiB/s rd, 1.0 MiB/s wr, 60 op/s 2026-03-10T06:24:13.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:13 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "mon metadata", "id": "vm04"}]: dispatch 2026-03-10T06:24:13.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:13 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "mon metadata", "id": "vm06"}]: dispatch 2026-03-10T06:24:13.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:13 vm04.local ceph-mon[115743]: mon.vm04 calling monitor election 2026-03-10T06:24:13.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:13 vm04.local ceph-mon[115743]: mon.vm06 calling monitor election 2026-03-10T06:24:13.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:13 vm04.local ceph-mon[115743]: mon.vm04 is new leader, mons vm04,vm06 in quorum (ranks 0,1) 2026-03-10T06:24:13.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:13 vm04.local ceph-mon[115743]: monmap epoch 3 2026-03-10T06:24:13.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:13 vm04.local ceph-mon[115743]: fsid 9c59102a-1c48-11f1-b618-035af535377d 2026-03-10T06:24:13.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:13 vm04.local ceph-mon[115743]: last_changed 2026-03-10T06:24:12.213096+0000 2026-03-10T06:24:13.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:13 vm04.local ceph-mon[115743]: created 2026-03-10T06:16:42.736031+0000 2026-03-10T06:24:13.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:13 vm04.local ceph-mon[115743]: min_mon_release 19 (squid) 2026-03-10T06:24:13.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:13 vm04.local ceph-mon[115743]: election_strategy: 1 2026-03-10T06:24:13.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:13 vm04.local ceph-mon[115743]: 0: [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] mon.vm04 2026-03-10T06:24:13.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:13 vm04.local ceph-mon[115743]: 1: [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] mon.vm06 2026-03-10T06:24:13.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:13 vm04.local ceph-mon[115743]: fsmap cephfs:1 {0=cephfs.vm04.hdxbzv=up:active} 1 up:standby-replay 2 up:standby 2026-03-10T06:24:13.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:13 vm04.local ceph-mon[115743]: osdmap e43: 6 total, 6 up, 6 in 2026-03-10T06:24:13.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:13 vm04.local ceph-mon[115743]: mgrmap e39: vm04.exdvdb(active, since 8s), standbys: vm06.wwotdr 2026-03-10T06:24:13.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:13 vm04.local ceph-mon[115743]: Health detail: HEALTH_WARN 1 filesystem with deprecated feature inline_data 2026-03-10T06:24:13.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:13 vm04.local ceph-mon[115743]: [WRN] FS_INLINE_DATA_DEPRECATED: 1 filesystem with deprecated feature inline_data 2026-03-10T06:24:13.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:13 vm04.local ceph-mon[115743]: fs cephfs has deprecated feature inline_data enabled. 2026-03-10T06:24:13.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:13 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:13.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:13 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:13.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:13 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:24:14.869 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:14 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:14.870 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:14 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:14.870 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:14 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:14.870 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:14 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:14.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:14 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:14.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:14 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:14.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:14 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:14.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:14 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:15.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:15 vm06.local ceph-mon[98962]: pgmap v8: 65 pgs: 65 active+clean; 298 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail; 870 KiB/s rd, 814 KiB/s wr, 46 op/s 2026-03-10T06:24:15.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:15 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:15.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:15 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:15.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:15 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:24:15.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:15 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T06:24:15.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:15 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:15.761 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:15 vm04.local ceph-mon[115743]: pgmap v8: 65 pgs: 65 active+clean; 298 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail; 870 KiB/s rd, 814 KiB/s wr, 46 op/s 2026-03-10T06:24:15.761 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:15 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:15.761 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:15 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:15.761 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:15 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:24:15.761 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:15 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T06:24:15.761 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:15 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:16.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:16 vm04.local ceph-mon[115743]: Reconfiguring mon.vm04 (monmap changed)... 2026-03-10T06:24:16.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:16 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch 2026-03-10T06:24:16.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:16 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch 2026-03-10T06:24:16.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:16 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:24:16.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:16 vm04.local ceph-mon[115743]: Reconfiguring daemon mon.vm04 on vm04 2026-03-10T06:24:16.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:16 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:16.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:16 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:16.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:16 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm04.exdvdb", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-10T06:24:16.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:16 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-10T06:24:16.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:16 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:24:17.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:16 vm06.local ceph-mon[98962]: Reconfiguring mon.vm04 (monmap changed)... 2026-03-10T06:24:17.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:16 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch 2026-03-10T06:24:17.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:16 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch 2026-03-10T06:24:17.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:16 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:24:17.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:16 vm06.local ceph-mon[98962]: Reconfiguring daemon mon.vm04 on vm04 2026-03-10T06:24:17.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:16 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:17.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:16 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:17.118 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:16 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm04.exdvdb", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-10T06:24:17.118 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:16 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-10T06:24:17.118 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:16 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:24:17.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:17 vm04.local ceph-mon[115743]: pgmap v9: 65 pgs: 65 active+clean; 304 MiB data, 2.8 GiB used, 117 GiB / 120 GiB avail; 1.5 MiB/s rd, 1.6 MiB/s wr, 87 op/s 2026-03-10T06:24:17.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:17 vm04.local ceph-mon[115743]: Reconfiguring mgr.vm04.exdvdb (monmap changed)... 2026-03-10T06:24:17.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:17 vm04.local ceph-mon[115743]: Reconfiguring daemon mgr.vm04.exdvdb on vm04 2026-03-10T06:24:17.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:17 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:17.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:17 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:17.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:17 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm04", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-10T06:24:17.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:17 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth caps", "entity": "client.ceph-exporter.vm04", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-10T06:24:17.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:17 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd='[{"prefix": "auth caps", "entity": "client.ceph-exporter.vm04", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]': finished 2026-03-10T06:24:17.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:17 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "client.ceph-exporter.vm04"}]: dispatch 2026-03-10T06:24:17.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:17 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:24:17.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:17 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:17.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:17 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:17.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:17 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.vm04", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch 2026-03-10T06:24:17.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:17 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:24:17.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:17 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:18.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:17 vm06.local ceph-mon[98962]: pgmap v9: 65 pgs: 65 active+clean; 304 MiB data, 2.8 GiB used, 117 GiB / 120 GiB avail; 1.5 MiB/s rd, 1.6 MiB/s wr, 87 op/s 2026-03-10T06:24:18.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:17 vm06.local ceph-mon[98962]: Reconfiguring mgr.vm04.exdvdb (monmap changed)... 2026-03-10T06:24:18.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:17 vm06.local ceph-mon[98962]: Reconfiguring daemon mgr.vm04.exdvdb on vm04 2026-03-10T06:24:18.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:17 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:18.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:17 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:18.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:17 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm04", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-10T06:24:18.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:17 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth caps", "entity": "client.ceph-exporter.vm04", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-10T06:24:18.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:17 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd='[{"prefix": "auth caps", "entity": "client.ceph-exporter.vm04", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]': finished 2026-03-10T06:24:18.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:17 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "client.ceph-exporter.vm04"}]: dispatch 2026-03-10T06:24:18.118 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:17 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:24:18.118 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:17 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:18.118 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:17 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:18.118 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:17 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.vm04", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch 2026-03-10T06:24:18.118 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:17 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:24:18.118 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:17 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:19.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:18 vm06.local ceph-mon[98962]: Reconfiguring ceph-exporter.vm04 (monmap changed)... 2026-03-10T06:24:19.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:18 vm06.local ceph-mon[98962]: Unable to update caps for client.ceph-exporter.vm04 2026-03-10T06:24:19.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:18 vm06.local ceph-mon[98962]: Reconfiguring daemon ceph-exporter.vm04 on vm04 2026-03-10T06:24:19.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:18 vm06.local ceph-mon[98962]: Reconfiguring crash.vm04 (monmap changed)... 2026-03-10T06:24:19.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:18 vm06.local ceph-mon[98962]: Reconfiguring daemon crash.vm04 on vm04 2026-03-10T06:24:19.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:18 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:19.118 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:18 vm06.local ceph-mon[98962]: Reconfiguring osd.0 (monmap changed)... 2026-03-10T06:24:19.118 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:18 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "osd.0"}]: dispatch 2026-03-10T06:24:19.118 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:18 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:24:19.118 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:18 vm06.local ceph-mon[98962]: Reconfiguring daemon osd.0 on vm04 2026-03-10T06:24:19.118 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:18 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:19.118 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:18 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:19.118 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:18 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "osd.1"}]: dispatch 2026-03-10T06:24:19.118 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:18 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:24:19.118 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:18 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:19.118 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:18 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:19.118 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:18 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "osd.2"}]: dispatch 2026-03-10T06:24:19.118 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:18 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:24:19.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:18 vm04.local ceph-mon[115743]: Reconfiguring ceph-exporter.vm04 (monmap changed)... 2026-03-10T06:24:19.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:18 vm04.local ceph-mon[115743]: Unable to update caps for client.ceph-exporter.vm04 2026-03-10T06:24:19.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:18 vm04.local ceph-mon[115743]: Reconfiguring daemon ceph-exporter.vm04 on vm04 2026-03-10T06:24:19.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:18 vm04.local ceph-mon[115743]: Reconfiguring crash.vm04 (monmap changed)... 2026-03-10T06:24:19.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:18 vm04.local ceph-mon[115743]: Reconfiguring daemon crash.vm04 on vm04 2026-03-10T06:24:19.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:18 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:19.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:18 vm04.local ceph-mon[115743]: Reconfiguring osd.0 (monmap changed)... 2026-03-10T06:24:19.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:18 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "osd.0"}]: dispatch 2026-03-10T06:24:19.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:18 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:24:19.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:18 vm04.local ceph-mon[115743]: Reconfiguring daemon osd.0 on vm04 2026-03-10T06:24:19.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:18 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:19.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:18 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:19.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:18 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "osd.1"}]: dispatch 2026-03-10T06:24:19.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:18 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:24:19.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:18 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:19.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:18 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:19.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:18 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "osd.2"}]: dispatch 2026-03-10T06:24:19.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:18 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:24:19.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:19 vm04.local ceph-mon[115743]: pgmap v10: 65 pgs: 65 active+clean; 304 MiB data, 2.8 GiB used, 117 GiB / 120 GiB avail; 1.4 MiB/s rd, 1.4 MiB/s wr, 78 op/s 2026-03-10T06:24:19.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:19 vm04.local ceph-mon[115743]: Reconfiguring osd.1 (monmap changed)... 2026-03-10T06:24:19.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:19 vm04.local ceph-mon[115743]: Reconfiguring daemon osd.1 on vm04 2026-03-10T06:24:19.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:19 vm04.local ceph-mon[115743]: Reconfiguring osd.2 (monmap changed)... 2026-03-10T06:24:19.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:19 vm04.local ceph-mon[115743]: Reconfiguring daemon osd.2 on vm04 2026-03-10T06:24:19.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:19 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:19.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:19 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:19.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:19 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm04.hdxbzv", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-10T06:24:19.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:19 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:24:19.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:19 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T06:24:19.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:19 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:19.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:19 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:19.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:19 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm04.hsrsig", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-10T06:24:19.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:19 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:24:19.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:19 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:19.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:19 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:19.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:19 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm06", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-10T06:24:19.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:19 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth caps", "entity": "client.ceph-exporter.vm06", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-10T06:24:19.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:19 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd='[{"prefix": "auth caps", "entity": "client.ceph-exporter.vm06", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]': finished 2026-03-10T06:24:19.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:19 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "client.ceph-exporter.vm06"}]: dispatch 2026-03-10T06:24:19.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:19 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:24:19.965 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:19 vm06.local ceph-mon[98962]: pgmap v10: 65 pgs: 65 active+clean; 304 MiB data, 2.8 GiB used, 117 GiB / 120 GiB avail; 1.4 MiB/s rd, 1.4 MiB/s wr, 78 op/s 2026-03-10T06:24:19.965 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:19 vm06.local ceph-mon[98962]: Reconfiguring osd.1 (monmap changed)... 2026-03-10T06:24:19.965 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:19 vm06.local ceph-mon[98962]: Reconfiguring daemon osd.1 on vm04 2026-03-10T06:24:19.965 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:19 vm06.local ceph-mon[98962]: Reconfiguring osd.2 (monmap changed)... 2026-03-10T06:24:19.965 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:19 vm06.local ceph-mon[98962]: Reconfiguring daemon osd.2 on vm04 2026-03-10T06:24:19.965 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:19 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:19.965 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:19 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:19.965 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:19 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm04.hdxbzv", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-10T06:24:19.965 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:19 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:24:19.965 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:19 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T06:24:19.965 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:19 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:19.965 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:19 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:19.965 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:19 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm04.hsrsig", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-10T06:24:19.965 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:19 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:24:19.965 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:19 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:19.965 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:19 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:19.965 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:19 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm06", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-10T06:24:19.965 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:19 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth caps", "entity": "client.ceph-exporter.vm06", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-10T06:24:19.965 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:19 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd='[{"prefix": "auth caps", "entity": "client.ceph-exporter.vm06", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]': finished 2026-03-10T06:24:19.965 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:19 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "client.ceph-exporter.vm06"}]: dispatch 2026-03-10T06:24:19.965 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:19 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:24:20.969 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:20 vm06.local ceph-mon[98962]: Reconfiguring mds.cephfs.vm04.hdxbzv (monmap changed)... 2026-03-10T06:24:20.969 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:20 vm06.local ceph-mon[98962]: Reconfiguring daemon mds.cephfs.vm04.hdxbzv on vm04 2026-03-10T06:24:20.969 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:20 vm06.local ceph-mon[98962]: Reconfiguring mds.cephfs.vm04.hsrsig (monmap changed)... 2026-03-10T06:24:20.969 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:20 vm06.local ceph-mon[98962]: Reconfiguring daemon mds.cephfs.vm04.hsrsig on vm04 2026-03-10T06:24:20.969 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:20 vm06.local ceph-mon[98962]: Reconfiguring ceph-exporter.vm06 (monmap changed)... 2026-03-10T06:24:20.969 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:20 vm06.local ceph-mon[98962]: Unable to update caps for client.ceph-exporter.vm06 2026-03-10T06:24:20.969 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:20 vm06.local ceph-mon[98962]: Reconfiguring daemon ceph-exporter.vm06 on vm06 2026-03-10T06:24:20.969 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:20 vm06.local ceph-mon[98962]: pgmap v11: 65 pgs: 65 active+clean; 304 MiB data, 2.8 GiB used, 117 GiB / 120 GiB avail; 1.4 MiB/s rd, 1.4 MiB/s wr, 78 op/s 2026-03-10T06:24:20.969 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:20 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:20.969 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:20 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:20.969 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:20 vm06.local ceph-mon[98962]: Reconfiguring crash.vm06 (monmap changed)... 2026-03-10T06:24:20.969 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:20 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.vm06", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch 2026-03-10T06:24:20.969 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:20 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:24:20.969 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:20 vm06.local ceph-mon[98962]: Reconfiguring daemon crash.vm06 on vm06 2026-03-10T06:24:20.969 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:20 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:20.969 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:20 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:20.970 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:20 vm06.local ceph-mon[98962]: Reconfiguring mgr.vm06.wwotdr (monmap changed)... 2026-03-10T06:24:20.970 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:20 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm06.wwotdr", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-10T06:24:20.970 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:20 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-10T06:24:20.970 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:20 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:24:20.970 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:20 vm06.local ceph-mon[98962]: Reconfiguring daemon mgr.vm06.wwotdr on vm06 2026-03-10T06:24:21.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:20 vm04.local ceph-mon[115743]: Reconfiguring mds.cephfs.vm04.hdxbzv (monmap changed)... 2026-03-10T06:24:21.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:20 vm04.local ceph-mon[115743]: Reconfiguring daemon mds.cephfs.vm04.hdxbzv on vm04 2026-03-10T06:24:21.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:20 vm04.local ceph-mon[115743]: Reconfiguring mds.cephfs.vm04.hsrsig (monmap changed)... 2026-03-10T06:24:21.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:20 vm04.local ceph-mon[115743]: Reconfiguring daemon mds.cephfs.vm04.hsrsig on vm04 2026-03-10T06:24:21.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:20 vm04.local ceph-mon[115743]: Reconfiguring ceph-exporter.vm06 (monmap changed)... 2026-03-10T06:24:21.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:20 vm04.local ceph-mon[115743]: Unable to update caps for client.ceph-exporter.vm06 2026-03-10T06:24:21.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:20 vm04.local ceph-mon[115743]: Reconfiguring daemon ceph-exporter.vm06 on vm06 2026-03-10T06:24:21.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:20 vm04.local ceph-mon[115743]: pgmap v11: 65 pgs: 65 active+clean; 304 MiB data, 2.8 GiB used, 117 GiB / 120 GiB avail; 1.4 MiB/s rd, 1.4 MiB/s wr, 78 op/s 2026-03-10T06:24:21.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:20 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:21.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:20 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:21.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:20 vm04.local ceph-mon[115743]: Reconfiguring crash.vm06 (monmap changed)... 2026-03-10T06:24:21.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:20 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.vm06", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch 2026-03-10T06:24:21.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:20 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:24:21.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:20 vm04.local ceph-mon[115743]: Reconfiguring daemon crash.vm06 on vm06 2026-03-10T06:24:21.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:20 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:21.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:20 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:21.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:20 vm04.local ceph-mon[115743]: Reconfiguring mgr.vm06.wwotdr (monmap changed)... 2026-03-10T06:24:21.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:20 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm06.wwotdr", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-10T06:24:21.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:20 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-10T06:24:21.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:20 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:24:21.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:20 vm04.local ceph-mon[115743]: Reconfiguring daemon mgr.vm06.wwotdr on vm06 2026-03-10T06:24:22.167 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:21 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:22.167 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:21 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:22.167 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:21 vm06.local ceph-mon[98962]: Reconfiguring mon.vm06 (monmap changed)... 2026-03-10T06:24:22.167 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:21 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch 2026-03-10T06:24:22.167 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:21 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch 2026-03-10T06:24:22.167 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:21 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:24:22.167 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:21 vm06.local ceph-mon[98962]: Reconfiguring daemon mon.vm06 on vm06 2026-03-10T06:24:22.167 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:21 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:22.167 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:21 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:22.167 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:21 vm06.local ceph-mon[98962]: Reconfiguring osd.3 (monmap changed)... 2026-03-10T06:24:22.167 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:21 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "osd.3"}]: dispatch 2026-03-10T06:24:22.167 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:21 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:24:22.167 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:21 vm06.local ceph-mon[98962]: Reconfiguring daemon osd.3 on vm06 2026-03-10T06:24:22.167 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:21 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:22.167 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:21 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:22.167 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:21 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "osd.4"}]: dispatch 2026-03-10T06:24:22.167 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:21 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:24:22.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:21 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:22.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:21 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:22.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:21 vm04.local ceph-mon[115743]: Reconfiguring mon.vm06 (monmap changed)... 2026-03-10T06:24:22.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:21 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch 2026-03-10T06:24:22.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:21 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch 2026-03-10T06:24:22.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:21 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:24:22.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:21 vm04.local ceph-mon[115743]: Reconfiguring daemon mon.vm06 on vm06 2026-03-10T06:24:22.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:21 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:22.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:21 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:22.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:21 vm04.local ceph-mon[115743]: Reconfiguring osd.3 (monmap changed)... 2026-03-10T06:24:22.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:21 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "osd.3"}]: dispatch 2026-03-10T06:24:22.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:21 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:24:22.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:21 vm04.local ceph-mon[115743]: Reconfiguring daemon osd.3 on vm06 2026-03-10T06:24:22.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:21 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:22.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:21 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:22.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:21 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "osd.4"}]: dispatch 2026-03-10T06:24:22.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:21 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:24:23.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:23 vm06.local ceph-mon[98962]: Reconfiguring osd.4 (monmap changed)... 2026-03-10T06:24:23.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:23 vm06.local ceph-mon[98962]: Reconfiguring daemon osd.4 on vm06 2026-03-10T06:24:23.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:23 vm06.local ceph-mon[98962]: pgmap v12: 65 pgs: 65 active+clean; 308 MiB data, 2.8 GiB used, 117 GiB / 120 GiB avail; 1.9 MiB/s rd, 1.9 MiB/s wr, 118 op/s 2026-03-10T06:24:23.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:23 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:23.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:23 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:23.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:23 vm06.local ceph-mon[98962]: Reconfiguring osd.5 (monmap changed)... 2026-03-10T06:24:23.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:23 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "osd.5"}]: dispatch 2026-03-10T06:24:23.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:23 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:24:23.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:23 vm06.local ceph-mon[98962]: Reconfiguring daemon osd.5 on vm06 2026-03-10T06:24:23.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:23 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:23.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:23 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:23.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:23 vm06.local ceph-mon[98962]: Reconfiguring mds.cephfs.vm06.wzhqon (monmap changed)... 2026-03-10T06:24:23.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:23 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm06.wzhqon", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-10T06:24:23.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:23 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:24:23.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:23 vm06.local ceph-mon[98962]: Reconfiguring daemon mds.cephfs.vm06.wzhqon on vm06 2026-03-10T06:24:23.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:23 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:23.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:23 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:23.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:23 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm06.afscws", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-10T06:24:23.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:23 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:24:23.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:23 vm04.local ceph-mon[115743]: Reconfiguring osd.4 (monmap changed)... 2026-03-10T06:24:23.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:23 vm04.local ceph-mon[115743]: Reconfiguring daemon osd.4 on vm06 2026-03-10T06:24:23.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:23 vm04.local ceph-mon[115743]: pgmap v12: 65 pgs: 65 active+clean; 308 MiB data, 2.8 GiB used, 117 GiB / 120 GiB avail; 1.9 MiB/s rd, 1.9 MiB/s wr, 118 op/s 2026-03-10T06:24:23.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:23 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:23.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:23 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:23.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:23 vm04.local ceph-mon[115743]: Reconfiguring osd.5 (monmap changed)... 2026-03-10T06:24:23.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:23 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "osd.5"}]: dispatch 2026-03-10T06:24:23.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:23 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:24:23.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:23 vm04.local ceph-mon[115743]: Reconfiguring daemon osd.5 on vm06 2026-03-10T06:24:23.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:23 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:23.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:23 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:23.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:23 vm04.local ceph-mon[115743]: Reconfiguring mds.cephfs.vm06.wzhqon (monmap changed)... 2026-03-10T06:24:23.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:23 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm06.wzhqon", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-10T06:24:23.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:23 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:24:23.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:23 vm04.local ceph-mon[115743]: Reconfiguring daemon mds.cephfs.vm06.wzhqon on vm06 2026-03-10T06:24:23.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:23 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:23.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:23 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:23.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:23 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm06.afscws", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-10T06:24:23.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:23 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:24:24.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:24 vm06.local ceph-mon[98962]: Reconfiguring mds.cephfs.vm06.afscws (monmap changed)... 2026-03-10T06:24:24.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:24 vm06.local ceph-mon[98962]: Reconfiguring daemon mds.cephfs.vm06.afscws on vm06 2026-03-10T06:24:24.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:24 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:24.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:24 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:24.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:24 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:24:24.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:24 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:24:24.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:24 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:24:24.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:24 vm06.local ceph-mon[98962]: Upgrade: Setting container_image for all mon 2026-03-10T06:24:24.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:24 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:24.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:24 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon.vm04"}]: dispatch 2026-03-10T06:24:24.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:24 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mon.vm04"}]': finished 2026-03-10T06:24:24.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:24 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon.vm06"}]: dispatch 2026-03-10T06:24:24.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:24 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mon.vm06"}]': finished 2026-03-10T06:24:24.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:24 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:24.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:24 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.vm04", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch 2026-03-10T06:24:24.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:24 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:24:24.665 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:24 vm04.local ceph-mon[115743]: Reconfiguring mds.cephfs.vm06.afscws (monmap changed)... 2026-03-10T06:24:24.665 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:24 vm04.local ceph-mon[115743]: Reconfiguring daemon mds.cephfs.vm06.afscws on vm06 2026-03-10T06:24:24.665 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:24 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:24.665 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:24 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:24.665 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:24 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:24:24.665 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:24 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:24:24.665 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:24 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:24:24.665 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:24 vm04.local ceph-mon[115743]: Upgrade: Setting container_image for all mon 2026-03-10T06:24:24.665 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:24 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:24.665 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:24 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon.vm04"}]: dispatch 2026-03-10T06:24:24.665 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:24 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mon.vm04"}]': finished 2026-03-10T06:24:24.665 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:24 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon.vm06"}]: dispatch 2026-03-10T06:24:24.665 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:24 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mon.vm06"}]': finished 2026-03-10T06:24:24.665 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:24 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:24.665 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:24 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.vm04", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch 2026-03-10T06:24:24.666 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:24 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:24:25.621 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:25 vm04.local ceph-mon[115743]: pgmap v13: 65 pgs: 65 active+clean; 308 MiB data, 2.8 GiB used, 117 GiB / 120 GiB avail; 1.3 MiB/s rd, 1.3 MiB/s wr, 84 op/s 2026-03-10T06:24:25.621 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:25 vm04.local ceph-mon[115743]: Upgrade: Updating crash.vm04 (1/2) 2026-03-10T06:24:25.621 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:25 vm04.local ceph-mon[115743]: Deploying daemon crash.vm04 on vm04 2026-03-10T06:24:25.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:25 vm06.local ceph-mon[98962]: pgmap v13: 65 pgs: 65 active+clean; 308 MiB data, 2.8 GiB used, 117 GiB / 120 GiB avail; 1.3 MiB/s rd, 1.3 MiB/s wr, 84 op/s 2026-03-10T06:24:25.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:25 vm06.local ceph-mon[98962]: Upgrade: Updating crash.vm04 (1/2) 2026-03-10T06:24:25.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:25 vm06.local ceph-mon[98962]: Deploying daemon crash.vm04 on vm04 2026-03-10T06:24:26.132 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.130+0000 7f83fce18700 1 -- 192.168.123.104:0/1123372197 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f83f8108780 msgr2=0x7f83f8110dc0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:24:26.132 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.130+0000 7f83fce18700 1 --2- 192.168.123.104:0/1123372197 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f83f8108780 0x7f83f8110dc0 secure :-1 s=READY pgs=35 cs=0 l=1 rev1=1 crypto rx=0x7f83ec007720 tx=0x7f83ec007a30 comp rx=0 tx=0).stop 2026-03-10T06:24:26.132 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.131+0000 7f83fce18700 1 -- 192.168.123.104:0/1123372197 shutdown_connections 2026-03-10T06:24:26.132 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.131+0000 7f83fce18700 1 --2- 192.168.123.104:0/1123372197 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f83f8108780 0x7f83f8110dc0 unknown :-1 s=CLOSED pgs=35 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:24:26.132 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.131+0000 7f83fce18700 1 --2- 192.168.123.104:0/1123372197 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f83f8107de0 0x7f83f81081b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:24:26.132 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.131+0000 7f83fce18700 1 -- 192.168.123.104:0/1123372197 >> 192.168.123.104:0/1123372197 conn(0x7f83f806cb20 msgr2=0x7f83f806cf20 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:24:26.132 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.132+0000 7f83fce18700 1 -- 192.168.123.104:0/1123372197 shutdown_connections 2026-03-10T06:24:26.133 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.132+0000 7f83fce18700 1 -- 192.168.123.104:0/1123372197 wait complete. 2026-03-10T06:24:26.133 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.132+0000 7f83fce18700 1 Processor -- start 2026-03-10T06:24:26.133 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.132+0000 7f83fce18700 1 -- start start 2026-03-10T06:24:26.133 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.132+0000 7f83fce18700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f83f8107de0 0x7f83f81a13b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:24:26.133 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.132+0000 7f83fce18700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f83f8108780 0x7f83f819c3b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:24:26.133 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.132+0000 7f83fce18700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f83f819c980 con 0x7f83f8108780 2026-03-10T06:24:26.133 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.132+0000 7f83fce18700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f83f819cac0 con 0x7f83f8107de0 2026-03-10T06:24:26.134 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.133+0000 7f83f5d9b700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f83f8108780 0x7f83f819c3b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:24:26.134 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.133+0000 7f83f5d9b700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f83f8108780 0x7f83f819c3b0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:42674/0 (socket says 192.168.123.104:42674) 2026-03-10T06:24:26.134 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.133+0000 7f83f5d9b700 1 -- 192.168.123.104:0/1223829642 learned_addr learned my addr 192.168.123.104:0/1223829642 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:24:26.134 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.133+0000 7f83f659c700 1 --2- 192.168.123.104:0/1223829642 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f83f8107de0 0x7f83f81a13b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:24:26.134 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.133+0000 7f83f5d9b700 1 -- 192.168.123.104:0/1223829642 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f83f8107de0 msgr2=0x7f83f81a13b0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:24:26.134 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.133+0000 7f83f5d9b700 1 --2- 192.168.123.104:0/1223829642 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f83f8107de0 0x7f83f81a13b0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:24:26.134 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.133+0000 7f83f5d9b700 1 -- 192.168.123.104:0/1223829642 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f83ec007350 con 0x7f83f8108780 2026-03-10T06:24:26.134 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.133+0000 7f83f5d9b700 1 --2- 192.168.123.104:0/1223829642 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f83f8108780 0x7f83f819c3b0 secure :-1 s=READY pgs=36 cs=0 l=1 rev1=1 crypto rx=0x7f83ec00aa00 tx=0x7f83ec00aa30 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:24:26.134 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.133+0000 7f83e77fe700 1 -- 192.168.123.104:0/1223829642 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f83ec003a40 con 0x7f83f8108780 2026-03-10T06:24:26.134 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.133+0000 7f83e77fe700 1 -- 192.168.123.104:0/1223829642 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f83ec003ba0 con 0x7f83f8108780 2026-03-10T06:24:26.135 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.133+0000 7f83e77fe700 1 -- 192.168.123.104:0/1223829642 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f83ec01c660 con 0x7f83f8108780 2026-03-10T06:24:26.135 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.133+0000 7f83fce18700 1 -- 192.168.123.104:0/1223829642 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f83f819cd40 con 0x7f83f8108780 2026-03-10T06:24:26.135 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.133+0000 7f83fce18700 1 -- 192.168.123.104:0/1223829642 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f83f819d1b0 con 0x7f83f8108780 2026-03-10T06:24:26.136 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.135+0000 7f83e77fe700 1 -- 192.168.123.104:0/1223829642 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 39) v1 ==== 100217+0+0 (secure 0 0 0) 0x7f83ec012480 con 0x7f83f8108780 2026-03-10T06:24:26.136 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.135+0000 7f83e77fe700 1 --2- 192.168.123.104:0/1223829642 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f83e00778f0 0x7f83e0079da0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:24:26.136 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.135+0000 7f83e77fe700 1 -- 192.168.123.104:0/1223829642 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(43..43 src has 1..43) v4 ==== 6050+0+0 (secure 0 0 0) 0x7f83ec09a620 con 0x7f83f8108780 2026-03-10T06:24:26.138 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.136+0000 7f83f659c700 1 --2- 192.168.123.104:0/1223829642 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f83e00778f0 0x7f83e0079da0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:24:26.138 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.134+0000 7f83fce18700 1 -- 192.168.123.104:0/1223829642 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f83f810e4c0 con 0x7f83f8108780 2026-03-10T06:24:26.138 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.138+0000 7f83f659c700 1 --2- 192.168.123.104:0/1223829642 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f83e00778f0 0x7f83e0079da0 secure :-1 s=READY pgs=20 cs=0 l=1 rev1=1 crypto rx=0x7f83f0009510 tx=0x7f83f00093a0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:24:26.146 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.139+0000 7f83e77fe700 1 -- 192.168.123.104:0/1223829642 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f83ec062d00 con 0x7f83f8108780 2026-03-10T06:24:26.293 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.292+0000 7f83fce18700 1 -- 192.168.123.104:0/1223829642 --> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f83f819d8e0 con 0x7f83e00778f0 2026-03-10T06:24:26.296 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.295+0000 7f83e77fe700 1 -- 192.168.123.104:0/1223829642 <== mgr.34104 v2:192.168.123.104:6800/1421430943 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+385 (secure 0 0 0) 0x7f83f819d8e0 con 0x7f83e00778f0 2026-03-10T06:24:26.302 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.302+0000 7f83fce18700 1 -- 192.168.123.104:0/1223829642 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f83e00778f0 msgr2=0x7f83e0079da0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:24:26.302 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.302+0000 7f83fce18700 1 --2- 192.168.123.104:0/1223829642 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f83e00778f0 0x7f83e0079da0 secure :-1 s=READY pgs=20 cs=0 l=1 rev1=1 crypto rx=0x7f83f0009510 tx=0x7f83f00093a0 comp rx=0 tx=0).stop 2026-03-10T06:24:26.303 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.302+0000 7f83fce18700 1 -- 192.168.123.104:0/1223829642 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f83f8108780 msgr2=0x7f83f819c3b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:24:26.303 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.302+0000 7f83fce18700 1 --2- 192.168.123.104:0/1223829642 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f83f8108780 0x7f83f819c3b0 secure :-1 s=READY pgs=36 cs=0 l=1 rev1=1 crypto rx=0x7f83ec00aa00 tx=0x7f83ec00aa30 comp rx=0 tx=0).stop 2026-03-10T06:24:26.303 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.302+0000 7f83fce18700 1 -- 192.168.123.104:0/1223829642 shutdown_connections 2026-03-10T06:24:26.303 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.302+0000 7f83fce18700 1 --2- 192.168.123.104:0/1223829642 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f83e00778f0 0x7f83e0079da0 unknown :-1 s=CLOSED pgs=20 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:24:26.303 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.302+0000 7f83fce18700 1 --2- 192.168.123.104:0/1223829642 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f83f8107de0 0x7f83f81a13b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:24:26.303 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.302+0000 7f83fce18700 1 --2- 192.168.123.104:0/1223829642 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f83f8108780 0x7f83f819c3b0 unknown :-1 s=CLOSED pgs=36 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:24:26.303 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.302+0000 7f83fce18700 1 -- 192.168.123.104:0/1223829642 >> 192.168.123.104:0/1223829642 conn(0x7f83f806cb20 msgr2=0x7f83f810b600 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:24:26.303 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.302+0000 7f83fce18700 1 -- 192.168.123.104:0/1223829642 shutdown_connections 2026-03-10T06:24:26.303 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.302+0000 7f83fce18700 1 -- 192.168.123.104:0/1223829642 wait complete. 2026-03-10T06:24:26.314 INFO:teuthology.orchestra.run.vm04.stdout:true 2026-03-10T06:24:26.388 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.387+0000 7fb7135dc700 1 -- 192.168.123.104:0/1730475456 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb70c10c8b0 msgr2=0x7fb70c10cc80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:24:26.388 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.387+0000 7fb7135dc700 1 --2- 192.168.123.104:0/1730475456 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb70c10c8b0 0x7fb70c10cc80 secure :-1 s=READY pgs=4 cs=0 l=1 rev1=1 crypto rx=0x7fb708009a60 tx=0x7fb708009d70 comp rx=0 tx=0).stop 2026-03-10T06:24:26.388 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.388+0000 7fb7135dc700 1 -- 192.168.123.104:0/1730475456 shutdown_connections 2026-03-10T06:24:26.389 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.388+0000 7fb7135dc700 1 --2- 192.168.123.104:0/1730475456 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb70c071e40 0x7fb70c0722b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:24:26.389 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.388+0000 7fb7135dc700 1 --2- 192.168.123.104:0/1730475456 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb70c10c8b0 0x7fb70c10cc80 unknown :-1 s=CLOSED pgs=4 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:24:26.389 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.388+0000 7fb7135dc700 1 -- 192.168.123.104:0/1730475456 >> 192.168.123.104:0/1730475456 conn(0x7fb70c06c6c0 msgr2=0x7fb70c06cac0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:24:26.389 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.388+0000 7fb7135dc700 1 -- 192.168.123.104:0/1730475456 shutdown_connections 2026-03-10T06:24:26.389 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.388+0000 7fb7135dc700 1 -- 192.168.123.104:0/1730475456 wait complete. 2026-03-10T06:24:26.389 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.388+0000 7fb7135dc700 1 Processor -- start 2026-03-10T06:24:26.389 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.389+0000 7fb7135dc700 1 -- start start 2026-03-10T06:24:26.390 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.389+0000 7fb7135dc700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb70c071e40 0x7fb70c07ced0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:24:26.390 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.389+0000 7fb7135dc700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb70c07d410 0x7fb70c07d880 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:24:26.390 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.389+0000 7fb7135dc700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb70c081a50 con 0x7fb70c071e40 2026-03-10T06:24:26.390 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.389+0000 7fb7135dc700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb70c081bc0 con 0x7fb70c07d410 2026-03-10T06:24:26.390 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.389+0000 7fb710b77700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb70c07d410 0x7fb70c07d880 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:24:26.390 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.389+0000 7fb710b77700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb70c07d410 0x7fb70c07d880 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.104:56344/0 (socket says 192.168.123.104:56344) 2026-03-10T06:24:26.390 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.389+0000 7fb710b77700 1 -- 192.168.123.104:0/2083631231 learned_addr learned my addr 192.168.123.104:0/2083631231 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:24:26.390 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.389+0000 7fb710b77700 1 -- 192.168.123.104:0/2083631231 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb70c071e40 msgr2=0x7fb70c07ced0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:24:26.390 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.389+0000 7fb710b77700 1 --2- 192.168.123.104:0/2083631231 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb70c071e40 0x7fb70c07ced0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:24:26.390 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.389+0000 7fb710b77700 1 -- 192.168.123.104:0/2083631231 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb708009710 con 0x7fb70c07d410 2026-03-10T06:24:26.390 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.390+0000 7fb710b77700 1 --2- 192.168.123.104:0/2083631231 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb70c07d410 0x7fb70c07d880 secure :-1 s=READY pgs=5 cs=0 l=1 rev1=1 crypto rx=0x7fb70400ea00 tx=0x7fb70400edc0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:24:26.390 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.390+0000 7fb7027fc700 1 -- 192.168.123.104:0/2083631231 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb70400d4e0 con 0x7fb70c07d410 2026-03-10T06:24:26.391 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.390+0000 7fb7135dc700 1 -- 192.168.123.104:0/2083631231 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fb70c081ea0 con 0x7fb70c07d410 2026-03-10T06:24:26.391 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.390+0000 7fb7135dc700 1 -- 192.168.123.104:0/2083631231 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fb70c0823f0 con 0x7fb70c07d410 2026-03-10T06:24:26.391 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.391+0000 7fb7027fc700 1 -- 192.168.123.104:0/2083631231 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7fb704018470 con 0x7fb70c07d410 2026-03-10T06:24:26.391 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.391+0000 7fb7027fc700 1 -- 192.168.123.104:0/2083631231 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb70400f660 con 0x7fb70c07d410 2026-03-10T06:24:26.393 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.392+0000 7fb7027fc700 1 -- 192.168.123.104:0/2083631231 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 39) v1 ==== 100217+0+0 (secure 0 0 0) 0x7fb704015070 con 0x7fb70c07d410 2026-03-10T06:24:26.393 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.393+0000 7fb7027fc700 1 --2- 192.168.123.104:0/2083631231 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7fb6f8079c50 0x7fb6f807c100 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:24:26.393 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.393+0000 7fb711378700 1 --2- 192.168.123.104:0/2083631231 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7fb6f8079c50 0x7fb6f807c100 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:24:26.393 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.393+0000 7fb7027fc700 1 -- 192.168.123.104:0/2083631231 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(43..43 src has 1..43) v4 ==== 6050+0+0 (secure 0 0 0) 0x7fb70409a510 con 0x7fb70c07d410 2026-03-10T06:24:26.394 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.393+0000 7fb711378700 1 --2- 192.168.123.104:0/2083631231 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7fb6f8079c50 0x7fb6f807c100 secure :-1 s=READY pgs=21 cs=0 l=1 rev1=1 crypto rx=0x7fb708009a60 tx=0x7fb70800b540 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:24:26.394 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.393+0000 7fb7135dc700 1 -- 192.168.123.104:0/2083631231 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fb6f0005320 con 0x7fb70c07d410 2026-03-10T06:24:26.402 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.399+0000 7fb7027fc700 1 -- 192.168.123.104:0/2083631231 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fb704062b70 con 0x7fb70c07d410 2026-03-10T06:24:26.545 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.544+0000 7fb7135dc700 1 -- 192.168.123.104:0/2083631231 --> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7fb6f0000bf0 con 0x7fb6f8079c50 2026-03-10T06:24:26.546 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.546+0000 7fb7027fc700 1 -- 192.168.123.104:0/2083631231 <== mgr.34104 v2:192.168.123.104:6800/1421430943 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+385 (secure 0 0 0) 0x7fb6f0000bf0 con 0x7fb6f8079c50 2026-03-10T06:24:26.550 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.549+0000 7fb6f7fff700 1 -- 192.168.123.104:0/2083631231 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7fb6f8079c50 msgr2=0x7fb6f807c100 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:24:26.550 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.549+0000 7fb6f7fff700 1 --2- 192.168.123.104:0/2083631231 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7fb6f8079c50 0x7fb6f807c100 secure :-1 s=READY pgs=21 cs=0 l=1 rev1=1 crypto rx=0x7fb708009a60 tx=0x7fb70800b540 comp rx=0 tx=0).stop 2026-03-10T06:24:26.550 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.549+0000 7fb6f7fff700 1 -- 192.168.123.104:0/2083631231 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb70c07d410 msgr2=0x7fb70c07d880 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:24:26.550 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.549+0000 7fb6f7fff700 1 --2- 192.168.123.104:0/2083631231 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb70c07d410 0x7fb70c07d880 secure :-1 s=READY pgs=5 cs=0 l=1 rev1=1 crypto rx=0x7fb70400ea00 tx=0x7fb70400edc0 comp rx=0 tx=0).stop 2026-03-10T06:24:26.550 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.549+0000 7fb6f7fff700 1 -- 192.168.123.104:0/2083631231 shutdown_connections 2026-03-10T06:24:26.550 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.549+0000 7fb6f7fff700 1 --2- 192.168.123.104:0/2083631231 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7fb6f8079c50 0x7fb6f807c100 unknown :-1 s=CLOSED pgs=21 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:24:26.550 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.549+0000 7fb6f7fff700 1 --2- 192.168.123.104:0/2083631231 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb70c071e40 0x7fb70c07ced0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:24:26.550 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.549+0000 7fb6f7fff700 1 --2- 192.168.123.104:0/2083631231 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb70c07d410 0x7fb70c07d880 unknown :-1 s=CLOSED pgs=5 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:24:26.550 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.549+0000 7fb6f7fff700 1 -- 192.168.123.104:0/2083631231 >> 192.168.123.104:0/2083631231 conn(0x7fb70c06c6c0 msgr2=0x7fb70c070050 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:24:26.550 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.549+0000 7fb6f7fff700 1 -- 192.168.123.104:0/2083631231 shutdown_connections 2026-03-10T06:24:26.550 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.549+0000 7fb6f7fff700 1 -- 192.168.123.104:0/2083631231 wait complete. 2026-03-10T06:24:26.628 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.627+0000 7f6a416ec700 1 -- 192.168.123.104:0/3041215763 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6a3c071e40 msgr2=0x7f6a3c0722b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:24:26.628 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.627+0000 7f6a416ec700 1 --2- 192.168.123.104:0/3041215763 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6a3c071e40 0x7f6a3c0722b0 secure :-1 s=READY pgs=37 cs=0 l=1 rev1=1 crypto rx=0x7f6a34008280 tx=0x7f6a34008590 comp rx=0 tx=0).stop 2026-03-10T06:24:26.628 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.627+0000 7f6a416ec700 1 -- 192.168.123.104:0/3041215763 shutdown_connections 2026-03-10T06:24:26.628 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.627+0000 7f6a416ec700 1 --2- 192.168.123.104:0/3041215763 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6a3c071e40 0x7f6a3c0722b0 unknown :-1 s=CLOSED pgs=37 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:24:26.628 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.627+0000 7f6a416ec700 1 --2- 192.168.123.104:0/3041215763 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6a3c10c8b0 0x7f6a3c10cc80 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:24:26.628 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.627+0000 7f6a416ec700 1 -- 192.168.123.104:0/3041215763 >> 192.168.123.104:0/3041215763 conn(0x7f6a3c06c6c0 msgr2=0x7f6a3c06cac0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:24:26.628 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.627+0000 7f6a416ec700 1 -- 192.168.123.104:0/3041215763 shutdown_connections 2026-03-10T06:24:26.628 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.627+0000 7f6a416ec700 1 -- 192.168.123.104:0/3041215763 wait complete. 2026-03-10T06:24:26.628 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.627+0000 7f6a416ec700 1 Processor -- start 2026-03-10T06:24:26.628 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.627+0000 7f6a416ec700 1 -- start start 2026-03-10T06:24:26.628 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.627+0000 7f6a416ec700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6a3c10c8b0 0x7f6a3c07ced0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:24:26.628 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.627+0000 7f6a416ec700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6a3c07d410 0x7f6a3c07d880 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:24:26.628 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.627+0000 7f6a416ec700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6a3c081a50 con 0x7f6a3c10c8b0 2026-03-10T06:24:26.628 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.627+0000 7f6a416ec700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6a3c081bc0 con 0x7f6a3c07d410 2026-03-10T06:24:26.628 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.627+0000 7f6a3affd700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6a3c10c8b0 0x7f6a3c07ced0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:24:26.628 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.627+0000 7f6a3a7fc700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6a3c07d410 0x7f6a3c07d880 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:24:26.628 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.628+0000 7f6a3a7fc700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6a3c07d410 0x7f6a3c07d880 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.104:56360/0 (socket says 192.168.123.104:56360) 2026-03-10T06:24:26.628 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.628+0000 7f6a3a7fc700 1 -- 192.168.123.104:0/380086481 learned_addr learned my addr 192.168.123.104:0/380086481 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:24:26.628 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.628+0000 7f6a3a7fc700 1 -- 192.168.123.104:0/380086481 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6a3c10c8b0 msgr2=0x7f6a3c07ced0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:24:26.628 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.628+0000 7f6a3a7fc700 1 --2- 192.168.123.104:0/380086481 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6a3c10c8b0 0x7f6a3c07ced0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:24:26.628 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.628+0000 7f6a3a7fc700 1 -- 192.168.123.104:0/380086481 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f6a34007ed0 con 0x7f6a3c07d410 2026-03-10T06:24:26.628 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.628+0000 7f6a3a7fc700 1 --2- 192.168.123.104:0/380086481 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6a3c07d410 0x7f6a3c07d880 secure :-1 s=READY pgs=6 cs=0 l=1 rev1=1 crypto rx=0x7f6a34008250 tx=0x7f6a34010f70 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:24:26.628 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.628+0000 7f6a23fff700 1 -- 192.168.123.104:0/380086481 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6a34007620 con 0x7f6a3c07d410 2026-03-10T06:24:26.629 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.628+0000 7f6a416ec700 1 -- 192.168.123.104:0/380086481 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f6a3c081e40 con 0x7f6a3c07d410 2026-03-10T06:24:26.629 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.628+0000 7f6a416ec700 1 -- 192.168.123.104:0/380086481 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f6a3c082390 con 0x7f6a3c07d410 2026-03-10T06:24:26.629 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.629+0000 7f6a23fff700 1 -- 192.168.123.104:0/380086481 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f6a3400a2e0 con 0x7f6a3c07d410 2026-03-10T06:24:26.629 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.629+0000 7f6a23fff700 1 -- 192.168.123.104:0/380086481 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6a34004490 con 0x7f6a3c07d410 2026-03-10T06:24:26.630 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.630+0000 7f6a23fff700 1 -- 192.168.123.104:0/380086481 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 39) v1 ==== 100217+0+0 (secure 0 0 0) 0x7f6a34016020 con 0x7f6a3c07d410 2026-03-10T06:24:26.630 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.630+0000 7f6a416ec700 1 -- 192.168.123.104:0/380086481 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f6a3c075730 con 0x7f6a3c07d410 2026-03-10T06:24:26.631 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.630+0000 7f6a23fff700 1 --2- 192.168.123.104:0/380086481 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f6a24077a40 0x7f6a24079ef0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:24:26.631 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.630+0000 7f6a23fff700 1 -- 192.168.123.104:0/380086481 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(43..43 src has 1..43) v4 ==== 6050+0+0 (secure 0 0 0) 0x7f6a3409b1c0 con 0x7f6a3c07d410 2026-03-10T06:24:26.631 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.631+0000 7f6a3affd700 1 --2- 192.168.123.104:0/380086481 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f6a24077a40 0x7f6a24079ef0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:24:26.631 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.631+0000 7f6a3affd700 1 --2- 192.168.123.104:0/380086481 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f6a24077a40 0x7f6a24079ef0 secure :-1 s=READY pgs=22 cs=0 l=1 rev1=1 crypto rx=0x7f6a2c005950 tx=0x7f6a2c0058e0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:24:26.634 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.633+0000 7f6a23fff700 1 -- 192.168.123.104:0/380086481 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f6a340638d0 con 0x7f6a3c07d410 2026-03-10T06:24:26.772 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.772+0000 7f6a416ec700 1 -- 192.168.123.104:0/380086481 --> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7f6a3c07e400 con 0x7f6a24077a40 2026-03-10T06:24:26.779 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.778+0000 7f6a23fff700 1 -- 192.168.123.104:0/380086481 <== mgr.34104 v2:192.168.123.104:6800/1421430943 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3576 (secure 0 0 0) 0x7f6a3c07e400 con 0x7f6a24077a40 2026-03-10T06:24:26.779 INFO:teuthology.orchestra.run.vm04.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-10T06:24:26.779 INFO:teuthology.orchestra.run.vm04.stdout:alertmanager.vm04 vm04 *:9093,9094 running (74s) 19s ago 7m 15.6M - 0.25.0 c8568f914cd2 85edc8fe2fc1 2026-03-10T06:24:26.779 INFO:teuthology.orchestra.run.vm04.stdout:ceph-exporter.vm04 vm04 running (7m) 19s ago 7m 8472k - 18.2.0 dc2bc1663786 019b79596e39 2026-03-10T06:24:26.779 INFO:teuthology.orchestra.run.vm04.stdout:ceph-exporter.vm06 vm06 running (6m) 13s ago 6m 11.3M - 18.2.0 dc2bc1663786 02ba67f7b99e 2026-03-10T06:24:26.779 INFO:teuthology.orchestra.run.vm04.stdout:crash.vm04 vm04 starting - - - - 2026-03-10T06:24:26.779 INFO:teuthology.orchestra.run.vm04.stdout:crash.vm06 vm06 running (6m) 13s ago 6m 7411k - 18.2.0 dc2bc1663786 a60199b09d41 2026-03-10T06:24:26.779 INFO:teuthology.orchestra.run.vm04.stdout:grafana.vm04 vm04 *:3000 running (57s) 19s ago 6m 81.6M - 10.4.0 c8b91775d855 28b34ae2f2b0 2026-03-10T06:24:26.779 INFO:teuthology.orchestra.run.vm04.stdout:mds.cephfs.vm04.hdxbzv vm04 running (4m) 19s ago 4m 262M - 18.2.0 dc2bc1663786 342935a5b39a 2026-03-10T06:24:26.779 INFO:teuthology.orchestra.run.vm04.stdout:mds.cephfs.vm04.hsrsig vm04 running (4m) 19s ago 4m 17.4M - 18.2.0 dc2bc1663786 9bbaa4df4333 2026-03-10T06:24:26.779 INFO:teuthology.orchestra.run.vm04.stdout:mds.cephfs.vm06.afscws vm06 running (4m) 13s ago 4m 17.6M - 18.2.0 dc2bc1663786 dc29bd0a94dd 2026-03-10T06:24:26.779 INFO:teuthology.orchestra.run.vm04.stdout:mds.cephfs.vm06.wzhqon vm06 running (4m) 13s ago 4m 289M - 18.2.0 dc2bc1663786 5f7b9f10b346 2026-03-10T06:24:26.779 INFO:teuthology.orchestra.run.vm04.stdout:mgr.vm04.exdvdb vm04 *:8443,9283,8765 running (2m) 19s ago 7m 582M - 19.2.3-678-ge911bdeb 654f31e6858e 640a9d30421c 2026-03-10T06:24:26.779 INFO:teuthology.orchestra.run.vm04.stdout:mgr.vm06.wwotdr vm06 *:8443,9283,8765 running (2m) 13s ago 6m 494M - 19.2.3-678-ge911bdeb 654f31e6858e 0f98de364d6a 2026-03-10T06:24:26.779 INFO:teuthology.orchestra.run.vm04.stdout:mon.vm04 vm04 running (29s) 19s ago 7m 43.7M 2048M 19.2.3-678-ge911bdeb 654f31e6858e cf1d92823378 2026-03-10T06:24:26.779 INFO:teuthology.orchestra.run.vm04.stdout:mon.vm06 vm06 running (14s) 13s ago 6m 38.2M 2048M 19.2.3-678-ge911bdeb 654f31e6858e 0f90bc9a714a 2026-03-10T06:24:26.779 INFO:teuthology.orchestra.run.vm04.stdout:node-exporter.vm04 vm04 *:9100 running (108s) 19s ago 7m 9.92M - 1.7.0 72c9c2088986 f88b18573eef 2026-03-10T06:24:26.779 INFO:teuthology.orchestra.run.vm04.stdout:node-exporter.vm06 vm06 *:9100 running (104s) 13s ago 6m 9504k - 1.7.0 72c9c2088986 32cea90d1988 2026-03-10T06:24:26.779 INFO:teuthology.orchestra.run.vm04.stdout:osd.0 vm04 running (5m) 19s ago 5m 292M 4096M 18.2.0 dc2bc1663786 23249edb3d75 2026-03-10T06:24:26.779 INFO:teuthology.orchestra.run.vm04.stdout:osd.1 vm04 running (5m) 19s ago 5m 283M 4096M 18.2.0 dc2bc1663786 ddcaf1636c42 2026-03-10T06:24:26.779 INFO:teuthology.orchestra.run.vm04.stdout:osd.2 vm04 running (5m) 19s ago 5m 249M 4096M 18.2.0 dc2bc1663786 e5a533082c80 2026-03-10T06:24:26.779 INFO:teuthology.orchestra.run.vm04.stdout:osd.3 vm06 running (5m) 13s ago 5m 367M 4096M 18.2.0 dc2bc1663786 62400287eca0 2026-03-10T06:24:26.779 INFO:teuthology.orchestra.run.vm04.stdout:osd.4 vm06 running (5m) 13s ago 5m 311M 4096M 18.2.0 dc2bc1663786 dcd395dfe220 2026-03-10T06:24:26.779 INFO:teuthology.orchestra.run.vm04.stdout:osd.5 vm06 running (5m) 13s ago 5m 281M 4096M 18.2.0 dc2bc1663786 862da087fc06 2026-03-10T06:24:26.779 INFO:teuthology.orchestra.run.vm04.stdout:prometheus.vm04 vm04 *:9095 running (84s) 19s ago 6m 47.0M - 2.51.0 1d3b7f56885b 9e491f823407 2026-03-10T06:24:26.781 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.781+0000 7f6a21ffb700 1 -- 192.168.123.104:0/380086481 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f6a24077a40 msgr2=0x7f6a24079ef0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:24:26.781 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.781+0000 7f6a21ffb700 1 --2- 192.168.123.104:0/380086481 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f6a24077a40 0x7f6a24079ef0 secure :-1 s=READY pgs=22 cs=0 l=1 rev1=1 crypto rx=0x7f6a2c005950 tx=0x7f6a2c0058e0 comp rx=0 tx=0).stop 2026-03-10T06:24:26.781 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.781+0000 7f6a21ffb700 1 -- 192.168.123.104:0/380086481 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6a3c07d410 msgr2=0x7f6a3c07d880 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:24:26.781 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.781+0000 7f6a21ffb700 1 --2- 192.168.123.104:0/380086481 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6a3c07d410 0x7f6a3c07d880 secure :-1 s=READY pgs=6 cs=0 l=1 rev1=1 crypto rx=0x7f6a34008250 tx=0x7f6a34010f70 comp rx=0 tx=0).stop 2026-03-10T06:24:26.781 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.781+0000 7f6a21ffb700 1 -- 192.168.123.104:0/380086481 shutdown_connections 2026-03-10T06:24:26.781 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.781+0000 7f6a21ffb700 1 --2- 192.168.123.104:0/380086481 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f6a24077a40 0x7f6a24079ef0 unknown :-1 s=CLOSED pgs=22 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:24:26.781 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.781+0000 7f6a21ffb700 1 --2- 192.168.123.104:0/380086481 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6a3c10c8b0 0x7f6a3c07ced0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:24:26.781 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.781+0000 7f6a21ffb700 1 --2- 192.168.123.104:0/380086481 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6a3c07d410 0x7f6a3c07d880 unknown :-1 s=CLOSED pgs=6 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:24:26.781 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.781+0000 7f6a21ffb700 1 -- 192.168.123.104:0/380086481 >> 192.168.123.104:0/380086481 conn(0x7f6a3c06c6c0 msgr2=0x7f6a3c070900 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:24:26.782 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.781+0000 7f6a21ffb700 1 -- 192.168.123.104:0/380086481 shutdown_connections 2026-03-10T06:24:26.782 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.781+0000 7f6a21ffb700 1 -- 192.168.123.104:0/380086481 wait complete. 2026-03-10T06:24:26.861 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.860+0000 7f554d0b6700 1 -- 192.168.123.104:0/991774565 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5548071e40 msgr2=0x7f55480722b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:24:26.861 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.860+0000 7f554d0b6700 1 --2- 192.168.123.104:0/991774565 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5548071e40 0x7f55480722b0 secure :-1 s=READY pgs=38 cs=0 l=1 rev1=1 crypto rx=0x7f554000d3f0 tx=0x7f554000d700 comp rx=0 tx=0).stop 2026-03-10T06:24:26.861 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.860+0000 7f554d0b6700 1 -- 192.168.123.104:0/991774565 shutdown_connections 2026-03-10T06:24:26.862 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.860+0000 7f554d0b6700 1 --2- 192.168.123.104:0/991774565 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5548071e40 0x7f55480722b0 unknown :-1 s=CLOSED pgs=38 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:24:26.862 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.860+0000 7f554d0b6700 1 --2- 192.168.123.104:0/991774565 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f554810c8b0 0x7f554810cc80 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:24:26.862 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.860+0000 7f554d0b6700 1 -- 192.168.123.104:0/991774565 >> 192.168.123.104:0/991774565 conn(0x7f554806c6c0 msgr2=0x7f554806cac0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:24:26.862 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.860+0000 7f554d0b6700 1 -- 192.168.123.104:0/991774565 shutdown_connections 2026-03-10T06:24:26.862 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.860+0000 7f554d0b6700 1 -- 192.168.123.104:0/991774565 wait complete. 2026-03-10T06:24:26.862 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.861+0000 7f554d0b6700 1 Processor -- start 2026-03-10T06:24:26.862 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.861+0000 7f554d0b6700 1 -- start start 2026-03-10T06:24:26.862 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.861+0000 7f554d0b6700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f554810c8b0 0x7f5548137930 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:24:26.862 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.861+0000 7f554d0b6700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5548132930 0x7f5548132da0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:24:26.862 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.861+0000 7f554d0b6700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f55481333a0 con 0x7f5548132930 2026-03-10T06:24:26.862 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.861+0000 7f554d0b6700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5548133510 con 0x7f554810c8b0 2026-03-10T06:24:26.862 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.861+0000 7f554659c700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5548132930 0x7f5548132da0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:24:26.862 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.861+0000 7f554659c700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5548132930 0x7f5548132da0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:42730/0 (socket says 192.168.123.104:42730) 2026-03-10T06:24:26.862 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.861+0000 7f554659c700 1 -- 192.168.123.104:0/2572754113 learned_addr learned my addr 192.168.123.104:0/2572754113 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:24:26.862 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.861+0000 7f5546d9d700 1 --2- 192.168.123.104:0/2572754113 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f554810c8b0 0x7f5548137930 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:24:26.862 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.861+0000 7f554659c700 1 -- 192.168.123.104:0/2572754113 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f554810c8b0 msgr2=0x7f5548137930 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:24:26.862 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.861+0000 7f554659c700 1 --2- 192.168.123.104:0/2572754113 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f554810c8b0 0x7f5548137930 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:24:26.862 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.861+0000 7f554659c700 1 -- 192.168.123.104:0/2572754113 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f5540007ed0 con 0x7f5548132930 2026-03-10T06:24:26.862 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.862+0000 7f554659c700 1 --2- 192.168.123.104:0/2572754113 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5548132930 0x7f5548132da0 secure :-1 s=READY pgs=39 cs=0 l=1 rev1=1 crypto rx=0x7f5540003c60 tx=0x7f5540003d40 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:24:26.862 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.862+0000 7f552ffff700 1 -- 192.168.123.104:0/2572754113 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f554001c070 con 0x7f5548132930 2026-03-10T06:24:26.863 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.862+0000 7f554d0b6700 1 -- 192.168.123.104:0/2572754113 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f5548133790 con 0x7f5548132930 2026-03-10T06:24:26.863 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.862+0000 7f554d0b6700 1 -- 192.168.123.104:0/2572754113 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f554807f140 con 0x7f5548132930 2026-03-10T06:24:26.863 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.863+0000 7f552ffff700 1 -- 192.168.123.104:0/2572754113 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f554000fcf0 con 0x7f5548132930 2026-03-10T06:24:26.863 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.863+0000 7f552ffff700 1 -- 192.168.123.104:0/2572754113 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f5540017d40 con 0x7f5548132930 2026-03-10T06:24:26.863 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.863+0000 7f554d0b6700 1 -- 192.168.123.104:0/2572754113 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f5534005320 con 0x7f5548132930 2026-03-10T06:24:26.867 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.866+0000 7f552ffff700 1 -- 192.168.123.104:0/2572754113 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 39) v1 ==== 100217+0+0 (secure 0 0 0) 0x7f554000f810 con 0x7f5548132930 2026-03-10T06:24:26.867 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.866+0000 7f552ffff700 1 --2- 192.168.123.104:0/2572754113 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f5530077b00 0x7f5530079fb0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:24:26.867 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.867+0000 7f552ffff700 1 -- 192.168.123.104:0/2572754113 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(43..43 src has 1..43) v4 ==== 6050+0+0 (secure 0 0 0) 0x7f5540013070 con 0x7f5548132930 2026-03-10T06:24:26.867 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.867+0000 7f5546d9d700 1 --2- 192.168.123.104:0/2572754113 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f5530077b00 0x7f5530079fb0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:24:26.869 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.869+0000 7f5546d9d700 1 --2- 192.168.123.104:0/2572754113 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f5530077b00 0x7f5530079fb0 secure :-1 s=READY pgs=23 cs=0 l=1 rev1=1 crypto rx=0x7f5538009990 tx=0x7f5538008040 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:24:26.871 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:26.870+0000 7f552ffff700 1 -- 192.168.123.104:0/2572754113 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f55400643e0 con 0x7f5548132930 2026-03-10T06:24:27.054 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:26 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:27.054 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:26 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:27.054 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:26 vm06.local ceph-mon[98962]: pgmap v14: 65 pgs: 65 active+clean; 306 MiB data, 2.8 GiB used, 117 GiB / 120 GiB avail; 1.9 MiB/s rd, 2.0 MiB/s wr, 141 op/s 2026-03-10T06:24:27.054 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:26 vm06.local ceph-mon[98962]: Upgrade: Updating crash.vm06 (2/2) 2026-03-10T06:24:27.054 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:26 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:27.054 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:26 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.vm06", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch 2026-03-10T06:24:27.054 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:26 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:24:27.054 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:26 vm06.local ceph-mon[98962]: Deploying daemon crash.vm06 on vm06 2026-03-10T06:24:27.054 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:26 vm06.local ceph-mon[98962]: from='client.34128 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:24:27.054 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:26 vm06.local ceph-mon[98962]: from='client.44105 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:24:27.062 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:27.061+0000 7f554d0b6700 1 -- 192.168.123.104:0/2572754113 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7f5534005cc0 con 0x7f5548132930 2026-03-10T06:24:27.062 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:26 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:27.062 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:26 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:27.062 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:26 vm04.local ceph-mon[115743]: pgmap v14: 65 pgs: 65 active+clean; 306 MiB data, 2.8 GiB used, 117 GiB / 120 GiB avail; 1.9 MiB/s rd, 2.0 MiB/s wr, 141 op/s 2026-03-10T06:24:27.062 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:26 vm04.local ceph-mon[115743]: Upgrade: Updating crash.vm06 (2/2) 2026-03-10T06:24:27.062 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:26 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:27.062 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:26 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.vm06", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch 2026-03-10T06:24:27.062 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:26 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:24:27.062 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:26 vm04.local ceph-mon[115743]: Deploying daemon crash.vm06 on vm06 2026-03-10T06:24:27.062 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:26 vm04.local ceph-mon[115743]: from='client.34128 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:24:27.062 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:26 vm04.local ceph-mon[115743]: from='client.44105 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:24:27.066 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:27.065+0000 7f552ffff700 1 -- 192.168.123.104:0/2572754113 <== mon.0 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+694 (secure 0 0 0) 0x7f554000fe60 con 0x7f5548132930 2026-03-10T06:24:27.066 INFO:teuthology.orchestra.run.vm04.stdout:{ 2026-03-10T06:24:27.066 INFO:teuthology.orchestra.run.vm04.stdout: "mon": { 2026-03-10T06:24:27.066 INFO:teuthology.orchestra.run.vm04.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T06:24:27.066 INFO:teuthology.orchestra.run.vm04.stdout: }, 2026-03-10T06:24:27.066 INFO:teuthology.orchestra.run.vm04.stdout: "mgr": { 2026-03-10T06:24:27.066 INFO:teuthology.orchestra.run.vm04.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T06:24:27.066 INFO:teuthology.orchestra.run.vm04.stdout: }, 2026-03-10T06:24:27.066 INFO:teuthology.orchestra.run.vm04.stdout: "osd": { 2026-03-10T06:24:27.066 INFO:teuthology.orchestra.run.vm04.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 6 2026-03-10T06:24:27.066 INFO:teuthology.orchestra.run.vm04.stdout: }, 2026-03-10T06:24:27.066 INFO:teuthology.orchestra.run.vm04.stdout: "mds": { 2026-03-10T06:24:27.066 INFO:teuthology.orchestra.run.vm04.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 4 2026-03-10T06:24:27.066 INFO:teuthology.orchestra.run.vm04.stdout: }, 2026-03-10T06:24:27.066 INFO:teuthology.orchestra.run.vm04.stdout: "overall": { 2026-03-10T06:24:27.066 INFO:teuthology.orchestra.run.vm04.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 10, 2026-03-10T06:24:27.066 INFO:teuthology.orchestra.run.vm04.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 4 2026-03-10T06:24:27.066 INFO:teuthology.orchestra.run.vm04.stdout: } 2026-03-10T06:24:27.066 INFO:teuthology.orchestra.run.vm04.stdout:} 2026-03-10T06:24:27.068 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:27.068+0000 7f552dffb700 1 -- 192.168.123.104:0/2572754113 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f5530077b00 msgr2=0x7f5530079fb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:24:27.068 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:27.068+0000 7f552dffb700 1 --2- 192.168.123.104:0/2572754113 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f5530077b00 0x7f5530079fb0 secure :-1 s=READY pgs=23 cs=0 l=1 rev1=1 crypto rx=0x7f5538009990 tx=0x7f5538008040 comp rx=0 tx=0).stop 2026-03-10T06:24:27.068 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:27.068+0000 7f552dffb700 1 -- 192.168.123.104:0/2572754113 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5548132930 msgr2=0x7f5548132da0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:24:27.068 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:27.068+0000 7f552dffb700 1 --2- 192.168.123.104:0/2572754113 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5548132930 0x7f5548132da0 secure :-1 s=READY pgs=39 cs=0 l=1 rev1=1 crypto rx=0x7f5540003c60 tx=0x7f5540003d40 comp rx=0 tx=0).stop 2026-03-10T06:24:27.069 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:27.068+0000 7f552dffb700 1 -- 192.168.123.104:0/2572754113 shutdown_connections 2026-03-10T06:24:27.069 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:27.068+0000 7f552dffb700 1 --2- 192.168.123.104:0/2572754113 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f5530077b00 0x7f5530079fb0 unknown :-1 s=CLOSED pgs=23 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:24:27.069 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:27.068+0000 7f552dffb700 1 --2- 192.168.123.104:0/2572754113 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f554810c8b0 0x7f5548137930 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:24:27.069 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:27.068+0000 7f552dffb700 1 --2- 192.168.123.104:0/2572754113 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5548132930 0x7f5548132da0 unknown :-1 s=CLOSED pgs=39 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:24:27.069 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:27.068+0000 7f552dffb700 1 -- 192.168.123.104:0/2572754113 >> 192.168.123.104:0/2572754113 conn(0x7f554806c6c0 msgr2=0x7f5548070010 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:24:27.069 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:27.068+0000 7f552dffb700 1 -- 192.168.123.104:0/2572754113 shutdown_connections 2026-03-10T06:24:27.069 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:27.069+0000 7f552dffb700 1 -- 192.168.123.104:0/2572754113 wait complete. 2026-03-10T06:24:27.155 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:27.154+0000 7f5c33921700 1 -- 192.168.123.104:0/2505275395 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5c2c071e40 msgr2=0x7f5c2c0722b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:24:27.155 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:27.154+0000 7f5c33921700 1 --2- 192.168.123.104:0/2505275395 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5c2c071e40 0x7f5c2c0722b0 secure :-1 s=READY pgs=40 cs=0 l=1 rev1=1 crypto rx=0x7f5c2400d3f0 tx=0x7f5c2400d700 comp rx=0 tx=0).stop 2026-03-10T06:24:27.155 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:27.154+0000 7f5c33921700 1 -- 192.168.123.104:0/2505275395 shutdown_connections 2026-03-10T06:24:27.155 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:27.154+0000 7f5c33921700 1 --2- 192.168.123.104:0/2505275395 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5c2c071e40 0x7f5c2c0722b0 unknown :-1 s=CLOSED pgs=40 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:24:27.155 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:27.154+0000 7f5c33921700 1 --2- 192.168.123.104:0/2505275395 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5c2c10c8b0 0x7f5c2c10cc80 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:24:27.155 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:27.154+0000 7f5c33921700 1 -- 192.168.123.104:0/2505275395 >> 192.168.123.104:0/2505275395 conn(0x7f5c2c06c6c0 msgr2=0x7f5c2c06cac0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:24:27.155 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:27.154+0000 7f5c33921700 1 -- 192.168.123.104:0/2505275395 shutdown_connections 2026-03-10T06:24:27.155 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:27.154+0000 7f5c33921700 1 -- 192.168.123.104:0/2505275395 wait complete. 2026-03-10T06:24:27.155 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:27.154+0000 7f5c33921700 1 Processor -- start 2026-03-10T06:24:27.155 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:27.154+0000 7f5c33921700 1 -- start start 2026-03-10T06:24:27.155 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:27.154+0000 7f5c33921700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5c2c10c8b0 0x7f5c2c132790 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:24:27.155 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:27.155+0000 7f5c33921700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5c2c132cd0 0x7f5c2c133140 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:24:27.155 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:27.155+0000 7f5c33921700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5c2c07eef0 con 0x7f5c2c10c8b0 2026-03-10T06:24:27.155 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:27.155+0000 7f5c33921700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5c2c07f060 con 0x7f5c2c132cd0 2026-03-10T06:24:27.155 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:27.155+0000 7f5c30ebc700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5c2c132cd0 0x7f5c2c133140 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:24:27.155 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:27.155+0000 7f5c30ebc700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5c2c132cd0 0x7f5c2c133140 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.104:56400/0 (socket says 192.168.123.104:56400) 2026-03-10T06:24:27.155 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:27.155+0000 7f5c30ebc700 1 -- 192.168.123.104:0/4196671823 learned_addr learned my addr 192.168.123.104:0/4196671823 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:24:27.156 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:27.156+0000 7f5c30ebc700 1 -- 192.168.123.104:0/4196671823 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5c2c10c8b0 msgr2=0x7f5c2c132790 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:24:27.156 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:27.156+0000 7f5c30ebc700 1 --2- 192.168.123.104:0/4196671823 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5c2c10c8b0 0x7f5c2c132790 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:24:27.156 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:27.156+0000 7f5c30ebc700 1 -- 192.168.123.104:0/4196671823 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f5c24007ed0 con 0x7f5c2c132cd0 2026-03-10T06:24:27.158 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:27.157+0000 7f5c30ebc700 1 --2- 192.168.123.104:0/4196671823 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5c2c132cd0 0x7f5c2c133140 secure :-1 s=READY pgs=7 cs=0 l=1 rev1=1 crypto rx=0x7f5c24003c60 tx=0x7f5c24004b40 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:24:27.160 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:27.158+0000 7f5c227fc700 1 -- 192.168.123.104:0/4196671823 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f5c2401c070 con 0x7f5c2c132cd0 2026-03-10T06:24:27.160 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:27.159+0000 7f5c33921700 1 -- 192.168.123.104:0/4196671823 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f5c2c07f280 con 0x7f5c2c132cd0 2026-03-10T06:24:27.160 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:27.159+0000 7f5c33921700 1 -- 192.168.123.104:0/4196671823 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f5c2c07f770 con 0x7f5c2c132cd0 2026-03-10T06:24:27.160 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:27.159+0000 7f5c227fc700 1 -- 192.168.123.104:0/4196671823 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f5c2400deb0 con 0x7f5c2c132cd0 2026-03-10T06:24:27.160 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:27.159+0000 7f5c227fc700 1 -- 192.168.123.104:0/4196671823 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f5c24017c90 con 0x7f5c2c132cd0 2026-03-10T06:24:27.161 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:27.160+0000 7f5c33921700 1 -- 192.168.123.104:0/4196671823 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f5c10005320 con 0x7f5c2c132cd0 2026-03-10T06:24:27.161 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:27.160+0000 7f5c227fc700 1 -- 192.168.123.104:0/4196671823 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 39) v1 ==== 100217+0+0 (secure 0 0 0) 0x7f5c24017420 con 0x7f5c2c132cd0 2026-03-10T06:24:27.161 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:27.161+0000 7f5c227fc700 1 --2- 192.168.123.104:0/4196671823 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f5c18077a40 0x7f5c18079ef0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:24:27.161 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:27.161+0000 7f5c227fc700 1 -- 192.168.123.104:0/4196671823 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(43..43 src has 1..43) v4 ==== 6050+0+0 (secure 0 0 0) 0x7f5c24013070 con 0x7f5c2c132cd0 2026-03-10T06:24:27.162 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:27.161+0000 7f5c316bd700 1 --2- 192.168.123.104:0/4196671823 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f5c18077a40 0x7f5c18079ef0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:24:27.162 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:27.162+0000 7f5c316bd700 1 --2- 192.168.123.104:0/4196671823 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f5c18077a40 0x7f5c18079ef0 secure :-1 s=READY pgs=24 cs=0 l=1 rev1=1 crypto rx=0x7f5c280098a0 tx=0x7f5c28006d90 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:24:27.171 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:27.166+0000 7f5c227fc700 1 -- 192.168.123.104:0/4196671823 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f5c24063ec0 con 0x7f5c2c132cd0 2026-03-10T06:24:27.319 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:27.318+0000 7f5c33921700 1 -- 192.168.123.104:0/4196671823 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7f5c10005cc0 con 0x7f5c2c132cd0 2026-03-10T06:24:27.320 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:27.319+0000 7f5c227fc700 1 -- 192.168.123.104:0/4196671823 <== mon.1 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 11 v11) v1 ==== 76+0+1944 (secure 0 0 0) 0x7f5c240257b0 con 0x7f5c2c132cd0 2026-03-10T06:24:27.320 INFO:teuthology.orchestra.run.vm04.stdout:e11 2026-03-10T06:24:27.320 INFO:teuthology.orchestra.run.vm04.stdout:btime 1970-01-01T00:00:00:000000+0000 2026-03-10T06:24:27.320 INFO:teuthology.orchestra.run.vm04.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-10T06:24:27.320 INFO:teuthology.orchestra.run.vm04.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T06:24:27.320 INFO:teuthology.orchestra.run.vm04.stdout:legacy client fscid: 1 2026-03-10T06:24:27.320 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:24:27.320 INFO:teuthology.orchestra.run.vm04.stdout:Filesystem 'cephfs' (1) 2026-03-10T06:24:27.320 INFO:teuthology.orchestra.run.vm04.stdout:fs_name cephfs 2026-03-10T06:24:27.320 INFO:teuthology.orchestra.run.vm04.stdout:epoch 9 2026-03-10T06:24:27.320 INFO:teuthology.orchestra.run.vm04.stdout:flags 32 joinable allow_snaps allow_multimds_snaps allow_standby_replay 2026-03-10T06:24:27.320 INFO:teuthology.orchestra.run.vm04.stdout:created 2026-03-10T06:19:48.407965+0000 2026-03-10T06:24:27.320 INFO:teuthology.orchestra.run.vm04.stdout:modified 2026-03-10T06:19:55.449951+0000 2026-03-10T06:24:27.320 INFO:teuthology.orchestra.run.vm04.stdout:tableserver 0 2026-03-10T06:24:27.320 INFO:teuthology.orchestra.run.vm04.stdout:root 0 2026-03-10T06:24:27.320 INFO:teuthology.orchestra.run.vm04.stdout:session_timeout 60 2026-03-10T06:24:27.320 INFO:teuthology.orchestra.run.vm04.stdout:session_autoclose 300 2026-03-10T06:24:27.320 INFO:teuthology.orchestra.run.vm04.stdout:max_file_size 1099511627776 2026-03-10T06:24:27.320 INFO:teuthology.orchestra.run.vm04.stdout:max_xattr_size 65536 2026-03-10T06:24:27.320 INFO:teuthology.orchestra.run.vm04.stdout:required_client_features {} 2026-03-10T06:24:27.320 INFO:teuthology.orchestra.run.vm04.stdout:last_failure 0 2026-03-10T06:24:27.320 INFO:teuthology.orchestra.run.vm04.stdout:last_failure_osd_epoch 0 2026-03-10T06:24:27.320 INFO:teuthology.orchestra.run.vm04.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T06:24:27.320 INFO:teuthology.orchestra.run.vm04.stdout:max_mds 1 2026-03-10T06:24:27.321 INFO:teuthology.orchestra.run.vm04.stdout:in 0 2026-03-10T06:24:27.321 INFO:teuthology.orchestra.run.vm04.stdout:up {0=14508} 2026-03-10T06:24:27.321 INFO:teuthology.orchestra.run.vm04.stdout:failed 2026-03-10T06:24:27.321 INFO:teuthology.orchestra.run.vm04.stdout:damaged 2026-03-10T06:24:27.321 INFO:teuthology.orchestra.run.vm04.stdout:stopped 2026-03-10T06:24:27.321 INFO:teuthology.orchestra.run.vm04.stdout:data_pools [3] 2026-03-10T06:24:27.321 INFO:teuthology.orchestra.run.vm04.stdout:metadata_pool 2 2026-03-10T06:24:27.321 INFO:teuthology.orchestra.run.vm04.stdout:inline_data enabled 2026-03-10T06:24:27.321 INFO:teuthology.orchestra.run.vm04.stdout:balancer 2026-03-10T06:24:27.321 INFO:teuthology.orchestra.run.vm04.stdout:bal_rank_mask -1 2026-03-10T06:24:27.321 INFO:teuthology.orchestra.run.vm04.stdout:standby_count_wanted 1 2026-03-10T06:24:27.321 INFO:teuthology.orchestra.run.vm04.stdout:qdb_cluster leader: 0 members: 2026-03-10T06:24:27.321 INFO:teuthology.orchestra.run.vm04.stdout:[mds.cephfs.vm04.hdxbzv{0:14508} state up:active seq 3 join_fscid=1 addr [v2:192.168.123.104:6826/2274683007,v1:192.168.123.104:6827/2274683007] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T06:24:27.321 INFO:teuthology.orchestra.run.vm04.stdout:[mds.cephfs.vm06.wzhqon{0:24299} state up:standby-replay seq 2 join_fscid=1 addr [v2:192.168.123.106:6824/3071631026,v1:192.168.123.106:6825/3071631026] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T06:24:27.321 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:24:27.321 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:24:27.321 INFO:teuthology.orchestra.run.vm04.stdout:Standby daemons: 2026-03-10T06:24:27.321 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:24:27.321 INFO:teuthology.orchestra.run.vm04.stdout:[mds.cephfs.vm04.hsrsig{-1:14518} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.104:6828/2419696492,v1:192.168.123.104:6829/2419696492] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T06:24:27.321 INFO:teuthology.orchestra.run.vm04.stdout:[mds.cephfs.vm06.afscws{-1:14526} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.106:6826/3120742985,v1:192.168.123.106:6827/3120742985] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T06:24:27.323 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:27.323+0000 7f5c17fff700 1 -- 192.168.123.104:0/4196671823 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f5c18077a40 msgr2=0x7f5c18079ef0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:24:27.323 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:27.323+0000 7f5c17fff700 1 --2- 192.168.123.104:0/4196671823 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f5c18077a40 0x7f5c18079ef0 secure :-1 s=READY pgs=24 cs=0 l=1 rev1=1 crypto rx=0x7f5c280098a0 tx=0x7f5c28006d90 comp rx=0 tx=0).stop 2026-03-10T06:24:27.323 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:27.323+0000 7f5c17fff700 1 -- 192.168.123.104:0/4196671823 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5c2c132cd0 msgr2=0x7f5c2c133140 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:24:27.323 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:27.323+0000 7f5c17fff700 1 --2- 192.168.123.104:0/4196671823 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5c2c132cd0 0x7f5c2c133140 secure :-1 s=READY pgs=7 cs=0 l=1 rev1=1 crypto rx=0x7f5c24003c60 tx=0x7f5c24004b40 comp rx=0 tx=0).stop 2026-03-10T06:24:27.323 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:27.323+0000 7f5c17fff700 1 -- 192.168.123.104:0/4196671823 shutdown_connections 2026-03-10T06:24:27.323 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:27.323+0000 7f5c17fff700 1 --2- 192.168.123.104:0/4196671823 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f5c18077a40 0x7f5c18079ef0 unknown :-1 s=CLOSED pgs=24 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:24:27.323 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:27.323+0000 7f5c17fff700 1 --2- 192.168.123.104:0/4196671823 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5c2c10c8b0 0x7f5c2c132790 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:24:27.324 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:27.323+0000 7f5c17fff700 1 --2- 192.168.123.104:0/4196671823 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5c2c132cd0 0x7f5c2c133140 unknown :-1 s=CLOSED pgs=7 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:24:27.324 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:27.323+0000 7f5c17fff700 1 -- 192.168.123.104:0/4196671823 >> 192.168.123.104:0/4196671823 conn(0x7f5c2c06c6c0 msgr2=0x7f5c2c06ff90 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:24:27.324 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:27.323+0000 7f5c17fff700 1 -- 192.168.123.104:0/4196671823 shutdown_connections 2026-03-10T06:24:27.324 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:27.323+0000 7f5c17fff700 1 -- 192.168.123.104:0/4196671823 wait complete. 2026-03-10T06:24:27.325 INFO:teuthology.orchestra.run.vm04.stderr:dumped fsmap epoch 11 2026-03-10T06:24:27.400 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:27.399+0000 7f8966a01700 1 -- 192.168.123.104:0/764326775 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f896010eab0 msgr2=0x7f896010ee80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:24:27.400 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:27.399+0000 7f8966a01700 1 --2- 192.168.123.104:0/764326775 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f896010eab0 0x7f896010ee80 secure :-1 s=READY pgs=8 cs=0 l=1 rev1=1 crypto rx=0x7f8950009a60 tx=0x7f8950009d70 comp rx=0 tx=0).stop 2026-03-10T06:24:27.400 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:27.399+0000 7f8966a01700 1 -- 192.168.123.104:0/764326775 shutdown_connections 2026-03-10T06:24:27.400 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:27.399+0000 7f8966a01700 1 --2- 192.168.123.104:0/764326775 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8960071b60 0x7f8960071fd0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:24:27.400 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:27.399+0000 7f8966a01700 1 --2- 192.168.123.104:0/764326775 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f896010eab0 0x7f896010ee80 unknown :-1 s=CLOSED pgs=8 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:24:27.400 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:27.399+0000 7f8966a01700 1 -- 192.168.123.104:0/764326775 >> 192.168.123.104:0/764326775 conn(0x7f896006c6c0 msgr2=0x7f896006cac0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:24:27.400 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:27.399+0000 7f8966a01700 1 -- 192.168.123.104:0/764326775 shutdown_connections 2026-03-10T06:24:27.400 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:27.399+0000 7f8966a01700 1 -- 192.168.123.104:0/764326775 wait complete. 2026-03-10T06:24:27.400 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:27.400+0000 7f8966a01700 1 Processor -- start 2026-03-10T06:24:27.401 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:27.400+0000 7f8966a01700 1 -- start start 2026-03-10T06:24:27.401 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:27.400+0000 7f8966a01700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8960071b60 0x7f89601a4fa0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:24:27.401 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:27.400+0000 7f8966a01700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f896010eab0 0x7f89601a54e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:24:27.401 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:27.400+0000 7f8966a01700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f89601a5bc0 con 0x7f896010eab0 2026-03-10T06:24:27.401 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:27.400+0000 7f8966a01700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f89601a9950 con 0x7f8960071b60 2026-03-10T06:24:27.401 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:27.400+0000 7f895ffff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8960071b60 0x7f89601a4fa0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:24:27.401 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:27.401+0000 7f895ffff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8960071b60 0x7f89601a4fa0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.104:56432/0 (socket says 192.168.123.104:56432) 2026-03-10T06:24:27.401 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:27.401+0000 7f895ffff700 1 -- 192.168.123.104:0/3440263456 learned_addr learned my addr 192.168.123.104:0/3440263456 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:24:27.401 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:27.401+0000 7f895f7fe700 1 --2- 192.168.123.104:0/3440263456 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f896010eab0 0x7f89601a54e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:24:27.401 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:27.401+0000 7f895f7fe700 1 -- 192.168.123.104:0/3440263456 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8960071b60 msgr2=0x7f89601a4fa0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:24:27.401 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:27.401+0000 7f895f7fe700 1 --2- 192.168.123.104:0/3440263456 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8960071b60 0x7f89601a4fa0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:24:27.401 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:27.401+0000 7f895f7fe700 1 -- 192.168.123.104:0/3440263456 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f8950009710 con 0x7f896010eab0 2026-03-10T06:24:27.401 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:27.401+0000 7f895f7fe700 1 --2- 192.168.123.104:0/3440263456 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f896010eab0 0x7f89601a54e0 secure :-1 s=READY pgs=41 cs=0 l=1 rev1=1 crypto rx=0x7f895800c3a0 tx=0x7f895800c6b0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:24:27.717 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:27.715+0000 7f895d7fa700 1 -- 192.168.123.104:0/3440263456 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f895800e030 con 0x7f896010eab0 2026-03-10T06:24:27.717 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:27.715+0000 7f8966a01700 1 -- 192.168.123.104:0/3440263456 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f89601a9b50 con 0x7f896010eab0 2026-03-10T06:24:27.717 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:27.715+0000 7f8966a01700 1 -- 192.168.123.104:0/3440263456 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f89601aa0a0 con 0x7f896010eab0 2026-03-10T06:24:27.717 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:27.716+0000 7f8966a01700 1 -- 192.168.123.104:0/3440263456 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f896004f2a0 con 0x7f896010eab0 2026-03-10T06:24:27.717 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:27.716+0000 7f895d7fa700 1 -- 192.168.123.104:0/3440263456 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f895800f040 con 0x7f896010eab0 2026-03-10T06:24:27.717 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:27.716+0000 7f895d7fa700 1 -- 192.168.123.104:0/3440263456 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f8958014650 con 0x7f896010eab0 2026-03-10T06:24:27.719 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:27.718+0000 7f895d7fa700 1 -- 192.168.123.104:0/3440263456 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 39) v1 ==== 100217+0+0 (secure 0 0 0) 0x7f8958007930 con 0x7f896010eab0 2026-03-10T06:24:27.719 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:27.719+0000 7f895d7fa700 1 --2- 192.168.123.104:0/3440263456 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f8948077a00 0x7f8948079eb0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:24:27.720 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:27.719+0000 7f895ffff700 1 --2- 192.168.123.104:0/3440263456 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f8948077a00 0x7f8948079eb0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:24:27.720 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:27.720+0000 7f895ffff700 1 --2- 192.168.123.104:0/3440263456 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f8948077a00 0x7f8948079eb0 secure :-1 s=READY pgs=25 cs=0 l=1 rev1=1 crypto rx=0x7f8950009a30 tx=0x7f895000b540 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:24:27.720 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:27.720+0000 7f895d7fa700 1 -- 192.168.123.104:0/3440263456 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(43..43 src has 1..43) v4 ==== 6050+0+0 (secure 0 0 0) 0x7f895801a2b0 con 0x7f896010eab0 2026-03-10T06:24:27.723 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:27.722+0000 7f895d7fa700 1 -- 192.168.123.104:0/3440263456 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f89580629b0 con 0x7f896010eab0 2026-03-10T06:24:27.889 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:27.888+0000 7f8966a01700 1 -- 192.168.123.104:0/3440263456 --> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f89601aa440 con 0x7f8948077a00 2026-03-10T06:24:27.890 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:27.890+0000 7f895d7fa700 1 -- 192.168.123.104:0/3440263456 <== mgr.34104 v2:192.168.123.104:6800/1421430943 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+385 (secure 0 0 0) 0x7f89601aa440 con 0x7f8948077a00 2026-03-10T06:24:27.890 INFO:teuthology.orchestra.run.vm04.stdout:{ 2026-03-10T06:24:27.891 INFO:teuthology.orchestra.run.vm04.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc", 2026-03-10T06:24:27.891 INFO:teuthology.orchestra.run.vm04.stdout: "in_progress": true, 2026-03-10T06:24:27.891 INFO:teuthology.orchestra.run.vm04.stdout: "which": "Upgrading all daemon types on all hosts", 2026-03-10T06:24:27.891 INFO:teuthology.orchestra.run.vm04.stdout: "services_complete": [ 2026-03-10T06:24:27.891 INFO:teuthology.orchestra.run.vm04.stdout: "mon", 2026-03-10T06:24:27.891 INFO:teuthology.orchestra.run.vm04.stdout: "mgr" 2026-03-10T06:24:27.891 INFO:teuthology.orchestra.run.vm04.stdout: ], 2026-03-10T06:24:27.891 INFO:teuthology.orchestra.run.vm04.stdout: "progress": "4/23 daemons upgraded", 2026-03-10T06:24:27.891 INFO:teuthology.orchestra.run.vm04.stdout: "message": "Currently upgrading crash daemons", 2026-03-10T06:24:27.891 INFO:teuthology.orchestra.run.vm04.stdout: "is_paused": false 2026-03-10T06:24:27.891 INFO:teuthology.orchestra.run.vm04.stdout:} 2026-03-10T06:24:27.894 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:27.893+0000 7f8966a01700 1 -- 192.168.123.104:0/3440263456 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f8948077a00 msgr2=0x7f8948079eb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:24:27.894 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:27.893+0000 7f8966a01700 1 --2- 192.168.123.104:0/3440263456 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f8948077a00 0x7f8948079eb0 secure :-1 s=READY pgs=25 cs=0 l=1 rev1=1 crypto rx=0x7f8950009a30 tx=0x7f895000b540 comp rx=0 tx=0).stop 2026-03-10T06:24:27.894 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:27.893+0000 7f8966a01700 1 -- 192.168.123.104:0/3440263456 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f896010eab0 msgr2=0x7f89601a54e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:24:27.894 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:27.893+0000 7f8966a01700 1 --2- 192.168.123.104:0/3440263456 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f896010eab0 0x7f89601a54e0 secure :-1 s=READY pgs=41 cs=0 l=1 rev1=1 crypto rx=0x7f895800c3a0 tx=0x7f895800c6b0 comp rx=0 tx=0).stop 2026-03-10T06:24:27.894 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:27.893+0000 7f8966a01700 1 -- 192.168.123.104:0/3440263456 shutdown_connections 2026-03-10T06:24:27.894 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:27.893+0000 7f8966a01700 1 --2- 192.168.123.104:0/3440263456 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f8948077a00 0x7f8948079eb0 unknown :-1 s=CLOSED pgs=25 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:24:27.894 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:27.893+0000 7f8966a01700 1 --2- 192.168.123.104:0/3440263456 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8960071b60 0x7f89601a4fa0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:24:27.894 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:27.893+0000 7f8966a01700 1 --2- 192.168.123.104:0/3440263456 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f896010eab0 0x7f89601a54e0 unknown :-1 s=CLOSED pgs=41 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:24:27.894 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:27.893+0000 7f8966a01700 1 -- 192.168.123.104:0/3440263456 >> 192.168.123.104:0/3440263456 conn(0x7f896006c6c0 msgr2=0x7f8960118230 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:24:27.894 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:27.894+0000 7f8966a01700 1 -- 192.168.123.104:0/3440263456 shutdown_connections 2026-03-10T06:24:27.894 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:27.894+0000 7f8966a01700 1 -- 192.168.123.104:0/3440263456 wait complete. 2026-03-10T06:24:27.997 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:27.996+0000 7f3dd4092700 1 -- 192.168.123.104:0/1329774776 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3dcc10eab0 msgr2=0x7f3dcc10ee80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:24:27.997 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:27.996+0000 7f3dd4092700 1 --2- 192.168.123.104:0/1329774776 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3dcc10eab0 0x7f3dcc10ee80 secure :-1 s=READY pgs=9 cs=0 l=1 rev1=1 crypto rx=0x7f3dc400b3a0 tx=0x7f3dc400b6b0 comp rx=0 tx=0).stop 2026-03-10T06:24:27.997 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:27.996+0000 7f3dd4092700 1 -- 192.168.123.104:0/1329774776 shutdown_connections 2026-03-10T06:24:27.997 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:27.996+0000 7f3dd4092700 1 --2- 192.168.123.104:0/1329774776 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3dcc071b60 0x7f3dcc071fd0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:24:27.997 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:27.996+0000 7f3dd4092700 1 --2- 192.168.123.104:0/1329774776 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3dcc10eab0 0x7f3dcc10ee80 unknown :-1 s=CLOSED pgs=9 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:24:27.997 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:27.996+0000 7f3dd4092700 1 -- 192.168.123.104:0/1329774776 >> 192.168.123.104:0/1329774776 conn(0x7f3dcc06c6c0 msgr2=0x7f3dcc06cac0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:24:27.997 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:27.997+0000 7f3dd4092700 1 -- 192.168.123.104:0/1329774776 shutdown_connections 2026-03-10T06:24:27.997 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:27.997+0000 7f3dd4092700 1 -- 192.168.123.104:0/1329774776 wait complete. 2026-03-10T06:24:27.998 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:27.997+0000 7f3dd4092700 1 Processor -- start 2026-03-10T06:24:27.998 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:27.997+0000 7f3dd4092700 1 -- start start 2026-03-10T06:24:27.998 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:27.997+0000 7f3dd4092700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3dcc071b60 0x7f3dcc119540 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:24:27.998 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:27.997+0000 7f3dd4092700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3dcc114590 0x7f3dcc114a00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:24:27.998 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:27.997+0000 7f3dd4092700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3dcc114f40 con 0x7f3dcc071b60 2026-03-10T06:24:27.998 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:27.997+0000 7f3dd4092700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3dcc1150b0 con 0x7f3dcc114590 2026-03-10T06:24:27.998 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:27.998+0000 7f3dd162d700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3dcc114590 0x7f3dcc114a00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:24:27.998 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:27.998+0000 7f3dd162d700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3dcc114590 0x7f3dcc114a00 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.104:56446/0 (socket says 192.168.123.104:56446) 2026-03-10T06:24:27.998 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:27.998+0000 7f3dd162d700 1 -- 192.168.123.104:0/619079879 learned_addr learned my addr 192.168.123.104:0/619079879 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:24:27.999 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:27.998+0000 7f3dd162d700 1 -- 192.168.123.104:0/619079879 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3dcc071b60 msgr2=0x7f3dcc119540 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:24:27.999 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:27.998+0000 7f3dd162d700 1 --2- 192.168.123.104:0/619079879 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3dcc071b60 0x7f3dcc119540 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:24:27.999 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:27.998+0000 7f3dd162d700 1 -- 192.168.123.104:0/619079879 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f3dc400b050 con 0x7f3dcc114590 2026-03-10T06:24:27.999 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:27.998+0000 7f3dd162d700 1 --2- 192.168.123.104:0/619079879 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3dcc114590 0x7f3dcc114a00 secure :-1 s=READY pgs=10 cs=0 l=1 rev1=1 crypto rx=0x7f3dbc00eb10 tx=0x7f3dbc00eed0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:24:27.999 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:27.999+0000 7f3dc2ffd700 1 -- 192.168.123.104:0/619079879 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3dbc00cca0 con 0x7f3dcc114590 2026-03-10T06:24:28.000 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:27.999+0000 7f3dd4092700 1 -- 192.168.123.104:0/619079879 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f3dcc115390 con 0x7f3dcc114590 2026-03-10T06:24:28.000 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:27.999+0000 7f3dd4092700 1 -- 192.168.123.104:0/619079879 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f3dcc1b7cb0 con 0x7f3dcc114590 2026-03-10T06:24:28.000 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:27.999+0000 7f3dc2ffd700 1 -- 192.168.123.104:0/619079879 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f3dbc00ce00 con 0x7f3dcc114590 2026-03-10T06:24:28.000 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:27.999+0000 7f3dc2ffd700 1 -- 192.168.123.104:0/619079879 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3dbc018910 con 0x7f3dcc114590 2026-03-10T06:24:28.000 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:28.000+0000 7f3dd4092700 1 -- 192.168.123.104:0/619079879 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f3db0005320 con 0x7f3dcc114590 2026-03-10T06:24:28.002 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:28.001+0000 7f3dc2ffd700 1 -- 192.168.123.104:0/619079879 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 39) v1 ==== 100217+0+0 (secure 0 0 0) 0x7f3dbc018a70 con 0x7f3dcc114590 2026-03-10T06:24:28.002 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:28.001+0000 7f3dc2ffd700 1 --2- 192.168.123.104:0/619079879 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f3db8077b10 0x7f3db8079fc0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:24:28.002 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:28.002+0000 7f3dc2ffd700 1 -- 192.168.123.104:0/619079879 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(43..43 src has 1..43) v4 ==== 6050+0+0 (secure 0 0 0) 0x7f3dbc014070 con 0x7f3dcc114590 2026-03-10T06:24:28.002 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:28.002+0000 7f3dd1e2e700 1 --2- 192.168.123.104:0/619079879 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f3db8077b10 0x7f3db8079fc0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:24:28.003 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:28.002+0000 7f3dd1e2e700 1 --2- 192.168.123.104:0/619079879 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f3db8077b10 0x7f3db8079fc0 secure :-1 s=READY pgs=26 cs=0 l=1 rev1=1 crypto rx=0x7f3dc400b3a0 tx=0x7f3dc400ba00 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:24:28.005 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:28.003+0000 7f3dc2ffd700 1 -- 192.168.123.104:0/619079879 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f3dbc062f00 con 0x7f3dcc114590 2026-03-10T06:24:28.255 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:28.255+0000 7f3dd4092700 1 -- 192.168.123.104:0/619079879 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7f3db0005190 con 0x7f3dcc114590 2026-03-10T06:24:28.256 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:28 vm04.local ceph-mon[115743]: from='client.44109 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:24:28.256 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:28 vm04.local ceph-mon[115743]: from='client.? 192.168.123.104:0/2572754113' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:24:28.256 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:28 vm04.local ceph-mon[115743]: from='client.? 192.168.123.104:0/4196671823' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T06:24:28.259 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:28.258+0000 7f3dc2ffd700 1 -- 192.168.123.104:0/619079879 <== mon.1 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+201 (secure 0 0 0) 0x7f3dbc062650 con 0x7f3dcc114590 2026-03-10T06:24:28.259 INFO:teuthology.orchestra.run.vm04.stdout:HEALTH_WARN 1 filesystem with deprecated feature inline_data 2026-03-10T06:24:28.259 INFO:teuthology.orchestra.run.vm04.stdout:[WRN] FS_INLINE_DATA_DEPRECATED: 1 filesystem with deprecated feature inline_data 2026-03-10T06:24:28.259 INFO:teuthology.orchestra.run.vm04.stdout: fs cephfs has deprecated feature inline_data enabled. 2026-03-10T06:24:28.265 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:28.264+0000 7f3dc0ff9700 1 -- 192.168.123.104:0/619079879 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f3db8077b10 msgr2=0x7f3db8079fc0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:24:28.265 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:28.264+0000 7f3dc0ff9700 1 --2- 192.168.123.104:0/619079879 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f3db8077b10 0x7f3db8079fc0 secure :-1 s=READY pgs=26 cs=0 l=1 rev1=1 crypto rx=0x7f3dc400b3a0 tx=0x7f3dc400ba00 comp rx=0 tx=0).stop 2026-03-10T06:24:28.265 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:28.264+0000 7f3dc0ff9700 1 -- 192.168.123.104:0/619079879 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3dcc114590 msgr2=0x7f3dcc114a00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:24:28.265 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:28.264+0000 7f3dc0ff9700 1 --2- 192.168.123.104:0/619079879 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3dcc114590 0x7f3dcc114a00 secure :-1 s=READY pgs=10 cs=0 l=1 rev1=1 crypto rx=0x7f3dbc00eb10 tx=0x7f3dbc00eed0 comp rx=0 tx=0).stop 2026-03-10T06:24:28.265 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:28.264+0000 7f3dc0ff9700 1 -- 192.168.123.104:0/619079879 shutdown_connections 2026-03-10T06:24:28.265 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:28.264+0000 7f3dc0ff9700 1 --2- 192.168.123.104:0/619079879 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f3db8077b10 0x7f3db8079fc0 unknown :-1 s=CLOSED pgs=26 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:24:28.265 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:28.264+0000 7f3dc0ff9700 1 --2- 192.168.123.104:0/619079879 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3dcc071b60 0x7f3dcc119540 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:24:28.265 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:28.264+0000 7f3dc0ff9700 1 --2- 192.168.123.104:0/619079879 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3dcc114590 0x7f3dcc114a00 unknown :-1 s=CLOSED pgs=10 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:24:28.265 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:28.264+0000 7f3dc0ff9700 1 -- 192.168.123.104:0/619079879 >> 192.168.123.104:0/619079879 conn(0x7f3dcc06c6c0 msgr2=0x7f3dcc0700b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:24:28.265 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:28.264+0000 7f3dc0ff9700 1 -- 192.168.123.104:0/619079879 shutdown_connections 2026-03-10T06:24:28.265 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:28.264+0000 7f3dc0ff9700 1 -- 192.168.123.104:0/619079879 wait complete. 2026-03-10T06:24:28.542 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:28 vm06.local ceph-mon[98962]: from='client.44109 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:24:28.542 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:28 vm06.local ceph-mon[98962]: from='client.? 192.168.123.104:0/2572754113' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:24:28.542 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:28 vm06.local ceph-mon[98962]: from='client.? 192.168.123.104:0/4196671823' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T06:24:29.430 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:29 vm04.local ceph-mon[115743]: from='client.34142 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:24:29.431 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:29 vm04.local ceph-mon[115743]: pgmap v15: 65 pgs: 65 active+clean; 306 MiB data, 2.8 GiB used, 117 GiB / 120 GiB avail; 1.1 MiB/s rd, 1.2 MiB/s wr, 97 op/s 2026-03-10T06:24:29.431 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:29 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:29.431 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:29 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:29.431 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:29 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:24:29.431 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:29 vm04.local ceph-mon[115743]: from='client.? 192.168.123.104:0/619079879' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T06:24:29.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:29 vm06.local ceph-mon[98962]: from='client.34142 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:24:29.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:29 vm06.local ceph-mon[98962]: pgmap v15: 65 pgs: 65 active+clean; 306 MiB data, 2.8 GiB used, 117 GiB / 120 GiB avail; 1.1 MiB/s rd, 1.2 MiB/s wr, 97 op/s 2026-03-10T06:24:29.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:29 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:29.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:29 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:29.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:29 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:24:29.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:29 vm06.local ceph-mon[98962]: from='client.? 192.168.123.104:0/619079879' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T06:24:30.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:30 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:30.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:30 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:30.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:30 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:30.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:30 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:30.649 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:30 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:30.649 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:30 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:30.649 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:30 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:30.649 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:30 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:31.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:31 vm06.local ceph-mon[98962]: pgmap v16: 65 pgs: 65 active+clean; 306 MiB data, 2.8 GiB used, 117 GiB / 120 GiB avail; 1.1 MiB/s rd, 1.2 MiB/s wr, 97 op/s 2026-03-10T06:24:31.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:31 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:31.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:31 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:31.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:31 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:31.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:31 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:31.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:31 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:31.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:31 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:31.921 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:31 vm04.local ceph-mon[115743]: pgmap v16: 65 pgs: 65 active+clean; 306 MiB data, 2.8 GiB used, 117 GiB / 120 GiB avail; 1.1 MiB/s rd, 1.2 MiB/s wr, 97 op/s 2026-03-10T06:24:31.921 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:31 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:31.921 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:31 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:31.921 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:31 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:31.921 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:31 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:31.921 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:31 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:31.921 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:31 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:33.866 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:33 vm04.local ceph-mon[115743]: pgmap v17: 65 pgs: 65 active+clean; 300 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 1.6 MiB/s rd, 1.8 MiB/s wr, 183 op/s 2026-03-10T06:24:33.867 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:33 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:33.867 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:33 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:33.867 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:33 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:24:33.867 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:33 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T06:24:33.867 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:33 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:33.867 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:33 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:24:33.867 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:33 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:24:33.867 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:33 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:24:33.867 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:33 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:24:33.867 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:33 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:33.867 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:33 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.crash.vm04"}]: dispatch 2026-03-10T06:24:33.867 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:33 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.crash.vm04"}]': finished 2026-03-10T06:24:33.867 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:33 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.crash.vm06"}]: dispatch 2026-03-10T06:24:33.867 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:33 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.crash.vm06"}]': finished 2026-03-10T06:24:33.867 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:33 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd ok-to-stop", "ids": ["0"], "max": 16}]: dispatch 2026-03-10T06:24:33.867 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:33 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:33.867 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:33 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "osd.0"}]: dispatch 2026-03-10T06:24:33.867 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:33 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:24:33.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:33 vm06.local ceph-mon[98962]: pgmap v17: 65 pgs: 65 active+clean; 300 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 1.6 MiB/s rd, 1.8 MiB/s wr, 183 op/s 2026-03-10T06:24:33.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:33 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:33.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:33 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:33.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:33 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:24:33.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:33 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T06:24:33.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:33 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:33.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:33 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:24:33.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:33 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:24:33.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:33 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:24:33.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:33 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:24:33.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:33 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:33.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:33 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.crash.vm04"}]: dispatch 2026-03-10T06:24:33.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:33 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.crash.vm04"}]': finished 2026-03-10T06:24:33.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:33 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.crash.vm06"}]: dispatch 2026-03-10T06:24:33.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:33 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.crash.vm06"}]': finished 2026-03-10T06:24:33.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:33 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd ok-to-stop", "ids": ["0"], "max": 16}]: dispatch 2026-03-10T06:24:33.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:33 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:33.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:33 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "osd.0"}]: dispatch 2026-03-10T06:24:33.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:33 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:24:34.427 INFO:journalctl@ceph.osd.0.vm04.stdout:Mar 10 06:24:34 vm04.local systemd[1]: Stopping Ceph osd.0 for 9c59102a-1c48-11f1-b618-035af535377d... 2026-03-10T06:24:35.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:34 vm06.local ceph-mon[98962]: Upgrade: Setting container_image for all crash 2026-03-10T06:24:35.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:34 vm06.local ceph-mon[98962]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["0"], "max": 16}]: dispatch 2026-03-10T06:24:35.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:34 vm06.local ceph-mon[98962]: Upgrade: osd.0 is safe to restart 2026-03-10T06:24:35.118 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:34 vm06.local ceph-mon[98962]: Upgrade: Updating osd.0 2026-03-10T06:24:35.118 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:34 vm06.local ceph-mon[98962]: Deploying daemon osd.0 on vm04 2026-03-10T06:24:35.118 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:34 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T06:24:35.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:34 vm04.local ceph-mon[115743]: Upgrade: Setting container_image for all crash 2026-03-10T06:24:35.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:34 vm04.local ceph-mon[115743]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["0"], "max": 16}]: dispatch 2026-03-10T06:24:35.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:34 vm04.local ceph-mon[115743]: Upgrade: osd.0 is safe to restart 2026-03-10T06:24:35.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:34 vm04.local ceph-mon[115743]: Upgrade: Updating osd.0 2026-03-10T06:24:35.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:34 vm04.local ceph-mon[115743]: Deploying daemon osd.0 on vm04 2026-03-10T06:24:35.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:34 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T06:24:35.178 INFO:journalctl@ceph.osd.0.vm04.stdout:Mar 10 06:24:34 vm04.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-0[68076]: 2026-03-10T06:24:34.769+0000 7fb565f44700 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-osd -n osd.0 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false (PID: 1) UID: 0 2026-03-10T06:24:35.178 INFO:journalctl@ceph.osd.0.vm04.stdout:Mar 10 06:24:34 vm04.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-0[68076]: 2026-03-10T06:24:34.769+0000 7fb565f44700 -1 osd.0 43 *** Got signal Terminated *** 2026-03-10T06:24:35.178 INFO:journalctl@ceph.osd.0.vm04.stdout:Mar 10 06:24:34 vm04.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-0[68076]: 2026-03-10T06:24:34.769+0000 7fb565f44700 -1 osd.0 43 *** Immediate shutdown (osd_fast_shutdown=true) *** 2026-03-10T06:24:36.100 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:35 vm04.local ceph-mon[115743]: pgmap v18: 65 pgs: 65 active+clean; 300 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 1.1 MiB/s rd, 1.3 MiB/s wr, 143 op/s 2026-03-10T06:24:36.100 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:35 vm04.local ceph-mon[115743]: osd.0 marked itself down and dead 2026-03-10T06:24:36.100 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:35 vm04.local ceph-mon[115743]: Health check failed: 1 osds down (OSD_DOWN) 2026-03-10T06:24:36.100 INFO:journalctl@ceph.osd.0.vm04.stdout:Mar 10 06:24:35 vm04.local podman[124180]: 2026-03-10 06:24:35.838600329 +0000 UTC m=+1.338338315 container died 23249edb3d75967a1c62fc1816b7908ec756bfd2435c5cab2ad6212cd85bdf86 (image=quay.io/ceph/ceph@sha256:1793ff3af6ae74527c86e1a0b22401e9c42dc08d0ebb8379653be07db17d0007, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-0, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.29.1, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 8 Base Image, GIT_BRANCH=HEAD, maintainer=Guillaume Abrioux , GIT_CLEAN=True, GIT_COMMIT=0396eef90bef641b676c164ec7a3876f45010308, ceph=True, org.label-schema.build-date=20231212, CEPH_POINT_RELEASE=-18.2.0, RELEASE=HEAD) 2026-03-10T06:24:36.100 INFO:journalctl@ceph.osd.0.vm04.stdout:Mar 10 06:24:35 vm04.local podman[124180]: 2026-03-10 06:24:35.889375101 +0000 UTC m=+1.389113087 container remove 23249edb3d75967a1c62fc1816b7908ec756bfd2435c5cab2ad6212cd85bdf86 (image=quay.io/ceph/ceph@sha256:1793ff3af6ae74527c86e1a0b22401e9c42dc08d0ebb8379653be07db17d0007, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-0, GIT_REPO=https://github.com/ceph/ceph-container.git, org.label-schema.build-date=20231212, org.label-schema.schema-version=1.0, RELEASE=HEAD, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_POINT_RELEASE=-18.2.0, GIT_BRANCH=HEAD, maintainer=Guillaume Abrioux , ceph=True, io.buildah.version=1.29.1, org.label-schema.name=CentOS Stream 8 Base Image, GIT_CLEAN=True, GIT_COMMIT=0396eef90bef641b676c164ec7a3876f45010308) 2026-03-10T06:24:36.100 INFO:journalctl@ceph.osd.0.vm04.stdout:Mar 10 06:24:35 vm04.local bash[124180]: ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-0 2026-03-10T06:24:36.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:35 vm06.local ceph-mon[98962]: pgmap v18: 65 pgs: 65 active+clean; 300 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 1.1 MiB/s rd, 1.3 MiB/s wr, 143 op/s 2026-03-10T06:24:36.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:35 vm06.local ceph-mon[98962]: osd.0 marked itself down and dead 2026-03-10T06:24:36.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:35 vm06.local ceph-mon[98962]: Health check failed: 1 osds down (OSD_DOWN) 2026-03-10T06:24:36.357 INFO:journalctl@ceph.osd.0.vm04.stdout:Mar 10 06:24:36 vm04.local podman[124245]: 2026-03-10 06:24:36.100464139 +0000 UTC m=+0.032917774 container create f853e9ed59a61db680fc8622b3d83ecfc4d0186fd41163a8de4f70b4df3643ae (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-0-deactivate, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team , org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, OSD_FLAVOR=default, ceph=True) 2026-03-10T06:24:36.357 INFO:journalctl@ceph.osd.0.vm04.stdout:Mar 10 06:24:36 vm04.local podman[124245]: 2026-03-10 06:24:36.141281131 +0000 UTC m=+0.073734777 container init f853e9ed59a61db680fc8622b3d83ecfc4d0186fd41163a8de4f70b4df3643ae (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-0-deactivate, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team ) 2026-03-10T06:24:36.357 INFO:journalctl@ceph.osd.0.vm04.stdout:Mar 10 06:24:36 vm04.local podman[124245]: 2026-03-10 06:24:36.144045537 +0000 UTC m=+0.076499161 container start f853e9ed59a61db680fc8622b3d83ecfc4d0186fd41163a8de4f70b4df3643ae (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-0-deactivate, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.license=GPLv2, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df) 2026-03-10T06:24:36.357 INFO:journalctl@ceph.osd.0.vm04.stdout:Mar 10 06:24:36 vm04.local podman[124245]: 2026-03-10 06:24:36.144855593 +0000 UTC m=+0.077309228 container attach f853e9ed59a61db680fc8622b3d83ecfc4d0186fd41163a8de4f70b4df3643ae (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-0-deactivate, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.authors=Ceph Release Team , org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, org.label-schema.build-date=20260223, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image) 2026-03-10T06:24:36.357 INFO:journalctl@ceph.osd.0.vm04.stdout:Mar 10 06:24:36 vm04.local podman[124245]: 2026-03-10 06:24:36.080412663 +0000 UTC m=+0.012866298 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-10T06:24:36.357 INFO:journalctl@ceph.osd.0.vm04.stdout:Mar 10 06:24:36 vm04.local conmon[124255]: conmon f853e9ed59a61db680fc : Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-f853e9ed59a61db680fc8622b3d83ecfc4d0186fd41163a8de4f70b4df3643ae.scope/container/memory.events 2026-03-10T06:24:36.357 INFO:journalctl@ceph.osd.0.vm04.stdout:Mar 10 06:24:36 vm04.local podman[124245]: 2026-03-10 06:24:36.311536116 +0000 UTC m=+0.243989751 container died f853e9ed59a61db680fc8622b3d83ecfc4d0186fd41163a8de4f70b4df3643ae (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-0-deactivate, io.buildah.version=1.41.3, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.vendor=CentOS, CEPH_REF=squid, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.authors=Ceph Release Team , org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) 2026-03-10T06:24:36.357 INFO:journalctl@ceph.osd.0.vm04.stdout:Mar 10 06:24:36 vm04.local podman[124245]: 2026-03-10 06:24:36.342743117 +0000 UTC m=+0.275196752 container remove f853e9ed59a61db680fc8622b3d83ecfc4d0186fd41163a8de4f70b4df3643ae (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-0-deactivate, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.vendor=CentOS, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_REF=squid, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team ) 2026-03-10T06:24:36.677 INFO:journalctl@ceph.osd.0.vm04.stdout:Mar 10 06:24:36 vm04.local systemd[1]: ceph-9c59102a-1c48-11f1-b618-035af535377d@osd.0.service: Deactivated successfully. 2026-03-10T06:24:36.677 INFO:journalctl@ceph.osd.0.vm04.stdout:Mar 10 06:24:36 vm04.local systemd[1]: ceph-9c59102a-1c48-11f1-b618-035af535377d@osd.0.service: Unit process 124255 (conmon) remains running after unit stopped. 2026-03-10T06:24:36.677 INFO:journalctl@ceph.osd.0.vm04.stdout:Mar 10 06:24:36 vm04.local systemd[1]: ceph-9c59102a-1c48-11f1-b618-035af535377d@osd.0.service: Unit process 124263 (podman) remains running after unit stopped. 2026-03-10T06:24:36.677 INFO:journalctl@ceph.osd.0.vm04.stdout:Mar 10 06:24:36 vm04.local systemd[1]: Stopped Ceph osd.0 for 9c59102a-1c48-11f1-b618-035af535377d. 2026-03-10T06:24:36.677 INFO:journalctl@ceph.osd.0.vm04.stdout:Mar 10 06:24:36 vm04.local systemd[1]: ceph-9c59102a-1c48-11f1-b618-035af535377d@osd.0.service: Consumed 28.422s CPU time, 443.3M memory peak. 2026-03-10T06:24:36.677 INFO:journalctl@ceph.osd.0.vm04.stdout:Mar 10 06:24:36 vm04.local systemd[1]: Starting Ceph osd.0 for 9c59102a-1c48-11f1-b618-035af535377d... 2026-03-10T06:24:37.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:36 vm06.local ceph-mon[98962]: osdmap e44: 6 total, 5 up, 6 in 2026-03-10T06:24:37.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:36 vm06.local ceph-mon[98962]: pgmap v20: 65 pgs: 9 stale+active+clean, 56 active+clean; 296 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 1.3 MiB/s rd, 1.3 MiB/s wr, 161 op/s 2026-03-10T06:24:37.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:36 vm04.local ceph-mon[115743]: osdmap e44: 6 total, 5 up, 6 in 2026-03-10T06:24:37.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:36 vm04.local ceph-mon[115743]: pgmap v20: 65 pgs: 9 stale+active+clean, 56 active+clean; 296 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 1.3 MiB/s rd, 1.3 MiB/s wr, 161 op/s 2026-03-10T06:24:37.178 INFO:journalctl@ceph.osd.0.vm04.stdout:Mar 10 06:24:36 vm04.local podman[124344]: 2026-03-10 06:24:36.768131555 +0000 UTC m=+0.028579313 container create ec5bbd3a8a30e73b0f9407f9c24c7667f87d27716e02c0569615fc549b1cc18b (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-0-activate, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team , CEPH_REF=squid, org.label-schema.license=GPLv2, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, OSD_FLAVOR=default, org.label-schema.build-date=20260223, io.buildah.version=1.41.3) 2026-03-10T06:24:37.178 INFO:journalctl@ceph.osd.0.vm04.stdout:Mar 10 06:24:36 vm04.local podman[124344]: 2026-03-10 06:24:36.844461007 +0000 UTC m=+0.104908765 container init ec5bbd3a8a30e73b0f9407f9c24c7667f87d27716e02c0569615fc549b1cc18b (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-0-activate, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, OSD_FLAVOR=default, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team , CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/) 2026-03-10T06:24:37.178 INFO:journalctl@ceph.osd.0.vm04.stdout:Mar 10 06:24:36 vm04.local podman[124344]: 2026-03-10 06:24:36.849122595 +0000 UTC m=+0.109570353 container start ec5bbd3a8a30e73b0f9407f9c24c7667f87d27716e02c0569615fc549b1cc18b (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-0-activate, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team , org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, ceph=True, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, FROM_IMAGE=quay.io/centos/centos:stream9) 2026-03-10T06:24:37.178 INFO:journalctl@ceph.osd.0.vm04.stdout:Mar 10 06:24:36 vm04.local podman[124344]: 2026-03-10 06:24:36.758195143 +0000 UTC m=+0.018642901 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-10T06:24:37.178 INFO:journalctl@ceph.osd.0.vm04.stdout:Mar 10 06:24:36 vm04.local podman[124344]: 2026-03-10 06:24:36.859293284 +0000 UTC m=+0.119741042 container attach ec5bbd3a8a30e73b0f9407f9c24c7667f87d27716e02c0569615fc549b1cc18b (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-0-activate, org.label-schema.license=GPLv2, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team , CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260223) 2026-03-10T06:24:37.178 INFO:journalctl@ceph.osd.0.vm04.stdout:Mar 10 06:24:36 vm04.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-0-activate[124356]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T06:24:37.178 INFO:journalctl@ceph.osd.0.vm04.stdout:Mar 10 06:24:36 vm04.local bash[124344]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T06:24:37.178 INFO:journalctl@ceph.osd.0.vm04.stdout:Mar 10 06:24:36 vm04.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-0-activate[124356]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T06:24:37.178 INFO:journalctl@ceph.osd.0.vm04.stdout:Mar 10 06:24:36 vm04.local bash[124344]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T06:24:38.047 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:37 vm04.local ceph-mon[115743]: osdmap e45: 6 total, 5 up, 6 in 2026-03-10T06:24:38.047 INFO:journalctl@ceph.osd.0.vm04.stdout:Mar 10 06:24:37 vm04.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-0-activate[124356]: --> Failed to activate via raw: did not find any matching OSD to activate 2026-03-10T06:24:38.047 INFO:journalctl@ceph.osd.0.vm04.stdout:Mar 10 06:24:37 vm04.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-0-activate[124356]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T06:24:38.047 INFO:journalctl@ceph.osd.0.vm04.stdout:Mar 10 06:24:37 vm04.local bash[124344]: --> Failed to activate via raw: did not find any matching OSD to activate 2026-03-10T06:24:38.047 INFO:journalctl@ceph.osd.0.vm04.stdout:Mar 10 06:24:37 vm04.local bash[124344]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T06:24:38.047 INFO:journalctl@ceph.osd.0.vm04.stdout:Mar 10 06:24:37 vm04.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-0-activate[124356]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T06:24:38.047 INFO:journalctl@ceph.osd.0.vm04.stdout:Mar 10 06:24:37 vm04.local bash[124344]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T06:24:38.047 INFO:journalctl@ceph.osd.0.vm04.stdout:Mar 10 06:24:37 vm04.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-0-activate[124356]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0 2026-03-10T06:24:38.047 INFO:journalctl@ceph.osd.0.vm04.stdout:Mar 10 06:24:37 vm04.local bash[124344]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0 2026-03-10T06:24:38.047 INFO:journalctl@ceph.osd.0.vm04.stdout:Mar 10 06:24:37 vm04.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-0-activate[124356]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph-823c2c97-cf31-4e97-b429-0adb94ce6442/osd-block-24cf9fc9-b995-47ea-a145-3fd48dc1ed14 --path /var/lib/ceph/osd/ceph-0 --no-mon-config 2026-03-10T06:24:38.047 INFO:journalctl@ceph.osd.0.vm04.stdout:Mar 10 06:24:37 vm04.local bash[124344]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph-823c2c97-cf31-4e97-b429-0adb94ce6442/osd-block-24cf9fc9-b995-47ea-a145-3fd48dc1ed14 --path /var/lib/ceph/osd/ceph-0 --no-mon-config 2026-03-10T06:24:38.047 INFO:journalctl@ceph.osd.0.vm04.stdout:Mar 10 06:24:38 vm04.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-0-activate[124356]: Running command: /usr/bin/ln -snf /dev/ceph-823c2c97-cf31-4e97-b429-0adb94ce6442/osd-block-24cf9fc9-b995-47ea-a145-3fd48dc1ed14 /var/lib/ceph/osd/ceph-0/block 2026-03-10T06:24:38.047 INFO:journalctl@ceph.osd.0.vm04.stdout:Mar 10 06:24:38 vm04.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-0-activate[124356]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-0/block 2026-03-10T06:24:38.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:37 vm06.local ceph-mon[98962]: osdmap e45: 6 total, 5 up, 6 in 2026-03-10T06:24:38.378 INFO:journalctl@ceph.osd.0.vm04.stdout:Mar 10 06:24:38 vm04.local bash[124344]: Running command: /usr/bin/ln -snf /dev/ceph-823c2c97-cf31-4e97-b429-0adb94ce6442/osd-block-24cf9fc9-b995-47ea-a145-3fd48dc1ed14 /var/lib/ceph/osd/ceph-0/block 2026-03-10T06:24:38.378 INFO:journalctl@ceph.osd.0.vm04.stdout:Mar 10 06:24:38 vm04.local bash[124344]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-0/block 2026-03-10T06:24:38.378 INFO:journalctl@ceph.osd.0.vm04.stdout:Mar 10 06:24:38 vm04.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-0-activate[124356]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0 2026-03-10T06:24:38.378 INFO:journalctl@ceph.osd.0.vm04.stdout:Mar 10 06:24:38 vm04.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-0-activate[124356]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0 2026-03-10T06:24:38.378 INFO:journalctl@ceph.osd.0.vm04.stdout:Mar 10 06:24:38 vm04.local bash[124344]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0 2026-03-10T06:24:38.378 INFO:journalctl@ceph.osd.0.vm04.stdout:Mar 10 06:24:38 vm04.local bash[124344]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0 2026-03-10T06:24:38.378 INFO:journalctl@ceph.osd.0.vm04.stdout:Mar 10 06:24:38 vm04.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-0-activate[124356]: --> ceph-volume lvm activate successful for osd ID: 0 2026-03-10T06:24:38.378 INFO:journalctl@ceph.osd.0.vm04.stdout:Mar 10 06:24:38 vm04.local bash[124344]: --> ceph-volume lvm activate successful for osd ID: 0 2026-03-10T06:24:38.378 INFO:journalctl@ceph.osd.0.vm04.stdout:Mar 10 06:24:38 vm04.local conmon[124356]: conmon ec5bbd3a8a30e73b0f94 : Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-ec5bbd3a8a30e73b0f9407f9c24c7667f87d27716e02c0569615fc549b1cc18b.scope/container/memory.events 2026-03-10T06:24:38.378 INFO:journalctl@ceph.osd.0.vm04.stdout:Mar 10 06:24:38 vm04.local podman[124344]: 2026-03-10 06:24:38.071183932 +0000 UTC m=+1.331631690 container died ec5bbd3a8a30e73b0f9407f9c24c7667f87d27716e02c0569615fc549b1cc18b (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-0-activate, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.vendor=CentOS, org.label-schema.build-date=20260223, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, OSD_FLAVOR=default, CEPH_REF=squid, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.schema-version=1.0) 2026-03-10T06:24:38.378 INFO:journalctl@ceph.osd.0.vm04.stdout:Mar 10 06:24:38 vm04.local podman[124344]: 2026-03-10 06:24:38.099314373 +0000 UTC m=+1.359762121 container remove ec5bbd3a8a30e73b0f9407f9c24c7667f87d27716e02c0569615fc549b1cc18b (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-0-activate, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team , org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, io.buildah.version=1.41.3, OSD_FLAVOR=default, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df) 2026-03-10T06:24:38.682 INFO:journalctl@ceph.osd.0.vm04.stdout:Mar 10 06:24:38 vm04.local podman[124602]: 2026-03-10 06:24:38.377838994 +0000 UTC m=+0.059500338 container create df697b82ad516dca23fb5aa0e7fc927d02c5efe12395fe6aaec831badbd6c328 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-0, org.label-schema.license=GPLv2, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, org.label-schema.build-date=20260223, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team , CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, ceph=True) 2026-03-10T06:24:38.682 INFO:journalctl@ceph.osd.0.vm04.stdout:Mar 10 06:24:38 vm04.local podman[124602]: 2026-03-10 06:24:38.429115737 +0000 UTC m=+0.110777091 container init df697b82ad516dca23fb5aa0e7fc927d02c5efe12395fe6aaec831badbd6c328 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-0, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_REF=squid, org.label-schema.license=GPLv2, org.label-schema.build-date=20260223, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, ceph=True, org.opencontainers.image.authors=Ceph Release Team , CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df) 2026-03-10T06:24:38.682 INFO:journalctl@ceph.osd.0.vm04.stdout:Mar 10 06:24:38 vm04.local podman[124602]: 2026-03-10 06:24:38.431939083 +0000 UTC m=+0.113600427 container start df697b82ad516dca23fb5aa0e7fc927d02c5efe12395fe6aaec831badbd6c328 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-0, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, ceph=True, org.opencontainers.image.authors=Ceph Release Team , GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260223, CEPH_REF=squid, OSD_FLAVOR=default, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df) 2026-03-10T06:24:38.682 INFO:journalctl@ceph.osd.0.vm04.stdout:Mar 10 06:24:38 vm04.local bash[124602]: df697b82ad516dca23fb5aa0e7fc927d02c5efe12395fe6aaec831badbd6c328 2026-03-10T06:24:38.682 INFO:journalctl@ceph.osd.0.vm04.stdout:Mar 10 06:24:38 vm04.local podman[124602]: 2026-03-10 06:24:38.352178537 +0000 UTC m=+0.033839881 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-10T06:24:38.682 INFO:journalctl@ceph.osd.0.vm04.stdout:Mar 10 06:24:38 vm04.local systemd[1]: Started Ceph osd.0 for 9c59102a-1c48-11f1-b618-035af535377d. 2026-03-10T06:24:39.312 INFO:journalctl@ceph.osd.0.vm04.stdout:Mar 10 06:24:39 vm04.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-0[124612]: 2026-03-10T06:24:39.046+0000 7f66a08ab740 -1 Falling back to public interface 2026-03-10T06:24:39.566 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:39 vm04.local ceph-mon[115743]: pgmap v22: 65 pgs: 9 stale+active+clean, 56 active+clean; 296 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 1.6 MiB/s rd, 1.7 MiB/s wr, 201 op/s 2026-03-10T06:24:39.566 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:39 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:39.566 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:39 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:39.566 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:39 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:24:39.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:39 vm06.local ceph-mon[98962]: pgmap v22: 65 pgs: 9 stale+active+clean, 56 active+clean; 296 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 1.6 MiB/s rd, 1.7 MiB/s wr, 201 op/s 2026-03-10T06:24:39.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:39 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:39.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:39 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:39.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:39 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:24:41.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:41 vm06.local ceph-mon[98962]: pgmap v23: 65 pgs: 9 stale+active+clean, 56 active+clean; 296 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 832 KiB/s rd, 792 KiB/s wr, 71 op/s 2026-03-10T06:24:41.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:41 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:41.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:41 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:41.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:41 vm04.local ceph-mon[115743]: pgmap v23: 65 pgs: 9 stale+active+clean, 56 active+clean; 296 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 832 KiB/s rd, 792 KiB/s wr, 71 op/s 2026-03-10T06:24:41.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:41 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:41.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:41 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:43.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:42 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:43.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:42 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:43.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:42 vm04.local ceph-mon[115743]: pgmap v24: 65 pgs: 2 active+undersized, 32 active+undersized+degraded, 31 active+clean; 301 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 1.5 MiB/s rd, 1.6 MiB/s wr, 134 op/s; 1345/8964 objects degraded (15.004%) 2026-03-10T06:24:43.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:42 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:43.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:42 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:43.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:42 vm06.local ceph-mon[98962]: pgmap v24: 65 pgs: 2 active+undersized, 32 active+undersized+degraded, 31 active+clean; 301 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 1.5 MiB/s rd, 1.6 MiB/s wr, 134 op/s; 1345/8964 objects degraded (15.004%) 2026-03-10T06:24:43.677 INFO:journalctl@ceph.osd.0.vm04.stdout:Mar 10 06:24:43 vm04.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-0[124612]: 2026-03-10T06:24:43.265+0000 7f66a08ab740 -1 osd.0 0 read_superblock omap replica is missing. 2026-03-10T06:24:44.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:43 vm04.local ceph-mon[115743]: Health check failed: Degraded data redundancy: 1345/8964 objects degraded (15.004%), 32 pgs degraded (PG_DEGRADED) 2026-03-10T06:24:44.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:43 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:44.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:43 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:44.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:43 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:24:44.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:43 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T06:24:44.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:43 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:44.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:43 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:24:44.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:43 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:24:44.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:43 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:24:44.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:43 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:24:44.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:43 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T06:24:44.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:43 vm04.local ceph-mon[115743]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T06:24:44.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:43 vm04.local ceph-mon[115743]: Upgrade: unsafe to stop osd(s) at this time (15 PGs are or would become offline) 2026-03-10T06:24:44.178 INFO:journalctl@ceph.osd.0.vm04.stdout:Mar 10 06:24:44 vm04.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-0[124612]: 2026-03-10T06:24:44.029+0000 7f66a08ab740 -1 osd.0 43 log_to_monitors true 2026-03-10T06:24:44.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:43 vm06.local ceph-mon[98962]: Health check failed: Degraded data redundancy: 1345/8964 objects degraded (15.004%), 32 pgs degraded (PG_DEGRADED) 2026-03-10T06:24:44.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:43 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:44.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:43 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:44.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:43 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:24:44.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:43 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T06:24:44.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:43 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:44.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:43 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:24:44.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:43 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:24:44.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:43 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:24:44.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:43 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:24:44.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:43 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T06:24:44.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:43 vm06.local ceph-mon[98962]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T06:24:44.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:43 vm06.local ceph-mon[98962]: Upgrade: unsafe to stop osd(s) at this time (15 PGs are or would become offline) 2026-03-10T06:24:45.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:44 vm06.local ceph-mon[98962]: pgmap v25: 65 pgs: 2 active+undersized, 32 active+undersized+degraded, 31 active+clean; 301 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 730 KiB/s rd, 851 KiB/s wr, 61 op/s; 1345/8964 objects degraded (15.004%) 2026-03-10T06:24:45.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:44 vm06.local ceph-mon[98962]: from='osd.0 [v2:192.168.123.104:6802/1771959057,v1:192.168.123.104:6803/1771959057]' entity='osd.0' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["0"]}]: dispatch 2026-03-10T06:24:45.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:44 vm04.local ceph-mon[115743]: pgmap v25: 65 pgs: 2 active+undersized, 32 active+undersized+degraded, 31 active+clean; 301 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 730 KiB/s rd, 851 KiB/s wr, 61 op/s; 1345/8964 objects degraded (15.004%) 2026-03-10T06:24:45.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:44 vm04.local ceph-mon[115743]: from='osd.0 [v2:192.168.123.104:6802/1771959057,v1:192.168.123.104:6803/1771959057]' entity='osd.0' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["0"]}]: dispatch 2026-03-10T06:24:46.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:45 vm06.local ceph-mon[98962]: from='osd.0 [v2:192.168.123.104:6802/1771959057,v1:192.168.123.104:6803/1771959057]' entity='osd.0' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["0"]}]': finished 2026-03-10T06:24:46.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:45 vm06.local ceph-mon[98962]: osdmap e46: 6 total, 5 up, 6 in 2026-03-10T06:24:46.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:45 vm06.local ceph-mon[98962]: from='osd.0 [v2:192.168.123.104:6802/1771959057,v1:192.168.123.104:6803/1771959057]' entity='osd.0' cmd=[{"prefix": "osd crush create-or-move", "id": 0, "weight":0.0195, "args": ["host=vm04", "root=default"]}]: dispatch 2026-03-10T06:24:46.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:45 vm04.local ceph-mon[115743]: from='osd.0 [v2:192.168.123.104:6802/1771959057,v1:192.168.123.104:6803/1771959057]' entity='osd.0' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["0"]}]': finished 2026-03-10T06:24:46.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:45 vm04.local ceph-mon[115743]: osdmap e46: 6 total, 5 up, 6 in 2026-03-10T06:24:46.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:45 vm04.local ceph-mon[115743]: from='osd.0 [v2:192.168.123.104:6802/1771959057,v1:192.168.123.104:6803/1771959057]' entity='osd.0' cmd=[{"prefix": "osd crush create-or-move", "id": 0, "weight":0.0195, "args": ["host=vm04", "root=default"]}]: dispatch 2026-03-10T06:24:46.927 INFO:journalctl@ceph.osd.0.vm04.stdout:Mar 10 06:24:46 vm04.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-0[124612]: 2026-03-10T06:24:46.507+0000 7f6697e44640 -1 osd.0 43 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-10T06:24:48.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:47 vm06.local ceph-mon[98962]: pgmap v27: 65 pgs: 34 active+undersized+degraded, 31 active+clean; 301 MiB data, 2.6 GiB used, 117 GiB / 120 GiB avail; 1.3 MiB/s rd, 1.4 MiB/s wr, 138 op/s; 1292/8565 objects degraded (15.085%) 2026-03-10T06:24:48.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:47 vm06.local ceph-mon[98962]: from='osd.0 [v2:192.168.123.104:6802/1771959057,v1:192.168.123.104:6803/1771959057]' entity='osd.0' 2026-03-10T06:24:48.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:47 vm06.local ceph-mon[98962]: Health check cleared: OSD_DOWN (was: 1 osds down) 2026-03-10T06:24:48.118 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:47 vm06.local ceph-mon[98962]: osd.0 [v2:192.168.123.104:6802/1771959057,v1:192.168.123.104:6803/1771959057] boot 2026-03-10T06:24:48.118 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:47 vm06.local ceph-mon[98962]: osdmap e47: 6 total, 6 up, 6 in 2026-03-10T06:24:48.118 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:47 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-10T06:24:48.118 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:47 vm06.local ceph-mon[98962]: osdmap e48: 6 total, 6 up, 6 in 2026-03-10T06:24:48.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:47 vm04.local ceph-mon[115743]: pgmap v27: 65 pgs: 34 active+undersized+degraded, 31 active+clean; 301 MiB data, 2.6 GiB used, 117 GiB / 120 GiB avail; 1.3 MiB/s rd, 1.4 MiB/s wr, 138 op/s; 1292/8565 objects degraded (15.085%) 2026-03-10T06:24:48.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:47 vm04.local ceph-mon[115743]: from='osd.0 [v2:192.168.123.104:6802/1771959057,v1:192.168.123.104:6803/1771959057]' entity='osd.0' 2026-03-10T06:24:48.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:47 vm04.local ceph-mon[115743]: Health check cleared: OSD_DOWN (was: 1 osds down) 2026-03-10T06:24:48.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:47 vm04.local ceph-mon[115743]: osd.0 [v2:192.168.123.104:6802/1771959057,v1:192.168.123.104:6803/1771959057] boot 2026-03-10T06:24:48.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:47 vm04.local ceph-mon[115743]: osdmap e47: 6 total, 6 up, 6 in 2026-03-10T06:24:48.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:47 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-10T06:24:48.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:47 vm04.local ceph-mon[115743]: osdmap e48: 6 total, 6 up, 6 in 2026-03-10T06:24:49.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:49 vm06.local ceph-mon[98962]: pgmap v30: 65 pgs: 34 active+undersized+degraded, 31 active+clean; 301 MiB data, 2.6 GiB used, 117 GiB / 120 GiB avail; 1.0 MiB/s rd, 1.0 MiB/s wr, 128 op/s; 1292/8565 objects degraded (15.085%) 2026-03-10T06:24:49.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:49 vm06.local ceph-mon[98962]: osdmap e49: 6 total, 6 up, 6 in 2026-03-10T06:24:49.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:49 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:49.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:49 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T06:24:49.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:49 vm04.local ceph-mon[115743]: pgmap v30: 65 pgs: 34 active+undersized+degraded, 31 active+clean; 301 MiB data, 2.6 GiB used, 117 GiB / 120 GiB avail; 1.0 MiB/s rd, 1.0 MiB/s wr, 128 op/s; 1292/8565 objects degraded (15.085%) 2026-03-10T06:24:49.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:49 vm04.local ceph-mon[115743]: osdmap e49: 6 total, 6 up, 6 in 2026-03-10T06:24:49.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:49 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:24:49.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:49 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T06:24:50.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:50 vm06.local ceph-mon[98962]: Health check update: Degraded data redundancy: 754/8355 objects degraded (9.025%), 25 pgs degraded (PG_DEGRADED) 2026-03-10T06:24:50.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:50 vm06.local ceph-mon[98962]: osdmap e50: 6 total, 6 up, 6 in 2026-03-10T06:24:50.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:50 vm04.local ceph-mon[115743]: Health check update: Degraded data redundancy: 754/8355 objects degraded (9.025%), 25 pgs degraded (PG_DEGRADED) 2026-03-10T06:24:50.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:50 vm04.local ceph-mon[115743]: osdmap e50: 6 total, 6 up, 6 in 2026-03-10T06:24:51.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:51 vm06.local ceph-mon[98962]: pgmap v32: 65 pgs: 25 active+undersized+degraded, 40 active+clean; 300 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail; 1.2 MiB/s rd, 1.2 MiB/s wr, 166 op/s; 754/8355 objects degraded (9.025%); 0 B/s, 10 keys/s, 23 objects/s recovering 2026-03-10T06:24:51.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:51 vm04.local ceph-mon[115743]: pgmap v32: 65 pgs: 25 active+undersized+degraded, 40 active+clean; 300 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail; 1.2 MiB/s rd, 1.2 MiB/s wr, 166 op/s; 754/8355 objects degraded (9.025%); 0 B/s, 10 keys/s, 23 objects/s recovering 2026-03-10T06:24:52.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:52 vm04.local ceph-mon[115743]: osdmap e51: 6 total, 6 up, 6 in 2026-03-10T06:24:52.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:52 vm06.local ceph-mon[98962]: osdmap e51: 6 total, 6 up, 6 in 2026-03-10T06:24:53.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:53 vm06.local ceph-mon[98962]: pgmap v35: 65 pgs: 7 active+recovery_wait+degraded, 2 peering, 1 active+recovering, 55 active+clean; 296 MiB data, 2.6 GiB used, 117 GiB / 120 GiB avail; 1004 KiB/s rd, 1.3 MiB/s wr, 182 op/s; 105/7170 objects degraded (1.464%); 1.7 MiB/s, 22 keys/s, 32 objects/s recovering 2026-03-10T06:24:53.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:53 vm04.local ceph-mon[115743]: pgmap v35: 65 pgs: 7 active+recovery_wait+degraded, 2 peering, 1 active+recovering, 55 active+clean; 296 MiB data, 2.6 GiB used, 117 GiB / 120 GiB avail; 1004 KiB/s rd, 1.3 MiB/s wr, 182 op/s; 105/7170 objects degraded (1.464%); 1.7 MiB/s, 22 keys/s, 32 objects/s recovering 2026-03-10T06:24:55.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:55 vm06.local ceph-mon[98962]: pgmap v36: 65 pgs: 7 active+recovery_wait+degraded, 2 peering, 1 active+recovering, 55 active+clean; 296 MiB data, 2.6 GiB used, 117 GiB / 120 GiB avail; 814 KiB/s rd, 1.1 MiB/s wr, 148 op/s; 105/7170 objects degraded (1.464%); 1.4 MiB/s, 17 keys/s, 26 objects/s recovering 2026-03-10T06:24:55.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:55 vm04.local ceph-mon[115743]: pgmap v36: 65 pgs: 7 active+recovery_wait+degraded, 2 peering, 1 active+recovering, 55 active+clean; 296 MiB data, 2.6 GiB used, 117 GiB / 120 GiB avail; 814 KiB/s rd, 1.1 MiB/s wr, 148 op/s; 105/7170 objects degraded (1.464%); 1.4 MiB/s, 17 keys/s, 26 objects/s recovering 2026-03-10T06:24:56.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:56 vm04.local ceph-mon[115743]: Health check update: Degraded data redundancy: 80/5622 objects degraded (1.423%), 5 pgs degraded (PG_DEGRADED) 2026-03-10T06:24:56.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:56 vm06.local ceph-mon[98962]: Health check update: Degraded data redundancy: 80/5622 objects degraded (1.423%), 5 pgs degraded (PG_DEGRADED) 2026-03-10T06:24:57.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:57 vm06.local ceph-mon[98962]: pgmap v37: 65 pgs: 5 active+recovery_wait+degraded, 1 active+recovering, 59 active+clean; 295 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail; 1.7 MiB/s rd, 1.8 MiB/s wr, 235 op/s; 80/5622 objects degraded (1.423%); 1.1 MiB/s, 13 keys/s, 25 objects/s recovering 2026-03-10T06:24:57.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:57 vm04.local ceph-mon[115743]: pgmap v37: 65 pgs: 5 active+recovery_wait+degraded, 1 active+recovering, 59 active+clean; 295 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail; 1.7 MiB/s rd, 1.8 MiB/s wr, 235 op/s; 80/5622 objects degraded (1.423%); 1.1 MiB/s, 13 keys/s, 25 objects/s recovering 2026-03-10T06:24:58.385 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:58.383+0000 7f74ab59e700 1 -- 192.168.123.104:0/445465204 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f74ac071e40 msgr2=0x7f74ac0722b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:24:58.385 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:58.383+0000 7f74ab59e700 1 --2- 192.168.123.104:0/445465204 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f74ac071e40 0x7f74ac0722b0 secure :-1 s=READY pgs=43 cs=0 l=1 rev1=1 crypto rx=0x7f74a4008220 tx=0x7f74a4008530 comp rx=0 tx=0).stop 2026-03-10T06:24:58.385 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:58.383+0000 7f74ab59e700 1 -- 192.168.123.104:0/445465204 shutdown_connections 2026-03-10T06:24:58.385 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:58.383+0000 7f74ab59e700 1 --2- 192.168.123.104:0/445465204 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f74ac071e40 0x7f74ac0722b0 unknown :-1 s=CLOSED pgs=43 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:24:58.385 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:58.383+0000 7f74ab59e700 1 --2- 192.168.123.104:0/445465204 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f74ac10c8f0 0x7f74ac10ccc0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:24:58.385 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:58.383+0000 7f74ab59e700 1 -- 192.168.123.104:0/445465204 >> 192.168.123.104:0/445465204 conn(0x7f74ac06c6c0 msgr2=0x7f74ac06cac0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:24:58.385 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:58.383+0000 7f74ab59e700 1 -- 192.168.123.104:0/445465204 shutdown_connections 2026-03-10T06:24:58.385 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:58.383+0000 7f74ab59e700 1 -- 192.168.123.104:0/445465204 wait complete. 2026-03-10T06:24:58.385 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:58.384+0000 7f74ab59e700 1 Processor -- start 2026-03-10T06:24:58.385 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:58.384+0000 7f74ab59e700 1 -- start start 2026-03-10T06:24:58.385 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:58.384+0000 7f74ab59e700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f74ac10c8f0 0x7f74ac07cea0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:24:58.385 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:58.384+0000 7f74ab59e700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f74ac07d3e0 0x7f74ac07d850 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:24:58.385 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:58.384+0000 7f74ab59e700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f74ac081a20 con 0x7f74ac07d3e0 2026-03-10T06:24:58.385 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:58.384+0000 7f74ab59e700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f74ac081b90 con 0x7f74ac10c8f0 2026-03-10T06:24:58.386 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:58.385+0000 7f74aa59c700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f74ac10c8f0 0x7f74ac07cea0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:24:58.386 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:58.385+0000 7f74aa59c700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f74ac10c8f0 0x7f74ac07cea0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.104:47580/0 (socket says 192.168.123.104:47580) 2026-03-10T06:24:58.386 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:58.385+0000 7f74aa59c700 1 -- 192.168.123.104:0/418004058 learned_addr learned my addr 192.168.123.104:0/418004058 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:24:58.386 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:58.385+0000 7f74a9d9b700 1 --2- 192.168.123.104:0/418004058 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f74ac07d3e0 0x7f74ac07d850 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:24:58.386 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:58.385+0000 7f74aa59c700 1 -- 192.168.123.104:0/418004058 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f74ac07d3e0 msgr2=0x7f74ac07d850 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:24:58.386 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:58.385+0000 7f74aa59c700 1 --2- 192.168.123.104:0/418004058 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f74ac07d3e0 0x7f74ac07d850 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:24:58.386 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:58.385+0000 7f74aa59c700 1 -- 192.168.123.104:0/418004058 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f74a4007ed0 con 0x7f74ac10c8f0 2026-03-10T06:24:58.386 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:58.386+0000 7f74aa59c700 1 --2- 192.168.123.104:0/418004058 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f74ac10c8f0 0x7f74ac07cea0 secure :-1 s=READY pgs=12 cs=0 l=1 rev1=1 crypto rx=0x7f749c00ca30 tx=0x7f749c00cd40 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:24:58.386 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:58.386+0000 7f749b7fe700 1 -- 192.168.123.104:0/418004058 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f749c00b960 con 0x7f74ac10c8f0 2026-03-10T06:24:58.387 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:58.386+0000 7f74ab59e700 1 -- 192.168.123.104:0/418004058 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f74ac081e70 con 0x7f74ac10c8f0 2026-03-10T06:24:58.387 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:58.386+0000 7f74ab59e700 1 -- 192.168.123.104:0/418004058 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f74ac0823c0 con 0x7f74ac10c8f0 2026-03-10T06:24:58.387 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:58.386+0000 7f749b7fe700 1 -- 192.168.123.104:0/418004058 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f749c015420 con 0x7f74ac10c8f0 2026-03-10T06:24:58.387 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:58.386+0000 7f749b7fe700 1 -- 192.168.123.104:0/418004058 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f749c014570 con 0x7f74ac10c8f0 2026-03-10T06:24:58.388 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:58.388+0000 7f74ab59e700 1 -- 192.168.123.104:0/418004058 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f74ac04f2a0 con 0x7f74ac10c8f0 2026-03-10T06:24:58.388 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:58.388+0000 7f749b7fe700 1 -- 192.168.123.104:0/418004058 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 39) v1 ==== 100217+0+0 (secure 0 0 0) 0x7f749c015a70 con 0x7f74ac10c8f0 2026-03-10T06:24:58.389 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:58.388+0000 7f749b7fe700 1 --2- 192.168.123.104:0/418004058 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f7494077a40 0x7f7494079ef0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:24:58.389 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:58.388+0000 7f749b7fe700 1 -- 192.168.123.104:0/418004058 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(51..51 src has 1..51) v4 ==== 6050+0+0 (secure 0 0 0) 0x7f749c0999b0 con 0x7f74ac10c8f0 2026-03-10T06:24:58.389 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:58.389+0000 7f74a9d9b700 1 --2- 192.168.123.104:0/418004058 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f7494077a40 0x7f7494079ef0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:24:58.390 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:58.390+0000 7f74a9d9b700 1 --2- 192.168.123.104:0/418004058 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f7494077a40 0x7f7494079ef0 secure :-1 s=READY pgs=28 cs=0 l=1 rev1=1 crypto rx=0x7f74a40081f0 tx=0x7f74a401d040 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:24:58.392 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:58.391+0000 7f749b7fe700 1 -- 192.168.123.104:0/418004058 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f749c062010 con 0x7f74ac10c8f0 2026-03-10T06:24:58.578 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:58.577+0000 7f74ab59e700 1 -- 192.168.123.104:0/418004058 --> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f74ac0826a0 con 0x7f7494077a40 2026-03-10T06:24:58.578 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:58 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T06:24:58.583 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:58.581+0000 7f749b7fe700 1 -- 192.168.123.104:0/418004058 <== mgr.34104 v2:192.168.123.104:6800/1421430943 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7f74ac0826a0 con 0x7f7494077a40 2026-03-10T06:24:58.584 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:58.584+0000 7f74997fa700 1 -- 192.168.123.104:0/418004058 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f7494077a40 msgr2=0x7f7494079ef0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:24:58.584 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:58.584+0000 7f74997fa700 1 --2- 192.168.123.104:0/418004058 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f7494077a40 0x7f7494079ef0 secure :-1 s=READY pgs=28 cs=0 l=1 rev1=1 crypto rx=0x7f74a40081f0 tx=0x7f74a401d040 comp rx=0 tx=0).stop 2026-03-10T06:24:58.584 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:58.584+0000 7f74997fa700 1 -- 192.168.123.104:0/418004058 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f74ac10c8f0 msgr2=0x7f74ac07cea0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:24:58.584 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:58.584+0000 7f74997fa700 1 --2- 192.168.123.104:0/418004058 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f74ac10c8f0 0x7f74ac07cea0 secure :-1 s=READY pgs=12 cs=0 l=1 rev1=1 crypto rx=0x7f749c00ca30 tx=0x7f749c00cd40 comp rx=0 tx=0).stop 2026-03-10T06:24:58.584 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:58.584+0000 7f74997fa700 1 -- 192.168.123.104:0/418004058 shutdown_connections 2026-03-10T06:24:58.584 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:58.584+0000 7f74997fa700 1 --2- 192.168.123.104:0/418004058 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f7494077a40 0x7f7494079ef0 unknown :-1 s=CLOSED pgs=28 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:24:58.584 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:58.584+0000 7f74997fa700 1 --2- 192.168.123.104:0/418004058 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f74ac10c8f0 0x7f74ac07cea0 unknown :-1 s=CLOSED pgs=12 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:24:58.584 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:58.584+0000 7f74997fa700 1 --2- 192.168.123.104:0/418004058 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f74ac07d3e0 0x7f74ac07d850 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:24:58.585 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:58.584+0000 7f74997fa700 1 -- 192.168.123.104:0/418004058 >> 192.168.123.104:0/418004058 conn(0x7f74ac06c6c0 msgr2=0x7f74ac0700f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:24:58.585 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:58.584+0000 7f74997fa700 1 -- 192.168.123.104:0/418004058 shutdown_connections 2026-03-10T06:24:58.585 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:58.584+0000 7f74997fa700 1 -- 192.168.123.104:0/418004058 wait complete. 2026-03-10T06:24:58.601 INFO:teuthology.orchestra.run.vm04.stdout:true 2026-03-10T06:24:58.698 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:58.698+0000 7f171c21d700 1 -- 192.168.123.104:0/1245623361 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1714071e40 msgr2=0x7f17140722b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:24:58.698 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:58.698+0000 7f171c21d700 1 --2- 192.168.123.104:0/1245623361 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1714071e40 0x7f17140722b0 secure :-1 s=READY pgs=44 cs=0 l=1 rev1=1 crypto rx=0x7f170c00d3f0 tx=0x7f170c00d700 comp rx=0 tx=0).stop 2026-03-10T06:24:58.698 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:58.698+0000 7f171c21d700 1 -- 192.168.123.104:0/1245623361 shutdown_connections 2026-03-10T06:24:58.698 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:58.698+0000 7f171c21d700 1 --2- 192.168.123.104:0/1245623361 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1714071e40 0x7f17140722b0 unknown :-1 s=CLOSED pgs=44 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:24:58.698 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:58.698+0000 7f171c21d700 1 --2- 192.168.123.104:0/1245623361 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f171410c8b0 0x7f171410cc80 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:24:58.698 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:58.698+0000 7f171c21d700 1 -- 192.168.123.104:0/1245623361 >> 192.168.123.104:0/1245623361 conn(0x7f171406c6c0 msgr2=0x7f171406cac0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:24:58.699 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:58.698+0000 7f171c21d700 1 -- 192.168.123.104:0/1245623361 shutdown_connections 2026-03-10T06:24:58.699 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:58.698+0000 7f171c21d700 1 -- 192.168.123.104:0/1245623361 wait complete. 2026-03-10T06:24:58.699 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:58.698+0000 7f171c21d700 1 Processor -- start 2026-03-10T06:24:58.699 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:58.698+0000 7f171c21d700 1 -- start start 2026-03-10T06:24:58.699 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:58.698+0000 7f171c21d700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f171410c8b0 0x7f17141328e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:24:58.699 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:58.698+0000 7f171c21d700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1714132e20 0x7f1714133290 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:24:58.699 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:58.699+0000 7f171c21d700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f171407ee70 con 0x7f171410c8b0 2026-03-10T06:24:58.699 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:58.699+0000 7f171c21d700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f171407efe0 con 0x7f1714132e20 2026-03-10T06:24:58.699 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:58.699+0000 7f17197b8700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1714132e20 0x7f1714133290 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:24:58.699 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:58.699+0000 7f17197b8700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1714132e20 0x7f1714133290 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.104:47606/0 (socket says 192.168.123.104:47606) 2026-03-10T06:24:58.699 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:58.699+0000 7f17197b8700 1 -- 192.168.123.104:0/3244535435 learned_addr learned my addr 192.168.123.104:0/3244535435 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:24:58.700 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:58.699+0000 7f1719fb9700 1 --2- 192.168.123.104:0/3244535435 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f171410c8b0 0x7f17141328e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:24:58.700 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:58.699+0000 7f17197b8700 1 -- 192.168.123.104:0/3244535435 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f171410c8b0 msgr2=0x7f17141328e0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:24:58.700 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:58.699+0000 7f17197b8700 1 --2- 192.168.123.104:0/3244535435 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f171410c8b0 0x7f17141328e0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:24:58.700 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:58.699+0000 7f17197b8700 1 -- 192.168.123.104:0/3244535435 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f170c007ed0 con 0x7f1714132e20 2026-03-10T06:24:58.700 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:58.700+0000 7f17197b8700 1 --2- 192.168.123.104:0/3244535435 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1714132e20 0x7f1714133290 secure :-1 s=READY pgs=13 cs=0 l=1 rev1=1 crypto rx=0x7f170c003c60 tx=0x7f170c003d40 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:24:58.701 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:58.701+0000 7f170affd700 1 -- 192.168.123.104:0/3244535435 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f170c01c070 con 0x7f1714132e20 2026-03-10T06:24:58.702 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:58.701+0000 7f170affd700 1 -- 192.168.123.104:0/3244535435 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f170c00fcf0 con 0x7f1714132e20 2026-03-10T06:24:58.702 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:58.701+0000 7f170affd700 1 -- 192.168.123.104:0/3244535435 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f170c017d40 con 0x7f1714132e20 2026-03-10T06:24:58.702 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:58.701+0000 7f171c21d700 1 -- 192.168.123.104:0/3244535435 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f171407f200 con 0x7f1714132e20 2026-03-10T06:24:58.702 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:58.701+0000 7f171c21d700 1 -- 192.168.123.104:0/3244535435 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f171407f6f0 con 0x7f1714132e20 2026-03-10T06:24:58.703 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:58.702+0000 7f171c21d700 1 -- 192.168.123.104:0/3244535435 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f16f8005320 con 0x7f1714132e20 2026-03-10T06:24:58.703 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:58.703+0000 7f170affd700 1 -- 192.168.123.104:0/3244535435 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 39) v1 ==== 100217+0+0 (secure 0 0 0) 0x7f170c017420 con 0x7f1714132e20 2026-03-10T06:24:58.704 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:58.703+0000 7f170affd700 1 --2- 192.168.123.104:0/3244535435 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f1700077b00 0x7f1700079fb0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:24:58.704 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:58.703+0000 7f170affd700 1 -- 192.168.123.104:0/3244535435 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(51..51 src has 1..51) v4 ==== 6050+0+0 (secure 0 0 0) 0x7f170c013070 con 0x7f1714132e20 2026-03-10T06:24:58.704 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:58.704+0000 7f1719fb9700 1 --2- 192.168.123.104:0/3244535435 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f1700077b00 0x7f1700079fb0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:24:58.705 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:58.704+0000 7f1719fb9700 1 --2- 192.168.123.104:0/3244535435 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f1700077b00 0x7f1700079fb0 secure :-1 s=READY pgs=29 cs=0 l=1 rev1=1 crypto rx=0x7f1710005fd0 tx=0x7f171000a560 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:24:58.710 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:58.706+0000 7f170affd700 1 -- 192.168.123.104:0/3244535435 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f170c063c80 con 0x7f1714132e20 2026-03-10T06:24:58.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:58 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T06:24:58.869 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:58.868+0000 7f171c21d700 1 -- 192.168.123.104:0/3244535435 --> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f16f8000bf0 con 0x7f1700077b00 2026-03-10T06:24:58.873 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:58.872+0000 7f170affd700 1 -- 192.168.123.104:0/3244535435 <== mgr.34104 v2:192.168.123.104:6800/1421430943 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7f16f8000bf0 con 0x7f1700077b00 2026-03-10T06:24:58.877 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:58.877+0000 7f1708ff9700 1 -- 192.168.123.104:0/3244535435 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f1700077b00 msgr2=0x7f1700079fb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:24:58.877 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:58.877+0000 7f1708ff9700 1 --2- 192.168.123.104:0/3244535435 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f1700077b00 0x7f1700079fb0 secure :-1 s=READY pgs=29 cs=0 l=1 rev1=1 crypto rx=0x7f1710005fd0 tx=0x7f171000a560 comp rx=0 tx=0).stop 2026-03-10T06:24:58.877 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:58.877+0000 7f1708ff9700 1 -- 192.168.123.104:0/3244535435 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1714132e20 msgr2=0x7f1714133290 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:24:58.877 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:58.877+0000 7f1708ff9700 1 --2- 192.168.123.104:0/3244535435 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1714132e20 0x7f1714133290 secure :-1 s=READY pgs=13 cs=0 l=1 rev1=1 crypto rx=0x7f170c003c60 tx=0x7f170c003d40 comp rx=0 tx=0).stop 2026-03-10T06:24:58.878 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:58.877+0000 7f1708ff9700 1 -- 192.168.123.104:0/3244535435 shutdown_connections 2026-03-10T06:24:58.878 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:58.877+0000 7f1708ff9700 1 --2- 192.168.123.104:0/3244535435 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f1700077b00 0x7f1700079fb0 unknown :-1 s=CLOSED pgs=29 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:24:58.878 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:58.877+0000 7f1708ff9700 1 --2- 192.168.123.104:0/3244535435 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f171410c8b0 0x7f17141328e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:24:58.878 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:58.877+0000 7f1708ff9700 1 --2- 192.168.123.104:0/3244535435 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1714132e20 0x7f1714133290 unknown :-1 s=CLOSED pgs=13 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:24:58.878 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:58.877+0000 7f1708ff9700 1 -- 192.168.123.104:0/3244535435 >> 192.168.123.104:0/3244535435 conn(0x7f171406c6c0 msgr2=0x7f1714070070 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:24:58.878 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:58.877+0000 7f1708ff9700 1 -- 192.168.123.104:0/3244535435 shutdown_connections 2026-03-10T06:24:58.878 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:58.877+0000 7f1708ff9700 1 -- 192.168.123.104:0/3244535435 wait complete. 2026-03-10T06:24:58.981 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:58.980+0000 7fa07f59e700 1 -- 192.168.123.104:0/539963440 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa080071b60 msgr2=0x7fa080071fd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:24:58.981 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:58.980+0000 7fa07f59e700 1 --2- 192.168.123.104:0/539963440 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa080071b60 0x7fa080071fd0 secure :-1 s=READY pgs=45 cs=0 l=1 rev1=1 crypto rx=0x7fa07801c580 tx=0x7fa07801c890 comp rx=0 tx=0).stop 2026-03-10T06:24:58.981 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:58.981+0000 7fa07f59e700 1 -- 192.168.123.104:0/539963440 shutdown_connections 2026-03-10T06:24:58.981 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:58.981+0000 7fa07f59e700 1 --2- 192.168.123.104:0/539963440 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa080071b60 0x7fa080071fd0 unknown :-1 s=CLOSED pgs=45 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:24:58.981 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:58.981+0000 7fa07f59e700 1 --2- 192.168.123.104:0/539963440 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa08010e9e0 0x7fa08010edb0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:24:58.981 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:58.981+0000 7fa07f59e700 1 -- 192.168.123.104:0/539963440 >> 192.168.123.104:0/539963440 conn(0x7fa08006c6c0 msgr2=0x7fa08006cac0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:24:58.981 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:58.981+0000 7fa07f59e700 1 -- 192.168.123.104:0/539963440 shutdown_connections 2026-03-10T06:24:58.981 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:58.981+0000 7fa07f59e700 1 -- 192.168.123.104:0/539963440 wait complete. 2026-03-10T06:24:58.981 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:58.981+0000 7fa07f59e700 1 Processor -- start 2026-03-10T06:24:58.982 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:58.981+0000 7fa07f59e700 1 -- start start 2026-03-10T06:24:58.982 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:58.981+0000 7fa07f59e700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa08010e9e0 0x7fa08019c5d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:24:58.982 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:58.981+0000 7fa07f59e700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa08019cb10 0x7fa0801a0f80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:24:58.982 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:58.981+0000 7fa07f59e700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa08019d120 con 0x7fa08010e9e0 2026-03-10T06:24:58.982 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:58.981+0000 7fa07f59e700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa08019d290 con 0x7fa08019cb10 2026-03-10T06:24:58.982 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:58.982+0000 7fa07e59c700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa08010e9e0 0x7fa08019c5d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:24:58.982 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:58.982+0000 7fa07dd9b700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa08019cb10 0x7fa0801a0f80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:24:58.982 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:58.982+0000 7fa07dd9b700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa08019cb10 0x7fa0801a0f80 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.104:47612/0 (socket says 192.168.123.104:47612) 2026-03-10T06:24:58.982 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:58.982+0000 7fa07dd9b700 1 -- 192.168.123.104:0/497935544 learned_addr learned my addr 192.168.123.104:0/497935544 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:24:58.982 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:58.982+0000 7fa07dd9b700 1 -- 192.168.123.104:0/497935544 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa08010e9e0 msgr2=0x7fa08019c5d0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:24:58.983 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:58.982+0000 7fa07dd9b700 1 --2- 192.168.123.104:0/497935544 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa08010e9e0 0x7fa08019c5d0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:24:58.983 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:58.982+0000 7fa07dd9b700 1 -- 192.168.123.104:0/497935544 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fa07801c060 con 0x7fa08019cb10 2026-03-10T06:24:58.983 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:58.982+0000 7fa07e59c700 1 --2- 192.168.123.104:0/497935544 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa08010e9e0 0x7fa08019c5d0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-10T06:24:58.983 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:58.983+0000 7fa07dd9b700 1 --2- 192.168.123.104:0/497935544 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa08019cb10 0x7fa0801a0f80 secure :-1 s=READY pgs=14 cs=0 l=1 rev1=1 crypto rx=0x7fa078009750 tx=0x7fa078007860 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:24:58.983 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:58.983+0000 7fa06f7fe700 1 -- 192.168.123.104:0/497935544 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa078003dc0 con 0x7fa08019cb10 2026-03-10T06:24:58.983 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:58.983+0000 7fa07f59e700 1 -- 192.168.123.104:0/497935544 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fa0801a1580 con 0x7fa08019cb10 2026-03-10T06:24:58.983 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:58.983+0000 7fa07f59e700 1 -- 192.168.123.104:0/497935544 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fa0801a1ad0 con 0x7fa08019cb10 2026-03-10T06:24:58.984 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:58.983+0000 7fa06f7fe700 1 -- 192.168.123.104:0/497935544 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7fa07801f030 con 0x7fa08019cb10 2026-03-10T06:24:58.984 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:58.984+0000 7fa06f7fe700 1 -- 192.168.123.104:0/497935544 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa07802d9d0 con 0x7fa08019cb10 2026-03-10T06:24:58.988 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:58.987+0000 7fa06f7fe700 1 -- 192.168.123.104:0/497935544 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 39) v1 ==== 100217+0+0 (secure 0 0 0) 0x7fa078007c90 con 0x7fa08019cb10 2026-03-10T06:24:58.988 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:58.988+0000 7fa06f7fe700 1 --2- 192.168.123.104:0/497935544 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7fa068077a50 0x7fa068079f00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:24:58.988 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:58.988+0000 7fa07e59c700 1 --2- 192.168.123.104:0/497935544 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7fa068077a50 0x7fa068079f00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:24:58.988 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:58.988+0000 7fa06f7fe700 1 -- 192.168.123.104:0/497935544 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(51..51 src has 1..51) v4 ==== 6050+0+0 (secure 0 0 0) 0x7fa0780ac0c0 con 0x7fa08019cb10 2026-03-10T06:24:58.989 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:58.984+0000 7fa07f59e700 1 -- 192.168.123.104:0/497935544 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fa060005320 con 0x7fa08019cb10 2026-03-10T06:24:58.989 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:58.989+0000 7fa07e59c700 1 --2- 192.168.123.104:0/497935544 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7fa068077a50 0x7fa068079f00 secure :-1 s=READY pgs=30 cs=0 l=1 rev1=1 crypto rx=0x7fa070006fd0 tx=0x7fa070009380 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:24:58.992 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:58.992+0000 7fa06f7fe700 1 -- 192.168.123.104:0/497935544 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fa0780747e0 con 0x7fa08019cb10 2026-03-10T06:24:59.138 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.134+0000 7fa07f59e700 1 -- 192.168.123.104:0/497935544 --> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7fa060000bf0 con 0x7fa068077a50 2026-03-10T06:24:59.145 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.145+0000 7fa06f7fe700 1 -- 192.168.123.104:0/497935544 <== mgr.34104 v2:192.168.123.104:6800/1421430943 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3576 (secure 0 0 0) 0x7fa060000bf0 con 0x7fa068077a50 2026-03-10T06:24:59.145 INFO:teuthology.orchestra.run.vm04.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-10T06:24:59.145 INFO:teuthology.orchestra.run.vm04.stdout:alertmanager.vm04 vm04 *:9093,9094 running (106s) 17s ago 7m 22.1M - 0.25.0 c8568f914cd2 85edc8fe2fc1 2026-03-10T06:24:59.145 INFO:teuthology.orchestra.run.vm04.stdout:ceph-exporter.vm04 vm04 running (7m) 17s ago 7m 8770k - 18.2.0 dc2bc1663786 019b79596e39 2026-03-10T06:24:59.145 INFO:teuthology.orchestra.run.vm04.stdout:ceph-exporter.vm06 vm06 running (6m) 29s ago 6m 11.3M - 18.2.0 dc2bc1663786 02ba67f7b99e 2026-03-10T06:24:59.145 INFO:teuthology.orchestra.run.vm04.stdout:crash.vm04 vm04 running (33s) 17s ago 7m 7847k - 19.2.3-678-ge911bdeb 654f31e6858e 330b1d951bd0 2026-03-10T06:24:59.146 INFO:teuthology.orchestra.run.vm04.stdout:crash.vm06 vm06 running (31s) 29s ago 6m 7856k - 19.2.3-678-ge911bdeb 654f31e6858e d5aafc4fb1bb 2026-03-10T06:24:59.146 INFO:teuthology.orchestra.run.vm04.stdout:grafana.vm04 vm04 *:3000 running (89s) 17s ago 7m 85.2M - 10.4.0 c8b91775d855 28b34ae2f2b0 2026-03-10T06:24:59.146 INFO:teuthology.orchestra.run.vm04.stdout:mds.cephfs.vm04.hdxbzv vm04 running (5m) 17s ago 5m 243M - 18.2.0 dc2bc1663786 342935a5b39a 2026-03-10T06:24:59.146 INFO:teuthology.orchestra.run.vm04.stdout:mds.cephfs.vm04.hsrsig vm04 running (5m) 17s ago 5m 17.7M - 18.2.0 dc2bc1663786 9bbaa4df4333 2026-03-10T06:24:59.146 INFO:teuthology.orchestra.run.vm04.stdout:mds.cephfs.vm06.afscws vm06 running (5m) 29s ago 5m 17.7M - 18.2.0 dc2bc1663786 dc29bd0a94dd 2026-03-10T06:24:59.146 INFO:teuthology.orchestra.run.vm04.stdout:mds.cephfs.vm06.wzhqon vm06 running (5m) 29s ago 5m 268M - 18.2.0 dc2bc1663786 5f7b9f10b346 2026-03-10T06:24:59.146 INFO:teuthology.orchestra.run.vm04.stdout:mgr.vm04.exdvdb vm04 *:8443,9283,8765 running (2m) 17s ago 8m 607M - 19.2.3-678-ge911bdeb 654f31e6858e 640a9d30421c 2026-03-10T06:24:59.146 INFO:teuthology.orchestra.run.vm04.stdout:mgr.vm06.wwotdr vm06 *:8443,9283,8765 running (2m) 29s ago 6m 495M - 19.2.3-678-ge911bdeb 654f31e6858e 0f98de364d6a 2026-03-10T06:24:59.146 INFO:teuthology.orchestra.run.vm04.stdout:mon.vm04 vm04 running (62s) 17s ago 8m 58.3M 2048M 19.2.3-678-ge911bdeb 654f31e6858e cf1d92823378 2026-03-10T06:24:59.146 INFO:teuthology.orchestra.run.vm04.stdout:mon.vm06 vm06 running (47s) 29s ago 6m 50.6M 2048M 19.2.3-678-ge911bdeb 654f31e6858e 0f90bc9a714a 2026-03-10T06:24:59.146 INFO:teuthology.orchestra.run.vm04.stdout:node-exporter.vm04 vm04 *:9100 running (2m) 17s ago 7m 9.94M - 1.7.0 72c9c2088986 f88b18573eef 2026-03-10T06:24:59.146 INFO:teuthology.orchestra.run.vm04.stdout:node-exporter.vm06 vm06 *:9100 running (2m) 29s ago 6m 9416k - 1.7.0 72c9c2088986 32cea90d1988 2026-03-10T06:24:59.146 INFO:teuthology.orchestra.run.vm04.stdout:osd.0 vm04 running (20s) 17s ago 6m 29.0M 4096M 19.2.3-678-ge911bdeb 654f31e6858e df697b82ad51 2026-03-10T06:24:59.146 INFO:teuthology.orchestra.run.vm04.stdout:osd.1 vm04 running (6m) 17s ago 6m 313M 4096M 18.2.0 dc2bc1663786 ddcaf1636c42 2026-03-10T06:24:59.146 INFO:teuthology.orchestra.run.vm04.stdout:osd.2 vm04 running (6m) 17s ago 6m 270M 4096M 18.2.0 dc2bc1663786 e5a533082c80 2026-03-10T06:24:59.146 INFO:teuthology.orchestra.run.vm04.stdout:osd.3 vm06 running (5m) 29s ago 5m 385M 4096M 18.2.0 dc2bc1663786 62400287eca0 2026-03-10T06:24:59.146 INFO:teuthology.orchestra.run.vm04.stdout:osd.4 vm06 running (5m) 29s ago 5m 322M 4096M 18.2.0 dc2bc1663786 dcd395dfe220 2026-03-10T06:24:59.146 INFO:teuthology.orchestra.run.vm04.stdout:osd.5 vm06 running (5m) 29s ago 5m 294M 4096M 18.2.0 dc2bc1663786 862da087fc06 2026-03-10T06:24:59.146 INFO:teuthology.orchestra.run.vm04.stdout:prometheus.vm04 vm04 *:9095 running (117s) 17s ago 7m 52.2M - 2.51.0 1d3b7f56885b 9e491f823407 2026-03-10T06:24:59.149 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.149+0000 7fa06d7fa700 1 -- 192.168.123.104:0/497935544 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7fa068077a50 msgr2=0x7fa068079f00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:24:59.149 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.149+0000 7fa06d7fa700 1 --2- 192.168.123.104:0/497935544 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7fa068077a50 0x7fa068079f00 secure :-1 s=READY pgs=30 cs=0 l=1 rev1=1 crypto rx=0x7fa070006fd0 tx=0x7fa070009380 comp rx=0 tx=0).stop 2026-03-10T06:24:59.149 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.149+0000 7fa06d7fa700 1 -- 192.168.123.104:0/497935544 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa08019cb10 msgr2=0x7fa0801a0f80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:24:59.149 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.149+0000 7fa06d7fa700 1 --2- 192.168.123.104:0/497935544 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa08019cb10 0x7fa0801a0f80 secure :-1 s=READY pgs=14 cs=0 l=1 rev1=1 crypto rx=0x7fa078009750 tx=0x7fa078007860 comp rx=0 tx=0).stop 2026-03-10T06:24:59.150 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.149+0000 7fa06d7fa700 1 -- 192.168.123.104:0/497935544 shutdown_connections 2026-03-10T06:24:59.150 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.149+0000 7fa06d7fa700 1 --2- 192.168.123.104:0/497935544 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7fa068077a50 0x7fa068079f00 unknown :-1 s=CLOSED pgs=30 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:24:59.150 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.149+0000 7fa06d7fa700 1 --2- 192.168.123.104:0/497935544 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa08010e9e0 0x7fa08019c5d0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:24:59.150 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.149+0000 7fa06d7fa700 1 --2- 192.168.123.104:0/497935544 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa08019cb10 0x7fa0801a0f80 unknown :-1 s=CLOSED pgs=14 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:24:59.150 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.149+0000 7fa06d7fa700 1 -- 192.168.123.104:0/497935544 >> 192.168.123.104:0/497935544 conn(0x7fa08006c6c0 msgr2=0x7fa08006cce0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:24:59.150 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.149+0000 7fa06d7fa700 1 -- 192.168.123.104:0/497935544 shutdown_connections 2026-03-10T06:24:59.150 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.149+0000 7fa06d7fa700 1 -- 192.168.123.104:0/497935544 wait complete. 2026-03-10T06:24:59.229 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.228+0000 7f891e5fe700 1 -- 192.168.123.104:0/2870298897 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8918071e40 msgr2=0x7f89180722b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:24:59.229 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.228+0000 7f891e5fe700 1 --2- 192.168.123.104:0/2870298897 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8918071e40 0x7f89180722b0 secure :-1 s=READY pgs=15 cs=0 l=1 rev1=1 crypto rx=0x7f891000d3f0 tx=0x7f891000d700 comp rx=0 tx=0).stop 2026-03-10T06:24:59.229 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.228+0000 7f891e5fe700 1 -- 192.168.123.104:0/2870298897 shutdown_connections 2026-03-10T06:24:59.229 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.228+0000 7f891e5fe700 1 --2- 192.168.123.104:0/2870298897 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8918071e40 0x7f89180722b0 unknown :-1 s=CLOSED pgs=15 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:24:59.229 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.228+0000 7f891e5fe700 1 --2- 192.168.123.104:0/2870298897 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f891810c8b0 0x7f891810cc80 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:24:59.229 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.228+0000 7f891e5fe700 1 -- 192.168.123.104:0/2870298897 >> 192.168.123.104:0/2870298897 conn(0x7f891806c6c0 msgr2=0x7f891806cac0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:24:59.230 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.229+0000 7f891e5fe700 1 -- 192.168.123.104:0/2870298897 shutdown_connections 2026-03-10T06:24:59.230 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.229+0000 7f891e5fe700 1 -- 192.168.123.104:0/2870298897 wait complete. 2026-03-10T06:24:59.230 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.229+0000 7f891e5fe700 1 Processor -- start 2026-03-10T06:24:59.230 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.229+0000 7f891e5fe700 1 -- start start 2026-03-10T06:24:59.231 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.229+0000 7f891e5fe700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f891810c8b0 0x7f89181327a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:24:59.231 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.229+0000 7f891e5fe700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8918132ce0 0x7f8918133150 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:24:59.231 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.229+0000 7f891e5fe700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f891807eef0 con 0x7f8918132ce0 2026-03-10T06:24:59.231 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.229+0000 7f891e5fe700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f891807f060 con 0x7f891810c8b0 2026-03-10T06:24:59.231 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.229+0000 7f89177fe700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8918132ce0 0x7f8918133150 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:24:59.231 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.230+0000 7f89177fe700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8918132ce0 0x7f8918133150 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:55560/0 (socket says 192.168.123.104:55560) 2026-03-10T06:24:59.231 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.230+0000 7f89177fe700 1 -- 192.168.123.104:0/3429354271 learned_addr learned my addr 192.168.123.104:0/3429354271 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:24:59.231 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.230+0000 7f8917fff700 1 --2- 192.168.123.104:0/3429354271 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f891810c8b0 0x7f89181327a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:24:59.231 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.230+0000 7f89177fe700 1 -- 192.168.123.104:0/3429354271 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f891810c8b0 msgr2=0x7f89181327a0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:24:59.231 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.230+0000 7f89177fe700 1 --2- 192.168.123.104:0/3429354271 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f891810c8b0 0x7f89181327a0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:24:59.231 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.230+0000 7f89177fe700 1 -- 192.168.123.104:0/3429354271 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f8910007ed0 con 0x7f8918132ce0 2026-03-10T06:24:59.231 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.231+0000 7f89177fe700 1 --2- 192.168.123.104:0/3429354271 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8918132ce0 0x7f8918133150 secure :-1 s=READY pgs=46 cs=0 l=1 rev1=1 crypto rx=0x7f8910003c60 tx=0x7f8910004b40 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:24:59.231 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.231+0000 7f89157fa700 1 -- 192.168.123.104:0/3429354271 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f891001c070 con 0x7f8918132ce0 2026-03-10T06:24:59.232 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.231+0000 7f891e5fe700 1 -- 192.168.123.104:0/3429354271 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f891807f280 con 0x7f8918132ce0 2026-03-10T06:24:59.232 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.231+0000 7f891e5fe700 1 -- 192.168.123.104:0/3429354271 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f891807f770 con 0x7f8918132ce0 2026-03-10T06:24:59.232 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.232+0000 7f89157fa700 1 -- 192.168.123.104:0/3429354271 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f891000deb0 con 0x7f8918132ce0 2026-03-10T06:24:59.232 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.232+0000 7f89157fa700 1 -- 192.168.123.104:0/3429354271 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f8910017c20 con 0x7f8918132ce0 2026-03-10T06:24:59.233 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.232+0000 7f891e5fe700 1 -- 192.168.123.104:0/3429354271 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f8904005320 con 0x7f8918132ce0 2026-03-10T06:24:59.233 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.233+0000 7f89157fa700 1 -- 192.168.123.104:0/3429354271 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 39) v1 ==== 100217+0+0 (secure 0 0 0) 0x7f8910017420 con 0x7f8918132ce0 2026-03-10T06:24:59.234 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.233+0000 7f89157fa700 1 --2- 192.168.123.104:0/3429354271 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f8900077a40 0x7f8900079ef0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:24:59.234 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.233+0000 7f89157fa700 1 -- 192.168.123.104:0/3429354271 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(51..51 src has 1..51) v4 ==== 6050+0+0 (secure 0 0 0) 0x7f8910013070 con 0x7f8918132ce0 2026-03-10T06:24:59.234 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.233+0000 7f8917fff700 1 --2- 192.168.123.104:0/3429354271 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f8900077a40 0x7f8900079ef0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:24:59.234 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.234+0000 7f8917fff700 1 --2- 192.168.123.104:0/3429354271 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f8900077a40 0x7f8900079ef0 secure :-1 s=READY pgs=31 cs=0 l=1 rev1=1 crypto rx=0x7f8908009c80 tx=0x7f8908009400 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:24:59.237 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.236+0000 7f89157fa700 1 -- 192.168.123.104:0/3429354271 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f8910063e30 con 0x7f8918132ce0 2026-03-10T06:24:59.422 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.421+0000 7f891e5fe700 1 -- 192.168.123.104:0/3429354271 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7f8904005cc0 con 0x7f8918132ce0 2026-03-10T06:24:59.422 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.422+0000 7f89157fa700 1 -- 192.168.123.104:0/3429354271 <== mon.0 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+799 (secure 0 0 0) 0x7f8910063580 con 0x7f8918132ce0 2026-03-10T06:24:59.423 INFO:teuthology.orchestra.run.vm04.stdout:{ 2026-03-10T06:24:59.423 INFO:teuthology.orchestra.run.vm04.stdout: "mon": { 2026-03-10T06:24:59.423 INFO:teuthology.orchestra.run.vm04.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T06:24:59.423 INFO:teuthology.orchestra.run.vm04.stdout: }, 2026-03-10T06:24:59.423 INFO:teuthology.orchestra.run.vm04.stdout: "mgr": { 2026-03-10T06:24:59.423 INFO:teuthology.orchestra.run.vm04.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T06:24:59.423 INFO:teuthology.orchestra.run.vm04.stdout: }, 2026-03-10T06:24:59.423 INFO:teuthology.orchestra.run.vm04.stdout: "osd": { 2026-03-10T06:24:59.423 INFO:teuthology.orchestra.run.vm04.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 5, 2026-03-10T06:24:59.423 INFO:teuthology.orchestra.run.vm04.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 1 2026-03-10T06:24:59.423 INFO:teuthology.orchestra.run.vm04.stdout: }, 2026-03-10T06:24:59.423 INFO:teuthology.orchestra.run.vm04.stdout: "mds": { 2026-03-10T06:24:59.423 INFO:teuthology.orchestra.run.vm04.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 4 2026-03-10T06:24:59.423 INFO:teuthology.orchestra.run.vm04.stdout: }, 2026-03-10T06:24:59.423 INFO:teuthology.orchestra.run.vm04.stdout: "overall": { 2026-03-10T06:24:59.423 INFO:teuthology.orchestra.run.vm04.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 9, 2026-03-10T06:24:59.423 INFO:teuthology.orchestra.run.vm04.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 5 2026-03-10T06:24:59.423 INFO:teuthology.orchestra.run.vm04.stdout: } 2026-03-10T06:24:59.423 INFO:teuthology.orchestra.run.vm04.stdout:} 2026-03-10T06:24:59.426 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.425+0000 7f88feffd700 1 -- 192.168.123.104:0/3429354271 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f8900077a40 msgr2=0x7f8900079ef0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:24:59.427 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.425+0000 7f88feffd700 1 --2- 192.168.123.104:0/3429354271 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f8900077a40 0x7f8900079ef0 secure :-1 s=READY pgs=31 cs=0 l=1 rev1=1 crypto rx=0x7f8908009c80 tx=0x7f8908009400 comp rx=0 tx=0).stop 2026-03-10T06:24:59.427 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.425+0000 7f88feffd700 1 -- 192.168.123.104:0/3429354271 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8918132ce0 msgr2=0x7f8918133150 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:24:59.427 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.425+0000 7f88feffd700 1 --2- 192.168.123.104:0/3429354271 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8918132ce0 0x7f8918133150 secure :-1 s=READY pgs=46 cs=0 l=1 rev1=1 crypto rx=0x7f8910003c60 tx=0x7f8910004b40 comp rx=0 tx=0).stop 2026-03-10T06:24:59.427 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.426+0000 7f88feffd700 1 -- 192.168.123.104:0/3429354271 shutdown_connections 2026-03-10T06:24:59.427 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.426+0000 7f88feffd700 1 --2- 192.168.123.104:0/3429354271 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f8900077a40 0x7f8900079ef0 unknown :-1 s=CLOSED pgs=31 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:24:59.427 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.426+0000 7f88feffd700 1 --2- 192.168.123.104:0/3429354271 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f891810c8b0 0x7f89181327a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:24:59.427 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.426+0000 7f88feffd700 1 --2- 192.168.123.104:0/3429354271 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8918132ce0 0x7f8918133150 secure :-1 s=CLOSED pgs=46 cs=0 l=1 rev1=1 crypto rx=0x7f8910003c60 tx=0x7f8910004b40 comp rx=0 tx=0).stop 2026-03-10T06:24:59.427 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.426+0000 7f88feffd700 1 -- 192.168.123.104:0/3429354271 >> 192.168.123.104:0/3429354271 conn(0x7f891806c6c0 msgr2=0x7f891806ff90 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:24:59.427 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.426+0000 7f88feffd700 1 -- 192.168.123.104:0/3429354271 shutdown_connections 2026-03-10T06:24:59.427 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.426+0000 7f88feffd700 1 -- 192.168.123.104:0/3429354271 wait complete. 2026-03-10T06:24:59.523 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.523+0000 7f3ac427b700 1 -- 192.168.123.104:0/376224086 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3abc10c8b0 msgr2=0x7f3abc10cc80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:24:59.523 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.523+0000 7f3ac427b700 1 --2- 192.168.123.104:0/376224086 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3abc10c8b0 0x7f3abc10cc80 secure :-1 s=READY pgs=47 cs=0 l=1 rev1=1 crypto rx=0x7f3ab800bc70 tx=0x7f3ab800bf80 comp rx=0 tx=0).stop 2026-03-10T06:24:59.523 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.523+0000 7f3ac427b700 1 -- 192.168.123.104:0/376224086 shutdown_connections 2026-03-10T06:24:59.523 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.523+0000 7f3ac427b700 1 --2- 192.168.123.104:0/376224086 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3abc071e40 0x7f3abc0722b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:24:59.523 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.523+0000 7f3ac427b700 1 --2- 192.168.123.104:0/376224086 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3abc10c8b0 0x7f3abc10cc80 unknown :-1 s=CLOSED pgs=47 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:24:59.523 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.523+0000 7f3ac427b700 1 -- 192.168.123.104:0/376224086 >> 192.168.123.104:0/376224086 conn(0x7f3abc06c6c0 msgr2=0x7f3abc06cac0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:24:59.524 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.523+0000 7f3ac427b700 1 -- 192.168.123.104:0/376224086 shutdown_connections 2026-03-10T06:24:59.524 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.523+0000 7f3ac427b700 1 -- 192.168.123.104:0/376224086 wait complete. 2026-03-10T06:24:59.524 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.523+0000 7f3ac427b700 1 Processor -- start 2026-03-10T06:24:59.524 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.523+0000 7f3ac427b700 1 -- start start 2026-03-10T06:24:59.524 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.523+0000 7f3ac427b700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3abc071e40 0x7f3abc07ce80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:24:59.524 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.523+0000 7f3ac427b700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3abc07d3c0 0x7f3abc07d830 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:24:59.526 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.523+0000 7f3ac427b700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3abc081a00 con 0x7f3abc071e40 2026-03-10T06:24:59.526 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.523+0000 7f3ac427b700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3abc081b70 con 0x7f3abc07d3c0 2026-03-10T06:24:59.526 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.524+0000 7f3ac2017700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3abc071e40 0x7f3abc07ce80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:24:59.526 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.524+0000 7f3ac2017700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3abc071e40 0x7f3abc07ce80 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:55574/0 (socket says 192.168.123.104:55574) 2026-03-10T06:24:59.526 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.524+0000 7f3ac2017700 1 -- 192.168.123.104:0/2070080976 learned_addr learned my addr 192.168.123.104:0/2070080976 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:24:59.526 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.524+0000 7f3ac1816700 1 --2- 192.168.123.104:0/2070080976 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3abc07d3c0 0x7f3abc07d830 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:24:59.526 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.524+0000 7f3ac2017700 1 -- 192.168.123.104:0/2070080976 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3abc07d3c0 msgr2=0x7f3abc07d830 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:24:59.526 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.524+0000 7f3ac2017700 1 --2- 192.168.123.104:0/2070080976 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3abc07d3c0 0x7f3abc07d830 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:24:59.526 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.524+0000 7f3ac2017700 1 -- 192.168.123.104:0/2070080976 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f3ab800b920 con 0x7f3abc071e40 2026-03-10T06:24:59.526 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.526+0000 7f3ac2017700 1 --2- 192.168.123.104:0/2070080976 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3abc071e40 0x7f3abc07ce80 secure :-1 s=READY pgs=48 cs=0 l=1 rev1=1 crypto rx=0x7f3ab8004520 tx=0x7f3ab8004550 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:24:59.527 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.526+0000 7f3ab37fe700 1 -- 192.168.123.104:0/2070080976 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3ab8010070 con 0x7f3abc071e40 2026-03-10T06:24:59.528 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.526+0000 7f3ac427b700 1 -- 192.168.123.104:0/2070080976 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f3abc081df0 con 0x7f3abc071e40 2026-03-10T06:24:59.528 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.526+0000 7f3ac427b700 1 -- 192.168.123.104:0/2070080976 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f3abc082280 con 0x7f3abc071e40 2026-03-10T06:24:59.528 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.527+0000 7f3ab37fe700 1 -- 192.168.123.104:0/2070080976 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f3ab800dc00 con 0x7f3abc071e40 2026-03-10T06:24:59.528 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.527+0000 7f3ab37fe700 1 -- 192.168.123.104:0/2070080976 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3ab8014800 con 0x7f3abc071e40 2026-03-10T06:24:59.528 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.528+0000 7f3ab37fe700 1 -- 192.168.123.104:0/2070080976 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 39) v1 ==== 100217+0+0 (secure 0 0 0) 0x7f3ab8003b60 con 0x7f3abc071e40 2026-03-10T06:24:59.529 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.528+0000 7f3ab37fe700 1 --2- 192.168.123.104:0/2070080976 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f3aa8077a40 0x7f3aa8079ef0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:24:59.529 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.528+0000 7f3ab37fe700 1 -- 192.168.123.104:0/2070080976 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(51..51 src has 1..51) v4 ==== 6050+0+0 (secure 0 0 0) 0x7f3ab809aa60 con 0x7f3abc071e40 2026-03-10T06:24:59.529 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.529+0000 7f3ac427b700 1 -- 192.168.123.104:0/2070080976 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f3aa0005320 con 0x7f3abc071e40 2026-03-10T06:24:59.530 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.530+0000 7f3ac1816700 1 --2- 192.168.123.104:0/2070080976 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f3aa8077a40 0x7f3aa8079ef0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:24:59.531 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.530+0000 7f3ac1816700 1 --2- 192.168.123.104:0/2070080976 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f3aa8077a40 0x7f3aa8079ef0 secure :-1 s=READY pgs=32 cs=0 l=1 rev1=1 crypto rx=0x7f3ab4003eb0 tx=0x7f3ab4008be0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:24:59.534 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.533+0000 7f3ab37fe700 1 -- 192.168.123.104:0/2070080976 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f3ab80630c0 con 0x7f3abc071e40 2026-03-10T06:24:59.679 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:59 vm04.local ceph-mon[115743]: pgmap v38: 65 pgs: 5 active+recovery_wait+degraded, 1 active+recovering, 59 active+clean; 295 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail; 1.7 MiB/s rd, 1.8 MiB/s wr, 220 op/s; 80/5622 objects degraded (1.423%); 1.0 MiB/s, 6 keys/s, 10 objects/s recovering 2026-03-10T06:24:59.679 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:59 vm04.local ceph-mon[115743]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T06:24:59.680 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:59 vm04.local ceph-mon[115743]: Upgrade: unsafe to stop osd(s) at this time (2 PGs are or would become offline) 2026-03-10T06:24:59.680 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:24:59 vm04.local ceph-mon[115743]: from='client.44129 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:24:59.739 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.739+0000 7f3ac427b700 1 -- 192.168.123.104:0/2070080976 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7f3aa0005cc0 con 0x7f3abc071e40 2026-03-10T06:24:59.740 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.739+0000 7f3ab37fe700 1 -- 192.168.123.104:0/2070080976 <== mon.0 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 11 v11) v1 ==== 76+0+1944 (secure 0 0 0) 0x7f3ab8062810 con 0x7f3abc071e40 2026-03-10T06:24:59.740 INFO:teuthology.orchestra.run.vm04.stdout:e11 2026-03-10T06:24:59.740 INFO:teuthology.orchestra.run.vm04.stdout:btime 1970-01-01T00:00:00:000000+0000 2026-03-10T06:24:59.740 INFO:teuthology.orchestra.run.vm04.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-10T06:24:59.740 INFO:teuthology.orchestra.run.vm04.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T06:24:59.740 INFO:teuthology.orchestra.run.vm04.stdout:legacy client fscid: 1 2026-03-10T06:24:59.740 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:24:59.740 INFO:teuthology.orchestra.run.vm04.stdout:Filesystem 'cephfs' (1) 2026-03-10T06:24:59.740 INFO:teuthology.orchestra.run.vm04.stdout:fs_name cephfs 2026-03-10T06:24:59.740 INFO:teuthology.orchestra.run.vm04.stdout:epoch 9 2026-03-10T06:24:59.740 INFO:teuthology.orchestra.run.vm04.stdout:flags 32 joinable allow_snaps allow_multimds_snaps allow_standby_replay 2026-03-10T06:24:59.740 INFO:teuthology.orchestra.run.vm04.stdout:created 2026-03-10T06:19:48.407965+0000 2026-03-10T06:24:59.740 INFO:teuthology.orchestra.run.vm04.stdout:modified 2026-03-10T06:19:55.449951+0000 2026-03-10T06:24:59.740 INFO:teuthology.orchestra.run.vm04.stdout:tableserver 0 2026-03-10T06:24:59.740 INFO:teuthology.orchestra.run.vm04.stdout:root 0 2026-03-10T06:24:59.740 INFO:teuthology.orchestra.run.vm04.stdout:session_timeout 60 2026-03-10T06:24:59.740 INFO:teuthology.orchestra.run.vm04.stdout:session_autoclose 300 2026-03-10T06:24:59.740 INFO:teuthology.orchestra.run.vm04.stdout:max_file_size 1099511627776 2026-03-10T06:24:59.740 INFO:teuthology.orchestra.run.vm04.stdout:max_xattr_size 65536 2026-03-10T06:24:59.740 INFO:teuthology.orchestra.run.vm04.stdout:required_client_features {} 2026-03-10T06:24:59.740 INFO:teuthology.orchestra.run.vm04.stdout:last_failure 0 2026-03-10T06:24:59.740 INFO:teuthology.orchestra.run.vm04.stdout:last_failure_osd_epoch 0 2026-03-10T06:24:59.740 INFO:teuthology.orchestra.run.vm04.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T06:24:59.740 INFO:teuthology.orchestra.run.vm04.stdout:max_mds 1 2026-03-10T06:24:59.740 INFO:teuthology.orchestra.run.vm04.stdout:in 0 2026-03-10T06:24:59.740 INFO:teuthology.orchestra.run.vm04.stdout:up {0=14508} 2026-03-10T06:24:59.740 INFO:teuthology.orchestra.run.vm04.stdout:failed 2026-03-10T06:24:59.740 INFO:teuthology.orchestra.run.vm04.stdout:damaged 2026-03-10T06:24:59.740 INFO:teuthology.orchestra.run.vm04.stdout:stopped 2026-03-10T06:24:59.740 INFO:teuthology.orchestra.run.vm04.stdout:data_pools [3] 2026-03-10T06:24:59.740 INFO:teuthology.orchestra.run.vm04.stdout:metadata_pool 2 2026-03-10T06:24:59.740 INFO:teuthology.orchestra.run.vm04.stdout:inline_data enabled 2026-03-10T06:24:59.740 INFO:teuthology.orchestra.run.vm04.stdout:balancer 2026-03-10T06:24:59.740 INFO:teuthology.orchestra.run.vm04.stdout:bal_rank_mask -1 2026-03-10T06:24:59.740 INFO:teuthology.orchestra.run.vm04.stdout:standby_count_wanted 1 2026-03-10T06:24:59.740 INFO:teuthology.orchestra.run.vm04.stdout:qdb_cluster leader: 0 members: 2026-03-10T06:24:59.740 INFO:teuthology.orchestra.run.vm04.stdout:[mds.cephfs.vm04.hdxbzv{0:14508} state up:active seq 3 join_fscid=1 addr [v2:192.168.123.104:6826/2274683007,v1:192.168.123.104:6827/2274683007] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T06:24:59.740 INFO:teuthology.orchestra.run.vm04.stdout:[mds.cephfs.vm06.wzhqon{0:24299} state up:standby-replay seq 2 join_fscid=1 addr [v2:192.168.123.106:6824/3071631026,v1:192.168.123.106:6825/3071631026] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T06:24:59.740 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:24:59.740 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:24:59.740 INFO:teuthology.orchestra.run.vm04.stdout:Standby daemons: 2026-03-10T06:24:59.741 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:24:59.741 INFO:teuthology.orchestra.run.vm04.stdout:[mds.cephfs.vm04.hsrsig{-1:14518} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.104:6828/2419696492,v1:192.168.123.104:6829/2419696492] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T06:24:59.741 INFO:teuthology.orchestra.run.vm04.stdout:[mds.cephfs.vm06.afscws{-1:14526} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.106:6826/3120742985,v1:192.168.123.106:6827/3120742985] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T06:24:59.744 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.743+0000 7f3ab16fa700 1 -- 192.168.123.104:0/2070080976 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f3aa8077a40 msgr2=0x7f3aa8079ef0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:24:59.744 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.743+0000 7f3ab16fa700 1 --2- 192.168.123.104:0/2070080976 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f3aa8077a40 0x7f3aa8079ef0 secure :-1 s=READY pgs=32 cs=0 l=1 rev1=1 crypto rx=0x7f3ab4003eb0 tx=0x7f3ab4008be0 comp rx=0 tx=0).stop 2026-03-10T06:24:59.744 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.743+0000 7f3ab16fa700 1 -- 192.168.123.104:0/2070080976 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3abc071e40 msgr2=0x7f3abc07ce80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:24:59.744 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.743+0000 7f3ab16fa700 1 --2- 192.168.123.104:0/2070080976 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3abc071e40 0x7f3abc07ce80 secure :-1 s=READY pgs=48 cs=0 l=1 rev1=1 crypto rx=0x7f3ab8004520 tx=0x7f3ab8004550 comp rx=0 tx=0).stop 2026-03-10T06:24:59.744 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.743+0000 7f3ab16fa700 1 -- 192.168.123.104:0/2070080976 shutdown_connections 2026-03-10T06:24:59.744 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.743+0000 7f3ab16fa700 1 --2- 192.168.123.104:0/2070080976 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f3aa8077a40 0x7f3aa8079ef0 unknown :-1 s=CLOSED pgs=32 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:24:59.744 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.743+0000 7f3ab16fa700 1 --2- 192.168.123.104:0/2070080976 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3abc071e40 0x7f3abc07ce80 unknown :-1 s=CLOSED pgs=48 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:24:59.744 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.743+0000 7f3ab16fa700 1 --2- 192.168.123.104:0/2070080976 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3abc07d3c0 0x7f3abc07d830 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:24:59.744 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.743+0000 7f3ab16fa700 1 -- 192.168.123.104:0/2070080976 >> 192.168.123.104:0/2070080976 conn(0x7f3abc06c6c0 msgr2=0x7f3abc070890 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:24:59.744 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.744+0000 7f3ab16fa700 1 -- 192.168.123.104:0/2070080976 shutdown_connections 2026-03-10T06:24:59.744 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.744+0000 7f3ab16fa700 1 -- 192.168.123.104:0/2070080976 wait complete. 2026-03-10T06:24:59.746 INFO:teuthology.orchestra.run.vm04.stderr:dumped fsmap epoch 11 2026-03-10T06:24:59.854 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.854+0000 7f7c8fa08700 1 -- 192.168.123.104:0/428203334 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7c88071e40 msgr2=0x7f7c880722b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:24:59.854 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.854+0000 7f7c8fa08700 1 --2- 192.168.123.104:0/428203334 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7c88071e40 0x7f7c880722b0 secure :-1 s=READY pgs=16 cs=0 l=1 rev1=1 crypto rx=0x7f7c80009230 tx=0x7f7c80009260 comp rx=0 tx=0).stop 2026-03-10T06:24:59.855 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.854+0000 7f7c8fa08700 1 -- 192.168.123.104:0/428203334 shutdown_connections 2026-03-10T06:24:59.855 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.854+0000 7f7c8fa08700 1 --2- 192.168.123.104:0/428203334 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7c88071e40 0x7f7c880722b0 unknown :-1 s=CLOSED pgs=16 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:24:59.855 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.854+0000 7f7c8fa08700 1 --2- 192.168.123.104:0/428203334 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7c8810c8b0 0x7f7c8810cc80 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:24:59.855 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.854+0000 7f7c8fa08700 1 -- 192.168.123.104:0/428203334 >> 192.168.123.104:0/428203334 conn(0x7f7c8806c6c0 msgr2=0x7f7c8806cac0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:24:59.856 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.854+0000 7f7c8fa08700 1 -- 192.168.123.104:0/428203334 shutdown_connections 2026-03-10T06:24:59.856 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.854+0000 7f7c8fa08700 1 -- 192.168.123.104:0/428203334 wait complete. 2026-03-10T06:24:59.856 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.855+0000 7f7c8fa08700 1 Processor -- start 2026-03-10T06:24:59.856 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.855+0000 7f7c8fa08700 1 -- start start 2026-03-10T06:24:59.856 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.855+0000 7f7c8fa08700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7c8810c8b0 0x7f7c8807ce30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:24:59.856 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.855+0000 7f7c8fa08700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7c8807d370 0x7f7c8807d7e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:24:59.856 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.855+0000 7f7c8fa08700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7c880819b0 con 0x7f7c8810c8b0 2026-03-10T06:24:59.856 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.855+0000 7f7c8fa08700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7c88081b20 con 0x7f7c8807d370 2026-03-10T06:24:59.856 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.855+0000 7f7c8d7a4700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7c8810c8b0 0x7f7c8807ce30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:24:59.856 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.855+0000 7f7c8d7a4700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7c8810c8b0 0x7f7c8807ce30 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:55594/0 (socket says 192.168.123.104:55594) 2026-03-10T06:24:59.856 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.855+0000 7f7c8d7a4700 1 -- 192.168.123.104:0/1231759941 learned_addr learned my addr 192.168.123.104:0/1231759941 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:24:59.856 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.856+0000 7f7c8d7a4700 1 -- 192.168.123.104:0/1231759941 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7c8807d370 msgr2=0x7f7c8807d7e0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:24:59.856 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.856+0000 7f7c8d7a4700 1 --2- 192.168.123.104:0/1231759941 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7c8807d370 0x7f7c8807d7e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:24:59.856 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.856+0000 7f7c8d7a4700 1 -- 192.168.123.104:0/1231759941 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f7c80008ee0 con 0x7f7c8810c8b0 2026-03-10T06:24:59.856 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.856+0000 7f7c8d7a4700 1 --2- 192.168.123.104:0/1231759941 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7c8810c8b0 0x7f7c8807ce30 secure :-1 s=READY pgs=49 cs=0 l=1 rev1=1 crypto rx=0x7f7c8400bfd0 tx=0x7f7c84009d70 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:24:59.858 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.857+0000 7f7c7e7fc700 1 -- 192.168.123.104:0/1231759941 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f7c84010040 con 0x7f7c8810c8b0 2026-03-10T06:24:59.859 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.858+0000 7f7c8fa08700 1 -- 192.168.123.104:0/1231759941 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f7c88081e00 con 0x7f7c8810c8b0 2026-03-10T06:24:59.859 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.858+0000 7f7c8fa08700 1 -- 192.168.123.104:0/1231759941 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f7c88082350 con 0x7f7c8810c8b0 2026-03-10T06:24:59.859 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.858+0000 7f7c7e7fc700 1 -- 192.168.123.104:0/1231759941 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f7c8400ec20 con 0x7f7c8810c8b0 2026-03-10T06:24:59.859 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.858+0000 7f7c7e7fc700 1 -- 192.168.123.104:0/1231759941 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f7c84014e40 con 0x7f7c8810c8b0 2026-03-10T06:24:59.859 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.859+0000 7f7c8fa08700 1 -- 192.168.123.104:0/1231759941 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f7c6c005320 con 0x7f7c8810c8b0 2026-03-10T06:24:59.860 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.860+0000 7f7c7e7fc700 1 -- 192.168.123.104:0/1231759941 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 39) v1 ==== 100217+0+0 (secure 0 0 0) 0x7f7c84014590 con 0x7f7c8810c8b0 2026-03-10T06:24:59.861 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.860+0000 7f7c7e7fc700 1 --2- 192.168.123.104:0/1231759941 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f7c74077b10 0x7f7c74079fc0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:24:59.861 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.861+0000 7f7c7e7fc700 1 -- 192.168.123.104:0/1231759941 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(51..51 src has 1..51) v4 ==== 6050+0+0 (secure 0 0 0) 0x7f7c84099dd0 con 0x7f7c8810c8b0 2026-03-10T06:24:59.861 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.861+0000 7f7c8cfa3700 1 --2- 192.168.123.104:0/1231759941 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f7c74077b10 0x7f7c74079fc0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:24:59.864 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.863+0000 7f7c8cfa3700 1 --2- 192.168.123.104:0/1231759941 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f7c74077b10 0x7f7c74079fc0 secure :-1 s=READY pgs=33 cs=0 l=1 rev1=1 crypto rx=0x7f7c80009230 tx=0x7f7c8000c960 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:24:59.864 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:24:59.864+0000 7f7c7e7fc700 1 -- 192.168.123.104:0/1231759941 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f7c840624b0 con 0x7f7c8810c8b0 2026-03-10T06:24:59.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:59 vm06.local ceph-mon[98962]: pgmap v38: 65 pgs: 5 active+recovery_wait+degraded, 1 active+recovering, 59 active+clean; 295 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail; 1.7 MiB/s rd, 1.8 MiB/s wr, 220 op/s; 80/5622 objects degraded (1.423%); 1.0 MiB/s, 6 keys/s, 10 objects/s recovering 2026-03-10T06:24:59.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:59 vm06.local ceph-mon[98962]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T06:24:59.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:59 vm06.local ceph-mon[98962]: Upgrade: unsafe to stop osd(s) at this time (2 PGs are or would become offline) 2026-03-10T06:24:59.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:24:59 vm06.local ceph-mon[98962]: from='client.44129 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:25:00.034 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:00.027+0000 7f7c8fa08700 1 -- 192.168.123.104:0/1231759941 --> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f7c6c000bf0 con 0x7f7c74077b10 2026-03-10T06:25:00.036 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:00.035+0000 7f7c7e7fc700 1 -- 192.168.123.104:0/1231759941 <== mgr.34104 v2:192.168.123.104:6800/1421430943 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7f7c6c000bf0 con 0x7f7c74077b10 2026-03-10T06:25:00.036 INFO:teuthology.orchestra.run.vm04.stdout:{ 2026-03-10T06:25:00.036 INFO:teuthology.orchestra.run.vm04.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc", 2026-03-10T06:25:00.036 INFO:teuthology.orchestra.run.vm04.stdout: "in_progress": true, 2026-03-10T06:25:00.036 INFO:teuthology.orchestra.run.vm04.stdout: "which": "Upgrading all daemon types on all hosts", 2026-03-10T06:25:00.036 INFO:teuthology.orchestra.run.vm04.stdout: "services_complete": [ 2026-03-10T06:25:00.036 INFO:teuthology.orchestra.run.vm04.stdout: "mon", 2026-03-10T06:25:00.036 INFO:teuthology.orchestra.run.vm04.stdout: "mgr", 2026-03-10T06:25:00.036 INFO:teuthology.orchestra.run.vm04.stdout: "crash" 2026-03-10T06:25:00.036 INFO:teuthology.orchestra.run.vm04.stdout: ], 2026-03-10T06:25:00.036 INFO:teuthology.orchestra.run.vm04.stdout: "progress": "7/23 daemons upgraded", 2026-03-10T06:25:00.036 INFO:teuthology.orchestra.run.vm04.stdout: "message": "Currently upgrading osd daemons", 2026-03-10T06:25:00.036 INFO:teuthology.orchestra.run.vm04.stdout: "is_paused": false 2026-03-10T06:25:00.036 INFO:teuthology.orchestra.run.vm04.stdout:} 2026-03-10T06:25:00.041 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:00.041+0000 7f7c73fff700 1 -- 192.168.123.104:0/1231759941 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f7c74077b10 msgr2=0x7f7c74079fc0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:25:00.041 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:00.041+0000 7f7c73fff700 1 --2- 192.168.123.104:0/1231759941 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f7c74077b10 0x7f7c74079fc0 secure :-1 s=READY pgs=33 cs=0 l=1 rev1=1 crypto rx=0x7f7c80009230 tx=0x7f7c8000c960 comp rx=0 tx=0).stop 2026-03-10T06:25:00.041 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:00.041+0000 7f7c73fff700 1 -- 192.168.123.104:0/1231759941 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7c8810c8b0 msgr2=0x7f7c8807ce30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:25:00.041 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:00.041+0000 7f7c73fff700 1 --2- 192.168.123.104:0/1231759941 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7c8810c8b0 0x7f7c8807ce30 secure :-1 s=READY pgs=49 cs=0 l=1 rev1=1 crypto rx=0x7f7c8400bfd0 tx=0x7f7c84009d70 comp rx=0 tx=0).stop 2026-03-10T06:25:00.041 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:00.041+0000 7f7c73fff700 1 -- 192.168.123.104:0/1231759941 shutdown_connections 2026-03-10T06:25:00.041 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:00.041+0000 7f7c73fff700 1 --2- 192.168.123.104:0/1231759941 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f7c74077b10 0x7f7c74079fc0 unknown :-1 s=CLOSED pgs=33 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:25:00.042 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:00.041+0000 7f7c73fff700 1 --2- 192.168.123.104:0/1231759941 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7c8810c8b0 0x7f7c8807ce30 unknown :-1 s=CLOSED pgs=49 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:25:00.042 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:00.041+0000 7f7c73fff700 1 --2- 192.168.123.104:0/1231759941 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7c8807d370 0x7f7c8807d7e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:25:00.042 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:00.041+0000 7f7c73fff700 1 -- 192.168.123.104:0/1231759941 >> 192.168.123.104:0/1231759941 conn(0x7f7c8806c6c0 msgr2=0x7f7c88070840 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:25:00.042 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:00.041+0000 7f7c73fff700 1 -- 192.168.123.104:0/1231759941 shutdown_connections 2026-03-10T06:25:00.042 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:00.041+0000 7f7c73fff700 1 -- 192.168.123.104:0/1231759941 wait complete. 2026-03-10T06:25:00.156 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:00.155+0000 7f71bf59e700 1 -- 192.168.123.104:0/952945299 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f71c0071b60 msgr2=0x7f71c0071fd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:25:00.156 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:00.155+0000 7f71bf59e700 1 --2- 192.168.123.104:0/952945299 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f71c0071b60 0x7f71c0071fd0 secure :-1 s=READY pgs=50 cs=0 l=1 rev1=1 crypto rx=0x7f71b800b3a0 tx=0x7f71b800b6b0 comp rx=0 tx=0).stop 2026-03-10T06:25:00.156 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:00.155+0000 7f71bf59e700 1 -- 192.168.123.104:0/952945299 shutdown_connections 2026-03-10T06:25:00.156 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:00.155+0000 7f71bf59e700 1 --2- 192.168.123.104:0/952945299 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f71c0071b60 0x7f71c0071fd0 unknown :-1 s=CLOSED pgs=50 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:25:00.156 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:00.155+0000 7f71bf59e700 1 --2- 192.168.123.104:0/952945299 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f71c010e9e0 0x7f71c010edb0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:25:00.156 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:00.155+0000 7f71bf59e700 1 -- 192.168.123.104:0/952945299 >> 192.168.123.104:0/952945299 conn(0x7f71c006c6c0 msgr2=0x7f71c006cac0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:25:00.156 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:00.155+0000 7f71bf59e700 1 -- 192.168.123.104:0/952945299 shutdown_connections 2026-03-10T06:25:00.156 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:00.155+0000 7f71bf59e700 1 -- 192.168.123.104:0/952945299 wait complete. 2026-03-10T06:25:00.156 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:00.156+0000 7f71bf59e700 1 Processor -- start 2026-03-10T06:25:00.156 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:00.156+0000 7f71bf59e700 1 -- start start 2026-03-10T06:25:00.156 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:00.156+0000 7f71bf59e700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f71c010e9e0 0x7f71c0119680 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:25:00.156 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:00.156+0000 7f71bf59e700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f71c0114720 0x7f71c0114b90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:25:00.156 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:00.156+0000 7f71bf59e700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f71c0115160 con 0x7f71c0114720 2026-03-10T06:25:00.156 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:00.156+0000 7f71bf59e700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f71c01152d0 con 0x7f71c010e9e0 2026-03-10T06:25:00.157 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:00.157+0000 7f71be59c700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f71c010e9e0 0x7f71c0119680 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:25:00.157 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:00.157+0000 7f71be59c700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f71c010e9e0 0x7f71c0119680 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.104:47672/0 (socket says 192.168.123.104:47672) 2026-03-10T06:25:00.157 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:00.157+0000 7f71be59c700 1 -- 192.168.123.104:0/3045821827 learned_addr learned my addr 192.168.123.104:0/3045821827 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:25:00.157 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:00.157+0000 7f71bdd9b700 1 --2- 192.168.123.104:0/3045821827 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f71c0114720 0x7f71c0114b90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:25:00.158 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:00.157+0000 7f71be59c700 1 -- 192.168.123.104:0/3045821827 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f71c0114720 msgr2=0x7f71c0114b90 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:25:00.158 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:00.157+0000 7f71be59c700 1 --2- 192.168.123.104:0/3045821827 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f71c0114720 0x7f71c0114b90 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:25:00.158 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:00.157+0000 7f71be59c700 1 -- 192.168.123.104:0/3045821827 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f71b800b050 con 0x7f71c010e9e0 2026-03-10T06:25:00.158 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:00.158+0000 7f71be59c700 1 --2- 192.168.123.104:0/3045821827 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f71c010e9e0 0x7f71c0119680 secure :-1 s=READY pgs=17 cs=0 l=1 rev1=1 crypto rx=0x7f71b000c8a0 tx=0x7f71b000cbb0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:25:00.159 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:00.159+0000 7f71af7fe700 1 -- 192.168.123.104:0/3045821827 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f71b00078c0 con 0x7f71c010e9e0 2026-03-10T06:25:00.159 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:00.159+0000 7f71af7fe700 1 -- 192.168.123.104:0/3045821827 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f71b000f450 con 0x7f71c010e9e0 2026-03-10T06:25:00.159 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:00.159+0000 7f71af7fe700 1 -- 192.168.123.104:0/3045821827 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f71b000e5c0 con 0x7f71c010e9e0 2026-03-10T06:25:00.160 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:00.159+0000 7f71bf59e700 1 -- 192.168.123.104:0/3045821827 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f71c01b7a80 con 0x7f71c010e9e0 2026-03-10T06:25:00.160 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:00.159+0000 7f71bf59e700 1 -- 192.168.123.104:0/3045821827 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f71c01b7de0 con 0x7f71c010e9e0 2026-03-10T06:25:00.160 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:00.160+0000 7f71bf59e700 1 -- 192.168.123.104:0/3045821827 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f71c004f2a0 con 0x7f71c010e9e0 2026-03-10T06:25:00.161 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:00.160+0000 7f71af7fe700 1 -- 192.168.123.104:0/3045821827 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 39) v1 ==== 100217+0+0 (secure 0 0 0) 0x7f71b001e020 con 0x7f71c010e9e0 2026-03-10T06:25:00.162 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:00.161+0000 7f71af7fe700 1 --2- 192.168.123.104:0/3045821827 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f71a80777d0 0x7f71a8079c80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:25:00.162 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:00.161+0000 7f71af7fe700 1 -- 192.168.123.104:0/3045821827 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(51..51 src has 1..51) v4 ==== 6050+0+0 (secure 0 0 0) 0x7f71b0099380 con 0x7f71c010e9e0 2026-03-10T06:25:00.163 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:00.162+0000 7f71bdd9b700 1 --2- 192.168.123.104:0/3045821827 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f71a80777d0 0x7f71a8079c80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:25:00.166 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:00.163+0000 7f71af7fe700 1 -- 192.168.123.104:0/3045821827 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f71b0061a60 con 0x7f71c010e9e0 2026-03-10T06:25:00.181 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:00.180+0000 7f71bdd9b700 1 --2- 192.168.123.104:0/3045821827 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f71a80777d0 0x7f71a8079c80 secure :-1 s=READY pgs=34 cs=0 l=1 rev1=1 crypto rx=0x7f71b800bb30 tx=0x7f71b8014040 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:25:00.359 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:00.358+0000 7f71bf59e700 1 -- 192.168.123.104:0/3045821827 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7f71c004ea50 con 0x7f71c010e9e0 2026-03-10T06:25:00.370 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:00.370+0000 7f71af7fe700 1 -- 192.168.123.104:0/3045821827 <== mon.1 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+677 (secure 0 0 0) 0x7f71b00611b0 con 0x7f71c010e9e0 2026-03-10T06:25:00.371 INFO:teuthology.orchestra.run.vm04.stdout:HEALTH_WARN 1 filesystem with deprecated feature inline_data; Degraded data redundancy: 80/5622 objects degraded (1.423%), 5 pgs degraded 2026-03-10T06:25:00.371 INFO:teuthology.orchestra.run.vm04.stdout:[WRN] FS_INLINE_DATA_DEPRECATED: 1 filesystem with deprecated feature inline_data 2026-03-10T06:25:00.371 INFO:teuthology.orchestra.run.vm04.stdout: fs cephfs has deprecated feature inline_data enabled. 2026-03-10T06:25:00.371 INFO:teuthology.orchestra.run.vm04.stdout:[WRN] PG_DEGRADED: Degraded data redundancy: 80/5622 objects degraded (1.423%), 5 pgs degraded 2026-03-10T06:25:00.371 INFO:teuthology.orchestra.run.vm04.stdout: pg 3.f is active+recovery_wait+degraded, acting [5,3,0] 2026-03-10T06:25:00.371 INFO:teuthology.orchestra.run.vm04.stdout: pg 3.10 is active+recovery_wait+degraded, acting [5,0,1] 2026-03-10T06:25:00.371 INFO:teuthology.orchestra.run.vm04.stdout: pg 3.11 is active+recovery_wait+degraded, acting [3,4,0] 2026-03-10T06:25:00.371 INFO:teuthology.orchestra.run.vm04.stdout: pg 3.15 is active+recovery_wait+degraded, acting [3,0,4] 2026-03-10T06:25:00.371 INFO:teuthology.orchestra.run.vm04.stdout: pg 3.18 is active+recovery_wait+degraded, acting [2,0,1] 2026-03-10T06:25:00.374 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:00.373+0000 7f71bf59e700 1 -- 192.168.123.104:0/3045821827 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f71a80777d0 msgr2=0x7f71a8079c80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:25:00.374 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:00.373+0000 7f71bf59e700 1 --2- 192.168.123.104:0/3045821827 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f71a80777d0 0x7f71a8079c80 secure :-1 s=READY pgs=34 cs=0 l=1 rev1=1 crypto rx=0x7f71b800bb30 tx=0x7f71b8014040 comp rx=0 tx=0).stop 2026-03-10T06:25:00.374 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:00.373+0000 7f71bf59e700 1 -- 192.168.123.104:0/3045821827 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f71c010e9e0 msgr2=0x7f71c0119680 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:25:00.374 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:00.373+0000 7f71bf59e700 1 --2- 192.168.123.104:0/3045821827 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f71c010e9e0 0x7f71c0119680 secure :-1 s=READY pgs=17 cs=0 l=1 rev1=1 crypto rx=0x7f71b000c8a0 tx=0x7f71b000cbb0 comp rx=0 tx=0).stop 2026-03-10T06:25:00.374 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:00.374+0000 7f71bf59e700 1 -- 192.168.123.104:0/3045821827 shutdown_connections 2026-03-10T06:25:00.374 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:00.374+0000 7f71bf59e700 1 --2- 192.168.123.104:0/3045821827 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f71a80777d0 0x7f71a8079c80 unknown :-1 s=CLOSED pgs=34 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:25:00.374 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:00.374+0000 7f71bf59e700 1 --2- 192.168.123.104:0/3045821827 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f71c010e9e0 0x7f71c0119680 unknown :-1 s=CLOSED pgs=17 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:25:00.374 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:00.374+0000 7f71bf59e700 1 --2- 192.168.123.104:0/3045821827 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f71c0114720 0x7f71c0114b90 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:25:00.374 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:00.374+0000 7f71bf59e700 1 -- 192.168.123.104:0/3045821827 >> 192.168.123.104:0/3045821827 conn(0x7f71c006c6c0 msgr2=0x7f71c006f680 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:25:00.374 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:00.374+0000 7f71bf59e700 1 -- 192.168.123.104:0/3045821827 shutdown_connections 2026-03-10T06:25:00.374 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:00.374+0000 7f71bf59e700 1 -- 192.168.123.104:0/3045821827 wait complete. 2026-03-10T06:25:00.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:00 vm04.local ceph-mon[115743]: from='client.44131 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:25:00.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:00 vm04.local ceph-mon[115743]: from='client.44133 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:25:00.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:00 vm04.local ceph-mon[115743]: from='client.? 192.168.123.104:0/3429354271' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:25:00.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:00 vm04.local ceph-mon[115743]: from='client.? 192.168.123.104:0/2070080976' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T06:25:00.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:00 vm04.local ceph-mon[115743]: from='client.? 192.168.123.104:0/3045821827' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T06:25:00.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:00 vm06.local ceph-mon[98962]: from='client.44131 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:25:00.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:00 vm06.local ceph-mon[98962]: from='client.44133 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:25:00.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:00 vm06.local ceph-mon[98962]: from='client.? 192.168.123.104:0/3429354271' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:25:00.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:00 vm06.local ceph-mon[98962]: from='client.? 192.168.123.104:0/2070080976' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T06:25:00.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:00 vm06.local ceph-mon[98962]: from='client.? 192.168.123.104:0/3045821827' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T06:25:01.304 INFO:tasks.workunit:Stopping ['suites/fsstress.sh'] on client.1... 2026-03-10T06:25:01.304 DEBUG:teuthology.orchestra.run.vm06:> sudo rm -rf -- /home/ubuntu/cephtest/workunits.list.client.1 /home/ubuntu/cephtest/clone.client.1 2026-03-10T06:25:01.704 DEBUG:teuthology.parallel:result is None 2026-03-10T06:25:02.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:01 vm06.local ceph-mon[98962]: pgmap v39: 65 pgs: 5 active+recovery_wait+degraded, 1 active+recovering, 59 active+clean; 292 MiB data, 2.8 GiB used, 117 GiB / 120 GiB avail; 1.4 MiB/s rd, 1.4 MiB/s wr, 195 op/s; 80/5208 objects degraded (1.536%); 872 KiB/s, 5 keys/s, 8 objects/s recovering 2026-03-10T06:25:02.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:01 vm06.local ceph-mon[98962]: from='client.34170 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:25:02.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:01 vm04.local ceph-mon[115743]: pgmap v39: 65 pgs: 5 active+recovery_wait+degraded, 1 active+recovering, 59 active+clean; 292 MiB data, 2.8 GiB used, 117 GiB / 120 GiB avail; 1.4 MiB/s rd, 1.4 MiB/s wr, 195 op/s; 80/5208 objects degraded (1.536%); 872 KiB/s, 5 keys/s, 8 objects/s recovering 2026-03-10T06:25:02.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:01 vm04.local ceph-mon[115743]: from='client.34170 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:25:03.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:02 vm06.local ceph-mon[98962]: pgmap v40: 65 pgs: 3 active+recovery_wait+degraded, 1 active+recovering, 61 active+clean; 279 MiB data, 2.8 GiB used, 117 GiB / 120 GiB avail; 1.5 MiB/s rd, 1.4 MiB/s wr, 198 op/s; 40/3519 objects degraded (1.137%); 0 B/s, 7 objects/s recovering 2026-03-10T06:25:03.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:02 vm06.local ceph-mon[98962]: Health check update: Degraded data redundancy: 80/5208 objects degraded (1.536%), 5 pgs degraded (PG_DEGRADED) 2026-03-10T06:25:03.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:02 vm04.local ceph-mon[115743]: pgmap v40: 65 pgs: 3 active+recovery_wait+degraded, 1 active+recovering, 61 active+clean; 279 MiB data, 2.8 GiB used, 117 GiB / 120 GiB avail; 1.5 MiB/s rd, 1.4 MiB/s wr, 198 op/s; 40/3519 objects degraded (1.137%); 0 B/s, 7 objects/s recovering 2026-03-10T06:25:03.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:02 vm04.local ceph-mon[115743]: Health check update: Degraded data redundancy: 80/5208 objects degraded (1.536%), 5 pgs degraded (PG_DEGRADED) 2026-03-10T06:25:04.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:04 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T06:25:04.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:04 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T06:25:05.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:05 vm06.local ceph-mon[98962]: pgmap v41: 65 pgs: 3 active+recovery_wait+degraded, 1 active+recovering, 61 active+clean; 279 MiB data, 2.8 GiB used, 117 GiB / 120 GiB avail; 1.3 MiB/s rd, 1.2 MiB/s wr, 172 op/s; 40/3519 objects degraded (1.137%); 0 B/s, 6 objects/s recovering 2026-03-10T06:25:05.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:05 vm04.local ceph-mon[115743]: pgmap v41: 65 pgs: 3 active+recovery_wait+degraded, 1 active+recovering, 61 active+clean; 279 MiB data, 2.8 GiB used, 117 GiB / 120 GiB avail; 1.3 MiB/s rd, 1.2 MiB/s wr, 172 op/s; 40/3519 objects degraded (1.137%); 0 B/s, 6 objects/s recovering 2026-03-10T06:25:07.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:06 vm04.local ceph-mon[115743]: pgmap v42: 65 pgs: 1 active+recovering, 64 active+clean; 266 MiB data, 2.8 GiB used, 117 GiB / 120 GiB avail; 1.6 MiB/s rd, 1.7 MiB/s wr, 252 op/s; 0 B/s, 10 objects/s recovering 2026-03-10T06:25:07.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:06 vm04.local ceph-mon[115743]: Health check cleared: PG_DEGRADED (was: Degraded data redundancy: 40/3519 objects degraded (1.137%), 3 pgs degraded) 2026-03-10T06:25:07.221 INFO:tasks.workunit:Stopping ['suites/fsstress.sh'] on client.0... 2026-03-10T06:25:07.221 DEBUG:teuthology.orchestra.run.vm04:> sudo rm -rf -- /home/ubuntu/cephtest/workunits.list.client.0 /home/ubuntu/cephtest/clone.client.0 2026-03-10T06:25:07.339 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:06 vm06.local ceph-mon[98962]: pgmap v42: 65 pgs: 1 active+recovering, 64 active+clean; 266 MiB data, 2.8 GiB used, 117 GiB / 120 GiB avail; 1.6 MiB/s rd, 1.7 MiB/s wr, 252 op/s; 0 B/s, 10 objects/s recovering 2026-03-10T06:25:07.339 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:06 vm06.local ceph-mon[98962]: Health check cleared: PG_DEGRADED (was: Degraded data redundancy: 40/3519 objects degraded (1.137%), 3 pgs degraded) 2026-03-10T06:25:07.645 DEBUG:teuthology.parallel:result is None 2026-03-10T06:25:07.646 DEBUG:teuthology.orchestra.run.vm04:> sudo rm -rf -- /home/ubuntu/cephtest/mnt.0/client.0 2026-03-10T06:25:07.710 INFO:tasks.workunit:Deleted dir /home/ubuntu/cephtest/mnt.0/client.0 2026-03-10T06:25:07.710 DEBUG:teuthology.orchestra.run.vm06:> sudo rm -rf -- /home/ubuntu/cephtest/mnt.1/client.1 2026-03-10T06:25:07.748 INFO:tasks.workunit:Deleted dir /home/ubuntu/cephtest/mnt.1/client.1 2026-03-10T06:25:07.748 DEBUG:teuthology.parallel:result is None 2026-03-10T06:25:09.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:09 vm06.local ceph-mon[98962]: pgmap v43: 65 pgs: 1 active+recovering, 64 active+clean; 266 MiB data, 2.8 GiB used, 117 GiB / 120 GiB avail; 921 KiB/s rd, 1.0 MiB/s wr, 173 op/s; 0 B/s, 6 objects/s recovering 2026-03-10T06:25:09.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:09 vm04.local ceph-mon[115743]: pgmap v43: 65 pgs: 1 active+recovering, 64 active+clean; 266 MiB data, 2.8 GiB used, 117 GiB / 120 GiB avail; 921 KiB/s rd, 1.0 MiB/s wr, 173 op/s; 0 B/s, 6 objects/s recovering 2026-03-10T06:25:11.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:11 vm06.local ceph-mon[98962]: pgmap v44: 65 pgs: 1 active+recovering, 64 active+clean; 266 MiB data, 2.8 GiB used, 117 GiB / 120 GiB avail; 1000 KiB/s rd, 1.1 MiB/s wr, 186 op/s; 0 B/s, 6 objects/s recovering 2026-03-10T06:25:11.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:11 vm04.local ceph-mon[115743]: pgmap v44: 65 pgs: 1 active+recovering, 64 active+clean; 266 MiB data, 2.8 GiB used, 117 GiB / 120 GiB avail; 1000 KiB/s rd, 1.1 MiB/s wr, 186 op/s; 0 B/s, 6 objects/s recovering 2026-03-10T06:25:13.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:13 vm04.local ceph-mon[115743]: pgmap v45: 65 pgs: 65 active+clean; 256 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail; 1.2 MiB/s rd, 1.2 MiB/s wr, 225 op/s; 0 B/s, 6 objects/s recovering 2026-03-10T06:25:13.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:13 vm06.local ceph-mon[98962]: pgmap v45: 65 pgs: 65 active+clean; 256 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail; 1.2 MiB/s rd, 1.2 MiB/s wr, 225 op/s; 0 B/s, 6 objects/s recovering 2026-03-10T06:25:14.329 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:14 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T06:25:14.329 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:14 vm04.local ceph-mon[115743]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T06:25:14.329 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:14 vm04.local ceph-mon[115743]: Upgrade: osd.1 is safe to restart 2026-03-10T06:25:14.329 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:14 vm04.local ceph-mon[115743]: Upgrade: Updating osd.1 2026-03-10T06:25:14.329 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:14 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:25:14.329 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:14 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "osd.1"}]: dispatch 2026-03-10T06:25:14.329 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:14 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:25:14.329 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:14 vm04.local ceph-mon[115743]: Deploying daemon osd.1 on vm04 2026-03-10T06:25:14.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:14 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T06:25:14.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:14 vm06.local ceph-mon[98962]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T06:25:14.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:14 vm06.local ceph-mon[98962]: Upgrade: osd.1 is safe to restart 2026-03-10T06:25:14.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:14 vm06.local ceph-mon[98962]: Upgrade: Updating osd.1 2026-03-10T06:25:14.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:14 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:25:14.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:14 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "osd.1"}]: dispatch 2026-03-10T06:25:14.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:14 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:25:14.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:14 vm06.local ceph-mon[98962]: Deploying daemon osd.1 on vm04 2026-03-10T06:25:14.678 INFO:journalctl@ceph.osd.1.vm04.stdout:Mar 10 06:25:14 vm04.local systemd[1]: Stopping Ceph osd.1 for 9c59102a-1c48-11f1-b618-035af535377d... 2026-03-10T06:25:14.678 INFO:journalctl@ceph.osd.1.vm04.stdout:Mar 10 06:25:14 vm04.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-1[74271]: 2026-03-10T06:25:14.398+0000 7f68dd835700 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-osd -n osd.1 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false (PID: 1) UID: 0 2026-03-10T06:25:14.678 INFO:journalctl@ceph.osd.1.vm04.stdout:Mar 10 06:25:14 vm04.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-1[74271]: 2026-03-10T06:25:14.398+0000 7f68dd835700 -1 osd.1 51 *** Got signal Terminated *** 2026-03-10T06:25:14.678 INFO:journalctl@ceph.osd.1.vm04.stdout:Mar 10 06:25:14 vm04.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-1[74271]: 2026-03-10T06:25:14.398+0000 7f68dd835700 -1 osd.1 51 *** Immediate shutdown (osd_fast_shutdown=true) *** 2026-03-10T06:25:15.410 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:15 vm04.local ceph-mon[115743]: pgmap v46: 65 pgs: 65 active+clean; 256 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail; 666 KiB/s rd, 695 KiB/s wr, 143 op/s; 0 B/s, 3 objects/s recovering 2026-03-10T06:25:15.410 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:15 vm04.local ceph-mon[115743]: osd.1 marked itself down and dead 2026-03-10T06:25:15.410 INFO:journalctl@ceph.osd.1.vm04.stdout:Mar 10 06:25:15 vm04.local podman[129088]: 2026-03-10 06:25:15.205868173 +0000 UTC m=+0.822266387 container died ddcaf1636c428ea805ba1188a8ae4e2cc98c5f5ca929928cad29434a285b26c2 (image=quay.io/ceph/ceph@sha256:1793ff3af6ae74527c86e1a0b22401e9c42dc08d0ebb8379653be07db17d0007, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-1, ceph=True, RELEASE=HEAD, org.label-schema.build-date=20231212, CEPH_POINT_RELEASE=-18.2.0, GIT_BRANCH=HEAD, GIT_COMMIT=0396eef90bef641b676c164ec7a3876f45010308, GIT_REPO=https://github.com/ceph/ceph-container.git, org.label-schema.name=CentOS Stream 8 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, GIT_CLEAN=True, io.buildah.version=1.29.1, maintainer=Guillaume Abrioux , org.label-schema.license=GPLv2) 2026-03-10T06:25:15.410 INFO:journalctl@ceph.osd.1.vm04.stdout:Mar 10 06:25:15 vm04.local podman[129088]: 2026-03-10 06:25:15.240438679 +0000 UTC m=+0.856836893 container remove ddcaf1636c428ea805ba1188a8ae4e2cc98c5f5ca929928cad29434a285b26c2 (image=quay.io/ceph/ceph@sha256:1793ff3af6ae74527c86e1a0b22401e9c42dc08d0ebb8379653be07db17d0007, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-1, CEPH_POINT_RELEASE=-18.2.0, GIT_BRANCH=HEAD, org.label-schema.vendor=CentOS, GIT_COMMIT=0396eef90bef641b676c164ec7a3876f45010308, maintainer=Guillaume Abrioux , GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, org.label-schema.build-date=20231212, RELEASE=HEAD, io.buildah.version=1.29.1, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 8 Base Image, org.label-schema.schema-version=1.0) 2026-03-10T06:25:15.410 INFO:journalctl@ceph.osd.1.vm04.stdout:Mar 10 06:25:15 vm04.local bash[129088]: ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-1 2026-03-10T06:25:15.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:15 vm06.local ceph-mon[98962]: pgmap v46: 65 pgs: 65 active+clean; 256 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail; 666 KiB/s rd, 695 KiB/s wr, 143 op/s; 0 B/s, 3 objects/s recovering 2026-03-10T06:25:15.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:15 vm06.local ceph-mon[98962]: osd.1 marked itself down and dead 2026-03-10T06:25:15.678 INFO:journalctl@ceph.osd.1.vm04.stdout:Mar 10 06:25:15 vm04.local podman[129158]: 2026-03-10 06:25:15.409499395 +0000 UTC m=+0.025949785 container create e1adffaf668334a0762122aa3ebb4e49735936c9d9308b7937e38fe794d07657 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-1-deactivate, org.opencontainers.image.authors=Ceph Release Team , org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, OSD_FLAVOR=default, CEPH_REF=squid, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS) 2026-03-10T06:25:15.678 INFO:journalctl@ceph.osd.1.vm04.stdout:Mar 10 06:25:15 vm04.local podman[129158]: 2026-03-10 06:25:15.450044355 +0000 UTC m=+0.066494745 container init e1adffaf668334a0762122aa3ebb4e49735936c9d9308b7937e38fe794d07657 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-1-deactivate, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team , OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git) 2026-03-10T06:25:15.678 INFO:journalctl@ceph.osd.1.vm04.stdout:Mar 10 06:25:15 vm04.local podman[129158]: 2026-03-10 06:25:15.456228924 +0000 UTC m=+0.072679315 container start e1adffaf668334a0762122aa3ebb4e49735936c9d9308b7937e38fe794d07657 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-1-deactivate, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/) 2026-03-10T06:25:15.678 INFO:journalctl@ceph.osd.1.vm04.stdout:Mar 10 06:25:15 vm04.local podman[129158]: 2026-03-10 06:25:15.4572947 +0000 UTC m=+0.073745090 container attach e1adffaf668334a0762122aa3ebb4e49735936c9d9308b7937e38fe794d07657 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-1-deactivate, io.buildah.version=1.41.3, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team , ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git) 2026-03-10T06:25:15.678 INFO:journalctl@ceph.osd.1.vm04.stdout:Mar 10 06:25:15 vm04.local podman[129158]: 2026-03-10 06:25:15.397416382 +0000 UTC m=+0.013866772 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-10T06:25:15.678 INFO:journalctl@ceph.osd.1.vm04.stdout:Mar 10 06:25:15 vm04.local podman[129158]: 2026-03-10 06:25:15.58141746 +0000 UTC m=+0.197867850 container died e1adffaf668334a0762122aa3ebb4e49735936c9d9308b7937e38fe794d07657 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-1-deactivate, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team , CEPH_REF=squid, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, ceph=True) 2026-03-10T06:25:15.678 INFO:journalctl@ceph.osd.1.vm04.stdout:Mar 10 06:25:15 vm04.local podman[129158]: 2026-03-10 06:25:15.616179474 +0000 UTC m=+0.232629864 container remove e1adffaf668334a0762122aa3ebb4e49735936c9d9308b7937e38fe794d07657 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-1-deactivate, io.buildah.version=1.41.3, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team , CEPH_REF=squid, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20260223, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git) 2026-03-10T06:25:15.678 INFO:journalctl@ceph.osd.1.vm04.stdout:Mar 10 06:25:15 vm04.local systemd[1]: ceph-9c59102a-1c48-11f1-b618-035af535377d@osd.1.service: Deactivated successfully. 2026-03-10T06:25:15.678 INFO:journalctl@ceph.osd.1.vm04.stdout:Mar 10 06:25:15 vm04.local systemd[1]: Stopped Ceph osd.1 for 9c59102a-1c48-11f1-b618-035af535377d. 2026-03-10T06:25:15.678 INFO:journalctl@ceph.osd.1.vm04.stdout:Mar 10 06:25:15 vm04.local systemd[1]: ceph-9c59102a-1c48-11f1-b618-035af535377d@osd.1.service: Consumed 36.096s CPU time. 2026-03-10T06:25:15.984 INFO:journalctl@ceph.osd.1.vm04.stdout:Mar 10 06:25:15 vm04.local systemd[1]: Starting Ceph osd.1 for 9c59102a-1c48-11f1-b618-035af535377d... 2026-03-10T06:25:15.984 INFO:journalctl@ceph.osd.1.vm04.stdout:Mar 10 06:25:15 vm04.local podman[129269]: 2026-03-10 06:25:15.926281646 +0000 UTC m=+0.018649187 container create a2e7974711ead3c59af20db3d461bea5056da8d34c9c67e543310e8b97f2c621 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-1-activate, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.build-date=20260223, org.opencontainers.image.authors=Ceph Release Team , ceph=True, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, CEPH_REF=squid) 2026-03-10T06:25:15.984 INFO:journalctl@ceph.osd.1.vm04.stdout:Mar 10 06:25:15 vm04.local podman[129269]: 2026-03-10 06:25:15.969902495 +0000 UTC m=+0.062270046 container init a2e7974711ead3c59af20db3d461bea5056da8d34c9c67e543310e8b97f2c621 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-1-activate, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team , ceph=True, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.label-schema.license=GPLv2, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df) 2026-03-10T06:25:15.984 INFO:journalctl@ceph.osd.1.vm04.stdout:Mar 10 06:25:15 vm04.local podman[129269]: 2026-03-10 06:25:15.972861445 +0000 UTC m=+0.065228986 container start a2e7974711ead3c59af20db3d461bea5056da8d34c9c67e543310e8b97f2c621 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-1-activate, org.opencontainers.image.authors=Ceph Release Team , ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260223, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.license=GPLv2, CEPH_REF=squid, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, FROM_IMAGE=quay.io/centos/centos:stream9) 2026-03-10T06:25:15.984 INFO:journalctl@ceph.osd.1.vm04.stdout:Mar 10 06:25:15 vm04.local podman[129269]: 2026-03-10 06:25:15.982749378 +0000 UTC m=+0.075116909 container attach a2e7974711ead3c59af20db3d461bea5056da8d34c9c67e543310e8b97f2c621 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-1-activate, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.build-date=20260223, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid) 2026-03-10T06:25:16.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:16 vm04.local ceph-mon[115743]: Health check failed: 1 osds down (OSD_DOWN) 2026-03-10T06:25:16.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:16 vm04.local ceph-mon[115743]: osdmap e52: 6 total, 5 up, 6 in 2026-03-10T06:25:16.428 INFO:journalctl@ceph.osd.1.vm04.stdout:Mar 10 06:25:16 vm04.local podman[129269]: 2026-03-10 06:25:15.919006044 +0000 UTC m=+0.011373585 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-10T06:25:16.428 INFO:journalctl@ceph.osd.1.vm04.stdout:Mar 10 06:25:16 vm04.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-1-activate[129281]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T06:25:16.428 INFO:journalctl@ceph.osd.1.vm04.stdout:Mar 10 06:25:16 vm04.local bash[129269]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T06:25:16.428 INFO:journalctl@ceph.osd.1.vm04.stdout:Mar 10 06:25:16 vm04.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-1-activate[129281]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T06:25:16.428 INFO:journalctl@ceph.osd.1.vm04.stdout:Mar 10 06:25:16 vm04.local bash[129269]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T06:25:16.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:16 vm06.local ceph-mon[98962]: Health check failed: 1 osds down (OSD_DOWN) 2026-03-10T06:25:16.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:16 vm06.local ceph-mon[98962]: osdmap e52: 6 total, 5 up, 6 in 2026-03-10T06:25:16.847 INFO:journalctl@ceph.osd.1.vm04.stdout:Mar 10 06:25:16 vm04.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-1-activate[129281]: --> Failed to activate via raw: did not find any matching OSD to activate 2026-03-10T06:25:16.848 INFO:journalctl@ceph.osd.1.vm04.stdout:Mar 10 06:25:16 vm04.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-1-activate[129281]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T06:25:16.848 INFO:journalctl@ceph.osd.1.vm04.stdout:Mar 10 06:25:16 vm04.local bash[129269]: --> Failed to activate via raw: did not find any matching OSD to activate 2026-03-10T06:25:16.848 INFO:journalctl@ceph.osd.1.vm04.stdout:Mar 10 06:25:16 vm04.local bash[129269]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T06:25:16.848 INFO:journalctl@ceph.osd.1.vm04.stdout:Mar 10 06:25:16 vm04.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-1-activate[129281]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T06:25:16.848 INFO:journalctl@ceph.osd.1.vm04.stdout:Mar 10 06:25:16 vm04.local bash[129269]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T06:25:16.848 INFO:journalctl@ceph.osd.1.vm04.stdout:Mar 10 06:25:16 vm04.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-1-activate[129281]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1 2026-03-10T06:25:16.848 INFO:journalctl@ceph.osd.1.vm04.stdout:Mar 10 06:25:16 vm04.local bash[129269]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1 2026-03-10T06:25:16.848 INFO:journalctl@ceph.osd.1.vm04.stdout:Mar 10 06:25:16 vm04.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-1-activate[129281]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph-20259af7-1534-402a-91a5-75002f95fb18/osd-block-38852b8c-2ea4-46c8-a734-cf521893e9b5 --path /var/lib/ceph/osd/ceph-1 --no-mon-config 2026-03-10T06:25:16.848 INFO:journalctl@ceph.osd.1.vm04.stdout:Mar 10 06:25:16 vm04.local bash[129269]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph-20259af7-1534-402a-91a5-75002f95fb18/osd-block-38852b8c-2ea4-46c8-a734-cf521893e9b5 --path /var/lib/ceph/osd/ceph-1 --no-mon-config 2026-03-10T06:25:17.160 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:17 vm04.local ceph-mon[115743]: pgmap v48: 65 pgs: 29 peering, 3 stale+active+clean, 33 active+clean; 252 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail; 357 KiB/s rd, 302 KiB/s wr, 77 op/s 2026-03-10T06:25:17.160 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:17 vm04.local ceph-mon[115743]: osdmap e53: 6 total, 5 up, 6 in 2026-03-10T06:25:17.160 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:17 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:25:17.160 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:17 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:25:17.160 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:17 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:25:17.163 INFO:journalctl@ceph.osd.1.vm04.stdout:Mar 10 06:25:16 vm04.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-1-activate[129281]: Running command: /usr/bin/ln -snf /dev/ceph-20259af7-1534-402a-91a5-75002f95fb18/osd-block-38852b8c-2ea4-46c8-a734-cf521893e9b5 /var/lib/ceph/osd/ceph-1/block 2026-03-10T06:25:17.163 INFO:journalctl@ceph.osd.1.vm04.stdout:Mar 10 06:25:16 vm04.local bash[129269]: Running command: /usr/bin/ln -snf /dev/ceph-20259af7-1534-402a-91a5-75002f95fb18/osd-block-38852b8c-2ea4-46c8-a734-cf521893e9b5 /var/lib/ceph/osd/ceph-1/block 2026-03-10T06:25:17.163 INFO:journalctl@ceph.osd.1.vm04.stdout:Mar 10 06:25:16 vm04.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-1-activate[129281]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-1/block 2026-03-10T06:25:17.163 INFO:journalctl@ceph.osd.1.vm04.stdout:Mar 10 06:25:16 vm04.local bash[129269]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-1/block 2026-03-10T06:25:17.163 INFO:journalctl@ceph.osd.1.vm04.stdout:Mar 10 06:25:16 vm04.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-1-activate[129281]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-1 2026-03-10T06:25:17.163 INFO:journalctl@ceph.osd.1.vm04.stdout:Mar 10 06:25:16 vm04.local bash[129269]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-1 2026-03-10T06:25:17.163 INFO:journalctl@ceph.osd.1.vm04.stdout:Mar 10 06:25:16 vm04.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-1-activate[129281]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1 2026-03-10T06:25:17.163 INFO:journalctl@ceph.osd.1.vm04.stdout:Mar 10 06:25:16 vm04.local bash[129269]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1 2026-03-10T06:25:17.163 INFO:journalctl@ceph.osd.1.vm04.stdout:Mar 10 06:25:16 vm04.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-1-activate[129281]: --> ceph-volume lvm activate successful for osd ID: 1 2026-03-10T06:25:17.163 INFO:journalctl@ceph.osd.1.vm04.stdout:Mar 10 06:25:16 vm04.local bash[129269]: --> ceph-volume lvm activate successful for osd ID: 1 2026-03-10T06:25:17.163 INFO:journalctl@ceph.osd.1.vm04.stdout:Mar 10 06:25:16 vm04.local conmon[129281]: conmon a2e7974711ead3c59af2 : Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-a2e7974711ead3c59af20db3d461bea5056da8d34c9c67e543310e8b97f2c621.scope/container/memory.events 2026-03-10T06:25:17.163 INFO:journalctl@ceph.osd.1.vm04.stdout:Mar 10 06:25:16 vm04.local podman[129269]: 2026-03-10 06:25:16.871278657 +0000 UTC m=+0.963646188 container died a2e7974711ead3c59af20db3d461bea5056da8d34c9c67e543310e8b97f2c621 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-1-activate, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, io.buildah.version=1.41.3, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, ceph=True, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.build-date=20260223) 2026-03-10T06:25:17.163 INFO:journalctl@ceph.osd.1.vm04.stdout:Mar 10 06:25:16 vm04.local podman[129269]: 2026-03-10 06:25:16.886961158 +0000 UTC m=+0.979328689 container remove a2e7974711ead3c59af20db3d461bea5056da8d34c9c67e543310e8b97f2c621 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-1-activate, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.build-date=20260223, CEPH_REF=squid, org.label-schema.vendor=CentOS) 2026-03-10T06:25:17.163 INFO:journalctl@ceph.osd.1.vm04.stdout:Mar 10 06:25:16 vm04.local podman[129541]: 2026-03-10 06:25:16.980081604 +0000 UTC m=+0.016379137 container create 6bc3525fe6f5f0aca8ab3fea01e7046a18c0106de3e9de8bfdefd0ddcdeed9ea (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-1, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20260223, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.vendor=CentOS, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git) 2026-03-10T06:25:17.163 INFO:journalctl@ceph.osd.1.vm04.stdout:Mar 10 06:25:17 vm04.local podman[129541]: 2026-03-10 06:25:17.015996116 +0000 UTC m=+0.052293649 container init 6bc3525fe6f5f0aca8ab3fea01e7046a18c0106de3e9de8bfdefd0ddcdeed9ea (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team , ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.vendor=CentOS, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0) 2026-03-10T06:25:17.163 INFO:journalctl@ceph.osd.1.vm04.stdout:Mar 10 06:25:17 vm04.local podman[129541]: 2026-03-10 06:25:17.018820714 +0000 UTC m=+0.055118247 container start 6bc3525fe6f5f0aca8ab3fea01e7046a18c0106de3e9de8bfdefd0ddcdeed9ea (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-1, org.label-schema.build-date=20260223, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.vendor=CentOS, ceph=True, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0) 2026-03-10T06:25:17.163 INFO:journalctl@ceph.osd.1.vm04.stdout:Mar 10 06:25:17 vm04.local bash[129541]: 6bc3525fe6f5f0aca8ab3fea01e7046a18c0106de3e9de8bfdefd0ddcdeed9ea 2026-03-10T06:25:17.163 INFO:journalctl@ceph.osd.1.vm04.stdout:Mar 10 06:25:17 vm04.local podman[129541]: 2026-03-10 06:25:16.973999808 +0000 UTC m=+0.010297350 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-10T06:25:17.163 INFO:journalctl@ceph.osd.1.vm04.stdout:Mar 10 06:25:17 vm04.local systemd[1]: Started Ceph osd.1 for 9c59102a-1c48-11f1-b618-035af535377d. 2026-03-10T06:25:17.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:17 vm06.local ceph-mon[98962]: pgmap v48: 65 pgs: 29 peering, 3 stale+active+clean, 33 active+clean; 252 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail; 357 KiB/s rd, 302 KiB/s wr, 77 op/s 2026-03-10T06:25:17.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:17 vm06.local ceph-mon[98962]: osdmap e53: 6 total, 5 up, 6 in 2026-03-10T06:25:17.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:17 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:25:17.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:17 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:25:17.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:17 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:25:17.805 INFO:journalctl@ceph.osd.1.vm04.stdout:Mar 10 06:25:17 vm04.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-1[129553]: 2026-03-10T06:25:17.596+0000 7f09e2059740 -1 Falling back to public interface 2026-03-10T06:25:19.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:19 vm04.local ceph-mon[115743]: pgmap v50: 65 pgs: 29 peering, 3 stale+active+clean, 33 active+clean; 252 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail; 327 KiB/s rd, 256 KiB/s wr, 77 op/s 2026-03-10T06:25:19.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:19 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:25:19.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:19 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:25:19.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:19 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:25:19.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:19 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:25:19.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:19 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:25:19.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:19 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T06:25:19.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:19 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:25:19.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:19 vm06.local ceph-mon[98962]: pgmap v50: 65 pgs: 29 peering, 3 stale+active+clean, 33 active+clean; 252 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail; 327 KiB/s rd, 256 KiB/s wr, 77 op/s 2026-03-10T06:25:19.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:19 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:25:19.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:19 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:25:19.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:19 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:25:19.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:19 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:25:19.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:19 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:25:19.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:19 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T06:25:19.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:19 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:25:21.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:20 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:25:21.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:20 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:25:21.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:20 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:25:21.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:20 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T06:25:21.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:20 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:25:21.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:20 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:25:21.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:20 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:25:21.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:20 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:25:21.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:20 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:25:21.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:20 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd ok-to-stop", "ids": ["2"], "max": 16}]: dispatch 2026-03-10T06:25:21.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:20 vm06.local ceph-mon[98962]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["2"], "max": 16}]: dispatch 2026-03-10T06:25:21.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:20 vm06.local ceph-mon[98962]: Upgrade: unsafe to stop osd(s) at this time (12 PGs are or would become offline) 2026-03-10T06:25:21.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:20 vm06.local ceph-mon[98962]: pgmap v51: 65 pgs: 3 active+undersized, 29 peering, 2 active+undersized+degraded, 31 active+clean; 252 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail; 38 KiB/s wr, 1 op/s; 2/261 objects degraded (0.766%) 2026-03-10T06:25:21.127 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:20 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:25:21.127 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:20 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:25:21.127 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:20 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:25:21.127 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:20 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T06:25:21.127 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:20 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:25:21.127 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:20 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:25:21.127 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:20 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:25:21.127 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:20 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:25:21.127 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:20 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:25:21.127 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:20 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd ok-to-stop", "ids": ["2"], "max": 16}]: dispatch 2026-03-10T06:25:21.128 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:20 vm04.local ceph-mon[115743]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["2"], "max": 16}]: dispatch 2026-03-10T06:25:21.128 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:20 vm04.local ceph-mon[115743]: Upgrade: unsafe to stop osd(s) at this time (12 PGs are or would become offline) 2026-03-10T06:25:21.128 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:20 vm04.local ceph-mon[115743]: pgmap v51: 65 pgs: 3 active+undersized, 29 peering, 2 active+undersized+degraded, 31 active+clean; 252 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail; 38 KiB/s wr, 1 op/s; 2/261 objects degraded (0.766%) 2026-03-10T06:25:21.428 INFO:journalctl@ceph.osd.1.vm04.stdout:Mar 10 06:25:21 vm04.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-1[129553]: 2026-03-10T06:25:21.126+0000 7f09e2059740 -1 osd.1 0 read_superblock omap replica is missing. 2026-03-10T06:25:21.428 INFO:journalctl@ceph.osd.1.vm04.stdout:Mar 10 06:25:21 vm04.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-1[129553]: 2026-03-10T06:25:21.338+0000 7f09e2059740 -1 osd.1 51 log_to_monitors true 2026-03-10T06:25:22.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:21 vm06.local ceph-mon[98962]: Health check failed: Degraded data redundancy: 2/261 objects degraded (0.766%), 2 pgs degraded (PG_DEGRADED) 2026-03-10T06:25:22.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:21 vm06.local ceph-mon[98962]: from='osd.1 [v2:192.168.123.104:6810/2635985541,v1:192.168.123.104:6811/2635985541]' entity='osd.1' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["1"]}]: dispatch 2026-03-10T06:25:22.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:21 vm04.local ceph-mon[115743]: Health check failed: Degraded data redundancy: 2/261 objects degraded (0.766%), 2 pgs degraded (PG_DEGRADED) 2026-03-10T06:25:22.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:21 vm04.local ceph-mon[115743]: from='osd.1 [v2:192.168.123.104:6810/2635985541,v1:192.168.123.104:6811/2635985541]' entity='osd.1' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["1"]}]: dispatch 2026-03-10T06:25:23.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:22 vm06.local ceph-mon[98962]: from='osd.1 [v2:192.168.123.104:6810/2635985541,v1:192.168.123.104:6811/2635985541]' entity='osd.1' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["1"]}]': finished 2026-03-10T06:25:23.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:22 vm06.local ceph-mon[98962]: osdmap e54: 6 total, 5 up, 6 in 2026-03-10T06:25:23.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:22 vm06.local ceph-mon[98962]: from='osd.1 [v2:192.168.123.104:6810/2635985541,v1:192.168.123.104:6811/2635985541]' entity='osd.1' cmd=[{"prefix": "osd crush create-or-move", "id": 1, "weight":0.0195, "args": ["host=vm04", "root=default"]}]: dispatch 2026-03-10T06:25:23.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:22 vm06.local ceph-mon[98962]: pgmap v53: 65 pgs: 21 active+undersized, 13 active+undersized+degraded, 31 active+clean; 252 MiB data, 2.4 GiB used, 118 GiB / 120 GiB avail; 1.0 KiB/s wr, 1 op/s; 34/261 objects degraded (13.027%) 2026-03-10T06:25:23.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:22 vm04.local ceph-mon[115743]: from='osd.1 [v2:192.168.123.104:6810/2635985541,v1:192.168.123.104:6811/2635985541]' entity='osd.1' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["1"]}]': finished 2026-03-10T06:25:23.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:22 vm04.local ceph-mon[115743]: osdmap e54: 6 total, 5 up, 6 in 2026-03-10T06:25:23.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:22 vm04.local ceph-mon[115743]: from='osd.1 [v2:192.168.123.104:6810/2635985541,v1:192.168.123.104:6811/2635985541]' entity='osd.1' cmd=[{"prefix": "osd crush create-or-move", "id": 1, "weight":0.0195, "args": ["host=vm04", "root=default"]}]: dispatch 2026-03-10T06:25:23.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:22 vm04.local ceph-mon[115743]: pgmap v53: 65 pgs: 21 active+undersized, 13 active+undersized+degraded, 31 active+clean; 252 MiB data, 2.4 GiB used, 118 GiB / 120 GiB avail; 1.0 KiB/s wr, 1 op/s; 34/261 objects degraded (13.027%) 2026-03-10T06:25:23.177 INFO:journalctl@ceph.osd.1.vm04.stdout:Mar 10 06:25:22 vm04.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-1[129553]: 2026-03-10T06:25:22.812+0000 7f09d95f2640 -1 osd.1 51 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-10T06:25:24.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:23 vm06.local ceph-mon[98962]: from='osd.1 [v2:192.168.123.104:6810/2635985541,v1:192.168.123.104:6811/2635985541]' entity='osd.1' 2026-03-10T06:25:24.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:23 vm06.local ceph-mon[98962]: Health check cleared: OSD_DOWN (was: 1 osds down) 2026-03-10T06:25:24.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:23 vm06.local ceph-mon[98962]: osd.1 [v2:192.168.123.104:6810/2635985541,v1:192.168.123.104:6811/2635985541] boot 2026-03-10T06:25:24.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:23 vm06.local ceph-mon[98962]: osdmap e55: 6 total, 6 up, 6 in 2026-03-10T06:25:24.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:23 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-10T06:25:24.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:23 vm04.local ceph-mon[115743]: from='osd.1 [v2:192.168.123.104:6810/2635985541,v1:192.168.123.104:6811/2635985541]' entity='osd.1' 2026-03-10T06:25:24.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:23 vm04.local ceph-mon[115743]: Health check cleared: OSD_DOWN (was: 1 osds down) 2026-03-10T06:25:24.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:23 vm04.local ceph-mon[115743]: osd.1 [v2:192.168.123.104:6810/2635985541,v1:192.168.123.104:6811/2635985541] boot 2026-03-10T06:25:24.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:23 vm04.local ceph-mon[115743]: osdmap e55: 6 total, 6 up, 6 in 2026-03-10T06:25:24.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:23 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-10T06:25:25.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:24 vm04.local ceph-mon[115743]: osdmap e56: 6 total, 6 up, 6 in 2026-03-10T06:25:25.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:24 vm04.local ceph-mon[115743]: pgmap v56: 65 pgs: 21 active+undersized, 13 active+undersized+degraded, 31 active+clean; 252 MiB data, 2.4 GiB used, 118 GiB / 120 GiB avail; 0 B/s rd, 341 B/s wr, 1 op/s; 34/261 objects degraded (13.027%) 2026-03-10T06:25:25.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:24 vm06.local ceph-mon[98962]: osdmap e56: 6 total, 6 up, 6 in 2026-03-10T06:25:25.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:24 vm06.local ceph-mon[98962]: pgmap v56: 65 pgs: 21 active+undersized, 13 active+undersized+degraded, 31 active+clean; 252 MiB data, 2.4 GiB used, 118 GiB / 120 GiB avail; 0 B/s rd, 341 B/s wr, 1 op/s; 34/261 objects degraded (13.027%) 2026-03-10T06:25:26.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:26 vm06.local ceph-mon[98962]: Health check update: Degraded data redundancy: 18/261 objects degraded (6.897%), 7 pgs degraded (PG_DEGRADED) 2026-03-10T06:25:26.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:26 vm04.local ceph-mon[115743]: Health check update: Degraded data redundancy: 18/261 objects degraded (6.897%), 7 pgs degraded (PG_DEGRADED) 2026-03-10T06:25:27.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:27 vm06.local ceph-mon[98962]: pgmap v57: 65 pgs: 5 active+undersized, 7 active+undersized+degraded, 53 active+clean; 252 MiB data, 2.4 GiB used, 118 GiB / 120 GiB avail; 1.7 KiB/s rd, 0 B/s wr, 2 op/s; 18/261 objects degraded (6.897%) 2026-03-10T06:25:27.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:27 vm04.local ceph-mon[115743]: pgmap v57: 65 pgs: 5 active+undersized, 7 active+undersized+degraded, 53 active+clean; 252 MiB data, 2.4 GiB used, 118 GiB / 120 GiB avail; 1.7 KiB/s rd, 0 B/s wr, 2 op/s; 18/261 objects degraded (6.897%) 2026-03-10T06:25:28.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:28 vm04.local ceph-mon[115743]: Health check cleared: PG_DEGRADED (was: Degraded data redundancy: 18/261 objects degraded (6.897%), 7 pgs degraded) 2026-03-10T06:25:28.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:28 vm06.local ceph-mon[98962]: Health check cleared: PG_DEGRADED (was: Degraded data redundancy: 18/261 objects degraded (6.897%), 7 pgs degraded) 2026-03-10T06:25:29.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:29 vm04.local ceph-mon[115743]: pgmap v58: 65 pgs: 65 active+clean; 252 MiB data, 2.4 GiB used, 118 GiB / 120 GiB avail; 828 B/s rd, 1 op/s; 0 B/s, 0 objects/s recovering 2026-03-10T06:25:29.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:29 vm06.local ceph-mon[98962]: pgmap v58: 65 pgs: 65 active+clean; 252 MiB data, 2.4 GiB used, 118 GiB / 120 GiB avail; 828 B/s rd, 1 op/s; 0 B/s, 0 objects/s recovering 2026-03-10T06:25:30.462 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:30.460+0000 7f04f064f700 1 -- 192.168.123.104:0/1129529259 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f04e8106560 msgr2=0x7f04e8106930 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:25:30.462 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:30.460+0000 7f04f064f700 1 --2- 192.168.123.104:0/1129529259 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f04e8106560 0x7f04e8106930 secure :-1 s=READY pgs=52 cs=0 l=1 rev1=1 crypto rx=0x7f04e4009b50 tx=0x7f04e4009e60 comp rx=0 tx=0).stop 2026-03-10T06:25:30.462 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:30.461+0000 7f04f064f700 1 -- 192.168.123.104:0/1129529259 shutdown_connections 2026-03-10T06:25:30.462 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:30.461+0000 7f04f064f700 1 --2- 192.168.123.104:0/1129529259 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f04e8100540 0x7f04e81009b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:25:30.462 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:30.461+0000 7f04f064f700 1 --2- 192.168.123.104:0/1129529259 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f04e8106560 0x7f04e8106930 unknown :-1 s=CLOSED pgs=52 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:25:30.462 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:30.461+0000 7f04f064f700 1 -- 192.168.123.104:0/1129529259 >> 192.168.123.104:0/1129529259 conn(0x7f04e80fc000 msgr2=0x7f04e80fe410 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:25:30.463 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:30.462+0000 7f04f064f700 1 -- 192.168.123.104:0/1129529259 shutdown_connections 2026-03-10T06:25:30.463 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:30.462+0000 7f04f064f700 1 -- 192.168.123.104:0/1129529259 wait complete. 2026-03-10T06:25:30.463 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:30.462+0000 7f04f064f700 1 Processor -- start 2026-03-10T06:25:30.463 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:30.462+0000 7f04f064f700 1 -- start start 2026-03-10T06:25:30.463 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:30.462+0000 7f04f064f700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f04e8100540 0x7f04e8198d50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:25:30.463 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:30.462+0000 7f04f064f700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f04e8106560 0x7f04e81992b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:25:30.463 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:30.462+0000 7f04f064f700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f04e8193e00 con 0x7f04e8106560 2026-03-10T06:25:30.463 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:30.462+0000 7f04f064f700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f04e8193f70 con 0x7f04e8100540 2026-03-10T06:25:30.463 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:30.463+0000 7f04edbea700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f04e8106560 0x7f04e81992b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:25:30.463 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:30.463+0000 7f04edbea700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f04e8106560 0x7f04e81992b0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:56516/0 (socket says 192.168.123.104:56516) 2026-03-10T06:25:30.463 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:30.463+0000 7f04edbea700 1 -- 192.168.123.104:0/2976787888 learned_addr learned my addr 192.168.123.104:0/2976787888 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:25:30.464 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:30.463+0000 7f04edbea700 1 -- 192.168.123.104:0/2976787888 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f04e8100540 msgr2=0x7f04e8198d50 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:25:30.464 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:30.463+0000 7f04edbea700 1 --2- 192.168.123.104:0/2976787888 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f04e8100540 0x7f04e8198d50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:25:30.464 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:30.463+0000 7f04edbea700 1 -- 192.168.123.104:0/2976787888 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f04e40097e0 con 0x7f04e8106560 2026-03-10T06:25:30.464 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:30.463+0000 7f04edbea700 1 --2- 192.168.123.104:0/2976787888 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f04e8106560 0x7f04e81992b0 secure :-1 s=READY pgs=53 cs=0 l=1 rev1=1 crypto rx=0x7f04d800b730 tx=0x7f04d800baf0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:25:30.464 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:30.464+0000 7f04df7fe700 1 -- 192.168.123.104:0/2976787888 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f04d8010820 con 0x7f04e8106560 2026-03-10T06:25:30.465 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:30.464+0000 7f04f064f700 1 -- 192.168.123.104:0/2976787888 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f04e81941d0 con 0x7f04e8106560 2026-03-10T06:25:30.466 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:30.465+0000 7f04f064f700 1 -- 192.168.123.104:0/2976787888 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f04e8194720 con 0x7f04e8106560 2026-03-10T06:25:30.466 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:30.465+0000 7f04df7fe700 1 -- 192.168.123.104:0/2976787888 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f04d8010e60 con 0x7f04e8106560 2026-03-10T06:25:30.466 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:30.465+0000 7f04df7fe700 1 -- 192.168.123.104:0/2976787888 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f04d8017570 con 0x7f04e8106560 2026-03-10T06:25:30.467 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:30.466+0000 7f04df7fe700 1 -- 192.168.123.104:0/2976787888 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 39) v1 ==== 100217+0+0 (secure 0 0 0) 0x7f04d8017790 con 0x7f04e8106560 2026-03-10T06:25:30.467 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:30.467+0000 7f04df7fe700 1 --2- 192.168.123.104:0/2976787888 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f04d4077ab0 0x7f04d4079f60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:25:30.467 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:30.467+0000 7f04ee3eb700 1 --2- 192.168.123.104:0/2976787888 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f04d4077ab0 0x7f04d4079f60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:25:30.467 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:30.467+0000 7f04df7fe700 1 -- 192.168.123.104:0/2976787888 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(56..56 src has 1..56) v4 ==== 6050+0+0 (secure 0 0 0) 0x7f04d80988a0 con 0x7f04e8106560 2026-03-10T06:25:30.468 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:30.467+0000 7f04ee3eb700 1 --2- 192.168.123.104:0/2976787888 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f04d4077ab0 0x7f04d4079f60 secure :-1 s=READY pgs=36 cs=0 l=1 rev1=1 crypto rx=0x7f04e4009b20 tx=0x7f04e4005fb0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:25:30.468 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:30.467+0000 7f04f064f700 1 -- 192.168.123.104:0/2976787888 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f04cc005320 con 0x7f04e8106560 2026-03-10T06:25:30.471 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:30.471+0000 7f04df7fe700 1 -- 192.168.123.104:0/2976787888 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f04d80621e0 con 0x7f04e8106560 2026-03-10T06:25:30.602 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:30.601+0000 7f04f064f700 1 -- 192.168.123.104:0/2976787888 --> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f04cc000bf0 con 0x7f04d4077ab0 2026-03-10T06:25:30.603 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:30.603+0000 7f04df7fe700 1 -- 192.168.123.104:0/2976787888 <== mgr.34104 v2:192.168.123.104:6800/1421430943 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7f04cc000bf0 con 0x7f04d4077ab0 2026-03-10T06:25:30.606 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:30.605+0000 7f04f064f700 1 -- 192.168.123.104:0/2976787888 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f04d4077ab0 msgr2=0x7f04d4079f60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:25:30.606 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:30.605+0000 7f04f064f700 1 --2- 192.168.123.104:0/2976787888 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f04d4077ab0 0x7f04d4079f60 secure :-1 s=READY pgs=36 cs=0 l=1 rev1=1 crypto rx=0x7f04e4009b20 tx=0x7f04e4005fb0 comp rx=0 tx=0).stop 2026-03-10T06:25:30.606 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:30.606+0000 7f04f064f700 1 -- 192.168.123.104:0/2976787888 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f04e8106560 msgr2=0x7f04e81992b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:25:30.606 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:30.606+0000 7f04f064f700 1 --2- 192.168.123.104:0/2976787888 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f04e8106560 0x7f04e81992b0 secure :-1 s=READY pgs=53 cs=0 l=1 rev1=1 crypto rx=0x7f04d800b730 tx=0x7f04d800baf0 comp rx=0 tx=0).stop 2026-03-10T06:25:30.607 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:30.606+0000 7f04f064f700 1 -- 192.168.123.104:0/2976787888 shutdown_connections 2026-03-10T06:25:30.607 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:30.606+0000 7f04f064f700 1 --2- 192.168.123.104:0/2976787888 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f04d4077ab0 0x7f04d4079f60 unknown :-1 s=CLOSED pgs=36 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:25:30.607 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:30.606+0000 7f04f064f700 1 --2- 192.168.123.104:0/2976787888 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f04e8100540 0x7f04e8198d50 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:25:30.607 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:30.606+0000 7f04f064f700 1 --2- 192.168.123.104:0/2976787888 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f04e8106560 0x7f04e81992b0 unknown :-1 s=CLOSED pgs=53 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:25:30.607 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:30.606+0000 7f04f064f700 1 -- 192.168.123.104:0/2976787888 >> 192.168.123.104:0/2976787888 conn(0x7f04e80fc000 msgr2=0x7f04e80fd800 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:25:30.607 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:30.606+0000 7f04f064f700 1 -- 192.168.123.104:0/2976787888 shutdown_connections 2026-03-10T06:25:30.607 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:30.607+0000 7f04f064f700 1 -- 192.168.123.104:0/2976787888 wait complete. 2026-03-10T06:25:30.618 INFO:teuthology.orchestra.run.vm04.stdout:true 2026-03-10T06:25:30.681 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:30.680+0000 7fdfff56e700 1 -- 192.168.123.104:0/1842037208 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fdff8105310 msgr2=0x7fdff81056e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:25:30.681 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:30.680+0000 7fdfff56e700 1 --2- 192.168.123.104:0/1842037208 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fdff8105310 0x7fdff81056e0 secure :-1 s=READY pgs=54 cs=0 l=1 rev1=1 crypto rx=0x7fdfec009b50 tx=0x7fdfec009e60 comp rx=0 tx=0).stop 2026-03-10T06:25:30.681 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:30.681+0000 7fdfff56e700 1 -- 192.168.123.104:0/1842037208 shutdown_connections 2026-03-10T06:25:30.681 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:30.681+0000 7fdfff56e700 1 --2- 192.168.123.104:0/1842037208 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fdff8068490 0x7fdff8068900 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:25:30.681 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:30.681+0000 7fdfff56e700 1 --2- 192.168.123.104:0/1842037208 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fdff8105310 0x7fdff81056e0 unknown :-1 s=CLOSED pgs=54 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:25:30.682 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:30.681+0000 7fdfff56e700 1 -- 192.168.123.104:0/1842037208 >> 192.168.123.104:0/1842037208 conn(0x7fdff8075240 msgr2=0x7fdff8075640 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:25:30.682 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:30.681+0000 7fdfff56e700 1 -- 192.168.123.104:0/1842037208 shutdown_connections 2026-03-10T06:25:30.682 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:30.681+0000 7fdfff56e700 1 -- 192.168.123.104:0/1842037208 wait complete. 2026-03-10T06:25:30.682 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:30.682+0000 7fdfff56e700 1 Processor -- start 2026-03-10T06:25:30.682 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:30.682+0000 7fdfff56e700 1 -- start start 2026-03-10T06:25:30.682 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:30.682+0000 7fdfff56e700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fdff8068490 0x7fdff80ffd10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:25:30.683 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:30.682+0000 7fdfff56e700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fdff8105310 0x7fdff8100250 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:25:30.683 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:30.682+0000 7fdfff56e700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fdff8100820 con 0x7fdff8105310 2026-03-10T06:25:30.683 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:30.682+0000 7fdfff56e700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fdff8100990 con 0x7fdff8068490 2026-03-10T06:25:30.683 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:30.682+0000 7fdffdd6b700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fdff8105310 0x7fdff8100250 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:25:30.683 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:30.682+0000 7fdffdd6b700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fdff8105310 0x7fdff8100250 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:56524/0 (socket says 192.168.123.104:56524) 2026-03-10T06:25:30.683 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:30.682+0000 7fdffdd6b700 1 -- 192.168.123.104:0/3199806938 learned_addr learned my addr 192.168.123.104:0/3199806938 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:25:30.683 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:30.682+0000 7fdffdd6b700 1 -- 192.168.123.104:0/3199806938 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fdff8068490 msgr2=0x7fdff80ffd10 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:25:30.683 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:30.682+0000 7fdffdd6b700 1 --2- 192.168.123.104:0/3199806938 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fdff8068490 0x7fdff80ffd10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:25:30.683 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:30.682+0000 7fdffdd6b700 1 -- 192.168.123.104:0/3199806938 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fdfec0097e0 con 0x7fdff8105310 2026-03-10T06:25:30.683 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:30.683+0000 7fdffdd6b700 1 --2- 192.168.123.104:0/3199806938 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fdff8105310 0x7fdff8100250 secure :-1 s=READY pgs=55 cs=0 l=1 rev1=1 crypto rx=0x7fdff400d900 tx=0x7fdff400dcc0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:25:30.683 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:30.683+0000 7fdfeb7fe700 1 -- 192.168.123.104:0/3199806938 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fdff40041d0 con 0x7fdff8105310 2026-03-10T06:25:30.683 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:30.683+0000 7fdfeb7fe700 1 -- 192.168.123.104:0/3199806938 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7fdff4004d10 con 0x7fdff8105310 2026-03-10T06:25:30.683 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:30.683+0000 7fdfeb7fe700 1 -- 192.168.123.104:0/3199806938 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fdff400b750 con 0x7fdff8105310 2026-03-10T06:25:30.684 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:30.683+0000 7fdfff56e700 1 -- 192.168.123.104:0/3199806938 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fdff81a2120 con 0x7fdff8105310 2026-03-10T06:25:30.685 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:30.683+0000 7fdfff56e700 1 -- 192.168.123.104:0/3199806938 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fdff81a2590 con 0x7fdff8105310 2026-03-10T06:25:30.685 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:30.685+0000 7fdfff56e700 1 -- 192.168.123.104:0/3199806938 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fdff81086b0 con 0x7fdff8105310 2026-03-10T06:25:30.687 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:30.685+0000 7fdfeb7fe700 1 -- 192.168.123.104:0/3199806938 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 39) v1 ==== 100217+0+0 (secure 0 0 0) 0x7fdff4004330 con 0x7fdff8105310 2026-03-10T06:25:30.687 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:30.685+0000 7fdfeb7fe700 1 --2- 192.168.123.104:0/3199806938 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7fdfe40779e0 0x7fdfe4079e90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:25:30.687 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:30.685+0000 7fdfeb7fe700 1 -- 192.168.123.104:0/3199806938 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(56..56 src has 1..56) v4 ==== 6050+0+0 (secure 0 0 0) 0x7fdff40999a0 con 0x7fdff8105310 2026-03-10T06:25:30.688 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:30.688+0000 7fdfeb7fe700 1 -- 192.168.123.104:0/3199806938 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fdff4062080 con 0x7fdff8105310 2026-03-10T06:25:30.689 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:30.688+0000 7fdffe56c700 1 --2- 192.168.123.104:0/3199806938 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7fdfe40779e0 0x7fdfe4079e90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:25:30.690 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:30.689+0000 7fdffe56c700 1 --2- 192.168.123.104:0/3199806938 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7fdfe40779e0 0x7fdfe4079e90 secure :-1 s=READY pgs=37 cs=0 l=1 rev1=1 crypto rx=0x7fdfec00b5c0 tx=0x7fdfec005fd0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:25:30.824 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:30.823+0000 7fdfff56e700 1 -- 192.168.123.104:0/3199806938 --> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7fdff81012b0 con 0x7fdfe40779e0 2026-03-10T06:25:30.829 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:30.829+0000 7fdfeb7fe700 1 -- 192.168.123.104:0/3199806938 <== mgr.34104 v2:192.168.123.104:6800/1421430943 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7fdff81012b0 con 0x7fdfe40779e0 2026-03-10T06:25:30.833 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:30.833+0000 7fdfff56e700 1 -- 192.168.123.104:0/3199806938 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7fdfe40779e0 msgr2=0x7fdfe4079e90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:25:30.834 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:30.833+0000 7fdfff56e700 1 --2- 192.168.123.104:0/3199806938 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7fdfe40779e0 0x7fdfe4079e90 secure :-1 s=READY pgs=37 cs=0 l=1 rev1=1 crypto rx=0x7fdfec00b5c0 tx=0x7fdfec005fd0 comp rx=0 tx=0).stop 2026-03-10T06:25:30.834 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:30.833+0000 7fdfff56e700 1 -- 192.168.123.104:0/3199806938 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fdff8105310 msgr2=0x7fdff8100250 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:25:30.834 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:30.833+0000 7fdfff56e700 1 --2- 192.168.123.104:0/3199806938 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fdff8105310 0x7fdff8100250 secure :-1 s=READY pgs=55 cs=0 l=1 rev1=1 crypto rx=0x7fdff400d900 tx=0x7fdff400dcc0 comp rx=0 tx=0).stop 2026-03-10T06:25:30.834 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:30.833+0000 7fdfff56e700 1 -- 192.168.123.104:0/3199806938 shutdown_connections 2026-03-10T06:25:30.834 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:30.833+0000 7fdfff56e700 1 --2- 192.168.123.104:0/3199806938 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7fdfe40779e0 0x7fdfe4079e90 unknown :-1 s=CLOSED pgs=37 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:25:30.834 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:30.833+0000 7fdfff56e700 1 --2- 192.168.123.104:0/3199806938 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fdff8068490 0x7fdff80ffd10 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:25:30.834 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:30.833+0000 7fdfff56e700 1 --2- 192.168.123.104:0/3199806938 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fdff8105310 0x7fdff8100250 unknown :-1 s=CLOSED pgs=55 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:25:30.834 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:30.833+0000 7fdfff56e700 1 -- 192.168.123.104:0/3199806938 >> 192.168.123.104:0/3199806938 conn(0x7fdff8075240 msgr2=0x7fdff80febc0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:25:30.834 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:30.834+0000 7fdfff56e700 1 -- 192.168.123.104:0/3199806938 shutdown_connections 2026-03-10T06:25:30.834 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:30.834+0000 7fdfff56e700 1 -- 192.168.123.104:0/3199806938 wait complete. 2026-03-10T06:25:30.915 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:30.912+0000 7fc537668700 1 -- 192.168.123.104:0/1848927866 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc530071b60 msgr2=0x7fc530071fd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:25:30.915 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:30.912+0000 7fc537668700 1 --2- 192.168.123.104:0/1848927866 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc530071b60 0x7fc530071fd0 secure :-1 s=READY pgs=56 cs=0 l=1 rev1=1 crypto rx=0x7fc52c00b3a0 tx=0x7fc52c00b6b0 comp rx=0 tx=0).stop 2026-03-10T06:25:30.915 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:30.912+0000 7fc537668700 1 -- 192.168.123.104:0/1848927866 shutdown_connections 2026-03-10T06:25:30.915 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:30.912+0000 7fc537668700 1 --2- 192.168.123.104:0/1848927866 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc530071b60 0x7fc530071fd0 unknown :-1 s=CLOSED pgs=56 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:25:30.915 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:30.912+0000 7fc537668700 1 --2- 192.168.123.104:0/1848927866 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc53010e9e0 0x7fc53010edb0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:25:30.915 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:30.912+0000 7fc537668700 1 -- 192.168.123.104:0/1848927866 >> 192.168.123.104:0/1848927866 conn(0x7fc53006c6c0 msgr2=0x7fc53006cac0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:25:30.915 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:30.912+0000 7fc537668700 1 -- 192.168.123.104:0/1848927866 shutdown_connections 2026-03-10T06:25:30.915 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:30.912+0000 7fc537668700 1 -- 192.168.123.104:0/1848927866 wait complete. 2026-03-10T06:25:30.915 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:30.912+0000 7fc537668700 1 Processor -- start 2026-03-10T06:25:30.915 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:30.913+0000 7fc537668700 1 -- start start 2026-03-10T06:25:30.915 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:30.913+0000 7fc537668700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc53010e9e0 0x7fc530119590 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:25:30.915 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:30.913+0000 7fc537668700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc530114590 0x7fc530114a00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:25:30.915 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:30.913+0000 7fc537668700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc530114f40 con 0x7fc53010e9e0 2026-03-10T06:25:30.915 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:30.913+0000 7fc537668700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc5301150b0 con 0x7fc530114590 2026-03-10T06:25:30.915 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:30.913+0000 7fc536666700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc53010e9e0 0x7fc530119590 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:25:30.915 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:30.913+0000 7fc536666700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc53010e9e0 0x7fc530119590 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:56536/0 (socket says 192.168.123.104:56536) 2026-03-10T06:25:30.915 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:30.913+0000 7fc536666700 1 -- 192.168.123.104:0/844164184 learned_addr learned my addr 192.168.123.104:0/844164184 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:25:30.915 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:30.913+0000 7fc535e65700 1 --2- 192.168.123.104:0/844164184 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc530114590 0x7fc530114a00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:25:30.915 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:30.913+0000 7fc536666700 1 -- 192.168.123.104:0/844164184 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc530114590 msgr2=0x7fc530114a00 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:25:30.915 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:30.913+0000 7fc536666700 1 --2- 192.168.123.104:0/844164184 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc530114590 0x7fc530114a00 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:25:30.915 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:30.913+0000 7fc536666700 1 -- 192.168.123.104:0/844164184 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fc52c00b050 con 0x7fc53010e9e0 2026-03-10T06:25:30.915 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:30.913+0000 7fc536666700 1 --2- 192.168.123.104:0/844164184 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc53010e9e0 0x7fc530119590 secure :-1 s=READY pgs=57 cs=0 l=1 rev1=1 crypto rx=0x7fc52400d8d0 tx=0x7fc52400dbe0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:25:30.915 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:30.914+0000 7fc5237fe700 1 -- 192.168.123.104:0/844164184 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc524009880 con 0x7fc53010e9e0 2026-03-10T06:25:30.916 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:30.914+0000 7fc537668700 1 -- 192.168.123.104:0/844164184 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fc530115390 con 0x7fc53010e9e0 2026-03-10T06:25:30.916 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:30.914+0000 7fc537668700 1 -- 192.168.123.104:0/844164184 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fc5301b7cb0 con 0x7fc53010e9e0 2026-03-10T06:25:30.916 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:30.915+0000 7fc5237fe700 1 -- 192.168.123.104:0/844164184 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7fc524010460 con 0x7fc53010e9e0 2026-03-10T06:25:30.916 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:30.915+0000 7fc5237fe700 1 -- 192.168.123.104:0/844164184 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc52400f5d0 con 0x7fc53010e9e0 2026-03-10T06:25:30.918 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:30.917+0000 7fc537668700 1 -- 192.168.123.104:0/844164184 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fc514005320 con 0x7fc53010e9e0 2026-03-10T06:25:30.918 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:30.918+0000 7fc5237fe700 1 -- 192.168.123.104:0/844164184 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 39) v1 ==== 100217+0+0 (secure 0 0 0) 0x7fc524010ad0 con 0x7fc53010e9e0 2026-03-10T06:25:30.918 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:30.918+0000 7fc5237fe700 1 --2- 192.168.123.104:0/844164184 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7fc51c077b10 0x7fc51c079fc0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:25:30.919 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:30.918+0000 7fc535e65700 1 --2- 192.168.123.104:0/844164184 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7fc51c077b10 0x7fc51c079fc0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:25:30.919 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:30.918+0000 7fc5237fe700 1 -- 192.168.123.104:0/844164184 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(56..56 src has 1..56) v4 ==== 6050+0+0 (secure 0 0 0) 0x7fc524099810 con 0x7fc53010e9e0 2026-03-10T06:25:30.923 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:30.922+0000 7fc5237fe700 1 -- 192.168.123.104:0/844164184 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fc524061ef0 con 0x7fc53010e9e0 2026-03-10T06:25:30.926 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:30.925+0000 7fc535e65700 1 --2- 192.168.123.104:0/844164184 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7fc51c077b10 0x7fc51c079fc0 secure :-1 s=READY pgs=38 cs=0 l=1 rev1=1 crypto rx=0x7fc52c00bb30 tx=0x7fc52c00bf90 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:25:31.059 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.057+0000 7fc537668700 1 -- 192.168.123.104:0/844164184 --> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7fc514000bf0 con 0x7fc51c077b10 2026-03-10T06:25:31.064 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.063+0000 7fc5237fe700 1 -- 192.168.123.104:0/844164184 <== mgr.34104 v2:192.168.123.104:6800/1421430943 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3552 (secure 0 0 0) 0x7fc514000bf0 con 0x7fc51c077b10 2026-03-10T06:25:31.064 INFO:teuthology.orchestra.run.vm04.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-10T06:25:31.064 INFO:teuthology.orchestra.run.vm04.stdout:alertmanager.vm04 vm04 *:9093,9094 running (2m) 12s ago 8m 24.2M - 0.25.0 c8568f914cd2 85edc8fe2fc1 2026-03-10T06:25:31.064 INFO:teuthology.orchestra.run.vm04.stdout:ceph-exporter.vm04 vm04 running (8m) 12s ago 8m 9076k - 18.2.0 dc2bc1663786 019b79596e39 2026-03-10T06:25:31.064 INFO:teuthology.orchestra.run.vm04.stdout:ceph-exporter.vm06 vm06 running (7m) 61s ago 7m 11.3M - 18.2.0 dc2bc1663786 02ba67f7b99e 2026-03-10T06:25:31.064 INFO:teuthology.orchestra.run.vm04.stdout:crash.vm04 vm04 running (65s) 12s ago 8m 7847k - 19.2.3-678-ge911bdeb 654f31e6858e 330b1d951bd0 2026-03-10T06:25:31.064 INFO:teuthology.orchestra.run.vm04.stdout:crash.vm06 vm06 running (63s) 61s ago 7m 7856k - 19.2.3-678-ge911bdeb 654f31e6858e d5aafc4fb1bb 2026-03-10T06:25:31.064 INFO:teuthology.orchestra.run.vm04.stdout:grafana.vm04 vm04 *:3000 running (2m) 12s ago 7m 88.8M - 10.4.0 c8b91775d855 28b34ae2f2b0 2026-03-10T06:25:31.064 INFO:teuthology.orchestra.run.vm04.stdout:mds.cephfs.vm04.hdxbzv vm04 running (5m) 12s ago 5m 180M - 18.2.0 dc2bc1663786 342935a5b39a 2026-03-10T06:25:31.064 INFO:teuthology.orchestra.run.vm04.stdout:mds.cephfs.vm04.hsrsig vm04 running (5m) 12s ago 5m 18.0M - 18.2.0 dc2bc1663786 9bbaa4df4333 2026-03-10T06:25:31.064 INFO:teuthology.orchestra.run.vm04.stdout:mds.cephfs.vm06.afscws vm06 running (5m) 61s ago 5m 17.7M - 18.2.0 dc2bc1663786 dc29bd0a94dd 2026-03-10T06:25:31.064 INFO:teuthology.orchestra.run.vm04.stdout:mds.cephfs.vm06.wzhqon vm06 running (5m) 61s ago 5m 268M - 18.2.0 dc2bc1663786 5f7b9f10b346 2026-03-10T06:25:31.064 INFO:teuthology.orchestra.run.vm04.stdout:mgr.vm04.exdvdb vm04 *:8443,9283,8765 running (3m) 12s ago 8m 608M - 19.2.3-678-ge911bdeb 654f31e6858e 640a9d30421c 2026-03-10T06:25:31.064 INFO:teuthology.orchestra.run.vm04.stdout:mgr.vm06.wwotdr vm06 *:8443,9283,8765 running (3m) 61s ago 7m 495M - 19.2.3-678-ge911bdeb 654f31e6858e 0f98de364d6a 2026-03-10T06:25:31.064 INFO:teuthology.orchestra.run.vm04.stdout:mon.vm04 vm04 running (94s) 12s ago 8m 58.4M 2048M 19.2.3-678-ge911bdeb 654f31e6858e cf1d92823378 2026-03-10T06:25:31.064 INFO:teuthology.orchestra.run.vm04.stdout:mon.vm06 vm06 running (79s) 61s ago 7m 50.6M 2048M 19.2.3-678-ge911bdeb 654f31e6858e 0f90bc9a714a 2026-03-10T06:25:31.064 INFO:teuthology.orchestra.run.vm04.stdout:node-exporter.vm04 vm04 *:9100 running (2m) 12s ago 8m 10.1M - 1.7.0 72c9c2088986 f88b18573eef 2026-03-10T06:25:31.064 INFO:teuthology.orchestra.run.vm04.stdout:node-exporter.vm06 vm06 *:9100 running (2m) 61s ago 7m 9416k - 1.7.0 72c9c2088986 32cea90d1988 2026-03-10T06:25:31.064 INFO:teuthology.orchestra.run.vm04.stdout:osd.0 vm04 running (52s) 12s ago 6m 180M 4096M 19.2.3-678-ge911bdeb 654f31e6858e df697b82ad51 2026-03-10T06:25:31.065 INFO:teuthology.orchestra.run.vm04.stdout:osd.1 vm04 running (14s) 12s ago 6m 11.5M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 6bc3525fe6f5 2026-03-10T06:25:31.065 INFO:teuthology.orchestra.run.vm04.stdout:osd.2 vm04 running (6m) 12s ago 6m 293M 4096M 18.2.0 dc2bc1663786 e5a533082c80 2026-03-10T06:25:31.065 INFO:teuthology.orchestra.run.vm04.stdout:osd.3 vm06 running (6m) 61s ago 6m 385M 4096M 18.2.0 dc2bc1663786 62400287eca0 2026-03-10T06:25:31.065 INFO:teuthology.orchestra.run.vm04.stdout:osd.4 vm06 running (6m) 61s ago 6m 322M 4096M 18.2.0 dc2bc1663786 dcd395dfe220 2026-03-10T06:25:31.065 INFO:teuthology.orchestra.run.vm04.stdout:osd.5 vm06 running (6m) 61s ago 6m 294M 4096M 18.2.0 dc2bc1663786 862da087fc06 2026-03-10T06:25:31.065 INFO:teuthology.orchestra.run.vm04.stdout:prometheus.vm04 vm04 *:9095 running (2m) 12s ago 7m 53.0M - 2.51.0 1d3b7f56885b 9e491f823407 2026-03-10T06:25:31.067 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.066+0000 7fc537668700 1 -- 192.168.123.104:0/844164184 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7fc51c077b10 msgr2=0x7fc51c079fc0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:25:31.067 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.066+0000 7fc537668700 1 --2- 192.168.123.104:0/844164184 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7fc51c077b10 0x7fc51c079fc0 secure :-1 s=READY pgs=38 cs=0 l=1 rev1=1 crypto rx=0x7fc52c00bb30 tx=0x7fc52c00bf90 comp rx=0 tx=0).stop 2026-03-10T06:25:31.067 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.066+0000 7fc537668700 1 -- 192.168.123.104:0/844164184 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc53010e9e0 msgr2=0x7fc530119590 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:25:31.067 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.066+0000 7fc537668700 1 --2- 192.168.123.104:0/844164184 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc53010e9e0 0x7fc530119590 secure :-1 s=READY pgs=57 cs=0 l=1 rev1=1 crypto rx=0x7fc52400d8d0 tx=0x7fc52400dbe0 comp rx=0 tx=0).stop 2026-03-10T06:25:31.067 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.066+0000 7fc537668700 1 -- 192.168.123.104:0/844164184 shutdown_connections 2026-03-10T06:25:31.067 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.066+0000 7fc537668700 1 --2- 192.168.123.104:0/844164184 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7fc51c077b10 0x7fc51c079fc0 unknown :-1 s=CLOSED pgs=38 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:25:31.067 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.066+0000 7fc537668700 1 --2- 192.168.123.104:0/844164184 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc53010e9e0 0x7fc530119590 unknown :-1 s=CLOSED pgs=57 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:25:31.067 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.066+0000 7fc537668700 1 --2- 192.168.123.104:0/844164184 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc530114590 0x7fc530114a00 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:25:31.067 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.067+0000 7fc537668700 1 -- 192.168.123.104:0/844164184 >> 192.168.123.104:0/844164184 conn(0x7fc53006c6c0 msgr2=0x7fc53006cd20 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:25:31.069 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.069+0000 7fc537668700 1 -- 192.168.123.104:0/844164184 shutdown_connections 2026-03-10T06:25:31.069 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.069+0000 7fc537668700 1 -- 192.168.123.104:0/844164184 wait complete. 2026-03-10T06:25:31.153 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.152+0000 7f4eb8fda700 1 -- 192.168.123.104:0/3783560978 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4eb4071b60 msgr2=0x7f4eb4071fd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:25:31.153 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.152+0000 7f4eb8fda700 1 --2- 192.168.123.104:0/3783560978 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4eb4071b60 0x7f4eb4071fd0 secure :-1 s=READY pgs=58 cs=0 l=1 rev1=1 crypto rx=0x7f4ea4009b00 tx=0x7f4ea4009e10 comp rx=0 tx=0).stop 2026-03-10T06:25:31.153 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.152+0000 7f4eb8fda700 1 -- 192.168.123.104:0/3783560978 shutdown_connections 2026-03-10T06:25:31.153 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.152+0000 7f4eb8fda700 1 --2- 192.168.123.104:0/3783560978 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4eb4071b60 0x7f4eb4071fd0 unknown :-1 s=CLOSED pgs=58 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:25:31.153 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.152+0000 7f4eb8fda700 1 --2- 192.168.123.104:0/3783560978 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4eb410e9e0 0x7f4eb410edb0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:25:31.153 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.152+0000 7f4eb8fda700 1 -- 192.168.123.104:0/3783560978 >> 192.168.123.104:0/3783560978 conn(0x7f4eb406c6c0 msgr2=0x7f4eb406cac0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:25:31.153 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.152+0000 7f4eb8fda700 1 -- 192.168.123.104:0/3783560978 shutdown_connections 2026-03-10T06:25:31.153 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.152+0000 7f4eb8fda700 1 -- 192.168.123.104:0/3783560978 wait complete. 2026-03-10T06:25:31.153 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.153+0000 7f4eb8fda700 1 Processor -- start 2026-03-10T06:25:31.153 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.153+0000 7f4eb8fda700 1 -- start start 2026-03-10T06:25:31.153 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.153+0000 7f4eb8fda700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4eb410e9e0 0x7f4eb4119680 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:25:31.153 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.153+0000 7f4eb8fda700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4eb4114720 0x7f4eb4114b90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:25:31.153 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.153+0000 7f4eb8fda700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4eb4115160 con 0x7f4eb410e9e0 2026-03-10T06:25:31.153 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.153+0000 7f4eb8fda700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4eb41152d0 con 0x7f4eb4114720 2026-03-10T06:25:31.154 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.153+0000 7f4eb37fe700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4eb410e9e0 0x7f4eb4119680 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:25:31.154 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.153+0000 7f4eb2ffd700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4eb4114720 0x7f4eb4114b90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:25:31.154 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.153+0000 7f4eb2ffd700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4eb4114720 0x7f4eb4114b90 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.104:48966/0 (socket says 192.168.123.104:48966) 2026-03-10T06:25:31.154 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.153+0000 7f4eb2ffd700 1 -- 192.168.123.104:0/2679566783 learned_addr learned my addr 192.168.123.104:0/2679566783 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:25:31.154 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.153+0000 7f4eb37fe700 1 -- 192.168.123.104:0/2679566783 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4eb4114720 msgr2=0x7f4eb4114b90 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:25:31.154 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.153+0000 7f4eb37fe700 1 --2- 192.168.123.104:0/2679566783 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4eb4114720 0x7f4eb4114b90 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:25:31.154 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.153+0000 7f4eb37fe700 1 -- 192.168.123.104:0/2679566783 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f4ea40097e0 con 0x7f4eb410e9e0 2026-03-10T06:25:31.154 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.154+0000 7f4eb37fe700 1 --2- 192.168.123.104:0/2679566783 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4eb410e9e0 0x7f4eb4119680 secure :-1 s=READY pgs=59 cs=0 l=1 rev1=1 crypto rx=0x7f4eac00c390 tx=0x7f4eac00c6a0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:25:31.154 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.154+0000 7f4eb0ff9700 1 -- 192.168.123.104:0/2679566783 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f4eac00e050 con 0x7f4eb410e9e0 2026-03-10T06:25:31.155 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.154+0000 7f4eb8fda700 1 -- 192.168.123.104:0/2679566783 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f4eb4115550 con 0x7f4eb410e9e0 2026-03-10T06:25:31.155 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.154+0000 7f4eb8fda700 1 -- 192.168.123.104:0/2679566783 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f4eb41b7d60 con 0x7f4eb410e9e0 2026-03-10T06:25:31.156 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.155+0000 7f4eb0ff9700 1 -- 192.168.123.104:0/2679566783 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f4eac00f040 con 0x7f4eb410e9e0 2026-03-10T06:25:31.156 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.155+0000 7f4eb0ff9700 1 -- 192.168.123.104:0/2679566783 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f4eac013610 con 0x7f4eb410e9e0 2026-03-10T06:25:31.157 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.156+0000 7f4eb8fda700 1 -- 192.168.123.104:0/2679566783 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f4ea0005320 con 0x7f4eb410e9e0 2026-03-10T06:25:31.157 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.156+0000 7f4eb0ff9700 1 -- 192.168.123.104:0/2679566783 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 39) v1 ==== 100217+0+0 (secure 0 0 0) 0x7f4eac007500 con 0x7f4eb410e9e0 2026-03-10T06:25:31.157 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.157+0000 7f4eb0ff9700 1 --2- 192.168.123.104:0/2679566783 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f4e9c077b00 0x7f4e9c079fb0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:25:31.157 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.157+0000 7f4eb2ffd700 1 --2- 192.168.123.104:0/2679566783 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f4e9c077b00 0x7f4e9c079fb0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:25:31.157 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.157+0000 7f4eb0ff9700 1 -- 192.168.123.104:0/2679566783 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(56..56 src has 1..56) v4 ==== 6050+0+0 (secure 0 0 0) 0x7f4eac099790 con 0x7f4eb410e9e0 2026-03-10T06:25:31.160 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.160+0000 7f4eb2ffd700 1 --2- 192.168.123.104:0/2679566783 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f4e9c077b00 0x7f4e9c079fb0 secure :-1 s=READY pgs=39 cs=0 l=1 rev1=1 crypto rx=0x7f4ea4000c00 tx=0x7f4ea4019040 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:25:31.161 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.160+0000 7f4eb0ff9700 1 -- 192.168.123.104:0/2679566783 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f4eac061e70 con 0x7f4eb410e9e0 2026-03-10T06:25:31.339 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.338+0000 7f4eb8fda700 1 -- 192.168.123.104:0/2679566783 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7f4ea0005cc0 con 0x7f4eb410e9e0 2026-03-10T06:25:31.339 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.339+0000 7f4eb0ff9700 1 -- 192.168.123.104:0/2679566783 <== mon.0 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+799 (secure 0 0 0) 0x7f4eac0615c0 con 0x7f4eb410e9e0 2026-03-10T06:25:31.340 INFO:teuthology.orchestra.run.vm04.stdout:{ 2026-03-10T06:25:31.340 INFO:teuthology.orchestra.run.vm04.stdout: "mon": { 2026-03-10T06:25:31.340 INFO:teuthology.orchestra.run.vm04.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T06:25:31.340 INFO:teuthology.orchestra.run.vm04.stdout: }, 2026-03-10T06:25:31.340 INFO:teuthology.orchestra.run.vm04.stdout: "mgr": { 2026-03-10T06:25:31.340 INFO:teuthology.orchestra.run.vm04.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T06:25:31.340 INFO:teuthology.orchestra.run.vm04.stdout: }, 2026-03-10T06:25:31.340 INFO:teuthology.orchestra.run.vm04.stdout: "osd": { 2026-03-10T06:25:31.340 INFO:teuthology.orchestra.run.vm04.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 4, 2026-03-10T06:25:31.340 INFO:teuthology.orchestra.run.vm04.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T06:25:31.340 INFO:teuthology.orchestra.run.vm04.stdout: }, 2026-03-10T06:25:31.340 INFO:teuthology.orchestra.run.vm04.stdout: "mds": { 2026-03-10T06:25:31.340 INFO:teuthology.orchestra.run.vm04.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 4 2026-03-10T06:25:31.340 INFO:teuthology.orchestra.run.vm04.stdout: }, 2026-03-10T06:25:31.340 INFO:teuthology.orchestra.run.vm04.stdout: "overall": { 2026-03-10T06:25:31.340 INFO:teuthology.orchestra.run.vm04.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 8, 2026-03-10T06:25:31.340 INFO:teuthology.orchestra.run.vm04.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 6 2026-03-10T06:25:31.340 INFO:teuthology.orchestra.run.vm04.stdout: } 2026-03-10T06:25:31.340 INFO:teuthology.orchestra.run.vm04.stdout:} 2026-03-10T06:25:31.343 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.342+0000 7f4e9a7fc700 1 -- 192.168.123.104:0/2679566783 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f4e9c077b00 msgr2=0x7f4e9c079fb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:25:31.343 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.342+0000 7f4e9a7fc700 1 --2- 192.168.123.104:0/2679566783 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f4e9c077b00 0x7f4e9c079fb0 secure :-1 s=READY pgs=39 cs=0 l=1 rev1=1 crypto rx=0x7f4ea4000c00 tx=0x7f4ea4019040 comp rx=0 tx=0).stop 2026-03-10T06:25:31.343 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.343+0000 7f4e9a7fc700 1 -- 192.168.123.104:0/2679566783 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4eb410e9e0 msgr2=0x7f4eb4119680 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:25:31.343 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.343+0000 7f4e9a7fc700 1 --2- 192.168.123.104:0/2679566783 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4eb410e9e0 0x7f4eb4119680 secure :-1 s=READY pgs=59 cs=0 l=1 rev1=1 crypto rx=0x7f4eac00c390 tx=0x7f4eac00c6a0 comp rx=0 tx=0).stop 2026-03-10T06:25:31.344 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.343+0000 7f4e9a7fc700 1 -- 192.168.123.104:0/2679566783 shutdown_connections 2026-03-10T06:25:31.344 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.343+0000 7f4e9a7fc700 1 --2- 192.168.123.104:0/2679566783 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f4e9c077b00 0x7f4e9c079fb0 unknown :-1 s=CLOSED pgs=39 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:25:31.344 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.343+0000 7f4e9a7fc700 1 --2- 192.168.123.104:0/2679566783 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4eb410e9e0 0x7f4eb4119680 unknown :-1 s=CLOSED pgs=59 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:25:31.344 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.343+0000 7f4e9a7fc700 1 --2- 192.168.123.104:0/2679566783 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4eb4114720 0x7f4eb4114b90 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:25:31.344 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.343+0000 7f4e9a7fc700 1 -- 192.168.123.104:0/2679566783 >> 192.168.123.104:0/2679566783 conn(0x7f4eb406c6c0 msgr2=0x7f4eb406f680 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:25:31.344 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.343+0000 7f4e9a7fc700 1 -- 192.168.123.104:0/2679566783 shutdown_connections 2026-03-10T06:25:31.344 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.344+0000 7f4e9a7fc700 1 -- 192.168.123.104:0/2679566783 wait complete. 2026-03-10T06:25:31.416 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:31 vm04.local ceph-mon[115743]: pgmap v59: 65 pgs: 65 active+clean; 252 MiB data, 2.4 GiB used, 118 GiB / 120 GiB avail; 639 B/s rd, 1 op/s; 0 B/s, 0 objects/s recovering 2026-03-10T06:25:31.416 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:31 vm04.local ceph-mon[115743]: from='client.34182 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:25:31.417 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.414+0000 7f5e72b2c700 1 -- 192.168.123.104:0/3077084039 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5e6c071b60 msgr2=0x7f5e6c071fd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:25:31.417 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.414+0000 7f5e72b2c700 1 --2- 192.168.123.104:0/3077084039 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5e6c071b60 0x7f5e6c071fd0 secure :-1 s=READY pgs=60 cs=0 l=1 rev1=1 crypto rx=0x7f5e6401c580 tx=0x7f5e6401c890 comp rx=0 tx=0).stop 2026-03-10T06:25:31.417 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.414+0000 7f5e72b2c700 1 -- 192.168.123.104:0/3077084039 shutdown_connections 2026-03-10T06:25:31.417 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.414+0000 7f5e72b2c700 1 --2- 192.168.123.104:0/3077084039 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5e6c071b60 0x7f5e6c071fd0 unknown :-1 s=CLOSED pgs=60 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:25:31.417 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.414+0000 7f5e72b2c700 1 --2- 192.168.123.104:0/3077084039 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5e6c10e9e0 0x7f5e6c10edb0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:25:31.417 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.414+0000 7f5e72b2c700 1 -- 192.168.123.104:0/3077084039 >> 192.168.123.104:0/3077084039 conn(0x7f5e6c06c6c0 msgr2=0x7f5e6c06cac0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:25:31.417 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.414+0000 7f5e72b2c700 1 -- 192.168.123.104:0/3077084039 shutdown_connections 2026-03-10T06:25:31.417 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.414+0000 7f5e72b2c700 1 -- 192.168.123.104:0/3077084039 wait complete. 2026-03-10T06:25:31.417 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.414+0000 7f5e72b2c700 1 Processor -- start 2026-03-10T06:25:31.417 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.414+0000 7f5e72b2c700 1 -- start start 2026-03-10T06:25:31.417 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.415+0000 7f5e72b2c700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5e6c10e9e0 0x7f5e6c19c5e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:25:31.417 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.415+0000 7f5e72b2c700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5e6c19cb20 0x7f5e6c1a0f90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:25:31.417 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.415+0000 7f5e72b2c700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5e6c19d130 con 0x7f5e6c10e9e0 2026-03-10T06:25:31.417 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.415+0000 7f5e72b2c700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5e6c19d2a0 con 0x7f5e6c19cb20 2026-03-10T06:25:31.417 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.415+0000 7f5e71b2a700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5e6c10e9e0 0x7f5e6c19c5e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:25:31.417 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.415+0000 7f5e71b2a700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5e6c10e9e0 0x7f5e6c19c5e0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:56564/0 (socket says 192.168.123.104:56564) 2026-03-10T06:25:31.417 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.415+0000 7f5e71b2a700 1 -- 192.168.123.104:0/2573927833 learned_addr learned my addr 192.168.123.104:0/2573927833 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:25:31.417 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.415+0000 7f5e71329700 1 --2- 192.168.123.104:0/2573927833 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5e6c19cb20 0x7f5e6c1a0f90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:25:31.417 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.415+0000 7f5e71b2a700 1 -- 192.168.123.104:0/2573927833 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5e6c19cb20 msgr2=0x7f5e6c1a0f90 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:25:31.417 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.415+0000 7f5e71b2a700 1 --2- 192.168.123.104:0/2573927833 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5e6c19cb20 0x7f5e6c1a0f90 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:25:31.417 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.415+0000 7f5e71b2a700 1 -- 192.168.123.104:0/2573927833 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f5e6401c060 con 0x7f5e6c10e9e0 2026-03-10T06:25:31.417 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.415+0000 7f5e71b2a700 1 --2- 192.168.123.104:0/2573927833 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5e6c10e9e0 0x7f5e6c19c5e0 secure :-1 s=READY pgs=61 cs=0 l=1 rev1=1 crypto rx=0x7f5e6800baa0 tx=0x7f5e6800be60 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:25:31.417 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.416+0000 7f5e62ffd700 1 -- 192.168.123.104:0/2573927833 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f5e6800c760 con 0x7f5e6c10e9e0 2026-03-10T06:25:31.418 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.416+0000 7f5e62ffd700 1 -- 192.168.123.104:0/2573927833 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f5e6800cda0 con 0x7f5e6c10e9e0 2026-03-10T06:25:31.418 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.416+0000 7f5e62ffd700 1 -- 192.168.123.104:0/2573927833 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f5e68012550 con 0x7f5e6c10e9e0 2026-03-10T06:25:31.418 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.417+0000 7f5e72b2c700 1 -- 192.168.123.104:0/2573927833 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f5e6c1a1590 con 0x7f5e6c10e9e0 2026-03-10T06:25:31.419 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.418+0000 7f5e72b2c700 1 -- 192.168.123.104:0/2573927833 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f5e6c1a1ab0 con 0x7f5e6c10e9e0 2026-03-10T06:25:31.419 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.419+0000 7f5e62ffd700 1 -- 192.168.123.104:0/2573927833 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 39) v1 ==== 100217+0+0 (secure 0 0 0) 0x7f5e68014440 con 0x7f5e6c10e9e0 2026-03-10T06:25:31.420 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.419+0000 7f5e62ffd700 1 --2- 192.168.123.104:0/2573927833 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f5e58077a30 0x7f5e58079ee0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:25:31.420 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.419+0000 7f5e71329700 1 --2- 192.168.123.104:0/2573927833 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f5e58077a30 0x7f5e58079ee0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:25:31.420 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.420+0000 7f5e71329700 1 --2- 192.168.123.104:0/2573927833 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f5e58077a30 0x7f5e58079ee0 secure :-1 s=READY pgs=40 cs=0 l=1 rev1=1 crypto rx=0x7f5e64007fd0 tx=0x7f5e64007f60 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:25:31.420 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.420+0000 7f5e72b2c700 1 -- 192.168.123.104:0/2573927833 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f5e6c04f2a0 con 0x7f5e6c10e9e0 2026-03-10T06:25:31.421 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.420+0000 7f5e62ffd700 1 -- 192.168.123.104:0/2573927833 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(56..56 src has 1..56) v4 ==== 6050+0+0 (secure 0 0 0) 0x7f5e680654b0 con 0x7f5e6c10e9e0 2026-03-10T06:25:31.424 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.423+0000 7f5e62ffd700 1 -- 192.168.123.104:0/2573927833 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f5e68060ed0 con 0x7f5e6c10e9e0 2026-03-10T06:25:31.565 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.564+0000 7f5e72b2c700 1 -- 192.168.123.104:0/2573927833 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7f5e6c1a1d90 con 0x7f5e6c10e9e0 2026-03-10T06:25:31.566 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.565+0000 7f5e62ffd700 1 -- 192.168.123.104:0/2573927833 <== mon.0 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 11 v11) v1 ==== 76+0+1944 (secure 0 0 0) 0x7f5e68019030 con 0x7f5e6c10e9e0 2026-03-10T06:25:31.567 INFO:teuthology.orchestra.run.vm04.stdout:e11 2026-03-10T06:25:31.567 INFO:teuthology.orchestra.run.vm04.stdout:btime 1970-01-01T00:00:00:000000+0000 2026-03-10T06:25:31.567 INFO:teuthology.orchestra.run.vm04.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-10T06:25:31.567 INFO:teuthology.orchestra.run.vm04.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T06:25:31.567 INFO:teuthology.orchestra.run.vm04.stdout:legacy client fscid: 1 2026-03-10T06:25:31.567 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:25:31.567 INFO:teuthology.orchestra.run.vm04.stdout:Filesystem 'cephfs' (1) 2026-03-10T06:25:31.567 INFO:teuthology.orchestra.run.vm04.stdout:fs_name cephfs 2026-03-10T06:25:31.567 INFO:teuthology.orchestra.run.vm04.stdout:epoch 9 2026-03-10T06:25:31.567 INFO:teuthology.orchestra.run.vm04.stdout:flags 32 joinable allow_snaps allow_multimds_snaps allow_standby_replay 2026-03-10T06:25:31.567 INFO:teuthology.orchestra.run.vm04.stdout:created 2026-03-10T06:19:48.407965+0000 2026-03-10T06:25:31.567 INFO:teuthology.orchestra.run.vm04.stdout:modified 2026-03-10T06:19:55.449951+0000 2026-03-10T06:25:31.567 INFO:teuthology.orchestra.run.vm04.stdout:tableserver 0 2026-03-10T06:25:31.567 INFO:teuthology.orchestra.run.vm04.stdout:root 0 2026-03-10T06:25:31.567 INFO:teuthology.orchestra.run.vm04.stdout:session_timeout 60 2026-03-10T06:25:31.567 INFO:teuthology.orchestra.run.vm04.stdout:session_autoclose 300 2026-03-10T06:25:31.567 INFO:teuthology.orchestra.run.vm04.stdout:max_file_size 1099511627776 2026-03-10T06:25:31.567 INFO:teuthology.orchestra.run.vm04.stdout:max_xattr_size 65536 2026-03-10T06:25:31.567 INFO:teuthology.orchestra.run.vm04.stdout:required_client_features {} 2026-03-10T06:25:31.567 INFO:teuthology.orchestra.run.vm04.stdout:last_failure 0 2026-03-10T06:25:31.567 INFO:teuthology.orchestra.run.vm04.stdout:last_failure_osd_epoch 0 2026-03-10T06:25:31.567 INFO:teuthology.orchestra.run.vm04.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T06:25:31.567 INFO:teuthology.orchestra.run.vm04.stdout:max_mds 1 2026-03-10T06:25:31.567 INFO:teuthology.orchestra.run.vm04.stdout:in 0 2026-03-10T06:25:31.567 INFO:teuthology.orchestra.run.vm04.stdout:up {0=14508} 2026-03-10T06:25:31.567 INFO:teuthology.orchestra.run.vm04.stdout:failed 2026-03-10T06:25:31.567 INFO:teuthology.orchestra.run.vm04.stdout:damaged 2026-03-10T06:25:31.567 INFO:teuthology.orchestra.run.vm04.stdout:stopped 2026-03-10T06:25:31.567 INFO:teuthology.orchestra.run.vm04.stdout:data_pools [3] 2026-03-10T06:25:31.567 INFO:teuthology.orchestra.run.vm04.stdout:metadata_pool 2 2026-03-10T06:25:31.567 INFO:teuthology.orchestra.run.vm04.stdout:inline_data enabled 2026-03-10T06:25:31.567 INFO:teuthology.orchestra.run.vm04.stdout:balancer 2026-03-10T06:25:31.567 INFO:teuthology.orchestra.run.vm04.stdout:bal_rank_mask -1 2026-03-10T06:25:31.568 INFO:teuthology.orchestra.run.vm04.stdout:standby_count_wanted 1 2026-03-10T06:25:31.568 INFO:teuthology.orchestra.run.vm04.stdout:qdb_cluster leader: 0 members: 2026-03-10T06:25:31.568 INFO:teuthology.orchestra.run.vm04.stdout:[mds.cephfs.vm04.hdxbzv{0:14508} state up:active seq 3 join_fscid=1 addr [v2:192.168.123.104:6826/2274683007,v1:192.168.123.104:6827/2274683007] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T06:25:31.568 INFO:teuthology.orchestra.run.vm04.stdout:[mds.cephfs.vm06.wzhqon{0:24299} state up:standby-replay seq 2 join_fscid=1 addr [v2:192.168.123.106:6824/3071631026,v1:192.168.123.106:6825/3071631026] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T06:25:31.568 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:25:31.568 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:25:31.568 INFO:teuthology.orchestra.run.vm04.stdout:Standby daemons: 2026-03-10T06:25:31.568 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:25:31.568 INFO:teuthology.orchestra.run.vm04.stdout:[mds.cephfs.vm04.hsrsig{-1:14518} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.104:6828/2419696492,v1:192.168.123.104:6829/2419696492] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T06:25:31.568 INFO:teuthology.orchestra.run.vm04.stdout:[mds.cephfs.vm06.afscws{-1:14526} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.106:6826/3120742985,v1:192.168.123.106:6827/3120742985] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T06:25:31.569 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.569+0000 7f5e72b2c700 1 -- 192.168.123.104:0/2573927833 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f5e58077a30 msgr2=0x7f5e58079ee0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:25:31.569 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.569+0000 7f5e72b2c700 1 --2- 192.168.123.104:0/2573927833 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f5e58077a30 0x7f5e58079ee0 secure :-1 s=READY pgs=40 cs=0 l=1 rev1=1 crypto rx=0x7f5e64007fd0 tx=0x7f5e64007f60 comp rx=0 tx=0).stop 2026-03-10T06:25:31.570 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.569+0000 7f5e72b2c700 1 -- 192.168.123.104:0/2573927833 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5e6c10e9e0 msgr2=0x7f5e6c19c5e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:25:31.570 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.569+0000 7f5e72b2c700 1 --2- 192.168.123.104:0/2573927833 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5e6c10e9e0 0x7f5e6c19c5e0 secure :-1 s=READY pgs=61 cs=0 l=1 rev1=1 crypto rx=0x7f5e6800baa0 tx=0x7f5e6800be60 comp rx=0 tx=0).stop 2026-03-10T06:25:31.570 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.569+0000 7f5e72b2c700 1 -- 192.168.123.104:0/2573927833 shutdown_connections 2026-03-10T06:25:31.570 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.569+0000 7f5e72b2c700 1 --2- 192.168.123.104:0/2573927833 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f5e58077a30 0x7f5e58079ee0 unknown :-1 s=CLOSED pgs=40 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:25:31.570 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.569+0000 7f5e72b2c700 1 --2- 192.168.123.104:0/2573927833 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5e6c10e9e0 0x7f5e6c19c5e0 unknown :-1 s=CLOSED pgs=61 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:25:31.570 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.569+0000 7f5e72b2c700 1 --2- 192.168.123.104:0/2573927833 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5e6c19cb20 0x7f5e6c1a0f90 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:25:31.570 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.569+0000 7f5e72b2c700 1 -- 192.168.123.104:0/2573927833 >> 192.168.123.104:0/2573927833 conn(0x7f5e6c06c6c0 msgr2=0x7f5e6c06f5b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:25:31.570 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.569+0000 7f5e72b2c700 1 -- 192.168.123.104:0/2573927833 shutdown_connections 2026-03-10T06:25:31.570 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.569+0000 7f5e72b2c700 1 -- 192.168.123.104:0/2573927833 wait complete. 2026-03-10T06:25:31.571 INFO:teuthology.orchestra.run.vm04.stderr:dumped fsmap epoch 11 2026-03-10T06:25:31.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:31 vm06.local ceph-mon[98962]: pgmap v59: 65 pgs: 65 active+clean; 252 MiB data, 2.4 GiB used, 118 GiB / 120 GiB avail; 639 B/s rd, 1 op/s; 0 B/s, 0 objects/s recovering 2026-03-10T06:25:31.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:31 vm06.local ceph-mon[98962]: from='client.34182 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:25:31.647 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.646+0000 7fe3d695d700 1 -- 192.168.123.104:0/1443245986 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe3d0101ff0 msgr2=0x7fe3d010a4f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:25:31.647 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.646+0000 7fe3d695d700 1 --2- 192.168.123.104:0/1443245986 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe3d0101ff0 0x7fe3d010a4f0 secure :-1 s=READY pgs=62 cs=0 l=1 rev1=1 crypto rx=0x7fe3c0009b00 tx=0x7fe3c0009e10 comp rx=0 tx=0).stop 2026-03-10T06:25:31.647 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.647+0000 7fe3d695d700 1 -- 192.168.123.104:0/1443245986 shutdown_connections 2026-03-10T06:25:31.647 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.647+0000 7fe3d695d700 1 --2- 192.168.123.104:0/1443245986 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe3d0101ff0 0x7fe3d010a4f0 unknown :-1 s=CLOSED pgs=62 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:25:31.647 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.647+0000 7fe3d695d700 1 --2- 192.168.123.104:0/1443245986 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe3d01016e0 0x7fe3d0101ab0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:25:31.647 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.647+0000 7fe3d695d700 1 -- 192.168.123.104:0/1443245986 >> 192.168.123.104:0/1443245986 conn(0x7fe3d00faf00 msgr2=0x7fe3d00fd310 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:25:31.647 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.647+0000 7fe3d695d700 1 -- 192.168.123.104:0/1443245986 shutdown_connections 2026-03-10T06:25:31.647 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.647+0000 7fe3d695d700 1 -- 192.168.123.104:0/1443245986 wait complete. 2026-03-10T06:25:31.648 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.648+0000 7fe3d695d700 1 Processor -- start 2026-03-10T06:25:31.648 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.648+0000 7fe3d695d700 1 -- start start 2026-03-10T06:25:31.648 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.648+0000 7fe3d695d700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe3d01016e0 0x7fe3d00ff340 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:25:31.648 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.648+0000 7fe3d695d700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe3d0101ff0 0x7fe3d00ff880 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:25:31.648 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.648+0000 7fe3d695d700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe3d00ffdc0 con 0x7fe3d01016e0 2026-03-10T06:25:31.649 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.648+0000 7fe3d695d700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe3d00fff00 con 0x7fe3d0101ff0 2026-03-10T06:25:31.649 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.648+0000 7fe3cffff700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe3d01016e0 0x7fe3d00ff340 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:25:31.649 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.648+0000 7fe3cffff700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe3d01016e0 0x7fe3d00ff340 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:56580/0 (socket says 192.168.123.104:56580) 2026-03-10T06:25:31.649 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.648+0000 7fe3cffff700 1 -- 192.168.123.104:0/1556037628 learned_addr learned my addr 192.168.123.104:0/1556037628 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:25:31.649 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.648+0000 7fe3cf7fe700 1 --2- 192.168.123.104:0/1556037628 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe3d0101ff0 0x7fe3d00ff880 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:25:31.649 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.648+0000 7fe3cffff700 1 -- 192.168.123.104:0/1556037628 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe3d0101ff0 msgr2=0x7fe3d00ff880 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:25:31.649 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.649+0000 7fe3cffff700 1 --2- 192.168.123.104:0/1556037628 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe3d0101ff0 0x7fe3d00ff880 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:25:31.649 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.649+0000 7fe3cffff700 1 -- 192.168.123.104:0/1556037628 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fe3c00097e0 con 0x7fe3d01016e0 2026-03-10T06:25:31.649 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.649+0000 7fe3cf7fe700 1 --2- 192.168.123.104:0/1556037628 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe3d0101ff0 0x7fe3d00ff880 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T06:25:31.649 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.649+0000 7fe3cffff700 1 --2- 192.168.123.104:0/1556037628 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe3d01016e0 0x7fe3d00ff340 secure :-1 s=READY pgs=63 cs=0 l=1 rev1=1 crypto rx=0x7fe3b800ba70 tx=0x7fe3b800be30 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:25:31.650 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.649+0000 7fe3cd7fa700 1 -- 192.168.123.104:0/1556037628 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe3b800c780 con 0x7fe3d01016e0 2026-03-10T06:25:31.650 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.649+0000 7fe3d695d700 1 -- 192.168.123.104:0/1556037628 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fe3d01a2420 con 0x7fe3d01016e0 2026-03-10T06:25:31.650 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.649+0000 7fe3d695d700 1 -- 192.168.123.104:0/1556037628 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fe3d01a2830 con 0x7fe3d01016e0 2026-03-10T06:25:31.651 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.651+0000 7fe3cd7fa700 1 -- 192.168.123.104:0/1556037628 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7fe3b800cdc0 con 0x7fe3d01016e0 2026-03-10T06:25:31.652 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.651+0000 7fe3cd7fa700 1 -- 192.168.123.104:0/1556037628 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe3b8012550 con 0x7fe3d01016e0 2026-03-10T06:25:31.652 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.651+0000 7fe3d695d700 1 -- 192.168.123.104:0/1556037628 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fe3b0005320 con 0x7fe3d01016e0 2026-03-10T06:25:31.654 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.653+0000 7fe3cd7fa700 1 -- 192.168.123.104:0/1556037628 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 39) v1 ==== 100217+0+0 (secure 0 0 0) 0x7fe3b8012730 con 0x7fe3d01016e0 2026-03-10T06:25:31.654 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.653+0000 7fe3cd7fa700 1 --2- 192.168.123.104:0/1556037628 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7fe3bc077920 0x7fe3bc079dd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:25:31.654 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.653+0000 7fe3cf7fe700 1 --2- 192.168.123.104:0/1556037628 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7fe3bc077920 0x7fe3bc079dd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:25:31.654 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.654+0000 7fe3cd7fa700 1 -- 192.168.123.104:0/1556037628 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(56..56 src has 1..56) v4 ==== 6050+0+0 (secure 0 0 0) 0x7fe3b8099c90 con 0x7fe3d01016e0 2026-03-10T06:25:31.654 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.654+0000 7fe3cf7fe700 1 --2- 192.168.123.104:0/1556037628 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7fe3bc077920 0x7fe3bc079dd0 secure :-1 s=READY pgs=41 cs=0 l=1 rev1=1 crypto rx=0x7fe3c0009fd0 tx=0x7fe3c0005fb0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:25:31.655 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.655+0000 7fe3cd7fa700 1 -- 192.168.123.104:0/1556037628 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fe3b8062530 con 0x7fe3d01016e0 2026-03-10T06:25:31.790 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.789+0000 7fe3d695d700 1 -- 192.168.123.104:0/1556037628 --> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7fe3b0000bf0 con 0x7fe3bc077920 2026-03-10T06:25:31.791 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.790+0000 7fe3cd7fa700 1 -- 192.168.123.104:0/1556037628 <== mgr.34104 v2:192.168.123.104:6800/1421430943 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7fe3b0000bf0 con 0x7fe3bc077920 2026-03-10T06:25:31.791 INFO:teuthology.orchestra.run.vm04.stdout:{ 2026-03-10T06:25:31.791 INFO:teuthology.orchestra.run.vm04.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc", 2026-03-10T06:25:31.791 INFO:teuthology.orchestra.run.vm04.stdout: "in_progress": true, 2026-03-10T06:25:31.791 INFO:teuthology.orchestra.run.vm04.stdout: "which": "Upgrading all daemon types on all hosts", 2026-03-10T06:25:31.791 INFO:teuthology.orchestra.run.vm04.stdout: "services_complete": [ 2026-03-10T06:25:31.791 INFO:teuthology.orchestra.run.vm04.stdout: "mon", 2026-03-10T06:25:31.791 INFO:teuthology.orchestra.run.vm04.stdout: "mgr", 2026-03-10T06:25:31.791 INFO:teuthology.orchestra.run.vm04.stdout: "crash" 2026-03-10T06:25:31.791 INFO:teuthology.orchestra.run.vm04.stdout: ], 2026-03-10T06:25:31.791 INFO:teuthology.orchestra.run.vm04.stdout: "progress": "8/23 daemons upgraded", 2026-03-10T06:25:31.791 INFO:teuthology.orchestra.run.vm04.stdout: "message": "Currently upgrading osd daemons", 2026-03-10T06:25:31.791 INFO:teuthology.orchestra.run.vm04.stdout: "is_paused": false 2026-03-10T06:25:31.791 INFO:teuthology.orchestra.run.vm04.stdout:} 2026-03-10T06:25:31.794 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.793+0000 7fe3d695d700 1 -- 192.168.123.104:0/1556037628 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7fe3bc077920 msgr2=0x7fe3bc079dd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:25:31.794 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.793+0000 7fe3d695d700 1 --2- 192.168.123.104:0/1556037628 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7fe3bc077920 0x7fe3bc079dd0 secure :-1 s=READY pgs=41 cs=0 l=1 rev1=1 crypto rx=0x7fe3c0009fd0 tx=0x7fe3c0005fb0 comp rx=0 tx=0).stop 2026-03-10T06:25:31.794 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.793+0000 7fe3d695d700 1 -- 192.168.123.104:0/1556037628 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe3d01016e0 msgr2=0x7fe3d00ff340 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:25:31.794 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.793+0000 7fe3d695d700 1 --2- 192.168.123.104:0/1556037628 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe3d01016e0 0x7fe3d00ff340 secure :-1 s=READY pgs=63 cs=0 l=1 rev1=1 crypto rx=0x7fe3b800ba70 tx=0x7fe3b800be30 comp rx=0 tx=0).stop 2026-03-10T06:25:31.794 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.793+0000 7fe3d695d700 1 -- 192.168.123.104:0/1556037628 shutdown_connections 2026-03-10T06:25:31.794 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.793+0000 7fe3d695d700 1 --2- 192.168.123.104:0/1556037628 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7fe3bc077920 0x7fe3bc079dd0 unknown :-1 s=CLOSED pgs=41 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:25:31.794 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.793+0000 7fe3d695d700 1 --2- 192.168.123.104:0/1556037628 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe3d01016e0 0x7fe3d00ff340 unknown :-1 s=CLOSED pgs=63 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:25:31.795 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.793+0000 7fe3d695d700 1 --2- 192.168.123.104:0/1556037628 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe3d0101ff0 0x7fe3d00ff880 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:25:31.795 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.793+0000 7fe3d695d700 1 -- 192.168.123.104:0/1556037628 >> 192.168.123.104:0/1556037628 conn(0x7fe3d00faf00 msgr2=0x7fe3d0104ca0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:25:31.795 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.794+0000 7fe3d695d700 1 -- 192.168.123.104:0/1556037628 shutdown_connections 2026-03-10T06:25:31.795 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.794+0000 7fe3d695d700 1 -- 192.168.123.104:0/1556037628 wait complete. 2026-03-10T06:25:31.864 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.863+0000 7f630ba6e700 1 -- 192.168.123.104:0/1425335796 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6304101ec0 msgr2=0x7f630410a590 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:25:31.864 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.863+0000 7f630ba6e700 1 --2- 192.168.123.104:0/1425335796 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6304101ec0 0x7f630410a590 secure :-1 s=READY pgs=64 cs=0 l=1 rev1=1 crypto rx=0x7f6300009b00 tx=0x7f6300009e10 comp rx=0 tx=0).stop 2026-03-10T06:25:31.864 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.863+0000 7f630ba6e700 1 -- 192.168.123.104:0/1425335796 shutdown_connections 2026-03-10T06:25:31.864 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.863+0000 7f630ba6e700 1 --2- 192.168.123.104:0/1425335796 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6304101ec0 0x7f630410a590 unknown :-1 s=CLOSED pgs=64 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:25:31.864 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.863+0000 7f630ba6e700 1 --2- 192.168.123.104:0/1425335796 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f63041015b0 0x7f6304101980 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:25:31.864 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.863+0000 7f630ba6e700 1 -- 192.168.123.104:0/1425335796 >> 192.168.123.104:0/1425335796 conn(0x7f63040faf00 msgr2=0x7f63040fd310 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:25:31.864 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.864+0000 7f630ba6e700 1 -- 192.168.123.104:0/1425335796 shutdown_connections 2026-03-10T06:25:31.864 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.864+0000 7f630ba6e700 1 -- 192.168.123.104:0/1425335796 wait complete. 2026-03-10T06:25:31.865 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.864+0000 7f630ba6e700 1 Processor -- start 2026-03-10T06:25:31.865 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.864+0000 7f630ba6e700 1 -- start start 2026-03-10T06:25:31.865 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.865+0000 7f630ba6e700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f63041015b0 0x7f6304196150 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:25:31.865 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.865+0000 7f630ba6e700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6304101ec0 0x7f6304196690 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:25:31.865 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.865+0000 7f630ba6e700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6304196d70 con 0x7f6304101ec0 2026-03-10T06:25:31.865 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.865+0000 7f630ba6e700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f630419ab00 con 0x7f63041015b0 2026-03-10T06:25:31.865 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.865+0000 7f630980a700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f63041015b0 0x7f6304196150 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:25:31.866 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.865+0000 7f630980a700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f63041015b0 0x7f6304196150 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.104:49018/0 (socket says 192.168.123.104:49018) 2026-03-10T06:25:31.866 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.865+0000 7f630980a700 1 -- 192.168.123.104:0/3949356044 learned_addr learned my addr 192.168.123.104:0/3949356044 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:25:31.866 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.865+0000 7f6309009700 1 --2- 192.168.123.104:0/3949356044 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6304101ec0 0x7f6304196690 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:25:31.866 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.865+0000 7f630980a700 1 -- 192.168.123.104:0/3949356044 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6304101ec0 msgr2=0x7f6304196690 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:25:31.866 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.865+0000 7f630980a700 1 --2- 192.168.123.104:0/3949356044 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6304101ec0 0x7f6304196690 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:25:31.866 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.865+0000 7f630980a700 1 -- 192.168.123.104:0/3949356044 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f63000097e0 con 0x7f63041015b0 2026-03-10T06:25:31.866 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.865+0000 7f6309009700 1 --2- 192.168.123.104:0/3949356044 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6304101ec0 0x7f6304196690 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T06:25:31.866 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.866+0000 7f630980a700 1 --2- 192.168.123.104:0/3949356044 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f63041015b0 0x7f6304196150 secure :-1 s=READY pgs=19 cs=0 l=1 rev1=1 crypto rx=0x7f62f400d8d0 tx=0x7f62f400dc90 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:25:31.866 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.866+0000 7f62faffd700 1 -- 192.168.123.104:0/3949356044 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f62f4009940 con 0x7f63041015b0 2026-03-10T06:25:31.866 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.866+0000 7f630ba6e700 1 -- 192.168.123.104:0/3949356044 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f630419ade0 con 0x7f63041015b0 2026-03-10T06:25:31.866 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.866+0000 7f630ba6e700 1 -- 192.168.123.104:0/3949356044 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f630419b330 con 0x7f63041015b0 2026-03-10T06:25:31.867 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.866+0000 7f62faffd700 1 -- 192.168.123.104:0/3949356044 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f62f4010460 con 0x7f63041015b0 2026-03-10T06:25:31.867 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.866+0000 7f62faffd700 1 -- 192.168.123.104:0/3949356044 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f62f400f5d0 con 0x7f63041015b0 2026-03-10T06:25:31.869 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.867+0000 7f62faffd700 1 -- 192.168.123.104:0/3949356044 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 39) v1 ==== 100217+0+0 (secure 0 0 0) 0x7f62f40105d0 con 0x7f63041015b0 2026-03-10T06:25:31.869 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.868+0000 7f630ba6e700 1 -- 192.168.123.104:0/3949356044 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f62e8005320 con 0x7f63041015b0 2026-03-10T06:25:31.869 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.868+0000 7f62faffd700 1 --2- 192.168.123.104:0/3949356044 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f62f00779a0 0x7f62f0079e50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:25:31.869 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.868+0000 7f62faffd700 1 -- 192.168.123.104:0/3949356044 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(56..56 src has 1..56) v4 ==== 6050+0+0 (secure 0 0 0) 0x7f62f4099b70 con 0x7f63041015b0 2026-03-10T06:25:31.869 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.868+0000 7f6309009700 1 --2- 192.168.123.104:0/3949356044 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f62f00779a0 0x7f62f0079e50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:25:31.869 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.869+0000 7f6309009700 1 --2- 192.168.123.104:0/3949356044 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f62f00779a0 0x7f62f0079e50 secure :-1 s=READY pgs=42 cs=0 l=1 rev1=1 crypto rx=0x7f6300009fd0 tx=0x7f6300005fd0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:25:31.871 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:31.871+0000 7f62faffd700 1 -- 192.168.123.104:0/3949356044 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f62f4061b00 con 0x7f63041015b0 2026-03-10T06:25:32.050 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:32.049+0000 7f630ba6e700 1 -- 192.168.123.104:0/3949356044 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7f62e8005190 con 0x7f63041015b0 2026-03-10T06:25:32.050 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:32.050+0000 7f62faffd700 1 -- 192.168.123.104:0/3949356044 <== mon.1 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+201 (secure 0 0 0) 0x7f62f40160e0 con 0x7f63041015b0 2026-03-10T06:25:32.051 INFO:teuthology.orchestra.run.vm04.stdout:HEALTH_WARN 1 filesystem with deprecated feature inline_data 2026-03-10T06:25:32.051 INFO:teuthology.orchestra.run.vm04.stdout:[WRN] FS_INLINE_DATA_DEPRECATED: 1 filesystem with deprecated feature inline_data 2026-03-10T06:25:32.051 INFO:teuthology.orchestra.run.vm04.stdout: fs cephfs has deprecated feature inline_data enabled. 2026-03-10T06:25:32.054 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:32.054+0000 7f630ba6e700 1 -- 192.168.123.104:0/3949356044 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f62f00779a0 msgr2=0x7f62f0079e50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:25:32.054 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:32.054+0000 7f630ba6e700 1 --2- 192.168.123.104:0/3949356044 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f62f00779a0 0x7f62f0079e50 secure :-1 s=READY pgs=42 cs=0 l=1 rev1=1 crypto rx=0x7f6300009fd0 tx=0x7f6300005fd0 comp rx=0 tx=0).stop 2026-03-10T06:25:32.055 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:32.054+0000 7f630ba6e700 1 -- 192.168.123.104:0/3949356044 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f63041015b0 msgr2=0x7f6304196150 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:25:32.055 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:32.054+0000 7f630ba6e700 1 --2- 192.168.123.104:0/3949356044 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f63041015b0 0x7f6304196150 secure :-1 s=READY pgs=19 cs=0 l=1 rev1=1 crypto rx=0x7f62f400d8d0 tx=0x7f62f400dc90 comp rx=0 tx=0).stop 2026-03-10T06:25:32.055 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:32.054+0000 7f630ba6e700 1 -- 192.168.123.104:0/3949356044 shutdown_connections 2026-03-10T06:25:32.055 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:32.054+0000 7f630ba6e700 1 --2- 192.168.123.104:0/3949356044 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f62f00779a0 0x7f62f0079e50 unknown :-1 s=CLOSED pgs=42 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:25:32.055 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:32.054+0000 7f630ba6e700 1 --2- 192.168.123.104:0/3949356044 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f63041015b0 0x7f6304196150 unknown :-1 s=CLOSED pgs=19 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:25:32.055 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:32.054+0000 7f630ba6e700 1 --2- 192.168.123.104:0/3949356044 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6304101ec0 0x7f6304196690 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:25:32.055 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:32.054+0000 7f630ba6e700 1 -- 192.168.123.104:0/3949356044 >> 192.168.123.104:0/3949356044 conn(0x7f63040faf00 msgr2=0x7f6304104d70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:25:32.056 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:32.054+0000 7f630ba6e700 1 -- 192.168.123.104:0/3949356044 shutdown_connections 2026-03-10T06:25:32.056 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:25:32.054+0000 7f630ba6e700 1 -- 192.168.123.104:0/3949356044 wait complete. 2026-03-10T06:25:32.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:32 vm04.local ceph-mon[115743]: from='client.34186 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:25:32.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:32 vm04.local ceph-mon[115743]: from='client.34190 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:25:32.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:32 vm04.local ceph-mon[115743]: from='client.? 192.168.123.104:0/2679566783' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:25:32.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:32 vm04.local ceph-mon[115743]: from='client.? 192.168.123.104:0/2573927833' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T06:25:32.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:32 vm04.local ceph-mon[115743]: from='client.? 192.168.123.104:0/3949356044' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T06:25:32.477 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:32 vm06.local ceph-mon[98962]: from='client.34186 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:25:32.477 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:32 vm06.local ceph-mon[98962]: from='client.34190 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:25:32.477 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:32 vm06.local ceph-mon[98962]: from='client.? 192.168.123.104:0/2679566783' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:25:32.478 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:32 vm06.local ceph-mon[98962]: from='client.? 192.168.123.104:0/2573927833' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T06:25:32.478 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:32 vm06.local ceph-mon[98962]: from='client.? 192.168.123.104:0/3949356044' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T06:25:33.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:33 vm06.local ceph-mon[98962]: from='client.34202 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:25:33.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:33 vm06.local ceph-mon[98962]: pgmap v60: 65 pgs: 65 active+clean; 252 MiB data, 2.4 GiB used, 118 GiB / 120 GiB avail; 1.1 KiB/s rd, 2 op/s; 0 B/s, 0 objects/s recovering 2026-03-10T06:25:33.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:33 vm04.local ceph-mon[115743]: from='client.34202 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:25:33.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:33 vm04.local ceph-mon[115743]: pgmap v60: 65 pgs: 65 active+clean; 252 MiB data, 2.4 GiB used, 118 GiB / 120 GiB avail; 1.1 KiB/s rd, 2 op/s; 0 B/s, 0 objects/s recovering 2026-03-10T06:25:35.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:35 vm06.local ceph-mon[98962]: pgmap v61: 65 pgs: 65 active+clean; 252 MiB data, 2.4 GiB used, 118 GiB / 120 GiB avail; 1014 B/s rd, 1 op/s; 0 B/s, 0 objects/s recovering 2026-03-10T06:25:35.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:35 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:25:35.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:35 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T06:25:35.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:35 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:25:35.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:35 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd ok-to-stop", "ids": ["2"], "max": 16}]: dispatch 2026-03-10T06:25:35.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:35 vm04.local ceph-mon[115743]: pgmap v61: 65 pgs: 65 active+clean; 252 MiB data, 2.4 GiB used, 118 GiB / 120 GiB avail; 1014 B/s rd, 1 op/s; 0 B/s, 0 objects/s recovering 2026-03-10T06:25:35.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:35 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:25:35.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:35 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T06:25:35.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:35 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:25:35.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:35 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd ok-to-stop", "ids": ["2"], "max": 16}]: dispatch 2026-03-10T06:25:36.307 INFO:journalctl@ceph.osd.2.vm04.stdout:Mar 10 06:25:36 vm04.local systemd[1]: Stopping Ceph osd.2 for 9c59102a-1c48-11f1-b618-035af535377d... 2026-03-10T06:25:36.307 INFO:journalctl@ceph.osd.2.vm04.stdout:Mar 10 06:25:36 vm04.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-2[80471]: 2026-03-10T06:25:36.139+0000 7ff459d52700 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-osd -n osd.2 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false (PID: 1) UID: 0 2026-03-10T06:25:36.307 INFO:journalctl@ceph.osd.2.vm04.stdout:Mar 10 06:25:36 vm04.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-2[80471]: 2026-03-10T06:25:36.139+0000 7ff459d52700 -1 osd.2 56 *** Got signal Terminated *** 2026-03-10T06:25:36.307 INFO:journalctl@ceph.osd.2.vm04.stdout:Mar 10 06:25:36 vm04.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-2[80471]: 2026-03-10T06:25:36.139+0000 7ff459d52700 -1 osd.2 56 *** Immediate shutdown (osd_fast_shutdown=true) *** 2026-03-10T06:25:36.571 INFO:journalctl@ceph.osd.2.vm04.stdout:Mar 10 06:25:36 vm04.local podman[133724]: 2026-03-10 06:25:36.396936952 +0000 UTC m=+0.272470216 container died e5a533082c8079dc582a1d261d1b4f619beec16de01eea7776ba2c8ec4a5b51c (image=quay.io/ceph/ceph@sha256:1793ff3af6ae74527c86e1a0b22401e9c42dc08d0ebb8379653be07db17d0007, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_POINT_RELEASE=-18.2.0, GIT_BRANCH=HEAD, GIT_COMMIT=0396eef90bef641b676c164ec7a3876f45010308, ceph=True, io.buildah.version=1.29.1, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 8 Base Image, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=HEAD, maintainer=Guillaume Abrioux , org.label-schema.build-date=20231212) 2026-03-10T06:25:36.571 INFO:journalctl@ceph.osd.2.vm04.stdout:Mar 10 06:25:36 vm04.local podman[133724]: 2026-03-10 06:25:36.419551713 +0000 UTC m=+0.295084977 container remove e5a533082c8079dc582a1d261d1b4f619beec16de01eea7776ba2c8ec4a5b51c (image=quay.io/ceph/ceph@sha256:1793ff3af6ae74527c86e1a0b22401e9c42dc08d0ebb8379653be07db17d0007, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-2, ceph=True, org.label-schema.build-date=20231212, org.label-schema.name=CentOS Stream 8 Base Image, CEPH_POINT_RELEASE=-18.2.0, GIT_CLEAN=True, io.buildah.version=1.29.1, org.label-schema.license=GPLv2, GIT_COMMIT=0396eef90bef641b676c164ec7a3876f45010308, RELEASE=HEAD, org.label-schema.vendor=CentOS, GIT_BRANCH=HEAD, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , org.label-schema.schema-version=1.0) 2026-03-10T06:25:36.571 INFO:journalctl@ceph.osd.2.vm04.stdout:Mar 10 06:25:36 vm04.local bash[133724]: ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-2 2026-03-10T06:25:36.571 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:36 vm04.local ceph-mon[115743]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["2"], "max": 16}]: dispatch 2026-03-10T06:25:36.571 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:36 vm04.local ceph-mon[115743]: Upgrade: osd.2 is safe to restart 2026-03-10T06:25:36.571 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:36 vm04.local ceph-mon[115743]: Upgrade: Updating osd.2 2026-03-10T06:25:36.571 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:36 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:25:36.571 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:36 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "osd.2"}]: dispatch 2026-03-10T06:25:36.571 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:36 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:25:36.571 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:36 vm04.local ceph-mon[115743]: Deploying daemon osd.2 on vm04 2026-03-10T06:25:36.571 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:36 vm04.local ceph-mon[115743]: osd.2 marked itself down and dead 2026-03-10T06:25:36.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:36 vm06.local ceph-mon[98962]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["2"], "max": 16}]: dispatch 2026-03-10T06:25:36.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:36 vm06.local ceph-mon[98962]: Upgrade: osd.2 is safe to restart 2026-03-10T06:25:36.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:36 vm06.local ceph-mon[98962]: Upgrade: Updating osd.2 2026-03-10T06:25:36.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:36 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:25:36.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:36 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "osd.2"}]: dispatch 2026-03-10T06:25:36.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:36 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:25:36.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:36 vm06.local ceph-mon[98962]: Deploying daemon osd.2 on vm04 2026-03-10T06:25:36.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:36 vm06.local ceph-mon[98962]: osd.2 marked itself down and dead 2026-03-10T06:25:36.873 INFO:journalctl@ceph.osd.2.vm04.stdout:Mar 10 06:25:36 vm04.local podman[133791]: 2026-03-10 06:25:36.570790561 +0000 UTC m=+0.018378290 container create 8b384fd6eb2e37e8c248a2f4110d42a455af9893375efce4aa1a0aa7159b32a5 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-2-deactivate, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, ceph=True, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.build-date=20260223, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git) 2026-03-10T06:25:36.873 INFO:journalctl@ceph.osd.2.vm04.stdout:Mar 10 06:25:36 vm04.local podman[133791]: 2026-03-10 06:25:36.6123232 +0000 UTC m=+0.059910929 container init 8b384fd6eb2e37e8c248a2f4110d42a455af9893375efce4aa1a0aa7159b32a5 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-2-deactivate, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.build-date=20260223, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.license=GPLv2, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/) 2026-03-10T06:25:36.874 INFO:journalctl@ceph.osd.2.vm04.stdout:Mar 10 06:25:36 vm04.local podman[133791]: 2026-03-10 06:25:36.615579006 +0000 UTC m=+0.063166725 container start 8b384fd6eb2e37e8c248a2f4110d42a455af9893375efce4aa1a0aa7159b32a5 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-2-deactivate, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team , ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260223, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image) 2026-03-10T06:25:36.874 INFO:journalctl@ceph.osd.2.vm04.stdout:Mar 10 06:25:36 vm04.local podman[133791]: 2026-03-10 06:25:36.6187963 +0000 UTC m=+0.066384019 container attach 8b384fd6eb2e37e8c248a2f4110d42a455af9893375efce4aa1a0aa7159b32a5 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-2-deactivate, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_REF=squid, ceph=True, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.vendor=CentOS, org.label-schema.build-date=20260223, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/) 2026-03-10T06:25:36.874 INFO:journalctl@ceph.osd.2.vm04.stdout:Mar 10 06:25:36 vm04.local podman[133791]: 2026-03-10 06:25:36.563598045 +0000 UTC m=+0.011185774 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-10T06:25:36.874 INFO:journalctl@ceph.osd.2.vm04.stdout:Mar 10 06:25:36 vm04.local podman[133809]: 2026-03-10 06:25:36.76283042 +0000 UTC m=+0.011770379 container died 8b384fd6eb2e37e8c248a2f4110d42a455af9893375efce4aa1a0aa7159b32a5 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-2-deactivate, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.opencontainers.image.authors=Ceph Release Team , CEPH_REF=squid, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, OSD_FLAVOR=default) 2026-03-10T06:25:36.874 INFO:journalctl@ceph.osd.2.vm04.stdout:Mar 10 06:25:36 vm04.local podman[133809]: 2026-03-10 06:25:36.77970865 +0000 UTC m=+0.028648590 container remove 8b384fd6eb2e37e8c248a2f4110d42a455af9893375efce4aa1a0aa7159b32a5 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-2-deactivate, CEPH_REF=squid, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, io.buildah.version=1.41.3, org.label-schema.build-date=20260223, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/) 2026-03-10T06:25:36.874 INFO:journalctl@ceph.osd.2.vm04.stdout:Mar 10 06:25:36 vm04.local systemd[1]: ceph-9c59102a-1c48-11f1-b618-035af535377d@osd.2.service: Deactivated successfully. 2026-03-10T06:25:36.874 INFO:journalctl@ceph.osd.2.vm04.stdout:Mar 10 06:25:36 vm04.local systemd[1]: Stopped Ceph osd.2 for 9c59102a-1c48-11f1-b618-035af535377d. 2026-03-10T06:25:36.874 INFO:journalctl@ceph.osd.2.vm04.stdout:Mar 10 06:25:36 vm04.local systemd[1]: ceph-9c59102a-1c48-11f1-b618-035af535377d@osd.2.service: Consumed 30.925s CPU time. 2026-03-10T06:25:37.201 INFO:journalctl@ceph.osd.2.vm04.stdout:Mar 10 06:25:36 vm04.local systemd[1]: Starting Ceph osd.2 for 9c59102a-1c48-11f1-b618-035af535377d... 2026-03-10T06:25:37.201 INFO:journalctl@ceph.osd.2.vm04.stdout:Mar 10 06:25:37 vm04.local podman[133894]: 2026-03-10 06:25:37.073812469 +0000 UTC m=+0.016261798 container create 96fb63cdb545e0561bbef14d5d9419292b2cab416f231c09ba9cfa05932f4cd2 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-2-activate, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.license=GPLv2, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.build-date=20260223, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default) 2026-03-10T06:25:37.201 INFO:journalctl@ceph.osd.2.vm04.stdout:Mar 10 06:25:37 vm04.local podman[133894]: 2026-03-10 06:25:37.116393232 +0000 UTC m=+0.058842560 container init 96fb63cdb545e0561bbef14d5d9419292b2cab416f231c09ba9cfa05932f4cd2 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-2-activate, org.label-schema.vendor=CentOS, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260223, org.opencontainers.image.authors=Ceph Release Team , OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) 2026-03-10T06:25:37.201 INFO:journalctl@ceph.osd.2.vm04.stdout:Mar 10 06:25:37 vm04.local podman[133894]: 2026-03-10 06:25:37.120149564 +0000 UTC m=+0.062598892 container start 96fb63cdb545e0561bbef14d5d9419292b2cab416f231c09ba9cfa05932f4cd2 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-2-activate, org.label-schema.vendor=CentOS, ceph=True, CEPH_REF=squid, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20260223, org.opencontainers.image.authors=Ceph Release Team , GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image) 2026-03-10T06:25:37.201 INFO:journalctl@ceph.osd.2.vm04.stdout:Mar 10 06:25:37 vm04.local podman[133894]: 2026-03-10 06:25:37.125683245 +0000 UTC m=+0.068132573 container attach 96fb63cdb545e0561bbef14d5d9419292b2cab416f231c09ba9cfa05932f4cd2 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-2-activate, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, ceph=True, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.build-date=20260223) 2026-03-10T06:25:37.201 INFO:journalctl@ceph.osd.2.vm04.stdout:Mar 10 06:25:37 vm04.local podman[133894]: 2026-03-10 06:25:37.06689575 +0000 UTC m=+0.009345087 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-10T06:25:37.201 INFO:journalctl@ceph.osd.2.vm04.stdout:Mar 10 06:25:37 vm04.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-2-activate[133904]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T06:25:37.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:37 vm06.local ceph-mon[98962]: pgmap v62: 65 pgs: 65 active+clean; 252 MiB data, 2.4 GiB used, 118 GiB / 120 GiB avail; 1.2 KiB/s rd, 255 B/s wr, 2 op/s; 0 B/s, 0 objects/s recovering 2026-03-10T06:25:37.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:37 vm06.local ceph-mon[98962]: Health check failed: 1 osds down (OSD_DOWN) 2026-03-10T06:25:37.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:37 vm06.local ceph-mon[98962]: mgrmap e40: vm04.exdvdb(active, since 92s), standbys: vm06.wwotdr 2026-03-10T06:25:37.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:37 vm06.local ceph-mon[98962]: osdmap e57: 6 total, 5 up, 6 in 2026-03-10T06:25:37.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:37 vm04.local ceph-mon[115743]: pgmap v62: 65 pgs: 65 active+clean; 252 MiB data, 2.4 GiB used, 118 GiB / 120 GiB avail; 1.2 KiB/s rd, 255 B/s wr, 2 op/s; 0 B/s, 0 objects/s recovering 2026-03-10T06:25:37.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:37 vm04.local ceph-mon[115743]: Health check failed: 1 osds down (OSD_DOWN) 2026-03-10T06:25:37.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:37 vm04.local ceph-mon[115743]: mgrmap e40: vm04.exdvdb(active, since 92s), standbys: vm06.wwotdr 2026-03-10T06:25:37.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:37 vm04.local ceph-mon[115743]: osdmap e57: 6 total, 5 up, 6 in 2026-03-10T06:25:37.677 INFO:journalctl@ceph.osd.2.vm04.stdout:Mar 10 06:25:37 vm04.local bash[133894]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T06:25:37.677 INFO:journalctl@ceph.osd.2.vm04.stdout:Mar 10 06:25:37 vm04.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-2-activate[133904]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T06:25:37.677 INFO:journalctl@ceph.osd.2.vm04.stdout:Mar 10 06:25:37 vm04.local bash[133894]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T06:25:38.054 INFO:journalctl@ceph.osd.2.vm04.stdout:Mar 10 06:25:37 vm04.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-2-activate[133904]: --> Failed to activate via raw: did not find any matching OSD to activate 2026-03-10T06:25:38.054 INFO:journalctl@ceph.osd.2.vm04.stdout:Mar 10 06:25:37 vm04.local bash[133894]: --> Failed to activate via raw: did not find any matching OSD to activate 2026-03-10T06:25:38.054 INFO:journalctl@ceph.osd.2.vm04.stdout:Mar 10 06:25:37 vm04.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-2-activate[133904]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T06:25:38.054 INFO:journalctl@ceph.osd.2.vm04.stdout:Mar 10 06:25:37 vm04.local bash[133894]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T06:25:38.054 INFO:journalctl@ceph.osd.2.vm04.stdout:Mar 10 06:25:37 vm04.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-2-activate[133904]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T06:25:38.054 INFO:journalctl@ceph.osd.2.vm04.stdout:Mar 10 06:25:37 vm04.local bash[133894]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T06:25:38.054 INFO:journalctl@ceph.osd.2.vm04.stdout:Mar 10 06:25:37 vm04.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-2-activate[133904]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2 2026-03-10T06:25:38.054 INFO:journalctl@ceph.osd.2.vm04.stdout:Mar 10 06:25:37 vm04.local bash[133894]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2 2026-03-10T06:25:38.055 INFO:journalctl@ceph.osd.2.vm04.stdout:Mar 10 06:25:37 vm04.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-2-activate[133904]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph-27bf8308-eb6a-4ecf-af24-4bf3e0ac59df/osd-block-7fc62f1e-5fa4-44db-82a0-4b766c28a491 --path /var/lib/ceph/osd/ceph-2 --no-mon-config 2026-03-10T06:25:38.055 INFO:journalctl@ceph.osd.2.vm04.stdout:Mar 10 06:25:37 vm04.local bash[133894]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph-27bf8308-eb6a-4ecf-af24-4bf3e0ac59df/osd-block-7fc62f1e-5fa4-44db-82a0-4b766c28a491 --path /var/lib/ceph/osd/ceph-2 --no-mon-config 2026-03-10T06:25:38.055 INFO:journalctl@ceph.osd.2.vm04.stdout:Mar 10 06:25:38 vm04.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-2-activate[133904]: Running command: /usr/bin/ln -snf /dev/ceph-27bf8308-eb6a-4ecf-af24-4bf3e0ac59df/osd-block-7fc62f1e-5fa4-44db-82a0-4b766c28a491 /var/lib/ceph/osd/ceph-2/block 2026-03-10T06:25:38.351 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:38 vm04.local ceph-mon[115743]: osdmap e58: 6 total, 5 up, 6 in 2026-03-10T06:25:38.351 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:38 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:25:38.351 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:38 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:25:38.351 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:38 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:25:38.351 INFO:journalctl@ceph.osd.2.vm04.stdout:Mar 10 06:25:38 vm04.local bash[133894]: Running command: /usr/bin/ln -snf /dev/ceph-27bf8308-eb6a-4ecf-af24-4bf3e0ac59df/osd-block-7fc62f1e-5fa4-44db-82a0-4b766c28a491 /var/lib/ceph/osd/ceph-2/block 2026-03-10T06:25:38.351 INFO:journalctl@ceph.osd.2.vm04.stdout:Mar 10 06:25:38 vm04.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-2-activate[133904]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-2/block 2026-03-10T06:25:38.351 INFO:journalctl@ceph.osd.2.vm04.stdout:Mar 10 06:25:38 vm04.local bash[133894]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-2/block 2026-03-10T06:25:38.351 INFO:journalctl@ceph.osd.2.vm04.stdout:Mar 10 06:25:38 vm04.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-2-activate[133904]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-2 2026-03-10T06:25:38.351 INFO:journalctl@ceph.osd.2.vm04.stdout:Mar 10 06:25:38 vm04.local bash[133894]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-2 2026-03-10T06:25:38.351 INFO:journalctl@ceph.osd.2.vm04.stdout:Mar 10 06:25:38 vm04.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-2-activate[133904]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2 2026-03-10T06:25:38.351 INFO:journalctl@ceph.osd.2.vm04.stdout:Mar 10 06:25:38 vm04.local bash[133894]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2 2026-03-10T06:25:38.351 INFO:journalctl@ceph.osd.2.vm04.stdout:Mar 10 06:25:38 vm04.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-2-activate[133904]: --> ceph-volume lvm activate successful for osd ID: 2 2026-03-10T06:25:38.351 INFO:journalctl@ceph.osd.2.vm04.stdout:Mar 10 06:25:38 vm04.local bash[133894]: --> ceph-volume lvm activate successful for osd ID: 2 2026-03-10T06:25:38.351 INFO:journalctl@ceph.osd.2.vm04.stdout:Mar 10 06:25:38 vm04.local conmon[133904]: conmon 96fb63cdb545e0561bbe : Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-96fb63cdb545e0561bbef14d5d9419292b2cab416f231c09ba9cfa05932f4cd2.scope/container/memory.events 2026-03-10T06:25:38.351 INFO:journalctl@ceph.osd.2.vm04.stdout:Mar 10 06:25:38 vm04.local podman[133894]: 2026-03-10 06:25:38.081385539 +0000 UTC m=+1.023834867 container died 96fb63cdb545e0561bbef14d5d9419292b2cab416f231c09ba9cfa05932f4cd2 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-2-activate, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260223, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.schema-version=1.0, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9) 2026-03-10T06:25:38.351 INFO:journalctl@ceph.osd.2.vm04.stdout:Mar 10 06:25:38 vm04.local podman[133894]: 2026-03-10 06:25:38.167535053 +0000 UTC m=+1.109984381 container remove 96fb63cdb545e0561bbef14d5d9419292b2cab416f231c09ba9cfa05932f4cd2 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-2-activate, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, OSD_FLAVOR=default, CEPH_REF=squid, ceph=True, org.label-schema.license=GPLv2, org.label-schema.build-date=20260223, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team , CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/) 2026-03-10T06:25:38.351 INFO:journalctl@ceph.osd.2.vm04.stdout:Mar 10 06:25:38 vm04.local podman[134159]: 2026-03-10 06:25:38.257498639 +0000 UTC m=+0.016606242 container create 38220ba83a3f79daa7972be3e79b338c74eb1d3f5566908b7ffce6c2f23de6b2 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team , CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, ceph=True, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.label-schema.license=GPLv2) 2026-03-10T06:25:38.351 INFO:journalctl@ceph.osd.2.vm04.stdout:Mar 10 06:25:38 vm04.local podman[134159]: 2026-03-10 06:25:38.291996888 +0000 UTC m=+0.051104491 container init 38220ba83a3f79daa7972be3e79b338c74eb1d3f5566908b7ffce6c2f23de6b2 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-2, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260223, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_REF=squid, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team , ceph=True) 2026-03-10T06:25:38.351 INFO:journalctl@ceph.osd.2.vm04.stdout:Mar 10 06:25:38 vm04.local podman[134159]: 2026-03-10 06:25:38.295078608 +0000 UTC m=+0.054186211 container start 38220ba83a3f79daa7972be3e79b338c74eb1d3f5566908b7ffce6c2f23de6b2 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-2, org.label-schema.vendor=CentOS, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.build-date=20260223, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, ceph=True, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team , OSD_FLAVOR=default, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9) 2026-03-10T06:25:38.351 INFO:journalctl@ceph.osd.2.vm04.stdout:Mar 10 06:25:38 vm04.local bash[134159]: 38220ba83a3f79daa7972be3e79b338c74eb1d3f5566908b7ffce6c2f23de6b2 2026-03-10T06:25:38.351 INFO:journalctl@ceph.osd.2.vm04.stdout:Mar 10 06:25:38 vm04.local podman[134159]: 2026-03-10 06:25:38.250828169 +0000 UTC m=+0.009935772 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-10T06:25:38.351 INFO:journalctl@ceph.osd.2.vm04.stdout:Mar 10 06:25:38 vm04.local systemd[1]: Started Ceph osd.2 for 9c59102a-1c48-11f1-b618-035af535377d. 2026-03-10T06:25:38.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:38 vm06.local ceph-mon[98962]: osdmap e58: 6 total, 5 up, 6 in 2026-03-10T06:25:38.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:38 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:25:38.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:38 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:25:38.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:38 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:25:39.167 INFO:journalctl@ceph.osd.2.vm04.stdout:Mar 10 06:25:39 vm04.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-2[134170]: 2026-03-10T06:25:39.120+0000 7f8c4fd19740 -1 Falling back to public interface 2026-03-10T06:25:39.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:39 vm04.local ceph-mon[115743]: pgmap v65: 65 pgs: 7 peering, 5 stale+active+clean, 53 active+clean; 252 MiB data, 2.4 GiB used, 118 GiB / 120 GiB avail; 1.2 KiB/s rd, 383 B/s wr, 2 op/s 2026-03-10T06:25:39.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:39 vm06.local ceph-mon[98962]: pgmap v65: 65 pgs: 7 peering, 5 stale+active+clean, 53 active+clean; 252 MiB data, 2.4 GiB used, 118 GiB / 120 GiB avail; 1.2 KiB/s rd, 383 B/s wr, 2 op/s 2026-03-10T06:25:40.996 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:40 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:25:40.996 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:40 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:25:40.996 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:40 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:25:40.996 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:40 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:25:41.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:40 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:25:41.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:40 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:25:41.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:40 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:25:41.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:40 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:25:41.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:41 vm04.local ceph-mon[115743]: pgmap v66: 65 pgs: 4 active+undersized, 7 peering, 2 stale+active+clean, 4 active+undersized+degraded, 48 active+clean; 252 MiB data, 2.4 GiB used, 118 GiB / 120 GiB avail; 767 B/s rd, 383 B/s wr, 2 op/s; 10/261 objects degraded (3.831%) 2026-03-10T06:25:41.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:41 vm04.local ceph-mon[115743]: Health check failed: Degraded data redundancy: 10/261 objects degraded (3.831%), 4 pgs degraded (PG_DEGRADED) 2026-03-10T06:25:41.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:41 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:25:41.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:41 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:25:41.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:41 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:25:41.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:41 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T06:25:41.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:41 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:25:41.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:41 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:25:41.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:41 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:25:41.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:41 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:25:41.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:41 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:25:41.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:41 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd ok-to-stop", "ids": ["3"], "max": 16}]: dispatch 2026-03-10T06:25:42.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:41 vm06.local ceph-mon[98962]: pgmap v66: 65 pgs: 4 active+undersized, 7 peering, 2 stale+active+clean, 4 active+undersized+degraded, 48 active+clean; 252 MiB data, 2.4 GiB used, 118 GiB / 120 GiB avail; 767 B/s rd, 383 B/s wr, 2 op/s; 10/261 objects degraded (3.831%) 2026-03-10T06:25:42.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:41 vm06.local ceph-mon[98962]: Health check failed: Degraded data redundancy: 10/261 objects degraded (3.831%), 4 pgs degraded (PG_DEGRADED) 2026-03-10T06:25:42.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:41 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:25:42.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:41 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:25:42.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:41 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:25:42.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:41 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T06:25:42.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:41 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:25:42.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:41 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:25:42.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:41 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:25:42.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:41 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:25:42.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:41 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:25:42.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:41 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd ok-to-stop", "ids": ["3"], "max": 16}]: dispatch 2026-03-10T06:25:42.867 INFO:journalctl@ceph.osd.2.vm04.stdout:Mar 10 06:25:42 vm04.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-2[134170]: 2026-03-10T06:25:42.547+0000 7f8c4fd19740 -1 osd.2 0 read_superblock omap replica is missing. 2026-03-10T06:25:42.867 INFO:journalctl@ceph.osd.2.vm04.stdout:Mar 10 06:25:42 vm04.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-2[134170]: 2026-03-10T06:25:42.701+0000 7f8c4fd19740 -1 osd.2 56 log_to_monitors true 2026-03-10T06:25:42.867 INFO:journalctl@ceph.osd.2.vm04.stdout:Mar 10 06:25:42 vm04.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-2[134170]: 2026-03-10T06:25:42.793+0000 7f8c47ab3640 -1 osd.2 56 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-10T06:25:42.867 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:42 vm04.local ceph-mon[115743]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["3"], "max": 16}]: dispatch 2026-03-10T06:25:42.867 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:42 vm04.local ceph-mon[115743]: Upgrade: unsafe to stop osd(s) at this time (4 PGs are or would become offline) 2026-03-10T06:25:42.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:42 vm06.local ceph-mon[98962]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["3"], "max": 16}]: dispatch 2026-03-10T06:25:42.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:42 vm06.local ceph-mon[98962]: Upgrade: unsafe to stop osd(s) at this time (4 PGs are or would become offline) 2026-03-10T06:25:44.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:43 vm06.local ceph-mon[98962]: pgmap v67: 65 pgs: 15 active+undersized, 12 active+undersized+degraded, 38 active+clean; 252 MiB data, 2.4 GiB used, 118 GiB / 120 GiB avail; 1.2 KiB/s rd, 383 B/s wr, 2 op/s; 34/261 objects degraded (13.027%) 2026-03-10T06:25:44.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:43 vm06.local ceph-mon[98962]: from='osd.2 [v2:192.168.123.104:6818/4015290722,v1:192.168.123.104:6819/4015290722]' entity='osd.2' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]: dispatch 2026-03-10T06:25:44.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:43 vm06.local ceph-mon[98962]: from='osd.2 [v2:192.168.123.104:6818/4015290722,v1:192.168.123.104:6819/4015290722]' entity='osd.2' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]': finished 2026-03-10T06:25:44.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:43 vm06.local ceph-mon[98962]: osdmap e59: 6 total, 5 up, 6 in 2026-03-10T06:25:44.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:43 vm06.local ceph-mon[98962]: from='osd.2 [v2:192.168.123.104:6818/4015290722,v1:192.168.123.104:6819/4015290722]' entity='osd.2' cmd=[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=vm04", "root=default"]}]: dispatch 2026-03-10T06:25:44.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:43 vm04.local ceph-mon[115743]: pgmap v67: 65 pgs: 15 active+undersized, 12 active+undersized+degraded, 38 active+clean; 252 MiB data, 2.4 GiB used, 118 GiB / 120 GiB avail; 1.2 KiB/s rd, 383 B/s wr, 2 op/s; 34/261 objects degraded (13.027%) 2026-03-10T06:25:44.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:43 vm04.local ceph-mon[115743]: from='osd.2 [v2:192.168.123.104:6818/4015290722,v1:192.168.123.104:6819/4015290722]' entity='osd.2' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]: dispatch 2026-03-10T06:25:44.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:43 vm04.local ceph-mon[115743]: from='osd.2 [v2:192.168.123.104:6818/4015290722,v1:192.168.123.104:6819/4015290722]' entity='osd.2' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]': finished 2026-03-10T06:25:44.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:43 vm04.local ceph-mon[115743]: osdmap e59: 6 total, 5 up, 6 in 2026-03-10T06:25:44.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:43 vm04.local ceph-mon[115743]: from='osd.2 [v2:192.168.123.104:6818/4015290722,v1:192.168.123.104:6819/4015290722]' entity='osd.2' cmd=[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=vm04", "root=default"]}]: dispatch 2026-03-10T06:25:45.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:44 vm06.local ceph-mon[98962]: Health check cleared: OSD_DOWN (was: 1 osds down) 2026-03-10T06:25:45.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:44 vm06.local ceph-mon[98962]: osd.2 [v2:192.168.123.104:6818/4015290722,v1:192.168.123.104:6819/4015290722] boot 2026-03-10T06:25:45.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:44 vm06.local ceph-mon[98962]: osdmap e60: 6 total, 6 up, 6 in 2026-03-10T06:25:45.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:44 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-10T06:25:45.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:44 vm06.local ceph-mon[98962]: pgmap v70: 65 pgs: 15 active+undersized, 12 active+undersized+degraded, 38 active+clean; 252 MiB data, 2.1 GiB used, 118 GiB / 120 GiB avail; 769 B/s rd, 1 op/s; 34/261 objects degraded (13.027%) 2026-03-10T06:25:45.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:44 vm04.local ceph-mon[115743]: Health check cleared: OSD_DOWN (was: 1 osds down) 2026-03-10T06:25:45.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:44 vm04.local ceph-mon[115743]: osd.2 [v2:192.168.123.104:6818/4015290722,v1:192.168.123.104:6819/4015290722] boot 2026-03-10T06:25:45.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:44 vm04.local ceph-mon[115743]: osdmap e60: 6 total, 6 up, 6 in 2026-03-10T06:25:45.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:44 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-10T06:25:45.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:44 vm04.local ceph-mon[115743]: pgmap v70: 65 pgs: 15 active+undersized, 12 active+undersized+degraded, 38 active+clean; 252 MiB data, 2.1 GiB used, 118 GiB / 120 GiB avail; 769 B/s rd, 1 op/s; 34/261 objects degraded (13.027%) 2026-03-10T06:25:46.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:45 vm06.local ceph-mon[98962]: osdmap e61: 6 total, 6 up, 6 in 2026-03-10T06:25:46.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:45 vm04.local ceph-mon[115743]: osdmap e61: 6 total, 6 up, 6 in 2026-03-10T06:25:47.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:46 vm06.local ceph-mon[98962]: pgmap v72: 65 pgs: 6 active+undersized, 6 active+undersized+degraded, 53 active+clean; 252 MiB data, 2.1 GiB used, 118 GiB / 120 GiB avail; 1.5 KiB/s rd, 2 op/s; 21/261 objects degraded (8.046%) 2026-03-10T06:25:47.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:46 vm04.local ceph-mon[115743]: pgmap v72: 65 pgs: 6 active+undersized, 6 active+undersized+degraded, 53 active+clean; 252 MiB data, 2.1 GiB used, 118 GiB / 120 GiB avail; 1.5 KiB/s rd, 2 op/s; 21/261 objects degraded (8.046%) 2026-03-10T06:25:47.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:47 vm06.local ceph-mon[98962]: Health check update: Degraded data redundancy: 21/261 objects degraded (8.046%), 6 pgs degraded (PG_DEGRADED) 2026-03-10T06:25:48.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:47 vm04.local ceph-mon[115743]: Health check update: Degraded data redundancy: 21/261 objects degraded (8.046%), 6 pgs degraded (PG_DEGRADED) 2026-03-10T06:25:49.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:48 vm06.local ceph-mon[98962]: pgmap v73: 65 pgs: 65 active+clean; 252 MiB data, 2.1 GiB used, 118 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:25:49.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:48 vm04.local ceph-mon[115743]: pgmap v73: 65 pgs: 65 active+clean; 252 MiB data, 2.1 GiB used, 118 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:25:50.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:49 vm06.local ceph-mon[98962]: Health check cleared: PG_DEGRADED (was: Degraded data redundancy: 21/261 objects degraded (8.046%), 6 pgs degraded) 2026-03-10T06:25:50.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:49 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T06:25:50.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:49 vm04.local ceph-mon[115743]: Health check cleared: PG_DEGRADED (was: Degraded data redundancy: 21/261 objects degraded (8.046%), 6 pgs degraded) 2026-03-10T06:25:50.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:49 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T06:25:51.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:50 vm06.local ceph-mon[98962]: pgmap v74: 65 pgs: 65 active+clean; 252 MiB data, 2.1 GiB used, 118 GiB / 120 GiB avail; 709 B/s rd, 1 op/s 2026-03-10T06:25:51.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:50 vm04.local ceph-mon[115743]: pgmap v74: 65 pgs: 65 active+clean; 252 MiB data, 2.1 GiB used, 118 GiB / 120 GiB avail; 709 B/s rd, 1 op/s 2026-03-10T06:25:53.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:53 vm06.local ceph-mon[98962]: pgmap v75: 65 pgs: 65 active+clean; 252 MiB data, 2.1 GiB used, 118 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:25:53.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:53 vm04.local ceph-mon[115743]: pgmap v75: 65 pgs: 65 active+clean; 252 MiB data, 2.1 GiB used, 118 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:25:55.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:55 vm04.local ceph-mon[115743]: pgmap v76: 65 pgs: 65 active+clean; 252 MiB data, 2.1 GiB used, 118 GiB / 120 GiB avail; 1023 B/s rd, 1 op/s 2026-03-10T06:25:55.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:55 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:25:55.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:55 vm06.local ceph-mon[98962]: pgmap v76: 65 pgs: 65 active+clean; 252 MiB data, 2.1 GiB used, 118 GiB / 120 GiB avail; 1023 B/s rd, 1 op/s 2026-03-10T06:25:55.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:55 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:25:56.475 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:56 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd ok-to-stop", "ids": ["3"], "max": 16}]: dispatch 2026-03-10T06:25:56.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:56 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd ok-to-stop", "ids": ["3"], "max": 16}]: dispatch 2026-03-10T06:25:57.257 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 10 06:25:57 vm06.local systemd[1]: Stopping Ceph osd.3 for 9c59102a-1c48-11f1-b618-035af535377d... 2026-03-10T06:25:57.593 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:57 vm06.local ceph-mon[98962]: pgmap v77: 65 pgs: 65 active+clean; 252 MiB data, 2.1 GiB used, 118 GiB / 120 GiB avail; 1.3 KiB/s rd, 2 op/s 2026-03-10T06:25:57.594 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:57 vm06.local ceph-mon[98962]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["3"], "max": 16}]: dispatch 2026-03-10T06:25:57.594 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:57 vm06.local ceph-mon[98962]: Upgrade: osd.3 is safe to restart 2026-03-10T06:25:57.594 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:57 vm06.local ceph-mon[98962]: Upgrade: Updating osd.3 2026-03-10T06:25:57.594 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:57 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:25:57.594 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:57 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "osd.3"}]: dispatch 2026-03-10T06:25:57.594 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:57 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:25:57.594 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:57 vm06.local ceph-mon[98962]: Deploying daemon osd.3 on vm06 2026-03-10T06:25:57.594 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:57 vm06.local ceph-mon[98962]: osd.3 marked itself down and dead 2026-03-10T06:25:57.594 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 10 06:25:57 vm06.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-3[64871]: 2026-03-10T06:25:57.335+0000 7fa1b3109700 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-osd -n osd.3 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false (PID: 1) UID: 0 2026-03-10T06:25:57.594 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 10 06:25:57 vm06.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-3[64871]: 2026-03-10T06:25:57.335+0000 7fa1b3109700 -1 osd.3 61 *** Got signal Terminated *** 2026-03-10T06:25:57.594 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 10 06:25:57 vm06.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-3[64871]: 2026-03-10T06:25:57.335+0000 7fa1b3109700 -1 osd.3 61 *** Immediate shutdown (osd_fast_shutdown=true) *** 2026-03-10T06:25:57.860 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 10 06:25:57 vm06.local podman[106297]: 2026-03-10 06:25:57.593255851 +0000 UTC m=+0.272424770 container died 62400287eca02ab54a41f916a138f0cbe627122138b38b1047b37c3c215e3101 (image=quay.io/ceph/ceph@sha256:1793ff3af6ae74527c86e1a0b22401e9c42dc08d0ebb8379653be07db17d0007, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-3, GIT_BRANCH=HEAD, GIT_CLEAN=True, ceph=True, io.buildah.version=1.29.1, org.label-schema.license=GPLv2, maintainer=Guillaume Abrioux , org.label-schema.name=CentOS Stream 8 Base Image, org.label-schema.schema-version=1.0, GIT_REPO=https://github.com/ceph/ceph-container.git, org.label-schema.build-date=20231212, GIT_COMMIT=0396eef90bef641b676c164ec7a3876f45010308, RELEASE=HEAD, org.label-schema.vendor=CentOS, CEPH_POINT_RELEASE=-18.2.0) 2026-03-10T06:25:57.860 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 10 06:25:57 vm06.local podman[106297]: 2026-03-10 06:25:57.616415805 +0000 UTC m=+0.295584724 container remove 62400287eca02ab54a41f916a138f0cbe627122138b38b1047b37c3c215e3101 (image=quay.io/ceph/ceph@sha256:1793ff3af6ae74527c86e1a0b22401e9c42dc08d0ebb8379653be07db17d0007, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-3, GIT_BRANCH=HEAD, GIT_REPO=https://github.com/ceph/ceph-container.git, org.label-schema.name=CentOS Stream 8 Base Image, GIT_CLEAN=True, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=Guillaume Abrioux , org.label-schema.build-date=20231212, org.label-schema.vendor=CentOS, GIT_COMMIT=0396eef90bef641b676c164ec7a3876f45010308, RELEASE=HEAD, ceph=True, io.buildah.version=1.29.1, CEPH_POINT_RELEASE=-18.2.0) 2026-03-10T06:25:57.860 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 10 06:25:57 vm06.local bash[106297]: ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-3 2026-03-10T06:25:57.860 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 10 06:25:57 vm06.local podman[106362]: 2026-03-10 06:25:57.768984609 +0000 UTC m=+0.019298287 container create f32f71d78cecfa3aad937e280548563a75f2c02ce4b4f8181bd06a16739f4df0 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-3-deactivate, org.label-schema.license=GPLv2, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team , GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git) 2026-03-10T06:25:57.860 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 10 06:25:57 vm06.local podman[106362]: 2026-03-10 06:25:57.806941532 +0000 UTC m=+0.057255220 container init f32f71d78cecfa3aad937e280548563a75f2c02ce4b4f8181bd06a16739f4df0 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-3-deactivate, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team ) 2026-03-10T06:25:57.860 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 10 06:25:57 vm06.local podman[106362]: 2026-03-10 06:25:57.810052777 +0000 UTC m=+0.060366455 container start f32f71d78cecfa3aad937e280548563a75f2c02ce4b4f8181bd06a16739f4df0 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-3-deactivate, org.opencontainers.image.authors=Ceph Release Team , CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20260223, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.license=GPLv2, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df) 2026-03-10T06:25:57.860 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 10 06:25:57 vm06.local podman[106362]: 2026-03-10 06:25:57.814800734 +0000 UTC m=+0.065114412 container attach f32f71d78cecfa3aad937e280548563a75f2c02ce4b4f8181bd06a16739f4df0 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-3-deactivate, ceph=True, org.label-schema.build-date=20260223, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team , GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3) 2026-03-10T06:25:57.860 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 10 06:25:57 vm06.local podman[106362]: 2026-03-10 06:25:57.761275047 +0000 UTC m=+0.011588725 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-10T06:25:57.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:57 vm04.local ceph-mon[115743]: pgmap v77: 65 pgs: 65 active+clean; 252 MiB data, 2.1 GiB used, 118 GiB / 120 GiB avail; 1.3 KiB/s rd, 2 op/s 2026-03-10T06:25:57.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:57 vm04.local ceph-mon[115743]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["3"], "max": 16}]: dispatch 2026-03-10T06:25:57.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:57 vm04.local ceph-mon[115743]: Upgrade: osd.3 is safe to restart 2026-03-10T06:25:57.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:57 vm04.local ceph-mon[115743]: Upgrade: Updating osd.3 2026-03-10T06:25:57.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:57 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:25:57.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:57 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "osd.3"}]: dispatch 2026-03-10T06:25:57.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:57 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:25:57.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:57 vm04.local ceph-mon[115743]: Deploying daemon osd.3 on vm06 2026-03-10T06:25:57.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:57 vm04.local ceph-mon[115743]: osd.3 marked itself down and dead 2026-03-10T06:25:58.117 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 10 06:25:57 vm06.local podman[106362]: 2026-03-10 06:25:57.942068075 +0000 UTC m=+0.192381753 container died f32f71d78cecfa3aad937e280548563a75f2c02ce4b4f8181bd06a16739f4df0 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-3-deactivate, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team , FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, ceph=True, OSD_FLAVOR=default) 2026-03-10T06:25:58.118 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 10 06:25:57 vm06.local podman[106362]: 2026-03-10 06:25:57.961584198 +0000 UTC m=+0.211897876 container remove f32f71d78cecfa3aad937e280548563a75f2c02ce4b4f8181bd06a16739f4df0 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-3-deactivate, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_REF=squid, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.build-date=20260223) 2026-03-10T06:25:58.118 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 10 06:25:57 vm06.local systemd[1]: ceph-9c59102a-1c48-11f1-b618-035af535377d@osd.3.service: Deactivated successfully. 2026-03-10T06:25:58.118 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 10 06:25:57 vm06.local systemd[1]: ceph-9c59102a-1c48-11f1-b618-035af535377d@osd.3.service: Unit process 106372 (conmon) remains running after unit stopped. 2026-03-10T06:25:58.118 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 10 06:25:57 vm06.local systemd[1]: ceph-9c59102a-1c48-11f1-b618-035af535377d@osd.3.service: Unit process 106380 (podman) remains running after unit stopped. 2026-03-10T06:25:58.118 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 10 06:25:57 vm06.local systemd[1]: Stopped Ceph osd.3 for 9c59102a-1c48-11f1-b618-035af535377d. 2026-03-10T06:25:58.118 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 10 06:25:57 vm06.local systemd[1]: ceph-9c59102a-1c48-11f1-b618-035af535377d@osd.3.service: Consumed 42.383s CPU time, 951.0M memory peak. 2026-03-10T06:25:58.407 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 10 06:25:58 vm06.local systemd[1]: Starting Ceph osd.3 for 9c59102a-1c48-11f1-b618-035af535377d... 2026-03-10T06:25:58.407 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 10 06:25:58 vm06.local podman[106463]: 2026-03-10 06:25:58.378145535 +0000 UTC m=+0.031307768 container create 332cd22984c5ce1de92d7cd1d3c2ba9c91e4112ab582ce066bda0afc4d808311 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-3-activate, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20260223, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team , CEPH_REF=squid, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3) 2026-03-10T06:25:58.688 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 10 06:25:58 vm06.local podman[106463]: 2026-03-10 06:25:58.427516476 +0000 UTC m=+0.080678709 container init 332cd22984c5ce1de92d7cd1d3c2ba9c91e4112ab582ce066bda0afc4d808311 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-3-activate, org.opencontainers.image.authors=Ceph Release Team , OSD_FLAVOR=default, io.buildah.version=1.41.3, ceph=True, org.label-schema.vendor=CentOS, CEPH_REF=squid, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git) 2026-03-10T06:25:58.688 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 10 06:25:58 vm06.local podman[106463]: 2026-03-10 06:25:58.43042506 +0000 UTC m=+0.083587293 container start 332cd22984c5ce1de92d7cd1d3c2ba9c91e4112ab582ce066bda0afc4d808311 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-3-activate, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team , OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_REF=squid, ceph=True, org.label-schema.vendor=CentOS) 2026-03-10T06:25:58.688 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 10 06:25:58 vm06.local podman[106463]: 2026-03-10 06:25:58.439635282 +0000 UTC m=+0.092797525 container attach 332cd22984c5ce1de92d7cd1d3c2ba9c91e4112ab582ce066bda0afc4d808311 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-3-activate, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.build-date=20260223, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team ) 2026-03-10T06:25:58.688 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 10 06:25:58 vm06.local podman[106463]: 2026-03-10 06:25:58.364774716 +0000 UTC m=+0.017936958 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-10T06:25:58.688 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 10 06:25:58 vm06.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-3-activate[106476]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T06:25:58.688 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 10 06:25:58 vm06.local bash[106463]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T06:25:58.688 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 10 06:25:58 vm06.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-3-activate[106476]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T06:25:58.688 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 10 06:25:58 vm06.local bash[106463]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T06:25:58.688 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:58 vm06.local ceph-mon[98962]: Health check failed: 1 osds down (OSD_DOWN) 2026-03-10T06:25:58.688 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:58 vm06.local ceph-mon[98962]: osdmap e62: 6 total, 5 up, 6 in 2026-03-10T06:25:58.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:58 vm04.local ceph-mon[115743]: Health check failed: 1 osds down (OSD_DOWN) 2026-03-10T06:25:58.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:58 vm04.local ceph-mon[115743]: osdmap e62: 6 total, 5 up, 6 in 2026-03-10T06:25:59.314 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 10 06:25:59 vm06.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-3-activate[106476]: --> Failed to activate via raw: did not find any matching OSD to activate 2026-03-10T06:25:59.315 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 10 06:25:59 vm06.local bash[106463]: --> Failed to activate via raw: did not find any matching OSD to activate 2026-03-10T06:25:59.315 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 10 06:25:59 vm06.local bash[106463]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T06:25:59.315 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 10 06:25:59 vm06.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-3-activate[106476]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T06:25:59.315 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 10 06:25:59 vm06.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-3-activate[106476]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T06:25:59.315 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 10 06:25:59 vm06.local bash[106463]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T06:25:59.315 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 10 06:25:59 vm06.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-3-activate[106476]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-3 2026-03-10T06:25:59.315 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 10 06:25:59 vm06.local bash[106463]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-3 2026-03-10T06:25:59.315 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 10 06:25:59 vm06.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-3-activate[106476]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph-b3fcfe78-3985-481f-9525-5ab61ccd7397/osd-block-343c7178-1f64-4726-9c13-d1d348b25384 --path /var/lib/ceph/osd/ceph-3 --no-mon-config 2026-03-10T06:25:59.315 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 10 06:25:59 vm06.local bash[106463]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph-b3fcfe78-3985-481f-9525-5ab61ccd7397/osd-block-343c7178-1f64-4726-9c13-d1d348b25384 --path /var/lib/ceph/osd/ceph-3 --no-mon-config 2026-03-10T06:25:59.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:59 vm06.local ceph-mon[98962]: pgmap v79: 65 pgs: 4 peering, 15 stale+active+clean, 46 active+clean; 252 MiB data, 2.1 GiB used, 118 GiB / 120 GiB avail; 1023 B/s rd, 1 op/s 2026-03-10T06:25:59.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:25:59 vm06.local ceph-mon[98962]: osdmap e63: 6 total, 5 up, 6 in 2026-03-10T06:25:59.618 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 10 06:25:59 vm06.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-3-activate[106476]: Running command: /usr/bin/ln -snf /dev/ceph-b3fcfe78-3985-481f-9525-5ab61ccd7397/osd-block-343c7178-1f64-4726-9c13-d1d348b25384 /var/lib/ceph/osd/ceph-3/block 2026-03-10T06:25:59.618 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 10 06:25:59 vm06.local bash[106463]: Running command: /usr/bin/ln -snf /dev/ceph-b3fcfe78-3985-481f-9525-5ab61ccd7397/osd-block-343c7178-1f64-4726-9c13-d1d348b25384 /var/lib/ceph/osd/ceph-3/block 2026-03-10T06:25:59.618 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 10 06:25:59 vm06.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-3-activate[106476]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-3/block 2026-03-10T06:25:59.618 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 10 06:25:59 vm06.local bash[106463]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-3/block 2026-03-10T06:25:59.618 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 10 06:25:59 vm06.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-3-activate[106476]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0 2026-03-10T06:25:59.618 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 10 06:25:59 vm06.local bash[106463]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0 2026-03-10T06:25:59.618 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 10 06:25:59 vm06.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-3-activate[106476]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-3 2026-03-10T06:25:59.618 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 10 06:25:59 vm06.local bash[106463]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-3 2026-03-10T06:25:59.618 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 10 06:25:59 vm06.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-3-activate[106476]: --> ceph-volume lvm activate successful for osd ID: 3 2026-03-10T06:25:59.618 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 10 06:25:59 vm06.local bash[106463]: --> ceph-volume lvm activate successful for osd ID: 3 2026-03-10T06:25:59.618 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 10 06:25:59 vm06.local podman[106463]: 2026-03-10 06:25:59.520258438 +0000 UTC m=+1.173420671 container died 332cd22984c5ce1de92d7cd1d3c2ba9c91e4112ab582ce066bda0afc4d808311 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-3-activate, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, OSD_FLAVOR=default, io.buildah.version=1.41.3, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/) 2026-03-10T06:25:59.618 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 10 06:25:59 vm06.local podman[106463]: 2026-03-10 06:25:59.539318097 +0000 UTC m=+1.192480330 container remove 332cd22984c5ce1de92d7cd1d3c2ba9c91e4112ab582ce066bda0afc4d808311 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-3-activate, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team , GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, OSD_FLAVOR=default, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.build-date=20260223, CEPH_REF=squid) 2026-03-10T06:25:59.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:59 vm04.local ceph-mon[115743]: pgmap v79: 65 pgs: 4 peering, 15 stale+active+clean, 46 active+clean; 252 MiB data, 2.1 GiB used, 118 GiB / 120 GiB avail; 1023 B/s rd, 1 op/s 2026-03-10T06:25:59.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:25:59 vm04.local ceph-mon[115743]: osdmap e63: 6 total, 5 up, 6 in 2026-03-10T06:26:00.069 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 10 06:25:59 vm06.local podman[106746]: 2026-03-10 06:25:59.673807044 +0000 UTC m=+0.022391296 container create e91f44e1f6609617ce38c238503f445f67f24335365a3cd6205490cf8ea51f53 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-3, org.label-schema.build-date=20260223, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team , org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, ceph=True, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_REF=squid, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.schema-version=1.0) 2026-03-10T06:26:00.069 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 10 06:25:59 vm06.local podman[106746]: 2026-03-10 06:25:59.713983613 +0000 UTC m=+0.062567876 container init e91f44e1f6609617ce38c238503f445f67f24335365a3cd6205490cf8ea51f53 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, io.buildah.version=1.41.3, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260223, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team , org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid) 2026-03-10T06:26:00.069 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 10 06:25:59 vm06.local podman[106746]: 2026-03-10 06:25:59.716906625 +0000 UTC m=+0.065490877 container start e91f44e1f6609617ce38c238503f445f67f24335365a3cd6205490cf8ea51f53 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-3, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.license=GPLv2, CEPH_REF=squid, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20260223, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df) 2026-03-10T06:26:00.069 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 10 06:25:59 vm06.local bash[106746]: e91f44e1f6609617ce38c238503f445f67f24335365a3cd6205490cf8ea51f53 2026-03-10T06:26:00.069 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 10 06:25:59 vm06.local podman[106746]: 2026-03-10 06:25:59.66634716 +0000 UTC m=+0.014931412 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-10T06:26:00.069 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 10 06:25:59 vm06.local systemd[1]: Started Ceph osd.3 for 9c59102a-1c48-11f1-b618-035af535377d. 2026-03-10T06:26:00.618 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 10 06:26:00 vm06.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-3[106757]: 2026-03-10T06:26:00.566+0000 7f2f70591740 -1 Falling back to public interface 2026-03-10T06:26:01.100 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:00 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:26:01.100 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:00 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:26:01.100 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:00 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:26:01.100 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:00 vm04.local ceph-mon[115743]: pgmap v81: 65 pgs: 4 active+undersized, 4 peering, 12 stale+active+clean, 3 active+undersized+degraded, 42 active+clean; 252 MiB data, 2.1 GiB used, 118 GiB / 120 GiB avail; 9/261 objects degraded (3.448%) 2026-03-10T06:26:01.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:00 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:26:01.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:00 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:26:01.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:00 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:26:01.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:00 vm06.local ceph-mon[98962]: pgmap v81: 65 pgs: 4 active+undersized, 4 peering, 12 stale+active+clean, 3 active+undersized+degraded, 42 active+clean; 252 MiB data, 2.1 GiB used, 118 GiB / 120 GiB avail; 9/261 objects degraded (3.448%) 2026-03-10T06:26:01.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:01 vm04.local ceph-mon[115743]: Health check failed: Degraded data redundancy: 9/261 objects degraded (3.448%), 3 pgs degraded (PG_DEGRADED) 2026-03-10T06:26:01.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:01 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:26:01.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:01 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:26:01.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:01 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:26:01.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:01 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:26:02.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:01 vm06.local ceph-mon[98962]: Health check failed: Degraded data redundancy: 9/261 objects degraded (3.448%), 3 pgs degraded (PG_DEGRADED) 2026-03-10T06:26:02.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:01 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:26:02.118 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:01 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:26:02.118 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:01 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:26:02.118 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:01 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:26:02.133 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.131+0000 7f63cc609700 1 -- 192.168.123.104:0/4013144975 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f63c41024b0 msgr2=0x7f63c4102880 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:26:02.133 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.131+0000 7f63cc609700 1 --2- 192.168.123.104:0/4013144975 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f63c41024b0 0x7f63c4102880 secure :-1 s=READY pgs=21 cs=0 l=1 rev1=1 crypto rx=0x7f63c0009b00 tx=0x7f63c0009e10 comp rx=0 tx=0).stop 2026-03-10T06:26:02.133 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.133+0000 7f63cc609700 1 -- 192.168.123.104:0/4013144975 shutdown_connections 2026-03-10T06:26:02.133 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.133+0000 7f63cc609700 1 --2- 192.168.123.104:0/4013144975 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f63c40ff2b0 0x7f63c40ff700 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:26:02.133 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.133+0000 7f63cc609700 1 --2- 192.168.123.104:0/4013144975 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f63c41024b0 0x7f63c4102880 unknown :-1 s=CLOSED pgs=21 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:26:02.134 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.133+0000 7f63cc609700 1 -- 192.168.123.104:0/4013144975 >> 192.168.123.104:0/4013144975 conn(0x7f63c40747e0 msgr2=0x7f63c4074be0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:26:02.134 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.133+0000 7f63cc609700 1 -- 192.168.123.104:0/4013144975 shutdown_connections 2026-03-10T06:26:02.134 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.133+0000 7f63cc609700 1 -- 192.168.123.104:0/4013144975 wait complete. 2026-03-10T06:26:02.134 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.133+0000 7f63cc609700 1 Processor -- start 2026-03-10T06:26:02.134 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.133+0000 7f63cc609700 1 -- start start 2026-03-10T06:26:02.134 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.134+0000 7f63cc609700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f63c40ff2b0 0x7f63c41983d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:26:02.134 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.134+0000 7f63cc609700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f63c41024b0 0x7f63c4198910 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:26:02.134 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.134+0000 7f63cc609700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f63c4198ff0 con 0x7f63c40ff2b0 2026-03-10T06:26:02.134 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.134+0000 7f63cc609700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f63c419cd80 con 0x7f63c41024b0 2026-03-10T06:26:02.134 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.134+0000 7f63ca3a5700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f63c40ff2b0 0x7f63c41983d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:26:02.135 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.134+0000 7f63ca3a5700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f63c40ff2b0 0x7f63c41983d0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:41810/0 (socket says 192.168.123.104:41810) 2026-03-10T06:26:02.135 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.134+0000 7f63ca3a5700 1 -- 192.168.123.104:0/3925239926 learned_addr learned my addr 192.168.123.104:0/3925239926 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:26:02.135 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.134+0000 7f63ca3a5700 1 -- 192.168.123.104:0/3925239926 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f63c41024b0 msgr2=0x7f63c4198910 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T06:26:02.135 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.134+0000 7f63ca3a5700 1 --2- 192.168.123.104:0/3925239926 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f63c41024b0 0x7f63c4198910 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:26:02.135 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.134+0000 7f63ca3a5700 1 -- 192.168.123.104:0/3925239926 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f63b4009710 con 0x7f63c40ff2b0 2026-03-10T06:26:02.135 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.134+0000 7f63ca3a5700 1 --2- 192.168.123.104:0/3925239926 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f63c40ff2b0 0x7f63c41983d0 secure :-1 s=READY pgs=67 cs=0 l=1 rev1=1 crypto rx=0x7f63c000bb70 tx=0x7f63c0003680 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:26:02.135 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.135+0000 7f63bb7fe700 1 -- 192.168.123.104:0/3925239926 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f63c001d070 con 0x7f63c40ff2b0 2026-03-10T06:26:02.135 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.135+0000 7f63cc609700 1 -- 192.168.123.104:0/3925239926 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f63c00097e0 con 0x7f63c40ff2b0 2026-03-10T06:26:02.136 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.135+0000 7f63cc609700 1 -- 192.168.123.104:0/3925239926 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f63c419d420 con 0x7f63c40ff2b0 2026-03-10T06:26:02.136 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.135+0000 7f63bb7fe700 1 -- 192.168.123.104:0/3925239926 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f63c000f460 con 0x7f63c40ff2b0 2026-03-10T06:26:02.136 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.135+0000 7f63bb7fe700 1 -- 192.168.123.104:0/3925239926 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f63c0021600 con 0x7f63c40ff2b0 2026-03-10T06:26:02.137 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.136+0000 7f63bb7fe700 1 -- 192.168.123.104:0/3925239926 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 40) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f63c0003a80 con 0x7f63c40ff2b0 2026-03-10T06:26:02.137 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.137+0000 7f63bb7fe700 1 --2- 192.168.123.104:0/3925239926 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f63b0077990 0x7f63b0079e40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:26:02.137 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.137+0000 7f63cc609700 1 -- 192.168.123.104:0/3925239926 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f63a8005320 con 0x7f63c40ff2b0 2026-03-10T06:26:02.140 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.137+0000 7f63c9ba4700 1 --2- 192.168.123.104:0/3925239926 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f63b0077990 0x7f63b0079e40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:26:02.140 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.137+0000 7f63bb7fe700 1 -- 192.168.123.104:0/3925239926 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(63..63 src has 1..63) v4 ==== 6050+0+0 (secure 0 0 0) 0x7f63c009aaa0 con 0x7f63c40ff2b0 2026-03-10T06:26:02.140 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.137+0000 7f63c9ba4700 1 --2- 192.168.123.104:0/3925239926 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f63b0077990 0x7f63b0079e40 secure :-1 s=READY pgs=44 cs=0 l=1 rev1=1 crypto rx=0x7f63c41999f0 tx=0x7f63b4009450 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:26:02.141 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.141+0000 7f63bb7fe700 1 -- 192.168.123.104:0/3925239926 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f63c00633a0 con 0x7f63c40ff2b0 2026-03-10T06:26:02.267 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.267+0000 7f63cc609700 1 -- 192.168.123.104:0/3925239926 --> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f63a8000bf0 con 0x7f63b0077990 2026-03-10T06:26:02.268 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.268+0000 7f63bb7fe700 1 -- 192.168.123.104:0/3925239926 <== mgr.34104 v2:192.168.123.104:6800/1421430943 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+401 (secure 0 0 0) 0x7f63a8000bf0 con 0x7f63b0077990 2026-03-10T06:26:02.272 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.271+0000 7f63cc609700 1 -- 192.168.123.104:0/3925239926 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f63b0077990 msgr2=0x7f63b0079e40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:26:02.272 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.271+0000 7f63cc609700 1 --2- 192.168.123.104:0/3925239926 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f63b0077990 0x7f63b0079e40 secure :-1 s=READY pgs=44 cs=0 l=1 rev1=1 crypto rx=0x7f63c41999f0 tx=0x7f63b4009450 comp rx=0 tx=0).stop 2026-03-10T06:26:02.272 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.272+0000 7f63cc609700 1 -- 192.168.123.104:0/3925239926 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f63c40ff2b0 msgr2=0x7f63c41983d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:26:02.272 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.272+0000 7f63cc609700 1 --2- 192.168.123.104:0/3925239926 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f63c40ff2b0 0x7f63c41983d0 secure :-1 s=READY pgs=67 cs=0 l=1 rev1=1 crypto rx=0x7f63c000bb70 tx=0x7f63c0003680 comp rx=0 tx=0).stop 2026-03-10T06:26:02.272 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.272+0000 7f63cc609700 1 -- 192.168.123.104:0/3925239926 shutdown_connections 2026-03-10T06:26:02.272 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.272+0000 7f63cc609700 1 --2- 192.168.123.104:0/3925239926 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f63b0077990 0x7f63b0079e40 unknown :-1 s=CLOSED pgs=44 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:26:02.272 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.272+0000 7f63cc609700 1 --2- 192.168.123.104:0/3925239926 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f63c40ff2b0 0x7f63c41983d0 unknown :-1 s=CLOSED pgs=67 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:26:02.272 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.272+0000 7f63cc609700 1 --2- 192.168.123.104:0/3925239926 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f63c41024b0 0x7f63c4198910 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:26:02.272 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.272+0000 7f63cc609700 1 -- 192.168.123.104:0/3925239926 >> 192.168.123.104:0/3925239926 conn(0x7f63c40747e0 msgr2=0x7f63c40fc200 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:26:02.273 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.272+0000 7f63cc609700 1 -- 192.168.123.104:0/3925239926 shutdown_connections 2026-03-10T06:26:02.273 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.272+0000 7f63cc609700 1 -- 192.168.123.104:0/3925239926 wait complete. 2026-03-10T06:26:02.282 INFO:teuthology.orchestra.run.vm04.stdout:true 2026-03-10T06:26:02.354 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.353+0000 7f3ff2d24700 1 -- 192.168.123.104:0/986055325 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3fec108750 msgr2=0x7f3fec108b20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:26:02.354 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.353+0000 7f3ff2d24700 1 --2- 192.168.123.104:0/986055325 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3fec108750 0x7f3fec108b20 secure :-1 s=READY pgs=68 cs=0 l=1 rev1=1 crypto rx=0x7f3fe0009b50 tx=0x7f3fe0009e60 comp rx=0 tx=0).stop 2026-03-10T06:26:02.354 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.353+0000 7f3ff2d24700 1 -- 192.168.123.104:0/986055325 shutdown_connections 2026-03-10T06:26:02.354 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.353+0000 7f3ff2d24700 1 --2- 192.168.123.104:0/986055325 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3fec102750 0x7f3fec102bc0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:26:02.354 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.353+0000 7f3ff2d24700 1 --2- 192.168.123.104:0/986055325 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3fec108750 0x7f3fec108b20 unknown :-1 s=CLOSED pgs=68 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:26:02.354 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.353+0000 7f3ff2d24700 1 -- 192.168.123.104:0/986055325 >> 192.168.123.104:0/986055325 conn(0x7f3fec0fe250 msgr2=0x7f3fec100660 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:26:02.354 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.354+0000 7f3ff2d24700 1 -- 192.168.123.104:0/986055325 shutdown_connections 2026-03-10T06:26:02.354 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.354+0000 7f3ff2d24700 1 -- 192.168.123.104:0/986055325 wait complete. 2026-03-10T06:26:02.355 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.354+0000 7f3ff2d24700 1 Processor -- start 2026-03-10T06:26:02.355 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.354+0000 7f3ff2d24700 1 -- start start 2026-03-10T06:26:02.355 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.355+0000 7f3ff2d24700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3fec102750 0x7f3fec1985e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:26:02.355 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.355+0000 7f3ff0ac0700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3fec102750 0x7f3fec1985e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:26:02.355 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.355+0000 7f3ff0ac0700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3fec102750 0x7f3fec1985e0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:41832/0 (socket says 192.168.123.104:41832) 2026-03-10T06:26:02.355 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.355+0000 7f3ff2d24700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3fec198b20 0x7f3fec19cf90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:26:02.355 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.355+0000 7f3ff2d24700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3fec1990a0 con 0x7f3fec102750 2026-03-10T06:26:02.355 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.355+0000 7f3ff2d24700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3fec199210 con 0x7f3fec198b20 2026-03-10T06:26:02.356 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.355+0000 7f3ff0ac0700 1 -- 192.168.123.104:0/4279290629 learned_addr learned my addr 192.168.123.104:0/4279290629 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:26:02.356 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.355+0000 7f3febfff700 1 --2- 192.168.123.104:0/4279290629 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3fec198b20 0x7f3fec19cf90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:26:02.356 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.356+0000 7f3ff0ac0700 1 -- 192.168.123.104:0/4279290629 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3fec198b20 msgr2=0x7f3fec19cf90 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:26:02.356 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.356+0000 7f3ff0ac0700 1 --2- 192.168.123.104:0/4279290629 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3fec198b20 0x7f3fec19cf90 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:26:02.356 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.356+0000 7f3ff0ac0700 1 -- 192.168.123.104:0/4279290629 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f3fe00097e0 con 0x7f3fec102750 2026-03-10T06:26:02.356 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.356+0000 7f3ff0ac0700 1 --2- 192.168.123.104:0/4279290629 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3fec102750 0x7f3fec1985e0 secure :-1 s=READY pgs=69 cs=0 l=1 rev1=1 crypto rx=0x7f3fec103890 tx=0x7f3fe0005e30 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:26:02.358 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.356+0000 7f3fe9ffb700 1 -- 192.168.123.104:0/4279290629 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3fe001d070 con 0x7f3fec102750 2026-03-10T06:26:02.358 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.356+0000 7f3fe9ffb700 1 -- 192.168.123.104:0/4279290629 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f3fe0022470 con 0x7f3fec102750 2026-03-10T06:26:02.358 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.356+0000 7f3fe9ffb700 1 -- 192.168.123.104:0/4279290629 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3fe000f630 con 0x7f3fec102750 2026-03-10T06:26:02.358 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.356+0000 7f3ff2d24700 1 -- 192.168.123.104:0/4279290629 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f3fec19d530 con 0x7f3fec102750 2026-03-10T06:26:02.358 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.356+0000 7f3ff2d24700 1 -- 192.168.123.104:0/4279290629 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f3fec19d940 con 0x7f3fec102750 2026-03-10T06:26:02.362 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.358+0000 7f3fe9ffb700 1 -- 192.168.123.104:0/4279290629 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 40) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f3fe0022ae0 con 0x7f3fec102750 2026-03-10T06:26:02.362 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.358+0000 7f3ff2d24700 1 -- 192.168.123.104:0/4279290629 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f3fec10acc0 con 0x7f3fec102750 2026-03-10T06:26:02.362 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.358+0000 7f3fe9ffb700 1 --2- 192.168.123.104:0/4279290629 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f3fd4077870 0x7f3fd4079d20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:26:02.362 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.358+0000 7f3fe9ffb700 1 -- 192.168.123.104:0/4279290629 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(63..63 src has 1..63) v4 ==== 6050+0+0 (secure 0 0 0) 0x7f3fe009af50 con 0x7f3fec102750 2026-03-10T06:26:02.362 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.359+0000 7f3febfff700 1 --2- 192.168.123.104:0/4279290629 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f3fd4077870 0x7f3fd4079d20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:26:02.362 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.360+0000 7f3febfff700 1 --2- 192.168.123.104:0/4279290629 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f3fd4077870 0x7f3fd4079d20 secure :-1 s=READY pgs=45 cs=0 l=1 rev1=1 crypto rx=0x7f3fdc005ea0 tx=0x7f3fdc005e30 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:26:02.362 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.361+0000 7f3fe9ffb700 1 -- 192.168.123.104:0/4279290629 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f3fe0063880 con 0x7f3fec102750 2026-03-10T06:26:02.509 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.508+0000 7f3ff2d24700 1 -- 192.168.123.104:0/4279290629 --> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f3fec199990 con 0x7f3fd4077870 2026-03-10T06:26:02.510 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.510+0000 7f3fe9ffb700 1 -- 192.168.123.104:0/4279290629 <== mgr.34104 v2:192.168.123.104:6800/1421430943 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+401 (secure 0 0 0) 0x7f3fec199990 con 0x7f3fd4077870 2026-03-10T06:26:02.512 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.512+0000 7f3ff2d24700 1 -- 192.168.123.104:0/4279290629 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f3fd4077870 msgr2=0x7f3fd4079d20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:26:02.513 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.512+0000 7f3ff2d24700 1 --2- 192.168.123.104:0/4279290629 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f3fd4077870 0x7f3fd4079d20 secure :-1 s=READY pgs=45 cs=0 l=1 rev1=1 crypto rx=0x7f3fdc005ea0 tx=0x7f3fdc005e30 comp rx=0 tx=0).stop 2026-03-10T06:26:02.513 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.512+0000 7f3ff2d24700 1 -- 192.168.123.104:0/4279290629 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3fec102750 msgr2=0x7f3fec1985e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:26:02.513 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.512+0000 7f3ff2d24700 1 --2- 192.168.123.104:0/4279290629 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3fec102750 0x7f3fec1985e0 secure :-1 s=READY pgs=69 cs=0 l=1 rev1=1 crypto rx=0x7f3fec103890 tx=0x7f3fe0005e30 comp rx=0 tx=0).stop 2026-03-10T06:26:02.513 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.512+0000 7f3ff2d24700 1 -- 192.168.123.104:0/4279290629 shutdown_connections 2026-03-10T06:26:02.513 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.512+0000 7f3ff2d24700 1 --2- 192.168.123.104:0/4279290629 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f3fd4077870 0x7f3fd4079d20 unknown :-1 s=CLOSED pgs=45 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:26:02.513 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.512+0000 7f3ff2d24700 1 --2- 192.168.123.104:0/4279290629 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3fec102750 0x7f3fec1985e0 secure :-1 s=CLOSED pgs=69 cs=0 l=1 rev1=1 crypto rx=0x7f3fec103890 tx=0x7f3fe0005e30 comp rx=0 tx=0).stop 2026-03-10T06:26:02.513 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.512+0000 7f3ff2d24700 1 --2- 192.168.123.104:0/4279290629 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3fec198b20 0x7f3fec19cf90 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:26:02.513 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.512+0000 7f3ff2d24700 1 -- 192.168.123.104:0/4279290629 >> 192.168.123.104:0/4279290629 conn(0x7f3fec0fe250 msgr2=0x7f3fec0ffd90 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:26:02.513 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.512+0000 7f3ff2d24700 1 -- 192.168.123.104:0/4279290629 shutdown_connections 2026-03-10T06:26:02.513 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.513+0000 7f3ff2d24700 1 -- 192.168.123.104:0/4279290629 wait complete. 2026-03-10T06:26:02.592 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.591+0000 7fafc00de700 1 -- 192.168.123.104:0/653468541 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fafb8108760 msgr2=0x7fafb8108b30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:26:02.592 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.591+0000 7fafc00de700 1 --2- 192.168.123.104:0/653468541 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fafb8108760 0x7fafb8108b30 secure :-1 s=READY pgs=70 cs=0 l=1 rev1=1 crypto rx=0x7fafac009b00 tx=0x7fafac009e10 comp rx=0 tx=0).stop 2026-03-10T06:26:02.592 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.592+0000 7fafc00de700 1 -- 192.168.123.104:0/653468541 shutdown_connections 2026-03-10T06:26:02.592 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.592+0000 7fafc00de700 1 --2- 192.168.123.104:0/653468541 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fafb8102760 0x7fafb8102bd0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:26:02.592 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.592+0000 7fafc00de700 1 --2- 192.168.123.104:0/653468541 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fafb8108760 0x7fafb8108b30 secure :-1 s=CLOSED pgs=70 cs=0 l=1 rev1=1 crypto rx=0x7fafac009b00 tx=0x7fafac009e10 comp rx=0 tx=0).stop 2026-03-10T06:26:02.592 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.592+0000 7fafc00de700 1 -- 192.168.123.104:0/653468541 >> 192.168.123.104:0/653468541 conn(0x7fafb80fe280 msgr2=0x7fafb8100690 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:26:02.593 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.593+0000 7fafc00de700 1 -- 192.168.123.104:0/653468541 shutdown_connections 2026-03-10T06:26:02.593 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.593+0000 7fafc00de700 1 -- 192.168.123.104:0/653468541 wait complete. 2026-03-10T06:26:02.594 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.593+0000 7fafc00de700 1 Processor -- start 2026-03-10T06:26:02.594 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.593+0000 7fafc00de700 1 -- start start 2026-03-10T06:26:02.595 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.594+0000 7fafc00de700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fafb8102760 0x7fafb8075260 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:26:02.595 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.594+0000 7fafc00de700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fafb80757a0 0x7fafb8075c10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:26:02.595 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.594+0000 7fafc00de700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fafb8079250 con 0x7fafb80757a0 2026-03-10T06:26:02.595 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.594+0000 7fafc00de700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fafb80793c0 con 0x7fafb8102760 2026-03-10T06:26:02.595 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.594+0000 7fafbde7a700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fafb8102760 0x7fafb8075260 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:26:02.595 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.594+0000 7fafbde7a700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fafb8102760 0x7fafb8075260 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.104:46510/0 (socket says 192.168.123.104:46510) 2026-03-10T06:26:02.595 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.594+0000 7fafbde7a700 1 -- 192.168.123.104:0/555760967 learned_addr learned my addr 192.168.123.104:0/555760967 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:26:02.595 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.595+0000 7fafbde7a700 1 -- 192.168.123.104:0/555760967 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fafb80757a0 msgr2=0x7fafb8075c10 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T06:26:02.595 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.595+0000 7fafbd679700 1 --2- 192.168.123.104:0/555760967 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fafb80757a0 0x7fafb8075c10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:26:02.595 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.595+0000 7fafbde7a700 1 --2- 192.168.123.104:0/555760967 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fafb80757a0 0x7fafb8075c10 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:26:02.595 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.595+0000 7fafbde7a700 1 -- 192.168.123.104:0/555760967 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fafac0097e0 con 0x7fafb8102760 2026-03-10T06:26:02.596 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.595+0000 7fafbd679700 1 --2- 192.168.123.104:0/555760967 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fafb80757a0 0x7fafb8075c10 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T06:26:02.596 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.595+0000 7fafbde7a700 1 --2- 192.168.123.104:0/555760967 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fafb8102760 0x7fafb8075260 secure :-1 s=READY pgs=22 cs=0 l=1 rev1=1 crypto rx=0x7fafb81038a0 tx=0x7fafac004970 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:26:02.596 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.596+0000 7fafaaffd700 1 -- 192.168.123.104:0/555760967 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fafac01d070 con 0x7fafb8102760 2026-03-10T06:26:02.596 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.596+0000 7fafc00de700 1 -- 192.168.123.104:0/555760967 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fafb8071980 con 0x7fafb8102760 2026-03-10T06:26:02.596 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.596+0000 7fafaaffd700 1 -- 192.168.123.104:0/555760967 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7fafac00bc50 con 0x7fafb8102760 2026-03-10T06:26:02.597 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.596+0000 7fafc00de700 1 -- 192.168.123.104:0/555760967 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fafb8071ed0 con 0x7fafb8102760 2026-03-10T06:26:02.597 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.596+0000 7fafaaffd700 1 -- 192.168.123.104:0/555760967 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fafac01d070 con 0x7fafb8102760 2026-03-10T06:26:02.597 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.597+0000 7fafc00de700 1 -- 192.168.123.104:0/555760967 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fafb804ea50 con 0x7fafb8102760 2026-03-10T06:26:02.598 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.598+0000 7fafaaffd700 1 -- 192.168.123.104:0/555760967 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 40) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fafac022470 con 0x7fafb8102760 2026-03-10T06:26:02.599 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.598+0000 7fafaaffd700 1 --2- 192.168.123.104:0/555760967 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7fafa40779a0 0x7fafa4079e50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:26:02.599 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.598+0000 7fafbd679700 1 --2- 192.168.123.104:0/555760967 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7fafa40779a0 0x7fafa4079e50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:26:02.599 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.599+0000 7fafaaffd700 1 -- 192.168.123.104:0/555760967 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(63..63 src has 1..63) v4 ==== 6050+0+0 (secure 0 0 0) 0x7fafac09ab20 con 0x7fafb8102760 2026-03-10T06:26:02.599 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.599+0000 7fafbd679700 1 --2- 192.168.123.104:0/555760967 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7fafa40779a0 0x7fafa4079e50 secure :-1 s=READY pgs=46 cs=0 l=1 rev1=1 crypto rx=0x7fafb4005fd0 tx=0x7fafb4005e20 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:26:02.600 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.600+0000 7fafaaffd700 1 -- 192.168.123.104:0/555760967 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fafac0633a0 con 0x7fafb8102760 2026-03-10T06:26:02.727 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.727+0000 7fafc00de700 1 -- 192.168.123.104:0/555760967 --> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7fafb80768a0 con 0x7fafa40779a0 2026-03-10T06:26:02.733 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.732+0000 7fafaaffd700 1 -- 192.168.123.104:0/555760967 <== mgr.34104 v2:192.168.123.104:6800/1421430943 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3576 (secure 0 0 0) 0x7fafb80768a0 con 0x7fafa40779a0 2026-03-10T06:26:02.733 INFO:teuthology.orchestra.run.vm04.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-10T06:26:02.733 INFO:teuthology.orchestra.run.vm04.stdout:alertmanager.vm04 vm04 *:9093,9094 running (2m) 23s ago 8m 24.2M - 0.25.0 c8568f914cd2 85edc8fe2fc1 2026-03-10T06:26:02.733 INFO:teuthology.orchestra.run.vm04.stdout:ceph-exporter.vm04 vm04 running (8m) 23s ago 8m 9336k - 18.2.0 dc2bc1663786 019b79596e39 2026-03-10T06:26:02.733 INFO:teuthology.orchestra.run.vm04.stdout:ceph-exporter.vm06 vm06 running (7m) 1s ago 7m 11.4M - 18.2.0 dc2bc1663786 02ba67f7b99e 2026-03-10T06:26:02.733 INFO:teuthology.orchestra.run.vm04.stdout:crash.vm04 vm04 running (97s) 23s ago 8m 7847k - 19.2.3-678-ge911bdeb 654f31e6858e 330b1d951bd0 2026-03-10T06:26:02.733 INFO:teuthology.orchestra.run.vm04.stdout:crash.vm06 vm06 running (95s) 1s ago 7m 7856k - 19.2.3-678-ge911bdeb 654f31e6858e d5aafc4fb1bb 2026-03-10T06:26:02.733 INFO:teuthology.orchestra.run.vm04.stdout:grafana.vm04 vm04 *:3000 running (2m) 23s ago 8m 89.9M - 10.4.0 c8b91775d855 28b34ae2f2b0 2026-03-10T06:26:02.733 INFO:teuthology.orchestra.run.vm04.stdout:mds.cephfs.vm04.hdxbzv vm04 running (6m) 23s ago 6m 180M - 18.2.0 dc2bc1663786 342935a5b39a 2026-03-10T06:26:02.733 INFO:teuthology.orchestra.run.vm04.stdout:mds.cephfs.vm04.hsrsig vm04 running (6m) 23s ago 6m 18.2M - 18.2.0 dc2bc1663786 9bbaa4df4333 2026-03-10T06:26:02.733 INFO:teuthology.orchestra.run.vm04.stdout:mds.cephfs.vm06.afscws vm06 running (6m) 1s ago 6m 18.3M - 18.2.0 dc2bc1663786 dc29bd0a94dd 2026-03-10T06:26:02.733 INFO:teuthology.orchestra.run.vm04.stdout:mds.cephfs.vm06.wzhqon vm06 running (6m) 1s ago 6m 95.6M - 18.2.0 dc2bc1663786 5f7b9f10b346 2026-03-10T06:26:02.733 INFO:teuthology.orchestra.run.vm04.stdout:mgr.vm04.exdvdb vm04 *:8443,9283,8765 running (4m) 23s ago 9m 609M - 19.2.3-678-ge911bdeb 654f31e6858e 640a9d30421c 2026-03-10T06:26:02.733 INFO:teuthology.orchestra.run.vm04.stdout:mgr.vm06.wwotdr vm06 *:8443,9283,8765 running (3m) 1s ago 7m 495M - 19.2.3-678-ge911bdeb 654f31e6858e 0f98de364d6a 2026-03-10T06:26:02.733 INFO:teuthology.orchestra.run.vm04.stdout:mon.vm04 vm04 running (2m) 23s ago 9m 61.0M 2048M 19.2.3-678-ge911bdeb 654f31e6858e cf1d92823378 2026-03-10T06:26:02.733 INFO:teuthology.orchestra.run.vm04.stdout:mon.vm06 vm06 running (110s) 1s ago 7m 54.7M 2048M 19.2.3-678-ge911bdeb 654f31e6858e 0f90bc9a714a 2026-03-10T06:26:02.733 INFO:teuthology.orchestra.run.vm04.stdout:node-exporter.vm04 vm04 *:9100 running (3m) 23s ago 8m 10.4M - 1.7.0 72c9c2088986 f88b18573eef 2026-03-10T06:26:02.733 INFO:teuthology.orchestra.run.vm04.stdout:node-exporter.vm06 vm06 *:9100 running (3m) 1s ago 7m 9529k - 1.7.0 72c9c2088986 32cea90d1988 2026-03-10T06:26:02.733 INFO:teuthology.orchestra.run.vm04.stdout:osd.0 vm04 running (84s) 23s ago 7m 182M 4096M 19.2.3-678-ge911bdeb 654f31e6858e df697b82ad51 2026-03-10T06:26:02.733 INFO:teuthology.orchestra.run.vm04.stdout:osd.1 vm04 running (45s) 23s ago 7m 108M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 6bc3525fe6f5 2026-03-10T06:26:02.733 INFO:teuthology.orchestra.run.vm04.stdout:osd.2 vm04 running (24s) 23s ago 7m 12.9M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 38220ba83a3f 2026-03-10T06:26:02.733 INFO:teuthology.orchestra.run.vm04.stdout:osd.3 vm06 running (3s) 1s ago 6m 33.0M 4096M 19.2.3-678-ge911bdeb 654f31e6858e e91f44e1f660 2026-03-10T06:26:02.734 INFO:teuthology.orchestra.run.vm04.stdout:osd.4 vm06 running (6m) 1s ago 6m 372M 4096M 18.2.0 dc2bc1663786 dcd395dfe220 2026-03-10T06:26:02.734 INFO:teuthology.orchestra.run.vm04.stdout:osd.5 vm06 running (6m) 1s ago 6m 337M 4096M 18.2.0 dc2bc1663786 862da087fc06 2026-03-10T06:26:02.734 INFO:teuthology.orchestra.run.vm04.stdout:prometheus.vm04 vm04 *:9095 running (3m) 23s ago 8m 56.2M - 2.51.0 1d3b7f56885b 9e491f823407 2026-03-10T06:26:02.736 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.735+0000 7fafc00de700 1 -- 192.168.123.104:0/555760967 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7fafa40779a0 msgr2=0x7fafa4079e50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:26:02.736 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.735+0000 7fafc00de700 1 --2- 192.168.123.104:0/555760967 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7fafa40779a0 0x7fafa4079e50 secure :-1 s=READY pgs=46 cs=0 l=1 rev1=1 crypto rx=0x7fafb4005fd0 tx=0x7fafb4005e20 comp rx=0 tx=0).stop 2026-03-10T06:26:02.736 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.735+0000 7fafc00de700 1 -- 192.168.123.104:0/555760967 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fafb8102760 msgr2=0x7fafb8075260 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:26:02.736 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.735+0000 7fafc00de700 1 --2- 192.168.123.104:0/555760967 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fafb8102760 0x7fafb8075260 secure :-1 s=READY pgs=22 cs=0 l=1 rev1=1 crypto rx=0x7fafb81038a0 tx=0x7fafac004970 comp rx=0 tx=0).stop 2026-03-10T06:26:02.736 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.736+0000 7fafc00de700 1 -- 192.168.123.104:0/555760967 shutdown_connections 2026-03-10T06:26:02.736 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.736+0000 7fafc00de700 1 --2- 192.168.123.104:0/555760967 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7fafa40779a0 0x7fafa4079e50 unknown :-1 s=CLOSED pgs=46 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:26:02.736 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.736+0000 7fafc00de700 1 --2- 192.168.123.104:0/555760967 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fafb8102760 0x7fafb8075260 unknown :-1 s=CLOSED pgs=22 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:26:02.736 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.736+0000 7fafc00de700 1 --2- 192.168.123.104:0/555760967 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fafb80757a0 0x7fafb8075c10 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:26:02.736 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.736+0000 7fafc00de700 1 -- 192.168.123.104:0/555760967 >> 192.168.123.104:0/555760967 conn(0x7fafb80fe280 msgr2=0x7fafb80ffd00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:26:02.736 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.736+0000 7fafc00de700 1 -- 192.168.123.104:0/555760967 shutdown_connections 2026-03-10T06:26:02.736 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.736+0000 7fafc00de700 1 -- 192.168.123.104:0/555760967 wait complete. 2026-03-10T06:26:02.813 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.812+0000 7f107a657700 1 -- 192.168.123.104:0/2227201229 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1074108720 msgr2=0x7f1074108af0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:26:02.814 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.812+0000 7f10737fe700 1 -- 192.168.123.104:0/2227201229 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f106400ba40 con 0x7f1074108720 2026-03-10T06:26:02.814 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.812+0000 7f107a657700 1 --2- 192.168.123.104:0/2227201229 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1074108720 0x7f1074108af0 secure :-1 s=READY pgs=71 cs=0 l=1 rev1=1 crypto rx=0x7f1064009b00 tx=0x7f1064009e10 comp rx=0 tx=0).stop 2026-03-10T06:26:02.814 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.813+0000 7f107a657700 1 -- 192.168.123.104:0/2227201229 shutdown_connections 2026-03-10T06:26:02.814 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.813+0000 7f107a657700 1 --2- 192.168.123.104:0/2227201229 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1074102720 0x7f1074102b90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:26:02.814 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.813+0000 7f107a657700 1 --2- 192.168.123.104:0/2227201229 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1074108720 0x7f1074108af0 unknown :-1 s=CLOSED pgs=71 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:26:02.814 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.813+0000 7f107a657700 1 -- 192.168.123.104:0/2227201229 >> 192.168.123.104:0/2227201229 conn(0x7f10740fe220 msgr2=0x7f1074100630 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:26:02.814 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.813+0000 7f107a657700 1 -- 192.168.123.104:0/2227201229 shutdown_connections 2026-03-10T06:26:02.814 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.813+0000 7f107a657700 1 -- 192.168.123.104:0/2227201229 wait complete. 2026-03-10T06:26:02.814 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.814+0000 7f107a657700 1 Processor -- start 2026-03-10T06:26:02.814 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.814+0000 7f107a657700 1 -- start start 2026-03-10T06:26:02.815 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.814+0000 7f107a657700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1074102720 0x7f10741982d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:26:02.815 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.814+0000 7f107a657700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1074108720 0x7f1074198810 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:26:02.815 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.814+0000 7f107a657700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f1074198ef0 con 0x7f1074108720 2026-03-10T06:26:02.815 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.814+0000 7f107a657700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f107419cc80 con 0x7f1074102720 2026-03-10T06:26:02.815 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.815+0000 7f106bfff700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1074108720 0x7f1074198810 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:26:02.815 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.815+0000 7f106bfff700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1074108720 0x7f1074198810 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:41864/0 (socket says 192.168.123.104:41864) 2026-03-10T06:26:02.815 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.815+0000 7f106bfff700 1 -- 192.168.123.104:0/1547279886 learned_addr learned my addr 192.168.123.104:0/1547279886 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:26:02.815 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.815+0000 7f1073fff700 1 --2- 192.168.123.104:0/1547279886 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1074102720 0x7f10741982d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:26:02.816 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.815+0000 7f106bfff700 1 -- 192.168.123.104:0/1547279886 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1074102720 msgr2=0x7f10741982d0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:26:02.816 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.815+0000 7f106bfff700 1 --2- 192.168.123.104:0/1547279886 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1074102720 0x7f10741982d0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:26:02.816 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.815+0000 7f106bfff700 1 -- 192.168.123.104:0/1547279886 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f10640097e0 con 0x7f1074108720 2026-03-10T06:26:02.818 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.815+0000 7f106bfff700 1 --2- 192.168.123.104:0/1547279886 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1074108720 0x7f1074198810 secure :-1 s=READY pgs=72 cs=0 l=1 rev1=1 crypto rx=0x7f105c00d8d0 tx=0x7f105c00dbe0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:26:02.818 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.816+0000 7f1071ffb700 1 -- 192.168.123.104:0/1547279886 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f105c009880 con 0x7f1074108720 2026-03-10T06:26:02.818 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.816+0000 7f1071ffb700 1 -- 192.168.123.104:0/1547279886 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f105c010460 con 0x7f1074108720 2026-03-10T06:26:02.818 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.816+0000 7f1071ffb700 1 -- 192.168.123.104:0/1547279886 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f105c00f5d0 con 0x7f1074108720 2026-03-10T06:26:02.819 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.816+0000 7f107a657700 1 -- 192.168.123.104:0/1547279886 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f107419cf60 con 0x7f1074108720 2026-03-10T06:26:02.819 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.816+0000 7f107a657700 1 -- 192.168.123.104:0/1547279886 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f107419d4b0 con 0x7f1074108720 2026-03-10T06:26:02.819 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.817+0000 7f1071ffb700 1 -- 192.168.123.104:0/1547279886 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 40) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f105c00f730 con 0x7f1074108720 2026-03-10T06:26:02.819 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.817+0000 7f1071ffb700 1 --2- 192.168.123.104:0/1547279886 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f10600779e0 0x7f1060079e90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:26:02.819 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.818+0000 7f1073fff700 1 --2- 192.168.123.104:0/1547279886 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f10600779e0 0x7f1060079e90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:26:02.819 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.818+0000 7f107a657700 1 -- 192.168.123.104:0/1547279886 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f107419d0f0 con 0x7f1074108720 2026-03-10T06:26:02.822 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.820+0000 7f1071ffb700 1 -- 192.168.123.104:0/1547279886 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(63..63 src has 1..63) v4 ==== 6050+0+0 (secure 0 0 0) 0x7f105c09a7e0 con 0x7f1074108720 2026-03-10T06:26:02.822 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.820+0000 7f1073fff700 1 --2- 192.168.123.104:0/1547279886 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f10600779e0 0x7f1060079e90 secure :-1 s=READY pgs=47 cs=0 l=1 rev1=1 crypto rx=0x7f1064005fd0 tx=0x7f106400b580 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:26:02.822 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.822+0000 7f1071ffb700 1 -- 192.168.123.104:0/1547279886 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f107419d0f0 con 0x7f1074108720 2026-03-10T06:26:03.001 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:02.996+0000 7f107a657700 1 -- 192.168.123.104:0/1547279886 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7f107419d0f0 con 0x7f1074108720 2026-03-10T06:26:03.002 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.001+0000 7f1071ffb700 1 -- 192.168.123.104:0/1547279886 <== mon.0 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+799 (secure 0 0 0) 0x7f107419d0f0 con 0x7f1074108720 2026-03-10T06:26:03.002 INFO:teuthology.orchestra.run.vm04.stdout:{ 2026-03-10T06:26:03.002 INFO:teuthology.orchestra.run.vm04.stdout: "mon": { 2026-03-10T06:26:03.002 INFO:teuthology.orchestra.run.vm04.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T06:26:03.002 INFO:teuthology.orchestra.run.vm04.stdout: }, 2026-03-10T06:26:03.002 INFO:teuthology.orchestra.run.vm04.stdout: "mgr": { 2026-03-10T06:26:03.002 INFO:teuthology.orchestra.run.vm04.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T06:26:03.002 INFO:teuthology.orchestra.run.vm04.stdout: }, 2026-03-10T06:26:03.002 INFO:teuthology.orchestra.run.vm04.stdout: "osd": { 2026-03-10T06:26:03.002 INFO:teuthology.orchestra.run.vm04.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 2, 2026-03-10T06:26:03.002 INFO:teuthology.orchestra.run.vm04.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 3 2026-03-10T06:26:03.002 INFO:teuthology.orchestra.run.vm04.stdout: }, 2026-03-10T06:26:03.002 INFO:teuthology.orchestra.run.vm04.stdout: "mds": { 2026-03-10T06:26:03.002 INFO:teuthology.orchestra.run.vm04.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 4 2026-03-10T06:26:03.002 INFO:teuthology.orchestra.run.vm04.stdout: }, 2026-03-10T06:26:03.002 INFO:teuthology.orchestra.run.vm04.stdout: "overall": { 2026-03-10T06:26:03.002 INFO:teuthology.orchestra.run.vm04.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 6, 2026-03-10T06:26:03.002 INFO:teuthology.orchestra.run.vm04.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 7 2026-03-10T06:26:03.002 INFO:teuthology.orchestra.run.vm04.stdout: } 2026-03-10T06:26:03.002 INFO:teuthology.orchestra.run.vm04.stdout:} 2026-03-10T06:26:03.005 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.005+0000 7f107a657700 1 -- 192.168.123.104:0/1547279886 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f10600779e0 msgr2=0x7f1060079e90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:26:03.005 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.005+0000 7f107a657700 1 --2- 192.168.123.104:0/1547279886 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f10600779e0 0x7f1060079e90 secure :-1 s=READY pgs=47 cs=0 l=1 rev1=1 crypto rx=0x7f1064005fd0 tx=0x7f106400b580 comp rx=0 tx=0).stop 2026-03-10T06:26:03.005 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.005+0000 7f107a657700 1 -- 192.168.123.104:0/1547279886 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1074108720 msgr2=0x7f1074198810 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:26:03.005 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.005+0000 7f107a657700 1 --2- 192.168.123.104:0/1547279886 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1074108720 0x7f1074198810 secure :-1 s=READY pgs=72 cs=0 l=1 rev1=1 crypto rx=0x7f105c00d8d0 tx=0x7f105c00dbe0 comp rx=0 tx=0).stop 2026-03-10T06:26:03.006 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.005+0000 7f107a657700 1 -- 192.168.123.104:0/1547279886 shutdown_connections 2026-03-10T06:26:03.006 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.006+0000 7f107a657700 1 --2- 192.168.123.104:0/1547279886 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f10600779e0 0x7f1060079e90 unknown :-1 s=CLOSED pgs=47 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:26:03.006 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.006+0000 7f107a657700 1 --2- 192.168.123.104:0/1547279886 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1074102720 0x7f10741982d0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:26:03.006 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.006+0000 7f107a657700 1 --2- 192.168.123.104:0/1547279886 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1074108720 0x7f1074198810 unknown :-1 s=CLOSED pgs=72 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:26:03.006 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.006+0000 7f107a657700 1 -- 192.168.123.104:0/1547279886 >> 192.168.123.104:0/1547279886 conn(0x7f10740fe220 msgr2=0x7f10740ffa40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:26:03.006 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.006+0000 7f107a657700 1 -- 192.168.123.104:0/1547279886 shutdown_connections 2026-03-10T06:26:03.006 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.006+0000 7f107a657700 1 -- 192.168.123.104:0/1547279886 wait complete. 2026-03-10T06:26:03.081 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.080+0000 7f13dee2e700 1 -- 192.168.123.104:0/3840375496 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f13d8068490 msgr2=0x7f13d8068900 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:26:03.081 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.080+0000 7f13dee2e700 1 --2- 192.168.123.104:0/3840375496 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f13d8068490 0x7f13d8068900 secure :-1 s=READY pgs=73 cs=0 l=1 rev1=1 crypto rx=0x7f13d4009b00 tx=0x7f13d4009e10 comp rx=0 tx=0).stop 2026-03-10T06:26:03.081 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.080+0000 7f13dee2e700 1 -- 192.168.123.104:0/3840375496 shutdown_connections 2026-03-10T06:26:03.081 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.080+0000 7f13dee2e700 1 --2- 192.168.123.104:0/3840375496 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f13d8068490 0x7f13d8068900 unknown :-1 s=CLOSED pgs=73 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:26:03.081 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.080+0000 7f13dee2e700 1 --2- 192.168.123.104:0/3840375496 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f13d81063d0 0x7f13d81067a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:26:03.081 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.080+0000 7f13dee2e700 1 -- 192.168.123.104:0/3840375496 >> 192.168.123.104:0/3840375496 conn(0x7f13d8075240 msgr2=0x7f13d8075640 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:26:03.081 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.081+0000 7f13dee2e700 1 -- 192.168.123.104:0/3840375496 shutdown_connections 2026-03-10T06:26:03.081 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.081+0000 7f13dee2e700 1 -- 192.168.123.104:0/3840375496 wait complete. 2026-03-10T06:26:03.082 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.081+0000 7f13dee2e700 1 Processor -- start 2026-03-10T06:26:03.082 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.081+0000 7f13dee2e700 1 -- start start 2026-03-10T06:26:03.082 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.082+0000 7f13dee2e700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f13d8068490 0x7f13d8195fe0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:26:03.082 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.082+0000 7f13dde2c700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f13d8068490 0x7f13d8195fe0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:26:03.082 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.082+0000 7f13dde2c700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f13d8068490 0x7f13d8195fe0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:41872/0 (socket says 192.168.123.104:41872) 2026-03-10T06:26:03.082 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.082+0000 7f13dee2e700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f13d81063d0 0x7f13d8196520 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:26:03.082 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.082+0000 7f13dee2e700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f13d8196c00 con 0x7f13d8068490 2026-03-10T06:26:03.083 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.082+0000 7f13dee2e700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f13d819a990 con 0x7f13d81063d0 2026-03-10T06:26:03.083 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.082+0000 7f13dde2c700 1 -- 192.168.123.104:0/2061902055 learned_addr learned my addr 192.168.123.104:0/2061902055 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:26:03.083 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.082+0000 7f13dd62b700 1 --2- 192.168.123.104:0/2061902055 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f13d81063d0 0x7f13d8196520 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:26:03.083 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.083+0000 7f13dde2c700 1 -- 192.168.123.104:0/2061902055 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f13d81063d0 msgr2=0x7f13d8196520 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:26:03.083 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.083+0000 7f13dde2c700 1 --2- 192.168.123.104:0/2061902055 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f13d81063d0 0x7f13d8196520 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:26:03.083 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.083+0000 7f13dde2c700 1 -- 192.168.123.104:0/2061902055 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f13d40097e0 con 0x7f13d8068490 2026-03-10T06:26:03.083 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.083+0000 7f13dde2c700 1 --2- 192.168.123.104:0/2061902055 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f13d8068490 0x7f13d8195fe0 secure :-1 s=READY pgs=74 cs=0 l=1 rev1=1 crypto rx=0x7f13cc00dc40 tx=0x7f13cc00df50 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:26:03.084 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.083+0000 7f13caffd700 1 -- 192.168.123.104:0/2061902055 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f13cc0098e0 con 0x7f13d8068490 2026-03-10T06:26:03.085 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.083+0000 7f13caffd700 1 -- 192.168.123.104:0/2061902055 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f13cc010460 con 0x7f13d8068490 2026-03-10T06:26:03.085 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.083+0000 7f13caffd700 1 -- 192.168.123.104:0/2061902055 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f13cc00f5d0 con 0x7f13d8068490 2026-03-10T06:26:03.085 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.083+0000 7f13dee2e700 1 -- 192.168.123.104:0/2061902055 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f13d819ac70 con 0x7f13d8068490 2026-03-10T06:26:03.085 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.083+0000 7f13dee2e700 1 -- 192.168.123.104:0/2061902055 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f13d819b1c0 con 0x7f13d8068490 2026-03-10T06:26:03.089 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.085+0000 7f13caffd700 1 -- 192.168.123.104:0/2061902055 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 40) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f13cc00f730 con 0x7f13d8068490 2026-03-10T06:26:03.089 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.085+0000 7f13dee2e700 1 -- 192.168.123.104:0/2061902055 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f13d804f2a0 con 0x7f13d8068490 2026-03-10T06:26:03.089 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.085+0000 7f13caffd700 1 --2- 192.168.123.104:0/2061902055 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f13c4077870 0x7f13c4079d20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:26:03.089 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.085+0000 7f13caffd700 1 -- 192.168.123.104:0/2061902055 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(63..63 src has 1..63) v4 ==== 6050+0+0 (secure 0 0 0) 0x7f13cc099870 con 0x7f13d8068490 2026-03-10T06:26:03.089 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.086+0000 7f13dd62b700 1 --2- 192.168.123.104:0/2061902055 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f13c4077870 0x7f13c4079d20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:26:03.089 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.087+0000 7f13dd62b700 1 --2- 192.168.123.104:0/2061902055 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f13c4077870 0x7f13c4079d20 secure :-1 s=READY pgs=48 cs=0 l=1 rev1=1 crypto rx=0x7f13d8197600 tx=0x7f13d400b540 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:26:03.089 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.089+0000 7f13caffd700 1 -- 192.168.123.104:0/2061902055 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f13cc061a20 con 0x7f13d8068490 2026-03-10T06:26:03.239 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.238+0000 7f13dee2e700 1 -- 192.168.123.104:0/2061902055 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7f13d804ea50 con 0x7f13d8068490 2026-03-10T06:26:03.239 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.239+0000 7f13caffd700 1 -- 192.168.123.104:0/2061902055 <== mon.0 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 11 v11) v1 ==== 76+0+1944 (secure 0 0 0) 0x7f13cc061840 con 0x7f13d8068490 2026-03-10T06:26:03.240 INFO:teuthology.orchestra.run.vm04.stdout:e11 2026-03-10T06:26:03.240 INFO:teuthology.orchestra.run.vm04.stdout:btime 1970-01-01T00:00:00:000000+0000 2026-03-10T06:26:03.240 INFO:teuthology.orchestra.run.vm04.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-10T06:26:03.240 INFO:teuthology.orchestra.run.vm04.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T06:26:03.240 INFO:teuthology.orchestra.run.vm04.stdout:legacy client fscid: 1 2026-03-10T06:26:03.240 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:26:03.240 INFO:teuthology.orchestra.run.vm04.stdout:Filesystem 'cephfs' (1) 2026-03-10T06:26:03.240 INFO:teuthology.orchestra.run.vm04.stdout:fs_name cephfs 2026-03-10T06:26:03.240 INFO:teuthology.orchestra.run.vm04.stdout:epoch 9 2026-03-10T06:26:03.240 INFO:teuthology.orchestra.run.vm04.stdout:flags 32 joinable allow_snaps allow_multimds_snaps allow_standby_replay 2026-03-10T06:26:03.240 INFO:teuthology.orchestra.run.vm04.stdout:created 2026-03-10T06:19:48.407965+0000 2026-03-10T06:26:03.240 INFO:teuthology.orchestra.run.vm04.stdout:modified 2026-03-10T06:19:55.449951+0000 2026-03-10T06:26:03.240 INFO:teuthology.orchestra.run.vm04.stdout:tableserver 0 2026-03-10T06:26:03.240 INFO:teuthology.orchestra.run.vm04.stdout:root 0 2026-03-10T06:26:03.240 INFO:teuthology.orchestra.run.vm04.stdout:session_timeout 60 2026-03-10T06:26:03.240 INFO:teuthology.orchestra.run.vm04.stdout:session_autoclose 300 2026-03-10T06:26:03.240 INFO:teuthology.orchestra.run.vm04.stdout:max_file_size 1099511627776 2026-03-10T06:26:03.240 INFO:teuthology.orchestra.run.vm04.stdout:max_xattr_size 65536 2026-03-10T06:26:03.240 INFO:teuthology.orchestra.run.vm04.stdout:required_client_features {} 2026-03-10T06:26:03.240 INFO:teuthology.orchestra.run.vm04.stdout:last_failure 0 2026-03-10T06:26:03.240 INFO:teuthology.orchestra.run.vm04.stdout:last_failure_osd_epoch 0 2026-03-10T06:26:03.240 INFO:teuthology.orchestra.run.vm04.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T06:26:03.240 INFO:teuthology.orchestra.run.vm04.stdout:max_mds 1 2026-03-10T06:26:03.240 INFO:teuthology.orchestra.run.vm04.stdout:in 0 2026-03-10T06:26:03.240 INFO:teuthology.orchestra.run.vm04.stdout:up {0=14508} 2026-03-10T06:26:03.240 INFO:teuthology.orchestra.run.vm04.stdout:failed 2026-03-10T06:26:03.240 INFO:teuthology.orchestra.run.vm04.stdout:damaged 2026-03-10T06:26:03.240 INFO:teuthology.orchestra.run.vm04.stdout:stopped 2026-03-10T06:26:03.240 INFO:teuthology.orchestra.run.vm04.stdout:data_pools [3] 2026-03-10T06:26:03.240 INFO:teuthology.orchestra.run.vm04.stdout:metadata_pool 2 2026-03-10T06:26:03.240 INFO:teuthology.orchestra.run.vm04.stdout:inline_data enabled 2026-03-10T06:26:03.240 INFO:teuthology.orchestra.run.vm04.stdout:balancer 2026-03-10T06:26:03.240 INFO:teuthology.orchestra.run.vm04.stdout:bal_rank_mask -1 2026-03-10T06:26:03.240 INFO:teuthology.orchestra.run.vm04.stdout:standby_count_wanted 1 2026-03-10T06:26:03.240 INFO:teuthology.orchestra.run.vm04.stdout:qdb_cluster leader: 0 members: 2026-03-10T06:26:03.240 INFO:teuthology.orchestra.run.vm04.stdout:[mds.cephfs.vm04.hdxbzv{0:14508} state up:active seq 3 join_fscid=1 addr [v2:192.168.123.104:6826/2274683007,v1:192.168.123.104:6827/2274683007] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T06:26:03.240 INFO:teuthology.orchestra.run.vm04.stdout:[mds.cephfs.vm06.wzhqon{0:24299} state up:standby-replay seq 2 join_fscid=1 addr [v2:192.168.123.106:6824/3071631026,v1:192.168.123.106:6825/3071631026] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T06:26:03.240 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:26:03.240 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:26:03.240 INFO:teuthology.orchestra.run.vm04.stdout:Standby daemons: 2026-03-10T06:26:03.240 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:26:03.240 INFO:teuthology.orchestra.run.vm04.stdout:[mds.cephfs.vm04.hsrsig{-1:14518} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.104:6828/2419696492,v1:192.168.123.104:6829/2419696492] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T06:26:03.240 INFO:teuthology.orchestra.run.vm04.stdout:[mds.cephfs.vm06.afscws{-1:14526} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.106:6826/3120742985,v1:192.168.123.106:6827/3120742985] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T06:26:03.242 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.242+0000 7f13dee2e700 1 -- 192.168.123.104:0/2061902055 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f13c4077870 msgr2=0x7f13c4079d20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:26:03.242 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.242+0000 7f13dee2e700 1 --2- 192.168.123.104:0/2061902055 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f13c4077870 0x7f13c4079d20 secure :-1 s=READY pgs=48 cs=0 l=1 rev1=1 crypto rx=0x7f13d8197600 tx=0x7f13d400b540 comp rx=0 tx=0).stop 2026-03-10T06:26:03.243 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.242+0000 7f13dee2e700 1 -- 192.168.123.104:0/2061902055 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f13d8068490 msgr2=0x7f13d8195fe0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:26:03.243 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.242+0000 7f13dee2e700 1 --2- 192.168.123.104:0/2061902055 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f13d8068490 0x7f13d8195fe0 secure :-1 s=READY pgs=74 cs=0 l=1 rev1=1 crypto rx=0x7f13cc00dc40 tx=0x7f13cc00df50 comp rx=0 tx=0).stop 2026-03-10T06:26:03.243 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.242+0000 7f13dee2e700 1 -- 192.168.123.104:0/2061902055 shutdown_connections 2026-03-10T06:26:03.243 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.242+0000 7f13dee2e700 1 --2- 192.168.123.104:0/2061902055 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f13c4077870 0x7f13c4079d20 unknown :-1 s=CLOSED pgs=48 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:26:03.243 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.242+0000 7f13dee2e700 1 --2- 192.168.123.104:0/2061902055 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f13d8068490 0x7f13d8195fe0 unknown :-1 s=CLOSED pgs=74 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:26:03.243 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.242+0000 7f13dee2e700 1 --2- 192.168.123.104:0/2061902055 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f13d81063d0 0x7f13d8196520 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:26:03.243 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.242+0000 7f13dee2e700 1 -- 192.168.123.104:0/2061902055 >> 192.168.123.104:0/2061902055 conn(0x7f13d8075240 msgr2=0x7f13d80fea90 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:26:03.243 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.243+0000 7f13dee2e700 1 -- 192.168.123.104:0/2061902055 shutdown_connections 2026-03-10T06:26:03.243 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.243+0000 7f13dee2e700 1 -- 192.168.123.104:0/2061902055 wait complete. 2026-03-10T06:26:03.244 INFO:teuthology.orchestra.run.vm04.stderr:dumped fsmap epoch 11 2026-03-10T06:26:03.305 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:03 vm04.local ceph-mon[115743]: pgmap v82: 65 pgs: 18 active+undersized, 4 peering, 16 active+undersized+degraded, 27 active+clean; 252 MiB data, 2.1 GiB used, 118 GiB / 120 GiB avail; 54/261 objects degraded (20.690%) 2026-03-10T06:26:03.321 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.320+0000 7f8d5cb81700 1 -- 192.168.123.104:0/1721582273 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8d58068490 msgr2=0x7f8d58068900 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:26:03.321 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.320+0000 7f8d5cb81700 1 --2- 192.168.123.104:0/1721582273 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8d58068490 0x7f8d58068900 secure :-1 s=READY pgs=75 cs=0 l=1 rev1=1 crypto rx=0x7f8d48009b50 tx=0x7f8d48009e60 comp rx=0 tx=0).stop 2026-03-10T06:26:03.321 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.321+0000 7f8d5cb81700 1 -- 192.168.123.104:0/1721582273 shutdown_connections 2026-03-10T06:26:03.321 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.321+0000 7f8d5cb81700 1 --2- 192.168.123.104:0/1721582273 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8d58068490 0x7f8d58068900 unknown :-1 s=CLOSED pgs=75 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:26:03.321 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.321+0000 7f8d5cb81700 1 --2- 192.168.123.104:0/1721582273 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8d581066c0 0x7f8d58106a90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:26:03.321 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.321+0000 7f8d5cb81700 1 -- 192.168.123.104:0/1721582273 >> 192.168.123.104:0/1721582273 conn(0x7f8d580754a0 msgr2=0x7f8d580758a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:26:03.322 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.321+0000 7f8d5cb81700 1 -- 192.168.123.104:0/1721582273 shutdown_connections 2026-03-10T06:26:03.322 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.321+0000 7f8d5cb81700 1 -- 192.168.123.104:0/1721582273 wait complete. 2026-03-10T06:26:03.322 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.322+0000 7f8d5cb81700 1 Processor -- start 2026-03-10T06:26:03.322 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.322+0000 7f8d5cb81700 1 -- start start 2026-03-10T06:26:03.323 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.322+0000 7f8d5cb81700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8d58068490 0x7f8d58198380 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:26:03.323 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.322+0000 7f8d5cb81700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8d581066c0 0x7f8d581988c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:26:03.323 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.322+0000 7f8d5cb81700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8d58198fa0 con 0x7f8d58068490 2026-03-10T06:26:03.323 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.322+0000 7f8d5cb81700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8d5819cd30 con 0x7f8d581066c0 2026-03-10T06:26:03.323 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.322+0000 7f8d5659c700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8d58068490 0x7f8d58198380 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:26:03.323 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.322+0000 7f8d5659c700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8d58068490 0x7f8d58198380 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:41886/0 (socket says 192.168.123.104:41886) 2026-03-10T06:26:03.323 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.322+0000 7f8d5659c700 1 -- 192.168.123.104:0/590503799 learned_addr learned my addr 192.168.123.104:0/590503799 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:26:03.324 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.323+0000 7f8d5659c700 1 -- 192.168.123.104:0/590503799 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8d581066c0 msgr2=0x7f8d581988c0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:26:03.324 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.323+0000 7f8d55d9b700 1 --2- 192.168.123.104:0/590503799 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8d581066c0 0x7f8d581988c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:26:03.324 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.323+0000 7f8d5659c700 1 --2- 192.168.123.104:0/590503799 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8d581066c0 0x7f8d581988c0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:26:03.324 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.323+0000 7f8d5659c700 1 -- 192.168.123.104:0/590503799 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f8d480097e0 con 0x7f8d58068490 2026-03-10T06:26:03.324 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.323+0000 7f8d55d9b700 1 --2- 192.168.123.104:0/590503799 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8d581066c0 0x7f8d581988c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T06:26:03.324 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.323+0000 7f8d5659c700 1 --2- 192.168.123.104:0/590503799 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8d58068490 0x7f8d58198380 secure :-1 s=READY pgs=76 cs=0 l=1 rev1=1 crypto rx=0x7f8d4000eb10 tx=0x7f8d4000eed0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:26:03.324 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.324+0000 7f8d4f7fe700 1 -- 192.168.123.104:0/590503799 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f8d4000cca0 con 0x7f8d58068490 2026-03-10T06:26:03.329 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.324+0000 7f8d4f7fe700 1 -- 192.168.123.104:0/590503799 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f8d4000ce00 con 0x7f8d58068490 2026-03-10T06:26:03.329 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.324+0000 7f8d4f7fe700 1 -- 192.168.123.104:0/590503799 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f8d40018910 con 0x7f8d58068490 2026-03-10T06:26:03.329 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.324+0000 7f8d5cb81700 1 -- 192.168.123.104:0/590503799 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f8d5819d010 con 0x7f8d58068490 2026-03-10T06:26:03.329 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.324+0000 7f8d5cb81700 1 -- 192.168.123.104:0/590503799 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f8d5819d560 con 0x7f8d58068490 2026-03-10T06:26:03.329 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.325+0000 7f8d4f7fe700 1 -- 192.168.123.104:0/590503799 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 40) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f8d40018a70 con 0x7f8d58068490 2026-03-10T06:26:03.329 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.325+0000 7f8d5cb81700 1 -- 192.168.123.104:0/590503799 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f8d5804ea50 con 0x7f8d58068490 2026-03-10T06:26:03.329 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.328+0000 7f8d4f7fe700 1 --2- 192.168.123.104:0/590503799 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f8d440779e0 0x7f8d44079e90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:26:03.329 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.328+0000 7f8d4f7fe700 1 -- 192.168.123.104:0/590503799 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(63..63 src has 1..63) v4 ==== 6050+0+0 (secure 0 0 0) 0x7f8d40014070 con 0x7f8d58068490 2026-03-10T06:26:03.330 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.329+0000 7f8d55d9b700 1 --2- 192.168.123.104:0/590503799 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f8d440779e0 0x7f8d44079e90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:26:03.330 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.329+0000 7f8d4f7fe700 1 -- 192.168.123.104:0/590503799 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f8d40063d10 con 0x7f8d58068490 2026-03-10T06:26:03.330 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.330+0000 7f8d55d9b700 1 --2- 192.168.123.104:0/590503799 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f8d440779e0 0x7f8d44079e90 secure :-1 s=READY pgs=49 cs=0 l=1 rev1=1 crypto rx=0x7f8d4800b5c0 tx=0x7f8d48005fb0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:26:03.470 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.469+0000 7f8d5cb81700 1 -- 192.168.123.104:0/590503799 --> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f8d5819d840 con 0x7f8d440779e0 2026-03-10T06:26:03.471 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.471+0000 7f8d4f7fe700 1 -- 192.168.123.104:0/590503799 <== mgr.34104 v2:192.168.123.104:6800/1421430943 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+401 (secure 0 0 0) 0x7f8d5819d840 con 0x7f8d440779e0 2026-03-10T06:26:03.471 INFO:teuthology.orchestra.run.vm04.stdout:{ 2026-03-10T06:26:03.471 INFO:teuthology.orchestra.run.vm04.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc", 2026-03-10T06:26:03.471 INFO:teuthology.orchestra.run.vm04.stdout: "in_progress": true, 2026-03-10T06:26:03.471 INFO:teuthology.orchestra.run.vm04.stdout: "which": "Upgrading all daemon types on all hosts", 2026-03-10T06:26:03.471 INFO:teuthology.orchestra.run.vm04.stdout: "services_complete": [ 2026-03-10T06:26:03.471 INFO:teuthology.orchestra.run.vm04.stdout: "mon", 2026-03-10T06:26:03.471 INFO:teuthology.orchestra.run.vm04.stdout: "mgr", 2026-03-10T06:26:03.471 INFO:teuthology.orchestra.run.vm04.stdout: "crash" 2026-03-10T06:26:03.471 INFO:teuthology.orchestra.run.vm04.stdout: ], 2026-03-10T06:26:03.472 INFO:teuthology.orchestra.run.vm04.stdout: "progress": "10/23 daemons upgraded", 2026-03-10T06:26:03.472 INFO:teuthology.orchestra.run.vm04.stdout: "message": "Currently upgrading osd daemons", 2026-03-10T06:26:03.472 INFO:teuthology.orchestra.run.vm04.stdout: "is_paused": false 2026-03-10T06:26:03.472 INFO:teuthology.orchestra.run.vm04.stdout:} 2026-03-10T06:26:03.473 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.472+0000 7f8d5cb81700 1 -- 192.168.123.104:0/590503799 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f8d440779e0 msgr2=0x7f8d44079e90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:26:03.473 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.472+0000 7f8d5cb81700 1 --2- 192.168.123.104:0/590503799 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f8d440779e0 0x7f8d44079e90 secure :-1 s=READY pgs=49 cs=0 l=1 rev1=1 crypto rx=0x7f8d4800b5c0 tx=0x7f8d48005fb0 comp rx=0 tx=0).stop 2026-03-10T06:26:03.473 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.472+0000 7f8d5cb81700 1 -- 192.168.123.104:0/590503799 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8d58068490 msgr2=0x7f8d58198380 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:26:03.473 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.472+0000 7f8d5cb81700 1 --2- 192.168.123.104:0/590503799 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8d58068490 0x7f8d58198380 secure :-1 s=READY pgs=76 cs=0 l=1 rev1=1 crypto rx=0x7f8d4000eb10 tx=0x7f8d4000eed0 comp rx=0 tx=0).stop 2026-03-10T06:26:03.473 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.473+0000 7f8d5cb81700 1 -- 192.168.123.104:0/590503799 shutdown_connections 2026-03-10T06:26:03.473 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.473+0000 7f8d5cb81700 1 --2- 192.168.123.104:0/590503799 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f8d440779e0 0x7f8d44079e90 unknown :-1 s=CLOSED pgs=49 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:26:03.473 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.473+0000 7f8d5cb81700 1 --2- 192.168.123.104:0/590503799 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8d58068490 0x7f8d58198380 unknown :-1 s=CLOSED pgs=76 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:26:03.473 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.473+0000 7f8d5cb81700 1 --2- 192.168.123.104:0/590503799 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8d581066c0 0x7f8d581988c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:26:03.473 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.473+0000 7f8d5cb81700 1 -- 192.168.123.104:0/590503799 >> 192.168.123.104:0/590503799 conn(0x7f8d580754a0 msgr2=0x7f8d580feb90 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:26:03.473 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.473+0000 7f8d5cb81700 1 -- 192.168.123.104:0/590503799 shutdown_connections 2026-03-10T06:26:03.474 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.473+0000 7f8d5cb81700 1 -- 192.168.123.104:0/590503799 wait complete. 2026-03-10T06:26:03.543 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.542+0000 7f594a824700 1 -- 192.168.123.104:0/4251834581 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5944068490 msgr2=0x7f5944068900 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:26:03.543 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.542+0000 7f594a824700 1 --2- 192.168.123.104:0/4251834581 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5944068490 0x7f5944068900 secure :-1 s=READY pgs=77 cs=0 l=1 rev1=1 crypto rx=0x7f5934009b00 tx=0x7f5934009e10 comp rx=0 tx=0).stop 2026-03-10T06:26:03.543 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.542+0000 7f594a824700 1 -- 192.168.123.104:0/4251834581 shutdown_connections 2026-03-10T06:26:03.543 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.542+0000 7f594a824700 1 --2- 192.168.123.104:0/4251834581 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5944068490 0x7f5944068900 unknown :-1 s=CLOSED pgs=77 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:26:03.543 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.542+0000 7f594a824700 1 --2- 192.168.123.104:0/4251834581 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f59441066c0 0x7f5944106a90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:26:03.543 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.542+0000 7f594a824700 1 -- 192.168.123.104:0/4251834581 >> 192.168.123.104:0/4251834581 conn(0x7f59440754a0 msgr2=0x7f59440758a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:26:03.543 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.542+0000 7f594a824700 1 -- 192.168.123.104:0/4251834581 shutdown_connections 2026-03-10T06:26:03.543 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.542+0000 7f594a824700 1 -- 192.168.123.104:0/4251834581 wait complete. 2026-03-10T06:26:03.543 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.543+0000 7f594a824700 1 Processor -- start 2026-03-10T06:26:03.543 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.543+0000 7f594a824700 1 -- start start 2026-03-10T06:26:03.543 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.543+0000 7f594a824700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5944068490 0x7f59441961e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:26:03.544 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.543+0000 7f594a824700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f59441066c0 0x7f5944196720 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:26:03.544 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.543+0000 7f594a824700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5944196e00 con 0x7f5944068490 2026-03-10T06:26:03.544 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.543+0000 7f594a824700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f594419ab90 con 0x7f59441066c0 2026-03-10T06:26:03.544 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.543+0000 7f5943fff700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5944068490 0x7f59441961e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:26:03.544 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.543+0000 7f59437fe700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f59441066c0 0x7f5944196720 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:26:03.544 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.543+0000 7f59437fe700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f59441066c0 0x7f5944196720 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.104:46576/0 (socket says 192.168.123.104:46576) 2026-03-10T06:26:03.544 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.543+0000 7f59437fe700 1 -- 192.168.123.104:0/1018052890 learned_addr learned my addr 192.168.123.104:0/1018052890 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:26:03.544 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.543+0000 7f5943fff700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5944068490 0x7f59441961e0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:41910/0 (socket says 192.168.123.104:41910) 2026-03-10T06:26:03.544 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.544+0000 7f5943fff700 1 -- 192.168.123.104:0/1018052890 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f59441066c0 msgr2=0x7f5944196720 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:26:03.544 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.544+0000 7f5943fff700 1 --2- 192.168.123.104:0/1018052890 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f59441066c0 0x7f5944196720 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:26:03.544 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.544+0000 7f5943fff700 1 -- 192.168.123.104:0/1018052890 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f59340097e0 con 0x7f5944068490 2026-03-10T06:26:03.544 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.544+0000 7f5943fff700 1 --2- 192.168.123.104:0/1018052890 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5944068490 0x7f59441961e0 secure :-1 s=READY pgs=78 cs=0 l=1 rev1=1 crypto rx=0x7f592c00b840 tx=0x7f592c00bb50 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:26:03.546 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.544+0000 7f59417fa700 1 -- 192.168.123.104:0/1018052890 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f592c00d610 con 0x7f5944068490 2026-03-10T06:26:03.546 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.544+0000 7f59417fa700 1 -- 192.168.123.104:0/1018052890 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f592c00dc50 con 0x7f5944068490 2026-03-10T06:26:03.546 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.544+0000 7f59417fa700 1 -- 192.168.123.104:0/1018052890 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f592c017400 con 0x7f5944068490 2026-03-10T06:26:03.546 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.544+0000 7f594a824700 1 -- 192.168.123.104:0/1018052890 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f594419ae70 con 0x7f5944068490 2026-03-10T06:26:03.546 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.544+0000 7f594a824700 1 -- 192.168.123.104:0/1018052890 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f594419b3c0 con 0x7f5944068490 2026-03-10T06:26:03.550 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.546+0000 7f59417fa700 1 -- 192.168.123.104:0/1018052890 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 40) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f592c00d770 con 0x7f5944068490 2026-03-10T06:26:03.550 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.546+0000 7f594a824700 1 -- 192.168.123.104:0/1018052890 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f594404ea50 con 0x7f5944068490 2026-03-10T06:26:03.550 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.546+0000 7f59417fa700 1 --2- 192.168.123.104:0/1018052890 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f5930077870 0x7f5930079d20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:26:03.550 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.546+0000 7f59417fa700 1 -- 192.168.123.104:0/1018052890 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(63..63 src has 1..63) v4 ==== 6050+0+0 (secure 0 0 0) 0x7f592c099240 con 0x7f5944068490 2026-03-10T06:26:03.550 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.547+0000 7f59437fe700 1 --2- 192.168.123.104:0/1018052890 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f5930077870 0x7f5930079d20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:26:03.550 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.548+0000 7f59437fe700 1 --2- 192.168.123.104:0/1018052890 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f5930077870 0x7f5930079d20 secure :-1 s=READY pgs=50 cs=0 l=1 rev1=1 crypto rx=0x7f5944197800 tx=0x7f593400b560 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:26:03.550 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.550+0000 7f59417fa700 1 -- 192.168.123.104:0/1018052890 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f592c061ac0 con 0x7f5944068490 2026-03-10T06:26:03.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:03 vm06.local ceph-mon[98962]: pgmap v82: 65 pgs: 18 active+undersized, 4 peering, 16 active+undersized+degraded, 27 active+clean; 252 MiB data, 2.1 GiB used, 118 GiB / 120 GiB avail; 54/261 objects degraded (20.690%) 2026-03-10T06:26:03.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:03 vm06.local ceph-mon[98962]: from='client.34212 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:26:03.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:03 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:26:03.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:03 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:26:03.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:03 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:26:03.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:03 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T06:26:03.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:03 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:26:03.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:03 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:26:03.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:03 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:26:03.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:03 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:26:03.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:03 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:26:03.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:03 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd ok-to-stop", "ids": ["4"], "max": 16}]: dispatch 2026-03-10T06:26:03.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:03 vm06.local ceph-mon[98962]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["4"], "max": 16}]: dispatch 2026-03-10T06:26:03.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:03 vm06.local ceph-mon[98962]: Upgrade: unsafe to stop osd(s) at this time (18 PGs are or would become offline) 2026-03-10T06:26:03.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:03 vm06.local ceph-mon[98962]: from='client.34216 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:26:03.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:03 vm06.local ceph-mon[98962]: from='client.? 192.168.123.104:0/1547279886' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:26:03.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:03 vm06.local ceph-mon[98962]: from='client.? 192.168.123.104:0/2061902055' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T06:26:03.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:03 vm04.local ceph-mon[115743]: from='client.34212 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:26:03.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:03 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:26:03.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:03 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:26:03.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:03 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:26:03.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:03 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T06:26:03.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:03 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:26:03.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:03 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:26:03.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:03 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:26:03.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:03 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:26:03.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:03 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:26:03.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:03 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd ok-to-stop", "ids": ["4"], "max": 16}]: dispatch 2026-03-10T06:26:03.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:03 vm04.local ceph-mon[115743]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["4"], "max": 16}]: dispatch 2026-03-10T06:26:03.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:03 vm04.local ceph-mon[115743]: Upgrade: unsafe to stop osd(s) at this time (18 PGs are or would become offline) 2026-03-10T06:26:03.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:03 vm04.local ceph-mon[115743]: from='client.34216 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:26:03.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:03 vm04.local ceph-mon[115743]: from='client.? 192.168.123.104:0/1547279886' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:26:03.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:03 vm04.local ceph-mon[115743]: from='client.? 192.168.123.104:0/2061902055' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T06:26:03.721 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.720+0000 7f594a824700 1 -- 192.168.123.104:0/1018052890 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7f5944066e40 con 0x7f5944068490 2026-03-10T06:26:03.721 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.721+0000 7f59417fa700 1 -- 192.168.123.104:0/1018052890 <== mon.0 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+1346 (secure 0 0 0) 0x7f592c061ac0 con 0x7f5944068490 2026-03-10T06:26:03.721 INFO:teuthology.orchestra.run.vm04.stdout:HEALTH_WARN 1 filesystem with deprecated feature inline_data; 1 osds down; Degraded data redundancy: 54/261 objects degraded (20.690%), 16 pgs degraded 2026-03-10T06:26:03.721 INFO:teuthology.orchestra.run.vm04.stdout:[WRN] FS_INLINE_DATA_DEPRECATED: 1 filesystem with deprecated feature inline_data 2026-03-10T06:26:03.721 INFO:teuthology.orchestra.run.vm04.stdout: fs cephfs has deprecated feature inline_data enabled. 2026-03-10T06:26:03.721 INFO:teuthology.orchestra.run.vm04.stdout:[WRN] OSD_DOWN: 1 osds down 2026-03-10T06:26:03.722 INFO:teuthology.orchestra.run.vm04.stdout: osd.3 (root=default,host=vm06) is down 2026-03-10T06:26:03.722 INFO:teuthology.orchestra.run.vm04.stdout:[WRN] PG_DEGRADED: Degraded data redundancy: 54/261 objects degraded (20.690%), 16 pgs degraded 2026-03-10T06:26:03.722 INFO:teuthology.orchestra.run.vm04.stdout: pg 1.0 is active+undersized+degraded, acting [0,1] 2026-03-10T06:26:03.722 INFO:teuthology.orchestra.run.vm04.stdout: pg 2.0 is active+undersized+degraded, acting [1,0] 2026-03-10T06:26:03.722 INFO:teuthology.orchestra.run.vm04.stdout: pg 2.5 is active+undersized+degraded, acting [0,4] 2026-03-10T06:26:03.722 INFO:teuthology.orchestra.run.vm04.stdout: pg 2.6 is active+undersized+degraded, acting [1,4] 2026-03-10T06:26:03.722 INFO:teuthology.orchestra.run.vm04.stdout: pg 2.7 is active+undersized+degraded, acting [4,2] 2026-03-10T06:26:03.722 INFO:teuthology.orchestra.run.vm04.stdout: pg 2.8 is active+undersized+degraded, acting [5,0] 2026-03-10T06:26:03.722 INFO:teuthology.orchestra.run.vm04.stdout: pg 2.a is active+undersized+degraded, acting [1,4] 2026-03-10T06:26:03.722 INFO:teuthology.orchestra.run.vm04.stdout: pg 2.b is active+undersized+degraded, acting [4,5] 2026-03-10T06:26:03.722 INFO:teuthology.orchestra.run.vm04.stdout: pg 2.d is active+undersized+degraded, acting [1,2] 2026-03-10T06:26:03.722 INFO:teuthology.orchestra.run.vm04.stdout: pg 2.14 is active+undersized+degraded, acting [4,5] 2026-03-10T06:26:03.722 INFO:teuthology.orchestra.run.vm04.stdout: pg 2.15 is active+undersized+degraded, acting [1,0] 2026-03-10T06:26:03.722 INFO:teuthology.orchestra.run.vm04.stdout: pg 2.16 is active+undersized+degraded, acting [5,2] 2026-03-10T06:26:03.722 INFO:teuthology.orchestra.run.vm04.stdout: pg 2.18 is active+undersized+degraded, acting [5,4] 2026-03-10T06:26:03.722 INFO:teuthology.orchestra.run.vm04.stdout: pg 2.1a is active+undersized+degraded, acting [4,5] 2026-03-10T06:26:03.722 INFO:teuthology.orchestra.run.vm04.stdout: pg 2.1d is active+undersized+degraded, acting [5,0] 2026-03-10T06:26:03.722 INFO:teuthology.orchestra.run.vm04.stdout: pg 2.1f is active+undersized+degraded, acting [0,4] 2026-03-10T06:26:03.724 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.723+0000 7f594a824700 1 -- 192.168.123.104:0/1018052890 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f5930077870 msgr2=0x7f5930079d20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:26:03.724 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.723+0000 7f594a824700 1 --2- 192.168.123.104:0/1018052890 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f5930077870 0x7f5930079d20 secure :-1 s=READY pgs=50 cs=0 l=1 rev1=1 crypto rx=0x7f5944197800 tx=0x7f593400b560 comp rx=0 tx=0).stop 2026-03-10T06:26:03.724 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.723+0000 7f594a824700 1 -- 192.168.123.104:0/1018052890 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5944068490 msgr2=0x7f59441961e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:26:03.724 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.723+0000 7f594a824700 1 --2- 192.168.123.104:0/1018052890 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5944068490 0x7f59441961e0 secure :-1 s=READY pgs=78 cs=0 l=1 rev1=1 crypto rx=0x7f592c00b840 tx=0x7f592c00bb50 comp rx=0 tx=0).stop 2026-03-10T06:26:03.724 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.724+0000 7f594a824700 1 -- 192.168.123.104:0/1018052890 shutdown_connections 2026-03-10T06:26:03.724 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.724+0000 7f594a824700 1 --2- 192.168.123.104:0/1018052890 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f5930077870 0x7f5930079d20 unknown :-1 s=CLOSED pgs=50 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:26:03.724 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.724+0000 7f594a824700 1 --2- 192.168.123.104:0/1018052890 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5944068490 0x7f59441961e0 unknown :-1 s=CLOSED pgs=78 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:26:03.724 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.724+0000 7f594a824700 1 --2- 192.168.123.104:0/1018052890 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f59441066c0 0x7f5944196720 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:26:03.724 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.724+0000 7f594a824700 1 -- 192.168.123.104:0/1018052890 >> 192.168.123.104:0/1018052890 conn(0x7f59440754a0 msgr2=0x7f59440fecb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:26:03.724 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.724+0000 7f594a824700 1 -- 192.168.123.104:0/1018052890 shutdown_connections 2026-03-10T06:26:03.725 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:03.724+0000 7f594a824700 1 -- 192.168.123.104:0/1018052890 wait complete. 2026-03-10T06:26:04.611 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:04 vm06.local ceph-mon[98962]: from='client.44175 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:26:04.612 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:04 vm06.local ceph-mon[98962]: from='client.34230 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:26:04.612 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:04 vm06.local ceph-mon[98962]: from='client.? 192.168.123.104:0/1018052890' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T06:26:04.612 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:04 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:26:04.612 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:04 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T06:26:04.612 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 10 06:26:04 vm06.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-3[106757]: 2026-03-10T06:26:04.317+0000 7f2f70591740 -1 osd.3 0 read_superblock omap replica is missing. 2026-03-10T06:26:04.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:04 vm04.local ceph-mon[115743]: from='client.44175 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:26:04.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:04 vm04.local ceph-mon[115743]: from='client.34230 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:26:04.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:04 vm04.local ceph-mon[115743]: from='client.? 192.168.123.104:0/1018052890' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T06:26:04.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:04 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:26:04.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:04 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T06:26:04.867 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 10 06:26:04 vm06.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-3[106757]: 2026-03-10T06:26:04.611+0000 7f2f70591740 -1 osd.3 61 log_to_monitors true 2026-03-10T06:26:05.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:05 vm06.local ceph-mon[98962]: pgmap v83: 65 pgs: 20 active+undersized, 18 active+undersized+degraded, 27 active+clean; 252 MiB data, 2.1 GiB used, 118 GiB / 120 GiB avail; 58/261 objects degraded (22.222%) 2026-03-10T06:26:05.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:05 vm06.local ceph-mon[98962]: from='osd.3 [v2:192.168.123.106:6800/3185320012,v1:192.168.123.106:6801/3185320012]' entity='osd.3' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["3"]}]: dispatch 2026-03-10T06:26:05.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:05 vm06.local ceph-mon[98962]: from='osd.3 ' entity='osd.3' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["3"]}]: dispatch 2026-03-10T06:26:05.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:05 vm04.local ceph-mon[115743]: pgmap v83: 65 pgs: 20 active+undersized, 18 active+undersized+degraded, 27 active+clean; 252 MiB data, 2.1 GiB used, 118 GiB / 120 GiB avail; 58/261 objects degraded (22.222%) 2026-03-10T06:26:05.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:05 vm04.local ceph-mon[115743]: from='osd.3 [v2:192.168.123.106:6800/3185320012,v1:192.168.123.106:6801/3185320012]' entity='osd.3' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["3"]}]: dispatch 2026-03-10T06:26:05.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:05 vm04.local ceph-mon[115743]: from='osd.3 ' entity='osd.3' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["3"]}]: dispatch 2026-03-10T06:26:06.617 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 10 06:26:06 vm06.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-3[106757]: 2026-03-10T06:26:06.191+0000 7f2f67b2a640 -1 osd.3 61 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-10T06:26:06.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:06 vm06.local ceph-mon[98962]: from='osd.3 ' entity='osd.3' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["3"]}]': finished 2026-03-10T06:26:06.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:06 vm06.local ceph-mon[98962]: osdmap e64: 6 total, 5 up, 6 in 2026-03-10T06:26:06.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:06 vm06.local ceph-mon[98962]: from='osd.3 [v2:192.168.123.106:6800/3185320012,v1:192.168.123.106:6801/3185320012]' entity='osd.3' cmd=[{"prefix": "osd crush create-or-move", "id": 3, "weight":0.0195, "args": ["host=vm06", "root=default"]}]: dispatch 2026-03-10T06:26:06.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:06 vm06.local ceph-mon[98962]: from='osd.3 ' entity='osd.3' cmd=[{"prefix": "osd crush create-or-move", "id": 3, "weight":0.0195, "args": ["host=vm06", "root=default"]}]: dispatch 2026-03-10T06:26:06.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:06 vm06.local ceph-mon[98962]: from='osd.3 ' entity='osd.3' 2026-03-10T06:26:06.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:06 vm04.local ceph-mon[115743]: from='osd.3 ' entity='osd.3' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["3"]}]': finished 2026-03-10T06:26:06.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:06 vm04.local ceph-mon[115743]: osdmap e64: 6 total, 5 up, 6 in 2026-03-10T06:26:06.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:06 vm04.local ceph-mon[115743]: from='osd.3 [v2:192.168.123.106:6800/3185320012,v1:192.168.123.106:6801/3185320012]' entity='osd.3' cmd=[{"prefix": "osd crush create-or-move", "id": 3, "weight":0.0195, "args": ["host=vm06", "root=default"]}]: dispatch 2026-03-10T06:26:06.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:06 vm04.local ceph-mon[115743]: from='osd.3 ' entity='osd.3' cmd=[{"prefix": "osd crush create-or-move", "id": 3, "weight":0.0195, "args": ["host=vm06", "root=default"]}]: dispatch 2026-03-10T06:26:06.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:06 vm04.local ceph-mon[115743]: from='osd.3 ' entity='osd.3' 2026-03-10T06:26:07.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:07 vm06.local ceph-mon[98962]: pgmap v85: 65 pgs: 20 active+undersized, 18 active+undersized+degraded, 27 active+clean; 252 MiB data, 1.7 GiB used, 118 GiB / 120 GiB avail; 58/261 objects degraded (22.222%) 2026-03-10T06:26:07.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:07 vm06.local ceph-mon[98962]: Health check update: Degraded data redundancy: 58/261 objects degraded (22.222%), 18 pgs degraded (PG_DEGRADED) 2026-03-10T06:26:07.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:07 vm06.local ceph-mon[98962]: Health check cleared: OSD_DOWN (was: 1 osds down) 2026-03-10T06:26:07.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:07 vm06.local ceph-mon[98962]: osd.3 [v2:192.168.123.106:6800/3185320012,v1:192.168.123.106:6801/3185320012] boot 2026-03-10T06:26:07.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:07 vm06.local ceph-mon[98962]: osdmap e65: 6 total, 6 up, 6 in 2026-03-10T06:26:07.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:07 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-10T06:26:07.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:07 vm04.local ceph-mon[115743]: pgmap v85: 65 pgs: 20 active+undersized, 18 active+undersized+degraded, 27 active+clean; 252 MiB data, 1.7 GiB used, 118 GiB / 120 GiB avail; 58/261 objects degraded (22.222%) 2026-03-10T06:26:07.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:07 vm04.local ceph-mon[115743]: Health check update: Degraded data redundancy: 58/261 objects degraded (22.222%), 18 pgs degraded (PG_DEGRADED) 2026-03-10T06:26:07.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:07 vm04.local ceph-mon[115743]: Health check cleared: OSD_DOWN (was: 1 osds down) 2026-03-10T06:26:07.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:07 vm04.local ceph-mon[115743]: osd.3 [v2:192.168.123.106:6800/3185320012,v1:192.168.123.106:6801/3185320012] boot 2026-03-10T06:26:07.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:07 vm04.local ceph-mon[115743]: osdmap e65: 6 total, 6 up, 6 in 2026-03-10T06:26:07.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:07 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-10T06:26:09.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:09 vm06.local ceph-mon[98962]: pgmap v87: 65 pgs: 3 peering, 19 active+undersized, 16 active+undersized+degraded, 27 active+clean; 252 MiB data, 1.7 GiB used, 118 GiB / 120 GiB avail; 54/261 objects degraded (20.690%) 2026-03-10T06:26:09.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:09 vm06.local ceph-mon[98962]: osdmap e66: 6 total, 6 up, 6 in 2026-03-10T06:26:09.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:09 vm04.local ceph-mon[115743]: pgmap v87: 65 pgs: 3 peering, 19 active+undersized, 16 active+undersized+degraded, 27 active+clean; 252 MiB data, 1.7 GiB used, 118 GiB / 120 GiB avail; 54/261 objects degraded (20.690%) 2026-03-10T06:26:09.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:09 vm04.local ceph-mon[115743]: osdmap e66: 6 total, 6 up, 6 in 2026-03-10T06:26:11.574 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:11 vm04.local ceph-mon[115743]: pgmap v89: 65 pgs: 3 peering, 16 active+undersized, 15 active+undersized+degraded, 31 active+clean; 252 MiB data, 1.7 GiB used, 118 GiB / 120 GiB avail; 853 B/s rd, 2 op/s; 51/261 objects degraded (19.540%) 2026-03-10T06:26:11.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:11 vm06.local ceph-mon[98962]: pgmap v89: 65 pgs: 3 peering, 16 active+undersized, 15 active+undersized+degraded, 31 active+clean; 252 MiB data, 1.7 GiB used, 118 GiB / 120 GiB avail; 853 B/s rd, 2 op/s; 51/261 objects degraded (19.540%) 2026-03-10T06:26:12.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:12 vm06.local ceph-mon[98962]: Health check update: Degraded data redundancy: 51/261 objects degraded (19.540%), 15 pgs degraded (PG_DEGRADED) 2026-03-10T06:26:12.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:12 vm04.local ceph-mon[115743]: Health check update: Degraded data redundancy: 51/261 objects degraded (19.540%), 15 pgs degraded (PG_DEGRADED) 2026-03-10T06:26:13.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:13 vm04.local ceph-mon[115743]: pgmap v90: 65 pgs: 3 peering, 62 active+clean; 252 MiB data, 2.1 GiB used, 118 GiB / 120 GiB avail; 612 B/s rd, 1 op/s 2026-03-10T06:26:13.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:13 vm04.local ceph-mon[115743]: Health check cleared: PG_DEGRADED (was: Degraded data redundancy: 51/261 objects degraded (19.540%), 15 pgs degraded) 2026-03-10T06:26:13.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:13 vm06.local ceph-mon[98962]: pgmap v90: 65 pgs: 3 peering, 62 active+clean; 252 MiB data, 2.1 GiB used, 118 GiB / 120 GiB avail; 612 B/s rd, 1 op/s 2026-03-10T06:26:13.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:13 vm06.local ceph-mon[98962]: Health check cleared: PG_DEGRADED (was: Degraded data redundancy: 51/261 objects degraded (19.540%), 15 pgs degraded) 2026-03-10T06:26:15.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:15 vm06.local ceph-mon[98962]: pgmap v91: 65 pgs: 65 active+clean; 252 MiB data, 2.1 GiB used, 118 GiB / 120 GiB avail; 511 B/s rd, 0 op/s 2026-03-10T06:26:15.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:15 vm04.local ceph-mon[115743]: pgmap v91: 65 pgs: 65 active+clean; 252 MiB data, 2.1 GiB used, 118 GiB / 120 GiB avail; 511 B/s rd, 0 op/s 2026-03-10T06:26:17.609 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:17 vm06.local ceph-mon[98962]: pgmap v92: 65 pgs: 65 active+clean; 252 MiB data, 2.1 GiB used, 118 GiB / 120 GiB avail; 1.0 KiB/s rd, 2 op/s 2026-03-10T06:26:17.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:17 vm04.local ceph-mon[115743]: pgmap v92: 65 pgs: 65 active+clean; 252 MiB data, 2.1 GiB used, 118 GiB / 120 GiB avail; 1.0 KiB/s rd, 2 op/s 2026-03-10T06:26:18.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:18 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd ok-to-stop", "ids": ["4"], "max": 16}]: dispatch 2026-03-10T06:26:18.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:18 vm04.local ceph-mon[115743]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["4"], "max": 16}]: dispatch 2026-03-10T06:26:18.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:18 vm04.local ceph-mon[115743]: Upgrade: osd.4 is safe to restart 2026-03-10T06:26:18.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:18 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:26:18.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:18 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "osd.4"}]: dispatch 2026-03-10T06:26:18.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:18 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:26:18.509 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:18 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd ok-to-stop", "ids": ["4"], "max": 16}]: dispatch 2026-03-10T06:26:18.510 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:18 vm06.local ceph-mon[98962]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["4"], "max": 16}]: dispatch 2026-03-10T06:26:18.510 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:18 vm06.local ceph-mon[98962]: Upgrade: osd.4 is safe to restart 2026-03-10T06:26:18.510 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:18 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:26:18.510 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:18 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "osd.4"}]: dispatch 2026-03-10T06:26:18.510 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:18 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:26:18.867 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 10 06:26:18 vm06.local systemd[1]: Stopping Ceph osd.4 for 9c59102a-1c48-11f1-b618-035af535377d... 2026-03-10T06:26:18.868 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 10 06:26:18 vm06.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-4[70536]: 2026-03-10T06:26:18.625+0000 7f9b7d505700 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-osd -n osd.4 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false (PID: 1) UID: 0 2026-03-10T06:26:18.868 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 10 06:26:18 vm06.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-4[70536]: 2026-03-10T06:26:18.625+0000 7f9b7d505700 -1 osd.4 66 *** Got signal Terminated *** 2026-03-10T06:26:18.868 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 10 06:26:18 vm06.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-4[70536]: 2026-03-10T06:26:18.625+0000 7f9b7d505700 -1 osd.4 66 *** Immediate shutdown (osd_fast_shutdown=true) *** 2026-03-10T06:26:19.529 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 10 06:26:19 vm06.local podman[110749]: 2026-03-10 06:26:19.366315411 +0000 UTC m=+0.759495960 container died dcd395dfe2206f9a24dded22dd93b67536a3236b7a18ad9f7484849c01f5edef (image=quay.io/ceph/ceph@sha256:1793ff3af6ae74527c86e1a0b22401e9c42dc08d0ebb8379653be07db17d0007, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-4, io.buildah.version=1.29.1, GIT_BRANCH=HEAD, GIT_CLEAN=True, org.label-schema.build-date=20231212, GIT_COMMIT=0396eef90bef641b676c164ec7a3876f45010308, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 8 Base Image, maintainer=Guillaume Abrioux , org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_POINT_RELEASE=-18.2.0, RELEASE=HEAD) 2026-03-10T06:26:19.529 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 10 06:26:19 vm06.local podman[110749]: 2026-03-10 06:26:19.38301964 +0000 UTC m=+0.776200189 container remove dcd395dfe2206f9a24dded22dd93b67536a3236b7a18ad9f7484849c01f5edef (image=quay.io/ceph/ceph@sha256:1793ff3af6ae74527c86e1a0b22401e9c42dc08d0ebb8379653be07db17d0007, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-4, GIT_CLEAN=True, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, GIT_COMMIT=0396eef90bef641b676c164ec7a3876f45010308, io.buildah.version=1.29.1, org.label-schema.build-date=20231212, GIT_BRANCH=HEAD, RELEASE=HEAD, maintainer=Guillaume Abrioux , org.label-schema.name=CentOS Stream 8 Base Image, org.label-schema.vendor=CentOS, CEPH_POINT_RELEASE=-18.2.0) 2026-03-10T06:26:19.529 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 10 06:26:19 vm06.local bash[110749]: ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-4 2026-03-10T06:26:19.529 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:19 vm06.local ceph-mon[98962]: Upgrade: Updating osd.4 2026-03-10T06:26:19.529 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:19 vm06.local ceph-mon[98962]: Deploying daemon osd.4 on vm06 2026-03-10T06:26:19.529 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:19 vm06.local ceph-mon[98962]: pgmap v93: 65 pgs: 65 active+clean; 252 MiB data, 2.1 GiB used, 118 GiB / 120 GiB avail; 921 B/s rd, 1 op/s 2026-03-10T06:26:19.529 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:19 vm06.local ceph-mon[98962]: osd.4 marked itself down and dead 2026-03-10T06:26:19.529 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:19 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:26:19.529 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:19 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T06:26:19.529 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:19 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:26:19.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:19 vm04.local ceph-mon[115743]: Upgrade: Updating osd.4 2026-03-10T06:26:19.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:19 vm04.local ceph-mon[115743]: Deploying daemon osd.4 on vm06 2026-03-10T06:26:19.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:19 vm04.local ceph-mon[115743]: pgmap v93: 65 pgs: 65 active+clean; 252 MiB data, 2.1 GiB used, 118 GiB / 120 GiB avail; 921 B/s rd, 1 op/s 2026-03-10T06:26:19.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:19 vm04.local ceph-mon[115743]: osd.4 marked itself down and dead 2026-03-10T06:26:19.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:19 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:26:19.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:19 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T06:26:19.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:19 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:26:19.793 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 10 06:26:19 vm06.local podman[110814]: 2026-03-10 06:26:19.529219931 +0000 UTC m=+0.016363782 container create 79a93e18f608e83fd9a9d7026dece740151648c4d41bc8804c39d4444ef20302 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-4-deactivate, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_REF=squid, ceph=True, org.opencontainers.image.authors=Ceph Release Team , FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20260223) 2026-03-10T06:26:19.793 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 10 06:26:19 vm06.local podman[110814]: 2026-03-10 06:26:19.57036804 +0000 UTC m=+0.057511891 container init 79a93e18f608e83fd9a9d7026dece740151648c4d41bc8804c39d4444ef20302 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-4-deactivate, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, OSD_FLAVOR=default, CEPH_REF=squid, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.build-date=20260223) 2026-03-10T06:26:19.793 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 10 06:26:19 vm06.local podman[110814]: 2026-03-10 06:26:19.573849877 +0000 UTC m=+0.060993728 container start 79a93e18f608e83fd9a9d7026dece740151648c4d41bc8804c39d4444ef20302 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-4-deactivate, org.label-schema.build-date=20260223, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.schema-version=1.0, CEPH_REF=squid, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team ) 2026-03-10T06:26:19.793 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 10 06:26:19 vm06.local podman[110814]: 2026-03-10 06:26:19.576665327 +0000 UTC m=+0.063809178 container attach 79a93e18f608e83fd9a9d7026dece740151648c4d41bc8804c39d4444ef20302 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-4-deactivate, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_REF=squid, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team , FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20260223, ceph=True, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git) 2026-03-10T06:26:19.793 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 10 06:26:19 vm06.local podman[110814]: 2026-03-10 06:26:19.522173191 +0000 UTC m=+0.009317051 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-10T06:26:19.793 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 10 06:26:19 vm06.local podman[110814]: 2026-03-10 06:26:19.700221992 +0000 UTC m=+0.187365852 container died 79a93e18f608e83fd9a9d7026dece740151648c4d41bc8804c39d4444ef20302 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-4-deactivate, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, ceph=True, org.label-schema.license=GPLv2, CEPH_REF=squid, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20260223, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.name=CentOS Stream 9 Base Image) 2026-03-10T06:26:19.793 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 10 06:26:19 vm06.local podman[110814]: 2026-03-10 06:26:19.763997035 +0000 UTC m=+0.251140876 container remove 79a93e18f608e83fd9a9d7026dece740151648c4d41bc8804c39d4444ef20302 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-4-deactivate, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260223, io.buildah.version=1.41.3, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team , org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df) 2026-03-10T06:26:19.793 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 10 06:26:19 vm06.local systemd[1]: ceph-9c59102a-1c48-11f1-b618-035af535377d@osd.4.service: Deactivated successfully. 2026-03-10T06:26:19.793 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 10 06:26:19 vm06.local systemd[1]: ceph-9c59102a-1c48-11f1-b618-035af535377d@osd.4.service: Unit process 110824 (conmon) remains running after unit stopped. 2026-03-10T06:26:19.793 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 10 06:26:19 vm06.local systemd[1]: ceph-9c59102a-1c48-11f1-b618-035af535377d@osd.4.service: Unit process 110833 (podman) remains running after unit stopped. 2026-03-10T06:26:19.793 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 10 06:26:19 vm06.local systemd[1]: Stopped Ceph osd.4 for 9c59102a-1c48-11f1-b618-035af535377d. 2026-03-10T06:26:19.793 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 10 06:26:19 vm06.local systemd[1]: ceph-9c59102a-1c48-11f1-b618-035af535377d@osd.4.service: Consumed 34.438s CPU time, 742.5M memory peak. 2026-03-10T06:26:20.073 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 10 06:26:19 vm06.local systemd[1]: Starting Ceph osd.4 for 9c59102a-1c48-11f1-b618-035af535377d... 2026-03-10T06:26:20.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:20 vm06.local ceph-mon[98962]: Health check failed: 1 osds down (OSD_DOWN) 2026-03-10T06:26:20.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:20 vm06.local ceph-mon[98962]: osdmap e67: 6 total, 5 up, 6 in 2026-03-10T06:26:20.367 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 10 06:26:20 vm06.local podman[110915]: 2026-03-10 06:26:20.072867519 +0000 UTC m=+0.017110441 container create 8d2829d94022c93ad1df8dda42b3479d9c902e19f7ca38c37821bb11e20efa2c (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-4-activate, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team , FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.build-date=20260223, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, OSD_FLAVOR=default, io.buildah.version=1.41.3, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/) 2026-03-10T06:26:20.367 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 10 06:26:20 vm06.local podman[110915]: 2026-03-10 06:26:20.113884181 +0000 UTC m=+0.058127113 container init 8d2829d94022c93ad1df8dda42b3479d9c902e19f7ca38c37821bb11e20efa2c (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-4-activate, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.build-date=20260223, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team , FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.vendor=CentOS) 2026-03-10T06:26:20.367 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 10 06:26:20 vm06.local podman[110915]: 2026-03-10 06:26:20.117015943 +0000 UTC m=+0.061258865 container start 8d2829d94022c93ad1df8dda42b3479d9c902e19f7ca38c37821bb11e20efa2c (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-4-activate, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team , CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260223) 2026-03-10T06:26:20.367 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 10 06:26:20 vm06.local podman[110915]: 2026-03-10 06:26:20.119642099 +0000 UTC m=+0.063885021 container attach 8d2829d94022c93ad1df8dda42b3479d9c902e19f7ca38c37821bb11e20efa2c (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-4-activate, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.build-date=20260223, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.name=CentOS Stream 9 Base Image) 2026-03-10T06:26:20.367 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 10 06:26:20 vm06.local podman[110915]: 2026-03-10 06:26:20.066098658 +0000 UTC m=+0.010341590 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-10T06:26:20.367 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 10 06:26:20 vm06.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-4-activate[110926]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T06:26:20.367 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 10 06:26:20 vm06.local bash[110915]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T06:26:20.367 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 10 06:26:20 vm06.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-4-activate[110926]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T06:26:20.367 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 10 06:26:20 vm06.local bash[110915]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T06:26:20.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:20 vm04.local ceph-mon[115743]: Health check failed: 1 osds down (OSD_DOWN) 2026-03-10T06:26:20.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:20 vm04.local ceph-mon[115743]: osdmap e67: 6 total, 5 up, 6 in 2026-03-10T06:26:21.016 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 10 06:26:20 vm06.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-4-activate[110926]: --> Failed to activate via raw: did not find any matching OSD to activate 2026-03-10T06:26:21.017 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 10 06:26:20 vm06.local bash[110915]: --> Failed to activate via raw: did not find any matching OSD to activate 2026-03-10T06:26:21.017 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 10 06:26:20 vm06.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-4-activate[110926]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T06:26:21.017 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 10 06:26:20 vm06.local bash[110915]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T06:26:21.017 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 10 06:26:20 vm06.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-4-activate[110926]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T06:26:21.017 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 10 06:26:20 vm06.local bash[110915]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T06:26:21.017 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 10 06:26:20 vm06.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-4-activate[110926]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-4 2026-03-10T06:26:21.017 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 10 06:26:20 vm06.local bash[110915]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-4 2026-03-10T06:26:21.017 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 10 06:26:20 vm06.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-4-activate[110926]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph-6f2d3201-1917-45b8-8139-cca9d564f876/osd-block-0e68b450-e783-4ec6-99f7-0610ac3453d1 --path /var/lib/ceph/osd/ceph-4 --no-mon-config 2026-03-10T06:26:21.017 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 10 06:26:20 vm06.local bash[110915]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph-6f2d3201-1917-45b8-8139-cca9d564f876/osd-block-0e68b450-e783-4ec6-99f7-0610ac3453d1 --path /var/lib/ceph/osd/ceph-4 --no-mon-config 2026-03-10T06:26:21.017 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 10 06:26:21 vm06.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-4-activate[110926]: Running command: /usr/bin/ln -snf /dev/ceph-6f2d3201-1917-45b8-8139-cca9d564f876/osd-block-0e68b450-e783-4ec6-99f7-0610ac3453d1 /var/lib/ceph/osd/ceph-4/block 2026-03-10T06:26:21.278 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:21 vm06.local ceph-mon[98962]: pgmap v95: 65 pgs: 8 peering, 4 stale+active+clean, 53 active+clean; 252 MiB data, 2.1 GiB used, 118 GiB / 120 GiB avail; 921 B/s rd, 1 op/s 2026-03-10T06:26:21.278 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:21 vm06.local ceph-mon[98962]: Health check failed: Reduced data availability: 1 pg inactive, 2 pgs peering (PG_AVAILABILITY) 2026-03-10T06:26:21.279 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 10 06:26:21 vm06.local bash[110915]: Running command: /usr/bin/ln -snf /dev/ceph-6f2d3201-1917-45b8-8139-cca9d564f876/osd-block-0e68b450-e783-4ec6-99f7-0610ac3453d1 /var/lib/ceph/osd/ceph-4/block 2026-03-10T06:26:21.279 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 10 06:26:21 vm06.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-4-activate[110926]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-4/block 2026-03-10T06:26:21.279 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 10 06:26:21 vm06.local bash[110915]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-4/block 2026-03-10T06:26:21.279 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 10 06:26:21 vm06.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-4-activate[110926]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-1 2026-03-10T06:26:21.279 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 10 06:26:21 vm06.local bash[110915]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-1 2026-03-10T06:26:21.279 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 10 06:26:21 vm06.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-4-activate[110926]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-4 2026-03-10T06:26:21.279 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 10 06:26:21 vm06.local bash[110915]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-4 2026-03-10T06:26:21.279 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 10 06:26:21 vm06.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-4-activate[110926]: --> ceph-volume lvm activate successful for osd ID: 4 2026-03-10T06:26:21.279 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 10 06:26:21 vm06.local bash[110915]: --> ceph-volume lvm activate successful for osd ID: 4 2026-03-10T06:26:21.279 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 10 06:26:21 vm06.local podman[110915]: 2026-03-10 06:26:21.043582752 +0000 UTC m=+0.987825664 container died 8d2829d94022c93ad1df8dda42b3479d9c902e19f7ca38c37821bb11e20efa2c (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-4-activate, OSD_FLAVOR=default, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260223, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team , CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.schema-version=1.0) 2026-03-10T06:26:21.279 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 10 06:26:21 vm06.local podman[110915]: 2026-03-10 06:26:21.067217505 +0000 UTC m=+1.011460417 container remove 8d2829d94022c93ad1df8dda42b3479d9c902e19f7ca38c37821bb11e20efa2c (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-4-activate, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, ceph=True, CEPH_REF=squid, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team ) 2026-03-10T06:26:21.280 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 10 06:26:21 vm06.local podman[111169]: 2026-03-10 06:26:21.154162863 +0000 UTC m=+0.015631622 container create fea6c31251ba9ff98c92a9d272ac7dc5581b7280aa562389863b6093ec4880d1 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-4, org.label-schema.build-date=20260223, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_REF=squid) 2026-03-10T06:26:21.280 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 10 06:26:21 vm06.local podman[111169]: 2026-03-10 06:26:21.193935066 +0000 UTC m=+0.055403834 container init fea6c31251ba9ff98c92a9d272ac7dc5581b7280aa562389863b6093ec4880d1 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-4, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.license=GPLv2, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team , CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.build-date=20260223, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/) 2026-03-10T06:26:21.280 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 10 06:26:21 vm06.local podman[111169]: 2026-03-10 06:26:21.19660819 +0000 UTC m=+0.058076947 container start fea6c31251ba9ff98c92a9d272ac7dc5581b7280aa562389863b6093ec4880d1 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-4, org.label-schema.build-date=20260223, ceph=True, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team , CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/) 2026-03-10T06:26:21.280 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 10 06:26:21 vm06.local bash[111169]: fea6c31251ba9ff98c92a9d272ac7dc5581b7280aa562389863b6093ec4880d1 2026-03-10T06:26:21.280 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 10 06:26:21 vm06.local podman[111169]: 2026-03-10 06:26:21.14795931 +0000 UTC m=+0.009428089 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-10T06:26:21.280 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 10 06:26:21 vm06.local systemd[1]: Started Ceph osd.4 for 9c59102a-1c48-11f1-b618-035af535377d. 2026-03-10T06:26:21.528 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:21 vm06.local ceph-mon[98962]: osdmap e68: 6 total, 5 up, 6 in 2026-03-10T06:26:21.528 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:21 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:26:21.528 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:21 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:26:21.528 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:21 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:26:21.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:21 vm04.local ceph-mon[115743]: pgmap v95: 65 pgs: 8 peering, 4 stale+active+clean, 53 active+clean; 252 MiB data, 2.1 GiB used, 118 GiB / 120 GiB avail; 921 B/s rd, 1 op/s 2026-03-10T06:26:21.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:21 vm04.local ceph-mon[115743]: Health check failed: Reduced data availability: 1 pg inactive, 2 pgs peering (PG_AVAILABILITY) 2026-03-10T06:26:21.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:21 vm04.local ceph-mon[115743]: osdmap e68: 6 total, 5 up, 6 in 2026-03-10T06:26:21.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:21 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:26:21.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:21 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:26:21.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:21 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:26:22.117 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 10 06:26:22 vm06.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-4[111179]: 2026-03-10T06:26:22.031+0000 7f719ad01740 -1 Falling back to public interface 2026-03-10T06:26:23.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:23 vm06.local ceph-mon[98962]: pgmap v97: 65 pgs: 13 active+undersized, 8 peering, 1 stale+active+clean, 11 active+undersized+degraded, 32 active+clean; 252 MiB data, 1.7 GiB used, 118 GiB / 120 GiB avail; 0 B/s rd, 0 op/s; 35/261 objects degraded (13.410%) 2026-03-10T06:26:23.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:23 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:26:23.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:23 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:26:23.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:23 vm06.local ceph-mon[98962]: Health check failed: Degraded data redundancy: 35/261 objects degraded (13.410%), 11 pgs degraded (PG_DEGRADED) 2026-03-10T06:26:23.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:23 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:26:23.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:23 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:26:23.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:23 vm04.local ceph-mon[115743]: pgmap v97: 65 pgs: 13 active+undersized, 8 peering, 1 stale+active+clean, 11 active+undersized+degraded, 32 active+clean; 252 MiB data, 1.7 GiB used, 118 GiB / 120 GiB avail; 0 B/s rd, 0 op/s; 35/261 objects degraded (13.410%) 2026-03-10T06:26:23.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:23 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:26:23.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:23 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:26:23.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:23 vm04.local ceph-mon[115743]: Health check failed: Degraded data redundancy: 35/261 objects degraded (13.410%), 11 pgs degraded (PG_DEGRADED) 2026-03-10T06:26:23.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:23 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:26:23.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:23 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:26:25.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:24 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:26:25.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:24 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:26:25.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:24 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:26:25.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:24 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T06:26:25.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:24 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:26:25.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:24 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:26:25.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:24 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:26:25.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:24 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:26:25.118 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:24 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:26:25.118 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:24 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd ok-to-stop", "ids": ["5"], "max": 16}]: dispatch 2026-03-10T06:26:25.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:24 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:26:25.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:24 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:26:25.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:24 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:26:25.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:24 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T06:26:25.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:24 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:26:25.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:24 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:26:25.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:24 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:26:25.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:24 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:26:25.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:24 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:26:25.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:24 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd ok-to-stop", "ids": ["5"], "max": 16}]: dispatch 2026-03-10T06:26:26.117 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 10 06:26:25 vm06.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-4[111179]: 2026-03-10T06:26:25.739+0000 7f719ad01740 -1 osd.4 0 read_superblock omap replica is missing. 2026-03-10T06:26:26.117 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 10 06:26:25 vm06.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-4[111179]: 2026-03-10T06:26:25.947+0000 7f719ad01740 -1 osd.4 66 log_to_monitors true 2026-03-10T06:26:26.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:25 vm06.local ceph-mon[98962]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["5"], "max": 16}]: dispatch 2026-03-10T06:26:26.118 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:25 vm06.local ceph-mon[98962]: Upgrade: unsafe to stop osd(s) at this time (10 PGs are or would become offline) 2026-03-10T06:26:26.118 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:25 vm06.local ceph-mon[98962]: pgmap v98: 65 pgs: 15 active+undersized, 8 peering, 11 active+undersized+degraded, 31 active+clean; 252 MiB data, 1.7 GiB used, 118 GiB / 120 GiB avail; 35/261 objects degraded (13.410%) 2026-03-10T06:26:26.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:25 vm04.local ceph-mon[115743]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["5"], "max": 16}]: dispatch 2026-03-10T06:26:26.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:25 vm04.local ceph-mon[115743]: Upgrade: unsafe to stop osd(s) at this time (10 PGs are or would become offline) 2026-03-10T06:26:26.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:25 vm04.local ceph-mon[115743]: pgmap v98: 65 pgs: 15 active+undersized, 8 peering, 11 active+undersized+degraded, 31 active+clean; 252 MiB data, 1.7 GiB used, 118 GiB / 120 GiB avail; 35/261 objects degraded (13.410%) 2026-03-10T06:26:27.117 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 10 06:26:26 vm06.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-4[111179]: 2026-03-10T06:26:26.757+0000 7f7192a9b640 -1 osd.4 66 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-10T06:26:27.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:26 vm06.local ceph-mon[98962]: from='osd.4 [v2:192.168.123.106:6808/2382479193,v1:192.168.123.106:6809/2382479193]' entity='osd.4' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["4"]}]: dispatch 2026-03-10T06:26:27.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:26 vm06.local ceph-mon[98962]: from='osd.4 ' entity='osd.4' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["4"]}]: dispatch 2026-03-10T06:26:27.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:26 vm06.local ceph-mon[98962]: pgmap v99: 65 pgs: 19 active+undersized, 15 active+undersized+degraded, 31 active+clean; 252 MiB data, 1.7 GiB used, 118 GiB / 120 GiB avail; 0 B/s rd, 0 op/s; 45/261 objects degraded (17.241%) 2026-03-10T06:26:27.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:26 vm04.local ceph-mon[115743]: from='osd.4 [v2:192.168.123.106:6808/2382479193,v1:192.168.123.106:6809/2382479193]' entity='osd.4' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["4"]}]: dispatch 2026-03-10T06:26:27.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:26 vm04.local ceph-mon[115743]: from='osd.4 ' entity='osd.4' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["4"]}]: dispatch 2026-03-10T06:26:27.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:26 vm04.local ceph-mon[115743]: pgmap v99: 65 pgs: 19 active+undersized, 15 active+undersized+degraded, 31 active+clean; 252 MiB data, 1.7 GiB used, 118 GiB / 120 GiB avail; 0 B/s rd, 0 op/s; 45/261 objects degraded (17.241%) 2026-03-10T06:26:28.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:27 vm06.local ceph-mon[98962]: Health check cleared: PG_AVAILABILITY (was: Reduced data availability: 1 pg inactive, 2 pgs peering) 2026-03-10T06:26:28.118 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:27 vm06.local ceph-mon[98962]: from='osd.4 ' entity='osd.4' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["4"]}]': finished 2026-03-10T06:26:28.118 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:27 vm06.local ceph-mon[98962]: osdmap e69: 6 total, 5 up, 6 in 2026-03-10T06:26:28.118 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:27 vm06.local ceph-mon[98962]: from='osd.4 [v2:192.168.123.106:6808/2382479193,v1:192.168.123.106:6809/2382479193]' entity='osd.4' cmd=[{"prefix": "osd crush create-or-move", "id": 4, "weight":0.0195, "args": ["host=vm06", "root=default"]}]: dispatch 2026-03-10T06:26:28.118 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:27 vm06.local ceph-mon[98962]: from='osd.4 ' entity='osd.4' cmd=[{"prefix": "osd crush create-or-move", "id": 4, "weight":0.0195, "args": ["host=vm06", "root=default"]}]: dispatch 2026-03-10T06:26:28.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:27 vm04.local ceph-mon[115743]: Health check cleared: PG_AVAILABILITY (was: Reduced data availability: 1 pg inactive, 2 pgs peering) 2026-03-10T06:26:28.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:27 vm04.local ceph-mon[115743]: from='osd.4 ' entity='osd.4' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["4"]}]': finished 2026-03-10T06:26:28.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:27 vm04.local ceph-mon[115743]: osdmap e69: 6 total, 5 up, 6 in 2026-03-10T06:26:28.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:27 vm04.local ceph-mon[115743]: from='osd.4 [v2:192.168.123.106:6808/2382479193,v1:192.168.123.106:6809/2382479193]' entity='osd.4' cmd=[{"prefix": "osd crush create-or-move", "id": 4, "weight":0.0195, "args": ["host=vm06", "root=default"]}]: dispatch 2026-03-10T06:26:28.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:27 vm04.local ceph-mon[115743]: from='osd.4 ' entity='osd.4' cmd=[{"prefix": "osd crush create-or-move", "id": 4, "weight":0.0195, "args": ["host=vm06", "root=default"]}]: dispatch 2026-03-10T06:26:28.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:28 vm04.local ceph-mon[115743]: Health check cleared: OSD_DOWN (was: 1 osds down) 2026-03-10T06:26:28.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:28 vm04.local ceph-mon[115743]: osd.4 [v2:192.168.123.106:6808/2382479193,v1:192.168.123.106:6809/2382479193] boot 2026-03-10T06:26:28.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:28 vm04.local ceph-mon[115743]: osdmap e70: 6 total, 6 up, 6 in 2026-03-10T06:26:28.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:28 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-10T06:26:28.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:28 vm04.local ceph-mon[115743]: pgmap v102: 65 pgs: 1 peering, 18 active+undersized, 15 active+undersized+degraded, 31 active+clean; 252 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail; 0 B/s rd, 0 op/s; 45/261 objects degraded (17.241%) 2026-03-10T06:26:29.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:28 vm06.local ceph-mon[98962]: Health check cleared: OSD_DOWN (was: 1 osds down) 2026-03-10T06:26:29.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:28 vm06.local ceph-mon[98962]: osd.4 [v2:192.168.123.106:6808/2382479193,v1:192.168.123.106:6809/2382479193] boot 2026-03-10T06:26:29.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:28 vm06.local ceph-mon[98962]: osdmap e70: 6 total, 6 up, 6 in 2026-03-10T06:26:29.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:28 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-10T06:26:29.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:28 vm06.local ceph-mon[98962]: pgmap v102: 65 pgs: 1 peering, 18 active+undersized, 15 active+undersized+degraded, 31 active+clean; 252 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail; 0 B/s rd, 0 op/s; 45/261 objects degraded (17.241%) 2026-03-10T06:26:30.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:29 vm06.local ceph-mon[98962]: osdmap e71: 6 total, 6 up, 6 in 2026-03-10T06:26:30.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:29 vm04.local ceph-mon[115743]: osdmap e71: 6 total, 6 up, 6 in 2026-03-10T06:26:31.078 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:30 vm04.local ceph-mon[115743]: pgmap v104: 65 pgs: 1 peering, 15 active+undersized, 12 active+undersized+degraded, 37 active+clean; 252 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 2 op/s; 38/261 objects degraded (14.559%) 2026-03-10T06:26:31.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:30 vm06.local ceph-mon[98962]: pgmap v104: 65 pgs: 1 peering, 15 active+undersized, 12 active+undersized+degraded, 37 active+clean; 252 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 2 op/s; 38/261 objects degraded (14.559%) 2026-03-10T06:26:32.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:31 vm06.local ceph-mon[98962]: Health check update: Degraded data redundancy: 38/261 objects degraded (14.559%), 12 pgs degraded (PG_DEGRADED) 2026-03-10T06:26:32.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:31 vm04.local ceph-mon[115743]: Health check update: Degraded data redundancy: 38/261 objects degraded (14.559%), 12 pgs degraded (PG_DEGRADED) 2026-03-10T06:26:33.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:32 vm06.local ceph-mon[98962]: pgmap v105: 65 pgs: 1 peering, 64 active+clean; 252 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail; 511 B/s rd, 1 op/s 2026-03-10T06:26:33.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:32 vm04.local ceph-mon[115743]: pgmap v105: 65 pgs: 1 peering, 64 active+clean; 252 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail; 511 B/s rd, 1 op/s 2026-03-10T06:26:33.804 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:33.802+0000 7f77ab263700 1 -- 192.168.123.104:0/4232259151 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f77a40ffd00 msgr2=0x7f77a410c940 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:26:33.804 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:33.802+0000 7f77ab263700 1 --2- 192.168.123.104:0/4232259151 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f77a40ffd00 0x7f77a410c940 secure :-1 s=READY pgs=79 cs=0 l=1 rev1=1 crypto rx=0x7f7798009b50 tx=0x7f7798009e60 comp rx=0 tx=0).stop 2026-03-10T06:26:33.804 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:33.803+0000 7f77ab263700 1 -- 192.168.123.104:0/4232259151 shutdown_connections 2026-03-10T06:26:33.804 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:33.803+0000 7f77ab263700 1 --2- 192.168.123.104:0/4232259151 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f77a40ffd00 0x7f77a410c940 unknown :-1 s=CLOSED pgs=79 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:26:33.804 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:33.803+0000 7f77ab263700 1 --2- 192.168.123.104:0/4232259151 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f77a40ff360 0x7f77a40ff730 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:26:33.804 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:33.803+0000 7f77ab263700 1 -- 192.168.123.104:0/4232259151 >> 192.168.123.104:0/4232259151 conn(0x7f77a40762b0 msgr2=0x7f77a40766b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:26:33.804 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:33.803+0000 7f77ab263700 1 -- 192.168.123.104:0/4232259151 shutdown_connections 2026-03-10T06:26:33.804 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:33.803+0000 7f77ab263700 1 -- 192.168.123.104:0/4232259151 wait complete. 2026-03-10T06:26:33.804 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:33.804+0000 7f77ab263700 1 Processor -- start 2026-03-10T06:26:33.805 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:33.804+0000 7f77ab263700 1 -- start start 2026-03-10T06:26:33.805 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:33.805+0000 7f77ab263700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f77a40ff360 0x7f77a4198560 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:26:33.805 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:33.805+0000 7f77ab263700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f77a40ffd00 0x7f77a4198aa0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:26:33.805 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:33.805+0000 7f77a8fff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f77a40ff360 0x7f77a4198560 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:26:33.806 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:33.805+0000 7f77a3fff700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f77a40ffd00 0x7f77a4198aa0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:26:33.806 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:33.805+0000 7f77a3fff700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f77a40ffd00 0x7f77a4198aa0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:40940/0 (socket says 192.168.123.104:40940) 2026-03-10T06:26:33.806 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:33.805+0000 7f77a3fff700 1 -- 192.168.123.104:0/2493812817 learned_addr learned my addr 192.168.123.104:0/2493812817 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:26:33.806 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:33.805+0000 7f77ab263700 1 -- 192.168.123.104:0/2493812817 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f77a4199180 con 0x7f77a40ffd00 2026-03-10T06:26:33.806 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:33.805+0000 7f77ab263700 1 -- 192.168.123.104:0/2493812817 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f77a419cf10 con 0x7f77a40ff360 2026-03-10T06:26:33.806 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:33.806+0000 7f77a3fff700 1 -- 192.168.123.104:0/2493812817 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f77a40ff360 msgr2=0x7f77a4198560 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:26:33.807 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:33.806+0000 7f77a3fff700 1 --2- 192.168.123.104:0/2493812817 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f77a40ff360 0x7f77a4198560 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:26:33.807 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:33.806+0000 7f77a3fff700 1 -- 192.168.123.104:0/2493812817 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f77980097e0 con 0x7f77a40ffd00 2026-03-10T06:26:33.807 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:33.807+0000 7f77a3fff700 1 --2- 192.168.123.104:0/2493812817 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f77a40ffd00 0x7f77a4198aa0 secure :-1 s=READY pgs=80 cs=0 l=1 rev1=1 crypto rx=0x7f7798009b20 tx=0x7f7798004c10 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:26:33.809 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:33.807+0000 7f77a1ffb700 1 -- 192.168.123.104:0/2493812817 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f779801d070 con 0x7f77a40ffd00 2026-03-10T06:26:33.809 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:33.807+0000 7f77a1ffb700 1 -- 192.168.123.104:0/2493812817 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f779800bc90 con 0x7f77a40ffd00 2026-03-10T06:26:33.809 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:33.807+0000 7f77a1ffb700 1 -- 192.168.123.104:0/2493812817 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f77980217b0 con 0x7f77a40ffd00 2026-03-10T06:26:33.809 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:33.807+0000 7f77ab263700 1 -- 192.168.123.104:0/2493812817 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f77a419d190 con 0x7f77a40ffd00 2026-03-10T06:26:33.809 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:33.807+0000 7f77ab263700 1 -- 192.168.123.104:0/2493812817 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f77a419d680 con 0x7f77a40ffd00 2026-03-10T06:26:33.811 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:33.810+0000 7f77a1ffb700 1 -- 192.168.123.104:0/2493812817 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 40) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f7798021910 con 0x7f77a40ffd00 2026-03-10T06:26:33.812 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:33.810+0000 7f77a1ffb700 1 --2- 192.168.123.104:0/2493812817 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f77940778c0 0x7f7794079d70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:26:33.812 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:33.810+0000 7f77a1ffb700 1 -- 192.168.123.104:0/2493812817 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(71..71 src has 1..71) v4 ==== 6050+0+0 (secure 0 0 0) 0x7f779809be30 con 0x7f77a40ffd00 2026-03-10T06:26:33.812 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:33.811+0000 7f77a8fff700 1 --2- 192.168.123.104:0/2493812817 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f77940778c0 0x7f7794079d70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:26:33.812 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:33.811+0000 7f77a8fff700 1 --2- 192.168.123.104:0/2493812817 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f77940778c0 0x7f7794079d70 secure :-1 s=READY pgs=53 cs=0 l=1 rev1=1 crypto rx=0x7f7790006fd0 tx=0x7f7790009380 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:26:33.812 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:33.812+0000 7f77ab263700 1 -- 192.168.123.104:0/2493812817 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f77a410a0b0 con 0x7f77a40ffd00 2026-03-10T06:26:33.815 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:33.815+0000 7f77a1ffb700 1 -- 192.168.123.104:0/2493812817 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f7798064760 con 0x7f77a40ffd00 2026-03-10T06:26:33.942 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:33.941+0000 7f77ab263700 1 -- 192.168.123.104:0/2493812817 --> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f77a4066e40 con 0x7f77940778c0 2026-03-10T06:26:33.943 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:33.943+0000 7f77a1ffb700 1 -- 192.168.123.104:0/2493812817 <== mgr.34104 v2:192.168.123.104:6800/1421430943 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+401 (secure 0 0 0) 0x7f77a4066e40 con 0x7f77940778c0 2026-03-10T06:26:33.946 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:33.946+0000 7f77ab263700 1 -- 192.168.123.104:0/2493812817 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f77940778c0 msgr2=0x7f7794079d70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:26:33.946 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:33.946+0000 7f77ab263700 1 --2- 192.168.123.104:0/2493812817 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f77940778c0 0x7f7794079d70 secure :-1 s=READY pgs=53 cs=0 l=1 rev1=1 crypto rx=0x7f7790006fd0 tx=0x7f7790009380 comp rx=0 tx=0).stop 2026-03-10T06:26:33.946 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:33.946+0000 7f77ab263700 1 -- 192.168.123.104:0/2493812817 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f77a40ffd00 msgr2=0x7f77a4198aa0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:26:33.946 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:33.946+0000 7f77ab263700 1 --2- 192.168.123.104:0/2493812817 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f77a40ffd00 0x7f77a4198aa0 secure :-1 s=READY pgs=80 cs=0 l=1 rev1=1 crypto rx=0x7f7798009b20 tx=0x7f7798004c10 comp rx=0 tx=0).stop 2026-03-10T06:26:33.947 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:33.946+0000 7f77ab263700 1 -- 192.168.123.104:0/2493812817 shutdown_connections 2026-03-10T06:26:33.947 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:33.946+0000 7f77ab263700 1 --2- 192.168.123.104:0/2493812817 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f77940778c0 0x7f7794079d70 unknown :-1 s=CLOSED pgs=53 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:26:33.947 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:33.946+0000 7f77ab263700 1 --2- 192.168.123.104:0/2493812817 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f77a40ff360 0x7f77a4198560 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:26:33.947 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:33.946+0000 7f77ab263700 1 --2- 192.168.123.104:0/2493812817 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f77a40ffd00 0x7f77a4198aa0 unknown :-1 s=CLOSED pgs=80 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:26:33.947 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:33.946+0000 7f77ab263700 1 -- 192.168.123.104:0/2493812817 >> 192.168.123.104:0/2493812817 conn(0x7f77a40762b0 msgr2=0x7f77a40fd970 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:26:33.947 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:33.947+0000 7f77ab263700 1 -- 192.168.123.104:0/2493812817 shutdown_connections 2026-03-10T06:26:33.947 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:33.947+0000 7f77ab263700 1 -- 192.168.123.104:0/2493812817 wait complete. 2026-03-10T06:26:33.956 INFO:teuthology.orchestra.run.vm04.stdout:true 2026-03-10T06:26:34.020 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.019+0000 7f566a5df700 1 -- 192.168.123.104:0/2357263862 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5664069000 msgr2=0x7f56641051e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:26:34.020 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.019+0000 7f566a5df700 1 --2- 192.168.123.104:0/2357263862 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5664069000 0x7f56641051e0 secure :-1 s=READY pgs=81 cs=0 l=1 rev1=1 crypto rx=0x7f5654009b50 tx=0x7f5654009e60 comp rx=0 tx=0).stop 2026-03-10T06:26:34.020 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.020+0000 7f566a5df700 1 -- 192.168.123.104:0/2357263862 shutdown_connections 2026-03-10T06:26:34.020 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.020+0000 7f566a5df700 1 --2- 192.168.123.104:0/2357263862 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5664069000 0x7f56641051e0 unknown :-1 s=CLOSED pgs=81 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:26:34.020 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.020+0000 7f566a5df700 1 --2- 192.168.123.104:0/2357263862 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f56640686f0 0x7f5664068ac0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:26:34.020 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.020+0000 7f566a5df700 1 -- 192.168.123.104:0/2357263862 >> 192.168.123.104:0/2357263862 conn(0x7f56640754a0 msgr2=0x7f56640758a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:26:34.020 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.020+0000 7f566a5df700 1 -- 192.168.123.104:0/2357263862 shutdown_connections 2026-03-10T06:26:34.021 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.020+0000 7f566a5df700 1 -- 192.168.123.104:0/2357263862 wait complete. 2026-03-10T06:26:34.021 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.021+0000 7f566a5df700 1 Processor -- start 2026-03-10T06:26:34.021 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.021+0000 7f566a5df700 1 -- start start 2026-03-10T06:26:34.021 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.021+0000 7f566a5df700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f56640686f0 0x7f5664198440 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:26:34.022 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.021+0000 7f566a5df700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5664069000 0x7f5664198980 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:26:34.022 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.021+0000 7f566a5df700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5664199060 con 0x7f5664069000 2026-03-10T06:26:34.022 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.021+0000 7f566a5df700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f566419cdf0 con 0x7f56640686f0 2026-03-10T06:26:34.022 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.021+0000 7f5663fff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f56640686f0 0x7f5664198440 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:26:34.022 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.021+0000 7f5663fff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f56640686f0 0x7f5664198440 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.104:43014/0 (socket says 192.168.123.104:43014) 2026-03-10T06:26:34.022 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.021+0000 7f5663fff700 1 -- 192.168.123.104:0/1312409965 learned_addr learned my addr 192.168.123.104:0/1312409965 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:26:34.023 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.022+0000 7f5663fff700 1 -- 192.168.123.104:0/1312409965 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5664069000 msgr2=0x7f5664198980 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:26:34.023 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.022+0000 7f56637fe700 1 --2- 192.168.123.104:0/1312409965 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5664069000 0x7f5664198980 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:26:34.023 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.022+0000 7f5663fff700 1 --2- 192.168.123.104:0/1312409965 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5664069000 0x7f5664198980 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:26:34.023 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.022+0000 7f5663fff700 1 -- 192.168.123.104:0/1312409965 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f56540097e0 con 0x7f56640686f0 2026-03-10T06:26:34.023 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.022+0000 7f5663fff700 1 --2- 192.168.123.104:0/1312409965 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f56640686f0 0x7f5664198440 secure :-1 s=READY pgs=26 cs=0 l=1 rev1=1 crypto rx=0x7f564c00b700 tx=0x7f564c00bac0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:26:34.023 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.022+0000 7f56637fe700 1 --2- 192.168.123.104:0/1312409965 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5664069000 0x7f5664198980 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T06:26:34.023 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.023+0000 7f56617fa700 1 -- 192.168.123.104:0/1312409965 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f564c010840 con 0x7f56640686f0 2026-03-10T06:26:34.023 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.023+0000 7f566a5df700 1 -- 192.168.123.104:0/1312409965 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f566419d0d0 con 0x7f56640686f0 2026-03-10T06:26:34.023 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.023+0000 7f566a5df700 1 -- 192.168.123.104:0/1312409965 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f566419d620 con 0x7f56640686f0 2026-03-10T06:26:34.024 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.023+0000 7f56617fa700 1 -- 192.168.123.104:0/1312409965 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f564c010e80 con 0x7f56640686f0 2026-03-10T06:26:34.024 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.023+0000 7f56617fa700 1 -- 192.168.123.104:0/1312409965 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f564c00d590 con 0x7f56640686f0 2026-03-10T06:26:34.024 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.024+0000 7f566a5df700 1 -- 192.168.123.104:0/1312409965 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f566404ea50 con 0x7f56640686f0 2026-03-10T06:26:34.026 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.025+0000 7f56617fa700 1 -- 192.168.123.104:0/1312409965 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 40) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f564c00f3e0 con 0x7f56640686f0 2026-03-10T06:26:34.026 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.025+0000 7f56617fa700 1 --2- 192.168.123.104:0/1312409965 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f56500778e0 0x7f5650079d90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:26:34.026 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.025+0000 7f56617fa700 1 -- 192.168.123.104:0/1312409965 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(71..71 src has 1..71) v4 ==== 6050+0+0 (secure 0 0 0) 0x7f564c099400 con 0x7f56640686f0 2026-03-10T06:26:34.026 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.026+0000 7f56637fe700 1 --2- 192.168.123.104:0/1312409965 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f56500778e0 0x7f5650079d90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:26:34.027 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.026+0000 7f56637fe700 1 --2- 192.168.123.104:0/1312409965 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f56500778e0 0x7f5650079d90 secure :-1 s=READY pgs=54 cs=0 l=1 rev1=1 crypto rx=0x7f565400b5c0 tx=0x7f5654005fb0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:26:34.028 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.028+0000 7f56617fa700 1 -- 192.168.123.104:0/1312409965 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f564c061c00 con 0x7f56640686f0 2026-03-10T06:26:34.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:33 vm06.local ceph-mon[98962]: Health check cleared: PG_DEGRADED (was: Degraded data redundancy: 38/261 objects degraded (14.559%), 12 pgs degraded) 2026-03-10T06:26:34.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:33 vm04.local ceph-mon[115743]: Health check cleared: PG_DEGRADED (was: Degraded data redundancy: 38/261 objects degraded (14.559%), 12 pgs degraded) 2026-03-10T06:26:34.178 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.178+0000 7f566a5df700 1 -- 192.168.123.104:0/1312409965 --> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f566419d900 con 0x7f56500778e0 2026-03-10T06:26:34.183 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.182+0000 7f56617fa700 1 -- 192.168.123.104:0/1312409965 <== mgr.34104 v2:192.168.123.104:6800/1421430943 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+401 (secure 0 0 0) 0x7f566419d900 con 0x7f56500778e0 2026-03-10T06:26:34.185 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.185+0000 7f566a5df700 1 -- 192.168.123.104:0/1312409965 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f56500778e0 msgr2=0x7f5650079d90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:26:34.185 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.185+0000 7f566a5df700 1 --2- 192.168.123.104:0/1312409965 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f56500778e0 0x7f5650079d90 secure :-1 s=READY pgs=54 cs=0 l=1 rev1=1 crypto rx=0x7f565400b5c0 tx=0x7f5654005fb0 comp rx=0 tx=0).stop 2026-03-10T06:26:34.185 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.185+0000 7f566a5df700 1 -- 192.168.123.104:0/1312409965 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f56640686f0 msgr2=0x7f5664198440 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:26:34.185 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.185+0000 7f566a5df700 1 --2- 192.168.123.104:0/1312409965 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f56640686f0 0x7f5664198440 secure :-1 s=READY pgs=26 cs=0 l=1 rev1=1 crypto rx=0x7f564c00b700 tx=0x7f564c00bac0 comp rx=0 tx=0).stop 2026-03-10T06:26:34.186 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.185+0000 7f566a5df700 1 -- 192.168.123.104:0/1312409965 shutdown_connections 2026-03-10T06:26:34.186 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.185+0000 7f566a5df700 1 --2- 192.168.123.104:0/1312409965 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f56500778e0 0x7f5650079d90 unknown :-1 s=CLOSED pgs=54 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:26:34.186 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.185+0000 7f566a5df700 1 --2- 192.168.123.104:0/1312409965 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f56640686f0 0x7f5664198440 unknown :-1 s=CLOSED pgs=26 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:26:34.186 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.185+0000 7f566a5df700 1 --2- 192.168.123.104:0/1312409965 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5664069000 0x7f5664198980 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:26:34.186 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.185+0000 7f566a5df700 1 -- 192.168.123.104:0/1312409965 >> 192.168.123.104:0/1312409965 conn(0x7f56640754a0 msgr2=0x7f56641037b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:26:34.186 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.186+0000 7f566a5df700 1 -- 192.168.123.104:0/1312409965 shutdown_connections 2026-03-10T06:26:34.186 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.186+0000 7f566a5df700 1 -- 192.168.123.104:0/1312409965 wait complete. 2026-03-10T06:26:34.257 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.257+0000 7fd8267d2700 1 -- 192.168.123.104:0/1023716726 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd820068490 msgr2=0x7fd820068900 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:26:34.258 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.257+0000 7fd8267d2700 1 --2- 192.168.123.104:0/1023716726 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd820068490 0x7fd820068900 secure :-1 s=READY pgs=82 cs=0 l=1 rev1=1 crypto rx=0x7fd808009b00 tx=0x7fd808009e10 comp rx=0 tx=0).stop 2026-03-10T06:26:34.258 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.257+0000 7fd8267d2700 1 -- 192.168.123.104:0/1023716726 shutdown_connections 2026-03-10T06:26:34.258 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.257+0000 7fd8267d2700 1 --2- 192.168.123.104:0/1023716726 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd820068490 0x7fd820068900 unknown :-1 s=CLOSED pgs=82 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:26:34.258 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.257+0000 7fd8267d2700 1 --2- 192.168.123.104:0/1023716726 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd8201013a0 0x7fd820101770 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:26:34.258 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.257+0000 7fd8267d2700 1 -- 192.168.123.104:0/1023716726 >> 192.168.123.104:0/1023716726 conn(0x7fd8200754a0 msgr2=0x7fd8200758a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:26:34.258 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.257+0000 7fd8267d2700 1 -- 192.168.123.104:0/1023716726 shutdown_connections 2026-03-10T06:26:34.258 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.257+0000 7fd8267d2700 1 -- 192.168.123.104:0/1023716726 wait complete. 2026-03-10T06:26:34.258 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.258+0000 7fd8267d2700 1 Processor -- start 2026-03-10T06:26:34.258 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.258+0000 7fd8267d2700 1 -- start start 2026-03-10T06:26:34.259 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.258+0000 7fd8267d2700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd820068490 0x7fd820198410 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:26:34.259 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.258+0000 7fd8267d2700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd8201013a0 0x7fd820198950 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:26:34.259 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.258+0000 7fd8267d2700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd820199030 con 0x7fd8201013a0 2026-03-10T06:26:34.259 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.258+0000 7fd8267d2700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd82019cdc0 con 0x7fd820068490 2026-03-10T06:26:34.259 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.258+0000 7fd817fff700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd8201013a0 0x7fd820198950 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:26:34.259 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.258+0000 7fd817fff700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd8201013a0 0x7fd820198950 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:40984/0 (socket says 192.168.123.104:40984) 2026-03-10T06:26:34.259 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.258+0000 7fd817fff700 1 -- 192.168.123.104:0/2687779192 learned_addr learned my addr 192.168.123.104:0/2687779192 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:26:34.259 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.259+0000 7fd81ffff700 1 --2- 192.168.123.104:0/2687779192 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd820068490 0x7fd820198410 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:26:34.259 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.259+0000 7fd817fff700 1 -- 192.168.123.104:0/2687779192 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd820068490 msgr2=0x7fd820198410 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:26:34.259 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.259+0000 7fd817fff700 1 --2- 192.168.123.104:0/2687779192 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd820068490 0x7fd820198410 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:26:34.259 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.259+0000 7fd817fff700 1 -- 192.168.123.104:0/2687779192 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fd8080097e0 con 0x7fd8201013a0 2026-03-10T06:26:34.260 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.259+0000 7fd817fff700 1 --2- 192.168.123.104:0/2687779192 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd8201013a0 0x7fd820198950 secure :-1 s=READY pgs=83 cs=0 l=1 rev1=1 crypto rx=0x7fd8080048c0 tx=0x7fd8080049a0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:26:34.260 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.259+0000 7fd81dffb700 1 -- 192.168.123.104:0/2687779192 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd80801d070 con 0x7fd8201013a0 2026-03-10T06:26:34.261 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.259+0000 7fd81dffb700 1 -- 192.168.123.104:0/2687779192 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7fd80800bc50 con 0x7fd8201013a0 2026-03-10T06:26:34.261 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.259+0000 7fd81dffb700 1 -- 192.168.123.104:0/2687779192 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd80800f670 con 0x7fd8201013a0 2026-03-10T06:26:34.261 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.259+0000 7fd8267d2700 1 -- 192.168.123.104:0/2687779192 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fd82019d040 con 0x7fd8201013a0 2026-03-10T06:26:34.261 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.259+0000 7fd8267d2700 1 -- 192.168.123.104:0/2687779192 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fd82019d5b0 con 0x7fd8201013a0 2026-03-10T06:26:34.263 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.261+0000 7fd8267d2700 1 -- 192.168.123.104:0/2687779192 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fd82004ea50 con 0x7fd8201013a0 2026-03-10T06:26:34.264 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.263+0000 7fd81dffb700 1 -- 192.168.123.104:0/2687779192 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 40) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fd8080229e0 con 0x7fd8201013a0 2026-03-10T06:26:34.264 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.264+0000 7fd81dffb700 1 --2- 192.168.123.104:0/2687779192 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7fd800077990 0x7fd800079e40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:26:34.264 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.264+0000 7fd81dffb700 1 -- 192.168.123.104:0/2687779192 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(71..71 src has 1..71) v4 ==== 6050+0+0 (secure 0 0 0) 0x7fd80809bcd0 con 0x7fd8201013a0 2026-03-10T06:26:34.265 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.264+0000 7fd81ffff700 1 --2- 192.168.123.104:0/2687779192 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7fd800077990 0x7fd800079e40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:26:34.265 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.265+0000 7fd81dffb700 1 -- 192.168.123.104:0/2687779192 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fd8080645d0 con 0x7fd8201013a0 2026-03-10T06:26:34.265 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.265+0000 7fd81ffff700 1 --2- 192.168.123.104:0/2687779192 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7fd800077990 0x7fd800079e40 secure :-1 s=READY pgs=55 cs=0 l=1 rev1=1 crypto rx=0x7fd810006fd0 tx=0x7fd810008040 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:26:34.385 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.384+0000 7fd8267d2700 1 -- 192.168.123.104:0/2687779192 --> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7fd82019d900 con 0x7fd800077990 2026-03-10T06:26:34.390 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.389+0000 7fd81dffb700 1 -- 192.168.123.104:0/2687779192 <== mgr.34104 v2:192.168.123.104:6800/1421430943 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3576 (secure 0 0 0) 0x7fd82019d900 con 0x7fd800077990 2026-03-10T06:26:34.390 INFO:teuthology.orchestra.run.vm04.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-10T06:26:34.390 INFO:teuthology.orchestra.run.vm04.stdout:alertmanager.vm04 vm04 *:9093,9094 running (3m) 54s ago 9m 24.2M - 0.25.0 c8568f914cd2 85edc8fe2fc1 2026-03-10T06:26:34.390 INFO:teuthology.orchestra.run.vm04.stdout:ceph-exporter.vm04 vm04 running (9m) 54s ago 9m 9336k - 18.2.0 dc2bc1663786 019b79596e39 2026-03-10T06:26:34.390 INFO:teuthology.orchestra.run.vm04.stdout:ceph-exporter.vm06 vm06 running (8m) 12s ago 8m 11.7M - 18.2.0 dc2bc1663786 02ba67f7b99e 2026-03-10T06:26:34.390 INFO:teuthology.orchestra.run.vm04.stdout:crash.vm04 vm04 running (2m) 54s ago 9m 7847k - 19.2.3-678-ge911bdeb 654f31e6858e 330b1d951bd0 2026-03-10T06:26:34.390 INFO:teuthology.orchestra.run.vm04.stdout:crash.vm06 vm06 running (2m) 12s ago 8m 7856k - 19.2.3-678-ge911bdeb 654f31e6858e d5aafc4fb1bb 2026-03-10T06:26:34.390 INFO:teuthology.orchestra.run.vm04.stdout:grafana.vm04 vm04 *:3000 running (3m) 54s ago 8m 89.9M - 10.4.0 c8b91775d855 28b34ae2f2b0 2026-03-10T06:26:34.390 INFO:teuthology.orchestra.run.vm04.stdout:mds.cephfs.vm04.hdxbzv vm04 running (6m) 54s ago 6m 180M - 18.2.0 dc2bc1663786 342935a5b39a 2026-03-10T06:26:34.390 INFO:teuthology.orchestra.run.vm04.stdout:mds.cephfs.vm04.hsrsig vm04 running (6m) 54s ago 6m 18.2M - 18.2.0 dc2bc1663786 9bbaa4df4333 2026-03-10T06:26:34.390 INFO:teuthology.orchestra.run.vm04.stdout:mds.cephfs.vm06.afscws vm06 running (6m) 12s ago 6m 18.5M - 18.2.0 dc2bc1663786 dc29bd0a94dd 2026-03-10T06:26:34.390 INFO:teuthology.orchestra.run.vm04.stdout:mds.cephfs.vm06.wzhqon vm06 running (6m) 12s ago 6m 95.4M - 18.2.0 dc2bc1663786 5f7b9f10b346 2026-03-10T06:26:34.391 INFO:teuthology.orchestra.run.vm04.stdout:mgr.vm04.exdvdb vm04 *:8443,9283,8765 running (4m) 54s ago 9m 609M - 19.2.3-678-ge911bdeb 654f31e6858e 640a9d30421c 2026-03-10T06:26:34.391 INFO:teuthology.orchestra.run.vm04.stdout:mgr.vm06.wwotdr vm06 *:8443,9283,8765 running (4m) 12s ago 8m 495M - 19.2.3-678-ge911bdeb 654f31e6858e 0f98de364d6a 2026-03-10T06:26:34.391 INFO:teuthology.orchestra.run.vm04.stdout:mon.vm04 vm04 running (2m) 54s ago 9m 61.0M 2048M 19.2.3-678-ge911bdeb 654f31e6858e cf1d92823378 2026-03-10T06:26:34.391 INFO:teuthology.orchestra.run.vm04.stdout:mon.vm06 vm06 running (2m) 12s ago 8m 55.9M 2048M 19.2.3-678-ge911bdeb 654f31e6858e 0f90bc9a714a 2026-03-10T06:26:34.391 INFO:teuthology.orchestra.run.vm04.stdout:node-exporter.vm04 vm04 *:9100 running (3m) 54s ago 9m 10.4M - 1.7.0 72c9c2088986 f88b18573eef 2026-03-10T06:26:34.391 INFO:teuthology.orchestra.run.vm04.stdout:node-exporter.vm06 vm06 *:9100 running (3m) 12s ago 8m 9726k - 1.7.0 72c9c2088986 32cea90d1988 2026-03-10T06:26:34.391 INFO:teuthology.orchestra.run.vm04.stdout:osd.0 vm04 running (116s) 54s ago 7m 182M 4096M 19.2.3-678-ge911bdeb 654f31e6858e df697b82ad51 2026-03-10T06:26:34.391 INFO:teuthology.orchestra.run.vm04.stdout:osd.1 vm04 running (77s) 54s ago 7m 108M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 6bc3525fe6f5 2026-03-10T06:26:34.391 INFO:teuthology.orchestra.run.vm04.stdout:osd.2 vm04 running (56s) 54s ago 7m 12.9M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 38220ba83a3f 2026-03-10T06:26:34.391 INFO:teuthology.orchestra.run.vm04.stdout:osd.3 vm06 running (34s) 12s ago 7m 157M 4096M 19.2.3-678-ge911bdeb 654f31e6858e e91f44e1f660 2026-03-10T06:26:34.391 INFO:teuthology.orchestra.run.vm04.stdout:osd.4 vm06 running (13s) 12s ago 7m 14.1M 4096M 19.2.3-678-ge911bdeb 654f31e6858e fea6c31251ba 2026-03-10T06:26:34.391 INFO:teuthology.orchestra.run.vm04.stdout:osd.5 vm06 running (7m) 12s ago 7m 336M 4096M 18.2.0 dc2bc1663786 862da087fc06 2026-03-10T06:26:34.391 INFO:teuthology.orchestra.run.vm04.stdout:prometheus.vm04 vm04 *:9095 running (3m) 54s ago 8m 56.2M - 2.51.0 1d3b7f56885b 9e491f823407 2026-03-10T06:26:34.394 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.393+0000 7fd8267d2700 1 -- 192.168.123.104:0/2687779192 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7fd800077990 msgr2=0x7fd800079e40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:26:34.394 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.393+0000 7fd8267d2700 1 --2- 192.168.123.104:0/2687779192 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7fd800077990 0x7fd800079e40 secure :-1 s=READY pgs=55 cs=0 l=1 rev1=1 crypto rx=0x7fd810006fd0 tx=0x7fd810008040 comp rx=0 tx=0).stop 2026-03-10T06:26:34.394 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.393+0000 7fd8267d2700 1 -- 192.168.123.104:0/2687779192 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd8201013a0 msgr2=0x7fd820198950 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:26:34.394 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.393+0000 7fd8267d2700 1 --2- 192.168.123.104:0/2687779192 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd8201013a0 0x7fd820198950 secure :-1 s=READY pgs=83 cs=0 l=1 rev1=1 crypto rx=0x7fd8080048c0 tx=0x7fd8080049a0 comp rx=0 tx=0).stop 2026-03-10T06:26:34.394 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.393+0000 7fd8267d2700 1 -- 192.168.123.104:0/2687779192 shutdown_connections 2026-03-10T06:26:34.394 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.393+0000 7fd8267d2700 1 --2- 192.168.123.104:0/2687779192 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7fd800077990 0x7fd800079e40 unknown :-1 s=CLOSED pgs=55 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:26:34.394 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.393+0000 7fd8267d2700 1 --2- 192.168.123.104:0/2687779192 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd820068490 0x7fd820198410 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:26:34.394 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.393+0000 7fd8267d2700 1 --2- 192.168.123.104:0/2687779192 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd8201013a0 0x7fd820198950 unknown :-1 s=CLOSED pgs=83 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:26:34.394 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.393+0000 7fd8267d2700 1 -- 192.168.123.104:0/2687779192 >> 192.168.123.104:0/2687779192 conn(0x7fd8200754a0 msgr2=0x7fd8200fdd70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:26:34.394 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.393+0000 7fd8267d2700 1 -- 192.168.123.104:0/2687779192 shutdown_connections 2026-03-10T06:26:34.394 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.394+0000 7fd8267d2700 1 -- 192.168.123.104:0/2687779192 wait complete. 2026-03-10T06:26:34.466 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.466+0000 7f542d1f0700 1 -- 192.168.123.104:0/2993651434 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f54280686f0 msgr2=0x7f5428068ac0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:26:34.467 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.466+0000 7f542d1f0700 1 --2- 192.168.123.104:0/2993651434 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f54280686f0 0x7f5428068ac0 secure :-1 s=READY pgs=84 cs=0 l=1 rev1=1 crypto rx=0x7f5410009b50 tx=0x7f5410009e60 comp rx=0 tx=0).stop 2026-03-10T06:26:34.470 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.469+0000 7f542d1f0700 1 -- 192.168.123.104:0/2993651434 shutdown_connections 2026-03-10T06:26:34.470 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.469+0000 7f542d1f0700 1 --2- 192.168.123.104:0/2993651434 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5428069000 0x7f54281051e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:26:34.470 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.469+0000 7f542d1f0700 1 --2- 192.168.123.104:0/2993651434 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f54280686f0 0x7f5428068ac0 unknown :-1 s=CLOSED pgs=84 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:26:34.470 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.469+0000 7f542d1f0700 1 -- 192.168.123.104:0/2993651434 >> 192.168.123.104:0/2993651434 conn(0x7f54280754a0 msgr2=0x7f54280758a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:26:34.470 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.470+0000 7f542d1f0700 1 -- 192.168.123.104:0/2993651434 shutdown_connections 2026-03-10T06:26:34.470 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.470+0000 7f542d1f0700 1 -- 192.168.123.104:0/2993651434 wait complete. 2026-03-10T06:26:34.471 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.470+0000 7f542d1f0700 1 Processor -- start 2026-03-10T06:26:34.471 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.470+0000 7f542d1f0700 1 -- start start 2026-03-10T06:26:34.471 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.471+0000 7f542d1f0700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f54280686f0 0x7f5428194060 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:26:34.471 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.471+0000 7f542d1f0700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5428069000 0x7f54281945a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:26:34.471 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.471+0000 7f542d1f0700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5428194c80 con 0x7f54280686f0 2026-03-10T06:26:34.471 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.471+0000 7f542d1f0700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5428198a10 con 0x7f5428069000 2026-03-10T06:26:34.472 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.471+0000 7f5426d9d700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f54280686f0 0x7f5428194060 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:26:34.472 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.471+0000 7f542659c700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5428069000 0x7f54281945a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:26:34.472 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.471+0000 7f542659c700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5428069000 0x7f54281945a0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.104:43062/0 (socket says 192.168.123.104:43062) 2026-03-10T06:26:34.472 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.471+0000 7f542659c700 1 -- 192.168.123.104:0/129659230 learned_addr learned my addr 192.168.123.104:0/129659230 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:26:34.472 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.472+0000 7f542659c700 1 -- 192.168.123.104:0/129659230 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f54280686f0 msgr2=0x7f5428194060 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:26:34.472 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.472+0000 7f542659c700 1 --2- 192.168.123.104:0/129659230 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f54280686f0 0x7f5428194060 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:26:34.472 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.472+0000 7f542659c700 1 -- 192.168.123.104:0/129659230 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f54100097e0 con 0x7f5428069000 2026-03-10T06:26:34.472 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.472+0000 7f5426d9d700 1 --2- 192.168.123.104:0/129659230 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f54280686f0 0x7f5428194060 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-10T06:26:34.472 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.472+0000 7f542659c700 1 --2- 192.168.123.104:0/129659230 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5428069000 0x7f54281945a0 secure :-1 s=READY pgs=27 cs=0 l=1 rev1=1 crypto rx=0x7f541800d8d0 tx=0x7f541800dbe0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:26:34.473 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.472+0000 7f541ffff700 1 -- 192.168.123.104:0/129659230 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f5418009880 con 0x7f5428069000 2026-03-10T06:26:34.473 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.472+0000 7f542d1f0700 1 -- 192.168.123.104:0/129659230 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f5428198cf0 con 0x7f5428069000 2026-03-10T06:26:34.474 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.472+0000 7f542d1f0700 1 -- 192.168.123.104:0/129659230 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f5428199240 con 0x7f5428069000 2026-03-10T06:26:34.475 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.473+0000 7f541ffff700 1 -- 192.168.123.104:0/129659230 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f5418010460 con 0x7f5428069000 2026-03-10T06:26:34.475 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.473+0000 7f541ffff700 1 -- 192.168.123.104:0/129659230 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f541800f5d0 con 0x7f5428069000 2026-03-10T06:26:34.475 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.474+0000 7f541ffff700 1 -- 192.168.123.104:0/129659230 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 40) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f541800f730 con 0x7f5428069000 2026-03-10T06:26:34.476 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.474+0000 7f542d1f0700 1 -- 192.168.123.104:0/129659230 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f5428108b30 con 0x7f5428069000 2026-03-10T06:26:34.476 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.474+0000 7f541ffff700 1 --2- 192.168.123.104:0/129659230 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f541407bc80 0x7f541407e130 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:26:34.476 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.474+0000 7f541ffff700 1 -- 192.168.123.104:0/129659230 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(71..71 src has 1..71) v4 ==== 6050+0+0 (secure 0 0 0) 0x7f54180997e0 con 0x7f5428069000 2026-03-10T06:26:34.476 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.475+0000 7f5426d9d700 1 --2- 192.168.123.104:0/129659230 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f541407bc80 0x7f541407e130 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:26:34.476 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.475+0000 7f5426d9d700 1 --2- 192.168.123.104:0/129659230 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f541407bc80 0x7f541407e130 secure :-1 s=READY pgs=56 cs=0 l=1 rev1=1 crypto rx=0x7f5410005b40 tx=0x7f5410005a90 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:26:34.478 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.478+0000 7f541ffff700 1 -- 192.168.123.104:0/129659230 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f5418063120 con 0x7f5428069000 2026-03-10T06:26:34.653 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.652+0000 7f542d1f0700 1 -- 192.168.123.104:0/129659230 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7f542804ea50 con 0x7f5428069000 2026-03-10T06:26:34.653 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.653+0000 7f541ffff700 1 -- 192.168.123.104:0/129659230 <== mon.1 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+799 (secure 0 0 0) 0x7f5418062870 con 0x7f5428069000 2026-03-10T06:26:34.654 INFO:teuthology.orchestra.run.vm04.stdout:{ 2026-03-10T06:26:34.654 INFO:teuthology.orchestra.run.vm04.stdout: "mon": { 2026-03-10T06:26:34.654 INFO:teuthology.orchestra.run.vm04.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T06:26:34.654 INFO:teuthology.orchestra.run.vm04.stdout: }, 2026-03-10T06:26:34.654 INFO:teuthology.orchestra.run.vm04.stdout: "mgr": { 2026-03-10T06:26:34.654 INFO:teuthology.orchestra.run.vm04.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T06:26:34.654 INFO:teuthology.orchestra.run.vm04.stdout: }, 2026-03-10T06:26:34.654 INFO:teuthology.orchestra.run.vm04.stdout: "osd": { 2026-03-10T06:26:34.654 INFO:teuthology.orchestra.run.vm04.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 1, 2026-03-10T06:26:34.654 INFO:teuthology.orchestra.run.vm04.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 5 2026-03-10T06:26:34.654 INFO:teuthology.orchestra.run.vm04.stdout: }, 2026-03-10T06:26:34.654 INFO:teuthology.orchestra.run.vm04.stdout: "mds": { 2026-03-10T06:26:34.654 INFO:teuthology.orchestra.run.vm04.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 4 2026-03-10T06:26:34.654 INFO:teuthology.orchestra.run.vm04.stdout: }, 2026-03-10T06:26:34.654 INFO:teuthology.orchestra.run.vm04.stdout: "overall": { 2026-03-10T06:26:34.654 INFO:teuthology.orchestra.run.vm04.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 5, 2026-03-10T06:26:34.654 INFO:teuthology.orchestra.run.vm04.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 9 2026-03-10T06:26:34.654 INFO:teuthology.orchestra.run.vm04.stdout: } 2026-03-10T06:26:34.654 INFO:teuthology.orchestra.run.vm04.stdout:} 2026-03-10T06:26:34.656 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.655+0000 7f542d1f0700 1 -- 192.168.123.104:0/129659230 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f541407bc80 msgr2=0x7f541407e130 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:26:34.656 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.655+0000 7f542d1f0700 1 --2- 192.168.123.104:0/129659230 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f541407bc80 0x7f541407e130 secure :-1 s=READY pgs=56 cs=0 l=1 rev1=1 crypto rx=0x7f5410005b40 tx=0x7f5410005a90 comp rx=0 tx=0).stop 2026-03-10T06:26:34.656 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.655+0000 7f542d1f0700 1 -- 192.168.123.104:0/129659230 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5428069000 msgr2=0x7f54281945a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:26:34.656 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.656+0000 7f542d1f0700 1 --2- 192.168.123.104:0/129659230 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5428069000 0x7f54281945a0 secure :-1 s=READY pgs=27 cs=0 l=1 rev1=1 crypto rx=0x7f541800d8d0 tx=0x7f541800dbe0 comp rx=0 tx=0).stop 2026-03-10T06:26:34.656 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.656+0000 7f542d1f0700 1 -- 192.168.123.104:0/129659230 shutdown_connections 2026-03-10T06:26:34.656 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.656+0000 7f542d1f0700 1 --2- 192.168.123.104:0/129659230 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f541407bc80 0x7f541407e130 unknown :-1 s=CLOSED pgs=56 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:26:34.656 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.656+0000 7f542d1f0700 1 --2- 192.168.123.104:0/129659230 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f54280686f0 0x7f5428194060 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:26:34.656 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.656+0000 7f542d1f0700 1 --2- 192.168.123.104:0/129659230 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5428069000 0x7f54281945a0 unknown :-1 s=CLOSED pgs=27 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:26:34.656 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.656+0000 7f542d1f0700 1 -- 192.168.123.104:0/129659230 >> 192.168.123.104:0/129659230 conn(0x7f54280754a0 msgr2=0x7f54280fe9b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:26:34.657 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.656+0000 7f542d1f0700 1 -- 192.168.123.104:0/129659230 shutdown_connections 2026-03-10T06:26:34.657 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.656+0000 7f542d1f0700 1 -- 192.168.123.104:0/129659230 wait complete. 2026-03-10T06:26:34.730 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.729+0000 7fcc329cf700 1 -- 192.168.123.104:0/3758865634 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fcc2c102750 msgr2=0x7fcc2c102bc0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:26:34.731 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.729+0000 7fcc329cf700 1 --2- 192.168.123.104:0/3758865634 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fcc2c102750 0x7fcc2c102bc0 secure :-1 s=READY pgs=85 cs=0 l=1 rev1=1 crypto rx=0x7fcc1c009b00 tx=0x7fcc1c009e10 comp rx=0 tx=0).stop 2026-03-10T06:26:34.732 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.731+0000 7fcc329cf700 1 -- 192.168.123.104:0/3758865634 shutdown_connections 2026-03-10T06:26:34.732 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.731+0000 7fcc329cf700 1 --2- 192.168.123.104:0/3758865634 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fcc2c102750 0x7fcc2c102bc0 unknown :-1 s=CLOSED pgs=85 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:26:34.732 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.731+0000 7fcc329cf700 1 --2- 192.168.123.104:0/3758865634 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fcc2c108750 0x7fcc2c108b20 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:26:34.732 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.731+0000 7fcc329cf700 1 -- 192.168.123.104:0/3758865634 >> 192.168.123.104:0/3758865634 conn(0x7fcc2c0fe250 msgr2=0x7fcc2c100660 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:26:34.734 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.733+0000 7fcc329cf700 1 -- 192.168.123.104:0/3758865634 shutdown_connections 2026-03-10T06:26:34.734 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.733+0000 7fcc329cf700 1 -- 192.168.123.104:0/3758865634 wait complete. 2026-03-10T06:26:34.734 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.734+0000 7fcc329cf700 1 Processor -- start 2026-03-10T06:26:34.735 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.734+0000 7fcc329cf700 1 -- start start 2026-03-10T06:26:34.735 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.734+0000 7fcc329cf700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fcc2c102750 0x7fcc2c19cd80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:26:34.735 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.734+0000 7fcc329cf700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fcc2c108750 0x7fcc2c19d2e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:26:34.735 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.734+0000 7fcc329cf700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fcc2c0782e0 con 0x7fcc2c108750 2026-03-10T06:26:34.735 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.734+0000 7fcc329cf700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fcc2c078450 con 0x7fcc2c102750 2026-03-10T06:26:34.735 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.734+0000 7fcc2bfff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fcc2c102750 0x7fcc2c19cd80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:26:34.735 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.734+0000 7fcc2bfff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fcc2c102750 0x7fcc2c19cd80 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.104:43082/0 (socket says 192.168.123.104:43082) 2026-03-10T06:26:34.735 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.734+0000 7fcc2bfff700 1 -- 192.168.123.104:0/428620293 learned_addr learned my addr 192.168.123.104:0/428620293 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:26:34.735 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.735+0000 7fcc2b7fe700 1 --2- 192.168.123.104:0/428620293 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fcc2c108750 0x7fcc2c19d2e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:26:34.735 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.735+0000 7fcc2bfff700 1 -- 192.168.123.104:0/428620293 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fcc2c108750 msgr2=0x7fcc2c19d2e0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:26:34.736 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.735+0000 7fcc2bfff700 1 --2- 192.168.123.104:0/428620293 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fcc2c108750 0x7fcc2c19d2e0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:26:34.736 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.735+0000 7fcc2bfff700 1 -- 192.168.123.104:0/428620293 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fcc1c0097e0 con 0x7fcc2c102750 2026-03-10T06:26:34.736 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.735+0000 7fcc2b7fe700 1 --2- 192.168.123.104:0/428620293 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fcc2c108750 0x7fcc2c19d2e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T06:26:34.736 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.735+0000 7fcc2bfff700 1 --2- 192.168.123.104:0/428620293 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fcc2c102750 0x7fcc2c19cd80 secure :-1 s=READY pgs=28 cs=0 l=1 rev1=1 crypto rx=0x7fcc1400d900 tx=0x7fcc1400dcc0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:26:34.736 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.736+0000 7fcc297fa700 1 -- 192.168.123.104:0/428620293 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fcc140041d0 con 0x7fcc2c102750 2026-03-10T06:26:34.737 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.736+0000 7fcc329cf700 1 -- 192.168.123.104:0/428620293 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fcc2c078730 con 0x7fcc2c102750 2026-03-10T06:26:34.737 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.736+0000 7fcc329cf700 1 -- 192.168.123.104:0/428620293 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fcc2c078c80 con 0x7fcc2c102750 2026-03-10T06:26:34.737 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.736+0000 7fcc297fa700 1 -- 192.168.123.104:0/428620293 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7fcc14004330 con 0x7fcc2c102750 2026-03-10T06:26:34.737 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.736+0000 7fcc297fa700 1 -- 192.168.123.104:0/428620293 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fcc14003da0 con 0x7fcc2c102750 2026-03-10T06:26:34.737 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.737+0000 7fcc329cf700 1 -- 192.168.123.104:0/428620293 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fcc0c005320 con 0x7fcc2c102750 2026-03-10T06:26:34.738 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.738+0000 7fcc297fa700 1 -- 192.168.123.104:0/428620293 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 40) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fcc14039940 con 0x7fcc2c102750 2026-03-10T06:26:34.738 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.738+0000 7fcc297fa700 1 --2- 192.168.123.104:0/428620293 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7fcc180778c0 0x7fcc18079d70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:26:34.739 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.738+0000 7fcc2b7fe700 1 --2- 192.168.123.104:0/428620293 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7fcc180778c0 0x7fcc18079d70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:26:34.739 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.738+0000 7fcc297fa700 1 -- 192.168.123.104:0/428620293 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(71..71 src has 1..71) v4 ==== 6050+0+0 (secure 0 0 0) 0x7fcc14021050 con 0x7fcc2c102750 2026-03-10T06:26:34.739 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.739+0000 7fcc2b7fe700 1 --2- 192.168.123.104:0/428620293 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7fcc180778c0 0x7fcc18079d70 secure :-1 s=READY pgs=57 cs=0 l=1 rev1=1 crypto rx=0x7fcc1c00b5c0 tx=0x7fcc1c005960 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:26:34.741 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.741+0000 7fcc297fa700 1 -- 192.168.123.104:0/428620293 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fcc14061c20 con 0x7fcc2c102750 2026-03-10T06:26:34.887 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:34.886+0000 7fcc329cf700 1 -- 192.168.123.104:0/428620293 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7fcc0c005cc0 con 0x7fcc2c102750 2026-03-10T06:26:35.014 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:35.013+0000 7fcc297fa700 1 -- 192.168.123.104:0/428620293 <== mon.1 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 11 v11) v1 ==== 76+0+1944 (secure 0 0 0) 0x7fcc14061370 con 0x7fcc2c102750 2026-03-10T06:26:35.017 INFO:teuthology.orchestra.run.vm04.stdout:e11 2026-03-10T06:26:35.017 INFO:teuthology.orchestra.run.vm04.stdout:btime 1970-01-01T00:00:00:000000+0000 2026-03-10T06:26:35.017 INFO:teuthology.orchestra.run.vm04.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-10T06:26:35.017 INFO:teuthology.orchestra.run.vm04.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T06:26:35.017 INFO:teuthology.orchestra.run.vm04.stdout:legacy client fscid: 1 2026-03-10T06:26:35.017 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:26:35.017 INFO:teuthology.orchestra.run.vm04.stdout:Filesystem 'cephfs' (1) 2026-03-10T06:26:35.017 INFO:teuthology.orchestra.run.vm04.stdout:fs_name cephfs 2026-03-10T06:26:35.017 INFO:teuthology.orchestra.run.vm04.stdout:epoch 9 2026-03-10T06:26:35.017 INFO:teuthology.orchestra.run.vm04.stdout:flags 32 joinable allow_snaps allow_multimds_snaps allow_standby_replay 2026-03-10T06:26:35.017 INFO:teuthology.orchestra.run.vm04.stdout:created 2026-03-10T06:19:48.407965+0000 2026-03-10T06:26:35.017 INFO:teuthology.orchestra.run.vm04.stdout:modified 2026-03-10T06:19:55.449951+0000 2026-03-10T06:26:35.017 INFO:teuthology.orchestra.run.vm04.stdout:tableserver 0 2026-03-10T06:26:35.017 INFO:teuthology.orchestra.run.vm04.stdout:root 0 2026-03-10T06:26:35.017 INFO:teuthology.orchestra.run.vm04.stdout:session_timeout 60 2026-03-10T06:26:35.017 INFO:teuthology.orchestra.run.vm04.stdout:session_autoclose 300 2026-03-10T06:26:35.017 INFO:teuthology.orchestra.run.vm04.stdout:max_file_size 1099511627776 2026-03-10T06:26:35.017 INFO:teuthology.orchestra.run.vm04.stdout:max_xattr_size 65536 2026-03-10T06:26:35.017 INFO:teuthology.orchestra.run.vm04.stdout:required_client_features {} 2026-03-10T06:26:35.017 INFO:teuthology.orchestra.run.vm04.stdout:last_failure 0 2026-03-10T06:26:35.017 INFO:teuthology.orchestra.run.vm04.stdout:last_failure_osd_epoch 0 2026-03-10T06:26:35.017 INFO:teuthology.orchestra.run.vm04.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T06:26:35.017 INFO:teuthology.orchestra.run.vm04.stdout:max_mds 1 2026-03-10T06:26:35.017 INFO:teuthology.orchestra.run.vm04.stdout:in 0 2026-03-10T06:26:35.017 INFO:teuthology.orchestra.run.vm04.stdout:up {0=14508} 2026-03-10T06:26:35.017 INFO:teuthology.orchestra.run.vm04.stdout:failed 2026-03-10T06:26:35.017 INFO:teuthology.orchestra.run.vm04.stdout:damaged 2026-03-10T06:26:35.017 INFO:teuthology.orchestra.run.vm04.stdout:stopped 2026-03-10T06:26:35.017 INFO:teuthology.orchestra.run.vm04.stdout:data_pools [3] 2026-03-10T06:26:35.017 INFO:teuthology.orchestra.run.vm04.stdout:metadata_pool 2 2026-03-10T06:26:35.017 INFO:teuthology.orchestra.run.vm04.stdout:inline_data enabled 2026-03-10T06:26:35.017 INFO:teuthology.orchestra.run.vm04.stdout:balancer 2026-03-10T06:26:35.017 INFO:teuthology.orchestra.run.vm04.stdout:bal_rank_mask -1 2026-03-10T06:26:35.017 INFO:teuthology.orchestra.run.vm04.stdout:standby_count_wanted 1 2026-03-10T06:26:35.017 INFO:teuthology.orchestra.run.vm04.stdout:qdb_cluster leader: 0 members: 2026-03-10T06:26:35.017 INFO:teuthology.orchestra.run.vm04.stdout:[mds.cephfs.vm04.hdxbzv{0:14508} state up:active seq 3 join_fscid=1 addr [v2:192.168.123.104:6826/2274683007,v1:192.168.123.104:6827/2274683007] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T06:26:35.017 INFO:teuthology.orchestra.run.vm04.stdout:[mds.cephfs.vm06.wzhqon{0:24299} state up:standby-replay seq 2 join_fscid=1 addr [v2:192.168.123.106:6824/3071631026,v1:192.168.123.106:6825/3071631026] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T06:26:35.017 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:26:35.017 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:26:35.017 INFO:teuthology.orchestra.run.vm04.stdout:Standby daemons: 2026-03-10T06:26:35.017 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:26:35.017 INFO:teuthology.orchestra.run.vm04.stdout:[mds.cephfs.vm04.hsrsig{-1:14518} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.104:6828/2419696492,v1:192.168.123.104:6829/2419696492] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T06:26:35.017 INFO:teuthology.orchestra.run.vm04.stdout:[mds.cephfs.vm06.afscws{-1:14526} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.106:6826/3120742985,v1:192.168.123.106:6827/3120742985] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T06:26:35.020 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:35.020+0000 7fcc329cf700 1 -- 192.168.123.104:0/428620293 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7fcc180778c0 msgr2=0x7fcc18079d70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:26:35.021 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:35.020+0000 7fcc329cf700 1 --2- 192.168.123.104:0/428620293 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7fcc180778c0 0x7fcc18079d70 secure :-1 s=READY pgs=57 cs=0 l=1 rev1=1 crypto rx=0x7fcc1c00b5c0 tx=0x7fcc1c005960 comp rx=0 tx=0).stop 2026-03-10T06:26:35.021 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:35.020+0000 7fcc329cf700 1 -- 192.168.123.104:0/428620293 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fcc2c102750 msgr2=0x7fcc2c19cd80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:26:35.021 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:35.020+0000 7fcc329cf700 1 --2- 192.168.123.104:0/428620293 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fcc2c102750 0x7fcc2c19cd80 secure :-1 s=READY pgs=28 cs=0 l=1 rev1=1 crypto rx=0x7fcc1400d900 tx=0x7fcc1400dcc0 comp rx=0 tx=0).stop 2026-03-10T06:26:35.021 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:35.020+0000 7fcc329cf700 1 -- 192.168.123.104:0/428620293 shutdown_connections 2026-03-10T06:26:35.021 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:35.020+0000 7fcc329cf700 1 --2- 192.168.123.104:0/428620293 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7fcc180778c0 0x7fcc18079d70 unknown :-1 s=CLOSED pgs=57 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:26:35.021 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:35.020+0000 7fcc329cf700 1 --2- 192.168.123.104:0/428620293 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fcc2c102750 0x7fcc2c19cd80 unknown :-1 s=CLOSED pgs=28 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:26:35.021 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:35.020+0000 7fcc329cf700 1 --2- 192.168.123.104:0/428620293 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fcc2c108750 0x7fcc2c19d2e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:26:35.021 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:35.020+0000 7fcc329cf700 1 -- 192.168.123.104:0/428620293 >> 192.168.123.104:0/428620293 conn(0x7fcc2c0fe250 msgr2=0x7fcc2c0ffa90 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:26:35.021 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:35.020+0000 7fcc329cf700 1 -- 192.168.123.104:0/428620293 shutdown_connections 2026-03-10T06:26:35.021 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:35.020+0000 7fcc329cf700 1 -- 192.168.123.104:0/428620293 wait complete. 2026-03-10T06:26:35.021 INFO:teuthology.orchestra.run.vm04.stderr:dumped fsmap epoch 11 2026-03-10T06:26:35.092 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:35.091+0000 7f1a3399b700 1 -- 192.168.123.104:0/334329891 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1a2c106560 msgr2=0x7f1a2c106930 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:26:35.092 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:35.091+0000 7f1a3399b700 1 --2- 192.168.123.104:0/334329891 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1a2c106560 0x7f1a2c106930 secure :-1 s=READY pgs=86 cs=0 l=1 rev1=1 crypto rx=0x7f1a1c009b00 tx=0x7f1a1c009e10 comp rx=0 tx=0).stop 2026-03-10T06:26:35.092 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:35.091+0000 7f1a3399b700 1 -- 192.168.123.104:0/334329891 shutdown_connections 2026-03-10T06:26:35.092 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:35.091+0000 7f1a3399b700 1 --2- 192.168.123.104:0/334329891 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1a2c100540 0x7f1a2c1009b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:26:35.092 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:35.091+0000 7f1a3399b700 1 --2- 192.168.123.104:0/334329891 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1a2c106560 0x7f1a2c106930 unknown :-1 s=CLOSED pgs=86 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:26:35.092 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:35.091+0000 7f1a3399b700 1 -- 192.168.123.104:0/334329891 >> 192.168.123.104:0/334329891 conn(0x7f1a2c0fbfc0 msgr2=0x7f1a2c0fe3d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:26:35.092 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:35.091+0000 7f1a3399b700 1 -- 192.168.123.104:0/334329891 shutdown_connections 2026-03-10T06:26:35.092 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:35.092+0000 7f1a3399b700 1 -- 192.168.123.104:0/334329891 wait complete. 2026-03-10T06:26:35.093 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:35.092+0000 7f1a3399b700 1 Processor -- start 2026-03-10T06:26:35.093 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:35.092+0000 7f1a3399b700 1 -- start start 2026-03-10T06:26:35.093 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:35.092+0000 7f1a3399b700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1a2c100540 0x7f1a2c1983f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:26:35.093 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:35.092+0000 7f1a3399b700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1a2c106560 0x7f1a2c198930 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:26:35.093 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:35.092+0000 7f1a3399b700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f1a2c199010 con 0x7f1a2c100540 2026-03-10T06:26:35.093 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:35.092+0000 7f1a3399b700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f1a2c19cda0 con 0x7f1a2c106560 2026-03-10T06:26:35.094 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:35.093+0000 7f1a31737700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1a2c100540 0x7f1a2c1983f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:26:35.094 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:35.093+0000 7f1a31737700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1a2c100540 0x7f1a2c1983f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:41042/0 (socket says 192.168.123.104:41042) 2026-03-10T06:26:35.095 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:35.093+0000 7f1a31737700 1 -- 192.168.123.104:0/923441236 learned_addr learned my addr 192.168.123.104:0/923441236 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:26:35.095 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:35.093+0000 7f1a31737700 1 -- 192.168.123.104:0/923441236 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1a2c106560 msgr2=0x7f1a2c198930 unknown :-1 s=STATE_CONNECTING_RE l=1).mark_down 2026-03-10T06:26:35.095 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:35.093+0000 7f1a31737700 1 --2- 192.168.123.104:0/923441236 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1a2c106560 0x7f1a2c198930 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:26:35.096 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:35.093+0000 7f1a31737700 1 -- 192.168.123.104:0/923441236 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f1a1c0097e0 con 0x7f1a2c100540 2026-03-10T06:26:35.096 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:35.093+0000 7f1a31737700 1 --2- 192.168.123.104:0/923441236 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1a2c100540 0x7f1a2c1983f0 secure :-1 s=READY pgs=87 cs=0 l=1 rev1=1 crypto rx=0x7f1a1c0048c0 tx=0x7f1a1c0049a0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:26:35.096 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:35.093+0000 7f1a227fc700 1 -- 192.168.123.104:0/923441236 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f1a1c01d070 con 0x7f1a2c100540 2026-03-10T06:26:35.097 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:35.093+0000 7f1a227fc700 1 -- 192.168.123.104:0/923441236 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f1a1c00bc50 con 0x7f1a2c100540 2026-03-10T06:26:35.097 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:35.093+0000 7f1a3399b700 1 -- 192.168.123.104:0/923441236 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f1a2c19d020 con 0x7f1a2c100540 2026-03-10T06:26:35.097 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:35.093+0000 7f1a3399b700 1 -- 192.168.123.104:0/923441236 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f1a2c19d510 con 0x7f1a2c100540 2026-03-10T06:26:35.097 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:35.093+0000 7f1a227fc700 1 -- 192.168.123.104:0/923441236 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f1a1c00f780 con 0x7f1a2c100540 2026-03-10T06:26:35.097 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:35.095+0000 7f1a227fc700 1 -- 192.168.123.104:0/923441236 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 40) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f1a1c00f8e0 con 0x7f1a2c100540 2026-03-10T06:26:35.097 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:35.096+0000 7f1a227fc700 1 --2- 192.168.123.104:0/923441236 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f1a180800d0 0x7f1a18082580 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:26:35.097 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:35.096+0000 7f1a227fc700 1 -- 192.168.123.104:0/923441236 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(71..71 src has 1..71) v4 ==== 6050+0+0 (secure 0 0 0) 0x7f1a1c09c110 con 0x7f1a2c100540 2026-03-10T06:26:35.097 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:35.096+0000 7f1a30f36700 1 --2- 192.168.123.104:0/923441236 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f1a180800d0 0x7f1a18082580 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:26:35.097 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:35.097+0000 7f1a30f36700 1 --2- 192.168.123.104:0/923441236 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f1a180800d0 0x7f1a18082580 secure :-1 s=READY pgs=58 cs=0 l=1 rev1=1 crypto rx=0x7f1a2c199a10 tx=0x7f1a28005c10 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:26:35.100 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:35.097+0000 7f1a3399b700 1 -- 192.168.123.104:0/923441236 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f1a10005320 con 0x7f1a2c100540 2026-03-10T06:26:35.101 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:35.100+0000 7f1a227fc700 1 -- 192.168.123.104:0/923441236 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f1a1c064a40 con 0x7f1a2c100540 2026-03-10T06:26:35.229 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:35.228+0000 7f1a3399b700 1 -- 192.168.123.104:0/923441236 --> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f1a10000bf0 con 0x7f1a180800d0 2026-03-10T06:26:35.230 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:35.230+0000 7f1a227fc700 1 -- 192.168.123.104:0/923441236 <== mgr.34104 v2:192.168.123.104:6800/1421430943 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+401 (secure 0 0 0) 0x7f1a10000bf0 con 0x7f1a180800d0 2026-03-10T06:26:35.231 INFO:teuthology.orchestra.run.vm04.stdout:{ 2026-03-10T06:26:35.231 INFO:teuthology.orchestra.run.vm04.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc", 2026-03-10T06:26:35.231 INFO:teuthology.orchestra.run.vm04.stdout: "in_progress": true, 2026-03-10T06:26:35.231 INFO:teuthology.orchestra.run.vm04.stdout: "which": "Upgrading all daemon types on all hosts", 2026-03-10T06:26:35.231 INFO:teuthology.orchestra.run.vm04.stdout: "services_complete": [ 2026-03-10T06:26:35.231 INFO:teuthology.orchestra.run.vm04.stdout: "mon", 2026-03-10T06:26:35.231 INFO:teuthology.orchestra.run.vm04.stdout: "mgr", 2026-03-10T06:26:35.231 INFO:teuthology.orchestra.run.vm04.stdout: "crash" 2026-03-10T06:26:35.231 INFO:teuthology.orchestra.run.vm04.stdout: ], 2026-03-10T06:26:35.231 INFO:teuthology.orchestra.run.vm04.stdout: "progress": "11/23 daemons upgraded", 2026-03-10T06:26:35.231 INFO:teuthology.orchestra.run.vm04.stdout: "message": "Currently upgrading osd daemons", 2026-03-10T06:26:35.231 INFO:teuthology.orchestra.run.vm04.stdout: "is_paused": false 2026-03-10T06:26:35.231 INFO:teuthology.orchestra.run.vm04.stdout:} 2026-03-10T06:26:35.233 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:35.233+0000 7f1a3399b700 1 -- 192.168.123.104:0/923441236 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f1a180800d0 msgr2=0x7f1a18082580 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:26:35.234 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:35.233+0000 7f1a3399b700 1 --2- 192.168.123.104:0/923441236 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f1a180800d0 0x7f1a18082580 secure :-1 s=READY pgs=58 cs=0 l=1 rev1=1 crypto rx=0x7f1a2c199a10 tx=0x7f1a28005c10 comp rx=0 tx=0).stop 2026-03-10T06:26:35.234 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:35.233+0000 7f1a3399b700 1 -- 192.168.123.104:0/923441236 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1a2c100540 msgr2=0x7f1a2c1983f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:26:35.234 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:35.233+0000 7f1a3399b700 1 --2- 192.168.123.104:0/923441236 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1a2c100540 0x7f1a2c1983f0 secure :-1 s=READY pgs=87 cs=0 l=1 rev1=1 crypto rx=0x7f1a1c0048c0 tx=0x7f1a1c0049a0 comp rx=0 tx=0).stop 2026-03-10T06:26:35.234 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:35.234+0000 7f1a3399b700 1 -- 192.168.123.104:0/923441236 shutdown_connections 2026-03-10T06:26:35.234 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:35.234+0000 7f1a3399b700 1 --2- 192.168.123.104:0/923441236 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f1a180800d0 0x7f1a18082580 unknown :-1 s=CLOSED pgs=58 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:26:35.234 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:35.234+0000 7f1a3399b700 1 --2- 192.168.123.104:0/923441236 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1a2c100540 0x7f1a2c1983f0 unknown :-1 s=CLOSED pgs=87 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:26:35.234 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:35.234+0000 7f1a3399b700 1 --2- 192.168.123.104:0/923441236 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1a2c106560 0x7f1a2c198930 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:26:35.234 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:35.234+0000 7f1a3399b700 1 -- 192.168.123.104:0/923441236 >> 192.168.123.104:0/923441236 conn(0x7f1a2c0fbfc0 msgr2=0x7f1a2c0fda40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:26:35.235 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:35.234+0000 7f1a3399b700 1 -- 192.168.123.104:0/923441236 shutdown_connections 2026-03-10T06:26:35.235 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:35.234+0000 7f1a3399b700 1 -- 192.168.123.104:0/923441236 wait complete. 2026-03-10T06:26:35.311 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:35 vm04.local ceph-mon[115743]: from='client.34240 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:26:35.312 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:35 vm04.local ceph-mon[115743]: pgmap v106: 65 pgs: 65 active+clean; 252 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail; 422 B/s rd, 0 op/s 2026-03-10T06:26:35.312 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:35 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T06:26:35.312 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:35 vm04.local ceph-mon[115743]: from='client.44197 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:26:35.312 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:35 vm04.local ceph-mon[115743]: from='client.34246 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:26:35.312 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:35 vm04.local ceph-mon[115743]: from='client.? 192.168.123.104:0/129659230' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:26:35.312 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:35.310+0000 7f82f5a11700 1 -- 192.168.123.104:0/2089473699 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f82f00730f0 msgr2=0x7f82f00734c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:26:35.314 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:35.310+0000 7f82f5a11700 1 --2- 192.168.123.104:0/2089473699 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f82f00730f0 0x7f82f00734c0 secure :-1 s=READY pgs=88 cs=0 l=1 rev1=1 crypto rx=0x7f82e0009b00 tx=0x7f82e0009e10 comp rx=0 tx=0).stop 2026-03-10T06:26:35.314 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:35.314+0000 7f82f5a11700 1 -- 192.168.123.104:0/2089473699 shutdown_connections 2026-03-10T06:26:35.314 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:35.314+0000 7f82f5a11700 1 --2- 192.168.123.104:0/2089473699 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f82f0073a00 0x7f82f0110ee0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:26:35.314 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:35.314+0000 7f82f5a11700 1 --2- 192.168.123.104:0/2089473699 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f82f00730f0 0x7f82f00734c0 unknown :-1 s=CLOSED pgs=88 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:26:35.314 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:35.314+0000 7f82f5a11700 1 -- 192.168.123.104:0/2089473699 >> 192.168.123.104:0/2089473699 conn(0x7f82f00fbef0 msgr2=0x7f82f00fe300 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:26:35.316 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:35.315+0000 7f82f5a11700 1 -- 192.168.123.104:0/2089473699 shutdown_connections 2026-03-10T06:26:35.317 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:35.315+0000 7f82f5a11700 1 -- 192.168.123.104:0/2089473699 wait complete. 2026-03-10T06:26:35.317 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:35.316+0000 7f82f5a11700 1 Processor -- start 2026-03-10T06:26:35.317 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:35.316+0000 7f82f5a11700 1 -- start start 2026-03-10T06:26:35.317 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:35.316+0000 7f82f5a11700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f82f00730f0 0x7f82f0072880 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:26:35.317 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:35.316+0000 7f82f5a11700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f82f0073a00 0x7f82f006d880 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:26:35.317 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:35.316+0000 7f82f5a11700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f82f006ddc0 con 0x7f82f00730f0 2026-03-10T06:26:35.318 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:35.316+0000 7f82f5a11700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f82f006df30 con 0x7f82f0073a00 2026-03-10T06:26:35.318 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:35.317+0000 7f82e7fff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f82f0073a00 0x7f82f006d880 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:26:35.318 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:35.317+0000 7f82e7fff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f82f0073a00 0x7f82f006d880 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.104:43110/0 (socket says 192.168.123.104:43110) 2026-03-10T06:26:35.318 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:35.317+0000 7f82e7fff700 1 -- 192.168.123.104:0/2287097979 learned_addr learned my addr 192.168.123.104:0/2287097979 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:26:35.318 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:35.317+0000 7f82e7fff700 1 -- 192.168.123.104:0/2287097979 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f82f00730f0 msgr2=0x7f82f0072880 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:26:35.318 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:35.317+0000 7f82e7fff700 1 --2- 192.168.123.104:0/2287097979 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f82f00730f0 0x7f82f0072880 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:26:35.318 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:35.318+0000 7f82e7fff700 1 -- 192.168.123.104:0/2287097979 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f82e00097e0 con 0x7f82f0073a00 2026-03-10T06:26:35.318 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:35.318+0000 7f82e7fff700 1 --2- 192.168.123.104:0/2287097979 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f82f0073a00 0x7f82f006d880 secure :-1 s=READY pgs=29 cs=0 l=1 rev1=1 crypto rx=0x7f82d800eb10 tx=0x7f82d800eed0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:26:35.319 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:35.318+0000 7f82f4a0f700 1 -- 192.168.123.104:0/2287097979 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f82d800cca0 con 0x7f82f0073a00 2026-03-10T06:26:35.319 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:35.318+0000 7f82f5a11700 1 -- 192.168.123.104:0/2287097979 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f82f006e1c0 con 0x7f82f0073a00 2026-03-10T06:26:35.319 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:35.318+0000 7f82f4a0f700 1 -- 192.168.123.104:0/2287097979 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f82d800ce00 con 0x7f82f0073a00 2026-03-10T06:26:35.319 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:35.318+0000 7f82f4a0f700 1 -- 192.168.123.104:0/2287097979 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f82d80189c0 con 0x7f82f0073a00 2026-03-10T06:26:35.319 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:35.319+0000 7f82f5a11700 1 -- 192.168.123.104:0/2287097979 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f82f01046b0 con 0x7f82f0073a00 2026-03-10T06:26:35.320 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:35.320+0000 7f82f5a11700 1 -- 192.168.123.104:0/2287097979 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f82f004f310 con 0x7f82f0073a00 2026-03-10T06:26:35.322 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:35.321+0000 7f82f4a0f700 1 -- 192.168.123.104:0/2287097979 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 40) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f82d8018b20 con 0x7f82f0073a00 2026-03-10T06:26:35.322 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:35.321+0000 7f82f4a0f700 1 --2- 192.168.123.104:0/2287097979 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f82d0077870 0x7f82d0079d20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:26:35.322 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:35.322+0000 7f82f4a0f700 1 -- 192.168.123.104:0/2287097979 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(71..71 src has 1..71) v4 ==== 6050+0+0 (secure 0 0 0) 0x7f82d8014070 con 0x7f82f0073a00 2026-03-10T06:26:35.322 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:35.322+0000 7f82eeffd700 1 --2- 192.168.123.104:0/2287097979 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f82d0077870 0x7f82d0079d20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:26:35.323 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:35.322+0000 7f82eeffd700 1 --2- 192.168.123.104:0/2287097979 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f82d0077870 0x7f82d0079d20 secure :-1 s=READY pgs=59 cs=0 l=1 rev1=1 crypto rx=0x7f82e000b5c0 tx=0x7f82e0005fb0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:26:35.324 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:35.323+0000 7f82f4a0f700 1 -- 192.168.123.104:0/2287097979 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f82d80633f0 con 0x7f82f0073a00 2026-03-10T06:26:35.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:35 vm06.local ceph-mon[98962]: from='client.34240 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:26:35.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:35 vm06.local ceph-mon[98962]: pgmap v106: 65 pgs: 65 active+clean; 252 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail; 422 B/s rd, 0 op/s 2026-03-10T06:26:35.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:35 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T06:26:35.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:35 vm06.local ceph-mon[98962]: from='client.44197 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:26:35.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:35 vm06.local ceph-mon[98962]: from='client.34246 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:26:35.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:35 vm06.local ceph-mon[98962]: from='client.? 192.168.123.104:0/129659230' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:26:35.502 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:35.501+0000 7f82f5a11700 1 -- 192.168.123.104:0/2287097979 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7f82f006ee50 con 0x7f82f0073a00 2026-03-10T06:26:35.503 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:35.502+0000 7f82f4a0f700 1 -- 192.168.123.104:0/2287097979 <== mon.1 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+201 (secure 0 0 0) 0x7f82d8062b40 con 0x7f82f0073a00 2026-03-10T06:26:35.503 INFO:teuthology.orchestra.run.vm04.stdout:HEALTH_WARN 1 filesystem with deprecated feature inline_data 2026-03-10T06:26:35.503 INFO:teuthology.orchestra.run.vm04.stdout:[WRN] FS_INLINE_DATA_DEPRECATED: 1 filesystem with deprecated feature inline_data 2026-03-10T06:26:35.503 INFO:teuthology.orchestra.run.vm04.stdout: fs cephfs has deprecated feature inline_data enabled. 2026-03-10T06:26:35.505 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:35.505+0000 7f82f5a11700 1 -- 192.168.123.104:0/2287097979 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f82d0077870 msgr2=0x7f82d0079d20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:26:35.506 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:35.505+0000 7f82f5a11700 1 --2- 192.168.123.104:0/2287097979 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f82d0077870 0x7f82d0079d20 secure :-1 s=READY pgs=59 cs=0 l=1 rev1=1 crypto rx=0x7f82e000b5c0 tx=0x7f82e0005fb0 comp rx=0 tx=0).stop 2026-03-10T06:26:35.506 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:35.505+0000 7f82f5a11700 1 -- 192.168.123.104:0/2287097979 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f82f0073a00 msgr2=0x7f82f006d880 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:26:35.506 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:35.505+0000 7f82f5a11700 1 --2- 192.168.123.104:0/2287097979 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f82f0073a00 0x7f82f006d880 secure :-1 s=READY pgs=29 cs=0 l=1 rev1=1 crypto rx=0x7f82d800eb10 tx=0x7f82d800eed0 comp rx=0 tx=0).stop 2026-03-10T06:26:35.506 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:35.505+0000 7f82f5a11700 1 -- 192.168.123.104:0/2287097979 shutdown_connections 2026-03-10T06:26:35.506 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:35.505+0000 7f82f5a11700 1 --2- 192.168.123.104:0/2287097979 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f82d0077870 0x7f82d0079d20 unknown :-1 s=CLOSED pgs=59 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:26:35.506 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:35.505+0000 7f82f5a11700 1 --2- 192.168.123.104:0/2287097979 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f82f00730f0 0x7f82f0072880 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:26:35.506 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:35.505+0000 7f82f5a11700 1 --2- 192.168.123.104:0/2287097979 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f82f0073a00 0x7f82f006d880 unknown :-1 s=CLOSED pgs=29 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:26:35.506 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:35.505+0000 7f82f5a11700 1 -- 192.168.123.104:0/2287097979 >> 192.168.123.104:0/2287097979 conn(0x7f82f00fbef0 msgr2=0x7f82f01029f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:26:35.506 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:35.505+0000 7f82f5a11700 1 -- 192.168.123.104:0/2287097979 shutdown_connections 2026-03-10T06:26:35.506 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:26:35.505+0000 7f82f5a11700 1 -- 192.168.123.104:0/2287097979 wait complete. 2026-03-10T06:26:36.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:36 vm06.local ceph-mon[98962]: from='client.? 192.168.123.104:0/428620293' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T06:26:36.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:36 vm06.local ceph-mon[98962]: from='client.34256 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:26:36.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:36 vm06.local ceph-mon[98962]: from='client.? 192.168.123.104:0/2287097979' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T06:26:36.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:36 vm04.local ceph-mon[115743]: from='client.? 192.168.123.104:0/428620293' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T06:26:36.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:36 vm04.local ceph-mon[115743]: from='client.34256 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:26:36.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:36 vm04.local ceph-mon[115743]: from='client.? 192.168.123.104:0/2287097979' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T06:26:37.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:37 vm06.local ceph-mon[98962]: pgmap v107: 65 pgs: 65 active+clean; 252 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail; 991 B/s rd, 2 op/s 2026-03-10T06:26:37.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:37 vm04.local ceph-mon[115743]: pgmap v107: 65 pgs: 65 active+clean; 252 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail; 991 B/s rd, 2 op/s 2026-03-10T06:26:39.308 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:39 vm06.local ceph-mon[98962]: pgmap v108: 65 pgs: 65 active+clean; 252 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail; 819 B/s rd, 1 op/s 2026-03-10T06:26:39.309 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:39 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd ok-to-stop", "ids": ["5"], "max": 16}]: dispatch 2026-03-10T06:26:39.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:39 vm04.local ceph-mon[115743]: pgmap v108: 65 pgs: 65 active+clean; 252 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail; 819 B/s rd, 1 op/s 2026-03-10T06:26:39.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:39 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd ok-to-stop", "ids": ["5"], "max": 16}]: dispatch 2026-03-10T06:26:40.107 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 10 06:26:39 vm06.local systemd[1]: Stopping Ceph osd.5 for 9c59102a-1c48-11f1-b618-035af535377d... 2026-03-10T06:26:40.107 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 10 06:26:39 vm06.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-5[76033]: 2026-03-10T06:26:39.959+0000 7f33dd98e700 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-osd -n osd.5 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false (PID: 1) UID: 0 2026-03-10T06:26:40.107 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 10 06:26:39 vm06.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-5[76033]: 2026-03-10T06:26:39.959+0000 7f33dd98e700 -1 osd.5 71 *** Got signal Terminated *** 2026-03-10T06:26:40.107 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 10 06:26:39 vm06.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-5[76033]: 2026-03-10T06:26:39.959+0000 7f33dd98e700 -1 osd.5 71 *** Immediate shutdown (osd_fast_shutdown=true) *** 2026-03-10T06:26:40.367 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 10 06:26:40 vm06.local podman[115187]: 2026-03-10 06:26:40.254929682 +0000 UTC m=+0.310980185 container died 862da087fc06d298812881fa05859f74757681e6498ac59bad4658cb710212af (image=quay.io/ceph/ceph@sha256:1793ff3af6ae74527c86e1a0b22401e9c42dc08d0ebb8379653be07db17d0007, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-5, org.label-schema.license=GPLv2, maintainer=Guillaume Abrioux , org.label-schema.name=CentOS Stream 8 Base Image, org.label-schema.schema-version=1.0, GIT_BRANCH=HEAD, ceph=True, org.label-schema.vendor=CentOS, GIT_CLEAN=True, GIT_COMMIT=0396eef90bef641b676c164ec7a3876f45010308, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=HEAD, org.label-schema.build-date=20231212, CEPH_POINT_RELEASE=-18.2.0, io.buildah.version=1.29.1) 2026-03-10T06:26:40.367 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 10 06:26:40 vm06.local podman[115187]: 2026-03-10 06:26:40.27415967 +0000 UTC m=+0.330210162 container remove 862da087fc06d298812881fa05859f74757681e6498ac59bad4658cb710212af (image=quay.io/ceph/ceph@sha256:1793ff3af6ae74527c86e1a0b22401e9c42dc08d0ebb8379653be07db17d0007, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-5, RELEASE=HEAD, org.label-schema.build-date=20231212, org.label-schema.schema-version=1.0, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, org.label-schema.name=CentOS Stream 8 Base Image, CEPH_POINT_RELEASE=-18.2.0, GIT_COMMIT=0396eef90bef641b676c164ec7a3876f45010308, ceph=True, maintainer=Guillaume Abrioux , io.buildah.version=1.29.1, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, GIT_BRANCH=HEAD) 2026-03-10T06:26:40.367 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 10 06:26:40 vm06.local bash[115187]: ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-5 2026-03-10T06:26:40.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:40 vm06.local ceph-mon[98962]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["5"], "max": 16}]: dispatch 2026-03-10T06:26:40.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:40 vm06.local ceph-mon[98962]: Upgrade: osd.5 is safe to restart 2026-03-10T06:26:40.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:40 vm06.local ceph-mon[98962]: Upgrade: Updating osd.5 2026-03-10T06:26:40.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:40 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:26:40.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:40 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "osd.5"}]: dispatch 2026-03-10T06:26:40.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:40 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:26:40.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:40 vm06.local ceph-mon[98962]: Deploying daemon osd.5 on vm06 2026-03-10T06:26:40.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:40 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:26:40.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:40 vm06.local ceph-mon[98962]: osd.5 marked itself down and dead 2026-03-10T06:26:40.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:40 vm04.local ceph-mon[115743]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["5"], "max": 16}]: dispatch 2026-03-10T06:26:40.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:40 vm04.local ceph-mon[115743]: Upgrade: osd.5 is safe to restart 2026-03-10T06:26:40.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:40 vm04.local ceph-mon[115743]: Upgrade: Updating osd.5 2026-03-10T06:26:40.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:40 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:26:40.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:40 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "osd.5"}]: dispatch 2026-03-10T06:26:40.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:40 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:26:40.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:40 vm04.local ceph-mon[115743]: Deploying daemon osd.5 on vm06 2026-03-10T06:26:40.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:40 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:26:40.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:40 vm04.local ceph-mon[115743]: osd.5 marked itself down and dead 2026-03-10T06:26:40.726 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 10 06:26:40 vm06.local podman[115251]: 2026-03-10 06:26:40.413953469 +0000 UTC m=+0.015656329 container create 4b864505231fed1abf943b700321b2119f18beebb29059281fd6aae6cd698ef0 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-5-deactivate, CEPH_REF=squid, ceph=True, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260223, org.opencontainers.image.authors=Ceph Release Team , CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, OSD_FLAVOR=default) 2026-03-10T06:26:40.726 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 10 06:26:40 vm06.local podman[115251]: 2026-03-10 06:26:40.452012324 +0000 UTC m=+0.053715194 container init 4b864505231fed1abf943b700321b2119f18beebb29059281fd6aae6cd698ef0 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-5-deactivate, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team , CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_REF=squid, ceph=True, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0) 2026-03-10T06:26:40.726 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 10 06:26:40 vm06.local podman[115251]: 2026-03-10 06:26:40.455068405 +0000 UTC m=+0.056771266 container start 4b864505231fed1abf943b700321b2119f18beebb29059281fd6aae6cd698ef0 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-5-deactivate, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_REF=squid, ceph=True, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.build-date=20260223, org.opencontainers.image.documentation=https://docs.ceph.com/) 2026-03-10T06:26:40.726 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 10 06:26:40 vm06.local podman[115251]: 2026-03-10 06:26:40.460602985 +0000 UTC m=+0.062305855 container attach 4b864505231fed1abf943b700321b2119f18beebb29059281fd6aae6cd698ef0 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-5-deactivate, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20260223, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team , CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, io.buildah.version=1.41.3, OSD_FLAVOR=default, ceph=True) 2026-03-10T06:26:40.726 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 10 06:26:40 vm06.local podman[115251]: 2026-03-10 06:26:40.406507261 +0000 UTC m=+0.008210130 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-10T06:26:40.726 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 10 06:26:40 vm06.local conmon[115262]: conmon 4b864505231fed1abf94 : Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-4b864505231fed1abf943b700321b2119f18beebb29059281fd6aae6cd698ef0.scope/container/memory.events 2026-03-10T06:26:40.726 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 10 06:26:40 vm06.local podman[115251]: 2026-03-10 06:26:40.586817704 +0000 UTC m=+0.188520564 container died 4b864505231fed1abf943b700321b2119f18beebb29059281fd6aae6cd698ef0 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-5-deactivate, OSD_FLAVOR=default, CEPH_REF=squid, org.label-schema.build-date=20260223, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team , CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True) 2026-03-10T06:26:40.726 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 10 06:26:40 vm06.local podman[115251]: 2026-03-10 06:26:40.635322203 +0000 UTC m=+0.237025063 container remove 4b864505231fed1abf943b700321b2119f18beebb29059281fd6aae6cd698ef0 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-5-deactivate, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, OSD_FLAVOR=default, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.label-schema.build-date=20260223, org.opencontainers.image.authors=Ceph Release Team , GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/) 2026-03-10T06:26:40.726 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 10 06:26:40 vm06.local systemd[1]: ceph-9c59102a-1c48-11f1-b618-035af535377d@osd.5.service: Deactivated successfully. 2026-03-10T06:26:40.726 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 10 06:26:40 vm06.local systemd[1]: Stopped Ceph osd.5 for 9c59102a-1c48-11f1-b618-035af535377d. 2026-03-10T06:26:40.726 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 10 06:26:40 vm06.local systemd[1]: ceph-9c59102a-1c48-11f1-b618-035af535377d@osd.5.service: Consumed 34.574s CPU time. 2026-03-10T06:26:41.109 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:41 vm06.local ceph-mon[98962]: pgmap v109: 65 pgs: 65 active+clean; 252 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail; 727 B/s rd, 1 op/s 2026-03-10T06:26:41.110 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 10 06:26:40 vm06.local systemd[1]: Starting Ceph osd.5 for 9c59102a-1c48-11f1-b618-035af535377d... 2026-03-10T06:26:41.110 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 10 06:26:40 vm06.local podman[115352]: 2026-03-10 06:26:40.935414224 +0000 UTC m=+0.019257099 container create 0550820fd2a1ef449b2ecf097e79fa2f81b5c4aa17a5aa84e798843fd9299972 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-5-activate, org.label-schema.build-date=20260223, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, ceph=True, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.authors=Ceph Release Team , CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/) 2026-03-10T06:26:41.110 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 10 06:26:40 vm06.local podman[115352]: 2026-03-10 06:26:40.972188163 +0000 UTC m=+0.056031038 container init 0550820fd2a1ef449b2ecf097e79fa2f81b5c4aa17a5aa84e798843fd9299972 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-5-activate, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, CEPH_REF=squid, org.label-schema.license=GPLv2, ceph=True, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team , GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/) 2026-03-10T06:26:41.110 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 10 06:26:40 vm06.local podman[115352]: 2026-03-10 06:26:40.975280051 +0000 UTC m=+0.059122926 container start 0550820fd2a1ef449b2ecf097e79fa2f81b5c4aa17a5aa84e798843fd9299972 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-5-activate, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20260223, io.buildah.version=1.41.3, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.authors=Ceph Release Team ) 2026-03-10T06:26:41.110 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 10 06:26:40 vm06.local podman[115352]: 2026-03-10 06:26:40.976201066 +0000 UTC m=+0.060043941 container attach 0550820fd2a1ef449b2ecf097e79fa2f81b5c4aa17a5aa84e798843fd9299972 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-5-activate, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, CEPH_REF=squid, org.label-schema.license=GPLv2, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.build-date=20260223, ceph=True, org.opencontainers.image.authors=Ceph Release Team , org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.vendor=CentOS) 2026-03-10T06:26:41.110 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 10 06:26:41 vm06.local podman[115352]: 2026-03-10 06:26:40.92666226 +0000 UTC m=+0.010505135 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-10T06:26:41.110 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 10 06:26:41 vm06.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-5-activate[115363]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T06:26:41.110 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 10 06:26:41 vm06.local bash[115352]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T06:26:41.110 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 10 06:26:41 vm06.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-5-activate[115363]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T06:26:41.110 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 10 06:26:41 vm06.local bash[115352]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T06:26:41.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:41 vm06.local ceph-mon[98962]: Health check failed: 1 osds down (OSD_DOWN) 2026-03-10T06:26:41.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:41 vm06.local ceph-mon[98962]: Health check failed: all OSDs are running squid or later but require_osd_release < squid (OSD_UPGRADE_FINISHED) 2026-03-10T06:26:41.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:41 vm06.local ceph-mon[98962]: osdmap e72: 6 total, 5 up, 6 in 2026-03-10T06:26:41.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:41 vm04.local ceph-mon[115743]: pgmap v109: 65 pgs: 65 active+clean; 252 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail; 727 B/s rd, 1 op/s 2026-03-10T06:26:41.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:41 vm04.local ceph-mon[115743]: Health check failed: 1 osds down (OSD_DOWN) 2026-03-10T06:26:41.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:41 vm04.local ceph-mon[115743]: Health check failed: all OSDs are running squid or later but require_osd_release < squid (OSD_UPGRADE_FINISHED) 2026-03-10T06:26:41.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:41 vm04.local ceph-mon[115743]: osdmap e72: 6 total, 5 up, 6 in 2026-03-10T06:26:41.831 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 10 06:26:41 vm06.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-5-activate[115363]: --> Failed to activate via raw: did not find any matching OSD to activate 2026-03-10T06:26:41.831 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 10 06:26:41 vm06.local bash[115352]: --> Failed to activate via raw: did not find any matching OSD to activate 2026-03-10T06:26:41.831 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 10 06:26:41 vm06.local bash[115352]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T06:26:41.831 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 10 06:26:41 vm06.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-5-activate[115363]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T06:26:41.831 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 10 06:26:41 vm06.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-5-activate[115363]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T06:26:41.831 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 10 06:26:41 vm06.local bash[115352]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T06:26:41.831 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 10 06:26:41 vm06.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-5-activate[115363]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-5 2026-03-10T06:26:41.831 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 10 06:26:41 vm06.local bash[115352]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-5 2026-03-10T06:26:41.831 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 10 06:26:41 vm06.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-5-activate[115363]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph-6a46040a-58d4-4055-9890-32d8b670db05/osd-block-014751cc-c4ec-4d41-86ea-4b72c6f87e86 --path /var/lib/ceph/osd/ceph-5 --no-mon-config 2026-03-10T06:26:41.831 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 10 06:26:41 vm06.local bash[115352]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph-6a46040a-58d4-4055-9890-32d8b670db05/osd-block-014751cc-c4ec-4d41-86ea-4b72c6f87e86 --path /var/lib/ceph/osd/ceph-5 --no-mon-config 2026-03-10T06:26:42.118 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 10 06:26:41 vm06.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-5-activate[115363]: Running command: /usr/bin/ln -snf /dev/ceph-6a46040a-58d4-4055-9890-32d8b670db05/osd-block-014751cc-c4ec-4d41-86ea-4b72c6f87e86 /var/lib/ceph/osd/ceph-5/block 2026-03-10T06:26:42.118 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 10 06:26:41 vm06.local bash[115352]: Running command: /usr/bin/ln -snf /dev/ceph-6a46040a-58d4-4055-9890-32d8b670db05/osd-block-014751cc-c4ec-4d41-86ea-4b72c6f87e86 /var/lib/ceph/osd/ceph-5/block 2026-03-10T06:26:42.118 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 10 06:26:41 vm06.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-5-activate[115363]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-5/block 2026-03-10T06:26:42.118 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 10 06:26:41 vm06.local bash[115352]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-5/block 2026-03-10T06:26:42.118 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 10 06:26:41 vm06.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-5-activate[115363]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-2 2026-03-10T06:26:42.118 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 10 06:26:41 vm06.local bash[115352]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-2 2026-03-10T06:26:42.118 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 10 06:26:41 vm06.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-5-activate[115363]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-5 2026-03-10T06:26:42.118 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 10 06:26:41 vm06.local bash[115352]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-5 2026-03-10T06:26:42.118 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 10 06:26:41 vm06.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-5-activate[115363]: --> ceph-volume lvm activate successful for osd ID: 5 2026-03-10T06:26:42.118 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 10 06:26:41 vm06.local bash[115352]: --> ceph-volume lvm activate successful for osd ID: 5 2026-03-10T06:26:42.118 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 10 06:26:41 vm06.local podman[115352]: 2026-03-10 06:26:41.886170884 +0000 UTC m=+0.970013759 container died 0550820fd2a1ef449b2ecf097e79fa2f81b5c4aa17a5aa84e798843fd9299972 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-5-activate, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.vendor=CentOS, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team ) 2026-03-10T06:26:42.118 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 10 06:26:41 vm06.local podman[115352]: 2026-03-10 06:26:41.906788811 +0000 UTC m=+0.990631686 container remove 0550820fd2a1ef449b2ecf097e79fa2f81b5c4aa17a5aa84e798843fd9299972 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-5-activate, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260223, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.authors=Ceph Release Team , CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3) 2026-03-10T06:26:42.118 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 10 06:26:42 vm06.local podman[115612]: 2026-03-10 06:26:42.002194026 +0000 UTC m=+0.017238571 container create 15b85a82c8ae75f8e3980ed8b0ff7ec0dfc84d818c6b1b707b0a9e0e110936be (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-5, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, ceph=True, org.label-schema.schema-version=1.0, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_REF=squid, org.label-schema.build-date=20260223, org.opencontainers.image.authors=Ceph Release Team , GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.name=CentOS Stream 9 Base Image) 2026-03-10T06:26:42.118 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 10 06:26:42 vm06.local podman[115612]: 2026-03-10 06:26:42.0408831 +0000 UTC m=+0.055927656 container init 15b85a82c8ae75f8e3980ed8b0ff7ec0dfc84d818c6b1b707b0a9e0e110936be (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-5, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_REF=squid, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team , OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2) 2026-03-10T06:26:42.118 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 10 06:26:42 vm06.local podman[115612]: 2026-03-10 06:26:42.045110854 +0000 UTC m=+0.060155399 container start 15b85a82c8ae75f8e3980ed8b0ff7ec0dfc84d818c6b1b707b0a9e0e110936be (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-5, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team , CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.license=GPLv2, org.label-schema.build-date=20260223, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df) 2026-03-10T06:26:42.118 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 10 06:26:42 vm06.local bash[115612]: 15b85a82c8ae75f8e3980ed8b0ff7ec0dfc84d818c6b1b707b0a9e0e110936be 2026-03-10T06:26:42.118 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 10 06:26:42 vm06.local podman[115612]: 2026-03-10 06:26:41.995882581 +0000 UTC m=+0.010927137 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-10T06:26:42.118 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 10 06:26:42 vm06.local systemd[1]: Started Ceph osd.5 for 9c59102a-1c48-11f1-b618-035af535377d. 2026-03-10T06:26:42.482 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:42 vm06.local ceph-mon[98962]: osdmap e73: 6 total, 5 up, 6 in 2026-03-10T06:26:42.483 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:42 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:26:42.483 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:42 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:26:42.483 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:42 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:26:42.483 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 10 06:26:42 vm06.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-5[115623]: 2026-03-10T06:26:42.389+0000 7f942c374740 -1 Falling back to public interface 2026-03-10T06:26:42.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:42 vm04.local ceph-mon[115743]: osdmap e73: 6 total, 5 up, 6 in 2026-03-10T06:26:42.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:42 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:26:42.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:42 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:26:42.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:42 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:26:43.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:43 vm06.local ceph-mon[98962]: pgmap v112: 65 pgs: 5 active+undersized, 10 peering, 5 stale+active+clean, 5 active+undersized+degraded, 40 active+clean; 252 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail; 767 B/s rd, 1 op/s; 12/261 objects degraded (4.598%) 2026-03-10T06:26:43.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:43 vm06.local ceph-mon[98962]: Health check failed: Reduced data availability: 3 pgs peering (PG_AVAILABILITY) 2026-03-10T06:26:43.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:43 vm06.local ceph-mon[98962]: Health check failed: Degraded data redundancy: 12/261 objects degraded (4.598%), 5 pgs degraded (PG_DEGRADED) 2026-03-10T06:26:43.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:43 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:26:43.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:43 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:26:43.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:43 vm04.local ceph-mon[115743]: pgmap v112: 65 pgs: 5 active+undersized, 10 peering, 5 stale+active+clean, 5 active+undersized+degraded, 40 active+clean; 252 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail; 767 B/s rd, 1 op/s; 12/261 objects degraded (4.598%) 2026-03-10T06:26:43.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:43 vm04.local ceph-mon[115743]: Health check failed: Reduced data availability: 3 pgs peering (PG_AVAILABILITY) 2026-03-10T06:26:43.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:43 vm04.local ceph-mon[115743]: Health check failed: Degraded data redundancy: 12/261 objects degraded (4.598%), 5 pgs degraded (PG_DEGRADED) 2026-03-10T06:26:43.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:43 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:26:43.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:43 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:26:44.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:44 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:26:44.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:44 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:26:44.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:44 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:26:44.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:44 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:26:45.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:45 vm06.local ceph-mon[98962]: pgmap v113: 65 pgs: 8 active+undersized, 10 peering, 2 stale+active+clean, 7 active+undersized+degraded, 38 active+clean; 252 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail; 127 B/s rd, 0 op/s; 16/261 objects degraded (6.130%) 2026-03-10T06:26:45.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:45 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:26:45.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:45 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:26:45.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:45 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:26:45.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:45 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T06:26:45.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:45 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:26:45.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:45 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:26:45.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:45 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:26:45.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:45 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:26:45.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:45 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:26:45.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:45 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:26:45.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:45 vm06.local ceph-mon[98962]: Upgrade: Setting container_image for all osd 2026-03-10T06:26:45.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:45 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:26:45.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:45 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd.0"}]: dispatch 2026-03-10T06:26:45.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:45 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd='[{"prefix": "config rm", "name": "container_image", "who": "osd.0"}]': finished 2026-03-10T06:26:45.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:45 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd.1"}]: dispatch 2026-03-10T06:26:45.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:45 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd='[{"prefix": "config rm", "name": "container_image", "who": "osd.1"}]': finished 2026-03-10T06:26:45.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:45 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd.2"}]: dispatch 2026-03-10T06:26:45.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:45 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd='[{"prefix": "config rm", "name": "container_image", "who": "osd.2"}]': finished 2026-03-10T06:26:45.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:45 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd.3"}]: dispatch 2026-03-10T06:26:45.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:45 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd='[{"prefix": "config rm", "name": "container_image", "who": "osd.3"}]': finished 2026-03-10T06:26:45.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:45 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd.4"}]: dispatch 2026-03-10T06:26:45.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:45 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd='[{"prefix": "config rm", "name": "container_image", "who": "osd.4"}]': finished 2026-03-10T06:26:45.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:45 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd.5"}]: dispatch 2026-03-10T06:26:45.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:45 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd='[{"prefix": "config rm", "name": "container_image", "who": "osd.5"}]': finished 2026-03-10T06:26:45.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:45 vm06.local ceph-mon[98962]: Upgrade: Setting require_osd_release to 19 squid 2026-03-10T06:26:45.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:45 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd require-osd-release", "release": "squid"}]: dispatch 2026-03-10T06:26:45.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:45 vm04.local ceph-mon[115743]: pgmap v113: 65 pgs: 8 active+undersized, 10 peering, 2 stale+active+clean, 7 active+undersized+degraded, 38 active+clean; 252 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail; 127 B/s rd, 0 op/s; 16/261 objects degraded (6.130%) 2026-03-10T06:26:45.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:45 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:26:45.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:45 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:26:45.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:45 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:26:45.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:45 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T06:26:45.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:45 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:26:45.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:45 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:26:45.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:45 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:26:45.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:45 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:26:45.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:45 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:26:45.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:45 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:26:45.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:45 vm04.local ceph-mon[115743]: Upgrade: Setting container_image for all osd 2026-03-10T06:26:45.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:45 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:26:45.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:45 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd.0"}]: dispatch 2026-03-10T06:26:45.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:45 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd='[{"prefix": "config rm", "name": "container_image", "who": "osd.0"}]': finished 2026-03-10T06:26:45.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:45 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd.1"}]: dispatch 2026-03-10T06:26:45.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:45 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd='[{"prefix": "config rm", "name": "container_image", "who": "osd.1"}]': finished 2026-03-10T06:26:45.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:45 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd.2"}]: dispatch 2026-03-10T06:26:45.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:45 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd='[{"prefix": "config rm", "name": "container_image", "who": "osd.2"}]': finished 2026-03-10T06:26:45.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:45 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd.3"}]: dispatch 2026-03-10T06:26:45.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:45 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd='[{"prefix": "config rm", "name": "container_image", "who": "osd.3"}]': finished 2026-03-10T06:26:45.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:45 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd.4"}]: dispatch 2026-03-10T06:26:45.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:45 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd='[{"prefix": "config rm", "name": "container_image", "who": "osd.4"}]': finished 2026-03-10T06:26:45.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:45 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd.5"}]: dispatch 2026-03-10T06:26:45.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:45 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd='[{"prefix": "config rm", "name": "container_image", "who": "osd.5"}]': finished 2026-03-10T06:26:45.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:45 vm04.local ceph-mon[115743]: Upgrade: Setting require_osd_release to 19 squid 2026-03-10T06:26:45.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:45 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd require-osd-release", "release": "squid"}]: dispatch 2026-03-10T06:26:46.367 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 10 06:26:45 vm06.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-5[115623]: 2026-03-10T06:26:45.900+0000 7f942c374740 -1 osd.5 0 read_superblock omap replica is missing. 2026-03-10T06:26:46.367 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 10 06:26:46 vm06.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-5[115623]: 2026-03-10T06:26:46.057+0000 7f942c374740 -1 osd.5 71 log_to_monitors true 2026-03-10T06:26:46.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:46 vm06.local ceph-mon[98962]: Health check cleared: OSD_UPGRADE_FINISHED (was: all OSDs are running squid or later but require_osd_release < squid) 2026-03-10T06:26:46.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:46 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd='[{"prefix": "osd require-osd-release", "release": "squid"}]': finished 2026-03-10T06:26:46.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:46 vm06.local ceph-mon[98962]: osdmap e74: 6 total, 5 up, 6 in 2026-03-10T06:26:46.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:46 vm06.local ceph-mon[98962]: Upgrade: Disabling standby-replay for filesystem cephfs 2026-03-10T06:26:46.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:46 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:26:46.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:46 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "fs set", "fs_name": "cephfs", "var": "allow_standby_replay", "val": "0"}]: dispatch 2026-03-10T06:26:46.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:46 vm06.local ceph-mon[98962]: from='osd.5 [v2:192.168.123.106:6816/2399417715,v1:192.168.123.106:6817/2399417715]' entity='osd.5' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["5"]}]: dispatch 2026-03-10T06:26:46.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:46 vm06.local ceph-mon[98962]: from='osd.5 ' entity='osd.5' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["5"]}]: dispatch 2026-03-10T06:26:46.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:46 vm04.local ceph-mon[115743]: Health check cleared: OSD_UPGRADE_FINISHED (was: all OSDs are running squid or later but require_osd_release < squid) 2026-03-10T06:26:46.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:46 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd='[{"prefix": "osd require-osd-release", "release": "squid"}]': finished 2026-03-10T06:26:46.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:46 vm04.local ceph-mon[115743]: osdmap e74: 6 total, 5 up, 6 in 2026-03-10T06:26:46.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:46 vm04.local ceph-mon[115743]: Upgrade: Disabling standby-replay for filesystem cephfs 2026-03-10T06:26:46.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:46 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:26:46.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:46 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "fs set", "fs_name": "cephfs", "var": "allow_standby_replay", "val": "0"}]: dispatch 2026-03-10T06:26:46.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:46 vm04.local ceph-mon[115743]: from='osd.5 [v2:192.168.123.106:6816/2399417715,v1:192.168.123.106:6817/2399417715]' entity='osd.5' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["5"]}]: dispatch 2026-03-10T06:26:46.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:46 vm04.local ceph-mon[115743]: from='osd.5 ' entity='osd.5' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["5"]}]: dispatch 2026-03-10T06:26:47.867 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 10 06:26:47 vm06.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-5[115623]: 2026-03-10T06:26:47.608+0000 7f942410e640 -1 osd.5 71 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-10T06:26:47.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:47 vm06.local ceph-mon[98962]: pgmap v115: 65 pgs: 15 active+undersized, 13 active+undersized+degraded, 37 active+clean; 252 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail; 1023 B/s rd, 1 op/s; 40/261 objects degraded (15.326%) 2026-03-10T06:26:47.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:47 vm06.local ceph-mon[98962]: osdmap e75: 6 total, 5 up, 6 in 2026-03-10T06:26:47.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:47 vm06.local ceph-mon[98962]: Health check cleared: PG_AVAILABILITY (was: Reduced data availability: 3 pgs peering) 2026-03-10T06:26:47.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:47 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd='[{"prefix": "fs set", "fs_name": "cephfs", "var": "allow_standby_replay", "val": "0"}]': finished 2026-03-10T06:26:47.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:47 vm06.local ceph-mon[98962]: fsmap cephfs:1 {0=cephfs.vm04.hdxbzv=up:active} 2 up:standby 2026-03-10T06:26:47.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:47 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:26:47.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:47 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:26:47.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:47 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T06:26:47.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:47 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:26:47.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:47 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:26:47.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:47 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:26:47.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:47 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:26:47.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:47 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:26:47.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:47 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:26:47.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:47 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:26:47.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:47 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm04.hdxbzv", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-10T06:26:47.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:47 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:26:47.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:47 vm04.local ceph-mon[115743]: pgmap v115: 65 pgs: 15 active+undersized, 13 active+undersized+degraded, 37 active+clean; 252 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail; 1023 B/s rd, 1 op/s; 40/261 objects degraded (15.326%) 2026-03-10T06:26:47.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:47 vm04.local ceph-mon[115743]: osdmap e75: 6 total, 5 up, 6 in 2026-03-10T06:26:47.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:47 vm04.local ceph-mon[115743]: Health check cleared: PG_AVAILABILITY (was: Reduced data availability: 3 pgs peering) 2026-03-10T06:26:47.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:47 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd='[{"prefix": "fs set", "fs_name": "cephfs", "var": "allow_standby_replay", "val": "0"}]': finished 2026-03-10T06:26:47.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:47 vm04.local ceph-mon[115743]: fsmap cephfs:1 {0=cephfs.vm04.hdxbzv=up:active} 2 up:standby 2026-03-10T06:26:47.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:47 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:26:47.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:47 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:26:47.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:47 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T06:26:47.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:47 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:26:47.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:47 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:26:47.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:47 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:26:47.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:47 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:26:47.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:47 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:26:47.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:47 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:26:47.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:47 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:26:47.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:47 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm04.hdxbzv", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-10T06:26:47.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:47 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:26:47.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:47 vm04.local ceph-9c59102a-1c48-11f1-b618-035af535377d-mon-vm04[115739]: 2026-03-10T06:26:47.650+0000 7f1d3a3f0640 -1 log_channel(cluster) log [ERR] : Health check failed: 1 filesystem is offline (MDS_ALL_DOWN) 2026-03-10T06:26:48.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:48 vm06.local ceph-mon[98962]: Upgrade: Updating mds.cephfs.vm04.hdxbzv 2026-03-10T06:26:48.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:48 vm06.local ceph-mon[98962]: Deploying daemon mds.cephfs.vm04.hdxbzv on vm04 2026-03-10T06:26:48.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:48 vm06.local ceph-mon[98962]: from='osd.5 ' entity='osd.5' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["5"]}]': finished 2026-03-10T06:26:48.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:48 vm06.local ceph-mon[98962]: osdmap e76: 6 total, 5 up, 6 in 2026-03-10T06:26:48.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:48 vm06.local ceph-mon[98962]: from='osd.5 [v2:192.168.123.106:6816/2399417715,v1:192.168.123.106:6817/2399417715]' entity='osd.5' cmd=[{"prefix": "osd crush create-or-move", "id": 5, "weight":0.0195, "args": ["host=vm06", "root=default"]}]: dispatch 2026-03-10T06:26:48.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:48 vm06.local ceph-mon[98962]: from='osd.5 ' entity='osd.5' cmd=[{"prefix": "osd crush create-or-move", "id": 5, "weight":0.0195, "args": ["host=vm06", "root=default"]}]: dispatch 2026-03-10T06:26:48.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:48 vm06.local ceph-mon[98962]: Health check cleared: OSD_DOWN (was: 1 osds down) 2026-03-10T06:26:48.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:48 vm06.local ceph-mon[98962]: Health check failed: 1 filesystem is degraded (FS_DEGRADED) 2026-03-10T06:26:48.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:48 vm06.local ceph-mon[98962]: Health check failed: 1 filesystem is offline (MDS_ALL_DOWN) 2026-03-10T06:26:48.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:48 vm06.local ceph-mon[98962]: osd.5 [v2:192.168.123.106:6816/2399417715,v1:192.168.123.106:6817/2399417715] boot 2026-03-10T06:26:48.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:48 vm06.local ceph-mon[98962]: osdmap e77: 6 total, 6 up, 6 in 2026-03-10T06:26:48.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:48 vm06.local ceph-mon[98962]: mds.? [v2:192.168.123.106:6824/1147710747,v1:192.168.123.106:6825/1147710747] up:boot 2026-03-10T06:26:48.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:48 vm06.local ceph-mon[98962]: Standby daemon mds.cephfs.vm04.hsrsig assigned to filesystem cephfs as rank 0 2026-03-10T06:26:48.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:48 vm06.local ceph-mon[98962]: Health check cleared: MDS_ALL_DOWN (was: 1 filesystem is offline) 2026-03-10T06:26:48.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:48 vm06.local ceph-mon[98962]: fsmap cephfs:0/1 3 up:standby, 1 failed 2026-03-10T06:26:48.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:48 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm06.wzhqon"}]: dispatch 2026-03-10T06:26:48.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:48 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-10T06:26:48.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:48 vm06.local ceph-mon[98962]: fsmap cephfs:1/1 {0=cephfs.vm04.hsrsig=up:replay} 2 up:standby 2026-03-10T06:26:48.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:48 vm04.local ceph-mon[115743]: Upgrade: Updating mds.cephfs.vm04.hdxbzv 2026-03-10T06:26:48.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:48 vm04.local ceph-mon[115743]: Deploying daemon mds.cephfs.vm04.hdxbzv on vm04 2026-03-10T06:26:48.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:48 vm04.local ceph-mon[115743]: from='osd.5 ' entity='osd.5' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["5"]}]': finished 2026-03-10T06:26:48.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:48 vm04.local ceph-mon[115743]: osdmap e76: 6 total, 5 up, 6 in 2026-03-10T06:26:48.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:48 vm04.local ceph-mon[115743]: from='osd.5 [v2:192.168.123.106:6816/2399417715,v1:192.168.123.106:6817/2399417715]' entity='osd.5' cmd=[{"prefix": "osd crush create-or-move", "id": 5, "weight":0.0195, "args": ["host=vm06", "root=default"]}]: dispatch 2026-03-10T06:26:48.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:48 vm04.local ceph-mon[115743]: from='osd.5 ' entity='osd.5' cmd=[{"prefix": "osd crush create-or-move", "id": 5, "weight":0.0195, "args": ["host=vm06", "root=default"]}]: dispatch 2026-03-10T06:26:48.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:48 vm04.local ceph-mon[115743]: Health check cleared: OSD_DOWN (was: 1 osds down) 2026-03-10T06:26:48.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:48 vm04.local ceph-mon[115743]: Health check failed: 1 filesystem is degraded (FS_DEGRADED) 2026-03-10T06:26:48.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:48 vm04.local ceph-mon[115743]: Health check failed: 1 filesystem is offline (MDS_ALL_DOWN) 2026-03-10T06:26:48.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:48 vm04.local ceph-mon[115743]: osd.5 [v2:192.168.123.106:6816/2399417715,v1:192.168.123.106:6817/2399417715] boot 2026-03-10T06:26:48.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:48 vm04.local ceph-mon[115743]: osdmap e77: 6 total, 6 up, 6 in 2026-03-10T06:26:48.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:48 vm04.local ceph-mon[115743]: mds.? [v2:192.168.123.106:6824/1147710747,v1:192.168.123.106:6825/1147710747] up:boot 2026-03-10T06:26:48.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:48 vm04.local ceph-mon[115743]: Standby daemon mds.cephfs.vm04.hsrsig assigned to filesystem cephfs as rank 0 2026-03-10T06:26:48.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:48 vm04.local ceph-mon[115743]: Health check cleared: MDS_ALL_DOWN (was: 1 filesystem is offline) 2026-03-10T06:26:48.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:48 vm04.local ceph-mon[115743]: fsmap cephfs:0/1 3 up:standby, 1 failed 2026-03-10T06:26:48.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:48 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm06.wzhqon"}]: dispatch 2026-03-10T06:26:48.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:48 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-10T06:26:48.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:48 vm04.local ceph-mon[115743]: fsmap cephfs:1/1 {0=cephfs.vm04.hsrsig=up:replay} 2 up:standby 2026-03-10T06:26:49.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:49 vm04.local ceph-mon[115743]: pgmap v119: 65 pgs: 2 peering, 14 active+undersized, 12 active+undersized+degraded, 37 active+clean; 252 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s; 37/261 objects degraded (14.176%) 2026-03-10T06:26:49.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:49 vm04.local ceph-mon[115743]: osdmap e78: 6 total, 6 up, 6 in 2026-03-10T06:26:49.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:49 vm04.local ceph-mon[115743]: Health check update: Degraded data redundancy: 37/261 objects degraded (14.176%), 12 pgs degraded (PG_DEGRADED) 2026-03-10T06:26:49.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:49 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:26:49.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:49 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T06:26:50.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:49 vm06.local ceph-mon[98962]: pgmap v119: 65 pgs: 2 peering, 14 active+undersized, 12 active+undersized+degraded, 37 active+clean; 252 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s; 37/261 objects degraded (14.176%) 2026-03-10T06:26:50.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:49 vm06.local ceph-mon[98962]: osdmap e78: 6 total, 6 up, 6 in 2026-03-10T06:26:50.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:49 vm06.local ceph-mon[98962]: Health check update: Degraded data redundancy: 37/261 objects degraded (14.176%), 12 pgs degraded (PG_DEGRADED) 2026-03-10T06:26:50.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:49 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:26:50.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:49 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T06:26:51.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:50 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm04.hdxbzv"}]: dispatch 2026-03-10T06:26:51.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:50 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm04.hdxbzv"}]: dispatch 2026-03-10T06:26:52.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:51 vm04.local ceph-mon[115743]: pgmap v121: 65 pgs: 2 peering, 13 active+undersized, 12 active+undersized+degraded, 38 active+clean; 252 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 2.0 MiB/s rd, 1 op/s; 37/261 objects degraded (14.176%) 2026-03-10T06:26:52.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:51 vm06.local ceph-mon[98962]: pgmap v121: 65 pgs: 2 peering, 13 active+undersized, 12 active+undersized+degraded, 38 active+clean; 252 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 2.0 MiB/s rd, 1 op/s; 37/261 objects degraded (14.176%) 2026-03-10T06:26:53.357 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:53 vm04.local ceph-mon[115743]: pgmap v122: 65 pgs: 2 peering, 9 active+undersized, 4 active+undersized+degraded, 50 active+clean; 252 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 24 MiB/s rd, 8 op/s; 11/261 objects degraded (4.215%) 2026-03-10T06:26:53.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:53 vm06.local ceph-mon[98962]: pgmap v122: 65 pgs: 2 peering, 9 active+undersized, 4 active+undersized+degraded, 50 active+clean; 252 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 24 MiB/s rd, 8 op/s; 11/261 objects degraded (4.215%) 2026-03-10T06:26:54.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:54 vm04.local ceph-mon[115743]: Health check cleared: PG_DEGRADED (was: Degraded data redundancy: 11/261 objects degraded (4.215%), 4 pgs degraded) 2026-03-10T06:26:54.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:54 vm04.local ceph-mon[115743]: mds.? [v2:192.168.123.104:6828/2419696492,v1:192.168.123.104:6829/2419696492] up:reconnect 2026-03-10T06:26:54.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:54 vm04.local ceph-mon[115743]: fsmap cephfs:1/1 {0=cephfs.vm04.hsrsig=up:reconnect} 2 up:standby 2026-03-10T06:26:54.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:54 vm06.local ceph-mon[98962]: Health check cleared: PG_DEGRADED (was: Degraded data redundancy: 11/261 objects degraded (4.215%), 4 pgs degraded) 2026-03-10T06:26:54.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:54 vm06.local ceph-mon[98962]: mds.? [v2:192.168.123.104:6828/2419696492,v1:192.168.123.104:6829/2419696492] up:reconnect 2026-03-10T06:26:54.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:54 vm06.local ceph-mon[98962]: fsmap cephfs:1/1 {0=cephfs.vm04.hsrsig=up:reconnect} 2 up:standby 2026-03-10T06:26:55.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:55 vm04.local ceph-mon[115743]: pgmap v123: 65 pgs: 65 active+clean; 252 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 34 MiB/s rd, 10 op/s 2026-03-10T06:26:55.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:55 vm04.local ceph-mon[115743]: reconnect by client.24325 192.168.123.104:0/950847804 after 0 2026-03-10T06:26:55.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:55 vm04.local ceph-mon[115743]: reconnect by client.24331 192.168.144.1:0/2593838473 after 0.001 2026-03-10T06:26:55.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:55 vm04.local ceph-mon[115743]: mds.? [v2:192.168.123.104:6828/2419696492,v1:192.168.123.104:6829/2419696492] up:rejoin 2026-03-10T06:26:55.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:55 vm04.local ceph-mon[115743]: fsmap cephfs:1/1 {0=cephfs.vm04.hsrsig=up:rejoin} 2 up:standby 2026-03-10T06:26:55.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:55 vm06.local ceph-mon[98962]: pgmap v123: 65 pgs: 65 active+clean; 252 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 34 MiB/s rd, 10 op/s 2026-03-10T06:26:55.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:55 vm06.local ceph-mon[98962]: reconnect by client.24325 192.168.123.104:0/950847804 after 0 2026-03-10T06:26:55.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:55 vm06.local ceph-mon[98962]: reconnect by client.24331 192.168.144.1:0/2593838473 after 0.001 2026-03-10T06:26:55.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:55 vm06.local ceph-mon[98962]: mds.? [v2:192.168.123.104:6828/2419696492,v1:192.168.123.104:6829/2419696492] up:rejoin 2026-03-10T06:26:55.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:55 vm06.local ceph-mon[98962]: fsmap cephfs:1/1 {0=cephfs.vm04.hsrsig=up:rejoin} 2 up:standby 2026-03-10T06:26:56.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:56 vm04.local ceph-mon[115743]: daemon mds.cephfs.vm04.hsrsig is now active in filesystem cephfs as rank 0 2026-03-10T06:26:56.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:56 vm06.local ceph-mon[98962]: daemon mds.cephfs.vm04.hsrsig is now active in filesystem cephfs as rank 0 2026-03-10T06:26:57.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:57 vm04.local ceph-mon[115743]: pgmap v124: 65 pgs: 65 active+clean; 252 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 29 MiB/s rd, 122 B/s wr, 9 op/s 2026-03-10T06:26:57.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:57 vm04.local ceph-mon[115743]: Health check cleared: FS_DEGRADED (was: 1 filesystem is degraded) 2026-03-10T06:26:57.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:57 vm04.local ceph-mon[115743]: mds.? [v2:192.168.123.104:6828/2419696492,v1:192.168.123.104:6829/2419696492] up:active 2026-03-10T06:26:57.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:57 vm04.local ceph-mon[115743]: fsmap cephfs:1 {0=cephfs.vm04.hsrsig=up:active} 2 up:standby 2026-03-10T06:26:57.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:57 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:26:57.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:57 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:26:57.428 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:57 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:26:57.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:57 vm06.local ceph-mon[98962]: pgmap v124: 65 pgs: 65 active+clean; 252 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 29 MiB/s rd, 122 B/s wr, 9 op/s 2026-03-10T06:26:57.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:57 vm06.local ceph-mon[98962]: Health check cleared: FS_DEGRADED (was: 1 filesystem is degraded) 2026-03-10T06:26:57.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:57 vm06.local ceph-mon[98962]: mds.? [v2:192.168.123.104:6828/2419696492,v1:192.168.123.104:6829/2419696492] up:active 2026-03-10T06:26:57.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:57 vm06.local ceph-mon[98962]: fsmap cephfs:1 {0=cephfs.vm04.hsrsig=up:active} 2 up:standby 2026-03-10T06:26:57.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:57 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:26:57.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:57 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:26:57.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:57 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:26:59.580 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:59 vm04.local ceph-mon[115743]: mds.? [v2:192.168.123.104:6826/2103633514,v1:192.168.123.104:6827/2103633514] up:boot 2026-03-10T06:26:59.580 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:59 vm04.local ceph-mon[115743]: fsmap cephfs:1 {0=cephfs.vm04.hsrsig=up:active} 3 up:standby 2026-03-10T06:26:59.580 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:59 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm04.hdxbzv"}]: dispatch 2026-03-10T06:26:59.580 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:59 vm04.local ceph-mon[115743]: pgmap v125: 65 pgs: 65 active+clean; 252 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 25 MiB/s rd, 409 B/s wr, 10 op/s 2026-03-10T06:26:59.580 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:59 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:26:59.580 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:26:59 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:26:59.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:59 vm06.local ceph-mon[98962]: mds.? [v2:192.168.123.104:6826/2103633514,v1:192.168.123.104:6827/2103633514] up:boot 2026-03-10T06:26:59.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:59 vm06.local ceph-mon[98962]: fsmap cephfs:1 {0=cephfs.vm04.hsrsig=up:active} 3 up:standby 2026-03-10T06:26:59.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:59 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm04.hdxbzv"}]: dispatch 2026-03-10T06:26:59.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:59 vm06.local ceph-mon[98962]: pgmap v125: 65 pgs: 65 active+clean; 252 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 25 MiB/s rd, 409 B/s wr, 10 op/s 2026-03-10T06:26:59.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:59 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:26:59.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:26:59 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:00.575 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:00 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:00.575 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:00 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:00.575 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:00 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:00.575 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:00 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:00.575 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:00 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:00.575 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:00 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:27:00.575 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:00 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T06:27:00.575 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:00 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:00.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:00 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:00.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:00 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:00.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:00 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:00.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:00 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:00.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:00 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:00.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:00 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:27:00.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:00 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T06:27:00.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:00 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:01.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:01 vm06.local ceph-mon[98962]: pgmap v126: 65 pgs: 65 active+clean; 252 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 22 MiB/s rd, 4.9 KiB/s wr, 11 op/s 2026-03-10T06:27:01.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:01 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:27:01.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:01 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:27:01.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:01 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:27:01.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:01 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:27:01.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:01 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:27:01.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:01 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:01.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:01 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm04.hsrsig", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-10T06:27:01.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:01 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:27:01.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:01 vm06.local ceph-mon[98962]: Health check failed: 1 filesystem is degraded (FS_DEGRADED) 2026-03-10T06:27:01.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:01 vm06.local ceph-mon[98962]: Health check failed: 1 filesystem is offline (MDS_ALL_DOWN) 2026-03-10T06:27:01.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:01 vm06.local ceph-mon[98962]: osdmap e79: 6 total, 6 up, 6 in 2026-03-10T06:27:01.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:01 vm06.local ceph-mon[98962]: Standby daemon mds.cephfs.vm06.afscws assigned to filesystem cephfs as rank 0 2026-03-10T06:27:01.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:01 vm06.local ceph-mon[98962]: Health check cleared: MDS_ALL_DOWN (was: 1 filesystem is offline) 2026-03-10T06:27:01.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:01 vm06.local ceph-mon[98962]: fsmap cephfs:0/1 3 up:standby, 1 failed 2026-03-10T06:27:01.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:01 vm06.local ceph-mon[98962]: fsmap cephfs:1/1 {0=cephfs.vm06.afscws=up:replay} 2 up:standby 2026-03-10T06:27:01.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:01 vm04.local ceph-9c59102a-1c48-11f1-b618-035af535377d-mon-vm04[115739]: 2026-03-10T06:27:01.254+0000 7f1d3a3f0640 -1 log_channel(cluster) log [ERR] : Health check failed: 1 filesystem is offline (MDS_ALL_DOWN) 2026-03-10T06:27:01.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:01 vm04.local ceph-mon[115743]: pgmap v126: 65 pgs: 65 active+clean; 252 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 22 MiB/s rd, 4.9 KiB/s wr, 11 op/s 2026-03-10T06:27:01.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:01 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:27:01.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:01 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:27:01.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:01 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:27:01.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:01 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:27:01.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:01 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:27:01.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:01 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:01.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:01 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm04.hsrsig", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-10T06:27:01.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:01 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:27:01.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:01 vm04.local ceph-mon[115743]: Health check failed: 1 filesystem is degraded (FS_DEGRADED) 2026-03-10T06:27:01.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:01 vm04.local ceph-mon[115743]: Health check failed: 1 filesystem is offline (MDS_ALL_DOWN) 2026-03-10T06:27:01.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:01 vm04.local ceph-mon[115743]: osdmap e79: 6 total, 6 up, 6 in 2026-03-10T06:27:01.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:01 vm04.local ceph-mon[115743]: Standby daemon mds.cephfs.vm06.afscws assigned to filesystem cephfs as rank 0 2026-03-10T06:27:01.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:01 vm04.local ceph-mon[115743]: Health check cleared: MDS_ALL_DOWN (was: 1 filesystem is offline) 2026-03-10T06:27:01.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:01 vm04.local ceph-mon[115743]: fsmap cephfs:0/1 3 up:standby, 1 failed 2026-03-10T06:27:01.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:01 vm04.local ceph-mon[115743]: fsmap cephfs:1/1 {0=cephfs.vm06.afscws=up:replay} 2 up:standby 2026-03-10T06:27:02.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:02 vm06.local ceph-mon[98962]: Upgrade: Updating mds.cephfs.vm04.hsrsig 2026-03-10T06:27:02.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:02 vm06.local ceph-mon[98962]: Deploying daemon mds.cephfs.vm04.hsrsig on vm04 2026-03-10T06:27:02.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:02 vm04.local ceph-mon[115743]: Upgrade: Updating mds.cephfs.vm04.hsrsig 2026-03-10T06:27:02.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:02 vm04.local ceph-mon[115743]: Deploying daemon mds.cephfs.vm04.hsrsig on vm04 2026-03-10T06:27:03.451 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:03 vm04.local ceph-mon[115743]: pgmap v128: 65 pgs: 65 active+clean; 252 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 14 MiB/s rd, 5.6 KiB/s wr, 10 op/s 2026-03-10T06:27:03.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:03 vm06.local ceph-mon[98962]: pgmap v128: 65 pgs: 65 active+clean; 252 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 14 MiB/s rd, 5.6 KiB/s wr, 10 op/s 2026-03-10T06:27:05.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:05 vm04.local ceph-mon[115743]: pgmap v129: 65 pgs: 65 active+clean; 252 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 7.3 MiB/s rd, 5.6 KiB/s wr, 9 op/s 2026-03-10T06:27:05.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:05 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:05.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:05 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T06:27:05.587 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:05.586+0000 7fbc6adb7700 1 -- 192.168.123.104:0/623175581 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fbc64102780 msgr2=0x7fbc64102bf0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:27:05.588 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:05.586+0000 7fbc6adb7700 1 --2- 192.168.123.104:0/623175581 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fbc64102780 0x7fbc64102bf0 secure :-1 s=READY pgs=92 cs=0 l=1 rev1=1 crypto rx=0x7fbc58009b50 tx=0x7fbc58009e60 comp rx=0 tx=0).stop 2026-03-10T06:27:05.588 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:05.587+0000 7fbc6adb7700 1 -- 192.168.123.104:0/623175581 shutdown_connections 2026-03-10T06:27:05.588 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:05.587+0000 7fbc6adb7700 1 --2- 192.168.123.104:0/623175581 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fbc64102780 0x7fbc64102bf0 unknown :-1 s=CLOSED pgs=92 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:27:05.588 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:05.587+0000 7fbc6adb7700 1 --2- 192.168.123.104:0/623175581 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fbc64108780 0x7fbc64108b50 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:27:05.588 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:05.587+0000 7fbc6adb7700 1 -- 192.168.123.104:0/623175581 >> 192.168.123.104:0/623175581 conn(0x7fbc640fe280 msgr2=0x7fbc64100690 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:27:05.588 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:05.587+0000 7fbc6adb7700 1 -- 192.168.123.104:0/623175581 shutdown_connections 2026-03-10T06:27:05.588 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:05.588+0000 7fbc6adb7700 1 -- 192.168.123.104:0/623175581 wait complete. 2026-03-10T06:27:05.589 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:05.588+0000 7fbc6adb7700 1 Processor -- start 2026-03-10T06:27:05.589 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:05.588+0000 7fbc6adb7700 1 -- start start 2026-03-10T06:27:05.589 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:05.589+0000 7fbc6adb7700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fbc64102780 0x7fbc641983a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:27:05.589 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:05.589+0000 7fbc6adb7700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fbc64108780 0x7fbc641988e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:27:05.589 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:05.589+0000 7fbc6adb7700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fbc64198fc0 con 0x7fbc64102780 2026-03-10T06:27:05.589 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:05.589+0000 7fbc6adb7700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fbc6419cd50 con 0x7fbc64108780 2026-03-10T06:27:05.590 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:05.589+0000 7fbc68b53700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fbc64102780 0x7fbc641983a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:27:05.590 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:05.589+0000 7fbc68b53700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fbc64102780 0x7fbc641983a0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:48450/0 (socket says 192.168.123.104:48450) 2026-03-10T06:27:05.590 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:05.589+0000 7fbc68b53700 1 -- 192.168.123.104:0/2955677900 learned_addr learned my addr 192.168.123.104:0/2955677900 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:27:05.591 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:05.589+0000 7fbc68b53700 1 -- 192.168.123.104:0/2955677900 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fbc64108780 msgr2=0x7fbc641988e0 unknown :-1 s=STATE_CONNECTING_RE l=1).mark_down 2026-03-10T06:27:05.591 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:05.589+0000 7fbc68b53700 1 --2- 192.168.123.104:0/2955677900 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fbc64108780 0x7fbc641988e0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:27:05.591 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:05.589+0000 7fbc68b53700 1 -- 192.168.123.104:0/2955677900 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fbc580097e0 con 0x7fbc64102780 2026-03-10T06:27:05.591 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:05.590+0000 7fbc68b53700 1 --2- 192.168.123.104:0/2955677900 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fbc64102780 0x7fbc641983a0 secure :-1 s=READY pgs=93 cs=0 l=1 rev1=1 crypto rx=0x7fbc54009fd0 tx=0x7fbc5400eea0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:27:05.592 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:05.590+0000 7fbc61ffb700 1 -- 192.168.123.104:0/2955677900 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fbc54009980 con 0x7fbc64102780 2026-03-10T06:27:05.592 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:05.590+0000 7fbc61ffb700 1 -- 192.168.123.104:0/2955677900 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7fbc5400c7c0 con 0x7fbc64102780 2026-03-10T06:27:05.592 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:05.590+0000 7fbc61ffb700 1 -- 192.168.123.104:0/2955677900 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fbc54003c90 con 0x7fbc64102780 2026-03-10T06:27:05.592 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:05.590+0000 7fbc6adb7700 1 -- 192.168.123.104:0/2955677900 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fbc6419cfd0 con 0x7fbc64102780 2026-03-10T06:27:05.592 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:05.590+0000 7fbc6adb7700 1 -- 192.168.123.104:0/2955677900 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fbc6419d520 con 0x7fbc64102780 2026-03-10T06:27:05.593 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:05.592+0000 7fbc61ffb700 1 -- 192.168.123.104:0/2955677900 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 40) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fbc5401f440 con 0x7fbc64102780 2026-03-10T06:27:05.593 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:05.592+0000 7fbc61ffb700 1 --2- 192.168.123.104:0/2955677900 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7fbc4c0778c0 0x7fbc4c079d70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:27:05.593 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:05.592+0000 7fbc61ffb700 1 -- 192.168.123.104:0/2955677900 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(79..79 src has 1..79) v4 ==== 6222+0+0 (secure 0 0 0) 0x7fbc540a33a0 con 0x7fbc64102780 2026-03-10T06:27:05.593 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:05.593+0000 7fbc63fff700 1 --2- 192.168.123.104:0/2955677900 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7fbc4c0778c0 0x7fbc4c079d70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:27:05.593 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:05.593+0000 7fbc6adb7700 1 -- 192.168.123.104:0/2955677900 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fbc6419d160 con 0x7fbc64102780 2026-03-10T06:27:05.594 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:05.593+0000 7fbc63fff700 1 --2- 192.168.123.104:0/2955677900 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7fbc4c0778c0 0x7fbc4c079d70 secure :-1 s=READY pgs=63 cs=0 l=1 rev1=1 crypto rx=0x7fbc5800b5c0 tx=0x7fbc58005fb0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:27:05.596 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:05.596+0000 7fbc61ffb700 1 -- 192.168.123.104:0/2955677900 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fbc6419d160 con 0x7fbc64102780 2026-03-10T06:27:05.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:05 vm06.local ceph-mon[98962]: pgmap v129: 65 pgs: 65 active+clean; 252 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 7.3 MiB/s rd, 5.6 KiB/s wr, 9 op/s 2026-03-10T06:27:05.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:05 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:05.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:05 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T06:27:05.731 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:05.730+0000 7fbc6adb7700 1 -- 192.168.123.104:0/2955677900 --> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7fbc6419d160 con 0x7fbc4c0778c0 2026-03-10T06:27:05.732 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:05.732+0000 7fbc61ffb700 1 -- 192.168.123.104:0/2955677900 <== mgr.34104 v2:192.168.123.104:6800/1421430943 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+416 (secure 0 0 0) 0x7fbc6419d160 con 0x7fbc4c0778c0 2026-03-10T06:27:05.735 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:05.734+0000 7fbc6adb7700 1 -- 192.168.123.104:0/2955677900 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7fbc4c0778c0 msgr2=0x7fbc4c079d70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:27:05.735 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:05.734+0000 7fbc6adb7700 1 --2- 192.168.123.104:0/2955677900 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7fbc4c0778c0 0x7fbc4c079d70 secure :-1 s=READY pgs=63 cs=0 l=1 rev1=1 crypto rx=0x7fbc5800b5c0 tx=0x7fbc58005fb0 comp rx=0 tx=0).stop 2026-03-10T06:27:05.735 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:05.735+0000 7fbc6adb7700 1 -- 192.168.123.104:0/2955677900 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fbc64102780 msgr2=0x7fbc641983a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:27:05.735 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:05.735+0000 7fbc6adb7700 1 --2- 192.168.123.104:0/2955677900 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fbc64102780 0x7fbc641983a0 secure :-1 s=READY pgs=93 cs=0 l=1 rev1=1 crypto rx=0x7fbc54009fd0 tx=0x7fbc5400eea0 comp rx=0 tx=0).stop 2026-03-10T06:27:05.735 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:05.735+0000 7fbc6adb7700 1 -- 192.168.123.104:0/2955677900 shutdown_connections 2026-03-10T06:27:05.735 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:05.735+0000 7fbc6adb7700 1 --2- 192.168.123.104:0/2955677900 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7fbc4c0778c0 0x7fbc4c079d70 unknown :-1 s=CLOSED pgs=63 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:27:05.735 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:05.735+0000 7fbc6adb7700 1 --2- 192.168.123.104:0/2955677900 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fbc64102780 0x7fbc641983a0 unknown :-1 s=CLOSED pgs=93 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:27:05.736 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:05.735+0000 7fbc6adb7700 1 --2- 192.168.123.104:0/2955677900 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fbc64108780 0x7fbc641988e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:27:05.736 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:05.735+0000 7fbc6adb7700 1 -- 192.168.123.104:0/2955677900 >> 192.168.123.104:0/2955677900 conn(0x7fbc640fe280 msgr2=0x7fbc640ffca0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:27:05.736 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:05.735+0000 7fbc6adb7700 1 -- 192.168.123.104:0/2955677900 shutdown_connections 2026-03-10T06:27:05.736 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:05.735+0000 7fbc6adb7700 1 -- 192.168.123.104:0/2955677900 wait complete. 2026-03-10T06:27:05.747 INFO:teuthology.orchestra.run.vm04.stdout:true 2026-03-10T06:27:05.814 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:05.813+0000 7f8b47092700 1 -- 192.168.123.104:0/619933160 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8b400ffde0 msgr2=0x7f8b401042c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:27:05.814 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:05.813+0000 7f8b47092700 1 --2- 192.168.123.104:0/619933160 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8b400ffde0 0x7f8b401042c0 secure :-1 s=READY pgs=94 cs=0 l=1 rev1=1 crypto rx=0x7f8b34009b50 tx=0x7f8b34009e60 comp rx=0 tx=0).stop 2026-03-10T06:27:05.814 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:05.813+0000 7f8b47092700 1 -- 192.168.123.104:0/619933160 shutdown_connections 2026-03-10T06:27:05.814 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:05.813+0000 7f8b47092700 1 --2- 192.168.123.104:0/619933160 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8b400ffde0 0x7f8b401042c0 unknown :-1 s=CLOSED pgs=94 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:27:05.814 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:05.813+0000 7f8b47092700 1 --2- 192.168.123.104:0/619933160 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8b400ff4d0 0x7f8b400ff8a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:27:05.814 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:05.813+0000 7f8b47092700 1 -- 192.168.123.104:0/619933160 >> 192.168.123.104:0/619933160 conn(0x7f8b400faf00 msgr2=0x7f8b400fd310 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:27:05.814 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:05.814+0000 7f8b47092700 1 -- 192.168.123.104:0/619933160 shutdown_connections 2026-03-10T06:27:05.814 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:05.814+0000 7f8b47092700 1 -- 192.168.123.104:0/619933160 wait complete. 2026-03-10T06:27:05.815 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:05.814+0000 7f8b47092700 1 Processor -- start 2026-03-10T06:27:05.815 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:05.815+0000 7f8b47092700 1 -- start start 2026-03-10T06:27:05.815 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:05.815+0000 7f8b47092700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8b400ff4d0 0x7f8b401a25a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:27:05.815 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:05.815+0000 7f8b47092700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8b400ffde0 0x7f8b401a2ae0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:27:05.815 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:05.815+0000 7f8b47092700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8b401a3170 con 0x7f8b400ffde0 2026-03-10T06:27:05.815 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:05.815+0000 7f8b47092700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8b4019c620 con 0x7f8b400ff4d0 2026-03-10T06:27:05.816 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:05.815+0000 7f8b3ffff700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8b400ffde0 0x7f8b401a2ae0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:27:05.816 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:05.815+0000 7f8b3ffff700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8b400ffde0 0x7f8b401a2ae0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:48460/0 (socket says 192.168.123.104:48460) 2026-03-10T06:27:05.816 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:05.815+0000 7f8b3ffff700 1 -- 192.168.123.104:0/1170661775 learned_addr learned my addr 192.168.123.104:0/1170661775 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:27:05.816 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:05.815+0000 7f8b3ffff700 1 -- 192.168.123.104:0/1170661775 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8b400ff4d0 msgr2=0x7f8b401a25a0 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T06:27:05.816 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:05.815+0000 7f8b3ffff700 1 --2- 192.168.123.104:0/1170661775 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8b400ff4d0 0x7f8b401a25a0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:27:05.816 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:05.816+0000 7f8b3ffff700 1 -- 192.168.123.104:0/1170661775 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f8b340097e0 con 0x7f8b400ffde0 2026-03-10T06:27:05.816 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:05.816+0000 7f8b3ffff700 1 --2- 192.168.123.104:0/1170661775 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8b400ffde0 0x7f8b401a2ae0 secure :-1 s=READY pgs=95 cs=0 l=1 rev1=1 crypto rx=0x7f8b34000c00 tx=0x7f8b3400b870 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:27:05.816 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:05.816+0000 7f8b3dffb700 1 -- 192.168.123.104:0/1170661775 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f8b3401d070 con 0x7f8b400ffde0 2026-03-10T06:27:05.817 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:05.816+0000 7f8b47092700 1 -- 192.168.123.104:0/1170661775 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f8b4019c8a0 con 0x7f8b400ffde0 2026-03-10T06:27:05.817 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:05.816+0000 7f8b47092700 1 -- 192.168.123.104:0/1170661775 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f8b4019cd90 con 0x7f8b400ffde0 2026-03-10T06:27:05.818 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:05.818+0000 7f8b3dffb700 1 -- 192.168.123.104:0/1170661775 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f8b3400bcb0 con 0x7f8b400ffde0 2026-03-10T06:27:05.818 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:05.818+0000 7f8b3dffb700 1 -- 192.168.123.104:0/1170661775 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f8b34021880 con 0x7f8b400ffde0 2026-03-10T06:27:05.819 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:05.818+0000 7f8b2b7fe700 1 -- 192.168.123.104:0/1170661775 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f8b24005320 con 0x7f8b400ffde0 2026-03-10T06:27:05.820 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:05.820+0000 7f8b3dffb700 1 -- 192.168.123.104:0/1170661775 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 40) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f8b34021aa0 con 0x7f8b400ffde0 2026-03-10T06:27:05.820 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:05.820+0000 7f8b3dffb700 1 --2- 192.168.123.104:0/1170661775 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f8b30077760 0x7f8b30079c10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:27:05.821 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:05.820+0000 7f8b3dffb700 1 -- 192.168.123.104:0/1170661775 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(79..79 src has 1..79) v4 ==== 6222+0+0 (secure 0 0 0) 0x7f8b3409bf80 con 0x7f8b400ffde0 2026-03-10T06:27:05.822 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:05.821+0000 7f8b3dffb700 1 -- 192.168.123.104:0/1170661775 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f8b34064890 con 0x7f8b400ffde0 2026-03-10T06:27:05.822 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:05.822+0000 7f8b44e2e700 1 --2- 192.168.123.104:0/1170661775 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f8b30077760 0x7f8b30079c10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:27:05.822 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:05.822+0000 7f8b44e2e700 1 --2- 192.168.123.104:0/1170661775 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f8b30077760 0x7f8b30079c10 secure :-1 s=READY pgs=64 cs=0 l=1 rev1=1 crypto rx=0x7f8b2c009710 tx=0x7f8b2c006c60 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:27:05.960 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:05.959+0000 7f8b2b7fe700 1 -- 192.168.123.104:0/1170661775 --> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f8b24000bf0 con 0x7f8b30077760 2026-03-10T06:27:05.961 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:05.961+0000 7f8b3dffb700 1 -- 192.168.123.104:0/1170661775 <== mgr.34104 v2:192.168.123.104:6800/1421430943 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+416 (secure 0 0 0) 0x7f8b24000bf0 con 0x7f8b30077760 2026-03-10T06:27:05.963 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:05.963+0000 7f8b2b7fe700 1 -- 192.168.123.104:0/1170661775 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f8b30077760 msgr2=0x7f8b30079c10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:27:05.963 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:05.963+0000 7f8b2b7fe700 1 --2- 192.168.123.104:0/1170661775 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f8b30077760 0x7f8b30079c10 secure :-1 s=READY pgs=64 cs=0 l=1 rev1=1 crypto rx=0x7f8b2c009710 tx=0x7f8b2c006c60 comp rx=0 tx=0).stop 2026-03-10T06:27:05.963 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:05.963+0000 7f8b2b7fe700 1 -- 192.168.123.104:0/1170661775 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8b400ffde0 msgr2=0x7f8b401a2ae0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:27:05.963 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:05.963+0000 7f8b2b7fe700 1 --2- 192.168.123.104:0/1170661775 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8b400ffde0 0x7f8b401a2ae0 secure :-1 s=READY pgs=95 cs=0 l=1 rev1=1 crypto rx=0x7f8b34000c00 tx=0x7f8b3400b870 comp rx=0 tx=0).stop 2026-03-10T06:27:05.964 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:05.963+0000 7f8b2b7fe700 1 -- 192.168.123.104:0/1170661775 shutdown_connections 2026-03-10T06:27:05.964 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:05.963+0000 7f8b2b7fe700 1 --2- 192.168.123.104:0/1170661775 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f8b30077760 0x7f8b30079c10 unknown :-1 s=CLOSED pgs=64 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:27:05.964 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:05.963+0000 7f8b2b7fe700 1 --2- 192.168.123.104:0/1170661775 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8b400ff4d0 0x7f8b401a25a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:27:05.964 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:05.963+0000 7f8b2b7fe700 1 --2- 192.168.123.104:0/1170661775 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8b400ffde0 0x7f8b401a2ae0 unknown :-1 s=CLOSED pgs=95 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:27:05.964 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:05.963+0000 7f8b2b7fe700 1 -- 192.168.123.104:0/1170661775 >> 192.168.123.104:0/1170661775 conn(0x7f8b400faf00 msgr2=0x7f8b40110940 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:27:05.964 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:05.963+0000 7f8b2b7fe700 1 -- 192.168.123.104:0/1170661775 shutdown_connections 2026-03-10T06:27:05.964 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:05.963+0000 7f8b2b7fe700 1 -- 192.168.123.104:0/1170661775 wait complete. 2026-03-10T06:27:06.040 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:06.040+0000 7f2b93803700 1 -- 192.168.123.104:0/2711086249 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2b8c108750 msgr2=0x7f2b8c108b20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:27:06.041 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:06.040+0000 7f2b93803700 1 --2- 192.168.123.104:0/2711086249 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2b8c108750 0x7f2b8c108b20 secure :-1 s=READY pgs=96 cs=0 l=1 rev1=1 crypto rx=0x7f2b7c009b50 tx=0x7f2b7c009e60 comp rx=0 tx=0).stop 2026-03-10T06:27:06.041 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:06.040+0000 7f2b93803700 1 -- 192.168.123.104:0/2711086249 shutdown_connections 2026-03-10T06:27:06.041 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:06.040+0000 7f2b93803700 1 --2- 192.168.123.104:0/2711086249 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f2b8c102750 0x7f2b8c102bc0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:27:06.041 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:06.040+0000 7f2b93803700 1 --2- 192.168.123.104:0/2711086249 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2b8c108750 0x7f2b8c108b20 unknown :-1 s=CLOSED pgs=96 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:27:06.041 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:06.040+0000 7f2b93803700 1 -- 192.168.123.104:0/2711086249 >> 192.168.123.104:0/2711086249 conn(0x7f2b8c0fe250 msgr2=0x7f2b8c100660 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:27:06.041 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:06.040+0000 7f2b93803700 1 -- 192.168.123.104:0/2711086249 shutdown_connections 2026-03-10T06:27:06.041 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:06.040+0000 7f2b93803700 1 -- 192.168.123.104:0/2711086249 wait complete. 2026-03-10T06:27:06.041 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:06.041+0000 7f2b93803700 1 Processor -- start 2026-03-10T06:27:06.041 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:06.041+0000 7f2b93803700 1 -- start start 2026-03-10T06:27:06.041 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:06.041+0000 7f2b93803700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f2b8c102750 0x7f2b8c198390 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:27:06.042 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:06.041+0000 7f2b93803700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2b8c108750 0x7f2b8c1988d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:27:06.042 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:06.041+0000 7f2b93803700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f2b8c198fb0 con 0x7f2b8c108750 2026-03-10T06:27:06.042 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:06.041+0000 7f2b93803700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f2b8c19cd40 con 0x7f2b8c102750 2026-03-10T06:27:06.042 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:06.041+0000 7f2b90d9e700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2b8c108750 0x7f2b8c1988d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:27:06.042 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:06.041+0000 7f2b90d9e700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2b8c108750 0x7f2b8c1988d0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:48486/0 (socket says 192.168.123.104:48486) 2026-03-10T06:27:06.042 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:06.041+0000 7f2b90d9e700 1 -- 192.168.123.104:0/2974118448 learned_addr learned my addr 192.168.123.104:0/2974118448 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:27:06.042 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:06.042+0000 7f2b9159f700 1 --2- 192.168.123.104:0/2974118448 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f2b8c102750 0x7f2b8c198390 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:27:06.042 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:06.042+0000 7f2b90d9e700 1 -- 192.168.123.104:0/2974118448 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f2b8c102750 msgr2=0x7f2b8c198390 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:27:06.042 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:06.042+0000 7f2b90d9e700 1 --2- 192.168.123.104:0/2974118448 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f2b8c102750 0x7f2b8c198390 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:27:06.043 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:06.042+0000 7f2b90d9e700 1 -- 192.168.123.104:0/2974118448 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f2b7c0097e0 con 0x7f2b8c108750 2026-03-10T06:27:06.043 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:06.042+0000 7f2b90d9e700 1 --2- 192.168.123.104:0/2974118448 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2b8c108750 0x7f2b8c1988d0 secure :-1 s=READY pgs=97 cs=0 l=1 rev1=1 crypto rx=0x7f2b8800ed70 tx=0x7f2b8800c5b0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:27:06.043 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:06.043+0000 7f2b827fc700 1 -- 192.168.123.104:0/2974118448 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f2b8800cd70 con 0x7f2b8c108750 2026-03-10T06:27:06.044 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:06.043+0000 7f2b827fc700 1 -- 192.168.123.104:0/2974118448 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f2b8800eec0 con 0x7f2b8c108750 2026-03-10T06:27:06.044 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:06.043+0000 7f2b93803700 1 -- 192.168.123.104:0/2974118448 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f2b8c19d020 con 0x7f2b8c108750 2026-03-10T06:27:06.044 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:06.043+0000 7f2b827fc700 1 -- 192.168.123.104:0/2974118448 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f2b880188b0 con 0x7f2b8c108750 2026-03-10T06:27:06.044 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:06.043+0000 7f2b93803700 1 -- 192.168.123.104:0/2974118448 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f2b8c19d570 con 0x7f2b8c108750 2026-03-10T06:27:06.046 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:06.045+0000 7f2b827fc700 1 -- 192.168.123.104:0/2974118448 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 40) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f2b88018a10 con 0x7f2b8c108750 2026-03-10T06:27:06.046 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:06.045+0000 7f2b93803700 1 -- 192.168.123.104:0/2974118448 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f2b8c04ea50 con 0x7f2b8c108750 2026-03-10T06:27:06.046 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:06.045+0000 7f2b827fc700 1 --2- 192.168.123.104:0/2974118448 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f2b78077870 0x7f2b78079d20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:27:06.046 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:06.045+0000 7f2b827fc700 1 -- 192.168.123.104:0/2974118448 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(79..79 src has 1..79) v4 ==== 6222+0+0 (secure 0 0 0) 0x7f2b88014070 con 0x7f2b8c108750 2026-03-10T06:27:06.048 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:06.048+0000 7f2b9159f700 1 --2- 192.168.123.104:0/2974118448 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f2b78077870 0x7f2b78079d20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:27:06.049 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:06.048+0000 7f2b9159f700 1 --2- 192.168.123.104:0/2974118448 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f2b78077870 0x7f2b78079d20 secure :-1 s=READY pgs=65 cs=0 l=1 rev1=1 crypto rx=0x7f2b7c006010 tx=0x7f2b7c005a90 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:27:06.049 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:06.049+0000 7f2b827fc700 1 -- 192.168.123.104:0/2974118448 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f2b88062b80 con 0x7f2b8c108750 2026-03-10T06:27:06.184 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:06.183+0000 7f2b93803700 1 -- 192.168.123.104:0/2974118448 --> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7f2b8c19d850 con 0x7f2b78077870 2026-03-10T06:27:06.190 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:06.190+0000 7f2b827fc700 1 -- 192.168.123.104:0/2974118448 <== mgr.34104 v2:192.168.123.104:6800/1421430943 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3576 (secure 0 0 0) 0x7f2b8c19d850 con 0x7f2b78077870 2026-03-10T06:27:06.190 INFO:teuthology.orchestra.run.vm04.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-10T06:27:06.190 INFO:teuthology.orchestra.run.vm04.stdout:alertmanager.vm04 vm04 *:9093,9094 running (3m) 7s ago 9m 24.4M - 0.25.0 c8568f914cd2 85edc8fe2fc1 2026-03-10T06:27:06.190 INFO:teuthology.orchestra.run.vm04.stdout:ceph-exporter.vm04 vm04 running (9m) 7s ago 9m 9857k - 18.2.0 dc2bc1663786 019b79596e39 2026-03-10T06:27:06.190 INFO:teuthology.orchestra.run.vm04.stdout:ceph-exporter.vm06 vm06 running (8m) 23s ago 8m 12.1M - 18.2.0 dc2bc1663786 02ba67f7b99e 2026-03-10T06:27:06.190 INFO:teuthology.orchestra.run.vm04.stdout:crash.vm04 vm04 running (2m) 7s ago 9m 7847k - 19.2.3-678-ge911bdeb 654f31e6858e 330b1d951bd0 2026-03-10T06:27:06.190 INFO:teuthology.orchestra.run.vm04.stdout:crash.vm06 vm06 running (2m) 23s ago 8m 7856k - 19.2.3-678-ge911bdeb 654f31e6858e d5aafc4fb1bb 2026-03-10T06:27:06.190 INFO:teuthology.orchestra.run.vm04.stdout:grafana.vm04 vm04 *:3000 running (3m) 7s ago 9m 93.5M - 10.4.0 c8b91775d855 28b34ae2f2b0 2026-03-10T06:27:06.190 INFO:teuthology.orchestra.run.vm04.stdout:mds.cephfs.vm04.hdxbzv vm04 running (9s) 7s ago 7m 15.9M - 19.2.3-678-ge911bdeb 654f31e6858e 481d5dbc696e 2026-03-10T06:27:06.190 INFO:teuthology.orchestra.run.vm04.stdout:mds.cephfs.vm04.hsrsig vm04 running (7m) 7s ago 7m 92.0M - 18.2.0 dc2bc1663786 9bbaa4df4333 2026-03-10T06:27:06.190 INFO:teuthology.orchestra.run.vm04.stdout:mds.cephfs.vm06.afscws vm06 running (7m) 23s ago 7m 18.6M - 18.2.0 dc2bc1663786 dc29bd0a94dd 2026-03-10T06:27:06.190 INFO:teuthology.orchestra.run.vm04.stdout:mds.cephfs.vm06.wzhqon vm06 running (7m) 23s ago 7m 95.4M - 18.2.0 dc2bc1663786 5f7b9f10b346 2026-03-10T06:27:06.191 INFO:teuthology.orchestra.run.vm04.stdout:mgr.vm04.exdvdb vm04 *:8443,9283,8765 running (5m) 7s ago 10m 614M - 19.2.3-678-ge911bdeb 654f31e6858e 640a9d30421c 2026-03-10T06:27:06.191 INFO:teuthology.orchestra.run.vm04.stdout:mgr.vm06.wwotdr vm06 *:8443,9283,8765 running (4m) 23s ago 8m 496M - 19.2.3-678-ge911bdeb 654f31e6858e 0f98de364d6a 2026-03-10T06:27:06.191 INFO:teuthology.orchestra.run.vm04.stdout:mon.vm04 vm04 running (3m) 7s ago 10m 64.8M 2048M 19.2.3-678-ge911bdeb 654f31e6858e cf1d92823378 2026-03-10T06:27:06.191 INFO:teuthology.orchestra.run.vm04.stdout:mon.vm06 vm06 running (2m) 23s ago 8m 56.6M 2048M 19.2.3-678-ge911bdeb 654f31e6858e 0f90bc9a714a 2026-03-10T06:27:06.191 INFO:teuthology.orchestra.run.vm04.stdout:node-exporter.vm04 vm04 *:9100 running (4m) 7s ago 9m 10.6M - 1.7.0 72c9c2088986 f88b18573eef 2026-03-10T06:27:06.191 INFO:teuthology.orchestra.run.vm04.stdout:node-exporter.vm06 vm06 *:9100 running (4m) 23s ago 8m 9755k - 1.7.0 72c9c2088986 32cea90d1988 2026-03-10T06:27:06.191 INFO:teuthology.orchestra.run.vm04.stdout:osd.0 vm04 running (2m) 7s ago 8m 185M 4096M 19.2.3-678-ge911bdeb 654f31e6858e df697b82ad51 2026-03-10T06:27:06.191 INFO:teuthology.orchestra.run.vm04.stdout:osd.1 vm04 running (109s) 7s ago 8m 119M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 6bc3525fe6f5 2026-03-10T06:27:06.191 INFO:teuthology.orchestra.run.vm04.stdout:osd.2 vm04 running (87s) 7s ago 8m 90.8M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 38220ba83a3f 2026-03-10T06:27:06.191 INFO:teuthology.orchestra.run.vm04.stdout:osd.3 vm06 running (66s) 23s ago 8m 158M 4096M 19.2.3-678-ge911bdeb 654f31e6858e e91f44e1f660 2026-03-10T06:27:06.191 INFO:teuthology.orchestra.run.vm04.stdout:osd.4 vm06 running (45s) 23s ago 7m 98.9M 4096M 19.2.3-678-ge911bdeb 654f31e6858e fea6c31251ba 2026-03-10T06:27:06.191 INFO:teuthology.orchestra.run.vm04.stdout:osd.5 vm06 running (24s) 23s ago 7m 13.3M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 15b85a82c8ae 2026-03-10T06:27:06.191 INFO:teuthology.orchestra.run.vm04.stdout:prometheus.vm04 vm04 *:9095 running (4m) 7s ago 9m 56.9M - 2.51.0 1d3b7f56885b 9e491f823407 2026-03-10T06:27:06.193 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:06.192+0000 7f2b93803700 1 -- 192.168.123.104:0/2974118448 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f2b78077870 msgr2=0x7f2b78079d20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:27:06.193 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:06.193+0000 7f2b93803700 1 --2- 192.168.123.104:0/2974118448 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f2b78077870 0x7f2b78079d20 secure :-1 s=READY pgs=65 cs=0 l=1 rev1=1 crypto rx=0x7f2b7c006010 tx=0x7f2b7c005a90 comp rx=0 tx=0).stop 2026-03-10T06:27:06.193 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:06.193+0000 7f2b93803700 1 -- 192.168.123.104:0/2974118448 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2b8c108750 msgr2=0x7f2b8c1988d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:27:06.193 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:06.193+0000 7f2b93803700 1 --2- 192.168.123.104:0/2974118448 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2b8c108750 0x7f2b8c1988d0 secure :-1 s=READY pgs=97 cs=0 l=1 rev1=1 crypto rx=0x7f2b8800ed70 tx=0x7f2b8800c5b0 comp rx=0 tx=0).stop 2026-03-10T06:27:06.193 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:06.193+0000 7f2b93803700 1 -- 192.168.123.104:0/2974118448 shutdown_connections 2026-03-10T06:27:06.193 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:06.193+0000 7f2b93803700 1 --2- 192.168.123.104:0/2974118448 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f2b78077870 0x7f2b78079d20 unknown :-1 s=CLOSED pgs=65 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:27:06.194 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:06.193+0000 7f2b93803700 1 --2- 192.168.123.104:0/2974118448 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f2b8c102750 0x7f2b8c198390 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:27:06.194 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:06.193+0000 7f2b93803700 1 --2- 192.168.123.104:0/2974118448 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2b8c108750 0x7f2b8c1988d0 unknown :-1 s=CLOSED pgs=97 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:27:06.194 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:06.193+0000 7f2b93803700 1 -- 192.168.123.104:0/2974118448 >> 192.168.123.104:0/2974118448 conn(0x7f2b8c0fe250 msgr2=0x7f2b8c0ffb00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:27:06.194 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:06.193+0000 7f2b93803700 1 -- 192.168.123.104:0/2974118448 shutdown_connections 2026-03-10T06:27:06.194 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:06.193+0000 7f2b93803700 1 -- 192.168.123.104:0/2974118448 wait complete. 2026-03-10T06:27:06.275 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:06.274+0000 7fda8e2be700 1 -- 192.168.123.104:0/1386441076 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fda88108780 msgr2=0x7fda88108b50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:27:06.275 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:06.274+0000 7fda8e2be700 1 --2- 192.168.123.104:0/1386441076 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fda88108780 0x7fda88108b50 secure :-1 s=READY pgs=98 cs=0 l=1 rev1=1 crypto rx=0x7fda78009b00 tx=0x7fda78009e10 comp rx=0 tx=0).stop 2026-03-10T06:27:06.275 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:06.275+0000 7fda8e2be700 1 -- 192.168.123.104:0/1386441076 shutdown_connections 2026-03-10T06:27:06.275 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:06.275+0000 7fda8e2be700 1 --2- 192.168.123.104:0/1386441076 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fda88102780 0x7fda88102bf0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:27:06.275 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:06.275+0000 7fda8e2be700 1 --2- 192.168.123.104:0/1386441076 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fda88108780 0x7fda88108b50 unknown :-1 s=CLOSED pgs=98 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:27:06.275 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:06.275+0000 7fda8e2be700 1 -- 192.168.123.104:0/1386441076 >> 192.168.123.104:0/1386441076 conn(0x7fda880fe280 msgr2=0x7fda88100690 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:27:06.275 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:06.275+0000 7fda8e2be700 1 -- 192.168.123.104:0/1386441076 shutdown_connections 2026-03-10T06:27:06.276 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:06.275+0000 7fda8e2be700 1 -- 192.168.123.104:0/1386441076 wait complete. 2026-03-10T06:27:06.276 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:06.276+0000 7fda8e2be700 1 Processor -- start 2026-03-10T06:27:06.276 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:06.276+0000 7fda8e2be700 1 -- start start 2026-03-10T06:27:06.276 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:06.276+0000 7fda8e2be700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fda88102780 0x7fda88198390 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:27:06.277 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:06.276+0000 7fda8e2be700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fda88108780 0x7fda881988d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:27:06.277 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:06.276+0000 7fda8e2be700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fda88198fb0 con 0x7fda88108780 2026-03-10T06:27:06.277 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:06.276+0000 7fda8e2be700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fda8819cd40 con 0x7fda88102780 2026-03-10T06:27:06.277 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:06.277+0000 7fda877fe700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fda88108780 0x7fda881988d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:27:06.277 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:06.277+0000 7fda877fe700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fda88108780 0x7fda881988d0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:48506/0 (socket says 192.168.123.104:48506) 2026-03-10T06:27:06.277 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:06.277+0000 7fda877fe700 1 -- 192.168.123.104:0/2437309443 learned_addr learned my addr 192.168.123.104:0/2437309443 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:27:06.277 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:06.277+0000 7fda87fff700 1 --2- 192.168.123.104:0/2437309443 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fda88102780 0x7fda88198390 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:27:06.278 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:06.277+0000 7fda87fff700 1 -- 192.168.123.104:0/2437309443 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fda88108780 msgr2=0x7fda881988d0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:27:06.278 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:06.277+0000 7fda87fff700 1 --2- 192.168.123.104:0/2437309443 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fda88108780 0x7fda881988d0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:27:06.278 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:06.277+0000 7fda87fff700 1 -- 192.168.123.104:0/2437309443 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fda780097e0 con 0x7fda88102780 2026-03-10T06:27:06.278 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:06.277+0000 7fda87fff700 1 --2- 192.168.123.104:0/2437309443 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fda88102780 0x7fda88198390 secure :-1 s=READY pgs=33 cs=0 l=1 rev1=1 crypto rx=0x7fda78000c00 tx=0x7fda7800ba00 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:27:06.542 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:06.537+0000 7fda857fa700 1 -- 192.168.123.104:0/2437309443 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fda7801d070 con 0x7fda88102780 2026-03-10T06:27:06.542 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:06.538+0000 7fda8e2be700 1 -- 192.168.123.104:0/2437309443 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fda8819cfc0 con 0x7fda88102780 2026-03-10T06:27:06.543 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:06.540+0000 7fda8e2be700 1 -- 192.168.123.104:0/2437309443 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fda8819d4b0 con 0x7fda88102780 2026-03-10T06:27:06.543 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:06.541+0000 7fda857fa700 1 -- 192.168.123.104:0/2437309443 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7fda7800f460 con 0x7fda88102780 2026-03-10T06:27:06.543 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:06.541+0000 7fda857fa700 1 -- 192.168.123.104:0/2437309443 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fda78021600 con 0x7fda88102780 2026-03-10T06:27:06.543 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:06.541+0000 7fda857fa700 1 -- 192.168.123.104:0/2437309443 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 40) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fda78017400 con 0x7fda88102780 2026-03-10T06:27:06.543 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:06.541+0000 7fda857fa700 1 --2- 192.168.123.104:0/2437309443 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7fda70077920 0x7fda70079dd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:27:06.543 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:06.542+0000 7fda8e2be700 1 -- 192.168.123.104:0/2437309443 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fda74005320 con 0x7fda88102780 2026-03-10T06:27:06.543 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:06.543+0000 7fda877fe700 1 --2- 192.168.123.104:0/2437309443 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7fda70077920 0x7fda70079dd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:27:06.544 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:06.543+0000 7fda877fe700 1 --2- 192.168.123.104:0/2437309443 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7fda70077920 0x7fda70079dd0 secure :-1 s=READY pgs=66 cs=0 l=1 rev1=1 crypto rx=0x7fda881999b0 tx=0x7fda7c009450 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:27:06.859 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:06.858+0000 7fda857fa700 1 -- 192.168.123.104:0/2437309443 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(79..79 src has 1..79) v4 ==== 6222+0+0 (secure 0 0 0) 0x7fda780689a0 con 0x7fda88102780 2026-03-10T06:27:06.866 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:06.865+0000 7fda857fa700 1 -- 192.168.123.104:0/2437309443 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fda78026080 con 0x7fda88102780 2026-03-10T06:27:07.085 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.082+0000 7fda8e2be700 1 -- 192.168.123.104:0/2437309443 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7fda74005cc0 con 0x7fda88102780 2026-03-10T06:27:07.085 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.084+0000 7fda857fa700 1 -- 192.168.123.104:0/2437309443 <== mon.1 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+815 (secure 0 0 0) 0x7fda780643c0 con 0x7fda88102780 2026-03-10T06:27:07.085 INFO:teuthology.orchestra.run.vm04.stdout:{ 2026-03-10T06:27:07.085 INFO:teuthology.orchestra.run.vm04.stdout: "mon": { 2026-03-10T06:27:07.085 INFO:teuthology.orchestra.run.vm04.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T06:27:07.085 INFO:teuthology.orchestra.run.vm04.stdout: }, 2026-03-10T06:27:07.085 INFO:teuthology.orchestra.run.vm04.stdout: "mgr": { 2026-03-10T06:27:07.085 INFO:teuthology.orchestra.run.vm04.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T06:27:07.085 INFO:teuthology.orchestra.run.vm04.stdout: }, 2026-03-10T06:27:07.085 INFO:teuthology.orchestra.run.vm04.stdout: "osd": { 2026-03-10T06:27:07.085 INFO:teuthology.orchestra.run.vm04.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 6 2026-03-10T06:27:07.085 INFO:teuthology.orchestra.run.vm04.stdout: }, 2026-03-10T06:27:07.085 INFO:teuthology.orchestra.run.vm04.stdout: "mds": { 2026-03-10T06:27:07.085 INFO:teuthology.orchestra.run.vm04.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 2, 2026-03-10T06:27:07.085 INFO:teuthology.orchestra.run.vm04.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 1 2026-03-10T06:27:07.085 INFO:teuthology.orchestra.run.vm04.stdout: }, 2026-03-10T06:27:07.085 INFO:teuthology.orchestra.run.vm04.stdout: "overall": { 2026-03-10T06:27:07.085 INFO:teuthology.orchestra.run.vm04.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 2, 2026-03-10T06:27:07.085 INFO:teuthology.orchestra.run.vm04.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 11 2026-03-10T06:27:07.085 INFO:teuthology.orchestra.run.vm04.stdout: } 2026-03-10T06:27:07.085 INFO:teuthology.orchestra.run.vm04.stdout:} 2026-03-10T06:27:07.089 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.088+0000 7fda6effd700 1 -- 192.168.123.104:0/2437309443 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7fda70077920 msgr2=0x7fda70079dd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:27:07.089 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.088+0000 7fda6effd700 1 --2- 192.168.123.104:0/2437309443 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7fda70077920 0x7fda70079dd0 secure :-1 s=READY pgs=66 cs=0 l=1 rev1=1 crypto rx=0x7fda881999b0 tx=0x7fda7c009450 comp rx=0 tx=0).stop 2026-03-10T06:27:07.089 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.088+0000 7fda6effd700 1 -- 192.168.123.104:0/2437309443 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fda88102780 msgr2=0x7fda88198390 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:27:07.089 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.088+0000 7fda6effd700 1 --2- 192.168.123.104:0/2437309443 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fda88102780 0x7fda88198390 secure :-1 s=READY pgs=33 cs=0 l=1 rev1=1 crypto rx=0x7fda78000c00 tx=0x7fda7800ba00 comp rx=0 tx=0).stop 2026-03-10T06:27:07.089 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.088+0000 7fda6effd700 1 -- 192.168.123.104:0/2437309443 shutdown_connections 2026-03-10T06:27:07.089 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.088+0000 7fda6effd700 1 --2- 192.168.123.104:0/2437309443 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7fda70077920 0x7fda70079dd0 unknown :-1 s=CLOSED pgs=66 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:27:07.089 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.088+0000 7fda6effd700 1 --2- 192.168.123.104:0/2437309443 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fda88102780 0x7fda88198390 unknown :-1 s=CLOSED pgs=33 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:27:07.089 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.088+0000 7fda6effd700 1 --2- 192.168.123.104:0/2437309443 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fda88108780 0x7fda881988d0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:27:07.089 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.088+0000 7fda6effd700 1 -- 192.168.123.104:0/2437309443 >> 192.168.123.104:0/2437309443 conn(0x7fda880fe280 msgr2=0x7fda880ffae0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:27:07.089 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.088+0000 7fda6effd700 1 -- 192.168.123.104:0/2437309443 shutdown_connections 2026-03-10T06:27:07.089 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.088+0000 7fda6effd700 1 -- 192.168.123.104:0/2437309443 wait complete. 2026-03-10T06:27:07.171 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.170+0000 7f391af5f700 1 -- 192.168.123.104:0/1073865594 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f391410e960 msgr2=0x7f391410ed30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:27:07.171 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.170+0000 7f391af5f700 1 --2- 192.168.123.104:0/1073865594 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f391410e960 0x7f391410ed30 secure :-1 s=READY pgs=101 cs=0 l=1 rev1=1 crypto rx=0x7f3904009b00 tx=0x7f3904009e10 comp rx=0 tx=0).stop 2026-03-10T06:27:07.171 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.170+0000 7f391af5f700 1 -- 192.168.123.104:0/1073865594 shutdown_connections 2026-03-10T06:27:07.171 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.170+0000 7f391af5f700 1 --2- 192.168.123.104:0/1073865594 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f391406ce40 0x7f391406d2b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:27:07.171 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.170+0000 7f391af5f700 1 --2- 192.168.123.104:0/1073865594 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f391410e960 0x7f391410ed30 unknown :-1 s=CLOSED pgs=101 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:27:07.171 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.170+0000 7f391af5f700 1 -- 192.168.123.104:0/1073865594 >> 192.168.123.104:0/1073865594 conn(0x7f391406c370 msgr2=0x7f3914071540 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:27:07.171 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.170+0000 7f391af5f700 1 -- 192.168.123.104:0/1073865594 shutdown_connections 2026-03-10T06:27:07.171 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.170+0000 7f391af5f700 1 -- 192.168.123.104:0/1073865594 wait complete. 2026-03-10T06:27:07.172 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.171+0000 7f391af5f700 1 Processor -- start 2026-03-10T06:27:07.173 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.171+0000 7f391af5f700 1 -- start start 2026-03-10T06:27:07.173 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.171+0000 7f391af5f700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f391406ce40 0x7f39141aaac0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:27:07.173 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.171+0000 7f391af5f700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f39141ab000 0x7f39141a4ab0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:27:07.173 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.171+0000 7f391af5f700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f39141ab5a0 con 0x7f391406ce40 2026-03-10T06:27:07.173 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.171+0000 7f391af5f700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f39141a4ff0 con 0x7f39141ab000 2026-03-10T06:27:07.173 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.171+0000 7f3919f5d700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f391406ce40 0x7f39141aaac0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:27:07.173 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.171+0000 7f3919f5d700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f391406ce40 0x7f39141aaac0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:48542/0 (socket says 192.168.123.104:48542) 2026-03-10T06:27:07.173 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.171+0000 7f3919f5d700 1 -- 192.168.123.104:0/1320485135 learned_addr learned my addr 192.168.123.104:0/1320485135 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:27:07.173 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.171+0000 7f391975c700 1 --2- 192.168.123.104:0/1320485135 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f39141ab000 0x7f39141a4ab0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:27:07.173 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.171+0000 7f3919f5d700 1 -- 192.168.123.104:0/1320485135 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f39141ab000 msgr2=0x7f39141a4ab0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:27:07.173 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.171+0000 7f3919f5d700 1 --2- 192.168.123.104:0/1320485135 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f39141ab000 0x7f39141a4ab0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:27:07.173 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.171+0000 7f3919f5d700 1 -- 192.168.123.104:0/1320485135 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f39040097e0 con 0x7f391406ce40 2026-03-10T06:27:07.173 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.172+0000 7f3919f5d700 1 --2- 192.168.123.104:0/1320485135 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f391406ce40 0x7f39141aaac0 secure :-1 s=READY pgs=102 cs=0 l=1 rev1=1 crypto rx=0x7f390400f690 tx=0x7f390400f6c0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:27:07.173 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.172+0000 7f390affd700 1 -- 192.168.123.104:0/1320485135 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f390401c070 con 0x7f391406ce40 2026-03-10T06:27:07.174 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.172+0000 7f390affd700 1 -- 192.168.123.104:0/1320485135 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f3904003680 con 0x7f391406ce40 2026-03-10T06:27:07.174 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.172+0000 7f391af5f700 1 -- 192.168.123.104:0/1320485135 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f39141a5130 con 0x7f391406ce40 2026-03-10T06:27:07.174 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.172+0000 7f391af5f700 1 -- 192.168.123.104:0/1320485135 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f39141a5680 con 0x7f391406ce40 2026-03-10T06:27:07.177 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.174+0000 7f391af5f700 1 -- 192.168.123.104:0/1320485135 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f38f8005320 con 0x7f391406ce40 2026-03-10T06:27:07.177 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.174+0000 7f390affd700 1 -- 192.168.123.104:0/1320485135 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f39040176c0 con 0x7f391406ce40 2026-03-10T06:27:07.177 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.174+0000 7f390affd700 1 -- 192.168.123.104:0/1320485135 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 40) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f39040178e0 con 0x7f391406ce40 2026-03-10T06:27:07.178 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.177+0000 7f390affd700 1 --2- 192.168.123.104:0/1320485135 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f3900077720 0x7f3900079bd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:27:07.178 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.177+0000 7f391975c700 1 --2- 192.168.123.104:0/1320485135 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f3900077720 0x7f3900079bd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:27:07.178 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.178+0000 7f390affd700 1 -- 192.168.123.104:0/1320485135 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(79..79 src has 1..79) v4 ==== 6222+0+0 (secure 0 0 0) 0x7f390400fac0 con 0x7f391406ce40 2026-03-10T06:27:07.178 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.178+0000 7f391975c700 1 --2- 192.168.123.104:0/1320485135 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f3900077720 0x7f3900079bd0 secure :-1 s=READY pgs=67 cs=0 l=1 rev1=1 crypto rx=0x7f390c003eb0 tx=0x7f390c00b040 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:27:07.178 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.178+0000 7f390affd700 1 -- 192.168.123.104:0/1320485135 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f39040a0310 con 0x7f391406ce40 2026-03-10T06:27:07.337 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.333+0000 7f391af5f700 1 -- 192.168.123.104:0/1320485135 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7f38f8006200 con 0x7f391406ce40 2026-03-10T06:27:07.337 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.336+0000 7f390affd700 1 -- 192.168.123.104:0/1320485135 <== mon.0 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 21 v21) v1 ==== 76+0+1748 (secure 0 0 0) 0x7f3904064430 con 0x7f391406ce40 2026-03-10T06:27:07.339 INFO:teuthology.orchestra.run.vm04.stdout:e21 2026-03-10T06:27:07.339 INFO:teuthology.orchestra.run.vm04.stdout:btime 2026-03-10T06:27:06:484565+0000 2026-03-10T06:27:07.339 INFO:teuthology.orchestra.run.vm04.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-10T06:27:07.339 INFO:teuthology.orchestra.run.vm04.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T06:27:07.339 INFO:teuthology.orchestra.run.vm04.stdout:legacy client fscid: 1 2026-03-10T06:27:07.339 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:27:07.339 INFO:teuthology.orchestra.run.vm04.stdout:Filesystem 'cephfs' (1) 2026-03-10T06:27:07.339 INFO:teuthology.orchestra.run.vm04.stdout:fs_name cephfs 2026-03-10T06:27:07.339 INFO:teuthology.orchestra.run.vm04.stdout:epoch 21 2026-03-10T06:27:07.339 INFO:teuthology.orchestra.run.vm04.stdout:flags 12 joinable allow_snaps allow_multimds_snaps 2026-03-10T06:27:07.339 INFO:teuthology.orchestra.run.vm04.stdout:created 2026-03-10T06:19:48.407965+0000 2026-03-10T06:27:07.339 INFO:teuthology.orchestra.run.vm04.stdout:modified 2026-03-10T06:27:06.434345+0000 2026-03-10T06:27:07.339 INFO:teuthology.orchestra.run.vm04.stdout:tableserver 0 2026-03-10T06:27:07.339 INFO:teuthology.orchestra.run.vm04.stdout:root 0 2026-03-10T06:27:07.339 INFO:teuthology.orchestra.run.vm04.stdout:session_timeout 60 2026-03-10T06:27:07.339 INFO:teuthology.orchestra.run.vm04.stdout:session_autoclose 300 2026-03-10T06:27:07.339 INFO:teuthology.orchestra.run.vm04.stdout:max_file_size 1099511627776 2026-03-10T06:27:07.339 INFO:teuthology.orchestra.run.vm04.stdout:max_xattr_size 65536 2026-03-10T06:27:07.339 INFO:teuthology.orchestra.run.vm04.stdout:required_client_features {} 2026-03-10T06:27:07.339 INFO:teuthology.orchestra.run.vm04.stdout:last_failure 0 2026-03-10T06:27:07.339 INFO:teuthology.orchestra.run.vm04.stdout:last_failure_osd_epoch 79 2026-03-10T06:27:07.339 INFO:teuthology.orchestra.run.vm04.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T06:27:07.339 INFO:teuthology.orchestra.run.vm04.stdout:max_mds 1 2026-03-10T06:27:07.339 INFO:teuthology.orchestra.run.vm04.stdout:in 0 2026-03-10T06:27:07.339 INFO:teuthology.orchestra.run.vm04.stdout:up {0=14526} 2026-03-10T06:27:07.339 INFO:teuthology.orchestra.run.vm04.stdout:failed 2026-03-10T06:27:07.339 INFO:teuthology.orchestra.run.vm04.stdout:damaged 2026-03-10T06:27:07.339 INFO:teuthology.orchestra.run.vm04.stdout:stopped 2026-03-10T06:27:07.339 INFO:teuthology.orchestra.run.vm04.stdout:data_pools [3] 2026-03-10T06:27:07.339 INFO:teuthology.orchestra.run.vm04.stdout:metadata_pool 2 2026-03-10T06:27:07.339 INFO:teuthology.orchestra.run.vm04.stdout:inline_data enabled 2026-03-10T06:27:07.339 INFO:teuthology.orchestra.run.vm04.stdout:balancer 2026-03-10T06:27:07.339 INFO:teuthology.orchestra.run.vm04.stdout:bal_rank_mask -1 2026-03-10T06:27:07.339 INFO:teuthology.orchestra.run.vm04.stdout:standby_count_wanted 1 2026-03-10T06:27:07.340 INFO:teuthology.orchestra.run.vm04.stdout:qdb_cluster leader: 0 members: 2026-03-10T06:27:07.340 INFO:teuthology.orchestra.run.vm04.stdout:[mds.cephfs.vm06.afscws{0:14526} state up:reconnect seq 110 join_fscid=1 addr [v2:192.168.123.106:6826/3120742985,v1:192.168.123.106:6827/3120742985] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T06:27:07.340 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:27:07.340 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:27:07.340 INFO:teuthology.orchestra.run.vm04.stdout:Standby daemons: 2026-03-10T06:27:07.340 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:27:07.340 INFO:teuthology.orchestra.run.vm04.stdout:[mds.cephfs.vm04.hdxbzv{-1:34266} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.104:6826/2103633514,v1:192.168.123.104:6827/2103633514] compat {c=[1],r=[1],i=[1fff]}] 2026-03-10T06:27:07.340 INFO:teuthology.orchestra.run.vm04.stdout:[mds.cephfs.vm06.wzhqon{-1:44217} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.106:6824/1147710747,v1:192.168.123.106:6825/1147710747] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T06:27:07.342 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.341+0000 7f391af5f700 1 -- 192.168.123.104:0/1320485135 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f3900077720 msgr2=0x7f3900079bd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:27:07.342 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.341+0000 7f391af5f700 1 --2- 192.168.123.104:0/1320485135 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f3900077720 0x7f3900079bd0 secure :-1 s=READY pgs=67 cs=0 l=1 rev1=1 crypto rx=0x7f390c003eb0 tx=0x7f390c00b040 comp rx=0 tx=0).stop 2026-03-10T06:27:07.342 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.341+0000 7f391af5f700 1 -- 192.168.123.104:0/1320485135 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f391406ce40 msgr2=0x7f39141aaac0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:27:07.342 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.341+0000 7f391af5f700 1 --2- 192.168.123.104:0/1320485135 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f391406ce40 0x7f39141aaac0 secure :-1 s=READY pgs=102 cs=0 l=1 rev1=1 crypto rx=0x7f390400f690 tx=0x7f390400f6c0 comp rx=0 tx=0).stop 2026-03-10T06:27:07.343 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.341+0000 7f391af5f700 1 -- 192.168.123.104:0/1320485135 shutdown_connections 2026-03-10T06:27:07.343 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.341+0000 7f391af5f700 1 --2- 192.168.123.104:0/1320485135 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f3900077720 0x7f3900079bd0 unknown :-1 s=CLOSED pgs=67 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:27:07.343 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.341+0000 7f391af5f700 1 --2- 192.168.123.104:0/1320485135 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f391406ce40 0x7f39141aaac0 unknown :-1 s=CLOSED pgs=102 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:27:07.343 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.342+0000 7f391af5f700 1 --2- 192.168.123.104:0/1320485135 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f39141ab000 0x7f39141a4ab0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:27:07.343 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.342+0000 7f391af5f700 1 -- 192.168.123.104:0/1320485135 >> 192.168.123.104:0/1320485135 conn(0x7f391406c370 msgr2=0x7f391406f530 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:27:07.343 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.343+0000 7f391af5f700 1 -- 192.168.123.104:0/1320485135 shutdown_connections 2026-03-10T06:27:07.343 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.343+0000 7f391af5f700 1 -- 192.168.123.104:0/1320485135 wait complete. 2026-03-10T06:27:07.347 INFO:teuthology.orchestra.run.vm04.stderr:dumped fsmap epoch 21 2026-03-10T06:27:07.442 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.441+0000 7f08c2884700 1 -- 192.168.123.104:0/1146057485 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f08b40ac790 msgr2=0x7f08b40a5210 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:27:07.443 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.441+0000 7f08c2884700 1 --2- 192.168.123.104:0/1146057485 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f08b40ac790 0x7f08b40a5210 secure :-1 s=READY pgs=103 cs=0 l=1 rev1=1 crypto rx=0x7f08bc0669f0 tx=0x7f08bc0699f0 comp rx=0 tx=0).stop 2026-03-10T06:27:07.443 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.441+0000 7f08c2884700 1 -- 192.168.123.104:0/1146057485 shutdown_connections 2026-03-10T06:27:07.443 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.441+0000 7f08c2884700 1 --2- 192.168.123.104:0/1146057485 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f08b40ac790 0x7f08b40a5210 unknown :-1 s=CLOSED pgs=103 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:27:07.443 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.441+0000 7f08c2884700 1 --2- 192.168.123.104:0/1146057485 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f08b40ac3c0 0x7f08b40a4cd0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:27:07.443 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.441+0000 7f08c2884700 1 -- 192.168.123.104:0/1146057485 >> 192.168.123.104:0/1146057485 conn(0x7f08b401a290 msgr2=0x7f08b401a690 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:27:07.445 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.442+0000 7f08c2884700 1 -- 192.168.123.104:0/1146057485 shutdown_connections 2026-03-10T06:27:07.445 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.442+0000 7f08c2884700 1 -- 192.168.123.104:0/1146057485 wait complete. 2026-03-10T06:27:07.445 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.442+0000 7f08c2884700 1 Processor -- start 2026-03-10T06:27:07.445 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.442+0000 7f08c2884700 1 -- start start 2026-03-10T06:27:07.445 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.442+0000 7f08c2884700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f08b40ac3c0 0x7f08b40b6d00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:27:07.445 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.442+0000 7f08c2884700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f08b40b4d80 0x7f08b40b51f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:27:07.445 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.442+0000 7f08c2884700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f08b40b5730 con 0x7f08b40ac3c0 2026-03-10T06:27:07.445 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.442+0000 7f08c2884700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f08b40b58a0 con 0x7f08b40b4d80 2026-03-10T06:27:07.445 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.443+0000 7f08c1882700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f08b40ac3c0 0x7f08b40b6d00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:27:07.445 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.443+0000 7f08c1882700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f08b40ac3c0 0x7f08b40b6d00 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:48554/0 (socket says 192.168.123.104:48554) 2026-03-10T06:27:07.445 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.443+0000 7f08c1882700 1 -- 192.168.123.104:0/1430699409 learned_addr learned my addr 192.168.123.104:0/1430699409 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:27:07.445 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.443+0000 7f08c1081700 1 --2- 192.168.123.104:0/1430699409 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f08b40b4d80 0x7f08b40b51f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:27:07.445 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.443+0000 7f08c1882700 1 -- 192.168.123.104:0/1430699409 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f08b40b4d80 msgr2=0x7f08b40b51f0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:27:07.445 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.443+0000 7f08c1882700 1 --2- 192.168.123.104:0/1430699409 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f08b40b4d80 0x7f08b40b51f0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:27:07.445 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.443+0000 7f08c1882700 1 -- 192.168.123.104:0/1430699409 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f08bc067050 con 0x7f08b40ac3c0 2026-03-10T06:27:07.445 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.443+0000 7f08c1882700 1 --2- 192.168.123.104:0/1430699409 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f08b40ac3c0 0x7f08b40b6d00 secure :-1 s=READY pgs=104 cs=0 l=1 rev1=1 crypto rx=0x7f08b800ba70 tx=0x7f08b800be30 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:27:07.445 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.444+0000 7f08b2ffd700 1 -- 192.168.123.104:0/1430699409 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f08b800c760 con 0x7f08b40ac3c0 2026-03-10T06:27:07.445 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.444+0000 7f08b2ffd700 1 -- 192.168.123.104:0/1430699409 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f08b800cda0 con 0x7f08b40ac3c0 2026-03-10T06:27:07.452 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.444+0000 7f08b2ffd700 1 -- 192.168.123.104:0/1430699409 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f08b8012550 con 0x7f08b40ac3c0 2026-03-10T06:27:07.452 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.446+0000 7f08c2884700 1 -- 192.168.123.104:0/1430699409 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f08b40b5b80 con 0x7f08b40ac3c0 2026-03-10T06:27:07.452 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.446+0000 7f08c2884700 1 -- 192.168.123.104:0/1430699409 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f08b4166590 con 0x7f08b40ac3c0 2026-03-10T06:27:07.452 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.447+0000 7f08c2884700 1 -- 192.168.123.104:0/1430699409 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f08b4004500 con 0x7f08b40ac3c0 2026-03-10T06:27:07.452 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.448+0000 7f08b2ffd700 1 -- 192.168.123.104:0/1430699409 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 40) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f08b800c8c0 con 0x7f08b40ac3c0 2026-03-10T06:27:07.452 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.448+0000 7f08b2ffd700 1 --2- 192.168.123.104:0/1430699409 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f08a80777d0 0x7f08a8079c80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:27:07.452 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.448+0000 7f08b2ffd700 1 -- 192.168.123.104:0/1430699409 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(79..79 src has 1..79) v4 ==== 6222+0+0 (secure 0 0 0) 0x7f08b8098600 con 0x7f08b40ac3c0 2026-03-10T06:27:07.460 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.451+0000 7f08b2ffd700 1 -- 192.168.123.104:0/1430699409 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f08b8060ef0 con 0x7f08b40ac3c0 2026-03-10T06:27:07.460 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.456+0000 7f08c1081700 1 --2- 192.168.123.104:0/1430699409 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f08a80777d0 0x7f08a8079c80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:27:07.460 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.457+0000 7f08c1081700 1 --2- 192.168.123.104:0/1430699409 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f08a80777d0 0x7f08a8079c80 secure :-1 s=READY pgs=68 cs=0 l=1 rev1=1 crypto rx=0x7f08bc066ae0 tx=0x7f08bc064010 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:27:07.604 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:07 vm04.local ceph-mon[115743]: from='client.34270 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:27:07.604 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:07 vm04.local ceph-mon[115743]: from='client.34274 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:27:07.604 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:07 vm04.local ceph-mon[115743]: pgmap v130: 65 pgs: 65 active+clean; 252 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 13 MiB/s rd, 5.5 KiB/s wr, 11 op/s 2026-03-10T06:27:07.604 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:07 vm04.local ceph-mon[115743]: from='client.34278 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:27:07.604 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:07 vm04.local ceph-mon[115743]: reconnect by client.24331 192.168.144.1:0/2593838473 after 0.001 2026-03-10T06:27:07.604 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:07 vm04.local ceph-mon[115743]: reconnect by client.24325 192.168.123.104:0/950847804 after 0.00500001 2026-03-10T06:27:07.604 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:07 vm04.local ceph-mon[115743]: mds.? [v2:192.168.123.106:6826/3120742985,v1:192.168.123.106:6827/3120742985] up:reconnect 2026-03-10T06:27:07.604 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:07 vm04.local ceph-mon[115743]: fsmap cephfs:1/1 {0=cephfs.vm06.afscws=up:reconnect} 2 up:standby 2026-03-10T06:27:07.604 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:07 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:07.604 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:07 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:07.604 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:07 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:27:07.604 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:07 vm04.local ceph-mon[115743]: from='client.? 192.168.123.104:0/2437309443' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:27:07.604 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:07 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:07.604 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:07 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:07.604 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:07 vm04.local ceph-mon[115743]: from='client.? 192.168.123.104:0/1320485135' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T06:27:07.604 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.603+0000 7f08c2884700 1 -- 192.168.123.104:0/1430699409 --> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f08b40a2800 con 0x7f08a80777d0 2026-03-10T06:27:07.608 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.606+0000 7f08b2ffd700 1 -- 192.168.123.104:0/1430699409 <== mgr.34104 v2:192.168.123.104:6800/1421430943 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+416 (secure 0 0 0) 0x7f08b40a2800 con 0x7f08a80777d0 2026-03-10T06:27:07.608 INFO:teuthology.orchestra.run.vm04.stdout:{ 2026-03-10T06:27:07.609 INFO:teuthology.orchestra.run.vm04.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc", 2026-03-10T06:27:07.609 INFO:teuthology.orchestra.run.vm04.stdout: "in_progress": true, 2026-03-10T06:27:07.609 INFO:teuthology.orchestra.run.vm04.stdout: "which": "Upgrading all daemon types on all hosts", 2026-03-10T06:27:07.609 INFO:teuthology.orchestra.run.vm04.stdout: "services_complete": [ 2026-03-10T06:27:07.609 INFO:teuthology.orchestra.run.vm04.stdout: "osd", 2026-03-10T06:27:07.609 INFO:teuthology.orchestra.run.vm04.stdout: "mon", 2026-03-10T06:27:07.609 INFO:teuthology.orchestra.run.vm04.stdout: "mgr", 2026-03-10T06:27:07.609 INFO:teuthology.orchestra.run.vm04.stdout: "crash" 2026-03-10T06:27:07.609 INFO:teuthology.orchestra.run.vm04.stdout: ], 2026-03-10T06:27:07.609 INFO:teuthology.orchestra.run.vm04.stdout: "progress": "13/23 daemons upgraded", 2026-03-10T06:27:07.609 INFO:teuthology.orchestra.run.vm04.stdout: "message": "Currently upgrading mds daemons", 2026-03-10T06:27:07.609 INFO:teuthology.orchestra.run.vm04.stdout: "is_paused": false 2026-03-10T06:27:07.609 INFO:teuthology.orchestra.run.vm04.stdout:} 2026-03-10T06:27:07.615 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.614+0000 7f08c2884700 1 -- 192.168.123.104:0/1430699409 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f08a80777d0 msgr2=0x7f08a8079c80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:27:07.615 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.614+0000 7f08c2884700 1 --2- 192.168.123.104:0/1430699409 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f08a80777d0 0x7f08a8079c80 secure :-1 s=READY pgs=68 cs=0 l=1 rev1=1 crypto rx=0x7f08bc066ae0 tx=0x7f08bc064010 comp rx=0 tx=0).stop 2026-03-10T06:27:07.615 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.614+0000 7f08c2884700 1 -- 192.168.123.104:0/1430699409 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f08b40ac3c0 msgr2=0x7f08b40b6d00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:27:07.615 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.614+0000 7f08c2884700 1 --2- 192.168.123.104:0/1430699409 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f08b40ac3c0 0x7f08b40b6d00 secure :-1 s=READY pgs=104 cs=0 l=1 rev1=1 crypto rx=0x7f08b800ba70 tx=0x7f08b800be30 comp rx=0 tx=0).stop 2026-03-10T06:27:07.617 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.615+0000 7f08c2884700 1 -- 192.168.123.104:0/1430699409 shutdown_connections 2026-03-10T06:27:07.617 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.615+0000 7f08c2884700 1 --2- 192.168.123.104:0/1430699409 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f08a80777d0 0x7f08a8079c80 unknown :-1 s=CLOSED pgs=68 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:27:07.617 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.615+0000 7f08c2884700 1 --2- 192.168.123.104:0/1430699409 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f08b40ac3c0 0x7f08b40b6d00 unknown :-1 s=CLOSED pgs=104 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:27:07.617 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.615+0000 7f08c2884700 1 --2- 192.168.123.104:0/1430699409 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f08b40b4d80 0x7f08b40b51f0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:27:07.617 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.615+0000 7f08c2884700 1 -- 192.168.123.104:0/1430699409 >> 192.168.123.104:0/1430699409 conn(0x7f08b401a290 msgr2=0x7f08b40a20f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:27:07.617 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.616+0000 7f08c2884700 1 -- 192.168.123.104:0/1430699409 shutdown_connections 2026-03-10T06:27:07.617 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.616+0000 7f08c2884700 1 -- 192.168.123.104:0/1430699409 wait complete. 2026-03-10T06:27:07.713 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.711+0000 7fbf98dab700 1 -- 192.168.123.104:0/3231168498 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fbf9410eab0 msgr2=0x7fbf9410ee80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:27:07.713 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.711+0000 7fbf98dab700 1 --2- 192.168.123.104:0/3231168498 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fbf9410eab0 0x7fbf9410ee80 secure :-1 s=READY pgs=34 cs=0 l=1 rev1=1 crypto rx=0x7fbf84009b00 tx=0x7fbf84009e10 comp rx=0 tx=0).stop 2026-03-10T06:27:07.713 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.712+0000 7fbf98dab700 1 -- 192.168.123.104:0/3231168498 shutdown_connections 2026-03-10T06:27:07.713 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.712+0000 7fbf98dab700 1 --2- 192.168.123.104:0/3231168498 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fbf94071b60 0x7fbf94071fd0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:27:07.713 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.712+0000 7fbf98dab700 1 --2- 192.168.123.104:0/3231168498 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fbf9410eab0 0x7fbf9410ee80 unknown :-1 s=CLOSED pgs=34 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:27:07.713 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.712+0000 7fbf98dab700 1 -- 192.168.123.104:0/3231168498 >> 192.168.123.104:0/3231168498 conn(0x7fbf9406c6c0 msgr2=0x7fbf9406cac0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:27:07.713 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.712+0000 7fbf98dab700 1 -- 192.168.123.104:0/3231168498 shutdown_connections 2026-03-10T06:27:07.713 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.712+0000 7fbf98dab700 1 -- 192.168.123.104:0/3231168498 wait complete. 2026-03-10T06:27:07.713 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.713+0000 7fbf98dab700 1 Processor -- start 2026-03-10T06:27:07.714 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.713+0000 7fbf98dab700 1 -- start start 2026-03-10T06:27:07.717 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.713+0000 7fbf98dab700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fbf94071b60 0x7fbf94117580 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:27:07.717 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.713+0000 7fbf98dab700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fbf9410eab0 0x7fbf94112580 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:27:07.717 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.713+0000 7fbf98dab700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fbf94112ac0 con 0x7fbf9410eab0 2026-03-10T06:27:07.717 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.713+0000 7fbf98dab700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fbf94112c00 con 0x7fbf94071b60 2026-03-10T06:27:07.717 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.715+0000 7fbf8bfff700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fbf9410eab0 0x7fbf94112580 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:27:07.717 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.715+0000 7fbf8bfff700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fbf9410eab0 0x7fbf94112580 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:48566/0 (socket says 192.168.123.104:48566) 2026-03-10T06:27:07.717 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.715+0000 7fbf8bfff700 1 -- 192.168.123.104:0/3144130054 learned_addr learned my addr 192.168.123.104:0/3144130054 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:27:07.717 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.715+0000 7fbf9259c700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fbf94071b60 0x7fbf94117580 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:27:07.717 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.715+0000 7fbf8bfff700 1 -- 192.168.123.104:0/3144130054 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fbf94071b60 msgr2=0x7fbf94117580 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:27:07.717 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.715+0000 7fbf8bfff700 1 --2- 192.168.123.104:0/3144130054 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fbf94071b60 0x7fbf94117580 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:27:07.717 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.715+0000 7fbf8bfff700 1 -- 192.168.123.104:0/3144130054 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fbf7c009710 con 0x7fbf9410eab0 2026-03-10T06:27:07.717 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.715+0000 7fbf9259c700 1 --2- 192.168.123.104:0/3144130054 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fbf94071b60 0x7fbf94117580 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T06:27:07.717 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.716+0000 7fbf8bfff700 1 --2- 192.168.123.104:0/3144130054 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fbf9410eab0 0x7fbf94112580 secure :-1 s=READY pgs=105 cs=0 l=1 rev1=1 crypto rx=0x7fbf7c00ec80 tx=0x7fbf7c00c5b0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:27:07.717 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.716+0000 7fbf8b7fe700 1 -- 192.168.123.104:0/3144130054 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fbf7c00cd50 con 0x7fbf9410eab0 2026-03-10T06:27:07.717 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.716+0000 7fbf8b7fe700 1 -- 192.168.123.104:0/3144130054 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7fbf7c00ceb0 con 0x7fbf9410eab0 2026-03-10T06:27:07.719 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.716+0000 7fbf8b7fe700 1 -- 192.168.123.104:0/3144130054 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fbf7c005320 con 0x7fbf9410eab0 2026-03-10T06:27:07.719 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.716+0000 7fbf98dab700 1 -- 192.168.123.104:0/3144130054 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fbf840097e0 con 0x7fbf9410eab0 2026-03-10T06:27:07.719 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.716+0000 7fbf98dab700 1 -- 192.168.123.104:0/3144130054 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fbf94113270 con 0x7fbf9410eab0 2026-03-10T06:27:07.719 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.718+0000 7fbf8b7fe700 1 -- 192.168.123.104:0/3144130054 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 40) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fbf7c01e030 con 0x7fbf9410eab0 2026-03-10T06:27:07.723 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.718+0000 7fbf8b7fe700 1 --2- 192.168.123.104:0/3144130054 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7fbf80077750 0x7fbf80079c00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:27:07.723 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.718+0000 7fbf8b7fe700 1 -- 192.168.123.104:0/3144130054 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(79..79 src has 1..79) v4 ==== 6222+0+0 (secure 0 0 0) 0x7fbf7c014070 con 0x7fbf9410eab0 2026-03-10T06:27:07.723 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.719+0000 7fbf98dab700 1 -- 192.168.123.104:0/3144130054 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fbf9404f2a0 con 0x7fbf9410eab0 2026-03-10T06:27:07.723 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.722+0000 7fbf9259c700 1 --2- 192.168.123.104:0/3144130054 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7fbf80077750 0x7fbf80079c00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:27:07.723 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.722+0000 7fbf8b7fe700 1 -- 192.168.123.104:0/3144130054 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fbf7c062090 con 0x7fbf9410eab0 2026-03-10T06:27:07.723 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.722+0000 7fbf9259c700 1 --2- 192.168.123.104:0/3144130054 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7fbf80077750 0x7fbf80079c00 secure :-1 s=READY pgs=69 cs=0 l=1 rev1=1 crypto rx=0x7fbf8400b5c0 tx=0x7fbf84005fb0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:27:07.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:07 vm06.local ceph-mon[98962]: from='client.34270 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:27:07.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:07 vm06.local ceph-mon[98962]: from='client.34274 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:27:07.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:07 vm06.local ceph-mon[98962]: pgmap v130: 65 pgs: 65 active+clean; 252 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 13 MiB/s rd, 5.5 KiB/s wr, 11 op/s 2026-03-10T06:27:07.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:07 vm06.local ceph-mon[98962]: from='client.34278 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:27:07.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:07 vm06.local ceph-mon[98962]: reconnect by client.24331 192.168.144.1:0/2593838473 after 0.001 2026-03-10T06:27:07.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:07 vm06.local ceph-mon[98962]: reconnect by client.24325 192.168.123.104:0/950847804 after 0.00500001 2026-03-10T06:27:07.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:07 vm06.local ceph-mon[98962]: mds.? [v2:192.168.123.106:6826/3120742985,v1:192.168.123.106:6827/3120742985] up:reconnect 2026-03-10T06:27:07.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:07 vm06.local ceph-mon[98962]: fsmap cephfs:1/1 {0=cephfs.vm06.afscws=up:reconnect} 2 up:standby 2026-03-10T06:27:07.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:07 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:07.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:07 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:07.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:07 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:27:07.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:07 vm06.local ceph-mon[98962]: from='client.? 192.168.123.104:0/2437309443' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:27:07.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:07 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:07.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:07 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:07.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:07 vm06.local ceph-mon[98962]: from='client.? 192.168.123.104:0/1320485135' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T06:27:07.921 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.920+0000 7fbf98dab700 1 -- 192.168.123.104:0/3144130054 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7fbf9404ea50 con 0x7fbf9410eab0 2026-03-10T06:27:07.921 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.921+0000 7fbf8b7fe700 1 -- 192.168.123.104:0/3144130054 <== mon.0 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+297 (secure 0 0 0) 0x7fbf7c061eb0 con 0x7fbf9410eab0 2026-03-10T06:27:07.921 INFO:teuthology.orchestra.run.vm04.stdout:HEALTH_WARN 1 filesystem is degraded; 1 filesystem with deprecated feature inline_data 2026-03-10T06:27:07.921 INFO:teuthology.orchestra.run.vm04.stdout:[WRN] FS_DEGRADED: 1 filesystem is degraded 2026-03-10T06:27:07.921 INFO:teuthology.orchestra.run.vm04.stdout: fs cephfs is degraded 2026-03-10T06:27:07.921 INFO:teuthology.orchestra.run.vm04.stdout:[WRN] FS_INLINE_DATA_DEPRECATED: 1 filesystem with deprecated feature inline_data 2026-03-10T06:27:07.921 INFO:teuthology.orchestra.run.vm04.stdout: fs cephfs has deprecated feature inline_data enabled. 2026-03-10T06:27:07.927 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.927+0000 7fbf897fa700 1 -- 192.168.123.104:0/3144130054 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7fbf80077750 msgr2=0x7fbf80079c00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:27:07.927 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.927+0000 7fbf897fa700 1 --2- 192.168.123.104:0/3144130054 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7fbf80077750 0x7fbf80079c00 secure :-1 s=READY pgs=69 cs=0 l=1 rev1=1 crypto rx=0x7fbf8400b5c0 tx=0x7fbf84005fb0 comp rx=0 tx=0).stop 2026-03-10T06:27:07.928 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.927+0000 7fbf897fa700 1 -- 192.168.123.104:0/3144130054 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fbf9410eab0 msgr2=0x7fbf94112580 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:27:07.928 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.927+0000 7fbf897fa700 1 --2- 192.168.123.104:0/3144130054 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fbf9410eab0 0x7fbf94112580 secure :-1 s=READY pgs=105 cs=0 l=1 rev1=1 crypto rx=0x7fbf7c00ec80 tx=0x7fbf7c00c5b0 comp rx=0 tx=0).stop 2026-03-10T06:27:07.928 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.927+0000 7fbf897fa700 1 -- 192.168.123.104:0/3144130054 shutdown_connections 2026-03-10T06:27:07.928 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.927+0000 7fbf897fa700 1 --2- 192.168.123.104:0/3144130054 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7fbf80077750 0x7fbf80079c00 unknown :-1 s=CLOSED pgs=69 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:27:07.928 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.927+0000 7fbf897fa700 1 --2- 192.168.123.104:0/3144130054 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fbf94071b60 0x7fbf94117580 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:27:07.928 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.927+0000 7fbf897fa700 1 --2- 192.168.123.104:0/3144130054 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fbf9410eab0 0x7fbf94112580 unknown :-1 s=CLOSED pgs=105 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:27:07.928 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.928+0000 7fbf897fa700 1 -- 192.168.123.104:0/3144130054 >> 192.168.123.104:0/3144130054 conn(0x7fbf9406c6c0 msgr2=0x7fbf94070280 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:27:07.929 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.929+0000 7fbf897fa700 1 -- 192.168.123.104:0/3144130054 shutdown_connections 2026-03-10T06:27:07.930 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:07.929+0000 7fbf897fa700 1 -- 192.168.123.104:0/3144130054 wait complete. 2026-03-10T06:27:08.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:08 vm06.local ceph-mon[98962]: from='client.34294 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:27:08.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:08 vm06.local ceph-mon[98962]: mds.? [v2:192.168.123.106:6826/3120742985,v1:192.168.123.106:6827/3120742985] up:rejoin 2026-03-10T06:27:08.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:08 vm06.local ceph-mon[98962]: mds.? [v2:192.168.123.104:6828/142010479,v1:192.168.123.104:6829/142010479] up:boot 2026-03-10T06:27:08.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:08 vm06.local ceph-mon[98962]: fsmap cephfs:1/1 {0=cephfs.vm06.afscws=up:rejoin} 3 up:standby 2026-03-10T06:27:08.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:08 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm04.hsrsig"}]: dispatch 2026-03-10T06:27:08.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:08 vm06.local ceph-mon[98962]: daemon mds.cephfs.vm06.afscws is now active in filesystem cephfs as rank 0 2026-03-10T06:27:08.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:08 vm06.local ceph-mon[98962]: from='client.? 192.168.123.104:0/3144130054' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T06:27:08.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:08 vm04.local ceph-mon[115743]: from='client.34294 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:27:08.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:08 vm04.local ceph-mon[115743]: mds.? [v2:192.168.123.106:6826/3120742985,v1:192.168.123.106:6827/3120742985] up:rejoin 2026-03-10T06:27:08.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:08 vm04.local ceph-mon[115743]: mds.? [v2:192.168.123.104:6828/142010479,v1:192.168.123.104:6829/142010479] up:boot 2026-03-10T06:27:08.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:08 vm04.local ceph-mon[115743]: fsmap cephfs:1/1 {0=cephfs.vm06.afscws=up:rejoin} 3 up:standby 2026-03-10T06:27:08.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:08 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm04.hsrsig"}]: dispatch 2026-03-10T06:27:08.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:08 vm04.local ceph-mon[115743]: daemon mds.cephfs.vm06.afscws is now active in filesystem cephfs as rank 0 2026-03-10T06:27:08.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:08 vm04.local ceph-mon[115743]: from='client.? 192.168.123.104:0/3144130054' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T06:27:09.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:09 vm06.local ceph-mon[98962]: pgmap v131: 65 pgs: 65 active+clean; 252 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 24 MiB/s rd, 5.2 KiB/s wr, 11 op/s 2026-03-10T06:27:09.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:09 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:09.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:09 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:09.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:09 vm06.local ceph-mon[98962]: Health check cleared: FS_DEGRADED (was: 1 filesystem is degraded) 2026-03-10T06:27:09.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:09 vm06.local ceph-mon[98962]: mds.? [v2:192.168.123.106:6826/3120742985,v1:192.168.123.106:6827/3120742985] up:active 2026-03-10T06:27:09.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:09 vm06.local ceph-mon[98962]: fsmap cephfs:1 {0=cephfs.vm06.afscws=up:active} 3 up:standby 2026-03-10T06:27:09.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:09 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:09.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:09 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:09.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:09 vm04.local ceph-mon[115743]: pgmap v131: 65 pgs: 65 active+clean; 252 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 24 MiB/s rd, 5.2 KiB/s wr, 11 op/s 2026-03-10T06:27:09.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:09 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:09.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:09 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:09.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:09 vm04.local ceph-mon[115743]: Health check cleared: FS_DEGRADED (was: 1 filesystem is degraded) 2026-03-10T06:27:09.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:09 vm04.local ceph-mon[115743]: mds.? [v2:192.168.123.106:6826/3120742985,v1:192.168.123.106:6827/3120742985] up:active 2026-03-10T06:27:09.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:09 vm04.local ceph-mon[115743]: fsmap cephfs:1 {0=cephfs.vm06.afscws=up:active} 3 up:standby 2026-03-10T06:27:09.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:09 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:09.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:09 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:11.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:11 vm06.local ceph-mon[98962]: pgmap v132: 65 pgs: 65 active+clean; 252 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 25 MiB/s rd, 102 B/s wr, 9 op/s 2026-03-10T06:27:11.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:11 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:11.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:11 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:11.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:11 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:27:11.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:11 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T06:27:11.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:11 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:11.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:11 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:27:11.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:11 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:27:11.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:11 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:27:11.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:11 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:27:11.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:11 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:27:11.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:11 vm06.local ceph-mon[98962]: Upgrade: Updating mds.cephfs.vm06.wzhqon 2026-03-10T06:27:11.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:11 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:11.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:11 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm06.wzhqon", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-10T06:27:11.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:11 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:27:11.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:11 vm04.local ceph-mon[115743]: pgmap v132: 65 pgs: 65 active+clean; 252 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 25 MiB/s rd, 102 B/s wr, 9 op/s 2026-03-10T06:27:11.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:11 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:11.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:11 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:11.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:11 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:27:11.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:11 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T06:27:11.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:11 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:11.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:11 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:27:11.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:11 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:27:11.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:11 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:27:11.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:11 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:27:11.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:11 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:27:11.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:11 vm04.local ceph-mon[115743]: Upgrade: Updating mds.cephfs.vm06.wzhqon 2026-03-10T06:27:11.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:11 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:11.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:11 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm06.wzhqon", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-10T06:27:11.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:11 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:27:12.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:12 vm06.local ceph-mon[98962]: Deploying daemon mds.cephfs.vm06.wzhqon on vm06 2026-03-10T06:27:12.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:12 vm04.local ceph-mon[115743]: Deploying daemon mds.cephfs.vm06.wzhqon on vm06 2026-03-10T06:27:13.532 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:13 vm04.local ceph-mon[115743]: pgmap v133: 65 pgs: 65 active+clean; 252 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 23 MiB/s rd, 190 B/s wr, 9 op/s 2026-03-10T06:27:13.533 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:13 vm04.local ceph-mon[115743]: osdmap e80: 6 total, 6 up, 6 in 2026-03-10T06:27:13.533 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:13 vm04.local ceph-mon[115743]: fsmap cephfs:1 {0=cephfs.vm06.afscws=up:active} 2 up:standby 2026-03-10T06:27:13.533 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:13 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm06.wzhqon"}]: dispatch 2026-03-10T06:27:13.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:13 vm06.local ceph-mon[98962]: pgmap v133: 65 pgs: 65 active+clean; 252 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 23 MiB/s rd, 190 B/s wr, 9 op/s 2026-03-10T06:27:13.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:13 vm06.local ceph-mon[98962]: osdmap e80: 6 total, 6 up, 6 in 2026-03-10T06:27:13.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:13 vm06.local ceph-mon[98962]: fsmap cephfs:1 {0=cephfs.vm06.afscws=up:active} 2 up:standby 2026-03-10T06:27:13.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:13 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm06.wzhqon"}]: dispatch 2026-03-10T06:27:15.369 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:15 vm06.local ceph-mon[98962]: pgmap v135: 65 pgs: 65 active+clean; 252 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 21 MiB/s rd, 204 B/s wr, 9 op/s 2026-03-10T06:27:15.369 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:15 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:15.369 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:15 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:15.369 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:15 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:27:15.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:15 vm04.local ceph-mon[115743]: pgmap v135: 65 pgs: 65 active+clean; 252 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 21 MiB/s rd, 204 B/s wr, 9 op/s 2026-03-10T06:27:15.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:15 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:15.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:15 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:15.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:15 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:27:16.275 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:16 vm06.local ceph-mon[98962]: mds.? [v2:192.168.123.106:6824/2972586913,v1:192.168.123.106:6825/2972586913] up:boot 2026-03-10T06:27:16.275 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:16 vm06.local ceph-mon[98962]: fsmap cephfs:1 {0=cephfs.vm06.afscws=up:active} 3 up:standby 2026-03-10T06:27:16.275 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:16 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm06.wzhqon"}]: dispatch 2026-03-10T06:27:16.275 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:16 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:16.275 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:16 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:16.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:16 vm04.local ceph-mon[115743]: mds.? [v2:192.168.123.106:6824/2972586913,v1:192.168.123.106:6825/2972586913] up:boot 2026-03-10T06:27:16.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:16 vm04.local ceph-mon[115743]: fsmap cephfs:1 {0=cephfs.vm06.afscws=up:active} 3 up:standby 2026-03-10T06:27:16.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:16 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm06.wzhqon"}]: dispatch 2026-03-10T06:27:16.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:16 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:16.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:16 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:17.819 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:17 vm06.local ceph-mon[98962]: pgmap v136: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 12 MiB/s rd, 5.1 KiB/s wr, 9 op/s 2026-03-10T06:27:17.819 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:17 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:17.819 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:17 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:17.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:17 vm04.local ceph-mon[115743]: pgmap v136: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 12 MiB/s rd, 5.1 KiB/s wr, 9 op/s 2026-03-10T06:27:17.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:17 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:17.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:17 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:18.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:18 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:18.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:18 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:18.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:18 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:27:18.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:18 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T06:27:18.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:18 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:18.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:18 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:27:18.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:18 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:27:18.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:18 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:27:18.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:18 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:27:18.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:18 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:27:18.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:18 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:18.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:18 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm06.afscws", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-10T06:27:18.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:18 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:27:18.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:18 vm04.local ceph-9c59102a-1c48-11f1-b618-035af535377d-mon-vm04[115739]: 2026-03-10T06:27:18.713+0000 7f1d3a3f0640 -1 log_channel(cluster) log [ERR] : Health check failed: 1 filesystem is offline (MDS_ALL_DOWN) 2026-03-10T06:27:19.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:18 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:19.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:18 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:19.118 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:18 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:27:19.118 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:18 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T06:27:19.118 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:18 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:19.118 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:18 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:27:19.118 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:18 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:27:19.118 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:18 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:27:19.118 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:18 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:27:19.118 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:18 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:27:19.118 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:18 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:19.118 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:18 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm06.afscws", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-10T06:27:19.118 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:18 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:27:20.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:19 vm06.local ceph-mon[98962]: pgmap v137: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 542 KiB/s rd, 5.1 KiB/s wr, 6 op/s 2026-03-10T06:27:20.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:19 vm06.local ceph-mon[98962]: Upgrade: Updating mds.cephfs.vm06.afscws 2026-03-10T06:27:20.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:19 vm06.local ceph-mon[98962]: Deploying daemon mds.cephfs.vm06.afscws on vm06 2026-03-10T06:27:20.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:19 vm06.local ceph-mon[98962]: Health check failed: 1 filesystem is degraded (FS_DEGRADED) 2026-03-10T06:27:20.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:19 vm06.local ceph-mon[98962]: Health check failed: 1 filesystem is offline (MDS_ALL_DOWN) 2026-03-10T06:27:20.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:19 vm06.local ceph-mon[98962]: osdmap e81: 6 total, 6 up, 6 in 2026-03-10T06:27:20.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:19 vm06.local ceph-mon[98962]: Standby daemon mds.cephfs.vm04.hdxbzv assigned to filesystem cephfs as rank 0 2026-03-10T06:27:20.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:19 vm06.local ceph-mon[98962]: Health check cleared: MDS_ALL_DOWN (was: 1 filesystem is offline) 2026-03-10T06:27:20.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:19 vm06.local ceph-mon[98962]: fsmap cephfs:0/1 3 up:standby, 1 failed 2026-03-10T06:27:20.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:19 vm06.local ceph-mon[98962]: fsmap cephfs:1/1 {0=cephfs.vm04.hdxbzv=up:replay} 2 up:standby 2026-03-10T06:27:20.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:19 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T06:27:20.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:19 vm04.local ceph-mon[115743]: pgmap v137: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 542 KiB/s rd, 5.1 KiB/s wr, 6 op/s 2026-03-10T06:27:20.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:19 vm04.local ceph-mon[115743]: Upgrade: Updating mds.cephfs.vm06.afscws 2026-03-10T06:27:20.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:19 vm04.local ceph-mon[115743]: Deploying daemon mds.cephfs.vm06.afscws on vm06 2026-03-10T06:27:20.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:19 vm04.local ceph-mon[115743]: Health check failed: 1 filesystem is degraded (FS_DEGRADED) 2026-03-10T06:27:20.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:19 vm04.local ceph-mon[115743]: Health check failed: 1 filesystem is offline (MDS_ALL_DOWN) 2026-03-10T06:27:20.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:19 vm04.local ceph-mon[115743]: osdmap e81: 6 total, 6 up, 6 in 2026-03-10T06:27:20.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:19 vm04.local ceph-mon[115743]: Standby daemon mds.cephfs.vm04.hdxbzv assigned to filesystem cephfs as rank 0 2026-03-10T06:27:20.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:19 vm04.local ceph-mon[115743]: Health check cleared: MDS_ALL_DOWN (was: 1 filesystem is offline) 2026-03-10T06:27:20.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:19 vm04.local ceph-mon[115743]: fsmap cephfs:0/1 3 up:standby, 1 failed 2026-03-10T06:27:20.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:19 vm04.local ceph-mon[115743]: fsmap cephfs:1/1 {0=cephfs.vm04.hdxbzv=up:replay} 2 up:standby 2026-03-10T06:27:20.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:19 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T06:27:22.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:21 vm06.local ceph-mon[98962]: pgmap v139: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.0 MiB/s rd, 6.1 KiB/s wr, 5 op/s 2026-03-10T06:27:22.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:21 vm04.local ceph-mon[115743]: pgmap v139: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.0 MiB/s rd, 6.1 KiB/s wr, 5 op/s 2026-03-10T06:27:23.594 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:23 vm04.local ceph-mon[115743]: pgmap v140: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 14 MiB/s rd, 5.0 KiB/s wr, 8 op/s 2026-03-10T06:27:23.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:23 vm06.local ceph-mon[98962]: pgmap v140: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 14 MiB/s rd, 5.0 KiB/s wr, 8 op/s 2026-03-10T06:27:24.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:24 vm06.local ceph-mon[98962]: reconnect by client.24331 192.168.144.1:0/2593838473 after 0.00400001 2026-03-10T06:27:24.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:24 vm06.local ceph-mon[98962]: reconnect by client.24325 192.168.123.104:0/950847804 after 0.00400001 2026-03-10T06:27:24.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:24 vm06.local ceph-mon[98962]: mds.? [v2:192.168.123.104:6826/2103633514,v1:192.168.123.104:6827/2103633514] up:reconnect 2026-03-10T06:27:24.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:24 vm06.local ceph-mon[98962]: fsmap cephfs:1/1 {0=cephfs.vm04.hdxbzv=up:reconnect} 2 up:standby 2026-03-10T06:27:24.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:24 vm04.local ceph-mon[115743]: reconnect by client.24331 192.168.144.1:0/2593838473 after 0.00400001 2026-03-10T06:27:24.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:24 vm04.local ceph-mon[115743]: reconnect by client.24325 192.168.123.104:0/950847804 after 0.00400001 2026-03-10T06:27:24.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:24 vm04.local ceph-mon[115743]: mds.? [v2:192.168.123.104:6826/2103633514,v1:192.168.123.104:6827/2103633514] up:reconnect 2026-03-10T06:27:24.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:24 vm04.local ceph-mon[115743]: fsmap cephfs:1/1 {0=cephfs.vm04.hdxbzv=up:reconnect} 2 up:standby 2026-03-10T06:27:25.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:25 vm06.local ceph-mon[98962]: pgmap v141: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 22 MiB/s rd, 4.9 KiB/s wr, 9 op/s 2026-03-10T06:27:25.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:25 vm06.local ceph-mon[98962]: mds.? [v2:192.168.123.104:6826/2103633514,v1:192.168.123.104:6827/2103633514] up:rejoin 2026-03-10T06:27:25.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:25 vm06.local ceph-mon[98962]: fsmap cephfs:1/1 {0=cephfs.vm04.hdxbzv=up:rejoin} 2 up:standby 2026-03-10T06:27:25.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:25 vm06.local ceph-mon[98962]: daemon mds.cephfs.vm04.hdxbzv is now active in filesystem cephfs as rank 0 2026-03-10T06:27:25.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:25 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:25.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:25 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:25.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:25 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:27:25.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:25 vm04.local ceph-mon[115743]: pgmap v141: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 22 MiB/s rd, 4.9 KiB/s wr, 9 op/s 2026-03-10T06:27:25.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:25 vm04.local ceph-mon[115743]: mds.? [v2:192.168.123.104:6826/2103633514,v1:192.168.123.104:6827/2103633514] up:rejoin 2026-03-10T06:27:25.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:25 vm04.local ceph-mon[115743]: fsmap cephfs:1/1 {0=cephfs.vm04.hdxbzv=up:rejoin} 2 up:standby 2026-03-10T06:27:25.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:25 vm04.local ceph-mon[115743]: daemon mds.cephfs.vm04.hdxbzv is now active in filesystem cephfs as rank 0 2026-03-10T06:27:25.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:25 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:25.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:25 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:25.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:25 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:27:26.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:26 vm06.local ceph-mon[98962]: Health check cleared: FS_DEGRADED (was: 1 filesystem is degraded) 2026-03-10T06:27:26.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:26 vm06.local ceph-mon[98962]: mds.? [v2:192.168.123.104:6826/2103633514,v1:192.168.123.104:6827/2103633514] up:active 2026-03-10T06:27:26.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:26 vm06.local ceph-mon[98962]: mds.? [v2:192.168.123.106:6826/3728010036,v1:192.168.123.106:6827/3728010036] up:boot 2026-03-10T06:27:26.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:26 vm06.local ceph-mon[98962]: fsmap cephfs:1 {0=cephfs.vm04.hdxbzv=up:active} 3 up:standby 2026-03-10T06:27:26.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:26 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm06.afscws"}]: dispatch 2026-03-10T06:27:26.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:26 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:26.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:26 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:26.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:26 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:26.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:26 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:26.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:26 vm04.local ceph-mon[115743]: Health check cleared: FS_DEGRADED (was: 1 filesystem is degraded) 2026-03-10T06:27:26.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:26 vm04.local ceph-mon[115743]: mds.? [v2:192.168.123.104:6826/2103633514,v1:192.168.123.104:6827/2103633514] up:active 2026-03-10T06:27:26.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:26 vm04.local ceph-mon[115743]: mds.? [v2:192.168.123.106:6826/3728010036,v1:192.168.123.106:6827/3728010036] up:boot 2026-03-10T06:27:26.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:26 vm04.local ceph-mon[115743]: fsmap cephfs:1 {0=cephfs.vm04.hdxbzv=up:active} 3 up:standby 2026-03-10T06:27:26.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:26 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm06.afscws"}]: dispatch 2026-03-10T06:27:26.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:26 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:26.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:26 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:26.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:26 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:26.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:26 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:27.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:27 vm06.local ceph-mon[98962]: pgmap v142: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 25 MiB/s rd, 204 B/s wr, 8 op/s 2026-03-10T06:27:27.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:27 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:27.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:27 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:27.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:27 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:27:27.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:27 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T06:27:27.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:27 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:27.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:27 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:27:27.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:27 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:27:27.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:27 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:27:27.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:27 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:27:27.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:27 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:27:27.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:27 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:27:27.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:27 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:27.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:27 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm04.hdxbzv"}]: dispatch 2026-03-10T06:27:27.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:27 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm04.hdxbzv"}]': finished 2026-03-10T06:27:27.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:27 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm04.hsrsig"}]: dispatch 2026-03-10T06:27:27.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:27 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm04.hsrsig"}]': finished 2026-03-10T06:27:27.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:27 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm06.afscws"}]: dispatch 2026-03-10T06:27:27.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:27 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm06.afscws"}]': finished 2026-03-10T06:27:27.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:27 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm06.wzhqon"}]: dispatch 2026-03-10T06:27:27.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:27 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm06.wzhqon"}]': finished 2026-03-10T06:27:27.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:27 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "fs set", "fs_name": "cephfs", "var": "joinable", "val": "true"}]: dispatch 2026-03-10T06:27:27.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:27 vm04.local ceph-mon[115743]: pgmap v142: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 25 MiB/s rd, 204 B/s wr, 8 op/s 2026-03-10T06:27:27.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:27 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:27.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:27 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:27.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:27 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:27:27.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:27 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T06:27:27.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:27 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:27.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:27 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:27:27.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:27 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:27:27.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:27 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:27:27.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:27 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:27:27.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:27 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:27:27.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:27 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:27:27.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:27 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:27.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:27 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm04.hdxbzv"}]: dispatch 2026-03-10T06:27:27.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:27 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm04.hdxbzv"}]': finished 2026-03-10T06:27:27.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:27 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm04.hsrsig"}]: dispatch 2026-03-10T06:27:27.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:27 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm04.hsrsig"}]': finished 2026-03-10T06:27:27.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:27 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm06.afscws"}]: dispatch 2026-03-10T06:27:27.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:27 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm06.afscws"}]': finished 2026-03-10T06:27:27.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:27 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm06.wzhqon"}]: dispatch 2026-03-10T06:27:27.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:27 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm06.wzhqon"}]': finished 2026-03-10T06:27:27.678 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:27 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "fs set", "fs_name": "cephfs", "var": "joinable", "val": "true"}]: dispatch 2026-03-10T06:27:28.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:28 vm06.local ceph-mon[98962]: Upgrade: Setting container_image for all mds 2026-03-10T06:27:28.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:28 vm06.local ceph-mon[98962]: Upgrade: Setting filesystem cephfs Joinable 2026-03-10T06:27:28.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:28 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd='[{"prefix": "fs set", "fs_name": "cephfs", "var": "joinable", "val": "true"}]': finished 2026-03-10T06:27:28.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:28 vm06.local ceph-mon[98962]: fsmap cephfs:1 {0=cephfs.vm04.hdxbzv=up:active} 3 up:standby 2026-03-10T06:27:28.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:28 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "fs set", "fs_name": "cephfs", "var": "allow_standby_replay", "val": "1"}]: dispatch 2026-03-10T06:27:28.629 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:28 vm04.local ceph-mon[115743]: Upgrade: Setting container_image for all mds 2026-03-10T06:27:28.629 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:28 vm04.local ceph-mon[115743]: Upgrade: Setting filesystem cephfs Joinable 2026-03-10T06:27:28.629 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:28 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd='[{"prefix": "fs set", "fs_name": "cephfs", "var": "joinable", "val": "true"}]': finished 2026-03-10T06:27:28.629 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:28 vm04.local ceph-mon[115743]: fsmap cephfs:1 {0=cephfs.vm04.hdxbzv=up:active} 3 up:standby 2026-03-10T06:27:28.629 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:28 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "fs set", "fs_name": "cephfs", "var": "allow_standby_replay", "val": "1"}]: dispatch 2026-03-10T06:27:29.579 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:29 vm04.local ceph-mon[115743]: Upgrade: Enabling allow_standby_replay on filesystem cephfs 2026-03-10T06:27:29.579 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:29 vm04.local ceph-mon[115743]: pgmap v143: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 25 MiB/s rd, 204 B/s wr, 10 op/s 2026-03-10T06:27:29.579 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:29 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd='[{"prefix": "fs set", "fs_name": "cephfs", "var": "allow_standby_replay", "val": "1"}]': finished 2026-03-10T06:27:29.579 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:29 vm04.local ceph-mon[115743]: fsmap cephfs:1 {0=cephfs.vm04.hdxbzv=up:active} 3 up:standby 2026-03-10T06:27:29.579 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:29 vm04.local ceph-mon[115743]: fsmap cephfs:1 {0=cephfs.vm04.hdxbzv=up:active} 1 up:standby-replay 2 up:standby 2026-03-10T06:27:29.579 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:29 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:29.579 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:29 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:27:29.579 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:29 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:29.579 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:29 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:27:29.579 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:29 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:29.580 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:29 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:27:29.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:29 vm06.local ceph-mon[98962]: Upgrade: Enabling allow_standby_replay on filesystem cephfs 2026-03-10T06:27:29.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:29 vm06.local ceph-mon[98962]: pgmap v143: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 25 MiB/s rd, 204 B/s wr, 10 op/s 2026-03-10T06:27:29.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:29 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd='[{"prefix": "fs set", "fs_name": "cephfs", "var": "allow_standby_replay", "val": "1"}]': finished 2026-03-10T06:27:29.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:29 vm06.local ceph-mon[98962]: fsmap cephfs:1 {0=cephfs.vm04.hdxbzv=up:active} 3 up:standby 2026-03-10T06:27:29.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:29 vm06.local ceph-mon[98962]: fsmap cephfs:1 {0=cephfs.vm04.hdxbzv=up:active} 1 up:standby-replay 2 up:standby 2026-03-10T06:27:29.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:29 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:29.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:29 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:27:29.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:29 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:29.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:29 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:27:29.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:29 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:29.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:29 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:27:31.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:30 vm04.local ceph-mon[115743]: Upgrade: Setting container_image for all rgw 2026-03-10T06:27:31.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:30 vm04.local ceph-mon[115743]: Upgrade: Setting container_image for all rbd-mirror 2026-03-10T06:27:31.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:30 vm04.local ceph-mon[115743]: Upgrade: Updating ceph-exporter.vm04 (1/2) 2026-03-10T06:27:31.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:30 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:31.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:30 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm04", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-10T06:27:31.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:30 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:27:31.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:30 vm04.local ceph-mon[115743]: Deploying daemon ceph-exporter.vm04 on vm04 2026-03-10T06:27:31.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:30 vm06.local ceph-mon[98962]: Upgrade: Setting container_image for all rgw 2026-03-10T06:27:31.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:30 vm06.local ceph-mon[98962]: Upgrade: Setting container_image for all rbd-mirror 2026-03-10T06:27:31.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:30 vm06.local ceph-mon[98962]: Upgrade: Updating ceph-exporter.vm04 (1/2) 2026-03-10T06:27:31.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:30 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:31.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:30 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm04", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-10T06:27:31.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:30 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:27:31.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:30 vm06.local ceph-mon[98962]: Deploying daemon ceph-exporter.vm04 on vm04 2026-03-10T06:27:32.120 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:31 vm06.local ceph-mon[98962]: pgmap v144: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 22 MiB/s rd, 4.5 KiB/s wr, 12 op/s 2026-03-10T06:27:32.120 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:31 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:32.120 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:31 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:32.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:31 vm04.local ceph-mon[115743]: pgmap v144: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 22 MiB/s rd, 4.5 KiB/s wr, 12 op/s 2026-03-10T06:27:32.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:31 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:32.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:31 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:33.118 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:32 vm06.local ceph-mon[98962]: Upgrade: Updating ceph-exporter.vm06 (2/2) 2026-03-10T06:27:33.118 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:32 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:33.118 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:32 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm06", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-10T06:27:33.118 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:32 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:27:33.118 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:32 vm06.local ceph-mon[98962]: Deploying daemon ceph-exporter.vm06 on vm06 2026-03-10T06:27:33.118 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:32 vm06.local ceph-mon[98962]: pgmap v145: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 27 MiB/s rd, 4.2 KiB/s wr, 13 op/s 2026-03-10T06:27:33.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:32 vm04.local ceph-mon[115743]: Upgrade: Updating ceph-exporter.vm06 (2/2) 2026-03-10T06:27:33.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:32 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:33.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:32 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm06", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-10T06:27:33.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:32 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:27:33.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:32 vm04.local ceph-mon[115743]: Deploying daemon ceph-exporter.vm06 on vm06 2026-03-10T06:27:33.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:32 vm04.local ceph-mon[115743]: pgmap v145: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 27 MiB/s rd, 4.2 KiB/s wr, 13 op/s 2026-03-10T06:27:34.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:33 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:34.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:33 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:34.178 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:33 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:27:34.179 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:33 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:34.179 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:33 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:34.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:33 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:34.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:33 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:34.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:33 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:27:34.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:33 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:34.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:33 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:35.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:35 vm06.local ceph-mon[98962]: pgmap v146: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 22 MiB/s rd, 4.2 KiB/s wr, 11 op/s 2026-03-10T06:27:35.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:35 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:35.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:35 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T06:27:35.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:35 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:35.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:35 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:35.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:35 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:35.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:35 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:35.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:35 vm04.local ceph-mon[115743]: pgmap v146: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 22 MiB/s rd, 4.2 KiB/s wr, 11 op/s 2026-03-10T06:27:35.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:35 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:35.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:35 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T06:27:35.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:35 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:35.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:35 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:35.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:35 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:35.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:35 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:36.846 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:36 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:36.846 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:36 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:36.846 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:36 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:36.846 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:36 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:36.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:36 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:36.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:36 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:36.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:36 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:36.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:36 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:37.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:37 vm06.local ceph-mon[98962]: pgmap v147: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 18 MiB/s rd, 4.2 KiB/s wr, 10 op/s 2026-03-10T06:27:37.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:37 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:37.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:37 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:37.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:37 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:27:37.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:37 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T06:27:37.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:37 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:37.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:37 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:27:37.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:37 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:27:37.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:37 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:27:37.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:37 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:27:37.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:37 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:27:37.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:37 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:27:37.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:37 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "fs set", "fs_name": "cephfs", "var": "joinable", "val": "true"}]: dispatch 2026-03-10T06:27:37.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:37 vm04.local ceph-mon[115743]: pgmap v147: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 18 MiB/s rd, 4.2 KiB/s wr, 10 op/s 2026-03-10T06:27:37.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:37 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:37.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:37 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:37.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:37 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:27:37.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:37 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T06:27:37.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:37 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:37.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:37 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:27:37.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:37 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:27:37.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:37 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:27:37.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:37 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:27:37.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:37 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:27:37.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:37 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:27:37.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:37 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "fs set", "fs_name": "cephfs", "var": "joinable", "val": "true"}]: dispatch 2026-03-10T06:27:38.013 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:38.012+0000 7f98bcfbf700 1 -- 192.168.123.104:0/3463442159 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f98b8102780 msgr2=0x7f98b8102bf0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:27:38.013 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:38.012+0000 7f98bcfbf700 1 --2- 192.168.123.104:0/3463442159 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f98b8102780 0x7f98b8102bf0 secure :-1 s=READY pgs=106 cs=0 l=1 rev1=1 crypto rx=0x7f98a0009b50 tx=0x7f98a0009e60 comp rx=0 tx=0).stop 2026-03-10T06:27:38.013 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:38.013+0000 7f98bcfbf700 1 -- 192.168.123.104:0/3463442159 shutdown_connections 2026-03-10T06:27:38.013 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:38.013+0000 7f98bcfbf700 1 --2- 192.168.123.104:0/3463442159 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f98b8102780 0x7f98b8102bf0 unknown :-1 s=CLOSED pgs=106 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:27:38.013 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:38.013+0000 7f98bcfbf700 1 --2- 192.168.123.104:0/3463442159 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f98b8108780 0x7f98b8108b50 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:27:38.013 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:38.013+0000 7f98bcfbf700 1 -- 192.168.123.104:0/3463442159 >> 192.168.123.104:0/3463442159 conn(0x7f98b80fe280 msgr2=0x7f98b8100690 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:27:38.014 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:38.013+0000 7f98bcfbf700 1 -- 192.168.123.104:0/3463442159 shutdown_connections 2026-03-10T06:27:38.014 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:38.013+0000 7f98bcfbf700 1 -- 192.168.123.104:0/3463442159 wait complete. 2026-03-10T06:27:38.014 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:38.014+0000 7f98bcfbf700 1 Processor -- start 2026-03-10T06:27:38.014 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:38.014+0000 7f98bcfbf700 1 -- start start 2026-03-10T06:27:38.015 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:38.014+0000 7f98bcfbf700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f98b8102780 0x7f98b81983b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:27:38.015 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:38.014+0000 7f98b659c700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f98b8102780 0x7f98b81983b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:27:38.015 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:38.014+0000 7f98b659c700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f98b8102780 0x7f98b81983b0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:55346/0 (socket says 192.168.123.104:55346) 2026-03-10T06:27:38.015 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:38.014+0000 7f98bcfbf700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f98b8108780 0x7f98b81988f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:27:38.015 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:38.014+0000 7f98bcfbf700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f98b8198fd0 con 0x7f98b8102780 2026-03-10T06:27:38.015 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:38.014+0000 7f98bcfbf700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f98b819cd60 con 0x7f98b8108780 2026-03-10T06:27:38.015 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:38.014+0000 7f98b659c700 1 -- 192.168.123.104:0/404243471 learned_addr learned my addr 192.168.123.104:0/404243471 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:27:38.015 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:38.015+0000 7f98b659c700 1 -- 192.168.123.104:0/404243471 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f98b8108780 msgr2=0x7f98b81988f0 unknown :-1 s=STATE_CONNECTING_RE l=1).mark_down 2026-03-10T06:27:38.015 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:38.015+0000 7f98b659c700 1 --2- 192.168.123.104:0/404243471 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f98b8108780 0x7f98b81988f0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:27:38.015 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:38.015+0000 7f98b659c700 1 -- 192.168.123.104:0/404243471 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f98a00097e0 con 0x7f98b8102780 2026-03-10T06:27:38.015 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:38.015+0000 7f98b659c700 1 --2- 192.168.123.104:0/404243471 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f98b8102780 0x7f98b81983b0 secure :-1 s=READY pgs=107 cs=0 l=1 rev1=1 crypto rx=0x7f98a800ed20 tx=0x7f98a800c5b0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:27:38.016 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:38.015+0000 7f98af7fe700 1 -- 192.168.123.104:0/404243471 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f98a800cd70 con 0x7f98b8102780 2026-03-10T06:27:38.016 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:38.015+0000 7f98af7fe700 1 -- 192.168.123.104:0/404243471 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f98a8004510 con 0x7f98b8102780 2026-03-10T06:27:38.017 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:38.015+0000 7f98bcfbf700 1 -- 192.168.123.104:0/404243471 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f98b819d040 con 0x7f98b8102780 2026-03-10T06:27:38.017 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:38.015+0000 7f98bcfbf700 1 -- 192.168.123.104:0/404243471 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f98b819d590 con 0x7f98b8102780 2026-03-10T06:27:38.017 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:38.016+0000 7f98af7fe700 1 -- 192.168.123.104:0/404243471 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f98a8003ea0 con 0x7f98b8102780 2026-03-10T06:27:38.020 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:38.017+0000 7f98af7fe700 1 -- 192.168.123.104:0/404243471 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 40) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f98a8010640 con 0x7f98b8102780 2026-03-10T06:27:38.020 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:38.017+0000 7f98bcfbf700 1 -- 192.168.123.104:0/404243471 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f9898005320 con 0x7f98b8102780 2026-03-10T06:27:38.020 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:38.017+0000 7f98af7fe700 1 --2- 192.168.123.104:0/404243471 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f98a4077990 0x7f98a4079e40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:27:38.020 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:38.017+0000 7f98af7fe700 1 -- 192.168.123.104:0/404243471 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(81..81 src has 1..81) v4 ==== 6308+0+0 (secure 0 0 0) 0x7f98a8014070 con 0x7f98b8102780 2026-03-10T06:27:38.021 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:38.020+0000 7f98af7fe700 1 -- 192.168.123.104:0/404243471 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f98a8062b20 con 0x7f98b8102780 2026-03-10T06:27:38.021 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:38.021+0000 7f98affff700 1 --2- 192.168.123.104:0/404243471 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f98a4077990 0x7f98a4079e40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:27:38.021 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:38.021+0000 7f98affff700 1 --2- 192.168.123.104:0/404243471 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f98a4077990 0x7f98a4079e40 secure :-1 s=READY pgs=73 cs=0 l=1 rev1=1 crypto rx=0x7f98a0000c00 tx=0x7f98a0005fb0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:27:38.187 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:38.185+0000 7f98bcfbf700 1 -- 192.168.123.104:0/404243471 --> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f9898000bf0 con 0x7f98a4077990 2026-03-10T06:27:38.187 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:38.186+0000 7f98af7fe700 1 -- 192.168.123.104:0/404243471 <== mgr.34104 v2:192.168.123.104:6800/1421430943 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+175 (secure 0 0 0) 0x7f9898000bf0 con 0x7f98a4077990 2026-03-10T06:27:38.192 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:38.191+0000 7f98bcfbf700 1 -- 192.168.123.104:0/404243471 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f98a4077990 msgr2=0x7f98a4079e40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:27:38.192 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:38.191+0000 7f98bcfbf700 1 --2- 192.168.123.104:0/404243471 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f98a4077990 0x7f98a4079e40 secure :-1 s=READY pgs=73 cs=0 l=1 rev1=1 crypto rx=0x7f98a0000c00 tx=0x7f98a0005fb0 comp rx=0 tx=0).stop 2026-03-10T06:27:38.192 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:38.191+0000 7f98bcfbf700 1 -- 192.168.123.104:0/404243471 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f98b8102780 msgr2=0x7f98b81983b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:27:38.192 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:38.192+0000 7f98bcfbf700 1 --2- 192.168.123.104:0/404243471 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f98b8102780 0x7f98b81983b0 secure :-1 s=READY pgs=107 cs=0 l=1 rev1=1 crypto rx=0x7f98a800ed20 tx=0x7f98a800c5b0 comp rx=0 tx=0).stop 2026-03-10T06:27:38.192 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:38.192+0000 7f98bcfbf700 1 -- 192.168.123.104:0/404243471 shutdown_connections 2026-03-10T06:27:38.192 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:38.192+0000 7f98bcfbf700 1 --2- 192.168.123.104:0/404243471 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f98a4077990 0x7f98a4079e40 secure :-1 s=CLOSED pgs=73 cs=0 l=1 rev1=1 crypto rx=0x7f98a0000c00 tx=0x7f98a0005fb0 comp rx=0 tx=0).stop 2026-03-10T06:27:38.192 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:38.192+0000 7f98bcfbf700 1 --2- 192.168.123.104:0/404243471 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f98b8102780 0x7f98b81983b0 unknown :-1 s=CLOSED pgs=107 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:27:38.192 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:38.192+0000 7f98bcfbf700 1 --2- 192.168.123.104:0/404243471 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f98b8108780 0x7f98b81988f0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:27:38.193 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:38.192+0000 7f98bcfbf700 1 -- 192.168.123.104:0/404243471 >> 192.168.123.104:0/404243471 conn(0x7f98b80fe280 msgr2=0x7f98b80ffc10 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:27:38.194 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:38.193+0000 7f98bcfbf700 1 -- 192.168.123.104:0/404243471 shutdown_connections 2026-03-10T06:27:38.195 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:38.194+0000 7f98bcfbf700 1 -- 192.168.123.104:0/404243471 wait complete. 2026-03-10T06:27:38.276 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 9c59102a-1c48-11f1-b618-035af535377d -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph orch ps' 2026-03-10T06:27:38.449 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/mon.vm04/config 2026-03-10T06:27:38.540 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:38 vm04.local ceph-mon[115743]: Upgrade: Setting filesystem cephfs Joinable 2026-03-10T06:27:38.540 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:38 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd='[{"prefix": "fs set", "fs_name": "cephfs", "var": "joinable", "val": "true"}]': finished 2026-03-10T06:27:38.540 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:38 vm04.local ceph-mon[115743]: fsmap cephfs:1 {0=cephfs.vm04.hdxbzv=up:active} 1 up:standby-replay 2 up:standby 2026-03-10T06:27:38.540 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:38 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:27:38.540 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:38 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:27:38.540 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:38 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:27:38.540 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:38 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:27:38.540 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:38 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:38.540 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:38 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.ceph-exporter.vm04"}]: dispatch 2026-03-10T06:27:38.540 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:38 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.ceph-exporter.vm04"}]': finished 2026-03-10T06:27:38.540 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:38 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.ceph-exporter.vm06"}]: dispatch 2026-03-10T06:27:38.540 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:38 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.ceph-exporter.vm06"}]': finished 2026-03-10T06:27:38.540 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:38 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:27:38.540 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:38 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:38.540 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:38 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:27:38.540 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:38 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:38.540 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:38 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:27:38.540 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:38 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:38.540 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:38 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:27:38.540 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:38 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:27:38.540 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:38 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:27:38.540 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:38 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:27:38.540 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:38 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:27:38.540 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:38 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:27:38.540 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:38 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mgr"}]: dispatch 2026-03-10T06:27:38.541 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:38 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mgr"}]': finished 2026-03-10T06:27:38.541 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:38 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-10T06:27:38.541 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:38 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mon"}]': finished 2026-03-10T06:27:38.541 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:38 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.crash"}]: dispatch 2026-03-10T06:27:38.541 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:38 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.crash"}]': finished 2026-03-10T06:27:38.541 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:38 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd"}]: dispatch 2026-03-10T06:27:38.541 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:38 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd='[{"prefix": "config rm", "name": "container_image", "who": "osd"}]': finished 2026-03-10T06:27:38.541 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:38 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mds"}]: dispatch 2026-03-10T06:27:38.541 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:38 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mds"}]': finished 2026-03-10T06:27:38.541 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:38 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.rgw"}]: dispatch 2026-03-10T06:27:38.541 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:38 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.rgw"}]': finished 2026-03-10T06:27:38.541 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:38 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.rbd-mirror"}]: dispatch 2026-03-10T06:27:38.541 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:38 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.rbd-mirror"}]': finished 2026-03-10T06:27:38.541 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:38 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-10T06:27:38.541 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:38 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.ceph-exporter"}]: dispatch 2026-03-10T06:27:38.541 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:38 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.ceph-exporter"}]': finished 2026-03-10T06:27:38.541 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:38 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.iscsi"}]: dispatch 2026-03-10T06:27:38.541 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:38 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.iscsi"}]': finished 2026-03-10T06:27:38.541 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:38 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.nfs"}]: dispatch 2026-03-10T06:27:38.541 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:38 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.nfs"}]': finished 2026-03-10T06:27:38.541 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:38 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.nvmeof"}]: dispatch 2026-03-10T06:27:38.541 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:38 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.nvmeof"}]': finished 2026-03-10T06:27:38.541 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:38 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-10T06:27:38.541 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:38 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-10T06:27:38.541 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:38 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-10T06:27:38.541 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:38 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-10T06:27:38.541 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:38 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-10T06:27:38.541 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:38 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-10T06:27:38.541 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:38 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix":"config-key del","key":"mgr/cephadm/upgrade_state"}]: dispatch 2026-03-10T06:27:38.541 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:38 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/upgrade_state"}]': finished 2026-03-10T06:27:38.541 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:38 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:27:38.541 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:38 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:27:38.541 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:38 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T06:27:38.541 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:38 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:38.541 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:38 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:27:38.541 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:38 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:27:38.541 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:38 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T06:27:38.541 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:38 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:38.743 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:38.741+0000 7f9fb04a0700 1 -- 192.168.123.104:0/503339074 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9fa8108780 msgr2=0x7f9fa8108b50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:27:38.743 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:38.741+0000 7f9fb04a0700 1 --2- 192.168.123.104:0/503339074 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9fa8108780 0x7f9fa8108b50 secure :-1 s=READY pgs=108 cs=0 l=1 rev1=1 crypto rx=0x7f9f98009b50 tx=0x7f9f98009e60 comp rx=0 tx=0).stop 2026-03-10T06:27:38.743 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:38.742+0000 7f9fb04a0700 1 -- 192.168.123.104:0/503339074 shutdown_connections 2026-03-10T06:27:38.743 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:38.742+0000 7f9fb04a0700 1 --2- 192.168.123.104:0/503339074 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9fa8102780 0x7f9fa8102bf0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:27:38.743 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:38.742+0000 7f9fb04a0700 1 --2- 192.168.123.104:0/503339074 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9fa8108780 0x7f9fa8108b50 unknown :-1 s=CLOSED pgs=108 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:27:38.743 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:38.742+0000 7f9fb04a0700 1 -- 192.168.123.104:0/503339074 >> 192.168.123.104:0/503339074 conn(0x7f9fa80fe280 msgr2=0x7f9fa8100690 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:27:38.743 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:38.743+0000 7f9fb04a0700 1 -- 192.168.123.104:0/503339074 shutdown_connections 2026-03-10T06:27:38.743 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:38.743+0000 7f9fb04a0700 1 -- 192.168.123.104:0/503339074 wait complete. 2026-03-10T06:27:38.744 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:38.743+0000 7f9fb04a0700 1 Processor -- start 2026-03-10T06:27:38.744 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:38.744+0000 7f9fb04a0700 1 -- start start 2026-03-10T06:27:38.744 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:38.744+0000 7f9fb04a0700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9fa8102780 0x7f9fa8198410 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:27:38.745 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:38.744+0000 7f9fae23c700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9fa8102780 0x7f9fa8198410 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:27:38.745 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:38.744+0000 7f9fae23c700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9fa8102780 0x7f9fa8198410 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:55368/0 (socket says 192.168.123.104:55368) 2026-03-10T06:27:38.745 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:38.744+0000 7f9fb04a0700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9fa8108780 0x7f9fa8198950 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:27:38.745 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:38.744+0000 7f9fae23c700 1 -- 192.168.123.104:0/3065563728 learned_addr learned my addr 192.168.123.104:0/3065563728 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:27:38.745 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:38.744+0000 7f9fb04a0700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9fa8199030 con 0x7f9fa8102780 2026-03-10T06:27:38.745 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:38.744+0000 7f9fb04a0700 1 -- 192.168.123.104:0/3065563728 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9fa819cd70 con 0x7f9fa8108780 2026-03-10T06:27:38.745 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:38.745+0000 7f9fada3b700 1 --2- 192.168.123.104:0/3065563728 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9fa8108780 0x7f9fa8198950 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:27:38.746 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:38.745+0000 7f9fae23c700 1 -- 192.168.123.104:0/3065563728 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9fa8108780 msgr2=0x7f9fa8198950 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:27:38.746 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:38.745+0000 7f9fae23c700 1 --2- 192.168.123.104:0/3065563728 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9fa8108780 0x7f9fa8198950 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:27:38.746 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:38.745+0000 7f9fae23c700 1 -- 192.168.123.104:0/3065563728 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f9f980097e0 con 0x7f9fa8102780 2026-03-10T06:27:38.746 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:38.745+0000 7f9fae23c700 1 --2- 192.168.123.104:0/3065563728 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9fa8102780 0x7f9fa8198410 secure :-1 s=READY pgs=109 cs=0 l=1 rev1=1 crypto rx=0x7f9f980048f0 tx=0x7f9f98004920 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:27:38.747 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:38.745+0000 7f9f9f7fe700 1 -- 192.168.123.104:0/3065563728 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9f9801d070 con 0x7f9fa8102780 2026-03-10T06:27:38.747 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:38.745+0000 7f9f9f7fe700 1 -- 192.168.123.104:0/3065563728 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f9f98004b70 con 0x7f9fa8102780 2026-03-10T06:27:38.747 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:38.745+0000 7f9f9f7fe700 1 -- 192.168.123.104:0/3065563728 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9f9800f670 con 0x7f9fa8102780 2026-03-10T06:27:38.747 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:38.746+0000 7f9fb04a0700 1 -- 192.168.123.104:0/3065563728 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f9fa819cff0 con 0x7f9fa8102780 2026-03-10T06:27:38.747 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:38.746+0000 7f9fb04a0700 1 -- 192.168.123.104:0/3065563728 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f9fa819d400 con 0x7f9fa8102780 2026-03-10T06:27:38.747 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:38.747+0000 7f9fb04a0700 1 -- 192.168.123.104:0/3065563728 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f9fa810acf0 con 0x7f9fa8102780 2026-03-10T06:27:38.753 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:38.752+0000 7f9f9f7fe700 1 -- 192.168.123.104:0/3065563728 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 40) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f9f9800bc30 con 0x7f9fa8102780 2026-03-10T06:27:38.753 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:38.752+0000 7f9f9f7fe700 1 --2- 192.168.123.104:0/3065563728 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f9f94077990 0x7f9f94079e40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:27:38.753 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:38.752+0000 7f9f9f7fe700 1 -- 192.168.123.104:0/3065563728 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(81..81 src has 1..81) v4 ==== 6308+0+0 (secure 0 0 0) 0x7f9f9809b9a0 con 0x7f9fa8102780 2026-03-10T06:27:38.753 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:38.752+0000 7f9fada3b700 1 --2- 192.168.123.104:0/3065563728 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f9f94077990 0x7f9f94079e40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:27:38.753 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:38.753+0000 7f9fada3b700 1 --2- 192.168.123.104:0/3065563728 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f9f94077990 0x7f9f94079e40 secure :-1 s=READY pgs=74 cs=0 l=1 rev1=1 crypto rx=0x7f9fa81999e0 tx=0x7f9fa400b410 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:27:38.753 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:38.753+0000 7f9f9f7fe700 1 -- 192.168.123.104:0/3065563728 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f9f980cba90 con 0x7f9fa8102780 2026-03-10T06:27:38.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:38 vm06.local ceph-mon[98962]: Upgrade: Setting filesystem cephfs Joinable 2026-03-10T06:27:38.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:38 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd='[{"prefix": "fs set", "fs_name": "cephfs", "var": "joinable", "val": "true"}]': finished 2026-03-10T06:27:38.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:38 vm06.local ceph-mon[98962]: fsmap cephfs:1 {0=cephfs.vm04.hdxbzv=up:active} 1 up:standby-replay 2 up:standby 2026-03-10T06:27:38.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:38 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:27:38.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:38 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:27:38.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:38 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:27:38.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:38 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:27:38.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:38 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:38.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:38 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.ceph-exporter.vm04"}]: dispatch 2026-03-10T06:27:38.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:38 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.ceph-exporter.vm04"}]': finished 2026-03-10T06:27:38.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:38 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.ceph-exporter.vm06"}]: dispatch 2026-03-10T06:27:38.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:38 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.ceph-exporter.vm06"}]': finished 2026-03-10T06:27:38.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:38 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:27:38.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:38 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:38.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:38 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:27:38.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:38 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:38.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:38 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:27:38.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:38 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:38.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:38 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:27:38.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:38 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:27:38.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:38 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:27:38.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:38 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:27:38.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:38 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:27:38.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:38 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:27:38.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:38 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mgr"}]: dispatch 2026-03-10T06:27:38.868 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:38 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mgr"}]': finished 2026-03-10T06:27:38.869 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:38 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-10T06:27:38.869 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:38 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mon"}]': finished 2026-03-10T06:27:38.869 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:38 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.crash"}]: dispatch 2026-03-10T06:27:38.869 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:38 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.crash"}]': finished 2026-03-10T06:27:38.869 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:38 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd"}]: dispatch 2026-03-10T06:27:38.869 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:38 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd='[{"prefix": "config rm", "name": "container_image", "who": "osd"}]': finished 2026-03-10T06:27:38.869 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:38 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mds"}]: dispatch 2026-03-10T06:27:38.869 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:38 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mds"}]': finished 2026-03-10T06:27:38.869 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:38 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.rgw"}]: dispatch 2026-03-10T06:27:38.869 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:38 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.rgw"}]': finished 2026-03-10T06:27:38.869 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:38 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.rbd-mirror"}]: dispatch 2026-03-10T06:27:38.869 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:38 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.rbd-mirror"}]': finished 2026-03-10T06:27:38.869 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:38 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-10T06:27:38.869 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:38 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.ceph-exporter"}]: dispatch 2026-03-10T06:27:38.869 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:38 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.ceph-exporter"}]': finished 2026-03-10T06:27:38.869 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:38 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.iscsi"}]: dispatch 2026-03-10T06:27:38.869 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:38 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.iscsi"}]': finished 2026-03-10T06:27:38.869 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:38 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.nfs"}]: dispatch 2026-03-10T06:27:38.869 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:38 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.nfs"}]': finished 2026-03-10T06:27:38.869 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:38 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.nvmeof"}]: dispatch 2026-03-10T06:27:38.869 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:38 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.nvmeof"}]': finished 2026-03-10T06:27:38.869 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:38 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-10T06:27:38.869 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:38 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-10T06:27:38.869 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:38 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-10T06:27:38.869 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:38 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-10T06:27:38.869 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:38 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-10T06:27:38.869 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:38 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-10T06:27:38.869 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:38 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix":"config-key del","key":"mgr/cephadm/upgrade_state"}]: dispatch 2026-03-10T06:27:38.869 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:38 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/upgrade_state"}]': finished 2026-03-10T06:27:38.869 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:38 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:27:38.869 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:38 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:27:38.869 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:38 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T06:27:38.869 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:38 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:38.869 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:38 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:27:38.869 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:38 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:27:38.869 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:38 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T06:27:38.869 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:38 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:38.879 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:38.875+0000 7f9fb04a0700 1 -- 192.168.123.104:0/3065563728 --> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7f9fa8199830 con 0x7f9f94077990 2026-03-10T06:27:38.883 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:38.882+0000 7f9f9f7fe700 1 -- 192.168.123.104:0/3065563728 <== mgr.34104 v2:192.168.123.104:6800/1421430943 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3552 (secure 0 0 0) 0x7f9fa8199830 con 0x7f9f94077990 2026-03-10T06:27:38.883 INFO:teuthology.orchestra.run.vm04.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-10T06:27:38.883 INFO:teuthology.orchestra.run.vm04.stdout:alertmanager.vm04 vm04 *:9093,9094 running (4m) 4s ago 10m 24.4M - 0.25.0 c8568f914cd2 85edc8fe2fc1 2026-03-10T06:27:38.883 INFO:teuthology.orchestra.run.vm04.stdout:ceph-exporter.vm04 vm04 running (7s) 4s ago 10m 10.0M - 19.2.3-678-ge911bdeb 654f31e6858e 1f8c6c628bc5 2026-03-10T06:27:38.883 INFO:teuthology.orchestra.run.vm04.stdout:ceph-exporter.vm06 vm06 running (5s) 5s ago 9m 9709k - 19.2.3-678-ge911bdeb 654f31e6858e 4b8b93e98e4c 2026-03-10T06:27:38.883 INFO:teuthology.orchestra.run.vm04.stdout:crash.vm04 vm04 running (3m) 4s ago 10m 7847k - 19.2.3-678-ge911bdeb 654f31e6858e 330b1d951bd0 2026-03-10T06:27:38.883 INFO:teuthology.orchestra.run.vm04.stdout:crash.vm06 vm06 running (3m) 5s ago 9m 7856k - 19.2.3-678-ge911bdeb 654f31e6858e d5aafc4fb1bb 2026-03-10T06:27:38.883 INFO:teuthology.orchestra.run.vm04.stdout:grafana.vm04 vm04 *:3000 running (4m) 4s ago 9m 94.6M - 10.4.0 c8b91775d855 28b34ae2f2b0 2026-03-10T06:27:38.883 INFO:teuthology.orchestra.run.vm04.stdout:mds.cephfs.vm04.hdxbzv vm04 running (42s) 4s ago 7m 99.7M - 19.2.3-678-ge911bdeb 654f31e6858e 481d5dbc696e 2026-03-10T06:27:38.883 INFO:teuthology.orchestra.run.vm04.stdout:mds.cephfs.vm04.hsrsig vm04 running (32s) 4s ago 7m 265M - 19.2.3-678-ge911bdeb 654f31e6858e 053559c6b509 2026-03-10T06:27:38.883 INFO:teuthology.orchestra.run.vm04.stdout:mds.cephfs.vm06.afscws vm06 running (14s) 5s ago 7m 17.6M - 19.2.3-678-ge911bdeb 654f31e6858e b155e769016e 2026-03-10T06:27:38.883 INFO:teuthology.orchestra.run.vm04.stdout:mds.cephfs.vm06.wzhqon vm06 running (23s) 5s ago 7m 19.1M - 19.2.3-678-ge911bdeb 654f31e6858e 366d0632406e 2026-03-10T06:27:38.883 INFO:teuthology.orchestra.run.vm04.stdout:mgr.vm04.exdvdb vm04 *:8443,9283,8765 running (5m) 4s ago 10m 617M - 19.2.3-678-ge911bdeb 654f31e6858e 640a9d30421c 2026-03-10T06:27:38.883 INFO:teuthology.orchestra.run.vm04.stdout:mgr.vm06.wwotdr vm06 *:8443,9283,8765 running (5m) 5s ago 9m 496M - 19.2.3-678-ge911bdeb 654f31e6858e 0f98de364d6a 2026-03-10T06:27:38.883 INFO:teuthology.orchestra.run.vm04.stdout:mon.vm04 vm04 running (3m) 4s ago 10m 67.1M 2048M 19.2.3-678-ge911bdeb 654f31e6858e cf1d92823378 2026-03-10T06:27:38.883 INFO:teuthology.orchestra.run.vm04.stdout:mon.vm06 vm06 running (3m) 5s ago 9m 58.2M 2048M 19.2.3-678-ge911bdeb 654f31e6858e 0f90bc9a714a 2026-03-10T06:27:38.883 INFO:teuthology.orchestra.run.vm04.stdout:node-exporter.vm04 vm04 *:9100 running (5m) 4s ago 10m 10.6M - 1.7.0 72c9c2088986 f88b18573eef 2026-03-10T06:27:38.883 INFO:teuthology.orchestra.run.vm04.stdout:node-exporter.vm06 vm06 *:9100 running (4m) 5s ago 9m 9847k - 1.7.0 72c9c2088986 32cea90d1988 2026-03-10T06:27:38.883 INFO:teuthology.orchestra.run.vm04.stdout:osd.0 vm04 running (3m) 4s ago 9m 187M 4096M 19.2.3-678-ge911bdeb 654f31e6858e df697b82ad51 2026-03-10T06:27:38.883 INFO:teuthology.orchestra.run.vm04.stdout:osd.1 vm04 running (2m) 4s ago 8m 121M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 6bc3525fe6f5 2026-03-10T06:27:38.883 INFO:teuthology.orchestra.run.vm04.stdout:osd.2 vm04 running (2m) 4s ago 8m 93.8M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 38220ba83a3f 2026-03-10T06:27:38.883 INFO:teuthology.orchestra.run.vm04.stdout:osd.3 vm06 running (99s) 5s ago 8m 170M 4096M 19.2.3-678-ge911bdeb 654f31e6858e e91f44e1f660 2026-03-10T06:27:38.883 INFO:teuthology.orchestra.run.vm04.stdout:osd.4 vm06 running (77s) 5s ago 8m 107M 4096M 19.2.3-678-ge911bdeb 654f31e6858e fea6c31251ba 2026-03-10T06:27:38.883 INFO:teuthology.orchestra.run.vm04.stdout:osd.5 vm06 running (56s) 5s ago 8m 96.7M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 15b85a82c8ae 2026-03-10T06:27:38.883 INFO:teuthology.orchestra.run.vm04.stdout:prometheus.vm04 vm04 *:9095 running (4m) 4s ago 9m 59.7M - 2.51.0 1d3b7f56885b 9e491f823407 2026-03-10T06:27:38.886 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:38.885+0000 7f9fb04a0700 1 -- 192.168.123.104:0/3065563728 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f9f94077990 msgr2=0x7f9f94079e40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:27:38.886 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:38.885+0000 7f9fb04a0700 1 --2- 192.168.123.104:0/3065563728 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f9f94077990 0x7f9f94079e40 secure :-1 s=READY pgs=74 cs=0 l=1 rev1=1 crypto rx=0x7f9fa81999e0 tx=0x7f9fa400b410 comp rx=0 tx=0).stop 2026-03-10T06:27:38.886 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:38.885+0000 7f9fb04a0700 1 -- 192.168.123.104:0/3065563728 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9fa8102780 msgr2=0x7f9fa8198410 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:27:38.886 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:38.885+0000 7f9fb04a0700 1 --2- 192.168.123.104:0/3065563728 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9fa8102780 0x7f9fa8198410 secure :-1 s=READY pgs=109 cs=0 l=1 rev1=1 crypto rx=0x7f9f980048f0 tx=0x7f9f98004920 comp rx=0 tx=0).stop 2026-03-10T06:27:38.886 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:38.886+0000 7f9fb04a0700 1 -- 192.168.123.104:0/3065563728 shutdown_connections 2026-03-10T06:27:38.886 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:38.886+0000 7f9fb04a0700 1 --2- 192.168.123.104:0/3065563728 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f9f94077990 0x7f9f94079e40 unknown :-1 s=CLOSED pgs=74 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:27:38.886 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:38.886+0000 7f9fb04a0700 1 --2- 192.168.123.104:0/3065563728 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9fa8102780 0x7f9fa8198410 unknown :-1 s=CLOSED pgs=109 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:27:38.886 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:38.886+0000 7f9fb04a0700 1 --2- 192.168.123.104:0/3065563728 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9fa8108780 0x7f9fa8198950 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:27:38.886 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:38.886+0000 7f9fb04a0700 1 -- 192.168.123.104:0/3065563728 >> 192.168.123.104:0/3065563728 conn(0x7f9fa80fe280 msgr2=0x7f9fa80ffbe0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:27:38.886 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:38.886+0000 7f9fb04a0700 1 -- 192.168.123.104:0/3065563728 shutdown_connections 2026-03-10T06:27:38.886 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:38.886+0000 7f9fb04a0700 1 -- 192.168.123.104:0/3065563728 wait complete. 2026-03-10T06:27:38.936 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 9c59102a-1c48-11f1-b618-035af535377d -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph orch upgrade status' 2026-03-10T06:27:39.096 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/mon.vm04/config 2026-03-10T06:27:39.358 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:39.357+0000 7f1641129700 1 -- 192.168.123.104:0/2895196642 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f163c068490 msgr2=0x7f163c068900 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:27:39.358 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:39.357+0000 7f1641129700 1 --2- 192.168.123.104:0/2895196642 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f163c068490 0x7f163c068900 secure :-1 s=READY pgs=110 cs=0 l=1 rev1=1 crypto rx=0x7f162c009b00 tx=0x7f162c009e10 comp rx=0 tx=0).stop 2026-03-10T06:27:39.358 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:39.357+0000 7f1641129700 1 -- 192.168.123.104:0/2895196642 shutdown_connections 2026-03-10T06:27:39.358 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:39.357+0000 7f1641129700 1 --2- 192.168.123.104:0/2895196642 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f163c068490 0x7f163c068900 unknown :-1 s=CLOSED pgs=110 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:27:39.358 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:39.357+0000 7f1641129700 1 --2- 192.168.123.104:0/2895196642 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f163c1013a0 0x7f163c101770 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:27:39.358 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:39.357+0000 7f1641129700 1 -- 192.168.123.104:0/2895196642 >> 192.168.123.104:0/2895196642 conn(0x7f163c0754a0 msgr2=0x7f163c0758a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:27:39.359 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:39.358+0000 7f1641129700 1 -- 192.168.123.104:0/2895196642 shutdown_connections 2026-03-10T06:27:39.359 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:39.358+0000 7f1641129700 1 -- 192.168.123.104:0/2895196642 wait complete. 2026-03-10T06:27:39.359 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:39.359+0000 7f1641129700 1 Processor -- start 2026-03-10T06:27:39.359 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:39.359+0000 7f1641129700 1 -- start start 2026-03-10T06:27:39.360 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:39.359+0000 7f1641129700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f163c068490 0x7f163c198400 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:27:39.360 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:39.359+0000 7f1641129700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f163c1013a0 0x7f163c198940 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:27:39.360 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:39.359+0000 7f1641129700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f163c199020 con 0x7f163c1013a0 2026-03-10T06:27:39.360 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:39.359+0000 7f163a59c700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f163c1013a0 0x7f163c198940 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:27:39.360 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:39.359+0000 7f163a59c700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f163c1013a0 0x7f163c198940 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:55380/0 (socket says 192.168.123.104:55380) 2026-03-10T06:27:39.360 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:39.359+0000 7f163a59c700 1 -- 192.168.123.104:0/2544187721 learned_addr learned my addr 192.168.123.104:0/2544187721 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:27:39.360 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:39.359+0000 7f1641129700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f163c19cd40 con 0x7f163c068490 2026-03-10T06:27:39.360 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:39.360+0000 7f163a59c700 1 -- 192.168.123.104:0/2544187721 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f163c068490 msgr2=0x7f163c198400 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T06:27:39.361 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:39.360+0000 7f163a59c700 1 --2- 192.168.123.104:0/2544187721 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f163c068490 0x7f163c198400 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:27:39.361 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:39.360+0000 7f163a59c700 1 -- 192.168.123.104:0/2544187721 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f162c0097e0 con 0x7f163c1013a0 2026-03-10T06:27:39.361 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:39.360+0000 7f163a59c700 1 --2- 192.168.123.104:0/2544187721 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f163c1013a0 0x7f163c198940 secure :-1 s=READY pgs=111 cs=0 l=1 rev1=1 crypto rx=0x7f162c004930 tx=0x7f162c004a10 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:27:39.361 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:39.360+0000 7f1633fff700 1 -- 192.168.123.104:0/2544187721 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f162c01d070 con 0x7f163c1013a0 2026-03-10T06:27:39.362 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:39.361+0000 7f1633fff700 1 -- 192.168.123.104:0/2544187721 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f162c00bc50 con 0x7f163c1013a0 2026-03-10T06:27:39.362 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:39.361+0000 7f1641129700 1 -- 192.168.123.104:0/2544187721 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f163c19cfc0 con 0x7f163c1013a0 2026-03-10T06:27:39.362 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:39.361+0000 7f1641129700 1 -- 192.168.123.104:0/2544187721 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f163c19d4b0 con 0x7f163c1013a0 2026-03-10T06:27:39.362 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:39.361+0000 7f1633fff700 1 -- 192.168.123.104:0/2544187721 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f162c022620 con 0x7f163c1013a0 2026-03-10T06:27:39.366 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:39.362+0000 7f1641129700 1 -- 192.168.123.104:0/2544187721 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f163c04ea50 con 0x7f163c1013a0 2026-03-10T06:27:39.366 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:39.362+0000 7f1633fff700 1 -- 192.168.123.104:0/2544187721 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 40) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f162c00f580 con 0x7f163c1013a0 2026-03-10T06:27:39.366 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:39.363+0000 7f1633fff700 1 --2- 192.168.123.104:0/2544187721 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f16280778c0 0x7f1628079d70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:27:39.366 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:39.363+0000 7f1633fff700 1 -- 192.168.123.104:0/2544187721 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(81..81 src has 1..81) v4 ==== 6308+0+0 (secure 0 0 0) 0x7f162c09b470 con 0x7f163c1013a0 2026-03-10T06:27:39.366 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:39.366+0000 7f1633fff700 1 -- 192.168.123.104:0/2544187721 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f162c063b70 con 0x7f163c1013a0 2026-03-10T06:27:39.366 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:39.366+0000 7f163ad9d700 1 --2- 192.168.123.104:0/2544187721 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f16280778c0 0x7f1628079d70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:27:39.367 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:39.366+0000 7f163ad9d700 1 --2- 192.168.123.104:0/2544187721 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f16280778c0 0x7f1628079d70 secure :-1 s=READY pgs=75 cs=0 l=1 rev1=1 crypto rx=0x7f16240097b0 tx=0x7f1624006d20 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:27:39.504 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:39.502+0000 7f1641129700 1 -- 192.168.123.104:0/2544187721 --> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f163c19d790 con 0x7f16280778c0 2026-03-10T06:27:39.504 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:39.503+0000 7f1633fff700 1 -- 192.168.123.104:0/2544187721 <== mgr.34104 v2:192.168.123.104:6800/1421430943 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+175 (secure 0 0 0) 0x7f163c19d790 con 0x7f16280778c0 2026-03-10T06:27:39.504 INFO:teuthology.orchestra.run.vm04.stdout:{ 2026-03-10T06:27:39.504 INFO:teuthology.orchestra.run.vm04.stdout: "target_image": null, 2026-03-10T06:27:39.504 INFO:teuthology.orchestra.run.vm04.stdout: "in_progress": false, 2026-03-10T06:27:39.504 INFO:teuthology.orchestra.run.vm04.stdout: "which": "", 2026-03-10T06:27:39.504 INFO:teuthology.orchestra.run.vm04.stdout: "services_complete": [], 2026-03-10T06:27:39.504 INFO:teuthology.orchestra.run.vm04.stdout: "progress": null, 2026-03-10T06:27:39.504 INFO:teuthology.orchestra.run.vm04.stdout: "message": "", 2026-03-10T06:27:39.504 INFO:teuthology.orchestra.run.vm04.stdout: "is_paused": false 2026-03-10T06:27:39.504 INFO:teuthology.orchestra.run.vm04.stdout:} 2026-03-10T06:27:39.506 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:39.506+0000 7f1641129700 1 -- 192.168.123.104:0/2544187721 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f16280778c0 msgr2=0x7f1628079d70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:27:39.507 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:39.506+0000 7f1641129700 1 --2- 192.168.123.104:0/2544187721 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f16280778c0 0x7f1628079d70 secure :-1 s=READY pgs=75 cs=0 l=1 rev1=1 crypto rx=0x7f16240097b0 tx=0x7f1624006d20 comp rx=0 tx=0).stop 2026-03-10T06:27:39.507 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:39.506+0000 7f1641129700 1 -- 192.168.123.104:0/2544187721 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f163c1013a0 msgr2=0x7f163c198940 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:27:39.507 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:39.507+0000 7f1641129700 1 --2- 192.168.123.104:0/2544187721 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f163c1013a0 0x7f163c198940 secure :-1 s=READY pgs=111 cs=0 l=1 rev1=1 crypto rx=0x7f162c004930 tx=0x7f162c004a10 comp rx=0 tx=0).stop 2026-03-10T06:27:39.507 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:39.507+0000 7f1641129700 1 -- 192.168.123.104:0/2544187721 shutdown_connections 2026-03-10T06:27:39.507 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:39.507+0000 7f1641129700 1 --2- 192.168.123.104:0/2544187721 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f16280778c0 0x7f1628079d70 unknown :-1 s=CLOSED pgs=75 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:27:39.508 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:39.507+0000 7f1641129700 1 --2- 192.168.123.104:0/2544187721 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f163c068490 0x7f163c198400 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:27:39.508 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:39.507+0000 7f1641129700 1 --2- 192.168.123.104:0/2544187721 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f163c1013a0 0x7f163c198940 unknown :-1 s=CLOSED pgs=111 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:27:39.508 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:39.507+0000 7f1641129700 1 -- 192.168.123.104:0/2544187721 >> 192.168.123.104:0/2544187721 conn(0x7f163c0754a0 msgr2=0x7f163c0fddd0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:27:39.508 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:39.508+0000 7f1641129700 1 -- 192.168.123.104:0/2544187721 shutdown_connections 2026-03-10T06:27:39.508 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:39.508+0000 7f1641129700 1 -- 192.168.123.104:0/2544187721 wait complete. 2026-03-10T06:27:39.575 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 9c59102a-1c48-11f1-b618-035af535377d -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph health detail' 2026-03-10T06:27:39.736 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/mon.vm04/config 2026-03-10T06:27:39.765 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:39 vm04.local ceph-mon[115743]: pgmap v148: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 20 MiB/s rd, 4.1 KiB/s wr, 10 op/s 2026-03-10T06:27:39.765 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:39 vm04.local ceph-mon[115743]: Upgrade: Setting container_image for all ceph-exporter 2026-03-10T06:27:39.765 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:39 vm04.local ceph-mon[115743]: Upgrade: Setting container_image for all iscsi 2026-03-10T06:27:39.765 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:39 vm04.local ceph-mon[115743]: Upgrade: Setting container_image for all nfs 2026-03-10T06:27:39.765 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:39 vm04.local ceph-mon[115743]: Upgrade: Setting container_image for all nvmeof 2026-03-10T06:27:39.765 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:39 vm04.local ceph-mon[115743]: Upgrade: Finalizing container_image settings 2026-03-10T06:27:39.765 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:39 vm04.local ceph-mon[115743]: Upgrade: Complete! 2026-03-10T06:27:39.765 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:39 vm04.local ceph-mon[115743]: from='client.34310 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:27:39.765 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:39 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:39.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:39 vm06.local ceph-mon[98962]: pgmap v148: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 20 MiB/s rd, 4.1 KiB/s wr, 10 op/s 2026-03-10T06:27:39.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:39 vm06.local ceph-mon[98962]: Upgrade: Setting container_image for all ceph-exporter 2026-03-10T06:27:39.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:39 vm06.local ceph-mon[98962]: Upgrade: Setting container_image for all iscsi 2026-03-10T06:27:39.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:39 vm06.local ceph-mon[98962]: Upgrade: Setting container_image for all nfs 2026-03-10T06:27:39.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:39 vm06.local ceph-mon[98962]: Upgrade: Setting container_image for all nvmeof 2026-03-10T06:27:39.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:39 vm06.local ceph-mon[98962]: Upgrade: Finalizing container_image settings 2026-03-10T06:27:39.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:39 vm06.local ceph-mon[98962]: Upgrade: Complete! 2026-03-10T06:27:39.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:39 vm06.local ceph-mon[98962]: from='client.34310 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:27:39.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:39 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:27:40.018 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:40.016+0000 7f15a500d700 1 -- 192.168.123.104:0/242622173 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f15a0108810 msgr2=0x7f15a0108be0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:27:40.018 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:40.016+0000 7f15a500d700 1 --2- 192.168.123.104:0/242622173 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f15a0108810 0x7f15a0108be0 secure :-1 s=READY pgs=112 cs=0 l=1 rev1=1 crypto rx=0x7f1588009b50 tx=0x7f1588009e60 comp rx=0 tx=0).stop 2026-03-10T06:27:40.019 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:40.018+0000 7f15a500d700 1 -- 192.168.123.104:0/242622173 shutdown_connections 2026-03-10T06:27:40.019 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:40.018+0000 7f15a500d700 1 --2- 192.168.123.104:0/242622173 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f15a0102810 0x7f15a0102c80 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:27:40.019 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:40.018+0000 7f15a500d700 1 --2- 192.168.123.104:0/242622173 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f15a0108810 0x7f15a0108be0 unknown :-1 s=CLOSED pgs=112 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:27:40.019 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:40.018+0000 7f15a500d700 1 -- 192.168.123.104:0/242622173 >> 192.168.123.104:0/242622173 conn(0x7f15a00fe330 msgr2=0x7f15a0100740 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:27:40.019 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:40.018+0000 7f15a500d700 1 -- 192.168.123.104:0/242622173 shutdown_connections 2026-03-10T06:27:40.019 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:40.018+0000 7f15a500d700 1 -- 192.168.123.104:0/242622173 wait complete. 2026-03-10T06:27:40.019 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:40.019+0000 7f15a500d700 1 Processor -- start 2026-03-10T06:27:40.019 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:40.019+0000 7f15a500d700 1 -- start start 2026-03-10T06:27:40.020 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:40.019+0000 7f15a500d700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f15a0102810 0x7f15a019e420 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:27:40.020 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:40.019+0000 7f15a500d700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f15a019e960 0x7f15a0198500 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:27:40.020 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:40.020+0000 7f159e59c700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f15a019e960 0x7f15a0198500 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:27:40.020 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:40.020+0000 7f159e59c700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f15a019e960 0x7f15a0198500 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:55384/0 (socket says 192.168.123.104:55384) 2026-03-10T06:27:40.020 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:40.020+0000 7f159e59c700 1 -- 192.168.123.104:0/1831027959 learned_addr learned my addr 192.168.123.104:0/1831027959 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:27:40.020 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:40.019+0000 7f15a500d700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f15a019ef70 con 0x7f15a019e960 2026-03-10T06:27:40.020 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:40.020+0000 7f15a500d700 1 -- 192.168.123.104:0/1831027959 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f15a0198a40 con 0x7f15a0102810 2026-03-10T06:27:40.021 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:40.020+0000 7f159e59c700 1 -- 192.168.123.104:0/1831027959 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f15a0102810 msgr2=0x7f15a019e420 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:27:40.021 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:40.020+0000 7f159ed9d700 1 --2- 192.168.123.104:0/1831027959 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f15a0102810 0x7f15a019e420 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:27:40.021 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:40.020+0000 7f159e59c700 1 --2- 192.168.123.104:0/1831027959 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f15a0102810 0x7f15a019e420 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:27:40.021 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:40.020+0000 7f159e59c700 1 -- 192.168.123.104:0/1831027959 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f15880097e0 con 0x7f15a019e960 2026-03-10T06:27:40.021 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:40.020+0000 7f159ed9d700 1 --2- 192.168.123.104:0/1831027959 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f15a0102810 0x7f15a019e420 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T06:27:40.021 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:40.021+0000 7f159e59c700 1 --2- 192.168.123.104:0/1831027959 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f15a019e960 0x7f15a0198500 secure :-1 s=READY pgs=113 cs=0 l=1 rev1=1 crypto rx=0x7f159000ebf0 tx=0x7f159000c2d0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:27:40.021 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:40.021+0000 7f1597fff700 1 -- 192.168.123.104:0/1831027959 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f159000cd00 con 0x7f15a019e960 2026-03-10T06:27:40.022 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:40.021+0000 7f1597fff700 1 -- 192.168.123.104:0/1831027959 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f159000ce60 con 0x7f15a019e960 2026-03-10T06:27:40.022 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:40.021+0000 7f15a500d700 1 -- 192.168.123.104:0/1831027959 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f15a0198d20 con 0x7f15a019e960 2026-03-10T06:27:40.022 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:40.021+0000 7f15a500d700 1 -- 192.168.123.104:0/1831027959 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f15a0199270 con 0x7f15a019e960 2026-03-10T06:27:40.023 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:40.022+0000 7f1597fff700 1 -- 192.168.123.104:0/1831027959 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f15900049e0 con 0x7f15a019e960 2026-03-10T06:27:40.024 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:40.023+0000 7f1597fff700 1 -- 192.168.123.104:0/1831027959 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 40) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f1590018430 con 0x7f15a019e960 2026-03-10T06:27:40.024 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:40.023+0000 7f15a500d700 1 -- 192.168.123.104:0/1831027959 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f15a010ad80 con 0x7f15a019e960 2026-03-10T06:27:40.024 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:40.024+0000 7f1597fff700 1 --2- 192.168.123.104:0/1831027959 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f158c077870 0x7f158c079d20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:27:40.024 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:40.024+0000 7f1597fff700 1 -- 192.168.123.104:0/1831027959 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(81..81 src has 1..81) v4 ==== 6308+0+0 (secure 0 0 0) 0x7f1590014070 con 0x7f15a019e960 2026-03-10T06:27:40.024 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:40.024+0000 7f159ed9d700 1 --2- 192.168.123.104:0/1831027959 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f158c077870 0x7f158c079d20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:27:40.027 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:40.027+0000 7f1597fff700 1 -- 192.168.123.104:0/1831027959 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f15900627c0 con 0x7f15a019e960 2026-03-10T06:27:40.028 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:40.027+0000 7f159ed9d700 1 --2- 192.168.123.104:0/1831027959 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f158c077870 0x7f158c079d20 secure :-1 s=READY pgs=76 cs=0 l=1 rev1=1 crypto rx=0x7f1588009b20 tx=0x7f15880058e0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:27:40.198 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:40.197+0000 7f15a500d700 1 -- 192.168.123.104:0/1831027959 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7f15a0199a30 con 0x7f15a019e960 2026-03-10T06:27:40.199 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:40.198+0000 7f1597fff700 1 -- 192.168.123.104:0/1831027959 <== mon.0 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+201 (secure 0 0 0) 0x7f1590005740 con 0x7f15a019e960 2026-03-10T06:27:40.199 INFO:teuthology.orchestra.run.vm04.stdout:HEALTH_WARN 1 filesystem with deprecated feature inline_data 2026-03-10T06:27:40.199 INFO:teuthology.orchestra.run.vm04.stdout:[WRN] FS_INLINE_DATA_DEPRECATED: 1 filesystem with deprecated feature inline_data 2026-03-10T06:27:40.199 INFO:teuthology.orchestra.run.vm04.stdout: fs cephfs has deprecated feature inline_data enabled. 2026-03-10T06:27:40.202 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:40.201+0000 7f15a500d700 1 -- 192.168.123.104:0/1831027959 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f158c077870 msgr2=0x7f158c079d20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:27:40.202 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:40.201+0000 7f15a500d700 1 --2- 192.168.123.104:0/1831027959 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f158c077870 0x7f158c079d20 secure :-1 s=READY pgs=76 cs=0 l=1 rev1=1 crypto rx=0x7f1588009b20 tx=0x7f15880058e0 comp rx=0 tx=0).stop 2026-03-10T06:27:40.202 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:40.201+0000 7f15a500d700 1 -- 192.168.123.104:0/1831027959 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f15a019e960 msgr2=0x7f15a0198500 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:27:40.202 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:40.201+0000 7f15a500d700 1 --2- 192.168.123.104:0/1831027959 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f15a019e960 0x7f15a0198500 secure :-1 s=READY pgs=113 cs=0 l=1 rev1=1 crypto rx=0x7f159000ebf0 tx=0x7f159000c2d0 comp rx=0 tx=0).stop 2026-03-10T06:27:40.202 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:40.202+0000 7f15a500d700 1 -- 192.168.123.104:0/1831027959 shutdown_connections 2026-03-10T06:27:40.202 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:40.202+0000 7f15a500d700 1 --2- 192.168.123.104:0/1831027959 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f158c077870 0x7f158c079d20 unknown :-1 s=CLOSED pgs=76 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:27:40.203 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:40.202+0000 7f15a500d700 1 --2- 192.168.123.104:0/1831027959 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f15a0102810 0x7f15a019e420 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:27:40.203 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:40.202+0000 7f15a500d700 1 --2- 192.168.123.104:0/1831027959 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f15a019e960 0x7f15a0198500 unknown :-1 s=CLOSED pgs=113 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:27:40.203 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:40.202+0000 7f15a500d700 1 -- 192.168.123.104:0/1831027959 >> 192.168.123.104:0/1831027959 conn(0x7f15a00fe330 msgr2=0x7f15a00ffe80 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:27:40.203 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:40.202+0000 7f15a500d700 1 -- 192.168.123.104:0/1831027959 shutdown_connections 2026-03-10T06:27:40.203 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:40.203+0000 7f15a500d700 1 -- 192.168.123.104:0/1831027959 wait complete. 2026-03-10T06:27:40.274 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 9c59102a-1c48-11f1-b618-035af535377d -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph versions' 2026-03-10T06:27:40.442 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/mon.vm04/config 2026-03-10T06:27:40.529 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:40 vm04.local ceph-mon[115743]: from='client.34314 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:27:40.529 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:40 vm04.local ceph-mon[115743]: from='client.34318 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:27:40.529 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:40 vm04.local ceph-mon[115743]: from='client.? 192.168.123.104:0/1831027959' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T06:27:40.719 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:40.717+0000 7ff249c9c700 1 -- 192.168.123.104:0/1241155553 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7ff244106560 msgr2=0x7ff244106930 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:27:40.719 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:40.717+0000 7ff249c9c700 1 --2- 192.168.123.104:0/1241155553 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7ff244106560 0x7ff244106930 secure :-1 s=READY pgs=114 cs=0 l=1 rev1=1 crypto rx=0x7ff22c009b50 tx=0x7ff22c009e60 comp rx=0 tx=0).stop 2026-03-10T06:27:40.719 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:40.718+0000 7ff249c9c700 1 -- 192.168.123.104:0/1241155553 shutdown_connections 2026-03-10T06:27:40.719 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:40.718+0000 7ff249c9c700 1 --2- 192.168.123.104:0/1241155553 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff244100540 0x7ff2441009b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:27:40.719 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:40.718+0000 7ff249c9c700 1 --2- 192.168.123.104:0/1241155553 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7ff244106560 0x7ff244106930 unknown :-1 s=CLOSED pgs=114 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:27:40.719 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:40.718+0000 7ff249c9c700 1 -- 192.168.123.104:0/1241155553 >> 192.168.123.104:0/1241155553 conn(0x7ff2440fc000 msgr2=0x7ff2440fe410 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:27:40.719 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:40.719+0000 7ff249c9c700 1 -- 192.168.123.104:0/1241155553 shutdown_connections 2026-03-10T06:27:40.719 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:40.719+0000 7ff249c9c700 1 -- 192.168.123.104:0/1241155553 wait complete. 2026-03-10T06:27:40.720 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:40.719+0000 7ff249c9c700 1 Processor -- start 2026-03-10T06:27:40.720 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:40.719+0000 7ff249c9c700 1 -- start start 2026-03-10T06:27:40.720 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:40.720+0000 7ff249c9c700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7ff244100540 0x7ff2441963f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:27:40.721 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:40.720+0000 7ff249c9c700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff244196930 0x7ff24419ada0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:27:40.721 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:40.720+0000 7ff249c9c700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff244196f40 con 0x7ff244100540 2026-03-10T06:27:40.721 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:40.720+0000 7ff249c9c700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff2441970b0 con 0x7ff244196930 2026-03-10T06:27:40.721 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:40.720+0000 7ff242ffd700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff244196930 0x7ff24419ada0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:27:40.721 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:40.720+0000 7ff242ffd700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff244196930 0x7ff24419ada0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.104:50850/0 (socket says 192.168.123.104:50850) 2026-03-10T06:27:40.721 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:40.720+0000 7ff242ffd700 1 -- 192.168.123.104:0/666028238 learned_addr learned my addr 192.168.123.104:0/666028238 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:27:40.721 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:40.720+0000 7ff242ffd700 1 -- 192.168.123.104:0/666028238 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7ff244100540 msgr2=0x7ff2441963f0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:27:40.721 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:40.720+0000 7ff2437fe700 1 --2- 192.168.123.104:0/666028238 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7ff244100540 0x7ff2441963f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:27:40.721 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:40.721+0000 7ff242ffd700 1 --2- 192.168.123.104:0/666028238 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7ff244100540 0x7ff2441963f0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:27:40.721 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:40.721+0000 7ff242ffd700 1 -- 192.168.123.104:0/666028238 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7ff22c0097e0 con 0x7ff244196930 2026-03-10T06:27:40.721 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:40.721+0000 7ff2437fe700 1 --2- 192.168.123.104:0/666028238 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7ff244100540 0x7ff2441963f0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T06:27:40.721 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:40.721+0000 7ff242ffd700 1 --2- 192.168.123.104:0/666028238 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff244196930 0x7ff24419ada0 secure :-1 s=READY pgs=41 cs=0 l=1 rev1=1 crypto rx=0x7ff23400d8d0 tx=0x7ff23400dc90 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:27:40.722 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:40.721+0000 7ff240ff9700 1 -- 192.168.123.104:0/666028238 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff234009940 con 0x7ff244196930 2026-03-10T06:27:40.722 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:40.721+0000 7ff240ff9700 1 -- 192.168.123.104:0/666028238 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7ff234010460 con 0x7ff244196930 2026-03-10T06:27:40.722 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:40.721+0000 7ff249c9c700 1 -- 192.168.123.104:0/666028238 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7ff24419b3a0 con 0x7ff244196930 2026-03-10T06:27:40.722 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:40.721+0000 7ff249c9c700 1 -- 192.168.123.104:0/666028238 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7ff24419b8f0 con 0x7ff244196930 2026-03-10T06:27:40.723 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:40.722+0000 7ff240ff9700 1 -- 192.168.123.104:0/666028238 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff234009c50 con 0x7ff244196930 2026-03-10T06:27:40.723 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:40.722+0000 7ff249c9c700 1 -- 192.168.123.104:0/666028238 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7ff2441089f0 con 0x7ff244196930 2026-03-10T06:27:40.724 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:40.723+0000 7ff240ff9700 1 -- 192.168.123.104:0/666028238 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 40) v1 ==== 100115+0+0 (secure 0 0 0) 0x7ff2340105d0 con 0x7ff244196930 2026-03-10T06:27:40.724 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:40.724+0000 7ff240ff9700 1 --2- 192.168.123.104:0/666028238 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7ff2300778c0 0x7ff230079d70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:27:40.724 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:40.724+0000 7ff240ff9700 1 -- 192.168.123.104:0/666028238 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(81..81 src has 1..81) v4 ==== 6308+0+0 (secure 0 0 0) 0x7ff23409a030 con 0x7ff244196930 2026-03-10T06:27:40.725 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:40.724+0000 7ff2437fe700 1 --2- 192.168.123.104:0/666028238 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7ff2300778c0 0x7ff230079d70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:27:40.725 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:40.724+0000 7ff2437fe700 1 --2- 192.168.123.104:0/666028238 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7ff2300778c0 0x7ff230079d70 secure :-1 s=READY pgs=77 cs=0 l=1 rev1=1 crypto rx=0x7ff22c006010 tx=0x7ff22c0058e0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:27:40.728 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:40.728+0000 7ff240ff9700 1 -- 192.168.123.104:0/666028238 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7ff2340627b0 con 0x7ff244196930 2026-03-10T06:27:40.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:40 vm06.local ceph-mon[98962]: from='client.34314 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:27:40.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:40 vm06.local ceph-mon[98962]: from='client.34318 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:27:40.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:40 vm06.local ceph-mon[98962]: from='client.? 192.168.123.104:0/1831027959' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T06:27:40.902 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:40.902+0000 7ff249c9c700 1 -- 192.168.123.104:0/666028238 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7ff24419bbd0 con 0x7ff244196930 2026-03-10T06:27:40.903 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:40.902+0000 7ff240ff9700 1 -- 192.168.123.104:0/666028238 <== mon.1 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+633 (secure 0 0 0) 0x7ff234061f00 con 0x7ff244196930 2026-03-10T06:27:40.905 INFO:teuthology.orchestra.run.vm04.stdout:{ 2026-03-10T06:27:40.905 INFO:teuthology.orchestra.run.vm04.stdout: "mon": { 2026-03-10T06:27:40.905 INFO:teuthology.orchestra.run.vm04.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T06:27:40.905 INFO:teuthology.orchestra.run.vm04.stdout: }, 2026-03-10T06:27:40.905 INFO:teuthology.orchestra.run.vm04.stdout: "mgr": { 2026-03-10T06:27:40.905 INFO:teuthology.orchestra.run.vm04.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T06:27:40.905 INFO:teuthology.orchestra.run.vm04.stdout: }, 2026-03-10T06:27:40.905 INFO:teuthology.orchestra.run.vm04.stdout: "osd": { 2026-03-10T06:27:40.905 INFO:teuthology.orchestra.run.vm04.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 6 2026-03-10T06:27:40.905 INFO:teuthology.orchestra.run.vm04.stdout: }, 2026-03-10T06:27:40.905 INFO:teuthology.orchestra.run.vm04.stdout: "mds": { 2026-03-10T06:27:40.905 INFO:teuthology.orchestra.run.vm04.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 4 2026-03-10T06:27:40.905 INFO:teuthology.orchestra.run.vm04.stdout: }, 2026-03-10T06:27:40.905 INFO:teuthology.orchestra.run.vm04.stdout: "overall": { 2026-03-10T06:27:40.905 INFO:teuthology.orchestra.run.vm04.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 14 2026-03-10T06:27:40.906 INFO:teuthology.orchestra.run.vm04.stdout: } 2026-03-10T06:27:40.906 INFO:teuthology.orchestra.run.vm04.stdout:} 2026-03-10T06:27:40.907 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:40.906+0000 7ff249c9c700 1 -- 192.168.123.104:0/666028238 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7ff2300778c0 msgr2=0x7ff230079d70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:27:40.907 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:40.906+0000 7ff249c9c700 1 --2- 192.168.123.104:0/666028238 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7ff2300778c0 0x7ff230079d70 secure :-1 s=READY pgs=77 cs=0 l=1 rev1=1 crypto rx=0x7ff22c006010 tx=0x7ff22c0058e0 comp rx=0 tx=0).stop 2026-03-10T06:27:40.907 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:40.907+0000 7ff249c9c700 1 -- 192.168.123.104:0/666028238 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff244196930 msgr2=0x7ff24419ada0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:27:40.907 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:40.907+0000 7ff249c9c700 1 --2- 192.168.123.104:0/666028238 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff244196930 0x7ff24419ada0 secure :-1 s=READY pgs=41 cs=0 l=1 rev1=1 crypto rx=0x7ff23400d8d0 tx=0x7ff23400dc90 comp rx=0 tx=0).stop 2026-03-10T06:27:40.908 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:40.907+0000 7ff249c9c700 1 -- 192.168.123.104:0/666028238 shutdown_connections 2026-03-10T06:27:40.908 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:40.907+0000 7ff249c9c700 1 --2- 192.168.123.104:0/666028238 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7ff2300778c0 0x7ff230079d70 unknown :-1 s=CLOSED pgs=77 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:27:40.908 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:40.908+0000 7ff249c9c700 1 --2- 192.168.123.104:0/666028238 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7ff244100540 0x7ff2441963f0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:27:40.908 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:40.908+0000 7ff249c9c700 1 --2- 192.168.123.104:0/666028238 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff244196930 0x7ff24419ada0 unknown :-1 s=CLOSED pgs=41 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:27:40.908 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:40.908+0000 7ff249c9c700 1 -- 192.168.123.104:0/666028238 >> 192.168.123.104:0/666028238 conn(0x7ff2440fc000 msgr2=0x7ff244105810 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:27:40.909 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:40.908+0000 7ff249c9c700 1 -- 192.168.123.104:0/666028238 shutdown_connections 2026-03-10T06:27:40.909 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:27:40.908+0000 7ff249c9c700 1 -- 192.168.123.104:0/666028238 wait complete. 2026-03-10T06:27:40.977 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 9c59102a-1c48-11f1-b618-035af535377d -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'echo "wait for servicemap items w/ changing names to refresh"' 2026-03-10T06:27:41.140 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/mon.vm04/config 2026-03-10T06:27:41.345 INFO:teuthology.orchestra.run.vm04.stdout:wait for servicemap items w/ changing names to refresh 2026-03-10T06:27:41.383 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 9c59102a-1c48-11f1-b618-035af535377d -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'sleep 60' 2026-03-10T06:27:41.549 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/mon.vm04/config 2026-03-10T06:27:41.743 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:41 vm04.local ceph-mon[115743]: pgmap v149: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 20 MiB/s rd, 4.1 KiB/s wr, 8 op/s 2026-03-10T06:27:41.743 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:41 vm04.local ceph-mon[115743]: from='client.? 192.168.123.104:0/666028238' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:27:41.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:41 vm06.local ceph-mon[98962]: pgmap v149: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 20 MiB/s rd, 4.1 KiB/s wr, 8 op/s 2026-03-10T06:27:41.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:41 vm06.local ceph-mon[98962]: from='client.? 192.168.123.104:0/666028238' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:27:43.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:43 vm06.local ceph-mon[98962]: pgmap v150: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 20 MiB/s rd, 6 op/s 2026-03-10T06:27:43.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:43 vm04.local ceph-mon[115743]: pgmap v150: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 20 MiB/s rd, 6 op/s 2026-03-10T06:27:45.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:45 vm06.local ceph-mon[98962]: pgmap v151: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 14 MiB/s rd, 3 op/s 2026-03-10T06:27:45.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:45 vm04.local ceph-mon[115743]: pgmap v151: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 14 MiB/s rd, 3 op/s 2026-03-10T06:27:47.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:47 vm06.local ceph-mon[98962]: pgmap v152: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 8.8 MiB/s rd, 3 op/s 2026-03-10T06:27:47.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:47 vm04.local ceph-mon[115743]: pgmap v152: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 8.8 MiB/s rd, 3 op/s 2026-03-10T06:27:49.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:49 vm06.local ceph-mon[98962]: pgmap v153: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 5.1 MiB/s rd, 2 op/s 2026-03-10T06:27:49.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:49 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T06:27:49.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:49 vm04.local ceph-mon[115743]: pgmap v153: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 5.1 MiB/s rd, 2 op/s 2026-03-10T06:27:49.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:49 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T06:27:51.784 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:51 vm04.local ceph-mon[115743]: pgmap v154: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 456 KiB/s rd, 1 op/s 2026-03-10T06:27:51.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:51 vm06.local ceph-mon[98962]: pgmap v154: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 456 KiB/s rd, 1 op/s 2026-03-10T06:27:53.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:53 vm06.local ceph-mon[98962]: pgmap v155: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 342 KiB/s rd, 1 op/s 2026-03-10T06:27:53.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:53 vm04.local ceph-mon[115743]: pgmap v155: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 342 KiB/s rd, 1 op/s 2026-03-10T06:27:55.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:55 vm06.local ceph-mon[98962]: pgmap v156: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:27:55.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:55 vm04.local ceph-mon[115743]: pgmap v156: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:27:57.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:57 vm06.local ceph-mon[98962]: pgmap v157: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:27:57.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:57 vm04.local ceph-mon[115743]: pgmap v157: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:27:59.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:27:59 vm06.local ceph-mon[98962]: pgmap v158: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:27:59.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:27:59 vm04.local ceph-mon[115743]: pgmap v158: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:28:01.837 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:28:01 vm06.local ceph-mon[98962]: pgmap v159: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 2 op/s 2026-03-10T06:28:01.886 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:28:01 vm04.local ceph-mon[115743]: pgmap v159: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 2 op/s 2026-03-10T06:28:03.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:28:03 vm06.local ceph-mon[98962]: pgmap v160: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:28:03.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:28:03 vm04.local ceph-mon[115743]: pgmap v160: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:28:04.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:28:04 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T06:28:04.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:28:04 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T06:28:05.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:28:05 vm06.local ceph-mon[98962]: pgmap v161: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:28:05.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:28:05 vm04.local ceph-mon[115743]: pgmap v161: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:28:08.490 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:28:08 vm04.local ceph-mon[115743]: pgmap v162: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:28:08.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:28:08 vm06.local ceph-mon[98962]: pgmap v162: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:28:09.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:28:09 vm06.local ceph-mon[98962]: pgmap v163: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 767 B/s rd, 1 op/s 2026-03-10T06:28:09.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:28:09 vm04.local ceph-mon[115743]: pgmap v163: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 767 B/s rd, 1 op/s 2026-03-10T06:28:11.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:28:11 vm06.local ceph-mon[98962]: pgmap v164: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 767 B/s rd, 1 op/s 2026-03-10T06:28:11.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:28:11 vm04.local ceph-mon[115743]: pgmap v164: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 767 B/s rd, 1 op/s 2026-03-10T06:28:13.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:28:13 vm06.local ceph-mon[98962]: pgmap v165: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 1 op/s 2026-03-10T06:28:13.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:28:13 vm04.local ceph-mon[115743]: pgmap v165: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 1 op/s 2026-03-10T06:28:15.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:28:15 vm06.local ceph-mon[98962]: pgmap v166: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 767 B/s rd, 1 op/s 2026-03-10T06:28:15.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:28:15 vm04.local ceph-mon[115743]: pgmap v166: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 767 B/s rd, 1 op/s 2026-03-10T06:28:17.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:28:17 vm06.local ceph-mon[98962]: pgmap v167: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:28:17.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:28:17 vm04.local ceph-mon[115743]: pgmap v167: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:28:19.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:28:19 vm06.local ceph-mon[98962]: pgmap v168: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:28:19.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:28:19 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T06:28:19.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:28:19 vm04.local ceph-mon[115743]: pgmap v168: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:28:19.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:28:19 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T06:28:21.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:28:21 vm06.local ceph-mon[98962]: pgmap v169: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 2 op/s 2026-03-10T06:28:21.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:28:21 vm04.local ceph-mon[115743]: pgmap v169: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 2 op/s 2026-03-10T06:28:23.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:28:23 vm06.local ceph-mon[98962]: pgmap v170: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:28:23.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:28:23 vm04.local ceph-mon[115743]: pgmap v170: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:28:25.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:28:25 vm06.local ceph-mon[98962]: pgmap v171: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:28:25.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:28:25 vm04.local ceph-mon[115743]: pgmap v171: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:28:27.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:28:27 vm06.local ceph-mon[98962]: pgmap v172: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:28:27.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:28:27 vm04.local ceph-mon[115743]: pgmap v172: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:28:29.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:28:29 vm06.local ceph-mon[98962]: pgmap v173: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:28:29.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:28:29 vm04.local ceph-mon[115743]: pgmap v173: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:28:31.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:28:31 vm06.local ceph-mon[98962]: pgmap v174: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 2 op/s 2026-03-10T06:28:31.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:28:31 vm04.local ceph-mon[115743]: pgmap v174: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 2 op/s 2026-03-10T06:28:33.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:28:33 vm06.local ceph-mon[98962]: pgmap v175: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:28:33.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:28:33 vm04.local ceph-mon[115743]: pgmap v175: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:28:34.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:28:34 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T06:28:34.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:28:34 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T06:28:35.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:28:35 vm06.local ceph-mon[98962]: pgmap v176: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:28:35.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:28:35 vm04.local ceph-mon[115743]: pgmap v176: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:28:37.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:28:37 vm06.local ceph-mon[98962]: pgmap v177: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:28:37.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:28:37 vm04.local ceph-mon[115743]: pgmap v177: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:28:38.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:28:38 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:28:38.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:28:38 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:28:38.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:28:38 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:28:38.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:28:38 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:28:38.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:28:38 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:28:38.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:28:38 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:28:39.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:28:39 vm04.local ceph-mon[115743]: pgmap v178: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:28:39.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:28:39 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:28:39.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:28:39 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T06:28:39.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:28:39 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:28:40.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:28:39 vm06.local ceph-mon[98962]: pgmap v178: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:28:40.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:28:39 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:28:40.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:28:39 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T06:28:40.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:28:39 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:28:41.815 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 9c59102a-1c48-11f1-b618-035af535377d -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph orch ps' 2026-03-10T06:28:41.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:28:41 vm04.local ceph-mon[115743]: pgmap v179: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 2 op/s 2026-03-10T06:28:41.973 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/mon.vm04/config 2026-03-10T06:28:42.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:28:41 vm06.local ceph-mon[98962]: pgmap v179: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 2 op/s 2026-03-10T06:28:42.253 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:42.252+0000 7f5beea64700 1 -- 192.168.123.104:0/2214922823 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5be8108780 msgr2=0x7f5be8108b50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:28:42.255 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:42.252+0000 7f5beea64700 1 --2- 192.168.123.104:0/2214922823 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5be8108780 0x7f5be8108b50 secure :-1 s=READY pgs=115 cs=0 l=1 rev1=1 crypto rx=0x7f5bd8009b00 tx=0x7f5bd8009e10 comp rx=0 tx=0).stop 2026-03-10T06:28:42.256 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:42.254+0000 7f5beea64700 1 -- 192.168.123.104:0/2214922823 shutdown_connections 2026-03-10T06:28:42.256 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:42.254+0000 7f5beea64700 1 --2- 192.168.123.104:0/2214922823 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5be8102780 0x7f5be8102bf0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:42.256 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:42.254+0000 7f5beea64700 1 --2- 192.168.123.104:0/2214922823 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5be8108780 0x7f5be8108b50 unknown :-1 s=CLOSED pgs=115 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:42.256 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:42.254+0000 7f5beea64700 1 -- 192.168.123.104:0/2214922823 >> 192.168.123.104:0/2214922823 conn(0x7f5be80fe280 msgr2=0x7f5be8100690 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:28:42.256 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:42.254+0000 7f5beea64700 1 -- 192.168.123.104:0/2214922823 shutdown_connections 2026-03-10T06:28:42.256 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:42.254+0000 7f5beea64700 1 -- 192.168.123.104:0/2214922823 wait complete. 2026-03-10T06:28:42.256 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:42.255+0000 7f5beea64700 1 Processor -- start 2026-03-10T06:28:42.256 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:42.255+0000 7f5beea64700 1 -- start start 2026-03-10T06:28:42.257 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:42.255+0000 7f5beea64700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5be8102780 0x7f5be819cec0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:28:42.257 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:42.255+0000 7f5beea64700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5be8108780 0x7f5be80782e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:28:42.257 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:42.255+0000 7f5beea64700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5be819d530 con 0x7f5be8108780 2026-03-10T06:28:42.257 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:42.255+0000 7f5beea64700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5be819d6a0 con 0x7f5be8102780 2026-03-10T06:28:42.257 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:42.256+0000 7f5be7fff700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5be8108780 0x7f5be80782e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:28:42.257 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:42.256+0000 7f5bec800700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5be8102780 0x7f5be819cec0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:28:42.257 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:42.256+0000 7f5bec800700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5be8102780 0x7f5be819cec0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.104:45536/0 (socket says 192.168.123.104:45536) 2026-03-10T06:28:42.257 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:42.256+0000 7f5bec800700 1 -- 192.168.123.104:0/44689781 learned_addr learned my addr 192.168.123.104:0/44689781 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:28:42.257 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:42.256+0000 7f5bec800700 1 -- 192.168.123.104:0/44689781 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5be8108780 msgr2=0x7f5be80782e0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:28:42.258 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:42.256+0000 7f5bec800700 1 --2- 192.168.123.104:0/44689781 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5be8108780 0x7f5be80782e0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:42.258 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:42.256+0000 7f5bec800700 1 -- 192.168.123.104:0/44689781 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f5bd80097e0 con 0x7f5be8102780 2026-03-10T06:28:42.258 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:42.256+0000 7f5be7fff700 1 --2- 192.168.123.104:0/44689781 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5be8108780 0x7f5be80782e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-10T06:28:42.258 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:42.256+0000 7f5bec800700 1 --2- 192.168.123.104:0/44689781 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5be8102780 0x7f5be819cec0 secure :-1 s=READY pgs=42 cs=0 l=1 rev1=1 crypto rx=0x7f5bd8009fd0 tx=0x7f5bd8004930 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:28:42.258 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:42.257+0000 7f5be5ffb700 1 -- 192.168.123.104:0/44689781 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f5bd801d070 con 0x7f5be8102780 2026-03-10T06:28:42.258 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:42.257+0000 7f5beea64700 1 -- 192.168.123.104:0/44689781 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f5be8078880 con 0x7f5be8102780 2026-03-10T06:28:42.258 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:42.257+0000 7f5beea64700 1 -- 192.168.123.104:0/44689781 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f5be8078d70 con 0x7f5be8102780 2026-03-10T06:28:42.259 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:42.257+0000 7f5be5ffb700 1 -- 192.168.123.104:0/44689781 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f5bd8004b80 con 0x7f5be8102780 2026-03-10T06:28:42.259 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:42.257+0000 7f5be5ffb700 1 -- 192.168.123.104:0/44689781 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f5bd800f700 con 0x7f5be8102780 2026-03-10T06:28:42.260 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:42.258+0000 7f5be5ffb700 1 -- 192.168.123.104:0/44689781 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 40) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f5bd800bc50 con 0x7f5be8102780 2026-03-10T06:28:42.260 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:42.259+0000 7f5be5ffb700 1 --2- 192.168.123.104:0/44689781 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f5bd00778c0 0x7f5bd0079d70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:28:42.260 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:42.259+0000 7f5be5ffb700 1 -- 192.168.123.104:0/44689781 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(81..81 src has 1..81) v4 ==== 6308+0+0 (secure 0 0 0) 0x7f5bd809af80 con 0x7f5be8102780 2026-03-10T06:28:42.261 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:42.259+0000 7f5be7fff700 1 --2- 192.168.123.104:0/44689781 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f5bd00778c0 0x7f5bd0079d70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:28:42.261 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:42.259+0000 7f5beea64700 1 -- 192.168.123.104:0/44689781 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f5bd4005320 con 0x7f5be8102780 2026-03-10T06:28:42.261 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:42.260+0000 7f5be7fff700 1 --2- 192.168.123.104:0/44689781 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f5bd00778c0 0x7f5bd0079d70 secure :-1 s=READY pgs=78 cs=0 l=1 rev1=1 crypto rx=0x7f5be8195600 tx=0x7f5bdc00a380 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:28:42.266 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:42.264+0000 7f5be5ffb700 1 -- 192.168.123.104:0/44689781 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f5bd80637b0 con 0x7f5be8102780 2026-03-10T06:28:42.395 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:42.393+0000 7f5beea64700 1 -- 192.168.123.104:0/44689781 --> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7f5bd4000bf0 con 0x7f5bd00778c0 2026-03-10T06:28:42.402 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:42.400+0000 7f5be5ffb700 1 -- 192.168.123.104:0/44689781 <== mgr.34104 v2:192.168.123.104:6800/1421430943 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3576 (secure 0 0 0) 0x7f5bd4000bf0 con 0x7f5bd00778c0 2026-03-10T06:28:42.402 INFO:teuthology.orchestra.run.vm04.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-10T06:28:42.402 INFO:teuthology.orchestra.run.vm04.stdout:alertmanager.vm04 vm04 *:9093,9094 running (5m) 67s ago 11m 24.4M - 0.25.0 c8568f914cd2 85edc8fe2fc1 2026-03-10T06:28:42.402 INFO:teuthology.orchestra.run.vm04.stdout:ceph-exporter.vm04 vm04 running (71s) 67s ago 11m 10.0M - 19.2.3-678-ge911bdeb 654f31e6858e 1f8c6c628bc5 2026-03-10T06:28:42.402 INFO:teuthology.orchestra.run.vm04.stdout:ceph-exporter.vm06 vm06 running (69s) 68s ago 10m 9709k - 19.2.3-678-ge911bdeb 654f31e6858e 4b8b93e98e4c 2026-03-10T06:28:42.402 INFO:teuthology.orchestra.run.vm04.stdout:crash.vm04 vm04 running (4m) 67s ago 11m 7847k - 19.2.3-678-ge911bdeb 654f31e6858e 330b1d951bd0 2026-03-10T06:28:42.402 INFO:teuthology.orchestra.run.vm04.stdout:crash.vm06 vm06 running (4m) 68s ago 10m 7856k - 19.2.3-678-ge911bdeb 654f31e6858e d5aafc4fb1bb 2026-03-10T06:28:42.402 INFO:teuthology.orchestra.run.vm04.stdout:grafana.vm04 vm04 *:3000 running (5m) 67s ago 10m 94.6M - 10.4.0 c8b91775d855 28b34ae2f2b0 2026-03-10T06:28:42.402 INFO:teuthology.orchestra.run.vm04.stdout:mds.cephfs.vm04.hdxbzv vm04 running (105s) 67s ago 8m 99.7M - 19.2.3-678-ge911bdeb 654f31e6858e 481d5dbc696e 2026-03-10T06:28:42.402 INFO:teuthology.orchestra.run.vm04.stdout:mds.cephfs.vm04.hsrsig vm04 running (95s) 67s ago 8m 265M - 19.2.3-678-ge911bdeb 654f31e6858e 053559c6b509 2026-03-10T06:28:42.402 INFO:teuthology.orchestra.run.vm04.stdout:mds.cephfs.vm06.afscws vm06 running (78s) 68s ago 8m 17.6M - 19.2.3-678-ge911bdeb 654f31e6858e b155e769016e 2026-03-10T06:28:42.402 INFO:teuthology.orchestra.run.vm04.stdout:mds.cephfs.vm06.wzhqon vm06 running (87s) 68s ago 8m 19.1M - 19.2.3-678-ge911bdeb 654f31e6858e 366d0632406e 2026-03-10T06:28:42.402 INFO:teuthology.orchestra.run.vm04.stdout:mgr.vm04.exdvdb vm04 *:8443,9283,8765 running (6m) 67s ago 11m 617M - 19.2.3-678-ge911bdeb 654f31e6858e 640a9d30421c 2026-03-10T06:28:42.402 INFO:teuthology.orchestra.run.vm04.stdout:mgr.vm06.wwotdr vm06 *:8443,9283,8765 running (6m) 68s ago 10m 496M - 19.2.3-678-ge911bdeb 654f31e6858e 0f98de364d6a 2026-03-10T06:28:42.402 INFO:teuthology.orchestra.run.vm04.stdout:mon.vm04 vm04 running (4m) 67s ago 11m 67.1M 2048M 19.2.3-678-ge911bdeb 654f31e6858e cf1d92823378 2026-03-10T06:28:42.402 INFO:teuthology.orchestra.run.vm04.stdout:mon.vm06 vm06 running (4m) 68s ago 10m 58.2M 2048M 19.2.3-678-ge911bdeb 654f31e6858e 0f90bc9a714a 2026-03-10T06:28:42.402 INFO:teuthology.orchestra.run.vm04.stdout:node-exporter.vm04 vm04 *:9100 running (6m) 67s ago 11m 10.6M - 1.7.0 72c9c2088986 f88b18573eef 2026-03-10T06:28:42.402 INFO:teuthology.orchestra.run.vm04.stdout:node-exporter.vm06 vm06 *:9100 running (5m) 68s ago 10m 9847k - 1.7.0 72c9c2088986 32cea90d1988 2026-03-10T06:28:42.402 INFO:teuthology.orchestra.run.vm04.stdout:osd.0 vm04 running (4m) 67s ago 10m 187M 4096M 19.2.3-678-ge911bdeb 654f31e6858e df697b82ad51 2026-03-10T06:28:42.402 INFO:teuthology.orchestra.run.vm04.stdout:osd.1 vm04 running (3m) 67s ago 9m 121M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 6bc3525fe6f5 2026-03-10T06:28:42.402 INFO:teuthology.orchestra.run.vm04.stdout:osd.2 vm04 running (3m) 67s ago 9m 93.8M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 38220ba83a3f 2026-03-10T06:28:42.402 INFO:teuthology.orchestra.run.vm04.stdout:osd.3 vm06 running (2m) 68s ago 9m 170M 4096M 19.2.3-678-ge911bdeb 654f31e6858e e91f44e1f660 2026-03-10T06:28:42.402 INFO:teuthology.orchestra.run.vm04.stdout:osd.4 vm06 running (2m) 68s ago 9m 107M 4096M 19.2.3-678-ge911bdeb 654f31e6858e fea6c31251ba 2026-03-10T06:28:42.402 INFO:teuthology.orchestra.run.vm04.stdout:osd.5 vm06 running (2m) 68s ago 9m 96.7M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 15b85a82c8ae 2026-03-10T06:28:42.402 INFO:teuthology.orchestra.run.vm04.stdout:prometheus.vm04 vm04 *:9095 running (5m) 67s ago 10m 59.7M - 2.51.0 1d3b7f56885b 9e491f823407 2026-03-10T06:28:42.405 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:42.403+0000 7f5beea64700 1 -- 192.168.123.104:0/44689781 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f5bd00778c0 msgr2=0x7f5bd0079d70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:28:42.405 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:42.403+0000 7f5beea64700 1 --2- 192.168.123.104:0/44689781 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f5bd00778c0 0x7f5bd0079d70 secure :-1 s=READY pgs=78 cs=0 l=1 rev1=1 crypto rx=0x7f5be8195600 tx=0x7f5bdc00a380 comp rx=0 tx=0).stop 2026-03-10T06:28:42.405 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:42.403+0000 7f5beea64700 1 -- 192.168.123.104:0/44689781 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5be8102780 msgr2=0x7f5be819cec0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:28:42.405 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:42.403+0000 7f5beea64700 1 --2- 192.168.123.104:0/44689781 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5be8102780 0x7f5be819cec0 secure :-1 s=READY pgs=42 cs=0 l=1 rev1=1 crypto rx=0x7f5bd8009fd0 tx=0x7f5bd8004930 comp rx=0 tx=0).stop 2026-03-10T06:28:42.405 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:42.404+0000 7f5beea64700 1 -- 192.168.123.104:0/44689781 shutdown_connections 2026-03-10T06:28:42.405 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:42.404+0000 7f5beea64700 1 --2- 192.168.123.104:0/44689781 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f5bd00778c0 0x7f5bd0079d70 unknown :-1 s=CLOSED pgs=78 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:42.405 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:42.404+0000 7f5beea64700 1 --2- 192.168.123.104:0/44689781 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5be8102780 0x7f5be819cec0 unknown :-1 s=CLOSED pgs=42 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:42.405 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:42.404+0000 7f5beea64700 1 --2- 192.168.123.104:0/44689781 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5be8108780 0x7f5be80782e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:42.405 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:42.404+0000 7f5beea64700 1 -- 192.168.123.104:0/44689781 >> 192.168.123.104:0/44689781 conn(0x7f5be80fe280 msgr2=0x7f5be80ffa60 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:28:42.405 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:42.404+0000 7f5beea64700 1 -- 192.168.123.104:0/44689781 shutdown_connections 2026-03-10T06:28:42.405 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:42.404+0000 7f5beea64700 1 -- 192.168.123.104:0/44689781 wait complete. 2026-03-10T06:28:42.452 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 9c59102a-1c48-11f1-b618-035af535377d -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph versions' 2026-03-10T06:28:42.611 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/mon.vm04/config 2026-03-10T06:28:42.890 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:42.887+0000 7f1917e4f700 1 -- 192.168.123.104:0/3981994738 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1910100590 msgr2=0x7f1910100a00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:28:42.890 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:42.887+0000 7f1917e4f700 1 --2- 192.168.123.104:0/3981994738 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1910100590 0x7f1910100a00 secure :-1 s=READY pgs=116 cs=0 l=1 rev1=1 crypto rx=0x7f190c009b00 tx=0x7f190c009e10 comp rx=0 tx=0).stop 2026-03-10T06:28:42.891 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:42.889+0000 7f1917e4f700 1 -- 192.168.123.104:0/3981994738 shutdown_connections 2026-03-10T06:28:42.891 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:42.889+0000 7f1917e4f700 1 --2- 192.168.123.104:0/3981994738 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1910100590 0x7f1910100a00 unknown :-1 s=CLOSED pgs=116 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:42.891 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:42.889+0000 7f1917e4f700 1 --2- 192.168.123.104:0/3981994738 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f19101065b0 0x7f1910106980 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:42.891 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:42.889+0000 7f1917e4f700 1 -- 192.168.123.104:0/3981994738 >> 192.168.123.104:0/3981994738 conn(0x7f19100fc090 msgr2=0x7f19100fe4a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:28:42.891 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:42.889+0000 7f1917e4f700 1 -- 192.168.123.104:0/3981994738 shutdown_connections 2026-03-10T06:28:42.892 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:42.890+0000 7f1917e4f700 1 -- 192.168.123.104:0/3981994738 wait complete. 2026-03-10T06:28:42.892 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:42.890+0000 7f1917e4f700 1 Processor -- start 2026-03-10T06:28:42.892 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:42.890+0000 7f1917e4f700 1 -- start start 2026-03-10T06:28:42.892 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:42.890+0000 7f1917e4f700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1910100590 0x7f19100727e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:28:42.892 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:42.890+0000 7f1917e4f700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f19101065b0 0x7f191006d7e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:28:42.892 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:42.890+0000 7f1917e4f700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f191006dd20 con 0x7f1910100590 2026-03-10T06:28:42.892 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:42.890+0000 7f1917e4f700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f191006de90 con 0x7f19101065b0 2026-03-10T06:28:42.892 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:42.891+0000 7f19153ea700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f19101065b0 0x7f191006d7e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:28:42.892 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:42.891+0000 7f19153ea700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f19101065b0 0x7f191006d7e0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.104:45552/0 (socket says 192.168.123.104:45552) 2026-03-10T06:28:42.892 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:42.891+0000 7f19153ea700 1 -- 192.168.123.104:0/3594368754 learned_addr learned my addr 192.168.123.104:0/3594368754 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:28:42.893 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:42.891+0000 7f19153ea700 1 -- 192.168.123.104:0/3594368754 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1910100590 msgr2=0x7f19100727e0 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T06:28:42.893 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:42.891+0000 7f1915beb700 1 --2- 192.168.123.104:0/3594368754 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1910100590 0x7f19100727e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:28:42.893 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:42.891+0000 7f19153ea700 1 --2- 192.168.123.104:0/3594368754 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1910100590 0x7f19100727e0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:42.893 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:42.891+0000 7f19153ea700 1 -- 192.168.123.104:0/3594368754 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f190c0097e0 con 0x7f19101065b0 2026-03-10T06:28:42.893 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:42.891+0000 7f1915beb700 1 --2- 192.168.123.104:0/3594368754 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1910100590 0x7f19100727e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T06:28:42.893 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:42.892+0000 7f19153ea700 1 --2- 192.168.123.104:0/3594368754 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f19101065b0 0x7f191006d7e0 secure :-1 s=READY pgs=43 cs=0 l=1 rev1=1 crypto rx=0x7f190c000c00 tx=0x7f190c004a20 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:28:42.893 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:42.892+0000 7f1902ffd700 1 -- 192.168.123.104:0/3594368754 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f190c01d070 con 0x7f19101065b0 2026-03-10T06:28:42.894 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:42.892+0000 7f1917e4f700 1 -- 192.168.123.104:0/3594368754 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f191006e110 con 0x7f19101065b0 2026-03-10T06:28:42.894 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:42.892+0000 7f1917e4f700 1 -- 192.168.123.104:0/3594368754 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f191006e5a0 con 0x7f19101065b0 2026-03-10T06:28:42.894 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:42.893+0000 7f1902ffd700 1 -- 192.168.123.104:0/3594368754 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f190c004500 con 0x7f19101065b0 2026-03-10T06:28:42.894 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:42.893+0000 7f1902ffd700 1 -- 192.168.123.104:0/3594368754 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f190c003e80 con 0x7f19101065b0 2026-03-10T06:28:42.895 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:42.893+0000 7f1917e4f700 1 -- 192.168.123.104:0/3594368754 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f191004ea50 con 0x7f19101065b0 2026-03-10T06:28:42.896 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:42.894+0000 7f1902ffd700 1 -- 192.168.123.104:0/3594368754 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 40) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f190c004020 con 0x7f19101065b0 2026-03-10T06:28:42.896 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:42.895+0000 7f1902ffd700 1 --2- 192.168.123.104:0/3594368754 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f18fc07bd20 0x7f18fc07e1d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:28:42.896 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:42.895+0000 7f1902ffd700 1 -- 192.168.123.104:0/3594368754 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(81..81 src has 1..81) v4 ==== 6308+0+0 (secure 0 0 0) 0x7f190c09b850 con 0x7f19101065b0 2026-03-10T06:28:42.896 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:42.895+0000 7f1915beb700 1 --2- 192.168.123.104:0/3594368754 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f18fc07bd20 0x7f18fc07e1d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:28:42.897 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:42.896+0000 7f1915beb700 1 --2- 192.168.123.104:0/3594368754 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f18fc07bd20 0x7f18fc07e1d0 secure :-1 s=READY pgs=79 cs=0 l=1 rev1=1 crypto rx=0x7f1904006fd0 tx=0x7f1904008040 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:28:42.898 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:42.897+0000 7f1902ffd700 1 -- 192.168.123.104:0/3594368754 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f190c064050 con 0x7f19101065b0 2026-03-10T06:28:43.064 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:43.062+0000 7f1917e4f700 1 -- 192.168.123.104:0/3594368754 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7f19101a6d80 con 0x7f19101065b0 2026-03-10T06:28:43.065 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:43.063+0000 7f1902ffd700 1 -- 192.168.123.104:0/3594368754 <== mon.1 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+633 (secure 0 0 0) 0x7f190c0637a0 con 0x7f19101065b0 2026-03-10T06:28:43.066 INFO:teuthology.orchestra.run.vm04.stdout:{ 2026-03-10T06:28:43.066 INFO:teuthology.orchestra.run.vm04.stdout: "mon": { 2026-03-10T06:28:43.066 INFO:teuthology.orchestra.run.vm04.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T06:28:43.066 INFO:teuthology.orchestra.run.vm04.stdout: }, 2026-03-10T06:28:43.066 INFO:teuthology.orchestra.run.vm04.stdout: "mgr": { 2026-03-10T06:28:43.066 INFO:teuthology.orchestra.run.vm04.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T06:28:43.066 INFO:teuthology.orchestra.run.vm04.stdout: }, 2026-03-10T06:28:43.066 INFO:teuthology.orchestra.run.vm04.stdout: "osd": { 2026-03-10T06:28:43.066 INFO:teuthology.orchestra.run.vm04.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 6 2026-03-10T06:28:43.066 INFO:teuthology.orchestra.run.vm04.stdout: }, 2026-03-10T06:28:43.066 INFO:teuthology.orchestra.run.vm04.stdout: "mds": { 2026-03-10T06:28:43.066 INFO:teuthology.orchestra.run.vm04.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 4 2026-03-10T06:28:43.066 INFO:teuthology.orchestra.run.vm04.stdout: }, 2026-03-10T06:28:43.066 INFO:teuthology.orchestra.run.vm04.stdout: "overall": { 2026-03-10T06:28:43.066 INFO:teuthology.orchestra.run.vm04.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 14 2026-03-10T06:28:43.066 INFO:teuthology.orchestra.run.vm04.stdout: } 2026-03-10T06:28:43.066 INFO:teuthology.orchestra.run.vm04.stdout:} 2026-03-10T06:28:43.068 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:43.067+0000 7f1917e4f700 1 -- 192.168.123.104:0/3594368754 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f18fc07bd20 msgr2=0x7f18fc07e1d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:28:43.068 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:43.067+0000 7f1917e4f700 1 --2- 192.168.123.104:0/3594368754 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f18fc07bd20 0x7f18fc07e1d0 secure :-1 s=READY pgs=79 cs=0 l=1 rev1=1 crypto rx=0x7f1904006fd0 tx=0x7f1904008040 comp rx=0 tx=0).stop 2026-03-10T06:28:43.068 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:43.067+0000 7f1917e4f700 1 -- 192.168.123.104:0/3594368754 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f19101065b0 msgr2=0x7f191006d7e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:28:43.068 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:43.067+0000 7f1917e4f700 1 --2- 192.168.123.104:0/3594368754 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f19101065b0 0x7f191006d7e0 secure :-1 s=READY pgs=43 cs=0 l=1 rev1=1 crypto rx=0x7f190c000c00 tx=0x7f190c004a20 comp rx=0 tx=0).stop 2026-03-10T06:28:43.068 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:43.067+0000 7f1917e4f700 1 -- 192.168.123.104:0/3594368754 shutdown_connections 2026-03-10T06:28:43.068 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:43.067+0000 7f1917e4f700 1 --2- 192.168.123.104:0/3594368754 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f18fc07bd20 0x7f18fc07e1d0 unknown :-1 s=CLOSED pgs=79 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:43.068 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:43.067+0000 7f1917e4f700 1 --2- 192.168.123.104:0/3594368754 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1910100590 0x7f19100727e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:43.068 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:43.067+0000 7f1917e4f700 1 --2- 192.168.123.104:0/3594368754 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f19101065b0 0x7f191006d7e0 unknown :-1 s=CLOSED pgs=43 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:43.068 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:43.067+0000 7f1917e4f700 1 -- 192.168.123.104:0/3594368754 >> 192.168.123.104:0/3594368754 conn(0x7f19100fc090 msgr2=0x7f19100fd8e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:28:43.069 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:43.067+0000 7f1917e4f700 1 -- 192.168.123.104:0/3594368754 shutdown_connections 2026-03-10T06:28:43.069 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:43.067+0000 7f1917e4f700 1 -- 192.168.123.104:0/3594368754 wait complete. 2026-03-10T06:28:43.138 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 9c59102a-1c48-11f1-b618-035af535377d -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph versions | jq -e '"'"'.overall | length == 1'"'"'' 2026-03-10T06:28:43.304 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/mon.vm04/config 2026-03-10T06:28:43.563 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:43.561+0000 7faef3066700 1 -- 192.168.123.104:0/1182355946 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7faeec0730f0 msgr2=0x7faeec0734c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:28:43.563 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:43.561+0000 7faef3066700 1 --2- 192.168.123.104:0/1182355946 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7faeec0730f0 0x7faeec0734c0 secure :-1 s=READY pgs=117 cs=0 l=1 rev1=1 crypto rx=0x7faedc009b50 tx=0x7faedc009e60 comp rx=0 tx=0).stop 2026-03-10T06:28:43.563 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:43.562+0000 7faef3066700 1 -- 192.168.123.104:0/1182355946 shutdown_connections 2026-03-10T06:28:43.563 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:43.562+0000 7faef3066700 1 --2- 192.168.123.104:0/1182355946 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7faeec073a00 0x7faeec110ff0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:43.564 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:43.562+0000 7faef3066700 1 --2- 192.168.123.104:0/1182355946 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7faeec0730f0 0x7faeec0734c0 unknown :-1 s=CLOSED pgs=117 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:43.564 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:43.562+0000 7faef3066700 1 -- 192.168.123.104:0/1182355946 >> 192.168.123.104:0/1182355946 conn(0x7faeec0fc000 msgr2=0x7faeec0fe410 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:28:43.564 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:43.562+0000 7faef3066700 1 -- 192.168.123.104:0/1182355946 shutdown_connections 2026-03-10T06:28:43.564 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:43.562+0000 7faef3066700 1 -- 192.168.123.104:0/1182355946 wait complete. 2026-03-10T06:28:43.564 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:43.562+0000 7faef3066700 1 Processor -- start 2026-03-10T06:28:43.564 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:43.563+0000 7faef3066700 1 -- start start 2026-03-10T06:28:43.564 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:43.563+0000 7faef3066700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7faeec0730f0 0x7faeec1a25a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:28:43.565 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:43.563+0000 7faef3066700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7faeec073a00 0x7faeec1a2ae0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:28:43.565 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:43.563+0000 7faef3066700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7faeec1a3170 con 0x7faeec073a00 2026-03-10T06:28:43.565 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:43.563+0000 7faef3066700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7faeec19c620 con 0x7faeec0730f0 2026-03-10T06:28:43.565 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:43.563+0000 7faeebfff700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7faeec073a00 0x7faeec1a2ae0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:28:43.565 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:43.564+0000 7faeebfff700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7faeec073a00 0x7faeec1a2ae0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:42022/0 (socket says 192.168.123.104:42022) 2026-03-10T06:28:43.565 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:43.564+0000 7faeebfff700 1 -- 192.168.123.104:0/2984938330 learned_addr learned my addr 192.168.123.104:0/2984938330 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:28:43.565 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:43.564+0000 7faeebfff700 1 -- 192.168.123.104:0/2984938330 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7faeec0730f0 msgr2=0x7faeec1a25a0 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T06:28:43.565 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:43.564+0000 7faeebfff700 1 --2- 192.168.123.104:0/2984938330 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7faeec0730f0 0x7faeec1a25a0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:43.565 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:43.564+0000 7faeebfff700 1 -- 192.168.123.104:0/2984938330 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7faedc0097e0 con 0x7faeec073a00 2026-03-10T06:28:43.566 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:43.564+0000 7faeebfff700 1 --2- 192.168.123.104:0/2984938330 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7faeec073a00 0x7faeec1a2ae0 secure :-1 s=READY pgs=118 cs=0 l=1 rev1=1 crypto rx=0x7faee000ed70 tx=0x7faee000c5b0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:28:43.566 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:43.565+0000 7faee9ffb700 1 -- 192.168.123.104:0/2984938330 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7faee000cd70 con 0x7faeec073a00 2026-03-10T06:28:43.566 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:43.565+0000 7faee9ffb700 1 -- 192.168.123.104:0/2984938330 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7faee0010910 con 0x7faeec073a00 2026-03-10T06:28:43.566 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:43.565+0000 7faef3066700 1 -- 192.168.123.104:0/2984938330 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7faeec19c900 con 0x7faeec073a00 2026-03-10T06:28:43.567 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:43.565+0000 7faee9ffb700 1 -- 192.168.123.104:0/2984938330 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7faee0018a10 con 0x7faeec073a00 2026-03-10T06:28:43.568 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:43.565+0000 7faef3066700 1 -- 192.168.123.104:0/2984938330 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7faeec19ce50 con 0x7faeec073a00 2026-03-10T06:28:43.568 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:43.567+0000 7faef3066700 1 -- 192.168.123.104:0/2984938330 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7faeec10e770 con 0x7faeec073a00 2026-03-10T06:28:43.572 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:43.567+0000 7faee9ffb700 1 -- 192.168.123.104:0/2984938330 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 40) v1 ==== 100115+0+0 (secure 0 0 0) 0x7faee0018b70 con 0x7faeec073a00 2026-03-10T06:28:43.572 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:43.571+0000 7faee9ffb700 1 --2- 192.168.123.104:0/2984938330 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7faed40778c0 0x7faed4079d70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:28:43.573 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:43.571+0000 7faee9ffb700 1 -- 192.168.123.104:0/2984938330 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(81..81 src has 1..81) v4 ==== 6308+0+0 (secure 0 0 0) 0x7faee0014070 con 0x7faeec073a00 2026-03-10T06:28:43.573 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:43.571+0000 7faee9ffb700 1 -- 192.168.123.104:0/2984938330 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7faee009c990 con 0x7faeec073a00 2026-03-10T06:28:43.573 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:43.571+0000 7faef0e02700 1 --2- 192.168.123.104:0/2984938330 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7faed40778c0 0x7faed4079d70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:28:43.574 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:43.572+0000 7faef0e02700 1 --2- 192.168.123.104:0/2984938330 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7faed40778c0 0x7faed4079d70 secure :-1 s=READY pgs=80 cs=0 l=1 rev1=1 crypto rx=0x7faedc000c00 tx=0x7faedc005fb0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:28:43.744 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:28:43 vm04.local ceph-mon[115743]: pgmap v180: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:28:43.745 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:28:43 vm04.local ceph-mon[115743]: from='client.44267 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:28:43.745 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:28:43 vm04.local ceph-mon[115743]: from='client.? 192.168.123.104:0/3594368754' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:28:43.745 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:43.742+0000 7faef3066700 1 -- 192.168.123.104:0/2984938330 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7faeec19db30 con 0x7faeec073a00 2026-03-10T06:28:43.746 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:43.745+0000 7faee9ffb700 1 -- 192.168.123.104:0/2984938330 <== mon.0 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+633 (secure 0 0 0) 0x7faee0062db0 con 0x7faeec073a00 2026-03-10T06:28:43.749 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:43.747+0000 7faef3066700 1 -- 192.168.123.104:0/2984938330 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7faed40778c0 msgr2=0x7faed4079d70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:28:43.749 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:43.747+0000 7faef3066700 1 --2- 192.168.123.104:0/2984938330 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7faed40778c0 0x7faed4079d70 secure :-1 s=READY pgs=80 cs=0 l=1 rev1=1 crypto rx=0x7faedc000c00 tx=0x7faedc005fb0 comp rx=0 tx=0).stop 2026-03-10T06:28:43.749 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:43.747+0000 7faef3066700 1 -- 192.168.123.104:0/2984938330 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7faeec073a00 msgr2=0x7faeec1a2ae0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:28:43.749 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:43.747+0000 7faef3066700 1 --2- 192.168.123.104:0/2984938330 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7faeec073a00 0x7faeec1a2ae0 secure :-1 s=READY pgs=118 cs=0 l=1 rev1=1 crypto rx=0x7faee000ed70 tx=0x7faee000c5b0 comp rx=0 tx=0).stop 2026-03-10T06:28:43.749 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:43.748+0000 7faef3066700 1 -- 192.168.123.104:0/2984938330 shutdown_connections 2026-03-10T06:28:43.750 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:43.748+0000 7faef3066700 1 --2- 192.168.123.104:0/2984938330 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7faed40778c0 0x7faed4079d70 unknown :-1 s=CLOSED pgs=80 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:43.750 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:43.748+0000 7faef3066700 1 --2- 192.168.123.104:0/2984938330 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7faeec0730f0 0x7faeec1a25a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:43.750 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:43.748+0000 7faef3066700 1 --2- 192.168.123.104:0/2984938330 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7faeec073a00 0x7faeec1a2ae0 unknown :-1 s=CLOSED pgs=118 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:43.750 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:43.748+0000 7faef3066700 1 -- 192.168.123.104:0/2984938330 >> 192.168.123.104:0/2984938330 conn(0x7faeec0fc000 msgr2=0x7faeec102b00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:28:43.750 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:43.748+0000 7faef3066700 1 -- 192.168.123.104:0/2984938330 shutdown_connections 2026-03-10T06:28:43.750 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:43.749+0000 7faef3066700 1 -- 192.168.123.104:0/2984938330 wait complete. 2026-03-10T06:28:43.760 INFO:teuthology.orchestra.run.vm04.stdout:true 2026-03-10T06:28:43.802 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 9c59102a-1c48-11f1-b618-035af535377d -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph versions | jq -e '"'"'.overall | keys'"'"' | grep $sha1' 2026-03-10T06:28:43.953 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/mon.vm04/config 2026-03-10T06:28:44.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:28:43 vm06.local ceph-mon[98962]: pgmap v180: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:28:44.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:28:43 vm06.local ceph-mon[98962]: from='client.44267 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T06:28:44.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:28:43 vm06.local ceph-mon[98962]: from='client.? 192.168.123.104:0/3594368754' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:28:44.277 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:44.275+0000 7ff22cc67700 1 -- 192.168.123.104:0/1537804033 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7ff228071b60 msgr2=0x7ff228071fd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:28:44.277 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:44.275+0000 7ff22cc67700 1 --2- 192.168.123.104:0/1537804033 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7ff228071b60 0x7ff228071fd0 secure :-1 s=READY pgs=119 cs=0 l=1 rev1=1 crypto rx=0x7ff21c009b50 tx=0x7ff21c009e60 comp rx=0 tx=0).stop 2026-03-10T06:28:44.277 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:44.276+0000 7ff22cc67700 1 -- 192.168.123.104:0/1537804033 shutdown_connections 2026-03-10T06:28:44.278 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:44.276+0000 7ff22cc67700 1 --2- 192.168.123.104:0/1537804033 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7ff228071b60 0x7ff228071fd0 unknown :-1 s=CLOSED pgs=119 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:44.278 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:44.276+0000 7ff22cc67700 1 --2- 192.168.123.104:0/1537804033 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff22810eab0 0x7ff22810ee80 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:44.278 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:44.276+0000 7ff22cc67700 1 -- 192.168.123.104:0/1537804033 >> 192.168.123.104:0/1537804033 conn(0x7ff22806c6c0 msgr2=0x7ff22806cac0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:28:44.279 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:44.277+0000 7ff22cc67700 1 -- 192.168.123.104:0/1537804033 shutdown_connections 2026-03-10T06:28:44.279 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:44.277+0000 7ff22cc67700 1 -- 192.168.123.104:0/1537804033 wait complete. 2026-03-10T06:28:44.279 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:44.277+0000 7ff22cc67700 1 Processor -- start 2026-03-10T06:28:44.279 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:44.277+0000 7ff22cc67700 1 -- start start 2026-03-10T06:28:44.279 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:44.277+0000 7ff22cc67700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7ff228071b60 0x7ff228119570 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:28:44.279 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:44.278+0000 7ff22cc67700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff22810eab0 0x7ff228114570 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:28:44.279 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:44.278+0000 7ff22cc67700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff228114ab0 con 0x7ff228071b60 2026-03-10T06:28:44.279 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:44.278+0000 7ff22cc67700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff228114c20 con 0x7ff22810eab0 2026-03-10T06:28:44.279 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:44.278+0000 7ff22659c700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7ff228071b60 0x7ff228119570 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:28:44.280 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:44.278+0000 7ff22659c700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7ff228071b60 0x7ff228119570 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:35698/0 (socket says 192.168.123.104:35698) 2026-03-10T06:28:44.280 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:44.278+0000 7ff22659c700 1 -- 192.168.123.104:0/631692149 learned_addr learned my addr 192.168.123.104:0/631692149 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:28:44.280 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:44.278+0000 7ff225d9b700 1 --2- 192.168.123.104:0/631692149 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff22810eab0 0x7ff228114570 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:28:44.280 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:44.279+0000 7ff225d9b700 1 -- 192.168.123.104:0/631692149 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7ff228071b60 msgr2=0x7ff228119570 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:28:44.280 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:44.279+0000 7ff225d9b700 1 --2- 192.168.123.104:0/631692149 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7ff228071b60 0x7ff228119570 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:44.280 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:44.279+0000 7ff225d9b700 1 -- 192.168.123.104:0/631692149 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7ff21c0097e0 con 0x7ff22810eab0 2026-03-10T06:28:44.280 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:44.279+0000 7ff225d9b700 1 --2- 192.168.123.104:0/631692149 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff22810eab0 0x7ff228114570 secure :-1 s=READY pgs=44 cs=0 l=1 rev1=1 crypto rx=0x7ff21c009b50 tx=0x7ff21c0048c0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:28:44.283 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:44.281+0000 7ff2177fe700 1 -- 192.168.123.104:0/631692149 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff21c01d070 con 0x7ff22810eab0 2026-03-10T06:28:44.283 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:44.281+0000 7ff2177fe700 1 -- 192.168.123.104:0/631692149 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7ff21c022470 con 0x7ff22810eab0 2026-03-10T06:28:44.284 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:44.282+0000 7ff2177fe700 1 -- 192.168.123.104:0/631692149 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff21c00f670 con 0x7ff22810eab0 2026-03-10T06:28:44.284 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:44.282+0000 7ff22cc67700 1 -- 192.168.123.104:0/631692149 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7ff228114e50 con 0x7ff22810eab0 2026-03-10T06:28:44.284 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:44.282+0000 7ff22cc67700 1 -- 192.168.123.104:0/631692149 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7ff2281152c0 con 0x7ff22810eab0 2026-03-10T06:28:44.284 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:44.283+0000 7ff22cc67700 1 -- 192.168.123.104:0/631692149 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7ff22804f2a0 con 0x7ff22810eab0 2026-03-10T06:28:44.285 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:44.284+0000 7ff2177fe700 1 -- 192.168.123.104:0/631692149 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 40) v1 ==== 100115+0+0 (secure 0 0 0) 0x7ff21c00ba40 con 0x7ff22810eab0 2026-03-10T06:28:44.286 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:44.284+0000 7ff2177fe700 1 --2- 192.168.123.104:0/631692149 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7ff210077710 0x7ff210079bc0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:28:44.286 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:44.284+0000 7ff2177fe700 1 -- 192.168.123.104:0/631692149 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(81..81 src has 1..81) v4 ==== 6308+0+0 (secure 0 0 0) 0x7ff21c09bb60 con 0x7ff22810eab0 2026-03-10T06:28:44.286 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:44.285+0000 7ff22659c700 1 --2- 192.168.123.104:0/631692149 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7ff210077710 0x7ff210079bc0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:28:44.286 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:44.285+0000 7ff22659c700 1 --2- 192.168.123.104:0/631692149 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7ff210077710 0x7ff210079bc0 secure :-1 s=READY pgs=81 cs=0 l=1 rev1=1 crypto rx=0x7ff218009f10 tx=0x7ff218009450 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:28:44.288 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:44.287+0000 7ff2177fe700 1 -- 192.168.123.104:0/631692149 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7ff21c064360 con 0x7ff22810eab0 2026-03-10T06:28:44.462 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:44.460+0000 7ff22cc67700 1 -- 192.168.123.104:0/631692149 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7ff22804ea50 con 0x7ff22810eab0 2026-03-10T06:28:44.464 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:44.462+0000 7ff2177fe700 1 -- 192.168.123.104:0/631692149 <== mon.1 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+633 (secure 0 0 0) 0x7ff21c00bcf0 con 0x7ff22810eab0 2026-03-10T06:28:44.468 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:44.466+0000 7ff2157fa700 1 -- 192.168.123.104:0/631692149 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7ff210077710 msgr2=0x7ff210079bc0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:28:44.468 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:44.466+0000 7ff2157fa700 1 --2- 192.168.123.104:0/631692149 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7ff210077710 0x7ff210079bc0 secure :-1 s=READY pgs=81 cs=0 l=1 rev1=1 crypto rx=0x7ff218009f10 tx=0x7ff218009450 comp rx=0 tx=0).stop 2026-03-10T06:28:44.468 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:44.466+0000 7ff2157fa700 1 -- 192.168.123.104:0/631692149 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff22810eab0 msgr2=0x7ff228114570 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:28:44.468 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:44.467+0000 7ff2157fa700 1 --2- 192.168.123.104:0/631692149 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff22810eab0 0x7ff228114570 secure :-1 s=READY pgs=44 cs=0 l=1 rev1=1 crypto rx=0x7ff21c009b50 tx=0x7ff21c0048c0 comp rx=0 tx=0).stop 2026-03-10T06:28:44.468 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:44.467+0000 7ff2157fa700 1 -- 192.168.123.104:0/631692149 shutdown_connections 2026-03-10T06:28:44.468 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:44.467+0000 7ff2157fa700 1 --2- 192.168.123.104:0/631692149 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7ff210077710 0x7ff210079bc0 unknown :-1 s=CLOSED pgs=81 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:44.468 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:44.467+0000 7ff2157fa700 1 --2- 192.168.123.104:0/631692149 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7ff228071b60 0x7ff228119570 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:44.469 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:44.467+0000 7ff2157fa700 1 --2- 192.168.123.104:0/631692149 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff22810eab0 0x7ff228114570 unknown :-1 s=CLOSED pgs=44 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:44.469 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:44.467+0000 7ff2157fa700 1 -- 192.168.123.104:0/631692149 >> 192.168.123.104:0/631692149 conn(0x7ff22806c6c0 msgr2=0x7ff228070370 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:28:44.469 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:44.468+0000 7ff2157fa700 1 -- 192.168.123.104:0/631692149 shutdown_connections 2026-03-10T06:28:44.469 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:44.468+0000 7ff2157fa700 1 -- 192.168.123.104:0/631692149 wait complete. 2026-03-10T06:28:44.483 INFO:teuthology.orchestra.run.vm04.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)" 2026-03-10T06:28:44.549 DEBUG:teuthology.parallel:result is None 2026-03-10T06:28:44.549 INFO:teuthology.run_tasks:Running task cephadm.shell... 2026-03-10T06:28:44.552 INFO:tasks.cephadm:Running commands on role host.a host ubuntu@vm04.local 2026-03-10T06:28:44.552 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 9c59102a-1c48-11f1-b618-035af535377d -- bash -c 'ceph fs dump' 2026-03-10T06:28:44.723 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/mon.vm04/config 2026-03-10T06:28:44.787 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:28:44 vm04.local ceph-mon[115743]: from='client.? 192.168.123.104:0/2984938330' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:28:44.787 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:28:44 vm04.local ceph-mon[115743]: from='client.? 192.168.123.104:0/631692149' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:28:45.009 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:45.006+0000 7f92d4304700 1 -- 192.168.123.104:0/3275238966 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f92cc068490 msgr2=0x7f92cc068900 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:28:45.009 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:45.006+0000 7f92d4304700 1 --2- 192.168.123.104:0/3275238966 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f92cc068490 0x7f92cc068900 secure :-1 s=READY pgs=120 cs=0 l=1 rev1=1 crypto rx=0x7f92c8009b50 tx=0x7f92c8009e60 comp rx=0 tx=0).stop 2026-03-10T06:28:45.009 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:45.007+0000 7f92d4304700 1 -- 192.168.123.104:0/3275238966 shutdown_connections 2026-03-10T06:28:45.009 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:45.007+0000 7f92d4304700 1 --2- 192.168.123.104:0/3275238966 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f92cc068490 0x7f92cc068900 unknown :-1 s=CLOSED pgs=120 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:45.009 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:45.007+0000 7f92d4304700 1 --2- 192.168.123.104:0/3275238966 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f92cc1013a0 0x7f92cc101770 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:45.009 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:45.007+0000 7f92d4304700 1 -- 192.168.123.104:0/3275238966 >> 192.168.123.104:0/3275238966 conn(0x7f92cc0754a0 msgr2=0x7f92cc0758a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:28:45.009 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:45.007+0000 7f92d4304700 1 -- 192.168.123.104:0/3275238966 shutdown_connections 2026-03-10T06:28:45.009 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:45.007+0000 7f92d4304700 1 -- 192.168.123.104:0/3275238966 wait complete. 2026-03-10T06:28:45.009 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:45.008+0000 7f92d4304700 1 Processor -- start 2026-03-10T06:28:45.009 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:45.008+0000 7f92d4304700 1 -- start start 2026-03-10T06:28:45.010 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:45.008+0000 7f92d4304700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f92cc068490 0x7f92cc1982f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:28:45.010 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:45.008+0000 7f92d4304700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f92cc1013a0 0x7f92cc198830 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:28:45.010 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:45.008+0000 7f92d4304700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f92cc198f10 con 0x7f92cc1013a0 2026-03-10T06:28:45.010 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:45.008+0000 7f92d4304700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f92cc19cca0 con 0x7f92cc068490 2026-03-10T06:28:45.010 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:45.008+0000 7f92d20a0700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f92cc068490 0x7f92cc1982f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:28:45.010 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:45.009+0000 7f92d20a0700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f92cc068490 0x7f92cc1982f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.104:52588/0 (socket says 192.168.123.104:52588) 2026-03-10T06:28:45.010 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:45.009+0000 7f92d20a0700 1 -- 192.168.123.104:0/3514411855 learned_addr learned my addr 192.168.123.104:0/3514411855 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:28:45.010 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:45.009+0000 7f92d189f700 1 --2- 192.168.123.104:0/3514411855 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f92cc1013a0 0x7f92cc198830 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:28:45.010 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:45.009+0000 7f92d20a0700 1 -- 192.168.123.104:0/3514411855 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f92cc1013a0 msgr2=0x7f92cc198830 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:28:45.010 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:45.009+0000 7f92d20a0700 1 --2- 192.168.123.104:0/3514411855 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f92cc1013a0 0x7f92cc198830 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:45.011 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:45.009+0000 7f92d20a0700 1 -- 192.168.123.104:0/3514411855 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f92c0009710 con 0x7f92cc068490 2026-03-10T06:28:45.011 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:45.009+0000 7f92d20a0700 1 --2- 192.168.123.104:0/3514411855 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f92cc068490 0x7f92cc1982f0 secure :-1 s=READY pgs=45 cs=0 l=1 rev1=1 crypto rx=0x7f92c000ec80 tx=0x7f92c000c5b0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:28:45.011 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:45.009+0000 7f92bf7fe700 1 -- 192.168.123.104:0/3514411855 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f92c000cd50 con 0x7f92cc068490 2026-03-10T06:28:45.012 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:45.009+0000 7f92d189f700 1 --2- 192.168.123.104:0/3514411855 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f92cc1013a0 0x7f92cc198830 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T06:28:45.012 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:45.009+0000 7f92d4304700 1 -- 192.168.123.104:0/3514411855 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f92c80097e0 con 0x7f92cc068490 2026-03-10T06:28:45.012 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:45.009+0000 7f92d4304700 1 -- 192.168.123.104:0/3514411855 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f92cc19d340 con 0x7f92cc068490 2026-03-10T06:28:45.012 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:45.010+0000 7f92bf7fe700 1 -- 192.168.123.104:0/3514411855 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f92c000ceb0 con 0x7f92cc068490 2026-03-10T06:28:45.012 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:45.010+0000 7f92bf7fe700 1 -- 192.168.123.104:0/3514411855 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f92c0005380 con 0x7f92cc068490 2026-03-10T06:28:45.012 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:45.011+0000 7f92bf7fe700 1 -- 192.168.123.104:0/3514411855 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 40) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f92c0005630 con 0x7f92cc068490 2026-03-10T06:28:45.013 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:45.011+0000 7f92bf7fe700 1 --2- 192.168.123.104:0/3514411855 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f92b8080130 0x7f92b80825e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:28:45.013 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:45.011+0000 7f92d189f700 1 --2- 192.168.123.104:0/3514411855 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f92b8080130 0x7f92b80825e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:28:45.013 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:45.011+0000 7f92bf7fe700 1 -- 192.168.123.104:0/3514411855 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(81..81 src has 1..81) v4 ==== 6308+0+0 (secure 0 0 0) 0x7f92c0014070 con 0x7f92cc068490 2026-03-10T06:28:45.013 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:45.012+0000 7f92d4304700 1 -- 192.168.123.104:0/3514411855 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f92cc04ea50 con 0x7f92cc068490 2026-03-10T06:28:45.013 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:45.012+0000 7f92d189f700 1 --2- 192.168.123.104:0/3514411855 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f92b8080130 0x7f92b80825e0 secure :-1 s=READY pgs=82 cs=0 l=1 rev1=1 crypto rx=0x7f92c800b5c0 tx=0x7f92c8005fb0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:28:45.016 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:45.015+0000 7f92bf7fe700 1 -- 192.168.123.104:0/3514411855 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f92c00621c0 con 0x7f92cc068490 2026-03-10T06:28:45.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:28:44 vm06.local ceph-mon[98962]: from='client.? 192.168.123.104:0/2984938330' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:28:45.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:28:44 vm06.local ceph-mon[98962]: from='client.? 192.168.123.104:0/631692149' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T06:28:45.170 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:45.168+0000 7f92d4304700 1 -- 192.168.123.104:0/3514411855 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7f92cc19cf30 con 0x7f92cc068490 2026-03-10T06:28:45.171 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:45.169+0000 7f92bf7fe700 1 -- 192.168.123.104:0/3514411855 <== mon.1 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 34 v34) v1 ==== 76+0+2002 (secure 0 0 0) 0x7f92c001c070 con 0x7f92cc068490 2026-03-10T06:28:45.173 INFO:teuthology.orchestra.run.vm04.stdout:e34 2026-03-10T06:28:45.173 INFO:teuthology.orchestra.run.vm04.stdout:btime 2026-03-10T06:27:38:067931+0000 2026-03-10T06:28:45.174 INFO:teuthology.orchestra.run.vm04.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-10T06:28:45.174 INFO:teuthology.orchestra.run.vm04.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T06:28:45.174 INFO:teuthology.orchestra.run.vm04.stdout:legacy client fscid: 1 2026-03-10T06:28:45.174 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:28:45.174 INFO:teuthology.orchestra.run.vm04.stdout:Filesystem 'cephfs' (1) 2026-03-10T06:28:45.174 INFO:teuthology.orchestra.run.vm04.stdout:fs_name cephfs 2026-03-10T06:28:45.174 INFO:teuthology.orchestra.run.vm04.stdout:epoch 34 2026-03-10T06:28:45.174 INFO:teuthology.orchestra.run.vm04.stdout:flags 32 joinable allow_snaps allow_multimds_snaps allow_standby_replay 2026-03-10T06:28:45.174 INFO:teuthology.orchestra.run.vm04.stdout:created 2026-03-10T06:19:48.407965+0000 2026-03-10T06:28:45.174 INFO:teuthology.orchestra.run.vm04.stdout:modified 2026-03-10T06:27:37.147748+0000 2026-03-10T06:28:45.174 INFO:teuthology.orchestra.run.vm04.stdout:tableserver 0 2026-03-10T06:28:45.174 INFO:teuthology.orchestra.run.vm04.stdout:root 0 2026-03-10T06:28:45.174 INFO:teuthology.orchestra.run.vm04.stdout:session_timeout 60 2026-03-10T06:28:45.174 INFO:teuthology.orchestra.run.vm04.stdout:session_autoclose 300 2026-03-10T06:28:45.174 INFO:teuthology.orchestra.run.vm04.stdout:max_file_size 1099511627776 2026-03-10T06:28:45.174 INFO:teuthology.orchestra.run.vm04.stdout:max_xattr_size 65536 2026-03-10T06:28:45.174 INFO:teuthology.orchestra.run.vm04.stdout:required_client_features {} 2026-03-10T06:28:45.174 INFO:teuthology.orchestra.run.vm04.stdout:last_failure 0 2026-03-10T06:28:45.174 INFO:teuthology.orchestra.run.vm04.stdout:last_failure_osd_epoch 81 2026-03-10T06:28:45.174 INFO:teuthology.orchestra.run.vm04.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes} 2026-03-10T06:28:45.174 INFO:teuthology.orchestra.run.vm04.stdout:max_mds 1 2026-03-10T06:28:45.174 INFO:teuthology.orchestra.run.vm04.stdout:in 0 2026-03-10T06:28:45.174 INFO:teuthology.orchestra.run.vm04.stdout:up {0=34266} 2026-03-10T06:28:45.174 INFO:teuthology.orchestra.run.vm04.stdout:failed 2026-03-10T06:28:45.174 INFO:teuthology.orchestra.run.vm04.stdout:damaged 2026-03-10T06:28:45.174 INFO:teuthology.orchestra.run.vm04.stdout:stopped 2026-03-10T06:28:45.174 INFO:teuthology.orchestra.run.vm04.stdout:data_pools [3] 2026-03-10T06:28:45.174 INFO:teuthology.orchestra.run.vm04.stdout:metadata_pool 2 2026-03-10T06:28:45.174 INFO:teuthology.orchestra.run.vm04.stdout:inline_data enabled 2026-03-10T06:28:45.174 INFO:teuthology.orchestra.run.vm04.stdout:balancer 2026-03-10T06:28:45.174 INFO:teuthology.orchestra.run.vm04.stdout:bal_rank_mask -1 2026-03-10T06:28:45.174 INFO:teuthology.orchestra.run.vm04.stdout:standby_count_wanted 1 2026-03-10T06:28:45.174 INFO:teuthology.orchestra.run.vm04.stdout:qdb_cluster leader: 34266 members: 34266 2026-03-10T06:28:45.174 INFO:teuthology.orchestra.run.vm04.stdout:[mds.cephfs.vm04.hdxbzv{0:34266} state up:active seq 10 join_fscid=1 addr [v2:192.168.123.104:6826/2103633514,v1:192.168.123.104:6827/2103633514] compat {c=[1],r=[1],i=[1fff]}] 2026-03-10T06:28:45.174 INFO:teuthology.orchestra.run.vm04.stdout:[mds.cephfs.vm04.hsrsig{0:34286} state up:standby-replay seq 1 join_fscid=1 addr [v2:192.168.123.104:6828/142010479,v1:192.168.123.104:6829/142010479] compat {c=[1],r=[1],i=[1fff]}] 2026-03-10T06:28:45.174 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:28:45.174 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:28:45.174 INFO:teuthology.orchestra.run.vm04.stdout:Standby daemons: 2026-03-10T06:28:45.174 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:28:45.174 INFO:teuthology.orchestra.run.vm04.stdout:[mds.cephfs.vm06.wzhqon{-1:44245} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.106:6824/2972586913,v1:192.168.123.106:6825/2972586913] compat {c=[1],r=[1],i=[1fff]}] 2026-03-10T06:28:45.174 INFO:teuthology.orchestra.run.vm04.stdout:[mds.cephfs.vm06.afscws{-1:44249} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.106:6826/3728010036,v1:192.168.123.106:6827/3728010036] compat {c=[1],r=[1],i=[1fff]}] 2026-03-10T06:28:45.175 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:45.173+0000 7f92d4304700 1 -- 192.168.123.104:0/3514411855 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f92b8080130 msgr2=0x7f92b80825e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:28:45.175 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:45.175+0000 7f92d4304700 1 --2- 192.168.123.104:0/3514411855 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f92b8080130 0x7f92b80825e0 secure :-1 s=READY pgs=82 cs=0 l=1 rev1=1 crypto rx=0x7f92c800b5c0 tx=0x7f92c8005fb0 comp rx=0 tx=0).stop 2026-03-10T06:28:45.175 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:45.175+0000 7f92d4304700 1 -- 192.168.123.104:0/3514411855 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f92cc068490 msgr2=0x7f92cc1982f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:28:45.175 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:45.175+0000 7f92d4304700 1 --2- 192.168.123.104:0/3514411855 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f92cc068490 0x7f92cc1982f0 secure :-1 s=READY pgs=45 cs=0 l=1 rev1=1 crypto rx=0x7f92c000ec80 tx=0x7f92c000c5b0 comp rx=0 tx=0).stop 2026-03-10T06:28:45.176 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:45.175+0000 7f92d4304700 1 -- 192.168.123.104:0/3514411855 shutdown_connections 2026-03-10T06:28:45.176 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:45.175+0000 7f92d4304700 1 --2- 192.168.123.104:0/3514411855 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f92b8080130 0x7f92b80825e0 unknown :-1 s=CLOSED pgs=82 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:45.176 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:45.175+0000 7f92d4304700 1 --2- 192.168.123.104:0/3514411855 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f92cc068490 0x7f92cc1982f0 unknown :-1 s=CLOSED pgs=45 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:45.176 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:45.175+0000 7f92d4304700 1 --2- 192.168.123.104:0/3514411855 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f92cc1013a0 0x7f92cc198830 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:45.176 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:45.175+0000 7f92d4304700 1 -- 192.168.123.104:0/3514411855 >> 192.168.123.104:0/3514411855 conn(0x7f92cc0754a0 msgr2=0x7f92cc0fdd70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:28:45.176 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:45.175+0000 7f92d4304700 1 -- 192.168.123.104:0/3514411855 shutdown_connections 2026-03-10T06:28:45.177 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:45.175+0000 7f92d4304700 1 -- 192.168.123.104:0/3514411855 wait complete. 2026-03-10T06:28:45.178 INFO:teuthology.orchestra.run.vm04.stderr:dumped fsmap epoch 34 2026-03-10T06:28:45.239 INFO:teuthology.run_tasks:Running task fs.post_upgrade_checks... 2026-03-10T06:28:45.242 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid 9c59102a-1c48-11f1-b618-035af535377d -- ceph fs dump --format=json 2026-03-10T06:28:45.404 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/mon.vm04/config 2026-03-10T06:28:45.671 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:45.669+0000 7f72f22c5700 1 -- 192.168.123.104:0/1965446652 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f72ec102780 msgr2=0x7f72ec102bf0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:28:45.671 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:45.669+0000 7f72f22c5700 1 --2- 192.168.123.104:0/1965446652 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f72ec102780 0x7f72ec102bf0 secure :-1 s=READY pgs=121 cs=0 l=1 rev1=1 crypto rx=0x7f72dc009b00 tx=0x7f72dc009e10 comp rx=0 tx=0).stop 2026-03-10T06:28:45.671 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:45.670+0000 7f72f22c5700 1 -- 192.168.123.104:0/1965446652 shutdown_connections 2026-03-10T06:28:45.671 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:45.670+0000 7f72f22c5700 1 --2- 192.168.123.104:0/1965446652 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f72ec102780 0x7f72ec102bf0 unknown :-1 s=CLOSED pgs=121 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:45.671 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:45.670+0000 7f72f22c5700 1 --2- 192.168.123.104:0/1965446652 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f72ec108780 0x7f72ec108b50 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:45.671 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:45.670+0000 7f72f22c5700 1 -- 192.168.123.104:0/1965446652 >> 192.168.123.104:0/1965446652 conn(0x7f72ec0fe280 msgr2=0x7f72ec100690 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:28:45.671 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:45.670+0000 7f72f22c5700 1 -- 192.168.123.104:0/1965446652 shutdown_connections 2026-03-10T06:28:45.671 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:45.670+0000 7f72f22c5700 1 -- 192.168.123.104:0/1965446652 wait complete. 2026-03-10T06:28:45.672 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:45.670+0000 7f72f22c5700 1 Processor -- start 2026-03-10T06:28:45.672 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:45.671+0000 7f72f22c5700 1 -- start start 2026-03-10T06:28:45.672 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:45.671+0000 7f72f22c5700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f72ec102780 0x7f72ec198460 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:28:45.673 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:45.671+0000 7f72f22c5700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f72ec108780 0x7f72ec1989a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:28:45.673 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:45.671+0000 7f72f22c5700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f72ec199080 con 0x7f72ec102780 2026-03-10T06:28:45.673 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:45.671+0000 7f72f22c5700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f72ec19ce10 con 0x7f72ec108780 2026-03-10T06:28:45.673 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:45.671+0000 7f72ebfff700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f72ec102780 0x7f72ec198460 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:28:45.673 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:45.671+0000 7f72ebfff700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f72ec102780 0x7f72ec198460 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:35750/0 (socket says 192.168.123.104:35750) 2026-03-10T06:28:45.673 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:45.671+0000 7f72ebfff700 1 -- 192.168.123.104:0/494777448 learned_addr learned my addr 192.168.123.104:0/494777448 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:28:45.673 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:45.671+0000 7f72eb7fe700 1 --2- 192.168.123.104:0/494777448 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f72ec108780 0x7f72ec1989a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:28:45.673 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:45.672+0000 7f72ebfff700 1 -- 192.168.123.104:0/494777448 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f72ec108780 msgr2=0x7f72ec1989a0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:28:45.673 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:45.672+0000 7f72ebfff700 1 --2- 192.168.123.104:0/494777448 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f72ec108780 0x7f72ec1989a0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:45.673 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:45.672+0000 7f72ebfff700 1 -- 192.168.123.104:0/494777448 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f72dc0097e0 con 0x7f72ec102780 2026-03-10T06:28:45.673 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:45.672+0000 7f72ebfff700 1 --2- 192.168.123.104:0/494777448 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f72ec102780 0x7f72ec198460 secure :-1 s=READY pgs=122 cs=0 l=1 rev1=1 crypto rx=0x7f72e000b700 tx=0x7f72e000bac0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:28:45.674 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:45.672+0000 7f72e97fa700 1 -- 192.168.123.104:0/494777448 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f72e0010820 con 0x7f72ec102780 2026-03-10T06:28:45.674 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:45.672+0000 7f72e97fa700 1 -- 192.168.123.104:0/494777448 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f72e0010e60 con 0x7f72ec102780 2026-03-10T06:28:45.674 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:45.672+0000 7f72e97fa700 1 -- 192.168.123.104:0/494777448 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f72e0017570 con 0x7f72ec102780 2026-03-10T06:28:45.674 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:45.672+0000 7f72f22c5700 1 -- 192.168.123.104:0/494777448 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f72ec19d0f0 con 0x7f72ec102780 2026-03-10T06:28:45.674 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:45.672+0000 7f72f22c5700 1 -- 192.168.123.104:0/494777448 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f72ec19d6c0 con 0x7f72ec102780 2026-03-10T06:28:45.675 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:45.674+0000 7f72e97fa700 1 -- 192.168.123.104:0/494777448 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 40) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f72e00176d0 con 0x7f72ec102780 2026-03-10T06:28:45.675 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:45.674+0000 7f72e97fa700 1 --2- 192.168.123.104:0/494777448 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f72d4077870 0x7f72d4079d20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:28:45.676 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:45.674+0000 7f72eb7fe700 1 --2- 192.168.123.104:0/494777448 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f72d4077870 0x7f72d4079d20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:28:45.676 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:45.675+0000 7f72e97fa700 1 -- 192.168.123.104:0/494777448 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(81..81 src has 1..81) v4 ==== 6308+0+0 (secure 0 0 0) 0x7f72e0099570 con 0x7f72ec102780 2026-03-10T06:28:45.676 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:45.675+0000 7f72eb7fe700 1 --2- 192.168.123.104:0/494777448 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f72d4077870 0x7f72d4079d20 secure :-1 s=READY pgs=83 cs=0 l=1 rev1=1 crypto rx=0x7f72dc00b5c0 tx=0x7f72dc005fd0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:28:45.676 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:45.675+0000 7f72f22c5700 1 -- 192.168.123.104:0/494777448 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f72ec04ea50 con 0x7f72ec102780 2026-03-10T06:28:45.679 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:45.678+0000 7f72e97fa700 1 -- 192.168.123.104:0/494777448 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f72e0061c70 con 0x7f72ec102780 2026-03-10T06:28:45.691 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:28:45 vm04.local ceph-mon[115743]: pgmap v181: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:28:45.825 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:45.823+0000 7f72f22c5700 1 -- 192.168.123.104:0/494777448 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "fs dump", "format": "json"} v 0) v1 -- 0x7f72ec066e40 con 0x7f72ec102780 2026-03-10T06:28:45.826 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:45.824+0000 7f72e97fa700 1 -- 192.168.123.104:0/494777448 <== mon.0 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "format": "json"}]=0 dumped fsmap epoch 34 v34) v1 ==== 94+0+5270 (secure 0 0 0) 0x7f72e0061c70 con 0x7f72ec102780 2026-03-10T06:28:45.826 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:28:45.826 INFO:teuthology.orchestra.run.vm04.stdout:{"epoch":34,"btime":"2026-03-10T06:27:38:067931+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":44245,"name":"cephfs.vm06.wzhqon","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.106:6825/2972586913","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6824","nonce":2972586913},{"type":"v1","addr":"192.168.123.106:6825","nonce":2972586913}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":25},{"gid":44249,"name":"cephfs.vm06.afscws","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.106:6827/3728010036","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6826","nonce":3728010036},{"type":"v1","addr":"192.168.123.106:6827","nonce":3728010036}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":30}],"filesystems":[{"mdsmap":{"epoch":34,"flags":50,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":true,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":32,"explicitly_allowed_features":32,"created":"2026-03-10T06:19:48.407965+0000","modified":"2026-03-10T06:27:37.147748+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":81,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"max_mds":1,"in":[0],"up":{"mds_0":34266},"failed":[],"damaged":[],"stopped":[],"info":{"gid_34266":{"gid":34266,"name":"cephfs.vm04.hdxbzv","rank":0,"incarnation":27,"state":"up:active","state_seq":10,"addr":"192.168.123.104:6827/2103633514","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6826","nonce":2103633514},{"type":"v1","addr":"192.168.123.104:6827","nonce":2103633514}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}}},"gid_34286":{"gid":34286,"name":"cephfs.vm04.hsrsig","rank":0,"incarnation":0,"state":"up:standby-replay","state_seq":1,"addr":"192.168.123.104:6829/142010479","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6828","nonce":142010479},{"type":"v1","addr":"192.168.123.104:6829","nonce":142010479}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":34266,"qdb_cluster":[34266]},"id":1}]} 2026-03-10T06:28:45.828 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:45.827+0000 7f72f22c5700 1 -- 192.168.123.104:0/494777448 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f72d4077870 msgr2=0x7f72d4079d20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:28:45.828 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:45.827+0000 7f72f22c5700 1 --2- 192.168.123.104:0/494777448 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f72d4077870 0x7f72d4079d20 secure :-1 s=READY pgs=83 cs=0 l=1 rev1=1 crypto rx=0x7f72dc00b5c0 tx=0x7f72dc005fd0 comp rx=0 tx=0).stop 2026-03-10T06:28:45.828 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:45.827+0000 7f72f22c5700 1 -- 192.168.123.104:0/494777448 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f72ec102780 msgr2=0x7f72ec198460 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:28:45.828 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:45.827+0000 7f72f22c5700 1 --2- 192.168.123.104:0/494777448 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f72ec102780 0x7f72ec198460 secure :-1 s=READY pgs=122 cs=0 l=1 rev1=1 crypto rx=0x7f72e000b700 tx=0x7f72e000bac0 comp rx=0 tx=0).stop 2026-03-10T06:28:45.829 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:45.827+0000 7f72f22c5700 1 -- 192.168.123.104:0/494777448 shutdown_connections 2026-03-10T06:28:45.829 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:45.827+0000 7f72f22c5700 1 --2- 192.168.123.104:0/494777448 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f72d4077870 0x7f72d4079d20 unknown :-1 s=CLOSED pgs=83 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:45.829 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:45.827+0000 7f72f22c5700 1 --2- 192.168.123.104:0/494777448 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f72ec102780 0x7f72ec198460 unknown :-1 s=CLOSED pgs=122 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:45.829 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:45.827+0000 7f72f22c5700 1 --2- 192.168.123.104:0/494777448 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f72ec108780 0x7f72ec1989a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:45.829 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:45.827+0000 7f72f22c5700 1 -- 192.168.123.104:0/494777448 >> 192.168.123.104:0/494777448 conn(0x7f72ec0fe280 msgr2=0x7f72ec0ffb90 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:28:45.829 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:45.827+0000 7f72f22c5700 1 -- 192.168.123.104:0/494777448 shutdown_connections 2026-03-10T06:28:45.829 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:45.828+0000 7f72f22c5700 1 -- 192.168.123.104:0/494777448 wait complete. 2026-03-10T06:28:45.830 INFO:teuthology.orchestra.run.vm04.stderr:dumped fsmap epoch 34 2026-03-10T06:28:45.873 DEBUG:tasks.fs:checking fs fscid=1,name=cephfs state = {'epoch': 9, 'max_mds': 1, 'flags': 50} 2026-03-10T06:28:45.873 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid 9c59102a-1c48-11f1-b618-035af535377d -- ceph fs dump --format=json 10 2026-03-10T06:28:46.022 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/mon.vm04/config 2026-03-10T06:28:46.050 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:28:45 vm04.local ceph-mon[115743]: from='client.? 192.168.123.104:0/3514411855' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T06:28:46.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:28:45 vm06.local ceph-mon[98962]: pgmap v181: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:28:46.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:28:45 vm06.local ceph-mon[98962]: from='client.? 192.168.123.104:0/3514411855' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T06:28:46.291 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:46.289+0000 7f9dcde7d700 1 -- 192.168.123.104:0/916723614 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9dc8073a00 msgr2=0x7f9dc8111020 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:28:46.291 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:46.289+0000 7f9dcde7d700 1 --2- 192.168.123.104:0/916723614 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9dc8073a00 0x7f9dc8111020 secure :-1 s=READY pgs=46 cs=0 l=1 rev1=1 crypto rx=0x7f9db0009b00 tx=0x7f9db0009e10 comp rx=0 tx=0).stop 2026-03-10T06:28:46.291 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:46.290+0000 7f9dcde7d700 1 -- 192.168.123.104:0/916723614 shutdown_connections 2026-03-10T06:28:46.291 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:46.290+0000 7f9dcde7d700 1 --2- 192.168.123.104:0/916723614 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9dc8073a00 0x7f9dc8111020 unknown :-1 s=CLOSED pgs=46 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:46.291 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:46.290+0000 7f9dcde7d700 1 --2- 192.168.123.104:0/916723614 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9dc80730f0 0x7f9dc80734c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:46.291 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:46.290+0000 7f9dcde7d700 1 -- 192.168.123.104:0/916723614 >> 192.168.123.104:0/916723614 conn(0x7f9dc80fc050 msgr2=0x7f9dc80fe460 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:28:46.291 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:46.290+0000 7f9dcde7d700 1 -- 192.168.123.104:0/916723614 shutdown_connections 2026-03-10T06:28:46.292 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:46.290+0000 7f9dcde7d700 1 -- 192.168.123.104:0/916723614 wait complete. 2026-03-10T06:28:46.292 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:46.290+0000 7f9dcde7d700 1 Processor -- start 2026-03-10T06:28:46.292 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:46.291+0000 7f9dcde7d700 1 -- start start 2026-03-10T06:28:46.292 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:46.291+0000 7f9dcde7d700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9dc80730f0 0x7f9dc819c700 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:28:46.293 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:46.291+0000 7f9dcde7d700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9dc8073a00 0x7f9dc819cc40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:28:46.293 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:46.291+0000 7f9dcde7d700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9dc819d2d0 con 0x7f9dc8073a00 2026-03-10T06:28:46.293 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:46.291+0000 7f9dcde7d700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9dc81a0000 con 0x7f9dc80730f0 2026-03-10T06:28:46.293 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:46.291+0000 7f9dc6ffd700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9dc8073a00 0x7f9dc819cc40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:28:46.293 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:46.291+0000 7f9dc6ffd700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9dc8073a00 0x7f9dc819cc40 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:35760/0 (socket says 192.168.123.104:35760) 2026-03-10T06:28:46.293 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:46.291+0000 7f9dc6ffd700 1 -- 192.168.123.104:0/4074627075 learned_addr learned my addr 192.168.123.104:0/4074627075 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:28:46.293 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:46.291+0000 7f9dc6ffd700 1 -- 192.168.123.104:0/4074627075 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9dc80730f0 msgr2=0x7f9dc819c700 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:28:46.293 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:46.291+0000 7f9dc6ffd700 1 --2- 192.168.123.104:0/4074627075 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9dc80730f0 0x7f9dc819c700 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:46.293 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:46.291+0000 7f9dc6ffd700 1 -- 192.168.123.104:0/4074627075 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f9db8009710 con 0x7f9dc8073a00 2026-03-10T06:28:46.293 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:46.291+0000 7f9dc6ffd700 1 --2- 192.168.123.104:0/4074627075 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9dc8073a00 0x7f9dc819cc40 secure :-1 s=READY pgs=123 cs=0 l=1 rev1=1 crypto rx=0x7f9db0005b40 tx=0x7f9db000bfd0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:28:46.294 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:46.292+0000 7f9dc4ff9700 1 -- 192.168.123.104:0/4074627075 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9db001d070 con 0x7f9dc8073a00 2026-03-10T06:28:46.294 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:46.292+0000 7f9dc4ff9700 1 -- 192.168.123.104:0/4074627075 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f9db000f460 con 0x7f9dc8073a00 2026-03-10T06:28:46.294 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:46.292+0000 7f9dc4ff9700 1 -- 192.168.123.104:0/4074627075 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9db0021620 con 0x7f9dc8073a00 2026-03-10T06:28:46.294 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:46.292+0000 7f9dcde7d700 1 -- 192.168.123.104:0/4074627075 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f9db00097e0 con 0x7f9dc8073a00 2026-03-10T06:28:46.294 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:46.292+0000 7f9dcde7d700 1 -- 192.168.123.104:0/4074627075 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f9dc81a0620 con 0x7f9dc8073a00 2026-03-10T06:28:46.297 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:46.293+0000 7f9dbe7fc700 1 -- 192.168.123.104:0/4074627075 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f9dac0052f0 con 0x7f9dc8073a00 2026-03-10T06:28:46.297 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:46.294+0000 7f9dc4ff9700 1 -- 192.168.123.104:0/4074627075 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 40) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f9db000f5d0 con 0x7f9dc8073a00 2026-03-10T06:28:46.297 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:46.295+0000 7f9dc4ff9700 1 --2- 192.168.123.104:0/4074627075 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f9db40776c0 0x7f9db4079b70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:28:46.297 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:46.295+0000 7f9dc4ff9700 1 -- 192.168.123.104:0/4074627075 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(81..81 src has 1..81) v4 ==== 6308+0+0 (secure 0 0 0) 0x7f9db009aac0 con 0x7f9dc8073a00 2026-03-10T06:28:46.299 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:46.297+0000 7f9dc77fe700 1 --2- 192.168.123.104:0/4074627075 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f9db40776c0 0x7f9db4079b70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:28:46.299 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:46.298+0000 7f9dc77fe700 1 --2- 192.168.123.104:0/4074627075 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f9db40776c0 0x7f9db4079b70 secure :-1 s=READY pgs=84 cs=0 l=1 rev1=1 crypto rx=0x7f9db8005d90 tx=0x7f9db8005ce0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:28:46.299 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:46.298+0000 7f9dc4ff9700 1 -- 192.168.123.104:0/4074627075 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f9db009ce60 con 0x7f9dc8073a00 2026-03-10T06:28:46.442 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:46.440+0000 7f9dbe7fc700 1 -- 192.168.123.104:0/4074627075 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 10, "format": "json"} v 0) v1 -- 0x7f9dac0059c0 con 0x7f9dc8073a00 2026-03-10T06:28:46.443 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:46.441+0000 7f9dc4ff9700 1 -- 192.168.123.104:0/4074627075 <== mon.0 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 10, "format": "json"}]=0 dumped fsmap epoch 10 v34) v1 ==== 107+0+4920 (secure 0 0 0) 0x7f9db00631c0 con 0x7f9dc8073a00 2026-03-10T06:28:46.443 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:28:46.443 INFO:teuthology.orchestra.run.vm04.stdout:{"epoch":10,"btime":"1970-01-01T00:00:00:000000+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":14518,"name":"cephfs.vm04.hsrsig","rank":-1,"incarnation":0,"state":"up:standby","state_seq":2,"addr":"192.168.123.104:6829/2419696492","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6828","nonce":2419696492},{"type":"v1","addr":"192.168.123.104:6829","nonce":2419696492}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":10},{"gid":14526,"name":"cephfs.vm06.afscws","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.106:6827/3120742985","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6826","nonce":3120742985},{"type":"v1","addr":"192.168.123.106:6827","nonce":3120742985}]},"join_fscid":-1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":6}],"filesystems":[{"mdsmap":{"epoch":9,"flags":50,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":true,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":32,"explicitly_allowed_features":32,"created":"2026-03-10T06:19:48.407965+0000","modified":"2026-03-10T06:19:55.449951+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{"mds_0":14508},"failed":[],"damaged":[],"stopped":[],"info":{"gid_14508":{"gid":14508,"name":"cephfs.vm04.hdxbzv","rank":0,"incarnation":3,"state":"up:active","state_seq":3,"addr":"192.168.123.104:6827/2274683007","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6826","nonce":2274683007},{"type":"v1","addr":"192.168.123.104:6827","nonce":2274683007}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}},"gid_24299":{"gid":24299,"name":"cephfs.vm06.wzhqon","rank":0,"incarnation":0,"state":"up:standby-replay","state_seq":2,"addr":"192.168.123.106:6825/3071631026","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6824","nonce":3071631026},{"type":"v1","addr":"192.168.123.106:6825","nonce":3071631026}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":0,"qdb_cluster":[]},"id":1}]} 2026-03-10T06:28:46.445 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:46.444+0000 7f9dbe7fc700 1 -- 192.168.123.104:0/4074627075 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f9db40776c0 msgr2=0x7f9db4079b70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:28:46.446 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:46.444+0000 7f9dbe7fc700 1 --2- 192.168.123.104:0/4074627075 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f9db40776c0 0x7f9db4079b70 secure :-1 s=READY pgs=84 cs=0 l=1 rev1=1 crypto rx=0x7f9db8005d90 tx=0x7f9db8005ce0 comp rx=0 tx=0).stop 2026-03-10T06:28:46.446 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:46.444+0000 7f9dbe7fc700 1 -- 192.168.123.104:0/4074627075 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9dc8073a00 msgr2=0x7f9dc819cc40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:28:46.446 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:46.444+0000 7f9dbe7fc700 1 --2- 192.168.123.104:0/4074627075 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9dc8073a00 0x7f9dc819cc40 secure :-1 s=READY pgs=123 cs=0 l=1 rev1=1 crypto rx=0x7f9db0005b40 tx=0x7f9db000bfd0 comp rx=0 tx=0).stop 2026-03-10T06:28:46.446 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:46.444+0000 7f9dbe7fc700 1 -- 192.168.123.104:0/4074627075 shutdown_connections 2026-03-10T06:28:46.446 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:46.444+0000 7f9dbe7fc700 1 --2- 192.168.123.104:0/4074627075 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f9db40776c0 0x7f9db4079b70 unknown :-1 s=CLOSED pgs=84 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:46.446 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:46.444+0000 7f9dbe7fc700 1 --2- 192.168.123.104:0/4074627075 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9dc80730f0 0x7f9dc819c700 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:46.446 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:46.444+0000 7f9dbe7fc700 1 --2- 192.168.123.104:0/4074627075 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9dc8073a00 0x7f9dc819cc40 unknown :-1 s=CLOSED pgs=123 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:46.446 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:46.444+0000 7f9dbe7fc700 1 -- 192.168.123.104:0/4074627075 >> 192.168.123.104:0/4074627075 conn(0x7f9dc80fc050 msgr2=0x7f9dc8102b30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:28:46.446 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:46.444+0000 7f9dbe7fc700 1 -- 192.168.123.104:0/4074627075 shutdown_connections 2026-03-10T06:28:46.446 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:46.444+0000 7f9dbe7fc700 1 -- 192.168.123.104:0/4074627075 wait complete. 2026-03-10T06:28:46.447 INFO:teuthology.orchestra.run.vm04.stderr:dumped fsmap epoch 10 2026-03-10T06:28:46.495 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid 9c59102a-1c48-11f1-b618-035af535377d -- ceph fs dump --format=json 11 2026-03-10T06:28:46.647 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/mon.vm04/config 2026-03-10T06:28:46.908 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:46.905+0000 7f1f3bb24700 1 -- 192.168.123.104:0/1938238928 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1f34100540 msgr2=0x7f1f341009b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:28:46.908 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:46.905+0000 7f1f3bb24700 1 --2- 192.168.123.104:0/1938238928 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1f34100540 0x7f1f341009b0 secure :-1 s=READY pgs=124 cs=0 l=1 rev1=1 crypto rx=0x7f1f30009b50 tx=0x7f1f30009e60 comp rx=0 tx=0).stop 2026-03-10T06:28:46.908 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:46.907+0000 7f1f3bb24700 1 -- 192.168.123.104:0/1938238928 shutdown_connections 2026-03-10T06:28:46.909 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:46.907+0000 7f1f3bb24700 1 --2- 192.168.123.104:0/1938238928 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1f34100540 0x7f1f341009b0 unknown :-1 s=CLOSED pgs=124 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:46.909 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:46.907+0000 7f1f3bb24700 1 --2- 192.168.123.104:0/1938238928 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1f34106560 0x7f1f34106930 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:46.909 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:46.907+0000 7f1f3bb24700 1 -- 192.168.123.104:0/1938238928 >> 192.168.123.104:0/1938238928 conn(0x7f1f340fbfc0 msgr2=0x7f1f340fe3d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:28:46.909 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:46.907+0000 7f1f3bb24700 1 -- 192.168.123.104:0/1938238928 shutdown_connections 2026-03-10T06:28:46.909 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:46.907+0000 7f1f3bb24700 1 -- 192.168.123.104:0/1938238928 wait complete. 2026-03-10T06:28:46.909 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:46.907+0000 7f1f3bb24700 1 Processor -- start 2026-03-10T06:28:46.909 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:46.907+0000 7f1f3bb24700 1 -- start start 2026-03-10T06:28:46.909 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:46.908+0000 7f1f3bb24700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1f34100540 0x7f1f34198cf0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:28:46.909 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:46.908+0000 7f1f3bb24700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1f34106560 0x7f1f34193cf0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:28:46.909 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:46.908+0000 7f1f3bb24700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f1f34194230 con 0x7f1f34106560 2026-03-10T06:28:46.909 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:46.908+0000 7f1f3bb24700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f1f341943a0 con 0x7f1f34100540 2026-03-10T06:28:46.909 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:46.908+0000 7f1f390bf700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1f34106560 0x7f1f34193cf0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:28:46.909 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:46.908+0000 7f1f390bf700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1f34106560 0x7f1f34193cf0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:35776/0 (socket says 192.168.123.104:35776) 2026-03-10T06:28:46.909 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:46.908+0000 7f1f390bf700 1 -- 192.168.123.104:0/2466951649 learned_addr learned my addr 192.168.123.104:0/2466951649 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:28:46.909 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:46.908+0000 7f1f390bf700 1 -- 192.168.123.104:0/2466951649 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1f34100540 msgr2=0x7f1f34198cf0 unknown :-1 s=STATE_CONNECTING_RE l=1).mark_down 2026-03-10T06:28:46.909 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:46.908+0000 7f1f390bf700 1 --2- 192.168.123.104:0/2466951649 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1f34100540 0x7f1f34198cf0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:46.910 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:46.908+0000 7f1f390bf700 1 -- 192.168.123.104:0/2466951649 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f1f28009710 con 0x7f1f34106560 2026-03-10T06:28:46.910 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:46.908+0000 7f1f390bf700 1 --2- 192.168.123.104:0/2466951649 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1f34106560 0x7f1f34193cf0 secure :-1 s=READY pgs=125 cs=0 l=1 rev1=1 crypto rx=0x7f1f30005950 tx=0x7f1f30004ef0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:28:46.910 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:46.909+0000 7f1f26ffd700 1 -- 192.168.123.104:0/2466951649 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f1f3001d070 con 0x7f1f34106560 2026-03-10T06:28:46.910 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:46.909+0000 7f1f26ffd700 1 -- 192.168.123.104:0/2466951649 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f1f3000bbb0 con 0x7f1f34106560 2026-03-10T06:28:46.911 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:46.909+0000 7f1f26ffd700 1 -- 192.168.123.104:0/2466951649 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f1f3000f700 con 0x7f1f34106560 2026-03-10T06:28:46.911 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:46.909+0000 7f1f3bb24700 1 -- 192.168.123.104:0/2466951649 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f1f300097e0 con 0x7f1f34106560 2026-03-10T06:28:46.911 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:46.909+0000 7f1f3bb24700 1 -- 192.168.123.104:0/2466951649 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f1f34194900 con 0x7f1f34106560 2026-03-10T06:28:46.912 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:46.910+0000 7f1f3bb24700 1 -- 192.168.123.104:0/2466951649 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f1f3404ea50 con 0x7f1f34106560 2026-03-10T06:28:46.915 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:46.913+0000 7f1f26ffd700 1 -- 192.168.123.104:0/2466951649 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 40) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f1f3000bd20 con 0x7f1f34106560 2026-03-10T06:28:46.915 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:46.913+0000 7f1f26ffd700 1 --2- 192.168.123.104:0/2466951649 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f1f20077760 0x7f1f20079c10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:28:46.915 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:46.913+0000 7f1f26ffd700 1 -- 192.168.123.104:0/2466951649 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(81..81 src has 1..81) v4 ==== 6308+0+0 (secure 0 0 0) 0x7f1f3009bc40 con 0x7f1f34106560 2026-03-10T06:28:46.915 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:46.914+0000 7f1f398c0700 1 --2- 192.168.123.104:0/2466951649 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f1f20077760 0x7f1f20079c10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:28:46.916 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:46.914+0000 7f1f398c0700 1 --2- 192.168.123.104:0/2466951649 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f1f20077760 0x7f1f20079c10 secure :-1 s=READY pgs=85 cs=0 l=1 rev1=1 crypto rx=0x7f1f28009ee0 tx=0x7f1f28009450 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:28:46.916 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:46.915+0000 7f1f26ffd700 1 -- 192.168.123.104:0/2466951649 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f1f30064500 con 0x7f1f34106560 2026-03-10T06:28:47.057 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:28:46 vm04.local ceph-mon[115743]: from='client.? 192.168.123.104:0/494777448' entity='client.admin' cmd=[{"prefix": "fs dump", "format": "json"}]: dispatch 2026-03-10T06:28:47.057 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:28:46 vm04.local ceph-mon[115743]: from='client.? 192.168.123.104:0/4074627075' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 10, "format": "json"}]: dispatch 2026-03-10T06:28:47.057 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:47.055+0000 7f1f3bb24700 1 -- 192.168.123.104:0/2466951649 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 11, "format": "json"} v 0) v1 -- 0x7f1f34066e40 con 0x7f1f34106560 2026-03-10T06:28:47.060 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:47.059+0000 7f1f26ffd700 1 -- 192.168.123.104:0/2466951649 <== mon.0 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 11, "format": "json"}]=0 dumped fsmap epoch 11 v34) v1 ==== 107+0+4920 (secure 0 0 0) 0x7f1f30063c50 con 0x7f1f34106560 2026-03-10T06:28:47.060 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:28:47.061 INFO:teuthology.orchestra.run.vm04.stdout:{"epoch":11,"btime":"1970-01-01T00:00:00:000000+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":14518,"name":"cephfs.vm04.hsrsig","rank":-1,"incarnation":0,"state":"up:standby","state_seq":2,"addr":"192.168.123.104:6829/2419696492","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6828","nonce":2419696492},{"type":"v1","addr":"192.168.123.104:6829","nonce":2419696492}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":10},{"gid":14526,"name":"cephfs.vm06.afscws","rank":-1,"incarnation":0,"state":"up:standby","state_seq":2,"addr":"192.168.123.106:6827/3120742985","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6826","nonce":3120742985},{"type":"v1","addr":"192.168.123.106:6827","nonce":3120742985}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":11}],"filesystems":[{"mdsmap":{"epoch":9,"flags":50,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":true,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":32,"explicitly_allowed_features":32,"created":"2026-03-10T06:19:48.407965+0000","modified":"2026-03-10T06:19:55.449951+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{"mds_0":14508},"failed":[],"damaged":[],"stopped":[],"info":{"gid_14508":{"gid":14508,"name":"cephfs.vm04.hdxbzv","rank":0,"incarnation":3,"state":"up:active","state_seq":3,"addr":"192.168.123.104:6827/2274683007","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6826","nonce":2274683007},{"type":"v1","addr":"192.168.123.104:6827","nonce":2274683007}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}},"gid_24299":{"gid":24299,"name":"cephfs.vm06.wzhqon","rank":0,"incarnation":0,"state":"up:standby-replay","state_seq":2,"addr":"192.168.123.106:6825/3071631026","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6824","nonce":3071631026},{"type":"v1","addr":"192.168.123.106:6825","nonce":3071631026}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":0,"qdb_cluster":[]},"id":1}]} 2026-03-10T06:28:47.063 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:47.062+0000 7f1f3bb24700 1 -- 192.168.123.104:0/2466951649 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f1f20077760 msgr2=0x7f1f20079c10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:28:47.063 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:47.062+0000 7f1f3bb24700 1 --2- 192.168.123.104:0/2466951649 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f1f20077760 0x7f1f20079c10 secure :-1 s=READY pgs=85 cs=0 l=1 rev1=1 crypto rx=0x7f1f28009ee0 tx=0x7f1f28009450 comp rx=0 tx=0).stop 2026-03-10T06:28:47.063 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:47.062+0000 7f1f3bb24700 1 -- 192.168.123.104:0/2466951649 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1f34106560 msgr2=0x7f1f34193cf0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:28:47.063 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:47.062+0000 7f1f3bb24700 1 --2- 192.168.123.104:0/2466951649 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1f34106560 0x7f1f34193cf0 secure :-1 s=READY pgs=125 cs=0 l=1 rev1=1 crypto rx=0x7f1f30005950 tx=0x7f1f30004ef0 comp rx=0 tx=0).stop 2026-03-10T06:28:47.063 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:47.062+0000 7f1f3bb24700 1 -- 192.168.123.104:0/2466951649 shutdown_connections 2026-03-10T06:28:47.064 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:47.062+0000 7f1f3bb24700 1 --2- 192.168.123.104:0/2466951649 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f1f20077760 0x7f1f20079c10 unknown :-1 s=CLOSED pgs=85 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:47.064 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:47.062+0000 7f1f3bb24700 1 --2- 192.168.123.104:0/2466951649 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1f34100540 0x7f1f34198cf0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:47.064 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:47.062+0000 7f1f3bb24700 1 --2- 192.168.123.104:0/2466951649 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1f34106560 0x7f1f34193cf0 unknown :-1 s=CLOSED pgs=125 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:47.064 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:47.062+0000 7f1f3bb24700 1 -- 192.168.123.104:0/2466951649 >> 192.168.123.104:0/2466951649 conn(0x7f1f340fbfc0 msgr2=0x7f1f340fd8f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:28:47.064 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:47.062+0000 7f1f3bb24700 1 -- 192.168.123.104:0/2466951649 shutdown_connections 2026-03-10T06:28:47.064 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:47.062+0000 7f1f3bb24700 1 -- 192.168.123.104:0/2466951649 wait complete. 2026-03-10T06:28:47.065 INFO:teuthology.orchestra.run.vm04.stderr:dumped fsmap epoch 11 2026-03-10T06:28:47.108 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid 9c59102a-1c48-11f1-b618-035af535377d -- ceph fs dump --format=json 12 2026-03-10T06:28:47.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:28:46 vm06.local ceph-mon[98962]: from='client.? 192.168.123.104:0/494777448' entity='client.admin' cmd=[{"prefix": "fs dump", "format": "json"}]: dispatch 2026-03-10T06:28:47.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:28:46 vm06.local ceph-mon[98962]: from='client.? 192.168.123.104:0/4074627075' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 10, "format": "json"}]: dispatch 2026-03-10T06:28:47.269 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/mon.vm04/config 2026-03-10T06:28:47.530 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:47.528+0000 7f7d41221700 1 -- 192.168.123.104:0/410608133 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7d3c106560 msgr2=0x7f7d3c106930 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:28:47.530 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:47.528+0000 7f7d41221700 1 --2- 192.168.123.104:0/410608133 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7d3c106560 0x7f7d3c106930 secure :-1 s=READY pgs=126 cs=0 l=1 rev1=1 crypto rx=0x7f7d2c009b50 tx=0x7f7d2c009e60 comp rx=0 tx=0).stop 2026-03-10T06:28:47.530 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:47.529+0000 7f7d41221700 1 -- 192.168.123.104:0/410608133 shutdown_connections 2026-03-10T06:28:47.530 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:47.529+0000 7f7d41221700 1 --2- 192.168.123.104:0/410608133 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7d3c100540 0x7f7d3c1009b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:47.530 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:47.529+0000 7f7d41221700 1 --2- 192.168.123.104:0/410608133 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7d3c106560 0x7f7d3c106930 unknown :-1 s=CLOSED pgs=126 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:47.530 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:47.529+0000 7f7d41221700 1 -- 192.168.123.104:0/410608133 >> 192.168.123.104:0/410608133 conn(0x7f7d3c0fbfc0 msgr2=0x7f7d3c0fe3d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:28:47.530 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:47.529+0000 7f7d41221700 1 -- 192.168.123.104:0/410608133 shutdown_connections 2026-03-10T06:28:47.530 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:47.529+0000 7f7d41221700 1 -- 192.168.123.104:0/410608133 wait complete. 2026-03-10T06:28:47.531 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:47.529+0000 7f7d41221700 1 Processor -- start 2026-03-10T06:28:47.531 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:47.529+0000 7f7d41221700 1 -- start start 2026-03-10T06:28:47.531 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:47.530+0000 7f7d41221700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7d3c100540 0x7f7d3c1983a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:28:47.531 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:47.530+0000 7f7d41221700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7d3c106560 0x7f7d3c1988e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:28:47.531 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:47.530+0000 7f7d41221700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7d3c198fc0 con 0x7f7d3c106560 2026-03-10T06:28:47.532 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:47.530+0000 7f7d41221700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7d3c19cd50 con 0x7f7d3c100540 2026-03-10T06:28:47.532 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:47.530+0000 7f7d3a59c700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7d3c106560 0x7f7d3c1988e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:28:47.532 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:47.530+0000 7f7d3a59c700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7d3c106560 0x7f7d3c1988e0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:35800/0 (socket says 192.168.123.104:35800) 2026-03-10T06:28:47.532 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:47.530+0000 7f7d3a59c700 1 -- 192.168.123.104:0/209882439 learned_addr learned my addr 192.168.123.104:0/209882439 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:28:47.532 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:47.530+0000 7f7d3a59c700 1 -- 192.168.123.104:0/209882439 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7d3c100540 msgr2=0x7f7d3c1983a0 unknown :-1 s=STATE_CONNECTING_RE l=1).mark_down 2026-03-10T06:28:47.532 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:47.530+0000 7f7d3a59c700 1 --2- 192.168.123.104:0/209882439 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7d3c100540 0x7f7d3c1983a0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:47.532 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:47.530+0000 7f7d3a59c700 1 -- 192.168.123.104:0/209882439 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f7d30009710 con 0x7f7d3c106560 2026-03-10T06:28:47.532 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:47.530+0000 7f7d3a59c700 1 --2- 192.168.123.104:0/209882439 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7d3c106560 0x7f7d3c1988e0 secure :-1 s=READY pgs=127 cs=0 l=1 rev1=1 crypto rx=0x7f7d30009f80 tx=0x7f7d3000c5b0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:28:47.532 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:47.531+0000 7f7d23fff700 1 -- 192.168.123.104:0/209882439 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f7d30017de0 con 0x7f7d3c106560 2026-03-10T06:28:47.532 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:47.531+0000 7f7d41221700 1 -- 192.168.123.104:0/209882439 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f7d2c0097e0 con 0x7f7d3c106560 2026-03-10T06:28:47.533 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:47.531+0000 7f7d41221700 1 -- 192.168.123.104:0/209882439 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f7d3c19d330 con 0x7f7d3c106560 2026-03-10T06:28:47.533 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:47.531+0000 7f7d23fff700 1 -- 192.168.123.104:0/209882439 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f7d30004500 con 0x7f7d3c106560 2026-03-10T06:28:47.533 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:47.531+0000 7f7d23fff700 1 -- 192.168.123.104:0/209882439 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f7d30005020 con 0x7f7d3c106560 2026-03-10T06:28:47.534 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:47.532+0000 7f7d23fff700 1 -- 192.168.123.104:0/209882439 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 40) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f7d3000cbc0 con 0x7f7d3c106560 2026-03-10T06:28:47.534 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:47.533+0000 7f7d41221700 1 -- 192.168.123.104:0/209882439 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f7d28005320 con 0x7f7d3c106560 2026-03-10T06:28:47.537 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:47.535+0000 7f7d23fff700 1 --2- 192.168.123.104:0/209882439 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f7d24077990 0x7f7d24079e40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:28:47.537 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:47.535+0000 7f7d3ad9d700 1 --2- 192.168.123.104:0/209882439 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f7d24077990 0x7f7d24079e40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:28:47.537 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:47.535+0000 7f7d3ad9d700 1 --2- 192.168.123.104:0/209882439 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f7d24077990 0x7f7d24079e40 secure :-1 s=READY pgs=86 cs=0 l=1 rev1=1 crypto rx=0x7f7d2c00b5c0 tx=0x7f7d2c005fb0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:28:47.537 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:47.535+0000 7f7d23fff700 1 -- 192.168.123.104:0/209882439 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(81..81 src has 1..81) v4 ==== 6308+0+0 (secure 0 0 0) 0x7f7d300a2580 con 0x7f7d3c106560 2026-03-10T06:28:47.537 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:47.536+0000 7f7d23fff700 1 -- 192.168.123.104:0/209882439 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f7d3006ad80 con 0x7f7d3c106560 2026-03-10T06:28:47.679 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:47.678+0000 7f7d41221700 1 -- 192.168.123.104:0/209882439 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 12, "format": "json"} v 0) v1 -- 0x7f7d280059f0 con 0x7f7d3c106560 2026-03-10T06:28:47.681 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:47.679+0000 7f7d23fff700 1 -- 192.168.123.104:0/209882439 <== mon.0 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 12, "format": "json"}]=0 dumped fsmap epoch 12 v34) v1 ==== 107+0+4141 (secure 0 0 0) 0x7f7d3006a4d0 con 0x7f7d3c106560 2026-03-10T06:28:47.681 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:28:47.681 INFO:teuthology.orchestra.run.vm04.stdout:{"epoch":12,"btime":"2026-03-10T06:26:46:702932+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":14518,"name":"cephfs.vm04.hsrsig","rank":-1,"incarnation":0,"state":"up:standby","state_seq":2,"addr":"192.168.123.104:6829/2419696492","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6828","nonce":2419696492},{"type":"v1","addr":"192.168.123.104:6829","nonce":2419696492}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":10},{"gid":14526,"name":"cephfs.vm06.afscws","rank":-1,"incarnation":0,"state":"up:standby","state_seq":2,"addr":"192.168.123.106:6827/3120742985","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6826","nonce":3120742985},{"type":"v1","addr":"192.168.123.106:6827","nonce":3120742985}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":11}],"filesystems":[{"mdsmap":{"epoch":12,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":32,"explicitly_allowed_features":32,"created":"2026-03-10T06:19:48.407965+0000","modified":"2026-03-10T06:26:46.702931+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{"mds_0":14508},"failed":[],"damaged":[],"stopped":[],"info":{"gid_14508":{"gid":14508,"name":"cephfs.vm04.hdxbzv","rank":0,"incarnation":3,"state":"up:active","state_seq":3,"addr":"192.168.123.104:6827/2274683007","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6826","nonce":2274683007},{"type":"v1","addr":"192.168.123.104:6827","nonce":2274683007}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":14508,"qdb_cluster":[14508]},"id":1}]} 2026-03-10T06:28:47.683 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:47.682+0000 7f7d41221700 1 -- 192.168.123.104:0/209882439 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f7d24077990 msgr2=0x7f7d24079e40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:28:47.683 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:47.682+0000 7f7d41221700 1 --2- 192.168.123.104:0/209882439 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f7d24077990 0x7f7d24079e40 secure :-1 s=READY pgs=86 cs=0 l=1 rev1=1 crypto rx=0x7f7d2c00b5c0 tx=0x7f7d2c005fb0 comp rx=0 tx=0).stop 2026-03-10T06:28:47.684 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:47.682+0000 7f7d41221700 1 -- 192.168.123.104:0/209882439 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7d3c106560 msgr2=0x7f7d3c1988e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:28:47.684 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:47.682+0000 7f7d41221700 1 --2- 192.168.123.104:0/209882439 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7d3c106560 0x7f7d3c1988e0 secure :-1 s=READY pgs=127 cs=0 l=1 rev1=1 crypto rx=0x7f7d30009f80 tx=0x7f7d3000c5b0 comp rx=0 tx=0).stop 2026-03-10T06:28:47.684 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:47.682+0000 7f7d41221700 1 -- 192.168.123.104:0/209882439 shutdown_connections 2026-03-10T06:28:47.684 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:47.682+0000 7f7d41221700 1 --2- 192.168.123.104:0/209882439 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f7d24077990 0x7f7d24079e40 unknown :-1 s=CLOSED pgs=86 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:47.684 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:47.682+0000 7f7d41221700 1 --2- 192.168.123.104:0/209882439 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7d3c100540 0x7f7d3c1983a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:47.684 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:47.682+0000 7f7d41221700 1 --2- 192.168.123.104:0/209882439 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7d3c106560 0x7f7d3c1988e0 unknown :-1 s=CLOSED pgs=127 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:47.684 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:47.682+0000 7f7d41221700 1 -- 192.168.123.104:0/209882439 >> 192.168.123.104:0/209882439 conn(0x7f7d3c0fbfc0 msgr2=0x7f7d3c0fd820 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:28:47.684 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:47.682+0000 7f7d41221700 1 -- 192.168.123.104:0/209882439 shutdown_connections 2026-03-10T06:28:47.684 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:47.682+0000 7f7d41221700 1 -- 192.168.123.104:0/209882439 wait complete. 2026-03-10T06:28:47.685 INFO:teuthology.orchestra.run.vm04.stderr:dumped fsmap epoch 12 2026-03-10T06:28:47.737 DEBUG:tasks.fs:allow_standby_replay disabled in epoch 12 2026-03-10T06:28:47.737 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid 9c59102a-1c48-11f1-b618-035af535377d -- ceph fs dump --format=json 13 2026-03-10T06:28:47.896 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/mon.vm04/config 2026-03-10T06:28:47.938 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:28:47 vm04.local ceph-mon[115743]: pgmap v182: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:28:47.938 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:28:47 vm04.local ceph-mon[115743]: from='client.? 192.168.123.104:0/2466951649' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 11, "format": "json"}]: dispatch 2026-03-10T06:28:47.938 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:28:47 vm04.local ceph-mon[115743]: from='client.? 192.168.123.104:0/209882439' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 12, "format": "json"}]: dispatch 2026-03-10T06:28:48.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:28:47 vm06.local ceph-mon[98962]: pgmap v182: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:28:48.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:28:47 vm06.local ceph-mon[98962]: from='client.? 192.168.123.104:0/2466951649' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 11, "format": "json"}]: dispatch 2026-03-10T06:28:48.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:28:47 vm06.local ceph-mon[98962]: from='client.? 192.168.123.104:0/209882439' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 12, "format": "json"}]: dispatch 2026-03-10T06:28:48.162 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:48.160+0000 7f50ef50f700 1 -- 192.168.123.104:0/3760953034 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f50e8068490 msgr2=0x7f50e8068900 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:28:48.162 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:48.160+0000 7f50ef50f700 1 --2- 192.168.123.104:0/3760953034 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f50e8068490 0x7f50e8068900 secure :-1 s=READY pgs=128 cs=0 l=1 rev1=1 crypto rx=0x7f50e4009b00 tx=0x7f50e4009e10 comp rx=0 tx=0).stop 2026-03-10T06:28:48.162 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:48.160+0000 7f50ef50f700 1 -- 192.168.123.104:0/3760953034 shutdown_connections 2026-03-10T06:28:48.162 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:48.160+0000 7f50ef50f700 1 --2- 192.168.123.104:0/3760953034 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f50e8068490 0x7f50e8068900 unknown :-1 s=CLOSED pgs=128 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:48.162 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:48.160+0000 7f50ef50f700 1 --2- 192.168.123.104:0/3760953034 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f50e81066c0 0x7f50e8106a90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:48.162 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:48.160+0000 7f50ef50f700 1 -- 192.168.123.104:0/3760953034 >> 192.168.123.104:0/3760953034 conn(0x7f50e80754a0 msgr2=0x7f50e80758a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:28:48.162 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:48.160+0000 7f50ef50f700 1 -- 192.168.123.104:0/3760953034 shutdown_connections 2026-03-10T06:28:48.162 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:48.161+0000 7f50ef50f700 1 -- 192.168.123.104:0/3760953034 wait complete. 2026-03-10T06:28:48.162 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:48.161+0000 7f50ef50f700 1 Processor -- start 2026-03-10T06:28:48.163 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:48.161+0000 7f50ef50f700 1 -- start start 2026-03-10T06:28:48.163 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:48.161+0000 7f50ef50f700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f50e8068490 0x7f50e8196180 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:28:48.163 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:48.161+0000 7f50ef50f700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f50e81066c0 0x7f50e81966c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:28:48.163 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:48.162+0000 7f50ef50f700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f50e8196da0 con 0x7f50e81066c0 2026-03-10T06:28:48.163 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:48.162+0000 7f50ef50f700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f50e819ab30 con 0x7f50e8068490 2026-03-10T06:28:48.163 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:48.162+0000 7f50ed2ab700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f50e8068490 0x7f50e8196180 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:28:48.163 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:48.162+0000 7f50ed2ab700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f50e8068490 0x7f50e8196180 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.104:52692/0 (socket says 192.168.123.104:52692) 2026-03-10T06:28:48.163 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:48.162+0000 7f50ed2ab700 1 -- 192.168.123.104:0/1560620400 learned_addr learned my addr 192.168.123.104:0/1560620400 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:28:48.164 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:48.162+0000 7f50ecaaa700 1 --2- 192.168.123.104:0/1560620400 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f50e81066c0 0x7f50e81966c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:28:48.164 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:48.162+0000 7f50ecaaa700 1 -- 192.168.123.104:0/1560620400 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f50e8068490 msgr2=0x7f50e8196180 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:28:48.164 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:48.162+0000 7f50ecaaa700 1 --2- 192.168.123.104:0/1560620400 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f50e8068490 0x7f50e8196180 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:48.164 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:48.162+0000 7f50ecaaa700 1 -- 192.168.123.104:0/1560620400 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f50e40097e0 con 0x7f50e81066c0 2026-03-10T06:28:48.164 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:48.162+0000 7f50ecaaa700 1 --2- 192.168.123.104:0/1560620400 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f50e81066c0 0x7f50e81966c0 secure :-1 s=READY pgs=129 cs=0 l=1 rev1=1 crypto rx=0x7f50e4009b00 tx=0x7f50e4004900 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:28:48.164 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:48.163+0000 7f50de7fc700 1 -- 192.168.123.104:0/1560620400 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f50e401d070 con 0x7f50e81066c0 2026-03-10T06:28:48.165 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:48.163+0000 7f50de7fc700 1 -- 192.168.123.104:0/1560620400 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f50e4022470 con 0x7f50e81066c0 2026-03-10T06:28:48.165 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:48.163+0000 7f50ef50f700 1 -- 192.168.123.104:0/1560620400 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f50e819adb0 con 0x7f50e81066c0 2026-03-10T06:28:48.165 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:48.163+0000 7f50ef50f700 1 -- 192.168.123.104:0/1560620400 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f50e819b2a0 con 0x7f50e81066c0 2026-03-10T06:28:48.166 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:48.164+0000 7f50de7fc700 1 -- 192.168.123.104:0/1560620400 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f50e400bbf0 con 0x7f50e81066c0 2026-03-10T06:28:48.166 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:48.165+0000 7f50de7fc700 1 -- 192.168.123.104:0/1560620400 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 40) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f50e400bd50 con 0x7f50e81066c0 2026-03-10T06:28:48.167 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:48.166+0000 7f50de7fc700 1 --2- 192.168.123.104:0/1560620400 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f50d40778c0 0x7f50d4079d70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:28:48.167 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:48.166+0000 7f50ef50f700 1 -- 192.168.123.104:0/1560620400 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f50e804ea50 con 0x7f50e81066c0 2026-03-10T06:28:48.167 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:48.166+0000 7f50ed2ab700 1 --2- 192.168.123.104:0/1560620400 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f50d40778c0 0x7f50d4079d70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:28:48.167 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:48.166+0000 7f50de7fc700 1 -- 192.168.123.104:0/1560620400 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(81..81 src has 1..81) v4 ==== 6308+0+0 (secure 0 0 0) 0x7f50e409b300 con 0x7f50e81066c0 2026-03-10T06:28:48.168 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:48.166+0000 7f50ed2ab700 1 --2- 192.168.123.104:0/1560620400 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f50d40778c0 0x7f50d4079d70 secure :-1 s=READY pgs=87 cs=0 l=1 rev1=1 crypto rx=0x7f50d8005950 tx=0x7f50d800a400 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:28:48.170 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:48.169+0000 7f50de7fc700 1 -- 192.168.123.104:0/1560620400 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f50e406a020 con 0x7f50e81066c0 2026-03-10T06:28:48.314 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:48.313+0000 7f50ef50f700 1 -- 192.168.123.104:0/1560620400 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 13, "format": "json"} v 0) v1 -- 0x7f50e81975a0 con 0x7f50e81066c0 2026-03-10T06:28:48.317 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:48.316+0000 7f50de7fc700 1 -- 192.168.123.104:0/1560620400 <== mon.0 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 13, "format": "json"}]=0 dumped fsmap epoch 13 v34) v1 ==== 107+0+4123 (secure 0 0 0) 0x7f50e4027090 con 0x7f50e81066c0 2026-03-10T06:28:48.318 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:28:48.318 INFO:teuthology.orchestra.run.vm04.stdout:{"epoch":13,"btime":"2026-03-10T06:26:47:650593+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":14518,"name":"cephfs.vm04.hsrsig","rank":-1,"incarnation":0,"state":"up:standby","state_seq":2,"addr":"192.168.123.104:6829/2419696492","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6828","nonce":2419696492},{"type":"v1","addr":"192.168.123.104:6829","nonce":2419696492}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":10},{"gid":14526,"name":"cephfs.vm06.afscws","rank":-1,"incarnation":0,"state":"up:standby","state_seq":2,"addr":"192.168.123.106:6827/3120742985","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6826","nonce":3120742985},{"type":"v1","addr":"192.168.123.106:6827","nonce":3120742985}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":11},{"gid":44217,"name":"cephfs.vm06.wzhqon","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.106:6825/1147710747","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6824","nonce":1147710747},{"type":"v1","addr":"192.168.123.106:6825","nonce":1147710747}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":13}],"filesystems":[{"mdsmap":{"epoch":13,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":32,"explicitly_allowed_features":32,"created":"2026-03-10T06:19:48.407965+0000","modified":"2026-03-10T06:26:47.650592+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":77,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{},"failed":[0],"damaged":[],"stopped":[],"info":{},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":0,"qdb_cluster":[]},"id":1}]} 2026-03-10T06:28:48.322 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:48.320+0000 7f50ef50f700 1 -- 192.168.123.104:0/1560620400 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f50d40778c0 msgr2=0x7f50d4079d70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:28:48.322 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:48.320+0000 7f50ef50f700 1 --2- 192.168.123.104:0/1560620400 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f50d40778c0 0x7f50d4079d70 secure :-1 s=READY pgs=87 cs=0 l=1 rev1=1 crypto rx=0x7f50d8005950 tx=0x7f50d800a400 comp rx=0 tx=0).stop 2026-03-10T06:28:48.322 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:48.320+0000 7f50ef50f700 1 -- 192.168.123.104:0/1560620400 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f50e81066c0 msgr2=0x7f50e81966c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:28:48.322 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:48.320+0000 7f50ef50f700 1 --2- 192.168.123.104:0/1560620400 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f50e81066c0 0x7f50e81966c0 secure :-1 s=READY pgs=129 cs=0 l=1 rev1=1 crypto rx=0x7f50e4009b00 tx=0x7f50e4004900 comp rx=0 tx=0).stop 2026-03-10T06:28:48.322 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:48.321+0000 7f50ef50f700 1 -- 192.168.123.104:0/1560620400 shutdown_connections 2026-03-10T06:28:48.322 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:48.321+0000 7f50ef50f700 1 --2- 192.168.123.104:0/1560620400 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f50d40778c0 0x7f50d4079d70 unknown :-1 s=CLOSED pgs=87 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:48.322 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:48.321+0000 7f50ef50f700 1 --2- 192.168.123.104:0/1560620400 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f50e8068490 0x7f50e8196180 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:48.322 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:48.321+0000 7f50ef50f700 1 --2- 192.168.123.104:0/1560620400 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f50e81066c0 0x7f50e81966c0 unknown :-1 s=CLOSED pgs=129 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:48.322 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:48.321+0000 7f50ef50f700 1 -- 192.168.123.104:0/1560620400 >> 192.168.123.104:0/1560620400 conn(0x7f50e80754a0 msgr2=0x7f50e80feb00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:28:48.322 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:48.321+0000 7f50ef50f700 1 -- 192.168.123.104:0/1560620400 shutdown_connections 2026-03-10T06:28:48.323 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:48.321+0000 7f50ef50f700 1 -- 192.168.123.104:0/1560620400 wait complete. 2026-03-10T06:28:48.323 INFO:teuthology.orchestra.run.vm04.stderr:dumped fsmap epoch 13 2026-03-10T06:28:48.391 DEBUG:tasks.fs:allow_standby_replay disabled in epoch 13 2026-03-10T06:28:48.391 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid 9c59102a-1c48-11f1-b618-035af535377d -- ceph fs dump --format=json 14 2026-03-10T06:28:48.546 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/mon.vm04/config 2026-03-10T06:28:48.809 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:48.807+0000 7fbad4b74700 1 -- 192.168.123.104:0/1709126835 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fbad0106560 msgr2=0x7fbad0106930 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:28:48.809 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:48.807+0000 7fbad4b74700 1 --2- 192.168.123.104:0/1709126835 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fbad0106560 0x7fbad0106930 secure :-1 s=READY pgs=130 cs=0 l=1 rev1=1 crypto rx=0x7fbab8009b00 tx=0x7fbab8009e10 comp rx=0 tx=0).stop 2026-03-10T06:28:48.810 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:48.808+0000 7fbad4b74700 1 -- 192.168.123.104:0/1709126835 shutdown_connections 2026-03-10T06:28:48.810 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:48.808+0000 7fbad4b74700 1 --2- 192.168.123.104:0/1709126835 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fbad0100540 0x7fbad01009b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:48.810 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:48.808+0000 7fbad4b74700 1 --2- 192.168.123.104:0/1709126835 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fbad0106560 0x7fbad0106930 unknown :-1 s=CLOSED pgs=130 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:48.810 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:48.808+0000 7fbad4b74700 1 -- 192.168.123.104:0/1709126835 >> 192.168.123.104:0/1709126835 conn(0x7fbad00fbfc0 msgr2=0x7fbad00fe3d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:28:48.810 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:48.808+0000 7fbad4b74700 1 -- 192.168.123.104:0/1709126835 shutdown_connections 2026-03-10T06:28:48.810 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:48.808+0000 7fbad4b74700 1 -- 192.168.123.104:0/1709126835 wait complete. 2026-03-10T06:28:48.810 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:48.809+0000 7fbad4b74700 1 Processor -- start 2026-03-10T06:28:48.811 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:48.809+0000 7fbad4b74700 1 -- start start 2026-03-10T06:28:48.811 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:48.809+0000 7fbad4b74700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fbad0100540 0x7fbad0193ef0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:28:48.811 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:48.809+0000 7fbad4b74700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fbad0106560 0x7fbad0194430 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:28:48.811 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:48.809+0000 7fbad4b74700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fbad0194b10 con 0x7fbad0100540 2026-03-10T06:28:48.811 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:48.809+0000 7fbad4b74700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fbad01988a0 con 0x7fbad0106560 2026-03-10T06:28:48.812 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:48.810+0000 7fbace59c700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fbad0100540 0x7fbad0193ef0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:28:48.812 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:48.810+0000 7fbace59c700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fbad0100540 0x7fbad0193ef0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:35844/0 (socket says 192.168.123.104:35844) 2026-03-10T06:28:48.812 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:48.810+0000 7fbace59c700 1 -- 192.168.123.104:0/1118181533 learned_addr learned my addr 192.168.123.104:0/1118181533 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:28:48.812 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:48.810+0000 7fbace59c700 1 -- 192.168.123.104:0/1118181533 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fbad0106560 msgr2=0x7fbad0194430 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:28:48.812 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:48.810+0000 7fbace59c700 1 --2- 192.168.123.104:0/1118181533 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fbad0106560 0x7fbad0194430 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:48.812 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:48.810+0000 7fbace59c700 1 -- 192.168.123.104:0/1118181533 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fbab80097e0 con 0x7fbad0100540 2026-03-10T06:28:48.812 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:48.810+0000 7fbace59c700 1 --2- 192.168.123.104:0/1118181533 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fbad0100540 0x7fbad0193ef0 secure :-1 s=READY pgs=131 cs=0 l=1 rev1=1 crypto rx=0x7fbab8005b40 tx=0x7fbab800bab0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:28:48.813 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:48.811+0000 7fbac77fe700 1 -- 192.168.123.104:0/1118181533 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fbab801d070 con 0x7fbad0100540 2026-03-10T06:28:48.813 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:48.811+0000 7fbac77fe700 1 -- 192.168.123.104:0/1118181533 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7fbab800f460 con 0x7fbad0100540 2026-03-10T06:28:48.813 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:48.811+0000 7fbac77fe700 1 -- 192.168.123.104:0/1118181533 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fbab80216b0 con 0x7fbad0100540 2026-03-10T06:28:48.813 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:48.811+0000 7fbad4b74700 1 -- 192.168.123.104:0/1118181533 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fbad0198b20 con 0x7fbad0100540 2026-03-10T06:28:48.813 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:48.811+0000 7fbad4b74700 1 -- 192.168.123.104:0/1118181533 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fbad0199010 con 0x7fbad0100540 2026-03-10T06:28:48.814 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:48.812+0000 7fbad4b74700 1 -- 192.168.123.104:0/1118181533 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fbad004ea50 con 0x7fbad0100540 2026-03-10T06:28:48.819 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:48.817+0000 7fbac77fe700 1 -- 192.168.123.104:0/1118181533 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 40) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fbab800f5d0 con 0x7fbad0100540 2026-03-10T06:28:48.819 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:48.818+0000 7fbac77fe700 1 --2- 192.168.123.104:0/1118181533 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7fbabc0778d0 0x7fbabc079d80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:28:48.819 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:48.818+0000 7fbac77fe700 1 -- 192.168.123.104:0/1118181533 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(81..81 src has 1..81) v4 ==== 6308+0+0 (secure 0 0 0) 0x7fbab809b190 con 0x7fbad0100540 2026-03-10T06:28:48.819 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:48.818+0000 7fbac77fe700 1 -- 192.168.123.104:0/1118181533 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fbab809b660 con 0x7fbad0100540 2026-03-10T06:28:48.820 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:48.818+0000 7fbacdd9b700 1 --2- 192.168.123.104:0/1118181533 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7fbabc0778d0 0x7fbabc079d80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:28:48.820 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:48.819+0000 7fbacdd9b700 1 --2- 192.168.123.104:0/1118181533 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7fbabc0778d0 0x7fbabc079d80 secure :-1 s=READY pgs=88 cs=0 l=1 rev1=1 crypto rx=0x7fbac0005950 tx=0x7fbac00058e0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:28:48.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:28:48 vm04.local ceph-mon[115743]: from='client.? 192.168.123.104:0/1560620400' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 13, "format": "json"}]: dispatch 2026-03-10T06:28:48.961 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:48.959+0000 7fbad4b74700 1 -- 192.168.123.104:0/1118181533 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 14, "format": "json"} v 0) v1 -- 0x7fbad0066e40 con 0x7fbad0100540 2026-03-10T06:28:48.962 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:48.960+0000 7fbac77fe700 1 -- 192.168.123.104:0/1118181533 <== mon.0 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 14, "format": "json"}]=0 dumped fsmap epoch 14 v34) v1 ==== 107+0+4134 (secure 0 0 0) 0x7fbab8063a50 con 0x7fbad0100540 2026-03-10T06:28:48.962 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:28:48.962 INFO:teuthology.orchestra.run.vm04.stdout:{"epoch":14,"btime":"2026-03-10T06:26:47:668998+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":14526,"name":"cephfs.vm06.afscws","rank":-1,"incarnation":0,"state":"up:standby","state_seq":2,"addr":"192.168.123.106:6827/3120742985","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6826","nonce":3120742985},{"type":"v1","addr":"192.168.123.106:6827","nonce":3120742985}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":11},{"gid":44217,"name":"cephfs.vm06.wzhqon","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.106:6825/1147710747","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6824","nonce":1147710747},{"type":"v1","addr":"192.168.123.106:6825","nonce":1147710747}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":13}],"filesystems":[{"mdsmap":{"epoch":14,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":32,"explicitly_allowed_features":32,"created":"2026-03-10T06:19:48.407965+0000","modified":"2026-03-10T06:26:47.668991+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":77,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{"mds_0":14518},"failed":[],"damaged":[],"stopped":[],"info":{"gid_14518":{"gid":14518,"name":"cephfs.vm04.hsrsig","rank":0,"incarnation":14,"state":"up:replay","state_seq":2,"addr":"192.168.123.104:6829/2419696492","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6828","nonce":2419696492},{"type":"v1","addr":"192.168.123.104:6829","nonce":2419696492}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":0,"qdb_cluster":[]},"id":1}]} 2026-03-10T06:28:48.965 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:48.963+0000 7fbad4b74700 1 -- 192.168.123.104:0/1118181533 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7fbabc0778d0 msgr2=0x7fbabc079d80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:28:48.965 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:48.963+0000 7fbad4b74700 1 --2- 192.168.123.104:0/1118181533 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7fbabc0778d0 0x7fbabc079d80 secure :-1 s=READY pgs=88 cs=0 l=1 rev1=1 crypto rx=0x7fbac0005950 tx=0x7fbac00058e0 comp rx=0 tx=0).stop 2026-03-10T06:28:48.965 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:48.963+0000 7fbad4b74700 1 -- 192.168.123.104:0/1118181533 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fbad0100540 msgr2=0x7fbad0193ef0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:28:48.965 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:48.963+0000 7fbad4b74700 1 --2- 192.168.123.104:0/1118181533 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fbad0100540 0x7fbad0193ef0 secure :-1 s=READY pgs=131 cs=0 l=1 rev1=1 crypto rx=0x7fbab8005b40 tx=0x7fbab800bab0 comp rx=0 tx=0).stop 2026-03-10T06:28:48.965 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:48.964+0000 7fbad4b74700 1 -- 192.168.123.104:0/1118181533 shutdown_connections 2026-03-10T06:28:48.965 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:48.964+0000 7fbad4b74700 1 --2- 192.168.123.104:0/1118181533 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7fbabc0778d0 0x7fbabc079d80 unknown :-1 s=CLOSED pgs=88 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:48.965 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:48.964+0000 7fbad4b74700 1 --2- 192.168.123.104:0/1118181533 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fbad0100540 0x7fbad0193ef0 unknown :-1 s=CLOSED pgs=131 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:48.965 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:48.964+0000 7fbad4b74700 1 --2- 192.168.123.104:0/1118181533 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fbad0106560 0x7fbad0194430 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:48.966 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:48.964+0000 7fbad4b74700 1 -- 192.168.123.104:0/1118181533 >> 192.168.123.104:0/1118181533 conn(0x7fbad00fbfc0 msgr2=0x7fbad00fd8c0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:28:48.966 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:48.964+0000 7fbad4b74700 1 -- 192.168.123.104:0/1118181533 shutdown_connections 2026-03-10T06:28:48.966 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:48.964+0000 7fbad4b74700 1 -- 192.168.123.104:0/1118181533 wait complete. 2026-03-10T06:28:48.966 INFO:teuthology.orchestra.run.vm04.stderr:dumped fsmap epoch 14 2026-03-10T06:28:49.010 DEBUG:tasks.fs:allow_standby_replay disabled in epoch 14 2026-03-10T06:28:49.010 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid 9c59102a-1c48-11f1-b618-035af535377d -- ceph fs dump --format=json 15 2026-03-10T06:28:49.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:28:48 vm06.local ceph-mon[98962]: from='client.? 192.168.123.104:0/1560620400' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 13, "format": "json"}]: dispatch 2026-03-10T06:28:49.157 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/mon.vm04/config 2026-03-10T06:28:49.406 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:49.404+0000 7f58a682c700 1 -- 192.168.123.104:0/248318457 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f58a0100540 msgr2=0x7f58a01009b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:28:49.406 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:49.404+0000 7f58a682c700 1 --2- 192.168.123.104:0/248318457 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f58a0100540 0x7f58a01009b0 secure :-1 s=READY pgs=132 cs=0 l=1 rev1=1 crypto rx=0x7f5890009b00 tx=0x7f5890009e10 comp rx=0 tx=0).stop 2026-03-10T06:28:49.406 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:49.404+0000 7f58a682c700 1 -- 192.168.123.104:0/248318457 shutdown_connections 2026-03-10T06:28:49.407 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:49.404+0000 7f58a682c700 1 --2- 192.168.123.104:0/248318457 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f58a0100540 0x7f58a01009b0 secure :-1 s=CLOSED pgs=132 cs=0 l=1 rev1=1 crypto rx=0x7f5890009b00 tx=0x7f5890009e10 comp rx=0 tx=0).stop 2026-03-10T06:28:49.407 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:49.404+0000 7f58a682c700 1 --2- 192.168.123.104:0/248318457 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f58a0106560 0x7f58a0106930 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:49.407 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:49.404+0000 7f58a682c700 1 -- 192.168.123.104:0/248318457 >> 192.168.123.104:0/248318457 conn(0x7f58a00fbfc0 msgr2=0x7f58a00fe3d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:28:49.407 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:49.405+0000 7f58a682c700 1 -- 192.168.123.104:0/248318457 shutdown_connections 2026-03-10T06:28:49.407 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:49.405+0000 7f58a682c700 1 -- 192.168.123.104:0/248318457 wait complete. 2026-03-10T06:28:49.407 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:49.405+0000 7f58a682c700 1 Processor -- start 2026-03-10T06:28:49.407 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:49.405+0000 7f58a682c700 1 -- start start 2026-03-10T06:28:49.407 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:49.405+0000 7f58a682c700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f58a0106560 0x7f58a0198ed0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:28:49.407 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:49.405+0000 7f58a682c700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f58a0199410 0x7f58a019ec60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:28:49.407 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:49.405+0000 7f58a682c700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f58a0199a20 con 0x7f58a0106560 2026-03-10T06:28:49.407 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:49.405+0000 7f58a682c700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f58a019f1a0 con 0x7f58a0199410 2026-03-10T06:28:49.407 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:49.406+0000 7f589ffff700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f58a0106560 0x7f58a0198ed0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:28:49.407 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:49.406+0000 7f589f7fe700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f58a0199410 0x7f58a019ec60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:28:49.407 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:49.406+0000 7f589f7fe700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f58a0199410 0x7f58a019ec60 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.104:52732/0 (socket says 192.168.123.104:52732) 2026-03-10T06:28:49.407 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:49.406+0000 7f589f7fe700 1 -- 192.168.123.104:0/279138513 learned_addr learned my addr 192.168.123.104:0/279138513 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:28:49.407 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:49.406+0000 7f589f7fe700 1 -- 192.168.123.104:0/279138513 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f58a0106560 msgr2=0x7f58a0198ed0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:28:49.408 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:49.406+0000 7f589f7fe700 1 --2- 192.168.123.104:0/279138513 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f58a0106560 0x7f58a0198ed0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:49.408 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:49.406+0000 7f589f7fe700 1 -- 192.168.123.104:0/279138513 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f58900097e0 con 0x7f58a0199410 2026-03-10T06:28:49.408 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:49.406+0000 7f589ffff700 1 --2- 192.168.123.104:0/279138513 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f58a0106560 0x7f58a0198ed0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-10T06:28:49.408 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:49.406+0000 7f589f7fe700 1 --2- 192.168.123.104:0/279138513 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f58a0199410 0x7f58a019ec60 secure :-1 s=READY pgs=47 cs=0 l=1 rev1=1 crypto rx=0x7f5890005370 tx=0x7f5890004900 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:28:49.408 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:49.407+0000 7f589d7fa700 1 -- 192.168.123.104:0/279138513 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f589001d070 con 0x7f58a0199410 2026-03-10T06:28:49.408 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:49.407+0000 7f589d7fa700 1 -- 192.168.123.104:0/279138513 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f5890022470 con 0x7f58a0199410 2026-03-10T06:28:49.408 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:49.407+0000 7f589d7fa700 1 -- 192.168.123.104:0/279138513 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f589000f650 con 0x7f58a0199410 2026-03-10T06:28:49.409 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:49.407+0000 7f58a682c700 1 -- 192.168.123.104:0/279138513 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f58a019f3a0 con 0x7f58a0199410 2026-03-10T06:28:49.409 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:49.407+0000 7f58a682c700 1 -- 192.168.123.104:0/279138513 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f58a019f7f0 con 0x7f58a0199410 2026-03-10T06:28:49.410 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:49.408+0000 7f589d7fa700 1 -- 192.168.123.104:0/279138513 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 40) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f58900225e0 con 0x7f58a0199410 2026-03-10T06:28:49.410 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:49.409+0000 7f58a682c700 1 -- 192.168.123.104:0/279138513 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f58a004ea50 con 0x7f58a0199410 2026-03-10T06:28:49.410 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:49.409+0000 7f589d7fa700 1 --2- 192.168.123.104:0/279138513 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f588c077870 0x7f588c079d20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:28:49.411 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:49.409+0000 7f589ffff700 1 --2- 192.168.123.104:0/279138513 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f588c077870 0x7f588c079d20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:28:49.411 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:49.410+0000 7f589d7fa700 1 -- 192.168.123.104:0/279138513 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(81..81 src has 1..81) v4 ==== 6308+0+0 (secure 0 0 0) 0x7f589009b2d0 con 0x7f58a0199410 2026-03-10T06:28:49.411 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:49.410+0000 7f589ffff700 1 --2- 192.168.123.104:0/279138513 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f588c077870 0x7f588c079d20 secure :-1 s=READY pgs=89 cs=0 l=1 rev1=1 crypto rx=0x7f5888009fd0 tx=0x7f5888009380 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:28:49.415 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:49.413+0000 7f589d7fa700 1 -- 192.168.123.104:0/279138513 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f58900639d0 con 0x7f58a0199410 2026-03-10T06:28:49.558 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:49.556+0000 7f58a682c700 1 -- 192.168.123.104:0/279138513 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 15, "format": "json"} v 0) v1 -- 0x7f58a0195570 con 0x7f58a0199410 2026-03-10T06:28:49.560 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:49.559+0000 7f589d7fa700 1 -- 192.168.123.104:0/279138513 <== mon.1 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 15, "format": "json"}]=0 dumped fsmap epoch 15 v34) v1 ==== 107+0+4139 (secure 0 0 0) 0x7f5890063120 con 0x7f58a0199410 2026-03-10T06:28:49.560 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:28:49.560 INFO:teuthology.orchestra.run.vm04.stdout:{"epoch":15,"btime":"2026-03-10T06:26:54:085878+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":14526,"name":"cephfs.vm06.afscws","rank":-1,"incarnation":0,"state":"up:standby","state_seq":2,"addr":"192.168.123.106:6827/3120742985","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6826","nonce":3120742985},{"type":"v1","addr":"192.168.123.106:6827","nonce":3120742985}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":11},{"gid":44217,"name":"cephfs.vm06.wzhqon","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.106:6825/1147710747","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6824","nonce":1147710747},{"type":"v1","addr":"192.168.123.106:6825","nonce":1147710747}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":13}],"filesystems":[{"mdsmap":{"epoch":15,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":32,"explicitly_allowed_features":32,"created":"2026-03-10T06:19:48.407965+0000","modified":"2026-03-10T06:26:53.120552+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":77,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{"mds_0":14518},"failed":[],"damaged":[],"stopped":[],"info":{"gid_14518":{"gid":14518,"name":"cephfs.vm04.hsrsig","rank":0,"incarnation":14,"state":"up:reconnect","state_seq":107,"addr":"192.168.123.104:6829/2419696492","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6828","nonce":2419696492},{"type":"v1","addr":"192.168.123.104:6829","nonce":2419696492}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":0,"qdb_cluster":[]},"id":1}]} 2026-03-10T06:28:49.563 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:49.561+0000 7f58a682c700 1 -- 192.168.123.104:0/279138513 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f588c077870 msgr2=0x7f588c079d20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:28:49.563 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:49.561+0000 7f58a682c700 1 --2- 192.168.123.104:0/279138513 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f588c077870 0x7f588c079d20 secure :-1 s=READY pgs=89 cs=0 l=1 rev1=1 crypto rx=0x7f5888009fd0 tx=0x7f5888009380 comp rx=0 tx=0).stop 2026-03-10T06:28:49.563 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:49.562+0000 7f58a682c700 1 -- 192.168.123.104:0/279138513 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f58a0199410 msgr2=0x7f58a019ec60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:28:49.563 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:49.562+0000 7f58a682c700 1 --2- 192.168.123.104:0/279138513 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f58a0199410 0x7f58a019ec60 secure :-1 s=READY pgs=47 cs=0 l=1 rev1=1 crypto rx=0x7f5890005370 tx=0x7f5890004900 comp rx=0 tx=0).stop 2026-03-10T06:28:49.563 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:49.562+0000 7f58a682c700 1 -- 192.168.123.104:0/279138513 shutdown_connections 2026-03-10T06:28:49.564 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:49.562+0000 7f58a682c700 1 --2- 192.168.123.104:0/279138513 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f588c077870 0x7f588c079d20 unknown :-1 s=CLOSED pgs=89 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:49.564 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:49.562+0000 7f58a682c700 1 --2- 192.168.123.104:0/279138513 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f58a0106560 0x7f58a0198ed0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:49.564 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:49.562+0000 7f58a682c700 1 --2- 192.168.123.104:0/279138513 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f58a0199410 0x7f58a019ec60 unknown :-1 s=CLOSED pgs=47 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:49.564 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:49.562+0000 7f58a682c700 1 -- 192.168.123.104:0/279138513 >> 192.168.123.104:0/279138513 conn(0x7f58a00fbfc0 msgr2=0x7f58a00fd810 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:28:49.564 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:49.562+0000 7f58a682c700 1 -- 192.168.123.104:0/279138513 shutdown_connections 2026-03-10T06:28:49.564 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:49.562+0000 7f58a682c700 1 -- 192.168.123.104:0/279138513 wait complete. 2026-03-10T06:28:49.564 INFO:teuthology.orchestra.run.vm04.stderr:dumped fsmap epoch 15 2026-03-10T06:28:49.638 DEBUG:tasks.fs:allow_standby_replay disabled in epoch 15 2026-03-10T06:28:49.638 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid 9c59102a-1c48-11f1-b618-035af535377d -- ceph fs dump --format=json 16 2026-03-10T06:28:49.788 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/mon.vm04/config 2026-03-10T06:28:49.817 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:28:49 vm04.local ceph-mon[115743]: pgmap v183: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:28:49.817 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:28:49 vm04.local ceph-mon[115743]: from='client.? 192.168.123.104:0/1118181533' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 14, "format": "json"}]: dispatch 2026-03-10T06:28:49.817 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:28:49 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T06:28:49.817 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:28:49 vm04.local ceph-mon[115743]: from='client.? 192.168.123.104:0/279138513' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 15, "format": "json"}]: dispatch 2026-03-10T06:28:50.052 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:50.051+0000 7feaf2fa8700 1 -- 192.168.123.104:0/127793456 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7feaec1013a0 msgr2=0x7feaec101770 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:28:50.052 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:50.051+0000 7feaf2fa8700 1 --2- 192.168.123.104:0/127793456 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7feaec1013a0 0x7feaec101770 secure :-1 s=READY pgs=133 cs=0 l=1 rev1=1 crypto rx=0x7feadc009b50 tx=0x7feadc009e60 comp rx=0 tx=0).stop 2026-03-10T06:28:50.053 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:50.051+0000 7feaf2fa8700 1 -- 192.168.123.104:0/127793456 shutdown_connections 2026-03-10T06:28:50.053 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:50.051+0000 7feaf2fa8700 1 --2- 192.168.123.104:0/127793456 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7feaec068490 0x7feaec068900 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:50.053 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:50.051+0000 7feaf2fa8700 1 --2- 192.168.123.104:0/127793456 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7feaec1013a0 0x7feaec101770 unknown :-1 s=CLOSED pgs=133 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:50.053 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:50.051+0000 7feaf2fa8700 1 -- 192.168.123.104:0/127793456 >> 192.168.123.104:0/127793456 conn(0x7feaec0754a0 msgr2=0x7feaec0758a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:28:50.053 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:50.051+0000 7feaf2fa8700 1 -- 192.168.123.104:0/127793456 shutdown_connections 2026-03-10T06:28:50.053 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:50.052+0000 7feaf2fa8700 1 -- 192.168.123.104:0/127793456 wait complete. 2026-03-10T06:28:50.053 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:50.052+0000 7feaf2fa8700 1 Processor -- start 2026-03-10T06:28:50.054 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:50.052+0000 7feaf2fa8700 1 -- start start 2026-03-10T06:28:50.054 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:50.052+0000 7feaf2fa8700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7feaec068490 0x7feaec1021e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:28:50.054 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:50.052+0000 7feaf2fa8700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7feaec1013a0 0x7feaec102720 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:28:50.054 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:50.052+0000 7feaf2fa8700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7feaec106370 con 0x7feaec068490 2026-03-10T06:28:50.054 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:50.052+0000 7feaf2fa8700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7feaec102c60 con 0x7feaec1013a0 2026-03-10T06:28:50.054 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:50.052+0000 7feaebfff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7feaec1013a0 0x7feaec102720 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:28:50.054 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:50.053+0000 7feaebfff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7feaec1013a0 0x7feaec102720 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.104:52758/0 (socket says 192.168.123.104:52758) 2026-03-10T06:28:50.054 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:50.053+0000 7feaebfff700 1 -- 192.168.123.104:0/1540326017 learned_addr learned my addr 192.168.123.104:0/1540326017 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:28:50.054 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:50.053+0000 7feaebfff700 1 -- 192.168.123.104:0/1540326017 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7feaec068490 msgr2=0x7feaec1021e0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:28:50.054 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:50.053+0000 7feaf0d44700 1 --2- 192.168.123.104:0/1540326017 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7feaec068490 0x7feaec1021e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:28:50.054 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:50.053+0000 7feaebfff700 1 --2- 192.168.123.104:0/1540326017 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7feaec068490 0x7feaec1021e0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:50.054 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:50.053+0000 7feaebfff700 1 -- 192.168.123.104:0/1540326017 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7feadc0097e0 con 0x7feaec1013a0 2026-03-10T06:28:50.054 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:50.053+0000 7feaf0d44700 1 --2- 192.168.123.104:0/1540326017 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7feaec068490 0x7feaec1021e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T06:28:50.055 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:50.053+0000 7feaebfff700 1 --2- 192.168.123.104:0/1540326017 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7feaec1013a0 0x7feaec102720 secure :-1 s=READY pgs=48 cs=0 l=1 rev1=1 crypto rx=0x7feae000eb10 tx=0x7feae000eed0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:28:50.055 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:50.053+0000 7feae9ffb700 1 -- 192.168.123.104:0/1540326017 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7feae000cca0 con 0x7feaec1013a0 2026-03-10T06:28:50.055 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:50.054+0000 7feaf2fa8700 1 -- 192.168.123.104:0/1540326017 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7feaec102ec0 con 0x7feaec1013a0 2026-03-10T06:28:50.055 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:50.054+0000 7feaf2fa8700 1 -- 192.168.123.104:0/1540326017 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7feaec0719d0 con 0x7feaec1013a0 2026-03-10T06:28:50.055 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:50.054+0000 7feae9ffb700 1 -- 192.168.123.104:0/1540326017 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7feae000ce00 con 0x7feaec1013a0 2026-03-10T06:28:50.055 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:50.054+0000 7feae9ffb700 1 -- 192.168.123.104:0/1540326017 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7feae00189c0 con 0x7feaec1013a0 2026-03-10T06:28:50.057 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:50.055+0000 7feae9ffb700 1 -- 192.168.123.104:0/1540326017 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 40) v1 ==== 100115+0+0 (secure 0 0 0) 0x7feae0018b20 con 0x7feaec1013a0 2026-03-10T06:28:50.057 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:50.056+0000 7feae9ffb700 1 --2- 192.168.123.104:0/1540326017 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7fead40778c0 0x7fead4079d70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:28:50.057 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:50.056+0000 7fead37fe700 1 -- 192.168.123.104:0/1540326017 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fead8005320 con 0x7feaec1013a0 2026-03-10T06:28:50.057 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:50.056+0000 7feae9ffb700 1 -- 192.168.123.104:0/1540326017 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(81..81 src has 1..81) v4 ==== 6308+0+0 (secure 0 0 0) 0x7feae0014070 con 0x7feaec1013a0 2026-03-10T06:28:50.058 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:50.056+0000 7feaf0d44700 1 --2- 192.168.123.104:0/1540326017 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7fead40778c0 0x7fead4079d70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:28:50.058 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:50.057+0000 7feaf0d44700 1 --2- 192.168.123.104:0/1540326017 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7fead40778c0 0x7fead4079d70 secure :-1 s=READY pgs=90 cs=0 l=1 rev1=1 crypto rx=0x7feadc00b5c0 tx=0x7feadc0058e0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:28:50.060 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:50.059+0000 7feae9ffb700 1 -- 192.168.123.104:0/1540326017 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7feae00628e0 con 0x7feaec1013a0 2026-03-10T06:28:50.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:28:49 vm06.local ceph-mon[98962]: pgmap v183: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:28:50.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:28:49 vm06.local ceph-mon[98962]: from='client.? 192.168.123.104:0/1118181533' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 14, "format": "json"}]: dispatch 2026-03-10T06:28:50.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:28:49 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T06:28:50.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:28:49 vm06.local ceph-mon[98962]: from='client.? 192.168.123.104:0/279138513' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 15, "format": "json"}]: dispatch 2026-03-10T06:28:50.203 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:50.202+0000 7fead37fe700 1 -- 192.168.123.104:0/1540326017 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 16, "format": "json"} v 0) v1 -- 0x7fead8005190 con 0x7feaec1013a0 2026-03-10T06:28:50.204 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:50.203+0000 7feae9ffb700 1 -- 192.168.123.104:0/1540326017 <== mon.1 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 16, "format": "json"}]=0 dumped fsmap epoch 16 v34) v1 ==== 107+0+4136 (secure 0 0 0) 0x7feae0062030 con 0x7feaec1013a0 2026-03-10T06:28:50.205 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:28:50.205 INFO:teuthology.orchestra.run.vm04.stdout:{"epoch":16,"btime":"2026-03-10T06:26:55:090469+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":14526,"name":"cephfs.vm06.afscws","rank":-1,"incarnation":0,"state":"up:standby","state_seq":2,"addr":"192.168.123.106:6827/3120742985","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6826","nonce":3120742985},{"type":"v1","addr":"192.168.123.106:6827","nonce":3120742985}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":11},{"gid":44217,"name":"cephfs.vm06.wzhqon","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.106:6825/1147710747","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6824","nonce":1147710747},{"type":"v1","addr":"192.168.123.106:6825","nonce":1147710747}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":13}],"filesystems":[{"mdsmap":{"epoch":16,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":32,"explicitly_allowed_features":32,"created":"2026-03-10T06:19:48.407965+0000","modified":"2026-03-10T06:26:54.095986+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":77,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{"mds_0":14518},"failed":[],"damaged":[],"stopped":[],"info":{"gid_14518":{"gid":14518,"name":"cephfs.vm04.hsrsig","rank":0,"incarnation":14,"state":"up:rejoin","state_seq":108,"addr":"192.168.123.104:6829/2419696492","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6828","nonce":2419696492},{"type":"v1","addr":"192.168.123.104:6829","nonce":2419696492}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":0,"qdb_cluster":[]},"id":1}]} 2026-03-10T06:28:50.207 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:50.206+0000 7fead37fe700 1 -- 192.168.123.104:0/1540326017 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7fead40778c0 msgr2=0x7fead4079d70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:28:50.208 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:50.206+0000 7fead37fe700 1 --2- 192.168.123.104:0/1540326017 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7fead40778c0 0x7fead4079d70 secure :-1 s=READY pgs=90 cs=0 l=1 rev1=1 crypto rx=0x7feadc00b5c0 tx=0x7feadc0058e0 comp rx=0 tx=0).stop 2026-03-10T06:28:50.208 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:50.206+0000 7fead37fe700 1 -- 192.168.123.104:0/1540326017 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7feaec1013a0 msgr2=0x7feaec102720 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:28:50.208 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:50.206+0000 7fead37fe700 1 --2- 192.168.123.104:0/1540326017 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7feaec1013a0 0x7feaec102720 secure :-1 s=READY pgs=48 cs=0 l=1 rev1=1 crypto rx=0x7feae000eb10 tx=0x7feae000eed0 comp rx=0 tx=0).stop 2026-03-10T06:28:50.208 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:50.207+0000 7fead37fe700 1 -- 192.168.123.104:0/1540326017 shutdown_connections 2026-03-10T06:28:50.208 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:50.207+0000 7fead37fe700 1 --2- 192.168.123.104:0/1540326017 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7fead40778c0 0x7fead4079d70 unknown :-1 s=CLOSED pgs=90 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:50.210 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:50.207+0000 7fead37fe700 1 --2- 192.168.123.104:0/1540326017 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7feaec068490 0x7feaec1021e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:50.210 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:50.207+0000 7fead37fe700 1 --2- 192.168.123.104:0/1540326017 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7feaec1013a0 0x7feaec102720 unknown :-1 s=CLOSED pgs=48 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:50.210 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:50.207+0000 7fead37fe700 1 -- 192.168.123.104:0/1540326017 >> 192.168.123.104:0/1540326017 conn(0x7feaec0754a0 msgr2=0x7feaec0fddd0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:28:50.210 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:50.207+0000 7fead37fe700 1 -- 192.168.123.104:0/1540326017 shutdown_connections 2026-03-10T06:28:50.210 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:50.207+0000 7fead37fe700 1 -- 192.168.123.104:0/1540326017 wait complete. 2026-03-10T06:28:50.210 INFO:teuthology.orchestra.run.vm04.stderr:dumped fsmap epoch 16 2026-03-10T06:28:50.278 DEBUG:tasks.fs:allow_standby_replay disabled in epoch 16 2026-03-10T06:28:50.278 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid 9c59102a-1c48-11f1-b618-035af535377d -- ceph fs dump --format=json 17 2026-03-10T06:28:50.435 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/mon.vm04/config 2026-03-10T06:28:50.687 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:50.685+0000 7fae7d028700 1 -- 192.168.123.104:0/245875448 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fae78073a00 msgr2=0x7fae78110ff0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:28:50.688 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:50.685+0000 7fae7d028700 1 --2- 192.168.123.104:0/245875448 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fae78073a00 0x7fae78110ff0 secure :-1 s=READY pgs=134 cs=0 l=1 rev1=1 crypto rx=0x7fae68009b00 tx=0x7fae68009e10 comp rx=0 tx=0).stop 2026-03-10T06:28:50.688 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:50.686+0000 7fae7d028700 1 -- 192.168.123.104:0/245875448 shutdown_connections 2026-03-10T06:28:50.688 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:50.686+0000 7fae7d028700 1 --2- 192.168.123.104:0/245875448 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fae78073a00 0x7fae78110ff0 unknown :-1 s=CLOSED pgs=134 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:50.688 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:50.686+0000 7fae7d028700 1 --2- 192.168.123.104:0/245875448 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fae780730f0 0x7fae780734c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:50.688 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:50.686+0000 7fae7d028700 1 -- 192.168.123.104:0/245875448 >> 192.168.123.104:0/245875448 conn(0x7fae780fc000 msgr2=0x7fae780fe410 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:28:50.688 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:50.686+0000 7fae7d028700 1 -- 192.168.123.104:0/245875448 shutdown_connections 2026-03-10T06:28:50.688 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:50.686+0000 7fae7d028700 1 -- 192.168.123.104:0/245875448 wait complete. 2026-03-10T06:28:50.688 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:50.687+0000 7fae7d028700 1 Processor -- start 2026-03-10T06:28:50.688 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:50.687+0000 7fae7d028700 1 -- start start 2026-03-10T06:28:50.689 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:50.687+0000 7fae7d028700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fae780730f0 0x7fae781a2570 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:28:50.689 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:50.687+0000 7fae7d028700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fae78073a00 0x7fae781a2ab0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:28:50.689 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:50.687+0000 7fae7d028700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fae781a3140 con 0x7fae780730f0 2026-03-10T06:28:50.689 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:50.687+0000 7fae7d028700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fae7819c5f0 con 0x7fae78073a00 2026-03-10T06:28:50.689 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:50.688+0000 7fae7659c700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fae78073a00 0x7fae781a2ab0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:28:50.689 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:50.688+0000 7fae76d9d700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fae780730f0 0x7fae781a2570 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:28:50.689 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:50.688+0000 7fae76d9d700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fae780730f0 0x7fae781a2570 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:35892/0 (socket says 192.168.123.104:35892) 2026-03-10T06:28:50.689 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:50.688+0000 7fae76d9d700 1 -- 192.168.123.104:0/1132612145 learned_addr learned my addr 192.168.123.104:0/1132612145 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:28:50.689 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:50.688+0000 7fae7659c700 1 -- 192.168.123.104:0/1132612145 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fae780730f0 msgr2=0x7fae781a2570 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:28:50.689 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:50.688+0000 7fae7659c700 1 --2- 192.168.123.104:0/1132612145 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fae780730f0 0x7fae781a2570 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:50.689 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:50.688+0000 7fae7659c700 1 -- 192.168.123.104:0/1132612145 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fae680097e0 con 0x7fae78073a00 2026-03-10T06:28:50.690 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:50.688+0000 7fae76d9d700 1 --2- 192.168.123.104:0/1132612145 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fae780730f0 0x7fae781a2570 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-10T06:28:50.690 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:50.688+0000 7fae7659c700 1 --2- 192.168.123.104:0/1132612145 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fae78073a00 0x7fae781a2ab0 secure :-1 s=READY pgs=49 cs=0 l=1 rev1=1 crypto rx=0x7fae68000c00 tx=0x7fae680048c0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:28:50.690 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:50.689+0000 7fae6ffff700 1 -- 192.168.123.104:0/1132612145 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fae6801d070 con 0x7fae78073a00 2026-03-10T06:28:50.690 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:50.689+0000 7fae7d028700 1 -- 192.168.123.104:0/1132612145 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fae7819c8d0 con 0x7fae78073a00 2026-03-10T06:28:50.691 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:50.689+0000 7fae7d028700 1 -- 192.168.123.104:0/1132612145 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fae7819ce20 con 0x7fae78073a00 2026-03-10T06:28:50.691 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:50.689+0000 7fae6ffff700 1 -- 192.168.123.104:0/1132612145 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7fae68022470 con 0x7fae78073a00 2026-03-10T06:28:50.691 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:50.689+0000 7fae6ffff700 1 -- 192.168.123.104:0/1132612145 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fae6800f670 con 0x7fae78073a00 2026-03-10T06:28:50.692 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:50.690+0000 7fae6ffff700 1 -- 192.168.123.104:0/1132612145 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 40) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fae6800f7d0 con 0x7fae78073a00 2026-03-10T06:28:50.692 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:50.690+0000 7fae7d028700 1 -- 192.168.123.104:0/1132612145 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fae7810e770 con 0x7fae78073a00 2026-03-10T06:28:50.692 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:50.691+0000 7fae6ffff700 1 --2- 192.168.123.104:0/1132612145 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7fae64077870 0x7fae64079d20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:28:50.693 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:50.691+0000 7fae76d9d700 1 --2- 192.168.123.104:0/1132612145 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7fae64077870 0x7fae64079d20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:28:50.693 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:50.691+0000 7fae6ffff700 1 -- 192.168.123.104:0/1132612145 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(81..81 src has 1..81) v4 ==== 6308+0+0 (secure 0 0 0) 0x7fae6809af30 con 0x7fae78073a00 2026-03-10T06:28:50.693 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:50.691+0000 7fae76d9d700 1 --2- 192.168.123.104:0/1132612145 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7fae64077870 0x7fae64079d20 secure :-1 s=READY pgs=91 cs=0 l=1 rev1=1 crypto rx=0x7fae60005950 tx=0x7fae6000b500 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:28:50.694 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:50.693+0000 7fae6ffff700 1 -- 192.168.123.104:0/1132612145 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fae68064810 con 0x7fae78073a00 2026-03-10T06:28:50.721 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:28:50 vm04.local ceph-mon[115743]: from='client.? 192.168.123.104:0/1540326017' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 16, "format": "json"}]: dispatch 2026-03-10T06:28:50.843 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:50.841+0000 7fae7d028700 1 -- 192.168.123.104:0/1132612145 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 17, "format": "json"} v 0) v1 -- 0x7fae7804ea50 con 0x7fae78073a00 2026-03-10T06:28:50.844 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:50.842+0000 7fae6ffff700 1 -- 192.168.123.104:0/1132612145 <== mon.1 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 17, "format": "json"}]=0 dumped fsmap epoch 17 v34) v1 ==== 107+0+4145 (secure 0 0 0) 0x7fae68063f60 con 0x7fae78073a00 2026-03-10T06:28:50.844 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:28:50.844 INFO:teuthology.orchestra.run.vm04.stdout:{"epoch":17,"btime":"2026-03-10T06:26:56:140776+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":14526,"name":"cephfs.vm06.afscws","rank":-1,"incarnation":0,"state":"up:standby","state_seq":2,"addr":"192.168.123.106:6827/3120742985","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6826","nonce":3120742985},{"type":"v1","addr":"192.168.123.106:6827","nonce":3120742985}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":11},{"gid":44217,"name":"cephfs.vm06.wzhqon","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.106:6825/1147710747","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6824","nonce":1147710747},{"type":"v1","addr":"192.168.123.106:6825","nonce":1147710747}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":13}],"filesystems":[{"mdsmap":{"epoch":17,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":32,"explicitly_allowed_features":32,"created":"2026-03-10T06:19:48.407965+0000","modified":"2026-03-10T06:26:56.140774+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":77,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{"mds_0":14518},"failed":[],"damaged":[],"stopped":[],"info":{"gid_14518":{"gid":14518,"name":"cephfs.vm04.hsrsig","rank":0,"incarnation":14,"state":"up:active","state_seq":109,"addr":"192.168.123.104:6829/2419696492","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6828","nonce":2419696492},{"type":"v1","addr":"192.168.123.104:6829","nonce":2419696492}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":14518,"qdb_cluster":[14518]},"id":1}]} 2026-03-10T06:28:50.846 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:50.844+0000 7fae7d028700 1 -- 192.168.123.104:0/1132612145 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7fae64077870 msgr2=0x7fae64079d20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:28:50.846 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:50.844+0000 7fae7d028700 1 --2- 192.168.123.104:0/1132612145 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7fae64077870 0x7fae64079d20 secure :-1 s=READY pgs=91 cs=0 l=1 rev1=1 crypto rx=0x7fae60005950 tx=0x7fae6000b500 comp rx=0 tx=0).stop 2026-03-10T06:28:50.846 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:50.845+0000 7fae7d028700 1 -- 192.168.123.104:0/1132612145 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fae78073a00 msgr2=0x7fae781a2ab0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:28:50.846 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:50.845+0000 7fae7d028700 1 --2- 192.168.123.104:0/1132612145 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fae78073a00 0x7fae781a2ab0 secure :-1 s=READY pgs=49 cs=0 l=1 rev1=1 crypto rx=0x7fae68000c00 tx=0x7fae680048c0 comp rx=0 tx=0).stop 2026-03-10T06:28:50.846 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:50.845+0000 7fae7d028700 1 -- 192.168.123.104:0/1132612145 shutdown_connections 2026-03-10T06:28:50.847 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:50.845+0000 7fae7d028700 1 --2- 192.168.123.104:0/1132612145 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7fae64077870 0x7fae64079d20 unknown :-1 s=CLOSED pgs=91 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:50.847 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:50.845+0000 7fae7d028700 1 --2- 192.168.123.104:0/1132612145 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fae780730f0 0x7fae781a2570 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:50.847 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:50.845+0000 7fae7d028700 1 --2- 192.168.123.104:0/1132612145 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fae78073a00 0x7fae781a2ab0 unknown :-1 s=CLOSED pgs=49 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:50.847 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:50.845+0000 7fae7d028700 1 -- 192.168.123.104:0/1132612145 >> 192.168.123.104:0/1132612145 conn(0x7fae780fc000 msgr2=0x7fae78102b00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:28:50.847 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:50.846+0000 7fae7d028700 1 -- 192.168.123.104:0/1132612145 shutdown_connections 2026-03-10T06:28:50.847 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:50.846+0000 7fae7d028700 1 -- 192.168.123.104:0/1132612145 wait complete. 2026-03-10T06:28:50.848 INFO:teuthology.orchestra.run.vm04.stderr:dumped fsmap epoch 17 2026-03-10T06:28:50.920 DEBUG:tasks.fs:allow_standby_replay disabled in epoch 17 2026-03-10T06:28:50.920 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid 9c59102a-1c48-11f1-b618-035af535377d -- ceph fs dump --format=json 18 2026-03-10T06:28:51.073 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/mon.vm04/config 2026-03-10T06:28:51.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:28:50 vm06.local ceph-mon[98962]: from='client.? 192.168.123.104:0/1540326017' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 16, "format": "json"}]: dispatch 2026-03-10T06:28:51.325 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:51.323+0000 7f3f92dd2700 1 -- 192.168.123.104:0/1208510491 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3f8c1084e0 msgr2=0x7f3f8c1088b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:28:51.325 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:51.323+0000 7f3f92dd2700 1 --2- 192.168.123.104:0/1208510491 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3f8c1084e0 0x7f3f8c1088b0 secure :-1 s=READY pgs=135 cs=0 l=1 rev1=1 crypto rx=0x7f3f7c009b00 tx=0x7f3f7c009e10 comp rx=0 tx=0).stop 2026-03-10T06:28:51.325 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:51.324+0000 7f3f92dd2700 1 -- 192.168.123.104:0/1208510491 shutdown_connections 2026-03-10T06:28:51.325 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:51.324+0000 7f3f92dd2700 1 --2- 192.168.123.104:0/1208510491 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3f8c1024e0 0x7f3f8c102950 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:51.325 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:51.324+0000 7f3f92dd2700 1 --2- 192.168.123.104:0/1208510491 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3f8c1084e0 0x7f3f8c1088b0 unknown :-1 s=CLOSED pgs=135 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:51.325 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:51.324+0000 7f3f92dd2700 1 -- 192.168.123.104:0/1208510491 >> 192.168.123.104:0/1208510491 conn(0x7f3f8c0fe000 msgr2=0x7f3f8c100410 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:28:51.325 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:51.324+0000 7f3f92dd2700 1 -- 192.168.123.104:0/1208510491 shutdown_connections 2026-03-10T06:28:51.325 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:51.324+0000 7f3f92dd2700 1 -- 192.168.123.104:0/1208510491 wait complete. 2026-03-10T06:28:51.326 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:51.325+0000 7f3f92dd2700 1 Processor -- start 2026-03-10T06:28:51.326 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:51.325+0000 7f3f92dd2700 1 -- start start 2026-03-10T06:28:51.327 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:51.325+0000 7f3f92dd2700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3f8c1024e0 0x7f3f8c198130 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:28:51.327 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:51.325+0000 7f3f91dd0700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3f8c1024e0 0x7f3f8c198130 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:28:51.327 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:51.325+0000 7f3f92dd2700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3f8c1084e0 0x7f3f8c198670 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:28:51.327 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:51.325+0000 7f3f92dd2700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3f8c198d50 con 0x7f3f8c1024e0 2026-03-10T06:28:51.327 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:51.325+0000 7f3f91dd0700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3f8c1024e0 0x7f3f8c198130 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:35916/0 (socket says 192.168.123.104:35916) 2026-03-10T06:28:51.327 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:51.325+0000 7f3f91dd0700 1 -- 192.168.123.104:0/3567650743 learned_addr learned my addr 192.168.123.104:0/3567650743 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:28:51.327 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:51.326+0000 7f3f915cf700 1 --2- 192.168.123.104:0/3567650743 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3f8c1084e0 0x7f3f8c198670 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:28:51.327 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:51.326+0000 7f3f92dd2700 1 -- 192.168.123.104:0/3567650743 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3f8c19cae0 con 0x7f3f8c1084e0 2026-03-10T06:28:51.328 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:51.326+0000 7f3f91dd0700 1 -- 192.168.123.104:0/3567650743 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3f8c1084e0 msgr2=0x7f3f8c198670 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:28:51.328 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:51.326+0000 7f3f91dd0700 1 --2- 192.168.123.104:0/3567650743 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3f8c1084e0 0x7f3f8c198670 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:51.328 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:51.326+0000 7f3f91dd0700 1 -- 192.168.123.104:0/3567650743 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f3f7c0097e0 con 0x7f3f8c1024e0 2026-03-10T06:28:51.328 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:51.326+0000 7f3f91dd0700 1 --2- 192.168.123.104:0/3567650743 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3f8c1024e0 0x7f3f8c198130 secure :-1 s=READY pgs=136 cs=0 l=1 rev1=1 crypto rx=0x7f3f7c006010 tx=0x7f3f7c004c80 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:28:51.328 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:51.327+0000 7f3f82ffd700 1 -- 192.168.123.104:0/3567650743 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3f7c01d070 con 0x7f3f8c1024e0 2026-03-10T06:28:51.329 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:51.327+0000 7f3f82ffd700 1 -- 192.168.123.104:0/3567650743 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f3f7c022470 con 0x7f3f8c1024e0 2026-03-10T06:28:51.329 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:51.327+0000 7f3f82ffd700 1 -- 192.168.123.104:0/3567650743 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3f7c00f6b0 con 0x7f3f8c1024e0 2026-03-10T06:28:51.330 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:51.327+0000 7f3f92dd2700 1 -- 192.168.123.104:0/3567650743 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f3f8c19cd60 con 0x7f3f8c1024e0 2026-03-10T06:28:51.330 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:51.327+0000 7f3f92dd2700 1 -- 192.168.123.104:0/3567650743 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f3f8c19d250 con 0x7f3f8c1024e0 2026-03-10T06:28:51.330 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:51.329+0000 7f3f92dd2700 1 -- 192.168.123.104:0/3567650743 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f3f8c10a9e0 con 0x7f3f8c1024e0 2026-03-10T06:28:51.334 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:51.329+0000 7f3f82ffd700 1 -- 192.168.123.104:0/3567650743 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 40) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f3f7c022a80 con 0x7f3f8c1024e0 2026-03-10T06:28:51.334 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:51.329+0000 7f3f82ffd700 1 --2- 192.168.123.104:0/3567650743 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f3f780778c0 0x7f3f78079d70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:28:51.334 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:51.329+0000 7f3f82ffd700 1 -- 192.168.123.104:0/3567650743 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(81..81 src has 1..81) v4 ==== 6308+0+0 (secure 0 0 0) 0x7f3f7c09b2d0 con 0x7f3f8c1024e0 2026-03-10T06:28:51.334 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:51.330+0000 7f3f915cf700 1 --2- 192.168.123.104:0/3567650743 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f3f780778c0 0x7f3f78079d70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:28:51.334 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:51.332+0000 7f3f915cf700 1 --2- 192.168.123.104:0/3567650743 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f3f780778c0 0x7f3f78079d70 secure :-1 s=READY pgs=92 cs=0 l=1 rev1=1 crypto rx=0x7f3f8c199750 tx=0x7f3f8800a300 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:28:51.334 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:51.333+0000 7f3f82ffd700 1 -- 192.168.123.104:0/3567650743 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f3f7c063a50 con 0x7f3f8c1024e0 2026-03-10T06:28:51.475 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:51.473+0000 7f3f92dd2700 1 -- 192.168.123.104:0/3567650743 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 18, "format": "json"} v 0) v1 -- 0x7f3f8c199490 con 0x7f3f8c1024e0 2026-03-10T06:28:51.475 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:51.474+0000 7f3f82ffd700 1 -- 192.168.123.104:0/3567650743 <== mon.0 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 18, "format": "json"}]=0 dumped fsmap epoch 18 v34) v1 ==== 107+0+4996 (secure 0 0 0) 0x7f3f7c0631a0 con 0x7f3f8c1024e0 2026-03-10T06:28:51.476 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:28:51.476 INFO:teuthology.orchestra.run.vm04.stdout:{"epoch":18,"btime":"2026-03-10T06:26:57:942249+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":14526,"name":"cephfs.vm06.afscws","rank":-1,"incarnation":0,"state":"up:standby","state_seq":2,"addr":"192.168.123.106:6827/3120742985","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6826","nonce":3120742985},{"type":"v1","addr":"192.168.123.106:6827","nonce":3120742985}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":11},{"gid":34266,"name":"cephfs.vm04.hdxbzv","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.104:6827/2103633514","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6826","nonce":2103633514},{"type":"v1","addr":"192.168.123.104:6827","nonce":2103633514}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":18},{"gid":44217,"name":"cephfs.vm06.wzhqon","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.106:6825/1147710747","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6824","nonce":1147710747},{"type":"v1","addr":"192.168.123.106:6825","nonce":1147710747}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":13}],"filesystems":[{"mdsmap":{"epoch":17,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":32,"explicitly_allowed_features":32,"created":"2026-03-10T06:19:48.407965+0000","modified":"2026-03-10T06:26:56.140774+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":77,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{"mds_0":14518},"failed":[],"damaged":[],"stopped":[],"info":{"gid_14518":{"gid":14518,"name":"cephfs.vm04.hsrsig","rank":0,"incarnation":14,"state":"up:active","state_seq":109,"addr":"192.168.123.104:6829/2419696492","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6828","nonce":2419696492},{"type":"v1","addr":"192.168.123.104:6829","nonce":2419696492}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":14518,"qdb_cluster":[14518]},"id":1}]} 2026-03-10T06:28:51.479 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:51.477+0000 7f3f92dd2700 1 -- 192.168.123.104:0/3567650743 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f3f780778c0 msgr2=0x7f3f78079d70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:28:51.479 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:51.477+0000 7f3f92dd2700 1 --2- 192.168.123.104:0/3567650743 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f3f780778c0 0x7f3f78079d70 secure :-1 s=READY pgs=92 cs=0 l=1 rev1=1 crypto rx=0x7f3f8c199750 tx=0x7f3f8800a300 comp rx=0 tx=0).stop 2026-03-10T06:28:51.479 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:51.477+0000 7f3f92dd2700 1 -- 192.168.123.104:0/3567650743 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3f8c1024e0 msgr2=0x7f3f8c198130 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:28:51.479 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:51.477+0000 7f3f92dd2700 1 --2- 192.168.123.104:0/3567650743 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3f8c1024e0 0x7f3f8c198130 secure :-1 s=READY pgs=136 cs=0 l=1 rev1=1 crypto rx=0x7f3f7c006010 tx=0x7f3f7c004c80 comp rx=0 tx=0).stop 2026-03-10T06:28:51.479 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:51.477+0000 7f3f92dd2700 1 -- 192.168.123.104:0/3567650743 shutdown_connections 2026-03-10T06:28:51.479 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:51.477+0000 7f3f92dd2700 1 --2- 192.168.123.104:0/3567650743 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f3f780778c0 0x7f3f78079d70 unknown :-1 s=CLOSED pgs=92 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:51.479 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:51.477+0000 7f3f92dd2700 1 --2- 192.168.123.104:0/3567650743 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3f8c1024e0 0x7f3f8c198130 unknown :-1 s=CLOSED pgs=136 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:51.479 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:51.477+0000 7f3f92dd2700 1 --2- 192.168.123.104:0/3567650743 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3f8c1084e0 0x7f3f8c198670 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:51.479 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:51.477+0000 7f3f92dd2700 1 -- 192.168.123.104:0/3567650743 >> 192.168.123.104:0/3567650743 conn(0x7f3f8c0fe000 msgr2=0x7f3f8c0fea10 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:28:51.479 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:51.477+0000 7f3f92dd2700 1 -- 192.168.123.104:0/3567650743 shutdown_connections 2026-03-10T06:28:51.479 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:51.477+0000 7f3f92dd2700 1 -- 192.168.123.104:0/3567650743 wait complete. 2026-03-10T06:28:51.480 INFO:teuthology.orchestra.run.vm04.stderr:dumped fsmap epoch 18 2026-03-10T06:28:51.539 DEBUG:tasks.fs:allow_standby_replay disabled in epoch 18 2026-03-10T06:28:51.539 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid 9c59102a-1c48-11f1-b618-035af535377d -- ceph fs dump --format=json 19 2026-03-10T06:28:51.692 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/mon.vm04/config 2026-03-10T06:28:51.728 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:28:51 vm04.local ceph-mon[115743]: pgmap v184: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 2 op/s 2026-03-10T06:28:51.728 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:28:51 vm04.local ceph-mon[115743]: from='client.? 192.168.123.104:0/1132612145' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 17, "format": "json"}]: dispatch 2026-03-10T06:28:51.728 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:28:51 vm04.local ceph-mon[115743]: from='client.? 192.168.123.104:0/3567650743' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 18, "format": "json"}]: dispatch 2026-03-10T06:28:51.949 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:51.946+0000 7f9de328d700 1 -- 192.168.123.104:0/1125234481 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9ddc1066c0 msgr2=0x7f9ddc106a90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:28:51.949 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:51.946+0000 7f9de328d700 1 --2- 192.168.123.104:0/1125234481 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9ddc1066c0 0x7f9ddc106a90 secure :-1 s=READY pgs=50 cs=0 l=1 rev1=1 crypto rx=0x7f9dcc009b00 tx=0x7f9dcc009e10 comp rx=0 tx=0).stop 2026-03-10T06:28:51.949 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:51.947+0000 7f9de328d700 1 -- 192.168.123.104:0/1125234481 shutdown_connections 2026-03-10T06:28:51.949 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:51.947+0000 7f9de328d700 1 --2- 192.168.123.104:0/1125234481 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9ddc068490 0x7f9ddc068900 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:51.949 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:51.947+0000 7f9de328d700 1 --2- 192.168.123.104:0/1125234481 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9ddc1066c0 0x7f9ddc106a90 unknown :-1 s=CLOSED pgs=50 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:51.949 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:51.947+0000 7f9de328d700 1 -- 192.168.123.104:0/1125234481 >> 192.168.123.104:0/1125234481 conn(0x7f9ddc0754a0 msgr2=0x7f9ddc0758a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:28:51.949 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:51.948+0000 7f9de328d700 1 -- 192.168.123.104:0/1125234481 shutdown_connections 2026-03-10T06:28:51.949 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:51.948+0000 7f9de328d700 1 -- 192.168.123.104:0/1125234481 wait complete. 2026-03-10T06:28:51.950 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:51.948+0000 7f9de328d700 1 Processor -- start 2026-03-10T06:28:51.950 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:51.949+0000 7f9de328d700 1 -- start start 2026-03-10T06:28:51.950 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:51.949+0000 7f9de328d700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9ddc068490 0x7f9ddc196180 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:28:51.950 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:51.949+0000 7f9de328d700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9ddc1066c0 0x7f9ddc1966c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:28:51.950 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:51.949+0000 7f9de328d700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9ddc196da0 con 0x7f9ddc1066c0 2026-03-10T06:28:51.951 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:51.949+0000 7f9de328d700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9ddc19ab30 con 0x7f9ddc068490 2026-03-10T06:28:51.951 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:51.949+0000 7f9de1029700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9ddc068490 0x7f9ddc196180 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:28:51.951 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:51.949+0000 7f9de0828700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9ddc1066c0 0x7f9ddc1966c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:28:51.951 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:51.949+0000 7f9de1029700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9ddc068490 0x7f9ddc196180 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.104:52812/0 (socket says 192.168.123.104:52812) 2026-03-10T06:28:51.951 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:51.949+0000 7f9de1029700 1 -- 192.168.123.104:0/2682867322 learned_addr learned my addr 192.168.123.104:0/2682867322 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:28:51.951 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:51.949+0000 7f9de0828700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9ddc1066c0 0x7f9ddc1966c0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:35932/0 (socket says 192.168.123.104:35932) 2026-03-10T06:28:51.951 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:51.950+0000 7f9de0828700 1 -- 192.168.123.104:0/2682867322 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9ddc068490 msgr2=0x7f9ddc196180 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:28:51.951 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:51.950+0000 7f9de0828700 1 --2- 192.168.123.104:0/2682867322 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9ddc068490 0x7f9ddc196180 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:51.951 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:51.950+0000 7f9de0828700 1 -- 192.168.123.104:0/2682867322 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f9dd8009710 con 0x7f9ddc1066c0 2026-03-10T06:28:51.952 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:51.950+0000 7f9de0828700 1 --2- 192.168.123.104:0/2682867322 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9ddc1066c0 0x7f9ddc1966c0 secure :-1 s=READY pgs=137 cs=0 l=1 rev1=1 crypto rx=0x7f9dd800eda0 tx=0x7f9dd800c5b0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:28:51.953 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:51.950+0000 7f9dd27fc700 1 -- 192.168.123.104:0/2682867322 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9dd800cd70 con 0x7f9ddc1066c0 2026-03-10T06:28:51.953 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:51.950+0000 7f9dd27fc700 1 -- 192.168.123.104:0/2682867322 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f9dd8004500 con 0x7f9ddc1066c0 2026-03-10T06:28:51.953 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:51.950+0000 7f9dd27fc700 1 -- 192.168.123.104:0/2682867322 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9dd8003ea0 con 0x7f9ddc1066c0 2026-03-10T06:28:51.953 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:51.950+0000 7f9de328d700 1 -- 192.168.123.104:0/2682867322 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f9dcc0097e0 con 0x7f9ddc1066c0 2026-03-10T06:28:51.953 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:51.950+0000 7f9de328d700 1 -- 192.168.123.104:0/2682867322 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f9ddc19b1d0 con 0x7f9ddc1066c0 2026-03-10T06:28:51.959 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:51.953+0000 7f9de328d700 1 -- 192.168.123.104:0/2682867322 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f9ddc04ea50 con 0x7f9ddc1066c0 2026-03-10T06:28:51.959 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:51.955+0000 7f9dd27fc700 1 -- 192.168.123.104:0/2682867322 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 40) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f9dd8004000 con 0x7f9ddc1066c0 2026-03-10T06:28:51.959 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:51.955+0000 7f9dd27fc700 1 --2- 192.168.123.104:0/2682867322 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f9dc8077990 0x7f9dc8079e40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:28:51.959 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:51.955+0000 7f9dd27fc700 1 -- 192.168.123.104:0/2682867322 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(81..81 src has 1..81) v4 ==== 6308+0+0 (secure 0 0 0) 0x7f9dd8014070 con 0x7f9ddc1066c0 2026-03-10T06:28:51.959 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:51.957+0000 7f9de1029700 1 --2- 192.168.123.104:0/2682867322 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f9dc8077990 0x7f9dc8079e40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:28:51.959 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:51.958+0000 7f9de1029700 1 --2- 192.168.123.104:0/2682867322 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f9dc8077990 0x7f9dc8079e40 secure :-1 s=READY pgs=93 cs=0 l=1 rev1=1 crypto rx=0x7f9dcc006010 tx=0x7f9dcc00b560 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:28:51.959 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:51.958+0000 7f9dd27fc700 1 -- 192.168.123.104:0/2682867322 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f9dd80626f0 con 0x7f9ddc1066c0 2026-03-10T06:28:52.100 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:52.098+0000 7f9de328d700 1 -- 192.168.123.104:0/2682867322 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 19, "format": "json"} v 0) v1 -- 0x7f9ddc066e40 con 0x7f9ddc1066c0 2026-03-10T06:28:52.102 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:52.100+0000 7f9dd27fc700 1 -- 192.168.123.104:0/2682867322 <== mon.0 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 19, "format": "json"}]=0 dumped fsmap epoch 19 v34) v1 ==== 107+0+4191 (secure 0 0 0) 0x7f9dd8067e60 con 0x7f9ddc1066c0 2026-03-10T06:28:52.103 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:28:52.103 INFO:teuthology.orchestra.run.vm04.stdout:{"epoch":19,"btime":"2026-03-10T06:27:01:254608+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":14526,"name":"cephfs.vm06.afscws","rank":-1,"incarnation":0,"state":"up:standby","state_seq":2,"addr":"192.168.123.106:6827/3120742985","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6826","nonce":3120742985},{"type":"v1","addr":"192.168.123.106:6827","nonce":3120742985}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":11},{"gid":34266,"name":"cephfs.vm04.hdxbzv","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.104:6827/2103633514","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6826","nonce":2103633514},{"type":"v1","addr":"192.168.123.104:6827","nonce":2103633514}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":18},{"gid":44217,"name":"cephfs.vm06.wzhqon","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.106:6825/1147710747","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6824","nonce":1147710747},{"type":"v1","addr":"192.168.123.106:6825","nonce":1147710747}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":13}],"filesystems":[{"mdsmap":{"epoch":19,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":32,"explicitly_allowed_features":32,"created":"2026-03-10T06:19:48.407965+0000","modified":"2026-03-10T06:27:01.254607+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":79,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{},"failed":[0],"damaged":[],"stopped":[],"info":{},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":0,"qdb_cluster":[]},"id":1}]} 2026-03-10T06:28:52.105 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:52.104+0000 7f9de328d700 1 -- 192.168.123.104:0/2682867322 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f9dc8077990 msgr2=0x7f9dc8079e40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:28:52.106 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:52.104+0000 7f9de328d700 1 --2- 192.168.123.104:0/2682867322 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f9dc8077990 0x7f9dc8079e40 secure :-1 s=READY pgs=93 cs=0 l=1 rev1=1 crypto rx=0x7f9dcc006010 tx=0x7f9dcc00b560 comp rx=0 tx=0).stop 2026-03-10T06:28:52.106 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:52.104+0000 7f9de328d700 1 -- 192.168.123.104:0/2682867322 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9ddc1066c0 msgr2=0x7f9ddc1966c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:28:52.106 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:52.105+0000 7f9de328d700 1 --2- 192.168.123.104:0/2682867322 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9ddc1066c0 0x7f9ddc1966c0 secure :-1 s=READY pgs=137 cs=0 l=1 rev1=1 crypto rx=0x7f9dd800eda0 tx=0x7f9dd800c5b0 comp rx=0 tx=0).stop 2026-03-10T06:28:52.106 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:52.105+0000 7f9de328d700 1 -- 192.168.123.104:0/2682867322 shutdown_connections 2026-03-10T06:28:52.107 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:52.105+0000 7f9de328d700 1 --2- 192.168.123.104:0/2682867322 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f9dc8077990 0x7f9dc8079e40 unknown :-1 s=CLOSED pgs=93 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:52.107 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:52.105+0000 7f9de328d700 1 --2- 192.168.123.104:0/2682867322 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9ddc068490 0x7f9ddc196180 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:52.107 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:52.106+0000 7f9de328d700 1 --2- 192.168.123.104:0/2682867322 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9ddc1066c0 0x7f9ddc1966c0 unknown :-1 s=CLOSED pgs=137 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:52.107 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:52.106+0000 7f9de328d700 1 -- 192.168.123.104:0/2682867322 >> 192.168.123.104:0/2682867322 conn(0x7f9ddc0754a0 msgr2=0x7f9ddc0febf0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:28:52.107 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:52.106+0000 7f9de328d700 1 -- 192.168.123.104:0/2682867322 shutdown_connections 2026-03-10T06:28:52.107 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:52.106+0000 7f9de328d700 1 -- 192.168.123.104:0/2682867322 wait complete. 2026-03-10T06:28:52.108 INFO:teuthology.orchestra.run.vm04.stderr:dumped fsmap epoch 19 2026-03-10T06:28:52.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:28:51 vm06.local ceph-mon[98962]: pgmap v184: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 2 op/s 2026-03-10T06:28:52.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:28:51 vm06.local ceph-mon[98962]: from='client.? 192.168.123.104:0/1132612145' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 17, "format": "json"}]: dispatch 2026-03-10T06:28:52.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:28:51 vm06.local ceph-mon[98962]: from='client.? 192.168.123.104:0/3567650743' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 18, "format": "json"}]: dispatch 2026-03-10T06:28:52.179 DEBUG:tasks.fs:allow_standby_replay disabled in epoch 19 2026-03-10T06:28:52.179 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid 9c59102a-1c48-11f1-b618-035af535377d -- ceph fs dump --format=json 20 2026-03-10T06:28:52.332 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/mon.vm04/config 2026-03-10T06:28:52.600 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:52.598+0000 7f5d31fa6700 1 -- 192.168.123.104:0/650308118 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5d2c1016e0 msgr2=0x7f5d2c101ab0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:28:52.600 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:52.598+0000 7f5d31fa6700 1 --2- 192.168.123.104:0/650308118 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5d2c1016e0 0x7f5d2c101ab0 secure :-1 s=READY pgs=138 cs=0 l=1 rev1=1 crypto rx=0x7f5d14009b00 tx=0x7f5d14009e10 comp rx=0 tx=0).stop 2026-03-10T06:28:52.601 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:52.599+0000 7f5d31fa6700 1 -- 192.168.123.104:0/650308118 shutdown_connections 2026-03-10T06:28:52.601 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:52.599+0000 7f5d31fa6700 1 --2- 192.168.123.104:0/650308118 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5d2c101ff0 0x7f5d2c10a4f0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:52.601 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:52.599+0000 7f5d31fa6700 1 --2- 192.168.123.104:0/650308118 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5d2c1016e0 0x7f5d2c101ab0 unknown :-1 s=CLOSED pgs=138 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:52.601 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:52.599+0000 7f5d31fa6700 1 -- 192.168.123.104:0/650308118 >> 192.168.123.104:0/650308118 conn(0x7f5d2c0faf00 msgr2=0x7f5d2c0fd310 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:28:52.601 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:52.599+0000 7f5d31fa6700 1 -- 192.168.123.104:0/650308118 shutdown_connections 2026-03-10T06:28:52.601 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:52.599+0000 7f5d31fa6700 1 -- 192.168.123.104:0/650308118 wait complete. 2026-03-10T06:28:52.601 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:52.600+0000 7f5d31fa6700 1 Processor -- start 2026-03-10T06:28:52.601 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:52.600+0000 7f5d31fa6700 1 -- start start 2026-03-10T06:28:52.601 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:52.600+0000 7f5d31fa6700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5d2c1016e0 0x7f5d2c198440 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:28:52.601 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:52.600+0000 7f5d31fa6700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5d2c101ff0 0x7f5d2c198980 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:28:52.602 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:52.600+0000 7f5d31fa6700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5d2c199060 con 0x7f5d2c1016e0 2026-03-10T06:28:52.602 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:52.600+0000 7f5d31fa6700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5d2c19cdf0 con 0x7f5d2c101ff0 2026-03-10T06:28:52.602 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:52.600+0000 7f5d2b7fe700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5d2c1016e0 0x7f5d2c198440 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:28:52.602 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:52.600+0000 7f5d2b7fe700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5d2c1016e0 0x7f5d2c198440 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:35956/0 (socket says 192.168.123.104:35956) 2026-03-10T06:28:52.603 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:52.600+0000 7f5d2b7fe700 1 -- 192.168.123.104:0/3111620434 learned_addr learned my addr 192.168.123.104:0/3111620434 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:28:52.603 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:52.601+0000 7f5d2b7fe700 1 -- 192.168.123.104:0/3111620434 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5d2c101ff0 msgr2=0x7f5d2c198980 unknown :-1 s=STATE_CONNECTING_RE l=1).mark_down 2026-03-10T06:28:52.603 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:52.601+0000 7f5d2b7fe700 1 --2- 192.168.123.104:0/3111620434 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5d2c101ff0 0x7f5d2c198980 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:52.603 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:52.601+0000 7f5d2b7fe700 1 -- 192.168.123.104:0/3111620434 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f5d140097e0 con 0x7f5d2c1016e0 2026-03-10T06:28:52.603 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:52.601+0000 7f5d2b7fe700 1 --2- 192.168.123.104:0/3111620434 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5d2c1016e0 0x7f5d2c198440 secure :-1 s=READY pgs=139 cs=0 l=1 rev1=1 crypto rx=0x7f5d14005950 tx=0x7f5d14004990 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:28:52.603 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:52.601+0000 7f5d28ff9700 1 -- 192.168.123.104:0/3111620434 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f5d1401d070 con 0x7f5d2c1016e0 2026-03-10T06:28:52.603 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:52.601+0000 7f5d28ff9700 1 -- 192.168.123.104:0/3111620434 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f5d1400bc50 con 0x7f5d2c1016e0 2026-03-10T06:28:52.603 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:52.601+0000 7f5d28ff9700 1 -- 192.168.123.104:0/3111620434 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f5d1400f860 con 0x7f5d2c1016e0 2026-03-10T06:28:52.606 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:52.604+0000 7f5d31fa6700 1 -- 192.168.123.104:0/3111620434 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f5d2c19d070 con 0x7f5d2c1016e0 2026-03-10T06:28:52.606 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:52.604+0000 7f5d31fa6700 1 -- 192.168.123.104:0/3111620434 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f5d2c19d530 con 0x7f5d2c1016e0 2026-03-10T06:28:52.606 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:52.604+0000 7f5d31fa6700 1 -- 192.168.123.104:0/3111620434 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f5d2c10a000 con 0x7f5d2c1016e0 2026-03-10T06:28:52.609 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:52.607+0000 7f5d28ff9700 1 -- 192.168.123.104:0/3111620434 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 40) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f5d1400f9c0 con 0x7f5d2c1016e0 2026-03-10T06:28:52.609 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:52.607+0000 7f5d28ff9700 1 --2- 192.168.123.104:0/3111620434 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f5d180778c0 0x7f5d18079d70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:28:52.609 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:52.608+0000 7f5d2affd700 1 --2- 192.168.123.104:0/3111620434 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f5d180778c0 0x7f5d18079d70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:28:52.610 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:52.608+0000 7f5d2affd700 1 --2- 192.168.123.104:0/3111620434 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f5d180778c0 0x7f5d18079d70 secure :-1 s=READY pgs=94 cs=0 l=1 rev1=1 crypto rx=0x7f5d2c199a60 tx=0x7f5d1c006cb0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:28:52.610 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:52.609+0000 7f5d28ff9700 1 -- 192.168.123.104:0/3111620434 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(81..81 src has 1..81) v4 ==== 6308+0+0 (secure 0 0 0) 0x7f5d1400bdc0 con 0x7f5d2c1016e0 2026-03-10T06:28:52.613 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:52.612+0000 7f5d28ff9700 1 -- 192.168.123.104:0/3111620434 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f5d14060d50 con 0x7f5d2c1016e0 2026-03-10T06:28:52.759 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:52.758+0000 7f5d31fa6700 1 -- 192.168.123.104:0/3111620434 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 20, "format": "json"} v 0) v1 -- 0x7f5d2c0689d0 con 0x7f5d2c1016e0 2026-03-10T06:28:52.760 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:52.759+0000 7f5d28ff9700 1 -- 192.168.123.104:0/3111620434 <== mon.0 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 20, "format": "json"}]=0 dumped fsmap epoch 20 v34) v1 ==== 107+0+4202 (secure 0 0 0) 0x7f5d14060d50 con 0x7f5d2c1016e0 2026-03-10T06:28:52.760 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:28:52.760 INFO:teuthology.orchestra.run.vm04.stdout:{"epoch":20,"btime":"2026-03-10T06:27:01:264505+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":34266,"name":"cephfs.vm04.hdxbzv","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.104:6827/2103633514","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6826","nonce":2103633514},{"type":"v1","addr":"192.168.123.104:6827","nonce":2103633514}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":18},{"gid":44217,"name":"cephfs.vm06.wzhqon","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.106:6825/1147710747","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6824","nonce":1147710747},{"type":"v1","addr":"192.168.123.106:6825","nonce":1147710747}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":13}],"filesystems":[{"mdsmap":{"epoch":20,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":32,"explicitly_allowed_features":32,"created":"2026-03-10T06:19:48.407965+0000","modified":"2026-03-10T06:27:01.264478+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":79,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{"mds_0":14526},"failed":[],"damaged":[],"stopped":[],"info":{"gid_14526":{"gid":14526,"name":"cephfs.vm06.afscws","rank":0,"incarnation":20,"state":"up:replay","state_seq":2,"addr":"192.168.123.106:6827/3120742985","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6826","nonce":3120742985},{"type":"v1","addr":"192.168.123.106:6827","nonce":3120742985}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":0,"qdb_cluster":[]},"id":1}]} 2026-03-10T06:28:52.763 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:52.762+0000 7f5d31fa6700 1 -- 192.168.123.104:0/3111620434 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f5d180778c0 msgr2=0x7f5d18079d70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:28:52.764 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:52.762+0000 7f5d31fa6700 1 --2- 192.168.123.104:0/3111620434 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f5d180778c0 0x7f5d18079d70 secure :-1 s=READY pgs=94 cs=0 l=1 rev1=1 crypto rx=0x7f5d2c199a60 tx=0x7f5d1c006cb0 comp rx=0 tx=0).stop 2026-03-10T06:28:52.764 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:52.762+0000 7f5d31fa6700 1 -- 192.168.123.104:0/3111620434 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5d2c1016e0 msgr2=0x7f5d2c198440 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:28:52.764 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:52.762+0000 7f5d31fa6700 1 --2- 192.168.123.104:0/3111620434 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5d2c1016e0 0x7f5d2c198440 secure :-1 s=READY pgs=139 cs=0 l=1 rev1=1 crypto rx=0x7f5d14005950 tx=0x7f5d14004990 comp rx=0 tx=0).stop 2026-03-10T06:28:52.764 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:52.762+0000 7f5d31fa6700 1 -- 192.168.123.104:0/3111620434 shutdown_connections 2026-03-10T06:28:52.764 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:52.762+0000 7f5d31fa6700 1 --2- 192.168.123.104:0/3111620434 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f5d180778c0 0x7f5d18079d70 unknown :-1 s=CLOSED pgs=94 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:52.764 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:52.762+0000 7f5d31fa6700 1 --2- 192.168.123.104:0/3111620434 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5d2c1016e0 0x7f5d2c198440 unknown :-1 s=CLOSED pgs=139 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:52.764 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:52.762+0000 7f5d31fa6700 1 --2- 192.168.123.104:0/3111620434 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5d2c101ff0 0x7f5d2c198980 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:52.764 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:52.762+0000 7f5d31fa6700 1 -- 192.168.123.104:0/3111620434 >> 192.168.123.104:0/3111620434 conn(0x7f5d2c0faf00 msgr2=0x7f5d2c0ffb00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:28:52.764 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:52.763+0000 7f5d31fa6700 1 -- 192.168.123.104:0/3111620434 shutdown_connections 2026-03-10T06:28:52.764 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:52.763+0000 7f5d31fa6700 1 -- 192.168.123.104:0/3111620434 wait complete. 2026-03-10T06:28:52.765 INFO:teuthology.orchestra.run.vm04.stderr:dumped fsmap epoch 20 2026-03-10T06:28:52.832 DEBUG:tasks.fs:allow_standby_replay disabled in epoch 20 2026-03-10T06:28:52.832 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid 9c59102a-1c48-11f1-b618-035af535377d -- ceph fs dump --format=json 21 2026-03-10T06:28:52.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:28:52 vm04.local ceph-mon[115743]: from='client.? 192.168.123.104:0/2682867322' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 19, "format": "json"}]: dispatch 2026-03-10T06:28:52.984 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/mon.vm04/config 2026-03-10T06:28:53.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:28:52 vm06.local ceph-mon[98962]: from='client.? 192.168.123.104:0/2682867322' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 19, "format": "json"}]: dispatch 2026-03-10T06:28:53.242 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:53.240+0000 7fe98514f700 1 -- 192.168.123.104:0/3228465001 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe9800686f0 msgr2=0x7fe980068ac0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:28:53.243 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:53.240+0000 7fe98514f700 1 --2- 192.168.123.104:0/3228465001 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe9800686f0 0x7fe980068ac0 secure :-1 s=READY pgs=140 cs=0 l=1 rev1=1 crypto rx=0x7fe968009b00 tx=0x7fe968009e10 comp rx=0 tx=0).stop 2026-03-10T06:28:53.243 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:53.241+0000 7fe98514f700 1 -- 192.168.123.104:0/3228465001 shutdown_connections 2026-03-10T06:28:53.243 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:53.241+0000 7fe98514f700 1 --2- 192.168.123.104:0/3228465001 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe980069000 0x7fe9801051e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:53.243 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:53.241+0000 7fe98514f700 1 --2- 192.168.123.104:0/3228465001 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe9800686f0 0x7fe980068ac0 unknown :-1 s=CLOSED pgs=140 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:53.243 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:53.241+0000 7fe98514f700 1 -- 192.168.123.104:0/3228465001 >> 192.168.123.104:0/3228465001 conn(0x7fe9800754a0 msgr2=0x7fe9800758a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:28:53.243 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:53.241+0000 7fe98514f700 1 -- 192.168.123.104:0/3228465001 shutdown_connections 2026-03-10T06:28:53.243 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:53.241+0000 7fe98514f700 1 -- 192.168.123.104:0/3228465001 wait complete. 2026-03-10T06:28:53.243 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:53.242+0000 7fe98514f700 1 Processor -- start 2026-03-10T06:28:53.243 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:53.242+0000 7fe98514f700 1 -- start start 2026-03-10T06:28:53.244 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:53.242+0000 7fe98514f700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe9800686f0 0x7fe980196220 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:28:53.244 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:53.242+0000 7fe98514f700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe980069000 0x7fe980196760 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:28:53.244 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:53.242+0000 7fe98514f700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe980196e40 con 0x7fe9800686f0 2026-03-10T06:28:53.244 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:53.242+0000 7fe98514f700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe98019abd0 con 0x7fe980069000 2026-03-10T06:28:53.244 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:53.243+0000 7fe97e59c700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe980069000 0x7fe980196760 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:28:53.244 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:53.243+0000 7fe97e59c700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe980069000 0x7fe980196760 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.104:52858/0 (socket says 192.168.123.104:52858) 2026-03-10T06:28:53.244 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:53.243+0000 7fe97e59c700 1 -- 192.168.123.104:0/2167191763 learned_addr learned my addr 192.168.123.104:0/2167191763 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:28:53.244 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:53.243+0000 7fe97e59c700 1 -- 192.168.123.104:0/2167191763 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe9800686f0 msgr2=0x7fe980196220 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T06:28:53.244 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:53.243+0000 7fe97ed9d700 1 --2- 192.168.123.104:0/2167191763 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe9800686f0 0x7fe980196220 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:28:53.244 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:53.243+0000 7fe97e59c700 1 --2- 192.168.123.104:0/2167191763 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe9800686f0 0x7fe980196220 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:53.245 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:53.243+0000 7fe97e59c700 1 -- 192.168.123.104:0/2167191763 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fe9680097e0 con 0x7fe980069000 2026-03-10T06:28:53.245 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:53.243+0000 7fe97ed9d700 1 --2- 192.168.123.104:0/2167191763 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe9800686f0 0x7fe980196220 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T06:28:53.245 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:53.243+0000 7fe97e59c700 1 --2- 192.168.123.104:0/2167191763 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe980069000 0x7fe980196760 secure :-1 s=READY pgs=51 cs=0 l=1 rev1=1 crypto rx=0x7fe97000eb10 tx=0x7fe97000eed0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:28:53.245 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:53.243+0000 7fe977fff700 1 -- 192.168.123.104:0/2167191763 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe97000cca0 con 0x7fe980069000 2026-03-10T06:28:53.246 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:53.244+0000 7fe98514f700 1 -- 192.168.123.104:0/2167191763 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fe98019aeb0 con 0x7fe980069000 2026-03-10T06:28:53.246 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:53.244+0000 7fe977fff700 1 -- 192.168.123.104:0/2167191763 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7fe97000ce00 con 0x7fe980069000 2026-03-10T06:28:53.246 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:53.244+0000 7fe977fff700 1 -- 192.168.123.104:0/2167191763 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe970018910 con 0x7fe980069000 2026-03-10T06:28:53.246 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:53.244+0000 7fe98514f700 1 -- 192.168.123.104:0/2167191763 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fe98019b400 con 0x7fe980069000 2026-03-10T06:28:53.246 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:53.245+0000 7fe98514f700 1 -- 192.168.123.104:0/2167191763 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fe980108b30 con 0x7fe980069000 2026-03-10T06:28:53.248 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:53.247+0000 7fe977fff700 1 -- 192.168.123.104:0/2167191763 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 40) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fe970018a70 con 0x7fe980069000 2026-03-10T06:28:53.248 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:53.247+0000 7fe977fff700 1 --2- 192.168.123.104:0/2167191763 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7fe96c077870 0x7fe96c079d20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:28:53.249 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:53.247+0000 7fe977fff700 1 -- 192.168.123.104:0/2167191763 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(81..81 src has 1..81) v4 ==== 6308+0+0 (secure 0 0 0) 0x7fe970014070 con 0x7fe980069000 2026-03-10T06:28:53.249 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:53.247+0000 7fe97ed9d700 1 --2- 192.168.123.104:0/2167191763 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7fe96c077870 0x7fe96c079d20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:28:53.249 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:53.248+0000 7fe97ed9d700 1 --2- 192.168.123.104:0/2167191763 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7fe96c077870 0x7fe96c079d20 secure :-1 s=READY pgs=95 cs=0 l=1 rev1=1 crypto rx=0x7fe96800b5c0 tx=0x7fe968005fd0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:28:53.250 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:53.249+0000 7fe977fff700 1 -- 192.168.123.104:0/2167191763 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fe970063280 con 0x7fe980069000 2026-03-10T06:28:53.391 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:53.389+0000 7fe98514f700 1 -- 192.168.123.104:0/2167191763 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 21, "format": "json"} v 0) v1 -- 0x7fe98019b040 con 0x7fe980069000 2026-03-10T06:28:53.392 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:53.390+0000 7fe977fff700 1 -- 192.168.123.104:0/2167191763 <== mon.1 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 21, "format": "json"}]=0 dumped fsmap epoch 21 v34) v1 ==== 107+0+4207 (secure 0 0 0) 0x7fe98019b040 con 0x7fe980069000 2026-03-10T06:28:53.393 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:28:53.393 INFO:teuthology.orchestra.run.vm04.stdout:{"epoch":21,"btime":"2026-03-10T06:27:06:484565+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":34266,"name":"cephfs.vm04.hdxbzv","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.104:6827/2103633514","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6826","nonce":2103633514},{"type":"v1","addr":"192.168.123.104:6827","nonce":2103633514}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":18},{"gid":44217,"name":"cephfs.vm06.wzhqon","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.106:6825/1147710747","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6824","nonce":1147710747},{"type":"v1","addr":"192.168.123.106:6825","nonce":1147710747}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":13}],"filesystems":[{"mdsmap":{"epoch":21,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":32,"explicitly_allowed_features":32,"created":"2026-03-10T06:19:48.407965+0000","modified":"2026-03-10T06:27:06.434345+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":79,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{"mds_0":14526},"failed":[],"damaged":[],"stopped":[],"info":{"gid_14526":{"gid":14526,"name":"cephfs.vm06.afscws","rank":0,"incarnation":20,"state":"up:reconnect","state_seq":110,"addr":"192.168.123.106:6827/3120742985","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6826","nonce":3120742985},{"type":"v1","addr":"192.168.123.106:6827","nonce":3120742985}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":0,"qdb_cluster":[]},"id":1}]} 2026-03-10T06:28:53.396 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:53.395+0000 7fe98514f700 1 -- 192.168.123.104:0/2167191763 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7fe96c077870 msgr2=0x7fe96c079d20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:28:53.397 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:53.395+0000 7fe98514f700 1 --2- 192.168.123.104:0/2167191763 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7fe96c077870 0x7fe96c079d20 secure :-1 s=READY pgs=95 cs=0 l=1 rev1=1 crypto rx=0x7fe96800b5c0 tx=0x7fe968005fd0 comp rx=0 tx=0).stop 2026-03-10T06:28:53.397 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:53.395+0000 7fe98514f700 1 -- 192.168.123.104:0/2167191763 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe980069000 msgr2=0x7fe980196760 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:28:53.397 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:53.395+0000 7fe98514f700 1 --2- 192.168.123.104:0/2167191763 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe980069000 0x7fe980196760 secure :-1 s=READY pgs=51 cs=0 l=1 rev1=1 crypto rx=0x7fe97000eb10 tx=0x7fe97000eed0 comp rx=0 tx=0).stop 2026-03-10T06:28:53.397 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:53.395+0000 7fe98514f700 1 -- 192.168.123.104:0/2167191763 shutdown_connections 2026-03-10T06:28:53.397 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:53.395+0000 7fe98514f700 1 --2- 192.168.123.104:0/2167191763 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7fe96c077870 0x7fe96c079d20 secure :-1 s=CLOSED pgs=95 cs=0 l=1 rev1=1 crypto rx=0x7fe96800b5c0 tx=0x7fe968005fd0 comp rx=0 tx=0).stop 2026-03-10T06:28:53.397 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:53.395+0000 7fe98514f700 1 --2- 192.168.123.104:0/2167191763 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe9800686f0 0x7fe980196220 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:53.398 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:53.395+0000 7fe98514f700 1 --2- 192.168.123.104:0/2167191763 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe980069000 0x7fe980196760 secure :-1 s=CLOSED pgs=51 cs=0 l=1 rev1=1 crypto rx=0x7fe97000eb10 tx=0x7fe97000eed0 comp rx=0 tx=0).stop 2026-03-10T06:28:53.398 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:53.395+0000 7fe98514f700 1 -- 192.168.123.104:0/2167191763 >> 192.168.123.104:0/2167191763 conn(0x7fe9800754a0 msgr2=0x7fe9800ff700 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:28:53.398 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:53.395+0000 7fe98514f700 1 -- 192.168.123.104:0/2167191763 shutdown_connections 2026-03-10T06:28:53.398 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:53.396+0000 7fe98514f700 1 -- 192.168.123.104:0/2167191763 wait complete. 2026-03-10T06:28:53.398 INFO:teuthology.orchestra.run.vm04.stderr:dumped fsmap epoch 21 2026-03-10T06:28:53.466 DEBUG:tasks.fs:allow_standby_replay disabled in epoch 21 2026-03-10T06:28:53.466 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid 9c59102a-1c48-11f1-b618-035af535377d -- ceph fs dump --format=json 22 2026-03-10T06:28:53.622 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/mon.vm04/config 2026-03-10T06:28:53.873 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:53.871+0000 7fa39827c700 1 -- 192.168.123.104:0/213235295 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa390108790 msgr2=0x7fa390108b60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:28:53.873 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:53.871+0000 7fa395016700 1 -- 192.168.123.104:0/213235295 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa3800056d0 con 0x7fa390108790 2026-03-10T06:28:53.873 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:53.871+0000 7fa39827c700 1 --2- 192.168.123.104:0/213235295 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa390108790 0x7fa390108b60 secure :-1 s=READY pgs=141 cs=0 l=1 rev1=1 crypto rx=0x7fa380009b00 tx=0x7fa380009e10 comp rx=0 tx=0).stop 2026-03-10T06:28:53.873 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:53.872+0000 7fa39827c700 1 -- 192.168.123.104:0/213235295 shutdown_connections 2026-03-10T06:28:53.873 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:53.872+0000 7fa39827c700 1 --2- 192.168.123.104:0/213235295 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa390102790 0x7fa390102c00 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:53.873 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:53.872+0000 7fa39827c700 1 --2- 192.168.123.104:0/213235295 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa390108790 0x7fa390108b60 unknown :-1 s=CLOSED pgs=141 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:53.873 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:53.872+0000 7fa39827c700 1 -- 192.168.123.104:0/213235295 >> 192.168.123.104:0/213235295 conn(0x7fa3900fe2b0 msgr2=0x7fa3901006c0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:28:53.873 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:53.872+0000 7fa39827c700 1 -- 192.168.123.104:0/213235295 shutdown_connections 2026-03-10T06:28:53.874 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:53.872+0000 7fa39827c700 1 -- 192.168.123.104:0/213235295 wait complete. 2026-03-10T06:28:53.874 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:53.873+0000 7fa39827c700 1 Processor -- start 2026-03-10T06:28:53.874 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:53.873+0000 7fa39827c700 1 -- start start 2026-03-10T06:28:53.874 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:53.873+0000 7fa39827c700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa390102790 0x7fa3901982f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:28:53.875 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:53.873+0000 7fa39827c700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa390108790 0x7fa390198830 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:28:53.875 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:53.873+0000 7fa39827c700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa390198f10 con 0x7fa390102790 2026-03-10T06:28:53.875 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:53.873+0000 7fa39827c700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa39019cca0 con 0x7fa390108790 2026-03-10T06:28:53.875 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:53.873+0000 7fa396018700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa390102790 0x7fa3901982f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:28:53.875 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:53.873+0000 7fa396018700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa390102790 0x7fa3901982f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:58014/0 (socket says 192.168.123.104:58014) 2026-03-10T06:28:53.875 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:53.873+0000 7fa396018700 1 -- 192.168.123.104:0/1080510064 learned_addr learned my addr 192.168.123.104:0/1080510064 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:28:53.875 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:53.873+0000 7fa395817700 1 --2- 192.168.123.104:0/1080510064 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa390108790 0x7fa390198830 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:28:53.875 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:53.874+0000 7fa396018700 1 -- 192.168.123.104:0/1080510064 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa390108790 msgr2=0x7fa390198830 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:28:53.875 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:53.874+0000 7fa396018700 1 --2- 192.168.123.104:0/1080510064 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa390108790 0x7fa390198830 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:53.875 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:53.874+0000 7fa396018700 1 -- 192.168.123.104:0/1080510064 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fa3800097e0 con 0x7fa390102790 2026-03-10T06:28:53.876 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:53.874+0000 7fa395817700 1 --2- 192.168.123.104:0/1080510064 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa390108790 0x7fa390198830 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T06:28:53.876 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:53.874+0000 7fa396018700 1 --2- 192.168.123.104:0/1080510064 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa390102790 0x7fa3901982f0 secure :-1 s=READY pgs=142 cs=0 l=1 rev1=1 crypto rx=0x7fa380005850 tx=0x7fa3800049e0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:28:53.877 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:53.874+0000 7fa3877fe700 1 -- 192.168.123.104:0/1080510064 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa38001d070 con 0x7fa390102790 2026-03-10T06:28:53.877 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:53.874+0000 7fa39827c700 1 -- 192.168.123.104:0/1080510064 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fa39019cf20 con 0x7fa390102790 2026-03-10T06:28:53.877 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:53.874+0000 7fa39827c700 1 -- 192.168.123.104:0/1080510064 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fa39019d410 con 0x7fa390102790 2026-03-10T06:28:53.877 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:53.874+0000 7fa3877fe700 1 -- 192.168.123.104:0/1080510064 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7fa38000bc50 con 0x7fa390102790 2026-03-10T06:28:53.877 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:53.874+0000 7fa3877fe700 1 -- 192.168.123.104:0/1080510064 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa380017610 con 0x7fa390102790 2026-03-10T06:28:53.877 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:53.875+0000 7fa3877fe700 1 -- 192.168.123.104:0/1080510064 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 40) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fa380017770 con 0x7fa390102790 2026-03-10T06:28:53.878 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:53.876+0000 7fa3877fe700 1 --2- 192.168.123.104:0/1080510064 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7fa37c080120 0x7fa37c0825d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:28:53.880 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:53.876+0000 7fa3877fe700 1 -- 192.168.123.104:0/1080510064 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(81..81 src has 1..81) v4 ==== 6308+0+0 (secure 0 0 0) 0x7fa38009c060 con 0x7fa390102790 2026-03-10T06:28:53.880 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:53.876+0000 7fa39827c700 1 -- 192.168.123.104:0/1080510064 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fa374005320 con 0x7fa390102790 2026-03-10T06:28:53.880 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:53.879+0000 7fa3877fe700 1 -- 192.168.123.104:0/1080510064 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fa380064810 con 0x7fa390102790 2026-03-10T06:28:53.881 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:53.879+0000 7fa395817700 1 --2- 192.168.123.104:0/1080510064 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7fa37c080120 0x7fa37c0825d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:28:53.883 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:53.882+0000 7fa395817700 1 --2- 192.168.123.104:0/1080510064 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7fa37c080120 0x7fa37c0825d0 secure :-1 s=READY pgs=96 cs=0 l=1 rev1=1 crypto rx=0x7fa38c005fd0 tx=0x7fa38c005dc0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:28:53.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:28:53 vm04.local ceph-mon[115743]: pgmap v185: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:28:53.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:28:53 vm04.local ceph-mon[115743]: from='client.? 192.168.123.104:0/3111620434' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 20, "format": "json"}]: dispatch 2026-03-10T06:28:53.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:28:53 vm04.local ceph-mon[115743]: from='client.? 192.168.123.104:0/2167191763' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 21, "format": "json"}]: dispatch 2026-03-10T06:28:54.022 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:54.020+0000 7fa39827c700 1 -- 192.168.123.104:0/1080510064 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 22, "format": "json"} v 0) v1 -- 0x7fa374005190 con 0x7fa390102790 2026-03-10T06:28:54.023 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:54.021+0000 7fa3877fe700 1 -- 192.168.123.104:0/1080510064 <== mon.0 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 22, "format": "json"}]=0 dumped fsmap epoch 22 v34) v1 ==== 107+0+5052 (secure 0 0 0) 0x7fa380063f60 con 0x7fa390102790 2026-03-10T06:28:54.023 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:28:54.023 INFO:teuthology.orchestra.run.vm04.stdout:{"epoch":22,"btime":"2026-03-10T06:27:07:859962+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":34266,"name":"cephfs.vm04.hdxbzv","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.104:6827/2103633514","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6826","nonce":2103633514},{"type":"v1","addr":"192.168.123.104:6827","nonce":2103633514}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":18},{"gid":34286,"name":"cephfs.vm04.hsrsig","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.104:6829/142010479","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6828","nonce":142010479},{"type":"v1","addr":"192.168.123.104:6829","nonce":142010479}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":22},{"gid":44217,"name":"cephfs.vm06.wzhqon","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.106:6825/1147710747","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6824","nonce":1147710747},{"type":"v1","addr":"192.168.123.106:6825","nonce":1147710747}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":13}],"filesystems":[{"mdsmap":{"epoch":22,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":32,"explicitly_allowed_features":32,"created":"2026-03-10T06:19:48.407965+0000","modified":"2026-03-10T06:27:06.873092+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":79,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{"mds_0":14526},"failed":[],"damaged":[],"stopped":[],"info":{"gid_14526":{"gid":14526,"name":"cephfs.vm06.afscws","rank":0,"incarnation":20,"state":"up:rejoin","state_seq":111,"addr":"192.168.123.106:6827/3120742985","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6826","nonce":3120742985},{"type":"v1","addr":"192.168.123.106:6827","nonce":3120742985}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":0,"qdb_cluster":[]},"id":1}]} 2026-03-10T06:28:54.025 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:54.024+0000 7fa39827c700 1 -- 192.168.123.104:0/1080510064 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7fa37c080120 msgr2=0x7fa37c0825d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:28:54.025 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:54.024+0000 7fa39827c700 1 --2- 192.168.123.104:0/1080510064 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7fa37c080120 0x7fa37c0825d0 secure :-1 s=READY pgs=96 cs=0 l=1 rev1=1 crypto rx=0x7fa38c005fd0 tx=0x7fa38c005dc0 comp rx=0 tx=0).stop 2026-03-10T06:28:54.026 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:54.024+0000 7fa39827c700 1 -- 192.168.123.104:0/1080510064 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa390102790 msgr2=0x7fa3901982f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:28:54.026 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:54.024+0000 7fa39827c700 1 --2- 192.168.123.104:0/1080510064 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa390102790 0x7fa3901982f0 secure :-1 s=READY pgs=142 cs=0 l=1 rev1=1 crypto rx=0x7fa380005850 tx=0x7fa3800049e0 comp rx=0 tx=0).stop 2026-03-10T06:28:54.026 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:54.024+0000 7fa39827c700 1 -- 192.168.123.104:0/1080510064 shutdown_connections 2026-03-10T06:28:54.026 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:54.025+0000 7fa39827c700 1 --2- 192.168.123.104:0/1080510064 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7fa37c080120 0x7fa37c0825d0 unknown :-1 s=CLOSED pgs=96 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:54.026 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:54.025+0000 7fa39827c700 1 --2- 192.168.123.104:0/1080510064 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa390102790 0x7fa3901982f0 unknown :-1 s=CLOSED pgs=142 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:54.026 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:54.025+0000 7fa39827c700 1 --2- 192.168.123.104:0/1080510064 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa390108790 0x7fa390198830 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:54.026 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:54.025+0000 7fa39827c700 1 -- 192.168.123.104:0/1080510064 >> 192.168.123.104:0/1080510064 conn(0x7fa3900fe2b0 msgr2=0x7fa3900ffa00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:28:54.027 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:54.025+0000 7fa39827c700 1 -- 192.168.123.104:0/1080510064 shutdown_connections 2026-03-10T06:28:54.027 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:54.025+0000 7fa39827c700 1 -- 192.168.123.104:0/1080510064 wait complete. 2026-03-10T06:28:54.028 INFO:teuthology.orchestra.run.vm04.stderr:dumped fsmap epoch 22 2026-03-10T06:28:54.094 DEBUG:tasks.fs:allow_standby_replay disabled in epoch 22 2026-03-10T06:28:54.095 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid 9c59102a-1c48-11f1-b618-035af535377d -- ceph fs dump --format=json 23 2026-03-10T06:28:54.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:28:53 vm06.local ceph-mon[98962]: pgmap v185: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:28:54.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:28:53 vm06.local ceph-mon[98962]: from='client.? 192.168.123.104:0/3111620434' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 20, "format": "json"}]: dispatch 2026-03-10T06:28:54.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:28:53 vm06.local ceph-mon[98962]: from='client.? 192.168.123.104:0/2167191763' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 21, "format": "json"}]: dispatch 2026-03-10T06:28:54.246 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/mon.vm04/config 2026-03-10T06:28:54.515 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:54.513+0000 7f253e619700 1 -- 192.168.123.104:0/1292362193 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f25380686f0 msgr2=0x7f2538068ac0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:28:54.515 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:54.513+0000 7f253e619700 1 --2- 192.168.123.104:0/1292362193 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f25380686f0 0x7f2538068ac0 secure :-1 s=READY pgs=143 cs=0 l=1 rev1=1 crypto rx=0x7f2520009b00 tx=0x7f2520009e10 comp rx=0 tx=0).stop 2026-03-10T06:28:54.515 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:54.513+0000 7f253e619700 1 -- 192.168.123.104:0/1292362193 shutdown_connections 2026-03-10T06:28:54.515 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:54.513+0000 7f253e619700 1 --2- 192.168.123.104:0/1292362193 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f2538069000 0x7f25381051e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:54.515 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:54.513+0000 7f253e619700 1 --2- 192.168.123.104:0/1292362193 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f25380686f0 0x7f2538068ac0 unknown :-1 s=CLOSED pgs=143 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:54.515 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:54.513+0000 7f253e619700 1 -- 192.168.123.104:0/1292362193 >> 192.168.123.104:0/1292362193 conn(0x7f25380754a0 msgr2=0x7f25380758a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:28:54.515 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:54.514+0000 7f253e619700 1 -- 192.168.123.104:0/1292362193 shutdown_connections 2026-03-10T06:28:54.515 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:54.514+0000 7f253e619700 1 -- 192.168.123.104:0/1292362193 wait complete. 2026-03-10T06:28:54.515 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:54.514+0000 7f253e619700 1 Processor -- start 2026-03-10T06:28:54.516 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:54.514+0000 7f253e619700 1 -- start start 2026-03-10T06:28:54.516 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:54.515+0000 7f253e619700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f25380686f0 0x7f25381983a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:28:54.516 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:54.515+0000 7f253e619700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2538069000 0x7f25381988e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:28:54.516 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:54.515+0000 7f253e619700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f2538198fc0 con 0x7f2538069000 2026-03-10T06:28:54.516 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:54.515+0000 7f253e619700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f253819cd50 con 0x7f25380686f0 2026-03-10T06:28:54.516 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:54.515+0000 7f2537fff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f25380686f0 0x7f25381983a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:28:54.516 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:54.515+0000 7f2537fff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f25380686f0 0x7f25381983a0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.104:39414/0 (socket says 192.168.123.104:39414) 2026-03-10T06:28:54.516 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:54.515+0000 7f2537fff700 1 -- 192.168.123.104:0/4177301226 learned_addr learned my addr 192.168.123.104:0/4177301226 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:28:54.516 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:54.515+0000 7f25377fe700 1 --2- 192.168.123.104:0/4177301226 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2538069000 0x7f25381988e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:28:54.517 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:54.515+0000 7f2537fff700 1 -- 192.168.123.104:0/4177301226 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2538069000 msgr2=0x7f25381988e0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:28:54.517 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:54.515+0000 7f2537fff700 1 --2- 192.168.123.104:0/4177301226 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2538069000 0x7f25381988e0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:54.517 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:54.515+0000 7f2537fff700 1 -- 192.168.123.104:0/4177301226 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f25200097e0 con 0x7f25380686f0 2026-03-10T06:28:54.517 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:54.515+0000 7f25377fe700 1 --2- 192.168.123.104:0/4177301226 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2538069000 0x7f25381988e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_reply_more state changed! 2026-03-10T06:28:54.517 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:54.516+0000 7f2537fff700 1 --2- 192.168.123.104:0/4177301226 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f25380686f0 0x7f25381983a0 secure :-1 s=READY pgs=52 cs=0 l=1 rev1=1 crypto rx=0x7f25200048c0 tx=0x7f25200048f0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:28:54.517 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:54.516+0000 7f25357fa700 1 -- 192.168.123.104:0/4177301226 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f252001d070 con 0x7f25380686f0 2026-03-10T06:28:54.517 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:54.516+0000 7f253e619700 1 -- 192.168.123.104:0/4177301226 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f253819cfd0 con 0x7f25380686f0 2026-03-10T06:28:54.517 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:54.516+0000 7f253e619700 1 -- 192.168.123.104:0/4177301226 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f253819d4c0 con 0x7f25380686f0 2026-03-10T06:28:54.519 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:54.517+0000 7f25357fa700 1 -- 192.168.123.104:0/4177301226 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f2520004b80 con 0x7f25380686f0 2026-03-10T06:28:54.519 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:54.517+0000 7f25357fa700 1 -- 192.168.123.104:0/4177301226 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f252000f670 con 0x7f25380686f0 2026-03-10T06:28:54.519 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:54.518+0000 7f25357fa700 1 -- 192.168.123.104:0/4177301226 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 40) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f252000f7d0 con 0x7f25380686f0 2026-03-10T06:28:54.519 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:54.518+0000 7f25357fa700 1 --2- 192.168.123.104:0/4177301226 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f25240778c0 0x7f2524079d70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:28:54.520 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:54.518+0000 7f25377fe700 1 --2- 192.168.123.104:0/4177301226 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f25240778c0 0x7f2524079d70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:28:54.520 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:54.518+0000 7f25357fa700 1 -- 192.168.123.104:0/4177301226 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(81..81 src has 1..81) v4 ==== 6308+0+0 (secure 0 0 0) 0x7f252000bc50 con 0x7f25380686f0 2026-03-10T06:28:54.520 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:54.518+0000 7f253e619700 1 -- 192.168.123.104:0/4177301226 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f253804ea50 con 0x7f25380686f0 2026-03-10T06:28:54.520 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:54.519+0000 7f25377fe700 1 --2- 192.168.123.104:0/4177301226 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f25240778c0 0x7f2524079d70 secure :-1 s=READY pgs=97 cs=0 l=1 rev1=1 crypto rx=0x7f2528005950 tx=0x7f25280058e0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:28:54.523 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:54.522+0000 7f25357fa700 1 -- 192.168.123.104:0/4177301226 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f2520061230 con 0x7f25380686f0 2026-03-10T06:28:54.665 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:54.663+0000 7f253e619700 1 -- 192.168.123.104:0/4177301226 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 23, "format": "json"} v 0) v1 -- 0x7f2538199700 con 0x7f25380686f0 2026-03-10T06:28:54.666 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:54.664+0000 7f25357fa700 1 -- 192.168.123.104:0/4177301226 <== mon.1 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 23, "format": "json"}]=0 dumped fsmap epoch 23 v34) v1 ==== 107+0+5061 (secure 0 0 0) 0x7f2520061230 con 0x7f25380686f0 2026-03-10T06:28:54.666 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:28:54.666 INFO:teuthology.orchestra.run.vm04.stdout:{"epoch":23,"btime":"2026-03-10T06:27:08:862428+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":34266,"name":"cephfs.vm04.hdxbzv","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.104:6827/2103633514","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6826","nonce":2103633514},{"type":"v1","addr":"192.168.123.104:6827","nonce":2103633514}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":18},{"gid":34286,"name":"cephfs.vm04.hsrsig","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.104:6829/142010479","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6828","nonce":142010479},{"type":"v1","addr":"192.168.123.104:6829","nonce":142010479}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":22},{"gid":44217,"name":"cephfs.vm06.wzhqon","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.106:6825/1147710747","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6824","nonce":1147710747},{"type":"v1","addr":"192.168.123.106:6825","nonce":1147710747}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":13}],"filesystems":[{"mdsmap":{"epoch":23,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":32,"explicitly_allowed_features":32,"created":"2026-03-10T06:19:48.407965+0000","modified":"2026-03-10T06:27:08.862427+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":79,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{"mds_0":14526},"failed":[],"damaged":[],"stopped":[],"info":{"gid_14526":{"gid":14526,"name":"cephfs.vm06.afscws","rank":0,"incarnation":20,"state":"up:active","state_seq":112,"addr":"192.168.123.106:6827/3120742985","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6826","nonce":3120742985},{"type":"v1","addr":"192.168.123.106:6827","nonce":3120742985}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":14526,"qdb_cluster":[14526]},"id":1}]} 2026-03-10T06:28:54.668 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:54.667+0000 7f253e619700 1 -- 192.168.123.104:0/4177301226 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f25240778c0 msgr2=0x7f2524079d70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:28:54.669 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:54.667+0000 7f253e619700 1 --2- 192.168.123.104:0/4177301226 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f25240778c0 0x7f2524079d70 secure :-1 s=READY pgs=97 cs=0 l=1 rev1=1 crypto rx=0x7f2528005950 tx=0x7f25280058e0 comp rx=0 tx=0).stop 2026-03-10T06:28:54.669 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:54.667+0000 7f253e619700 1 -- 192.168.123.104:0/4177301226 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f25380686f0 msgr2=0x7f25381983a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:28:54.669 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:54.667+0000 7f253e619700 1 --2- 192.168.123.104:0/4177301226 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f25380686f0 0x7f25381983a0 secure :-1 s=READY pgs=52 cs=0 l=1 rev1=1 crypto rx=0x7f25200048c0 tx=0x7f25200048f0 comp rx=0 tx=0).stop 2026-03-10T06:28:54.669 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:54.668+0000 7f253e619700 1 -- 192.168.123.104:0/4177301226 shutdown_connections 2026-03-10T06:28:54.669 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:54.668+0000 7f253e619700 1 --2- 192.168.123.104:0/4177301226 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f25240778c0 0x7f2524079d70 unknown :-1 s=CLOSED pgs=97 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:54.669 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:54.668+0000 7f253e619700 1 --2- 192.168.123.104:0/4177301226 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f25380686f0 0x7f25381983a0 unknown :-1 s=CLOSED pgs=52 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:54.669 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:54.668+0000 7f253e619700 1 --2- 192.168.123.104:0/4177301226 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2538069000 0x7f25381988e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:54.669 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:54.668+0000 7f253e619700 1 -- 192.168.123.104:0/4177301226 >> 192.168.123.104:0/4177301226 conn(0x7f25380754a0 msgr2=0x7f25380fe9d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:28:54.669 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:54.668+0000 7f253e619700 1 -- 192.168.123.104:0/4177301226 shutdown_connections 2026-03-10T06:28:54.670 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:54.668+0000 7f253e619700 1 -- 192.168.123.104:0/4177301226 wait complete. 2026-03-10T06:28:54.670 INFO:teuthology.orchestra.run.vm04.stderr:dumped fsmap epoch 23 2026-03-10T06:28:54.717 DEBUG:tasks.fs:allow_standby_replay disabled in epoch 23 2026-03-10T06:28:54.717 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid 9c59102a-1c48-11f1-b618-035af535377d -- ceph fs dump --format=json 24 2026-03-10T06:28:54.800 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:28:54 vm04.local ceph-mon[115743]: from='client.? 192.168.123.104:0/1080510064' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 22, "format": "json"}]: dispatch 2026-03-10T06:28:54.800 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:28:54 vm04.local ceph-mon[115743]: from='client.? 192.168.123.104:0/4177301226' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 23, "format": "json"}]: dispatch 2026-03-10T06:28:54.887 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/mon.vm04/config 2026-03-10T06:28:55.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:28:54 vm06.local ceph-mon[98962]: from='client.? 192.168.123.104:0/1080510064' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 22, "format": "json"}]: dispatch 2026-03-10T06:28:55.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:28:54 vm06.local ceph-mon[98962]: from='client.? 192.168.123.104:0/4177301226' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 23, "format": "json"}]: dispatch 2026-03-10T06:28:55.149 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:55.147+0000 7f5a16e8b700 1 -- 192.168.123.104:0/117811767 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5a101024e0 msgr2=0x7f5a10102950 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:28:55.149 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:55.147+0000 7f5a16e8b700 1 --2- 192.168.123.104:0/117811767 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5a101024e0 0x7f5a10102950 secure :-1 s=READY pgs=144 cs=0 l=1 rev1=1 crypto rx=0x7f5a0c009b00 tx=0x7f5a0c009e10 comp rx=0 tx=0).stop 2026-03-10T06:28:55.149 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:55.148+0000 7f5a16e8b700 1 -- 192.168.123.104:0/117811767 shutdown_connections 2026-03-10T06:28:55.149 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:55.148+0000 7f5a16e8b700 1 --2- 192.168.123.104:0/117811767 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5a101024e0 0x7f5a10102950 unknown :-1 s=CLOSED pgs=144 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:55.149 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:55.148+0000 7f5a16e8b700 1 --2- 192.168.123.104:0/117811767 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5a101084e0 0x7f5a101088b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:55.149 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:55.148+0000 7f5a16e8b700 1 -- 192.168.123.104:0/117811767 >> 192.168.123.104:0/117811767 conn(0x7f5a100fe000 msgr2=0x7f5a10100410 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:28:55.149 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:55.148+0000 7f5a16e8b700 1 -- 192.168.123.104:0/117811767 shutdown_connections 2026-03-10T06:28:55.149 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:55.148+0000 7f5a16e8b700 1 -- 192.168.123.104:0/117811767 wait complete. 2026-03-10T06:28:55.150 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:55.148+0000 7f5a16e8b700 1 Processor -- start 2026-03-10T06:28:55.150 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:55.149+0000 7f5a16e8b700 1 -- start start 2026-03-10T06:28:55.150 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:55.149+0000 7f5a16e8b700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5a101024e0 0x7f5a10198130 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:28:55.150 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:55.149+0000 7f5a16e8b700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5a101084e0 0x7f5a10198670 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:28:55.150 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:55.149+0000 7f5a16e8b700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5a10198d50 con 0x7f5a101084e0 2026-03-10T06:28:55.151 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:55.149+0000 7f5a16e8b700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5a1019cae0 con 0x7f5a101024e0 2026-03-10T06:28:55.151 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:55.149+0000 7f5a15e89700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5a101024e0 0x7f5a10198130 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:28:55.151 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:55.149+0000 7f5a15e89700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5a101024e0 0x7f5a10198130 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.104:39426/0 (socket says 192.168.123.104:39426) 2026-03-10T06:28:55.151 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:55.149+0000 7f5a15e89700 1 -- 192.168.123.104:0/1350271688 learned_addr learned my addr 192.168.123.104:0/1350271688 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:28:55.151 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:55.149+0000 7f5a15688700 1 --2- 192.168.123.104:0/1350271688 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5a101084e0 0x7f5a10198670 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:28:55.151 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:55.150+0000 7f5a15e89700 1 -- 192.168.123.104:0/1350271688 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5a101084e0 msgr2=0x7f5a10198670 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:28:55.151 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:55.150+0000 7f5a15e89700 1 --2- 192.168.123.104:0/1350271688 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5a101084e0 0x7f5a10198670 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:55.151 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:55.150+0000 7f5a15e89700 1 -- 192.168.123.104:0/1350271688 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f5a0c0097e0 con 0x7f5a101024e0 2026-03-10T06:28:55.151 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:55.150+0000 7f5a15e89700 1 --2- 192.168.123.104:0/1350271688 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5a101024e0 0x7f5a10198130 secure :-1 s=READY pgs=53 cs=0 l=1 rev1=1 crypto rx=0x7f5a0400d8d0 tx=0x7f5a0400dc90 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:28:55.151 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:55.150+0000 7f5a15688700 1 --2- 192.168.123.104:0/1350271688 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5a101084e0 0x7f5a10198670 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_reply_more state changed! 2026-03-10T06:28:55.151 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:55.150+0000 7f5a02ffd700 1 -- 192.168.123.104:0/1350271688 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f5a04009940 con 0x7f5a101024e0 2026-03-10T06:28:55.152 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:55.150+0000 7f5a16e8b700 1 -- 192.168.123.104:0/1350271688 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f5a1019cdc0 con 0x7f5a101024e0 2026-03-10T06:28:55.152 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:55.150+0000 7f5a16e8b700 1 -- 192.168.123.104:0/1350271688 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f5a1019d310 con 0x7f5a101024e0 2026-03-10T06:28:55.152 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:55.151+0000 7f5a02ffd700 1 -- 192.168.123.104:0/1350271688 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f5a04010460 con 0x7f5a101024e0 2026-03-10T06:28:55.152 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:55.151+0000 7f5a02ffd700 1 -- 192.168.123.104:0/1350271688 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f5a0400f5d0 con 0x7f5a101024e0 2026-03-10T06:28:55.154 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:55.152+0000 7f5a02ffd700 1 -- 192.168.123.104:0/1350271688 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 40) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f5a0400f7e0 con 0x7f5a101024e0 2026-03-10T06:28:55.154 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:55.152+0000 7f5a16e8b700 1 -- 192.168.123.104:0/1350271688 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f59f4005320 con 0x7f5a101024e0 2026-03-10T06:28:55.154 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:55.152+0000 7f5a02ffd700 1 --2- 192.168.123.104:0/1350271688 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f59fc0778c0 0x7f59fc079d70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:28:55.154 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:55.152+0000 7f5a02ffd700 1 -- 192.168.123.104:0/1350271688 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(81..81 src has 1..81) v4 ==== 6308+0+0 (secure 0 0 0) 0x7f5a04099f20 con 0x7f5a101024e0 2026-03-10T06:28:55.154 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:55.152+0000 7f5a15688700 1 --2- 192.168.123.104:0/1350271688 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f59fc0778c0 0x7f59fc079d70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:28:55.154 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:55.153+0000 7f5a15688700 1 --2- 192.168.123.104:0/1350271688 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f59fc0778c0 0x7f59fc079d70 secure :-1 s=READY pgs=98 cs=0 l=1 rev1=1 crypto rx=0x7f5a0c00b5c0 tx=0x7f5a0c005fb0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:28:55.156 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:55.155+0000 7f5a02ffd700 1 -- 192.168.123.104:0/1350271688 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f5a04061fd0 con 0x7f5a101024e0 2026-03-10T06:28:55.304 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:55.302+0000 7f5a16e8b700 1 -- 192.168.123.104:0/1350271688 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 24, "format": "json"} v 0) v1 -- 0x7f59f4005190 con 0x7f5a101024e0 2026-03-10T06:28:55.306 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:55.304+0000 7f5a02ffd700 1 -- 192.168.123.104:0/1350271688 <== mon.1 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 24, "format": "json"}]=0 dumped fsmap epoch 24 v34) v1 ==== 107+0+4278 (secure 0 0 0) 0x7f5a04061df0 con 0x7f5a101024e0 2026-03-10T06:28:55.306 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:28:55.306 INFO:teuthology.orchestra.run.vm04.stdout:{"epoch":24,"btime":"2026-03-10T06:27:12:247203+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":34266,"name":"cephfs.vm04.hdxbzv","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.104:6827/2103633514","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6826","nonce":2103633514},{"type":"v1","addr":"192.168.123.104:6827","nonce":2103633514}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":18},{"gid":34286,"name":"cephfs.vm04.hsrsig","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.104:6829/142010479","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6828","nonce":142010479},{"type":"v1","addr":"192.168.123.104:6829","nonce":142010479}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":22}],"filesystems":[{"mdsmap":{"epoch":23,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":32,"explicitly_allowed_features":32,"created":"2026-03-10T06:19:48.407965+0000","modified":"2026-03-10T06:27:08.862427+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":79,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{"mds_0":14526},"failed":[],"damaged":[],"stopped":[],"info":{"gid_14526":{"gid":14526,"name":"cephfs.vm06.afscws","rank":0,"incarnation":20,"state":"up:active","state_seq":112,"addr":"192.168.123.106:6827/3120742985","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6826","nonce":3120742985},{"type":"v1","addr":"192.168.123.106:6827","nonce":3120742985}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":14526,"qdb_cluster":[14526]},"id":1}]} 2026-03-10T06:28:55.308 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:55.307+0000 7f5a16e8b700 1 -- 192.168.123.104:0/1350271688 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f59fc0778c0 msgr2=0x7f59fc079d70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:28:55.309 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:55.307+0000 7f5a16e8b700 1 --2- 192.168.123.104:0/1350271688 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f59fc0778c0 0x7f59fc079d70 secure :-1 s=READY pgs=98 cs=0 l=1 rev1=1 crypto rx=0x7f5a0c00b5c0 tx=0x7f5a0c005fb0 comp rx=0 tx=0).stop 2026-03-10T06:28:55.309 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:55.307+0000 7f5a16e8b700 1 -- 192.168.123.104:0/1350271688 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5a101024e0 msgr2=0x7f5a10198130 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:28:55.309 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:55.307+0000 7f5a16e8b700 1 --2- 192.168.123.104:0/1350271688 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5a101024e0 0x7f5a10198130 secure :-1 s=READY pgs=53 cs=0 l=1 rev1=1 crypto rx=0x7f5a0400d8d0 tx=0x7f5a0400dc90 comp rx=0 tx=0).stop 2026-03-10T06:28:55.309 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:55.307+0000 7f5a16e8b700 1 -- 192.168.123.104:0/1350271688 shutdown_connections 2026-03-10T06:28:55.309 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:55.307+0000 7f5a16e8b700 1 --2- 192.168.123.104:0/1350271688 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f59fc0778c0 0x7f59fc079d70 secure :-1 s=CLOSED pgs=98 cs=0 l=1 rev1=1 crypto rx=0x7f5a0c00b5c0 tx=0x7f5a0c005fb0 comp rx=0 tx=0).stop 2026-03-10T06:28:55.309 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:55.307+0000 7f5a16e8b700 1 --2- 192.168.123.104:0/1350271688 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5a101024e0 0x7f5a10198130 secure :-1 s=CLOSED pgs=53 cs=0 l=1 rev1=1 crypto rx=0x7f5a0400d8d0 tx=0x7f5a0400dc90 comp rx=0 tx=0).stop 2026-03-10T06:28:55.309 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:55.307+0000 7f5a16e8b700 1 --2- 192.168.123.104:0/1350271688 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5a101084e0 0x7f5a10198670 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:55.309 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:55.307+0000 7f5a16e8b700 1 -- 192.168.123.104:0/1350271688 >> 192.168.123.104:0/1350271688 conn(0x7f5a100fe000 msgr2=0x7f5a100fea20 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:28:55.309 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:55.308+0000 7f5a16e8b700 1 -- 192.168.123.104:0/1350271688 shutdown_connections 2026-03-10T06:28:55.309 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:55.308+0000 7f5a16e8b700 1 -- 192.168.123.104:0/1350271688 wait complete. 2026-03-10T06:28:55.310 INFO:teuthology.orchestra.run.vm04.stderr:dumped fsmap epoch 24 2026-03-10T06:28:55.359 DEBUG:tasks.fs:allow_standby_replay disabled in epoch 24 2026-03-10T06:28:55.359 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid 9c59102a-1c48-11f1-b618-035af535377d -- ceph fs dump --format=json 25 2026-03-10T06:28:55.508 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/mon.vm04/config 2026-03-10T06:28:55.769 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:55.764+0000 7fc4eada6700 1 -- 192.168.123.104:0/1530722124 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc4e4073960 msgr2=0x7fc4e41111c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:28:55.769 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:55.764+0000 7fc4eada6700 1 --2- 192.168.123.104:0/1530722124 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc4e4073960 0x7fc4e41111c0 secure :-1 s=READY pgs=145 cs=0 l=1 rev1=1 crypto rx=0x7fc4d8009b00 tx=0x7fc4d8009e10 comp rx=0 tx=0).stop 2026-03-10T06:28:55.770 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:55.769+0000 7fc4eada6700 1 -- 192.168.123.104:0/1530722124 shutdown_connections 2026-03-10T06:28:55.770 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:55.769+0000 7fc4eada6700 1 --2- 192.168.123.104:0/1530722124 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc4e4073960 0x7fc4e41111c0 unknown :-1 s=CLOSED pgs=145 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:55.770 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:55.769+0000 7fc4eada6700 1 --2- 192.168.123.104:0/1530722124 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc4e4073050 0x7fc4e4073420 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:55.770 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:55.769+0000 7fc4eada6700 1 -- 192.168.123.104:0/1530722124 >> 192.168.123.104:0/1530722124 conn(0x7fc4e4078580 msgr2=0x7fc4e4078980 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:28:55.773 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:55.772+0000 7fc4eada6700 1 -- 192.168.123.104:0/1530722124 shutdown_connections 2026-03-10T06:28:55.774 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:55.772+0000 7fc4eada6700 1 -- 192.168.123.104:0/1530722124 wait complete. 2026-03-10T06:28:55.774 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:55.772+0000 7fc4eada6700 1 Processor -- start 2026-03-10T06:28:55.774 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:55.772+0000 7fc4eada6700 1 -- start start 2026-03-10T06:28:55.774 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:55.773+0000 7fc4eada6700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc4e4073050 0x7fc4e40728f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:28:55.774 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:55.773+0000 7fc4eada6700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc4e4073960 0x7fc4e406d8f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:28:55.774 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:55.773+0000 7fc4eada6700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc4e406dec0 con 0x7fc4e4073960 2026-03-10T06:28:55.774 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:55.773+0000 7fc4eada6700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc4e406e030 con 0x7fc4e4073050 2026-03-10T06:28:55.774 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:55.773+0000 7fc4e8b42700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc4e4073050 0x7fc4e40728f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:28:55.774 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:55.773+0000 7fc4e8b42700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc4e4073050 0x7fc4e40728f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.104:39444/0 (socket says 192.168.123.104:39444) 2026-03-10T06:28:55.774 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:55.773+0000 7fc4e8b42700 1 -- 192.168.123.104:0/2541833502 learned_addr learned my addr 192.168.123.104:0/2541833502 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:28:55.775 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:55.773+0000 7fc4e3fff700 1 --2- 192.168.123.104:0/2541833502 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc4e4073960 0x7fc4e406d8f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:28:55.775 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:55.774+0000 7fc4e8b42700 1 -- 192.168.123.104:0/2541833502 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc4e4073960 msgr2=0x7fc4e406d8f0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:28:55.775 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:55.774+0000 7fc4e8b42700 1 --2- 192.168.123.104:0/2541833502 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc4e4073960 0x7fc4e406d8f0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:55.775 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:55.774+0000 7fc4e8b42700 1 -- 192.168.123.104:0/2541833502 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fc4d80097e0 con 0x7fc4e4073050 2026-03-10T06:28:55.775 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:55.774+0000 7fc4e3fff700 1 --2- 192.168.123.104:0/2541833502 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc4e4073960 0x7fc4e406d8f0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T06:28:55.776 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:55.775+0000 7fc4e8b42700 1 --2- 192.168.123.104:0/2541833502 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc4e4073050 0x7fc4e40728f0 secure :-1 s=READY pgs=54 cs=0 l=1 rev1=1 crypto rx=0x7fc4d400eb10 tx=0x7fc4d400eed0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:28:55.777 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:55.776+0000 7fc4e1ffb700 1 -- 192.168.123.104:0/2541833502 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc4d400cca0 con 0x7fc4e4073050 2026-03-10T06:28:55.777 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:55.776+0000 7fc4e1ffb700 1 -- 192.168.123.104:0/2541833502 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7fc4d400ce00 con 0x7fc4e4073050 2026-03-10T06:28:55.778 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:55.776+0000 7fc4e1ffb700 1 -- 192.168.123.104:0/2541833502 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc4d4018910 con 0x7fc4e4073050 2026-03-10T06:28:55.778 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:55.776+0000 7fc4eada6700 1 -- 192.168.123.104:0/2541833502 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fc4e406e290 con 0x7fc4e4073050 2026-03-10T06:28:55.779 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:55.776+0000 7fc4eada6700 1 -- 192.168.123.104:0/2541833502 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fc4e4104990 con 0x7fc4e4073050 2026-03-10T06:28:55.779 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:55.777+0000 7fc4eada6700 1 -- 192.168.123.104:0/2541833502 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fc4e404ea50 con 0x7fc4e4073050 2026-03-10T06:28:55.781 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:55.780+0000 7fc4e1ffb700 1 -- 192.168.123.104:0/2541833502 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 40) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fc4d4018a70 con 0x7fc4e4073050 2026-03-10T06:28:55.781 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:55.780+0000 7fc4e1ffb700 1 --2- 192.168.123.104:0/2541833502 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7fc4cc077760 0x7fc4cc079c10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:28:55.782 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:55.780+0000 7fc4e1ffb700 1 -- 192.168.123.104:0/2541833502 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(81..81 src has 1..81) v4 ==== 6308+0+0 (secure 0 0 0) 0x7fc4d4014070 con 0x7fc4e4073050 2026-03-10T06:28:55.782 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:55.780+0000 7fc4e3fff700 1 --2- 192.168.123.104:0/2541833502 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7fc4cc077760 0x7fc4cc079c10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:28:55.782 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:55.781+0000 7fc4e3fff700 1 --2- 192.168.123.104:0/2541833502 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7fc4cc077760 0x7fc4cc079c10 secure :-1 s=READY pgs=99 cs=0 l=1 rev1=1 crypto rx=0x7fc4d8006010 tx=0x7fc4d8005c00 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:28:55.782 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:55.781+0000 7fc4e1ffb700 1 -- 192.168.123.104:0/2541833502 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fc4d4062ea0 con 0x7fc4e4073050 2026-03-10T06:28:55.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:28:55 vm04.local ceph-mon[115743]: pgmap v186: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:28:55.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:28:55 vm04.local ceph-mon[115743]: from='client.? 192.168.123.104:0/1350271688' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 24, "format": "json"}]: dispatch 2026-03-10T06:28:55.933 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:55.931+0000 7fc4eada6700 1 -- 192.168.123.104:0/2541833502 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 25, "format": "json"} v 0) v1 -- 0x7fc4e406ee20 con 0x7fc4e4073050 2026-03-10T06:28:55.934 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:55.932+0000 7fc4e1ffb700 1 -- 192.168.123.104:0/2541833502 <== mon.1 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 25, "format": "json"}]=0 dumped fsmap epoch 25 v34) v1 ==== 107+0+5129 (secure 0 0 0) 0x7fc4d40625f0 con 0x7fc4e4073050 2026-03-10T06:28:55.934 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:28:55.934 INFO:teuthology.orchestra.run.vm04.stdout:{"epoch":25,"btime":"2026-03-10T06:27:15:263335+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":34266,"name":"cephfs.vm04.hdxbzv","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.104:6827/2103633514","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6826","nonce":2103633514},{"type":"v1","addr":"192.168.123.104:6827","nonce":2103633514}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":18},{"gid":34286,"name":"cephfs.vm04.hsrsig","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.104:6829/142010479","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6828","nonce":142010479},{"type":"v1","addr":"192.168.123.104:6829","nonce":142010479}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":22},{"gid":44245,"name":"cephfs.vm06.wzhqon","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.106:6825/2972586913","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6824","nonce":2972586913},{"type":"v1","addr":"192.168.123.106:6825","nonce":2972586913}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":25}],"filesystems":[{"mdsmap":{"epoch":23,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":32,"explicitly_allowed_features":32,"created":"2026-03-10T06:19:48.407965+0000","modified":"2026-03-10T06:27:08.862427+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":79,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{"mds_0":14526},"failed":[],"damaged":[],"stopped":[],"info":{"gid_14526":{"gid":14526,"name":"cephfs.vm06.afscws","rank":0,"incarnation":20,"state":"up:active","state_seq":112,"addr":"192.168.123.106:6827/3120742985","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6826","nonce":3120742985},{"type":"v1","addr":"192.168.123.106:6827","nonce":3120742985}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":14526,"qdb_cluster":[14526]},"id":1}]} 2026-03-10T06:28:55.936 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:55.935+0000 7fc4eada6700 1 -- 192.168.123.104:0/2541833502 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7fc4cc077760 msgr2=0x7fc4cc079c10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:28:55.937 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:55.935+0000 7fc4eada6700 1 --2- 192.168.123.104:0/2541833502 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7fc4cc077760 0x7fc4cc079c10 secure :-1 s=READY pgs=99 cs=0 l=1 rev1=1 crypto rx=0x7fc4d8006010 tx=0x7fc4d8005c00 comp rx=0 tx=0).stop 2026-03-10T06:28:55.937 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:55.935+0000 7fc4eada6700 1 -- 192.168.123.104:0/2541833502 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc4e4073050 msgr2=0x7fc4e40728f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:28:55.937 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:55.935+0000 7fc4eada6700 1 --2- 192.168.123.104:0/2541833502 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc4e4073050 0x7fc4e40728f0 secure :-1 s=READY pgs=54 cs=0 l=1 rev1=1 crypto rx=0x7fc4d400eb10 tx=0x7fc4d400eed0 comp rx=0 tx=0).stop 2026-03-10T06:28:55.937 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:55.935+0000 7fc4eada6700 1 -- 192.168.123.104:0/2541833502 shutdown_connections 2026-03-10T06:28:55.937 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:55.935+0000 7fc4eada6700 1 --2- 192.168.123.104:0/2541833502 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7fc4cc077760 0x7fc4cc079c10 unknown :-1 s=CLOSED pgs=99 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:55.937 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:55.935+0000 7fc4eada6700 1 --2- 192.168.123.104:0/2541833502 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc4e4073050 0x7fc4e40728f0 unknown :-1 s=CLOSED pgs=54 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:55.937 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:55.935+0000 7fc4eada6700 1 --2- 192.168.123.104:0/2541833502 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc4e4073960 0x7fc4e406d8f0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:55.937 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:55.935+0000 7fc4eada6700 1 -- 192.168.123.104:0/2541833502 >> 192.168.123.104:0/2541833502 conn(0x7fc4e4078580 msgr2=0x7fc4e4102e20 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:28:55.937 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:55.936+0000 7fc4eada6700 1 -- 192.168.123.104:0/2541833502 shutdown_connections 2026-03-10T06:28:55.937 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:55.936+0000 7fc4eada6700 1 -- 192.168.123.104:0/2541833502 wait complete. 2026-03-10T06:28:55.938 INFO:teuthology.orchestra.run.vm04.stderr:dumped fsmap epoch 25 2026-03-10T06:28:55.981 DEBUG:tasks.fs:allow_standby_replay disabled in epoch 25 2026-03-10T06:28:55.981 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid 9c59102a-1c48-11f1-b618-035af535377d -- ceph fs dump --format=json 26 2026-03-10T06:28:56.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:28:55 vm06.local ceph-mon[98962]: pgmap v186: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:28:56.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:28:55 vm06.local ceph-mon[98962]: from='client.? 192.168.123.104:0/1350271688' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 24, "format": "json"}]: dispatch 2026-03-10T06:28:56.133 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/mon.vm04/config 2026-03-10T06:28:56.405 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:56.402+0000 7f4f114ee700 1 -- 192.168.123.104:0/322497576 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4f0c108810 msgr2=0x7f4f0c108be0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:28:56.405 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:56.402+0000 7f4f114ee700 1 --2- 192.168.123.104:0/322497576 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4f0c108810 0x7f4f0c108be0 secure :-1 s=READY pgs=146 cs=0 l=1 rev1=1 crypto rx=0x7f4ef4009b00 tx=0x7f4ef4009e10 comp rx=0 tx=0).stop 2026-03-10T06:28:56.405 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:56.403+0000 7f4f114ee700 1 -- 192.168.123.104:0/322497576 shutdown_connections 2026-03-10T06:28:56.405 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:56.403+0000 7f4f114ee700 1 --2- 192.168.123.104:0/322497576 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4f0c102810 0x7f4f0c102c80 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:56.405 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:56.403+0000 7f4f114ee700 1 --2- 192.168.123.104:0/322497576 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4f0c108810 0x7f4f0c108be0 unknown :-1 s=CLOSED pgs=146 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:56.405 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:56.403+0000 7f4f114ee700 1 -- 192.168.123.104:0/322497576 >> 192.168.123.104:0/322497576 conn(0x7f4f0c0fe330 msgr2=0x7f4f0c100740 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:28:56.405 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:56.403+0000 7f4f114ee700 1 -- 192.168.123.104:0/322497576 shutdown_connections 2026-03-10T06:28:56.405 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:56.403+0000 7f4f114ee700 1 -- 192.168.123.104:0/322497576 wait complete. 2026-03-10T06:28:56.405 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:56.404+0000 7f4f114ee700 1 Processor -- start 2026-03-10T06:28:56.405 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:56.404+0000 7f4f114ee700 1 -- start start 2026-03-10T06:28:56.406 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:56.404+0000 7f4f114ee700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4f0c102810 0x7f4f0c075260 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:28:56.406 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:56.404+0000 7f4f114ee700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4f0c108810 0x7f4f0c0757a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:28:56.406 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:56.404+0000 7f4f114ee700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4f0c0793f0 con 0x7f4f0c102810 2026-03-10T06:28:56.406 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:56.404+0000 7f4f114ee700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4f0c075ce0 con 0x7f4f0c108810 2026-03-10T06:28:56.406 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:56.404+0000 7f4f0a7fc700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4f0c108810 0x7f4f0c0757a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:28:56.406 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:56.404+0000 7f4f0a7fc700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4f0c108810 0x7f4f0c0757a0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.104:39474/0 (socket says 192.168.123.104:39474) 2026-03-10T06:28:56.406 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:56.404+0000 7f4f0a7fc700 1 -- 192.168.123.104:0/2763736900 learned_addr learned my addr 192.168.123.104:0/2763736900 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:28:56.406 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:56.405+0000 7f4f0affd700 1 --2- 192.168.123.104:0/2763736900 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4f0c102810 0x7f4f0c075260 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:28:56.406 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:56.405+0000 7f4f0a7fc700 1 -- 192.168.123.104:0/2763736900 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4f0c102810 msgr2=0x7f4f0c075260 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:28:56.407 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:56.405+0000 7f4f0a7fc700 1 --2- 192.168.123.104:0/2763736900 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4f0c102810 0x7f4f0c075260 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:56.407 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:56.405+0000 7f4f0a7fc700 1 -- 192.168.123.104:0/2763736900 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f4ef40097e0 con 0x7f4f0c108810 2026-03-10T06:28:56.407 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:56.405+0000 7f4f0affd700 1 --2- 192.168.123.104:0/2763736900 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4f0c102810 0x7f4f0c075260 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_reply_more state changed! 2026-03-10T06:28:56.407 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:56.405+0000 7f4f0a7fc700 1 --2- 192.168.123.104:0/2763736900 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4f0c108810 0x7f4f0c0757a0 secure :-1 s=READY pgs=55 cs=0 l=1 rev1=1 crypto rx=0x7f4efc00eb10 tx=0x7f4efc00eed0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:28:56.407 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:56.405+0000 7f4f03fff700 1 -- 192.168.123.104:0/2763736900 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f4efc00cca0 con 0x7f4f0c108810 2026-03-10T06:28:56.407 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:56.405+0000 7f4f114ee700 1 -- 192.168.123.104:0/2763736900 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f4f0c075fc0 con 0x7f4f0c108810 2026-03-10T06:28:56.408 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:56.406+0000 7f4f114ee700 1 -- 192.168.123.104:0/2763736900 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f4f0c1a6c20 con 0x7f4f0c108810 2026-03-10T06:28:56.408 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:56.406+0000 7f4f03fff700 1 -- 192.168.123.104:0/2763736900 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f4efc00ce00 con 0x7f4f0c108810 2026-03-10T06:28:56.408 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:56.407+0000 7f4f03fff700 1 -- 192.168.123.104:0/2763736900 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f4efc0105e0 con 0x7f4f0c108810 2026-03-10T06:28:56.409 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:56.407+0000 7f4f114ee700 1 -- 192.168.123.104:0/2763736900 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f4f0c04ea50 con 0x7f4f0c108810 2026-03-10T06:28:56.409 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:56.408+0000 7f4f03fff700 1 -- 192.168.123.104:0/2763736900 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 40) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f4efc010ca0 con 0x7f4f0c108810 2026-03-10T06:28:56.409 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:56.408+0000 7f4f03fff700 1 --2- 192.168.123.104:0/2763736900 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f4ef80778c0 0x7f4ef8079d70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:28:56.410 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:56.408+0000 7f4f0affd700 1 --2- 192.168.123.104:0/2763736900 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f4ef80778c0 0x7f4ef8079d70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:28:56.410 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:56.408+0000 7f4f03fff700 1 -- 192.168.123.104:0/2763736900 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(81..81 src has 1..81) v4 ==== 6308+0+0 (secure 0 0 0) 0x7f4efc014070 con 0x7f4f0c108810 2026-03-10T06:28:56.410 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:56.409+0000 7f4f0affd700 1 --2- 192.168.123.104:0/2763736900 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f4ef80778c0 0x7f4ef8079d70 secure :-1 s=READY pgs=100 cs=0 l=1 rev1=1 crypto rx=0x7f4ef4006010 tx=0x7f4ef400b580 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:28:56.412 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:56.410+0000 7f4f03fff700 1 -- 192.168.123.104:0/2763736900 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f4efc062d20 con 0x7f4f0c108810 2026-03-10T06:28:56.550 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:56.549+0000 7f4f114ee700 1 -- 192.168.123.104:0/2763736900 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 26, "format": "json"} v 0) v1 -- 0x7f4f0c076830 con 0x7f4f0c108810 2026-03-10T06:28:56.553 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:56.550+0000 7f4f03fff700 1 -- 192.168.123.104:0/2763736900 <== mon.1 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 26, "format": "json"}]=0 dumped fsmap epoch 26 v34) v1 ==== 107+0+4324 (secure 0 0 0) 0x7f4efc062470 con 0x7f4f0c108810 2026-03-10T06:28:56.553 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:28:56.553 INFO:teuthology.orchestra.run.vm04.stdout:{"epoch":26,"btime":"2026-03-10T06:27:18:714365+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":34266,"name":"cephfs.vm04.hdxbzv","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.104:6827/2103633514","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6826","nonce":2103633514},{"type":"v1","addr":"192.168.123.104:6827","nonce":2103633514}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":18},{"gid":34286,"name":"cephfs.vm04.hsrsig","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.104:6829/142010479","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6828","nonce":142010479},{"type":"v1","addr":"192.168.123.104:6829","nonce":142010479}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":22},{"gid":44245,"name":"cephfs.vm06.wzhqon","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.106:6825/2972586913","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6824","nonce":2972586913},{"type":"v1","addr":"192.168.123.106:6825","nonce":2972586913}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":25}],"filesystems":[{"mdsmap":{"epoch":26,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":32,"explicitly_allowed_features":32,"created":"2026-03-10T06:19:48.407965+0000","modified":"2026-03-10T06:27:18.714363+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":81,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{},"failed":[0],"damaged":[],"stopped":[],"info":{},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":0,"qdb_cluster":[]},"id":1}]} 2026-03-10T06:28:56.555 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:56.554+0000 7f4f114ee700 1 -- 192.168.123.104:0/2763736900 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f4ef80778c0 msgr2=0x7f4ef8079d70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:28:56.555 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:56.554+0000 7f4f114ee700 1 --2- 192.168.123.104:0/2763736900 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f4ef80778c0 0x7f4ef8079d70 secure :-1 s=READY pgs=100 cs=0 l=1 rev1=1 crypto rx=0x7f4ef4006010 tx=0x7f4ef400b580 comp rx=0 tx=0).stop 2026-03-10T06:28:56.556 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:56.554+0000 7f4f114ee700 1 -- 192.168.123.104:0/2763736900 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4f0c108810 msgr2=0x7f4f0c0757a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:28:56.556 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:56.554+0000 7f4f114ee700 1 --2- 192.168.123.104:0/2763736900 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4f0c108810 0x7f4f0c0757a0 secure :-1 s=READY pgs=55 cs=0 l=1 rev1=1 crypto rx=0x7f4efc00eb10 tx=0x7f4efc00eed0 comp rx=0 tx=0).stop 2026-03-10T06:28:56.556 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:56.554+0000 7f4f114ee700 1 -- 192.168.123.104:0/2763736900 shutdown_connections 2026-03-10T06:28:56.556 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:56.554+0000 7f4f114ee700 1 --2- 192.168.123.104:0/2763736900 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f4ef80778c0 0x7f4ef8079d70 unknown :-1 s=CLOSED pgs=100 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:56.556 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:56.555+0000 7f4f114ee700 1 --2- 192.168.123.104:0/2763736900 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4f0c102810 0x7f4f0c075260 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:56.556 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:56.555+0000 7f4f114ee700 1 --2- 192.168.123.104:0/2763736900 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4f0c108810 0x7f4f0c0757a0 unknown :-1 s=CLOSED pgs=55 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:56.556 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:56.555+0000 7f4f114ee700 1 -- 192.168.123.104:0/2763736900 >> 192.168.123.104:0/2763736900 conn(0x7f4f0c0fe330 msgr2=0x7f4f0c0ffa90 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:28:56.556 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:56.555+0000 7f4f114ee700 1 -- 192.168.123.104:0/2763736900 shutdown_connections 2026-03-10T06:28:56.556 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:56.555+0000 7f4f114ee700 1 -- 192.168.123.104:0/2763736900 wait complete. 2026-03-10T06:28:56.557 INFO:teuthology.orchestra.run.vm04.stderr:dumped fsmap epoch 26 2026-03-10T06:28:56.598 DEBUG:tasks.fs:allow_standby_replay disabled in epoch 26 2026-03-10T06:28:56.599 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid 9c59102a-1c48-11f1-b618-035af535377d -- ceph fs dump --format=json 27 2026-03-10T06:28:56.762 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/mon.vm04/config 2026-03-10T06:28:56.802 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:28:56 vm04.local ceph-mon[115743]: from='client.? 192.168.123.104:0/2541833502' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 25, "format": "json"}]: dispatch 2026-03-10T06:28:56.802 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:28:56 vm04.local ceph-mon[115743]: pgmap v187: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:28:56.802 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:28:56 vm04.local ceph-mon[115743]: from='client.? 192.168.123.104:0/2763736900' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 26, "format": "json"}]: dispatch 2026-03-10T06:28:57.041 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:57.039+0000 7f988fb3f700 1 -- 192.168.123.104:0/846322332 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9888068490 msgr2=0x7f9888068900 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:28:57.041 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:57.039+0000 7f988fb3f700 1 --2- 192.168.123.104:0/846322332 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9888068490 0x7f9888068900 secure :-1 s=READY pgs=147 cs=0 l=1 rev1=1 crypto rx=0x7f9884009b50 tx=0x7f9884009e60 comp rx=0 tx=0).stop 2026-03-10T06:28:57.042 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:57.040+0000 7f988fb3f700 1 -- 192.168.123.104:0/846322332 shutdown_connections 2026-03-10T06:28:57.042 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:57.040+0000 7f988fb3f700 1 --2- 192.168.123.104:0/846322332 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9888068490 0x7f9888068900 unknown :-1 s=CLOSED pgs=147 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:57.042 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:57.040+0000 7f988fb3f700 1 --2- 192.168.123.104:0/846322332 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f98881013a0 0x7f9888101770 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:57.042 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:57.040+0000 7f988fb3f700 1 -- 192.168.123.104:0/846322332 >> 192.168.123.104:0/846322332 conn(0x7f98880754a0 msgr2=0x7f98880758a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:28:57.042 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:57.040+0000 7f988fb3f700 1 -- 192.168.123.104:0/846322332 shutdown_connections 2026-03-10T06:28:57.042 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:57.040+0000 7f988fb3f700 1 -- 192.168.123.104:0/846322332 wait complete. 2026-03-10T06:28:57.042 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:57.041+0000 7f988fb3f700 1 Processor -- start 2026-03-10T06:28:57.043 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:57.041+0000 7f988fb3f700 1 -- start start 2026-03-10T06:28:57.043 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:57.041+0000 7f988fb3f700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9888068490 0x7f98881982f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:28:57.043 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:57.042+0000 7f988fb3f700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f98881013a0 0x7f9888198830 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:28:57.043 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:57.042+0000 7f988fb3f700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9888198f10 con 0x7f98881013a0 2026-03-10T06:28:57.043 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:57.042+0000 7f988d0da700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f98881013a0 0x7f9888198830 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:28:57.043 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:57.042+0000 7f988fb3f700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f988819cca0 con 0x7f9888068490 2026-03-10T06:28:57.043 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:57.042+0000 7f988d0da700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f98881013a0 0x7f9888198830 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:58104/0 (socket says 192.168.123.104:58104) 2026-03-10T06:28:57.043 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:57.042+0000 7f988d0da700 1 -- 192.168.123.104:0/182312591 learned_addr learned my addr 192.168.123.104:0/182312591 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:28:57.043 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:57.042+0000 7f988d8db700 1 --2- 192.168.123.104:0/182312591 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9888068490 0x7f98881982f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:28:57.043 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:57.042+0000 7f988d0da700 1 -- 192.168.123.104:0/182312591 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9888068490 msgr2=0x7f98881982f0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:28:57.044 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:57.042+0000 7f988d0da700 1 --2- 192.168.123.104:0/182312591 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9888068490 0x7f98881982f0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:57.044 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:57.042+0000 7f988d0da700 1 -- 192.168.123.104:0/182312591 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f98840097e0 con 0x7f98881013a0 2026-03-10T06:28:57.044 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:57.042+0000 7f988d0da700 1 --2- 192.168.123.104:0/182312591 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f98881013a0 0x7f9888198830 secure :-1 s=READY pgs=148 cs=0 l=1 rev1=1 crypto rx=0x7f9884005f50 tx=0x7f9884004ef0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:28:57.045 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:57.042+0000 7f987effd700 1 -- 192.168.123.104:0/182312591 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f988401d070 con 0x7f98881013a0 2026-03-10T06:28:57.045 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:57.043+0000 7f987effd700 1 -- 192.168.123.104:0/182312591 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f9884022470 con 0x7f98881013a0 2026-03-10T06:28:57.046 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:57.043+0000 7f987effd700 1 -- 192.168.123.104:0/182312591 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f988400f7d0 con 0x7f98881013a0 2026-03-10T06:28:57.046 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:57.043+0000 7f988fb3f700 1 -- 192.168.123.104:0/182312591 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f988819cf20 con 0x7f98881013a0 2026-03-10T06:28:57.046 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:57.043+0000 7f988fb3f700 1 -- 192.168.123.104:0/182312591 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f988819d330 con 0x7f98881013a0 2026-03-10T06:28:57.046 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:57.043+0000 7f988d8db700 1 --2- 192.168.123.104:0/182312591 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9888068490 0x7f98881982f0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T06:28:57.047 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:57.044+0000 7f988fb3f700 1 -- 192.168.123.104:0/182312591 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f988804ea50 con 0x7f98881013a0 2026-03-10T06:28:57.049 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:57.047+0000 7f987effd700 1 -- 192.168.123.104:0/182312591 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 40) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f9884022ac0 con 0x7f98881013a0 2026-03-10T06:28:57.049 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:57.048+0000 7f987effd700 1 --2- 192.168.123.104:0/182312591 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f9874077910 0x7f9874079dc0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:28:57.049 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:57.048+0000 7f987effd700 1 -- 192.168.123.104:0/182312591 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(81..81 src has 1..81) v4 ==== 6308+0+0 (secure 0 0 0) 0x7f988409bc00 con 0x7f98881013a0 2026-03-10T06:28:57.049 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:57.048+0000 7f987effd700 1 -- 192.168.123.104:0/182312591 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f98840cba90 con 0x7f98881013a0 2026-03-10T06:28:57.050 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:57.048+0000 7f988d8db700 1 --2- 192.168.123.104:0/182312591 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f9874077910 0x7f9874079dc0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:28:57.051 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:57.049+0000 7f988d8db700 1 --2- 192.168.123.104:0/182312591 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f9874077910 0x7f9874079dc0 secure :-1 s=READY pgs=101 cs=0 l=1 rev1=1 crypto rx=0x7f9878009e20 tx=0x7f9878009450 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:28:57.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:28:56 vm06.local ceph-mon[98962]: from='client.? 192.168.123.104:0/2541833502' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 25, "format": "json"}]: dispatch 2026-03-10T06:28:57.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:28:56 vm06.local ceph-mon[98962]: pgmap v187: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:28:57.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:28:56 vm06.local ceph-mon[98962]: from='client.? 192.168.123.104:0/2763736900' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 26, "format": "json"}]: dispatch 2026-03-10T06:28:57.191 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:57.189+0000 7f988fb3f700 1 -- 192.168.123.104:0/182312591 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 27, "format": "json"} v 0) v1 -- 0x7f9888066e40 con 0x7f98881013a0 2026-03-10T06:28:57.192 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:57.190+0000 7f987effd700 1 -- 192.168.123.104:0/182312591 <== mon.0 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 27, "format": "json"}]=0 dumped fsmap epoch 27 v34) v1 ==== 107+0+4403 (secure 0 0 0) 0x7f98840644c0 con 0x7f98881013a0 2026-03-10T06:28:57.192 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:28:57.192 INFO:teuthology.orchestra.run.vm04.stdout:{"epoch":27,"btime":"2026-03-10T06:27:18:719205+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":34286,"name":"cephfs.vm04.hsrsig","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.104:6829/142010479","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6828","nonce":142010479},{"type":"v1","addr":"192.168.123.104:6829","nonce":142010479}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":22},{"gid":44245,"name":"cephfs.vm06.wzhqon","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.106:6825/2972586913","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6824","nonce":2972586913},{"type":"v1","addr":"192.168.123.106:6825","nonce":2972586913}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":25}],"filesystems":[{"mdsmap":{"epoch":27,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":32,"explicitly_allowed_features":32,"created":"2026-03-10T06:19:48.407965+0000","modified":"2026-03-10T06:27:18.719203+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":81,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"max_mds":1,"in":[0],"up":{"mds_0":34266},"failed":[],"damaged":[],"stopped":[],"info":{"gid_34266":{"gid":34266,"name":"cephfs.vm04.hdxbzv","rank":0,"incarnation":27,"state":"up:replay","state_seq":1,"addr":"192.168.123.104:6827/2103633514","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6826","nonce":2103633514},{"type":"v1","addr":"192.168.123.104:6827","nonce":2103633514}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":0,"qdb_cluster":[]},"id":1}]} 2026-03-10T06:28:57.194 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:57.193+0000 7f988fb3f700 1 -- 192.168.123.104:0/182312591 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f9874077910 msgr2=0x7f9874079dc0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:28:57.194 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:57.193+0000 7f988fb3f700 1 --2- 192.168.123.104:0/182312591 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f9874077910 0x7f9874079dc0 secure :-1 s=READY pgs=101 cs=0 l=1 rev1=1 crypto rx=0x7f9878009e20 tx=0x7f9878009450 comp rx=0 tx=0).stop 2026-03-10T06:28:57.194 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:57.193+0000 7f988fb3f700 1 -- 192.168.123.104:0/182312591 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f98881013a0 msgr2=0x7f9888198830 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:28:57.195 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:57.193+0000 7f988fb3f700 1 --2- 192.168.123.104:0/182312591 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f98881013a0 0x7f9888198830 secure :-1 s=READY pgs=148 cs=0 l=1 rev1=1 crypto rx=0x7f9884005f50 tx=0x7f9884004ef0 comp rx=0 tx=0).stop 2026-03-10T06:28:57.195 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:57.193+0000 7f988fb3f700 1 -- 192.168.123.104:0/182312591 shutdown_connections 2026-03-10T06:28:57.195 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:57.194+0000 7f988fb3f700 1 --2- 192.168.123.104:0/182312591 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f9874077910 0x7f9874079dc0 unknown :-1 s=CLOSED pgs=101 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:57.195 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:57.194+0000 7f988fb3f700 1 --2- 192.168.123.104:0/182312591 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9888068490 0x7f98881982f0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:57.195 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:57.194+0000 7f988fb3f700 1 --2- 192.168.123.104:0/182312591 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f98881013a0 0x7f9888198830 unknown :-1 s=CLOSED pgs=148 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:57.195 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:57.194+0000 7f988fb3f700 1 -- 192.168.123.104:0/182312591 >> 192.168.123.104:0/182312591 conn(0x7f98880754a0 msgr2=0x7f98880fdd70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:28:57.195 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:57.194+0000 7f988fb3f700 1 -- 192.168.123.104:0/182312591 shutdown_connections 2026-03-10T06:28:57.195 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:57.194+0000 7f988fb3f700 1 -- 192.168.123.104:0/182312591 wait complete. 2026-03-10T06:28:57.196 INFO:teuthology.orchestra.run.vm04.stderr:dumped fsmap epoch 27 2026-03-10T06:28:57.243 DEBUG:tasks.fs:allow_standby_replay disabled in epoch 27 2026-03-10T06:28:57.243 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid 9c59102a-1c48-11f1-b618-035af535377d -- ceph fs dump --format=json 28 2026-03-10T06:28:57.393 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/mon.vm04/config 2026-03-10T06:28:57.656 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:57.653+0000 7fda3c2fb700 1 -- 192.168.123.104:0/8297559 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fda34102810 msgr2=0x7fda34102c80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:28:57.656 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:57.653+0000 7fda3c2fb700 1 --2- 192.168.123.104:0/8297559 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fda34102810 0x7fda34102c80 secure :-1 s=READY pgs=149 cs=0 l=1 rev1=1 crypto rx=0x7fda24009b00 tx=0x7fda24009e10 comp rx=0 tx=0).stop 2026-03-10T06:28:57.656 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:57.655+0000 7fda3c2fb700 1 -- 192.168.123.104:0/8297559 shutdown_connections 2026-03-10T06:28:57.656 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:57.655+0000 7fda3c2fb700 1 --2- 192.168.123.104:0/8297559 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fda34102810 0x7fda34102c80 unknown :-1 s=CLOSED pgs=149 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:57.656 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:57.655+0000 7fda3c2fb700 1 --2- 192.168.123.104:0/8297559 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fda34108810 0x7fda34108be0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:57.656 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:57.655+0000 7fda3c2fb700 1 -- 192.168.123.104:0/8297559 >> 192.168.123.104:0/8297559 conn(0x7fda340fe330 msgr2=0x7fda34100740 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:28:57.656 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:57.655+0000 7fda3c2fb700 1 -- 192.168.123.104:0/8297559 shutdown_connections 2026-03-10T06:28:57.656 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:57.655+0000 7fda3c2fb700 1 -- 192.168.123.104:0/8297559 wait complete. 2026-03-10T06:28:57.657 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:57.655+0000 7fda3c2fb700 1 Processor -- start 2026-03-10T06:28:57.657 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:57.656+0000 7fda3c2fb700 1 -- start start 2026-03-10T06:28:57.657 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:57.656+0000 7fda3c2fb700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fda34102810 0x7fda34198400 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:28:57.657 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:57.656+0000 7fda3c2fb700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fda34108810 0x7fda34198940 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:28:57.657 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:57.656+0000 7fda3c2fb700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fda34199020 con 0x7fda34102810 2026-03-10T06:28:57.657 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:57.656+0000 7fda3c2fb700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fda3419cdb0 con 0x7fda34108810 2026-03-10T06:28:57.658 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:57.656+0000 7fda39896700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fda34108810 0x7fda34198940 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:28:57.658 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:57.656+0000 7fda39896700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fda34108810 0x7fda34198940 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.104:39514/0 (socket says 192.168.123.104:39514) 2026-03-10T06:28:57.658 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:57.656+0000 7fda39896700 1 -- 192.168.123.104:0/2069316723 learned_addr learned my addr 192.168.123.104:0/2069316723 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:28:57.658 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:57.656+0000 7fda3a097700 1 --2- 192.168.123.104:0/2069316723 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fda34102810 0x7fda34198400 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:28:57.658 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:57.657+0000 7fda39896700 1 -- 192.168.123.104:0/2069316723 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fda34102810 msgr2=0x7fda34198400 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:28:57.658 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:57.657+0000 7fda39896700 1 --2- 192.168.123.104:0/2069316723 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fda34102810 0x7fda34198400 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:57.658 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:57.657+0000 7fda39896700 1 -- 192.168.123.104:0/2069316723 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fda240097e0 con 0x7fda34108810 2026-03-10T06:28:57.659 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:57.657+0000 7fda39896700 1 --2- 192.168.123.104:0/2069316723 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fda34108810 0x7fda34198940 secure :-1 s=READY pgs=56 cs=0 l=1 rev1=1 crypto rx=0x7fda2400c010 tx=0x7fda2400ba00 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:28:57.659 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:57.657+0000 7fda2b7fe700 1 -- 192.168.123.104:0/2069316723 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fda2401d070 con 0x7fda34108810 2026-03-10T06:28:57.659 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:57.657+0000 7fda3c2fb700 1 -- 192.168.123.104:0/2069316723 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fda3419d030 con 0x7fda34108810 2026-03-10T06:28:57.659 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:57.657+0000 7fda3c2fb700 1 -- 192.168.123.104:0/2069316723 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fda3419d520 con 0x7fda34108810 2026-03-10T06:28:57.660 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:57.658+0000 7fda3c2fb700 1 -- 192.168.123.104:0/2069316723 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fda3404ea50 con 0x7fda34108810 2026-03-10T06:28:57.660 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:57.658+0000 7fda2b7fe700 1 -- 192.168.123.104:0/2069316723 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7fda2400f460 con 0x7fda34108810 2026-03-10T06:28:57.660 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:57.658+0000 7fda2b7fe700 1 -- 192.168.123.104:0/2069316723 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fda24021620 con 0x7fda34108810 2026-03-10T06:28:57.661 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:57.659+0000 7fda2b7fe700 1 -- 192.168.123.104:0/2069316723 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 40) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fda2400f5d0 con 0x7fda34108810 2026-03-10T06:28:57.661 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:57.660+0000 7fda2b7fe700 1 --2- 192.168.123.104:0/2069316723 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7fda200778c0 0x7fda20079d70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:28:57.661 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:57.660+0000 7fda2b7fe700 1 -- 192.168.123.104:0/2069316723 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(81..81 src has 1..81) v4 ==== 6308+0+0 (secure 0 0 0) 0x7fda2409b3c0 con 0x7fda34108810 2026-03-10T06:28:57.662 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:57.660+0000 7fda3a097700 1 --2- 192.168.123.104:0/2069316723 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7fda200778c0 0x7fda20079d70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:28:57.662 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:57.661+0000 7fda3a097700 1 --2- 192.168.123.104:0/2069316723 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7fda200778c0 0x7fda20079d70 secure :-1 s=READY pgs=102 cs=0 l=1 rev1=1 crypto rx=0x7fda34103950 tx=0x7fda30009450 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:28:57.663 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:57.662+0000 7fda2b7fe700 1 -- 192.168.123.104:0/2069316723 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fda24063bc0 con 0x7fda34108810 2026-03-10T06:28:57.760 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:28:57 vm04.local ceph-mon[115743]: from='client.? 192.168.123.104:0/182312591' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 27, "format": "json"}]: dispatch 2026-03-10T06:28:57.808 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:57.806+0000 7fda3c2fb700 1 -- 192.168.123.104:0/2069316723 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 28, "format": "json"} v 0) v1 -- 0x7fda34066e40 con 0x7fda34108810 2026-03-10T06:28:57.809 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:57.807+0000 7fda2b7fe700 1 -- 192.168.123.104:0/2069316723 <== mon.1 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 28, "format": "json"}]=0 dumped fsmap epoch 28 v34) v1 ==== 107+0+4406 (secure 0 0 0) 0x7fda24063310 con 0x7fda34108810 2026-03-10T06:28:57.809 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:28:57.809 INFO:teuthology.orchestra.run.vm04.stdout:{"epoch":28,"btime":"2026-03-10T06:27:23:235296+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":34286,"name":"cephfs.vm04.hsrsig","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.104:6829/142010479","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6828","nonce":142010479},{"type":"v1","addr":"192.168.123.104:6829","nonce":142010479}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":22},{"gid":44245,"name":"cephfs.vm06.wzhqon","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.106:6825/2972586913","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6824","nonce":2972586913},{"type":"v1","addr":"192.168.123.106:6825","nonce":2972586913}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":25}],"filesystems":[{"mdsmap":{"epoch":28,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":32,"explicitly_allowed_features":32,"created":"2026-03-10T06:19:48.407965+0000","modified":"2026-03-10T06:27:23.185160+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":81,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"max_mds":1,"in":[0],"up":{"mds_0":34266},"failed":[],"damaged":[],"stopped":[],"info":{"gid_34266":{"gid":34266,"name":"cephfs.vm04.hdxbzv","rank":0,"incarnation":27,"state":"up:reconnect","state_seq":8,"addr":"192.168.123.104:6827/2103633514","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6826","nonce":2103633514},{"type":"v1","addr":"192.168.123.104:6827","nonce":2103633514}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":0,"qdb_cluster":[]},"id":1}]} 2026-03-10T06:28:57.811 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:57.810+0000 7fda3c2fb700 1 -- 192.168.123.104:0/2069316723 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7fda200778c0 msgr2=0x7fda20079d70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:28:57.811 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:57.810+0000 7fda3c2fb700 1 --2- 192.168.123.104:0/2069316723 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7fda200778c0 0x7fda20079d70 secure :-1 s=READY pgs=102 cs=0 l=1 rev1=1 crypto rx=0x7fda34103950 tx=0x7fda30009450 comp rx=0 tx=0).stop 2026-03-10T06:28:57.812 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:57.810+0000 7fda3c2fb700 1 -- 192.168.123.104:0/2069316723 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fda34108810 msgr2=0x7fda34198940 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:28:57.812 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:57.810+0000 7fda3c2fb700 1 --2- 192.168.123.104:0/2069316723 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fda34108810 0x7fda34198940 secure :-1 s=READY pgs=56 cs=0 l=1 rev1=1 crypto rx=0x7fda2400c010 tx=0x7fda2400ba00 comp rx=0 tx=0).stop 2026-03-10T06:28:57.812 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:57.811+0000 7fda3c2fb700 1 -- 192.168.123.104:0/2069316723 shutdown_connections 2026-03-10T06:28:57.812 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:57.811+0000 7fda3c2fb700 1 --2- 192.168.123.104:0/2069316723 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7fda200778c0 0x7fda20079d70 unknown :-1 s=CLOSED pgs=102 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:57.812 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:57.811+0000 7fda3c2fb700 1 --2- 192.168.123.104:0/2069316723 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fda34102810 0x7fda34198400 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:57.812 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:57.811+0000 7fda3c2fb700 1 --2- 192.168.123.104:0/2069316723 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fda34108810 0x7fda34198940 unknown :-1 s=CLOSED pgs=56 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:57.813 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:57.811+0000 7fda3c2fb700 1 -- 192.168.123.104:0/2069316723 >> 192.168.123.104:0/2069316723 conn(0x7fda340fe330 msgr2=0x7fda340ff9f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:28:57.813 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:57.811+0000 7fda3c2fb700 1 -- 192.168.123.104:0/2069316723 shutdown_connections 2026-03-10T06:28:57.813 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:57.812+0000 7fda3c2fb700 1 -- 192.168.123.104:0/2069316723 wait complete. 2026-03-10T06:28:57.814 INFO:teuthology.orchestra.run.vm04.stderr:dumped fsmap epoch 28 2026-03-10T06:28:57.880 DEBUG:tasks.fs:allow_standby_replay disabled in epoch 28 2026-03-10T06:28:57.880 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid 9c59102a-1c48-11f1-b618-035af535377d -- ceph fs dump --format=json 29 2026-03-10T06:28:58.037 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/mon.vm04/config 2026-03-10T06:28:58.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:28:57 vm06.local ceph-mon[98962]: from='client.? 192.168.123.104:0/182312591' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 27, "format": "json"}]: dispatch 2026-03-10T06:28:58.312 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:58.310+0000 7f2dd18d4700 1 -- 192.168.123.104:0/2350287542 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2dcc108780 msgr2=0x7f2dcc108b50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:28:58.312 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:58.310+0000 7f2dd18d4700 1 --2- 192.168.123.104:0/2350287542 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2dcc108780 0x7f2dcc108b50 secure :-1 s=READY pgs=150 cs=0 l=1 rev1=1 crypto rx=0x7f2db4009b00 tx=0x7f2db4009e10 comp rx=0 tx=0).stop 2026-03-10T06:28:58.312 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:58.311+0000 7f2dd18d4700 1 -- 192.168.123.104:0/2350287542 shutdown_connections 2026-03-10T06:28:58.312 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:58.311+0000 7f2dd18d4700 1 --2- 192.168.123.104:0/2350287542 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f2dcc102780 0x7f2dcc102bf0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:58.312 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:58.311+0000 7f2dd18d4700 1 --2- 192.168.123.104:0/2350287542 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2dcc108780 0x7f2dcc108b50 unknown :-1 s=CLOSED pgs=150 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:58.312 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:58.311+0000 7f2dd18d4700 1 -- 192.168.123.104:0/2350287542 >> 192.168.123.104:0/2350287542 conn(0x7f2dcc0fe280 msgr2=0x7f2dcc100690 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:28:58.312 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:58.311+0000 7f2dd18d4700 1 -- 192.168.123.104:0/2350287542 shutdown_connections 2026-03-10T06:28:58.312 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:58.311+0000 7f2dd18d4700 1 -- 192.168.123.104:0/2350287542 wait complete. 2026-03-10T06:28:58.313 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:58.311+0000 7f2dd18d4700 1 Processor -- start 2026-03-10T06:28:58.313 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:58.311+0000 7f2dd18d4700 1 -- start start 2026-03-10T06:28:58.313 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:58.312+0000 7f2dd18d4700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2dcc102780 0x7f2dcc198350 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:28:58.313 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:58.312+0000 7f2dd18d4700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f2dcc108780 0x7f2dcc198890 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:28:58.313 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:58.312+0000 7f2dd18d4700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f2dcc198f70 con 0x7f2dcc102780 2026-03-10T06:28:58.313 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:58.312+0000 7f2dd18d4700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f2dcc19cd00 con 0x7f2dcc108780 2026-03-10T06:28:58.314 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:58.312+0000 7f2dcaffd700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2dcc102780 0x7f2dcc198350 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:28:58.314 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:58.312+0000 7f2dcaffd700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2dcc102780 0x7f2dcc198350 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:58116/0 (socket says 192.168.123.104:58116) 2026-03-10T06:28:58.314 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:58.312+0000 7f2dcaffd700 1 -- 192.168.123.104:0/240999519 learned_addr learned my addr 192.168.123.104:0/240999519 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:28:58.314 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:58.312+0000 7f2dcaffd700 1 -- 192.168.123.104:0/240999519 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f2dcc108780 msgr2=0x7f2dcc198890 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:28:58.314 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:58.312+0000 7f2dca7fc700 1 --2- 192.168.123.104:0/240999519 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f2dcc108780 0x7f2dcc198890 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:28:58.314 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:58.312+0000 7f2dcaffd700 1 --2- 192.168.123.104:0/240999519 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f2dcc108780 0x7f2dcc198890 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:58.314 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:58.312+0000 7f2dcaffd700 1 -- 192.168.123.104:0/240999519 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f2db40097e0 con 0x7f2dcc102780 2026-03-10T06:28:58.314 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:58.312+0000 7f2dca7fc700 1 --2- 192.168.123.104:0/240999519 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f2dcc108780 0x7f2dcc198890 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T06:28:58.314 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:58.313+0000 7f2dcaffd700 1 --2- 192.168.123.104:0/240999519 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2dcc102780 0x7f2dcc198350 secure :-1 s=READY pgs=151 cs=0 l=1 rev1=1 crypto rx=0x7f2db4005f50 tx=0x7f2db4004970 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:28:58.314 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:58.313+0000 7f2dd08d2700 1 -- 192.168.123.104:0/240999519 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f2db401d070 con 0x7f2dcc102780 2026-03-10T06:28:58.314 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:58.313+0000 7f2dd08d2700 1 -- 192.168.123.104:0/240999519 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f2db400bc50 con 0x7f2dcc102780 2026-03-10T06:28:58.314 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:58.313+0000 7f2dd18d4700 1 -- 192.168.123.104:0/240999519 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f2dcc19cf80 con 0x7f2dcc102780 2026-03-10T06:28:58.317 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:58.313+0000 7f2dd08d2700 1 -- 192.168.123.104:0/240999519 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f2db400f700 con 0x7f2dcc102780 2026-03-10T06:28:58.317 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:58.313+0000 7f2dd18d4700 1 -- 192.168.123.104:0/240999519 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f2dcc19d470 con 0x7f2dcc102780 2026-03-10T06:28:58.317 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:58.315+0000 7f2dd08d2700 1 -- 192.168.123.104:0/240999519 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 40) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f2db400f8a0 con 0x7f2dcc102780 2026-03-10T06:28:58.317 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:58.315+0000 7f2dd08d2700 1 --2- 192.168.123.104:0/240999519 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f2db80778c0 0x7f2db8079d70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:28:58.317 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:58.315+0000 7f2dca7fc700 1 --2- 192.168.123.104:0/240999519 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f2db80778c0 0x7f2db8079d70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:28:58.317 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:58.315+0000 7f2dd08d2700 1 -- 192.168.123.104:0/240999519 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(81..81 src has 1..81) v4 ==== 6308+0+0 (secure 0 0 0) 0x7f2db409c250 con 0x7f2dcc102780 2026-03-10T06:28:58.317 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:58.316+0000 7f2dd18d4700 1 -- 192.168.123.104:0/240999519 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f2dcc04ea50 con 0x7f2dcc102780 2026-03-10T06:28:58.320 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:58.317+0000 7f2dca7fc700 1 --2- 192.168.123.104:0/240999519 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f2db80778c0 0x7f2db8079d70 secure :-1 s=READY pgs=103 cs=0 l=1 rev1=1 crypto rx=0x7f2dbc005fd0 tx=0x7f2dbc005dc0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:28:58.321 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:58.320+0000 7f2dd08d2700 1 -- 192.168.123.104:0/240999519 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f2db4064a80 con 0x7f2dcc102780 2026-03-10T06:28:58.466 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:58.465+0000 7f2dd18d4700 1 -- 192.168.123.104:0/240999519 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 29, "format": "json"} v 0) v1 -- 0x7f2dcc066e40 con 0x7f2dcc102780 2026-03-10T06:28:58.467 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:58.465+0000 7f2dd08d2700 1 -- 192.168.123.104:0/240999519 <== mon.0 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 29, "format": "json"}]=0 dumped fsmap epoch 29 v34) v1 ==== 107+0+4403 (secure 0 0 0) 0x7f2db4027070 con 0x7f2dcc102780 2026-03-10T06:28:58.467 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:28:58.467 INFO:teuthology.orchestra.run.vm04.stdout:{"epoch":29,"btime":"2026-03-10T06:27:24:238653+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":34286,"name":"cephfs.vm04.hsrsig","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.104:6829/142010479","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6828","nonce":142010479},{"type":"v1","addr":"192.168.123.104:6829","nonce":142010479}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":22},{"gid":44245,"name":"cephfs.vm06.wzhqon","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.106:6825/2972586913","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6824","nonce":2972586913},{"type":"v1","addr":"192.168.123.106:6825","nonce":2972586913}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":25}],"filesystems":[{"mdsmap":{"epoch":29,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":32,"explicitly_allowed_features":32,"created":"2026-03-10T06:19:48.407965+0000","modified":"2026-03-10T06:27:23.244808+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":81,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"max_mds":1,"in":[0],"up":{"mds_0":34266},"failed":[],"damaged":[],"stopped":[],"info":{"gid_34266":{"gid":34266,"name":"cephfs.vm04.hdxbzv","rank":0,"incarnation":27,"state":"up:rejoin","state_seq":9,"addr":"192.168.123.104:6827/2103633514","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6826","nonce":2103633514},{"type":"v1","addr":"192.168.123.104:6827","nonce":2103633514}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":0,"qdb_cluster":[]},"id":1}]} 2026-03-10T06:28:58.470 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:58.468+0000 7f2dd18d4700 1 -- 192.168.123.104:0/240999519 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f2db80778c0 msgr2=0x7f2db8079d70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:28:58.470 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:58.468+0000 7f2dd18d4700 1 --2- 192.168.123.104:0/240999519 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f2db80778c0 0x7f2db8079d70 secure :-1 s=READY pgs=103 cs=0 l=1 rev1=1 crypto rx=0x7f2dbc005fd0 tx=0x7f2dbc005dc0 comp rx=0 tx=0).stop 2026-03-10T06:28:58.470 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:58.468+0000 7f2dd18d4700 1 -- 192.168.123.104:0/240999519 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2dcc102780 msgr2=0x7f2dcc198350 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:28:58.470 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:58.469+0000 7f2dd18d4700 1 --2- 192.168.123.104:0/240999519 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2dcc102780 0x7f2dcc198350 secure :-1 s=READY pgs=151 cs=0 l=1 rev1=1 crypto rx=0x7f2db4005f50 tx=0x7f2db4004970 comp rx=0 tx=0).stop 2026-03-10T06:28:58.470 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:58.469+0000 7f2dd18d4700 1 -- 192.168.123.104:0/240999519 shutdown_connections 2026-03-10T06:28:58.470 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:58.469+0000 7f2dd18d4700 1 --2- 192.168.123.104:0/240999519 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f2db80778c0 0x7f2db8079d70 unknown :-1 s=CLOSED pgs=103 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:58.470 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:58.469+0000 7f2dd18d4700 1 --2- 192.168.123.104:0/240999519 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2dcc102780 0x7f2dcc198350 unknown :-1 s=CLOSED pgs=151 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:58.470 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:58.469+0000 7f2dd18d4700 1 --2- 192.168.123.104:0/240999519 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f2dcc108780 0x7f2dcc198890 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:58.470 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:58.469+0000 7f2dd18d4700 1 -- 192.168.123.104:0/240999519 >> 192.168.123.104:0/240999519 conn(0x7f2dcc0fe280 msgr2=0x7f2dcc0ff9e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:28:58.470 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:58.469+0000 7f2dd18d4700 1 -- 192.168.123.104:0/240999519 shutdown_connections 2026-03-10T06:28:58.471 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:58.469+0000 7f2dd18d4700 1 -- 192.168.123.104:0/240999519 wait complete. 2026-03-10T06:28:58.471 INFO:teuthology.orchestra.run.vm04.stderr:dumped fsmap epoch 29 2026-03-10T06:28:58.517 DEBUG:tasks.fs:allow_standby_replay disabled in epoch 29 2026-03-10T06:28:58.517 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid 9c59102a-1c48-11f1-b618-035af535377d -- ceph fs dump --format=json 30 2026-03-10T06:28:58.684 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/mon.vm04/config 2026-03-10T06:28:58.949 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:58.947+0000 7f7834d7a700 1 -- 192.168.123.104:0/1302458103 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7830108780 msgr2=0x7f7830108b50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:28:58.950 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:58.947+0000 7f7834d7a700 1 --2- 192.168.123.104:0/1302458103 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7830108780 0x7f7830108b50 secure :-1 s=READY pgs=152 cs=0 l=1 rev1=1 crypto rx=0x7f7820009b00 tx=0x7f7820009e10 comp rx=0 tx=0).stop 2026-03-10T06:28:58.950 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:58.948+0000 7f7834d7a700 1 -- 192.168.123.104:0/1302458103 shutdown_connections 2026-03-10T06:28:58.950 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:58.948+0000 7f7834d7a700 1 --2- 192.168.123.104:0/1302458103 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7830102780 0x7f7830102bf0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:58.950 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:58.948+0000 7f7834d7a700 1 --2- 192.168.123.104:0/1302458103 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7830108780 0x7f7830108b50 unknown :-1 s=CLOSED pgs=152 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:58.950 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:58.948+0000 7f7834d7a700 1 -- 192.168.123.104:0/1302458103 >> 192.168.123.104:0/1302458103 conn(0x7f78300fe280 msgr2=0x7f7830100690 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:28:58.950 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:58.948+0000 7f7834d7a700 1 -- 192.168.123.104:0/1302458103 shutdown_connections 2026-03-10T06:28:58.950 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:58.948+0000 7f7834d7a700 1 -- 192.168.123.104:0/1302458103 wait complete. 2026-03-10T06:28:58.950 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:58.949+0000 7f7834d7a700 1 Processor -- start 2026-03-10T06:28:58.951 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:58.949+0000 7f7834d7a700 1 -- start start 2026-03-10T06:28:58.951 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:58.949+0000 7f7834d7a700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7830102780 0x7f783019d090 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:28:58.951 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:58.949+0000 7f7834d7a700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7830078500 0x7f7830078970 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:28:58.951 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:58.949+0000 7f7834d7a700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f783019d6e0 con 0x7f7830102780 2026-03-10T06:28:58.951 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:58.949+0000 7f7834d7a700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7830078eb0 con 0x7f7830078500 2026-03-10T06:28:58.951 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:58.949+0000 7f782e59c700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7830102780 0x7f783019d090 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:28:58.951 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:58.949+0000 7f7827fff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7830078500 0x7f7830078970 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:28:58.951 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:58.949+0000 7f7827fff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7830078500 0x7f7830078970 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.104:39554/0 (socket says 192.168.123.104:39554) 2026-03-10T06:28:58.951 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:58.949+0000 7f7827fff700 1 -- 192.168.123.104:0/110887747 learned_addr learned my addr 192.168.123.104:0/110887747 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:28:58.951 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:58.949+0000 7f782e59c700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7830102780 0x7f783019d090 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:58136/0 (socket says 192.168.123.104:58136) 2026-03-10T06:28:58.951 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:58.950+0000 7f782e59c700 1 -- 192.168.123.104:0/110887747 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7830078500 msgr2=0x7f7830078970 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:28:58.951 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:58.950+0000 7f782e59c700 1 --2- 192.168.123.104:0/110887747 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7830078500 0x7f7830078970 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:58.951 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:58.950+0000 7f782e59c700 1 -- 192.168.123.104:0/110887747 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f78200097e0 con 0x7f7830102780 2026-03-10T06:28:58.951 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:58.950+0000 7f7827fff700 1 --2- 192.168.123.104:0/110887747 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7830078500 0x7f7830078970 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-10T06:28:58.951 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:58.950+0000 7f782e59c700 1 --2- 192.168.123.104:0/110887747 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7830102780 0x7f783019d090 secure :-1 s=READY pgs=153 cs=0 l=1 rev1=1 crypto rx=0x7f7830103c10 tx=0x7f7820004910 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:28:58.952 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:58.950+0000 7f78277fe700 1 -- 192.168.123.104:0/110887747 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f782001d070 con 0x7f7830102780 2026-03-10T06:28:58.952 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:58.950+0000 7f7834d7a700 1 -- 192.168.123.104:0/110887747 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f7830079050 con 0x7f7830102780 2026-03-10T06:28:58.953 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:58.950+0000 7f7834d7a700 1 -- 192.168.123.104:0/110887747 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f7830198b50 con 0x7f7830102780 2026-03-10T06:28:58.953 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:58.952+0000 7f78277fe700 1 -- 192.168.123.104:0/110887747 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f7820022470 con 0x7f7830102780 2026-03-10T06:28:58.953 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:58.952+0000 7f78257fa700 1 -- 192.168.123.104:0/110887747 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f7814005320 con 0x7f7830102780 2026-03-10T06:28:58.957 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:58.953+0000 7f78277fe700 1 -- 192.168.123.104:0/110887747 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f782000f690 con 0x7f7830102780 2026-03-10T06:28:58.957 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:58.953+0000 7f78277fe700 1 -- 192.168.123.104:0/110887747 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 40) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f782000f830 con 0x7f7830102780 2026-03-10T06:28:58.957 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:58.953+0000 7f78277fe700 1 --2- 192.168.123.104:0/110887747 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f7810077730 0x7f7810079be0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:28:58.957 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:58.953+0000 7f78277fe700 1 -- 192.168.123.104:0/110887747 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(81..81 src has 1..81) v4 ==== 6308+0+0 (secure 0 0 0) 0x7f782009c090 con 0x7f7830102780 2026-03-10T06:28:58.957 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:58.956+0000 7f7827fff700 1 --2- 192.168.123.104:0/110887747 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f7810077730 0x7f7810079be0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:28:58.957 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:58.956+0000 7f7827fff700 1 --2- 192.168.123.104:0/110887747 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f7810077730 0x7f7810079be0 secure :-1 s=READY pgs=104 cs=0 l=1 rev1=1 crypto rx=0x7f781800f4d0 tx=0x7f7818005f90 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:28:58.957 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:58.956+0000 7f78277fe700 1 -- 192.168.123.104:0/110887747 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f7820064940 con 0x7f7830102780 2026-03-10T06:28:59.095 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:28:58 vm04.local ceph-mon[115743]: from='client.? 192.168.123.104:0/2069316723' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 28, "format": "json"}]: dispatch 2026-03-10T06:28:59.095 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:28:58 vm04.local ceph-mon[115743]: pgmap v188: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:28:59.095 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:28:58 vm04.local ceph-mon[115743]: from='client.? 192.168.123.104:0/240999519' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 29, "format": "json"}]: dispatch 2026-03-10T06:28:59.097 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:59.093+0000 7f78257fa700 1 -- 192.168.123.104:0/110887747 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 30, "format": "json"} v 0) v1 -- 0x7f78140059f0 con 0x7f7830102780 2026-03-10T06:28:59.098 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:59.096+0000 7f78277fe700 1 -- 192.168.123.104:0/110887747 <== mon.0 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 30, "format": "json"}]=0 dumped fsmap epoch 30 v34) v1 ==== 107+0+5264 (secure 0 0 0) 0x7f7820064090 con 0x7f7830102780 2026-03-10T06:28:59.098 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:28:59.098 INFO:teuthology.orchestra.run.vm04.stdout:{"epoch":30,"btime":"2026-03-10T06:27:25:245636+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":34286,"name":"cephfs.vm04.hsrsig","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.104:6829/142010479","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6828","nonce":142010479},{"type":"v1","addr":"192.168.123.104:6829","nonce":142010479}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":22},{"gid":44245,"name":"cephfs.vm06.wzhqon","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.106:6825/2972586913","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6824","nonce":2972586913},{"type":"v1","addr":"192.168.123.106:6825","nonce":2972586913}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":25},{"gid":44249,"name":"cephfs.vm06.afscws","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.106:6827/3728010036","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6826","nonce":3728010036},{"type":"v1","addr":"192.168.123.106:6827","nonce":3728010036}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":30}],"filesystems":[{"mdsmap":{"epoch":30,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":32,"explicitly_allowed_features":32,"created":"2026-03-10T06:19:48.407965+0000","modified":"2026-03-10T06:27:25.245632+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":81,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"max_mds":1,"in":[0],"up":{"mds_0":34266},"failed":[],"damaged":[],"stopped":[],"info":{"gid_34266":{"gid":34266,"name":"cephfs.vm04.hdxbzv","rank":0,"incarnation":27,"state":"up:active","state_seq":10,"addr":"192.168.123.104:6827/2103633514","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6826","nonce":2103633514},{"type":"v1","addr":"192.168.123.104:6827","nonce":2103633514}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":34266,"qdb_cluster":[34266]},"id":1}]} 2026-03-10T06:28:59.101 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:59.099+0000 7f78257fa700 1 -- 192.168.123.104:0/110887747 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f7810077730 msgr2=0x7f7810079be0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:28:59.101 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:59.099+0000 7f78257fa700 1 --2- 192.168.123.104:0/110887747 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f7810077730 0x7f7810079be0 secure :-1 s=READY pgs=104 cs=0 l=1 rev1=1 crypto rx=0x7f781800f4d0 tx=0x7f7818005f90 comp rx=0 tx=0).stop 2026-03-10T06:28:59.101 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:59.099+0000 7f78257fa700 1 -- 192.168.123.104:0/110887747 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7830102780 msgr2=0x7f783019d090 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:28:59.101 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:59.099+0000 7f78257fa700 1 --2- 192.168.123.104:0/110887747 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7830102780 0x7f783019d090 secure :-1 s=READY pgs=153 cs=0 l=1 rev1=1 crypto rx=0x7f7830103c10 tx=0x7f7820004910 comp rx=0 tx=0).stop 2026-03-10T06:28:59.102 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:59.100+0000 7f78257fa700 1 -- 192.168.123.104:0/110887747 shutdown_connections 2026-03-10T06:28:59.102 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:59.100+0000 7f78257fa700 1 --2- 192.168.123.104:0/110887747 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f7810077730 0x7f7810079be0 unknown :-1 s=CLOSED pgs=104 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:59.102 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:59.100+0000 7f78257fa700 1 --2- 192.168.123.104:0/110887747 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7830102780 0x7f783019d090 unknown :-1 s=CLOSED pgs=153 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:59.102 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:59.100+0000 7f78257fa700 1 --2- 192.168.123.104:0/110887747 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7830078500 0x7f7830078970 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:59.102 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:59.101+0000 7f78257fa700 1 -- 192.168.123.104:0/110887747 >> 192.168.123.104:0/110887747 conn(0x7f78300fe280 msgr2=0x7f78300ffd80 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:28:59.110 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:59.106+0000 7f78257fa700 1 -- 192.168.123.104:0/110887747 shutdown_connections 2026-03-10T06:28:59.110 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:59.108+0000 7f78257fa700 1 -- 192.168.123.104:0/110887747 wait complete. 2026-03-10T06:28:59.110 INFO:teuthology.orchestra.run.vm04.stderr:dumped fsmap epoch 30 2026-03-10T06:28:59.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:28:58 vm06.local ceph-mon[98962]: from='client.? 192.168.123.104:0/2069316723' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 28, "format": "json"}]: dispatch 2026-03-10T06:28:59.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:28:58 vm06.local ceph-mon[98962]: pgmap v188: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:28:59.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:28:58 vm06.local ceph-mon[98962]: from='client.? 192.168.123.104:0/240999519' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 29, "format": "json"}]: dispatch 2026-03-10T06:28:59.162 DEBUG:tasks.fs:allow_standby_replay disabled in epoch 30 2026-03-10T06:28:59.162 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid 9c59102a-1c48-11f1-b618-035af535377d -- ceph fs dump --format=json 31 2026-03-10T06:28:59.307 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/mon.vm04/config 2026-03-10T06:28:59.560 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:59.558+0000 7f47e4a87700 1 -- 192.168.123.104:0/1020291240 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f47e0068490 msgr2=0x7f47e0068900 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:28:59.560 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:59.558+0000 7f47e4a87700 1 --2- 192.168.123.104:0/1020291240 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f47e0068490 0x7f47e0068900 secure :-1 s=READY pgs=154 cs=0 l=1 rev1=1 crypto rx=0x7f47d0009b00 tx=0x7f47d0009e10 comp rx=0 tx=0).stop 2026-03-10T06:28:59.560 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:59.559+0000 7f47e4a87700 1 -- 192.168.123.104:0/1020291240 shutdown_connections 2026-03-10T06:28:59.560 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:59.559+0000 7f47e4a87700 1 --2- 192.168.123.104:0/1020291240 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f47e0068490 0x7f47e0068900 unknown :-1 s=CLOSED pgs=154 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:59.560 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:59.559+0000 7f47e4a87700 1 --2- 192.168.123.104:0/1020291240 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f47e01013a0 0x7f47e0101770 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:59.561 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:59.559+0000 7f47e4a87700 1 -- 192.168.123.104:0/1020291240 >> 192.168.123.104:0/1020291240 conn(0x7f47e00754a0 msgr2=0x7f47e00758a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:28:59.561 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:59.559+0000 7f47e4a87700 1 -- 192.168.123.104:0/1020291240 shutdown_connections 2026-03-10T06:28:59.561 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:59.559+0000 7f47e4a87700 1 -- 192.168.123.104:0/1020291240 wait complete. 2026-03-10T06:28:59.561 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:59.560+0000 7f47e4a87700 1 Processor -- start 2026-03-10T06:28:59.561 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:59.560+0000 7f47e4a87700 1 -- start start 2026-03-10T06:28:59.562 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:59.560+0000 7f47e4a87700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f47e0068490 0x7f47e0198330 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:28:59.562 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:59.560+0000 7f47e4a87700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f47e01013a0 0x7f47e0198870 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:28:59.562 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:59.560+0000 7f47e4a87700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f47e0198f50 con 0x7f47e0068490 2026-03-10T06:28:59.562 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:59.560+0000 7f47e4a87700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f47e019cce0 con 0x7f47e01013a0 2026-03-10T06:28:59.562 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:59.561+0000 7f47ddd9b700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f47e01013a0 0x7f47e0198870 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:28:59.562 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:59.561+0000 7f47ddd9b700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f47e01013a0 0x7f47e0198870 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.104:39568/0 (socket says 192.168.123.104:39568) 2026-03-10T06:28:59.562 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:59.561+0000 7f47ddd9b700 1 -- 192.168.123.104:0/1238919299 learned_addr learned my addr 192.168.123.104:0/1238919299 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:28:59.562 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:59.561+0000 7f47de59c700 1 --2- 192.168.123.104:0/1238919299 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f47e0068490 0x7f47e0198330 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:28:59.562 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:59.561+0000 7f47ddd9b700 1 -- 192.168.123.104:0/1238919299 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f47e0068490 msgr2=0x7f47e0198330 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:28:59.562 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:59.561+0000 7f47ddd9b700 1 --2- 192.168.123.104:0/1238919299 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f47e0068490 0x7f47e0198330 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:59.562 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:59.561+0000 7f47ddd9b700 1 -- 192.168.123.104:0/1238919299 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f47d00097e0 con 0x7f47e01013a0 2026-03-10T06:28:59.562 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:59.561+0000 7f47de59c700 1 --2- 192.168.123.104:0/1238919299 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f47e0068490 0x7f47e0198330 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_reply_more state changed! 2026-03-10T06:28:59.563 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:59.561+0000 7f47ddd9b700 1 --2- 192.168.123.104:0/1238919299 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f47e01013a0 0x7f47e0198870 secure :-1 s=READY pgs=57 cs=0 l=1 rev1=1 crypto rx=0x7f47d000b5c0 tx=0x7f47d0004970 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:28:59.563 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:59.562+0000 7f47d77fe700 1 -- 192.168.123.104:0/1238919299 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f47d001d070 con 0x7f47e01013a0 2026-03-10T06:28:59.563 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:59.562+0000 7f47d77fe700 1 -- 192.168.123.104:0/1238919299 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f47d0004b90 con 0x7f47e01013a0 2026-03-10T06:28:59.564 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:59.562+0000 7f47e4a87700 1 -- 192.168.123.104:0/1238919299 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f47e019cf60 con 0x7f47e01013a0 2026-03-10T06:28:59.564 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:59.562+0000 7f47d77fe700 1 -- 192.168.123.104:0/1238919299 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f47d000f7c0 con 0x7f47e01013a0 2026-03-10T06:28:59.564 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:59.562+0000 7f47e4a87700 1 -- 192.168.123.104:0/1238919299 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f47e019d450 con 0x7f47e01013a0 2026-03-10T06:28:59.564 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:59.563+0000 7f47e4a87700 1 -- 192.168.123.104:0/1238919299 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f47e004ea50 con 0x7f47e01013a0 2026-03-10T06:28:59.566 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:59.564+0000 7f47d77fe700 1 -- 192.168.123.104:0/1238919299 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 40) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f47d0022ae0 con 0x7f47e01013a0 2026-03-10T06:28:59.566 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:59.564+0000 7f47d77fe700 1 --2- 192.168.123.104:0/1238919299 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f47cc0778c0 0x7f47cc079d70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:28:59.566 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:59.564+0000 7f47de59c700 1 --2- 192.168.123.104:0/1238919299 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f47cc0778c0 0x7f47cc079d70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:28:59.566 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:59.564+0000 7f47d77fe700 1 -- 192.168.123.104:0/1238919299 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(81..81 src has 1..81) v4 ==== 6308+0+0 (secure 0 0 0) 0x7f47d009bd30 con 0x7f47e01013a0 2026-03-10T06:28:59.566 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:59.565+0000 7f47de59c700 1 --2- 192.168.123.104:0/1238919299 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f47cc0778c0 0x7f47cc079d70 secure :-1 s=READY pgs=105 cs=0 l=1 rev1=1 crypto rx=0x7f47c8005950 tx=0x7f47c80058e0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:28:59.568 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:59.566+0000 7f47d77fe700 1 -- 192.168.123.104:0/1238919299 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f47d0064530 con 0x7f47e01013a0 2026-03-10T06:28:59.722 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:59.721+0000 7f47e4a87700 1 -- 192.168.123.104:0/1238919299 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 31, "format": "json"} v 0) v1 -- 0x7f47e0066e40 con 0x7f47e01013a0 2026-03-10T06:28:59.723 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:59.721+0000 7f47d77fe700 1 -- 192.168.123.104:0/1238919299 <== mon.1 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 31, "format": "json"}]=0 dumped fsmap epoch 31 v34) v1 ==== 107+0+5264 (secure 0 0 0) 0x7f47d0063c80 con 0x7f47e01013a0 2026-03-10T06:28:59.723 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:28:59.723 INFO:teuthology.orchestra.run.vm04.stdout:{"epoch":31,"btime":"2026-03-10T06:27:27:925521+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":34286,"name":"cephfs.vm04.hsrsig","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.104:6829/142010479","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6828","nonce":142010479},{"type":"v1","addr":"192.168.123.104:6829","nonce":142010479}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":22},{"gid":44245,"name":"cephfs.vm06.wzhqon","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.106:6825/2972586913","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6824","nonce":2972586913},{"type":"v1","addr":"192.168.123.106:6825","nonce":2972586913}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":25},{"gid":44249,"name":"cephfs.vm06.afscws","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.106:6827/3728010036","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6826","nonce":3728010036},{"type":"v1","addr":"192.168.123.106:6827","nonce":3728010036}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":30}],"filesystems":[{"mdsmap":{"epoch":31,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":32,"explicitly_allowed_features":32,"created":"2026-03-10T06:19:48.407965+0000","modified":"2026-03-10T06:27:26.927320+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":81,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"max_mds":1,"in":[0],"up":{"mds_0":34266},"failed":[],"damaged":[],"stopped":[],"info":{"gid_34266":{"gid":34266,"name":"cephfs.vm04.hdxbzv","rank":0,"incarnation":27,"state":"up:active","state_seq":10,"addr":"192.168.123.104:6827/2103633514","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6826","nonce":2103633514},{"type":"v1","addr":"192.168.123.104:6827","nonce":2103633514}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":34266,"qdb_cluster":[34266]},"id":1}]} 2026-03-10T06:28:59.726 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:59.724+0000 7f47e4a87700 1 -- 192.168.123.104:0/1238919299 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f47cc0778c0 msgr2=0x7f47cc079d70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:28:59.726 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:59.724+0000 7f47e4a87700 1 --2- 192.168.123.104:0/1238919299 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f47cc0778c0 0x7f47cc079d70 secure :-1 s=READY pgs=105 cs=0 l=1 rev1=1 crypto rx=0x7f47c8005950 tx=0x7f47c80058e0 comp rx=0 tx=0).stop 2026-03-10T06:28:59.726 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:59.724+0000 7f47e4a87700 1 -- 192.168.123.104:0/1238919299 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f47e01013a0 msgr2=0x7f47e0198870 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:28:59.726 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:59.724+0000 7f47e4a87700 1 --2- 192.168.123.104:0/1238919299 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f47e01013a0 0x7f47e0198870 secure :-1 s=READY pgs=57 cs=0 l=1 rev1=1 crypto rx=0x7f47d000b5c0 tx=0x7f47d0004970 comp rx=0 tx=0).stop 2026-03-10T06:28:59.726 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:59.725+0000 7f47e4a87700 1 -- 192.168.123.104:0/1238919299 shutdown_connections 2026-03-10T06:28:59.726 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:59.725+0000 7f47e4a87700 1 --2- 192.168.123.104:0/1238919299 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f47cc0778c0 0x7f47cc079d70 unknown :-1 s=CLOSED pgs=105 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:59.726 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:59.725+0000 7f47e4a87700 1 --2- 192.168.123.104:0/1238919299 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f47e0068490 0x7f47e0198330 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:59.726 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:59.725+0000 7f47e4a87700 1 --2- 192.168.123.104:0/1238919299 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f47e01013a0 0x7f47e0198870 unknown :-1 s=CLOSED pgs=57 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:28:59.726 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:59.725+0000 7f47e4a87700 1 -- 192.168.123.104:0/1238919299 >> 192.168.123.104:0/1238919299 conn(0x7f47e00754a0 msgr2=0x7f47e00fdd70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:28:59.726 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:59.725+0000 7f47e4a87700 1 -- 192.168.123.104:0/1238919299 shutdown_connections 2026-03-10T06:28:59.726 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:28:59.725+0000 7f47e4a87700 1 -- 192.168.123.104:0/1238919299 wait complete. 2026-03-10T06:28:59.727 INFO:teuthology.orchestra.run.vm04.stderr:dumped fsmap epoch 31 2026-03-10T06:28:59.790 DEBUG:tasks.fs:allow_standby_replay disabled in epoch 31 2026-03-10T06:28:59.791 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid 9c59102a-1c48-11f1-b618-035af535377d -- ceph fs dump --format=json 32 2026-03-10T06:28:59.939 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/mon.vm04/config 2026-03-10T06:28:59.978 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:28:59 vm04.local ceph-mon[115743]: from='client.? 192.168.123.104:0/110887747' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 30, "format": "json"}]: dispatch 2026-03-10T06:28:59.978 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:28:59 vm04.local ceph-mon[115743]: from='client.? 192.168.123.104:0/1238919299' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 31, "format": "json"}]: dispatch 2026-03-10T06:29:00.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:28:59 vm06.local ceph-mon[98962]: from='client.? 192.168.123.104:0/110887747' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 30, "format": "json"}]: dispatch 2026-03-10T06:29:00.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:28:59 vm06.local ceph-mon[98962]: from='client.? 192.168.123.104:0/1238919299' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 31, "format": "json"}]: dispatch 2026-03-10T06:29:00.198 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:00.196+0000 7f5c33882700 1 -- 192.168.123.104:0/827716347 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5c2c108780 msgr2=0x7f5c2c108b50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:29:00.198 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:00.196+0000 7f5c33882700 1 --2- 192.168.123.104:0/827716347 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5c2c108780 0x7f5c2c108b50 secure :-1 s=READY pgs=155 cs=0 l=1 rev1=1 crypto rx=0x7f5c1c009b00 tx=0x7f5c1c009e10 comp rx=0 tx=0).stop 2026-03-10T06:29:00.199 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:00.197+0000 7f5c33882700 1 -- 192.168.123.104:0/827716347 shutdown_connections 2026-03-10T06:29:00.199 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:00.197+0000 7f5c33882700 1 --2- 192.168.123.104:0/827716347 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5c2c102780 0x7f5c2c102bf0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:29:00.199 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:00.197+0000 7f5c33882700 1 --2- 192.168.123.104:0/827716347 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5c2c108780 0x7f5c2c108b50 unknown :-1 s=CLOSED pgs=155 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:29:00.199 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:00.197+0000 7f5c33882700 1 -- 192.168.123.104:0/827716347 >> 192.168.123.104:0/827716347 conn(0x7f5c2c0fe280 msgr2=0x7f5c2c100690 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:29:00.199 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:00.197+0000 7f5c33882700 1 -- 192.168.123.104:0/827716347 shutdown_connections 2026-03-10T06:29:00.199 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:00.198+0000 7f5c33882700 1 -- 192.168.123.104:0/827716347 wait complete. 2026-03-10T06:29:00.199 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:00.198+0000 7f5c33882700 1 Processor -- start 2026-03-10T06:29:00.199 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:00.198+0000 7f5c33882700 1 -- start start 2026-03-10T06:29:00.200 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:00.198+0000 7f5c33882700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5c2c102780 0x7f5c2c1983d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:29:00.200 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:00.198+0000 7f5c33882700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5c2c108780 0x7f5c2c198910 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:29:00.200 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:00.198+0000 7f5c33882700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5c2c198ff0 con 0x7f5c2c102780 2026-03-10T06:29:00.200 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:00.198+0000 7f5c33882700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5c2c19cd30 con 0x7f5c2c108780 2026-03-10T06:29:00.200 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:00.199+0000 7f5c30e1d700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5c2c108780 0x7f5c2c198910 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:29:00.200 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:00.199+0000 7f5c30e1d700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5c2c108780 0x7f5c2c198910 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.104:39582/0 (socket says 192.168.123.104:39582) 2026-03-10T06:29:00.200 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:00.199+0000 7f5c30e1d700 1 -- 192.168.123.104:0/1044853811 learned_addr learned my addr 192.168.123.104:0/1044853811 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:29:00.200 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:00.199+0000 7f5c3161e700 1 --2- 192.168.123.104:0/1044853811 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5c2c102780 0x7f5c2c1983d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:29:00.200 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:00.199+0000 7f5c30e1d700 1 -- 192.168.123.104:0/1044853811 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5c2c102780 msgr2=0x7f5c2c1983d0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:29:00.201 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:00.199+0000 7f5c30e1d700 1 --2- 192.168.123.104:0/1044853811 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5c2c102780 0x7f5c2c1983d0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:29:00.201 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:00.199+0000 7f5c30e1d700 1 -- 192.168.123.104:0/1044853811 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f5c1c0097e0 con 0x7f5c2c108780 2026-03-10T06:29:00.201 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:00.199+0000 7f5c3161e700 1 --2- 192.168.123.104:0/1044853811 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5c2c102780 0x7f5c2c1983d0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T06:29:00.201 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:00.199+0000 7f5c30e1d700 1 --2- 192.168.123.104:0/1044853811 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5c2c108780 0x7f5c2c198910 secure :-1 s=READY pgs=58 cs=0 l=1 rev1=1 crypto rx=0x7f5c2800d8d0 tx=0x7f5c2800dc90 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:29:00.201 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:00.199+0000 7f5c227fc700 1 -- 192.168.123.104:0/1044853811 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f5c28009940 con 0x7f5c2c108780 2026-03-10T06:29:00.201 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:00.199+0000 7f5c33882700 1 -- 192.168.123.104:0/1044853811 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f5c2c19d010 con 0x7f5c2c108780 2026-03-10T06:29:00.202 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:00.200+0000 7f5c227fc700 1 -- 192.168.123.104:0/1044853811 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f5c28010460 con 0x7f5c2c108780 2026-03-10T06:29:00.202 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:00.200+0000 7f5c227fc700 1 -- 192.168.123.104:0/1044853811 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f5c2800f5d0 con 0x7f5c2c108780 2026-03-10T06:29:00.202 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:00.200+0000 7f5c33882700 1 -- 192.168.123.104:0/1044853811 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f5c2c19d560 con 0x7f5c2c108780 2026-03-10T06:29:00.203 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:00.201+0000 7f5c33882700 1 -- 192.168.123.104:0/1044853811 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f5c2c10aca0 con 0x7f5c2c108780 2026-03-10T06:29:00.203 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:00.201+0000 7f5c227fc700 1 -- 192.168.123.104:0/1044853811 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 40) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f5c280105d0 con 0x7f5c2c108780 2026-03-10T06:29:00.203 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:00.201+0000 7f5c227fc700 1 --2- 192.168.123.104:0/1044853811 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f5c18077910 0x7f5c18079dc0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:29:00.203 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:00.202+0000 7f5c227fc700 1 -- 192.168.123.104:0/1044853811 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(81..81 src has 1..81) v4 ==== 6308+0+0 (secure 0 0 0) 0x7f5c28098f90 con 0x7f5c2c108780 2026-03-10T06:29:00.204 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:00.202+0000 7f5c3161e700 1 --2- 192.168.123.104:0/1044853811 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f5c18077910 0x7f5c18079dc0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:29:00.204 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:00.203+0000 7f5c3161e700 1 --2- 192.168.123.104:0/1044853811 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f5c18077910 0x7f5c18079dc0 secure :-1 s=READY pgs=106 cs=0 l=1 rev1=1 crypto rx=0x7f5c1c009fd0 tx=0x7f5c1c005c00 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:29:00.206 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:00.204+0000 7f5c227fc700 1 -- 192.168.123.104:0/1044853811 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f5c28061040 con 0x7f5c2c108780 2026-03-10T06:29:00.357 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:29:00.357 INFO:teuthology.orchestra.run.vm04.stdout:{"epoch":32,"btime":"2026-03-10T06:27:28:928952+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":34286,"name":"cephfs.vm04.hsrsig","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.104:6829/142010479","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6828","nonce":142010479},{"type":"v1","addr":"192.168.123.104:6829","nonce":142010479}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":22},{"gid":44245,"name":"cephfs.vm06.wzhqon","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.106:6825/2972586913","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6824","nonce":2972586913},{"type":"v1","addr":"192.168.123.106:6825","nonce":2972586913}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":25},{"gid":44249,"name":"cephfs.vm06.afscws","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.106:6827/3728010036","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6826","nonce":3728010036},{"type":"v1","addr":"192.168.123.106:6827","nonce":3728010036}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":30}],"filesystems":[{"mdsmap":{"epoch":32,"flags":50,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":true,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":32,"explicitly_allowed_features":32,"created":"2026-03-10T06:19:48.407965+0000","modified":"2026-03-10T06:27:27.931710+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":81,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"max_mds":1,"in":[0],"up":{"mds_0":34266},"failed":[],"damaged":[],"stopped":[],"info":{"gid_34266":{"gid":34266,"name":"cephfs.vm04.hdxbzv","rank":0,"incarnation":27,"state":"up:active","state_seq":10,"addr":"192.168.123.104:6827/2103633514","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6826","nonce":2103633514},{"type":"v1","addr":"192.168.123.104:6827","nonce":2103633514}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":34266,"qdb_cluster":[34266]},"id":1}]} 2026-03-10T06:29:00.358 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:00.352+0000 7f5c33882700 1 -- 192.168.123.104:0/1044853811 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 32, "format": "json"} v 0) v1 -- 0x7f5c2c199780 con 0x7f5c2c108780 2026-03-10T06:29:00.358 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:00.353+0000 7f5c227fc700 1 -- 192.168.123.104:0/1044853811 <== mon.1 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 32, "format": "json"}]=0 dumped fsmap epoch 32 v34) v1 ==== 107+0+5263 (secure 0 0 0) 0x7f5c28060e60 con 0x7f5c2c108780 2026-03-10T06:29:00.360 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:00.358+0000 7f5c33882700 1 -- 192.168.123.104:0/1044853811 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f5c18077910 msgr2=0x7f5c18079dc0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:29:00.360 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:00.358+0000 7f5c33882700 1 --2- 192.168.123.104:0/1044853811 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f5c18077910 0x7f5c18079dc0 secure :-1 s=READY pgs=106 cs=0 l=1 rev1=1 crypto rx=0x7f5c1c009fd0 tx=0x7f5c1c005c00 comp rx=0 tx=0).stop 2026-03-10T06:29:00.360 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:00.359+0000 7f5c33882700 1 -- 192.168.123.104:0/1044853811 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5c2c108780 msgr2=0x7f5c2c198910 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:29:00.360 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:00.359+0000 7f5c33882700 1 --2- 192.168.123.104:0/1044853811 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5c2c108780 0x7f5c2c198910 secure :-1 s=READY pgs=58 cs=0 l=1 rev1=1 crypto rx=0x7f5c2800d8d0 tx=0x7f5c2800dc90 comp rx=0 tx=0).stop 2026-03-10T06:29:00.360 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:00.359+0000 7f5c33882700 1 -- 192.168.123.104:0/1044853811 shutdown_connections 2026-03-10T06:29:00.360 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:00.359+0000 7f5c33882700 1 --2- 192.168.123.104:0/1044853811 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f5c18077910 0x7f5c18079dc0 unknown :-1 s=CLOSED pgs=106 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:29:00.360 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:00.359+0000 7f5c33882700 1 --2- 192.168.123.104:0/1044853811 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5c2c102780 0x7f5c2c1983d0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:29:00.361 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:00.359+0000 7f5c33882700 1 --2- 192.168.123.104:0/1044853811 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5c2c108780 0x7f5c2c198910 unknown :-1 s=CLOSED pgs=58 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:29:00.361 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:00.359+0000 7f5c33882700 1 -- 192.168.123.104:0/1044853811 >> 192.168.123.104:0/1044853811 conn(0x7f5c2c0fe280 msgr2=0x7f5c2c0ffbe0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:29:00.361 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:00.359+0000 7f5c33882700 1 -- 192.168.123.104:0/1044853811 shutdown_connections 2026-03-10T06:29:00.361 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:00.359+0000 7f5c33882700 1 -- 192.168.123.104:0/1044853811 wait complete. 2026-03-10T06:29:00.362 INFO:teuthology.orchestra.run.vm04.stderr:dumped fsmap epoch 32 2026-03-10T06:29:00.403 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid 9c59102a-1c48-11f1-b618-035af535377d -- ceph fs dump --format=json 33 2026-03-10T06:29:00.548 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/mon.vm04/config 2026-03-10T06:29:00.807 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:00.805+0000 7f2a536c9700 1 -- 192.168.123.104:0/2593528862 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2a4c101d70 msgr2=0x7f2a4c10a270 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:29:00.807 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:00.805+0000 7f2a536c9700 1 --2- 192.168.123.104:0/2593528862 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2a4c101d70 0x7f2a4c10a270 secure :-1 s=READY pgs=156 cs=0 l=1 rev1=1 crypto rx=0x7f2a40009b00 tx=0x7f2a40009e10 comp rx=0 tx=0).stop 2026-03-10T06:29:00.807 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:00.806+0000 7f2a536c9700 1 -- 192.168.123.104:0/2593528862 shutdown_connections 2026-03-10T06:29:00.807 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:00.806+0000 7f2a536c9700 1 --2- 192.168.123.104:0/2593528862 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2a4c101d70 0x7f2a4c10a270 unknown :-1 s=CLOSED pgs=156 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:29:00.807 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:00.806+0000 7f2a536c9700 1 --2- 192.168.123.104:0/2593528862 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f2a4c101460 0x7f2a4c101830 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:29:00.807 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:00.806+0000 7f2a536c9700 1 -- 192.168.123.104:0/2593528862 >> 192.168.123.104:0/2593528862 conn(0x7f2a4c0faca0 msgr2=0x7f2a4c0fd0b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:29:00.807 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:00.806+0000 7f2a536c9700 1 -- 192.168.123.104:0/2593528862 shutdown_connections 2026-03-10T06:29:00.807 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:00.806+0000 7f2a536c9700 1 -- 192.168.123.104:0/2593528862 wait complete. 2026-03-10T06:29:00.808 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:00.806+0000 7f2a536c9700 1 Processor -- start 2026-03-10T06:29:00.808 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:00.807+0000 7f2a536c9700 1 -- start start 2026-03-10T06:29:00.808 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:00.807+0000 7f2a536c9700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f2a4c101460 0x7f2a4c195ee0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:29:00.809 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:00.807+0000 7f2a536c9700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2a4c101d70 0x7f2a4c196420 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:29:00.809 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:00.807+0000 7f2a536c9700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f2a4c196b00 con 0x7f2a4c101d70 2026-03-10T06:29:00.809 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:00.807+0000 7f2a536c9700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f2a4c19a890 con 0x7f2a4c101460 2026-03-10T06:29:00.809 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:00.807+0000 7f2a51ec6700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2a4c101d70 0x7f2a4c196420 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:29:00.809 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:00.807+0000 7f2a51ec6700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2a4c101d70 0x7f2a4c196420 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:58186/0 (socket says 192.168.123.104:58186) 2026-03-10T06:29:00.809 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:00.807+0000 7f2a51ec6700 1 -- 192.168.123.104:0/2419810142 learned_addr learned my addr 192.168.123.104:0/2419810142 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:29:00.809 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:00.807+0000 7f2a51ec6700 1 -- 192.168.123.104:0/2419810142 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f2a4c101460 msgr2=0x7f2a4c195ee0 unknown :-1 s=STATE_CONNECTING_RE l=1).mark_down 2026-03-10T06:29:00.809 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:00.807+0000 7f2a51ec6700 1 --2- 192.168.123.104:0/2419810142 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f2a4c101460 0x7f2a4c195ee0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:29:00.809 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:00.807+0000 7f2a51ec6700 1 -- 192.168.123.104:0/2419810142 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f2a400097e0 con 0x7f2a4c101d70 2026-03-10T06:29:00.809 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:00.808+0000 7f2a51ec6700 1 --2- 192.168.123.104:0/2419810142 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2a4c101d70 0x7f2a4c196420 secure :-1 s=READY pgs=157 cs=0 l=1 rev1=1 crypto rx=0x7f2a40006010 tx=0x7f2a4000bab0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:29:00.809 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:00.808+0000 7f2a3f7fe700 1 -- 192.168.123.104:0/2419810142 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f2a4001d070 con 0x7f2a4c101d70 2026-03-10T06:29:00.810 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:00.808+0000 7f2a3f7fe700 1 -- 192.168.123.104:0/2419810142 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f2a4000f460 con 0x7f2a4c101d70 2026-03-10T06:29:00.810 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:00.808+0000 7f2a3f7fe700 1 -- 192.168.123.104:0/2419810142 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f2a40021620 con 0x7f2a4c101d70 2026-03-10T06:29:00.810 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:00.808+0000 7f2a536c9700 1 -- 192.168.123.104:0/2419810142 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f2a4c19ab10 con 0x7f2a4c101d70 2026-03-10T06:29:00.810 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:00.808+0000 7f2a536c9700 1 -- 192.168.123.104:0/2419810142 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f2a4c19b000 con 0x7f2a4c101d70 2026-03-10T06:29:00.810 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:00.809+0000 7f2a536c9700 1 -- 192.168.123.104:0/2419810142 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f2a4c1078e0 con 0x7f2a4c101d70 2026-03-10T06:29:00.815 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:00.813+0000 7f2a3f7fe700 1 -- 192.168.123.104:0/2419810142 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 40) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f2a4000f5d0 con 0x7f2a4c101d70 2026-03-10T06:29:00.815 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:00.813+0000 7f2a3f7fe700 1 --2- 192.168.123.104:0/2419810142 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f2a38077870 0x7f2a38079d20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:29:00.815 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:00.813+0000 7f2a3f7fe700 1 -- 192.168.123.104:0/2419810142 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(81..81 src has 1..81) v4 ==== 6308+0+0 (secure 0 0 0) 0x7f2a4009afa0 con 0x7f2a4c101d70 2026-03-10T06:29:00.815 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:00.813+0000 7f2a3f7fe700 1 -- 192.168.123.104:0/2419810142 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f2a4009b430 con 0x7f2a4c101d70 2026-03-10T06:29:00.815 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:00.814+0000 7f2a526c7700 1 --2- 192.168.123.104:0/2419810142 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f2a38077870 0x7f2a38079d20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:29:00.816 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:00.814+0000 7f2a526c7700 1 --2- 192.168.123.104:0/2419810142 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f2a38077870 0x7f2a38079d20 secure :-1 s=READY pgs=107 cs=0 l=1 rev1=1 crypto rx=0x7f2a44009d30 tx=0x7f2a44009450 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:29:00.929 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:29:00 vm04.local ceph-mon[115743]: pgmap v189: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 2 op/s 2026-03-10T06:29:00.929 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:29:00 vm04.local ceph-mon[115743]: from='client.? 192.168.123.104:0/1044853811' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 32, "format": "json"}]: dispatch 2026-03-10T06:29:00.956 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:00.955+0000 7f2a536c9700 1 -- 192.168.123.104:0/2419810142 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 33, "format": "json"} v 0) v1 -- 0x7f2a4c067a10 con 0x7f2a4c101d70 2026-03-10T06:29:00.957 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:00.956+0000 7f2a3f7fe700 1 -- 192.168.123.104:0/2419810142 <== mon.0 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 33, "format": "json"}]=0 dumped fsmap epoch 33 v34) v1 ==== 107+0+5270 (secure 0 0 0) 0x7f2a400637a0 con 0x7f2a4c101d70 2026-03-10T06:29:00.957 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:29:00.957 INFO:teuthology.orchestra.run.vm04.stdout:{"epoch":33,"btime":"2026-03-10T06:27:28:936023+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":44245,"name":"cephfs.vm06.wzhqon","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.106:6825/2972586913","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6824","nonce":2972586913},{"type":"v1","addr":"192.168.123.106:6825","nonce":2972586913}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":25},{"gid":44249,"name":"cephfs.vm06.afscws","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.106:6827/3728010036","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6826","nonce":3728010036},{"type":"v1","addr":"192.168.123.106:6827","nonce":3728010036}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":30}],"filesystems":[{"mdsmap":{"epoch":33,"flags":50,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":true,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":32,"explicitly_allowed_features":32,"created":"2026-03-10T06:19:48.407965+0000","modified":"2026-03-10T06:27:28.936018+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":81,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"max_mds":1,"in":[0],"up":{"mds_0":34266},"failed":[],"damaged":[],"stopped":[],"info":{"gid_34266":{"gid":34266,"name":"cephfs.vm04.hdxbzv","rank":0,"incarnation":27,"state":"up:active","state_seq":10,"addr":"192.168.123.104:6827/2103633514","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6826","nonce":2103633514},{"type":"v1","addr":"192.168.123.104:6827","nonce":2103633514}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}}},"gid_34286":{"gid":34286,"name":"cephfs.vm04.hsrsig","rank":0,"incarnation":0,"state":"up:standby-replay","state_seq":1,"addr":"192.168.123.104:6829/142010479","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6828","nonce":142010479},{"type":"v1","addr":"192.168.123.104:6829","nonce":142010479}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":34266,"qdb_cluster":[34266]},"id":1}]} 2026-03-10T06:29:00.960 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:00.958+0000 7f2a536c9700 1 -- 192.168.123.104:0/2419810142 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f2a38077870 msgr2=0x7f2a38079d20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:29:00.960 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:00.958+0000 7f2a536c9700 1 --2- 192.168.123.104:0/2419810142 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f2a38077870 0x7f2a38079d20 secure :-1 s=READY pgs=107 cs=0 l=1 rev1=1 crypto rx=0x7f2a44009d30 tx=0x7f2a44009450 comp rx=0 tx=0).stop 2026-03-10T06:29:00.960 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:00.959+0000 7f2a536c9700 1 -- 192.168.123.104:0/2419810142 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2a4c101d70 msgr2=0x7f2a4c196420 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:29:00.960 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:00.959+0000 7f2a536c9700 1 --2- 192.168.123.104:0/2419810142 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2a4c101d70 0x7f2a4c196420 secure :-1 s=READY pgs=157 cs=0 l=1 rev1=1 crypto rx=0x7f2a40006010 tx=0x7f2a4000bab0 comp rx=0 tx=0).stop 2026-03-10T06:29:00.960 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:00.959+0000 7f2a536c9700 1 -- 192.168.123.104:0/2419810142 shutdown_connections 2026-03-10T06:29:00.960 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:00.959+0000 7f2a536c9700 1 --2- 192.168.123.104:0/2419810142 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f2a38077870 0x7f2a38079d20 unknown :-1 s=CLOSED pgs=107 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:29:00.961 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:00.959+0000 7f2a536c9700 1 --2- 192.168.123.104:0/2419810142 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f2a4c101460 0x7f2a4c195ee0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:29:00.961 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:00.959+0000 7f2a536c9700 1 --2- 192.168.123.104:0/2419810142 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2a4c101d70 0x7f2a4c196420 unknown :-1 s=CLOSED pgs=157 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:29:00.961 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:00.959+0000 7f2a536c9700 1 -- 192.168.123.104:0/2419810142 >> 192.168.123.104:0/2419810142 conn(0x7f2a4c0faca0 msgr2=0x7f2a4c104a20 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:29:00.961 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:00.960+0000 7f2a536c9700 1 -- 192.168.123.104:0/2419810142 shutdown_connections 2026-03-10T06:29:00.961 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:00.960+0000 7f2a536c9700 1 -- 192.168.123.104:0/2419810142 wait complete. 2026-03-10T06:29:00.962 INFO:teuthology.orchestra.run.vm04.stderr:dumped fsmap epoch 33 2026-03-10T06:29:01.006 DEBUG:teuthology.run_tasks:Unwinding manager ceph-fuse 2026-03-10T06:29:01.008 INFO:tasks.ceph_fuse:Unmounting ceph-fuse clients... 2026-03-10T06:29:01.008 DEBUG:teuthology.orchestra.run.vm04:> set -ex 2026-03-10T06:29:01.008 DEBUG:teuthology.orchestra.run.vm04:> dd if=/proc/self/mounts of=/dev/stdout 2026-03-10T06:29:01.024 DEBUG:teuthology.orchestra.run.vm04:> set -ex 2026-03-10T06:29:01.024 DEBUG:teuthology.orchestra.run.vm04:> dd if=/proc/self/mounts of=/dev/stdout 2026-03-10T06:29:01.080 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid 9c59102a-1c48-11f1-b618-035af535377d -- ceph osd blocklist ls 2026-03-10T06:29:01.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:29:00 vm06.local ceph-mon[98962]: pgmap v189: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 2 op/s 2026-03-10T06:29:01.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:29:00 vm06.local ceph-mon[98962]: from='client.? 192.168.123.104:0/1044853811' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 32, "format": "json"}]: dispatch 2026-03-10T06:29:01.268 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/mon.vm04/config 2026-03-10T06:29:01.518 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:01.516+0000 7f0f4b1c2700 1 -- 192.168.123.104:0/3745211611 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0f440686f0 msgr2=0x7f0f44068ac0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:29:01.518 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:01.516+0000 7f0f4b1c2700 1 --2- 192.168.123.104:0/3745211611 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0f440686f0 0x7f0f44068ac0 secure :-1 s=READY pgs=158 cs=0 l=1 rev1=1 crypto rx=0x7f0f34009b00 tx=0x7f0f34009e10 comp rx=0 tx=0).stop 2026-03-10T06:29:01.518 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:01.517+0000 7f0f4b1c2700 1 -- 192.168.123.104:0/3745211611 shutdown_connections 2026-03-10T06:29:01.518 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:01.517+0000 7f0f4b1c2700 1 --2- 192.168.123.104:0/3745211611 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0f44069000 0x7f0f441051e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:29:01.518 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:01.517+0000 7f0f4b1c2700 1 --2- 192.168.123.104:0/3745211611 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0f440686f0 0x7f0f44068ac0 unknown :-1 s=CLOSED pgs=158 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:29:01.518 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:01.517+0000 7f0f4b1c2700 1 -- 192.168.123.104:0/3745211611 >> 192.168.123.104:0/3745211611 conn(0x7f0f440754a0 msgr2=0x7f0f440758a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:29:01.518 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:01.517+0000 7f0f4b1c2700 1 -- 192.168.123.104:0/3745211611 shutdown_connections 2026-03-10T06:29:01.518 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:01.517+0000 7f0f4b1c2700 1 -- 192.168.123.104:0/3745211611 wait complete. 2026-03-10T06:29:01.519 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:01.517+0000 7f0f4b1c2700 1 Processor -- start 2026-03-10T06:29:01.519 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:01.518+0000 7f0f4b1c2700 1 -- start start 2026-03-10T06:29:01.519 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:01.518+0000 7f0f4b1c2700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0f440686f0 0x7f0f440fffd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:29:01.519 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:01.518+0000 7f0f4b1c2700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0f44069000 0x7f0f44100510 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:29:01.520 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:01.518+0000 7f0f4b1c2700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0f44100a50 con 0x7f0f440686f0 2026-03-10T06:29:01.520 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:01.518+0000 7f0f4b1c2700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0f44100b90 con 0x7f0f44069000 2026-03-10T06:29:01.520 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:01.518+0000 7f0f48f5e700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0f440686f0 0x7f0f440fffd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:29:01.520 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:01.518+0000 7f0f48f5e700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0f440686f0 0x7f0f440fffd0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:58198/0 (socket says 192.168.123.104:58198) 2026-03-10T06:29:01.520 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:01.518+0000 7f0f48f5e700 1 -- 192.168.123.104:0/1994044439 learned_addr learned my addr 192.168.123.104:0/1994044439 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:29:01.520 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:01.518+0000 7f0f43fff700 1 --2- 192.168.123.104:0/1994044439 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0f44069000 0x7f0f44100510 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:29:01.520 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:01.518+0000 7f0f48f5e700 1 -- 192.168.123.104:0/1994044439 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0f44069000 msgr2=0x7f0f44100510 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:29:01.520 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:01.518+0000 7f0f48f5e700 1 --2- 192.168.123.104:0/1994044439 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0f44069000 0x7f0f44100510 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:29:01.520 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:01.518+0000 7f0f48f5e700 1 -- 192.168.123.104:0/1994044439 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f0f340097e0 con 0x7f0f440686f0 2026-03-10T06:29:01.520 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:01.519+0000 7f0f48f5e700 1 --2- 192.168.123.104:0/1994044439 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0f440686f0 0x7f0f440fffd0 secure :-1 s=READY pgs=159 cs=0 l=1 rev1=1 crypto rx=0x7f0f34000c00 tx=0x7f0f34004970 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:29:01.520 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:01.519+0000 7f0f41ffb700 1 -- 192.168.123.104:0/1994044439 <== mon.0 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f0f3401d070 con 0x7f0f440686f0 2026-03-10T06:29:01.521 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:01.519+0000 7f0f41ffb700 1 -- 192.168.123.104:0/1994044439 <== mon.0 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f0f3400bc50 con 0x7f0f440686f0 2026-03-10T06:29:01.521 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:01.519+0000 7f0f41ffb700 1 -- 192.168.123.104:0/1994044439 <== mon.0 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f0f3400f740 con 0x7f0f440686f0 2026-03-10T06:29:01.522 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:01.519+0000 7f0f4b1c2700 1 -- 192.168.123.104:0/1994044439 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f0f44100e10 con 0x7f0f440686f0 2026-03-10T06:29:01.522 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:01.519+0000 7f0f4b1c2700 1 -- 192.168.123.104:0/1994044439 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f0f441a26e0 con 0x7f0f440686f0 2026-03-10T06:29:01.526 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:01.521+0000 7f0f4b1c2700 1 -- 192.168.123.104:0/1994044439 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f0f44108b30 con 0x7f0f440686f0 2026-03-10T06:29:01.526 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:01.521+0000 7f0f41ffb700 1 -- 192.168.123.104:0/1994044439 <== mon.0 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 40) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f0f34022470 con 0x7f0f440686f0 2026-03-10T06:29:01.526 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:01.522+0000 7f0f41ffb700 1 --2- 192.168.123.104:0/1994044439 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f0f2c0778c0 0x7f0f2c079d70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:29:01.526 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:01.522+0000 7f0f41ffb700 1 -- 192.168.123.104:0/1994044439 <== mon.0 v2:192.168.123.104:3300/0 5 ==== osd_map(81..81 src has 1..81) v4 ==== 6308+0+0 (secure 0 0 0) 0x7f0f3409b1a0 con 0x7f0f440686f0 2026-03-10T06:29:01.526 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:01.522+0000 7f0f43fff700 1 --2- 192.168.123.104:0/1994044439 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f0f2c0778c0 0x7f0f2c079d70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:29:01.526 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:01.522+0000 7f0f43fff700 1 --2- 192.168.123.104:0/1994044439 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f0f2c0778c0 0x7f0f2c079d70 secure :-1 s=READY pgs=108 cs=0 l=1 rev1=1 crypto rx=0x7f0f38005950 tx=0x7f0f380058e0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:29:01.526 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:01.525+0000 7f0f41ffb700 1 -- 192.168.123.104:0/1994044439 <== mon.0 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f0f34063920 con 0x7f0f440686f0 2026-03-10T06:29:01.655 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:01.654+0000 7f0f4b1c2700 1 -- 192.168.123.104:0/1994044439 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "osd blocklist ls"} v 0) v1 -- 0x7f0f4404ea50 con 0x7f0f440686f0 2026-03-10T06:29:01.659 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:01.658+0000 7f0f41ffb700 1 -- 192.168.123.104:0/1994044439 <== mon.0 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "osd blocklist ls"}]=0 listed 35 entries v81) v1 ==== 81+0+2150 (secure 0 0 0) 0x7f0f34063070 con 0x7f0f440686f0 2026-03-10T06:29:01.660 INFO:teuthology.orchestra.run.vm04.stdout:192.168.123.106:6826/3120742985 2026-03-11T06:27:18.714213+0000 2026-03-10T06:29:01.660 INFO:teuthology.orchestra.run.vm04.stdout:192.168.123.104:6829/2419696492 2026-03-11T06:27:01.254434+0000 2026-03-10T06:29:01.660 INFO:teuthology.orchestra.run.vm04.stdout:192.168.123.104:6826/2274683007 2026-03-11T06:26:47.650245+0000 2026-03-10T06:29:01.660 INFO:teuthology.orchestra.run.vm04.stdout:192.168.123.104:6827/2274683007 2026-03-11T06:26:47.650245+0000 2026-03-10T06:29:01.660 INFO:teuthology.orchestra.run.vm04.stdout:192.168.123.104:0/1794352543 2026-03-11T06:21:48.618768+0000 2026-03-10T06:29:01.660 INFO:teuthology.orchestra.run.vm04.stdout:192.168.123.106:6827/3120742985 2026-03-11T06:27:18.714213+0000 2026-03-10T06:29:01.660 INFO:teuthology.orchestra.run.vm04.stdout:192.168.123.106:0/3268211178 2026-03-11T06:22:06.287857+0000 2026-03-10T06:29:01.660 INFO:teuthology.orchestra.run.vm04.stdout:192.168.123.104:0/2105211783 2026-03-11T06:21:48.618768+0000 2026-03-10T06:29:01.660 INFO:teuthology.orchestra.run.vm04.stdout:192.168.123.104:0/4014113593 2026-03-11T06:18:04.067248+0000 2026-03-10T06:29:01.660 INFO:teuthology.orchestra.run.vm04.stdout:192.168.123.104:0/1930923909 2026-03-11T06:17:13.124403+0000 2026-03-10T06:29:01.660 INFO:teuthology.orchestra.run.vm04.stdout:192.168.123.106:6828/1426890327 2026-03-11T06:22:06.287857+0000 2026-03-10T06:29:01.660 INFO:teuthology.orchestra.run.vm04.stdout:192.168.123.106:0/4185520472 2026-03-11T06:22:06.287857+0000 2026-03-10T06:29:01.660 INFO:teuthology.orchestra.run.vm04.stdout:192.168.123.106:6829/1426890327 2026-03-11T06:22:06.287857+0000 2026-03-10T06:29:01.660 INFO:teuthology.orchestra.run.vm04.stdout:192.168.123.104:6801/2 2026-03-11T06:16:59.244005+0000 2026-03-10T06:29:01.660 INFO:teuthology.orchestra.run.vm04.stdout:192.168.123.104:6828/2419696492 2026-03-11T06:27:01.254434+0000 2026-03-10T06:29:01.660 INFO:teuthology.orchestra.run.vm04.stdout:192.168.123.104:0/4057554910 2026-03-11T06:24:03.938326+0000 2026-03-10T06:29:01.660 INFO:teuthology.orchestra.run.vm04.stdout:192.168.123.104:0/3171159121 2026-03-11T06:16:59.244005+0000 2026-03-10T06:29:01.660 INFO:teuthology.orchestra.run.vm04.stdout:192.168.123.106:0/4188376122 2026-03-11T06:22:06.287857+0000 2026-03-10T06:29:01.660 INFO:teuthology.orchestra.run.vm04.stdout:192.168.123.106:0/3310183690 2026-03-11T06:22:06.287857+0000 2026-03-10T06:29:01.660 INFO:teuthology.orchestra.run.vm04.stdout:192.168.123.104:6800/2 2026-03-11T06:16:59.244005+0000 2026-03-10T06:29:01.660 INFO:teuthology.orchestra.run.vm04.stdout:192.168.123.104:0/1120149152 2026-03-11T06:17:13.124403+0000 2026-03-10T06:29:01.660 INFO:teuthology.orchestra.run.vm04.stdout:192.168.123.106:0/3849404075 2026-03-11T06:22:06.287857+0000 2026-03-10T06:29:01.660 INFO:teuthology.orchestra.run.vm04.stdout:192.168.123.104:0/2035036316 2026-03-11T06:24:03.938326+0000 2026-03-10T06:29:01.660 INFO:teuthology.orchestra.run.vm04.stdout:192.168.123.104:6801/1695210057 2026-03-11T06:24:03.938326+0000 2026-03-10T06:29:01.660 INFO:teuthology.orchestra.run.vm04.stdout:192.168.123.104:0/140224168 2026-03-11T06:21:48.618768+0000 2026-03-10T06:29:01.660 INFO:teuthology.orchestra.run.vm04.stdout:192.168.123.104:0/1133148001 2026-03-11T06:16:59.244005+0000 2026-03-10T06:29:01.660 INFO:teuthology.orchestra.run.vm04.stdout:192.168.123.104:0/63423144 2026-03-11T06:18:04.067248+0000 2026-03-10T06:29:01.660 INFO:teuthology.orchestra.run.vm04.stdout:192.168.123.106:0/3059594429 2026-03-11T06:22:06.287857+0000 2026-03-10T06:29:01.660 INFO:teuthology.orchestra.run.vm04.stdout:192.168.123.104:0/2832999113 2026-03-11T06:17:13.124403+0000 2026-03-10T06:29:01.660 INFO:teuthology.orchestra.run.vm04.stdout:192.168.123.104:0/3321367007 2026-03-11T06:21:48.618768+0000 2026-03-10T06:29:01.660 INFO:teuthology.orchestra.run.vm04.stdout:192.168.123.104:0/4184397752 2026-03-11T06:18:04.067248+0000 2026-03-10T06:29:01.660 INFO:teuthology.orchestra.run.vm04.stdout:192.168.123.104:0/2564492007 2026-03-11T06:16:59.244005+0000 2026-03-10T06:29:01.660 INFO:teuthology.orchestra.run.vm04.stdout:192.168.123.104:0/3016996679 2026-03-11T06:24:03.938326+0000 2026-03-10T06:29:01.660 INFO:teuthology.orchestra.run.vm04.stdout:192.168.123.104:6800/1695210057 2026-03-11T06:24:03.938326+0000 2026-03-10T06:29:01.660 INFO:teuthology.orchestra.run.vm04.stdout:192.168.123.104:0/1784947156 2026-03-11T06:24:03.938326+0000 2026-03-10T06:29:01.662 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:01.660+0000 7f0f4b1c2700 1 -- 192.168.123.104:0/1994044439 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f0f2c0778c0 msgr2=0x7f0f2c079d70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:29:01.662 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:01.660+0000 7f0f4b1c2700 1 --2- 192.168.123.104:0/1994044439 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f0f2c0778c0 0x7f0f2c079d70 secure :-1 s=READY pgs=108 cs=0 l=1 rev1=1 crypto rx=0x7f0f38005950 tx=0x7f0f380058e0 comp rx=0 tx=0).stop 2026-03-10T06:29:01.662 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:01.660+0000 7f0f4b1c2700 1 -- 192.168.123.104:0/1994044439 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0f440686f0 msgr2=0x7f0f440fffd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:29:01.662 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:01.660+0000 7f0f4b1c2700 1 --2- 192.168.123.104:0/1994044439 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0f440686f0 0x7f0f440fffd0 secure :-1 s=READY pgs=159 cs=0 l=1 rev1=1 crypto rx=0x7f0f34000c00 tx=0x7f0f34004970 comp rx=0 tx=0).stop 2026-03-10T06:29:01.662 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:01.660+0000 7f0f4b1c2700 1 -- 192.168.123.104:0/1994044439 shutdown_connections 2026-03-10T06:29:01.662 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:01.660+0000 7f0f4b1c2700 1 --2- 192.168.123.104:0/1994044439 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f0f2c0778c0 0x7f0f2c079d70 unknown :-1 s=CLOSED pgs=108 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:29:01.662 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:01.660+0000 7f0f4b1c2700 1 --2- 192.168.123.104:0/1994044439 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0f440686f0 0x7f0f440fffd0 unknown :-1 s=CLOSED pgs=159 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:29:01.662 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:01.660+0000 7f0f4b1c2700 1 --2- 192.168.123.104:0/1994044439 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0f44069000 0x7f0f44100510 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:29:01.662 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:01.660+0000 7f0f4b1c2700 1 -- 192.168.123.104:0/1994044439 >> 192.168.123.104:0/1994044439 conn(0x7f0f440754a0 msgr2=0x7f0f440ff6d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:29:01.662 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:01.661+0000 7f0f4b1c2700 1 -- 192.168.123.104:0/1994044439 shutdown_connections 2026-03-10T06:29:01.662 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:01.661+0000 7f0f4b1c2700 1 -- 192.168.123.104:0/1994044439 wait complete. 2026-03-10T06:29:01.663 INFO:teuthology.orchestra.run.vm04.stderr:listed 35 entries 2026-03-10T06:29:01.734 DEBUG:teuthology.orchestra.run.vm04:> set -ex 2026-03-10T06:29:01.734 DEBUG:teuthology.orchestra.run.vm04:> dd if=/proc/self/mounts of=/dev/stdout 2026-03-10T06:29:01.753 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid 9c59102a-1c48-11f1-b618-035af535377d -- ceph osd blocklist ls 2026-03-10T06:29:01.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:29:01 vm04.local ceph-mon[115743]: from='client.? 192.168.123.104:0/2419810142' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 33, "format": "json"}]: dispatch 2026-03-10T06:29:01.928 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:29:01 vm04.local ceph-mon[115743]: from='client.? 192.168.123.104:0/1994044439' entity='client.admin' cmd=[{"prefix": "osd blocklist ls"}]: dispatch 2026-03-10T06:29:01.959 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/mon.vm04/config 2026-03-10T06:29:02.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:29:01 vm06.local ceph-mon[98962]: from='client.? 192.168.123.104:0/2419810142' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 33, "format": "json"}]: dispatch 2026-03-10T06:29:02.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:29:01 vm06.local ceph-mon[98962]: from='client.? 192.168.123.104:0/1994044439' entity='client.admin' cmd=[{"prefix": "osd blocklist ls"}]: dispatch 2026-03-10T06:29:02.241 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:02.239+0000 7f187f59e700 1 -- 192.168.123.104:0/152352771 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f18801047f0 msgr2=0x7f1880068490 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:29:02.241 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:02.239+0000 7f187f59e700 1 --2- 192.168.123.104:0/152352771 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f18801047f0 0x7f1880068490 secure :-1 s=READY pgs=160 cs=0 l=1 rev1=1 crypto rx=0x7f187800b3a0 tx=0x7f187800b6b0 comp rx=0 tx=0).stop 2026-03-10T06:29:02.241 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:02.239+0000 7f187f59e700 1 -- 192.168.123.104:0/152352771 shutdown_connections 2026-03-10T06:29:02.241 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:02.240+0000 7f187f59e700 1 --2- 192.168.123.104:0/152352771 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f18801047f0 0x7f1880068490 unknown :-1 s=CLOSED pgs=160 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:29:02.241 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:02.240+0000 7f187f59e700 1 --2- 192.168.123.104:0/152352771 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1880103ee0 0x7f18801042b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:29:02.241 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:02.240+0000 7f187f59e700 1 -- 192.168.123.104:0/152352771 >> 192.168.123.104:0/152352771 conn(0x7f18800751e0 msgr2=0x7f18800755e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:29:02.243 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:02.240+0000 7f187f59e700 1 -- 192.168.123.104:0/152352771 shutdown_connections 2026-03-10T06:29:02.243 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:02.240+0000 7f187f59e700 1 -- 192.168.123.104:0/152352771 wait complete. 2026-03-10T06:29:02.243 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:02.240+0000 7f187f59e700 1 Processor -- start 2026-03-10T06:29:02.243 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:02.240+0000 7f187f59e700 1 -- start start 2026-03-10T06:29:02.243 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:02.240+0000 7f187f59e700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1880103ee0 0x7f1880109a30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:29:02.243 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:02.240+0000 7f187f59e700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1880109f70 0x7f1880071840 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:29:02.243 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:02.240+0000 7f187f59e700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f188010a580 con 0x7f1880109f70 2026-03-10T06:29:02.243 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:02.240+0000 7f187f59e700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f1880071d80 con 0x7f1880103ee0 2026-03-10T06:29:02.243 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:02.241+0000 7f187e59c700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1880103ee0 0x7f1880109a30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:29:02.243 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:02.241+0000 7f187e59c700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1880103ee0 0x7f1880109a30 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.104:39634/0 (socket says 192.168.123.104:39634) 2026-03-10T06:29:02.243 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:02.241+0000 7f187e59c700 1 -- 192.168.123.104:0/2903280105 learned_addr learned my addr 192.168.123.104:0/2903280105 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T06:29:02.243 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:02.242+0000 7f187e59c700 1 -- 192.168.123.104:0/2903280105 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1880109f70 msgr2=0x7f1880071840 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:29:02.243 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:02.242+0000 7f187e59c700 1 --2- 192.168.123.104:0/2903280105 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1880109f70 0x7f1880071840 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:29:02.243 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:02.242+0000 7f187e59c700 1 -- 192.168.123.104:0/2903280105 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f187800b050 con 0x7f1880103ee0 2026-03-10T06:29:02.244 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:02.242+0000 7f187e59c700 1 --2- 192.168.123.104:0/2903280105 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1880103ee0 0x7f1880109a30 secure :-1 s=READY pgs=59 cs=0 l=1 rev1=1 crypto rx=0x7f187400d8d0 tx=0x7f187400dc90 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:29:02.244 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:02.243+0000 7f186f7fe700 1 -- 192.168.123.104:0/2903280105 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f1874009940 con 0x7f1880103ee0 2026-03-10T06:29:02.244 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:02.243+0000 7f187f59e700 1 -- 192.168.123.104:0/2903280105 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f1880072060 con 0x7f1880103ee0 2026-03-10T06:29:02.244 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:02.243+0000 7f187f59e700 1 -- 192.168.123.104:0/2903280105 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f18800725b0 con 0x7f1880103ee0 2026-03-10T06:29:02.244 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:02.243+0000 7f186f7fe700 1 -- 192.168.123.104:0/2903280105 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f1874010460 con 0x7f1880103ee0 2026-03-10T06:29:02.244 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:02.243+0000 7f186f7fe700 1 -- 192.168.123.104:0/2903280105 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f187400f5d0 con 0x7f1880103ee0 2026-03-10T06:29:02.246 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:02.245+0000 7f186f7fe700 1 -- 192.168.123.104:0/2903280105 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 40) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f1874009aa0 con 0x7f1880103ee0 2026-03-10T06:29:02.247 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:02.245+0000 7f186f7fe700 1 --2- 192.168.123.104:0/2903280105 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f1868077910 0x7f1868079dc0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T06:29:02.247 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:02.246+0000 7f186f7fe700 1 -- 192.168.123.104:0/2903280105 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(81..81 src has 1..81) v4 ==== 6308+0+0 (secure 0 0 0) 0x7f18740999c0 con 0x7f1880103ee0 2026-03-10T06:29:02.247 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:02.246+0000 7f187f59e700 1 -- 192.168.123.104:0/2903280105 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f1860005320 con 0x7f1880103ee0 2026-03-10T06:29:02.248 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:02.247+0000 7f187dd9b700 1 --2- 192.168.123.104:0/2903280105 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f1868077910 0x7f1868079dc0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T06:29:02.249 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:02.248+0000 7f187dd9b700 1 --2- 192.168.123.104:0/2903280105 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f1868077910 0x7f1868079dc0 secure :-1 s=READY pgs=109 cs=0 l=1 rev1=1 crypto rx=0x7f187800bb30 tx=0x7f187800bf90 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T06:29:02.251 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:02.249+0000 7f186f7fe700 1 -- 192.168.123.104:0/2903280105 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f1874061a70 con 0x7f1880103ee0 2026-03-10T06:29:02.381 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:02.379+0000 7f187f59e700 1 -- 192.168.123.104:0/2903280105 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "osd blocklist ls"} v 0) v1 -- 0x7f1860005f70 con 0x7f1880103ee0 2026-03-10T06:29:02.381 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:02.380+0000 7f186f7fe700 1 -- 192.168.123.104:0/2903280105 <== mon.1 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "osd blocklist ls"}]=0 listed 35 entries v81) v1 ==== 81+0+2150 (secure 0 0 0) 0x7f1874020020 con 0x7f1880103ee0 2026-03-10T06:29:02.382 INFO:teuthology.orchestra.run.vm04.stdout:192.168.123.106:6826/3120742985 2026-03-11T06:27:18.714213+0000 2026-03-10T06:29:02.382 INFO:teuthology.orchestra.run.vm04.stdout:192.168.123.104:6829/2419696492 2026-03-11T06:27:01.254434+0000 2026-03-10T06:29:02.382 INFO:teuthology.orchestra.run.vm04.stdout:192.168.123.104:6826/2274683007 2026-03-11T06:26:47.650245+0000 2026-03-10T06:29:02.383 INFO:teuthology.orchestra.run.vm04.stdout:192.168.123.106:0/4185520472 2026-03-11T06:22:06.287857+0000 2026-03-10T06:29:02.383 INFO:teuthology.orchestra.run.vm04.stdout:192.168.123.106:6829/1426890327 2026-03-11T06:22:06.287857+0000 2026-03-10T06:29:02.383 INFO:teuthology.orchestra.run.vm04.stdout:192.168.123.104:6801/2 2026-03-11T06:16:59.244005+0000 2026-03-10T06:29:02.383 INFO:teuthology.orchestra.run.vm04.stdout:192.168.123.104:6828/2419696492 2026-03-11T06:27:01.254434+0000 2026-03-10T06:29:02.383 INFO:teuthology.orchestra.run.vm04.stdout:192.168.123.104:0/4057554910 2026-03-11T06:24:03.938326+0000 2026-03-10T06:29:02.383 INFO:teuthology.orchestra.run.vm04.stdout:192.168.123.104:0/3171159121 2026-03-11T06:16:59.244005+0000 2026-03-10T06:29:02.383 INFO:teuthology.orchestra.run.vm04.stdout:192.168.123.104:6800/1695210057 2026-03-11T06:24:03.938326+0000 2026-03-10T06:29:02.383 INFO:teuthology.orchestra.run.vm04.stdout:192.168.123.104:0/2035036316 2026-03-11T06:24:03.938326+0000 2026-03-10T06:29:02.383 INFO:teuthology.orchestra.run.vm04.stdout:192.168.123.104:6801/1695210057 2026-03-11T06:24:03.938326+0000 2026-03-10T06:29:02.383 INFO:teuthology.orchestra.run.vm04.stdout:192.168.123.104:0/1120149152 2026-03-11T06:17:13.124403+0000 2026-03-10T06:29:02.383 INFO:teuthology.orchestra.run.vm04.stdout:192.168.123.106:0/3849404075 2026-03-11T06:22:06.287857+0000 2026-03-10T06:29:02.383 INFO:teuthology.orchestra.run.vm04.stdout:192.168.123.104:0/3016996679 2026-03-11T06:24:03.938326+0000 2026-03-10T06:29:02.383 INFO:teuthology.orchestra.run.vm04.stdout:192.168.123.106:0/4188376122 2026-03-11T06:22:06.287857+0000 2026-03-10T06:29:02.383 INFO:teuthology.orchestra.run.vm04.stdout:192.168.123.104:0/4014113593 2026-03-11T06:18:04.067248+0000 2026-03-10T06:29:02.383 INFO:teuthology.orchestra.run.vm04.stdout:192.168.123.104:0/2105211783 2026-03-11T06:21:48.618768+0000 2026-03-10T06:29:02.383 INFO:teuthology.orchestra.run.vm04.stdout:192.168.123.106:0/3310183690 2026-03-11T06:22:06.287857+0000 2026-03-10T06:29:02.383 INFO:teuthology.orchestra.run.vm04.stdout:192.168.123.106:6828/1426890327 2026-03-11T06:22:06.287857+0000 2026-03-10T06:29:02.383 INFO:teuthology.orchestra.run.vm04.stdout:192.168.123.104:0/1930923909 2026-03-11T06:17:13.124403+0000 2026-03-10T06:29:02.383 INFO:teuthology.orchestra.run.vm04.stdout:192.168.123.104:6800/2 2026-03-11T06:16:59.244005+0000 2026-03-10T06:29:02.383 INFO:teuthology.orchestra.run.vm04.stdout:192.168.123.104:6827/2274683007 2026-03-11T06:26:47.650245+0000 2026-03-10T06:29:02.383 INFO:teuthology.orchestra.run.vm04.stdout:192.168.123.104:0/1794352543 2026-03-11T06:21:48.618768+0000 2026-03-10T06:29:02.383 INFO:teuthology.orchestra.run.vm04.stdout:192.168.123.106:6827/3120742985 2026-03-11T06:27:18.714213+0000 2026-03-10T06:29:02.383 INFO:teuthology.orchestra.run.vm04.stdout:192.168.123.106:0/3268211178 2026-03-11T06:22:06.287857+0000 2026-03-10T06:29:02.383 INFO:teuthology.orchestra.run.vm04.stdout:192.168.123.104:0/140224168 2026-03-11T06:21:48.618768+0000 2026-03-10T06:29:02.383 INFO:teuthology.orchestra.run.vm04.stdout:192.168.123.104:0/1133148001 2026-03-11T06:16:59.244005+0000 2026-03-10T06:29:02.383 INFO:teuthology.orchestra.run.vm04.stdout:192.168.123.104:0/63423144 2026-03-11T06:18:04.067248+0000 2026-03-10T06:29:02.383 INFO:teuthology.orchestra.run.vm04.stdout:192.168.123.106:0/3059594429 2026-03-11T06:22:06.287857+0000 2026-03-10T06:29:02.383 INFO:teuthology.orchestra.run.vm04.stdout:192.168.123.104:0/2832999113 2026-03-11T06:17:13.124403+0000 2026-03-10T06:29:02.383 INFO:teuthology.orchestra.run.vm04.stdout:192.168.123.104:0/1784947156 2026-03-11T06:24:03.938326+0000 2026-03-10T06:29:02.383 INFO:teuthology.orchestra.run.vm04.stdout:192.168.123.104:0/3321367007 2026-03-11T06:21:48.618768+0000 2026-03-10T06:29:02.383 INFO:teuthology.orchestra.run.vm04.stdout:192.168.123.104:0/4184397752 2026-03-11T06:18:04.067248+0000 2026-03-10T06:29:02.383 INFO:teuthology.orchestra.run.vm04.stdout:192.168.123.104:0/2564492007 2026-03-11T06:16:59.244005+0000 2026-03-10T06:29:02.386 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:02.385+0000 7f186d7fa700 1 -- 192.168.123.104:0/2903280105 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f1868077910 msgr2=0x7f1868079dc0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:29:02.386 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:02.385+0000 7f186d7fa700 1 --2- 192.168.123.104:0/2903280105 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f1868077910 0x7f1868079dc0 secure :-1 s=READY pgs=109 cs=0 l=1 rev1=1 crypto rx=0x7f187800bb30 tx=0x7f187800bf90 comp rx=0 tx=0).stop 2026-03-10T06:29:02.386 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:02.385+0000 7f186d7fa700 1 -- 192.168.123.104:0/2903280105 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1880103ee0 msgr2=0x7f1880109a30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T06:29:02.387 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:02.385+0000 7f186d7fa700 1 --2- 192.168.123.104:0/2903280105 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1880103ee0 0x7f1880109a30 secure :-1 s=READY pgs=59 cs=0 l=1 rev1=1 crypto rx=0x7f187400d8d0 tx=0x7f187400dc90 comp rx=0 tx=0).stop 2026-03-10T06:29:02.387 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:02.385+0000 7f186d7fa700 1 -- 192.168.123.104:0/2903280105 shutdown_connections 2026-03-10T06:29:02.387 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:02.385+0000 7f186d7fa700 1 --2- 192.168.123.104:0/2903280105 >> [v2:192.168.123.104:6800/1421430943,v1:192.168.123.104:6801/1421430943] conn(0x7f1868077910 0x7f1868079dc0 unknown :-1 s=CLOSED pgs=109 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:29:02.387 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:02.386+0000 7f186d7fa700 1 --2- 192.168.123.104:0/2903280105 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1880103ee0 0x7f1880109a30 unknown :-1 s=CLOSED pgs=59 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:29:02.387 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:02.386+0000 7f186d7fa700 1 --2- 192.168.123.104:0/2903280105 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1880109f70 0x7f1880071840 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T06:29:02.387 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:02.386+0000 7f186d7fa700 1 -- 192.168.123.104:0/2903280105 >> 192.168.123.104:0/2903280105 conn(0x7f18800751e0 msgr2=0x7f18800fd610 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T06:29:02.387 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:02.386+0000 7f186d7fa700 1 -- 192.168.123.104:0/2903280105 shutdown_connections 2026-03-10T06:29:02.387 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T06:29:02.386+0000 7f186d7fa700 1 -- 192.168.123.104:0/2903280105 wait complete. 2026-03-10T06:29:02.388 INFO:teuthology.orchestra.run.vm04.stderr:listed 35 entries 2026-03-10T06:29:02.459 INFO:tasks.cephfs.fuse_mount:Running fusermount -u on ubuntu@vm04.local... 2026-03-10T06:29:02.459 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-10T06:29:02.459 DEBUG:teuthology.orchestra.run.vm04:> sudo fusermount -u /home/ubuntu/cephtest/mnt.0 2026-03-10T06:29:02.490 INFO:teuthology.orchestra.run:waiting for 300 2026-03-10T06:29:02.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:29:02 vm04.local ceph-mon[115743]: pgmap v190: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:29:02.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:29:02 vm04.local ceph-mon[115743]: from='client.? 192.168.123.104:0/2903280105' entity='client.admin' cmd=[{"prefix": "osd blocklist ls"}]: dispatch 2026-03-10T06:29:03.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:29:02 vm06.local ceph-mon[98962]: pgmap v190: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:29:03.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:29:02 vm06.local ceph-mon[98962]: from='client.? 192.168.123.104:0/2903280105' entity='client.admin' cmd=[{"prefix": "osd blocklist ls"}]: dispatch 2026-03-10T06:29:04.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:29:04 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T06:29:04.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:29:04 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T06:29:05.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:29:05 vm06.local ceph-mon[98962]: pgmap v191: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:29:05.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:29:05 vm04.local ceph-mon[115743]: pgmap v191: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:29:07.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:29:07 vm06.local ceph-mon[98962]: pgmap v192: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:29:07.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:29:07 vm04.local ceph-mon[115743]: pgmap v192: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:29:09.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:29:09 vm06.local ceph-mon[98962]: pgmap v193: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:29:09.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:29:09 vm04.local ceph-mon[115743]: pgmap v193: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:29:11.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:29:11 vm04.local ceph-mon[115743]: pgmap v194: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 2 op/s 2026-03-10T06:29:11.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:29:11 vm06.local ceph-mon[98962]: pgmap v194: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 2 op/s 2026-03-10T06:29:13.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:29:13 vm04.local ceph-mon[115743]: pgmap v195: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:29:13.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:29:13 vm06.local ceph-mon[98962]: pgmap v195: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:29:15.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:29:15 vm06.local ceph-mon[98962]: pgmap v196: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:29:15.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:29:15 vm04.local ceph-mon[115743]: pgmap v196: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:29:17.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:29:17 vm06.local ceph-mon[98962]: pgmap v197: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:29:17.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:29:17 vm04.local ceph-mon[115743]: pgmap v197: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:29:19.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:29:19 vm06.local ceph-mon[98962]: pgmap v198: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:29:19.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:29:19 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T06:29:19.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:29:19 vm04.local ceph-mon[115743]: pgmap v198: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:29:19.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:29:19 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T06:29:21.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:29:21 vm06.local ceph-mon[98962]: pgmap v199: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 2 op/s 2026-03-10T06:29:21.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:29:21 vm04.local ceph-mon[115743]: pgmap v199: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 2 op/s 2026-03-10T06:29:23.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:29:23 vm06.local ceph-mon[98962]: pgmap v200: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:29:23.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:29:23 vm04.local ceph-mon[115743]: pgmap v200: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:29:25.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:29:25 vm06.local ceph-mon[98962]: pgmap v201: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:29:25.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:29:25 vm04.local ceph-mon[115743]: pgmap v201: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:29:27.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:29:27 vm06.local ceph-mon[98962]: pgmap v202: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:29:27.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:29:27 vm04.local ceph-mon[115743]: pgmap v202: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:29:29.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:29:29 vm06.local ceph-mon[98962]: pgmap v203: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:29:29.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:29:29 vm04.local ceph-mon[115743]: pgmap v203: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:29:31.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:29:31 vm06.local ceph-mon[98962]: pgmap v204: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 2 op/s 2026-03-10T06:29:31.638 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:29:31 vm04.local ceph-mon[115743]: pgmap v204: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 2 op/s 2026-03-10T06:29:33.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:29:33 vm06.local ceph-mon[98962]: pgmap v205: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:29:33.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:29:33 vm04.local ceph-mon[115743]: pgmap v205: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:29:34.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:29:34 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T06:29:34.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:29:34 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T06:29:35.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:29:35 vm06.local ceph-mon[98962]: pgmap v206: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:29:35.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:29:35 vm04.local ceph-mon[115743]: pgmap v206: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:29:37.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:29:37 vm06.local ceph-mon[98962]: pgmap v207: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:29:37.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:29:37 vm04.local ceph-mon[115743]: pgmap v207: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:29:39.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:29:39 vm06.local ceph-mon[98962]: pgmap v208: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:29:39.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:29:39 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:29:39.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:29:39 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:29:39.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:29:39 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T06:29:39.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:29:39 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:29:39.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:29:39 vm04.local ceph-mon[115743]: pgmap v208: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:29:39.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:29:39 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:29:39.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:29:39 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:29:39.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:29:39 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T06:29:39.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:29:39 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:29:41.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:29:41 vm06.local ceph-mon[98962]: pgmap v209: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 2 op/s 2026-03-10T06:29:41.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:29:41 vm04.local ceph-mon[115743]: pgmap v209: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 2 op/s 2026-03-10T06:29:43.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:29:43 vm04.local ceph-mon[115743]: pgmap v210: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:29:44.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:29:43 vm06.local ceph-mon[98962]: pgmap v210: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:29:46.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:29:45 vm06.local ceph-mon[98962]: pgmap v211: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:29:46.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:29:45 vm04.local ceph-mon[115743]: pgmap v211: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:29:47.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:29:46 vm04.local ceph-mon[115743]: pgmap v212: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:29:47.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:29:46 vm06.local ceph-mon[98962]: pgmap v212: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:29:49.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:29:49 vm06.local ceph-mon[98962]: pgmap v213: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:29:49.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:29:49 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T06:29:49.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:29:49 vm04.local ceph-mon[115743]: pgmap v213: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:29:49.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:29:49 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T06:29:51.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:29:51 vm06.local ceph-mon[98962]: pgmap v214: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 2 op/s 2026-03-10T06:29:51.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:29:51 vm04.local ceph-mon[115743]: pgmap v214: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 2 op/s 2026-03-10T06:29:53.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:29:53 vm06.local ceph-mon[98962]: pgmap v215: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:29:53.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:29:53 vm04.local ceph-mon[115743]: pgmap v215: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:29:55.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:29:55 vm04.local ceph-mon[115743]: pgmap v216: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:29:56.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:29:55 vm06.local ceph-mon[98962]: pgmap v216: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:29:57.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:29:57 vm04.local ceph-mon[115743]: pgmap v217: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:29:58.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:29:57 vm06.local ceph-mon[98962]: pgmap v217: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:29:59.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:29:59 vm04.local ceph-mon[115743]: pgmap v218: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:30:00.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:29:59 vm06.local ceph-mon[98962]: pgmap v218: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:30:00.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:30:00 vm04.local ceph-mon[115743]: overall HEALTH_WARN 1 filesystem with deprecated feature inline_data 2026-03-10T06:30:01.018 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:30:00 vm06.local ceph-mon[98962]: overall HEALTH_WARN 1 filesystem with deprecated feature inline_data 2026-03-10T06:30:01.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:30:01 vm04.local ceph-mon[115743]: pgmap v219: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 2 op/s 2026-03-10T06:30:02.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:30:01 vm06.local ceph-mon[98962]: pgmap v219: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 2 op/s 2026-03-10T06:30:03.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:30:03 vm04.local ceph-mon[115743]: pgmap v220: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:30:04.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:30:03 vm06.local ceph-mon[98962]: pgmap v220: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:30:04.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:30:04 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T06:30:05.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:30:04 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T06:30:05.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:30:05 vm04.local ceph-mon[115743]: pgmap v221: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:30:06.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:30:05 vm06.local ceph-mon[98962]: pgmap v221: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:30:07.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:30:07 vm04.local ceph-mon[115743]: pgmap v222: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:30:08.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:30:07 vm06.local ceph-mon[98962]: pgmap v222: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:30:09.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:30:09 vm04.local ceph-mon[115743]: pgmap v223: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:30:10.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:30:09 vm06.local ceph-mon[98962]: pgmap v223: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:30:12.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:30:11 vm06.local ceph-mon[98962]: pgmap v224: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 2 op/s 2026-03-10T06:30:12.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:30:11 vm04.local ceph-mon[115743]: pgmap v224: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 2 op/s 2026-03-10T06:30:14.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:30:13 vm06.local ceph-mon[98962]: pgmap v225: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:30:14.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:30:13 vm04.local ceph-mon[115743]: pgmap v225: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:30:16.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:30:15 vm06.local ceph-mon[98962]: pgmap v226: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:30:16.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:30:15 vm04.local ceph-mon[115743]: pgmap v226: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:30:17.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:30:16 vm06.local ceph-mon[98962]: pgmap v227: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:30:17.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:30:16 vm04.local ceph-mon[115743]: pgmap v227: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:30:19.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:30:19 vm06.local ceph-mon[98962]: pgmap v228: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:30:19.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:30:19 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T06:30:19.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:30:19 vm04.local ceph-mon[115743]: pgmap v228: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:30:19.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:30:19 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T06:30:21.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:30:21 vm06.local ceph-mon[98962]: pgmap v229: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 2 op/s 2026-03-10T06:30:21.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:30:21 vm04.local ceph-mon[115743]: pgmap v229: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 2 op/s 2026-03-10T06:30:23.512 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:30:23 vm04.local ceph-mon[115743]: pgmap v230: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:30:23.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:30:23 vm06.local ceph-mon[98962]: pgmap v230: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:30:25.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:30:25 vm06.local ceph-mon[98962]: pgmap v231: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:30:25.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:30:25 vm04.local ceph-mon[115743]: pgmap v231: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:30:27.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:30:27 vm06.local ceph-mon[98962]: pgmap v232: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:30:27.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:30:27 vm04.local ceph-mon[115743]: pgmap v232: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:30:29.597 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:30:29 vm06.local ceph-mon[98962]: pgmap v233: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:30:29.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:30:29 vm04.local ceph-mon[115743]: pgmap v233: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:30:31.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:30:31 vm06.local ceph-mon[98962]: pgmap v234: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 2 op/s 2026-03-10T06:30:31.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:30:31 vm04.local ceph-mon[115743]: pgmap v234: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 2 op/s 2026-03-10T06:30:33.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:30:33 vm06.local ceph-mon[98962]: pgmap v235: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:30:33.620 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:30:33 vm04.local ceph-mon[115743]: pgmap v235: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:30:34.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:30:34 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T06:30:34.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:30:34 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T06:30:35.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:30:35 vm06.local ceph-mon[98962]: pgmap v236: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:30:35.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:30:35 vm04.local ceph-mon[115743]: pgmap v236: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:30:37.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:30:37 vm06.local ceph-mon[98962]: pgmap v237: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:30:37.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:30:37 vm04.local ceph-mon[115743]: pgmap v237: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:30:39.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:30:39 vm06.local ceph-mon[98962]: pgmap v238: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:30:39.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:30:39 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:30:39.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:30:39 vm04.local ceph-mon[115743]: pgmap v238: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:30:39.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:30:39 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:30:40.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:30:40 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:30:40.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:30:40 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T06:30:40.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:30:40 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:30:40.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:30:40 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:30:40.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:30:40 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T06:30:40.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:30:40 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:30:41.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:30:41 vm06.local ceph-mon[98962]: pgmap v239: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 2 op/s 2026-03-10T06:30:41.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:30:41 vm04.local ceph-mon[115743]: pgmap v239: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 2 op/s 2026-03-10T06:30:43.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:30:43 vm06.local ceph-mon[98962]: pgmap v240: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:30:43.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:30:43 vm04.local ceph-mon[115743]: pgmap v240: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:30:45.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:30:45 vm06.local ceph-mon[98962]: pgmap v241: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:30:45.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:30:45 vm04.local ceph-mon[115743]: pgmap v241: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:30:47.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:30:47 vm06.local ceph-mon[98962]: pgmap v242: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:30:47.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:30:47 vm04.local ceph-mon[115743]: pgmap v242: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:30:49.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:30:49 vm06.local ceph-mon[98962]: pgmap v243: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:30:49.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:30:49 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T06:30:49.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:30:49 vm04.local ceph-mon[115743]: pgmap v243: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:30:49.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:30:49 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T06:30:51.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:30:51 vm06.local ceph-mon[98962]: pgmap v244: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 2 op/s 2026-03-10T06:30:51.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:30:51 vm04.local ceph-mon[115743]: pgmap v244: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 2 op/s 2026-03-10T06:30:53.836 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:30:53 vm04.local ceph-mon[115743]: pgmap v245: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:30:53.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:30:53 vm06.local ceph-mon[98962]: pgmap v245: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:30:55.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:30:55 vm06.local ceph-mon[98962]: pgmap v246: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:30:55.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:30:55 vm04.local ceph-mon[115743]: pgmap v246: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:30:57.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:30:57 vm06.local ceph-mon[98962]: pgmap v247: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:30:57.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:30:57 vm04.local ceph-mon[115743]: pgmap v247: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:30:59.859 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:30:59 vm06.local ceph-mon[98962]: pgmap v248: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:30:59.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:30:59 vm04.local ceph-mon[115743]: pgmap v248: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:31:01.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:31:01 vm06.local ceph-mon[98962]: pgmap v249: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 2 op/s 2026-03-10T06:31:01.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:31:01 vm04.local ceph-mon[115743]: pgmap v249: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 2 op/s 2026-03-10T06:31:03.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:31:03 vm06.local ceph-mon[98962]: pgmap v250: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:31:03.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:31:03 vm04.local ceph-mon[115743]: pgmap v250: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:31:04.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:31:04 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T06:31:04.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:31:04 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T06:31:05.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:31:05 vm06.local ceph-mon[98962]: pgmap v251: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:31:05.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:31:05 vm04.local ceph-mon[115743]: pgmap v251: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:31:07.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:31:07 vm04.local ceph-mon[115743]: pgmap v252: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:31:08.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:31:07 vm06.local ceph-mon[98962]: pgmap v252: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:31:09.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:31:09 vm04.local ceph-mon[115743]: pgmap v253: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:31:09.941 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:31:09 vm06.local ceph-mon[98962]: pgmap v253: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:31:11.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:31:11 vm04.local ceph-mon[115743]: pgmap v254: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 2 op/s 2026-03-10T06:31:12.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:31:11 vm06.local ceph-mon[98962]: pgmap v254: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 2 op/s 2026-03-10T06:31:13.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:31:13 vm04.local ceph-mon[115743]: pgmap v255: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:31:14.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:31:13 vm06.local ceph-mon[98962]: pgmap v255: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:31:15.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:31:15 vm04.local ceph-mon[115743]: pgmap v256: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:31:16.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:31:15 vm06.local ceph-mon[98962]: pgmap v256: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:31:17.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:31:17 vm04.local ceph-mon[115743]: pgmap v257: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:31:18.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:31:17 vm06.local ceph-mon[98962]: pgmap v257: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:31:19.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:31:19 vm04.local ceph-mon[115743]: pgmap v258: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:31:19.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:31:19 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T06:31:20.028 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:31:19 vm06.local ceph-mon[98962]: pgmap v258: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:31:20.028 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:31:19 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T06:31:21.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:31:21 vm04.local ceph-mon[115743]: pgmap v259: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 2 op/s 2026-03-10T06:31:22.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:31:21 vm06.local ceph-mon[98962]: pgmap v259: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 2 op/s 2026-03-10T06:31:23.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:31:23 vm04.local ceph-mon[115743]: pgmap v260: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:31:24.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:31:23 vm06.local ceph-mon[98962]: pgmap v260: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:31:25.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:31:25 vm04.local ceph-mon[115743]: pgmap v261: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:31:26.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:31:25 vm06.local ceph-mon[98962]: pgmap v261: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:31:27.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:31:27 vm04.local ceph-mon[115743]: pgmap v262: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:31:28.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:31:27 vm06.local ceph-mon[98962]: pgmap v262: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:31:29.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:31:29 vm04.local ceph-mon[115743]: pgmap v263: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:31:30.116 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:31:29 vm06.local ceph-mon[98962]: pgmap v263: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:31:32.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:31:31 vm06.local ceph-mon[98962]: pgmap v264: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 2 op/s 2026-03-10T06:31:32.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:31:31 vm04.local ceph-mon[115743]: pgmap v264: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 2 op/s 2026-03-10T06:31:34.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:31:33 vm06.local ceph-mon[98962]: pgmap v265: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:31:34.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:31:33 vm04.local ceph-mon[115743]: pgmap v265: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:31:35.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:31:34 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T06:31:35.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:31:34 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T06:31:36.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:31:35 vm06.local ceph-mon[98962]: pgmap v266: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:31:36.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:31:35 vm04.local ceph-mon[115743]: pgmap v266: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:31:38.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:31:37 vm06.local ceph-mon[98962]: pgmap v267: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:31:38.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:31:37 vm04.local ceph-mon[115743]: pgmap v267: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:31:40.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:31:39 vm06.local ceph-mon[98962]: pgmap v268: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:31:40.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:31:39 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:31:40.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:31:39 vm04.local ceph-mon[115743]: pgmap v268: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:31:40.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:31:39 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:31:41.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:31:40 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:31:41.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:31:40 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T06:31:41.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:31:40 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:31:41.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:31:40 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:31:41.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:31:40 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T06:31:41.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:31:40 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:31:42.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:31:41 vm06.local ceph-mon[98962]: pgmap v269: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 2 op/s 2026-03-10T06:31:42.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:31:41 vm04.local ceph-mon[115743]: pgmap v269: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 2 op/s 2026-03-10T06:31:44.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:31:43 vm06.local ceph-mon[98962]: pgmap v270: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:31:44.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:31:43 vm04.local ceph-mon[115743]: pgmap v270: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:31:46.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:31:45 vm06.local ceph-mon[98962]: pgmap v271: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:31:46.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:31:45 vm04.local ceph-mon[115743]: pgmap v271: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:31:48.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:31:47 vm06.local ceph-mon[98962]: pgmap v272: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:31:48.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:31:47 vm04.local ceph-mon[115743]: pgmap v272: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:31:49.831 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:31:49 vm04.local ceph-mon[115743]: pgmap v273: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:31:49.831 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:31:49 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T06:31:50.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:31:49 vm06.local ceph-mon[98962]: pgmap v273: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:31:50.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:31:49 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T06:31:52.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:31:51 vm06.local ceph-mon[98962]: pgmap v274: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 2 op/s 2026-03-10T06:31:52.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:31:51 vm04.local ceph-mon[115743]: pgmap v274: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 2 op/s 2026-03-10T06:31:54.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:31:53 vm06.local ceph-mon[98962]: pgmap v275: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:31:54.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:31:53 vm04.local ceph-mon[115743]: pgmap v275: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:31:55.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:31:54 vm06.local ceph-mon[98962]: pgmap v276: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:31:55.177 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:31:54 vm04.local ceph-mon[115743]: pgmap v276: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:31:57.367 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:31:57 vm06.local ceph-mon[98962]: pgmap v277: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:31:57.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:31:57 vm04.local ceph-mon[115743]: pgmap v277: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:31:59.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:31:59 vm04.local ceph-mon[115743]: pgmap v278: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:31:59.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:31:59 vm06.local ceph-mon[98962]: pgmap v278: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:32:01.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:32:01 vm04.local ceph-mon[115743]: pgmap v279: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 2 op/s 2026-03-10T06:32:01.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:32:01 vm06.local ceph-mon[98962]: pgmap v279: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 2 op/s 2026-03-10T06:32:03.419 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:32:03 vm06.local ceph-mon[98962]: pgmap v280: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:32:03.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:32:03 vm04.local ceph-mon[115743]: pgmap v280: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:32:04.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:32:04 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T06:32:04.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:32:04 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T06:32:05.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:32:05 vm06.local ceph-mon[98962]: pgmap v281: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:32:05.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:32:05 vm04.local ceph-mon[115743]: pgmap v281: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:32:07.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:32:07 vm04.local ceph-mon[115743]: pgmap v282: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:32:07.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:32:07 vm06.local ceph-mon[98962]: pgmap v282: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:32:09.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:32:09 vm04.local ceph-mon[115743]: pgmap v283: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:32:09.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:32:09 vm06.local ceph-mon[98962]: pgmap v283: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:32:11.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:32:11 vm04.local ceph-mon[115743]: pgmap v284: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 2 op/s 2026-03-10T06:32:11.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:32:11 vm06.local ceph-mon[98962]: pgmap v284: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 2 op/s 2026-03-10T06:32:13.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:32:13 vm04.local ceph-mon[115743]: pgmap v285: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:32:13.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:32:13 vm06.local ceph-mon[98962]: pgmap v285: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:32:15.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:32:15 vm04.local ceph-mon[115743]: pgmap v286: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:32:15.507 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:32:15 vm06.local ceph-mon[98962]: pgmap v286: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:32:17.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:32:17 vm04.local ceph-mon[115743]: pgmap v287: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:32:17.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:32:17 vm06.local ceph-mon[98962]: pgmap v287: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:32:19.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:32:19 vm04.local ceph-mon[115743]: pgmap v288: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:32:19.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:32:19 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T06:32:19.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:32:19 vm06.local ceph-mon[98962]: pgmap v288: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:32:19.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:32:19 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T06:32:21.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:32:21 vm06.local ceph-mon[98962]: pgmap v289: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 2 op/s 2026-03-10T06:32:21.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:32:21 vm04.local ceph-mon[115743]: pgmap v289: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 2 op/s 2026-03-10T06:32:23.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:32:23 vm06.local ceph-mon[98962]: pgmap v290: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:32:23.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:32:23 vm04.local ceph-mon[115743]: pgmap v290: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:32:25.598 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:32:25 vm06.local ceph-mon[98962]: pgmap v291: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:32:25.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:32:25 vm04.local ceph-mon[115743]: pgmap v291: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:32:27.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:32:27 vm06.local ceph-mon[98962]: pgmap v292: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:32:27.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:32:27 vm04.local ceph-mon[115743]: pgmap v292: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:32:29.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:32:29 vm06.local ceph-mon[98962]: pgmap v293: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:32:29.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:32:29 vm04.local ceph-mon[115743]: pgmap v293: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:32:31.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:32:31 vm06.local ceph-mon[98962]: pgmap v294: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 2 op/s 2026-03-10T06:32:31.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:32:31 vm04.local ceph-mon[115743]: pgmap v294: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 2 op/s 2026-03-10T06:32:33.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:32:33 vm06.local ceph-mon[98962]: pgmap v295: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:32:33.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:32:33 vm04.local ceph-mon[115743]: pgmap v295: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:32:34.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:32:34 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T06:32:34.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:32:34 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T06:32:35.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:32:35 vm06.local ceph-mon[98962]: pgmap v296: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:32:35.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:32:35 vm04.local ceph-mon[115743]: pgmap v296: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:32:37.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:32:37 vm06.local ceph-mon[98962]: pgmap v297: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:32:37.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:32:37 vm04.local ceph-mon[115743]: pgmap v297: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:32:39.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:32:39 vm06.local ceph-mon[98962]: pgmap v298: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:32:39.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:32:39 vm04.local ceph-mon[115743]: pgmap v298: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:32:40.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:32:40 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:32:40.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:32:40 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:32:40.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:32:40 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T06:32:40.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:32:40 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:32:40.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:32:40 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:32:40.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:32:40 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:32:40.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:32:40 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T06:32:40.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:32:40 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:32:41.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:32:41 vm06.local ceph-mon[98962]: pgmap v299: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 2 op/s 2026-03-10T06:32:41.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:32:41 vm04.local ceph-mon[115743]: pgmap v299: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 2 op/s 2026-03-10T06:32:43.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:32:43 vm06.local ceph-mon[98962]: pgmap v300: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:32:43.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:32:43 vm04.local ceph-mon[115743]: pgmap v300: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:32:45.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:32:45 vm04.local ceph-mon[115743]: pgmap v301: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:32:45.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:32:45 vm06.local ceph-mon[98962]: pgmap v301: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:32:47.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:32:47 vm06.local ceph-mon[98962]: pgmap v302: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:32:47.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:32:47 vm04.local ceph-mon[115743]: pgmap v302: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:32:49.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:32:49 vm06.local ceph-mon[98962]: pgmap v303: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:32:49.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:32:49 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T06:32:49.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:32:49 vm04.local ceph-mon[115743]: pgmap v303: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:32:49.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:32:49 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T06:32:51.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:32:51 vm06.local ceph-mon[98962]: pgmap v304: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 2 op/s 2026-03-10T06:32:51.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:32:51 vm04.local ceph-mon[115743]: pgmap v304: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 2 op/s 2026-03-10T06:32:53.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:32:53 vm06.local ceph-mon[98962]: pgmap v305: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:32:53.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:32:53 vm04.local ceph-mon[115743]: pgmap v305: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:32:55.427 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:32:55 vm04.local ceph-mon[115743]: pgmap v306: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:32:55.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:32:55 vm06.local ceph-mon[98962]: pgmap v306: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:32:57.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:32:57 vm06.local ceph-mon[98962]: pgmap v307: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:32:57.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:32:57 vm04.local ceph-mon[115743]: pgmap v307: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:32:59.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:32:59 vm06.local ceph-mon[98962]: pgmap v308: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:32:59.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:32:59 vm04.local ceph-mon[115743]: pgmap v308: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:33:01.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:33:01 vm04.local ceph-mon[115743]: pgmap v309: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 2 op/s 2026-03-10T06:33:01.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:33:01 vm06.local ceph-mon[98962]: pgmap v309: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 2 op/s 2026-03-10T06:33:03.669 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:33:03 vm06.local ceph-mon[98962]: pgmap v310: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:33:03.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:33:03 vm04.local ceph-mon[115743]: pgmap v310: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:33:04.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:33:04 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T06:33:04.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:33:04 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T06:33:05.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:33:05 vm04.local ceph-mon[115743]: pgmap v311: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:33:05.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:33:05 vm06.local ceph-mon[98962]: pgmap v311: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:33:07.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:33:07 vm04.local ceph-mon[115743]: pgmap v312: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:33:07.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:33:07 vm06.local ceph-mon[98962]: pgmap v312: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:33:09.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:33:09 vm04.local ceph-mon[115743]: pgmap v313: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:33:09.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:33:09 vm06.local ceph-mon[98962]: pgmap v313: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:33:11.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:33:11 vm04.local ceph-mon[115743]: pgmap v314: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 2 op/s 2026-03-10T06:33:11.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:33:11 vm06.local ceph-mon[98962]: pgmap v314: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 2 op/s 2026-03-10T06:33:13.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:33:13 vm04.local ceph-mon[115743]: pgmap v315: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:33:13.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:33:13 vm06.local ceph-mon[98962]: pgmap v315: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:33:15.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:33:15 vm04.local ceph-mon[115743]: pgmap v316: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:33:15.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:33:15 vm06.local ceph-mon[98962]: pgmap v316: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:33:17.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:33:17 vm04.local ceph-mon[115743]: pgmap v317: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:33:17.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:33:17 vm06.local ceph-mon[98962]: pgmap v317: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:33:19.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:33:19 vm04.local ceph-mon[115743]: pgmap v318: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:33:19.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:33:19 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T06:33:19.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:33:19 vm06.local ceph-mon[98962]: pgmap v318: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:33:19.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:33:19 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T06:33:21.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:33:21 vm04.local ceph-mon[115743]: pgmap v319: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 2 op/s 2026-03-10T06:33:21.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:33:21 vm06.local ceph-mon[98962]: pgmap v319: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 2 op/s 2026-03-10T06:33:23.677 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:33:23 vm04.local ceph-mon[115743]: pgmap v320: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:33:23.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:33:23 vm06.local ceph-mon[98962]: pgmap v320: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:33:25.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:33:25 vm06.local ceph-mon[98962]: pgmap v321: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:33:25.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:33:25 vm04.local ceph-mon[115743]: pgmap v321: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:33:27.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:33:27 vm06.local ceph-mon[98962]: pgmap v322: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:33:27.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:33:27 vm04.local ceph-mon[115743]: pgmap v322: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:33:29.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:33:29 vm06.local ceph-mon[98962]: pgmap v323: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:33:29.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:33:29 vm04.local ceph-mon[115743]: pgmap v323: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:33:31.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:33:31 vm06.local ceph-mon[98962]: pgmap v324: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 2 op/s 2026-03-10T06:33:31.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:33:31 vm04.local ceph-mon[115743]: pgmap v324: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 2 op/s 2026-03-10T06:33:33.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:33:33 vm06.local ceph-mon[98962]: pgmap v325: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:33:33.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:33:33 vm04.local ceph-mon[115743]: pgmap v325: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:33:34.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:33:34 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T06:33:34.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:33:34 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T06:33:35.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:33:35 vm06.local ceph-mon[98962]: pgmap v326: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:33:35.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:33:35 vm04.local ceph-mon[115743]: pgmap v326: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:33:37.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:33:37 vm06.local ceph-mon[98962]: pgmap v327: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:33:37.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:33:37 vm04.local ceph-mon[115743]: pgmap v327: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:33:39.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:33:39 vm06.local ceph-mon[98962]: pgmap v328: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:33:39.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:33:39 vm04.local ceph-mon[115743]: pgmap v328: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:33:40.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:33:40 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:33:40.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:33:40 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:33:40.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:33:40 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T06:33:40.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:33:40 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T06:33:40.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:33:40 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T06:33:40.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:33:40 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T06:33:41.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:33:41 vm06.local ceph-mon[98962]: pgmap v329: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 2 op/s 2026-03-10T06:33:41.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:33:41 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:33:41.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:33:41 vm04.local ceph-mon[115743]: pgmap v329: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 2 op/s 2026-03-10T06:33:41.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:33:41 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' 2026-03-10T06:33:43.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:33:43 vm06.local ceph-mon[98962]: pgmap v330: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:33:43.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:33:43 vm04.local ceph-mon[115743]: pgmap v330: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:33:45.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:33:45 vm06.local ceph-mon[98962]: pgmap v331: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:33:45.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:33:45 vm04.local ceph-mon[115743]: pgmap v331: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:33:47.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:33:47 vm06.local ceph-mon[98962]: pgmap v332: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:33:47.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:33:47 vm04.local ceph-mon[115743]: pgmap v332: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:33:49.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:33:49 vm06.local ceph-mon[98962]: pgmap v333: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:33:49.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:33:49 vm06.local ceph-mon[98962]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T06:33:49.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:33:49 vm04.local ceph-mon[115743]: pgmap v333: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:33:49.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:33:49 vm04.local ceph-mon[115743]: from='mgr.34104 192.168.123.104:0/3325550887' entity='mgr.vm04.exdvdb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T06:33:51.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:33:51 vm06.local ceph-mon[98962]: pgmap v334: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 2 op/s 2026-03-10T06:33:51.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:33:51 vm04.local ceph-mon[115743]: pgmap v334: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 2 op/s 2026-03-10T06:33:53.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:33:53 vm06.local ceph-mon[98962]: pgmap v335: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:33:53.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:33:53 vm04.local ceph-mon[115743]: pgmap v335: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:33:55.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:33:55 vm06.local ceph-mon[98962]: pgmap v336: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:33:55.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:33:55 vm04.local ceph-mon[115743]: pgmap v336: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:33:57.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:33:57 vm06.local ceph-mon[98962]: pgmap v337: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:33:57.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:33:57 vm04.local ceph-mon[115743]: pgmap v337: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-10T06:33:59.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:33:59 vm06.local ceph-mon[98962]: pgmap v338: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:33:59.927 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:33:59 vm04.local ceph-mon[115743]: pgmap v338: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-10T06:34:01.539 ERROR:tasks.cephfs.fuse_mount:process failed to terminate after unmount. This probably indicates a bug within ceph-fuse. 2026-03-10T06:34:01.539 ERROR:teuthology.run_tasks:Manager failed: ceph-fuse Traceback (most recent call last): File "/home/teuthos/teuthology/teuthology/run_tasks.py", line 160, in run_tasks suppress = manager.__exit__(*exc_info) File "/home/teuthos/.local/share/uv/python/cpython-3.10.19-linux-x86_64-gnu/lib/python3.10/contextlib.py", line 142, in __exit__ next(self.gen) File "/home/teuthos/src/github.com_kshtsk_ceph_75a68fd8ca3f918fe9466b4c0bb385b7fc260a9b/qa/tasks/ceph_fuse.py", line 248, in task mount.umount_wait() File "/home/teuthos/src/github.com_kshtsk_ceph_75a68fd8ca3f918fe9466b4c0bb385b7fc260a9b/qa/tasks/cephfs/fuse_mount.py", line 403, in umount_wait run.wait([self.fuse_daemon], timeout) File "/home/teuthos/teuthology/teuthology/orchestra/run.py", line 479, in wait check_time() File "/home/teuthos/teuthology/teuthology/contextutil.py", line 134, in __call__ raise MaxWhileTries(error_msg) teuthology.exceptions.MaxWhileTries: reached maximum tries (50) after waiting for 300 seconds 2026-03-10T06:34:01.540 DEBUG:teuthology.run_tasks:Unwinding manager cephadm 2026-03-10T06:34:01.542 INFO:tasks.cephadm:Teardown begin 2026-03-10T06:34:01.542 ERROR:teuthology.contextutil:Saw exception from nested tasks Traceback (most recent call last): File "/home/teuthos/teuthology/teuthology/contextutil.py", line 32, in nested yield vars File "/home/teuthos/src/github.com_kshtsk_ceph_75a68fd8ca3f918fe9466b4c0bb385b7fc260a9b/qa/tasks/cephadm.py", line 2252, in task yield File "/home/teuthos/teuthology/teuthology/run_tasks.py", line 160, in run_tasks suppress = manager.__exit__(*exc_info) File "/home/teuthos/.local/share/uv/python/cpython-3.10.19-linux-x86_64-gnu/lib/python3.10/contextlib.py", line 142, in __exit__ next(self.gen) File "/home/teuthos/src/github.com_kshtsk_ceph_75a68fd8ca3f918fe9466b4c0bb385b7fc260a9b/qa/tasks/ceph_fuse.py", line 248, in task mount.umount_wait() File "/home/teuthos/src/github.com_kshtsk_ceph_75a68fd8ca3f918fe9466b4c0bb385b7fc260a9b/qa/tasks/cephfs/fuse_mount.py", line 403, in umount_wait run.wait([self.fuse_daemon], timeout) File "/home/teuthos/teuthology/teuthology/orchestra/run.py", line 479, in wait check_time() File "/home/teuthos/teuthology/teuthology/contextutil.py", line 134, in __call__ raise MaxWhileTries(error_msg) teuthology.exceptions.MaxWhileTries: reached maximum tries (50) after waiting for 300 seconds 2026-03-10T06:34:01.543 DEBUG:teuthology.orchestra.run.vm04:> sudo rm -f /etc/ceph/ceph.conf /etc/ceph/ceph.client.admin.keyring 2026-03-10T06:34:01.568 DEBUG:teuthology.orchestra.run.vm06:> sudo rm -f /etc/ceph/ceph.conf /etc/ceph/ceph.client.admin.keyring 2026-03-10T06:34:01.593 INFO:tasks.cephadm:Disabling cephadm mgr module 2026-03-10T06:34:01.593 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 9c59102a-1c48-11f1-b618-035af535377d -- ceph mgr module disable cephadm 2026-03-10T06:34:01.739 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/mon.vm04/config 2026-03-10T06:34:01.789 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:34:01 vm04.local ceph-mon[115743]: pgmap v339: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 2 op/s 2026-03-10T06:34:01.867 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 10 06:34:01 vm06.local ceph-mon[98962]: pgmap v339: 65 pgs: 65 active+clean; 253 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 2 op/s 2026-03-10T06:34:01.889 INFO:teuthology.orchestra.run.vm04.stderr:Error: statfs /etc/ceph/ceph.client.admin.keyring: no such file or directory 2026-03-10T06:34:01.905 DEBUG:teuthology.orchestra.run:got remote process result: 125 2026-03-10T06:34:01.905 INFO:tasks.cephadm:Cleaning up testdir ceph.* files... 2026-03-10T06:34:01.906 DEBUG:teuthology.orchestra.run.vm04:> rm -f /home/ubuntu/cephtest/seed.ceph.conf /home/ubuntu/cephtest/ceph.pub 2026-03-10T06:34:01.921 DEBUG:teuthology.orchestra.run.vm06:> rm -f /home/ubuntu/cephtest/seed.ceph.conf /home/ubuntu/cephtest/ceph.pub 2026-03-10T06:34:01.938 INFO:tasks.cephadm:Stopping all daemons... 2026-03-10T06:34:01.938 INFO:tasks.cephadm.mon.vm04:Stopping mon.vm04... 2026-03-10T06:34:01.938 DEBUG:teuthology.orchestra.run.vm04:> sudo systemctl stop ceph-9c59102a-1c48-11f1-b618-035af535377d@mon.vm04 2026-03-10T06:34:02.065 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 06:34:01 vm04.local systemd[1]: Stopping Ceph mon.vm04 for 9c59102a-1c48-11f1-b618-035af535377d... 2026-03-10T06:34:02.278 DEBUG:teuthology.orchestra.run.vm04:> sudo pkill -f 'journalctl -f -n 0 -u ceph-9c59102a-1c48-11f1-b618-035af535377d@mon.vm04.service' 2026-03-10T06:34:02.312 DEBUG:teuthology.orchestra.run:got remote process result: None 2026-03-10T06:34:02.312 INFO:tasks.cephadm.mon.vm04:Stopped mon.vm04 2026-03-10T06:34:02.313 INFO:tasks.cephadm.mon.vm06:Stopping mon.vm06... 2026-03-10T06:34:02.313 DEBUG:teuthology.orchestra.run.vm06:> sudo systemctl stop ceph-9c59102a-1c48-11f1-b618-035af535377d@mon.vm06 2026-03-10T06:34:02.562 DEBUG:teuthology.orchestra.run.vm06:> sudo pkill -f 'journalctl -f -n 0 -u ceph-9c59102a-1c48-11f1-b618-035af535377d@mon.vm06.service' 2026-03-10T06:34:02.593 DEBUG:teuthology.orchestra.run:got remote process result: None 2026-03-10T06:34:02.593 INFO:tasks.cephadm.mon.vm06:Stopped mon.vm06 2026-03-10T06:34:02.593 INFO:tasks.cephadm.osd.0:Stopping osd.0... 2026-03-10T06:34:02.593 DEBUG:teuthology.orchestra.run.vm04:> sudo systemctl stop ceph-9c59102a-1c48-11f1-b618-035af535377d@osd.0 2026-03-10T06:34:02.927 INFO:journalctl@ceph.osd.0.vm04.stdout:Mar 10 06:34:02 vm04.local systemd[1]: Stopping Ceph osd.0 for 9c59102a-1c48-11f1-b618-035af535377d... 2026-03-10T06:34:02.927 INFO:journalctl@ceph.osd.0.vm04.stdout:Mar 10 06:34:02 vm04.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-0[124612]: 2026-03-10T06:34:02.706+0000 7f669d840640 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-osd -n osd.0 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false (PID: 1) UID: 0 2026-03-10T06:34:02.927 INFO:journalctl@ceph.osd.0.vm04.stdout:Mar 10 06:34:02 vm04.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-0[124612]: 2026-03-10T06:34:02.706+0000 7f669d840640 -1 osd.0 81 *** Got signal Terminated *** 2026-03-10T06:34:02.927 INFO:journalctl@ceph.osd.0.vm04.stdout:Mar 10 06:34:02 vm04.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-0[124612]: 2026-03-10T06:34:02.706+0000 7f669d840640 -1 osd.0 81 *** Immediate shutdown (osd_fast_shutdown=true) *** 2026-03-10T06:34:08.021 INFO:journalctl@ceph.osd.0.vm04.stdout:Mar 10 06:34:07 vm04.local podman[155194]: 2026-03-10 06:34:07.745136303 +0000 UTC m=+5.051564886 container died df697b82ad516dca23fb5aa0e7fc927d02c5efe12395fe6aaec831badbd6c328 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-0, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git) 2026-03-10T06:34:08.021 INFO:journalctl@ceph.osd.0.vm04.stdout:Mar 10 06:34:07 vm04.local podman[155194]: 2026-03-10 06:34:07.778075719 +0000 UTC m=+5.084504313 container remove df697b82ad516dca23fb5aa0e7fc927d02c5efe12395fe6aaec831badbd6c328 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.license=GPLv2, ceph=True, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team , OSD_FLAVOR=default) 2026-03-10T06:34:08.021 INFO:journalctl@ceph.osd.0.vm04.stdout:Mar 10 06:34:07 vm04.local bash[155194]: ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-0 2026-03-10T06:34:08.021 INFO:journalctl@ceph.osd.0.vm04.stdout:Mar 10 06:34:07 vm04.local podman[155263]: 2026-03-10 06:34:07.929141292 +0000 UTC m=+0.017038373 container create 9b8772276dc0f272e3c2a631b14327e9db04bd1a9b2f31dcbe9638dae9001de6 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-0-deactivate, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, io.buildah.version=1.41.3, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.build-date=20260223, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image) 2026-03-10T06:34:08.022 INFO:journalctl@ceph.osd.0.vm04.stdout:Mar 10 06:34:07 vm04.local podman[155263]: 2026-03-10 06:34:07.965473439 +0000 UTC m=+0.053370530 container init 9b8772276dc0f272e3c2a631b14327e9db04bd1a9b2f31dcbe9638dae9001de6 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-0-deactivate, ceph=True, CEPH_REF=squid, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, io.buildah.version=1.41.3, OSD_FLAVOR=default) 2026-03-10T06:34:08.022 INFO:journalctl@ceph.osd.0.vm04.stdout:Mar 10 06:34:07 vm04.local podman[155263]: 2026-03-10 06:34:07.968183892 +0000 UTC m=+0.056080983 container start 9b8772276dc0f272e3c2a631b14327e9db04bd1a9b2f31dcbe9638dae9001de6 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-0-deactivate, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team , OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.build-date=20260223, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df) 2026-03-10T06:34:08.022 INFO:journalctl@ceph.osd.0.vm04.stdout:Mar 10 06:34:07 vm04.local podman[155263]: 2026-03-10 06:34:07.97403269 +0000 UTC m=+0.061929791 container attach 9b8772276dc0f272e3c2a631b14327e9db04bd1a9b2f31dcbe9638dae9001de6 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-0-deactivate, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.build-date=20260223, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, io.buildah.version=1.41.3, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default) 2026-03-10T06:34:08.022 INFO:journalctl@ceph.osd.0.vm04.stdout:Mar 10 06:34:08 vm04.local podman[155263]: 2026-03-10 06:34:07.922457661 +0000 UTC m=+0.010354753 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-10T06:34:08.127 DEBUG:teuthology.orchestra.run.vm04:> sudo pkill -f 'journalctl -f -n 0 -u ceph-9c59102a-1c48-11f1-b618-035af535377d@osd.0.service' 2026-03-10T06:34:08.161 DEBUG:teuthology.orchestra.run:got remote process result: None 2026-03-10T06:34:08.161 INFO:tasks.cephadm.osd.0:Stopped osd.0 2026-03-10T06:34:08.161 INFO:tasks.cephadm.osd.1:Stopping osd.1... 2026-03-10T06:34:08.161 DEBUG:teuthology.orchestra.run.vm04:> sudo systemctl stop ceph-9c59102a-1c48-11f1-b618-035af535377d@osd.1 2026-03-10T06:34:08.304 INFO:journalctl@ceph.osd.1.vm04.stdout:Mar 10 06:34:08 vm04.local systemd[1]: Stopping Ceph osd.1 for 9c59102a-1c48-11f1-b618-035af535377d... 2026-03-10T06:34:08.677 INFO:journalctl@ceph.osd.1.vm04.stdout:Mar 10 06:34:08 vm04.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-1[129553]: 2026-03-10T06:34:08.303+0000 7f09defee640 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-osd -n osd.1 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false (PID: 1) UID: 0 2026-03-10T06:34:08.677 INFO:journalctl@ceph.osd.1.vm04.stdout:Mar 10 06:34:08 vm04.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-1[129553]: 2026-03-10T06:34:08.303+0000 7f09defee640 -1 osd.1 81 *** Got signal Terminated *** 2026-03-10T06:34:08.677 INFO:journalctl@ceph.osd.1.vm04.stdout:Mar 10 06:34:08 vm04.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-1[129553]: 2026-03-10T06:34:08.303+0000 7f09defee640 -1 osd.1 81 *** Immediate shutdown (osd_fast_shutdown=true) *** 2026-03-10T06:34:13.655 INFO:journalctl@ceph.osd.1.vm04.stdout:Mar 10 06:34:13 vm04.local podman[155361]: 2026-03-10 06:34:13.345501169 +0000 UTC m=+5.055669018 container died 6bc3525fe6f5f0aca8ab3fea01e7046a18c0106de3e9de8bfdefd0ddcdeed9ea (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-1, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team , OSD_FLAVOR=default, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260223, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9) 2026-03-10T06:34:13.655 INFO:journalctl@ceph.osd.1.vm04.stdout:Mar 10 06:34:13 vm04.local podman[155361]: 2026-03-10 06:34:13.368979867 +0000 UTC m=+5.079147726 container remove 6bc3525fe6f5f0aca8ab3fea01e7046a18c0106de3e9de8bfdefd0ddcdeed9ea (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-1, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20260223, org.opencontainers.image.authors=Ceph Release Team , FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, OSD_FLAVOR=default, ceph=True) 2026-03-10T06:34:13.655 INFO:journalctl@ceph.osd.1.vm04.stdout:Mar 10 06:34:13 vm04.local bash[155361]: ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-1 2026-03-10T06:34:13.655 INFO:journalctl@ceph.osd.1.vm04.stdout:Mar 10 06:34:13 vm04.local podman[155431]: 2026-03-10 06:34:13.490366487 +0000 UTC m=+0.016428992 container create f7b34570ce4cf3405a734a70515b6dc8cfd09143a0d678754af6b4b5d904b464 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-1-deactivate, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team , OSD_FLAVOR=default, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, io.buildah.version=1.41.3, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20260223, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df) 2026-03-10T06:34:13.655 INFO:journalctl@ceph.osd.1.vm04.stdout:Mar 10 06:34:13 vm04.local podman[155431]: 2026-03-10 06:34:13.53173421 +0000 UTC m=+0.057796733 container init f7b34570ce4cf3405a734a70515b6dc8cfd09143a0d678754af6b4b5d904b464 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-1-deactivate, OSD_FLAVOR=default, org.label-schema.build-date=20260223, org.opencontainers.image.authors=Ceph Release Team , CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, ceph=True, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df) 2026-03-10T06:34:13.655 INFO:journalctl@ceph.osd.1.vm04.stdout:Mar 10 06:34:13 vm04.local podman[155431]: 2026-03-10 06:34:13.534102343 +0000 UTC m=+0.060164857 container start f7b34570ce4cf3405a734a70515b6dc8cfd09143a0d678754af6b4b5d904b464 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-1-deactivate, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.build-date=20260223, org.opencontainers.image.authors=Ceph Release Team , ceph=True, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git) 2026-03-10T06:34:13.655 INFO:journalctl@ceph.osd.1.vm04.stdout:Mar 10 06:34:13 vm04.local podman[155431]: 2026-03-10 06:34:13.538677837 +0000 UTC m=+0.064740351 container attach f7b34570ce4cf3405a734a70515b6dc8cfd09143a0d678754af6b4b5d904b464 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-1-deactivate, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.license=GPLv2, org.label-schema.build-date=20260223, org.opencontainers.image.authors=Ceph Release Team , OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, io.buildah.version=1.41.3, CEPH_REF=squid, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True) 2026-03-10T06:34:13.655 INFO:journalctl@ceph.osd.1.vm04.stdout:Mar 10 06:34:13 vm04.local podman[155431]: 2026-03-10 06:34:13.484038073 +0000 UTC m=+0.010100597 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-10T06:34:13.682 DEBUG:teuthology.orchestra.run.vm04:> sudo pkill -f 'journalctl -f -n 0 -u ceph-9c59102a-1c48-11f1-b618-035af535377d@osd.1.service' 2026-03-10T06:34:13.714 DEBUG:teuthology.orchestra.run:got remote process result: None 2026-03-10T06:34:13.714 INFO:tasks.cephadm.osd.1:Stopped osd.1 2026-03-10T06:34:13.714 INFO:tasks.cephadm.osd.2:Stopping osd.2... 2026-03-10T06:34:13.714 DEBUG:teuthology.orchestra.run.vm04:> sudo systemctl stop ceph-9c59102a-1c48-11f1-b618-035af535377d@osd.2 2026-03-10T06:34:13.927 INFO:journalctl@ceph.osd.2.vm04.stdout:Mar 10 06:34:13 vm04.local systemd[1]: Stopping Ceph osd.2 for 9c59102a-1c48-11f1-b618-035af535377d... 2026-03-10T06:34:13.927 INFO:journalctl@ceph.osd.2.vm04.stdout:Mar 10 06:34:13 vm04.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-2[134170]: 2026-03-10T06:34:13.845+0000 7f8c4ccae640 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-osd -n osd.2 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false (PID: 1) UID: 0 2026-03-10T06:34:13.927 INFO:journalctl@ceph.osd.2.vm04.stdout:Mar 10 06:34:13 vm04.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-2[134170]: 2026-03-10T06:34:13.845+0000 7f8c4ccae640 -1 osd.2 81 *** Got signal Terminated *** 2026-03-10T06:34:13.927 INFO:journalctl@ceph.osd.2.vm04.stdout:Mar 10 06:34:13 vm04.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-2[134170]: 2026-03-10T06:34:13.845+0000 7f8c4ccae640 -1 osd.2 81 *** Immediate shutdown (osd_fast_shutdown=true) *** 2026-03-10T06:34:19.170 INFO:journalctl@ceph.osd.2.vm04.stdout:Mar 10 06:34:18 vm04.local podman[155527]: 2026-03-10 06:34:18.873199409 +0000 UTC m=+5.040356717 container died 38220ba83a3f79daa7972be3e79b338c74eb1d3f5566908b7ffce6c2f23de6b2 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-2, org.label-schema.build-date=20260223, OSD_FLAVOR=default, ceph=True, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team , CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, FROM_IMAGE=quay.io/centos/centos:stream9) 2026-03-10T06:34:19.170 INFO:journalctl@ceph.osd.2.vm04.stdout:Mar 10 06:34:18 vm04.local podman[155527]: 2026-03-10 06:34:18.895156239 +0000 UTC m=+5.062313536 container remove 38220ba83a3f79daa7972be3e79b338c74eb1d3f5566908b7ffce6c2f23de6b2 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-2, org.opencontainers.image.authors=Ceph Release Team , CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, OSD_FLAVOR=default, CEPH_REF=squid, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, org.label-schema.build-date=20260223, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2) 2026-03-10T06:34:19.171 INFO:journalctl@ceph.osd.2.vm04.stdout:Mar 10 06:34:18 vm04.local bash[155527]: ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-2 2026-03-10T06:34:19.171 INFO:journalctl@ceph.osd.2.vm04.stdout:Mar 10 06:34:19 vm04.local podman[155594]: 2026-03-10 06:34:19.017207954 +0000 UTC m=+0.015167252 container create eafac3a87bf1dbcb240dfc8bc88b2817dfbb679948f1e74f36e47351f4b4e140 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-2-deactivate, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20260223, OSD_FLAVOR=default, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.schema-version=1.0) 2026-03-10T06:34:19.171 INFO:journalctl@ceph.osd.2.vm04.stdout:Mar 10 06:34:19 vm04.local podman[155594]: 2026-03-10 06:34:19.053934419 +0000 UTC m=+0.051893736 container init eafac3a87bf1dbcb240dfc8bc88b2817dfbb679948f1e74f36e47351f4b4e140 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-2-deactivate, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.authors=Ceph Release Team , io.buildah.version=1.41.3, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.build-date=20260223, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image) 2026-03-10T06:34:19.171 INFO:journalctl@ceph.osd.2.vm04.stdout:Mar 10 06:34:19 vm04.local podman[155594]: 2026-03-10 06:34:19.056535708 +0000 UTC m=+0.054495006 container start eafac3a87bf1dbcb240dfc8bc88b2817dfbb679948f1e74f36e47351f4b4e140 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-2-deactivate, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.build-date=20260223, CEPH_REF=squid, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team , io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image) 2026-03-10T06:34:19.171 INFO:journalctl@ceph.osd.2.vm04.stdout:Mar 10 06:34:19 vm04.local podman[155594]: 2026-03-10 06:34:19.059739645 +0000 UTC m=+0.057698962 container attach eafac3a87bf1dbcb240dfc8bc88b2817dfbb679948f1e74f36e47351f4b4e140 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-2-deactivate, org.label-schema.schema-version=1.0, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, ceph=True, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team , FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/) 2026-03-10T06:34:19.171 INFO:journalctl@ceph.osd.2.vm04.stdout:Mar 10 06:34:19 vm04.local podman[155594]: 2026-03-10 06:34:19.010942235 +0000 UTC m=+0.008901543 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-10T06:34:19.171 INFO:journalctl@ceph.osd.2.vm04.stdout:Mar 10 06:34:19 vm04.local podman[155594]: 2026-03-10 06:34:19.169794946 +0000 UTC m=+0.167754254 container died eafac3a87bf1dbcb240dfc8bc88b2817dfbb679948f1e74f36e47351f4b4e140 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-2-deactivate, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.opencontainers.image.authors=Ceph Release Team , io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.build-date=20260223, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df) 2026-03-10T06:34:19.197 DEBUG:teuthology.orchestra.run.vm04:> sudo pkill -f 'journalctl -f -n 0 -u ceph-9c59102a-1c48-11f1-b618-035af535377d@osd.2.service' 2026-03-10T06:34:19.228 DEBUG:teuthology.orchestra.run:got remote process result: None 2026-03-10T06:34:19.228 INFO:tasks.cephadm.osd.2:Stopped osd.2 2026-03-10T06:34:19.228 INFO:tasks.cephadm.osd.3:Stopping osd.3... 2026-03-10T06:34:19.228 DEBUG:teuthology.orchestra.run.vm06:> sudo systemctl stop ceph-9c59102a-1c48-11f1-b618-035af535377d@osd.3 2026-03-10T06:34:19.617 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 10 06:34:19 vm06.local systemd[1]: Stopping Ceph osd.3 for 9c59102a-1c48-11f1-b618-035af535377d... 2026-03-10T06:34:19.617 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 10 06:34:19 vm06.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-3[106757]: 2026-03-10T06:34:19.341+0000 7f2f6d526640 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-osd -n osd.3 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false (PID: 1) UID: 0 2026-03-10T06:34:19.617 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 10 06:34:19 vm06.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-3[106757]: 2026-03-10T06:34:19.341+0000 7f2f6d526640 -1 osd.3 81 *** Got signal Terminated *** 2026-03-10T06:34:19.617 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 10 06:34:19 vm06.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-3[106757]: 2026-03-10T06:34:19.341+0000 7f2f6d526640 -1 osd.3 81 *** Immediate shutdown (osd_fast_shutdown=true) *** 2026-03-10T06:34:24.676 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 10 06:34:24 vm06.local podman[129225]: 2026-03-10 06:34:24.37499679 +0000 UTC m=+5.047972229 container died e91f44e1f6609617ce38c238503f445f67f24335365a3cd6205490cf8ea51f53 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-3, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.build-date=20260223, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, ceph=True, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git) 2026-03-10T06:34:24.676 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 10 06:34:24 vm06.local podman[129225]: 2026-03-10 06:34:24.394942585 +0000 UTC m=+5.067918014 container remove e91f44e1f6609617ce38c238503f445f67f24335365a3cd6205490cf8ea51f53 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-3, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.vendor=CentOS, OSD_FLAVOR=default, ceph=True, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.build-date=20260223) 2026-03-10T06:34:24.676 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 10 06:34:24 vm06.local bash[129225]: ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-3 2026-03-10T06:34:24.676 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 10 06:34:24 vm06.local podman[129292]: 2026-03-10 06:34:24.518696836 +0000 UTC m=+0.014525350 container create 30fdfdfe1820cd53f6decf41a930bd7c1994681b0f4a386a4e213b4a3b7aa029 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-3-deactivate, org.opencontainers.image.authors=Ceph Release Team , OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.label-schema.build-date=20260223, io.buildah.version=1.41.3, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df) 2026-03-10T06:34:24.676 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 10 06:34:24 vm06.local podman[129292]: 2026-03-10 06:34:24.557782527 +0000 UTC m=+0.053611050 container init 30fdfdfe1820cd53f6decf41a930bd7c1994681b0f4a386a4e213b4a3b7aa029 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-3-deactivate, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, ceph=True, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.build-date=20260223, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, io.buildah.version=1.41.3) 2026-03-10T06:34:24.676 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 10 06:34:24 vm06.local podman[129292]: 2026-03-10 06:34:24.560618094 +0000 UTC m=+0.056446598 container start 30fdfdfe1820cd53f6decf41a930bd7c1994681b0f4a386a4e213b4a3b7aa029 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-3-deactivate, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, ceph=True, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2) 2026-03-10T06:34:24.676 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 10 06:34:24 vm06.local podman[129292]: 2026-03-10 06:34:24.56140131 +0000 UTC m=+0.057229824 container attach 30fdfdfe1820cd53f6decf41a930bd7c1994681b0f4a386a4e213b4a3b7aa029 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-3-deactivate, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.label-schema.build-date=20260223, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2) 2026-03-10T06:34:24.676 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 10 06:34:24 vm06.local podman[129292]: 2026-03-10 06:34:24.512875159 +0000 UTC m=+0.008703674 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-10T06:34:24.705 DEBUG:teuthology.orchestra.run.vm06:> sudo pkill -f 'journalctl -f -n 0 -u ceph-9c59102a-1c48-11f1-b618-035af535377d@osd.3.service' 2026-03-10T06:34:24.734 DEBUG:teuthology.orchestra.run:got remote process result: None 2026-03-10T06:34:24.734 INFO:tasks.cephadm.osd.3:Stopped osd.3 2026-03-10T06:34:24.734 INFO:tasks.cephadm.osd.4:Stopping osd.4... 2026-03-10T06:34:24.734 DEBUG:teuthology.orchestra.run.vm06:> sudo systemctl stop ceph-9c59102a-1c48-11f1-b618-035af535377d@osd.4 2026-03-10T06:34:25.117 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 10 06:34:24 vm06.local systemd[1]: Stopping Ceph osd.4 for 9c59102a-1c48-11f1-b618-035af535377d... 2026-03-10T06:34:25.117 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 10 06:34:24 vm06.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-4[111179]: 2026-03-10T06:34:24.866+0000 7f7197c96640 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-osd -n osd.4 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false (PID: 1) UID: 0 2026-03-10T06:34:25.117 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 10 06:34:24 vm06.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-4[111179]: 2026-03-10T06:34:24.867+0000 7f7197c96640 -1 osd.4 81 *** Got signal Terminated *** 2026-03-10T06:34:25.117 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 10 06:34:24 vm06.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-4[111179]: 2026-03-10T06:34:24.867+0000 7f7197c96640 -1 osd.4 81 *** Immediate shutdown (osd_fast_shutdown=true) *** 2026-03-10T06:34:29.302 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 10 06:34:28 vm06.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-5[115623]: 2026-03-10T06:34:28.876+0000 7f9425911640 -1 osd.5 81 heartbeat_check: no reply from 192.168.123.104:6806 osd.0 since back 2026-03-10T06:34:07.591859+0000 front 2026-03-10T06:34:07.592017+0000 (oldest deadline 2026-03-10T06:34:28.091578+0000) 2026-03-10T06:34:29.617 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 10 06:34:29 vm06.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-4[111179]: 2026-03-10T06:34:29.302+0000 7f719429e640 -1 osd.4 81 heartbeat_check: no reply from 192.168.123.104:6806 osd.0 since back 2026-03-10T06:34:06.085041+0000 front 2026-03-10T06:34:06.084894+0000 (oldest deadline 2026-03-10T06:34:28.384662+0000) 2026-03-10T06:34:30.117 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 10 06:34:29 vm06.local podman[129388]: 2026-03-10 06:34:29.910783742 +0000 UTC m=+5.055785392 container died fea6c31251ba9ff98c92a9d272ac7dc5581b7280aa562389863b6093ec4880d1 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-4, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team , GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git) 2026-03-10T06:34:30.117 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 10 06:34:29 vm06.local podman[129388]: 2026-03-10 06:34:29.93122296 +0000 UTC m=+5.076224610 container remove fea6c31251ba9ff98c92a9d272ac7dc5581b7280aa562389863b6093ec4880d1 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-4, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20260223, OSD_FLAVOR=default, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.license=GPLv2) 2026-03-10T06:34:30.117 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 10 06:34:29 vm06.local bash[129388]: ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-4 2026-03-10T06:34:30.117 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 10 06:34:30 vm06.local podman[129467]: 2026-03-10 06:34:30.0572467 +0000 UTC m=+0.015175807 container create 888b64c6e03625074f5283eb9b7109af5907ec353e32e1a8c0b548ce1a39eef2 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-4-deactivate, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, OSD_FLAVOR=default, CEPH_REF=squid, org.label-schema.build-date=20260223, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.authors=Ceph Release Team , io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True) 2026-03-10T06:34:30.117 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 10 06:34:30 vm06.local podman[129467]: 2026-03-10 06:34:30.096149228 +0000 UTC m=+0.054078344 container init 888b64c6e03625074f5283eb9b7109af5907ec353e32e1a8c0b548ce1a39eef2 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-4-deactivate, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.build-date=20260223, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team ) 2026-03-10T06:34:30.117 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 10 06:34:30 vm06.local podman[129467]: 2026-03-10 06:34:30.099119607 +0000 UTC m=+0.057048714 container start 888b64c6e03625074f5283eb9b7109af5907ec353e32e1a8c0b548ce1a39eef2 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-4-deactivate, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team , OSD_FLAVOR=default, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20260223, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9) 2026-03-10T06:34:30.117 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 10 06:34:30 vm06.local podman[129467]: 2026-03-10 06:34:30.099926948 +0000 UTC m=+0.057856055 container attach 888b64c6e03625074f5283eb9b7109af5907ec353e32e1a8c0b548ce1a39eef2 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-4-deactivate, ceph=True, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team , io.buildah.version=1.41.3, org.label-schema.license=GPLv2, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.schema-version=1.0, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.build-date=20260223) 2026-03-10T06:34:30.117 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 10 06:34:29 vm06.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-5[115623]: 2026-03-10T06:34:29.854+0000 7f9425911640 -1 osd.5 81 heartbeat_check: no reply from 192.168.123.104:6806 osd.0 since back 2026-03-10T06:34:07.591859+0000 front 2026-03-10T06:34:07.592017+0000 (oldest deadline 2026-03-10T06:34:28.091578+0000) 2026-03-10T06:34:30.248 DEBUG:teuthology.orchestra.run.vm06:> sudo pkill -f 'journalctl -f -n 0 -u ceph-9c59102a-1c48-11f1-b618-035af535377d@osd.4.service' 2026-03-10T06:34:30.277 DEBUG:teuthology.orchestra.run:got remote process result: None 2026-03-10T06:34:30.278 INFO:tasks.cephadm.osd.4:Stopped osd.4 2026-03-10T06:34:30.278 INFO:tasks.cephadm.osd.5:Stopping osd.5... 2026-03-10T06:34:30.278 DEBUG:teuthology.orchestra.run.vm06:> sudo systemctl stop ceph-9c59102a-1c48-11f1-b618-035af535377d@osd.5 2026-03-10T06:34:30.410 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 10 06:34:30 vm06.local systemd[1]: Stopping Ceph osd.5 for 9c59102a-1c48-11f1-b618-035af535377d... 2026-03-10T06:34:30.825 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 10 06:34:30 vm06.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-5[115623]: 2026-03-10T06:34:30.409+0000 7f9429309640 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-osd -n osd.5 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false (PID: 1) UID: 0 2026-03-10T06:34:30.825 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 10 06:34:30 vm06.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-5[115623]: 2026-03-10T06:34:30.409+0000 7f9429309640 -1 osd.5 81 *** Got signal Terminated *** 2026-03-10T06:34:30.825 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 10 06:34:30 vm06.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-5[115623]: 2026-03-10T06:34:30.409+0000 7f9429309640 -1 osd.5 81 *** Immediate shutdown (osd_fast_shutdown=true) *** 2026-03-10T06:34:31.117 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 10 06:34:30 vm06.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-5[115623]: 2026-03-10T06:34:30.824+0000 7f9425911640 -1 osd.5 81 heartbeat_check: no reply from 192.168.123.104:6806 osd.0 since back 2026-03-10T06:34:07.591859+0000 front 2026-03-10T06:34:07.592017+0000 (oldest deadline 2026-03-10T06:34:28.091578+0000) 2026-03-10T06:34:32.117 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 10 06:34:31 vm06.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-5[115623]: 2026-03-10T06:34:31.835+0000 7f9425911640 -1 osd.5 81 heartbeat_check: no reply from 192.168.123.104:6806 osd.0 since back 2026-03-10T06:34:07.591859+0000 front 2026-03-10T06:34:07.592017+0000 (oldest deadline 2026-03-10T06:34:28.091578+0000) 2026-03-10T06:34:33.117 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 10 06:34:32 vm06.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-5[115623]: 2026-03-10T06:34:32.865+0000 7f9425911640 -1 osd.5 81 heartbeat_check: no reply from 192.168.123.104:6806 osd.0 since back 2026-03-10T06:34:07.591859+0000 front 2026-03-10T06:34:07.592017+0000 (oldest deadline 2026-03-10T06:34:28.091578+0000) 2026-03-10T06:34:34.367 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 10 06:34:33 vm06.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-5[115623]: 2026-03-10T06:34:33.895+0000 7f9425911640 -1 osd.5 81 heartbeat_check: no reply from 192.168.123.104:6806 osd.0 since back 2026-03-10T06:34:07.591859+0000 front 2026-03-10T06:34:07.592017+0000 (oldest deadline 2026-03-10T06:34:28.091578+0000) 2026-03-10T06:34:35.117 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 10 06:34:34 vm06.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-5[115623]: 2026-03-10T06:34:34.861+0000 7f9425911640 -1 osd.5 81 heartbeat_check: no reply from 192.168.123.104:6806 osd.0 since back 2026-03-10T06:34:07.591859+0000 front 2026-03-10T06:34:07.592017+0000 (oldest deadline 2026-03-10T06:34:28.091578+0000) 2026-03-10T06:34:35.117 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 10 06:34:34 vm06.local ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-5[115623]: 2026-03-10T06:34:34.861+0000 7f9425911640 -1 osd.5 81 heartbeat_check: no reply from 192.168.123.104:6814 osd.1 since back 2026-03-10T06:34:12.192280+0000 front 2026-03-10T06:34:12.192253+0000 (oldest deadline 2026-03-10T06:34:34.492001+0000) 2026-03-10T06:34:35.739 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 10 06:34:35 vm06.local podman[129562]: 2026-03-10 06:34:35.435814586 +0000 UTC m=+5.037659206 container died 15b85a82c8ae75f8e3980ed8b0ff7ec0dfc84d818c6b1b707b0a9e0e110936be (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-5, ceph=True, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.build-date=20260223, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df) 2026-03-10T06:34:35.739 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 10 06:34:35 vm06.local podman[129562]: 2026-03-10 06:34:35.462834232 +0000 UTC m=+5.064678852 container remove 15b85a82c8ae75f8e3980ed8b0ff7ec0dfc84d818c6b1b707b0a9e0e110936be (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-5, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.build-date=20260223, org.opencontainers.image.authors=Ceph Release Team , org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3) 2026-03-10T06:34:35.740 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 10 06:34:35 vm06.local bash[129562]: ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-5 2026-03-10T06:34:35.740 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 10 06:34:35 vm06.local podman[129629]: 2026-03-10 06:34:35.582627059 +0000 UTC m=+0.014021896 container create c5736762417a3d2dee28eb33d812e749c781f9bafb1499c6bc3e2ee5a5058a17 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-5-deactivate, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.vendor=CentOS, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20260223, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, OSD_FLAVOR=default, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team ) 2026-03-10T06:34:35.740 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 10 06:34:35 vm06.local podman[129629]: 2026-03-10 06:34:35.618521306 +0000 UTC m=+0.049916163 container init c5736762417a3d2dee28eb33d812e749c781f9bafb1499c6bc3e2ee5a5058a17 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-5-deactivate, org.label-schema.schema-version=1.0, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team , io.buildah.version=1.41.3, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True) 2026-03-10T06:34:35.740 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 10 06:34:35 vm06.local podman[129629]: 2026-03-10 06:34:35.621321116 +0000 UTC m=+0.052715963 container start c5736762417a3d2dee28eb33d812e749c781f9bafb1499c6bc3e2ee5a5058a17 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-5-deactivate, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, OSD_FLAVOR=default, ceph=True, CEPH_REF=squid, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.authors=Ceph Release Team , FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3) 2026-03-10T06:34:35.740 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 10 06:34:35 vm06.local podman[129629]: 2026-03-10 06:34:35.627646737 +0000 UTC m=+0.059041584 container attach c5736762417a3d2dee28eb33d812e749c781f9bafb1499c6bc3e2ee5a5058a17 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-5-deactivate, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team , ceph=True, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git) 2026-03-10T06:34:35.740 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 10 06:34:35 vm06.local podman[129629]: 2026-03-10 06:34:35.576874592 +0000 UTC m=+0.008269449 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-10T06:34:35.740 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 10 06:34:35 vm06.local podman[129629]: 2026-03-10 06:34:35.739446392 +0000 UTC m=+0.170841229 container died c5736762417a3d2dee28eb33d812e749c781f9bafb1499c6bc3e2ee5a5058a17 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-9c59102a-1c48-11f1-b618-035af535377d-osd-5-deactivate, ceph=True, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.name=CentOS Stream 9 Base Image) 2026-03-10T06:34:35.765 DEBUG:teuthology.orchestra.run.vm06:> sudo pkill -f 'journalctl -f -n 0 -u ceph-9c59102a-1c48-11f1-b618-035af535377d@osd.5.service' 2026-03-10T06:34:35.794 DEBUG:teuthology.orchestra.run:got remote process result: None 2026-03-10T06:34:35.794 INFO:tasks.cephadm.osd.5:Stopped osd.5 2026-03-10T06:34:35.794 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm rm-cluster --fsid 9c59102a-1c48-11f1-b618-035af535377d --force --keep-logs 2026-03-10T06:34:35.889 INFO:teuthology.orchestra.run.vm04.stdout:Deleting cluster with fsid: 9c59102a-1c48-11f1-b618-035af535377d 2026-03-10T06:34:37.130 INFO:tasks.cephfs.fuse_mount.ceph-fuse.0.vm04.stderr:ceph-fuse[92037]: fuse finished with error 0 and tester_r 0 2026-03-10T06:34:47.306 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm rm-cluster --fsid 9c59102a-1c48-11f1-b618-035af535377d --force --keep-logs 2026-03-10T06:34:47.401 INFO:teuthology.orchestra.run.vm06.stdout:Deleting cluster with fsid: 9c59102a-1c48-11f1-b618-035af535377d 2026-03-10T06:34:52.050 DEBUG:teuthology.orchestra.run.vm04:> sudo rm -f /etc/ceph/ceph.conf /etc/ceph/ceph.client.admin.keyring 2026-03-10T06:34:52.075 DEBUG:teuthology.orchestra.run.vm06:> sudo rm -f /etc/ceph/ceph.conf /etc/ceph/ceph.client.admin.keyring 2026-03-10T06:34:52.099 INFO:tasks.cephadm:Archiving crash dumps... 2026-03-10T06:34:52.099 DEBUG:teuthology.misc:Transferring archived files from vm04:/var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/crash to /archive/kyr-2026-03-10_01:00:38-orch-squid-none-default-vps/926/remote/vm04/crash 2026-03-10T06:34:52.099 DEBUG:teuthology.orchestra.run.vm04:> sudo tar c -f - -C /var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/crash -- . 2026-03-10T06:34:52.138 INFO:teuthology.orchestra.run.vm04.stderr:tar: /var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/crash: Cannot open: No such file or directory 2026-03-10T06:34:52.138 INFO:teuthology.orchestra.run.vm04.stderr:tar: Error is not recoverable: exiting now 2026-03-10T06:34:52.139 DEBUG:teuthology.misc:Transferring archived files from vm06:/var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/crash to /archive/kyr-2026-03-10_01:00:38-orch-squid-none-default-vps/926/remote/vm06/crash 2026-03-10T06:34:52.139 DEBUG:teuthology.orchestra.run.vm06:> sudo tar c -f - -C /var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/crash -- . 2026-03-10T06:34:52.161 INFO:teuthology.orchestra.run.vm06.stderr:tar: /var/lib/ceph/9c59102a-1c48-11f1-b618-035af535377d/crash: Cannot open: No such file or directory 2026-03-10T06:34:52.161 INFO:teuthology.orchestra.run.vm06.stderr:tar: Error is not recoverable: exiting now 2026-03-10T06:34:52.162 INFO:tasks.cephadm:Checking cluster log for badness... 2026-03-10T06:34:52.162 DEBUG:teuthology.orchestra.run.vm04:> sudo egrep '\[ERR\]|\[WRN\]|\[SEC\]' /var/log/ceph/9c59102a-1c48-11f1-b618-035af535377d/ceph.log | egrep -v '\(MDS_ALL_DOWN\)' | egrep -v '\(MDS_UP_LESS_THAN_MAX\)' | egrep -v FS_DEGRADED | egrep -v 'filesystem is degraded' | egrep -v FS_INLINE_DATA_DEPRECATED | egrep -v FS_WITH_FAILED_MDS | egrep -v MDS_ALL_DOWN | egrep -v 'filesystem is offline' | egrep -v 'is offline because no MDS' | egrep -v MDS_DAMAGE | egrep -v MDS_DEGRADED | egrep -v MDS_FAILED | egrep -v MDS_INSUFFICIENT_STANDBY | egrep -v MDS_UP_LESS_THAN_MAX | egrep -v 'online, but wants' | egrep -v 'filesystem is online with fewer MDS than max_mds' | egrep -v POOL_APP_NOT_ENABLED | egrep -v 'do not have an application enabled' | egrep -v 'overall HEALTH_' | egrep -v 'Replacing daemon' | egrep -v 'deprecated feature inline_data' | egrep -v MGR_MODULE_ERROR | egrep -v OSD_DOWN | egrep -v 'osds down' | egrep -v 'overall HEALTH_' | egrep -v '\(OSD_DOWN\)' | egrep -v '\(OSD_' | egrep -v 'but it is still running' | egrep -v 'is not responding' | egrep -v MON_DOWN | egrep -v PG_AVAILABILITY | egrep -v PG_DEGRADED | egrep -v 'Reduced data availability' | egrep -v 'Degraded data redundancy' | egrep -v 'pg .* is stuck inactive' | egrep -v 'pg .* is .*degraded' | egrep -v 'pg .* is stuck peering' | head -n 1 2026-03-10T06:34:52.229 INFO:tasks.cephadm:Compressing logs... 2026-03-10T06:34:52.229 DEBUG:teuthology.orchestra.run.vm04:> time sudo find /var/log/ceph /var/log/rbd-target-api -name '*.log' -print0 | sudo xargs --max-args=1 --max-procs=0 --verbose -0 --no-run-if-empty -- gzip -5 --verbose -- 2026-03-10T06:34:52.230 DEBUG:teuthology.orchestra.run.vm06:> time sudo find /var/log/ceph /var/log/rbd-target-api -name '*.log' -print0 | sudo xargs --max-args=1 --max-procs=0 --verbose -0 --no-run-if-empty -- gzip -5 --verbose -- 2026-03-10T06:34:52.253 INFO:teuthology.orchestra.run.vm06.stderr:gzip -5 --verbose -- /var/log/ceph/cephadm.log 2026-03-10T06:34:52.253 INFO:teuthology.orchestra.run.vm06.stderr:find: ‘/var/log/rbd-target-api’: No such file or directory 2026-03-10T06:34:52.254 INFO:teuthology.orchestra.run.vm06.stderr:/var/log/ceph/cephadm.log: gzip -5 --verbose -- /var/log/ceph/9c59102a-1c48-11f1-b618-035af535377d/ceph-volume.log 2026-03-10T06:34:52.254 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/cephadm.log 2026-03-10T06:34:52.254 INFO:teuthology.orchestra.run.vm04.stderr:find: ‘/var/log/rbd-target-api’: No such file or directory 2026-03-10T06:34:52.255 INFO:teuthology.orchestra.run.vm06.stderr:/var/log/ceph/9c59102a-1c48-11f1-b618-035af535377d/ceph-volume.log: gzip -5 --verbose -- /var/log/ceph/9c59102a-1c48-11f1-b618-035af535377d/ceph-client.ceph-exporter.vm06.log 2026-03-10T06:34:52.255 INFO:teuthology.orchestra.run.vm06.stderr: 92.8% -- replaced with /var/log/ceph/cephadm.log.gz 2026-03-10T06:34:52.256 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/9c59102a-1c48-11f1-b618-035af535377d/ceph-mon.vm04.log 2026-03-10T06:34:52.257 INFO:teuthology.orchestra.run.vm06.stderr:/var/log/ceph/9c59102a-1c48-11f1-b618-035af535377d/ceph-client.ceph-exporter.vm06.log: gzip -5 --verbose -- /var/log/ceph/9c59102a-1c48-11f1-b618-035af535377d/ceph-mgr.vm06.wwotdr.log 2026-03-10T06:34:52.257 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/cephadm.log: gzip -5 --verbose -- /var/log/ceph/9c59102a-1c48-11f1-b618-035af535377d/ceph.log 2026-03-10T06:34:52.257 INFO:teuthology.orchestra.run.vm06.stderr: 93.9% -- replaced with /var/log/ceph/9c59102a-1c48-11f1-b618-035af535377d/ceph-client.ceph-exporter.vm06.log.gz 2026-03-10T06:34:52.258 INFO:teuthology.orchestra.run.vm06.stderr:gzip -5 --verbose -- /var/log/ceph/9c59102a-1c48-11f1-b618-035af535377d/ceph-mon.vm06.log 2026-03-10T06:34:52.260 INFO:teuthology.orchestra.run.vm06.stderr:/var/log/ceph/9c59102a-1c48-11f1-b618-035af535377d/ceph-mgr.vm06.wwotdr.log: 94.2% -- replaced with /var/log/ceph/9c59102a-1c48-11f1-b618-035af535377d/ceph-volume.log.gz 2026-03-10T06:34:52.261 INFO:teuthology.orchestra.run.vm06.stderr:gzip -5 --verbose -- /var/log/ceph/9c59102a-1c48-11f1-b618-035af535377d/ceph.log 2026-03-10T06:34:52.268 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/9c59102a-1c48-11f1-b618-035af535377d/ceph-mon.vm04.log: gzip -5 --verbose -- /var/log/ceph/9c59102a-1c48-11f1-b618-035af535377d/ceph-mgr.vm04.exdvdb.log 2026-03-10T06:34:52.268 INFO:teuthology.orchestra.run.vm06.stderr:/var/log/ceph/9c59102a-1c48-11f1-b618-035af535377d/ceph-mon.vm06.log: gzip -5 --verbose -- /var/log/ceph/9c59102a-1c48-11f1-b618-035af535377d/ceph.audit.log 2026-03-10T06:34:52.269 INFO:teuthology.orchestra.run.vm06.stderr:/var/log/ceph/9c59102a-1c48-11f1-b618-035af535377d/ceph.log: 87.6% -- replaced with /var/log/ceph/9c59102a-1c48-11f1-b618-035af535377d/ceph.log.gz 2026-03-10T06:34:52.269 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/9c59102a-1c48-11f1-b618-035af535377d/ceph.log: 87.5% 91.8% -- replaced with /var/log/ceph/cephadm.log.gz 2026-03-10T06:34:52.270 INFO:teuthology.orchestra.run.vm04.stderr: -- replaced with /var/log/ceph/9c59102a-1c48-11f1-b618-035af535377d/ceph.log.gz 2026-03-10T06:34:52.270 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/9c59102a-1c48-11f1-b618-035af535377d/ceph.audit.log 2026-03-10T06:34:52.270 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/9c59102a-1c48-11f1-b618-035af535377d/ceph-mgr.vm04.exdvdb.log: gzip -5 --verbose -- /var/log/ceph/9c59102a-1c48-11f1-b618-035af535377d/ceph.cephadm.log 2026-03-10T06:34:52.272 INFO:teuthology.orchestra.run.vm06.stderr:gzip -5 --verbose -- /var/log/ceph/9c59102a-1c48-11f1-b618-035af535377d/ceph.cephadm.log 2026-03-10T06:34:52.276 INFO:teuthology.orchestra.run.vm06.stderr:/var/log/ceph/9c59102a-1c48-11f1-b618-035af535377d/ceph.audit.log: 91.4% 89.2% -- replaced with /var/log/ceph/9c59102a-1c48-11f1-b618-035af535377d/ceph-mgr.vm06.wwotdr.log.gz 2026-03-10T06:34:52.276 INFO:teuthology.orchestra.run.vm06.stderr: -- replaced with /var/log/ceph/9c59102a-1c48-11f1-b618-035af535377d/ceph.audit.log.gz 2026-03-10T06:34:52.277 INFO:teuthology.orchestra.run.vm06.stderr:gzip -5 --verbose -- /var/log/ceph/9c59102a-1c48-11f1-b618-035af535377d/ceph-osd.3.log 2026-03-10T06:34:52.277 INFO:teuthology.orchestra.run.vm06.stderr:/var/log/ceph/9c59102a-1c48-11f1-b618-035af535377d/ceph.cephadm.log: 85.1% -- replaced with /var/log/ceph/9c59102a-1c48-11f1-b618-035af535377d/ceph.cephadm.log.gz 2026-03-10T06:34:52.278 INFO:teuthology.orchestra.run.vm06.stderr:gzip -5 --verbose -- /var/log/ceph/9c59102a-1c48-11f1-b618-035af535377d/ceph-osd.4.log 2026-03-10T06:34:52.280 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/9c59102a-1c48-11f1-b618-035af535377d/ceph.audit.log: 91.2% -- replaced with /var/log/ceph/9c59102a-1c48-11f1-b618-035af535377d/ceph.audit.log.gz 2026-03-10T06:34:52.282 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/9c59102a-1c48-11f1-b618-035af535377d/ceph-volume.log 2026-03-10T06:34:52.283 INFO:teuthology.orchestra.run.vm06.stderr:/var/log/ceph/9c59102a-1c48-11f1-b618-035af535377d/ceph-osd.3.log: gzip -5 --verbose -- /var/log/ceph/9c59102a-1c48-11f1-b618-035af535377d/ceph-osd.5.log 2026-03-10T06:34:52.284 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/9c59102a-1c48-11f1-b618-035af535377d/ceph.cephadm.log: 85.1% -- replaced with /var/log/ceph/9c59102a-1c48-11f1-b618-035af535377d/ceph.cephadm.log.gz 2026-03-10T06:34:52.287 INFO:teuthology.orchestra.run.vm06.stderr:/var/log/ceph/9c59102a-1c48-11f1-b618-035af535377d/ceph-osd.4.log: gzip -5 --verbose -- /var/log/ceph/9c59102a-1c48-11f1-b618-035af535377d/ceph-mds.cephfs.vm06.wzhqon.log 2026-03-10T06:34:52.288 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/9c59102a-1c48-11f1-b618-035af535377d/ceph-client.ceph-exporter.vm04.log 2026-03-10T06:34:52.297 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/9c59102a-1c48-11f1-b618-035af535377d/ceph-volume.log: gzip -5 --verbose -- /var/log/ceph/9c59102a-1c48-11f1-b618-035af535377d/ceph-osd.0.log 2026-03-10T06:34:52.297 INFO:teuthology.orchestra.run.vm06.stderr:/var/log/ceph/9c59102a-1c48-11f1-b618-035af535377d/ceph-osd.5.log: gzip -5 --verbose -- /var/log/ceph/9c59102a-1c48-11f1-b618-035af535377d/ceph-mds.cephfs.vm06.afscws.log 2026-03-10T06:34:52.301 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/9c59102a-1c48-11f1-b618-035af535377d/ceph-client.ceph-exporter.vm04.log: 93.9% -- replaced with /var/log/ceph/9c59102a-1c48-11f1-b618-035af535377d/ceph-client.ceph-exporter.vm04.log.gz 2026-03-10T06:34:52.301 INFO:teuthology.orchestra.run.vm04.stderr: 94.2% -- replaced with /var/log/ceph/9c59102a-1c48-11f1-b618-035af535377d/ceph-volume.log.gz 2026-03-10T06:34:52.304 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/9c59102a-1c48-11f1-b618-035af535377d/ceph-osd.1.log 2026-03-10T06:34:52.305 INFO:teuthology.orchestra.run.vm06.stderr:/var/log/ceph/9c59102a-1c48-11f1-b618-035af535377d/ceph-mds.cephfs.vm06.wzhqon.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.1.log 2026-03-10T06:34:52.314 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/9c59102a-1c48-11f1-b618-035af535377d/ceph-osd.0.log: gzip -5 --verbose -- /var/log/ceph/9c59102a-1c48-11f1-b618-035af535377d/ceph-osd.2.log 2026-03-10T06:34:52.319 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/9c59102a-1c48-11f1-b618-035af535377d/ceph-osd.1.log: gzip -5 --verbose -- /var/log/ceph/9c59102a-1c48-11f1-b618-035af535377d/ceph-mds.cephfs.vm04.hdxbzv.log 2026-03-10T06:34:52.329 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/9c59102a-1c48-11f1-b618-035af535377d/ceph-osd.2.log: gzip -5 --verbose -- /var/log/ceph/9c59102a-1c48-11f1-b618-035af535377d/ceph-mds.cephfs.vm04.hsrsig.log 2026-03-10T06:34:52.340 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/9c59102a-1c48-11f1-b618-035af535377d/ceph-mds.cephfs.vm04.hdxbzv.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.0.log 2026-03-10T06:34:52.811 INFO:teuthology.orchestra.run.vm06.stderr:/var/log/ceph/9c59102a-1c48-11f1-b618-035af535377d/ceph-mds.cephfs.vm06.afscws.log: /var/log/ceph/ceph-client.1.log: 92.3% -- replaced with /var/log/ceph/9c59102a-1c48-11f1-b618-035af535377d/ceph-mon.vm06.log.gz 2026-03-10T06:34:52.832 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/9c59102a-1c48-11f1-b618-035af535377d/ceph-mds.cephfs.vm04.hsrsig.log: /var/log/ceph/ceph-client.0.log: 89.2% -- replaced with /var/log/ceph/9c59102a-1c48-11f1-b618-035af535377d/ceph-mgr.vm04.exdvdb.log.gz 2026-03-10T06:34:53.677 INFO:teuthology.orchestra.run.vm04.stderr: 90.6% -- replaced with /var/log/ceph/9c59102a-1c48-11f1-b618-035af535377d/ceph-mon.vm04.log.gz 2026-03-10T06:34:58.695 INFO:teuthology.orchestra.run.vm06.stderr: 93.8% -- replaced with /var/log/ceph/9c59102a-1c48-11f1-b618-035af535377d/ceph-osd.4.log.gz 2026-03-10T06:34:59.762 INFO:teuthology.orchestra.run.vm04.stderr: 93.9% -- replaced with /var/log/ceph/9c59102a-1c48-11f1-b618-035af535377d/ceph-osd.2.log.gz 2026-03-10T06:35:00.189 INFO:teuthology.orchestra.run.vm04.stderr: 93.9% -- replaced with /var/log/ceph/9c59102a-1c48-11f1-b618-035af535377d/ceph-osd.0.log.gz 2026-03-10T06:35:00.452 INFO:teuthology.orchestra.run.vm06.stderr: 93.9% -- replaced with /var/log/ceph/9c59102a-1c48-11f1-b618-035af535377d/ceph-osd.5.log.gz 2026-03-10T06:35:00.846 INFO:teuthology.orchestra.run.vm04.stderr: 94.0% -- replaced with /var/log/ceph/9c59102a-1c48-11f1-b618-035af535377d/ceph-osd.1.log.gz 2026-03-10T06:35:01.593 INFO:teuthology.orchestra.run.vm06.stderr: 93.9% -- replaced with /var/log/ceph/9c59102a-1c48-11f1-b618-035af535377d/ceph-osd.3.log.gz 2026-03-10T06:35:02.299 INFO:teuthology.orchestra.run.vm06.stderr: 94.9% -- replaced with /var/log/ceph/9c59102a-1c48-11f1-b618-035af535377d/ceph-mds.cephfs.vm06.afscws.log.gz 2026-03-10T06:35:02.989 INFO:teuthology.orchestra.run.vm06.stderr: 95.0% -- replaced with /var/log/ceph/9c59102a-1c48-11f1-b618-035af535377d/ceph-mds.cephfs.vm06.wzhqon.log.gz 2026-03-10T06:35:06.241 INFO:teuthology.orchestra.run.vm06.stderr:gzip: /var/log/ceph/ceph-client.1.log: file size changed while zipping 2026-03-10T06:35:06.241 INFO:teuthology.orchestra.run.vm06.stderr: 93.6% -- replaced with /var/log/ceph/ceph-client.1.log.gz 2026-03-10T06:35:06.243 INFO:teuthology.orchestra.run.vm06.stderr: 2026-03-10T06:35:06.243 INFO:teuthology.orchestra.run.vm06.stderr:real 0m13.999s 2026-03-10T06:35:06.243 INFO:teuthology.orchestra.run.vm06.stderr:user 0m23.669s 2026-03-10T06:35:06.243 INFO:teuthology.orchestra.run.vm06.stderr:sys 0m1.024s 2026-03-10T06:35:07.915 INFO:teuthology.orchestra.run.vm04.stderr: 94.9% -- replaced with /var/log/ceph/9c59102a-1c48-11f1-b618-035af535377d/ceph-mds.cephfs.vm04.hsrsig.log.gz 2026-03-10T06:35:09.154 INFO:teuthology.orchestra.run.vm04.stderr:gzip: /var/log/ceph/ceph-client.0.log: file size changed while zipping 2026-03-10T06:35:09.283 INFO:teuthology.orchestra.run.vm04.stderr: 93.5% -- replaced with /var/log/ceph/ceph-client.0.log.gz 2026-03-10T06:36:03.084 INFO:teuthology.orchestra.run.vm04.stderr: 93.1% -- replaced with /var/log/ceph/9c59102a-1c48-11f1-b618-035af535377d/ceph-mds.cephfs.vm04.hdxbzv.log.gz 2026-03-10T06:36:03.614 INFO:teuthology.orchestra.run.vm04.stderr: 2026-03-10T06:36:03.614 INFO:teuthology.orchestra.run.vm04.stderr:real 1m11.370s 2026-03-10T06:36:03.614 INFO:teuthology.orchestra.run.vm04.stderr:user 1m20.754s 2026-03-10T06:36:03.614 INFO:teuthology.orchestra.run.vm04.stderr:sys 0m5.172s 2026-03-10T06:36:03.614 INFO:tasks.cephadm:Archiving logs... 2026-03-10T06:36:03.614 DEBUG:teuthology.misc:Transferring archived files from vm04:/var/log/ceph to /archive/kyr-2026-03-10_01:00:38-orch-squid-none-default-vps/926/remote/vm04/log 2026-03-10T06:36:03.614 DEBUG:teuthology.orchestra.run.vm04:> sudo tar c -f - -C /var/log/ceph -- . 2026-03-10T06:36:07.628 DEBUG:teuthology.misc:Transferring archived files from vm06:/var/log/ceph to /archive/kyr-2026-03-10_01:00:38-orch-squid-none-default-vps/926/remote/vm06/log 2026-03-10T06:36:07.628 DEBUG:teuthology.orchestra.run.vm06:> sudo tar c -f - -C /var/log/ceph -- . 2026-03-10T06:36:08.763 INFO:tasks.cephadm:Removing cluster... 2026-03-10T06:36:08.764 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm rm-cluster --fsid 9c59102a-1c48-11f1-b618-035af535377d --force 2026-03-10T06:36:08.935 INFO:teuthology.orchestra.run.vm04.stdout:Deleting cluster with fsid: 9c59102a-1c48-11f1-b618-035af535377d 2026-03-10T06:36:09.263 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm rm-cluster --fsid 9c59102a-1c48-11f1-b618-035af535377d --force 2026-03-10T06:36:09.364 INFO:teuthology.orchestra.run.vm06.stdout:Deleting cluster with fsid: 9c59102a-1c48-11f1-b618-035af535377d 2026-03-10T06:36:09.619 INFO:tasks.cephadm:Removing cephadm ... 2026-03-10T06:36:09.619 DEBUG:teuthology.orchestra.run.vm04:> rm -rf /home/ubuntu/cephtest/cephadm 2026-03-10T06:36:09.636 DEBUG:teuthology.orchestra.run.vm06:> rm -rf /home/ubuntu/cephtest/cephadm 2026-03-10T06:36:09.649 INFO:tasks.cephadm:Teardown complete 2026-03-10T06:36:09.649 DEBUG:teuthology.run_tasks:Unwinding manager install 2026-03-10T06:36:09.651 ERROR:teuthology.contextutil:Saw exception from nested tasks Traceback (most recent call last): File "/home/teuthos/teuthology/teuthology/contextutil.py", line 32, in nested yield vars File "/home/teuthos/teuthology/teuthology/task/install/__init__.py", line 644, in task yield File "/home/teuthos/teuthology/teuthology/run_tasks.py", line 160, in run_tasks suppress = manager.__exit__(*exc_info) File "/home/teuthos/.local/share/uv/python/cpython-3.10.19-linux-x86_64-gnu/lib/python3.10/contextlib.py", line 142, in __exit__ next(self.gen) File "/home/teuthos/src/github.com_kshtsk_ceph_75a68fd8ca3f918fe9466b4c0bb385b7fc260a9b/qa/tasks/ceph_fuse.py", line 248, in task mount.umount_wait() File "/home/teuthos/src/github.com_kshtsk_ceph_75a68fd8ca3f918fe9466b4c0bb385b7fc260a9b/qa/tasks/cephfs/fuse_mount.py", line 403, in umount_wait run.wait([self.fuse_daemon], timeout) File "/home/teuthos/teuthology/teuthology/orchestra/run.py", line 479, in wait check_time() File "/home/teuthos/teuthology/teuthology/contextutil.py", line 134, in __call__ raise MaxWhileTries(error_msg) teuthology.exceptions.MaxWhileTries: reached maximum tries (50) after waiting for 300 seconds 2026-03-10T06:36:09.652 INFO:teuthology.task.install.util:Removing shipped files: /home/ubuntu/cephtest/valgrind.supp /usr/bin/daemon-helper /usr/bin/adjust-ulimits /usr/bin/stdin-killer... 2026-03-10T06:36:09.652 DEBUG:teuthology.orchestra.run.vm04:> sudo rm -f -- /home/ubuntu/cephtest/valgrind.supp /usr/bin/daemon-helper /usr/bin/adjust-ulimits /usr/bin/stdin-killer 2026-03-10T06:36:09.678 DEBUG:teuthology.orchestra.run.vm06:> sudo rm -f -- /home/ubuntu/cephtest/valgrind.supp /usr/bin/daemon-helper /usr/bin/adjust-ulimits /usr/bin/stdin-killer 2026-03-10T06:36:09.718 INFO:teuthology.task.install.rpm:Removing packages: ceph-radosgw, ceph-test, ceph, ceph-base, cephadm, ceph-immutable-object-cache, ceph-mgr, ceph-mgr-dashboard, ceph-mgr-diskprediction-local, ceph-mgr-rook, ceph-mgr-cephadm, ceph-fuse, librados-devel, libcephfs2, libcephfs-devel, librados2, librbd1, python3-rados, python3-rgw, python3-cephfs, python3-rbd, rbd-fuse, rbd-mirror, rbd-nbd on rpm system. 2026-03-10T06:36:09.718 DEBUG:teuthology.orchestra.run.vm04:> 2026-03-10T06:36:09.718 DEBUG:teuthology.orchestra.run.vm04:> for d in ceph-radosgw ceph-test ceph ceph-base cephadm ceph-immutable-object-cache ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-cephadm ceph-fuse librados-devel libcephfs2 libcephfs-devel librados2 librbd1 python3-rados python3-rgw python3-cephfs python3-rbd rbd-fuse rbd-mirror rbd-nbd ; do 2026-03-10T06:36:09.718 DEBUG:teuthology.orchestra.run.vm04:> sudo yum -y remove $d || true 2026-03-10T06:36:09.718 DEBUG:teuthology.orchestra.run.vm04:> done 2026-03-10T06:36:09.722 INFO:teuthology.task.install.rpm:Removing packages: ceph-radosgw, ceph-test, ceph, ceph-base, cephadm, ceph-immutable-object-cache, ceph-mgr, ceph-mgr-dashboard, ceph-mgr-diskprediction-local, ceph-mgr-rook, ceph-mgr-cephadm, ceph-fuse, librados-devel, libcephfs2, libcephfs-devel, librados2, librbd1, python3-rados, python3-rgw, python3-cephfs, python3-rbd, rbd-fuse, rbd-mirror, rbd-nbd on rpm system. 2026-03-10T06:36:09.722 DEBUG:teuthology.orchestra.run.vm06:> 2026-03-10T06:36:09.722 DEBUG:teuthology.orchestra.run.vm06:> for d in ceph-radosgw ceph-test ceph ceph-base cephadm ceph-immutable-object-cache ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-cephadm ceph-fuse librados-devel libcephfs2 libcephfs-devel librados2 librbd1 python3-rados python3-rgw python3-cephfs python3-rbd rbd-fuse rbd-mirror rbd-nbd ; do 2026-03-10T06:36:09.722 DEBUG:teuthology.orchestra.run.vm06:> sudo yum -y remove $d || true 2026-03-10T06:36:09.722 DEBUG:teuthology.orchestra.run.vm06:> done 2026-03-10T06:36:09.952 INFO:teuthology.orchestra.run.vm06.stdout:Dependencies resolved. 2026-03-10T06:36:09.952 INFO:teuthology.orchestra.run.vm06.stdout:================================================================================ 2026-03-10T06:36:09.952 INFO:teuthology.orchestra.run.vm06.stdout: Package Architecture Version Repository Size 2026-03-10T06:36:09.952 INFO:teuthology.orchestra.run.vm06.stdout:================================================================================ 2026-03-10T06:36:09.952 INFO:teuthology.orchestra.run.vm06.stdout:Removing: 2026-03-10T06:36:09.953 INFO:teuthology.orchestra.run.vm06.stdout: ceph-radosgw x86_64 2:18.2.0-0.el9 @ceph 31 M 2026-03-10T06:36:09.953 INFO:teuthology.orchestra.run.vm06.stdout:Removing unused dependencies: 2026-03-10T06:36:09.953 INFO:teuthology.orchestra.run.vm06.stdout: mailcap noarch 2.1.49-5.el9 @baseos 78 k 2026-03-10T06:36:09.953 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-10T06:36:09.953 INFO:teuthology.orchestra.run.vm06.stdout:Transaction Summary 2026-03-10T06:36:09.953 INFO:teuthology.orchestra.run.vm06.stdout:================================================================================ 2026-03-10T06:36:09.953 INFO:teuthology.orchestra.run.vm06.stdout:Remove 2 Packages 2026-03-10T06:36:09.953 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-10T06:36:09.953 INFO:teuthology.orchestra.run.vm06.stdout:Freed space: 31 M 2026-03-10T06:36:09.953 INFO:teuthology.orchestra.run.vm06.stdout:Running transaction check 2026-03-10T06:36:09.957 INFO:teuthology.orchestra.run.vm06.stdout:Transaction check succeeded. 2026-03-10T06:36:09.957 INFO:teuthology.orchestra.run.vm06.stdout:Running transaction test 2026-03-10T06:36:09.971 INFO:teuthology.orchestra.run.vm06.stdout:Transaction test succeeded. 2026-03-10T06:36:09.971 INFO:teuthology.orchestra.run.vm06.stdout:Running transaction 2026-03-10T06:36:10.003 INFO:teuthology.orchestra.run.vm06.stdout: Preparing : 1/1 2026-03-10T06:36:10.023 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: ceph-radosgw-2:18.2.0-0.el9.x86_64 1/2 2026-03-10T06:36:10.023 INFO:teuthology.orchestra.run.vm06.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T06:36:10.023 INFO:teuthology.orchestra.run.vm06.stdout:Invalid unit name "ceph-radosgw@*.service" escaped as "ceph-radosgw@\x2a.service". 2026-03-10T06:36:10.023 INFO:teuthology.orchestra.run.vm06.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-radosgw.target". 2026-03-10T06:36:10.023 INFO:teuthology.orchestra.run.vm06.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-radosgw.target". 2026-03-10T06:36:10.023 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-10T06:36:10.024 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : ceph-radosgw-2:18.2.0-0.el9.x86_64 1/2 2026-03-10T06:36:10.033 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: ceph-radosgw-2:18.2.0-0.el9.x86_64 1/2 2026-03-10T06:36:10.044 INFO:teuthology.orchestra.run.vm04.stdout:Dependencies resolved. 2026-03-10T06:36:10.045 INFO:teuthology.orchestra.run.vm04.stdout:================================================================================ 2026-03-10T06:36:10.045 INFO:teuthology.orchestra.run.vm04.stdout: Package Architecture Version Repository Size 2026-03-10T06:36:10.045 INFO:teuthology.orchestra.run.vm04.stdout:================================================================================ 2026-03-10T06:36:10.045 INFO:teuthology.orchestra.run.vm04.stdout:Removing: 2026-03-10T06:36:10.045 INFO:teuthology.orchestra.run.vm04.stdout: ceph-radosgw x86_64 2:18.2.0-0.el9 @ceph 31 M 2026-03-10T06:36:10.045 INFO:teuthology.orchestra.run.vm04.stdout:Removing unused dependencies: 2026-03-10T06:36:10.045 INFO:teuthology.orchestra.run.vm04.stdout: mailcap noarch 2.1.49-5.el9 @baseos 78 k 2026-03-10T06:36:10.045 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:36:10.045 INFO:teuthology.orchestra.run.vm04.stdout:Transaction Summary 2026-03-10T06:36:10.045 INFO:teuthology.orchestra.run.vm04.stdout:================================================================================ 2026-03-10T06:36:10.045 INFO:teuthology.orchestra.run.vm04.stdout:Remove 2 Packages 2026-03-10T06:36:10.045 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:36:10.045 INFO:teuthology.orchestra.run.vm04.stdout:Freed space: 31 M 2026-03-10T06:36:10.046 INFO:teuthology.orchestra.run.vm04.stdout:Running transaction check 2026-03-10T06:36:10.047 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : mailcap-2.1.49-5.el9.noarch 2/2 2026-03-10T06:36:10.050 INFO:teuthology.orchestra.run.vm04.stdout:Transaction check succeeded. 2026-03-10T06:36:10.050 INFO:teuthology.orchestra.run.vm04.stdout:Running transaction test 2026-03-10T06:36:10.066 INFO:teuthology.orchestra.run.vm04.stdout:Transaction test succeeded. 2026-03-10T06:36:10.066 INFO:teuthology.orchestra.run.vm04.stdout:Running transaction 2026-03-10T06:36:10.100 INFO:teuthology.orchestra.run.vm04.stdout: Preparing : 1/1 2026-03-10T06:36:10.112 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: mailcap-2.1.49-5.el9.noarch 2/2 2026-03-10T06:36:10.112 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : ceph-radosgw-2:18.2.0-0.el9.x86_64 1/2 2026-03-10T06:36:10.120 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-radosgw-2:18.2.0-0.el9.x86_64 1/2 2026-03-10T06:36:10.120 INFO:teuthology.orchestra.run.vm04.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T06:36:10.120 INFO:teuthology.orchestra.run.vm04.stdout:Invalid unit name "ceph-radosgw@*.service" escaped as "ceph-radosgw@\x2a.service". 2026-03-10T06:36:10.120 INFO:teuthology.orchestra.run.vm04.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-radosgw.target". 2026-03-10T06:36:10.120 INFO:teuthology.orchestra.run.vm04.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-radosgw.target". 2026-03-10T06:36:10.120 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:36:10.121 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : ceph-radosgw-2:18.2.0-0.el9.x86_64 1/2 2026-03-10T06:36:10.129 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-radosgw-2:18.2.0-0.el9.x86_64 1/2 2026-03-10T06:36:10.144 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : mailcap-2.1.49-5.el9.noarch 2/2 2026-03-10T06:36:10.163 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : mailcap-2.1.49-5.el9.noarch 2/2 2026-03-10T06:36:10.163 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-10T06:36:10.163 INFO:teuthology.orchestra.run.vm06.stdout:Removed: 2026-03-10T06:36:10.163 INFO:teuthology.orchestra.run.vm06.stdout: ceph-radosgw-2:18.2.0-0.el9.x86_64 mailcap-2.1.49-5.el9.noarch 2026-03-10T06:36:10.163 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-10T06:36:10.163 INFO:teuthology.orchestra.run.vm06.stdout:Complete! 2026-03-10T06:36:10.213 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: mailcap-2.1.49-5.el9.noarch 2/2 2026-03-10T06:36:10.213 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-radosgw-2:18.2.0-0.el9.x86_64 1/2 2026-03-10T06:36:10.261 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : mailcap-2.1.49-5.el9.noarch 2/2 2026-03-10T06:36:10.261 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:36:10.261 INFO:teuthology.orchestra.run.vm04.stdout:Removed: 2026-03-10T06:36:10.261 INFO:teuthology.orchestra.run.vm04.stdout: ceph-radosgw-2:18.2.0-0.el9.x86_64 mailcap-2.1.49-5.el9.noarch 2026-03-10T06:36:10.261 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:36:10.261 INFO:teuthology.orchestra.run.vm04.stdout:Complete! 2026-03-10T06:36:10.359 INFO:teuthology.orchestra.run.vm06.stdout:Dependencies resolved. 2026-03-10T06:36:10.359 INFO:teuthology.orchestra.run.vm06.stdout:================================================================================ 2026-03-10T06:36:10.359 INFO:teuthology.orchestra.run.vm06.stdout: Package Architecture Version Repository Size 2026-03-10T06:36:10.359 INFO:teuthology.orchestra.run.vm06.stdout:================================================================================ 2026-03-10T06:36:10.359 INFO:teuthology.orchestra.run.vm06.stdout:Removing: 2026-03-10T06:36:10.359 INFO:teuthology.orchestra.run.vm06.stdout: ceph-test x86_64 2:18.2.0-0.el9 @ceph 164 M 2026-03-10T06:36:10.359 INFO:teuthology.orchestra.run.vm06.stdout:Removing unused dependencies: 2026-03-10T06:36:10.359 INFO:teuthology.orchestra.run.vm06.stdout: libxslt x86_64 1.1.34-12.el9 @appstream 743 k 2026-03-10T06:36:10.359 INFO:teuthology.orchestra.run.vm06.stdout: socat x86_64 1.7.4.1-8.el9 @appstream 1.1 M 2026-03-10T06:36:10.359 INFO:teuthology.orchestra.run.vm06.stdout: xmlstarlet x86_64 1.6.1-20.el9 @appstream 195 k 2026-03-10T06:36:10.359 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-10T06:36:10.359 INFO:teuthology.orchestra.run.vm06.stdout:Transaction Summary 2026-03-10T06:36:10.359 INFO:teuthology.orchestra.run.vm06.stdout:================================================================================ 2026-03-10T06:36:10.359 INFO:teuthology.orchestra.run.vm06.stdout:Remove 4 Packages 2026-03-10T06:36:10.359 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-10T06:36:10.360 INFO:teuthology.orchestra.run.vm06.stdout:Freed space: 166 M 2026-03-10T06:36:10.360 INFO:teuthology.orchestra.run.vm06.stdout:Running transaction check 2026-03-10T06:36:10.362 INFO:teuthology.orchestra.run.vm06.stdout:Transaction check succeeded. 2026-03-10T06:36:10.362 INFO:teuthology.orchestra.run.vm06.stdout:Running transaction test 2026-03-10T06:36:10.386 INFO:teuthology.orchestra.run.vm06.stdout:Transaction test succeeded. 2026-03-10T06:36:10.386 INFO:teuthology.orchestra.run.vm06.stdout:Running transaction 2026-03-10T06:36:10.437 INFO:teuthology.orchestra.run.vm06.stdout: Preparing : 1/1 2026-03-10T06:36:10.444 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : ceph-test-2:18.2.0-0.el9.x86_64 1/4 2026-03-10T06:36:10.447 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : xmlstarlet-1.6.1-20.el9.x86_64 2/4 2026-03-10T06:36:10.449 INFO:teuthology.orchestra.run.vm04.stdout:Dependencies resolved. 2026-03-10T06:36:10.450 INFO:teuthology.orchestra.run.vm04.stdout:================================================================================ 2026-03-10T06:36:10.450 INFO:teuthology.orchestra.run.vm04.stdout: Package Architecture Version Repository Size 2026-03-10T06:36:10.450 INFO:teuthology.orchestra.run.vm04.stdout:================================================================================ 2026-03-10T06:36:10.450 INFO:teuthology.orchestra.run.vm04.stdout:Removing: 2026-03-10T06:36:10.450 INFO:teuthology.orchestra.run.vm04.stdout: ceph-test x86_64 2:18.2.0-0.el9 @ceph 164 M 2026-03-10T06:36:10.450 INFO:teuthology.orchestra.run.vm04.stdout:Removing unused dependencies: 2026-03-10T06:36:10.450 INFO:teuthology.orchestra.run.vm04.stdout: libxslt x86_64 1.1.34-12.el9 @appstream 743 k 2026-03-10T06:36:10.450 INFO:teuthology.orchestra.run.vm04.stdout: socat x86_64 1.7.4.1-8.el9 @appstream 1.1 M 2026-03-10T06:36:10.450 INFO:teuthology.orchestra.run.vm04.stdout: xmlstarlet x86_64 1.6.1-20.el9 @appstream 195 k 2026-03-10T06:36:10.450 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:36:10.450 INFO:teuthology.orchestra.run.vm04.stdout:Transaction Summary 2026-03-10T06:36:10.450 INFO:teuthology.orchestra.run.vm04.stdout:================================================================================ 2026-03-10T06:36:10.450 INFO:teuthology.orchestra.run.vm04.stdout:Remove 4 Packages 2026-03-10T06:36:10.450 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:36:10.450 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : libxslt-1.1.34-12.el9.x86_64 3/4 2026-03-10T06:36:10.450 INFO:teuthology.orchestra.run.vm04.stdout:Freed space: 166 M 2026-03-10T06:36:10.451 INFO:teuthology.orchestra.run.vm04.stdout:Running transaction check 2026-03-10T06:36:10.453 INFO:teuthology.orchestra.run.vm04.stdout:Transaction check succeeded. 2026-03-10T06:36:10.453 INFO:teuthology.orchestra.run.vm04.stdout:Running transaction test 2026-03-10T06:36:10.465 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : socat-1.7.4.1-8.el9.x86_64 4/4 2026-03-10T06:36:10.478 INFO:teuthology.orchestra.run.vm04.stdout:Transaction test succeeded. 2026-03-10T06:36:10.479 INFO:teuthology.orchestra.run.vm04.stdout:Running transaction 2026-03-10T06:36:10.528 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: socat-1.7.4.1-8.el9.x86_64 4/4 2026-03-10T06:36:10.528 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : ceph-test-2:18.2.0-0.el9.x86_64 1/4 2026-03-10T06:36:10.528 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : libxslt-1.1.34-12.el9.x86_64 2/4 2026-03-10T06:36:10.528 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : socat-1.7.4.1-8.el9.x86_64 3/4 2026-03-10T06:36:10.529 INFO:teuthology.orchestra.run.vm04.stdout: Preparing : 1/1 2026-03-10T06:36:10.536 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : ceph-test-2:18.2.0-0.el9.x86_64 1/4 2026-03-10T06:36:10.538 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : xmlstarlet-1.6.1-20.el9.x86_64 2/4 2026-03-10T06:36:10.541 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : libxslt-1.1.34-12.el9.x86_64 3/4 2026-03-10T06:36:10.557 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : socat-1.7.4.1-8.el9.x86_64 4/4 2026-03-10T06:36:10.574 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : xmlstarlet-1.6.1-20.el9.x86_64 4/4 2026-03-10T06:36:10.574 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-10T06:36:10.574 INFO:teuthology.orchestra.run.vm06.stdout:Removed: 2026-03-10T06:36:10.574 INFO:teuthology.orchestra.run.vm06.stdout: ceph-test-2:18.2.0-0.el9.x86_64 libxslt-1.1.34-12.el9.x86_64 2026-03-10T06:36:10.574 INFO:teuthology.orchestra.run.vm06.stdout: socat-1.7.4.1-8.el9.x86_64 xmlstarlet-1.6.1-20.el9.x86_64 2026-03-10T06:36:10.574 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-10T06:36:10.574 INFO:teuthology.orchestra.run.vm06.stdout:Complete! 2026-03-10T06:36:10.617 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: socat-1.7.4.1-8.el9.x86_64 4/4 2026-03-10T06:36:10.618 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-test-2:18.2.0-0.el9.x86_64 1/4 2026-03-10T06:36:10.618 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : libxslt-1.1.34-12.el9.x86_64 2/4 2026-03-10T06:36:10.618 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : socat-1.7.4.1-8.el9.x86_64 3/4 2026-03-10T06:36:10.661 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : xmlstarlet-1.6.1-20.el9.x86_64 4/4 2026-03-10T06:36:10.661 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:36:10.661 INFO:teuthology.orchestra.run.vm04.stdout:Removed: 2026-03-10T06:36:10.661 INFO:teuthology.orchestra.run.vm04.stdout: ceph-test-2:18.2.0-0.el9.x86_64 libxslt-1.1.34-12.el9.x86_64 2026-03-10T06:36:10.661 INFO:teuthology.orchestra.run.vm04.stdout: socat-1.7.4.1-8.el9.x86_64 xmlstarlet-1.6.1-20.el9.x86_64 2026-03-10T06:36:10.661 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:36:10.661 INFO:teuthology.orchestra.run.vm04.stdout:Complete! 2026-03-10T06:36:10.781 INFO:teuthology.orchestra.run.vm06.stdout:Dependencies resolved. 2026-03-10T06:36:10.782 INFO:teuthology.orchestra.run.vm06.stdout:================================================================================ 2026-03-10T06:36:10.782 INFO:teuthology.orchestra.run.vm06.stdout: Package Arch Version Repository Size 2026-03-10T06:36:10.782 INFO:teuthology.orchestra.run.vm06.stdout:================================================================================ 2026-03-10T06:36:10.782 INFO:teuthology.orchestra.run.vm06.stdout:Removing: 2026-03-10T06:36:10.782 INFO:teuthology.orchestra.run.vm06.stdout: ceph x86_64 2:18.2.0-0.el9 @ceph 0 2026-03-10T06:36:10.782 INFO:teuthology.orchestra.run.vm06.stdout:Removing unused dependencies: 2026-03-10T06:36:10.782 INFO:teuthology.orchestra.run.vm06.stdout: ceph-mds x86_64 2:18.2.0-0.el9 @ceph 6.4 M 2026-03-10T06:36:10.782 INFO:teuthology.orchestra.run.vm06.stdout: ceph-mon x86_64 2:18.2.0-0.el9 @ceph 20 M 2026-03-10T06:36:10.782 INFO:teuthology.orchestra.run.vm06.stdout: ceph-osd x86_64 2:18.2.0-0.el9 @ceph 61 M 2026-03-10T06:36:10.782 INFO:teuthology.orchestra.run.vm06.stdout: ledmon-libs x86_64 1.1.0-3.el9 @baseos 80 k 2026-03-10T06:36:10.782 INFO:teuthology.orchestra.run.vm06.stdout: libconfig x86_64 1.7.2-9.el9 @baseos 220 k 2026-03-10T06:36:10.782 INFO:teuthology.orchestra.run.vm06.stdout: libstoragemgmt x86_64 1.10.1-1.el9 @appstream 685 k 2026-03-10T06:36:10.782 INFO:teuthology.orchestra.run.vm06.stdout: python3-libstoragemgmt x86_64 1.10.1-1.el9 @appstream 832 k 2026-03-10T06:36:10.782 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-10T06:36:10.782 INFO:teuthology.orchestra.run.vm06.stdout:Transaction Summary 2026-03-10T06:36:10.782 INFO:teuthology.orchestra.run.vm06.stdout:================================================================================ 2026-03-10T06:36:10.782 INFO:teuthology.orchestra.run.vm06.stdout:Remove 8 Packages 2026-03-10T06:36:10.782 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-10T06:36:10.783 INFO:teuthology.orchestra.run.vm06.stdout:Freed space: 89 M 2026-03-10T06:36:10.783 INFO:teuthology.orchestra.run.vm06.stdout:Running transaction check 2026-03-10T06:36:10.786 INFO:teuthology.orchestra.run.vm06.stdout:Transaction check succeeded. 2026-03-10T06:36:10.786 INFO:teuthology.orchestra.run.vm06.stdout:Running transaction test 2026-03-10T06:36:10.807 INFO:teuthology.orchestra.run.vm06.stdout:Transaction test succeeded. 2026-03-10T06:36:10.807 INFO:teuthology.orchestra.run.vm06.stdout:Running transaction 2026-03-10T06:36:10.846 INFO:teuthology.orchestra.run.vm06.stdout: Preparing : 1/1 2026-03-10T06:36:10.848 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : ceph-2:18.2.0-0.el9.x86_64 1/8 2026-03-10T06:36:10.861 INFO:teuthology.orchestra.run.vm04.stdout:Dependencies resolved. 2026-03-10T06:36:10.861 INFO:teuthology.orchestra.run.vm04.stdout:================================================================================ 2026-03-10T06:36:10.861 INFO:teuthology.orchestra.run.vm04.stdout: Package Arch Version Repository Size 2026-03-10T06:36:10.861 INFO:teuthology.orchestra.run.vm04.stdout:================================================================================ 2026-03-10T06:36:10.861 INFO:teuthology.orchestra.run.vm04.stdout:Removing: 2026-03-10T06:36:10.862 INFO:teuthology.orchestra.run.vm04.stdout: ceph x86_64 2:18.2.0-0.el9 @ceph 0 2026-03-10T06:36:10.862 INFO:teuthology.orchestra.run.vm04.stdout:Removing unused dependencies: 2026-03-10T06:36:10.862 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mds x86_64 2:18.2.0-0.el9 @ceph 6.4 M 2026-03-10T06:36:10.862 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mon x86_64 2:18.2.0-0.el9 @ceph 20 M 2026-03-10T06:36:10.862 INFO:teuthology.orchestra.run.vm04.stdout: ceph-osd x86_64 2:18.2.0-0.el9 @ceph 61 M 2026-03-10T06:36:10.862 INFO:teuthology.orchestra.run.vm04.stdout: ledmon-libs x86_64 1.1.0-3.el9 @baseos 80 k 2026-03-10T06:36:10.862 INFO:teuthology.orchestra.run.vm04.stdout: libconfig x86_64 1.7.2-9.el9 @baseos 220 k 2026-03-10T06:36:10.862 INFO:teuthology.orchestra.run.vm04.stdout: libstoragemgmt x86_64 1.10.1-1.el9 @appstream 685 k 2026-03-10T06:36:10.862 INFO:teuthology.orchestra.run.vm04.stdout: python3-libstoragemgmt x86_64 1.10.1-1.el9 @appstream 832 k 2026-03-10T06:36:10.862 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:36:10.862 INFO:teuthology.orchestra.run.vm04.stdout:Transaction Summary 2026-03-10T06:36:10.862 INFO:teuthology.orchestra.run.vm04.stdout:================================================================================ 2026-03-10T06:36:10.862 INFO:teuthology.orchestra.run.vm04.stdout:Remove 8 Packages 2026-03-10T06:36:10.862 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:36:10.862 INFO:teuthology.orchestra.run.vm04.stdout:Freed space: 89 M 2026-03-10T06:36:10.862 INFO:teuthology.orchestra.run.vm04.stdout:Running transaction check 2026-03-10T06:36:10.865 INFO:teuthology.orchestra.run.vm04.stdout:Transaction check succeeded. 2026-03-10T06:36:10.865 INFO:teuthology.orchestra.run.vm04.stdout:Running transaction test 2026-03-10T06:36:10.867 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: ceph-osd-2:18.2.0-0.el9.x86_64 2/8 2026-03-10T06:36:10.867 INFO:teuthology.orchestra.run.vm06.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T06:36:10.867 INFO:teuthology.orchestra.run.vm06.stdout:Invalid unit name "ceph-osd@*.service" escaped as "ceph-osd@\x2a.service". 2026-03-10T06:36:10.867 INFO:teuthology.orchestra.run.vm06.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-osd.target". 2026-03-10T06:36:10.867 INFO:teuthology.orchestra.run.vm06.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-osd.target". 2026-03-10T06:36:10.867 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-10T06:36:10.869 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : ceph-osd-2:18.2.0-0.el9.x86_64 2/8 2026-03-10T06:36:10.876 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: ceph-osd-2:18.2.0-0.el9.x86_64 2/8 2026-03-10T06:36:10.889 INFO:teuthology.orchestra.run.vm04.stdout:Transaction test succeeded. 2026-03-10T06:36:10.889 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: libstoragemgmt-1.10.1-1.el9.x86_64 3/8 2026-03-10T06:36:10.889 INFO:teuthology.orchestra.run.vm06.stdout:Removed "/etc/systemd/system/multi-user.target.wants/libstoragemgmt.service". 2026-03-10T06:36:10.889 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-10T06:36:10.889 INFO:teuthology.orchestra.run.vm04.stdout:Running transaction 2026-03-10T06:36:10.890 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : libstoragemgmt-1.10.1-1.el9.x86_64 3/8 2026-03-10T06:36:10.908 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: libstoragemgmt-1.10.1-1.el9.x86_64 3/8 2026-03-10T06:36:10.911 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-libstoragemgmt-1.10.1-1.el9.x86_64 4/8 2026-03-10T06:36:10.913 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : ledmon-libs-1.1.0-3.el9.x86_64 5/8 2026-03-10T06:36:10.922 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : libconfig-1.7.2-9.el9.x86_64 6/8 2026-03-10T06:36:10.928 INFO:teuthology.orchestra.run.vm04.stdout: Preparing : 1/1 2026-03-10T06:36:10.929 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : ceph-2:18.2.0-0.el9.x86_64 1/8 2026-03-10T06:36:10.946 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: ceph-mds-2:18.2.0-0.el9.x86_64 7/8 2026-03-10T06:36:10.946 INFO:teuthology.orchestra.run.vm06.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T06:36:10.946 INFO:teuthology.orchestra.run.vm06.stdout:Invalid unit name "ceph-mds@*.service" escaped as "ceph-mds@\x2a.service". 2026-03-10T06:36:10.946 INFO:teuthology.orchestra.run.vm06.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-mds.target". 2026-03-10T06:36:10.946 INFO:teuthology.orchestra.run.vm06.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-mds.target". 2026-03-10T06:36:10.946 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-10T06:36:10.946 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : ceph-mds-2:18.2.0-0.el9.x86_64 7/8 2026-03-10T06:36:10.950 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-osd-2:18.2.0-0.el9.x86_64 2/8 2026-03-10T06:36:10.950 INFO:teuthology.orchestra.run.vm04.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T06:36:10.950 INFO:teuthology.orchestra.run.vm04.stdout:Invalid unit name "ceph-osd@*.service" escaped as "ceph-osd@\x2a.service". 2026-03-10T06:36:10.950 INFO:teuthology.orchestra.run.vm04.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-osd.target". 2026-03-10T06:36:10.950 INFO:teuthology.orchestra.run.vm04.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-osd.target". 2026-03-10T06:36:10.950 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:36:10.952 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : ceph-osd-2:18.2.0-0.el9.x86_64 2/8 2026-03-10T06:36:10.953 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: ceph-mds-2:18.2.0-0.el9.x86_64 7/8 2026-03-10T06:36:10.961 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-osd-2:18.2.0-0.el9.x86_64 2/8 2026-03-10T06:36:10.974 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: ceph-mon-2:18.2.0-0.el9.x86_64 8/8 2026-03-10T06:36:10.974 INFO:teuthology.orchestra.run.vm06.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T06:36:10.974 INFO:teuthology.orchestra.run.vm06.stdout:Invalid unit name "ceph-mon@*.service" escaped as "ceph-mon@\x2a.service". 2026-03-10T06:36:10.974 INFO:teuthology.orchestra.run.vm06.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-mon.target". 2026-03-10T06:36:10.974 INFO:teuthology.orchestra.run.vm06.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-mon.target". 2026-03-10T06:36:10.974 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-10T06:36:10.975 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : ceph-mon-2:18.2.0-0.el9.x86_64 8/8 2026-03-10T06:36:10.976 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: libstoragemgmt-1.10.1-1.el9.x86_64 3/8 2026-03-10T06:36:10.976 INFO:teuthology.orchestra.run.vm04.stdout:Removed "/etc/systemd/system/multi-user.target.wants/libstoragemgmt.service". 2026-03-10T06:36:10.976 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:36:10.977 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : libstoragemgmt-1.10.1-1.el9.x86_64 3/8 2026-03-10T06:36:11.000 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: libstoragemgmt-1.10.1-1.el9.x86_64 3/8 2026-03-10T06:36:11.003 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-libstoragemgmt-1.10.1-1.el9.x86_64 4/8 2026-03-10T06:36:11.005 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : ledmon-libs-1.1.0-3.el9.x86_64 5/8 2026-03-10T06:36:11.006 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : libconfig-1.7.2-9.el9.x86_64 6/8 2026-03-10T06:36:11.029 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-mds-2:18.2.0-0.el9.x86_64 7/8 2026-03-10T06:36:11.029 INFO:teuthology.orchestra.run.vm04.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T06:36:11.029 INFO:teuthology.orchestra.run.vm04.stdout:Invalid unit name "ceph-mds@*.service" escaped as "ceph-mds@\x2a.service". 2026-03-10T06:36:11.030 INFO:teuthology.orchestra.run.vm04.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-mds.target". 2026-03-10T06:36:11.030 INFO:teuthology.orchestra.run.vm04.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-mds.target". 2026-03-10T06:36:11.030 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:36:11.030 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : ceph-mds-2:18.2.0-0.el9.x86_64 7/8 2026-03-10T06:36:11.037 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-mds-2:18.2.0-0.el9.x86_64 7/8 2026-03-10T06:36:11.059 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: ceph-mon-2:18.2.0-0.el9.x86_64 8/8 2026-03-10T06:36:11.060 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : ceph-2:18.2.0-0.el9.x86_64 1/8 2026-03-10T06:36:11.060 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : ceph-mds-2:18.2.0-0.el9.x86_64 2/8 2026-03-10T06:36:11.060 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : ceph-mon-2:18.2.0-0.el9.x86_64 3/8 2026-03-10T06:36:11.060 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : ceph-osd-2:18.2.0-0.el9.x86_64 4/8 2026-03-10T06:36:11.060 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : ledmon-libs-1.1.0-3.el9.x86_64 5/8 2026-03-10T06:36:11.060 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : libconfig-1.7.2-9.el9.x86_64 6/8 2026-03-10T06:36:11.060 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : libstoragemgmt-1.10.1-1.el9.x86_64 7/8 2026-03-10T06:36:11.060 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-mon-2:18.2.0-0.el9.x86_64 8/8 2026-03-10T06:36:11.060 INFO:teuthology.orchestra.run.vm04.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T06:36:11.060 INFO:teuthology.orchestra.run.vm04.stdout:Invalid unit name "ceph-mon@*.service" escaped as "ceph-mon@\x2a.service". 2026-03-10T06:36:11.060 INFO:teuthology.orchestra.run.vm04.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-mon.target". 2026-03-10T06:36:11.060 INFO:teuthology.orchestra.run.vm04.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-mon.target". 2026-03-10T06:36:11.060 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:36:11.061 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : ceph-mon-2:18.2.0-0.el9.x86_64 8/8 2026-03-10T06:36:11.111 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-libstoragemgmt-1.10.1-1.el9.x86_64 8/8 2026-03-10T06:36:11.111 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-10T06:36:11.111 INFO:teuthology.orchestra.run.vm06.stdout:Removed: 2026-03-10T06:36:11.111 INFO:teuthology.orchestra.run.vm06.stdout: ceph-2:18.2.0-0.el9.x86_64 ceph-mds-2:18.2.0-0.el9.x86_64 2026-03-10T06:36:11.111 INFO:teuthology.orchestra.run.vm06.stdout: ceph-mon-2:18.2.0-0.el9.x86_64 ceph-osd-2:18.2.0-0.el9.x86_64 2026-03-10T06:36:11.111 INFO:teuthology.orchestra.run.vm06.stdout: ledmon-libs-1.1.0-3.el9.x86_64 libconfig-1.7.2-9.el9.x86_64 2026-03-10T06:36:11.111 INFO:teuthology.orchestra.run.vm06.stdout: libstoragemgmt-1.10.1-1.el9.x86_64 python3-libstoragemgmt-1.10.1-1.el9.x86_64 2026-03-10T06:36:11.112 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-10T06:36:11.112 INFO:teuthology.orchestra.run.vm06.stdout:Complete! 2026-03-10T06:36:11.151 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-mon-2:18.2.0-0.el9.x86_64 8/8 2026-03-10T06:36:11.151 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-2:18.2.0-0.el9.x86_64 1/8 2026-03-10T06:36:11.151 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-mds-2:18.2.0-0.el9.x86_64 2/8 2026-03-10T06:36:11.151 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-mon-2:18.2.0-0.el9.x86_64 3/8 2026-03-10T06:36:11.151 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-osd-2:18.2.0-0.el9.x86_64 4/8 2026-03-10T06:36:11.151 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ledmon-libs-1.1.0-3.el9.x86_64 5/8 2026-03-10T06:36:11.151 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : libconfig-1.7.2-9.el9.x86_64 6/8 2026-03-10T06:36:11.151 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : libstoragemgmt-1.10.1-1.el9.x86_64 7/8 2026-03-10T06:36:11.208 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-libstoragemgmt-1.10.1-1.el9.x86_64 8/8 2026-03-10T06:36:11.208 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:36:11.208 INFO:teuthology.orchestra.run.vm04.stdout:Removed: 2026-03-10T06:36:11.208 INFO:teuthology.orchestra.run.vm04.stdout: ceph-2:18.2.0-0.el9.x86_64 ceph-mds-2:18.2.0-0.el9.x86_64 2026-03-10T06:36:11.208 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mon-2:18.2.0-0.el9.x86_64 ceph-osd-2:18.2.0-0.el9.x86_64 2026-03-10T06:36:11.208 INFO:teuthology.orchestra.run.vm04.stdout: ledmon-libs-1.1.0-3.el9.x86_64 libconfig-1.7.2-9.el9.x86_64 2026-03-10T06:36:11.208 INFO:teuthology.orchestra.run.vm04.stdout: libstoragemgmt-1.10.1-1.el9.x86_64 python3-libstoragemgmt-1.10.1-1.el9.x86_64 2026-03-10T06:36:11.208 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:36:11.208 INFO:teuthology.orchestra.run.vm04.stdout:Complete! 2026-03-10T06:36:11.320 INFO:teuthology.orchestra.run.vm06.stdout:Dependencies resolved. 2026-03-10T06:36:11.325 INFO:teuthology.orchestra.run.vm06.stdout:================================================================================ 2026-03-10T06:36:11.325 INFO:teuthology.orchestra.run.vm06.stdout: Package Arch Version Repository Size 2026-03-10T06:36:11.325 INFO:teuthology.orchestra.run.vm06.stdout:================================================================================ 2026-03-10T06:36:11.325 INFO:teuthology.orchestra.run.vm06.stdout:Removing: 2026-03-10T06:36:11.325 INFO:teuthology.orchestra.run.vm06.stdout: ceph-base x86_64 2:18.2.0-0.el9 @ceph 22 M 2026-03-10T06:36:11.325 INFO:teuthology.orchestra.run.vm06.stdout:Removing dependent packages: 2026-03-10T06:36:11.325 INFO:teuthology.orchestra.run.vm06.stdout: ceph-immutable-object-cache x86_64 2:18.2.0-0.el9 @ceph 399 k 2026-03-10T06:36:11.325 INFO:teuthology.orchestra.run.vm06.stdout: ceph-mgr x86_64 2:18.2.0-0.el9 @ceph 4.5 M 2026-03-10T06:36:11.325 INFO:teuthology.orchestra.run.vm06.stdout: ceph-mgr-cephadm noarch 2:18.2.0-0.el9 @ceph-noarch 654 k 2026-03-10T06:36:11.326 INFO:teuthology.orchestra.run.vm06.stdout: ceph-mgr-dashboard noarch 2:18.2.0-0.el9 @ceph-noarch 7.1 M 2026-03-10T06:36:11.326 INFO:teuthology.orchestra.run.vm06.stdout: ceph-mgr-diskprediction-local noarch 2:18.2.0-0.el9 @ceph-noarch 66 M 2026-03-10T06:36:11.326 INFO:teuthology.orchestra.run.vm06.stdout: ceph-mgr-rook noarch 2:18.2.0-0.el9 @ceph-noarch 567 k 2026-03-10T06:36:11.326 INFO:teuthology.orchestra.run.vm06.stdout: rbd-mirror x86_64 2:18.2.0-0.el9 @ceph 12 M 2026-03-10T06:36:11.326 INFO:teuthology.orchestra.run.vm06.stdout:Removing unused dependencies: 2026-03-10T06:36:11.326 INFO:teuthology.orchestra.run.vm06.stdout: ceph-common x86_64 2:18.2.0-0.el9 @ceph 70 M 2026-03-10T06:36:11.326 INFO:teuthology.orchestra.run.vm06.stdout: ceph-grafana-dashboards noarch 2:18.2.0-0.el9 @ceph-noarch 319 k 2026-03-10T06:36:11.326 INFO:teuthology.orchestra.run.vm06.stdout: ceph-mgr-modules-core noarch 2:18.2.0-0.el9 @ceph-noarch 1.4 M 2026-03-10T06:36:11.326 INFO:teuthology.orchestra.run.vm06.stdout: ceph-prometheus-alerts noarch 2:18.2.0-0.el9 @ceph-noarch 40 k 2026-03-10T06:36:11.326 INFO:teuthology.orchestra.run.vm06.stdout: ceph-selinux x86_64 2:18.2.0-0.el9 @ceph 138 k 2026-03-10T06:36:11.326 INFO:teuthology.orchestra.run.vm06.stdout: flexiblas x86_64 3.0.4-9.el9 @appstream 68 k 2026-03-10T06:36:11.326 INFO:teuthology.orchestra.run.vm06.stdout: flexiblas-netlib x86_64 3.0.4-9.el9 @appstream 11 M 2026-03-10T06:36:11.326 INFO:teuthology.orchestra.run.vm06.stdout: flexiblas-openblas-openmp x86_64 3.0.4-9.el9 @appstream 39 k 2026-03-10T06:36:11.326 INFO:teuthology.orchestra.run.vm06.stdout: fmt x86_64 8.1.1-5.el9 @epel 337 k 2026-03-10T06:36:11.326 INFO:teuthology.orchestra.run.vm06.stdout: libcephsqlite x86_64 2:18.2.0-0.el9 @ceph 426 k 2026-03-10T06:36:11.326 INFO:teuthology.orchestra.run.vm06.stdout: libgfortran x86_64 11.5.0-14.el9 @baseos 2.8 M 2026-03-10T06:36:11.326 INFO:teuthology.orchestra.run.vm06.stdout: liboath x86_64 2.6.12-1.el9 @epel 94 k 2026-03-10T06:36:11.326 INFO:teuthology.orchestra.run.vm06.stdout: libquadmath x86_64 11.5.0-14.el9 @baseos 330 k 2026-03-10T06:36:11.326 INFO:teuthology.orchestra.run.vm06.stdout: libradosstriper1 x86_64 2:18.2.0-0.el9 @ceph 1.5 M 2026-03-10T06:36:11.326 INFO:teuthology.orchestra.run.vm06.stdout: openblas x86_64 0.3.29-1.el9 @appstream 112 k 2026-03-10T06:36:11.326 INFO:teuthology.orchestra.run.vm06.stdout: openblas-openmp x86_64 0.3.29-1.el9 @appstream 46 M 2026-03-10T06:36:11.326 INFO:teuthology.orchestra.run.vm06.stdout: python3-asyncssh noarch 2.13.2-5.el9 @epel 3.9 M 2026-03-10T06:36:11.326 INFO:teuthology.orchestra.run.vm06.stdout: python3-autocommand noarch 2.2.2-8.el9 @epel 82 k 2026-03-10T06:36:11.326 INFO:teuthology.orchestra.run.vm06.stdout: python3-babel noarch 2.9.1-2.el9 @appstream 27 M 2026-03-10T06:36:11.326 INFO:teuthology.orchestra.run.vm06.stdout: python3-backports-tarfile noarch 1.2.0-1.el9 @epel 254 k 2026-03-10T06:36:11.326 INFO:teuthology.orchestra.run.vm06.stdout: python3-bcrypt x86_64 3.2.2-1.el9 @epel 87 k 2026-03-10T06:36:11.326 INFO:teuthology.orchestra.run.vm06.stdout: python3-cachetools noarch 4.2.4-1.el9 @epel 93 k 2026-03-10T06:36:11.326 INFO:teuthology.orchestra.run.vm06.stdout: python3-ceph-common x86_64 2:18.2.0-0.el9 @ceph 570 k 2026-03-10T06:36:11.326 INFO:teuthology.orchestra.run.vm06.stdout: python3-certifi noarch 2023.05.07-4.el9 @epel 6.3 k 2026-03-10T06:36:11.326 INFO:teuthology.orchestra.run.vm06.stdout: python3-cffi x86_64 1.14.5-5.el9 @baseos 1.0 M 2026-03-10T06:36:11.326 INFO:teuthology.orchestra.run.vm06.stdout: python3-chardet noarch 4.0.0-5.el9 @anaconda 1.4 M 2026-03-10T06:36:11.326 INFO:teuthology.orchestra.run.vm06.stdout: python3-cheroot noarch 10.0.1-4.el9 @epel 682 k 2026-03-10T06:36:11.326 INFO:teuthology.orchestra.run.vm06.stdout: python3-cherrypy noarch 18.6.1-2.el9 @epel 1.1 M 2026-03-10T06:36:11.326 INFO:teuthology.orchestra.run.vm06.stdout: python3-cryptography x86_64 36.0.1-5.el9 @baseos 4.5 M 2026-03-10T06:36:11.326 INFO:teuthology.orchestra.run.vm06.stdout: python3-devel x86_64 3.9.25-3.el9 @appstream 765 k 2026-03-10T06:36:11.326 INFO:teuthology.orchestra.run.vm06.stdout: python3-google-auth noarch 1:2.45.0-1.el9 @epel 1.4 M 2026-03-10T06:36:11.326 INFO:teuthology.orchestra.run.vm06.stdout: python3-idna noarch 2.10-7.el9.1 @anaconda 513 k 2026-03-10T06:36:11.326 INFO:teuthology.orchestra.run.vm06.stdout: python3-jaraco noarch 8.2.1-3.el9 @epel 3.7 k 2026-03-10T06:36:11.326 INFO:teuthology.orchestra.run.vm06.stdout: python3-jaraco-classes noarch 3.2.1-5.el9 @epel 24 k 2026-03-10T06:36:11.326 INFO:teuthology.orchestra.run.vm06.stdout: python3-jaraco-collections noarch 3.0.0-8.el9 @epel 55 k 2026-03-10T06:36:11.326 INFO:teuthology.orchestra.run.vm06.stdout: python3-jaraco-context noarch 6.0.1-3.el9 @epel 31 k 2026-03-10T06:36:11.326 INFO:teuthology.orchestra.run.vm06.stdout: python3-jaraco-functools noarch 3.5.0-2.el9 @epel 33 k 2026-03-10T06:36:11.326 INFO:teuthology.orchestra.run.vm06.stdout: python3-jaraco-text noarch 4.0.0-2.el9 @epel 51 k 2026-03-10T06:36:11.326 INFO:teuthology.orchestra.run.vm06.stdout: python3-jinja2 noarch 2.11.3-8.el9 @appstream 1.1 M 2026-03-10T06:36:11.326 INFO:teuthology.orchestra.run.vm06.stdout: python3-jsonpatch noarch 1.21-16.el9 @koji-override-0 55 k 2026-03-10T06:36:11.326 INFO:teuthology.orchestra.run.vm06.stdout: python3-jsonpointer noarch 2.0-4.el9 @koji-override-0 34 k 2026-03-10T06:36:11.326 INFO:teuthology.orchestra.run.vm06.stdout: python3-jwt noarch 2.4.0-1.el9 @epel 103 k 2026-03-10T06:36:11.326 INFO:teuthology.orchestra.run.vm06.stdout: python3-jwt+crypto noarch 2.4.0-1.el9 @epel 5.4 k 2026-03-10T06:36:11.326 INFO:teuthology.orchestra.run.vm06.stdout: python3-kubernetes noarch 1:26.1.0-3.el9 @epel 21 M 2026-03-10T06:36:11.326 INFO:teuthology.orchestra.run.vm06.stdout: python3-logutils noarch 0.3.5-21.el9 @epel 126 k 2026-03-10T06:36:11.326 INFO:teuthology.orchestra.run.vm06.stdout: python3-mako noarch 1.1.4-6.el9 @appstream 534 k 2026-03-10T06:36:11.326 INFO:teuthology.orchestra.run.vm06.stdout: python3-markupsafe x86_64 1.1.1-12.el9 @appstream 60 k 2026-03-10T06:36:11.327 INFO:teuthology.orchestra.run.vm06.stdout: python3-more-itertools noarch 8.12.0-2.el9 @epel 378 k 2026-03-10T06:36:11.327 INFO:teuthology.orchestra.run.vm06.stdout: python3-natsort noarch 7.1.1-5.el9 @epel 215 k 2026-03-10T06:36:11.327 INFO:teuthology.orchestra.run.vm06.stdout: python3-numpy x86_64 1:1.23.5-2.el9 @appstream 30 M 2026-03-10T06:36:11.327 INFO:teuthology.orchestra.run.vm06.stdout: python3-numpy-f2py x86_64 1:1.23.5-2.el9 @appstream 1.7 M 2026-03-10T06:36:11.327 INFO:teuthology.orchestra.run.vm06.stdout: python3-oauthlib noarch 3.1.1-5.el9 @koji-override-0 888 k 2026-03-10T06:36:11.327 INFO:teuthology.orchestra.run.vm06.stdout: python3-pecan noarch 1.4.2-3.el9 @epel 1.3 M 2026-03-10T06:36:11.327 INFO:teuthology.orchestra.run.vm06.stdout: python3-ply noarch 3.11-14.el9 @baseos 430 k 2026-03-10T06:36:11.327 INFO:teuthology.orchestra.run.vm06.stdout: python3-portend noarch 3.1.0-2.el9 @epel 20 k 2026-03-10T06:36:11.327 INFO:teuthology.orchestra.run.vm06.stdout: python3-prettytable noarch 0.7.2-27.el9 @koji-override-0 166 k 2026-03-10T06:36:11.327 INFO:teuthology.orchestra.run.vm06.stdout: python3-pyOpenSSL noarch 21.0.0-1.el9 @epel 389 k 2026-03-10T06:36:11.327 INFO:teuthology.orchestra.run.vm06.stdout: python3-pyasn1 noarch 0.4.8-7.el9 @appstream 622 k 2026-03-10T06:36:11.327 INFO:teuthology.orchestra.run.vm06.stdout: python3-pyasn1-modules noarch 0.4.8-7.el9 @appstream 1.0 M 2026-03-10T06:36:11.327 INFO:teuthology.orchestra.run.vm06.stdout: python3-pycparser noarch 2.20-6.el9 @baseos 745 k 2026-03-10T06:36:11.327 INFO:teuthology.orchestra.run.vm06.stdout: python3-pysocks noarch 1.7.1-12.el9 @anaconda 88 k 2026-03-10T06:36:11.327 INFO:teuthology.orchestra.run.vm06.stdout: python3-pytz noarch 2021.1-5.el9 @koji-override-0 176 k 2026-03-10T06:36:11.327 INFO:teuthology.orchestra.run.vm06.stdout: python3-repoze-lru noarch 0.7-16.el9 @epel 83 k 2026-03-10T06:36:11.327 INFO:teuthology.orchestra.run.vm06.stdout: python3-requests noarch 2.25.1-10.el9 @baseos 405 k 2026-03-10T06:36:11.327 INFO:teuthology.orchestra.run.vm06.stdout: python3-requests-oauthlib noarch 1.3.0-12.el9 @appstream 119 k 2026-03-10T06:36:11.327 INFO:teuthology.orchestra.run.vm06.stdout: python3-routes noarch 2.5.1-5.el9 @epel 459 k 2026-03-10T06:36:11.327 INFO:teuthology.orchestra.run.vm06.stdout: python3-rsa noarch 4.9-2.el9 @epel 202 k 2026-03-10T06:36:11.327 INFO:teuthology.orchestra.run.vm06.stdout: python3-scipy x86_64 1.9.3-2.el9 @appstream 76 M 2026-03-10T06:36:11.327 INFO:teuthology.orchestra.run.vm06.stdout: python3-tempora noarch 5.0.0-2.el9 @epel 96 k 2026-03-10T06:36:11.327 INFO:teuthology.orchestra.run.vm06.stdout: python3-toml noarch 0.10.2-6.el9 @appstream 99 k 2026-03-10T06:36:11.327 INFO:teuthology.orchestra.run.vm06.stdout: python3-typing-extensions noarch 4.15.0-1.el9 @epel 447 k 2026-03-10T06:36:11.327 INFO:teuthology.orchestra.run.vm06.stdout: python3-urllib3 noarch 1.26.5-7.el9 @baseos 746 k 2026-03-10T06:36:11.327 INFO:teuthology.orchestra.run.vm06.stdout: python3-webob noarch 1.8.8-2.el9 @epel 1.2 M 2026-03-10T06:36:11.327 INFO:teuthology.orchestra.run.vm06.stdout: python3-websocket-client noarch 1.2.3-2.el9 @epel 319 k 2026-03-10T06:36:11.327 INFO:teuthology.orchestra.run.vm06.stdout: python3-werkzeug noarch 2.0.3-3.el9.1 @epel 1.9 M 2026-03-10T06:36:11.327 INFO:teuthology.orchestra.run.vm06.stdout: python3-zc-lockfile noarch 2.0-10.el9 @epel 35 k 2026-03-10T06:36:11.327 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-10T06:36:11.327 INFO:teuthology.orchestra.run.vm06.stdout:Transaction Summary 2026-03-10T06:36:11.327 INFO:teuthology.orchestra.run.vm06.stdout:================================================================================ 2026-03-10T06:36:11.327 INFO:teuthology.orchestra.run.vm06.stdout:Remove 84 Packages 2026-03-10T06:36:11.327 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-10T06:36:11.327 INFO:teuthology.orchestra.run.vm06.stdout:Freed space: 433 M 2026-03-10T06:36:11.327 INFO:teuthology.orchestra.run.vm06.stdout:Running transaction check 2026-03-10T06:36:11.350 INFO:teuthology.orchestra.run.vm06.stdout:Transaction check succeeded. 2026-03-10T06:36:11.350 INFO:teuthology.orchestra.run.vm06.stdout:Running transaction test 2026-03-10T06:36:11.424 INFO:teuthology.orchestra.run.vm04.stdout:Dependencies resolved. 2026-03-10T06:36:11.429 INFO:teuthology.orchestra.run.vm04.stdout:================================================================================ 2026-03-10T06:36:11.429 INFO:teuthology.orchestra.run.vm04.stdout: Package Arch Version Repository Size 2026-03-10T06:36:11.429 INFO:teuthology.orchestra.run.vm04.stdout:================================================================================ 2026-03-10T06:36:11.429 INFO:teuthology.orchestra.run.vm04.stdout:Removing: 2026-03-10T06:36:11.429 INFO:teuthology.orchestra.run.vm04.stdout: ceph-base x86_64 2:18.2.0-0.el9 @ceph 22 M 2026-03-10T06:36:11.429 INFO:teuthology.orchestra.run.vm04.stdout:Removing dependent packages: 2026-03-10T06:36:11.429 INFO:teuthology.orchestra.run.vm04.stdout: ceph-immutable-object-cache x86_64 2:18.2.0-0.el9 @ceph 399 k 2026-03-10T06:36:11.429 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mgr x86_64 2:18.2.0-0.el9 @ceph 4.5 M 2026-03-10T06:36:11.429 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mgr-cephadm noarch 2:18.2.0-0.el9 @ceph-noarch 654 k 2026-03-10T06:36:11.429 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mgr-dashboard noarch 2:18.2.0-0.el9 @ceph-noarch 7.1 M 2026-03-10T06:36:11.429 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mgr-diskprediction-local noarch 2:18.2.0-0.el9 @ceph-noarch 66 M 2026-03-10T06:36:11.429 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mgr-rook noarch 2:18.2.0-0.el9 @ceph-noarch 567 k 2026-03-10T06:36:11.429 INFO:teuthology.orchestra.run.vm04.stdout: rbd-mirror x86_64 2:18.2.0-0.el9 @ceph 12 M 2026-03-10T06:36:11.429 INFO:teuthology.orchestra.run.vm04.stdout:Removing unused dependencies: 2026-03-10T06:36:11.429 INFO:teuthology.orchestra.run.vm04.stdout: ceph-common x86_64 2:18.2.0-0.el9 @ceph 70 M 2026-03-10T06:36:11.429 INFO:teuthology.orchestra.run.vm04.stdout: ceph-grafana-dashboards noarch 2:18.2.0-0.el9 @ceph-noarch 319 k 2026-03-10T06:36:11.429 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mgr-modules-core noarch 2:18.2.0-0.el9 @ceph-noarch 1.4 M 2026-03-10T06:36:11.429 INFO:teuthology.orchestra.run.vm04.stdout: ceph-prometheus-alerts noarch 2:18.2.0-0.el9 @ceph-noarch 40 k 2026-03-10T06:36:11.429 INFO:teuthology.orchestra.run.vm04.stdout: ceph-selinux x86_64 2:18.2.0-0.el9 @ceph 138 k 2026-03-10T06:36:11.429 INFO:teuthology.orchestra.run.vm04.stdout: flexiblas x86_64 3.0.4-9.el9 @appstream 68 k 2026-03-10T06:36:11.430 INFO:teuthology.orchestra.run.vm04.stdout: flexiblas-netlib x86_64 3.0.4-9.el9 @appstream 11 M 2026-03-10T06:36:11.430 INFO:teuthology.orchestra.run.vm04.stdout: flexiblas-openblas-openmp x86_64 3.0.4-9.el9 @appstream 39 k 2026-03-10T06:36:11.430 INFO:teuthology.orchestra.run.vm04.stdout: fmt x86_64 8.1.1-5.el9 @epel 337 k 2026-03-10T06:36:11.430 INFO:teuthology.orchestra.run.vm04.stdout: libcephsqlite x86_64 2:18.2.0-0.el9 @ceph 426 k 2026-03-10T06:36:11.430 INFO:teuthology.orchestra.run.vm04.stdout: libgfortran x86_64 11.5.0-14.el9 @baseos 2.8 M 2026-03-10T06:36:11.430 INFO:teuthology.orchestra.run.vm04.stdout: liboath x86_64 2.6.12-1.el9 @epel 94 k 2026-03-10T06:36:11.430 INFO:teuthology.orchestra.run.vm04.stdout: libquadmath x86_64 11.5.0-14.el9 @baseos 330 k 2026-03-10T06:36:11.430 INFO:teuthology.orchestra.run.vm04.stdout: libradosstriper1 x86_64 2:18.2.0-0.el9 @ceph 1.5 M 2026-03-10T06:36:11.430 INFO:teuthology.orchestra.run.vm04.stdout: openblas x86_64 0.3.29-1.el9 @appstream 112 k 2026-03-10T06:36:11.430 INFO:teuthology.orchestra.run.vm04.stdout: openblas-openmp x86_64 0.3.29-1.el9 @appstream 46 M 2026-03-10T06:36:11.430 INFO:teuthology.orchestra.run.vm04.stdout: python3-asyncssh noarch 2.13.2-5.el9 @epel 3.9 M 2026-03-10T06:36:11.430 INFO:teuthology.orchestra.run.vm04.stdout: python3-autocommand noarch 2.2.2-8.el9 @epel 82 k 2026-03-10T06:36:11.430 INFO:teuthology.orchestra.run.vm04.stdout: python3-babel noarch 2.9.1-2.el9 @appstream 27 M 2026-03-10T06:36:11.430 INFO:teuthology.orchestra.run.vm04.stdout: python3-backports-tarfile noarch 1.2.0-1.el9 @epel 254 k 2026-03-10T06:36:11.430 INFO:teuthology.orchestra.run.vm04.stdout: python3-bcrypt x86_64 3.2.2-1.el9 @epel 87 k 2026-03-10T06:36:11.430 INFO:teuthology.orchestra.run.vm04.stdout: python3-cachetools noarch 4.2.4-1.el9 @epel 93 k 2026-03-10T06:36:11.430 INFO:teuthology.orchestra.run.vm04.stdout: python3-ceph-common x86_64 2:18.2.0-0.el9 @ceph 570 k 2026-03-10T06:36:11.430 INFO:teuthology.orchestra.run.vm04.stdout: python3-certifi noarch 2023.05.07-4.el9 @epel 6.3 k 2026-03-10T06:36:11.430 INFO:teuthology.orchestra.run.vm04.stdout: python3-cffi x86_64 1.14.5-5.el9 @baseos 1.0 M 2026-03-10T06:36:11.430 INFO:teuthology.orchestra.run.vm04.stdout: python3-chardet noarch 4.0.0-5.el9 @anaconda 1.4 M 2026-03-10T06:36:11.430 INFO:teuthology.orchestra.run.vm04.stdout: python3-cheroot noarch 10.0.1-4.el9 @epel 682 k 2026-03-10T06:36:11.430 INFO:teuthology.orchestra.run.vm04.stdout: python3-cherrypy noarch 18.6.1-2.el9 @epel 1.1 M 2026-03-10T06:36:11.430 INFO:teuthology.orchestra.run.vm04.stdout: python3-cryptography x86_64 36.0.1-5.el9 @baseos 4.5 M 2026-03-10T06:36:11.430 INFO:teuthology.orchestra.run.vm04.stdout: python3-devel x86_64 3.9.25-3.el9 @appstream 765 k 2026-03-10T06:36:11.430 INFO:teuthology.orchestra.run.vm04.stdout: python3-google-auth noarch 1:2.45.0-1.el9 @epel 1.4 M 2026-03-10T06:36:11.430 INFO:teuthology.orchestra.run.vm04.stdout: python3-idna noarch 2.10-7.el9.1 @anaconda 513 k 2026-03-10T06:36:11.430 INFO:teuthology.orchestra.run.vm04.stdout: python3-jaraco noarch 8.2.1-3.el9 @epel 3.7 k 2026-03-10T06:36:11.430 INFO:teuthology.orchestra.run.vm04.stdout: python3-jaraco-classes noarch 3.2.1-5.el9 @epel 24 k 2026-03-10T06:36:11.430 INFO:teuthology.orchestra.run.vm04.stdout: python3-jaraco-collections noarch 3.0.0-8.el9 @epel 55 k 2026-03-10T06:36:11.430 INFO:teuthology.orchestra.run.vm04.stdout: python3-jaraco-context noarch 6.0.1-3.el9 @epel 31 k 2026-03-10T06:36:11.430 INFO:teuthology.orchestra.run.vm04.stdout: python3-jaraco-functools noarch 3.5.0-2.el9 @epel 33 k 2026-03-10T06:36:11.430 INFO:teuthology.orchestra.run.vm04.stdout: python3-jaraco-text noarch 4.0.0-2.el9 @epel 51 k 2026-03-10T06:36:11.430 INFO:teuthology.orchestra.run.vm04.stdout: python3-jinja2 noarch 2.11.3-8.el9 @appstream 1.1 M 2026-03-10T06:36:11.430 INFO:teuthology.orchestra.run.vm04.stdout: python3-jsonpatch noarch 1.21-16.el9 @koji-override-0 55 k 2026-03-10T06:36:11.430 INFO:teuthology.orchestra.run.vm04.stdout: python3-jsonpointer noarch 2.0-4.el9 @koji-override-0 34 k 2026-03-10T06:36:11.430 INFO:teuthology.orchestra.run.vm04.stdout: python3-jwt noarch 2.4.0-1.el9 @epel 103 k 2026-03-10T06:36:11.430 INFO:teuthology.orchestra.run.vm04.stdout: python3-jwt+crypto noarch 2.4.0-1.el9 @epel 5.4 k 2026-03-10T06:36:11.430 INFO:teuthology.orchestra.run.vm04.stdout: python3-kubernetes noarch 1:26.1.0-3.el9 @epel 21 M 2026-03-10T06:36:11.430 INFO:teuthology.orchestra.run.vm04.stdout: python3-logutils noarch 0.3.5-21.el9 @epel 126 k 2026-03-10T06:36:11.430 INFO:teuthology.orchestra.run.vm04.stdout: python3-mako noarch 1.1.4-6.el9 @appstream 534 k 2026-03-10T06:36:11.430 INFO:teuthology.orchestra.run.vm04.stdout: python3-markupsafe x86_64 1.1.1-12.el9 @appstream 60 k 2026-03-10T06:36:11.430 INFO:teuthology.orchestra.run.vm04.stdout: python3-more-itertools noarch 8.12.0-2.el9 @epel 378 k 2026-03-10T06:36:11.430 INFO:teuthology.orchestra.run.vm04.stdout: python3-natsort noarch 7.1.1-5.el9 @epel 215 k 2026-03-10T06:36:11.430 INFO:teuthology.orchestra.run.vm04.stdout: python3-numpy x86_64 1:1.23.5-2.el9 @appstream 30 M 2026-03-10T06:36:11.430 INFO:teuthology.orchestra.run.vm04.stdout: python3-numpy-f2py x86_64 1:1.23.5-2.el9 @appstream 1.7 M 2026-03-10T06:36:11.430 INFO:teuthology.orchestra.run.vm04.stdout: python3-oauthlib noarch 3.1.1-5.el9 @koji-override-0 888 k 2026-03-10T06:36:11.430 INFO:teuthology.orchestra.run.vm04.stdout: python3-pecan noarch 1.4.2-3.el9 @epel 1.3 M 2026-03-10T06:36:11.430 INFO:teuthology.orchestra.run.vm04.stdout: python3-ply noarch 3.11-14.el9 @baseos 430 k 2026-03-10T06:36:11.430 INFO:teuthology.orchestra.run.vm04.stdout: python3-portend noarch 3.1.0-2.el9 @epel 20 k 2026-03-10T06:36:11.430 INFO:teuthology.orchestra.run.vm04.stdout: python3-prettytable noarch 0.7.2-27.el9 @koji-override-0 166 k 2026-03-10T06:36:11.430 INFO:teuthology.orchestra.run.vm04.stdout: python3-pyOpenSSL noarch 21.0.0-1.el9 @epel 389 k 2026-03-10T06:36:11.430 INFO:teuthology.orchestra.run.vm04.stdout: python3-pyasn1 noarch 0.4.8-7.el9 @appstream 622 k 2026-03-10T06:36:11.430 INFO:teuthology.orchestra.run.vm04.stdout: python3-pyasn1-modules noarch 0.4.8-7.el9 @appstream 1.0 M 2026-03-10T06:36:11.430 INFO:teuthology.orchestra.run.vm04.stdout: python3-pycparser noarch 2.20-6.el9 @baseos 745 k 2026-03-10T06:36:11.430 INFO:teuthology.orchestra.run.vm04.stdout: python3-pysocks noarch 1.7.1-12.el9 @anaconda 88 k 2026-03-10T06:36:11.430 INFO:teuthology.orchestra.run.vm04.stdout: python3-pytz noarch 2021.1-5.el9 @koji-override-0 176 k 2026-03-10T06:36:11.430 INFO:teuthology.orchestra.run.vm04.stdout: python3-repoze-lru noarch 0.7-16.el9 @epel 83 k 2026-03-10T06:36:11.430 INFO:teuthology.orchestra.run.vm04.stdout: python3-requests noarch 2.25.1-10.el9 @baseos 405 k 2026-03-10T06:36:11.430 INFO:teuthology.orchestra.run.vm04.stdout: python3-requests-oauthlib noarch 1.3.0-12.el9 @appstream 119 k 2026-03-10T06:36:11.431 INFO:teuthology.orchestra.run.vm04.stdout: python3-routes noarch 2.5.1-5.el9 @epel 459 k 2026-03-10T06:36:11.431 INFO:teuthology.orchestra.run.vm04.stdout: python3-rsa noarch 4.9-2.el9 @epel 202 k 2026-03-10T06:36:11.431 INFO:teuthology.orchestra.run.vm04.stdout: python3-scipy x86_64 1.9.3-2.el9 @appstream 76 M 2026-03-10T06:36:11.431 INFO:teuthology.orchestra.run.vm04.stdout: python3-tempora noarch 5.0.0-2.el9 @epel 96 k 2026-03-10T06:36:11.431 INFO:teuthology.orchestra.run.vm04.stdout: python3-toml noarch 0.10.2-6.el9 @appstream 99 k 2026-03-10T06:36:11.431 INFO:teuthology.orchestra.run.vm04.stdout: python3-typing-extensions noarch 4.15.0-1.el9 @epel 447 k 2026-03-10T06:36:11.431 INFO:teuthology.orchestra.run.vm04.stdout: python3-urllib3 noarch 1.26.5-7.el9 @baseos 746 k 2026-03-10T06:36:11.431 INFO:teuthology.orchestra.run.vm04.stdout: python3-webob noarch 1.8.8-2.el9 @epel 1.2 M 2026-03-10T06:36:11.431 INFO:teuthology.orchestra.run.vm04.stdout: python3-websocket-client noarch 1.2.3-2.el9 @epel 319 k 2026-03-10T06:36:11.431 INFO:teuthology.orchestra.run.vm04.stdout: python3-werkzeug noarch 2.0.3-3.el9.1 @epel 1.9 M 2026-03-10T06:36:11.431 INFO:teuthology.orchestra.run.vm04.stdout: python3-zc-lockfile noarch 2.0-10.el9 @epel 35 k 2026-03-10T06:36:11.431 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:36:11.431 INFO:teuthology.orchestra.run.vm04.stdout:Transaction Summary 2026-03-10T06:36:11.431 INFO:teuthology.orchestra.run.vm04.stdout:================================================================================ 2026-03-10T06:36:11.431 INFO:teuthology.orchestra.run.vm04.stdout:Remove 84 Packages 2026-03-10T06:36:11.431 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:36:11.431 INFO:teuthology.orchestra.run.vm04.stdout:Freed space: 433 M 2026-03-10T06:36:11.431 INFO:teuthology.orchestra.run.vm04.stdout:Running transaction check 2026-03-10T06:36:11.452 INFO:teuthology.orchestra.run.vm06.stdout:Transaction test succeeded. 2026-03-10T06:36:11.452 INFO:teuthology.orchestra.run.vm06.stdout:Running transaction 2026-03-10T06:36:11.452 INFO:teuthology.orchestra.run.vm04.stdout:Transaction check succeeded. 2026-03-10T06:36:11.452 INFO:teuthology.orchestra.run.vm04.stdout:Running transaction test 2026-03-10T06:36:11.560 INFO:teuthology.orchestra.run.vm04.stdout:Transaction test succeeded. 2026-03-10T06:36:11.560 INFO:teuthology.orchestra.run.vm04.stdout:Running transaction 2026-03-10T06:36:11.581 INFO:teuthology.orchestra.run.vm06.stdout: Preparing : 1/1 2026-03-10T06:36:11.581 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : ceph-mgr-rook-2:18.2.0-0.el9.noarch 1/84 2026-03-10T06:36:11.589 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: ceph-mgr-rook-2:18.2.0-0.el9.noarch 1/84 2026-03-10T06:36:11.607 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: ceph-mgr-2:18.2.0-0.el9.x86_64 2/84 2026-03-10T06:36:11.607 INFO:teuthology.orchestra.run.vm06.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T06:36:11.607 INFO:teuthology.orchestra.run.vm06.stdout:Invalid unit name "ceph-mgr@*.service" escaped as "ceph-mgr@\x2a.service". 2026-03-10T06:36:11.607 INFO:teuthology.orchestra.run.vm06.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-mgr.target". 2026-03-10T06:36:11.607 INFO:teuthology.orchestra.run.vm06.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-mgr.target". 2026-03-10T06:36:11.607 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-10T06:36:11.608 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : ceph-mgr-2:18.2.0-0.el9.x86_64 2/84 2026-03-10T06:36:11.622 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: ceph-mgr-2:18.2.0-0.el9.x86_64 2/84 2026-03-10T06:36:11.629 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : ceph-mgr-modules-core-2:18.2.0-0.el9.noarch 3/84 2026-03-10T06:36:11.629 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : ceph-mgr-dashboard-2:18.2.0-0.el9.noarch 4/84 2026-03-10T06:36:11.686 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: ceph-mgr-dashboard-2:18.2.0-0.el9.noarch 4/84 2026-03-10T06:36:11.690 INFO:teuthology.orchestra.run.vm04.stdout: Preparing : 1/1 2026-03-10T06:36:11.690 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : ceph-mgr-rook-2:18.2.0-0.el9.noarch 1/84 2026-03-10T06:36:11.695 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-kubernetes-1:26.1.0-3.el9.noarch 5/84 2026-03-10T06:36:11.699 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-mgr-rook-2:18.2.0-0.el9.noarch 1/84 2026-03-10T06:36:11.700 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-requests-oauthlib-1.3.0-12.el9.noarch 6/84 2026-03-10T06:36:11.700 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : ceph-mgr-cephadm-2:18.2.0-0.el9.noarch 7/84 2026-03-10T06:36:11.713 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: ceph-mgr-cephadm-2:18.2.0-0.el9.noarch 7/84 2026-03-10T06:36:11.718 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-mgr-2:18.2.0-0.el9.x86_64 2/84 2026-03-10T06:36:11.718 INFO:teuthology.orchestra.run.vm04.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T06:36:11.718 INFO:teuthology.orchestra.run.vm04.stdout:Invalid unit name "ceph-mgr@*.service" escaped as "ceph-mgr@\x2a.service". 2026-03-10T06:36:11.718 INFO:teuthology.orchestra.run.vm04.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-mgr.target". 2026-03-10T06:36:11.718 INFO:teuthology.orchestra.run.vm04.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-mgr.target". 2026-03-10T06:36:11.718 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:36:11.719 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : ceph-mgr-2:18.2.0-0.el9.x86_64 2/84 2026-03-10T06:36:11.720 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-cherrypy-18.6.1-2.el9.noarch 8/84 2026-03-10T06:36:11.723 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-cheroot-10.0.1-4.el9.noarch 9/84 2026-03-10T06:36:11.726 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-jaraco-collections-3.0.0-8.el9.noarch 10/84 2026-03-10T06:36:11.731 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-jaraco-text-4.0.0-2.el9.noarch 11/84 2026-03-10T06:36:11.733 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-mgr-2:18.2.0-0.el9.x86_64 2/84 2026-03-10T06:36:11.735 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-jinja2-2.11.3-8.el9.noarch 12/84 2026-03-10T06:36:11.741 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : ceph-mgr-modules-core-2:18.2.0-0.el9.noarch 3/84 2026-03-10T06:36:11.741 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : ceph-mgr-dashboard-2:18.2.0-0.el9.noarch 4/84 2026-03-10T06:36:11.744 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-requests-2.25.1-10.el9.noarch 13/84 2026-03-10T06:36:11.758 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-google-auth-1:2.45.0-1.el9.noarch 14/84 2026-03-10T06:36:11.764 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-pecan-1.4.2-3.el9.noarch 15/84 2026-03-10T06:36:11.774 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-rsa-4.9-2.el9.noarch 16/84 2026-03-10T06:36:11.781 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-pyasn1-modules-0.4.8-7.el9.noarch 17/84 2026-03-10T06:36:11.799 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-mgr-dashboard-2:18.2.0-0.el9.noarch 4/84 2026-03-10T06:36:11.809 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-kubernetes-1:26.1.0-3.el9.noarch 5/84 2026-03-10T06:36:11.811 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-urllib3-1.26.5-7.el9.noarch 18/84 2026-03-10T06:36:11.813 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-requests-oauthlib-1.3.0-12.el9.noarch 6/84 2026-03-10T06:36:11.813 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : ceph-mgr-cephadm-2:18.2.0-0.el9.noarch 7/84 2026-03-10T06:36:11.818 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-babel-2.9.1-2.el9.noarch 19/84 2026-03-10T06:36:11.820 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-jaraco-classes-3.2.1-5.el9.noarch 20/84 2026-03-10T06:36:11.825 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-mgr-cephadm-2:18.2.0-0.el9.noarch 7/84 2026-03-10T06:36:11.829 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-pyOpenSSL-21.0.0-1.el9.noarch 21/84 2026-03-10T06:36:11.832 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-cherrypy-18.6.1-2.el9.noarch 8/84 2026-03-10T06:36:11.835 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-cheroot-10.0.1-4.el9.noarch 9/84 2026-03-10T06:36:11.837 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-asyncssh-2.13.2-5.el9.noarch 22/84 2026-03-10T06:36:11.837 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : ceph-mgr-diskprediction-local-2:18.2.0-0.el9.noarc 23/84 2026-03-10T06:36:11.838 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-jaraco-collections-3.0.0-8.el9.noarch 10/84 2026-03-10T06:36:11.843 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-jaraco-text-4.0.0-2.el9.noarch 11/84 2026-03-10T06:36:11.845 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: ceph-mgr-diskprediction-local-2:18.2.0-0.el9.noarc 23/84 2026-03-10T06:36:11.848 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-jinja2-2.11.3-8.el9.noarch 12/84 2026-03-10T06:36:11.857 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-requests-2.25.1-10.el9.noarch 13/84 2026-03-10T06:36:11.870 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-google-auth-1:2.45.0-1.el9.noarch 14/84 2026-03-10T06:36:11.876 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-pecan-1.4.2-3.el9.noarch 15/84 2026-03-10T06:36:11.886 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-rsa-4.9-2.el9.noarch 16/84 2026-03-10T06:36:11.892 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-pyasn1-modules-0.4.8-7.el9.noarch 17/84 2026-03-10T06:36:11.923 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-urllib3-1.26.5-7.el9.noarch 18/84 2026-03-10T06:36:11.930 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-babel-2.9.1-2.el9.noarch 19/84 2026-03-10T06:36:11.933 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-jaraco-classes-3.2.1-5.el9.noarch 20/84 2026-03-10T06:36:11.940 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-jsonpatch-1.21-16.el9.noarch 24/84 2026-03-10T06:36:11.942 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-pyOpenSSL-21.0.0-1.el9.noarch 21/84 2026-03-10T06:36:11.949 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-asyncssh-2.13.2-5.el9.noarch 22/84 2026-03-10T06:36:11.949 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : ceph-mgr-diskprediction-local-2:18.2.0-0.el9.noarc 23/84 2026-03-10T06:36:11.957 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-mgr-diskprediction-local-2:18.2.0-0.el9.noarc 23/84 2026-03-10T06:36:11.969 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-scipy-1.9.3-2.el9.x86_64 25/84 2026-03-10T06:36:11.974 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-numpy-f2py-1:1.23.5-2.el9.x86_64 26/84 2026-03-10T06:36:11.981 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-bcrypt-3.2.2-1.el9.x86_64 27/84 2026-03-10T06:36:11.985 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-mako-1.1.4-6.el9.noarch 28/84 2026-03-10T06:36:11.988 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-jaraco-context-6.0.1-3.el9.noarch 29/84 2026-03-10T06:36:11.990 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-portend-3.1.0-2.el9.noarch 30/84 2026-03-10T06:36:11.993 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-tempora-5.0.0-2.el9.noarch 31/84 2026-03-10T06:36:11.996 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-jaraco-functools-3.5.0-2.el9.noarch 32/84 2026-03-10T06:36:11.998 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-jwt-2.4.0-1.el9.noarch 33/84 2026-03-10T06:36:12.002 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-jwt+crypto-2.4.0-1.el9.noarch 34/84 2026-03-10T06:36:12.015 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-routes-2.5.1-5.el9.noarch 35/84 2026-03-10T06:36:12.023 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-cryptography-36.0.1-5.el9.x86_64 36/84 2026-03-10T06:36:12.028 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-cffi-1.14.5-5.el9.x86_64 37/84 2026-03-10T06:36:12.057 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-jsonpatch-1.21-16.el9.noarch 24/84 2026-03-10T06:36:12.076 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-pycparser-2.20-6.el9.noarch 38/84 2026-03-10T06:36:12.084 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-scipy-1.9.3-2.el9.x86_64 25/84 2026-03-10T06:36:12.088 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-numpy-1:1.23.5-2.el9.x86_64 39/84 2026-03-10T06:36:12.089 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-numpy-f2py-1:1.23.5-2.el9.x86_64 26/84 2026-03-10T06:36:12.090 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : flexiblas-netlib-3.0.4-9.el9.x86_64 40/84 2026-03-10T06:36:12.094 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 41/84 2026-03-10T06:36:12.096 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-bcrypt-3.2.2-1.el9.x86_64 27/84 2026-03-10T06:36:12.096 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : openblas-openmp-0.3.29-1.el9.x86_64 42/84 2026-03-10T06:36:12.099 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : libgfortran-11.5.0-14.el9.x86_64 43/84 2026-03-10T06:36:12.101 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-mako-1.1.4-6.el9.noarch 28/84 2026-03-10T06:36:12.104 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-jaraco-context-6.0.1-3.el9.noarch 29/84 2026-03-10T06:36:12.107 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-portend-3.1.0-2.el9.noarch 30/84 2026-03-10T06:36:12.111 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-tempora-5.0.0-2.el9.noarch 31/84 2026-03-10T06:36:12.114 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-jaraco-functools-3.5.0-2.el9.noarch 32/84 2026-03-10T06:36:12.116 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-jwt-2.4.0-1.el9.noarch 33/84 2026-03-10T06:36:12.119 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-jwt+crypto-2.4.0-1.el9.noarch 34/84 2026-03-10T06:36:12.120 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: rbd-mirror-2:18.2.0-0.el9.x86_64 44/84 2026-03-10T06:36:12.120 INFO:teuthology.orchestra.run.vm06.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T06:36:12.120 INFO:teuthology.orchestra.run.vm06.stdout:Invalid unit name "ceph-rbd-mirror@*.service" escaped as "ceph-rbd-mirror@\x2a.service". 2026-03-10T06:36:12.120 INFO:teuthology.orchestra.run.vm06.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-rbd-mirror.target". 2026-03-10T06:36:12.120 INFO:teuthology.orchestra.run.vm06.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-rbd-mirror.target". 2026-03-10T06:36:12.120 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-10T06:36:12.121 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : rbd-mirror-2:18.2.0-0.el9.x86_64 44/84 2026-03-10T06:36:12.131 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: rbd-mirror-2:18.2.0-0.el9.x86_64 44/84 2026-03-10T06:36:12.142 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-routes-2.5.1-5.el9.noarch 35/84 2026-03-10T06:36:12.148 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-cryptography-36.0.1-5.el9.x86_64 36/84 2026-03-10T06:36:12.150 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: ceph-immutable-object-cache-2:18.2.0-0.el9.x86_64 45/84 2026-03-10T06:36:12.150 INFO:teuthology.orchestra.run.vm06.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T06:36:12.150 INFO:teuthology.orchestra.run.vm06.stdout:Invalid unit name "ceph-immutable-object-cache@*.service" escaped as "ceph-immutable-object-cache@\x2a.service". 2026-03-10T06:36:12.150 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-10T06:36:12.150 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : ceph-immutable-object-cache-2:18.2.0-0.el9.x86_64 45/84 2026-03-10T06:36:12.153 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-cffi-1.14.5-5.el9.x86_64 37/84 2026-03-10T06:36:12.159 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: ceph-immutable-object-cache-2:18.2.0-0.el9.x86_64 45/84 2026-03-10T06:36:12.161 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : openblas-0.3.29-1.el9.x86_64 46/84 2026-03-10T06:36:12.164 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : flexiblas-3.0.4-9.el9.x86_64 47/84 2026-03-10T06:36:12.166 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-ply-3.11-14.el9.noarch 48/84 2026-03-10T06:36:12.169 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-repoze-lru-0.7-16.el9.noarch 49/84 2026-03-10T06:36:12.171 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-jaraco-8.2.1-3.el9.noarch 50/84 2026-03-10T06:36:12.174 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-more-itertools-8.12.0-2.el9.noarch 51/84 2026-03-10T06:36:12.177 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-toml-0.10.2-6.el9.noarch 52/84 2026-03-10T06:36:12.180 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-pytz-2021.1-5.el9.noarch 53/84 2026-03-10T06:36:12.188 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-backports-tarfile-1.2.0-1.el9.noarch 54/84 2026-03-10T06:36:12.192 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-devel-3.9.25-3.el9.x86_64 55/84 2026-03-10T06:36:12.195 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-jsonpointer-2.0-4.el9.noarch 56/84 2026-03-10T06:36:12.197 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-typing-extensions-4.15.0-1.el9.noarch 57/84 2026-03-10T06:36:12.200 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-idna-2.10-7.el9.1.noarch 58/84 2026-03-10T06:36:12.204 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-pycparser-2.20-6.el9.noarch 38/84 2026-03-10T06:36:12.206 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-pysocks-1.7.1-12.el9.noarch 59/84 2026-03-10T06:36:12.210 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-pyasn1-0.4.8-7.el9.noarch 60/84 2026-03-10T06:36:12.216 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-logutils-0.3.5-21.el9.noarch 61/84 2026-03-10T06:36:12.216 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-numpy-1:1.23.5-2.el9.x86_64 39/84 2026-03-10T06:36:12.219 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : flexiblas-netlib-3.0.4-9.el9.x86_64 40/84 2026-03-10T06:36:12.220 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-webob-1.8.8-2.el9.noarch 62/84 2026-03-10T06:36:12.222 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 41/84 2026-03-10T06:36:12.224 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : openblas-openmp-0.3.29-1.el9.x86_64 42/84 2026-03-10T06:36:12.226 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : libgfortran-11.5.0-14.el9.x86_64 43/84 2026-03-10T06:36:12.226 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-cachetools-4.2.4-1.el9.noarch 63/84 2026-03-10T06:36:12.230 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-chardet-4.0.0-5.el9.noarch 64/84 2026-03-10T06:36:12.232 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-autocommand-2.2.2-8.el9.noarch 65/84 2026-03-10T06:36:12.236 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-zc-lockfile-2.0-10.el9.noarch 66/84 2026-03-10T06:36:12.244 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-natsort-7.1.1-5.el9.noarch 67/84 2026-03-10T06:36:12.245 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: rbd-mirror-2:18.2.0-0.el9.x86_64 44/84 2026-03-10T06:36:12.245 INFO:teuthology.orchestra.run.vm04.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T06:36:12.245 INFO:teuthology.orchestra.run.vm04.stdout:Invalid unit name "ceph-rbd-mirror@*.service" escaped as "ceph-rbd-mirror@\x2a.service". 2026-03-10T06:36:12.245 INFO:teuthology.orchestra.run.vm04.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-rbd-mirror.target". 2026-03-10T06:36:12.245 INFO:teuthology.orchestra.run.vm04.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-rbd-mirror.target". 2026-03-10T06:36:12.246 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:36:12.246 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : rbd-mirror-2:18.2.0-0.el9.x86_64 44/84 2026-03-10T06:36:12.250 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-oauthlib-3.1.1-5.el9.noarch 68/84 2026-03-10T06:36:12.253 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-websocket-client-1.2.3-2.el9.noarch 69/84 2026-03-10T06:36:12.255 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: rbd-mirror-2:18.2.0-0.el9.x86_64 44/84 2026-03-10T06:36:12.256 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-certifi-2023.05.07-4.el9.noarch 70/84 2026-03-10T06:36:12.258 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : ceph-grafana-dashboards-2:18.2.0-0.el9.noarch 71/84 2026-03-10T06:36:12.264 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : ceph-prometheus-alerts-2:18.2.0-0.el9.noarch 72/84 2026-03-10T06:36:12.268 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-werkzeug-2.0.3-3.el9.1.noarch 73/84 2026-03-10T06:36:12.274 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-immutable-object-cache-2:18.2.0-0.el9.x86_64 45/84 2026-03-10T06:36:12.274 INFO:teuthology.orchestra.run.vm04.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T06:36:12.274 INFO:teuthology.orchestra.run.vm04.stdout:Invalid unit name "ceph-immutable-object-cache@*.service" escaped as "ceph-immutable-object-cache@\x2a.service". 2026-03-10T06:36:12.274 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:36:12.274 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : ceph-immutable-object-cache-2:18.2.0-0.el9.x86_64 45/84 2026-03-10T06:36:12.283 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-immutable-object-cache-2:18.2.0-0.el9.x86_64 45/84 2026-03-10T06:36:12.284 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : openblas-0.3.29-1.el9.x86_64 46/84 2026-03-10T06:36:12.286 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : flexiblas-3.0.4-9.el9.x86_64 47/84 2026-03-10T06:36:12.287 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: ceph-base-2:18.2.0-0.el9.x86_64 74/84 2026-03-10T06:36:12.287 INFO:teuthology.orchestra.run.vm06.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-crash.service". 2026-03-10T06:36:12.287 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-10T06:36:12.289 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-ply-3.11-14.el9.noarch 48/84 2026-03-10T06:36:12.291 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-repoze-lru-0.7-16.el9.noarch 49/84 2026-03-10T06:36:12.293 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : ceph-base-2:18.2.0-0.el9.x86_64 74/84 2026-03-10T06:36:12.294 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-jaraco-8.2.1-3.el9.noarch 50/84 2026-03-10T06:36:12.296 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-more-itertools-8.12.0-2.el9.noarch 51/84 2026-03-10T06:36:12.299 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-toml-0.10.2-6.el9.noarch 52/84 2026-03-10T06:36:12.302 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-pytz-2021.1-5.el9.noarch 53/84 2026-03-10T06:36:12.311 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-backports-tarfile-1.2.0-1.el9.noarch 54/84 2026-03-10T06:36:12.312 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: ceph-base-2:18.2.0-0.el9.x86_64 74/84 2026-03-10T06:36:12.312 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : ceph-selinux-2:18.2.0-0.el9.x86_64 75/84 2026-03-10T06:36:12.315 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-devel-3.9.25-3.el9.x86_64 55/84 2026-03-10T06:36:12.317 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-jsonpointer-2.0-4.el9.noarch 56/84 2026-03-10T06:36:12.320 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-typing-extensions-4.15.0-1.el9.noarch 57/84 2026-03-10T06:36:12.323 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-idna-2.10-7.el9.1.noarch 58/84 2026-03-10T06:36:12.329 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-pysocks-1.7.1-12.el9.noarch 59/84 2026-03-10T06:36:12.332 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-pyasn1-0.4.8-7.el9.noarch 60/84 2026-03-10T06:36:12.338 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-logutils-0.3.5-21.el9.noarch 61/84 2026-03-10T06:36:12.342 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-webob-1.8.8-2.el9.noarch 62/84 2026-03-10T06:36:12.348 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-cachetools-4.2.4-1.el9.noarch 63/84 2026-03-10T06:36:12.351 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-chardet-4.0.0-5.el9.noarch 64/84 2026-03-10T06:36:12.353 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-autocommand-2.2.2-8.el9.noarch 65/84 2026-03-10T06:36:12.357 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-zc-lockfile-2.0-10.el9.noarch 66/84 2026-03-10T06:36:12.366 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-natsort-7.1.1-5.el9.noarch 67/84 2026-03-10T06:36:12.371 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-oauthlib-3.1.1-5.el9.noarch 68/84 2026-03-10T06:36:12.374 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-websocket-client-1.2.3-2.el9.noarch 69/84 2026-03-10T06:36:12.376 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-certifi-2023.05.07-4.el9.noarch 70/84 2026-03-10T06:36:12.378 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : ceph-grafana-dashboards-2:18.2.0-0.el9.noarch 71/84 2026-03-10T06:36:12.384 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : ceph-prometheus-alerts-2:18.2.0-0.el9.noarch 72/84 2026-03-10T06:36:12.387 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-werkzeug-2.0.3-3.el9.1.noarch 73/84 2026-03-10T06:36:12.405 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-base-2:18.2.0-0.el9.x86_64 74/84 2026-03-10T06:36:12.405 INFO:teuthology.orchestra.run.vm04.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-crash.service". 2026-03-10T06:36:12.405 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:36:12.412 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : ceph-base-2:18.2.0-0.el9.x86_64 74/84 2026-03-10T06:36:12.430 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-base-2:18.2.0-0.el9.x86_64 74/84 2026-03-10T06:36:12.430 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : ceph-selinux-2:18.2.0-0.el9.x86_64 75/84 2026-03-10T06:36:17.613 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: ceph-selinux-2:18.2.0-0.el9.x86_64 75/84 2026-03-10T06:36:17.613 INFO:teuthology.orchestra.run.vm06.stdout:skipping the directory /sys 2026-03-10T06:36:17.613 INFO:teuthology.orchestra.run.vm06.stdout:skipping the directory /proc 2026-03-10T06:36:17.614 INFO:teuthology.orchestra.run.vm06.stdout:skipping the directory /mnt 2026-03-10T06:36:17.614 INFO:teuthology.orchestra.run.vm06.stdout:skipping the directory /var/tmp 2026-03-10T06:36:17.614 INFO:teuthology.orchestra.run.vm06.stdout:skipping the directory /home 2026-03-10T06:36:17.614 INFO:teuthology.orchestra.run.vm06.stdout:skipping the directory /root 2026-03-10T06:36:17.614 INFO:teuthology.orchestra.run.vm06.stdout:skipping the directory /tmp 2026-03-10T06:36:17.614 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-10T06:36:17.623 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : ceph-common-2:18.2.0-0.el9.x86_64 76/84 2026-03-10T06:36:17.646 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: ceph-common-2:18.2.0-0.el9.x86_64 76/84 2026-03-10T06:36:17.650 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-ceph-common-2:18.2.0-0.el9.x86_64 77/84 2026-03-10T06:36:17.652 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-prettytable-0.7.2-27.el9.noarch 78/84 2026-03-10T06:36:17.654 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : liboath-2.6.12-1.el9.x86_64 79/84 2026-03-10T06:36:17.654 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : libradosstriper1-2:18.2.0-0.el9.x86_64 80/84 2026-03-10T06:36:17.668 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: libradosstriper1-2:18.2.0-0.el9.x86_64 80/84 2026-03-10T06:36:17.671 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : fmt-8.1.1-5.el9.x86_64 81/84 2026-03-10T06:36:17.674 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : libquadmath-11.5.0-14.el9.x86_64 82/84 2026-03-10T06:36:17.677 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-markupsafe-1.1.1-12.el9.x86_64 83/84 2026-03-10T06:36:17.677 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : libcephsqlite-2:18.2.0-0.el9.x86_64 84/84 2026-03-10T06:36:17.776 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: libcephsqlite-2:18.2.0-0.el9.x86_64 84/84 2026-03-10T06:36:17.776 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : ceph-base-2:18.2.0-0.el9.x86_64 1/84 2026-03-10T06:36:17.776 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : ceph-common-2:18.2.0-0.el9.x86_64 2/84 2026-03-10T06:36:17.776 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : ceph-grafana-dashboards-2:18.2.0-0.el9.noarch 3/84 2026-03-10T06:36:17.776 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : ceph-immutable-object-cache-2:18.2.0-0.el9.x86_64 4/84 2026-03-10T06:36:17.776 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : ceph-mgr-2:18.2.0-0.el9.x86_64 5/84 2026-03-10T06:36:17.776 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : ceph-mgr-cephadm-2:18.2.0-0.el9.noarch 6/84 2026-03-10T06:36:17.777 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : ceph-mgr-dashboard-2:18.2.0-0.el9.noarch 7/84 2026-03-10T06:36:17.777 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : ceph-mgr-diskprediction-local-2:18.2.0-0.el9.noarc 8/84 2026-03-10T06:36:17.777 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : ceph-mgr-modules-core-2:18.2.0-0.el9.noarch 9/84 2026-03-10T06:36:17.777 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : ceph-mgr-rook-2:18.2.0-0.el9.noarch 10/84 2026-03-10T06:36:17.777 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : ceph-prometheus-alerts-2:18.2.0-0.el9.noarch 11/84 2026-03-10T06:36:17.777 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : ceph-selinux-2:18.2.0-0.el9.x86_64 12/84 2026-03-10T06:36:17.777 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : flexiblas-3.0.4-9.el9.x86_64 13/84 2026-03-10T06:36:17.777 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : flexiblas-netlib-3.0.4-9.el9.x86_64 14/84 2026-03-10T06:36:17.777 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 15/84 2026-03-10T06:36:17.777 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : fmt-8.1.1-5.el9.x86_64 16/84 2026-03-10T06:36:17.777 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : libcephsqlite-2:18.2.0-0.el9.x86_64 17/84 2026-03-10T06:36:17.777 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : libgfortran-11.5.0-14.el9.x86_64 18/84 2026-03-10T06:36:17.777 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : liboath-2.6.12-1.el9.x86_64 19/84 2026-03-10T06:36:17.777 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : libquadmath-11.5.0-14.el9.x86_64 20/84 2026-03-10T06:36:17.777 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : libradosstriper1-2:18.2.0-0.el9.x86_64 21/84 2026-03-10T06:36:17.777 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : openblas-0.3.29-1.el9.x86_64 22/84 2026-03-10T06:36:17.777 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : openblas-openmp-0.3.29-1.el9.x86_64 23/84 2026-03-10T06:36:17.777 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-asyncssh-2.13.2-5.el9.noarch 24/84 2026-03-10T06:36:17.777 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-autocommand-2.2.2-8.el9.noarch 25/84 2026-03-10T06:36:17.777 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-babel-2.9.1-2.el9.noarch 26/84 2026-03-10T06:36:17.777 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-backports-tarfile-1.2.0-1.el9.noarch 27/84 2026-03-10T06:36:17.777 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-bcrypt-3.2.2-1.el9.x86_64 28/84 2026-03-10T06:36:17.777 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-cachetools-4.2.4-1.el9.noarch 29/84 2026-03-10T06:36:17.777 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-ceph-common-2:18.2.0-0.el9.x86_64 30/84 2026-03-10T06:36:17.777 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-certifi-2023.05.07-4.el9.noarch 31/84 2026-03-10T06:36:17.777 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-cffi-1.14.5-5.el9.x86_64 32/84 2026-03-10T06:36:17.777 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-chardet-4.0.0-5.el9.noarch 33/84 2026-03-10T06:36:17.777 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-cheroot-10.0.1-4.el9.noarch 34/84 2026-03-10T06:36:17.777 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-cherrypy-18.6.1-2.el9.noarch 35/84 2026-03-10T06:36:17.777 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-cryptography-36.0.1-5.el9.x86_64 36/84 2026-03-10T06:36:17.777 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-devel-3.9.25-3.el9.x86_64 37/84 2026-03-10T06:36:17.777 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-google-auth-1:2.45.0-1.el9.noarch 38/84 2026-03-10T06:36:17.777 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-idna-2.10-7.el9.1.noarch 39/84 2026-03-10T06:36:17.777 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-jaraco-8.2.1-3.el9.noarch 40/84 2026-03-10T06:36:17.777 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-jaraco-classes-3.2.1-5.el9.noarch 41/84 2026-03-10T06:36:17.777 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-jaraco-collections-3.0.0-8.el9.noarch 42/84 2026-03-10T06:36:17.777 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-jaraco-context-6.0.1-3.el9.noarch 43/84 2026-03-10T06:36:17.777 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-jaraco-functools-3.5.0-2.el9.noarch 44/84 2026-03-10T06:36:17.777 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-jaraco-text-4.0.0-2.el9.noarch 45/84 2026-03-10T06:36:17.777 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-jinja2-2.11.3-8.el9.noarch 46/84 2026-03-10T06:36:17.777 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-jsonpatch-1.21-16.el9.noarch 47/84 2026-03-10T06:36:17.777 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-jsonpointer-2.0-4.el9.noarch 48/84 2026-03-10T06:36:17.777 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-jwt-2.4.0-1.el9.noarch 49/84 2026-03-10T06:36:17.777 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-jwt+crypto-2.4.0-1.el9.noarch 50/84 2026-03-10T06:36:17.777 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-kubernetes-1:26.1.0-3.el9.noarch 51/84 2026-03-10T06:36:17.777 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-logutils-0.3.5-21.el9.noarch 52/84 2026-03-10T06:36:17.777 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-mako-1.1.4-6.el9.noarch 53/84 2026-03-10T06:36:17.777 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-markupsafe-1.1.1-12.el9.x86_64 54/84 2026-03-10T06:36:17.777 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-more-itertools-8.12.0-2.el9.noarch 55/84 2026-03-10T06:36:17.777 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-natsort-7.1.1-5.el9.noarch 56/84 2026-03-10T06:36:17.777 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-numpy-1:1.23.5-2.el9.x86_64 57/84 2026-03-10T06:36:17.777 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-numpy-f2py-1:1.23.5-2.el9.x86_64 58/84 2026-03-10T06:36:17.777 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-oauthlib-3.1.1-5.el9.noarch 59/84 2026-03-10T06:36:17.777 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-pecan-1.4.2-3.el9.noarch 60/84 2026-03-10T06:36:17.777 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-ply-3.11-14.el9.noarch 61/84 2026-03-10T06:36:17.777 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-portend-3.1.0-2.el9.noarch 62/84 2026-03-10T06:36:17.777 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-prettytable-0.7.2-27.el9.noarch 63/84 2026-03-10T06:36:17.777 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-pyOpenSSL-21.0.0-1.el9.noarch 64/84 2026-03-10T06:36:17.777 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-pyasn1-0.4.8-7.el9.noarch 65/84 2026-03-10T06:36:17.777 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-pyasn1-modules-0.4.8-7.el9.noarch 66/84 2026-03-10T06:36:17.777 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-pycparser-2.20-6.el9.noarch 67/84 2026-03-10T06:36:17.777 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-pysocks-1.7.1-12.el9.noarch 68/84 2026-03-10T06:36:17.778 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-pytz-2021.1-5.el9.noarch 69/84 2026-03-10T06:36:17.778 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-repoze-lru-0.7-16.el9.noarch 70/84 2026-03-10T06:36:17.778 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-requests-2.25.1-10.el9.noarch 71/84 2026-03-10T06:36:17.778 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-requests-oauthlib-1.3.0-12.el9.noarch 72/84 2026-03-10T06:36:17.778 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-routes-2.5.1-5.el9.noarch 73/84 2026-03-10T06:36:17.778 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-rsa-4.9-2.el9.noarch 74/84 2026-03-10T06:36:17.778 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-scipy-1.9.3-2.el9.x86_64 75/84 2026-03-10T06:36:17.778 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-tempora-5.0.0-2.el9.noarch 76/84 2026-03-10T06:36:17.778 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-toml-0.10.2-6.el9.noarch 77/84 2026-03-10T06:36:17.778 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-typing-extensions-4.15.0-1.el9.noarch 78/84 2026-03-10T06:36:17.778 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-urllib3-1.26.5-7.el9.noarch 79/84 2026-03-10T06:36:17.778 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-webob-1.8.8-2.el9.noarch 80/84 2026-03-10T06:36:17.778 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-websocket-client-1.2.3-2.el9.noarch 81/84 2026-03-10T06:36:17.778 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-werkzeug-2.0.3-3.el9.1.noarch 82/84 2026-03-10T06:36:17.778 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-zc-lockfile-2.0-10.el9.noarch 83/84 2026-03-10T06:36:17.885 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : rbd-mirror-2:18.2.0-0.el9.x86_64 84/84 2026-03-10T06:36:17.886 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-10T06:36:17.886 INFO:teuthology.orchestra.run.vm06.stdout:Removed: 2026-03-10T06:36:17.886 INFO:teuthology.orchestra.run.vm06.stdout: ceph-base-2:18.2.0-0.el9.x86_64 2026-03-10T06:36:17.886 INFO:teuthology.orchestra.run.vm06.stdout: ceph-common-2:18.2.0-0.el9.x86_64 2026-03-10T06:36:17.886 INFO:teuthology.orchestra.run.vm06.stdout: ceph-grafana-dashboards-2:18.2.0-0.el9.noarch 2026-03-10T06:36:17.886 INFO:teuthology.orchestra.run.vm06.stdout: ceph-immutable-object-cache-2:18.2.0-0.el9.x86_64 2026-03-10T06:36:17.886 INFO:teuthology.orchestra.run.vm06.stdout: ceph-mgr-2:18.2.0-0.el9.x86_64 2026-03-10T06:36:17.886 INFO:teuthology.orchestra.run.vm06.stdout: ceph-mgr-cephadm-2:18.2.0-0.el9.noarch 2026-03-10T06:36:17.886 INFO:teuthology.orchestra.run.vm06.stdout: ceph-mgr-dashboard-2:18.2.0-0.el9.noarch 2026-03-10T06:36:17.886 INFO:teuthology.orchestra.run.vm06.stdout: ceph-mgr-diskprediction-local-2:18.2.0-0.el9.noarch 2026-03-10T06:36:17.886 INFO:teuthology.orchestra.run.vm06.stdout: ceph-mgr-modules-core-2:18.2.0-0.el9.noarch 2026-03-10T06:36:17.886 INFO:teuthology.orchestra.run.vm06.stdout: ceph-mgr-rook-2:18.2.0-0.el9.noarch 2026-03-10T06:36:17.886 INFO:teuthology.orchestra.run.vm06.stdout: ceph-prometheus-alerts-2:18.2.0-0.el9.noarch 2026-03-10T06:36:17.886 INFO:teuthology.orchestra.run.vm06.stdout: ceph-selinux-2:18.2.0-0.el9.x86_64 2026-03-10T06:36:17.886 INFO:teuthology.orchestra.run.vm06.stdout: flexiblas-3.0.4-9.el9.x86_64 2026-03-10T06:36:17.886 INFO:teuthology.orchestra.run.vm06.stdout: flexiblas-netlib-3.0.4-9.el9.x86_64 2026-03-10T06:36:17.886 INFO:teuthology.orchestra.run.vm06.stdout: flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 2026-03-10T06:36:17.886 INFO:teuthology.orchestra.run.vm06.stdout: fmt-8.1.1-5.el9.x86_64 2026-03-10T06:36:17.886 INFO:teuthology.orchestra.run.vm06.stdout: libcephsqlite-2:18.2.0-0.el9.x86_64 2026-03-10T06:36:17.886 INFO:teuthology.orchestra.run.vm06.stdout: libgfortran-11.5.0-14.el9.x86_64 2026-03-10T06:36:17.886 INFO:teuthology.orchestra.run.vm06.stdout: liboath-2.6.12-1.el9.x86_64 2026-03-10T06:36:17.886 INFO:teuthology.orchestra.run.vm06.stdout: libquadmath-11.5.0-14.el9.x86_64 2026-03-10T06:36:17.886 INFO:teuthology.orchestra.run.vm06.stdout: libradosstriper1-2:18.2.0-0.el9.x86_64 2026-03-10T06:36:17.886 INFO:teuthology.orchestra.run.vm06.stdout: openblas-0.3.29-1.el9.x86_64 2026-03-10T06:36:17.886 INFO:teuthology.orchestra.run.vm06.stdout: openblas-openmp-0.3.29-1.el9.x86_64 2026-03-10T06:36:17.886 INFO:teuthology.orchestra.run.vm06.stdout: python3-asyncssh-2.13.2-5.el9.noarch 2026-03-10T06:36:17.886 INFO:teuthology.orchestra.run.vm06.stdout: python3-autocommand-2.2.2-8.el9.noarch 2026-03-10T06:36:17.886 INFO:teuthology.orchestra.run.vm06.stdout: python3-babel-2.9.1-2.el9.noarch 2026-03-10T06:36:17.886 INFO:teuthology.orchestra.run.vm06.stdout: python3-backports-tarfile-1.2.0-1.el9.noarch 2026-03-10T06:36:17.886 INFO:teuthology.orchestra.run.vm06.stdout: python3-bcrypt-3.2.2-1.el9.x86_64 2026-03-10T06:36:17.886 INFO:teuthology.orchestra.run.vm06.stdout: python3-cachetools-4.2.4-1.el9.noarch 2026-03-10T06:36:17.886 INFO:teuthology.orchestra.run.vm06.stdout: python3-ceph-common-2:18.2.0-0.el9.x86_64 2026-03-10T06:36:17.886 INFO:teuthology.orchestra.run.vm06.stdout: python3-certifi-2023.05.07-4.el9.noarch 2026-03-10T06:36:17.886 INFO:teuthology.orchestra.run.vm06.stdout: python3-cffi-1.14.5-5.el9.x86_64 2026-03-10T06:36:17.886 INFO:teuthology.orchestra.run.vm06.stdout: python3-chardet-4.0.0-5.el9.noarch 2026-03-10T06:36:17.886 INFO:teuthology.orchestra.run.vm06.stdout: python3-cheroot-10.0.1-4.el9.noarch 2026-03-10T06:36:17.886 INFO:teuthology.orchestra.run.vm06.stdout: python3-cherrypy-18.6.1-2.el9.noarch 2026-03-10T06:36:17.886 INFO:teuthology.orchestra.run.vm06.stdout: python3-cryptography-36.0.1-5.el9.x86_64 2026-03-10T06:36:17.886 INFO:teuthology.orchestra.run.vm06.stdout: python3-devel-3.9.25-3.el9.x86_64 2026-03-10T06:36:17.886 INFO:teuthology.orchestra.run.vm06.stdout: python3-google-auth-1:2.45.0-1.el9.noarch 2026-03-10T06:36:17.886 INFO:teuthology.orchestra.run.vm06.stdout: python3-idna-2.10-7.el9.1.noarch 2026-03-10T06:36:17.886 INFO:teuthology.orchestra.run.vm06.stdout: python3-jaraco-8.2.1-3.el9.noarch 2026-03-10T06:36:17.886 INFO:teuthology.orchestra.run.vm06.stdout: python3-jaraco-classes-3.2.1-5.el9.noarch 2026-03-10T06:36:17.886 INFO:teuthology.orchestra.run.vm06.stdout: python3-jaraco-collections-3.0.0-8.el9.noarch 2026-03-10T06:36:17.886 INFO:teuthology.orchestra.run.vm06.stdout: python3-jaraco-context-6.0.1-3.el9.noarch 2026-03-10T06:36:17.886 INFO:teuthology.orchestra.run.vm06.stdout: python3-jaraco-functools-3.5.0-2.el9.noarch 2026-03-10T06:36:17.887 INFO:teuthology.orchestra.run.vm06.stdout: python3-jaraco-text-4.0.0-2.el9.noarch 2026-03-10T06:36:17.887 INFO:teuthology.orchestra.run.vm06.stdout: python3-jinja2-2.11.3-8.el9.noarch 2026-03-10T06:36:17.887 INFO:teuthology.orchestra.run.vm06.stdout: python3-jsonpatch-1.21-16.el9.noarch 2026-03-10T06:36:17.887 INFO:teuthology.orchestra.run.vm06.stdout: python3-jsonpointer-2.0-4.el9.noarch 2026-03-10T06:36:17.887 INFO:teuthology.orchestra.run.vm06.stdout: python3-jwt-2.4.0-1.el9.noarch 2026-03-10T06:36:17.887 INFO:teuthology.orchestra.run.vm06.stdout: python3-jwt+crypto-2.4.0-1.el9.noarch 2026-03-10T06:36:17.887 INFO:teuthology.orchestra.run.vm06.stdout: python3-kubernetes-1:26.1.0-3.el9.noarch 2026-03-10T06:36:17.887 INFO:teuthology.orchestra.run.vm06.stdout: python3-logutils-0.3.5-21.el9.noarch 2026-03-10T06:36:17.887 INFO:teuthology.orchestra.run.vm06.stdout: python3-mako-1.1.4-6.el9.noarch 2026-03-10T06:36:17.887 INFO:teuthology.orchestra.run.vm06.stdout: python3-markupsafe-1.1.1-12.el9.x86_64 2026-03-10T06:36:17.887 INFO:teuthology.orchestra.run.vm06.stdout: python3-more-itertools-8.12.0-2.el9.noarch 2026-03-10T06:36:17.887 INFO:teuthology.orchestra.run.vm06.stdout: python3-natsort-7.1.1-5.el9.noarch 2026-03-10T06:36:17.887 INFO:teuthology.orchestra.run.vm06.stdout: python3-numpy-1:1.23.5-2.el9.x86_64 2026-03-10T06:36:17.887 INFO:teuthology.orchestra.run.vm06.stdout: python3-numpy-f2py-1:1.23.5-2.el9.x86_64 2026-03-10T06:36:17.887 INFO:teuthology.orchestra.run.vm06.stdout: python3-oauthlib-3.1.1-5.el9.noarch 2026-03-10T06:36:17.887 INFO:teuthology.orchestra.run.vm06.stdout: python3-pecan-1.4.2-3.el9.noarch 2026-03-10T06:36:17.887 INFO:teuthology.orchestra.run.vm06.stdout: python3-ply-3.11-14.el9.noarch 2026-03-10T06:36:17.887 INFO:teuthology.orchestra.run.vm06.stdout: python3-portend-3.1.0-2.el9.noarch 2026-03-10T06:36:17.887 INFO:teuthology.orchestra.run.vm06.stdout: python3-prettytable-0.7.2-27.el9.noarch 2026-03-10T06:36:17.887 INFO:teuthology.orchestra.run.vm06.stdout: python3-pyOpenSSL-21.0.0-1.el9.noarch 2026-03-10T06:36:17.887 INFO:teuthology.orchestra.run.vm06.stdout: python3-pyasn1-0.4.8-7.el9.noarch 2026-03-10T06:36:17.887 INFO:teuthology.orchestra.run.vm06.stdout: python3-pyasn1-modules-0.4.8-7.el9.noarch 2026-03-10T06:36:17.887 INFO:teuthology.orchestra.run.vm06.stdout: python3-pycparser-2.20-6.el9.noarch 2026-03-10T06:36:17.887 INFO:teuthology.orchestra.run.vm06.stdout: python3-pysocks-1.7.1-12.el9.noarch 2026-03-10T06:36:17.887 INFO:teuthology.orchestra.run.vm06.stdout: python3-pytz-2021.1-5.el9.noarch 2026-03-10T06:36:17.887 INFO:teuthology.orchestra.run.vm06.stdout: python3-repoze-lru-0.7-16.el9.noarch 2026-03-10T06:36:17.887 INFO:teuthology.orchestra.run.vm06.stdout: python3-requests-2.25.1-10.el9.noarch 2026-03-10T06:36:17.887 INFO:teuthology.orchestra.run.vm06.stdout: python3-requests-oauthlib-1.3.0-12.el9.noarch 2026-03-10T06:36:17.887 INFO:teuthology.orchestra.run.vm06.stdout: python3-routes-2.5.1-5.el9.noarch 2026-03-10T06:36:17.887 INFO:teuthology.orchestra.run.vm06.stdout: python3-rsa-4.9-2.el9.noarch 2026-03-10T06:36:17.887 INFO:teuthology.orchestra.run.vm06.stdout: python3-scipy-1.9.3-2.el9.x86_64 2026-03-10T06:36:17.887 INFO:teuthology.orchestra.run.vm06.stdout: python3-tempora-5.0.0-2.el9.noarch 2026-03-10T06:36:17.887 INFO:teuthology.orchestra.run.vm06.stdout: python3-toml-0.10.2-6.el9.noarch 2026-03-10T06:36:17.887 INFO:teuthology.orchestra.run.vm06.stdout: python3-typing-extensions-4.15.0-1.el9.noarch 2026-03-10T06:36:17.887 INFO:teuthology.orchestra.run.vm06.stdout: python3-urllib3-1.26.5-7.el9.noarch 2026-03-10T06:36:17.887 INFO:teuthology.orchestra.run.vm06.stdout: python3-webob-1.8.8-2.el9.noarch 2026-03-10T06:36:17.887 INFO:teuthology.orchestra.run.vm06.stdout: python3-websocket-client-1.2.3-2.el9.noarch 2026-03-10T06:36:17.887 INFO:teuthology.orchestra.run.vm06.stdout: python3-werkzeug-2.0.3-3.el9.1.noarch 2026-03-10T06:36:17.887 INFO:teuthology.orchestra.run.vm06.stdout: python3-zc-lockfile-2.0-10.el9.noarch 2026-03-10T06:36:17.887 INFO:teuthology.orchestra.run.vm06.stdout: rbd-mirror-2:18.2.0-0.el9.x86_64 2026-03-10T06:36:17.887 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-10T06:36:17.887 INFO:teuthology.orchestra.run.vm06.stdout:Complete! 2026-03-10T06:36:17.903 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-selinux-2:18.2.0-0.el9.x86_64 75/84 2026-03-10T06:36:17.903 INFO:teuthology.orchestra.run.vm04.stdout:skipping the directory /sys 2026-03-10T06:36:17.903 INFO:teuthology.orchestra.run.vm04.stdout:skipping the directory /proc 2026-03-10T06:36:17.903 INFO:teuthology.orchestra.run.vm04.stdout:skipping the directory /mnt 2026-03-10T06:36:17.903 INFO:teuthology.orchestra.run.vm04.stdout:skipping the directory /var/tmp 2026-03-10T06:36:17.903 INFO:teuthology.orchestra.run.vm04.stdout:skipping the directory /home 2026-03-10T06:36:17.903 INFO:teuthology.orchestra.run.vm04.stdout:skipping the directory /root 2026-03-10T06:36:17.903 INFO:teuthology.orchestra.run.vm04.stdout:skipping the directory /tmp 2026-03-10T06:36:17.903 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:36:17.914 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : ceph-common-2:18.2.0-0.el9.x86_64 76/84 2026-03-10T06:36:17.939 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-common-2:18.2.0-0.el9.x86_64 76/84 2026-03-10T06:36:17.942 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-ceph-common-2:18.2.0-0.el9.x86_64 77/84 2026-03-10T06:36:17.944 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-prettytable-0.7.2-27.el9.noarch 78/84 2026-03-10T06:36:17.946 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : liboath-2.6.12-1.el9.x86_64 79/84 2026-03-10T06:36:17.946 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : libradosstriper1-2:18.2.0-0.el9.x86_64 80/84 2026-03-10T06:36:17.959 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: libradosstriper1-2:18.2.0-0.el9.x86_64 80/84 2026-03-10T06:36:17.962 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : fmt-8.1.1-5.el9.x86_64 81/84 2026-03-10T06:36:17.964 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : libquadmath-11.5.0-14.el9.x86_64 82/84 2026-03-10T06:36:17.966 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-markupsafe-1.1.1-12.el9.x86_64 83/84 2026-03-10T06:36:17.966 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : libcephsqlite-2:18.2.0-0.el9.x86_64 84/84 2026-03-10T06:36:18.058 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: libcephsqlite-2:18.2.0-0.el9.x86_64 84/84 2026-03-10T06:36:18.058 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-base-2:18.2.0-0.el9.x86_64 1/84 2026-03-10T06:36:18.058 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-common-2:18.2.0-0.el9.x86_64 2/84 2026-03-10T06:36:18.058 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-grafana-dashboards-2:18.2.0-0.el9.noarch 3/84 2026-03-10T06:36:18.058 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-immutable-object-cache-2:18.2.0-0.el9.x86_64 4/84 2026-03-10T06:36:18.058 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-mgr-2:18.2.0-0.el9.x86_64 5/84 2026-03-10T06:36:18.058 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-mgr-cephadm-2:18.2.0-0.el9.noarch 6/84 2026-03-10T06:36:18.058 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-mgr-dashboard-2:18.2.0-0.el9.noarch 7/84 2026-03-10T06:36:18.058 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-mgr-diskprediction-local-2:18.2.0-0.el9.noarc 8/84 2026-03-10T06:36:18.058 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-mgr-modules-core-2:18.2.0-0.el9.noarch 9/84 2026-03-10T06:36:18.058 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-mgr-rook-2:18.2.0-0.el9.noarch 10/84 2026-03-10T06:36:18.058 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-prometheus-alerts-2:18.2.0-0.el9.noarch 11/84 2026-03-10T06:36:18.058 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-selinux-2:18.2.0-0.el9.x86_64 12/84 2026-03-10T06:36:18.058 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : flexiblas-3.0.4-9.el9.x86_64 13/84 2026-03-10T06:36:18.058 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : flexiblas-netlib-3.0.4-9.el9.x86_64 14/84 2026-03-10T06:36:18.058 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 15/84 2026-03-10T06:36:18.058 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : fmt-8.1.1-5.el9.x86_64 16/84 2026-03-10T06:36:18.058 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : libcephsqlite-2:18.2.0-0.el9.x86_64 17/84 2026-03-10T06:36:18.058 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : libgfortran-11.5.0-14.el9.x86_64 18/84 2026-03-10T06:36:18.058 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : liboath-2.6.12-1.el9.x86_64 19/84 2026-03-10T06:36:18.058 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : libquadmath-11.5.0-14.el9.x86_64 20/84 2026-03-10T06:36:18.058 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : libradosstriper1-2:18.2.0-0.el9.x86_64 21/84 2026-03-10T06:36:18.058 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : openblas-0.3.29-1.el9.x86_64 22/84 2026-03-10T06:36:18.058 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : openblas-openmp-0.3.29-1.el9.x86_64 23/84 2026-03-10T06:36:18.058 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-asyncssh-2.13.2-5.el9.noarch 24/84 2026-03-10T06:36:18.059 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-autocommand-2.2.2-8.el9.noarch 25/84 2026-03-10T06:36:18.059 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-babel-2.9.1-2.el9.noarch 26/84 2026-03-10T06:36:18.060 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-backports-tarfile-1.2.0-1.el9.noarch 27/84 2026-03-10T06:36:18.060 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-bcrypt-3.2.2-1.el9.x86_64 28/84 2026-03-10T06:36:18.060 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-cachetools-4.2.4-1.el9.noarch 29/84 2026-03-10T06:36:18.060 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-ceph-common-2:18.2.0-0.el9.x86_64 30/84 2026-03-10T06:36:18.060 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-certifi-2023.05.07-4.el9.noarch 31/84 2026-03-10T06:36:18.060 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-cffi-1.14.5-5.el9.x86_64 32/84 2026-03-10T06:36:18.060 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-chardet-4.0.0-5.el9.noarch 33/84 2026-03-10T06:36:18.060 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-cheroot-10.0.1-4.el9.noarch 34/84 2026-03-10T06:36:18.060 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-cherrypy-18.6.1-2.el9.noarch 35/84 2026-03-10T06:36:18.060 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-cryptography-36.0.1-5.el9.x86_64 36/84 2026-03-10T06:36:18.060 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-devel-3.9.25-3.el9.x86_64 37/84 2026-03-10T06:36:18.060 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-google-auth-1:2.45.0-1.el9.noarch 38/84 2026-03-10T06:36:18.060 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-idna-2.10-7.el9.1.noarch 39/84 2026-03-10T06:36:18.060 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-jaraco-8.2.1-3.el9.noarch 40/84 2026-03-10T06:36:18.060 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-jaraco-classes-3.2.1-5.el9.noarch 41/84 2026-03-10T06:36:18.060 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-jaraco-collections-3.0.0-8.el9.noarch 42/84 2026-03-10T06:36:18.060 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-jaraco-context-6.0.1-3.el9.noarch 43/84 2026-03-10T06:36:18.060 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-jaraco-functools-3.5.0-2.el9.noarch 44/84 2026-03-10T06:36:18.060 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-jaraco-text-4.0.0-2.el9.noarch 45/84 2026-03-10T06:36:18.060 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-jinja2-2.11.3-8.el9.noarch 46/84 2026-03-10T06:36:18.060 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-jsonpatch-1.21-16.el9.noarch 47/84 2026-03-10T06:36:18.060 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-jsonpointer-2.0-4.el9.noarch 48/84 2026-03-10T06:36:18.060 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-jwt-2.4.0-1.el9.noarch 49/84 2026-03-10T06:36:18.060 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-jwt+crypto-2.4.0-1.el9.noarch 50/84 2026-03-10T06:36:18.060 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-kubernetes-1:26.1.0-3.el9.noarch 51/84 2026-03-10T06:36:18.060 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-logutils-0.3.5-21.el9.noarch 52/84 2026-03-10T06:36:18.060 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-mako-1.1.4-6.el9.noarch 53/84 2026-03-10T06:36:18.060 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-markupsafe-1.1.1-12.el9.x86_64 54/84 2026-03-10T06:36:18.060 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-more-itertools-8.12.0-2.el9.noarch 55/84 2026-03-10T06:36:18.060 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-natsort-7.1.1-5.el9.noarch 56/84 2026-03-10T06:36:18.060 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-numpy-1:1.23.5-2.el9.x86_64 57/84 2026-03-10T06:36:18.060 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-numpy-f2py-1:1.23.5-2.el9.x86_64 58/84 2026-03-10T06:36:18.060 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-oauthlib-3.1.1-5.el9.noarch 59/84 2026-03-10T06:36:18.060 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-pecan-1.4.2-3.el9.noarch 60/84 2026-03-10T06:36:18.060 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-ply-3.11-14.el9.noarch 61/84 2026-03-10T06:36:18.060 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-portend-3.1.0-2.el9.noarch 62/84 2026-03-10T06:36:18.060 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-prettytable-0.7.2-27.el9.noarch 63/84 2026-03-10T06:36:18.060 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-pyOpenSSL-21.0.0-1.el9.noarch 64/84 2026-03-10T06:36:18.060 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-pyasn1-0.4.8-7.el9.noarch 65/84 2026-03-10T06:36:18.060 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-pyasn1-modules-0.4.8-7.el9.noarch 66/84 2026-03-10T06:36:18.060 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-pycparser-2.20-6.el9.noarch 67/84 2026-03-10T06:36:18.060 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-pysocks-1.7.1-12.el9.noarch 68/84 2026-03-10T06:36:18.060 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-pytz-2021.1-5.el9.noarch 69/84 2026-03-10T06:36:18.060 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-repoze-lru-0.7-16.el9.noarch 70/84 2026-03-10T06:36:18.060 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-requests-2.25.1-10.el9.noarch 71/84 2026-03-10T06:36:18.060 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-requests-oauthlib-1.3.0-12.el9.noarch 72/84 2026-03-10T06:36:18.060 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-routes-2.5.1-5.el9.noarch 73/84 2026-03-10T06:36:18.060 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-rsa-4.9-2.el9.noarch 74/84 2026-03-10T06:36:18.060 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-scipy-1.9.3-2.el9.x86_64 75/84 2026-03-10T06:36:18.060 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-tempora-5.0.0-2.el9.noarch 76/84 2026-03-10T06:36:18.060 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-toml-0.10.2-6.el9.noarch 77/84 2026-03-10T06:36:18.060 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-typing-extensions-4.15.0-1.el9.noarch 78/84 2026-03-10T06:36:18.060 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-urllib3-1.26.5-7.el9.noarch 79/84 2026-03-10T06:36:18.060 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-webob-1.8.8-2.el9.noarch 80/84 2026-03-10T06:36:18.060 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-websocket-client-1.2.3-2.el9.noarch 81/84 2026-03-10T06:36:18.060 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-werkzeug-2.0.3-3.el9.1.noarch 82/84 2026-03-10T06:36:18.060 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-zc-lockfile-2.0-10.el9.noarch 83/84 2026-03-10T06:36:18.083 INFO:teuthology.orchestra.run.vm06.stdout:Dependencies resolved. 2026-03-10T06:36:18.083 INFO:teuthology.orchestra.run.vm06.stdout:================================================================================ 2026-03-10T06:36:18.083 INFO:teuthology.orchestra.run.vm06.stdout: Package Architecture Version Repository Size 2026-03-10T06:36:18.083 INFO:teuthology.orchestra.run.vm06.stdout:================================================================================ 2026-03-10T06:36:18.083 INFO:teuthology.orchestra.run.vm06.stdout:Removing: 2026-03-10T06:36:18.084 INFO:teuthology.orchestra.run.vm06.stdout: cephadm noarch 2:18.2.0-0.el9 @ceph-noarch 200 k 2026-03-10T06:36:18.084 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-10T06:36:18.084 INFO:teuthology.orchestra.run.vm06.stdout:Transaction Summary 2026-03-10T06:36:18.084 INFO:teuthology.orchestra.run.vm06.stdout:================================================================================ 2026-03-10T06:36:18.084 INFO:teuthology.orchestra.run.vm06.stdout:Remove 1 Package 2026-03-10T06:36:18.084 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-10T06:36:18.084 INFO:teuthology.orchestra.run.vm06.stdout:Freed space: 200 k 2026-03-10T06:36:18.084 INFO:teuthology.orchestra.run.vm06.stdout:Running transaction check 2026-03-10T06:36:18.085 INFO:teuthology.orchestra.run.vm06.stdout:Transaction check succeeded. 2026-03-10T06:36:18.085 INFO:teuthology.orchestra.run.vm06.stdout:Running transaction test 2026-03-10T06:36:18.086 INFO:teuthology.orchestra.run.vm06.stdout:Transaction test succeeded. 2026-03-10T06:36:18.087 INFO:teuthology.orchestra.run.vm06.stdout:Running transaction 2026-03-10T06:36:18.102 INFO:teuthology.orchestra.run.vm06.stdout: Preparing : 1/1 2026-03-10T06:36:18.102 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : cephadm-2:18.2.0-0.el9.noarch 1/1 2026-03-10T06:36:18.129 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : rbd-mirror-2:18.2.0-0.el9.x86_64 84/84 2026-03-10T06:36:18.129 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:36:18.129 INFO:teuthology.orchestra.run.vm04.stdout:Removed: 2026-03-10T06:36:18.129 INFO:teuthology.orchestra.run.vm04.stdout: ceph-base-2:18.2.0-0.el9.x86_64 2026-03-10T06:36:18.129 INFO:teuthology.orchestra.run.vm04.stdout: ceph-common-2:18.2.0-0.el9.x86_64 2026-03-10T06:36:18.129 INFO:teuthology.orchestra.run.vm04.stdout: ceph-grafana-dashboards-2:18.2.0-0.el9.noarch 2026-03-10T06:36:18.129 INFO:teuthology.orchestra.run.vm04.stdout: ceph-immutable-object-cache-2:18.2.0-0.el9.x86_64 2026-03-10T06:36:18.129 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mgr-2:18.2.0-0.el9.x86_64 2026-03-10T06:36:18.129 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mgr-cephadm-2:18.2.0-0.el9.noarch 2026-03-10T06:36:18.129 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mgr-dashboard-2:18.2.0-0.el9.noarch 2026-03-10T06:36:18.129 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mgr-diskprediction-local-2:18.2.0-0.el9.noarch 2026-03-10T06:36:18.129 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mgr-modules-core-2:18.2.0-0.el9.noarch 2026-03-10T06:36:18.129 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mgr-rook-2:18.2.0-0.el9.noarch 2026-03-10T06:36:18.129 INFO:teuthology.orchestra.run.vm04.stdout: ceph-prometheus-alerts-2:18.2.0-0.el9.noarch 2026-03-10T06:36:18.129 INFO:teuthology.orchestra.run.vm04.stdout: ceph-selinux-2:18.2.0-0.el9.x86_64 2026-03-10T06:36:18.129 INFO:teuthology.orchestra.run.vm04.stdout: flexiblas-3.0.4-9.el9.x86_64 2026-03-10T06:36:18.130 INFO:teuthology.orchestra.run.vm04.stdout: flexiblas-netlib-3.0.4-9.el9.x86_64 2026-03-10T06:36:18.130 INFO:teuthology.orchestra.run.vm04.stdout: flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 2026-03-10T06:36:18.130 INFO:teuthology.orchestra.run.vm04.stdout: fmt-8.1.1-5.el9.x86_64 2026-03-10T06:36:18.130 INFO:teuthology.orchestra.run.vm04.stdout: libcephsqlite-2:18.2.0-0.el9.x86_64 2026-03-10T06:36:18.130 INFO:teuthology.orchestra.run.vm04.stdout: libgfortran-11.5.0-14.el9.x86_64 2026-03-10T06:36:18.130 INFO:teuthology.orchestra.run.vm04.stdout: liboath-2.6.12-1.el9.x86_64 2026-03-10T06:36:18.130 INFO:teuthology.orchestra.run.vm04.stdout: libquadmath-11.5.0-14.el9.x86_64 2026-03-10T06:36:18.130 INFO:teuthology.orchestra.run.vm04.stdout: libradosstriper1-2:18.2.0-0.el9.x86_64 2026-03-10T06:36:18.130 INFO:teuthology.orchestra.run.vm04.stdout: openblas-0.3.29-1.el9.x86_64 2026-03-10T06:36:18.130 INFO:teuthology.orchestra.run.vm04.stdout: openblas-openmp-0.3.29-1.el9.x86_64 2026-03-10T06:36:18.130 INFO:teuthology.orchestra.run.vm04.stdout: python3-asyncssh-2.13.2-5.el9.noarch 2026-03-10T06:36:18.130 INFO:teuthology.orchestra.run.vm04.stdout: python3-autocommand-2.2.2-8.el9.noarch 2026-03-10T06:36:18.130 INFO:teuthology.orchestra.run.vm04.stdout: python3-babel-2.9.1-2.el9.noarch 2026-03-10T06:36:18.130 INFO:teuthology.orchestra.run.vm04.stdout: python3-backports-tarfile-1.2.0-1.el9.noarch 2026-03-10T06:36:18.130 INFO:teuthology.orchestra.run.vm04.stdout: python3-bcrypt-3.2.2-1.el9.x86_64 2026-03-10T06:36:18.130 INFO:teuthology.orchestra.run.vm04.stdout: python3-cachetools-4.2.4-1.el9.noarch 2026-03-10T06:36:18.130 INFO:teuthology.orchestra.run.vm04.stdout: python3-ceph-common-2:18.2.0-0.el9.x86_64 2026-03-10T06:36:18.130 INFO:teuthology.orchestra.run.vm04.stdout: python3-certifi-2023.05.07-4.el9.noarch 2026-03-10T06:36:18.130 INFO:teuthology.orchestra.run.vm04.stdout: python3-cffi-1.14.5-5.el9.x86_64 2026-03-10T06:36:18.130 INFO:teuthology.orchestra.run.vm04.stdout: python3-chardet-4.0.0-5.el9.noarch 2026-03-10T06:36:18.130 INFO:teuthology.orchestra.run.vm04.stdout: python3-cheroot-10.0.1-4.el9.noarch 2026-03-10T06:36:18.130 INFO:teuthology.orchestra.run.vm04.stdout: python3-cherrypy-18.6.1-2.el9.noarch 2026-03-10T06:36:18.130 INFO:teuthology.orchestra.run.vm04.stdout: python3-cryptography-36.0.1-5.el9.x86_64 2026-03-10T06:36:18.130 INFO:teuthology.orchestra.run.vm04.stdout: python3-devel-3.9.25-3.el9.x86_64 2026-03-10T06:36:18.130 INFO:teuthology.orchestra.run.vm04.stdout: python3-google-auth-1:2.45.0-1.el9.noarch 2026-03-10T06:36:18.130 INFO:teuthology.orchestra.run.vm04.stdout: python3-idna-2.10-7.el9.1.noarch 2026-03-10T06:36:18.130 INFO:teuthology.orchestra.run.vm04.stdout: python3-jaraco-8.2.1-3.el9.noarch 2026-03-10T06:36:18.130 INFO:teuthology.orchestra.run.vm04.stdout: python3-jaraco-classes-3.2.1-5.el9.noarch 2026-03-10T06:36:18.130 INFO:teuthology.orchestra.run.vm04.stdout: python3-jaraco-collections-3.0.0-8.el9.noarch 2026-03-10T06:36:18.130 INFO:teuthology.orchestra.run.vm04.stdout: python3-jaraco-context-6.0.1-3.el9.noarch 2026-03-10T06:36:18.130 INFO:teuthology.orchestra.run.vm04.stdout: python3-jaraco-functools-3.5.0-2.el9.noarch 2026-03-10T06:36:18.130 INFO:teuthology.orchestra.run.vm04.stdout: python3-jaraco-text-4.0.0-2.el9.noarch 2026-03-10T06:36:18.130 INFO:teuthology.orchestra.run.vm04.stdout: python3-jinja2-2.11.3-8.el9.noarch 2026-03-10T06:36:18.130 INFO:teuthology.orchestra.run.vm04.stdout: python3-jsonpatch-1.21-16.el9.noarch 2026-03-10T06:36:18.130 INFO:teuthology.orchestra.run.vm04.stdout: python3-jsonpointer-2.0-4.el9.noarch 2026-03-10T06:36:18.130 INFO:teuthology.orchestra.run.vm04.stdout: python3-jwt-2.4.0-1.el9.noarch 2026-03-10T06:36:18.130 INFO:teuthology.orchestra.run.vm04.stdout: python3-jwt+crypto-2.4.0-1.el9.noarch 2026-03-10T06:36:18.130 INFO:teuthology.orchestra.run.vm04.stdout: python3-kubernetes-1:26.1.0-3.el9.noarch 2026-03-10T06:36:18.130 INFO:teuthology.orchestra.run.vm04.stdout: python3-logutils-0.3.5-21.el9.noarch 2026-03-10T06:36:18.130 INFO:teuthology.orchestra.run.vm04.stdout: python3-mako-1.1.4-6.el9.noarch 2026-03-10T06:36:18.130 INFO:teuthology.orchestra.run.vm04.stdout: python3-markupsafe-1.1.1-12.el9.x86_64 2026-03-10T06:36:18.130 INFO:teuthology.orchestra.run.vm04.stdout: python3-more-itertools-8.12.0-2.el9.noarch 2026-03-10T06:36:18.130 INFO:teuthology.orchestra.run.vm04.stdout: python3-natsort-7.1.1-5.el9.noarch 2026-03-10T06:36:18.130 INFO:teuthology.orchestra.run.vm04.stdout: python3-numpy-1:1.23.5-2.el9.x86_64 2026-03-10T06:36:18.130 INFO:teuthology.orchestra.run.vm04.stdout: python3-numpy-f2py-1:1.23.5-2.el9.x86_64 2026-03-10T06:36:18.130 INFO:teuthology.orchestra.run.vm04.stdout: python3-oauthlib-3.1.1-5.el9.noarch 2026-03-10T06:36:18.130 INFO:teuthology.orchestra.run.vm04.stdout: python3-pecan-1.4.2-3.el9.noarch 2026-03-10T06:36:18.130 INFO:teuthology.orchestra.run.vm04.stdout: python3-ply-3.11-14.el9.noarch 2026-03-10T06:36:18.130 INFO:teuthology.orchestra.run.vm04.stdout: python3-portend-3.1.0-2.el9.noarch 2026-03-10T06:36:18.130 INFO:teuthology.orchestra.run.vm04.stdout: python3-prettytable-0.7.2-27.el9.noarch 2026-03-10T06:36:18.130 INFO:teuthology.orchestra.run.vm04.stdout: python3-pyOpenSSL-21.0.0-1.el9.noarch 2026-03-10T06:36:18.130 INFO:teuthology.orchestra.run.vm04.stdout: python3-pyasn1-0.4.8-7.el9.noarch 2026-03-10T06:36:18.130 INFO:teuthology.orchestra.run.vm04.stdout: python3-pyasn1-modules-0.4.8-7.el9.noarch 2026-03-10T06:36:18.130 INFO:teuthology.orchestra.run.vm04.stdout: python3-pycparser-2.20-6.el9.noarch 2026-03-10T06:36:18.130 INFO:teuthology.orchestra.run.vm04.stdout: python3-pysocks-1.7.1-12.el9.noarch 2026-03-10T06:36:18.130 INFO:teuthology.orchestra.run.vm04.stdout: python3-pytz-2021.1-5.el9.noarch 2026-03-10T06:36:18.130 INFO:teuthology.orchestra.run.vm04.stdout: python3-repoze-lru-0.7-16.el9.noarch 2026-03-10T06:36:18.130 INFO:teuthology.orchestra.run.vm04.stdout: python3-requests-2.25.1-10.el9.noarch 2026-03-10T06:36:18.130 INFO:teuthology.orchestra.run.vm04.stdout: python3-requests-oauthlib-1.3.0-12.el9.noarch 2026-03-10T06:36:18.130 INFO:teuthology.orchestra.run.vm04.stdout: python3-routes-2.5.1-5.el9.noarch 2026-03-10T06:36:18.130 INFO:teuthology.orchestra.run.vm04.stdout: python3-rsa-4.9-2.el9.noarch 2026-03-10T06:36:18.130 INFO:teuthology.orchestra.run.vm04.stdout: python3-scipy-1.9.3-2.el9.x86_64 2026-03-10T06:36:18.130 INFO:teuthology.orchestra.run.vm04.stdout: python3-tempora-5.0.0-2.el9.noarch 2026-03-10T06:36:18.130 INFO:teuthology.orchestra.run.vm04.stdout: python3-toml-0.10.2-6.el9.noarch 2026-03-10T06:36:18.131 INFO:teuthology.orchestra.run.vm04.stdout: python3-typing-extensions-4.15.0-1.el9.noarch 2026-03-10T06:36:18.131 INFO:teuthology.orchestra.run.vm04.stdout: python3-urllib3-1.26.5-7.el9.noarch 2026-03-10T06:36:18.131 INFO:teuthology.orchestra.run.vm04.stdout: python3-webob-1.8.8-2.el9.noarch 2026-03-10T06:36:18.131 INFO:teuthology.orchestra.run.vm04.stdout: python3-websocket-client-1.2.3-2.el9.noarch 2026-03-10T06:36:18.131 INFO:teuthology.orchestra.run.vm04.stdout: python3-werkzeug-2.0.3-3.el9.1.noarch 2026-03-10T06:36:18.131 INFO:teuthology.orchestra.run.vm04.stdout: python3-zc-lockfile-2.0-10.el9.noarch 2026-03-10T06:36:18.131 INFO:teuthology.orchestra.run.vm04.stdout: rbd-mirror-2:18.2.0-0.el9.x86_64 2026-03-10T06:36:18.131 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:36:18.131 INFO:teuthology.orchestra.run.vm04.stdout:Complete! 2026-03-10T06:36:18.205 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: cephadm-2:18.2.0-0.el9.noarch 1/1 2026-03-10T06:36:18.243 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : cephadm-2:18.2.0-0.el9.noarch 1/1 2026-03-10T06:36:18.243 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-10T06:36:18.243 INFO:teuthology.orchestra.run.vm06.stdout:Removed: 2026-03-10T06:36:18.243 INFO:teuthology.orchestra.run.vm06.stdout: cephadm-2:18.2.0-0.el9.noarch 2026-03-10T06:36:18.243 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-10T06:36:18.243 INFO:teuthology.orchestra.run.vm06.stdout:Complete! 2026-03-10T06:36:18.325 INFO:teuthology.orchestra.run.vm04.stdout:Dependencies resolved. 2026-03-10T06:36:18.326 INFO:teuthology.orchestra.run.vm04.stdout:================================================================================ 2026-03-10T06:36:18.326 INFO:teuthology.orchestra.run.vm04.stdout: Package Architecture Version Repository Size 2026-03-10T06:36:18.326 INFO:teuthology.orchestra.run.vm04.stdout:================================================================================ 2026-03-10T06:36:18.326 INFO:teuthology.orchestra.run.vm04.stdout:Removing: 2026-03-10T06:36:18.326 INFO:teuthology.orchestra.run.vm04.stdout: cephadm noarch 2:18.2.0-0.el9 @ceph-noarch 200 k 2026-03-10T06:36:18.326 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:36:18.326 INFO:teuthology.orchestra.run.vm04.stdout:Transaction Summary 2026-03-10T06:36:18.326 INFO:teuthology.orchestra.run.vm04.stdout:================================================================================ 2026-03-10T06:36:18.326 INFO:teuthology.orchestra.run.vm04.stdout:Remove 1 Package 2026-03-10T06:36:18.326 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:36:18.326 INFO:teuthology.orchestra.run.vm04.stdout:Freed space: 200 k 2026-03-10T06:36:18.326 INFO:teuthology.orchestra.run.vm04.stdout:Running transaction check 2026-03-10T06:36:18.327 INFO:teuthology.orchestra.run.vm04.stdout:Transaction check succeeded. 2026-03-10T06:36:18.327 INFO:teuthology.orchestra.run.vm04.stdout:Running transaction test 2026-03-10T06:36:18.329 INFO:teuthology.orchestra.run.vm04.stdout:Transaction test succeeded. 2026-03-10T06:36:18.329 INFO:teuthology.orchestra.run.vm04.stdout:Running transaction 2026-03-10T06:36:18.344 INFO:teuthology.orchestra.run.vm04.stdout: Preparing : 1/1 2026-03-10T06:36:18.344 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : cephadm-2:18.2.0-0.el9.noarch 1/1 2026-03-10T06:36:18.417 INFO:teuthology.orchestra.run.vm06.stdout:No match for argument: ceph-immutable-object-cache 2026-03-10T06:36:18.417 INFO:teuthology.orchestra.run.vm06.stderr:No packages marked for removal. 2026-03-10T06:36:18.420 INFO:teuthology.orchestra.run.vm06.stdout:Dependencies resolved. 2026-03-10T06:36:18.420 INFO:teuthology.orchestra.run.vm06.stdout:Nothing to do. 2026-03-10T06:36:18.420 INFO:teuthology.orchestra.run.vm06.stdout:Complete! 2026-03-10T06:36:18.452 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: cephadm-2:18.2.0-0.el9.noarch 1/1 2026-03-10T06:36:18.491 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : cephadm-2:18.2.0-0.el9.noarch 1/1 2026-03-10T06:36:18.491 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:36:18.491 INFO:teuthology.orchestra.run.vm04.stdout:Removed: 2026-03-10T06:36:18.491 INFO:teuthology.orchestra.run.vm04.stdout: cephadm-2:18.2.0-0.el9.noarch 2026-03-10T06:36:18.491 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:36:18.491 INFO:teuthology.orchestra.run.vm04.stdout:Complete! 2026-03-10T06:36:18.583 INFO:teuthology.orchestra.run.vm06.stdout:No match for argument: ceph-mgr 2026-03-10T06:36:18.583 INFO:teuthology.orchestra.run.vm06.stderr:No packages marked for removal. 2026-03-10T06:36:18.585 INFO:teuthology.orchestra.run.vm06.stdout:Dependencies resolved. 2026-03-10T06:36:18.586 INFO:teuthology.orchestra.run.vm06.stdout:Nothing to do. 2026-03-10T06:36:18.586 INFO:teuthology.orchestra.run.vm06.stdout:Complete! 2026-03-10T06:36:18.664 INFO:teuthology.orchestra.run.vm04.stdout:No match for argument: ceph-immutable-object-cache 2026-03-10T06:36:18.664 INFO:teuthology.orchestra.run.vm04.stderr:No packages marked for removal. 2026-03-10T06:36:18.667 INFO:teuthology.orchestra.run.vm04.stdout:Dependencies resolved. 2026-03-10T06:36:18.668 INFO:teuthology.orchestra.run.vm04.stdout:Nothing to do. 2026-03-10T06:36:18.668 INFO:teuthology.orchestra.run.vm04.stdout:Complete! 2026-03-10T06:36:18.744 INFO:teuthology.orchestra.run.vm06.stdout:No match for argument: ceph-mgr-dashboard 2026-03-10T06:36:18.744 INFO:teuthology.orchestra.run.vm06.stderr:No packages marked for removal. 2026-03-10T06:36:18.746 INFO:teuthology.orchestra.run.vm06.stdout:Dependencies resolved. 2026-03-10T06:36:18.747 INFO:teuthology.orchestra.run.vm06.stdout:Nothing to do. 2026-03-10T06:36:18.747 INFO:teuthology.orchestra.run.vm06.stdout:Complete! 2026-03-10T06:36:18.832 INFO:teuthology.orchestra.run.vm04.stdout:No match for argument: ceph-mgr 2026-03-10T06:36:18.832 INFO:teuthology.orchestra.run.vm04.stderr:No packages marked for removal. 2026-03-10T06:36:18.835 INFO:teuthology.orchestra.run.vm04.stdout:Dependencies resolved. 2026-03-10T06:36:18.836 INFO:teuthology.orchestra.run.vm04.stdout:Nothing to do. 2026-03-10T06:36:18.836 INFO:teuthology.orchestra.run.vm04.stdout:Complete! 2026-03-10T06:36:18.902 INFO:teuthology.orchestra.run.vm06.stdout:No match for argument: ceph-mgr-diskprediction-local 2026-03-10T06:36:18.903 INFO:teuthology.orchestra.run.vm06.stderr:No packages marked for removal. 2026-03-10T06:36:18.905 INFO:teuthology.orchestra.run.vm06.stdout:Dependencies resolved. 2026-03-10T06:36:18.906 INFO:teuthology.orchestra.run.vm06.stdout:Nothing to do. 2026-03-10T06:36:18.906 INFO:teuthology.orchestra.run.vm06.stdout:Complete! 2026-03-10T06:36:18.991 INFO:teuthology.orchestra.run.vm04.stdout:No match for argument: ceph-mgr-dashboard 2026-03-10T06:36:18.992 INFO:teuthology.orchestra.run.vm04.stderr:No packages marked for removal. 2026-03-10T06:36:18.994 INFO:teuthology.orchestra.run.vm04.stdout:Dependencies resolved. 2026-03-10T06:36:18.995 INFO:teuthology.orchestra.run.vm04.stdout:Nothing to do. 2026-03-10T06:36:18.995 INFO:teuthology.orchestra.run.vm04.stdout:Complete! 2026-03-10T06:36:19.059 INFO:teuthology.orchestra.run.vm06.stdout:No match for argument: ceph-mgr-rook 2026-03-10T06:36:19.059 INFO:teuthology.orchestra.run.vm06.stderr:No packages marked for removal. 2026-03-10T06:36:19.062 INFO:teuthology.orchestra.run.vm06.stdout:Dependencies resolved. 2026-03-10T06:36:19.062 INFO:teuthology.orchestra.run.vm06.stdout:Nothing to do. 2026-03-10T06:36:19.063 INFO:teuthology.orchestra.run.vm06.stdout:Complete! 2026-03-10T06:36:19.150 INFO:teuthology.orchestra.run.vm04.stdout:No match for argument: ceph-mgr-diskprediction-local 2026-03-10T06:36:19.150 INFO:teuthology.orchestra.run.vm04.stderr:No packages marked for removal. 2026-03-10T06:36:19.153 INFO:teuthology.orchestra.run.vm04.stdout:Dependencies resolved. 2026-03-10T06:36:19.154 INFO:teuthology.orchestra.run.vm04.stdout:Nothing to do. 2026-03-10T06:36:19.154 INFO:teuthology.orchestra.run.vm04.stdout:Complete! 2026-03-10T06:36:19.225 INFO:teuthology.orchestra.run.vm06.stdout:No match for argument: ceph-mgr-cephadm 2026-03-10T06:36:19.225 INFO:teuthology.orchestra.run.vm06.stderr:No packages marked for removal. 2026-03-10T06:36:19.227 INFO:teuthology.orchestra.run.vm06.stdout:Dependencies resolved. 2026-03-10T06:36:19.228 INFO:teuthology.orchestra.run.vm06.stdout:Nothing to do. 2026-03-10T06:36:19.228 INFO:teuthology.orchestra.run.vm06.stdout:Complete! 2026-03-10T06:36:19.310 INFO:teuthology.orchestra.run.vm04.stdout:No match for argument: ceph-mgr-rook 2026-03-10T06:36:19.310 INFO:teuthology.orchestra.run.vm04.stderr:No packages marked for removal. 2026-03-10T06:36:19.313 INFO:teuthology.orchestra.run.vm04.stdout:Dependencies resolved. 2026-03-10T06:36:19.313 INFO:teuthology.orchestra.run.vm04.stdout:Nothing to do. 2026-03-10T06:36:19.313 INFO:teuthology.orchestra.run.vm04.stdout:Complete! 2026-03-10T06:36:19.390 INFO:teuthology.orchestra.run.vm06.stdout:Dependencies resolved. 2026-03-10T06:36:19.390 INFO:teuthology.orchestra.run.vm06.stdout:================================================================================ 2026-03-10T06:36:19.390 INFO:teuthology.orchestra.run.vm06.stdout: Package Architecture Version Repository Size 2026-03-10T06:36:19.390 INFO:teuthology.orchestra.run.vm06.stdout:================================================================================ 2026-03-10T06:36:19.390 INFO:teuthology.orchestra.run.vm06.stdout:Removing: 2026-03-10T06:36:19.391 INFO:teuthology.orchestra.run.vm06.stdout: ceph-fuse x86_64 2:18.2.0-0.el9 @ceph 2.4 M 2026-03-10T06:36:19.391 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-10T06:36:19.391 INFO:teuthology.orchestra.run.vm06.stdout:Transaction Summary 2026-03-10T06:36:19.391 INFO:teuthology.orchestra.run.vm06.stdout:================================================================================ 2026-03-10T06:36:19.391 INFO:teuthology.orchestra.run.vm06.stdout:Remove 1 Package 2026-03-10T06:36:19.391 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-10T06:36:19.391 INFO:teuthology.orchestra.run.vm06.stdout:Freed space: 2.4 M 2026-03-10T06:36:19.391 INFO:teuthology.orchestra.run.vm06.stdout:Running transaction check 2026-03-10T06:36:19.392 INFO:teuthology.orchestra.run.vm06.stdout:Transaction check succeeded. 2026-03-10T06:36:19.392 INFO:teuthology.orchestra.run.vm06.stdout:Running transaction test 2026-03-10T06:36:19.405 INFO:teuthology.orchestra.run.vm06.stdout:Transaction test succeeded. 2026-03-10T06:36:19.405 INFO:teuthology.orchestra.run.vm06.stdout:Running transaction 2026-03-10T06:36:19.434 INFO:teuthology.orchestra.run.vm06.stdout: Preparing : 1/1 2026-03-10T06:36:19.448 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : ceph-fuse-2:18.2.0-0.el9.x86_64 1/1 2026-03-10T06:36:19.469 INFO:teuthology.orchestra.run.vm04.stdout:No match for argument: ceph-mgr-cephadm 2026-03-10T06:36:19.470 INFO:teuthology.orchestra.run.vm04.stderr:No packages marked for removal. 2026-03-10T06:36:19.472 INFO:teuthology.orchestra.run.vm04.stdout:Dependencies resolved. 2026-03-10T06:36:19.473 INFO:teuthology.orchestra.run.vm04.stdout:Nothing to do. 2026-03-10T06:36:19.473 INFO:teuthology.orchestra.run.vm04.stdout:Complete! 2026-03-10T06:36:19.505 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: ceph-fuse-2:18.2.0-0.el9.x86_64 1/1 2026-03-10T06:36:19.545 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : ceph-fuse-2:18.2.0-0.el9.x86_64 1/1 2026-03-10T06:36:19.545 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-10T06:36:19.545 INFO:teuthology.orchestra.run.vm06.stdout:Removed: 2026-03-10T06:36:19.545 INFO:teuthology.orchestra.run.vm06.stdout: ceph-fuse-2:18.2.0-0.el9.x86_64 2026-03-10T06:36:19.545 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-10T06:36:19.545 INFO:teuthology.orchestra.run.vm06.stdout:Complete! 2026-03-10T06:36:19.643 INFO:teuthology.orchestra.run.vm04.stdout:Dependencies resolved. 2026-03-10T06:36:19.643 INFO:teuthology.orchestra.run.vm04.stdout:================================================================================ 2026-03-10T06:36:19.643 INFO:teuthology.orchestra.run.vm04.stdout: Package Architecture Version Repository Size 2026-03-10T06:36:19.643 INFO:teuthology.orchestra.run.vm04.stdout:================================================================================ 2026-03-10T06:36:19.643 INFO:teuthology.orchestra.run.vm04.stdout:Removing: 2026-03-10T06:36:19.643 INFO:teuthology.orchestra.run.vm04.stdout: ceph-fuse x86_64 2:18.2.0-0.el9 @ceph 2.4 M 2026-03-10T06:36:19.643 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:36:19.644 INFO:teuthology.orchestra.run.vm04.stdout:Transaction Summary 2026-03-10T06:36:19.644 INFO:teuthology.orchestra.run.vm04.stdout:================================================================================ 2026-03-10T06:36:19.644 INFO:teuthology.orchestra.run.vm04.stdout:Remove 1 Package 2026-03-10T06:36:19.644 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:36:19.644 INFO:teuthology.orchestra.run.vm04.stdout:Freed space: 2.4 M 2026-03-10T06:36:19.644 INFO:teuthology.orchestra.run.vm04.stdout:Running transaction check 2026-03-10T06:36:19.645 INFO:teuthology.orchestra.run.vm04.stdout:Transaction check succeeded. 2026-03-10T06:36:19.645 INFO:teuthology.orchestra.run.vm04.stdout:Running transaction test 2026-03-10T06:36:19.659 INFO:teuthology.orchestra.run.vm04.stdout:Transaction test succeeded. 2026-03-10T06:36:19.659 INFO:teuthology.orchestra.run.vm04.stdout:Running transaction 2026-03-10T06:36:19.686 INFO:teuthology.orchestra.run.vm04.stdout: Preparing : 1/1 2026-03-10T06:36:19.701 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : ceph-fuse-2:18.2.0-0.el9.x86_64 1/1 2026-03-10T06:36:19.722 INFO:teuthology.orchestra.run.vm06.stdout:Dependencies resolved. 2026-03-10T06:36:19.722 INFO:teuthology.orchestra.run.vm06.stdout:================================================================================ 2026-03-10T06:36:19.722 INFO:teuthology.orchestra.run.vm06.stdout: Package Architecture Version Repository Size 2026-03-10T06:36:19.722 INFO:teuthology.orchestra.run.vm06.stdout:================================================================================ 2026-03-10T06:36:19.722 INFO:teuthology.orchestra.run.vm06.stdout:Removing: 2026-03-10T06:36:19.722 INFO:teuthology.orchestra.run.vm06.stdout: librados-devel x86_64 2:18.2.0-0.el9 @ceph 456 k 2026-03-10T06:36:19.722 INFO:teuthology.orchestra.run.vm06.stdout:Removing dependent packages: 2026-03-10T06:36:19.722 INFO:teuthology.orchestra.run.vm06.stdout: libcephfs-devel x86_64 2:18.2.0-0.el9 @ceph 137 k 2026-03-10T06:36:19.722 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-10T06:36:19.722 INFO:teuthology.orchestra.run.vm06.stdout:Transaction Summary 2026-03-10T06:36:19.722 INFO:teuthology.orchestra.run.vm06.stdout:================================================================================ 2026-03-10T06:36:19.722 INFO:teuthology.orchestra.run.vm06.stdout:Remove 2 Packages 2026-03-10T06:36:19.722 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-10T06:36:19.722 INFO:teuthology.orchestra.run.vm06.stdout:Freed space: 593 k 2026-03-10T06:36:19.722 INFO:teuthology.orchestra.run.vm06.stdout:Running transaction check 2026-03-10T06:36:19.725 INFO:teuthology.orchestra.run.vm06.stdout:Transaction check succeeded. 2026-03-10T06:36:19.725 INFO:teuthology.orchestra.run.vm06.stdout:Running transaction test 2026-03-10T06:36:19.735 INFO:teuthology.orchestra.run.vm06.stdout:Transaction test succeeded. 2026-03-10T06:36:19.735 INFO:teuthology.orchestra.run.vm06.stdout:Running transaction 2026-03-10T06:36:19.760 INFO:teuthology.orchestra.run.vm06.stdout: Preparing : 1/1 2026-03-10T06:36:19.762 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : libcephfs-devel-2:18.2.0-0.el9.x86_64 1/2 2026-03-10T06:36:19.762 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-fuse-2:18.2.0-0.el9.x86_64 1/1 2026-03-10T06:36:19.775 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : librados-devel-2:18.2.0-0.el9.x86_64 2/2 2026-03-10T06:36:19.802 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-fuse-2:18.2.0-0.el9.x86_64 1/1 2026-03-10T06:36:19.802 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:36:19.802 INFO:teuthology.orchestra.run.vm04.stdout:Removed: 2026-03-10T06:36:19.802 INFO:teuthology.orchestra.run.vm04.stdout: ceph-fuse-2:18.2.0-0.el9.x86_64 2026-03-10T06:36:19.802 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:36:19.802 INFO:teuthology.orchestra.run.vm04.stdout:Complete! 2026-03-10T06:36:19.835 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: librados-devel-2:18.2.0-0.el9.x86_64 2/2 2026-03-10T06:36:19.836 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : libcephfs-devel-2:18.2.0-0.el9.x86_64 1/2 2026-03-10T06:36:19.872 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : librados-devel-2:18.2.0-0.el9.x86_64 2/2 2026-03-10T06:36:19.872 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-10T06:36:19.872 INFO:teuthology.orchestra.run.vm06.stdout:Removed: 2026-03-10T06:36:19.872 INFO:teuthology.orchestra.run.vm06.stdout: libcephfs-devel-2:18.2.0-0.el9.x86_64 librados-devel-2:18.2.0-0.el9.x86_64 2026-03-10T06:36:19.872 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-10T06:36:19.872 INFO:teuthology.orchestra.run.vm06.stdout:Complete! 2026-03-10T06:36:19.984 INFO:teuthology.orchestra.run.vm04.stdout:Dependencies resolved. 2026-03-10T06:36:19.984 INFO:teuthology.orchestra.run.vm04.stdout:================================================================================ 2026-03-10T06:36:19.984 INFO:teuthology.orchestra.run.vm04.stdout: Package Architecture Version Repository Size 2026-03-10T06:36:19.984 INFO:teuthology.orchestra.run.vm04.stdout:================================================================================ 2026-03-10T06:36:19.984 INFO:teuthology.orchestra.run.vm04.stdout:Removing: 2026-03-10T06:36:19.984 INFO:teuthology.orchestra.run.vm04.stdout: librados-devel x86_64 2:18.2.0-0.el9 @ceph 456 k 2026-03-10T06:36:19.984 INFO:teuthology.orchestra.run.vm04.stdout:Removing dependent packages: 2026-03-10T06:36:19.984 INFO:teuthology.orchestra.run.vm04.stdout: libcephfs-devel x86_64 2:18.2.0-0.el9 @ceph 137 k 2026-03-10T06:36:19.984 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:36:19.984 INFO:teuthology.orchestra.run.vm04.stdout:Transaction Summary 2026-03-10T06:36:19.984 INFO:teuthology.orchestra.run.vm04.stdout:================================================================================ 2026-03-10T06:36:19.984 INFO:teuthology.orchestra.run.vm04.stdout:Remove 2 Packages 2026-03-10T06:36:19.984 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:36:19.984 INFO:teuthology.orchestra.run.vm04.stdout:Freed space: 593 k 2026-03-10T06:36:19.984 INFO:teuthology.orchestra.run.vm04.stdout:Running transaction check 2026-03-10T06:36:19.986 INFO:teuthology.orchestra.run.vm04.stdout:Transaction check succeeded. 2026-03-10T06:36:19.986 INFO:teuthology.orchestra.run.vm04.stdout:Running transaction test 2026-03-10T06:36:19.996 INFO:teuthology.orchestra.run.vm04.stdout:Transaction test succeeded. 2026-03-10T06:36:19.996 INFO:teuthology.orchestra.run.vm04.stdout:Running transaction 2026-03-10T06:36:20.020 INFO:teuthology.orchestra.run.vm04.stdout: Preparing : 1/1 2026-03-10T06:36:20.022 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : libcephfs-devel-2:18.2.0-0.el9.x86_64 1/2 2026-03-10T06:36:20.035 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : librados-devel-2:18.2.0-0.el9.x86_64 2/2 2026-03-10T06:36:20.046 INFO:teuthology.orchestra.run.vm06.stdout:Dependencies resolved. 2026-03-10T06:36:20.047 INFO:teuthology.orchestra.run.vm06.stdout:================================================================================ 2026-03-10T06:36:20.047 INFO:teuthology.orchestra.run.vm06.stdout: Package Arch Version Repository Size 2026-03-10T06:36:20.047 INFO:teuthology.orchestra.run.vm06.stdout:================================================================================ 2026-03-10T06:36:20.047 INFO:teuthology.orchestra.run.vm06.stdout:Removing: 2026-03-10T06:36:20.047 INFO:teuthology.orchestra.run.vm06.stdout: libcephfs2 x86_64 2:18.2.0-0.el9 @ceph 1.8 M 2026-03-10T06:36:20.047 INFO:teuthology.orchestra.run.vm06.stdout:Removing dependent packages: 2026-03-10T06:36:20.047 INFO:teuthology.orchestra.run.vm06.stdout: python3-cephfs x86_64 2:18.2.0-0.el9 @ceph 480 k 2026-03-10T06:36:20.047 INFO:teuthology.orchestra.run.vm06.stdout:Removing unused dependencies: 2026-03-10T06:36:20.047 INFO:teuthology.orchestra.run.vm06.stdout: python3-ceph-argparse x86_64 2:18.2.0-0.el9 @ceph 186 k 2026-03-10T06:36:20.047 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-10T06:36:20.047 INFO:teuthology.orchestra.run.vm06.stdout:Transaction Summary 2026-03-10T06:36:20.047 INFO:teuthology.orchestra.run.vm06.stdout:================================================================================ 2026-03-10T06:36:20.047 INFO:teuthology.orchestra.run.vm06.stdout:Remove 3 Packages 2026-03-10T06:36:20.047 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-10T06:36:20.047 INFO:teuthology.orchestra.run.vm06.stdout:Freed space: 2.5 M 2026-03-10T06:36:20.047 INFO:teuthology.orchestra.run.vm06.stdout:Running transaction check 2026-03-10T06:36:20.049 INFO:teuthology.orchestra.run.vm06.stdout:Transaction check succeeded. 2026-03-10T06:36:20.049 INFO:teuthology.orchestra.run.vm06.stdout:Running transaction test 2026-03-10T06:36:20.060 INFO:teuthology.orchestra.run.vm06.stdout:Transaction test succeeded. 2026-03-10T06:36:20.060 INFO:teuthology.orchestra.run.vm06.stdout:Running transaction 2026-03-10T06:36:20.085 INFO:teuthology.orchestra.run.vm06.stdout: Preparing : 1/1 2026-03-10T06:36:20.087 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-cephfs-2:18.2.0-0.el9.x86_64 1/3 2026-03-10T06:36:20.089 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-ceph-argparse-2:18.2.0-0.el9.x86_64 2/3 2026-03-10T06:36:20.089 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : libcephfs2-2:18.2.0-0.el9.x86_64 3/3 2026-03-10T06:36:20.092 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: librados-devel-2:18.2.0-0.el9.x86_64 2/2 2026-03-10T06:36:20.092 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : libcephfs-devel-2:18.2.0-0.el9.x86_64 1/2 2026-03-10T06:36:20.128 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : librados-devel-2:18.2.0-0.el9.x86_64 2/2 2026-03-10T06:36:20.128 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:36:20.128 INFO:teuthology.orchestra.run.vm04.stdout:Removed: 2026-03-10T06:36:20.128 INFO:teuthology.orchestra.run.vm04.stdout: libcephfs-devel-2:18.2.0-0.el9.x86_64 librados-devel-2:18.2.0-0.el9.x86_64 2026-03-10T06:36:20.128 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:36:20.128 INFO:teuthology.orchestra.run.vm04.stdout:Complete! 2026-03-10T06:36:20.150 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: libcephfs2-2:18.2.0-0.el9.x86_64 3/3 2026-03-10T06:36:20.150 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : libcephfs2-2:18.2.0-0.el9.x86_64 1/3 2026-03-10T06:36:20.150 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-ceph-argparse-2:18.2.0-0.el9.x86_64 2/3 2026-03-10T06:36:20.189 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-cephfs-2:18.2.0-0.el9.x86_64 3/3 2026-03-10T06:36:20.189 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-10T06:36:20.189 INFO:teuthology.orchestra.run.vm06.stdout:Removed: 2026-03-10T06:36:20.189 INFO:teuthology.orchestra.run.vm06.stdout: libcephfs2-2:18.2.0-0.el9.x86_64 2026-03-10T06:36:20.189 INFO:teuthology.orchestra.run.vm06.stdout: python3-ceph-argparse-2:18.2.0-0.el9.x86_64 2026-03-10T06:36:20.189 INFO:teuthology.orchestra.run.vm06.stdout: python3-cephfs-2:18.2.0-0.el9.x86_64 2026-03-10T06:36:20.189 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-10T06:36:20.189 INFO:teuthology.orchestra.run.vm06.stdout:Complete! 2026-03-10T06:36:20.312 INFO:teuthology.orchestra.run.vm04.stdout:Dependencies resolved. 2026-03-10T06:36:20.313 INFO:teuthology.orchestra.run.vm04.stdout:================================================================================ 2026-03-10T06:36:20.313 INFO:teuthology.orchestra.run.vm04.stdout: Package Arch Version Repository Size 2026-03-10T06:36:20.313 INFO:teuthology.orchestra.run.vm04.stdout:================================================================================ 2026-03-10T06:36:20.313 INFO:teuthology.orchestra.run.vm04.stdout:Removing: 2026-03-10T06:36:20.313 INFO:teuthology.orchestra.run.vm04.stdout: libcephfs2 x86_64 2:18.2.0-0.el9 @ceph 1.8 M 2026-03-10T06:36:20.313 INFO:teuthology.orchestra.run.vm04.stdout:Removing dependent packages: 2026-03-10T06:36:20.313 INFO:teuthology.orchestra.run.vm04.stdout: python3-cephfs x86_64 2:18.2.0-0.el9 @ceph 480 k 2026-03-10T06:36:20.313 INFO:teuthology.orchestra.run.vm04.stdout:Removing unused dependencies: 2026-03-10T06:36:20.313 INFO:teuthology.orchestra.run.vm04.stdout: python3-ceph-argparse x86_64 2:18.2.0-0.el9 @ceph 186 k 2026-03-10T06:36:20.313 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:36:20.313 INFO:teuthology.orchestra.run.vm04.stdout:Transaction Summary 2026-03-10T06:36:20.313 INFO:teuthology.orchestra.run.vm04.stdout:================================================================================ 2026-03-10T06:36:20.313 INFO:teuthology.orchestra.run.vm04.stdout:Remove 3 Packages 2026-03-10T06:36:20.313 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:36:20.313 INFO:teuthology.orchestra.run.vm04.stdout:Freed space: 2.5 M 2026-03-10T06:36:20.313 INFO:teuthology.orchestra.run.vm04.stdout:Running transaction check 2026-03-10T06:36:20.315 INFO:teuthology.orchestra.run.vm04.stdout:Transaction check succeeded. 2026-03-10T06:36:20.315 INFO:teuthology.orchestra.run.vm04.stdout:Running transaction test 2026-03-10T06:36:20.326 INFO:teuthology.orchestra.run.vm04.stdout:Transaction test succeeded. 2026-03-10T06:36:20.326 INFO:teuthology.orchestra.run.vm04.stdout:Running transaction 2026-03-10T06:36:20.349 INFO:teuthology.orchestra.run.vm06.stdout:No match for argument: libcephfs-devel 2026-03-10T06:36:20.349 INFO:teuthology.orchestra.run.vm06.stderr:No packages marked for removal. 2026-03-10T06:36:20.351 INFO:teuthology.orchestra.run.vm04.stdout: Preparing : 1/1 2026-03-10T06:36:20.352 INFO:teuthology.orchestra.run.vm06.stdout:Dependencies resolved. 2026-03-10T06:36:20.352 INFO:teuthology.orchestra.run.vm06.stdout:Nothing to do. 2026-03-10T06:36:20.352 INFO:teuthology.orchestra.run.vm06.stdout:Complete! 2026-03-10T06:36:20.353 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-cephfs-2:18.2.0-0.el9.x86_64 1/3 2026-03-10T06:36:20.355 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-ceph-argparse-2:18.2.0-0.el9.x86_64 2/3 2026-03-10T06:36:20.355 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : libcephfs2-2:18.2.0-0.el9.x86_64 3/3 2026-03-10T06:36:20.415 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: libcephfs2-2:18.2.0-0.el9.x86_64 3/3 2026-03-10T06:36:20.415 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : libcephfs2-2:18.2.0-0.el9.x86_64 1/3 2026-03-10T06:36:20.415 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-ceph-argparse-2:18.2.0-0.el9.x86_64 2/3 2026-03-10T06:36:20.454 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-cephfs-2:18.2.0-0.el9.x86_64 3/3 2026-03-10T06:36:20.454 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:36:20.454 INFO:teuthology.orchestra.run.vm04.stdout:Removed: 2026-03-10T06:36:20.454 INFO:teuthology.orchestra.run.vm04.stdout: libcephfs2-2:18.2.0-0.el9.x86_64 2026-03-10T06:36:20.454 INFO:teuthology.orchestra.run.vm04.stdout: python3-ceph-argparse-2:18.2.0-0.el9.x86_64 2026-03-10T06:36:20.454 INFO:teuthology.orchestra.run.vm04.stdout: python3-cephfs-2:18.2.0-0.el9.x86_64 2026-03-10T06:36:20.455 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:36:20.455 INFO:teuthology.orchestra.run.vm04.stdout:Complete! 2026-03-10T06:36:20.525 INFO:teuthology.orchestra.run.vm06.stdout:Dependencies resolved. 2026-03-10T06:36:20.526 INFO:teuthology.orchestra.run.vm06.stdout:================================================================================ 2026-03-10T06:36:20.526 INFO:teuthology.orchestra.run.vm06.stdout: Package Arch Version Repository Size 2026-03-10T06:36:20.526 INFO:teuthology.orchestra.run.vm06.stdout:================================================================================ 2026-03-10T06:36:20.527 INFO:teuthology.orchestra.run.vm06.stdout:Removing: 2026-03-10T06:36:20.527 INFO:teuthology.orchestra.run.vm06.stdout: librados2 x86_64 2:18.2.0-0.el9 @ceph 12 M 2026-03-10T06:36:20.527 INFO:teuthology.orchestra.run.vm06.stdout:Removing dependent packages: 2026-03-10T06:36:20.527 INFO:teuthology.orchestra.run.vm06.stdout: python3-rados x86_64 2:18.2.0-0.el9 @ceph 1.1 M 2026-03-10T06:36:20.527 INFO:teuthology.orchestra.run.vm06.stdout: python3-rbd x86_64 2:18.2.0-0.el9 @ceph 1.1 M 2026-03-10T06:36:20.527 INFO:teuthology.orchestra.run.vm06.stdout: python3-rgw x86_64 2:18.2.0-0.el9 @ceph 269 k 2026-03-10T06:36:20.527 INFO:teuthology.orchestra.run.vm06.stdout: qemu-kvm-block-rbd x86_64 17:10.1.0-15.el9 @appstream 37 k 2026-03-10T06:36:20.527 INFO:teuthology.orchestra.run.vm06.stdout: rbd-fuse x86_64 2:18.2.0-0.el9 @ceph 230 k 2026-03-10T06:36:20.527 INFO:teuthology.orchestra.run.vm06.stdout: rbd-nbd x86_64 2:18.2.0-0.el9 @ceph 490 k 2026-03-10T06:36:20.527 INFO:teuthology.orchestra.run.vm06.stdout:Removing unused dependencies: 2026-03-10T06:36:20.527 INFO:teuthology.orchestra.run.vm06.stdout: boost-program-options x86_64 1.75.0-13.el9 @appstream 276 k 2026-03-10T06:36:20.527 INFO:teuthology.orchestra.run.vm06.stdout: gperftools-libs x86_64 2.9.1-3.el9 @epel 1.4 M 2026-03-10T06:36:20.527 INFO:teuthology.orchestra.run.vm06.stdout: libarrow x86_64 9.0.0-15.el9 @epel 18 M 2026-03-10T06:36:20.527 INFO:teuthology.orchestra.run.vm06.stdout: libarrow-doc noarch 9.0.0-15.el9 @epel 122 k 2026-03-10T06:36:20.527 INFO:teuthology.orchestra.run.vm06.stdout: libpmemobj x86_64 1.12.1-1.el9 @appstream 383 k 2026-03-10T06:36:20.527 INFO:teuthology.orchestra.run.vm06.stdout: librabbitmq x86_64 0.11.0-7.el9 @appstream 102 k 2026-03-10T06:36:20.527 INFO:teuthology.orchestra.run.vm06.stdout: librbd1 x86_64 2:18.2.0-0.el9 @ceph 12 M 2026-03-10T06:36:20.527 INFO:teuthology.orchestra.run.vm06.stdout: librdkafka x86_64 1.6.1-102.el9 @appstream 2.0 M 2026-03-10T06:36:20.527 INFO:teuthology.orchestra.run.vm06.stdout: librgw2 x86_64 2:18.2.0-0.el9 @ceph 15 M 2026-03-10T06:36:20.527 INFO:teuthology.orchestra.run.vm06.stdout: libunwind x86_64 1.6.2-1.el9 @epel 170 k 2026-03-10T06:36:20.527 INFO:teuthology.orchestra.run.vm06.stdout: lttng-ust x86_64 2.12.0-6.el9 @appstream 1.0 M 2026-03-10T06:36:20.527 INFO:teuthology.orchestra.run.vm06.stdout: parquet-libs x86_64 9.0.0-15.el9 @epel 2.8 M 2026-03-10T06:36:20.527 INFO:teuthology.orchestra.run.vm06.stdout: re2 x86_64 1:20211101-20.el9 @epel 472 k 2026-03-10T06:36:20.527 INFO:teuthology.orchestra.run.vm06.stdout: thrift x86_64 0.15.0-4.el9 @epel 4.8 M 2026-03-10T06:36:20.527 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-10T06:36:20.527 INFO:teuthology.orchestra.run.vm06.stdout:Transaction Summary 2026-03-10T06:36:20.527 INFO:teuthology.orchestra.run.vm06.stdout:================================================================================ 2026-03-10T06:36:20.527 INFO:teuthology.orchestra.run.vm06.stdout:Remove 21 Packages 2026-03-10T06:36:20.527 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-10T06:36:20.527 INFO:teuthology.orchestra.run.vm06.stdout:Freed space: 74 M 2026-03-10T06:36:20.527 INFO:teuthology.orchestra.run.vm06.stdout:Running transaction check 2026-03-10T06:36:20.530 INFO:teuthology.orchestra.run.vm06.stdout:Transaction check succeeded. 2026-03-10T06:36:20.530 INFO:teuthology.orchestra.run.vm06.stdout:Running transaction test 2026-03-10T06:36:20.551 INFO:teuthology.orchestra.run.vm06.stdout:Transaction test succeeded. 2026-03-10T06:36:20.551 INFO:teuthology.orchestra.run.vm06.stdout:Running transaction 2026-03-10T06:36:20.593 INFO:teuthology.orchestra.run.vm06.stdout: Preparing : 1/1 2026-03-10T06:36:20.595 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : rbd-nbd-2:18.2.0-0.el9.x86_64 1/21 2026-03-10T06:36:20.597 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : rbd-fuse-2:18.2.0-0.el9.x86_64 2/21 2026-03-10T06:36:20.600 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-rgw-2:18.2.0-0.el9.x86_64 3/21 2026-03-10T06:36:20.600 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : librgw2-2:18.2.0-0.el9.x86_64 4/21 2026-03-10T06:36:20.612 INFO:teuthology.orchestra.run.vm04.stdout:No match for argument: libcephfs-devel 2026-03-10T06:36:20.613 INFO:teuthology.orchestra.run.vm04.stderr:No packages marked for removal. 2026-03-10T06:36:20.613 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: librgw2-2:18.2.0-0.el9.x86_64 4/21 2026-03-10T06:36:20.615 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : parquet-libs-9.0.0-15.el9.x86_64 5/21 2026-03-10T06:36:20.615 INFO:teuthology.orchestra.run.vm04.stdout:Dependencies resolved. 2026-03-10T06:36:20.616 INFO:teuthology.orchestra.run.vm04.stdout:Nothing to do. 2026-03-10T06:36:20.616 INFO:teuthology.orchestra.run.vm04.stdout:Complete! 2026-03-10T06:36:20.617 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-rbd-2:18.2.0-0.el9.x86_64 6/21 2026-03-10T06:36:20.619 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-rados-2:18.2.0-0.el9.x86_64 7/21 2026-03-10T06:36:20.621 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : qemu-kvm-block-rbd-17:10.1.0-15.el9.x86_64 8/21 2026-03-10T06:36:20.621 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : librbd1-2:18.2.0-0.el9.x86_64 9/21 2026-03-10T06:36:20.634 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: librbd1-2:18.2.0-0.el9.x86_64 9/21 2026-03-10T06:36:20.634 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : librados2-2:18.2.0-0.el9.x86_64 10/21 2026-03-10T06:36:20.634 INFO:teuthology.orchestra.run.vm06.stdout:warning: file /etc/ceph: remove failed: No such file or directory 2026-03-10T06:36:20.634 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-10T06:36:20.646 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: librados2-2:18.2.0-0.el9.x86_64 10/21 2026-03-10T06:36:20.648 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : libarrow-9.0.0-15.el9.x86_64 11/21 2026-03-10T06:36:20.650 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : gperftools-libs-2.9.1-3.el9.x86_64 12/21 2026-03-10T06:36:20.652 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : libarrow-doc-9.0.0-15.el9.noarch 13/21 2026-03-10T06:36:20.654 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : libunwind-1.6.2-1.el9.x86_64 14/21 2026-03-10T06:36:20.657 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : re2-1:20211101-20.el9.x86_64 15/21 2026-03-10T06:36:20.660 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : lttng-ust-2.12.0-6.el9.x86_64 16/21 2026-03-10T06:36:20.663 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : thrift-0.15.0-4.el9.x86_64 17/21 2026-03-10T06:36:20.666 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : libpmemobj-1.12.1-1.el9.x86_64 18/21 2026-03-10T06:36:20.668 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : boost-program-options-1.75.0-13.el9.x86_64 19/21 2026-03-10T06:36:20.671 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : librabbitmq-0.11.0-7.el9.x86_64 20/21 2026-03-10T06:36:20.684 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : librdkafka-1.6.1-102.el9.x86_64 21/21 2026-03-10T06:36:20.740 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: librdkafka-1.6.1-102.el9.x86_64 21/21 2026-03-10T06:36:20.740 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : boost-program-options-1.75.0-13.el9.x86_64 1/21 2026-03-10T06:36:20.740 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : gperftools-libs-2.9.1-3.el9.x86_64 2/21 2026-03-10T06:36:20.740 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : libarrow-9.0.0-15.el9.x86_64 3/21 2026-03-10T06:36:20.740 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : libarrow-doc-9.0.0-15.el9.noarch 4/21 2026-03-10T06:36:20.740 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : libpmemobj-1.12.1-1.el9.x86_64 5/21 2026-03-10T06:36:20.740 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : librabbitmq-0.11.0-7.el9.x86_64 6/21 2026-03-10T06:36:20.740 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : librados2-2:18.2.0-0.el9.x86_64 7/21 2026-03-10T06:36:20.740 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : librbd1-2:18.2.0-0.el9.x86_64 8/21 2026-03-10T06:36:20.740 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : librdkafka-1.6.1-102.el9.x86_64 9/21 2026-03-10T06:36:20.740 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : librgw2-2:18.2.0-0.el9.x86_64 10/21 2026-03-10T06:36:20.740 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : libunwind-1.6.2-1.el9.x86_64 11/21 2026-03-10T06:36:20.740 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : lttng-ust-2.12.0-6.el9.x86_64 12/21 2026-03-10T06:36:20.740 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : parquet-libs-9.0.0-15.el9.x86_64 13/21 2026-03-10T06:36:20.740 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-rados-2:18.2.0-0.el9.x86_64 14/21 2026-03-10T06:36:20.740 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-rbd-2:18.2.0-0.el9.x86_64 15/21 2026-03-10T06:36:20.740 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-rgw-2:18.2.0-0.el9.x86_64 16/21 2026-03-10T06:36:20.740 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : qemu-kvm-block-rbd-17:10.1.0-15.el9.x86_64 17/21 2026-03-10T06:36:20.740 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : rbd-fuse-2:18.2.0-0.el9.x86_64 18/21 2026-03-10T06:36:20.740 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : rbd-nbd-2:18.2.0-0.el9.x86_64 19/21 2026-03-10T06:36:20.740 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : re2-1:20211101-20.el9.x86_64 20/21 2026-03-10T06:36:20.784 INFO:teuthology.orchestra.run.vm04.stdout:Dependencies resolved. 2026-03-10T06:36:20.786 INFO:teuthology.orchestra.run.vm04.stdout:================================================================================ 2026-03-10T06:36:20.786 INFO:teuthology.orchestra.run.vm04.stdout: Package Arch Version Repository Size 2026-03-10T06:36:20.786 INFO:teuthology.orchestra.run.vm04.stdout:================================================================================ 2026-03-10T06:36:20.786 INFO:teuthology.orchestra.run.vm04.stdout:Removing: 2026-03-10T06:36:20.786 INFO:teuthology.orchestra.run.vm04.stdout: librados2 x86_64 2:18.2.0-0.el9 @ceph 12 M 2026-03-10T06:36:20.786 INFO:teuthology.orchestra.run.vm04.stdout:Removing dependent packages: 2026-03-10T06:36:20.786 INFO:teuthology.orchestra.run.vm04.stdout: python3-rados x86_64 2:18.2.0-0.el9 @ceph 1.1 M 2026-03-10T06:36:20.786 INFO:teuthology.orchestra.run.vm04.stdout: python3-rbd x86_64 2:18.2.0-0.el9 @ceph 1.1 M 2026-03-10T06:36:20.786 INFO:teuthology.orchestra.run.vm04.stdout: python3-rgw x86_64 2:18.2.0-0.el9 @ceph 269 k 2026-03-10T06:36:20.786 INFO:teuthology.orchestra.run.vm04.stdout: qemu-kvm-block-rbd x86_64 17:10.1.0-15.el9 @appstream 37 k 2026-03-10T06:36:20.786 INFO:teuthology.orchestra.run.vm04.stdout: rbd-fuse x86_64 2:18.2.0-0.el9 @ceph 230 k 2026-03-10T06:36:20.786 INFO:teuthology.orchestra.run.vm04.stdout: rbd-nbd x86_64 2:18.2.0-0.el9 @ceph 490 k 2026-03-10T06:36:20.786 INFO:teuthology.orchestra.run.vm04.stdout:Removing unused dependencies: 2026-03-10T06:36:20.786 INFO:teuthology.orchestra.run.vm04.stdout: boost-program-options x86_64 1.75.0-13.el9 @appstream 276 k 2026-03-10T06:36:20.786 INFO:teuthology.orchestra.run.vm04.stdout: gperftools-libs x86_64 2.9.1-3.el9 @epel 1.4 M 2026-03-10T06:36:20.786 INFO:teuthology.orchestra.run.vm04.stdout: libarrow x86_64 9.0.0-15.el9 @epel 18 M 2026-03-10T06:36:20.786 INFO:teuthology.orchestra.run.vm04.stdout: libarrow-doc noarch 9.0.0-15.el9 @epel 122 k 2026-03-10T06:36:20.786 INFO:teuthology.orchestra.run.vm04.stdout: libpmemobj x86_64 1.12.1-1.el9 @appstream 383 k 2026-03-10T06:36:20.786 INFO:teuthology.orchestra.run.vm04.stdout: librabbitmq x86_64 0.11.0-7.el9 @appstream 102 k 2026-03-10T06:36:20.786 INFO:teuthology.orchestra.run.vm04.stdout: librbd1 x86_64 2:18.2.0-0.el9 @ceph 12 M 2026-03-10T06:36:20.786 INFO:teuthology.orchestra.run.vm04.stdout: librdkafka x86_64 1.6.1-102.el9 @appstream 2.0 M 2026-03-10T06:36:20.786 INFO:teuthology.orchestra.run.vm04.stdout: librgw2 x86_64 2:18.2.0-0.el9 @ceph 15 M 2026-03-10T06:36:20.786 INFO:teuthology.orchestra.run.vm04.stdout: libunwind x86_64 1.6.2-1.el9 @epel 170 k 2026-03-10T06:36:20.786 INFO:teuthology.orchestra.run.vm04.stdout: lttng-ust x86_64 2.12.0-6.el9 @appstream 1.0 M 2026-03-10T06:36:20.786 INFO:teuthology.orchestra.run.vm04.stdout: parquet-libs x86_64 9.0.0-15.el9 @epel 2.8 M 2026-03-10T06:36:20.787 INFO:teuthology.orchestra.run.vm04.stdout: re2 x86_64 1:20211101-20.el9 @epel 472 k 2026-03-10T06:36:20.787 INFO:teuthology.orchestra.run.vm04.stdout: thrift x86_64 0.15.0-4.el9 @epel 4.8 M 2026-03-10T06:36:20.787 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:36:20.787 INFO:teuthology.orchestra.run.vm04.stdout:Transaction Summary 2026-03-10T06:36:20.787 INFO:teuthology.orchestra.run.vm04.stdout:================================================================================ 2026-03-10T06:36:20.787 INFO:teuthology.orchestra.run.vm04.stdout:Remove 21 Packages 2026-03-10T06:36:20.787 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:36:20.787 INFO:teuthology.orchestra.run.vm04.stdout:Freed space: 74 M 2026-03-10T06:36:20.787 INFO:teuthology.orchestra.run.vm04.stdout:Running transaction check 2026-03-10T06:36:20.790 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : thrift-0.15.0-4.el9.x86_64 21/21 2026-03-10T06:36:20.790 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-10T06:36:20.790 INFO:teuthology.orchestra.run.vm06.stdout:Removed: 2026-03-10T06:36:20.790 INFO:teuthology.orchestra.run.vm06.stdout: boost-program-options-1.75.0-13.el9.x86_64 2026-03-10T06:36:20.790 INFO:teuthology.orchestra.run.vm06.stdout: gperftools-libs-2.9.1-3.el9.x86_64 2026-03-10T06:36:20.790 INFO:teuthology.orchestra.run.vm06.stdout: libarrow-9.0.0-15.el9.x86_64 2026-03-10T06:36:20.790 INFO:teuthology.orchestra.run.vm06.stdout: libarrow-doc-9.0.0-15.el9.noarch 2026-03-10T06:36:20.790 INFO:teuthology.orchestra.run.vm06.stdout: libpmemobj-1.12.1-1.el9.x86_64 2026-03-10T06:36:20.790 INFO:teuthology.orchestra.run.vm06.stdout: librabbitmq-0.11.0-7.el9.x86_64 2026-03-10T06:36:20.790 INFO:teuthology.orchestra.run.vm06.stdout: librados2-2:18.2.0-0.el9.x86_64 2026-03-10T06:36:20.790 INFO:teuthology.orchestra.run.vm06.stdout: librbd1-2:18.2.0-0.el9.x86_64 2026-03-10T06:36:20.790 INFO:teuthology.orchestra.run.vm06.stdout: librdkafka-1.6.1-102.el9.x86_64 2026-03-10T06:36:20.790 INFO:teuthology.orchestra.run.vm06.stdout: librgw2-2:18.2.0-0.el9.x86_64 2026-03-10T06:36:20.790 INFO:teuthology.orchestra.run.vm06.stdout: libunwind-1.6.2-1.el9.x86_64 2026-03-10T06:36:20.790 INFO:teuthology.orchestra.run.vm06.stdout: lttng-ust-2.12.0-6.el9.x86_64 2026-03-10T06:36:20.790 INFO:teuthology.orchestra.run.vm06.stdout: parquet-libs-9.0.0-15.el9.x86_64 2026-03-10T06:36:20.790 INFO:teuthology.orchestra.run.vm06.stdout: python3-rados-2:18.2.0-0.el9.x86_64 2026-03-10T06:36:20.790 INFO:teuthology.orchestra.run.vm06.stdout: python3-rbd-2:18.2.0-0.el9.x86_64 2026-03-10T06:36:20.790 INFO:teuthology.orchestra.run.vm06.stdout: python3-rgw-2:18.2.0-0.el9.x86_64 2026-03-10T06:36:20.790 INFO:teuthology.orchestra.run.vm06.stdout: qemu-kvm-block-rbd-17:10.1.0-15.el9.x86_64 2026-03-10T06:36:20.790 INFO:teuthology.orchestra.run.vm06.stdout: rbd-fuse-2:18.2.0-0.el9.x86_64 2026-03-10T06:36:20.790 INFO:teuthology.orchestra.run.vm06.stdout: rbd-nbd-2:18.2.0-0.el9.x86_64 2026-03-10T06:36:20.790 INFO:teuthology.orchestra.run.vm06.stdout: re2-1:20211101-20.el9.x86_64 2026-03-10T06:36:20.790 INFO:teuthology.orchestra.run.vm06.stdout: thrift-0.15.0-4.el9.x86_64 2026-03-10T06:36:20.790 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-10T06:36:20.790 INFO:teuthology.orchestra.run.vm06.stdout:Complete! 2026-03-10T06:36:20.790 INFO:teuthology.orchestra.run.vm04.stdout:Transaction check succeeded. 2026-03-10T06:36:20.791 INFO:teuthology.orchestra.run.vm04.stdout:Running transaction test 2026-03-10T06:36:20.813 INFO:teuthology.orchestra.run.vm04.stdout:Transaction test succeeded. 2026-03-10T06:36:20.813 INFO:teuthology.orchestra.run.vm04.stdout:Running transaction 2026-03-10T06:36:20.854 INFO:teuthology.orchestra.run.vm04.stdout: Preparing : 1/1 2026-03-10T06:36:20.856 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : rbd-nbd-2:18.2.0-0.el9.x86_64 1/21 2026-03-10T06:36:20.858 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : rbd-fuse-2:18.2.0-0.el9.x86_64 2/21 2026-03-10T06:36:20.861 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-rgw-2:18.2.0-0.el9.x86_64 3/21 2026-03-10T06:36:20.861 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : librgw2-2:18.2.0-0.el9.x86_64 4/21 2026-03-10T06:36:20.873 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: librgw2-2:18.2.0-0.el9.x86_64 4/21 2026-03-10T06:36:20.875 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : parquet-libs-9.0.0-15.el9.x86_64 5/21 2026-03-10T06:36:20.877 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-rbd-2:18.2.0-0.el9.x86_64 6/21 2026-03-10T06:36:20.878 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-rados-2:18.2.0-0.el9.x86_64 7/21 2026-03-10T06:36:20.881 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : qemu-kvm-block-rbd-17:10.1.0-15.el9.x86_64 8/21 2026-03-10T06:36:20.881 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : librbd1-2:18.2.0-0.el9.x86_64 9/21 2026-03-10T06:36:20.893 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: librbd1-2:18.2.0-0.el9.x86_64 9/21 2026-03-10T06:36:20.894 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : librados2-2:18.2.0-0.el9.x86_64 10/21 2026-03-10T06:36:20.894 INFO:teuthology.orchestra.run.vm04.stdout:warning: file /etc/ceph: remove failed: No such file or directory 2026-03-10T06:36:20.894 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:36:20.905 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: librados2-2:18.2.0-0.el9.x86_64 10/21 2026-03-10T06:36:20.907 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : libarrow-9.0.0-15.el9.x86_64 11/21 2026-03-10T06:36:20.909 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : gperftools-libs-2.9.1-3.el9.x86_64 12/21 2026-03-10T06:36:20.911 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : libarrow-doc-9.0.0-15.el9.noarch 13/21 2026-03-10T06:36:20.913 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : libunwind-1.6.2-1.el9.x86_64 14/21 2026-03-10T06:36:20.917 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : re2-1:20211101-20.el9.x86_64 15/21 2026-03-10T06:36:20.920 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : lttng-ust-2.12.0-6.el9.x86_64 16/21 2026-03-10T06:36:20.923 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : thrift-0.15.0-4.el9.x86_64 17/21 2026-03-10T06:36:20.925 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : libpmemobj-1.12.1-1.el9.x86_64 18/21 2026-03-10T06:36:20.927 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : boost-program-options-1.75.0-13.el9.x86_64 19/21 2026-03-10T06:36:20.929 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : librabbitmq-0.11.0-7.el9.x86_64 20/21 2026-03-10T06:36:20.942 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : librdkafka-1.6.1-102.el9.x86_64 21/21 2026-03-10T06:36:20.978 INFO:teuthology.orchestra.run.vm06.stdout:No match for argument: librbd1 2026-03-10T06:36:20.978 INFO:teuthology.orchestra.run.vm06.stderr:No packages marked for removal. 2026-03-10T06:36:20.981 INFO:teuthology.orchestra.run.vm06.stdout:Dependencies resolved. 2026-03-10T06:36:20.982 INFO:teuthology.orchestra.run.vm06.stdout:Nothing to do. 2026-03-10T06:36:20.982 INFO:teuthology.orchestra.run.vm06.stdout:Complete! 2026-03-10T06:36:21.002 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: librdkafka-1.6.1-102.el9.x86_64 21/21 2026-03-10T06:36:21.002 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : boost-program-options-1.75.0-13.el9.x86_64 1/21 2026-03-10T06:36:21.002 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : gperftools-libs-2.9.1-3.el9.x86_64 2/21 2026-03-10T06:36:21.002 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : libarrow-9.0.0-15.el9.x86_64 3/21 2026-03-10T06:36:21.002 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : libarrow-doc-9.0.0-15.el9.noarch 4/21 2026-03-10T06:36:21.002 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : libpmemobj-1.12.1-1.el9.x86_64 5/21 2026-03-10T06:36:21.002 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : librabbitmq-0.11.0-7.el9.x86_64 6/21 2026-03-10T06:36:21.002 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : librados2-2:18.2.0-0.el9.x86_64 7/21 2026-03-10T06:36:21.002 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : librbd1-2:18.2.0-0.el9.x86_64 8/21 2026-03-10T06:36:21.002 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : librdkafka-1.6.1-102.el9.x86_64 9/21 2026-03-10T06:36:21.002 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : librgw2-2:18.2.0-0.el9.x86_64 10/21 2026-03-10T06:36:21.002 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : libunwind-1.6.2-1.el9.x86_64 11/21 2026-03-10T06:36:21.002 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : lttng-ust-2.12.0-6.el9.x86_64 12/21 2026-03-10T06:36:21.002 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : parquet-libs-9.0.0-15.el9.x86_64 13/21 2026-03-10T06:36:21.002 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-rados-2:18.2.0-0.el9.x86_64 14/21 2026-03-10T06:36:21.002 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-rbd-2:18.2.0-0.el9.x86_64 15/21 2026-03-10T06:36:21.002 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-rgw-2:18.2.0-0.el9.x86_64 16/21 2026-03-10T06:36:21.002 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : qemu-kvm-block-rbd-17:10.1.0-15.el9.x86_64 17/21 2026-03-10T06:36:21.002 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : rbd-fuse-2:18.2.0-0.el9.x86_64 18/21 2026-03-10T06:36:21.002 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : rbd-nbd-2:18.2.0-0.el9.x86_64 19/21 2026-03-10T06:36:21.002 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : re2-1:20211101-20.el9.x86_64 20/21 2026-03-10T06:36:21.047 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : thrift-0.15.0-4.el9.x86_64 21/21 2026-03-10T06:36:21.047 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:36:21.047 INFO:teuthology.orchestra.run.vm04.stdout:Removed: 2026-03-10T06:36:21.047 INFO:teuthology.orchestra.run.vm04.stdout: boost-program-options-1.75.0-13.el9.x86_64 2026-03-10T06:36:21.047 INFO:teuthology.orchestra.run.vm04.stdout: gperftools-libs-2.9.1-3.el9.x86_64 2026-03-10T06:36:21.047 INFO:teuthology.orchestra.run.vm04.stdout: libarrow-9.0.0-15.el9.x86_64 2026-03-10T06:36:21.047 INFO:teuthology.orchestra.run.vm04.stdout: libarrow-doc-9.0.0-15.el9.noarch 2026-03-10T06:36:21.047 INFO:teuthology.orchestra.run.vm04.stdout: libpmemobj-1.12.1-1.el9.x86_64 2026-03-10T06:36:21.047 INFO:teuthology.orchestra.run.vm04.stdout: librabbitmq-0.11.0-7.el9.x86_64 2026-03-10T06:36:21.047 INFO:teuthology.orchestra.run.vm04.stdout: librados2-2:18.2.0-0.el9.x86_64 2026-03-10T06:36:21.047 INFO:teuthology.orchestra.run.vm04.stdout: librbd1-2:18.2.0-0.el9.x86_64 2026-03-10T06:36:21.047 INFO:teuthology.orchestra.run.vm04.stdout: librdkafka-1.6.1-102.el9.x86_64 2026-03-10T06:36:21.047 INFO:teuthology.orchestra.run.vm04.stdout: librgw2-2:18.2.0-0.el9.x86_64 2026-03-10T06:36:21.047 INFO:teuthology.orchestra.run.vm04.stdout: libunwind-1.6.2-1.el9.x86_64 2026-03-10T06:36:21.047 INFO:teuthology.orchestra.run.vm04.stdout: lttng-ust-2.12.0-6.el9.x86_64 2026-03-10T06:36:21.047 INFO:teuthology.orchestra.run.vm04.stdout: parquet-libs-9.0.0-15.el9.x86_64 2026-03-10T06:36:21.047 INFO:teuthology.orchestra.run.vm04.stdout: python3-rados-2:18.2.0-0.el9.x86_64 2026-03-10T06:36:21.047 INFO:teuthology.orchestra.run.vm04.stdout: python3-rbd-2:18.2.0-0.el9.x86_64 2026-03-10T06:36:21.047 INFO:teuthology.orchestra.run.vm04.stdout: python3-rgw-2:18.2.0-0.el9.x86_64 2026-03-10T06:36:21.047 INFO:teuthology.orchestra.run.vm04.stdout: qemu-kvm-block-rbd-17:10.1.0-15.el9.x86_64 2026-03-10T06:36:21.047 INFO:teuthology.orchestra.run.vm04.stdout: rbd-fuse-2:18.2.0-0.el9.x86_64 2026-03-10T06:36:21.047 INFO:teuthology.orchestra.run.vm04.stdout: rbd-nbd-2:18.2.0-0.el9.x86_64 2026-03-10T06:36:21.047 INFO:teuthology.orchestra.run.vm04.stdout: re2-1:20211101-20.el9.x86_64 2026-03-10T06:36:21.047 INFO:teuthology.orchestra.run.vm04.stdout: thrift-0.15.0-4.el9.x86_64 2026-03-10T06:36:21.047 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T06:36:21.047 INFO:teuthology.orchestra.run.vm04.stdout:Complete! 2026-03-10T06:36:21.155 INFO:teuthology.orchestra.run.vm06.stdout:No match for argument: python3-rados 2026-03-10T06:36:21.155 INFO:teuthology.orchestra.run.vm06.stderr:No packages marked for removal. 2026-03-10T06:36:21.158 INFO:teuthology.orchestra.run.vm06.stdout:Dependencies resolved. 2026-03-10T06:36:21.158 INFO:teuthology.orchestra.run.vm06.stdout:Nothing to do. 2026-03-10T06:36:21.159 INFO:teuthology.orchestra.run.vm06.stdout:Complete! 2026-03-10T06:36:21.238 INFO:teuthology.orchestra.run.vm04.stdout:No match for argument: librbd1 2026-03-10T06:36:21.238 INFO:teuthology.orchestra.run.vm04.stderr:No packages marked for removal. 2026-03-10T06:36:21.241 INFO:teuthology.orchestra.run.vm04.stdout:Dependencies resolved. 2026-03-10T06:36:21.242 INFO:teuthology.orchestra.run.vm04.stdout:Nothing to do. 2026-03-10T06:36:21.242 INFO:teuthology.orchestra.run.vm04.stdout:Complete! 2026-03-10T06:36:21.321 INFO:teuthology.orchestra.run.vm06.stdout:No match for argument: python3-rgw 2026-03-10T06:36:21.322 INFO:teuthology.orchestra.run.vm06.stderr:No packages marked for removal. 2026-03-10T06:36:21.324 INFO:teuthology.orchestra.run.vm06.stdout:Dependencies resolved. 2026-03-10T06:36:21.325 INFO:teuthology.orchestra.run.vm06.stdout:Nothing to do. 2026-03-10T06:36:21.325 INFO:teuthology.orchestra.run.vm06.stdout:Complete! 2026-03-10T06:36:21.405 INFO:teuthology.orchestra.run.vm04.stdout:No match for argument: python3-rados 2026-03-10T06:36:21.405 INFO:teuthology.orchestra.run.vm04.stderr:No packages marked for removal. 2026-03-10T06:36:21.408 INFO:teuthology.orchestra.run.vm04.stdout:Dependencies resolved. 2026-03-10T06:36:21.409 INFO:teuthology.orchestra.run.vm04.stdout:Nothing to do. 2026-03-10T06:36:21.409 INFO:teuthology.orchestra.run.vm04.stdout:Complete! 2026-03-10T06:36:21.480 INFO:teuthology.orchestra.run.vm06.stdout:No match for argument: python3-cephfs 2026-03-10T06:36:21.480 INFO:teuthology.orchestra.run.vm06.stderr:No packages marked for removal. 2026-03-10T06:36:21.483 INFO:teuthology.orchestra.run.vm06.stdout:Dependencies resolved. 2026-03-10T06:36:21.483 INFO:teuthology.orchestra.run.vm06.stdout:Nothing to do. 2026-03-10T06:36:21.483 INFO:teuthology.orchestra.run.vm06.stdout:Complete! 2026-03-10T06:36:21.569 INFO:teuthology.orchestra.run.vm04.stdout:No match for argument: python3-rgw 2026-03-10T06:36:21.569 INFO:teuthology.orchestra.run.vm04.stderr:No packages marked for removal. 2026-03-10T06:36:21.572 INFO:teuthology.orchestra.run.vm04.stdout:Dependencies resolved. 2026-03-10T06:36:21.572 INFO:teuthology.orchestra.run.vm04.stdout:Nothing to do. 2026-03-10T06:36:21.572 INFO:teuthology.orchestra.run.vm04.stdout:Complete! 2026-03-10T06:36:21.640 INFO:teuthology.orchestra.run.vm06.stdout:No match for argument: python3-rbd 2026-03-10T06:36:21.640 INFO:teuthology.orchestra.run.vm06.stderr:No packages marked for removal. 2026-03-10T06:36:21.643 INFO:teuthology.orchestra.run.vm06.stdout:Dependencies resolved. 2026-03-10T06:36:21.643 INFO:teuthology.orchestra.run.vm06.stdout:Nothing to do. 2026-03-10T06:36:21.643 INFO:teuthology.orchestra.run.vm06.stdout:Complete! 2026-03-10T06:36:21.728 INFO:teuthology.orchestra.run.vm04.stdout:No match for argument: python3-cephfs 2026-03-10T06:36:21.728 INFO:teuthology.orchestra.run.vm04.stderr:No packages marked for removal. 2026-03-10T06:36:21.730 INFO:teuthology.orchestra.run.vm04.stdout:Dependencies resolved. 2026-03-10T06:36:21.731 INFO:teuthology.orchestra.run.vm04.stdout:Nothing to do. 2026-03-10T06:36:21.731 INFO:teuthology.orchestra.run.vm04.stdout:Complete! 2026-03-10T06:36:21.797 INFO:teuthology.orchestra.run.vm06.stdout:No match for argument: rbd-fuse 2026-03-10T06:36:21.798 INFO:teuthology.orchestra.run.vm06.stderr:No packages marked for removal. 2026-03-10T06:36:21.800 INFO:teuthology.orchestra.run.vm06.stdout:Dependencies resolved. 2026-03-10T06:36:21.801 INFO:teuthology.orchestra.run.vm06.stdout:Nothing to do. 2026-03-10T06:36:21.801 INFO:teuthology.orchestra.run.vm06.stdout:Complete! 2026-03-10T06:36:21.885 INFO:teuthology.orchestra.run.vm04.stdout:No match for argument: python3-rbd 2026-03-10T06:36:21.885 INFO:teuthology.orchestra.run.vm04.stderr:No packages marked for removal. 2026-03-10T06:36:21.888 INFO:teuthology.orchestra.run.vm04.stdout:Dependencies resolved. 2026-03-10T06:36:21.888 INFO:teuthology.orchestra.run.vm04.stdout:Nothing to do. 2026-03-10T06:36:21.888 INFO:teuthology.orchestra.run.vm04.stdout:Complete! 2026-03-10T06:36:21.953 INFO:teuthology.orchestra.run.vm06.stdout:No match for argument: rbd-mirror 2026-03-10T06:36:21.953 INFO:teuthology.orchestra.run.vm06.stderr:No packages marked for removal. 2026-03-10T06:36:21.955 INFO:teuthology.orchestra.run.vm06.stdout:Dependencies resolved. 2026-03-10T06:36:21.956 INFO:teuthology.orchestra.run.vm06.stdout:Nothing to do. 2026-03-10T06:36:21.956 INFO:teuthology.orchestra.run.vm06.stdout:Complete! 2026-03-10T06:36:22.043 INFO:teuthology.orchestra.run.vm04.stdout:No match for argument: rbd-fuse 2026-03-10T06:36:22.043 INFO:teuthology.orchestra.run.vm04.stderr:No packages marked for removal. 2026-03-10T06:36:22.045 INFO:teuthology.orchestra.run.vm04.stdout:Dependencies resolved. 2026-03-10T06:36:22.046 INFO:teuthology.orchestra.run.vm04.stdout:Nothing to do. 2026-03-10T06:36:22.046 INFO:teuthology.orchestra.run.vm04.stdout:Complete! 2026-03-10T06:36:22.108 INFO:teuthology.orchestra.run.vm06.stdout:No match for argument: rbd-nbd 2026-03-10T06:36:22.108 INFO:teuthology.orchestra.run.vm06.stderr:No packages marked for removal. 2026-03-10T06:36:22.111 INFO:teuthology.orchestra.run.vm06.stdout:Dependencies resolved. 2026-03-10T06:36:22.111 INFO:teuthology.orchestra.run.vm06.stdout:Nothing to do. 2026-03-10T06:36:22.111 INFO:teuthology.orchestra.run.vm06.stdout:Complete! 2026-03-10T06:36:22.131 DEBUG:teuthology.orchestra.run.vm06:> sudo yum clean all 2026-03-10T06:36:22.204 INFO:teuthology.orchestra.run.vm04.stdout:No match for argument: rbd-mirror 2026-03-10T06:36:22.204 INFO:teuthology.orchestra.run.vm04.stderr:No packages marked for removal. 2026-03-10T06:36:22.207 INFO:teuthology.orchestra.run.vm04.stdout:Dependencies resolved. 2026-03-10T06:36:22.207 INFO:teuthology.orchestra.run.vm04.stdout:Nothing to do. 2026-03-10T06:36:22.207 INFO:teuthology.orchestra.run.vm04.stdout:Complete! 2026-03-10T06:36:22.251 INFO:teuthology.orchestra.run.vm06.stdout:56 files removed 2026-03-10T06:36:22.270 DEBUG:teuthology.orchestra.run.vm06:> sudo rm -f /etc/yum.repos.d/ceph.repo 2026-03-10T06:36:22.293 DEBUG:teuthology.orchestra.run.vm06:> sudo yum clean expire-cache 2026-03-10T06:36:22.360 INFO:teuthology.orchestra.run.vm04.stdout:No match for argument: rbd-nbd 2026-03-10T06:36:22.360 INFO:teuthology.orchestra.run.vm04.stderr:No packages marked for removal. 2026-03-10T06:36:22.363 INFO:teuthology.orchestra.run.vm04.stdout:Dependencies resolved. 2026-03-10T06:36:22.364 INFO:teuthology.orchestra.run.vm04.stdout:Nothing to do. 2026-03-10T06:36:22.364 INFO:teuthology.orchestra.run.vm04.stdout:Complete! 2026-03-10T06:36:22.384 DEBUG:teuthology.orchestra.run.vm04:> sudo yum clean all 2026-03-10T06:36:22.439 INFO:teuthology.orchestra.run.vm06.stdout:Cache was expired 2026-03-10T06:36:22.439 INFO:teuthology.orchestra.run.vm06.stdout:0 files removed 2026-03-10T06:36:22.457 DEBUG:teuthology.parallel:result is None 2026-03-10T06:36:22.494 INFO:teuthology.orchestra.run.vm04.stdout:56 files removed 2026-03-10T06:36:22.511 DEBUG:teuthology.orchestra.run.vm04:> sudo rm -f /etc/yum.repos.d/ceph.repo 2026-03-10T06:36:22.533 DEBUG:teuthology.orchestra.run.vm04:> sudo yum clean expire-cache 2026-03-10T06:36:22.676 INFO:teuthology.orchestra.run.vm04.stdout:Cache was expired 2026-03-10T06:36:22.676 INFO:teuthology.orchestra.run.vm04.stdout:0 files removed 2026-03-10T06:36:22.692 DEBUG:teuthology.parallel:result is None 2026-03-10T06:36:22.693 INFO:teuthology.task.install:Removing ceph sources lists on ubuntu@vm04.local 2026-03-10T06:36:22.693 INFO:teuthology.task.install:Removing ceph sources lists on ubuntu@vm06.local 2026-03-10T06:36:22.693 DEBUG:teuthology.orchestra.run.vm04:> sudo rm -f /etc/yum.repos.d/ceph.repo 2026-03-10T06:36:22.693 DEBUG:teuthology.orchestra.run.vm06:> sudo rm -f /etc/yum.repos.d/ceph.repo 2026-03-10T06:36:22.715 DEBUG:teuthology.orchestra.run.vm04:> sudo mv -f /etc/yum/pluginconf.d/priorities.conf.orig /etc/yum/pluginconf.d/priorities.conf 2026-03-10T06:36:22.716 DEBUG:teuthology.orchestra.run.vm06:> sudo mv -f /etc/yum/pluginconf.d/priorities.conf.orig /etc/yum/pluginconf.d/priorities.conf 2026-03-10T06:36:22.780 DEBUG:teuthology.parallel:result is None 2026-03-10T06:36:22.780 DEBUG:teuthology.parallel:result is None 2026-03-10T06:36:22.780 DEBUG:teuthology.run_tasks:Unwinding manager clock 2026-03-10T06:36:22.783 INFO:teuthology.task.clock:Checking final clock skew... 2026-03-10T06:36:22.783 DEBUG:teuthology.orchestra.run.vm04:> PATH=/usr/bin:/usr/sbin ntpq -p || PATH=/usr/bin:/usr/sbin chronyc sources || true 2026-03-10T06:36:22.822 DEBUG:teuthology.orchestra.run.vm06:> PATH=/usr/bin:/usr/sbin ntpq -p || PATH=/usr/bin:/usr/sbin chronyc sources || true 2026-03-10T06:36:22.834 INFO:teuthology.orchestra.run.vm04.stderr:bash: line 1: ntpq: command not found 2026-03-10T06:36:22.836 INFO:teuthology.orchestra.run.vm06.stderr:bash: line 1: ntpq: command not found 2026-03-10T06:36:22.840 INFO:teuthology.orchestra.run.vm06.stdout:MS Name/IP address Stratum Poll Reach LastRx Last sample 2026-03-10T06:36:22.840 INFO:teuthology.orchestra.run.vm06.stdout:=============================================================================== 2026-03-10T06:36:22.840 INFO:teuthology.orchestra.run.vm06.stdout:^+ mail.gunnarhofmann.de 2 7 377 115 -110us[ -108us] +/- 18ms 2026-03-10T06:36:22.840 INFO:teuthology.orchestra.run.vm06.stdout:^+ vps-fra1.orleans.ddnss.de 2 6 377 51 -112us[ -110us] +/- 12ms 2026-03-10T06:36:22.840 INFO:teuthology.orchestra.run.vm06.stdout:^+ 47.ip-51-75-67.eu 4 6 377 52 -869us[ -867us] +/- 16ms 2026-03-10T06:36:22.840 INFO:teuthology.orchestra.run.vm06.stdout:^* ntp2.rrze.uni-erlangen.de 1 6 377 51 +981us[ +983us] +/- 13ms 2026-03-10T06:36:22.840 INFO:teuthology.orchestra.run.vm04.stdout:MS Name/IP address Stratum Poll Reach LastRx Last sample 2026-03-10T06:36:22.841 INFO:teuthology.orchestra.run.vm04.stdout:=============================================================================== 2026-03-10T06:36:22.841 INFO:teuthology.orchestra.run.vm04.stdout:^* ntp2.rrze.uni-erlangen.de 1 6 377 49 +953us[ +933us] +/- 13ms 2026-03-10T06:36:22.841 INFO:teuthology.orchestra.run.vm04.stdout:^+ mail.gunnarhofmann.de 2 7 377 116 -94us[ -106us] +/- 18ms 2026-03-10T06:36:22.841 INFO:teuthology.orchestra.run.vm04.stdout:^+ vps-fra1.orleans.ddnss.de 2 6 377 49 -160us[ -160us] +/- 12ms 2026-03-10T06:36:22.841 INFO:teuthology.orchestra.run.vm04.stdout:^+ 47.ip-51-75-67.eu 4 6 377 50 -892us[ -912us] +/- 16ms 2026-03-10T06:36:22.842 DEBUG:teuthology.run_tasks:Unwinding manager ansible.cephlab 2026-03-10T06:36:22.845 INFO:teuthology.task.ansible:Skipping ansible cleanup... 2026-03-10T06:36:22.845 DEBUG:teuthology.run_tasks:Unwinding manager selinux 2026-03-10T06:36:22.848 DEBUG:teuthology.run_tasks:Unwinding manager pcp 2026-03-10T06:36:22.851 DEBUG:teuthology.run_tasks:Unwinding manager internal.timer 2026-03-10T06:36:22.853 INFO:teuthology.task.internal:Duration was 1402.957412 seconds 2026-03-10T06:36:22.853 DEBUG:teuthology.run_tasks:Unwinding manager internal.syslog 2026-03-10T06:36:22.856 INFO:teuthology.task.internal.syslog:Shutting down syslog monitoring... 2026-03-10T06:36:22.856 DEBUG:teuthology.orchestra.run.vm04:> sudo rm -f -- /etc/rsyslog.d/80-cephtest.conf && sudo service rsyslog restart 2026-03-10T06:36:22.884 DEBUG:teuthology.orchestra.run.vm06:> sudo rm -f -- /etc/rsyslog.d/80-cephtest.conf && sudo service rsyslog restart 2026-03-10T06:36:22.923 INFO:teuthology.orchestra.run.vm04.stderr:Redirecting to /bin/systemctl restart rsyslog.service 2026-03-10T06:36:22.924 INFO:teuthology.orchestra.run.vm06.stderr:Redirecting to /bin/systemctl restart rsyslog.service 2026-03-10T06:36:23.215 INFO:teuthology.task.internal.syslog:Checking logs for errors... 2026-03-10T06:36:23.215 DEBUG:teuthology.task.internal.syslog:Checking ubuntu@vm04.local 2026-03-10T06:36:23.215 DEBUG:teuthology.orchestra.run.vm04:> grep -E --binary-files=text '\bBUG\b|\bINFO\b|\bDEADLOCK\b' /home/ubuntu/cephtest/archive/syslog/kern.log | grep -v 'task .* blocked for more than .* seconds' | grep -v 'lockdep is turned off' | grep -v 'trying to register non-static key' | grep -v 'DEBUG: fsize' | grep -v CRON | grep -v 'BUG: bad unlock balance detected' | grep -v 'inconsistent lock state' | grep -v '*** DEADLOCK ***' | grep -v 'INFO: possible irq lock inversion dependency detected' | grep -v 'INFO: NMI handler (perf_event_nmi_handler) took too long to run' | grep -v 'INFO: recovery required on readonly' | grep -v 'ceph-create-keys: INFO' | grep -v INFO:ceph-create-keys | grep -v 'Loaded datasource DataSourceOpenStack' | grep -v 'container-storage-setup: INFO: Volume group backing root filesystem could not be determined' | grep -E -v '\bsalt-master\b|\bsalt-minion\b|\bsalt-api\b' | grep -v ceph-crash | grep -E -v '\btcmu-runner\b.*\bINFO\b' | head -n 1 2026-03-10T06:36:23.280 DEBUG:teuthology.task.internal.syslog:Checking ubuntu@vm06.local 2026-03-10T06:36:23.280 DEBUG:teuthology.orchestra.run.vm06:> grep -E --binary-files=text '\bBUG\b|\bINFO\b|\bDEADLOCK\b' /home/ubuntu/cephtest/archive/syslog/kern.log | grep -v 'task .* blocked for more than .* seconds' | grep -v 'lockdep is turned off' | grep -v 'trying to register non-static key' | grep -v 'DEBUG: fsize' | grep -v CRON | grep -v 'BUG: bad unlock balance detected' | grep -v 'inconsistent lock state' | grep -v '*** DEADLOCK ***' | grep -v 'INFO: possible irq lock inversion dependency detected' | grep -v 'INFO: NMI handler (perf_event_nmi_handler) took too long to run' | grep -v 'INFO: recovery required on readonly' | grep -v 'ceph-create-keys: INFO' | grep -v INFO:ceph-create-keys | grep -v 'Loaded datasource DataSourceOpenStack' | grep -v 'container-storage-setup: INFO: Volume group backing root filesystem could not be determined' | grep -E -v '\bsalt-master\b|\bsalt-minion\b|\bsalt-api\b' | grep -v ceph-crash | grep -E -v '\btcmu-runner\b.*\bINFO\b' | head -n 1 2026-03-10T06:36:23.302 INFO:teuthology.task.internal.syslog:Gathering journactl... 2026-03-10T06:36:23.302 DEBUG:teuthology.orchestra.run.vm04:> sudo journalctl > /home/ubuntu/cephtest/archive/syslog/journalctl.log 2026-03-10T06:36:23.322 DEBUG:teuthology.orchestra.run.vm06:> sudo journalctl > /home/ubuntu/cephtest/archive/syslog/journalctl.log 2026-03-10T06:36:24.041 INFO:teuthology.task.internal.syslog:Compressing syslogs... 2026-03-10T06:36:24.041 DEBUG:teuthology.orchestra.run.vm04:> find /home/ubuntu/cephtest/archive/syslog -name '*.log' -print0 | sudo xargs -0 --max-args=1 --max-procs=0 --verbose --no-run-if-empty -- gzip -5 --verbose -- 2026-03-10T06:36:24.042 DEBUG:teuthology.orchestra.run.vm06:> find /home/ubuntu/cephtest/archive/syslog -name '*.log' -print0 | sudo xargs -0 --max-args=1 --max-procs=0 --verbose --no-run-if-empty -- gzip -5 --verbose -- 2026-03-10T06:36:24.065 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /home/ubuntu/cephtest/archive/syslog/kern.log 2026-03-10T06:36:24.065 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /home/ubuntu/cephtest/archive/syslog/misc.log 2026-03-10T06:36:24.066 INFO:teuthology.orchestra.run.vm04.stderr:gzip/home/ubuntu/cephtest/archive/syslog/kern.log: -5 --verbose -- /home/ubuntu/cephtest/archive/syslog/journalctl.log 2026-03-10T06:36:24.066 INFO:teuthology.orchestra.run.vm04.stderr: 0.0% -- replaced with /home/ubuntu/cephtest/archive/syslog/kern.log.gz 2026-03-10T06:36:24.066 INFO:teuthology.orchestra.run.vm04.stderr:/home/ubuntu/cephtest/archive/syslog/misc.log: 0.0% -- replaced with /home/ubuntu/cephtest/archive/syslog/misc.log.gz 2026-03-10T06:36:24.066 INFO:teuthology.orchestra.run.vm06.stderr:gzip -5 --verbose -- /home/ubuntu/cephtest/archive/syslog/kern.log 2026-03-10T06:36:24.067 INFO:teuthology.orchestra.run.vm06.stderr:gzip -5 --verbose -- /home/ubuntu/cephtest/archive/syslog/misc.log 2026-03-10T06:36:24.067 INFO:teuthology.orchestra.run.vm06.stderr:/home/ubuntu/cephtest/archive/syslog/kern.log: 0.0% -- replaced with /home/ubuntu/cephtest/archive/syslog/kern.log.gz 2026-03-10T06:36:24.067 INFO:teuthology.orchestra.run.vm06.stderr:gzip -5 --verbose -- /home/ubuntu/cephtest/archive/syslog/journalctl.log 2026-03-10T06:36:24.068 INFO:teuthology.orchestra.run.vm06.stderr:/home/ubuntu/cephtest/archive/syslog/misc.log: 0.0% -- replaced with /home/ubuntu/cephtest/archive/syslog/misc.log.gz 2026-03-10T06:36:24.211 INFO:teuthology.orchestra.run.vm06.stderr:/home/ubuntu/cephtest/archive/syslog/journalctl.log: 97.8% -- replaced with /home/ubuntu/cephtest/archive/syslog/journalctl.log.gz 2026-03-10T06:36:24.249 INFO:teuthology.orchestra.run.vm04.stderr:/home/ubuntu/cephtest/archive/syslog/journalctl.log: 97.1% -- replaced with /home/ubuntu/cephtest/archive/syslog/journalctl.log.gz 2026-03-10T06:36:24.251 DEBUG:teuthology.run_tasks:Unwinding manager internal.sudo 2026-03-10T06:36:24.254 INFO:teuthology.task.internal:Restoring /etc/sudoers... 2026-03-10T06:36:24.254 DEBUG:teuthology.orchestra.run.vm04:> sudo mv -f /etc/sudoers.orig.teuthology /etc/sudoers 2026-03-10T06:36:24.315 DEBUG:teuthology.orchestra.run.vm06:> sudo mv -f /etc/sudoers.orig.teuthology /etc/sudoers 2026-03-10T06:36:24.343 DEBUG:teuthology.run_tasks:Unwinding manager internal.coredump 2026-03-10T06:36:24.346 DEBUG:teuthology.orchestra.run.vm04:> sudo sysctl -w kernel.core_pattern=core && sudo bash -c 'for f in `find /home/ubuntu/cephtest/archive/coredump -type f`; do file $f | grep -q systemd-sysusers && rm $f || true ; done' && rmdir --ignore-fail-on-non-empty -- /home/ubuntu/cephtest/archive/coredump 2026-03-10T06:36:24.357 DEBUG:teuthology.orchestra.run.vm06:> sudo sysctl -w kernel.core_pattern=core && sudo bash -c 'for f in `find /home/ubuntu/cephtest/archive/coredump -type f`; do file $f | grep -q systemd-sysusers && rm $f || true ; done' && rmdir --ignore-fail-on-non-empty -- /home/ubuntu/cephtest/archive/coredump 2026-03-10T06:36:24.380 INFO:teuthology.orchestra.run.vm04.stdout:kernel.core_pattern = core 2026-03-10T06:36:24.411 INFO:teuthology.orchestra.run.vm06.stdout:kernel.core_pattern = core 2026-03-10T06:36:24.423 DEBUG:teuthology.orchestra.run.vm04:> test -e /home/ubuntu/cephtest/archive/coredump 2026-03-10T06:36:24.448 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-10T06:36:24.448 DEBUG:teuthology.orchestra.run.vm06:> test -e /home/ubuntu/cephtest/archive/coredump 2026-03-10T06:36:24.479 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-10T06:36:24.479 DEBUG:teuthology.run_tasks:Unwinding manager internal.archive 2026-03-10T06:36:24.482 INFO:teuthology.task.internal:Transferring archived files... 2026-03-10T06:36:24.483 DEBUG:teuthology.misc:Transferring archived files from vm04:/home/ubuntu/cephtest/archive to /archive/kyr-2026-03-10_01:00:38-orch-squid-none-default-vps/926/remote/vm04 2026-03-10T06:36:24.483 DEBUG:teuthology.orchestra.run.vm04:> sudo tar c -f - -C /home/ubuntu/cephtest/archive -- . 2026-03-10T06:36:24.521 DEBUG:teuthology.misc:Transferring archived files from vm06:/home/ubuntu/cephtest/archive to /archive/kyr-2026-03-10_01:00:38-orch-squid-none-default-vps/926/remote/vm06 2026-03-10T06:36:24.521 DEBUG:teuthology.orchestra.run.vm06:> sudo tar c -f - -C /home/ubuntu/cephtest/archive -- . 2026-03-10T06:36:24.549 INFO:teuthology.task.internal:Removing archive directory... 2026-03-10T06:36:24.549 DEBUG:teuthology.orchestra.run.vm04:> rm -rf -- /home/ubuntu/cephtest/archive 2026-03-10T06:36:24.561 DEBUG:teuthology.orchestra.run.vm06:> rm -rf -- /home/ubuntu/cephtest/archive 2026-03-10T06:36:24.603 DEBUG:teuthology.run_tasks:Unwinding manager internal.archive_upload 2026-03-10T06:36:24.606 INFO:teuthology.task.internal:Not uploading archives. 2026-03-10T06:36:24.606 DEBUG:teuthology.run_tasks:Unwinding manager internal.base 2026-03-10T06:36:24.609 INFO:teuthology.task.internal:Tidying up after the test... 2026-03-10T06:36:24.610 DEBUG:teuthology.orchestra.run.vm04:> find /home/ubuntu/cephtest -ls ; rmdir -- /home/ubuntu/cephtest 2026-03-10T06:36:24.619 DEBUG:teuthology.orchestra.run.vm06:> find /home/ubuntu/cephtest -ls ; rmdir -- /home/ubuntu/cephtest 2026-03-10T06:36:24.633 INFO:teuthology.orchestra.run.vm04.stdout: 8532144 0 drwxr-xr-x 3 ubuntu ubuntu 19 Mar 10 06:36 /home/ubuntu/cephtest 2026-03-10T06:36:24.633 INFO:teuthology.orchestra.run.vm04.stdout: 79867997 0 d--------- 2 ubuntu ubuntu 6 Mar 10 06:19 /home/ubuntu/cephtest/mnt.0 2026-03-10T06:36:24.634 INFO:teuthology.orchestra.run.vm04.stderr:find: ‘/home/ubuntu/cephtest/mnt.0’: Permission denied 2026-03-10T06:36:24.634 INFO:teuthology.orchestra.run.vm04.stderr:rmdir: failed to remove '/home/ubuntu/cephtest': Directory not empty 2026-03-10T06:36:24.645 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-10T06:36:24.645 ERROR:teuthology.run_tasks:Manager failed: internal.base Traceback (most recent call last): File "/home/teuthos/teuthology/teuthology/task/internal/__init__.py", line 48, in base yield File "/home/teuthos/teuthology/teuthology/run_tasks.py", line 160, in run_tasks suppress = manager.__exit__(*exc_info) File "/home/teuthos/.local/share/uv/python/cpython-3.10.19-linux-x86_64-gnu/lib/python3.10/contextlib.py", line 142, in __exit__ next(self.gen) File "/home/teuthos/src/github.com_kshtsk_ceph_75a68fd8ca3f918fe9466b4c0bb385b7fc260a9b/qa/tasks/ceph_fuse.py", line 248, in task mount.umount_wait() File "/home/teuthos/src/github.com_kshtsk_ceph_75a68fd8ca3f918fe9466b4c0bb385b7fc260a9b/qa/tasks/cephfs/fuse_mount.py", line 403, in umount_wait run.wait([self.fuse_daemon], timeout) File "/home/teuthos/teuthology/teuthology/orchestra/run.py", line 479, in wait check_time() File "/home/teuthos/teuthology/teuthology/contextutil.py", line 134, in __call__ raise MaxWhileTries(error_msg) teuthology.exceptions.MaxWhileTries: reached maximum tries (50) after waiting for 300 seconds During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/home/teuthos/teuthology/teuthology/run_tasks.py", line 160, in run_tasks suppress = manager.__exit__(*exc_info) File "/home/teuthos/.local/share/uv/python/cpython-3.10.19-linux-x86_64-gnu/lib/python3.10/contextlib.py", line 153, in __exit__ self.gen.throw(typ, value, traceback) File "/home/teuthos/teuthology/teuthology/task/internal/__init__.py", line 53, in base run.wait( File "/home/teuthos/teuthology/teuthology/orchestra/run.py", line 485, in wait proc.wait() File "/home/teuthos/teuthology/teuthology/orchestra/run.py", line 161, in wait self._raise_for_status() File "/home/teuthos/teuthology/teuthology/orchestra/run.py", line 181, in _raise_for_status raise CommandFailedError( teuthology.exceptions.CommandFailedError: Command failed on vm04 with status 1: 'find /home/ubuntu/cephtest -ls ; rmdir -- /home/ubuntu/cephtest' 2026-03-10T06:36:24.645 DEBUG:teuthology.run_tasks:Unwinding manager console_log 2026-03-10T06:36:24.648 DEBUG:teuthology.run_tasks:Exception was not quenched, exiting: MaxWhileTries: reached maximum tries (50) after waiting for 300 seconds 2026-03-10T06:36:24.649 INFO:teuthology.run:Summary data: description: orch/cephadm/mds_upgrade_sequence/{bluestore-bitmap centos_9.stream conf/{client mds mgr mon osd} fail_fs/yes kernel overrides/{ignorelist_health ignorelist_upgrade ignorelist_wrongly_marked_down pg-warn pg_health syntax} roles tasks/{0-from/reef/{v18.2.0} 1-volume/{0-create 1-ranks/1 2-allow_standby_replay/yes 3-inline/yes 4-verify} 2-client/fuse 3-upgrade-mgr-staggered 4-config-upgrade/{fail_fs} 5-upgrade-with-workload 6-verify}} duration: 1402.957412481308 failure_reason: reached maximum tries (50) after waiting for 300 seconds flavor: default owner: kyr status: fail success: false 2026-03-10T06:36:24.649 DEBUG:teuthology.report:Pushing job info to http://localhost:8080 2026-03-10T06:36:24.674 INFO:teuthology.run:FAIL